Menu | Explaining level changesDavid Brierley v Graham Howe (Wed 07 May 2025)Match won by David Brierley. Result: 11-9,7-11,11-6,12-10.Starting level for David Brierley: 2,730, level confidence: 77%. Set manually. Starting level for Graham Howe: 2,058, level confidence: 46%. David Brierley to win as he is currently playing 33% better than Graham Howe. David Brierley won 75% of the games and 53% of the points. This games result would be expected if he was better by around 25%. This points result would be expected if he was better by around 14% (PAR scoring). These are weighted and combined to calculate that David Brierley played 21% better than Graham Howe in this match. As David Brierley has played below his allowed range at 2,527, his level reduction is 1.2% before damping. On the assumption that David Brierley would normally have been playing at level 2,660 (based on typical behaviour), Graham Howe played better than expected and therefore gains a pre-damping level increase of 3.2%. Allowing for the difference in level between the players, the adjustments have been reduced to 1% and 2.8% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for David Brierley changes to -0.6% and Graham Howe changes to +2.8%. After applying standard match damping, the adjustment for David Brierley becomes -0.4% and for Graham Howe becomes +2.1%. Given David Brierley's level and the type of match played, an additional damping of 14% has been applied to his level change. Given Graham Howe's level and the type of match played, an additional damping of 1.2% has been applied to his level change. Apply match/event weighting of 50% for 'Fair Oak Boxes' so the adjustment for David Brierley is 0% and for Graham Howe is +1%. Increase level confidence due to one more match played. David Brierley: 88%, Graham Howe: 68%. Reduce level confidence based on how unexpected the result is. David Brierley: 84%, Graham Howe: 65%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for David Brierley: 2,721, level confidence: 84%. Final level for Graham Howe: 2,076, level confidence: 65%. Notes
|