· clojure ranking-systems

Elo Rating System: Ranking Champions League teams using Clojure Part 2

A few weeks ago I wrote about ranking Champions League teams using the Elo Rating algorithm, and since I wrote that post I’ve collated data for 10 years worth of matches so I thought an update was in order.

After extracting the details of all those matches I saved them to a JSON file so that I wouldn’t have to parse the HTML pages every time I tweaked the algorithm. This should also make it easier for other people to play with the data.

I described the algorithm in the previous post and had analysed the rankings for one season. However, it was difficult to understand why teams has been ranked in a certain order so I drilled into the data to find out.

Since the 2012/2013 season is the freshest in my memory I started with that.

The first thing to do was load the matches from disk: ~lisp (ns ranking-algorithms.uefa (:require [clj-time.format :as f]) (:require [clojure.data.json :as json])) (defn as-date [date-field] (f/parse (f/formatter "dd MMM YYYY") date-field )) (defn date-aware-value-reader [key value] (if (= key :date) (as-date value) value)) (defn read-from-file [file] (json/read-str (slurp file) :value-fn date-aware-value-reader :key-fn keyword)) ~

I’ve written previously about reifying a date masquerading as a string but for now we want to give those matches a name and then process them. ~lisp > (def the-matches (read-from-file "data/cl-matches-2013.json")) #'ranking-algorithms.uefa/the-matches > (count the-matches) 213 ~

I already had a function https://github.com/mneedham/ranking-algorithms/blob/27f94ccf1fc6c63663c8b3d512a2470330585803/src/ranking_algorithms/core.clj#L28 which would apply the Elo algorithm across the matches, but since I wanted to drill into each team’s performance I wrapped that function in another one called https://github.com/mneedham/ranking-algorithms/blob/27f94ccf1fc6c63663c8b3d512a2470330585803/src/ranking_algorithms/core.clj#L121: ~lisp (comment "other functions excluded for brevity") (defn format-for-printing [all-matches idx [team ranking & [rd]]] (let [team-matches (show-matches team all-matches)] (merge {:rank (inc idx) :team team :ranking ranking :rd rd :round (performance team-matches)} (match-record team-matches)))) (defn print-top-teams ([number all-matches] (print-top-teams number all-matches {})) ([number all-matches base-rankings] (clojure.pprint/print-table [:rank :team :ranking :round :wins :draw :loses] (map-indexed (partial format-for-printing all-matches) (top-teams number all-matches base-rankings))))) ~ ~lisp > (ranking-algorithms.core/print-top-teams 10 the-matches) ======================================================================== :rank | :team | :ranking | :round | :wins | :draw | :loses ======================================================================== 1 | Bayern | 1272.74 | Final | 10 | 1 | 2 2 | PSG | 1230.02 | Quarter-finals | 6 | 3 | 1 3 | Dortmund | 1220.96 | Final | 7 | 4 | 2 4 | Real Madrid | 1220.33 | Semi-finals | 6 | 3 | 3 5 | Porto | 1216.97 | Round of 16 | 5 | 1 | 2 6 | CFR Cluj | 1216.56 | Group stage | 7 | 1 | 2 7 | Galatasaray | 1215.56 | Quarter-finals | 5 | 2 | 3 8 | Juventus | 1214.0 | Quarter-finals | 5 | 3 | 2 9 | Málaga | 1211.53 | Quarter-finals | 5 | 5 | 2 10 | Valencia | 1211.0 | Round of 16 | 4 | 2 | 2 ======================================================================== ~

I’ve excluded most of the functions but you can find the source in core.clj on github.

Clojure-wise I learnt about the http://clojuredocs.org/clojure_core/clojure.pprint/print-table function which came in handy and Elo-wise I realised that the ranking places a heavy emphasis on winning matches.

If you follow the Champions League closely you’ll have noticed that Barcelona are missing from the top 10 despite reaching the Semi Final. The Elo algorithm actually ranks them in 65th position: ~lisp ========================================================================================= :rank | :team | :ranking | :round | :wins | :draw | :loses ========================================================================================= 63 | Motherwell | 1195.04 | Third qualifying round | 0 | 0 | 2 64 | Feyenoord | 1195.04 | Third qualifying round | 0 | 0 | 2 65 | Barcelona | 1194.68 | Semi-finals | 5 | 3 | 4 66 | BATE | 1194.36 | Group stage | 5 | 3 | 4 67 | Anderlecht | 1193.41 | Group stage | 4 | 2 | 4 ========================================================================================= ~

It’s all about winning

I thought there might be a bug in my implementation but having looked through it multiple times, Barcelona’s low ranking results from losing multiple games - to Celtic, AC Milan and twice to Bayern Munich - and progressing from their Quarter Final tie against PSG without winning either match.

I did apply a higher weighting to matches won later on in the competition but otherwise the Elo algorithm doesn’t take into account progress in a tournament.

Low variation in rankings

The initial ranking of each team was 1200, so I was surprised to see that the top ranked team had only achieved a ranking of 1272 - I expected it to be higher.

I read a bit more about the algorithm and learnt that a 200 points gap in ranking signifies that the higher ranked team should win 75% of the time.

For this data set the top ranked team has 1272 points and the lowest ranked team has 1171 points so we probably need to tweak the algorithm to make it more accurate.

Accuracy of the Elo algorithm

My understanding of the Elo algorithm is that it becomes more accurate as teams play more matches so I decided to try it out on all the matches from 2004 - 2012.

I adapted the print-top-teams function to exclude ':round' since it doesn’t make sense in this context: ~lisp (comment "I really need to pull out the printing stuff into a function but I’m lazy so I haven’t...yet") (defn print-top-teams-without-round ([number all-matches] (print-top-teams-without-round number all-matches {})) ([number all-matches base-rankings] (clojure.pprint/print-table [:rank :team :ranking :wins :draw :loses] (map-indexed (partial format-for-printing all-matches) (top-teams number all-matches base-rankings))))) ~

If we evaluate that function we see the following rankings: ~lisp > (def the-matches (read-from-file "data/cl-matches-2004-2012.json")) #'ranking-algorithms.uefa/the-matches > (ranking-algorithms.core/print-top-teams-without-round 10 the-matches) ========================================================== :rank | :team | :ranking | :wins | :draw | :loses ========================================================== 1 | Barcelona | 1383.85 | 55 | 25 | 12 2 | Man. United | 1343.54 | 49 | 21 | 14 3 | Chelsea | 1322.0 | 44 | 27 | 17 4 | Real Madrid | 1317.68 | 42 | 14 | 18 5 | Bayern | 1306.18 | 42 | 13 | 19 6 | Arsenal | 1276.83 | 47 | 21 | 18 7 | Liverpool | 1272.52 | 41 | 17 | 17 8 | Internazionale | 1260.27 | 36 | 18 | 21 9 | Milan | 1257.63 | 34 | 22 | 18 10 | Bordeaux | 1243.04 | 12 | 3 | 7 ========================================================== ~

The only finalists missing from this list are Monaco and Porto who contested the final in 2004 but haven’t reached that level of performance since.

Bordeaux are the only unlikely entry in this list and have played ~60 games less than the other teams which suggests that we might not have as much confidence in their ranking. It was at this stage that I started looking at the Glicko algorithm which calculates a rating reliability as well as the rating itself.

I’ve added instructions to the github README showing some examples but if you have any questions feel free to ping me on here or twitter.

  • LinkedIn
  • Tumblr
  • Reddit
  • Google+
  • Pinterest
  • Pocket