Results

The table below presents the up-to-date results of the WMH Segmentation Challenge. Methods are ranked according to the five evaluation criteria.

#TeamRankDSCH95 (mm)AVD (%)RecallF1
1pgs0.01850.815.6318.580.820.79
2sysu_media_20.01870.805.7628.730.870.76
3sysu_media0.02880.806.3021.880.840.76
4coroflo0.04930.795.4622.530.760.77
5neuro.ml_20.05110.786.3330.630.820.73
6cian0.05710.786.8221.720.830.70
7nlp_logix0.06780.777.1618.370.730.78
8nih_cidi_20.08030.757.3527.260.810.69
9bigrbrain_20.08470.779.4628.040.780.71
10bigrbrain0.09100.786.7523.240.700.73
11nic-vicorob0.09270.778.2828.540.750.71
12wta0.10120.786.7816.200.660.73
13misp_20.11780.7811.1019.710.680.71
14k20.15310.779.7919.080.590.70
15lrde0.17970.7314.5421.710.630.67
16misp0.18210.7214.8821.360.630.68
17ipmi-bern0.26250.699.7219.920.440.57
18nih_cidi0.28430.6812.82196.380.590.54
19scan0.28980.6314.3434.670.550.51
20tig_corr0.30390.6817.4839.070.540.46
21livia0.30440.6122.7038.390.540.61
22achilles0.30810.6311.8224.410.450.52
23skkumedneuro0.36040.5819.0258.540.470.51
24tignet0.39090.5921.5886.220.460.45
25tig0.39550.6017.8634.340.380.42
26knight0.42390.7017.0339.990.250.35
27upc_dlmi0.44490.5327.01208.490.570.42
28himinn0.45000.6224.4944.190.330.36
29nist0.48300.5315.91109.980.370.25
30text_class0.57810.5028.23146.640.270.29
31neuro.ml0.60740.5137.36614.050.710.21
32hadi0.89400.2352.02828.610.580.11

Results presented at MICCAI 2017 in Quebec City can be found here: MICCAI results.