Results

The table below presents the up-to-date results of the WMH Segmentation Challenge. Methods are ranked according to the five evaluation criteria.

#TeamRankDSCH95 (mm)AVD (%)RecallF1
1pgs0.01850.815.6318.580.820.79
2sysu_media_20.01870.805.7628.730.870.76
3sysu_media0.02880.806.3021.880.840.76
4buckeye_ai0.03140.796.1722.990.830.77
5coroflo0.04930.795.4622.530.760.77
6neuro.ml_20.05110.786.3330.630.820.73
7cian0.05710.786.8221.720.830.70
8bioengineering_espol_team0.05960.786.2428.260.780.74
9nlp_logix0.06780.777.1618.370.730.78
10nih_cidi_20.08030.757.3527.260.810.69
11bigrbrain_20.08470.779.4628.040.780.71
12bigrbrain0.09100.786.7523.240.700.73
13nic-vicorob0.09270.778.2828.540.750.71
14rasha_improved0.09640.777.4224.970.760.67
15wta0.10120.786.7816.200.660.73
16dice0.10460.777.6319.770.690.71
17fmrib0.11140.758.9127.930.720.70
18wta_20.11320.768.2121.310.670.72
19misp_20.11780.7811.1019.710.680.71
20uned_20.12370.768.9028.830.730.63
21uned_contrast0.13550.759.9044.330.770.60
22k20.15310.779.7919.080.590.70
23uned0.15520.7311.0455.840.810.54
24rasha_simple0.15620.749.0529.730.650.64
25acunet0.17350.659.2229.350.670.67
26lrde0.17970.7314.5421.710.630.67
27misp0.18210.7214.8821.360.630.68
28ipmi-bern0.26250.699.7219.920.440.57
29nih_cidi0.28430.6812.82196.380.590.54
30scan0.28980.6314.3434.670.550.51
31tig_corr0.30390.6817.4839.070.540.46
32livia0.30440.6122.7038.390.540.61
33achilles0.30810.6311.8224.410.450.52
34skkumedneuro0.36040.5819.0258.540.470.51
35tignet0.39090.5921.5886.220.460.45
36tig0.39550.6017.8634.340.380.42
37knight0.42390.7017.0339.990.250.35
38upc_dlmi0.44490.5327.01208.490.570.42
39himinn0.45000.6224.4944.190.330.36
40nist0.48300.5315.91109.980.370.25
41text_class0.57810.5028.23146.640.270.29
42neuro.ml0.60740.5137.36614.050.710.21
43hadi0.89400.2352.02828.610.580.11

Results presented at MICCAI 2017 in Quebec City can be found here: MICCAI results.