Results

The table below presents the up-to-date results of the WMH Segmentation Challenge. Methods are ranked according to the five evaluation criteria.

#TeamRankDSCH95 (mm)AVD (%)RecallF1
1pgs0.01850.815.6318.580.820.79
2sysu_media_20.01870.805.7628.730.870.76
3sysu_media0.02880.806.3021.880.840.76
4buckeye_ai0.03140.796.1722.990.830.77
5tum_ibbm0.03180.795.9022.330.830.76
6pitt-vamp-20.03850.795.7419.690.810.76
7coroflo0.04930.795.4622.530.760.77
8neuro.ml_20.05110.786.3330.630.820.73
9cian0.05710.786.8221.720.830.70
10bioengineering_espol_team0.05960.786.2428.260.780.74
11nlp_logix0.06780.777.1618.370.730.78
12pitt-vamp0.07530.798.8519.530.760.73
13nih_cidi_20.08030.757.3527.260.810.69
14bigrbrain_20.08470.779.4628.040.780.71
15uned_stand_20210.08660.796.3717.560.690.73
16fmrib-truenet_20.08910.776.9520.490.740.70
17bigrbrain0.09100.786.7523.240.700.73
18uned_20.09270.787.5927.690.770.66
19nic-vicorob0.09270.778.2828.540.750.71
20rasha_improved0.09640.777.4224.970.760.67
21wta0.10120.786.7816.200.660.73
22dice0.10460.777.6319.770.690.71
23uned_enhanced_20210.10630.797.2318.640.660.71
24fmrib-truenet0.11140.758.9127.930.720.70
25wta_20.11320.768.2121.310.670.72
26misp_20.11780.7811.1019.710.680.71
27acunet_2-20.13460.718.3422.580.690.70
28uned_contrast0.13550.759.9044.330.770.60
29k20.15310.779.7919.080.590.70
30uned0.15520.7311.0455.840.810.54
31rasha_simple0.15620.749.0529.730.650.64
32acunet_2-40.16420.6910.3026.140.630.71
33acunet_2-10.16960.698.9628.990.650.65
34acunet0.17350.659.2229.350.670.67
35lrde0.17970.7314.5421.710.630.67
36misp0.18210.7214.8821.360.630.68
37acunet_2-30.20330.6610.3128.400.580.67
38ipmi-bern0.26250.699.7219.920.440.57
39nih_cidi0.28430.6812.82196.380.590.54
40arg0.28790.7021.5446.110.530.57
41scan0.28980.6314.3434.670.550.51
42tig_corr0.30390.6817.4839.070.540.46
43livia0.30440.6122.7038.390.540.61
44achilles0.30810.6311.8224.410.450.52
45caai_amh0.30900.6215.0328.540.450.57
46skkumedneuro0.36040.5819.0258.540.470.51
47tignet0.39090.5921.5886.220.460.45
48tig0.39550.6017.8634.340.380.42
49knight0.42390.7017.0339.990.250.35
50upc_dlmi0.44490.5327.01208.490.570.42
51himinn0.45000.6224.4944.190.330.36
52nist0.48300.5315.91109.980.370.25
53text_class0.57810.5028.23146.640.270.29
54neuro.ml0.60740.5137.36614.050.710.21
55hadi0.89400.2352.02828.610.580.11

Results presented at MICCAI 2017 in Quebec City can be found here: MICCAI results.