Results

The table below presents the up-to-date results of the WMH Segmentation Challenge. Methods are ranked according to the five evaluation criteria.

#TeamRankDSCH95 (mm)AVD (%)RecallF1
1pgs0.02150.815.6318.580.820.79
2sysu_media_20.02200.805.7628.730.870.76
3sysu_media0.03200.806.3021.880.840.76
4buckeye_ai0.03450.796.1722.990.830.77
5tum_ibbm0.03490.795.9022.330.830.76
6pitt-vamp-20.04150.795.7419.690.810.76
7coroflo0.05200.795.4622.530.760.77
8neuro.ml_20.05410.786.3330.630.820.73
9nus_mnndl0.05680.766.9250.280.880.71
10cian0.06020.786.8221.720.830.70
11bioengineering_espol_team0.06250.786.2428.260.780.74
12nlp_logix0.07040.777.1618.370.730.78
13pitt-vamp0.07800.798.8519.530.760.73
14nih_cidi_20.08330.757.3527.260.810.69
15bigrbrain_20.08760.779.4628.040.780.71
16uned_stand_20210.08900.796.3717.560.690.73
17fmrib-truenet_20.09170.776.9520.490.740.70
18bigrbrain0.09340.786.7523.240.700.73
19nic-vicorob0.09540.778.2828.540.750.71
20uned_20.09540.787.5927.690.770.66
21rasha_improved0.09910.777.4224.970.760.67
22wta0.10330.786.7816.200.660.73
23dice0.10700.777.6319.770.690.71
24uned_enhanced_20210.10850.797.2318.640.660.71
25fmrib-truenet0.11390.758.9127.930.720.70
26wta_20.11540.768.2121.310.670.72
27misp_20.12010.7811.1019.710.680.71
28uned_enhanced_20220.12150.756.7917.680.660.71
29acunet_2-20.13690.718.3422.580.690.70
30uned_contrast0.13830.759.9044.330.770.60
31k20.15490.779.7919.080.590.70
32uned0.15810.7311.0455.840.810.54
33rasha_simple0.15840.749.0529.730.650.64
34acunet_2-40.16620.6910.3026.140.630.71
35acunet_2-10.17180.698.9628.990.650.65
36acunet0.17580.659.2229.350.670.67
37lrde0.18180.7314.5421.710.630.67
38misp0.18420.7214.8821.360.630.68
39acunet_2-30.20510.6610.3128.400.580.67
40ipmi-bern0.26350.699.7219.920.440.57
41nih_cidi0.28620.6812.82196.380.590.54
42arg0.28940.7021.5446.110.530.57
43scan0.29140.6314.3434.670.550.51
44tig_corr0.30550.6817.4839.070.540.46
45livia0.30600.6122.7038.390.540.61
46achilles0.30920.6311.8224.410.450.52
47caai_amh0.31010.6215.0328.540.450.57
48skkumedneuro0.36160.5819.0258.540.470.51
49tignet0.39210.5921.5886.220.460.45
50tig0.39630.6017.8634.340.380.42
51knight0.42390.7017.0339.990.250.35
52upc_dlmi0.44660.5327.01208.490.570.42
53himinn0.45050.6224.4944.190.330.36
54nist0.48360.5315.91109.980.370.25
55text_class0.57820.5028.23146.640.270.29
56neuro.ml0.60990.5137.36614.050.710.21
57hadi0.89570.2352.02828.610.580.11

Results presented at MICCAI 2017 in Quebec City can be found here: MICCAI results.