Results

The table below presents the up-to-date results of the WMH Segmentation Challenge. Methods are ranked according to the five evaluation criteria.

#TeamRankDSCH95 (mm)AVD (%)RecallF1
1pgs0.01850.815.6318.580.820.79
2sysu_media_20.01870.805.7628.730.870.76
3sysu_media0.02880.806.3021.880.840.76
4anonymous_202004130.03140.796.1722.990.830.77
5coroflo0.04930.795.4622.530.760.77
6neuro.ml_20.05110.786.3330.630.820.73
7cian0.05710.786.8221.720.830.70
8nlp_logix0.06780.777.1618.370.730.78
9nih_cidi_20.08030.757.3527.260.810.69
10bigrbrain_20.08470.779.4628.040.780.71
11bigrbrain0.09100.786.7523.240.700.73
12nic-vicorob0.09270.778.2828.540.750.71
13rasha_improved0.09640.777.4224.970.760.67
14wta0.10120.786.7816.200.660.73
15dice0.10460.777.6319.770.690.71
16fmrib0.11140.758.9127.930.720.70
17wta_20.11320.768.2121.310.670.72
18misp_20.11780.7811.1019.710.680.71
19uned_contrast0.13550.759.9044.330.770.60
20k20.15310.779.7919.080.590.70
21uned0.15520.7311.0455.840.810.54
22rasha_simple0.15620.749.0529.730.650.64
23lrde0.17970.7314.5421.710.630.67
24misp0.18210.7214.8821.360.630.68
25ipmi-bern0.26250.699.7219.920.440.57
26nih_cidi0.28430.6812.82196.380.590.54
27scan0.28980.6314.3434.670.550.51
28tig_corr0.30390.6817.4839.070.540.46
29livia0.30440.6122.7038.390.540.61
30achilles0.30810.6311.8224.410.450.52
31skkumedneuro0.36040.5819.0258.540.470.51
32tignet0.39090.5921.5886.220.460.45
33tig0.39550.6017.8634.340.380.42
34knight0.42390.7017.0339.990.250.35
35upc_dlmi0.44490.5327.01208.490.570.42
36himinn0.45000.6224.4944.190.330.36
37nist0.48300.5315.91109.980.370.25
38text_class0.57810.5028.23146.640.270.29
39neuro.ml0.60740.5137.36614.050.710.21
40hadi0.89400.2352.02828.610.580.11

Results presented at MICCAI 2017 in Quebec City can be found here: MICCAI results.