The table below presents the up-to-date results of the WMH Segmentation Challenge. Methods are ranked according to the five evaluation criteria.
# | Team | Rank | DSC | H95 (mm) | AVD (%) | Recall | F1 |
---|---|---|---|---|---|---|---|
1 | pgs | 0.0185 | 0.81 | 5.63 | 18.58 | 0.82 | 0.79 |
2 | sysu_media_2 | 0.0187 | 0.80 | 5.76 | 28.73 | 0.87 | 0.76 |
3 | sysu_media | 0.0288 | 0.80 | 6.30 | 21.88 | 0.84 | 0.76 |
4 | buckeye_ai | 0.0314 | 0.79 | 6.17 | 22.99 | 0.83 | 0.77 |
5 | tum_ibbm | 0.0318 | 0.79 | 5.90 | 22.33 | 0.83 | 0.76 |
6 | pitt-vamp-2 | 0.0385 | 0.79 | 5.74 | 19.69 | 0.81 | 0.76 |
7 | coroflo | 0.0493 | 0.79 | 5.46 | 22.53 | 0.76 | 0.77 |
8 | neuro.ml_2 | 0.0511 | 0.78 | 6.33 | 30.63 | 0.82 | 0.73 |
9 | cian | 0.0571 | 0.78 | 6.82 | 21.72 | 0.83 | 0.70 |
10 | bioengineering_espol_team | 0.0596 | 0.78 | 6.24 | 28.26 | 0.78 | 0.74 |
11 | nlp_logix | 0.0678 | 0.77 | 7.16 | 18.37 | 0.73 | 0.78 |
12 | pitt-vamp | 0.0753 | 0.79 | 8.85 | 19.53 | 0.76 | 0.73 |
13 | nih_cidi_2 | 0.0803 | 0.75 | 7.35 | 27.26 | 0.81 | 0.69 |
14 | bigrbrain_2 | 0.0847 | 0.77 | 9.46 | 28.04 | 0.78 | 0.71 |
15 | uned_stand_2021 | 0.0866 | 0.79 | 6.37 | 17.56 | 0.69 | 0.73 |
16 | fmrib-truenet_2 | 0.0891 | 0.77 | 6.95 | 20.49 | 0.74 | 0.70 |
17 | bigrbrain | 0.0910 | 0.78 | 6.75 | 23.24 | 0.70 | 0.73 |
18 | uned_2 | 0.0927 | 0.78 | 7.59 | 27.69 | 0.77 | 0.66 |
19 | nic-vicorob | 0.0927 | 0.77 | 8.28 | 28.54 | 0.75 | 0.71 |
20 | rasha_improved | 0.0964 | 0.77 | 7.42 | 24.97 | 0.76 | 0.67 |
21 | wta | 0.1012 | 0.78 | 6.78 | 16.20 | 0.66 | 0.73 |
22 | dice | 0.1046 | 0.77 | 7.63 | 19.77 | 0.69 | 0.71 |
23 | uned_enhanced_2021 | 0.1063 | 0.79 | 7.23 | 18.64 | 0.66 | 0.71 |
24 | fmrib-truenet | 0.1114 | 0.75 | 8.91 | 27.93 | 0.72 | 0.70 |
25 | wta_2 | 0.1132 | 0.76 | 8.21 | 21.31 | 0.67 | 0.72 |
26 | misp_2 | 0.1178 | 0.78 | 11.10 | 19.71 | 0.68 | 0.71 |
27 | acunet_2-2 | 0.1346 | 0.71 | 8.34 | 22.58 | 0.69 | 0.70 |
28 | uned_contrast | 0.1355 | 0.75 | 9.90 | 44.33 | 0.77 | 0.60 |
29 | k2 | 0.1531 | 0.77 | 9.79 | 19.08 | 0.59 | 0.70 |
30 | uned | 0.1552 | 0.73 | 11.04 | 55.84 | 0.81 | 0.54 |
31 | rasha_simple | 0.1562 | 0.74 | 9.05 | 29.73 | 0.65 | 0.64 |
32 | acunet_2-4 | 0.1642 | 0.69 | 10.30 | 26.14 | 0.63 | 0.71 |
33 | acunet_2-1 | 0.1696 | 0.69 | 8.96 | 28.99 | 0.65 | 0.65 |
34 | acunet | 0.1735 | 0.65 | 9.22 | 29.35 | 0.67 | 0.67 |
35 | lrde | 0.1797 | 0.73 | 14.54 | 21.71 | 0.63 | 0.67 |
36 | misp | 0.1821 | 0.72 | 14.88 | 21.36 | 0.63 | 0.68 |
37 | acunet_2-3 | 0.2033 | 0.66 | 10.31 | 28.40 | 0.58 | 0.67 |
38 | ipmi-bern | 0.2625 | 0.69 | 9.72 | 19.92 | 0.44 | 0.57 |
39 | nih_cidi | 0.2843 | 0.68 | 12.82 | 196.38 | 0.59 | 0.54 |
40 | arg | 0.2879 | 0.70 | 21.54 | 46.11 | 0.53 | 0.57 |
41 | scan | 0.2898 | 0.63 | 14.34 | 34.67 | 0.55 | 0.51 |
42 | tig_corr | 0.3039 | 0.68 | 17.48 | 39.07 | 0.54 | 0.46 |
43 | livia | 0.3044 | 0.61 | 22.70 | 38.39 | 0.54 | 0.61 |
44 | achilles | 0.3081 | 0.63 | 11.82 | 24.41 | 0.45 | 0.52 |
45 | caai_amh | 0.3090 | 0.62 | 15.03 | 28.54 | 0.45 | 0.57 |
46 | skkumedneuro | 0.3604 | 0.58 | 19.02 | 58.54 | 0.47 | 0.51 |
47 | tignet | 0.3909 | 0.59 | 21.58 | 86.22 | 0.46 | 0.45 |
48 | tig | 0.3955 | 0.60 | 17.86 | 34.34 | 0.38 | 0.42 |
49 | knight | 0.4239 | 0.70 | 17.03 | 39.99 | 0.25 | 0.35 |
50 | upc_dlmi | 0.4449 | 0.53 | 27.01 | 208.49 | 0.57 | 0.42 |
51 | himinn | 0.4500 | 0.62 | 24.49 | 44.19 | 0.33 | 0.36 |
52 | nist | 0.4830 | 0.53 | 15.91 | 109.98 | 0.37 | 0.25 |
53 | text_class | 0.5781 | 0.50 | 28.23 | 146.64 | 0.27 | 0.29 |
54 | neuro.ml | 0.6074 | 0.51 | 37.36 | 614.05 | 0.71 | 0.21 |
55 | hadi | 0.8940 | 0.23 | 52.02 | 828.61 | 0.58 | 0.11 |
Results presented at MICCAI 2017 in Quebec City can be found here: MICCAI results.