Results of the ImageCLEF 2006 - Medical Automatic Annotation Task

Here are the results of the medical automatic annotation task of ImageCLEF 2006.

In 2006, 27 groups registered for the medical automatic annotation task and 12 of these submitted a total of 27 runs (maximal number of submissions per group 5).

The task description can be found on the task website.

The following table gives the error rates (i.e. number of misclassified images/number of classified images, i.e. 1-(classification accuracy)) for the submitted runs.

Group ID Runtag Error rate
RWTHi6 SHME 16.2
UFR UFR-ns-1000-20x20x10 16.7
RWTHi6 SHSVM 16.7
MedIC_CISMeF local+global_PCA335 17.2
MedIC_CISMeF local_PCA333 17.2
MSRA WSM-msra_wsm_gray 17.6
MedIC_CISMeF local+global_PCA450 17.9
UFR UFR-ns-800-20x20x10 17.9
MSRA WSM-msra_wsm_patch 18.2
MedIC_CISMeF local_PCA150 20.2
RWTHi6 IDM 20.4
rwth_mi opt 21.5
rwth_mi baseline 21.7
cindi cindi-svm-sum 24.1
cindi cindi-svm-product 24.8
cindi cindi-svm-ehd 25.5
cindi cindi-fusion-KNN9 25.6
cindi cindi-svm-max 26.1
OHSU OHSU_iconGLCM2_tr 26.3
OHSU OHSU_iconGLCM2_tr_de 26.4
NCTU dblab-nctu-dblab2 26.7
MU I2R_refine_SVM 28.0
OHSU OHSU_iconHistGLCM2_t 28.1
ULG SYSMOD-RANDOM-SUBWINDOWS-EX 29.0
DEU DEU-3NN-EDGE 29.5
OHSU OHSU_iconHist_tr_dev 30.8
UTD UTD 31.7
ULG SYSMOD-RANDOM-SUBWINDOWS-24 34.1
if you want your group id or your runtag or the link to your site to be changed, please contact me with the necessary details.

Further analysis and details of the results will be made available soon. So far, we provide the correct classification for the test data.


Thomas Deselaers
Last modified: Fri Jun 30 11:17:57 CEST 2006

Valid HTML 4.01! xemacs logo debian logo ;