Results of the ImageCLEF 2006 - Object Annotation Task

formerly known as non-medical automatic annotation task

Here are the results of the object annotation task of ImageCLEF 2006.

In 2006, 20 groups registered for the object annotation task and 3 of these submitted a total of 8 runs (maximal number of submissions per group 4).

The task description can be found on the task website.

The following table gives the error rates (i.e. number of misclassified images/number of classified images, i.e. 1-(classification accuracy)) for the submitted runs.

Group ID Runtag Error rate
RWTHi6 SHME 0.773
RWTHi6 PatchHisto 0.802
cindi Cindi_SVM_Product 0.832
cindi Cindi_SVM_EHD 0.85
cindi Cindi_SVM_SUM 0.852
cindi Cindi_Fusion_knn 0.871
DEU CS edgehistogr_centroid 0.882
DEU CS colorlayout_centroid 0.932
if you want your group id or your runtag or the link to your site to be changed, please contact me with the necessary details.

Further analysis and details of the results will be made available soon.

The groundtruth file for the test data is available here.


Thomas Deselaers
Last modified: Mon Jul 10 12:26:19 CEST 2006

Valid HTML 4.01! xemacs logo debian logo ;