Here are the results of the object annotation task of ImageCLEF 2006.
In 2006, 20 groups registered for the object annotation task and 3 of these submitted a total of 8 runs (maximal number of submissions per group 4).
The task description can be found on the task website.
The following table gives the error rates (i.e. number of misclassified images/number of classified images, i.e. 1-(classification accuracy)) for the submitted runs.
Group ID | Runtag | Error rate |
---|---|---|
RWTHi6 | SHME | 0.773 |
RWTHi6 | PatchHisto | 0.802 |
cindi | Cindi_SVM_Product | 0.832 |
cindi | Cindi_SVM_EHD | 0.85 |
cindi | Cindi_SVM_SUM | 0.852 |
cindi | Cindi_Fusion_knn | 0.871 |
DEU CS | edgehistogr_centroid | 0.882 |
DEU CS | colorlayout_centroid | 0.932 |
Further analysis and details of the results will be made available soon.
The groundtruth file for the test data is available here.