ICPR 2020

CHART-Infographics
2020

Competition on Harvesting Raw Tables from Infographics


Leaderboards (Updated: May 1, 2022)


Overall Ranking Per-Task Ranking
UB PMC Adobe Synth
Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 Task 7

Overall Ranking of Teams

Points per Task on UB PMC Points per Task on Adobe Synth
Team 1 2 3 4 5 6 7 1 2 3 4 5 6 7 Score
DeepBlueAI 3 2 2 4 2 3 0 4 2 4 4 2 3 0 35
Magic 2 3 3 0 4 0 0 3 3 4 0 4 0 0 26
Lenovo-SCUT-Intsig 4 4 4 0 0 0 0 4 4 4 0 0 0 0 24
SCUT-IntSig-Lenovo 0 0 0 3 0 0 4 0 0 0 4 0 0 4 15
IntSig-SCUT-Lenovo 0 0 0 0 3 4 0 0 0 0 0 3 4 0 14
IPSA 1 1 0 0 0 0 0 3 1 0 0 0 0 0 6
PY 0 1 1 2 0 0 0 0 0 0 0 0 0 0 4

Ranking for Task 1 on the UB PMC Testing Dataset

Rank Team Name Average
Per-Class
F-Measure
1 Lenovo-SCUT-Intsig 94.92%
2 DeepBlueAI 94.12%
3 Magic 93.65%
4 IPSA 90.95%

Ranking for Task 2 on the UB PMC Testing Dataset

Rank Team Name Average
Detection
IOU
Average
Recognition
OCR
Average
F-Measure
1 Lenovo-SCUT-Intsig 92.07% 94.85% 93.44%
2 Magic 89.54% 90.87% 90.20%
3 DeepBlueAI 91.33% 72.06% 80.56%
4 PY 83.86% 74.85% 79.10%
5 IPSA 33.51% 38.88% 36.00%

Ranking for Task 3 on the UB PMC Testing Dataset

Rank Team Name Average
Per-Class
F-Measure
1 Lenovo-SCUT-Intsig 85.85%
2 Magic 81.71%
3 DeepBlueAI 77.19%
4 PY 65.38%

Ranking for Task 4 on the UB PMC Testing Dataset

Rank Team Name Average
Recall
Average
Precision
Average
F-Measure
1 DeepBlueAI 96.04% 95.35% 95.69%
2 SCUT-IntSig-Lenovo 93.12% 94.60% 93.85%
3 PY 71.63% 68.37% 69.97%

* Updated on January 19, 2021 after bug in evaluation tool was fixed. Actual scores have been updated but ranking remained the same

Ranking for Task 5 on the UB PMC Testing Dataset

Rank Team Name Average
BBox Recall
Average
BBox IOU
1 Magic 92.00% 86.00%
2 IntSig-SCUT-Lenovo 93.19% 84.92%
3 DeepBlueAI 86.43% 81.78%

Ranking for Task 6 on the UB PMC Testing Dataset

Rank Team Name Average
Visual Element
Detection Score
Average
Name Score
Average
Data Score
Average
Metric Score
1 IntSig-SCUT-Lenovo 88.23% 90.42% 76.73% 80.15%
2 DeepBlueAI 87.00% 78.54% 55.40% 61.18%

Ranking for Task 7 on the UB PMC Testing Dataset

Rank Team Name Average
Visual Element
Detection Score
Average
Name Score
Average
Data Score
Average
Metric Score
1 SCUT-IntSig-Lenovo 85.61% 82.08% 69.06% 72.32%

Ranking for Task 1 on the Adobe Synthetic Testing Dataset

Rank Team Name Average
Per-Class
F-Measure
1 Lenovo-SCUT-Intsig 100.00%
1 DeepBlueAI 100.00%
2 Magic 99.44%
2 IPSA 99.16%

Ranking for Task 2 on the Adobe Synthetic Testing Dataset

Rank Team Name Average
Detection
IOU
Average
Recognition
OCR
Average
F-Measure
1 Lenovo-SCUT-Intsig 94.29% 97.34% 95.80%
2 Magic 92.88% 92.25% 92.56%
3 DeepBlueAI 44.10% 70.19% 54.16%
4 IPSA 13.48% 20.93% 16.40%

Ranking for Task 3 on the Adobe Synthetic Testing Dataset

Rank Team Name Average
Per-Class
F-Measure
1 Lenovo-SCUT-Intsig 100.00%
1 Magic 99.93%
1 DeepBlueAI 99.92%

Ranking for Task 4 on the Adobe Synthetic Testing Dataset

Rank Team Name Average
F-Measure
1 DeepBlueAI 99.90%
1 SCUT-IntSig-Lenovo 99.80%

Ranking for Task 5 on the Adobe Synthetic Testing Dataset

Rank Team Name Average
BBox Recall
Average
BBox IOU
1 Magic 99.30% 98.99%
2 IntSig-SCUT-Lenovo 99.67% 94.98%
3 DeepBlueAI 92.82% 91.88%

Ranking for Task 6 on the Adobe Synthetic Testing Dataset

Rank Team Name Average
Visual Element
Detection Score
Average
Name Score
Average
Data Score
Average
Metric Score
1 IntSig-SCUT-Lenovo 95.37% 99.86% 96.20% 97.12%
2 DeepBlueAI 90.67% 94.84% 70.30% 76.43%

Ranking for Task 7 on the Adobe Synthetic Testing Dataset

Rank Team Name Average
Name Score
Average
Data Score
Average
Metric Score
1 SCUT-IntSig-Lenovo 97.00% 93.19% 94.14%