Experimental results for the paper:
Fuzzy Multi-Instance Classifiers
S. Vluymans, D. Sánchez Tarragó, Y. Saeys, C. Cornelis, F. Herrera
IEEE Transactions on Fuzzy Systems 24(6), p.1395-1409, 2016
Corresponding author: sarah.vluymans@ugent.be
Below, we present the accuracy and kappa values for all methods and all datasets. The datasets are grouped according to application domain: bio-informatics, textual data applications, inductive logic programming and image categorization. We present the mean values per group as well as the overall mean. For each dataset, the row-wise highest value is printed in bold.
These two tables accompany Tables VIII and IX of the paper.
Dataset | BFMIC | IFMIC | MILES | MIWrapper | SimpleMI | MILR | BARTMIP | miSVM | CitationKNN |
---|---|---|---|---|---|---|---|---|---|
Musk1 | 81.522 | 82.609 | 88.043 | 86.957 | 73.913 | 85.870 | 84.783 | 78.261 | 86.957 |
Musk2 | 72.277 | 74.257 | 81.188 | 82.178 | 81.188 | 80.198 | 86.139 | 71.287 | 83.168 |
Atoms | 73.936 | 73.936 | 75.000 | 73.936 | 67.553 | 72.872 | 73.936 | 66.489 | 71.277 |
Bonds | 71.809 | 75.000 | 82.447 | 77.660 | 79.255 | 72.340 | 76.064 | 66.489 | 74.468 |
Chains | 73.936 | 73.404 | 82.447 | 85.106 | 78.723 | 73.404 | 80.319 | 66.489 | 72.872 |
AntDrugs5 | 72.000 | 71.750 | 72.250 | 77.250 | 77.250 | 74.750 | 76.250 | 72.000 | 72.500 |
AntDrugs10 | 80.750 | 80.750 | 80.500 | 80.750 | 78.500 | 82.250 | 81.250 | 75.500 | 78.250 |
AntDrugs20 | 75.500 | 75.500 | 77.000 | 79.000 | 76.500 | 80.250 | 78.750 | 69.750 | 69.250 |
Mean Bio-IT | 75.216 | 75.901 | 79.859 | 80.355 | 76.610 | 77.742 | 79.686 | 70.783 | 76.093 |
TREC1 | 90.750 | 92.500 | 94.500 | 90.750 | 94.250 | 84.750 | 79.000 | 88.250 | 59.250 |
TREC2 | 70.250 | 71.000 | 80.500 | 73.750 | 82.750 | 69.000 | 61.000 | 77.750 | 46.750 |
TREC3 | 77.750 | 81.000 | 75.500 | 84.750 | 88.500 | 79.500 | 63.750 | 76.750 | 49.750 |
TREC4 | 78.750 | 82.500 | 80.500 | 81.750 | 87.500 | 79.250 | 68.000 | 79.750 | 43.750 |
TREC7 | 76.000 | 75.000 | 78.500 | 74.000 | 76.250 | 72.000 | 60.500 | 72.750 | 44.500 |
TREC9 | 67.000 | 63.750 | 63.500 | 61.250 | 62.250 | 60.000 | 57.750 | 59.750 | 47.250 |
TREC10 | 77.750 | 75.250 | 77.500 | 80.500 | 75.500 | 72.250 | 64.500 | 72.750 | 47.000 |
WIR7 | 75.221 | 75.221 | 56.637 | 75.221 | 66.372 | 75.221 | 52.212 | 68.142 | 61.947 |
WIR8 | 77.876 | 72.566 | 55.752 | 71.681 | 69.027 | 71.681 | 53.097 | 58.407 | 66.372 |
WIR9 | 74.336 | 72.566 | 53.097 | 76.106 | 73.451 | 74.336 | 57.522 | 61.062 | 69.027 |
Mean Text | 76.568 | 76.135 | 71.599 | 76.976 | 77.585 | 73.799 | 61.733 | 71.536 | 53.560 |
EastWest | 80.000 | 70.000 | 75.000 | 50.000 | 65.000 | 65.000 | 70.000 | 50.000 | 50.000 |
WestEast | 80.000 | 75.000 | 80.000 | 50.000 | 55.000 | 65.000 | 55.000 | 35.000 | 50.000 |
Mean LogProg | 80.000 | 72.500 | 77.500 | 50.000 | 60.000 | 65.000 | 62.500 | 42.500 | 50.000 |
Elephant | 84.500 | 84.500 | 78.500 | 84.500 | 76.500 | 79.000 | 81.500 | 78.500 | 50.000 |
Fox | 62.500 | 59.500 | 58.500 | 56.000 | 63.500 | 57.000 | 59.500 | 50.000 | 50.000 |
Tiger | 84.000 | 82.500 | 77.500 | 81.000 | 74.000 | 77.500 | 80.000 | 79.000 | 50.000 |
Corel1vs2 | 92.500 | 91.000 | 91.000 | 91.000 | 84.500 | 83.000 | 92.500 | 84.500 | 88.500 |
Corel1vs3 | 83.500 | 83.500 | 84.500 | 83.500 | 87.500 | 86.500 | 86.000 | 82.000 | 83.000 |
Corel1vs4 | 96.500 | 97.500 | 96.500 | 99.000 | 89.000 | 88.500 | 98.500 | 92.500 | 98.000 |
Corel1vs5 | 98.500 | 97.000 | 99.000 | 92.500 | 98.500 | 90.000 | 99.000 | 85.000 | 98.000 |
Corel2vs3 | 89.500 | 88.000 | 84.000 | 82.000 | 74.000 | 82.000 | 90.500 | 69.500 | 85.000 |
Corel2vs4 | 97.000 | 86.500 | 94.000 | 91.000 | 84.500 | 89.500 | 96.000 | 94.000 | 91.500 |
Corel2vs5 | 98.000 | 100.000 | 99.000 | 91.500 | 96.000 | 95.500 | 100.000 | 99.000 | 99.500 |
Corel3vs4 | 94.000 | 88.000 | 94.000 | 91.500 | 90.500 | 89.500 | 98.000 | 58.000 | 89.500 |
Corel3vs5 | 98.500 | 100.000 | 99.000 | 95.000 | 98.000 | 84.500 | 100.000 | 97.500 | 98.500 |
Corel4vs5 | 100.000 | 98.500 | 99.000 | 93.500 | 98.000 | 96.500 | 100.000 | 95.500 | 99.500 |
Mean Image | 90.692 | 88.962 | 88.808 | 87.077 | 85.731 | 84.538 | 90.885 | 81.923 | 83.154 |
Mean | 82.013 | 80.911 | 80.738 | 80.139 | 79.492 | 78.452 | 77.616 | 73.686 | 70.465 |
Dataset | BFMIC | IFMIC | MILES | MIWrapper | SimpleMI | MILR | BARTMIP | miSVM | CitationKNN |
---|---|---|---|---|---|---|---|---|---|
Musk1 | 0.627 | 0.650 | 0.760 | 0.739 | 0.478 | 0.717 | 0.695 | 0.564 | 0.739 |
Musk2 | 0.475 | 0.500 | 0.601 | 0.620 | 0.597 | 0.586 | 0.716 | 0.391 | 0.643 |
Atoms | 0.408 | 0.464 | 0.454 | 0.351 | 0.332 | 0.319 | 0.403 | 0.000 | 0.294 |
Bonds | 0.260 | 0.462 | 0.617 | 0.499 | 0.567 | 0.256 | 0.448 | 0.000 | 0.389 |
Chains | 0.334 | 0.292 | 0.614 | 0.671 | 0.511 | 0.341 | 0.531 | 0.000 | 0.374 |
AntDrugs5 | 0.437 | 0.432 | 0.444 | 0.543 | 0.545 | 0.493 | 0.524 | 0.437 | 0.449 |
AntDrugs10 | 0.602 | 0.602 | 0.605 | 0.603 | 0.564 | 0.635 | 0.615 | 0.499 | 0.556 |
AntDrugs20 | 0.493 | 0.493 | 0.535 | 0.572 | 0.528 | 0.599 | 0.568 | 0.381 | 0.375 |
Mean Bio-IT | 0.455 | 0.487 | 0.579 | 0.575 | 0.515 | 0.493 | 0.562 | 0.284 | 0.477 |
TREC1 | 0.815 | 0.850 | 0.890 | 0.815 | 0.885 | 0.695 | 0.580 | 0.765 | 0.185 |
TREC2 | 0.405 | 0.420 | 0.610 | 0.475 | 0.655 | 0.380 | 0.220 | 0.555 | -0.065 |
TREC3 | 0.555 | 0.620 | 0.510 | 0.695 | 0.770 | 0.590 | 0.275 | 0.535 | -0.005 |
TREC4 | 0.575 | 0.650 | 0.610 | 0.635 | 0.750 | 0.585 | 0.360 | 0.595 | -0.125 |
TREC7 | 0.520 | 0.500 | 0.570 | 0.480 | 0.525 | 0.440 | 0.210 | 0.455 | -0.110 |
TREC9 | 0.340 | 0.275 | 0.270 | 0.225 | 0.245 | 0.200 | 0.155 | 0.195 | -0.055 |
TREC10 | 0.555 | 0.505 | 0.550 | 0.610 | 0.510 | 0.445 | 0.290 | 0.455 | -0.060 |
WIR7 | 0.506 | 0.508 | 0.128 | 0.503 | 0.328 | 0.505 | 0.052 | 0.368 | 0.232 |
WIR8 | 0.559 | 0.455 | 0.114 | 0.434 | 0.381 | 0.435 | 0.059 | 0.180 | 0.326 |
WIR9 | 0.488 | 0.455 | 0.056 | 0.522 | 0.467 | 0.488 | 0.152 | 0.232 | 0.380 |
Mean Text | 0.532 | 0.524 | 0.431 | 0.539 | 0.552 | 0.476 | 0.235 | 0.434 | 0.070 |
EastWest | 0.600 | 0.400 | 0.500 | 0.000 | 0.300 | 0.300 | 0.400 | 0.000 | 0.000 |
WestEast | 0.600 | 0.500 | 0.600 | 0.000 | 0.100 | 0.300 | 0.100 | -0.300 | 0.000 |
Mean LogProg | 0.600 | 0.450 | 0.550 | 0.000 | 0.200 | 0.300 | 0.250 | -0.150 | 0.000 |
Elephant | 0.690 | 0.690 | 0.570 | 0.690 | 0.530 | 0.580 | 0.630 | 0.570 | 0.000 |
Fox | 0.250 | 0.190 | 0.170 | 0.120 | 0.270 | 0.140 | 0.190 | 0.000 | 0.000 |
Tiger | 0.680 | 0.650 | 0.550 | 0.620 | 0.480 | 0.550 | 0.600 | 0.580 | 0.000 |
Corel1vs2 | 0.850 | 0.820 | 0.820 | 0.820 | 0.690 | 0.660 | 0.850 | 0.690 | 0.770 |
Corel1vs3 | 0.670 | 0.670 | 0.690 | 0.670 | 0.750 | 0.730 | 0.720 | 0.640 | 0.660 |
Corel1vs4 | 0.930 | 0.950 | 0.930 | 0.980 | 0.780 | 0.770 | 0.970 | 0.850 | 0.960 |
Corel1vs5 | 0.970 | 0.940 | 0.980 | 0.850 | 0.970 | 0.800 | 0.980 | 0.700 | 0.960 |
Corel2vs3 | 0.790 | 0.760 | 0.680 | 0.640 | 0.480 | 0.640 | 0.810 | 0.390 | 0.700 |
Corel2vs4 | 0.940 | 0.730 | 0.880 | 0.820 | 0.690 | 0.790 | 0.920 | 0.880 | 0.830 |
Corel2vs5 | 0.960 | 1.000 | 0.980 | 0.830 | 0.920 | 0.910 | 1.000 | 0.980 | 0.990 |
Corel3vs4 | 0.880 | 0.760 | 0.880 | 0.830 | 0.810 | 0.790 | 0.960 | 0.160 | 0.790 |
Corel3vs5 | 0.970 | 1.000 | 0.980 | 0.900 | 0.960 | 0.690 | 1.000 | 0.950 | 0.970 |
Corel4vs5 | 1.000 | 0.970 | 0.980 | 0.870 | 0.960 | 0.930 | 1.000 | 0.910 | 0.990 |
Mean Image | 0.814 | 0.779 | 0.776 | 0.742 | 0.715 | 0.691 | 0.818 | 0.638 | 0.663 |
Mean | 0.628 | 0.611 | 0.610 | 0.595 | 0.586 | 0.554 | 0.545 | 0.443 | 0.398 |