Table 3:

Top performing model for each feature set, along with the 3 other data-preprocessing results using the same modeling strategy (external data)

Feature SetProcessingAlgorithmFeature SelectionmAUCLogLossBrier ScoreBest Model
CE_ET and F_PTRaNone/noneSVM-PICC0.8180.9290.548False
SD/nonebSVM-PICC0.8330.8710.521True
None/ComBatSVM-PICC0.8081.0290.588False
SD/ComBatSVM-PICC0.8170.9490.564False
CE_ET and T2_PTRaNone/noneENETNone0.8080.9040.520False
SD/noneENETNone0.8170.8670.499False
None/ComBatbENETNone0.8410.9220.492True
SD/ ComBatENETNone0.8350.8910.487False
CE_ET, A_ET and F_PTRaNone/noneSVM-PICC0.8730.7640.433False
SD/nonebSVM-PICC0.8860.7120.414True
None/ComBatSVM-PICC0.8360.8720.520False
SD/ ComBatSVM-PICC0.8730.7490.444False
CE_ETaNone/noneSVM-PICC0.8190.8810.499False
SD/NonebSVM-PICC0.8590.7890.472True
None/ComBatSVM-PICC0.8210.9620.531False
SD/ ComBatSVM-PICC0.8420.8500.512False
  • Note:—ENET indicates multinomial elastic net; SVM-P, support vector machine-polynomial kernel, LogLoss, ??????; A, ADC; F, FLAIR

  • aRow indicates models using BIP only, but with otherwise the same modeling strategy.

  • bRow indicates the top-performing model for each feature set.