Yanou Ramon won this year’s ‘Best Paper Award’ at the Doctoral Day, during which PhD students at the Faculty of Business and Economics present their work and attend workshops.
Virtual Doctoral Day
The Doctoral Day gives PhD students the chance to present their work, to network and to exchange ideas with fellow students and faculty members. This year, due to corona, the faculty and the Antwerp Management School organised the first ever online edition. In total, 43 doctorandi gave a presentation and 141 people registered to attend the event.
Best Paper Award goes to…
Each year, the Award Committee grants the ‘Best Paper Award’. This year, the award went to Yanou’s paper on ‘Increasing Global Understanding of Prediction Models on Behavioral and Textual Data using Metafeatures-based Explanation Rules’. The jury found the paper highly relevant both from an academic and practitioners perspective.
“The paper further enhanced insights on predicting models based on behavioral data (such as Facebook, Youtube,…) through artificial intelligence, thereby also dealing with privacy and ethical issues. It demonstrated strong contributions and an excellent embeddedness of the research questions to the relevant theoretical perspectives.”
Prediction Models on Behavioral and Textual Data
In her paper, Yanou explains how machine learning models built on behavioral and textual data can result in highly accurate prediction models, but that they are often very difficult to interpret. Rule-extraction techniques have been proposed to combine the desired predictive accuracy of complex ‘black-box’ models with explainability.
“Because of the high dimensionality and sparsity, combined with many features being relevant for the prediction task, rule-extraction might fail in their primary explainability goal as the model may need to be replaced by many rules, leaving the user again with an incomprehensible explanation.”
“To address this problem, we develop and test a rule-extraction methodology based on higher-level, less-sparse ‘metafeatures’. We empirically validate the quality of the global explanation rules in terms of fidelity, stability, and accuracy over a collection of data sets, and benchmark their performance against rules extracted using the fine-grained behavioral and textual features. A key finding is that metafeatures-based explanations are better at mimicking the black-box model, as measured by the fidelity of explanations.”
Yanou is part of the Applied Data Mining research group at the University of Antwerp (Engineering Department).