ENSEMBLE LEARNING WITH HIGHLY VARIABLE CLASS-BASED PERFORMANCE

Ensemble Learning with Highly Variable Class-Based Performance

Ensemble Learning with Highly Variable Class-Based Performance

Blog Article

This paper proposes a novel model-agnostic method for weighting the outputs of base classifiers in machine learning (ML) ensembles.Our approach uses class-based weight coefficients assigned to every output class in each learner in the ensemble.This is particularly useful when the base classifiers have highly Bruschetta variable performance across classes.Our method generates a dense set of coefficients for the models in our ensemble by considering the model performance on each class.

We compare our novel method to the commonly used ensemble approaches like voting and weighted averages.In addition, we compare our approach to class-specific soft voting (CSSV), which was also designed to address variable performance but generates a sparse set of weights by solving a linear system.We choose to illustrate the power of this approach by applying it to an ensemble of extreme learning machines (ELMs), which are well suited for Face Wrap this approach due to their stochastic, highly varying performance across classes.We illustrate the superiority of our approach by comparing its performance to that of simple majority voting, weighted majority voting, and class-specific soft voting using ten popular open-source multiclass classification datasets.

Report this page