Surrey researchers Sign in
A Bias-Variance Analysis of Bootstrapped Class-Separability Weighting for Error-Correcting Output Code Ensembles
Conference proceeding

A Bias-Variance Analysis of Bootstrapped Class-Separability Weighting for Error-Correcting Output Code Ensembles

R S Smith and T Windeatt
2010 20th International Conference on Pattern Recognition, pp.61-64
08/2010

Abstract

Artificial neural networks bias/variance bootstrapping Decoding ecoc Encoding Kernel Polynomials Support vector machines Training weighting
We investigate the effects, in terms of a bias-variance decomposition of error, of applying class-separability weighting plus bootstrapping in the construction of error-correcting output code ensembles of binary classifiers. Evidence is presented to show that bias tends to be reduced at low training strength values whilst variance tends to be reduced across the full range. The relative importance of these effects, however, varies depending on the stability of the base classifier type.

Metrics

8 Record Views

Details

Usage Policy