Placeholder image for staff profiles

Dr Terry Windeatt


Biography

My teaching

My publications

Publications

Windeatt T, Dias K (2008) Ensemble Approaches to Facial Action Unit Classification, PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS AND APPLICATIONS, PROCEEDINGS 5197 pp. 551-559 SPRINGER-VERLAG BERLIN
Facial action unit (au) classification is an approach to face expression
recognition that decouples the recognition of expression from individual actions. In this
paper, upper face aus are classified using an ensemble of MLP (Multi-layer perceptron)
base classifiers with feature ranking based on PCA components. This approach is
compared experimentally with other popular feature-ranking methods applied to Gabor
features. Experimental results on Cohn-Kanade database demonstrate that the MLP
ensemble is relatively insensitive to the feature-ranking method but optimized PCA
features achieve lowest error rate. When posed as a multi-class problem using Error-
Correcting-Output-Coding (ECOC), error rates are comparable to two-class problems
(one-versus-rest) when the number of features and base classifier are optimized.
Windeatt T (2008) Ensemble MLP classifier design, Studies in Computational Intelligence 137 pp. 133-147
Duangsoithong R, Windeatt T (2010) Correlation-Based and Causal Feature Selection Analysis for Ensemble Classifiers, ARTIFICIAL NEURAL NETWORKS IN PATTERN RECOGNITION, PROCEEDINGS 5998 pp. 25-36 SPRINGER-VERLAG BERLIN
Duangsoithong R, Windeatt T (2009) Relevant and Redundant Feature Analysis with Ensemble Classification, ICAPR 2009: SEVENTH INTERNATIONAL CONFERENCE ON ADVANCES IN PATTERN RECOGNITION, PROCEEDINGS pp. 247-250 IEEE COMPUTER SOC
Smith RS, Windeatt T (2009) The Bias Variance Trade-Off in Bootstrapped Error Correcting Output Code Ensembles, MULTIPLE CLASSIFIER SYSTEMS, PROCEEDINGS 5519 pp. 1-10 SPRINGER-VERLAG BERLIN
By performing experiments on publicly available multi-class datasets we examine the effect of bootstrapping on the bias/variance behaviour of error-correcting output code ensembles. We present evidence to show that the general trend is for bootstrapping to reduce variance but to slightly increase bias error. This generally leads to an improvement in the lowest attainable ensemble error, however this is not always the case and bootstrapping appears to be most useful on datasets where the non-bootstrapped ensemble classifier is prone to overfitting.
Duangsoithong R, Phukpattaranont P, Windeatt T (2013) Bootstrap Causal Feature Selection for irrelevant feature elimination, BMEiCON 2013 - 6th Biomedical Engineering International Conference
Irrelevant features may lead to degradation in accuracy and efficiency of classifier performance. In this paper, Bootstrap Causal Feature Selection (BCFS) algorithm is proposed. BCFS uses bootstrapping with a causal discovery algorithm to remove irrelevant features. The results are evaluated by the number of selected features and classification accuracy. According to the experimental results, BCFS is able to remove irrelevant features and provides slightly higher average accuracy than using the original features and causal feature selection. Moreover, BCFS also reduces complexity in causal graphs which provides more comprehensibility for the casual discovery system. © 2013 IEEE.
Smith RS, Windeatt T (2010) Class-Separability Weighting and Bootstrapping in Error Correcting Output Code Ensembles, MULTIPLE CLASSIFIER SYSTEMS, PROCEEDINGS 5997 pp. 185-194 SPRINGER-VERLAG BERLIN
Windeatt T (2007) Ensemble neural classifier design for face recognition., ESANN pp. 373-378
Prior M, Windeatt T (2009) Improved Uniformity Enforcement in Stochastic Discrimination, MULTIPLE CLASSIFIER SYSTEMS, PROCEEDINGS 5519 pp. 335-343 SPRINGER-VERLAG BERLIN
There are a variety of methods for inducing predictive systems from
observed data. Many of these methods fall into the field of study of
machine learning. Some of the most effective algorithms in this domain
succeed by combining a number of distinct predictive elements to form
what can be described as a type of committee. Well known examples of
such algorithms are AdaBoost, bagging and random forests. Stochastic
discrimination is a committee-forming algorithm that attempts to combine
a large number of relatively simple predictive elements in an effort to
achieve a high degree of accuracy. A key element of the success of this
technique is that its coverage of the observed feature space should be
uniform in nature. We introduce a new uniformity enforcement method,
which on benchmark datasets, leads to greater predictive efficiency than
the currently published method.
Özö?ür-Akyüz S, Özö?ür-Akyüz S, Windeatt T, Smith R (2015) Pruning of Error Correcting Output Codes by optimization of accuracy?diversity trade off, Machine Learning
© 2014 The Author(s) Ensemble learning is a method of combining learners to obtain more reliable and accurate predictions in supervised and unsupervised learning. However, the ensemble sizes are sometimes unnecessarily large which leads to additional memory usage, computational overhead and decreased effectiveness. To overcome such side effects, pruning algorithms have been developed; since this is a combinatorial problem, finding the exact subset of ensembles is computationally infeasible. Different types of heuristic algorithms have developed to obtain an approximate solution but they lack a theoretical guarantee. Error Correcting Output Code (ECOC) is one of the well-known ensemble techniques for multiclass classification which combines the outputs of binary base learners to predict the classes for multiclass data. In this paper, we propose a novel approach for pruning the ECOC matrix by utilizing accuracy and diversity information simultaneously. All existing pruning methods need the size of the ensemble as a parameter, so the performance of the pruning methods depends on the size of the ensemble. Our unparametrized pruning method is novel as being independent of the size of ensemble. Experimental results show that our pruning method is mostly better than other existing approaches.
Smith RS, Windeatt T (2010) Facial Action Unit Recognition using Filtered Local Binary Pattern Features with Bootstrapped and Weighted ECOC Classifiers,
Within the context face expression classification using the facial action coding system (FACS), we address the problem of detecting facial action units (AUs). The method adopted is to train a single error-correcting output code (ECOC) multiclass classifier to estimate the probabilities that each one of several commonly occurring AU groups is present in the probe image. Platt scaling is used to calibrate the ECOC outputs to probabilities and appropriate sums of these probabilities are taken to obtain a separate probability for each AU individually. Feature extraction is performed by generating a large number of local binary pattern (LBP) features and then selecting from these using fast correlation-based filtering (FCBF). The bias and variance properties of the classifier are measured and we show that both these sources of error can be reduced by enhancing ECOC through the application of bootstrapping and class-separability weighting.
Windeatt T (2006) Accuracy/diversity and ensemble MLP classifier design., IEEE Trans Neural Netw 17 (5) pp. 1194-1211
The difficulties of tuning parameters of multilayer perceptrons (MLP) classifiers are well known. In this paper, a measure is described that is capable of predicting the number of classifier training epochs for achieving optimal performance in an ensemble of MLP classifiers. The measure is computed between pairs of patterns on the training data and is based on a spectral representation of a Boolean function. This representation characterizes the mapping from classifier decisions to target label and allows accuracy and diversity to be incorporated within a single measure. Results on many benchmark problems, including the Olivetti Research Laboratory (ORL) face database demonstrate that the measure is well correlated with base-classifier test error, and may be used to predict the optimal number of training epochs. While correlation with ensemble test error is not quite as strong, it is shown in this paper that the measure may be used to predict number of epochs for optimal ensemble performance. Although the technique is only applicable to two-class problems, it is extended here to multiclass through output coding. For the output-coding technique, a random code matrix is shown to give better performance than one-per-class code, even when the base classifier is well-tuned.
Windeatt T, Smith RS, Dias K (2008) Weighted Decoding ECOC for Facial Action Unit Classification, Applications of Supervised and Unsupervised Ensemble Methods
Windeatt T, Zor C (2012) Low training strength high capacity classifiers for accurate ensembles using walsh coefficients, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 7626 LNCS pp. 701-709
If a binary decision is taken for each classifier in an ensemble, training patterns may be represented as binary vectors. For a two-class supervised learning problem this leads to a partially specified Boolean function that may be analysed in terms of spectral coefficients. In this paper it is shown that a vote which is weighted by the coefficients enables a fast ensemble classifier that achieves performance close to Bayes rate. Experimental evidence shows that effective classifier performance may be achieved with one epoch of training of an MLP using Levenberg-Marquardt with 64 hidden nodes. © 2012 Springer-Verlag Berlin Heidelberg.
Smith RS, Windeatt T (2010) Facial Expression Detection using Filtered Local Binary Pattern Features with ECOC Classifiers and Platt Scaling, Journal of Machine Learning Research Proceedings
We outline a design for a FACS-based facial expression recognition system and describe in more detail the implementation of two of its main components. Firstly we look at how features that are useful from a pattern analysis point of view can be extracted from a raw input image. We show that good results can be obtained by using the method of local binary patterns (LPB) to generate a large number of candidate features and then selecting from them using fast correlation-based filtering (FCBF). Secondly we show how Platt scaling can be used to improve the performance of an error-correcting output code (ECOC) classifier.
Smith RS, Windeatt T (2010) Facial Expression Detection using Filtered Local Binary Pattern Features with ECOC Classifiers and Platt Scaling., Journal of Machine Learning Research Track 11 pp. 111-118 Microtome Publishing
We outline a design for a FACS-based facial expression recognition system and describe in more
detail the implementation of two of its main components. Firstly we look at how features that are
useful from a pattern analysis point of view can be extracted from a raw input image. We show
that good results can be obtained by using the method of local binary patterns (LPB) to generate
a large number of candidate features and then selecting from them using fast correlation-based
ltering (FCBF). Secondly we show how Platt scaling can be used to improve the performance of
an error-correcting output code (ECOC) classi er.
Zor C, Windeatt T, Yanikoglu B (2011) Bias-variance analysis of ECOC and bagging using neural nets, 373/2011 pp. 59-73 Springer
One of the methods used to evaluate the performance of ensemble classifiers
is bias and variance analysis. In this chapter, we analyse bootstrap aggregating
(bagging) and Error Correcting Output Coding (ECOC) ensembles using a biasvariance
framework; and make comparisons with single classifiers, while having
Neural Networks (NNs) as base classifiers. As the performance of the ensembles
depends on the individual base classifiers, it is important to understand the overall
trends when the parameters of the base classifiers -nodes and epochs for NNs-, are
changed.We show experimentally on 5 artificial and 4 UCI MLR datasets that there
are some clear trends in the analysis that should be taken into consideration while
designing NN classifier systems.
Duangsoithong R, Windeatt T (2011) Hybrid correlation and causal feature selection for ensemble classifiers, Studies in Computational Intelligence 373 pp. 97-115 Springer
PC and TPDA algorithms are robust and well known prototype algorithms,
incorporating constraint-based approaches for causal discovery. However, both algorithms
cannot scale up to deal with high dimensional data, that is more than few
hundred features. This chapter presents hybrid correlation and causal feature selection
for ensemble classifiers to deal with this problem. Redundant features are
removed by correlation-based feature selection and then irrelevant features are eliminated
by causal feature selection. The number of eliminated features, accuracy, the
area under the receiver operating characteristic curve (AUC) and false negative rate
(FNR) of proposed algorithms are compared with correlation-based feature selection
(FCBF and CFS) and causal based feature selection algorithms (PC, TPDA,
GS, IAMB).
Prior M, Windeatt T (2007) An ensemble dependence measure, Artificial Neural Networks - ICANN 2007, Pt 1, Proceedings 4668 pp. 329-338 SPRINGER-VERLAG BERLIN
Zor C, Windeatt T, Yanikoglu, B (2010) Bias-Variance Analysis of ECOC and Bagging
Using Neural Nets,
Proceedings of the the Third Workshop on Supervised and Unsupervised Ensemble Methods and Their Applications, European Conference on Machine Learning pp. 109-118
Windeatt T, Duangsoithong R, Smith R (2011) Embedded feature ranking for ensemble MLP classifiers, IEEE Transactions on Neural Networks 22 (6) pp. 988-994 IEEE
A feature ranking scheme for multilayer perceptron (MLP) ensembles is proposed, along with a stopping criterion based upon the out-of-bootstrap estimate. To solve multi-class problems feature ranking is combined with modified error-correcting output coding. Experimental results on benchmark data demonstrate the versatility of the MLP base classifier in removing irrelevant features.
Kittler J, Ghaderi R, Windeatt T, Matas J (2001) Face identification and verification via ECOC, AUDIO- AND VIDEO-BASED BIOMETRIC PERSON AUTHENTICATION, PROCEEDINGS 2091 pp. 1-13 SPRINGER-VERLAG BERLIN
Duangsoithong R, Windeatt T (2009) Relevance and Redundancy Analysis for Ensemble Classifiers, MACHINE LEARNING AND DATA MINING IN PATTERN RECOGNITION 5632 pp. 206-220 SPRINGER-VERLAG BERLIN
Windeatt T, Prior M, Effron N, Intrator N (2007) Ensemble-based Feature Selection Criteria., MLDM Posters pp. 168-182 IBaI publishing
Zor C, Windeatt T (2009) Upper Facial Action Unit Recognition, ADVANCES IN BIOMETRICS 5558 pp. 239-248 SPRINGER-VERLAG BERLIN
This paper concentrates on the comparisons of systems that
are used for the recognition of expressions generated by six upper face
action units (AUs) by using Facial Action Coding System (FACS). Haar
wavelet, Haar-Like and Gabor wavelet coe cients are compared, using
Adaboost for feature selection. The binary classi cation results by using
Support Vector Machines (SVM) for the upper face AUs have been observed
to be better than the current results in the literature, for example
96.5% for AU2 and 97.6% for AU5. In multi-class classi cation case, the
Error Correcting Output Coding (ECOC) has been applied. Although
for a large number of classes, the results are not as accurate as the binary
case, ECOC has the advantage of solving all problems simultaneously;
and for large numbers of training samples and small number of classes,
error rates are improved.
Dias K, Windeatt T (2015) Hybrid Dynamic Learning Systems for Regression, ADVANCES IN COMPUTATIONAL INTELLIGENCE, PT II 9095 pp. 464-476 SPRINGER-VERLAG BERLIN
Dias K, Windeatt T (2014) Dynamic ensemble selection and instantaneous pruning for regression used in signal calibration, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 8681 L pp. 475-482
A dynamic method of selecting a pruned ensemble of predictors for regression problems is described. The proposed method enhances the prediction accuracy and generalization ability of pruning methods that change the order in which ensemble members are combined. Ordering heuristics attempt to combine accurate yet complementary regressors. The proposed method enhances the performance by modifying the order of aggregation through distributing the regressor selection over the entire dataset. This paper compares four static ensemble pruning approaches with the proposed dynamic method. The experimental comparison is made using MLP regressors on benchmark datasets and on an industrial application of radio frequency source calibration. © 2014 Springer International Publishing Switzerland.
Hancock E, Wilson R, Windeatt T, Ulusoy I, Escolano F (2010) Preface: Lecture Notes in Computer Science: Structural, Syntactic and Statistical Pattern Recognition, Lecture Notes in Computer Science: Structural, Syntactic and Statistical Pattern Recognition 6218 pp. v-vi Springer
Dias K, Windeatt T (2014) Dynamic ensemble selection and instantaneous pruning for regression., ESANN
Gimel'Farb GL, Hancock E, Imiya A, Kudo M, Kuijper A, Omachi S, Windeatt T, Yamada K (2012) Preface, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 7626 L
Zor C, Windeatt T, Kittler J (2013) ECOC Matrix Pruning Using Accuracy Information., MCS 7872 pp. 386-397 Springer
Zor C, Yanikoglu B, Windeatt T, Alpaydin E (2010) FLIP-ECOC: A greedy optimization of the ECOC matrix, Lecture Notes in Electrical Engineering: Computer and Information Sciences 62 (5) pp. 149-154 Springer
Error Correcting Output Coding (ECOC) is a multiclass classification technique, in which multiple base classifiers (dichotomizers) are trained using subsets of the training data, determined by a preset code matrix. While it is one of the best solutions to multiclass problems, ECOC is suboptimal, as the code matrix and the base classifiers are not learned simultaneously. In this paper, we show an iterative update algorithm that reduces this decoupling. We compare the algorithm with the standard ECOC approach, using Neural Networks (NNs) as the base classifiers, and show that it improves the accuracy for some well-known data sets under different settings.
Duangsoithong D, Windeatt T (2010) Hybrid Correlation and Causal Feature Selection
for Ensemble Classifiers,
Proceedings of the the Third Workshop on Supervised and Unsupervised Ensemble Methods and Their Applications, European Conference on Machine Learning pp. 23-32
PC and TPDA algorithms are robust and well known prototype
algorithms, incorporating constraint-based approaches for causal
discovery. However, both algorithms cannot scale up to deal with high
dimensional data, that is more than few hundred features. This paper
presents hybrid correlation and causal feature selection for ensemble classifiers
to deal with this problem. The number of eliminated features, accuracy,
the area under the receiver operating characteristic curve (AUC)
and false negative rate (FNR) of proposed algorithms are compared with
correlation-based feature selection (FCBF and CFS) and causal based
feature selection algorithms (PC, TPDA, GS, IAMB).
Duangsoithong R, Windeatt T (2010) Bootstrap feature selection for ensemble classifiers, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 6171 LNAI pp. 28-41
Zor C, Windeatt T, Kittler J (2013) ECOC pruning using accuracy, diversity and hamming distance information, 2013 21st Signal Processing and Communications Applications Conference, SIU 2013
Existing ensemble pruning algorithms in the literature have mainly been defined for unweighted or weighted voting ensembles, whose extensions to the Error Correcting Output Coding (ECOC) framework is not successful. This paper presents a novel pruning algorithm to be used in the pruning of ECOC, via using a new accuracy measure together with diversity and Hamming distance information. The results show that the novel method outperforms those existing in the state-of-the-Art. © 2013 IEEE.
Smith RS, Windeatt T (2015) Facial action unit recognition using multi-class classification, Neurocomputing 150 (PB) pp. 440-448
Within the context of facial expression classification using the facial action coding system (FACS), we address the problem of detecting facial action units (AUs). Feature extraction is performed by generating a large number of multi-resolution local binary pattern (MLBP) features and then selecting from these using fast correlation-based filtering (FCBF). The need for a classifier per AU is avoided by training a single error-correcting output code (ECOC) multi-class classifier to generate occurrence scores for each of several AU groups. A novel weighted decoding scheme is proposed with the weights computed using first order Walsh coefficients. Platt scaling is used to calibrate the ECOC scores to probabilities and appropriate sums are taken to obtain separate probability estimates for each AU individually. The bias and variance properties of the classifier are measured and we show that both these sources of error can be reduced by enhancing ECOC through bootstrapping and weighted decoding.
Windeatt T, Dias K (2008) Feature ranking ensembles for facial action unit classification, ARTIFICIAL NEURAL NETWORKS IN PATTERN RECOGNITION, PROCEEDINGS 5064 pp. 267-279 SPRINGER-VERLAG BERLIN
Prior M, Windeatt T (2006) Parameter mining using the out-of-bootstrap generalisation error estimate for Stochastic Discrimination and Random Forests, 18th International Conference on Pattern Recognition, Vol 2, Proceedings pp. 498-501 IEEE COMPUTER SOC
Windeatt T, Zor C (2013) Ensemble Pruning Using Spectral Coefficients, IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 24 (4) pp. 673-678 IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Windeatt T, Prior M (2007) Stopping criteria for ensemble-based feature selection, Multiple Classifier Systems, Proceedings 4472 pp. 271-281 SPRINGER-VERLAG BERLIN
Windeatt T (2009) Weighted decoding ECOC for facial action unit classification, Studies in Computational Intelligence 245 pp. 59-77 Springer
There are two approaches to automating the task of facial expression recognition, the first concentrating on what meaning is conveyed by facial expression and the second on categorising deformation and motion into visual classes. The latter approach has the advantage that the interpretation of facial expression is decoupled from individual actions as in FACS (Facial Action Coding System). In this chapter, upper face action units (aus) are classified using an ensemble of MLP base classifiers with feature ranking based on PCA components. When posed as a multi-class problem using Error-Correcting-Output-Coding (ECOC), experimental results on Cohn-Kanade database demonstrate that error rates comparable to two-class problems (one-versus-rest) may be obtained. The ECOC coding and decoding strategies are discussed in detail, and a novel weighted decoding approach is shown to outperform conventional ECOC decoding. Furthermore, base classifiers are tuned using the ensemble Out-of-Bootstrap estimate, for which purpose, ECOC decoding is modified. The error rates obtained for six upper face aus around the eyes are believed to be among the best for this database.
Smith RS, Windeatt T (2011) Facial Action Unit Recognition using Filtered Local Binary Pattern Features with Bootstrapped and Weighted ECOC Classifiers, Studies in Computational Intelligence 373/2011 pp. 1-20 Springer
Within the context face expression classification using the facial action
coding system (FACS), we address the problem of detecting facial action units
(AUs). The method adopted is to train a single error-correcting output code (ECOC)
multiclass classifier to estimate the probabilities that each one of several commonly
occurring AU groups is present in the probe image. Platt scaling is used to calibrate
the ECOC outputs to probabilities and appropriate sums of these probabilities are
taken to obtain a separate probability for each AU individually. Feature extraction
is performed by generating a large number of local binary pattern (LBP) features
and then selecting from these using fast correlation-based filtering (FCBF). The
bias and variance properties of the classifier are measured and we show that both
these sources of error can be reduced by enhancing ECOC through the application
of bootstrapping and class-separability weighting.
Smith RS, Windeatt T (2010) A bias-variance analysis of bootstrapped class-separabilityweighting for error-correcting output code ensembles, Proceedings - International Conference on Pattern Recognition pp. 61-64
We investigate the effects, in terms of a bias-variance decomposition of error, of applying class-separability weighting plus bootstrapping in the construction of error-correcting output code ensembles of binary classifiers. Evidence is presented to show that bias tends to be reduced at low training strength values whilst variance tends to be reduced across the full range. The relative importance of these effects, however, varies depending on the stability of the base classifier type. © 2010 IEEE.
Within the context face expression classication using the facial action coding system (FACS), we address the problem of detecting facial action units (AUs). The method adopted is to train a single error-correcting output code (ECOC) multiclass classier to estimate the probabilities that each one of several commonly occurring AU groups is present in the probe image. Platt scaling is used to calibrate the ECOC outputs to probabilities and appropriate sums of these probabilities are taken to obtain a separate probability for each AU individually. Feature extraction is performed by generating a large number of local binary pattern (LBP) features and then selecting from these using fast correlation-based ltering (FCBF). The bias and variance properties of the classifier are measured and we show that both these sources of error can be reduced by enhancing ECOC through the application of bootstrapping and class-separability weighting.
Smith RS, Bober M, Windeatt T (2011) A comparison of random forest with ECOC-based classifiers, Lecture Notes in Computer Science: Multiple Classifier Systems 6713 pp. 207-216 Springer
We compare experimentally the performance of three approaches to ensemble-based classification on general multi-class datasets. These are the methods of random forest, error-correcting output codes (ECOC) and ECOC enhanced by the use of bootstrapping and class-separability weighting (ECOC-BW). These experiments suggest that ECOC-BW yields better generalisation performance than either random forest or unmodified ECOC. A bias-variance analysis indicates that ECOC benefits from reduced bias, when compared to random forest, and that ECOC-BW benefits additionally from reduced variance. One disadvantage of ECOC-based algorithms, however, when compared with random forest, is that they impose a greater computational demand leading to longer training times.
Zor C, Windeatt T, Kittler JV (2013) ECOC Pruning using Accuracy, Diversity and Hamming Distance Information, 2013 21ST SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU)
Existing ensemble pruning algorithms in the literature have mainly been defined for unweighted or weighted voting ensembles, whose extensions to the Error Correcting Output Coding (ECOC) framework is not successful. This paper presents a novel pruning algorithm to be used in the pruning of ECOC, via using a new accuracy measure together with diversity and Hamming distance information. The results show that the novel method outperforms those existing in the state-of-the-art.
Windeatt T, Zor C (2011) Minimising added classification error using walsh coefficients, IEEE Transactions on Neural Networks 22 (8) pp. 1334-1339 IEEE
Two-class supervised learning in the context of a classifier ensemble may be formulated as learning an incompletely specified Boolean function, and the associated Walsh coefficients can be estimated without knowledge of the unspecified patterns. Using an extended version of the Tumer-Ghosh model, the relationship between Added Classification Error and second order Walsh coefficients is established. In this paper, the ensemble is composed of Multi-layer Perceptron (MLP) base classifiers, with the number of hidden nodes and epochs systematically varied. Experiments demonstrate that the mean second order coefficients peak at the same number of training epochs as ensemble test error reaches a minimum.
Limshuebchuey Asavaron, Duangsoithong Rakkrit, Windeatt Terry (2016) Redundant feature identification and redundancy analysis for causal feature selection, 2015 8th Biomedical Engineering International Conference (BMEiCON) IEEE
High dimensional data can lead to low accuracy of classification and take a long time to calculate because it contains irrelevant features and redundant features. To overcome this problem, dimension of data has to be reduced. Causal feature selection is one of methods for feature reduction but it cannot identify redundant features. This paper presents Parent-Children based for Causal Redundant Feature Identification (PCRF) algorithm to identify and remove redundant features. The accuracy of classification and number of feature reduced by PCRF algorithm are compared with correlation feature selection. According to the results, PCRF algorithm can identify redundant feature but has lower accuracy of classification than correlation feature selection.
Windeatt Terry (2018) Optimising Ensemble of Two-Class classifiers using Spectral Analysis, ICPR 2018 Proceedings IEEE
An approach to approximating the decision boundary
of an ensemble of two-class classifiers is proposed.
Spectral
coefficients are used to approximate the discrete probability density
function of a Boolean Function. It is shown that the difference
between first and third order coefficient approximation is a good
indicator of optimal base classifier complexity. A theoretical analysis
is supported by experimental results on a variety of Artificial and
Real two-class problems.
Windeatt Terry, Zor Cemre, Camgöz Necati Cihan (2018) Approximation of Ensemble Boundary using Spectral Coefficients, IEEE Transactions on Neural Networks and Learning Systems IEEE
A spectral analysis of a Boolean function is proposed for ap-
proximating the decision boundary of an ensemble of classifiers, and an in-
tuitive explanation of computing Walsh coefficients for the functional ap-
proximation is provided. It is shown that the difference between first and
third order coefficient approximation is a good indicator of optimal base
classifier complexity. When combining Neural Networks, experimental re-
sults on a variety of artificial and real two-class problems demonstrate un-
der what circumstances ensemble performance can be improved. For tuned
base classifiers, first order coefficients provide performance similar to ma-
jority vote. However, for weak/fast base classifiers, higher order coefficient
approximation may give better performance. It is also shown that higher
order coefficient approximation is superior to the Adaboost logarithmic
weighting rule when boosting weak Decision Tree base classifiers.
Jitaree S, Windeatt Terry, Boonyapiphat P, Phukpattaranont P (2017) Classifying Breast Cancer Microscopic Images using Fractal Dimension and Ensemble Classifier, Biomedical Engineering International Conference (BMEiCON-2017) Proceedings IEEE
To improve the performance of the computer-aided
systems for breast cancer diagnosis, the ensemble classifier is
proposed for classifying the histological structures in the breast
cancer microscopic images into three region types: positive
cancer cells, negative cancer cells and non-cancer cell (stromal
cells and lymphocyte cells) image. The bagging and boosting
ensemble techniques are used with the decision tree (DT) learner.
They are also compared with the single classifier, DT. The
feature used as an input of classifiers is the fractal dimension
(FD) based 12 color channels. It is computed from the image
datasets, which are manually prepared in small cropped image
with 3 window sizes including 128×128 pixels, 192×192 pixels and
256×256 pixels. The results show that the boosting ensemble
classifier gives the best accuracy about 80% from window size of
256, although it is the lowest when using the single DT as
classifier. The results indicated that the ensemble method is
capable of improving the accuracy in the classification compared
to the single classifier. The classification model using FD and the
ensemble classifier would be applied to develop the computer-
aided systems for breast cancer diagnosis in the future.
Zor Cemre, Yanikoglu Berrin, Merdivan Erinc, Windeatt Terry, Kittler Josef, Alpaydin Ethem (2017) BeamECOC: A local search for the optimization of the ECOC matrix, ICPR 2016 Proceedings pp. 198-203 IEEE
Error Correcting Output Coding (ECOC) is a multi-
class classification technique in which multiple binary classifiers
are trained according to a preset code matrix such that each one
learns a separate dichotomy of the classes. While ECOC is one of
the best solutions for multi-class problems, one issue which makes
it suboptimal is that the training of the base classifiers is done
independently of the generation of the code matrix.
In this paper, we propose to modify a given ECOC matrix
to improve its performance by reducing this decoupling. The
proposed algorithm uses beam search to iteratively modify the
original matrix, using validation accuracy as a guide. It does not
involve further training of the classifiers and can be applied to
any ECOC matrix.
We evaluate the accuracy of the proposed algorithm (BeamE-
COC) using 10-fold cross-validation experiments on 6 UCI
datasets, using random code matrices of different sizes, and base
classifiers of different strengths. Compared to the random ECOC
approach, BeamECOC increases the average cross-validation
accuracy in
83
:
3%
of the experimental settings involving all
datasets, and gives better results than the state-of-the-art in
75%
of the scenarios. By employing BeamECOC, it is also possible to
reduce the number of columns of a random matrix down to
13%
and still obtain comparable or even better results at times.

Additional publications