
Marco Visentin; Cesare Furlanello,
Time Series Boosting for the Automatic Selection of panels in Marketing and Financial Studies,
2001

Cesare Furlanello; Stefano Merler; Stefano Menegon,
Metodi informatici WebGIS per l'analisi e la sorveglianza epidemiologica delle infezioni trasmesse da zecche,
2001

C. Chemini; Annamaria Rizzoli; Cesare Furlanello,
La ricerca su zecche e malattie trasmesse in Trentino,
2001

Stefano Merler; Cesare Furlanello; Barbara Larcher; Andrea Sboner,
Tuning Costsensitive Boosting and its Application to Melanoma Diagnosis,
Second International Workshop on Multiple Classifier Systems,
Springer Verlag,
2001
, pp. 32
42
, (Second International Workshop on Multiple Classifier Systems,
Cambridge, UK,
02/07/2001  04/07/2001)

Cesare Furlanello; P. Lecca; S. Zamboni,
Servizi G.I.S. e metodi statistici per la gestione della tutela e della ricerca archeologica in Trentino,
Atti del Convegno Archeologia e Territorio,
2001

Bruno Caprile; Cesare Furlanello; Stefano Merler,
Exact Bagging with kNearest Neighbour Classifiers,
A formula is exhibited for the exact computation of Bagging classifiers when the base model adopted is kNearest Neighbour (kNN). The formula holds in any dimension, does not require the extraction of bootstrap replicates, and yields an implementation of Bagging that is as fast as the computation of a single kNN classifier. It also shows that Bagging with 1Nearest Neighbour is perfectly equivalent to plain 1NN,
2001

Bruno Caprile; Cesare Furlanello; Stefano Merler,
The Dynamics of AdaBoost Weights Tells You What's hard to Classify,
The dynamical evolution of weights in the AdaBoost algorithm contains useful information about the role that the associated data points play in the built of the AdaBoost model. In particular, the dynamics induces a bipartition of the data set into two (easy/hard) classes. Easy points are ininfluential in the making of the model, while the varying relevance of hard points can be gauged in terms of an entropy value associated to their evolution. Smooth approximations of entropy highlight regions where classification is most
uncertain. Promising results are obtained when methods proposed are applied in the Optimal Sampling framework,
2001

Stefano Merler; Cesare Furlanello; Barbara Larcher; Andrea Sboner,
SSTBoost: Automatic Model Selection in Costsensitive Boosting,
This paper introduces SSTBoost, a predictive classification methodology designed to target the accuracy of a modified boosting algorithm towards required sensitivity and specificity constraints. The SSTBoost method is demonstrated in practice for the automated medical diagnosis of cancer on a set of skin lesions (42 melanomas and 110 naevi) described by geometric and colorimetric features. A costsensitive variant of the AdaBoost algorithm is combined with a procedure for the automatic selection of optimal cost parameters. Within each boosting step, different weights are considered for errors on false negatives and false positives, and differently updated for negatives and positives. Given only a target region in the ROC space, the method also completely automates the selection of the cost parameters ratio, tipically of uncertain definition. On the cancer diagnosis problem, SSTBoost outperformed in accuracy and stability a battery of specialized automatic systems based on different types of multiple classifier combinations and a panel of expert dermatologists. The method thus can be applied for the early diagnosis of melanoma cancer or in other problems in which an automated costresistive classification is required,
2001