Vegetation Cover Type Classification Using Cartographic Data for Prediction of Wildfire Behaviour

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)
47 Downloads (Pure)


Predicting the behaviour of wildfires can help save lives and reduce health, socioeconomic, and environmental impacts. Because wildfire behaviour is highly dependent on fuel type and distribution, their accurate estimation is paramount for accurate prediction of the fire propagation dynamics. This paper studies the effect of combining automated hyperparameter tuning with Bayesian optimisation and recursive feature elimination on the accuracy of three boosting (AdaB, XGB, CatB), two bagging (Random Forest, Extremely Randomised Trees), and three stacking ensemble models with respect to their ability to estimate the vegetation cover type from cartographic data. The models are trained on the University of California Irvine (UCI) cover type dataset using five-fold cross-validation. Feature importance scores are calculated and used in recursive feature elimination analysis to study the sensitivity of model accuracy to the different feature combinations. Our results indicate that the implemented fine-tuning procedure significantly affects the accuracy of all models investigated, with XGB achieving an overall accuracy of 97.1%
slightly outperforming the others.
Original languageEnglish
Article number76
Number of pages18
Issue number2
Early online date18 Feb 2023
Publication statusPublished - 18 Feb 2023

Bibliographical note

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (


  • Forest fire
  • Fuel load
  • Hyperparameter tuning
  • Machine learning (ML)
  • Ensemble models
  • Bayesian optimisation
  • Optimization


Dive into the research topics of 'Vegetation Cover Type Classification Using Cartographic Data for Prediction of Wildfire Behaviour'. Together they form a unique fingerprint.

Cite this