Lucas Mentch

Lucas Mentch

University of Pittsburgh


Articles

As the size, complexity, and availability of data continues to grow, scientists are increasingly relying upon black-box learning algorithms that can often provide accurate predictions with minimal a priori model specifications. Tools like random forest have an established track record of off-the-shelf success and offer various ad hoc strategies for analyzing the underlying relationships among variables. Motivated by recent insights into random forest behavior, we introduce the idea of augmented bagging (AugBagg), a procedure that operates in an identical fashion to classical bagging and random forests, but which acts on a larger, augmented space containing additional randomly generated noise features. Surprisingly and counterintuitively, we demonstrate that this simple act of including extra noise variables in the model can lead to dramatic improvements in out-of-sample predictive accuracy. As a result, common notions of variable importance based on improved model accuracy may be fatally flawed, as even purely random noise can routinely register as statistically significant. Numerous demonstrations on both real and synthetic data are provided along with a proposed solution.

© 2018-2020 Researchers.One