Published by the Foundation for Open Access Statistics Editors-in-chief: Bettina Grün, Torsten Hothorn, Rebecca Killick, Edzer Pebesma, Achim Zeileis    ISSN 1548-7660; CODEN JSSOBK
Authors: Rodney Sparapani, Charles Spanbauer, Robert McCulloch
Title: Nonparametric Machine Learning and Efficient Computation with Bayesian Additive Regression Trees: The BART R Package
Abstract: In this article, we introduce the BART R package which is an acronym for Bayesian additive regression trees. BART is a Bayesian nonparametric, machine learning, ensemble predictive modeling method for continuous, binary, categorical and time-to-event outcomes. Furthermore, BART is a tree-based, black-box method which fits the outcome to an arbitrary random function, f , of the covariates. The BART technique is relatively computationally efficient as compared to its competitors, but large sample sizes can be demanding. Therefore, the BART package includes efficient state-of-the-art implementations for continuous, binary, categorical and time-to-event outcomes that can take advantage of modern off-the-shelf hardware and software multi-threading technology. The BART package is written in C++ for both programmer and execution efficiency. The BART package takes advantage of multi-threading via forking as provided by the parallel package and OpenMP when available and supported by the platform. The ensemble of binary trees produced by a BART fit can be stored and re-used later via the R predict function. In addition to being an R package, the installed BART routines can be called directly from C++. The BART package provides the tools for your BART toolbox.

Page views:: 733. Submitted: 2019-01-23. Published: 2021-01-14.
Paper: Nonparametric Machine Learning and Efficient Computation with Bayesian Additive Regression Trees: The BART R Package     Download PDF (Downloads: 252)
Supplements:
BART_2.9.tar.gz: R source package Download (Downloads: 13; 4MB)
v97i01-replication.zip: Replication materials Download (Downloads: 19; 70KB)

DOI: 10.18637/jss.v097.i01

by
This work is licensed under the licenses
Paper: Creative Commons Attribution 3.0 Unported License
Code: GNU General Public License (at least one of version 2 or version 3) or a GPL-compatible license.