Robert B. Gramacy Professor of Statistics

Dynamic Trees for Learning and Design

dynaTree is an R package implementing sequential Monte Carlo inference for dynamic tree regression and classification models by particle learning (PL). The sequential nature of inference and the active learning (AL) hooks provided facilitate thrifty sequential design and optimization..

This software is licensed under the GNU Lesser Public License (LGPL), version 2 or later. See the change log and an archive of previous versions.

The current version provides:

  • regression by constant and linear leaf models
  • classification by multinomial leaf models
  • sequential design for regression models by active learning heuristics including predictive variance (ALM and ALC); and for classification boundaries by predictive entropy
  • optimization of regression models by expected improvement (EI) statistics
  • variable selection and by relevance statistics and Saltelli-style sensitivity indices
  • fully online learning via retirement and active discarding for massive data, and forgetting factors for drifting concepts

Obtaining the package

  • Download R from by selecting the version for your operating system.
  • Install the dynaTree package, from within R.
    R> install.packages(c("dynaTree"))
  • Optionally, install the akima, plgp and tgp packages, which are helpful for some of the comparisons in the examples and demos.
    R> install.packages(c("akima", "plgp", "tgp"))
  • Load the library as you would for any R library.
    R> library(dynaTree)


See the package documentation. A pdf version of the reference manual, or help pages, is also available. The help pages can be accessed from within R. The best way to acquaint yourself with the functionality of this package is to run the demos which illustrate the examples contained in the papers referenced below. Try starting with:

R> help(package=dynaTree)
R> ?dynaTree    # follow the examples
R> demo(package="dynaTree")   # for a listing of the demos