SMS scnews item created by Munir Hiabu at Thu 13 Feb 2020 1234
Type: Seminar
Modified: Thu 13 Feb 2020 1239; Mon 24 Feb 2020 1345; Tue 25 Feb 2020 1553
Distribution: World
Expiry: 5 Mar 2020
Calendar1: 26 Feb 2020 1600-1800
CalLoc1: New Law LT 104
CalTitle1: Contrast Trees and Distribution Boosting
Auth: munir@pmunir2.pc (assumed)

Statistics Seminar

Contrast Trees and Distribution Boosting

Jerome H. Friedman

Wednesday February 26 4pm, New Law School Lecture Theatre 104

Jerome H. Friedman (Stanford University, Department of Statistics)


Schedule:
4pm: Talk
5pm: Drinks & Nibbles

RSVP
To secure your place, please register online (free): http://bit.ly/friedmanatusyd

Zoom (Registration not needed)
Join virtually via PC, Mac, Linux, iOS or Android: https://uni-sydney.zoom.us/j/845336313

Title: Contrast Trees and Distribution Boosting

Often machine learning methods are applied and results reported in cases where there is little to no information concerning accuracy of the output. Simply because a computer program returns a result does not insure its validity. If decisions are to be made based on such results it is important to have some notion of their veracity. Contrast trees represent a new approach for assessing the accuracy of many types of machine learning estimates that are not amenable to standard (cross) validation methods. They are easily interpreted and can be used as diagnostic tools to reveal and then understand the inaccuracies of models produced by any learning method. In situations where inaccuracies are detected boosted contrast trees can often improve performance. A special case, distribution boosting, provides an assumption free method for directly estimating the full conditional distribution of an outcome variable y for any given set of joint predictor variable values x.