Jump to content

User talk:Bgorven

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Statistical understanding of AdaBoost

[edit]

Hello Bgorven,

On March 12, 2014, you made a substantial change to the AdaBoost article which among other things added the following:

Specifically, in the case where all weak learners are known a priori, AdaBoost corresponds to a single iteration of the backfitting algorithm in which the smoothing splines are the minimizers of , that is: fits an exponential cost function and is linear with respect to the observation.

Is the infinity in the formula a typo? Also, do you have a source for this? I tried finding information connecting AdaBoost to backfitting, but could not find one.

I am going to add {{fact}} for now.

« D. Trebbien (talk) 19:32, 8 May 2016 (UTC)[reply]