The document proposes altering the AdaBoost algorithm to produce a new boosting method that yields more accurate results using the same number of repetitions. It hypothesizes that eliminating the last k runs of the weak learning algorithm, where k is less than the total number of runs t, will force the algorithm to become more accurate faster. The proposal plans to develop both the new and AdaBoost methods in C, formulate a formula to optimize k based on the number of repetitions and weak learner, test variations of k with the new method, and compare the accuracy of the two methods on the same training sets.
Related topics: