Prepare for the Society of Actuaries (SOA) PA Exam with our comprehensive quiz. Study with flashcards and multiple choice questions with explanations. Master key concepts and boost your confidence!

Each practice test/flash card set has 50 randomly selected questions from a bank of over 500. You'll get a new set of questions each time!

Practice this question and more.


What typical values are used for the shrinkage parameter in boosting?

  1. 0.1 and 0.05

  2. 1.0 and 0.5

  3. 0.01 and 0.001

  4. 0.1 and 0.001

The correct answer is: 0.01 and 0.001

In boosting algorithms, the shrinkage parameter, also known as the learning rate, is used to control the contribution of each individual model (or learner) to the overall ensemble model. The smaller the value of the shrinkage parameter, the more robust the model can be to overfitting, as it allows the algorithm to make more moderate updates to the model. Typical values for the shrinkage parameter often fall within the range of 0.01 to 0.1. In this context, the values of 0.01 and 0.001 are particularly small and represent conservative approaches to updating the model, which aids in ensuring better generalization to unseen data. By using these smaller values, the boosting process takes smaller steps towards optimizing the model, which tends to enhance performance, especially when the model complexity is high. Given the embrace of small learning rates, the selections that favor larger values would not fit the common practices observed in boosting algorithms. Thus, the choice with the values 0.01 and 0.001 aligns well with the standard practices in boosting, supporting the overall aim of controlling the model complexity while still allowing for effective training.