The XGBoost algorithm doesn't just build one decision tree; it builds hundreds to thousands of trees, learning more about the relationship between features and the output with every tree iteration. In recent years, threshold analysis has become synonymous with draft analytics. A running back prospect doesn't necessarily have to be the fastest or biggest player at his position, he just has to be fast enough and big enough to succeed at running back. This theory aligns with the objective of decision trees, to find the series of splits and interactions between features that best improve model accuracy. Setting monotonic constraints was also critical for the results to be interpretable. Adding monotonicity ensures the relationship between our inputs (i.e., 40-yard dash times) and output (starter/Pro Bowler probability) follows a directional relationship aligned with our understanding of the data.