This study explores the performance of various split criteria in bagging ensembles of decision trees under classification noise, focusing on traditional criteria like information gain, gain ratio, and Gini index, as well as a new criterion based on imprecise probabilities. Experimental results indicate that the imprecise information gain (IIG) demonstrates superior robustness to noise compared to the other criteria. The findings suggest that while IIG performs best under increasing noise levels, the gain ratio also shows favorable results, indicating the need for further research on model performance and computational costs.