The article explores the development and verification of mathematical models for computing systems with resource virtualization using queuing theory. It highlights the inadequacies of existing models that cannot analyze resource distribution effectively in virtualized environments and suggests new models based on closed queuing networks to optimize resource allocation. The study emphasizes the importance of adaptive models in enhancing the efficiency and reliability of computing systems in contemporary IT infrastructures.