Lasso And Ridge Regression For Optimized Resource Allocation In Cloud Computing

Harshala Shingne, Sountharrajan S, Karthiga M , Rajan C

Cloud computing is a latest technology that greatly helps the users by providing on demand services thereby reducing the investment and maintenance cost. The cloud resources are accumulated in the data center and they have the problem of optimal resource allocation to the requested users. The researchers suggested several strategies based on greedy algorithm and machine learning algorithms to solve the issue. Still greedy algorithms are NP-hard in nature, whereas the machine learning algorithms provides a near optimal solution. The regression and classification algorithms have been applied for an auction based resource allocation. In this paper, Lasso and Ridge regression are applied to DAS 2 dataset which consists of resource requirement of the users. The Lasso and Ridge regression outperformed the linear regression due to the appropriate feature selection and the ability to solve the multicollinearity issue respectively. The results are evaluated by plotting the residual and probability plots. The Root Mean Squared Error (RMSE) of the lasso and ridge regression is found to be less when compared to linear regression. Both the algorithms helped the Virtual Machines to get exhaust in terms of storage and CPU utilization.

Volume 12 | Issue 2

Pages: 1740-1747

DOI: 10.5373/JARDCS/V12I2/S20201215