A Smart DRL Based Job Scheduler for Green Cloud Computing
The rapid growth of cloud computing has led to a significant increase in energy consumption, posing major environmental and economic concerns. Data centers consume approximately 7% of global electricity, a figure expected to rise to 13% by 2030, contributing 2% of global emissions, comparable to the aviation industry. Current frameworks face challenges such as high energy consumption, poor job scheduling, non-adaptability to dynamic environments, and model optimization issues. These limitations necessitate the development of more efficient and adaptable solutions. This research proposes an intelligent energy efficiency framework in cloud computing using Deep Reinforcement Learning (DRL). The framework is designed to optimize energy consumption while maintaining the quality of service (QoS) by dynamically adjusting to real-time workloads. The proposed DRL model formulates job scheduling through state and action spaces, incorporating real-time job characteristics such as arrival rates, types, and deadlines. The framework includes decision-making phases and reward functions to ensure effective and adaptable scheduling. The results demonstrate significant improvements over baseline frameworks. The DRL-based framework achieved a maximum reduction in energy consumption by 70%, a 75% improvement in response times, a 60% increase in resource utilization, and a 50% enhancement in job completion success rates meeting QoS requirements.