نوع مقاله : مقاله پژوهشی
نویسندگان
گروه فناوری اطلاعات، دانشکده مهندسی برق و کامپیوتر، دانشگاه سیستان و بلوچستان، زاهدان، ایران
چکیده
کلیدواژهها
عنوان مقاله [English]
نویسندگان [English]
One of the most common CPU scheduling algorithms in time-sharing systems is the round-robin algorithm. This algorithm considers a time quantum for each process, which represents the maximum amount of time that a process can have access to the processor. The processor is then allocated to ready processes in a rotating manner for the duration of the time quantum. The size of the quantum has a significant impact on the efficiency of the round-robin algorithm, such that if the quantum is too short, there will be an increase in context switches and associated overheads, which will decrease CPU utilization. Conversely, if the quantum is too large, it will increase the average response time of processes and make the use of the round-robin scheduling algorithm inefficient in interactive applications. The aim of this paper is to propose an effective method for determining the dynamic time quantum using machine learning. To this end, a training set consisting of features such as the number of processes and their maximum, minimum, average, and median burst times and the optimal time quantum as the class is constructed. Machine learning classifiers are then trained on this set to predict the optimal time quantum for new samples. The experimental results show that the proposed method outperforms other methods for determining optimal time quantum based on performance evaluation metrics of scheduling algorithms. For instance, in comparison with the genetic algorithm, which has the best performance among the available methods, the proposed method improves the average waiting time, the average number of context switches, and the average turnaround time by 12 milliseconds, 1.76 units, and about 2 milliseconds, respectively.
کلیدواژهها [English]