We are given a sequence of jobs with processing times t1, . . . , tn. Each job h
ID: 3604012 • Letter: W
Question
We are given a sequence of jobs with processing times t1, . . . , tn. Each job has to be processed in a continous time interval, and these intervals cannot overlap. We have 3 processors (not 1). Each job has to be processed in a continious time interval on one of the processors, and the intervals on each one of the processors cannot overlap. We want to minimize the average finishing times. Is the greedy algorithm optimal? In the greedy algorithm, every time a processor becomes available, among the unprocessed jobs we assign the one that has the smallest processing time to that processor. Either prove that this algorithm is optimal or give a counter-example showing that it does not always minimize the average of the finishing times.
Explanation / Answer
Yes, the given greedy approach is optimal.
Proof:
The arrival times of all the processes are same, so there will be no starvation and all the processes will execute eventually.
The number of processors are 3, that is the first three jobs will have the finishing time equal to their processing times.
Case 1:
Assume that there are only three processes and that the processing times [t1,t2,t3…tn] are in ascending order, that is t1 < t2 < t3 < … < tn , and n is a multiple of 3.
In this case the average finishing time would be t1+t2+t3 divided by 3.
Since, the finishing time of the processes is equal to their processing times, therefore, there is no possible scheduling algorithm which can reduce this finishing time. Hence, the approach is optimal in this case.
Case 2:
Assume that there are n processes and that the processing times [t1,t2,t3…tn] are in ascending order, that is t1 < t2 < t3 < … < tn , and n is a multiple of 3.
Then, the finishing time of the first three processes is t1, t2, and t3 respectively.
The finishing time of the next three processes is t1+t4, t2+t5, and t3+t6.
Similarly, the finishing time of the last three processes is ((n/3)t1 + ((n/3)-1)t4 + …t(n-2)), ((n/3)t2 + ((n/3)-1)t5 + …t(n-1)), and ((n/3)t3 + ((n/3)-1)t6 + …tn).
The total finishing time of all the jobs is therefore as follows:
[t1 + t2 + t3 + (t1+t4) + (t2+t5) + (t3+t6) + … + ((n/3)t1 + ((n/3)-1)t4 + …t(n-2)), ((n/3)t2 + ((n/3)-1)t5 + …t(n-1)), and ((n/3)t3 + ((n/3)-1)t6 + …tn)]
The finishing times that are being added the most are the shortest times (t1, t2, and t3).
And the largest processing time is being added only once.
Since, the shortest finishing time is being added the most and the largest finishing time is being added only once, therefore, the sum of the finishing times would be possibly the smallest.
Since, the sum of the finishing times is the smallest, therefore the average of the finishing times will also be the smallest.
Therefore, the approach is optimal in this case.
General Case:
Assume that the finishing times are in random order. In this case also the first three shortest jobs will be picked and the process will continue. The finishing time in this case also will have comparable properties to that of the finishing time of case 2.
Therefore, in this case also this approach is optimal.
Hence, it is proved that the given approach is optimal for every case.
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.