Loading [MathJax]/jax/output/SVG/jax.js
Research article

Bicriteria multi-machine scheduling with equal processing times subject to release dates

  • Received: 05 January 2023 Revised: 07 May 2023 Accepted: 10 May 2023 Published: 23 May 2023
  • This paper addresses the problem of scheduling n equal-processing-time jobs with release dates non-preemptively on identical machines to optimize two criteria simultaneously or hierarchically. For simultaneous optimization of total completion time (and makespan) and maximum cost, an algorithm is presented which can produce all Pareto-optimal points together with the corresponding schedules. The algorithm is then adapted to solve the hierarchical optimization of two min-max criteria, and the final schedule has a minimum total completion time and minimum makespan among the hierarchical optimal schedules. The two algorithms provided in this paper run in O(n3) time.

    Citation: Zhimeng Liu, Shuguang Li, Muhammad Ijaz Khan, Shaimaa A. M. Abdelmohsen, Sayed M. Eldin. Bicriteria multi-machine scheduling with equal processing times subject to release dates[J]. Networks and Heterogeneous Media, 2023, 18(3): 1378-1392. doi: 10.3934/nhm.2023060

    Related Papers:

    [1] Shuguang Li, Mingsong Li, Muhammad Ijaz Khan . Algorithms for two-agent unbounded serial-batch scheduling with makespan and maximum lateness objectives. Networks and Heterogeneous Media, 2023, 18(4): 1678-1691. doi: 10.3934/nhm.2023073
    [2] Dieter Armbruster, Michael Herty, Xinping Wang, Lindu Zhao . Integrating release and dispatch policies in production models. Networks and Heterogeneous Media, 2015, 10(3): 511-526. doi: 10.3934/nhm.2015.10.511
    [3] Qi Luo, Ryan Weightman, Sean T. McQuade, Mateo Díaz, Emmanuel Trélat, William Barbour, Dan Work, Samitha Samaranayake, Benedetto Piccoli . Optimization of vaccination for COVID-19 in the midst of a pandemic. Networks and Heterogeneous Media, 2022, 17(3): 443-466. doi: 10.3934/nhm.2022016
    [4] Vincent Renault, Michèle Thieullen, Emmanuel Trélat . Optimal control of infinite-dimensional piecewise deterministic Markov processes and application to the control of neuronal dynamics via Optogenetics. Networks and Heterogeneous Media, 2017, 12(3): 417-459. doi: 10.3934/nhm.2017019
    [5] Giuseppe Buttazzo, Filippo Santambrogio . Asymptotical compliance optimization for connected networks. Networks and Heterogeneous Media, 2007, 2(4): 761-777. doi: 10.3934/nhm.2007.2.761
    [6] Don A. Jones, Hal L. Smith, Horst R. Thieme . Spread of viral infection of immobilized bacteria. Networks and Heterogeneous Media, 2013, 8(1): 327-342. doi: 10.3934/nhm.2013.8.327
    [7] Cristian Barbarosie, Anca-Maria Toader . Optimization of bodies with locally periodic microstructure by varying the periodicity pattern. Networks and Heterogeneous Media, 2014, 9(3): 433-451. doi: 10.3934/nhm.2014.9.433
    [8] Al-hassem Nayam . Asymptotics of an optimal compliance-network problem. Networks and Heterogeneous Media, 2013, 8(2): 573-589. doi: 10.3934/nhm.2013.8.573
    [9] Miguel A. Herrero, Leandro Sastre . Models of aggregation in dictyostelium discoideum: On the track of spiral waves. Networks and Heterogeneous Media, 2006, 1(2): 241-258. doi: 10.3934/nhm.2006.1.241
    [10] Giulia Cavagnari, Antonio Marigonda, Benedetto Piccoli . Optimal synchronization problem for a multi-agent system. Networks and Heterogeneous Media, 2017, 12(2): 277-295. doi: 10.3934/nhm.2017012
  • This paper addresses the problem of scheduling n equal-processing-time jobs with release dates non-preemptively on identical machines to optimize two criteria simultaneously or hierarchically. For simultaneous optimization of total completion time (and makespan) and maximum cost, an algorithm is presented which can produce all Pareto-optimal points together with the corresponding schedules. The algorithm is then adapted to solve the hierarchical optimization of two min-max criteria, and the final schedule has a minimum total completion time and minimum makespan among the hierarchical optimal schedules. The two algorithms provided in this paper run in O(n3) time.



    Parallel machine scheduling has received extensive attention since 1950, given the wide diversity of real-world systems it represents. A variety of criteria has been considered. Among the most studied criteria are makespan (maximum completion time) and total completion time, which can measure the effective utilization of the machines. A second set of criteria are related to meeting due dates and thus considering the system's customers. If the criteria are not specified, we can consider two types of general objective functions: min-sum and min-max.

    In real production, decision makers may need to consider a number of criteria simultaneously before arriving at a decision. However, it is often the case that different criteria are in conflict. A solution which is optimal with respect to a given criterion might be a poor candidate for some other criterion. Thus, in the last two decades, multicriteria optimization approaches and techniques have been increasingly applied to provide solutions where the criteria are balanced in an acceptable and profitable way [1,2].

    An important subclass in multicriteria optimization is bicriteria optimization where only two criteria, say γ1 and γ2, are considered. There are four popular approaches in the literature: (a) Positive combination optimization: find a schedule to minimize the positive linear combination of γ1 and γ2. (b) Constrained optimization: find a schedule to minimize γ2 under an upper bound on γ1. (c) Pareto optimization (also called simultaneous optimization): find all Pareto-optimal solutions for γ1 and γ2. A feasible schedule σ is (strict) Pareto-optimal for γ1 and γ2 if there is no feasible schedule σ such that γ1(σ)γ1(σ) and γ2(σ)γ2(σ), where at least one of the inequalities is strict. The objective vector (γ1(σ),γ2(σ)) of a Pareto-optimal schedule σ is called a Pareto-optimal point [1]. A feasible schedule σ is weak Pareto-optimal for γ1 and γ2 if there is no feasible schedule σ such that γ1(σ)<γ1(σ) and γ2(σ)<γ2(σ). (d) Hierarchical optimization (also called lexicographical optimization): find a schedule to minimize γ2 among the set of optimal schedules minimizing γ1. In hierarchical bicriteria scheduling problems, the two criteria have different levels of importance thus they are optimized in a lexicographic fashion. Such problems appear naturally in situations where there are several optimal solutions with respect to a specific objective and the decision maker needs to select from among these solutions the one with the best second objective.

    In this paper, we consider the bicriteria problem of scheduling equal-processing-time jobs with release dates non-preemptively on identical machines. We apply a Pareto optimization approach to minimize the total completion time (and makespan) and maximum cost simultaneously, and we apply a hierarchical optimization approach to minimize two general min-max criteria hierarchically.

    Formally speaking, we are given a set of n jobs, J={J1,J2,,Jn}, to be processed on m identical machines. The machines run in parallel and each machine can process at most one job at a time. All jobs have the same processing time p>0. Each job, JjJ, has a release date rj0 before which it cannot be processed, as well as two cost functions fj(t) and gj(t) which denote the costs incurred if the job is completed at time t. We assume that all fj and gj are regular, i.e., fj and gj are non-decreasing in the job completion times [3].

    A schedule assigns each job Jj to exactly one machine and specifies its completion time Cj on the machine. Given a schedule σ, let fj(Cj(σ)) and gj(Cj(σ)) be two scheduling costs of Jj. Then fmax(σ)=maxjfj(Cj(σ)) and gmax(σ)=maxjgj(Cj(σ)) are two maximum costs of σ. Two important special cases of maximum cost are the makespan Cmax(σ)=maxj{Cj(σ)} and the maximum lateness Lmax(σ)=maxj{Cj(σ)dj}, where dj denotes the due date of job Jj. We omit the argument σ when it is clear to which schedule we are referring.

    The first bicriteria problem we consider is to determine Pareto-optimal schedules which simultaneously minimize the total completion time nj=1Cj (and makespan Cmax) and maximum cost fmax. Following the notation schemes of [1,2,3], it can be denoted by P|rj,pj=p|(nj=1Cj,fmax) (and P|rj,pj=p|(Cmax,fmax)).

    The second bicriteria problem we consider is to determine a lexicographical optimal schedule such that the secondary criterion gmax is minimized under the constraint that the primary criterion fmax is minimized. Following the notation schemes of [1,2,3], it can be denoted by P|rj,pj=p|Lex(fmax,gmax), where the criterion mentioned first in the argument of Lex is the more important one.

    For parallel machine scheduling that considers multiple criteria, please refer to [4,5,6] for the surveys. Examples of Pareto optimization and hierarchical optimization scheduling on parallel machines can be found in [7,8,9,10,11] and [12,13,14], respectively.

    Bruno et al. [15] proved that problem P||Lex(nj=1Cj,Cmax) (the jobs have unequal processing times and equal release dates) is NP-hard. Gupta et al. [16] further gave a complexity result: they showed that P||Lex(nj=1Cj,Cmax) is strongly NP-hard. Hence, Pareto optimization problem P||(nj=1Cj,fmax) is also strongly NP-hard. Since the single criterion problem P||Cmax is strongly NP-hard [17], the lexicographical optimization problem P||Lex(fmax,gmax) is strongly NP-hard, too. Also, problem 1|rj|(nj=1Cj,fmax) (the single machine case where the jobs have arbitrary processing times and release dates) is strongly NP-hard, due to the strong NP-hardness results by Lenstra et al. [18] for problems 1|rj|nj=1Cj and 1|rj|Lmax. Thus, we are interested in the special case where all jobs have equal processing times.

    Although the problem setting of equal-processing-time jobs appears simple, it captures important aspects of a wide range of applications. For example, in standardized systems in practice, the products consistently have the same processing times. In networking and information systems, transmission packets also often have a constant length [19]. Since products and data packets usually arrive dynamically, it is reasonable to consider the jobs with release dates.

    For single criterion scheduling, Kravchenko and Werner [19,20] surveyed the approaches and exposed the problems with an open complexity status for scheduling jobs with equal processing times on parallel machines. Brucker and Shakhlevich [21] characterized optimal schedules for scheduling jobs with unit processing times on parallel machines by providing necessary and sufficient conditions of optimality. Hong et al. [22] studied the problem of scheduling jobs with equal processing times and eligibility restrictions on identical machines to minimize total completion time. For the problem with a fixed number of machines, they provided a polynomial time dynamic programming algorithm. For the problem with an arbitrary number of machines, they provided two polynomial time approximation algorithms with approximation ratios of 3/5 and 1.4. Vakhania [23] studied the problem of scheduling jobs with equal processing times on identical machines to minimize the maximum delivery completion time, which is defined to be the time by which all jobs are delivered. He presented an algorithm which can be considered as either pseudo-polynomial with time complexity O(qmaxmnlogn) or as polynomial with time complexity O(mκn)), where qmax denotes the maximum delivery time of all jobs and κ<n is a parameter which is known only after the termination of the algorithm. The maximum cost minimization problem P|rj,pj=p,ˉdj|fmax can be solved by the polynomial time algorithm developed in [19] by Kravchenko and Werner, where ˉdj denotes the deadline of job Jj before which Jj must be completed in any feasible schedule. Vakhania and Werner [24] studied the problem of scheduling jobs with equal processing times on uniform machines (processing jobs at different speeds) to minimize the maximum delivery completion time. For this problem whose complexity status remains open for a long time, they presented an O(λm2nlogn)-time algorithm which is optimal under an explicitly stated special condition, where λ can be any of the magnitudes n or qmax.

    Tuzikov et al. [25] studied the bicriteria problems of scheduling jobs with equal processing times on uniform machines, denoted as Q|pj=p|(fmax,gmax) and Q|pj=p|(nj=1fj,gmax), where fmax and gmax are two min-max criteria, and nj=1fj is a min-sum criterion. They showed that problems Q|pj=p|(fmax,gmax) and Q|pj=p|(nj=1fj,gmax) can be solved iteratively in O(n4) and O(n5) time, respectively. Note that they discussed only the case of equal release dates. In this paper, we apply the framework used in [25] and extend the results in [25] to deal with release dates. In fact, the main contribution of this paper is the incorporation of the release dates and general maximum costs into the problem.

    Sarin and Prakash [26] studied the lexicographical optimization problem of scheduling jobs with equal processing times and equal release dates on identical machines for various pairwise combinations of primary and secondary criteria f and g, where f,g{Tmax,jTj,jUj,jCj,jwjCj}. (Please refer to [3] for the definitions.) Apart from P|pj=p|Lex(jUj,jwjCj) whose computational complexity was left open in [26], all other problems P|pj=p|Lex(f,g) studied in [26] are solvable in polynomial time. Zhao and Yuan [27] revisited the bicriteria problems of scheduling jobs with equal processing times on uniform machines. They presented a comprehensive study on the problems with respect to various regular criteria. Particularly, they obtained an O(n3)-time algorithm for P|pj=p|Lex(jUj,jwjCj), solving the open problem posed in [26].

    As for the parallel machines case with release dates and equal processing times, Simons [28] proposed the first polynomial algorithm running in O(n3loglogn) time for P|rj,pj=p,ˉdj|nj=1Cj. (The algorithm also solves the feasibility problem P|rj,pj=p,ˉdj|). Simons and Warmuth [29] further improved this bound to O(mn2). For the same problem, Dürr and Hurand [30], López-Ortiz and Quimper [31] gave algorithms that run in O(n3) and O(min{1,p/m}n2) time, respectively. These schedules all minimize both the objectives nj=1Cj and Cmax. Since the maximum lateness Lmax is upper-bounded by np/m, Fahimi and Quimper [32] remarked that problems P|rj,pj=p|(nj=1Cj,Lmax) and P|rj,pj=p|(Cmax,Lmax) can be solved in polynomial time with time complexity O(log(np/m)min{1,p/m}n2) and using the binary search that calls the algorithm in [31] at most log(np/m) times. They also extended the algorithm presented in [31] for P|rj,pj=p,ˉdj|nj=1Cj to solve a variation of the problem where the number of machines fluctuates over time. They further proved that minimizing the total cost of the jobs, i.e., nj=1fj(Sj), for arbitrary functions fj(t) is NP-hard, where Sj denotes the start time of job Jj. They then specialized this objective function to the case that it is merely contingent on the time and showed that although this case is pseudo-polynomial solvable, one can derive polynomial time algorithms for either a monotonic or periodic cost function.

    To the best of our knowledge, problems P|rj,pj=p|(nj=1Cj,fmax) and P|rj,pj=p|Lex(fmax,gmax) have not been studied to date. Note that here fmax and gmax are two general min-max criteria. The above-mentioned results [28,29,30,31,32] discussed only nj=1Cj, Cmax or Lmax.

    In Section 2, we present an O(n3)-time algorithm for P|rj,pj=p|(nj=1Cj,fmax). The algorithm also solves problem P|rj,pj=p|(Cmax,fmax). Consequently, problem P|rj,pj=p|fmax can be solved in O(n3) time, which has its own independent interest. In Section 3, we adapt the algorithm to solve P|rj,pj=p|Lex(fmax,gmax) in O(n3) time. The final generated schedule also has the minimum total completion time and minimum makespan among the lexicographical optimal schedules for fmax and gmax. Finally, we draw some concluding remarks in the last section.

    In this section we will present an O(n3)-time algorithm for P|rj,pj=p|(nj=1Cj,fmax). As a by-product, the last schedule constructed by the algorithm is optimal for P|rj,pj=p|fmax.

    For ease of discussion, throughout the paper, we always represent a feasible schedule σ by a sequence Jσ(1),Jσ(2),,Jσ(n) of jobs, where Jσ(i) is the job scheduled at the i-th position in σ, i=1,2,,n. The positions in σ are indexed from 1 to n in non-decreasing order of their start times in σ (ties broken in favor of the job on the machine with the smallest index).

    Intuitively, if a job has a large cost when it completes late, then we need to move it to the left (i.e., start it earlier) to decrease its cost, even if it has a large release date. Therefore, it is quite often that in a feasible schedule, some jobs with larger release dates may start earlier than some jobs with smaller release dates.

    To recover a schedule from its job sequence representation, we need the following lemma whose proof can be found in [28]. Though simple, this lemma plays a non-negligible role in our algorithms. It allows us to focus on the positions of the jobs in a schedule; we do not worry about their release dates.

    Let Sσ(i) denote the start time of Jσ(i) in σ=(Jσ(1),Jσ(2),,Jσ(n)), i=1,2,,n.

    Lemma 2.1. ([28]) For any feasible schedule, a solution σ identical except in machine assignment exists and is cyclic, i.e., for any i, Jσ(i),Jσ(i+m), are scheduled on the same machine. Moreover, Sσ(1)=rσ(1), Sσ(i)=max{Sσ(i1),rσ(i)} (i=2,3,,m) and Sσ(i)=max{Sσ(i1),rσ(i),Sσ(im)+p} (i=m+1,m+2,,n).

    The ε-constraint method (see, e.g., [1,2]) provides a general way to find Pareto-optimal points: let y be the optimal value of constrained optimization problem α|fˆx|g, and let x be the optimal value of constrained optimization problem α|gy|f. Then (x,y) is a Pareto-optimal point for problem α||(f,g).

    The algorithm follows the framework used in [25] which repeatedly uses the ε-constraint method to construct the Pareto-optimal schedules. Please see Figure 1 for an illustration. Similar figures and illustrations can be found, e.g., in [25,33]. All circles (white solid, black solid and white dashed) in Figure 1 represent weak Pareto-optimal points (schedules), but only the black solid circles represent Pareto-optimal points (schedules). The weak Pareto-optimal schedules (σ1,σ2,σ3,, which are generated in turn in Algorithm M1) are constructed in strictly decreasing order of their fmax-values, and within that order in non-decreasing order of their nj=1Cj-values. Since the constraint fmax<y is used instead of fmaxy, all white dashed circles will be ignored by Algorithm M1. The Pareto-optimal schedules output by Algorithm M1 are π1,π2,π3,. The last Pareto-optimal schedule has the minimum fmax-value.

    Figure 1.  Illustration of Algorithm M1.

    Let Ω(J) denote the Pareto set which consists of all Pareto-optimal points together with their corresponding Pareto-optimal schedules. Let Π(J) denote the set of all feasible schedules for J. Let Π(J,y)Π(J) denote the set of the schedules with maximum cost (fmax-value) less than y, where y denotes a given threshold value. Obviously, we have that Π(J,+)=Π(J).

    Below is the algorithm called Algorithm M1 for constructing the Pareto set Ω(J) for P|rj,pj=p|(nj=1Cj,fmax). It first assigns the unassigned job with the largest release date to the i-th position (i=n,n1,,1), ignoring the scheduling cost fmax (see the initial schedule ˆσ below). It then repeatedly decreases the fmax-value of the current schedule until the cost cannot be further improved. During the process, all Pareto-optimal schedules are constructed one by one.

    The initial schedule is ˆσ={Jˆσ(1),Jˆσ(2),,Jˆσ(n)}, where the jobs Jˆσ(1),Jˆσ(2),,Jˆσ(n) are in non-decreasing order of their release dates (ties broken arbitrarily). Note that this order is also the non-decreasing order of their start times in ˆσ (ties broken in favor of the job on the machine with the smallest index). It is easy to see that ˆσ is optimal for P|rj,pj=p|nj=1Cj and P|rj,pj=p|Cmax. (Set the start times of the jobs in ˆσ by Lemma 2.1.)

    The basic idea of our algorithms is as follows: schedule the jobs backwardly (from the right to the left) and at each decision point always select the job with the largest release date from among the candidate jobs (check the choice of ˆσ in Step 1 of Algorithm M1 and Step 3 of Procedure A1(), as well as the choice of π in Step 1 of Algorithm M2 and Step 3 of Procedure A2()). To coincide with this idea, we treat the initial schedule ˆσ={Jˆσ(1),Jˆσ(2),,Jˆσ(n)} as a schedule in which the jobs Jˆσ(n),Jˆσ(n1),,Jˆσ(1) are in non-increasing order of their release dates.

    Algorithm M1: Input: An instance of P|rj,pj=p|(nj=1Cj,fmax).

    Output: The Pareto set Ω(J).

    Step 1. Initially, set s=1. Let σs={Jσs(1),Jσs(2),,Jσs(n)}=ˆσ, ys=fmax(σs). Let Ω(J)=, k=0.

    Step 2. Run Procedure A1(ys) to get a schedule σs+1, using σs as the input schedule.

    Step 3. If σs+1, then do the following:

    (i) Set ys+1=fmax(σs+1).

    (ii) If nj=1Cj(σs)<nj=1Cj(σs+1), then set k=k+1 and πk=σs. Incorporate (nj=1Cj(πk),fmax(πk),πk) into Ω(J).

    (iii) Set s=s+1. Go to Step 2.

    Step 4. If σs+1=, then set k=k+1 and πk=σs. Incorporate (nj=1Cj(πk),fmax(πk),πk) into Ω(J) and return Ω(J).

    Procedure A1(ys):

    Input: Schedule σs={Jσs(1),Jσs(2),,Jσs(n)} with ys=fmax(σs).

    Output: Schedule σs+1={Jσs+1(1),Jσs+1(2),,Jσs+1(n)} which has the minimum total completion time (and minimum makespan) among all schedules in Π(J,ys).

    Step 1. Initially, set h=0. Let σh={Jσh(1),Jσh(2),,Jσh(n)}, where σh(i)=σs(i), i=1,2,,n.

    Step 2. Set the start times of the jobs in σh (by Lemma 2.1): Sσh(1)=rσh(1); for i=2,3,,m, let Sσh(i)=max{Sσh(i1),rσh(i)}; for i=m+1,m+2,,n, let Sσh(i)=max{Sσh(i1),rσh(i),Sσh(im)+p}. Update the cost fj(Cj(σh)) of job j in σh, JjJ.

    Step 3. Adjust σh:

    IF for all i, the inequality fσh(i)(Cσh(i))<ys holds, THEN Return σs+1=σh.

    ELSE Pick a job Jσh(i) such that fσh(i)(Cσh(i))ys. Let E(σh(i))={l|1lifσh(l)(Cσh(i))<ys} \quad denote the set of the candidate jobs at time Cσh(i).

       IF E(σh(i))=, THEN Return σs+1=.

       ELSE Find the job with the largest release date in E(σh(i)), say Jσh(e). Let Jσh(e) be scheduled at the i-th position instead of Jσh(i). Set Jx=Jσh(i). For q=i1,i2,,e+1 (this ordering is used crucially), let Jσh(q) be the job in {Jσh(q),Jx} with the larger release date, and let Jx be the other job. Finally, let Jx be scheduled at the e-th position.

    Let σh+1=σh and then set h=h+1. Go to Step 2.

    Remark 1. Let us illustrate Step 3 of Procedure A1(ys) in more detail. Suppose that we find a job Jσh(i) violating its inequality. We remove Jσh(i) from position i and select the candidate job Jσh(e) from E(σh(i)) which has the largest release date. Let Jσh(e) be scheduled at position i. Treat Jσh(i) as the job in hand, denoted by Jx, for which we need to find a suitable position. We compare Jx and Jσh(i1), and let the one with the larger release date be scheduled at position i1. Let Jx be the other job. As will be seen in Lemma 2.2 below, the job at position i1 now has the release date that is not less than the largest release date among the candidate jobs in E(σh(i1)). Its cost may not less than ys at time Cσh(i1). However, we do not worry about this possibility, since the job can be moved further to the left in the next iterations. We continue to compare Jx and Jσh(i2), and so on. In this way, we can find suitable positions for jobs Jσh(i),Jσh(i1),,Jσh(e+1). Thus, we accomplish the idea mentioned before: schedule the jobs backwardly and at each decision point always select the job with the largest release date from among the candidate jobs.

    Example: We now demonstrate an example to illustrate Algorithm M1. There are three machines and six jobs J1,J2,,J6 with processing four times, where r1=r2=0, r3=1, r4=2, r5=r6=3; d1=d2=4, d3=3, d4=2, d5=d6=1. Algorithm M1 works as follows:

    (1) σ1=ˆσ={J1,J2,J3,J4,J5,J6}, jobs J1,J2,,J6 are in non-decreasing order of their release dates. The schedule is recovered by Lemma 2.1 with Cj(σ1)=38 and Lmax(σ1)=8.

    (2) Run Procedure A1(y1), where y1=8.

    Initially, σ0=σ1={J1,J2,J3,J4,J5,J6}. The job violating the inequality is J6. The set of the candidate jobs at time C6 is E(σ0(6))={J1,J2,J3,J4}. Since J4 has the largest release date among the jobs in E(σ0(6)), it is scheduled at the sixth position instead of J6. Job J6 becomes the job in hand. We compare J6 and J5. Since r5r6, J5 stays at the fifth position. We continue to consider the fourth position. The fourth position is not occupied because J4 has been moved from this position to the sixth position. Therefore, J6 is scheduled at the fourth position. Set the start times of the jobs by Lemma 2.1. We get: σ1={J1,J2,J3,J6,J5,J4} with Cj(σ1)=38 and Lmax(σ1)=7. Since Lmax(σ1)=8>7=Lmax(σ1), Procedure A1(y1) returns σ2=σ1={J1,J2,J3,J6,J5,J4} with Cj(σ2)=38 and Lmax(σ2)=7. Since Cj(σ1)=38=Cj(σ2), by Step 3 of Algorithm M1, we get rid of σ1 since it cannot be a Pareto-optimal schedule.

    (3) Run Procedure A1(y2), where y2=7.

    (i) Initially, σ0=σ2={J1,J2,J3,J6,J5,J4}. The jobs violating the inequalities are J4,J5,J6. The set of the candidate jobs at time C4 is E(σ0(6))={J1,J2,J3}. Since J3 has the largest release date among the jobs in E(σ0(6)), it is scheduled at the sixth position instead of J4. Job J4 becomes the job in hand. We compare J4 and J5. Next, we compare J4 and J6, and then decide to schedule J4 at the third position. Set the start times of the jobs by Lemma 2.1. We get σ1={J1,J2,J4,J6,J5,J3} with Cj(σ1)=40 and Lmax(σ1)=7.

    (ii) Now, the jobs violating the inequalities are J3,J5,J6. We select J3 as the job in hand and adjust σ1. We get σ2={J1,J3,J4,J6,J5,J2} with Cj(σ2)=42 and Lmax(σ2)=8.

    (iii) Since y2=7, the jobs violating the inequalities are J5,J6. We select J5 as the job in hand and adjust σ2. We get: σ3={J1,J4,J5,J6,J3,J2} with Cj(σ3)=47 and Lmax(σ3)=8.

    (iv) Since y2=7, the jobs violating the inequalities are J2,J3,J6. We select J2 as the job in hand and adjust σ3. Since E(σ3(6))=, Procedure A1(y2) returns σ3=, implying that Π(J,y2)=.

    By Step 4 of Algorithm M1, we get: π1=σ2={J1,J2,J3,J6,J5,J4} with Cj(π1)=38 and Lmax(π1)=7. Finally, Algorithm M1 returns the Pareto set Ω(J)={(38,7,π1)}.

    Step 1 of Procedure A1(ys) can be implemented in O(n) time. Steps 2 and 3 can be implemented in O(n) time in each iteration. Here, an iteration refers to a job inequality violation adjustment. In each iteration, there is a job which has to be moved to the left because of the inequality violation. Later, by Lemma 3.1 we will know that this job cannot be moved back again. Hence, since there are n jobs and each job goes through at most n1 positions (from the rightmost to the leftmost), the total number of iterations is O(n2). The running time of Procedure A1(ys) is O(n3).

    The running time of Algorithm M1 is O(n3), since although it is not clear how many times Algorithm M1 calls for Procedure A1(...), the total number of iterations in all calls for Procedure A1(...) is still O(n2), and each iteration can be done in O(n) time.

    In Lemma 2.2 below, we will prove the following: (1) Algorithm M1 schedules the jobs backwardly and always selects the job with the largest release date from among the candidate jobs; (2) in the course of the algorithm, no job moved to the left can be moved back again; (3) at each iteration, Procedure A1(ys) constructs a schedule in which the i-th job, counting from the left, starts no later than the i-th job in any feasible schedule in Π(J,ys), i=1,2,,n.

    Therefore, the schedule obtained at each iteration of Procedure A1(ys) is no worse than any feasible schedule in Π(J,ys) for the objectives nj=1Cj and Cmax. Therefore, the final schedule (i.e., σs+1, if it exists) obtained by Procedure A1(ys) at the last iteration is in Π(J,ys) and optimal for nj=1Cj and Cmax.

    Lemma 2.2. Let σh=(Jσh(1),Jσh(2),,Jσh(n)) be the schedule obtained at iteration h (h=0,1,) of Procedure A1(ys). Let σ=(Jσ(1),Jσ(2),,Jσ(n)) be any schedule in Π(J,ys). Then for i=1,2,,n, we have the following: (1) rσh(i)max{rj|jE(σh(i))}; (2) Sσh(i)Sσ(i) (and thus Cσh(i)Cσ(i)); (3) Cσh(i)Cσh+1(i).

    Proof. We prove the lemma by induction on s and h.

    First, we consider the input schedule for Procedure A1(y1), which is the initial schedule ˆσ. Property (1) of the lemma clearly holds. We are going to prove property (2) for the base case h=0 of the call for Procedure A1(y1). Let σ be any schedule in Π(J,y1). We compare ˆσ and σ backwardly (from the right to the left) looking for a difference between the jobs. Suppose that the first difference occurs at the k-th position, which is occupied by jobs Ja and Jb in ˆσ and σ, respectively. By the construction of ˆσ, we know that rarb. Since Ja is processed earlier than Jb in σ, we can safely interchange Ja and Jb in σ (regardless of whether Ja can be scheduled at the k-th position in σ), without increasing the job start or completion time at any position. Repetition of this argument shows that σ can be safely transformed into ˆσ. Thus, for i=1,2,,n, we have that Sˆσ(i)Sσ(i), proving property (2) for the base case h=0 of the call for Procedure A1(y1). Hence, the lemma holds for the 0-th iteration of Procedure A1(y1).

    Assume that the lemma holds for the first h iterations of Procedure A1(y1). We now consider the (h+1)-th iteration. More precisely, we observe σh+1 at the moment that it is being constructed to adjust σh during Step 3 of Procedure A1(y1), but the next round of Step 2 has not been executed yet. That is, σh+1 is obtained from σh by performing an inequality violation adjustment, but the completion times and the costs of the jobs in σh+1 have not been updated yet.

    As described in Step 3 of Procedure A1(y1), for adjusting σh, we pick a job Jσh(i) in σh such that fσh(i)(Cσh(i))ys, and find a job Jσh(e) which has the largest release date in E(σh(i)). Let Jσh(e) be scheduled at the i-th position instead of Jσh(i). Clearly, rσh(e)=max{rj|jE(σh(i))}. By the inductive assumption, if we do not consider Jσh(i), then we have that rσh(i1)max{rj|jE(σh(i1))}. By comparing Jx=Jσh(i) and Jσh(i1), letting the one with the larger release date be scheduled at position i1, and letting Jx be the other job, we can ensure that rσh(i1)max{rj|jE(σh(i1))} after Jσh(i) has been taken into consideration. We continue to deal with Jx and Jσh(i2), and so on, as described in Step 3. Therefore, we prove property (1) for the (h+1)-th iteration of Procedure A1(y1).

    To prove property (2) for the (h+1)-th iteration of Procedure A1(y1), we compare σh+1 and σ backwardly looking for a difference between the jobs. Suppose that the first difference occurs at the k-th position, which is occupied by jobs Ja and Jb in σh+1 and σ, respectively. By the inductive assumption, Cσh(k)Cσ(k) and thus fb(Cσh(k))fb(Cσ(k))<y1 (test the feasibility of job Jb when it is completed at Cσh(k)), which means that job Jb is also a candidate job at time Cσh(k). By the rule of selecting a candidate job in favor of the largest release date (which has just been proved), we know that rarb. Since Ja is processed earlier than Jb in σ, we can safely interchange Ja and Jb in σ, without increasing the job start or completion time at any position. Repetition of this argument shows that σ can be safely transformed into σh+1, without increasing the job start or completion time at any position. Thus, for i=1,2,,n we have that Sσh+1(i)Sσ(i), proving property (2) of the lemma for the (h+1)-th iteration of Procedure A1(y1).

    Finally, we observe σh+1 at the moment that it has been obtained and the next round of Step 2 has been executed already, which is, the moment when the completion times and the costs of the jobs in σh+1 have been updated. Clearly, for e+1li we have that rσh(l)rσh(e). (Otherwise, since Jσh(e) is a candidate job at time Cσh(l), by the rule of selecting a candidate job in favor of the largest release date, Jσh(e) should have been scheduled at the l-th position instead of Jσh(l).) After scheduling jobs Jσh(i),Jσh(i1),,Jσh(e) at the suitable positions, only the release date at i-th position may become smaller. The release dates at all the other positions remain unchanged or become larger. Since Jσh(1),Jσh(2),,Jσh(n) are processed in non-decreasing order of their start times in σh, by Lemma 2.1, during the adjustment of σh, none among Cσh(1),Cσh(2),,Cσh(n) can decrease. It follows that for all i Cσh(i)Cσh+1(i). Therefore, we prove property (3) for the (h+1)-th iteration of Procedure A1(y1). Note that property (1) still holds, since the increasing completion times can only reduce the set of candidate jobs, and thus only makes property (1) easier to satisfy. Property (3) also holds, since scheduling the jobs in a given sequence as described in Lemma 2.1 is optimal.

    Summarizing the above, we have proved the lemma for Procedure A1(y1). Assume that the lemma holds for Procedures A1(y1), A1(y2),, A1(ys). We now consider Procedure A1(ys+1). Let σ=(Jσ(1),Jσ(2),,Jσ(n)) be any schedule in Π(J,ys+1). Since ys+1<ys, we have that Π(J,ys+1)Π(J,ys). The input schedule for Procedure A1(ys+1) is just the output schedule of Procedure A1(ys). By the inductive assumption, the lemma holds for this schedule and σ. Assume that the lemma holds for the first h iterations of Procedure A1(ys+1). We now consider σh+1 and σ. In almost the same manner as described above, we can prove that the lemma holds for the (h+1)-th iteration of Procedure A1(ys+1).

    By the principle of induction, we complete the proof.

    We get the following:

    Lemma 2.3. Let σlast (i.e., σs+1) be the schedule obtained at the last iteration of Procedure A1(ys). If σlast=, then Π(J,ys)=; Otherwise σlast is a schedule which has minimum total completion time (and minimum makespan) among all schedules in Π(J,ys).

    Proof. If σlast=, then by Step 3 of Procedure A1(ys), at the last iteration, there is a job Jσh(i) such that fσh(i)(Cσh(i))ys and E(σh(i))=, where E(σh(i))={l|1lifσh(l)(Cσh(i))<ys} denotes the set of the candidate jobs at time Cσh(i). Therefore, the first i jobs in σlast1 can only be scheduled at the first i positions in any schedule in Π(J,ys), but none of them can be scheduled at the i-th position. This contradiction tells us that Π(J,ys)=.

    If σlast, then by Lemma 2.2, we have the following: Cσlast(i)Cσ(i), i=1,2,,n, where σ=(Jσ(1),Jσ(2),,Jσ(n)) denotes any schedule in Π(J,ys). Hence, σlast is a schedule which has the minimum total completion time (and minimum makespan) among all schedules in Π(J,ys).

    Algorithm M1 applies the ε-constraint method of Pareto optimization. The following theorem shows its correctness, the proof of which is based on Lemma 2.3. We omit the proof since it is standard and implied in Figure 1 and, e.g., [1,25].

    Theorem 2.4. Algorithm M1 constructs all Pareto-optimal points together with the corresponding Pareto-optimal schedules for P|rj,pj=p|(nj=1Cj,fmax) in O(n3) time. Consequently, problem P|rj,pj=p|fmax can also be solved in O(n3) time.

    Moreover, by Lemma 2.3, Algorithm M1 also solves P|rj,pj=p|(Cmax,fmax) in O(n3) time. We only need to change the obtained Pareto-optimal points. The Pareto-optimal schedules for Cmax and fmax are the same as those for nj=1Cj and fmax.

    In this section we will adapt Algorithm M1 to solve P|rj,pj=p|Lex(fmax,gmax) in O(n3) time. Note that Lemma 2.1 still holds for this problem.

    During the run of Algorithm M1, we only care about the criteria nj=1Cj and fmax, totally ignoring gmax. To solve P|rj,pj=p|Lex(fmax,gmax), we need to incorporate gmax into the framework.

    Let schedule π be the last schedule obtained upon the completion of Algorithm M1. Let f=fmax(π). By Theorem 2.4, π is an optimal schedule for P|rj,pj=p|fmax, i.e., f=minσΠ(J)fmax(σ). Let σ be any schedule in Π(J) with fmax(σ)=f. By Lemma 2.2, the i-th job in π, counting from the left, starts no later than the i-th job in σ, i=1,2,,n. As we saw in the last section, this property plays a key role in solving P|rj,pj=p|(nj=1Cj,fmax). We will maintain a similar property for solving P|rj,pj=p|Lex(fmax,gmax).

    Let Π(J,f,y) denote the set of the schedules in Π(J) whose fmax-values are equal to f and gmax-values are less than y.

    Below is the algorithm (Algorithm M2) for P|rj,pj=p|Lex(fmax,gmax). (The basic idea of the algorithm has been illustrated in the preceding section before the description of Algorithm M1.) The initial schedule is π, which is the optimal schedule for P|rj,pj=p|fmax obtained by Algorithm M1.

    Algorithm M2:

    Input: An instance of P|rj,pj=p|Lex(fmax,gmax).

    Output: A lexicographical optimal schedule such that gmax is minimized under the constraint that the fmax-value is equal to f.

    Step 1. Initially, set s=1. Let σs=π, ys=gmax(σs).

    Step 2. Run Procedure A2(ys) to get a schedule σs+1 in Π(J,f,ys), using σs as the input schedule.

    Step 3. If σs+1, then set ys+1=gmax(σs+1), s=s+1. Go to Step 2. Otherwise, return σs.

    Procedure A2(ys):

    Input: Schedule σs={Jσs(1),Jσs(2),,Jσs(n)} with fmax(σs)=f and ys=gmax(σs).

    Output: Schedule σs+1={Jσs+1(1),Jσs+1(2),,Jσs+1(n)} which has the minimum total completion time (and minimum makespan) among all schedules in Π(J,f,ys).

    Step 1. Initially, set h=0. Let σh={Jσh(1),Jσh(2),,Jσh(n)}, where σh(i)=σs(i), i=1,2,,n.

    Step 2. Set the start times of the jobs in σh (by Lemma 2.1): Sσh(1)=rσh(1); for i=2,3,,m, let Sσh(i)=max{Sσh(i1),rσh(i)}; for i=m+1,m+2,,n, let Sσh(i)=max{Sσh(i1),rσh(i),Sσh(im)+p}.

    Step 3. Adjust σh:

    IF for all i, the inequalities fσh(i)(Cσh(i))f and gσh(i)(Cσh(i))<ys hold,

    THEN Return σs+1=σh.

    ELSE Pick a job Jσh(i) such that fσh(i)(Cσh(i))>f or gσh(i)(Cσh(i))ys. Let E(σh(i))={l|1lifσh(l)(Cσh(i))fgσh(l)(Cσh(i))<ys} denote the set of the candidate jobs at time Cσh(i).

       IF E(i)=, THEN Return σs+1=.

       ELSE Find the job with the largest release date in E(σh(i)), say Jσh(e). Let Jσh(e) be scheduled at the i-th position instead of Jσh(i). Set Jx=Jσh(i). For q=i1,i2,,e+1 (this ordering is used crucially), let Jσh(q) be the job in {Jσh(q),Jx} with the larger release date, and let Jx be the other job. Finally, let Jx be scheduled at the e-th position.

    Let σh+1=σh and then set h=h+1. Go to Step 2.

    We omit the proof of Lemma 3.1 because it is very similar to that of Lemma 2.2.

    Lemma 3.1. Let σh=(Jσh(1),Jσh(2),,Jσh(n)) be the schedule obtained at iteration h (h=0,1,) of Procedure A2(ys). Let σ=(Jσ(1),Jσ(2),,Jσ(n)) be any schedule in Π(J,f,ys). Then for i=1,2,,n, we have the following: (1) rσh(i)max{rj|jE(σh(i))}; (2) Sσh(i)Sσ(i) (and thus Cσh(i)Cσ(i)); (3) Cσh(i)Cσh+1(i).

    We get the following:

    Lemma 3.2. Let σlast (i.e., σs+1) be the schedule obtained at the last iteration of Procedure A2(ys). If σlast=, then Π(J,f,ys)=; otherwise σlast is a schedule which has the minimum total completion time and minimum makespan among all schedules in Π(J,f,ys).

    Proof. If σlast=, then by Step 3 of Procedure A2(ys), at the last iteration, there is a job Jσh(i) such that fσh(i)(Cσh(i))>f or gσh(i)(Cσh(i))ys and E(σh(i))=, where E(σh(i))={l|1lifσh(l)(Cσh(i))fgσh(l)(Cσh(i))<ys} denotes the set of candidate jobs at time Cσh(i). Therefore, the first i jobs in σlast1 can only be scheduled at the first i positions in any schedule in Π(J,f,ys), but none of them can be scheduled at the i-th position. This contradiction tells us that Π(J,f,ys)=.

    If σlast, then by Lemma 3.1, σlast is a schedule which has the minimum total completion time and minimum makespan among all schedules in Π(J,f,ys).

    Based on Lemma 3.2, we get the following:

    Theorem 3.3. Algorithm M2 solves P|rj,pj=p|Lex(fmax,gmax) in O(n3) time.

    Moreover, by Lemma 3.2, the last schedule generated by Algorithm M2 has the minimum total completion time and minimum makespan among the lexicographical optimal schedules for P|rj,pj=p|Lex(fmax,gmax).

    In this paper we studied the bicriteria problem of scheduling equal-processing-time jobs with release dates on identical machines to minimize total completion time (and makespan) and maximum cost simultaneously, or to minimize the two general min-max criteria hierarchically. We presented O(n3)-time algorithms for the two problems. For future research, for Pareto optimization it is interesting to consider more general min-sum objective functions instead of total the completion time, such as the total weighted completion time, in combination with a maximum cost or another min-sum objective function. For lexicographical optimization, we can try to extend the results in [26,27] to the case of unequal release dates.

    The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.

    We thank the editor and reviewers for their helpful suggestions.

    This work is supported by Natural Science Foundation of Shandong Province China (No. ZR2020MA030), and Shandong Soft Science Project (No. 2021RKY02040).

    The authors express their gratitude to Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2023R61), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia.

    The authors declare that there is no conflict of interest.



    [1] H. Hoogeveen, Multicriteria scheduling, Eur. J. Oper., 167 (2005), 592–623. https://doi.org/10.1016/j.ejor.2004.07.011
    [2] V. T'kindt, J. C. Billaut, Multicriteria Scheduling: Theory, Models and Algorithms, Berlin: Springer Verlag, 2006.
    [3] P Brucker, Scheduling Algorithms, J Oper Res Soc, 50 (1999), 774–774. https://doi.org/10.2307/3010332
    [4] A. Nagar, J. Haddock, S. Heragu, Multiple and bicriteria scheduling: A literature survey, Eur. J. Oper., 81 (1995), 88–104. https://doi.org/10.1016/0022-4049(94)00117-2 doi: 10.1016/0022-4049(94)00117-2
    [5] M. Pfund, J. W. Fowler, J. N. D Gupta, A survey of algorithms for single and multi-objective unrelated parallel-machine deterministic scheduling problems, Journal of the Chinese Institute of Industrial Engineers, 21 (2004), 230–241. https://doi.org/10.1080/10170660409509404 doi: 10.1080/10170660409509404
    [6] D. M. Lei, Multi-objective production scheduling: a survey, Int. J. Adv. Manuf., 43 (2009), 926–938. https://doi.org/10.1007/s00170-008-1770-4 doi: 10.1007/s00170-008-1770-4
    [7] F. S. Erenay, I. Sabuncuoglu, A. Toptal, M. K. Tiwari, New solution methods for single machine bicriteria scheduling problem: Minimization of average flowtime and number of tardy jobs, Eur. J. Oper., 201 (2010), 89–98. https://doi.org/10.1057/fsp.2010.5 doi: 10.1057/fsp.2010.5
    [8] Y. K. Lin, J. W. Fowler, M. E. Pfund, Multiple-objective heuristics for scheduling unrelated parallel machines, Eur. J. Oper., 227 (2013), 239–253. https://doi.org/10.1016/j.ejor.2012.10.008 doi: 10.1016/j.ejor.2012.10.008
    [9] A. J. Ruiz-Torres, J. H. Ablanedo-Rosas, S. Mukhopadhyay, G. Paletta, Scheduling workers: A multi-criteria model considering their satisfaction, Comput Ind Eng, 128 (2019), 747–754. https://doi.org/10.1016/j.cie.2018.12.070 doi: 10.1016/j.cie.2018.12.070
    [10] J. C. Yepes-Borrero, F. Perea, R. Ruiz, F. Villa, Bi-objective parallel machine scheduling with additional resources during setups, Eur. J. Oper., 292 (2021), 443–455. https://doi.org/10.1016/j.ejor.2020.10.052 doi: 10.1016/j.ejor.2020.10.052
    [11] J. Mar-Ortiz, A. J. Ruiz Torres, B. Adenso-Díaz, Scheduling in parallel machines with two objectives: analysis of factors that influence the pareto frontier, Oper. Res., 22 (2022), 4585–4605. https://doi.org/10.1007/s12351-021-00684-9 doi: 10.1007/s12351-021-00684-9
    [12] J. ND Gupta, A. J. Ruiz-Torres, Minimizing makespan subject to minimum total flow-time on identical parallel machines, Eur. J. Oper., 125 (2000), 370–380. https://doi.org/10.1016/S0377-2217(99)00386-0 doi: 10.1016/S0377-2217(99)00386-0
    [13] C. H. Lin, C. J. Liao, Makespan minimization subject to flowtime optimality on identical parallel machines, Comput. Oper. Res., 31 (2004), 1655–1666. https://doi.org/10.1016/S0305-0548(03)00113-8 doi: 10.1016/S0305-0548(03)00113-8
    [14] L. H. Su, Minimizing earliness and tardiness subject to total completion time in an identical parallel machine system, Comput. Oper. Res., 36 (2009), 461–471. https://doi.org/10.1016/j.cor.2007.09.013 doi: 10.1016/j.cor.2007.09.013
    [15] J. L. Bruno, E. G. Coffman Jr, R. Sethi, Algorithms for minimizing mean flow time, Proceedings of the IFIP Congress, 74 (1974), 504–510.
    [16] J. ND Gupta, A. J. Ruiz-Torres, S. Webster, Minimizing maximum tardiness and number of tardy jobs on parallel machines subject to minimum flow-time, J. Oper. Res. Soc., 54 (2003), 1263–1274. https://doi.org/10.1057/palgrave.jors.2601638 doi: 10.1057/palgrave.jors.2601638
    [17] E. L. Lawler, J. K. Lenstra, A. H. G. R. Kan, D. B. Shmoys, Sequencing and scheduling: Algorithms and complexity, Handbooks in operations research and management science, 4 (1993), 445–522.
    [18] J. K. Lenstra, A. H. G. R. Kan, P. Brucker, Complexity of machine scheduling problems, Annals of discrete mathematics, 1 (1977), 343–362.
    [19] S. A. Kravchenko, F. Werner, Scheduling jobs with equal processing times, Proceedings of the 13th IFAC Symposium on Information Control Problems in Manufacturing, 42 (2009), 1262–1267. https://doi.org/10.3182/20090603-3-RU-2001.0042 doi: 10.3182/20090603-3-RU-2001.0042
    [20] S. A. Kravchenko, F. Werner, Parallel machine problems with equal processing times: a survey, J. Sched., 14 (2011), 435–444. https://doi.org/10.1007/s10951-011-0231-3 doi: 10.1007/s10951-011-0231-3
    [21] P. Brucker, N. V. Shakhlevich, Necessary and sufficient optimality conditions for scheduling unit time jobs on identical parallel machines, J. Sched., 19 (2016), 659–685. https://doi.org/10.1007/s10951-016-0471-3 doi: 10.1007/s10951-016-0471-3
    [22] J. Hong, K. Lee, M. L. Pinedo, Scheduling equal length jobs with eligibility restrictions, Ann. Oper. Res., 285 (2020), 295–314. https://doi.org/10.1007/s10479-019-03172-8 doi: 10.1007/s10479-019-03172-8
    [23] N. Vakhania, A better algorithm for sequencing with release and delivery times on identical machines, J. Algorithm, 48 (2003), 273–293. https://doi.org/10.1016/S0196-6774(03)00072-5 doi: 10.1016/S0196-6774(03)00072-5
    [24] N. Vakhania, F. Werner, A polynomial algorithm for sequencing jobs with release and delivery times on uniform machines, 2021010142, [Preprint], (2021) [cited 2023 May 23 ]. Available from: https://www.preprints.org/manuscript/202101.0142/v2
    [25] A. Tuzikov, M. Makhaniok, R. Männer, Bicriterion scheduling of identical processing time jobs by uniform processors, Comput. Oper. Res., 25 (1998), 31–35. https://doi.org/10.1016/S0305-0548(98)80005-1 doi: 10.1016/S0305-0548(98)80005-1
    [26] S. C. Sarin, D. Prakash, Equal processing time bicriteria scheduling on parallel machines, J Comb Optim, 8 (2004), 227–240.
    [27] Q. L. Zhao, J. J. Yuan, Bicriteria scheduling of equal length jobs on uniform parallel machines, J Comb Optim, 39 (2020), 637–661. https://doi.org/10.1007/s10878-019-00507-w doi: 10.1007/s10878-019-00507-w
    [28] B. Simons, Multiprocessor scheduling of unit-time jobs with arbitrary release times and deadlines, SIAM J. Comput., 12 (1983), 294–299. https://doi.org/10.1137/0212018 doi: 10.1137/0212018
    [29] B. B. Simons, M. K. Warmuth, A fast algorithm for multiprocessor scheduling of unit-length jobs, SIAM J. Comput., 18 (1989), 690–710. https://doi.org/10.1137/0218048 doi: 10.1137/0218048
    [30] C. Dürr, M. Hurand, Finding total unimodularity in optimization problems solved by linear programs, Algorithmica, 59 (2011), 256–268. https://doi.org/10.1007/s00453-009-9310-7 doi: 10.1007/s00453-009-9310-7
    [31] A. López-Ortiz, C. G. Quimper, A fast algorithm for multi-machine scheduling problems with jobs of equal processing times, Symposium on Theoretical Aspects of Computer Science, 9 (2011), 380–391.
    [32] H. Fahimi, C. G. Quimper, Variants of multi-resource scheduling problems with equal processing times, In Combinatorial Optimization and Applications: 9th International Conference, Houston: Springer International Publishing, 2015.
    [33] D. Elvikis, V.Tkindt, Two-agent scheduling on uniform parallel machines with min-max criteria, Ann. Oper. Res., 213 (2014), 79–94.
  • This article has been cited by:

    1. Raghunath Kodi, Mohana Ramana Ravuri, V. Veeranna, M. Ijaz Khan, Sherzod Abdullaev, Nissren Tamam, Hall current and thermal radiation effects of 3D rotating hybrid nanofluid reactive flow via stretched plate with internal heat absorption, 2023, 53, 22113797, 106915, 10.1016/j.rinp.2023.106915
    2. Mohd Ahtesham Hussain Siddiqui, Somnath Chattopadhyaya, Shubham Sharma, Changhe Li, Yanbin Zhang, Anita Gehlot, Abhinav Kumar, Fuad A. Awwad, M. Ijaz Khan, Emad A. A. Ismail, Underground Coal Mines Unexplored Strata Structure Identification with Subsurface Profiling: A Case Study of Inherent Fault-Detection Method, 2024, 41, 2524-3462, 2357, 10.1007/s42461-024-00992-6
    3. M. Venkata Subba Rao, Kotha Gangadhar, Ali J. Chamkha, MHD Eyring-Powell fluid flow over a stratified stretching sheet immersed in a porous medium through mixed convection and viscous dissipation, 2023, 0228-6203, 1, 10.1080/02286203.2023.2296678
    4. Muhammad Naveed Khan, Mostafa A. Hussien, F.M. Allehiany, N. Ameer Ahammad, Zhentao Wang, Ebrahem A. Algehyne, Variable fluid properties and concentration species analysis of a chemically reactive flow of micropolar fluid between two plates in a rotating frame with cross diffusion theory, 2023, 189, 0301679X, 108943, 10.1016/j.triboint.2023.108943
    5. Mohd Ahtesham Hussain Siddiqui, Somnath Chattopadhyaya, Shubham Sharma, Changhe Li, Yanbin Zhang, Abhinav Kumar, Dražan Kozak, Mohamed Abbas, Jasmina Lozanovic, Productivity enhancement of continuous miners packages of underground mines with energy conservation in industrial sectors, 2024, 0144-5987, 10.1177/01445987241266084
    6. Mohd Ahtesham Hussain Siddiqui, Shahzad Akhtar, Somnath Chattopadhyaya, Shubham Sharma, Abhinav Kumar, Mohamed Abbas, 611 Universal Drilling Machine Reliability Modeling and Performance Evaluation in Subterranean Coal Mines, 2024, 57, 0723-2632, 3559, 10.1007/s00603-023-03705-5
  • Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(3801) PDF downloads(37) Cited by(6)

Figures and Tables

Figures(1)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog