Next Article in Journal
Efficient Reduction Algorithms for Banded Symmetric Generalized Eigenproblems via Sequentially Semiseparable (SSS) Matrices
Next Article in Special Issue
Analyzing the Collatz Conjecture Using the Mathematical Complete Induction Method
Previous Article in Journal
The Macroeconomic Effects of an Interest-Bearing CBDC: A DSGE Model
Previous Article in Special Issue
Unsupervised Fault Diagnosis of Sucker Rod Pump Using Domain Adaptation with Generated Motor Power Curves
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Due Window Assignment Problems with Deteriorating Job and Delivery Time

College of Science, Northeastern University, Shenyang 110819, China
*
Author to whom correspondence should be addressed.
Mathematics 2022, 10(10), 1672; https://doi.org/10.3390/math10101672
Submission received: 8 April 2022 / Revised: 5 May 2022 / Accepted: 11 May 2022 / Published: 13 May 2022
(This article belongs to the Special Issue Applied Computing and Artificial Intelligence)

Abstract

:
This paper considers the single machine scheduling problem with due window, delivery time and deteriorating job, whose goal is to minimize the window location, window size, earliness and tardiness. Common due window and slack due window are considered. The delivery time depends on the actual processing time of past sequences. The actual processing time of the job is an increasing function of the start time. Based on the small perturbations technique and adjacent exchange technique, we obtain the propositions of the problems. For common and slack due window assignment, we prove that the two objective functions are polynomial time solvable in O ( n l o g n ) time. We propose the corresponding algorithms to obtain the optimal sequence, window location and window size.

1. Introduction

In order to gain the competitive advantage, modern operations management advocates that companies improve customer service. Operationally, good customer service means fulfilling orders within a specified time period. The research on due window usually assumes three situations. First, the jobs that are completed before the start time of the due window have earliness costs. Second, the jobs that are completed after the end of the due window have tardiness costs. Third, the jobs that are completed within the due window do not incur any costs. For cost considerations, we need to take into account the comprehensive factors, which contain earliness costs, tardiness costs, starting time of the window and window size.
Based on the research gap found in the literature review, this paper studies scheduling problems involving both simple linear degradation and past sequence dependent delivery time under the common and slack due window assignment. To the best of our knowledge, such articles are rare for the common or slack due window assignment. In this paper, the single machine scheduling problem with delivery time and deteriorating job is considered. The actual processing time of job is an increasing function of the start time. The delivery time depends on the actual processing time of past sequence. The goal is to minimize the window location, window size, earliness and tardiness. Small perturbation technique and adjacent switching technique are effective methods to deal with scheduling problems. Based on the small perturbations technique and adjacent exchange technique, we obtain the propositions of the method. By the propositions, we propose the polynomial time solvable algorithms to obtain the optimal sequence, window location and window size. The complexity of the algorithms is O ( n l o g n ) .

2. Literature Review

In the field of production sequencing and scheduling, the scheduling problem with due window has received a lot of attention. In 1955, Jackson first proposed the concept of due date [1]. The scheduling problem of the optimal due date was considered by Seidmann [2] and Panwalkar [3]. In 2015, Li et al. studied the scheduling problem with the slack due window, resource allocation and learning effect [4]. In 2015, Yang et al. studied the scheduling problem with deteriorating jobs, learning effect and due window [5]. In 2016, Yin et al. studied the scheduling problem with resource allocation and slack due window [6]. In 2017, Wang et al. studied the scheduling problem with the slack due window and controllable processing times [7]. In 2020, Wang et al. considered the scheduling problem with due window and position-dependent weights [8]. In 2020, Sun et al. considered the proportional flow shop scheduling problem with due window and position-dependent weights [9]. In 2021, Yue et al. studied a scheduling model with due window and the processing time as a special function [10].
The processing time of deteriorating job is variable. As the jobs are processed, and the processing times of jobs increase. The concept of deteriorating jobs was proposed by Gupta [11]. Under the common due window assignment, Yue et al. considered the scheduling problem with deteriorating jobs [12]. In 2018, Wang et al. studied the unrelated parallel processors scheduling problem with a maintenance activity and deterioration effect [13]. In 2019, Sun et al. considered the single machine scheduling problem with machine maintenance and deteriorating jobs [14]. In 2020, Cheng et al. studied the single machine scheduling problem whose processing time was a stage function of the start time [15]. In 2020, Liang et al. studied the scheduling problem with resource allocation and deteriorating jobs [16].
After a job is processed, the job should be delivered to the customer, which is called the past sequence dependent (psd) delivery time. Koulamas and Kyparisis first proposed the past sequence dependent delivery time [17]. In 2012, Liu et al. studied some new results on single machine scheduling with past sequence dependent delivery times [18]. In 2014, Zhao et al. studied single machine scheduling problems with general position-dependent processing times and past sequence dependent delivery times [19]. In 2020, Mir et al. studied parallel machine problem with delivery time and deterioration effect [20]. In 2021, Toksari et al. studied some scheduling problems with learning effect and past sequence dependent delivery times [21]. In 2021, Wang et al. studied single machine scheduling with position-dependent weights and delivery times [22].
The problem is described in Section 3. The research method is given in Section 4. The summary is given in Section 5.

3. Notation and Problem Statement

There are n independent jobs S = { J 1 , , J n } processed at time t 0 ( t 0 > 0 ) on a machine. The actual processing time p i of J i is
p i = b i t i ,
where b i is the deterioration rate, t i is the start time. The delivery time q [ i ] of J [ i ] is
q [ i ] = r w [ i ] = r j = 0 i 1 p [ j ] ,
where r is the delivery rate, p [ i ] is the actual possessing time of J [ i ] and w [ i ] = j = 0 i 1 p [ j ] = t 0 j = 1 i 1 ( 1 + b [ j ] ) , p [ 0 ] = t 0 . The subscript [ i ] indicates that the job is arranged at the ith position. The completion time C [ i ] of J [ i ] is
C [ i ] = w [ i ] + p [ i ] + q [ i ] = t 0 ( 1 + b [ i ] + r ) j = 1 i 1 ( 1 + b [ j ] ) .
The common due window (CONW) and slack due window (SLKW) are considered. Let [ d i , d i ] be the due window of J i , where d i and d i are the start time and end time of due window, respectively. For the common due window, all jobs have the same start time d and end time d of due window. For the slack due window, d i = p i + q 1 and d i = p i + q 2 . D = d i d i is the size of due window, D = q 2 q 1 , q 1 and q 2 are the decision variables. C i represents the completion time of J i . E i is the earliness of J i , E i = max { 0 , d i C i } ; T i is the tardiness of J i , T i = max { 0 , C i d i } .
The goal is to minimize the earliness, tardiness, start time of window and window size. The objective functions are
M = i = 1 n [ a E [ i ] + c T [ i ] + e d + f D ] ,
M = i = 1 n [ a E [ i ] + c T [ i ] + e q 1 + f D ] ,
where a, c, e, f represent the unit cost of earliness, the unit cost of tardiness, and the unit cost of the start time and window size. If C [ h ] < d < C [ h + 1 ] and C [ l ] < d < C [ l + 1 ] (CONW), the objective function is
M = i = 1 h a ( d C [ i ] ) + i = l + 1 n c ( C [ i ] d ) + n e d + n f ( d d ) = a i = 1 h C [ i ] + c i = l + 1 n C [ i ] + [ h a + n ( e f ) ] d + [ n f ( n l ) c ] d .
If d = C [ h + 1 ] and d = C [ l ] (CONW), the objective function is
M = i = 1 h a ( C [ h + 1 ] C [ i ] ) + i = l + 1 n c ( C [ i ] C [ l ] ) + n e C [ h + 1 ] + n f ( C [ l ] C [ h + 1 ] ) = a i = 1 h C [ i ] + c i = l + 1 n C [ i ] + [ h a + n ( e f ) ] C [ h + 1 ] + [ n f ( n l ) c ] C [ l ] .
The objective function of SLKW model is similar to that of CONW model.
By the three-region notation, the models can be defined as
1 | CONW , p i = b i t i , q p s d | i = 1 n [ a E [ i ] + c T [ i ] + e d + f D ] ,
1 | SLKW , p i = b i t i , q p s d | i = 1 n [ a E [ i ] + c T [ i ] + e q 1 + f D ] ,
where q p s d represents the past sequence dependent delivery time.

4. Research Method

In this section, we present several properties for an optimal schedule. First, we show the optimal common and slack due window positions by using the technique of small perturbations. Then, the sequence of jobs within different sets is determined by the adjacent exchange technique. Finally, we propose the polynomial time algorithms to obtain the optimal sequence.

4.1. The Problem 1 | C O N W , p i = b i t i , q p s d | i = 1 n [ a E [ i ] + c T [ i ] + e d + f D ]

4.1.1. Optimal Properties of the Problem

Lemma 1.
For any job sequence, d of the optimal schedule is the start time of some job.
Proof. 
(a) When C [ h ] < d < C [ h + 1 ] and d = C [ l ] , 0 h < l n , C [ 0 ] = t 0 . The objective function is
M = a i = 1 h C [ i ] + c i = l + 1 n C [ i ] + [ h a + n ( e f ) ] d + [ n f ( n l ) c ] C [ l ] .
when d = C [ h ] , the objective function is
M 1 = a i = 1 h 1 C [ i ] + c i = l + 1 n C [ i ] + [ ( h 1 ) a + n ( e f ) ] C [ h ] + [ n f ( n l ) c ] C [ l ] .
M M 1 = [ h a + n ( e f ) ] [ d C [ h ] ] .
when d = C [ h + 1 ] , the objective function is
M 2 = a i = 1 h C [ i ] + c i = l + 1 n C [ i ] + [ h a + n ( e f ) ] C [ h + 1 ] + [ n f ( n l ) c ] C [ l ] .
M M 2 = [ h a + n ( e f ) ] [ d C [ h + 1 ] ] .
when h a + n ( e f ) < 0 , M > M 2 ; otherwise, M M 1 .
(b) When C [ h ] < d < C [ h + 1 ] and C [ l ] < d < C [ l + 1 ] , 0 h l < n . The objective function is
M = a i = 1 h C [ i ] + c i = l + 1 n C [ i ] + [ h a + n ( e f ) ] d + [ n f ( n l ) c ] d .
when d = C [ h ] , the objective function is
M 3 = a i = 1 h 1 C [ i ] + c i = l + 1 n C [ i ] + [ ( h 1 ) a + n ( e f ) ] C [ h ] + [ n f ( n l ) c ] d .
M M 3 = [ h a + n ( e f ) ] [ d C [ h ] ] .
when d = C [ h + 1 ] , the objective function is
M 4 = a i = 1 h C [ i ] + c i = l + 1 n C [ i ] + [ h a + n ( e f ) ] C [ h + 1 ] + [ n f ( n l ) c ] d .
M M 4 = [ h a + n ( e f ) ] [ d C [ h + 1 ] ] .
when h a + n ( e f ) < 0 , M > M 4 ; otherwise, M M 3 . □
Lemma 2.
For any job sequence, d of the optimal schedule is the completion time of some job.
Proof. 
(a) When d = C [ h ] and C [ l ] < d < C [ l + 1 ] , 0 h l < n . The objective function is
M = a i = 1 h 1 C [ i ] + c i = l + 1 n C [ i ] + [ ( h 1 ) a + n ( e f ) ] C [ h ] + [ n f ( n l ) c ] d .
when d = C [ l ] , the objective function is
M 1 = a i = 1 h 1 C [ i ] + c i = l + 1 n C [ i ] + [ ( h 1 ) a + n ( e f ) ] C [ h ] + [ n f ( n l ) c ] C [ l ] .
M M 1 = [ n f ( n l ) c ] [ d C [ l ] ] .
when d = C [ l + 1 ] , the objective function is
M 2 = a i = 1 h 1 C [ i ] + c i = l + 2 n C [ i ] + [ ( h 1 ) a + n ( e f ) ] C [ h ] + [ n f ( n l 1 ) c ] C [ l + 1 ] .
M M 2 = [ n f ( n l ) c ] [ d C [ l + 1 ] ] .
when n f ( n l ) c < 0 , M > M 2 ; otherwise, M M 1 .
(b) When C [ h ] < d < C [ h + 1 ] and C [ l ] < d < C [ l + 1 ] , 0 h l < n . The objective function is
M = a i = 1 h C [ i ] + c i = l + 1 n C [ i ] + [ h a + n ( e f ) ] d + [ n f ( n l ) c ] d .
when d = C [ l ] , the objective function is
M 3 = a i = 1 h C [ i ] + c i = l + 1 n C [ i ] + [ h a + n ( e f ) ] d + [ n f ( n l ) c ] C [ l ] .
M M 3 = [ n f ( n l ) c ] [ d C [ l ] ] .
When d = C [ l + 1 ] , the objective function is
M 4 = a i = 1 h C [ i ] + c i = l + 2 n C [ i ] + [ h a + n ( e f ) ] d + [ n f ( n l 1 ) c ] C [ l + 1 ] .
M M 4 = [ n f ( n l ) c ] [ d C [ l + 1 ] ] .
when n f ( n l ) c < 0 , M > M 4 ; otherwise, M M 3 . □
Lemma 3.
For the optimal schedule, d is the completion time C [ h ] , d is the completion time C [ l ] , h = n ( f e ) a , l = n ( c f ) c .
Proof. 
When d = C [ h ] and d = C [ l ] for the optimal schedule, the objective function is
M = a i = 1 h 1 C [ i ] + c i = l + 1 n C [ i ] + [ ( h 1 ) a + n ( e f ) ] C [ h ] + [ n f ( n l ) c ] C [ l ] .
(a) When d = C [ h 1 ] and d = C [ l ] , the objective function is
M 1 = a i = 1 h 2 C [ i ] + c i = l + 1 n C [ i ] + [ ( h 2 ) a + n ( e f ) ] C [ h 1 ] + [ n f ( n l ) c ] C [ l ] .
M M 1 = [ ( h 1 ) a + n ( e f ) ] [ C [ h ] C [ h 1 ] ] 0 .
So, h 1 + n ( f e ) a .
When d = C [ h + 1 ] and d = C [ l ] , the objective function is
M 2 = a i = 1 h C [ i ] + c i = l + 1 n C [ i ] + [ h a + n ( e f ) ] C [ h + 1 ] + [ n f ( n l ) c ] C [ l ] .
M M 2 = [ h a + n ( e f ) ] [ C [ h ] C [ h + 1 ] ] 0 .
So n ( f e ) a h 1 + n ( f e ) a , h = n ( f e ) a .
(b) When d = C [ l 1 ] and d = C [ h ] , the objective function is
M 3 = a i = 1 h 1 C [ i ] + c i = l n C [ i ] + [ ( h 1 ) a + n ( e f ) ] C [ h ] + [ n f ( n l + 1 ) c ] C [ l 1 ] .
M M 3 = [ n f ( n l + 1 ) c ] [ C [ l ] C [ l 1 ] ] 0 .
So, l 1 + n ( c f ) c .
When d = C [ l + 1 ] and d = C [ h ] , the objective function is
M 4 = a i = 1 h 1 C [ i ] + c i = l + 2 n C [ i ] + [ ( h 1 ) a + n ( e f ) ] C [ h ] + [ n f ( n l 1 ) c ] C [ l + 1 ] .
M M 4 = [ n f ( n l ) c ] [ C [ l ] C [ l + 1 ] ] 0 .
So n ( c f ) c l 1 + n ( c f ) c , l = n ( c f ) c . □
Suppose d = C [ h ] and d = C [ l ] for the optimal schedule. Three sets Ω 1 = { J i Ω | i h 1 } , Ω 2 = { J i Ω | i = h } , Ω 3 = { J i Ω | h + 1 i l 1 } , Ω 4 = { J i Ω | i = l } , Ω 5 = { J i Ω | l + 1 i n } , Ω is the job sequence.
Lemma 4.
In the optimal schedule, the jobs in Ω 1 can be processed in descending order of b j .
Proof. 
J g is at the uth position and J k is at the ( u + 1 ) th position in Ω 1 . π 1 = { J 1 , , J g , J k , , J n } , 1 u < u + 1 h 1 . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , J g , , J n } . The objective function of π 2 is M 2 .
M 1 M 2 = a t 0 ( b k b g ) ( 1 + r ) i = 1 u 1 ( 1 + b [ i ] ) .
If b g > b k , M 1 < M 2 . □
Lemma 5.
The deterioration rate of the job in Ω 2 is less than the deterioration rate of any job in  Ω 1 .
Proof. 
J g is at the ( h 1 ) th position, and J k is at the hth position. π 1 = { J 1 , , J g , J k , , J n } . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , J g , , J n } . The objective function of π 2 is M 2 .
M 1 M 2 = { [ ( h 1 ) a + n ( e f ) ] r a } t 0 ( b g b k ) i = 1 h 2 ( 1 + b [ i ] ) .
If b k < b g , M 1 < M 2 . □
Lemma 6.
In the optimal schedule, the jobs in Ω 3 can be processed in any order of b j .
Proof. 
J g is at the uth position and J k which is at the ( u + 1 ) th position in Ω 3 , π 1 = { J 1 , , J g , J k , , J n } , h + 1 u < u + 1 l 1 . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , J g , , J n } . The objective function of π 2 is M 2 .
M 1 = M 2 .
Lemma 7.
The deterioration rate of any job in Ω 3 is less than the deterioration rate of the job in  Ω 4 .
Proof. 
J g is at the uth position, h + 1 u l 1 . J k is at the lth position. π 1 = { J 1 , , J g , , J k , , J n } . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , , J g , , J n } . The objective function of π 2 is M 2 .
M 1 M 2 = [ n f ( n l ) c ] t 0 r ( b g b k ) i = 1 , i u l 1 ( 1 + b [ i ] ) .
If b g < b k , M 1 < M 2 . □
Lemma 8.
In the optimal schedule, the jobs in Ω 5 can be processed in ascending order of b j .
Proof. 
J g is at the uth position and J k is at the ( u + 1 ) th position in Ω 5 , π 1 = { J 1 , , J g , J k , , J n } , l + 1 u < u + 1 n . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , J g , , J n } . The objective function of π 2 is M 2 .
M 1 M 2 = c t 0 ( 1 + r ) ( b g b k ) i = 1 u 1 ( 1 + b [ i ] ) .
If b g < b k , M 1 < M 2 . □
Lemma 9.
The deterioration rate of the job in Ω 4 is less than the deterioration rate of any job in  Ω 5 .
Proof. 
J g is at the lth position, and J k is at the ( l + 1 ) th position. π 1 = { J 1 , , J g , J k , , J n } . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , J g , , J n } . The objective function of π 2 is M 2 .
M 1 M 2 = { c r + [ n f ( n l ) c ] } t 0 ( b g b k ) i = 1 l 1 ( 1 + b [ i ] ) .
If b g < b k , M 1 < M 2 . □
Suppose J g is at the uth position and J k is at the vth position in the sequence π 1 = { J 1 , , J g , , J k , , J n } , 1 u h , l v n . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , , J g , , J n } . The objective function of π 2 is  M 2 .
M 1 M 2 = t 0 ( b g b k ) i = 1 u 1 ( 1 + b [ i ] ) { a a i = u + 1 h ( 1 + r + b [ i ] ) m = u + 1 i 1 ( 1 + b [ m ] ) + c i = l v 1 ( 1 + r + b [ i ] ) m = u + 1 i 1 ( 1 + b [ m ] ) + c r i = u + 1 v 1 ( 1 + b [ i ] ) + [ h a + n ( e f ) ] ( 1 + r + b [ h ] ) i = u + 1 h 1 ( 1 + b [ i ] ) + [ n f ( n l + 1 ) c ] ( 1 + r + b [ l ] ) i = u + 1 l 1 ( 1 + b [ i ] ) } .
Define
π u v = a a i = u + 1 h ( 1 + r + b [ i ] ) m = u + 1 i 1 ( 1 + b [ m ] ) + c i = l v 1 ( 1 + r + b [ i ] ) m = u + 1 i 1 ( 1 + b [ m ] ) + c r i = u + 1 v 1 ( 1 + b [ i ] ) + [ h a + n ( e f ) ] ( 1 + r + b [ h ] ) i = u + 1 h 1 ( 1 + b [ i ] ) + [ n f ( n l + 1 ) c ] ( 1 + r + b [ l ] ) i = u + 1 l 1 ( 1 + b [ i ] ) .
If π u v > 0 , J g should be at the uth position; otherwise, J g should be at the vth position.

4.1.2. Optimal Algorithm

The Algorithm 1 is summarized as follows:
Algorithm 1 1 | CONW , p j = b j t j , q p s d | i = 1 n [ a E [ i ] + c T [ i ] + e d + f D ]
Input: 
t 0 , a, c, e, f, b j , r
Output: 
The optimal sequence, d , d
  1:
First step: Sorted by b [ 1 ] b [ n ] .
  2:
Second step: Determine h = n ( f e ) a , l = n ( c f ) c , d = C [ h ] , d = C [ l ] .
  3:
Third step: Determine set Ω 3 that contains l h 1 jobs, i.e., b [ 1 ] , …, b [ l h 1 ] .
  4:
Last step: Determine the jobs of sets Ω 1 Ω 2 and Ω 4 Ω 5 by π u v .
Theorem 1.
For the problem 1 | C O N W , p i = b i t i , q p s d | i = 1 n [ a E [ i ] + c T [ i ] + e d + f D ] , the complexity of the algorithm is O ( n l o g n ) .
Proof. 
The first step requires O ( n l o g n ) time. The second and third steps are completed in constant time. The last step requires O ( n ) time. So the complexity of the algorithm is O ( n l o g n ) . □
Example 1.
There are 4 jobs processed in sequence on a single machine. t 0 = 1 , r = 0.1 , b 1 = 2 , b 2 = 0.3 , b 3 = 1 , b 4 = 0.7 , a = 4 , c = 5 , e = 1 , f = 2 .
Step 1. Because b 2 b 4 b 3 b 1 , J 2 J 4 J 3 J 1 .
Step 2. Calculate the values h = n ( f e ) a = 1 , l = n ( c f ) c = 3 , d = C [ 1 ] , d = C [ 3 ] .
Step 3. J 2 is contained in set Ω 3 which is at the second position.
Step 4. (a) When u = 1 and v = 3 , π [ 1 ] [ 3 ] = 3.61 < 0 , J 4 is determined at the third position;
(b) When u = 1 and v = 4 , π [ 1 ] [ 4 ] = 4.125 > 0 , J 3 is determined at the first position.
Therefore, the optimal sequence is J 3 J 2 J 4 J 1 .

4.2. The Problem 1 | S L K W , p i = b i t i , q p s d | i = 1 n [ a E [ i ] + c T [ i ] + e q 1 + f D ]

4.2.1. Optimal Properties of the Problem

Lemma 10.
For any job sequence, q 1 of the optimal schedule is the ( 1 + r ) times the sum of actual processing time for some jobs or t 0 .
Proof. 
(a) When ( 1 + r ) i = 0 h 1 p [ i ] < q 1 < ( 1 + r ) i = 0 h p [ i ] and q 2 = ( 1 + r ) i = 0 l 1 p [ i ] , 1 h < l n . The objective function is
M = a ( 1 + r ) i = 1 h w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ h a + n ( e f ) ] q 1 + [ n f ( n l ) c ] ( 1 + r ) i = 0 l 1 p [ i ] .
when q 1 = ( 1 + r ) i = 0 h 1 p [ i ] , the objective function is
M 1 = a ( 1 + r ) i = 1 h 1 w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ ( h 1 ) a + n ( e f ) ] ( 1 + r ) i = 0 h 1 p [ i ] + [ n f ( n l ) c ] ( 1 + r ) i = 0 l 1 p [ i ] .
M M 1 = [ h a + n ( e f ) ] [ q 1 ( 1 + r ) i = 0 h 1 p [ i ] ] .
when q 1 = ( 1 + r ) i = 0 h p [ i ] , the objective function is
M 2 = a ( 1 + r ) i = 1 h + 1 w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ ( h + 1 ) a + n ( e f ) ] ( 1 + r ) i = 0 h p [ i ] + [ n f ( n l ) c ] ( 1 + r ) i = 0 l 1 p [ i ] .
M M 2 = [ h a + n ( e f ) ] [ q 1 ( 1 + r ) i = 0 h p [ i ] ] .
when h a + n ( e f ) < 0 , M > M 2 ; otherwise, M M 1 .
(b) When ( 1 + r ) i = 0 h 1 p [ i ] < q 1 < ( 1 + r ) i = 0 h p [ i ] and ( 1 + r ) i = 0 l 1 p [ i ] < q 2 < ( 1 + r ) i = 0 l p [ i ] , 1 h l n . The objective function is
M = a ( 1 + r ) i = 1 h w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ h a + n ( e f ) ] q 1 + [ n f ( n l ) c ] q 2 .
when q 1 = ( 1 + r ) i = 0 h 1 p [ i ] , the objective function is
M 3 = a ( 1 + r ) i = 1 h 1 w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ ( h 1 ) a + n ( e f ) ] ( 1 + r ) i = 0 h 1 p [ i ] + [ n f ( n l ) c ] q 2 .
M M 3 = [ h a + n ( e f ) ] [ q 1 ( 1 + r ) i = 0 h 1 p [ i ] ] .
when q 1 = ( 1 + r ) i = 0 h p [ i ] , the objective function is
M 4 = a ( 1 + r ) i = 1 h w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ h a + n ( e f ) ] ( 1 + r ) i = 0 h p [ i ] + [ n f ( n l ) c ] q 2 .
M M 4 = [ h a + n ( e f ) ] [ q 1 ( 1 + r ) i = 0 h p [ i ] ] .
when h a + n ( e f ) < 0 , M > M 4 ; otherwise, M M 3 . □
Lemma 11.
For any job sequence, q 2 of the optimal schedule is ( 1 + r ) times sum of actual processing time for some jobs or t 0 .
Proof. 
(a) When q 1 = ( 1 + r ) i = 0 h 1 p [ i ] and ( 1 + r ) i = 0 l 1 p [ i ] < q 2 < ( 1 + r ) i = 0 l p [ i ] , 1 h l n . The objective function is
M = a ( 1 + r ) i = 1 h 1 w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ ( h 1 ) a + n ( e f ) ] ( 1 + r ) i = 0 h 1 p [ i ] + [ n f ( n l ) c ] q 2 .
when q 2 = ( 1 + r ) i = 0 l 1 p [ i ] , the objective function is
M 1 = a ( 1 + r ) i = 1 h 1 w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ ( h 1 ) a + n ( e f ) ] ( 1 + r ) i = 0 h 1 p [ i ] + [ n f ( n l ) c ] ( 1 + r ) i = 0 l 1 p [ i ] .
M M 1 = [ n f ( n l ) c ] [ q 2 ( 1 + r ) i = 0 l 1 p [ i ] ] .
when q 2 = ( 1 + r ) i = 0 l p [ i ] , the objective function is
M 2 = a ( 1 + r ) i = 1 h 1 w [ i ] + c ( 1 + r ) i = l + 2 n w [ i ] + [ ( h 1 ) a + n ( e f ) ] ( 1 + r ) i = 0 h 1 p [ i ] + [ n f ( n l 1 ) c ] ( 1 + r ) i = 0 l p [ i ] .
M M 2 = [ n f ( n l ) c ] [ q 2 ( 1 + r ) i = 0 l p [ i ] ] .
when n f ( n l ) c < 0 , M > M 2 ; otherwise, M M 1 .
(b) When ( 1 + r ) i = 0 h 1 p [ i ] < q 1 < ( 1 + r ) i = 0 h p [ i ] and ( 1 + r ) i = 0 l 1 p [ i ] < q 2 < ( 1 + r ) i = 0 l p [ i ] , 1 h l n . The objective function is
M = a ( 1 + r ) i = 1 h w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ h a + n ( e f ) ] q 1 + [ n f ( n l ) c ] q 2 .
when q 2 = ( 1 + r ) i = 0 l 1 p [ i ] , the objective function is
M 3 = a ( 1 + r ) i = 1 h w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ h a + n ( e f ) ] q 1 + [ n f ( n l ) c ] ( 1 + r ) i = 0 l 1 p [ i ] .
M M 3 = [ n f ( n l ) c ] [ q 2 ( 1 + r ) i = 0 l 1 p [ i ] ] .
when q 2 = ( 1 + r ) i = 0 l p [ i ] , the objective function is
M 4 = a ( 1 + r ) i = 1 h w [ i ] + c ( 1 + r ) i = l + 2 n w [ i ] + [ h a + n ( e f ) ] q 1 + [ n f ( n l 1 ) c ] ( 1 + r ) i = 0 l p [ i ] .
M M 4 = [ n f ( n l ) c ] [ q 2 ( 1 + r ) i = 0 l p [ i ] ] .
when n f ( n l ) c < 0 , M > M 4 ; otherwise, M M 3 . □
Lemma 12.
For the optimal schedule, q 1 is ( 1 + r ) i = 0 h 1 p [ i ] , q 2 is ( 1 + r ) i = 0 l 1 p [ i ] , h = n ( f e ) a , l = n ( c f ) c .
Proof. 
When q 1 = ( 1 + r ) i = 0 h 1 p [ i ] and q 2 = ( 1 + r ) i = 0 l 1 p [ i ] for the optimal schedule, the objective function is
M = a ( 1 + r ) i = 1 h 1 w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ ( h 1 ) a + n ( e f ) ] ( 1 + r ) i = 0 h 1 p [ i ] + [ n f ( n l ) c ] ( 1 + r ) i = 0 l 1 p [ i ] .
(a) When q 1 = ( 1 + r ) i = 0 h 2 p [ i ] and q 2 = ( 1 + r ) i = 0 l 1 p [ i ] , the objective function is
M 1 = a ( 1 + r ) i = 1 h 2 w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ ( h 2 ) a + n ( e f ) ] ( 1 + r ) i = 0 h 2 p [ i ] + [ n f ( n l ) c ] ( 1 + r ) i = 0 l 1 p [ i ] .
M M 1 = [ ( h 1 ) a + n ( e f ) ] ( 1 + r ) p [ h 1 ] 0 .
So h 1 + n ( f e ) a .
When q 1 = ( 1 + r ) i = 0 h p [ i ] and q 2 = ( 1 + r ) i = 0 l 1 p [ i ] , the objective function is
M 2 = a ( 1 + r ) i = 1 h w [ i ] + c ( 1 + r ) i = l + 1 n w [ i ] + [ h a + n ( e f ) ] ( 1 + r ) i = 0 h p [ i ] + [ n f ( n l ) c ] ( 1 + r ) i = 0 l 1 p [ i ] .
M M 2 = [ h a + n ( e f ) ] ( 1 + r ) p [ h ] 0 .
So, n ( f e ) a h 1 + n ( f e ) a , h = n ( f e ) a .
(b) When q 2 = ( 1 + r ) i = 0 l 2 p [ i ] and q 1 = ( 1 + r ) i = 0 h 1 p [ i ] , the objective function is
M 3 = a ( 1 + r ) i = 1 h 1 w [ i ] + c ( 1 + r ) i = l n w [ i ] + [ ( h 1 ) a + n ( e f ) ] ( 1 + r ) i = 0 h 1 p [ i ] + [ n f ( n l + 1 ) c ] ( 1 + r ) i = 0 l 2 p [ i ] .
M M 3 = [ n f ( n l + 1 ) c ] ( 1 + r ) p [ l 1 ] 0 .
So, l 1 + n ( c f ) c .
When q 2 = ( 1 + r ) i = 0 l p [ i ] and q 1 = ( 1 + r ) i = 0 h 1 p [ i ] , the objective function is
M 4 = a ( 1 + r ) i = 1 h 1 w [ i ] + c ( 1 + r ) i = l + 2 n w [ i ] + [ ( h 1 ) a + n ( e f ) ] ( 1 + r ) i = 0 h 1 p [ i ] + [ n f ( n l 1 ) c ] ( 1 + r ) i = 0 l p [ i ] .
M M 4 = [ n f ( n l ) c ] ( 1 + r ) p [ l ] 0 .
So n ( c f ) c l 1 + n ( c f ) c , l = n ( c f ) c . □
Suppose q 1 = ( 1 + r ) i = 0 h 1 p [ i ] and q 2 = ( 1 + r ) i = 0 l 1 p [ j ] for the optimal schedule. Three sets Ω 1 = { J i Ω | i h 1 } , Ω 2 = { J i Ω | h i l 1 } , Ω 3 = { J i Ω | l i n } , Ω is the job sequence.
Lemma 13.
In the optimal schedule, the jobs in Ω 1 can be processed in descending order of b j .
Proof. 
J g is at the uth position and J k is at the ( u + 1 ) th position in Ω 1 , π 1 = { J 1 , , J g , J k , , J n } , 1 u < u + 1 h 1 . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , J g , , J n } . The objective function of π 2 is M 2 .
M 1 M 2 = a ( 1 + r ) t 0 ( b g b k ) i = 1 u 1 ( 1 + b [ i ] ) .
If b g > b k , M 1 < M 2 . □
Lemma 14.
In the optimal schedule, the jobs in Ω 2 can be processed in any order of b j .
Proof. 
J g is at the uth position and J k is at the ( u + 1 ) th position in Ω 2 , π 1 = { J 1 , , J g , J k , , J n } , h u < u + 1 l 1 . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , J g , , J n } . The objective function of π 2 is M 2 .
M 1 = M 2 .
Lemma 15.
In the optimal schedule, the jobs in Ω 3 can be processed in ascending order of b j .
Proof. 
J g is at the uth position and J k is at the ( u + 1 ) th position in Ω 3 , π 1 = { J 1 , , J g , J k , , J n } , l v < v + 1 n . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , J g , , J n } . The objective function of π 2 is M 2 .
M 1 M 2 = c ( 1 + r ) t 0 ( b g b k ) i = 1 u 1 ( 1 + b [ i ] ) .
If b g < b k , M 1 < M 2 . □
Lemma 16.
The deterioration rate of any job in Ω 2 is less than the deterioration rate of any job in  Ω 1 .
Proof. 
J g is at the ( h 1 ) th position, and J k is at the hth position. π 1 = { J 1 , , J g , J k , , J n } . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , J g , , J n } . The objective function of π 2 is M 2 .
M 1 M 2 = [ ( h 1 ) a + n ( e f ) ] ( 1 + r ) t 0 ( b g b k ) i = 1 h 2 ( 1 + b [ i ] ) .
If b k < b g , M 1 < M 2 . □
Lemma 17.
The deterioration rate of any job in Ω 2 is less than the deterioration rate of any job in  Ω 3 .
Proof. 
J g is at the ( l 1 ) th position and J k is at the lth position. π 1 = { J 1 , , J g , J k , , J n } . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , J g , , J n } . The objective function of π 2 is M 2 .
M 1 M 2 = [ n f ( n l ) c ] ( 1 + r ) t 0 ( b g b k ) i = 1 l 2 ( 1 + b [ i ] ) .
If b g < b k , M 1 < M 2 . □
Suppose J g is at the uth position and J k is at the vth position in the sequence π 1 = { J 1 , , J g , , J k , , J n } , 1 u h 1 , l v n . The objective function of π 1 is M 1 . Swap J g and J k to get sequence π 2 = { J 1 , , J k , , J g , , J n } . The objective function of π 2 is M 2 .
M 1 M 2 = t 0 ( 1 + r ) ( b g b k ) i = 1 u 1 ( 1 + b [ i ] ) { a a i = u + 1 h 2 m = u + 1 i ( 1 + b [ m ] ) + c i = l v 1 m = u + 1 i ( 1 + b [ m ] ) + [ n f ( n l ) c ] i = u + 1 l 1 ( 1 + b [ i ] ) + [ ( h 1 ) a + n ( e f ) ] i = u + 1 h 1 ( 1 + b [ i ] ) } .
Define
π u v = a a i = u + 1 h 2 m = u + 1 i ( 1 + b [ m ] ) + [ ( h 1 ) a + n ( e f ) ] i = u + 1 h 1 ( 1 + b [ i ] ) + c i = l v 1 m = u + 1 i ( 1 + b [ m ] ) + [ n f ( n l ) c ] i = u + 1 l 1 ( 1 + b [ i ] ) .
If π u v > 0 , J g should be at the uth position; otherwise, J g should be at the vth position.

4.2.2. Optimal Algorithm

The Algorithm 2 is summarized as follows:
Algorithm 2 1 | SLKW , p i = b i t i , q p s d | i = 1 n [ a E [ i ] + c T [ i ] + e q 1 + f D ]
Input: 
t 0 , a, c, e, f, b j , r
Output: 
The optimal sequence, q 1 , q 2
  1:
First step: Sorted by b [ 1 ] b [ n ] .
  2:
Second step: When c f e , q 1 = q 2 = t 0 , Ω 1 Ω 2 = ϕ , Ω 3 = { J [ 1 ] , , J [ n ] } .
  3:
Third step: When c > f and f e , q 1 = t 0 , q 2 = ( 1 + r ) i = 0 l 1 p [ i ] , l = n ( c f ) c . Ω 1 = ϕ , Ω 2 = { J [ 1 ] , , J [ l ] } , Ω 3 = { J [ l + 1 ] , , J [ n ] } .
  4:
Forth step: Determine h = n ( f e ) a , l = n ( c f ) c , q 1 = ( 1 + r ) i = 0 h 1 p [ i ] , q 2 = ( 1 + r ) i = 0 l 1 p [ i ] .
  5:
Last step: Ω 2 = { J [ 1 ] , , J [ l h ] } . Determine the jobs of sets Ω 1 and Ω 3 by π u v .
Theorem 2.
For the problem 1 | S L K W , p i = b i t i , q p s d | i = 1 n [ a E [ i ] + c T [ i ] + e q 1 + f D ] , the complexity of the algorithm is O ( n l o g n ) .
Proof. 
The first step requires O ( n l o g n ) time. The second, third and forth steps are completed in constant time. The last step requires O ( n ) time. So the complexity of the algorithm is O ( n l o g n ) . □
Example 2.
There are 4 jobs processed in sequence on a single machine. t 0 = 1 , r = 0.1 , b 1 = 2 , b 2 = 0.3 , b 3 = 1 , b 4 = 0.7 , a = 4 , c = 5 , e = 1 , f = 2 .
Step 1. Because b 2 b 4 b 3 b 1 , J 2 J 4 J 3 J 1 .
Step 2. Calculate the values h = n ( f e ) a = 1 , l = n ( c f ) c = 3 , q 1 = ( 1 + r ) t 0 , q 2 = ( 1 + r ) w [ 3 ] .
Step 3. J 2 and J 4 are contained in set Ω 2 which are at the first and second position.
Step 4. J 3 and J 1 are contained in set Ω 3 , which are at the third and forth positions, Ω 1 = ϕ .
Therefore, the optimal sequence is J 2 J 4 J 3 J 1 .

4.3. Discussion

This paper studies scheduling problems involving both simple linear degradation and past sequence dependent delivery time under the common and slack due window assignment. To the best of our knowledge, such articles are rare for the common or slack due window assignment. Small perturbation technique and adjacent switching technique are effective methods to deal with scheduling problems. Based on the small perturbations technique and adjacent exchange technique, we obtain the propositions of the method. By the propositions, the polynomial time algorithms are proposed to obtain the optimal sequence. However, not all scheduling problems can be solved by these techniques. Specific problems need to be analyzed in detail. In the future research, it is worth investigating multi-machine scheduling problems.

5. Conclusions

The single machine scheduling problem with deteriorating jobs and delivery time is considered under due window assignment. The goal is to minimize the window location, window size, earliness and tardiness. Based on the small perturbations technique and adjacent exchange technique, we obtain the propositions of the problems. However, not all scheduling problems can be solved by these techniques. Specific problems need to be analyzed in detail. For common and slack due window assignment, we prove that the two objective functions are polynomial time solvable in O ( n l o g n ) time. We propose the corresponding algorithms to obtain the optimal sequence, window location and window size. In the future, the multi-machine environment can be considered to expand the research, i.e., parallel machines and flow shop setting. The more general deterioration processing time is also considered for a single machine scheduling or the multi-machine scheduling.

Author Contributions

Conceptualization, J.Q.; methodology, J.Q.; validation, Y.Z.; investigation, Y.Z.; writing—original draft preparation, J.Q.; writing—review and editing, Y.Z.; supervision, Y.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by J.Q. of the Fundamental Research Funds for the Central Universities grant number N2105020. and Y.Z. of the Fundamental Research Funds for the Central Universities grant number N2105021. and Y.Z. of the Natural Science Foundation of Liaoning Province Project grant number 2021-MS-102.

Data Availability Statement

Not applicable.

Acknowledgments

We thank the anonymous for their helpful comments and insights that significantly improved our paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jackson, J.R. Scheduling a production line to minimize maximum tardiness. In Management Science Research Project; University of California: Los Angeles, CA, USA, 1955. [Google Scholar]
  2. Seidmann, A.; Panwalkar, S.S.; Smith, M.L. Optimal assignment of due-dates for a single processor scheduling problem. Int. J. Prod. Res. 1981, 19, 393–399. [Google Scholar] [CrossRef]
  3. Panwalkar, S.S.; Smith, M.L.; Seidmann, A. Common due date assignment to minimize total penalty for the one machine scheduling problem. Oper. Res. 1982, 30, 391–399. [Google Scholar] [CrossRef]
  4. Li, G.; Luo, M.L.; Zhang, W.J.; Wang, X.Y. Single-machine due-window assignment scheduling based on common flow allowance, learning effect and resource allocation. Int. J. Prod. Res. 2015, 53, 1228–1241. [Google Scholar] [CrossRef]
  5. Yang, S.W.; Wan, L.; Yin, N. Research on single machine SLK/DIF due window assignment problem with learning effect and deteriorating jobs. Appl. Math. Model. 2015, 39, 4593–4598. [Google Scholar] [CrossRef]
  6. Yin, Y.; Wang, D.; Cheng, T.C.E.; Wu, C.C. Bi-criterion single machine scheduling and due wWindow assignment with common flow allowances and resource allocation. J. Oper. Res. Soc. 2016, 67, 1169–1183. [Google Scholar] [CrossRef]
  7. Wang, D.; Yin, Y.; Cheng, T.C.E. A bicriterion approach to common flow allowances due window assignment and scheduling with controllable processing times. Nav. Res. Log. 2017, 64, 41–63. [Google Scholar] [CrossRef]
  8. Wang, J.B.; Zhang, B.; Li, L.; Bai, D.; Feng, Y.B. Due-window assignment scheduling problems with position-dependent weights on a single machine. Eng. Optimiz. 2020, 52, 185–193. [Google Scholar] [CrossRef]
  9. Sun, X.Y.; Geng, X.N.; Liu, T. Due-window assignment scheduling in the proportionate flow shop setting. Ann. Oper. Res. 2020, 292, 113–131. [Google Scholar] [CrossRef]
  10. Yue, Q.; Zhou, S. Due-window assignment scheduling problem with stochastic processing times. Eur. J. Oper. Res. 2021, 290, 453–468. [Google Scholar] [CrossRef]
  11. Gupta, J.N.D.; Gupta, S.K. Single facility scheduling with nonlinear processing times. Comput. Ind. Eng. 1998, 14, 387–393. [Google Scholar] [CrossRef]
  12. Yue, Q.; Wang, G. Scheduling deteriorating jobs with common due window assignment. Ind. Eng. Manag. 2015, 20, 42–47. [Google Scholar]
  13. Wang, J.B.; Li, L. Machine scheduling with deteriorating jobs and modifying maintenance activities. Comput. J. 2018, 61, 47–53. [Google Scholar] [CrossRef]
  14. Sun, X.; Geng, X.N. Single-machine scheduling with deteriorating effects and machine maintenance. Int. J. Prod. Res. 2019, 57, 3186–3199. [Google Scholar] [CrossRef]
  15. Cheng, T.C.E.; Kravchenko, S.A.; Lin, B.M.T. Scheduling step-deteriorating jobs to minimize the total completion time. Comput. Ind. Eng. 2020, 144, 106329. [Google Scholar] [CrossRef]
  16. Liang, X.X.; Liu, M.Q.; Feng, Y.B.; Wang, J.B.; Wen, L.S. Solution algorithms for single machine resource allocation scheduling with deteriorating jobs and group technology. Eng. Optimiz. 2020, 52, 1184–1197. [Google Scholar] [CrossRef]
  17. Koulamas, C.; Kyparisis, G.J. Single-machine scheduling problems with past-sequence-dependent setup times. Eur. J. Oper. Res. 2008, 187, 1045–1049. [Google Scholar] [CrossRef]
  18. Liu, M.; Zheng, F.; Chu, C.; Xu, Y. New results on singlemachine scheduling with past-sequence-dependent delivery times. Teor. Comput. Sci. 2012, 438, 55–61. [Google Scholar] [CrossRef] [Green Version]
  19. Zhao, C.; Tang, H. Single machine scheduling problems with general position-dependent processing times and past sequence dependent delivery times. J. Appl. Math. Comput. 2014, 45, 259–274. [Google Scholar] [CrossRef]
  20. Mir, S.S.M.; Javad, R.; Hossein, M. Scheduling parallel machine problem under general effects of deterioration and learning with past-sequence-dependent setup time: Heuristic and meta-heuristic approaches. Soft Comput. 2020, 24, 1335–1355. [Google Scholar]
  21. Toksari, M.D.; Aydogan, E.K.; Atalay, B.; Sari, S. Some scheduling problems with sum of logarithm processing times based learning effect and exponential past sequence dependent delivery times. J. Ind. Manag. Optim. 2021; ahead of print. [Google Scholar]
  22. Wang, J.B.; Cui, B.; Ji, P.; Liu, W.W. Research on single-machine scheduling with position-dependent weights and past-sequence-dependent delivery times. J. Comb. Optim. 2021, 41, 290–303. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Qian, J.; Zhan, Y. The Due Window Assignment Problems with Deteriorating Job and Delivery Time. Mathematics 2022, 10, 1672. https://doi.org/10.3390/math10101672

AMA Style

Qian J, Zhan Y. The Due Window Assignment Problems with Deteriorating Job and Delivery Time. Mathematics. 2022; 10(10):1672. https://doi.org/10.3390/math10101672

Chicago/Turabian Style

Qian, Jin, and Yu Zhan. 2022. "The Due Window Assignment Problems with Deteriorating Job and Delivery Time" Mathematics 10, no. 10: 1672. https://doi.org/10.3390/math10101672

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop