Next Article in Journal
Optimizing Hadoop Scheduling in Single-Board-Computer-Based Heterogeneous Clusters
Previous Article in Journal
To Bind or Not to Bind? A Comprehensive Characterization of TIR1 and Auxins Using Consensus In Silico Approaches
Previous Article in Special Issue
Bioinspired Multipurpose Approach to the Sampling Problem
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Analysis and Control of Partially Observed Discrete-Event Systems via Positively Constructed Formulas

Matrosov Institute for System Dynamics and Control Theory of the Siberian Branch of the Russian Academy of Sciences, 134 Lermontov St., 664033 Irkutsk, Russia
*
Author to whom correspondence should be addressed.
Computation 2024, 12(5), 95; https://doi.org/10.3390/computation12050095
Submission received: 30 November 2023 / Revised: 27 April 2024 / Accepted: 6 May 2024 / Published: 9 May 2024

Abstract

:
This paper establishes a connection between control theory for partially observed discrete-event systems (DESs) and automated theorem proving (ATP) in the calculus of positively constructed formulas (PCFs). The language of PCFs is a complete first-order language providing a powerful tool for qualitative analysis of dynamical systems. Based on ATP in the PCF calculus, a new technique is suggested for checking observability as a property of formal languages, which is necessary for the existence of supervisory control of DESs. In the case of violation of observability, words causing a conflict can also be extracted with the help of a specially designed PCF. With an example of the problem of path planning by a robot in an unknown environment, we show the application of our approach at one of the levels of a robot control system. The prover Bootfrost developed to facilitate PCF refutation is also presented. The tests show positive results and perspectives for the presented approach.

1. Introduction

The class of discrete-event systems (DESs) is a wildly used modeling formalism for a large variety of man-made complex objects [1,2,3]. Logical DESs, as a subclass of DESs, represent system evolution in terms of states changing in response to the occurrence of some events at non-predetermined time instants. A logical DES is commonly represented as a finite-state automaton, the transitions of which from state to state are labeled with the letters of some finite alphabet and correspond to events occurring in the system. Sequences of such transitions form the words of a regular language that describes the behavior of the system from the high-level, or symbolic, point of view. Consequently, system properties may be described as statements over these formal expressions.
One of the most popular ways to deal with logical DESs containing events that may be switched off is the supervisory control theory (SCT) [4]. SCT was developed as a tool for restricting DES behavior according to a set of constraints defined by some specification. For example, logical DESs are extensively exploited nowadays in mobile robots and robot group control, e.g., [5,6,7]. A detailed description of SCT is presented in, e.g., [1,8,9].
It is well known that formal logic aims at dealing with symbolic structures, such as formal languages, by formulating their properties as theorems that must be proven. Thus, studying logical DESs is naturally embraced by the paradigm of automated theorem proving (ATP). ATP represents an implementation of a natural human reasoning process with the help of formal logic and special computer programs called provers. Developed to help mathematicians in producing and verification of formal mathematical proofs, nowadays it is useful in program analysis, system verification, etc., providing results in a wide range of areas. The latest examples include proving the 400-year-old Kepler conjecture on sphere packing [10] and the correctness of the seL4 operating system kernel [11], not to mention earlier proof of the four-color theorem in graph theory [12] and the verified C compiler CompCert [13]. A modern domain of ATP application is robotics, where it helps mostly in planning [14] and decision making [15]. For example, in [16] for planning and control in swarm robotics, the PDDL language is used, which is based on the classical STRIPS-style ATP. In [17], theorem proving is applied for verification of the framework for modeling the controllers of autonomous robots, combined with the automatic generation of C++ code.
We suggest a new way to study and design logical DESs that is based on ATP in the calculus of positively constructed formulas (PCFs). PCFs are first-order formulas that do not contain the negation symbol in their syntax [18,19]. The language of PCFs is a complete first-order language, providing a powerful tool for qualitative analysis of dynamical and intelligent systems. Its applications include telescope orientation [20], elevators group control [21], pursuing goals [22], and achieving targets [23]. To facilitate the inference search in the PCF language, a prover Bootfrost is developed [24]. The most important features of the PCF calculus and its prover implementation are the following:
  • Large block data structures for representing formulas and inference rules;
  • Absence of necessity to remove existence quantifiers with a skolemization procedure, which decreases the complexity of the inference;
  • Compatibility with application-specific heuristics and general logical inference control heuristics;
  • Clarity of the logical inference, which helps to find formalization errors;
  • Support for the equality predicate;
  • Modifiability of semantics to support nonclassical logics.
In modern ATP, wide usage of proof assistants, noted in [25], such as Isabelle/HOL in [10], HOL Light and Isabelle in [11], or Coq in [12], instead of fully automated provers, is caused by the fact that significant user input is often required to clarify a difficult situation that has stumbled the inference. In addition to its other advantages, the prover Bootfrost combines the power of automated reasoning with the ability to implement user-designed strategies.
Our previous papers addressed how the basic problems of the supervisory control theory (SCT) [4] for logical DESs can be solved using the PCF-based technique. The issues considered so far include controllability checking to determine if a formal language restricting DES functioning may be guaranteed by a supervisor, a supremal controllable sublanguage of a given specification language construction, or a monolithic supervisor realization (e.g., [26]). This paper deals with partially observed DESs, in which the occurrence of some events is unavailable for observation. In this case, the property called observability of a formal language determines those specifications on DES functioning that may be ensured by supervisory control. Some effective tests for observability have been already suggested, e.g., in [27,28], where the former considers the observation function in the form of a mask, and the latter is based on the algebraic operations on processes that represent DESs. For regular languages, a fixed-point test for observability from [29] can be effectively implemented. The test employs the operator that is also associated with the algorithm for computing the infimal prefix-closed and observable superlanguage of a given language. It is used in [29] to find a solution for the supervisory control and observation problem in its general case, i.e., when a supervisor should provide a language lying in some predefined range of languages.
This paper extends the use of PCF calculus and ATP in SCT problems. Namely, it will be shown how the polynomial time algorithm for testing observability from [1] may be implemented with PCFs. If observability is violated, then the conflicting strings can also be found with the help of another PCF that will be also presented. The main advantage of the presented PCF-based approach and the employment of the ATP technique is the declarative description of the used algorithms. In this case, the programmer only describes (declares) the properties of the required result, and the solution (method) is provided by the logical programming system, as a result of searching for the logical conclusion of some goal statement. This is a step up from programming in imperative languages (e.g., C/C++, Java, etc.), as the programmer does not have to worry about the low-level details of the program.
There are some tools developed for the analysis and design of controlled DESs in the framework of SCT. Among them are TCT [30], DESUMA/UMDES [31,32], Supremica [33], and others. Having user-friendly graphical interface and high-performance algorithms, Supremica supports extended finite-state machines where transitions are labeled with guards and actions in addition to events [34]. The guards and actions reference variables, which can be declared over finite integer ranges or as enumerated type. Although a PCF-based tool for SCT is a developing project, our approach suggests that both guards and actions may be expressed in the form of logical statements of any kind. Unlike focusing on large-scale industrial examples like in [35,36,37], the usage of PCFs and the PCF calculus for DES control suggests exploiting logical tools for knowledge representation and processing.
The contribution of this work consists in presenting a theoretical base of the original logical approach to handling partially observed DESs. An ATP-based technique of DES specifications testing for observability is presented with the usage of PCF representation of logical DESs. This paper’s structure is the following. After the necessary preliminaries on the PCF calculus provided in the next section, in Section 3, we state the problem of the supervisory control of partially observed DESs and present a PCF for observability checking. Section 4 provides a PCF extracting conflicting strings. In Section 5, the PCF-based implementation of the supervisory control of DESs is described. Section 6 is dedicated to the prover Bootfrost and its main features. Section 7 provides a case study for the path-following problem for a mobile robot. In the conclusion, we discuss the presented approach and outline directions for our future work.

2. The PCF Calculus

The calculus of positively constructed formulas (PCFs) is based on the refutation of the negation of an original statement which is to be proved. The main idea is the following: if the negation of the statement is proved to be false, then the statement itself is true. The language of PCFs is a special variant of the language of first-order logic (FOL), which consists of first-order formulas (FOFs) built out of atomic formulas, or atoms, with the help of operators &, ∨, ¬, →, ↔, quantifier symbols ∀ and ∃, and constants T r u e and F a l s e . We suppose the reader is familiar with the concepts of atom, literal, and term, understood in the usual sense.

2.1. The PCF Language

Let X = { x 1 , , x k } be a set of variables, A = { A 1 , , A m } a set of atomic formulas, and F = { F 1 , , F n } a set of subformulas. Then, the formulas ( ( x 1 ) ( x k ) ( A 1 & & A m ( F 1 F n ) ) ) and ( ( x 1 ) ( x k ) ( A 1 & & A m & ( F 1 & & F n ) ) ) are denoted as X A : F and X A : F , respectively, keeping in mind that the ∀–quantifier corresponds to the disjunction of all subformulas, and ∃–quantifier corresponds to their conjunction. If F = , then the above formulas turn to the form X A : X A F a l s e and X A : X A & T r u e , since the empty disjunction is understood as F a l s e , whereas the empty conjunction is understood as T r u e . Let X A and X A be abbreviations of such formulas. If X = then A : F and A : F are analogous abbreviations. The set of atoms A is called a conjunct. Variables from X, bound by corresponding quantifiers, are called ∀–variables and ∃–variables, respectively. In X A , a variable from X that does not appear in the conjunct A is called an unconfined variable. Note that : T r u e F a l s e F a l s e .
Definition 1 (Positively constructed formulas (PCF)). 
Let X be a set of variables and A a conjunct. Then, the following holds:
  • X A and X A are –PCF and –PCF, respectively.
  • If F = { F 1 , , F n } is a set of –PCFs, then X A : F is a –PCF.
  • If F = { F 1 , , F n } is a set of –PCFs, then X A : F is a –PCF.
  • Any –PCF or –PCF is a PCF.
A PCF starting with is called a PCF in the canonical form. Any PCF can be represented in the canonical form. If F is a noncanonical ∃–PCF, then : F is the canonical PCF, since : F T r u e F F . If F is a noncanonical ∀–PCF, then the canonical PCF is : { : F } T r u e T r u e & F F . Type quantifiers and are used to regularize PCFs, i.e., to transform them to the canonical form.
The term “positively” comes from the fact that according to the definition, PCFs contain no negation operator (¬). This is also true for the semantics of language; negation is “concealed” in the implications that are meant following universal quantifiers. Note that any FOFs can be represented as PCFs, since the PCF language is a special notation of classical FOFs, as well as the conjunctive normal form, the disjunctive normal form, etc. The converting algorithm is presented in [38].
For easier reading, a PCF may be represented graphically as a tree structure. For example, consider a PCF representation of a FOF
F = ¬ x y P ( x , y ) z P ( z , z ) .
An image F of F in the PCF language is F = : { : { x : { y : P ( x , y ) } , z : P ( z , z ) { : F a l s e } } } . The tree-like form of the latter is
Computation 12 00095 i001
The root of a PCF tree is called a PCF r o o t . Each PCF root’s child X A is called a PCF b a s e , the conjunct A is called a b a s e   o f   f a c t s , and a PCF rooted from the base is called a b a s e   s u b f o r m u l a . The PCF base children Y B are called q u e s t i o n s to the parent base. The subtrees of the questions are called c o n s e q u e n t s . If a question has no consequent, then the question is referred to as goal question, and it is identical to F a l s e .

2.2. The Inference Rule

The only axiom of the PCF calculus is : , i.e., F a l s e . The inference rule ω in the PCF calculus is based on the search for so-called a n s w e r i n g   s u b s t i t u t i o n s , i.e., such substitutions of variables in terms that satisfy certain conditions.
Definition 2 (Answer). 
A question Y D : Υ to a base X A has an answer θ if and only if θ is a substitution Y H X and D θ A , where H is Herbrand universe based on constant and function symbols that occur in corresponding base subformulas.
Definition 3 (Splitting). 
Let B = X A : Ψ , and Q = Y D : Υ , where Υ = { Z 1 C 1 : Γ 1 , …, Z n C n : Γ n } then s p l i t ( B , Q ) = { X Z 1 A C 1 : Ψ Γ 1 , , X Z n A C n : Ψ Γ n } , where ′ is a variable renaming operator. We say that B is split by Q. Obviously, s p l i t ( B , Y D ) = s p l i t ( B , Y D : ) = .
Definition 4 (Inference rule ω ). 
Consider some canonical PCF F = : Φ . Let there exists a question Q that has an answer θ to appropriate base B Φ , then ω F = : Φ { B } s p l i t ( B , Q θ ) .
If a question has an answer to its base, then the base subformula is split by this question. In the case of a goal question, we say that the base subformula is refuted because s p l i t ( B , Y D ) = . The refuted base subformula B is removed from the set of base subformulas Φ since Φ { S } = Φ { S } . When all bases subformulas in Φ have been refuted, the formula F is also refuted since : F a l s e .
Any finite sequence of PCFs F , ω F , ω 2 F , , ω n F , where ω s F = ω ( ω s 1 F ) , ω 1 = ω , ω n F = , is called an i n f e r e n c e of F in the PCF calculus (with the axiom ∀). An answer to the goal question refutes the corresponding base. When all bases of the PCF are refuted, then the PCF F is reduced to ∀, i.e., F a l s e . This means that F as the negation of the statement under consideration is unsatisfiable; therefore, the statement itself is true. The details on the PCF calculus may be found in [18,19,21,39]. In [19], the correctness and completeness of the PCF calculus with functional symbols was proven.

3. Checking Observability of a Regular Language

3.1. Partially Observable DESs

Consider a discrete event system (DES) in the form of a generator G = ( Q , Σ , δ , q 0 , Q m ) of a formal language [4]. Here, Q is the set of states q; Σ the set of events; δ : Σ × Q Q the transition function; q 0 Q the initial state; Q m Q the set of marked states. Let L ( G ) be a language generated by G, and L m ( G ) be a language marked by G. The Ramadge–Wonham supervisory control framework assumes the existence of a means of control G presented by a supervisor [4]. Let Σ c be a controllable event set, Σ u c = Σ Σ c , Σ c Σ u c = . The supervisor switches control patterns so that the supervised discrete event systems achieve a control objective described by some regular language K.
Let G be partially observable, i.e., a set Σ o of observable events is distinguished from all events, Σ u o = Σ Σ o , Σ c Σ u o = . The observation function is usually defined as the natural projection P : Σ * Σ o * , which erases unobservable events for s Σ * P ( s σ ) = P ( s ) σ if σ Σ o and P ( s σ ) = P ( s ) if σ Σ u o . The supervisor only observes events from Σ o and, based on this information, disables events in Σ c . Denote L ( J / G ) a language generated by the closed-looped behavior of the plant and the supervisor. In this paper, for simplicity of presentation, the marked language and related problems, such as the construction of nonblocking supervisory control, are not considered.
Supervisory control and observation problem (SCOP). Given a plant G over an alphabet Σ , a language L A L ( G ) , a language L E L ( G ) , and sets Σ o , Σ c Σ , construct a supervisor J for G such that L A L ( J / G ) L E .
The less complex problem consists in finding such control patterns that the language marked by the supervisor is equal to some desired language. Thus, the special case of SCOP is constructing such a supervisor that L ( J / G ) = K ¯ where K is called a specification language. We refer to this problem as basic SCOP (BSCOP). The notions of controllable and observable languages are essential in solving this problem. Let L ¯ be a set of all strings that are prefixes of words of L, i.e., L ¯ = { s | s Σ * and t Σ * : s · t L } .
Definition 5. 
A language K is called controllable (with respect to L ( G ) and Σ u c ) if K Σ u c L ( G ) K ¯ . Here, K ¯ Σ u c is a shortened expression denoting concatenations of all the strings from K ¯ with any of the symbols from the set Σ u c .
Definition 6. 
The K is observable (with respect to L ( J ) and P) if s , t Σ * ( P ( s ) = P ( t ) ( σ Σ ) ( s σ K ¯ & t σ L ( J ) & t K ¯ t σ K ¯ ) ) .
If observability holds true, this means that no event should be enabled and disabled simultaneously to satisfy specification K. The opposite situation is called a conflict.
Supervisor existence criterion for BSCOP sounds as follows: given K L ( G ) , there exists a supervisor J such that L ( J / G ) = K ¯ iff K is controllable and observable with respect to L ( G ) and P.
The PCF-based procedure for checking controllability was suggested in [26]. It also allows one to construct supremal controllable sublanguage of uncontrollable specification such that it may be chosen as a new specification and be ensured by a proper supervisor. In the rest of the paper, we show how PCFs allow testing regular language for observability and implementing supervisory control.

3.2. Checking Observability via PCFs

There are several algorithms exist to check if the regular language K is observable. Among them is the algorithm from [1] that is polynomial with respect to the size of the automata generating K. The main idea of the algorithm is constructing an automaton for tracking two words s 1 , s 2 of K ¯ which have the same projection P ( s 1 ) = P ( s 2 ) but such that s 1 σ K while s 2 σ K . Strings s 1 and s 2 are then called conflicting because they demand different control actions from the supervisor. Let the regular language K be recognized by the finite-state automaton H. The algorithm from [1] suggests to consider two copies of the automaton H and one copy of the automaton G and to design an automaton T with the states of the form ( h 1 , h 2 , q ) , where h 1 Q H , h 2 Q H , q Q G , and the single state d e a d . The existence of the state d e a d denotes the unobservability of K, because in this case, there exist a set of strings s i K ¯ , i = 1 , 2 , 3 , and some event σ such that s 2 = s 3 and P ( s 1 ) = P ( s 2 ) , s 1 σ K ¯ , s 3 σ L ( G ) , while s 2 σ K ¯ , i.e., the observability condition is violated.
We prove that the above algorithm may be realized with the help of ATP in the PCF calculus. For this, some preliminary procedures are required, in particular, we determine what transitions are defined in each state of automata involved, namely G and H, using logical inference only. The following list of predicates will be exploited in this procedure: a predicate Q X ( _ ) that corresponds to all states of the automaton X, a predicate E X ( _ ) that defines all events of the automaton X, and terms δ X ( q 1 i , σ i , q 2 i ) that determine transitions from the state q 1 i of automaton X to the state q 2 i labeled with an event σ i . Let the term F X ( q i , σ j ) mean that there is a transition from the state q i of the automaton X labeled by the event σ j . Let N o F X ( q i , σ j ) mean the opposite, i.e., that there is no such transition. For an automaton X, consider the PCF F P r e p X (1) with the base B P r e p X = { Q X ( q i ) , δ X ( q 1 i , σ i , q 2 i ) , E X ( σ i ) } :
Computation 12 00095 i002
During the inference search for F P r e p X , the following strategy is used: First, all possible answers to the first question are being searched for, thus atoms N o F X ( q i , σ j ) are added into the base. Then, answers to the second question add proper atoms F X ( q i , σ j ) into the base while removing respective atoms N o F X ( q i , σ j ) . The inference of PCF F P r e p X for any finite automaton X ends due to exhaustion of answering substitutions since the sets used for searching for substitutions are finite.
The operator ∗, used for the deletion of the redundant atoms N o F X ( q i , σ j ) added by the first question, demonstrates the essential feature of the calculus of PCFs, namely, the possibility of nonmonotonic inference. In particular, after applying the inference rule ω , the atoms that participated in the matching search with the atoms marked with * in question should be removed from the base. In general, the operator * affects the property of completeness of the PCF calculus, but for the problem considered in this paper, the inference using * is always complete.
Denote by B P r e p X the base obtained as a result of the inference search of F P r e p X . Let B O b s = B P r e p G B P r e p H { Q 0 G ( q 0 G ) , Q 0 H ( q 0 H ) , E c ( σ j ) , E o ( σ j ) , E u o ( σ j ) }, where Q 0 X ( _ ) determines the initial state of automaton X, atoms E c ( σ i ) , E o ( σ j ) , and E u o ( σ k ) define controllable and observable events. The predicate T ( _ , _ , _ ) will be used to construct states of the automaton T O b s , while the predicate δ T ( _ ) will construct transitions of T O b s , and δ T ( q H 1 , q H 2 , q G , σ 1 , σ 2 , σ 3 , t H 1 , t H 2 , t G ) is equal to the phrase “there is a transition labeled ( σ 1 , σ 2 , σ 3 ) from the state ( q H 1 , q H 2 , q G ) to the state ( t H 1 , t H 2 , t G ) of the automaton T O b s ”.
To test observability, we employ a PCF F O b s (2) that constructs the testing automaton T O b s . The questions of F O b s are listed as formulas R 1 R 6 below.
Computation 12 00095 i008
R 1 : q G , q H 1 Q 0 G ( q G ) , Q 0 H ( q H 1 ) Q 0 T ( q H 1 , q H 1 , q G ) , T ( q H 1 , q H 1 , q G ) R 2 : σ , q H 1 , q H 2 , q G , t H 1 , t H 2 , t G T ( q H 1 , q H 2 , q G ) , E o ( σ ) , δ H ( q H 1 , σ , t H 1 ) , δ H ( q H 2 , σ , t H 2 ) , δ G ( q G , σ , t G ) T ( t H 1 , t H 2 , t G ) , δ T ( q H 1 , q H 2 , q G , σ , σ , σ , t H 1 , t H 2 , t G ) R 3 : σ , q H 1 , q H 2 , q G , t H 1 T ( q H 1 , q H 2 , q G ) , E u o ( σ ) , δ H ( q H 1 , σ , t H 1 ) T ( t H 1 , q H 2 , q G ) , δ T ( q H 1 , q H 2 , q G , σ , ε , ε , t H 1 , q H 2 , q G ) R 4 : σ , q H 1 , q H 2 , q G , t H 2 , t G T ( q H 1 , q H 2 , q G ) , E u o ( σ ) , δ H ( q H 2 , σ , t H 2 ) , δ G ( q G , σ , t G ) T ( q H 1 , t H 2 , t G ) , δ T ( q H 1 , q H 2 , q G , ε , σ , σ , q H 1 , t H 2 , t G ) R 5 : σ , q H 1 , q H 2 , q G T ( q H 1 , q H 2 , q G ) , E c ( σ ) , F H ( q H 1 , σ ) , N o F H ( q H 2 , σ ) , F G ( q G , σ ) d e a d ( q H 1 , q H 2 , q G , σ ) , δ T ( q H 1 , q H 2 , q G , σ , ε , σ , q H 1 , t H 2 , q G ) R 6 : q H 1 , q H 2 , q G , σ d e a d ( q H 1 , q H 2 , q G , σ )
Proposition 1. 
Given a partially observable DES G and a regular language K recognized by a finite-state automaton H, let in the PCF F O b s (2) B O b s = B P r e p G B P r e p H { Q 0 G ( q 0 G ) , Q 0 H ( q 0 H ) , E c ( σ j ) , E o ( σ j ) , E u o ( σ j ) }. Then, the inference of F O b s always terminates, and the language K is unobservable if and only if the resulting base B O b s contains an atom d e a d ( q H 1 , q H 2 , q G , σ ) .
Proof. 
To prove the proposition, we consider each question of the PCF F O b s to show that violation of observability leads to the appearance of the atom d e a d ( q H 1 , q H 2 , q G , σ ) in the base of F O b s . The question R 1 adds into the base an atom T ( q H 1 , q H 1 , q G ) that serves as a starting point for the inference since no question except the last one may be answered without finding in the base a proper substitution for the term T ( _ , _ , _ ) .
Questions R 2 R 4 are aimed at constructing states and transitions of the automaton T o b s , wherein R 2 is responsible for processing observable events, while R 3 and R 4 are responsible for processing unobservable events occurrences. T o b s is constructed in such a way as to ensure s 2 = s 3 and P ( s 1 ) = P ( s 2 ) , thus implementing the main idea of the rules for constructing T o b s from [1]. Each state of the desired automaton T O b s has the form ( h 1 , h 2 , q ) , where the first two components of the triple are some states of automaton H, and q is the state of automaton G, i.e., h 1 Q H , h 2 Q H , q Q G . Thus, we consider a set of strings s i K ¯ , i = 1 , 2 , 3 corresponding to transitions between these states. An observable event σ allows answering to the question R 2 that results in adding into the base B o b s a transition of T o b s labeled by a triple ( σ , σ , σ ) . If event σ is unobservable, then two transitions are constructed: The first one is determined by the question R 4 , labeled by a triple ( ε , σ , σ ) to ensure s 2 = s 3 , s 2 σ K ¯ , s 3 σ L ( G ) . The second transition is constructed by the question R 3 , which is labeled by a triple ( σ , ε , ε ) to ensure P ( s 1 ) = P ( s 2 ) , s 1 σ , s 2 σ K ¯ .
The question R 5 checks if a controllable event e violates the observability condition at the current state ( X 1 , X 2 , Q ) of T o b s achieved by a set of strings s i K ¯ , i = 1 , 2 , 3 . Indeed, let there be a substitution { σ e , q H 1 X 1 , q H 2 X 2 , q G Q } such that atoms T ( q H 1 , q H 2 , q G ) , E c ( σ ) , F H ( q H 1 , σ ) , N o F H ( q H 2 , σ ) , F G ( q G , σ ) become T r u e simultaneously. This means that for s i K ¯ , i = 1 , 2 , 3 , and for the event e, we have s 1 e K ¯ , s 3 e L ( G ) , while s 2 e K ¯ . By construction of T o b s , s 2 = s 3 , and P ( s 1 ) = P ( s 2 ) , and such a combination means a violation of observability. Given such a substitution, the answer to the question R 5 adds the atom d e a d ( q H 1 , q H 2 , q G , σ ) into the base. It contains the information about the state and event where the observation conditions are violated. The question R 5 cannot be answered if observability is not violated by some controllable event e.
The question R 6 is the goal question an answer to which terminates the inference search. It may be answered only if the base contains the atom d e a d ( q H 1 , q H 2 , q G , σ ) ; thus, if it has been answered, then the language K is unobservable.
The inference of the PCF F O b s is always finite since no functional symbols are used (only calculated ones), and all sets used for searching for substitutions are finite. The inference always ends with either a refutation of the base by answering the goal question R 6 , i.e., adding the atom d e a d into the base, or an exhaustion of options for searching of answering substitutions. □
Example 1. 
Consider a DES presented by the generator G in Figure 1 and specification language K generated by the automaton H in Figure 2. Let Σ u o = { u , v , c , e } , Σ c = Σ . It may be noted that the strings s = u v and t = ε are those that cause the conflict in the system G. Indeed, the occurrence of event a leads to the situation when P ( s ) = P ( t ) , s a K ¯ but t a K ¯ , which violates the observability condition.
Table 1 illustrates the process of adding new atoms to the base that define the states and transitions of the automaton T O b s serving for observability check. As one can see, the minimum inference for PCF F O b s with the base corresponding to the automata in Figure 1 and Figure 2 consists of four steps. Since inference is random in nature, its length may vary. The choice of questions and answers occurs according to the chosen strategy (see Section 6), but whether the strategy will lead to the contradiction in a short or long way is unknown a priori. Table 1 shows the inference constructed by the prover Bootfrost, consisting of eight steps. The entire automaton T O b s and all possible conflicts can be constructed by removing the goal question R 6 from PCF F O b s . Figure 3 shows a part of automaton T O b s constructed by the inference presented in Table 1.

4. Conflict Extracting

To examine partially observable systems, observer automata are often employed. There are standard methods of constructing such automata, so we direct readers to refer to the literature, e.g., [1]. One of the methods consists in changing all unobservable events with the silent τ event and converting the resulting nondeterministic automaton to a deterministic one.
Example 2. 
In Example 1, the observer in Figure 4 shows that the event a in the state { 1 , 2 , 3 } causes the conflict. Indeed, to satisfy specification H in Figure 2, the event a must be disabled at state 1 but must be enabled at state 3.
If the specification language proved to be unobservable using PCF F O b s , conflicting strings may be extracted from the automaton T o b s with the help of another PCF, and we name it F C o n f (Equation (3)). The PCF F C o n f processes the final version B O b s of the base B O b s obtained during the observability checking so set B C o n f = B O b s .
Computation 12 00095 i003
R 1 C o n f : σ , q H 1 , q H 2 , q G , q H 1 f , q H 2 f , q G f , s 1 , s 2 , s 3 δ T ( q H 1 f , q H 2 f , q G f , s 1 , s 2 , s 3 , q H 1 , q H 2 , q G ) , d e a d ( q H 1 , q H 2 , q G , σ )   n e x t ( q H 1 f , q H 2 f , q G f , s 1 , s 2 , s 3 , σ ) R 2 C o n f : σ , q H 1 , q H 2 , q G , q H 1 f , q H 2 f , q G f , s 1 , s 2 , s 3 , σ 1 , σ 2 , σ 3 n e x t ( q H 1 , q H 2 , q G , s 1 , s 2 , s 3 , σ ) , δ T ( q H 1 f , q H 2 f , q G f , σ 1 , σ 2 , σ 3 , q H 1 , q H 2 , q G ) n e x t ( q H 1 f , q H 2 f , q G f , σ 1 · s 1 , σ 2 · s 2 , σ 3 · s 3 , σ )
Proposition 2. 
Given a partially observable DES G and a regular language K recognized by a finite-state automaton H, let in the PCF F C o n f (3) B C o n f = B O b s , where B O b s is the final version of the base of the PCF F O b s after the end of its inference. Then, the inference of F C o n f always terminates, and the last added atom n e x t ( q 0 H , q 0 H , q 0 G , s 1 c o n f , s 2 c o n f , s 2 c o n f , e ) contains conflicting strings as its forth and fifth arguments.
Proof. 
The question R 1 C o n f in the PCF F C o n f is the only question that can be answered after the start of the inference. It uses the argument values of d e a d ( q H 1 , q H 2 , q G , σ ) and δ T ( , s 1 , s 2 , s 3 , q H 1 , q H 2 , q G ) to obtain the last symbols of the required conflicting strings s 1 c o n f and s 2 c o n f . As stated in the observability checking algorithm, s 2 = s 3 in all terms δ T ( , s 1 , s 2 , s 3 , q H 1 , q H 2 , q G ) .
Sequential answers to the second question R 2 C o n f occur according to transitions of the automaton T o b s contained in the base B C o n f as atoms δ T ( q H 1 f , q H 2 f , q G f , σ 1 , σ 2 , σ 3 , q H 1 , q H 2 , q G ) . Having started in the state T ( q H 1 , q H 2 , q G ) of T o b s , the inference ends in the state Q 0 T . During the inference, the atoms n e x t ( q H 1 f , q H 2 f , q G f , σ 1 · s 1 , σ 2 · s 2 , σ 3 · s 3 , σ ) are being added into the base where, by construction of T o b s , s 1 and s 2 accumulate suffixes of conflicting words. When the initial state Q 0 T of the automaton T o b s will be achieved with some substitution { σ e , q H 1 X 1 , q H 2 X 2 , q G Q , q H 1 f q 0 H , q H 2 f q 0 H , q G f q 0 G , s 1 s ^ 1 , s 2 s ^ 2 , s 3 s ^ 3 , σ 1 σ ^ 1 , σ 2 σ ^ 2 , σ 3 σ ^ 3 } , then s ^ 1 = s 1 c o n f and s ^ 2 = s 2 c o n f , where s 1 c o n f and s 2 c o n f are the strings causing the conflict in case of occurrence of the event e. □
Example 3. 
For the conflict in Example 1, the inference of F C o n f consists of two steps. When the inference terminates the arguments of the atom n e x t ( _ ) in the base store the information required, in particular, the string s 1 = u v , the string s 2 = ε ε = ε , and the conflict event a. Table 2 shows the inference of F C o n f in Example 1.

5. Controlled System Design

Let Γ : Q 2 Σ be the active event function (also known as feasible event function); Γ ( q ) is the set of all events e for which δ ( q , e ) is defined. Γ ( q ) is also called the active event set (or feasible event set) of G at q [1]. In the case of systems with complete observations, the supervisor can be viewed as the recognizer H of K ¯ , i.e., an automaton that marks K ¯ or as the recognizer H of its supremal controllable sublanguage K ¯ of the specification K. Control actions for the string s generated by the plant are directly obtained from the active event set Γ ( δ J ( q 0 J , s ) ) ] of the supervisor automaton after the string s has been read. A supervisor J thus ensures J ( s ) = [ Σ u c Γ ( δ ( q 0 , s ) ) ] { σ Σ c : s σ K ¯ } . It is known that a closed-looped behavior of the plant and the supervisor may be realized by the parallel composition of the corresponding automata, i.e., L ( J / G ) = L ( H | | G ) .
Now, our goal is to implement supervisory control in the case of partial observation of events in G to solve BSCOP. For this, in the case of observable and controllable specification, one often employs an observer automata. Let H O b s be an observer for automaton H. Let s be a current string of DES G and let t = P ( s ) be a current observed string available. Let x o b s be the current state of H O b s after the execution of t. Since observer H O b s states in general are sets of states of H, this means that after the last event in t automaton H could be in any one of the states in the set x o b s . Then, we have that J P ( t ) = x x o b s [ Γ H O b s ( x ) ] , where Γ H O b s is the active event function of H O b s . As stated in [1], in order to use the same technique that was used for the case of complete observation of events, it is enough for each state x o b s to add self-loops for the unobservable events that are feasible in corresponding x x o b s in H. The resulting automaton H O b s guarantees the desired result L ( J P / G ) = L ( H O b s | | G ) for a supervisor J P .
Let B l o o p s = B H B H O b s { u n O b s ( e ¯ ) } . The predicate u n O b s ( _ ) , having a list of elements as an argument, is used to define all unobservable events of H. To obtain an element of a list e ¯ , the computable predicate ∈ is used. Details on the realization of lists in PCF prover Bootfrost are presented in Section 6.3. During the inference search of the PCF F l o o p s (4), new transition atoms δ O b s ( q ¯ , σ , q ¯ ) corresponding to auxiliary loops for unobservable events σ feasible in states x of H corresponding to x o b s of H O b s are added to the base B l o o p s .
F l o o p s = B l o o p s q , σ , q , σ q ¯ , q ¯ , σ ¯ δ H ( q , σ , q ) , δ H O b s ( q ¯ , σ , q ¯ ) , q q ¯ , σ σ ¯ , u n O b s ( σ ¯ ) δ O b s ( q ¯ , σ , q ¯ )
In PCF formalization, the behavior of the system under the supervisory control may be described by the PCF F B S C O P (5). The predicate L J P / G ( _ , _ ) is employed to store words of the controlled language L J P / G as its first argument. A state to which this string brought the system is stored as the second argument. Although the language L ( J P / G ) is a restricted version of L ( G ) , we use a new predicate L J P / G ( _ , _ ) to emphasize that the language constructed is a result of teamwork of the plant and the supervisor. The base B B S C O P consists of sets of atoms corresponding to the transitions of the plant and the supervisor, correspondingly, and contains initial atom L J P / G ( ε , 1 ) .
F B S C O P = B B S C O P σ , s , q , q , q ¯ J , q ¯ J L J P / G ( s , q ) , δ G ( q , σ , q ) , δ H O b s ( q ¯ J , σ , q ¯ J ) , q q ¯ J L J P / G ( s · σ , q )
The single question of F B S C O P may be interpreted as follows. If the system is at the state q and an event σ occurs, then according to the δ G , the system is switched to the specified state q , and σ is added to the current string of events s stored as the first argument of the predicate L J P / G ( _ , _ ) . The rule works only on those strings that are allowed by the supervisor, i.e., atoms δ H O b s ( _ , _ , _ ) limit the answers that could be found with atoms δ G ( _ , _ , _ ) .
Example 4. 
In Example 1, let Σ u o = { u , c , e } , Σ c = Σ , i.e., the event u is now considered observable. The corresponding observer is presented in Figure 5. It may be seen that no event must be disabled and enabled simultaneously. In this case, the logical inference search of the PCF F O b s performed by Bootfrost terminates due to exhaustion of all possible substitutions. When the inference search is stopped, the predicates in the base of F O b s define the states and transitions of the automaton T O b s , but none of them are the d e a d ( _ ) atom.
Table 3 shows a few steps of the inference of PCF F B S C O P in Example 1, with the observer in Figure 5 modified with the PCF F l o o p s (4).
The representation of DESs with the help of PCFs allows one to use information coming from the environment, as well as data on the functioning of the system itself in the process of constructing the logical inference. This can be realized by special logical rules represented in PCF in the form of event processing questions. Answering them triggers subinferences, in which the events coming from the environment serve as parameters used in calculations or other decisions. To implement this inference search behavior, formulas use the # label to indicate that the labeled term is a call to an external function.
Here, we utilize # in PCF F B S C O P # (6) that presents a version of F B S C O P obtaining an event from outside. Note the atom E ( σ # ) in the base of F B S C O P # where σ # is the result of calling the function g e t _ r a n d o m _ e v e n t ( ) , which provides a random event that can occur in the current state.
F B S C O P # = B B S C O P { E ( σ # ) } σ , s , q , q , q ¯ J , q ¯ J L J P / G ( s , q ) , δ G ( q , σ , q ) , δ H O b s ( q ¯ J , σ , q ¯ J ) , q q ¯ J , E * ( σ ) L J P / G ( s · σ , q ) , E ( σ # )
Example 5. 
Table 4 shows the first steps of the inference of PCF F B S C O P # in Example 4. Note the difference in the last lines of Table 3 and Table 4. Unlike F B S C O P , PCF F B S C O P # does not generate words of the same length, thus simulating the real system behavior.
The PCF F B S C O P T # (7) presents an extended version of PCF F B S C O P # that allows to not only process events coming from outside but also take into account the knowledge available in the system. The usage of F B S C O P T # is explained in Section 7.
Computation 12 00095 i004

6. The Prover for Refutation PCFs

To facilitate inference searches in the PCF calculus, a prover named Bootfrost is developed. The source code of the project, written in the Rust programming language, documentation, and examples, including those presented in this paper, can be found on the GitHub page [24].
A number of modern approaches have been implemented in Bootfrost: perfect sharing of terms that significantly saves memory on some tasks; indexing of the fact base; and automatic, semiautomatic, and manual modes of logical inference searching. Rust’s rich type system and the ownership model guarantee memory safety and thread safety. Also, there is no runtime or garbage collector in Rust. These features give the opportunity to realize safe and efficient systems. The prover design is based on a transaction system that allows one to log and roll back any changes that occur during the logical inference. Thus, all algorithms used are taken into account. In addition, this version of the prover is specialized for guarded PCFs. Although this limits the class of problems to be solved, nevertheless, in applied problems and problems describing dynamic systems, only such formulas are usually used. Guarded PCFs are the formulas in typical quantifier conditions in which all variables controlled by the quantifier occur in the conjunct. For example, x , y A ( x ) , B ( y ) is a subformula with guarded variables because variables x, y occur in the conjunct A ( x ) , B ( y ) , and x , y A ( x ) , B ( x ) is a subformula with unguarded variables because variable y does not occur in the conjunct A ( x ) , B ( x ) .
The prover was tested using problems from the TPTP library [40] to empirically verify its soundness. About 9000 problems were selected in the First-Order Formula format, and among them, 1001 with guarded variables. None of the problems that had no solution were solved, which is considered as the main criteria of any prover’s soundness at the CADE ATP System Competition (https://tptp.org/CASC/, accessed on 25 April 2024).
Strategies employed in Bootfrost are divided into three main groups: question selection strategies, answer selection strategies, and evaluated terms and commands.

6.1. Question Selection Strategies

At each step of the inference, the prover must select a question for which it will be looking for an answer. A special procedure for the question selection is used in Bootfrost. It scores each question by several criteria. Such criteria can be configured by the user, but there is a default strategy that scores a question as follows:
  • Is the question a goal? Goal questions are scored highest because answering them terminates the inference.
  • How many steps ago in the inference was the question answered for the last time? The most distant questions are scored the highest. Such an approach provides a high level of diversity by restricting the usage of the same question several times in a row.
  • How many times has the question been answered before? Questions with the lowest value are scored the highest.
  • What is the level of the question branching? The ones with the lowest value are scored the highest.
This general strategy sorts questions by the score from the highest to the lowest. Then, a special method tries to answer questions from the list using an answer selection strategy (see the next subsection). If an answer is found, the formula is transformed according to the inference rule ω , and the transition to the next step is performed.

6.2. Answer Selection Strategies

There are two answer selection strategies in the Bootfrost prover, called the First appropriate answer and the Best answer.
The First appropriate answer strategy selects the first found answer that satisfies a given criteria. Such an approach leads to effective usage of the memory and CPU resources. The criteria can be configured, and the default criteria is a trivial function returning t r u e , which leads to selecting the first answer found.
The Best answer strategy selects the best answer from the set of all possible answers. Thus, this strategy finds all possible answers and then selects the best of them according to a special selection function, which can be configured by the user. The default function just selects an answer with the lowest total weight of all terms involved. A term weight is the amount of nodes in a tree representing the term; for example, the term A ( e ) has weight 2, and the term A ( e , f ( e ) ) has weight 4.
Another auxiliary procedure used for answer searching is a starting point selection. There are two variants of where to start the search: “from the last” and “from scratch”. The “From the last” method means that the next search for an answer to the question will begin at the point where the prover stopped during the previous searching procedure. “From scratch” means that the procedure for finding answers starts from scratch, i.e., without taking into account previous search results. This approach is applicable for nonmonotonic inferences and in similar situations where the previous history of the inference can change over time and become irrelevant. For example, the “from scratch” method is used for constructing the sublanguage K C of an uncontrollable specification K [26]. In the PCF F S u b , for providing a sublanguage construction, a special operator * is employed. If a question with an atom marked with the * operator has an answer, then after applying the inference rule the atoms in the base that participated in the matching search with the marked atom should be removed from the base. The operator * can be also modeled by the command “remove-fact”, described in the next section.

6.3. Evaluated Terms and Commands

Evaluated terms (ETerms) are terms treated by the prover not as syntactical structures but as functions that must be evaluated. For now, Eterms implemented in the prover are arithmetical operations, compare operations, list operations (in, notin, first, last, concatenation, length), and solve. The main framework of term evaluation is as follows: a special procedure retrieves terms by IDs from the environment; then, these terms are performed; and then, for the resulting term, using the perfect sharing structure, the ID is calculated, and this ID is returned as the result.
Evaluated terms that are performed immediately after the inference rule ω application are called commands. For example, “remove-fact” is a command. ETerms and commands are classified as strategies because their main purposes are modifying the process of logical inference searching and extending the logical capabilities of PCF. By the commands, it is possible to model the nonmonotonic inference.
Another feature in the task description syntax for the Bootfrost prover is the ability to use lists as arguments of terms. The following computable operations are implemented:
  • Evaluated function a ¯ ++ b ¯ —concatenation of lists a ¯ and b ¯ ;
  • Evaluated predicate a \in b ¯ —the membership of element a to list b ¯ ;
  • Evaluated predicate a ¯ \subseteq b ¯ —whether a ¯ is a sublist of b ¯ ;
  • Evaluated function sort( a ¯ )—sorting list items in ascending order;
  • Computed function dedup( a ¯ )—deletion of repeating elements of the list;
List elements can be other terms and lists, and the above operations handle them only syntactically, without considering the possibility of unification of terms.

6.4. Complexity Evaluation

Let us say some words on the complexity of the algorithms involved. We consider the procedure of construction automaton T o b s for checking the observability of the specification language during refutation of PCF F O b s (Section 3).
As an elementary operation of PCF refutation, we consider the search for a unifying substitution for an atom. This assumption holds because the formula F O b s under consideration has a fairly simple structure (i.e., belongs to the Horn class) and an algorithm for finding the inference can be provided for it. For example, the algorithm may be chosen in the form of a cycle of traversing questions from top to bottom of the PCF’s tree structure and comparing the atoms of the questions with the atoms of the base, while attempting to unify these atoms. For example, for answering question R 4 , the terms T ( q H 1 , q H 2 , q G ) , E o ( σ ) , δ H ( q H 1 , σ , t H 1 ) , δ H ( q H 2 , σ , t H 2 ) , δ G ( q G , σ , t G ) must be unified. Here, variables q H 1 , q H 2 , t H 1 , t H 2 take values from the set Q H , variables q G , t G take values from the set Q G , and variable σ take value from the set Σ o . So, for complexity, we have the following expression: | Q H | 4 · | Q G | 2 · | Σ o | .
Let us consider the worst case of possible inference search options. For a given formula, one of the options is the exhaustion of all substitutions, while the answers to the second and third questions do not exist. Another option is that the answer to the second, and consequently, the third target question is carried out after enumerating all triplets of states in the last steps of the inference. Thus, we do not take into account the complexity of answering the second and third questions because they participate in the inference only once, as well as the first question. The computational complexity of answering question R 5 may be evaluated as | Q H | 3 · | Q G | · | Σ u o | and of answering question R 6 is | Q H | 3 · | Q G | 2 · | Σ u o | . Summarizing complexity for R 4 , R 5 , R 6 , we have | Q H | 4 · | Q G | 2 · | Σ o | + | Q H | 3 · | Q G | · | Σ u o | + | Q H | 3 · | Q G | 2 · | Σ u o | . Reducing, we obtain | Q H | 3 · | Q G | · ( | Q H | · | Q G | · | Σ o | + | Σ u o | + | Σ u o | · | Q G | ) , i.e., if we go to big O notation, the complexity of observability checking is O ( | Q H | 4 · | Q G | 2 · | Σ | ) .

7. Case Study

As an example, we consider the problem of robot’s path planning in an unknown environment. We use the model from [41,42] simplified in the part of the path-following controller and extended by additional states and events, including unobservable ones.

7.1. Problem Statement

Suppose that the robot should follow a given reference path, leaving it to avoid collisions with encountered obstacles and returning to it after completing avoidance maneuvers. Let the robot’s dynamics in the horizontal plane be described [43] by equations
x ˙ = u cos ( ψ B ) v sin ( ψ B ) , y ˙ = u sin ( ψ B ) + v cos ( ψ B ) , ψ ˙ B = ω ,
where ( x , y ) are the coordinates of the robot in a global reference frame { U } , ψ B is the yaw angle; u and v are the surge and sway speeds, and ω is the yaw rate (Figure 6).
Assuming that u is always nonzero, we define side-slip angle β = a r c t a n ( v / u ) and a reference frame { W } , which is obtained by rotating { B } around the yaw axis through angle β . Subsequently, the kinematic Equation (8) can be rewritten as
x ˙ = v t cos ( ψ W ) , y ˙ = v t sin ( ψ W ) , ψ ˙ W = ω + β ˙ ,
where ψ W = ψ B + β , v t = ( u 2 + v 2 ) 1 2 is the absolute value of the total velocity vector [ u v ] T . The goal is to develop a path-planning solution that meets kinematic constraints given by minimum turning radius R min and lies no closer than a safe distance D s from obstacles.
We assume that the robot is equipped with a multibeam forward-looking sonar (FLS) installed onboard in the horizontal plane in order to detect obstacles in the forward direction. The data obtained from the FLS can be presented as a set of pairs ( α i , ρ i ) , where α i is the beam angle counted from the robot’s heading direction ( π / 2 < α ̲ α i α ¯ < π / 2 ), ρ i is the distance to an obstacle in the beam direction, i = 1 , N b ¯ , N b is the number of beams. If no obstacles are detected in the direction of beam i, or ρ i is greater than the detection range ρ d , then ρ i = .
Let I L { i : α i 0 } and I R { i : α i > 0 } be the sets of left and right beams of the FLS, respectively. For each obstacle point Q i , i = 1 , N b ¯ , we define the maximum turning radius [41,44] as
R max i = ρ i cos α i sin 2 β i D s , β i = arctan D s ρ i sec α i + tan α i .
It can be used to evaluate the robot’s ability to bypass the point Q i at a safe distance D s (Figure 7), taking into account the kinematic constraint R min : if R min R max i , the robot can safely bypass the obstacle point Q i . Define also R max L = min i I L R max i , R max R = min i I R R max i , and ρ min = min i I L I L ρ i .
The main component of the proposed approach is DES G m s n (Figure 8), which is designed to detect situations that require updating the current path. Since, unlike [41], we do not employ a path-following controller to drive the robot along the generated paths, a situation may happen in which the robot finds itself in the wrong position or faced with some obstacle. State W P D corresponds to such a situation, and an unobservable event eER denotes transitions to this state. State R P F is the initial state. The description of all possible system states is the following:
  • R P F —reference path following;
  • D O L —detouring the detected obstacle from its left sid;
  • D O R —detouring the detected obstacle from its right side;
  • N R P —navigation to the reference path;
  • S O L —searching for an obstacle on the left;
  • S O R —searching for an obstacle on the right;
  • W P D —wrong position diagnosing;
  • C N P —computing a new path.
The set of system events with their triggering conditions is shown in Table 5. Events ALi, ARi denote switching to the obstacle avoidance modes from the other modes, i = 1 , 4 ¯ . They are composite, i.e., determined by several atomic events. It is convenient to define composite events by logical formulas, such as those presented in Table 5. For example, event eAL1 determines switching to the mode of obstacle avoidance from the obstacle’s left side and it is the result of occurring of event eOLNF with one of the events eORN, eORNF, or eON.
In Figure 8, transitions caused by uncontrollable events are shown by red arrows (eRPR, eNOR, eNOL, eOF, eOL), while unobservable events are shown by green arrows (eER).
Changing the DES state entails building a new path for the robot. Entering states D O L or D O R generates a path to avoid the detected obstacle using a modification [41] of the waypoint guidance algorithm [44]. Entering state N R P constructs a path that connects the robot’s current position with the reference path for what the path planning algorithm from [41] is used, which is based on Dubins curves [45].

7.2. System Constraints and Analysis

Let the specification for the problem above be provided by the automaton H m s n depicted in Figure 9. The desired robot behavior strategy can be expressed as follows: while DES G m s n is designed to ensure that the robot avoids places from which it is impossible to get out using the obstacle avoidance algorithms implemented (this is guaranteed by the rules related to obstacle avoidance modes), the specification language K m s n imposes additional constraints, which are interpreted as “do not change once chosen obstacle avoidance direction until the robot returns to the reference path” (left- or right-hand rule, which is ensured by the prohibition of transitions eAL3 and eAR3, and transitions ePRF) and “rebuild the path only if it is vital” (realized by the prohibition of transitions eEAP).
Before designing a supervisor as a controller that ensures specification K m s n , first, the supervisor existence criterion must be checked. By refuting the special PCF developed for controllability checking in [26], the prover Bootfrost established controllability of K m s n . The observability of K m s n is checked via inspection of automaton T H m s n constructed using the PCF F O b s (2) (Section 3). T H m s n consists of 20 states and 45 transitions and does not contain the d e a d state, which means observability of K m s n . Full T H m s n may be found in supplementary materials at [24].
Once the specification K m s n is proved to be controllable and observable, a supervisor may be designed as described in Section 5. To realize its control action, we employ the PCF F B S C O P T # (7), which allows processing events coming from outside and taking into account the knowledge available in the system. To handle composite events, predicate T # can be taken in the form
Computation 12 00095 i005
The first question of PCF T ( e ¯ ) is a general form of a question processing a composite event in the system. The number of such questions in the instantiated PCF equals the number of composite events in the system. In the case of G m s n , there are eight such questions for all eALi and eARi, and in each of them, the predicate C o m p ( t ¯ ) is computed according to Table 5 using corresponding propositional formulas. Below is a part of the instantiated form of T # in our example:
Computation 12 00095 i006

7.3. Simulation Results

The path-following and obstacle avoidance mission was realized at the robotic test bed by a LEGO Mindstorm EV3 robot with a differential drive. The robot moves due to the movement of two separately controlled wheels; therefore, Equation (9) turns to the following kinematic scheme:
x ˙ = r ω L + ω R 2 cos ( ψ W ) , y ˙ = r ω L + ω R 2 sin ( ψ W ) , ψ ˙ W = ( ω L ω R ) r l ,
where ω L , ω R are the corresponding angular speeds of rotation of the wheels, r is the radius of the wheel, and l is the distance between the wheels. The employed configuration of the EV3 robot has 2 r = 56 mm and l = 100 mm. To detect obstacles, the robot was equipped with a LIDAR with a viewing sector [ 60 ; 60 ]. For certainty, we accept R min = 50  cm, D s = 30  cm, R = 10  cm, ρ n = 30  cm, ρ f = 70  cm, N b = 60 , ρ d = 150  cm, α ̲ = 60 , and α ¯ = 60 .
The results of the preliminary computer simulation are shown in Figure 10 and Figure 11. The robot moved along the predefined preference path denoted by a green arrow until it faced an obstacle. During the obstacle detouring maneuver, the obstacle on the right was lost, so DES G m s n entered the S O L state. Then, a failure was detected by the event eER, which led G m s n into the N R P state and immediately back to D O R due to an obstacle on the right. Figure 11 shows changes in state G m s n corresponding to changes in the robot’s functioning modes. A new path was generated in the D O R , N R P , and other states, which is denoted by the dots on the line in Figure 11.
To explain the prover implementation, consider inference steps resulting in the sequence of the first state changes in Figure 11. In the initial state R P F , the g e t _ e v e n t ( ) function returned two atomic events eORNF and eON, which were passed as a list to the subinference of the T # formula when its first question was checked. The subinference triggered the question containing eAR1. Instead of s ¯ , the list { e O R N F , e O N } was substituted and the formula for subinference was triggered:
Computation 12 00095 i007
This PCF corresponds to the negation of the propositional formula
( e O R N F & e O N ) & ( ¬ ( e O R N F & e O L N ) ¬ ( e O R N F & e O L N F ) ¬ ( e O R N F & e O N ) ) ,
and is refuted in one step by answering the third question. Therefore, the constant T r u e was returned to the formula T # , making an answer to its first question possible. Thus, the atom e A R 1 ( { e O R N F , e O N } ) was added to the base. Then, by answering the question that corresponds to the question x C o m p ( x ) of the formula T # , inference control is returned to the original formula F B S C O P T # . Instead of σ , e A R 1 was substituted in the consequent of its first question. Then, the atom E ( e A R 1 ) was added into the base B B S C O P , which allowed answering the second question and adding the atom L J P / G m s n ( ε · e A R 1 , D O R ) into the base. This addition caused a message to the main thread, which informed the program about the change in the DES’s state.
The corresponding seminatural experiments were carried out on a robotic stand (Figure 12), which is described in detail in [46]. Due to changes in the obstacle’s configuration and a number of uncontrollable factors affecting the robot dynamics, the trajectory of the robot is different compared with the one obtained in the computer simulation. However, the proper processing of the unobservable fault event eER led the implemented DES G m s n to W P D state, which allowed the robot to avoid a collision with the obstacle.

8. Discussion

This paper continues the work of developing a new way of formalizing and solving control problems for the important class of dynamic systems known as DESs. The PCF calculus provides powerful tools for dealing with sophisticated control problems, and above, it was shown how the PCF inference helps check the observability of regular languages.
Close to our research is an approach for testing the diagnosability of DESs based on their logical representation [47]. In [47], conjunctive normal forms (CNFs) are exploited to study the diagnosability properties of DESs. Automata transitions are described as a set of clauses and when the well-known resolution method is applied to test whether failure events can be detected in a finite number of observable events. Keep in mind that CNF is less expressive compared with PCF, which means to represent automata underlying DESs, we leave the problem of the diagnosability of DESs and the intriguing comparison with the CNF-based approach for future research.
Moreover, the representation of DESs with the help of PCFs allows one to use information coming from the environment, as well as data on the functioning of the system itself, in the process of constructing the logical inference. This feature may help greatly in problems of supervisory controller synthesis and implementation for safety PLCs (programmable logic controllers) [48], where extended finite-state automata (EFA) are exploited. Transitions in an EFA may contain guards (i.e., logical conditions) over the variables and updates (i.e., assignments) to the variables. A simple guard, used in EFA, may replace a large automaton expressing the same requirement. In PCF, guards can be realized by event processing questions serving as special logical rules. Answers to them trigger subinferences, in which the events coming from the environment are used in calculations or other kinds of data processing.
Another promising direction of the PCF calculus’s theoretical and practical development is embracing temporal logic and temporal-logic-based controls of DESs [49]. Since early papers, linear time temporal logic (LTL) has been proposed for specifying and verifying the safety and liveness properties of systems (e.g., [50,51]). LTL formulas allow formalizing such statements as “nothing bad will ever occur” and “something good such as accomplishment of tasks will occur regularly”, so they cover a useful range of control specifications about finishing tasks regularly without compromising safety [49]. In subsequent years, computational tree logic (CTL) and epistemic temporal logic have been applied to deal with more sophisticated properties of DESs, e.g., property of stability which requires that the system should eventually reach a set of states where some statement holds and stay there forever [52,53,54].
It should be noted that the Bootfrost prover has not yet been formally verified by any specific testing software. Its applications currently lie in the areas of SCT problems mentioned in this and our previous articles, as well as in several case studies. In the future, it is planned to better demonstrate the validity of the work and the broad applicability of the presented approach. We will continue to develop the PCF-based approach so that it becomes a full-fledged tool capable of solving a wide range of tasks. In future works, its application to constructing supervisors for decentralized and distributed DESs will be suggested. Moreover, if the specification language is either not controllable or not observable, one may be interested in finding the less restricting controllable and observable sublanguage of the specification. While there are effective algorithms to construct the supremal controllable sublanguage of the given language, the supremal observable sublanguage does not exist. Only a maximal observable sublanguage may be found, which is not unique in general. Finding these languages and implementing them in control systems using the PCF approach is the line of our future research. Results obtained will be embedded at the different levels of the hierarchical control system for mobile robots.

Author Contributions

Conceptualization, A.D. and N.N.; formal analysis, A.D. and N.N.; investigation, A.D. and N.N.; methodology, A.D., A.L. and N.N.; resources, A.L.; software, A.D. and A.L.; validation, A.D. and N.N.; visualization, A.D. and A.L.; writing—original draft, A.D., A.L. and N.N.; writing—review and editing, A.D. and N.N. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Ministry of Science and Higher Education of the Russian Federation, project no. 121032400051-9.

Data Availability Statement

The prover Bootfrost for the refutation of PCFs is written in the Rust programming language. The source code of the prover, documentation, and examples, including those presented in the manuscript, can be found on the GitHub page [24].

Acknowledgments

The authors thank Sergei Ul’yanov for providing the model and algorithms for obstacle avoidance, as well as Anton Tolstikhin for his enormous assistance in the practical implementation of the presented approach at the robotic test bed.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
PCFPositively constructed formula
ATPAutomated theorem proving
DESDiscrete event system
SCTSupervisory control theory
CNFConjunctive normal form
EFAExtended finite automaton
LTLLinear time temporal logic
CTLComputational trees logic
FOFFirst-order formula
FOLFirst-order logic
SCOPSupervisory control and observation problem
BSCOPBasic supervisory control and observation problem
IDIdentifier

References

  1. Cassandras, C.G.; Lafortune, S. Introduction to Discrete Event Systems; Springer: Cham, Switzerland, 2021. [Google Scholar] [CrossRef]
  2. Lafortune, S. Discrete Event Systems: Modeling, Observation, and Control. Annu. Rev. Control Robot. Auton. Syst. 2019, 2, 141–159. [Google Scholar] [CrossRef]
  3. Seatzu, C.; Silva, M.; van Schuppen, J.H. (Eds.) Control of Discrete-Event Systems; Springer: London, UK, 2013. [Google Scholar] [CrossRef]
  4. Ramadge, P.J.; Wonham, W.M. Supervisory control of a class of discrete event processes. SIAM J. Control Optim. 1987, 25, 206–230. [Google Scholar] [CrossRef]
  5. Dai, X.; Jiang, L.; Zhao, Y. Cooperative exploration based on supervisory control of multi-robot systems. Appl. Intell. 2016, 45, 18–29. [Google Scholar] [CrossRef]
  6. Lopes, Y.K.; Trenkwalder, S.M.; Leal, A.B.; Dodd, T.J.; Groß, R. Supervisory control theory applied to swarm robotics. Swarm Intell. 2016, 10, 65–97. [Google Scholar] [CrossRef]
  7. Hill, R.C.; Lafortune, S. Scaling the formal synthesis of supervisory control software for multiple robot systems. In Proceedings of the 2017 American Control Conference (ACC), Seattle, WA, USA, 24–26 May 2017; pp. 3840–3847. [Google Scholar] [CrossRef]
  8. Wonham, W.M.; Cai, K. Supervisory Control of Discrete-Event Systems; Springer International Publishing: Cham, Switzerland, 2019. [Google Scholar]
  9. Wonham, W.; Cai, K.; Rudie, K. Supervisory control of discrete-event systems: A brief history. Annu. Rev. Control 2018, 45, 250–256. [Google Scholar] [CrossRef]
  10. Hales, T.; Adams, M.; Bauer, G.; Dang, T.D.; Harrison, J.; Hoang, L.T.; Kaliszyk, C.; Magron, V.; Mclaughlin, S.; Nguyen, T.T.; et al. A formal proof of the kepler conjecture. Forum Math. Pi 2017, 5, e2. [Google Scholar] [CrossRef]
  11. Klein, G.; Andronick, J.; Elphinstone, K.; Heiser, G.; Cock, D.; Derrin, P.; Elkaduwe, D.; Engelhardt, K.; Kolanski, R.; Norrish, M.; et al. SeL4: Formal Verification of an Operating-System Kernel. Commun. ACM 2010, 53, 107–115. [Google Scholar] [CrossRef]
  12. Gonthier, G. Formal Proof—The Four-Color Theorem. Not. Am. Math. Soc. 2008, 11, 1382–1393. [Google Scholar]
  13. Leroy, X. Formal Verification of a Realistic Compiler. Commun. ACM 2009, 52, 107–115. [Google Scholar] [CrossRef]
  14. Karpas, E.; Magazzeni, D. Automated planning for robotics. Annu. Rev. Control Robot. Auton. Syst. 2020, 3, 417–439. [Google Scholar] [CrossRef]
  15. Zombori, Z.; Urban, J.; Brown, C.E. Prolog technology reinforcement learning prover. In Proceedings of the International Joint Conference on Automated Reasoning, Paris, France, 1–4 July 2020; Springer: Cham, Switzerland, 2020; pp. 489–507. [Google Scholar]
  16. Schader, M.; Luke, S. Planner-Guided Robot Swarms. In Proceedings of the International Conference on Practical Applications of Agents and Multi-Agent Systems, L’Aquila, Italy, 7–9 October 2020; Springer: Cham, Switzerland, 2020; pp. 224–237. [Google Scholar]
  17. Li, W.; Miyazawa, A.; Ribeiro, P.; Cavalcanti, A.; Woodcock, J.; Timmis, J. From Formalised State Machines to Implementations of Robotic Controllers. In Distributed Autonomous Robotic Systems: The 13th International Symposium; Groß, R., Kolling, A., Berman, S., Frazzoli, E., Martinoli, A., Matsuno, F., Gauci, M., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 517–529. [Google Scholar] [CrossRef]
  18. Vassilyev, S.N. Machine synthesis of mathematical theorems. J. Log. Program. 1990, 9, 235–266. [Google Scholar] [CrossRef]
  19. Davydov, A.; Larionov, A.; Cherkashin, E. On the calculus of positively constructed formulas for automated theorem proving. Autom. Control Comput. Sci. 2011, 45, 402–407. [Google Scholar] [CrossRef]
  20. Cherkashin, E.A.; Postoenko, A.; Vassilyev, S.N.; Zherlov, A. New Logics for Intelligent Control. In Proceedings of the Twelfth International Florida Artificial Intelligence Research Society Conference, Orlando, FL, USA, 1–5 May 1999; Kumar, A.N., Russell, I., Eds.; AAAI Press: Washington, DC, USA, 1999; pp. 257–261. [Google Scholar]
  21. Zherlov, A.K.; Vassilyev, S.N.; Fedosov, E.A.; Fedunov, B.E. Intelligent Control of Dynamic Systems; Fizmatlit: Moscow, Russia, 2000. (In Russian) [Google Scholar]
  22. Vassilyev, S.; Galyaev, A. Logical-optimization approach to pursuit problems for a group of targets. Dokl. Math. 2017, 95, 299–304. [Google Scholar] [CrossRef]
  23. Vassilyev, S.; Ponomarev, G. Automation methods for logical derivation and their application in the control of dynamic and intelligent systems. Proc. Steklov Inst. Math. 2012, 276, 161–179. [Google Scholar] [CrossRef]
  24. Larionov, A. Bootfrost. Available online: https://github.com/snigavik/bootfrost (accessed on 26 April 2024).
  25. Klein, G.; Andronick, J.; Keller, G.; Matichuk, D.; Murray, T.; O’Connor, L. Provably trustworthy systems. Philos. Trans. R. Soc. Math. Phys. Eng. Sci. 2017, 375, 20150404. [Google Scholar] [CrossRef] [PubMed]
  26. Davydov, A.; Larionov, A.; Nagul, N.V. The construction of controllable sublanguage of specification for DES via PCFs based inference. In Proceedings of the 2nd International Workshop on Information, Computation, and Control Systems for Distributed Environments, ICCS-DE 2020, Irkutsk, Russia, 6–7 July 2020; Bychkov, I., Tchernykh, A., Eds.; CEUR-WS.org: Aachen, Germany, 2020. CEUR Workshop Proceedings. Volume 2638, pp. 68–78. [Google Scholar]
  27. Cho, H.; Marcus, S.I. Supremal and maximal sublanguages arising in supervisor synthesis problems with partial observations. Math. Syst. Theory 1989, 22, 177–211. [Google Scholar] [CrossRef]
  28. Inan, K. An algebraic approach to supervisory control. Math. Control Signals Syst. 1992, 5, 151–164. [Google Scholar] [CrossRef]
  29. Rudie, K.; Murray Wonham, W. The infimal prefix-closed and observable superlanguange of a given language. Syst. Control Lett. 1990, 15, 361–371. [Google Scholar] [CrossRef]
  30. Feng, L.; Wonham, W.M. TCT: A computation tool for supervisory control synthesis. In Proceedings of the 2006 8th International Workshop on Discrete Event Systems, Ann Arbor, MI, USA, 10–12 July 2006; pp. 388–389. [Google Scholar]
  31. Lafortune, S. Desuma. Available online: https://gitlab.eecs.umich.edu/wikis/desuma (accessed on 15 April 2024).
  32. Ricker, L.; Lafortune, S.; Genc, S. DESUMA: A Tool Integrating GIDDES and UMDES. In Proceedings of the 2006 8th International Workshop on Discrete Event Systems, Ann Arbor, MI, USA, 10–12 July 2006; pp. 392–393. [Google Scholar] [CrossRef]
  33. Åkesson, K.; Fabian, M.; Flordal, H.; Vahidi, A.; Malik, R. Supremica. Available online: https://supremica.org/ (accessed on 5 April 2024).
  34. Malik, R.; Åkesson, K.; Flordal, H.; Fabian, M. Supremica–An Efficient Tool for Large-Scale Discrete Event Systems. IFAC-PapersOnLine 2017, 50, 5794–5799. [Google Scholar] [CrossRef]
  35. Åkesson, K.; Flordal, H.; Fabian, M. Exploiting modularity for synthesis and verification of supervisors. IFAC Proc. Vol. 2002, 35, 175–180. [Google Scholar] [CrossRef]
  36. Brandin, B.; Malik, R.; Malik, P. Incremental verification and synthesis of discrete-event systems guided by counter examples. IEEE Trans. Control Syst. Technol. 2004, 12, 387–401. [Google Scholar] [CrossRef]
  37. Mohajerani, S.; Malik, R.; Fabian, M. A Framework for Compositional Synthesis of Modular Nonblocking Supervisors. IEEE Trans. Autom. Control 2014, 59, 150–162. [Google Scholar] [CrossRef]
  38. Larionov, A.; Davydov, A.; Cherkashin, E. The method for translating first-order logic formulas into positively constructed formulas. Softw. Syst. 2019, 4, 556–564. [Google Scholar] [CrossRef]
  39. Larionov, A.; Davydov, A.; Cherkashin, E. The calculus of positively constructed formulas, its features, strategies and implementation. In Proceedings of the 2013 36th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 20–24 May 2013; pp. 1023–1028. [Google Scholar]
  40. Sutcliffe, G. The TPTP Problem Library and Associated Infrastructure. From CNF to TH0, TPTP v6.4.0. J. Autom. Reason. 2017, 59, 483–502. [Google Scholar] [CrossRef]
  41. Ulyanov, S.; Bychkov, I.; Maksimkin, N. Event-Based Path-Planning and Path-Following in Unknown Environments for Underactuated Autonomous Underwater Vehicles. Appl. Sci. 2020, 10, 7894. [Google Scholar] [CrossRef]
  42. Bychkov, I.; Ulyanov, S.; Nagul, N.; Davydov, A.; Kenzin, M.; Maksimkin, N. Hierarchical event-based control of multi-robot systems in unstructured environments. J. Phys. Conf. Ser. 2021, 1864, 012001. [Google Scholar] [CrossRef]
  43. Lapierre, L.; Soetanto, D. Nonlinear path-following control of an AUV. Ocean Eng. 2007, 34, 1734–1744. [Google Scholar] [CrossRef]
  44. Yan, Z.; Li, J.; Zhang, G.; Wu, Y. A Real-Time Reaction Obstacle Avoidance Algorithm for Autonomous Underwater Vehicles in Unknown Environments. Sensors 2018, 18, 438. [Google Scholar] [CrossRef] [PubMed]
  45. Dubins, L.E. On Curves of Minimal Length with a Constraint on Average Curvature, and with Prescribed Initial and Terminal Positions and Tangents. Am. J. Math. 1957, 79, 497–516. [Google Scholar] [CrossRef]
  46. Kostylev, D.; Tolstikhin, A.; Ulyanov, S. Development of the complex modelling system for intelligent control algorithms testing. In Proceedings of the 2019 42nd International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 20–24 May 2019; pp. 943–948. [Google Scholar] [CrossRef]
  47. Geng, X.; Ouyang, D.; Han, C. Verifying Diagnosability of Discrete Event System with Logical Formula. Chin. J. Electron. 2020, 29, 304–311. [Google Scholar] [CrossRef]
  48. Reijnen, F.F.; Erens, T.R.; van de Mortel-Fronczak, J.M.; Rooda, J.E. Supervisory controller synthesis and implementation for safety PLCs. Discret. Event Dyn. Syst. 2022, 32, 115–141. [Google Scholar] [CrossRef]
  49. Seow, K.T. Supervisory Control of Fair Discrete-Event Systems: A Canonical Temporal Logic Foundation. IEEE Trans. Autom. Control 2021, 66, 5269–5282. [Google Scholar] [CrossRef]
  50. Thistle, J.G.; Wonham, W.M. Control problems in a temporal logic framework. Int. J. Control 1986, 44, 943–976. [Google Scholar] [CrossRef]
  51. Rawlings, B.C.; Lafortune, S.; Ydstie, B.E. Supervisory Control of Labeled Transition Systems Subject to Multiple Reachability Requirements via Symbolic Model Checking. IEEE Trans. Control Syst. Technol. 2020, 28, 644–652. [Google Scholar] [CrossRef]
  52. Jiang, S.; Kumar, R. Supervisory Control of Discrete Event Systems with CTL* Temporal Logic Specifications. SIAM J. Control Optim. 2006, 44, 2079–2103. [Google Scholar] [CrossRef]
  53. Aucher, G. Supervisory Control Theory in Epistemic Temporal Logic. In Proceedings of the International Conference on Autonomous Agents and Multi-Agent Systems, Paris, France, 5–9 May 2014; Volume 1. [Google Scholar]
  54. Ritsuka, K.; Rudie, K. Do what you know: Coupling knowledge with action in discrete-event systems. Discret. Event Dyn. Syst. 2023, 33, 257–277. [Google Scholar] [CrossRef]
Figure 1. Automaton G.
Figure 1. Automaton G.
Computation 12 00095 g001
Figure 2. Automaton H as a recognizer of K ¯ .
Figure 2. Automaton H as a recognizer of K ¯ .
Computation 12 00095 g002
Figure 3. A part of automaton T O b s for observability checking.
Figure 3. A part of automaton T O b s for observability checking.
Computation 12 00095 g003
Figure 4. Observer automaton for H with Σ u o = { u , v , c , e } .
Figure 4. Observer automaton for H with Σ u o = { u , v , c , e } .
Computation 12 00095 g004
Figure 5. Observer automaton for H with Σ u o = { u , c , e } .
Figure 5. Observer automaton for H with Σ u o = { u , c , e } .
Computation 12 00095 g005
Figure 6. Reference frames (borrowed from [41]).
Figure 6. Reference frames (borrowed from [41]).
Computation 12 00095 g006
Figure 7. The maximum turning radius R m a x i (borrowed from [41]).
Figure 7. The maximum turning radius R m a x i (borrowed from [41]).
Computation 12 00095 g007
Figure 8. DES G m s n for the path-following mission.
Figure 8. DES G m s n for the path-following mission.
Computation 12 00095 g008
Figure 9. Specification H m s n for DES G m s n .
Figure 9. Specification H m s n for DES G m s n .
Computation 12 00095 g009
Figure 10. The result of computer modeling of the path-following and obstacle avoidance mission with the supervised partially observed DES G m s n .
Figure 10. The result of computer modeling of the path-following and obstacle avoidance mission with the supervised partially observed DES G m s n .
Computation 12 00095 g010
Figure 11. The changes in the state of G m s n . The marks on the line indicate moments when a new path is generated.
Figure 11. The changes in the state of G m s n . The marks on the line indicate moments when a new path is generated.
Computation 12 00095 g011
Figure 12. Simulation at the robotic test bed. The fault event eER led to W P D state, and a collision with the obstacle was avoided.
Figure 12. Simulation at the robotic test bed. The fault event eER led to W P D state, and a collision with the obstacle was avoided.
Computation 12 00095 g012
Table 1. PCF F O b s refutation progress for checking observability using automaton T O b s .
Table 1. PCF F O b s refutation progress for checking observability using automaton T O b s .
R # σ q H 1 q H 2 q G t H 1 t H 2 t G Additions into the Base
1 1 1 T ( 1 , 1 , 1 )
3u1112 T ( 2 , 1 , 1 ) , δ T ( 1 , 1 , 1 , u , ε , ε , 2 , 1 , 1 )
4u111 22 T ( 1 , 2 , 2 ) , δ T ( 1 , 1 , 1 , ε , u , u , 1 , 2 , 2 )
4u212 22 T ( 2 , 2 , 2 ) , δ T ( 2 , 1 , 1 , ε , u , u , 2 , 2 , 2 )
4v122 33 T ( 1 , 3 , 3 ) , δ T ( 1 , 2 , 2 , ε , v , v , 1 , 3 , 3 )
4v222 33 T ( 2 , 3 , 3 ) , δ T ( 2 , 2 , 2 , ε , v , v , 2 , 3 , 3 )
3v2113 T ( 3 , 1 , 1 ) , δ T ( 2 , 1 , 1 , v , ε , ε , 3 , 1 , 1 )
5a311 d e a d ( 3 , 1 , 1 , a )
6a311
Table 2. The inference of the PCF F C o n f .
Table 2. The inference of the PCF F C o n f .
σ q H 1 q H 2 q G q H 1 f q H 2 f q G f s 1 s 2 s 3 σ 1 σ 2 σ 3 Additions
a311211v ε ε n e x t ( 2 , 1 , 1 , v , ε , ε , a )
a211111v ε ε u ε ε n e x t ( 1 , 1 , 1 , u · v , ε · ε , ε · ε , a )
Table 3. The first steps of the PCF F B S C O P inference constructing L ( J P / G ) .
Table 3. The first steps of the PCF F B S C O P inference constructing L ( J P / G ) .
Base Atoms Usedsq σ q q ¯ J q ¯ J Additions
L J P / G ( ε , 1 ) , δ G ( 1 , u , 2 ) , δ H O b s ( [ 1 , 2 ] , u , [ 1 , 2 ] ) ε 1u2 [ 1 , 2 ] [ 1 , 2 ] L J P / G ( ε · u , 2 )
L J P / G ( ε · u , 2 ) , δ G ( 2 , v , 3 ) , δ H O b s ( [ 1 , 2 ] , v , [ 3 ] ) ε · u 2v3 [ 1 , 2 ] [ 3 ] L J P / G ( ε · u · v , 3 )
L J P / G ( ε · u · v , 3 ) , δ G ( 3 , v , 4 ) , δ H O b s ( [ 3 ] , a , [ 4 , 5 ] ) ε · u · v 3a4 [ 3 ] [ 4 , 5 ] L J P / G ( ε · u · v · a , 4 )
L J P / G ( ε · u · v , 3 ) , δ G ( 4 , v , 5 ) , δ H O b s ( [ 4 , 5 ] , c , [ 4 , 5 ] ) ε · u · v · a 4c5 [ 4 , 5 ] [ 4 , 5 ] L J P / G ( ε · u · v · a · c , 5 )
L J P / G ( ε · u · v , 3 ) , δ G ( 4 , v , 2 ) , δ H O b s ( [ 4 , 5 ] , b , [ 1 , 2 ] ) ε · u · v · a 4b2 [ 4 , 5 ] [ 1 , 2 ] L J P / G ( ε · u · v · a · b , 2 )
Table 4. The first steps of the PCF F B S C O P # inference constructing L # ( J P / G ) .
Table 4. The first steps of the PCF F B S C O P # inference constructing L # ( J P / G ) .
Base Atoms Usedsq σ q q ¯ J q ¯ J Additions
L J P / G ( ε , 1 ) , δ G ( 1 , u , 2 ) , δ H O b s ( [ 1 , 2 ] , u , [ 1 , 2 ] ) , E ( u ) ε 1u2 [ 1 , 2 ] [ 1 , 2 ] L J P / G ( ε · u , 2 ) , E ( σ # )
L J P / G ( ε · u , 2 ) , δ G ( 2 , v , 3 ) , δ H O b s ( [ 1 , 2 ] , v , [ 3 ] ) , E ( v ) ε · u 2v3 [ 1 , 2 ] [ 3 ] L J P / G ( ε · u · v , 3 ) , E ( σ # )
L J P / G ( ε · u · v , 3 ) , δ G ( 3 , u , 4 ) , δ H O b s ( [ 3 ] , a , [ 4 , 5 ] ) , E ( a ) ε · u · v 3a4 [ 3 ] [ 4 , 5 ] L J P / G ( ε · u · v · a , 4 ) , E ( σ # )
L J P / G ( ε · u · v , 3 ) , δ G ( 4 , u , 5 ) , δ H O b s ( [ 4 , 5 ] , c , [ 4 , 5 ] ) , E ( c ) ε · u · v · a 4c5 [ 4 , 5 ] [ 4 , 5 ] L J P / G ( ε · u · v · a · c , 5 ) , E ( σ # )
L J P / G ( ε · u · v , 3 ) , δ G ( 5 , u , 1 ) , δ H O b s ( [ 4 , 5 ] , b , [ 1 , 2 ] ) , E ( d ) ε · u · v · a · c 5d1 [ 4 , 5 ] [ 1 , 2 ] L J P / G ( ε · u · v · a · c · d , 1 ) , E ( σ # )
Table 5. Events of the DES G m s n .
Table 5. Events of the DES G m s n .
NameTriggering ConditionDescription
eOLN R max L < R min The obstacle detected on the left is near
eOLNF R min R max L < R min + Δ R The obstacle detected on the left is not far
eNOL R max L = There are no obstacles on the left
eORN R max R < R min The obstacle detected on the right is near
eORNF R min R max R < R min + Δ R The obstacle detected on the right is not far
eNOR R max R = There are no obstacles on the right
eON ρ min < ρ n The detected obstacle is near
eOF ρ min > ρ f The detected obstacle is far
eEAP The robot has reached the end of the avoidance path
eRPR The robot has reached the reference path
eAL1eOLNF and (eORN ∨ eON)The robot detours the obstacle from the left
eAL2eEAP ∨ eOLNFThe robot detours the obstacle from the left
eAL3(eOLNF ∨ eNOL) and (eORN ∨ eON)The robot detours the obstacle from the left
eAL4eON ∨ eOLNFThe robot detours the obstacle from the left
eAR1eORNF and (eOLN ∨ eOLNF ∨ eON)The robot detours the obstacle from the right
eAR2eEAP ∨ eORNFThe robot detours the obstacle from the right
eAR3(eORNF ∨ eNOR) and (eOLN ∨ eON)The robot detours the obstacle from the right
eAR4eON ∨ eOLRFThe robot detours the obstacle from the right
eRPF The reference path is found
ePC The path is corrected
eER An error has occured
eNPR A new path is required
eNPC A new path has been computed
eOL The obstacle is lost
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Davydov, A.; Larionov, A.; Nagul, N. Analysis and Control of Partially Observed Discrete-Event Systems via Positively Constructed Formulas. Computation 2024, 12, 95. https://doi.org/10.3390/computation12050095

AMA Style

Davydov A, Larionov A, Nagul N. Analysis and Control of Partially Observed Discrete-Event Systems via Positively Constructed Formulas. Computation. 2024; 12(5):95. https://doi.org/10.3390/computation12050095

Chicago/Turabian Style

Davydov, Artem, Aleksandr Larionov, and Nadezhda Nagul. 2024. "Analysis and Control of Partially Observed Discrete-Event Systems via Positively Constructed Formulas" Computation 12, no. 5: 95. https://doi.org/10.3390/computation12050095

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop