Next Article in Journal
Bézier Curves and Surfaces with the Blending (α, λ, s)-Bernstein Basis
Previous Article in Journal
BCA: Besiege and Conquer Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Safety of Uncertain Session Types with Interval Probability

by
Bogdan Aman
1,2 and
Gabriel Ciobanu
1,*
1
Institute of Computer Science, Romanian Academy, 700505 Iaşi, Romania
2
Faculty of Computer Science, Alexandru Ioan Cuza University, 700506 Iaşi, Romania
*
Author to whom correspondence should be addressed.
Symmetry 2025, 17(2), 218; https://doi.org/10.3390/sym17020218
Submission received: 3 October 2024 / Revised: 17 January 2025 / Accepted: 30 January 2025 / Published: 1 February 2025
(This article belongs to the Section Mathematics)

Abstract

:
In this article, we introduce and study a new kind of multiparty session type for a probabilistic process calculus that incorporates two forms of choice: probabilistic and nondeterministic. The main novelty of our approach is the use of interval probabilities in a type system in order to deal with uncertainty in a probabilistic process calculus. We present a decidable proof system that ensures deadlock freedom, type preservation and type safety, even when several types can be assigned to a process. The new typing system represents a conservative extension of the standard typing system based on multiparty session types (removing the probabilities from processes does not affect their well-typedness). We also define a probabilistic bisimulation between processes that are typed by using the same sorting and typing.

1. Introduction

Session types represent a type discipline used to describe communication-centric systems in the framework of process calculi. Involving concurrent processes, session types guarantee that message-passing communication follows predefined protocols, implementing given (multiparty) session protocols without errors. In this way, the session types describe statically the communication protocols, and well-typedness ensures that the behavior is in accordance with these protocols at runtime.
The theory of session types operates under the assumption of no system errors (e.g., failure at transport level), and does not account for tolerance of interactions or variability of parameters in a communication protocol [1]. In fact, most of the work on session types has focused on error-free (and non-probabilistic) scenarios; only recently were published some papers focusing on failures [2,3,4,5,6,7]. However, many real-life communication and concurrent message-passing systems involve various uncertainties, and so they are generally probabilistic. In the real world, these systems involving network and embedded equipment exhibit different kinds of uncertainty in their behavior as a result of some random physical parameters. Probability theory and stochastic processes provide the mathematical foundation for analyzing uncertain and random phenomena. The use of a probabilistic approach can be more realistic and helpful in analyzing the communication-centric systems. The use of probabilities in session types was presented recently in some articles [8,9,10,11,12,13,14].
In this article, we introduce new multiparty session types for a probabilistic process calculus that incorporates two forms of choice: probabilistic and nondeterministic. The main novelty of our approach is the use of interval probabilities in a type system in order to deal with uncertainty in a probabilistic process calculus; in this way, several processes can be well-typed by the same type and several types can be assigned to the same process. We present a decidable proof system that ensures deadlock freedom, type preservation and type safety.
The concept of interval probability is introduced to generalize classical probability for describing uncertainty in general. Such intervals of probabilities are mentioned in early articles [15,16]. More recently, interval probability has been used in [17] for stochastic hybrid systems to manage uncertainty and to develop an efficient formal analysis and control synthesis of (discrete-time) stochastic hybrid systems with linear dynamics. Using the Interval Markov Decision Processes (labelling transitions by intervals of probabilities) as an abstract model, the authors present a robust strategy against the uncertainties of the real-world hybrid systems. The uncertainties are viewed as the nondeterministic choice of a feasible transition probability from one state to another (under a given action); in this way, specific reachability and safety problems are solved.
As usual for multiparty session types, we consider a top-down approach where we utilize global types (that describe the interactions among distributed processes) to enforce communication safety and the local types (obtained by projection from global types) to statically type-check the processes. If processes are well-typed, then they correctly interact as prescribed by a multiparty session protocol. To formally describe these systems, we use interval probabilities in our type system in order to deal with uncertainty in the probabilistic process calculus. In this way, we obtain a formal model for complex communication-centric protocols to describe and analyze their behavior by involving tolerance to inherent uncertainties provided by certain random physical parameters. We use local and global types that incorporate interval probabilities, and allow processes to incorporate two forms of choice: probabilistic internal and nondeterministic external. Note that, in general, the deterministic behavior cannot be replaced by a probabilistic behavior (only in some cases is valid to assume that a probabilistic mechanism governs a nondeterministic choice). This is also due to the fact that the behavior produced by nondeterministic choices is unpredictable, while the one produced by probabilistic is predictable in the long run. Thus, the probabilistic and nondeterministic choices need to be treated separately [18].
The paper follows the standard way of dealing with session types. Section 2 presents the syntax and semantics of our probabilistic calculus incorporating two types of choice (probabilistic and nondeterministic); we use a running example to illustrate the key aspects. Section 3 introduces the global and local types, together with their connection established by a projection. Section 4 presents the new probabilistic typing system that ensures deadlock freedom, type preservation and type safety, even when several types can be assigned to a process. The new typing system represents a conservative extension; removing the probabilities from processes does not affect their well-typedness. Section 4 also defines the probabilistic relations between processes that are typed according to specific sorting and typing, and a bisimulation between processes that are typed under the same sorting and typing. Section 5 presents the conclusions, and discusses other approaches that used probabilities in typing systems.
This is a revised and extended version of [19]. In addition to some improvements in motivation, related work and by including the missing proofs, the most significant technical differences between this article and its conference version are as follows: (i) defining a probabilistic relation between processes that are typed under the same sorting and typing and corresponding definitions when the probabilities are erased; (ii) proving that the previous relation is an equivalence and that there exist several such relations that can relate two processes; (iii) defining and studying bisimulations between processes by also taking into account their types and interval probability.

2. Probabilistic Multiparty Session Calculus

Using similar notations as for the calculus presented in [20], we propose a probabilistic extension of a synchronous multiparty session process calculus. However, taking inspiration from [21], we simplify the syntax by omitting session initiations and dedicated input/output prefixes. Our calculus allows processes to incorporate two forms of choice: probabilistic internal choice and nondeterministic external choice.

2.1. Syntax

A session between multiple parties is basically a series of their interactions. In order to establish a session, the involved parties should use a shared public name that is employed to create fresh session channels that will be used only for communication between them. Table 1 presents the syntax of our probabilistic calculus, where we use p for probabilities, c and s for channels, v for values, x for variables, l for labels, r for roles and X for the process variables appearing in recursive processes.
The selection process c [ r ] i I p i : l i ( v i ) . P i uses channel c to send with probability p i towards role r the labelled value l i ( v i ) , and then it behaves according to process P i . Since all the labels in the set { l i } i I are unique by construction, the selection process has I possible continuations. We only impose that the sum of probabilities appearing in a selection is 1, without further constraints or connections with other probabilities appearing in a process. In a similar manner, the branching process c [ r ] & i I { l i ( x i ) . P i } uses channel c to receive from role r a labelled value. If the value received has label l i , the branching process behaves according to process P i in which the received value replaces all free occurrences of the variable x i . Note that the branching process is the only one that bounds variables, namely each variable x i is bound within the scope of P i . The set of free channels and free variables of a process P (namely those not bound in the branching process) are denoted by fc ( P ) and fv ( P ) , respectively, while the set of all processes is denoted by P .
The restriction process ( ν s ) P represents a process P to which the scope of session s is constrained. Process definition def D in P and process call X ( v ˜ c ˜ ) are used to define recursive behaviors. If D from the process definition has the form X ( x ˜ ) = P , then the process call X ( v ˜ c ˜ ) behaves as P in which v ˜ c ˜ are used to replace the variables x ˜ . Similar to the approach of [20], we allow only closed process declarations in which the process declaration X ( x ˜ ) = P is such that fc ( P ) = and fv ( P ) x ˜ . The inaction process 0 is a process that does nothing, while the parallel process P | Q allows the concurrent processes P and Q to communicate on a shared channel. The error process err marks the fact that we have an unsuccessful communication. Processes containing holes are used as evaluation contexts, while a communication channel c is either a variable x or a channel s [ r ] for role r in a session s, as in [22]. Values v used in processes can be Boolean, integers or strings.
Example 1.
In what follows, we describe a process that outlines a straightforward survey poll in which Alice either answers Bob ’s questions or opts out. Bob can conclude the poll if satisfied with the responses received or proceed with additional polling questions. Similar to the scenario described in [23], Alice and Bob express their preferences and receive others’ choices, each accompanied by numerical probabilities. These probabilities establish a clear numerical scale for responses.
The starting system is as follows:
System = ( ν s ) ( Bob | ( ν phoneA ) ( ν imA ) Alice ) .
The session that involves the parallel processes of Alice and Bob is denoted by s . This session encompasses two roles: r A assigned to Alice and r B assigned to Bob . Thus, within this session, Alice and Bob communicate by using channels identified by roles s [ r A ] and s [ r B ] , respectively. Additionally, phoneA and imA are sessions in which Alice and Bob can interact; these sessions can be established by performing communications within session s . It is important to note that initially the scopes of phoneA and imA are limited to Alice , as Bob is informed later if he needs to join one of these sessions. The variable x q appearing in A 1 of Alice is used to wait for a question q from Bob to which she can answer or not.
Alice = def A ( y ) = A 1 in A 2 , where the definitions of A 1 and A 2 are as follows:
A 1 = y [ r B ] & { talk ( x q ) . y [ r B ] 0.6 : yes ( x q ) . A ( y ) , 0.3 : no ( x q ) . A ( y ) , 0.1 : quit ( x q ) . 0 ) , quit . 0 } ;
A 2 = s [ r A ] [ r B ] 0.6 : phone ( phoneA ) . A ( phoneA [ r A ] ) , 0.35 : im ( IMIdA ) . A ( imA [ r A ] ) , 0.05 : not ( no ) . 0 .
Bob = def B ( y , q ) = B 1 in B 2 , where the definitions of B 1 and B 2 are as follows:
B 1 = y [ r A ] 0.95 : talk ( q ) . y [ r A ] & { yes ( q ) . B ( y , next ( q ) ) , no ( q ) . B ( y , next ( q ) ) , unsure ( q ) . B ( y , next ( q ) ) , quit ( q ) . 0 } , 0.05 : quit . 0 ;
B 2 = s [ r B ] [ r A ] & { phone ( x A ) . B ( x A [ r B ] , q 1 ) , im ( x A ) . B ( x A [ r B ] , q 1 ) , not ( x A ) . 0 } .
The concurrent behavior of processes Alice and Bob is explained through their interaction.
  • Initial decision: Alice uses the channel with role s [ r A ] in order to decide how to respond to the poll: by phone, instant messaging or by deferring the response. Probabilities are assigned to each option, indicating Alice ’s choice.
  • Phone communication: If Alice chooses the phone option, communication occurs through a unique session channel ( P h o n e N o A ) representing Alice ’s phone number.
  • Bob ’s interaction: Bob initiates the interaction by invoking B ( x A [ r B ] , q 1 ) , where x A becomes phoneA after communication. Here, q 1 represents the first question for Alice to answer using the following labels: yes , no, unsure or quit . Alice excludes the unsure option for this question, as she considers it inapplicable to the question at hand.
  • Question–answer sequence: After receiving an answer, Bob proceeds to the next question in the list (represented by next ).
  • Exiting the poll: Both processes are able to get out of the poll by selecting either the quit or not label. These options have low probabilities, indicating a low likelihood of being chosen during the survey poll.

2.2. Operational Semantics

If a name s is not in fc ( P ) , then we cannot find any role r such that s [ r ] exists within fc ( P ) . Additionally, we use d p v ( D ) to represent all process variables that are declared in D and f p v ( P ) to represent all process variables that appear as free variables in P.
Operational semantics is defined by using the structural congruence as its foundation:
  P | Q Q | P ( P | Q ) | R P | ( Q | R )     P | 0 P ( ν s ) 0 0 def D in 0 0
  ( ν s ) ( ν s ) P ( ν s ) ( ν s ) P        ( ν s ) P | Q ( ν s ) ( P | Q ) ( if s fc ( Q ) )
  def D in ( ν s ) P ( ν s ) def D in P ( if s fc ( D ) )
 1ex def D in ( P | Q ) ( def D in P ) | Q   ( if d p v ( D ) f p v ( Q ) = )
def D in def D in P def D in def D in P
  ( if ( d p v ( D ) f p v ( D ) ) d p v ( D )   = ( d p v ( D ) f p v ( D ) ) d p v ( D ) = )
P Q implies E [ P ] E [ Q ] .
For our probabilistic calculus, operational semantics is defined inductively according to Table 2, incorporating the previously defined structural congruence ≡ and also α -conversion. A process P is considered erroneous if and only if there exists a context E such that P = E [ err ] .
The communication between a probabilistic selection process (with role r 1 ) and a branching process (with role r 2 ) inside a session s is modelled using rule (Comm). When the process s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) . P j selects with probability p k a branch with label l k , the corresponding branch with label l k is also selected in the process s [ r 2 ] [ r 1 ] & i I l i ( x i ) . P i . The probabilistic selection process behaves afterwards as P k , while the branching process behaves afterwards as P k v k ˜ / x k ˜ , where the communicated value v k ˜ replaces the variable x k ˜ . It is important to note that the options provided by the probabilistic selection process may be fewer than those offered by the branching process, since the latter must account for all potential continuations based on different scenarios. However, all options in the selection set J must be present in the branching set I (i.e., J I ) to ensure no branch is chosen without a corresponding match. If an option k from the selection set J is absent in the branching set I (i.e., k J and k I ) and branch k is chosen without a match, an error err is produced (according to rule (Err)).
Rule (Call) initiates a process call by utilizing its definition in which the actual parameters v ˜ are used to substitute the formal parameters x ˜ . The side condition from rule (Call) is used to ensure an equal number of variables and values appearing in the definition and process call, respectively. Rule (Struct) asserts that the transition relation is invariant under structural congruence. Rule (Ctxt) specifies that a transition can occur under various evaluation contexts. The newly obtained value p associated with the reduction arrow is a normalized version of the original probability value p, adjusted according to the total number of rules applicable to both process P and process E [ P ] . The nextProc function is used to determine the number of rules applicable to a process P. Since the nextProc function is only relevant when applying rule (Ctxt), it implies that nextProc ( P ) > 0 and nextProc ( E [ P ] ) > 0 , as both P and E [ P ] are capable of performing at least one reduction step.
The nextProc function is defined as follows:
nextProc ( P ) = 1 + nextProc ( P ) i f   P s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) . P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) . P i } P   and   J I nextProc ( P ) + nextProc ( P ) i f   P ( ν s ) P P k + nextProc ( P ) + nextProc ( P ) i f   P def D in ( X ( v 1 ˜ ) X ( v k ˜ ) | P ) P where   D : : = X ( x ˜ ) = Q   and   X ( v ˜ ) P 0 otherwise .
The necessity of this function will become clear in the next example.
Example 2.
This example presents how the semantics is working. To improve readability, the parts of the processes that are used in the reduction are underlined. Consequently, the initial process can be rewritten as follows:
( ν s ) ( Bob | ( ν phoneA , imA ) Alice ) ( ν s ) ( ν phoneA ) ( ν imA ) ( def B ( y , q ) = B 1 in def A ( y ) = A 1 in s [ r B ] [ r A ] ̲ & { phone ( x A ) ̲ . B ( x A [ r B ] , q 1 ) , im ( x A ) . B ( x A [ r B ] , q 1 ) , not ( x A ) . 0 } s [ r A ] [ r B ] ̲ 0.6 : phone ( phoneA ) ̲ . A ( phoneA [ r A ] ) , 0.35 : im ( IMIdA ) . A ( imA [ r A ] ) , 0.05 : not ( ) . 0 ) = System 1 .
Note that the first applied rule is (Com), while the second one is (Ctxt) with the context E = ( ν s ) ( ν phoneA ) ( ν imA )   def B ( y , q ) = B 1   in def A ( y ) = A 1  in [ ] . The uniqueness of the next applied rule for System 1 is captured by the fact that nextProc ( System 1 ) = 1 . Suppose Alice selects (with probability 0.6 ) her phone to respond to the survey poll:
0.6 ( ν s ) ( ν phoneA ) ( ν imA ) ( def B ( y , q ) = B 1 in def A ( y ) = A 1 in B ( x A [ r B ] , q 1 ) { phoneA / x A } A ( phoneA [ r A ] ) ) ( ν s ) ( ν phoneA ) ( ν imA ) ( def B ( y , q ) = B 1 in def A ( y ) = A 1 in B ( phoneA [ r B ] , q 1 ) A ( phoneA [ r A ] ) ) = System 2 .
Observe that the rule (Call) is applied to two distinct process calls; subsequently, the (Ctxt) rule is applied. This is caught by nextProc ( System 2 ) = 2 . Thus, the probability that the rule (Ctxt) is applied equals the probability that the rule (Call) is applied divided by two. Suppose the process definition P = def B ( y , q ) = B 1 in   B ( phoneA [ r B ] , q 1 ) is executed inside the context E = ( ν s ) ( ν phoneA ) ( ν imA ) ( [ ] def A ( y ) = A 1 in  A ( phoneA [ r A ] ) ) , namely System 2 = E [ P ]
( ν s ) ( ν phoneA ) ( ν imA ) ( def B ( y , q ) = B 1 in B ( phoneA [ r B ] , q 1 ) def A ( y ) = A 1 in A ( phoneA [ r A ] ) ) 0.5 ( ν s ) ( ν phoneA ) ( ν imA ) ( def B ( y , q ) = B 1 in B 1 { phoneA [ r B ] / y , q 1 / q } def A ( y ) = A 1 in A ( phoneA [ r A ] ) )
The evolution continues by applying the rules of Table 2.
The evolution of the system can be presented as a sequence diagram (see Figure 1).

3. Global and Local Types Using Interval Probability

For describing typed systems with probabilistic behavior given by interval probabilities instead of precise probabilities, we use interval probability for both global and local types of our process calculus with nondeterministic external choices and probabilistic internal choices. We validate the probabilistic behaviors in session types by defining a decidable proof system in which the concept of interval probability is used to express uncertainty in the evaluation of complex concurrent processes. Interval probabilities provide a more general framework for describing uncertainty than classical probability; classical probability, as a special case, is fully compatible with the theory of interval probability [24]. We use probability intervals given by lower and upper bounds; such an interval describes the uncertainty in the behavior of the systems involving nondeterminism and probabilistic interactions. Further details on the probability intervals and operations over them are presented in [25].
The global types G ,   G i shown in Table 3 describe the global behavior of a probabilistic multiparty session process. We represent the probability interval ι as [ d 1 , d 2 ] , where d 1 , d 2 [ 0 , 1 ] and d 1 d 2 . If ι = [ d , d ] , we use the shorthand notation ι = d . The bounds of the intervals appearing in the global types are not randomly selected, but can be found statistically to contain initially the set of most possible values according to our belief/understanding (similar to Bayesian interval estimation in statistics) [26]. These probability intervals can be changed (narrowed) a posteriori.
The type r 1 r 2 : ι i : l i ( S i ) . G i i I indicates that a participant with role r 1 uses label l i to send a message of type S i to a participant with role r 2 , with a probability in the interval ι i ; the subsequent interactions are described by G i . Each l i is unique, as assumed when defining the processes. The type μ t . G is recursive; we consider only non-contractive recursion, namely μ t . t is not allowed. We take the equi-recursive viewpoint [27]; i.e., we identify μ t . G with G { μ t . G / t } .
Example 3.
The global type of the phoneA session from Example 1 is:
G 1 = μ t . r B r A : ι 1 : talk ( string ) . r A r B : ι 2 : yes ( string ) . t , ι 3 : no ( string ) . t , ι 4 : quit ( string ) . end , ι 5 : quit ( string ) . end .
If ι 5 = [ 0.95 , 1 ] , it means that Bob (the person responsible for conducting the poll) is more likely to quit than to finish the poll (as intended). However, if we want a minimal chance for Bob to stop the polling, this can be done by imposing ι 5 = [ 0 , 0.1 ] .
The global type of this example can be presented as a sequence diagram (see Figure 2).
In the global type r 1 r 2 : { ι i : l i ( S i ) . G i } i I , one should not assign arbitrarily the bounds of the probability interval ι i = [ d 1 i ; d 2 i ] for i I , but must instead adhere to certain constraints [25].
Definition 1.
Consider a set of probability intervals { ι i } i I of the form ι i = [ d 1 i ; d 2 i ] .
The set { ι i } i I is called proper if
i I d 1 i 1 i I d 2 i ,
while the set { ι i } i I is called reachable if for all i I it holds that
( i j d 1 j ) + d 2 i 1 ( i j d 2 j ) + d 1 i .
Example 4.
Consider the set of probability intervals { ι i } i { 1 , 2 } , where ι 1 = ι 2 = [ 0.4 ; 0.49 ] . Note that { ι i } i { 1 , 2 } is not proper (since 1 0.49 + 0.49 = 0.98 ). This means that if we choose any p 1 ι 1 , there is no p 2 ι 2 such that i { 1 , 2 } p i = 1 . Thus, by defining a set to be proper, we avoid a situation in which we cannot choose a probability from each probability interval such that the sum of the chosen probabilities is 1.
If we modify the intervals with just 0.1 at the upper part, then the set of probability intervals { ι i } i { 1 , 2 } , where ι 1 = ι 2 = [ 0.4 ; 0.59 ] , is proper (since 0.4 + 0.4 = 0.8 1 0.59 + 0.59 = 1.08 ). This is due to the fact that there exist at least a pair, e.g., p 1 = 0.45 ι 1 and p 2 = 0.55 ι 2 , such that their sum is 1. However, since 1 0.4 + 0.59 = 0.99 , this set is not reachable, meaning that it is not the case that for any p 1 ι 1 , there exists a p 2 ι such that their sum is 1. For example, if p 1 = 0.4 there is no p 2 = 0.6 ι 2 such that their sum is 1.
Defining a probability interval as proper and reachable prevents scenarios in which for all i I there does not exist p i ι i such that i I p i = 1 .
The local types T , T i shown in Table 4 describe the local behavior of processes, and also serve as a link between the global types and the processes of our calculus.
The selection type r i I ι i : ! l i S i . T i represents a channel that can choose a label l i with a probability belonging to the probability interval ι i (for any i I ), send it to r along with a variable of type S i and then proceed as T i . The branching type r & i I ? l i ( S i ) . T i represents a channel that can receive a label l i from role r (for some i I , chosen by r), along with a variable of type S i and then proceed as T i . In both selection and branching types, the labels l i are distinct and their order does not matter. The recursive type μ t . T and type variable t model infinite behaviors. Type end represents a terminated channel (and it is often omitted). The base types S , S 0 , S 1 , can include types such as bool , int , and others.
We define the projection of a global type onto its corresponding local types.
Definition 2.
The projection G r for a participant with role r appearing in a global type G is inductively defined as:
  • ( r 1 r 2 : { ι i : l i ( S i ) . G i } i I ) r = r 2 i I ι i : ! l i S i . ( G i r ) i f   r = r 1 r 2 r 1 & i I ? l i ( S i ) . ( G i r ) i f   r = r 2 r 1 G 1 r i f   r 1 r r 2 , i , j I , G i r = G j r t ( t ) ;
  • ( μ t . G ) r = μ t . ( G r ) i f   G r t ( t ) end ;
  • t r = t ;
  • end r = end .
If none of the conditions are satisfied, the projection is undefined.
Example 5.
Consider the global type G 1 of Example 3:
G 1 = μ t . r B r A : ι 1 : talk ( string ) . r A r B : ι 2 : yes ( string ) . t , ι 3 : no ( string ) . t , ι 4 : quit ( string ) . end , ι 5 : quit ( string ) . end .
The local types obtained from the global type G 1 by using the projection of Definition 2 for each of the two roles are as follows:
T A = G 1 r A = μ t . r B & ? talk ( string ) . T A , ? quit ( string ) . end ,   where T A = r B ι 2 : ! yes ( string ) . t , ι 3 : ! no ( string ) . t , ι 4 : ! quit ( string ) . end
T B = G 1 r B = μ t . r A ι 1 : ! talk ( string ) . T B , ι 5 : ! quit ( string ) . end ,   where T B = r A & ? yes ( string ) . t , ? no ( string ) . t , ? quit ( string ) . end
In order to state subject reduction (a property that ensures that the type of a process is preserved during its evaluation), we need to formalize how global types are modified when multiparty sessions evolve.
Definition 3
(Global type consumption). If G is a global type, then G r 1 r 2 : { ι : l } G denoting that G is reduced by consumption of a communication between roles r 1 and r 2 is defined inductively as follows:
  • r 1 r 2 : { ι i : l i ( S i ) . G i } i I r 1 r 2 : { ι : l } G k if k I s.t. l = l k , ι = ι k ;
  • r 3 r 4 : { ι i : l i ( S i ) . G i } i I r 1 r 2 : { ι : l } r 3 r 4 : { ι i : l i ( S i ) . G i } i I if { r 1 , r 2 } { r 3 , r 4 } = and i . G i r 1 r 2 : { ι : l } G i .
Example 6.
Consider the global type G 1 of Example 3:
G 1 = μ t . r B r A : ι 1 : talk ( string ) . r A r B : ι 2 : yes ( string ) . t , ι 3 : no ( string ) . t , ι 4 : quit ( string ) . end , ι 5 : quit ( string ) . end .
Since we take the equi-recursive viewpoint [27], i.e., we identify μ t . G with G { μ t . G / t } , it follows that G 1 can be written as:
G 1 = r B r A : ι 1 : talk ( string ) . r A r B : ι 2 : yes ( string ) . G 1 , ι 3 : no ( string ) . G 1 , ι 4 : quit ( string ) . end , ι 5 : quit ( string ) . end .
Then, by Definition 3, type G 1 can be reduced as follows:
G 1 r B r A : { ι 1 : talk } G 1 , where G 1 = r A r B : ι 2 : yes ( string ) . G 1 , ι 3 : no ( string ) . G 1 , ι 4 : quit ( string ) . end .
Following the methodology presented in [1], we design a distributed system in terms of global types which is then implemented as a set of processes that adhere to local types, local types derived as projections of the global types.
Given a global type G, in what follows pid ( G ) denotes the set of participants in G.
Definition 4.
A global type G is said to be well-formed if:
  • for all r pid ( G ) , the projection G r is defined;
  • in all its subterms of the form r 1 r 2 : { ι i : l i ( S i ) . G i } i I , the set of intervals { ι i } i I is proper and reachable.
Example 7.
Consider the global type G 1 of Example 3:
G 1 = μ t . r B r A : ι 1 : talk ( string ) . r A r B : ι 2 : yes ( string ) . t , ι 3 : no ( string ) . t , ι 4 : quit ( string ) . end , ι 5 : quit ( string ) . end ,
in which for all i { 1 , 4 } it holds that ι i = [ 0 , 1 ] and also ι 5 = [ 0.95 , 1 ] . As seen in Example 5, G 1 r is defined for all roles. Moreover, since 0 + 0 + 0 = 0 1 1 + 1 + 1 = 3 , the set { ι i } i { 2 , 3 , 4 } is proper (there exist at least a probability in each probability interval such that their sum is 1: e.g., p 2 = 0 ι 2 , p 3 = 0 ι 3 and p 3 = 1 ι 4 ) and since 0 + 0 + 1 = 1 1 1 + 1 + 0 = 2 , then the set { ι i } i { 2 , 3 , 4 } is reachable (for all i { 2 , 3 , 4 } by choosing p i ι i there exists j k { 2 , 3 , 4 } i such that p j ι j , p k ι k and p i + p j + p k = 1 ).
Also, since 0 + 0.95 = 0.95 1 + 1 = 2 , the set { ι i } i { 1 , 5 } is proper (there exist at least a probability in each probability interval such that their sum is 1: e.g., p 1 = 0.05 ι 1 and p 5 = 0.95 ι 5 ), while, since 1 + 0.95 1 , the set { ι i } i { 1 , 5 } is not reachable (for p 1 = 0.5 ι 1 , there does not exists p 5 = 0.5 ι 5 such that p 1 + p 5 = 1 ).
Thus, having a proper and reachable set of probabilities is often a desirable requirement; it can serve as a sanity check for a specification.

4. Interval Probability for Multiparty Session Types

4.1. Typing System

We present how the notions defined in Section 2 and Section 3 (namely, the probabilistic processes and the global/local types using probability intervals) are connected. To achieve this, we introduce sortings Γ and typings Δ , specifically
Γ : : = Γ , x : S Γ , X : S ˜ T ˜ and Δ : : = Δ , s [ r ] : T .
The typing system uses a map from shared names to their sorts ( S , S , ) . A sorting ( Γ , Γ , ) is a finite map that associates names with sorts, and process variables with sequences of sorts and types. A typing ( Δ , Δ , ) tracks the usage of session channels. The disjoint union Δ , Δ of two typings Δ and Δ is defined only if the sets of channels contained in the domains of these two typings are distinct. In what follows, we denote by S and T the set of all sortings and typings, respectively.
The dynamics of the types of the interacting processes is defined as in [1] by a labelled type reduction relation ι on typing Δ that respects the following rules:
  • s [ r 1 ] : r 2 i I ι i : ! l i S i . T i , s [ r 2 ] : r 1 & i I ? l i ( S i ) . T i ι k s [ r 1 ] : T k , s [ r 2 ] : T k for k I ;
  • s [ r ] : μ t . T 1 s [ r ] : T { μ t . T / t } ;
  • Δ , Δ ι Δ , Δ whenever Δ ι Δ .
The first rule indicates two participants with roles r 1 and r 2 that use a label l k to communicate on the session channel s a value of type S k . Notice that the reduction arrow is decorated with the probability interval of the sending type; this would be useful when proving type preservation under evolution. The second rule describes the unfolding of local types with probability interval [ 1 , 1 ] , while the third rule refers to the composition of two typings when only one of them is evolving.
We describe that a typing Δ is safe as in [28]; this property is independent of the probability intervals present in Δ .
Definition 5.
φ is a safety property of typing contexts if and only if:
  • φ ( Δ , s [ r 1 ] : r 2 i I ι i : ! l i S i . T i , s [ r 2 ] : r 1 & j J ? l j ( S j ) . T j ) implies I J ;
  • φ ( Δ , s [ p ] : μ t . T ) implies φ ( Δ , s [ p ] : T { μ t . T / t } ) ;
  • φ ( Δ ) and Δ ι Δ implies φ ( Δ ) .
We say Δ is safe if φ ( Δ ) for some safety property φ. The fact that Δ is safe is denoted by safe ( Δ ) .
If the safety property φ is not explicitly instantiated, then, as in [28], we consider φ = safe (i.e., the largest safety property, cf. Definition 5).
Example 8.
Consider the local types T A and T B from Example 5 and the typing:
Δ = phoneA [ r A ] : T A , phoneA [ r B ] : T B , imA [ r A ] : T A , imA [ r B ] : T B
We can verify that safe ( Δ ) holds by checking its reductions.
Note that the safety of disjoint unions of typings is preserved by each of these disjoint typings.
Proposition 1.
If safe ( Δ , Δ ) , then safe ( Δ ) .
Proof. 
By contradiction, assume that it is not the case that safe ( Δ ) holds. Then, by Definition 5 (last clause), there is Δ such that Δ ι Δ , and Δ violates the first clause of Definition 5 (possibly after applying the second clause if some unfolding is needed). However, by the last clause of the definition of the labelled type reduction relation ι , it holds that Δ , Δ ι Δ , Δ and it is not the case that safe ( Δ , Δ ) holds. This means the disjoint union Δ , Δ violates the first clause of Definition 5 and therefore it is not the case that safe ( Δ , Δ ) holds, which represents a contradiction. Thus, we conclude that safe ( Δ ) holds. □
Example 9.
Consider from Example 8 the typing:
Δ = phoneA [ r A ] : T A , phoneA [ r B ] : T B , imA [ r A ] : T A , imA [ r B ] : T B .
Then Δ can be written as a disjoint union of two typings, namely Δ = Δ 1 , Δ 2 , where Δ 1 = phoneA [ r A ] : T A , phoneA [ r B ] : T B and Δ 2 = imA [ r A ] : T A , imA [ r B ] : T B . From Example 8, safe ( Δ ) holds; according to Proposition 1, safe ( Δ 1 ) and safe ( Δ 2 ) also hold.
In order to establish a connection between the global types G and typings Δ , being inspired by [29], we define the full projection of a global type as a typing. This full projection is useful in defining the typing system and proving the subject reduction.
Definition 6
(Full projection). If G is the global type of a session s such that pid ( G ) = I and safe ( { s [ r i ] : T i } i I ) where for all i I it holds that G r i = T i , then the full projection of G on s is given by G s = { s [ r i ] : T i } i I .
Example 10.
Consider the global type G 1 of Example 3, the local types T A and T B obtained from G 1 in Example 5 and the safe typing Δ 1 = phoneA [ r A ] : T A , phoneA [ r B ] : T B obtained in Example 9. Then, by Definition 6, the full projection of G 1 on phoneA is G 1 phoneA = Δ 1 .
The connection between the evolution of global types and that of the corresponding typings can be expressed as in the following result.
Proposition 2.
If Δ ι Δ and safe ( Δ ) , then there exist s , r 1 , r 2 , l , G , G such that Δ = Δ , G s and Δ = Δ , G s , where G r 1 r 2 : { ι : l } G .
Proof. 
Since Δ ι Δ and safe ( Δ ) , then the associated redex is in Δ s = { s [ r i ] : T i } i I , where Δ = Δ , Δ s . Due to Proposition 1, it holds that safe ( Δ s ) , and thus we can write Δ s as G s for some well-formed global type G. By taking the preimage of the associated redex in G s , reducing G by G r 1 r 2 : { ι : l } G corresponds to eliminating its preimage from G to obtain G , whose projection G s precisely gives the result of reducing G s = Δ s to G s = Δ s , and thus Δ = Δ , G s . □
Example 11.
Consider the global type G 1 of Example 3, the local types T A and T B obtained from G 1 in Example 5 and the safe typing Δ 1 = phoneA [ r A ] : T A , phoneA [ r B ] : T B obtained in Example 9 such that G 1 phoneA = Δ 1 as obtained in Example 10.
By using the labelled type reduction relation ι on typing Δ 1 , we have that Δ 1 ι 1 Δ 1 , where Δ 1 = phoneA [ r A ] : T A , phoneA [ r B ] : T B . Since safe ( Δ 1 ) holds from Exercise 9, then by Definition 5 also safe ( Δ 1 ) holds. From Example 6, it holds that G 1 r B r A : { ι 1 : talk } G 1 . By Definition 2, the local types T A = G 1 r A and T B = G 1 r B are obtained from G 1 . Thus, by Definition 6, the full projection G 1 on phoneA is G 1 phoneA = Δ 1 , and so the result of Proposition 2 holds.
The type-assignment system for processes is given in Table 5. We use the judgement Γ P Δ saying that “under the sorting Γ , process P has typing Δ ”. Actually, we have two types of rules: some rules using only Γ and others using both Γ and Δ . We use the rules (TVar) and (TVal) for typing variables and values (using only the sorting Γ ). (TSelect) and (TBranch) are the rules for typing selection and branching, respectively. Just like in [30], the (TSelect) rule states that the selection process is well-typed if c [ r ] has a compatible selection (branching) type and the continuations P i (for all i I ) are well-typed with respect to the session types. Since rule (TSelect) types the selection process by using probability intervals, it should verify that the probabilities fall within the corresponding probability interval. Instead, the (TBranch) rule states that the branching process is well-typed if each c : r from the branching type has a compatible branching process and the continuations P i (for all i I ) are well-typed with respect to the session types; thus, it allows more branches in the process than in the global type, just mimicking the subtyping for session types [31]. Rule (TEnd) is standard; “ Δ end only ” means that Δ contains only end as session types. Rule (TConc) is composed of two processes if their local types are disjoint. (TRes) is the restriction rule for session names, claiming that ( ν s ) P is well-typed in Γ ; this happens if the full projection G s of the global type G associated with the session s appears in the typing associated with process P. (TDef) says that a process definition def X ( x ˜ c 1 c n ) = P in Q is well-typed if both P and Q are well-typed in their typing contexts. Rule (TCall) says that a process call X ( v ˜ c 1 c n ) is well-typed if the actual parameters v ˜ c 1 c n have compatible types with respect to X.
Example 12.
Consider from Example 2 the process
AB = def B ( y , q ) = B 1 in def A ( y ) = A 1 in A 1 { phoneA [ r A ] / y } ) B 1 { PhoneNoA [ r B ] / y , q 1 / q } , and the safe typing Δ 1 = phoneA [ r A ] : T A , phoneA [ r B ] : T B obtained in Example 9.
By using the rules of Table 5, we can type the process AB with the typing Δ 1 and the empty sorting , namely AB Δ 1 .

4.2. Behavioral Properties of Typed Processes

We can prove that our probabilistic typing system is sound, meaning that its term-checking rules admit only consistent terms with respect to the structural congruence and operational semantics.
A subject reduction property ensures that the type of an expression is preserved during its evaluation. For proving this property, we need the following substitution, weakening and strengthening results.
Lemma 1.
  • (substitution)  If Γ , x : S P Δ and Γ v : S then Γ P { v / x } Δ .
  • (type weakening) If Γ P Δ and Δ is only end, then Γ P Δ , Δ .
  • (type strengthening) If s [ r ] fc ( P ) and Γ P Δ , { s [ r ] : T } , then Γ P Δ .
  • (sort weakening) If X d o m ( Γ ) and Γ P Δ , then Γ , X : S ˜ T ˜ P Δ .
  • (sort strengthening) If X f p v ( P ) and Γ , X : S ˜ T ˜ P Δ , then Γ P Δ .
Proof. 
The proofs are rather standard, by induction on the derivations on the premise, following those presented in [1]. □
The next result states that if under the sorting Γ , a process P has typing Δ , then rearranging the process P into process P by using the structural congruence ≡, then the process P is typed using the same sorting Γ and the same typing Δ .
Theorem 1
(Subject equivalence).  Γ P Δ and P P imply Γ P Δ .
Proof. 
The proof is by induction on ≡, showing that if one side has a typing, then the other side has the same typing. The probability intervals in the involved types remain unchanged because the structural congruence ≡ does not consume or modify the selection processes.
  • Case P 0 P .
    ⇒ Assume Γ P 0 Δ . By inverting the rule (TConc), we obtain Γ P Δ 1 and Γ 0 Δ 2 , where Δ 1 , Δ 2 = Δ . By inverting the rule (TEnd), Δ 2 is only end and Δ 2 is such that dom ( Δ 1 ) dom ( Δ 2 ) = . Then, by type weakening, we obtain that Γ P Δ , where Δ = Δ 1 , Δ 2 .
    ⇐ Assume Γ P Δ . By rule (TEnd), it holds that Γ 0 Δ , where Δ is only end and dom ( Δ ) dom ( Δ ) = . By applying the rule (TConc), we obtain Γ P 0 Δ , Δ and since Δ = , by type strengthening, we obtain Γ P 0 Δ , as required.
  • Case ( ν s ) 0 0 .
    ⇒ Assume Γ ( ν s ) 0 Δ . By inverting the rule (TRes), we obtain Γ 0 Δ , G s where s Δ . By inverting the rule (TEnd), Δ , G s is only end, namely Δ is only end. Using the rule (TEnd), we obtain Γ 0 Δ .
    ⇐ Assume Γ 0 Δ . By rule (TRes), it holds that Γ ( ν s ) 0 Δ 1 where s Δ 1 , G s = { s [ r i ] : end } i I and Δ = Δ 1 , G s . Then, by type weakening, we obtain that Γ ( ν s ) 0 Δ , as required.
  • Case P Q Q P
    ⇒ Assume Γ P Q Δ . By inverting the rule (TConc), we obtain Γ P Δ 1 and Γ Q Δ 2 , where Δ 1 , Δ 2 = Δ . Using the rule (TConc), we obtain Γ Q P Δ .
    ⇐ It follows in a similar way (actually, symmetric to ⇒).
  • Case ( P Q ) R P ( Q R )
    ⇒ Assume Γ ( P Q ) R Δ . By inverting the rule (TConc), we obtain Γ P Δ 1 , Γ Q Δ 2 and Γ R Δ 3 , where Δ 1 , Δ 2 , Δ 3 = Δ . Using the rule (TConc), we obtain Γ P ( Q R ) Δ .
    ⇐ It follows in a similar way (actually, symmetric to ⇒).
  • Case ( ν s ) ( ν s ) P ( ν s ) ( ν s ) P
    ⇒ Assume Γ ( ν s ) ( ν s ) P Δ . By inverting the rule (TRes) twice, we obtain Γ P Δ , G s , G s where s Δ and s Δ . Using the rule (TRes), we obtain Γ ( ν s ) ( ν s ) P Δ .
    ⇐ Proof is similar (actually, symmetric to ⇒).
  • Case ( ν s ) P | Q ( ν s ) ( P | Q ) ( if s fc ( Q ) )
    ⇒ Assume Γ ( ν s ) P | Q Δ . Since s fc ( ( ν s ) P ) and s fc ( Q ) , if s Δ (by using type weakening), then Δ = Δ , Δ such that s Δ , s Δ and Δ is only end. By type strengthening, we obtain Γ ( ν s ) P | Q Δ . By inverting the rule (TConc), we obtain Γ ( ν s ) P Δ 1 and Γ Q Δ 2 , where Δ 1 , Δ 2 = Δ . By inverting the rule (TRes), we obtain Γ P Δ 1 , G s where s Δ 1 . Using the rule (TConc), we obtain Γ P Q Δ 1 , G s , Δ 2 . Since s fc ( Q ) , it means that Δ 2 does not contain types for s; from Γ P Q Δ 1 , Δ 2 , G s with s Δ 1 , Δ 2 , by using the rule (TRes) we obtain Γ ( ν s ) ( P | Q ) Δ , where Δ 1 , Δ 2 = Δ . Since Δ is only end, by using type weakening we obtain Γ ( ν s ) ( P | Q ) Δ .
    ⇐ Assume Γ ( ν s ) ( P | Q ) Δ . By inverting rule (TRes), we obtain Γ P | Q Δ , G s , where s Δ . Since s fc ( Q ) , by inverting the rule (TConc), we obtain Γ P Δ 1 , G s and Γ Q Δ 2 , where Δ 1 , Δ 2 = Δ . Using rule (TRes), we obtain Γ ( ν s ) P Δ 1 . By using rule (TConc), we obtain Γ ( ν s ) P | Q Δ , as required.
  • Case def D in 0 0
    ⇒ Assume Γ def D in 0 Δ . By inverting rule (TDef), we obtain Γ , x : S ˜ , X : S ˜ T 1 T n P c 1 : T 1 , , c n : T n and Γ , X : S ˜ T 1 T n 0 Δ , where D = { X ( x ˜ c 1 c n ) = P } . By inverting rule (TEnd) we obtain that Δ is only end. Applying rule (TEnd), we obtain Γ 0 Δ .
    ⇐ Assume Γ 0 Δ . By inverting rule (TEnd), we obtain that Δ is only end. Applying rule (TEnd), we obtain Γ , X : S ˜ T 1 T n 0 Δ . Consider a typable process P such that Γ , x : S ˜ , X : S ˜ T 1 T n P c 1 : T 1 , , c n : T n . By applying rule (TDef) we obtain that Γ def X ( x ˜ c 1 c n ) = P in 0 Δ , namely Γ def D in 0 Δ , as required.
  • Case def D in ( ν s ) P ( ν s ) def D in P ( if s fc ( D ) )
    ⇒ Assume Γ def D in ( ν s ) P Δ . By inverting rule (TDef), we obtain Γ , x : S ˜ , X : S ˜ T 1 T n Q c 1 : T 1 , , c n : T n and Γ , X : S ˜ T 1 T n ( ν s ) P Δ , where D = { X ( x ˜ c 1 c n ) = Q } . By inverting the rule (TRes), we obtain Γ P Δ , G s where s Δ . Using rule (TDef), we obtain Γ def D in P Δ , G s , while using rule (TRes), we obtain Γ ( ν s ) def D in P Δ .
    ⇐ Assume Γ ( ν s ) def D in P Δ . By inverting rule (TRes), we obtain Γ def D in P Δ , G s where s Δ . By inverting rule (TDef), we obtain Γ , x : S ˜ , X : S ˜ T 1 T n Q c 1 : T 1 , , c n : T n and Γ , X : S ˜ T 1 T n P Δ , G s , where D = { X ( x ˜ c 1 c n ) = Q } . Using rule (TRes), we obtain Γ , X : S ˜ T 1 T n ( ν s ) P Δ . By applying rule (TDef) we obtain that Γ def X ( x ˜ c 1 c n ) = Q in ( ν s ) P Δ , namely Γ def D in ( ν s ) P Δ , as required.
  • Case def D in ( P | Q ) ( def D in P ) | Q ( if d p v ( D ) f p v ( Q ) = )
    ⇒ Assume Γ def D in ( P | Q ) Δ . By inverting the rule (TDef), we obtain Γ , x : S ˜ , X : S ˜ T 1 T n R c 1 : T 1 , , c n : T n and Γ , X : S ˜ T 1 T n P Q Δ , where D = { X ( x ˜ c 1 c n ) = R } . By inverting the rule (TConc), we obtain Γ , X : S ˜ T 1 T n P Δ 1 and Γ , X : S ˜ T 1 T n Q Δ 2 , where Δ 1 , Δ 2 = Δ . By applying the rule (TDef), we obtain that Γ def X ( x ˜ c 1 c n ) = R in P Δ 1 , namely Γ def D in P Δ 1 . Since d p v ( D ) f p v ( Q ) = and X d p v ( D ) , then X f p v ( Q ) and so by sort strengthening we obtain Γ Q Δ 2 . By applying the rule (TConc), we obtain that Γ ( def D in P ) Q Δ .
    ⇐ Assume Γ ( def D in P ) Q Δ . By inverting rule (TConc), we obtain Γ def D in P Δ 1 and Γ Q Δ 2 , where Δ 1 , Δ 2 = Δ . By inverting rule (TDef), we obtain Γ , x : S ˜ , X : S ˜ T 1 T n R c 1 : T 1 , , c n : T n and Γ , X : S ˜ T 1 T n P Δ 1 , where D = { X ( x ˜ c 1 c n ) = R } and X d o m ( Γ ) . Since X d o m ( Γ ) , by sort weakening we obtain Γ , X : S ˜ T 1 T n Q Δ 2 . By applying rule (TConc), we obtain that Γ , X : S ˜ T 1 T n P Q Δ . By using rule (TDef), we obtain Γ def X ( x ˜ c 1 c n ) = R in ( P | Q ) Δ , namely Γ def D in ( P | Q ) Δ , as required.
  • Case def D in def D in P def D in def D in P ( if ( d p v ( D ) f p v ( D ) ) d p v ( D ) = ( d p v ( D ) f p v ( D ) ) d p v ( D ) = )
    ⇒ Assume Γ def D in def D in P Δ . By inverting rule (TDef), we obtain Γ , x : S ˜ , X : S ˜ T 1 T n R c 1 : T 1 , , c n : T n and Γ , X : S ˜ T 1 T n def D in P Δ , where D = { X ( x ˜ c 1 c n ) = R } . By inverting rule (TDef), we obtain Γ , X : S ˜ T 1 T n , x : S ˜ , X : S ˜ T 1 T n R c 1 : T 1 , , c n : T n and Γ , X : S ˜ T 1 T n , X : S ˜ T 1 T n P Δ , where D = { X ( x ˜ c 1 c n ) = R } . Since ( d p v ( D ) f p v ( D ) ) d p v ( D ) = ( d p v ( D ) f p v ( D ) ) d p v ( D ) = , it means that X and X are different. Applying rule (TDef) twice, we obtain def D in def D in P .
    ⇐ It follows in a similar way (actually, symmetric to ⇒).
  • P 0 P 0 implies E [ P 0 ] E [ P 0 ] . Let P E [ P 0 ] and P E [ P 0 ] .
    The proof is by structural induction on E .
    Case E = [ ] . This case is trivial, as P P is in fact P 0 P 0 .
    Case E = ( ν s ) E . By assumption, Γ ( ν s ) E [ P 0 ] Δ . By inverting the rule (TRes), we obtain that Γ E [ P 0 ] Δ , G s . By structural congruence, P 0 P 0 implies E [ P 0 ] E [ P 0 ] . By induction, Γ E [ P 0 ] Δ , G s . By structural congruence, E [ P 0 ] E [ P 0 ] implies E [ P 0 ] E [ P 0 ] . By using the rule (TRes), we obtain Γ ( ν s ) E [ P 0 ] Δ , as required.
    Case E = def D in E . By assumption, Γ def D in E [ P 0 ] Δ . By inverting rule (TDef), we obtain Γ , x : S ˜ , X : S ˜ T 1 T n Q c 1 : T 1 , , c n : T n and Γ , X : S ˜ T 1 T n E [ P 0 ] Δ , where D = { X ( x ˜ c 1 c n ) = Q } . By structural congruence, P 0 P 0 implies E [ P 0 ] E [ P 0 ] . By induction, Γ , X : S ˜ T 1 T n E [ P 0 ] Δ . By structural congruence, E [ P 0 ] E [ P 0 ] implies E [ P 0 ] E [ P 0 ] . By using the rule (TDef), we obtain Γ def D in E [ P 0 ] Δ .
    Case E = E P . By assumption, Γ E [ P 0 ] P Δ . By inverting rule (TConc), we obtain Γ E [ P 0 ] Δ 1 and Γ P Δ 2 , where Δ 1 , Δ 2 = Δ . By structural congruence, P 0 P 0 implies E [ P 0 ] E [ P 0 ] . By induction, Γ E [ P 0 ] Δ 1 . By structural congruence, E [ P 0 ] E [ P 0 ] implies E [ P 0 ] E [ P 0 ] . By using the rule (TConc), we obtain Γ E [ P 0 ] P Δ , as required.
Example 13.
Consider from Example 12 the process
AB = def B ( y , q ) = B 1 in def A ( y ) = A 1 in A 1 { phoneA [ r A ] / y } ) B 1 { PhoneNoA [ r B ] / y , q 1 / q } , where AB Δ 1 . If there exists a process BA such that AB BA , then (by Theorem 1) it also holds that BA Δ 1 .
The next result states that if under the sorting Γ , a process P has the safe typing Δ , by reducing the process P to process P , then the process P is typed using the same sorting Γ and a typing Δ obtained from the typing Δ .
Theorem 2
(Subject reduction).  Γ P Δ , safe ( Δ ) and P p P imply Γ P Δ , where Δ = Δ or Δ ι Δ with p · nextProc ( P ) ι .
Proof. 
The proof is by induction on the derivation of P p P . There is a case for each operational semantics rule, and for each such rule we consider the typing system rule generating Γ P Δ . Since the reduction relations p and ι track the consumed probabilities in processes and probability intervals in types, this result provides a relation between the consumed probabilities and probability intervals at each reduction step.
  • Case (Com): s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) . P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) . P i } p k P k P k { v k ˜ / x k ˜ } .
    By assumption, Γ s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) . P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) ; P i } Δ . By inverting the rule (TConc), we obtain Γ s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) ; P j Δ 1 and Γ s [ r 2 ] [ r 1 ] & i I { l i ( x i ) ; P i } Δ 2 with Δ = Δ 1 , Δ 2 . Since these can be inferred only from (TSelect) and (TBranch), we know that Δ 1 = Δ 1 , s [ r 1 ] : r 2 j J ι j : ! l j S j . T j and Δ 2 = Δ 2 , s [ r 2 ] : r 1 & i I ? l i ( S i ) . T i . By inverting the rules (TSelect) and (TBranch), we obtain that j . Γ v j : S j , j . Γ P j Δ 1 , s [ r 1 ] : T j , p j ι j and i . Γ , x i : S i P i Δ 2 , s [ r 2 ] : T i . Since k J I , from Γ v k : S k and Γ , x k : S k P k Δ 2 , s [ r 2 ] : T k , by applying the substitution part of Lemma 1, we obtain that Γ P k { v k / x k } Δ 2 , s [ r 2 ] : T k . By applying the rule (TConc), we obtain Γ P k P k { v k / x k } Δ 1 , s [ r 1 ] : T k , Δ 2 , s [ r 2 ] : T k . By using the type reduction relation, we obtain Δ ι k Δ , where Δ = Δ 1 , s [ r 1 ] : T k , Δ 2 , s [ r 2 ] : T k and p k ι k .
  • Case (Call): def X ( x ˜ ) = R in ( X ( v ˜ ) | Q ) 1 def X ( x ˜ ) = R in ( R { v ˜ / x ˜ } | Q )
    By assumption, Γ def X ( x ˜ ) = R in ( X ( v ˜ ) | Q ) Δ . By inverting rule (TDef), we obtain Γ , x : S ˜ , X : S ˜ R and Γ , X : S ˜ X ( v ˜ ) | Q Δ . By inverting the rule (TConc), we obtain Γ , X : S ˜ X ( v ˜ ) Δ 1 and Γ , X : S ˜ Q Δ 2 , where Δ 1 , Δ 2 = Δ . By inverting the rule (TCall), we obtain Γ v ˜ : S ˜ and Δ 1 is end only. Applying substitution (Lemma 1), we obtain Γ , X : S ˜ R { v ˜ / x ˜ } . Since Δ 1 is end only, then by type weakening (Lemma 1) we obtain that Γ , X : S ˜ R { v ˜ / x ˜ } Δ 1 . Applying rule (TConc), we obtain Γ , X : S ˜ R { v ˜ / x ˜ } | Q Δ . Applying rule (TDef), we obtain Γ def X ( x ˜ ) = R in ( R { v ˜ / x ˜ } | Q ) Δ , as desired.
  • Case (Ctxt): P 0 p 0 P 0 implies E [ P 0 ] p E [ P 0 ] , where p = p 0 · nextProc ( P 0 ) / nextProc ( E ( P 0 ) ) . Let P E [ P 0 ] , P E [ P 0 ] . Proof is by structural induction on E .
    Case E = [ ] . This case is trivial, as P p P is in fact P 0 p P 0 .
    Case E = ( ν s ) E . By assumption, Γ ( ν s ) E [ P 0 ] Δ . By inverting rule (TRes), we obtain that Γ E [ P 0 ] Δ , G s . By applying rule (Ctxt), P 0 p 0 P 0 implies E [ P 0 ] p E [ P 0 ] , where p = p 0 · nextProc ( P 0 ) / nextProc ( E [ P 0 ] ) . By induction, Γ E [ P 0 ] Δ , G s , where Δ , G s = Δ , G s or Δ , G s ι Δ , G s , with either Δ = Δ or G s = G s and p · nextProc ( E [ P 0 ] ) ι . By applying rule (Ctxt), E [ P 0 ] p E [ P 0 ] implies E [ P 0 ] p E [ P 0 ] , where p = p · nextProc ( E [ P 0 ] ) / nextProc ( E [ P 0 ] ) . By using the rule (TRes), we obtain Γ ( ν s ) E [ P 0 ] Δ , where Δ = Δ or Δ ι Δ and p · nextProc ( ( ν s ) E [ P 0 ] ) = p · nextProc ( E [ P 0 ] ) = p 0 · nextProc ( P 0 ) ι , as required.
    Case E = def D in E . By assumption, Γ def D in E [ P 0 ] Δ . By inverting rule (TDef), we obtain Γ , x : S ˜ , X : S ˜ T 1 T n Q c 1 : T 1 , , c n : T n and Γ , X : S ˜ T 1 T n E [ P 0 ] Δ , where D = { X ( x ˜ c 1 c n ) = Q } . By applying rule (Ctxt), P 0 p 0 P 0 implies E [ P 0 ] p E [ P 0 ] , where p = p 0 · nextProc ( P 0 ) / nextProc ( E [ P 0 ] ) . By induction, Γ , X : S ˜ T 1 T n E [ P 0 ] Δ , where Δ = Δ or Δ ι Δ and p · nextProc ( E [ P 0 ] ) ι . By applying rule (Ctxt), E [ P 0 ] p E [ P 0 ] implies E [ P 0 ] p E [ P 0 ] , where p = p · nextProc ( E [ P 0 ] ) / nextProc ( E [ P 0 ] ) . By using rule (TDef), we obtain Γ def D in E [ P 0 ] Δ , where Δ = Δ or Δ ι Δ and p · nextProc ( def D in E [ P 0 ] ) = p · nextProc ( E [ P 0 ] ) = p 0 · nextProc ( P 0 ) ι , as required.
    Case E = E P . By assumption, Γ E [ P 0 ] P Δ . By inverting rule (TConc), we obtain Γ E [ P 0 ] Δ 1 and Γ P Δ 2 , where Δ 1 , Δ 2 = Δ . By applying rule (Ctxt), P 0 p 0 P 0 implies E [ P 0 ] p E [ P 0 ] , where p = p 0 · nextProc ( P 0 ) / nextProc ( E [ P 0 ] ) . By induction, Γ E [ P 0 ] Δ 1 , with Δ 1 = Δ 1 or Δ 1 ι Δ 1 and p · nextProc ( E [ P 0 ] ) ι . By applying rule (Ctxt), E [ P 0 ] p E [ P 0 ] implies E [ P 0 ] p E [ P 0 ] , where p = p · nextProc ( E [ P 0 ] ) / nextProc ( E [ P 0 ] ) . By using the rule (TConc), we obtain Γ E [ P 0 ] P Δ , with Δ = Δ or Δ ι Δ = Δ 1 , Δ 2 and p · nextProc ( E [ P 0 ] P ) = p · nextProc ( E [ P 0 ] ) = p 0 · nextProc ( P 0 ) ι , as required.
  • Case (Struct): P P and P p Q and Q Q implies P p Q . We use Theorem 1.
Example 14.
Consider from Example 12 the process
AB = def B ( y , q ) = B 1 in def A ( y ) = A 1 in A 1 { phoneA [ r A ] / y } ) B 1 { phoneA [ r B ] / y , q 1 / q } , where AB Δ 1 . If AB 0.95 A B , then:
AB = def B ( y , q ) = B 1 in def A ( y ) = A 1 in
phoneA [ r A ] [ r B ] 0.6 : yes ( x q ) . A ( phoneA [ r A ] ) , 0.3 : no ( x q ) . A ( phoneA [ r A ] ) , 0.1 : quit ( x q ) . 0 ) , phoneA [ r B ] [ r A ] & { yes ( q 1 ) . B ( phoneA [ r B ] , next ( q 1 ) ) , no ( q 1 ) . B ( phoneA [ r B ] , next ( q 1 ) ) , unsure ( q 1 ) . B ( phoneA [ r B ] , next ( q 1 ) ) , quit ( q 1 ) . 0 } .
By using the rules of Table 5, the process AB can be typed using the typing Δ 1 = phoneA [ r A ] : T A ,   phoneA [ r B ] : T B of Example 11, namely AB Δ 1 . Note that from Example 11, it holds that Δ 1 ι 1 Δ 1 . Also, since nextProc ( A B ) = 1 and 0.95 ι 1 , it holds that 0.95 · 1 = 0.95 ι 1 , according to the rule (TSelect) of Table 5.
The next result states that if under the sorting Γ , a process P has the safe typing Δ , then by reducing the process P we can never reach the err process.
Corollary 1
(Type safety). If Γ P Δ , safe ( Δ ) and P p * P , then P has no error.
Proof. 
By induction on the number of performed derivations. By using Theorem 2, from Γ P Δ and safe ( Δ ) we obtain Γ P Δ such that Δ =Δ or Δ ι * Δ . If there exists a context E such that P = E [ err ] , then P is untypable because err is untypable. Hence, P has no error. □
The next result states that if under the sorting Γ , a process P has the safe typing Δ , then by reducing the process P we ensure the positivity of the nextProc on process P.
Corollary 2
(Positivity of nextProc ). If Γ P Δ , safe ( Δ ) and P p P , then nextProc ( P ) > 0 .
Proof. 
By induction on the derivation of P p i P . There is a case for each operational semantics rule, and for each operational semantics rule we consider the typing system rule generating Γ P Δ .
  • Case (Com): s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) . P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) . P i } p k P k P k { v k ˜ / x k ˜ } .
    This means that P = s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) . P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) . P i } and thus nextProc ( P ) = 1 > 0 , as desired.
  • Case (Call): def X ( x ˜ ) = R in ( X ( v ˜ ) | Q ) 1 def X ( x ˜ ) = R in ( R { v ˜ / x ˜ } | Q ) . This means that P = def X ( x ˜ ) = R in ( X ( v ˜ ) | Q ) and thus nextProc ( P ) = 1 + nextProc ( Q ) > 0 . Since nextProc ( Q ) 0 , then nextProc ( P ) > 0 , as desired.
  • Case (Ctxt): P 0 p 0 P 0 implies E [ P 0 ] p E [ P 0 ] , with p = p 0 · nextProc ( P 0 ) / nextProc ( E ( P 0 ) ) . This means that P = E [ P 0 ] and thus nextProc ( P ) = nextProc ( E [ 0 ] ) + nextProc ( P 0 ) . From Γ P Δ and safe ( Δ ) , it holds from the rules of Table 5 that there exists Γ and Δ such that Γ P 0 Δ and safe ( Δ ) . By induction, since Γ P 0 Δ , safe ( Δ ) and P 0 p 0 P 0 , it holds that nextProc ( P 0 ) > 0 and thus nextProc ( P ) > 0 , as desired.
  • Case (Struct): P P and P p Q and Q Q implies P p Q . From Γ P Δ and P P , by using Theorem 1, it holds that Γ P Δ . By induction, since Γ P Δ , safe ( Δ ) and P p Q , it holds that nextProc ( P ) > 0 . Also, from P P , it holds that nextProc ( P ) = nextProc ( P ) and thus nextProc ( P ) > 0 , as desired.
  • Case (Err): s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) . P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) . P i } p k err (if k J and k I ). This means that P = s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) . P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) . P i } . Since Γ P Δ , safe ( Δ ) and P p P , by Corollary 1, P has no error. This contradicts the assumption that P = err and thus there is no P with Γ P Δ , safe ( Δ ) and P p P such that rule (Err) is applicable.
As in [1], an annotated process P is the result of annotating the bound names of P, e.g., ( ν s : G ) P , s ? ( x : S ) P and def X ( x ˜ : S ˜ c ˜ : T ˜ ) in . Notice that some rules without variables are the same as the ones of Table 5 and that type checking of annotated processes is decidable whenever the safety property is decidable (this follows because typings have finite-state transition systems).
Theorem 3.
Given an annotated process ( ν s : G ) P and a sorting Γ, it is decidable even if Γ ( ν s : G ) P is derivable or not.
Proof. 
An algorithm for deriving Γ ( ν s : G ) P can be straightforwardly obtained by inverting the annotated rules obtained from the rules of Table 5, noticing that:
  • All typing rules are deterministically invertible—i.e., at most one typing rule can be applied for any arbitrary given well-typed process;
  • At each invert, the typing contexts, sortings and type annotations in the process in conclusion determine how typing contexts, sortings and type annotations in the process in premises are generated; note that in the worst case, when splitting a typing context Δ , one might need to try all possible typing contexts Δ 1 and Δ 2 such that Δ 1 , Δ 2 = Δ ;
  • Each rule invert guarantees that the premises contain smaller subterms of the term in the conclusion and so the recursive derivation eventually terminates.
As in many approaches on multiparty session types, our approach makes sure that a typed ensemble of processes interacting on a single annotated session (namely, a process typed ( ν s : G ) ( P 1 P n ) with each P i ( 1 i n ) interacting only on s [ r i ] ) is deadlock free. From now on, Q means that a process Q cannot evolve by means of any rule ( i.e., Q is a stable process).
Theorem 4
(Deadlock freedom). Let P , where P ( ν s : G )   ( P 1 P n ) and each P i ( 1 i n ) interacts only on s [ r i ] of type T i , where T i = G r i . Then P is deadlock free, i.e.,
P p * P implies   P 0
Proof. 
By inverting the rule (TRes), it holds that P 1 P n G s , while by inverting the rule (TConc) it holds that P i s [ r i ] : T i . Since T i = G r i , then G s reduces as prescribed by G. The proof proceeds by cases depending on the structure of G.
  • Case G = r 1 r 2 : { ι j : l j ( S j ) . G j } j J . Then there exist processes P 1 and P 2 and also, in G s , the terms s [ r 1 ] : r 2 j J ι j : ! l j S j . T j and s [ r 2 ] : r 1 & j J ? l j ( S j ) . T j such that P 1 s [ r 1 ] : r 2 j J ι j : ! l j S j . T j and P 2 s [ r 2 ] : r 1 & j J ? l j ( S j ) . T j . By inverting the rules (TSelect) and (TBranch), we obtain that P 1 = s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) ; P j and P 2 = s [ r 2 ] [ r 1 ] & j J { l j ( x j ) ; P j } . According to rule (Com), there exist P 1 and P 2 and a probability p such that P 1 P 2 p j P 1 P 2 ; by using rule (Ctxt), it holds that P p P 1 P 2 P 3 P n , as desired.
The remaining cases are proved in a similar manner. □

4.3. Properties Preserved by Removing Probabilities

As can be noticed from the rules of Table 5, the obtained types are not unique. This is due to the fact that we use probability intervals allowing the processes to be considered well-typed within the interval defined by lower and upper probabilities. Having several types for the same process, we can obtain some refinements of these typings. Inspired by [32], we use an erase function that removes the probability annotations when it is applied to a process or a type.
Example 15.
Consider the global type G 1 of Example 3:
G 1 = μ t . r B r A : ι 1 : talk ( string ) . r A r B : ι 2 : yes ( string ) . t , ι 3 : no ( string ) . t , ι 4 : quit ( string ) . end , ι 5 : quit ( string ) . end .
By using the erase function we obtain the global type
erase ( G 1 ) = μ t . r B r A : talk ( string ) . r A r B : yes ( string ) . t , no ( string ) . t , quit ( string ) . end , quit ( string ) . end .
The operational semantics without probabilities is similar to the one from Table 2, while the typing system without probabilities is similar to the one from Table 5. In order to distinguish between the rules of Table 2 and Table 5 and those with probabilities erased, we add an S (from simple) in front of the name of the rules with probabilities erased. E.g., for the rule (Com) of Table 2:
s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) ; P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) ; P i } p k P k P k { v k ˜ / x k ˜ }
its corresponding rule with probabilities erased is (SCom):
s [ r 1 ] [ r 2 ] j J l j ( v j ) ; P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) ; P i } P k P k { v k ˜ / x k ˜ }
Since probabilities do not influence the evolution, we obtain the following two results. The first one states that if a probabilistic process can perform an evolution step, then an evolution step can be also performed by the process with all the probabilities erased, while the second one states that if a process with all the probabilities erased can perform an evolution step, then an evolution step can be performed also by the process with probabilities.
Proposition 3.
If P p P , then e r a s e ( P ) e r a s e ( P ) .
Proof. 
By induction on the derivation P p P .
There is a case for each operational semantics rule of Table 2.
  • Case (Com): s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) ; P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) ; P i } p k P k P k { v k ˜ / x k ˜ } . Using the definition of the erase function, we obtain that
    erase ( P ) = erase ( s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) ; P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) ; P i } )
    = erase ( s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) ; P j ) erase ( s [ r 2 ] [ r 1 ] & i I { l i ( x i ) ; P i } )
    = s [ r 1 ] [ r 2 ] j J l j ( v j ) ; e r a s e ( P j ) s [ r 2 ] [ r 1 ] & i I { l i ( x i ) ; e r a s e ( P i ) } .
    By using rule (SCom), it holds that e r a s e ( P ) e r a s e ( P k ) e r a s e ( P k ) { v k ˜ / x k ˜ } .
    Since e r a s e ( P ) = erase ( P k P k { v k ˜ / x k ˜ } ) = erase ( P k ) e r a s e ( P k ) { v k ˜ / x k ˜ } , then it results that e r a s e ( P ) e r a s e ( P ) , as required.
The remaining cases can be proved in a similar manner. □
Proposition 4.
If e r a s e ( P ) e r a s e ( P ) , then there exists p such that P p P .
Proof. 
The proof is similar to that of the previous result (Proposition 3). □
A process remains well-typed after eliminating probabilities in processes, and probability intervals in types.
Proposition 5.
If Γ P Δ , then Γ e r a s e ( P ) e r a s e ( Δ ) .
Proof. 
The proof is by induction on the structure of process P and by using the rules of Table 5.
  • P = c [ r ] i I p i : l i ( v i ) ; P i . Assume that for all i I it holds that there exist T i and Δ such that Γ P i Δ , c : T i . There exists ι i such that p i ι i and according to rule (TSelect) of Table 5 it holds that Γ P Δ , where Δ = Δ , c : r i I ι i : ! l i S i . T i . Since by induction for all i I it holds that Γ e r a s e ( P i ) e r a s e ( Δ , c : T i ) and e r a s e ( Δ , c : T i ) = e r a s e ( Δ ) , c : e r a s e ( T i ) , then Γ e r a s e ( P i ) e r a s e ( Δ ) , c : e r a s e ( T i ) . According to rule (TSelect), it holds that Γ c [ r ] i I l i ( v i ) ; e r a s e ( P i ) e r a s e ( Δ ) , c : r i I ι i : ! l i S i . e r a s e ( T i ) . Since e r a s e ( P ) = c [ r ] i I l i ( v i ) ; e r a s e ( P i ) and e r a s e ( Δ ) = e r a s e ( Δ ) , c : r i I ι i : ! l i S i . e r a s e ( T i ) , it implies that Γ e r a s e ( P ) e r a s e ( Δ ) , as desired.
The remaining cases can be proved in a similar manner. □

4.4. Typed Probabilistic Equivalence

Since in the type system we use probability intervals, several processes (more than one) can be considered well-typed with respect to the same type. However, such well-typed processes are connected, as they can be related by using the e r a s e function, namely they would be equal by erasing all probabilities.
Proposition 6.
If Γ P Δ and Δ contains at least one probability interval of the form [ d 1 , d 2 ] in which d 1 , d 2 [ 0 , 1 ] and d 1 < d 2 , then there exists a process Q such that P Q , Γ Q Δ and e r a s e ( P ) e r a s e ( Q ) .
Proof. 
The proof is by induction on the structure of process P, by using the rules of Table 5.
  • P = c [ r ] i I p i : l i ( v i ) ; P i . Assume that for all i I it holds that there exist T i and Δ such that Γ P i Δ , c : T i . There exist ι i such that p i ι i and according to rule (TSelect) of Table 5 it holds that Γ P Δ , where Δ = Δ , c : r i I ι i : ! l i S i . T i . From the assumption that Δ contains at least one probability interval of the form [ d 1 , d 2 ] in which d 1 , d 2 [ 0 , 1 ] and d 1 < d 2 , one can construct a Q such that Γ Q Δ by considering Q = c [ r ] i I p i : l i ( v i ) ; Q i , where for all i I it holds that p i ι i and for at least one i I we have p i p i . Thus, by construction, it holds that P Q . Since by induction for all i I it holds that e r a s e ( P i ) e r a s e ( Q i ) , it implies that also e r a s e ( P ) e r a s e ( Q ) , as desired.
The remaining cases are proved in a similar manner. □
However, the reverse direction of this result does not work: if there exists a process Q such that P Q , Γ Q Δ and e r a s e ( P ) e r a s e ( Q ) , this does not imply that Γ P Δ . Consider as a counterexample the process Q = c [ r ] 0.4 : l 1 ( v 1 ) , 0.6 : l 2 ( v 2 ) and Δ = c : r { [ 0.3 , 0.5 ] : ! l i S i , [ 0.7 , 0.8 ] : ! l 2 S 2 } . Then the process P = c [ r ] 0.2 : l 1 ( v 1 ) , 0.8 : l 2 ( v 2 ) is such that P Q and e r a s e ( P ) e r a s e ( Q ) , but it does not hold that Γ P Δ .
Proposition 6 pushes us to consider a probabilistic relation between processes that are typed under the same sorting and typing and also have equal definitions when their probabilities are erased. Formally, such a relation is defined as follows.
Definition 7.
We say that two processes P and Q are related by a sorting Γ and a typing Δ, denoted by P Γ , Δ Q , if Γ P Δ , Γ Q Δ and e r a s e ( P ) e r a s e ( Q ) .
Example 16.
Consider from Example 13 the process
AB = def B ( y , q ) = B 1 in def A ( y ) = A 1 in A 1 { phoneA [ r A ] / y } ) B 1 { PhoneNoA [ r B ] / y , q 1 / q } , where AB Δ 1 and a process BA such that AB BA and BA Δ 1 . Then, according to Definition 7, it also holds that AB , Δ 1 BA .
It is not difficult to note that for any sorting Γ and any typing Δ , the relation Γ , Δ is an equivalence relation (it is reflexive, symmetric and transitive).
Proposition 7.
For any sorting Γ and typing Δ, the relation Γ , Δ is an equivalence.
Proof. 
The fact that the relation Γ , Δ is an equivalence follows from:
  • Reflexivity: Consider Γ P Δ . Since e r a s e ( P ) e r a s e ( P ) , from Definition 7 it follows that P Γ , Δ P , as desired.
  • Symmetry: Consider P Γ , Δ Q . According to Definition 7, it holds that Γ P Δ , Γ Q Δ and e r a s e ( P ) e r a s e ( Q ) and thus Q Γ , Δ P .
  • Transitivity: Consider P Γ , Δ Q and Q Γ , Δ R . According to Definition 7, it holds that Γ P Δ , Γ Q Δ , Γ R Δ , e r a s e ( P ) e r a s e ( Q ) and e r a s e ( Q ) e r a s e ( R ) and thus P Γ , Δ R .
If two typings Δ 1 and Δ 2 differ only by means of probability intervals, namely erase ( Δ 1 ) = erase ( Δ 2 ) , then it is natural to define their intersection:
Δ 1 Δ 2 = { s [ r ] : ( T 1 T 2 ) s [ r ] : T 1 Δ 1   and   s [ r ] : T 2 Δ 2 } ,   where
T 1 T 2 = r 1 i I ( ι 1 i ι 2 i ) : ! l i S i . ( T 1 i T 2 i ) if   T j = r 1 i I ι j i : ! l i S i . T j i , ι 1 i ι 2 i , with   j { 1 , 2 } r 1 & i I ? l i ( S i ) . ( T 1 i T 2 i ) if   T j = r 1 & i I ? l i ( S i ) . T j i with   j { 1 , 2 } μ t . ( T 1 i T 2 i ) if   T 1 = μ t . T 1 i   and   T 2 = μ t . T 2 i T 1 if   T 1 = T 2 = t   or   T 1 = T 2 = end undefined otherwise . .
Since the intersection of probability intervals is associative, then this property is naturally extended to the intersection of typings; thus, the intersection of typings is also associative.
Proposition 8.
( Δ 1 Δ 2 ) Δ 3 = Δ 1 ( Δ 2 Δ 3 ) .
Proof. 
By induction on the structure of the local types appearing in Δ 1 , Δ 2 and Δ 3 , considered whenever their intersection is defined. Assume that s [ r i ] : T j Δ j for 1 j 3 . Notice that if one of T j (for 1 j 3 ) is end or t, then we have T 1 = T 2 = T 3 and so conclude the proof. Otherwise, we have the following cases:
  • T j = r i I ι j i : ! l i S i . T j i , for 1 j 3 .
    By induction and by the associativity of interval intersections, we obtain that ( T 1 T 2 ) T 3 = r i I ( ( ι 1 i ι 2 i ) ι 3 i ) : ! l i S i . ( ( T 1 i T 2 i ) T 3 i ) = r i I ( ι 1 i ( ι 2 i ι 3 i ) ) : ! l i S i . ( T 1 i ( T 2 i T 3 i ) ) = T 1 ( T 2 T 3 ) , as requested.
  • T j = r & i I ? l i ( S i ) . T j i , for 1 j 3 .
    By induction, we obtain that ( T 1 T 2 ) T 3 = r & i I ? l i ( S i ) . ( ( T 1 i T 2 i ) T 3 i ) = r & i I ? l i ( S i ) . ( T 1 i ( T 2 i T 3 i ) ) = T 1 ( T 2 T 3 ) , as requested.
  • T j = μ t . T j , for 1 j 3 .
    By induction, we obtain that ( T 1 T 2 ) T 3 = μ t . ( T 1 T 2 ) T 3 = μ t . T 1 ( T 2 T 3 ) = T 1 ( T 2 T 3 ) .
It is worth noting that if a process is well-typed with different types Δ 1 and Δ 2 such that erase ( Δ 1 ) = erase ( Δ 2 ) , then it remains well-typed when we consider the intersection of typings Δ 1 and Δ 2 . This means that if a selection process has a probability p, there exist two well-formed types with different probability intervals ι 1 and ι 2 such that the selection process remains well-typed when we consider the intersection of the intervals ι 1 and ι 2 .
Theorem 5.
If Γ P Δ 1 , Γ P Δ 2 and erase ( Δ 1 ) = erase ( Δ 2 ) , then Γ P Δ 1 Δ 2 .
Proof. 
The proof is by induction on the structure of process P and by using the rules of Table 5.
  • P = c [ r ] i I p i : l i ( v i ) ; P i . Assume that for all i I it holds that there exist T i 1 , T i 2 , Δ 1 and Δ 2 such that Γ P i Δ 1 , c : T i 1 and Γ P i Δ 2 , c : T i 2 . There exist ι i 1 and ι i 2 such that p i ι i 1 and p i ι i 2 . According to rule (TSelect) of Table 5, it holds that Γ P Δ 1 and Γ P Δ 2 , where Δ 1 = Δ 1 , c : r i I ι i 1 : ! l i S i . T i 1 and Δ 2 = Δ 2 , c : r i I ι i 2 : ! l i S i . T i 2 together with erase ( Δ 1 ) = erase ( Δ 2 ) . Since for all i I it holds that p i ι i 1 and p i ι i 2 , then also p i ι i 1 ι i 2 holds. By induction for all i I , it holds that Γ P i Δ 1 Δ 2 , c : T i 1 T i 2 . By applying the rule (TSelect) of Table 5, it holds that Γ c [ r ] i I p i : l i ( v i ) ; P i Δ 1 Δ 2 , c : r i I ι i 1 ι i 2 : ! l i S i . T i 1 T i 2 . Since Δ 1 Δ 2 = Δ 1 Δ 2 , c : r i I ι i 1 ι i 2 : ! l i S i . T i 1 T i 2 , then Γ P Δ 1 Δ 2 , as desired.
The remaining cases are proved in a similar manner. □
The next result shows that if between two processes there exists a probabilistic relation under a sorting Γ and two typings Δ 1 and Δ 2 that are different only in the probability intervals they use (i.e., erase ( Δ 1 ) = erase ( Δ 2 ) ), then between the two processes there exists a probabilistic relation under the sorting Γ and the typing Δ 1 Δ 2 . This means that if two processes can be related under the same sorting but at least two different typings, then more typings can be created to relate the two processes.
Proposition 9.
If P Γ , Δ 1 Q , P Γ , Δ 2 Q and erase ( Δ 1 ) = erase ( Δ 2 ) , then P Γ , Δ 1 Δ 2 Q .
Proof. 
From P Γ , Δ 1 Q and P Γ , Δ 2 Q , according to Definition 7, it holds that Γ P Δ 1 , Γ P Δ 2 , Γ Q Δ 1 , Γ Q Δ 2 and e r a s e ( P ) = e r a s e ( Q ) . Since Γ P Δ 1 , Γ P Δ 2 and erase ( Δ 1 ) = erase ( Δ 2 ) , then by using Theorem 5 it results that Γ P Δ 1 Δ 2 . Similarly, since Γ Q Δ 1 , Γ Q Δ 2 and erase ( Δ 1 ) = erase ( Δ 2 ) , then by using Theorem 5 it results that Γ Q Δ 1 Δ 2 . From Γ P Δ 1 Δ 2 , Γ Q Δ 1 Δ 2 and e r a s e ( P ) = e r a s e ( Q ) , by using Definition 7, it results that P Γ , Δ 1 Δ 2 Q , as desired. □
By defining the union operator ⊔ for typings in a similar way to the intersection operator ⊓, since the union of probability intervals is associative, we obtain that this property is naturally extended to the union of typings, and so the union of typings is associative.
Proposition 10.
( Δ 1 Δ 2 ) Δ 3 = Δ 1 ( Δ 2 Δ 3 )
Proof. 
Similar to the proof of Proposition 8. □
It is worth noting that a well-typed process with a type Δ 1 remains well-typed when we consider the union of typing Δ 1 with another typing Δ 2 such that erase ( Δ 1 ) = erase ( Δ 2 ) . This means that if a selection process has a probability p, there exists a well-formed type with probability interval ι 1 such that the selection process remains well-typed when we consider the union of the interval ι 1 with the probability interval ι 2 appearing in Δ 2 .
Theorem 6.
If Γ P Δ 1 and there is Δ 2 such that erase ( Δ 1 ) = erase ( Δ 2 ) , then Γ P Δ 1 Δ 2 .
Proof. 
The proof is by induction on the structure of process P, using the rules of Table 5.
  • P = c [ r ] i I p i : l i ( v i ) . P i . Assume that for all i I it holds that there exist T i 1 and Δ 1 such that Γ P i Δ 1 , c : T i 1 . There exist ι i 1 such that p i ι i 1 and according to rule (TSelect) of Table 5, it holds that Γ P Δ 1 , where Δ 1 = Δ 1 , c : r i I ι i 1 : ! l i S i . T i 1 . If there exists Δ 2 such that erase ( Δ 1 ) = erase ( Δ 2 ) , then it implies that Δ 2 = Δ 2 , c : r i I ι i 2 : ! l i S i . T i 2 . Since for all i I it holds that p i ι i 1 , then also p i ι i 1 ι i 2 holds. By induction for all i I it holds that Γ P i Δ 1 Δ 2 , c : T i 1 T i 2 . By applying the rule (TSelect) of Table 5, it holds that Γ c [ r ] i I p i : l i ( v i . P i Δ 1 Δ 2 , c : r i I ι i 1 ι i 2 : ! l i S i . T i 1 T i 2 . Since Δ 1 Δ 2 = Δ 1 Δ 2 , c : r i I ι i 1 ι i 2 : ! l i S i . T i 1 T i 2 , then Γ P Δ 1 Δ 2 , as desired.
The remaining cases are proved in a similar manner. □
The next result shows that if between two processes there exists a probabilistic relation under one sorting Γ and two typings Δ 1 and Δ 2 that are different only in the probability intervals they use (namely erase ( Δ 1 ) = erase ( Δ 2 ) ), then between the two processes there exists a probabilistic relation under the sorting Γ and the typing Δ 1 Δ 2 . This means that two processes can be related under the same sorting, but their typings different only in the probability intervals they use.
Proposition 11.
If P Γ , Δ 1 Q and exists Δ 2 such that erase ( Δ 1 ) = erase ( Δ 2 ) , then P Γ , Δ 1 Δ 2 Q .
Proof. 
From P Γ , Δ 1 Q , according to Definition 7, it holds that Γ P Δ 1 , Γ Q Δ 1 and e r a s e ( P ) = e r a s e ( Q ) . Since Γ P Δ 1 and exists Δ 2 such that erase ( Δ 1 ) = erase ( Δ 2 ) , then by using Theorem 6 it results that Γ P Δ 1 Δ 2 . Similarly, since Γ Q Δ 1 and exists Δ 2 such that erase ( Δ 1 ) = erase ( Δ 2 ) , then by using Theorem 6 it results that Γ Q Δ 1 Δ 2 . From Γ P Δ 1 Δ 2 , Γ Q Δ 1 Δ 2 and e r a s e ( P ) = e r a s e ( Q ) , by using Definition 7, it results that P Γ , Δ 1 Δ 2 Q , as desired. □
Usually, an equivalence relation requires an exact match of transitions of two processes. Sometimes this requirement is too strong and this is why in what follows, inspired by [33], we define and study bisimulations between processes by taking into account also their types with probability intervals. We present some notations used in the remaining of the paper.
  • A typed probabilistic relation over the set P of processes, the set S of sortings and the set T of typings is any relation R { ( P , Γ , Δ , Q ) P Γ , Δ Q , P , Q P , Γ S , Δ T , safe ( Δ ) } .
  • The identity typed probabilistic relation is
    I d R = d f { ( P , Γ , Δ , P ) P P , Γ S , Δ T , safe ( Δ ) } .
  • The inverse of a typed probabilistic relation  R is
    R 1 = d f { ( Q , Γ , Δ , P ) | ( P , Γ , Δ , Q ) R } .
  • The composition of typed probabilistic relations  R 1 and R 2 is
    R 1 R 2 = d f { ( P , Γ , Δ , Q ) | R P : ( P , Γ , Δ , R ) R 1 ( R , Γ , Δ , Q ) R 2 } .
Definition 8
(Typed Probabilistic Bisimulation). Let R be a symmetric typed probabilistic relation.
1. 
R is a typed probabilistic bisimulation if ( P , Γ , Δ , Q ) R and P p P implies that exists Q and Δ such that Q p Q and ( P , Γ , Δ , Q ) R .
2. 
The typed probabilistic bisimilarity is the union ≃ of all typed probabilistic bisimulations R .
In what follows, we prove some properties of the typed probabilistic bisimulations and the typed probabilistic bisimilarity.
Proposition 12.
1. 
Identity, inverse, composition and union of typed probabilistic bisimulations are typed probabilistic bisimulations.
2. 
≃ is the largest typed probabilistic bisimulation.
3. 
≃ is an equivalence.
Proof. 
  • We treat each relation separately, showing that it respects the conditions from Definition 8 for being a typed probabilistic bisimulation.
    (a)
    The identity relation I d R is a typed probabilistic bisimulation. Assume ( P , Γ , Δ , P ) I d R ; then Γ P Δ and safe ( Δ ) . Consider P p P ; according to Theorem 2, there exists Δ and ι such that Γ P Δ and Δ = Δ or Δ ι Δ . Since safe ( Δ ) , according to Definition 5, it also holds that safe ( Δ ) . According to Proposition 7, by reflexivity, it holds that P Γ , Δ P . Thus, it results that ( P , Γ , Δ , P ) I d R , as desired.
    (b)
    The inverse of a typed probabilistic bisimulation is a typed probabilistic bisimulation. Assume ( P , Γ , Δ , Q ) R 1 , namely ( Q , Γ , Δ , P ) R . Consider Q p Q ; then for some P and Δ we have P p P and ( Q , Γ , Δ , P ) R , namely ( P , Γ , Δ , Q ) R 1 . By similar reasoning, if P p P then we can find Q and Δ such that Q p Q and ( P , Γ , Δ , Q ) R 1 .
    (c)
    The composition of typed probabilistic bisimulations is a typed probabilistic bisimulation. Assume ( P , Γ , Δ , Q ) R 1 R 2 . Then for some R we have ( P , Γ , Δ , R ) R 1 and ( R , Γ , Δ , Q ) R 2 . Consider P p P ; then for some R and Δ , since ( P , Γ , Δ , R ) R 1 , we have R p R and ( P , Γ , Δ , R ) R 1 . Also, since ( R , Γ , Δ , Q ) R 2 we have for some Q that Q p Q and ( R , Γ , Δ , Q ) R 2 . Thus, ( P , Γ , Δ , Q ) R 1 R 2 . By similar reasoning, if Q p Q then we can find P and Δ such that P p P and ( P , Γ , Δ , Q ) R 1 R 2 .
    (d)
    The union of typed probabilistic bisimulations is a typed probabilistic bisimulation. Assume ( P , Γ , Δ , Q ) i I R i . Then for some i I we have ( P , Γ , Δ , Q ) R i . Consider P p P ; then for some Q and Δ , since ( P , Γ , Δ , Q ) R i , we have Q p Q and ( P , Γ , Δ , Q ) R i . Thus, ( P , Γ , Δ , Q ) i I R i . By similar reasoning, if Q p Q then we can find P such that P p P and ( P , Γ , Δ , Q ) R i , namely ( P , Γ , Δ , Q ) i I R i .
  • Since the union of typed probabilistic bisimulations is a typed probabilistic bisimulation, ≃ is a typed probabilistic bisimulation and it includes any other typed probabilistic bisimulation.
  • Proving that relation ≃ is an equivalence requires to show that it satisfies reflexivity, symmetry and transitivity. We consider each of them in what follows:
    (a)
    Reflexivity: For any process P, P P results from the fact that the identity relation is a typed probabilistic bisimulation.
    (b)
    Symmetry: If P Q , then exists Γ and Δ such that ( P , Γ , Δ , Q ) R for some typed probabilistic bisimulation R . Hence ( Q , Γ , Δ , P ) R 1 and so Q P because the inverse relation is a typed probabilistic bisimulation.
    (c)
    Transitivity: If P R and R Q then ( P , Γ , Δ , R ) R 1 and ( R , Γ , Δ , Q ) R 2 for some typed probabilistic bisimulations R 1 and R 2 . Thus, ( P , Γ , Δ , Q ) R 1 R 2 and so P Q due to the fact that the composition relation is a typed probabilistic bisimulation.

4.5. Probabilistic Properties of Typed Processes

In what follows, we associate with each execution path (starting from a process P and leading to a process Q) an execution probability of going along the path. This probability is computed by using the operational semantics presented in Table 2.
Definition 9
(Evolution paths). A sequence of evolution transitions provides an evolution path e p = P 0 p 1 P 1 p k P k . Two evolution paths e p = P 0 p 1 P 1 p k P k and e p = P 0 p 1 P 1 p k P k are identical if k = k and for all t { 0 , , k } we have P i P i and p i = p i .
Definition 10
(Evolution probability). The probability of reaching a process P j starting from a process P i (with i < j ) along the evolution path e p = P i p i + 1 P i + 1 p j P j is denoted by p r o b ( e p , P i , P j ) = p i + 1 p j . The evolution probability of reaching P j starting from a process P i (with i < j ) taking all the possible evolution paths (without considering identical paths) is denoted by prob ( P i , P j ) = e p p r o b ( e p , P i , P j ) .
For any process P, we can define the sets Reach k ( P ) of processes reached in k steps starting from P by using the rules of Table 2.
Definition 11
(Reachable sets).  Reach 1 ( P ) = { Q P p Q } .
For k 2 , Reach k ( P ) = { Q Q R e a c h k 1 ( P ) & Q }   { R Q Reach k 1 ( P ) & Q p R } .
Example 17.
Consider from Example 13 the process
AB = def B ( y , q ) = B 1 in def A ( y ) = A 1 in A 1 { phoneA [ r A ] / y } ) B 1 { phoneA [ r B ] / y , q 1 / q } , and from Example 14 the process AB obtained from AB after a one step reduction. Then, according to Definition 11, it holds that AB Reach 1 ( AB ) .
The following result makes sure that for each well-typed restricted process, the sum of the probabilities of all possible transitions is 1. The restriction ensures that the types are derived from some well-formed global types.
Theorem 7.
Given a well-typed process ( ν s ) P such that Reach k ( ( ν s ) P ) for all k 1 , then
Q Reach k ( P ) p r o b ( P , Q ) = 1 .
Proof. 
By induction on the number k of the evolution steps.
  • Case k = 1 . The proof proceeds by cases depending on the value of nextProc ( ( ν s ) P ) .
    nextProc ( ( ν s ) P ) = 0 . We have only one subcase:
    P = end . Since ( ν s ) P = end , then Reach 1 ( ( ν s ) P ) = and the sum is not computed.
    Since ( ν s ) P is well-typed, this means that P c [ r ] i I p i : l i ( v i ) . P i , and also that P c [ r ] & i I { l i ( x i ) . P i } .
    nextProc ( ( ν s ) P ) = 1 . We have two subcases:
    P = ( def X ( x ˜ ) = P in ( X ( v ˜ c ˜ ) | R ) ) . Since nextProc ( ( ν s ) P ) = 1 , it means that R is unable to evolve and only X ( v ˜ c ˜ ) could evolve by applying the rule (Call). Thus, by also using rule (Ctxt), it holds that ( ν s ) P evolves with probability 1 to ( ν s ) ( def X ( x ˜ ) = P in ( P { v ˜ c ˜ / x ˜ } | R ) ) . This means that Q Reach 1 ( ( ν s ) P ) prob ( ( ν s ) P ,   Q ) = 1 (as desired).
    P = P 1 P 2 , where P 1 = s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) . P j and P 2 = s [ r 2 ] [ r 1 ] & i I { l i ( x i ) . P i } , meaning that the rules (Com) and (Ctxt) are applied and ( ν s ) P evolves with probability p k (with k J ) to ( ν s ) ( P k P k { v k ˜ / x k ˜ } ) . From the definition of P 1 , it follows that k J p k = 1 . Thus, Q Reach 1 ( ( ν s ) P ) prob ( ( ν s ) P , Q ) = k J p k = 1 (as desired).
    nextProc ( ( ν s ) P ) = l , with l > 1 . In this case, there is P 1 able to perform at least a (Call) or (Com) rule such that ( ν s ) P = E [ P 1 ] and nextProc ( P 1 ) = m , where 1 m l . Since nextProc ( ( ν s ) P ) = l , then by removing P 1 from P, what remains (namely E [ 0 ] ) is such that nextProc ( E [ 0 ] ) = l m . Since ( ν s ) P is well-typed, then Reach 1 ( ( ν s ) P ) = Reach 1 ( E [ 0 ] ) + Reach 1 ( P 1 ) . Thus, Q Reach 1 ( ( ν s ) P ) prob ( ( ν s ) P , Q ) = Q Reach 1 ( E [ 0 ] ) prob ( ( ν s ) P , Q ) + Q Reach 1 ( P 1 ) prob ( ( ν s ) P , Q ) = ( 1 i nextProc ( E [ 0 ] ) 1 nextProc ( ( ν s ) P ) ) + m nextProc ( ( ν s ) P ) = l m l + m l = 1 .
    The normalization appears due to the application of rule (Ctxt).
  • Case k > 1 . Since Reach k ( ( ν s ) P ) , this means that Reach k 1 ( ( ν s ) P ) . By considering the inductive step, we obtain that Q Reach k 1 ( ( ν s ) P ) prob ( ( ν s ) P , Q ) = 1 . Let us consider that another evolution step is performed, namely k steps starting from process ( ν s ) P . This means that Q Reach k ( ( ν s ) P ) prob ( ( ν s ) P , Q ) can be broken into two parts: one computing the probability for the first ( k 1 ) steps and another one using the probabilities obtained in the additional step k. Thus, it holds that Q Reach k ( ( ν s ) P ) prob ( ( ν s ) P , Q )   = Q 1 Reach k 1 ( ( ν s ) P ) ( prob ( ( ν s ) P , Q 1 ) *   Q Reach 1 ( Q 1 ) p r o b ( Q 1 , Q ) ) . Since we have Q Reach 1 ( Q 1 ) p r o b ( Q 1 , Q ) = 1 , then
    Q Reach k ( ( ν s ) P ) prob ( ( ν s ) P , Q ) = Q 1 Reach k 1 ( ( ν s ) P ) prob ( ( ν s ) P , Q 1 )
    Also, since we have Q 1 Reach k 1 ( ( ν s ) P ) prob ( ( ν s ) P , Q 1 ) = 1 , then
    Q Reach k ( ( ν s ) P ) prob ( ( ν s ) P , Q ) = 1

5. Conclusions and Related Work

The non-probabilistic approach of multiparty session types [1] could become more realistic when dealing with various uncertainties by considering the use of probabilities in processes and types. In our approach, we introduced a new kind of multiparty session type for a probabilistic process calculus that incorporates two forms of choice: probabilistic choice and nondeterministic choice. The main novelty of this approach is the use of interval probabilities in the type system in order to deal with uncertainty in the probabilistic process calculus, thus allowing several processes to be considered well-typed with respect to the same type and also several types to be assigned to a process. We also define a probabilistic bisimulation between processes that are typed by using the same sorting and typing. In our approach, the use of interval probabilities is such that the axioms appearing in the theory of probability are satisfied. The new typing system represents a conservative extension of the standard typing system based on multiparty session types: removing the probabilities from processes does not affect their well-typedness. Moreover, the new typing system preserves deadlock freedom, type preservation and type safety.
The idea of interval probability is not new: the axioms of Kolmogorov [34] together with the other three axioms represent the foundations for the theory of interval probabilities. This probabilistic approach adheres to the standard axioms when dealing with the probability of an event [24]: the probabilities are real values from the closed interval [ 0 , 1 ] , while the sum of all probabilities is 1. A different approach to dealing with uncertainty is the use of lower and upper probabilities in the Dempster–Shafer theory [35]. Also, when Keynes proposed dealing with probabilities by using intervals [36], B. Russell stated about the work that it is “undoubtedly the most important work on probability that has appeared for a very long time”. It is worth noting that in analytic philosophy, interval-valued functions [37] and lower/upper probabilities [38] were used in probabilistic judgements. A discussion on the differences between these two approaches (classic Bayesian and interval probability) is available in [39].
Regarding the use of probabilities in session types, there are only a few articles to mention. According to the authors of a recent article [14]: “To the best of our knowledge, Aman and Ciobanu [8,19] are the only ones who studied the concept of choreography and probabilities”. In [9], other authors present a probabilistic variant of binary session types by enriching the type system with fixed probabilities. The authors of [10] developed probabilistic binary session types to be used for the runtime analysis of communicating processes, while those of [12] developed a meta theory of probabilistic session types to be used in the automatic expected resource analysis. The authors of [11] enriched session typing with probabilistic choices for modelling cryptographic experiments, while in [13] they investigated the termination problem in a calculus of sessions with probabilistic choices.
On the other hand, there exist several articles presenting probabilistic process calculi (see [40] and its references). In probabilistic π -calculus [41], the defined types are useful in identifying nondeterministic probabilistic behaviors such that in the framework of event structures the compositionality of the parallel operator is preserved. The authors propose an initial step toward developing an effective typing discipline that incorporates probabilistic name passing by utilizing Segala automata [42] and probabilistic event structures. In comparison with this work, we simplify the approach and work directly with processes, providing a probabilistic typing in the context of multiparty session types. Additionally, we introduce the interval probability in the framework of multiparty session types.

Author Contributions

These authors contributed equally to this work. Conceptualization, G.C.; formal analysis, B.A.; investigation, B.A. and G.C.; writing—original draft preparation, B.A.; writing—review and editing, G.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

No new data were created.

Acknowledgments

The authors are grateful to the anonymous reviewers for their careful reading of our manuscript, their insightful comments and suggestions that improved the manuscript.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Honda, K.; Yoshida, N.; Carbone, M. Multiparty Asynchronous Session Types. J. ACM 2016, 63, 9:1–9:67. [Google Scholar] [CrossRef]
  2. Fowler, S.; Lindley, S.; Morris, J.G.; Decova, S. Exceptional asynchronous session types: Session types without tiers. Proc. ACM Program. Lang. 2019, 3, 28:1–28:29. [Google Scholar] [CrossRef]
  3. Viering, M.; Hu, R.; Eugster, P.; Ziarek, L. A multiparty session typing discipline for fault-tolerant event-driven distributed programming. Proc. ACM Program. Lang. 2021, 5, 1–30. [Google Scholar] [CrossRef]
  4. Barwell, A.D.; Scalas, A.; Yoshida, N.; Zhou, F. Generalised Multiparty Session Types with Crash-Stop Failures. In Proceedings of the 33rd International Conference on Concurrency Theory, CONCUR, LIPIcs, Warsaw, Poland, 12–16 September 2022; Klin, B., Lasota, S., Muscholl, A., Eds.; Schloss Dagstuhl-Leibniz-Zentrum für Informatik: Wadern, Germany, 2022; Volume 243, pp. 35:1–35:25. [Google Scholar] [CrossRef]
  5. Barwell, A.D.; Hou, P.; Yoshida, N.; Zhou, F. Designing Asynchronous Multiparty Protocols with Crash-Stop Failures. In Proceedings of the 37th European Conference on Object-Oriented Programming, ECOOP, LIPIcs, Seattle, WA, USA, 17–21 July 2023; Ali, K., Salvaneschi, G., Eds.; Schloss Dagstuhl-Leibniz-Zentrum für Informatik: Wadern, Germany, 2023; Volume 263, pp. 1:1–1:30. [Google Scholar] [CrossRef]
  6. Brun, M.A.L.; Dardha, O. MAGπ: Types for Failure-Prone Communication. In Proceedings of the 32nd European Symposium on Programming on Programming Languages and Systems, ESOP 2023, Held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS, Paris, France, 22–27 April 2023; Wies, T., Ed.; Lecture Notes in Computer Science. Springer: Berlin/Heidelberg, Germany, 2023; Volume 13990, pp. 363–391. [Google Scholar] [CrossRef]
  7. Pears, J.; Bocchi, L.; King, A. Safe Asynchronous Mixed-Choice for Timed Interactions. In International Conference on Coordination Languages and Models, Proceedings of the 25th IFIP WG 6.1 International Conference, COORDINATION 2023, Held as Part of the 18th International Federated Conference on Distributed Computing Techniques, DisCoTec 2023, Lisbon, Portugal, 19–23 June 2023; Jongmans, S., Lopes, A., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2023; Volume 13908, pp. 214–231. [Google Scholar] [CrossRef]
  8. Aman, B.; Ciobanu, G. Probabilities in Session Types. In Proceedings of the Third Symposium on Working Formal Methods, FROM 2019, EPTCS, Timişoara, Romania, 3–5 September 2019; Marin, M., Craciun, A., Eds.; Volume 303, pp. 92–106. [Google Scholar] [CrossRef]
  9. Inverso, O.; Melgratti, H.C.; Padovani, L.; Trubiani, C.; Tuosto, E. Probabilistic Analysis of Binary Sessions. In Proceedings of the 31st International Conference on Concurrency Theory, CONCUR 2020, LIPIcs, Vienna, Austria (Virtual Conference), 1–4 September 2020; Konnov, I., Kovács, L., Eds.; Schloss Dagstuhl-Leibniz-Zentrum für Informatik: Wadern, Germany, 2020; Volume 171, pp. 14:1–14:21. [Google Scholar] [CrossRef]
  10. Burlò, C.B.; Francalanza, A.; Scalas, A.; Trubiani, C.; Tuosto, E. Towards Probabilistic Session-Type Monitoring. In International Conference on Coordination Languages and Models, Proceedings of the 23rd IFIP WG 6.1 International Conference, COORDINATION 2021, Held as Part of the 16th International Federated Conference on Distributed Computing Techniques, DisCoTec 2021, Valletta, Malta, 14–18 June 2021; Damiani, F., Dardha, O., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2021; Volume 12717, pp. 106–120. [Google Scholar] [CrossRef]
  11. Dal Lago, U.; Giusti, G. On Session Typing, Probabilistic Polynomial Time and Cryptographic Experiments. In Proceedings of the 33rd International Conference on Concurrency Theory, CONCUR 2022, LIPIcs, Warsaw, Poland, 12–16 September 2022; Klin, B., Lasota, S., Muscholl, A., Eds.; Schloss Dagstuhl-Leibniz-Zentrum für Informatik: Wadern, Germany, 2022; Volume 243, pp. 37:1–37:18. [Google Scholar] [CrossRef]
  12. Das, A.; Wang, D.; Hoffmann, J. Probabilistic Resource-Aware Session Types. Proc. ACM Program. Lang. 2023, 7, 1925–1956. [Google Scholar] [CrossRef]
  13. Lago, U.D.; Padovani, L. On the Almost-Sure Termination of Binary Sessions. In Proceedings of the 26th International Symposium on Principles and Practice of Declarative Programming, PPDP 2024, Milano, Italy, 9–11 September 2024; Bruni, A., Momigliano, A., Pradella, M., Rossi, M., Cheney, J., Eds.; ACM: New York, NY, USA, 2024; pp. 9:1–9:12. [Google Scholar] [CrossRef]
  14. Carbone, M.; Veschetti, A. A Probabilistic Choreography Language for PRISM. In International Conference on Coordination Models and Languages, Proceedings of the 26th IFIP WG 6.1 International Conference, COORDINATION 2024, Held as Part of the 19th International Federated Conference on Distributed Computing Techniques, DisCoTec 2024, Groningen, The Netherlands, 17–21 June 2024; Castellani, I., Tiezzi, F., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2024; Volume 14676, pp. 20–37. [Google Scholar] [CrossRef]
  15. Jonsson, B.; Larsen, K.G. Specification and Refinement of Probabilistic Processes. In Proceedings of the 1991 Sixth Annual IEEE Symposium on Logic in Computer Science, Amsterdam, The Netherlands, 15–18 July 1991; pp. 266–277. [Google Scholar] [CrossRef]
  16. Yi, W. Algebraic Reasoning for Real-Time Probabilistic Processes with Uncertain Information. In Proceedings of the International Symposium on Formal Techniques in Real-Time and Fault-Tolerant Systems, Third International Symposium Organized Jointly with the Working Group Provably Correct Systems-ProCos, Lübeck, Germany, 19–23 September 1994; Langmaack, H., de Roever, W.P., Vytopil, J., Eds.; Lecture Notes in Computer Science. Springer: Cham, Switzerland, 1994; Volume 863, pp. 680–693. [Google Scholar] [CrossRef]
  17. Cauchi, N.; Laurenti, L.; Lahijanian, M.; Abate, A.; Kwiatkowska, M.; Cardelli, L. Efficiency through uncertainty: Scalable formal synthesis for stochastic hybrid systems. In Proceedings of the 22nd ACM International Conference on Hybrid Systems: Computation and Control, HSCC 2019, Montreal, QC, Canada, 16–18 April 2019; Ozay, N., Prabhakar, P., Eds.; ACM: New York, NY, USA, 2019; pp. 240–251. [Google Scholar] [CrossRef]
  18. Stoelinga, M. An Introduction to Probabilistic Automata. Bull. EATCS 2002, 78, 176–198. [Google Scholar]
  19. Aman, B.; Ciobanu, G. Interval Probability for Sessions Types. In Logic, Language, Information and Computation, Proceedings of the 28th International Workshop, WoLLIC 2022, Iaşi, Romania, 20–23 September 2022; Ciabattoni, A., Pimentel, E., de Queiroz, R.J.G.B., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2022; Volume 13468, pp. 123–140. [Google Scholar] [CrossRef]
  20. Scalas, A.; Dardha, O.; Hu, R.; Yoshida, N. A Linear Decomposition of Multiparty Sessions for Safe Distributed Programming. In Proceedings of the 31st European Conference on Object-Oriented Programming, ECOOP 2017, LIPIcs, Barcelona, Spain, 18–23 June 2017; Müller, P., Ed.; Schloss Dagstuhl-Leibniz-Zentrum für Informatik: Wadern, Germany, 2017; Volume 74, pp. 24:1–24:31. [Google Scholar] [CrossRef]
  21. Ghilezan, S.; Jaksic, S.; Pantovic, J.; Scalas, A.; Yoshida, N. Precise subtyping for synchronous multiparty sessions. J. Log. Algebr. Methods Progr. 2019, 104, 127–173. [Google Scholar] [CrossRef]
  22. Coppo, M.; Dezani-Ciancaglini, M.; Yoshida, N.; Padovani, L. Global progress for dynamically interleaved multiparty sessions. Math. Struct. Comput. Sci. 2016, 26, 238–302. [Google Scholar] [CrossRef]
  23. Wallsten, T.S.; Budescu, D.V.; Zwick, R.; Kemp, S.M. Preferences and reasons for communicating probabilistic information in verbal or numerical terms. Bull. Psychon. Soc. 1993, 31, 135–138. [Google Scholar] [CrossRef]
  24. Weichselberger, K. The theory of interval-probability as a unifying concept for uncertainty. Int. J. Approx. Reason. 2000, 24, 149–170. [Google Scholar] [CrossRef]
  25. de Campos, L.M.; Huete, J.F.; Moral, S. Probability Intervals: A Tool for uncertain Reasoning. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 1994, 2, 167–196. [Google Scholar] [CrossRef]
  26. Lee, P.M. Bayesian Statistics: An Introduction, 4th ed.; Wiley: Hoboken, NJ, USA, 2012. [Google Scholar]
  27. Pierce, B.C. Types and Programming Languages; MIT Press: Cambridge, MA, USA, 2002. [Google Scholar]
  28. Scalas, A.; Yoshida, N. Less is more: Multiparty session types revisited. Proc. ACM Program. Lang. 2019, 3, 30:1–30:29. [Google Scholar] [CrossRef]
  29. Bejleri, A.; Yoshida, N. Synchronous Multiparty Session Types. In Proceedings of the First Workshop on Programming Language Approaches to Concurrency and Communication-cEntric Software, PLACES@DisCoTec, Oslo, Norway, 7 June 2008; Vasconcelos, V.T., Yoshida, N., Eds.; Electronic Notes in Theoretical Computer Science. Elsevier: Amsterdam, The Netherlands, 2008; Volume 241, pp. 3–33. [Google Scholar] [CrossRef]
  30. Dagnino, F.; Giannini, P.; Dezani-Ciancaglini, M. Deconfined Global Types for Asynchronous Sessions. Log. Methods Comput. Sci. 2023, 19. [Google Scholar] [CrossRef] [PubMed]
  31. Demangeon, R.; Honda, K. Nested Protocols in Session Types. In Proceedings of the 23rd International Conference on Concurrency Theory, CONCUR 2012, Newcastle upon Tyne, UK, 4–7 September 2012; Koutny, M., Ulidowski, I., Eds.; Lecture Notes in Computer Science. Springer: Cham, Switzerland, 2012; Volume 7454, pp. 272–286. [Google Scholar] [CrossRef]
  32. Bocchi, L.; Honda, K.; Tuosto, E.; Yoshida, N. A Theory of Design-by-Contract for Distributed Multiparty Interactions. In Proceedings of the 21th International Conference on Concurrency Theory, CONCUR 2010, Paris, France, 31 August–3 September 2010; Gastin, P., Laroussinie, F., Eds.; Lecture Notes in Computer Science. Springer: Cham, Switzerland, 2010; Volume 6269, pp. 162–176. [Google Scholar] [CrossRef]
  33. Berger, M.; Yoshida, N. Timed, Distributed, Probabilistic, Typed Processes. In Programming Languages and Systems, Proceedings of the 5th Asian Symposium, APLAS 2007, Singapore, 29 November–1 December 2007; Shao, Z., Ed.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2007; Volume 4807, pp. 158–174. [Google Scholar] [CrossRef]
  34. Kolmogorov, A. Foundations of the Theory of Probability; Chelsea Pub Co.: New York, NY, USA, 1950. [Google Scholar]
  35. Dempster, A.P. Upper and Lower Probabilities Induced by a Multivalued Mapping. In Classic Works of the Dempster–Shafer Theory of Belief Functions; Yager, R.R., Liu, L., Eds.; Studies in Fuzziness and Soft Computing; Springer: Berlin/Heidelberg, Germany, 2008; Volume 219, pp. 57–72. [Google Scholar] [CrossRef]
  36. Keynes, J.M. A Treatise on Probability; Macmillan and Co.: New York, NY, USA, 1921. [Google Scholar]
  37. Levi, I. On Indeterminate Probabilities. J. Philos. 1974, 71, 391–418. [Google Scholar] [CrossRef]
  38. Wolfenson, M.; Fine, T.L. Bayes-like Decision Making with Upper and Lower Probabilities. J. Am. Stat. Assoc. 1982, 77, 80–88. [Google Scholar] [CrossRef]
  39. Halpern, J.Y. Reasoning About Uncertainty; MIT Press: Cambridge, MA, USA, 2005. [Google Scholar]
  40. Bernardo, M.; De Nicola, R.; Loreti, M. Revisiting bisimilarity and its modal logic for nondeterministic and probabilistic processes. Acta Inform. 2015, 52, 61–106. [Google Scholar] [CrossRef]
  41. Varacca, D.; Yoshida, N. Probabilistic pi-Calculus and Event Structures. Electron. Notes Theor. Comput. Sci. 2007, 190, 147–166. [Google Scholar] [CrossRef]
  42. Segala, R.; Lynch, N.A. Probabilistic Simulations for Probabilistic Processes. Nord. J. Comput. 1995, 2, 250–273. [Google Scholar]
Figure 1. Dotted lines stand for probabilistic choices and contain the probability associated with a choice; rectangles with names in their left top corners stand for chosen labels.
Figure 1. Dotted lines stand for probabilistic choices and contain the probability associated with a choice; rectangles with names in their left top corners stand for chosen labels.
Symmetry 17 00218 g001
Figure 2. Dashed lines stand for interval probabilistic choices and contain the probability interval associated with the choice; rectangles with names in their left top corners stand for chosen labels.
Figure 2. Dashed lines stand for interval probabilistic choices and contain the probability interval associated with the choice; rectangles with names in their left top corners stand for chosen labels.
Symmetry 17 00218 g002
Table 1. Syntax of the probabilistic multiparty session calculus.
Table 1. Syntax of the probabilistic multiparty session calculus.
Processes
P , Q : : = c [ r ] i I p i : l i ( v i ) . P i ( selection towards role r , I , i I p i = 1 )
| c [ r ] & i I { l i ( x i ) . P i } (branching from role r, I )
| def D in P | X ( v ˜ c ˜ ) (process definition & process call)
| P Q | ( ν s ) P (parallel & restriction)
| 0 | err (inaction & error)
Declarations
D : : = X ( x ˜ ) = P (process declaration)
Contexts
E : : = [ ] | ( ν s ) E | def D in E | E P (evaluation context)
Channels
c : : = x| s [ r ] (variable, channel with role r)
Values
v : : = c | t r u e | f a l s e | 3 | (channel, base value)
Table 2. Operational semantics of the probabilistic multiparty session calculus.
Table 2. Operational semantics of the probabilistic multiparty session calculus.
s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) . P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) . P i } p k P k P k { v k ˜ / x k ˜ } (if k J I )(Com)
def X ( x ˜ ) = P in ( X ( v ˜ c ˜ ) | Q ) 1 def X ( x ˜ ) = P in ( P { v ˜ c ˜ / x ˜ } | Q )
                        (if x ˜ = x 1 x n , v ˜ c ˜ = v 1 v i c i + 1 c n )(Call)
P p P implies E [ P ] p E [ P ] (where p = p · nextProc ( P ) / nextProc ( E [ P ] ) )(Ctxt)
P P and P p Q and Q Q implies P p Q (Struct)
s [ r 1 ] [ r 2 ] j J p j : l j ( v j ) . P j s [ r 2 ] [ r 1 ] & i I { l i ( x i ) . P i } p k err (if k J and k I )(Err)
Table 3. Global types of the session typing.
Table 3. Global types of the session typing.
GlobalG : : = r 1 r 2 : { ι i : l i ( S i ) . G i } i I    | μ t . G   | t  |  end
SortsS : : = bool nat
Table 4. Local types of the session typing.
Table 4. Local types of the session typing.
LocalT : : = r i I ι i : ! l i S i . T i | r & i I ? l i ( S i ) . T i | μ t . T  | t | end
SortsS : : = nat bool
Table 5. Typing system for processes in probabilistic multiparty sessions.
Table 5. Typing system for processes in probabilistic multiparty sessions.
Γ , x : S x : S     Γ v : S (TVar), (TVal)
  i . Γ v i : S i i . Γ P i Δ , c : T i p i ι i Γ c [ r ] i I p i : l i ( v i ) . P i Δ , c : r i I ι i : ! l i S i . T i (TSelect)
  i . Γ , x i : S i P i Δ , c : T i , I J Γ c [ r ] & j J { l j ( x j ) . P j } Δ , c : r & i I ? l i ( S i ) . T i (TBranch)
  Δ end only Γ 0 Δ        Γ P Δ Γ Q Δ Γ P Q Δ , Δ (TEnd), (TConc)
  Γ P Δ , G s Γ ( ν s ) P Δ (TRes)
  Γ , x : S ˜ , X : S ˜ T 1 T n P c 1 : T 1 , , c n : T n Γ , X : S ˜ T 1 T n Q Δ Γ def X ( x ˜ c 1 c n ) = P in Q Δ (TDef)
  Γ v ˜ : S ˜ Δ end only Γ , X : S ˜ T 1 T n X ( v ˜ c 1 c n ) Δ , c 1 : T 1 , , c n : T n (TCall)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Aman, B.; Ciobanu, G. Safety of Uncertain Session Types with Interval Probability. Symmetry 2025, 17, 218. https://doi.org/10.3390/sym17020218

AMA Style

Aman B, Ciobanu G. Safety of Uncertain Session Types with Interval Probability. Symmetry. 2025; 17(2):218. https://doi.org/10.3390/sym17020218

Chicago/Turabian Style

Aman, Bogdan, and Gabriel Ciobanu. 2025. "Safety of Uncertain Session Types with Interval Probability" Symmetry 17, no. 2: 218. https://doi.org/10.3390/sym17020218

APA Style

Aman, B., & Ciobanu, G. (2025). Safety of Uncertain Session Types with Interval Probability. Symmetry, 17(2), 218. https://doi.org/10.3390/sym17020218

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop