Decision Trees for Binary Subword-Closed Languages

In this paper, we study arbitrary subword-closed languages over the alphabet {0,1} (binary subword-closed languages). For the set of words L(n) of the length n belonging to a binary subword-closed language L, we investigate the depth of the decision trees solving the recognition and the membership problems deterministically and nondeterministically. In the case of the recognition problem, for a given word from L(n), we should recognize it using queries, each of which, for some i∈{1,…,n}, returns the ith letter of the word. In the case of the membership problem, for a given word over the alphabet {0,1} of the length n, we should recognize if it belongs to the set L(n) using the same queries. With the growth of n, the minimum depth of the decision trees solving the problem of recognition deterministically is either bounded from above by a constant or grows as a logarithm, or linearly. For other types of trees and problems (decision trees solving the problem of recognition nondeterministically and decision trees solving the membership problem deterministically and nondeterministically), with the growth of n, the minimum depth of the decision trees is either bounded from above by a constant or grows linearly. We study the joint behavior of the minimum depths of the considered four types of decision trees and describe five complexity classes of binary subword-closed languages.


Introduction
In this paper, we study arbitrary binary languages (languages over the alphabet E = {0, 1}) that are subword closed: if a word w 1 u 1 w 2 · · · w m u m w m+1 belongs to a language, then the word u 1 · · · u m belongs to this language. Subword-closed languages have attracted the attention of researchers in the field of formal languages for many years [1][2][3][4][5].
For the set of words L(n) of the length n belonging to a binary subword-closed language L, we investigate the depth of the decision trees solving the recognition and the membership problems deterministically and nondeterministically. In the case of the recognition problem, for a given word from L(n), we should recognize it using queries, each of which, for some i ∈ {1, . . . , n}, returns the ith letter of the word. In the case of the membership problem, for a given word over the alphabet E of the length n, we should recognize if it belongs to L(n) using the same queries.
For an arbitrary binary subword-closed language, with the growth of n, the minimum depth of the decision trees solving the problem of recognition deterministically is either bounded from above by a constant or grows as a logarithm, or linearly. For other types of trees and problems (decision trees solving the problem of recognition nondeterministically and decision trees solving the membership problem deterministically and nondeterministically), with the growth of n, the minimum depth of decision trees is either bounded from above by a constant or grows linearly. We study the joint behavior of the minimum depths of the considered four types of decision trees and describe five complexity classes of binary subword-closed languages.
In [6], the following results were announced without proof. For an arbitrary regular language, with the growth of n, (i) the minimum depth of the decision trees solving the problem of recognition deterministically is either bounded from above by a constant or grows as a logarithm, or linearly, and (ii) the minimum depth of the decision trees solving the problem of recognition nondeterministically is either bounded from above by a constant or grows linearly. Proofs for the case of decision trees solving the problem of recognition deterministically can be found in [7,8]. To apply the considered results to a given regular language, it is necessary to know a deterministic finite automaton (DFA) accepting this language.
Each subword-closed language over a finite alphabet is a regular language [3]. In this paper, we do not assume that binary subword-closed languages are given by DFAs. So, we cannot use the results from [6][7][8]. Instead of this, for binary subword-closed languages, we describe simple criteria for the behavior of the minimum depths of decision trees solving the problems of recognition and membership deterministically and nondeterministically.
This paper is a theoretical work related to the field of formal languages. It has no direct applications. In the theory of formal languages, various parameters of languages are studied, in particular the growth of the number of words of the language with the growth of the length of words and, for regular languages, the minimum number of states of the automaton accepting the language. For many years, the author has been introducing new parameters of languages into scientific use: the minimum depth of deterministic and nondeterministic decision trees for the recognition and membership problems related to the language [6][7][8][9]. The present paper continues this line of research.
There is now an extensive collection of methods for constructing decision trees. It includes (i) a variety of greedy heuristics based on measures of uncertainty, such as entropy and the Gini index [10][11][12], (ii) exact optimization algorithms based on dynamic programming, branch-and-bound search, SAT-based methods, etc., [13][14][15][16], and (iii) approximate optimization algorithms with bounds of accuracy that are applicable to obtain theoretical results about the complexity of decision trees [8,17].
In this paper, we found simple combinatorial parameters of binary subword-closed languages, which made it possible to obtain bounds on the depth of the decision trees without using the effective but rather complicated methods developed in the monographs [8,17].
The rest of this paper is organized as follows. In Section 2, we consider the main notions; in Section 3, the main results; in Section 4, the proofs; and in Section 5, short conclusions.

Main Notions
Let ω = {0, 1, 2, . . .} be the set of nonnegative integers and E = {0, 1}. By E * , we denote the set of all finite words over the alphabet E, including the empty word λ. Any subset L of the set E * is called a binary language. This language is called subword closed if, for any word w 1 u 1 w 2 · · · w m u m w m+1 belonging to L, the word u 1 · · · u m belongs to L, where w i , u j ∈ E * , i = 1, . . . , m + 1, j = 1, . . . , m. For any natural n, we denote by L(n) the set of words from L, for which length is equal to n. We consider two problems related to the set L(n). The problem of recognition: for a given word from L(n), we should recognize it using attributes (queries) l n 1 , . . . , l n n , where l n i , i ∈ {1, . . . , n}, is a function from E * (n) to E such that l n i (a 1 · · · a n ) = a i for any word a 1 · · · a n ∈ E * (n). The problem of membership: for a given word from E * (n), we should recognize if this word belongs to the set L(n) using the same attributes. To solve these problems, we use decision trees over L(n).
A decision tree over L(n) is a marked finite directed tree with the root, which has the following properties: • The root and the edges leaving the root are not labeled. • Each node, which is not the root or terminal node, is labeled with an attribute from the set {l n 1 , . . . , l n n }. • Each edge leaving a node, which is not a root, is labeled with a number from E.
A decision tree over L(n) is called deterministic if it satisfies the following conditions: • Exactly one edge leaves the root. • For any node, which is not the root nor terminal node, the edges leaving this node are labeled with pairwise different numbers.
Let Γ be a decision tree over L(n). A complete path in Γ is any sequence ξ = v 0 , e 0 , . . . , v m , e m , v m+1 of nodes and edges of Γ such that v 0 is the root, v m+1 is a terminal node, v i is the initial, and v i+1 is the terminal node of the edge e i for i = 0, . . . , m. We define a subset E(n, ξ) of the set E * (n) in the following way: if m = 0, then E(n, ξ) = E * (n). Let m > 0, the attribute l n i j be assigned to the node v j and b j be the number assigned to the edge e j , j = 1, . . . , m. Then, E(n, ξ) = {a 1 · · · a n ∈ E * (n) : Let L(n) = ∅. We say that a decision tree Γ over L(n) solves the problem of recognition for L(n) nondeterministically if Γ satisfies the following conditions: • Each terminal node of Γ is labeled with a word from L(n).

•
For any word w ∈ L(n), there exists a complete path ξ in the tree Γ such that w ∈ E(n, ξ).

•
For any word w ∈ L(n) and for any complete path ξ in the tree Γ such that w ∈ E(n, ξ), the terminal node of the path ξ is labeled with the word w.
We say that a decision tree Γ over L(n) solves the problem of recognition for L(n) deterministically if Γ is a deterministic decision tree, which solves the problem of recognition for L(n) nondeterministically.
We say that a decision tree Γ over L(n) solves the problem of membership for L(n) nondeterministically if Γ satisfies the following conditions: For any word w ∈ E * (n), there exists a complete path ξ in the tree Γ such that w ∈ E(n, ξ). • For any word w ∈ E * (n) and for any complete path ξ in the tree Γ such that w ∈ E(n, ξ), the terminal node of the path ξ is labeled with the number 1 if w ∈ L(n) and with the number 0, otherwise.
We say that a decision tree Γ over L(n) solves the problem of membership for L(n) deterministically if Γ is a deterministic decision tree which solves the problem of membership for L(n) nondeterministically.
Let Γ be a decision tree over L(n). We denote by h(Γ) the maximum number of nodes in a complete path in Γ that are not the root nor terminal node. The value h(Γ) is called the depth of the decision tree Γ.
We denote by h ra L (n) (h rd L (n)) the minimum depth of a decision tree, which solves the problem of recognition for L(n) nondeterministically (deterministically). If L(n) = ∅, then h ra L (n) = h rd L (n) = 0. We denote by h ma L (n) (h md L (n)) the minimum depth of a decision tree, which solves the problem of membership for L(n) nondeterministically (deterministically). If L(n) = ∅, then h ma L (n) = h md L (n) = 0.

Main Results
Let L be a binary subword-closed language. For any a ∈ E and i ∈ ω, we denote by a i the word a · · · a of the length i (if i = 0, then a i = λ). For any a ∈ E , letā = 1 if a = 0 and a = 0 if a = 1.
We define the parameter Hom(L) of the language L, which is called the homogeneity dimension of the language L. If for each natural number m, there exists a ∈ E such that the word a mā a m belongs to L, then Hom(L) = ∞. Otherwise, Hom(L) is the maximum number m ∈ ω such that there exists a ∈ E for which the word a mā a m belongs to L. If L = ∅, then Hom(L) = 0.
We now define the parameter Het(L) of the language L, which is called the heterogeneity dimension of the language L. If for each natural number m, there exists a ∈ E such that the word a mām belongs to L, then Het(L) = ∞. Otherwise, Het(L) is the maximum number m ∈ ω such that there exists a ∈ E for which the word a mām belongs to L. If L = ∅, then Het(L) = 0. For a binary subword-closed language L, we denote by L C its complementary language E * \ L. The notation |L| = ∞ means that L is an infinite language, and the notation |L| < ∞ means that L is a finite language.
Theorem 2. Let L be a binary subword-closed language.
(a) If |L| = ∞ and L C = ∅, then h md L (n) = Θ(n) and h ma Example 2. One can show that, for the binary subword-closed To study all possible types of joint behavior of functions h rd L (n), h ra L (n), h md L (n), and h ma L (n) for binary subword-closed languages L, we consider five classes of languages L 1 , . . . , L 5 described in the columns 2-5 of Table 1. In particular, L 1 consists of all binary subword-closed languages L with Hom(L) = ∞ and L C = ∅. It is easy to show that the complexity classes L 1 , . . . , L 5 are pairwise disjointed, and each binary subword-closed language belongs to one of these classes. The behavior of functions h rd L (n), h ra L (n), h md L (n), and h ma L (n) for languages from these classes is described in the last four columns of Table 1. For each class, the results considered in Table 1 follow from Theorems 1 and 2 and the following three remarks: (i) from the condition Hom(L) = ∞, it follows |L| = ∞, (ii) from the condition Het(L) = ∞, it follows |L| = ∞, and (iii) from the condition Hom(L) < ∞, it follows L C = ∅.

Hom(L) Het(L)
We now show that the classes L 1 , . . . , L 5 are nonempty. To this end, we consider the following five binary subword-closed languages: It is easy to see that L i ∈ L i for i = 1, . . . , 5.

Proofs of Theorems 1 and 2
In this section, we prove Theorems 1 and 2. First, we consider two auxiliary statements. For a word w, we denote by |w| its length. Lemma 1. Let L be a binary subword-closed language for which Hom(L) < ∞. Then, any word w from L can be represented in the form where a ∈ E, i, j ∈ ω, and w 1 , w 2 , w 3 are words from E * with length at most 2Hom(L) each.
Proof. Denote m = Hom(L). Then, the words 0 m+1 10 m+1 and 1 m+1 01 m+1 do not belong to L. Let w be a word from L. Then, for any a ∈ E, any entry of the letter a in w has at most m as to the left of this entry (we call it l-entry of a) or at most mās to the right of this entry (we call it r-entry of a). Let a ∈ E. We say that w is (i) a-l-word if any entry of a in w is l-entry; (ii) a-r-word if any entry of a in w is r-entry; and (iii) a-b-word if w is not a-l-word and is not a-r-word. Let c, d ∈ {l, r, b}. We say that w is cd-word if w is 0-c-word and 1-d-word.
There are nine possible pairs cd. We divide them into four groups: (a) ll and rr, (b) lr and rl, (c) lb, rb, bl, and br, and (d) bb, and consider them separately. Let w = a 1 · · · a n .
We assume that w contains both 0s and 1s. Otherwise, w can be represented in the form (1). (a) Let w be ll-word. Let a n = 0 and a i be the rightmost entry of 1 in w. Because w is ll-word, there are at most m 1s to the left of a n and at most m 0s to the left of a i . Denote w 1 = a 1 · · · a i . Then, w 1 contains at most m 0s and at most m 1s, i.e., the length of w 1 is at most 2m. Moreover, to the right of a i , there are only 0s. Thus, w = w 1 0 n−i , where |w 1 | = i ≤ 2m, i.e., w can be represented in the form (1).
Let a n = 1 and a i be the rightmost entry of 0 in w. Denote w 1 = a 1 · · · a i . Then, w 1 contains at most m 0s and at most m 1s, i.e., |w 1 | ≤ 2m. Moreover, to the right of a i , there are only 1s. Thus, w = w 1 1 n−i , i.e., w can be represented in the form (1).
One can prove in a similar way that any rr-word can be represented in the form (1). (b) Let w be lr-word, a i be the rightmost entry of 0, and a j be the leftmost entry of 1. Then, either j = i + 1 or j < i. Let j = i + 1. Then, w = 0 i 1 n−i , i.e., w can be represented in the form (1). Let now j < i. Denote w 2 = a j · · · a i . The word w has at most m 0s to the right of a j and at most m 1s to the left of a i . Therefore, |w 2 | ≤ 2m and w = 0 j−1 w 2 1 n−i , i.e., w can be represented in the form (1).
One can prove in a similar way that any rl-word can be represented in the form (1). (c) Let w be lb-word; a i be the rightmost entry of 1 such that to the left of this entry, we have at most m 0s; and a j be the next after a i entry of 1. It is clear that to the right of a j , there are at most m 0s, j ≥ i + 2, and all letters a i+1 , . . . , a j−1 are equal to 0. Let a k be the rightmost entry of 0. Then, to the left of a k , there are at most m 1s. It is clear that either k = j − 1 or k > j. Denote w 1 = a 1 · · · a i . Then, |w 1 | ≤ 2m. Let k = j − 1. In this case, w = w 1 0 j−i−1 1 n−j+1 , i.e., w can be represented in the form (1). Let k > j. Denote w 2 = a j · · · a k . Then, |w 2 | ≤ 2m. We have w = w 1 0 j−i−1 w 2 1 n−k , i.e., w can be represented in the form (1).
One can prove in a similar way that any rbor blor br-word can be represented in the form (1).
(d) Let w be bb-word, a i be the rightmost entry of 0 such that there are at most m 1s to the left of this entry, and a j be the next after a i entry of 0. Then, there are at most m 1s to the right of a j , j ≥ i + 2, and w = a 1 · · · a i 1 · · · 1a j · · · a n . Denote A = {1, . . . , i}, B = {i + 1, . . . , j − 1}, and C = {j, . . . , n}. Let a k be the rightmost entry of 1 such that there are at most m 0s to the left of this entry and a l be the next after a k entry of 1. Then, there are at most m 0s to the right of a l , l ≥ k + 2, and w = a 1 · · · a k 0 · · · 0a l · · · a n .
There are four possible types of location of a k and a l : (i) k ∈ A and l ∈ A, (ii) k ∈ A and l ∈ B (the combination k ∈ A and l ∈ C is impossible because all letters with indices from B are 1s, but all letters between a k and a l are 0s), (iii) k ∈ B and l ∈ C (the combination k ∈ B and l ∈ B is impossible because all letters with indices from B are 1s, but all letters between a k and a l are 0s), and (iv) k ∈ C and l ∈ C. We now consider cases (i)-(iv) in detail.
(i) Let k ∈ A and l ∈ A. Then, w = a 1 · · · a k 0 · · · 0a l · · · a i 1 · · · 1a j · · · a n . Denote w 1 = a 1 · · · a k , w 2 = a l · · · a i , and w 3 = a j · · · a n . The length of w 1 is at most 2m because from the left of a k , there are at most m 0s, and from the left of a i , there are at most m 1s. We can prove in a similar way that |w 2 | ≤ 2m and |w 3 | ≤ 2m. Therefore, w can be represented in the form (1).

Lemma 2.
Let L be a binary subword-closed language for which Hom(L) < ∞ and Het(L) < ∞. Then, there exists natural p such that |L(n)| ≤ p for any natural n.
Proof. Denote m = max(Hom(L), Het(L)). Then, the words 0 m+1 1 m+1 and 1 m+1 0 m+1 do not belong to L. Using Lemma 1, we obtain that each word w from L can be represented in the form w 1 a i w 2ā j w 3 , where a ∈ E, the length of w k is at most t = 2m for k = 1, 2, 3, i, j ∈ ω, and i ≤ m or j ≤ m. We now evaluate the number of such words, for which length is equal to n. Let k ∈ {1, 2, 3}. Then, the number of different words w k is at most 2 0 + 2 1 + · · · + 2 t < 2 t+1 . Let us assume that the words w 1 , w 2 , and w 3 are fixed and |w 1 | + |w 2 | + |w 3 | ≤ n. Then, the number of different words a iāj of the length n − |w 1 | − |w 2 | − |w 3 | is at most 4(m + 1) because i ≤ m or j ≤ m. Thus, the number of words in L(n) is at most p = 2 3t+3 (2t + 4).
Proof of Theorem 1. It is clear that h ra L (n) ≤ h rd L (n) for any natural n. (a) Let Hom(L) = ∞ and n be a natural number. Then, there exists a ∈ E such that a nā a n ∈ L. Therefore, a n , a iā a n−i−1 ∈ L(n) for i = 0, . . . , n − 1. Let Γ be a decision tree over L(n), which solves the problem of recognition for L(n) nondeterministically and has the minimum depth h ra L (n), and ξ be a complete path in Γ such that a n ∈ E(n, ξ). Let us assume that there is i ∈ {0, . . . , n − 1} such that the attribute l n i+1 is not attached to any node of ξ, which is not the root nor the terminal node. Then, a iā a n−i−1 ∈ E(n, ξ), which is impossible. Therefore, h(Γ) ≥ n and h ra L (n) ≥ n. It is easy to show that h rd L (n) ≤ n. Thus, h ra L (n) = h rd L (n) = n for any natural n. (b) Let Hom(L) < ∞ and Het(L) = ∞. By Lemma 1, each word from L can be represented in the form w 1 a i w 2ā j w 3 , where a ∈ E, the length of w k is at most t = 2Hom(L) for k = 1, 2, 3, and i, j ∈ ω. Note that either w 2 = λ or w 2 is a word of the kindā · · · a.
Let n be a natural number such that n ≥ 10t. We now describe the work of a decision tree over L(n), which solves the problem of recognition for L(n) deterministically. Let w ∈ L(n). We represent this word as follows: w = L 1 L 2 L 3 AR 3 R 2 R 1 , where the length of each word L 1 , L 2 , L 3 , R 3 , R 2 , R 1 is equal to t. First, we recognize all letters in the words L 1 , L 2 , R 2 , R 1 using 4t queries (attributes). We now consider four cases.
(i) Let L 2 = R 2 = a t for some a ∈ E. Then, L 3 AR 3 = a n−4t , and the word w is recognized.
(ii) Let L 2 = a t for some a ∈ E, and R 2 contains both 0 and 1. Then, R 2 has an intersection with the word w 2 . It is clear that w 2 has no intersection with the word A and L 3 A = a n−5t . We recognize all letters of the word R 3 . As a result, the word w will be recognized.
(iii) Let R 2 = a t for some a ∈ E, and L 2 contains both 0 and 1. Then, L 2 has an intersection with the word w 2 . It is clear that w 2 has no intersection with the word A and AR 3 = a n−5t . We recognize all letters of the word L 3 . As a result, the word w will be recognized.
(iv) Let L 2 = a t and R 2 =ā t for some a ∈ E. Then, we need to recognize the position of the word w 2 and the word w 2 itself. Beginning with the left, we divide L 3 AR 3 and, probably, a prefix of R 2 into blocks of the length t. As a result, we have k ≤ n/t blocks. We recognize all letters in the block with the number r = k/2 . If all letters in this block are equal toā, then we apply the same procedure to the blocks with numbers 1, . . . , r − 1. If all letters in this block are equal to a, then we apply the same procedure to the blocks with numbers r + 1, . . . , k. If the considered block contains both 0 and 1, then we recognize t letters before this block and t letters after this block and, as a result, recognize both the word w 2 and its position. After each iteration, the number of blocks is at most one-half of the previous number of blocks. Let q be the whole number of iterations. Then, after the iteration q − 1, we have at least one unchecked block. Therefore, k/2 q−1 ≥ 1 and q ≤ log 2 k + 1.
In case (i), to recognize the word w, we make 4t queries. In cases (ii) and (iii), we make 5t queries. In case (iv), we make at most t log 2 (n/t) + 7t queries. As a result, we have h rd L (n) = O(log n).
Because Het(L) = ∞, for any natural n, the set L(n) contains for some a ∈ E words a iān−i for i = 0, . . . , n. Then, |L(n)| ≥ n + 1, and each decision tree Γ over L(n) solving the problem of recognition for L(n) deterministically has at least n + 1 terminal nodes. One can show that the number of terminal nodes in Γ is at most 2 h(Γ) . Therefore, h(Γ) ≥ log 2 (n + 1). Thus, h rd L (n) = Ω(log n) and h rd L (n) = Θ(log n). We now prove that h ra L (n) = O(1). To this end, it is enough to show that there is a natural number c such that, for each natural n and for each word w ∈ L(n), there exists a subset B w of the set of attributes {l n 1 , . . . , l n n } such that |B w | ≤ c and, for any word u ∈ L(n) different from w, there exists an attribute l n i ∈ B w for which l n i (w) = l n i (u). We now show that as c, we can use the number 7t. In case (i), in the capacity of the set B w , we can choose all attributes corresponding to 4t letters from the subwords L 1 , L 2 , R 2 , and R 1 . In case (ii), we can choose all attributes corresponding to 5t letters from the subwords L 1 , L 2 , R 3 , R 2 , and R 1 . In case (iii), we can choose all attributes corresponding to 5t letters from the subwords L 1 , L 2 , L 3 , R 2 , and R 1 . In case (iv), in the capacity of the set B w , we can choose all attributes corresponding to 4t letters from the subwords L 1 , L 2 , R 2 , and R 1 , and 3t letters from the block containing both 0 and 1 and from the blocks that are its left and right neighbors.
(c) Let Hom(L) < ∞ and Het(L) < ∞. By Lemma 2, there exists natural p such that |L(n)| ≤ p for any natural n. Let n be a natural number. Then, the set L(n) contains at most p words, and there exists a subset B of the set of attributes {l n 1 , . . . , l n n } such that |B| ≤ p 2 and, for any two different words u, w ∈ L(n), there exists an attribute l n i ∈ B for which l n i (w) = l n i (u). It is easy to construct a decision tree over L(n) which solves the problem of recognition for L(n) deterministically by sequentially computing attributes from B. The depth of this tree is at most p 2 . Therefore, h rd L (n) = O(1) and h ra L (n) = O(1).

Proof of Theorem 2.
It is clear that h ma L (n) ≤ h md L (n) for any natural n. (a) Let |L| = ∞, L C = ∅, and w 0 be a word with the minimum length from L C . Because |L| = ∞, L(n) = ∅ for any natural n. Let n be a natural number such that n > |w 0 | and Γ be a decision tree over L(n) that solves the problem of membership for L(n) nondeterministically and has the minimum depth. Let w ∈ L(n) and ξ be a complete path in Γ such that w ∈ E(n, ξ). Then, the terminal node of ξ is labeled with the number 1. Let us assume that the number of nodes labeled with attributes in ξ is at most n − |w 0 |. Then, we can change at most |w 0 | letters in the word w such that the obtained word w will satisfy the following conditions: w 0 is a subword of w and w ∈ E(n, ξ). However, it is impossible because in this case w / ∈ L(n) and w ∈ E(n, ξ), but the terminal node of ξ is labeled with the number 1. Therefore, the depth of Γ is greater than n − |w 0 |. Thus, h ma L (n) = Ω(n). It is easy to construct a decision tree over L(n) that solves the problem of membership for L(n) deterministically and has a depth equal to n. Therefore, h md L (n) = O(n). Thus, h md L (n) = Θ(n) and h ma L (n) = Θ(n). (b) Let |L| < ∞. Then, there exists natural m such that L(n) = ∅ for any natural n ≥ m. Therefore, for each natural n ≥ m, h md L (n) = 0 and h ma L (n) = 0. Let L C = ∅, n be a natural number, and Γ be a decision tree over L(n) which consists of the root, a terminal node labeled with 1, and an edge that leaves the root and enters the terminal node. One can show that Γ solves the problem of membership for L(n) deterministically and has a depth equal to 0. Therefore, h md L (n) = 0 and h ma L (n) = 0.

Conclusions
In this paper, we studied arbitrary binary subword-closed languages. For the set of words L(n) of the length n belonging to a binary subword-closed language L, we investigated the depth of the decision trees solving the recognition and the membership problems deterministically and nondeterministically. We proved that with the growth of n, the minimum depth of the decision trees solving the problem of recognition deterministically is either bounded from above by a constant or grows as a logarithm, or linearly. For other types of trees and problems, with the growth of n, the minimum depth of the decision trees is either bounded from above by a constant or grows linearly. We also studied the joint behavior of the minimum depths of the considered four types of decision trees and described five complexity classes of binary subword-closed languages.
In this paper, we did not assume that a binary subword-closed language is given by a deterministic finite automaton accepting this language. So, we could not use the parameters of the automaton for the study of decision tree complexity as it was done in [6][7][8][9]. Instead of this, for binary subword-closed languages, we described simple combinatorial criteria for the behavior of the minimum depths of the decision trees solving the problems of recognition and membership deterministically and nondeterministically.
In the future, we are planning to generalize this approach to some other classes of formal languages.