1. Introduction
In the field of
logic programming it has always been f great importance to reduce the formulae and expressions to their minimal conjunctive or disjunctive normal forms (
CNF and
DNF respectively), as this reduces the number of logic OR and AND gates needed to implement the function as a circuit. This topic has been broadly studied for
Boolean Logic and well known methods such as the Karnaugh Maps [
1] and Quine-McCluskey algorithm [
2,
3] have served as base for powerful tools such as ESPRESSO [
4] and BOOM [
5]. Despite this being the case for Boolean Logic, it is not for Here-and-There logic (i.e., three-valued logic). There is a modified version of the Quine-McCluskey algorithm that takes in account the particularities of this specific logic [
6] but there aren’t any available tools implementing it. In this paper we describe a first approach to a tool that is able to minimize logic programs and theories in Here-and-There logic leveraging both the Quine-McCluskey algorithm alongside the power of Answer Set Programming.
2. Materials and Methods
The Quine-McCluskey algorithm aims to obtain a minimal normal form equivalent to any propositional theory. It can obtain either a minimal
DNF from the set of models of the theory or a minimal
CNF from its countermodels. The algorithm computes the set of prime implicates of a given theory given its countermodels. To obtain the minimal CNF we also need a coverage algorithm (Usually
Petrick’s Method [
7]) to select the elements of the minimal subset of prime implicates that cover all of the initial countermodels of the theory.
In the logic of here-and-there (HT), a formula is defined in the usual way as a well-formed combination of the operators ⊥,∧,∨,→ with atoms in a propositional signature . We also define , and . A theory is a set of formulas.
An HT-interpretation is a model of a given theory if satisfies all formulas in that theory. A formula true in all models is said to be valid or a tautology, while an Equilibrium Model of a theory is any total model () of that theory such that no with is model of the theory. Equilibrium Logic is the logic induced by equilibrium models.
As shown in [
8], logic programs constitute a CNF for HT. A logic program is a conjunction of
clauses, called
rules with a positive and negative bodies and positive and negative heads in the form:
2.1. Implementation
Fundamental rules are rules in which all pairwise intersections of the head and body sets are empty with the possible exception of . Said rules can be translated to a minterm-like notation. Such rules can be transformed into minterm-like labels using a set of six symbols
These rules are then codified to octal and compared in a pairwise manner by using bit-by-bit operations, which allows us to perform the prime implicate generation taking in account the set of adjacent values for HT. This octal codification uses each one of the three bits for each symbol to describe which values are used (e.g., the symbol
, meaning “not 2” having all bit positions set to 1 except the one corresponding to 2).
Value | 2 | 1 | 0 | Value | 2 | 1 | 0 |
0 | 0 | 0 | 1 | | 0 | 1 | 1 |
1 | 0 | 1 | 0 | | 1 | 1 | 0 |
2 | 1 | 0 | 0 | - | 1 | 1 | 1 |
On top of the modified version of the Quine-McCluskey method, we add a previous step that allows us to work with the direct translation of the rules, instead of having to expand all of the possible minterms that each aggregated rule contains. Instead of comparing which labels are totally adjacent, we check that all positions are compatible except the adjacent one. This means that every position should subsum each other, or in the best case, be equal. Since we only expand the partially adjacent labels, in the worst case scenario it will be equal to expanding all of the labels into the possible minterms and in the average case will be logarithmically smaller. Finally, for the minimal coverage step, since this is a well studied case in ASP we leverage the ASP solver clingo to obtain the set of minimal versions of the original program, thus avoiding the implementation of the Petrick’s method.
2.2. Tests
For a minimal program to be correct, it has to verify three conditions:
Size: The minimal program has to be smaller, or at least equal in number of rules to the original program.
Subsumption: The rules of the minimal program subsume all of the rules of the original program.
Strong Equivalence: Both the original program and the minimal program have exactly the same models.
The size condition check is straightforward, only in the case that the number of rules remain the same (i.e., the original program is already minimal) we have to check that any rewriting of the original rules is valid regarding the other two conditions.
The subsumption condition is further detailed in [
6] but to summarize, every rule of the minimal program has to subsume at least a rule of the original program and there can’t be any rule of the original program that is not subsumed by the rules of the minimal program. This condition is checked with a program in ASP that compares both the original and the minimal programs to verify the condition by checking which parts of each rules of the latter are a subset of the ones of the original one.
For the strong equivalence condition, the objective is to transform both the original and minimal programs to classical logic by applying a set of transformations. By comparing the classical models obtained for the transformations of the original program and the minimal version we can ensure the strong equivalence property.
3. Conclusions
We have developed a novel tool that implements the modified version of Quine-McCluskey’s method for HT, while also using the capabilities of Answer Set Programming to perform the minimal coverage of the initial countermodels, as well as using it to test and validate the results.
This first version of the tool is able to minimize the samples used in [
6] in better times than the proof-of-concept Prolog script provided by the authors, since it doesn’t rely on the calculation of all of the countermodels for a logic program, being able to work directly with the rules of the program as input by checking which rules can be potentially subsumed and only expanding those into minterms.
As for future work, we are already studying the splitting properties of logic programs in terms of minimization, which would allow us to separately minimize parts of a given logic program, greatly minimizing the number of atoms and rules that have to be dealt with at a single time in the current approach.