1308
B. Kulik, A. Fridman, A. Zuenko
4 DATA AND KNOWLEDGE REPRESENTATION IN NTA
4.1 Graphs and Semantic Networks
In computers, graphs and networks are usually represented as list structures. In
artificial intelligence systems, logical inference in graphs and semantic networks is
implemented through algorithms of search for accessible vertices or through con-
struction of the transitive closure of a graph. However, such algorithms are not
efficient enough and hard to parallelize. Let us now consider the way graphs are
expressed in NTA. We will use the graph presented in Figure 1 as an example.
c
e
d
b
a
Fig. 1. Example of a graph
This graph can be expressed as a C-system G[XY ] =
{a}
{b, c, d, e}
{b}
{d}
{c}
{a, b, d, e}
iso-
morphic to the adjacency matrix of this graph.
Composition of graphs G◦G, e.g. composition of a graph with itself, is used quite
often. This operation is shortly denoted as G
2
. Greater “degrees” of composition
can also be used, e.g. G
3
= G ◦ G ◦ G and so on.
It is often necessary to determine the set of all the accessible vertices for each
vertex of a graph G. This information is contained in the transitive closure of the
graph, which is defined as follows.
Transitive closure of a graph G that contains n vertices, is the graph G
+
each
of whose vertices is connected with all its accessible vertices with an arc.
Transitive closure can be constructed with the following sequence of operations:
G
+
= G ∪ G
2
∪ G
3
∪ . . . ∪ G
k
,
where k ≤ n. Practically in all cases, the operation of transformation of a finite
graph G into graph G
+
ends before the last “degree” G
k
is found. The reason for
ending this operation early is the fact that at some step the next “degree” of the
graph does not have any arcs that have not been in the graph before.
Let us consider the way inference in semantic networks is implemented in
NTA [15]. Any semantic network can be represented as a totality of binary re-
lations. In semantic networks, inference rules are expressed as productions whose
left part contains joins or compositions of some of these relations, and the right part
is a relation that is substituted for the left part in the semantic network or is added
to the semantic network as a new relation. Suppose that in an initial semantic
network, existing relations R
1
and R
2
(see Figure 2) infers an additional link R
3
Algebraic Approach to Logical Inference Implementation
1309
between the domain of the relation R
1
(vertex K) and the co-domain (target) of the
relation R
2
(vertex N ). The respective rule is shown in Figure 3 where A, B, C are
variables whose values can be the vertices of the described semantic networks.
R
3
R
1
L
К
R
2
N
T
R
2
S
Fig. 2. Initial semantic network
R
1
B
A
R
2
C
R
3
R
1
B
A
R
2
C
Fig. 3. Example of a transformation rule for a network
In NTA language, this network can be recorded as a totality of C-systems,
namely R
1
[XY ] = [{K}{L}], R
2
[Y W ] = [{L, T }{N }], R
3
[XW ] = [{S}{N }].
To express a production rule by means of NTA in general case, we need to
perform the following sequence of steps:
1. calculate the join of relations contained in the left part of the rule;
2. filter the resulting relation P using some the given restrictions, for instance,
a totality of facts;
3. if the rule requires substituting its left part for the right one, delete all links
contained in P from the initial knowledge base;
4. calculate the join T of relations contained in the right part of the rule;
5. filter T if necessary;
6. add all links contained in T into the knowledge base.
Classifying the rules in advance simplifies the calculations significantly. As the
rule shown in Figure 3 only requires adding a new link without deleting any other
links, we only need to calculate R
1
[XY ] ◦ R
2
[Y W ] = [{K}{L}] ◦ [{L, T }{N }] =
[{K}{N }] and then add the derived n-tuple into the C-system matched to the
relation R
3
: R
3
[XW ] = R
3
[XW ] ∪ (R
1
[XY ] ◦ R
2
[Y W ]) = [{S}{N }] ∪ [{K}{N }] =
[{S, K}{N }]. After all the necessary transformations, the semantic network will look
as follows: R
1
[XY ] = [{K}{L}], R
2
[Y W ] = [{L, T }{N }], R
3
[XW ] = [{S, K}{N }].