Published in journal "Software systems and computational methods", 2014-2 in rubric "Software for innovative information technologies", pages 239-254.
Resume: the study uses initial data in form of truth table of three-dimensional function of binary, ternary and mixed logics. Calculation of values of the functions is done by their geometric interpretations, disjunctive / conjunctive normal forms, incompletely connected artificial neural networks and perceptrons with hidden layer. The article in detail reviews the intermediate computation results for all methods mention above. The authors study properties of functions of mixed logic: binary-ternary and 3-2 logics in one-, two- and three-dimensions. The article presents mutually equivalent implementations of logic functions in the form of disjunctive normal form and in the form of incompletely connected artificial neural network. The authors performed replacement of continuous activation function with ternary threshold function. The study includes building disjunctive normal forms, direct synthesis of incompletely connected artificial neural network weights matrix. The perceptron is trained with Back Propagation algorithm. Some conclusions are formed based on the laws of mathematical induction. The article shows that: 1. minimization of the quantity of neurons in perceptron’s hidden layer implicitly leads to the usage of many-valued logics; 2. some functions of binary-ternary logics may be used to build disjunctive forms; 3. a bijective way to convert disjunctive normal form into the form of incompletely connected artificial neural network and vice versa exists; 4. in one-dimensional 3-2 logic there are only eight functions and all of them are listed; 5. proposed structure of incompletely connected artificial neural network may implement any function of ternary logic in any dimensionality.
Keywords: XOR problem, perceptron, separating hyperplane, activation function, perfect disjunctive form, binary-ternary logic, 3-2 logic, ternary logic, neural network training, Back Propagation algorithm
1. Kodd E. F. Rasshirenie relyatsionnoy modeli dlya luchshego otrazheniya semantiki. // SUBD, 1996, ¹2.
2. Deyt K.Dzh. Vvedenie v sistemy baz dannykh. Moskva. Vil'yams. 2000. 848 s.
3. URL: http://dit.ipg.pt./MBP
4. Giniyatullin V.M. Modelirovanie logicheskikh funktsiy v neyrosetevom bazise. // Neftegazovoe delo, 2008
¹ 1 C. 35-43.
5. URL: http://ru.wikipedia.org/wiki/Troichnye_funktsi
Correct link to this article:
just copy this link to clipboard