Document Type
Technical Report
Publication Date
1991-12-01
Technical Report Number
WUCS-91-52
Abstract
The goal of this article is to construct a connectionist inference engine that is capable of representing and learning nonmotonic knowledge. An extended version of propositional calculus is developed and is demonstrated to be useful for nonmonotonic reasoning and for coping with inconsistency that may be a result of noisy, unreliable sources of knowledge. Formulas of the extended calculus (called penalty logic) are proved to be equivalent in a very strong sense to symmetric networks (like Hopfield networks and Boltzmann machines), and efficient algorithms are given for translating back and forth between the two forms of knowledge representation. The paper presents a fast learning procedure that allows symmetric networks to learn representations of unknown logic formulas by looking at examples. A connectionist inference engine is then sketched whose knowledge is either compiled from a symbolic representation or that is inductively learned from training examples. Finally, the paper shows that penalty logic can be used as an high-level specification language for connectionist networks, and as a framework into which several recent systems may be mapped.
Recommended Citation
Pinkas, Gadi, "Representation and Learning of Propositional Knowledge in Symmetric Connectionist Networks" Report Number: WUCS-91-52 (1991). All Computer Science and Engineering Research.
https://openscholarship.wustl.edu/cse_research/670
Comments
Permanent URL: http://dx.doi.org/10.7936/K7VD6WTX