Technical Report Number
We define a notion of reasoning using world-rank-functions, independently of any symbolic language. We then show that every symmetric neural network (like Hopfield networks or Boltzman machines) can be seen as performing a search for a satisfying model of some knowledge that is wired into the network's topology and weights. Several equivalent languages are then shown to describe symbolically the knowledge embedded in these networks. We extend propositional calculus by augmenting assumptions with penalties. The extended calculus (called "penalty logic") is useful in expressing default knowledge, preference between arguments, and reliability of assumptions in an inconsistent knowledge base. Every symmetric network can be described by this language and any sentence in the language is translatable to such network. A proof-theoretic reasoning procedure supplements the model-theoretic definitions and gives an intuitive understanding of the non-monotonic behavior of the reasoning mechanism. Finally we sketch a connectionist inference engine for penalty logic and discuss its capabilities and limitations.
Pinkas, Gadi, "Propositional Non-Monotonic Reasoning and Inconsistency in Symmetric Neural Networks" Report Number: WUCS-91-03 (1991). All Computer Science and Engineering Research.