Stochastic Pattern Computing: A New Computing Paradigm for AI

Pentti Kanerva, SICS

Computers have always been thought of as some kinds of brains and have even been called electronic brains. However, the first ones were made for NUMERIC COMPUTING; they were fast automatic calculators or number-crunchers. Then it was realized that computers actually manipulate symbols, giving birth to programming languages. It was also realized that the objects of computing could be not only numbers but also data structures, and so SYMBOLIC COMPUTING was born and has dominated AI ever since. In the last 10-15 years NEUROCOMPUTING has come to rival symbolic computing in cognitive science, spurring the Second AI Debate. The debate is about whether connectionist neurocomputing is sufficient for machine intelligence and cognition, and whether we need anything more than symbolic computing.

Stochastic Pattern Computing combines robustness and learning of neurocomputing with the compositionality of symbolic computing into a system that resembles numeric computing. The "numbers" that we compute with are large patterns; they are random vectors with thousands of dimensions. Such a vector can represent an object or a property or a relation or a function or a composed structure or a mapping between structures. The key ideas are that all things are represented in the same mathematical space. and that new things, or concepts, are composed recursively from existing ones. The vectors are not sectioned into fields. Instead, every component of a composed vector holds some information about every one of the constituent vectors, so that the vectors are holographic or holistic--the representation is brain-like. Computing with such vectors relies on the statistical law of large numbers, which is why the vectors are very high dimensional. The idea of stochastic pattern computing will be demonstrated with simple examples.