A Cellular Neural Associative Array for
Symbolic Vision
Christos Orovas and James Austin
Abstract
A system which combines the descriptional power of symbolic
representations with the parallel and distributed processing
model of cellular automata and the speed and robustness of
connectionist symbol processing is described. Following a
cellular automata based approach, the aim of the system is to
transform initial symbolic descriptions of patterns to
corresponding object level descriptions in order to identify
patterns in complex or noisy scenes. A learning algorithm based
on a hierarchical structural analysis is used to learn symbolic
descriptions of objects. The underlying symbolic processing
engine of the system is a neural based associative memory (AURA)
which enables the system to operate in high speed. In addition,
the use of distributed representations allow both efficient inter-cellular
communications and compact storage of rules.