Quarterly (winter, spring, summer, fall)
128 pp. per issue
7 x 10, illustrated
2014 Impact factor:

Artificial Life

Summer 1994, Vol. 1, No. 4, Pages 373-389
(doi: 10.1162/artl.1994.1.4.373)
© 1995 Massachusetts Institute of Technology
Evolving Visual Routines
Article PDF (2.5 MB)

Traditional machine vision assumes that the vision system recovers a complete, labeled description of the world [10]. Recently, several researchers have criticized this model and proposed an alternative model that considers perception as a distributed collection of task-specific, context-driven visual routines [1, 12]. Some of these researchers have argued that in natural living systems these visual routines are the product of natural selection [11]. So far, researchers have hand-coded task-specific visual routines for actual implementations (e.g., [3]). In this article we propose an alternative approach in which visual routines for simple tasks are created using an artificial evolution approach. We present results from a series of runs on actual camera images, in which simple routines were evolved using genetic programming techniques [7]. The results obtained are promising: The evolved routines are able to process correctly up to 93% of the test images, which is better than any algorithm we were able to write by hand.