I486/I586/H400
Artificial Life as an approach to Artificial Intelligence
Spring Semester
2011
Meeting Times:
2:30pm – 3:45pm MW
Location: Informatics West, room 107
Professor: Larry
Yaeger
Office: Informatics East, room 305
Office Hours: By
appointment, any afternoon or early evening I'm not in a class or meeting
Phone: (812)
856-1845
Email: larryy (at) indiana
(dot) edu
Textbook (required): Braitenberg,
V. Vehicles: Experiments in Synthetic
Psychology
All other reading materials
available online via links on this page.
For distinction between I486, I586, and H400 expectations,
see Requirements section below.
Introduction
What is life?
What is intelligence? How
are the two linked, how can we quantify them, and how can we leverage our
knowledge of natural intelligence to create artificial intelligence? Since every example we have of natural
intelligence is the product of the evolution of nervous systems subject to
natural selection under ecological pressures, this course explores the idea
that our best hope of creating artificial intelligence is through the evolution
of artificial nervous systems subject to natural selection in an artificial
ecology—i.e., Artificial Life.
We will study seminal papers and fundamental technologies that enable
such an endeavor, including neural networks, genetic algorithms, emergent
phenomena, information theory, and measures of complexity.
Expectations
Some of the course is conceptual in nature, some requires
the ability to think algorithmically (as in computer programming), and some
requires a degree of comfort with math that includes probabilities, summations,
and logarithms, at least. A basic
introduction to probability theory and information theory are a key part of the
class, and some of the readings are based on information theory and even some
differential equations (though a knowledge of differential equations is not
required to learn the necessary material).
A substantial amount of reading is required for the course, with
undergrads expected to read at least a couple of journal articles each week,
graduate students twice that. There
is also a term project, that can be a software
project, a report on an experiment using existing software, or a review paper on
a topic relevant to the class.
Schedule and
Reading List
Links under the ÒTopicsÓ heading are to PowerPoint lecture
notes. You can download these in
advance if you want something to take notes on, but be warned, they are subject
to change.
In the topics, S# means Speaker #. (Speaker topics can be inferred from the
schedule, but are listed explicitly at the bottom of this page.)
In the reading assignments, B# means chapter # of the Braitenberg book.
L# is Lecture #.
Class |
Date |
Topics |
Reading
Assignment (after the corresponding weekÕs class) |
Additional
Graduate & Honors Reading Assignments |
Extras
(not required) |
1a |
MO
10 Jan |
Class intro, L1-Intro to Artificial Life |
|
||
1b |
WE
12 Jan |
Intro to Braitenberg,
L2-Is it alive? |
Farmer & Belin,
B1 |
|
|
|
MO
17 Jan |
No class - MLK
Holiday |
|
|
|
2b |
WE
19 Jan |
Goldberg, B2 |
|||
3a |
MO
24 Jan |
|
|
|
|
3b |
WE
26 Jan |
|
|||
4a |
MO
31 Jan |
S3, B3,
discussion, test prep |
Exam
1, take-home |
|
|
4b |
WE 2
Feb |
Exam 1 due,
L5-Neural Networks Pt. 1 - Terms
& Defs |
Anderson, B4 |
|
|
5a |
MO 7
Feb |
S4, B4, software demos,
discussion |
|
|
|
5b |
WE 9
Feb |
|
|||
6a |
MO
14 Feb |
S5, B5, ALife project ideas, discussion |
|
|
|
6b |
WE
16 Feb |
|
|||
7a |
MO
21 Feb |
John
Beggs guest lecture? |
|
|
|
7b |
WE
23 Feb |
L8-Neural Nets Pt 3 – Hebbian learning via Information Theory |
|
||
8a |
MO
28 Feb |
|
|
|
|
8b |
WE 2
Mar |
In-class
Midterm (Exam 2) |
B8 |
|
|
9a |
MO 7
Mar |
Return and discuss exams, S8, B8,
discussion |
|
|
|
9b |
WE 9
Mar |
||||
|
MO
14 Mar |
No Class -
Spring Break |
|
|
|
|
WE
16 Mar |
No Class -
Spring Break |
|
|
|
10a |
MO
21 Mar |
Informal presentation of project ideas |
|
|
|
10b |
WE
23 Mar |
Hinton&Nowlan,
Parisi,
Chalmers, B9 |
|
||
11a |
MO
28 Mar |
|
|
|
|
11b |
WE
30 Mar |
L11-Organisms simulated and real (Koko, Dolphins,
Betty, Alex) |
|||
12a |
MO 4
Apr |
|
|
|
|
12b |
WE 6
Apr |
Hillis,
B11, B12 |
|
||
13a |
MO
11 Apr |
Olaf Sporns
guest lecture |
Exam
3, take-home |
|
|
13b |
WE
13 Apr |
Exam 3 due,
L13-Evolution of Intelligence (movies) |
Yaeger, B13, B14 |
|
|
14a |
MO
18 Apr |
|
|
|
|
14b |
WE
20 Apr |
||||
15a |
MO
25 Apr |
Project Demos |
|
|
|
15b |
WE
27 Apr |
Project Demos (Projects due Friday Apr 29) |
|
|
|
|
WE 4
May |
Final
Exam 12:30pm - 2:30pm (Exam 4) Info West 109 |
|
|
|
Students will be expected to attend class, do weekly
readings, and participate in discussions.
Each student will present and lead a discussion on the material in one
topic area (see list at bottom of page).
In addition there is a semester project, for which each
student must do one
of the following. Note that there are
different requirements for graduate and undergraduate students, and grades will
be assessed accordingly.
á
Write
a final paper demonstrating insight into one of the topic areas (undergrads
15-20 pages; grads 20-25 pages and should be of publishable caliber as a review
paper).
á
Develop
a functioning ALife simulator of at least modest
complexity (student should consult with teacher before writing; for graduate
students this must be more than a simple CA or Conway Game of Life and the work
should be of publishable caliber) and provide a succinct write up describing
the software and sample results.
á
Turn
in a technical paper describing the results of an ALife
experiment, carried out with Breve, NetLogo, Tierra, Avida, Swarm,
Second Life, Polyworld, or other ALife
software (student should consult with teacher in selection of software and
topic; for graduate students the experimental design and write up should be of
publishable caliber).
Students will present / demo their projects the last week of
class.
There will be four exams—two take-home exams, an
in-class midterm, and an in-class final exam. The lowest test grade will be dropped,
thus the final is optional, if a student is satisfied with the first three test
results. The midterm may include
cumulative questions, but will primarily focus on the material since the
previous test. The final exam is
cumulative. Dates are indicated in
the schedule above.
Graduate students have additional reading assignments, and
will be responsible for the material covered in those readings.
Course Structure
Generally, each weekÕs theme will bridge weekends, being
introduced by me in the WE lecture class, and then followed up with readings
before the next class on MO, in order to allow students maximum time to read
and prepare for discussion. On the
following MO, student presentations of a topic related to the reading materials
will take place, followed by a discussion of one of the chapters from the Braitenberg book, followed by any additional material I
need to communicate, and class discussion of that weekÕs topic. (There will be exceptions to this
pattern, as indicated in the above schedule, at the start of the semester,
around Spring Break or Thanksgiving, or when there are guest speakers.)
Each student will prepare one presentation during the
semester on a topic related to the reading materials. We will agree on and set dates in the
first class. If there are more than
13 students in the class, we will fit extra presentations in as needed. These presentations are to be 15 to 20
minutes only. In general, I hope
students will select one of the papers referenced by the primary reading
material and use that as the basis for their presentation. But if there is a topic of particular
interest to a student in the reading material itself and I have not covered
that topic in the lecture class, that will be acceptable. Related papers not taken from the
reference list may also be acceptable, but require my approval.
Each student will also work on one final project, with
higher expectations for graduate students.
As indicated above, this project may take the form of writing your own ALife simulator, performing experiments with an existing ALife simulator, or researching and writing on a topic in
the field, in order to accommodate all computer skill levels. Students are expected to give brief
presentations describing their semester project during the last week of class,
and projects are officially due at the end of the last week of class.
Grading
Each of the four tests, class participation, the topic
presentation, and the final project will contribute to the total grade as
follows:
Participation |
5 pts |
(Yes, it really does count) |
Gedanken experiment |
5 pts |
|
Presentation |
10 pts |
|
Project |
20 pts |
|
|
|
|
Test 1 (take-home) |
20 pts |
|
Test 2 - Midterm |
20 pts |
|
Test 3 (take-home) |
20 pts |
|
Test 4 - Final |
20 pts |
|
|
|
|
Total |
120 pts |
(100 pts after dropping the lowest test score) |
The extended gedanken experiment will be defined in the first week. This experiment will pose a challenge
that, if answered correctly by anyone in the class, anytime before the end of
the semester, will result in the credit being received by the entire class.
Grades will not be curved and will be assigned as follows
(this is the same as used in I101 and I210, but differs from some other
classes):
A+ |
98-100 |
4.0 |
A |
93-97 |
4.0 |
A- |
90-92 |
3.7 |
B+ |
85-89 |
3.3 |
B |
80-84 |
3.0 |
B- |
75-79 |
2.7 |
C+ |
70-74 |
2.3 |
C |
65-69 |
2.0 |
C- |
60-64 |
1.7 |
D+ |
55-59 |
1.3 |
D |
50-54 |
1.0 |
D- |
45-49 |
0.7 |
F |
0-44 |
0.0 |
Conduct
Cooperation on the extra credit problem is encouraged, as is
discussion in general. In fact, I intend
to set up an email list to facilitate discussions outside of class. However, tests must be taken
individually, both take-home and in-class.
Cheating will be reported to the Dean of Students, according to
university policies.
General Course
Description
Artificial Life is a broad discipline encompassing the
origins, modeling, and synthesis of natural and artificial living entities and
systems. Artificial Intelligence, as a discipline, tries to model and
understand intelligent systems and behavior, typically at the human level. This class will introduce core concepts
and technologies employed in Artificial Life systems that can be used to
approach the evolution of Artificial Intelligence in computers. Key themes include:
- bottom-up design and synthesis
principles,
- definitions and measurements of
life and intelligence,
- genetic algorithms,
- neural networks,
- the evolution of learning,
- the emergence of intelligence,
- computational ecologies, and
- information theory-based measures
of complexity.
Our path through these materials will lay the theoretical
groundwork for an approach to Artificial Intelligence based on the tenets and
practices of Artificial Life—an approach which utilizes evolution to
start small and work our way up a spectrum of intelligence, from the simplest
organisms to the most complex, rather than attempting to model human-level
intelligence from the outset.
Lectures and readings will be based on seminal papers and
introductory texts in these fields, drawing from the Artificial Life conference
proceedings, and technical papers by Donald Hebb
(from which we obtain Hebbian learning), Rumelhart and McClelland (editors of and authors in the
original Parallel Distributed Processing books that launched the modern neural
network field), Ralph Linsker ("Infomax" theoretical approach to neural network
learning), Hinton and Nowland (the "Baldwin
effect"), William James ("the greatest American psychologist"),
W. Grey Walter, Tom Ray, Karl Sims, Danny Hillis, and
others. We will also read and discuss Braitenberg's
seductive and influential Vehicles book.
References (Partial)
Anderson, J. A., General Introduction, p. xiii-xxi, Neurcomputing, Foundations of Research, ed. by J. A.
Anderson and E. Rosenfeld, A Bradford Book, MIT Press, Cambridge,
Massachusetts, 1988
Chalmers, D., "The Evolution of Learning: An Experiment
in Genetic Connectionism" in Connectionist Models, Proceedings of the 1990
Summer School, edited by D. S. Touretzky, J. L.
Elman, T. J. Sejnowski, G. E. Hinton, Morgan
Kaufmann, San Mateo, CA, 1991
Douglas, F., ÒDo fruit flies dream of electric bananas?Ó, New Scientist, 14 February 2004
Farmer, J. D., and A. dÕA.
Belin, "Artificial Life: The Coming Evolution" In Artificial
Life II, edited by C. Langton, C. Taylor, J. Farmer, and S. Rasmussen. Proceedings of the Artificial Life II
Conference (in 1990), Santa Fe Institute Studies in the Sciences of Complexity
Proc. Vol. X. Addison-Wesley,
Redwood City, CA, 1992
Frye, M.A. & Dickinson, M.H. ÒA signature of salience in
the Drosophila brainÓ, commentary on
article by Swinderen & Greenspan, Nat. Neur. 6
(6) 544-546 June 2003
Giurfa, M., Zhang, S., Jenett, A., Menzel, R., Mandyam, V., ÒThe concepts of 'sameness' and 'difference'
in an insectÓ, Nature 410(6831) 930-933, 19 Apr 2001
Goldberg, D. E., A Gentle Introduction to Genetic
Algorithms, p. 1-23, Chapter 1 of Genetic Algorithms in Search, Optimization,
and Machine Learning, by D. E. Goldberg, Addison-Wesley 1989
Hebb, D. O., Introduction (p.
xi-xix) and Chapter 4, "The first stage of perception: growth of the
assembly" (p. 60-78), The Organization of Behavior, Wiley, New York, 1949
(with introduction, p. 43-56 from Neurocomputing,
Foundations of Research, ed. by J. A. Anderson and E. Rosenfeld, A Bradford
Book, MIT Press, Cambridge, Massachusetts, 1988)
Hillis, D. W., Intelligence as an
Emergent Behavior, p. 175-189, Daedalus, Journal of the American Academy of
Arts and Sciences, special issue on Artificial Intelligence, Winter 1988
Hinton, G. E. and Nowlan, S. J., How learning can guide evolution. Complex Systems, 1:495--502,
1987
Izhikevich, Eugene M., Simple Model of
Spiking Neurons, IEEE Transactions on Neural Networks (2003) 14:1569-1572
http://www.nsi.edu/users/izhikevich/publications/spikes.htm
Izhikevich, Eugene M., Which Model to Use
for Cortical Spiking Neurons? IEEE Transactions on Neural Networks (2004)
15:1063-1070
http://www.nsi.edu/users/izhikevich/publications/whichmod.htm
Izhikevich, Eugene M., Polychronization:
Computation With Spikes, Neural Computation (2006) 18:245-28
http://www.nsi.edu/users/izhikevich/publications/spnet.htm
James, W., Association, Chapter XVI of Psychology (Briefer
Course), p. 253-279, Holt, New York, 1890 (with introduction, p. 1-14 from Neurocomputing, Foundations of Research, ed. by J. A.
Anderson and E. Rosenfeld, A Bradford Book, MIT Press, Cambridge,
Massachusetts, 1988)
Langton, C. G., Artificial Life, preface to Artificial Life,
The Proceedings of an Interdisciplinary Workshop on the Synthesis and
Simulation of Living Systems held September, 1987 in Los Alamos, New Mexico,
Santa Fe Institute Studies in the Sciences of Complexity Proc. Vol. VI., edited by C. Langton, Addison Wesley, Redwood City, CA,
1989
Langton, C. G., Computation at the Edge of Chaos: Phase
Transitions and Emergent Computation, p. 12-37, Emergent Computation,
Proceedings of the Ninth Annual International Conference of the Center for
Nonlinear Studies on Self-organizing, Collective, and Cooperative Phenomena in
Natural and Artificial Computing Networks, Los Alamos, NM, 1989, ed. Stephanie
Forrest, North Holland, 1990
Linsker, R.,
"Towards an Organizing Principle for a Layered Perceptual Network" in
Neural Information Processing Systems, ed. by D. Z. Anderson. American Institute of Physics,
New York, 1988
Linsker, R., "Self-Organization in
a Perceptual Network", Computer 21(3), 105-117, March 1988
Parisi, D., S. Nolfi,
and F. Cecconi, "Learning, Behavior, and
Evolution", Tech. Rep. PCIA-91-14, Dept. of Cognitive Processes and
Artificial Intelligence, Institute of Psychology, C.N.R., Rome, June 1991.
(Appeared in Proceedings of ECAL-91—First European Conference on
Artificial Life, December 1991, Paris; also in Varela, F, Bourgine,
P. Toward a pratice of autonomous systems. MIT Press. 1991
Ray, T. S. 1992. Evolution, ecology and
optimization of digital organisms. Santa Fe Institute working paper
92-08-042
Rumelhart, D. E. and McClelland, J. L.,
PDP Models and General Issues of Cognitive Science, p. 110-146, Chapter 4 of
Parallel Distributed Processing, Explorations in the Microstructure of
Cognition, Volume 1: Foundations, ed. by D. E. Rumelhart,
J. L. McClelland, and the PDP Research Group, A Bradford Book, MIT Press,
Cambridge, Massachusetts, 1986
Schneider, T., ÒInformation Theory PrimerÓ,
<http://www.lecb.ncifcrf.gov/~toms/paper/primer/>
Sims, K., Evolving Virtual
Creatures, Computer Graphics, Annual Conference Series, (SIGGRAPH Ô94
Proceedings), July 1994, pp.15-22.
Walter, W. G. (1950), "An Imitation of Life",
Scientific American, 182(5), 42-45, May 1950
Yaeger, L. S., Computational Genetics, Physiology,
Metabolism, Neural Systems, Learning, Vision, and Behavior or PolyWorld: Life in a New Context, p. 263-298, Proceedings
of the Artificial Life III Conference (in 1992), ed. Chris Langton,
Addison-Wesley, 1994
Speaker Topics
S1 – Intro to Artificial Life or Is It Alive?
Origin of life; artificial
intelligence; Charles Darwin; Alan Turing; John von Neumann; other attempts to
formally define life; applications of artificial life; multi-agent modeling;
evolutionary trends (in complexity, size, cell types, etc.); ALife software on the web (Breve,
Sugarscape, Swarm, NetLogo,
Avida, Framsticks, etc.)
S2 – Genetic Algorithms
Function optimization issues
(Charbonneau "extras" reading); real world applications (there are
many); specific application in depth; Genetic Programming methods and
applications; John Holland; cross-over versus mutation (Holland, others); differences
between computational GAs and biological genetical
evolution; evolutionary art / aesthetic selection; evolutionary robotics; GA
software on the web
S3 – Simulated Evolution
Avida; recent reimplementations
of Sims's blocky creatures; Sims's competing block creatures work; applications
of simulated evolution (economy, language, number use, art, music etc.);
evolutionary games (Spore, SimLife, Creatures, etc.);
simulated Lamarckian evolution; evolutionary software on the web (Breve, Sugarscape, NetLogo, Avida, Framsticks, etc.)
S4 – Neural Networks, Intro
Neural network applications
(there are many); distributed vs. symbolic AI and knowledge representation;
evolving neural networks; enhancements to BackProp
for training ANNs; levels of detail used to model neurons and neural nets;
differences between artificial and biological neurons and neural nets; neural
network software on the web (for biological modeling and/or
engineering/classification work)
S5 – Neural Networks, Association & Hebb
Applications of Hebbian learning (vision, PCA/ICA, signal separation,
robotics, etc.); anti-Hebbian learning; Hebbian-learning neural network software on the web
(biological or artificial)
S6 – Information Theory
Information metrics as expected
values; the Schneider paper; discussion of results from the "n-grams"
extra exercises; applications of information theory (neural spike timings,
feature selection for inputs to ANNs, medical imaging, genetic
analysis—site conservation, encoding information about the environment,
classifying protein sequences, etc.); Kullback-Liebler
Divergence; information theory software on the web
S7 – Neural Networks, Information theoretic approach
to neural network learning
Application of information
theory to ANNs (neural spike timings, feature selection for inputs to ANNs,
etc.); Bayesian
techniques and applications; analyzing neural networks with information theory
(capacity, statistical mechanics, etc.); application of statistical mechanics
to network analysis (percolation on graphs, ala John Beggs)
S8 – Any of the above!
Any of the above topics, and
then some
S9 – Combining evolution and learning
Belew's analysis and extension of the
Hinton & Nowlan work (see "extras");
other Parisi papers; David Ackley's experiments with
Lamarckian evolution; other computational experiments combining learning and
evolution (Miller & Todd, French & Messinger,
others)
S10 – Animal intelligence, intelligence in simulated
organisms
Any Clayton work with scrub jays
(see "extras"); any animal cognition studies; more depth on Koko,
Alex, Kanzi, others; other bee experiments (bilateral
symmetry, human facial recognition, etc.)
S11 – Intelligence as an emergent phenomenon
Dennett's "Kinds of
Minds" (see "extras"); Hillis's
coevolution of sorting algorithms; other approaches to modeling intelligence (Cyc, Hofstadter's Copy-Cat, Seq-See,
etc.); emergence; order out of chaos (Prigogene,
others)
S12 – Evolution of intelligence
Other evolutionary software (Sugarscape, NetLogo, Avida, Framsticks, etc.);
evolution of complexity (see "extras")
S13 – Complexity
Complexity metric from Tononi, Sporns, & Edelman
(see "extras" next to Olaf Sporns's guest
lecture); overview of different complexity measures (see Seth, Crutchfield, and
Feldman "extras"); Tononi's "Phi"
measure of complexity and consciousness (see Tononi2 in "extras");
Kolmogorov descriptive/algorithmic complexity
Links to previous student presentations may be found on the
following class home pages from previous semesters. Click on the student's name in
parentheses to download the presentation.
If the "S#" indicator is also a link, then it will point to
supporting materials for the presentation.
(Other links on these pages do not work.)
Look here for examples of previous
semester projects.