Conceptual Farm
Shuen-Huei Guan Sheng-Yao Cho Yu-Te Shen
Ming Ouhyoung*
Rung-Huei Liang Bing-Yu Chen
{drake,ven,edwards,liang}@cmlab.csie.ntu.edu.tw, abpnm3@r.postjobfree.com, *abpnm3@r.postjobfree.com
Communication and Multimedia Laboratory,
*
Dept. of Computer Science and Information Engineering / Dept. of Information Management,
National Taiwan University
1. Converting simple descriptions into complex be-
haviors Complex autonomous behaviors are cre-
Abstract
ated using compact table-based descriptions (Fig-
ure 3) and flexible scripts instead of complicated
Conceptual Farm is a virtual reality platform for
codes.
generating and observing behaviors of different
2. Semi-interactive editing The behavior is per-
autonomous characters. By providing 1) descriptions
formed in real time as users adjust characters
for characters behaviors and 2) 3D animations and
properties, except when there is a need to regener-
sound, life-like characters in a realistic habitat can be
ate the animation clips.
created, modified, and interact both with users and
3. Extensibility Conceptual Farm is a platform that
other characters in real time. The flexible, manageable
can be used to easily realize every type of animal
and scalable nature of Conceptual Farm leads to its
including insects, mammals, fish, and can even be
desirability in zoological research, general education,
extended to autonomous characters of every type
game and film production, and even decorative arts.
such as aircraft or soccer players.
4. Adoptability for AFX According to the former
1. Introduction features, the design philosophy of Conceptual
Farm can be applied to the higher levels in Anima-
Imagine you have a paintbrush in your hand. You tion Framework eXtension in MPEG-4, which are
draw some pigeons with gray feathers in a square, and not yet clearly defined.
teach them to walk, to eat, to fly, and so on. Then, the The first two features, which are not easily per-
pigeons start wandering around the ground, eating the formed by previously mentioned approaches, are our
feed you spread, and flying up from time to time. An main contributions, since our system reflects user s
idea occurs to you, Why not add kids to play with the input with characters behaviors directly, and users can
pigeons? So you draw three children in the square, focus on characters behaviors without annoying pro-
running amongst the pigeons for fun as soon as run- gramming problems.
ning is taught. As new ideas continue to strike you, the
picture becomes more and more enriched
2. Overview
Interactive artificial life as imaged above has been
presented so far [2, 3, 5, 6, 12, 13]. Based on biologi-
cal theories, these systems simulate animals behaviors
realistically, but, on the other hand, too complicated to
be integrated with a compact and systematic interface
such that users can create artificial lives easily. Some
languages are proposed for describing cognitive be-
haviors [4, 5, 7], but they are not intuitive enough.
To simplify the process of creating artificial lives,
we designed a system, Conceptual Farm, for easy
creation, modification, observation of and interaction
with virtual autonomous characters. Four important
features of Conceptual Farm are:
Figure 1: Flocking pigeons in Conceptual Farm
Based on Conceptual Farm, Dove project creates but also internal factors such as current action, degree
pigeons in a 3D world that can act autonomously in of hunger, etc. Following the rules given by users,
namely, Plans, Decision Maker will select the action
Figure 1. These autonomous behaviors come from a
with the highest utility to send to Pilot and Performer.
table-based system, in which the three resources of
simulated characters (Plans, Navigating Styles and Plans describe 1) the relationship (occurrence prob-
Appearances) are provided by users. abilities) between actions and percepts, 2) the mapping
of each action to its corresponding Navigating Styles
This paper describes the mechanism which enables
and Appearances, and 3) the scope of each percept.
an efficient approach to simulate artificial lives as in
Dove, the implementation issues, and the impact on The following is a sample of (1) and (2) of a pigeon s
Plans.
AFX.
Wander Eating Pursuit
3. Behavioral Model
See(feed) 0.2 0.0 0.8
Destroyed(feed) 0.7 0.0 0.3
In Conceptual Farm, all simulated characters can be Hungry 0.0 0.5 0.5
viewed as autonomous agents, which repeatedly per- Wander 0.7 0.0 0.3
ceive information from World and perform certain re- Eating(feed) 0.2 0.8 0.0
NS_2DWander NS_2DStill NS_2DPursuit
actions, as illustrated in Figure 2. Each agent consists
Animation wander.asf eat.asf pursuit.asf
of Decision Maker, Pilot, and Performer, and has its
Sound wander.wav eat.wav pursuit.wav
own resources provided by users. During each simulat-
ing process, Decision Maker selects a Navigating Style Action Percept
and an Appearance according to the information ob-
Navigating Style Appearances
tained from World, and passes them to Pilot and Per-
former, respectively. Pilot determines new position,
Figure 3: Plans
velocity, and orientation for the next instant, and Per-
former outputs proper Appearances to World. In the Decision Maker consists of 2 agents: Percept Agent
implementation, World is the union of all the other and Action Agent. We refer to the c4 architecture [2]
characters. of the MIT Media Lab.
Let us take an example of the whole process. There The Percept Agent senses all external events within
is a dove. At one moment, Decision Maker decides to the sensory scope for each percept. Internal self-
eat food, then Pilot steps a forward little, and finally awareness, such as the current action, is also sensed.
Performer plays head-lowering animation and cooing Users can dynamically add or remove customized per-
sound. cept functions as well as the built-in ones, and the be-
haviors will change immediately. Plans not only en-
able realistic simulation for sophisticated behaviors
with the uncertain nature of the probability values, but
Resources
Autonomous Agent
also, as a table, provide a compact way for easy ma-
Decision Plans
nipulation.
Maker
The Action Agent is a utility-based agent. It uses
World
Navigating
the percepts stored by Percept Agent and calculates the
Styles
Pilot
score for every candidate action using (1). If nothing is
sensed within the scope of a percept, the probability of
Appearances
Performer this percept will be set to zero. The resulting probabil-
ity of each action is proportional to its score.
Figure 2: An agent with its resources vis-a-vis. World. N
Score ( A j ) = Prob ( P i, A j ) ( P i ) (1)
i=0
3.1 Decision Maker
Aj : the jth action
Pi : the ith percept
Decision Maker, as a brain with sense organs, per-
N : the number of percepts.
ceives information from World and then chooses the
Prob (Pi, Aj) : the occurrence probability of ith per-
best responding action. Influences affecting decision-
cept when the jth action happens
making include not only events made by users with the (Pi ) : 1 if Pi occurs, 0 otherwise.
system UI or by other characters in the virtual world
Performer is responsible for perceptible outputs of
After selecting the action with the highest score, the
Action Agents will send the Navigating Style to the autonomous characters by animation, sound or any
Pilot, while the corresponding animation and sound expressive media provided by users. Our system han-
dles most kinds of sounds by FMOD3.
will be sent to Performer.
Several exporters were implemented to allow users
to make animations with their favorite tools such as
3.2 Pilot
Maya, 3D Studio Max.
For each character, Pilot determines its own path
and orientation according to the Navigating Style (ex. 4. Experimental Results
seek, pursuit and wander) from Decision Maker.
Pilot is built based on Reynolds s OpenSteer li- We used Conceptual Farm to build Dove which is a
brary 1 . Extending Reynolds 16 common steering square at the Chiang Kai-Shek Memorial Hall in
styles for autonomous agents [9], we provide 31 built- Taipei where pigeons gather to find food, clean their
in Navigating Styles. Although the built-in styles meet feathers, and where dogs wander around, and user can
the demands of most cases, users can also provide cus- spread crumbs and run amongst the pigeons in Figure
tomized Navigating Styles through scripts. We provide 6. Taking the pigeons resource as an example, there
a high-level script based on Small 2, which enhances are 13 actions and 8 percepts from Plans, 12 anima-
the flexibility of Conceptual Farm. tions and 4 sounds for Appearances, and 2 scripts for
Navigating Styles.
For example, when a bird is landing with a built-in
Navigating Style, NS_YParabolicUp, it will have the
unreal velocity-alignment problem as illustrated at the 5. AFX & Conceptual Farm
left of Figure 4. The problem can be solved by a script
as below: According to the characteristics mentioned above,
the concept of our system can be applied to enhance
current multimedia standards.
public doLanding (force, elapsedTime) {
As illustrated in Figure 5, MPEG-4 proposed AFX
applyForce (force, elapsedTime)
(Animation Framework eXtension) [1] in order to pro-
velocity[1] = 0
vide a standardized description for computer animation
calculateOrientation (velocity)
and interaction, similar to video and audio standards.
}
AFX is layered into six components, which are, in a
top-down order, cognitive, behavioral, biomechanical,
First, it calls applyForce to get the new position and
physics, modeling and geometry components. The last
velocity and then, it applies calculateOrientation with
four are specified clearly in details, while cognitive
the y-zeroed velocity to get the reasonable orientation
and behavioral components are not, since they are AI-
with the result as illustrated on the right side of Figure
intensive and difficult to formalize.
4.
Audio Audio
Audio CB
DB Decode
Composite
Render
Video Video
Video CB
DB Decode
DIMF
AFX AFX Decoded
Animation
DB Decode AFX
Framework
eXtension
Figure 4: Two landing styles.
Figure 5: AFX in MPEG-4
3.3 Performer To make up for this limitation in current AFX, the
concept of our system provides a solution for editing
1
http://opensteer.sourceforge.net
2 3
http://www.compuphase.com/small.htm http://www.fmod.org
tional Joint Conference on Autonomous Agents and Multi-
and functioning4 data in the cognitive and behavioral
agent Systems, AAMAS, Bologna, Italy, 2002, pp.362-333.
layers, and connects them with multimedia in the other
components of AFX, and even other parts in MPEG-4. [4] L. Chen, K. Bechkoum, and G. Clapworthy, A logical
The formalized data used to describe characters be- approach to high-level agent control, Proc. of the Fifth In-
haviors in Conceptual Farm is also suitable for storage ternational Conference on Autonomous Agents, Montreal,
Quebec, Canada, May 2001, pp.1-8.
and transmission, which is the characteristic of stan-
dardized data. [5] J. Funge, X. Tu, D. Terzopoulos, Cognitive modeling:
knowledge, reasoning and planning for intelligent charac-
ters, Proc. of SIGGRAPH, ACM Press, Los Angeles, CA,
6. Future Work and Conclusions
August 11-13, 1999, pp.29-38.
Conceptual Farm provides an easy way to create [6] M.P. Johnson, A. Wilson, B. Blumberg, C. Kline, and A.
virtual lives. There are, however, some limitations. Bobick. Sympathetic Interfaces: Using a Plush Toy to Di-
rect Synthetic Characters, Proc. of CHI, Pittsburgh, ACM
Characters cannot adapt themselves to the environment
because they cannot modify their own Plans. In addi- Press, May 1999, pp. 152-158.
tion, hardcode cannot be substituted for with our table- [7] J.E. Laird, It Knows What You re Going to Do: Adding
based input mechanism to provide user-character inter- Anticipation to a Quakebot, Proc. of the Fifth International
action, which is simulated-character, environment, and Conference on Autonomous Agents, ACM Press, Montreal,
input-device dependent. Canada, 2001, pp. 385-392.
In summary, we demonstrate a novel approach to
[8] K. Perlin, A. Goldberg, Improv: A System for Scripting
create interactive artificial lives with our virtual reality Interactive Actors in Virtual Worlds, Proc. of SIGGRAPH,
system, Conceptual Farm. The system simulates char- ACM Press, New Orleans, August 1996, pp.205-216.
acter behaviors realistically with compact and formal-
[9] C. Reynolds, Steering Behaviors for Autonomous Char-
ized input descriptions and preserves its flexibility by
acters, Proc. of Game Developers Conference, San Jose,
introducing scripts. It also suggests a practical method
CA, 1999, pp. 763-782.
for standardizing and easily manipulating the dynamic
[10] K. Sims, "Evolving Virtual Creatures", Proc. of SIG-
and real-time properties within the cognitive and be-
GRAPH, ACM Press, New York, July 1994, pp.15-22.
havioral levels of current AFX in MPEG-4.
[11] D. Terzopoulos, Artificial life for computer graphics,
7. Acknowledgements Communications of the ACM, Vol. 42, No. 8, ACM Press,
August, 1999, pp.33-42.
This research was supported in part by National [12] X. Tu, D. Terzopoulos, Artificial Fishes: Physics, Lo-
Science Council 92-2622-E002-002. We are graceful comotion, Perception, Behavior, Proc. of SIGGRAPH,
to Kuei-Yuan Zheng, Ping-Chun Kuo, Wei-Chih Liao, ACM Press, New York, 1994, pp.43-49.
and Tien-Jung Huang (National Taiwan University of
[13] H.S. Yang, H.-J. Park, Y.-J. Cho, Interactive Artificial
Arts) for providing their technical helps. We also thank Life based on Behavior and Perception in a Virtual Environ-
Wan-Chun Ma for his comments on MPEG-4 AFX. ment, Proc. of International Conference on Multimedia &
Expo, IEEE, New York, 30 July - 2 August, 2000, pp. 207-
8. References 210.
[1] ISO/IEC 14496-16:2003(E), Information technology
Coding of audio-visual objects Part 16: Animation
Framework eXtension (AFX)
[2] R. Burke, D. Isla, M. Downie, Y. Ivanov, and B. Blum-
berg, Creature Smarts: The Art and Architecture of a Vir-
tual Brian, Proc. of Game Developers Conference, San Jose,
CA, 2001, pp. 147-166.
[3] R. Burke and B. Blumberg, Using an Ethologically-
Inspired Model to Learn Apparent Temporal Causality for
Planning in Synthetic Creatures, Proc. of the First Interna-
4
We use the word functioning instead of playing because artifi-
Figure 6: Simulating pigeons with Conceptual Farm
cial intelligence is concerned in addition to animation.