“A Nervous System Model for Direct Dynamics Animation”
Direct dynamics animation consists of synthesizing the movements of a model from the specification of its physical properties (mass and moment of inertia), the conditions of bond between its contracting parties, the conditions of contact with other bodies and the forces that acts on it. This approaching has the advantage to generate animations with physical realism. The problem, that continues relevant as inquiry object, is the control of the model: “What forces must be applied to the model to generate
the desired movement?”.
The solution of the problem, presented in this work, assumes that the studied model consists of a structure of rigid link bodies whose movements are generated by internal actuators, with its forces defined by a nervous system. With use of artificial neural networks and evolutionary computation, the proposed controller is capable of adapting itself to control different articulated models, and to generate varied types of movements while it keeps the stability even with small variations of the terrain.
The presented model possesses, in its core, a Central Pattern Generator (CPG)
based on neural oscillators, that has their activities regulated by the sensorial module, to allow the balance of the structure and stability of the movement, responding to environment variations.
For the adaptation to the articulated structure and learning of movements, the controller has a cognitive module, responsible for the search of neural parameters, through genetic algorithms, and the feedback networks (sensorial answers to environment variations), with genetic programming.
Results are presented related to the control of models humanoid, cheetah, frog, luxo and luxo-2, having these last two ones equal topologies, but with variations in the sizes of the bodies and freedom of the joints. All the models are tested in plain land and
ODE (http://www.ode.org/) was used to physical simulation.