Self modeling robot

video://www.youtube.com/watch?v=rkGaQqkyWII

Here’s a tight ass robot!

While simple yet robust behaviors can be achieved without a model at all, here we show how low-level sensation and actuation synergies can give rise to an internal predictive self-model, which in turn can be used to develop new behaviors. We demonstrate, both computationally and experimentally, how a legged robot automatically synthesizes a predictive model of its own topology (where and how its body parts are connected) through limited yet self-directed interaction with its environment, and then uses this model to synthesize successful new locomotive behavior before and after damage. The legged robot learned how to move forward based on only 16 brief self-directed interactions with its environment. These interactions were unrelated to the task of locomotion, driven only by the objective of disambiguating competing internal models. These findings may help develop more robust robotics, as well as shed light on the relation between curiosity and cognition in animals and humans: Creating models through exploration, and using them to create new behaviors through introspection

via http://zedomax.com/image/icon/make.jpg

2 Responses to Self modeling robot

  1. girrrrrrr2 says:

    awesome… but this to techeie for me at the mooment…

  2. Izl says:

    theres robots that teach themeselves hwo to walk and everyting

Leave a Reply

Your email address will not be published.


Check out more interesting categories: Consumer, Educational, Entertainment, Microcontroller, Misc, Motor, Robots, Video.


Related News and Resources