Here’s a tight ass robot!
While simple yet robust behaviors can be achieved without a model at all, here we show how low-level sensation and actuation synergies can give rise to an internal predictive self-model, which in turn can be used to develop new behaviors. We demonstrate, both computationally and experimentally, how a legged robot automatically synthesizes a predictive model of its own topology (where and how its body parts are connected) through limited yet self-directed interaction with its environment, and then uses this model to synthesize successful new locomotive behavior before and after damage. The legged robot learned how to move forward based on only 16 brief self-directed interactions with its environment. These interactions were unrelated to the task of locomotion, driven only by the objective of disambiguating competing internal models. These findings may help develop more robust robotics, as well as shed light on the relation between curiosity and cognition in animals and humans: Creating models through exploration, and using them to create new behaviors through introspection
awesome… but this to techeie for me at the mooment…
theres robots that teach themeselves hwo to walk and everyting