Autonomous sequence generation for a neural dynamic robot

  • Neurally inspired robotics already has a long history that includes reactive systems emulating reflexes, neural oscillators to generate movement patterns, and neural networks as trainable filters for high-dimensional sensory information. Neural inspiration has been less successful at the level of cognition. Decision-making, planning, building and using memories, for instance, are more often addressed in terms of computational algorithms than through neural process models. To move neural process models beyond reactive behavior toward cognition, the capacity to autonomously generate sequences of processing steps is critical. We review a potential solution to this problem that is based on strongly recurrent neural networks described as neural dynamic systems. Their stable states perform elementary motor or cognitive functions while coupled to sensory inputs. The state of the neural dynamics transitions to a new motor or cognitive function when a previously stable neural state becomes unstable. Only when a neural robotic system is capable of acting autonomously does it become a useful to a human user. We demonstrate how a neural dynamic architecture that supports autonomous sequence generation can engage in such interaction. A human user presents colored objects to the robot in a particular order, thus defining a serial order of color concepts. The user then exposes the system to a visual scene that contains the colored objects in a new spatial arrangement. The robot autonomously builds a scene representation by sequentially bringing objects into the attentional foreground. Scene memory updates if the scene changes. The robot performs visual search and then reaches for the objects in the instructed serial order. In doing so, the robot generalizes across time and space, is capable of waiting when an element is missing, and updates its action plans online when the scene changes. The entire flow of behavior emerges from a time-continuous neural dynamics without any controlling or supervisory algorithm.

Download full text files

Export metadata

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Jan TekülveORCiDGND, Adrien FoisGND, Yulia SandamirskayaORCiDGND, Gregor SchönerORCiDGND
URN:urn:nbn:de:hbz:294-72775
DOI:https://doi.org/10.3389/fnbot.2019.00095
Parent Title (English):Frontiers in neurorobotics
Subtitle (English):scene perception, serial order, and object-oriented movement
Publisher:Frontiers Media
Place of publication:Lausanne
Document Type:Article
Language:English
Date of Publication (online):2020/07/02
Date of first Publication:2019/11/15
Publishing Institution:Ruhr-Universität Bochum, Universitätsbibliothek
Tag:Open Access Fonds
autonomous robot; neural dynamic modeling; reaching movement; scene perception; sequence generation
Volume:13
Issue:Artikel 95
First Page:95-1
Last Page:95-18
Note:
Article Processing Charge funded by the Deutsche Forschungsgemeinschaft (DFG) and the Open Access Publication Fund of Ruhr-Universität Bochum.
Institutes/Facilities:Institut für Neuroinformatik
Dewey Decimal Classification:Allgemeines, Informatik, Informationswissenschaft / Informatik
open_access (DINI-Set):open_access
Licence (English):License LogoCreative Commons - CC BY 4.0 - Attribution 4.0 International