From perception to action and vice versa: A new architecture showing how perception and action can modulate each other simultaneously

Thumbnail Image
Publication date
Defense date
Journal Title
Journal ISSN
Volume Title
IEEE - The Institute of Electrical and Electronics Engineers, Inc
Google Scholar
Research Projects
Organizational Units
Journal Issue
Artificial vision systems can not process all the information that they receive from the world in real time because it is highly expensive and inefficient in terms of computational cost. However, inspired by biological perception systems, it is possible to develop an artificial attention model able to select only the relevant part of the scene, as human vision does. From the Automated Planning point of view, a relevant area can be seen as an area where the objects involved in the execution of a plan are located. Thus, the planning system should guide the attention model to track relevant objects. But, at the same time, the perceived objects may constrain or provide new information that could suggest the modification of a current plan. Therefore, a plan that is being executed should be adapted or recomputed taking into account actual information perceived from the world. In this work, we introduce an architecture that creates a symbiosis between the planning and the attention modules of a robotic system, linking visual features with high level behaviours. The architecture is based on the interaction of an oversubscription planner, that produces plans constrained by the information perceived from the vision system, and an object-based attention system, able to focus on the relevant objects of the plan being executed.
The proceeding at: 6th European Conference on Mobile Robots.Took place in September 25-27, 2013, in Barcelona, Spain.
Intelligent robots, Mobile robots, Object trackingpath, Robot vision, Color, Computational modeling, Computer architecture, Image color analysis, Planning, Robots, Visualization
Bibliographic citation
2013 European Conference on Mobile Robots. Pp. 268-273