• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Konferenzschrift
  4. GPU-accelerated affordance cueing based on visual attention
 
  • Details
  • Full
Options
2007
Conference Paper
Title

GPU-accelerated affordance cueing based on visual attention

Abstract
This work focuses on the relevance of visual attention in affordance-inspired robotics. Among all approaches in robotics related to Gibson's concept of affordances the dealing with attention cues is only rudimentary. We are introducing this concept within the perception layer of our affordance-inspired robotic framework. In this context we present a high-performance visual attention system handling invariants in the optical array. This layer builds the base of higher-sophisticated tasks, like a "curiosity drive" that helps a robotic agent to explore its environment. Our attention system derived from VOCUS utilizes the parallel design of the graphics processing unit (GPU) and reaches real-time performance for the processing of online video streams in VGA resolution on a single computer platform. GPU-VOCUS is currently the fastest known visual attention system running on standard personal computers.that are learned by infants, which range from perception of unity through motion to invariants for locomotion. She shows that the perception of space is directly coupled to the development of locomotion. This dependency indicates that an agent can only perceive affordances that are related to any of its possible actions. Another example is that an agent can only perceive whether an object affords lifting if it is capable to attach to the object and to lift it. This affordance inspiration is one of the fundamentals in our EU project MACS. Within this context Paletta et al. presented a novel framework for cueing and hypothesis verification of affordances that could play an important role in future.

; 

This work focuses on the relevance of visual attention in affordance-inspired robotics. Among all approaches in robotics related to Gibson's concept of affordances the dealing with attention cues is only rudimentary. We are introducing this concept within the perception layer of our affordance-inspired robotic framework. In this context we present a high-performance visual attention system handling invariants in the optical array. This layer builds the base of higher-sophisticated tasks, like a "curiosity drive" that helps a robotic agent to explore its environment. Our attention system derived from VOCUS utilizes the parallel design of the graphics processing unit (GPU) and reaches real-time performance for the processing of online video streams in VGA resolution on a single computer platform. GPU-VOCUS is currently the fastest known visual attention system running on standard personal computers.that are learned by infants, which range from perception of unity through motion to invariants for locomotion. She shows that the perception of space is directly coupled to the development of locomotion. This dependency indicates that an agent can only perceive affordances that are related to any of its possible actions. Another example is that an agent can only perceive whether an object affords lifting if it is capable to attach to the object and to lift it. This affordance inspiration is one of the fundamentals in our EU project MACS. Within this context Paletta et al. presented a novel framework for cueing and hypothesis verification of affordances that could play an important role in future.
Author(s)
May, S.
Klodt, M.
Rome, Erich  
Breithaupt, Ralph
Mainwork
IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2007  
Conference
International Conference on Intelligent Robots and Systems (IROS) 2007  
Open Access
File(s)
Download (306.38 KB)
Rights
Use according to copyright law
DOI
10.24406/publica-r-355939
10.1109/IROS.2007.4399118
Language
English
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024