Now showing 1 - 7 of 7
  • Publication
    Investigation on stereo-ToF data fusion for the inspection of used industrial parts
    Optical systems for automated or partially automated inspection have been making important contributions to ensure the quality and functionality of technical products for many years. Often used to monitor the quality of newly produced goods, vision systems also aim to play an important role in identification and condition description of used industrial parts such as aged vehicle components. In this work, a passive stereo system and a Time-of-Flight (ToF) sensor of the latest generation were used to create the desired sensor system. In the first step, the pixel-based information of both sensors was exploited to spatially calibrate the transformation between the left stereo camera and the ToF sensor by forming a 2D-3D correspondence set of detected feature points. To compensate for the resolution difference of the sensors, numerous interpolation points were randomly sampled on the reconstructed sparse surface mesh of the ToF sensor to create the missing sub-pixel information. It could be shown that the fused sensor information led to an increase in incompleteness by 7.81% on average for all components examined. The higher noise in the ToF measurement data in the fill-ins could be mitigated by using an adapted median kernel filtering. The average deviation of the measurement from a reference dataset was 1.30mm for the stereo system, 2.51mm for the ToF system, and 1.42mm for the fused result. The result of this work is promising as the quality of the surface mesh could be raised especially for critical surface areas and the underlying RGB data itself can be used for pixel-wise classification and segmentation.
  • Publication
    An automatic calibration approach for a multi-camera-robot system
    ( 2019)
    Kroeger, Ole
    ;
    ;
    Niebuhr, Carsten A.
    In this paper we present an automated and precise calibration approach to enable a vision-based robot control with a multi-camera setup. In our case we use a setup consisting of four fixed cameras above the workplace (eye-to-hand configuration) and a collaborative robot. The robot will be used to automate the intrinsic camera calibration and the necessary process of capturing images of a calibration pattern from a maximum number of different viewpoints. For the extrinsic calibration of the cameras and the calibration to the robot we use only two small markers, which are permanently attached to the base of the robot. This allows a fast online calibration of the setup. With the help of first experiments, we can show that the calibration method works well under laboratory conditions. Furthermore, we will discuss the achieved accuracy.
  • Publication
    Combining the Advantages of On- and Offline Industrial Robot Programming
    ( 2019)
    Guhl, Jan
    ;
    Nikoleizig, Sylvio
    ;
    ; ;
    Classic off- and online programming approaches offer different advantages but cannot provide intuitive programming as it would be needed for frequent reconfiguration of industrial robotic cells and their programs. While a simulation and therewith a collision control can be used offline, highest precision can be achieved by using online teach-in. We propose a system that combines the advantages of both offline and online programming. By creating a programming framework that is independent of in- and output devices we allow the use of augmented as well as virtual reality. Thus, enabling the user to choose for a programming technique that best suits his needs during different stages of programming. Bringing together both methods allows us to not only drastically reduce programming time but simultaneously increase intuitiveness of human robot interaction.
  • Publication
    Intuitive Roboterprogrammierung per Fingerzeig
    Automatisierte Produktionsabläufe werden zunehmend wichtiger. Dennoch ist der Mensch nach wie vor ein unerlässlicher Faktor, da die künstliche Intelligenz von Robotern noch nicht ausreicht. Das Fraunhofer IPK hat nun ein System entwickelt, das eine gestenbasierte, intuitive Programmierung von Robotern für Schweißanwendungen mit hoher Präzision ermöglicht.
  • Publication
    Enabling Human-Robot-Interaction via Virtual and Augmented Reality in Distributed Control Systems
    Production and assembly lines are nowadays transforming into flexible and interconnected cells due to rapidly changing production demands. Changes are, for example, varying locations and poses for the processed work pieces and tools as well as the involved machinery like industrial robots. Even a variation in the combination or sequence of different production steps is possible. In case of older involved machines the task of reconfiguration and calibration can be time consuming. This may lead, in addition with the expected upcoming shortage of highly skilled workers, to future challenges, especially for small and medium sized enterprises. One possibility to address these challenges is to use distributed or cloud-based control for the participating machines. These approaches allow the use of more sophisticated and therefore in most cases computationally heavier algorithms than offered by classic monolithic controls. Those algorithms range from simple visual servoing applications to more complex scenarios, like sampling-based path planning in a previously 3D-reconstructed robot cell. Moving the computation of the machine's reactions physically and logically away from the machine control complicates the supervision and verification of the computed robot paths and trajectories. This poses a potential threat to the machine's operator since he/she is hardly capable of predicting or controlling the robot's movements. To overcome this drawback, this paper presents a system which allows the user to interact with industrial robot and other cyber physical systems via augmented and virtual reality. Captured topics in this paper are the architecture and concept for the distributed system, first implementation details and promising results obtained by using a Microsoft HoloLens and other visualization devices.