Featured in:
IEEE International Conference on Autonomous Robot Systems and Competitions, Vila Real, Portugal
Authors:
Tiago Dias, Pedro Miraldo and Nuno Goncalves
In this article we propose a framework for the application of augmented reality to non-central catadioptric imaging devices. Considering a virtual object in the world with known 3D coordinates, the goal is to project this object into the image of a non-central catadioptric camera. We propose a solution to this problem which allows us to project texturized objects to the image in realtime, up to 20 fps: projection of 3D segments to the image, occlusions, illumination and shading. To the best of our knowledge this is the first time that this problem is addressed (all state-of-the-art methods are derived for central camera systems). In our experiments, we used a non-central catadioptric camera formed with a perspective camera and a spherical mirror. To test the proposed approach, we define a cube with texturized faces where each of the main steps of the framework is evaluated. To conclude, we used the proposed framework to project to the image the Stanford “bunny” object.
© 2024 VISTeam | Made by Black Monster Media
Institute of Systems and Robotics Department of Electrical and Computers Engineering University of Coimbra