mechanical engineering bachelor group: control systems ...mate.tue.nl/mate/pdfs/12206.pdf ·...
TRANSCRIPT
Mechanical Engineering
Bachelor Final Project
2009/2010
Group: Control Systems Technology
“Design of an augmented reality robot positioning tracking tool for RoboCup”
Student: F.B.F. Schoenmakers Supervisor: ir. R.J.M. Janssen Coordinator: dr.ir. W.J.A.E.M. Post Eindhoven, 09‐2010
Design of an augmented reality robot positioning tracking tool for RoboCup The goal for this bachelor final project, was to create an augmented reality tool for Techunited. Since all the information in the robocup community should be internationally accessible, it is decided to write a wiki‐page about this bachelor final project. Therefore, a printed version of this wiki‐page is attached. The first part is about the project itself and the online version of it can be viewed at: http://www.techunited.nl/wiki/index.php/Greenfield_Augmented_Reality The second part is about installing OpenCV, which is an open source set of libraries, required for this software to run. Because this OpenCV libraries first needs to be installed, also a wiki‐page is written for this. The online version of this wiki can de viewed at: http://www.techunited.nl/wiki/index.php/How_to_Install_OpenCV
Greenfield Augmented Reality
From Tech United RoboCup Team Eindhoven
Author(s): Ferry Schoenmakers
This page describes the augmented reality version of the Tech United Greenfield monitoring tool.
Overview
The augmented reality version of the Greenfield monitoring tool has been developed in order to track the
localization performance of the TURTLES. The TURTLES acquire there position and the position of
other obstacles by using information coming from the omnivision camera. These positions are then
shown in the Greenfield monitoring tool. However, the position obtained, can sometimes diverge from
the 'real' position because of various factors. To determine how well the localization works, real world
data will be needed and this must be compared to the TURTLES' data. In the augmented reality tool, a
camera is used which overviews the whole playfield. Images from this camera are then fed into the
Greenfield monitoring tool. Below the augmented reality tool is showed.
Contents
� 1 Overview � 2 Operation � 3 Coupling � 4 Future Work � 5 Compatibility � 6 How to start the tool � 7 References
Page 1 of 5Greenfield Augmented Reality - Tech United RoboCup Team Eindhoven
19-9-2010http://www.techunited.nl/wiki/index.php/Greenfield_Augmented_Reality
Operation
The working of the tool is based upon OpenCV libraries. (See How to Install OpenCV to install
OpenCV on a devpc)
These C based libraries are highly optimized and very useful for applications which need to use
computer vision. Since the augmented reality tool shows a realtime simulation, process times should be
as less as possible in order to reduce lag.
� First step is to extract the actual playfield from the camera images. The position of the camera is undetermined, so the location of the field in the camera images is not known. So the filed should be detected somehow.
� When the playfield is viewed by a camera, it is shown in perspective. In order to use the camera images as a surface underground, as shown in the figure above, the field should be reformed into a rectangle.
� To reduce cpu time, the location of the field in the camera images is determined before the tool starts. Automatically detecting the field every frame is way to slow for this application and definitely not robust. When initialising, the user sees a screen with the camera footage. With the mouse he then roughly selects the fieldcorners. The program then finds the exact corners close to the selected positions by line detection, using a hough transform based algorithm. The user interface is shown below.
Screenshot of greenfield augmented reality tool.
Page 2 of 5Greenfield Augmented Reality - Tech United RoboCup Team Eindhoven
19-9-2010http://www.techunited.nl/wiki/index.php/Greenfield_Augmented_Reality
� When completed, the fieldcorners are known so the location of the field is also known. Next, the field should be reformed into a rectangle before passing it through to the monitoring tool. This step is done by warping the image.
In order to warp an image, some points in the original image and there corresponding position in the
resulting image should be known. The corners of the playfield are already known and the location of the
corners in the new image are the actual corners of the rectangular output image itself. With this
information a so called mapMatrix can be calculated and thereafter the complete camera image can be
warped. An example is shown below
Coupling
Retreiving rectangular images of the playfield from the camera using OpenCV works fine, but
The corner selection screen.
Example of OpenCV's warp function.
Page 3 of 5Greenfield Augmented Reality - Tech United RoboCup Team Eindhoven
19-9-2010http://www.techunited.nl/wiki/index.php/Greenfield_Augmented_Reality
implementing this software in the Greenfield monitoring tool gave some problems. Because of shared
memory problems a separate executable is written to obtain the mapMatrix. When the whole
initialization process is completed the mapMatrix is written to disk in a .txt-file.
� When the actual monitoring tool starts, it calls a matlab mex-function to retreive the camera images. The first time this mex-function starts, it reads in the .txt-file and stores the mapMatrix in memory.
� Every next time the mex-function is called, it queries a frame from the camera. It then transforms the frame using the stored mapMatrix and it outputs the retrieved image as an mxArray to the Greenfield monitoring tool. An mxArray is the default matrix datatype for Matlab.
� When closing the tool, the mex-function is called again with input argument 0, in order to clear all memory used by this function.
Future Work
Unfortunately there is still a bug in matlab. When changing the colordata of a graphical surface
repeadetly, the renderer's cache fills up. This is a known bug at mathworks where regrettably still is no
solution for. When applying the camera image on the field, the same things happen whereby the same
bug rises. A workaround is available, however this only works on windows systems. The robocup
software is all written on linux systems and therefore still suffers from this bug. The mathworks
company is working on big changes on the graphical engine, but a release date is not known yet.
Another workaround is now applied to the Greenfield Augmented Reality tool. Streaming images on a
surface-element rises the memorybug, but streaming images to an image-element doesn't. A drawback of
using this approach is that the camera needs to face the image perpendicular in order to see it. An image-
element is a 2D element whereby matlab doesn't plot it when rotated in 3D space. Therefore the
augmented reality part of the tool now only works when facing the field from above, prependicular to
the field itself. Hopefully matlab will fix the bug mentioned, so that the augmented reality tool will also
work in 3D view in the future.
Compatibility
To use Greenfield AR, it is necessary to connect a camera to your computer. This cam should overview
the whole playfield. Various cams can be used. The easiest way is to use a webcam. In fact every
webcam that works under linux, should be accessible for OpenCV and thus for the tool. For a list of
tested webcams you should see the OpenCV's site.
Usage of the prosilica cam above the playfield of techunited is still in development.
How to start the tool
Okay, sounds nice. How do I run it?
� Before running greenfield3D in augmented reality mode, make sure OpenCV is installed on your DevPC. See How to Install OpenCV for instructions.
� Second step is to make sure a camera is attached to your computer. � Initializing the camera: Before running greenfield3D, the camera has to be initialized. Otherwise
an error will be displayed instead of a videostream. Run the m-file 'Initialize_Cam.m' located in the greenfield3D folder to do so and follow the instructions in the matlab console. When you've
Page 4 of 5Greenfield Augmented Reality - Tech United RoboCup Team Eindhoven
19-9-2010http://www.techunited.nl/wiki/index.php/Greenfield_Augmented_Reality
� This page was last modified on 17 September 2010, at 17:05.
attached a camera, initialized it and didn't move the camera, it is not necessary to initialize it every time you start greenfield3D. The initialization process creates a file named: HMatrix.txt in the map greenfield3D/src, where it stores the gathered information. Only when you attach a new camera, move the camera or the camera-stream doesn't allign correctly with the playfield, you have to initialize again.
� Almost there. When initialization is completed, the window automatically closes. Now you're ready to start greenfield3D. Once greenfield3D runs you press 'K' once to activate the augmented reality mode and pressing 'K' again exits the augmented reality mode. It's that easy!
References
� OpenCV homepage
Retrieved from "http://www.techunited.nl/wiki/index.php/Greenfield_Augmented_Reality"
Page 5 of 5Greenfield Augmented Reality - Tech United RoboCup Team Eindhoven
19-9-2010http://www.techunited.nl/wiki/index.php/Greenfield_Augmented_Reality
How to Install OpenCVFrom Tech United RoboCup Team Eindhoven
Author(s): Ferry Schoenmakers
This page describes how OpenCV should be installed on a DevPC, in order to make all OpenCV related software work correctly.
Download
First you need to download the package here (http://sourceforge.net/projects/opencvlibrary/files/opencv-unix/2.1/) .
■
Save the package anywhere and extract it to the robocup homefolder. Make sure that the folder is named: 'OpenCV-2.1.0'
Installing
Next, open the OpenCV-2.1.0 folder and create a new folder in it, named: 'opencv.build'. ■
So the complete path should look like this: '/home/robocup/OpenCV-2.1.0/opencv.build'
For the next step, cmake-gui is needed. Acquire it by typing: 'apt-get install cmakge-gui' in a terminal as superuser.
■
(To become a superuser, first enter: 'sudo su') When completed, open cmake-gui by clicking Apllications>Programming>CMake.
In Cmake, in the first field (source) select the OpenCV-2.1.0 folder. ■
In the second field (build) select the opencv.build folder.
Now click 'Configure' and select 'unix-makefiles' in the popup box. Now a red list of options appears. The options needed are already selected. Click 'Configure' again and 'Generate' after that to automatically create the makefiles.
Everything is now set ready to build the libraries and install them to your system.
In a terminal navigate to the opencv.build folder as a super user. Do this by entering: 'cd //home/robocup/OpenCV-2.1.0/opencv.build'.
■
Now build the libraries by entering: 'make'. Ignore the warning 'CVAPI-EXPORTS is DEFINED', nothing to worry about!
When finished, enter: 'make install' an after that has completed enter: 'ldconfig'. ■
Now all libraries are built and installed to your system!
Retrieved from "http://www.techunited.nl/wiki/index.php/How_to_Install_OpenCV"
Page 1 of 2How to Install OpenCV - Tech United RoboCup Team Eindhoven
20-9-2010http://www.techunited.nl/wiki/index.php/How_to_Install_OpenCV
Screenshot of Cmake-gui.
This page was last modified on 14 September 2010, at 20:32.■
Page 2 of 2How to Install OpenCV - Tech United RoboCup Team Eindhoven
20-9-2010http://www.techunited.nl/wiki/index.php/How_to_Install_OpenCV