Beaglebone and ROS + Webcam + remote ROS master (+OpenCV)

This tutorial assumes that you already have Ubuntu on your Beaglebone. If not, you can get it done this way:

Although Fuerte is out I don’t recommend it yet.
Connect to your BeagleBone using ssh.

A nice guide can be found to install ROS Electric here:

Be prepared that this process might take a while since you are gonna compile the whole ROS system (Full) from sources. Use this line:

me@beaglebone$ rosinstall ~/ros ""

You can expect a compilation time of 8 hours or more 🙂

Since ROS itself was installed from sources, don’t expect to find binaries for arm architecture anywhere. You will have to install every additional packages you’d like to use from source.

An easy and straightforward way to get images from a v4l supported webcam is to use the usb_cam package. It’s fast, written in C++, uses libusb.

An alternative choice is gscam. It was also written in C++, but uses gstreamer to get the image feed.
gscam has dependency errors, feel free to play with the yaml-cpp library. If you manage to compile it, send me a note and I’ll update the post.

Let’s bring up a roscore and a camera publisher node (usb_cam in this case).
me@beaglebone$ roscore &
me@beaglebone$ rosrun usb_cam usb_cam_node

In order to check what you’ve done you’ll have to set up the ROS_MASTER_URI to point to the BeagleBone.

Setting the environment variable is one thing, we’ll have to adjust a bit of network administration to make sure everything’s gonna be ok. You should know the name and the network ip of both the BeagleBone and your PC. Use ifconfig to get the network ip.

Edit /etc/hosts to set up the address of both computers by name.

me@beaglebone$ sudo vim /etc/hosts

add something like this to the file: pc

Now the same thing on the other side:

me@pc$ sudo vim /etc/hosts

and add beaglebone

and set the master uri accordingly. In this case it’s:

me@pc$ export ROS_MASTER_URI=http://beaglebone:11311/

After this in the terminal where you did the export you should be able to see the roscore running on the BeagleBone. Try it with “rostopic list”.

To see the image you can bring up rviz or use the image_view node, pick one 🙂

me@pc$ rosrun rviz rviz
rosrun image_view image_view image:=/usb_cam/image_raw

After you have all this running you can get down to write your own image processing node using OpenCV 2.3.1 (the OpenCV version in ROS Electric). Have fun!

If you have problems or you feel that the post is incomplete, feel free to ask.


17 thoughts on “Beaglebone and ROS + Webcam + remote ROS master (+OpenCV)

  1. I’m one of the developers and biased, but have you looked at Syntro for this kind of robotics control / remote camera use? We test on small boards like the Beagles and Gumstix all the time. There is less network setup since Syntro has a discovery system. It’s GPL/LGPL software. Our blog is here If you have questions ask. We’re still working on docs.

  2. I agree with you that it’s quite likely that Syntro would be more optimal in terms of hardware usage and ROS in this case is a bit of an overkill. On the other hand I’d like to set things up in such a way that I can go for the well-known ROS nodes I use and handle problems like that. I’ve seen that you are working with the Beagle and Arduino quite reliably where in the case of Arduino ROS is not that great.

    Is there a way to use Syntro as middleware (or “driver”) for external devices while using ROS in general? I’ve seen that on a blog somewhere but just as an idea. If there isn’t and you are open to work on it we can share some ideas.

    1. The ROS build system is quite complex, and based on CMake. The most easy way for you would be to use the ROS build system to generate message headers (C++ structs) and use them in some templatized transport system. The size of the generated messages are known in compile time, you only have to input them into your transport system (possibly by using some template-magic).

      1. What I am imagining is a ROS node that is in fact a gateway to a Syntro subnetwork. The Syntro/ROS gateway would need to publish services and do all the ROS-y things that need to be done. The question really is how the services should be presented to maximize the usefulness for people using ROS. For example, are there standard abstractions for things like robot drive systems, IMUs etc that would make sense for the Syntro/ROS gateway to implement? Or, is it really just a case of transporting ROS messages to and from things like Arduinos so that the Syntro/ROS gateway doesn’t need to understand them. The latter case seems quite possible to do if it is useful. APIs could be provided in the small subsystems that keep things really simple there but still allow them to play with real ROS nodes. The Syntro/ROS gateway would interact with roscore and other ROS nodes as required as a sort of proxy. The other ROS nodes would think that they are dealing with standard ROS nodes.

        What do you think? If you have any specific applications in mind, we’d be very happy to work with you on this.

      2. That’s exactly what I was thinking about.
        What I have in mind is to use the ROS-generated message types in Syntro so a straightforward “mapping” can be done.
        As a user I would like to have a ROS package of syntro gateway and a really quick way to set it up on my computer and external devices, and a straightforward way to create a syntro application which would typically read sensors and do control related things like use a motor controller via PWM. Doing computationally expensive things is not meant to be done on a small device like the BeagleBone or Arduino.

  3. Nicely done!!

    Quick question: how do you open a new terminal for beagle bone on the pc with mincom console?

    As in the above steps of
    “Let’s bring up a roscore and a camera publisher node (usb_cam in this case).
    me@beaglebone$ roscore &
    me@beaglebone$ rosrun usb_cam usb_cam_node”

    once I run roscore, the terminal is occupied, I don’t know how to open a new terminal like on pc with gnome-terminal. So I am stuck here.

    Please help. Thanks a lot.

    1. I was using this through ssh with a simple gnome-terminal. But I think you can do the same when using screen to connect through serial port.
      The & mark at the end of the command means that it should be running in the background so you should get back your command prompt. Have you tried pressing return a few times?:)

      1. Thank you. Here is what I get, so may have to reload and try again.

        ubuntu@omap:~/ros$ svn co
        A usb_cam/include
        A usb_cam/include/usb_cam
        A usb_cam/include/usb_cam/usb_cam.h
        A usb_cam/manifest.xml
        A usb_cam/src
        A usb_cam/src/libusb_cam
        A usb_cam/src/libusb_cam/usb_cam.cpp
        A usb_cam/src/libusb_cam/CMakeLists.txt
        A usb_cam/src/CMakeLists.txt
        A usb_cam/src/usb_cam_node
        A usb_cam/src/usb_cam_node/usb_cam_node.cpp
        A usb_cam/src/usb_cam_node/CMakeLists.txt
        A usb_cam/CMakeLists.txt
        A usb_cam/Makefile
        U usb_cam
        Checked out revision 1447.
        ubuntu@omap:~/ros$ roscd usb_cam
        roscd: No such package ‘usb_cam’

      2. roscd and all ROS-related tools are only working on packages _inside_ your ROS_PACKAGE_PATH. If it’s still not roscd-ing there is might be because of registering, etc, but you can still cd there manually for doing rosmake.

  4. Thank you so much for posting this tutorial. Instead of taking your advice and installing ROS Electric on Ubuntu 11.04, I had tried to initially install ROS Fuerte from source on Robert Nelson’s Precise Pangolin release, and it was a complete nightmare. Hours and hours of frustration, problems with log4cxx, etc. etc. etc. I finally gave up and returned to the top of your instructions, wiped the SD card and installed 11.04, and am now smoothly installing ROS Electric on the BeagleBone. 🙂 So far the install is going great (I am in the first hour of installation) I should have followed your advice and done the 11.04 / Electric install in the first place! 😀

    Thanks again for a great tutorial 🙂

  5. I am planning to make a robot with the following features
    1. Beaglebone Black + Ubuntu + ROS OR Laptop + Ubuntu with ROS + arduino
    2. Laser Range finder
    3. Kinect or other USB camera

    Controlling station
    A laptop

    What I want
    1. Autonomous movement and control from the base station of the Robot
    2. SLAM implemented
    3. The robot should send live pictures of its surroundings
    4. The live pictures would be processed by open CV in the base station.

    Is this model possible?


Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.