T1ll13 robot step1

After a couple of long nights, a lack of appetite and a fair amount of headaches, I finally have something meaningful setup as a framework/platform to build on for the new robot called T1ll13 - it's a new (Millennial) species, so it needs a new type of naming structure. (It's short for Matilda)

(this is not T1ll13 - just a stock photo)

I originally went the RPi3 route of installing Raspbian/Jessie and then attempting to build upwards from there to install the ROS, as explained here:
http://wiki.ros.org/ROSberryPi/Installing%20ROS%20Kinetic%20on%20the%20Raspberry%20Pi

All would seem to work okay, until it got to the rosbag install and it would fail, with no sensible way of recovering....trust me, I tried.

I even went backwards a version to the Indigo, but still had no joy:
http://wiki.ros.org/ROSberryPi/Installing%20ROS%20Indigo%20on%20Raspberry%20Pi
....that just gave me issues at another point further down the line.

Not being one to admit defeat, I adopted my usual approach (to work & life) and that was to find another way to the goal...just imagine running water.  Running water will always find a way to get around what it needs to in order to keep moving.

With that in mind, I wondered if I could do what I had just done on my Mac.  I setup a VMWare Fusion VM and installed Ubuntu linux and installed ROS (as defined here).  That went off without a hitch, so I was thinking...I wonder if I could install Ubuntu onto a Raspberry Pi 3?  Not something I've done before....but time to give it a go.

After a bit of searching, I found this site: http://phillw.net/isos/pi2/
and more specifically, I downloaded this image:
http://phillw.net/isos/pi2/ubuntu-mate-16.04-desktop-armhf-raspberry-pi.img.xz

After a quick session with Etcher and a 32Gb SDCard....the image was burnt and ready to go into the Raspberry Pi 3.  It booted.  Why was I surprised?  After a quick setup and a couple of reboots later.  I was ready to see if I could get something working.  After the usual 'sudo apt-get update/upgrade' it was time to get on with the ROS stuff.
From a Terminal session, it was time to enter:
---------------------------------------------------------------------------------------------------
sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'
sudo apt-key adv --keyserver hkp://ha.pool.sks-keyservers.net:80 --recv-key 0xB01FA116
sudo apt-get update
sudo apt-get install -y ros-kinetic-desktop-full
sudo rosdep init
rosdep update
# Create ROS workspace
echo "source /opt/ros/kinetic/setup.bash" >> ~/.bashrc
source ~/.bashrc
source /opt/ros/kinetic/setup.bash
sudo apt-get install -y python-rosinstall
sudo apt-get install -y python-roslaunch
---------------------------------------------------------------------------------------------------

Initially the ros-kinect-desktop above did not have the word "full" on the end, but after having the last command "python-roslaunch" fail, I found some suggestions to do it like the above.  The "python-roslaunch" still fails with unmet dependency errors, but it didn't seem to be an issue at this point.

Now it was time to plug the XBox 360 Kinect into the RPi3.  I noticed that the Kinect has an odd cable end....not USB.  Typical.  I looked online, Amazon could get me one by the weekend, so could eBay, but that was, like, 4 days away!  After hunting around the house for 30mins, I found the original charger for the Kinect that included a USB connector - result!  It was a bit picky about which of the usb sockets I put it into on the RPi3, but I found the top left to work okay - the Kinect green light came on!  (also running lsusb told me it was detected: Microsoft Corp. Xbox NUI Camera/Motor/Audio)

Back to that open Terminal from earlier, we have to download the software to get ROS to work with the Kinect:
---------------------------------------------------------------------------------------------------
# search for packages
sudo apt-cache search ros-kinetic | grep "search term"
# Install kinect packages
sudo apt-get install -y freenect
sudo apt-get install -y ros-kinetic-freenect-camera ros-kinetic-freenect-launch
sudo apt-get install -y ros-kinetic-freenect-stack ros-kinetic-libfreenect
---------------------------------------------------------------------------------------------------


Now it was time to open up multiple Terminal windows.....

TERM1: This is the "master", think of it like the main brain that has to be running to process all the events that are happening (bit like a web server):
---------------------------------------------------------------------------------------------------
$ roscore
....
started roslaunch server http://rpi3ubuntu:37735/
ros_comm version 1.12.7
summary
parameters
 * /rosdistro: kinetic
 * /rosversion: 1.12.7
nodes
auto-starting new master
process[master]: started with pid [2592]
ROS_MASTER_URI=http://rpi3ubuntu:11311/

setting /run_id to c2dbad48-c3d7-11e7-9e7d-b827eb8fbf84
process[rosout-1]: started with pid [2605]
started core service [/rosout]
---------------------------------------------------------------------------------------------------

TERM2: Start the Node for the Kinect.  An ROS node is a bit like an "IoT module", it'll do it's own thing, gathering/capturing data and if things are subscribed to it, it'll publish this out to whoever is listening.  For us here, it is the roscore/master.   The /topic sub/pub concept shouldn't be a new concept, we've been doing it with Message Queues (MQTT) for years now....
---------------------------------------------------------------------------------------------------
$ roslaunch freenect_launch freenect.launch

started roslaunch server http://rpi3ubuntu:46776/
summary
parameters
 * /camera/.....
nodes
  /camera/
ROS_MASTER_URI=http://localhost:11311
core service [/rsout] found
....
[ INFO] Starting a 3s RGB and Depth stream flush.
[ INFO] Opened 'Xbox NUI Camera' on bus 0:0 with serial number 'A0033333335A'
---------------------------------------------------------------------------------------------------

TERM3: Now that the above TERM2 should be publishing topics now, we can run a command to see the topics.  We can now look at the RGB image from the Kinect with image_view
---------------------------------------------------------------------------------------------------
$rostopic list
/camera/depth/camera_info
....
/camera/rgb/camera_info
/camera/rgb/image_raw
/camera/rgb/image_color
....

$ rosrun image_view image_view image:=/camera/rgb/image_color
libEGL warning: DRI2: failed to authenticate
init done
[ INFO] Using transport "raw"
---------------------------------------------------------------------------------------------------
This will pop up a window shows you what the Xbox 360 Kinect is currently "seeing".
As shown here (and the purpose of ALL THIS WRITING WAS JUST TO SHOW THIS PHOTO!)


Yes, that is the re-purposed iMac G5 running Ubuntu with Arduino IDE and an Arduino UNO plugged in ready for some servo action later in the week.  Some random Sharp IR sensors are waiting to be used too... along with a 15 servo HAT for the RPi too.

You know it's going to be a fun couple of evenings when you can see a soldering iron sitting on the corner of the desk.

So, there's the RPi3 on the bottom left, plugged into the Xbox Kinect (on top of the iMac G5) and the big monitor on the right os plugged into the RPi3, showing the 3 terminals described earlier and the output image of the Xbox Kinect, with yours truly trying to take a decent photo.... (and yes that is a Watson T-shirt I'm wearing).

If anyone is interested, I ran up "System Monitor" and the 4 CPUs are running at 35-50% and memory is at 365Mb out of 1Gb.  I am streaming the image quite large though - something to keep an eye on.

Phew, that was step 1..... now it's time to do some further basic ROS testing and to create some new nodes and make sure that ROS is working correctly.

What ROS nodes will I be creating?.....well, obviously there will be some STT/TTS/NLU and now I have the Xbox Kinect working a Visual Recognition node too.

And for the true eagle eyed, yes that is an AIY Projects - Voice Kit box on the top right - I haven't gotten around to making it yet, maybe by the end of the week, but in essence it's the same thing as what I'm going to be doing anyway, just without pushing a button or using Google....

UPDATE: To get the I2C to work, you need to modify /boot/config.txt because we're using Ubuntu we can't use raspi-config

Comments