We got our company Independent Robotics incorporated, and had it's name officially registered.
Last night we finally finished off the third version of the Aqua robot family. Like the other two revisions of the hardware, this is a hexapod (6 legged) robot that can either swim or walk. Version 2 was smaller, more compact and a lot more capable than version 1. This third model includes more custom electronics and a gigabit-over-optical-fiber ethernet interface for high-speed data logging.
The software has also changed a fair bit, but most of the good parts are back-ported to the other versions of the robot, so that's not really a distinguishing feature. It's leaving for it's new home at York University where it will be used in the lab of my colleague and collaborator Mike Jenkin.
Okay, now for something lighthearted. Here is a video clip of the best of the Austrian Hexpod Dance competition at Hagenberg Technical College (Upper Austria University of Applied Science). The competition is part of a course in hardware/software of system engineering (HSSE).
A hexpod is, of course, something with six legs. Hexpods can have their six legs configured is various ways, and in fact our Aqua swimming robot is also a hexpod, but quite different from the ones in this video. This dance competition doesn't have that much to do with brand-new science and puts a high premium on decorations, but getting the robots to do this is still no mean feat. As far as I know, there is a locally-developed standard base hardware platform provided to all teams. Programming the control is no-doubt difficult. I believe they are programmed in C and controlled by an Atmel AVR microprocessor.
The Robotics Science and Systems 2009 conference is being held at the University of Washington this year. The web site just opened for paper submissions (with a submission deadline in mid-January). As always, the conference deals with the latest and greatest science and technology in robotics research, with a emphasis or hard science, elegant mathematics, and originality.
[Update: March 19, 2009 -- review process is complete for this year. It looks like the overall trends will be consistent, but the number of workshops may be down. ]
We are near the end of a mad rush to get all the robotics equipment in our lab ready for the next sea trials. As always, no matter how much we think it will be a breeze, getting ready involves a lot of work and stress. Part of this is due to the constant pressure to push the bounds of what we can do and, once we get things stable, to add more complexity to the experiments being planned.
This year the part of the team from McGill is putting particular focus on three different classes of experiment: leaning-based robot guidance, optimal data collection given constraints on time, teleoperation and tele learning using Microsoft Robotics Studio, and enhanced controller design. See, I added a 4th experimental context just since the time I started this paragraph!
The learning-based guidance is based on work by my student Philippe Giguere where he uses a new learning rule to exploit the correlations is time that are present in most of the observations made in the real world. Using these temporal correlations, the process of learning about different classes of experience can be made easier. These classes of experience can be different terrain textures (as in our most recent RSS paper) where the robot learns how to adapt it's walking modes to the type of terrain it is on. For example, on slippery terrain a more "careful" low-speed gait is appropriate.
As part of our Sea Trials in Barbados, we held an Underwater Webcast in which we transmitted live image data and robot telemetry to a class at St. George's High School back in Montreal. We did this Jan 21st, 2009. This was accompanied by live streaming audio and video that I narrated from the beach. My colleague Ioannis Rekleitis operated the Microsoft Robotics Studio (MSRS) interface that was used to collect the telemetry data. Junaed Sattar sat in at the High School and gave an associated seminar that also served as a backup in case our Internet connection failed (it didn't, but it did have to be restarted a couple of times).
On the beach, in addition to providing narration for the experiment, I also did some question answering so it was a truly interactive webcast event. I also used the camera embedded in my laptop to provide a live video feed from the surface of what was going on. Thus, we had a webcast that involved live footage from both the surface and from underwater. To my knowledge, this is the first live interactive underwater webcast.