Entries : Category [ Robotics ]
Articles about robots and robotics.
[Miscellaneous]  [Computers and Technology]  [Travel]  [Education]  [Hacks]  [Robotics]  [Science]  [Programming and Software]  [iPhone]  [Digital TV and Video]  [Intellectual Property & Copyright]  [Personal] 

18 June

We applied for and received funding from Microsoft to develop tools and infrastructure to use Microsoft Robotics Studio (MSRS) with our underwater and walking robot systems. Our work has involved the control of our robot both underwater and on land using visual markers, especially because any other type of communication underwater is so difficult. This funding is part of Microsoft's Human Robot Interaction Awards and was awarded to Ioannis Rekleitis and myself.

To quote our own project information, there are many challenges to be faced here due to the fact that our Aqua vehicle moves in a variety of terrains and is capable of providing very limited sensory feedback in the form of video footage and the state of an inertial measurement unit (IMU). We want to use one video and IMU feedback for control.

Current operations with the robot require a skilled operator who is capable of guiding the robot either in walking or in swimming mode. In our project we proposed to implement a user interface utilizing the strengths of the Microsoft Robotics Studio (MSRS) to provide an interface for controlling the robot as well as a visualization tool for interpreting the visual feedback. This work would also extend a new method for communicating with AQUA when a direct link to a controlling console is not available; this method called RoboChat is based on cue cards that are presented in AQUA’s vision sensor, instructing the vehicle to perform high level actions. While on land operations communication between an operator and a vehicle is easy to implement in a variety of methods, e.g. wireless/wired links; underwater communications are a lot more restrictive in terms of cost, bulk, energy, and bandwidth.

Some of our work with RobotChat was recently presented at the 2008 International Conference on Robotics and Automation. The Microsoft announcement is here

By Gregory Dudek at | Leave a comment |    
30 June

Today I visited Wolfram Burgard's lab in Freiburg Germany. There are numerous interesting projects going on there, including work on range estimation from video and range data, and modeling of robot kinematics using Gaussian processes (the latter was the subject of a recent paper at RSS). They also have several interesting implementation projects running, including an automated blimp, and automated helicopter, and several types of vehicles.

I also gave a talk there whose primary focus was our own underwater and amphibious robotics work at McGill. Unfortunately, the display system would not recognize my older Powerbook, and as a result the 1-hour talk started a half -hour late. This was a new experience for me. This got me totally stressed out and led to a very uncomfortable talk for me. That said, the visit was very pleasant, interesting and enriching.

Students doing a demo of a robot for exploring and building

By Gregory Dudek at | Read (1) or Leave a comment |    
16 August

This Summer I attended the International Symposium on Experimental Robotics. This meeting takes place every second year and tends to be located in diverse locations. This year it was in Athens, Greece and Natasha came along to check it out and do some sightseeing.

The Parthenon, Acropolis, Athens

As the name implies, the conference features papers that have an experimental component. The selection on topics was very diverse ranging from medical robotics and (endoscopy -- cameras on a flexible stick) to search and rescue. There were a couple of papers on very different kinds dealing with autonomous or semi-autonomous flying helicopters. A student from Andrew Ng's lab discussed their ongoing work on the control of helicopters, and the amazing dynamics they can manage. Somebody from Nick Roy's lab presented the vision-based helicopter system they have developed that recently won a search-and-rescue contest (as noted previously in this blog in entry 112).

My own presentation dealt with the Aqua underwater vehicle and I talked about some new ways we are doing visual servo control to interact with human divers. In particular, I talked about some early results on using machine learning to assist the construction of servo controllers for real-time tracking underwater where color is an important cue.

As usual, the conference presented a great opportunity to meet people from the community and talk about research. Unlike larger meetings, the group is fairly focussed and you end up spending plenty of time with the same (very interesting) group. For example, I got to speak at length with a couple of interesting younger researchers such as Katie Byl and Jonathan Clark, and may end up with a new research sub-project as a result. Of course, it was also great to meet many old friends and colleagues, not the least of whom was the eminent Oussama Khatib who is the lead organizer of the meeting. Natasha got to briefly monopolize the dance floor with Oussama at the end-of-conference dinner; he also turns out to be a great dancer.

A highlight of the trip, aside from the talks, was a concert one evening. This was a performance by the Bolshoi Orchestra that was held in the Acropolis! They performed “Alexander Nevsky", a cantata for mezzo-soprano chorus and orchestra, opus 78 (1939) Sergei Prokofiev and Vladimir Lugovskoy. It also included Violin Concert No. 2, with Simos Papanas (of Thessaloniki) as the soloist and was conducted by the chief conductor of the Bolshoi: Alexander Vedernikov. This was the best possible venue for a concert and made the trip truly memorable.

Natasha and Yiannis the the symphony
(Click to expand)

By Gregory Dudek at | Leave a comment |    
11 September

Every times I see it, I like it more.

In The Know: Are We Giving The Robots That Run Our Society Too Much Power?

By Gregory Dudek at | Leave a comment |    
01 October

Last week I was at the International Conference on Robots and Systems (IROS 2008) in Nice, France. This is one the the two huge annual conferences on robotics research. This one places particular emphasis on complete systems and it has an especially large representation from Japan and Asia, being sponsored by the Robotics Society of Japan (RSJ) as well as the IEEE.

As I get older, I end up spending a larger fraction of my conference attendance time having discussions in the hallways and making plans, instead of merely attending the technical sessions. Despite that, I heard a variety of talks spanning both areas I work in like SLAM and underwater robotics, as well as topics like the Design of Humanoid Robots that I only track via conference presentations.

One of our own presentations was an overview of the Aqua robot project, and how we are moving to increasing levels of autonomy with the underwater vehicle. The presentation was tricky since rather than focus on a single narrow technical problem, it had to weave together a group of projects and problems that, together, allow the vehicle to operate semi-autonomously. In the same conference session on underwater robotics we also heard about a few types of autonomous surface vehicle and how they can navigate autonomously.

Dave Meger presented work he had done with Yiannis and myself as part of his Master's thesis. That dealt with using a mobile robot to help the nodes of a sensor network to estimate their own positions. In particular, as the robot explores the metric embedding of the network, and can select various path planning strategies, and these directly impact the accuracy of the localization process.

Acropolis Conference Center in Nice, France
Acropolis Conference Center in Nice, France

Shadow Hand, IROS, Nice, France
Shadow Hand, IROS, Nice, France

By Gregory Dudek at | Leave a comment |    
07 October

One of the disadvantages of underwater (or amphibious) robotics is that there is a lot of logistic overhead to do real experiments. Just doing a smallish test in the pool requires not only that the whole pool be reserved (which is hard to accomplish), but that a fair bit of equipment as be transported and installed poolside.

In practice, this usually involves a team of 6 to 8 people, including swimmers (or divers) who accompany the robot and take photos underwater. Since we rarely get more than a couple of straight hours of pool time, this means we needs to minimize setup and take down times. When the robot is tethered, just managing the delicate fibre-optic cable is a hassle, and spooling it up wastes valuable time. Using a more robust cable would mean a heavier cable that would interfere with the robot's behavior and the measurements of its dynamics.

In the last year or so we have been doing lots more experiments without any tether. This has been a big win, and setup time (and logistic support) is now pretty limited. That's great, but it a small way I miss the bug party atmosphere of a big crew.

By Gregory Dudek at | Leave a comment |    
Prev  1   2   3   4   5   [6]   7   8   9   10   11   12   13   14   Next