Google It All

Monday, March 2, 2009

Robots growing in sophistication

Although we are surrounded by robots that we think of as automated tools, there are some sophisticated robots already in use (photo below). A remote telepresence is one of the most common applications that today's mobile, autonomous robots provide. Intelligence for these robots is handled via an embedded microcontroller that manages internal systems, and by a laptop that is attached to the robot. Humans control the robot through wireless communications. In this way, humans can tell the robot to change directions, shift a camera angle, take measurements, grasp objects, and so on. For example, mobile robots can let security personnel stay in a central office and still check out unsupervised areas in a warehouse or other remote site.


Carnegie Mellon University's TagBots use Intel boards


With advances in microchip design, nanotech sciences, software architecture, and mini-power cells, robot systems can be more than just another pair of eyes. They are already being tested and used in a variety of applications. They can traverse different, even dangerous environments and perform complex tasks on their own. For example, mil-spec iRobot Packbots have been used in Afghanistan to detect and map the locations and contents of caves. Another iRobot rover was used in the historic exploration of both the southern and northern shafts that led to the Queen's Chamber in the Great Pyramid at Giza (Egypt). The rover was able to illuminate areas beyond the blocking stones in the shafts, which had last been viewed by human eyes some 4,500 years ago.

Robot mobility issues

Regardless of a robot's design or tasks, there are still three main issues with its mobility:

* Localization: How does a robot know where it is in its environment?
* Mapping: How does the robot know the details of its environment?
* Navigation: How does a robot traverse its environment?

Intel works closely with researchers to identify novel ways for a robot to perform its mobility tasks. Intel is particularly interested in machine-vision libraries that can be used to perform localization and mapping based on monocular- or stereo-vision systems. For example, right now, most robots navigate by using infrared or radio waves to avoid objects in their paths. However, Intel software researchers recently developed several libraries that are very applicable to robotics systems. Intel's computer vision library is already used extensively by vision researchers.

Intel has also released a test version of a technical library for building Bayesian networks to support machine-learning activities. Bayesian networks are a form of probability-based artificial intelligence. Such a network would let a robot navigate by matching sensor data to a map stored in its memory.

No comments: