The Massachusetts Institute of Technology (MIT)’s Media Lab has completed its testing of a radio frequency identification system known as TurboTrack. The lab’s researchers say the solution could enable a new level of flexibility and autonomy for robots in manufacturing processes, as well as in applications such as search-and-rescue.
The TurboTrack system is designed to pinpoint a passive UHF RFID tag’s location at the sub-centimeter level, even if it is moving at fairly high-speed. Such a system could make it possible for a robot to understand where a tagged item was located and to respond accordingly—even one flying overhead, as in the case of a swarm of drones. In the long run, the technology is intended to offer a more effective option for managing robotics than computer vision. The group will present a paper on the technology today at the USENIX Symposium on Networked Systems Design and Implementation.
The system can accomplish highly granular location of a tag using a standard UHF RFID interrogator, along with what the group calls “helper” devices. The helper comes with four built-in antennas that pulse very short RFID signals and measure the reflection of the responding signal from a tag in order to better understand its location, says Fadel Adib, an assistant professor and principal investigator at the MIT Media Lab, and the founding director of the Signal Kinetics Research Group.
TurboTrack is one of multiple RFID projects the Media Lab has conducted in recent years (see RFID Detects Food Safety With Innovation from MIT Media Lab Research and Lightweight Relays Enable Small Drones to Read RFID Tags Indoors). This project, however, is focused on enabling highly precise location capture for robots as they are used increasingly in manufacturing, including most recently in the picking and packing of goods. According to Adib, robots these days often serve as mobile devices that are no long simply bolted to an assembly line accomplishing a single, simple task.
Many robots leverage computer vision to understand their surroundings. However, computer vision provides robots with a somewhat limited view as they navigate their way through a task. “The problem with using computer vision is that it fails in cluttered environments,” Adib says. Computer vision requires a line of sight, for instance. Therefore, if an object were oriented in such a way that it couldn’t be identified, or if it were located behind a wall or obstruction, it could not be located. In addition, something like a swarm of drones cannot be individually identified.
“We wanted to enable robots to use radio signals to outperform vision-based systems” in busy or crowded environments, Adib says. If robots had better data regarding where objects or individuals were located, he adds, they could make more intelligent decisions, such as adjusting their work or speed based on the surrounding environment, as well as prevent collisions, find and move missing objects or identify drones flying overhead. They could also collaborate with another robot to accomplish a task.
Therefore, the MIT Media Lab began working on a solution about 18 months ago. The group quickly found that they needed a way to localize a tag more precisely than would be possible using standard UHF RFID technology, says Zhihong Luo, a graduate student in the Signal Kinetics Research Group. RFID offers the ability to more easily identify something even without a clear line of sight. However, regular UHF RFID systems still cannot pinpoint a tag with enough granularity to work with tasks such as assembly. Luo estimates that UHF technology could provide accuracy of approximately 15 to 20 centimeters (5.9 to 7.9 inches).
To accomplish more granular localization, the team came up with the additional set of antennas sending short-duration pulses. The short pulses alone did not provide sufficient power to interrogate and receive a tag’s ID-related data. Therefore, the team incorporated a standard UHF reader to interrogate the tag, and—in tandem with the helper—to listen to the RF responses of their short pulses.
The concept was inspired by the way in which super-resolution camera images merge multiple low-resolution images to create a single high-resolution image, Luo explains. The group wanted to do the same with RF signals. “If you have multiple cameras,” he states, “you can merge low-resolution images to create a high-resolution image.”
By using multiple antennas to capture location very precisely, along with the interrogator to prompt tag transmissions and identify tags, the team was able to demonstrate that the tags could be located on drones or other moving objects, so that robots could then respond to that information and conduct an activity, such as picking up an object. The short pulses use a different bandwidth than the interrogator: 800 to 900 MHz, rather than the standard UHF range of 902 to 928 MHz.
With the technology in place, the team has demonstrated that the location of a moving object can be surprisingly precise. For instance, they were able to use a tagged nano-drone to write letters in the air, and the software capturing the RFID tag-read data was able to understand what those letters were. They were also able to fly two nano-drones in unison, with one following the other’s movements. The software can use artificial intelligence to manage the measurements from multiple tags and then respond to that data.
The group hopes to commercialize the technology, and they are currently in conversations with some technology companies as part of that effort. They also plan to further research how the technology might be used to make robotic systems more robust and accurate.
What’s more, the researchers believe the technology could be employed for emergency response, with a reader and a helper device installed on such objects as a swarm of drones that could fly over an area to identify something with an attached tag. The reader and helper could also be stationary and provide location information to the software, which a robot or drone could then use to accomplish such tasks as locating a missing person. The system would also work well with active RFID, Adib says, in order to enable a longer read range.