Wright University Researchers Test RFID and Ultrasound for 3-D RTLS

By Claire Swedberg

The project's team is developing a hybrid real-time location system for identifying where a nursing home resident is within a room, and whether that person is standing, sitting or has fallen.

By combining RFID with ultrasonic technology, a team of researchers from Wright State University, in Dayton, Ohio, is developing a system that can pinpoint not only where an individual with a tag is located, but also whether he or she is sitting, standing, or lying on the floor. Once the technology is fully developed, says the project's leader, computer science and engineering professor Kuldip Rattan, it could help caregivers track the activities and well-being of clients, patients or loved ones, and even send a alert if those individuals have fallen.

The project is part of Wright State's "Living Laboratory," which opened in November at the Bethany Village senior community in nearby Centerville. There, a two-story house is being used as a model for new health-care technology to include mannequin-like robots to simulate members of a family who need health-care assistance. The robots are designed to help nursing students learn caregiving skills.

The RFID portion of the research has its origins at Wright-Patterson Air Force Base, where Rattan says he was researching RFID-based systems for tracking vehicles in indoor settings where GPS transmission would not be possible. He and his students then began working with Wright State's Nursing Institute and Bethany Village to develop a way to track the movement, behavior and health status of individuals such as nursing home residents.

Nursing and assisted-care facilities already employ a variety of tracking solutions that incorporate RFID or infrared (IR) technology to locate individuals within a room, and also in some cases with a transponder in a pendant featuring a button that can be pressed in the case of an emergency such as a bad fall. However, Rattan's goal was to develop a system that could not only know where in the room an individual was, but also how high they were in the room, in other words, whether they were standing, sitting or down on the floor. This can be accomplished by using ultrasonic technology Rattan says. The tag prototype that he and his student team developed comes with an ultrasound transceiver and an active EPC Gen 2 RFID transponder that transmits a unique ID number to nine sensor nodes that Rattan has installed on ceilings throughout the simulated living room.

Each sensor node contains an ultrasound emitter as well as the RFID interrogator. The tag, which students and faculty developed over the past year, receives a UHF EPC Gen 2 RFID transmission from sensor nodes, and then waits to receive the 20 kHz to 40 kHz acoustic (ultrasound) signals also transmitted by up to four sensor nodes in the vicinity. Trilateration from the sensor nodes, as well as the length of time it takes for the tag to receive ultrasound transmission from those sensor nodes, can then used in the system software to pinpoint the location of the tag in 3-D within a few centimeters. The tag employs the EPC Gen 2 air-interface RFID protocol to transmit its unique ID number and ultrasound-based location back to the sensor nodes. A gateway receives that data from the sensor nodes via a ZigBee 2.4 GHz connection.

The software displays an icon representing the person wearing that tag in the location in which they are within the room. The software also determines how high the tag is and therefore whether the individual is standing, sitting or lying down. This data is then presented to viewers on the software display.

Rattan's group has also installed a camera in the room and has tested integrating cameras with the system in their own laboratory. Rattan is now seeking funding to help pay for additional hardware and further research and development, with the goal of deploying throughout Bethany Village, he says. Once that happens, when someone falls and fails to stand back up, the system will capture that event, determine that an alert should be sent and instructs the camera to turn to the location of that individual and take a picture. That picture, as well as the alert status, can then be sent to the caregiver's computer or mobile device such as an iPhone.

"In that way a caregiver can look at the picture, receive the alert and see if that person is in trouble," he says.

The data being stored by the software can also help caregivers gain an understanding of how an individual uses his or her day. For example, if there is a concern that the individual is not eating, the system will track how often he or she goes to the refrigerator. In addition, Rattan says, it can track when someone approaches the stove, and a temperature sensor attached to the sensor node closest to the stove can even send an alert if the temperature rises above a specific threshold, which could indicate that someone forgot to turn the stove off.

The technology would provide greater independence to elderly or disabled people who would prefer to live at home than in assisted living, Rattan says. Thus far the group is looking for further funding as it continues to implement the camera system to be integrated into the RFID/ultrasound system, as well as installing sensor nodes throughout the entire house. They also plan to develop a smaller tag that could comfortably be worn on a wrist, he says. Currently the tag, which was built by researchers from off the shelf technology is attached to a helmet and is too large to be comfortably worn around someone's neck or wrist.