First Wal-Mart and the U.S. Department of Defense, now Target and the U.S. Food and Drug Administration. RFID’s list of mandates gets longer by the month. This isn’t another techno-bubble—it’s the start of real and lasting change. Recent history reveals why.
It’s often said that RFID dates back to World War II, but this is misleading. The term RFID first appeared at the start of the 1990s. The technology as we understand it today—an integrated circuit, attached to a radio antenna, that is cheap enough to embed in a pallet, case or item—is around a decade old.
During the past 10 years, a lot has changed: Computers were deployed throughout businesses, and the Internet caught on as
a way to connect them all together. By the end of the 1990s, this information infrastructure was everywhere. The only thing missing was a way to get real-time data into the system automatically.
In the mid-1990s, sharp-eyed executives started noticing that although computers were everywhere, the result wasn’t the utopia of perfect information that experts had predicted. Procter & Gamble wondered why products were out of stock. Gillette wanted to stop theft. Wal-Mart, never satisfied, looked for new ways to drive costs out of its supply chain. Gradually and independently, these companies and others felt their way to the same conclusion: They needed more and better information. And it couldn’t be a little more, or a little better. It had to be real time, all the time, and always accurate. That was the only way to know what and where everything was.
The potential benefits were enormous. If you are in the business of stuff, the stuff is your business. You make it and move it. Anything that makes you better at doing that is a fundamental advantage. This is especially true for big businesses and for businesses in mature industries. In a big business, the tiniest saving multiplied by millions of customers, or hundreds of thousands of employees, or thousands of stores, is real money. In mature industries, competitors all deliver similar benefits, making product advantage marginal. Increased efficiency is the best hope for building business. For all these reasons, a few executives in a few big companies concluded that real-time data was a source of growth with unprecedented potential.
RFID could have provided this data, but it would have been too expensive. So in 1999, P&G and Gillette, quickly followed by International Paper, Philip Morris, CHEP, Wal-Mart, the DOD and others, founded and funded the Auto-ID Center at MIT to create a
system that could capture cost-effective, real-time business data. After four years, they achieved that. The next step is to start using the system. That’s why now, in 2004, these companies are announcing plans to deploy RFID technology. To the uninitiated, the barrage of news seems like an overnight sensation. In fact, it’s the latest stage in the process that has roots in the 1990s.
RFID is not a fad that’s going to fade away like so many other technologies. (Remember online marketplaces?) We know this because RFID is not a solution in search of a problem. Users and their needs, not RFID vendors and inventors, led its latest and most crucial stage of development. Users built a system that could deliver real bottom-line value. RFID’s foundation is irresistible need. And that’s why this is no bubble. It’s a beginning.
Kevin Ashton was cofounder and executive director of the Auto-ID Center. He has just finished writing a book about RFID.