Is there a grand theory underpinning radio frequency identification?
Interesting question. I'm not sure there is a grand theory behind radio frequency identification, any more than there would be a grand theory behind computers. It is merely a tool that was invented, and people are learning how to use it to improve the way they do business.
I guess if there were a unified RFID theory, it would be that you could save a lot of money and become vastly more efficient if you could collect data automatically. Bar codes require a human to orient a bar code to a scanner and pull a trigger. This is slow and requires a lot of labor, so it is costly. Companies do not hire armies of people to scan a bar code each time a pallet, case or item moves from one place to another, or every time a tool is moved on a construction site.
By installing RFID readers that can communicate with RFID tags on objects automatically and remotely via radio waves, companies can collect data about everything without a lot of extra labor cost. This provides greater visibility into the location of just about everything, and that allows companies to run their businesses more efficiently.
Gordon Moore, Intel's co-founder, is said to have claimed that the number of transistors on integrated circuits doubles every two years. This is known as Moore's Law. My theory of RFID is that efficiencies that can be gained from RFID will double every two years, due the fact that companies will find new benefits to the visibility it delivers. I think we should call this "Roberti's Law."
—Mark Roberti, Founder and Editor, RFID Journal
Previous Post What Is the Best RFID Tag for Deep Water Environments? »