|Home||Internet of Things||Aerospace||Apparel||Energy||Defense||Health Care||Logistics||Manufacturing||Retail|
Ask The Experts Forum
How Is RF Cable Signal Loss Calculated?
I am a regular reader of RFID Journal, which I find very useful. The content is amazing indeed for RFID professionals. I'd like to seek your expertise with regard to a question concerning RF cable loss encountered in an overall RFID system.
I saw some information online that said a 2 dB cable loss results in 37 percent loss in signal when it reaches the other end. Similarly, a 20 dB loss in a cable renders 99 percent loss of signal by the time it reaches the other end. The typical length of cable with regard to dB loss is not specified.
How is the loss calculated mathematically and logically in the above scenario (without knowing cable length), and how were the above examples (37 percent and 99 percent for losses of 2 dB and 20 dB, respectively) arrived at? I'd be grateful to you for your reply.
I am not an engineer, but to my knowledge, you cannot calculate the signal lost as it travels through a coax cable without knowing the cable length. For a 900 MHz signal traveling through a 100-foot RG-11 coax cable, signal loss is 5.4 dB. If the signal travels over 50 feet of cable, the loss is 50 percent less. If the cable length is 200 feet, the signal loss doubles. Different types of coax have different loss rates. I don't know if there is a formula for calculating loss per 100 feet; perhaps one of our readers can chime in with additional information.
—Mark Roberti, Editor, RFID Journal
Login and post your comment!
Not a member?
Signup for an account now to access all of the features of RFIDJournal.com!
SEND IT YOUR WAY
RFID JOURNAL EVENTS
ASK THE EXPERTS
Simply enter a question for our experts.
TAKE THE POLL