How Is RF Cable Signal Loss Calculated?

By RFID Journal

  • TAGS
Ask The ExpertsHow Is RF Cable Signal Loss Calculated?
RFID Journal Staff asked 15 years ago

I am a regular reader of RFID Journal, which I find very useful. The content is amazing indeed for RFID professionals. I'd like to seek your expertise with regard to a question concerning RF cable loss encountered in an overall RFID system.

I saw some information online that said a 2 dB cable loss results in 37 percent loss in signal when it reaches the other end. Similarly, a 20 dB loss in a cable renders 99 percent loss of signal by the time it reaches the other end. The typical length of cable with regard to dB loss is not specified.

How is the loss calculated mathematically and logically in the above scenario (without knowing cable length), and how were the above examples (37 percent and 99 percent for losses of 2 dB and 20 dB, respectively) arrived at? I'd be grateful to you for your reply.

Best wishes,


Sharath

———


Dear Sharath,

I am not an engineer, but to my knowledge, you cannot calculate the signal lost as it travels through a coax cable without knowing the cable length. For a 900 MHz signal traveling through a 100-foot RG-11 coax cable, signal loss is 5.4 dB. If the signal travels over 50 feet of cable, the loss is 50 percent less. If the cable length is 200 feet, the signal loss doubles. Different types of coax have different loss rates. I don't know if there is a formula for calculating loss per 100 feet; perhaps one of our readers can chime in with additional information.

—Mark Roberti, Editor, RFID Journal

Previous Post
»