Rain Fade on Microwave Links
Rain fade refers primarily to the absorption of a microwave radio frequency (RF) signal by atmospheric rain, snow or ice, and losses which are especially prevalent at frequencies above 11 GHz. It also refers to the degradation of a signal caused by the electromagnetic interference of the leading edge of a storm front. Rain fade can be caused by precipitation at the uplink or downlink location. However, it does not need to be raining at a location for it to be affected by rain fade, as the signal may pass through precipitation many miles away, especially if the satellite dish has a low look angle. From 5 to 20 percent of rain fade or satellite signal attenuation may also be caused by rain, snow or ice on the uplink or downlink antenna reflector, radome or feed horn. Rain fade is not limited to satellite uplinks or downlinks, it also can affect terrestrial point to point microwave links (those on the earth’s surface).
Possible ways to overcome the effects of rain fade are site diversity, uplink power control, variable rate encoding, receiving antennas larger (i.e. higher gain) than the required size for normal weather conditions, and hydrophobic coatings.
Two models are generally used for Rain modelling: Crane and ITU. The ITU model is generally preferred by microwave planners. A global map of Rain distribution according to the ITU model is shown below:
Used in conjunction with appropriate planning tools, this data can be used to predict the expected Operational Availability (in %) of a microwave link. Useful Operational Availability figures typically vary from 99.9% (“three nines”) to 99.999% (“five nines”), and are a function of the overall link budget including frequency band, antenna sizes, modulation, receiver sensitivity and other factors.
Another useful Rain Fade map is shown here, showing the 0.01% annual rainfall exceedance rate:
For more information on this topic, please contact us
You must be logged in to post a comment.