Attenuation in cables relates to the ratio in decibels of the output power (or voltage) to the input power (or voltage) when the load and source impedance are matched to the typical impedance of the cable. If your terminations are properly done and equal then the ratio of the output to input power or voltage is called attenuation. Real life measurements will result in values that are higher than the attenuation depending on the degree of it being mismatched. If you are analyzing it in terms of voltage ratio then attenuation can be determined according to this fun expression:
Where Vin = Input Voltage and Vout = Output voltage
The Presence of Attenuation in Cables
Attenuation in cables is a loss of signal transmission or signal strength. The more attenuation in the signal the worse it is and your communication is limited. Some of the major causes of attenuation in cables is noise on the network. This can come from other power cables, radio or electric currents. One way to combat this could be to use shielded ethernet cables. The other thing to take in to consideration when installing cable is the surrounding environment around it. If your cable is exposed to weather conditions and the cable is not properly suited for outdoor conditions it can cause attenuation to increase and start to lose signal strength. One way to tackle this would be to use outdoor rated cable which is built for outdoor conditions with the proper jacket material. Probably the biggest reason for attenuation is cable distance. The recommended channel length for most categories of cable is 100 meters. The 100 meters composes of 90 meters of cable back bone and 10 meters of patch cable. When you start getting in to distances further than this is when you run into problems such as signal strength decreasing and the category cable effectiveness reduces. So it is always recommend to follow standards in regards to cable terminations and cable lengths when running cables in your home or business.