Abstract
Radio telemetry is commonly utilized in large, deep bodies of water to assess fish movement and habitat use. A commonly neglected factor in these studies is the influence of signal attenuation on the results and conclusions. Signal attenuation is related to many factors but most importantly to the depth of the transmitter in the water column and water conductivity. While conducting a biotelemetry study within the Ohio River, several fish that had not been detected in prior search periods were detected in later searches. Consequently, we hypothesized that telemetered fish in deep water may not be detected. We conducted an experiment to measure the influence of depth on the maximum distance at which a transmitter could be detected and found that an exponential decay model (distance = 0.9890 × e (0.2005 × depth)) best explained these data. Our results imply that radio telemetry studies may underestimate use of deepwater habitats by fishes.