Abstract
Many technical and environmental variables affect radio wave transmission and reception in aquatic environments. We used a controlled experimental design in three large North American rivers (Illinois and Mississippi rivers, Illinois, and Skeena River, British Columbia) to examine the effects of water conductivity, transmitter depth, electromagnetic noise, antenna height, and transmitter type on detection distance. Detection distance was significantly affected by water conductivity, transmitter depth, and electromagnetic noise from a boat motor. Detection distance and maximum depth of signal reception increased with decreasing conductivity except at conductivities from 60 to 90 μS/cm (Skeena River). In contrast to terrestrial radiotelemetry, antenna height did not significantly affect detection distance. We also determined that a transmitter with a 3.5-V battery had an approximately 2.5 times greater detection distance than a transmitter with a 1.5-V battery. We provide quantitative results for several variables influencing radio wave transmission and reception, and we present two regression equations that can be used by researchers to obtain estimated detection distances in rivers of similar conductivity and depth. Furthermore, we compared our detection distances with those calculated from theoretical equations and found that our field results yielded higher detection distances. We strongly believe that there are gaps between results reported in technical papers and those obtained in field method studies; we argue for more collaborative research between aquatic telemetry users and engineers to reduce the current need for trial and error in telemetry study design.