Abstract
The method of matched asymptotic expansions is applied to the determination of the damping of gravity waves propagating in turbulent conditions. The effect of the turbulence is introduced by a general system of coefficients of eddy viscosity, whilst the turbulence itself is supposed to be confined to boundary layers adjacent to a rigid impermeable bottom and the free surface. The lowest order damping in the system is found to be independent of surface turbulence and computations are made for a physically meaningful distribution of eddy viscosity in the lower boundary layer.