Abstract
We study the effect of losses on the phase noise of single-mode field states. The losses are described by the standard loss master equation, and it is used to find an upper bound for the increase in the phase noise as a function of time. We compare the time dependence of the phase noise of an initial coherent state to that of a state that initially has very small phase noise. Both states have the same initial mean photon number. While the small-phase noise state is more susceptible to losses, the difference between its behaviour and that of the coherent state is not great.