Abstract
Within the framework of generalized linear models, the problem of finding maximum likelihood estimates when a design matrix depends on a non-linear parameter vector is explored. Generalized linear models (Nelder and Wedderburn, 1972) consider cases when the design matrix is given; while conditionally (also called partial) linear models (Golub and Pereyra, 1973; Kaufman, 1975) assume that the sample is from a normal family. We combine the two techniques for finding the maximum likelihood estimates of both non–linear and conditionally linear parameters. In particular, three increments of the nonlinear parameter vector are defined: the reduced Gauss–Newton increment, the Kaufman increment and the Golub and Pereyra increment. We show that the first two increments are equivalent up to the initial values. The second two increments are relatedby a linear transformation. Finally, we present an implementation of all three methods and compare them using numerical examples.