12-08

Classical algorithms for approximate linear regression design applied to a first order model with heteroscedasticity

Gaffke, N.;Graßhoff, U.;Schwabe, R.

 

Preprint series: 12-08 , Preprints

The paper is published: Submitted to CSDA, Special Issue on Algorithms for Design of Experiments, May 29, 2012

 MSC:

           62K05 Optimal designs

Abstract: The basic structure of classical algorithms for numerical computation of optimal approximate
linear regression designs are briefly summarized. The algorithms are applicable
if the convex and compact set of information matrices is simple enough such that linear
functions can easily be minimized over that set. This limitation can be weakened to “fractional
minimization” of linear functions. In case of multiple first order regression on the
cube $[−1, 1]^k$ with heteroscedasticity caused by random coefficients with unknown means
but known dispersion matrix, such fractional linear minimization turns out to be nonconvex
quadratic minimization over the cube, which is known to be NP-hard. So applications
are restricted to small dimensions $k \leq 10$. In the special case that the dispersion matrix
of the random coefficients is diagonal, equivariance of the linear regression model allows
restriction to invariant designs, and the algorithms become applicable for any dimension $k$.

Keywords: Information matrix; D- and I-optimality; Quasi-Newton method; nonconvex quadratic minimization; invariant design

Upload: 2012-06-08-06-08

Letzte Änderung: 01.03.2018 - Ansprechpartner: Webmaster