Indeed, the

least-squares problem in step 9 of Algorithm 1 involves the small upper-Hessenberg matrix [H.sub.k], therefore it may quickly be solved by a series of Givens rotations.

By solving the weighted

least-squares problem in (3), we can obtain the local linear estimate of m(x) at x = [x.sub.0] as

then select an incoming column solving the

least-squares problemIn Section 2 we give a brief overview of tensor methods for nonlinear

least-squares problems (tensor methods for nonlinear equations can be regarded as a special case of these).

The first three examples apply the QSip solver to minimize benchmark

least-squares problems with different nonsmooth regularizers that are QS representable; the last example applies the solver to a sparse logistic-regression problem on a standard data set.

Here [p.sub.i] denotes the minimizer of the

least-squares problem (2.3) for the given set W.

We are concerned with the solution of large-scale linear

least-squares problemsFor example, if the exact solution [x.sup.*] to (1.1) is in the subspace [W.sup.[??].sub.p] [intersection] [V.sub.p], then the approximate solution obtained by solving the

least-squares problem (1.1) in [S.sub.p,j] could provide higher accuracy than the one in [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII].

Thus (1.2) is equivalent to the [l.sub.1] regularized

least-squares problemThis problem, which we call the linearized

least-squares problem for rational interpolation, is the starting point of the algorithm we recommend in this article, and we describe the mathematical basis of how we solve it in the next section.

Hough and Vavasis in [17] developed an algorithm to solve an ill-conditioned full rank weighted

least-squares problem using RRQR factorization as a part of their algorithm.

When solving nonlinear

least-squares problems, having good initialization values for all the parameters is crucial to obtain good results.

Saunders, "LSMR: an iterative algorithm for sparse

least-squares problems," SIAM Journal on Scientific Computing, vol.

The material is geared toward scientists and engineers who analyze and solve

least-squares problems, but is also suitable for a graduate or advanced undergraduate course for students with a working knowledge of linear algebra and basic statistics.

He proceeds by examining vector spaces and linear transformations, explores the Moore-Penrose pseudouniverse, introduces singular value decomposition, describes linear equations, projections, inner product spaces, norms, linear

least-squares problems, eigenvalues and eigenvectors.