Questions on the various algorithms used in linear algebra computations (matrix computations).

learn more… | top users | synonyms

-1
votes
1answer
26 views

LU factorization of a 2X2 singular matrix [duplicate]

My book is asking me to show that every matrix of the form $A=\begin{pmatrix}0&a\\0&b\end{pmatrix}$ has a $LU$-factorization and to show that even if $L$ is a unit lower triangular matrix that ...
1
vote
1answer
53 views

numerical linear algebra 101

since I'm a programmer and I need linear algebra, I'm starting considering how to teach myself a little of numerical linear algebra, not really optimize things right from the start, but I would like ...
0
votes
1answer
49 views

Spline Interpolation

I have four questions about splines. Any help would be greatly appreciated. 1) Boundary conditions for cubic spline interpolation to a set of data $a=x{}_{1}<x2<...<x_{m} , $ like for ...
0
votes
0answers
29 views

How to write a matrix equation for an underdetermined system

I am having difficulty writing the following equation in matrix form that I can then feed into a computer package to find solutions. The equation I have is: $f_i=g_i(1+\alpha*\exp(2*\pi*i*\lambda))$ ...
0
votes
0answers
22 views

Prefactoring to solve many similar linear systems

I am designing an algorithm that needs to solve many (large) linear systems of the form $$\Phi^\top D_i\Phi \vec x_i=\vec r_i,$$ where $\Phi\in\mathbb{R}^{m\times n}$ with $m>n$ is fixed. We will ...
0
votes
0answers
36 views

LU Equation Verification and Norms

The question is as follows: Let $A = LU$ be the $LU$ factorization of $n\times n$ matrix $A$ with $|l_ij|\leq 1$. Let $a_i^T$ and $u_i^T$ denote the $i$-th rows of $A$ and $U$, respectively. Verify ...
0
votes
1answer
17 views

Simple question about equivalence of two forms of PCA as trace maximization over an implicit distribution

This may be a soft question of sorts. One formulation of principal component analysis is trace maximization: $$\arg\max_U \mathbb{E}_x \ [tr(U^Txx^TU)],$$ for $U^TU\le I$ and we assume that there is ...
1
vote
1answer
32 views

projection of a matrix $U$ with respect to the spectral norm of $UU^T$

I'm reading a paper that defines a projector as follows: $P_{\perp}(U)$ is a "projection" (slight abuse of termninology) with respect to the spectral norm of $UU^T$ onto the set of $d\times d$ ...
0
votes
1answer
33 views

Positive linear combination of vectors to produce a positive vector

Given a list of vectors, I want a linear combination with positive coefficients that produces a final vector with only positive values (EDIT: this final vector is unknown; any positive vector is ...
4
votes
1answer
42 views

Is Householder orthogonalization/QR practicable for non-Euclidean inner products?

The question Is there a variant of the Householder QR algorithm to orthonormalize a set of vectors with respect to an inner product if no orthonormal basis is known a priori? Background Let's ...
0
votes
0answers
27 views

Minimizing an expression with linear constraints

Given a system of under-constrained (i.e. infinite solutions) linear equations (all values will be integers, all coefficients will be 0, 1, or -1), I want to pick values for the variables to minimize ...
1
vote
1answer
42 views

What is the range of this function

Let $\lambda_{1}(X)$ be the larger eigenvalue of the $2$ eigenvalues of a symmetric matrix X. For fixed real numbers $a,b,c,d$, what is the range of $\lambda_{1}\left(diag\left(a,b\right)-U\cdot ...
-1
votes
0answers
26 views

How to prove this

Let $A′$ be a principal submatrix of the matrix $A$. Prove that $w(A') \subset w(A)$. $w(A)=\{x^HAx : x^Hx=1 ,x \in C^n \subset C\}$.
0
votes
1answer
34 views

$W(A)=\{x^HAx : x^Hx=1,{x\in \mathbb{C}}\}$, ${A\in \mathbb{R}}^{n\cdot n}$ How do I show that set is symmetrical set regard to real axis?

I need help to solve this task, so I would accept any suggestion: If ${A\in \mathbb{R}}^{n\cdot n}$, show that set $W(A)=\{x^HAx : x^Hx=1,\,{x\in \mathbb{C^n}}\}$, is a symmetrical set with respect ...
-1
votes
1answer
70 views

How do I show that $W(A)$ is symmetrical set regard to real axis? [closed]

I need help to solve this task,so I would accept any suggestion: If $A$ is $\mathbb R^{n\times n}$ squared matrix, show that $W(A)$ is symmetric set with regard to real axis.
0
votes
0answers
11 views

Mesh based solution of thin plate spline interpolation

Irregularly spaced elevation data $ \{z(x_i,y_i)\}_{i=1..n}$ can be interpolated using thin plate splines. One way to do this is to use RBFs: link. According to "Scattered Data Interpolation and ...
0
votes
1answer
21 views

Given an M x N matrix, is there a way to produce an orthogonal set of N vectors of length M, where M < N?

Gram-Schmidt orthogonalization would only use the first M vectors to generate a basis of size M x M.
0
votes
0answers
47 views

Can I realize constraints into an optimization problem like this?

I have a question about adding contraints into optimization problems. I'm reading the book independent component analysis by Hyvärinen and Oja (2001). On page 178, the following formula is given: ...
0
votes
0answers
40 views

Is there any risk to transform to $(B^{T} \otimes A)\operatorname{vec}(X)=\operatorname{vec}(C) $ for solving $AXB=C$ for X

To solve the equation $AXB=C$ for X, we can use the property of vec operator and kronecker product to transform to $(B^{T}\otimes A)\operatorname{vec}(X)=\operatorname{vec}(C)$, where ...
0
votes
0answers
28 views

Linear optimization: doubt about the Big M method

I'm trying to implement the Big M method on javascript, but I have some difficulties to completelly understand it. I'm reading this explanation. However, I get stuck on the final step when ...
-4
votes
1answer
49 views

Can we conclude that this matrix is definite positive? [duplicate]

Let $A$ be a $n\text{-by-}m$ matrix. Suppose that columns of $A$ are linearly independent. Can we conclude that $A^TA$ is definite positive? Could you help me with proof? Thanks.
1
vote
0answers
43 views

What is the Moore-Penrose pseudoinverse for scaled linear regression?

The matrix equation for linear regression is: $$ \vec{y} = X\vec{\beta}+\vec{\epsilon} $$ The Least Square Error solution of this forms the normal equations: $$ ({\bf{X}}^T \bf{X}) \vec{\beta}= ...
1
vote
0answers
20 views

Short-cut to a group of long sums/differences

If I have data $a,b,c,d$, and want to calculate $x=a+b-c-d$, $y=a-b-c+d$ and $z=a+b+c-d$, I can save three adds by doing $e=a-c$, $f=b-d$, then $x=e+f$,$y=e-f$, $z=a+c+f$. If I have 100 data values ...
0
votes
0answers
31 views

Very simple question about solving linear systems numerically by using LU decomposition

Say we have the equation $Ax=b$ where we do want to calculate $x$. Why is it useful to make a $LU $ decomposition of $A$ (if possible) ? What are the numerical advantages?
3
votes
1answer
44 views

How to show that the Hessian matrix of $G$ is positive definite?

Let $\{g_i:X\subset\mathbb{R}\rightarrow\mathbb{R};\;i=1,...,m\}$ be a linerly independet set of real functions. Given $n$ points $(x_1,y_1),...,(x_n,y_n)\in X$, consider the following function ...
2
votes
1answer
60 views

I would like a hint in order to prove that this matrix is positive definite

Let $a_{ij}$ be a real number for all $i,j\in\{1,...,n\}$. Consider the matrix below. $$B=\begin{bmatrix} \sum_{k=1}^n(a_{1k})^2 & \sum_{k=1}^na_{1k}a_{2k} & \cdots & ...
3
votes
1answer
59 views

$\delta$ Notation in linear algebra

In this equation below, what is $\delta_{l,q}$ denoting? Is $\delta$ a standard notation, or anything to do with all one's or the basis matrix etc? $$A_{ij}=\delta_{l,q}\left(\sum_{h=1}^n B_{l,h} + ...
2
votes
1answer
40 views

Numerical Methods for eigen values of $A \in \mathbb{C}^{n \times n} $

I've been writing a linear algebra library in c# for a while as an intellectual exercise and its gotten vastly more sophisticated that I originally thought it would and when I started adding methods ...
3
votes
1answer
38 views

Finding the smallest subset of a set of vectors which contains another vector in the span

Consider a set $S=\{ \underline{v_1},\dots , \underline{v_n} \} $ of vectors of dimension $d<n$. Suppose for some vector $\underline{b}$ that the solution space for the matrix equation $\left[ ...
0
votes
1answer
35 views

Spectral radius of $A$ and convergence of $A^k$

I'm trying to understand the proof of first theorem here. Maybe it's very simple but I would like your help because I need understand this, I have no much time and my knowledge about this subject is ...
1
vote
0answers
31 views

How to Diagonalize an Extremely Large Sparse Matrix in SLEPc/PETSc

Dear Friends, Recently I have started with learning SLEPc/PETSc, but I didn't find a way to solve my problem. I have to solve a big sparse matrix which is a two dimensional quantum ...
1
vote
1answer
69 views

If modulus of each one of eigenvalues of $B$ is less than $1$, then $B^k\rightarrow 0$

Let $B$ be a $n\times n$ matrix and let $X$ be the set of all eigenvalues of $B$. Prove that if $|m|<1$ then $\lim \limits_{k\rightarrow\infty}B^k=0$, where $m=\max X$. Thanks. Actually, there ...
2
votes
1answer
74 views

Radial coordinate evaluation

Details of the question can be found in the article equation(55,56) A radial coordinate $R$ defined by \begin{equation} r=\frac{2R}{\kappa(1-R^2)} \,, \end{equation} where $\kappa$ is a constantand ...
2
votes
1answer
64 views

The Miracle of Newton-Cotes

Background and motivation: Suppose I want to approximate the integral $\int_{a}^{b}f(x)dx$ using evenly-spaced sampled values of $f$: $f(a + \frac{i}{n}(b-a)), i=0,\cdots,n$. By the linearity of ...
3
votes
1answer
39 views

affine transformation of a polynomial

If I have a set of polynomials defined on six points of a triangle, such that $\phi_i(p_j) = \delta_{ij}$, how do I use an affine transformation to get new polynomials so that $\bar{\phi_i}(\bar{p}_j) ...
1
vote
2answers
32 views

Space spanned by matrices

I have a set of 5 by 5 matrices, M1,M2,...,M19 ,M20. I want to try to find a basis from this set and also to find relationships between these matrices. This is how I think I should approach the ...
1
vote
1answer
72 views

Represent in a matrix form: $\sum_{ijl}(F_{il}-F_{jl})^2W_{ij}$

How can I represent this in a matrix form: $\sum_{ijl}(F_{il}-F_{jl})^2W_{ij}$ where all the entries are real and $W$ is a known(constant) matrix and $F$ is a rectangular matrix. When I say matrix ...
0
votes
0answers
23 views

Shear decomposition

Is there an algorithm for decomposing a square matrix (or a similar matrix to it) in to shear and diagonal matrices? All the usual decompositions (Schur, SVD, QR, LU, etc.) don't seem to help. ...
1
vote
1answer
36 views

What is the fastest algorithm to solve the eigenvector of a transition matrix of a Markov Chain?

Given a transition matrix of a Markov chain, $P$, I want to solve the left eigenvector of $P$, namely a row vector $\alpha$ such that $$ \alpha P = \alpha $$ I know the algorithm to solve a linear ...
1
vote
1answer
65 views

How to find the unknown values in this Numerical Integration type?

Given the following type of numerical integration: $$I(f)=\int_0^1 f(x) \, dx \approx \frac 12 f(x_{0}) +c_1 f(x_1) $$ a) Find the values ​​of: the coefficient $c_1$ and points $x_0$ and $x_1$ so ...
0
votes
0answers
65 views

A minimax problem

Let $\omega$ be a non zero real number, $(\lambda_i)_{i = 1}^n$ be a sequence of $n$ real numbers and $$u_{i}:=1-(1-\lambda_{i})\,\omega.$$ Show that if we pick ...
1
vote
0answers
42 views

Existence criteria for the LU decomposition of a tridiagonal matrix

In this link, the following result is presented without proof: Let $a, b, c$ be the lower off diagonal, diagonal, and upper off diagonal elements of a tridiagonal matrix. A pivotless LU ...
1
vote
1answer
44 views

Non-monotonic decrease of residuals in Conjugate Gradients:

In some of my numerical programming using conjugate gradient solvers, I noticed an alarming problem: The residuals were not monotonically decreasing to zero, but were sometimes increasing. In this ...
1
vote
0answers
46 views

About the Generalized singular value decomposition (GSVD).

I have studied about Singular value decomposition (SVD) and had solved few numerical examples to understand SVD. Now I am studying Generalized singular value decomposition (GSVD). I followed this ...
0
votes
0answers
23 views

Can I detect repeated eigenvalue by inverse iteration?

Suppose all eigenvalues of $A$ are nonnegative. By using inverse iteration $A-\mu I$ for many values of $\mu\ge 0$, I can find eigenvalues of $A$. If $A$ is a $n\times n$ matrix and have different $n$ ...
2
votes
3answers
130 views

On integral of a function over a simplex

Help w/the following general calculation and references would be appreciated. Let $ABC$ be a triangle in the plane. Then for any linear function of two variables $u$. $$ \int_{\triangle}|\nabla ...
0
votes
0answers
57 views

Shifted inverse power method in Octave.

EDIT: Ok, I've managed it. It was very stupid bug... I must write $p=L\(P*z0)$ etc.... I'm trying to write a function which returning vector $a$ (vector of eigenvalues of matrix $A=A^T \in ...
0
votes
0answers
10 views

Error bound on matrix vector multiplication

I am multiplying a matrix $A$ with vector $p$. However, the matrix $A$ isn't accurate. Some (a very small fraction) of the element's value is changed from $a_{i,j}$ to {0,$-a_{i,j}$, $2a_{i,j}$}. ...
2
votes
0answers
28 views

Solver for sparse linearly-constrained non-linear least-squares

Reposted from stackoverflow on the advice of Nick Rosencrantz: Are there any algorithms or solvers for solving non-linear least-squares problems where the jacobian is known to always be sparse, and ...
1
vote
0answers
24 views

Algorithm to compute similarity computation

I have a similarity transformation of matrices from the type $B = P^{-1}AP$. It is known that $A$ and $P$ are invertible matrices, but not orthogonal. Given that I have the matrices $P$ and $A$ I ...

1 2 3 4 5 8