# Need to calculate values of all the access of a $3 \times 3$ matrix?

I have a Image handling application in which I have a Matrix formula as listed below:

$A \cdot R=I$

where,

$A = 3 \times 3$ matrix (Constants)

$R = 3 \times 1$ matrix (Column Vector) Lets call this as real result.

$I = 3 \times 1$ matrix. Allows call $I$ as Ideal result

I recognize the values of matrix $I$, and also $R$. I need to locate what matrix $A$, if blog post increased by $R$ would certainly offer me matrix $I$.

Just how can I set his scenario up in matrix algebra and also address it calculate $A$?

Any kind of reminders would certainly be handy.

Thanks.

- ADVERTISEMENT.

Let $a_{ij}$ be the number in $i$ - th row and also $j$ - th column in the matrix $A$. After that each formula of the kind $AR=I$ is basicly 3 straight formulas with unknowns $a_{ij}$. If there are 24 collections after that you have $24 \cdot 3$ straight formulas in $9$ unknowns. This system could have no remedy as a whole. Yet, you can make use of Linear least squares to locate a remedy that fits ideal all those formulas.

Actually if you have $24$ collections of **I ** and also **R **:

$$ \mathsf A \cdot \mathbf R_i = \mathbf I_i \qquad i = 1, 2, \dotsc, 24 $$

Since there are just $9$ variables for each and every component of $A$ yet $24 \times 3 = 72$ straight formulas, the trouble is overdetermined and also there might be no remedies. Nonetheless, if you approve some mistakes **ε **,

$$ \mathsf A \cdot \mathbf R_i + \boldsymbol \epsilon_i = \mathbf I_i \qquad i = 1, 2, \dotsc, 24 $$

It is feasible to locate such an $A$, as this presents $72$ extra variables (the trouble comes to be underdetermined). The most effective point is certainly to decrease the complete mistake. We first write A as a column of row vectors:

$$ \mathsf A = \begin{bmatrix} \mathbf A_1 \\\\ \mathbf A_2 \\\\ \mathbf A_3 \end{bmatrix} $$

after that we have

$$ \mathbf R_i \cdot \mathbf A_j + \epsilon_{ij} = I_{ij} \qquad i = 1, \dotsc, 24; j = 1, 2, 3 $$

this is specifically like a linear regression problem (if we decrease the amount of squares of mistakes)

$$ \mathsf X \cdot \boldsymbol \beta_j + \boldsymbol \epsilon_j = \mathbf y_j $$

with

\begin{align} \mathsf X &= \begin{bmatrix} R_{1,1} & R_{1,2} & R_{1,3} \\\\ \vdots & \vdots & \vdots \\\\ R_{24,1} & R_{24,2} & R_{24,3} \end{bmatrix} \\\\ \boldsymbol\beta_j &= \begin{bmatrix} A_{j,1} \\\\ A_{j,2} \\\\ A_{j,3} \end{bmatrix} \\\\ \mathbf y_j &= \begin{bmatrix} I_{1,j} \\\\ \vdots \\\\ I_{24,j} \end{bmatrix} \end{align}

With these we can make use of any kind of software program sustaining regression (or by hand) to locate the most effective estimate of $A_j$, and also hence the entire matrix $A$.

(This can be conveniently generalised to $N \gg 24$ pixels.)

FIRST CASE. If you had just *one * set of values for $I$ and also $R$, as you claimed in your response to Rasmus, after that you would certainly have a system of straight formulas with definitely several remedies.

For emotional factors, allow's write it in this manner:

$$ XA = B \ . $$

Here $A$ is your $R$, $B$ your $I$ and also $X$ your $A$. If we shift we get a synchronised system of straight formulas

$$ A^t X^t = B^t \ . $$

If $A^t = (a_1 \ a_2 \ a_3)$ and also $B^t = (b_1 \ b_2 \ b_3)$, and also $(x_i \ y_i \ z_i)$ is the i - th colum $X^t$, we have the straight formulas

$$ a_1 x_i + a_2 y_i + a_3 z_i = b_i $$

for $i = 1, 2, 3$.

Which, thinking $a_1 \neq 0$, you can address similar to this:

$$ x_i = \frac{b_i}{a_1} - \frac{a_2}{a_1} y_i - \frac{a_3}{a_1} z_i \ . $$

Now, offer $y_i$ and also $z_i$ the values you desire and also you have a remedy for your trouble.

EDIT : Maybe I could create a little extra my solution, in order to *actually * include your trouble, which is just one of overdeterminated synchronised systems of straight formulas, as KennyTM claims, given that you do not have a prefered set of values $(R,I)$, do you? In order to take care of all the $24$ sets of values $(R,I)$ you have, possibly you need to think about this SECOND CASE.

2ND CASE. I'm sorry, yet I'm transforming a little the symbols once more. Ultimately, I'll write the remedy with your own.

Allow

$$ AX = B $$

be a (synchronised) system of straight formulas such as your own, with $A$ a $24\times 3$ matrix ($24$ rows, $3$ columns), $X$ a $3\times 3$ matrix and also $B$ a $24 \times 3$ matrix.

* *Theory : Let's think that our matrix $A$ (that is, your $R$) has actually called $ 3$. * *

(If this is not the instance, the trouble is extra entailed.)

Allow's write the first system in this manner:

$$ x_1a_1 + y_1a_2 + z_1a_3 = b_1 \ . \qquad [1] $$

Here, $a_i , i = 1,2,3$ are the columns of $A$, $X_1 = \begin{pmatrix} x_1 & y_1 & z_1 \end{pmatrix}^t$ the first column of $X$ as well as additionally $b_1$ is the first column of $B$.

So we see in one go a geometric analysis of our system of formulas : system [1 ] has a remedy $\begin{pmatrix} x_1 & y_1 & z_1 \end{pmatrix}$ if and also just if the vector $b_1$ comes from the straight period created by the columns of $A$:

$$ AX_1 = b_1 \quad \text{has a solution} \qquad \Longleftrightarrow \qquad b_1 \in [a_1, a_2, a_3] $$

What can we do if this is *not * the instance? - To seek the *local * vector in $[a_1, a_2, a_3]$ to $b_1$.

This *local * vector is, certainly, the *orthogonal estimate * of $b_1$ onto the subspace $[a_1, a_2, a_3] $.

According to Wikipedia, http://en.wikipedia.org/wiki/Orthogonal_projection, the matrix of this orthogonal estimate in the typical basis of $\mathbb{R}^{24}$ is

$$ P_A = A (A^tA)^{-1}A^t \ . $$

So the most effective $X = \begin{pmatrix} X_1 & X_2 & X_3\end{pmatrix}$ for you is

$$ X = A (A^tA)^{-1}A^tB $$

where $B = \begin{pmatrix} b_1 & b_2 & b_3 \end{pmatrix}$. Currently, allow's return to your symbols : $A = X^t$, $R = A^t$ and also $I = B^t$. So

$$ A = \left( R^t (R^{tt}R^t)^{-1}R^{tt}I^t \right)^t = I R^t ((RR^t)^{-1})^t R $$

where $R$ and also $I$ are currently the matrices with *all * of your $R$'s and also $I$'s as columns.

Given some $A$, you can gauge just how well it executes on the set of $R_1,...,R_{24}$ and also $I_1,...,I_{24}$ by calculating the mistake. As an example : $$ J = \sum_{i=1}^{24} ||A R_i - I_i ||^2 $$ Where we have actually made use of the typical Euclidean standard. The objective is to locate the $A$ that makes $J$ as tiny as feasible. We can share this even more compactly by piling the vectors $R_i$ and also $I_i$ right into bigger 3x24 matrices. So if we allow : $R = [R_1 R_2 \dots R_{24}]$ and also in a similar way for $I$, we can revise our mistake as : $$ J = ||AR - I||_F^2 $$ where we are currently making use of the Frobenius norm. This optimization trouble can be addressed in several means. As an example, you can differentiate J relative to An and also set the outcome equivalent to absolutely no. This generates the optimality problem : $$ ARR^T = IR^T $$ So any kind of A that pleases the above formula will certainly be optimum in the feeling that it will certainly decrease J. If R is a complete - ranking matrix (most likely), after that the optimum A is offered by : $$ A = IR^T(RR^T)^{-1} $$

As Rasmus claimed, if you have just one (I, R) set, after that there are numerous opportunities for A.

If you want any kind of one such A, you can make use of the adhering to.

Allow $R = (x,y,z)$ and also think that $x \neq 0$

Let $I = (a,b,c)$

You can pick the matrix to be

```
[a/x 0 0]
[b/x 0 0]
[c/x 0 0]
```

Related questions