Sanity Checks are missives on a specific math point in need of clarification. I try to do so using the fewest words possible. Usually, this is still quite a few words.
Remember matrix-vector multiplication? We multiply a vector (call it x) by a matrix (call it A) and obtain a new vector: y = Ax. Somewhere in your travels you were told to stop thinking of this as matrix algebra and start thinking of it as a mapping. A is a matrix, yes, but it represents a linear mapping that operates on x and sends it somewhere, and we call the new thingy y.
Perhaps here you say da doi, of course a matrix vector product is a linear transformation and a linear transformation is a matrix vector product. But it's a bona fide theorem. Poke around a linear algebra textbook and you'll find something like: A transformation F: Rn → Rm is linear if and only if it is a matrix transformation. Can you prove it? I can't, but somebody can.
Today, however, we are interested in coordinate transformations. Suppose we have a mapping of the vector x into y represented by the matrix A. That is, y = Ax. Now, suppose we change the basis. In the new basis, x → x' and y → y'. Is it still true that y' = Ax'?
No, it is not.
"Why not?" leads us to the mathematical vista called a similarity transformation. But we're getting ahead of ourself. First, let's try a simple example. We'll start in the standard 2D Cartesian basis e1 = [ 1 0 ] and e2 = [ 0 1 ] then rotate 45° counterclockwise to obtain a new basis.
Let A = [ 1 0 ; 0 -1 ]. (This is Matlab notation, where rows of a matrix are separated by a semicolon. I'm using Matlab notation because formatting a matrix in html is a pain in the fur.) Let x = [ 1 1 ]. Then y = Ax = [ 1 -1 ]. (Note A is a reflection about the x axis -- I'm using geometric transformations so you can think in pictures.)
To obtain vector components in the new basis, we multiply by Q = [ cos(π/4) -sin(π/4) ; sin(π/4) cos(π/4) ]. So x' = Qx = [ 0 √2 ] and y' = Qy = [ √2 0 ]. Note Ax' = [ 0 -√2 ] ≠ y'. What has gone wrong?
Since x and y were related in the original coordinates (to wit: y = Ax), and x' and y' are the same vectors just expressed in a new basis, we intuitively expect there to be a relation between x' and y'. Alas, our counterexample shows y' = Ax' isn't true, and one counterexample is all you need to screw the pooch. The good news is uncovering the correct form just takes a tiny splash of algebra.
Footnote: You do understand that "screws the pooch" is an engineering turn-of-phrase, yes? It means "an event that wrecks our plan, irrevocably and not infrequently involving explosions" (IIRC, it was invented at NASA). Every time I use this phrase in mixed company at work, I get a testy email from HR.
Note Q is invertible (geometrically, the inverse represents a 45° clockwise rotation. In fact, you can get Q-1 simply by substituting -π/4 for π/4 into Q). So we can write x = Q-1x' and y = Q-1 y'. So we have Q-1y' = A Q-1x'. Rearranging, we obtain y' = QAQ-1x' or y' = A'x', where A' = QAQ-1. If we apply this to our example, we find Q-1 = [ √2/2 -√2/2 ; √2/2 √2/2 ], A' = [ 0 1 ; 1 0 ] and A'x' = [ 0 1 ; 1 0 ] [ 0 √2 ] = [ √2 0 ] = y'. Huzzah.
A transformation of the form QAQ-1 is called a similarity transformation. (Danger! The similarity transformation can be written either QAQ-1 or Q-1AQ, depending on whether you define Q as taking the unprimed coordinates into the primed coordinates or the primed into the unprimed). There's all sorts of useful properties of a similarity transformation (A and A' have the same eigenvalues, for example). But our interest presently is the form itself. Look closer: A' = QAQ-1. Transforming A involves two multiplications -- one factor of the forward direction (Q) and one factor of the inverse (Q-1). This is usually just accepted and memorized, but it's worth thinking about. Why two factors? Why isn't A' = QA? And why one factor in each direction? Why isn't A' = QAQ or QQA? That seems like the logical extension of how vectors transform.
I assure you A' = QAQ-1 is true. Algebra doesn't lie, although it does Lie (little joke there for the grad students joining us today). But it would be nice to have an intuitive explanation behind the symbols. Fortunately, an intuition is not hard to conjure. We just have to attach a few words to the algebra.
As we now know: y' = QAQ-1x'.
But Q-1x' = x. So we really have y' = QAx.
But Ax = y. So we really have y' = Qy.
Adding more words: The factor Q-1 maps x' back into x, then A operates on it to get y, then the factor Q maps the result back into the primed system. Ergo, QAQ-1. Like the arrow in the FedEx logo, impossible to unsee once pointed out.
This may seem da doi, but it ain't in any textbook I've looked in. They're so busy with equivalence relations and kernels and homeomorphisms that they forget to include an explanation. It's an all-too-common affliction. Don't believe? Take the LabKitty Similarity Challenge: Google any of the following (taken from my recent browser history) and see if any lead you to clarity:
Remember matrix-vector multiplication? We multiply a vector (call it x) by a matrix (call it A) and obtain a new vector: y = Ax. Somewhere in your travels you were told to stop thinking of this as matrix algebra and start thinking of it as a mapping. A is a matrix, yes, but it represents a linear mapping that operates on x and sends it somewhere, and we call the new thingy y.
Perhaps here you say da doi, of course a matrix vector product is a linear transformation and a linear transformation is a matrix vector product. But it's a bona fide theorem. Poke around a linear algebra textbook and you'll find something like: A transformation F: Rn → Rm is linear if and only if it is a matrix transformation. Can you prove it? I can't, but somebody can.
Today, however, we are interested in coordinate transformations. Suppose we have a mapping of the vector x into y represented by the matrix A. That is, y = Ax. Now, suppose we change the basis. In the new basis, x → x' and y → y'. Is it still true that y' = Ax'?
No, it is not.
"Why not?" leads us to the mathematical vista called a similarity transformation. But we're getting ahead of ourself. First, let's try a simple example. We'll start in the standard 2D Cartesian basis e1 = [ 1 0 ] and e2 = [ 0 1 ] then rotate 45° counterclockwise to obtain a new basis.
Let A = [ 1 0 ; 0 -1 ]. (This is Matlab notation, where rows of a matrix are separated by a semicolon. I'm using Matlab notation because formatting a matrix in html is a pain in the fur.) Let x = [ 1 1 ]. Then y = Ax = [ 1 -1 ]. (Note A is a reflection about the x axis -- I'm using geometric transformations so you can think in pictures.)
To obtain vector components in the new basis, we multiply by Q = [ cos(π/4) -sin(π/4) ; sin(π/4) cos(π/4) ]. So x' = Qx = [ 0 √2 ] and y' = Qy = [ √2 0 ]. Note Ax' = [ 0 -√2 ] ≠ y'. What has gone wrong?
Since x and y were related in the original coordinates (to wit: y = Ax), and x' and y' are the same vectors just expressed in a new basis, we intuitively expect there to be a relation between x' and y'. Alas, our counterexample shows y' = Ax' isn't true, and one counterexample is all you need to screw the pooch. The good news is uncovering the correct form just takes a tiny splash of algebra.
Footnote: You do understand that "screws the pooch" is an engineering turn-of-phrase, yes? It means "an event that wrecks our plan, irrevocably and not infrequently involving explosions" (IIRC, it was invented at NASA). Every time I use this phrase in mixed company at work, I get a testy email from HR.
Note Q is invertible (geometrically, the inverse represents a 45° clockwise rotation. In fact, you can get Q-1 simply by substituting -π/4 for π/4 into Q). So we can write x = Q-1x' and y = Q-1 y'. So we have Q-1y' = A Q-1x'. Rearranging, we obtain y' = QAQ-1x' or y' = A'x', where A' = QAQ-1. If we apply this to our example, we find Q-1 = [ √2/2 -√2/2 ; √2/2 √2/2 ], A' = [ 0 1 ; 1 0 ] and A'x' = [ 0 1 ; 1 0 ] [ 0 √2 ] = [ √2 0 ] = y'. Huzzah.
A transformation of the form QAQ-1 is called a similarity transformation. (Danger! The similarity transformation can be written either QAQ-1 or Q-1AQ, depending on whether you define Q as taking the unprimed coordinates into the primed coordinates or the primed into the unprimed). There's all sorts of useful properties of a similarity transformation (A and A' have the same eigenvalues, for example). But our interest presently is the form itself. Look closer: A' = QAQ-1. Transforming A involves two multiplications -- one factor of the forward direction (Q) and one factor of the inverse (Q-1). This is usually just accepted and memorized, but it's worth thinking about. Why two factors? Why isn't A' = QA? And why one factor in each direction? Why isn't A' = QAQ or QQA? That seems like the logical extension of how vectors transform.
I assure you A' = QAQ-1 is true. Algebra doesn't lie, although it does Lie (little joke there for the grad students joining us today). But it would be nice to have an intuitive explanation behind the symbols. Fortunately, an intuition is not hard to conjure. We just have to attach a few words to the algebra.
As we now know: y' = QAQ-1x'.
But Q-1x' = x. So we really have y' = QAx.
But Ax = y. So we really have y' = Qy.
Adding more words: The factor Q-1 maps x' back into x, then A operates on it to get y, then the factor Q maps the result back into the primed system. Ergo, QAQ-1. Like the arrow in the FedEx logo, impossible to unsee once pointed out.
This may seem da doi, but it ain't in any textbook I've looked in. They're so busy with equivalence relations and kernels and homeomorphisms that they forget to include an explanation. It's an all-too-common affliction. Don't believe? Take the LabKitty Similarity Challenge: Google any of the following (taken from my recent browser history) and see if any lead you to clarity:
how does matrix transform
explaination similarity transformation
explanation similarity transformation
matrix triple product justification
donald trump antichrist
coordinate transformation of a matrix
wherefore art thou similarity transformation
explaination similarity transformation
explanation similarity transformation
matrix triple product justification
donald trump antichrist
coordinate transformation of a matrix
wherefore art thou similarity transformation
No comments:
Post a Comment