HOME

David Terr
Ph.D. Math, UC Berkeley

 

Home >> Pre-Calculus >> 8. Matrices and Determinants

<< 8.4. Cramer's Rule

 

8.5. Coordinate Transformation Matrices

Brief Lecture on 2D Linear Transformations

     

Thus far in this chapter as well as the previous one, we have considered matrices solely for the purpose of solving systems of linear equations. In fact, matrices have many other applications. In this section, we look at one of them, namely coordinate transformations.

From Chapter 1, we know how to graph points in the plane in terms of their x and y coordinates. But what if we change our coordinate system, i.e. by rotating it, switching the roles of x and y, or changing the scale of x and/or y? Then we must relabel the points in order to reflect this change. How do we do this? We need to change our coordinates from (x, y) to a new pair of coordinates (x', y'). If the coordinate transformation is linear, then we can write this coordinate transformation as a matrix equation. The most general linear coordinate transformation of R2 has the following form:

  • (8.5.1a) x' = ax + by
  • (8.5.1b) y' = cx + dy

where a, b, c, and d are real-valued constants with ad - bc ≠ 0. Now we can easily see that this transformation may be expressed as a matrix equation as follows:

coordinate transformation

What if we want to return to the old coordinates? Then we apply the inverse coordinate transformation, given by

inverse coordinate transformation

These equations may be written symbolically as x' = Ax and x = A-1x'. It is easy to see that the first of these equations implies the second; just multiply both sides on the left by A-1. When written out as a system of linear equations, the above matrix equation becomes

  • (8.5.2a) x = (dx' - by') / (ad - bc)
  • (8.5.2b) y = (-cx' + ay') / (ad - bc)

Equations (8.5.1) and (8.5.2) tell us how points become transformed.

 

Example 1: Consider the following coordinate transformation A:

x' = 3x + 4y

y' = 2x + 3y

Determine how the following points get transformed by this transformation and its inverse.

  • (a) (3, 4)
  • (b) (16, -5)
  • (c) (1, 0)
  • (d) (0, 1)

Solution:

The matrix associated with the transformation is

example 1 matrix

and the matrix associated with its inverse is

example 1 inverse

Thus, the inverse transformation is given by

x = 3x' - 4y'

y = -2x' + 3y'

  • (a) We have (x', y') = ((3)(3) + (4)(4), (2)(3) + (3)(4)) = (25, 18). For the inverse transformation, we have (x, y) = ((3)(3) - (4)(4), -(2)(3) + (3)(4)) = (-7, 6).
  • (b) We have (x', y') = ((3)(16) + (4)(-5), (2)(16) + (3)(-5)) = (28, 17). For the inverse transformation, we have (x, y) = ((3)(16) - (4)(-5), -(2)(16) + (3)(-5)) = (68, -47).
  • (c) We have (x', y') = ((3)(1) + (4)(0), (2)(1) + (3)(0)) = (3, 2). For the inverse transformation, we have (x, y) = ((3)(1) - (4)(0), -(2)(1) + (3)(0)) = (3, -2). Note that the coordinates of these points match those of the first columns of A and A-1 respectively.
  • (d) We have (x', y') = ((3)(0) + (4)(1), (2)(0) + (3)(1)) = (4, 3). For the inverse transformation, we have (x, y) = ((3)(0) - (4)(1), -(2)(0) + (3)(1)) = (-4, 3). Note that the coordinates of these points match those of the second columns of A and A-1 respectively.

 

It is also useful to know how various graphs become transformed by linear transformations. In this section, we will mainly concern ourselves with graphs of polygons. In the following chapter, we will also consider graphs of curves known as conic sections.

 

Example 2: Let T denote the triangle from Example 2 of Section 1.1, namely the triangle with vertices P = (-2, -2), Q = (0, 2), and R = (2, -2). Consider the following linear transformation:

x' = x + y

y' = x - y

Graph the original triangle T and the transformed triangle T'.

 

Solution: It is straightforward to show that the transformed vertices are P' = (-4, 0), Q' = (2, -2), and R' = (0, 4). Graphs of T and T' are shown below.

triangle

transformed triangle

An interesting property of linear transfomations is how areas of regions get transformed. It turns out that a region R of area S gets transformed to a region R' of area S' = |D|S, where D is the determinant of the linear transformation. The absolute value must be used since areas are always nonnegative. Thus, in the above example, T' has twice the area of T since the determinant of the transformation is -2.

It is useful to know how the unit square gets transformed by a linear transformation. The unit square is the square with vertices at (0, 0), (1, 0), (1, 1), and (0, 1). It is easy to show that for the linear transformation given by Equations (8.5.1) , the unit square gets transformed into a parallelogram with corresponding vertices at (0, 0), (a, c), (a+ b, c+d), and (b, d). Since the unit square has area 1, the parallelogram to which it gets transformed has area |D| = |ad - bc|.

 

Example 3: Determine the area of the parallelogram P with vertices at (0, 0), (7, 2), (8, 6), and (1, 4).

Solution: P is the transformation of the unit square under the following linear transformation:

x' = 7x + y

y' = 2x + 4y

Since the matrix associated with this transformation has determinant 26, the area of P is 26.

 

There are several special types of linear transformations, including reflections, rotations, stretches, and shears. In fact, it can be shown that every linear transformation is a combination of these. Each of these is worth discussing.

 

Reflections

Lecture on Reflections in Two Dimensions

 

Linear transformations of R2 include reflections about any line passing through the origin, but we will only consider four of these, namely reflections about the x-axis, the y-axis, and the lines y = ±x. First consider reflection about the x-axis. Such a transformation leaves x invariant and negates y, so it has the form x' = x; y' = -y. The matrix associated with reflection about the x-axis is thus

reflection about x-axis

Reflection about the y-axis leaves y invariant and negates x, i.e. we have x' = -x; y' = y. The matrix associated with reflection about the y-axis is thus

reflection about y-axis

Reflection about the line y = x swaps x and y, i.e. we have x' = y; y' = x. The matrix associated with reflection about ththis line is thus

reflection about the line y=x

Finally, reflection about the line y = -x swaps x and -y, i.e. we have x' = -y; y' = -x. The matrix associated with reflection about ththis line is thus

reflection about the line y = -x

It is easy to see that each of these four reflections has determinant -1. In fact, every reflection has determinant -1.

 

Rotations

Lecture on Rotations in Two Dimensions

 

A counterclockwise rotation in the xy-plane by the angle θ brings about the following coordinate transformation:

x' = x cos θ - y sin θ

y' = x sin θ + y cos θ

Note that this is a linear transformation with the following transformation matrix:

rotation matrix

It is easy to see that the inverse of this transformation is given by

inverse rotation matrix

which represents a clockwise rotation by the angle θ. It is also easy to see that every rotation matrix has determinant 1.

Perhaps the most beautiful property of rotations is that the product of a rotation by an angle θ and a rotation by an angle φ is equal to a rotation by an angle θ + φ, as the following calculation shows:

rotation addition

However, this is to be expected, since the product R[φ] R[θ] represents a rotation by angle θ followed by a rotation by angle φ, which of course is equivalent to a rotation by angle θ + φ. (Note that matrix multiplication always proceeds from right to left.) Another remarkable feature of rotations in R2 is that they commute, i.e. R[φ] R[θ] = R[θ] R[φ].

Another nice related feature of rotations, which we will not prove, is the following:

  • (8.5.3) R[θ]n = R[nθ]

for every positive integer n. What this means is that n rotations by angle θ are equivalent to a single rotation by angle nθ. Note the similarity between Equation (8.5.3) and De Moivre's Formula: Equation (6.6.2).

 

Rotations and reflections together comprise the set of linear transformations known as rigid transformations. These are transformations which preserve distances and angles. All non-rigid transformations may be thought of as stretching, squishing, or bending parts or all of the plane. A nice feature of rigid transformations is that they preserve the dot product. We leave the proof of this fact as an exercise.

 

Stretches

Lecture on Stretches in Two Dimensions

 

A stretch in the xy-plane is a linear transformation which enlarges all distances in a particular direction by a constant factor but does not affect distances in the perpendicular direction. We only consider stretches along the x-axis and y-axis. A stretch along the x-axis has the form x' = kx; y' = y for some positive constant k. (Note that if k is greater than 1, then this really is a stretch, but if k is less than 1, it is really a squish along the x-axis, but we still call it a stretch. Also, if k=1, then this transformation is really the identity, i.e. it has no effect.) The matrix associated with a stretch by a factor k along the x-axis is given by

stretch along x-axis

Similarly, a stretch by a factor k along the y-axis has the form x' = x; y' = ky, so the matrix associated with this transformation is

y-stretch

It is easy to see that a two stretches along the x or y axis, the first by a factor k1 and the second by a factor k2, is equivalent to a single stretch along that axis by a factor k1k2. In other words, we have

  • (8.5.4a) Tx[k2] Tx[k1] = Tx[k1k2]
  • (8.5.4b) Ty[k2] Ty[k1] = Ty[k1k2]

It is also easy to see that a stretch by a factor k along one axis followed by a stretch by a factor k along the other axis (order does not matter) is equivalent to an enlargement (or shrinking if k is less than 1) of the entire xy-plane by a factor k. Finally, we see that both Tx[k] and Ty[k] have determinant k.

 

Shears

Lecture on Shears in Two Dimensions

 

The last special type of linear transformation we will discuss is shears. Like stretches, shears may be performed in any direction, but we will only consider shears along the coordinate axes. A shear along the x-axis has the form

x' = x + ky

y' = y

for some constant k. The matrix associated with this shear is

x-shear

Similary, a shear along the y-axis has the form

x' = x

y' = kx + y

for some constant k. The matrix associated with this shear is

y-shear

It is easy to see that all shears along the x or y axis have determinant 1. In fact, all shears have determinant 1. . It is also easy to show that shears have the following multiplicative properties:

  • (8.5.5a) Sx[k2] Sx[k1] = Sx[k1 + k2]
  • (8.5.5b) Sy[k2] Sy[k1] = Sy[k1 + k2]

 

Home >> Pre-Calculus >> 8. Matrices and Determinants

<< 8.4. Cramer's Rule