|
||
|
||
|
1. Definitions: know the definitions of all these terms:
rank
dimension
null space
kernel
range
inhomogeneous
homogeneous
eigenvalue
eigenvector
basis
kth basis vector
nullity
Gaussian elimination
elementary row operation
characteristic equation
minimal equation
permutationeven permutation
2. Skills: be prepared to perform any of these activities:
recognize singular system of equations
perform Gaussian elimination to solve equations
perform Gaussian elimination to find an inverse
perform Gaussian elimination to find the determinant
obtain general solution to equations without unique solutions
determine rank of a matrix
determine range and null space of a matrix
find the characteristic equation of a matrix
find a corresponding eigenvector of a matrix given an eigenvalue
solve for an unknown in a system of equations by using Cramers Rule
find sign of a permutation term in the determinant
evaluate a 4 by 4 determinant by expanding on a row or column
3. Factoids: identify the correct statements below and give counterexamples for or correct the incorrect ones:
If a matrix is singular, it has an eigenvector corresponding to eigenvalue 0.
The range of a matrix is the set of all linear combinations of its columns.
The null space and the kernel of a symmetric matrix are the same.
There is at least one eigenvector corresponding to every eigenvalue of a matrix.
If an eigenvalue has multiplicity two in the characteristic equation of M then M has two linearly independent eigenvectors corresponding to it.
Suppose M is the two by two matrix with first row (01) and second row (10); then exM is the matrix with first row (cosh x, sinh x) and second row (sinh x, cosh x).
The dot product vw of two column vectors, v and w, is the same as the matrix product of the transpose of v with w: vTw.
Every real symmetric matrix has real eigenvalues and is diagonalizable by a real transformation to an orthonormal basis.
The highest power of the variable in the characteristic equation of an n by n matrix M has coefficient 1 or -1. The coefficient of the next lower power is plus or minus the sum of the diagonal elements of M.
Since the determinant of a matrix representing a transformation is independent of the basis, so is the characteristic function, and hence the eigenvalues.
4. Some conceptual questions:
Suppose we have two vectors in a 4 dimensional vector space; how can we find the area of the parallelogram their sides determine? For example, suppose the vectors are (1,0,2,2) and (1,2,2,0).
A determinant can be expressed as a sum over a single row of its elements
each multiplied by a certain cofactor.
If we pick any two rows, it can be written as the sum over every pair of
columns of the sum of the subdeterminant of the matrix in the given rows
and those columns multiplied by a similar kind of cofactor. What is the
cofactor?
We can define row eigenvectors exactly like column eigenvectors.
Given a matrix A, then the matrix product of the row eigenvector
r corresponding to one eigenvalue and the column vector v
corresponding to any other eigenvalue is zero.
Prove this by evaluating the product r A v two different ways.
Two matrices, A and B, commute if their product in either
order is the same: AB = BA.
Prove that if A and B do not commute, then there can be no basis
consisting of simultaneous eigenvectors of both.
You can define a composition operation on permutations. Given two permutations
on n elements, p and q, you can perform p and then perform q on the result.
Thus starting with 123 you can interchange 2 and 3 getting 132 and then
interchange 1 and 3 getting 312. We can write this as 312=321*132. Show
that this composition obeys the associative law: a(bc) = (ab)c; there is
an identity element each element has an inverse, and the product of any
two permutations is a permutation.These conditions mean that permutations
on n elements for any n form a group.
Show that even permutations also form a group (by showing that the product
of any two is another), where even permutations are those that have an even
number of even length cycles.
What is the smallest n for which two permutations fail to commute? Find
two permutations for that n that fail to commute.
The simplex algorithm for linear programming starts at a form of its equations
such that the origin in the basis variables is feasible, in that it obeys
all the constraints that variables must be positive.Suppose you start instead
with a form of the equations in which the origin in the basis variables
is not feasible.
I ntroduce one new basic variable and constraints that variable is at least
0 and at most 1, and add appropriate terms in the other equations so that
1. The origin in the old basis variables plus the new one is feasible.
2. When the new variable is 1, the old equations are exactly as they were
before you added the new variable.
3. Make the new variable your objective function.
Show that solving the new linear program either produces a basis for the
old for which the origin is feasible, or provides a proof that the old linear
program had no feasible points at all.
Introduce a variable like this explicitly for the for the following linear
program: