Summer 2011: Question 3

Suppose that n is a positive integer. Pick n intervals I_1,I_2,\dots,I_n on the real number line. Assume that any pair of these intervals are disjoint. Pick n real numbers a_1,a_2,\dots,a_n.

  1. Using ideas of linear algebra, prove that there is a polynomial function p(x) of degree at most n-1, with real number coefficients, so that  \int_{I_j}p(x)\,dx=a_j
  2. How many such polynomial functions are there? Justify your answer?
[HINT: Make a linear map. If the integral of a continuous function on an interval vanishes, then the function vanishes somewhere on the interval].


  1. [Vahid] Let p(x)=\sum_{k=0}^{n-1}e_kx^k and I_i=(c_i,d_i) such that \int_{c_i}^{d_i}p(x)\,dx=a_i. That is, for each i=1,2,\dots,n


This corresponds to n equations in n unknowns:




This is equivalent to the matrix equation MX=A, where X is the vector (e_{n-1}\, e_{n-2}\,\dots\,e_0)^TA is the vector (a_1\,,\,a_2\,,\,\dots\,,\,a_n)^T and M is the matrix


A solution to this matrix equation gives a polynomial which satisfies the given condition.

     2. Does a solution always exist?? I’m stuck here! It depends on whether the matrix is invertible, etc.

Summer 2010: Question 1

Suppose that T:V\rightarrow V is a linear map. Suppose that v_1,v_2,\dots,v_p are eigenvectors of T with eigenvalues \lambda_1,\lambda_2,\dots,\lambda_p, and that these eigenvectors are all distinct. Prove that the eigenvectors v_1,v_2,\dots,v_p are linearly independent.


We use induction on the number of eigenvectors n. Let P(n) be the proposition that n of the eigenvectors of T are linearly independent.

P(1) is true because the set \{v_1\} is linearly independent if v_1\neq 0 — and eigenvectors are non-zero by definition.

Assume P(k). That is assume if

\sum_{i=1}^ka_iv_i=\mathbf{0} \Rightarrow c_i=0  for i=1,\dots,n.

Consider P(k+1). Multiply both sides of \sum_{i=1}^{k+1} a_iv_i by T-\lambda_{k+1}I:


\Rightarrow \sum_{i=1}^{k+1}a_i(\lambda_i-\lambda_{k+1})v_i=0



By the inductive hypothesis (P(k)) we have that a_i(\lambda_i-\lambda_{k+1})=0 for each i=1,\dots,k. Since \lambda_i\neq\lambda_{k+1}, it follows that a_i=0 in every case. Therefore we are left with

\sum_{i=1}^{k+1} a_iv_i=a_{k+1}v_{k+1}=0\Rightarrow a_{k+1}=0,


Hence by the inductive hypothesis, the eigenvectors v_1,\dots,v_p are linearly independent \bullet