You are currently browsing the category archive for the ‘MATH8009’ category.

This strategy is by no means optimal nor exhaustive. It is for students who are struggling with basic integration and anti-differentiation and need something to help them start calculating straightforward integrals and finding anti-derivatives.

TL;DR: The strategy to antidifferentiate a function $f$ that I present is as follows:

1. Direct
2. Manipulation
3. $u$-Substitution
4. Parts

In this short note we will explain why we multiply matrices in this “rows-by-columns” fashion. This note will only look at $2\times 2$ matrices but it should be clear, particularly by looking at this note, how this generalises to matrices of arbitrary size.

First of all we need some objects. Consider the plane $\Pi$. By fixing an origin, orientation ( $x$– and $y$-directions), and scale, each point $P\in\Pi$ can be associated with an ordered pair $(a,b)$, where $a$ is the distance along the $x$ axis and $b$ is the distance along the $y$ axis. For the purposes of linear algebra we denote this point $P=(a,b)$ by $\displaystyle P=\left(\begin{array}{c}a\\ b\end{array}\right)$. We have two basic operations with points in the plane. We can add them together and we can scalar multiply them according to, if $Q=(c,d)$ and $\lambda\in\mathbb{R}$: $P+Q=\left(\begin{array}{c}a\\ b\end{array}\right)+\left(\begin{array}{c}c\\ d\end{array}\right)$ $\displaystyle=\left(\begin{array}{c}a+c\\ b+d\end{array}\right)$, and $\lambda\cdot P=\lambda\cdot \left(\begin{array}{c}a\\ b\end{array}\right)=\left(\begin{array}{c}\lambda\cdot a\\ \lambda\cdot b\end{array}\right)$.

Objects in mathematics that can be added together and scalar-multiplied are said to be vectorsSets of vectors are known as vector spaces and a feature of vector spaces is that all vectors can be written in a unique way as a sum of basic vectors. 