So I'm near the end of this block starting to pick up speed a little just need to put the finishing touches to the TMA. The unit is a game of two halves as a foothall manager would say. Blocks one and two are fairly straightforward summaries of vectors, solving equations by row reduction and a bit of coordinate geometry. Then the next three units lay the foundations of vector spaces and linear algebra. It covers quite a lot of ground in a short space of time. Introducing the concept of ortho normal basis vectors and showing how matrices can be diagonalised. It also hints briefly (but not nearly enough for my tastes) that vector spaces are nore than just relations between geometric vectors but can be applied to functions as well. Indeed it was the realisation by Heisenberg and Pauli that the theory of vector spaces could be applied to quantum mechanics that led to a breakthrough in the field. I have commented on this in earlier posts.

In this post I will briefly expand on some of the hints given in the M208 about the relationship between geometric vector spaces and function spaces. The simplest example to illustrate the analogy are the trigonometric functions

if I differentiate $$cos(\lambda x)$$ where $$\lambda$$ is a scalar then I get

$$\frac{d^2}{dx^2} cos(\lambda x) = -\lambda^{2}cos(\lambda x) $$

Thus we can treat $$\lambda^2$$ as an eigenvalue of the eigenfunction cos(\lambda x) but instead of matrices we now have a differential operator. So that $$cos(\lambda x)$$ is an eigenfunction of the differential operator $$\frac{d^2}{dx^2}$$ with eigenvalue $$-\lambda^2$$

As the trigonometric functions obey the following integrals where m does not equal n and m and n are integers

$$\int_{-\pi}^{\pi} cos(mx)cos(nx) dx = 0 $$

$$\int_{-\pi}^{\pi} sin(mx)sin(nx) dx = 0 $$

$$\int_{-\pi}^{\pi} cos(mx)sin(nx) dx = 0 $$

We can define a set of orthogonal vectors in function space involving the trigonometric functions with the scalar product being defined as

$$\int_{-\pi}^{\pi} e_{m}e_{n} dx $$

where $$e_{m},e_{n}$$ are vectors defined taken from the set

$${1,cos(x),sin(x),cos(2x),sin(2x) ........cos(mx), sin(nx)...}$$

To make this set orthonormal note that

$$\int_{-\pi}^{\pi}1 dx = 2\pi$$

and

$$\int_{-\pi}^{\pi}sin^{2}mx dx = \int_{-\pi}^{\pi}cos^{2}mx dx = \pi $$ if m > 0

Hence the functions

$$\frac{1}{\sqrt{2\pi}},\frac{cos x}{\sqrt{\pi}}....\frac{cos nx}{\sqrt{\pi}},

\frac{sin x}{\sqrt{\pi}}, ....\frac{sin nx}{\sqrt{\pi}}$$

Form an orthonormal basis set

Well so what you might say all this shows is an analogy, however this analogy is quite profound, recall that given a vector space and an orthonormal set of basis vectors any function in that space can be expanded as a sum over the basis vectors. In physics this means that any complicated wave form can be reduced to a sum over all the basis vectors. So for example a square wave can be seen as a complicated sum over its component waveforms taken from the basis set given hear. Musical synthesis is essentially based on this profound yet simple idea. Such a technique is called Fourier decomposition for more details see eg

http://en.wikipedia.org/wiki/Fourier_series

Next post on this topic I'll show how Gram Schmidt othogonalisation can be extended to Polynomials.

The relationship between vectors, differential operators and orthonormal basis vectors is called linear analysis. Many years ago the Open University offered a course based on this topic called M201 it is a sad sign of the times that this course is no longer available. However if you search on Amazon you might be able to pick up the text book on which the course was based. It is called "An Introduction to Linear Analysis" by Krieder, Kuller, Ostberg and Perkins. This book is a synthesis between the Applied maths of MST209 and the vector space and analysis parts of M208 showing how both pure and Applied maths combine together. It shows how the concept of linear vector spaces and it's analogies can be used to illuminate the structure behind the solutions of both ordinary and Partial differential equations. It is probably my favourite book on maths at the minute and is strongly recommended.

## No comments:

## Post a Comment