Take two…

Well- not much happened this fall…  OK, a lot happened, but we were all too busy to tell anyone about any of it!

The biggest lesson I learned was not to try teaching two classes while taking a full load of graduate credits. And buying a house.  And moving…  Yes, I took on a bit too much, and my posts necessarily suffered.

Winter quarter will be different!Aside from having more time, I’ll be using my time a little more efficiently!  But – we’re here for math, so let’s discuss some of the cool things I learned in mathematics this quarter:

Stochastic Processes (that is, random processes) are cool.  Oh yes – moreover, the analysis that goes along with stochastic processes has really improved my understanding of real analysis!  The key to them both lies in measure theory (well – one of the keys…)

What’s measure theory, I hear you ask?  Well – the basic idea is to look at a particular kind of function, called a measure, that maps the subsets of a space to the real numbers (or extended reals, wherein we get to consider infinity and negative infinity as numbers).  In a stochastic process, the measure is onto the interval [0,1], and it is called a probability measure.  Intuitively, length is a measure – it takes a subset of the real number line (an interval) and returns a real number (its length).  That’s fine for something nice, like an interval, or even a collection of intervals, but we want the collection of subsets to be a sigma field, so that we can combine the subsets in the ways we normally combine sets; union, intersection, complement…  And we want the measure to extend to cover these newly formed sets – even when we take the union of a countable (that is, infinite) number of sets.

It turns out that a measure on a field can be extended uniquely to be a measure on the generated sigma-field.  That is useful and cool, and I’m going to stop here to avoid having to recapitulate the entire quarter’s work!

My second course was Partial Differential Equations, and was much more about the nuts and bolts of how to solve special instances of such equations, rather than about a lot of theory.  Interestingly enough, unlike ODEs, there is not a general theory for PDEs!  Thus what you mostly get is recipe solutions for special forms that cannot then be extended to cover other types of PDEs.  Sad, but true.

That is not to say that there was no theory!  There are still things that can be discussed at a higher level, and I am really looking forward to the advanced differential equations course next year!  The one big thing we looked at (in a theoretical sense) was the Fourier coefficents.  Interestingly, we can look at a generalized function space as a vector space, having a (countable) basis made up of orthogonal sine functions, sin(x), sin(2x), etc.  Let’s take some arbitrary function f(x) and attempt to approximate it with a (finite) series of sine functions of this form.  We adjust the coefficients to minimize the ‘distance’, and it will turn out that the coefficients that minimize the error in the approximation are exactly the Fourier coefficients of the sine-series expansion of f(x).  That’s pretty neat!

It does turn out that with boundary conditions, if the related ODEs (created by the method of separation; assuming u(x,t) can be written as f(x)g(t)…), have a solution, then we always have a finite or countable number of orthogonal functions in the series expansion of the solution.  As to why – well, that’s going to have to wait until next year!

That’s it for now – more to come in the new year!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: