Machine Learning

Machine Learning #

This was an extremely rewarding class. I had worked on a few ML projects before, but learning the statistical foundations gave me a deeper appreciation and love for the field. I learned about things like the theory of generalization, regression techniques, as well as various supervised and unsupervised learning algorithms. Here’s the course website.

The class was more math heavy than I expected, so it was initially a bit challenging to follow, but a quick review was all I needed to get back on track. I loved the math heavy assignments and the challenge of developing certain new theorems and algorithms from scratch.

We had a couple rather interesting assignments. The first was to predict our professor’s 5K run time based on his previous runs. We were given a dataset containing everything from the weather to the age of his youngest child. It was a rather light hearted project that was a lot of fun to work on.

The second project was an open-ended assignment to explore any specific area of machine learning that we found interesting. I was inspired by a particular article on the assumptions of linear regression. In conventional least-squares, certain properties like a normally distributed noise function, homoscedasticity, and the lack of autocorrelation is assumed.

We wanted to derived equations when these assumptions no longer held true. What would happen if the noise was heteroscedstic? Or the noise followed a different distribution? It was a bit of a challenge to get these equations to the point where they were computationally easy to calculate, and we certainly ran into a few dead ends with a few derivations, but I really enjoyed being inundated in the math. Here’s our final report for the project. It originally ran twice the length, but we had to make it fit five pages.