Blog Post #3

Success

Throughout my college career I’ve always wanted to learn machine learning, and I felt I did here, but I feel like I’ve only just BEGUN to scratch the surface. There is so much to learn in such a broad and simultaneously deep topic, spanning multiple areas of mathematics, as well as computer science, not to mention understanding of data analysis/statistics and applying it to unique cases. My biggest success was finally understanding how a Convolutional Neural Network actually works, implementing it, and tweaking the hyper-parameters to improve its base score.

Machine Learning is one of those disciplines that when you get it, you really feel like you’ve accomplished something. Because it encompasses so many topics, it takes a long time to feel like you’ve grasped a concept, but by the time you have, you have really, really grasped it. For example, for our project it had two major components. The first was learning how to process the audio data in a meaningful way for it to be interpretable by a computer, and second, was to design a neural network that was the most optimal for classifying that data into 10 genres. Audio analysis alone is a field of research that masters and doctoral students write papers on regularly, and there is still debate as to what is the most effective machine learning model for different types of audio analysis. Do you use Mel Spectrograms, or MFCCs? Do you even touch chroma spectrograms and tonnetz spectrograms? Would you process your data as numpy arrays or just leave them as spectrograms for image processing? Once you finish all of that audio analysis, then you can actually work on a neural network.

This is where the real work occurs, and the real pain. When I finally was able to get a CNN working, I was thrilled. Convolutional Neural Networks deal in multidimensional arrays, which can cause some interesting errors based on how Convolutional Layers and Max Pooling Layers work, which are both integral to CNNs. They essentially reduce the dimensionality of your array, in varying increments depending on the hyper-parameters you give it. You have to tailor make your layer organization based on the shape of the array you give the input layer, and learning this can be challenging. Eventually, there were no tutorials that would teach you this, or even reliable learning material we could find, it turned out that good ol’ trial and error is what eventually would divulge this secret. If one of the most valuable lessons I learned from doing this project was anything, it was this – tinkering is always good for the mind. Sometimes, if you don’t know how to solve a problem formally (in a prior established method), you just have to think creatively and debug going in blind. Reduce the problem. Still can’t figure it out? Make it simpler, reduce the problem again. This is a principle of recursion as well if you think about it, find the simplest version of the problem, which sometimes can be recursive, then solve that. If you can solve that, then you should be able to solve the next one, and so on and so forth. This is the value of tinkering. I’ve also learned my limits. As I near graduation, I started as a married man with no kids, now I am a father of a 4 month old and 2 year old. This makes life hard. Being married, working full time, and being a father of 2 seems nigh impossible at times, lets just say I’m glad to be graduating soon. This class has been a lot of fun, and I wish I had more time to work on my project, but I am so glad we are near the finish line.

Print Friendly, PDF & Email

Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *