Friday, February 24, 2017

Week 3

Hello everybody, we're back from another week of work. For the most part, I have just been expanding upon my knowledge of machine learning and signal processing so I can apply that knowledge later. For this post I will give you more information on what I am learning, and how I am learning it.

First of all, I mentioned briefly last week that I was auditing Dr. Berisha's class, Signals and Systems. Because I started off behind, I have spent a lot of time this week trying to catch up. I go the the class two times a week for about an hour where we cover concepts of signal processing. The unit/concept that we are currently learning about is Fourier transforms and series. I'll give a short explanation of those, but, like last week, it might be a little confusing.

Fourier series and transforms are both used to take a signal in the time domain and convert it to a signal in the frequency domain. You might be asking: 'Then what's the difference between series and transform?' Well series is only for periodic(repeating) signals, while transform is only for aperiodic (not repeating) signals. This is much more clear in the math, but I don't want to bore you with long equations, even though it is very interesting to see them work; I would rather tell you how it might be used in my project. In class all the signals we look at are pretty simple, however the concept of looking at a signal in different domains is still useful. Depending on what characteristic of speech you are looking at, it may be more useful to look at the speech with regards to time (likely useful for speaking rate) or with frequency (maybe for pitch).

Aside from the class, I am still learning about machine learning and deep neural networks, but this week had a little less focus on it. This week I learned about this more independently, where I would research myself and ask questions if I needed help. I was not able to look at more applications of neural networks on my project, however, because Prad was sick and couldn't come into the lab. Although unfortunate, this gave me an opportunity to catch up in the class, and start researching probability theory. I haven't learned enough of probability theory to try to tell you guys about it, but when I have I definitely will.


That is pretty much it for this week; I learned a lot, and plan to further my project with this knowledge in future weeks. But I do have one last thing to tell you about. In the lab, since the first week, there was a Go board game on the table that seemed a little out of place. For the first time on Thursday I saw it being played. Two of the students, Alan and someone I plan on meeting next week, decided to play on a whim. This actually led to a pretty interesting simulation of pieces, that I really couldn't explain to you, so I'll put a picture of it and tell you what it is called: Conway's Game of Life.






With that final, kinda unrelated, information comes the end of this post. I'll see you next week with hopefully more updates on the cough project.

Buh bye

22 comments:

  1. Looks like our physics class on Fourier transforms helped a little on this project didn't it? Lucky you! It was really interesting to see how Fourier transforms and series could be adapted to this research and i'm looking forward to reading more about it and probability theory in the future.

    ReplyDelete
    Replies
    1. Yeah, I was pretty lucky that I was able to join the class when I did. I am actually not sure if I am going to have to put fourier analysis to use in the project or if it is just a good concept to know. I'll keep you updated.

      Delete
  2. Hey Luke, although there isn't much to talk about in this post, since this was more of an update rather than explanation, I'm pretty excited to read about probability theory in the future and hopefully read more on machine learning in the future!
    Also, I was wondering if you could explain how you would apply the probability theory into the cough and speech patterns.
    PS: In the 3rd paragraph in the beginning of the third sentence, "me" should be "be" ^.^

    ReplyDelete
    Replies
    1. Thanks Evan, good catch. As for the probability theory, it actually relates closely to the deep learning of week 2. Every time the algorithm goes back and changes the weights, it is using probability theory to determine the effect of each weight and how much they should be changed.

      Delete
  3. Hello Luke. It seems like you've had a pretty simple week of learning new information for your project. Didn't some researchers at Google develop an AI that beat some of the best Go players last year? Anyways, I look forward to reading your future posts discussing your findings from this week and hopefully more updates to the cough project.

    ReplyDelete
    Replies
    1. This upcoming week will actually bring a pretty good update to the cough project, but I'll leave that for friday. The AI that beat the Go players was actually something that Ming showed me at the very beginning of the project. It is a really cool use of AI, and I think they're trying to improve upon it.

      Delete
  4. Hi Luke! It must be a great opportunity to be able to audit Dr. Berisha's class. I'm just wondering, how does probability theory relate to signal processing?

    ReplyDelete
    Replies
    1. Probability theory doesn't relate directly to signal processing, but it is used in the neural networks from week 2. Probabilities are used to determine how much each weight effects the error and outcome.

      Delete
  5. Hey Luke! It appear you've been enjoying this project and the information you are learning. You said in your post that you study the subjects of Deep Neural Networks and Fourier series/transforms more independently to get a better understanding of the subject. What is the best way you are able to learn outside of the lecture? Great work so far!

    ReplyDelete
    Replies
    1. I have actually been taking courses online to learn a lot of this stuff. There is a really cool course on machine learning that everyone recommended. It is all free on Coursera. This class in particular is taught by a Stanford professor, so I have learned a good amount of stuff from that.

      Delete
  6. Hey Luke! I just read about what the Fourier series is, and it seems nice that to see an application of it. Quick question -- what exactly is the relationship between the time and frequency domain? I expect it to be T = 1/f and vice versa, but I just want to make sure.

    ReplyDelete
    Replies
    1. Well, it can be a little more complicated than that. The 'T' in that formula you wrote is actually period, so it only applies to simple periodic(repeating) signals. Even if you do have a periodic signal, it is often difficult to use that formula.

      What Fourier series and transform do is actually replicate the signal with different sine waves, so the relation is through the frequency of those sine waves and the time of the signal. Not sure if that makes sense, or I might have said something slightly wrong, but that is how I understand it.

      Delete
  7. Hi Luke! I read your post and what you are learning about seems very intriguing. I was wondering how you are learning independently about machine learning and deep neural networks. Keep up the great work!

    ReplyDelete
    Replies
    1. I gave a little more detail to Adi in his comment above, but the one thing I am doing the most is going through a coursera machine learning course.

      Delete
  8. Hey there Luke, I am still a little confused about the Fourier series. How does it take a signal in the time domain and convert it to a signal in the frequency domain?

    ReplyDelete
    Replies
    1. It is a little confusing to explain without drawing a picture for you (that's why I avoided it in the post), but basically it multiplies a bunch of sine waves at different frequencies to try to replicate the original series. Then it uses those sine waves' frequencies to create the signal in the frequency domain. If you want a better explanation (probably with some more visualization) there are a lot of websites with pictures.

      Delete
  9. Hey Luke! Sounds like you've had an interesting week! What exactly do you mean when you say "repeating" and "non repeating" signals when it comes to the series and transforms? Can't wait for next week's update!

    ReplyDelete
    Replies
    1. When I say repeating or non-repeating I am referring to the shape of the signal. Pretend you have a simple square wave 3 seconds long starting at time zero, then 2 seconds after the end of the square wave, another identical square wave occurs, and that just keeps happening indefinitely; that would be repeating (I encourage you to try to draw this). Non repeating would be if it didn't keep happening indefinitely.

      Delete
  10. Hi Luke! It's great to see you go more in-depth with your project. Have you been able to choose which student's project you will be following ?

    ReplyDelete
    Replies
    1. Yeah, for the most part i will be following along with Prad's project, the cough project. I will have more updates for that in week 4.

      Delete
  11. Hey Luke! It's cool that you're getting the chance to be part of the class and learning as everyone else is and also getting time to research on your own. Also what exactly is Conway's Game of Life and how do you play? The gif and picture made it seem very interesting!

    ReplyDelete
    Replies
    1. Conway's game is a simulation where, speaking in terms of the gif, you have groups of black squares. If there is a single square it will go away, if there are two or more squares the group will grow, but if there is one square completely surrounded by other squares it will go away, so the shape of the groups will keep changing based on these conditions. I may have missed a condition, but that is the basic idea.

      Delete