Introduction to Deep Learning (CMU)
30 comments
·March 19, 2025firefax
>You will need familiarity with basic calculus (differentiation, chain rule), linear algebra, and basic probability.
So if I'm at the point that math skills, rather than programming skills are my barrier to interesting courses like this one, does anyone know of any good resources? I don't seem able to teach myself calc from a book like I did Python.
somethingsome
I know many of them
Probability https://youtube.com/playlist?list=PLoROMvodv4rOpr_A7B9SriE_i...
Basic algebra and calculus https://tutorial.math.lamar.edu/
Real analysis https://youtube.com/playlist?list=PL0E754696F72137EC
All those are really basics and start from (almost) nothing
noisy_boy
Do you have any links for someone who did maths and stats in uni and basically forgot most of it (except very basic algebra)? I would like to wake up my near-dead math brain cells.
somethingsome
At which level? And pure mathematics or engineering or other?
dimatura
I took this course the first semester it was given. There was one TA, and now there's 24! Fun fact: The TA was the writer of Aqua's 90s hit "Doctor Jones".
npalli
>> The TA was the writer of Aqua's 90s hit "Doctor Jones".
Soren Rasted?? Which year was it. Mind blown.
meccabrepapa
I am a 1year experienced software engineer in a small company. I have been learning Machine Learning recently for company's project. Do you recommend me this course? I want to learn the concept systematically.
rottc0dd
There was another hn page where discussion happened on this topic. Please check following comment thread.
https://news.ycombinator.com/item?id=43391604
https://news.ycombinator.com/item?id=43395172
These resources were helpful for me. Note that, [1] and [2] are concerned about systematic understanding rather than hands on. [3] is a hands on exercise to build neural networks from ground up.
1. A fantastic resource and best resourse IMO, for getting probablistic perspective about machine learning from ground up:
https://www.youtube.com/watch?v=2MuDZIAzBMY&list=PLoROMvodv4...
2. Another good free course.
https://work.caltech.edu/telecourse
3. For hands on after getting some knowledge and building things from ground up:
https://www.youtube.com/watch?v=VMj-3S1tku0&list=PLAqhIrjkxb...
meccabrepapa
thanks a lot
mliker
I recommend checking out this survey of free ML resources: https://www.trybackprop.com/blog/top_ml_learning_resources
No doubt CMU's intro to deep learning course is good, you might find some other goodies in that link too.
ascarshen
The most valuable part is the assignments and homework. If possible, where can I find the code?
fxwin
Seems like they're not available to non-students, but I'm happy to be proven wrong
BinaryMachine
I agree, this only is useful for passive learning from watching lecture videos, which is not an ideal way to soak in the material. Even if the quizzes and assignments are just instructions id be happy to experiment on my own.
janalsncm
I think for someone who hasn’t seen the material at all before it would be a lot for a semester. They don’t know what backpropagation is but by the end will understand a diffusion model? It’s ambitious, I think.
The other thing is this seems to be very CNN heavy. Four lectures on the topic seems like a lot.
Also, I don’t see embeddings explicitly mentioned as a topic. They’re a huge component of industrial research, and creating good embeddings and retrieving them quickly is a topic I feel students should also be exposed to. (Yes, they mention “representation” with autoencoders but quite frankly the code bit is generally not useful for similarity metrics.)
Finally, it would be nice to expose students to multimodal learning. Something like CLIP would be pretty neat to expose students to. It’s a great insight when you realize that you can train projections of multiple modalities into a shared high dimensional space. If they’re going to cover diffusion models certainly complexity isn’t a concern.
berniep
Can you recommend any tutorials/resources about designing and training simple multi-modal models like CLIP? Or should I just be reading the papers and following along?
janalsncm
Well, the core idea is to train a text encoder and an image encoder jointly with in-batch negatives. In other words, a two tower model maximizing the diagonal and minimizing everything else. They have pseudo code in the paper.
mliker
> They don’t know what backpropagation is but by the end will understand a diffusion model?
That seems plausible to learn in a semester long course, especially at an institution like CMU
diebeforei485
> I think for someone who hasn’t seen the material at all before it would be a lot for a semester. They don’t know what backpropagation is but by the end will understand a diffusion model? It’s ambitious, I think.
Welcome to CMU :)
null
noisy_boy
It is not clear to me if people who are not CMU students can get their assignments/quizes checked by the autograder.
null
Isamu
There’s a Giant Eagle auditorium in Baker Hall?
vivzkestrel
in person or remote? open sourced like MIT or closed source? no details are mentioned whatsoever
yamrzou
If you go to Menu > Lectures, you'll find links to the Youtube videos.
null
I remember taking this course at CMU, I had no knowledge of deep learning before this course. After this course, I had trained over 75 models in the assignments, implemented a pytorch backend and a significantly large course project that I was confident to launch my career in Deep Learning. Cannot recommend this course enough, you need to do all the assignments to get maximum value out of it and it can be intense. But think of it as a bootcamp for machine learning. I still find these course materials useful in interviews.