Thanks. I agree with using computational graphs. I think understanding backpropagation using graphs is much easier to understand if you are new to the subject. The reason I didn’t do it here is mainly because there’s already a lot of guides that do that online, but fewer that introduce tensors and how they interact with deep learning. Also I’m writing these posts primarily so that I can learn, although of course I hope other people find these posts useful.
I also want to add that this guide is far from complete, and so I would want to read yours to see what types of things I might have done better. :)
For sure! To be honest, I got a little lost reading your 3-part series here, so I think I’ll revisit it later on.
I’m newer to deep learning, so I think my goals are similar to yours (e.g. writing it up so I have a better understanding of what’s going on), but I’m still hashing out the more introductory stuff.
Thanks. I agree with using computational graphs. I think understanding backpropagation using graphs is much easier to understand if you are new to the subject. The reason I didn’t do it here is mainly because there’s already a lot of guides that do that online, but fewer that introduce tensors and how they interact with deep learning. Also I’m writing these posts primarily so that I can learn, although of course I hope other people find these posts useful.
I also want to add that this guide is far from complete, and so I would want to read yours to see what types of things I might have done better. :)
For sure! To be honest, I got a little lost reading your 3-part series here, so I think I’ll revisit it later on.
I’m newer to deep learning, so I think my goals are similar to yours (e.g. writing it up so I have a better understanding of what’s going on), but I’m still hashing out the more introductory stuff.
I’ll definitely link it here after I finish!