From Tensors To Residual Learning

From the math of tensors to the reasoning and implementation of residual learning
Tech stack
image
image
What we will build
Updated July 20st, 2024
pytorch
Dive into an exciting journey through the core of deep learning in our course, where we start by unraveling tensors, an abstract-dimension representation of numerical computing values and look into the operations within tensors core for parallel computing, especially in TPU hardware computes hence creation of compilers like xla deep learning compiler. We'll look into Tensors, smoothly transitioning into the intricacies of autograd and the elegant dance of calculus that powers neural networks' forward and backward passes. You'll not only understand how these mathematical concepts interlink but also see them in action as we explore convolutional architectures and delve into the very core of residual learning that made ResNets impactful in the deep learning models.