Lukáš Hudec
Slovak Technical Unversity in Bratislava
SAT 9:00 – 12:20
Have you ever wondered what is “behind the scenes” of Neural Networks? How does the network and training work? With all these fancy DL frameworks… is it even possible to implement your own little Neural Network from scratch and train it for your precious data? … and maybe even more?
The deep learning workshop focuses on the basics of understanding the main operations behind the neural network – linear layers, gradient computation, and backpropagation, optimization. During the first part of the workshop, you will implement (with a help of some prepared templates) a lightweight framework (inspired by PyTorch) with your own neural network in NumPy. In the second part of the workshop, we will take a look at some more advanced NN architectural structures. The neural networks do not always have to be sequential or from tf.keras.applications/torchvision.models. We will take a look at how to design custom implementations of some of the most used architectures – VGG, residual, dense, inception blocks (and maybe others?) – using PyTorch.
Deep Learning for Computer Vision