Explore the fascinating world of Large Language Models, understand how ChatGPT works, and dive into the mathematics behind transformer architecture and attention mechanisms.
Dive deep into optimization algorithms that power neural network training. Understand the mathematics behind SGD, Momentum, RMSprop, and Adam optimizers.
Master time series forecasting with classical statistical methods and modern deep learning approaches. Learn when to use ARIMA, LSTM, and Transformer models.
Understanding how neural networks learn through backpropagation, with complete mathematical derivations and practical implementation from scratch. We have derived all the code from scratch in python and all the math was derived.