Lecture
Location: Senate Hall
Group A: 1:00-2:00 pm
Group B: 2:30-3:30 pm
Labs
Location: Main Computer Lab
Group A: Thursday 2:00-3:00 pm
Group B: Thursday 3:00-4:00 pm
Group C: Friday 2:00-3:00 pm
Group D: Friday 3:00-4:00 pm

Week 5

October 6 - October 10, 2025

Convolutional Neural Networks

Continuing our deep learning journey, this lecture will introduce Convolutional Neural Networks (CNNs), a specialized type of neural network designed to process data with a grid-like topology, such as images. While traditional neural networks treat every pixel as an independent input, CNNs use a series of filters, or "kernels," to automatically and adaptively learn spatial hierarchies of features, from simple edges and textures to complex objects. We'll explore the core components—convolutional layers, pooling layers, and fully connected layers—and see how they leverage concepts like parameter sharing and sparse connectivity to make image analysis more efficient and accurate than with standard neural networks.

Slides for lecture 5 ( Click here to download... )
Assignment
Due October 10, 2025 at 5pm Juba Time
Quiz 5 - due October 10 at 5pm Juba Time ( Click here to complete)

Week 4

September 29 - October 3rd, 2025

Convolutional Neural Networks

Continuing our deep learning journey, this lecture will introduce Convolutional Neural Networks (CNNs), a specialized type of neural network designed to process data with a grid-like topology, such as images. While traditional neural networks treat every pixel as an independent input, CNNs use a series of filters, or "kernels," to automatically and adaptively learn spatial hierarchies of features, from simple edges and textures to complex objects. We'll explore the core components—convolutional layers, pooling layers, and fully connected layers—and see how they leverage concepts like parameter sharing and sparse connectivity to make image analysis more efficient and accurate than with standard neural networks.

Slides for lecture 4 ( Click here to download... )
Lab: Training a neural network in Tensorflow (Click here to open notebook...)
Assignment
Due October 3, 2025 at 5pm Juba Time
Quiz 4 - due October 3rd at 5pm Juba Time ( Click here to complete )

Week 3

September 22 - 26, 2025

Training and Optimizing a Neural Network

This week's lecture will delve into the mechanisms behind how a neural network learns. We'll start with backpropagation, the core algorithm that allows a network to "learn" from its errors. It works by propagating the error from the output layer backward through the network, calculating how much each neuron's weights contributed to that error. This process is essentially a fancy application of the chain rule from calculus. Next, we'll connect this to gradient descent, the optimization algorithm that uses the error information from backpropagation to adjust the weights and biases of the network. Think of gradient descent as guiding the network down a hill to the lowest point of error, with backpropagation showing it which way is "down." Finally, we will discuss overfitting, a common problem where a model learns the training data too well, memorizing noise and specific examples rather than generalizing to new data. We'll explore strategies to combat overfitting and ensure our models are robust and effective.

Slides for lecture 3 ( Click here to download... )
Lab: Introduction to Neural Networks in Keras and Tensorflow (Click here to open notebook...)
Assignment
September 26, 2025 at 5pm Juba Time
Quiz 3 - due September 26th ( Click here )

Week 2

September 15 - 19, 2025

Biological and Artificial Neural Networks

This week, we're diving deep into the fascinating world of deep neural networks, the very foundation of modern AI. Our journey begins with an exploration of the biological neural networks that inspired this technology, providing essential context for understanding how these powerful systems work. We'll then break down the core components, starting with the perceptron, and progressively build up to more complex neural networks. You'll gain a solid grasp of how data flows through these networks via forward propagation and, crucially, how we measure their performance by computing the loss of a deep neural network. Get ready to build a strong theoretical foundation for the hands-on labs to come!.

Slides for lecture 2 - Click here to download...
Lab: Introduction to Google Colab and Python (Click here to open notebook...)
Artificial Neural Network Playground (Click here...)
Assignment
September 19, 2025
Quiz 2 - due September 19th - Click here...

Week 1

September 8 - 12, 2025

Introduction and Logistics

In the lecture portion of Week 1, we will provide a comprehensive overview of deep learning, including its fundamental concepts, applications, and the role of neural networks. We will also discuss the TensorFlow/Keras framework, which will be the primary tool used throughout the course. In the lab session, students will set up their development environment using Google Colab, experiment with a neural network, and familiarize themselves with the TensorFlow/Keras API.

Course Overview and Logistics
Overview of Deep Learning and State of the art
Slides for lecture 1 - Click here to download...
Lab: Introduction to Google Colab and Python (Click here to open notebook...)
Assignment
September 12, 2025
Quiz 1 - due September 12th - Click here...
Complete assignment 1 if you have not already done so - Click here...
Read chapter 1 of Deep Learning - Click here...
Create a Gmail account if you don't already have one - Click here...
Create a Github account if you don't already have one - Click here...
Create a ChatGPT account if you don't already have one - Click here...