This page outlines the weekly schedule for lectures, labs, assignments, and examinations. The schedule will be updated regularly to align with the University of Juba's academic calendar and holiday schedule. Reading materials, lecture slides, and lab materials will be accessible through this schedule, with links provided for downloading prior to the commencement of each lecture or lab session. If you encounter any difficulties or have questions, please contact the lead Teaching Fellow, Thiong Abraham.

Week 9

Exam prep

This lecture will review materials for exams

Lecture
Location: Senate Hall
Date: December 2nd, 2024
Group A: 1:00-2:00 pm
Group B: 2:30-3:30 pm
Exam materials
Slide available after lecture
Laboratory
Sign Numbers - Project Report Submission
Complete your report and submit it after reviewing with a TF.
Group A: December 3rd, 2024 from 2 - 3pm
Group B: December 3rd, 2024 from 3 - 4pm
Group C: December 5th, 2024 from 2-3 pm
Group D: December 5th, 2024 from 3-4 pm

Week 8

Exam prep

This lecture will review materials for exams

Lecture
Location: Senate Hall
Date: November 25th, 2024
Group A: 1:00-2:00 pm
Group B: 2:30-3:30 pm
Exam materials
Slides for lecture 8 - Click here to download...
Laboratory
Sign Numbers - CNN Evaluation
Notebook available during your lab session. Click here to open...
Group A: November 26th, 2024 from 2 - 3pm
Group B: November 26th, 2024 from 3 - 4pm
Group C: November 28th, 2024 from 2-3 pm
Group D: November 28th, 2024 from 3-4 pm

Week 7

Advance Topics

This lecture will cover advance topics.

Lecture
Location: Senate Hall
Date: November 18th, 2024
Group A: 1:00-2:00 pm
Group B: 2:30-3:30 pm
Recurrent Neural Networks
Generative Adversarial Neural Networks
Other Architectures
Performance Metrics
Slides for lecture 7 - Click here to download...
Quiz 7 - Click here... Deadline: November 22th, 2024.
Laboratory
Sign Numbers - CNN Implementation
Group A: November 19th, 2024 from 2 - 3pm
Group B: November 19th, 2024 from 3 - 4pm
Group C: November 21st, 2024 from 2-3 pm
Group D: November 21st, 2024 from 3-4 pm

Week 6

CNN Architectures

Welcome to our lecture on Convolutional Neural Networks (CNNs)! Today, we'll delve into the fundamental building blocks that make CNNs so powerful for image recognition and computer vision tasks. We'll start by understanding the concept of convolution, where a filter (or kernel) slides over an input image, performing element-wise multiplications and summations with the underlying pixels. This process extracts local features like edges and textures. Next, we'll explore pooling layers, which reduce the spatial dimensions of feature maps while preserving essential information. Techniques like max pooling and average pooling are commonly used to downsample the input and make the network more efficient. Finally, we'll discuss the role of fully connected layers, which take the output of the convolutional and pooling layers and map them to class probabilities. By combining these building blocks, CNNs can learn hierarchical representations of visual data, enabling them to accurately classify images, detect objects, and even generate realistic images.

Lecture
Location: Senate Hall
Date: November 11th, 2024
Group A: 1:00-2:00 pm
Group B: 2:30-3:30 pm
Receptive Fields
CNN Applications
Sign Numbers Classification
Slides for lecture 6 - Click here to download...
Quiz 6 - Click here... Deadline: November 15th, 2024.
Laboratory
Sign Numbers - Data and Augmentation
Group A: November 12th, 2024 from 2 - 3pm
Group B: November 12th, 2024 from 3 - 4pm
Group C: November 14th, 2024 from 2-3 pm
Group D: November 14th, 2024 from 3-4 pm

Week 5

Convolutional Neural Networks (continued...)

Welcome to our lecture on Convolutional Neural Networks (CNNs)! Today, we'll delve into the fundamental building blocks that make CNNs so powerful for image recognition and computer vision tasks. We'll start by understanding the concept of convolution, where a filter (or kernel) slides over an input image, performing element-wise multiplications and summations with the underlying pixels. This process extracts local features like edges and textures. Next, we'll explore pooling layers, which reduce the spatial dimensions of feature maps while preserving essential information. Techniques like max pooling and average pooling are commonly used to downsample the input and make the network more efficient. Finally, we'll discuss the role of fully connected layers, which take the output of the convolutional and pooling layers and map them to class probabilities. By combining these building blocks, CNNs can learn hierarchical representations of visual data, enabling them to accurately classify images, detect objects, and even generate realistic images.

Lecture
Location: Senate Hall
Date: November 4th, 2024
Group A: 1:00-2:00 pm
Group B: 2:30-3:30 pm
CNN Layers
Parameters and Memory
CNN Architectures
Sign Number Detection - Data Collection
Slides for lecture 5 - Click here to download...
Quiz 5 - Click here... Deadline: November 8th, 2024.
Laboratory
Introduction to Neural Networks in Keras and Tensorflow
Data Collection Instructions: Click here to open...
Group A: November 5th, 2024 from 2 - 3pm
Group B: November 5th, 2024 from 3 - 4pm
Group C: November 7th, 2024 from 2-3 pm
Group D: November 7th, 2024 from 3-4 pm
Assignment
Play with a Convolutional Neural Network - Click here...

Week 4

Convolutional Neural Networks

Welcome to our lecture on Convolutional Neural Networks (CNNs)! Today, we'll delve into the fundamental building blocks that make CNNs so powerful for image recognition and computer vision tasks. We'll start by understanding the concept of convolution, where a filter (or kernel) slides over an input image, performing element-wise multiplications and summations with the underlying pixels. This process extracts local features like edges and textures. Next, we'll explore pooling layers, which reduce the spatial dimensions of feature maps while preserving essential information. Techniques like max pooling and average pooling are commonly used to downsample the input and make the network more efficient. Finally, we'll discuss the role of fully connected layers, which take the output of the convolutional and pooling layers and map them to class probabilities. By combining these building blocks, CNNs can learn hierarchical representations of visual data, enabling them to accurately classify images, detect objects, and even generate realistic images.

Lecture
Location: Senate Hall
Date: October 28th, 2024
Group A: 1:00-2:00 pm
Group B: 2:30-3:30 pm
Motivation for CNNs
Building blocks of CNN (convolution and pooling)
Slides for lecture 4 - Click here to download...
Quiz 4 - Click here... Deadline: November 1st, 2024.
Laboratory
Introduction to Neural Networks in Keras and Tensorflow
Group A: October 29th, 2024 from 2 - 3pm
Group B: October 29th, 2024 from 3 - 4pm
Group C: October 31st, 2024 from 2-3 pm
Group D: October 31st, 2024 from 3-4 pm
Assignment
Play with a Convolutional Neural Network - Click here...

Week 3

Training and Optimizing Neural Networks

This lecture delves into the core concepts of training and optimizing neural networks. We will explore the fundamental building blocks of neural networks, including neurons, layers, and activation functions. You'll learn about the backpropagation algorithm, a crucial technique for calculating gradients and updating weights to minimize loss. We'll discuss various optimization algorithms like gradient descent, stochastic gradient descent, and advanced techniques like Adam and RMSprop. The lecture will also cover regularization techniques, such as L1 and L2 regularization, dropout, and early stopping, to prevent overfitting and improve generalization. By the end of this lecture, you'll have a solid understanding of the principles behind training and optimizing neural networks, enabling you to build and fine-tune your own models effectively.

Lecture
Location: Senate Hall
Date: October 21st, 2024
Group A: 1:00-2:00 pm
Group B: 2:30-3:30 pm
Backpropagation
Training Neural Networks
Optimizing Neural Networks
Slides for lecture 3 - Click here to download...
Quiz 3 - Click here... Deadline: October 25th, 2024.
Laboratory
Introduction to Neural Networks in Keras and Tensorflow
Group A: October 22nd, 2024 from 2 - 3pm October 23rd, 2024 from 11 am - 12 pm
Group B: October 22nd, 2024 from 3 - 4pm October 23rd, 2024 from 12 pm - 1 pm
Group C: October 24th, 2024 from 2-3 pm
Group D: October 24th, 2024 from 3-4 pm
Assignment
Play with an Artificial Neural Network - Click here...

Week 2

Artificial Neural Networks

This lecture will introduce students to the fascinating world of artificial neural networks (ANNs), drawing inspiration from biological neural networks. We will explore the fundamental components and functions of biological neurons and how these concepts have been adapted to create artificial counterparts. Students will learn how to construct ANNs with different layers, including input, hidden, and output layers, and understand the role of activation functions in shaping the network's behavior. The lecture will culminate in a discussion of deep neural networks, which leverage multiple hidden layers to tackle complex tasks such as image recognition, natural language processing, and more. By the end of this session, students will have a solid foundation in the theory and practical aspects of building ANNs.

Lecture
Location: Senate Hall
Date: October 14th, 2024
Group A: 1:00-2:00 pm
Group B: 2:30-3:30 pm
Biological Neural Networks
Artificial Neural Networks
Slides or lecture 2 - Click here to download...
Quiz 2 - Click here... Deadline: October 18, 2024.
Laboratory
Introduction to Neural Networks in Keras and Tensorflow
Click here to open notebook... - accessible on Lab day.
Group A: October 15th, 2024 from 2-3 pm
Group B: October 15th, 2024 from 3-4 pm
Group C: October 17th, 2024 from 2-3 pm
Group D: October 17th, 2024 from 3-4 pm
Assignment
Read chapter 6 of Deep Learning - Click here... OR Click here...
Play with an Artificial Neural Network - Click here...

Week 1

Introduction and Logistics

In the lecture portion of Week 1, we will provide a comprehensive overview of deep learning, including its fundamental concepts, applications, and the role of neural networks. We will also discuss the TensorFlow/Keras framework, which will be the primary tool used throughout the course. In the lab session, students will set up their development environment using Google Colab, experiment with a neural network, and familiarize themselves with the TensorFlow/Keras API.

Lecture
Location: Senate Hall
Date: October 7th, 2024
Group A: 1:00-2:00 pm
Group B: 2:30-3:30 pm
Course Overview and Logistics
Overview of Deep Learning and State of the art
Slides for lecture 1 - Click here to download...
Quiz 1 -Click here... Deadline: October 11, 2024
Laboratory
Introduction to Google Colab and Python -
Group A: October 8th, 2024 from 2-3 pm
Group B: October 8th, 2024 from 3-4 pm
Group C: October 10th, 2024 from 2-3 pm
Group D: October 10th, 2024 from 3-4 pm
Assignment
Due October 7th
Complete assignment 1 if you have not already done so - Click here...
Read chapter 1 of Deep Learning - Click here...
Create a Gmail account if you don't already have one - Click here...
Create a Github account if you don't already have one - Click here...
Create a ChatGPT account if you don't already have one - Click here...