AI on the Edge LESSON 15: Use the Raspberry Pi Camera in openCV to Create Live Video

Hey everyone, welcome to Lesson 15 of the AI on the Edge series!

In today’s lesson, we take a very important step forward. We finally bring the Raspberry Pi Camera into our OpenCV world so we can capture live video and start building real computer vision projects.  Today we learn how to pull live frames directly from the official Raspberry Pi camera using picamera2 and display them smoothly with OpenCV.

This lesson is all about building a clean, reliable foundation. I walk you through how to properly configure the Pi Camera with the modern picamera2 library — setting the resolution to 1280×720, choosing the right format, and pushing the frame rate up to 60fps. Then we bring those frames straight into OpenCV so we can see live video in a window. You’ll also learn why we use RGB888 format and how to organize your code so it stays clean as our projects get more complex.

Getting reliable live video from the Pi Camera is one of those foundational skills that opens the door to everything we’re going to do in this class — face detection, object tracking, color tracking, motion detection, and all the exciting AI projects still ahead. Once you have solid camera access, the real fun begins.

I kept this lesson straightforward on purpose. I want you to have a rock-solid base that you can build upon without fighting technical problems later. By the end of this video, you’ll have a clean, responsive live video stream running from your Raspberry Pi Camera, ready for all the computer vision magic we’re about to add in the coming lessons.

So fire up your Raspberry Pi, grab your camera module, and let’s get that live video rolling! As always, I encourage you to type the code along with me and experiment with it. Change the resolution, try different frame rates, and make it your own.

Are you ready? Let’s dive in!

In today’s lesson, this is the code which we developed:

This is the schematic we are using in these lessons:

Fusion Hat Circuit Diagram
This is the circuit we will use moving forward in the class

AI on the Edge LESSON 14: Control LED Color With Voice Commands on Raspberry Pi 5

In Lesson 14 of AI on the Edge, we’re doing something really fun and powerful — we’re building a voice-controlled RGB LED that listens to you, changes colors on command, and even talks back with some personality! This is true edge AI running 100% locally on your Raspberry Pi with the Fusion HAT. No cloud, no internet, just fast, private, and responsive voice interaction right on your desk.

You simply speak a color — red, green, blue, cyan, magenta, yellow, off, or even quit — and the RGB LED instantly springs to life with beautiful color. But that’s not all. Every time you give a command, the system replies with a fun, playful spoken response using the Piper text-to-speech engine. It turns your Raspberry Pi into a charming little LED companion that feels alive and interactive.In this lesson, you’ll learn how to combine local Speech-to-Text with the STT library and natural-sounding Text-to-Speech with Piper. You’ll master PWM control of a full-color RGB LED through the Fusion HAT, and you’ll see how to use Python threading plus a queue to keep the voice listening running smoothly in the background without ever locking up your main program. The code is clean, well-structured, and includes proper startup greetings, graceful shutdown, and excellent resource cleanup — exactly the kind of solid practices we love in this series.What makes this project extra special is how it brings everything together. You get real-time voice recognition, instant hardware response, and spoken feedback — all happening locally on the edge. It’s fast, it’s private, and it’s incredibly satisfying to watch that LED light up exactly as you command while your Pi chats back at you.

Go ahead and watch the full Lesson 14 video, grab the complete code from the description, and build this project step by step with me. Once you have it running, I want you to play with it! Add new colors, create your own funny responses, or start thinking about how you could combine this voice control with sensors or other hardware in future projects.

This is the kind of hands-on, creative AI application that makes learning so exciting. You’re not just watching — you’re building real, useful skills that put you in the driver’s seat with artificial intelligence.

Fire up that Raspberry Pi, get your Fusion HAT ready, and let’s make some colors shine while the Pi talks back. I can’t wait to see what you create with this one!

Happy building, everyone — I’ll see you in the next lesson!

This is the schematic we are using for the project:

Fusion Hat Circuit Diagram
This is the circuit we will use moving forward in the class

This is the code we developed in the video:

 

AI on the Edge LESSON 13: Control LED Brightness with Voice Commands on Raspberry Pi 5

In this video lesson we continue to expand our skills in using AI and Speech to Text (SST) capability to control our project via voice commands. In this lesson, we explore using voice commands to control the brightness of an LED. We use threading so it does not have a blocking issue. We are using the red channel of the RGB LED in order to demonstrate this capability.

This is the schematic we are using, from LESSON #5.

Fusion Hat Circuit Diagram
This is the circuit we will use moving forward in the class

In the video, this is the code we developed:

 

AI on the Edge LESSON 12: Introduction to Python Threading on the Raspberry Pi

The challenge we face as we move forward in this class is that certain important functions which we need are ‘Blocking’ in nature. That is, they block the remainder of the program as they wait for input. For example, imagine blinking an LED and having the user input the delay time. When the program is waiting for user input, it can not continue to perform the blinking operation. This is also true for speech input. While the program waits for you to say something, execution of the remainder of the program stalls. To overcome this, we use threads. Threads are functions, or small snippets of code, which we can have execute in the background. In today’s lesson, I will show you how to incorporate threading into your AI projects.

In today’s lesson, this is the code we developed.

 

AI on the Edge LESSON 11: Control LED on Raspberry Pi With Voice Commands

In this video lesson I show you my solution to the homework assignment I gave in LESSON #10. The assignment was to control an LED using voice commands on the Raspberry Pi. This uses the Speech To Text expertise we developed in the last few lessons, but incorporates them into a real world project. With this basic framework, you are now equipped to make speech part of your future Raspberry Pi projects.

This is the schematic of the circuit we are using for our AI class. We go into great detail on this schematic in LESSON #5 if you want to learn more about it.

Fusion Hat Circuit Diagram
This is the circuit we will use moving forward in the class

Now this is the code we developed in this lesson:

 

Making The World a Better Place One High Tech Project at a Time. Enjoy!