This repository contains prototypes and components of an AI event-driven architecture for controlling 8-bit LED screen simulation. The project includes several modes such as animations, drawing, games, and interaction with large language models (LLMs).
-
C: Various animations, movement detection, and pose/hand recognition on Jetson Nano. Includes snake and brick pong games implemented in C/C++.
-
Ellie_connected: MobileNet for user recognition, Mediapipe for BrickPong hand controller, autolaunch when user is detected.
-
Ellie_connected_v2: More solid version with resting animation and movement detection with OpenCV, mode selection with hand gesture recognition and brick pong with hand controller, relaunch when no movement.
-
flaskServerWith3DEffects: YOLOv9 object detection integrated to create video effects and animations.
-
soundAndPersonRecognition: Scripts for detecting sound levels and recognizing persons using a simple OpenCV model.
- Python
- C/C++
- YOLOv9
- Mediapipe
- MobileNet
- OpenCV
- PyGame
- NumPy
- Matplotlib
- whisper
- Jetson Nano
This project is licensed under the MIT License.