Hailo’s Yocto layers allow the user to integrate Hailo’s software into an existing Yocto environment. They include recipes for:
- PCIe driver
- Hailo-8 firmware
- HailoRT GStreamer library implementing the HailoNet element
- HailoRT library
- pyHailoRT - HailoRT Python API (wraps the run-time library)
- Hailo TAPPAS - framework for optimized execution of video-processing pipelines
This layer works with poky.
Please follow the recommended setup procedures of your OE distribution.
The branches that are currently supported (April 2024): Dunfell, Kirkstone, Mickledore (HailoRT only).
Unsupported branches are no longer updated to support newer versions, but will work when using their compatible (older) versions.
- For integrating HailoRT to your existing environment - see hailo.ai developer zone documentation (registration is required for full documentation access).
- For integrating TAPPAS to your existing environment - see the documentation in the TAPPAS GitHub.
See hailo.ai developer zone - HailoRT changelog (registration required).
Contact information and support is available at hailo.ai.
Hailo offers breakthrough AI Inference Accelerators and AI Vision Processors uniquely designed to accelerate embedded deep learning applications on edge devices.
The Hailo AI Inference Accelerators allow edge devices to run deep learning applications at full scale more efficiently, effectively, and sustainably, with an architecture that takes advantage of the core properties of neural networks.
The Hailo AI Vision Processors (SoC) combine Hailo's patented and field proven AI inferencing capabilities with advanced computer vision engines, generating premium image quality and advanced video analytics.
For more information, please visit hailo.ai.