An open-source project combining Llama 3.2 Vision model, robotic controls and brain computer interfaces to enable intuitive (and accesible!) human-robot interaction. Built on top of open-source projects including LeRobot, Llama, and EMOTIV's Cortex API.
- Currently powered by Llama 3.2 90B Vision through Groq
- Real-time environment analysis and spatial reasoning
- Action sequence generation based on visual input
- Planned edge deployment using smaller models (1B and 3B parameters)
- Local inference for improved latency
- Reduced hardware requirements
- Offline operation capability
- Compatible with Moss v1 robotic arm
- Precise motor control through LeRobot integration
- Support for complex manipulation tasks
- 2 fully-trained RL policies openly available on huggingface
- 20GB+ of human recorded data, openly shared with the community as [huggingface dataset](url
- Direct mind control of robotic arms using EMOTIV EEG headsets
- Real-time neural signal processing
- Built on the open Cortex API for BCI integration
- EMOTIV EEG headset
- Moss v1 robotic arm (assembly instructions)
- Python 3.10+
- Clone the repository
git clone https://github.com/yourusername/mindgrip.git
cd mindgrip
- Install dependencies
pip install -r requirements.txt
- Set up environment variables in a global .env with:
export GROQ_API_KEY="your_key_here" # Required for Llama 90B model
- Follow the Moss v1 assembly guide for robotic arm setup
- Connect your EMOTIV headset following the Cortex API documentation
from mindgrip.llama import LlamaPolicy
from mindgrip.cortex import CortexInterface
# Initialize components
policy = LlamaPolicy()
bci = CortexInterface()
# Start control loop
while True:
# Get BCI input
command = bci.get_command()
# Process with vision system
action = policy.get_action(command)
# Execute on robot
robot.execute(action)
- Initial integration with Llama 3.2 90B Vision
- Edge deployment with 1B parameter model
- Edge deployment with 3B parameter model
- Offline operation support
- Improved latency through local inference
We welcome contributions! Please see our Contributing Guide for details.
This project is fully open source and licensed under the MIT License - see the LICENSE file for details.
Built with these amazing open-source projects:
- LeRobot - Robot control framework
- Llama - Vision and language model (90B parameters)
- EMOTIV Cortex Examples - BCI integration
- Moss Robot Arms - Hardware design