Using LLMs to count macro nutrients (macros)
Tested using Nvidia GPU for acceleration. If using Nvidia, ensure you have compatible CUDA runtime (run nvidid-smi
) and CUDA toolkit (run nvcc --version
) installed for building llama-cpp-python.
Copy .env_example
to .env
and put in your API key from FoodData Central
python -m venv env
source env/bin/activate
pip install -r requirements.txt
CMAKE_ARGS="-DLLAMA_CUDA=on" pip install llama-cpp-python --force-reinstall --no-cache-dir
python download_models.py
./start_server.sh &
python main.py
- Parse recipes from the web
- Parse strings to pint quantities
- Add ability to cache recipes
- Calculate macro needs based on height, weight, age, gender
- Use JSON mode in instructor so llama-3 can be used
- Add RecipeBook object that keeps track of cached recipes
- Get macro values for each ingredient
- Adjust macros based on goals: lose weight, build muscle, maintain
- Create meal plan for individual based on recipes and macro requirements
- Create meal plan for groups of people
- Create shopping list based meal plan
- Calculate prices using grocery stores' APIs
- llama-cpp-python for serving the LLM backends, function calling
- pydantic for object orientation and typing
- instructor for generating instances of pydantic models using LLM's
- pint for unit conversion
- fooddatacentral client for USDA FoodData Central API
- USDA FoodData Central API:
U.S. Department of Agriculture, Agricultural Research Service. FoodData Central, 2019. fdc.nal.usda.gov.
- Macro Counting with Python by @AdamLiscia