-
Notifications
You must be signed in to change notification settings - Fork 95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Instructions to run KG_RAG on mac #34
base: main
Are you sure you want to change the base?
Changes from all commits
9dd6b7d
12d3104
2f2a578
df2fb85
e6c52fc
32d7611
cb2a390
b147050
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
FROM mcr.microsoft.com/devcontainers/anaconda:0-3 | ||
|
||
# Copy environment.yml (if found) to a temp location so we update the environment. Also | ||
# copy "noop.txt" so the COPY instruction does not fail if no environment.yml exists. | ||
COPY environment.yml* .devcontainer/noop.txt /tmp/conda-tmp/ | ||
RUN if [ -f "/tmp/conda-tmp/environment.yml" ]; then umask 0002 && /opt/conda/bin/conda env update -n base -f /tmp/conda-tmp/environment.yml; fi \ | ||
&& rm -rf /tmp/conda-tmp | ||
|
||
# [Optional] Uncomment this section to install additional OS packages. | ||
# RUN apt-get update && export DEBIAN_FRONTEND=noninteractive \ | ||
# && apt-get -y install --no-install-recommends <your-package-list-here> |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,24 @@ | ||
// For format details, see https://aka.ms/devcontainer.json. For config options, see the | ||
// README at: https://github.com/devcontainers/templates/tree/main/src/anaconda | ||
{ | ||
"name": "Anaconda (Python 3)", | ||
"build": { | ||
"context": "..", | ||
"dockerfile": "Dockerfile" | ||
}, | ||
|
||
// Features to add to the dev container. More info: https://containers.dev/features. | ||
// "features": {}, | ||
|
||
// Use 'forwardPorts' to make a list of ports inside the container available locally. | ||
// "forwardPorts": [], | ||
|
||
// Use 'postCreateCommand' to run commands after the container is created. | ||
"postCreateCommand": "pwd && /bin/bash /workspaces/KG_RAG/.devcontainer/postCreateCommand.sh" | ||
|
||
// Configure tool-specific properties. | ||
// "customizations": {}, | ||
|
||
// Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root. | ||
// "remoteUser": "root" | ||
} |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
This file copied into the container along with environment.yml* from the parent | ||
folder. This file is included to prevents the Dockerfile COPY instruction from | ||
failing if no environment.yml is found. |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,37 @@ | ||
#!/bin/bash | ||
|
||
# Update PATH | ||
echo 'export PATH="$HOME/conda/bin:$PATH"' >> $HOME/.bashrc | ||
export PATH="$HOME/conda/bin:$PATH" | ||
|
||
# Initialize conda | ||
conda init bash | ||
|
||
# Source the updated .bashrc to apply changes | ||
source $HOME/.bashrc | ||
|
||
# Create and activate the conda environment, and install requirements | ||
conda create -y -n kg_rag python=3.10.9 | ||
source activate kg_rag | ||
pip install -r /workspaces/KG_RAG/requirements.txt | ||
|
||
# Ensure the conda environment is activated for future terminals | ||
echo 'conda activate kg_rag' >> $HOME/.bashrc | ||
|
||
|
||
# # Update PATH | ||
# echo 'export PATH="$HOME/conda/bin:$PATH"' >> $HOME/.bashrc | ||
# export PATH="$HOME/conda/bin:$PATH" | ||
|
||
# # Initialize conda | ||
# conda init | ||
|
||
# # Create and activate the conda environment | ||
# conda create -y -n kg_rag python=3.10.9 | ||
# echo 'conda activate kg_rag' >> $HOME/.bashrc | ||
# pip install -r /workspaces/KG_RAG/requirements.txt | ||
# source $HOME/.bashrc | ||
|
||
# # conda create -y -n kg_rag python=3.10.9 | ||
# # echo 'conda activate kg_rag' | ||
# # pip install -r /workspaces/KG_RAG/requirements.txt |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,12 @@ | ||
# To get started with Dependabot version updates, you'll need to specify which | ||
# package ecosystems to update and where the package manifests are located. | ||
# Please see the documentation for more information: | ||
# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates | ||
# https://containers.dev/guide/dependabot | ||
|
||
version: 2 | ||
updates: | ||
- package-ecosystem: "devcontainers" | ||
directory: "/" | ||
schedule: | ||
interval: weekly |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
# Uncomment the following 3 lines and add the azure API_KEY, RESOURCE_ENDPOINT and API_VERSION, if using GPT_API_TYPE="azure" in the config.yaml file. | ||
# API_KEY=<API_KEY> | ||
# RESOURCE_ENDPOINT=<RESOURCE_ENDPOINT> | ||
# API_VERSION=<API_VERSION> # Can default to "2024-02-01" | ||
|
||
|
||
# Uncomment the following and add the openai api key, if using GPT_API_TYPE="open_ai" in the config.yaml file. Make sure to comment out the variables for azure endpoints above. | ||
# API_KEY=<OPENAI_API_KEY> |
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. can you try changing the absolute path to relative path? instead of:
try:
since the code is run as a module from the KG_RAG directory, I think this should be fine and the users do not need to change the path. Can you please change it and test it? If it works fine, then please change it to the relative path format. |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -392,5 +392,6 @@ def interactive(question, vectorstore, node_context_df, embedding_function_for_c | |
output = llm_chain.run(context=node_context_extracted, question=question) | ||
elif "gpt" in llm_type: | ||
enriched_prompt = "Context: "+ node_context_extracted + "\n" + "Question: " + question | ||
output = get_GPT_response(enriched_prompt, system_prompt, llm_type, llm_type, temperature=config_data["LLM_TEMPERATURE"]) | ||
chat_model_id, chat_deployment_id = get_gpt35() | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Noticed that There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. if api type is open_ai, I think chat_deployment_id is None. Please see my response given below. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. One problem with the new line 395 is that it will always call gpt-3.5, regardless of whether the user specified other gpt models, such as gpt-4. I think the better option here is to change line 395:
to:
Do you agree? |
||
output = get_GPT_response(enriched_prompt, system_prompt, chat_model_id, chat_deployment_id, temperature=config_data["LLM_TEMPERATURE"]) | ||
stream_out(output) |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -11,7 +11,7 @@ asttokens==2.4.0 | |
async-lru==2.0.4 | ||
async-timeout==4.0.3 | ||
attrs==23.1.0 | ||
auto-gptq==0.4.2 | ||
# auto-gptq==0.4.2 | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I couldn't install this and find any usage in the codebase. Can we remove this from the requirements? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I would probably keep this, because, some users utilize quantized model to run KG-RAG. And I presume this was added by them. |
||
Babel==2.12.1 | ||
backcall==0.2.0 | ||
backoff==2.2.1 | ||
|
@@ -102,17 +102,17 @@ notebook==7.0.4 | |
notebook_shim==0.2.3 | ||
numexpr==2.8.6 | ||
numpy==1.26.0 | ||
nvidia-cublas-cu11==11.10.3.66 | ||
nvidia-cuda-cupti-cu11==11.7.101 | ||
nvidia-cuda-nvrtc-cu11==11.7.99 | ||
nvidia-cuda-runtime-cu11==11.7.99 | ||
nvidia-cudnn-cu11==8.5.0.96 | ||
nvidia-cufft-cu11==10.9.0.58 | ||
nvidia-curand-cu11==10.2.10.91 | ||
nvidia-cusolver-cu11==11.4.0.1 | ||
nvidia-cusparse-cu11==11.7.4.91 | ||
nvidia-nccl-cu11==2.14.3 | ||
nvidia-nvtx-cu11==11.7.91 | ||
# nvidia-cublas-cu11==11.10.3.66 | ||
# nvidia-cuda-cupti-cu11==11.7.101 | ||
# nvidia-cuda-nvrtc-cu11==11.7.99 | ||
# nvidia-cuda-runtime-cu11==11.7.99 | ||
# nvidia-cudnn-cu11==8.5.0.96 | ||
# nvidia-cufft-cu11==10.9.0.58 | ||
# nvidia-curand-cu11==10.2.10.91 | ||
# nvidia-cusolver-cu11==11.4.0.1 | ||
# nvidia-cusparse-cu11==11.7.4.91 | ||
# nvidia-nccl-cu11==2.14.3 | ||
# nvidia-nvtx-cu11==11.7.91 | ||
Comment on lines
+105
to
+115
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I couldn't install the specific versions. Is this required? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. i know local models such as llama and sentence transformers make use of nvidia gpu to run the operations (which I tried in the linux server). so this maybe useful for that. but I haven't checked it otherwise. |
||
onnxruntime==1.16.0 | ||
openai==0.28.1 | ||
overrides==7.4.0 | ||
|
@@ -185,7 +185,7 @@ tornado==6.3.3 | |
tqdm==4.66.1 | ||
traitlets==5.10.0 | ||
transformers==4.33.2 | ||
triton==2.0.0 | ||
# triton==2.0.0 | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Is this required? Couldn't install this either. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. i presume this is also related to nvidia gpu. so same explanation as above |
||
typer==0.9.0 | ||
typing-inspect==0.9.0 | ||
typing_extensions==4.8.0 | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I really liked the container-based setup. However, I am aware that some users have managed to run KG-RAG without it, skipping straight from Step 1 to the 'Create a virtual environment' step. Therefore, if the container-based setup is only necessary for macOS users, could you please mark this step as 'Optional' and note that it's specifically for macOS installation?