Skip to content

Topic names and Llama cpp #2080

Answered by MaartenGr
caentrena asked this question in Q&A
Jul 11, 2024 · 1 comments · 1 reply
Discussion options

You must be logged in to vote

Thank you for your kind words! You can do this by following the multiple representations documentation. Essentially, all you need to do is make sure that you create a dictionary of your representation model like so:

from bertopic import BERTopic
from bertopic.representation import LlamaCPP

# Use llama.cpp to load in a 4-bit quantized version of Zephyr 7B Alpha
zephyr = LlamaCPP("zephyr-7b-alpha.Q4_K_M.gguf")
representation_model = {
   "Zephyr":  zephyr,
}

# Create our BERTopic model
topic_model = BERTopic(representation_model=representation_model,  verbose=True)

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@caentrena
Comment options

Answer selected by caentrena
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants