Reading
This repository contains a collection of important papers related to Mixture of Experts (MoE) and Large Language Models (LLMs), along with links to their corresponding Arxiv pages and available GitHub code.
*Advances in Weight Generation and Retrieval for Language Models
# | Paper Title | Year | Link | Code |
---|---|---|---|---|
1 | Representing Model Weights with Language using Tree Experts | 2024 | Arxiv | No code available |
2 | Deep Linear Probe Generators for Weight Space Learning | 2024 | Arxiv | GitHub |
3 | Knowledge Fusion By Evolving Weights of Language Models | 2024 | Arxiv | GitHub |
Vector Quantization Prompting for Continual Learning
Historical Test-time Prompt Tuning for Vision Foundation Models
Feel free to open a pull request if you find new papers or code related to MoE and LLMs. Let's keep this list growing!