diff --git a/README.md b/README.md index 9c9e36ad07acc..6d37b43a1d5f4 100644 --- a/README.md +++ b/README.md @@ -1,53 +1,6 @@ -# llama.cpp - -![llama](https://user-images.githubusercontent.com/1991296/230134379-7181e485-c521-4d23-a0d6-f7b3b61ba524.png) - -[![License: MIT](https://img.shields.io/badge/license-MIT-blue.svg)](https://opensource.org/licenses/MIT) - -[Roadmap](https://github.com/users/ggerganov/projects/7) / [Project status](https://github.com/ggerganov/llama.cpp/discussions/3471) / [Manifesto](https://github.com/ggerganov/llama.cpp/discussions/205) / [ggml](https://github.com/ggerganov/ggml) - -Inference of [LLaMA](https://arxiv.org/abs/2302.13971) model in pure C/C++ - -### Hot topics - -- ⚠️ **Upcoming change that might break functionality. Help with testing is needed:** https://github.com/ggerganov/llama.cpp/pull/3912 - ----- - -
- Table of Contents -
    -
  1. - Description -
  2. -
  3. - Usage - -
  4. -
  5. Contributing
  6. -
  7. Coding guidelines
  8. -
  9. Docs
  10. -
-
+# skywork.cpp | 天工大模型通过CPU来运行 + +- 基于 [llama.cpp](https://github.com/ggerganov/llama.cpp) 通过 C/C++ 来实现的大模型运行环境,可以通过 CPU 就可以直接运行 [天工大模型](https://github.com/SkyworkAI/Skywork)。 ## Description