mirror of
https://github.com/OpenBMB/MiniCPM-V.git
synced 2026-02-04 17:59:18 +08:00
update readme
This commit is contained in:
@@ -25,6 +25,7 @@
|
||||
|
||||
## News <!-- omit in toc -->
|
||||
|
||||
* [2024.05.25] MiniCPM-Llama3-V 2.5 now supports streaming outputs and supports customized system prompts. Try it at [here](https://huggingface.co/openbmb/MiniCPM-Llama3-V-2_5#usage)!
|
||||
* [2024.05.25] MiniCPM-Llama3-V 2.5 now supports [Ollama](https://github.com/OpenBMB/ollama/tree/minicpm-v2.5/examples/minicpm-v2.5). Try it now!
|
||||
* [2024.05.24] We release the MiniCPM-Llama3-V 2.5 [gguf](https://huggingface.co/openbmb/MiniCPM-Llama3-V-2_5-gguf), which supports [llama.cpp](#inference-with-llamacpp) inference and provides a 6~8 token/s smooth decoding on mobile phones. Try it now!
|
||||
* [2024.05.23] 🔥🔥🔥 MiniCPM-V tops GitHub Trending and Hugging Face Trending! Our demo, recommended by Hugging Face Gradio’s official account, is available [here](https://huggingface.co/spaces/openbmb/MiniCPM-Llama3-V-2_5). Come and try it out!
|
||||
|
||||
Reference in New Issue
Block a user