From 663d96c887946a8d0e5b32b1e9ff3b9adfcda56c Mon Sep 17 00:00:00 2001 From: yiranyyu <2606375857@qq.com> Date: Tue, 26 Aug 2025 18:31:16 +0800 Subject: [PATCH] update readme --- README.md | 3 --- README_zh.md | 3 --- docs/minicpm_v2dot6_en.md | 8 ++++++++ docs/minicpm_v2dot6_zh.md | 10 ++++++++++ 4 files changed, 18 insertions(+), 6 deletions(-) diff --git a/README.md b/README.md index 1eb3503..2843a55 100644 --- a/README.md +++ b/README.md @@ -1271,9 +1271,6 @@ Open `http://localhost:8000/` in browser and enjoy the vision mode chatbot. | MiniCPM-o 2.6| GPU | 18 GB | The latest version, achieving GPT-4o level performance for vision, speech and multimodal live streaming on end-side devices. | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-o-2_6)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6) | | MiniCPM-o 2.6 gguf | CPU | 8 GB | The gguf version, lower memory usage and faster inference. | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-o-2_6-gguf)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-gguf) | | MiniCPM-o 2.6 int4 | GPU | 9 GB | The int4 quantized version, lower GPU memory usage. | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-o-2_6-int4)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-int4) | -| MiniCPM-V 2.6| GPU | 17 GB | Strong end-side multimodal performance for single image, multi-image and video understanding. | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6) | -| MiniCPM-V 2.6 gguf | CPU | 6 GB | The gguf version, lower memory usage and faster inference. | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6-gguf)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6-gguf) | -| MiniCPM-V 2.6 int4 | GPU | 7 GB | The int4 quantized version, lower GPU memory usage. | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6-int4)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6-int4) | ### Multi-turn Conversation diff --git a/README_zh.md b/README_zh.md index 5071726..56d1a14 100644 --- a/README_zh.md +++ b/README_zh.md @@ -1201,9 +1201,6 @@ python web_demos/minicpm-o_2.6/chatbot_web_demo_o2.6.py | MiniCPM-o 2.6| GPU | 18 GB | ๆœ€ๆ–ฐ็‰ˆๆœฌ๏ผŒๆไพ›็ซฏไพง GPT-4o ็บง็š„่ง†่ง‰ใ€่ฏญ้Ÿณใ€ๅคšๆจกๆ€ๆตๅผไบคไบ’่ƒฝๅŠ›ใ€‚ | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-o-2_6)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6) | | MiniCPM-o 2.6 gguf | CPU | 8 GB | gguf ็‰ˆๆœฌ๏ผŒๆ›ดไฝŽ็š„ๅ†…ๅญ˜ๅ ็”จๅ’Œๆ›ด้ซ˜็š„ๆŽจ็†ๆ•ˆ็އใ€‚ | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-o-2_6-gguf)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-gguf) | | MiniCPM-o 2.6 int4 | GPU | 9 GB | int4้‡ๅŒ–็‰ˆ๏ผŒๆ›ดไฝŽๆ˜พๅญ˜ๅ ็”จใ€‚ | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-o-2_6-int4)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-int4) | -| MiniCPM-V 2.6| GPU | 17 GB | ๆไพ›ๅ‡บ่‰ฒ็š„็ซฏไพงๅ•ๅ›พใ€ๅคšๅ›พใ€่ง†้ข‘็†่งฃ่ƒฝๅŠ›ใ€‚ | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6) | -| MiniCPM-V 2.6 gguf | CPU | 6 GB | gguf ็‰ˆๆœฌ๏ผŒๆ›ดไฝŽ็š„ๅ†…ๅญ˜ๅ ็”จๅ’Œๆ›ด้ซ˜็š„ๆŽจ็†ๆ•ˆ็އใ€‚ | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6-gguf)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6-gguf) | -| MiniCPM-V 2.6 int4 | GPU | 7 GB | int4้‡ๅŒ–็‰ˆ๏ผŒๆ›ดไฝŽๆ˜พๅญ˜ๅ ็”จใ€‚ | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6-int4)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6-int4) | ๆ›ดๅคš[ๅކๅฒ็‰ˆๆœฌๆจกๅž‹](#legacy-models) diff --git a/docs/minicpm_v2dot6_en.md b/docs/minicpm_v2dot6_en.md index 9ef6dac..5e6cbd0 100644 --- a/docs/minicpm_v2dot6_en.md +++ b/docs/minicpm_v2dot6_en.md @@ -943,3 +943,11 @@ answer = model.chat( print(answer) ``` + +### Model Zoo + +| Model | Device | Memory |          Description | Download | +|:-----------|:--:|:-----------:|:-------------------|:---------------:| +| MiniCPM-V 2.6| GPU | 17 GB | Strong end-side multimodal performance for single image, multi-image and video understanding. | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6) | +| MiniCPM-V 2.6 gguf | CPU | 6 GB | The gguf version, lower memory usage and faster inference. | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6-gguf)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6-gguf) | +| MiniCPM-V 2.6 int4 | GPU | 7 GB | The int4 quantized version, lower GPU memory usage. | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6-int4)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6-int4) | diff --git a/docs/minicpm_v2dot6_zh.md b/docs/minicpm_v2dot6_zh.md index 5b58aa3..44fbd5d 100644 --- a/docs/minicpm_v2dot6_zh.md +++ b/docs/minicpm_v2dot6_zh.md @@ -761,3 +761,13 @@ + + + +### ๆจกๅž‹ๅบ“ + +| ๆจกๅž‹ | ่ฎพๅค‡ | ่ต„ๆบ |          ็ฎ€ไป‹ | ไธ‹่ฝฝ้“พๆŽฅ | +|:--------------|:-:|:----------:|:-------------------|:---------------:| +| MiniCPM-V 2.6| GPU | 17 GB | ๆไพ›ๅ‡บ่‰ฒ็š„็ซฏไพงๅ•ๅ›พใ€ๅคšๅ›พใ€่ง†้ข‘็†่งฃ่ƒฝๅŠ›ใ€‚ | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6) | +| MiniCPM-V 2.6 gguf | CPU | 6 GB | gguf ็‰ˆๆœฌ๏ผŒๆ›ดไฝŽ็š„ๅ†…ๅญ˜ๅ ็”จๅ’Œๆ›ด้ซ˜็š„ๆŽจ็†ๆ•ˆ็އใ€‚ | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6-gguf)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6-gguf) | +| MiniCPM-V 2.6 int4 | GPU | 7 GB | int4้‡ๅŒ–็‰ˆ๏ผŒๆ›ดไฝŽๆ˜พๅญ˜ๅ ็”จใ€‚ | [๐Ÿค—](https://huggingface.co/openbmb/MiniCPM-V-2_6-int4)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-2_6-int4) |