diff --git a/README.md b/README.md index a186983..b06c3d7 100644 --- a/README.md +++ b/README.md @@ -1268,6 +1268,7 @@ Open `http://localhost:8000/` in browser and enjoy the vision mode chatbot. | MiniCPM-V 4.5| GPU | 18 GB | The latest version, strong end-side multimodal performance for single image, multi-image and video understanding. | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-V-4_5)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5) | | MiniCPM-V 4.5 gguf | CPU | 8 GB | The gguf version, lower memory usage and faster inference. | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-V-4_5-gguf)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-gguf) | | MiniCPM-V 4.5 int4 | GPU | 9 GB | The int4 quantized version, lower GPU memory usage. | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-V-4_5-int4)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-int4) | +| MiniCPM-V 4.5 AWQ | GPU | 9 GB | The int4 quantized version, lower GPU memory usage. | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-V-4_5-AWQ)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-AWQ) | | MiniCPM-o 2.6| GPU | 18 GB | The latest version, achieving GPT-4o level performance for vision, speech and multimodal live streaming on end-side devices. | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-o-2_6)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6) | | MiniCPM-o 2.6 gguf | CPU | 8 GB | The gguf version, lower memory usage and faster inference. | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-o-2_6-gguf)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-gguf) | | MiniCPM-o 2.6 int4 | GPU | 9 GB | The int4 quantized version, lower GPU memory usage. | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-o-2_6-int4)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-int4) | diff --git a/README_zh.md b/README_zh.md index 3530b61..7540562 100644 --- a/README_zh.md +++ b/README_zh.md @@ -1198,6 +1198,7 @@ python web_demos/minicpm-o_2.6/chatbot_web_demo_o2.6.py | MiniCPM-V 4.5| GPU | 18 GB | ζδΎ›ε‡Ίθ‰²ηš„η«―δΎ§ε•ε›Ύγ€ε€šε›Ύγ€θ§†ι’‘η†θ§£θƒ½εŠ›γ€‚ | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-V-4_5)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5) | | MiniCPM-V 4.5 gguf | CPU | 8 GB | gguf η‰ˆζœ¬οΌŒζ›΄δ½Žηš„ε†…ε­˜ε η”¨ε’Œζ›΄ι«˜ηš„ζŽ¨η†ζ•ˆηŽ‡γ€‚ | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-V-4_5-gguf)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-gguf) | | MiniCPM-V 4.5 int4 | GPU | 9 GB | int4ι‡εŒ–η‰ˆοΌŒζ›΄δ½Žζ˜Ύε­˜ε η”¨ | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-V-4_5-int4)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-int4) | +| MiniCPM-V 4.5 AWQ | GPU | 9 GB | int4ι‡εŒ–η‰ˆοΌŒζ›΄δ½Žζ˜Ύε­˜ε η”¨ | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-V-4_5-AWQ)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-AWQ) | | MiniCPM-o 2.6| GPU | 18 GB | ζœ€ζ–°η‰ˆζœ¬οΌŒζδΎ›η«―δΎ§ GPT-4o ηΊ§ηš„θ§†θ§‰γ€θ―­ιŸ³γ€ε€šζ¨‘ζ€ζ΅εΌδΊ€δΊ’θƒ½εŠ›γ€‚ | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-o-2_6)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6) | | MiniCPM-o 2.6 gguf | CPU | 8 GB | gguf η‰ˆζœ¬οΌŒζ›΄δ½Žηš„ε†…ε­˜ε η”¨ε’Œζ›΄ι«˜ηš„ζŽ¨η†ζ•ˆηŽ‡γ€‚ | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-o-2_6-gguf)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-gguf) | | MiniCPM-o 2.6 int4 | GPU | 9 GB | int4ι‡εŒ–η‰ˆοΌŒζ›΄δ½Žζ˜Ύε­˜ε η”¨γ€‚ | [πŸ€—](https://huggingface.co/openbmb/MiniCPM-o-2_6-int4)    [](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-int4) | diff --git a/assets/minicpmv4_5/MiniCPM-V 4.5-8.26_img.jpeg b/assets/minicpmv4_5/MiniCPM-V 4.5-8.26_img.jpeg index e67c808..1c94b49 100644 Binary files a/assets/minicpmv4_5/MiniCPM-V 4.5-8.26_img.jpeg and b/assets/minicpmv4_5/MiniCPM-V 4.5-8.26_img.jpeg differ