diff --git a/README.md b/README.md
index a186983..b06c3d7 100644
--- a/README.md
+++ b/README.md
@@ -1268,6 +1268,7 @@ Open `http://localhost:8000/` in browser and enjoy the vision mode chatbot.
| MiniCPM-V 4.5| GPU | 18 GB | The latest version, strong end-side multimodal performance for single image, multi-image and video understanding. | [π€](https://huggingface.co/openbmb/MiniCPM-V-4_5) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5) |
| MiniCPM-V 4.5 gguf | CPU | 8 GB | The gguf version, lower memory usage and faster inference. | [π€](https://huggingface.co/openbmb/MiniCPM-V-4_5-gguf) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-gguf) |
| MiniCPM-V 4.5 int4 | GPU | 9 GB | The int4 quantized version, lower GPU memory usage. | [π€](https://huggingface.co/openbmb/MiniCPM-V-4_5-int4) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-int4) |
+| MiniCPM-V 4.5 AWQ | GPU | 9 GB | The int4 quantized version, lower GPU memory usage. | [π€](https://huggingface.co/openbmb/MiniCPM-V-4_5-AWQ) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-AWQ) |
| MiniCPM-o 2.6| GPU | 18 GB | The latest version, achieving GPT-4o level performance for vision, speech and multimodal live streaming on end-side devices. | [π€](https://huggingface.co/openbmb/MiniCPM-o-2_6) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6) |
| MiniCPM-o 2.6 gguf | CPU | 8 GB | The gguf version, lower memory usage and faster inference. | [π€](https://huggingface.co/openbmb/MiniCPM-o-2_6-gguf) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-gguf) |
| MiniCPM-o 2.6 int4 | GPU | 9 GB | The int4 quantized version, lower GPU memory usage. | [π€](https://huggingface.co/openbmb/MiniCPM-o-2_6-int4) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-int4) |
diff --git a/README_zh.md b/README_zh.md
index 3530b61..7540562 100644
--- a/README_zh.md
+++ b/README_zh.md
@@ -1198,6 +1198,7 @@ python web_demos/minicpm-o_2.6/chatbot_web_demo_o2.6.py
| MiniCPM-V 4.5| GPU | 18 GB | ζδΎεΊθ²ηη«―δΎ§εεΎγε€εΎγθ§ι’ηθ§£θ½εγ | [π€](https://huggingface.co/openbmb/MiniCPM-V-4_5) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5) |
| MiniCPM-V 4.5 gguf | CPU | 8 GB | gguf ηζ¬οΌζ΄δ½ηε
εε η¨εζ΄ι«ηζ¨ηζηγ | [π€](https://huggingface.co/openbmb/MiniCPM-V-4_5-gguf) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-gguf) |
| MiniCPM-V 4.5 int4 | GPU | 9 GB | int4ιεηοΌζ΄δ½ζΎεε η¨ | [π€](https://huggingface.co/openbmb/MiniCPM-V-4_5-int4) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-int4) |
+| MiniCPM-V 4.5 AWQ | GPU | 9 GB | int4ιεηοΌζ΄δ½ζΎεε η¨ | [π€](https://huggingface.co/openbmb/MiniCPM-V-4_5-AWQ) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-V-4_5-AWQ) |
| MiniCPM-o 2.6| GPU | 18 GB | ζζ°ηζ¬οΌζδΎη«―δΎ§ GPT-4o ηΊ§ηθ§θ§γθ―ι³γε€ζ¨‘ζζ΅εΌδΊ€δΊθ½εγ | [π€](https://huggingface.co/openbmb/MiniCPM-o-2_6) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6) |
| MiniCPM-o 2.6 gguf | CPU | 8 GB | gguf ηζ¬οΌζ΄δ½ηε
εε η¨εζ΄ι«ηζ¨ηζηγ | [π€](https://huggingface.co/openbmb/MiniCPM-o-2_6-gguf) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-gguf) |
| MiniCPM-o 2.6 int4 | GPU | 9 GB | int4ιεηοΌζ΄δ½ζΎεε η¨γ | [π€](https://huggingface.co/openbmb/MiniCPM-o-2_6-int4) [
](https://modelscope.cn/models/OpenBMB/MiniCPM-o-2_6-int4) |
diff --git a/assets/minicpmv4_5/MiniCPM-V 4.5-8.26_img.jpeg b/assets/minicpmv4_5/MiniCPM-V 4.5-8.26_img.jpeg
index e67c808..1c94b49 100644
Binary files a/assets/minicpmv4_5/MiniCPM-V 4.5-8.26_img.jpeg and b/assets/minicpmv4_5/MiniCPM-V 4.5-8.26_img.jpeg differ