diff --git a/README_en.md b/README_en.md
index 78ba6ca..8426c3a 100644
--- a/README_en.md
+++ b/README_en.md
@@ -608,7 +608,7 @@ MiniCPM-Llama3-V 2.5 can run with llama.cpp now! See our fork of [llama.cpp](htt
### Inference with vLLM
-Click to see how to inference with vLLM
+Click to see how to inference MiniCPM-V 2.0 with vLLM (MiniCPM-Llama3-V 2.5 coming soon)
Because our pull request to vLLM is still waiting for reviewing, we fork this repository to build and test our vLLM demo. Here are the steps:
1. Clone our version of vLLM: