diff --git a/README_en.md b/README_en.md index 3801cd1..446b92e 100644 --- a/README_en.md +++ b/README_en.md @@ -1504,7 +1504,7 @@ PYTORCH_ENABLE_MPS_FALLBACK=1 python test.py ### Deployment on Mobile Phone -MiniCPM-Llama3-V 2.5 and MiniCPM-V 2.0 can be deployed on mobile phones with Android operating systems. 🚀 Click [MiniCPM-Llama3-V 2.5](http://minicpm.modelbest.cn/android/modelbest-release-20240528_182155.apk) / [MiniCPM-V 2.0](https://github.com/OpenBMB/mlc-MiniCPM) to install apk. +MiniCPM-V 2.0 can be deployed on mobile phones with Android operating systems. 🚀 Click [MiniCPM-V 2.0](https://github.com/OpenBMB/mlc-MiniCPM) to install apk. ### Inference with llama.cpp MiniCPM-V 2.6 can run with llama.cpp now! See [our fork of llama.cpp](https://github.com/OpenBMB/llama.cpp/tree/minicpmv-main/examples/llava/README-minicpmv2.6.md) for more detail. This implementation supports smooth inference of 16~18 token/s on iPad (test environment:iPad Pro + M4). @@ -1515,9 +1515,9 @@ MiniCPM-V 2.6 can run with ollama now! See [our fork of ollama](https://github.c ### Inference with vLLM
- vLLM now officially supports MiniCPM-V 2.0, MiniCPM-Llama3-V 2.5 and MiniCPM-V 2.6, Click to see. + vLLM now officially supports MiniCPM-V 2.6, MiniCPM-Llama3-V 2.5 and MiniCPM-V 2.0, Click to see. -1. Install vLLM(==0.5.4): +1. Install vLLM(>=0.5.4): ```shell pip install vllm ``` @@ -1681,10 +1681,10 @@ This project is developed by the following institutions: If you find our model/code/paper helpful, please consider cite our papers 📝 and star us ⭐️! ```bib -@article{yao2024minicpmv, - title={MiniCPM-V: A GPT-4V Level MLLM on Your Phone}, - author={Yao, Yuan and Yu, Tianyu and Zhang, Ao and Wang, Chongyi and Cui, Junbo and Zhu, Hongji and Cai, Tianchi and Li, Haoyu and Zhao, Weilin and He, Zhihui and Chen, Qianyu and Zhou, Huarong and Zou, Zhensheng and Zhang, Haoye and Hu, Shengding and Zheng, Zhi and Zhou, Jie and Cai, Jie and Han, Xu and Zeng, Guoyang and Li, Dahai and Liu, Zhiyuan and Sun, Maosong}, - journal={arXiv preprint 2408.01800}, - year={2024}, +@article{yao2024minicpm, + title={MiniCPM-V: A GPT-4V Level MLLM on Your Phone}, + author={Yao, Yuan and Yu, Tianyu and Zhang, Ao and Wang, Chongyi and Cui, Junbo and Zhu, Hongji and Cai, Tianchi and Li, Haoyu and Zhao, Weilin and He, Zhihui and others}, + journal={arXiv preprint arXiv:2408.01800}, + year={2024} } ``` diff --git a/README_zh.md b/README_zh.md index 20e6708..7d2125d 100644 --- a/README_zh.md +++ b/README_zh.md @@ -1704,10 +1704,10 @@ print(outputs[0].outputs[0].text) 如果您觉得我们模型/代码/论文有帮助,请给我们 ⭐ 和 引用 📝,感谢! ```bib -@article{yao2024minicpmv, - title={MiniCPM-V: A GPT-4V Level MLLM on Your Phone}, - author={Yao, Yuan and Yu, Tianyu and Zhang, Ao and Wang, Chongyi and Cui, Junbo and Zhu, Hongji and Cai, Tianchi and Li, Haoyu and Zhao, Weilin and He, Zhihui and Chen, Qianyu and Zhou, Huarong and Zou, Zhensheng and Zhang, Haoye and Hu, Shengding and Zheng, Zhi and Zhou, Jie and Cai, Jie and Han, Xu and Zeng, Guoyang and Li, Dahai and Liu, Zhiyuan and Sun, Maosong}, - journal={arXiv preprint 2408.01800}, - year={2024}, +@article{yao2024minicpm, + title={MiniCPM-V: A GPT-4V Level MLLM on Your Phone}, + author={Yao, Yuan and Yu, Tianyu and Zhang, Ao and Wang, Chongyi and Cui, Junbo and Zhu, Hongji and Cai, Tianchi and Li, Haoyu and Zhao, Weilin and He, Zhihui and others}, + journal={arXiv preprint arXiv:2408.01800}, + year={2024} } ```