mirror of
https://github.com/OpenBMB/MiniCPM-V.git
synced 2026-02-04 09:49:20 +08:00
Merge pull request #71 from HwwwwwwwH/main
Add description of vLLM support in README&README_en
This commit is contained in:
28
README.md
28
README.md
@@ -27,6 +27,7 @@
|
||||
|
||||
## 更新日志 <!-- omit in toc -->
|
||||
|
||||
* [2024.04.23] 我们增加了对 [vllm](#vllm) 的支持,欢迎体验!
|
||||
* [2024.04.18] 我们在 HuggingFace Space 新增了 MiniCPM-V 2.0 的 [demo](https://huggingface.co/spaces/openbmb/MiniCPM-V-2),欢迎体验!
|
||||
* [2024.04.17] MiniCPM-V 2.0 现在支持用户部署本地 [WebUI Demo](#本地webui-demo部署) 了,欢迎试用!
|
||||
* [2024.04.15] MiniCPM-V 2.0 现在可以通过 SWIFT 框架 [微调](https://github.com/modelscope/swift/blob/main/docs/source/Multi-Modal/minicpm-v-2最佳实践.md) 了,支持流式输出!
|
||||
@@ -628,6 +629,33 @@ PYTORCH_ENABLE_MPS_FALLBACK=1 python web_demo.py --device mps --dtype fp16
|
||||
```
|
||||
</details>
|
||||
|
||||
### vLLM 部署 <a id='vllm'></a>
|
||||
<details>
|
||||
<summary>点击查看 vLLM 部署运行的方法</summary>
|
||||
由于我们对 vLLM 提交的 PR 还在 review 中,因此目前我们 fork 了一个 vLLM 仓库以供测试使用。
|
||||
|
||||
1. 首先克隆我们 fork 的 vLLM 库:
|
||||
```shell
|
||||
git clone https://github.com/OpenBMB/vllm.git
|
||||
```
|
||||
2. 安装 vLLM 库:
|
||||
```shell
|
||||
cd vllm
|
||||
pip install -e .
|
||||
```
|
||||
3. 安装 timm 库:
|
||||
```shell
|
||||
pip install timm=0.9.10
|
||||
```
|
||||
4. 测试运行示例程序:
|
||||
```shell
|
||||
python examples/minicpmv_example.py
|
||||
```
|
||||
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
## 微调
|
||||
|
||||
### MiniCPM-V <!-- omit in toc -->
|
||||
|
||||
25
README_en.md
25
README_en.md
@@ -25,6 +25,7 @@
|
||||
|
||||
## News <!-- omit in toc -->
|
||||
|
||||
* [2024.04.23] MiniCPM-V-2.0 supports vLLM now! Click [here](#vllm) to view more details.
|
||||
* [2024.04.18] We create a HuggingFace Space to host the demo of MiniCPM-V 2.0 at [here](https://huggingface.co/spaces/openbmb/MiniCPM-V-2)!
|
||||
* [2024.04.17] MiniCPM-V-2.0 supports deploying [WebUI Demo](#webui-demo) now!
|
||||
* [2024.04.15] MiniCPM-V-2.0 now also supports [fine-tuning](https://github.com/modelscope/swift/blob/main/docs/source/Multi-Modal/minicpm-v-2最佳实践.md) with the SWIFT framework!
|
||||
@@ -620,6 +621,30 @@ PYTORCH_ENABLE_MPS_FALLBACK=1 python web_demo.py --device mps --dtype fp16
|
||||
```
|
||||
</details>
|
||||
|
||||
### Inference with vLLM<a id="vllm"></a>
|
||||
|
||||
<details>
|
||||
<summary>Click to see how to inference with vLLM </summary>
|
||||
Because our pull request to vLLM is still waiting for reviewing, we fork this repository to build and test our vLLM demo. Here are the steps:
|
||||
|
||||
1. Clone our version of vLLM:
|
||||
```shell
|
||||
git clone https://github.com/OpenBMB/vllm.git
|
||||
```
|
||||
2. Install vLLM:
|
||||
```shell
|
||||
cd vllm
|
||||
pip install -e .
|
||||
```
|
||||
3. Install timm:
|
||||
```shell
|
||||
pip install timm=0.9.10
|
||||
```
|
||||
4. Run our demo:
|
||||
```shell
|
||||
python examples/minicpmv_example.py
|
||||
```
|
||||
</details>
|
||||
|
||||
## Finetune
|
||||
|
||||
|
||||
Reference in New Issue
Block a user