mirror of
https://github.com/OpenBMB/MiniCPM-V.git
synced 2026-02-04 17:59:18 +08:00
Update README
This commit is contained in:
@@ -126,6 +126,7 @@
|
||||
- [Inference on Mac](#inference-on-mac)
|
||||
- [Efficient Inference with llama.cpp, ollama, vLLM](#efficient-inference-with-llamacpp-ollama-vllm)
|
||||
- [Fine-tuning](#fine-tuning)
|
||||
- [Awesome work using MiniCPM-V \& MiniCPM-o](#awesome-work-using-minicpm-v--minicpm-o)
|
||||
- [FAQs](#faqs)
|
||||
- [Limitations](#limitations)
|
||||
|
||||
@@ -2563,6 +2564,9 @@ We now support MiniCPM-V series fine-tuning with the SWIFT framework. SWIFT supp
|
||||
|
||||
Best Practices:[MiniCPM-V 1.0](https://github.com/modelscope/swift/blob/main/docs/source/Multi-Modal/minicpm-v最佳实践.md), [MiniCPM-V 2.0](https://github.com/modelscope/swift/blob/main/docs/source/Multi-Modal/minicpm-v-2最佳实践.md), [MiniCPM-V 2.6](https://github.com/modelscope/ms-swift/issues/1613).
|
||||
|
||||
## Awesome work using MiniCPM-V & MiniCPM-o
|
||||
|
||||
|
||||
## FAQs
|
||||
Click here to view the [FAQs](./docs/faqs.md)
|
||||
|
||||
|
||||
@@ -114,6 +114,7 @@
|
||||
- [Mac 推理](#mac-推理)
|
||||
- [基于 llama.cpp、ollama、vLLM 的高效推理](#基于-llamacppollamavllm-的高效推理)
|
||||
- [微调](#微调)
|
||||
- [基于 MiniCPM-V \& MiniCPM-o 的更多项目](#基于-minicpm-v--minicpm-o-的更多项目)
|
||||
- [FAQs](#faqs)
|
||||
- [模型局限性](#模型局限性)
|
||||
|
||||
@@ -2449,6 +2450,11 @@ pip install vllm
|
||||
|
||||
参考文档:[MiniCPM-V 1.0](https://github.com/modelscope/swift/blob/main/docs/source/Multi-Modal/minicpm-v最佳实践.md),[MiniCPM-V 2.0](https://github.com/modelscope/swift/blob/main/docs/source/Multi-Modal/minicpm-v-2最佳实践.md) [MiniCPM-V 2.6](https://github.com/modelscope/ms-swift/issues/1613).
|
||||
|
||||
|
||||
## 基于 MiniCPM-V & MiniCPM-o 的更多项目
|
||||
|
||||
|
||||
|
||||
## FAQs
|
||||
点击查看 [FAQs](./docs/faqs.md)
|
||||
|
||||
|
||||
Reference in New Issue
Block a user