mirror of
https://github.com/shivammehta25/Matcha-TTS.git
synced 2026-02-05 18:29:19 +08:00
Merge branch 'ONNX_BRANCH' into dev
This commit is contained in:
20
README.md
20
README.md
@@ -36,7 +36,6 @@ Check out our [demo page](https://shivammehta25.github.io/Matcha-TTS) and read [
|
|||||||
|
|
||||||
[](https://youtu.be/xmvJkz3bqw0)
|
[](https://youtu.be/xmvJkz3bqw0)
|
||||||
|
|
||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
1. Create an environment (suggested but optional)
|
1. Create an environment (suggested but optional)
|
||||||
@@ -191,11 +190,19 @@ matcha-tts --text "<INPUT TEXT>" --checkpoint_path <PATH TO CHECKPOINT>
|
|||||||
|
|
||||||
## ONNX support
|
## ONNX support
|
||||||
|
|
||||||
|
> Special thanks to @mush42 for implementing ONNX export and inference support.
|
||||||
|
|
||||||
It is possible to export Matcha checkpoints to [ONNX](https://onnx.ai/), and run inference on the exported ONNX graph.
|
It is possible to export Matcha checkpoints to [ONNX](https://onnx.ai/), and run inference on the exported ONNX graph.
|
||||||
|
|
||||||
### ONNX export
|
### ONNX export
|
||||||
|
|
||||||
To export a checkpoint to ONNX, run the following:
|
To export a checkpoint to ONNX, first install ONNX with
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install onnx
|
||||||
|
```
|
||||||
|
|
||||||
|
then run the following:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
python3 -m matcha.onnx.export matcha.ckpt model.onnx --n-timesteps 5
|
python3 -m matcha.onnx.export matcha.ckpt model.onnx --n-timesteps 5
|
||||||
@@ -209,7 +216,14 @@ Optionally, the ONNX exporter accepts **vocoder-name** and **vocoder-checkpoint*
|
|||||||
|
|
||||||
### ONNX Inference
|
### ONNX Inference
|
||||||
|
|
||||||
To run inference on the exported model, use the following:
|
To run inference on the exported model, first install `onnxruntime` using
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install onnxruntime
|
||||||
|
pip install onnxruntime-gpu # for GPU inference
|
||||||
|
```
|
||||||
|
|
||||||
|
then use the following:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
python3 -m matcha.onnx.infer model.onnx --text "hey" --output-dir ./outputs
|
python3 -m matcha.onnx.infer model.onnx --text "hey" --output-dir ./outputs
|
||||||
|
|||||||
Reference in New Issue
Block a user