Update README.md
Browse files
README.md
CHANGED
@@ -108,9 +108,9 @@ print(result)
|
|
108 |
```
|
109 |
The punctuator will be applied to `text/*` feature. Eg.)
|
110 |
```
|
111 |
-
|
112 |
-
|
113 |
-
|
114 |
```
|
115 |
|
116 |
- To contorol the number of speakers (see [here](https://huggingface.co/pyannote/speaker-diarization-3.1#controlling-the-number-of-speakers)):
|
@@ -124,6 +124,12 @@ or
|
|
124 |
+ result = pipe("sample_diarization_japanese.mp3", min_speakers=2, max_speakers=5)
|
125 |
```
|
126 |
|
|
|
|
|
|
|
|
|
|
|
|
|
127 |
### Flash Attention 2
|
128 |
We recommend using [Flash-Attention 2](https://huggingface.co/docs/transformers/main/en/perf_infer_gpu_one#flashattention-2)
|
129 |
if your GPU allows for it. To do so, you first need to install [Flash Attention](https://github.com/Dao-AILab/flash-attention):
|
|
|
108 |
```
|
109 |
The punctuator will be applied to `text/*` feature. Eg.)
|
110 |
```
|
111 |
+
'text/SPEAKER_00': '水をマレーシアから買わなくてはならないのです。'
|
112 |
+
'text/SPEAKER_01': 'これも先ほどがずっと言っている。自分の感覚的には大丈夫です。けれども。今は屋外の気温、昼も夜も上がってますので、空気の入れ替えだけではかえって人が上がってきます。'
|
113 |
+
'text/SPEAKER_02': '愚直にその街の良さをアピールしていくという。そういう姿勢が基本にあった上での、こういうPR作戦だと思うんですよね。'
|
114 |
```
|
115 |
|
116 |
- To contorol the number of speakers (see [here](https://huggingface.co/pyannote/speaker-diarization-3.1#controlling-the-number-of-speakers)):
|
|
|
124 |
+ result = pipe("sample_diarization_japanese.mp3", min_speakers=2, max_speakers=5)
|
125 |
```
|
126 |
|
127 |
+
- To add silence before/after the audio sometimes improves the transcription quality:
|
128 |
+
```diff
|
129 |
+
- result = pipe("sample_diarization_japanese.mp3")
|
130 |
+
+ result = pipe("sample_diarization_japanese.mp3", add_silence_end=0.5, add_silence_start=0.5) # adding 0.5 sec silence to before/after the audio
|
131 |
+
```
|
132 |
+
|
133 |
### Flash Attention 2
|
134 |
We recommend using [Flash-Attention 2](https://huggingface.co/docs/transformers/main/en/perf_infer_gpu_one#flashattention-2)
|
135 |
if your GPU allows for it. To do so, you first need to install [Flash Attention](https://github.com/Dao-AILab/flash-attention):
|