Update README.md
This commit is contained in:
@@ -4,6 +4,8 @@
|
||||
<details>
|
||||
<summary>Updates:</summary>
|
||||
|
||||
22 Nov 2024: Updated to support large-v3-turbo
|
||||
|
||||
30 Sept 2024: Removed webui
|
||||
|
||||
5 Sept 2024: Fixed Emby response to a test message/notification. Clarified Emby/Plex/Jellyfin instructions for paths.
|
||||
@@ -175,7 +177,7 @@ The following environment variables are available in Docker. They will default
|
||||
| Variable | Default Value | Description |
|
||||
|---------------------------|------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||
| TRANSCRIBE_DEVICE | 'cpu' | Can transcribe via gpu (Cuda only) or cpu. Takes option of "cpu", "gpu", "cuda". |
|
||||
| WHISPER_MODEL | 'medium' | Can be:'tiny', 'tiny.en', 'base', 'base.en', 'small', 'small.en', 'medium', 'medium.en', 'large-v1','large-v2', 'large-v3', 'large', 'distil-large-v2', 'distil-large-v3', 'distil-medium.en', 'distil-small.en' |
|
||||
| WHISPER_MODEL | 'medium' | Can be:'tiny', 'tiny.en', 'base', 'base.en', 'small', 'small.en', 'medium', 'medium.en', 'large-v1','large-v2', 'large-v3', 'large', 'distil-large-v2', 'distil-large-v3', 'distil-medium.en', 'distil-small.en', 'large-v3-turbo' |
|
||||
| CONCURRENT_TRANSCRIPTIONS | 2 | Number of files it will transcribe in parallel |
|
||||
| WHISPER_THREADS | 4 | number of threads to use during computation |
|
||||
| MODEL_PATH | './models' | This is where the WHISPER_MODEL will be stored. This defaults to placing it where you execute the script in the folder 'models' |
|
||||
|
||||
Reference in New Issue
Block a user