From d3e35ba1cedae125f0e79c67b52446f4e195980f Mon Sep 17 00:00:00 2001 From: McCloudS <64094529+McCloudS@users.noreply.github.com> Date: Thu, 26 Oct 2023 09:35:11 -0600 Subject: [PATCH] Update README.md --- README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 3d56019..daafe48 100644 --- a/README.md +++ b/README.md @@ -96,7 +96,8 @@ The following environment variables are available in Docker. They will default | Variable | Default Value | Description | |---------------------------|------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | TRANSCRIBE_DEVICE | 'cpu' | Can transcribe via gpu (Cuda only) or cpu. Takes option of "cpu", "gpu", "cuda". You must be running a cuda dockerfile to use the cuda/gpu options without failing. | -| WHISPER_MODEL | 'medium' | this can be tiny, base, small, medium, large | +| WHISPER_MODEL | 'medium' | Can be:'tiny', 'tiny.en', 'base', 'base.en', 'small', 'small.en', 'medium', 'medium.en', 'large-v1', + 'large-v2', or 'large' | | CONCURRENT_TRANSCRIPTIONS | 2 | Number of files it will transcribe in parallel | | WHISPER_THREADS | 4 | number of threads to use during computation | | MODEL_PATH | '.' | This is where the WHISPER_MODEL will be stored. This defaults to placing it where you execute the script |