From fb37f1c816938c466b92092c1257ddf7cd29d300 Mon Sep 17 00:00:00 2001 From: McCloudS <64094529+McCloudS@users.noreply.github.com> Date: Thu, 8 Feb 2024 08:39:58 -0700 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 64b614b..58d65c2 100644 --- a/README.md +++ b/README.md @@ -117,7 +117,7 @@ You can define the port via environment variables, but the endpoints are static. The following environment variables are available in Docker. They will default to the values listed below. | Variable | Default Value | Description | |---------------------------|------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| TRANSCRIBE_DEVICE | 'cpu' | Can transcribe via gpu (Cuda only) or cpu. Takes option of "cpu", "gpu", "cuda". You must be running a cuda dockerfile to use the cuda/gpu options without failing. | +| TRANSCRIBE_DEVICE | 'cpu' | Can transcribe via gpu (Cuda only) or cpu. Takes option of "cpu", "gpu", "cuda". | | WHISPER_MODEL | 'medium' | Can be:'tiny', 'tiny.en', 'base', 'base.en', 'small', 'small.en', 'medium', 'medium.en', 'large-v1','large-v2', or 'large' | | CONCURRENT_TRANSCRIPTIONS | 2 | Number of files it will transcribe in parallel | | WHISPER_THREADS | 4 | number of threads to use during computation |