-
-
Notifications
You must be signed in to change notification settings - Fork 221
Open
Description
I am getting this on my Mac M1 (Ventura 13.5.2) with Python 3.11.5:
Traceback (most recent call last):
File "/Users/user/code/project/text-generation-webui/server.py", line 29, in <module>
from modules import (
File "/Users/user/code/project/text-generation-webui/modules/ui_default.py", line 3, in <module>
from modules import logits, shared, ui, utils
File "/Users/user/code/project/text-generation-webui/modules/logits.py", line 4, in <module>
from modules.exllama import ExllamaModel
File "/Users/user/code/project/text-generation-webui/modules/exllama.py", line 22, in <module>
from generator import ExLlamaGenerator
File "/Users/user/code/project/text-generation-webui/repositories/exllama/generator.py", line 1, in <module>
import cuda_ext
File "/Users/user/code/project/text-generation-webui/repositories/exllama/cuda_ext.py", line 43, in <module>
exllama_ext = load(
^^^^^
File "/Users/user/code/project/text-generation-webui-venv/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1284, in load
return _jit_compile(
^^^^^^^^^^^^^
File "/Users/user/code/project/text-generation-webui-venv/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1509, in _jit_compile
_write_ninja_file_and_build_library(
File "/Users/user/code/project/text-generation-webui-venv/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1601, in _write_ninja_file_and_build_library
extra_ldflags = _prepare_ldflags(
^^^^^^^^^^^^^^^^^
File "/Users/user/code/project/text-generation-webui-venv/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 1699, in _prepare_ldflags
extra_ldflags.append(f'-L{_join_cuda_home("lib64")}')
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/user/code/project/text-generation-webui-venv/lib/python3.11/site-packages/torch/utils/cpp_extension.py", line 2223, in _join_cuda_home
raise EnvironmentError('CUDA_HOME environment variable is not set. '
OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.
make: *** [run-llama2-text-generation-webui] Error 1
My Mac doesn't have a GPU, so I don't have CUDA. How can I get past this error?
Metadata
Metadata
Assignees
Labels
No labels