Just installed, "failed to load" #17147
Replies: 1 comment
-
|
Unsure if relevant, but I removed to do a fresh install and did the method with doing the gitpull (the 2nd method listed) and this is the message I am getting at the end, where I should get the IP address): RuntimeError: CUDA error: no kernel image is available for execution on the device Stable diffusion model failed to load Stable diffusion model failed to load |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey all,
I just installed the "Install and Run on Nvidia GPUs" steps.
I then went in and edited the web-user.bat to have the auto start and auto update lines and this is what I am getting:
"Creating venv in directory E:\zstable-diffusion\stable-diffusion-sd.webui\webui\venv using python "C:\Users\Christian\AppData\Local\Programs\Python\Python310\python.exe"
Requirement already satisfied: pip in e:\zstable-diffusion\stable-diffusion-sd.webui\webui\venv\lib\site-packages (22.2.1)
Collecting pip
Using cached pip-25.2-py3-none-any.whl (1.8 MB)
Installing collected packages: pip
Attempting uninstall: pip
Found existing installation: pip 22.2.1
Uninstalling pip-22.2.1:
Successfully uninstalled pip-22.2.1
Successfully installed pip-25.2
venv "E:\zstable-diffusion\stable-diffusion-sd.webui\webui\venv\Scripts\Python.exe"
Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug 1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)]
Version: 1.10.1
Commit hash:
Installing torch and torchvision
Looking in indexes: https://pypi.org/simple, https://download.pytorch.org/whl/cu128
Collecting torch==2.7.0
Using cached https://download.pytorch.org/whl/cu128/torch-2.7.0%2Bcu128-cp310-cp310-win_amd64.whl.metadata (29 kB)
Collecting torchvision==0.22.0
Using cached https://download.pytorch.org/whl/cu128/torchvision-0.22.0%2Bcu128-cp310-cp310-win_amd64.whl.metadata (6.3 kB)
Collecting filelock (from torch==2.7.0)
Using cached filelock-3.20.0-py3-none-any.whl.metadata (2.1 kB)
Collecting typing-extensions>=4.10.0 (from torch==2.7.0)
Using cached typing_extensions-4.15.0-py3-none-any.whl.metadata (3.3 kB)
Collecting sympy>=1.13.3 (from torch==2.7.0)
Using cached sympy-1.14.0-py3-none-any.whl.metadata (12 kB)
Collecting networkx (from torch==2.7.0)
Using cached networkx-3.4.2-py3-none-any.whl.metadata (6.3 kB)
Collecting jinja2 (from torch==2.7.0)
Using cached jinja2-3.1.6-py3-none-any.whl.metadata (2.9 kB)
Collecting fsspec (from torch==2.7.0)
Using cached fsspec-2025.9.0-py3-none-any.whl.metadata (10 kB)
Collecting numpy (from torchvision==0.22.0)
Using cached numpy-2.2.6-cp310-cp310-win_amd64.whl.metadata (60 kB)
Collecting pillow!=8.3.*,>=5.3.0 (from torchvision==0.22.0)
Using cached pillow-11.3.0-cp310-cp310-win_amd64.whl.metadata (9.2 kB)
Collecting mpmath<1.4,>=1.1.0 (from sympy>=1.13.3->torch==2.7.0)
Using cached https://download.pytorch.org/whl/mpmath-1.3.0-py3-none-any.whl (536 kB)
Collecting MarkupSafe>=2.0 (from jinja2->torch==2.7.0)
Using cached markupsafe-3.0.3-cp310-cp310-win_amd64.whl.metadata (2.8 kB)
Using cached https://download.pytorch.org/whl/cu128/torch-2.7.0%2Bcu128-cp310-cp310-win_amd64.whl (3338.3 MB)
Using cached https://download.pytorch.org/whl/cu128/torchvision-0.22.0%2Bcu128-cp310-cp310-win_amd64.whl (7.6 MB)
Using cached pillow-11.3.0-cp310-cp310-win_amd64.whl (7.0 MB)
Using cached sympy-1.14.0-py3-none-any.whl (6.3 MB)
Using cached typing_extensions-4.15.0-py3-none-any.whl (44 kB)
Using cached filelock-3.20.0-py3-none-any.whl (16 kB)
Using cached fsspec-2025.9.0-py3-none-any.whl (199 kB)
Using cached jinja2-3.1.6-py3-none-any.whl (134 kB)
Using cached markupsafe-3.0.3-cp310-cp310-win_amd64.whl (15 kB)
Using cached networkx-3.4.2-py3-none-any.whl (1.7 MB)
Using cached numpy-2.2.6-cp310-cp310-win_amd64.whl (12.9 MB)
Installing collected packages: mpmath, typing-extensions, sympy, pillow, numpy, networkx, MarkupSafe, fsspec, filelock, jinja2, torch, torchvision
Successfully installed MarkupSafe-3.0.3 filelock-3.20.0 fsspec-2025.9.0 jinja2-3.1.6 mpmath-1.3.0 networkx-3.4.2 numpy-2.2.6 pillow-11.3.0 sympy-1.14.0 torch-2.7.0+cu128 torchvision-0.22.0+cu128 typing-extensions-4.15.0
Installing clip
Installing open_clip
Couldn't determine assets's hash: 6f7db241d2f8ba7457bac5ca9753331f0c266917, attempting autofix...
Fetching all contents for assets
'"git"' is not recognized as an internal or external command,
operable program or batch file.
Traceback (most recent call last):
File "E:\zstable-diffusion\stable-diffusion-sd.webui\webui\launch.py", line 53, in
main()
File "E:\zstable-diffusion\stable-diffusion-sd.webui\webui\launch.py", line 44, in main
prepare_environment()
File "E:\zstable-diffusion\stable-diffusion-sd.webui\webui\modules\launch_utils.py", line 443, in prepare_environment
git_clone(assets_repo, repo_dir('stable-diffusion-webui-assets'), "assets", assets_commit_hash)
File "E:\zstable-diffusion\stable-diffusion-sd.webui\webui\modules\launch_utils.py", line 176, in git_clone
current_hash = run_git(dir, name, 'rev-parse HEAD', None, f"Couldn't determine {name}'s hash: {commithash}", live=False).strip()
File "E:\zstable-diffusion\stable-diffusion-sd.webui\webui\modules\launch_utils.py", line 164, in run_git
git_fix_workspace(dir, name)
File "E:\zstable-diffusion\stable-diffusion-sd.webui\webui\modules\launch_utils.py", line 151, in git_fix_workspace
run(f'"{git}" -C "{dir}" fetch --refetch --no-auto-gc', f"Fetching all contents for {name}", f"Couldn't fetch {name}", live=True)
File "E:\zstable-diffusion\stable-diffusion-sd.webui\webui\modules\launch_utils.py", line 114, in run
raise RuntimeError("\n".join(error_bits))
RuntimeError: Couldn't fetch assets.
Command: "git" -C "E:\zstable-diffusion\stable-diffusion-sd.webui\webui\repositories\stable-diffusion-webui-assets" fetch --refetch --no-auto-gc
Error code: 1
Press any key to continue . . ."
Beta Was this translation helpful? Give feedback.
All reactions