Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

no model named 'torch' #740

Open
Tracked by #743
DeForestDiamond opened this issue May 4, 2023 · 55 comments · May be fixed by #743
Open
Tracked by #743

no model named 'torch' #740

DeForestDiamond opened this issue May 4, 2023 · 55 comments · May be fixed by #743

Comments

@DeForestDiamond
Copy link

I'm really not used to using command prompt and I'm guessing this is an issue with torch but i even reinstalled it and im still getting this error. Any ideas?

(I know the absolute bare minimum about this stuff. So, please talk to me like I'm an idiot, lol)

(venv) C:\Users\Nutri>pip install -U xformers
Collecting xformers
  Using cached xformers-0.0.16.tar.gz (7.3 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error

  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [21 lines of output]
      Traceback (most recent call last):
        File "C:\Users\Nutri\venv\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 353, in <module>
          main()
        File "C:\Users\Nutri\venv\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\Nutri\venv\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\Nutri\AppData\Local\Temp\pip-build-env-w3jebqad\overlay\Lib\site-packages\setuptools\build_meta.py", line 341, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\Nutri\AppData\Local\Temp\pip-build-env-w3jebqad\overlay\Lib\site-packages\setuptools\build_meta.py", line 323, in _get_build_requires
          self.run_setup()
        File "C:\Users\Nutri\AppData\Local\Temp\pip-build-env-w3jebqad\overlay\Lib\site-packages\setuptools\build_meta.py", line 488, in run_setup
          self).run_setup(setup_script=setup_script)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\Nutri\AppData\Local\Temp\pip-build-env-w3jebqad\overlay\Lib\site-packages\setuptools\build_meta.py", line 338, in run_setup
          exec(code, locals())
        File "<string>", line 23, in <module>
      ModuleNotFoundError: No module named 'torch'
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.```
@danthe3rd
Copy link
Contributor

You should install torch before installing xFormers.

@DeForestDiamond
Copy link
Author

I already had torch, I needed it to install stable diffusion.

@ivanprado
Copy link

I think the problem is that torch is imported in setup.py (see here https://github.com/facebookresearch/xformers/blob/main/setup.py#L23).

This is problematic when installing all the dependencies of a project from scratch in an environment. All the dependencies are resolved properly and torch is also going to be installed... but previously, the setup.py of all projects are invoked and at this stage, torch has not been installed yet. I don't know exactly the solution to that, but it seems that this breaking the contract of setup.py

The workaround I found is to install the torch dependency in advance. For example:

pip install torch
pip install -r requirements. txt

@DeForestDiamond
Copy link
Author

Thank you, I'll give that a try after work :)

@worthy7
Copy link

worthy7 commented May 7, 2023

I am also getting this issue, and am unsure how to solve it.

I am adding this package inside a venv which already has the latest pytorch installed. So I can only assum ethat this "subprocess" doing something in another environment. I not a python/pip expert at all.

@JosefWN
Copy link

JosefWN commented May 8, 2023

Same issue for me, torch is already installed and working but I can't install xformers. Seems specific to Windows, as I could install xformers without issues on macOS the other day.

@jinmingyi1998 jinmingyi1998 linked a pull request May 10, 2023 that will close this issue
10 tasks
@adampauls
Copy link

adampauls commented Jun 6, 2023

I'm hitting this issue with poetry on macOS.

@jinmingyi1998
Copy link

I'm hitting this issue with poetry on macOS.

a trick way is to install torch correctly, you make sure yourself.
and then you just pip install xformers --no-dependencies

@rkique
Copy link

rkique commented Jun 8, 2023

Yep, installing pytorch from source and then pip install xformers --no-dependencies works for me.

@adampauls
Copy link

@jinmingyi1998 do I understand correctly that the suggested workaround requires not putting xformers in my pyproject.toml and manually installing it instead? That's definitely not an ideal use of poetry?

Am I understanding correctly that xformers is expressly choosing to disregard the recommendation against "building packages in the destination environment" mentioned in this comment?

@danthe3rd
Copy link
Contributor

That's definitely not a good situation, but we couldn't find a satisfying solution. The problem with not building in the destination environment is that you might end up building xFormers with a different version of Pytorch than the one in your environment, which will break everything.
We had some discussion in this PR: #743

@adampauls
Copy link

adampauls commented Jun 12, 2023

Yes, I saw that discussion. Is it possible to provide some pre-built wheels that build in that relationship? E.g. I could declare a dependency on xformers-pytorch-2-0-1 = "^0.0.20". It's a little annoying for the library user that they need to put the version number of pytorch twice, but it's less annoying that not be able to use poetry properly. And you could still use the base xformers dependency with custom build if you don't want to use one of the already built xformers binaries?

@danthe3rd
Copy link
Contributor

You can also use the pypi wheels which are already built:
https://pypi.org/project/xformers/#history
As the binaries are built with a specific version of pytorch, they have that version pinned as a requirement. I hope this helps ...

@adampauls
Copy link

Hmm, maybe I missed something. The issue above happens when you use xformers = "^0.0.20" with poetry, which should be fetching one of those wheels. A minimal poetry project with pyproject.toml:

[tool.poetry]
name = "xformers-test"
version = "0.1.0"
description = ""
authors = []
readme = "README.md"
packages = [{include = "xformers_test"}]

[tool.poetry.dependencies]
python = "^3.11"
xformers = "^0.0.20"

[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"

will crash on poetry install with

  Backend subprocess exited when trying to invoke get_requires_for_build_wheel
...
  ModuleNotFoundError: No module named 'torch'

I'm using

xformers-test % poetry --version
Poetry (version 1.5.1)
xformers-test % pip3 --version
pip 22.3.1 from /opt/homebrew/lib/python3.11/site-packages/pip (python 3.11)

@lynxoid
Copy link

lynxoid commented Jun 13, 2023

Also running into this issue. My pip requirements file has these versions:

torch==2.0.1
notebook==6.5.4
transformers==4.30.0
matplotlib==3.7.1
scikit-learn==1.2.2
pandas==2.0.2
xformers==0.0.20

and I am running in a venv with py3.10. It looks to me that either 1) xformers is uninstalling torch before its own install or 2) xformers install is ignoring venv paths and installing on the machine natively (and so it does not see an installed torch dependency).

I tried pip install --pre xformers, pip install xformers==0.0.20, pip install xformers with the same result:

Collecting xformers
  Using cached xformers-0.0.20.tar.gz (7.6 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error

  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [17 lines of output]
      Traceback (most recent call last):
        File "/Users/daryafilippova/dev/environments/pytorch_2/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/Users/daryafilippova/dev/environments/pytorch_2/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/Users/daryafilippova/dev/environments/pytorch_2/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
        File "/private/var/folders/ft/jrq7289j5d5_mvl5phb5pnlm0000gn/T/pip-build-env-jjxyy34d/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 341, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
        File "/private/var/folders/ft/jrq7289j5d5_mvl5phb5pnlm0000gn/T/pip-build-env-jjxyy34d/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 323, in _get_build_requires
          self.run_setup()
        File "/private/var/folders/ft/jrq7289j5d5_mvl5phb5pnlm0000gn/T/pip-build-env-jjxyy34d/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 487, in run_setup
          super(_BuildMetaLegacyBackend,
        File "/private/var/folders/ft/jrq7289j5d5_mvl5phb5pnlm0000gn/T/pip-build-env-jjxyy34d/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 338, in run_setup
          exec(code, locals())
        File "<string>", line 23, in <module>
      ModuleNotFoundError: No module named 'torch'
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

@adampauls
Copy link

adampauls commented Jun 14, 2023

You can also use the pypi wheels which are already built:

I see that there are no prebuilt binaries for MacOS in PyPI, which would explain why it tries to build locally.

@danthe3rd
Copy link
Contributor

xformers is not compatible with MacOS

@drscotthawley
Copy link

@danthe3rd Facebook should put that in the README in the install instructions. Would save so many of us wasted time.

(Sure it says, "Recommended, Windows & Linux", but that's much less clear than actually saying "Does not work on Mac")

@adampauls
Copy link

HuggingFace complains that xformers is not available. I know that's HF's fault (or maybe torch?), but it's annoying that I can't silence the warning or that the warning exists at all. I'll figure out where the warning is coming from and file an issue there.

@sidhartha-roy
Copy link

I am getting the same issue, I have a Mac M1 Pro chip.

I ensured that pytorch is installed before trying to install xformers.

>>> print(torch.__version__)
2.0.1

I still get the same error:

Collecting xformers
  Using cached xformers-0.0.20.tar.gz (7.6 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error

  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [21 lines of output]
      Traceback (most recent call last):
        File "/Users/sidhartharoy/research-stage/llm-databricks/env/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/Users/sidhartharoy/research-stage/llm-databricks/env/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/Users/sidhartharoy/research-stage/llm-databricks/env/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^
        File "/private/var/folders/mv/wh9dwtcd2wg8x2d82s0rbstm0000gq/T/pip-build-env-j67vrkzj/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 341, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/private/var/folders/mv/wh9dwtcd2wg8x2d82s0rbstm0000gq/T/pip-build-env-j67vrkzj/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 323, in _get_build_requires
          self.run_setup()
        File "/private/var/folders/mv/wh9dwtcd2wg8x2d82s0rbstm0000gq/T/pip-build-env-j67vrkzj/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 488, in run_setup
          self).run_setup(setup_script=setup_script)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/private/var/folders/mv/wh9dwtcd2wg8x2d82s0rbstm0000gq/T/pip-build-env-j67vrkzj/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 338, in run_setup
          exec(code, locals())
        File "<string>", line 23, in <module>
      ModuleNotFoundError: No module named 'torch'
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

I tried the following options:
pip install --pre -U xformers
pip install xformers --no-dependencies

pip install ninja
pip install -v -U git+https://github.com/facebookresearch/xformers.git@main#egg=xformers

I still get the exact same error as shown above.

@DeForestDiamond
Copy link
Author

I completely forgot I started this question my bad... I ended up downloading the "A1111 WebUI Easy Installer and Launcher". It will install all dependencies and everything you need and has an option to install x-formers. Worked like a charm.

@talhaanwarch
Copy link

--no-dependencies

getting same on linux with torch 2.0.1

@dfiru
Copy link

dfiru commented Aug 3, 2023

this thread is cursed

Drun555 added a commit to Drun555/InvokeAI that referenced this issue Oct 14, 2023
I'm not sure if it's correct way of handling things, but correcting this string to '==0.0.20' fixes xformers install for me - and maybe for others too. 

Please see this thread, this is the issue I had (trying to install InvokeAI):
facebookresearch/xformers#740
@MichaelMalike
Copy link

MichaelMalike commented Oct 26, 2023

Here is my solution in my environment:
Firstly my server is aarch64, and maybe there is no such problem on the x86_64 platform at all.
I got the same error messages just like above when I tried to use pip install xformers:

ModuleNotFoundError: No module named 'torch'

But I already installed torch, so I followed the suggestions in this issue, I tried --no-dependencies and --pre -U, unfortunately nothing works.
Then I suddenly found I didn't install wheel yet, maybe things will change if I install it
pip install wheel
Indeed, after the wheel is installed, "ModuleNotFoundError: No module named 'torch'" is gone away, it started to compile but I got a new error message:

ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1900, in _run_ninja_build
subprocess.run(
File "/stable-diffusion-webui/runtime/python/3.10.6/lib/python3.10/subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['ninja', '-v']' returned non-zero exit status 1.

I installed ninja
pip install ninja
But still same compile errors
Then I tried to download the source code of xformers:
git clone https://github.com/facebookresearch/xformers.git
git checkout v0.0.16rc425
pip install .
Still no lucky, has same compile errors.
Then I looked the error messages again, and found a clue:

error: ‘vld1q_f32_x2’ was not declared in this scope

I searched the error, and it is said it's a problem of gcc in aarch64, and will be fixed in gcc-8.4.1-1
I just have gcc-10.3.1 at hand, so I tried it and it works!
CC=/gcc1031/bin/gcc CXX=/gcc1031/bin/c++ pip install .

Building wheels for collected packages: xformers
Building wheel for xformers (setup.py) ... done
Created wheel for xformers: filename=xformers-0.0.16+6f3c20f.d20231026-cp310-cp310-linux_aarch64.whl size=7641635 sha256=222f900bf3f4917368b6aa1108080ccdd59d1c2589e2553a2c94dea57cee86f0
Stored in directory: /tmp/pip-ephem-wheel-cache-hnw74sa9/wheels/6f/f5/81/43807df029d66aff5bf231868b0c3f7fbde0b32ed35d595fe7
Successfully built xformers
DEPRECATION: pytorch-lightning 1.7.6 has a non-standard dependency specifier torch>=1.9.. pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
DEPRECATION: torchsde 0.2.5 has a non-standard dependency specifier numpy>=1.19.
; python_version >= "3.7". pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of torchsde or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
Installing collected packages: xformers
Successfully installed xformers-0.0.16+6f3c20f.d20231026

@carlthome
Copy link

Also struggling with this issue now on macOS M1. Is there any reasonable work around? Would want to avoid running in Docker Desktop when debugging unit tests.

@aurban0
Copy link

aurban0 commented Nov 1, 2023

Here is my solution in my environment: Firstly my server is aarch64, and maybe there is no such problem on the x86_64 platform at all. I got the same error messages just like above when I tried to use pip install xformers:

ModuleNotFoundError: No module named 'torch'

But I already installed torch, so I followed the suggestions in this issue, I tried --no-dependencies and --pre -U, unfortunately nothing works. Then I suddenly found I didn't install wheel yet, maybe things will change if I install it pip install wheel Indeed, after the wheel is installed, "ModuleNotFoundError: No module named 'torch'" is gone away, it started to compile but I got a new error message:

ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1900, in _run_ninja_build
subprocess.run(
File "/stable-diffusion-webui/runtime/python/3.10.6/lib/python3.10/subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['ninja', '-v']' returned non-zero exit status 1.

I installed ninja pip install ninja But still same compile errors Then I tried to download the source code of xformers: git clone https://github.com/facebookresearch/xformers.git git checkout v0.0.16rc425 pip install . Still no lucky, has same compile errors. Then I looked the error messages again, and found a clue:

error: ‘vld1q_f32_x2’ was not declared in this scope

I searched the error, and it is said it's a problem of gcc in aarch64, and will be fixed in gcc-8.4.1-1 I just have gcc-10.3.1 at hand, so I tried it and it works! CC=/gcc1031/bin/gcc CXX=/gcc1031/bin/c++ pip install .

Building wheels for collected packages: xformers
Building wheel for xformers (setup.py) ... done
Created wheel for xformers: filename=xformers-0.0.16+6f3c20f.d20231026-cp310-cp310-linux_aarch64.whl size=7641635 sha256=222f900bf3f4917368b6aa1108080ccdd59d1c2589e2553a2c94dea57cee86f0
Stored in directory: /tmp/pip-ephem-wheel-cache-hnw74sa9/wheels/6f/f5/81/43807df029d66aff5bf231868b0c3f7fbde0b32ed35d595fe7
Successfully built xformers
DEPRECATION: pytorch-lightning 1.7.6 has a non-standard dependency specifier torch>=1.9.. pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
DEPRECATION: torchsde 0.2.5 has a non-standard dependency specifier numpy>=1.19.
; python_version >= "3.7". pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of torchsde or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
Installing collected packages: xformers
Successfully installed xformers-0.0.16+6f3c20f.d20231026

The real solution. Many thanks.

@HuabeiYou
Copy link

Here is my solution in my environment: Firstly my server is aarch64, and maybe there is no such problem on the x86_64 platform at all. I got the same error messages just like above when I tried to use pip install xformers:

ModuleNotFoundError: No module named 'torch'

But I already installed torch, so I followed the suggestions in this issue, I tried --no-dependencies and --pre -U, unfortunately nothing works. Then I suddenly found I didn't install wheel yet, maybe things will change if I install it pip install wheel Indeed, after the wheel is installed, "ModuleNotFoundError: No module named 'torch'" is gone away, it started to compile but I got a new error message:

ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1900, in _run_ninja_build
subprocess.run(
File "/stable-diffusion-webui/runtime/python/3.10.6/lib/python3.10/subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['ninja', '-v']' returned non-zero exit status 1.

I installed ninja pip install ninja But still same compile errors Then I tried to download the source code of xformers: git clone https://github.com/facebookresearch/xformers.git git checkout v0.0.16rc425 pip install . Still no lucky, has same compile errors. Then I looked the error messages again, and found a clue:

error: ‘vld1q_f32_x2’ was not declared in this scope

I searched the error, and it is said it's a problem of gcc in aarch64, and will be fixed in gcc-8.4.1-1 I just have gcc-10.3.1 at hand, so I tried it and it works! CC=/gcc1031/bin/gcc CXX=/gcc1031/bin/c++ pip install .

Building wheels for collected packages: xformers
Building wheel for xformers (setup.py) ... done
Created wheel for xformers: filename=xformers-0.0.16+6f3c20f.d20231026-cp310-cp310-linux_aarch64.whl size=7641635 sha256=222f900bf3f4917368b6aa1108080ccdd59d1c2589e2553a2c94dea57cee86f0
Stored in directory: /tmp/pip-ephem-wheel-cache-hnw74sa9/wheels/6f/f5/81/43807df029d66aff5bf231868b0c3f7fbde0b32ed35d595fe7
Successfully built xformers
DEPRECATION: pytorch-lightning 1.7.6 has a non-standard dependency specifier torch>=1.9.. pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
DEPRECATION: torchsde 0.2.5 has a non-standard dependency specifier numpy>=1.19.
; python_version >= "3.7". pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of torchsde or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
Installing collected packages: xformers
Successfully installed xformers-0.0.16+6f3c20f.d20231026

Thanks for the explanation! In my case (Apple M1, MacOS 13.6.4), after installing wheel and torch, the error I got when building wheel wasclang: error unsupported option '-fopenmp'. I did

CC=/opt/homebrew/opt/gcc/bin/gcc-13 CXX=/opt/homebrew/opt/gcc/bin/c++-13 pip install xformers

and it installed.

@chintanckg
Copy link

None of the provided solutions are working! Any straight way solution that works on ubuntu ec2 instance?

@VictorMustin
Copy link

pip install xformers --no-dependencies

Yes, it worked for me (macos m3).

I have done :

poetry run python -m pip install --upgrade pip poetry run python -m pip install --upgrade setuptools poetry run python -m pip install torch>=2.1.0 poetry run python -m pip install wheel ninja poetry run python -m pip install xformers --no-dependencies

Ugly, but it work...

Thanks a lot

Tried everything and doing things in this order worked for me (windows 11) :))

@ThePowerOfElectricity
Copy link

ThePowerOfElectricity commented Apr 15, 2024

The issue still persists. Windows 10 // Torch 2.2.2
None of the solutions worked for me, I'm giving up on the project using it and moving on with life.

EDIT: Adding the pip flag --no-use-pep517 got me a different error message further downstream

@shiroxxsora
Copy link

Yes, it worked for me (macos m3).

I have done :

poetry run python -m pip install --upgrade pip poetry run python -m pip install --upgrade setuptools poetry run python -m pip install torch>=2.1.0 poetry run python -m pip install wheel ninja poetry run python -m pip install xformers --no-dependencies

Ugly, but it work...

Thanks a lot

thanks

@631068264
Copy link

@HuabeiYou

Here is my solution in my environment: Firstly my server is aarch64, and maybe there is no such problem on the x86_64 platform at all. I got the same error messages just like above when I tried to use pip install xformers:

ModuleNotFoundError: No module named 'torch'

But I already installed torch, so I followed the suggestions in this issue, I tried --no-dependencies and --pre -U, unfortunately nothing works. Then I suddenly found I didn't install wheel yet, maybe things will change if I install it pip install wheel Indeed, after the wheel is installed, "ModuleNotFoundError: No module named 'torch'" is gone away, it started to compile but I got a new error message:

ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1900, in _run_ninja_build
subprocess.run(
File "/stable-diffusion-webui/runtime/python/3.10.6/lib/python3.10/subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['ninja', '-v']' returned non-zero exit status 1.

I installed ninja pip install ninja But still same compile errors Then I tried to download the source code of xformers: git clone https://github.com/facebookresearch/xformers.git git checkout v0.0.16rc425 pip install . Still no lucky, has same compile errors. Then I looked the error messages again, and found a clue:

error: ‘vld1q_f32_x2’ was not declared in this scope

I searched the error, and it is said it's a problem of gcc in aarch64, and will be fixed in gcc-8.4.1-1 I just have gcc-10.3.1 at hand, so I tried it and it works! CC=/gcc1031/bin/gcc CXX=/gcc1031/bin/c++ pip install .

Building wheels for collected packages: xformers
Building wheel for xformers (setup.py) ... done
Created wheel for xformers: filename=xformers-0.0.16+6f3c20f.d20231026-cp310-cp310-linux_aarch64.whl size=7641635 sha256=222f900bf3f4917368b6aa1108080ccdd59d1c2589e2553a2c94dea57cee86f0
Stored in directory: /tmp/pip-ephem-wheel-cache-hnw74sa9/wheels/6f/f5/81/43807df029d66aff5bf231868b0c3f7fbde0b32ed35d595fe7
Successfully built xformers
DEPRECATION: pytorch-lightning 1.7.6 has a non-standard dependency specifier torch>=1.9.. pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
DEPRECATION: torchsde 0.2.5 has a non-standard dependency specifier numpy>=1.19.
; python_version >= "3.7". pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of torchsde or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
Installing collected packages: xformers
Successfully installed xformers-0.0.16+6f3c20f.d20231026

Thanks for the explanation! In my case (Apple M1, MacOS 13.6.4), after installing wheel and torch, the error I got when building wheel wasclang: error unsupported option '-fopenmp'. I did

CC=/opt/homebrew/opt/gcc/bin/gcc-13 CXX=/opt/homebrew/opt/gcc/bin/c++-13 pip install xformers

and it installed.

@HuabeiYou How to find CC and CXX path ?

@HuabeiYou
Copy link

@HuabeiYou

Here is my solution in my environment: Firstly my server is aarch64, and maybe there is no such problem on the x86_64 platform at all. I got the same error messages just like above when I tried to use pip install xformers:

ModuleNotFoundError: No module named 'torch'

But I already installed torch, so I followed the suggestions in this issue, I tried --no-dependencies and --pre -U, unfortunately nothing works. Then I suddenly found I didn't install wheel yet, maybe things will change if I install it pip install wheel Indeed, after the wheel is installed, "ModuleNotFoundError: No module named 'torch'" is gone away, it started to compile but I got a new error message:

ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1900, in _run_ninja_build
subprocess.run(
File "/stable-diffusion-webui/runtime/python/3.10.6/lib/python3.10/subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['ninja', '-v']' returned non-zero exit status 1.

I installed ninja pip install ninja But still same compile errors Then I tried to download the source code of xformers: git clone https://github.com/facebookresearch/xformers.git git checkout v0.0.16rc425 pip install . Still no lucky, has same compile errors. Then I looked the error messages again, and found a clue:

error: ‘vld1q_f32_x2’ was not declared in this scope

I searched the error, and it is said it's a problem of gcc in aarch64, and will be fixed in gcc-8.4.1-1 I just have gcc-10.3.1 at hand, so I tried it and it works! CC=/gcc1031/bin/gcc CXX=/gcc1031/bin/c++ pip install .

Building wheels for collected packages: xformers
Building wheel for xformers (setup.py) ... done
Created wheel for xformers: filename=xformers-0.0.16+6f3c20f.d20231026-cp310-cp310-linux_aarch64.whl size=7641635 sha256=222f900bf3f4917368b6aa1108080ccdd59d1c2589e2553a2c94dea57cee86f0
Stored in directory: /tmp/pip-ephem-wheel-cache-hnw74sa9/wheels/6f/f5/81/43807df029d66aff5bf231868b0c3f7fbde0b32ed35d595fe7
Successfully built xformers
DEPRECATION: pytorch-lightning 1.7.6 has a non-standard dependency specifier torch>=1.9.. pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
DEPRECATION: torchsde 0.2.5 has a non-standard dependency specifier numpy>=1.19.
; python_version >= "3.7". pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of torchsde or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
Installing collected packages: xformers
Successfully installed xformers-0.0.16+6f3c20f.d20231026

Thanks for the explanation! In my case (Apple M1, MacOS 13.6.4), after installing wheel and torch, the error I got when building wheel wasclang: error unsupported option '-fopenmp'. I did

CC=/opt/homebrew/opt/gcc/bin/gcc-13 CXX=/opt/homebrew/opt/gcc/bin/c++-13 pip install xformers

and it installed.

@HuabeiYou How to find CC and CXX path ?

From my (very limited) understanding, the clang error was thrown because MacOS's default C compiler doesn't support some required options for compiling certain dependencies. To resolve this, we need to use an alternative compiler like gcc.

brew install gcc

After installation, you can typically find the GCC and G++ compilers under $HOMEBREW_PREFIX/bin/ or $HOMEBREW_PREFIX/opt/gcc/bin/ and you can use them as CC and CXX respectively.

@631068264
Copy link

@HuabeiYou

Here is my solution in my environment: Firstly my server is aarch64, and maybe there is no such problem on the x86_64 platform at all. I got the same error messages just like above when I tried to use pip install xformers:

ModuleNotFoundError: No module named 'torch'

But I already installed torch, so I followed the suggestions in this issue, I tried --no-dependencies and --pre -U, unfortunately nothing works. Then I suddenly found I didn't install wheel yet, maybe things will change if I install it pip install wheel Indeed, after the wheel is installed, "ModuleNotFoundError: No module named 'torch'" is gone away, it started to compile but I got a new error message:

ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1900, in _run_ninja_build
subprocess.run(
File "/stable-diffusion-webui/runtime/python/3.10.6/lib/python3.10/subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['ninja', '-v']' returned non-zero exit status 1.

I installed ninja pip install ninja But still same compile errors Then I tried to download the source code of xformers: git clone https://github.com/facebookresearch/xformers.git git checkout v0.0.16rc425 pip install . Still no lucky, has same compile errors. Then I looked the error messages again, and found a clue:

error: ‘vld1q_f32_x2’ was not declared in this scope

I searched the error, and it is said it's a problem of gcc in aarch64, and will be fixed in gcc-8.4.1-1 I just have gcc-10.3.1 at hand, so I tried it and it works! CC=/gcc1031/bin/gcc CXX=/gcc1031/bin/c++ pip install .

Building wheels for collected packages: xformers
Building wheel for xformers (setup.py) ... done
Created wheel for xformers: filename=xformers-0.0.16+6f3c20f.d20231026-cp310-cp310-linux_aarch64.whl size=7641635 sha256=222f900bf3f4917368b6aa1108080ccdd59d1c2589e2553a2c94dea57cee86f0
Stored in directory: /tmp/pip-ephem-wheel-cache-hnw74sa9/wheels/6f/f5/81/43807df029d66aff5bf231868b0c3f7fbde0b32ed35d595fe7
Successfully built xformers
DEPRECATION: pytorch-lightning 1.7.6 has a non-standard dependency specifier torch>=1.9.. pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
DEPRECATION: torchsde 0.2.5 has a non-standard dependency specifier numpy>=1.19.
; python_version >= "3.7". pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of torchsde or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
Installing collected packages: xformers
Successfully installed xformers-0.0.16+6f3c20f.d20231026

Thanks for the explanation! In my case (Apple M1, MacOS 13.6.4), after installing wheel and torch, the error I got when building wheel wasclang: error unsupported option '-fopenmp'. I did

CC=/opt/homebrew/opt/gcc/bin/gcc-13 CXX=/opt/homebrew/opt/gcc/bin/c++-13 pip install xformers

and it installed.

@HuabeiYou How to find CC and CXX path ?

From my (very limited) understanding, the clang error was thrown because MacOS's default C compiler doesn't support some required options for compiling certain dependencies. To resolve this, we need to use an alternative compiler like gcc.

brew install gcc

After installation, you can typically find the GCC and G++ compilers under $HOMEBREW_PREFIX/bin/ or $HOMEBREW_PREFIX/opt/gcc/bin/ and you can use them as CC and CXX respectively.

Work for me . Just use brew info gcc get the path.

@MurphLu
Copy link

MurphLu commented May 16, 2024

Yes, it worked for me (macos m3).

I have done :

poetry run python -m pip install --upgrade pip poetry run python -m pip install --upgrade setuptools poetry run python -m pip install torch>=2.1.0 poetry run python -m pip install wheel ninja poetry run python -m pip install xformers --no-dependencies

Ugly, but it work...

Thanks a lot

it worked for me (macos intel)

@shuttie
Copy link

shuttie commented Jun 4, 2024

FYI had the same issue on Ubuntu 24.04 with Python 3.12, and this trick did the install without issues:

pip install wheel ninja setuptools torch
pip install xformers --no-dependencies --no-build-isolation

@jeremyxie97
Copy link

Here is my solution in my environment: Firstly my server is aarch64, and maybe there is no such problem on the x86_64 platform at all. I got the same error messages just like above when I tried to use pip install xformers:

ModuleNotFoundError: No module named 'torch'

But I already installed torch, so I followed the suggestions in this issue, I tried --no-dependencies and --pre -U, unfortunately nothing works. Then I suddenly found I didn't install wheel yet, maybe things will change if I install it pip install wheel Indeed, after the wheel is installed, "ModuleNotFoundError: No module named 'torch'" is gone away, it started to compile but I got a new error message:

ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1900, in _run_ninja_build
subprocess.run(
File "/stable-diffusion-webui/runtime/python/3.10.6/lib/python3.10/subprocess.py", line 524, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['ninja', '-v']' returned non-zero exit status 1.

I installed ninja pip install ninja But still same compile errors Then I tried to download the source code of xformers: git clone https://github.com/facebookresearch/xformers.git git checkout v0.0.16rc425 pip install . Still no lucky, has same compile errors. Then I looked the error messages again, and found a clue:

error: ‘vld1q_f32_x2’ was not declared in this scope

I searched the error, and it is said it's a problem of gcc in aarch64, and will be fixed in gcc-8.4.1-1 I just have gcc-10.3.1 at hand, so I tried it and it works! CC=/gcc1031/bin/gcc CXX=/gcc1031/bin/c++ pip install .

Building wheels for collected packages: xformers
Building wheel for xformers (setup.py) ... done
Created wheel for xformers: filename=xformers-0.0.16+6f3c20f.d20231026-cp310-cp310-linux_aarch64.whl size=7641635 sha256=222f900bf3f4917368b6aa1108080ccdd59d1c2589e2553a2c94dea57cee86f0
Stored in directory: /tmp/pip-ephem-wheel-cache-hnw74sa9/wheels/6f/f5/81/43807df029d66aff5bf231868b0c3f7fbde0b32ed35d595fe7
Successfully built xformers
DEPRECATION: pytorch-lightning 1.7.6 has a non-standard dependency specifier torch>=1.9.. pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of pytorch-lightning or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
DEPRECATION: torchsde 0.2.5 has a non-standard dependency specifier numpy>=1.19.
; python_version >= "3.7". pip 24.0 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of torchsde or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
Installing collected packages: xformers
Successfully installed xformers-0.0.16+6f3c20f.d20231026

Hi, how can I install gcc on windows? thanks!!

@Spacefish
Copy link

Spacefish commented Jun 28, 2024

pip install wheel
pip install setuptools

fixed it for me

@KyleAWang
Copy link

KyleAWang commented Jul 5, 2024

pip install setuptools
pip install xformers --no-dependencies --no-build-isolation

worked to me

@aswordok
Copy link

pip install ninja wheel setuptools
git clone https://github.com/facebookresearch/xformers.git
cd xformers
git submodule update --init --recursive
pip install .
Successfully installed xformers-0.0.27+66cfba7.d20240609

worked to me.

@tevops
Copy link

tevops commented Jul 30, 2024

For me a solution was to install torch independently prior to installation. Now I use MacBook with M3 (os Sonoma) - the workaround was to install gcc (by the time of this comment 14) with brew and install the package via
export clang=gcc-14;python -m pip install xformers --no-dependencies. At least the errors went off (prior to that - the issue was that with native clang there was an issue of unsupported option '-fopenmp'

@jayeew
Copy link

jayeew commented Aug 29, 2024

pip install wheel pip install setuptools

fixed it for me

this works for me. Thank you.

@pawamoy
Copy link

pawamoy commented Sep 8, 2024

Couldn't this project simply add a pyproject.toml with the following contents?

[build-system]
requires = ["setuptools", "torch"]
build-backend = "setuptools.build_meta"

@xzkeee
Copy link

xzkeee commented Sep 17, 2024

pip install wheel pip install setuptools
fixed it for me

this works for me. Thank you.

same. turns out my system don't have wheel installed

@atemate
Copy link

atemate commented Sep 24, 2024

TL;DR try export VIRTUALENV_SYSTEM_SITE_PACKAGES=true

See issue python-poetry/poetry#9707

@itsnotoger
Copy link

itsnotoger commented Oct 2, 2024

Windows 11 Solution

pip install wheel
Windows uses VS build tools instead of gcc for me, so install that:
https://visualstudio.microsoft.com/visual-cpp-build-tools/
Select components in the installer: Windows 11 SDK, MSVC v143 - VS 2022 build tools.
After this, it worked for me. Did not need ninja or other dependencies, and torch was found correctly.

But pay close attention to the build errors, they should tell you whether it wants gcc or another compiler.

@tuxfamily
Copy link

tuxfamily commented Nov 6, 2024

Similar issue(s) trying to install xformers on a mac studio m1 max in a conda venv.

After many attempts and different errors, this was the solution for me:

brew install libomp llvm gcc
set -gx LDFLAGS "-L/opt/homebrew/opt/llvm/lib"
set -gx LDFLAGS "-L/opt/homebrew/opt/libomp/lib"
set -gx CPPFLAGS "-I/opt/homebrew/opt/llvm/include"
set -gx CPPFLAGS "-I/opt/homebrew/opt/libomp/include"
sudo ln -s /opt/homebrew/Cellar/gcc/14.2.0_1/bin/gcc-14 /usr/local/bin/gcc
sudo ln -s /opt/homebrew/Cellar/gcc/14.2.0_1/bin/g++-14 /usr/local/bin/g++
sudo ln -s /opt/homebrew/Cellar/gcc/14.2.0_1/bin/cpp-14 /usr/local/bin/cpp
sudo ln -s /opt/homebrew/Cellar/gcc/14.2.0_1/bin/c++-14 /usr/local/bin/c++
pip install --no-build-isolation xformers

Notes:

  1. I'm using fish, so why the set -gx instead of export
  2. instead of ln -s, a cleaner set -gx CC ... would have probably worked too, but after build, I removed these symlinks

Now I've this warning

WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for:
    PyTorch 2.5.1 with CUDA None (you have 2.5.1)
    Python  3.12.2 (you have 3.12.2)
  Please reinstall xformers (see https://github.com/facebookresearch/xformers#installing-xformers)
  Memory-efficient attention, SwiGLU, sparse and more won't be available.
  Set XFORMERS_MORE_DETAILS=1 for more details
xformers version: 0.0.28.post3

Which is normal I guess :)

EDIT: in the end, I removed xformers because it was causing issues with some ComfyUI workflows

NotImplementedError: No operator found for `memory_efficient_attention_forward` with inputs:
     query       : shape=(1, 1962, 12, 64) (torch.float32)
     key         : shape=(1, 1962, 12, 64) (torch.float32)
     value       : shape=(1, 1962, 12, 64) (torch.float32)
     attn_bias   : <class 'NoneType'>
     p           : 0.0
`ckF` is not supported because:
    device=mps (supported: {'cuda'})
    dtype=torch.float32 (supported: {torch.bfloat16, torch.float16})
    operator wasn't built - see `python -m xformers.info` for more info

Too bad... 😟

@sanlares
Copy link

had similar issues with torch and SentenceTransformer. solved creating a separated virtualenv just installing torch and SentenceTransformer before anything else.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.