Replies: 2 comments 5 replies
-
|
You're using python 3.8, you need to reinstall and recompile xformers and save the precompiled files for future use |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Ah! ok giving that a try now Does the pip installation work, or built from :https://github.com/facebookresearch/xformers ? |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm having some trouble running the precompiled XFormers wheel on a V100 GPU , the error message warns of needing to run run python setup.py build develop, but that should be taken care of by having a precompiled package right?
Need to compile C++ extensions to get sparse attention suport. Please run python setup.py build developand :
RuntimeError: No such operator xformers::efficient_attention_forward_generic - did you forget to build xformers with 'python setup.py develop'?Thanks in advance, really appreciate the help!
Full traceback:
/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/xformers/_C.so: undefined symbol: _ZNK3c104impl13OperatorEntry20reportSignatureErrorENS0_12CppSignatureE WARNING:root:WARNING: /home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/xformers/_C.so: undefined symbol: _ZNK3c104impl13OperatorEntry20reportSignatureErrorENS0_12CppSignatureE Need to compile C++ extensions to get sparse attention suport. Please run python setup.py build develop 0%| | 0/7200 [00:00<?, ?it/s] zura zura Traceback (most recent call last): File "/home/zamaru/mainCharacter/trainer/content/diffusers/examples/dreambooth/train_dreambooth.py", line 783, in <module> main() File "/home/zamaru/mainCharacter/trainer/content/diffusers/examples/dreambooth/train_dreambooth.py", line 648, in main noise_pred = unet(noisy_latents, timesteps, encoder_hidden_states).sample File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl return forward_call(*input, **kwargs) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/accelerate/utils/operations.py", line 507, in __call__ return convert_to_fp32(self.model_forward(*args, **kwargs)) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/torch/amp/autocast_mode.py", line 14, in decorate_autocast return func(*args, **kwargs) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/diffusers/models/unet_2d_condition.py", line 283, in forward sample, res_samples = downsample_block( File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl return forward_call(*input, **kwargs) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/diffusers/models/unet_blocks.py", line 565, in forward hidden_states = attn(hidden_states, context=encoder_hidden_states) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl return forward_call(*input, **kwargs) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/diffusers/models/attention.py", line 167, in forward hidden_states = block(hidden_states, context=context) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl return forward_call(*input, **kwargs) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/diffusers/models/attention.py", line 217, in forward hidden_states = self.attn1(self.norm1(hidden_states)) + hidden_states File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1190, in _call_impl return forward_call(*input, **kwargs) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/diffusers/models/attention.py", line 287, in forward out = xformers.ops.memory_efficient_attention(q, k, v, attn_bias=None, op=self.attention_op) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/xformers/ops.py", line 626, in memory_efficient_attention return op.apply(query, key, value, attn_bias, p) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/xformers/ops.py", line 257, in forward out, lse, rng_seed, rng_offset = cls.FORWARD_OPERATOR( File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/xformers/ops.py", line 45, in no_such_operator raise RuntimeError( RuntimeError: No such operator xformers::efficient_attention_forward_generic - did you forget to build xformers withpython setup.py develop? 0%| | 0/7200 [00:03<?, ?it/s] Traceback (most recent call last): File "/home/zamaru/mainCharacter/sdenv/bin/accelerate", line 8, in <module> sys.exit(main()) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/accelerate/commands/accelerate_cli.py", line 43, in main args.func(args) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/accelerate/commands/launch.py", line 837, in launch_command simple_launcher(args) File "/home/zamaru/mainCharacter/sdenv/lib/python3.8/site-packages/accelerate/commands/launch.py", line 354, in simple_launcher raise subprocess.CalledProcessError(returncode=process.returncode, cmd=cmd)Beta Was this translation helpful? Give feedback.
All reactions