You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I believe the source of this error is this line in flash-attention codebase. In my case, the import statement encapsulated in the try-except block failed with the following error: ModuleNotFoundError: No module named 'triton'. Installing triton fixed the problem.
i try the fllow code:
`from transformers import AutoConfig, AutoModelForCausalLM
model_name = 'togethercomputer/evo-1-8k-base'
model_config = AutoConfig.from_pretrained(model_name, trust_remote_code=True)
model_config.use_cache = True
model = AutoModelForCausalLM.from_pretrained(
model_name,
config=model_config,
trust_remote_code=True,
)`
raise error :AssertionError: rotary_emb is not installed
however,the rotary_emb v0,1 installed,flash attentation also installed
The text was updated successfully, but these errors were encountered: