-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ModuleNotFoundError: No module named 'transformers_modules.togethercomputer.evo-1-131k-base.9562f3fdc38f09b92594864c5e98264f1bfbca33.tokenizer' #53
Comments
What is the stack trace for the error you see when it crashes |
Thanks for your reply, I am still stuck with this error and could not use Evo model yet. Here is the full trace, the same happens if I try to load the phase 2 checkpoint or if I try to load from the Evo package instead of the auto class from HuggingFace
To answer your question on the huggingface space, I tried both
It is the first HuggingFace model imported from external classes that I try to run so I never came across such error ... Thanks for any hints! |
@Zymrael in case that is relevant, I also tried to manually download the checkpoints and load from the local copies but it didn't help (same error). And I also tried to update transformers to the latest from the github source which also produces the same error.
|
I have the same issue |
@oliverfleetwood did you make any progress? |
Hello all, I had the the same issue and I found a workaround, apparently the The error I got:
How to fix itChange to
Try to load the model again, eg.
it should load the checkpoints, make sure the model downloaded correctly from HF https://huggingface.co/togethercomputer/evo-1-131k-base/tree/main and check that your cache folder has all those files in and then... it works. :)
If the above doesn't work try the other model Environment
Hope that helps! |
@juliocesar-io thanks a lot, this fixed my issue!!! Following on this, I have a couple questions please (maybe @Zymrael knows too?), regarding the code snippet below (helper function for myself)
Thanks again for your assistance! |
Hi all and thanks for open sourcing this interesting model!
I managed to install flash-attention and all other packages so I am able to import Evo package.
But I am stuck with the following error
ModuleNotFoundError: No module named 'transformers_modules.togethercomputer.evo-1-131k-base.9562f3fdc38f09b92594864c5e98264f1bfbca33.tokenizer'
This happens regardless of using the source code
or trying to load directly from HF
The error points to
transformers_modules.togethercomputer.evo-1-131k-base
regardless of which EVO checkpoint I select and I tried to update transformers both to latest or to "4.36.2" as show in https://huggingface.co/togethercomputer/evo-1-131k-base/blob/main/generation_config.jsonAny clue on how to solve this error please? Thanks!
The text was updated successfully, but these errors were encountered: