Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Estimate Required RAM for inference #76

Open
fmaa23 opened this issue Jun 24, 2024 · 0 comments
Open

Estimate Required RAM for inference #76

fmaa23 opened this issue Jun 24, 2024 · 0 comments

Comments

@fmaa23
Copy link

fmaa23 commented Jun 24, 2024

Hello,

Could someone please provide an estimate of the required RAM with respect to the sequence length for inference? I tried running the example script with 16GB, and it ran out of memory.

Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant