Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allocation on device RTX 4070 #49

Open
Milor123 opened this issue Jun 21, 2024 · 3 comments
Open

Allocation on device RTX 4070 #49

Milor123 opened this issue Jun 21, 2024 · 3 comments

Comments

@Milor123
Copy link

Guys, is not possible use it in my RTX 4070 with 12GB VRAM?
I am in linux with 4070 dedicated only for AI and my IGPU renders my desktop, I have 48gb of ram

I would like use it, I've tried bfloat and float but nothings works

@yunlianwangluo
Copy link

same, :(
maybe we should try better hardware...

@cardosofelipe
Copy link
Contributor

12GB of VRAM are not enough for this model. I an running on a RTX 3090 24GB and I basically have 80% of VRAM consumption only by running IDM-VTON. This is for inference. For training, my GPU is not even enough.

@txhno
Copy link

txhno commented Aug 6, 2024

It is a problem with this implementation, If you use the original repository and the diffusers library you can run the FP4 alternative which even runs with 8GB VRAM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants