Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug fixes for llava multimodal #5038

Open
wants to merge 22 commits into
base: dev
Choose a base branch
from
Open

Bug fixes for llava multimodal #5038

wants to merge 22 commits into from

Conversation

szelok
Copy link

@szelok szelok commented Dec 21, 2023

Checklist:

@szelok szelok changed the title Fix https://github.com/oobabooga/text-generation-webui/issues/5036 Bug fixes for llava multimodal Jan 4, 2024
@oobabooga oobabooga deleted the branch oobabooga:dev February 17, 2024 21:53
@oobabooga oobabooga closed this Feb 17, 2024
@oobabooga oobabooga reopened this Feb 17, 2024
@oobabooga
Copy link
Owner

@szelok could you merge the dev branch check if the changes in this PR are still necessary?

@Victorivus
Copy link
Contributor

@oobabooga

@szelok could you merge the dev branch check if the changes in this PR are still necessary?

I just did, only dev branch does not work:
Capture d’écran du 2024-04-08 11-49-51

After modifying modules/models.py with this PR's content, it works:

Capture d’écran du 2024-04-08 12-00-51

Note: I didn't touch the openai/completions.py

@randoentity
Copy link
Contributor

@oobabooga
This is still relevant.
On commit a363cdf (origin/dev)
Date: Mon May 27 15:21:30 2024 +0300

Fixed with cherry-pick 3af2cfbc3c198ddd6b351c27af868287c5eb354d from this PR.

One related error is when using --gpu-memory in combination with --load-in-8bit: LlavaLlamaForCausalLM doesn't define from_config. But it works fine when not using bitsandbytes or when not specifying gpu-memory.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants