Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does it support custom models? #18

Open
yuehengwu opened this issue Sep 13, 2023 · 1 comment
Open

Does it support custom models? #18

yuehengwu opened this issue Sep 13, 2023 · 1 comment

Comments

@yuehengwu
Copy link

I tried to add the DoctorGPT model, modified the model in /public/lib/vicuna-7b, and also modified the config.json to point the cacheUrl to the local model at http://localhost:3000/lib/WebLLM/vicuna-7b/doctorGPT/.

However, the webpage shows an error:

截屏2023-09-13 18 57 08
@Neet-Nestor
Copy link

Consider trying https://github.com/mlc-ai/web-llm-chat and create issues to the main web-llm repo for new model support requests.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants