Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference in batches and multiple GPU #163

Open
OmergottliebAB opened this issue Feb 12, 2024 · 1 comment
Open

Inference in batches and multiple GPU #163

OmergottliebAB opened this issue Feb 12, 2024 · 1 comment

Comments

@OmergottliebAB
Copy link

  1. I am trying to use infer model on multiple GPU but I get an error of unathurised access to GPU.Do I need to config the repo accordingly / how to use cuda visible devices ?
  2. Can I infer with images batches?
@edoproch
Copy link

I did inference on multiple GPUs on vast.ai and I didn't have problems. The only things I did was to give privileges to dist_test.sh.
You can try either chmod +x dist_test.sh or chmod 777 dist_test.sh

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants