Is SeeKeR in interactive chat mode supposed to use gpu?
See original GitHub issueHi I just installed SeekeR in a new environment for parlai v1.6.0 and have been testing its response to interactive chat queries. It seems pretty slow and I noticed that it does not appear to be changing the use of gpu memory when I run it. I am using this command…
parlai i -mf zoo:seeker/seeker_dialogue_3B/model -o gen/seeker_dialogue --search-server 127.0.0.1:8080
where the search-server I am running is the same one that I use for blenderbot2. Is there a command option that needs to be set to use the gpu? I am running under Windows 10.
Thanks!
Issue Analytics
- State:
- Created a year ago
- Reactions:1
- Comments:11 (6 by maintainers)
Top Results From Across the Web
KoboldAI/KoboldAI-Client - GitHub
KoboldAI - Your gateway to GPT writing. This is a browser-based front-end for AI-assisted writing with multiple local & remote AI models.
Read more >Model Zoo — ParlAI Documentation
Model Zoo¶. This is a list of pretrained ParlAI models. They are listed by task, or else in a pretraining section (at the...
Read more >Assign GPUs to virtual machines with VMware vGPU mode
Admins who want to assign a GPU to virtual machines have two options: pass-through or virtual GPU mode. Here's how to get started....
Read more >Wiki - GPU Overload Issues - OBS Studio
I just want this to work, so how do I keep my GPU from being overloaded? Preventing GPU overload mostly boils down to...
Read more >GPU Mode - Keyshot Manual
To use GPU mode in the KeyShot Real-time View, select GPU mode on the Ribbon or Render, GPU mode from the Main Menu....
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

That fixed it!
But I need to note that with Windows you can’t just do a
pip install parlaifor a standard installI needed to make a local copy of the
requirements.txtfile (I called itrequirements-local.txt) where I had to comment out thesh==1.12.14package to avoid an error (sinceshis linux-only) and to changepyzmq==18.1.0topyzmq==18.1.1to avoid aninvalid wheelerror . Then it is installed asBut there were still missing packages that I needed to install…
before the
parlaicommand would work. There was an installation oftorchfrom theparlaiinstall, but nocudatoolkit, so I also reinstalled pytorch with the toolkits using the command on the pytorch website to be safe.For good measure, I also installed
cudnn, but I don’t know if it is needed.Glad that worked, and thank you for the windows install instructions! I’ll go ahead and close this for now but please reopen if there are lingering concerns