BlenderBot3: inference on a particular module
See original GitHub issueHi, I was wondering how can I inference a particular module of the BB3 model? For example, the generate dialogue response module takes in “Full context + knowledge + memory sequences” and generates a response. How can I feed in my own input to it?
Using the huggingface analogy, I’d tokenize this input string f”{dialogue_context} {TOKEN_KNOWLEDGE} {knowledge} {TOKEN_END_KNOWLEDGE} {BEGIN_MEMORY} {memory_sequence} {END_MEMORY}” and pass it through the model. How to accomplish this using ParlAI? Also I’ve found the special tokens for BB3 here, but am not sure if it’s usage is documented elsewhere (would be great to have a guide for this).
Thanks!
Issue Analytics
- State:
- Created a year ago
- Reactions:1
- Comments:5 (3 by maintainers)
Top Results From Across the Web
Blenderbot - Hugging Face
It is used to instantiate an Blenderbot model according to the specified arguments, defining the model architecture. Instantiating a configuration with the ...
Read more >BlenderBot 3 175B model card - GitHub
A framework for training and evaluating AI models on a variety of openly available dialogue datasets. - ParlAI/model_card.md at main ...
Read more >BlenderBot 3: a deployed conversational agent that ... - DeepAI
BB3 is a modular system but the modules are not independent components – this is achieved by training a single transformer model to...
Read more >arXiv:2208.03188v3 [cs.CL] 10 Aug 2022
Generate a conversational response given the con- text. Table 1: Set of modules inside BlenderBot 3. All modules except Internet Search are ...
Read more >Tips and Tricks — ParlAI Documentation
{task_name}.worlds that handles the particular nature of interactions, ... For example, the self-chats evaluated in the BlenderBot paper were generated by.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

#4765 improves the documentation
Hi there, in #4746 I’ve written up a README describing the various ways in which to interact directly with the BB3 model (including how to show context to the model). Hopefully that answers your questions!
(Once that lands, you can find the information at this link