Multiple Users Using Parlai Chat Service
See original GitHub issueHi, my task is to develop a service where multiple users can chat with blenderbot2, here’s a quick overview of my general setup.
Entry Point: I’m using the websocket chat service as the entry point. I made the same modifications to socket.py as the person who posted this issue so that the sid is set based on a user id that’s passed in.
World: My world file is pretty much the same as the demo one here. A new world is created for each human agent that enters a chat and a new blenderbot agent is cloned. I am not passing in any message history so all conversation history is being handled by the blenderbot agent.
Config Here is my config file:
tasks:
default:
onboard_world: MessengerBotChatOnboardWorld
task_world: MessengerBotChatTaskWorld
timeout: 1800
agents_required: 1
task_name: chatbot
world_module: parlai.chat_service.tasks.chatbot.worlds
overworld: MessengerOverworld
max_workers: 100
opt:
debug: True
models:
blenderbot2_3B:
model: projects.blenderbot2.agents.blenderbot2:BlenderBot2FidAgent
model_file: zoo:blenderbot2/blenderbot2_3B/model
interactive_mode: True
no_cuda: False
override:
search_server: https://www.google.com
additional_args:
page_id: 1 # Configure Your Own Page
My issue:
The issue I’m running into occurs when two users send a message to the bot simultaneously. Whenever this happens, one of the user’s chats breaks and I get this error message:
World default had error RuntimeError('Last observe() was episode_done, but we never saw a corresponding self_observe to clear the history, probably because you missed an act(). This was changed in #2043. File a GitHub issue if you require assistance.')
This only happens when users send messages simultaneously, if one user sends a message then another user sends one after a delay I get no errors and the bot agent’s memories/conversation history is unaffected. The delay doesn’t necessarily have to be long enough for the bot to finish responding to the first user, it seems to only happen during one of the initial response phases.
Additional context: I managed to stop this error by adding these lines at the start of the observe() function in torch_agent.py:
if observation.get('episode_done'):
observation.force_set('episode_done', False)
but this causes the bot’s persona memories to get messed up (some of the persona memories start looking like this: __NO__PERSONA__BEAM__MIN__LEN__20__) although the conversation histories don’t seem to be affected.
Issue Analytics
- State:
- Created a year ago
- Comments:14 (6 by maintainers)

Top Related StackOverflow Question
I see, that is actually my current implementation. I wanted to see if calls could be done concurrently but it seems like getting that to work may be quite difficult. Either way this issue seems to be resolved in BlenderBot 3 so I can just switch to using that one. Thanks for the help with everything
There are a lot of moving parts to BB2, part of the reason why BB3 is the way it is. I’ll go ahead and close this issue then, please reopen (or file a new one) if you feel the need to