Limitation of the number of simultaneous events
See original GitHub issueI created a simple client/server socketio application. The server sends X events at the same time every second.
I manage to send 16 events at the same time before getting disconnected/connected on the client-side.
Is this a normal limitation of your library?
For my project I need to send a huge amount of events simultaneously, I was wondering if I should switch rather to something else like gRPC ?
Server
from threading import Thread
from time import sleep
import socketio
from flask import Flask
TICKER_EVENT = "TICKER_EVENT"
HOST = "localhost"
PORT = 6002
NB_EVENTS = 17
app = Flask(__name__)
sio = socketio.Server(async_mode="threading", logger=True)
app.wsgi_app = socketio.WSGIApp(sio, app.wsgi_app)
def send_events():
sleep(3)
while True:
for _ in range(NB_EVENTS):
sio.emit(TICKER_EVENT, {})
sleep(2)
t = Thread(target=send_events)
t.start()
app.run(threaded=True, port=PORT, host=HOST)
Client
import socketio
TICKER_EVENT = "TICKER_EVENT"
HOST = "http://127.0.0.1:6002"
sio = socketio.Client()
@sio.on("connect")
def subscribe_to_tickers():
print("connected")
@sio.on("disconnect")
def disconnect():
print("disconnected")
@sio.on(TICKER_EVENT)
def handle_new_ticker(ticker):
print("get ticker {}".format(ticker))
sio.connect(HOST)
Client output
get ticker {}
get ticker {}
disconnected
connected
disconnected
connected
disconnected
Issue Analytics
- State:
- Created 3 years ago
- Comments:19 (8 by maintainers)
Top Results From Across the Web
Is there a limit to the number of concurrent live events?
Yes, the default limit of concurrent live events is 3. If you would like to change the limit, please contact your Kaltura Customer...
Read more >Maximum number of simultaneous online meetings
There is no limit for concurrent (simultaneous) meetings, there however is a limit for Live events (15 concurrent). For more information you may ......
Read more >Is there a limit to the number of events we can run simultaneously ...
No, you can have as many events live at once as you'd like.
Read more >libPAPI : Maximum simultaneous events measure
Basically, you program some Machine Specific Register for monitoring a set of events. Obviously, the number of events that you can monitor ...
Read more >Discrete-event models of computation
Figure 2: The Discrete Event block diagram in Figure 1 used in scheduling execution to resolve simultaneous events. Of the four blocks, only...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@timkeeler I am aware of problems with random disconnections when sending large payloads. I believe this to be a problem in the underlying WebSocket layer (either the client or the server, I have not done a through debugging of the problem yet). In general WebSocket is not a good protocol for sending large amounts of data, stability decreases when you do that.
But in any case, I don’t agree that the CPU load comes from transmitting this data. Transmission of data is I/O, not CPU. If you do something with these hundreds of 1MB blocks (like maybe concatenating them) you are likely the one causing all that CPU load.
@adrian410 the problem is that you are receiving connections in one thread and emitting from another. The background thread is broadcasting to all clients, and it seems that sometimes the broadcast includes a client that is in the process of connecting, but isn’t fully connected yet. So this is a race condition caused by your threading choice.
I converted your server to use eventlet, and in that case the race did not occur. While this is clearly a bug in the server, given that it only occurs when multiple threads are used my priority to fix it is not high. I suggest you switch to one of the supported async frameworks (eventlet, gevent, tornado, etc.) which I assume you will be going to switch anyway, since the threading mode isn’t production ready.