You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am sending a lot of data for online rendering of the heatmap plot,
and at the same time the buffer fills up and the interface begins to lag behind the backend.
And in some cases even an error occurs
The Jupyter server will temporarily stop sending output
to the client in order to avoid crashing it.
To change this limit, set the config variable
`--ServerApp.iopub_msg_rate_limit`.
Current values:
ServerApp.iopub_msg_rate_limit=1000.0 (msgs/sec)
ServerApp.rate_limit_window=3.0 (secs)
Is there way to check if the buffer has unsent data to avoid sending data too often?
Now I've come up with a crutch: ping every 60 frames.
but this leads to friezes every second.
(and if you ping every frame, fps drops from 40 to 10)
In fact, I don't have to wait for a response from the frontend.
I want to send data one way at maximum speed so that the buffer doesn't get clogged.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
I am sending a lot of data for online rendering of the heatmap plot,
and at the same time the buffer fills up and the interface begins to lag behind the backend.
And in some cases even an error occurs
Is there way to check if the buffer has unsent data to avoid sending data too often?
Now I've come up with a crutch: ping every 60 frames.
but this leads to friezes every second.
(and if you ping every frame, fps drops from 40 to 10)
In fact, I don't have to wait for a response from the frontend.
I want to send data one way at maximum speed so that the buffer doesn't get clogged.
Beta Was this translation helpful? Give feedback.
All reactions