Streaming connection suddenly stops pushing events

Hi.

we are currently using 2 connections for the same streaming endpoints.
(Please note that I’ve posted a question regarding the API limits in

to avoid possible issues in this regard).

It seems like that one of the connections to the same streaming endpoint can suddenly stop pushing events while the connection is still open. We have also observed that it may recover after a couple of hours.

Is there a reason for this behaviour and what would be the best way to recover more quickly?

Your help would be very much appreciated.

We have increased the number of concurrent connections to 7. I you create more, the longest serving connection will be invalidated as the new one is created.
If you try to make too many connections over a short period of time, you will receive a back off error and will have to wait a small period of time before attempting to re-establish a connection.

There is nothing in our streaming API which is discriminating against active connections and one receives where one does not. It’s a pub/sub model - so if you are active and subscribed you will receive what is published.
This is difficult to diagnose based on the limited information you have provided or knowing more about what your application is doing.
If you are maintaining the connection correctly and it’s alive, you should receive traffic.
Could you be blocking yourself?
Is it possible your connection is dying and you have code which re-establishes a second connection - which you are seeing as recovery?
It would be useful to know if you are receiving any errors or what your logs are saying.

Thank you!!! I think the hint with the 7 concurrent connections is the solution for my issue: At the moment, we establish 8 connections in total (2 for each of the following stream endpoints: psc, filings, officers, companies). As we have a reconnect logic, the currently oldest connection is closed. After a delay, the reconnect attempt for the just closed connection closes another connection and so on…

1 Like