You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
I use the universal snapshot method every minute to download 1 minute aggregate bars for all stocks, then do some analysis on it, and alert any stock that I need to get second-level data for. These stocks (~100 by the end of the data) are stored in a list.
Describe the solution you'd like
I would like to be able to run get_aggs every 5 seconds for a list of stocks with just one call.
A universal snapshot at the second-level would be the best case scenario.
Describe alternatives you've considered
I tried using the websocket API (second-level aggregates), but I could not make it to work such that once the list is updated, the connection is closed, and a new connection is opened with the updated list. I think (please correct me if I am wrong) this is because the close method is an asynchronous method. I tried using asyncio, but I could not make it to work without changing a significant part of my existing code. This (https://polygon.io/blog/pattern-for-non-blocking-websocket-and-rest-calls-in-python) helped me understand the problem a bit more, but still not very clear about it. My entire current program runs on threading and the rest api.
I also tried running get_aggs (for 5 second interval) every 5 seconds for each stock in the list, but that turned out to be obviously very slow.
In the worst case, I am planning on using the websocket api and get second-level data for all tradeable stocks. Then every 5 seconds, write the aggregated data to an on-disk sqlite database. And have another process read the necessary data to run further analysis on it. It will certainly result in a lot of unnecessary data since I only need to analyze data for ~100 stocks by the end of the day. If there are any obvious inefficiencies with this plan, please let me know.
Additional context
If there is another/better way to solve this problem, please let me know -- I will be very grateful for that.
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
I use the universal snapshot method every minute to download 1 minute aggregate bars for all stocks, then do some analysis on it, and alert any stock that I need to get second-level data for. These stocks (~100 by the end of the data) are stored in a list.
Describe the solution you'd like
I would like to be able to run get_aggs every 5 seconds for a list of stocks with just one call.
A universal snapshot at the second-level would be the best case scenario.
Describe alternatives you've considered
I tried using the websocket API (second-level aggregates), but I could not make it to work such that once the list is updated, the connection is closed, and a new connection is opened with the updated list. I think (please correct me if I am wrong) this is because the close method is an asynchronous method. I tried using asyncio, but I could not make it to work without changing a significant part of my existing code. This (https://polygon.io/blog/pattern-for-non-blocking-websocket-and-rest-calls-in-python) helped me understand the problem a bit more, but still not very clear about it. My entire current program runs on threading and the rest api.
I also tried running get_aggs (for 5 second interval) every 5 seconds for each stock in the list, but that turned out to be obviously very slow.
In the worst case, I am planning on using the websocket api and get second-level data for all tradeable stocks. Then every 5 seconds, write the aggregated data to an on-disk sqlite database. And have another process read the necessary data to run further analysis on it. It will certainly result in a lot of unnecessary data since I only need to analyze data for ~100 stocks by the end of the day. If there are any obvious inefficiencies with this plan, please let me know.
Additional context
If there is another/better way to solve this problem, please let me know -- I will be very grateful for that.
The text was updated successfully, but these errors were encountered: