Git Product home page Git Product logo

feature-requests's People

Contributors

jrbell19 avatar

Watchers

 avatar  avatar

feature-requests's Issues

Historical queries for the Snapshot Options API

Is your feature request related to a problem? Please describe.
Currently, the Snapshot Options API only displays the current state and not the historical states of an option contract. Knowing historical states are vital for strategy formulation/backtesting & powering of option chain analytical dashboards.

(1) Historical open_interest is currently not available. Knowing the change in the daily open_interest for an option contract is particularly useful to determine if contracts are being opened/closed/exchanged (when compared with contract volume).

(2) Historical implied_volatility is currently not available. This is incredibly useful to determine the implied volatility percentile/rank for a given contract/underlying stock.

Describe the solution you'd like
Addition/support for an optional date range query for the Snapshot - Option Contract: /v3/snapshot/options/{underlyingAsset}/{optionContract} endpoint as per your other endpoints:
Screenshot from 2022-05-05 19-48-09

Describe alternatives you've considered
(1) I currently do not have a way to retrieve historical open interest programmatically. My broker displays it on their platform but it cannot be retrieved via their API. Have considered pulling data from polygon once a day but this is not feasible/scalable at all especially when I need data from 2 years back.

(2) I currently pull historical stock prices and it's 1 month forward option contract prices and solve for the implied volatility using the Black-Scholes formula. This results in a lot of computations. It would be much easier/faster if polygon offered the historical implied volatility for option contracts.

Additional context
The 13/52 week Implied Volatility Percentile/Rank is widely used/displayed and may potentially be of some value if offered.

P.S. Really appreciate the Polygon team's work on the options API endpoints thus far!

OTC stocks: distinguish between Expert Market stocks and other OTC tiers

Is your feature request related to a problem? Please describe.
The Ticker Details endpoint (/v3/reference/tickers/{ticker}) does not provide information about which OTC tier an OTC ticker belongs to. As a result, there is no way to distinguish between Expert Market OTC stocks and "regular" OTC stocks. This information is relevant because trading in Expert Market stocks is restricted, unlike stocks categorized under the other OTC tiers (i.e. OTCQX, OTCQB, Pink Current Information, Pink Limited Information).

Describe the solution you'd like
There should be a way to find out whether or not an OTC stock is classified as Expert Market. One way to do this is to add an additional field to the response of the Ticker Details endpoint (/v3/reference/tickers/{ticker}).

Add float shares to stock financial

I was searching for the float of shares in your data but it looks like it's missing completely. I would rather stick to Polygon and get that data here instead of moving to another provider that has this data.

Also data like average volume etc. could be useful ( this data could be computed out of the data that already exists )

Additional Flag On Aggregate Bars To Include Bars With No Trades

Is your feature request related to a problem? Please describe.
Add an additional flag for the aggregate bars API endpoint to allow for the inclusion of bars that have no eligible trades โ€“ which would incidentally cause the OHLC to have no value. The purpose of this would be to have accurate volume data provided for each bar in the time-range (when summed for a given day, would add up the daily volume).

My suggestion would be to send NULL values for the OHLC to indicate a lack of eligible trades (and avoid confusion with the actual value price being zero dollars).

Alternatively, an additional field could be added to the bar record containing a count of trades during that period, which would be 0, letting the user know to skip that bar when doing certain calculations.

Being that this flag would default to false (not sending empty bars โ€“ the current behavior) this would be backwards compatible and not blow up existing logic using this endpoint.

Describe the solution you'd like
Currently it is not possible to get an accurate daily volume when requesting one-minute bars for a given day and summing the volumes. In order to do certain calculations, accurate per-bar volume is required (eg, VWAP, Volume Oscillators, etc)

Describe alternatives you've considered
The only other way to fetch the volume data for a given day is via the Daily Open / Close endpoint which lacks one-minute granularity.

Additional context
Related to polygon-io/issues#70

Stocks APIs - Multiple/Array Tickers in requests and responses

It is frustrating when we need to make the same call repeatedly for different tickers. For example, if we want aggregates for the same period and options but for 1,000 tickers, it requires 1,000 API calls. It would be much nicer if an array of tickers could be passed in and an array of results returned. This applies to ALL stock endpoints.

Varying tickers are the biggest frustration but there are times when other fields could also be arrays, for example the timestamp of Quotes NBBO VX (I want last quote for a given ticker at an array of timestamps)

Example:
POST: /v2/aggs/tickers
BODY:
{
"tickers": ["MSFT","AAPL","TSLA"],
"multiplier": 2,
"from": "2021-01-02",
"to": "2021-01-03"
}

OR (this would have query length limitations but would still be an improvement)
GET: /v2/aggs/tickers/range/{multiplier}/{timespan}/{from}/{to}?tickers=AAPL,TSLA,MSFT,...

Endpoint for historical market holidays / status

Is your feature request related to a problem? Please describe.
I sometimes want to find what was the last trading day before the current day. Normally, one can just look-up the previous business-day, but market holidays can complicate things.

Describe the solution you'd like
I would love to see another endpoint similar to v1/marketstatus/now and v1/marketstatus/upcoming that allows for querying past dates. Perhaps v1/marketstatus/history. The parameters for this endpoint would most likely be start (date) and end (date), similar to the aggs endpoint.

Describe alternatives you've considered
The best alternative currently seems to be to call the previous-close endpoint and parse the date from the t field, but that requires specifying a specific ticker even though we want a market-wide value

Use consistent IDs for companies

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

I was getting issues when using the api and found out it was from trying to load in data for ticker BFYT. I only found after some investigation that ticker BFYT was previously ticker HIIQ, but they changed the ticker symbol March 5th 2020. In polygon.io the data for HIIQ stops March 5th and data for BFYT starts March 6th, with no link between them.

Describe the solution you'd like
A clear and concise description of what you want to happen.

As discussed in the support here, it would be really great to use consistent ids for companies and even have a field that keeps track of ticker symbol changes.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
Quantopian does something like this. They use SIDs for companies. However, I don't believe they keep track of ticker changes.
Additional context
Add any other context or screenshots about the feature request here.

Because we need to backtest our strategies, so bid and ask for every candlestick is needed.

Hi,

I hope you are doing well.
This repository is amazing.

Just a little suggestion:
Because we need to backtest our strategies, so bid and ask for every candlestick is needed. Every candlestick's bid and ask stands for the aggregated tick based on the specific time frame.
You can add a parameter querystring for it.
For example:
price="M" stands for getting the data with the middle value between bid and ask.
price="B" stands for bid
price="A" stands for ask
price="BA" stands for both.

stream well known indicators RSI, MACD Histogram, EMA

Is your feature request related to a problem? Please describe.

The feature request is not related to a problem.

Describe the solution you'd like

I would like to save time in coding the processing of data streamed and to either get a sample code for processing in real-time some of the most used technical indicators such as RSI, MACDH and EMAs.

Describe alternatives you've considered

The only alternative is to code it.

Additional context
Add any other context or screenshots about the feature request here.

Adjusted Prices to Include Dividend Adjustment

It appears the adjusted prices only account for splits, for fully adjusted prices dividends should also be accounted for.

It would be useful to be able to request unadjusted, split-adjusted, split & dividend-adjusted prices

Adding Polygon server timestamp

Is your feature request related to a problem? Please describe.
Polygon websockets latencies tends to be very volatile during data spikes. Polygon.io claim an average latency of 25 ms but in real the latency distribution is hugly. Latency is sometimes far superior to 100ms., even 500ms on market opening, closing, order imbalance and so on...
The issue is for HFT like applications on realtime basis, it is very difficult to rely on the existing SIP timestamp (I am not speaking of the exg and trf ones) because Polygon.io server do not dispatch the data at a constant but variable latency.

Describe the solution you'd like
That each message include a polygon server timestamp. Then we would be able to accurately estimate the final latency for a given server (vps, dedi) or even pc, anywhere in the world. This would definitily make Polygon data more valuable and not only a big historical database.

Describe alternatives you've considered
No rock solide alternative.

Additional context
Nothing to add.

Market data channels - provide cumulative snapshot or last good aggregate(s) upon subscription

Stitching history with realtime data presents a lot of challenges. e.g. when subscribing to 1-second aggregates I have skip until the next 1-minute period starts, cache newly received 1-sec aggregates for sometime to ensure market data endpoint returns an aggregate for the past minute and only then concatenate history with real-time aggregates. Also, when connection breaks or for whatever reason needs to be reestablished steps above have to be repeated.

Please implement a solution to return cumulative snapshot or emit all aggregates since the start of a minute.

  • When subscribing to 1-second market data channel at 9:30:30am emit either
  1. one aggregate containing OHLC + volume from 9:30:00 to 9:30:30
  2. all 1-second aggregates from 9:30:00 to 9:30:30
  • When subscribing to 1-minute market data channel at 9:30:30am emit an aggregate for period 9:29:00 - 9:30:00

Request for Aggregates ( Bars ) to have a switch to turn off results outside of RTH

Is your feature request related to a problem? Please describe.
no

Describe the solution you'd like
Add a switch for Aggregates ( Bars ) to have a switch to turn off results outside of regular trading hours

Describe alternatives you've considered
The alternative is to code a routine to discard those bars.

Additional context
I typically throw away bars outside of regular trading hours due to low volume on those.

Trade (tick) timestamps do not follow json conventions (causing overflow)

URL
https://api.polygon.io/v2/ticks/stocks/trades/A/2010-01-06?reverse=false&limit=50000&timestamp=1262812349393999878&apiKey=...

Result
Getting a results that on first glance looks textually correct, however the timestamps must be quoted, as json converts #s to double under the covers.

{"results":[{
    "t":1262812349394000000,  <- note the lack of quotes in the original
    "q":2737306,
    "i":"",
    "x":4,
    "r":12,
    "s":100,
    "c":[10,12,2],
    "p":30.8668,
    "z":1}],  
    ...
}

This causes a timestamp like: 1262812349394000000 to be converted to a double, where the mantissa is such that does not have full resolution on the #. The stamp becomes: 1262812349393999878 once it goes through the double conversion (close but not exact).

For better or worse, the time stamp must be quoted. So instead should be:

{"results":[{
    "t":"1262812349394000000",   <- note the quotes
    "q":2737306,
    "i":"",
    "x":4,
    "r":12,
    "s":100,
    "c":[10,12,2],
    "p":30.8668,
    "z":1}],  
    ...
}

Expected Result
Timestamps must be quoted for proper json parsing.

Additional Notes
Javascript and many JSON parsers represent the numeric type as a double. This means that most parsers will return the wrong timestamp value (out to some resolution) due to the reduced resolution of 52 bits in the IEEE floating point representation. By convention most market data providers streaming JSON will either:

  • quote the epoch timestamp (be it in ms or ns) OR
  • use ISO 8601 style timestamps

I think the most compact form would be to continue to use the epoch time in nanoseconds, but quote it so that Javascript, Java-based JSON parsers, and other parsers following the Javascript convention of holding numeric in a double do not lose resolution when parsing. This would present a minor format change and might break some code, however. APIs that expect a numeric value rather than a string-based-long might fail to parse the messages without a minor adjustment to convert the string to a long.

Scheduled Delistings

Is your feature request related to a problem? Please describe.

As an algorithmic trader, I want to avoid holding securities that will be delisted the next trading day, so I can invest elsewhere.

Describe the solution you'd like

The /v3/reference/tickers endpoint only provides "delisted_utc" when "active=false" - only reports after-the-fact that a ticker has delisted. However, often (not always), we know that some tickers are scheduled to be delisted.

If we know a ticker is scheduled to be delisted, I want to know.

For example, ROVRW (a warrant) had news that the warrants would be exercised on 2022-01-12. I would like my code to ignore that opportunity.

Describe alternatives you've considered

I can check every warrant before I trade it - a human-in-the-loop model - though I'd rather not be chained to a desk with the strategies I'm developing.

Additional context

ENVXW was also delisted in January 2022, though without news.

I would also like it available historically (/v3/reference/tickers with day= query parameter), so I can backtest and see if delisting is scheduled for the following day. (if unexpected delisting, please don't edit historical view)

Some expectations:

  • best-effort (sometimes you don't see it coming, also might not be reported consistently)
  • can be mutable (maybe delisting is rescheduled)

Request: IPO data

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
No IPO data

Describe the solution you'd like
A clear and concise description of what you want to happen.
IPO data with insider lockup dates

For reference:
https://www.benzinga.com/apis/cloud-product/ipo-calendar/

Stock Index Historical Composition

It would be great if Polygon could provide index constituents (and historical changes) for S&P500, Nasdaq 100, and other indices. For example, I manually maintain the following configuration (json format) for S&P500 and other indices:

{
    "name": "SP500",
    "description": "S&P 500",
    "constituents": [
        "A", "AAL", "AAP", "AAPL", "ABBV", "ABC", "ABMD", "ABT", "ACN", "ADBE", "ADI", "ADM", "ADP", "ADS",
        "ADSK", "AEE", "AEP", "AES", "AFL", "AGR", "AIG", "AIV", "AIZ", "AJG", "AKAM", "ALB", "ALGN", "ALK", "ALL",
        "ALLE", "ALXN", "AMAT", "AMCR", "AMD", "AME", "AMGN", "AMP", "AMT", "AMZN", "ANET", "ANSS",
        ...
     ]
     "events": [
          { "asof": "1999-12-07", "action": "Add", "symbol": "YHOO" },
          { "asof": "1999-12-07", "action": "Remove", "symbol": "LDW" },
          { "asof": "2000-07-27", "action": "Add", "symbol": "JDSU" },
          { "asof": "2000-07-27", "action": "Remove", "symbol": "RAD" },
          ...
     ]
}

System status via the api

We see some very useful info on https://polygon.io/system, but the endpoint v1/marketstatus/now gives trading session info, but not polygon system info:

{'afterHours': True,
'currencies': {'crypto': 'open', 'fx': 'closed'},
'earlyHours': False,
'exchanges': {'nasdaq': 'extended-hours',
'nyse': 'extended-hours',
'otc': 'extended-hours'},
'market': 'extended-hours',
'serverTime': '2021-07-16T16:59:34-04:00'}

Could you guys expose an endpoint that gives us the info at https://polygon.io/system? That would allow us to validate the status of the polygon.io. api programmatically and prevent a lot of issues.

Provision of bars with finer tenors (for example 10sec, 30sec)

Is your feature request related to a problem? Please describe.
Loading trades (ticks) in order to produce bars with granularities finer than 1min is expensive in terms of:

  • time it takes to load
  • bandwidth consumed
  • processing
  • disk space.

Describe the solution you'd like
Would like to extend the REST api to allow queries for bar granularities of < 1min. It should be straightforward to generate and return bars of arbitrary granularity (with exception of daily bars, which are special given the auction prices and volume outside of market hours).

Bars with no trades (volume = 0) would not be returned, reducing bandwidth and avoiding computation on the server side.

Describe alternatives you've considered
My current approach is to load ticks for my stock universe (~3000 stocks), save, compress, and then generate 10sec bars (the current granularity of interest). The process of loading the ticks took ~3 days and involved loading ~5 TB of data. Loading the bars would have been much more compact and time efficient.

Notes
It would probably be impractical to precompute / save all possible bars of interest. Computing the bars on the fly from a tick series should have very little additional cost, relative to the cost of retrieving and and streaming ticks.

Timestamps for High and Low Prices

This one is pretty straightforward:

Please include the timestamps for the high and low prices (e.g. for daily stock prices), so its possible to determine when the price reached the daily (or other time interval) high or low.

Thank you.

Start Quotes Data from Arbitrary Timestamp

Is your feature request related to a problem? Please describe.
I was wondering if it's possible to start the pagination for /v2/ticks/stocks/nbbo/{ticker}/{date} from an arbitrary timestamp. From playing around with the API it seems like the timestamp parameter needs to correspond to an actual value in the dataset, otherwise the returned quotes will start at the beginning of the day.

Describe the solution you'd like
I'd like to be able to submit any timestamp in the middle of the day and get the quotes directly after this timestamp.

Describe alternatives you've considered
The alternative seems to be fetching the quotes for the entire day then sifting through the set to find the one that is closest to my
desired timestamp.

Historical Options Greeks

Is your feature request related to a problem? Please describe.
As an options trader, I want to analyze how greek values for a given contract changes over time so that I may better inform my future trades.

Describe the solution you'd like
In the quotes endpoint and channel, a results.greeks object similar to the "option contract snapshots" endpoint is included to represent an option contract's greeks at the time of the quote.

Describe alternatives you've considered

  • Rapidly polling the option contract endpoint and storing results for future analysis
  • Deriving greeks from information already provided by option contract quotes using the Black-Scholes model

Stocks Grouped Daily (Bars) API returns invalid timestamps

Stocks Grouped Daily (Bars) API documentation says t - The Unix Msec timestamp for the start of the aggregate window. which should be 00:00:00 (am) but it returns timestamps with 16:00:00 (pm) instead. Aggregates (Bars) API on the other hand work as expected.

Example below.

https://api.polygon.io/v2/aggs/grouped/locale/us/market/stocks/2023-01-09?adjusted=true

{
    "queryCount": 10953,
    "resultsCount": 10953,
    "adjusted": true,
    "results": [
        {
            "T": "TTMI",
            "v": 394280,
            "vw": 16.1078,
            "o": 15.96,
            "c": 16.08,
            "h": 16.335,
            "l": 15.96,
            "t": 1673298000000,
            "n": 5416
        },
...

167329800 converts to Monday January 09, 2023 16:00:00 (pm) in time zone America/New York (EST)

Aggregate Bars API returns timestamps as expected, example for the same ticker and date

https://api.polygon.io/v2/aggs/ticker/TTMI/range/1/day/2023-01-09/2023-01-09?adjusted=true&sort=asc&limit=120

{
    "ticker": "TTMI",
    "queryCount": 1,
    "resultsCount": 1,
    "adjusted": true,
    "results": [
        {
            "v": 394280,
            "vw": 16.1078,
            "o": 15.96,
            "c": 16.08,
            "h": 16.335,
            "l": 15.96,
            "t": 1673240400000,
            "n": 5416
        }
    ],
    "status": "OK",
    "request_id": "e2e0c5c9276e65fe7ae7ce49afda7967",
    "count": 1
}

1673240400 converts to Monday January 09, 2023 00:00:00 (am) in time zone America/New York (EST)

Add option to exclude dark pool from aggregate data

The OHLCV data has dark pool data mixed in. This produces sub-optimal results for my analysis, and probably others. Tradingview and TDA/TOS don't include this data. I switched to Finnhub because they didn't seem to include it before, but now it is included.

Describe the solution you'd like
Add a boolean parameter to the aggregate API, which will exclude dark pool data.

Describe alternatives you've considered
I can try to build my own bars from the raw trade data, but I am under the impression that it requires much work to get right.

Aggregates will return information from multiple cusips

URL
https://api.polygon.io/v2/aggs/ticker/ULTR/range/1/day/2012-03-17/2022-03-17?adjusted=true&limit=50000&apiKey={...}

Result

vestedus.polygon> (def test-ds (since-date-impl "ULTR" "day" (java.time.LocalDate/parse "2012-03-17")
                                                (java.time.LocalDate/now)))
https://api.polygon.io/v2/aggs/ticker/ULTR/range/1/day/2012-03-17/2022-03-17?adjusted=true&limit=50000&apiKey={...}
#'vestedus.polygon/test-ds
vestedus.polygon> test-ds
_unnamed [1818 8]:

|           :timestamp |   :volume | :volume-weighted-average | :n-transactions |    :low |   :open |  :close |   :high |
|----------------------|----------:|-------------------------:|----------------:|--------:|--------:|--------:|--------:|
| 2012-03-19T04:00:00Z |   94694.0 |                   2.4714 |             368 |  2.4100 |  2.4100 |  2.4500 |  2.5100 |
| 2012-03-20T04:00:00Z | 1045601.0 |                   2.3942 |             853 |  2.3000 |  2.4200 |  2.3200 |  2.4400 |
| 2012-03-21T04:00:00Z |  121298.0 |                   2.2989 |             526 |  2.2500 |  2.3400 |  2.2900 |  2.3500 |
| 2012-03-22T04:00:00Z |  504113.0 |                   2.1128 |            1178 |  2.0600 |  2.2400 |  2.1300 |  2.2500 |
| 2012-03-23T04:00:00Z |  298638.0 |                   2.1287 |             665 |  2.1000 |  2.1200 |  2.1400 |  2.2100 |
| 2012-03-26T04:00:00Z |   44103.0 |                   2.1881 |             227 |  2.1504 |  2.1700 |  2.1900 |  2.2200 |
| 2012-03-27T04:00:00Z |   99757.0 |                   2.1207 |             497 |  2.0800 |  2.1900 |  2.1000 |  2.2150 |
| 2012-03-28T04:00:00Z |   62572.0 |                   2.1102 |             152 |  2.1000 |  2.1100 |  2.1000 |  2.1400 |
| 2012-03-29T04:00:00Z |  163553.0 |                   2.0131 |             542 |  1.9900 |  2.0700 |  1.9900 |  2.1100 |
| 2012-03-30T04:00:00Z |   89797.0 |                   1.9995 |             237 |  1.9800 |  2.0200 |  2.0000 |  2.0500 |
|                  ... |       ... |                      ... |             ... |     ... |     ... |     ... |     ... |
| 2022-03-02T05:00:00Z |    1475.0 |                  48.1569 |              14 | 48.1301 | 48.1700 | 48.1301 | 48.1700 |
| 2022-03-03T05:00:00Z |   36662.0 |                  48.1847 |              12 | 48.1500 | 48.7000 | 48.1800 | 48.7000 |
| 2022-03-04T05:00:00Z |    2296.0 |                  48.3091 |              26 | 48.2699 | 48.3100 | 48.2699 | 48.3100 |
| 2022-03-07T05:00:00Z |   77320.0 |                  48.1512 |              22 | 48.1400 | 48.3000 | 48.2000 | 48.3000 |
| 2022-03-08T05:00:00Z |    5671.0 |                  48.1238 |              25 | 48.0900 | 48.3000 | 48.0900 | 48.3000 |
| 2022-03-09T05:00:00Z |     487.0 |                  48.0506 |               9 | 47.9678 | 48.3000 | 48.0300 | 48.3000 |
| 2022-03-10T05:00:00Z |    2045.0 |                  47.9793 |              11 | 47.9729 | 48.0100 | 48.0102 | 48.0102 |
| 2022-03-11T05:00:00Z |     683.0 |                  48.0034 |              12 | 47.9600 | 48.0100 | 47.9600 | 48.0100 |
| 2022-03-14T04:00:00Z |   20033.0 |                  47.9463 |             182 | 47.8599 | 48.1900 | 47.9600 | 48.1900 |
| 2022-03-15T04:00:00Z |    1186.0 |                  47.9708 |              10 | 47.9300 | 47.9951 | 47.9400 | 47.9951 |
| 2022-03-16T04:00:00Z |    1128.0 |                  47.7782 |              16 | 47.4300 | 47.4300 | 47.7800 | 48.1900 |

Expected Result
The ULTR ETF started around august 2019. The system is picking up some information from the ULTR etf and some information from another cusip on a different exchange that used the same ticker. What I personally would like is to only get information on the ETF. Due to issue polygon-io/issues#201 I can't use the list_date to shorten the request date ranges.

This means that in general I can't trust the aggregate call in order to build out backtesting information because it will potentially switch cusips in the same data stream.

In WebSockets, add ability to set additional parameters to further filter the messages

Is your feature request related to a problem? Please describe.
No.

Describe the solution you'd like
In the websockets, you can set additional parameters other than symbol to further filter the messages. For example, I'd like to only see messages of Trades that are >=X shares.

Describe alternatives you've considered
We will import all trades, and then do the filtering on our side.

Include number of trades per aggregate window in snapshot all tickers request

The aggs endpoint returns a field called 'n' that has the number of trades in the interval.

The snapshot endpoint does not include this field. Is there a plan to include this field in the future? if not, could you?

It would make sense to add it as a field in the min sub object that is returned from the api request.

Thanks!

Allow the Snapshot Options API to select multiple tickers

Is your feature request related to a problem? Please describe.
In a previous feature request, the Snapshot API was enhanced to allow users to pull snapshots for multiple stocks with one request. see previous request polygon-io/issues#29. However, this did not carry over to the options snapshot API.

Describe the solution you'd like
Let's say I want to get the Options Snapshot for multiple options contracts, I would have to place separate calls to "/v3/snapshot/options/{underlyingAsset}/{optionContract}". It would be much easier to place a single call to,
"/v3/snapshot/options/{underlyingAsset}/{optionContract_1}, {optionContract_2}, ...{optionContract_n}".

Describe alternatives you've considered
If I want to get the snapshot for every options contract for a given stock it would take 2-5 seconds to individual call
every single options ticker through the current API.

Additional context
Similar to feature request polygon-io/issues#29

Websocket - Adding exchange and trf timestamps

Is your feature request related to a problem? Please describe.
Yes. Using the rest API, you provide 3 timestamps: SIP, EXG, TRF. But on a realtime basis using websocket, you provide only SIP timestamp.

Describe the solution you'd like
That the websocket provide the 3 timestamps. Ideally nanosec but most importantly 3 timestamps instead of 1 timestamp only.

Describe alternatives you've considered
On a realtime basis no alternatives.

Additional context
N/A.

Thanks a lot for your time and your answer++

Websocket data Json return inconvenience (golang)

Client
using "github.com/gorilla/websocket" for GoLang
Issue
Both stock trade and stock aggregates under the stock cluster have a json return of "c". When I'm subscribed to both channels I have to use a custom json unmarshaler to determine if its a trade or aggregate before unmarshalling the rest into a struct. This is because for the trade "c" is a []int but for aggregates it is a float64.
Expected Result
I suggest you make all of the json returns unique for the same cluster but different channels.
Screenshots

Desktop (please complete the following information):
Ubuntu Cloud Server
Additional context

Add any other context about the problem here.

by-second resolution historical aggregates

Is your feature request related to a problem? Please describe.
I'd like to look at some OHLCV data on the per-second resolution. It is a happy medium between tick and minutely.

Describe the solution you'd like
Add option for aggregate historical bar data (/v2/aggs/ticker/ endpoint) to be returned on the per-second resolution.

Describe alternatives you've considered
Rolling my own second-ly bars from ticks. However,

  1. This would require loading all tick data in the timeframe
  2. This would require taking into account different trade conditions (from different exchanges?)
  3. It would seem that Polygon.io already has the infrastructure to produce minutely-bars. Don't want to repeat work here.

Additional context
Add any other context or screenshots about the feature request here.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.