Comments (9)
There are 2 ways we could go about this:
- Define a developer request to compress the content as an imperative, and necessarily compress the payload. In this case, if the application asked for the payload to be compressed, the backend would need to decompress the data as part of the processing pipeline, without requiring the underlying server to support that compression necessarily. (that is, no need for
Content-Encoding
, etc). @ksylor is right that in this case, the negotiation mechanism can be based on the upload URL. - Define a developer request to compress as a hint (which can contain a list with multiple preferred encodings), which the browser can act on at will. In this case, we would need a negotiation mechanism to tell the server which encoding was applied.
[Content-Encoding](https://datatracker.ietf.org/doc/html/rfc7231#section-3.1.2.1)
is indeed the natural candidate.
(2) seems more robust and ergonomic, especially if we want the browser to potentially beacon the data once the renderer is no more.
/cc @mnot @reschke - as I vaguely remember conversations about the use of Content-Encoding as a request header field, but don't remember their details.
from beacon.
+1 from Wikimedia as well.
A few things come to mind:
There is a perhaps not-so-obvious need for support on the server-side here. Which means if an intermediary library changes its sendBeacon()
call to enable compressed transfer encoding, then any consumer of that that has a receiving server must now support that or it might break. Is that right? I couldn't find precedent for negotiating upload encoding in other HTTP upload mechanisms (e.g. large www-form-encoded post submissions, or large uploads via HTML5 file inputs with file formats that may benefit from compression). This seems hard to negotiate in a progressive and backward-compatible manner, e.g. like downloads where the request is expected to have Accept-Encoding
, which lets the sender toggle compression as-needed. This may be fine, but it's worth considering as such and documenting.
There is also a perhaps not-so-obvious need for that same server to also retain ability to process uncompressed submissions as browsers presumably don't have to follow this option, at least until all browsers support it.
That leaves us with how to spec this. Do we spec it as something the browser must implement and must follow if passed? That seems simplest, but means it's less obvious that the same code interpreted by browsers following an older version of the spec ignore the option. I wonder if it would make sense to spec it has a hint and let the browser decide. That would make it more obvious that receiving servers can't assume all inputs to be compressed, and also leaves some room for "user/device knows best"-type optimisations based on whatever hueristics browser vendors and users may come up with in the future (e.g. optimise for high bandwidth, or low CPU, or skip below size threshold etc.)
On naming, if this will involve the Content-Encoding
header in the underlying spec and implementation, that may be worth using in the naming as well for consistency and familiarity, e.g. contentEncoding: "gzip"
. On the other hand, it might actually cause unwarranted confidence if we go with the idea of it being a "hint" that the browser may ignore, and/or if we want to support mutiple choices at some point, e.g. ["deflate", "br"]
, where Brotli would be added by developer if and when their consumer/server supports that.
from beacon.
/cc @ricea
from beacon.
+1 to this from Etsy’s perspective! Particularly with RUM data the payload can be big, and this would allow us to use compression in more cases.
from beacon.
Those are great points @Krinkle!
There is a perhaps not-so-obvious need for support on the server-side here. Which means if an intermediary library changes its sendBeacon() call to enable compressed transfer encoding, then any consumer of that that has a receiving server must now support that or it might break. Is that right?
I think that is right, but it seems reasonable to me that companies would need to make a conscious decision to enable the compression feature via the parameter, and then would also be responsible for updating their observability systems to accept & un-encode the payload based on Content-Encoding
and Content-Type
. If folks are using a third party library they would still need to have some knowledge of their system that accepts the data, or else the third party library is used to post to a third party service, but I could definitely be wrong about that assumption!
There is also a perhaps not-so-obvious need for that same server to also retain ability to process uncompressed submissions as browsers presumably don't have to follow this option, at least until all browsers support it.
This seems like another fair tradeoff to me. Not sure how widespread this is as a practice, but we already have to do a similar multi-type support in our data capture endpoints to accept both sendBeacon
(content-type: text/plain;charset=UTF-8
) and xhr (content-type: application/x-www-form-urlencoded
) payloads as a fallback in non-supporting browsers. Not that we necessarily want to also manage a third option, but the tradeoff might be worth it to decrease payload sizes over the wire significantly.
The complexity could potentially be alleviated by exposing the supported compression types and then using different URLs per send type - something like this?
if (navigator.sendBeacon && navigator.sendBeacon.supportsEncoding("gzip")) {
navigator.sendBeacon('/endpoint/accepts/gzip', data, {
contentEncoding: "gzip"
});
} else {
navigator.sendBeacon('/endpoint/accepts/json', data);
}
However if your idea of browsers using it as a hint takes off, then that would make it much more complex of an API that would need to be exposed in order to do that kind of url switching in client code. I'm not sure though if that's a solid argument against treating it as a hint or not?
Interestingly when I ran a test of @nicjansma 's example code in Chrome 96, currently when you send the compressed results through navigator.sendBeacon
both the Content-Encoding
and Content-Type
headers aren't part of the resulting request, so there must be a bug with the interoperability of streams and the beacon api as it is implemented now?
from beacon.
Given that PendingBeacon is the next big thing, should we aim for this use case to be supported there instead?
/cc @clelland
from beacon.
@nicjansma - I'd also love your thoughts on whether the PendingBeacon proposal obsoletes this feature request
from beacon.
Given that PendingBeacon is the next big thing, should we aim for this use case to be supported there instead?
Yes definitely! I was just reading over the proposal for PendingBeacons and it seems like it would solve quite a number of issues, it would be great to get compression support built into that API vs. a later bolt-on.
from beacon.
I think that's fair to push this to PendingBeacon.
As far as I can tell, there's nothing significant you could do with sendBeacon()
that you wouldn't be able to do (and more) with PendingBeacon?
from beacon.
Related Issues (20)
- How important is it for sendBeacon() to follow redirects? HOT 12
- Remove referrer source definition + HTML5.2 dep HOT 2
- corsMode calculation in the "Processing Model" section is broken HOT 1
- Typo in Introduction note
- Typo in Processing Model - fetch step HOT 1
- Typo in Privacy and Security HOT 1
- Resource Timing + beacon test HOT 2
- beacon-navigate is broken HOT 1
- Firefox, Chrome and Edge all fail http referrer test HOT 5
- Typo in Example 1 code HOT 5
- navegator.sendBeacon cookie visibility question. HOT 4
- When can Beacon support the GET method? HOT 2
- Drop dependencies section HOT 1
- Integrate with Resource Timing
- Account for fetch algorithm rename
- Setup auto publishing HOT 9
- References terms from Page Visibility, which is a discontinued draft HOT 2
- Is "entry setings object" correct? HOT 1
- Duplicate, vague, and monkeypatching normative requirements
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from beacon.