w3c / webrtc-stats Goto Github PK
View Code? Open in Web Editor NEWWebRTC Statistics
Home Page: https://w3c.github.io/webrtc-stats/
License: Other
WebRTC Statistics
Home Page: https://w3c.github.io/webrtc-stats/
License: Other
This was proposed to the list over a year ago by @alvestrand with favorable response, and has been implemented in Firefox release since 34 (December).
http://w3c.github.io/webrtc-stats/#dictionary-rtcoutboundrtpstreamstats-members has "targetBitrate" without a definition.
This needs to call on some definition - preferably a preexisting one. At least we should be able to tell whether it's with or without RTP headers, and it would also be nice to say what measurement interval is expected to produce this bitrate. Not sure one exists.
@alvestrand just noticed that serverUrl as described in https://lists.w3.org/Archives/Public/public-webrtc/2014May/0115.html is not part of the spec.
It's still more useful than just being able to figure out the type of the connection from the local type preference / priority.
Current description: "Represents the number of unique datachannels [opened/closed]."
Only DataChannels that have been fully opened should increase the closed counter when closed, so that (opened - closed) = (currently open). It may be possible for a DataChannel to go from Connecting to Closing/Closed, skipping the Open state.
Also, should "datachannel" be written as "DataChannel" or "data channel" instead?
Current text:
audioLevel of type double
Only valid for audio, and the value is between 0..1 (linear), where 1.0 represents 0 dBov. Calculated as defined in [RFC6464].
RFC 6464 defines a logartihmic scale with -127 pegged to silence. That's not what is meant here.
"Linear" comes close, but not quite enough, I think.
I think we want 0..1, where 0 is silence, 1 is 0 dBov, and 0.5 represents "approximately 6 dBSPL change in the sound pressure level from the maximum" (quoted from the mediacapture draft's "volume" spec).
At the moment, one is supposed to figure out whether a candidate is local or remote based on its position in a candidate pair.
This doesn't cover the case where you have candidates that aren't paired with anything - for instance IPv6 local candidates when talking to an IPv4-only remote system.
There should be an "isRemote = false" field on RTCIceCandidateStats, just like we have on RTCRTPStreamStats.
For WebRTC-1.0, stats needs to be a stable reference (with all the stats that are in the MTI set).
Adding stats needs to be a process that's clear to people who want to do it.
Suggestion from @juberti : If we accept dicitionaries as stats attributes, we should embed Candiate inside CandidatePair. (Not uncontroversial).
Per mailing list discussion: For a local RTCOutboundRTPStreamStats, the RTT is calculated based on received RTCP RR reports. The rest of the RTCP-derived data is placed in the remote RTCInboundRTPStreamStats, with an appropriate timestamp. RTT should go there too.
Consideration: Can RTT be calculated at all on local RTCInboundStreamStats?
We currently have an RTT stat on a candidate pair, but the RTP spec defines a different RTT metric (based on RTCP and timestamp correlations).
Consider whether we should expose this too, and if so, how.
RTCCodecStats has "codec" and "parameters" whereas RTCRtpCodecParameters has "mimeType" and "sdpFmtpLine" which seem to mean the same things.
If these are indeed expected to contain the same information, it would be nice if the names aligned. I slightly prefer the RTCRtpCodecParameters names; they seem more descriptive.
When a track ends or is disconnected, the stats has to do one of two things:
Proposal, in consistency with the philosophy so far: If it's ended, just report as normal; show some stat that says that the state is ended.
If it's disconnected, continue reporting (for history), but with a stat saying it's disconnected.
Assuming that there will be a pacing limits on the STUN packets. We would need:
The summary states:
A Transport carries a part of an SDP session, consisting of RTP and RTCP. When Bundle is in use, an SDP session will have only one Transport per Bundle group. When Bundle is not in use, there is one Transport per m-line.
This heavily implies that an RTCTransportStats combines RTP and RTCP for an m= section.
But later, we have rtcpTransportStatsId, defined as:
If RTP and RTCP are not multiplexed, this is the id of the transport that gives stats for the RTCP component, and this record has only the RTP component stats.
Which implies that there are separate RTCTransportStats for RTP and RTCP. This seems correct, since that would allow a 1:1 mapping from an RTCIceTransport to an RTCTransportStats. Also, it's the only way there could be a single "selected candidate pair" per RTCTransportStats.
The enum of stats types needs to follow W3C conventions as stated in #5 and use dashes.
Tagging @fluffy
The "RTC[In/Out]boundRTPStreamStats : RTCRTPStreamStats" dictionaries are valid for both audio and video. RTCRTPStreamStats has a DOMString mediaType that is either "audio" or "video" to distinguish the two cases.
RTCMediaStreamTrackStats is also valid for both audio and video, but it has no mediaType member. You'd have to examine the other members (some of which are only valid for audio or video) to deduce if its for audio or video.
We should add "DOMString mediaType" to RTCMediaStreamTrackStats.
Alternatively we could split it up into "RTCMediaStream[Audio/Video]TrackStats : RTCMediaStreamTrackStats". Its members are mostly audio- or video-specific so why are all of the members in the same dictionary?
Looking at RTCStats desendants:
They're all "*Stats" except two. Not that it matters terribly since these are dictionaries, but shouldn't we be consistent?
https://w3c.github.io/webrtc-stats/#mststats-dict* has a member ssrcIds which is not described.
I suspect it is a list of pointers to "RTP streams" which are identified by a ssrc: https://w3c.github.io/webrtc-stats/#streamstats-dict* -- the description of this is missing too ;-)
btw, why is the mediaType defined on RTCRTPStreamStats and not in RTCMediaStreamTrackStats? It would be useful in both probably since some attributes only make sense for audio and video respectively. It would be more natural on the track stats since that is related to an RTPSender/RTPReceiver (which has a kind attribute)
If the RTCIceCandidateStats is referenced by an RTCIceCandidatePairStats, it's possible to tell. But not every candidate will be paired (for instance, before any remote candidates have been trickled).
I propose adding boolean isRemote
.
https://tools.ietf.org/html/rfc5245#section-5.7.4 doesn't have a "cancelled" state. If we want this in the RTCStatsIceCandidatePairState enum, we have to find another definition for it.
The definition for RTCTransportStats.activeConnection is "Set to true when transport is active."
But there's no definition of what "active" means. It might be that the RTCIceTransport that it corresponds to has its state in {connecting, completed}, or it might mean something else. Clarification needed.
The spec needs to say (in text) that the AttributeStats is representing everythig in the non-Stats object + some more, and have the field names & types be in sync.
Result of: https://validator.w3.org/checklink?url=https://w3c.github.io/webrtc-stats/webrtc-stats.html
Line: 186 https://w3c.github.io/webrtc-stats/webrtc-stats.html
Status: 200 OK
Some of the links to this resource point to broken URI fragments (such as index.html#fragment).
Broken fragments:
https://w3c.github.io/webrtc-stats/webrtc-stats.html#widl-RTCCodec-codec (line 186)
warning Lines: 70, 77, 80, 84 http://dev.w3.org/html5/spec/webappapis.html redirected to http://w3c.github.io/html/webappapis.html
Status: 301 -> 200 OK
This is a permanent redirect. The link should be updated to point to the more recent URI.
Broken fragments:
http://dev.w3.org/html5/spec/webappapis.html#event-handlers (line 84)
http://dev.w3.org/html5/spec/webappapis.html#fire-a-simple-event (line 80)
http://dev.w3.org/html5/spec/webappapis.html#queue-a-task (line 77)
http://dev.w3.org/html5/spec/webappapis.html#eventhandler (line 70)
RTCDataChannelStats and RTCIceCandidatePairStats doesn't know about their transport, they should have a "transportId" member, just like RTC[In/Out]boundRTPStreamStats does.
Note: RTCTransportStats does have selectedCandidatePairId, but this only covers the selected one and in direction transport->pair, not pair->transport.
When an UA's interface changes, it would be nice to have a safe way of detecting that a change has happened. Should the implementation be encouraged to expose some kind of "version number"?
In #68, it was raised that:
Is there something that we can do better?
The term "media stream" is heavily overloaded, and should not be used unqualified.
The term "RTP stream" is used in the document, and needs a reference to the correct RFC in the terminology section.
All occurences of "media stream" without qualifier should be updated to reference the term (with links).
The behavior agred on is that stats for an object, once initialized, will be available for the lifetime of the PC. When it's stopped, the clock stops advancing - the stats are the stats at the moment of stoppage, but the stats object does not disappear even if the object does.
This applies to all stats objects.
This needs to be clear.
On the one hand, they're control objects for tracks and SSRCs.
On the other hand....?
From Sami Kalliomäki [email protected]:
What we need is a way to get average QP over some period of time. I don't think adding this to the receiver side is needed by us but maybe it should be added as well. I think the approach of adding a sum of QP would be a good idea. It probably should be added to the RTCMediaStreamTrackStats structure. Maybe something like this:
qpSum of type unsigned long
Only valid for video. QP (quantization parameter) describes how much spatial detail is included a frame. Low value corresponds with good quality. The range of the value per frame is defined by the codec being used. This parameter represents the sum of all QPs for framesDecoded on remote streams and framesSent on local streams.
The average QP has been implemented before as a goog stat but it never landed. The relevant CLs are here:
https://codereview.webrtc.org/1264693003/
https://codereview.webrtc.org/1420963005/
The text about "detached" for a MediaStreamTrack is not completely clear. The text needs to say that it MUST continue to appear, with the counters frozen at the values at the time of detaching, and the timestamp set to the detach time. If it is reattached, counters will increment again.
This is an internal ticket for the editors to attend to. The subsections in section 4 do not make sense now.
RFC5245 section 8.3 talks about a candidate being "freed". I assume that for TURN candidates, this means sending a refresh request with a lifetime of 0, and for host UDP/TCP candidates, it means closing the socket.
It would be useful to have a field (on local candidates) indicating whether or not the candidate has been freed. This gives an indication of how many sockets and TURN allocations WebRTC is consuming, which can be useful information.
At the moment, the text relevant to this says:
"track"
Contains the sequence of tracks related to a specific media stream and the corresponding media-level metrics. It is accessed by the RTCMediaStreamStats.
This is obviously wrong. The RTCStatsType string for a MediaStream should be "stream", and "track" should point to RTCMediaStreamTrackStats. The term "media stream" shouldn't be used without a reference.
In the design considerations, we should include explanations of why we use "sum & count" rather than "average" - the reason being that "average" implies averaging over some fixed time interval, which may or may not be what the app wants; sum & count allows us to do getStats() twice and calculate the average between them over an app-chosen time interval - which may be long or short.
Information for calculating the Circuit breaker with the information in the Stats.
CB Trigger 1. latest timestamp of a packet for each SSRC. Covers both sent and received streams
CB Trigger 2. latest timestamp of the RTCP packet for each SSRC. I believe this is covered with the timestamp in the remote stats.
CB Trigger 3. TFRC can be computed with the information, although one piece of information that is missing is the average RTCP interval.
currently the spec is not clear on what the unit for all RTT measurements are. In one place it says (seconds), and for the other no units are presented.
Should the RTT metrics be reported in milliseconds or seconds. Since the type is double, I feel the milliseconds for the integer part may be the right resolution.
Called out by @pbos: In certain contexts, it's important to know which particular encoder is being used by a particular platform (for instance when both hardware and software codecs are available, and one or the other may be in use at any particular moment).
This is really only useful when one has intimate knowledge of the platform in use, so an implementation-defined string seems natural.
Suggestion: Add a new attribute to the RTCCodec dictionary.
Name: implementationName
Value: DOMString
Description: An implementation-specific identifier for the codec in use. This will be the encoder for outgoing streams, and the decoder for incoming streams; they may be different.
http://w3c.github.io/webrtc-stats/#streamstats-dict*
defines associateStatsId, example 1 uses (and Firefox implements)
remoteNow = currentReport[now.remoteId];
For instance, "transport" needs to say that it is represented by an RTCTransportStats object (and hyperlinked).
Inboundrtp and outboundrtp have it already.
The RTCStatsType in webrtc-stats is out of sync with RTCStatsType in webrtc-pc.
rtp
.(see bug originally reported at https://www.w3.org/Bugs/Public/show_bug.cgi?id=26620)
(this assumes that getStats moves to this document as discussed during TPAC 2014)
There are a number of counters (for example sliCount) that are likely to be:
RTCInboundRTPStreamStats
, RTCInboundRTPStreamStats
& RTCRTPStreamStats
are defined both in webrtc-stats and webrtc-pc (although the definitions aren't the same); we need to pick one or the other (or if needed, make one spec uses partial dictionaries to complete the other).
The metrics are discussed in https://tools.ietf.org/html/draft-ietf-xrblock-rtcweb-rtcp-xr-metrics-03
For TURN candidates, it would be useful to see the protocol used to communicate with the TURN server. Either UDP, TCP, or TLS over TCP. The protocol
field that currently exists is just the protocol of the allocated candidate.
If we add this, it may be worth adding to RTCIceCandidate
as well. Though it's in a different category from the other attributes, in that it's not something that appears in the candidate SDP.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.