auth0-blog / blog-dombench Goto Github PK
View Code? Open in Web Editor NEWDOM benchmarks using different libraries
DOM benchmarks using different libraries
Currently, the run-benchmarks.js
runs all the default benchmarks that that browser-perf runs. This may include additional javascript on the page that could skew the numbers a little. Does it make sense to only run specific benchmarks so that we get more accurate numbers ?
For example, we could specify a metrics: ['TimelineMetrics']
to only collect Timeline data when we are running the tests. This way, javascript related to other metrics like Network are not run on the page, this giving a more accurate reading.
I noticed that you mentioned running this test on a Core i5 with Chromium. I recently attempted the same test on Chrome / Mac OSX with a 2.5 Ghz Core i7 and had much different results. Can you possibly speak to this? I also ensured that your branch of browser-perf was used (to ensure memory analysis). Could the homebrew installed python server have anything to do with this?
See my results here:
https://github.com/priley86/blog-dombench/blob/dombench2/results.csv
Hi. From reading your findings over at the auth0 blog, I'm wondering if you might be interpreting several of the results incorrectly. As you state:
Layout/Paint. This graph shows the time spent by the browser doing re-layout operations (i.e. creating an internal representation of the DOM tree after changes). I find it surprising that cito.js is the slowest in this case. Subjective performance would tell otherwise, as both cito.js and Incremental DOM feel quite snappy when interacting directly with the browser.
The less time each framework has spent doing layout/paint does not really mean that it is faster. I could make a framework that was incredibly slow, resulting in very little time being spent in layout/paint, simply because it spends the time doing other things? Also, the more rendering loops a framework manages to push during the benchmark (higher fps) - the more time will be spent in layout/paint (if the frameworks invalidate/update the same amount of dom nodes on each render).
I guess a better benchmark would be to run each implementation for a specific amount of runs - and then compare the time spent doing this (this will also give a fairer result of how much time is spent in gc etc, since all implementations have done the same amount of work). I think the results would be radically different :)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.