concordusapps / pytest-bench Goto Github PK
View Code? Open in Web Editor NEWA benchmark utility for pytest.
License: MIT License
A benchmark utility for pytest.
License: MIT License
Something like the following:
py.test --bench --bench-unit=s
Now the benchmark summary would report elapsed time in seconds instead of microseconds.
How many iterations are done by default in benchmarking? I ask this because adding @bench(...)
to one of my tests has increased its total duration from about 1.34s to about 120s, but the mean running time was only 3.6us...
Is it possible to add a command-line argument, or a decorator argument, that will control the number of iterations done for benchmarking?
Strange error when the benchmark expression refers to a classmethod.
Thanks for providing this tool. I had an issue when I tried to run it on python3.3:
=============================================================================== test session starts ================================================================================
platform linux -- Python 3.3.2 -- pytest-2.3.5
plugins: cov, bench
collected 1 items
test_eq.py .
----------------------------------------------------------------------------- benchmark session starts -----------------------------------------------------------------------------
collected 1 items
-------------------------------------------------------------------------------------------------------------------
Benchmark Time (s)
-------------------------------------------------------------------------------------------------------------------
Traceback (most recent call last):
File "/home/stefan/rspcas/bin/py.test", line 9, in <module>
load_entry_point('pytest==2.3.5', 'console_scripts', 'py.test')()
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 493, in main
exitstatus = config.hook.pytest_cmdline_main(config=config)
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 441, in __call__
return self._docall(methods, kwargs)
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 452, in _docall
res = mc.execute()
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 370, in execute
res = method(**kwargs)
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/main.py", line 109, in pytest_cmdline_main
return wrap_session(config, _main)
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/main.py", line 103, in wrap_session
exitstatus=session.exitstatus)
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 441, in __call__
return self._docall(methods, kwargs)
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 452, in _docall
res = mc.execute()
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 370, in execute
res = method(**kwargs)
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/terminal.py", line 337, in pytest_sessionfinish
self.config.hook.pytest_terminal_summary(terminalreporter=self)
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 441, in __call__
return self._docall(methods, kwargs)
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 452, in _docall
res = mc.execute()
File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 370, in execute
res = method(**kwargs)
File "/home/stefan/rspcas/lib/python3.3/site-packages/pytest_bench/plugin.py", line 126, in pytest_terminal_summary
name = benchmark.name
File "/home/stefan/rspcas/lib/python3.3/site-packages/pytest_bench/plugin.py", line 37, in name
self.item.cls.__name__,
AttributeError: 'NoneType' object has no attribute '__name__'
Otherwise the environment may be in a state the test did not intend
Currently the @mark.bench
process only allows for marking of functions available in the global scope.
Something like the following:
py.test --bench --bench-metric=calls
Now the benchmark summary would report calls per second instead of the average elapsed time.
Please consider moving to new pytest-dev team. See contributing for more information.
Cheers,
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.