Git Product home page Git Product logo

pytest-bench's People

Contributors

deni64k avatar mehcode avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

pytest-bench's Issues

Control number of iterations

How many iterations are done by default in benchmarking? I ask this because adding @bench(...) to one of my tests has increased its total duration from about 1.34s to about 120s, but the mean running time was only 3.6us...

Is it possible to add a command-line argument, or a decorator argument, that will control the number of iterations done for benchmarking?

Does not work under python 3.3

Thanks for providing this tool. I had an issue when I tried to run it on python3.3:

=============================================================================== test session starts ================================================================================
platform linux -- Python 3.3.2 -- pytest-2.3.5
plugins: cov, bench
collected 1 items 

test_eq.py .

----------------------------------------------------------------------------- benchmark session starts -----------------------------------------------------------------------------
collected 1 items

-------------------------------------------------------------------------------------------------------------------
Benchmark                                                                                                  Time (s)
-------------------------------------------------------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/stefan/rspcas/bin/py.test", line 9, in <module>
    load_entry_point('pytest==2.3.5', 'console_scripts', 'py.test')()
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 493, in main
    exitstatus = config.hook.pytest_cmdline_main(config=config)
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 441, in __call__
    return self._docall(methods, kwargs)
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 452, in _docall
    res = mc.execute()
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 370, in execute
    res = method(**kwargs)
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/main.py", line 109, in pytest_cmdline_main
    return wrap_session(config, _main)
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/main.py", line 103, in wrap_session
    exitstatus=session.exitstatus)
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 441, in __call__
    return self._docall(methods, kwargs)
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 452, in _docall
    res = mc.execute()
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 370, in execute
    res = method(**kwargs)
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/terminal.py", line 337, in pytest_sessionfinish
    self.config.hook.pytest_terminal_summary(terminalreporter=self)
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 441, in __call__
    return self._docall(methods, kwargs)
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 452, in _docall
    res = mc.execute()
  File "/home/stefan/rspcas/lib/python3.3/site-packages/_pytest/core.py", line 370, in execute
    res = method(**kwargs)
  File "/home/stefan/rspcas/lib/python3.3/site-packages/pytest_bench/plugin.py", line 126, in pytest_terminal_summary
    name = benchmark.name
  File "/home/stefan/rspcas/lib/python3.3/site-packages/pytest_bench/plugin.py", line 37, in name
    self.item.cls.__name__,
AttributeError: 'NoneType' object has no attribute '__name__'

Allow changing metric reported

Something like the following:

py.test --bench --bench-metric=calls

Now the benchmark summary would report calls per second instead of the average elapsed time.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.