rkrahl / pytest-dependency Goto Github PK
View Code? Open in Web Editor NEWManage dependencies of tests
License: Apache License 2.0
Manage dependencies of tests
License: Apache License 2.0
Is it possible to mark that test depends on all test methods inside specific test class?
This doesn't work
class TestABC:
def test_a(self):
pass
def test_b(self):
pass
@pytest.mark.dependency(depends=['TestABC'])
def test_depends_1():
pass # this test is skipped
@pytest.mark.dependency(depends=['TestABC::*'])
def test_depends_2():
pass # this test is skipped
Only this works for me
@pytest.mark.dependency(depends=[
'TestABC::test_a',
'TestABC::test_b',
])
def test_depends_3():
pass
We have a use case where we have a very long test that we want to skip if ANY of the previous tests have failed since it will just waste time.
Would it be possible to have that added to this library?
I would envision it working like
@pytest.mark.dependenc(depends="all")
diff --git a/pytest_dependency.py b/pytest_dependency.py
index a1355c3..74b6b0d 100644
--- a/pytest_dependency.py
+++ b/pytest_dependency.py
@@ -72,6 +72,11 @@ class DependencyManager(object):
status.addResult(rep)
def checkDepend(self, depends, item):
+ if depends == "all":
+ for key in self.results:
+ if not self.results[key].isSuccess():
+ pytest.skip("%s depends on all previous tests passing (%s failed)" % (item.name, key))
+
for i in depends:
if i in self.results:
if self.results[i].isSuccess():
Another option would be to support regex in the depends list. So if you wanted all previous tests you could just do "test_" if you wanted all tests that started with "test_foo_" you could add that too
Is there any intent to reorder test execution according to dependencies? The current behavior I observe is that if I have two tests, where the first depends on the second, the first is just skipped. It would be preferable to reorder them so that they can both execute, rather than relying on source code order.
import pytest
@pytest.mark.dependency(depends=["test_parent"])
def test_child():
pass
@pytest.mark.dependency
def test_parent():
pass
$ pytest
==== test session starts ====
platform linux -- Python 3.6.4, pytest-3.5.0, py-1.5.3, pluggy-0.6.0 -- /usr/local/bin/python
cachedir: .pytest_cache
rootdir: /tests, inifile:
plugins: dependency-0.3.2
collected 2 items
test_file.py::test_child SKIPPED
test_file.py::test_parent PASSED
==== 2 passed, 1 skipped in 0.01 seconds ====
NOTE: https://github.com/ftobia/pytest-ordering does some of this, but requires explicit ordering. I'd rather just define the dependencies and let the system figure it out.
Is it possible to include a feature/param to run dependent when the status of dependencies is failed or not a success?
Add more documentation. Need at least a text explaining the basic concept. What is this good for, statement of the problem that this plugin solves. How to use it.
I have a bunch of tests running in Pytest that depend on a series of tests running in a class. I'm using pytest-dependency to run some other tests in another module, but only if all the tests in this dependency class pass.
This is the set of tests that absolutely NEED to pass for me to proceed with the rest of the tests. It has two methods inside the class:
@pytest.mark.dependency()
class TestThis:
"""Test steady state of network."""
@pytest.mark.parametrize("device", params_this)
def test_that(self, device: str) -> None:
do something
assert xyz == abc
@pytest.mark.parametrize("device", params_that)
def test_this_as_well(self, device: str) -> None:
do something
assert xyz == abc
Now, when I add only one dependency marker in the tests that follow, it works as expected. If any of the tests in TestThis::test_that fail, the rest of the tests are skipped.
@pytest.mark.dependency(
depends=instances(
"tests/test_xyz.py::TestThis::test_that",
params_this,
),
scope="session",
)
class TestEverything:
"""Class to test everything."""
However, when I add two dependency markers like below, the tests proceed as usual even if one or more of the tests within the dependencies fail. This is unexpected behavior AFAIK.
@pytest.mark.dependency(
depends=instances(
"tests/test_xyz.py::TestThis::test_that",
params_this,
),
scope="session",
)
@pytest.mark.dependency(
depends=instances(
"tests/test_xyz.py::TestThis::test_this_as_well",
params_that,
),
scope="session",
)
class TestEverything:
"""Class to test everything."""
I found that if a test fails from the test_that method, but the tests from test_this_as_well all pass, it proceeds with the test in the TestEverything method. But if tests in the second dependency marker fail, it skips the tests as expected. Need to figure out what the solution for this is, or if this is a bug. Can we use multiple dependency markers like this?
Looking for possible solutions to this problem, because I cannot combine the two dependency methods into one for the rest of my test-suite to consume.
Should consider to add logging. Log messages might include which test outcome have been registered internally and which dependencies have been considered. Most log messages would have DEBUG
level, skipping of a test might be logged with level INFO
.
The default log level in pytest is WARNING
, so all these log messages would be suppressed by default. But for debugging, one could decrease the level to DEBUG
and then these messages might be helpful to understand what is going on.
I still need to experiment with logging in pytest, though, in order to check whether this really works as I assume and to assess how useful this might be in the end.
The current documentation describe this usecase:
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency(depends=["test_a"])
def test_b():
pass
It would be great to avoid marking test_a
:
def test_a():
pass
@pytest.mark.dependency(depends=["test_a"])
def test_b():
pass
What do you think?
test_b will always be skipped!
import pytest
class Tests:
@pytest.mark.dependency()
@pytest.mark.parametrize('x', [1, 2, 3])
def test_a(self, x):
assert True
@pytest.mark.dependency(depends=["Tests::test_a"])
def test_b(self):
assert True
python3 -m pytest -rvAt temp/
============================================================================ test session starts ============================================================================
platform linux -- Python 3.6.9, pytest-5.4.3, py-1.8.1, pluggy-0.13.1
rootdir: /workspace
plugins: reportlog-0.1.1, dependency-0.5.1
collected 4 items
temp/test_x.py ...s [100%]
================================================================================== PASSES ===================================================================================
========================================================================== short test summary info ==========================================================================
PASSED temp/test_x.py::Tests::test_a[1]
PASSED temp/test_x.py::Tests::test_a[2]
PASSED temp/test_x.py::Tests::test_a[3]
SKIPPED [1] /usr/local/lib/python3.6/dist-packages/pytest_dependency.py:103: test_b depends on Tests::test_a
I find it rather impractical for large test suites to name both session and package scoped dependencies with the full node id. Wouldn't it be more logical to remove the path of that package from the name? From what I can understand, it is even redundant to name that dependency (in a package scope) including the full path.
I'll try to give an example, for the sake of clarity.
Given a project structure somewhat like:
/project
/app1
/tests
test_file1.py
test_file2.py
/app2
/tests
test_file1.py
test_file2.py
With tests:
# app1/tests/test_file1.py
@pytest.mark.dependency()
def test_a():
pass
and
# app1/tests/test_file2.py
@pytest.mark.dependency(depends=['app1/tests/test_file1.py::test_a'], scope='package')
def test_b():
pass
It would seem more logical to me to be able to do this like:
# app1/tests/test_file2.py
@pytest.mark.dependency(depends=['test_file1.py::test_a'], scope='package')
def test_b():
pass
Which in turn wouldn't cause any conflict with the following (if I'm not wrong):
# app2/tests/test_file1.py
@pytest.mark.dependency()
def test_a():
pass
# app2/tests/test_file2.py
@pytest.mark.dependency(depends=['test_file1.py::test_a'], scope='package')
def test_b():
pass
I hope I have explained myself as clearly as possible and would appreciate any feedback.
Thanks in advance.
PR #11 changed the message provided when a test is skipped due to missing dependencies in that the name of the skipped test was added. There are however still inconsistencies and issues with these messages that the PR did not address.
In the most basic case, the skip messages look like:
$ py.test -rs basic.py
============================= test session starts ==============================
[...]
=========================== short test summary info ============================
SKIP [1] /usr/lib/python3.4/site-packages/pytest_dependency.py:65: test_e depends on test_c
SKIP [1] /usr/lib/python3.4/site-packages/pytest_dependency.py:65: test_c depends on test_a
================ 2 passed, 2 skipped, 1 xfailed in 0.02 seconds ================
The most obvious issue is the trace line /usr/.../pytest_dependency.py:65
that appears in each skip message. It is due to the fact that skipping a test in pytest is an exception and this is the trace with the line of code that calls pytest.skip()
. It is always the same in each skip message and clutters the message without containing any useful bit of information. It should be removed. Unfortunately, adding this trace is hard coded deep in the internals of pytest, so it won't be easy to get rid of it.
The name of the dependent test appearing in the message is actually the name of the function or method of the test:
$ py.test -rs named.py
============================= test session starts ==============================
[...]
=========================== short test summary info ============================
SKIP [1] /usr/lib/python3.4/site-packages/pytest_dependency.py:65: test_c depends on a
SKIP [1] /usr/lib/python3.4/site-packages/pytest_dependency.py:65: test_e depends on c
================ 2 passed, 2 skipped, 1 xfailed in 0.01 seconds ================
For the name of the dependent test, the message ignores the name set with the name
argument to the pytest.mark.dependency()
marker, while for the dependency, this setting is taken into account. This is somewhat inconsistent.
This inconsistency would be acceptable, if it would lead to unambiguous names. Unfortunately, this is not the case. Consider the following example:
import pytest
def test_b():
pass
class TestClass(object):
@pytest.mark.dependency()
@pytest.mark.xfail(reason="deliberate fail")
def test_a(self):
assert False
@pytest.mark.dependency(depends=["TestClass::test_a"])
def test_b(self):
pass
The output looks like:
$ py.test -rs testclass.py
============================= test session starts ==============================
[...]
=========================== short test summary info ============================
SKIP [1] /usr/lib/python3.4/site-packages/pytest_dependency.py:65: test_b depends on TestClass::test_a
================ 1 passed, 1 skipped, 1 xfailed in 0.01 seconds ================
This is misleading, one may believe that the test function test_b()
had been skipped, but it was the method TestClass::test_b()
in fact.
Currently pipenv doesn't allow to install pytest-dependency.
As said in the following ticket, pipenv is unable to install pytest-dependency due to poorly written setup.py
:
https://github.com/kennethreitz/pipenv/issues/878
I'm trying to package your module as an rpm package. So I'm using the typical PEP517 based build, install and test cycle used on building packages from non-root account.
python3 -sBm build -w --no-isolation
Here is pytest output:
+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-pytest-dependency-0.5.1-2.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-pytest-dependency-0.5.1-2.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1
plugins: dependency-0.5.1
collected 27 items
tests/test_01_marker.py .. [ 7%]
tests/test_02_simple_dependency.py FFFFFF [ 29%]
tests/test_03_class.py FFF [ 40%]
tests/test_03_multiple_dependency.py F [ 44%]
tests/test_03_param.py F [ 48%]
tests/test_03_runtime.py F [ 51%]
tests/test_03_scope.py FFFFFFF [ 77%]
tests/test_03_skipmsgs.py F [ 81%]
tests/test_04_automark.py FF. [ 92%]
tests/test_04_ignore_unknown.py F. [100%]
================================================================================= FAILURES =================================================================================
_______________________________________________________________________________ test_no_skip _______________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_no_skip0')>
def test_no_skip(ctestdir):
"""One test is skipped, but no other test depends on it,
so all other tests pass.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency()
def test_a():
pytest.skip("explicit skip")
@pytest.mark.dependency()
def test_b():
pass
@pytest.mark.dependency(depends=["test_b"])
def test_c():
pass
@pytest.mark.dependency(depends=["test_c"])
def test_d():
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=3, skipped=1, failed=0)
> result.stdout.fnmatch_lines("""
*::test_a SKIPPED
*::test_b PASSED
*::test_c PASSED
*::test_d PASSED
""")
E Failed: nomatch: '*::test_a SKIPPED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_no_skip0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 4 items'
E and: ''
E and: 'test_no_skip.py::test_a SKIPPED (explicit skip)'
E and: 'test_no_skip.py::test_b PASSED'
E and: 'test_no_skip.py::test_c PASSED'
E and: 'test_no_skip.py::test_d PASSED'
E and: ''
E and: '========================= 3 passed, 1 skipped in 0.01s ========================='
E remains unmatched: '*::test_a SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_02_simple_dependency.py:32: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_no_skip0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 4 items
test_no_skip.py::test_a SKIPPED (explicit skip)
test_no_skip.py::test_b PASSED
test_no_skip.py::test_c PASSED
test_no_skip.py::test_d PASSED
========================= 3 passed, 1 skipped in 0.01s =========================
_____________________________________________________________________________ test_skip_depend _____________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_skip_depend0')>
def test_skip_depend(ctestdir):
"""One test is skipped, other dependent tests are skipped as well.
This also includes indirect dependencies.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency()
def test_b():
pytest.skip("explicit skip")
@pytest.mark.dependency(depends=["test_b"])
def test_c():
pass
@pytest.mark.dependency(depends=["test_c"])
def test_d():
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=1, skipped=3, failed=0)
> result.stdout.fnmatch_lines("""
*::test_a PASSED
*::test_b SKIPPED
*::test_c SKIPPED
*::test_d SKIPPED
""")
E Failed: nomatch: '*::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_skip_depend0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 4 items'
E and: ''
E fnmatch: '*::test_a PASSED'
E with: 'test_skip_depend.py::test_a PASSED'
E nomatch: '*::test_b SKIPPED'
E and: 'test_skip_depend.py::test_b SKIPPED (explicit skip)'
E and: 'test_skip_depend.py::test_c SKIPPED (test_c depends on test_b)'
E and: 'test_skip_depend.py::test_d SKIPPED (test_d depends on test_c)'
E and: ''
E and: '========================= 1 passed, 3 skipped in 0.01s ========================='
E remains unmatched: '*::test_b SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_02_simple_dependency.py:65: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_skip_depend0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 4 items
test_skip_depend.py::test_a PASSED
test_skip_depend.py::test_b SKIPPED (explicit skip)
test_skip_depend.py::test_c SKIPPED (test_c depends on test_b)
test_skip_depend.py::test_d SKIPPED (test_d depends on test_c)
========================= 1 passed, 3 skipped in 0.01s =========================
_____________________________________________________________________________ test_fail_depend _____________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_fail_depend0')>
def test_fail_depend(ctestdir):
"""One test fails, other dependent tests are skipped.
This also includes indirect dependencies.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency()
def test_b():
assert False
@pytest.mark.dependency(depends=["test_b"])
def test_c():
pass
@pytest.mark.dependency(depends=["test_c"])
def test_d():
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=1, skipped=2, failed=1)
> result.stdout.fnmatch_lines("""
*::test_a PASSED
*::test_b FAILED
*::test_c SKIPPED
*::test_d SKIPPED
""")
E Failed: nomatch: '*::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_fail_depend0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 4 items'
E and: ''
E fnmatch: '*::test_a PASSED'
E with: 'test_fail_depend.py::test_a PASSED'
E fnmatch: '*::test_b FAILED'
E with: 'test_fail_depend.py::test_b FAILED'
E nomatch: '*::test_c SKIPPED'
E and: 'test_fail_depend.py::test_c SKIPPED (test_c depends on test_b)'
E and: 'test_fail_depend.py::test_d SKIPPED (test_d depends on test_c)'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '____________________________________ test_b ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_b():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_fail_depend.py:9: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_fail_depend.py::test_b - assert False'
E and: '==================== 1 failed, 1 passed, 2 skipped in 0.01s ===================='
E remains unmatched: '*::test_c SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_02_simple_dependency.py:98: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_fail_depend0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 4 items
test_fail_depend.py::test_a PASSED
test_fail_depend.py::test_b FAILED
test_fail_depend.py::test_c SKIPPED (test_c depends on test_b)
test_fail_depend.py::test_d SKIPPED (test_d depends on test_c)
=================================== FAILURES ===================================
____________________________________ test_b ____________________________________
@pytest.mark.dependency()
def test_b():
> assert False
E assert False
test_fail_depend.py:9: AssertionError
=========================== short test summary info ============================
FAILED test_fail_depend.py::test_b - assert False
==================== 1 failed, 1 passed, 2 skipped in 0.01s ====================
__________________________________________________________________________ test_named_fail_depend __________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_named_fail_depend0')>
def test_named_fail_depend(ctestdir):
"""Same as test_fail_depend, but using custom test names.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency(name="a")
def test_a():
pass
@pytest.mark.dependency(name="b")
def test_b():
assert False
@pytest.mark.dependency(name="c", depends=["b"])
def test_c():
pass
@pytest.mark.dependency(name="d", depends=["c"])
def test_d():
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=1, skipped=2, failed=1)
> result.stdout.fnmatch_lines("""
*::test_a PASSED
*::test_b FAILED
*::test_c SKIPPED
*::test_d SKIPPED
""")
E Failed: nomatch: '*::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_named_fail_depend0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 4 items'
E and: ''
E fnmatch: '*::test_a PASSED'
E with: 'test_named_fail_depend.py::test_a PASSED'
E fnmatch: '*::test_b FAILED'
E with: 'test_named_fail_depend.py::test_b FAILED'
E nomatch: '*::test_c SKIPPED'
E and: 'test_named_fail_depend.py::test_c SKIPPED (test_c depends on b)'
E and: 'test_named_fail_depend.py::test_d SKIPPED (test_d depends on c)'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '____________________________________ test_b ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency(name="b")'
E and: ' def test_b():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_named_fail_depend.py:9: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_named_fail_depend.py::test_b - assert False'
E and: '==================== 1 failed, 1 passed, 2 skipped in 0.01s ===================='
E remains unmatched: '*::test_c SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_02_simple_dependency.py:130: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_named_fail_depend0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 4 items
test_named_fail_depend.py::test_a PASSED
test_named_fail_depend.py::test_b FAILED
test_named_fail_depend.py::test_c SKIPPED (test_c depends on b)
test_named_fail_depend.py::test_d SKIPPED (test_d depends on c)
=================================== FAILURES ===================================
____________________________________ test_b ____________________________________
@pytest.mark.dependency(name="b")
def test_b():
> assert False
E assert False
test_named_fail_depend.py:9: AssertionError
=========================== short test summary info ============================
FAILED test_named_fail_depend.py::test_b - assert False
==================== 1 failed, 1 passed, 2 skipped in 0.01s ====================
___________________________________________________________________________ test_explicit_select ___________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_explicit_select0')>
def test_explicit_select(ctestdir):
"""Explicitly select only a single test that depends on another one.
Since the other test has not been run at all, the selected test
will be skipped.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency()
def test_b():
pass
@pytest.mark.dependency()
def test_c():
pass
@pytest.mark.dependency(depends=["test_c"])
def test_d():
pass
""")
result = ctestdir.runpytest("--verbose", "test_explicit_select.py::test_d")
result.assert_outcomes(passed=0, skipped=1, failed=0)
> result.stdout.fnmatch_lines("""
*::test_d SKIPPED
""")
E Failed: nomatch: '*::test_d SKIPPED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_explicit_select0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 1 item'
E and: ''
E and: 'test_explicit_select.py::test_d SKIPPED (test_d depends on test_c)'
E and: ''
E and: '============================== 1 skipped in 0.01s =============================='
E remains unmatched: '*::test_d SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_02_simple_dependency.py:165: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_explicit_select0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 1 item
test_explicit_select.py::test_d SKIPPED (test_d depends on test_c)
============================== 1 skipped in 0.01s ==============================
___________________________________________________________________________ test_depend_unknown ____________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_depend_unknown0')>
def test_depend_unknown(ctestdir):
"""Depend on an unknown test that is not even defined in the test set.
Note that is not an error to depend on an undefined test, but the
dependent test will be skipped since the non-existent dependency
has not been run successfully.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency()
def test_b():
pass
@pytest.mark.dependency()
def test_c():
pass
@pytest.mark.dependency(depends=["test_x"])
def test_d():
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=3, skipped=1, failed=0)
> result.stdout.fnmatch_lines("""
*::test_a PASSED
*::test_b PASSED
*::test_c PASSED
*::test_d SKIPPED
""")
E Failed: nomatch: '*::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_depend_unknown0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 4 items'
E and: ''
E fnmatch: '*::test_a PASSED'
E with: 'test_depend_unknown.py::test_a PASSED'
E fnmatch: '*::test_b PASSED'
E with: 'test_depend_unknown.py::test_b PASSED'
E fnmatch: '*::test_c PASSED'
E with: 'test_depend_unknown.py::test_c PASSED'
E nomatch: '*::test_d SKIPPED'
E and: 'test_depend_unknown.py::test_d SKIPPED (test_d depends on test_x)'
E and: ''
E and: '========================= 3 passed, 1 skipped in 0.01s ========================='
E remains unmatched: '*::test_d SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_02_simple_dependency.py:198: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_depend_unknown0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 4 items
test_depend_unknown.py::test_a PASSED
test_depend_unknown.py::test_b PASSED
test_depend_unknown.py::test_c PASSED
test_depend_unknown.py::test_d SKIPPED (test_d depends on test_x)
========================= 3 passed, 1 skipped in 0.01s =========================
____________________________________________________________________________ test_class_simple _____________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_class_simple0')>
def test_class_simple(ctestdir):
"""Simple dependencies of test methods in a class.
test_a() deliberately fails, some other methods depend on it, some don't.
"""
ctestdir.makepyfile("""
import pytest
class TestClass(object):
@pytest.mark.dependency()
def test_a(self):
assert False
@pytest.mark.dependency()
def test_b(self):
pass
@pytest.mark.dependency(depends=["TestClass::test_a"])
def test_c(self):
pass
@pytest.mark.dependency(depends=["TestClass::test_b"])
def test_d(self):
pass
@pytest.mark.dependency(depends=["TestClass::test_b",
"TestClass::test_c"])
def test_e(self):
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=2, skipped=2, failed=1)
> result.stdout.fnmatch_lines("""
*::TestClass::test_a FAILED
*::TestClass::test_b PASSED
*::TestClass::test_c SKIPPED
*::TestClass::test_d PASSED
*::TestClass::test_e SKIPPED
""")
E Failed: nomatch: '*::TestClass::test_a FAILED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_class_simple0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 5 items'
E and: ''
E fnmatch: '*::TestClass::test_a FAILED'
E with: 'test_class_simple.py::TestClass::test_a FAILED'
E fnmatch: '*::TestClass::test_b PASSED'
E with: 'test_class_simple.py::TestClass::test_b PASSED'
E nomatch: '*::TestClass::test_c SKIPPED'
E and: 'test_class_simple.py::TestClass::test_c SKIPPED (test_c depends on T...)'
E and: 'test_class_simple.py::TestClass::test_d PASSED'
E and: 'test_class_simple.py::TestClass::test_e SKIPPED (test_e depends on T...)'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '_______________________________ TestClass.test_a _______________________________'
E and: ''
E and: 'self = <test_class_simple.TestClass object at 0x7f015c465460>'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_a(self):'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_class_simple.py:7: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_class_simple.py::TestClass::test_a - assert False'
E and: '==================== 1 failed, 2 passed, 2 skipped in 0.02s ===================='
E remains unmatched: '*::TestClass::test_c SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_class.py:39: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_class_simple0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 5 items
test_class_simple.py::TestClass::test_a FAILED
test_class_simple.py::TestClass::test_b PASSED
test_class_simple.py::TestClass::test_c SKIPPED (test_c depends on T...)
test_class_simple.py::TestClass::test_d PASSED
test_class_simple.py::TestClass::test_e SKIPPED (test_e depends on T...)
=================================== FAILURES ===================================
_______________________________ TestClass.test_a _______________________________
self = <test_class_simple.TestClass object at 0x7f015c465460>
@pytest.mark.dependency()
def test_a(self):
> assert False
E assert False
test_class_simple.py:7: AssertionError
=========================== short test summary info ============================
FAILED test_class_simple.py::TestClass::test_a - assert False
==================== 1 failed, 2 passed, 2 skipped in 0.02s ====================
_________________________________________________________________________ test_class_simple_named __________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_class_simple_named0')>
def test_class_simple_named(ctestdir):
"""Mostly the same as test_class_simple(), but name the test methods
now explicitely.
"""
ctestdir.makepyfile("""
import pytest
class TestClassNamed(object):
@pytest.mark.dependency(name="a")
def test_a(self):
assert False
@pytest.mark.dependency(name="b")
def test_b(self):
pass
@pytest.mark.dependency(name="c", depends=["a"])
def test_c(self):
pass
@pytest.mark.dependency(name="d", depends=["b"])
def test_d(self):
pass
@pytest.mark.dependency(name="e", depends=["b", "c"])
def test_e(self):
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=2, skipped=2, failed=1)
> result.stdout.fnmatch_lines("""
*::TestClassNamed::test_a FAILED
*::TestClassNamed::test_b PASSED
*::TestClassNamed::test_c SKIPPED
*::TestClassNamed::test_d PASSED
*::TestClassNamed::test_e SKIPPED
""")
E Failed: nomatch: '*::TestClassNamed::test_a FAILED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_class_simple_named0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 5 items'
E and: ''
E fnmatch: '*::TestClassNamed::test_a FAILED'
E with: 'test_class_simple_named.py::TestClassNamed::test_a FAILED'
E fnmatch: '*::TestClassNamed::test_b PASSED'
E with: 'test_class_simple_named.py::TestClassNamed::test_b PASSED'
E nomatch: '*::TestClassNamed::test_c SKIPPED'
E and: 'test_class_simple_named.py::TestClassNamed::test_c SKIPPED (test_c d...)'
E and: 'test_class_simple_named.py::TestClassNamed::test_d PASSED'
E and: 'test_class_simple_named.py::TestClassNamed::test_e SKIPPED (test_e d...)'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '____________________________ TestClassNamed.test_a _____________________________'
E and: ''
E and: 'self = <test_class_simple_named.TestClassNamed object at 0x7f015c3d85e0>'
E and: ''
E and: ' @pytest.mark.dependency(name="a")'
E and: ' def test_a(self):'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_class_simple_named.py:7: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_class_simple_named.py::TestClassNamed::test_a - assert False'
E and: '==================== 1 failed, 2 passed, 2 skipped in 0.02s ===================='
E remains unmatched: '*::TestClassNamed::test_c SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_class.py:79: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_class_simple_named0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 5 items
test_class_simple_named.py::TestClassNamed::test_a FAILED
test_class_simple_named.py::TestClassNamed::test_b PASSED
test_class_simple_named.py::TestClassNamed::test_c SKIPPED (test_c d...)
test_class_simple_named.py::TestClassNamed::test_d PASSED
test_class_simple_named.py::TestClassNamed::test_e SKIPPED (test_e d...)
=================================== FAILURES ===================================
____________________________ TestClassNamed.test_a _____________________________
self = <test_class_simple_named.TestClassNamed object at 0x7f015c3d85e0>
@pytest.mark.dependency(name="a")
def test_a(self):
> assert False
E assert False
test_class_simple_named.py:7: AssertionError
=========================== short test summary info ============================
FAILED test_class_simple_named.py::TestClassNamed::test_a - assert False
==================== 1 failed, 2 passed, 2 skipped in 0.02s ====================
_________________________________________________________________________ test_class_default_name __________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_class_default_name0')>
def test_class_default_name(ctestdir):
"""Issue #6: for methods of test classes, the default name used to be
the method name. This could have caused conflicts if there is a
function having the same name outside the class. In the following
example, before fixing this issue, the method test_a() of class
TestClass would have shadowed the failure of function test_a().
Now the class name is prepended to the default test name, removing
this conflict.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency()
def test_a():
assert False
class TestClass(object):
@pytest.mark.dependency()
def test_a(self):
pass
@pytest.mark.dependency(depends=["test_a"])
def test_b():
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=1, skipped=1, failed=1)
> result.stdout.fnmatch_lines("""
*::test_a FAILED
*::TestClass::test_a PASSED
*::test_b SKIPPED
""")
E Failed: nomatch: '*::test_a FAILED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_class_default_name0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 3 items'
E and: ''
E fnmatch: '*::test_a FAILED'
E with: 'test_class_default_name.py::test_a FAILED'
E fnmatch: '*::TestClass::test_a PASSED'
E with: 'test_class_default_name.py::TestClass::test_a PASSED'
E nomatch: '*::test_b SKIPPED'
E and: 'test_class_default_name.py::test_b SKIPPED (test_b depends on test_a)'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '____________________________________ test_a ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_a():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_class_default_name.py:5: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_class_default_name.py::test_a - assert False'
E and: '==================== 1 failed, 1 passed, 1 skipped in 0.01s ===================='
E remains unmatched: '*::test_b SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_class.py:117: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_class_default_name0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 3 items
test_class_default_name.py::test_a FAILED
test_class_default_name.py::TestClass::test_a PASSED
test_class_default_name.py::test_b SKIPPED (test_b depends on test_a)
=================================== FAILURES ===================================
____________________________________ test_a ____________________________________
@pytest.mark.dependency()
def test_a():
> assert False
E assert False
test_class_default_name.py:5: AssertionError
=========================== short test summary info ============================
FAILED test_class_default_name.py::test_a - assert False
==================== 1 failed, 1 passed, 1 skipped in 0.01s ====================
______________________________________________________________________________ test_multiple _______________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_multiple0')>
def test_multiple(ctestdir):
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency(name="a")
def test_a():
pytest.skip("explicit skip")
@pytest.mark.dependency(name="b")
def test_b():
assert False
@pytest.mark.dependency(name="c")
def test_c():
pass
@pytest.mark.dependency(name="d")
def test_d():
pass
@pytest.mark.dependency(name="e")
def test_e():
pass
@pytest.mark.dependency(name="f", depends=["a", "c"])
def test_f():
pass
@pytest.mark.dependency(name="g", depends=["b", "d"])
def test_g():
pass
@pytest.mark.dependency(name="h", depends=["c", "e"])
def test_h():
pass
@pytest.mark.dependency(name="i", depends=["f", "h"])
def test_i():
pass
@pytest.mark.dependency(name="j", depends=["d", "h"])
def test_j():
pass
@pytest.mark.dependency(name="k", depends=["g", "i", "j"])
def test_k():
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=5, skipped=5, failed=1)
> result.stdout.fnmatch_lines("""
*::test_a SKIPPED
*::test_b FAILED
*::test_c PASSED
*::test_d PASSED
*::test_e PASSED
*::test_f SKIPPED
*::test_g SKIPPED
*::test_h PASSED
*::test_i SKIPPED
*::test_j PASSED
*::test_k SKIPPED
""")
E Failed: nomatch: '*::test_a SKIPPED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_multiple0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 11 items'
E and: ''
E and: 'test_multiple.py::test_a SKIPPED (explicit skip)'
E and: 'test_multiple.py::test_b FAILED'
E and: 'test_multiple.py::test_c PASSED'
E and: 'test_multiple.py::test_d PASSED'
E and: 'test_multiple.py::test_e PASSED'
E and: 'test_multiple.py::test_f SKIPPED (test_f depends on a)'
E and: 'test_multiple.py::test_g SKIPPED (test_g depends on b)'
E and: 'test_multiple.py::test_h PASSED'
E and: 'test_multiple.py::test_i SKIPPED (test_i depends on f)'
E and: 'test_multiple.py::test_j PASSED'
E and: 'test_multiple.py::test_k SKIPPED (test_k depends on g)'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '____________________________________ test_b ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency(name="b")'
E and: ' def test_b():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_multiple.py:9: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_multiple.py::test_b - assert False'
E and: '==================== 1 failed, 5 passed, 5 skipped in 0.03s ===================='
E remains unmatched: '*::test_a SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_multiple_dependency.py:57: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_multiple0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 11 items
test_multiple.py::test_a SKIPPED (explicit skip)
test_multiple.py::test_b FAILED
test_multiple.py::test_c PASSED
test_multiple.py::test_d PASSED
test_multiple.py::test_e PASSED
test_multiple.py::test_f SKIPPED (test_f depends on a)
test_multiple.py::test_g SKIPPED (test_g depends on b)
test_multiple.py::test_h PASSED
test_multiple.py::test_i SKIPPED (test_i depends on f)
test_multiple.py::test_j PASSED
test_multiple.py::test_k SKIPPED (test_k depends on g)
=================================== FAILURES ===================================
____________________________________ test_b ____________________________________
@pytest.mark.dependency(name="b")
def test_b():
> assert False
E assert False
test_multiple.py:9: AssertionError
=========================== short test summary info ============================
FAILED test_multiple.py::test_b - assert False
==================== 1 failed, 5 passed, 5 skipped in 0.03s ====================
______________________________________________________________________________ test_multiple _______________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_multiple1')>
def test_multiple(ctestdir):
ctestdir.makepyfile("""
import pytest
_md = pytest.mark.dependency
@pytest.mark.parametrize("x,y", [
pytest.param(0, 0, marks=_md(name="a1")),
pytest.param(0, 1, marks=_md(name="a2")),
pytest.param(1, 0, marks=_md(name="a3")),
pytest.param(1, 1, marks=_md(name="a4"))
])
def test_a(x,y):
assert x==0 or y==0
@pytest.mark.parametrize("u,v", [
pytest.param(1, 2, marks=_md(name="b1", depends=["a1", "a2"])),
pytest.param(1, 3, marks=_md(name="b2", depends=["a1", "a3"])),
pytest.param(1, 4, marks=_md(name="b3", depends=["a1", "a4"])),
pytest.param(2, 3, marks=_md(name="b4", depends=["a2", "a3"])),
pytest.param(2, 4, marks=_md(name="b5", depends=["a2", "a4"])),
pytest.param(3, 4, marks=_md(name="b6", depends=["a3", "a4"]))
])
def test_b(u,v):
pass
@pytest.mark.parametrize("w", [
pytest.param(1, marks=_md(name="c1", depends=["b1", "b3", "b5"])),
pytest.param(2, marks=_md(name="c2", depends=["b1", "b3", "b6"])),
pytest.param(3, marks=_md(name="c3", depends=["b1", "b2", "b4"]))
])
def test_c(w):
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=7, skipped=5, failed=1)
> result.stdout.fnmatch_lines("""
*::test_a?0-0? PASSED
*::test_a?0-1? PASSED
*::test_a?1-0? PASSED
*::test_a?1-1? FAILED
*::test_b?1-2? PASSED
*::test_b?1-3? PASSED
*::test_b?1-4? SKIPPED
*::test_b?2-3? PASSED
*::test_b?2-4? SKIPPED
*::test_b?3-4? SKIPPED
*::test_c?1? SKIPPED
*::test_c?2? SKIPPED
*::test_c?3? PASSED
""")
E Failed: nomatch: '*::test_a?0-0? PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_multiple1, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 13 items'
E and: ''
E fnmatch: '*::test_a?0-0? PASSED'
E with: 'test_multiple.py::test_a[0-0] PASSED'
E fnmatch: '*::test_a?0-1? PASSED'
E with: 'test_multiple.py::test_a[0-1] PASSED'
E fnmatch: '*::test_a?1-0? PASSED'
E with: 'test_multiple.py::test_a[1-0] PASSED'
E fnmatch: '*::test_a?1-1? FAILED'
E with: 'test_multiple.py::test_a[1-1] FAILED'
E fnmatch: '*::test_b?1-2? PASSED'
E with: 'test_multiple.py::test_b[1-2] PASSED'
E fnmatch: '*::test_b?1-3? PASSED'
E with: 'test_multiple.py::test_b[1-3] PASSED'
E nomatch: '*::test_b?1-4? SKIPPED'
E and: 'test_multiple.py::test_b[1-4] SKIPPED (test_b[1-4] depends on a4)'
E and: 'test_multiple.py::test_b[2-3] PASSED'
E and: 'test_multiple.py::test_b[2-4] SKIPPED (test_b[2-4] depends on a4)'
E and: 'test_multiple.py::test_b[3-4] SKIPPED (test_b[3-4] depends on a4)'
E and: 'test_multiple.py::test_c[1] SKIPPED (test_c[1] depends on b3)'
E and: 'test_multiple.py::test_c[2] SKIPPED (test_c[2] depends on b3)'
E and: 'test_multiple.py::test_c[3] PASSED'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '_________________________________ test_a[1-1] __________________________________'
E and: ''
E and: 'x = 1, y = 1'
E and: ''
E and: ' @pytest.mark.parametrize("x,y", ['
E and: ' pytest.param(0, 0, marks=_md(name="a1")),'
E and: ' pytest.param(0, 1, marks=_md(name="a2")),'
E and: ' pytest.param(1, 0, marks=_md(name="a3")),'
E and: ' pytest.param(1, 1, marks=_md(name="a4"))'
E and: ' ])'
E and: ' def test_a(x,y):'
E and: '> assert x==0 or y==0'
E and: 'E assert (1 == 0'
E and: 'E +1'
E and: 'E -0 or 1 == 0'
E and: 'E +1'
E and: 'E -0)'
E and: ''
E and: 'test_multiple.py:12: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_multiple.py::test_a[1-1] - assert (1 == 0'
E and: '==================== 1 failed, 7 passed, 5 skipped in 0.03s ===================='
E remains unmatched: '*::test_b?1-4? SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_param.py:43: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_multiple1, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 13 items
test_multiple.py::test_a[0-0] PASSED
test_multiple.py::test_a[0-1] PASSED
test_multiple.py::test_a[1-0] PASSED
test_multiple.py::test_a[1-1] FAILED
test_multiple.py::test_b[1-2] PASSED
test_multiple.py::test_b[1-3] PASSED
test_multiple.py::test_b[1-4] SKIPPED (test_b[1-4] depends on a4)
test_multiple.py::test_b[2-3] PASSED
test_multiple.py::test_b[2-4] SKIPPED (test_b[2-4] depends on a4)
test_multiple.py::test_b[3-4] SKIPPED (test_b[3-4] depends on a4)
test_multiple.py::test_c[1] SKIPPED (test_c[1] depends on b3)
test_multiple.py::test_c[2] SKIPPED (test_c[2] depends on b3)
test_multiple.py::test_c[3] PASSED
=================================== FAILURES ===================================
_________________________________ test_a[1-1] __________________________________
x = 1, y = 1
@pytest.mark.parametrize("x,y", [
pytest.param(0, 0, marks=_md(name="a1")),
pytest.param(0, 1, marks=_md(name="a2")),
pytest.param(1, 0, marks=_md(name="a3")),
pytest.param(1, 1, marks=_md(name="a4"))
])
def test_a(x,y):
> assert x==0 or y==0
E assert (1 == 0
E +1
E -0 or 1 == 0
E +1
E -0)
test_multiple.py:12: AssertionError
=========================== short test summary info ============================
FAILED test_multiple.py::test_a[1-1] - assert (1 == 0
==================== 1 failed, 7 passed, 5 skipped in 0.03s ====================
_________________________________________________________________________ test_skip_depend_runtime _________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_skip_depend_runtime0')>
def test_skip_depend_runtime(ctestdir):
"""One test is skipped, other dependent tests are skipped as well.
This also includes indirect dependencies.
"""
ctestdir.makepyfile("""
import pytest
from pytest_dependency import depends
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency()
def test_b():
pytest.skip("explicit skip")
@pytest.mark.dependency()
def test_c(request):
depends(request, ["test_b"])
pass
@pytest.mark.dependency()
def test_d(request):
depends(request, ["test_a", "test_c"])
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=1, skipped=3, failed=0)
> result.stdout.fnmatch_lines("""
*::test_a PASSED
*::test_b SKIPPED
*::test_c SKIPPED
*::test_d SKIPPED
""")
E Failed: nomatch: '*::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_skip_depend_runtime0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 4 items'
E and: ''
E fnmatch: '*::test_a PASSED'
E with: 'test_skip_depend_runtime.py::test_a PASSED'
E nomatch: '*::test_b SKIPPED'
E and: 'test_skip_depend_runtime.py::test_b SKIPPED (explicit skip)'
E and: 'test_skip_depend_runtime.py::test_c SKIPPED (test_c depends on test_b)'
E and: 'test_skip_depend_runtime.py::test_d SKIPPED (test_d depends on test_c)'
E and: ''
E and: '========================= 1 passed, 3 skipped in 0.01s ========================='
E remains unmatched: '*::test_b SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_runtime.py:35: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_skip_depend_runtime0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 4 items
test_skip_depend_runtime.py::test_a PASSED
test_skip_depend_runtime.py::test_b SKIPPED (explicit skip)
test_skip_depend_runtime.py::test_c SKIPPED (test_c depends on test_b)
test_skip_depend_runtime.py::test_d SKIPPED (test_d depends on test_c)
========================= 1 passed, 3 skipped in 0.01s =========================
____________________________________________________________________________ test_scope_module _____________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_scope_module0')>
def test_scope_module(ctestdir):
"""One single module, module scope is explicitely set in the
pytest.mark.dependency() marker.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency()
def test_a():
assert False
@pytest.mark.dependency()
def test_b():
pass
@pytest.mark.dependency(depends=["test_a"], scope='module')
def test_c():
pass
@pytest.mark.dependency(depends=["test_b"], scope='module')
def test_d():
pass
@pytest.mark.dependency(depends=["test_b", "test_c"], scope='module')
def test_e():
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=2, skipped=2, failed=1)
> result.stdout.fnmatch_lines("""
test_scope_module.py::test_a FAILED
test_scope_module.py::test_b PASSED
test_scope_module.py::test_c SKIPPED
test_scope_module.py::test_d PASSED
test_scope_module.py::test_e SKIPPED
""")
E Failed: nomatch: 'test_scope_module.py::test_a FAILED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_module0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 5 items'
E and: ''
E exact match: 'test_scope_module.py::test_a FAILED'
E exact match: 'test_scope_module.py::test_b PASSED'
E nomatch: 'test_scope_module.py::test_c SKIPPED'
E and: 'test_scope_module.py::test_c SKIPPED (test_c depends on test_a)'
E and: 'test_scope_module.py::test_d PASSED'
E and: 'test_scope_module.py::test_e SKIPPED (test_e depends on test_c)'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '____________________________________ test_a ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_a():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_scope_module.py:5: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_scope_module.py::test_a - assert False'
E and: '==================== 1 failed, 2 passed, 2 skipped in 0.02s ===================='
E remains unmatched: 'test_scope_module.py::test_c SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_scope.py:36: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_module0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 5 items
test_scope_module.py::test_a FAILED
test_scope_module.py::test_b PASSED
test_scope_module.py::test_c SKIPPED (test_c depends on test_a)
test_scope_module.py::test_d PASSED
test_scope_module.py::test_e SKIPPED (test_e depends on test_c)
=================================== FAILURES ===================================
____________________________________ test_a ____________________________________
@pytest.mark.dependency()
def test_a():
> assert False
E assert False
test_scope_module.py:5: AssertionError
=========================== short test summary info ============================
FAILED test_scope_module.py::test_a - assert False
==================== 1 failed, 2 passed, 2 skipped in 0.02s ====================
____________________________________________________________________________ test_scope_session ____________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_scope_session0')>
def test_scope_session(ctestdir):
"""Two modules, some cross module dependencies in session scope.
"""
ctestdir.makepyfile(test_scope_session_01="""
import pytest
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency()
def test_b():
assert False
@pytest.mark.dependency(depends=["test_a"])
def test_c():
pass
class TestClass(object):
@pytest.mark.dependency()
def test_b(self):
pass
""", test_scope_session_02="""
import pytest
@pytest.mark.dependency()
def test_a():
assert False
@pytest.mark.dependency(
depends=["test_scope_session_01.py::test_a",
"test_scope_session_01.py::test_c"],
scope='session'
)
def test_e():
pass
@pytest.mark.dependency(
depends=["test_scope_session_01.py::test_b"],
scope='session'
)
def test_f():
pass
@pytest.mark.dependency(
depends=["test_scope_session_02.py::test_e"],
scope='session'
)
def test_g():
pass
@pytest.mark.dependency(
depends=["test_scope_session_01.py::TestClass::test_b"],
scope='session'
)
def test_h():
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=6, skipped=1, failed=2)
> result.stdout.fnmatch_lines("""
test_scope_session_01.py::test_a PASSED
test_scope_session_01.py::test_b FAILED
test_scope_session_01.py::test_c PASSED
test_scope_session_01.py::TestClass::test_b PASSED
test_scope_session_02.py::test_a FAILED
test_scope_session_02.py::test_e PASSED
test_scope_session_02.py::test_f SKIPPED
test_scope_session_02.py::test_g PASSED
test_scope_session_02.py::test_h PASSED
""")
E Failed: nomatch: 'test_scope_session_01.py::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_session0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 9 items'
E and: ''
E exact match: 'test_scope_session_01.py::test_a PASSED'
E exact match: 'test_scope_session_01.py::test_b FAILED'
E exact match: 'test_scope_session_01.py::test_c PASSED'
E exact match: 'test_scope_session_01.py::TestClass::test_b PASSED'
E exact match: 'test_scope_session_02.py::test_a FAILED'
E exact match: 'test_scope_session_02.py::test_e PASSED'
E nomatch: 'test_scope_session_02.py::test_f SKIPPED'
E and: 'test_scope_session_02.py::test_f SKIPPED (test_f depends on test_sco...)'
E and: 'test_scope_session_02.py::test_g PASSED'
E and: 'test_scope_session_02.py::test_h PASSED'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '____________________________________ test_b ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_b():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_scope_session_01.py:9: AssertionError'
E and: '____________________________________ test_a ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_a():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_scope_session_02.py:5: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_scope_session_01.py::test_b - assert False'
E and: 'FAILED test_scope_session_02.py::test_a - assert False'
E and: '==================== 2 failed, 6 passed, 1 skipped in 0.03s ===================='
E remains unmatched: 'test_scope_session_02.py::test_f SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_scope.py:105: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_session0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 9 items
test_scope_session_01.py::test_a PASSED
test_scope_session_01.py::test_b FAILED
test_scope_session_01.py::test_c PASSED
test_scope_session_01.py::TestClass::test_b PASSED
test_scope_session_02.py::test_a FAILED
test_scope_session_02.py::test_e PASSED
test_scope_session_02.py::test_f SKIPPED (test_f depends on test_sco...)
test_scope_session_02.py::test_g PASSED
test_scope_session_02.py::test_h PASSED
=================================== FAILURES ===================================
____________________________________ test_b ____________________________________
@pytest.mark.dependency()
def test_b():
> assert False
E assert False
test_scope_session_01.py:9: AssertionError
____________________________________ test_a ____________________________________
@pytest.mark.dependency()
def test_a():
> assert False
E assert False
test_scope_session_02.py:5: AssertionError
=========================== short test summary info ============================
FAILED test_scope_session_01.py::test_b - assert False
FAILED test_scope_session_02.py::test_a - assert False
==================== 2 failed, 6 passed, 1 skipped in 0.03s ====================
____________________________________________________________________________ test_scope_package ____________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_scope_package0')>
def test_scope_package(ctestdir):
"""Two packages, some cross module dependencies within the package and
across package boundaries.
"""
ctestdir.mkpydir("test_scope_package_a")
ctestdir.mkpydir("test_scope_package_b")
srcs = {
'test_scope_package_a/test_01': """
import pytest
@pytest.mark.dependency()
def test_a():
pass
""",
'test_scope_package_b/test_02': """
import pytest
@pytest.mark.dependency()
def test_c():
pass
@pytest.mark.dependency()
def test_d():
assert False
""",
'test_scope_package_b/test_03': """
import pytest
@pytest.mark.dependency(
depends=["test_scope_package_a/test_01.py::test_a"],
scope='session'
)
def test_e():
pass
@pytest.mark.dependency(
depends=["test_scope_package_a/test_01.py::test_a"],
scope='package'
)
def test_f():
pass
@pytest.mark.dependency(
depends=["test_scope_package_b/test_02.py::test_c"],
scope='package'
)
def test_g():
pass
@pytest.mark.dependency(
depends=["test_scope_package_b/test_02.py::test_d"],
scope='package'
)
def test_h():
pass
""",
}
ctestdir.makepyfile(**srcs)
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=4, skipped=2, failed=1)
> result.stdout.fnmatch_lines("""
test_scope_package_a/test_01.py::test_a PASSED
test_scope_package_b/test_02.py::test_c PASSED
test_scope_package_b/test_02.py::test_d FAILED
test_scope_package_b/test_03.py::test_e PASSED
test_scope_package_b/test_03.py::test_f SKIPPED
test_scope_package_b/test_03.py::test_g PASSED
test_scope_package_b/test_03.py::test_h SKIPPED
""")
E Failed: nomatch: 'test_scope_package_a/test_01.py::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_package0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 7 items'
E and: ''
E exact match: 'test_scope_package_a/test_01.py::test_a PASSED'
E exact match: 'test_scope_package_b/test_02.py::test_c PASSED'
E exact match: 'test_scope_package_b/test_02.py::test_d FAILED'
E exact match: 'test_scope_package_b/test_03.py::test_e PASSED'
E nomatch: 'test_scope_package_b/test_03.py::test_f SKIPPED'
E and: 'test_scope_package_b/test_03.py::test_f SKIPPED (test_f depends on t...)'
E and: 'test_scope_package_b/test_03.py::test_g PASSED'
E and: 'test_scope_package_b/test_03.py::test_h SKIPPED (test_h depends on t...)'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '____________________________________ test_d ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_d():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_scope_package_b/test_02.py:9: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_scope_package_b/test_02.py::test_d - assert False'
E and: '==================== 1 failed, 4 passed, 2 skipped in 0.03s ===================='
E remains unmatched: 'test_scope_package_b/test_03.py::test_f SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_scope.py:177: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_package0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 7 items
test_scope_package_a/test_01.py::test_a PASSED
test_scope_package_b/test_02.py::test_c PASSED
test_scope_package_b/test_02.py::test_d FAILED
test_scope_package_b/test_03.py::test_e PASSED
test_scope_package_b/test_03.py::test_f SKIPPED (test_f depends on t...)
test_scope_package_b/test_03.py::test_g PASSED
test_scope_package_b/test_03.py::test_h SKIPPED (test_h depends on t...)
=================================== FAILURES ===================================
____________________________________ test_d ____________________________________
@pytest.mark.dependency()
def test_d():
> assert False
E assert False
test_scope_package_b/test_02.py:9: AssertionError
=========================== short test summary info ============================
FAILED test_scope_package_b/test_02.py::test_d - assert False
==================== 1 failed, 4 passed, 2 skipped in 0.03s ====================
_____________________________________________________________________________ test_scope_class _____________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_scope_class0')>
def test_scope_class(ctestdir):
"""Dependencies in class scope.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency()
def test_a():
assert False
@pytest.mark.dependency()
def test_b():
pass
class TestClass1(object):
@pytest.mark.dependency()
def test_c(self):
pass
class TestClass2(object):
@pytest.mark.dependency()
def test_a(self):
pass
@pytest.mark.dependency()
def test_b(self):
assert False
@pytest.mark.dependency(depends=["test_a"])
def test_d(self):
pass
@pytest.mark.dependency(depends=["test_b"])
def test_e(self):
pass
@pytest.mark.dependency(depends=["test_a"], scope='class')
def test_f(self):
pass
@pytest.mark.dependency(depends=["test_b"], scope='class')
def test_g(self):
pass
@pytest.mark.dependency(depends=["test_c"], scope='class')
def test_h(self):
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=5, skipped=3, failed=2)
> result.stdout.fnmatch_lines("""
test_scope_class.py::test_a FAILED
test_scope_class.py::test_b PASSED
test_scope_class.py::TestClass1::test_c PASSED
test_scope_class.py::TestClass2::test_a PASSED
test_scope_class.py::TestClass2::test_b FAILED
test_scope_class.py::TestClass2::test_d SKIPPED
test_scope_class.py::TestClass2::test_e PASSED
test_scope_class.py::TestClass2::test_f PASSED
test_scope_class.py::TestClass2::test_g SKIPPED
test_scope_class.py::TestClass2::test_h SKIPPED
""")
E Failed: nomatch: 'test_scope_class.py::test_a FAILED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_class0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 10 items'
E and: ''
E exact match: 'test_scope_class.py::test_a FAILED'
E exact match: 'test_scope_class.py::test_b PASSED'
E exact match: 'test_scope_class.py::TestClass1::test_c PASSED'
E exact match: 'test_scope_class.py::TestClass2::test_a PASSED'
E exact match: 'test_scope_class.py::TestClass2::test_b FAILED'
E nomatch: 'test_scope_class.py::TestClass2::test_d SKIPPED'
E and: 'test_scope_class.py::TestClass2::test_d SKIPPED (test_d depends on t...)'
E and: 'test_scope_class.py::TestClass2::test_e PASSED'
E and: 'test_scope_class.py::TestClass2::test_f PASSED'
E and: 'test_scope_class.py::TestClass2::test_g SKIPPED (test_g depends on t...)'
E and: 'test_scope_class.py::TestClass2::test_h SKIPPED (test_h depends on t...)'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '____________________________________ test_a ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_a():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_scope_class.py:5: AssertionError'
E and: '______________________________ TestClass2.test_b _______________________________'
E and: ''
E and: 'self = <test_scope_class.TestClass2 object at 0x7f015bff0190>'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_b(self):'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_scope_class.py:25: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_scope_class.py::test_a - assert False'
E and: 'FAILED test_scope_class.py::TestClass2::test_b - assert False'
E and: '==================== 2 failed, 5 passed, 3 skipped in 0.03s ===================='
E remains unmatched: 'test_scope_class.py::TestClass2::test_d SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_scope.py:239: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_class0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 10 items
test_scope_class.py::test_a FAILED
test_scope_class.py::test_b PASSED
test_scope_class.py::TestClass1::test_c PASSED
test_scope_class.py::TestClass2::test_a PASSED
test_scope_class.py::TestClass2::test_b FAILED
test_scope_class.py::TestClass2::test_d SKIPPED (test_d depends on t...)
test_scope_class.py::TestClass2::test_e PASSED
test_scope_class.py::TestClass2::test_f PASSED
test_scope_class.py::TestClass2::test_g SKIPPED (test_g depends on t...)
test_scope_class.py::TestClass2::test_h SKIPPED (test_h depends on t...)
=================================== FAILURES ===================================
____________________________________ test_a ____________________________________
@pytest.mark.dependency()
def test_a():
> assert False
E assert False
test_scope_class.py:5: AssertionError
______________________________ TestClass2.test_b _______________________________
self = <test_scope_class.TestClass2 object at 0x7f015bff0190>
@pytest.mark.dependency()
def test_b(self):
> assert False
E assert False
test_scope_class.py:25: AssertionError
=========================== short test summary info ============================
FAILED test_scope_class.py::test_a - assert False
FAILED test_scope_class.py::TestClass2::test_b - assert False
==================== 2 failed, 5 passed, 3 skipped in 0.03s ====================
____________________________________________________________________________ test_scope_nodeid _____________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_scope_nodeid0')>
def test_scope_nodeid(ctestdir):
"""The default name of a test is the node id.
The references to the default names must be adapted according to
the scope.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency(
depends=["test_a"],
scope='module'
)
def test_b():
pass
@pytest.mark.dependency(
depends=["test_scope_nodeid.py::test_a"],
scope='module'
)
def test_c():
pass
@pytest.mark.dependency(
depends=["test_a"],
scope='session'
)
def test_d():
pass
@pytest.mark.dependency(
depends=["test_scope_nodeid.py::test_a"],
scope='session'
)
def test_e():
pass
class TestClass(object):
@pytest.mark.dependency()
def test_f(self):
pass
@pytest.mark.dependency(
depends=["test_f"],
scope='class'
)
def test_g(self):
pass
@pytest.mark.dependency(
depends=["TestClass::test_f"],
scope='class'
)
def test_h(self):
pass
@pytest.mark.dependency(
depends=["test_scope_nodeid.py::TestClass::test_f"],
scope='class'
)
def test_i(self):
pass
@pytest.mark.dependency(
depends=["test_f"],
scope='module'
)
def test_j(self):
pass
@pytest.mark.dependency(
depends=["TestClass::test_f"],
scope='module'
)
def test_k(self):
pass
@pytest.mark.dependency(
depends=["test_scope_nodeid.py::TestClass::test_f"],
scope='module'
)
def test_l(self):
pass
@pytest.mark.dependency(
depends=["test_f"],
scope='session'
)
def test_m(self):
pass
@pytest.mark.dependency(
depends=["TestClass::test_f"],
scope='session'
)
def test_n(self):
pass
@pytest.mark.dependency(
depends=["test_scope_nodeid.py::TestClass::test_f"],
scope='session'
)
def test_o(self):
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=7, skipped=8, failed=0)
> result.stdout.fnmatch_lines("""
test_scope_nodeid.py::test_a PASSED
test_scope_nodeid.py::test_b PASSED
test_scope_nodeid.py::test_c SKIPPED
test_scope_nodeid.py::test_d SKIPPED
test_scope_nodeid.py::test_e PASSED
test_scope_nodeid.py::TestClass::test_f PASSED
test_scope_nodeid.py::TestClass::test_g PASSED
test_scope_nodeid.py::TestClass::test_h SKIPPED
test_scope_nodeid.py::TestClass::test_i SKIPPED
test_scope_nodeid.py::TestClass::test_j SKIPPED
test_scope_nodeid.py::TestClass::test_k PASSED
test_scope_nodeid.py::TestClass::test_l SKIPPED
test_scope_nodeid.py::TestClass::test_m SKIPPED
test_scope_nodeid.py::TestClass::test_n SKIPPED
test_scope_nodeid.py::TestClass::test_o PASSED
""")
E Failed: nomatch: 'test_scope_nodeid.py::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_nodeid0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 15 items'
E and: ''
E exact match: 'test_scope_nodeid.py::test_a PASSED'
E exact match: 'test_scope_nodeid.py::test_b PASSED'
E nomatch: 'test_scope_nodeid.py::test_c SKIPPED'
E and: 'test_scope_nodeid.py::test_c SKIPPED (test_c depends on test_scope_n...)'
E and: 'test_scope_nodeid.py::test_d SKIPPED (test_d depends on test_a)'
E and: 'test_scope_nodeid.py::test_e PASSED'
E and: 'test_scope_nodeid.py::TestClass::test_f PASSED'
E and: 'test_scope_nodeid.py::TestClass::test_g PASSED'
E and: 'test_scope_nodeid.py::TestClass::test_h SKIPPED (test_h depends on T...)'
E and: 'test_scope_nodeid.py::TestClass::test_i SKIPPED (test_i depends on t...)'
E and: 'test_scope_nodeid.py::TestClass::test_j SKIPPED (test_j depends on t...)'
E and: 'test_scope_nodeid.py::TestClass::test_k PASSED'
E and: 'test_scope_nodeid.py::TestClass::test_l SKIPPED (test_l depends on t...)'
E and: 'test_scope_nodeid.py::TestClass::test_m SKIPPED (test_m depends on t...)'
E and: 'test_scope_nodeid.py::TestClass::test_n SKIPPED (test_n depends on T...)'
E and: 'test_scope_nodeid.py::TestClass::test_o PASSED'
E and: ''
E and: '========================= 7 passed, 8 skipped in 0.03s ========================='
E remains unmatched: 'test_scope_nodeid.py::test_c SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_scope.py:363: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_nodeid0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 15 items
test_scope_nodeid.py::test_a PASSED
test_scope_nodeid.py::test_b PASSED
test_scope_nodeid.py::test_c SKIPPED (test_c depends on test_scope_n...)
test_scope_nodeid.py::test_d SKIPPED (test_d depends on test_a)
test_scope_nodeid.py::test_e PASSED
test_scope_nodeid.py::TestClass::test_f PASSED
test_scope_nodeid.py::TestClass::test_g PASSED
test_scope_nodeid.py::TestClass::test_h SKIPPED (test_h depends on T...)
test_scope_nodeid.py::TestClass::test_i SKIPPED (test_i depends on t...)
test_scope_nodeid.py::TestClass::test_j SKIPPED (test_j depends on t...)
test_scope_nodeid.py::TestClass::test_k PASSED
test_scope_nodeid.py::TestClass::test_l SKIPPED (test_l depends on t...)
test_scope_nodeid.py::TestClass::test_m SKIPPED (test_m depends on t...)
test_scope_nodeid.py::TestClass::test_n SKIPPED (test_n depends on T...)
test_scope_nodeid.py::TestClass::test_o PASSED
========================= 7 passed, 8 skipped in 0.03s =========================
_____________________________________________________________________________ test_scope_named _____________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_scope_named0')>
def test_scope_named(ctestdir):
"""Explicitely named tests are always referenced by that name,
regardless of the scope.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency(name="a")
def test_a():
pass
@pytest.mark.dependency(
depends=["a"],
scope='module'
)
def test_b():
pass
@pytest.mark.dependency(
depends=["test_a"],
scope='module'
)
def test_c():
pass
@pytest.mark.dependency(
depends=["a"],
scope='session'
)
def test_d():
pass
@pytest.mark.dependency(
depends=["test_scope_named.py::test_a"],
scope='session'
)
def test_e():
pass
class TestClass(object):
@pytest.mark.dependency(name="f")
def test_f(self):
pass
@pytest.mark.dependency(
depends=["f"],
scope='class'
)
def test_g(self):
pass
@pytest.mark.dependency(
depends=["test_f"],
scope='class'
)
def test_h(self):
pass
@pytest.mark.dependency(
depends=["f"],
scope='module'
)
def test_i(self):
pass
@pytest.mark.dependency(
depends=["TestClass::test_f"],
scope='module'
)
def test_j(self):
pass
@pytest.mark.dependency(
depends=["f"],
scope='session'
)
def test_k(self):
pass
@pytest.mark.dependency(
depends=["test_scope_named.py::TestClass::test_f"],
scope='session'
)
def test_l(self):
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=7, skipped=5, failed=0)
> result.stdout.fnmatch_lines("""
test_scope_named.py::test_a PASSED
test_scope_named.py::test_b PASSED
test_scope_named.py::test_c SKIPPED
test_scope_named.py::test_d PASSED
test_scope_named.py::test_e SKIPPED
test_scope_named.py::TestClass::test_f PASSED
test_scope_named.py::TestClass::test_g PASSED
test_scope_named.py::TestClass::test_h SKIPPED
test_scope_named.py::TestClass::test_i PASSED
test_scope_named.py::TestClass::test_j SKIPPED
test_scope_named.py::TestClass::test_k PASSED
test_scope_named.py::TestClass::test_l SKIPPED
""")
E Failed: nomatch: 'test_scope_named.py::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_named0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 12 items'
E and: ''
E exact match: 'test_scope_named.py::test_a PASSED'
E exact match: 'test_scope_named.py::test_b PASSED'
E nomatch: 'test_scope_named.py::test_c SKIPPED'
E and: 'test_scope_named.py::test_c SKIPPED (test_c depends on test_a)'
E and: 'test_scope_named.py::test_d PASSED'
E and: 'test_scope_named.py::test_e SKIPPED (test_e depends on test_scope_na...)'
E and: 'test_scope_named.py::TestClass::test_f PASSED'
E and: 'test_scope_named.py::TestClass::test_g PASSED'
E and: 'test_scope_named.py::TestClass::test_h SKIPPED (test_h depends on te...)'
E and: 'test_scope_named.py::TestClass::test_i PASSED'
E and: 'test_scope_named.py::TestClass::test_j SKIPPED (test_j depends on Te...)'
E and: 'test_scope_named.py::TestClass::test_k PASSED'
E and: 'test_scope_named.py::TestClass::test_l SKIPPED (test_l depends on te...)'
E and: ''
E and: '========================= 7 passed, 5 skipped in 0.03s ========================='
E remains unmatched: 'test_scope_named.py::test_c SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_scope.py:470: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_named0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 12 items
test_scope_named.py::test_a PASSED
test_scope_named.py::test_b PASSED
test_scope_named.py::test_c SKIPPED (test_c depends on test_a)
test_scope_named.py::test_d PASSED
test_scope_named.py::test_e SKIPPED (test_e depends on test_scope_na...)
test_scope_named.py::TestClass::test_f PASSED
test_scope_named.py::TestClass::test_g PASSED
test_scope_named.py::TestClass::test_h SKIPPED (test_h depends on te...)
test_scope_named.py::TestClass::test_i PASSED
test_scope_named.py::TestClass::test_j SKIPPED (test_j depends on Te...)
test_scope_named.py::TestClass::test_k PASSED
test_scope_named.py::TestClass::test_l SKIPPED (test_l depends on te...)
========================= 7 passed, 5 skipped in 0.03s =========================
__________________________________________________________________________ test_scope_dependsfunc __________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_scope_dependsfunc0')>
def test_scope_dependsfunc(ctestdir):
"""Test the scope argument to the depends() function.
"""
ctestdir.makepyfile(test_scope_dependsfunc_01="""
import pytest
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency()
def test_b():
assert False
@pytest.mark.dependency(depends=["test_a"])
def test_c():
pass
class TestClass(object):
@pytest.mark.dependency()
def test_b(self):
pass
""", test_scope_dependsfunc_02="""
import pytest
from pytest_dependency import depends
@pytest.mark.dependency()
def test_a():
assert False
@pytest.mark.dependency()
def test_b():
pass
@pytest.mark.dependency()
def test_e(request):
depends(request,
["test_scope_dependsfunc_01.py::test_a",
"test_scope_dependsfunc_01.py::test_c"],
scope='session')
pass
@pytest.mark.dependency()
def test_f(request):
depends(request,
["test_scope_dependsfunc_01.py::test_b"],
scope='session')
pass
@pytest.mark.dependency()
def test_g(request):
depends(request,
["test_scope_dependsfunc_02.py::test_e"],
scope='session')
pass
@pytest.mark.dependency()
def test_h(request):
depends(request,
["test_scope_dependsfunc_01.py::TestClass::test_b"],
scope='session')
pass
@pytest.mark.dependency()
def test_i(request):
depends(request, ["test_a"], scope='module')
pass
@pytest.mark.dependency()
def test_j(request):
depends(request, ["test_b"], scope='module')
pass
class TestClass(object):
@pytest.mark.dependency()
def test_a(self):
pass
@pytest.mark.dependency()
def test_b(self):
assert False
@pytest.mark.dependency()
def test_c(self, request):
depends(request, ["test_a"], scope='class')
pass
@pytest.mark.dependency()
def test_d(self, request):
depends(request, ["test_b"], scope='class')
pass
""")
result = ctestdir.runpytest("--verbose")
result.assert_outcomes(passed=10, skipped=3, failed=3)
> result.stdout.fnmatch_lines("""
test_scope_dependsfunc_01.py::test_a PASSED
test_scope_dependsfunc_01.py::test_b FAILED
test_scope_dependsfunc_01.py::test_c PASSED
test_scope_dependsfunc_01.py::TestClass::test_b PASSED
test_scope_dependsfunc_02.py::test_a FAILED
test_scope_dependsfunc_02.py::test_b PASSED
test_scope_dependsfunc_02.py::test_e PASSED
test_scope_dependsfunc_02.py::test_f SKIPPED
test_scope_dependsfunc_02.py::test_g PASSED
test_scope_dependsfunc_02.py::test_h PASSED
test_scope_dependsfunc_02.py::test_i SKIPPED
test_scope_dependsfunc_02.py::test_j PASSED
test_scope_dependsfunc_02.py::TestClass::test_a PASSED
test_scope_dependsfunc_02.py::TestClass::test_b FAILED
test_scope_dependsfunc_02.py::TestClass::test_c PASSED
test_scope_dependsfunc_02.py::TestClass::test_d SKIPPED
""")
E Failed: nomatch: 'test_scope_dependsfunc_01.py::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_dependsfunc0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 16 items'
E and: ''
E exact match: 'test_scope_dependsfunc_01.py::test_a PASSED'
E exact match: 'test_scope_dependsfunc_01.py::test_b FAILED'
E exact match: 'test_scope_dependsfunc_01.py::test_c PASSED'
E exact match: 'test_scope_dependsfunc_01.py::TestClass::test_b PASSED'
E exact match: 'test_scope_dependsfunc_02.py::test_a FAILED'
E exact match: 'test_scope_dependsfunc_02.py::test_b PASSED'
E exact match: 'test_scope_dependsfunc_02.py::test_e PASSED'
E nomatch: 'test_scope_dependsfunc_02.py::test_f SKIPPED'
E and: 'test_scope_dependsfunc_02.py::test_f SKIPPED (test_f depends on test...)'
E and: 'test_scope_dependsfunc_02.py::test_g PASSED'
E and: 'test_scope_dependsfunc_02.py::test_h PASSED'
E and: 'test_scope_dependsfunc_02.py::test_i SKIPPED (test_i depends on test_a)'
E and: 'test_scope_dependsfunc_02.py::test_j PASSED'
E and: 'test_scope_dependsfunc_02.py::TestClass::test_a PASSED'
E and: 'test_scope_dependsfunc_02.py::TestClass::test_b FAILED'
E and: 'test_scope_dependsfunc_02.py::TestClass::test_c PASSED'
E and: 'test_scope_dependsfunc_02.py::TestClass::test_d SKIPPED (test_d depe...)'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '____________________________________ test_b ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_b():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_scope_dependsfunc_01.py:9: AssertionError'
E and: '____________________________________ test_a ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_a():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_scope_dependsfunc_02.py:6: AssertionError'
E and: '_______________________________ TestClass.test_b _______________________________'
E and: ''
E and: 'self = <test_scope_dependsfunc_02.TestClass object at 0x7f015bdd2490>'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_b(self):'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_scope_dependsfunc_02.py:59: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'FAILED test_scope_dependsfunc_01.py::test_b - assert False'
E and: 'FAILED test_scope_dependsfunc_02.py::test_a - assert False'
E and: 'FAILED test_scope_dependsfunc_02.py::TestClass::test_b - assert False'
E and: '=================== 3 failed, 10 passed, 3 skipped in 0.05s ===================='
E remains unmatched: 'test_scope_dependsfunc_02.py::test_f SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_scope.py:581: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_scope_dependsfunc0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 16 items
test_scope_dependsfunc_01.py::test_a PASSED
test_scope_dependsfunc_01.py::test_b FAILED
test_scope_dependsfunc_01.py::test_c PASSED
test_scope_dependsfunc_01.py::TestClass::test_b PASSED
test_scope_dependsfunc_02.py::test_a FAILED
test_scope_dependsfunc_02.py::test_b PASSED
test_scope_dependsfunc_02.py::test_e PASSED
test_scope_dependsfunc_02.py::test_f SKIPPED (test_f depends on test...)
test_scope_dependsfunc_02.py::test_g PASSED
test_scope_dependsfunc_02.py::test_h PASSED
test_scope_dependsfunc_02.py::test_i SKIPPED (test_i depends on test_a)
test_scope_dependsfunc_02.py::test_j PASSED
test_scope_dependsfunc_02.py::TestClass::test_a PASSED
test_scope_dependsfunc_02.py::TestClass::test_b FAILED
test_scope_dependsfunc_02.py::TestClass::test_c PASSED
test_scope_dependsfunc_02.py::TestClass::test_d SKIPPED (test_d depe...)
=================================== FAILURES ===================================
____________________________________ test_b ____________________________________
@pytest.mark.dependency()
def test_b():
> assert False
E assert False
test_scope_dependsfunc_01.py:9: AssertionError
____________________________________ test_a ____________________________________
@pytest.mark.dependency()
def test_a():
> assert False
E assert False
test_scope_dependsfunc_02.py:6: AssertionError
_______________________________ TestClass.test_b _______________________________
self = <test_scope_dependsfunc_02.TestClass object at 0x7f015bdd2490>
@pytest.mark.dependency()
def test_b(self):
> assert False
E assert False
test_scope_dependsfunc_02.py:59: AssertionError
=========================== short test summary info ============================
FAILED test_scope_dependsfunc_01.py::test_b - assert False
FAILED test_scope_dependsfunc_02.py::test_a - assert False
FAILED test_scope_dependsfunc_02.py::TestClass::test_b - assert False
=================== 3 failed, 10 passed, 3 skipped in 0.05s ====================
_______________________________________________________________________________ test_simple ________________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_simple0')>
def test_simple(ctestdir):
"""One test fails, other dependent tests are skipped.
This also includes indirect dependencies.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency()
def test_b():
assert False
@pytest.mark.dependency(depends=["test_b"])
def test_c():
pass
@pytest.mark.dependency(depends=["test_c"])
def test_d():
pass
""")
result = ctestdir.runpytest("--verbose", "-rs")
result.assert_outcomes(passed=1, skipped=2, failed=1)
> result.stdout.fnmatch_lines("""
*::test_a PASSED
*::test_b FAILED
*::test_c SKIPPED
*::test_d SKIPPED
""")
E Failed: nomatch: '*::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_simple0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 4 items'
E and: ''
E fnmatch: '*::test_a PASSED'
E with: 'test_simple.py::test_a PASSED'
E fnmatch: '*::test_b FAILED'
E with: 'test_simple.py::test_b FAILED'
E nomatch: '*::test_c SKIPPED'
E and: 'test_simple.py::test_c SKIPPED (test_c depends on test_b)'
E and: 'test_simple.py::test_d SKIPPED (test_d depends on test_c)'
E and: ''
E and: '=================================== FAILURES ==================================='
E and: '____________________________________ test_b ____________________________________'
E and: ''
E and: ' @pytest.mark.dependency()'
E and: ' def test_b():'
E and: '> assert False'
E and: 'E assert False'
E and: ''
E and: 'test_simple.py:9: AssertionError'
E and: '=========================== short test summary info ============================'
E and: 'SKIPPED [1] ../../../../home/tkloczko/rpmbuild/BUILDROOT/python-pytest-dependency-0.5.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/pytest_dependency.py:98: test_c depends on test_b'
E and: 'SKIPPED [1] ../../../../home/tkloczko/rpmbuild/BUILDROOT/python-pytest-dependency-0.5.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/pytest_dependency.py:98: test_d depends on test_c'
E and: '==================== 1 failed, 1 passed, 2 skipped in 0.01s ===================='
E remains unmatched: '*::test_c SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_03_skipmsgs.py:32: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_simple0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 4 items
test_simple.py::test_a PASSED
test_simple.py::test_b FAILED
test_simple.py::test_c SKIPPED (test_c depends on test_b)
test_simple.py::test_d SKIPPED (test_d depends on test_c)
=================================== FAILURES ===================================
____________________________________ test_b ____________________________________
@pytest.mark.dependency()
def test_b():
> assert False
E assert False
test_simple.py:9: AssertionError
=========================== short test summary info ============================
SKIPPED [1] ../../../../home/tkloczko/rpmbuild/BUILDROOT/python-pytest-dependency-0.5.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/pytest_dependency.py:98: test_c depends on test_b
SKIPPED [1] ../../../../home/tkloczko/rpmbuild/BUILDROOT/python-pytest-dependency-0.5.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/pytest_dependency.py:98: test_d depends on test_c
==================== 1 failed, 1 passed, 2 skipped in 0.01s ====================
_______________________________________________________________________________ test_not_set _______________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_not_set0')>
def test_not_set(ctestdir):
"""No pytest.ini file, e.g. automark_dependency is not set.
Since automark_dependency defaults to false and test_a is not
marked, the outcome of test_a will not be recorded. As a result,
test_b will be skipped due to a missing dependency.
"""
ctestdir.makepyfile("""
import pytest
def test_a():
pass
@pytest.mark.dependency(depends=["test_a"])
def test_b():
pass
""")
result = ctestdir.runpytest("--verbose", "-rs")
result.assert_outcomes(passed=1, skipped=1, failed=0)
> result.stdout.fnmatch_lines("""
*::test_a PASSED
*::test_b SKIPPED
""")
E Failed: nomatch: '*::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_not_set0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 2 items'
E and: ''
E fnmatch: '*::test_a PASSED'
E with: 'test_not_set.py::test_a PASSED'
E nomatch: '*::test_b SKIPPED'
E and: 'test_not_set.py::test_b SKIPPED (test_b depends on test_a)'
E and: ''
E and: '=========================== short test summary info ============================'
E and: 'SKIPPED [1] ../../../../home/tkloczko/rpmbuild/BUILDROOT/python-pytest-dependency-0.5.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/pytest_dependency.py:98: test_b depends on test_a'
E and: '========================= 1 passed, 1 skipped in 0.01s ========================='
E remains unmatched: '*::test_b SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_04_automark.py:26: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_not_set0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 2 items
test_not_set.py::test_a PASSED
test_not_set.py::test_b SKIPPED (test_b depends on test_a)
=========================== short test summary info ============================
SKIPPED [1] ../../../../home/tkloczko/rpmbuild/BUILDROOT/python-pytest-dependency-0.5.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/pytest_dependency.py:98: test_b depends on test_a
========================= 1 passed, 1 skipped in 0.01s =========================
______________________________________________________________________________ test_set_false ______________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_set_false0')>
def test_set_false(ctestdir):
"""A pytest.ini is present, automark_dependency is set to false.
Since automark_dependency is set to false and test_a is not
marked, the outcome of test_a will not be recorded. As a result,
test_b will be skipped due to a missing dependency.
"""
ctestdir.makefile('.ini', pytest="""
[pytest]
automark_dependency = false
console_output_style = classic
""")
ctestdir.makepyfile("""
import pytest
def test_a():
pass
@pytest.mark.dependency(depends=["test_a"])
def test_b():
pass
""")
result = ctestdir.runpytest("--verbose", "-rs")
result.assert_outcomes(passed=1, skipped=1, failed=0)
> result.stdout.fnmatch_lines("""
*::test_a PASSED
*::test_b SKIPPED
""")
E Failed: nomatch: '*::test_a PASSED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_set_false0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 2 items'
E and: ''
E fnmatch: '*::test_a PASSED'
E with: 'test_set_false.py::test_a PASSED'
E nomatch: '*::test_b SKIPPED'
E and: 'test_set_false.py::test_b SKIPPED (test_b depends on test_a)'
E and: ''
E and: '=========================== short test summary info ============================'
E and: 'SKIPPED [1] ../../../../home/tkloczko/rpmbuild/BUILDROOT/python-pytest-dependency-0.5.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/pytest_dependency.py:98: test_b depends on test_a'
E and: '========================= 1 passed, 1 skipped in 0.01s ========================='
E remains unmatched: '*::test_b SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_04_automark.py:56: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_set_false0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 2 items
test_set_false.py::test_a PASSED
test_set_false.py::test_b SKIPPED (test_b depends on test_a)
=========================== short test summary info ============================
SKIPPED [1] ../../../../home/tkloczko/rpmbuild/BUILDROOT/python-pytest-dependency-0.5.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/pytest_dependency.py:98: test_b depends on test_a
========================= 1 passed, 1 skipped in 0.01s =========================
______________________________________________________________________________ test_no_ignore ______________________________________________________________________________
ctestdir = <Testdir local('/tmp/pytest-of-tkloczko/pytest-677/test_no_ignore0')>
def test_no_ignore(ctestdir):
"""No command line option, e.g. ignore-unknown-dependency is not set.
Explicitly select only a single test that depends on another one.
Since the other test has not been run at all, the selected test
will be skipped.
"""
ctestdir.makepyfile("""
import pytest
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency()
def test_b():
pass
@pytest.mark.dependency()
def test_c():
pass
@pytest.mark.dependency(depends=["test_c"])
def test_d():
pass
""")
result = ctestdir.runpytest("--verbose", "test_no_ignore.py::test_d")
result.assert_outcomes(passed=0, skipped=1, failed=0)
> result.stdout.fnmatch_lines("""
*::test_d SKIPPED
""")
E Failed: nomatch: '*::test_d SKIPPED'
E and: '============================= test session starts =============================='
E and: 'platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3'
E and: 'cachedir: .pytest_cache'
E and: 'rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_no_ignore0, configfile: pytest.ini'
E and: 'plugins: dependency-0.5.1'
E and: 'collecting ... collected 1 item'
E and: ''
E and: 'test_no_ignore.py::test_d SKIPPED (test_d depends on test_c)'
E and: ''
E and: '============================== 1 skipped in 0.01s =============================='
E remains unmatched: '*::test_d SKIPPED'
/home/tkloczko/rpmbuild/BUILD/pytest-dependency-0.5.1/tests/test_04_ignore_unknown.py:35: Failed
--------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/bin/python3
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-tkloczko/pytest-677/test_no_ignore0, configfile: pytest.ini
plugins: dependency-0.5.1
collecting ... collected 1 item
test_no_ignore.py::test_d SKIPPED (test_d depends on test_c)
============================== 1 skipped in 0.01s ==============================
========================================================================= short test summary info ==========================================================================
FAILED tests/test_02_simple_dependency.py::test_no_skip - Failed: nomatch: '*::test_a SKIPPED'
FAILED tests/test_02_simple_dependency.py::test_skip_depend - Failed: nomatch: '*::test_a PASSED'
FAILED tests/test_02_simple_dependency.py::test_fail_depend - Failed: nomatch: '*::test_a PASSED'
FAILED tests/test_02_simple_dependency.py::test_named_fail_depend - Failed: nomatch: '*::test_a PASSED'
FAILED tests/test_02_simple_dependency.py::test_explicit_select - Failed: nomatch: '*::test_d SKIPPED'
FAILED tests/test_02_simple_dependency.py::test_depend_unknown - Failed: nomatch: '*::test_a PASSED'
FAILED tests/test_03_class.py::test_class_simple - Failed: nomatch: '*::TestClass::test_a FAILED'
FAILED tests/test_03_class.py::test_class_simple_named - Failed: nomatch: '*::TestClassNamed::test_a FAILED'
FAILED tests/test_03_class.py::test_class_default_name - Failed: nomatch: '*::test_a FAILED'
FAILED tests/test_03_multiple_dependency.py::test_multiple - Failed: nomatch: '*::test_a SKIPPED'
FAILED tests/test_03_param.py::test_multiple - Failed: nomatch: '*::test_a?0-0? PASSED'
FAILED tests/test_03_runtime.py::test_skip_depend_runtime - Failed: nomatch: '*::test_a PASSED'
FAILED tests/test_03_scope.py::test_scope_module - Failed: nomatch: 'test_scope_module.py::test_a FAILED'
FAILED tests/test_03_scope.py::test_scope_session - Failed: nomatch: 'test_scope_session_01.py::test_a PASSED'
FAILED tests/test_03_scope.py::test_scope_package - Failed: nomatch: 'test_scope_package_a/test_01.py::test_a PASSED'
FAILED tests/test_03_scope.py::test_scope_class - Failed: nomatch: 'test_scope_class.py::test_a FAILED'
FAILED tests/test_03_scope.py::test_scope_nodeid - Failed: nomatch: 'test_scope_nodeid.py::test_a PASSED'
FAILED tests/test_03_scope.py::test_scope_named - Failed: nomatch: 'test_scope_named.py::test_a PASSED'
FAILED tests/test_03_scope.py::test_scope_dependsfunc - Failed: nomatch: 'test_scope_dependsfunc_01.py::test_a PASSED'
FAILED tests/test_03_skipmsgs.py::test_simple - Failed: nomatch: '*::test_a PASSED'
FAILED tests/test_04_automark.py::test_not_set - Failed: nomatch: '*::test_a PASSED'
FAILED tests/test_04_automark.py::test_set_false - Failed: nomatch: '*::test_a PASSED'
FAILED tests/test_04_ignore_unknown.py::test_no_ignore - Failed: nomatch: '*::test_d SKIPPED'
======================================================================= 23 failed, 4 passed in 3.38s =======================================================================
Is there a way to have a test that is conditional dependent on 2 tests?
We have 2 tests (test_1 and test_2) that are mutually exclusive; we control which test to run by using a marker.
We have a test which should depend on either one of them.
Is there a way to achieve it?
Pseudo code of what is needed:
@pytest.mark.dependency()
def test_1():
return
@pytest.mark.dependency()
def test_2():
return
@pytest.mark.dependency(depends="test_1" or "test_2")
def test_3():
return
On windows everything seems to be ok
platform win32 -- Python 3.6.10, pytest-6.2.2, py-1.10.0, pluggy-0.13.1
plugins: cov-2.11.1, dependency-0.5.1
collected 14 items
14 passed, 9 warnings in 37.41s
But running the same tests on linux I get
platform linux -- Python 3.6.13, pytest-6.2.2, py-1.10.0, pluggy-0.13.1
plugins: dependency-0.5.1, cov-2.11.1, depends-1.0.1
6 passed, 8 skipped, 13 warnings in 28.14s
I'm not sure why the skipping would happen as no tests failed(and why on winodws it seems to be fine)
I am seeing a lot of errors running the test suite with pytest 6.2.2. I assume they are from 6.2.0 but I havent isolated when they started.
https://github.com/pytest-dev/pytest/blob/master/doc/en/changelog.rst might give some clues to what causes it
Would you mind having a cli switch that allows us to NOT skip the test if the parent tests are marked as skip
or xfail
?
I am building a framework that uses this plugin, in which we skip tests if test that they are dependent on, fails - for e.g., if an HTTP API fails, there's no point in going ahead and checking its values in database. but there are times when we dont have to make that API call, yet somehow check in the database. in that case, we would like to have this switch (something like --ignore-skipped-tests
) that will let us skip
the API test case but continue to run the database test case.
I can think of having separate switches for skip
and xfail
.
What are your views on this? If this seems ok, I can submit a PR for the same.
Thanks! ✌️
I'm getting the following deprecation warnings using pytest-dependency 0.3.2 with pytest 3.8.0:
/usr/local/venv/pytest/lib64/python3.6/site-packages/pytest_dependency.py:150: RemovedInPytest4Warning: MarkInfo objects are deprecated as they contain merged marks which are hard to deal with correctly.
Please use node.get_closest_marker(name) or node.iter_markers(name).
Docs: https://docs.pytest.org/en/latest/mark.html#updating-code
depends = marker.kwargs.get('depends')
/usr/local/venv/pytest/lib64/python3.6/site-packages/pytest_dependency.py:139: RemovedInPytest4Warning: MarkInfo objects are deprecated as they contain merged marks which are hard to deal with correctly.
Please use node.get_closest_marker(name) or node.iter_markers(name).
Docs: https://docs.pytest.org/en/latest/mark.html#updating-code
name = marker.kwargs.get('name') if marker is not None else None
Need testing. Scenarios to consider:
I am using the code snippet from https://pytest-dependency.readthedocs.io/en/stable/usage.html
Expected dependent tests to skip rather they passed.
import pytest
@pytest.mark.dependency()
def test_a():
assert 10 == 100
@pytest.mark.dependency()
def test_b():
pass
@pytest.mark.dependency(depends=["test_a"])
def test_c():
pass
@pytest.mark.dependency(depends=["test_b"])
def test_d():
pass
@pytest.mark.dependency(depends=["test_b", "test_c"])
def test_e():
pass
output
PASSED test_demo_test_cases_webhook.py::test_b
PASSED test_demo_test_cases_webhook.py::test_c
PASSED test_demo_test_cases_webhook.py::test_d
PASSED test_demo_test_cases_webhook.py::test_e
FAILED test_demo_test_cases_webhook.py::test_a - assert 10 == 100
I am unsure whether this a feature request or an issue but I am attempting to have a test in one file depend on a test from another. I will give a visual example below.
tests/file1.py
--- class1()
--- --- method1()
tests/file2.py
--- class2()
--- --- method2()
Now I want method2 to depend on the success of method1. Is there some way I am missing to do this? I have done the "Class1::method1" in the depends for method2.
When the plugin is active, python -m pytest --markers
should list the @pytest.mark.dependency
marker.
the link: https://pytest-dependency.readthedocs.io/en/stable/scope.html#explicitely-specifying-the-scope.
test_mod_01.py
import pytest
@pytest.mark.dependency()
def test_a():
print("test_a")
pass
@pytest.mark.dependency()
@pytest.mark.xfail(reason="deliberate fail")
def test_b():
print("test_b")
assert False
@pytest.mark.dependency(depends=["test_a"])
def test_c():
print("test_c")
pass
class TestClass(object):
@pytest.mark.dependency()
def test_b(self):
pass
test_mod_02.py
import pytest
@pytest.mark.dependency()
@pytest.mark.xfail(reason="deliberate fail")
def test_a():
print("test_a")
assert False
@pytest.mark.dependency(
depends=["tests/test_mod_01.py::test_a", "tests/test_mod_01.py::test_c"],
scope='session'
)
def test_e():
print("test_e")
pass
@pytest.mark.dependency(
depends=["tests/test_mod_01.py::test_b", "tests/test_mod_02.py::test_e"],
scope='session'
)
def test_f():
print("test_f")
pass
@pytest.mark.dependency(
depends=["tests/test_mod_01.py::TestClass::test_b"],
scope='session'
)
def test_g():
print("test_g")
pass
"E:\Program Files\Python37\python.exe" "E:\Program Files\JetBrains\PyCharm 2021.3\plugins\python\helpers\pycharm_jb_pytest_runner.py" --path E:/PycharmProjects/SelniumPOM-master/tests/test_mod_02.py
Testing started at 19:23 ...
Launching pytest with arguments E:/PycharmProjects/SelniumPOM-master/tests/test_mod_02.py --no-header --no-summary -q in E:\PycharmProjects\SelniumPOM-master\tests
============================= test session starts =============================
collecting ... collected 4 items
test_mod_02.py::test_a XFAIL (deliberate fail) [ 25%]test_a
@pytest.mark.dependency()
@pytest.mark.xfail(reason="deliberate fail")
def test_a():
print("test_a")
assert False
E assert False
test_mod_02.py:7: AssertionError
test_mod_02.py::test_e SKIPPED (test_e depends on tests/test_mod_01....) [ 50%]
Skipped: test_e depends on tests/test_mod_01.py::test_a
test_mod_02.py::test_f SKIPPED (test_f depends on tests/test_mod_01....) [ 75%]
Skipped: test_f depends on tests/test_mod_01.py::test_b
test_mod_02.py::test_g SKIPPED (test_g depends on tests/test_mod_01....) [100%]
Skipped: test_g depends on tests/test_mod_01.py::TestClass::test_b
======================== 3 skipped, 1 xfailed in 0.07s ========================
Process finished with exit code 0
the expect result is test_e and test_g is success!
@anishradvani requested in #38 that it should be possible to apply the dependency marker on a test class rather then to an individual method. Actually, it turns out that this already works. Consider:
import pytest
@pytest.mark.dependency()
@pytest.mark.xfail(reason="deliberate fail")
def test_f():
assert False
@pytest.mark.dependency(depends=["test_f"])
class TestClass(object):
def test_a(self):
pass
@pytest.mark.dependency()
def test_b(self):
pass
def test_c(self):
pass
This will have the effect that test_a
and test_c
will be skipped, because they depend on test_f
, while test_b
will be run.
I am trying to do the most trivial example, but impose ordering on fixtures, rather than tests.
I couldn't find any doc on this. Is it supported?
Please see code on stackoverflow
Thanks!
It would be nice an option in the dependency that force to collect and execute the required test. For example:
import pytest
@pytest.mark.dependency()
@pytest.mark.xfail(reason="deliberate fail")
def test_a():
assert False
@pytest.mark.dependency()
def test_b():
pass
@pytest.mark.dependency(depends=["test_a"])
def test_c():
pass
@pytest.mark.dependency(depends=["test_b"], collect=True)
def test_d():
pass
@pytest.mark.dependency(depends=["test_b", "test_c"])
def test_e():
pass
if execute only the test_d
the collector also will select the test_b
because the hard restriction of the dependency collect=True
If you think this is a good idea I can work on this feature. kind regards
env:
pytest==5.3.4
pytest-dependency==0.4.0(installed by pip3 install pytest-dependency)
code:
@pytest.mark.dependency()
def test_a(self):
# fail
assert True
@pytest.mark.dependency()
def test_b(self):
# pass
assert True
@pytest.mark.dependency(depends=["test_a", "test_b"])
def test_c(self):
assert True
problem:
test_c is skipped when test_a and test_b are passed。
The test suite fails with pytest 3.3.0.
The cause is the new progress style console output introduced in pytest-dev/pytest#2657. The test suite runs several tests in a testdir
and checks the output. The new style console output causes the test output not to match the expected result.
import pytest
@pytest.mark.parametrize("x,y", [
pytest.mark.dependency(name="a1")((0,0)),
pytest.mark.dependency(name="a2")(pytest.mark.xfail((0,1))),
pytest.mark.dependency(name="a3")((1,0)),
pytest.mark.dependency(name="a4")((1,1))
])
def test_a(x,y):
assert y <= x
@pytest.mark.parametrize("u,v", [
pytest.mark.dependency(name="b1", depends=["a1", "a2"])((1,2)),
pytest.mark.dependency(name="b2", depends=["a1", "a3"])((1,3)),
pytest.mark.dependency(name="b3", depends=["a1", "a4"])((1,4)),
pytest.mark.dependency(name="b4", depends=["a2", "a3"])((2,3)),
pytest.mark.dependency(name="b5", depends=["a2", "a4"])((2,4)),
pytest.mark.dependency(name="b6", depends=["a3", "a4"])((3,4))
])
def test_b(u,v):
pass
@pytest.mark.parametrize("w", [
pytest.mark.dependency(name="c1", depends=["b1", "b2", "b6"])(1),
pytest.mark.dependency(name="c2", depends=["b2", "b3", "b6"])(2),
pytest.mark.dependency(name="c3", depends=["b2", "b4", "b6"])(3)
])
def test_c(w):
pass
The above code will fail with the following error in pytest 4.0.0:
custom-eggs/pluggy-0.8.0-py2.7.egg/pluggy/hooks.py:284: in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
custom-eggs/pluggy-0.8.0-py2.7.egg/pluggy/manager.py:67: in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
custom-eggs/pluggy-0.8.0-py2.7.egg/pluggy/manager.py:61: in <lambda>
firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
custom-eggs/pytest-4.0.0-py2.7.egg/_pytest/python.py:224: in pytest_pycollect_makeitem
res = list(collector._genfunctions(name, obj))
custom-eggs/pytest-4.0.0-py2.7.egg/_pytest/python.py:409: in _genfunctions
self.ihook.pytest_generate_tests(metafunc=metafunc)
custom-eggs/pluggy-0.8.0-py2.7.egg/pluggy/hooks.py:284: in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
custom-eggs/pluggy-0.8.0-py2.7.egg/pluggy/manager.py:67: in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
custom-eggs/pluggy-0.8.0-py2.7.egg/pluggy/manager.py:61: in <lambda>
firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
custom-eggs/pytest-4.0.0-py2.7.egg/_pytest/python.py:133: in pytest_generate_tests
metafunc.parametrize(*marker.args, **marker.kwargs)
custom-eggs/pytest-4.0.0-py2.7.egg/_pytest/python.py:942: in parametrize
function_definition=self.definition,
custom-eggs/pytest-4.0.0-py2.7.egg/_pytest/mark/structures.py:127: in _for_parametrize
for x in argvalues
custom-eggs/pytest-4.0.0-py2.7.egg/_pytest/mark/structures.py:110: in extract_from
belonging_definition.warn(MARK_PARAMETERSET_UNPACKING)
custom-eggs/pytest-4.0.0-py2.7.egg/_pytest/nodes.py:181: in warn
self._std_warn(_code_or_warning)
custom-eggs/pytest-4.0.0-py2.7.egg/_pytest/nodes.py:232: in _std_warn
lineno=lineno + 1 if lineno is not None else None,
E RemovedInPytest4Warning: Applying marks directly to parameters is deprecated, please use pytest.param(..., marks=...) instead.
E For more details, see: https://docs.pytest.org/en/latest/parametrize.html
Hi,
I have a test that depends on 2 other tests:
# Test the GrowthQuality sub-class
import pytest
from SharepadDaten.calcvalues.calcs.GrowthQuality import *
@pytest.mark.parametrize("sharecode,expected", [
('RTN.L', Decimal(0.703).quantize(Decimal('.001'))),
('FL', Decimal(0.811).quantize(Decimal('.001'))),
('ALV.DE', Decimal(0.784).quantize(Decimal('.001'))),
])
@pytest.mark.dependency(
depends=["tests_calculatedvalues/test_leaseLiabilities.py::test_leaseLiabilities", "tests_calculatedvalues/test_leaseAdjCapEmp.py::test_leaseAdjCapEmp"],
scope='session'
)
def test_calcGrowthQuality(sharecode, expected, dbsession, tables):
dbSession = dbsession
SQLAlcBase = tables
sharesDetailsClass = SQLAlcBase.classes['sharesDetails']
growthQuality = GrowthQuality(sharecode, dbSession, SQLAlcBase)
assert growthQuality.value == expected
It depends on these 2 other tests, which are defined in separate .py files in the same directory:
@pytest.mark.dependency(name="test_leaseAdjCapEmp")
def test_leaseAdjCapEmp(sharecode, expected, dbsession, tables):
dbSession = dbsession
SQLAlcBase = tables
sharesDetailsClass = SQLAlcBase.classes['sharesDetails']
leaseAdjCapEmpObj = leaseAdjCapEmp(sharecode, dbSession, SQLAlcBase)
# IntermediateCalculatedValue tests should actually write to the DB, since other tests rely on these values being there
leaseAdjCapEmpObj.outputValue()
assert leaseAdjCapEmpObj.values.equals(expected)
@pytest.mark.dependency(name="test_leaseLiabilities")
def test_leaseLiabilities(sharecode, expected, dbsession, tables):
dbSession = dbsession
SQLAlcBase = tables
sharesDetailsClass = SQLAlcBase.classes['sharesDetails']
leaseLiab = leaseLiabilities(sharecode, dbSession, SQLAlcBase)
# IntermediateCalculatedValue tests should actually write to the DB, since other tests rely on these values being there
leaseLiab.outputValue()
assert leaseLiab.values.equals(expected)
When I run my tests I get:
SKIPPED [1] /usr/local/lib/python3.8/dist-packages/pytest_dependency.py:104: test_calcGrowthQuality[RTN.L-expected0] depends on tests_calculatedvalues/test_leaseAdjCapEmp.py::test_leaseAdjCapEmp
SKIPPED [1] /usr/local/lib/python3.8/dist-packages/pytest_dependency.py:104: test_calcGrowthQuality[FL-expected1] depends on tests_calculatedvalues/test_leaseAdjCapEmp.py::test_leaseAdjCapEmp
SKIPPED [1] /usr/local/lib/python3.8/dist-packages/pytest_dependency.py:104: test_calcGrowthQuality[ALV.DE-expected2] depends on tests_calculatedvalues/test_leaseAdjCapEmp.py::test_leaseAdjCapEmp
I've read https://pytest-dependency.readthedocs.io/en/stable/scope.html and I assume I've done something wrong with naming the tests, but I can't work out what. To me the name "tests_calculatedvalues/test_leaseLiabilities.py::test_leaseLiabilities" seems like it should be correct.
I am making this issue because I believe the use of pytest.mark.parametrize
requires knowledge not currently in pytest-dependency
's docs: bracket syntax used in the name of parametrized tests.
I was trying to use pytest-dependency
and pytest.mark.parametrize
together, and couldn't figure it out.
pytest-dependency
docs for "parametrize" --> no resultsHere's some sample code:
import pytest
class TestOneTwo:
"""Bunch of tests grouped in a class."""
PARAM_ONE = 1001
PARAM_TWO = 1002
@pytest.mark.parametrize("foo", [PARAM_ONE, PARAM_TWO])
@pytest.mark.dependency()
def test_one(self, foo: int) -> None:
assert isinstance(foo, int)
@pytest.mark.dependency(depends=[f"TestOneTwo::test_one[{PARAM_ONE}]"])
def test_two_option_1(self) -> None:
pass
@pytest.mark.dependency(depends=[f"test_one[{PARAM_ONE}]"], scope="class")
def test_two_option_2(self) -> None:
pass
I think the docs should address that use of pytest.mark.parametrize
requires brackets [x]
in the depends
field to address parametrized tests.
Is there a way how can I mark parametrized test so that second, dependent, test will run only if first test passed on all parameters?
import pytest
import pytest_dependency
@pytest.mark.parametrize("x", [(True),(True)])
@pytest.mark.dependency()
def test_a(x):
if x:
pass
else:
assert 0
@pytest.mark.dependency(depends=["test_a"])
def test_b():
pass
I know that I can mark each parameter, but this would be easier if the test should pass on all parameters and there is a lot of them.
Now this passes on first two tests (i.e. test_a) and second (test_b) is skipped.
Hello, i have a question:
Is it possible to add dependency with two tests but exclusive?
Let me explain my self:
With this line "@pytest.mark.dependency(depends=["test_b", "test_c"])", both test_b and test_c should pass for the test to run.
What i want to do is if at least one of the tests pass the test should run. So if test_b pass and test_c fails it should run anyways. The only way the test shouldn't run if both tests fails.
I know this questions migth seem a bit rare, but i'm trying to create a workaroun since i'm running parallel tests with xdist and i think this would help me.
Referring to the screenshots (sample code copied from pytest-dependency doc, slight changes by removing "tests" folder), I am expecting "test_e" and "test_g" to pass, however, both are skipped. Kindly advise if I have done anything silly that stopping the session scope from working properly.
Note:
Should add a page to the documentation giving hints on how to debug.
I did not experiment much with test classes by now. Need to verify that the plugin also works for test classes, should add appropriate tests to the test suite and add the usage for test classes to the documentation.
Thanks for this! Would you mind updating the pypi index? The fix to not depend on import anymore from 7691ce8 is really useful.
This module is great, it's helped me out quite a bit with some complicated test dependencies that I have.
However, I cannot pip install it via requirements.txt. When I do, I see an error that pytest is not installed, even when I specify pytest before pytest-dependency.
I found a workaround: install all other dependencies via pip install -r requirements.txt, then run pip install pytest-dependency. The doc states that you should download and unpack the source and use setup.py.
I would rather use pip though. I am using a virtualenv. Just wondering what I'm doing wrong with the first approach or why I'm seeing this error. Is there a specific reason why pip is not officially supported or is that just a doc issue?
Thanks!
For the moment, the scope of the dependencies is hard coded to module
. That means that a test can only be marked as dependent on a test in the same module, but cannot depend on a test in a different module. The backend class DependencyManager
is in principle also capable to work on session
scope. But the calling hooks do not use the optional scope
argument in the getManager()
call.
The difficulty is that I have no idea how to define a clean API to configure the scope from the calling tests.
The online documentation is currently hosted at http://pythonhosted.org/pytest-dependency/. But pythonhosted.org has essentially been closed down without a warning, without any option to update the documentation. We urgently need to move to a new host. https://readthedocs.org/ seem to be a valid option.
2 files:
test_abc.py
import pytest
class TestAbc:
@pytest.mark.dependency()
def test_abc(self):
assert False
and test_def.py:
import pytest
class TestDef:
@pytest.mark.dependency(depends=['test_abc.py::TestAbc::test_abc'], scope='package')
def test_def(self):
pass
It works as expected now: test_abc fails, test_def is skipped.
When I rename test_def within dependency like this:
import pytest
class TestDef:
@pytest.mark.dependency(depends=['test_abc.py::TestAbc::test_abc'], scope='package')
@pytest.mark.dependency(name='test_two')
def test_def(self):
pass
actual behaviour is: test_abc fails, test_def passes - is that correct behaviour? Shouldn't test_def be skipped?
Hi,
I have managed to successfully install pytest-dependency using pip install
.
Is there a way to install using conda install
or if not possible using apt-get install
? I am facing some dependency issue where unable to use pip
.
I was checking out http://pytest-dependency.readthedocs.io/en/0.3.1/ and the web, unfortunately had not been lucky.
Hope you can advice.
Thanks.
This plugin defines the marker @pytest.mark.dependency()
to mark a test as dependent on another test. But there are cases when it's difficult to know the dependency in advance, it may turn out at run time of the test that it depends on another test. So it would be handy to have some function depends(other_tests)
that can be called from a test and would skip the running test if other_tests
have not all been run successful. So this new function would relate to the marker in a similar way as pytest.skip()
relates to @pytest.mark.skip()
.
A possible implementation might be:
def depends(request, other_tests):
DependencyManager.getManager(request.node).checkDepend(other_tests)
When running the test suite with pytest 4.2.0 or newer, test test_03_skipmsgs.py fails.
The cause is the following: This test is supposed to check the skip messages generated from pytest-dependency. It searches lines matching something like SKIP * test_c depends on test_b
in the test output. But in pytest 4.2.0 the corresponding line in the test output has been changed from
SKIP [1] [...]/pytest_dependency.py:88: test_c depends on test_b
to
SKIPPED [1] [...]/pytest_dependency.py:88: test_c depends on test_b
I believe the relevant change in pytest was in pytest-dev/pytest#4668.
@pytest.mark.dependency
does not honor the depends
parameter.tl;dr The dependency check does not work.
Following the example from readthedocs.io page usage section naming-tests, my file looks like
import pytest
class TestClassNamed(object):
@pytest.mark.dependency(name="a")
@pytest.mark.xfail(reason="deliberate fail")
def test_a(self):
assert False
@pytest.mark.dependency(name="b", depends=["a"])
def test_b(self):
pass
Running this
PS > pytest .\test\ -v
===================== test session starts =====================
platform win32 -- Python 3.7.1, pytest-4.2.0, py-1.7.0, pluggy-0.8.1 -- c:\users\user1\.virtualenvs\project1-3kr5jz-l\scripts\python.exe
cachedir: .pytest_cache
rootdir: C:\Users\user1\Downloads\project1\test, inifile:
test/test_1.py::TestClassNamed::test_a XFAIL [ 50%]
test/test_1.py::TestClassNamed::test_b PASSED [100%]
I expected TestClassNamed::test_b
to be SKIPPED
because it depends
on test named "a
". That test XFAIL
. However, TestClassNamed::test_b
was run and PASSED
. That was unexpected.
I tried this without the @pytest.mark.xfail(reason="deliberate fail")
on test_a
. However, TestClassNamed::test_b
still ran and PASSED
. That was also unexpected.
pip list
PS > pip list
Package Version
-------------- -------
atomicwrites 1.3.0
attrs 18.2.0
colorama 0.4.1
more-itertools 5.0.0
musicbrainzngs 0.6
mutagen 1.42.0
pip 19.0.1
pluggy 0.8.1
py 1.7.0
pytest 4.2.0
setuptools 40.7.2
six 1.12.0
wheel 0.32.3
From the example output
platform win32 -- Python 3.7.1, pytest-4.2.0, py-1.7.0, pluggy-0.8.1
Originally and mistakenly reported in pytest #4737 (which has been closed).
Is there an option to disable dependencies? f.e by some parser option?
If someone wants to run f.e. a test suite, but this one will be dependent on some test from different test suite, tests will be skipped.
If there will be a lot of tests and they will be heavily dependent, then these dependencies saves a lot of time if tests are run all, but they are contraproductive while only part of the tests are to be run (one has to open a file and comment dependencies).
pytest-dependency does not seem to behave well when using pytest-xdist. Some tests are skipped when they should not be. Let's see how to quickly reproduce the bug:
test_foobar.py
import pytest
@pytest.mark.dependency()
def test_a():
pass
@pytest.mark.dependency(depends=["test_a"])
def test_b():
pass
console
virtualenv env
./env/bin/pip install pytest
./env/bin/pip install pytest-xdist
./env/bin/pip install pytest-dependency
./env/bin/pytest -n auto test_foobar.py
output
pytest test_foobar.py -n auto
============================= test session starts ==============================
platform linux -- Python 3.6.1, pytest-3.1.3, py-1.4.34, pluggy-0.4.0
rootdir: /home/eloi/dev/dependency-test, inifile:
plugins: xdist-1.18.1, dependency-0.2
gw0 [2] / gw1 [2] / gw2 [2] / gw3 [2]
scheduling tests via LoadScheduling
s.
===================== 1 passed, 1 skipped in 0.31 seconds ======================
Test are skipped when trying to execute it with mark.
pipenv run pytest -m C999
@pytest.mark.dependency(name="test1")
def test_1(self):
pass
@pytest.mark.dependency(depends=["test1"])
@pytest.mark.C999
def test_2(self):
pass
tests/cms/test_posts.py::TestArticles::test_2 SKIPPED (test_2 depends on test1) [100%]
pytest == 6.2.4
pytest-dependency == 0.5.1
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.