Git Product home page Git Product logo

avocado-framework / avocado Goto Github PK

View Code? Open in Web Editor NEW
335.0 44.0 333.0 21.34 MB

Avocado is a set of tools and libraries to help with automated testing. One can call it a test framework with benefits. Native tests are written in Python and they follow the unittest pattern, but any executable can serve as a test.

Home Page: https://avocado-framework.github.io/

License: Other

Python 97.33% Shell 0.95% Makefile 0.18% HTML 0.47% JavaScript 0.45% CSS 0.57% RobotFramework 0.01% Go 0.03%
testing python framework linux

avocado's Introduction

Welcome to Avocado

Avocado is a set of tools and libraries to help with automated testing.

One can call it a test framework with benefits. Native tests are written in Python and they follow the unittest pattern, but any executable can serve as a test.

How does it work?

You should first experience Avocado by using the test runner, that is, the command line tool that will conveniently run your tests and collect their results.

To do so, please run avocado with the run sub-command followed by a test reference, which could be either a path to the file, or a recognizable name:

$ avocado run /bin/true
JOB ID     : e0134e010afa18b55d93276ac2a790dc38db7948
JOB LOG    : $HOME/avocado/job-results/job-2023-09-06T10.55-e0134e0/job.log
  (1/1) /bin/true: STARTED
  (1/1) /bin/true: PASS (0.02 s)
RESULTS    : PASS 1 | ERROR 0 | FAIL 0 | SKIP 0 | WARN 0 | INTERRUPT 0 | CANCEL 0
JOB HTML   : $HOME/avocado/job-results/job-2023-09-06T10.55-e0134e0/results.html
JOB TIME   : 1.52 s

You probably noticed that we used /bin/true as a test, and in accordance with our expectations, it passed! These are known as exec-test, but there is also another type of test, which we call instrumented tests.

Tip

See more at the Test types section on the Avocado User's Guide.

Why should I use it?

Multiple result formats

A regular run of Avocado will present the test results on standard output, a nice and colored report useful for human beings. But results for machines can also be generated.

Check the job-results folder ($HOME/avocado/job-results/latest/) to see the outputs.

Currently we support, out of box, the following output formats:

  • xUnit: an XML format that contains test results in a structured form, and are used by other test automation projects, such as jenkins.
  • JSON: a widely used data exchange format. The JSON Avocado plugin outputs job information, similarly to the xunit output plugin.
  • TAP: Provides the basic TAP (Test Anything Protocol) results, currently in v12. Unlike most existing Avocado machine readable outputs this one is streamlined (per test results).

Note

You can see the results of the latest job inside the folder $HOME/avocado/job-results/latest/. You can also specify at the command line the options --xunit, --json or --tap followed by a filename. Avocado will write the output on the specified filename.

When it comes to outputs, Avocado is very flexible. You can check the various output plugins. If you need something more sophisticated, visit our plugins section.

Sysinfo data collector

Avocado comes with a sysinfo plugin, which automatically gathers some system information per each job or even between tests. This is very helpful when trying to identify the cause of a test failure.

Check out the files stored at $HOME/avocado/job-results/latest/sysinfo/:

$ ls $HOME/avocado/job-results/latest/sysinfo/pre/
'brctl show'           hostname             modules
 cmdline              'ifconfig -a'         mounts
 cpuinfo               installed_packages  'numactl --hardware show'
 current_clocksource   interrupts           partitions
'df -mP'              'ip link'             scaling_governor
 dmesg                'ld --version'       'uname -a'
 dmidecode             lscpu                uptime
'fdisk -l'            'lspci -vvnn'         version
'gcc --version'        meminfo

For more information about sysinfo collector, please consult the Avocado User's Guide.

Job Replay and Job Diff

In order to reproduce a given job using the same data, one can use the replay subcommand, informing the hash id from the original job to be replayed. The hash id can be partial, as long as the provided part corresponds to the initial characters of the original job id and it is also unique enough. Or, instead of the job id, you can use the string latest and Avocado will replay the latest job executed.

Example:

$ avocado replay 825b86
JOB ID     : 55a0d10132c02b8cc87deb2b480bfd8abbd956c3
SRC JOB ID : 825b860b0c2f6ec48953c638432e3e323f8d7cad
JOB LOG    : $HOME/avocado/job-results/job-2016-01-11T16.18-55a0d10/job.log
 (1/2) /bin/true: PASS (0.01 s)
 (2/2) /bin/false: FAIL (0.01 s)
RESULTS    : PASS 1 | ERROR 0 | FAIL 1 | SKIP 0 | WARN 0 | INTERRUPT 0
JOB TIME   : 0.11 s
JOB HTML   : $HOME/avocado/job-results/job-2016-01-11T16.18-55a0d10/html/results.html

Avocado Diff plugin allows users to easily compare several aspects of two given jobs. The basic usage is:

$ avocado diff 7025aaba 384b949c
--- 7025aaba9c2ab8b4bba2e33b64db3824810bb5df
+++ 384b949c991b8ab324ce67c9d9ba761fd07672ff
@@ -1,15 +1,15 @@

 COMMAND LINE
-/usr/bin/avocado run sleeptest.py
+/usr/bin/avocado run passtest.py

 TOTAL TIME
-1.00 s
+0.00 s

 TEST RESULTS
-1-sleeptest.py:SleepTest.test: PASS
+1-passtest.py:PassTest.test: PASS

 ...

Extensible by plugins

Avocado has a plugin system that can be used to extend it in a clean way. The avocado command line tool has a builtin plugins command that lets you list available plugins. The usage is pretty simple:

$ avocado plugins
Plugins that add new commands (avocado.plugins.cli.cmd):
exec-path Returns path to Avocado bash libraries and exits.
run       Run one or more tests (native test, test alias, binary or script)
sysinfo   Collect system information
...
Plugins that add new options to commands (avocado.plugins.cli):
remote  Remote machine options for 'run' subcommand
journal Journal options for the 'run' subcommand
...

For more information about plugins, please visit the Plugin System section on the Avocado User's Guide.

Utility libraries

When writing tests, developers often need to perform basic tasks on OS and end up having to implement these routines just to run they tests.

Avocado has more than 40 utility modules that helps you to perform basic operations.

Below a small subset of our utility modules:

  • utils.vmimage: This utility provides a API to download/cache VM images (QCOW) from the official distributions repositories.
  • utils.memory: Provides information about memory usage.
  • utils.cpu: Get information from the current's machine CPU.
  • utils.software_manager: Software package management library.
  • utils.download: Methods to download URLs and regular files.
  • utils.archive: Module to help extract and create compressed archives.

Avocado Python API

If the command-line is limiting you, then you can use our new API and create custom jobs and test suites:

import sys

from avocado.core.job import Job

with Job.from_config({'resolver.references': ['/bin/true']}) as job:
    sys.exit(job.run())

How to install

It is super easy, just run the follow command:

$ pip3 install --user avocado-framework

This will install the avocado command in your home directory.

Note

For more details and alternative methods, please visit the Installing section on Avocado User’s Guide

Documentation

Please use the following links for full documentation, including installation methods, tutorials and API or browse this site for more content.

Bugs/Requests

Please use the GitHub issue tracker to submit bugs or request features.

Changelog

Please consult the Avocado Releases for fixes and enhancements of each version.

License

Except where otherwise indicated in a given source file, all original contributions to Avocado are licensed under the GNU General Public License version 2 (GPLv2) or any later version.

By contributing you agree that these contributions are your own (or approved by your employer) and you grant a full, complete, irrevocable copyright license to all users and developers of the Avocado project, present and future, pursuant to the license of the project.

Build and Quality Status

Copr build

Code Climate Maintainability

Documentation Status

Code Style checking by Black

avocado's People

Contributors

abdhaleegit avatar adereis avatar amoskong avatar ana avatar apahim avatar beraldoleal avatar bonzini avatar caiocarrara avatar cforno12 avatar chunfuwen avatar clebergnu avatar harish-24 avatar hsmj1412 avatar ldoktor avatar lmr avatar mmathesius avatar narasimhan-v avatar naresh-ibm avatar paulyuuu avatar pevogam avatar praveenpenguin avatar richtja avatar ruda avatar samiraguiar avatar smruti77 avatar tiagohonorato avatar tntc4stl3 avatar vaishnavibhat avatar wainersm avatar willianrampazzo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

avocado's Issues

Update path of core file to case log folder

Now core file is generated under avocado framework folder,

  1. the core will be replaced while more than one core is generated during test,
  2. tester could not see the core from test result.

need to put the core in the case log folder

Test interrupted because of Job Timeout should finish as INTERRUPTed

@ruda since this is the code you've just written, please take a look at the following execution:

$ ./scripts/avocado run sleeptest -m examples/tests/sleeptest.py.data/sleeptest.yaml --job-timeout 5s 
JOB ID     : 027a7f27af4ef455d53aefe3c12dc9d1cc71a765
JOB LOG    : /home/cleber/avocado/job-results/job-2015-04-15T19.33-027a7f2/job.log
JOB HTML   : /home/cleber/avocado/job-results/job-2015-04-15T19.33-027a7f2/html/results.html
TESTS      : 4
(1/4) sleeptest.py: PASS (0.50 s)
(2/4) sleeptest.py.1: PASS (1.00 s)
(3/4) sleeptest.py.2: ERROR (4.06 s)
(4/4) sleeptest.py.3: SKIP
PASS       : 2
ERROR      : 1
FAIL       : 0
SKIP       : 1
WARN       : 0
INTERRUPT  : 0
TIME       : 5.58 s

IMHO, test 3/4 should have finished as INTERRUPed, just as if the job had been aborted by a CTRL-C.

Skipping tests outside of Setup()

Hi,

Currently, there is no way to call skip() outside of Setup().
Is there any design restriction in using it that way ?
There might be use cases for skipping individual.

Thanks
Narasimhan V

Avocado: basic-example documentation needs an update

link: http://avocado-framework.readthedocs.org/en/latest/WritingTests.html#basic-example

Above link talks of an basic example which users could refer and run to get used to Avocado. Currently the .yaml contents in that page is missing '!mux' keyword, run as-is will show users below error:

14:57:33 job L0399 INFO | Multiplex tree representation:
14:57:33 job L0401 INFO | -- run
14:57:33 job L0401 INFO | -- sleeptest
14:57:33 job L0401 INFO | | -> type: builtin
14:57:33 job L0401 INFO | |-- short
14:57:33 job L0401 INFO | | -> sleep_length: 0.5
14:57:33 job L0401 INFO | |-- medium
14:57:33 job L0401 INFO | | -> sleep_length: 1
14:57:33 job L0401 INFO | -- long
14:57:33 job L0401 INFO | -> sleep_length: 5
14:57:33 job L0402 INFO |
14:57:33 job L0406 INFO | Temporary dir: /var/tmp/avocado_RMhEVQ
14:57:33 job L0407 INFO |
14:57:33 job L0414 INFO | Variant 1: /run/sleeptest/short, /run/sleeptest/medium, /run/sleeptest/long
14:57:33 job L0417 INFO |
14:57:33 job L0292 INFO | Job ID: 76915559c8f8b86a5a39f55844d5dd857124c6d8
14:57:33 job L0293 INFO |
14:57:33 test L0153 INFO | START sleeptest.py:SleepTest.test
14:57:33 multiplexer L0193 DEBUG| PARAMS (key=timeout, path=*, default=None) => None
14:57:33 stacktrace L0036 ERROR|
14:57:33 stacktrace L0039 ERROR| Reproduced traceback from: /usr/lib/python2.6/site-packages/avocado-0.27.0-py2.6.egg/avocado/core/test.py:375
14:57:33 stacktrace L0042 ERROR| Traceback (most recent call last):
14:57:33 stacktrace L0042 ERROR| File "/usr/share/avocado/tests/sleeptest.py", line 21, in test
14:57:33 stacktrace L0042 ERROR| sleep_length = self.params.get('sleep_length', default=1)
14:57:33 stacktrace L0042 ERROR| File "/usr/lib/python2.6/site-packages/avocado-0.27.0-py2.6.egg/avocado/core/multiplexer.py", line 265, in get
14:57:33 stacktrace L0042 ERROR| value = self._get(key, path, default)
14:57:33 stacktrace L0042 ERROR| File "/usr/lib/python2.6/site-packages/avocado-0.27.0-py2.6.egg/avocado/core/multiplexer.py", line 281, in _get
14:57:33 stacktrace L0042 ERROR| return param.get_or_die(path, key)
14:57:33 stacktrace L0042 ERROR| File "/usr/lib/python2.6/site-packages/avocado-0.27.0-py2.6.egg/avocado/core/multiplexer.py", line 372, in get_or_die
14:57:33 stacktrace L0042 ERROR| for _ in ret]))
14:57:33 stacktrace L0042 ERROR| ValueError: Multiple leaves contain the key 'sleep_length'; ['/run/sleeptest/short=>0.5', '/run/sleeptest/medium=>1', '/run/sleeptest/long=>5']
14:57:33 stacktrace L0043 ERROR|
14:57:33 test L0482 ERROR| Traceback (most recent call last):

14:57:33 test L0482 ERROR| File "/usr/lib/python2.6/site-packages/avocado-0.27.0-py2.6.egg/avocado/core/test.py", line 424, in _run_avocado
raise test_exception

14:57:33 test L0482 ERROR| ValueError: Multiple leaves contain the key 'sleep_length'; ['/run/sleeptest/short=>0.5', '/run/sleeptest/medium=>1', '/run/sleeptest/long=>5']

14:57:33 test L0499 ERROR| ERROR sleeptest.py:SleepTest.test -> ValueError: Multiple leaves contain the key 'sleep_length'; ['/run/sleeptest/short=>0.5', '/run/sleeptest/medium=>1', '/run/sleeptest/long=>5']
14:57:33 test L0486 INFO |

Needs a small update as mentioned below:

sleeptest: !mux <<---
type: "builtin"
short:
sleep_length: 0.5
medium:
sleep_length: 1
long:
sleep_length: 5

RFE: copy whole test directory structure to remote(virtual) machine

Hi,
Nowadays when test is running, only mentioned tests are copied to remote machine, but trouble is that I have also there some support library for some of test (contains some general part of code execution)
I have to copy also this helper lib to virt(remote) machine manually.
It is connected with issue:
#350
It could be solved via some special parametr to be relative to that path like --dir or that could contain some option like: --copy-tree what cause copying everything around mentioned test like
you run: avocado run --vm... $ABC/test.py
causes to copy $ABC to VMs
Regards
Honza

Job Timeout: sub-second granularity

@ruda take a look at the following avocado execution:

./scripts/avocado run sleeptest -m examples/tests/sleeptest.py.data/sleeptest.yaml --job-timeout 1s
JOB ID     : 9b054d39ef2086b25fe194780991372b40ff4816
JOB LOG    : /home/cleber/avocado/job-results/job-2015-04-15T19.26-9b054d3/job.log
JOB HTML   : /home/cleber/avocado/job-results/job-2015-04-15T19.26-9b054d3/html/results.html
TESTS      : 4
(1/4) sleeptest.py: PASS (0.50 s)
(2/4) sleeptest.py.1: PASS (1.00 s)
(3/4) sleeptest.py.2: SKIP
(4/4) sleeptest.py.3: SKIP
PASS       : 2
ERROR      : 0
FAIL       : 0
SKIP       : 2
WARN       : 0
INTERRUPT  : 0
TIME       : 1.52 s

IMHO, for correctness, test 2/4 should not have finished as a PASS, but it should have been interrupted.

Consistent handling of python tests and shell scripts

Hi,
I've found issues with various handling for various test types:
I have tests in /usr/share/avocado/tests directory (default config for root user)
I have there two tests:
inittest.sh - shell script
checklogin.py - avocado framework based python

I expected to be able to run command like:

avocado run inittest.sh checklogin.py

Actual situation is that I have to use absolute path for both

avocado run /usr/share/avocado/tests/inittest.sh /usr/share/avocado/tests/checklogin.py

or absolute path for inittest and relative without suffix ".py" like

avocado run /usr/share/avocado/tests/inittest.sh checklogin

It is strange and unintuitive

  Thanks&Regards
  Honza

RFE: passing parameters to test via avocado command line

Hi,
I somehow need to insert variables to test (set environment variable for tests)
for example via some special param like --env
(It should setup variable for all tests)
avocado run --vm-domain $NAME --vm-username root --vm-password $PASSWD --vm-hostname $IP $AVOCADO_TEST_DIR/sources.sh --env="A=xyz" --env="B=abc"

or another possibility, set it up for each test, like:
avocado run --vm-domain $NAME --vm-username root --vm-password $PASSWD --vm-hostname $IP "A=xyz B=abc $AVOCADO_TEST_DIR/sources.sh"

  Thanks&Regards
  Honza

Unable to run scripts on LINUX platform

We are running our set up on a linux server and facing the following issues while running the avocado script.

avocado --help
Traceback (most recent call last):
File "avocado", line 25, in
from avocado.cli.app import AvocadoApp
File "/tools/avocado/avocado-master/avocado/init.py", line 80, in
config.dictConfig(DEFAULT_LOGGING)
AttributeError: 'module' object has no attribute 'dictConfig'

Any ideas why this will happen ? Does Avocado runs only on Fedora and Obuntu and will not run on plain Linux OS. I tried copying the files mentioned in the error to the scripts folder but no success.

Regards
Sumeet Chawla

IOError: [Errno 2] No such file or directory: '/var/tmp/avocado_ugU6zs/address_pool.lock'

I don't know how to reproduce this, because the second time I tried to run the same tests, it worked as expected.

Some log messages:

16:23:30 qemu_monitor     L0608 WARNI| Could not connect to monitor 'Could not connect to monitor socket: [Errno 2] No such file or directory'
16:23:30 qemu_monitor     L0608 WARNI| Could not connect to monitor 'Could not connect to monitor socket: [Errno 2] No such file or directory'
16:23:30 vt               L0456 DEBUG| Searching for test modules that match 'type = boot' and 'provider = io-github-autotest-qemu' on this cartesian dict
16:23:30 vt               L0470 DEBUG| Found subtest module /home/ehabkost/avocado/data/avocado-vt/test-providers.d/downloads/io-github-autotest-qemu/generic/tests/boot.p
y
16:23:30 env_process      L0662 DEBUG| KVM version: 4.1.5-100.fc21.x86_64
16:23:30 env_process      L0683 DEBUG| KVM userspace version: 2.4.50
16:23:30 process          L0272 INFO | Running 'true'
16:23:30 storage          L0415 DEBUG| Image backup /home/ehabkost/avocado/data/avocado-vt/images/jeos-21-64.qcow2.backup already exists, skipping...
16:23:30 env_process      L0862 ERROR| [Errno 2] No such file or directory: '/var/tmp/avocado_ugU6zs/address_pool.lock'
16:23:30 process          L0272 INFO | Running 'true'
16:23:30 vt               L0524 ERROR| Exception raised during postprocessing: Failures occurred while postprocess:
Postprocess: [Errno 2] No such file or directory: '/var/tmp/avocado_ugU6zs/address_pool.lock'
16:23:30 stacktrace       L0036 ERROR| 
16:23:30 stacktrace       L0039 ERROR| Reproduced traceback from: /home/ehabkost/rh/proj/virt/avocado/avocado/core/plugins/vt.py:575
16:23:30 stacktrace       L0042 ERROR| Traceback (most recent call last):
16:23:30 stacktrace       L0042 ERROR|   File "/home/ehabkost/rh/proj/virt/avocado/avocado/core/plugins/vt.py", line 361, in runTest
16:23:30 stacktrace       L0042 ERROR|     self._runTest()
16:23:30 stacktrace       L0042 ERROR|   File "/home/ehabkost/rh/proj/virt/avocado/avocado/core/plugins/vt.py", line 484, in _runTest
16:23:30 stacktrace       L0042 ERROR|     params = env_process.preprocess(self, params, env)
[...]
16:23:30 stacktrace       L0042 ERROR|   File "/home/ehabkost/rh/proj/virt/avocado/virttest/utils_net.py", line 2531, in lock_db
16:23:30 stacktrace       L0042 ERROR|     self.lock = utils_misc.lock_file(self.db_lockfile)
16:23:30 stacktrace       L0042 ERROR|   File "/home/ehabkost/rh/proj/virt/avocado/virttest/utils_misc.py", line 229, in lock_file
16:23:30 stacktrace       L0042 ERROR|     lockfile = open(filename, "w")
16:23:30 stacktrace       L0042 ERROR| IOError: [Errno 2] No such file or directory: '/var/tmp/avocado_ugU6zs/address_pool.lock'
16:23:30 stacktrace       L0043 ERROR| 
16:23:30 test             L0491 ERROR| Traceback (most recent call last):

16:23:30 test             L0491 ERROR|   File "/home/ehabkost/rh/proj/virt/avocado/avocado/core/plugins/vt.py", line 617, in _run_avocado
    raise test_exception

16:23:30 test             L0491 ERROR| IOError: [Errno 2] No such file or directory: '/var/tmp/avocado_ugU6zs/address_pool.lock'

16:23:30 test             L0508 ERROR| ERROR io-github-autotest-qemu.reboot -> IOError: [Errno 2] No such file or directory: '/var/tmp/avocado_ugU6zs/address_pool.lock'
16:23:30 test             L0495 INFO | 

Full job result tarball available at: https://dl.dropboxusercontent.com/u/1626365/tmp/job-2015-10-02T16.23-3c858a8.tar.bz2

.

RFE: Idea of support multihost testing

Hi,
I'm not sure if this is right issue tracker (avocado-virt), I expect to be able to do some multihost testing:

my imagination now is, that I need
1 SERVER and 3 CLIENS and run some part of test on clients and another part on server.
It need 2 things

  • support some sync beetween machines
  • support to script to know which part is server and which is client
  • creating X machines - could be solved for example via VMs, like (there is --vm-domain parameter given) and avocado will clone that machine to 4 instances
    for remote plugin it could be solved via more --remote-ip $IP parameters
    Thanks&Regards
    Honza

colors of terminal multiplied for each testcase (terminal output)

Hi,
I've tested recent avoicado from COPR repo and I've seen output what is done to console, but I've stored it to file, and there are multiplied COLOR chars for each test:
:: [ BEGIN ] :: Running 'sudo -u test bash -c 'export DISPLAY=:99; avocado run ./selenium-login.py''
JOB ID : 922db57926a1a120ac4449361a17d2ecc8988b46
JOB LOG : /home/test/avocado/job-results/job-2015-10-09T05.26-922db57/job.log
TESTS : 3
(1/3) ./selenium-login.py:BasicTestSuite.testBase: �[1D-�[1D\�[1DPASS (2.61 s)
(2/3) ./selenium-login.py:BasicTestSuite.testLogin: �[1D|�[1D/�[1D-�[1D\�[1DPASS (4.42 s)
(3/3) ./selenium-login.py:BasicTestSuite.testLoginChangeTabNetworking: �[1D|�[1D/�[1D-�[1D\�[1D|�[1D/�[1D-�[1D\�[1D|�[1D/�[1DPASS (13.51 s)
RESULTS : PASS 3 | ERROR 0 | FAIL 0 | SKIP 0 | WARN 0 | INTERRUPT 0
TIME : 20.54 s

It is not blocing issue, but in case of thousand of testecases it can cause bigger troubles.
avocado version: 0.29.0-1.el7.centos

Regards
Honza

bad path for results for bash scripts (html page)

Hi,
I've scheduled command:

avocado run --vm --vm-domain $NAME --vm-clean --vm-username root --vm-hostname $IP inittest.sh checklogin.py

VM DOMAIN : checkmachine1-fedora-21-x86_64
VM LOGIN : [email protected]
JOB ID : 8cd8f5aa76c9d3e589ea23e6701ef35dd5d9bf5f
JOB LOG : /root/avocado/job-results/job-2015-01-30T10.07-8cd8f5a/job.log
JOB HTML : /root/avocado/job-results/job-2015-01-30T10.07-8cd8f5a/html/results.html
TESTS : 2
(1/2) /root/avocado/tests/inittest.sh: PASS (87.14 s)
(2/2) checklogin.py: FAIL (10.43 s)
PASS : 1
ERROR : 0
NOT FOUND : 0
NOT A TEST : 0
FAIL : 1
SKIP : 0
WARN : 0
TIME : 97.57 s

It is working well, but there is trouble with results there are bad paths for bash scripts (and debug log for that) links on result.html page
checklogin.py debug log path is: file:///root/avocado/job-results/latest/test-results/checklogin.py/debug.log
inittest.sh deguh log path is: file:///root/avocado/tests/inittest.sh/debug.log
This path for bash (.sh) script is bad and does not exist

  Regards

Command line arguments for tests

Hi,

Currently there is no way to supply command line arguments to the tests.
We either have to give through config file or multiplexer.
Can we have command line arguments, so that dynamic inputs can be given.

Thanks
Narasimhan V

avocado rpms for ppc64[le]

Hi @lmr,
Hi @ldoktor,

I used to see avocado ppc64le rpms(for fedora) being maintained into your copr(https://copr.fedoraproject.org/coprs/lmr/Autotest/) repository, though those builds were failing.
Has there been any change in plans?
Anyways I can still use the direct repository and do make(with couple of dependencies resolved), let me know if any of our help needed there to bring that repo back.
Thanks in advance.

_regards,
-Satheesh.

VM plugin: how is it working now? --vm-cleanup

Hi,
I'm not sure how it is wokring now.
I've used virtual plugin like:
sudo avocado run --vm --vm-domain checkmachine1-fedora-21-x86_64 --vm-clean --vm-username root --vm-password testvm --vm-hostname 192.168.122.2 --xunit out2.xml compiletest.sh checklogin.py

and it seems that it creates new snapshot, althought old one is leaved there. I saw in virt-manager, that there are stored snapshot (host is RHEL7)
Is it intended or it is a bug?
It cause that using this param is very very slow.
On fedora21 it works better, but I dont know If it is caused by avocado or by bug in libvirt
Regards
Honza

RFE: allow more fails in tests

Hi,
as we discussed, it would be nice to allow more failures in one test. to solve issue, that lots of people have to do it by own counter of fails.
It could besolved via some decorators aroound asserts in python test.
like:
self.assertTrue(False)
self.assertTrue(True)
self.assertTrue(False)
should produce:
PASS: 1
FAIL: 2
...
It would be very nice to solve it in framework instead of many own solutions for same issue.
Thanks&Regards
Honza

Avocado with option `--json` when some builtin plugin could not be enabled will dump garbage

Running Avocado with option --json when some builtin plugin could not be enabled (ex.: no libivrt installed), will dump garbage if you want to parse the JSON result.

Example: avocado --json run sleepiest:

Could not import plugin 'RunVM': No module named libvirt
{"tests": [{"test": "sleeptest.1", "url": "sleeptest", "status": "PASS", "time": 1.0622379779815674}], "errors": 0, "skip": 0, "time": 1.0622379779815674, "debuglog": "/home/rmoura/avocado/logs/run-2014-05-24-09.36.00/debug.log", "pass": 1, "failures": 0, "total": 1}

GDB: *.gdb.connect_commands uses relative path to binary instead of absolute

This causes problems when you execute the connect command from different path:

doublefree.gdb.connect_commands (with relative path to binary):

file DEVCONF/GDB/doublefree2.py.data/doublefree
target extended-remote :20001

doublefree.gdb.sh (is fine)

#!/bin/sh
/bin/gdb -x /home/medic/avocado/job-results/job-2015-02-06T08.16-772abbb/test-results/DEVCONF/GDB/doublefree2.py/data/doublefree.gdb.connect_commands
echo -n 'C' > /home/medic/avocado/job-results/job-2015-02-06T08.16-772abbb/test-results/DEVCONF/GDB/doublefree2.py/data/doublefree.gdb.cont.fifo

Regards,
Lukáš

When avocado (or perhaps even test) is inside path which containst ` `, it fails

When you put avocado inside /whatever/path with spaces/in/it, test fails:

[medic@t530 avocado ]$ ./scripts/avocado run DEVCONF/Basic_usage/output_compare.sh --show-job-log
Profilers declared: ['vmstat 1', 'journalctl -f']
Profiler disabled
Not logging /proc/slabinfo (lack of permissions)
Profilers declared: ['vmstat 1', 'journalctl -f']
Profiler disabled
START DEVCONF/Basic_usage/output_compare.sh

Test instance parameters:
    id = DEVCONF/Basic_usage/output_compare.sh
    omit_non_tests = False

Default parameters:

Test instance params override defaults whenever available

Running '/home/medic/Work/Projekty/2015-DevConf - Avocado workshop/tmp/avocado/DEVCONF/Basic_usage/output_compare.sh'

Traceback (most recent call last):
  File "/home/medic/Work/Projekty/2015-DevConf - Avocado workshop/tmp/avocado/avocado/test.py", line 538, in action
    result = process.run(self.path, verbose=True, env=test_params)
  File "/home/medic/Work/Projekty/2015-DevConf - Avocado workshop/tmp/avocado/avocado/utils/process.py", line 938, in run
    cmd_result = sp.run(timeout=timeout)
  File "/home/medic/Work/Projekty/2015-DevConf - Avocado workshop/tmp/avocado/avocado/utils/process.py", line 497, in run
    self._init_subprocess()
  File "/home/medic/Work/Projekty/2015-DevConf - Avocado workshop/tmp/avocado/avocado/utils/process.py", line 298, in _init_subprocess
    env=self.env)
  File "/usr/lib64/python2.7/subprocess.py", line 711, in __init__
    errread, errwrite)
  File "/usr/lib64/python2.7/subprocess.py", line 1327, in _execute_child
    raise child_exception
OSError: [Errno 2] No such file or directory

Traceback (most recent call last):

  File "/home/medic/Work/Projekty/2015-DevConf - Avocado workshop/tmp/avocado/avocado/test.py", line 427, in runTest
    raise action_exception

OSError: [Errno 2] No such file or directory

FAIL DEVCONF/Basic_usage/output_compare.sh -> OSError: [Errno 2] No such file or directory

Not logging /proc/slabinfo (lack of permissions)

Stderr from sub processes leaking from avocado

When executing the trinity fuzzer, stderr is leaking from its subprocess, something that shouldn't be happening:

$ scripts/avocado run trinity
DEBUG LOG: /home/lmr/avocado/logs/run-2014-05-28-22.52.31/debug.log
TOTAL TESTS: 1
*** Error in `./trinity': double free or corruption (top): 0x0000000002d402d0 ***
*** Error in `./trinity': double free or corruption (!prev): 0x0000000001cd2050 ***

We need to find and fix the source of this leak.

Avocado execution needs to give better progress indication to users

So far we've implemented and run extremely short tests in avocado, which masked this issue: Right now, if you execute a test that takes a fairly long time to execute, you'll get frustrated and confused, because the test runner won't jump to show that it is running said long running test

$ scripts/avocado run trinity
DEBUG LOG: /home/lmr/avocado/logs/run-2014-05-28-22.52.31/debug.log
TOTAL TESTS: 1
...long time waiting...

So we need to wrestle with the python logging system a bit in order to show when a test is being executed, so we have something like:

$ scripts/avocado run trinity
DEBUG LOG: /home/lmr/avocado/logs/run-2014-05-28-22.52.31/debug.log
TOTAL TESTS: 1
(1/1) trinity.1:....long time waiting....

avocado run with vm plugin hangs bc of pager

Our test suite stopped and hang forever, waiting for the user to quit less in a virtual machine.

On the host:

 └─avocado /usr/bin/avocado run --vm-domain checkmachine7-fedora-21-x86_64 --vm-username root --vm-password testvm --vm-hostname 192.168.123.5 ...

On the VM:

─sshd -D
  │   └─sshd    
  │       └─bash -l -c cd ~/avocado/tests; avocado list ...
  │           └─avocado /usr/bin/avocado list ...
  │               └─less -FRSX

sysinfo not collected when running in-tree

Sysinfo uses absolute /etc path, which doesn't exists when running in-tree version (when avocado was not already installed). We should detect and use in-tree configs.

Similar problems are on other places too. How about using settings module to say we're using in-tree configuration? That should be deterministic everywhere we touch settings.

@clebergnu since you implemented the configurable sysingo, can you please have a look at this, maybe it's just on my system...

state['status'] is None exception (virsh.dump testcases)

Hello,
I encountered a strange error case which is not repeating exactly during the same step but it occurs during one of the virsh.dump subtests (randomly between tests # 10-20 of 54 . The exception occurs in avocado/core/result.py

below is the sample ( ./scripts/avocado run virsh.dump --vt-type libvirt ):

(14/54) type_specific.io-github-autotest-libvirt.virsh.dump.positive_test.non_acl.memory_dump.kdump-snappy_format:
Avocado crashed: KeyError: None
Traceback (most recent call last):

File "/home/avocado-jenkins/avocado/avocado/core/job.py", line 492, in _run
timeout=self.timeout)
File "/home/avocado-jenkins/avocado/avocado/core/runner.py", line 290, in run_suite
deadline)
File "/home/avocado-jenkins/avocado/avocado/core/runner.py", line 246, in run_test
self.result.check_test(test_state)
File "/home/avocado-jenkins/avocado/avocado/core/result.py", line 98, in check_test
output_plugin.check_test(state)
File "/home/avocado-jenkins/avocado/avocado/core/result.py", line 252, in check_test
add = status_map[state['status']]

KeyError: None

Please include the traceback info and command line used on your bug report
Report bugs visiting https://github.com/avocado-framework/avocado/issues/new

Somehow state['status'] has a non-supported None value, as well as e.g. start['time_start'] and start['time_start'] are None which doesn't look good.
I also tried to increase the timeout value in the virsh.dump test cfg to 60 (because for some of its subtests the default 5 seconds were not enough).

As it's not exactly reproducible on the exact step it makes the debugging quite time consuming (when I launch a single subtest separately it doesn't crash) because of running each time the whole set of subtests.

Maybe there are some ideas/hints about it ?

Thank you in advance

Avocado APIs to execute processes should be able to print stdout/err as it happens

It is quite counter productive to have process.run, process.system and others to not display stdout/err as it is generated by default. Due to the extremely simplistic way the subprocess wrappers were implemented, stdout/err is only stored and displayed to the user later.

Modifying it requires some fair changes to the implementation of said functions, possibly making them more similar to the autotest implementation (hopefully cleaner, but still).

Consolidate HTML plugin link paths for remote/vm plugins

Hi,
I've used command like:
sudo avocado run --vm-domain checkmachine1-fedora-21-x86_64 --vm-username root --vm-password testvm --vm-hostname 192.168.122.33 --vm-clean /tmp/avocado-test/tests/compiletest.sh /tmp/avocado-test/tests/checklogin.py

it generated HTML output to:
file:///root/avocado/job-results/job-2015-02-05T15.39-40f2dbc/html/results.html

but all links in page are bad, there are mixed LOCAL / REMOTE paths propably

        <table id="results" class="display" cellspacing="0" width="100%"><thead>
        <tr>
            <th>Start Time</th>
            <th>Test ID</th>
            <th>Status</th>
            <th>Time (sec)</th>
            <th>Info</th>
            <th>Debug Log</th>
        </tr>
        </thead>
            <tr class="success">
                <td>2015-02-05 15:38:45</td>
                <td><a href="/root/avocado/tests/tmp/avocado-test/tests/compiletest.sh">/root/avocado/tests/tmp/avocado-test/tests/compiletest.sh</a></td>
                <td>PASS</td>
                <td>42.99</td>
                <td>Not supported yet</td>
                <td><a href="/root/avocado/tests/tmp/avocado-test/tests/compiletest.sh/debug.log">debug.log</a></td>
            </tr>
            <tr class="success">
                <td>2015-02-05 15:39:28</td>
                <td><a href="/root/avocado/tests/tmp/avocado-test/tests/checklogin.py">/root/avocado/tests/tmp/avocado-test/tests/checklogin.py</a></td>
                <td>PASS</td>
                <td>9.59</td>
                <td>Not supported yet</td>
                <td><a href="/root/avocado/tests/tmp/avocado-test/tests/checklogin.py/debug.log">debug.log</a></td>
            </tr>
        </table>

Compatible with python2.6

Hi folks,
The python2.6 needs packages are installed only for travis(requirements-travis-python26.txt), should we install them in Makefile to support all our tests can run on python2.6(RHEL6)?

--help info on RHEL6 is broken

[root@astepano2 avocado]# pwd
/root/avocado
[root@astepano2 avocado]# scripts/avocado vt-bootstrap --help
Usage:
avocado [OPTION...] - GStreamer initialization

Help Options:
-h, --help Show help options
--help-all Show all help options
--help-gst Show GStreamer Options

[root@astepano2 avocado]# git log
commit bb2de39
Merge: d90ea78 63fef11
Author: Lucas Meneghel Rodrigues [email protected]
Date: Mon Aug 3 23:47:25 2015 -0300

RFE: add --dir option to avocado run (should work as path where to find tests)

Hi,
I would like to see option --dir what will meas relative path to tests:
I can imagine that situation:
You have it configured in avocado config file, where are test, where to store results, results are okay, but in case of tests it is not familiar to use some absolute paths like:
avocado run /x/y/z/a.py /x/y/z/b.sh
it would be nice to have possibility to say, there isanother path where to find test from cmdline like:
avocado run --dir /x/y/z a.py b.sh

There is also second usage of that - it is good for remote and vm plugin for running tests. For example I have some shared functions for more that one script, what have to be copied to remote machine, then it could be used there. So --dir option cause that whole --dir with tests will be copied there.

 Thanks&Regards
 Honza

Investigation of avocado build failures

An interesting side effect of @clebergnu's work of ditching nose and standardizing all the unittests and functional tests is that all the tests are run at rpm build time. While a great idea, it made building our avocado packages on different chroots challenging. Here are the problems I've faced so far:

Success

epel-6-i386: OK
epel-6-x86_64: OK
epel-7-x86_64: OK
fedora-22-x86_64: OK
fedora-22-i386: OK

Failures

fedora-rawhide-i386:

======================================================================
ERROR: test_build_docs (selftests.doc.test_doc_build.DocBuildTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/builddir/build/BUILD/avocado-0.28.0/selftests/doc/test_doc_build.py", line 54, in test_build_docs
    raise DocBuildError(e_msg)
DocBuildError: 3 ERRORS and/or WARNINGS detected while building the html docs:
1) /builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.rst:17: WARNING: autodoc: failed to import module u'avocado.core.restclient.connection'; the following exception was raised:
2) /builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.cli.rst:18: WARNING: autodoc: failed to import module u'avocado.core.restclient.cli.app'; the following exception was raised:
3) /builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.cli.actions.rst:18: WARNING: autodoc: failed to import module u'avocado.core.restclient.cli.actions.server'; the following exception was raised:
Full output: make: Entering directory '/builddir/build/BUILD/avocado-0.28.0/docs'
sphinx-build -b html -d build/doctrees   source build/html
Making output directory...
Running Sphinx v1.2.3
loading pickled environment... not yet created
loading intersphinx inventory from http://docs.python.org/objects.inv...
building [html]: targets for 24 source files that are out of date
updating environment: 24 added, 0 changed, 0 removed
reading sources... [  4%] Configuration
reading sources... [  8%] ContributionGuide
reading sources... [ 12%] DebuggingWithGDB
reading sources... [ 16%] DevelopmentTips
reading sources... [ 20%] GetStartedGuide
reading sources... [ 25%] Introduction
reading sources... [ 29%] MultiplexConfig
reading sources... [ 33%] Plugins
reading sources... [ 37%] ReferenceGuide
reading sources... [ 41%] ResultFormats
reading sources... [ 45%] RunningTestsRemotely
reading sources... [ 50%] WrapProcess
reading sources... [ 54%] WritingTests
reading sources... [ 58%] api/core/avocado.core
reading sources... [ 62%] api/core/avocado.core.plugins
reading sources... [ 66%] api/core/avocado.core.remote
reading sources... [ 70%] api/core/avocado.core.restclient
reading sources... [ 75%] api/core/avocado.core.restclient.cli
reading sources... [ 79%] api/core/avocado.core.restclient.cli.actions
reading sources... [ 83%] api/core/avocado.core.restclient.cli.args
reading sources... [ 87%] api/test/avocado
reading sources... [ 91%] api/utils/avocado.utils
reading sources... [ 95%] api/utils/avocado.utils.external
reading sources... [100%] index
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
preparing documents... done
writing output... [  4%] Configuration
writing output... [  8%] ContributionGuide
writing output... [ 12%] DebuggingWithGDB
writing output... [ 16%] DevelopmentTips
writing output... [ 20%] GetStartedGuide
writing output... [ 25%] Introduction
writing output... [ 29%] MultiplexConfig
writing output... [ 33%] Plugins
writing output... [ 37%] ReferenceGuide
writing output... [ 41%] ResultFormats
writing output... [ 45%] RunningTestsRemotely
writing output... [ 50%] WrapProcess
writing output... [ 54%] WritingTests
writing output... [ 58%] api/core/avocado.core
writing output... [ 62%] api/core/avocado.core.plugins
writing output... [ 66%] api/core/avocado.core.remote
writing output... [ 70%] api/core/avocado.core.restclient
writing output... [ 75%] api/core/avocado.core.restclient.cli
writing output... [ 79%] api/core/avocado.core.restclient.cli.actions
writing output... [ 83%] api/core/avocado.core.restclient.cli.args
writing output... [ 87%] api/test/avocado
writing output... [ 91%] api/utils/avocado.utils
writing output... [ 95%] api/utils/avocado.utils.external
writing output... [100%] index
writing additional files... genindex py-modindex search
copying images... [ 50%] diagram.png
copying images... [100%] modules.png
copying static files... done
copying extra files... done
dumping search index... done
dumping object inventory... done
build succeeded, 3 warnings.
Build finished. The HTML pages are in build/html.
make: Leaving directory '/builddir/build/BUILD/avocado-0.28.0/docs'
/builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.rst:17: WARNING: autodoc: failed to import module u'avocado.core.restclient.connection'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/sphinx/ext/autodoc.py", line 335, in import_object
    __import__(self.modname)
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/core/restclient/connection.py", line 22, in <module>
    import requests
ImportError: No module named requests
/builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.cli.rst:18: WARNING: autodoc: failed to import module u'avocado.core.restclient.cli.app'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/sphinx/ext/autodoc.py", line 335, in import_object
    __import__(self.modname)
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/core/restclient/cli/app.py", line 26, in <module>
    from .. import connection
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/core/restclient/connection.py", line 22, in <module>
    import requests
ImportError: No module named requests
/builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.cli.actions.rst:18: WARNING: autodoc: failed to import module u'avocado.core.restclient.cli.actions.server'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/sphinx/ext/autodoc.py", line 335, in import_object
    __import__(self.modname)
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/core/restclient/cli/actions/server.py", line 10, in <module>
    from ... import connection
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/core/restclient/connection.py", line 22, in <module>
    import requests
ImportError: No module named requests
Please check the output and fix your docstrings/.rst docs
----------------------------------------------------------------------
Ran 262 tests in 58.913s
FAILED (errors=1)

fedora-rawhide-x86_64:

======================================================================
ERROR: test_build_docs (selftests.doc.test_doc_build.DocBuildTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/builddir/build/BUILD/avocado-0.28.0/selftests/doc/test_doc_build.py", line 54, in test_build_docs
    raise DocBuildError(e_msg)
DocBuildError: 3 ERRORS and/or WARNINGS detected while building the html docs:
1) /builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.rst:17: WARNING: autodoc: failed to import module u'avocado.core.restclient.connection'; the following exception was raised:
2) /builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.cli.rst:18: WARNING: autodoc: failed to import module u'avocado.core.restclient.cli.app'; the following exception was raised:
3) /builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.cli.actions.rst:18: WARNING: autodoc: failed to import module u'avocado.core.restclient.cli.actions.server'; the following exception was raised:
Full output: make: Entering directory '/builddir/build/BUILD/avocado-0.28.0/docs'
sphinx-build -b html -d build/doctrees   source build/html
Making output directory...
Running Sphinx v1.2.3
loading pickled environment... not yet created
loading intersphinx inventory from http://docs.python.org/objects.inv...
building [html]: targets for 24 source files that are out of date
updating environment: 24 added, 0 changed, 0 removed
reading sources... [  4%] Configuration
reading sources... [  8%] ContributionGuide
reading sources... [ 12%] DebuggingWithGDB
reading sources... [ 16%] DevelopmentTips
reading sources... [ 20%] GetStartedGuide
reading sources... [ 25%] Introduction
reading sources... [ 29%] MultiplexConfig
reading sources... [ 33%] Plugins
reading sources... [ 37%] ReferenceGuide
reading sources... [ 41%] ResultFormats
reading sources... [ 45%] RunningTestsRemotely
reading sources... [ 50%] WrapProcess
reading sources... [ 54%] WritingTests
reading sources... [ 58%] api/core/avocado.core
reading sources... [ 62%] api/core/avocado.core.plugins
reading sources... [ 66%] api/core/avocado.core.remote
reading sources... [ 70%] api/core/avocado.core.restclient
reading sources... [ 75%] api/core/avocado.core.restclient.cli
reading sources... [ 79%] api/core/avocado.core.restclient.cli.actions
reading sources... [ 83%] api/core/avocado.core.restclient.cli.args
reading sources... [ 87%] api/test/avocado
reading sources... [ 91%] api/utils/avocado.utils
reading sources... [ 95%] api/utils/avocado.utils.external
reading sources... [100%] index
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
preparing documents... done
writing output... [  4%] Configuration
writing output... [  8%] ContributionGuide
writing output... [ 12%] DebuggingWithGDB
writing output... [ 16%] DevelopmentTips
writing output... [ 20%] GetStartedGuide
writing output... [ 25%] Introduction
writing output... [ 29%] MultiplexConfig
writing output... [ 33%] Plugins
writing output... [ 37%] ReferenceGuide
writing output... [ 41%] ResultFormats
writing output... [ 45%] RunningTestsRemotely
writing output... [ 50%] WrapProcess
writing output... [ 54%] WritingTests
writing output... [ 58%] api/core/avocado.core
writing output... [ 62%] api/core/avocado.core.plugins
writing output... [ 66%] api/core/avocado.core.remote
writing output... [ 70%] api/core/avocado.core.restclient
writing output... [ 75%] api/core/avocado.core.restclient.cli
writing output... [ 79%] api/core/avocado.core.restclient.cli.actions
writing output... [ 83%] api/core/avocado.core.restclient.cli.args
writing output... [ 87%] api/test/avocado
writing output... [ 91%] api/utils/avocado.utils
writing output... [ 95%] api/utils/avocado.utils.external
writing output... [100%] index
writing additional files... genindex py-modindex search
copying images... [ 50%] diagram.png
copying images... [100%] modules.png
copying static files... done
copying extra files... done
dumping search index... done
dumping object inventory... done
build succeeded, 3 warnings.
Build finished. The HTML pages are in build/html.
make: Leaving directory '/builddir/build/BUILD/avocado-0.28.0/docs'
/builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.rst:17: WARNING: autodoc: failed to import module u'avocado.core.restclient.connection'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/sphinx/ext/autodoc.py", line 335, in import_object
    __import__(self.modname)
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/core/restclient/connection.py", line 22, in <module>
    import requests
ImportError: No module named requests
/builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.cli.rst:18: WARNING: autodoc: failed to import module u'avocado.core.restclient.cli.app'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/sphinx/ext/autodoc.py", line 335, in import_object
    __import__(self.modname)
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/core/restclient/cli/app.py", line 26, in <module>
    from .. import connection
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/core/restclient/connection.py", line 22, in <module>
    import requests
ImportError: No module named requests
/builddir/build/BUILD/avocado-0.28.0/docs/source/api/core/avocado.core.restclient.cli.actions.rst:18: WARNING: autodoc: failed to import module u'avocado.core.restclient.cli.actions.server'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/sphinx/ext/autodoc.py", line 335, in import_object
    __import__(self.modname)
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/core/restclient/cli/actions/server.py", line 10, in <module>
    from ... import connection
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/core/restclient/connection.py", line 22, in <module>
    import requests
ImportError: No module named requests
Please check the output and fix your docstrings/.rst docs
----------------------------------------------------------------------
Ran 262 tests in 55.893s
FAILED (errors=1)

fedora-21-x86_64:

======================================================================
ERROR: test_badly_behaved (selftests.functional.test_interrupt.InterruptTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/builddir/build/BUILD/avocado-0.28.0/selftests/functional/test_interrupt.py", line 99, in test_badly_behaved
    wait.wait_for(wait_until_no_badtest, timeout=2)
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/utils/wait.py", line 28, in wait_for
    output = func()
  File "/builddir/build/BUILD/avocado-0.28.0/selftests/functional/test_interrupt.py", line 88, in wait_until_no_badtest
    for p in psutil.pids():
AttributeError: 'module' object has no attribute 'pids'
======================================================================
ERROR: test_well_behaved (selftests.functional.test_interrupt.InterruptTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/builddir/build/BUILD/avocado-0.28.0/selftests/functional/test_interrupt.py", line 140, in test_well_behaved
    wait.wait_for(wait_until_no_goodtest, timeout=2)
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/utils/wait.py", line 28, in wait_for
    output = func()
  File "/builddir/build/BUILD/avocado-0.28.0/selftests/functional/test_interrupt.py", line 129, in wait_until_no_goodtest
    for p in psutil.pids():
AttributeError: 'module' object has no attribute 'pids'
----------------------------------------------------------------------
Ran 262 tests in 57.053s
FAILED (errors=2)

fedora-21-i386:

======================================================================
ERROR: test_badly_behaved (selftests.functional.test_interrupt.InterruptTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/builddir/build/BUILD/avocado-0.28.0/selftests/functional/test_interrupt.py", line 99, in test_badly_behaved
    wait.wait_for(wait_until_no_badtest, timeout=2)
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/utils/wait.py", line 28, in wait_for
    output = func()
  File "/builddir/build/BUILD/avocado-0.28.0/selftests/functional/test_interrupt.py", line 88, in wait_until_no_badtest
    for p in psutil.pids():
AttributeError: 'module' object has no attribute 'pids'
======================================================================
ERROR: test_well_behaved (selftests.functional.test_interrupt.InterruptTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/builddir/build/BUILD/avocado-0.28.0/selftests/functional/test_interrupt.py", line 140, in test_well_behaved
    wait.wait_for(wait_until_no_goodtest, timeout=2)
  File "/builddir/build/BUILD/avocado-0.28.0/avocado/utils/wait.py", line 28, in wait_for
    output = func()
  File "/builddir/build/BUILD/avocado-0.28.0/selftests/functional/test_interrupt.py", line 129, in wait_until_no_goodtest
    for p in psutil.pids():
AttributeError: 'module' object has no attribute 'pids'
----------------------------------------------------------------------
Ran 262 tests in 61.469s
FAILED (errors=2)

fedora-22-ppc64le: 

Mock Version: 1.2.12
ENTER do(['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target ppc64le --nodeps /builddir/build/SPECS/avocado.spec'], chrootPath='/var/lib/mock/fedora-22-ppc64le-mockbuilder-6508/root'shell=FalseprintOutput=Falseenv={'LANG': 'en_US.UTF-8', 'TERM': 'vt100', 'SHELL': '/bin/bash', 'PROMPT_COMMAND': 'printf "\x1b]0;<mock-chroot>\x07<mock-chroot>"', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'HOME': '/builddir', 'HOSTNAME': 'mock'}gid=135user='mockbuild'timeout=0logger=<mockbuild.trace_decorator.getLog object at 0x3fff8aae6a50>uid=1000)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target ppc64le --nodeps /builddir/build/SPECS/avocado.spec'] with env {'LANG': 'en_US.UTF-8', 'TERM': 'vt100', 'SHELL': '/bin/bash', 'PROMPT_COMMAND': 'printf "\x1b]0;<mock-chroot>\x07<mock-chroot>"', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'HOME': '/builddir', 'HOSTNAME': 'mock'} and shell False
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
Building target platforms: ppc64le
Building for target ppc64le
Wrote: /builddir/build/SRPMS/avocado-0.28.0-2.fc22.src.rpm
Child return code was: 0
LEAVE do --> 

fedora-21-ppc64le:

Mock Version: 1.2.12
ENTER do(['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target ppc64le --nodeps /builddir/build/SPECS/avocado.spec'], chrootPath='/var/lib/mock/fedora-21-ppc64le-mockbuilder-3473/root'shell=FalseprintOutput=Falseenv={'LANG': 'en_US.UTF-8', 'TERM': 'vt100', 'SHELL': '/bin/bash', 'PROMPT_COMMAND': 'printf "\x1b]0;<mock-chroot>\x07<mock-chroot>"', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'HOME': '/builddir', 'HOSTNAME': 'mock'}gid=135user='mockbuild'timeout=0logger=<mockbuild.trace_decorator.getLog object at 0x3fffa28e6a50>uid=1000)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target ppc64le --nodeps /builddir/build/SPECS/avocado.spec'] with env {'LANG': 'en_US.UTF-8', 'TERM': 'vt100', 'SHELL': '/bin/bash', 'PROMPT_COMMAND': 'printf "\x1b]0;<mock-chroot>\x07<mock-chroot>"', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'HOME': '/builddir', 'HOSTNAME': 'mock'} and shell False
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
Building target platforms: ppc64le
Building for target ppc64le
Wrote: /builddir/build/SRPMS/avocado-0.28.0-2.fc21.src.rpm
Child return code was: 0
LEAVE do --> 

fedora-23-i386:

Mock Version: 1.2.12
ENTER do(['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target i686 --nodeps /builddir/build/SPECS/avocado.spec'], chrootPath='/var/lib/mock/fedora-23-i386-mockbuilder-7204/root'shell=FalseprintOutput=Falseenv={'LANG': 'en_US.UTF-8', 'TERM': 'vt100', 'SHELL': '/bin/bash', 'PROMPT_COMMAND': 'printf "\x1b]0;<mock-chroot>\x07<mock-chroot>"', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'HOME': '/builddir', 'HOSTNAME': 'mock'}gid=135user='mockbuild'timeout=0logger=<mockbuild.trace_decorator.getLog object at 0x7fb84b82a690>uid=1001)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target i686 --nodeps /builddir/build/SPECS/avocado.spec'] with env {'LANG': 'en_US.UTF-8', 'TERM': 'vt100', 'SHELL': '/bin/bash', 'PROMPT_COMMAND': 'printf "\x1b]0;<mock-chroot>\x07<mock-chroot>"', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'HOME': '/builddir', 'HOSTNAME': 'mock'} and shell False
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
Building target platforms: i686
Building for target i686
Wrote: /builddir/build/SRPMS/avocado-0.28.0-2.fc23.src.rpm
Child return code was: 0
LEAVE do --> 

fedora-23-x86_64:

Mock Version: 1.2.12
ENTER do(['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target x86_64 --nodeps /builddir/build/SPECS/avocado.spec'], chrootPath='/var/lib/mock/fedora-23-x86_64-mockbuilder-6780/root'shell=FalseprintOutput=Falseenv={'LANG': 'en_US.UTF-8', 'TERM': 'vt100', 'SHELL': '/bin/bash', 'PROMPT_COMMAND': 'printf "\x1b]0;<mock-chroot>\x07<mock-chroot>"', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'HOME': '/builddir', 'HOSTNAME': 'mock'}gid=135user='mockbuild'timeout=0logger=<mockbuild.trace_decorator.getLog object at 0x7fedadd1b6d0>uid=1001)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target x86_64 --nodeps /builddir/build/SPECS/avocado.spec'] with env {'LANG': 'en_US.UTF-8', 'TERM': 'vt100', 'SHELL': '/bin/bash', 'PROMPT_COMMAND': 'printf "\x1b]0;<mock-chroot>\x07<mock-chroot>"', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'HOME': '/builddir', 'HOSTNAME': 'mock'} and shell False
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
Building target platforms: x86_64
Building for target x86_64
Wrote: /builddir/build/SRPMS/avocado-0.28.0-2.fc23.src.rpm
Child return code was: 0
LEAVE do --> 

fedora-23-ppc64le:

Mock Version: 1.2.12
ENTER do(['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target ppc64le --nodeps /builddir/build/SPECS/avocado.spec'], chrootPath='/var/lib/mock/fedora-23-ppc64le-mockbuilder-4866/root'shell=FalseprintOutput=Falseenv={'LANG': 'en_US.UTF-8', 'TERM': 'vt100', 'SHELL': '/bin/bash', 'PROMPT_COMMAND': 'printf "\x1b]0;<mock-chroot>\x07<mock-chroot>"', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'HOME': '/builddir', 'HOSTNAME': 'mock'}gid=135user='mockbuild'timeout=0logger=<mockbuild.trace_decorator.getLog object at 0x3fff90c36a50>uid=1000)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target ppc64le --nodeps /builddir/build/SPECS/avocado.spec'] with env {'LANG': 'en_US.UTF-8', 'TERM': 'vt100', 'SHELL': '/bin/bash', 'PROMPT_COMMAND': 'printf "\x1b]0;<mock-chroot>\x07<mock-chroot>"', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'HOME': '/builddir', 'HOSTNAME': 'mock'} and shell False
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
sh: /usr/bin/python: No such file or directory
Building target platforms: ppc64le
Building for target ppc64le
Wrote: /builddir/build/SRPMS/avocado-0.28.0-2.fc23.src.rpm
Child return code was: 0
LEAVE do --> 

Regression that breaks travis on master

It looks like PR #471, which is basically PR #469 without one patch broke our selftests, and thus Travis:

======================================================================
FAIL: test_run_double_mplex (multiplex_tests.MultiplexTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/cleber/src/avocado/selftests/all/functional/avocado/multiplex_tests.py", line 104, in test_run_double_mplex
    self.run_and_check(cmd_line, expected_rc, 14 * 4)
  File "/home/cleber/src/avocado/selftests/all/functional/avocado/multiplex_tests.py", line 42, in run_and_check
    "".join(job_log_lines))
AssertionError: The multiplexed job log output has less lines than expected
19:51:28 test       L0132 INFO | START passtest.py.1
19:51:28 test       L0133 DEBUG| 
19:51:28 test       L0134 DEBUG| Test instance parameters:
19:51:28 test       L0141 DEBUG|     id = passtest
19:51:28 test       L0141 DEBUG|     sleep_length = 0.5
19:51:28 test       L0141 DEBUG|     tag = 1
19:51:28 test       L0142 DEBUG| 
19:51:28 test       L0145 DEBUG| Default parameters:
19:51:28 test       L0149 DEBUG| 
19:51:28 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:28 test       L0151 DEBUG| 
19:51:28 test       L0514 INFO | PASS passtest.py.1
19:51:28 test       L0495 INFO | 
19:51:28 test       L0132 INFO | START passtest.py.2
19:51:28 test       L0133 DEBUG| 
19:51:28 test       L0134 DEBUG| Test instance parameters:
19:51:28 test       L0141 DEBUG|     id = passtest
19:51:28 test       L0141 DEBUG|     sleep_length = 1
19:51:28 test       L0141 DEBUG|     tag = 2
19:51:28 test       L0142 DEBUG| 
19:51:28 test       L0145 DEBUG| Default parameters:
19:51:28 test       L0149 DEBUG| 
19:51:28 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:28 test       L0151 DEBUG| 
19:51:28 test       L0514 INFO | PASS passtest.py.2
19:51:28 test       L0495 INFO | 
19:51:28 test       L0132 INFO | START passtest.py.3
19:51:28 test       L0133 DEBUG| 
19:51:28 test       L0134 DEBUG| Test instance parameters:
19:51:28 test       L0141 DEBUG|     id = passtest
19:51:28 test       L0141 DEBUG|     sleep_length = 5
19:51:28 test       L0141 DEBUG|     tag = 3
19:51:28 test       L0142 DEBUG| 
19:51:28 test       L0145 DEBUG| Default parameters:
19:51:28 test       L0149 DEBUG| 
19:51:28 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:28 test       L0151 DEBUG| 
19:51:28 test       L0514 INFO | PASS passtest.py.3
19:51:28 test       L0495 INFO | 
19:51:28 test       L0132 INFO | START passtest.py.4
19:51:28 test       L0133 DEBUG| 
19:51:28 test       L0134 DEBUG| Test instance parameters:
19:51:28 test       L0141 DEBUG|     id = passtest
19:51:28 test       L0141 DEBUG|     sleep_length = 10
19:51:28 test       L0141 DEBUG|     tag = 4
19:51:28 test       L0142 DEBUG| 
19:51:28 test       L0145 DEBUG| Default parameters:
19:51:28 test       L0149 DEBUG| 
19:51:28 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:28 test       L0151 DEBUG| 
19:51:28 test       L0514 INFO | PASS passtest.py.4
19:51:28 test       L0495 INFO | 


======================================================================
FAIL: test_run_mplex_doublepass (multiplex_tests.MultiplexTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/cleber/src/avocado/selftests/all/functional/avocado/multiplex_tests.py", line 88, in test_run_mplex_doublepass
    self.run_and_check(cmd_line, expected_rc=0, expected_lines=2 * 4 * 14)
  File "/home/cleber/src/avocado/selftests/all/functional/avocado/multiplex_tests.py", line 42, in run_and_check
    "".join(job_log_lines))
AssertionError: The multiplexed job log output has less lines than expected
19:51:28 test       L0132 INFO | START passtest.py.1
19:51:28 test       L0133 DEBUG| 
19:51:28 test       L0134 DEBUG| Test instance parameters:
19:51:28 test       L0141 DEBUG|     id = passtest
19:51:28 test       L0141 DEBUG|     sleep_length = 0.5
19:51:28 test       L0141 DEBUG|     tag = 1
19:51:28 test       L0142 DEBUG| 
19:51:28 test       L0145 DEBUG| Default parameters:
19:51:28 test       L0149 DEBUG| 
19:51:28 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:28 test       L0151 DEBUG| 
19:51:28 test       L0514 INFO | PASS passtest.py.1
19:51:28 test       L0495 INFO | 
19:51:28 test       L0132 INFO | START passtest.py.2
19:51:28 test       L0133 DEBUG| 
19:51:28 test       L0134 DEBUG| Test instance parameters:
19:51:28 test       L0141 DEBUG|     id = passtest
19:51:28 test       L0141 DEBUG|     sleep_length = 1
19:51:28 test       L0141 DEBUG|     tag = 2
19:51:28 test       L0142 DEBUG| 
19:51:28 test       L0145 DEBUG| Default parameters:
19:51:28 test       L0149 DEBUG| 
19:51:28 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:28 test       L0151 DEBUG| 
19:51:28 test       L0514 INFO | PASS passtest.py.2
19:51:28 test       L0495 INFO | 
19:51:28 test       L0132 INFO | START passtest.py.3
19:51:28 test       L0133 DEBUG| 
19:51:28 test       L0134 DEBUG| Test instance parameters:
19:51:28 test       L0141 DEBUG|     id = passtest
19:51:28 test       L0141 DEBUG|     sleep_length = 5
19:51:28 test       L0141 DEBUG|     tag = 3
19:51:28 test       L0142 DEBUG| 
19:51:28 test       L0145 DEBUG| Default parameters:
19:51:28 test       L0149 DEBUG| 
19:51:28 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:28 test       L0151 DEBUG| 
19:51:28 test       L0514 INFO | PASS passtest.py.3
19:51:28 test       L0495 INFO | 
19:51:28 test       L0132 INFO | START passtest.py.4
19:51:28 test       L0133 DEBUG| 
19:51:28 test       L0134 DEBUG| Test instance parameters:
19:51:28 test       L0141 DEBUG|     id = passtest
19:51:28 test       L0141 DEBUG|     sleep_length = 10
19:51:28 test       L0141 DEBUG|     tag = 4
19:51:28 test       L0142 DEBUG| 
19:51:28 test       L0145 DEBUG| Default parameters:
19:51:28 test       L0149 DEBUG| 
19:51:28 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:28 test       L0151 DEBUG| 
19:51:28 test       L0514 INFO | PASS passtest.py.4
19:51:28 test       L0495 INFO | 
19:51:28 test       L0132 INFO | START passtest.py.1
19:51:28 test       L0133 DEBUG| 
19:51:28 test       L0134 DEBUG| Test instance parameters:
19:51:28 test       L0141 DEBUG|     id = passtest
19:51:28 test       L0141 DEBUG|     sleep_length = 0.5
19:51:28 test       L0141 DEBUG|     tag = 1
19:51:28 test       L0142 DEBUG| 
19:51:28 test       L0145 DEBUG| Default parameters:
19:51:28 test       L0149 DEBUG| 
19:51:28 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:28 test       L0151 DEBUG| 
19:51:28 test       L0514 INFO | PASS passtest.py.1
19:51:28 test       L0495 INFO | 
19:51:28 test       L0132 INFO | START passtest.py.2
19:51:28 test       L0133 DEBUG| 
19:51:28 test       L0134 DEBUG| Test instance parameters:
19:51:28 test       L0141 DEBUG|     id = passtest
19:51:28 test       L0141 DEBUG|     sleep_length = 1
19:51:28 test       L0141 DEBUG|     tag = 2
19:51:28 test       L0142 DEBUG| 
19:51:28 test       L0145 DEBUG| Default parameters:
19:51:28 test       L0149 DEBUG| 
19:51:28 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:28 test       L0151 DEBUG| 
19:51:28 test       L0514 INFO | PASS passtest.py.2
19:51:28 test       L0495 INFO | 
19:51:29 test       L0132 INFO | START passtest.py.3
19:51:29 test       L0133 DEBUG| 
19:51:29 test       L0134 DEBUG| Test instance parameters:
19:51:29 test       L0141 DEBUG|     id = passtest
19:51:29 test       L0141 DEBUG|     sleep_length = 5
19:51:29 test       L0141 DEBUG|     tag = 3
19:51:29 test       L0142 DEBUG| 
19:51:29 test       L0145 DEBUG| Default parameters:
19:51:29 test       L0149 DEBUG| 
19:51:29 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:29 test       L0151 DEBUG| 
19:51:29 test       L0514 INFO | PASS passtest.py.3
19:51:29 test       L0495 INFO | 
19:51:29 test       L0132 INFO | START passtest.py.4
19:51:29 test       L0133 DEBUG| 
19:51:29 test       L0134 DEBUG| Test instance parameters:
19:51:29 test       L0141 DEBUG|     id = passtest
19:51:29 test       L0141 DEBUG|     sleep_length = 10
19:51:29 test       L0141 DEBUG|     tag = 4
19:51:29 test       L0142 DEBUG| 
19:51:29 test       L0145 DEBUG| Default parameters:
19:51:29 test       L0149 DEBUG| 
19:51:29 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:29 test       L0151 DEBUG| 
19:51:29 test       L0514 INFO | PASS passtest.py.4
19:51:29 test       L0495 INFO | 


======================================================================
FAIL: test_run_mplex_passtest (multiplex_tests.MultiplexTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/cleber/src/avocado/selftests/all/functional/avocado/multiplex_tests.py", line 83, in test_run_mplex_passtest
    self.run_and_check(cmd_line, expected_rc, 14 * 4)
  File "/home/cleber/src/avocado/selftests/all/functional/avocado/multiplex_tests.py", line 42, in run_and_check
    "".join(job_log_lines))
AssertionError: The multiplexed job log output has less lines than expected
19:51:30 test       L0132 INFO | START passtest.py.1
19:51:30 test       L0133 DEBUG| 
19:51:30 test       L0134 DEBUG| Test instance parameters:
19:51:30 test       L0141 DEBUG|     id = passtest
19:51:30 test       L0141 DEBUG|     sleep_length = 0.5
19:51:30 test       L0141 DEBUG|     tag = 1
19:51:30 test       L0142 DEBUG| 
19:51:30 test       L0145 DEBUG| Default parameters:
19:51:30 test       L0149 DEBUG| 
19:51:30 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:30 test       L0151 DEBUG| 
19:51:30 test       L0514 INFO | PASS passtest.py.1
19:51:30 test       L0495 INFO | 
19:51:30 test       L0132 INFO | START passtest.py.2
19:51:30 test       L0133 DEBUG| 
19:51:30 test       L0134 DEBUG| Test instance parameters:
19:51:30 test       L0141 DEBUG|     id = passtest
19:51:30 test       L0141 DEBUG|     sleep_length = 1
19:51:30 test       L0141 DEBUG|     tag = 2
19:51:30 test       L0142 DEBUG| 
19:51:30 test       L0145 DEBUG| Default parameters:
19:51:30 test       L0149 DEBUG| 
19:51:30 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:30 test       L0151 DEBUG| 
19:51:30 test       L0514 INFO | PASS passtest.py.2
19:51:30 test       L0495 INFO | 
19:51:30 test       L0132 INFO | START passtest.py.3
19:51:30 test       L0133 DEBUG| 
19:51:30 test       L0134 DEBUG| Test instance parameters:
19:51:30 test       L0141 DEBUG|     id = passtest
19:51:30 test       L0141 DEBUG|     sleep_length = 5
19:51:30 test       L0141 DEBUG|     tag = 3
19:51:30 test       L0142 DEBUG| 
19:51:30 test       L0145 DEBUG| Default parameters:
19:51:30 test       L0149 DEBUG| 
19:51:30 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:30 test       L0151 DEBUG| 
19:51:30 test       L0514 INFO | PASS passtest.py.3
19:51:30 test       L0495 INFO | 
19:51:30 test       L0132 INFO | START passtest.py.4
19:51:30 test       L0133 DEBUG| 
19:51:30 test       L0134 DEBUG| Test instance parameters:
19:51:30 test       L0141 DEBUG|     id = passtest
19:51:30 test       L0141 DEBUG|     sleep_length = 10
19:51:30 test       L0141 DEBUG|     tag = 4
19:51:30 test       L0142 DEBUG| 
19:51:30 test       L0145 DEBUG| Default parameters:
19:51:30 test       L0149 DEBUG| 
19:51:30 test       L0150 DEBUG| Test instance params override defaults whenever available
19:51:30 test       L0151 DEBUG| 
19:51:30 test       L0514 INFO | PASS passtest.py.4
19:51:30 test       L0495 INFO | 

Since that build failed while still on git fetch, the real regression can not be seen on Travis:

https://travis-ci.org/avocado-framework/avocado/builds/53862837

But, make check fails on my machine, and it will probably fail on further PRs.

@lmr and @ruda, please take a look. The fix should be really trivial (line count change).

still troubles with run remote/vm plugin with output (not JSON) more tests

Hi,
I have version of avocado 20.0.1 and there is still some problem with remote/vm running of tests -> I know that you've repaired partially, but it seems that there is some issue what cause traceback.
Propably it could be caused by more tests running in one avocado run command and which have various output. So it could be something like mixed more JSON output + strange text between
Regards
Honza

DOMAIN : checkmachine7-fedora-21-x86_64
LOGIN : [email protected]:22
JOB ID : ebb596e3324213d96256264fa45b3ba57dc00612
JOB LOG : /home/jscotka/avocado/job-results/job-2015-02-27T03.21-ebb596e/job.log
TESTS : 3
(1/3) /root/avocado/tests/tmp/avocado-test/sources.sh: PASS (3.19 s)
(2/3) /root/avocado/tests/tmp/avocado-test/inittest.sh: PASS (3.01 s)
(3/3) /root/avocado/tests/tmp/avocado-test/compiletest.sh: PASS (133.17 s)
PASS : 3
ERROR : 0
FAIL : 0
SKIP : 0
WARN : 0
TIME : 139.37 s
DOMAIN : checkmachine7-fedora-21-x86_64
LOGIN : [email protected]:22
Avocado crashed: ValueError: Could not parse JSON from avocado remote output:

Authentication failed
Authentication failed
Permission denied
couldn't read serial dmi info: Process exited with code 1
couldn't read serial dmi info: Process exited with code 1
Avocado crashed: TypeError: ('init() takes exactly 2 arguments (1 given)', <class 'testlib.Error'>, ())
Traceback (most recent call last):

File "/usr/lib/python2.7/site-packages/avocado/job.py", line 301, in _run
failures = self.test_runner.run_suite(test_suite)

File "/usr/lib/python2.7/site-packages/avocado/runner.py", line 171, in run_suite
test_state = q.get()

File "/usr/lib64/python2.7/multiprocessing/queues.py", line 376, in get
return recv()

TypeError: ('init() takes exactly 2 arguments (1 given)', <class 'testlib.Error'>, ())

Please include the traceback info and command line used on your bug report
Report bugs visiting https://github.com/avocado-framework/avocado/issues/new
Traceback (most recent call last):

File "/usr/lib/python2.7/site-packages/avocado/job.py", line 301, in _run
failures = self.test_runner.run_suite(test_suite)

File "/usr/lib/python2.7/site-packages/avocado/plugins/remote.py", line 74, in run_suite
results = self.run_test(self.result.urls)

File "/usr/lib/python2.7/site-packages/avocado/plugins/remote.py", line 61, in run_test
"\n%s" % result.stdout)

ValueError: Could not parse JSON from avocado remote output:

Authentication failed
Authentication failed
Permission denied
couldn't read serial dmi info: Process exited with code 1
couldn't read serial dmi info: Process exited with code 1
Avocado crashed: TypeError: ('init() takes exactly 2 arguments (1 given)', <class 'testlib.Error'>, ())
Traceback (most recent call last):

File "/usr/lib/python2.7/site-packages/avocado/job.py", line 301, in _run
failures = self.test_runner.run_suite(test_suite)

File "/usr/lib/python2.7/site-packages/avocado/runner.py", line 171, in run_suite
test_state = q.get()

File "/usr/lib64/python2.7/multiprocessing/queues.py", line 376, in get
return recv()

TypeError: ('init() takes exactly 2 arguments (1 given)', <class 'testlib.Error'>, ())

Please include the traceback info and command line used on your bug report
Report bugs visiting https://github.com/avocado-framework/avocado/issues/new

Please include the traceback info and command line used on your bug report
Report bugs visiting https://github.com/avocado-framework/avocado/issues/new

tmpfs (/tmp) FS problems with QEMU - temporary dir location to be changed

HI @lmr, @ldoktor

Recently I faced a case (running test virsh.blockpull.normal_test.network_disk.iscsi.notimeout.nobase.async , other similar tests maybe impacted ) where it was failing with "TestFail: error: internal error: unable to execute QEMU command 'transaction': Could not open '/tmp/avocado_YM43eu/blk_src_file.snap1': Invalid argument" . After long debugging (as the QEMU message is not much clear) I found out that the problem is related to the case when iscsi disk is added during the test and it has the argument cache='none' , which is causing problems with iscsi disk-file which is located in /tmp/avocado_XXXX (according to the new tmp dir approach from data_dir.py, instead of using the tmp dir in home dir). /tmp is type of tmpfs which doesn't support O_DIRECT, according to the log below
tail avocado-vt-vm1.log
...
...
2015-11-27T09:43:59.957100Z file system may not support O_DIRECT
qemu: terminating on signal 15 from pid 26243
2015-11-27 09:44:00.567+0000: shutting down

Thus qemu is not able to manipulate properly with files located in /tmp (tmpfs). Maybe it's better to move back tmp dir to a home dir ?

Avocado crashed: NetworkError: Name lookup failed for fedora20.vm

Avocado should gives a gentle error message and do not crash when the hostname provide could not be resolved.

./scripts/avocado run --remote-hostname fedora20.vm --remote-username fedora --remote-password fedora passtest
LOGIN      : [email protected]:22
Avocado crashed: NetworkError: Name lookup failed for fedora20.vm
Traceback (most recent call last):

  File "/home/rmoura/Work/avocado-remote/avocado/job.py", line 310, in _run
    failures = self.test_runner.run_suite(test_suite)

  File "/home/rmoura/Work/avocado-remote/avocado/remote/runner.py", line 80, in run_suite
    self.result.setup()

  File "/home/rmoura/Work/avocado-remote/avocado/remote/result.py", line 87, in setup
    self._copy_tests()

  File "/home/rmoura/Work/avocado-remote/avocado/remote/result.py", line 54, in _copy_tests
    self.remote.makedir(self.remote_test_dir)

  File "/home/rmoura/Work/avocado-remote/avocado/utils/remote.py", line 133, in makedir
    self.run('mkdir -p %s' % remote_path)

  File "/home/rmoura/Work/avocado-remote/avocado/utils/remote.py", line 100, in run
    timeout=timeout)

  File "/usr/lib/python2.7/site-packages/fabric/network.py", line 639, in host_prompting_wrapper
    return func(*args, **kwargs)

  File "/usr/lib/python2.7/site-packages/fabric/operations.py", line 1042, in run
    shell_escape=shell_escape)

  File "/usr/lib/python2.7/site-packages/fabric/operations.py", line 909, in _run_command
    channel=default_channel(), command=wrapped_command, pty=pty,

  File "/usr/lib/python2.7/site-packages/fabric/state.py", line 390, in default_channel
    chan = _open_session()

  File "/usr/lib/python2.7/site-packages/fabric/state.py", line 382, in _open_session
    return connections[env.host_string].get_transport().open_session()

  File "/usr/lib/python2.7/site-packages/fabric/network.py", line 151, in __getitem__
    self.connect(key)

  File "/usr/lib/python2.7/site-packages/fabric/network.py", line 143, in connect
    self[key] = connect(user, host, port, cache=self)

  File "/usr/lib/python2.7/site-packages/fabric/network.py", line 533, in connect
    raise NetworkError('Name lookup failed for %s' % host, e)

NetworkError: Name lookup failed for fedora20.vm

Please include the traceback info and command line used on your bug report
Report bugs visiting https://github.com/avocado-framework/avocado/issues/new

Avocado crashes while trying to render the HTML report

One of our users in China has reported a problem while rendering the HTML report:

#avocado run sleeptest
JOB ID     : 13a596d8c3d4ea2eb5aabace094f7857386abffa
JOB LOG    : /root/avocado/job-results/job-2015-11-24T16.31-13a596d/job.log
TESTS      : 1
 (1/1) sleeptest.py:SleepTest.test: PASS (1.01 s)

Avocado crashed: UnicodeDecodeError: 'utf8' codec can't decode byte 0x89 in position 65716: invalid start byte
Traceback (most recent call last):

  File "/usr/lib/python2.7/site-packages/avocado-0.30.0-py2.7.egg/avocado/core/job.py", line 497, in _run
    timeout=self.timeout)

  File "/usr/lib/python2.7/site-packages/avocado-0.30.0-py2.7.egg/avocado/core/runner.py", line 364, in run_suite
    self.result.end_tests()

  File "/usr/lib/python2.7/site-packages/avocado-0.30.0-py2.7.egg/avocado/core/result.py", line 66, in end_tests
    output_plugin.end_tests()

  File "/usr/lib/python2.7/site-packages/avocado-0.30.0-py2.7.egg/avocado/core/plugins/htmlresult.py", line 230, in end_tests
    self._render_report()

  File "/usr/lib/python2.7/site-packages/avocado-0.30.0-py2.7.egg/avocado/core/plugins/htmlresult.py", line 240, in _render_report
    report_contents = renderer.render(open(template, 'r').read(), context)

  File "/usr/lib/python2.7/site-packages/pystache/renderer.py", line 458, in render
    return self._render_string(template, *context, **kwargs)

  File "/usr/lib/python2.7/site-packages/pystache/renderer.py", line 406, in _render_string
    return self._render_final(render_func, *context, **kwargs)

  File "/usr/lib/python2.7/site-packages/pystache/renderer.py", line 423, in _render_final
    return render_func(engine, stack)

  File "/usr/lib/python2.7/site-packages/pystache/renderer.py", line 404, in <lambda>
    render_func = lambda engine, stack: engine.render(template, stack)

  File "/usr/lib/python2.7/site-packages/pystache/renderengine.py", line 181, in render
    return parsed_template.render(self, context_stack)

  File "/usr/lib/python2.7/site-packages/pystache/parsed.py", line 47, in render
    parts = map(get_unicode, self._parse_tree)

  File "/usr/lib/python2.7/site-packages/pystache/parsed.py", line 46, in get_unicode
    return node.render(engine, context)

  File "/usr/lib/python2.7/site-packages/pystache/parser.py", line 218, in render
    parts.append(self.parsed.render(engine, context))

  File "/usr/lib/python2.7/site-packages/pystache/parsed.py", line 47, in render
    parts = map(get_unicode, self._parse_tree)

  File "/usr/lib/python2.7/site-packages/pystache/parsed.py", line 46, in get_unicode
    return node.render(engine, context)

  File "/usr/lib/python2.7/site-packages/pystache/parser.py", line 122, in render
    return engine.escape(s)

  File "/usr/lib/python2.7/site-packages/pystache/renderer.py", line 202, in _escape_to_unicode
    return unicode(self.escape(self._to_unicode_soft(s)))

  File "/usr/lib/python2.7/site-packages/pystache/renderer.py", line 186, in _to_unicode_soft
    return self.unicode(s)

  File "/usr/lib/python2.7/site-packages/pystache/renderer.py", line 229, in unicode
    return unicode(b, encoding, self.decode_errors)

UnicodeDecodeError: 'utf8' codec can't decode byte 0x89 in position 65716: invalid start byte

Support multi avocado-vt instances in a host

Sometimes, many teams shared a machine to run test (special for some new platform or expensive platform like ppc). And many of our test case a automated in avocao/autotest and it will help our guys to save testing resource, so our guys like to run this kind test case in avocado/autotest. But now avocado reading configuration from /etc/avocado, and copy test configuration to /usr/share/avocado directory, if multi-instance running in same host, avocado configuration and avocado testing configuration will be overwrite by latest instance.

python-psutil EL6 compatibility

One specific functional test, selftests/all/functional/avocado/interrupt_tests.py, fails on EL6 systems when using the python-psutil package, at version 0.6.1, available from EPEL.

Ideally, it would be nice to have a development environment all based on packaged versions. Currently, a newer version must be installed via pip.

Let's examine if python-psutil 0.6.1 is enough for what avocado needs.

Consistent handling of python tests and shell scripts (virtual, remote plugin)

Hi,
this issue is similar to:
#351

There is strange behaviour in case of virtual ar remote plugin.
I have tests in /usr/share/avocado/tests directory (default config for root user)
inittest.sh - shell script
checklogin.py - avocado framework based python

I've expected to use:

avocado run --remote-hostname .... inittest.sh checklogin.py

Actual situation is:
usage of: # avocado run --remote-user root --remote-hostname guest .... checklogin
works more-less well
Strange is, that location on test by default is /usr/share/avocado/test , on remote machine this test is copied to /root/avocado/tests directory, what is inconsistent and not easy to debug why.

usage of:

avocado run --remote-user root --remote-hostname guest .... /usr/share/avocado/test/inittest.sh

does not work, because test is copied to /root/avocado/tests and thats why it is not working, but this command cause copying of test to remote machine what is expected.
I have to use

avocado run --remote-user root --remote-hostname guest .... inittest.sh

but only in case test was previously copied to /root/avocado/tests directory

 Thanks&Regards
 Honza

Proper simple tests examples

Even though simple tests are, well, simple, let's have a couple of them in the examples directory.

A big reason for that is that we currently use wrappers as the simple tests examples in the Getting Started guide (avocado list examples/wrappers) which can be confusing to new users.

plugins.html: popover info doesn't work on records on pages > 1

Hi guys, I noticed that the Info popup doesn't work only for first 10 records. The ones on following pages doesn't work. Changing the number of records per page won't help either. Only first 10 of them works.

Reproducer:

. ./scripts/avocado run passtest passtest passtest passtest passtest passtest passtest passtest passtest passtest this_should_produce_longer_message.py

. click on second page

. click on "Test this_should_produce_longer_message...."

It should display the full message but does nothing at all.

sysinfo subcommand should use local etc/avocado/* files

When running avocado from the git sources, the sysinfo subcommand should use local (from the repository) etc/avocado/* files for it configuration, when the global /etc/avocado is not available. Otherwise the result will be not interesting.

Here's the transcription of my session:

$ ./scripts/avocado sysinfo
Commands configured by file: /etc/avocado/sysinfo/commands
Files configured by file: /etc/avocado/sysinfo/files
Profilers configured by file: /etc/avocado/sysinfo/profilers
Profilers declared: []
Profiler disabled: no profiler commands configured
System log file not found (looked for ['/var/log/messages', '/var/log/syslog', '/var/log/system.log'])
Logged system information to /home/rmoura/Work/avocado/sysinfo-2015-04-17-10.36.37

$ ls -l /etc/avocado
ls: cannot access /etc/avocado: No such file or directory

$ find sysinfo-2015-04-17-10.36.37 
sysinfo-2015-04-17-10.36.37
sysinfo-2015-04-17-10.36.37/post
sysinfo-2015-04-17-10.36.37/post/added_packages
sysinfo-2015-04-17-10.36.37/profile
sysinfo-2015-04-17-10.36.37/removed_packages
sysinfo-2015-04-17-10.36.37/pre
sysinfo-2015-04-17-10.36.37/pre/installed_packages

Error getting parameters from the multiplexer

Very simple avocado run revealed a possible bug in the multiplexer parameter code:

/scripts/avocado run sleeptest --multiplex examples/tests/sleeptest.py.data/sleeptest.yaml 
JOB ID     : 59ac07ecd7788af2dd98e1d6e71a5d51cedc6d72
JOB LOG    : /home/cleber/avocado/job-results/job-2015-05-21T16.15-59ac07e/job.log
JOB HTML   : /home/cleber/avocado/job-results/job-2015-05-21T16.15-59ac07e/html/results.html
TESTS      : 4
(1/4) sleeptest.py: PASS (1.00 s)
(2/4) sleeptest.py.1: PASS (1.01 s)
(3/4) sleeptest.py.2: PASS (1.00 s)
(4/4) sleeptest.py.3: PASS (1.00 s)
PASS       : 4
ERROR      : 0
FAIL       : 0
SKIP       : 0
WARN       : 0
INTERRUPT  : 0
TIME       : 4.02 s

As can be seen on the log the self.params.get calls are actually returning the default value.

16:15:53 job        L0323 INFO | Job ID: 59ac07ecd7788af2dd98e1d6e71a5d51cedc6d72
16:15:53 job        L0324 INFO | 
16:15:53 test       L0147 INFO | START sleeptest.py
16:15:53 test       L0148 DEBUG| 
16:15:53 multiplexe L0193 DEBUG| PARAMS (key=timeout, path=*, default=None) => None
16:15:53 multiplexe L0193 DEBUG| PARAMS (key=sleep_length, path=*, default=1) => 1
16:15:53 sleeptest  L0020 DEBUG| Sleeping for 1.00 seconds
16:15:54 test       L0492 INFO | PASS sleeptest.py
16:15:54 test       L0473 INFO | 
16:15:54 test       L0147 INFO | START sleeptest.py.1
16:15:54 test       L0148 DEBUG| 
16:15:54 multiplexe L0193 DEBUG| PARAMS (key=timeout, path=*, default=None) => None
16:15:54 multiplexe L0193 DEBUG| PARAMS (key=sleep_length, path=*, default=1) => 1
16:15:54 sleeptest  L0020 DEBUG| Sleeping for 1.00 seconds
16:15:55 test       L0492 INFO | PASS sleeptest.py.1
16:15:55 test       L0473 INFO | 
16:15:55 test       L0147 INFO | START sleeptest.py.2
16:15:55 test       L0148 DEBUG| 
16:15:55 multiplexe L0193 DEBUG| PARAMS (key=timeout, path=*, default=None) => None
16:15:55 multiplexe L0193 DEBUG| PARAMS (key=sleep_length, path=*, default=1) => 1
16:15:55 sleeptest  L0020 DEBUG| Sleeping for 1.00 seconds
16:15:56 test       L0492 INFO | PASS sleeptest.py.2
16:15:56 test       L0473 INFO | 
16:15:56 test       L0147 INFO | START sleeptest.py.3
16:15:56 test       L0148 DEBUG| 
16:15:56 multiplexe L0193 DEBUG| PARAMS (key=timeout, path=*, default=None) => None
16:15:56 multiplexe L0193 DEBUG| PARAMS (key=sleep_length, path=*, default=1) => 1
16:15:56 sleeptest  L0020 DEBUG| Sleeping for 1.00 seconds
16:15:57 test       L0492 INFO | PASS sleeptest.py.3
16:15:57 test       L0473 INFO | 

Avocado crashed: ReaderError: unacceptable character #x007f: ...

rmoura@thinkpad:avocado% ./scripts/avocado run examples/tests/sleeptest.py -m examples/tests/sleeptest.py.data/sleeptest.yaml /usr/bin/true --job-timeout=3s
Avocado crashed: ReaderError: unacceptable character #x007f: control characters are not allowed
in "/usr/bin/true", position 0
Traceback (most recent call last):

File "/home/rmoura/Work/avocado/avocado/job.py", line 313, in _run
mux = multiplexer.Mux(self.args)

File "/home/rmoura/Work/avocado/avocado/multiplexer.py", line 442, in init
self.pools = parse_yamls(mux_files, filter_only, filter_out)

File "/home/rmoura/Work/avocado/avocado/multiplexer.py", line 77, in parse_yamls
input_tree = tree.create_from_yaml(input_yamls, debug)

File "/home/rmoura/Work/avocado/avocado/core/tree.py", line 492, in create_from_yaml
merge(data, path)

File "/home/rmoura/Work/avocado/avocado/core/tree.py", line 475, in _merge
data.merge(_create_from_yaml(path))

File "/home/rmoura/Work/avocado/avocado/core/tree.py", line 452, in _create_from_yaml
loaded_tree = yaml.load(stream, Loader)

File "/usr/lib64/python2.7/site-packages/yaml/init.py", line 71, in load
return loader.get_single_data()

File "/usr/lib64/python2.7/site-packages/yaml/constructor.py", line 37, in get_single_data
node = self.get_single_node()

File "_yaml.pyx", line 702, in _yaml.CParser.get_single_node (ext/_yaml.c:8689)

File "_yaml.pyx", line 905, in _yaml.CParser._parse_next_event (ext/_yaml.c:11673)

ReaderError: unacceptable character #x007f: control characters are not allowed
in "/usr/bin/true", position 0

Please include the traceback info and command line used on your bug report
Report bugs visiting https://github.com/avocado-framework/avocado/issues/new
rmoura@thinkpad:avocado% ./scripts/avocado run examples/tests/sleeptest.py /u

Running test synctest inside a virtual machine crashes Avocado

Running test synctest inside a virtual machine and with option --archive turned on
crashes Avocado:

Traceback (most recent call last):

  File "/usr/lib/python2.7/site-packages/avocado/job.py", line 237, in _run
    archive.create_zip(name, self.debugdir)

  File "/usr/lib/python2.7/site-packages/avocado/utils/archive.py", line 242, in create_zip
    with zipfile.ZipFile(name, 'w') as zf:

  File "/usr/lib64/python2.7/zipfile.py", line 752, in __init__
    self.fp = open(file, modeDict[mode])

IOError: [Errno 2] No such file or directory: 'run-2014-05-24-09.28.48.zip'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.