Git Product home page Git Product logo

ansible-logstash-callback's People

Contributors

pgporada avatar ujenmr avatar wbagdon avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ansible-logstash-callback's Issues

Name of play and play ID are not logged

Assuming the following playbook, one should expect a logged field for each play name in the playbook.

---
- name: Play 1
  hosts: localhost
  connection: local
  tasks:
    - name: Task 1
       shell: uptime

- name: Play 2
  hosts: localhost
  connection: local
  tasks:
    - name: Task 1
       shell: uname -a
...

What I would expect to see as a log entry would be

{
        "ansible_type" => "start",
               "level" => "INFO",
             "session" => "87c7bf9e-f731-11e6-845a-2477032230c8",
             "message" => "START /tmp/kitchen/default.yml",
                "type" => "ansible",
    "ansible_playbook" => "/tmp/kitchen/default.yml",
    "ansible_play_name" => "Play 1",                               <============================
    "ansible_play_id" => "82cc5a17-d834-41a4-833b-79794311abff",   <============================
                "tags" => [],
                "path" => "/tmp/kitchen/callback_plugins/logstash.py",
          "@timestamp" => 2017-02-20T05:57:58.497Z,
                "port" => 43622,
            "@version" => "1",
                "host" => "laptappy",
         "logger_name" => "python-logstash-logger",
              "status" => "OK"
}

Thanks for creating this plugin in the first place! It's been really helpful so far. ๐Ÿ˜ƒ ๐Ÿ‘

Ansible version that runs the playbook is not logged

It would be useful to see the version of ansible in each log message. I have a testing situation with different versions of ansible and would like to see if anything changes between these different versions.

Expected output would be something like

{
        "ansible_type" => "start",
               "level" => "INFO",
             "session" => "87c7bf9e-f731-11e6-845a-2477032230c8",
             "message" => "START /tmp/kitchen/default.yml",
                "type" => "ansible",
    "ansible_playbook" => "/tmp/kitchen/default.yml",
    "ansible_version" => "2.2.1.0",                     <==========================
                "tags" => [],
                "path" => "/tmp/kitchen/callback_plugins/logstash.py",
          "@timestamp" => 2017-02-20T05:57:58.497Z,
                "port" => 43622,
            "@version" => "1",
                "host" => "laptappy",
         "logger_name" => "python-logstash-logger",
              "status" => "OK"
}

My particular setup utilizes the tox project for testing roles/playbooks as follows

tox.ini

[tox]
minversion = 1.8
envlist = py{27}-ansible{221,220,212,202}
skipsdist = true

[flake8]
ignore = E501

[testenv]
passenv = *
deps =
    python-vagrant
    molecule
    ansible202: ansible==2.0.2.0
    ansible212: ansible==2.1.2.0
    ansible220: ansible==2.2.0.0
    ansible221: ansible==2.2.1.0
commands =
    molecule converge
    molecule verify

This setup, for me, would run a virtualenv in Python 2.7 for Ansible 2.0.2.0, Ansible 2.1.2.0 etc, and run the shell commands molecule converge and molecule verify. Being able to get this data into my logstash/elasticsearch would be really helpful.

Not picking up ansible.cfg settings

I've set the options in ansible.cfg like in the examples in logstash.py but only the env vars seem to be picked up.

This does not work:

[defaults]
callback_whitelist = logstash

[callback_logstash]
server = logstash
port = 5000
pre_command = git rev-parse HEAD
type = enterprise

But if I set the environment vars, then it works.
Am I mis-configuring something in the ansible.cfg?
I'm trying to make these settings portable so that other users will also log out to logstash.

logstash callback, does not support setting 'options'

Hi,

I receive the below warning while executing ansible.

[DEPRECATION WARNING]: logstash callback, does not support setting 'options',
it will work for now, but this will be required in the future and should be
updated, see the 2.4 porting guide for details.. This feature will be removed
in version 2.9. Deprecation warnings can be disabled by setting
deprecation_warnings=False in ansible.cfg.

Current ansible version is 2.8 and I doubt I can no more use this plugin when we migrate to 2.9 in near future.

Could you please fix this?

Thanks,
Harsha

Does not work with ansible 2.9.2

$ ansible-playbook main.yaml
ERROR! Unexpected Exception, this is probably a bug: 'CallbackModule' object has no attribute '_options'

ansible --version
ansible 2.9.2

Task IDs should be logged

Currently we get the following data

{
        "ansible_type" => "task",
               "level" => "INFO",
      "ansible_result" => "{\"changed\": false, \"msg\": \"\", \"rc\": 0, \"results\": [\"All packages providing epel-release are up to date\", \"\"]}",
             "session" => "87c7bf9e-f731-11e6-845a-2477032230c8",
     "ansible_changed" => false,
             "message" => "{\"changed\": false, \"msg\": \"\", \"rc\": 0, \"results\": [\"All packages providing epel-release are up to date\", \"\"]}",
                "type" => "ansible",
    "ansible_playbook" => "/tmp/kitchen/default.yml",
        "ansible_task" => "TASK: pgporada.repo-epel : Second pass to ensure EPEL is the latest version",
                "tags" => [],
                "path" => "/tmp/kitchen/callback_plugins/logstash.py",
        "ansible_host" => "localhost",
          "@timestamp" => 2017-02-20T06:01:12.269Z,
                "port" => 43622,
                "host" => "laptappy",
            "@version" => "1",
         "logger_name" => "python-logstash-logger",
}

In the ansible output for this task, here is all the information available

{
                    "hosts": {
                        "localhost": {
                            "_ansible_no_log": false, 
                            "_ansible_parsed": true, 
                            "changed": false, 
                            "invocation": {
                                "module_args": {
                                    "conf_file": null, 
                                    "disable_gpg_check": false, 
                                    "disablerepo": null, 
                                    "enablerepo": null, 
                                    "exclude": null, 
                                    "install_repoquery": true, 
                                    "list": null, 
                                    "name": [
                                        "epel-release"
                                    ], 
                                    "state": "latest", 
                                    "update_cache": false, 
                                    "validate_certs": true
                                }
                            }, 
                            "msg": "", 
                            "rc": 0, 
                            "results": [
                                "All packages providing epel-release are up to date", 
                                ""
                            ]
                        }
                    }, 
                    "task": {
                        "id": "edf38f61-6d47-40b0-b00e-cb3e1f172c71",               <======================
                        "name": "Second pass to ensure EPEL is the latest version"
                    }
                }

cannot import ansible.template in elasticseach 7.4.1

โฏ curl -s -H 'Content-Type: application/json' -XPUT 'http://localhost:9200/_template/ansible' [email protected] --user elastic:xxxxxxxxxxxxxxxxxx | jq
{
  "error": {
    "root_cause": [
      {
        "type": "mapper_parsing_exception",
        "reason": "Root mapping definition has unsupported parameters:  [_default_ : {dynamic_templates=[{strings_as_keyword={mapping={ignore_above=1024, type=keyword}, match_mapping_type=string}}], _all={norms=false}, properties={@timestamp={type=date}, ansible_facts={norms=false, type=text}, ansible_result={norms=false, type=text}, input_type={ignore_above=1024, type=keyword}, message={norms=false, type=text}, type={ignore_above=1024, type=keyword}, tags={ignore_above=1024, type=keyword}}}]"
      }
    ],
    "type": "mapper_parsing_exception",
    "reason": "Failed to parse mapping [_doc]: Root mapping definition has unsupported parameters:  [_default_ : {dynamic_templates=[{strings_as_keyword={mapping={ignore_above=1024, type=keyword}, match_mapping_type=string}}], _all={norms=false}, properties={@timestamp={type=date}, ansible_facts={norms=false, type=text}, ansible_result={norms=false, type=text}, input_type={ignore_above=1024, type=keyword}, message={norms=false, type=text}, type={ignore_above=1024, type=keyword}, tags={ignore_above=1024, type=keyword}}}]",
    "caused_by": {
      "type": "mapper_parsing_exception",
      "reason": "Root mapping definition has unsupported parameters:  [_default_ : {dynamic_templates=[{strings_as_keyword={mapping={ignore_above=1024, type=keyword}, match_mapping_type=string}}], _all={norms=false}, properties={@timestamp={type=date}, ansible_facts={norms=false, type=text}, ansible_result={norms=false, type=text}, input_type={ignore_above=1024, type=keyword}, message={norms=false, type=text}, type={ignore_above=1024, type=keyword}, tags={ignore_above=1024, type=keyword}}}]"
    }
  },
  "status": 400
}

I think that here is the solution, but i'm not sure :
https://xyzcoder.github.io/elasticsearch/nest/2019/04/12/es-70-and-nest-mapping-error.html

help adding another field

Hi. I've been trying to modify the python script to create a custom field that returns stderr and stdout for each task if it's not empty. Otherwise return "none" for that field.

I can't figure out the code to do this. Would you be willing to help?

Port field in Logstash output

Hello,

What the field "port" in your logstash output is related for ?

I got the same task with the same "ansible_task_id" and timestamp but with two different port value (46028 and 46024 for eg).

Ansible version 2.4.2.0
ELK 5.1.1
python-logstash==0.4.6

Thanks you

ansible_result as sting

Hi,
When using plugin, results from ansible (absible_result) is treated as string, so it' impossible to search over json format of result.

At the other hand when using python dictionary as:
data = {
'status': "OK",
'host': self.hostname,
'session': self.session,
'ansible_type': "task",
'ansible_playbook': self.playbook,
'ansible_host': result._host.name,
'ansible_task': task_name,
'ansible_result': result._result
}
Than it looks ok, I'm able to use queries in Kibana like this one:
ansible_result.results.*.status: true

Dubravko

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.