Git Product home page Git Product logo

Comments (11)

paulfantom avatar paulfantom commented on August 17, 2024

Just to clarify. After creating those files you'll like to load them with file_sd like this:

  - file_sd_configs:
    - files:
      - /etc/prometheus/file_sd/*.yml

?

from ansible-prometheus.

SuperQ avatar SuperQ commented on August 17, 2024

No, I would like to be able to configure them with a specific job.

See this prometheus config and how we're currently deploying the config files.

from ansible-prometheus.

paulfantom avatar paulfantom commented on August 17, 2024

You are using this role at FOSDEM?! So great! 🎉 Shame I cannot go there this year 😟

But back to the subject. Look how we apply file_sd right now in defaults/main.yml. We can just change line L54 to get wildcard and that way by default it will load all target jobs. However if someone wants to overwrite this behaviour, he can just by specifying different prometheus_scrape_configs. This way role could create custom target files and you could load them with custom jobs. Is this ok?

from ansible-prometheus.

SuperQ avatar SuperQ commented on August 17, 2024

No, using a wildcard there goes against Prometheus job design. It would basically lump all targets into one job. This is why I changed the default scrape_config to only specify one target file. 😃

The goal here is to allow many different jobs with different sets of target lists.

from ansible-prometheus.

paulfantom avatar paulfantom commented on August 17, 2024

I get it. However we need some default way to load those files. Imagine when someone decides to describe targets in prometheus_targets but forgets to write custom prometheus_scrape_config what then? This way we ensure prometheus will load this config albeit not the way it was designed to work.

from ansible-prometheus.

SuperQ avatar SuperQ commented on August 17, 2024

Adjusting scrape configs and targets at the same time is basically a requirement for Prometheus anyway. You can't have targets without a correctly identified scrape config.

from ansible-prometheus.

paulfantom avatar paulfantom commented on August 17, 2024

You are right. However you can define it via label system, look at this: https://github.com/cloudalchemy/demo-site/blob/master/group_vars/all/vars#L9 and then this: http://demo.cloudalchemy.org:9090/targets

One scrape config, multiple jobs.

from ansible-prometheus.

paulfantom avatar paulfantom commented on August 17, 2024

I don't know if such usage is an anti-pattern or isn't, but it's the simplest way we could think of. With such config you just say: here is your target list, they belong to this environment and do this job.

from ansible-prometheus.

SuperQ avatar SuperQ commented on August 17, 2024

The example demo site configuration is not the right pattern to define multiple jobs.

from ansible-prometheus.

paulfantom avatar paulfantom commented on August 17, 2024

Ok, we'll change it after doing #60

from ansible-prometheus.

lock avatar lock commented on August 17, 2024

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

from ansible-prometheus.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.