Git Product home page Git Product logo

log-analytics's Introduction

JFrog Log Analytics

This project integrates JFrog logs into various log analytic providers through the use of fluentd as a common logging agent.

The goal of this project is to provide JFrog customers with robust log analytic solutions that they could use to monitor the JFrog unified platform microservices.

How to clone this project?

This project makes use of git submodules for this reason we recommend you clone with the --recursive flag to ensure all log vendor files are downloaded.

git clone https://github.com/jfrog/log-analytics.git --recursive
cd log-analytics
git submodule foreach git checkout master
git submodule foreach git pull origin master

Table of Contents

Fluentd

Fluentd is a required component to use this integration.

Fluentd has an logger agent called td-agent which will be required to be installed into each node you wish to monitor logs on.

For more details on how to install Fluentd into your environment please visit:

Fluentd installation guide

JFrog Installation Configurations

Due to the nature of customer installations varying we cannot account for all possible installations however to ensure our integration works with your installation please review:

JFrog Product Directory Structure guide

The environment variable JF_PRODUCT_DATA_INTERNAL must be defined to the correct location.

Helm based installs will already have this defined based upon the underlying docker images.

For non-k8s based installations below is a reference to the Docker image locations per product. Note these locations may be different based upon the installation location chosen.

Artifactory: 
export JF_PRODUCT_DATA_INTERNAL=/var/opt/jfrog/artifactory/
Xray:
export JF_PRODUCT_DATA_INTERNAL=/var/opt/jfrog/xray/
Mision Control:
export JF_PRODUCT_DATA_INTERNAL=/var/opt/jfrog/mc/
Distribution:
export JF_PRODUCT_DATA_INTERNAL=/var/opt/jfrog/distribution/
Pipelines:
export JF_PRODUCT_DATA_INTERNAL=/opt/jfrog/pipelines/var/

Note if you are using Artifactory 6.x you will need to use the legacy environment variable ARTIFACTORY_HOME instead.

Root Installation

Install the td-agent agent on Redhat UBI we need to run the below command:

$ curl -L https://toolbelt.treasuredata.com/sh/install-redhat-td-agent3.sh | sh

Root access will be required as this will use yum to install td-agent

User Installation

Non-root users to make life easier for we have provided a tar.gz containing everything you need to run fluentd.

Follow these steps:

$JF_PRODUCT_DATA_INTERNAL/fluentd-1.11.0-linux-x86_64/fluentd <conf_file>

Updating fluentd to future releases is simple as well:

$JF_PRODUCT_DATA_INTERNAL/fluentd-1.11.0-linux-x86_64/lib/ruby/bin/gem install fluentd

Adding any fluentd plugins like Datadog as works in the same fashion:

$JF_PRODUCT_DATA_INTERNAL/fluentd-1.11.0-linux-x86_64/lib/ruby/bin/gem install fluent-plugin-datadog

Logger Agent

  • Package Manager installations only.

The default configuration file for td-agent is located at:

/etc/td-agent/td-agent.conf

You should update this configuration file and run td-agent as a service.

If you wish to only run td-agent against a test configuration file you can also run:

td-agent -c fluentd.conf

Once td-agent has been installed on an Artifactory or Xray node you will also need to install the relevant plugin if you are using Splunk or Datadog:

Splunk:

td-agent-gem install fluent-plugin-splunk-enterprise

Datadog:

td-agent-gem install fluent-plugin-datadog

Elastic:

td-agent-gem install fluent-plugin-elasticsearch

Config Files

Fluentd requires configuration file to know which logs to tail and how to ship them to the correct log provider.

Our configurations are saved into each log provider's folder.

We will need to store these configurations into the correct location per our installer type.

The environment variable JF_PRODUCT_DATA_INTERNAL must be defined to the correct location.

Helm based installs will already have this defined based upon the underlying docker images.

For non-k8s based installations below is a reference to the Docker image locations per product. Note these locations may be different based upon the installation location chosen.

Artifactory: 
export JF_PRODUCT_DATA_INTERNAL=/var/opt/jfrog/artifactory/
Xray:
export JF_PRODUCT_DATA_INTERNAL=/var/opt/jfrog/xray/
Mision Control:
export JF_PRODUCT_DATA_INTERNAL=/var/opt/jfrog/mc/
Distribution:
export JF_PRODUCT_DATA_INTERNAL=/var/opt/jfrog/distribution/
Pipelines:
export JF_PRODUCT_DATA_INTERNAL=/opt/jfrog/pipelines/var/

If you are running on RT 6.x you will need to ensure the ARTIFACTORY_HOME environment variable is set instead.

Running as a service

By default td-agent will run as the td-agent user however the JFrog logs folder only has file permissions for the artifactory or xray user.

  • Fix the group and file permissions issue in Artifactory as root:
usermod -a -G artifactory td-agent
chmod 0770 $JF_PRODUCT_DATA_INTERNAL/log
chmod 0640 $JF_PRODUCT_DATA_INTERNAL/log/*.log
  • Fix the group and file permissions issue in Xray as root:
usermod -a -G xray td-agent
chmod 0770 $JF_PRODUCT_DATA_INTERNAL/log
chmod 0640 $JF_PRODUCT_DATA_INTERNAL/log/*.log
  • Run td-agent and check it's status
systemctl start td-agent
systemctl status td-agent

Running as a service as a regular user

Using systemd:

  • Create a service unit configuration file
mkdir -p ~/.config/systemd/user/
touch ~/.config/systemd/user/jfrogfluentd.service
  • Copy paste below snippet, update the path to match $JF_PRODUCT_DATA_INTERNAL/ and fluentd configuration file location, and save into the file:
[Unit]
Description=JFrog_Fluentd

[Service]
ExecStart=/opt/jfrog/artifactory/var/fluentd-1.11.0-linux-x86_64/fluentd <conf_file>
Restart=always

[Install]
WantedBy=graphical.target
See man systemd.service and man systemd.unit for more options.
  • Enable service in userspace
systemctl --user enable jfrogfluentd
  • Start it and check it's status
systemctl --user start jfrogfluentd
systemctl --user status jfrogfluentd
  • Enjoy!

Tools

  • Fluentd - Fluentd Logging Aggregator/Agent
  • Splunk - Splunk Logging Platform
  • Splunk HEC - Splunk HEC used to upload data into Splunk
  • Elasticsearch - Elastic search log data platform
  • Kibana - Elastic search visualization layer
  • Prometheus - Prometheus metrics and monitoring

Contributing

Please read CONTRIBUTING.md for details on our code of conduct, and the process for submitting pull requests to us.

Versioning

We use SemVer for versioning. For the versions available, see the tags on this repository.

Contact

  • Github

log-analytics's People

Contributors

mahithab avatar peters95 avatar jefferyfry avatar danbarr avatar robinino avatar rpf3 avatar oliverolg avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.