Git Product home page Git Product logo

Comments (6)

infinitydon avatar infinitydon commented on June 22, 2024

Hi,

1.) Yes.. Any recent kubernetes will do just fine (either AWS, GKE or any other cloud/baremetal). You don't have to copy the jmeter folder inside the k8s cluster, just install the operator and then create the jmeter instance as needed (t is the same procedure that you used on your local).

2.) Currently the influxdb data is located inside the POD, so when the jmeter instance is deleted then all the data is lost. Depending on your specific use case, you can add some sort of persistence by mounting the /data path in a PVC. To do this, you will have to custom build the operator by adding a PVC (this needs to be created beforehand) to the influxdb deployment manifest (https://github.com/kubernauts/jmeter-operator/blob/master/roles/jmeter/tasks/main.yml). A sample snippet of what to add is given below:

          volumeMounts:
          - name: influxdb-data
            mountPath: /data
      volumes:
        - name: influxdb-data
          persistentVolumeClaim:
            claimName: influxdb-heapster-pvc

Another alternative is to disable the installation of influxdb when you are creating the jmeter instance (set "influxdb_install" to false), then you can take care of installing Influxdb separately (with persistence using the PVC as explained above), you can even decide to use a central Influxdb for your Jmeter. Going this route will require you to configure Grafana datasource manually (this means that the initialize_cluster.sh script will not work in this scenario by default unless you adapt it).

3.) Yes this can be set up with terraform or other infrastructure tools out there, this is solely dependent on your experience using k8s and these tools.

from jmeter-operator.

Newbie112222 avatar Newbie112222 commented on June 22, 2024

Hi,
Thanks for the detailed response. Just for clarification when you say install the operator that means i have to repeat the following steps again right? This time on the actual cluster though instead of minikube.
The steps to use this operator is as follows (and so on):
(1.) Clone this repo "git clone https://github.com/kubernauts/jmeter-operator.git"
(2.) Install the Jmeter CRD (custom resource definition):
"kubectl apply -f deploy/crds/loadtest_v1alpha1_jmeter_crd.yaml".

I will look into the InfluxDB data persistence piece as well and i can also install another time series DB on the cluster if needed as right?

Looking into terraform to just manage the cluster but will need more info on that tool but will keep you posted. Thanks once again for all your help.

from jmeter-operator.

infinitydon avatar infinitydon commented on June 22, 2024

Yes.. You should do (1) and (2) all the time if you can only manage the k8s cluster from the master nodes alone.
But you can do this from your local computer if you can access the k8s clusters remotely, so you only have to clone the git repo once (and then switch to the needed k8s context and install the operator & CRD).

Jmeter only supports Graphite and Influxdb time series DB. If you are switching to Graphite, then you will have to adjust your jmeter test scripts to use graphite instead of Influxdb.

InfluxDB also supports the graphite implementation (you don't need to install a separate Graphite POD), the Inlfuxdb implementation in our Jmeter-operator has Graphite enabled also (both in the configmap and service manifests):

         [[graphite]]
         enabled = true
         bind-address = ":2003" # If not set, is actually set to bind-address.
         database = "jmeter"  # store graphite data in this database
          - port: 2003
            name: graphite
            targetPort: 2003

from jmeter-operator.

Newbie112222 avatar Newbie112222 commented on June 22, 2024

Hey,
So i was able to setup the whole process and looks like it working as expected :) Alsom appreciate your patience to answer all questions on the same.

Below is my setup:
Set up a K8 cluster with 2 nodes on us-east region
Moved the folders from repo jmeter-operator, jmeter kubernetes in my working directory
Verified it up and running
Did all the steps from the repo and all of them were successful- including grafana etc

I had 2 questions for you on the same:

  1. Every time i create a new cluster the above steps to deploy the .yaml and install the CRD need to be done every time (and all other steps like creating influx etc) ?- Any way to not repeat the steps? (i am looking into terraform for the same)
  2. Any ideas how we can integrate the app logs using this setup as influx will only show the test stats - maybe include cloudwatch or prometheus?

from jmeter-operator.

infinitydon avatar infinitydon commented on June 22, 2024

1.) You have to repeat the installation steps if you create a new k8s cluster. How you automate the jmeter-operator installation is entirely up to you as long as you understand the automation tool. Terraform is just one of the ways to achieve this.

2.) Not sure what you mean by APP logs.. You should be able to use an ELK stack

from jmeter-operator.

Newbie112222 avatar Newbie112222 commented on June 22, 2024

ok thanks for all your help on the same.

from jmeter-operator.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.