Git Product home page Git Product logo

gatling-kafka's People

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

kopcheski

gatling-kafka's Issues

Protobuf Support

Currently the plugin supports only plain text/String type for sending/receiving data.
However, it's quite common for Kafka consumers/producers to use some schema like Avro or Protobuf to encode/decode messages. This is a feature request to add support for Protobuf encoded messages.

pace() does not apply properly

Hi guys, I'm trying to introduce delay between each iteration of sessions with pace() like that:

   public ScenarioBuilder getScenarioBuilder(KafkaSendActionBuilder kafkaSendActionBuilder) {

        return scenario("scenario")
                .exec(kafkaSendActionBuilder).pace(Duration.ofSeconds(10));
    }

    public void runSimulationSetup(KafkaSendActionBuilder kafkaSendActionBuilder, int sessions, int duration) {
        setUp(
                getScenarioBuilder(kafkaSendActionBuilder).injectOpen(constantUsersPerSec(sessions)
                        .during(duration))
        ).protocols(getKafkaProtocol());
    }


// This is declared in another class extending the one with the methods from above
   {
        runSimulationSetup(kafkaFireAndForget(), Constants.NUMBER_OF_SESSIONS_BASKET, Constants.DURATION_IN_SEC_BASKET);
    }

Though it is not affected - I tried similar with pause() but there it seems to finish the whole simulation and start to do something out of my expectations but the effect is that the desired delay between the sessions is missing.
Could you pls check and tell me if this is working on your side and if yes how to setup it to apply properly?

Messages sent with SaslMechanism.scram_sha_512() are fake positive and not received in topic

Hi, I'm new to gatling and kafka as well, though for my current company needs I make a research and POC which kafka plugin to be used. Your plugin is the suggested from gatling official site and the only active one supporting Java implementation. My case for the moment is to fire and forget and I need simply to trigger an amount of events from kafka Producer and check the results later on within our DB. What I found out is that if apply the SaslMechanism.scram_sha_512() configured for our test it is returned fake positive response but further check for event messages in the desired topic returns empty list. I tried many things but the result is always fake positive unless I used plaintext on local kafka without credentials where it works fine. Furthermore the same configuration provided below is working fine with jmeter setup and with another kafka plugin (ru.tinkoff -> gatling-kafka-plugin_2.13). Here is the setup I am applying:

KafkaProtocolBuilder kafkaProtocol = kafka
.broker(KafkaBroker("put your valid kafka host here", 9092))
.acks("1")
.producerKeySerializer("org.apache.kafka.common.serialization.StringSerializer")
.producerValueSerializer("org.apache.kafka.common.serialization.StringSerializer")
.credentials("put a valid username here", "put a valid password here", false, SaslMechanism.scram_sha_512())
.addProducerProperty("security.protocol", "SASL_PLAINTEXT")
.addProducerProperty("sasl.mechanism", "SCRAM-SHA-512")
.addProducerProperty("sasl.jaas.config", "org.apache.kafka.common.security.scram.ScramLoginModule required username='put a valid username here' password='put a valid password here';");

P.S: I have tried using only the .credentials() method and only the .addProducerProperty("sasl.jaas.config", ...) one but the result is always the same as described.
Will appreciate any valid feedback on it!

Consume only mode

Hi

As I understand there are currently only two ways to use this plugin: fire&forget and request&reply
I wonder if it is possible to use this plugin for consuming messages only?
My workflow is following:

  1. Gatling sends messages to Rest service
  2. Message goes through a chain of services and ends up in Kafka topic
  3. Gatling consumes that topic, matches messages, etc

Thank you.

Support for headers

That's something I could work around locally, but it would be nice to have it supported out of the box in the plugin.

In my scenario, I need to perform some "checks" on the ConsumerRecord headers. For example:

.check(jsonPath("$.headers.header1").is("value1"))

Of course, in order to be able to evaluate this json path, I had to extend the KafkaDsl, override the implicit function kafkaJsonPathCheckMaterializer and then manipulate the schema of the json to be evaluated. As you see above, the headers are in a attribute in the json root and the same goes for the original payload. Something like this:

{ "payload" : { original_payload }, "headers" : { "header1" : "value1", ... } }

I don't think this would be an acceptable solution for the plugin itself, because whatever code already written using the jsonPath method will break, but I hope the example serves as food for thought.

How does that sound for you?

Java API

How does the idea of having a Java API - just like gatling-core have - sound like to you? Is this something you see as added value to this plugin?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.