Git Product home page Git Product logo

isabella232 / azure-search-performance-testing Goto Github PK

View Code? Open in Web Editor NEW

This project forked from azure-samples/azure-search-performance-testing

0.0 0.0 0.0 5.23 MB

Scalable cloud load/stress test for Azure Cognitive Search. Includes a pipelined solution with Apache JMeter and Terraform to dynamically provision and destroy the required infrastructure on Azure.

License: MIT License

Dockerfile 1.38% Python 12.41% HCL 16.38% C# 69.84%

azure-search-performance-testing's Introduction

page_type languages products name description urlFragment
sample
yaml
python
azure
azure-cognitive-search
Azure Search Performance Tests
Performance testing tool for Azure Cognitive Search
azure-search-perf

Load Testing Pipeline with JMeter, ACI and Terraform

This pipeline helps to load test Azure Cognitive Search, it leverages Apache JMeter as an open source load and performance testing tool and Terraform to dynamically provision and destroy the required infrastructure on Azure. The JMeter workers and controller are hosted in Azure Container Instances (ACI) to allow VNET injection and Private Endpoint scenarios too.

Note: This is a fork from this original repo customized for Azure Cognitive Search (ACS) REST API and syntax.

Key concepts

Architecture

The flow is triggered and controlled by an Azure Pipeline on Azure DevOps. The pipeline contains a set of tasks that are organized logically in SETUP, TEST, RESULTS and TEARDOWN groups.

Task group Tasks
SETUP
  • Check if the JMeter Docker image exists
  • Validate the JMX file that contains the JMeter test definition
  • Upload JMeter JMX file to Azure Storage Account File Share
  • Provision the infrastructure with Terraform
  • TEST
  • Run JMeter test execution and wait for completion
  • RESULTS
  • Show JMeter logs
  • Get JMeter artifacts (e.g. logs, dashboard)
  • Convert JMeter tests result (JTL format) to JUnit format
  • Publish JUnit test results to Azure Pipelines
  • Publish JMeter artifacts to Azure Pipelines
  • TEARDOWN
  • Destroy all ephemeral infrastructure with Terraform
  • On the SETUP phase, JMeter agents are provisioned as Azure Container Instance (ACI) using a custom Docker image on Terraform. Through a Remote Testing approach, JMeter controller is responsible to configure all workers, consolidating all results and generating the resulting artifacts (dashboard, logs, etc).

    The infrastructure provisioned by Terraform includes:

    • Resource Group
    • Virtual Network (VNet)
    • Storage Account File Share
    • 1 JMeter controller on ACI
    • N JMeter workers on ACI

    On the RESULTS phase, a JMeter Report Dashboard and Tests Results are published in the end of each load testing execution.

    Prerequisites

    You should have the following Azure resources:

    Getting Started using the UI

    1. Create an Azure DevOps project and import this repo

    Go to Azure DevOps, create a new project, and import this repo.

    Azure DevOps new project

    Click into the Repos tab. You will get a warning saying that the repo is empty. Click on Import a repository, then for the Clone URL copy and paste this url: https://github.com/Azure-Samples/azure-search-performance-testing

    Import this code by cloning the repo

    2. Create a service connection in Azure DevOps

    Next, create a service connection in Azure Devops as described in the DevOps documentation. This service connection will create a service principal allowing the Azure Pipelines to access the resource group.

    Note: Make sure to add the service connection at the subscription level (don't specify a resource group) so the service connection has access to install resource providers.

    Create a service connection

    Make a note of the Service Connection name as it will be used in next step.

    3. Create the variable group

    Create a variable group named JMETER_TERRAFORM_SETTINGS as described in the DevOps documentation.

    Add the following variables to the variable group:

    • TF_VAR_JMETER_ACR_NAME = <your_azurecr_name>
    • TF_VAR_RESOURCE_GROUP_NAME = <your_rg_name>
    • TF_VAR_JMETER_DOCKER_IMAGE = <your_azurecr_name>.azurecr.io/jmeter
    • TF_VAR_LOCATION = <your_preferred_azure_region> (i.e. eastus, westeurope, etc.)
    • AZURE_SERVICE_CONNECTION_NAME = <your_service_connection_name>
    • AZURE_SUBSCRIPTION_ID = <your_subscription_id>

    When you're finished, the variable group should look similar to the image below:

    Create a variable group

    4. Create and run the Docker pipeline

    Create a pipeline with New Pipeline (blue button, right side), chose Azure Repos Git (YAML), click on your existing repo (cloned in step #1), configure the pipeline with Existing Azure Pipelines YAML file, the path of the existing file is /pipelines/azure-pipelines.docker.yml.

    Rename the new pipeline to jmeter-docker-build so you won't confuse it with the pipeline created in the next step (in the Pipelines tab, find the three dots inside your pipeline row and there you can rename it).

    Run this pipeline before proceeding to the next step.

    5. Create the JMeter pipeline

    Replicate the steps as in step #4 but with yaml file pipelines/azure-pipelines.load-test.yml and rename it to jmeter-load-test.

    For this pipeline you will need some additional variables:

    • API_KEY = <search_service_api_key> (and keep it secret in devops)
    • SEARCH_SERVICE_NAME = <your_search_service_name>
    • SEARCH_INDEX_NAME = <your_search_index_name>
    • TF_VAR_JMETER_JMX_FILE = sample.jmx
    • TF_VAR_JMETER_WORKERS_COUNT = 1 (or as many as you want for scalability of the JMeter workers)

    Save the pipeline but don't run it yet. You may want to update sample.jmx as described in the next step.

    6. Define the test definition inside your JMX file

    By default the test uses sample.jmx. This JMX file contains a test definition for performing HTTP requests on <your-search-service-name>.search.windows.net endpoint through the 443 port.

    You can update the JMX file with the test definition of your preference. You can find more details on updating the test definition in jmeter/README.md.

    7. Run the JMeter Pipeline

    Now you're ready to run the pipeline:

    ui-run-pipeline

    Note: The variables you created in step #5 might not show up on this view, but rest assured they haven't been deleted.

    If you're using the default test configuration this should take about 10-15 minutes to complete.

    Viewing Test Results

    JMeter test results are created in a JTL file (results.jtl) with CSV formatting. A Python script was created to convert JTL to JUnit format and used during the pipeline to have full integration with Azure DevOps test visualization.

    Azure DevOps with successful requests

    Error messages generated by JMeter for failed HTTP requests can also be seen on Azure DevOps. In this case, Sevice Unavailable usually means the request was throttled by the search service.

    Azure DevOps with failed requests

    Viewing Artifacts

    Some artifacts are published after the test ends. Some of them are a static JMeter Dashboard, logs and others.

    pipeline-artifacts

    You can also download these build artifacts using az pipelines runs artifact download.

    After downloading the dashboard and unzipping it, open dashboard/index.html on your browser.

    Some screenshots here: jmeter-latencies

    jmeter-dashboard

    JMeter Test Configuration

    The sample.jmx includes some modules to configure the HTTP request, headers and body that Azure Cognitive Search is expecting. It also includes subsections to configure the query distribution (ie 10 concurrent users per second during 1 minute), a section to define which search terms will be sent (to avoid distortion in latencies thanks to cache) that read an input CSV. For more details and examples: JMeter official doc.

    If you struggle adding new modules to the .jmx (the syntax can be quite tricky) I would suggest to use JMeter's UI and save the config to a temporary jmx file, analyze the new module and embed it in your jmx config file.

    Pipeline Configuration

    All Terraform parameters can be configured using the Variable Group JMETER_TERRAFORM_SETTINGS. Please read JMeter Pipeline Settings to know more details about it.

    Limitations

    • Load Test duration Please note that for Microsoft hosted agents, you can have pipelines that runs up to 1 hour (private project) or 6 hours (public project). You can have your own agents to bypass this limitation.

    • ACI on VNET regions Please note that not all regions currently support ACI and VNET integration. If you need private JMeter agents, you can deploy it in a different region and use VNET peering between them. Also note that vCPUs and memory limits change based on regions.

    Additional Documentation

    External References

    Contributing

    This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

    When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

    This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

    azure-search-performance-testing's People

    Contributors

    allantargino avatar crowz4k avatar daxianji007 avatar dereklegenzoff avatar fedeoliv avatar hepsi204 avatar ignaciofls avatar lakshaykaushik avatar microsoft-github-operations[bot] avatar microsoftopensource avatar shivamtawari avatar

    Recommend Projects

    • React photo React

      A declarative, efficient, and flexible JavaScript library for building user interfaces.

    • Vue.js photo Vue.js

      ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

    • Typescript photo Typescript

      TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

    • TensorFlow photo TensorFlow

      An Open Source Machine Learning Framework for Everyone

    • Django photo Django

      The Web framework for perfectionists with deadlines.

    • D3 photo D3

      Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

    Recommend Topics

    • javascript

      JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

    • web

      Some thing interesting about web. New door for the world.

    • server

      A server is a program made to process requests and deliver data to clients.

    • Machine learning

      Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

    • Game

      Some thing interesting about game, make everyone happy.

    Recommend Org

    • Facebook photo Facebook

      We are working to build community through open source technology. NB: members must have two-factor auth.

    • Microsoft photo Microsoft

      Open source projects and samples from Microsoft.

    • Google photo Google

      Google โค๏ธ Open Source for everyone.

    • D3 photo D3

      Data-Driven Documents codes.