Git Product home page Git Product logo

farfetch / loadshedding Goto Github PK

View Code? Open in Web Editor NEW
80.0 6.0 9.0 3.32 MB

A .NET library created to assist the applications in applying LoadShedding techniques and making it easy to configure it

Home Page: https://farfetch.github.io/loadshedding/

License: MIT License

Makefile 0.10% Dockerfile 0.82% C# 94.23% JavaScript 3.91% CSS 0.94%
c-sharp concurrency-limiter csharp dotnet dotnet-core farfetch loadshedding middleware queue rate-limiting

loadshedding's People

Contributors

ailtonguitar avatar brmagadutra avatar dependabot[bot] avatar gsferreira avatar joaorodriguesgithub avatar lpcouto avatar qzlp2p avatar ruiqbarbosa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

loadshedding's Issues

[Feature Request]: Support Diagnostics.Metrics instead of prometheus-net

Is your request related to a problem you have?

N/A

Describe the solution you'd like

What about supporting Diagnostics.Metrics instead of prometheus-net? Using Metrics we are able to consume from OpenTelemetry for example

Are you able to help bring it to life and contribute with a Pull Request?

Yes

Additional context

No response

[Feature Request]: Report waiting time for no waiting tasks

Is your request related to a problem you have?

When a task doesn't go to the queue, the zero waiting time is not reported for this task, which causes the queue time histogram to have periods without data and not represent the completed queue time data.

If we want to obtain the histogram of queue times, it would be possible to exclude bucket zero and thus obtain a new histogram only with buckets > 0.

Additionally, the queue counter is never reported as 0 for an application that has never had an element in the queue, as this metric is only reported in the dequeue event.

Describe the solution you'd like

The queue time of zero should be reported in the histogram in all processed events, as well as the queue counter. So in the first task it would already report that the queue is zero and the waiting time is zero.

image

Are you able to help bring it to life and contribute with a Pull Request?

Yes

Additional context

I've already done a local POC and I have the code that I can push to the server for comments from maintainers.

[Bug Report]: http_requests_queue_items_total is not zero after process entire queue

Prerequisites

  • I have searched issues to ensure it has not already been reported

Description

The http_requests_queue_items_total metric is being used as a counter to report the total task waiting in the queue.

Sometimes after the queue is empty the gauge reports more than 0 items in the queue.

Steps to reproduce

  1. Open the Solution
  2. Go to AdaptativeConcurrencyLimiterTests test class
  3. Change the assertions in AssertMetrics method as described below
  4. Run the GetAsync_WithReducedLimitAndQueueSize_SomeRequestsAreRejected test until failure (use the Test Explorer Run Until Failure)
  5. After 5 attempts the tests should fail
Assert.Contains("http_requests_concurrency_items_total{method=\"GET\",priority=\"normal\"} 0", content);
Assert.Contains("http_requests_concurrency_limit_total", content);
Assert.Contains("http_requests_task_processing_time_seconds", content);
Assert.Contains("http_requests_queue_items_total{method=\"GET\",priority=\"normal\"} 0", content);
Assert.Contains("http_requests_queue_limit_total", content);
Assert.Contains("http_requests_queue_time_seconds_sum{method=\"GET\",priority=\"normal\"}", content);
Assert.Contains("http_requests_queue_time_seconds_count{method=\"GET\",priority=\"normal\"}", content);
Assert.Contains("http_requests_queue_time_seconds_bucket{method=\"GET\",priority=\"normal\",le=\"0.0005\"}", content);
Assert.Contains("http_requests_rejected_total{method=\"GET\",priority=\"normal\",reason=\"max_queue_items\"}", content);

Expected behavior

After processing all the tasks the metrics http_requests_queue_items_total and http_requests_concurrency_items_total must be 0 for all methods and priorities

Actual behavior

When the test fails, the http_requests_queue_items_total is 1, however it must be 0.

I saw one test failing which method where http_requests_queue_items_total{method=GET} 1 and http_requests_queue_items_total{method=UNKNOWN} -1

LoadShedding version

1.0.0

[Bug Report]: Load Shedding hides errors

Prerequisites

  • I have searched issues to ensure it has not already been reported

Description

I have a .Net 8 WebAPI that I've been building out and I added LoadShedding to it today.

I've noticed now that any time an exception would be thrown, returning a 500 error to the client it instead returns an empty 200 response instead.

Steps to reproduce

  1. Create a basic .Net 8 WebAPI (Controllers) application
  2. Create a controller as below.
  3. Add load shedding with defaults
  4. Run the application and use a tool like Postman to hit the endpoint
[ApiController]
[Route("[controller]")]
public sealed class EchoController : ControllerBase
{
    [HttpGet("test/error")]
    public async Task<IActionResult> TestError(CancellationToken cancellationToken = default)
    {
        throw new Exception("Something went wrong");
    }
}

The same issue happens for Unauthorized responses:

    [HttpGet("test/auth")]
    [Authorize]
    public async Task<IActionResult> TestAuth(CancellationToken cancellationToken = default)
    {
        return Ok("Authorized");
    }

If you hit this endpoint while not authenticated you also get a blank 200 response.

Expected behavior

I'd expect error responses to be preserved so that client applications can respond correctly.

Actual behavior

A blank 200 response is returned for errors.

LoadShedding version

1.0.0

[Feature Request]:Threshold-Based Load Shedding Feature

Is your request related to a problem you have?

The feature would introduce the ability to configure specific thresholds for system metrics (e.g., CPU usage, memory usage, request rate), beyond which the middleware would start shedding loads. This approach allows for finer control over when load shedding should occur, ensuring critical resources are preserved for the most important operations and complementing auto-scaling by providing an additional layer of performance management.

Describe the solution you'd like

In a scenario where an ASP.NET Core application experiences sudden spikes in traffic, auto-scaling might not immediately accommodate the increased load, leading to potential performance degradation. By setting a CPU usage threshold at 75%, the load shedding can begin shedding less critical requests, maintaining the responsiveness of the application until auto-scaling can bring additional resources online.

Are you able to help bring it to life and contribute with a Pull Request?

No

Additional context

No response

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.