Comments (14)
@stefanprodan question is why do you have to set/create a new counter every time? You just want to increment the counter right?
Memory cache is just keeping data in memory. You can just use Interlocked.Increment on the entry.Value.TotalRequests field, and it would work totally thread safe.
from aspnetcoreratelimit.
I believe that building a named locker that locks on counter id is a better solution. the following blog post explains how to implement it http://johnculviner.com/achieving-named-lock-locker-functionality-in-c-4-0/
from aspnetcoreratelimit.
That lock is applied for the get/set of the rate limit key, it doesn't affect the rest of the request processing. How is that related to sync/async?
from aspnetcoreratelimit.
Req1 goes through middleware say IpRateLimitMiddleware => which is calling processor.ProcessRequest(identity, rule)
;
=> which calls at _core.ProcessRequest(requestIdentity, rule);
=> which is acquiring a lock on a static object _processLocker
at lock (_processLocker)
Now Req2 comes along flows through the same path as Req1 and waits for Req1 to release the lock at lock (_processLocker)
Isn't that making all requests to be bottle-necked at that lock part?
Imagine If we are making a network call to fetch the _counterStore.Get(counterId);
from some sort of distributed cache, the impact will much more as the lock held time of each request will be much higher (say 10ms), and if the machine receives some 100 requests at the same instant, then the 1st request will be done in a minimum of 10ms , second will be a minimum of 20ms , 3rd 30ms.. 100th request 100*10ms (and this will keep growing) as each request will be competing to get a lock on that static object?
Correct me if I'm wrong.
from aspnetcoreratelimit.
That lock is there for the local cache, that's in memory in the same process. If you switch to Redis as distributed cache, then there is no need for that lock because Redis has atomic get/set operations. When the Redis provider for .NET Core gets stable I will add a distributed cache implementation to AspNetCoreRateLimit lock free.
from aspnetcoreratelimit.
Got it .. But still even for the in-memory local cache it will slowdown the requests and if a DOS attack happens then all the requests response time will be affected heavily. As throttling is used to prevent such things but in this case with the lock being on a static object all the requests are getting bottle-necked even the genuine ones.
I feel the lock should be replaced with a concurrent(thread safe) dictionary/map ( or some thing of that sort) implementation where requests don't have to depend on other requests (until unless they are coming from same ip or from same client). and thus safe guarding requests of a non throttling(genuine) client from throttling client(DOS attacker)
from aspnetcoreratelimit.
If you use a thread safe dictionary you will still have a global lock when you lookup in that dictionary, otherwise how is it thread-safe? Also that dictionary will grow in time and the lookup will have a greater impact on your app than a simple lock. A thread-safe dictionary has no expire mechanism so you will end up storing there all the IPs and users ids that have no use to the rate limiter.
from aspnetcoreratelimit.
I see that the concurrent dictionary doesn't lock on the whole array, just a part of it. So it could improve the lock time a little, but I don't see how you can clean the dictionary since it has no expire feature.
from aspnetcoreratelimit.
HttpRuntime.Cache might be the answer here instead of plain concurrent dictionary, would take of expiry as well.. see http://stackoverflow.com/questions/33969/best-way-to-implement-request-throttling-in-asp-net-mvc
from aspnetcoreratelimit.
I'm using ASP.NET Core cache that's similar to HttpRuntime.Cache. I think you should look at what operations I am doing inside the lock. ASP.NET cache and HttpRuntime cache have no atomic increment feature.
from aspnetcoreratelimit.
@stefanprodan Core's MemoryCache is thread safe and uses concurrent dictionary under the hood.
https://github.com/aspnet/Caching
from aspnetcoreratelimit.
Let me explain a little the code inside the lock:
lock (_processLocker)
{
//thread safe operation
var entry = _counterStore.Get(counterId);
if (entry.HasValue)
{
// entry has not expired
if (entry.Value.Timestamp + rule.PeriodTimespan.Value >= DateTime.UtcNow)
{
// increment request count (THIS IS THE CRITICAL OPERATION)
var totalRequests = entry.Value.TotalRequests + 1;
// deep copy
counter = new RateLimitCounter
{
Timestamp = entry.Value.Timestamp,
TotalRequests = totalRequests
};
}
}
//thread safe operation
_counterStore.Set(counterId, counter, rule.PeriodTimespan.Value);
}
A lock free atomic increment is offered by Redis INCR and as far as know there is no such thing in .NET Core cache.
from aspnetcoreratelimit.
Wouldn't you just use Interlocked.CompareExchange?
https://github.com/Icehunter/litterbox/blob/master/LitterBox.Redis/Models/ConnectionPool.cs#L58
That's the "atomic" in c#, unless you declared it as volatile.
from aspnetcoreratelimit.
Check this out #63
from aspnetcoreratelimit.
Related Issues (20)
- How can I disable rate limiting for google bots?
- Incremental Rate limiting
- No fallback if Redis isn't available
- How to let Get and Post use different Quota exceeded response
- ClientRateLimitOptions reload HOT 2
- rate limiting is not working for with .net6 and AspNetCoreRateLimit 5.0 HOT 1
- Getting CORS error instead of 429 (when using AspNetCoreRateLimit Nuget)
- Does it support interface access control based on queues
- Per user rate limit with seeding of the current usages HOT 1
- Endpoint path involving * (variable) not working correctly HOT 3
- Is there a way to rate limit globally on an end-point HOT 1
- same ip with different device in lan HOT 1
- Using Azure redis
- Post or put rule not working
- How to load client rules on execution, not startup HOT 1
- Release new version
- RateLimiting Not Working HOT 1
- Get the real User's desktop IP vs proxy ip address
- On ip got blocked
- Is this still maintained?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from aspnetcoreratelimit.