Comments (12)
What is "limits" ? I can't find any reference to it in the Sidekiq source.
The client isn't configured to scale-up on mailchimp, is that correct?
from autoscaler.
It's from sidekiq-limit_fetch and it limits how many concurrent jobs get run at once from a specific queue. WIth out it there is a chance I could be running more then 10 api requests to Mailchimp. At my current tier that would cause a failure.
from autoscaler.
Ah, nifty. I've been using connection pools for that kind of thing, but it could lock up threads while waiting.
Any thoughts on the client config?
from autoscaler.
I'm having a similar problem as Hadees... I too have 2 queues, and I can't get the worker to autostart, but it will stop on its own. I've tried your complex.rb sample (which if i'm not mistaken, creates two workers), but I coudln't get that to work either, but I think I was messing something up (I copied the two lines into the Procfile, but I'm not sure I had the settings were right)... As you can see, I'm a bit of a newbie to ruby/rails, but iI'll work on this later today and if I have any insight, I'll share it here.
from autoscaler.
@JustinLove I'm not sure what you mean about the client config. Should I have done something differently?
from autoscaler.
Your code has 'default' => heroku
It will only scale-up when starting the default
queue - you'd need to add 'mailchimp' => heroku
from autoscaler.
Well, I just tried using just the one default queue, and it still doesn't start the worker process. So maybe this should be its own issue, but I figure maybe Hadees might try having just the default queue and see if that starts up. If I scale the worker up manually, it will shut itself off once its work is done.
Perhaps it is something with my Procfile? my Procfile is currently:
web: bundle exec rails server thin -p $PORT
worker: bundle exec sidekiq
should it be something else? Is there anything else? I have the Procfile, and the simple.rb example file copied verbatim as sidekiq.rb in the config/initializers folder...
from autoscaler.
I'm a hands-on kind of guy - it's really hard to say much about your application without seeing it run and probing things.
I've created a trivial Rails app that implements both of the example configurations. If you can't get this to work, then I've forgotten to document a setup step. Please run through it again and write down everything you did so that we can try to pinpoint it.
If this does work, create a fork with a minimal reproduction for the setup you are trying to use.
https://github.com/JustinLove/autoscaler_sample
from autoscaler.
Okay, I was able to get the autoscaler_sample app to work, so it looks like I'm doing something wrong with my app. I am trying to use autoscaler with the carrierwave_backgrounder gem to do image processing in a background process that autoscales, is it possible it is doing something differently that makes it incompatible? There is definitely two queues, a default queue and a "carrierwave" queue (though i could change that to anything), and I can see them both in the sidekiq web monitoring sinatra app. (I installed the web monitoring on the autoscaler_sample app to see if it looks the same, and it does, except for the name of the queues.) So there's definitely jobs enqueued that are waiting for workers to pick them up, but the autoscaler doesn't seem to recognize them with the carrierwave gem, though it does fine in your autoscaler_sample one. Is there some way the worker needs to be called/enqueued that makes a difference?
Sorry if this is way beyond the scope of the autoscaler gem; I will continue to look at this and/or try a different tack if need be. Thanks.
from autoscaler.
My first thought is the client config - do you have a 'carrierwave' => heroku
mapping?
If you add your own job to the carrierwave queue, is it different? The gem is using the undocumented client_push, method, but that just wraps the same Sidekiq::Client.push that I'm using in the sample, so it ought to be hitting the middleware.
from autoscaler.
Thanks for your help, Justin. I grafted your sample app into my application to see if there's some sort of configuration problem. But your little TWEET job works just fine and it starts up and closes down the workers without problem. I even went ahead and tried uploading a file to put the imageprocessing worker into the default queue. Nothing happens; then I clicked the default job button, and that spooled up the worker, and both jobs were completed, and the worker spun down. So, it must be something with the way the carrierwave_backgrounder gem is doing things. If I can figure out how, I'll try to extract the relevant parts of that gem and just manually stick it into my app and play around.
from autoscaler.
I am having the same problem with my workers being shut down by autoscaler, but never restarted when new jobs arrive in the queues.
My procfile is not setup the same way though, I have no sidekiq.yml:
web: bundle exec rails server puma -p $PORT -e $RACK_ENV
critical: env HEROKU_PROCESS=critical bundle exec sidekiq -c 4 -q critical,1
default: env HEROKU_PROCESS=default bundle exec sidekiq -c 2 -q default,1
Anything obviously wrong with this setup? Should I be using the sidekiq.yml?
from autoscaler.
Related Issues (20)
- Handle heroku api downtimes HOT 1
- Project requires more active maintainer HOT 3
- 403 Forbidden when scaling worker HOT 1
- support for Sidekiq 3.0.0 HOT 4
- undefined method `async' for nil:NilClass HOT 2
- Autoscaling multiple workers HOT 1
- support for Sidekiq 3.1 and 3.2 HOT 2
- Upgrade to platform-api HOT 2
- time to release a new version? HOT 15
- Linear scaling strategy with long running jobs HOT 3
- Handling multiple queues with a single worker dyno HOT 4
- How to use LinearScalingStrategy? HOT 1
- Project maintainer should also be a user HOT 3
- Error when scaling down - Unable to create thread HOT 10
- don't want to use set_initial_workers HOT 5
- Sidekiq 5 support? HOT 2
- Heroku platform API error HOT 3
- Rename ENV vars HOT 4
- Autoscaler is passaing Incorrect Authentication "type" param to platform-api HOT 4
- ThreadError: can't alloc thread HOT 6
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from autoscaler.