Git Product home page Git Product logo

Comments (11)

j0k3r avatar j0k3r commented on May 24, 2024 1

Also, thanks for creating that issue.
I usually check the RabbitMQ queues to see if everything works well.
In that case it was my Redis server which was full. Each message failed because of that without noticing me.

Every new release/tag should be retrieved in the next hour or so.

from banditore.

j0k3r avatar j0k3r commented on May 24, 2024 1

I ran a quick query on MySQL for you and I got 16.828 version:

SELECT username, COUNT( version.id ) 
FROM  `user` 
LEFT JOIN star ON star.user_id = user.id
LEFT JOIN repo ON repo.id = star.repo_id
LEFT JOIN version ON version.repo_id = repo.id
WHERE user.id =  '2591991'
username count(version.id)
JanJastrow 16828

Which means you'll need at least: 16828 (get the release info) + 16828 (to convert the release note to markdown) = 33656 calls before you got everything.

5000 isn't enough 🙂

from banditore.

j0k3r avatar j0k3r commented on May 24, 2024

No they haven't changed anything on their side.
Banditore burn a lot of ratelimit when checking for new version. It uses cache request to avoid burning too much. But in anycase, if you have a lot of repo to check you'll hit the limit really soon.
This is what I explained in the README: https://github.com/j0k3r/banditore#github-client-discovery

from banditore.

JanJastrow avatar JanJastrow commented on May 24, 2024

Hi again,

thank you for fixing your server.
I'd still like to run my own version (take the load off your shoulders).
I still have the issue that I stumble upon the GitHub rate limit.

[217/731] Check 17070487 …
[218/731] Check 17148588 …
09:40:52 ERROR     [console] Error thrown while running command "-e prod banditore:sync:versions". Message: "You have reached GitHub hourly limit! Actual limit is: 5000"
["exception" => Github\Exception\ApiLimitExceedException { …},"com…

Since it's "only" 731 stars, I don't understand why I run into the 5000 limit.
Any idea how I can fix/debug this?

-- edit --
I have run the console command only manually yet and plan to use with cron

from banditore.

j0k3r avatar j0k3r commented on May 24, 2024

Did you properly defined redis_dsn_guzzle_cache & redis_dsn_doctrine_cache?

from banditore.

JanJastrow avatar JanJastrow commented on May 24, 2024
redis_dsn_guzzle_cache: 'redis://127.0.0.1:6379/2'
redis_dsn_doctrine_cache: 'redis://127.0.0.1:6379/3'

They are both set in the parameter.yml
Sorry, I don't really work with redis so I don't know what to set

from banditore.

JanJastrow avatar JanJastrow commented on May 24, 2024

I'm also not able to run the server 🤷‍♂️

sudo -u www-data php bin/console server:run -e prod
> There are no commands defined in the "server" namespace.

So I'm trying with nginx instead, but I'm also fighting to find a correct config for this.

from banditore.

j0k3r avatar j0k3r commented on May 24, 2024

There is now web server shipped with banditore.
You should indeed use a dedicated webserver instead.
Did you have a running instance of Redis on your server?

from banditore.

JanJastrow avatar JanJastrow commented on May 24, 2024

Yes, the service is running and the logfile looks good (?):

588:M 05 Jul 13:14:44.289 * Background saving terminated with success
588:M 05 Jul 13:19:45.065 * 10 changes in 300 seconds. Saving...
588:M 05 Jul 13:19:45.076 * Background saving started by pid 7222
7222:C 05 Jul 13:19:46.143 * DB saved on disk
7222:C 05 Jul 13:19:46.148 * RDB: 1 MB of memory used by copy-on-write
588:M 05 Jul 13:19:46.182 * Background saving terminated with success
588:M 05 Jul 13:24:47.032 * 10 changes in 300 seconds. Saving...
588:M 05 Jul 13:24:47.041 * Background saving started by pid 7554
7554:C 05 Jul 13:24:48.111 * DB saved on disk
7554:C 05 Jul 13:24:48.114 * RDB: 1 MB of memory used by copy-on-write
588:M 05 Jul 13:24:48.146 * Background saving terminated with success

This is the log after I ran banditore:sync:versions

from banditore.

j0k3r avatar j0k3r commented on May 24, 2024

After all, I think everything works fine.
The first run when syncing version too longer than the next one because it needs to retrieve ALL versions/tags for each repository. So it might consume more at first run.
Try to re-run the command after one hour, and so on. It should then be fixed on a daily basis.
This is what I explained here: https://github.com/j0k3r/banditore#github-client-discovery

from banditore.

JanJastrow avatar JanJastrow commented on May 24, 2024

Well, that explains it a bit.
I'll now add a cronjob for an hourly run and see if it runs well.

Thank you for your help.

from banditore.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.