Git Product home page Git Product logo

ghcrawler-cli's Introduction

Crawler command line

GHCrawler is utility for walking GitHub APIs and tracking GitHub events. This command line app allows you to control various aspects of a crawler's behavior. There is some overlap in function with the Crawler Dashboard. This project also has a simple Node client library for talking to a crawler.

Controlling a crawler

The cc utility is in the bin directory of this repo. It can be run interactively or as a single command processor. The general format of using the command line is

node cc [options] [command]

where the available options are:

-i -- Run in interactive mode

-s <url> -- Control the crawler service running at the given url. Defaults to http://localhost:3000. You can also set the CRAWLER_SERVICE_URL environment variable.

-t -- The crawler service API token to use. This can also be supplied via the CRAWLER_SERVICE_AUTH_TOKEN environment variable. If not defined in either place, the default value of secret is used.

The available commands are:

start [count] -- Start the crawler processing with count concurrent operations. If count is not specified, 1 is the default. On a reasonably fast network a count of 10 to 15 should be sufficient. This also depends on how many tokens you are using.

stop -- Stop crawler processing. The crawler service is left running but it stops pulling requests off the queue. This is the same as start 0.

queue <requests...> -- Queues the given requests for processing. The requests parameter is a list of GitHub "org" and/or "org/repo" names.

orgs <org orgs...> -- Set the crawler's to traverse only the GitHub orgs named in the given list.

config -- Dumps the crawler service's configuration to the console.

tokens <spec...> -- Set the GitHub tokens to be used by the crawler when calling GitHub APIs. The spec value is a list of token specs. Each spec has the form <token>#<trait>,<trait>... where the token is the GitHub OAuth or Personal Access token and the comma-separated list of traits identify what permissions the token has. The available traits are: public, admin, private. You can list as many tokens and traits as you like. Note that you can also configure the GitHub tokens the CRAWLER_GITHUB_TOKENS environment variable instead before starting the crawler. For example, export CRAWLER_GITHUB_TOKENS="<token1>#public;<token2>#admin".

A typical sequence shown in the snippet below configures the crawler with a set of tokens, configures the org filter set and then queues and starts the processing of the org.

> node bin/cc
http://localhost:3000> tokens 43984b2344ca575d0f0e097efd97#public 972bbdfe098098fa9ce082309#admin
http://localhost:3000> orgs contoso-d
http://localhost:3000> queue contoso-d
http://localhost:3000> start 5
http://localhost:3000> exit

New commands are being added all the time. Check the help using node bin/cc -help.

API

API doc is coming.

Contributing

Currently we are adding commands and API as they are needed. We are very keen on adding things that users find valuable and would happily take pull requests. Adding a command is quite simple, just cut and paste some existing command and add your logic. Adding useful API is great too though it may involve adding function to the service (so a bit more work).

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

ghcrawler-cli's People

Contributors

geneh avatar iamwillbar avatar jeffmcaffer avatar microsoftopensource avatar msftgits avatar stuartlangridge avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

ghcrawler-cli's Issues

Module usage broken?

Looking at the source, I think confgureCrawler() should be this.configureCrawler() in the configureOrgs and configureCount methods.

It also looks like getToken() has an unused parameter tokens.

Adding tokens from cli fails with parsing error

node bin/cc tokens 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa#private' makes the crawler server throw an error in its log:

crawler_1    | { SyntaxError: Unexpected token "
crawler_1    |     at parse (/opt/ghcrawler/node_modules/body-parser/lib/types/json.js:83:15)
crawler_1    |     at /opt/ghcrawler/node_modules/body-parser/lib/read.js:116:18
crawler_1    |     at invokeCallback (/opt/ghcrawler/node_modules/raw-body/index.js:262:16)
crawler_1    |     at done (/opt/ghcrawler/node_modules/raw-body/index.js:251:7)
crawler_1    |     at IncomingMessage.onEnd (/opt/ghcrawler/node_modules/raw-body/index.js:307:7)
crawler_1    |     at emitNone (events.js:86:13)
crawler_1    |     at IncomingMessage.emit (events.js:185:7)
crawler_1    |     at endReadableNT (_stream_readable.js:974:12)
crawler_1    |     at _combinedTickCallback (internal/process/next_tick.js:80:11)
crawler_1    |     at process._tickCallback (internal/process/next_tick.js:104:9)
crawler_1    |   body: '"aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa#private"',
crawler_1    |   status: 400,
crawler_1    |   statusCode: 400 }

This seems to be a quoting issue; bin/cc joins all the tokens together with ; and then passes one single string as the body, and that body is JSON encoded. This means that the body is not aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa#private, but "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaa#private", and the receiver doesn't appear to be expecting the quotes; that is, I think that the sender is sending it as JSON but the receiver is expecting the body to be not JSON. But I'm not sure whether the actual error is cc sending it wrongly or the server's expectation.

Notes:
I'm using crawler-in-a-box.
I've also tried talking directly to the server config API with curl, but that doesn't help; I get the same error, and if I pass the string unquoted in the body then the /config/tokens endpoint returns 404.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.