Git Product home page Git Product logo

warp's People

Contributors

capitanu avatar diogomqbm avatar enkronan avatar leostera avatar uber-capitanu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

warp's Issues

Continuous delivery of tricorders and toolchains

We want to keep tricorders/toolchains pre-built for the architectures we support. We can do this by writing a pipeline that installs warp using our GitHub Action (see #38) and then executes the following warp pack commands:

warp pack https://rules.warp.build/toolchains/openssl --upload
warp pack https://rules.warp.build/toolchains/erlang --upload
warp pack https://rules.warp.build/toolchains/elixir --upload
warp pack https://rules.warp.build/toolchains/git --upload
warp pack https://rules.warp.build/toolchains/cmake --upload
warp pack https://rules.warp.build/toolchains/rebar3 --upload
warp pack tricorders/beam/mix.exs --upload
warp pack tricorders/rust/Cargo.toml --upload

This pipeline should run when main changes, and it will generate a commit with the new hashes for all the packaged things.

Add `warp fmt` command

To be able to format all of your code using the formatters you've already configured (or the default ones if you haven't!)

Make Rust tricorder bootstrappable

In short, this command doesn't work:

warp build tricorders/rust/Cargo.toml

Because the Rust tricorder currently won't emit signatures for Cargo.toml files. A cargo_binary or cargo_library rule would suffice to make the tricorder build itself.

We can improve on the granularity of the caching here later.

Add flag to only read rules and manifests from local folders

This would fix the edge cases of #47 while also letting us test the rest of the system on any change to rules/manifests.

The main change to do here is make calls like RuleStore::get and PublicStore::try_fetch to return early after reading a file from disk. Namely, that'd be ./rules/{rules,toolchains}/** and ./store/**/manifest.json.

Refactor Rule names to always be URLs

At the moment we have a few places where Rule names are being handled as bare strings, which can be the short-name of the rule (like mix_release) or the full name (like https://rules.warp.build/rules/mix_release.js).

I think we should just use URLs for this and normalize as soon as we get a string from a Tricorder.

Make `--force-redownload` apply to rules too

For testing purposes, the /warp/rules directory can't be trusted to contain the latest rules. We already use the --force-redownload flag to force it to download manifests for published artifacts. If we make this flag also apply to rules, and redownload them on every run, we can make sure to have the latest published rules at the time of executing the code.

Even better would be to have a flag to read rules from the current local code, but this would be a first step.

Specifying implicit toolchain versions

Some toolchains do not have a clear version that are readable from configuration files. One example of this is protoc, that usually is expected to be installed globally via brew or apt.

To do this, we should add support in the Warpfile to define toolchain versions:

{
  "toolchains": {
    "protobuf": "3.21.12"
  }
}

Open Source Checklist

Hi team, this is a checklist of stuff that we should have in place before announcing the first release. Would love to hear your thoughts and if anything is missing here.

  • CD:
    • CI passes for Linux and macOS
    • New releases are created on tags and published to Github Releases
  • Docs:
    • README.md includes basic information about the project (see https://github.com/denoland/deno/blob/main/README.md)
    • docs.warp.build is deployed from main
    • Installation page shows how to install warp from sources
    • Concepts page explains our core concepts (Target, Signature, Tricorder, Phases, etc)
    • CONTRIBUTING.md includes instructions for
      • How to set up the project locally
      • How to contribute (PR etiquette, commit messages, etc)

Feel free to pick up any of this work, just make an issue for it and go bananas ๐Ÿš€

cc/ @capitanu @diogomqbm @Enkronan

Add `warp license` command

Since we have a complete graph of everything that's being built, we can provide a command that will list all the licenses that are being used.

Add `warp gc` command

Since we know how often we are accessing certain artifacts in the local store, we can keep track of the last access to allow for routine cleanups of the store.

This will help us keep our store size down to rather stable numbers.

Make `warp pack` update local manifest files

Since we have the ./store folder in the same repo now, we can make warp pack update hashes of the manifests automatically. This will help us not forget to update them, but also aid in continuously delivering warp-packable things (like the Tricorders).

The gist of this is that after calling warp pack <target> we have access to all the build results, and also the Package that came out of the warp.pack method. This Package includes an internal manifest that includes everything we need to write to disk.

The only thing we need to make sure is that we are not modifying any of the hashes of other architectures.

So if you're running warp pack on Mac M1, that should only modify the aarch64-apple-darwin hashes.

Add `fetch` command to try to populate the cache from the public store

I wanted to download the latest version of Git in the Public Store without having to built it locally if it wasn't available for my architecture. This command would do just that: download if possible, else just say we have to build it.

$ warp fetch https://rules.warp.build/toolchains/elixir
  Downloading https://rules.warp.build/toolchains/elixir
  Finished ...

$ warp fetch https://rules.warp.build/toolchains/git
  The target "https://rules.warp.build/toolchains/git" has never been built before.
  To build it run: warp build https://rules.warp.build/toolchains/git

The test reporter is reporting wrong counts

When running the command below, we see all the right things are being handled BUT the counts of the different events are off.

We have built 5 things, and we have cached 5 things, but there is no mention of the 11 tests that were ran and cached.

; warp test --public-store-metadata-url http://localhost:9000 --force-redownload ./test/verl_SUITE.erl
     Started http://localhost:9000/tricorder/beam/manifest.json
    Readying http://localhost:9000/tricorder/beam/manifest.json
       Ready http://localhost:9000/tricorder/beam/manifest.json
   Cache-hit https://rules.warp.build/toolchains/openssl  
   Cache-hit https://rules.warp.build/toolchains/erlang  
   Cache-hit src/verl_parser.erl  
   Cache-hit src/verl.erl  
        PASS ./test/verl_SUITE.erl parse_test (CACHED)
        PASS ./test/verl_SUITE.erl compare_test (CACHED)
        PASS ./test/verl_SUITE.erl between_test (CACHED)
        PASS ./test/verl_SUITE.erl compile_requirement_test (CACHED)
        PASS ./test/verl_SUITE.erl ./test/verl_SUITE.erl (CACHED)
        PASS ./test/verl_SUITE.erl lte_test (CACHED)
        PASS ./test/verl_SUITE.erl lt_test (CACHED)
        PASS ./test/verl_SUITE.erl gt_test (CACHED)
        PASS ./test/verl_SUITE.erl is_match_test (CACHED)
        PASS ./test/verl_SUITE.erl parse_requirement_test (CACHED)
        PASS ./test/verl_SUITE.erl eq_test (CACHED)
        PASS ./test/verl_SUITE.erl gte_test (CACHED)
    Finished ./test/verl_SUITE.erl in 865ms (5 built, 5 cached)

What we'd expect to see would be:

    Finished ./test/verl_SUITE.erl in 865ms (11 passed, 0 errors, 5 built, 16 cached)

And also, one specific line where we built a library (the test/verl_SUITE.erl library) reads:

        PASS ./test/verl_SUITE.erl ./test/verl_SUITE.erl (CACHED)

where it should read:

   Cache-hit ./test/verl_SUITE.erl

Move to `normpath`

The normpath crate has fixed a bunch of issues with Path manipulation that we'll need to be safe when shipping Windows binaries.

End-to-end Tests to Write

  • We can run warp on a new repo and we'll automatically create a Warpfile for it. No complaints.
    • If we're in a subfolder, we'll go up until we find a .git folder that we believe is the root of the repo.
  • We can call warp build @all and it'll just queue everything in the workspace.
  • warp build @all honors .gitignore
  • When running warp build @all, if we find unknown extensions we should just ignore them
  • When the network is unreliable, we retry a few times before giving up

Make `warp setup` idempotent

Right now on macOS this can lead to multiple volumes. A simple check for /warp's existance should be enough to skip this step.

Honor .gitignore files

Right now some of the glob expansion that Warp does, doesn't always follow the ignored files, so sometimes we pick up files that don't end up as part of the official sources.

This happened while pairing with @diogomqbm, and we found out that there was a folder in the BEAM tricorder that had just been ignored, but locally it was being picked up by warp and building.

Edge case, but good to consider this more generally.

Support for running warp from anywhere inside of the workspace

Right now, you can only run warp and give a target from the root of the project. This needs to be extended to accept relative paths from the current directory.

We should probably use the invocation dir and the workspace root to figure out the difference and then construct a relative path to the root of the workspace.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.