Git Product home page Git Product logo

github-config's Introduction

Github config

This repository contains config files common to implementation and language-family CNBs.

Rules

Run scripts/sanity.sh to see if the changes you made to this repo are valid.

Run scripts/repo_rules.sh to see if your paketo cnb github repo has recommended settings.

How do I consume this common config

If you just wrote a new CNB, run bootstrap.sh as follows:

# type is either "implementation", "language-family", or "builder"
./scripts/bootstrap.sh --target <path/to/your/cnb> --repo-type <type>

This will copy the relevant config files to your CNB. Git commit and Push your CNB.

Now, to wire up your CNB repo to receive relevant updates as a pull requests:

  • Append your repo name to the relevant file here
  • Configure deploy-keys, secrets as required in workflow

Submit your change to this repo as a PR. You should be all set when the PR is merged.

github-config's People

Contributors

arjun024 avatar brayanhenao avatar christopherclark avatar dependabot[bot] avatar dwillist avatar foresteckhardt avatar joshuatcasey avatar joshzarrabi avatar maliksalman avatar pacostas avatar paketo-bot avatar pbusko avatar phil9909 avatar robdimsdale avatar ryanmoran avatar sophiewigmore avatar thitch97 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

github-config's Issues

Update list of buildpacks

The list of buildpacks in implementation-cnbs and language-family-cnbs is out of date. It is missing poetry, and some of the buildpacks like the python family are listed as being under packeto-community vs paketo-buildpacks. This is assuming they are still in use - they haven't been updated in over a year.

Push buildpacks to CNB registry

Note - this issue applies to all non-java Paketo meta and implementation buildpacks. Filing it here for now as I'm not sure where it belongs.

We should look into publishing all meta/implementation Paketo buildpacks to the CNB registry, now that it's been implemented as per this RFC. This would involve running pack register-buildpack or using leveraging existing github actions everytime a new buildpack is published to GCR/Dockerhub. Ultimately, publishing all buildpacks to the CNB registry will make it easier for users to "discover" buildpacks that exist.

Automate bumping jam

Whenever a release of pack is cut that involves changes to jam, we have to manually bump the version in the tools.sh script for both the implementation and language family directories, like we do in this commit. If we forget to bump the version, buildpack repos have out of sync versions of jam and packit. We should automate this process so we don't have to remember to manually bump the jam version everytime.

Post-synchronize hook for update-github-config workflows

Describe the Enhancement

I would like to be able to optionally declare a per-repository hook that can run as part of the update-github-config workflows. I would want it to run after copying files but before committing them. This will allow modification of the synchronized files on a per-repository basis.

Possible Solution

I don't want to over-generalize by providing any sort of templating functionality. I think the right level of abstraction for this problem is a shell script inside the repository which the update-github-config workflows execute if it is present. I would expect this file to exist at ./.github/sync-post-hook.

I think this post-hook logic should live inside the sync action itself, but I could fairly easily be convinced it should be a part of the workflow instead.

Motivation

The specific issue that I trying to support is that I want to add a secrets section to the create-stack job for some stacks but not others, but I still want to share a common set of workflows for all stacks.

Inaccessible integration tests - asks for $GIT_TOKEN

When I try to run integration tests using ./scripts/integration.sh on any buildpack (go-dist logs shown here), it fails with with the following error:

Fetching GIT_TOKEN
<path-to-buildpack>/scripts/.util/git.sh: line 18: lpass: command not found

...

    init_test.go:71:
        Unexpected error:
            <*errors.errorString | 0xc0004f2010>: {
                s: "unexpected response status: 401 Unauthorized",
            }
            unexpected response status: 401 Unauthorized
        occurred

Then, I figure out and install lpass cli, it hits a different error:

Fetching GIT_TOKEN
Error: Could not find decryption key. Perhaps you need to login with `lpass login`.

...
    init_test.go:71:
        Unexpected error:
            <*errors.errorString | 0xc0004f2010>: {
                s: "unexpected response status: 401 Unauthorized",
            }
            unexpected response status: 401 Unauthorized
        occurred

The error points to go-dist init_test.go#L71 where occam's BuildpackStore/Freezer seems to expect $GIT_TOKEN variable to set to a valid github token.

It looks like if I get access to the following lastpass directory, the issue is solved.

lpass show Shared-CF\ Buildpacks/concourse-private.yml \


Problem: These tests are not accessible to the community

Proposal: Either make loading GIT_TOKEN optional, or error out the test and force users to pass/load their own github token (not the nicest idea)

Language family buildpackages should be published to Paketo GCR Registry

Context
Now that we're publishing buildpackages as language family release artifacts, it would be great if these could also be published to GCR so that users can easily consume them.

Proposal
Publish language family buildpackages to Paketo GCR registry (gcr.io/paketo-buildpacks/<LANGUAGE FAMILY)

Deliverables

  • Node.js buildpackages are available at gcr.io/paketo-buildpacks/nodejs
  • PHP buildpackages are available at gcr.io/paketo-buildpacks/php
  • .NET Core buildpackages are available at gcr.io/paketo-buildpacks/dotnet-core
  • Go buildpackages are available at gcr.io/paketo-buildpacks/go

Community Buildpackages:

  • Ruby buildpackages are available at gcr.io/paketo-community/ruby

Validate release after it is created in stacks create-release workflow

The stacks release workflow creates a Github release that should have several artifacts attached to it. The artifacts (e.g. *-build-receipt.txt) are crucial for future runs of the release workflow. Malformed artifacts could break future workflow runs. Therefore, it may be useful to add a step to the workflow that validates that all artifacts were uploaded successfully.

Language Family Buildpacks should have Release Notes

It would be great if we could have release notes for the all Paketo language family buildpacks.

As a start, could we roll up the auto-generated release notes into the following tables:

  • dependencies table that lists all dependencies, versions, and stack IDs
  • Default depedencies table that lists defaults of all dependencies
  • Supported stacks table that lists the supported stacks.

I'm imagining something like this
Dependencies:

name version stacks
go 1.14.2 io.buildpacks.stacks.bionic, org.cloudfoundry.stacks.cflinuxfs3, org.cloudfoundry.stacks.tiny
go 1.14.1 io.buildpacks.stacks.bionic, org.cloudfoundry.stacks.cflinuxfs3, org.cloudfoundry.stacks.tiny
go 1.13.10 io.buildpacks.stacks.bionic, org.cloudfoundry.stacks.cflinuxfs3, org.cloudfoundry.stacks.tiny
go 1.13.9 io.buildpacks.stacks.bionic, org.cloudfoundry.stacks.cflinuxfs3, org.cloudfoundry.stacks.tiny
dep 0.5.4 io.buildpacks.stacks.bionic, org.cloudfoundry.stacks.cflinuxfs3, org.cloudfoundry.stacks.tiny
mod 0.5.4 io.buildpacks.stacks.bionic, org.cloudfoundry.stacks.cflinuxfs3, org.cloudfoundry.stacks.tiny

Default dependencies:

name version
go 1.13.*

Supported stacks:

name
io.buildpacks.stacks.bionic
org.cloudfoundry.stacks.cflinuxfs3
org.cloudfoundry.stacks.tiny

Relevant issue - paketo-buildpacks/packit#14

Add retry with exponential backoff bash helper function

Describe the Enhancement

There are a variety of situations where we observe intermittent failures running commands (e.g. docker push).

I would like to propose we add a bash helper function to retry with exponential backoff. It could look something like this.

I'm not sure where the best place for this function is, given it would be used in multiple different places. I'm assuming we would put a copy in the scripts/.util/ directory for each type of repo (language family, stacks, etc) as well as in the root scripts/.util directory for use in CI. @paketo-buildpacks/tooling-maintainers does this sound about right to you?

Release Note Automation

Some of our repositories have changelogs in their release notes (check out packit's release notes for example). This information is helpful for users, but can be time consuming for maintainers. It's also not standardized across our buildpacks and other repos.

We should create some kind of automation that will auto-generate a changelog in the release notes when the create-draft-release workflow runs so we can have consistent notes across our community.

There should be a mechanism to propagate config files to implementation/meta cnb repos

Here's a proposal:

This is how the structure of the repository should be:

├── .github
│   ├── (the github action workflow for this repo that sends dispatch to target repos)
├── actions
│   ├── (generic actions)
├── implementation
│   ├── (things that should go into implementation cnbs only)
├── language-family
│   ├── (things that should go into lang family cnbs only)
├── common
│   ├── (things that should go into both implementation & lang family cnbs)
│   ├── (this will have the dispatch receiver github workflow)

Sender workflow

There will be 3 sender workflows that will reside in .github.
It will be very similar to implementation/workflows/send-dependency-update.yml that makes use of actions/dispatch, except that:

on="push to implementation OR language-family OR common directories"
event="working-dir-update"
repo=<LIST OF TARGET REPOSITORIES> (this will be all impl cnbs for the workflow1, all lang cnbs for second workflow etc.)
 payload: |
          {
            "commit": <HEAD>,
            "pathmap": {
              "/implementaton": "/",
             }
          }

pathmap should be a mapping relative to the top-level directory of the Git working tree. It dictates how the updates in github-config should be mapped to the target repo. The above example is for the implementation cnb propagation workflow.

Receiver workflow

There will be 1 receiver workflow that will reside in common.
It will be very similar to language-family/workflows/update-buildpack-toml.yml.
It will, on receiving the dispatcher of type working-dir-update
clone github-config @ payload.commit and copy the directory contents with src
and dest dictated by pathmap and create a PR.

Auto-merging

There has to be .mergify config files in both implementation and language-family directories. (We could have put it in common but the conditions to merge are different)

They will take care of mergeing the PR generated in the above step and propagation of config files will be completed.

Blockers

  • Add support to send dispatches to multiple targets in action/dispatch

cc @thitch97

Multiple draft releases created

Context
When two or more PRs are merged into a meta buildpack close together there is a time in which the previous draft release is deleted by one of the merging PRs and when the later of the PRs see that there is no draft release so they create a draft release that is the same version as the other PR resulting in two or more draft releases of a the same version.

Proposal
Figure out a way to set up the action such that it is either impossible to merge several PRs in rapid succession or make the action smart enough to wait for the draft that is building.

Outcome
It is impossible to end up in a situation where there are multiple draft releases for the same version on a metabuildpack.

Language family buildpacks' release notes should have a summary of implementation buildpack changes

What happened?

  • What were you attempting to do?
    Understand what were all of the changes that went into a given language family buildpack release

  • What did you expect to happen?
    I expected that the changelog would have some helpful information about the changes that were made to the implementation buildpacks

  • What was the actual behavior? Please provide log output, if possible.
    The changelog only contained a list of PRs named "Updates buildpacks in buildpack.toml". I had to look at each of these and then look at the release notes for the corresponding implementation buildpacks that were updated in the language family in order to understand the overall feature changes for the language family buildpack.

It would be great if the language family buildpack release notes contained some sort of aggregate of the changes to the implementation buildpacks since the last language family release.

Checklist

  • I have included log output.
  • The log output includes an error message.
  • I have included steps for reproduction.

Remove mergify instructions from README

The team has moved away from using mergify for auto-approving & auto-merging PRs but the README still recommends turning it on for repos that need the functionality. We should remove those instructions.

Add automation to file issues on workflow failures

Context

Recently, we added automation to our stack release repositories to file an issue and tag maintainers if the release workflow fails.

We added this because our automation fails silently, and it's easy to miss unless you look at the workflows.

Issue

We should consider adding a similar mechanism for our buildpack release workflows so that we're made aware of failures as they happen, rather than days to weeks later when we go to cut a release. It may also be helpful to implement this automation step for the push buildpackage workflows as well.

Outcome

A workflow step that will file an issue on release/buildpackage push failures for our buildpacks.

Add tests to stack related actions

There are some actions written for the stacks workflow that are written in Go. We should write some tests for these actions to facilitate ease of maintenance going forward.

Make cron action to update buildpack dependencies

In issue paketo-buildpacks/packit#140 @ryanmoran outlined a full plan to manage dependency updates in a more transparent manner using the dependency server. We are breaking that issue down into workable pieces.

This issue is to implement the a GitHub Actions workflow that will take advantage of the jam update-dependencies command, which is outlined in paketo-buildpacks/packit#178, to automatically keep buildpack dependencies up-to-date based on information that it gets from the dep-server.

The output of this workflow should be PR if there are any updated versions of existing dependencies lines.

Note
Keep in mind this action should be written such that it does not break buildpacks that do not yet have the dependency constraints in their buildpack.toml. Buildpacks that haven't been switched over to using the dep-server will still consume dependencies from the old pipeline, and should be able to do this even if the workflow exists in their .github directory.

When it's time to switch a buildpack over to use this automation, the CNB can be removed from the old pipeline configuration, dependency constraint metadata can be added to the buildpack.toml, and then this workflow should begin to work

Tool for automatically updating buildpacks in builder in CI

Hi there! 👋

We would really like to use your existing tooling for updating buildpack dependencies in our custom builder. Do you have plans to separately publish and support this update tooling? We would prefer to us this instead of creating our own!

https://github.com/paketo-buildpacks/github-config/tree/main/actions/builder/update

Alternatively, it looks like there is a dockerfile for this tool that maybe could be published as an image to the same location as the buildpacks and builders?

Consider running integration tests on a schedule

Recently we have observed integration test failures in some python buildpacks due to a change in the lifecycle. We only observed these failures because PRs were submitted and those PRs ran the integration tests. We then ran the integration tests on the main branches and they failed for the same reason.

We could have improved the feedback loop by running the integration tests on a schedule, so I would like to propose we do that. I assume the shared config would live in this repo.

@paketo-buildpacks/tooling-maintainers thoughts?

Fix auto-merge PR to run after labels are added

Since the introduction of the Set Validate PR Labels workflow, the auto-merge workflow step often fails to merge in cases when it should because it runs before the labels are added, and the PR is in a "blocked" state. It's not a huge problem, but maintainers then have to perform a manual step to merge the PR. This is the whole reason that the auto-merge functionality exists, so we should make sure it works.

We should determine a clean way to ensure that the auto merge workflow re-runs when labels are added, or waits to run until the PR review is submitted as well as labels being added.

[Investigation] Identify un-patched packages to determine when to update stacks

The goal of the get-usns action is to automatically trigger stack rebuilds when a stack contains vulnerable packages.

Currently, the get-usns action takes a JSON array of recently patched USNs as input. It uses the input to determine whether there are new USNs to patch. It also takes a JSON array of packages as input. These are the packages that the action will look for in the USNs' "Affected Packages". If there are new USNs with relevant affected packages, the pipeline will rebuild the stack to upgrade the packages to patched versions.

The current approach is indirect – it doesn't rebuild the stack because it contains vulnerable versions, it infers that the stack contains vulnerable versions based on the presence of a newly-seen USN.

It would be more direct to somehow trigger a stack rebuild if a USN is published that recommends upgrading a package that we have installed on our stack.

One potential way of doing this: scrape package names and versions from the "Update Instructions" page of each USN (see this USN for an example) and compare the patched version to the version installed in the stack. If the installed version is lower than the patched version, trigger a rebuild.

Some key questions to consider:

  • How can we avoid brittleness in our USN data scraping? Can we consume data from something other than human-readable HTML?
  • Are packages listed in the "Update Instructions" section enumerated in a consistent, scrape-able way? Are they versioned consistently?
  • How can we compare versions of OS packages? Is it as simple as SemVer?

release-create action has no protection against duplicate releases

See [1] and [2]. They were both triggered on the same commit 1486adb and created 2 separate releases pointing to the same commit. Not sure how they were triggered (maybe manual re-run, maybe a github action inconsistency), but this duplication caused the next run of this workflow to fail.

Expected behavior:
[2] must have failed with error "Release already exists against this base commit"


[1] https://github.com/paketo-buildpacks/node-engine/runs/713774619?check_suite_focus=true
[2] https://github.com/paketo-buildpacks/node-engine/runs/717336391?check_suite_focus=true

Always update version of pack when a new version is available

When there is a new version of pack available but I already have a version installed under ./.bin/pack in an implementation buildpack, these scripts do not update pack to the newer version. I think they should.

I think there are valid arguments for not updating it, but I think it is simpler to know that we are always running against the latest version when we run integration tests

I think we would expand the logic in .util/tools.sh to run ./.bin/pack version and compare the versions.

I'm happy to submit a PR if folks agree this is worth doing.

Auto-label job should fail green

I feel that the auto-label job should fail green if it is unable to pass. This will mean that PRs that cannot be auto-labeled will not be marked as failing just because they cannot be auto-labeled.

Prior Art:
This is the current behavior of the auto-merge job.

Fix auto-labeler / auto-merger

Oftentimes when a Dependabot PR comes out, the auto-semver labeling step fails with the following output:
Canceling since a higher priority waiting request for 'pr_labels' exists
Screen Shot 2022-02-24 at 10 08 49 AM

When this job is re-run, labels are successfully added, but then the PR auto-merge workflow does not re-run, so maintainers have to manually 1. re-run the labeler or add a label and then 2. hand merge the PR.

We should fix both the problem with the auto-labeler, and figure out how to configure the auto-merger to re-run.

Replace deprecated GHA ::set-ouput syntax

Looks like we will need to update the way we output info from jobs in github actions: https://github.blog/changelog/2022-10-11-github-actions-deprecating-save-state-and-set-output-commands/

TL;DR - use this syntax instead:

echo "{name}={value}" >> $GITHUB_OUTPUT

e.g.

- name: Set color
  id: random-color-generator
  run: echo "SELECTED_COLOR=green" >> $GITHUB_OUTPUT
- name: Get color
  run: echo "The selected color is ${{ steps.random-color-generator.outputs.SELECTED_COLOR }}"

Create automation for picking up new version lines from the dep-server

Background

With our move to the dep-server (api.deps.paketo.io/v1/dependency?name=dependency-name) outlined in paketo-buildpacks/packit#140, the way we will consume dependency updates across our buildpacks will change.

Part of this process requires our buildpack.toml files to contain a metadata.dependency-constraints section for each version line of a dependency we support. For example, if we support node-engine version 12.* and version 14.*, there would be two separate metadata.dependency-constraint entries in the buildpack.toml.

Issue

Our automation for this task covered in #279 will keep our dependencies up to date for each included version line listed in a buildpack.toml. The issue is that this automation does not pick up new version lines that get published to the dep-server. We should create some type of GHA workflow that will monitor the dep-server and file issues when a new version line becomes available, so that contributors/maintainers are aware of the new version line and can add the new constraint to the buildpack.toml.

Rationale and Alternatives

The alternative would be to have the automation create PRs to add the new version lines into the buildpack.toml. This is less desirable because we may not want to consume new version lines as soon as they come out. Filing an issue about the new version line will allow maintainers to be intentional about the versions we ship in our buildpacks.

Can no longer run stacks `./scripts/create.sh` locally

Expected Behavior

It should create a stack when passed no arguments.

Current Behavior

I get the following error message:

 ‣ ./scripts/create.sh
Using jam 1.2.3
./scripts/create.sh: line 82: @: unbound variable

Possible Solution

Appears that the secrets variable needs to be handled in the case that there are no secrets to pass. I'm too lazy to figure that part out and will just run jam create-stack directly.

Steps to Reproduce

  1. Checkout any of the stack repos
  2. Run ./scripts/create.sh

Motivations

I used to just run this helpful script and now I have to type a longer command to get the same outcome.

Add support for library repositories

Describe the Enhancement

Currently we support a few types of repositories:

  • language family buildpacks
  • component buildpacks
  • stacks
  • builders

We have started to identify areas where we would benefit from libraries (e.g. https://github.com/paketo-buildpacks/libreload-packit) and as such it would be valuable to be able to manage shared configuration from this common repository.

Possible Solution

I'm not 100% sure yet what types of configuration we would want, but I'd expect we would some of the following:

  • PR checks (tests, linter, etc)
  • post-merge tests and draft release
  • automatic semantic versioning

I'd imagine a lot of this would look like what we already have in packit - in fact we should consider that packit could consume this newly-added library section, too.

Motivation

Manually creating github configuration and scripts for libraries is brittle and error-prone. Locating them in a common place has proven valuable for other types of repositories.

Make dependency update PR names more informative

Automated PRs generated by concourse pipelines to update dependencies have names like: "Updating version for dotnet-runtime for 3.1.X". This is handy because it's easy to determine, at a glance, what the PR is changing. (Like if you're looking at the changelog on a draft release to determine the correct semver version for the next one).

PRs opened by the update dependencies workflow are all named the same thing

It would be helpful if these PRs could somehow give a hint about the changes they made.

Pack update action

There should be an action in this repo that update the version of pack that in our is in out script and images that install pack, see here.

Versions in the order definitions of metabuildpack buildpack.toml do not get updated

Context
When a new implementation buildpack version is received the metadata.dependecies entry is modified to the new version but the version in the order definition is not updated.

Proposal
We should update the versions in the both the metadata.dependecies as well as the order.

Acceptance Criteria
When a new implementation buildpack version comes in:

  • The metadata.dependecies entry is updated.
  • The order entries are updated.

Add test coverage to stacks actions

The get-usns Github action used for polling the Ubuntu USN RSS feed is one of the most important actions in the Paketo ecosystem -- it's the main mechanism that triggers automatic stack updates that keep our stacks secure for users. Currently, it's untested. We should add tests. Many of these may mock out an RSS feed server.

In addition, I think it'd be useful to have a test that will fail if the RSS feed or Ubuntu USN/CVE web pages change their format in a way that makes our data scraping invalid. For instance, asserting that scraping a real USN page for a certain regex returns a reasonable number of matches, a valid URL, etc.

The diff-package-receipts and release-notes actions are also written in Golang and have some degree of complexity. These should be tested as well.

Incorrect SemVer Calculation

When cutting a release of packit we noticed the the SemVer calculation was incorrect based on the labels given to PRs in that release, which can be seen here. After looking through the SemVer calculation action I think that I have found the issue.

uri := fmt.Sprintf("%s/repos/%s/compare/%s...main", config.Endpoint, config.Repo, previous.Original())

In this api call we are hard coded to compare against main however packit no longer has main as its default branch instead having v2 be the default. We should find a way to either allow this to be changed or determine it in logic. I think that it would be relative straight forward to get the action to pass the commitish that it is running on and use that instead of main.

Fix open PR action

Currently the update builder action creates a single branch and tries to make changes to and make a PR with that branch. When that PR is merged, the branch is deleted. However if the PR is closed the branch is not deleted and a new PR will not be reopened on that branch meaning that there is a possibility to miss updates. This problem can be demonstrated by the following to PRs, first we have paketo-buildpacks/full-builder#331 which opens a branch with relevant changes. This PR was then closed by another PR through linking and all subsequent update runs did not reopen a similar PR until the automation/builder-toml branch was deleted and then this PR was created and merged paketo-buildpacks/full-builder#333.

jam update buildpack action uses out of date jam version

Currently, the update buildpack action uses jam 0.9.0. However, the latest is v0.11.0. The update includes a fix introduce in 0.10.1 that enables jam update builder to work when registry path != buildpackage ID.

There's an issue with that fix when used with private registries (see paketo-buildpacks/packit#175), so it might not make sense to upgrade the version yet, but it would be good to have a mechanism that keeps this jam version up to date.

Update actions/release/notes

This action still hardcodes the version of jam internally, we should move to using the method from tools.json.

Add usage flag to package scripts

Could we consider adding in a usage (-h) flag to our package scripts so that users can see what options are supported in packaging a buildpack (caching, etc.)?

Dependency update PR title has blank list of new versions

Minor thing, but it's a bit weird how dependency update PRs that don't actually change any versions (they just reformat the buildpack.toml) have a title that includes an empty list of dependency versions. Like paketo-buildpacks/dotnet-core-sdk#263 .

Seems like this is because the PR title in the update workflow is always the same regardless of whether the updated versions output is empty or not:

title: "Updates buildpack.toml with new dependency versions: ${{ steps.update.outputs.new-versions }}"

Create-draft-release workflow fails to upload assets in some cases

What happened?

  • What were you attempting to do?

Create a draft release of a buildpack.

  • What did you expect to happen?

Upon merging a pull request, the create-draft-release workflow runs successfully resulting in a draft release of the buildpack with packaged buildpack artifacts attached as release assets.

  • What was the actual behavior? Please provide log output, if possible.

The release job failed to upload the buildpack artifacts (example 1 & example 2).

Checklist

  • I have included log output.
  • The log output includes an error message.
  • I have included steps for reproduction.

Can we use the concurrency setting in GitHub workflows?

We recently experienced a large backup in the number of queued workflow runs across our GitHub org. This resulted in a massive slowdown in our ability to get PRs merged, releases cut, dependencies updated, etc.

Can we use the concurrency setting on a workflow to reduce some of the queue pressure for our org?

Add docs

It would be great to see some docs around how to make use of the github config/existing actions that live in this repo so that it's easy for community members to leverage in new community buildpacks.

Add "good-first-issue" label

I have found value in the good-first-issue label for issues - it can help onboard new contributors by identifying issues that are smaller in scope and can be addressed with less context.

Any objections to adding that label to our list of labels?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.