Git Product home page Git Product logo

upload-cloud-storage's People

Contributors

averikitsch avatar bharathkkb avatar dependabot[bot] avatar google-github-actions-bot avatar jakthom avatar nickdub avatar omrihq avatar robertvanhoesel avatar sethvargo avatar syprx avatar verbanicm avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

upload-cloud-storage's Issues

How to use the folder/files created in previous step

I'm looking to upload all static content built-in in one step and upload it into a bucket on the following step. As I'm doing, it doesn't find my content.

Any guidance on how I can make it happen?

  - name: Build gulpfile
      id: "gulpfile"
      run: npx gulp

    - id: 'auth'
      uses: 'google-github-actions/auth@v0'
      with:
        workload_identity_provider: ${{ secrets.GCP_CDN_WORKLOAD_IDENTITY }}
        service_account: ${{ secrets.GCP_CDN_SA_CREDENTIALS }}

    - name: 'test'
      run: |
        echo "content here:"
        ls ./public #all content is printed here

    - id: 'upload-folder'
      uses: 'google-github-actions/upload-cloud-storage@v0'
      with:
        path: ./public
        destination: 'static-bucket'
        parent: false

Error occured when using upload-cloud-storage

  • name: Upload to Google Cloud Storage
    uses: GoogleCloudPlatform/github-actions/upload-cloud-storage@master
    with:
    credentials: ${{ secrets.GCS_CREDENTIALS }}
    path: build
    destination: ${{ env.UPLOAD_PATH_NAME }}

Error: A resumable upload could not be performed. The directory, /root/.config, is not writable. You may try another upload, this time setting options.resumable to false.

Re-enable load test

TL;DR

The current load test can only run 3x in a short period and then we hit quota limits. This makes iterative testing and debugging extremely painful, so it's disabled.

Detailed design

We need to re-enable the load test. Ideas:

  • Move to a separate action that are allowed to fail
  • Only run on pushes to main (not on PRs)
  • Run on a cron schedule instead of on PRs and pushes

There also seems to be something weird with the authentication. The error messages we get are actually 503s from the STS service. It seems like the NPM package for Storage is doing an auth handshake for each file upload. I don't have more advanced telemetry to prove this, but that feels like a bug. The token should be cached and reused for its TTL /cc @bcoe

Additional information

No response

Upload fails with: triggerUncaughtException(err, true /* fromPromise */);

TL;DR

Hello,

unfortunately i sometimes get an uncaught exception while uploading a testreport (~ 500 files)
See log output for exception details. As i see in the log, that several files are uploaded before the exception appears.
It happens from time to time. Maybe 1/10 tries.

Do you know if it is possible to enable a retry or handle the exception within your package?
That would be awesome to have this Job always pass. 🙂

Expected behavior

uploading files always works

Observed behavior

upload is not successful due to the exception

Action YAML

upload-report:
    name: Upload Report
    if: (success() || failure()) && needs.test-integration.result != 'skipped'
    runs-on: ubuntu-latest
    environment:
      name: dev-cluster
    env:
      JOB_NAME: ${{github.job}}
    needs:
      - test-integration
    steps:
      - name: Archive Report Download
        uses: actions/download-artifact@v3
        with:
          path: tests/output/allure-report
      - name: Authenticate to GCP
        uses: google-github-actions/[email protected]
        with:
          token_format: access_token
          workload_identity_provider: ${{ secrets.*** }}
          service_account: ${{ secrets.***}}
      - name: Authenticate to GKE cluster
        uses: google-github-actions/[email protected]
        with:
          cluster_name: ${{ secrets.*** }}
          location: ${{ secrets.*** }}
          use_internal_ip: true
      - name: Upload Report
        uses: google-github-actions/upload-cloud-storage@v1
        with:
          path: tests/output/allure-report
          parent: false
          destination: ${{ env.BUCKET_NAME }}/${{ env.BUCKET_PATH }}
          concurrency: 50

Log output

Run google-github-actions/upload-cloud-storage@v1
Upload files
Uploading /tests/output/allure-report/test-integration/index.html to gs://***
...
Uploading /tests/output/allure-report/test-integration/history/retry-trend.json
Uploading /tests/output/allure-report/test-integration/history/history.json to gs://***
Uploading /tests/output/allure-report/test-integration/history/history-trend.json to gs://***
Uploading /tests/output/allure-report/test-integration/history/duration-trend.json to gs://***

 node:internal/process/promises:279
              triggerUncaughtException(err, true /* fromPromise */);
              ^
  FetchError: request to https://pipelines.actions.githubusercontent.com/... failed, reason: 
connect ETIMEDOUT 13.107.42.16:443
      at ClientRequest.<anonymous> (/home/runner/work/_actions/google-github-actions/upload-cloud-storage/v1/dist/index.js:201:57777)
      at ClientRequest.emit (node:events:527:28)
      at TLSSocket.socketErrorListener (node:_http_client:454:9)
      at TLSSocket.emit (node:events:527:28)
      at emitErrorNT (node:internal/streams/destroy:157:8)
      at emitErrorCloseNT (node:internal/streams/destroy:122:3)
      at processTicksAndRejections (node:internal/process/task_queues:83:21) {

Cache-Control header not applied

TL;DR

Unable to apply cache-control header
Expected behavior

Uploaded file in GCS contains expected metadata

Observed behavior

Uploaded file does not have cache-control set. In my action, I see:

Unexpected input(s) 'headers', valid inputs are ['credentials', 'path', 'destination', 'gzip', 'resumable', 'predefinedAcl', 'parent', 'glob', 'concurrency']

Reproduction

Action YAML

# Paste your complete GitHub Actions YAML here, removing
      - name: Upload app files
        uses: google-github-actions/[email protected]
        with:
          path: ${{ env.BUILT_APP_DIR }}
          parent: false
          destination: my-bucket-name
          glob: '**/*'
          gzip: true
          headers: |-
            cache-control: public, max-age=300, must-revalidate
# any sensitive values.

Additional information

Relative path does not find any files

TL;DR

Expected behavior
The root path should be the same as the checkout path (at least intuitively). Instead the file is not found when uploading. Perhaps a more helpful error message would be printing the full path of the attempted upload.

Observed behavior
File not found error.

Reproduction

Action YAML

name: test
on: push
jobs:
  push:
    name: Push to GCS
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v2
    - uses: google-github-actions/upload-cloud-storage@main
      with:
        credentials: ${{ secrets.TEST }}
        path: ../test
        destination: test

Repository
https://github.com/GuessWhoSamFoo/gcs-push-test/

Additional information
n/a

Two-way transfer

TL;DR

This is an action to upload files to cloud storage => couldn't it be a two-way action ?

Design

Action YAML
It could be a generic source => destination action.

steps:
  - id: upload-files
    uses: google-github-actions/transfer-cloud-storage@main
    with:
      source: /path/to/folder
      destination: bucket-name
  - id: download-files
    uses: google-github-actions/transfer-cloud-storage@main
    with:
      source: gs://bucket-name/dir/file.txt
      destination: file.txt

Resources
N/A

Additional information
N/A

仓促

TL;DR

[Z](@ak2337) {无限大}仓库储存数据公用### ** bugging**

Expected behavior

URL编码(UTF-8):cang+ku+%7Bchu+cun+cang+chu+ku+fang+8000ping+mi+huan+yang+di+ping+ceng+gao+10mi+600ping+mi+600ping+mi+3000ping+mi+24xiao+shi+quan+tian+hou+tong+feng+liang+hao+nei+bu+gan+jing+zheng+jie+cang+ku+shi+he+chu+cun+ren+he+shu+ju+tuo+guan+
URL编码(GBK):cang+ku+%7Bchu+cun+cang+chu+ku+fang+8000ping+mi+huan+yang+di+ping+ceng+gao+10mi+600ping+mi+600ping+mi+3000ping+mi+24xiao+shi+quan+tian+hou+tong+feng+liang+hao+nei+bu+gan+jing+zheng+jie+cang+ku+shi+he+chu+cun+ren+he+shu+ju+tuo+guan+
UCS2编`

Observed behavior

码:\u0063\u0061\u006e\u0067\u0020\u006b\u0075\u0020\u007b\u0063\u0068\u0075\u0020\u0063\u0075\u006e\u0020\u0063\u0061\u006e\u0067\u0020\u0063\u0068\u0075\u0020\u006b\u0075\u0020\u0066\u0061\u006e\u0067\u0020\u0038\u0030\u0030\u0030\u0070\u0069\u006e\u0067\u0020\u006d\u0069\u0020\u0068\u0075\u0061\u006e\u0020\u0079\u0061\u006e\u0067\u0020\u0064\u0069\u0020\u0070\u0069\u006e\u0067\u0020\u0063\u0065\u006e\u0067\u0020\u0067\u0061\u006f\u0020\u0031\u0030\u006d\u0069\u0020\u0036\u0030\u0030\u0070\u0069\u006e\u0067\u0020\u006d\u0069\u0020\u0036\u0030\u0030\u0070\u0069\u006e\u0067\u0020\u006d\u0069\u0020\u0033\u0030\u0030\u0030\u0070\u0069\u006e\u0067\u0020\u006d\u0069\u0020\u0032\u0034\u0078\u0069\u0061\u006f\u0020\u0073\u0068\u0069\u0020\u0071\u0075\u0061\u006e\u0020\u0074\u0069\u0061\u006e\u0020\u0068\u006f\u0075\u0020\u0074\u006f\u006e\u0067\u0020\u0066\u0065\u006e\u0067\u0020\u006c\u0069\u0061\u006e\u0067\u0020\u0068\u0061\u006f\u0020\u006e\u0065\u0069\u0020\u0062\u0075\u0020\u0067\u0061\u006e\u0020\u006a\u0069\u006e\u0067\u0020\u007a\u0068\u0065\u006e\u0067\u0020\u006a\u0069\u0065\u0020\u0063\u0061\u006e\u0067\u0020\u006b\u0075\u0020\u0073\u0068\u0069\u0020\u0068\u0065\u0020\u0063\u0068\u0075\u0020\u0063\u0075\u006e\u0020\u0072\u0065\u006e\u0020\u0068\u0065\u0020\u0073\u0068\u0075\u0020\u006a\u0075\u0020\u0074\u0075\u006f\u0020\u0067\u0075\u0061\u006e\u0020
BASE64编码:Y2FuZyBrdSB7Y2h1IGN1biBjYW5nIGNodSBrdSBmYW5nIDgwMDBwaW5nIG1pIGh1YW4geWFuZyBkaSBwaW5nIGNlbmcgZ2FvIDEwbWkgNjAwcGluZyBtaSA2MDBwaW5nIG1pIDMwMDBwaW5nIG1pIDI0eGlhbyBzaGkgcXVhbiB0aWFuIGhvdSB0b25nIGZlbmcgbGlhbmcgaGFvIG5laSBidSBnYW4gamluZyB6aGVuZyBqaWUgY2FuZyBrdSBzaGkgaGUgY2h1IGN1biByZW4gaGUgc2h1IGp1IHR1byBndWFuIA==接

Action YAML

MD5值:124122FE6698062272D590E0FCC0C883 #243  #`

Log output

No response

Additional information

No response

Intermittent failed with [object Object] error on upload

TL;DR

Not sure why, but I ran into a Error: google-github-actions/upload-cloud-storage failed with [object Object] on an upload step. I have not seen this before and I've been using this action for several months.

Run and workflow links below. Nothing complicated, I am uploading a single JSON file

[1] Run: https://github.com/mozilla-mobile/mobile-test-health/runs/5571311372?check_suite_focus=true
[2] Workflow: https://github.com/mozilla-mobile/mobile-test-health/blob/main/.github/workflows/daily.yml

Expected behavior

Successful upload

Observed behavior

Error: google-github-actions/upload-cloud-storage failed with [object Object]

Action YAML

      - uses: google-github-actions/upload-cloud-storage@v0
        if: hashFiles('output.json') != ''
        name: Upload Artifact (json) to Google Cloud Storage
        with: 
          path: ${{ steps.date.outputs.date }}_${{ matrix.configuration }}.json
          destination: mobile-reports/public/moz-mobile-test-health/${{ steps.date.outputs.day }}

Log output

No response

Additional information

No response

Action does not allow uploading single file to a bucket path

TL;DR

I am unable to use this action to upload a single file to a given path in a bucket. It seems like the package only allows specifying a prefix and not a explicit destination path.

E.g. I am unable to achieve the following: /tmp/data.yaml => gs://bucket/.acl/data.yaml

If I try to run the action with the following variables:

uses: google-github-actions/upload-cloud-storage@v1
with:
  path: /tmp/acl.yaml
  destination: bucket/.acl/acl.yaml
  parent: false
  gzip: false
  headers: |-
    content-type: text/yaml

The file is uploaded to: gs://bucket/.acl/acl.yaml/acl.yaml instead of gs://bucket/.acl/acl.yaml

Expected behavior

I would expect that a single file would be uploaded to: gs://bucket/.acl/acl.yaml

Observed behavior

A file is instead uploaded to: gs://bucket/.acl/acl.yaml/acl.yaml

Action YAML

name: upload-gcs
on: [workflow_call]
jobs:
  upload:
    runs-on: ubuntu-latest
    permissions:
      contents: read
      id-token: write
    steps:
    - id: gh-checkout
      uses: actions/checkout@v3
    - id: setup-node
      uses: actions/setup-node@v3

    - id: setup-dependencies
      run: |
        npm install yaml

    - id: parse-acl
      uses: actions/github-script@v6
      with:
        script: |
          // script generates a YAML file with the ACL policy and saves it to /tmp/acl.yaml
          // script also outputs a "acl" output variable with the path to the generated file (/tmp/acl.yaml)

    - id: gcp-authenticate
      uses: google-github-actions/auth@v1
      with:
        workload_identity_provider: [identityProvider] # redacted
        service_account: [serviceAccount] # redacted

    - id: gcp-upload-acl
      uses: google-github-actions/upload-cloud-storage@v1
      with:
        path: ${{ steps.parse-acl.outputs.acl }} # value is /tmp/acl.yaml
        destination: gs://[bucket]/.acl/acl.yaml # bucket name redacted
        parent: false
        gzip: false
        headers: |-
          content-type: text/yaml

Log output

# I am unable to provide the entire log output here, but here's the log specific to this action.

##[debug]Evaluating condition for step: 'Run google-github-actions/upload-cloud-storage@v1'
##[debug]Evaluating: success()
##[debug]Evaluating success:
##[debug]=> true
##[debug]Result: true
##[debug]Starting: Run google-github-actions/upload-cloud-storage@v1
##[debug]Loading inputs
##[debug]Evaluating: steps.parse-acl.outputs.acl
##[debug]Evaluating Index:
##[debug]..Evaluating Index:
##[debug]....Evaluating Index:
##[debug]......Evaluating steps:
##[debug]......=> Object
##[debug]......Evaluating String:
##[debug]......=> 'parse-acl'
##[debug]....=> Object
##[debug]....Evaluating String:
##[debug]....=> 'outputs'
##[debug]..=> Object
##[debug]..Evaluating String:
##[debug]..=> 'acl'
##[debug]=> '/tmp/acl.yaml'
##[debug]Result: '/tmp/acl.yaml'
Run google-github-actions/upload-cloud-storage@v1
  with:
    path: /tmp/acl.yaml
    destination: [bucket]/.acl/acl.yaml
    parent: false
    gzip: false
    headers:
      content-type: text/yaml
    resumable: true
    concurrency: 100
    process_gcloudignore: true
  env:
    CLOUDSDK_AUTH_CREDENTIAL_FILE_OVERRIDE: [redacted]
    GOOGLE_APPLICATION_CREDENTIALS: [redacted]
    GOOGLE_GHA_CREDS_PATH: [redacted]
    CLOUDSDK_CORE_PROJECT: [redacted]
    CLOUDSDK_PROJECT: [redacted]
    GCLOUD_PROJECT: [redacted]
    GCP_PROJECT: [redacted]
    GOOGLE_CLOUD_PROJECT: [redacted]
##[debug]Computed absoluteRoot from "/tmp/acl.yaml" to "/tmp" (isDir: false)
##[debug]Computed computedGlob from "" to "acl.yaml"
##[debug]Found 1 files: ["acl.yaml"]
##[debug]Processing gcloudignore
##[debug]Using .gcloudignore at: /home/runner/work/test-gh-upload/test-gh-upload/.gcloudignore
##[debug]Parsed ignore list: [".git","gha-creds-*.json",".github"]
##[debug]Uploading 1 files: ["acl.yaml"]
##[debug]Computed bucket as "[bucket]"
##[debug]Computed prefix as ".acl/acl.yaml"
::group::Upload files
Upload files
  Uploading /tmp/acl.yaml to gs://[bucket]/.acl/acl.yaml/acl.yaml
  ##[debug]Uploading: {"destination":"[bucket]/.acl/acl.yaml/acl.yaml","metadata":{"contentType":"text/yaml"},"gzip":false,"resumable":true,"configPath":[redacted],"ts":[redacted],"source":"/tmp/acl.yaml"}
  ::endgroup::
##[debug]Node Action run completed with exit code 0
##[debug]Set output uploaded = .acl/acl.yaml/acl.yaml
##[debug]Finishing: Run google-github-actions/upload-cloud-storage@v1

Additional information

This also prevents uploading a static file with a different name. E.g. /tmp/acl.yaml => gs://bucket/.acl/github.yaml

I think the only way to make this work would be:

mkdir -p /tmp/acl
echo "tktktk" > /tmp/acl/github.yaml

Then calling the action with { path: "/tmp/acl", destination: "gs://bucket/.acl/", parent: false }

v0.10.0 - uploading a file uploads it with the containing directory

TL;DR

the upload logic has changed on v0.10.0 - the uploaded file is a folder containing the desired file instead of only the actual file.

Expected behavior

When uploading
path: './A/B/myfile.apk'
i expected myfile.apk only to be uploaded, this was the way it worked until v0.9.0.

Observed behavior

for some reason the logic has changed on v0.10.0 to uploading 'B/myfile.apk' . this is a bug. if i wanted to upload 'B/myfile.apk' i will select the path to be './A/B'.

Action YAML

i can't unfortunately as it is a private org repo. but the bug can be recreated really easily, a folder will be uploaded instead of just a file as i explained.

Log output

No response

Additional information

No response

Annotation warnings: Unexpected input(s)

TL;DR

Two of the steps in our jobs that are using upload-cloud-storage are throwing warnings:

Unexpected input(s) 'path', 'parent', 'destination', 'predefinedAcl', valid inputs are ['project_id', 'workload_identity_provider', 'service_account', 'audience', 'credentials_json', 'create_credentials_file', 'export_environment_variables', 'token_format', 'delegates', 'cleanup_credentials', 'access_token_lifetime', 'access_token_scopes', 'access_token_subject', 'retries', 'backoff', 'backoff_limit', 'id_token_audience', 'id_token_include_email']

Expected behavior

There shouldn't be any warning since 'path', 'parent', 'destination', 'predefinedAcl' are valid inputs.

Observed behavior

No response

Action YAML

...
jobs:
  {NAME}:
    runs-on: ubuntu-latest
    steps:
      ...
      - name: Auth
        id: auth
        uses: google-github-actions/auth@v1
        with:
          path: ${{steps.path.outputs.path}}
          parent: false
          destination: ${{ steps.cdnpath.outputs.cdnpath }}/
          predefinedAcl: publicRead
          credentials_json: ${{ secrets.CDN_WRITER }}
      - name: Upload file
        id: upload-file
        uses: google-github-actions/upload-cloud-storage@v1
        with:
          path: ${{steps.path.outputs.path}}
          parent: false
          destination: ${{ steps.cdnpath.outputs.cdnpath }}/
          predefinedAcl: publicRead
      ...

Log output

{NAME}
Unexpected input(s) 'path', 'parent', 'destination', 'predefinedAcl', valid inputs are ['project_id', 'workload_identity_provider', 'service_account', 'audience', 'credentials_json', 'create_credentials_file', 'export_environment_variables', 'token_format', 'delegates', 'cleanup_credentials', 'access_token_lifetime', 'access_token_scopes', 'access_token_subject', 'retries', 'backoff', 'backoff_limit', 'id_token_audience', 'id_token_include_email']

Additional information

No response

Cache based on Checksum

TL;DR

Upload files only when the checksum is different from the repo file to the CloudStorage file.

Detailed design

If a file already exists in the CloudStorage Bucket with the same name as a file to be uploaded by the Action,
then the file should be downloaded first. Then the checksums of the files with identical names should be compared.
And finally the file defined to be uploaded to the CloudStorage Bucket should only be uploaded when the checksum has changed.

Additional information

No response

Clean up temporary files for resumable uploads

TL;DR

For each file, we create a resumable metadata file on disk for the client library. However, we never clean up that file. On managed runners, it's fine because the filesystem is destroyed. But for self-hosted runners, we leak these files over time.

High-level design

  • Create all files in subdirectory dedicated to the workflow number
  • Create post step that deletes the directory when the job has finished

Expected behavior

Don't leak the files.

Observed behavior

No response

Action YAML

not: applicable

Log output

No response

Additional information

No response

Update to use Node 16

TL;DR

Node 12 is deprecated

Design

  • Cut a release with current features with node12 support
  • Update action.yml to uses: 'node16'
  • Update all tests to use 16 in the matrix
  • Update actions/checkout@v2 -> actions/checkout@v3 everywhere
  • Update README to note node 16 requirement for self-hosted runners (example)
  • Cut a release with node16 support

Exception uploading static assets to GCS

TL;DR

Expected behavior

Successful upload.
Observed behavior

Exception and failed action.

{"$id":"1","innerException":null,"message":"Operations that change non-concurrent collections must have exclusive access. A concurrent update was performed on this collection and corrupted its state. The collection's state is no longer correct.","typeName":"System.InvalidOperationException, System.Private.CoreLib, Version=14.0.0.0, Culture=neutral, PublicKeyToken=XXXXXX","typeKey":"InvalidOperationException","errorCode":0,"eventId":0}

Reproduction

Action YAML

# Paste your complete GitHub Actions YAML here, removing
# any sensitive values.
      - name: Upload assets
        uses: google-github-actions/[email protected]
        with:
          path: ${{ env.BUILT_APP_DIR }}
          parent: false
          destination: ${{ env.GCS_BUCKET }}
          glob: 'assets/*'
          gzip: true
          headers: |-
            cache-control: public,max-age=31536000,immutable

Repository

Additional information

Setting gzip to false

TL;DR

Uploading to GCS using upload-cloud-storage actions passes gzip: true as a default, can this be an option?

Design

Action YAML
In this line, the default is set to gzip: true, I'd like to be able to set this.

const options: UploadOptions = { gzip: true };

            -   name: Upload files
                id: upload-files
                uses: google-github-actions/upload-cloud-storage@main
                with:
                    path: foo
                    destination: bucketname 
                    gzip: false

Support brotli compression for uploaded assets

TL;DR

Support brotli compression as well as gzip for asset uploads

From what I've read, this can make a noticeable difference for web performance:

Screen Shot 2022-06-01 at 2 00 19 PM

Detailed design

No response

Additional information

No response

Switch away from globby

TL;DR

The latest version of globby isn't compatible because it requires us to switch to modules which is a fairly large undertaking. We also aren't using all the features of globby, so it might be better just to switch to https://github.com/mrmlnc/fast-glob directly, which is what globby uses under the hood.

Expected behavior

No response

Observed behavior

No response

Action YAML

not: applicable

Log output

No response

Additional information

No response

Clean up Readme re: file paths

TL;DR

The file paths used in the example don't work for me as written.

Detailed design

Now, I'm integrating these steps into an existing Actions file, doing a bunch of other stuff, if that's any kind of issue. 

In the Readme file example now it shows:


      with:
        path: '/path/to/file'
        destination: 'bucket-name/file'

To get it to actually work, I did:

      with:
        path: 'path/to/file'
        destination: 'bucket-name/folder'

Ditching the initial / on the path, and removing "file" from the destination. With those changes it grabbed the file correctly from the git cloned files, and it the file ended up in the bucket-name/folder location.



### Additional information

_No response_

Improve documentation about folders uploading

TL;DR

I need to upload a folder with some DAGS (python3 scripts), I am able to use relative paths, or absolute paths, but what is default path? In the following example:

├── README.md
├── requeriments.txt
└── tarea1
    └── dag.py

The default path start on README.md level, or in .github/workflows level?

Design

Action YAML

name: update-dags
on:
  push:
    branches:
    - main
  # Simplemente hay que sincronizar
jobs:
  sincronizar:
    runs-on: ubuntu-latest
    steps:

      - name: Set up Cloud SDK
        uses: google-github-actions/setup-gcloud@master
        with:
          project_id: ${{ secrets.GCP_PROJECT_ID }}
          service_account_key: ${{ secrets.GCP_SA_KEY }}
          export_default_credentials: true

      - name: Sync dag folder
        uses: google-github-actions/upload-cloud-storage@main
        with:
          path: tarea1
          destination: ${{ secrets.GCP_BUCKET_NAME }}

gha-creds file getting pushed to the storage

TL;DR

Hi Team,

The gha-creds* file is also getting pushed to the cloud storage along with the build folder.

Could you please check and let us know if by any chance we can ignore this file.

  • id: 'auth'
    uses: 'google-github-actions/auth@v0'
    with:
    credentials_json: '${{ secrets.***}}'
    export_environment_variables: false

    • id: 'upload-folder'
      uses: 'google-github-actions/upload-cloud-storage@v1'
      with:
      path: '.'
      destination: 'carear-metroplex-portal'
      parent: false
      process_gcloudignore: true

Thanks
Harshita

Expected behavior

No response

Observed behavior

No response

Action YAML

name: Move to Firebase storage on workflow_dispatch
'on': 
 workflow_dispatch:
    inputs:
      environment:
        description: 'Environment'
        required: true
        default: 'metroplex'
        
jobs:
  build:
    name: Build
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
        with:
          fetch-depth: 0
          path: main
      - name: Checkout ex builder repo
        uses: actions/checkout@v3
        with:
          repository: carear-io/CareARExperienceBuilder
          path: CareARExperienceBuilder
          ref: ${{ github.event.inputs.experiencebuilderbranch  }} 
          token: ${{ secrets.GIT_ACCESSTOKEN_CICD }}
    
 [[ here we have the logic to build the app]
       
      - name: Upload Artifact
        uses: actions/upload-artifact@master
        with:
          name: build
          path: ./main/build
          
  deploy:
    name: Deploy
    needs: [build]
    runs-on: ubuntu-latest
    steps:
      - name: Download the build output
        uses: actions/download-artifact@v3
        with:
          name: build

      - id: 'auth'
        uses: 'google-github-actions/auth@v0'
        with:
           credentials_json: '${{ secrets.FIREBASE_CAREAR_METROPLEX}}'    

      - id: 'upload-folder'
        uses: 'google-github-actions/upload-cloud-storage@v1'
        with:
          path: '.'
          destination: 'carear-metroplex-portal' 
          parent: false
          process_gcloudignore: true

Log output

No response

Additional information

No response

How to Copy Files/Folder from Github Action to Google Cloud Storage (Bucket)

TL;DR

I want to push some files or folder from github action to GCP bucket

I have create a new bucket and give permission of owner

BUCKET NAME :- github_action

FOLDER NAME(INSIDE BUCKET I CREATED) :- testing

SOURCE FOLDER-NAME :- ./results.sarif

DEST FOLDER-NAME :- gcp bucket

Whatever above yaml file plse tell me which are i need to kept and which are not to kept(unnecessary content i will remove it)

I Dont know what i need to give here and i dont understand this below 4 point on how to use it in github action and also more link on gcp bucket permission to ready it

path:

destination:

credentials_json: ${{ secrets.gcp_credentials }}

${{ steps.upload-folder.outputs.uploaded }}'

But i want to push this file ./results.sarif into gcp bucket

Please give me any link or blog that clearly show on steps by steps process

Expected behavior

ERROR 1
Run google-github-actions/upload-cloud-storage@v1

Error: google-github-actions/upload-cloud-storage failed with: ENOENT: no such file or directory, lstat '/home/runner/work/GitHub-Action-Tests/GitHub-Action-Tests/results.sarif'

ERROR 2
codeql/upload-sarif action failed: HttpError: Resource not accessible by integration

ERROR 3
Uploading /home/runner/work/GitHub-Action/test/test.txt to gs://https:/console.cloud.google.com/scanning-vul/test/test.txt
Error: google-github-actions/upload-cloud-storage failed with: error code invalid_request: Invalid value for "audience". This value should be the full resource name of the Identity Provider. See https://cloud.google.com/iam/docs/reference/sts/rest/v1/TopLevel/token for the list of possible formats.
node:internal/process/promises:246
triggerUncaughtException(err, true /* fromPromise */);
^

Error: invalid_request
at Gaxios._request (/home/runner/work/_actions/google-github-actions/upload-cloud-storage/v1/dist/index.js:168:5830)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at async StsCredentials.exchangeToken (/home/runner/work/_actions/google-github-actions/upload-cloud-storage/v1/dist/index.js:168:84570)
at async IdentityPoolClient.refreshAccessTokenAsync (/home/runner/work/_actions/google-github-actions/upload-cloud-storage/v1/dist/index.js:168:24677)
at async IdentityPoolClient.getAccessToken (/home/runner/work/_actions/google-github-actions/upload-cloud-storage/v1/dist/index.js:168:23023)
at async IdentityPoolClient.getRequestHeaders (/home/runner/work/_actions/google-github-actions/upload-cloud-storage/v1/dist/index.js:168:23175)

Observed behavior

No response

Action YAML

Here is my gcpauth.yml file

name: Auth
on:
  push:
    branches: [ "main" ]
  workflow_dispatch:
jobs:
  build:
    runs-on: ubuntu-latest
    
    # Add "id-token" with the intended permissions.
    permissions:
      contents: 'read'
      id-token: 'write'
      
    steps:
      # actions/checkout MUST come before auth    
      - uses: 'actions/checkout@v3'
      
      - id: 'auth'
        name: 'Authenticate to Google Cloud'
        uses: 'google-github-actions/auth@v1'
        with:
          workload_identity_provider: 'https://iam.googleapis.com/projects/xxxxxxxxx/locations/global/workloadIdentityPools/workload-auth/providers/oidc/*'
          service_account: '[email protected]'
          path: ./results.sarif
          destination: github_action/testing/
          
          #credentials_json: ${{ secrets.gcp_credentials }}
          
#       - name: 'Set up Cloud SDK'
#         uses: 'google-github-actions/setup-gcloud@v1'

#       - name: 'Create files'
#         run: |-
#            mkdir -p test
#            touch test/test2.txt

#       - name: 'Use gcloud CLI'
#         run: 'gsutil -m rsync -R test/test2.txt gs://<github_action>'

      
#       - id: 'upload-file'
#         uses: 'google-github-actions/upload-cloud-storage@v1'
#         with:           
#           path: 'https://console.cloud.google.com/storage/browser/github_action'
#           destination: 'github_action/testing'
#         env:
#           files: '${{ steps.upload-folder.outputs.uploaded }}  

connect ETIMEDOUT

TL;DR

Sometimes My workflow often occurs with a reason: connect ETIMEDOUT ERROR as follows.
Is there anything wrong with my config?

Error: google-github-actions/upload-cloud-storage failed with: request to
https://storage.googleapis.com/upload/storage/v1/b/{xxx} failed, reason: connect ETIMEDOUT {ip_address}

My setting is as follows:

- name: Deploy to GCS 
  uses: 'google-github-actions/upload-cloud-storage@v0'
  with:
    path: './build'
    destination: 'some-destination'
    parent: false
    project_id: {project_name}
    concurrency: 50

Expected behavior

No response

Observed behavior

No response

Action YAML

- name: Deploy to GCS 
  uses: 'google-github-actions/upload-cloud-storage@v0'
  with:
    path: './build'
    destination: 'some-destination'
    parent: false
    project_id: {project_name}
    concurrency: 50

Log output

Uploading ...
  Error: google-github-actions/upload-cloud-storage failed with: request to https://storage.googleapis.com/upload/storage/v1/b/xxx failed, reason: connect ETIMEDOUT xx.xx.xx.xx:443
  

FetchError: request to https://storage.googleapis.com/upload/storage/v1/b/xxx failed, reason: connect ETIMEDOUT xx.xx.xx.xx:443
      at ClientRequest.<anonymous> (/home/runner/work/_actions/google-github-actions/upload-cloud-storage/v0/dist/index.js:226:57777)
      at ClientRequest.emit (node:events:402:35)
      at TLSSocket.socketErrorListener (node:_http_client:447:9)
      at TLSSocket.emit (node:events:390:28)
      at emitErrorNT (node:internal/streams/destroy:157:8)
      at emitErrorCloseNT (node:internal/streams/destroy:122:3)
      at processTicksAndRejections (node:internal/process/task_queues:83:21) {
    type: 'system',
    errno: 'ETIMEDOUT',
    code: 'ETIMEDOUT',
    config: {
      method: 'PUT',
      url: 'https://storage.googleapis.com/upload/storage/v1/b/xxx',
      headers: {
        'Content-Range': 'bytes 0-*/*',
        Authorization: '***',
        'User-Agent': 'google-api-nodejs-client/7.14.1',
        'x-goog-api-client': 'gl-node/16.13.0 auth/7.14.1',
        Accept: 'application/json'
      },
      body: Readable {
        _readableState: ReadableState {
          objectMode: false,
          highWaterMark: 16384,
          buffer: BufferList { head: null, tail: null, length: 0 },
          length: 0,
          pipes: [],
          flowing: false,
          ended: true,
          endEmitted: true,
          reading: false,
          constructed: true,
          sync: false,
          needReadable: false,
          emittedReadable: false,
          readableListening: false,
          resumeScheduled: false,
          errorEmitted: false,
          emitClose: true,
          autoDestroy: true,
          destroyed: true,
          errored: null,
          closed: true,
          closeEmitted: true,
          defaultEncoding: 'utf8',
          awaitDrainWriters: null,
          multiAwaitDrain: false,
          readingMore: false,
          dataEmitted: true,
          decoder: null,
          encoding: null,
          [Symbol(kPaused)]: true
        },
        _read: [AsyncFunction: read],
        _events: [Object: null prototype] { error: [Function (anonymous)] },
        _eventsCount: 1,
        _maxListeners: undefined,
        [Symbol(kCapture)]: false
      },
      signal: AbortSignal {},
      validateStatus: [Function (anonymous)],
      paramsSerializer: [Function: paramsSerializer],
      responseType: 'json'
    }
  }

Additional information

No response

Following the guide here to upload a static site can result in leaking credentials

TL;DR

When using this action to upload site content to a bucket, if the site content lives at the root of the repository, then this action will also upload the gcloud credentials file.

Expected behavior
Not uploading credential files.

Observed behavior
Job output:

 Uploading file: page.html to gs://bucket/page.html
Uploading file: other_page.html to gs://bucket/other_page.html
Uploading file: gha-creds-xyzxyzxyxz.json to gs://bucket/gha-creds-xyzxyzxyxz.json

Reproduction

Action YAML

name: Deploy

on:
  push:
    branches:
      - master

jobs:
  deploy:
    name: Deploy
    runs-on: ubuntu-latest
    permissions:
      contents: read
      id-token: write
    steps:
      - uses: actions/checkout@v2
      - id: auth
        uses: google-github-actions/auth@v0
        with:
          workload_identity_provider: ${{ secrets.GCLOUD_WORKLOAD_IDENTITY_PROVIDER }}
          service_account: ${{ secrets.GCLOUD_SERVICE_ACCOUNT }}
      - id: upload-artifacts
        uses: google-github-actions/upload-cloud-storage@v0
        with:
          path: ./
          destination: bucket

Additional information
The repo isn't public, but it's more or less a few raw html, css, js, and image files checked in. The root of the repo is the root of the site. This issue isn't blocking me and I'm going to find ways to work around it, but the bug serves just as much of a warning to others how easy this action makes it to accidentally leak credentials.

action ignores working-directory default

Question

When you specify defaults: working-directory, action ignores it and runs from the root directory. Is it intended behavior?

My YAML file:

name: GCP Build and Deploy

on:
  push:
    branches:
      - main
jobs:
  build:
    defaults:
      run:
        working-directory: ./app

    runs-on: ubuntu-latest

    strategy:
      matrix:
        node-version: [16.x]

    steps:
      - uses: actions/checkout@v2
        with:
          persist-credentials: false

      - name: Use Node.js ${{ matrix.node-version }}
        uses: actions/setup-node@v1
        with:
          node-version: ${{ matrix.node-version }}

      - name: Install Dependencies
        run: npm install

      - name: Build
        run: npm run build

      - name: 'Configure GCP credentials'
        uses: 'google-github-actions/auth@v0'
        with:
          credentials_json: '${{ secrets.GCP_CREDENTIALS }}'

      - name: 'Upload'
        uses: 'google-github-actions/upload-cloud-storage@v0'
        with:
          path: '${{ env.SOURCE_DIR }}'
          destination: '${{ secrets.GCP_BUCKET_NAME }}'
          parent: false

env:
  SOURCE_DIR: app/build

Since I specified working-directory: ./app why does the SOURCE_DIR still have to be app/build?

Allow filtering files

TL;DR

Allow filtering files by specifying a glob pattern. This would be useful for static websites that generate files during their build phase that aren't meant for public consumption, such as source maps.

Design

Action YAML
The user could specify a glob pattern. Only files that match that glob pattern are uploaded.

steps:
  - id: upload-files
    uses: google-github-actions/upload-cloud-storage@main
    with:
      path: /path/to/folder
      match: **/!(*.map)
      destination: bucket-name

Resources
N/A

Additional information
N/A

Set headers for uploaded files

TL;DR

I'm using this command line at the moment:
gsutil -h "Content-Type:text/xml" -h "Cache-Control:public, max-age=600" cp file.ext gs://bucket-name
I would like to use this action but don't see how I can without the ability to set headers. Enabling headers would be nice.

Design

Action YAML

steps:
  - id: upload-files
    uses: google-github-actions/upload-cloud-storage@main
    with:
      path: /path/to/folder
      destination: bucket-name
      headers:
        - "Content-Type:text/xml"
        - "Cache-Control:public, max-age=600"

set-output is deprecated

TL;DR

Running the google-github-actions/upload-cloud-storage@v0 action produces the following warning:

Warning: The `set-output` command is deprecated and will be disabled soon. Please upgrade to using Environment Files. For more information see: https://github.blog/changelog/2022-10-11-github-actions-deprecating-save-state-and-set-output-commands/

Link from message: https://github.blog/changelog/2022-10-11-github-actions-deprecating-save-state-and-set-output-commands/

Expected behavior

Warning should not occur.

Observed behavior

No response

Action YAML

on:
  workflow_dispatch:

jobs:
  static:
    runs-on: ubuntu-latest

    steps:
    - id: 'upload-file'
      uses: 'google-github-actions/upload-cloud-storage@v0'
      with:
        path: './path/to/file'
        destination: 'bucket'

Log output

No response

Additional information

No response

Synchronize folder as the bucket root

TL;DR

Uploading a folder like this, will upload to bucket-name/folder. I want to synchronize the bucket root, /path/to/folder would be the root of my bucket. /path/to/folder/a would be bucket-name/a, not bucket-name/folder/a.

steps:
  - id: upload-files
    uses: GoogleCloudPlatform/github-actions/upload-cloud-storage@master
    with:
      path: /path/to/folder
      destination: bucket-name

Design

Action YAML

I propose introducing a root input.

steps:
  - id: upload-files
    uses: GoogleCloudPlatform/github-actions/upload-cloud-storage@master
    with:
      path: /path/to/folder
      destination: bucket-name
      root: true

Want to replace all files in a folder rather than merge.

TL;DR

When I upload a new version of my website, I want to remove all existing files and replace with my new ones. The current design merges my new files into the existing contents which leaves unneeded (and likely deprecated) files behind.

Design

Action YAML

- name: Upload to Google
  uses: google-github-actions/upload-cloud-storage@main
  with:
    credentials: ${{ secrets.credentials}}
    path: 'sourcedir'
    destination: ${{ secrets.bucket}}
    parent: false
    remove: true

In the above, I would expect the contents of the entire bucket to be deleted. If parent is true, then I would expect the contents of the appropriate folder to be deleted.

Resources
#40

  • Link to answer of current implementation

gsutil rm gs://bucket/**

v0 tag does not exist.

FYI,

I am getting the following error when trying to use this action following the example in the readme.

Error: Unable to resolve action `google-github-actions/upload-cloud-storage@v0`, unable to find version `v0`

It looks like github actions doesn't use semantic versioning and will not resolve v0 to the latest v0.4.2 tag. I am guessing a v0 tag needs to be created and updated whenever a new version is released.

Perhaps a latest branch could be used instead?

Thanks.

gzip doesn't work as a parameter

TL;DR

When passing gzip as a parameter in UploadOptions to Storage

Expected behavior
I expected the content-encoding not to be gzipped

Observed behavior
Screen Shot 2021-02-04 at 12 29 12 PM

Reproduction

I published a forked version of this with gzip hardcoded to false. omrihq@2f3b249

And the encoding is still gzipped.
Repository
Repo: https://github.com/omrihq/upload-cloud-storage/

Additional information
I recently merged this expecting to fix this: #15
I will happily put up another PR when I get to the bottom of it, however this seems not as straightforward as adding the flag originally.

Thanks @bharathkkb and @averikitsch for being so responsive, I appreciate it!

Failed by timeout on a large number of files

We use this action to upload reports after building projects.
Upload fails in one of the big projects. After 8 minutes, an error occurs:

Error: Multiple errors occurred during the request. Please see the `errors` array for complete details.

    1. Request Timeout
    2. <!DOCTYPE html>
<html lang=en>
  <meta charset=utf-8>
  <meta name=viewport content="initial-scale=1, minimum-scale=1, width=device-width">
  <title>Error 408 (Request Timeout)!!1</title>
  <style><!-- skip --></style>
  <a href=//www.google.com/><span id=logo aria-label=Google></span></a>
  <p><b>408.</b> <ins>That’s an error.</ins>
  <p>Your client has taken too long to issue its request.  <ins>That’s all we know.</ins>

Characteristics of the directory for uploading:

# find target/site/clover -type f | wc -l 
    3645
# du -sh target/site/clover
107M	target/site/clover

For other smaller projects, this action works perfectly.

If I setup google-github-actions/setup-gcloud and run gsutil -mq cp -r <path> <destination> then the uploading is successful within 2 minutes.

I can't link to the code because the repository is not public.

Getting error Error: Action failed with error: Error: The "GitHub Action workflow must specify exactly one of "workload_identity_provider" or "credentials_json"!

TL;DR

Even though i am using single credentials, i am getting error stating "Error: Action failed with error: Error: The GitHub Action workflow must specify exactly one of "workload_identity_provider" or "credentials_json"!"
Expected behavior
Workflow should work as per my understanding

Reproduction

Action YAML

name: Build and Deploy to Cloud Run

# Defining Triggers.
# In our case : any push to the dev_omipar branch
on:
  push:
    branches:
    - dev_omipar

# Defining ENV vars internal to the flow
env:
  PROJECT_ID: datalake-298101
  REGION: us-central1
  DEPLOYMENT_NAME: omipar-dev
  IMAGE: omipar-dev

jobs:
  setup-build-publish-deploy:
    name: Setup, Build, Publish, and Deploy
    runs-on: ubuntu-latest

    strategy:
      matrix:
        node-version: [12.x]

    steps:

    # Checking out code from github repo
    - name: Checkout
      uses: actions/checkout@v2
    - name: Use Node.js ${{ matrix.node-version }}
      uses: actions/setup-node@v1
      with:
        node-version: ${{ matrix.node-version }}
    - name: npm install and test
      run: |
        npm install
        npm test > coverage.txt
      env:
        CI: true
    - uses: papeloto/action-zip@v1
      with:
        files: coverage/ index.json
        dest: result.zip

    # Setup gcloud SDK access to cloud resources by refering to github Secrets store
    - uses: google-github-actions/setup-gcloud@master
      with:
        service_account_key: ${{ secrets.gcp_credentials }}
        project_id: ${{ env.PROJECT_ID }}


    # Configure Docker to use the gcloud command-line tool as a helper for authentication
    - run: |-
        gcloud --quiet auth configure-docker
    
    # upload
    - name: 'Upload assets to GCP bucket, CDN'
      uses: google-github-actions/[email protected]
      with:
        credentials_json: ${{ secrets.gcp_credentials }}
    - uses: google-github-actions/[email protected]
      with:
        path: coverage
        destination: omipar-release-test-case-reports
    
    # Build the Docker image
    - name: Build
      run: |-
        sudo docker build --tag "gcr.io/$PROJECT_ID/$IMAGE:latest" .
    # Push the updated build into the GCR registry
    - uses: mattes/gce-docker-push-action@v1
      with:
        creds: ${{ secrets.gcp_credentials }}
        src: gcr.io/${{ env.PROJECT_ID }}/${{ env.IMAGE}}:latest
        dst: gcr.io/${{ env.PROJECT_ID }}/${{ env.IMAGE}}:latest
        
    # - name: Send mail
    #   uses: dawidd6/action-send-mail@v3
    #   with:
    #     # Required mail server address:
    #     server_address: smtp.gmail.com
    #     # Required mail server port:
    #     server_port: 465
    #     # Optional (recommended): mail server username:
    #     username: ${{secrets.USERNAME}}
    #     # Optional (recommended) mail server password:
    #     password: ${{secrets.PASSWORD}}
    #     # Required mail subject:
    #     subject: Github Actions job result
    #     # Required recipients' addresses:
    #     to: [email protected]
    #     # Required sender full name (address can be skipped):
    #     from: Mihir Mehta # <[email protected]>
    #     # Optional whether this connection use TLS (default is true if server_port is 465)
    #     secure: true
    #     # Optional plain body:
    #     body: Build job of ${{github.repository}} completed successfully!
    #     # Optional HTML body read from file:
    #     # html_body: file://result.zip
    #     attachments: result.zip
    #     # Optional unsigned/invalid certificates allowance:
    #     ignore_cert: true
    #     # Optional converting Markdown to HTML (set content_type to text/html too):
    #     convert_markdown: true
    - name: Send email through SendGrid
      uses: peter-evans/sendgrid-action@v1
      env:
        SENDGRID_API_KEY: ${{ secrets.SENDGRID_API_KEY }}
        
    - name: Deploy to Cloud Run
      id: deploy
      uses: google-github-actions/deploy-cloudrun@main
      with:
        service: ${{env.DEPLOYMENT_NAME}}
        image: gcr.io/${{ env.PROJECT_ID }}/${{ env.IMAGE}}:latest
        project_id: ${{ env.PROJECT_ID }}
        region: ${{env.REGION}}
        credentials: ${{ secrets.gcp_credentials }}
        flags: --service-account [email protected] --memory 2Gi

Thank you

10 mb limit uploads

Question

Is there a 10 megabytes limit for uploads?
I used this action and I could see my files being cropped to 10mb

Upload folder with dot in name on start

Hi guys,

I am uploading whole folder which contains folder: .next.
On Cloud Bucket this folder missing. I tried process 3x times, same result.

Is it bug or can I do some configuration to allow these hidden dot folders?

Screenshot 2021-08-12 at 1 47 30

Screenshot 2021-08-12 at 1 48 38

Part of my workflow:

      - name: 'Copy assets out of the image'
        run: |
          mkdir ${{ env.ASSETS_FOLDER }}
          docker images
          export CID=$(docker create pay)
          docker cp $CID:/usr/src/.next ./${{ env.ASSETS_FOLDER }}
          cp -R public ./${{ env.ASSETS_FOLDER }}
          cd ${{ env.ASSETS_FOLDER }} && ls -la

      - name: 'Upload assets to GCP bucket, CDN'
        uses: google-github-actions/upload-cloud-storage@main
        with:
          credentials: ${{ env.GCP_SA_KEY }}
          path: ${{ env.ASSETS_FOLDER }}
          destination: ${{ env.GCP_BUCKET_PATH }}
          parent: false

Suddenly began to fail with the error: path should be a `path.relative()`d string

TL;DR

I've been using 'upload-cloud-storage' in multiple projects for a while now, and everything was fine until recently, when it began to fail with the following error:

Error: google-github-actions/upload-cloud-storage failed with path should be a `path.relative()`d string, but got "D:\a\project\archive.1.0.8132.27186-b6b5283.tar.gz"

Here is relevant parts in my Github workflow:

...
...
jobs:
    build:
        name: Compile and package
        runs-on: windows-latest
...
...
         -
            name: Collect UnitTests files
            id: collect-test-files
            env:
                PRODUCT_VERSION: ${{ steps.create-version-id.outputs.version }}
            run: |
                $fileName = "archive.$($env:PRODUCT_VERSION).tar.gz"

                Start-Process -NoNewWindow -FilePath 'tar.exe' -ArgumentList  "-cz", "-f $fileName", ".\UnitTestFiles" -Wait;

                $fileName = "$(Get-Location)\$fileName"

                Write-Host "::set-output name=unittest_file::$fileName"
        -
            name: Authenticate to Google Cloud
            uses: 'google-github-actions/auth@v0'
            with:
                credentials_json: '${{ secrets.GCP_CREDENTIALS }}'
        -
            name: Upload to Google Storage bucket
            uses: 'google-github-actions/upload-cloud-storage@v0'
            with:
                path: '${{ steps.collect-test-files.outputs.unittest_file }}'
                destination: 'integration-tests-cache'
                parent: false

What I can do about it ?

Expected behavior

File should be uploaded to Google Storage even if provided absolute path and not just relative

Observed behavior

Upload process crashed with error and crashed the workflow

Action YAML

...
...
jobs:
    build:
        name: Compile and package
        runs-on: windows-latest
...
...
         -
            name: Collect UnitTests files
            id: collect-test-files
            env:
                PRODUCT_VERSION: ${{ steps.create-version-id.outputs.version }}
            run: |
                $fileName = "archive.$($env:PRODUCT_VERSION).tar.gz"

                Start-Process -NoNewWindow -FilePath 'tar.exe' -ArgumentList  "-cz", "-f $fileName", ".\UnitTestFiles" -Wait;

                $fileName = "$(Get-Location)\$fileName"

                Write-Host "::set-output name=unittest_file::$fileName"
        -
            name: Authenticate to Google Cloud
            uses: 'google-github-actions/auth@v0'
            with:
                credentials_json: '${{ secrets.GCP_CREDENTIALS }}'
        -
            name: Upload to Google Storage bucket
            uses: 'google-github-actions/upload-cloud-storage@v0'
            with:
                path: '${{ steps.collect-test-files.outputs.unittest_file }}'
                destination: 'integration-tests-cache'
                parent: false



Sorry - it is a private repo - could not paste the whole workflow

Log output

No response

Additional information

I am running on Windows 2022 Github action runner

Error: failed with error code [object Object]

TL;DR

Action fails intermittently with obscure error (Error: google-github-actions/upload-cloud-storage failed with error code [object Object]), recently more often than before.

Expected behavior

It shouldn't fail

Observed behavior

Upload just fails with Error: google-github-actions/upload-cloud-storage failed with error code [object Object]

Action YAML

name: Sync

on:
  schedule:
    - cron: '17/30 * * * *'
  workflow_dispatch:

jobs:
  sync:
    runs-on: ubuntu-latest

    permissions:
      contents: 'write'
      id-token: 'write'

    env:
      BUCKET_NAME: linux-mirror-db
      DB_NAME: mirror.sl3
      URL_PREFIX: mirror-chunk-
      # 10MB = 10 * 1024 * 1024 = 10485760
      SERVER_CHUNK_SIZE: 10485760
      SUFFIX_LENGTH: 3

    steps:
      - uses: actions/checkout@v2

      - name: Download CSVs
        run: |
          wget https://${BUCKET_NAME}.storage.googleapis.com/fixes.csv
          wget https://${BUCKET_NAME}.storage.googleapis.com/reported_by.csv
          wget https://${BUCKET_NAME}.storage.googleapis.com/tags.csv
          wget https://${BUCKET_NAME}.storage.googleapis.com/upstream.csv
          wget https://${BUCKET_NAME}.storage.googleapis.com/cve.csv

      - name: Recreate SQLite3 DB
        run: |
          sudo apt update && sudo apt install sqlite3
          rm -f mirror.sl3
          sqlite3 mirror.sl3 '.mode csv' '.import fixes.csv fixes'
          sqlite3 mirror.sl3 '.mode csv' '.import reported_by.csv reported_by'
          sqlite3 mirror.sl3 '.mode csv' '.import tags.csv tags'
          sqlite3 mirror.sl3 '.mode csv' '.import upstream.csv upstream'
          sqlite3 mirror.sl3 '.mode csv' '.import cve.csv cve'
          sqlite3 mirror.sl3 'CREATE INDEX fixes_commit ON fixes (`commit`,`fixes`);'
          sqlite3 mirror.sl3 'CREATE INDEX fixes_fixes ON fixes (`fixes`,`commit`);'
          sqlite3 mirror.sl3 'CREATE INDEX reported_by_commit ON reported_by (`commit`,`reported_by`);'
          sqlite3 mirror.sl3 'CREATE INDEX tags_commit ON tags (`commit`,`tags`);'
          sqlite3 mirror.sl3 'CREATE INDEX upstream_commit ON upstream (`commit`,`upstream`);'
          sqlite3 mirror.sl3 'CREATE INDEX upstream_upstream ON upstream (`upstream`,`commit`);'
          sqlite3 mirror.sl3 'CREATE INDEX cve_commit ON cve (`commit`,`cve`);'
          sqlite3 mirror.sl3 'CREATE INDEX cve_cve ON cve (`cve`,`commit`);'
          sqlite3 mirror.sl3 'pragma journal_mode = delete; pragma page_size = 1024; vacuum;'
          echo "::set-output name=databaseChecksum::$(sha1sum mirror.sl3 | cut -f1 -d' ')"

      - id: 'split-db'
        run: |
          databaseLengthBytes="$(stat --printf="%s" "${DB_NAME}")"
          requestChunkSize="$(sqlite3 "${DB_NAME}" 'pragma page_size')"
          rm -f ${URL_PREFIX}*
          split "${DB_NAME}" --bytes=${SERVER_CHUNK_SIZE} --suffix-length=${SUFFIX_LENGTH} --numeric-suffixes ${URL_PREFIX}
          echo "::set-output name=requestChunkSize::${requestChunkSize}"
          echo "::set-output name=databaseLengthBytes::${databaseLengthBytes}"

      - uses: 'google-github-actions/auth@v0'
        with:
          workload_identity_provider: 'projects/799795028847/locations/global/workloadIdentityPools/github-pool/providers/github-provider'
          service_account: '[email protected]'

      - id: 'upload-db'
        uses: 'google-github-actions/upload-cloud-storage@v0'
        with:
          parent: false
          path: '${{ env.DB_NAME }}'
          destination: '${{ env.BUCKET_NAME }}'

      - id: 'upload-db-chunks'
        uses: 'google-github-actions/upload-cloud-storage@v0'
        with:
          destination: '${{ env.BUCKET_NAME }}/'
          glob: '${{ env.URL_PREFIX }}*'
          path: ./
          gzip: false
          headers: |-
            cache-control: max-age=30, must-revalidate

      - name: "Update config"
        env:
          requestChunkSize: '${{ steps.split-db.outputs.requestChunkSize }}'
          databaseLengthBytes: '${{ steps.split-db.outputs.databaseLengthBytes }}'
          databaseChecksum: '${{ steps.split-db.outputs.databaseChecksum }}'
          uploadedPath: 'https://storage.googleapis.com/${{ env.BUCKET_NAME }}/${{ env.URL_PREFIX }}'
        run: |
          echo '{
              "serverMode": "chunked",
              "requestChunkSize": '${requestChunkSize}',
              "databaseLengthBytes": '${databaseLengthBytes}',
              "serverChunkSize": '${SERVER_CHUNK_SIZE}',
              "urlPrefix": "'${uploadedPath}'",
              "suffixLength": '${SUFFIX_LENGTH}'
          }' | tee "config.json"

      - id: 'upload-config'
        uses: 'google-github-actions/upload-cloud-storage@v0'
        with:
          parent: false
          path: 'config.json'
          destination: '${{ env.BUCKET_NAME }}'
          headers: |-
            cache-control: max-age=3, must-revalidate

Log output

https://github.com/sirdarckcat/linux-1/runs/5812581320?check_suite_focus=true

https://github.com/sirdarckcat/linux-mirror/runs/5812493690?check_suite_focus=true

Additional information

This happens across different projects, with different workflows.

Getting artifact upload public URL in steps after upload

Hello,

In a GitHub Actions job, after google-github-actions/upload-cloud-storage@main step, I would like to get the public URL of the uploaded artifact from the previous step.

The idea is a job using the following steps as an example:

  • Build a binary
  • Upload binary as an artifact
  • Fire off a Slack message with a public URL of said binary

How would you get the public URL of the artifact in a subsequent step?

- uses: google-github-actions/upload-cloud-storage@main
   name: Upload Artifact (json) to Google Cloud Storage
   with: 
       path: file
       destination:  bucket/output
       env-pub: FOOBAR
  -name: Use URL for something
    run: echo "${{ env.FOOBAR }}"

The last command would give me a public URL to the uploaded artifact.

release-please job failing

TL;DR

Release-please action is failing. Seems related to googleapis/release-please-action#164. Workaround will be to pin to previous version

Expected behavior

Observed behavior

Reproduction

Action YAML

# Paste your complete GitHub Actions YAML here, removing
# any sensitive values.

Repository

Additional information

Absolute path with parent false results in base directory also being uploaded

TL;DR

Absolute path with parent false results in base directory being uploaded

Expected behavior
With a dir structure like nested/in/here/hello.txt and parent false, only hello.txt should be uploaded to bucket root

Observed behavior
here/hello.txt is uploaded

Reproduction

Action YAML

          path: /home/runner/work/test-upload-gcs-action/test-upload-gcs-action/nested/in/here
          parent: false

Repository

Additional information

After refactor in version v0.10.0 mime types are often wrong

TL;DR

In version v0.10.0 all files our files get the mime type image/svg+xml in the GCP bucket.

Expected behavior

Previous version got the correct mime type based on file.

Observed behavior

All files (.js, .css etc) all get image/svg+xml as mime typ in GCP

Action YAML

name: Deploy production

on:
  push:

jobs:
  deploy-production:
    runs-on: ubuntu-latest
    name: Deploy
    permissions:
      contents: 'read'
      id-token: 'write'

    steps:
      - uses: actions/checkout@v2
        with:
          ref: main

      - uses: actions/setup-node@v2
        with:
          node-version: '16.13'

      - name: Install
        run: npm i

      - name: Build
        run: npm run build

      - id: 'auth'
        uses: 'google-github-actions/auth@v0'
        with:
          credentials_json: '${{ secrets.GCP_CREDENTIALS_PRODUCTION }}'

      - id: 'upload-build-folder'
        uses: 'google-github-actions/upload-cloud-storage@v0'
        with:
          path: './build'
          destination: 'path/to/folder'
          parent: false

Log output

No response

Additional information

No response

Parent does not omit path

TL;DR

parent: false does not behave as expected.

Expected behavior

Only contents of directory specified in path is copied to GCS destination.

Observed behavior

Parent directory is included, whether parent is set to true or false.

Reproduction

Working directory

build/
  foo/some-bundle.js
  bar/other-bundle.js

Action YAML

      - name: Upload to Google Cloud Storage
        uses: google-github-actions/upload-cloud-storage@main
        with:
          credentials: ${{ secrets.gcs_credentials }}
          path: build
          parent: false
          destination: some-bucket.example.com/some/destination

Getting error while uploading path.relative()

TL;DR

Error: google-github-actions/upload-cloud-storage failed with path should be a path.relative()d string,

I tried with relative as well as absolute path both the paths are not working.

   - name: 'upload file'
        uses: 'google-github-actions/upload-cloud-storage@v0'
        with:
          path: '/tmp/test.txt'
          destination: 'bucket_name/test.txt'
    - name: 'upload file'
        uses: 'google-github-actions/upload-cloud-storage@v0'
        with:
          path: '../../../tmp/test.txt'
          destination: 'bucket_name/test.txt'

Expected behavior

No response

Observed behavior

No response

Action YAML

- name: 'upload file'
        uses: 'google-github-actions/upload-cloud-storage@v0'
        with:
          path: '/tmp/test.txt'
          destination: 'bucket_name/test.txt'
    - name: 'upload file'
        uses: 'google-github-actions/upload-cloud-storage@v0'
        with:
          path: '../../../tmp/test.txt'
          destination: 'bucket_name/test.txt'

Log output

Error: google-github-actions/upload-cloud-storage failed with path should be a `path.relative()`d string,

Additional information

No response

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.