Git Product home page Git Product logo

gatsby-source-s3's Introduction

Welcome to the Gatsby User Collective!

The Gatsby UC was born out of a desire to create higher quality plugins in the Gatsby ecosystem by reducing maintainer work loads.

Goals

Most Gatsby plugin projects aren't massive. Sometimes they need regular maintenance, a lot of the time they do not. Either way, original authors become unable or willing to maintain a plugin at any given moment or maybe indefinately. This is not a comdemnation, it's a fact of life that sometimes maintainers need to move on. No matter the cause, this results in plugins going un-maintained by their authors. Others are willing and able to help; but the bottle neck that cannot be bypassed. At times this means fixing bugs in less ideal ways, creating duplicate implementations of plugins, etc. The Gatsby User collective's goal is to demorcratize the maintenance of these plugins to allow more folks to assist in their maintenance.

Info

The details of this community is still being flushed out. As problems arrize we will solve them together and in an open fasion. For now check out

  • Code of conduct [TBA]
  • Submitting a plugin to the UC [TBA]
  • ...

gatsby-source-s3's People

Contributors

dependabot[bot] avatar lpmraven avatar moonmeister avatar renovate[bot] avatar robinmetral avatar vacilando avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

gatsby-source-s3's Issues

Can't read markdown file from aws3

query IndexQuery {
allS3Image {
edges {
node {
Key
Url
}
}
}
}

npm install @robinmetral/gatsby-source-s3

i am using this plugin,markdown files is s3 bucket,
While using Alls3object query markdown files not read
We are not getting any query, help me please

urgent our project i stuck with this.

thank you

test: add e2e test case for sourcing from a private bucket

Follow-up on #21 and #24

After resolving #24, we should add an e2e test case to make sure that sourcing from private buckets is handled properly moving forward.

We need to:

  • create a private bucket with some images to source (and document the bucket's policy for the README)
  • source them in the example Gatsby site
  • assert that it succeeds in cypress

test: add e2e testing for plugin

Test cases to cover:

  • sourcing from a private bucket
  • sourcing from multiple buckets
  • sourcing from a bucket with over 1000 objects (using the ContinuationToken, waiting for #20)

Ideally the tests are ran in CI on a real example Gatsby site pulling from real S3 buckets (no fixtures). This would ensure that the core functionality of the plugin is covered.

Filtering images by path -- possible without the Url key?

Contrary to https://github.com/DSchau/gatsby-source-s3 this plugin supports more than 1000 items in a bucket which is GREAT so we planned to switch all our projects to this one.

However, it does not have Url as a key.

This seems to be a major problem since we often need to show only images on certain paths (folders) in the bucket.

It there a way to do it in V2?

Or do we have to downgrade to 1.2.0?
Will V1 continue being supported along V2?

Sourcing markdown files from S3

I am doing a blog project.
I am doing it to read dynamically from S3 bucket. i am using plugin is /plugins/@robinmetral/gatsby-source-s3/
in s3 bucket
There is bucket named tia , there is a file named post-1.md, its a markdown file.
In post-1.md there is title, featured img URL, date and content. While using Alls3object query
I am getting displayed bucket name, key, URL only.

What I need is a query to display title, content, featured img URL from post 1.md.
We are not getting any query, help me please
Please help me ASAP, my project is getting stuck with this.

Compatibility with other S3-compliant solutions (e.g. DigitalOcean Spaces)

Hi Robin, thanks for developing and maintaining this plugin. Though I'm not confident enough in my ability to create a robust and testable PR for this, I think it'd be fairly easy to change the way the plugin conducts AWS auth in order to make it compatible with other S3-compliant solutions such as DigitalOcean spaces.

The current authentication process involves the AWS config object:

const { aws: awsConfig, buckets } = pluginOptions;
AWS.config.update(awsConfig);
  const s3 = new AWS.S3();

To expand this plugin's compatibility with other S3-compliant solutions, couldn't we add a configurable endpoint to the plugin options (one that still defaults to an S3 endpoint and uses the region property) and perhaps just pass the config object to the s3 constructor?

const someOtherS3CompliantEndpoint = `https://${region}.digitaloceanspace.com`; // e.g.
const s3 = new AWS.S3({
    endpoint: someOtherS3CompliantEndpoint,
    accessKeyId: process.env.CLIENT_KEY,
    secretAccessKey: process.env.CLIENT_SECRET
});

The automated release is failing 🚨

🚨 The automated release from the master branch failed. 🚨

I recommend you give this issue a high priority, so other packages depending on you could benefit from your bug fixes and new features.

You can find below the list of errors reported by semantic-release. Each one of them has to be resolved in order to automatically publish your package. I’m sure you can resolve this πŸ’ͺ.

Errors are usually caused by a misconfiguration or an authentication problem. With each error reported below you will find explanation and guidance to help you to resolve it.

Once all the errors are resolved, semantic-release will release your package the next time you push a commit to the master branch. You can also manually restart the failed CI job that runs semantic-release.

If you are not sure how to resolve this, here is some links that can help you:

If those don’t help, or if this issue is reporting something you think isn’t right, you can always ask the humans behind semantic-release.


Invalid npm token.

The npm token configured in the NPM_TOKEN environment variable must be a valid token allowing to publish to the registry https://registry.npmjs.org/.

If you are using Two-Factor Authentication, make configure the auth-only level is supported. semantic-release cannot publish with the default auth-and-writes level.

Please make sure to set the NPM_TOKEN environment variable in your CI with the exact value of the npm token.


Good luck with your project ✨

Your semantic-release bot πŸ“¦πŸš€

Handle empty bucket

Great plugin, thanks for creating it!

Sometimes I don't want to process any images - for example in my CI pipeline. An easy way to accomplish that is to do something like in gatsby-config.js:

{
buckets: process.env.GATSBY_SKIP_IMAGE_PROCESSING
          ? []
          : ["my-s3-bucket"],
...
}

and then do GATSBY_SKIP_IMAGE_PROCESSING=1 gatsby build to pretend there are no images.

This is much easier than trying to make your GraphQL queries adapt to the environment variable, because Gatsby graphQL queries are pretty static.

However, when I do that, I get a Gatsby error:

 ERROR #85923  GRAPHQL

There was an error in your GraphQL query:

Cannot query field "allS3Object" on type "Query".

If you don't expect "allS3Object" to exist on the type "Query" it is most likely a typo.
However, if you expect "allS3Object" to exist there are a couple of solutions to common problems:

- If you added a new data source and/or changed something inside gatsby-node.js/gatsby-config.js, please try a restart of your development server
- The field might be accessible in another subfield, please try your query in GraphiQL and use the GraphiQL explorer to see which fields you can query and what shape they have
- You want to optionally use your field "allS3Object" and right now it is not used anywhere. Therefore Gatsby can't infer the type and add it to the GraphQL schema. A quick fix is to add a least one entry with that field (
"dummy content")

It is recommended to explicitly type your GraphQL schema if you want to use optional fields. This way you don't have to add the mentioned "dummy content". Visit our docs to learn how you can define the schema for "Query":
https://www.gatsbyjs.org/docs/schema-customization/#creating-type-definitions

This makes sense, but it would be nice if it Just Worked, or alternatively, if there was documentation in this plugin's README explaining how to add the missing type information.

V2: Error logs of 'wrong url'

Hi @robinmetral, I just upgraded to version 2; great work on getting the sourcing to work with private buckets! πŸŽ‰

I'm just testing it out with my websites, one unknown issue I'm getting is all of these error logs; which is unusual as I can confirm the image downloads are successfully and added to the static folder.

Screen Shot 2020-08-14 at 2 08 55 pm

I'm looking into this now.

How to handle object expiration with this plugin?

Hi Robin, first of all, thank you very much for this plugin, I had this question and I wasn't sure if this is the right place to it so I apologise in case it's not. I was trying building a photo gallery that has more 1000 photos, so I was using alls3objects>edges>node>url , it is requesting the images but the images are giving 403 error with object expired, I searched on google about the issue and I found that I need to send a request with Presigned URL, I am not that technical so I thought I should ask here if the plugin has some option to do that. Thank you.

Error creating file node for S3 object key using Gatsby 4.4

23:08:41 PM: ERROR Error creating file node for S3 object key "myimage.jpg": TypeError: this is not a Date object.

This only started happening once we upgraded to Gatsby 4.4.
After downgrading to Gatsby 4.3 the problem disappears.

Do others see this, is there a solution?

Attaching S3 images to another node

I just wanted to mention the idea of maybe documenting another way of using this plugin. The site I am building is an e-commerce platform with thousands of images (for all the products). This presented a major issue with using gatsby which is querying images. For a long time, I had a component that queried all images and matched them up to their respective product. (Like proposed in this stack overflow) This is highly inefficient, throwing warnings about the duration of the query.

An alternate to this is the attach the imageFile to the product on the data level, rather than when trying to render.

src/gatsby-api/create-resolvers/index.js

const resolvers = {
    AWSAppSync_Product: {
        imageFile: {
            type: 'File',
            resolve: async (source, args, context, info) => {
                const node = await context.nodeModel.runQuery({
                    query: {
                        filter: {
                            Key: { eq: source.image1 }
                        }
                    },
                    type: 'S3Object',
                    firstOnly: true
                });

                if (node && node.imageFile) return node.imageFile;
            }
        },
    },
}

module.exports = {
    resolvers
}

gatsby-node.js

exports.createResolvers = async ({ createResolvers }) => {
    createResolvers(resolvers)
}

src/components/image/index.js

import React from 'react'
import Img from 'gatsby-image'

export const Image = props => {
  if (props.imageFile && props.imageFile.childImageSharp && props.imageFile.childImageSharp.fluid) {
    return <Img className={props.imgClassName} alt={props.alt} fluid={props.imageFile.childImageSharp.fluid} />;
  }
};

Then use it like:

<Image
  imageFile={product.imageFile}
  alt=""
/>

AWSAppSync_Product is the type of node I am attaching my File to. (which can be found in the graphql playground on localhost). The resolve will match the Key of the S3Object with image1 (which is a string) on the product. This allows me to directly use the product images without having to run a query inside the image component.

In my opinion, this is a valuable piece of information once you wrap your head around it and it certainly has helped me a lot. Thanks @Js-Brecht.

HTTPError: Response code 403 (Forbidden)

Above a certain number of images (a few hundred, perhaps) during build (or develop) I get a barrage of errors like this:

error failed to process https://s3.amazonaws.com/MYBUCKET/path/to/image.jpg?AWSAccessKeyId=XXX&Expires=1605742170&Signature=zzz
HTTPError: Response code 403 (Forbidden)

The AWS setup is just like in https://github.com/robinmetral/gatsby-source-s3#aws-setup ... granting s3:ListBucket and s3:GetObject for the bucket.
First just for the accessing user's IAM role access policy, then also for the given S3 bucket policy (not sure what's the difference, perhaps it could be clarified in the setup instructions).
I even experimented with making the bucket list and items accessible to anonymous users... no help.

The S3 settings might be a red herring since things work for small number of images. The largest number I got to work was 175, anything above produces a shower of these errors.
Why would it work for a small but not a larger number of images?

Happens locally as well as on the server (Netlify).

feat: handle private buckets

The plugin officially doesn't support sourcing from private buckets, although it's been reported to work in #19.

We should investigate if there's anything missing from the plugin to handle private buckets. The README should also be updated with info on how to properly set up a private S3 bucket and IAM role for sourcing with this plugin.

localFile does not appear in graphiQL

Hello, I want to get the images from my aws S3 cloud and I'm using this plugin from gatsby. But I can't query them because the localFile doesn't exist.

Here is the configuration from gtabsy-config.js

{
      resolve: `@robinmetral/gatsby-source-s3`,
      options: {
        aws: {
          accessKeyId: 'my accessKeyId',
          secretAccessKey: 'my secretAccessKey',
          region: 'eu-central-1',
        },
        buckets: ['my bucket (only one bucket)'],
      },
    },

I have the following s3 configuration:
-S3 bucket is private
-only read acces based on IAM user
-region is 'eu-central-1'

Error: failed to download

Hi @robinmetral, I am getting the error for every image:

 ERROR 

failed to process https://s3.us-east-1.amazonaws.com/<redacted bucketname>/FAJKCAN20001-gold-1.jpg
Failed to download https://s3.us-east-1.amazonaws.com/<redacted bucketname>/FAJKCAN20001-gold-1.jpg after 3 attempts

Do you have any ideas about why this might be? I also think you have an issue with the plugin as it crashed my terminal and mac (after about 200+ of these errors), it does not end.

My first thought is that this is a permissions issue on the S3 bucket, but I am using the following cloudformation which grants public access:

  ImagesBucket:
    Type: AWS::S3::Bucket
    Properties:
      BucketName: !Sub "<redacted bucketname>"
      AccessControl: PublicRead
    DeletionPolicy: Retain

  ImagesBucketPolicy:
    Type: AWS::S3::BucketPolicy
    Properties:
      PolicyDocument:
        Version: 2012-10-17
        Statement:
          - Sid: PublicReadForGetBucketObjects
            Effect: Allow
            Principal: '*'
            Action:
              - s3:GetObject
            Resource: !Join ["", [!GetAtt ImagesBucket.Arn, "/*"]]
      Bucket: !Ref ImagesBucket

I've also granted AmazonS3ReadOnlyAccess for programmatic access to the user of the accessKeyId.

{
  resolve: `@robinmetral/gatsby-source-s3`,
  options: {
      aws: {
          accessKeyId: process.env.GATSBY_SOURCE_S3_IMAGE_KEY_ID,
          secretAccessKey: process.env.GATSBY_SOURCE_S3_IMAGE_KEY_SECRET,
          region: 'us-east-1'
      },
      buckets: ["<redacted bucketname>"]
  }
},

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.