Git Product home page Git Product logo

workflow-bq-to-cloudsql's Introduction

Overview

This workflow performs an export of data from BigQuery with the export statement and store the result in Cloud Storage. The export files are imported, one by one to Cloud SQL. It's the purpose of this Medium article

Customization

At the top of the import.yaml file, the assignment step allows you configuring the following parameters:

  • bucket: The bucket to store the export files
  • prefix: The export file prefix in the export bucket. Can contain path (path/to/file-prefix)
  • projectid: The project ID
  • query: The BigQuery query to perform to get and format the data correctly
  • instance: The name of the Cloud SQL instance
  • databaseschema: The schema name inside the Cloud SQL instance
  • importtable: The table name in the schema. The data will be imported there, the table schema must fit your export file content.

You can also use Workflow parameters to set them dynamically at execution time.

Deployment

Create a service account with the required permission:

# Create the service account
gcloud iam service-accounts create import-workflow

# Grant the permissions
gcloud projects add-iam-policy-binding <PROJECT_ID> --member="serviceAccount:import-workflow@<PROJECT_ID>.iam.gserviceaccount.com" --role="roles/cloudsql.admin" --condition=None
gcloud projects add-iam-policy-binding <PROJECT_ID> --member="serviceAccount:import-workflow@<PROJECT_ID>.iam.gserviceaccount.com" --role="roles/storage.admin" --condition=None
gcloud projects add-iam-policy-binding <PROJECT_ID> --member="serviceAccount:import-workflow@<PROJECT_ID>.iam.gserviceaccount.com" --role="roles/bigquery.dataViewer" --condition=None
gcloud projects add-iam-policy-binding <PROJECT_ID> --member="serviceAccount:import-workflow@<PROJECT_ID>.iam.gserviceaccount.com" --role="roles/bigquery.jobUser" --condition=None

Deploy to Workflow

gcloud workflows deploy import --source=import.yaml --service-account=import-workflow@<PROJECT_ID>.iam.gserviceaccount.com

Execution

Before running the execution, be sure that:

  • Your Cloud SQL instance is running
  • The schema and the table existing in the instance
  • The table schema is correct according to your BigQuery export

Trigger your Workflow execution (add arguments if you update the workflow in the customization step)

gcloud workflows execute import

The command line provide you the link to follow the execution results. Wait 1 minutes and check the result. You can also check your Cloud SQL database to see the import result.

License

This library is licensed under Apache 2.0. Full license text is available in LICENSE.

workflow-bq-to-cloudsql's People

Contributors

guillaumeblaquiere avatar marcelschliesser avatar

Stargazers

arthur avatar ChristianEspinoza avatar Saiganesh Loganathan avatar  avatar Borislav Gizdov avatar  avatar Andy avatar Michael avatar Bilal Haidar avatar Rafael de Morais avatar Christopher Rubio Márquez avatar Dalei Li avatar Marcelo Costa avatar

Watchers

 avatar

workflow-bq-to-cloudsql's Issues

calling PG load_file many times might be quite slow

I haven't used GCP workflows before but my intuition leads me to believe that if you can merge the csv files before importing them then it will be a lot faster to import to postgres.

This may not be an issue for you (if BQ export was only creating a few shards) but if you had clustering set on a GEOGRAPHY type column BigQuery was creating tens of thousands and even hundreds of thousands of shards in my experience.

https://stackoverflow.com/a/59444472/14598384

if you KNOW you will always have fewer than 32 shards this might be another option but I think the time improvement won't justify the uncertainty in this case (sharding is uncontrollable) https://stackoverflow.com/questions/55114393/google-cloud-storage-joining-multiple-csv-files

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.