Git Product home page Git Product logo

profit-bidder's Introduction

SA360 Profit Bidder Pipeline

Disclaimer: This is not an official Google product. There is an alternative deployment architecture available here.

Objective

To create an automated data pipeline that will run daily to extract the previous day's conversion data via an SA360 data transfer, generate new conversions with calculated order profit as revenue based on margin data file and upload the new conversions back into Search Ads 360 (SA360) where it will be leveraged for Custom Bidding and/or reporting.

Solution Architecture

Please find below the architecture of the solution: Architecture

Requirements

The pipeline is built within a Google Cloud project instance and uses the following cloud products and technologies:

- Big Query
	- Function:
		- SA360 to Big Query Connector
		- Google Merchant Center (GMC) Data Transfer
		- Store the  margin data file
		- Data storage
		- Data transformation
- Google Cloud Functions
	- Function:
		- Execute script to upload new conversions via SA360 API
- Google Cloud Scheduler
	- Function:
		- Trigger cloud function
- Google Cloud Storage  *(optional)*
	- Function:
			- Store upload script execution/error logs
- Python 3.7
- Standard SQL
- Caampaign Manager 360 (CM360)
	- Create Offline Floodlight Tag
	- Grant service account access
- Search Ads 360 (SA360)
	- Grant service account access

Setup Guide

  1. Create a Google Cloud Project instance For this step you may create a new project instance if needed or utilize an already existing project in which case you may skip this step.

  2. Enable required APIs (or APIs can be enabled as you perform each subsequent step).

  3. Create keyless service account Create service account for your project.

  4. Grant service account product permissions

  5. (Optional) Create Cloud Storage Bucket for upload logs

    • Follow steps outlined here to create a GCS bucket named converison_upload_log
  6. Create Bigquery datasets to segment data.

    • Documentation to create BQ dataset.
    • Create datasets for SA360, GMC and business data.
    • SA360: 1 per required Advertiser
      • Name: <advertiser_name>
      • Default table expiration: 7 days - Never (table data is appended daily, up to your discretion)
    • Google Merchant Center (GMC):
      • Name: <account_name>_GMC_feed
      • Default table expiration: 7 days - Never (table data is appended daily, up to your discretion)
    • Business data (margin file)
      • Name: business_data
      • Default table expiration: Any
  7. Create Bigquery Data Transfers

    • Create following data transfers:
      • SA360 (1 per advertiser, as needed) [link]
        • Display Name: Any
        • Schedule: Daily (recommended to run early morning, ex: 4AM EST)
        • Dataset ID: Relevant SA360 Advertiser dataset created in Step 6
        • Agency/Advertiser ID: Both IDs can be found in SA360
      • Google Merchant Center [link]
        • Display Name: Any
        • Schedule: Daily (recommended to run early morning, ex: 4AM EST)
        • Dataset ID: Google Merchant Center dataset created in Step 6
        • Merchant ID: ID can be found in GMC
        • For this project only the Products & product issues option is required and should be checked.
  8. Upload Margin data into Bigquery

    • Manually upload margin data (.csv file format recommended) into business_data dataset.
    • A data transfer from Google Cloud Storage may also used to automatically pull a specifed file which would refresh the target table at a set schedule.
  9. Create Scheduled Query

    • Create a scheduled query to run transformation queries for each advertiser.
    • Example Configuration:
      • Scheduled Query Name: <advertiser_name/id> Profit Gen or Any
      • Destination Dataset: Rescpective advertiser dataset created in Step 6
      • Destination Table: conversion_final_<sa360_advertiser_id>
      • Write Preference: WRITE_TRUNCATE
      • Query String: Reference query code in the sql_query folder.
  10. Create Delegator Cloud Function

    • Create cloud function with the following configurations:
    • Step 1 configuration:
      • Function Name: cloud_conversion_upload_delegator
      • Region: us-central1
      • Trigger: Pub/Sub
      • Authentication: Require authentication
      • Advanced:
        • Memory allocated: 2 GB
        • Timeout: 540 seconds
        • Service Account: App Engine default service account
    • Step 2 configuration:
      • Runtime: Python 3.7
      • Entry point: main
      • Code: Reference code in the conversion_upload_delegator folder
  11. Create Upload Cloud Function - For this step you have the option of standing up either the CM360 upload node or the SA360 upload node.

NOTE: It is recommended to utilize the CM360 API for offline conversion uploads unless your use case can only be supported by the SA360 API. - (Option A) - Create CM360 Cloud Function. Create cloud function with the following configurations: - Step 1 configuration: - Function Name: cm360_cloud_conversion_upload_node - Region: us-central1 - Trigger: Pub/Sub - Authentication: Require authentication - Advanced: - Memory allocated: 256 MB - Timeout: 540 seconds - Service Account: App Engine default service account - Step 2 configuration: - Runtime: Python 3.7 - Entry point: main - Code: Reference code in the CM360_cloud_conversion_upload_node folder.

- **(Option B)** - Create SA360 Cloud Function. [Create cloud function](https://cloud.google.com/functions/docs/deploying/console) with the following configurations:
  - Step 1 configuration:
    - **Function Name:** ```sa360_cloud_conversion_upload_node```
    - **Region:** us-central1
    - **Trigger:** Pub/Sub
    - **Authentication:** Require authentication
    - **Advanced:**
      - **Memory allocated:** 256 MB
      - **Timeout:** 540 seconds
      - **Service Account:** App Engine default service account
  - Step 2 configuration:
    - **Runtime:** Python 3.7
    - **Entry point:** ```main```
    - **Code:** Reference code in the ```SA360_cloud_conversion_upload_node``` folder.
  1. Standup Cloud Scheduler Job(s)
    • Create a Cloud Scheduler job per target advertiser. Please note that each advertiser will have its own scheduled job and Frequency should be staggered by 5 minutes within the same hour.
    • Example configuration:
      • Name: Any
      • Description: Any
      • Frequency:
        • Example: starting everyday at 6 AM staggered by 5 minutes:
          • 0 6 * * *
          • 5 6 * * *
          • 10 6 * * *
          • etc...
      • Timezone: As per your preference
      • Target: Pub/Sub
      • Topic: conversion_upload_delegator
      • Payload samples in the section below.

Cloud Scheduler Pub/Sub Payload Examples

CM360 sample payload:

{ //configurable in install.sh
  "dataset_name": <DS_BUSINESS_DATA>,
  "table_name": <CM360_TABLE>,
  "topic": <CM360_PUBSUB_TOPIC_NAME>,
  "cm360_config": {
    "profile_id": <CM360_PROFILE_ID>,
    "floodlight_activity_id": <CM360_FL_ACTIVITY_ID>,
    "floodlight_configuration_id": <CM360_FL_CONFIG_ID>
  }
}

SA360 sample payload:

{  
	"table_name": "conversion_final_<SA360_advertiser_id>",  
	"topic": "SA360_conversion_upload"  //hardcode
}

Quick start up guide

Notebook uses the synthesized data, which you can run in less than 30 mins to comprehend the core concept and familiarize yourself with the code.

We recommend that you follow three broad phases to productionalize the solution:

  • Phase 1 - use the notebook to valid account access, etc.,
  • Phase 2 - use the test module to further test with the full stack of the services, and finally,
  • Phase 3 - operationalize the solution in your environment.

Demo solution with synthesized data

We provide synthesized test data to test the solution in the test_solution folder. Please use the install.sh with proper parameters to install the demo module.

profit-bidder's People

Contributors

anaesqueda avatar avaz1301 avatar dani-kay avatar dpanigra avatar pdex avatar pgilmon avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.