Git Product home page Git Product logo

snowflake-connector's Introduction

Datastax Snowflake Sink Connector for Apache Pulsar

This connector pushes messages from Apache Pulsar topic into Snowflake DB.

This connector is Open Source Software, Apache 2 licensed and built using Snowflake's Connector for Kafka (documentation).

Please refer to this blog post for a quick walk-through.

Installation

Please refer to Apache Pulsar IO documentation.

Use pulsar-snowflake-connector/target/pulsar-snowflake-connector-0.1.0-SNAPSHOT.nar if the project is built from sources.

Configuration

Parameter Name Description Default value
batchSize Size of messages in bytes the sink will attempt to batch messages together before flush. 16384
lingerTimeMs Time interval in milliseconds the sink will attempt to batch messages together before flush. 2147483647
topic The topic name that passed to kafka sink. n/a
kafkaConnectorSinkClass A kafka-connector sink class to use. com.snowflake.kafka.connector.SnowflakeSinkConnector
offsetStorageTopic Pulsar topic to store offsets at. snowflake-sink-offsets
unwrapKeyValueIfAvailable In case of Record<KeyValue<>> data use key from KeyValue<> instead of one from Record. true
kafkaConnectorConfigProperties Config properties to pass to the kafka connector. n/a

List of kafkaConnectorConfigProperties can be found at the documentation for the Snowflake's Connector for Kafka.

Known Issues

snowflake.topic2table.map parameter is not supported.

Snowflake's Connector expects topic:table[,topic:table] format and does not handle Pulsar's topic URLs persistent://tenant/namespace/topic-name.

Example

processingGuarantees: "EFFECTIVELY_ONCE"
configs:
  topic: "snowflake-demo"
  offsetStorageTopic: "snowflake-sink-offsets-demo"
  batchSize: "100"
  lingerTimeMs: "600000"
  kafkaConnectorConfigProperties:
     name: "snowflakedemo"
     connector.class: "com.snowflake.kafka.connector.SnowflakeSinkConnector"
     tasks.max: "1"
     topics: "snowflake-demo"
     buffer.count.records: "100"
     buffer.flush.time: "600"
     buffer.size.bytes: "102400"
     snowflake.url.name: "tenant.snowflakecomputing.com:443"
     snowflake.user.name: "kafka_connector_user"
     snowflake.private.key: "very_secret_key"
     snowflake.database.name: "kafka_db"
     snowflake.schema.name: "kafka_schema"
     key.converter: "org.apache.kafka.connect.storage.StringConverter"
     value.converter: "com.snowflake.kafka.connector.records.SnowflakeJsonConverter"

Building from source

If you want to develop and test this library you need to build it from sources.

mvn clean install -DskipTests

The Apache Pulsar distro is regulated by the properties pulsar.version and pulsar.distribution-name.

Release

mvn release:prepare -Prelease -Dresume=false

The GitHub release is handled by a GitHub action whenever a tag is being pushed

snowflake-connector's People

Contributors

dlg99 avatar nicoloboschi avatar nikhil-ctds avatar nikhilerigila09 avatar zzzming avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

snowflake-connector's Issues

README.md update proposals for Pulsar newbies

Hi,
I was testing Pulsar with your snowflake-connector. This was my first Pulsar exercise. So I am just summing up here some issues, that I was facing and which might be worth mentioning in README.

  • Pulsar version dependency - the connector is not working with latest stable Pulsar version (2.8.1). You need to build Pulsar from master branch.
  • For my first tests, I was using Pulsar CLI client to create testing messages with JSON content. The message content in Snowflake table was encoded using Base64 and I did not understand why. In the end I found out, that message schema matters (and value.converter class does not). The Pulsar CLI client creates messages with BINARY schema. Once I used Java client and set schema on Producer to JSON, everything start working. Still not sure here, but probably value.converter (and may be key.converter) kafkaConnectorConfigProperties has no effect, since Pulsar messages are converted to Kafka messages in org.apache.pulsar.io.kafka.connect.KafkaConnectSink.toSinkRecord() method. This is very confusing. If there are some required kafkaConnectorConfigProperties, which are effectively ignored - it should me mentioned in the doc.
  • The configs.topic configuration property seems to be ignored as well. Created this bug: apache/pulsar#12880 This should solve the topic to table name mapping issue.

Anyway after checking some source code, I was able to make the connector running. So thanks for your project!

Error on trying to configure the connector

After maneuvers, I finally reached a dead end with one error log consistent, com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "secureConnectBundle" (class [org.apache.pulsar.io](http://org.apache.pulsar.io/).kafka.connect.PulsarKafkaConnectSinkConfig), not marked as ignorable. There's nowhere that field is being set both in the UI and even when trying to use Pulsar-admin.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.