Git Product home page Git Product logo

behnamyazdan / ecommerce_realtime_data_pipeline Goto Github PK

View Code? Open in Web Editor NEW
41.0 1.0 9.0 2.81 MB

Ecommerce Realtime Data Pipeline (Data Modeling, Workflow Orchestration, Change Data Capture, Analytical Database and Dashboarding)

Python 97.97% Dockerfile 1.13% Shell 0.89%
apache-kafka clickhouse debezium docker-compose grafana postgresql database-modeling change-data-capture data-pipeline ecommerce

ecommerce_realtime_data_pipeline's Introduction

E-Commerce Realtime Data Pipeline

In this project, I aimed to cover a set of preliminary skills required by a data engineer. As a data analyst planning to transition into this field, this project has helped me become acquainted with various concepts, techniques, and technologies. As a result, I have decided to share my experiences.



Project Overview and Architecture

This project encompasses the design and implementation of a data pipeline tailored for an online store environment. The primary objective is to analyze the data generated by the store's operations in real-time and present meaningful insights through a dashboard interface. To achieve this goal, several key technologies have been employed, each serving a specific purpose in the data processing and visualization workflow.

e-commerce real time data pipeline


1- Python:

In this project, Python served as the primary programming language. Python is widely used in data engineering projects due to its versatility, extensive libraries, and ease of use. To simulate ecommerce data and populate the database, I utilized the Faker library. Faker is a Python library that generates fake data, such as names, addresses, and product information, which is useful for testing and development purposes. Additionally, I employed the geopy.geocoders module, specifically the Nominatim class, to generate coordinates for each city where user orders were registered. This allowed for the geocoding of location data, enabling geographic analysis and visualization within the project. You can access the code I used to generate fake data in "code/ecommerce/models".

2- Apache Airflow:

Apache Airflow played a crucial role in this project by enabling the creation of Directed Acyclic Graphs (DAGs) and facilitating scheduling for data simulation tasks. As an open-source platform, Apache Airflow excels in orchestrating complex workflows and data pipelines. By defining DAGs, which encapsulate the sequence of tasks and their dependencies, I could automate the process of simulating data and executing tasks reliably and at scale. Leveraging Airflow's scheduling capabilities, I ensured efficient management of data processing tasks, ensuring timely execution and handling of dependencies between different components of the pipeline. The DAGs used in this project are available at "pipeline/dags".

3- PostgreSQL:

PostgreSQL served as the Online Transaction Processing (OLTP) database for this project, tasked with storing transactional data. To begin, I meticulously crafted a database schema, which was then implemented within PostgreSQL. Connectivity to PostgreSQL was established using the psycopg2 library, a robust PostgreSQL adapter for Python. An accompanying entity relationship diagram (ERD) is provided below, offering a visual representation of the database schema. Furthermore, the SQL script utilized for creating the database tables can be accessed in "database/postgres_tables.sql".

ecommerce entity relational diagram

4- Debezium and Apache Kafka:

Debezium, utilized as a Kafka source connector in this project, plays a pivotal role in enabling Change Data Capture (CDC) from PostgreSQL to Apache Kafka. Acting as a conduit for real-time streaming, Debezium captures database changes in PostgreSQL and efficiently transfers them to Kafka topics. This seamless integration ensures that Kafka consumers have access to the most up-to-date information for further processing and analysis within the data pipeline. By leveraging Kafka's distributed architecture, the combined functionality of Debezium and Kafka facilitates scalable and fault-tolerant streaming of data, empowering the project with robust real-time capabilities. The Debezium connector configurations are available as a JSON file at "docker/debezium/init-scripts/postgres-connector.json".

Role of Kafka in the Project: In this project, Kafka serves as a central data hub, facilitating the seamless transfer of real-time data between different components of the data pipeline. ClickHouse connects to Kafka and retrieves data from Kafka topics, enabling efficient data processing and analysis. Kafka's distributed architecture ensures reliable and scalable data streaming, making it an essential component for building real-time data pipelines in the project.

5- ClickHouse:

In this project, ClickHouse serves as the analytical database, seamlessly connected to Kafka through the Kafka engine. Leveraging ClickHouse's robust capabilities, I utilized various features to enhance data processing and analysis. Specifically, I employed the MergeTree family table engine, which includes versatile table types such as MergeTree, SummingMergeTree, and AggregatingMergeTree. The MergeTree engine is well-suited for time-series data and offers efficient storage and querying capabilities, making it ideal for handling large volumes of streaming data. Additionally, the SummingMergeTree and AggregatingMergeTree table engines provide aggregation capabilities, allowing for the computation of summary statistics and aggregates on the fly.

Furthermore, ClickHouse's Materialized Views feature played a critical role in the project, effectively acting as triggers for real-time data updates. Materialized views in ClickHouse allow for the precomputation and storage of query results, ensuring fast access to frequently accessed data. By defining materialized views on the Kafka table engine, I could efficiently process incoming data and update downstream tables in real-time, facilitating timely insights and analysis within the data pipeline.

Additionally, materialized views in ClickHouse offer flexibility in defining refresh intervals, ensuring that data remains up-to-date and accurate for analytical purposes. Moreover, I leveraged ClickHouse's Kafka engine integration to seamlessly ingest data from Kafka topics into ClickHouse tables. To continuously read data from Kafka topics, I utilized ClickHouse's "materialized view" functionality, ensuring real-time updates and insights into the data pipeline.

The tables used for analytical purposes in the project start with _at_ (at: analytical table). Similarly, the Materialized Views used to populate these tables start with _mv_at_. Furthermore, I implemented Time to Live (TTL) settings for some tables to automatically delete summarized data after a specified time, ensuring the freshness of data stored in the analytical tables.

6- Grafana Dashboard:

Grafana is utilized in this project primarily for real-time visualization of data. It connects to ClickHouse via the "grafana-clickhouse-datasource" plugin. While the primary focus of this project is not on visualization, Grafana serves to demonstrate the capabilities of ClickHouse and the features employed for analytical purposes.

Grafana realtime ecommerce dashboard

Note: Upon running the project, you may notice that certain parts of the dashboard do not display information initially. Only the "Registered Users" panel will contain data. This is due to the fact that order and transaction information is generated gradually, with data from 10 users initially stored in the database. You can adjust this behavior by modifying the code in "code/ecommerce/models/role_user.py".

7- Docker (docker-compose):

I utilized docker-compose to streamline the deployment and interconnection of various services essential for the project's functionality. The configuration defines a comprehensive suite of 13 services, encompassing "airflow," "kafka-broker," "zookeeper," "clickhouse," "debezium," "grafana," and others. Notably, the "airflow" service orchestrates workflow management, while "kafka-broker" and "zookeeper" handle Kafka messaging infrastructure. The "clickhouse" service functions as the analytical database, with "debezium" enabling Change Data Capture (CDC) from PostgreSQL to Kafka. Grafana is deployed for monitoring and visualization purposes. Each service is meticulously configured with relevant environment variables, port assignments, health checks, and dependencies, ensuring seamless integration within the stack. Furthermore, volumes are employed to persist data for services such as Kafka, ClickHouse, Grafana, and PostgreSQL. A dedicated network ("services") facilitates efficient communication among the deployed services. This docker-compose configuration optimizes the management of the project's tech stack, fostering streamlined development, deployment, and operation processes.


Getting Started: Running the Project

Prerequisites:

  • Ensure you have Docker and docker-compose installed on your system. If not, please follow the official Docker installation guide for your operating system.

Step 1: Clone the Repository

  1. Open your terminal.

  2. Clone the project repository from GitHub to your local machine using the following command:

    git clone https://github.com/behnamyazdan/ecommerce_realtime_data_pipeline.git
    

Step 2: Navigate to Project Directory

  1. Use the command line to navigate to the root directory of the project:

    cd ecommerce_realtime_data_pipeline
    

Step 3: Start Docker Containers

  1. Execute the following command to start all services defined in the docker-compose file:

    docker-compose up
    

    This command will build and start the Docker containers for various services in your project.

Step 4: Monitor Service Initialization

  1. During the startup process, monitor the console output to ensure all services are initialized successfully.
  2. Pay attention to the initialization tasks performed by the airflow-init and debezium-connector-init containers.
    • The airflow-init container initializes Airflow and performs necessary setup tasks. After completing its initialization, this container will stop automatically.
    • The debezium-connector-init container creates a connector for Debezium, facilitating Change Data Capture (CDC) from PostgreSQL to Kafka. After creating the connector, this container will also stop automatically.
  3. Once all initialization tasks are complete, ensure that other services continue to run without any errors or warnings.

These adjustments reflect the specific tasks performed by the airflow-init and debezium-connector-init containers during service initialization, providing clarity on their roles in the project setup process. If you have any further questions or need additional assistance, feel free to ask!

Step 5: Access Project Services

  1. Open a web browser and navigate to the following URLs to access various project services:
    • Airflow UI: http://localhost:13005 username:airflow, password:airflow
    • Debezium UI: http://localhost:8085 (authentication not required)
    • Kafka UI: http://localhost:8095/ (authentication not required)
    • Grafana Dashboard: http://localhost:13000 username:admin, password:admin
    • PostgreSQL: port:65432, username:postgres, password:postgres
    • ClickHouse: port:8123, username:default, password: (not required)

ecommerce_realtime_data_pipeline's People

Contributors

behnamyazdan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

ecommerce_realtime_data_pipeline's Issues

dependency failed to start: container debezium exited (127)

Hi @behnamyazdan,

After running the "docker-compose up" command, I encountered the following error. Do you have any ideas on how to fix it? I am using a Mac with an M1 chip and allocating 12GB of RAM for the Docker environment.System info

Error log :

Attaching to airflow-init, airflow-scheduler, airflow-webserver, clickhouse, debezium, debezium-connector-init, debezium-ui, grafana, kafka-broker, kafka-ui, postgres-main, postgres_airflow, zookeeper
debezium                 | Using BOOTSTRAP_SERVERS=kafka-broker:19092
debezium                 | Plugins are loaded from /kafka/connect
debezium                 | Debezium Scripting enabled!
debezium                 | Using the following environment variables:
debezium                 |       GROUP_ID=1
debezium                 |       CONFIG_STORAGE_TOPIC=connect_configs
debezium                 |       OFFSET_STORAGE_TOPIC=connect_offsets
debezium                 |       STATUS_STORAGE_TOPIC=connect_statuses
debezium                 |       BOOTSTRAP_SERVERS=kafka-broker:19092
debezium                 |       REST_HOST_NAME=172.20.0.7
debezium                 |       REST_PORT=8083
debezium                 |       ADVERTISED_HOST_NAME=172.20.0.7
debezium                 |       ADVERTISED_PORT=8083
debezium                 |       KEY_CONVERTER=org.apache.kafka.connect.json.JsonConverter
debezium                 |       VALUE_CONVERTER=org.apache.kafka.connect.json.JsonConverter
debezium                 |       OFFSET_FLUSH_INTERVAL_MS=60000
debezium                 |       OFFSET_FLUSH_TIMEOUT_MS=5000
debezium                 |       SHUTDOWN_TIMEOUT=10000
debezium                 | --- Setting property from CONNECT_REST_ADVERTISED_PORT: rest.advertised.port=8083
debezium                 | --- Setting property from CONNECT_OFFSET_STORAGE_TOPIC: offset.storage.topic=connect_offsets
debezium                 | --- Setting property from CONNECT_KEY_CONVERTER: key.converter=org.apache.kafka.connect.json.JsonConverter
debezium                 | --- Setting property from CONNECT_CONFIG_STORAGE_TOPIC: config.storage.topic=connect_configs
debezium                 | --- Setting property from CONNECT_GROUP_ID: group.id=1
debezium                 | --- Setting property from CONNECT_REST_ADVERTISED_HOST_NAME: rest.advertised.host.name=172.20.0.7
debezium                 | --- Setting property from CONNECT_REST_HOST_NAME: rest.host.name=172.20.0.7
debezium                 | --- Setting property from CONNECT_VALUE_CONVERTER: value.converter=org.apache.kafka.connect.json.JsonConverter
debezium                 | --- Setting property from CONNECT_REST_PORT: rest.port=8083
debezium                 | --- Setting property from CONNECT_STATUS_STORAGE_TOPIC: status.storage.topic=connect_statuses
debezium                 | --- Setting property from CONNECT_OFFSET_FLUSH_TIMEOUT_MS: offset.flush.timeout.ms=5000
debezium                 | --- Setting property from CONNECT_PLUGIN_PATH: plugin.path=/kafka/connect
debezium                 | --- Setting property from CONNECT_OFFSET_FLUSH_INTERVAL_MS: offset.flush.interval.ms=60000
debezium                 | --- Setting property from CONNECT_BOOTSTRAP_SERVERS: bootstrap.servers=kafka-broker:19092
debezium                 | --- Setting property from CONNECT_TASK_SHUTDOWN_GRACEFUL_TIMEOUT_MS: task.shutdown.graceful.timeout.ms=10000
debezium                 | 2024-06-26 10:00:14,356 INFO   ||  Kafka Connect worker initializing ...   [org.apache.kafka.connect.cli.AbstractConnectCli]
debezium                 | 2024-06-26 10:00:14,364 INFO   ||  WorkerInfo values: 
debezium                 | 	jvm.args = -Xms256M, -Xmx2G, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -XX:MaxInlineLevel=15, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dkafka.logs.dir=/kafka/logs, -Dlog4j.configuration=file:/kafka/config/log4j.properties
debezium                 | 	jvm.spec = Red Hat, Inc., OpenJDK 64-Bit Server VM, 11.0.20, 11.0.20+8
debezium                 | 	jvm.classpath = /kafka/libs/activation-1.1.1.jar:/kafka/libs/aopalliance-repackaged-2.6.1.jar:/kafka/libs/argparse4j-0.7.0.jar:/kafka/libs/audience-annotations-0.12.0.jar:/kafka/libs/caffeine-2.9.3.jar:/kafka/libs/checker-qual-3.19.0.jar:/kafka/libs/commons-beanutils-1.9.4.jar:/kafka/libs/commons-cli-1.4.jar:/kafka/libs/commons-collections-3.2.2.jar:/kafka/libs/commons-digester-2.1.jar:/kafka/libs/commons-io-2.11.0.jar:/kafka/libs/commons-lang3-3.8.1.jar:/kafka/libs/commons-logging-1.2.jar:/kafka/libs/commons-validator-1.7.jar:/kafka/libs/connect-api-3.6.1.jar:/kafka/libs/connect-basic-auth-extension-3.6.1.jar:/kafka/libs/connect-json-3.6.1.jar:/kafka/libs/connect-mirror-3.6.1.jar:/kafka/libs/connect-mirror-client-3.6.1.jar:/kafka/libs/connect-runtime-3.6.1.jar:/kafka/libs/connect-transforms-3.6.1.jar:/kafka/libs/error_prone_annotations-2.10.0.jar:/kafka/libs/hk2-api-2.6.1.jar:/kafka/libs/hk2-locator-2.6.1.jar:/kafka/libs/hk2-utils-2.6.1.jar:/kafka/libs/jackson-annotations-2.13.5.jar:/kafka/libs/jackson-core-2.13.5.jar:/kafka/libs/jackson-databind-2.13.5.jar:/kafka/libs/jackson-dataformat-csv-2.13.5.jar:/kafka/libs/jackson-datatype-jdk8-2.13.5.jar:/kafka/libs/jackson-jaxrs-base-2.13.5.jar:/kafka/libs/jackson-jaxrs-json-provider-2.13.5.jar:/kafka/libs/jackson-module-jaxb-annotations-2.13.5.jar:/kafka/libs/jackson-module-scala_2.13-2.13.5.jar:/kafka/libs/jakarta.activation-api-1.2.2.jar:/kafka/libs/jakarta.annotation-api-1.3.5.jar:/kafka/libs/jakarta.inject-2.6.1.jar:/kafka/libs/jakarta.validation-api-2.0.2.jar:/kafka/libs/jakarta.ws.rs-api-2.1.6.jar:/kafka/libs/jakarta.xml.bind-api-2.3.3.jar:/kafka/libs/javassist-3.29.2-GA.jar:/kafka/libs/javax.activation-api-1.2.0.jar:/kafka/libs/javax.annotation-api-1.3.2.jar:/kafka/libs/javax.servlet-api-3.1.0.jar:/kafka/libs/javax.ws.rs-api-2.1.1.jar:/kafka/libs/jaxb-api-2.3.1.jar:/kafka/libs/jersey-client-2.39.1.jar:/kafka/libs/jersey-common-2.39.1.jar:/kafka/libs/jersey-container-servlet-2.39.1.jar:/kafka/libs/jersey-container-servlet-core-2.39.1.jar:/kafka/libs/jersey-hk2-2.39.1.jar:/kafka/libs/jersey-server-2.39.1.jar:/kafka/libs/jetty-client-9.4.52.v20230823.jar:/kafka/libs/jetty-continuation-9.4.52.v20230823.jar:/kafka/libs/jetty-http-9.4.52.v20230823.jar:/kafka/libs/jetty-io-9.4.52.v20230823.jar:/kafka/libs/jetty-security-9.4.52.v20230823.jar:/kafka/libs/jetty-server-9.4.52.v20230823.jar:/kafka/libs/jetty-servlet-9.4.52.v20230823.jar:/kafka/libs/jetty-servlets-9.4.52.v20230823.jar:/kafka/libs/jetty-util-9.4.52.v20230823.jar:/kafka/libs/jetty-util-ajax-9.4.52.v20230823.jar:/kafka/libs/jline-3.22.0.jar:/kafka/libs/jolokia-jvm-1.7.2.jar:/kafka/libs/jopt-simple-5.0.4.jar:/kafka/libs/jose4j-0.9.3.jar:/kafka/libs/jsr305-3.0.2.jar:/kafka/libs/kafka-clients-3.6.1.jar:/kafka/libs/kafka-group-coordinator-3.6.1.jar:/kafka/libs/kafka-log4j-appender-3.6.1.jar:/kafka/libs/kafka-metadata-3.6.1.jar:/kafka/libs/kafka-raft-3.6.1.jar:/kafka/libs/kafka-server-common-3.6.1.jar:/kafka/libs/kafka-shell-3.6.1.jar:/kafka/libs/kafka-storage-3.6.1.jar:/kafka/libs/kafka-storage-api-3.6.1.jar:/kafka/libs/kafka-streams-3.6.1.jar:/kafka/libs/kafka-streams-examples-3.6.1.jar:/kafka/libs/kafka-streams-scala_2.13-3.6.1.jar:/kafka/libs/kafka-streams-test-utils-3.6.1.jar:/kafka/libs/kafka-tools-3.6.1.jar:/kafka/libs/kafka-tools-api-3.6.1.jar:/kafka/libs/kafka_2.13-3.6.1.jar:/kafka/libs/lz4-java-1.8.0.jar:/kafka/libs/maven-artifact-3.8.8.jar:/kafka/libs/metrics-core-2.2.0.jar:/kafka/libs/metrics-core-4.1.12.1.jar:/kafka/libs/netty-buffer-4.1.100.Final.jar:/kafka/libs/netty-codec-4.1.100.Final.jar:/kafka/libs/netty-common-4.1.100.Final.jar:/kafka/libs/netty-handler-4.1.100.Final.jar:/kafka/libs/netty-resolver-4.1.100.Final.jar:/kafka/libs/netty-transport-4.1.100.Final.jar:/kafka/libs/netty-transport-classes-epoll-4.1.100.Final.jar:/kafka/libs/netty-transport-native-epoll-4.1.100.Final.jar:/kafka/libs/netty-transport-native-unix-common-4.1.100.Final.jar:/kafka/libs/osgi-resource-locator-1.0.3.jar:/kafka/libs/paranamer-2.8.jar:/kafka/libs/pcollections-4.0.1.jar:/kafka/libs/plexus-utils-3.3.1.jar:/kafka/libs/reflections-0.10.2.jar:/kafka/libs/reload4j-1.2.25.jar:/kafka/libs/rocksdbjni-7.9.2.jar:/kafka/libs/scala-collection-compat_2.13-2.10.0.jar:/kafka/libs/scala-java8-compat_2.13-1.0.2.jar:/kafka/libs/scala-library-2.13.11.jar:/kafka/libs/scala-logging_2.13-3.9.4.jar:/kafka/libs/scala-reflect-2.13.11.jar:/kafka/libs/slf4j-api-1.7.36.jar:/kafka/libs/slf4j-reload4j-1.7.36.jar:/kafka/libs/snappy-java-1.1.10.5.jar:/kafka/libs/swagger-annotations-2.2.8.jar:/kafka/libs/trogdor-3.6.1.jar:/kafka/libs/zookeeper-3.8.3.jar:/kafka/libs/zookeeper-jute-3.8.3.jar:/kafka/libs/zstd-jni-1.5.5-1.jar
debezium                 | 	os.spec = Linux, amd64, 6.6.31-linuxkit
debezium                 | 	os.vcpus = 8
debezium                 |    [org.apache.kafka.connect.runtime.WorkerInfo]
debezium                 | 2024-06-26 10:00:14,364 INFO   ||  Scanning for plugin classes. This might take a moment ...   [org.apache.kafka.connect.cli.AbstractConnectCli]
debezium                 | 2024-06-26 10:00:14,428 INFO   ||  Loading plugin from: /kafka/connect/debezium-connector-spanner   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:14,573 INFO   ||  Using up-to-date JsonConverter implementation   [io.debezium.converters.CloudEventsConverter]
zookeeper                | [2024-06-26 10:00:14,693] INFO Processing srvr command from /172.20.0.2:46630 (org.apache.zookeeper.server.NIOServerCnxn)
debezium                 | 2024-06-26 10:00:14,815 INFO   ||  Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-spanner/}   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:14,818 INFO   ||  Loading plugin from: /kafka/connect/debezium-connector-oracle   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:14,880 INFO   ||  Using up-to-date JsonConverter implementation   [io.debezium.converters.CloudEventsConverter]
debezium                 | 2024-06-26 10:00:14,933 INFO   ||  Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-oracle/}   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:14,933 INFO   ||  Loading plugin from: /kafka/connect/debezium-connector-vitess   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:14,972 INFO   ||  Using up-to-date JsonConverter implementation   [io.debezium.converters.CloudEventsConverter]
debezium                 | 2024-06-26 10:00:15,010 INFO   ||  Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-vitess/}   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,010 INFO   ||  Loading plugin from: /kafka/connect/debezium-connector-sqlserver   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,026 INFO   ||  Using up-to-date JsonConverter implementation   [io.debezium.converters.CloudEventsConverter]
debezium                 | 2024-06-26 10:00:15,063 INFO   ||  Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-sqlserver/}   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,242 INFO   ||  Loading plugin from: /kafka/connect/debezium-connector-informix   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,266 INFO   ||  Using up-to-date JsonConverter implementation   [io.debezium.converters.CloudEventsConverter]
debezium                 | 2024-06-26 10:00:15,304 INFO   ||  Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-informix/}   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,305 INFO   ||  Loading plugin from: /kafka/connect/debezium-connector-mongodb   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,318 INFO   ||  Using up-to-date JsonConverter implementation   [io.debezium.converters.CloudEventsConverter]
debezium                 | 2024-06-26 10:00:15,392 INFO   ||  Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-mongodb/}   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,392 INFO   ||  Loading plugin from: /kafka/connect/debezium-connector-mysql   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,410 INFO   ||  Using up-to-date JsonConverter implementation   [io.debezium.converters.CloudEventsConverter]
debezium                 | 2024-06-26 10:00:15,443 INFO   ||  Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-mysql/}   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,447 INFO   ||  Loading plugin from: /kafka/connect/debezium-connector-db2   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,480 INFO   ||  Using up-to-date JsonConverter implementation   [io.debezium.converters.CloudEventsConverter]
debezium                 | 2024-06-26 10:00:15,521 INFO   ||  Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-db2/}   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,522 INFO   ||  Loading plugin from: /kafka/connect/debezium-connector-jdbc   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,552 INFO   ||  Using up-to-date JsonConverter implementation   [io.debezium.converters.CloudEventsConverter]
debezium                 | 2024-06-26 10:00:15,589 INFO   ||  Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-jdbc/}   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,654 INFO   ||  Loading plugin from: /kafka/connect/debezium-connector-postgres   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,674 INFO   ||  Using up-to-date JsonConverter implementation   [io.debezium.converters.CloudEventsConverter]
debezium                 | 2024-06-26 10:00:15,704 INFO   ||  Registered loader: PluginClassLoader{pluginLocation=file:/kafka/connect/debezium-connector-postgres/}   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,706 INFO   ||  Loading plugin from: classpath   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,716 INFO   ||  Registered loader: jdk.internal.loader.ClassLoaders$AppClassLoader@3d4eac69   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,717 INFO   ||  Scanning plugins with ServiceLoaderScanner took 1291 ms   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | 2024-06-26 10:00:15,721 INFO   ||  Loading plugin from: /kafka/connect/debezium-connector-spanner   [org.apache.kafka.connect.runtime.isolation.PluginScanner]
debezium                 | #
debezium                 | # A fatal error has been detected by the Java Runtime Environment:
debezium                 | #
debezium                 | #  SIGSEGV (0xb) at pc=0x00007ffffea68481, pid=1, tid=736
debezium                 | #
debezium                 | # JRE version: OpenJDK Runtime Environment (Red_Hat-11.0.20.0.8-1.fc37) (11.0.20+8) (build 11.0.20+8)
debezium                 | # Java VM: OpenJDK 64-Bit Server VM (Red_Hat-11.0.20.0.8-1.fc37) (11.0.20+8, mixed mode, sharing, tiered, compressed oops, g1 gc, linux-amd64)
debezium                 | # Problematic frame:
debezium                 | # V  [libjvm.so+0x7f5481]  G1ParScanThreadState::copy_to_survivor_space(InCSetState, oopDesc*, markOopDesc*)+0x51
debezium                 | #
debezium                 | # No core dump will be written. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
debezium                 | #
debezium                 | # An error report file with more information is saved as:
debezium                 | # /kafka/hs_err_pid1.log
debezium                 | #
debezium                 | # If you would like to submit a bug report, please visit:
debezium                 | #   https://bugzilla.redhat.com/enter_bug.cgi?product=Fedora&component=java-11-openjdk-portable&version=37
debezium                 | #
debezium                 | 
debezium                 | [error occurred during error reporting (), id 0xb, SIGSEGV (0xb) at pc=0x00007fffff5ec899]
debezium                 | 

�[Kdebezium exited with code 127

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.