Skip to Main Content

RisingWave

RisingWave is a distributed streaming database offering a standard SQL interface compatible with the PostgreSQL ecosystem, allowing integration without code changes. RisingWave treats streams as tables, enabling users to write complex queries on both streaming and historical data in an elegant manner. With RisingWave, users can focus on query analysis logic without needing to learn Java or the underlying API of a specific system.

This article will introduce how to import data from AutoMQ into the RisingWave database through RisingWave Cloud.

Prepare AutoMQ and Test Data

Refer to Deploy Locally▸ to deploy AutoMQ, ensuring network connectivity between AutoMQ and RisingWave.

Quickly create a Topic named example_topic in AutoMQ and write a test JSON message into it by following these steps.

Create Topic

Use the Apache Kafka command line tool to create the Topic. Ensure you have access to a Kafka environment and the Kafka service is running. Below is an example command to create the Topic:


./kafka-topics.sh --create --topic exampleto_topic --bootstrap-server 10.0.96.4:9092 --partitions 1 --replication-factor 1

When executing the command, replace topic and bootstrap-server with the actual Kafka server address in use.

After creating the Topic, you can use the following command to verify whether the Topic has been successfully created.


./kafka-topics.sh --describe example_topic --bootstrap-server 10.0.96.4:9092

Generate Test Data

Generate test data in JSON format that corresponds to the table mentioned earlier.


{
"id": 1,
"name": "Test User"
"timestamp": "2023-11-10T12:00:00",
"status": "active"
}

Write Test Data

Use Kafka's command line tool or programming methods to write the test data into the Topic named example_topic. Below is an example using the command line tool:


echo '{"id": 1, "name": "Test User", "timestamp": "2023-11-10T12:00:00", "status": "active"}' | sh kafka-console-producer.sh --broker-list 10.0.96.4:9092 --topic example_topic

Use the following command to view the data just written to the topic:


sh kafka-console-consumer.sh --bootstrap-server 10.0.96.4:9092 --topic example_topic --from-beginning

When executing the command, replace the topic and bootstrap-server with the actual Kafka server address.

Creating an AutoMQ Source on RisingWave Cloud

  1. Go to RisingWave Cloud Clusters to create a cluster.

  2. Navigate to RisingWave Cloud Source to create a source.

  3. Specify the cluster and database, and log in to the database.

  4. AutoMQ is 100% compatible with Apache Kafka®, so simply click Create source and select Kafka.

  5. Configure the connector according to the RisingWave Cloud guided interface, setting the source information and schema details.

  6. Confirm the generated SQL statement and click Confirm to complete the creation of the source.

AutoMQ's default port is 9092, and SSL is not enabled by default. To enable SSL, please refer to the Apache Kafka Documentation.

In this example, you can set the startup mode to earliest and use JSON format to access all data from the topic from the beginning.

Query Data

  1. Navigate to the RisingWave Cloud Console and log into your cluster.

  2. Run the following SQL statement to access the imported data, replacing the variable your_source_name with the custom name specified when creating the source.


SELECT * from {your_source_name} limit 1;