Skip to Main Content

RisingWave

RisingWave is a distributed stream database that offers a standard SQL interface and is compatible with the PostgreSQL ecosystem. This compatibility allows for seamless integration without the need to alter existing code. RisingWave treats streams as tables, enabling users to execute complex queries on both streaming and historical data with ease. With RisingWave, users can concentrate on query analysis logic without the necessity to learn Java or the specific underlying APIs of various systems.

This article will demonstrate how to import data from AutoMQ into the RisingWave database using RisingWave Cloud.

Prepare AutoMQ and Test Data

Refer to Deploy Locally▸ to deploy AutoMQ, ensuring network connectivity between AutoMQ and RisingWave.

Quickly create a topic named example_topic in AutoMQ and write a test JSON data into it by following the steps below.

Create Topic

Use the Apache Kafka command-line tool to create a topic, ensuring you have access to a Kafka environment and that the Kafka service is operational. Here is an example of the command to create a topic:


./kafka-topics.sh --create --topic exampleto_topic --bootstrap-server 10.0.96.4:9092 --partitions 1 --replication-factor 1

When executing the command, replace topic and bootstrap-server with the actual Kafka server address you are using.

After creating the topic, use the following command to verify that the topic has been successfully created.


./kafka-topics.sh --describe example_topic --bootstrap-server 10.0.96.4:9092

Generate Test Data

Generate a piece of JSON formatted test data, which should correspond to the table mentioned earlier.


{
"id": 1,
"name": "Test User",
"timestamp": "2023-11-10T12:00:00",
"status": "active"
}

Write Test Data

Use Apache Kafka®'s command line tools or programming methods to write the test data into a topic named example_topic. Here is an example using the command line tool:


echo '{"id": 1, "name": "Test User", "timestamp": "2023-11-10T12:00:00", "status": "active"}' | sh kafka-console-producer.sh --broker-list 10.0.96.4:9092 --topic example_topic

To view the data that was just written to the topic, use the following command:


sh kafka-console-consumer.sh --bootstrap-server 10.0.96.4:9092 --topic example_topic --from-beginning

When executing commands, replace the topic and bootstrap-server with the actual Kafka server addresses you are using.

Creating an AutoMQ Source on RisingWave Cloud

  1. Navigate to RisingWave Cloud Clusters to create a cluster.

  2. Go to RisingWave Cloud Source to create a source.

  3. Specify the cluster and database, and log in to the database.

  4. Since AutoMQ is 100% compatible with Apache Kafka®, simply click Create source and select Kafka.

  5. Follow RisingWave Cloud's guide to configure the connector, set the source information and schema information.

  6. Review the generated SQL statement and click Confirm to complete the creation of the source.

AutoMQ defaults to port 9092 and does not have SSL enabled. To enable SSL, please consult the Apache Kafka Documentation.

In this example, you can retrieve all the data within a topic from the start by setting the startup mode to 'earliest' and opting for the JSON format.

Query Data

  1. Navigate to the RisingWave Cloud Console and sign in to your cluster.

  2. Execute the following SQL statement to access the imported data, ensuring you replace the 'your_source_name' variable with the custom name you assigned during the source creation.


SELECT * from {your_source_name} limit 1;