Skip to Main Content

Timeplus

Timeplus is a data analytics platform that specializes in stream processing, leveraging the open-source streaming database Proton to offer comprehensive functionalities. It allows teams to effortlessly manage both stream and historical data, catering to various organizations across different sectors. The platform is designed to empower data engineers and platform engineers to maximize the value derived from streaming data using SQL.

This chapter describes how to import AutoMQ data into Timeplus via the Timeplus console. Given that AutoMQ is fully compatible with Apache Kafka®, you can also establish a Kafka external stream to analyze AutoMQ data without relocating it.

Prepare AutoMQ and Test Data

For deploying AutoMQ, follow the instructions at Deploy Locally▸, ensuring there is network connectivity between AutoMQ and Timeplus.

If it's necessary to maintain an IP whitelist, add the static IP of the Timeplus service to the list:

52.83.159.13 for cloud.timeplus.com.cn

Quickly establish a topic named example_topic in AutoMQ and populate it with a test JSON message by following these instructions.

Create Topic

To create a topic using the Apache Kafka® command-line tool, first ensure you have access to a Kafka environment and that the Kafka service is operational. Here is an example command for creating a topic:


./kafka-topics.sh --create --topic exampleto_topic --bootstrap-server 10.0.96.4:9092 --partitions 1 --replication-factor 1

When executing the command, you must replace 'topic' and 'bootstrap-server' with the actual address of your Kafka server.

After creating the topic, you can use the following command to confirm that the topic has been successfully established.


./kafka-topics.sh --describe example_topic --bootstrap-server 10.0.96.4:9092

Generate Test Data

Produce a piece of JSON formatted test data, which should correspond to the data outlined in the previous table.


{
"id": 1,
"name": "Test User",
"timestamp": "2023-11-10T12:00:00",
"status": "active"
}

Write Test Data

Utilize Kafka's command-line tool or a programming method to write the test data into a topic named 'example_topic'. Here is an example using the command-line tool:


echo '{"id": 1, "name": "Test User", "timestamp": "2023-11-10T12:00:00", "status": "active"}' | sh kafka-console-producer.sh --broker-list 10.0.96.4:9092 --topic example_topic

Use the following command to check the data that was just sent to the topic:


sh kafka-console-consumer.sh --bootstrap-server 10.0.96.4:9092 --topic example_topic --from-beginning

When running the command, ensure you replace the topic and bootstrap-server with your actual Kafka server address.

AutoMQ Data Source

  1. In the left navigation menu, select "Data Ingestion," then click on the "Add Data" button located in the upper right corner.

  2. In the dialog that appears, review the available data sources and other data addition options. Since AutoMQ is fully compatible with Apache Kafka, simply choose Apache Kafka.

  3. Provide the broker URL, disable TLS and authentication.

  4. Specify the AutoMQ topic name and select the data format under "Read as," supporting JSON, AVRO, and text formats.

    1. It is advisable to use Text to store the entire JSON document as a string, which simplifies managing schema changes.

    2. For AVRO, turn on the "automatic extraction" feature to store top-level attributes in separate columns. Be sure to provide the schema registry address, API key, and key.

  5. In the "preview" step, ensure to display at least one event, with new data sources typically generating new streams in Timeplus by default.

  6. Assign a name to the stream and confirm the column details; you may define the event time column. If unset, the system defaults to using the ingestion time. Alternatively, you can opt to use an existing stream.

  7. Following the data preview, name the source, add a description, and finalize the configuration. Click "finish," and the stream data will be immediately available in the designated stream.

AutoMQ Source Description

When leveraging an AutoMQ data source, observe the following limitations:

  1. Presently, only messages in AutoMQ Kafka topics formatted in JSON and AVRO are supported.

  2. Topic-level JSON properties are transformed into stream columns, with nested properties stored as String columns, which can be queried using JSON functions.

  3. Numerical or Boolean types in JSON messages are converted into their corresponding types within the stream.

  4. Dates or timestamps are stored as string columns and can be converted back to DateTime using the to_time function.