All Products
Search
Document Center

DataWorks:Create a data push node

Last Updated:Mar 24, 2025

DataWorks DataStudio allows you to create a data push node to obtain the data query results generated by other nodes and push the obtained data to DingTalk groups, Lark groups, WeCom groups, Microsoft Teams, or Email. This way, members in the groups or teams can receive the latest data at the earliest opportunity.

Overview

A data push node receives the output parameters of its ancestor nodes and uses the output parameters as the input parameters of the data push node to push data to specific destinations. For example, the data push node can use placeholders in the content to push to obtain the output parameters of the ancestor nodes. Data push nodes can obtain the output parameters of ancestor SQL query nodes and assignment nodes.

  • After an SQL query node queries data, the outputs parameter is generated for the node to pass the query results to its descendant nodes. For information about how to configure input and output parameters, see Configure input and output parameters.

  • If you want to use Markdown to display the content that you want to push to specific destinations, you can add ${Parameter name} to the content to reference the output parameters of the ancestor node.

  • If you want to use tables to display the content that you want to push to specific destinations, you can select the fields that are queried by the ancestor SQL query node as input parameters to obtain data.

Prerequisites

  • DataWorks is activated. For more information, see Activate DataWorks.

  • A DataWorks workspace is created. For more information, see Create a workspace.

  • A workflow is created in the DataWorks workspace.

  • A serverless resource group is created. Only serverless resource groups are supported. For information about how to create and use a serverless resource group, see Create and use a serverless resource group.

Limits

  • Limits on the data size:

    • If you want to push data to DingTalk, the data size cannot exceed 20 KB.

    • If you want to push data to Lark, the data size cannot exceed 20 KB, and the size of an image must be less than 10 MB.

    • If you want to push data to WeCom, each chatbot can send a maximum of 20 messages every minute.

    • If you want to push data to Microsoft Teams, the data size cannot exceed 28 KB.

    • If you want to push data to Email, only one email body can be added to each data push task. If the email body is added, the email body cannot be added again. For more information about restrictions, see the Simple Mail Transfer Protocol (SMTP) limits of the used email service.

  • The data push feature is available only in DataWorks workspaces in the following regions: China (Hangzhou), China (Shanghai), China (Beijing), China (Shenzhen), China (Chengdu), China (Hong Kong), Singapore, Malaysia (Kuala Lumpur), US (Silicon Valley), US (Virginia), and Germany (Frankfurt).

Procedure

Step 1: Create an SQL query node or an assignment node

A data push node receives the output parameters of its ancestor nodes and uses the output parameters as the input parameters of the data push node to push data to specific destinations. Before you create a data push node, you must create an SQL query node or an assignment node as the ancestor node of the data push node.

Note
  • If you want to query and push MaxCompute data, you must use an assignment node to query MaxCompute data and add the outputs parameter in the Input and Output Parameters section of the Properties tab to pass the query results to a data push node. For more information, see MaxCompute data push.

  • If you want to query and push the data of other data sources, you can use an SQL query node to query data and add the outputs parameter in the Input and Output Parameters section of the Properties tab to pass the query results to a data push node. For more information, see Best practice for configuring data push nodes in a workflow.

Create an SQL query node

  1. Go to the DataStudio page.

    Log on to the DataWorks console. In the top navigation bar, select the desired region. In the left-side navigation pane, choose Data Development and O&M > Data Development. On the page that appears, select the desired workspace from the drop-down list and click Go to Data Development.

  1. Double-click the created workflow. On the configuration tab of the workflow, click the image icon. Then, click the desired node type in the related section based on your business requirements. In the Create Node dialog box, configure parameters and click Confirm.

  2. Double-click the created SQL query node. On the configuration tab of the node, write code for the node.

    Note

    You cannot use data push nodes to directly query data from an ODSP SQL node. You must create an assignment node and write SQL statements for the node. For more information, see the Configure data push flows in the workflow section of the "Best practice for configuring data push nodes in a workflow" topic.

  3. In the right-side navigation pane of the configuration tab of the node, click the Properties tab. On the Properties tab, configure parameters based on your business requirements.

    For more information, see Configure basic properties, Configure time properties, Configure the resource property, Configure same-cycle scheduling dependencies, and Configure input and output parameters.

  4. On the Properties tab, click the drop-down arrow to the right of Input and Output Parameters. Click Add assignment parameter to the right of Output Parameters to add the outputs parameter for the SQL query node.

  5. In the top toolbar, click the image icon to save the configurations.

Create an assignment node

  1. Go to the DataStudio page.

    Log on to the DataWorks console. In the top navigation bar, select the desired region. In the left-side navigation pane, choose Data Development and O&M > Data Development. On the page that appears, select the desired workspace from the drop-down list and click Go to Data Development.

  1. Double-click the created workflow. On the configuration tab of the workflow, click the image icon. Then, click Assignment Node in the General section. In the Create Node dialog box, configure parameters and click Confirm.

  2. On the configuration tab of the created workflow, double-click the name of the created assignment node. On the configuration tab of the node, configure the node based on your business requirements.

    You can select ODPS SQL, SHELL, or Python from the Language drop-down list to write code for the node. For more information, see Configure an assignment node.

  3. In the top toolbar, click the image icon to save the configurations.

Step 2: Create a data push node

  1. Double-click the created workflow. On the configuration tab of the workflow, click the image icon. Then, click Data Push in the General section. In the Create Node dialog box, configure parameters and click Confirm. The following table describes the parameters.

    Parameter

    Description

    Node Type

    The type of the node. Select Data Push from the drop-down list.

    Path

    The path that stores the node. The path must be the same as that of the node created in Step 1.

    Name

    The name of the data push node. You can specify the name based on your business requirements.

  2. Double-click the created data push node. The configuration tab of the node appears.

  3. Add the SQL query node created in Step 1 as the ancestor node of the data push node. In the right-side navigation pane of the configuration tab of the data push node, Click Properties. In the Dependencies section, select Node Name from the drop-down list below Parent Nodes, enter the name of the SQL query node in the field, and then click Create.

  4. In the Resource Group section of the Properties tab, configure the Resource Group parameter. You must select a serverless resource group that was created after June 28, 2024, which is the release date of the data push feature. If the resource group was created before June 28, 2024, submit a ticket to upgrade the resource group.

  5. Add the outputs parameter of the SQL query node created in Step 1 as the input parameter of the data push node. On the Properties tab, click the drop-down arrow to the right of Input and Output Parameters. Click Create to the right of Input Parameters. Then, enter a parameter name in the Parameter Name column and select the outputs parameter from the drop-down list in the Value Source column. Close the Properties tab.

  6. Configure the Destination, Title, and Body parameters for the data push node.

    1. Destination: Select a destination from the Destination drop-down list. If no destination is available, click Create Destination to create a destination. The following table describes the parameters for creating a destination.

      Parameter

      Description

      Type

      Select a push channel. DingTalk, Lark, WeCom, Microsoft Teams, and Email are supported.

      Destination Name

      Enter a name based on your business requirements.

      WebHook

      Enter the chatbot of DingTalk, Lark or WeCom or webhook URL of Microsoft Teams. You must obtain the chatbot and webhook URL of the related push channel in the corresponding platform. You can also obtain the SMTP of Email in the corresponding platform.

      Note

      Create and manage a destination in DataService Studio:

      1. Log on to the DataWorks console. In the top navigation bar, select the desired region. In the left-side navigation pane, choose Data Analysis and Service > DataService Studio. On the page that appears, select the desired workspace from the drop-down list and click Go to DataService Studio.

      2. In the lower-left corner of the Service Development tab of the DataService Studio page, click the image icon. On the page that appears, click the Destination Management tab. On the tab, you can click Create Destination to create a destination and manage existing destinations. For more information, see the Create a webhook destination section in the "Data push" topic.

    2. Title: Enter a title for the data push node in the related field based on your business requirements.

    3. Body: Click Add and select Markdown or Table to configure the content that you want to push. For more information, see the Configure the content to push section in the "Data push" topic.

      Note
      • Take note of the following items when you use an SQL query node as the ancestor node of the data push node:

        • If you select Markdown to configure the content to push in the Body section, you can directly use the field names that are queried by the SQL query node as placeholders to obtain the output parameters of the SQL query node. The placeholders must be in the ${Field name} format.

        • If you select Table to configure the content that you want to push in the Body section, you can add the fields that are queried by the SQL query node as input parameters to obtain data.

      • If you use an assignment node as the ancestor node of the data push node, you must use the names of the input parameters of the data push node as placeholders to obtain the output parameters of the assignment node. The placeholders must be in the ${Input parameter name} format.

  7. In the top toolbar, click the image icon to save the configurations.

Step 3: Test data push flows in the workflow, commit the workflow, and then deploy nodes in the workflow

After you configure all data push flows in the workflow, double-click the workflow. On the configuration tab of the workflow, test all data push flows. If all data push flows run as expected, commit the workflow and deploy nodes in the workflow.

  1. On the configuration tab of the workflow, click the image icon to run the workflow.

  2. After image appears next to all nodes in the workflow, click the image icon to commit the workflow.

  3. In the Commit dialog box, select the nodes that you want to commit, enter a description, and then select Ignore I/O Inconsistency Alerts.

  4. Click Confirm.

  5. Deploy nodes. For more information, see Deploy nodes.

Best practice

DataWorks DataStudio allows you to create data push nodes in a workflow to push data to specific destinations by using the following push methods: simple data push, combined data push, script data push, conditional data push, and MaxCompute data push. For more information, see Best practice for configuring data push nodes in a workflow.

What to do next

After all nodes in the workflow are deployed, you can perform various O&M operations on the nodes in Operation Center. For more information, see Perform basic O&M operations on auto triggered nodes.