In this guide we will use Airbotics to set up data streams and collect data.


  • Create a new data stream.
  • Collect data points from the stream.
  • Query the collected data points.
  • Configure the data stream.
  • Understand buffering.


  • An account with Airbotics.
  • A valid API key with the data:read and data:write permissions.
  • Device running Ubuntu 22.04 with ROS 2 and the airbotics agent installed.
  • Physical or SSH access to your device.

Prepare environment

Ensure you have the following two environment variables set before you get started:

export AIR_API_KEY=<your_api_key>
export AIR_ROBOT_ID=<your_robot_id>

Run the agent and the application nodes

ros2 run turtlesim turtlesim_node
ros2 run turtlesim turtle_teleop_key

Create a data stream

To start collecting data points, we must create a data stream. The data stream defines the configuration for the data collection and is responsible for communicating this to the affected robot.

curl --request POST$AIR_ROBOT_ID/streams \
    --header "content-type: application/json" \
    --header "air-api-key: $AIR_API_KEY" \
    --data '{
        "source": "/turtle1/cmd_vel",
        "type": "geometry_msgs/msg/Twist",
        "hz": 0.1

The agent should receive a message to register the new data stream.

Collect data points

Our data stream is set to collect data points from the /turtle1/cmd_vel at a frequency of 0.2hz. This means data points will be collected at a maximum frequency of 1 every 5 seconds. Try moving the turtle around the the telop_key node.

Data points will continue to be collected indefinitely while the data stream remains enabled.

Query the data points

Depending on how long you spent moving the turtle around, you should have collected at least some data points. Make another API call to get the 10 most recent data points from the stream.

curl --request GET$AIR_ROBOT_ID/data?source=%2Fturtle1%2Fcmd_vel&limit=10 \
    --header "content-type: application/json" \
    --header "air-api-key: $AIR_API_KEY"

You should notice from the response that there is at least a 5 second gap between each data points sent_at timestamp, this is because we throttled the data stream to 0.2hz. You can read more about throttling here.

Configure the data stream

A data stream can be updated anytime, you may enable or disable a data stream of change its collection frequency. Let’s update the data stream to collect data points at 1hz:

curl --request PATCH$AIR_ROBOT_ID/data?source=%2Fturtle1%2Fcmd_vel \
    --header "content-type: application/json" \
    --header "air-api-key: $AIR_API_KEY" \
    --data '{
        "hz": 1

Try moving the turtle around with the telop_key node again and repeat the query step. You will notice the reduction in the sent_at timestamps.

You can also configure the data stream with enabled: false to pause the data point collection.

Understand buffering

You have already seen a situation where data points may be dropped, when the frequency of the data stream is less than the frequency the agent receives new messages from ROS.

Buffering is a separate built in mechanism that aims to reduce dropped data points. For this step, disable the network on the device where the agent is running and try moving the turtle around again.

After a few seconds reconnect the device to the network and repeat the query step. If you look at the sent_at timestamps you’ll notice that the data points that the agent received while the network was disconnected were not lost. This is thanks to the buffer.

The agent will store a buffer of up to 65555 data points when it losses connection. On reconnection it will start sending all of the buffered data points. In the case where the buffer overflows, i.e more than 65555 data points are received before the connection is re-established, oldest data points start to be dropped by newer data points in a FIFO process.

Wrapping up

In this deep dive you have created and updated data streams, collected and queried data points. You’ve also learnt about why and under what conditions data points are dropped.

Cleaning up

If you want to delete all the data streams and data points you created in this guide, you can make an API call to delete the data stream, which will in turn delete all of the associated data points.

curl --request DELETE$AIR_ROBOT_ID/data?source=%2Fturtle1%2Fcmd_vel \
    --header "content-type: application/json" \
    --header "air-api-key: $AIR_API_KEY"