This application note provides instructions on how to integrate the input from a USB camera and telemetry sensors with Foxglove. Foxglove Studio is used visualize and analyze the video stream and telemetry data.
1. Foxglove Studio
Foxglove Studio is a powerful visualization and analysis tool designed for robotics developers and engineers. It provides a comprehensive set of features to help users visualize, analyze, and understand robotic systems and their data in real-time. Some key features of Foxglove Studio include:
Data Visualization: Foxglove Studio allows users to visualize various types of data generated by robotic systems, such as sensor data, robot state information, and control signals. It offers customizable visualizations that help users gain insights into the behavior of their robots.
Real-Time Monitoring: Users can monitor and analyze data streams in real time, enabling them to track the performance of robotic systems, identify issues, and make informed decisions quickly.
Data Playback: Foxglove Studio provides the ability to record and playback data, allowing users to review past events, debug issues, and analyze system behavior offline.
Integration with ROS: Foxglove Studio is designed to work seamlessly with the Robot Operating System (ROS), a popular framework for building robotic applications. It can subscribe to ROS topics, visualize ROS messages, and interact with ROS nodes, making it a valuable tool for ROS developers.
User-Friendly Interface: The tool features an intuitive and user-friendly interface that simplifies
2. Foxglove Bridge
Foxglove Bridge is a component of Foxglove Studio that facilitates seamless communication between different parts of a robotic system, enabling data aggregation, transformation, real-time communication, integration with Foxglove Studio, and scalability.
3. Prerequisites
To proceed with the tests documented below, you will need the following set-up:
USB camera;
NAVQ+ kit;
Linux PC for running Foxglove Studio;
NAVQ+ connected to the Linux PC via WiFi.
4. Connecting Camera
To connect the USB Camera to the NAVQ+ kit, utilize a USB Type-C to Type-C cable and insert the camera into the USB1 port of the NAVQ+. By default, the USB1 port is designated for connecting the camera, while the USB2 port is configured for using the NAVQ+ as a USB gadget.
Alternatively, you can connect the camera to a USB hub that is then plugged into the USB1 port of the NAVQ+.
5. Software Setup
5.1. Setting Up NAVQ+
Release 1.1.13 or higher must be installed on the NAVQ+.
5.2. Installing Foxglove Studio
Download Foxglove Studio from the official Foxglove website and follow the installation instructions provided. Please note that a free account is necessary to utilize the software.
Alternatively, you can access the web version of Foxglove Studio at https://app.foxglove.dev/
6. Bridging ROS Data to Foxglove Studio
To run Foxglove Bridge on NAVQ+, use the following command:
user@imx8mpnavq:~$ ros2 launch foxglove_bridge foxglove_bridge_launch.xml port:=8765
Here is an example output:
For a quick communication test, run the ROS2 demo talker and listener. Start the demo talker on one console using the following command:
user@imx8mpnavq:~$ ros2 run demo_nodes_cpp talker
On another console start the demo listener using the following command:
user@imx8mpnavq:~$ ros2 run demo_nodes_cpp listener
Here is an example output:
Run Foxglove Studio on the host machine, and follow these steps:
Click on Open connection... and enter the NAVQ+ IP address in the Websocket URL field, for example,
ws://192.168.68.123:8765
.Click Open to establish the connection.
In the Layout menu, choose Create new layout and then select the Log layout.
The ROS2 log window will open, showing messages arriving to the /chatter topic:
7. Integrating Video Input with Foxglove Studio
Depending on additional hardware connected to the NAVQ+, the USB camera may not be the only V4L2 device present in the system. In order to access the camera the user needs to determine the correct video device file node. The below command lists all device node files registered for the USB camera:
user@imx8mpnavq:~$ v4l2-ctl --list-devices | grep -A3 UVC UVC Camera (046d:0825) (usb-xhci-hcd.1.auto-1.2): /dev/video3 /dev/video4 /dev/media0
In the above output, the /dev/video3
device node corresponds to the camera connected to the USB1 slot.
Run the following command to start the ROS V4L2 node:
user@imx8mpnavq:~$ ros2 run v4l2_camera v4l2_camera_node --ros-args -p video_device:=/dev/video3
Here is an example output:
Verify that the camera is streaming by checking the currently open topics on the NAVQ+:
user@imx8mpnavq:~$ ros2 topic list 1711722453.283299 [7] ros2: eth1: optional interface was not found. /camera_info /image_raw /image_raw/compressed /image_raw/compressedDepth /image_raw/theora /parameter_events /rosout
Check that the /camera_info
topic works:
user@imx8mpnavq:~$ ros2 topic echo /camera_info
Here is an example output:
Run Foxglove Studio on the host machine, and follow these steps:
Click on Open connection... and enter the NAVQ+ IP address in the Websocket URL field, for example,
ws://192.168.68.123:8765
.Click Open to establish the connection.
In the Layout menu, choose Create new layout and then select the Image layout.
Your layout will resemble the following:
8. Integrating Data from USB ToF 3D Camera
Check if the Flexx2 camera is detected in the system by running the lsusb
command:
user@imx8mpnavq:~$ lsusb Bus 002 Device 079: ID 1c28:c033 PMD Technologies pmdModule ...
Launch the PMD ROS2 node by executing the following command:
user@imx8mpnavq:~$ ros2 launch pmd_camera_ros pmd_camera.launch.py
Check the ROS2 topic list to see the PMD topics have appeared:
user@imx8mpnavq:~$ ros2 topic list ... /initialpose /move_base_simple/goal /parameter_events /pmd_camera/cloud /pmd_camera/info /pmd_camera/state ...
In Foxglove Studio, create a new layout and select the 3D layout template from the options provided.
Open the layout settings, navigate to Topics, find the /pmd_camera/cloud
topic, and click on the "Toggle Visibility" icon.
The point cloud will be displayed. You can also view the /pmd_camera/info
topic to display the camera settings.
9. Integrating Telemetry Sensors with Foxglove Studio
9.1. IMU Emulation
In this scenario, we will utilize a USB mouse as a substitute for an actual IMU sensor. The USB mouse will be interpreted as a raw HID device, with values from mouse movements serving as simulated data for the IMU values. The X-axis will represent the acceleration.x
value, while the Y-axis will correspond to the acceleration.y
value. Data will be fetched every second through a timer named self.timer
.
In this instance, the mouse is identified by VID 0x248a and PID 0x8366. Kindly adjust the provided examples to align with your specific configuration. You can determine the VID and PID of your USB mouse by utilizing the lsusb
utility.
9.2. Preparation
To configure udev
and set the HID device mode to 0666 for user access, follow these steps:
Open the file
/etc/udev/rules.d/60-hid.rules
as the root user.Add the following rule in the file:
SUBSYSTEM=="usb", ATTRS{idVendor}=="248a", ATTR{idProduct}=="8366", MODE="0666"
Save the file and exit the text editor.
Reboot the system.
Create a Python script hid2imu.py
with the following content:
import rclpy from rclpy.node import Node from sensor_msgs.msg import Imu from std_msgs.msg import Header from geometry_msgs.msg import (Quaternion, Vector3) from std_msgs.msg import String import hid import time h = hid.device() h.open(0x248a, 0x8366) h.set_nonblocking(1) class ImuPublisher(Node): def __init__(self): super().__init__('imu_publisher') self.pub_imu = self.create_publisher(Imu, 'imu', 10) self.imu_msg = Imu() self.frame_id = self.declare_parameter('frame_id', "base_imu_link").value self.timer = self.create_timer(1.0, self.publish_imu_message) def publish_imu_message(self): """ Publish the sensor message with new data """ imu_msg = Imu() gyro = Vector3() gyro.x, gyro.y, gyro.z = 0.0, 0.0, 0.0 # Read data from HID device d = h.read(64) accel = Vector3() if len(d) > 3: accel.x, accel.y, accel.z = d[2]/255.0, d[3]/255.0, 0.0 else: accel.x, accel.y, accel.z = 0.0, 0.0, 0.0 imu_msg.angular_velocity = gyro imu_msg.angular_velocity_covariance[0] = 0.00001 imu_msg.angular_velocity_covariance[4] = 0.00001 imu_msg.angular_velocity_covariance[8] = 0.00001 imu_msg.linear_acceleration = accel imu_msg.linear_acceleration_covariance[0] = 0.00001 imu_msg.linear_acceleration_covariance[4] = 0.00001 imu_msg.linear_acceleration_covariance[8] = 0.00001 imu_msg.orientation_covariance[0] = 0.00001 imu_msg.orientation_covariance[4] = 0.00001 imu_msg.orientation_covariance[8] = 0.00001 # add header imu_msg.header.stamp = self.get_clock().now().to_msg() imu_msg.header.frame_id = self.frame_id self.pub_imu.publish(imu_msg) def main(args=None): rclpy.init(args=args) node = ImuPublisher() try: rclpy.spin(node) except KeyboardInterrupt: pass rclpy.shutdown() if __name__ == '__main__': main()
This Python script opens the HID device, reads raw data from it, and then posts it to the /imu
topic as a ROS2 node.
9.3. Running
Type the following command to run the script:
user@imx8mpnavq:~$ python3 hid2imu.py
To listen for the /imu
topic on NAVQ+, run:
user@imx8mpnavq:~$ ros2 topic echo /imu
Here is an example output:
Run Foxglove Studio on the host machine, and follow these steps:
Click on Open connection... and enter the NAVQ+ IP address in the Websocket URL field, for example,
ws://192.168.68.123:8765
.Click Open to establish the connection.
In the Layout menu, choose Create new layout and then select the Image layout.
Within Foxglove layout press on … and select Split right.
In the new panel, press on … and select Change Panel. Select Plot in the menu:
Press the Click to add series button and add the topics
/imu.acceleration.x
and/imu.acceleration.y
:
To add a second series, press the + button next to the Series option.
The resulting layout will display changing acceleration values.
10. Using Foxglove Studio for Analyzing Video Streams and Telemetry Sensors
There are several methods available for saving ROS2 data, including the use of tools like the Foxglove Agent. Here are some various ways to save ROS2 data:
Bag Files: ROS2 provides a built-in tool called
rosbag
that allows users to record and save ROS2 data streams into bag files. These bag files can later be replayed for analysis or debugging purposes.Custom Loggers: Developers can create custom logging mechanisms within their ROS2 nodes to save specific data streams to text files, CSV files, or databases. This gives more flexibility in terms of data format and storage options.
Foxglove Agent: As mentioned earlier, the Foxglove Agent can be used not only for visualization and debugging but also for saving ROS2 data. The agent can capture and log ROS2 topics in real-time, providing a comprehensive solution for data collection and analysis.
Cloud Storage: Data collected from ROS2 can also be saved directly to cloud storage services like Amazon S3, Google Cloud Storage, or Azure Blob Storage for long-term storage and easy access from anywhere.
Database Integration: Integrating ROS2 with databases like SQLite, MySQL, or MongoDB allows for efficient storage and retrieval of data, enabling more advanced data analysis and querying capabilities.
Let’s save video stream and telemetry data from the USB camera using ros bag
.
Start IMU node:
user@imx8mpnavq:~$ python3 hid2imu.py
Start the camera node:
user@imx8mpnavq:~$ ros2 run v4l2_camera v4l2_camera_node --ros-args -p video_device:=/dev/video3
Save the ROS2 data:
user@imx8mpnavq:~$ ros2 bag record -a
Wait for 30 seconds and terminate the program using
Ctrl+C
.Copy the resultant file to your host machine. File name will be in the
ros2 bag
output and will look likerosbag2_2024_03_30-09_09_00
.In Foxglove Studio select Open local file… and open the obtained file with
.db3
extension.
You can replay your data in Foxglove Studio now:
You can modify the replay speed with the speed control feature and view values at any time point by clicking on the time axis: