Answer
stringlengths
44
28.2k
Id
stringlengths
1
6
CreationDate
stringlengths
23
23
Tags
stringlengths
4
94
Body
stringlengths
54
35.7k
Title
stringlengths
13
150
<blockquote> <pre><code> plugins: [&quot;inflation_layer&quot;, &quot;obstacle_layer&quot;] </code></pre> </blockquote> <p>These are <strong>ordered</strong>, so the inflation layer only applies to layers below it. Invert the order and you should be good to go!</p>
104536
2023-10-05T14:45:33.023
|navigation|ros2|simulation|ros-humble|nav2|
<p>I am trying to set up the nav2 parameter yaml file for my robot. The code I wrote for the costmaps can be seen here:</p> <pre><code> local_costmap: ros__parameters: update_frequency: 5.0 publish_frequency: 5.0 global_frame: odom #change here robot_base_frame: baseplate use_sim_time: True rolling_window: true # Setting the &quot;rolling_window&quot; parameter to true means that the costmap will remain centered around the robot as the robot moves through the world. width: 5 height: 5 resolution: 0.05 footprint: &quot;[ [0.8, 0.6], [0.8, -0.6], [-0.8, -0.6], [-0.8, 0.6] ]&quot; plugins: [&quot;inflation_layer&quot;, &quot;obstacle_layer&quot;] filters: [&quot;keepout_filter&quot;] inflation_layer: enabled: True # Disable it to see difference plugin: &quot;nav2_costmap_2d::InflationLayer&quot; cost_scaling_factor: 2.58 #A scaling factor to apply to cost values during inflation. increasing the factor will decrease the resulting cost values. inflation_radius: 0.9 #The radius in meters to which the map inflates obstacle cost values. obstacle_layer: plugin: &quot;nav2_costmap_2d::ObstacleLayer&quot; enabled: True #observation_persistence: 15.0 # How long to store messages in a buffer to add to costmap before removing them (s). Effectivelly, obstacles are not so easily forgotten. footprint_clearing_enabled: True # If true, the robot footprint will clear (mark as free) the space in which it travels. combination_method: 1 # 0 - Overwrite: Overwrite master costmap with every valid observation. 1 - Max: Sets the new value to the maximum of the master_grid’s value and this layer’s value. observation_sources: scan scan: topic: &quot;/lidar_ign&quot; max_obstacle_height: 3.0 # The maximum height in meters of a sensor reading considered valid. This is usually set to be slightly higher than the height of the robot. expected_update_rate: 0.5 #How often to expect a reading from a sensor in seconds. A value of 0.0 will allow infinite time between readings. This parameter is used as a failsafe to keep the navigation stack from commanding the robot when a sensor has failed. It should be set to a value that is slightly more permissive than the actual rate of the sensor clearing: True marking: True data_type: &quot;LaserScan&quot; raytrace_max_range: 4.0 # If a ray encounters an obstacle at a distance of e.g. 100 meters from the robot, it does not clear any cells beyond the 90-meter limit defined by raytrace_max_range. The cells between 90 and 100 meters are still marked as &quot;unknown&quot; in the costmap. raytrace_min_range: 0.0 obstacle_max_range: 3.0 # The maximum range in meters at which to insert obstacles into the costmap using sensor data. obstacle_min_range: 0.0 # The minimum range in meters at which to insert obstacles into the costmap using sensor data. keepout_filter: plugin: &quot;nav2_costmap_2d::KeepoutFilter&quot; enabled: True filter_info_topic: &quot;/costmap_filter_info&quot; transform_tolerance: 0.1 always_send_full_costmap: True global_costmap: global_costmap: ros__parameters: update_frequency: 5.0 publish_frequency: 5.0 global_frame: map robot_base_frame: baseplate use_sim_time: True height: 20 width: 20 footprint: &quot;[ [0.8, 0.6], [0.8, -0.6], [-0.8, -0.6], [-0.8, 0.6] ]&quot; resolution: 0.05 track_unknown_space: false plugins: [&quot;static_layer&quot;, &quot;inflation_layer&quot;, &quot;obstacle_layer&quot;] filters: [&quot;keepout_filter&quot;, &quot;inflation_layer&quot;] static_layer: plugin: &quot;nav2_costmap_2d::StaticLayer&quot; map_subscribe_transient_local: True inflation_layer: enabled: True # Disable it to see difference plugin: &quot;nav2_costmap_2d::InflationLayer&quot; cost_scaling_factor: 2.58 #A scaling factor to apply to cost values during inflation. increasing the factor will decrease the resulting cost values. inflation_radius: 0.9 #The radius in meters to which the map inflates obstacle cost values. obstacle_layer: plugin: &quot;nav2_costmap_2d::ObstacleLayer&quot; enabled: False observation_persistence: 15.0 # How long to store messages in a buffer to add to costmap before removing them (s). Effectivelly, obstacles are not so easily forgotten. footprint_clearing_enabled: True # If true, the robot footprint will clear (mark as free) the space in which it travels. combination_method: 1 # 0 - Overwrite: Overwrite master costmap with every valid observation. 1 - Max: Sets the new value to the maximum of the master_grid’s value and this layer’s value. observation_sources: scan scan: topic: &quot;/lidar_ign&quot; max_obstacle_height: 3.0 # The maximum height in meters of a sensor reading considered valid. This is usually set to be slightly higher than the height of the robot. expected_update_rate: 0.5 clearing: True marking: True data_type: &quot;LaserScan&quot; raytrace_max_range: 90.0 # If a ray encounters an obstacle at a distance of e.g. 100 meters from the robot, it does not clear any cells beyond the 90-meter limit defined by raytrace_max_range. The cells between 90 and 100 meters are still marked as &quot;unknown&quot; in the costmap. raytrace_min_range: 0.0 obstacle_max_range: 70.0 # The maximum range in meters at which to insert obstacles into the costmap using sensor data. obstacle_min_range: 0.0 # The minimum range in meters at which to insert obstacles into the costmap using sensor data. #2 - MaxWithoutUnknownOverwrite: Sets the new value to the maximum of the master_grid’s value and this layer’s value. If the master value is NO_INFORMATION, it is NOT overwritten. It can be used to make sure that the static map is the dominant source of information, and prevent the robot to go through places that are not present in the static map. keepout_filter: plugin: &quot;nav2_costmap_2d::KeepoutFilter&quot; enabled: True filter_info_topic: &quot;/costmap_filter_info&quot; transform_tolerance: 0.1 always_send_full_costmap: True map_server: ros__parameters: use_sim_time: True # Overridden in launch by the &quot;map&quot; launch configuration or provided default value. yaml_filename: &quot;&quot; topic_name: &quot;map&quot; frame_id: &quot;map&quot; costmap_filter_info_server: ros__parameters: use_sim_time: true type: 0 filter_info_topic: &quot;/costmap_filter_info&quot; # Topic to publish costmap filter information to. mask_topic: &quot;/keepout_filter_mask&quot; # Topic to publish filter mask to. The value of this parameter should be in accordance with topic_name parameter of Map Server tuned to filter mask publishing. base: 0.0 multiplier: 1.0 filter_mask_server: ros__parameters: use_sim_time: true frame_id: &quot;map&quot; topic_name: &quot;/keepout_filter_mask&quot; yaml_filename: &quot;&quot; </code></pre> <p>The global costmap works as expected. However, the local costmap appears empty. More specifically, it seems like the inflation layer of the local costmap does not work. I tried changing almost all parameters of local costmap, but that did not fix the issue.</p>
Inflation Layer doesn't seem to be working in local costmap specifically
<p>Gazebo Sim does not support the material script, this is mentioned in the <a href="https://gazebosim.org/api/sim/8/migrationsdf.html" rel="nofollow noreferrer">migration guide</a>.</p> <p>The proposed alternatives are to either use plain colours, or to use textures.</p>
104540
2023-10-06T02:04:35.593
|gazebo|simulation|gazebo-7|
<p>Environment:<br /> <span class="math-container">$\space$</span> OS: Linux Ubuntu 22.04<br /> <span class="math-container">$\space$</span> gazebo version: garden<br /> Problem:<br /> <span class="math-container">$\space$</span> I downloaded the sdf from <a href="https://app.gazebosim.org/OpenRobotics/fuel/models/RoboCup%202009%20SPL%20Field" rel="nofollow noreferrer">here</a>, it is supposed to look like this <a href="https://i.stack.imgur.com/e2eWu.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/e2eWu.png" alt="enter image description here" /></a> <span class="math-container">$\space$</span> but in my simulation it looks like this <a href="https://i.stack.imgur.com/hQLAw.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/hQLAw.png" alt="enter image description here" /></a> I think it's caused by some errors when loading the matetial, in model.sdf the material element looks like this.</p> <pre><code>&lt;material&gt; &lt;script&gt; &lt;uri&gt;materials/scripts/&lt;/uri&gt; &lt;uri&gt;materials/textures/&lt;/uri&gt; &lt;name&gt;RoboCup/Carpet&lt;/name&gt; &lt;/script&gt; &lt;/material&gt; </code></pre> <p>I read the Material.cc in sdfformat/src and I found that the script seems to have been implemented here.</p> <pre><code>if (_sdf-&gt;HasElement(&quot;script&quot;)) { sdf::ElementPtr elem = _sdf-&gt;GetElement(&quot;script&quot;, errors); std::pair&lt;std::string, bool&gt; uriPair = elem-&gt;Get&lt;std::string&gt;(errors, &quot;uri&quot;, &quot;&quot;); if (uriPair.first == &quot;__default__&quot;) uriPair.first = &quot;&quot;; if (!uriPair.second || uriPair.first.empty()) { errors.push_back({ErrorCode::ELEMENT_INVALID, &quot;A &lt;script&gt; element is missing a child &lt;uri&gt; element, or the &quot; &quot;&lt;uri&gt; element is empty.&quot;}); } this-&gt;dataPtr-&gt;scriptUri = resolveURI(uriPair.first, _config, errors); std::pair&lt;std::string, bool&gt; namePair = elem-&gt;Get&lt;std::string&gt;(errors, &quot;name&quot;, &quot;&quot;); if (namePair.first == &quot;__default__&quot;) namePair.first = &quot;&quot;; if (!namePair.second || namePair.first.empty()) { errors.push_back({ErrorCode::ELEMENT_MISSING, &quot;A &lt;script&gt; element is missing a child &lt;name&gt; element, or the &quot; &quot;&lt;name&gt; element is empty.&quot;}); } this-&gt;dataPtr-&gt;scriptName = namePair.first; } </code></pre> <p>so maybe there's something wrong in rendering? I wonder.</p>
gazebo garden can't load material script
<p>I found the solution, I just changed the order of arrangements in CMakeLists.txt</p> <p>This is what worked for me,</p> <pre><code># cmake_minimum_required(VERSION 3.0.2) project(midas_cpp) list(APPEND CMAKE_PREFIX_PATH &quot;~/libtorch&quot;) list(APPEND CMAKE_PREFIX_PATH &quot;~/librealsense&quot;) link_directories(&quot;/usr/local/include/pcl-1.13/&quot;) link_directories(&quot;/usr/local/lib/&quot;) list(APPEND CMAKE_PREFIX_PATH &quot;/usr/local/include/&quot;) list(APPEND CMAKE_PREFIX_PATH &quot;/usr/local/lib/python3.6/dist-packages/torch/lib&quot;) list(APPEND CMAKE_PREFIX_PATH &quot;/usr/local/lib/python2.7/dist-packages/torch/lib&quot;) find_package(Torch REQUIRED) find_package(OpenCV REQUIRED) find_package(realsense2 REQUIRED) find_package(PCL) find_package(catkin REQUIRED COMPONENTS cv_bridge image_transport roscpp rospy sensor_msgs std_msgs PCL ) include_directories(<span class="math-container">${PCL_INCLUDE_DIRS}) link_directories($</span>{PCL_LIBRARY_DIRS}) add_definitions(${PCL_DEFINITIONS}) include_directories( <span class="math-container">${OpenCV_INCLUDE_DIRS} ) include_directories($</span>{PCL_INCLUDE_DIRS}) catkin_package( # INCLUDE_DIRS include LIBRARIES midas_cpp pcl_ros_filters pcl_ros_io pcl_ros_tf CATKIN_DEPENDS cv_bridge image_transport roscpp sensor_msgs std_msgs pcl_ros DEPENDS PCL ) ########### ## Build ## ########### ## Specify additional locations of header files ## Your package locations should be listed before other locations include_directories( # include ${catkin_INCLUDE_DIRS} ) add_executable(midas_cpp src/main.cpp) target_link_libraries(midas_cpp &quot;<span class="math-container">${TORCH_LIBRARIES}" "$</span>{OpenCV_LIBS} ${catkin_LIBRARIES}&quot;) set_property(TARGET midas_cpp PROPERTY CXX_STANDARD 14) add_executable(realsense2_camera src/realsense_bringup.cpp) target_link_libraries(realsense2_camera &quot;<span class="math-container">${OpenCV_LIBS} $</span>{catkin_LIBRARIES}&quot; realsense2::realsense2) set_property(TARGET realsense2_camera PROPERTY CXX_STANDARD 14) # add_executable(depth_image src/depth_image.cpp) # target_link_libraries(depth_image &quot;<span class="math-container">${OpenCV_LIBS} $</span>{catkin_LIBRARIES}&quot;) # set_property(TARGET depth_image PROPERTY CXX_STANDARD 14) add_executable(camera_bringup src/camera_bringup.cpp) target_link_libraries(camera_bringup &quot;<span class="math-container">${OpenCV_LIBS} $</span>{catkin_LIBRARIES}&quot; ) set_property(TARGET camera_bringup PROPERTY CXX_STANDARD 14) # add_executable(depth2cloud src/depth2cloud.cpp) # target_link_libraries(depth2cloud &quot;<span class="math-container">${OpenCV_LIBS} $</span>{catkin_LIBRARIES}&quot;) # set_property(TARGET depth2cloud PROPERTY CXX_STANDARD 14) message(&quot;<span class="math-container">${PCL_LIBRARIES}") add_executable(depth_pcl src/depth_pcl.cpp) target_link_libraries(depth_pcl "$</span>{OpenCV_LIBS} ${catkin_LIBRARIES}&quot;) </code></pre>
104542
2023-10-06T05:03:49.553
|python|pcl|catkin-make|ros-noetic|pcl-ros|
<p>I am trying to use PCL filters in my rosnode ,where I am trying to perform noise removal on the pointcloud,but while building the package. I am getting CMAKE errors</p> <pre><code>#include &lt;pcl/filters/statistical_outlier_removal.h&gt; pcl::StatisticalOutlierRemoval&lt;pcl::PointXYZ&gt; sor; sor.setInputCloud(cloud_msg); sor.setMeanK(50); // Number of neighbors to consider for mean distance estimation sor.setStddevMulThresh(1.0); // Standard deviation threshold sor.filter(*cloud_msg); // Apply the noise filter </code></pre> <p>Here is the catkin_make output:</p> <pre><code>-- Using PYTHON_EXECUTABLE: /usr/bin/python3 -- Using Debian Python package layout -- Using empy: /usr/lib/python3/dist-packages/em.py -- Using CATKIN_ENABLE_TESTING: ON -- Call enable_testing() -- Using CATKIN_TEST_RESULTS_DIR: /home/ketan/MiDaS/ros/build/test_results -- Forcing gtest/gmock from source, though one was otherwise available. -- Found gtest sources under '/usr/src/googletest': gtests will be built -- Found gmock sources under '/usr/src/googletest': gmock will be built -- Found PythonInterp: /usr/bin/python3 (found version &quot;3.8.10&quot;) -- Using Python nosetests: /usr/bin/nosetests3 -- catkin 0.8.10 -- BUILD_SHARED_LIBS is on -- BUILD_SHARED_LIBS is on -- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -- ~~ traversing 1 packages in topological order: -- ~~ - midas_cpp -- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -- +++ processing catkin package: 'midas_cpp' -- ==&gt; add_subdirectory(midas_cpp) -- Caffe2: CUDA detected: 12.2 -- Caffe2: CUDA nvcc is: /usr/local/cuda/bin/nvcc -- Caffe2: CUDA toolkit directory: /usr/local/cuda -- Caffe2: Header version is: 12.2 -- /usr/local/cuda/lib64/libnvrtc.so shorthash is 000ca627 -- USE_CUDNN is set to 0. Compiling without cuDNN support -- Autodetected CUDA architecture(s): 6.1 -- Added CUDA NVCC flags for: -gencode;arch=compute_61,code=sm_61 -- Eigen found (include: /usr/include/eigen3, version: 3.3.7) -- FLANN found (include: /usr/include, lib: /usr/lib/x86_64-linux-gnu/libflann_cpp.so) -- The imported target &quot;vtkParseOGLExt&quot; references the file &quot;/usr/bin/vtkParseOGLExt-7.1&quot; but this file does not exist. Possible reasons include: * The file was deleted, renamed, or moved to another location. * An install or uninstall procedure did not complete successfully. * The installation package was faulty and contained &quot;/usr/lib/cmake/vtk-7.1/VTKTargets.cmake&quot; but not all the files it references. -- The imported target &quot;vtkRenderingPythonTkWidgets&quot; references the file &quot;/usr/lib/x86_64-linux-gnu/libvtkRenderingPythonTkWidgets.so&quot; but this file does not exist. Possible reasons include: * The file was deleted, renamed, or moved to another location. * An install or uninstall procedure did not complete successfully. * The installation package was faulty and contained &quot;/usr/lib/cmake/vtk-7.1/VTKTargets.cmake&quot; but not all the files it references. -- The imported target &quot;vtk&quot; references the file &quot;/usr/bin/vtk&quot; but this file does not exist. Possible reasons include: * The file was deleted, renamed, or moved to another location. * An install or uninstall procedure did not complete successfully. * The installation package was faulty and contained &quot;/usr/lib/cmake/vtk-7.1/VTKTargets.cmake&quot; but not all the files it references. -- The imported target &quot;pvtk&quot; references the file &quot;/usr/bin/pvtk&quot; but this file does not exist. Possible reasons include: * The file was deleted, renamed, or moved to another location. * An install or uninstall procedure did not complete successfully. * The installation package was faulty and contained &quot;/usr/lib/cmake/vtk-7.1/VTKTargets.cmake&quot; but not all the files it references. -- OpenNI found (version: 1.5.4.0, include: /usr/include/ni, lib: /usr/lib/libOpenNI.so;libusb::libusb) -- OpenNI2 found (version: 2.2.0.33, include: /usr/include/openni2, lib: /usr/lib/libOpenNI2.so;libusb::libusb) -- Could NOT find Pcap (missing: PCAP_LIBRARIES PCAP_INCLUDE_DIRS) ** WARNING ** io features related to pcap will be disabled -- Eigen found (include: /usr/include/eigen3, version: 3.3.7) -- OpenNI found (version: 1.5.4.0, include: /usr/include/ni, lib: /usr/lib/libOpenNI.so;libusb::libusb) -- OpenNI2 found (version: 2.2.0.33, include: /usr/include/openni2, lib: /usr/lib/libOpenNI2.so;libusb::libusb) -- Could NOT find Qhull (missing: QHULL_INCLUDE_DIR) ** WARNING ** surface features related to qhull will be disabled -- looking for PCL_COMMON -- looking for PCL_KDTREE -- looking for PCL_OCTREE -- looking for PCL_SEARCH -- looking for PCL_SAMPLE_CONSENSUS -- looking for PCL_FILTERS -- looking for PCL_2D -- looking for PCL_GEOMETRY -- looking for PCL_IO -- looking for PCL_FEATURES -- looking for PCL_ML -- looking for PCL_SEGMENTATION -- looking for PCL_VISUALIZATION -- looking for PCL_SURFACE -- looking for PCL_REGISTRATION -- looking for PCL_KEYPOINTS -- looking for PCL_TRACKING -- looking for PCL_RECOGNITION -- looking for PCL_STEREO -- looking for PCL_OUTOFCORE -- looking for PCL_PEOPLE -- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy -- Configuring done (1.6s) -- Generating done (0.0s) -- Build files have been written to: /home/ketan/MiDaS/ros/build #### #### Running command: &quot;make -j12 -l12&quot; in &quot;/home/ketan/MiDaS/ros/build&quot; #### [ 25%] Built target camera_bringup [ 50%] Built target realsense2_camera [ 62%] Linking CXX executable /home/ketan/MiDaS/ros/devel/lib/midas_cpp/depth_pcl [ 87%] Built target midas_cpp /usr/bin/ld: warning: libopencv_imgcodecs.so.4.2, needed by /opt/ros/noetic/lib/libcv_bridge.so, may conflict with libopencv_imgcodecs.so.408 /usr/bin/ld: warning: libopencv_core.so.408, needed by /home/ketan/opencv/build/lib/libopencv_imgcodecs.so.4.8.0, may conflict with libopencv_core.so.4.2 /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o: in function `DepthPcl::stateCallback(ros::TimerEvent const&amp;)': depth_pcl.cpp:(.text+0x7d1): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setInputCloud(std::shared_ptr&lt;pcl::PointCloud&lt;pcl::PointXYZ&gt; const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o: in function `pcl::Filter&lt;pcl::PointXYZ&gt;::filter(pcl::PointCloud&lt;pcl::PointXYZ&gt;&amp;)': depth_pcl.cpp:(.text._ZN3pcl6FilterINS_8PointXYZEE6filterERNS_10PointCloudIS1_EE[_ZN3pcl6FilterINS_8PointXYZEE6filterERNS_10PointCloudIS1_EE]+0x38): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::initCompute()' /usr/bin/ld: depth_pcl.cpp:(.text._ZN3pcl6FilterINS_8PointXYZEE6filterERNS_10PointCloudIS1_EE[_ZN3pcl6FilterINS_8PointXYZEE6filterERNS_10PointCloudIS1_EE]+0x205): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::deinitCompute()' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o: in function `pcl::Filter&lt;pcl::PointXYZ&gt;::Filter(bool)': depth_pcl.cpp:(.text._ZN3pcl6FilterINS_8PointXYZEEC2Eb[_ZN3pcl6FilterINS_8PointXYZEEC5Eb]+0x20): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::PCLBase()' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o: in function `void pcl::PointCloud&lt;pcl::PointXYZ&gt;::assign&lt;__gnu_cxx::__normal_iterator&lt;pcl::PointXYZ const*, std::vector&lt;pcl::PointXYZ, Eigen::aligned_allocator&lt;pcl::PointXYZ&gt; &gt; &gt; &gt;(__gnu_cxx::__normal_iterator&lt;pcl::PointXYZ const*, std::vector&lt;pcl::PointXYZ, Eigen::aligned_allocator&lt;pcl::PointXYZ&gt; &gt; &gt;, __gnu_cxx::__normal_iterator&lt;pcl::PointXYZ const*, std::vector&lt;pcl::PointXYZ, Eigen::aligned_allocator&lt;pcl::PointXYZ&gt; &gt; &gt;, int)': depth_pcl.cpp:(.text._ZN3pcl10PointCloudINS_8PointXYZEE6assignIN9__gnu_cxx17__normal_iteratorIPKS1_St6vectorIS1_N5Eigen17aligned_allocatorIS1_EEEEEEEvT_SE_i[_ZN3pcl10PointCloudINS_8PointXYZEE6assignIN9__gnu_cxx17__normal_iteratorIPKS1_St6vectorIS1_N5Eigen17aligned_allocatorIS1_EEEEEEEvT_SE_i]+0x36): undefined reference to `pcl::console::print(pcl::console::VERBOSITY_LEVEL, char const*, ...)' /usr/bin/ld: depth_pcl.cpp:(.text._ZN3pcl10PointCloudINS_8PointXYZEE6assignIN9__gnu_cxx17__normal_iteratorIPKS1_St6vectorIS1_N5Eigen17aligned_allocatorIS1_EEEEEEEvT_SE_i[_ZN3pcl10PointCloudINS_8PointXYZEE6assignIN9__gnu_cxx17__normal_iteratorIPKS1_St6vectorIS1_N5Eigen17aligned_allocatorIS1_EEEEEEEvT_SE_i]+0x129): undefined reference to `pcl::console::print(pcl::console::VERBOSITY_LEVEL, char const*, ...)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE[_ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE]+0x20): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setInputCloud(std::shared_ptr&lt;pcl::PointCloud&lt;pcl::PointXYZ&gt; const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE[_ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE]+0x28): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;std::vector&lt;int, std::allocator&lt;int&gt; &gt; &gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE[_ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE]+0x30): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;std::vector&lt;int, std::allocator&lt;int&gt; &gt; const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE[_ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE]+0x38): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;pcl::PointIndices const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE[_ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE]+0x40): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(unsigned long, unsigned long, unsigned long, unsigned long)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE[_ZTVN3pcl25StatisticalOutlierRemovalINS_8PointXYZEEE]+0x48): undefined reference to `pcl::FilterIndices&lt;pcl::PointXYZ&gt;::applyFilter(pcl::PointCloud&lt;pcl::PointXYZ&gt;&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl13FilterIndicesINS_8PointXYZEEE[_ZTVN3pcl13FilterIndicesINS_8PointXYZEEE]+0x20): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setInputCloud(std::shared_ptr&lt;pcl::PointCloud&lt;pcl::PointXYZ&gt; const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl13FilterIndicesINS_8PointXYZEEE[_ZTVN3pcl13FilterIndicesINS_8PointXYZEEE]+0x28): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;std::vector&lt;int, std::allocator&lt;int&gt; &gt; &gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl13FilterIndicesINS_8PointXYZEEE[_ZTVN3pcl13FilterIndicesINS_8PointXYZEEE]+0x30): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;std::vector&lt;int, std::allocator&lt;int&gt; &gt; const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl13FilterIndicesINS_8PointXYZEEE[_ZTVN3pcl13FilterIndicesINS_8PointXYZEEE]+0x38): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;pcl::PointIndices const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl13FilterIndicesINS_8PointXYZEEE[_ZTVN3pcl13FilterIndicesINS_8PointXYZEEE]+0x40): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(unsigned long, unsigned long, unsigned long, unsigned long)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl13FilterIndicesINS_8PointXYZEEE[_ZTVN3pcl13FilterIndicesINS_8PointXYZEEE]+0x48): undefined reference to `pcl::FilterIndices&lt;pcl::PointXYZ&gt;::applyFilter(pcl::PointCloud&lt;pcl::PointXYZ&gt;&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl6FilterINS_8PointXYZEEE[_ZTVN3pcl6FilterINS_8PointXYZEEE]+0x20): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setInputCloud(std::shared_ptr&lt;pcl::PointCloud&lt;pcl::PointXYZ&gt; const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl6FilterINS_8PointXYZEEE[_ZTVN3pcl6FilterINS_8PointXYZEEE]+0x28): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;std::vector&lt;int, std::allocator&lt;int&gt; &gt; &gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl6FilterINS_8PointXYZEEE[_ZTVN3pcl6FilterINS_8PointXYZEEE]+0x30): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;std::vector&lt;int, std::allocator&lt;int&gt; &gt; const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl6FilterINS_8PointXYZEEE[_ZTVN3pcl6FilterINS_8PointXYZEEE]+0x38): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;pcl::PointIndices const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl6FilterINS_8PointXYZEEE[_ZTVN3pcl6FilterINS_8PointXYZEEE]+0x40): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(unsigned long, unsigned long, unsigned long, unsigned long)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl7PCLBaseINS_8PointXYZEEE[_ZTVN3pcl7PCLBaseINS_8PointXYZEEE]+0x20): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setInputCloud(std::shared_ptr&lt;pcl::PointCloud&lt;pcl::PointXYZ&gt; const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl7PCLBaseINS_8PointXYZEEE[_ZTVN3pcl7PCLBaseINS_8PointXYZEEE]+0x28): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;std::vector&lt;int, std::allocator&lt;int&gt; &gt; &gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl7PCLBaseINS_8PointXYZEEE[_ZTVN3pcl7PCLBaseINS_8PointXYZEEE]+0x30): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;std::vector&lt;int, std::allocator&lt;int&gt; &gt; const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl7PCLBaseINS_8PointXYZEEE[_ZTVN3pcl7PCLBaseINS_8PointXYZEEE]+0x38): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(std::shared_ptr&lt;pcl::PointIndices const&gt; const&amp;)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o:(.data.rel.ro._ZTVN3pcl7PCLBaseINS_8PointXYZEEE[_ZTVN3pcl7PCLBaseINS_8PointXYZEEE]+0x40): undefined reference to `pcl::PCLBase&lt;pcl::PointXYZ&gt;::setIndices(unsigned long, unsigned long, unsigned long, unsigned long)' /usr/bin/ld: CMakeFiles/depth_pcl.dir/src/depth_pcl.cpp.o: in function `pcl::StatisticalOutlierRemoval&lt;pcl::PointXYZ&gt;::applyFilter(std::vector&lt;int, std::allocator&lt;int&gt; &gt;&amp;)': depth_pcl.cpp:(.text._ZN3pcl25StatisticalOutlierRemovalINS_8PointXYZEE11applyFilterERSt6vectorIiSaIiEE[_ZN3pcl25StatisticalOutlierRemovalINS_8PointXYZEE11applyFilterERSt6vectorIiSaIiEE]+0x23): undefined reference to `pcl::StatisticalOutlierRemoval&lt;pcl::PointXYZ&gt;::applyFilterIndices(std::vector&lt;int, std::allocator&lt;int&gt; &gt;&amp;)' collect2: error: ld returned 1 exit status make[2]: *** [midas_cpp/CMakeFiles/depth_pcl.dir/build.make:187: /home/ketan/MiDaS/ros/devel/lib/midas_cpp/depth_pcl] Error 1 make[1]: *** [CMakeFiles/Makefile2:2579: midas_cpp/CMakeFiles/depth_pcl.dir/all] Error 2 make: *** [Makefile:146: all] Error 2 Invoking &quot;make -j12 -l12&quot; failed </code></pre> <p>From this I understand that it's not able to find the library, or the library is not linked correctly.</p> <pre><code> cmake_minimum_required(VERSION 3.0.2) project(midas_cpp) find_package(catkin REQUIRED COMPONENTS cv_bridge image_transport roscpp rospy sensor_msgs std_msgs ) list(APPEND CMAKE_PREFIX_PATH &quot;~/libtorch&quot;) list(APPEND CMAKE_PREFIX_PATH &quot;~/librealsense&quot;) link_directories(&quot;/usr/local/include/pcl-1.13/&quot;) list(APPEND CMAKE_PREFIX_PATH &quot;/usr/local/include/&quot;) list(APPEND CMAKE_PREFIX_PATH &quot;/usr/local/lib/python3.6/dist-packages/torch/lib&quot;) list(APPEND CMAKE_PREFIX_PATH &quot;/usr/local/lib/python2.7/dist-packages/torch/lib&quot;) find_package(Torch REQUIRED) find_package(OpenCV REQUIRED) find_package(realsense2 REQUIRED) find_package(PCL) find_package(pcl_ros) include_directories(<span class="math-container">${PCL_INCLUDE_DIRS}) link_directories($</span>{PCL_LIBRARY_DIRS}) add_definitions(${PCL_DEFINITIONS}) include_directories( <span class="math-container">${OpenCV_INCLUDE_DIRS} ) include_directories($</span>{PCL_INCLUDE_DIRS}) ########### ## Build ## ########### ## Specify additional locations of header files ## Your package locations should be listed before other locations include_directories( # include ${catkin_INCLUDE_DIRS} ) add_executable(midas_cpp src/main.cpp) target_link_libraries(midas_cpp &quot;<span class="math-container">${TORCH_LIBRARIES}" "$</span>{OpenCV_LIBS} ${catkin_LIBRARIES}&quot;) set_property(TARGET midas_cpp PROPERTY CXX_STANDARD 14) add_executable(realsense2_camera src/realsense_bringup.cpp) target_link_libraries(realsense2_camera &quot;<span class="math-container">${OpenCV_LIBS} $</span>{catkin_LIBRARIES}&quot; realsense2::realsense2) set_property(TARGET realsense2_camera PROPERTY CXX_STANDARD 14) add_executable(camera_bringup src/camera_bringup.cpp) target_link_libraries(camera_bringup &quot;<span class="math-container">${OpenCV_LIBS} $</span>{catkin_LIBRARIES}&quot; ) set_property(TARGET camera_bringup PROPERTY CXX_STANDARD 14) add_executable(depth_pcl src/depth_pcl.cpp) target_link_libraries(depth_pcl &quot;<span class="math-container">${OpenCV_LIBS} $</span>{catkin_LIBRARIES} ${PCL_LIBRARIES}&quot;) set_property(TARGET depth_pcl PROPERTY CXX_STANDARD 14) ############# ## Install ## ############# ############# ## Testing ## ############# # catkin_package() </code></pre> <p>I did some google search and have tried various solutions like adding pcl_ros to find packages, adding it in package.xml, adding PCL_LIBRARIES in target_link_libraries. Nothing seems to work.</p> <p>Thank for the help.</p>
Facing issues integrating PCL with ROS
<p>The problem comes from <code>namespace</code>.</p> <p>With:</p> <pre><code>from launch import LaunchDescription import launch_ros.actions def generate_launch_description(): return LaunchDescription([ launch_ros.actions.Node( namespace= &quot;talker&quot;, package='my_package', executable='talker'), launch_ros.actions.Node( namespace= &quot;listener&quot;, package='my_package', executable='listener'), ]) </code></pre> <p>Nodes end up in different namespaces. So subscriber should do:</p> <pre><code>self.subscription = self.create_subscription( ChannelFloat32, '/talker/topic', # Adjust the topic namespace self.listener_callback, 10) </code></pre> <p>Or, original posted subscriber code will work with no namespace used:</p> <pre><code>from launch import LaunchDescription import launch_ros.actions def generate_launch_description(): return LaunchDescription([ launch_ros.actions.Node( package='my_package', executable='talker'), launch_ros.actions.Node( package='my_package', executable='listener'), ]) </code></pre> <p>Note: This solution was given to me by ChatGPT...</p>
104552
2023-10-06T13:28:27.243
|ros2|ros2-launch|
<p>I'm using ROS2 humble under Windows. I have a very simple PyQt based publisher and subscriber:</p> <p><strong>Publisher:</strong></p> <pre><code>import rclpy from rclpy.node import Node from rclpy.executors import MultiThreadedExecutor from sensor_msgs.msg import ChannelFloat32 from PyQt5.QtWidgets import QApplication, QWidget, QPushButton, QVBoxLayout import threading import math import sys class Gui: def __init__(self,publisher): self.publisher = publisher self.widget = QWidget() self.layout = QVBoxLayout(self.widget) self.pushButton = QPushButton(self.widget) self.pushButton.setText(&quot;Press to reset publisher&quot;) self.pushButton.clicked.connect(self.publisher.resetCounter) self.layout.addWidget( self.pushButton ) self.widget.setGeometry(50,50,320,200) self.widget.show() self.publisher.widget = self.widget class DummyPublisher(Node): def __init__(self): super().__init__('dummy_publisher') self.publisher_ = self.create_publisher(ChannelFloat32, 'topic', 10) timer_period = 0.5 # seconds self.timer = self.create_timer(timer_period, self.timer_callback) self.resetCounter() self.widget = None self.t0 = self.get_clock().now().nanoseconds def timer_callback(self): msg = ChannelFloat32() msg.values = [ float((self.get_clock().now().nanoseconds-self.t0)/1e9), float(self.i), float(math.sin(self.i)), float(self.i*4) ] self.publisher_.publish(msg) self.get_logger().info('Publishing: &quot;%s&quot;\n' % str(msg.values)) self.i += 1 if self.widget: self.widget.setWindowTitle(&quot;Dummy publisher &quot; + str(self.i)) def resetCounter(self): self.i = 0 def ros_node_thread(ros_node): executor = MultiThreadedExecutor() executor.add_node(ros_node) executor.spin() def main(args=None): rclpy.init(args=args) app = QApplication(sys.argv) publisher = DummyPublisher() gui = Gui( publisher ) ros_node_thread_instance = threading.Thread(target=ros_node_thread, args=(publisher,)) ros_node_thread_instance.start() try: sys.exit(app.exec_()) finally: publisher.destroy_node() rclpy.shutdown() ros_node_thread_instance.join() if __name__ == '__main__': main() </code></pre> <p><strong>Subscriber:</strong></p> <pre><code>import rclpy from rclpy.node import Node from rclpy.executors import MultiThreadedExecutor from sensor_msgs.msg import ChannelFloat32 from PyQt5.QtWidgets import QApplication, QWidget, QVBoxLayout from PyQt5.QtCore import pyqtSignal, QObject, QTimer import PyQt5.QtCore import threading import sys class Gui: def __init__(self): self.widget = QWidget() self.layout = QVBoxLayout(self.widget) self.widget.setGeometry(50,50,320,200) self.widget.show() class DummyListener(Node): def __init__(self,gui): super().__init__('dummy_listener') self.gui = gui self.subscription = self.create_subscription( ChannelFloat32, 'topic', self.listener_callback, 10) self.subscription # prevent unused variable warning self.gui = gui self.count = 0 def listener_callback(self, msg): self.get_logger().info('I heard: &quot;%s&quot;' % str(msg.values)) self.count += 1 self.gui.widget.setWindowTitle(&quot;Dummy listener &quot; + str(self.count)) def ros_node_thread(ros_node): executor = MultiThreadedExecutor() executor.add_node(ros_node) executor.spin() def main(args=None): rclpy.init(args=args) app = QApplication(sys.argv) gui = Gui() listener = DummyListener( gui ) ros_node_thread_instance = threading.Thread(target=ros_node_thread, args=(listener,)) ros_node_thread_instance.start() try: sys.exit(app.exec_()) finally: listener.destroy_node() rclpy.shutdown() ros_node_thread_instance.join() if __name__ == '__main__': main() </code></pre> <p>When I execute them from different consoles, running:</p> <pre><code>ros2 run my_package talker ros2 run my_package listener </code></pre> <p>It just works fine, I see the two widgets, one with <code>Dummy publisher #</code> title being incremented and one with <code>Dummy listener #</code> title being incremented.</p> <p>However, when I use a laucnh script:</p> <pre><code>from launch import LaunchDescription import launch_ros.actions def generate_launch_description(): return LaunchDescription([ launch_ros.actions.Node( namespace= &quot;talker&quot;, package='my_package', executable='talker'), launch_ros.actions.Node( namespace= &quot;listener&quot;, package='my_package', executable='listener'), ]) </code></pre> <p>The, I see the first widget with <code>Dummy publisher #</code> title being incremented and the other one's title remains unchanged as &quot;python&quot;. By looking deeper, I see <code>listener_callback</code> never gets called.</p> <p>Am I doing something wrong?</p>
ROS2 launch: subscriber not receiving data
<p>The only way to do this is to:</p> <ol> <li>Open a text editor</li> <li>Load the world file containing your Robot node in the text editor</li> <li>Create a new PROTO file from scratch in the text editor</li> <li>Copy your Robot node from the world file</li> <li>Paste it in the body of your PROTO node</li> </ol> <p>All this procedure is explained in detail in this <a href="https://cyberbotics.com/doc/guide/tutorial-7-your-first-proto" rel="nofollow noreferrer">tutorial</a>.</p> <p><em>Disclaimer: I am a <a href="https://github.com/cyberbotics/webots" rel="nofollow noreferrer">Webots</a> developer working at <a href="https://cyberbotics.com" rel="nofollow noreferrer">Cyberbotics</a>.</em></p>
104561
2023-10-06T15:33:04.360
|webots|
<p>I created a custom robot using the Robot Node by importing some meshes and creating boundary boxes with primitive shapes. I want to use it in another world, and I know the process involves creating a Proto. My question is: How can I turn my robot into a PROTO? In the past, I remember that right-clicking the node would display the &quot;<strong>Export to Proto</strong>&quot; option. Is there a way to do this in <strong>Webots 2023</strong>?</p>
How can I turn custom robot into a PROTO in Webots 2023?
<p>From the source code</p> <pre><code>//we also want to clear the robot footprint from the costmap we're using costmap_ros_-&gt;clearRobotFootprint(); </code></pre> <p><a href="https://github.com/strawlab/navigation/blob/master/dwa_local_planner/src/dwa_planner_ros.cpp" rel="nofollow noreferrer">https://github.com/strawlab/navigation/blob/master/dwa_local_planner/src/dwa_planner_ros.cpp</a></p> <pre><code> void Costmap2DROS::clearRobotFootprint(const tf::Stamped&lt;tf::Pose&gt;&amp; global_pose){ std::vector&lt;geometry_msgs::Point&gt; oriented_footprint; //check if we have a circular footprint or a polygon footprint if(footprint_spec_.size() &lt; 3){ //we'll build an approximation of the circle as the footprint and clear that double angle = 0; double step = 2 * M_PI / 72; while(angle &lt; 2 * M_PI){ geometry_msgs::Point pt; pt.x = getInscribedRadius() * cos(angle) + global_pose.getOrigin().x(); pt.y = getInscribedRadius() * sin(angle) + global_pose.getOrigin().y(); pt.z = 0.0; oriented_footprint.push_back(pt); angle += step; } } </code></pre> <p><a href="https://docs.ros.org/en/electric/api/costmap_2d/html/classcostmap__2d_1_1Costmap2DROS.html" rel="nofollow noreferrer">https://docs.ros.org/en/electric/api/costmap_2d/html/classcostmap__2d_1_1Costmap2DROS.html</a></p> <p><a href="https://github.com/strawlab/navigation/blob/master/costmap_2d/src/costmap_2d_ros.cpp" rel="nofollow noreferrer">https://github.com/strawlab/navigation/blob/master/costmap_2d/src/costmap_2d_ros.cpp</a></p> <p>Someone gave you a down vote because there is an expectation the questions are the results of failed efforts where the OP has explained what they have tried, but couldn't, do.</p>
104563
2023-10-06T16:36:41.490
|ros2|costmap|trajectory|footprint|
<p>I'm currently working with the DWAPLANNER for a person-following robot in ROS, and I'm eager to gain a deeper understanding of how this planner incorporates a robot's footprint into trajectory calculations. My goal is to make substantial modifications to the ROS navigation stack for this specific use case. Additionally, I'm keen to comprehend how this footprint interacts with the costmap. Would anyone be able to provide an in-depth explanation or direct me to valuable resources that can illuminate these two critical aspects of the DWAPLANNER in ROS? Your assistance would be greatly appreciated. Thank you!</p>
DWAPLANNER in ROS for a person-following robot: Footprint and costmap?
<blockquote> <p>I believe all of the above is correct, but if someone thinks otherwise I would welcome the correction.</p> </blockquote> <p>Yes, it's correct.</p> <blockquote> <p>The boundary conditions now include (I think) <span class="math-container">$\ddot{\theta} = 0$</span> at <span class="math-container">$t=0$</span> and <span class="math-container">$t=\tau$</span>.</p> </blockquote> <p>The 5th order polynome you propose already satisfies those boundary conditions.</p> <blockquote> <p>I tried things with different polynomials and/or the Taylor series approximation for cosθ</p> </blockquote> <p>I am not sure I completely follow what you try to achieve: you can just plug your fifth order motion profile θ(t) into</p> <p><span class="math-container">$$ T = \frac{1}{3}ml^2\ddot{\theta} + \frac{1}{2}mgl\cos\theta $$</span></p> <p>and this yields the expression T(t) which you can evaluate for any value of t in order to calculate the torque.</p> <p>Obviously T(0) and T(τ) will no longer be zero, but that is due to the gravitational force.</p> <p>The new expression T(t) has the cosine term in it, which is somewhat less elegant than the T(t) expression for the horizontal case, but there's no need to do any Taylor expansion nor approximation of that cosine.</p>
104569
2023-10-07T12:43:16.037
|control|motor|dynamics|forward-kinematics|actuator|
<p>This seems like it should have been a well studied problem, but I can't find any solution.</p> <p>We want to define what a motor must be commanded to do to move a single link with mass <span class="math-container">$m$</span> and length <span class="math-container">$l$</span> <span class="math-container">$180^\circ$</span> in a certain amount of time <span class="math-container">$\tau$</span>.</p> <p>In the horizontal plane, we know that the torque required to move the link is <span class="math-container">$$ T = \frac{1}{3}ml^2\ddot{\theta} $$</span> It took a little calculation, but to move the link <span class="math-container">$180^\circ$</span> in <span class="math-container">$\tau$</span> sec, a quintic polynomial can solve all of the boundary conditions <span class="math-container">$$ \theta(0) = 0 $$</span> <span class="math-container">$$ \theta(\tau) = \pi $$</span> <span class="math-container">$$ \dot{\theta}(0) = 0 $$</span> <span class="math-container">$$ \dot{\theta}(\tau) = 0 $$</span> <span class="math-container">$$ T(\theta) = 0 $$</span> <span class="math-container">$$ T(\tau) = 0 $$</span> The function that works is <span class="math-container">$$ \theta(t) = 10\pi\left(\frac{t}{\tau}\right)^3 - 15\pi\left(\frac{t}{\tau}\right)^4 + 6\pi\left(\frac{t}{\tau}\right)^5 $$</span> The implied torque that the actuator must deliver is <span class="math-container">$$ T(t) = 20\pi\,\frac{ml^2}{\tau^2}\left[\left(\frac{t}{\tau}\right) - 3\left(\frac{t}{\tau}\right)^2 + 2\left(\frac{t}{\tau}\right)^3\right] $$</span> I believe all of the above is correct, but if someone thinks otherwise I would welcome the correction.</p> <p>Now we want to solve this same problem in the vertical plane, where the actuator has to work against gravity. It can be shown from the Lagrangian or elsewhere that now the actuator has to effect <span class="math-container">$$ T = \frac{1}{3}ml^2\ddot{\theta} + \frac{1}{2}mgl\cos\theta $$</span> I would like to again come up some expression for <span class="math-container">$T(t)$</span> or <span class="math-container">$\theta$</span> but I am stumped. The boundary conditions now include (I think) <span class="math-container">$\ddot{\theta} = 0$</span> at <span class="math-container">$t=0$</span> and <span class="math-container">$t=\tau$</span>. This seems like something someone must have solved, since in a real system someone would have had to define the torque or position commands for the motor. I tried things with different polynomials and/or the Taylor series approximation for <span class="math-container">$\cos\theta$</span>. Probably I will have to resort to some kind of numerical search, but I am wondering if there is any kind of elegant closed-form expression that solves this.</p>
Controlling motion of single link
<p>No. It relies on items not available in Humble. Please upgrade to Iron or newer to use GPS Waypoint Follower</p>
104581
2023-10-08T21:28:26.760
|ros2|ros-humble|nav2|
<p>I saw that a GPS Waypoint Follower including a demo was very recently merged into the main branch of the nav2 repository, great work! (<a href="https://github.com/ros-planning/navigation2/pull/2814" rel="nofollow noreferrer">PR #2814</a>; <a href="https://github.com/ros-planning/navigation2_tutorials/pull/70" rel="nofollow noreferrer">demo</a>). The GPS WPF has already been backported to the ROS2 Iron distribution (<a href="https://github.com/ros-planning/navigation2/pull/3837" rel="nofollow noreferrer">PR #3837</a>).</p> <p>Will there also be a backport to ROS2 Humble, since it has LTS? I already tried to do the backport myself and cherry-picked the <a href="https://github.com/ros-planning/navigation2/commit/1efc96c3aa1b80d4926238238b56d45a9d7a8993" rel="nofollow noreferrer">respective commit</a>, but it seems like there are more API changes in nav2 that need to be migrated as well. Does anybody know what changes are needed for the GPS WPF to work with Humble? I couldn't find an up-to-date overview with all breaking changes in nav2 between Humble and Iron.</p>
Will the Navigation2 GPS WPF be backported to Humble?
<p>Welcome to Robotics Stack Exchange!</p> <p>In short, the Intel RealSense camera works with ROS 2. You are looking at the documentation of the <code>librealsense</code>, which may not be updated accordingly. Anyways, I strongly suggest installing the prebuilt package using <code>apt</code>. Below is an example:</p> <pre><code>sudo apt install ros-&lt;ROS_DISTRO&gt;-librealsense2* sudo apt install ros-&lt;ROS_DISTRO&gt;-realsense2-* </code></pre> <p>Please note that you may need to configure your Ubuntu repositories in order to install these packages. So, please look at <a href="https://github.com/IntelRealSense/realsense-ros#--installation" rel="nofollow noreferrer">the official documentation</a>.</p>
104583
2023-10-09T04:06:44.933
|navigation|ros2|slam|realsense|realsense-camera|
<p>I am just getting started with Intel Real Sense integration with ROS2. When trying to install Intel real sense packages in ROS 2 Humble, <a href="https://github.com/IntelRealSense/librealsense/blob/master/doc/distribution_linux.md#installing-the-packages:%7E:text=Ubuntu%20LTS%20kernels%204.4%2C%204.8%2C%204.10%2C%204.13%2C%204.15%2C%204.18*%2C%205.0*%2C%205.3*%2C%205.4%2C%205.13%20and%205.15" rel="nofollow noreferrer">the official documentation</a> says the supported kernel version of this package is only for 5.4 or less. On the contrary, by default, the kernel version of ROS2 humble in Ubuntu 22 is 6.</p> <p>Please give some input on whether ROS2 Humble supports real-sense integration.</p>
Intel RealSense with ROS 2
<p>Since you mention that you are using <code>Ubuntu 20.04</code>, you can not use the tutorial. It is written for <code>Ubuntu 22.04</code> (<a href="https://micro.ros.org/docs/tutorials/core/first_application_rtos/freertos/#:%7E:text=Hawksbill%20on%20your-,Ubuntu%2022.04,-LTS%20computer.%20To" rel="nofollow noreferrer">https://micro.ros.org/docs/tutorials/core/first_application_rtos/freertos/#:~:text=Hawksbill%20on%20your-,Ubuntu%2022.04,-LTS%20computer.%20To</a>)</p> <p>I assume you are trying to use <strong>galactic</strong>, which is not supported any more (<a href="https://docs.ros.org/en/rolling/Releases.html" rel="nofollow noreferrer">https://docs.ros.org/en/rolling/Releases.html</a>)</p>
104586
2023-10-09T05:46:49.327
|ros2|microcontroller|micro-ros|
<p>I'm trying to set up micro-ROS on Ubuntu 20.04 using this <a href="https://micro.ros.org/docs/tutorials/core/first_application_rtos/freertos/" rel="nofollow noreferrer">website</a>, and connect ESP32 to ROS 2. However, when executing the command <code>ros2 run micro_ros_setup create_firmware_ws.sh freertos esp32</code>, I get the following error:</p> <pre><code>ERROR: the following packages/stacks could not have their rosdep keys resolved to system dependencies: rclc_parameter: Cannot locate rosdep definition for [osrf_testing_tools_cpp] rmw: Cannot locate rosdep definition for [osrf_testing_tools_cpp] rmw_implementation: Cannot locate rosdep definition for [rcpputils] rosidl_typesupport_c: Cannot locate rosdep definition for [mimick_vendor] rosidl_default_runtime: Cannot locate rosdep definition for [rosidl_typesupport_introspection_cpp] rcl_logging_noop: Cannot locate rosdep definition for [launch_testing] rosidl_typesupport_microxrcedds_c_tests: Cannot locate rosdep definition for [rosidl_typesupport_introspection_c] tracetools_launch: Cannot locate rosdep definition for [launch_ros] rcutils: Cannot locate rosdep definition for [osrf_testing_tools_cpp] rcl_action: Cannot locate rosdep definition for [osrf_testing_tools_cpp] libyaml_vendor: Cannot locate rosdep definition for [rcpputils] rcl_lifecycle: Cannot locate rosdep definition for [osrf_testing_tools_cpp] tracetools_test: Cannot locate rosdep definition for [launch_ros] test_rmw_implementation: Cannot locate rosdep definition for [rmw_dds_common] rclc_lifecycle: Cannot locate rosdep definition for [osrf_testing_tools_cpp] rosidl_typesupport_cpp: Cannot locate rosdep definition for [rcpputils] rcl: Cannot locate rosdep definition for [rcpputils] ros2trace: Cannot locate rosdep definition for [ros2cli] rclc: Cannot locate rosdep definition for [osrf_testing_tools_cpp] </code></pre> <p>Is there any solution?</p> <p>Incidentally, I implemented the following solution suggested by ChatGPT, but it didn't improve the situation.</p> <ul> <li>Update ROS 2 dependencies. <code>rosdep update</code></li> <li>Install System Packages, which ROS2 packages or stacks depend on. <code>rosdep install --from-paths /path/to/your/ros2/workspace --ignore-src --rosdistro &lt;your_ros_distro&gt;</code></li> <li>Rebuild the ROS 2 workspace. <code>colcon build</code></li> </ul>
Cannot locate rosdep definition, error on micro-ROS
<p>I interpret your question as follows:</p> <ul> <li>You know how to specify a custom log folder,</li> <li>You have folders <code>build</code>, <code>install</code> and <code>log</code> from running <code>colcon build</code> in your <code>ros2_ws</code> workspace,</li> <li>You currently specify <code>ros2_ws/log/</code> as log folder,</li> <li>However, you would like to log to the the latest build directory for your workspace instead (e.g. <code>ros2_ws/log/build_2023-10-09_8-51-26/</code>).</li> </ul> <p>If this is correct, then the answer is to simply specify <code>ros2_ws/log/latest</code> or <code>ros2_ws/log/latest_build</code> as log directory, as these <code>latest</code> and <code>latest_build</code> are symbolic links to the latest build directory.</p> <p>I'm not sure though what you mean with &quot;get the specific run/build directory in CPP&quot;. In any case: you can retrieve the log folder through <code>rcl_logging_get_logging_directory()</code> or <a href="https://github.com/ros2/rclcpp/blob/77c7aaf9178268c16dee8fc412537e9f67d2afe1/rclcpp/src/rclcpp/logger.cpp#L57" rel="nofollow noreferrer"><code>rclcpp::get_logging_directory()</code></a>.</p>
104593
2023-10-09T09:09:35.930
|ros-humble|logging|rclcpp|logger|
<p>I have a different process in my CPP node that writes a separate log file.</p> <p>Right now, this log is written to a file in <code>ros2_ws/log/additional_process.log</code> but I would like to put that file in the specific ROS log folder of that run, e.g. <code>ros2_ws/log/build_2023-10-09_8-51-26/additional_process.log</code>.</p> <p>Does anyone know how to get the specific run/build directory in CPP for every run/launch?</p>
Getting current log directory in a CPP Node
<p><code>ROSNode::SharedPtr</code> is a typedef inside <code>rclcpp::Node</code> that is always a <code>std::shared_ptr&lt;rclcpp::Node&gt;</code>. As such, it cannot inherit the <code>setCloud</code> method from your custom node.</p> <p>Juste replace with an explicit <code>std::shared_ptr&lt;ROSNode&gt;</code> and you should be fine.</p> <p>Not relevant for the question but I do not get the use of <code>typename</code> in the <code>setCloud</code> argument. Additionally, the point cloud could be passed by (const) reference at it is probably a pretty large object.</p>
104594
2023-10-09T09:19:53.423
|ros2|rosnode|rclcpp|ros-foxy|
<p>I have a node class shown below and I want to call its setCloud method from another class but it gives me the error:</p> <blockquote> <p>error: ‘using element_type = class rclcpp::Node’ {aka ‘class rclcpp::Node’} has no member named ‘setCloud’ 422 | node_-&gt;setCloud(data);</p> </blockquote> <p>My Node class:</p> <pre><code>class ROSNode : public rclcpp::Node { public: ROSNode() : Node(&quot;ros_node&quot;) { msg.header.frame_id = &quot;t&quot;; points_publisher = this-&gt;create_publisher&lt;sensor_msgs::msg::PointCloud2&gt;(&quot;/t_points&quot;, 10); } void setCloud(typename pcl::PointCloud&lt;pcl::PointXYZI&gt; cloud) { pcl::toROSMsg(cloud, msg); msg.header.stamp = get_clock()-&gt;now(); points_publisher-&gt;publish(msg); }; private: sensor_msgs::msg::PointCloud2 msg; rclcpp::Publisher&lt;sensor_msgs::msg::PointCloud2&gt;::SharedPtr points_publisher; }; </code></pre> <p>Header for the other class:</p> <pre><code>class testclass{ public: testclass(); virtual ~testclass(); protected: ROSNode::SharedPtr node_; void transmitTargetDataROS(pcl::PointCloud&lt;pcl::PointXYZI&gt; data); }; </code></pre> <p>Source file for the other class:</p> <pre><code>testclass::testclass(): node_(std::make_shared&lt;ROSNode&gt;()) { rclcpp::spin(node_); } void testclass::transmitTargetDataROS(pcl::PointCloud&lt;pcl::PointXYZI&gt; data) { node_-&gt;setCloud(data); } </code></pre> <p>I copied only relevant parts from the files.</p>
How can I call a member method which is inside a ROS2 node class?
<p>To generate the objects themselves what I found the easiest was to use <a href="https://github.com/mikaelarguedas/gazebo_models" rel="nofollow noreferrer">this repo</a> which takes in a bunch of images and outputs models that Gazebo<sup>1</sup> can understand. There is a <a href="https://www.youtube.com/watch?v=A3fJwTL2O4g" rel="nofollow noreferrer">brief tutorial video</a> for the repo on Youtube as well.</p> <p>You then have to put those models in a folder that makes sense for your project and, <strong>here's the key thing</strong>:</p> <p>Wherever those models are stored, <strong>you have to put the folder they are in into the environment variable <code>GAZEBO_MODEL_PATH</code>.</strong></p> <br> <p>EXAMPLE:</p> <p>Say my workspace is in <code>~/my_ws/</code> and inside there I have a folder called <code>mymodels</code>.</p> <p>Then I would run this whenever I activate my ROS2 environment:</p> <p><code>export GAZEBO_MODEL_PATH=$GAZEBO_MODEL_PATH:mymodels</code></p> <p>Along with other things like <code>source /opt/ros/humble/setup.bash</code> etc.</p> <br> <p>One last thing: I had to go into the <code>model.sdf</code> file for each of the markers I generated and reduce its size. I don't know how the sizing in that repo is supposed to work but the marker objects it generated for me were enormous. Thankfully it's pretty easy to do in the scale tag.</p> <p>Footnotes:</p> <p>1: Gazebo Classic, in my case 11.10 on Ubuntu 22.04 with ROS2 Humble</p>
104598
2023-10-09T11:03:50.270
|gazebo|pose|marker|
<p>The title says it all. I would like to test my robot's ArUco recognition and processing abilities by putting some markers around environments in classic Gazebo.</p> <p>All the tutorials I see online seem to be for older versions of Gazebo and (frankly) use some very elaborate tricks.</p> <p>What is the canonical way of adding a marker to Gazebo?</p> <p>EDIT: My version of Gazebo is 11.10.2, by &quot;older versions of Gazebo&quot; I mean older versions of Gazebo classic specifically. Sorry for any confusion.</p>
How to put ArUco markers in Gazebo Classic
<p>I think the rationale is mentioned <a href="https://github.com/light-tech/ros2_control/blob/795b6edc2b84bca3e20325a8bc2cec1d7cf6872d/controller_interface/include/controller_interface/controller_interface_base.hpp#L62-L64" rel="nofollow noreferrer">here</a>: both <code>ControllerInterface</code> and <code>ChainableControllerInterface</code> inherit from <code>ControllerInterfaceBase</code>, but controllers should not.</p> <pre><code>/** * Base interface class for an controller. The interface may not be used to implement a controller. * The class provides definitions for `ControllerInterface` and `ChainableControllerInterface` * that should be implemented and extended for a specific controller. */ </code></pre>
104610
2023-10-09T19:03:41.623
|ros-control|
<p>What were the design considerations behind creating <code>controller_interface::ControllerInterfaceBase</code> in ROS2 Control? While it appears there are many <code>controller_interface::ControllerInterface</code> <a href="https://github.com/search?q=%22public%20controller_interface%3A%3AControllerInterface%22&amp;type=code" rel="nofollow noreferrer">derived classes</a>, I couldn't find any as far as my <a href="https://github.com/search?q=%22public%20controller_interface%3A%3AControllerInterfaceBase%22&amp;type=code" rel="nofollow noreferrer">GitHub search</a> shows.</p>
Design Considerations Behind Creating controller_interface::ControllerInterfaceBase
<p>According to my understanding of your question, you can launch some files with <a href="http://wiki.ros.org/roslaunch/API%20Usage" rel="nofollow noreferrer">roslaunch python API</a> and <code>ROSLaunch()</code> call may fail when there's no running ROS Master node available.</p> <p>Hope this helps.</p>
104622
2023-10-10T03:48:52.680
|ros|python|roscore|ros-noetic|
<p>What is the best approach to launch a ROS file? Do I need to check that <code>roscore</code> is running before running it?</p>
Manually controlling the ROS driver from Python
<p>Hello I am a ROS enthusiast myself and I dont know if what I am suggesting would work but you can give it a try.</p> <ol> <li>Yes it is possible to use SLAM without LiDAR. You need a two sensors one t check the distance of statics in the environment and robots movement to or away from it and another internal sensor which tells you how much your robot has travelled. Fuse these data together and you have your own SLAM.</li> <li>Sensors for underwater like Sonar or radar would work.</li> <li>Localisation is just how your robot understands where it is given a little prior knowledge of the map. (or Environment)</li> </ol>
104631
2023-10-10T09:10:28.400
|ros2|slam|imu|lidar|nav2|
<p>I am trying to implement SLAM and robot localization for my ROV - remotely operated underwater vehicle- without using lidar. nor wheel encoders since the rov's movement uses T200 thrusters and not wheel encoders- it is not a differential robot. however all implementations i have found are using Lidar primarily and wheel encoders (and IMU sensor).is what i am trying to do possible? if so, how should i change slam to work with imu and any other needed sensor that is not a lidar, using the nav2 stack and if needed the robot localization package. any references to someone who has done the same would be welcome.</p>
ROS2 SLAM without Lidar
<p>Gazebo has only the pinhole camera model, so there is no way to make a true telecentric lens. It can however be approximated by a using a small enough fov, since then the projection lines are more parallel. The question then is how many pixels, what fov, and what distance from the image plane the camera should have to achieve some px/mm resolution and also what projection distortion we would have in the worst case, i.e. on the sides of the image. I've thrown together a python script which calculates this things.</p> <pre><code>import math w_image = 0.2 d_image = 0.3 # the field of view we need for the given d_image/w_image fov = math.atan(w_image/(2*d_image))*2 # w1 is the worst case pixel distance on the image plane we allow w1 = 0.0005 h1 = d_image w = w_image/2 d1 = math.sin(math.atan(h1/w)) * w1 d2 = math.sqrt(h1**2+w**2)-math.sqrt(w1**2-d1**2) px = fov/(math.atan(d1/d2)) print(f&quot;fov: {fov:0.3f}&quot;) print(f&quot;w1: {w1*1e3:0.3f}mm&quot;) print(f&quot;px: {px:0.1f}&quot;) </code></pre> <p>here are the somewhat roundabout calculations I made for this code, so that you know what the variables stand for.</p> <p><a href="https://i.stack.imgur.com/u3VTq.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/u3VTq.png" alt="calc for the above code" /></a></p> <p>I've also plotted the actual pixel widths here:</p> <p><a href="https://i.stack.imgur.com/n18Xn.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/n18Xn.png" alt="actual pixel width" /></a></p> <p>The plot was generated with this function, you can simply append it to the code above as it relies on some variables from there.</p> <pre><code>def plot_w1(): import numpy as np import matplotlib.pyplot as plt def dist(px_i): return math.tan(fov/px*px_i)*h1 X = np.arange(0, px//2) Y = np.array([dist(px_i) for px_i in X]) Y = Y[1:] - Y[:-1] X = X[1:] plt.plot(X, Y, label=&quot;w(px_i)&quot;) plt.plot(X, [w1]*len(X), label=&quot;w1&quot;) plt.xlabel(&quot;pixel from center&quot;) plt.ylabel(&quot;pixel distance in mm&quot;) plt.legend() plt.show() </code></pre> <p>With this one can easily calculate the actual distance in mm for a given pixel.</p>
104638
2023-10-10T19:34:30.257
|gazebo|gazebo-camera|gazebo-sensor|
<p>I'm trying to create an orthographic camera sensor for my robot model and have trouble specifying the sensor in a way that I know the px/mm dimensions of the resulting images.</p> <p>E.g. I want a camera with an 100px wide and 1px high image where the 1px would map to 1mm on the image plane.</p> <p>My goal is to use the resulting image to simulate a line sensor which can detect a robots offset from a line and the line thickness.</p> <p>I'm looking through the camera parameters available here <a href="http://sdformat.org/spec?elem=sensor" rel="nofollow noreferrer">http://sdformat.org/spec?elem=sensor</a> and have found the projection matrices, but to my understanding I also need to specify the pixel size in mm or something similar to know the actual image size, or am I wrong?</p>
How to create an orthographic camera sensor in gazebo fortress?
<p>I had the same issue with the following setup :</p> <ul> <li>A Yocto distribution running ROS2 code on ROS_DOMAIN_ID=28, exposing a WIFI hotspot and connected to the internet using ethernet cable</li> <li>My computer connected to the WIFI hotspot and trying to access node list on ROS_DOMAIN_ID=28</li> </ul> <p>In fact I solved the issue by removing the ethernet cable. It seems CycloneDDS chose to work on ethernet network instead of the local network created by this hotspot. I don't know if you are in the same conditions.</p> <p>And of course if you want to solve it forever, checkout the cyclonedds documentation on how to select a specific NetworkInterface.</p>
104641
2023-10-10T19:57:37.190
|ros2|network|
<p>I am having a problem with my ROS2 network, I have both the robot computer and my personal computer connected to the same WiFi network, however If I launch the ROS 2 code on the robot and try to check the nodes and info on my personal computer, it doesn't show anything, if I check on the robot computer everything is working correctly, so I don't know what do I need to do, as far as I know, it is not necessary to set the environment variables as it was done in ROS, so I hope someone can give me some ideas about what could be the problem.</p> <p>I have CycloneDDS installed and set up on both computers.</p> <p>Thank you!</p>
Can't see ros2 nodes over WiFi network
<p>I solved it myself. The reason why it couldn't subscribe and check the content was that <code>ROS_DOMAIN_ID</code> was set. After I set <code>ROS_DOMAIN_ID</code> back to 0, it become able to subscribe and check the content. If you're faced with the same problem, please refer to this.</p>
104650
2023-10-11T08:23:32.877
|ros2|microcontroller|micro-ros|
<p>I'm trying to set up micro-ROS on Ubuntu 22.04 following the steps on this <a href="https://medium.com/@SameerT009/connect-esp32-to-ros2-foxy-5f06e0cc64df" rel="nofollow noreferrer">website</a>, and connect ESP32 to ROS2.</p> <p>However, though the connection is established (Step8 on this <a href="https://medium.com/@SameerT009/connect-esp32-to-ros2-foxy-5f06e0cc64df" rel="nofollow noreferrer">website</a>), it can't subscribe and check the content.</p> <p>(When executing the command <code>ros2 topic list</code>, I can't see an additional topic <code>/freertos_int32_publisher</code>.)</p> <p>Is there any solution?</p> <p><a href="https://i.stack.imgur.com/waLef.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/waLef.png" alt="enter image description here" /></a></p> <p><a href="https://i.stack.imgur.com/n3eKf.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/n3eKf.png" alt="enter image description here" /></a></p>
Can't subscribe and check the content on micro-ROS
<p>Your StackExchange link is talking about a very specific case: slam. The optimizations and metrics for building a map (slam) and for <em>efficiently</em> navigating an existing map are not necessarily the same.</p> <p>Here is an example. Imagine there are two long hallways that both lead to the goal, which is far away and not visible to the robot's sensors. The global planner has to choose one hallway, so it is useful to have the information that one hallway is obstructed. Of course, the system also needs some mechanism that eventually clears obstacles from the global costmap.</p>
104652
2023-10-11T09:15:54.100
|navigation|ros2|simulation|mapping|nav2|
<p>I recently started using the NAV2 stack and I am still trying to understand basic concepts. As I understand it, obstacle avoidance and therefore local path re-planning should be done by the controller which theoretically works at higher frequencies that the planner. I was therefore wondering why one would add the obstacle layer in the global costmap. I understand that based on the configuration of the controller sometimes local re-planning is not possible (e.g., if a Regulated Pure Pursuit controller is used). Still, however, I cannot comprehend the intuition behind using an obstacle layer in the global costmap (i.e., why would someone completely avoid doing local re-planning and go directly to global re-planning). Similar questions to mine did not give any valuable insights in the intuition behind this choice: <a href="https://robotics.stackexchange.com/questions/69286/should-global-costmap-have-an-obstacle-layer-when-using-a-slam-node">StackExchangeLink</a></p>
Should global costmap have an obstacle layer?
<p>Unfortunately, we don't have a released PID controller yet. But there is <a href="https://github.com/ros-controls/ros2_controllers/pull/434" rel="nofollow noreferrer">one PR</a> for it, I hope it gets attention again after ROSCon.</p> <p>ros2_control supports <a href="https://control.ros.org/humble/doc/ros2_control_demos/example_12/doc/userdoc.html" rel="nofollow noreferrer">controller chaining</a>, but diff_drive_controller needs an update to support chaining with the PID controller above.</p> <p><a href="https://github.com/ros-controls/control_toolbox/blob/ros2-master/src/pid.cpp" rel="nofollow noreferrer">control_toolbox::PID</a> is also released for ROS 2, I fear that you have to implement that in the hardware component as you are suggesting.</p>
104653
2023-10-11T10:17:30.550
|pid|ros-humble|hardware-interface|diff-drive-controller|ros2-control|
<p>I followed ros2_control diff_drive_example <a href="https://github.com/ros-controls/ros2_control_demos/tree/master/example_2" rel="nofollow noreferrer">https://github.com/ros-controls/ros2_control_demos/tree/master/example_2</a> to bring my ROS1 robot to ROS2.</p> <p>I notice there is a PID controller (pid.cpp and pid.hpp files) in ros2_control and I wonder if I have to implement/use it in my hardware_interface by my own or if it will already be used by diff_drive_controller itself?</p> <p>In my previous code for ROS1, there was a class PID which inherited from control_toolbox::PID <a href="https://github.com/hoverboard-robotics/hoverboard-driver/blob/master/include/hoverboard_driver/pid.h" rel="nofollow noreferrer">https://github.com/hoverboard-robotics/hoverboard-driver/blob/master/include/hoverboard_driver/pid.h</a> and which acts as an interface between dynamic reconfigure and control_toolbox::Pid. Inside the Hardware_interface itself, there was an instance of control_toolbox:Pid (actually of the new pid class which inherits from control_toolbox::Pid) for each wheel like this</p> <pre><code> // Init PID controller pids[0].init(nh_left, 1.0, 0.0, 0.0, 0.01, 1.5, -1.5, true, max_velocity, -max_velocity); pids[0].setOutputLimits(-max_velocity, max_velocity); pids[1].init(nh_right, 1.0, 0.0, 0.0, 0.01, 1.5, -1.5, true, max_velocity, -max_velocity); pids[1].setOutputLimits(-max_velocity, max_velocity); </code></pre> <p>and this was just used before sending commands to Hardware by serial like this</p> <pre><code> double pid_outputs[2]; pid_outputs[0] = pids[0](joints[0].vel.data, joints[0].cmd.data, period); pid_outputs[1] = pids[1](joints[1].vel.data, joints[1].cmd.data, period); // Convert PID outputs in RAD/S to RPM double set_speed[2] = { pid_outputs[0] / 0.10472, pid_outputs[1] / 0.10472 }; </code></pre> <p>Now I wonder if I still need to implement PID to Hardware_interface by myself or if the controller will already handle this for me.</p> <p>I couldn't find an example for diff drive together with PID which makes me curious if it is needed or not</p> <p>Patrick</p>
ros2_control diff_drive_controller PID?
<p>The Nav Stack publishes where the robot is asked to go, not what it is actually doing or where it actually is. That is what the odometry provides, ground truth of what the robot is actually doing.</p>
104666
2023-10-11T15:10:36.813
|navigation|odometry|velocity|
<p><strong>So why do I have to calculate the <em>VX</em> and <em>VTH</em> for odometry when move base from navigation stack actually publishes the velocities for my robot ... i don't get that can someone plz explain</strong>.</p> <pre><code>current_time = ros::Time::now(); double DistancePerCount = (3.14159265 * 0.13) / 2626; double lengthBetweenTwoWheels = 0.25; // extract the wheel velocities from the tick signals count deltaLeft = tick_x - _PreviousLeftEncoderCounts; deltaRight = tick_y - _PreviousRightEncoderCounts; omega_left = (deltaLeft * DistancePerCount) / (current_time - last_time).toSec(); omega_right = (deltaRight * DistancePerCount) / (current_time - last_time).toSec(); v_left = omega_left * 0.065; //radius v_right = omega_right * 0.065; vx = ((v_right + v_left) / 2)*10; vy = 0; vth = ((v_right - v_left)/lengthBetweenTwoWheels)*10; double dt = (current_time - last_time).toSec(); double delta_x = (vx * cos(th)) * dt; double delta_y = (vx * sin(th)) * dt; double delta_th = vth * dt; x += delta_x; y += delta_y; th += delta_th; geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(th); geometry_msgs::TransformStamped odom_trans; odom_trans.header.stamp = current_time; odom_trans.header.frame_id = &quot;odom&quot;; odom_trans.child_frame_id = &quot;base_link&quot;; odom_trans.transform.translation.x = x; odom_trans.transform.translation.y = y; odom_trans.transform.translation.z = 0.0; odom_trans.transform.rotation = odom_quat; // send the transform odom_broadcaster.sendTransform(odom_trans); // Odometry message nav_msgs::Odometry odom; odom.header.stamp = current_time; odom.header.frame_id = &quot;odom&quot;; // set the position odom.pose.pose.position.x = x; odom.pose.pose.position.y = y; odom.pose.pose.position.z = 0.0; odom.pose.pose.orientation = odom_quat; // set the velocity odom.child_frame_id = &quot;base_link&quot;; odom.twist.twist.linear.x = vx; odom.twist.twist.linear.y = vy; odom.twist.twist.angular.z = vth; // publish the message odom_pub.publish(odom); _PreviousLeftEncoderCounts = tick_x; _PreviousRightEncoderCounts = tick_y; last_time = current_time; </code></pre>
Navigation odometry and velocities published
<p>You're wanting to stop based on a position, but you're sending velocity commands with no mechanism for recording time. If you could record the elapsed time, you could implement a <a href="https://en.wikipedia.org/wiki/Dead_reckoning" rel="nofollow noreferrer">dead reckoning</a> method that would let you estimate your position by accumulating position deltas, where each position delta is the previous speed multiplied by the elapsed time.</p> <p>If you have an angular velocity in radians per second, then you complete one revolution when you accumulate two*pi radians, or about 6.283 radians. If you monitor the time, you could do something like this:</p> <pre><code>currentRotation = 0 fullRotation = 6.283 # radians pastTime = rospy.get_time() while currentRotation &lt; fullRotation: # Get the elapsed time currentTime = rospy.get_time() timeDelta = currentTime - pastTime pastTime = currentTime # Publish velocities vel = Twist() vel.linear.x = 2.0 vel.angular.z = -1.5 # radians velocity_publisher.publish(vel) # Dead reckoning - numerically integrate the angular velocity currentRotation += abs(vel.angular.z) * timeDelta # Cleanup - once you are done rotating, stop moving vel = Twist() velocity_publisher.publish(vel) </code></pre> <p>If you want to get more accurate, you'll need to use a position sensor.</p>
104681
2023-10-12T16:21:56.060
|gazebo|ros2|ros-humble|
<p>How to stop my turtle at specified coordinates in turtlesim. i used a python script in which i made a turtle move in specified manner. then i spawned another turtle and made in move in the same coordinates. now i cannot stop it even after trying many ways. <a href="https://i.stack.imgur.com/QY0rM.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/QY0rM.png" alt="I used this code for turtle2" /></a> If i un-comment the line, the turtle stops after going 1/4th way, and when its commented, it repeatedly goes in circle. i want it to stop after making the shape.</p>
Turtlesim doubt
<p>The best reference for supported platforms is <a href="https://www.ros.org/reps/rep-2000.html" rel="nofollow noreferrer">REP 2000</a>.</p> <p>To directly answer your question, Debian platforms are not Tier 1 for any distribution at the moment. However because they are so closely tied to Ubuntu distributions they will generally work without any changes if you pick the closest analogous system. But you will have to compile from source.</p>
104682
2023-10-12T16:54:11.830
|ubuntu|debian|linux|
<p>How well does Ros 2 work on all debian based distros does it also have tier -1 support, or only ubuntu ?</p>
Does Ros 2 work on any debian base distro or only ubuntu?
<p>Be sure you set up correctly the <code>crazyflies.yaml</code> file: <a href="https://imrclab.github.io/crazyswarm2/usage.html#crazyflies-yaml" rel="nofollow noreferrer">link</a></p>
104687
2023-10-12T22:42:26.213
|ros|ros2|
<p>I would like to follow a simple take off and land example without any launch files or yaml files on <a href="https://imrclab.github.io/crazyswarm2/usage.html" rel="nofollow noreferrer"><strong>ros2 crazyswarm</strong></a> (<strong>Physical Experiments part</strong>). What could be the reason for the following output, thanks.</p> <pre><code>ntukenmez3@ae-icps-407120:~/Documents/ros2_ws$ ros2 param set crazyflie_server cf231.params.commander.enHighLevel 1 Setting parameter failed </code></pre>
ros2 parameter setting
<p>If you have specific questions wrt certain tutorials, you should post a link to those tutorials and describe what commands you try and what exact issues you run into.</p> <p>But apart from that:</p> <ul> <li><p>Gazebo loads its environment from an SDF file, which is an xml file (i.e. a <em>text</em> file). There are no binary files. It can be called <code>file.sdf</code> or <code>file.world</code>, but the contents is always xml corresponding to the SDF specification.</p> </li> <li><p>The SDF specification can be found on <a href="http://sdformat.org/spec" rel="nofollow noreferrer">sdformat.org</a>.</p> </li> <li><p>Gazebo <strong>Classic</strong> tutorials are <a href="https://classic.gazebosim.org/tutorials" rel="nofollow noreferrer">here</a>. Gazebo Classic is the 'old Gazebo'. It has version <em>numbers</em>. The most recent version is 11.</p> </li> <li><p>Gazebo <strong>Sim</strong> tutorials are <a href="https://gazebosim.org/docs/harmonic/tutorials" rel="nofollow noreferrer">here</a>. Gazebo Sim is the 'new Gazebo'. It has version <em>names</em>, e.g. Fortress, Garden, Harmonic.</p> <p>Gazebo Sim used to be called 'Ignition Gazebo' until Fortress. From Garden on the name was changed to 'Gazebo Sim' due to trademark issued wrt 'Ignition'.</p> </li> <li><p>Gazebo and RViz are not the same. You cannot load an SDF into RViz.</p> </li> </ul> <blockquote> <p>And I'm kinda confused that many tutorials also mentioned launch file, so am I suppose to make it into a package?</p> </blockquote> <p>A launch file is nothing more than a convenience script to start up your application.</p> <p>Instead of using a launch script, you can issue each command manually from a terminal (i.e. execute <code>ros2 run &lt;package&gt; &lt;executable&gt; &lt;arguments&gt;</code> for each node, start gazebo, etc.).</p> <p>I think you should start with:</p> <ul> <li>Gazebo tutorials (i.e. without ROS),</li> <li>ROS tutorials (i.e. without Gazebo),</li> <li>Play around with these until you get comfortable,</li> <li>Learn the basics of launch files,</li> <li>Then go through the ROS+Gazebo tutorials.</li> </ul> <p>If that does not work, consider enrolling for a course.</p> <p>There is no shortcut in learning ROS and Gazebo.</p>
104688
2023-10-13T04:42:43.707
|gazebo|
<p>As the title mention, when I saved my world it only genarates a plain text file instead of .world file. Though I still can open the world by <code>rosrun gazebo_ros gazebo &lt;world_name&gt;</code>, I got into trouble when trying to launch rviz with gazebo, which all kinds of tutorials required .world file.</p> <p>edit:</p> <p>I can change the file to .world file and still launch it using <code>rosrun gazebo_ros gazebo &lt;world_name&gt;</code>, but the code doesn't seem like a .world file format. And I'm kinda confused that many tutorials also mentioned launch file, so am I suppose to make it into a package? I just created the whole world only using gazebo. The <a href="https://github.com/brian2lee/test/blob/master/test_depth2.world" rel="nofollow noreferrer">file</a> it generated using <code>file-&gt;save world as</code></p> <p>gazebo version 11.14.0</p>
why does my gazebo saved the world as plain text file?
<p>The &quot;frame_id&quot; in the header of your laserscan is not populated - RVIZ/TF has no idea where to put the data. You need to copy this data over from the range message.</p> <p>Once that is fixed though, your data will likely not show up where you expect it, since your laserscan angle_min is a large negative value - this should probably be set to 0 assuming that the TOF sensor is aligned with the frame you set for frame_id.</p>
104694
2023-10-13T11:18:33.843
|ros|rviz|slam|hector-slam|ros-noetic|
<p><strong>ver: lubuntu20.04, ros1-noetic, raspberry Pi 4</strong></p> <p>I am currently thinking of doing SLAM using a <strong>ToF sensor</strong> called <strong>vl53l1x</strong>. The SLAM I plan to use is <strong>Hector-SLAM</strong>.</p> <p>However, I am confused about converting the data to the <strong>sensor_msgs::LaserScan</strong> type required for Hector-SLAM.</p> <pre><code>/* * STM VL53L1X ToF rangefinder driver for ROS * * Author: Oleg Kalachev &lt;okalachev@gmail.com&gt; * * Distributed under BSD 3-Clause License (available at https://opensource.org/licenses/BSD-3-Clause). * * Documentation used: * VL53L1X datasheet - https://www.st.com/resource/en/datasheet/vl53l1x.pdf * VL53L1X API user manual - https://www.st.com/content/ccc/resource/technical/document/user_manual/group0/98/0d/38/38/5d/84/49/1f/DM00474730/files/DM00474730.pdf/jcr:content/translations/en.DM00474730.pdf * */ #include &lt;string&gt; #include &lt;vector&gt; #include &lt;ros/ros.h&gt; #include &lt;sensor_msgs/Range.h&gt; #include &lt;vl53l1x/MeasurementData.h&gt; #include &quot;vl53l1_api.h&quot; #include &quot;i2c.h&quot; #define xSTR(x) #x #define STR(x) xSTR(x) #define CHECK_STATUS(func) { \ VL53L1_Error status = func; \ if (status != VL53L1_ERROR_NONE) { \ ROS_WARN(&quot;VL53L1X: Error %d on %s&quot;, status, STR(func)); \ } \ } int main(int argc, char **argv) { ros::init(argc, argv, &quot;vl53l1x&quot;); ros::NodeHandle nh, nh_priv(&quot;~&quot;); sensor_msgs::Range range; vl53l1x::MeasurementData data; range.radiation_type = sensor_msgs::Range::INFRARED; ros::Publisher range_pub = nh_priv.advertise&lt;sensor_msgs::Range&gt;(&quot;range&quot;, 20); ros::Publisher data_pub = nh_priv.advertise&lt;vl53l1x::MeasurementData&gt;(&quot;data&quot;, 20); // Read parameters int mode, i2c_bus, i2c_address; double poll_rate, timing_budget, offset; bool ignore_range_status; std::vector&lt;int&gt; pass_statuses { VL53L1_RANGESTATUS_RANGE_VALID, VL53L1_RANGESTATUS_RANGE_VALID_NO_WRAP_CHECK_FAIL, VL53L1_RANGESTATUS_RANGE_VALID_MERGED_PULSE }; nh_priv.param(&quot;mode&quot;, mode, 3); nh_priv.param(&quot;i2c_bus&quot;, i2c_bus, 1); nh_priv.param(&quot;i2c_address&quot;, i2c_address, 0x29); nh_priv.param(&quot;poll_rate&quot;, poll_rate, 100.0); nh_priv.param(&quot;ignore_range_status&quot;, ignore_range_status, false); nh_priv.param(&quot;timing_budget&quot;, timing_budget, 0.1); nh_priv.param(&quot;offset&quot;, offset, 0.0); nh_priv.param&lt;std::string&gt;(&quot;frame_id&quot;, range.header.frame_id, &quot;&quot;); nh_priv.param(&quot;field_of_view&quot;, range.field_of_view, 0.471239f); // 27 deg, source: datasheet nh_priv.param(&quot;min_range&quot;, range.min_range, 0.0f); nh_priv.param(&quot;max_range&quot;, range.max_range, 4.0f); nh_priv.getParam(&quot;pass_statuses&quot;, pass_statuses); if (timing_budget &lt; 0.02 || timing_budget &gt; 1) { ROS_FATAL(&quot;Error: timing_budget should be within 0.02 and 1 s (%g is set)&quot;, timing_budget); ros::shutdown(); } // The minimum inter-measurement period must be longer than the timing budget + 4 ms (*) double inter_measurement_period = timing_budget + 0.004; // Setup I2C bus i2c_setup(i2c_bus, i2c_address); // Init sensor VL53L1_Dev_t dev; VL53L1_Error dev_error; VL53L1_software_reset(&amp;dev); VL53L1_WaitDeviceBooted(&amp;dev); VL53L1_DataInit(&amp;dev); VL53L1_StaticInit(&amp;dev); VL53L1_SetPresetMode(&amp;dev, VL53L1_PRESETMODE_AUTONOMOUS); // Print device info VL53L1_DeviceInfo_t device_info; CHECK_STATUS(VL53L1_GetDeviceInfo(&amp;dev, &amp;device_info)); ROS_INFO(&quot;VL53L1X: Device name: %.&quot; STR(VL53L1_DEVINFO_STRLEN) &quot;s&quot;, device_info.Name); ROS_INFO(&quot;VL53L1X: Device type: %.&quot; STR(VL53L1_DEVINFO_STRLEN) &quot;s&quot;, device_info.Type); ROS_INFO(&quot;VL53L1X: Product ID: %.&quot; STR(VL53L1_DEVINFO_STRLEN) &quot;s&quot;, device_info.ProductId); ROS_INFO(&quot;VL53L1X: Type: %u Version: %u.%u&quot;, device_info.ProductType, device_info.ProductRevisionMajor, device_info.ProductRevisionMinor); // Setup sensor CHECK_STATUS(VL53L1_SetDistanceMode(&amp;dev, mode)); CHECK_STATUS(VL53L1_SetMeasurementTimingBudgetMicroSeconds(&amp;dev, round(timing_budget * 1e6))); double min_signal; if (nh_priv.getParam(&quot;min_signal&quot;, min_signal)) { CHECK_STATUS(VL53L1_SetLimitCheckValue(&amp;dev, VL53L1_CHECKENABLE_SIGNAL_RATE_FINAL_RANGE, min_signal * 65536)); } double max_sigma; if (nh_priv.getParam(&quot;max_sigma&quot;, max_sigma)) { CHECK_STATUS(VL53L1_SetLimitCheckValue(&amp;dev, VL53L1_CHECKENABLE_SIGMA_FINAL_RANGE, max_sigma * 1000 * 65536)); } // Start sensor for (int i = 0; i &lt; 100; i++) { CHECK_STATUS(VL53L1_SetInterMeasurementPeriodMilliSeconds(&amp;dev, round(inter_measurement_period * 1e3))); dev_error = VL53L1_StartMeasurement(&amp;dev); if (dev_error == VL53L1_ERROR_INVALID_PARAMS) { inter_measurement_period += 0.001; // Increase inter_measurement_period to satisfy condition (*) } else break; } // Check for errors after start if (dev_error != VL53L1_ERROR_NONE) { ROS_FATAL(&quot;VL53L1X: Can't start measurement: error %d&quot;, dev_error); ros::shutdown(); } ROS_INFO(&quot;VL53L1X: ranging&quot;); VL53L1_RangingMeasurementData_t measurement_data; // Main loop ros::Rate r(poll_rate); while (ros::ok()) { r.sleep(); range.header.stamp = ros::Time::now(); // Check the data is ready uint8_t data_ready = 0; VL53L1_GetMeasurementDataReady(&amp;dev, &amp;data_ready); if (!data_ready) { continue; } // Read measurement VL53L1_GetRangingMeasurementData(&amp;dev, &amp;measurement_data); VL53L1_ClearInterruptAndStartMeasurement(&amp;dev); // Publish measurement data data.header.stamp = range.header.stamp; data.signal = measurement_data.SignalRateRtnMegaCps / 65536.0; data.ambient = measurement_data.AmbientRateRtnMegaCps / 65536.0; data.effective_spad = measurement_data.EffectiveSpadRtnCount / 256; data.sigma = measurement_data.SigmaMilliMeter / 65536.0 / 1000.0; data.status = measurement_data.RangeStatus; data_pub.publish(data); // Check measurement for validness if (!ignore_range_status &amp;&amp; std::find(pass_statuses.begin(), pass_statuses.end(), measurement_data.RangeStatus) == pass_statuses.end()) { char range_status[VL53L1_MAX_STRING_LENGTH]; VL53L1_get_range_status_string(measurement_data.RangeStatus, range_status); ROS_DEBUG(&quot;Range measurement status is not valid: %s&quot;, range_status); ros::spinOnce(); continue; } // Publish measurement range.range = measurement_data.RangeMilliMeter / 1000.0 + offset; range_pub.publish(range); ros::spinOnce(); } // Release ROS_INFO(&quot;VL53L1X: stop ranging&quot;); VL53L1_StopMeasurement(&amp;dev); i2c_release(); } </code></pre> <p>The ToF sensor <strong>publish program</strong> is available from github.</p> <p>Also, the program that receives the data is</p> <pre><code>#include &lt;ros/ros.h&gt; #include &lt;sensor_msgs/Range.h&gt; #include &lt;sensor_msgs/LaserScan.h&gt; ros::Publisher laser_scan_pub; void chatterCallback(const sensor_msgs::Range&amp; range_msg) { // Create a LaserScan message sensor_msgs::LaserScan laser_scan_msg; laser_scan_msg.header = range_msg.header; // Set the LaserScan-specific parameters laser_scan_msg.angle_min = -M_PI / 4.0; // Start angle (usually 0 radians) laser_scan_msg.angle_max = M_PI / 4.0; // End angle (usually 0 radians) laser_scan_msg.angle_increment = M_PI / 180.0; // Angle increment (usually 0 radians) laser_scan_msg.time_increment = 0.0; // Time between measurements (usually 0 seconds) laser_scan_msg.scan_time = 0.1; // Time taken for one scan (usually 0 seconds) laser_scan_msg.range_min = range_msg.min_range; laser_scan_msg.range_max = range_msg.max_range; // Calculate ranges array laser_scan_msg.ranges.push_back(range_msg.range); // Publish LaserScan message laser_scan_pub.publish(laser_scan_msg); } int main(int argc, char** argv) { ros::init(argc, argv, &quot;sensor_data&quot;); ros::NodeHandle nh; ros::Subscriber sub = nh.subscribe(&quot;/vl53l1x/range&quot;, 10, chatterCallback); laser_scan_pub = nh.advertise&lt;sensor_msgs::LaserScan&gt;(&quot;/laserscan&quot;, 10); ros::spin(); return 0; } </code></pre> <p>rostopic echo <strong>/laserscan</strong> This command confirms that the <strong>Topic is being received</strong></p> <pre><code>header: seq: 202 stamp: secs: 1697194120 nsecs: 356778758 frame_id: '' angle_min: -0.7853981852531433 angle_max: 0.7853981852531433 angle_increment: 0.01745329238474369 time_increment: 0.0 scan_time: 0.10000000149011612 range_min: 0.0 range_max: 4.0 ranges: [0.8009999990463257] intensities: [] </code></pre> <p>However, when I tried to visualize the <strong>/laserscan</strong> Topic with <strong>rviz</strong>, the point cloud data was not displayed.</p> <p>The publish side, /vl53l1x/range, could be visualized with rviz</p> <p>I couldn't improve it on my own <strong>Anything is fine, so please lend me your help.</strong></p>
Regarding rviz display after converting from sensor_msgs::Range type to sensor_msgs::LaserScan type
<p><strong>Solution:</strong> Steps to Follow:</p> <p>1.Create a Models Folder in your Directory(package file) and store the sdf file there. I have created a single Folder named Box inside the Models File</p> <ul> <li>Models <ul> <li>Box <ul> <li>model.config</li> <li>model.sdf</li> <li>meshes(folder) <ul> <li>mesh1.dae</li> <li>mesh2.stl</li> <li>mesh3.stl</li> </ul> </li> </ul> </li> </ul> </li> </ul> <p>a. Model.config file</p> <pre><code> &lt;?xml version=&quot;1.0&quot;?&gt; &lt;model&gt; &lt;name&gt;Box&lt;/name&gt; &lt;version&gt;1.0&lt;/version&gt; &lt;sdf version='1.6'&gt;model.sdf&lt;/sdf&gt; &lt;author&gt; &lt;name&gt;name&lt;/name&gt; &lt;email&gt;name@email.address&lt;/email&gt; &lt;/author&gt; &lt;description&gt;This is a box&lt;/description&gt; </code></pre> b. model.sdf <pre><code>&lt;?xml version=&quot;1.0&quot; ?&gt; &lt;sdf version=&quot;1.5&quot;&gt; &lt;model name=&quot;Box&quot;&gt; &lt;pose&gt;0 0 0 0 0 0&lt;/pose&gt; &lt;static&gt;true&lt;/static&gt; &lt;link name=&quot;link&quot;&gt; &lt;collision name=&quot;collision&quot;&gt; &lt;geometry&gt; &lt;mesh&gt; &lt;uri&gt;model://box/meshes/mesh1.dae&lt;/uri&gt; &lt;scale&gt;0.25 0.25 0.25&lt;/scale&gt; &lt;/mesh&gt; &lt;/geometry&gt; &lt;/collision&gt; &lt;visual name=&quot;visual&quot;&gt; &lt;geometry&gt; &lt;mesh&gt; &lt;uri&gt;model://box/meshes/mesh1.dae&lt;/uri&gt; &lt;scale&gt;0.25 0.25 0.25&lt;/scale&gt; &lt;/mesh&gt; &lt;/geometry&gt; &lt;/visual&gt; &lt;/link&gt; </code></pre> <p>c. How to use the Launch File in python</p> <pre><code>spawn_entity = Node(package='gazebo_ros', executable='spawn_entity.py', name=&quot;spawn_sdf_entity&quot;, arguments=[ '-entity','Box','-file', LaunchConfiguration('sdf_model'), '-x','1.0', '-y','1.0', '-z', '0.0', '-R','0.0', '-P','0.0', '-Y','0.0' ], output='screen') return LaunchDescription([ sdf_model, spawn_entity, ]) </code></pre> <p>I hope this will be useful for everyone!!</p> <p>SYSTEM:</p> <p>ROS2 Humble Linux ubuntu 22</p>
104695
2023-10-13T11:46:00.250
|ros-humble|sdformat|sdf|spawn-model|
<p>I have created an Object using Blender and was able to add the Add to the Path File in Ros Humble Gazebo it worked, how to open the same SDF file using Python Launch file.</p>
Spawning an Sdf Model in ROS humble
<p>It should be initialized, and therefore all fields should be initialized, including strings and sequences. The type support code is responsible for resizing, or finalizing and then re-initializing, fields as needed.</p> <p>The docs don’t explicitly say initialized, and perhaps they should (tag me on a pr if you’d like to add that), but the implication is that you shouldn’t pass in uninitialized memory, otherwise it’s impossible to tell if the complex fields like strings and such need to be cleaned up first.</p> <p>Consider that it’s also perfectly legitimate use case for a user to allocate one message on the stack, initialize it once (which is done automatically in C++), and then call take on it over and over again, reusing the message each time.</p>
104696
2023-10-13T12:50:16.770
|ros2|ros-humble|c|memory|
<p>I am writing a custom type support which uses the C type support to save work. Unfortunately I failed to find exact documentation about using the C type support which left me with this question:</p> <p>Has a message to be initialized before being passed to <a href="http://docs.ros.org/en/humble/p/rcl/generated/function_subscription_8h_1ae2528dbb733a899d7721b38e58495277.html" rel="nofollow noreferrer"><code>rcl_take</code></a>?</p> <p>I looked at the <a href="https://github.com/ros2/rclc/blob/d263be2057e39f1f702f014548c60f06b5a51333/rclc_examples/src/example_executor_only_rcl.c#L156" rel="nofollow noreferrer">rclc example</a> which initialized the message but the documentation for <code>rcl_take</code> says that only an allocated struct is required in which the taken message will be copied, which would lead to a memory leak if there is already a message with pointers to allocated arrays stored there.</p>
Initializing a ROS2 Message before calling rcl_take
<p>You will need to edit /etc/apt/sources.list.d/ros2.list (or whichever file has the apt setup for ros2). Replace &quot;lunar&quot; with your ubuntu codename (for instance, &quot;bionic&quot;, &quot;focal&quot;, &quot;jammy&quot;, etc).</p>
104710
2023-10-14T08:33:17.143
|ros|
<p>I'm trying to setup ROS2 on Ubuntu but I'm getting the following errors. Any suggestions on this? Thanks</p> <pre><code>Err:6 http://packages.ros.org/ros2/ubuntu lunar Release 404 Not Found [IP: 64.50.236.52 80] E: The repository 'http://packages.ros.org/ros2/ubuntu lunar Release' does not have a Release file. N: Updating from such a repository can't be done securely, and is therefore disabled by default. N: See apt-secure(8) manpage for repository creation and user configuration details. ``` </code></pre>
Lunar Release' does not have a Release file
<p>You have to declare parameters for the controllers in an own top-level section instead inside the controller_manager</p> <pre><code>joint_state_broadcaster: ros__parameters: use_local_topics: true </code></pre>
104715
2023-10-14T13:19:48.520
|ros2|ros-foxy|joint-trajectory-controller|ros2-control|ros2-controllers|
<p>I'm using ros2 control foxy now,and I'm trying to define namespace for my robot. The problem is that joint_state_broadcaster doesn't publish the /joint_states to the local topic /ns/joint_states. According to this <a href="https://control.ros.org/foxy/doc/ros2_controllers/joint_state_broadcaster/doc/userdoc.html" rel="nofollow noreferrer">Link</a>, it seems like we can change the parameter use_local_topics to 'true' for using our own namespace. Have someone tried this, i'm not sure the place that I should edit. In my understanding, should be somewhere in config file. Here is my config file, but it doesn't work.</p> <pre><code># Controller manager configuration controller_manager: ros__parameters: update_rate: 200 # Hz ### Controllers available joint_state_broadcaster: type: joint_state_broadcaster/JointStateBroadcaster use_local_topics: true forward_position_controller: type: forward_command_controller/ForwardCommandController position_trajectory_controller: type: joint_trajectory_controller/JointTrajectoryController ### Properties of the controllers that we will use and definition of joints to use ### forward_position_controller: ros__parameters: joints: - j1 - j2 - j3 interface_name: position position_trajectory_controller: ros__parameters: joints: - j1 - j2 - j3 command_interfaces: - position state_interfaces: - position state_publish_rate: 50.0 # Hz, Defaults to 50 action_monitor_rate: 20.0 # Hz, Defaults to 20 allow_partial_joints_goal: false # Defaults to false open_loop_control: false allow_integration_in_goal_trajectories: true constraints: stopped_velocity_tolerance: 0.01 # Defaults to 0.01 goal_time: 0.0 # Defaults to 0.0 (start immediately) gain: j1: p: 100.0 i: 10.0 d: 3600.0 j2: p: 100.0 i: 10.0 d: 3600.0 j3: p: 100.0 i: 10.0 d: 3600.0 </code></pre>
How to set the optional parameter , "use_local_topics", of joint_state_broadcaster of ros2_control (foxy)
<p>There is no position sensor ROS plugin in <code>webots_ros2</code>.</p> <p>There are two ways to get joint position data published:</p> <ul> <li>Use <a href="https://control.ros.org/master/index.html" rel="nofollow noreferrer"><code>ros2_control</code></a>. There is a <code>ros2_control</code> interface for <code>webots_ros2</code> called <a href="https://github.com/cyberbotics/webots_ros2/tree/master/webots_ros2_control" rel="nofollow noreferrer"><code>webots_ros2_control</code></a> (see usage example <a href="https://github.com/cyberbotics/webots_ros2/blob/b0361c09c13e070be76d85aa89d9d0794b6d2a0b/webots_ros2_turtlebot/resource/turtlebot_webots.urdf#L33%5D" rel="nofollow noreferrer">here</a>). In your case, you will need to spawn <a href="https://control.ros.org/master/doc/ros2_controllers/joint_state_broadcaster/doc/userdoc.html" rel="nofollow noreferrer"><code>joint_state_broadcaster</code></a> which publishes joints state messages. This is generally a recommended way to interact with joints in ROS as you can achieve hard control loops, much better performance, and simpler sim2real transfer.</li> <li>Create a custom <code>webots_ros2</code> plugin (see the <a href="https://docs.ros.org/en/humble/Tutorials/Advanced/Simulators/Webots/Setting-Up-Simulation-Webots-Basic.html" rel="nofollow noreferrer">tutorial</a>). You can use it to create a joint state publisher and publish joints state messages.</li> </ul>
104726
2023-10-15T10:46:15.927
|ros2|webots|
<p>I've installed webots 2023b and after I saw that a world with and inverted pendulum has been already shipped in the default installation directory, I decided to use that world for my project.</p> <p>Basically I followed <a href="https://docs.ros.org/en/humble/Tutorials/Advanced/Simulators/Webots/Setting-Up-Simulation-Webots-Basic.html" rel="nofollow noreferrer">this tutorial</a> and <a href="https://docs.ros.org/en/humble/Tutorials/Advanced/Simulators/Webots/Setting-Up-Simulation-Webots-Advanced.html" rel="nofollow noreferrer">this tutorial</a> to be able to publish sensor data directly from the urdf file itself. That means, that I create this file:</p> <pre class="lang-xml prettyprint-override"><code>&lt;?xml version=&quot;1.0&quot; ?&gt; &lt;robot name=&quot;Inverted Pendulum&quot;&gt; &lt;webots&gt; &lt;device reference=&quot;hip&quot; type=&quot;PositionSensor&quot;&gt; &lt;ros&gt; &lt;topicName&gt;/hip&lt;/topicName&gt; &lt;alwaysOn&gt;true&lt;/alwaysOn&gt; &lt;/ros&gt; &lt;/device&gt; &lt;device reference=&quot;horizontal position sensor&quot; type=&quot;PositionSensor&quot;&gt; &lt;ros&gt; &lt;topicName&gt;/horizontal_position&lt;/topicName&gt; &lt;alwaysOn&gt;true&lt;/alwaysOn&gt; &lt;/ros&gt; &lt;/device&gt; &lt;plugin type=&quot;inverted_pendulum.device_driver.InvertedPendulum&quot;/&gt; &lt;/webots&gt; &lt;/robot&gt; </code></pre> <p>but, even if, everything starts correctly (no errors or warnings in webots), doing a:</p> <pre class="lang-bash prettyprint-override"><code> $ros2 topic list </code></pre> <p>does not show any topic with the name <em>/hip</em> or <em>/horizontal_position</em>. It hangs forever waiting for data to be published.</p> <p>The simulator is not paused, since I created a driver, which moves the cart forth and back.</p> <p>Any idea?</p> <p>The code for the world is the same. I changed only the name. But I add it here anyway, just in case.</p> <pre class="lang-js prettyprint-override"><code>#VRML_SIM R2023b utf8 EXTERNPROTO &quot;https://raw.githubusercontent.com/cyberbotics/webots/R2023b/projects/objects/backgrounds/protos/TexturedBackground.proto&quot; EXTERNPROTO &quot;https://raw.githubusercontent.com/cyberbotics/webots/R2023b/projects/objects/floors/protos/Floor.proto&quot; WorldInfo { info [ &quot;An example of hot to solve the Inverted Pendulum problem using a PID controller&quot; ] title &quot;Inverted Pendulum&quot; basicTimeStep 16 contactProperties [ ContactProperties { material1 &quot;robot_basis&quot; material2 &quot;floor&quot; coulombFriction [ 0.2 ] } ] } Viewpoint { orientation -0.0996069518072968 -0.03685053329082472 0.9943442529364971 3.666446119327704 position 13.193129047100818 10.690115274872808 4.2889817843979205 follow &quot;robot:solid&quot; } TexturedBackground { } Floor { size 1000 2 appearance PBRAppearance { baseColorMap ImageTexture { url [ &quot;https://raw.githubusercontent.com/cyberbotics/webots/R2023b/projects/default/worlds/textures/checkered_marble.jpg&quot; ] } roughness 1 metalness 0 } } Robot { rotation 0 0 1 3.14159 children [ SliderJoint { jointParameters JointParameters { axis 1 0 0 dampingConstant 1.5 } device [ LinearMotor { name &quot;horizontal_motor&quot; maxForce 40 } PositionSensor { name &quot;horizontal position sensor&quot; } ] endPoint Solid { translation 0 0 0.06 children [ DEF ROBOT_SHAPE Shape { appearance PBRAppearance { baseColor 0.2443427176317998 0.704051270313573 0.1756923781185626 roughness 1 metalness 0 } geometry Box { size 0.3 0.1 0.1 } } DEF HIP HingeJoint { jointParameters HingeJointParameters { position 0.000161402 axis 0 1 0 anchor 0 0 0.03 } device [ PositionSensor { name &quot;hip&quot; } ] endPoint DEF THIGH_BB Solid { translation 0 -0.061 0.33000000000000007 rotation 0 1 0 0 children [ Shape { appearance PBRAppearance { baseColor 0.8496833752956435 0.07072556649118791 0.09393453879606317 roughness 1 metalness 0 } geometry DEF THIGH_BOX Box { size 0.05 0.02 0.6 } } ] boundingObject USE THIGH_BOX physics Physics { density -1 mass 0.05 centerOfMass [ 0 0.061 -0.27 ] } } } PointLight { attenuation 0 0 1 intensity 5 location 0 0 2 } ] contactMaterial &quot;robot_basis&quot; boundingObject USE ROBOT_SHAPE physics Physics { density -1 mass 1 } } } ] name &quot;Inverted_Pendulum&quot; boundingObject Box { size 200 0.1 0.01 } physics Physics { density -1 mass 30 } controller &quot;&lt;extern&gt;&quot; } </code></pre>
How to get sensor data over a topic with webots_ros2 plugin?
<p>A Gazebo Sim system plugin can only be attached to a <code>&lt;world&gt;</code>, a <code>&lt;model&gt;</code> or a <code>&lt;sensor&gt;</code>, not to a <code>&lt;link&gt;</code>. You should be getting an error message for this in the log output, I think even without verbose output enabled. In any case: for testing purposes it is advisable to enable verbose output (<code>-v 4</code>), e.g.:</p> <pre><code>ign gazebo -v 4 world.sdf </code></pre> <p>All standard Gazebo Fortress system plugins can be found in the <a href="https://github.com/gazebosim/gz-sim/tree/ign-gazebo6/src/systems" rel="nofollow noreferrer">GitHub repository</a>. See the other branches of that repository for the other Gazebo Sim versions.</p> <p>Each plugin has usage documentation in its header file. For the <code>PosePublisher</code> system see <a href="https://github.com/gazebosim/gz-sim/blob/14b0f3b42da811cdd8fb24e983c2b2598a855c91/src/systems/pose_publisher/PosePublisher.hh#L35-L66" rel="nofollow noreferrer">this part of the header file</a>.</p> <p>I haven't used that plugin yet, but given following documentation I conclude that you just have to add the plugin to the <code>&lt;model&gt;</code> in order to get all pose information published:</p> <blockquote> <p>Attach to an entity to publish the transform of its child entities in the form of gz::msgs::Pose messages, or a single gz::msgs::Pose_V message if &quot;use_pose_vector_msg&quot; is true.</p> </blockquote> <p>The GitHub repository also holds <a href="https://github.com/gazebosim/gz-sim/tree/ign-gazebo6/examples/worlds" rel="nofollow noreferrer">example worlds</a> that show the use of each plugin. <a href="https://github.com/gazebosim/gz-sim/blob/ign-gazebo6/examples/worlds/pose_publisher.sdf" rel="nofollow noreferrer">Here</a> is the one for the pose publisher.</p>
104733
2023-10-15T15:36:58.940
|gazebo|rviz|pose|gazebo-plugin|sensor|
<p>I am trying publish the pose of the camera and view it in rviz2. I am using ros2 Humble and gazebo fortress. I can see the image using the image topic but i am not able to visualize the topic in the gazebo as well. Here is my sdf file.</p> <pre><code>&lt;?xml version=&quot;1.0&quot; ?&gt; &lt;sdf version=&quot;1.6&quot;&gt; &lt;world name=&quot;test&quot;&gt; &lt;plugin filename=&quot;ignition-gazebo-physics-system&quot; name=&quot;gz::sim::systems::Physics&quot;&gt; &lt;/plugin&gt; &lt;plugin filename=&quot;ignition-gazebo-sensors-system&quot; name=&quot;gz::sim::systems::Sensors&quot;&gt; &lt;render_engine&gt;ogre2&lt;/render_engine&gt; &lt;/plugin&gt; &lt;plugin filename=&quot;ignition-gazebo-user-commands-system&quot; name=&quot;gz::sim::systems::UserCommands&quot;&gt; &lt;/plugin&gt; &lt;plugin filename=&quot;ignition-gazebo-scene-broadcaster-system&quot; name=&quot;gz::sim::systems::SceneBroadcaster&quot;&gt; &lt;/plugin&gt; &lt;light type=&quot;directional&quot; name=&quot;sun&quot;&gt; &lt;cast_shadows&gt;true&lt;/cast_shadows&gt; &lt;pose&gt;0 0 10 0 0 0&lt;/pose&gt; &lt;diffuse&gt;0.8 0.8 0.8 1&lt;/diffuse&gt; &lt;specular&gt;0.2 0.2 0.2 1&lt;/specular&gt; &lt;attenuation&gt; &lt;range&gt;1000&lt;/range&gt; &lt;constant&gt;0.9&lt;/constant&gt; &lt;linear&gt;0.01&lt;/linear&gt; &lt;quadratic&gt;0.001&lt;/quadratic&gt; &lt;/attenuation&gt; &lt;direction&gt;-0.5 0.1 -0.9&lt;/direction&gt; &lt;/light&gt; &lt;model name=&quot;ground_plane&quot;&gt; &lt;static&gt;true&lt;/static&gt; &lt;link name=&quot;link&quot;&gt; &lt;collision name=&quot;collision&quot;&gt; &lt;geometry&gt; &lt;plane&gt; &lt;normal&gt;0.0 0.0 1&lt;/normal&gt; &lt;size&gt;100 100&lt;/size&gt; &lt;/plane&gt; &lt;/geometry&gt; &lt;/collision&gt; &lt;visual name=&quot;visual&quot;&gt; &lt;geometry&gt; &lt;plane&gt; &lt;normal&gt;0.0 0.0 1&lt;/normal&gt; &lt;size&gt;100 100&lt;/size&gt; &lt;/plane&gt; &lt;/geometry&gt; &lt;material&gt; &lt;ambient&gt;0.8 0.8 0.8 1&lt;/ambient&gt; &lt;diffuse&gt;0.8 0.8 0.8 1&lt;/diffuse&gt; &lt;specular&gt;0.8 0.8 0.8 1&lt;/specular&gt; &lt;/material&gt; &lt;/visual&gt; &lt;/link&gt; &lt;/model&gt; &lt;model name=&quot;box&quot;&gt; &lt;pose&gt;5.0 0 0.7 0 0 0&lt;/pose&gt; &lt;link name=&quot;box_link&quot;&gt; &lt;inertial&gt; &lt;inertia&gt; &lt;ixx&gt;1&lt;/ixx&gt; &lt;ixy&gt;0&lt;/ixy&gt; &lt;ixz&gt;0&lt;/ixz&gt; &lt;iyy&gt;1&lt;/iyy&gt; &lt;iyz&gt;0&lt;/iyz&gt; &lt;izz&gt;1&lt;/izz&gt; &lt;/inertia&gt; &lt;mass&gt;1.0&lt;/mass&gt; &lt;/inertial&gt; &lt;collision name=&quot;box_collision&quot;&gt; &lt;geometry&gt; &lt;box&gt; &lt;size&gt;1 1 1&lt;/size&gt; &lt;/box&gt; &lt;/geometry&gt; &lt;/collision&gt; &lt;visual name=&quot;box_visual&quot;&gt; &lt;geometry&gt; &lt;box&gt; &lt;size&gt;1 1 1&lt;/size&gt; &lt;/box&gt; &lt;/geometry&gt; &lt;material&gt; &lt;ambient&gt;0.3 0.3 0.3 1&lt;/ambient&gt; &lt;diffuse&gt;0.3 0.3 0.3 1&lt;/diffuse&gt; &lt;specular&gt;0.3 0.5 0.3 1&lt;/specular&gt; &lt;/material&gt; &lt;/visual&gt; &lt;/link&gt; &lt;/model&gt; &lt;model name=&quot;model_with_camera&quot;&gt; &lt;static&gt;true&lt;/static&gt; &lt;pose&gt;4 -6 2 0 0 1.57&lt;/pose&gt; &lt;link name=&quot;link&quot;&gt; &lt;pose&gt;0.05 0.05 0.05 0 0 0&lt;/pose&gt; &lt;visual name=&quot;visual&quot;&gt; &lt;geometry&gt; &lt;box&gt; &lt;size&gt;0.1 0.1 0.1&lt;/size&gt; &lt;/box&gt; &lt;/geometry&gt; &lt;/visual&gt; &lt;sensor name=&quot;camera&quot; type=&quot;camera&quot;&gt; &lt;camera&gt; &lt;horizontal_fov&gt;1.047&lt;/horizontal_fov&gt; &lt;image&gt; &lt;width&gt;640&lt;/width&gt; &lt;height&gt;480&lt;/height&gt; &lt;/image&gt; &lt;clip&gt; &lt;near&gt;0.1&lt;/near&gt; &lt;far&gt;100&lt;/far&gt; &lt;/clip&gt; &lt;/camera&gt; &lt;always_on&gt;1&lt;/always_on&gt; &lt;update_rate&gt;30&lt;/update_rate&gt; &lt;visualize&gt;true&lt;/visualize&gt; &lt;topic&gt;camera&lt;/topic&gt; &lt;/sensor&gt; &lt;/link&gt; &lt;/model&gt; &lt;model name=&quot;lidar&quot;&gt; &lt;pose&gt;4.05 -6 2 0 0 1.57&lt;/pose&gt; &lt;link name=&quot;link&quot;&gt; &lt;pose&gt;0.05 0.05 0.05 0 0 0&lt;/pose&gt; &lt;inertial&gt; &lt;mass&gt;0.1&lt;/mass&gt; &lt;inertia&gt; &lt;ixx&gt;0.000166667&lt;/ixx&gt; &lt;iyy&gt;0.000166667&lt;/iyy&gt; &lt;izz&gt;0.000166667&lt;/izz&gt; &lt;/inertia&gt; &lt;/inertial&gt; &lt;collision name=&quot;collision&quot;&gt; &lt;geometry&gt; &lt;box&gt; &lt;size&gt;0.1 0.1 0.1&lt;/size&gt; &lt;/box&gt; &lt;/geometry&gt; &lt;/collision&gt; &lt;visual name=&quot;visual&quot;&gt; &lt;geometry&gt; &lt;box&gt; &lt;size&gt;0.1 0.1 0.1&lt;/size&gt; &lt;/box&gt; &lt;/geometry&gt; &lt;/visual&gt; &lt;sensor name='gpu_lidar' type='gpu_lidar'&gt; &lt;topic&gt;lidar&lt;/topic&gt; &lt;update_rate&gt;10&lt;/update_rate&gt; &lt;lidar&gt; &lt;scan&gt; &lt;horizontal&gt; &lt;samples&gt;1024&lt;/samples&gt; &lt;resolution&gt;1&lt;/resolution&gt; &lt;min_angle&gt;-1.396263&lt;/min_angle&gt; &lt;max_angle&gt;1.396263&lt;/max_angle&gt; &lt;/horizontal&gt; &lt;vertical&gt; &lt;samples&gt;64&lt;/samples&gt; &lt;resolution&gt;1&lt;/resolution&gt; &lt;min_angle&gt;-0.261799&lt;/min_angle&gt; &lt;max_angle&gt;0.261799&lt;/max_angle&gt; &lt;/vertical&gt; &lt;/scan&gt; &lt;range&gt; &lt;min&gt;0.08&lt;/min&gt; &lt;max&gt;30.0&lt;/max&gt; &lt;resolution&gt;0.01&lt;/resolution&gt; &lt;/range&gt; &lt;/lidar&gt; &lt;alwaysOn&gt;1&lt;/alwaysOn&gt; &lt;visualize&gt;true&lt;/visualize&gt; &lt;/sensor&gt; &lt;plugin filename=&quot;ignition-gazebo-pose-publisher-system&quot; name=&quot;gz::sim::systems::PosePublisher&quot;&gt; &lt;publish_link_pose&gt;true&lt;/publish_link_pose&gt; &lt;publish_sensor_pose&gt;true&lt;/publish_sensor_pose&gt; &lt;use_pose_vector_msg&gt;true&lt;/use_pose_vector_msg&gt; &lt;static_publisher&gt;true&lt;/static_publisher&gt; &lt;/plugin&gt; &lt;/link&gt; &lt;static&gt;true&lt;/static&gt; &lt;/model&gt; &lt;/world&gt; &lt;/sdf&gt; </code></pre> <p>I tried to publish the pose of the lidar using pose-publisher-system as well. Thank you.</p>
How to publish pose of the lidar/camera/(or any dynamic/static model) in gazebo fortress?
<p>The PWM signal has an undefined reference voltage level, because there is no common ground.</p> <p>Short the negative poles of both battery packs.</p>
104739
2023-10-15T22:23:59.583
|servo|
<p><a href="https://i.stack.imgur.com/PL9kx.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/PL9kx.png" alt="enter image description here" /></a></p> <p>I am having a problem when switching my servo power supply from the microcontroller to a separate battery pack. Fortunately, I was able to replicate the problem with two different cheapo servo testers, as shown below.</p> <p>In my understanding:</p> <ol> <li>The white (control PWM) cable works as a glorified relay - it tells the servo to give or cut power.</li> <li>There is NO feedback to the servo tester from the servo potentiometer - that's an internal Servo thing to align angle based on PWM.</li> </ol> <p>With those givens, where is my gap in understanding, why does the second diagram not work, and how do I make it work?</p>
Servo Tester Only Works When Power and PWM are On Same Battery, Why?
<p>In robotics, a quaternion (i.e. 4 coordinates) is typically used as a representation of a 3D orientation. There are other possible representations for the same 3D orientation, e.g. a minimal set of 3 coordinates (Euler angles, Roll Pitch Yaw, etc.), a 3x3 rotation matrix (i.e. 9 coordinates), etc.</p> <p>For your case, your robot only has a 1D orientation, and you already have a representation for that orientation: it is given by the angle <code>ebot_controller.loc_theta</code>. You <em>could</em> express this orientation in terms of a quaternion, but there is no real need for that.</p> <p>The relation between the global and local coordinate system is given by:</p> <p><span class="math-container">$$ \begin{bmatrix}v_{x\_global}\\ v_{y\_global}\end{bmatrix} = \begin{bmatrix}cos(\theta)&amp;-sin(\theta)\\ sin(\theta)&amp;cos(\theta)\end{bmatrix}\cdot\begin{bmatrix}v_{x\_ local}\\ v_{y\_local}\end{bmatrix} $$</span> and inverse: <span class="math-container">$$ \begin{bmatrix}v_{x\_local}\\ v_{y\_local}\end{bmatrix} = \begin{bmatrix}cos(\theta)&amp;sin(\theta)\\ -sin(\theta)&amp;cos(\theta)\end{bmatrix}\cdot\begin{bmatrix}v_{x\_global}\\ v_{y\_global}\end{bmatrix} $$</span></p> <p>The rotational component is unchanged, i.e.: <span class="math-container">$$ \omega_{global} = \omega_{local} $$</span></p>
104747
2023-10-16T06:13:06.900
|gazebo|ros2|quaternion|transformation|
<p>I have to use some sort of transformations to generate velocity of a model using global coordinates. This code is for making the model go to specific locations (goals). the problem i am facing is that the error_x and other values are calculated using global frame but the linear and angular velocity is always given in body frame of the model. i figured that maybe quaternions can help me achieve this task. i need help regarding this.</p> <pre><code> error_x = x_goal - ebot_controller.loc_x error_y = y_goal - ebot_controller.loc_y error_theta = theta_goal- ebot_controller.loc_theta print(&quot;X(Error): &quot;, error_x, &quot; Y(Error):&quot;, error_y) # Calculation of control outputs using a P controller ebot_controller.vel.linear.x = ebot_controller.kp_x * error_x ebot_controller.vel.linear.y = ebot_controller.kp_y * error_y ebot_controller.vel.angular.z = ebot_controller.kp_theta * error_theta ebot_controller.cmd_vel_pub_.publish(ebot_controller.vel) # code below is for not letting the model crawl to the goal very slowly if error_x &lt; 0.1 and y_goal &lt; 0.1: ebot_controller.index += 1 if ebot_controller.flag == 1: ebot_controller.index = 0 ebot_controller.send_request(ebot_controller.index) </code></pre>
How to use quaternions in ROS2
<p>It looks like from the launch file that you might have to declare a static transform frame called &quot;world&quot;, and connect that to the link &quot;body&quot;. You can try this out using the following bit of code in the launch file:</p> <pre><code>import os import typing import launch import launch_ros from launch.substitutions import ( Command, FindExecutable, LaunchConfiguration, PathJoinSubstitution, ) pkg_share = launch_ros.substitutions.FindPackageShare(package=&quot;spot_description&quot;).find(&quot;spot_description&quot;) default_model_path = os.path.join(pkg_share, &quot;urdf/spot.urdf.xacro&quot;) default_rviz2_path = os.path.join(pkg_share, &quot;rviz/viz_spot.rviz&quot;) def launch_setup(context: launch.LaunchContext) -&gt; typing.List[launch_ros.actions.Node]: namespace = LaunchConfiguration(&quot;namespace&quot;).perform(context) robot_description = Command( [ PathJoinSubstitution([FindExecutable(name=&quot;xacro&quot;)]), &quot; &quot;, PathJoinSubstitution([pkg_share, &quot;urdf&quot;, &quot;spot.urdf.xacro&quot;]), &quot; &quot;, &quot;arm:=&quot;, LaunchConfiguration(&quot;arm&quot;), &quot; &quot;, &quot;tf_prefix:=&quot;, LaunchConfiguration(&quot;tf_prefix&quot;), &quot; &quot;, ] ) robot_state_publisher_node = launch_ros.actions.Node( package=&quot;robot_state_publisher&quot;, executable=&quot;robot_state_publisher&quot;, parameters=[{&quot;robot_description&quot;: robot_description}], namespace=namespace, ) joint_state_publisher_node = launch_ros.actions.Node( package=&quot;joint_state_publisher&quot;, executable=&quot;joint_state_publisher&quot;, name=&quot;joint_state_publisher&quot;, namespace=namespace, condition=launch.conditions.UnlessCondition(LaunchConfiguration(&quot;gui&quot;)), ) joint_state_publisher_gui_node = launch_ros.actions.Node( package=&quot;joint_state_publisher_gui&quot;, executable=&quot;joint_state_publisher_gui&quot;, name=&quot;joint_state_publisher_gui&quot;, condition=launch.conditions.IfCondition(LaunchConfiguration(&quot;gui&quot;)), namespace=namespace, ) rviz_node = launch_ros.actions.Node( package=&quot;rviz2&quot;, executable=&quot;rviz2&quot;, name=&quot;rviz2&quot;, output=&quot;screen&quot;, arguments=[&quot;-d&quot; + default_rviz2_path], ) static_tf_node = launch_ros.actions.Node( package=&quot;tf2_ros&quot;, executable=&quot;static_transform_publisher&quot;, output=&quot;screen&quot;, arguments=[&quot;0&quot;, &quot;0&quot;, &quot;0&quot;, &quot;0&quot;, &quot;0&quot;, &quot;0&quot;, &quot;world&quot;, &quot;body&quot;] ) return [ joint_state_publisher_node, joint_state_publisher_gui_node, robot_state_publisher_node, rviz_node, static_tf_node ] def generate_launch_description() -&gt; launch.LaunchDescription: launch_arguments = [ launch.actions.DeclareLaunchArgument( name=&quot;gui&quot;, default_value=&quot;True&quot;, description=&quot;Flag to enable joint_state_publisher_gui&quot; ), launch.actions.DeclareLaunchArgument( name=&quot;model&quot;, default_value=default_model_path, description=&quot;Absolute path to robot urdf file&quot; ), launch.actions.DeclareLaunchArgument( name=&quot;rvizconfig&quot;, default_value=default_rviz2_path, description=&quot;Absolute path to rviz config file&quot; ), launch.actions.DeclareLaunchArgument(&quot;arm&quot;, default_value=&quot;false&quot;, description=&quot;include arm in robot model&quot;), launch.actions.DeclareLaunchArgument( &quot;tf_prefix&quot;, default_value='&quot;&quot;', description=&quot;apply namespace prefix to robot links and joints&quot; ), launch.actions.DeclareLaunchArgument(&quot;namespace&quot;, default_value=&quot;&quot;, description=&quot;Namespace for robot tf topic&quot;), ] return launch.LaunchDescription(launch_arguments + [launch.actions.OpaqueFunction(function=launch_setup)]) </code></pre> <p><strong>EDIT</strong> Added the whole launch file to show the changes a bit better</p>
104749
2023-10-16T07:24:29.247
|gazebo|ros2|rviz|world|
<p><a href="https://i.stack.imgur.com/OrwV5.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/OrwV5.png" alt="Error in RVIZ" /></a></p> <p>I am using the <a href="https://github.com/bdaiinstitute/spot_ros2" rel="nofollow noreferrer">ROS2 humble SPOT</a> github repo, while installing the package in the workspace and running my first command, I came across this problem.</p> <p>I tried to check the launch file for any wrong files path but found everything in it's place.</p> <p>I did not encounter any problems during the installation and everything went well, How can I fix this error and include the world frame.</p>
Robot Model does not exist,
<p>Which ROS distro are you using? This is a well-known &quot;feature&quot;/bug and was <a href="https://github.com/ros-controls/ros2_controllers/pull/558" rel="nofollow noreferrer">fixed for rolling/iron packages</a>. I plan to backport this fix to humble, but this might still take some time to ensure not breaking existing setups.</p>
104758
2023-10-16T12:40:41.243
|joint-trajectory-controller|ros2-controllers|
<p>I want to control a manipulator with the <a href="https://control.ros.org/humble/doc/ros2_controllers/joint_trajectory_controller/doc/userdoc.html" rel="nofollow noreferrer">Joint Trajectory Controller</a> by using the <strong>velocity command interface</strong>. I am planning a trajectory with with MoveIt <code>MotionPlanning</code>of rviz. I am using ros2 humble.</p> <p>I've tested several setups but with all of them I have the same problem: When the Joint Trajectory Controller reports &quot;Goal reached, success!&quot;, <strong>the velocities aren't set to zero</strong>. I tried to play with with the parameters (increasing and decreasing <code>stopped_velocity_tolerance</code> or <code>goal_time</code>, changing the <code>gains</code>, adding and removing <code>ff_velocity_scale</code>) which sometimes leads to a smaller final velocity but never to completely stopped motion.</p> <p>A simple setup to reproduce this issue is by launching <code>ros2 launch moveit2_tutorials demo.launch.py</code> from (moveit2_tutorials)[https://github.com/ros-planning/moveit2_tutorials] but with changing from position to velocity interface in the <code>panda_moveit_config</code>. I changed</p> <ol> <li>Settings for <code>panda_arm_controller</code> in <code>ros2_controllers.yaml</code> to:</li> </ol> <pre><code>panda_arm_controller: ros__parameters: command_interfaces: - velocity state_interfaces: - position - velocity joints: - panda_joint1 - panda_joint2 - panda_joint3 - panda_joint4 - panda_joint5 - panda_joint6 - panda_joint7 gains: panda_joint1: p: 10.0 i: 1.0 d: 1.0 i_clamp: 1.0 ff_velocity_scale: 1.0 panda_joint2: p: 10.0 i: 1.0 d: 1.0 i_clamp: 1.0 ff_velocity_scale: 1.0 panda_joint3: p: 10.0 i: 1.0 d: 1.0 i_clamp: 1.0 ff_velocity_scale: 1.0 panda_joint4: p: 10.0 i: 1.0 d: 1.0 i_clamp: 1.0 ff_velocity_scale: 1.0 panda_joint5: p: 10.0 i: 1.0 d: 1.0 i_clamp: 1.0 ff_velocity_scale: 1.0 panda_joint6: p: 10.0 i: 1.0 d: 1.0 i_clamp: 1.0 ff_velocity_scale: 1.0 panda_joint7: p: 10.0 i: 1.0 d: 1.0 i_clamp: 1.0 ff_velocity_scale: 1.0 </code></pre> <ol start="2"> <li>Setting command_interface from <strong>position</strong> to <strong>velocity</strong> in <code>panda.ros2_control.xacro</code>:</li> </ol> <pre><code>&lt;?xml version=&quot;1.0&quot;?&gt; &lt;robot xmlns:xacro=&quot;http://www.ros.org/wiki/xacro&quot;&gt; &lt;xacro:macro name=&quot;panda_ros2_control&quot; params=&quot;name initial_positions_file ros2_control_hardware_type&quot;&gt; &lt;xacro:property name=&quot;initial_positions&quot; value=&quot;${xacro.load_yaml(initial_positions_file)['initial_positions']}&quot;/&gt; &lt;ros2_control name=&quot;<span class="math-container">${name}" type="system"&gt; &lt;hardware&gt; &lt;xacro:if value="$</span>{ros2_control_hardware_type == 'mock_components'}&quot;&gt; &lt;plugin&gt;mock_components/GenericSystem&lt;/plugin&gt; &lt;param name=&quot;calculate_dynamics&quot;&gt;true&lt;/param&gt; &lt;/xacro:if&gt; &lt;xacro:if value=&quot;<span class="math-container">${ros2_control_hardware_type == 'isaac'}"&gt; &lt;plugin&gt;topic_based_ros2_control/TopicBasedSystem&lt;/plugin&gt; &lt;param name="joint_commands_topic"&gt;/isaac_joint_commands&lt;/param&gt; &lt;param name="joint_states_topic"&gt;/isaac_joint_states&lt;/param&gt; &lt;/xacro:if&gt; &lt;/hardware&gt; &lt;joint name="panda_joint1"&gt; &lt;command_interface name="velocity"/&gt; &lt;state_interface name="position"&gt; &lt;param name="initial_value"&gt;$</span>{initial_positions['panda_joint1']}&lt;/param&gt; &lt;/state_interface&gt; &lt;state_interface name=&quot;velocity&quot;/&gt; &lt;/joint&gt; &lt;joint name=&quot;panda_joint2&quot;&gt; &lt;command_interface name=&quot;velocity&quot;/&gt; &lt;state_interface name=&quot;position&quot;&gt; &lt;param name=&quot;initial_value&quot;&gt;<span class="math-container">${initial_positions['panda_joint2']}&lt;/param&gt; &lt;/state_interface&gt; &lt;state_interface name="velocity"/&gt; &lt;/joint&gt; &lt;joint name="panda_joint3"&gt; &lt;command_interface name="velocity"/&gt; &lt;state_interface name="position"&gt; &lt;param name="initial_value"&gt;$</span>{initial_positions['panda_joint3']}&lt;/param&gt; &lt;/state_interface&gt; &lt;state_interface name=&quot;velocity&quot;/&gt; &lt;/joint&gt; &lt;joint name=&quot;panda_joint4&quot;&gt; &lt;command_interface name=&quot;velocity&quot;/&gt; &lt;state_interface name=&quot;position&quot;&gt; &lt;param name=&quot;initial_value&quot;&gt;<span class="math-container">${initial_positions['panda_joint4']}&lt;/param&gt; &lt;/state_interface&gt; &lt;state_interface name="velocity"/&gt; &lt;/joint&gt; &lt;joint name="panda_joint5"&gt; &lt;command_interface name="velocity"/&gt; &lt;state_interface name="position"&gt; &lt;param name="initial_value"&gt;$</span>{initial_positions['panda_joint5']}&lt;/param&gt; &lt;/state_interface&gt; &lt;state_interface name=&quot;velocity&quot;/&gt; &lt;/joint&gt; &lt;joint name=&quot;panda_joint6&quot;&gt; &lt;command_interface name=&quot;velocity&quot;/&gt; &lt;state_interface name=&quot;position&quot;&gt; &lt;param name=&quot;initial_value&quot;&gt;<span class="math-container">${initial_positions['panda_joint6']}&lt;/param&gt; &lt;/state_interface&gt; &lt;state_interface name="velocity"/&gt; &lt;/joint&gt; &lt;joint name="panda_joint7"&gt; &lt;command_interface name="velocity"/&gt; &lt;state_interface name="position"&gt; &lt;param name="initial_value"&gt;$</span>{initial_positions['panda_joint7']}&lt;/param&gt; &lt;/state_interface&gt; &lt;state_interface name=&quot;velocity&quot;/&gt; &lt;/joint&gt; &lt;/ros2_control&gt; &lt;/xacro:macro&gt; &lt;/robot&gt; </code></pre> <p>Now you can run <code>ros2 launch moveit2_tutorials demo.launch.py</code>, open a second terminal and run <code>ros2 topic echo /joint_states</code>. Plan an arbitrary trajectory by using rviz and hit <code>Plan &amp; Execute</code> in the <code>MotionPlanning</code> panel of rviz. Then you should see a motion being executed and the terminal should report:</p> <pre><code>[ros2_control_node-5] [INFO] [1697458106.110702550] [panda_arm_controller]: Goal reached, success! [move_group-4] [INFO] [1697458106.122919526] [moveit.simple_controller_manager.follow_joint_trajectory_controller_handle]: Controller 'panda_arm_controller' successfully finished [move_group-4] [INFO] [1697458106.151076866] [moveit_ros.trajectory_execution_manager]: Completed trajectory execution with status SUCCEEDED ... [move_group-4] [INFO] [1697458106.151414260] [moveit_move_group_default_capabilities.move_action_capability]: Solution was found and executed. [rviz2-1] [INFO] [1697458106.151859534] [move_group_interface]: Plan and Execute request complete! </code></pre> <p>In parallel you can check the logs of <code>ros2 topic echo /joint_states</code> which should report <strong>non-zero velocities even after the trajectory is successfully executed</strong>. Therefore the arm keeps moving (slowly but steady) and you should see the positions constantly changing.</p> <p>Is this the intended behavior of the Joint Trajectory Controller? Or do I just have a bad parameter setup? Any help or hint on what the problem could be would be highly appreciated!</p>
Joint Trajectory Controller with velocity command interface has non-zero velocity after finshed trajectory
<p>I found the solution to my problem thanks to introspecting the <code>~/controller_state</code> topic with PlotJuggler.</p> <p>The problem was with the <strong>joint limits</strong>. Lowering the <code>max_velocity</code> and the <code>max_acceleration</code> in the <code>joint_limits.yaml</code> fixed the problem for me. I think the joint limits within my hardware made it impossible to follow the trajectory. Planing a much slower trajectory on the other hand fixed that.</p>
104784
2023-10-17T14:58:52.757
|joint-trajectory-controller|ros2-controllers|
<p>I want to control a manipulator with the Joint Trajectory Controller (JTC) by using the velocity command interface. I am using ros2_control respectively ros2_controllers on the master branch and I have a setup where I use a <strong>velocity command interface</strong> together with a <strong>position and velocity state interface</strong> for real hardware.</p> <p>I am planning a trajectory with MoveIt MotionPlanning of RViz. After hitting <code>Plan &amp; Execute</code> the JTC reports <em>&quot;Goal reached, success!&quot;</em> always far to early although there is still a big <strong>position error</strong> (e.g. the joint is only making 70% of the way). And when the JTC reports <em>&quot;Goal reached, success!&quot;</em> the line <code>traj_msg_external_point_ptr_.initRT(set_hold_position());</code> makes the JTC try to hold the wrong position if I understand it correctly.</p> <p>It seems to me that the <strong>sampling of the trajectory is not working properly</strong> or rather I have a setup that makes it operate incorrectly. I tried to debug this a little and found that the trajectory, that MoveIt generates, looks okay, at least the desired positions look fine. I tried to tweak the <strong>gains for the JTC</strong> in a lot of combinations but it mostly gets worse.</p> <p>Any idea what could cause such a behavior or what parameter I should tune?</p> <p><strong>EDIT:</strong> I used PlotJuggler to capture the data of the controller_state to better visualize the problem:</p> <p><a href="https://i.stack.imgur.com/FAUdS.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/FAUdS.png" alt="Data of controller_state topic" /></a></p> <p>These are my settings for the JTC:</p> <pre><code>joint_trajectory_controller: ros__parameters: joints: - joint_1 command_interfaces: - velocity state_interfaces: - position - velocity state_publish_rate: 50.0 # Defaults to 50 action_monitor_rate: 20.0 # Defaults to 20 gains: joint_1: p: 0.5 i: 0.0 d: 0.0 i_clamp: 0.0 ff_velocity_scale: 1.0 </code></pre>
Joint Trajectory Controller with velocity command interface reports success to early
<p>Its a good question, but unfortunately I don't have a robot remotely like that which I have access to to play around with and test. I can say though from user videos provided for me for my ROSCon 2023 talk, we have a robot exactly like you describe using MPPI successfully. That's not to say that perhaps behavior specialized to robots of this nature couldn't be made, I'm just unfortunately not in a position to do evaluation and testing to implement it myself or have the experience to give you a straight-forward answer about what changes specifically would be necessary to make easier.</p> <p>Perhaps acceleration constraints (which are planned) would do the trick. Perhaps not <em>fully</em> is my intuition.</p> <p>Ex. <a href="https://youtu.be/1n2bGVIe7Gs?si=yDbrOyzjANmQhSK1&amp;t=48" rel="nofollow noreferrer">https://youtu.be/1n2bGVIe7Gs?si=yDbrOyzjANmQhSK1&amp;t=48</a></p> <p>My email's pretty easy to find - if you have the time to implement / test &amp; this is something you'd like to collaborate over to make work, let me know!</p>
104789
2023-10-17T19:12:13.150
|path-planning|ros-humble|nav2|local-planner|
<p>We are currently developing a swerve drive robot utilizing the MPPI controller, but we're encountering challenges in transitioning from simulation to real-world operation. To provide context on the swerve drive system, our robot consists of four &quot;legs,&quot; each equipped with a motor to control wheel orientation/direction relative to the robot and another motor to manage wheel speed.</p> <p>In simulation, the robot moves smoothly and follows the desired trajectory accurately. However, in the real world, we observe that the robot's wheel orientations rapidly fluctuate between random positions while at idle before initiating its intended trajectory. Moreover, during movement, the trajectory deviates, resulting in a snake-like pattern, and the robot struggles to react effectively to dynamic obstacles.</p> <p>The problem seems to be that our direction motors move instantly to their intended positions in simulation while in reality there is a noticeable delay. Our guess is that the MPPI controller generates trajectories for the next time step, aiming for motor angles that may not be achievable within a single time step due to the speed constraints of the direction motors.</p> <p>My question is therefore:</p> <ol> <li>Is there a way to add the speeds and accelerations of our direction motors to the MPPI controller for the computation of feasible paths?</li> <li>Is there an alternative solution to our problem?</li> </ol>
Optimizing Swerve Drive Robot Performance for Real-world Implementation with MPPI Controller
<p>Matrix multiplication is not associative. In order to calculate the transform composition in the robot moving frame you should do:</p> <pre><code>m = m*transf_mat; </code></pre> <p>when you do:</p> <pre><code>m = transf_mat*m; </code></pre> <p>you are transforming the robot frame relative to your fixed (inertial) axis.</p> <p><a href="https://i.stack.imgur.com/2bzQA.jpg" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/2bzQA.jpg" alt="plot after changing the multiplication order:" /></a></p>
104791
2023-10-17T21:23:53.997
|ros|robot-localization|matlab|motion-planning|pose|
<p>I have a robot starting from the origin. It first turns 90 degrees (yaw), then move to (5,10), then it starts to move along its own negative x direction at each step with a translation of 1. Here is the code:</p> <pre><code> close all %% Set up the robot, with its tip pointing its own +x x_points = [-1, 2, 3, 2, -1, -1]; y_points = [2, 2, 0.5, -1, -1, 2]; points = [x_points; y_points; ones(1, length(x_points))]; %% translation x_trans = [5 1 1 1 1 1 1 1 1 1]; y_trans = [10 0 0 0 0 0 0 0 0 0]; %% rotation theta = [pi/2 0 0 0 0 0 0 0 0 0]; clf; plot(x_points, y_points, '-r' ); grid on hold on; axis image xlim([-5 20]); ylim([-5 20]); xlabel('x') ylabel('y'); %% initial transfer matrix m = 1; %% loop through the preset motion for the robot for n = 1:length(sxx) t = theta(n); sx = x_trans(n); sy = y_trans(n); % transformation matrix at each step transf_mat = [ cos(t), -sin(t), sx; sin(t), cos(t), sy; 0, 0, 1]; % composit transformation matrix m = transf_mat*m; % Compute the new points transf_pts = m*points; % Plot the points plot(transf_pts(1,:), transf_pts(2,:), '-k' ); grid on hold on; xlim([-5 20]); ylim([-5 20]); drawnow; pause(0.2); end </code></pre> <p>and here is the plot of the moving trajectory of the robot: <a href="https://i.stack.imgur.com/dA7vC.jpg" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/dA7vC.jpg" alt="enter image description here" /></a></p> <p>However, since the very first motion for the robot is to turn +90 degrees, so I am expecting the robot to start moving downwards (in a negative y direction in the world coordinates), because, now the negative x direction of the robot is actually the negative y direction in the world coordinates after the first +90 degrees (yaw) rotation. So shouldn't the plot look something like this?</p> <p><a href="https://i.stack.imgur.com/QqETK.jpg" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/QqETK.jpg" alt="enter image description here" /></a></p> <p>I have been thinking about this for two days, and now I am really confused. Can anyone tell me where I did wrong perhaps? Thank you so much!</p>
Pose matrices and robotics 2D movement
<p>All of the layers have an <code>enabled</code> parameter that you can dynamically set on/off as you like.</p>
104792
2023-10-18T01:30:01.427
|ros2|nav2|
<p>Is there a way to change runtime plugins like below for local and global costmap from BT in order to disable <code>obstacle_layer</code> as needed ?</p> <pre><code>plugins: [&quot;plugin1&quot;, &quot;plugin2&quot;] plugin1: [&quot;voxel_layer&quot;, &quot;inflation_layer&quot;] plugin2: [&quot;voxel_layer&quot;, &quot;inflation_layer&quot;, &quot;obstacle_layer&quot;] </code></pre>
Nav2 Local and Global Plugins
<p>This error often happened to me when I built the package for the RViz plugin but did not source it in the terminal that starts and runs RViz.</p> <p>Make sure you source the workspace where &quot;vision_msgs_visualization&quot; lives and run RViz in the same terminal.</p>
104810
2023-10-18T19:28:24.963
|ros|rviz|ubuntu|ros-noetic|rviz-plugins|
<p>I am trying to use the <a href="https://github.com/Kukanani/vision_msgs_visualization" rel="nofollow noreferrer">Kukanani/vision_msgs_visualization</a> plugin visualize the <code>vision_msgs/Detection3DArray</code> type of data.</p> <p>I can add in RViz by display type <code>OrkObjectDisplay</code> option. But I get the following error, when I open RViz:</p> <pre><code>[ERROR] [1697653988.707062524]: PluginlibFactory: The plugin for class 'vision_msgs_visualization/OrkObjectDisplay' failed to load. Error: Could not find library corresponding to plugin vision_msgs_visualization/OrkObjectDisplay. Make sure the plugin description XML file has the correct name of the library and that the library actually exists. </code></pre>
Error in RViz with vision_msgs/Detection3DArray visualization
<p><a href="https://jeffzzq.medium.com/ros2-image-pipeline-tutorial-3b18903e7329" rel="nofollow noreferrer">This tutorial</a> does exactly what you want, I think. They are also creating a disparity map with stereo_image_proc.</p> <p>There you will calibrate your cameras and use the opencv_cam driver to read your devices. When reading you cameras with opencv_cam you can provide a calibration file (.yaml) that you created with the calibration, and open_cv cam will publish your image_raw and camera_info topic.</p>
104811
2023-10-18T19:54:07.807
|ros-humble|camera-info|stereo-camera|disparity|
<p>I'm working with 2 csi cameras in a jetson nano with ROS2 humble and I want to do a disparity image. What I'm trying to use is stereo_image_proc disparity node, but it requires to have the camera_info topics for both of the cameras. The camera is a imx219-83 and I haven't found a driver that allows me to have a camera_info topic. Is it possible to obtain this topic ? If not, is there something that could be helpful for this situations?</p> <p>Thanks!</p>
How to use stereo_image_proc without camera_info topics?
<p>Welcome to Robotics Stack Exchange!</p> <p>Please make sure that both machines are time synchronized. Time synchronization is crucial in a multi-machine setup.</p>
104816
2023-10-18T23:10:08.557
|gmapping|ros-noetic|
<p>I'm asking this question because I'm trying to get LiDAR sensor values from a home-made robot using a Raspberry Pi and run SLAM on a PC, but I'm having trouble running it.</p> <p>Raspberry Pi and PC are connected via LAN cable, and both ROS_HOST_NAME and ROS_MASTER_URI have been set.</p> <p><a href="https://i.stack.imgur.com/ZBZTM.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/ZBZTM.png" alt="enter image description here" /></a></p> <p>Connected to Raspberry Pi with VNC Viewer. When running LIDAR and odom node SLAM on Raspberry Pi, it operates normally as above. However, mapping speed is very slow.</p> <p><a href="https://i.stack.imgur.com/XS8lZ.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/XS8lZ.png" alt="enter image description here" /></a> This is rqt_graph executed on Raspberry Pi.</p> <p><a href="https://i.stack.imgur.com/lfElk.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/lfElk.png" alt="enter image description here" /></a> This error occurs when remotely connecting to an Ubuntu PC.</p> <p><a href="https://i.stack.imgur.com/3U89v.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/3U89v.png" alt="enter image description here" /></a> This is rqt_graph run on Ubuntu.</p>
ROS Raspberry Pi <-> PC Communication error
<p>If you still like to work with TEB, you can clone &quot;ros2-master&quot; branch of teb_local_planner, and build it (I cloned teb_local_planner to the nav2_ws/src (workspace of navigation2))</p> <p>git clone <a href="https://github.com/rst-tu-dortmund/teb_local_planner.git" rel="nofollow noreferrer">https://github.com/rst-tu-dortmund/teb_local_planner.git</a> --brach ros2-master</p> <p>After build Nav2 work space, you can use teb_local_planner as a Nav2 plug-in by the same way as other plug-in controllers</p>
104822
2023-10-19T09:07:18.467
|ros2|ros-humble|plugin|teb-local-planner|nav2|
<p>On <a href="https://navigation.ros.org/setup_guides/algorithm/select_algorithm.html" rel="nofollow noreferrer">navigation plugin setup document</a> , the TEB controller is mentioned, but I don't find any documentation to set up this controller in nav2 humble. And there is no humble branch in the teb_local_planner repository. So I copied the TEB parameters from foxy and launch the navigation node, but it told me the TEB controller plugin can't be found.</p> <pre><code>[controller_server-4] [FATAL] [1697706023.360686729] [controller_server]: Failed to create controller. Exception: According to the loaded plugin descriptions the class teb_local_planner::TebLocalPlannerROS with base class type nav2_core::Controller does not exist. Declared types are dwb_core::DWBLocalPlanner nav2_mppi_controller::MPPIController nav2_regulated_pure_pursuit_controller::RegulatedPurePursuitController nav2_rotation_shim_controller::RotationShimController </code></pre> <p>So why TEB controller is not available in ros2 humble? or did I miss something? Thanks.</p>
Why TEB controller is not available in ros2 humble?
<p>You're missing a couple key aspects:</p> <p>baselink -&gt; odom transform is the location of the robot (where the robot thinks it is) relative to where it started as calculated by tracking the wheel encoders. This transform is 0,0,0,0,0,0 at the start and starts changing as soon as the robot starts moving.</p> <p>map -&gt; odom TF is the correction between the odom location and the map. If the robot starts at the map origin, and the odom has zero error in time, then the odom -&gt; baselink TF is where the robot thinks it is (which is exactly where it is in the map because there is no odom error in this imagined robot) and the map -&gt; odom TF is 0,0,0,0,0,0.</p> <p>In reality though, the odom error will increase in time so your localization method (AMCL) will publish a TF that gets added(subtracted) from the odom value to tell you where the robot actually is in the map. As time continues the odom errors can increase forever so the numerical values in the map -&gt; odom TF may get large. After a year of driving a robot around your odom readings may suggest you're in Paris but the map -&gt; odom TF will have slowly been adding a correction throughout the year so you know that you actually never left Burghausen(I'm sorry for that).</p> <p>If you don't start at the map origin then that initial TF gets seeded with the correction and the TF is then updated with the needed corrections because the odom error is always changing. Odom is only ever where the robot thinks it is relative to where it started, using only local measurement techniques.</p>
104830
2023-10-19T10:22:04.220
|ros|localization|tf2|
<p>I would like to understand better the transformations between the frames established in <a href="https://www.ros.org/reps/rep-0105.html" rel="nofollow noreferrer">REP-105</a>. I did read the specification but I think I might be misunderstanding something.</p> <p>Let me bring one example: Suppose that I'm at the center of the city, where I already SLAM before, and have a previous static map that I could use to localize myself. Also, I know the transformation between the origin of this static map and the earth frame (in my case I would work in utm), which would result in traslation + rotation.</p> <p>That would allow me to publish this transform as static, instead of using other nodes such as Navsat using GPS information.</p> <p>When I imagine the &quot;map&quot; -&gt; &quot;odom&quot; transform, I would understand it, as the transformation between the origin of the map, and the start position of my robot (the wheel's odometry for ex, should be publishing 0s).</p> <p>If that make sense, I could publish this tf depending on the position where the robot starts, in order to set the initial pose inside my map.</p> <p>Lastly, the odom -&gt; &quot;base_link&quot; tf represents how much did the robot move from the &quot;start point&quot;.</p> <p>My question is, if I want to swap the map because I'm trying to enter a room, which should be the correct way of doing it?</p> <p>I still have the utm -&gt; base_link tf (global position) and I also have the next utm -&gt; map tf to apply. The idea is to maintain the utm -&gt; base_link tf while changing the utm -&gt; map but I could accomplish this task using the map -&gt; odom tf, the odom -&gt; base_link tf or both. Which should be the &quot;best&quot; way to handle this?</p> <p>Seeing packages such as AMCL manipulating map -&gt; odom tf, I somehow understand that the important tf is map -&gt; base_link which represents the &quot;local&quot; location of the robot. But I don't know if it would be correct to set again the map-&gt;odom tf depending on the new local start position and setting odom -&gt; base_link tf to 0 again, or maintain it and let AMCL deal with it.</p> <p>If this doesn't make sense, I would greet someone could enlighten me on this</p> <p><strong>Edit 1</strong>: As I mentioned at first, let's imagine I have some maps that I could relate as a typical map graph setup. And I want to travel from map_1 to map_2. I would know aproximately where the robot will start on map_2 and moreover, if I have the origin of map_2 on UTM coordinates, I could also get the UTM position of my robot. Obviously this is aproximate but I think it best than nothing. In this case, wouldn't it be interesting to kind of &quot;reset&quot; the odom-&gt;base_link tf in order to reduce the error provided by wheel encoders, etc.? Please see this minimal example. <a href="https://i.stack.imgur.com/RkXLB.jpg" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/RkXLB.jpg" alt="Minimal example of map swaps" /></a></p> <p>Imagine we also have the utm -&gt; map tf (I should have drawn it but I forgot to do it, sorry for that).</p> <ol> <li><p>I start in other position than map origin, represented in the map -&gt; odom tf, and as it is my start position, odom -&gt; base_link tf is 0s.</p> </li> <li><p>My robot is reaching the end of map_1 where my algorithm could understand that a map swap might be needed, based on the robot's future path if it needs to go to map_2 I could swap to the next map when its close enough, and I could update map -&gt; odom since I somehow have a map graph and can know from where will the robot come. Obviously, I would also need to update the utm -&gt; map tf, but this is static since I have all my maps origin coordinates meassured in utm.</p> </li> <li><p>I update the utm -&gt; map tf according to map_2. Since I know where should base_link be on my new map_2, I would have to correct my map -&gt; odom tf, and I could &quot;reset&quot; my odom -&gt; base_link in order to correct offsets from the previous path.</p> </li> </ol> <p>As I understand it, when we are not using earth frame. Where we only have map, odom and base_link, my main focus is to know the map -&gt; base_link tf, since it will represent my position inside my map.</p> <p>When there is earth frame, I could have my global position (utm -&gt; base_link), and also my local position (map -&gt; base_link).</p>
Correct use of transformations while using coordinate frames
<p>Both services and actions have <code>wait_for_...()</code> methods, see the <a href="https://docs.ros.org/en/rolling/Tutorials/Beginner-Client-Libraries/Writing-A-Simple-Cpp-Service-And-Client.html#write-the-client-node" rel="nofollow noreferrer">service tutorial</a> and rclcpp <a href="https://github.com/ros2/rclcpp/blob/5ffc963e1aef97928a6cc895db18d40673c655f1/rclcpp/include/rclcpp/client.hpp#L191-L198" rel="nofollow noreferrer">source code</a>:</p> <pre class="lang-cpp prettyprint-override"><code>while (!client-&gt;wait_for_service(1s)) { if (!rclcpp::ok()) { RCLCPP_ERROR(rclcpp::get_logger(&quot;rclcpp&quot;), &quot;Interrupted while waiting for the service. Exiting.&quot;); return 0; } RCLCPP_INFO(rclcpp::get_logger(&quot;rclcpp&quot;), &quot;service not available, waiting again...&quot;); } </code></pre> <p>and the <a href="https://docs.ros.org/en/rolling/Tutorials/Intermediate/Writing-an-Action-Server-Client/Cpp.html#writing-an-action-client" rel="nofollow noreferrer">action client tutorial</a>:</p> <pre class="lang-cpp prettyprint-override"><code>if (!this-&gt;client_ptr_-&gt;wait_for_action_server()) { RCLCPP_ERROR(this-&gt;get_logger(), &quot;Action server not available after waiting&quot;); rclcpp::shutdown(); } </code></pre> <p><code>wait_for_action_server()</code> can also take a duration, see the rclcpp_action <a href="https://github.com/ros2/rclcpp/blob/5ffc963e1aef97928a6cc895db18d40673c655f1/rclcpp_action/include/rclcpp_action/client.hpp#L83-L86" rel="nofollow noreferrer">source code</a>.</p> <p>For your application you'd have to loop over each of your services/actions and wait untill all are available.</p>
104837
2023-10-19T13:36:39.140
|ros2|ros-humble|action-server|ros-service|
<p>I have a list of services and a list of actions that I need to wait for to start before I can proceed further. Is there some sort of inbuilt function in <code>rclcpp</code> or <code>rclcpp::Node</code> that I can use?</p> <p>Creating a custom service and action client type is not really a viable option for me, since I would have more than 100 services + actions combined.</p> <p>What I want is something like the following:</p> <pre><code>class SomeNode : public rclcpp::Node { public: SomeNode() : Node(&quot;service_and_action_check&quot;) { /* We do our initial setup */ std::vector&lt;std::string&gt; serviceList = {&quot;service1&quot;, &quot;service2&quot;, &quot;service3&quot;}; std::vector&lt;std::string&gt; actionList = {&quot;action1&quot;, &quot;action2&quot;, &quot;action3&quot;}; for (auto &amp;elem : serviceList) { checkIfServiceServerIsRunning(elem); } for (auto &amp;elem : actionList) { checkIfActionServerIsRunning(elem); } RCLCPP_INFO(this-&gt;get_logger(), &quot;All services and actions have been loaded.&quot;); } . . . }; </code></pre> <p>If this is not readily available, if anyone knows of a simple way to create the functions <code>checkIfServiceServerIsRunning</code> and <code>checkIfActionServerIsRunning</code>, that would be of help too.</p> <p>Thanks!</p>
rclcpp wait for a list of service/action servers to start before proceeding further
<p>The first question: do you really need a custom controller, or might work a <a href="https://control.ros.org/master/doc/ros2_controllers/position_controllers/doc/userdoc.html" rel="nofollow noreferrer">position_controller</a> (simple interface, topic with one position-vector per received topic) or the <a href="https://control.ros.org/master/doc/ros2_controllers/joint_trajectory_controller/doc/userdoc.html" rel="nofollow noreferrer">joint trajectory controller</a> (has also a topic interface, but additionally an action server; and it interpolates optionally from a full trajectory, not only from a single point).</p> <p>Have a look at the <a href="https://control.ros.org/master/doc/ros2_control_demos/example_1/doc/userdoc.html" rel="nofollow noreferrer">other examples</a> from the demos on how to use them.</p>
104839
2023-10-19T15:36:45.227
|ros2|control|position|
<p>I am a newbie with ros2 control. I am using ros2 iron on Ubunto 22.04 (Linux Mint). I made a copy of the ros2_control demo example #7.</p> <p>This is the one that is launched with the command <code>ros2 launch ros2_control_demo_example_7 r6bot_controller.launch.py</code>, I changed the name as well as urdf file. I also interfaced it with my Arduino robot arm based on positions. My arm has 5 instead of 6 joints and only uses positions. There are no encoders on my arm. All the velocity commands and states related to velocity were taken out of example 7. In my hardware interface (copy), I have code that when a position is changed, the code should send off a state change to an Arduino robot arm...</p> <p>When I run <code>ros2 control list_controllers</code>, I see</p> <pre><code>jem_5dof_robot_arm_controller[jem_robot_arm/RobotController] active joint_state_broadcaster[joint_state_broadcaster/JointStateBroadcaster] active </code></pre> <p>When I run <code>ros2 control list_hardware_interfaces</code>, I see</p> <pre class="lang-none prettyprint-override"><code>command interfaces Joint1/position [available] [claimed] Joint2/position [available] [claimed] Joint3/position [available] [claimed] Joint4/position [available] [claimed] Joint5/position [available] [claimed] state interfaces Joint1/position Joint2/position Joint3/position Joint4/position Joint5/position </code></pre> <p>I also see am image of my robot arm in rviz... I would say &quot;things look like they are working&quot;</p> <p>My question is this...........</p> <p>How do I send a new position to my robot arm from the command line or a GUI. Either send one new position for a Joint or send the set of positions for all 5 joints. All I want to happen is, quickly send something and interact with the Arduino hardware. I believe the code I implemented in the hardware interface to my Arduino based arm will work, the question is how to I force something to call my hardware interface with an update (so as to test). I'd like to set a few different positions and see if the hardware is working ok. Ideally, it would be nice to make changes in rviz and send those changes to my arm/Arduino. I suspect that rviz is only a one way visualization of the arm and has no way to move something. I suspect that once a new position is sent to my robot arm's controller [jem_5dof_robot_arm_controller], rviz will update accordingly.</p> <p>What are is a simple way to send off a few new positions from the command line to test?</p>
How do I send a new position to my 5 dof robot arm?
<p>I found that the simulated IMU in gazebo fortress was already giving me filtered data (<code>imu/data</code> as per imu_filter_madgwick). I thought it would be raw <code>imu/data_raw</code> and used <a href="http://wiki.ros.org/imu_filter_madgwick" rel="nofollow noreferrer">imu_filter_madgwick</a> to filter it again which made the orientation estimation wrong.</p>
104842
2023-10-19T16:56:34.467
|robot-localization|ros-humble|ignition-fortress|
<p>ROS2 Humble, Gazebo Fortress, Gravity in gazebo set to zero.</p> <p>I am trying to fuse IMU and GPS data with <a href="https://github.com/cra-ros-pkg/robot_localization/tree/ros2" rel="nofollow noreferrer">robot_localization</a> package.</p> <p>Already went through the following <a href="http://docs.ros.org/en/noetic/api/robot_localization/html/integrating_gps.html" rel="nofollow noreferrer">tutorial 1</a>, <a href="https://navigation.ros.org/tutorials/docs/navigation2_with_gps.html" rel="nofollow noreferrer">tutorial 2</a>.</p> <p>I am publishing a static transform from map to odom.</p> <p>The parameters are the following</p> <pre><code>ekf_filter_node_map: ros__parameters: frequency: 30.0 two_d_mode: true # Recommended to use 2d mode for nav2 in mostly planar environments print_diagnostics: true debug: false publish_tf: true map_frame: map odom_frame: odom base_link_frame: base_link # the frame id used by the turtlebot's diff drive plugin world_frame: odom odom0: odometry/gps odom0_config: [true, true, false, false, false, false, false, false, false, false, false, false, false, false, false] odom0_queue_size: 10 odom0_differential: false odom0_relative: false imu0: imu/data imu0_config: [false, false, false, false, false, true, false, false, false, false, false, false, false, false, false] imu0_differential: false # If using a real robot you might want to set this to true, since usually absolute measurements from real imu's are not very accurate imu0_relative: false imu0_queue_size: 10 imu0_remove_gravitational_acceleration: true use_control: false process_noise_covariance: [ 1e-3, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1e-3, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1e-3, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00000005, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.005, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.005, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.3, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3.0 ] navsat_transform: ros__parameters: frequency: 30.0 delay: 3.0 magnetic_declination_radians: 0.0 yaw_offset: 0.0 zero_altitude: true broadcast_utm_transform: true publish_filtered_gps: true use_odometry_yaw: true wait_for_datum: false </code></pre> <p>I tried several values for the <code>process_noise_convariance</code> but I see that the velocity in gazebo and Rviz dont match when I sent a Twist msg with only z angular velocity. This <a href="https://youtu.be/-i-gWCwUhrw" rel="nofollow noreferrer">video</a> explains it better I think. If I use ground truth odometry from gazebo, then the robot could reach a target goal but with robot_localization , this doesnt happen.</p> <p>Is there something else that I could try out to get the localization working? Any help would be great</p> <p>thanks</p>
Gazebo and Rviz have different velocity when robot_localization package is used
<p>The above populates parameters correctly when I swapped the launchfile from what is described above to:</p> <pre><code>def generate_launch_description(): service_node = Node( package=&quot;example&quot;, executable=&quot;my_service&quot;, name=&quot;my_service&quot;, parameters=[{&quot;example&quot;: &quot;test&quot;}], ) return LaunchDescription([service_node]) </code></pre> <p>...specifically, using the <code>parameters</code> keyword argument in the service node over a launch file argument; I am unsure at the moment why, or why the tutorial I read <a href="https://docs.ros.org/en/galactic/How-To-Guides/Launch-file-different-formats.html" rel="nofollow noreferrer">here</a> suggested <code>DeclareLaunchArgument</code>.</p>
104850
2023-10-19T23:43:23.730
|ros2|
<p>I'm trying to make sense of when launch file parameters are populated in ROS2. My situation is similar to the following simplified example; I have a launch file that contains a launch function.</p> <pre><code>def generate_launch_description(): argument = DeclareLaunchArgument( &quot;argument&quot;, description=&quot;Sample parameter/argument&quot; ) service_node = Node( package=&quot;example&quot;, executable=&quot;my_service&quot;, name=&quot;my_service&quot;, ) return LaunchDescription([argument, service_node]) </code></pre> <p>...which i call via something like <code>ros2 launch example example.launch.py argument:=test</code> And the node code as such:</p> <pre><code>import rclpy from rcl_interfaces.msg import ParameterDescriptor from rclpy.node import Node class Service(Node): def __init__(self): super().__init__(&quot;llm_service&quot;) self.declare_parameter( &quot;argument&quot;, &quot;&quot;, ParameterDescriptor( description=&quot;Example argument&quot; ), ) self.srv = self.create_service(LLM, &quot;prompt&quot;, self.prompt_callback) def get(self) -&gt; str: argument = ( self.get_parameter(&quot;argument&quot;).get_parameter_value().string_value ) return argument def main(args=None): # initialize the ROS communication rclpy.init(args=args) # declare the node constructor service = Service() # pause the program execution, waits for a request to kill the node (ctrl+c) rclpy.spin(service) # shutdown the ROS communication rclpy.shutdown() if __name__ == &quot;__main__&quot;: main() </code></pre> <p>When I call the init for my service, the argument is not set, and if I called <code>get()</code> it'd return the default value of <code>&quot;&quot;</code> . If I had a callback trigger later, say through something like a service call, and I called <code>get()</code> then, I would get the correct populated value from the launch file.</p> <p>So when does that get populated? What is the best accepted pattern for grabbing those arguments on service startup such that I can use it for initialization?</p>
In ROS2, when launching a node service with a launch file, when is the passed parameters populated?
<p>Welcome to Robotics Stack Exchange!</p> <p>It is complicated to understand the question as it is not structured well. Based on the description, I feel that you have the following intention. You are trying to ask whether the below practice is followed in ROS:</p> <pre><code># This is a dummy code import ld08_driver driver = ld08_driver() speed = driver.GetSpeed() </code></pre> <p><em>Please assume the above code is written in your package, which differs from the package you mentioned (<a href="https://github.com/ROBOTIS-GIT/ld08_driver" rel="nofollow noreferrer">ROBOTIS-GIT/ld08_driver</a>).</em></p> <p>Now, based on the title <em>&quot;How do I use other packages while writing my own package&quot;</em>, the answer (to question whether the abive practice is followed in ROS) is, No. ROS does not use the above practice. Instead, you should do the following:</p> <ol> <li>Please invoke the launch file by using the following command: <pre><code>ros2 launch ld08_driver ld08.launch.py </code></pre> </li> <li>Depending on the package, they may create publishers, subscribers, services, etc. You can easily list out the information by executing ROS commands. However, it is strongly advised to look at the package documentation.</li> <li>In the case of <a href="https://github.com/ROBOTIS-GIT/ld08_driver" rel="nofollow noreferrer">ROBOTIS-GIT/ld08_driver</a>, it seems to be creating a publisher of type <code>sensor_msgs::msg::LaserScan</code></li> <li>In your package, you should create a subscriber of <code>sensor_msgs::msg::LaserScan</code> type which subscriber to above topic. This is how, you can get the data (laser scan) from the driver in ROS.</li> </ol> <p>Please feel free to read and try <a href="https://docs.ros.org/en/foxy/Tutorials/Beginner-Client-Libraries/Writing-A-Simple-Cpp-Publisher-And-Subscriber.html#write-the-subscriber-node" rel="nofollow noreferrer">the official ROS 2 documentation</a> at any time.</p>
104851
2023-10-20T01:21:21.420
|ros2|lidar|packages|
<p>Im just starting out with ros2, and I have been trying to do something that I thought would be very common. I want to use the functions available inside some other package in my own node. The package in questions is ld08_driver lds-02 lidar driver <a href="https://github.com/ROBOTIS-GIT/ld08_driver/tree/ros2-devel" rel="nofollow noreferrer">github link</a> and I see that inside the /include/lipkg.h file, I see that there are functions like GetSpeed, GetTimestamp and so on. What I’m asking is, is it not possible to do something like how rclpy is imported and used, “Import ld08_driver” and just use those functions and not use the pre-created node?</p> <p>I do realize the question is not very well structured, but please understand I am just starting out, and haven’t found any solution by Google searches. I’m using ros2 foxy and for now, plan to use the lidar standalone with no real purpose, just testing various codes, instead of working with simulations.</p>
How do I use other packages while writing my own package
<p>Both ros1 or ros2 have all the features you'll need for this task, but achieving 5cm accuracy using dead-reckoning on uneven pavement is unlikely. For accurate localization, you'll need additional sensor(s), as well as predetermined landmarks you can rely on.</p>
104861
2023-10-20T09:36:52.630
|differential-drive|
<p>First time on this forum and I’d like to ask about a project I’d want to do. I’d like to build a robot, preferably on a differential drive base (two driving wheels, one or two caster wheels) and have an arm which can rotate mounted on top. This video provides a concept of what I’m trying to achieve although it has a four wheel skid steer base.</p> <p><a href="https://youtu.be/iE2HxXjoqeM?t=3" rel="nofollow noreferrer">https://youtu.be/iE2HxXjoqeM?t=3</a></p> <p>The reason I’m drawn to a differential base is that I’m concerned about loss of directional accuracy through turns if I was using a skid steer application. I also feel that mecanum wheels may not do so well as the robot will be working on asphalt which can be rough and slightly uneven.</p> <p>To provide some idea of the accuracy I hope to achieve the robot will need to drive straight for almost 20 meters, reverse back to the half way mark the turn 90 degrees and drive a further 15 meters and hopefully be within 50mm of it’s intended path, the more accurate the better though.</p> <p>I would like to be able to set the robot running and not need to have any input until it has completed it’s path. I would also like to incorporate some simple monitoring channels for a thermocouple which may adjust the travel speed and to be able to trigger a relay.</p> <p>Is this a reasonable goal for a ROS2 application? At this point in my research the only other option I am aware of, which may not tick all boxes is CASP (Computer Aided Simulation Program).</p> <p>I see there is a tutorial which may be suitable for this project but there has been some negative feedback citing issues of broken repository references. Planning For Differential-Drive Mobile Base and an Arm tutorial is broken · Issue #364 · <a href="https://github.com/ros-planning/moveit2_tutorials/issues/364" rel="nofollow noreferrer">https://github.com/ros-planning/moveit2_tutorials/issues/364</a></p> <p>Any thoughts or suggestions on this?</p> <p>Thanks</p>
Rotating arm mounted on a differential base?
<p>Welcome at RSE.</p> <p>ros-control is based on a realtime-loop (update methods of the controllers and read/write methods of the hardware components, sharing data with a shared-memory layer). realtime_tools provide mechanisms to use non-realtime-safe code in this loop: E.g., publishers, or buffers exchanging data between the realtime-loop and subscriber callbacks.</p> <p>It seems that you are working with ROS 1. Both, the buffer and publisher are used in several places of the <a href="https://github.com/search?q=repo%3Aros-controls%2Fros_controllers%20realtime_publisher&amp;type=code" rel="nofollow noreferrer">ros_controllers</a>, or in a simpler form in the <a href="https://github.com/ros-controls/control_toolbox/blob/noetic-devel/src/pid.cpp" rel="nofollow noreferrer">control_toolbox's PID implementation</a></p>
104874
2023-10-22T03:22:15.003
|ros|ros-control|realtime|
<p>I'm trying to figure out how to use realtime_tools (mainly realtime_publisher&amp;realtime_buffer), but there are to few information about it. I can only find this:</p> <pre><code>#include &lt;realtime_tools/realtime_publisher.h&gt; bool MyController::init(pr2_mechanism_model::RobotState *robot, ros::NodeHandle &amp;n) { realtime_pub = new realtime_tools::RealtimePublisher&lt;mgs_type&gt;(n, &quot;topic&quot;, 4); return true; } void MyController::update() { if (realtime_pub-&gt;trylock()){ realtime_pub-&gt;msg_.a_field = &quot;hallo&quot;; realtime_pub-&gt;msg_.header.stamp = ros::Time::now(); realtime_pub-&gt;unlockAndPublish(); } } </code></pre> <p>Hope someone can give me more detailed help.</p> <p>Thanks a lot.</p>
realtime_publisher&realtime_buffer
<p>The question defines <span class="math-container">$x(t)$</span> as a &quot;position reference&quot;. Given the remainder of the question, e.g. <span class="math-container">$\dot{q_t} = J^{-1}(q_t)\dot{x_t}$</span>, I conclude that rather a full 6-DOF reference trajectory is meant (i.e. a specification for positions <strong>and</strong> orientations for the end effector).</p> <p>For a 6D reference trajectory, a 6-joint robot is non-redundant. As long as the robot does not travel through singularities thoughout the trajectory <span class="math-container">$x(t)$</span>, then the trajectory is feasible. But that purely depends on the start configuration of your robot (if multiple are possible) and the specified 6D trajectory. There is a one-to-one mapping between the trajectory (i.e. sequence of end effector poses) and the robot motion (i.e. sequence of joint positions).</p> <p>In other words: for a given 6D motion trajectory and a 6 joint robot, you cannot optimize joint motions towards some criteria (such as avoiding joint angle limits or reducing max joint velocities, etc).</p> <p>If you want to formulate the robot task as an optimization problem, you need extra degrees of freedom so one trajectory in task space (in your example: cartesian space) has multiple possible trajectories in joint space. The optimization problem is then to identify, from all possible trajectories in joint space, <em>that one</em> which yields minimum values for your optimization criterium.</p> <p>E.g.:</p> <ul> <li><p><strong>Consider a 6DOF trajectory with a 7 joint robot:</strong> due to the 7th joint, the robot is redundant and this redundancy can be used to minimize some optimization criterium (such as remaining clear of joint limits, or minimizing kinetic energy, etc).</p> </li> <li><p><strong>Consider a 6 joint robot, but a 5 DOF kinematic task.</strong> E.g. a <a href="https://www.google.com/search?sca_esv=575726020&amp;sxsrf=AM9HkKnLa0I8y7zqjSskDQSSeP4M_Ec5eg:1698049312158&amp;q=mig+welding&amp;tbm=isch&amp;source=lnms&amp;sa=X&amp;sqi=2&amp;ved=2ahUKEwiAqb_Q3ouCAxVHtKQKHT_7BQIQ0pQJegQIDhAB&amp;biw=3370&amp;bih=1304&amp;dpr=1" rel="nofollow noreferrer">MIG welding operation</a>: the weld seam geometry defines position references <span class="math-container">$x(t)$</span>, <span class="math-container">$y(t)$</span>, <span class="math-container">$z(t)$</span> and two angles <span class="math-container">$\phi(t)$</span> and <span class="math-container">$\theta(t)$</span> but the rotation of the welding gun around the axis defined by the welding wire is not relevant. Null motion around that axis can be used, e.g. to avoid singular robot positions throughout the motion along the seam.</p> </li> </ul> <p>An alternative optimization problem is if the <em>task function</em> is only <em>partially specified</em>, and you need to calculate an optimum path through the task space to realize that partial specification. E.g. instead of a full trajectory <span class="math-container">$x(t)$</span>, you specify only a start pose <span class="math-container">$x(0)$</span> and end pose <span class="math-container">$x(T)$</span> (and maybe some intermediate poses) and you calculate an optimal trajectory to realize that motion.</p>
104882
2023-10-22T21:19:10.450
|robotic-arm|control|kinematics|motion-planning|velocity|
<p>I've been trying to utilize Model Predictive Control (MPC) scheme that I have for end-effector position reference <span class="math-container">$x(t)$</span> tracking control to build <strong>end-effector velocity reference <span class="math-container">$\dot{x}(t)$</span> tracking</strong>. Hopefully, I can design a smooth velocity reference profile from the positions reference profile so that I CAN ACHIEVE BOTH POSITION AND VELOCITY REFERENCE TRACKING.</p> <p>Let me first describe how I think the MPC should be formulated to achieve what I want:</p> <p>Assuming that we have a common 6-DOF robot manipulator without any redundancy (e.g., 6-joints cobot), and there is a way to calculate inverse of its Jacobian matrix <span class="math-container">$J^{-1}(q_t)$</span>. Then, from end-effector velocity reference profile <span class="math-container">$\dot{x}(t)$</span>, we can calculate desired joint velocities <span class="math-container">$\dot{q_t} = J^{-1}(q_t)\dot{x_t}$</span>.</p> <p>A system of equation is as following:</p> <p><span class="math-container">$\dot{q}_{t+1} = \dot{q_t} + \Delta_t a_t$</span> = <span class="math-container">$A\dot{q_t} + B\Delta_t a_t$</span></p> <p>From here, we can formulate the MPC problem as:</p> <p><span class="math-container">$min_{a_0,...a_{N-1}} {J = \sum_{t=1}^{N}||\dot{q_t} - \dot{q_g}||^2_{Q} + ||\dot{q_N} - \dot{q_g}||^2_{P}}_{} + ||a_t||^2_R$</span></p> <p><span class="math-container">$s.t. \dot{q}_{t+1} = A\dot{q_t} + B\Delta_t a_t$</span></p> <p><span class="math-container">$\dot{q}_{min} \leq \dot{q}_t \leq \dot{q}_{max}$</span>, <span class="math-container">$\dot{q}_0$</span> given</p> <p>Now, the question is: how am I supposed to put <strong>joint angle limits</strong> instead joint acceleration limits (<span class="math-container">$a_{lim}$</span>) in the optimization problem above (OEM provides only the joint angle and velocity limits)?</p> <p>I was thinking of using Quadratic Programming (QP), as the problem looks like a strictly convex optimization. But there is no place to put the angle limits as parameters in QP (may be these limits cannot be expressed as linear constraints). I appreciate it if anyone could provide me some advices and/or references.</p>
End effector velocity control of a 6-DOF robotic manipulator using MPC
<p>For me, the solution was as Roberto said, but I also needed to copy some <code>.dll</code> files such as <code>OgreMain.dll</code>, <code>OgreOverlay.dll</code>, <code>assymp-vc140-mt.dll</code>, <code>libcurl.dll</code>, <code>yam-cpp.dll</code>, <code>zlib.dll</code> from <code>C:/dev/ros2_humble/opt/{other folder}/bin</code> to <code>C:/dev/ros2_humble/bin</code> and changing the name of the <code>zlib.dll</code>.</p>
104892
2023-10-23T07:24:50.307
|ros2|rviz|
<p>I am trying to use RVIZ2 on Windows 10 with ROS2 Humble, without much success so far. Running (after activating <code>local_setup.ps1</code>):</p> <pre><code>ros2 run rviz2 rviz2 </code></pre> <p>Returns with the message:</p> <blockquote> <p>[ros2run]: Process exited with failure 3221225595</p> </blockquote> <p>When running the executable directly:</p> <pre><code>C:\dev\ros2_humble\bin\rviz2.exe </code></pre> <p>I get no messages at all, but the exit code is -1073741701.</p> <p>I can reproduce this with Windows Sandbox (although the numbers are slightly different), with a fresh and default install of Humble.</p> <p>ROS2 is working normally and I have installed Qt 5.12.12.<br /> What could be the problem here?</p>
ROS2 Humble on Windows 10 - RVIZ2 gives error "Process exited with failure 3221225595"
<p>The problem with the map was that I had the Durability policy in RViz as Volatile instead of Transcient local. And the problem with the costmap was that I didn't installed <code>behaviourtree_cpp_v3</code> package on my robot computer.</p>
104899
2023-10-23T13:12:53.143
|ros2|rviz|ros-humble|nav2|
<p>I am trying to have my robot navigate with Nav2, I have a laptop connected to the same network as the robot in order to visualize and send navigation goals, however I am getting a &quot;No map received&quot; warning in RViz2 as shown below:</p> <p><a href="https://i.stack.imgur.com/w6p2F.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/w6p2F.png" alt="RViz2 warning image" /></a></p> <p>I tried running the navigation nodes on the remote PC and the map shows successfully but I want to have the navigation run on the robot computer. I also tried running the SLAM on the robot and visualize on the remote PC and the map is showing as shown bellow, so I don't know what should I do to visualize the map when doing navigation.</p> <p><a href="https://i.stack.imgur.com/tzwCy.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/tzwCy.png" alt="enter image description here" /></a></p> <p>[EDIT]</p> <p>The problem with the map was that I had to configure the Durability policy to Transient local, however the costmap is still not showing, I compared the rqt_graph output on both cases (launching the navigation nodes from the robot computer and on my personal computer) and I see the <code>bt_navigator</code> node is not publishing <code>/bond</code> topic so I went to check the logs when launching navigation on the robot computer and I see there is a problem on the <code>bt_navigator</code> node when it is configuring, the error is the following:</p> <pre class="lang-bash prettyprint-override"><code>[lifecycle_manager-12] [INFO] [1698166413.648628234] [lifecycle_manager_navigation]: Configuring bt_navigator [bt_navigator-9] [INFO] [1698166413.648875423] [bt_navigator]: Configuring [bt_navigator-9] [ERROR] [1698166413.694414631] []: Caught exception in callback for transition 10 [bt_navigator-9] [ERROR] [1698166413.694437451] []: Original error: Could not load library: /opt/ros/humble/lib/libnav2_compute_path_to_pose_action_bt_node.so: undefined symbol: _ZN2BT10Blackboard15createEntryImplERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERKNS_8PortInfoE [bt_navigator-9] [WARN] [1698166413.694469440] []: Error occurred while doing error handling. [bt_navigator-9] [FATAL] [1698166413.694478582] [bt_navigator]: Lifecycle node bt_navigator does not have error state implemented [lifecycle_manager-12] [ERROR] [1698166413.694941267] [lifecycle_manager_navigation]: Failed to change state for node: bt_navigator [lifecycle_manager-12] [ERROR] [1698166413.695032343] [lifecycle_manager_navigation]: Failed to bring up all requested nodes. Aborting bringup. </code></pre>
No costmap received Nav2
<p>So after trial and error and looking around for a while online I seems to have found a solution, from what I understand multi chain planning is possible but not multi chain kinematics solving. essentially this means that you need to solve the kinematics of each move group independently (moveit2 does not support multi chain kinematic solver? this doesnt seem to be specified anywhere nor mentioned in the documents) and then pass the solved joint states to the combined move group planner.</p> <p>Essentially:</p> <ol> <li>Use individual move_groups to calculate desired angles based on pose</li> <li>extract the joint angles</li> <li>combine the joint angles as expected by the dual arm group (this isn't documented I had to trial and error)</li> <li>then plan the movements with the dual arm group using the combined joint angles</li> </ol> <p>Here is my code:</p> <pre><code> geometry_msgs::msg::Pose rightGoalPose = msg-&gt;poses.back(); geometry_msgs::msg::Pose leftGoalPose = msg-&gt;poses.front(); // Calculate Kinematics for each arm individually using separate move_groups leftArmMoveGroup-&gt;setJointValueTarget(leftGoalPose); rightArmMoveGroup-&gt;setJointValueTarget(rightGoalPose); std::vector&lt;double&gt; leftJointValue; std::vector&lt;double&gt; rightJointValue; leftArmMoveGroup-&gt;getJointValueTarget(leftJointValue); rightArmMoveGroup-&gt;getJointValueTarget(rightJointValue); std::vector&lt;double&gt; bothJointsSend; bothJointsSend.insert(bothJointsSend.end(), rightJointValue.begin(), rightJointValue.end()); bothJointsSend.insert(bothJointsSend.end(), leftJointValue.begin(), leftJointValue.end()); auto successCheck = bothArmsMoveGroup-&gt;setJointValueTarget(bothJointsSend); std::vector&lt;double&gt; bothJointValue; bothArmsMoveGroup-&gt;getJointValueTarget(bothJointValue); moveit::planning_interface::MoveGroupInterface::Plan plan; bool planning_success = (bothArmsMoveGroup-&gt;plan(plan) == moveit::planning_interface::MoveItErrorCode::SUCCESS); if (planning_success) { RCLCPP_INFO(LOGGER, &quot;Planning succeeded. Executing movement...&quot;); bothArmsMoveGroup-&gt;execute(plan); // rightArmMoveGroup-&gt;execute(planRight); RCLCPP_INFO(LOGGER, &quot;Movement executed successfully.&quot;); } else { RCLCPP_ERROR(LOGGER, &quot;Planning failed for both arms&quot;); } <span class="math-container">```</span> </code></pre>
104915
2023-10-23T23:24:23.103
|moveit|robotic-arm|kinematics|ros-humble|move-group|
<p>I am attempting to plan and move two arms using moveit2 at the same time. I have setup a move group that contains both arms and am using the c++ move_group_interface to attempt to plan and execute for both arms.</p> <p>When I try to plan I get the message &quot;No kinematics solver instantiated for group 'both_arms'&quot; (I have tried to add and remove &quot;both_arms&quot; from the kinematics.yaml but neither seems to work?)</p> <p>the rest of the planning and execution &quot;finishes successfully&quot; but I imagine thats because no plan is actually being sent to the move_group.</p> <p>I can successfully plan and move each arm individually but when i attempt to use the combined move group nothing seems to work</p> <p>I am also able to successfully use the rviz tool to plan and execute for both arms at the same time, its specifically with the c++ api that I am unable to.</p> <p>If anyone could point me in the right direction it would be greatly appreciated!</p> <p><strong>c++</strong></p> <pre><code> bothArmsMoveGroup-&gt;setApproximateJointValueTarget(goal_pose, &quot;left_lower_finger_1&quot;); bothArmsMoveGroup-&gt;setApproximateJointValueTarget(goal_pose, &quot;right_lower_finger_1&quot;); // Plan and execute moveit::planning_interface::MoveGroupInterface::Plan plan; bool planning_success = (bothArmsMoveGroup-&gt;plan(plan) == moveit::planning_interface::MoveItErrorCode::SUCCESS); if (planning_success) { RCLCPP_INFO(LOGGER, &quot;Planning succeeded. Executing movement...&quot;); bothArmsMoveGroup-&gt;execute(plan); RCLCPP_INFO(LOGGER, &quot;Movement executed successfully.&quot;); } else { RCLCPP_ERROR(LOGGER, &quot;Planning failed for %s&quot;, planningGroupName.c_str()); } </code></pre> <p><strong>SRDF</strong></p> <pre><code>&lt;!--This does not replace URDF, and is not an extension of URDF. This is a format for representing semantic information about the robot structure. A URDF file must exist for this robot as well, where the joints and the links that are referenced are defined --&gt; &lt;robot name=&quot;avatar-assembly-202303&quot;&gt; &lt;!--GROUPS: Representation of a set of joints and links. This can be useful for specifying DOF to plan for, defining arms, end effectors, etc--&gt; &lt;!--LINKS: When a link is specified, the parent joint of that link (if it exists) is automatically included--&gt; &lt;!--JOINTS: When a joint is specified, the child link of that joint (which will always exist) is automatically included--&gt; &lt;!--CHAINS: When a chain is specified, all the links along the chain (including endpoints) are included in the group. Additionally, all the joints that are parents to included links are also included. This means that joints along the chain and the parent joint of the base link are included in the group--&gt; &lt;!--SUBGROUPS: Groups can also be formed by referencing to already defined group names--&gt; &lt;group name=&quot;left_arm&quot;&gt; &lt;joint name=&quot;left_shoulder_joint_1&quot;/&gt; &lt;joint name=&quot;left_shoulder_joint_2&quot;/&gt; &lt;joint name=&quot;left_forearm_joint&quot;/&gt; &lt;joint name=&quot;left_elbow_joint&quot;/&gt; &lt;joint name=&quot;left_wrist_joint&quot;/&gt; &lt;joint name=&quot;left_hand_joint&quot;/&gt; &lt;/group&gt; &lt;group name=&quot;left_gripper&quot;&gt; &lt;link name=&quot;left_finger_1&quot;/&gt; &lt;/group&gt; &lt;group name=&quot;right_arm&quot;&gt; &lt;joint name=&quot;right_shoulder_joint_1&quot;/&gt; &lt;joint name=&quot;right_shoulder_joint_2&quot;/&gt; &lt;joint name=&quot;right_forearm_joint&quot;/&gt; &lt;joint name=&quot;right_elbow_joint&quot;/&gt; &lt;joint name=&quot;right_wrist_joint&quot;/&gt; &lt;joint name=&quot;right_hand_joint&quot;/&gt; &lt;/group&gt; &lt;group name=&quot;right_gripper&quot;&gt; &lt;link name=&quot;right_finger_1&quot;/&gt; &lt;/group&gt; &lt;group name=&quot;both_arms&quot;&gt; &lt;group name=&quot;left_arm&quot;/&gt; &lt;group name=&quot;right_arm&quot;/&gt; &lt;/group&gt; &lt;!--GROUP STATES: Purpose: Define a named state for a particular group, in terms of joint values. This is useful to define states like 'folded arms'--&gt; &lt;group_state name=&quot;open&quot; group=&quot;left_gripper&quot;&gt; &lt;joint name=&quot;left_finger_joint&quot; value=&quot;-0.6422&quot;/&gt; &lt;/group_state&gt; &lt;group_state name=&quot;closed&quot; group=&quot;left_gripper&quot;&gt; &lt;joint name=&quot;left_finger_joint&quot; value=&quot;0.3645&quot;/&gt; &lt;/group_state&gt; &lt;group_state name=&quot;open&quot; group=&quot;right_gripper&quot;&gt; &lt;joint name=&quot;right_finger_joint&quot; value=&quot;-0.6422&quot;/&gt; &lt;/group_state&gt; &lt;group_state name=&quot;closed&quot; group=&quot;right_gripper&quot;&gt; &lt;joint name=&quot;right_finger_joint&quot; value=&quot;0.3645&quot;/&gt; &lt;/group_state&gt; &lt;!--END EFFECTOR: Purpose: Represent information about an end effector.--&gt; &lt;end_effector name=&quot;left_gripper&quot; parent_link=&quot;left_lower_finger_1&quot; group=&quot;left_gripper&quot; parent_group=&quot;left_arm&quot;/&gt; &lt;end_effector name=&quot;right_gripper&quot; parent_link=&quot;right_lower_finger_1&quot; group=&quot;right_gripper&quot; parent_group=&quot;right_arm&quot;/&gt; &lt;/robot&gt; </code></pre> <p><strong>Kinematics.yaml</strong> (I've tried both having a the separate kinematics for the combined group and keeping is speerate, either way it doesn't seem to work)</p> <pre><code>left_arm: kinematics_solver: kdl_kinematics_plugin/KDLKinematicsPlugin kinematics_solver_search_resolution: 0.0050000000000000001 kinematics_solver_timeout: 0.0050000000000000001 right_arm: kinematics_solver: kdl_kinematics_plugin/KDLKinematicsPlugin kinematics_solver_search_resolution: 0.0050000000000000001 kinematics_solver_timeout: 0.0050000000000000001 both_arms: kinematics_solver: kdl_kinematics_plugin/KDLKinematicsPlugin kinematics_solver_search_resolution: 0.0050000000000000001 kinematics_solver_timeout: 0.0050000000000000001 </code></pre> <p><strong>Launch File</strong></p> <pre><code>import os import yaml from launch import LaunchDescription from launch_ros.actions import Node from ament_index_python.packages import get_package_share_directory from launch.substitutions import ( Command, FindExecutable, LaunchConfiguration, PathJoinSubstitution, ) from launch_ros.substitutions import FindPackageShare from launch.actions import DeclareLaunchArgument def load_file(package_name, file_path): package_path = get_package_share_directory(package_name) absolute_file_path = os.path.join(package_path, file_path) try: with open(absolute_file_path, &quot;r&quot;) as file: return file.read() except EnvironmentError: # parent of IOError, OSError *and* WindowsError where available return None def load_yaml(package_name, file_path): package_path = get_package_share_directory(package_name) absolute_file_path = os.path.join(package_path, file_path) try: with open(absolute_file_path, &quot;r&quot;) as file: return yaml.safe_load(file) except EnvironmentError: # parent of IOError, OSError *and* WindowsError where available return None def generate_launch_description(): robot_description_semantic_config = load_file( &quot;Moveit2&quot;, &quot;config/avatar-assembly-202303.srdf&quot; ) robot_description_semantic = { &quot;robot_description_semantic&quot;: robot_description_semantic_config } robot_description_content = Command( [ PathJoinSubstitution([FindExecutable(name=&quot;xacro&quot;)]), &quot; &quot;, PathJoinSubstitution( [ FindPackageShare(&quot;avatar&quot;), &quot;urdf&quot;, &quot;avatar.xacro&quot;, ] ), ] ) robot_description = {&quot;robot_description&quot;: robot_description_content} kinematics_yaml = load_yaml( &quot;Moveit2&quot;, &quot;config/kinematics.yaml&quot; ) planning_plugin = {&quot;planning_plugin&quot;: &quot;chomp_interface/CHOMPPlanner&quot;} sim_arg = DeclareLaunchArgument( 'sim_mode', default_value='false', description='Launch in sim_mode if true' ) return LaunchDescription( [ sim_arg, Node( package=&quot;avatar&quot;, executable=&quot;planner_bridge_plugin&quot;, parameters=[ robot_description, robot_description_semantic, kinematics_yaml, planning_plugin, {&quot;use_sim_time&quot;: LaunchConfiguration('sim_mode')}, ], ) ] ) </code></pre>
Simultaneous dual arm movement
<p>Have a look at the <a href="https://control.ros.org/master/doc/ros2_control_demos/example_2/doc/userdoc.html" rel="nofollow noreferrer">diff_drive example</a>, this should answer question 2.</p> <p>About how to write a hardware_component <a href="https://www.youtube.com/watch?v=J02jEKawE5U" rel="nofollow noreferrer">this video</a> could help you with the first steps, or have a look at <a href="https://control.ros.org/master/doc/ros2_control/hardware_interface/doc/writing_new_hardware_component.html" rel="nofollow noreferrer">this step-by-step guide</a>.</p>
104918
2023-10-24T02:35:33.437
|ros2|control|mobile-robot|ros2-control|ros2-controllers|
<p>I am in the process of developing a custom two-wheeled mobile robot with differential control, and I want to integrate it with the ROS 2 ecosystem, specifically leveraging <strong>ros2_control</strong>. I understand that <strong>ros2_control</strong> provides a framework to connect any hardware to ROS 2, but I am having some challenges with where and how to start for specific/custom robot configuration.</p> <p><strong>Robot Details:</strong></p> <ul> <li>Two drive wheels with individual motor controllers.</li> <li>The robot uses differential control for maneuvering.</li> <li>Velocity control mechanism for each wheel.</li> <li>Encoders on each wheel for feedback.</li> </ul> <p><strong>Questions:</strong></p> <ol> <li><p>What are the fundamental steps to create a custom hardware interface for a differential-controlled robot to use with <strong>ros2_control</strong>?</p> </li> <li><p>How can I expose the readings from the wheel encoders to the <strong>joint_states</strong> topic within the ROS 2 ecosystem?</p> </li> </ol> <p>I've already gone through the official <strong>ros2_control</strong> documentation, and understood that I need to use <strong>ros2_control</strong> tags in my robot's URDF to set up the hardware interfaces and that I need to write YAML file to configure controllers. I'd greatly appreciate insights or experiences from those who have tackled the creation of the <strong>hardware interface</strong>, especially in the context of wheeled robots and encoder data integration.</p> <p>Thank you in advance for your guidance!</p>
Creating a Custom Hardware Interface for a Two-Wheeled Mobile Robot for ros2_control?
<p>Be aware that hardware components and controllers are different parts of ros2_control. You try to deactivate the controller -&gt; this does not deactivate the hardware component.</p> <p>Lifecycle of hardware components is not 100% implemented as by now, and there <a href="https://github.com/ros-controls/ros2_control/issues/1103" rel="nofollow noreferrer">are some issues</a> with the current implementation. Moreover, controller won't be deactivated currently if you deactivate the hardware_component, where the controller claimed the interfaces.</p> <p>Have a look at <a href="https://github.com/StoglRobotics-forks/ros2_control_demos/pull/7" rel="nofollow noreferrer">this draft example</a>, which should make things more clear.</p>
104924
2023-10-24T08:58:51.303
|controller-manager|ros2-control|
<p>I have implemented a hardware interface in ros2_control and added code to free my hardware resources in the <code>on_deactivate()</code> method.</p> <p>When I try to stop the controller with the command <code>ros2 control set_controller_state diffbot_base_controller inactive</code> and try to free the hardware, <code>on_deactivate()</code> doesn't seem to be called.</p> <p>I can't find console output from <code>deactivate</code>, and the read/write method is still being called. Unlike before, only the value from hw_command_ is 0.</p> <p>It's very confusing.Maybe I'm confused about the life cycle concept in <code>controller_manager</code> of ros2_control. I was hoping to get some tips or links to relevant materials.</p>
About Lifecycle, And how to call `on_deactivate()` in `hardware_interface` of ros2_control by controller_manager CLI or other method?
<p>The <code>rclcpp::QoS</code> class was introduced <a href="https://github.com/ros2/rclcpp/pull/713" rel="nofollow noreferrer">here</a>. As the main reason is mentioned: 'Adds an rclcpp version of QoS, which requires the history depth to be specified if using &quot;keep last&quot;'.</p> <blockquote> <p>why do some constructors use rclcpp::Qos and some other use rclcpp::rmw_qos_profile_t</p> </blockquote> <p>A quick look through the <code>rclcpp</code> PR's reveals e.g.:</p> <ul> <li><a href="https://github.com/ros2/rclcpp/pull/1964" rel="nofollow noreferrer">This one</a>, introducing <code>rclcpp::QoS</code> for <code>create_client</code> and</li> <li><a href="https://github.com/ros2/rclcpp/pull/1969" rel="nofollow noreferrer">This one</a> for <code>create_service</code>.</li> </ul> <p>Note that these PR's are a lot more recent than the initial PR introducing <code>rclcpp::QoS</code>.</p> <p>So I think the reason some constructors still take a <code>rmw_qos_profile_t</code> is probably just because nobody had the time to change those yet.</p> <p>Side note on the question:</p> <p>My best-practice to find this kind of knowledge is to:</p> <ul> <li>Go to the code in Github,</li> <li>Find a relevant line (e.g. a class definition or a specific method),</li> <li>Click on the line number,</li> <li>Choose 'View git blame' and</li> <li>Read through the pull requests that influenced that part of the code.</li> </ul> <p>This generally leads to design info and discussions about design choices.</p> <p>Final remark: <code>rmw_qos_profile_t</code> is not part of <code>rclcpp</code> but of <code>rmw</code>. It is defined <a href="https://github.com/ros2/rmw/blob/83445be486deae8c78d275e092eafb4bf380bd49/rmw/include/rmw/types.h#L574-L619" rel="nofollow noreferrer">here</a>.</p>
104930
2023-10-24T14:17:16.233
|ros2|dds|
<p>Can someone explain to me the difference between the <code>rclcpp::Qos</code> class and the <code>rclcpp::rmw_qos_profile_t</code> class?</p> <p>To me, it seems that the first is just an encapsulation of the latter. And if, so, why do some constructors (see <code>rclcpp::Publisher</code>) use <code>rclcpp::Qos</code> and some other (see <code>rclcpp::message_filter::Subscriber</code>) use <code>rclcpp::rmw_qos_profile_t</code>?</p>
Difference between QoS and rmw_qos_profile
<p>We believe we've figured this out. Here's what we've learned:</p> <h1>What is a Transform?</h1> <p>Consider the diagram below which shows two coordinate frames, source and destination, as well as a vector representing some hypothetical measurement. To get the source frame to align to the destination frame we'd have rotate counterclockwise by 15 degrees. We'll define positive rotation as clockwise, so the pose of the destination frame in the source frame is negative 15 degrees. If our data vector is defined in the source frame and we want to transform it to the destination frame, we'd rotate it by a positive 15 degrees; the below diagram helps illustrate this transformation. When we refer to a transform we're talking about transforming data, and when we refer to a pose we're talking about the relationship between two coordinate frames. These concepts of transform vs. pose are diametrically opposed, in that the pose is the inverse of the transform. In more concise terms, the transform from the source frame to the destination frame is the inverse of the pose of the destination frame in the source frame.</p> <p><a href="https://i.stack.imgur.com/r33fg.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/r33fg.png" alt="Shows two coordinate frames which are rotated relative to each other and how a data vector appears in both frames." /></a></p> <p>Now let's consider a more complicated example, as illustrated in the below figure, where the frames are not only rotated 15 degrees apart but also offset from each other. In this case, the destination frame is some positive x and y distance away from the source frame, so the pose of the destination frame in the source frame would be negative 15 degrees with a positive translation. When transforming our data vector though we're in-fact subtracting some x and y offset.</p> <p><a href="https://i.stack.imgur.com/ZAtXi.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/ZAtXi.png" alt="Shows two coordinate frames which are rotated and translated relative to each other and how a data vector appears in both frames." /></a></p> <p>Below is a helpful illustration of how the transform is encoded in a 4x4 homogeneous transform matrix (<a href="https://math.stackexchange.com/questions/1433314/homogeneous-transformation-matrix-how-to-use-it">source</a>). In particular, we use a subset of homogeneous transforms called isometric transforms, which consist only of a rotation and translation (technically an isometry can also be a reflection rather than a rotation, but our implementation only allows them to be rotations), disallowing operations such as scaling, shearing, and rotation, and also preserving colinearity. More information on isometries can be found <a href="https://en.wikipedia.org/wiki/Isometry" rel="nofollow noreferrer">here</a>. The C++ <a href="https://eigen.tuxfamily.org/index.php?title=Main_Page" rel="nofollow noreferrer">Eigen</a> library is very useful for working with transforms and data, and there's already a library for converting between Eigen and the <code>geometry_msgs/TransformStamped</code>, <a href="https://wiki.ros.org/eigen_conversions" rel="nofollow noreferrer">eigen_conversions</a></p> <p><a href="https://i.stack.imgur.com/giBQA.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/giBQA.png" alt="Shows how a homogeneous transform (in particular a 3D isometry) is represented by a 4x4 matrix." /></a></p> <p>When we lookup a transform from a source frame to a destination frame, using the <code>lookupTransform</code> API of the tf2 library, we receive the transform needed to transform a point in our source frame to a point in our destination frame. You might've noticed previously that looking up the transform from source to destination gives opposite relationship between the two frames compared to what was published to the transform tree. This is because the published transform tree actually contains poses, not transforms! The <code>lookupTransform</code> API then converts these poses into actual transforms by inverting them.</p> <h3>How to Transform Data</h3> <p>Below are examples of how to utilize 3D isometric transforms to modify data as well as poses:</p> <h4>Construct a transform.</h4> <h5>Inputs</h5> <ul> <li><span class="math-container">$R_A^B$</span> ∴ 3x3 rotation matrix, specifying how to rotate data in frame A to frame B.</li> <li><span class="math-container">$t_B^A$</span> ∴ 1x3 translation vector, specifying location of frame A origin relative to frame B origin.</li> </ul> <h5>Output</h5> <p><span class="math-container">$T_A^B$</span> ∴ Transform from frame A to frame B.</p> <h5>Expression</h5> <p><span class="math-container">$T_A^B = \begin{bmatrix} R_A^B &amp; t_B^A \\\ \left&lt;0,0,0\right&gt; &amp; 1 \end{bmatrix}$</span></p> <h4>Construct a pose.</h4> <h5>Inputs</h5> <ul> <li><span class="math-container">$R_A^B$</span> ∴ 3x3 rotation matrix, specifying orientation of frame A relative to frame B.</li> <li><span class="math-container">$t_B^A$</span> ∴ 1x3 position vector, specifying location of frame A origin relative to frame B origin.</li> </ul> <h5>Output</h5> <p><span class="math-container">$P_A^B$</span> ∴ Pose of frame A in frame B.</p> <h5>Expression</h5> <p><span class="math-container">$P_A^B = \begin{bmatrix} R_A^B &amp; t_B^A \\\ \left&lt;0,0,0\right&gt; &amp; 1 \end{bmatrix}$</span></p> <h4>Rotate a 3D vector in frame A to frame B.</h4> <h5>Inputs</h5> <ul> <li><span class="math-container">$R_A^B$</span> ∴ 3x3 rotation matrix, specifying how to rotate data in frame A to frame B.</li> <li><span class="math-container">$\vec{v}_A = \left&lt;{v_x, v_y, v_z}\right&gt;_A$</span> ∴ 3D vector in frame A.</li> </ul> <h5>Output</h5> <p><span class="math-container">$\vec{v}_B = \left&lt;{v_x, v_y, v_z}\right&gt;_B$</span> ∴ 3D vector in frame B.</p> <h5>Expression</h5> <p><span class="math-container">$\vec{v}_B = R_A^B * \vec{v}_A$</span></p> <h4>Transform a point in frame A to frame B.</h4> <h5>Inputs</h5> <ul> <li><span class="math-container">$T_A^B$</span> ∴ Transform from frame A to frame B.</li> <li><span class="math-container">$\vec{p}_A = \left&lt;{p_x, p_y, p_z, 0}\right&gt;_A$</span> ∴ 3D vector in frame A. Zero is added as 4th element so we can multiply by the transformation matrix.</li> </ul> <h5>Output</h5> <p><span class="math-container">$\vec{p}_B = \left&lt;{p_x, p_y, p_z, 0}\right&gt;_B$</span> ∴ 3D vector in frame B.</p> <h5>Expression</h5> <p><span class="math-container">$\vec{p}_B = T_A^B * \vec{p}_A$</span></p> <h4>Composing multiple transforms into a single transform.</h4> <h5>Inputs</h5> <ul> <li><span class="math-container">$T_A^B$</span> ∴ Transform from frame A to frame B.</li> <li><span class="math-container">$T_B^C$</span> ∴ Transform from frame B to frame C.</li> </ul> <h5>Output</h5> <p><span class="math-container">$T_A^C$</span> ∴ Transform from frame A to frame C.</p> <h5>Expression</h5> <p><span class="math-container">$T_A^C = T_B^C * T_A^B$</span></p> <h4>Composing multiple poses into a single pose.</h4> <h5>Inputs</h5> <ul> <li><span class="math-container">$P_B^A$</span> ∴ Pose of frame B in frame A.</li> <li><span class="math-container">$P_C^B$</span> ∴ Pose of frame C in frame B.</li> </ul> <h5>Output</h5> <ul> <li><span class="math-container">$P_C^A$</span> ∴ Pose of frame C in frame A.</li> </ul> <h5>Expression</h5> <p><span class="math-container">$P_C^A = P_B^A * P_C^B$</span></p> <h4>Inverting an Isometry</h4> <h5>Input</h5> <p><span class="math-container">$I = \begin{bmatrix} R &amp; t \\\ \left&lt;0,0,0\right&gt; &amp; 1 \end{bmatrix}$</span> ∴ Isometric transform or pose to invert.</p> <h5>Output</h5> <p><span class="math-container">$I^{-1}$</span> ∴ Inverted isometric transform or pose.</p> <h5>Expression</h5> <p><span class="math-container">$I^{-1} = \begin{bmatrix} R^{-1} &amp; t^{-1} \\\ \left&lt;0,0,0\right&gt; &amp; 1 \end{bmatrix}$</span>, where</p> <ul> <li><span class="math-container">$R^{-1} = transpose(R)$</span></li> <li><span class="math-container">$t^{-1} = -R^{-1} * t$</span></li> </ul> <h4>Identities</h4> <ul> <li><span class="math-container">$T_A^B = P_A^B = inverse(T_B^A) = inverse(P_A^B)$</span></li> <li><span class="math-container">$T_B^A = P_B^A = inverse(T_A^B) = inverse(P_B^A)$</span></li> </ul>
104935
2023-10-24T19:31:50.700
|ros|transform|tf2|
<p>According to <a href="https://math.stackexchange.com/questions/1433314/homogeneous-transformation-matrix-how-to-use-it">this</a> definition of a homogeneous transformation matrix (pictured below), a transform consists of a rotation from source to destination and the location of the source origin relative to the destination origin. So, for example, if I want to transform a point from frame A to frame B, where B is 2 meters ahead of A, my transformation matrix would have an x translation of -2 (using ROS coordinate conventions).</p> <p>To add this transform to my transform tree I'd expect to have to publish a TFMessage, with parent A and child B with a translation of -2. However, when I do this and use <code>tf2_ros::Buffer::lookupTransform</code> with the target frame as B and the source frame as A my transform contains a +2 x-translation. Additionally, this relationship between A &amp; B, with B being 2 meters ahead of A, is only visualized correctly in RVIZ if my published transform contains a +2 translation. This leads me to believe that the homogeneous transform matrix is being represented in tf2 as I expect, with the negative translation, but in ROS it is being represented in the opposite way.</p> <p>What exactly is the difference in transformation representation between ROS and tf2? Do the standard built-in tools all interpret transformations in the same way?</p> <p><a href="https://i.stack.imgur.com/mLYqp.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/mLYqp.png" alt="Definition of Homogeneous transformation matrix. Source : https://math.stackexchange.com/questions/1433314/homogeneous-transformation-matrix-how-to-use-it" /></a></p>
Inconsistencies in Transform Definition between tf2 and ROS?
<p>Welcome to Robotics Stack Exchange!</p> <h3>Problems</h3> <p>There are following issues with your code:</p> <ol> <li><code>NodeHandle</code> is initilized before <code>ros::init()</code> which is unacceptable.</li> <li><code>tf::TransformBroadcaster</code> is created inside a callback. Initilization should be done outside callback and callback must be used to repetitive tasks.</li> <li>Similarly, <code>ros::Publisher</code> is created inside a callback which should be avoided.</li> </ol> <h3>Solution</h3> <p>Simply define a class to encapsulate all the functionalities. The <code>tf::TransformBroadcaster</code> and <code>ros::Publisher</code> can be initilized as class members.</p> <h3>Example</h3> <p>I have create a demo code as an example. Please see below:</p> <pre><code>#include &lt;ros/ros.h&gt; #include &lt;std_msgs/String.h&gt; #include &lt;geometry_msgs/PoseStamped.h&gt; class MyClass { private: ros::NodeHandle nh_rel; ros::Publisher pose_pub; ros::Subscriber string_sub; size_t counter; public: MyClass(); void stringCallback(const std_msgs::String::ConstPtr &amp;msg); }; MyClass::MyClass() { counter = 0; nh_rel = ros::NodeHandle(&quot;~&quot;); pose_pub = nh_rel.advertise&lt;geometry_msgs::PoseStamped&gt;(&quot;estimated_pose&quot;, 10); string_sub = nh_rel.subscribe&lt;std_msgs::String&gt;(&quot;/string/data_raw&quot;, 10, &amp;MyClass::stringCallback, this); } void MyClass::stringCallback(const std_msgs::String::ConstPtr &amp;msg) { ROS_INFO_STREAM(&quot;Received message:&quot; &lt;&lt; msg-&gt;data &lt;&lt; &quot; counter is:&quot; &lt;&lt; counter); geometry_msgs::PoseStamped pose_msg; pose_msg.pose.position.x = counter++; pose_msg.pose.position.y = counter++; pose_msg.pose.position.z = counter++; pose_pub.publish(pose_msg); } int main(int argc, char **argv) { ros::init(argc, argv, &quot;my_node&quot;); MyClass my_class; ros::spin(); return 0; } </code></pre> <p>The <code>NodeHandle</code> is initilized after <code>ros::init</code> inside the class and then publishers, subscribers etc are created. Pay attention to how and where they are initilized. The callback is only publishing the data.</p> <h3>How to run</h3> <ol> <li><p>Start roscore by <code>$ roscore</code></p> </li> <li><p>Publish dummy string <code>$ rostopic pub -r 10 /string/data_raw std_msgs/String hello</code></p> </li> <li><p>Invoke above node:</p> <pre><code>$ rosrun my_pkg my_node [ INFO] [1698214701.051547495]: Received message:hello counter is:0 [ INFO] [1698214701.151819953]: Received message:hello counter is:3 [ INFO] [1698214701.251777275]: Received message:hello counter is:6 [ INFO] [1698214701.351677097]: Received message:hello counter is:9 [ INFO] [1698214701.451825040]: Received message:hello counter is:12 </code></pre> </li> <li><p>Verify that the data is published:</p> <pre><code>$ rostopic echo /my_node/estimated_pose header: seq: 187 stamp: secs: 0 nsecs: 0 frame_id: '' pose: position: x: 561.0 y: 562.0 z: 563.0 orientation: x: 0.0 y: 0.0 z: 0.0 w: 0.0 --- </code></pre> </li> </ol> <p>I hope you can modify the above code to meet your requirements.</p>
104941
2023-10-25T05:35:10.213
|c++|nodehandle|
<p>Raspberry Pi 4, Lubuntu20.04, ROS1-noetic</p> <p>Please help me because I am getting an error with <strong>NodeHandle</strong></p> <p>My program is below (Some Japanese is mixed in)</p> <p>The <strong>publish node</strong> will be as follows <a href="https://github.com/rt-net/rt_usb_9axisimu_driver/tree/master" rel="nofollow noreferrer">https://github.com/rt-net/rt_usb_9axisimu_driver/tree/master</a></p> <pre><code>#include &lt;ros/ros.h&gt; #include &lt;sensor_msgs/Imu.h&gt; #include &lt;geometry_msgs/PoseStamped.h&gt; #include &lt;tf/transform_broadcaster.h&gt; #include &lt;tf/transform_datatypes.h&gt; ros::NodeHandle nh; double gravity = 10.1347; // 重力加速度 void imuCallback(const sensor_msgs::Imu::ConstPtr&amp; imu_msg) { // IMUデータからクォータニオンを取得 tf::Quaternion orientation; tf::quaternionMsgToTF(imu_msg-&gt;orientation, orientation); // クォータニオンからオイラー角を取得 double roll, pitch, yaw; tf::Matrix3x3(orientation).getRPY(roll, pitch, yaw); // 自己位置を計算(ここでは単純化) double x = 0.0; double y = 0.0; double z = 0.0; // 自己位置情報をPublish geometry_msgs::PoseStamped pose_msg; pose_msg.header = imu_msg-&gt;header; pose_msg.pose.position.x = x; pose_msg.pose.position.y = y; pose_msg.pose.position.z = z; // クォータニオンから姿勢情報をBroadcast static tf::TransformBroadcaster broadcaster; tf::Transform transform; transform.setOrigin(tf::Vector3(x, y, z)); transform.setRotation(orientation); broadcaster.sendTransform(tf::StampedTransform(transform, imu_msg-&gt;header.stamp, &quot;imu_link&quot;, &quot;base_link&quot;)); // 自己位置情報をPublish // 自己位置情報をPublish static ros::Publisher pose_pub = nh.advertise&lt;geometry_msgs::PoseStamped&gt;(&quot;estimated_pose&quot;, 10); pose_pub.publish(pose_msg); } int main(int argc, char** argv) { ros::init(argc, argv, &quot;self_localization_node&quot;); ros::Subscriber imu_sub = nh.subscribe&lt;sensor_msgs::Imu&gt;(&quot;/imu/data_raw&quot;, 10, imuCallback); ros::spin(); return 0; } </code></pre> <p>When I run this code I get the following error</p> <pre><code>[FATAL] [1698211644.979626372]: You must call ros::init() before creating the first NodeHandle [ERROR] [1698211645.036590112]: [registerPublisher] Failed to contact master at [:0]. Retrying... </code></pre> <p>But when I put the <strong>NodeHandle inside the main function</strong>, I get the below error when building</p> <pre><code>/home/ubuntu/catkin_ws/src/rt_usb_9axisimu_driver/src/rt_usb_9axisimu_odo_sub.cpp: In function ‘void imuCallback(const ConstPtr&amp;)’: /home/ubuntu/catkin_ws/src/rt_usb_9axisimu_driver/src/rt_usb_9axisimu_odo_sub.cpp:40:38: error: ‘nh’ was not declared in this scope 40 | static ros::Publisher pose_pub = nh.advertise&lt;geometry_msgs::PoseStamped&gt;(&quot;estimated_pose&quot;, 10); | ^~ /home/ubuntu/catkin_ws/src/rt_usb_9axisimu_driver/src/rt_usb_9axisimu_odo_sub.cpp:40:77: error: expected primary-expression before ‘&gt;’ token 40 | static ros::Publisher pose_pub = nh.advertise&lt;geometry_msgs::PoseStamped&gt;(&quot;estimated_pose&quot;, 10); | ^ make[2]: *** [CMakeFiles/imu_subscriber_odometory.dir/build.make:63: CMakeFiles/imu_subscriber_odometory.dir/src/rt_usb_9axisimu_odo_sub.cpp.o] Error 1 make[1]: *** [CMakeFiles/Makefile2:1848: CMakeFiles/imu_subscriber_odometory.dir/all] Error 2 make: *** [Makefile:141: all] Error 2 </code></pre> <p>Please help me</p>
How to improve NodeHandle build and execution errors
<p>No, only Iron and newer has the plugin-based navigators -- please upgrade!</p>
104949
2023-10-25T12:47:43.330
|navigation|ros2|plugin|nav2|
<p>Hello ROS 2 Navigators,</p> <p>I am working in ROS 2 Humble Nav2 stack, There's a tutorial in Nav2 for Writing a New Navigator plugin but I can't find navigators parameter in the Nav2 Humble Version, I want to get to know that, Is it possible to use own bt navigator instead NavigateToPose in Humble?</p>
Using own bt_navigator in Humble
<p>No GUI environment exists that I know of. Assuming your on a Linux based system you can just mount your x11 directory and set your display variable.</p> <p>For example:</p> <p>First step enter the console command</p> <pre><code>xhost + </code></pre> <p>Start a basic ubuntu docker container with the x11 volume mounted and the env variable set.</p> <pre><code>docker run -it -v /tmp/.X11-unix:/tmp/.X11-unix --env DISPLAY=${DISPLAY} ubuntu bash </code></pre> <p>Now in the container install xapps (since base image has no gui based tools)</p> <pre><code>apt update &amp;&amp; apt install -y x11-apps </code></pre> <p>and test using</p> <pre><code>xeyes </code></pre> <p>You can even do this with GPU support using the Nvidia container toolkit. For Mac and Windows this I believe you just use XQuartz or VcXsrv Windows X Server</p> <p>Typically I prefer using docker compose for settings these variables and launching/building my containers.</p>
104960
2023-10-25T20:50:16.617
|docker|ros-noetic|
<p>I can't seem to find a proper answer to this question by reading through Docker documentation.</p> <p>If Docker is touted as a lighter alternative to VMs installed via tools like VirtualBox, does it provide the full Ubuntu GUI development environment for ROS Noetic (for example)?</p> <p>There's an <a href="https://robotics.stackexchange.com/questions/90228/running-ros-and-its-gui-tools-through-a-docker-image">older question on this</a>, but the answers given don't make it seem like Docker provides a GUI dev environment natively. But rather through a bunch of different add-ons.</p> <p>I'm not an advanced ROS developer, so I don't want the hassle of not having the &quot;vanilla&quot; ROS dev environment of a &quot;vanilla&quot; Ubuntu OS with full GUI.</p> <p>If someone can confirm that Docker can't support GUI dev environment natively, then I'd rather stick to using VirtualBox.</p>
Does Docker provide a full Ubuntu GUI development environment like VirtualBox for ROS Noetic
<p>You can disable node log output with cli argument <code>--disable-external-lib-logs</code>, e.g.:</p> <pre><code>ros2 run &lt;package&gt; &lt;executable&gt; --ros-args --disable-external-lib-logs` </code></pre> <p>Similar arguments exist for <code>stdout</code> and <code>rosout</code>:</p> <p><code>--disable-stdout-logs</code> <code>--disable-rosout-logs</code></p> <p>When using launch files, the launch script also generates a log file. By default, the node stdout log is also written to that launch log file.</p> <p>This can be disabled with <code>output='screen'</code> in the launch file, e.g.:</p> <pre><code>#!/usr/bin/env python3 from launch import LaunchDescription from launch_ros.actions import Node def generate_launch_description(): return LaunchDescription([ Node( package='package_name', executable='executable_name', output='screen', arguments=[ &quot;--ros-args&quot;, &quot;--disable-external-lib-logs&quot;] ), ]) </code></pre> <p>However, even then a folder and a launch log file is created in <code>~/.ros/log</code>.</p> <p>I found <a href="https://answers.ros.org/question/371795/ros2-disable-logging-to-file/" rel="nofollow noreferrer">this related question</a> on the ROS Answers site.</p> <p>I tried the answer by <code>WhatTheActual01</code> but I didn't succeed to not generate log files that way.</p> <p>The answer by <code>hsaito</code> works fine though:</p> <ul> <li>Clone and compile <a href="https://github.com/xrgtn/nullfs" rel="nofollow noreferrer">nullfs</a>: <pre><code>sudo apt update sudo apt install -y git libfuse-dev build-essential git clone https://github.com/xrgtn/nullfs cd nullfs/ make nullfs </code></pre> </li> <li>Make directory for, and run nullfs <pre><code>mkdir /tmp/empty_log_dir ./nullfs /tmp/empty_log_dir </code></pre> </li> <li>Set environment variable and run ROS 2: <pre><code>export ROS_LOG_DIR=/tmp/empty_log_dir ros2 launch launch_file.launch.py </code></pre> </li> </ul>
104972
2023-10-26T06:32:59.700
|ros2|ros-humble|logging|
<p>I read <a href="https://docs.ros.org/en/humble/Concepts/Intermediate/About-Logging.html" rel="nofollow noreferrer">https://docs.ros.org/en/humble/Concepts/Intermediate/About-Logging.html</a> but that didn't really explain how one would go about it. Is it even possible? I am on Ubuntu 22.04 and I'm on a source build ros_humble.</p>
How can I disable all logging to files under ~/.ros but still log to stderr?
<p>There's a guide on the MCAP website <a href="https://mcap.dev/guides/getting-started/ros-2#using-ros2-bag-convert" rel="nofollow noreferrer">here</a>:</p> <pre><code>$ cat &lt;&lt; EOF &gt; convert.yaml output_bags: - uri: ros2_output storage_id: mcap all: true EOF $ ros2 bag convert -i ros2_input.db3 -o convert.yaml </code></pre> <p>If you have the <a href="https://github.com/ros2/rosbag2/tree/humble/rosbag2_storage_mcap" rel="nofollow noreferrer">rosbag2_storage_mcap</a> package installed, this should work out of the box.</p>
104976
2023-10-26T07:39:55.307
|ros2|rosbag|mcap|
<p>I've got some ROS2 bags in SQLite format that I'd like to convert to MCAP format, but I cannot find any examples or tools that do it. I'd have hoped there would be an obvious option for <code>ros2 bag convert</code>, like the <code>--output-options</code> YAML's <code>storage_id:</code>, but it's not obvious to me what to use.</p>
Convert ROS2 SQLite bag to MCAP format
<p>I would start with increasing the y axis related values. First check the velocity and acceleration parameters. Is there any change on the planned path? Then I would increase the weight parameters related to y axis and observe the changes. If changing only y axis related values does not work, I would decrease x axis related values and observe the changes again. It is possible that you have selected correct values to generate a commands that can steer your robot but it is not selected as the optimum output. If none of this works, I would gradually decrease the value of the weight_kinematics_nh parameter down to zero. Also reducing the weight on the time optimization could help because x axis motion could be faster compared to y axis motion in your case (weight_optimaltime)</p>
104987
2023-10-26T12:08:35.793
|ros-noetic|teb-local-planner|
<p>I'm trying teb_local_planner on omni_dir robot.</p> <p><a href="https://i.stack.imgur.com/TOC42.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/TOC42.png" alt="enter image description here" /></a> <a href="https://youtu.be/AKF6wPZgCa8?si=Y_17OSZsSo7nzhfV" rel="nofollow noreferrer">https://youtu.be/AKF6wPZgCa8?si=Y_17OSZsSo7nzhfV</a></p> <p>However, the robot moves like a car as shown in this video.</p> <p><a href="https://i.stack.imgur.com/TuSAP.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/TuSAP.png" alt="enter image description here" /></a> I want to moving holonomic like this picture.</p> <p>my setting parameter is this.</p> <pre><code>TebLocalPlannerROS: # Trajectory teb_autosize: True dt_ref: 0.3 dt_hysteresis: 0.1 max_samples: 500 global_plan_overwrite_orientation: True allow_init_with_backwards_motion: True max_global_plan_lookahead_dist: 0.5 global_plan_viapoint_sep: -1 global_plan_prune_distance: 1 exact_arc_length: False feasibility_check_no_poses: 5 publish_feedback: False # Robot max_vel_x: 0.4 max_vel_x_backwards: 0.2 max_vel_y: 0.1 max_vel_theta: 0.3 acc_lim_x: 0.5 acc_lim_y: 0.2 acc_lim_theta: 0.5 min_turning_radius: 0.0 # omni-drive robot (can turn on place!) footprint_model: type: &quot;circular&quot; radius: 0.188 # GoalTolerance xy_goal_tolerance: 0.05 yaw_goal_tolerance: 0.15 free_goal_vel: False complete_global_plan: True # Obstacles min_obstacle_dist: 0.25 # This value must also include our robot radius, since footprint_model is set to &quot;point&quot;. inflation_dist: 0.6 include_costmap_obstacles: False costmap_obstacles_behind_robot_dist: 1.0 obstacle_poses_affected: 30 costmap_converter_plugin: &quot;&quot; costmap_converter_spin_thread: True costmap_converter_rate: 5 # Optimization no_inner_iterations: 5 no_outer_iterations: 4 optimization_activate: True optimization_verbose: False penalty_epsilon: 0.05 obstacle_cost_exponent: 4 weight_max_vel_x: 2 weight_max_vel_y: 2 weight_max_vel_theta: 1 weight_acc_lim_x: 1 weight_acc_lim_y: 1 weight_acc_lim_theta: 1 weight_kinematics_nh: 1 # WE HAVE A HOLONOMIC ROBOT, JUST ADD A SMALL PENALTY weight_kinematics_forward_drive: 1 weight_kinematics_turning_radius: 1 weight_optimaltime: 1 # must be &gt; 0 weight_shortest_path: 0 weight_obstacle: 50 weight_inflation: 0.2 weight_dynamic_obstacle: 10 weight_dynamic_obstacle_inflation: 0.2 weight_viapoint: 1 weight_adapt_factor: 2 # Homotopy Class Planner enable_homotopy_class_planning: True enable_multithreading: True max_number_classes: 4 selection_cost_hysteresis: 1.0 selection_prefer_initial_plan: 0.9 selection_obst_cost_scale: 1.0 selection_alternative_time_cost: False roadmap_graph_no_samples: 15 roadmap_graph_area_width: 5 roadmap_graph_area_length_scale: 1.0 h_signature_prescaler: 0.5 h_signature_threshold: 0.1 obstacle_heading_threshold: 0.45 switching_blocking_period: 0.0 viapoints_all_candidates: True delete_detours_backwards: True max_ratio_detours_duration_best_duration: 3.0 visualize_hc_graph: False visualize_with_time_as_z_axis_scale: False # Recovery shrink_horizon_backup: True shrink_horizon_min_duration: 10 oscillation_recovery: True oscillation_v_eps: 0.1 oscillation_omega_eps: 0.1 oscillation_recovery_min_duration: 10 oscillation_filter_duration: 10 </code></pre> <p>What settings should I adjust?</p>
teb_local_planner question
<p>What is your fixed frame?</p> <p>I see you are changing the frame_id every time. I am thinking that without proper connection between your pointcloud frame and rviz fixed frame, it is not able to visualize your message. Please try with a constant frame_id like &quot;laser&quot; and make sure you are viewing it in &quot;laser&quot; frame or another connected frame</p>
104989
2023-10-26T13:25:34.113
|ros2|c++|pointcloud|
<p>I want to send a dummy PointCloud2 message to be visualized with RVIZ2, but when I try to subscribe to the pointCloud2 message, I received the following error :</p> <p>[INFO] [1698326059.380456033] [rviz]: Message Filter dropping message: frame '625' at time 1698326054,377 for reason 'Unknown'</p> <p>And I can't get rviz2 to show any frame, I populated every field of the pointcloud2 message.</p> <p>The code to generate the pointCloud2 is the following :</p> <pre><code>#include &lt;chrono&gt; #include &lt;memory&gt; #include &quot;rclcpp/rclcpp.hpp&quot; #include &quot;sensor_msgs/msg/point_cloud2.hpp&quot; using std::placeholders::_1; using namespace std::chrono_literals; using namespace std::chrono; class TestPcloud2Pub : public rclcpp::Node { public: TestPcloud2Pub() : Node(&quot;test_pcloud2_pub&quot;) { mPcloudMsg.height = 5; mPcloudMsg.width = 3; mPcloudMsg.is_dense = true; mPcloudMsg.is_bigendian = true; mPcloudMsg.point_step = 4 * 3; mPcloudMsg.row_step = mPcloudMsg.width * 4 * 3; std::vector&lt;sensor_msgs::msg::PointField&gt; pointFields; sensor_msgs::msg::PointField pointFieldX; pointFieldX.name = 'x'; pointFieldX.offset = 0; pointFieldX.datatype = 7; // float 32 pointFieldX.count = 1; sensor_msgs::msg::PointField pointFieldY; pointFieldY.name = 'y'; pointFieldY.offset = 4; pointFieldY.datatype = 7; // float 32 pointFieldY.count = 1; sensor_msgs::msg::PointField pointFieldZ; pointFieldZ.name = 'z'; pointFieldZ.offset = 8; pointFieldZ.datatype = 7; // float 32 pointFieldZ.count = 1; pointFields.push_back(pointFieldX); pointFields.push_back(pointFieldY); pointFields.push_back(pointFieldZ); mPcloudMsg.fields = pointFields; mPublisher = this-&gt;create_publisher&lt;sensor_msgs::msg::PointCloud2&gt;(&quot;test_pcloud&quot;, 10); mTimer = this-&gt;create_wall_timer(500ms, std::bind(&amp;TestPcloud2Pub::timer_callback, this)); } private: void timer_callback() { mPcloudMsg.header.frame_id = std::to_string(mFrameCounter); auto tp = system_clock::now(); auto tpSec = time_point_cast&lt;seconds&gt;(tp); nanoseconds ns = tp - tpSec; uint32_t timeSec = static_cast&lt;uint32_t&gt;(tpSec.time_since_epoch().count()); uint32_t timeNsec = static_cast&lt;uint32_t&gt;(ns.count()); mPcloudMsg.header.stamp.sec = timeSec; mPcloudMsg.header.stamp.nanosec = timeNsec; mPcloudMsg.data.clear(); // mPcloudMsg.data.resize(mPcloudMsg.height * mPcloudMsg.width * 4 * 3); for (int i = 0; i &lt; mPcloudMsg.width; ++i) { for (int j = 0; j &lt; mPcloudMsg.height; ++j) { float tempX = 9.5; uint8_t* x = reinterpret_cast&lt;uint8_t*&gt;(&amp;tempX); float tempY = i - 1.; uint8_t* y = reinterpret_cast&lt;uint8_t*&gt;(&amp;tempY); float tempZ = j; uint8_t* z = reinterpret_cast&lt;uint8_t*&gt;(&amp;tempZ); mPcloudMsg.data.insert(mPcloudMsg.data.end(), x, x + 4); mPcloudMsg.data.insert(mPcloudMsg.data.end(), y, y + 4); mPcloudMsg.data.insert(mPcloudMsg.data.end(), z, z + 4); } } /*std::cout &lt;&lt; &quot;pcloud size : &quot; &lt;&lt; mPcloudMsg.data.size() &lt;&lt; std::endl; std::cout &lt;&lt; &quot;Pcloud data : &quot; &lt;&lt; std::endl; for (int i = 0; i &lt; mPcloudMsg.data.size(); ++i) { std::cout &lt;&lt; std::hex &lt;&lt; (int)mPcloudMsg.data[i]; if (i % 4 == 0) std::cout &lt;&lt; &quot; &quot;; } std::cout &lt;&lt; std::endl;*/ mFrameCounter++; mPublisher-&gt;publish(mPcloudMsg); } uint64_t mFrameCounter = 0; rclcpp::TimerBase::SharedPtr mTimer; sensor_msgs::msg::PointCloud2 mPcloudMsg; rclcpp::Publisher&lt;sensor_msgs::msg::PointCloud2&gt;::SharedPtr mPublisher; }; int main(int argc, char* argv[]) { rclcpp::init(argc, argv); rclcpp::spin(std::make_shared&lt;TestPcloud2Pub&gt;()); rclcpp::shutdown(); return 0; } </code></pre> <p>I displayed the content of the message with</p> <pre><code>ros2 topic echo /test_pcloud </code></pre> <p>and the content seems fine to me.</p> <p>Does anyone have any clue on the subject? How do I do to debug it when the error outputed i 'Unknown' ?</p> <p>Thansk for your help !</p>
ROS 2 - Poincloud2 message drop when trying to display in RVIZ2
<p>I managed to achieve better results with <a href="https://docs.opencv.org/3.4/db/df6/tutorial_erosion_dilatation.html" rel="nofollow noreferrer">cv::dilate</a>.</p> <p>As far as I tested, I haven't lost any perfomance and it is really easy to implement.</p>
105004
2023-10-27T06:35:59.760
|c++|opencv|pointcloud|
<p>I am projecting a 3D Point Cloud on a 2D image incoming as ROS 2 messages. Now, I want to fill the gaps between pixels from the point cloud, so that I can see an area, instead of single pixels and publish the overlay (point cloud + camera) image. I found <a href="https://stackoverflow.com/questions/76688703/algorithm-for-overlaying-point-cloud-on-image">here</a> a similar question, but unfortunately it hasn't been answered.</p> <p>What I am trying to achieve is:</p> <p>before <a href="https://i.stack.imgur.com/KoG6J.jpg" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/KoG6J.jpg" alt="enter image description here" /></a></p> <p>after <a href="https://i.stack.imgur.com/iKqJh.jpg" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/iKqJh.jpg" alt="enter image description here" /></a></p> <p>I tried at first some voronoi diagram, but I haven't succeeded. My Node is written in C++ using OpenCV for image processing.</p> <p>Does anyone has an idea, how can I achieve it?</p>
colorize area of point cloud on a 2D image
<p>Open your Robot Node and check the field &quot;name&quot;. That name <strong>must match exactly</strong> the name of the robot in your launch file.</p> <p>For instance. If the name in your robot node is &quot;Zachariah&quot;, then your launch file will look like:</p> <pre><code>my_robot_driver = Node( package='webots_ros2_driver', executable='driver', output='screen', additional_env={'WEBOTS_CONTROLLER_URL': controller_url_prefix() + 'Zachariah'}, parameters=[ {'robot_description': robot_description}, ] </code></pre>
105009
2023-10-27T10:22:10.677
|ros2|webots|
<p>I using ROS-Foxy on Ubuntu-20.04 installed using WSL 2. I also have installed R2023b version of webots on windows (the host). I followed the ROS-Foxy instructions to setup webots-ros2. However, when I tried running the example at <a href="https://docs.ros.org/en/foxy/Tutorials/Advanced/Simulators/Webots/Setting-Up-Simulation-Webots-Basic.html" rel="nofollow noreferrer">https://docs.ros.org/en/foxy/Tutorials/Advanced/Simulators/Webots/Setting-Up-Simulation-Webots-Basic.html</a> I got the bellow errors:</p> <pre><code>WARNING: Parquetry &quot;PBRAppearance&quot; &gt; ImageTexture : Texture image size of 'https://raw.githubusercontent.com/cyberbotics/webots/R2022b/projects/appearances/protos/textures/parquetry/chequered_parquetry_base_color.jpg' is not a power of two: rescaling it from -1x-1 to 0x0. WARNING: Parquetry &quot;PBRAppearance&quot; &gt; ImageTexture : Cannot load texture 'https://raw.githubusercontent.com/cyberbotics/webots/R2022b/projects/appearances/protos/textures/parquetry/chequered_parquetry_base_color.jpg': Unsupported image format. WARNING: Parquetry &quot;PBRAppearance&quot; &gt; ImageTexture : Texture image size of 'https://raw.githubusercontent.com/cyberbotics/webots/R2022b/projects/appearances/protos/textures/parquetry/chequered_parquetry_normal.jpg' is not a power of two: rescaling it from -1x-1 to 0x0. WARNING: Parquetry &quot;PBRAppearance&quot; &gt; ImageTexture : Cannot load texture 'https://raw.githubusercontent.com/cyberbotics/webots/R2022b/projects/appearances/protos/textures/parquetry/chequered_parquetry_normal.jpg': Unsupported image format. WARNING: Parquetry &quot;PBRAppearance&quot; &gt; ImageTexture : Texture image size of 'https://raw.githubusercontent.com/cyberbotics/webots/R2022b/projects/appearances/protos/textures/parquetry/chequered_parquetry_occlusion.jpg' is not a power of two: rescaling it from -1x-1 to 0x0. WARNING: Parquetry &quot;PBRAppearance&quot; &gt; ImageTexture : Cannot load texture 'https://raw.githubusercontent.com/cyberbotics/webots/R2022b/projects/appearances/protos/textures/parquetry/chequered_parquetry_occlusion.jpg': Unsupported image format. INFO: 'my_robot' extern controller: Waiting for local or remote connection on port 1234 targeting robot named 'my_robot'. INFO: 'my_robot' extern controller: connected. INFO: 'my_robot' extern controller: disconnected, waiting for new connection. </code></pre> <p>I need help on how to resolve this.</p> <p>Thank you in advance</p>
INFO: 'my_robot' extern controller: Waiting for local or remote connection on port 1234 targeting robot named 'my_robot'
<p><code>odom_frame</code> is set to odom even though there is no odom. The same goes for map. I think that needs to be fixed.</p> <pre><code>map_frame: map # 通常は map を使用。 odom_frame: odom # 通常は odom を使用。 base_link_frame: base_link # ロボットのベースリンクのフレーム名。 world_frame: odom # 通常は odom を使用。 </code></pre> <p>I think the above might change something if you do the following</p> <pre><code>odom_frame: imu_link base_link_frame: base_link # ロボットのベースリンクのフレーム名。 world_frame: imu_link # 通常は odom を使用。 </code></pre> <p>In the first place, robot_localization (Kalman filter) with a single input may not have been considered.</p>
105010
2023-10-27T10:26:38.050
|robot-localization|raspberry-pi|ros-noetic|ekf-localization-node|ekf-localization|
<p>Version: Raspberry Pi 4, Lubuntu20.04, ROS1-noetic</p> <p>I am thinking of estimating the position using robot <strong>localization from the IMU.</strong> However, I am having trouble because the Topic for <strong>/odometry/filtered is empty.</strong></p> <p><strong>/imu/data_raw</strong></p> <pre><code>header: seq: 5143 stamp: secs: 1698400774 nsecs: 353708923 frame_id: &quot;imu_link&quot; orientation: x: 0.0 y: 0.0 z: 0.0 w: 0.0 orientation_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] angular_velocity: x: 0.0 y: 0.010641999542713165 z: 0.0021279999054968357 angular_velocity_covariance: [1.1280564100000001e-06, 0.0, 0.0, 0.0, 1.1280564100000001e-06, 0.0, 0.0, 0.0, 1.1280564100000001e-06] linear_acceleration: x: 0.8523352064937353 y: 1.101335832411796 z: -10.1753605627954 linear_acceleration_covariance: [0.0005356910249999999, 0.0, 0.0, 0.0, 0.0005356910249999999, 0.0, 0.0, 0.0, 0.0005356910249999999] </code></pre> <p><strong>/odometry/filtered</strong></p> <pre><code>ubuntu@ubiquityrobot:~$ rostopic echo /odometory/filtered WARNING: topic [/odometory/filtered] does not appear to be published yet </code></pre> <p><strong>ekf_template.launch</strong></p> <pre><code>&lt;launch&gt; &lt;node pkg=&quot;robot_localization&quot; type=&quot;ekf_localization_node&quot; name=&quot;ekf_se&quot; clear_params=&quot;true&quot;&gt; &lt;rosparam command=&quot;load&quot; file=&quot;$(find robot_localization)/params/imu_ekf_template.yaml&quot; /&gt; &lt;/node&gt; &lt;/launch&gt; </code></pre> <p><strong>imu_ekf_template.yaml</strong></p> <pre><code>frequency: 10 # 適切な更新頻度(通常はセンサデータの発行頻度)に設定してください。 sensor_timeout: 0.5 # センサデータのタイムアウト時間。 two_d_mode: false # 3D位置推定を行う場合は false に設定してください。 transform_time_offset: 0.5 transform_timeout: 0.5 print_diagnostics: true debug: false debug_out_file: /path/to/debug/file.txt # オプション: デバッグ情報をファイルに出力する場合のファイルのパス。 publish_tf: true # tfを発行するかどうか。 publish_acceleration: true map_frame: map # 通常は map を使用。 odom_frame: odom # 通常は odom を使用。 base_link_frame: base_link # ロボットのベースリンクのフレーム名。 world_frame: odom # 通常は odom を使用。 imu0: /imu/data_raw imu0_config: [false, false, false, # x, y, z 位置 true, true, true, # x, y, z 回転速度 true, true, true, # x, y, z 加速度 false, false, false, # x, y, z 速度 false, false, false] # ジャイロバイアス imu0_queue_size: 10 imu0_nodelay: false imu0_differential: false imu0_relative: true imu0_pose_rejection_threshold: 0.8 # 適切な値に設定してください。 imu0_twist_rejection_threshold: 0.8 # 適切な値に設定してください。 imu0_linear_acceleration_rejection_threshold: 0.8 # 適切な値に設定してください。 imu0_remove_gravitational_acceleration: true </code></pre> <p><strong>ekf_localization_node.cpp</strong></p> <pre><code>#include &quot;robot_localization/ros_filter_types.h&quot; #include &lt;cstdlib&gt; #include &lt;ros/ros.h&gt; int main(int argc, char **argv) { ros::init(argc, argv, &quot;ekf_navigation_node&quot;); ros::NodeHandle nh; ros::NodeHandle nh_priv(&quot;~&quot;); RobotLocalization::RosEkf ekf(nh, nh_priv); ekf.initialize(); ros::spin(); return EXIT_SUCCESS; } </code></pre> <p><strong>tf trees</strong> <a href="https://i.stack.imgur.com/bCfYS.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/bCfYS.png" alt="enter image description here" /></a></p> <p><strong>I don't have much knowledge about ROS and robot localization, so I would appreciate any information you can give me.</strong></p>
Robot localization /odometry/filtered is not displayed
<p>You can tune the controller to more closely follow the original path using the <code>PathAlignCritic</code> critic. The <code>PathFollowCritic</code> is the critic that drives the robot forward towards the path, but not specifically aligning to it more than general following. The Alignment critic controls the weight on alignment.</p>
105016
2023-10-27T14:49:15.697
|ros2|nav2|
<p>I would like to use NavThroughPoses BT to navigate my robot with the MPPI controller. My problem is that the controller doesn't follow the path correctly. When the robot comes close to an intermediate point, which is on a bend, it doesn't go to the point, but cuts off the path.</p> <p><a href="https://i.stack.imgur.com/PgDKX.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/PgDKX.png" alt="enter image description here" /></a></p> <p>I tried to tune the MPPI parameters, mostly the PathFollowCritic values, but the result was no better. The only parameter that made the robot follow the path better was to reduce the value of prune_distance, but I don't think this is a good solution. My question is, which parameters should be set correctly to achieve the desired result?</p> <p>My parameter settings are:</p> <pre><code>controller_server: ros__parameters: use_sim_time: True controller_frequency: 16.0 min_x_velocity_threshold: 0.001 min_theta_velocity_threshold: 0.001 failure_tolerance: 0.3 progress_checker_plugin: &quot;progress_checker&quot; goal_checker_plugins: [&quot;general_goal_checker&quot;] controller_plugins: [&quot;FollowPath&quot;] progress_checker: plugin: &quot;nav2_controller::SimpleProgressChecker&quot; required_movement_radius: 0.25 movement_time_allowance: 10.0 general_goal_checker: stateful: True plugin: &quot;nav2_controller::SimpleGoalChecker&quot; xy_goal_tolerance: 0.05 yaw_goal_tolerance: 0.05 FollowPath: plugin: &quot;nav2_mppi_controller::MPPIController&quot; time_steps: 90 model_dt: 0.0625 batch_size: 2000 vx_std: 0.1 vy_std: 0.0 wz_std: 0.1 vx_max: 0.5 vx_min: -0.5 vy_max: 0.0 wz_max: 0.4 iteration_count: 1 prune_distance: 3.0 transform_tolerance: 0.1 temperature: 0.3 gamma: 0.015 motion_model: &quot;DiffDrive&quot; visualize: false reset_period: 1.0 TrajectoryVisualizer: trajectory_step: 5 time_step: 3 max_robot_pose_search_dist: 5.0 enforce_path_inversion: True inversion_xy_tolerance: 0.2 inversion_yaw_tolerance: 0.4 AckermannConstrains: min_turning_r: 0.5 critics: [ &quot;ConstraintCritic&quot;, &quot;ObstaclesCritic&quot;, &quot;GoalCritic&quot;, &quot;GoalAngleCritic&quot;, &quot;PathAlignCritic&quot;, &quot;PathFollowCritic&quot;, &quot;PathAngleCritic&quot;, &quot;PreferForwardCritic&quot;, ] ConstraintCritic: enabled: true cost_power: 1 cost_weight: 4.0 GoalCritic: enabled: true cost_power: 1 cost_weight: 5.0 threshold_to_consider: 1.4 GoalAngleCritic: enabled: true cost_power: 1 cost_weight: 5.0 threshold_to_consider: 0.5 PreferForwardCritic: enabled: false cost_power: 1 cost_weight: 5.0 threshold_to_consider: 1.4 ObstaclesCritic: enabled: true cost_power: 1 repulsion_weight: 1.5 critical_weight: 20.0 consider_footprint: True collision_cost: 10000.0 collision_margin_distance: 0.1 near_goal_distance: 0.5 inflation_radius: 3.9 cost_scaling_factor: 1.5 PathAlignCritic: enabled: true use_path_orientations: True cost_power: 1 cost_weight: 14.0 max_path_occupancy_ratio: 0.05 trajectory_point_step: 3 threshold_to_consider: 0.5 offset_from_furthest: 13 PathFollowCritic: enabled: true cost_power: 1 cost_weight: 17.0 offset_from_furthest: 5 threshold_to_consider: 0.5 PathAngleCritic: enabled: true cost_power: 1 mode: 2 cost_weight: 4.0 offset_from_furthest: 20 threshold_to_consider: 1.4 max_angle_to_furthest: 0.25 </code></pre>
Path following using MPPI controller
<p>I don't know the RobotCreator, but I can provide some hints on how to build a world for Gazebo:</p> <ul> <li><p>Option 1: install Gazebo Classic and use its <a href="https://classic.gazebosim.org/tutorials?cat=build_world&amp;tut=building_editor" rel="nofollow noreferrer">building editor</a>.</p> <p>Gazebo Classic can be installed side-by-side with Gazebo Sim (i.e. 'New Gazebo') up to Gazebo Sim Fortress. From Garden on, this is no longer possible due to a name clash of the <code>gz</code> executable. But you can still compile from source or use a docker image.</p> </li> <li><p>Option 2: draw your model in a 3D modeler (e.g. Blender or Fusion360), export as <code>.dae</code> file (Collada) and import in Gazebo.</p> </li> </ul>
105030
2023-10-28T12:29:05.363
|gazebo|sdformat|
<p>I want to create a world (a school maybe) to simulate multiple robots navigating in there and trying to create a map for that school (I am using ROS2 Humble on Ubuntu 22.04).</p> <p>So, the first simulator that came to my mind was Gazebo because it's popular, but when I followed their tutorials to create a world in <code>sdf</code> formats, it was just a tedious process and I think it would take me weeks just to make a simple world.</p> <p>So, I found some random old reddit post talking about using CAD tools like <code>freeCAD</code> to build the world then export the design using tools like <a href="https://github.com/maidenone/RobotCreator" rel="nofollow noreferrer">RobotCreator</a> to export the design to <code>SDF</code> file format.</p> <p>I followed the instructions on the Repo by downloading the repo then putting the main folder under <code>/usr/bin/freecad/Mod/</code> but when I tried to export the body design in freecad to sdf file I keep getting the error <code>no module named 'importSDF'</code>, I examined the files of the repo and I think the problem comes from these 2 lines:</p> <pre><code>FreeCAD.addImportType(&quot;SDF (*.sdf)&quot;,&quot;importSDF&quot;) FreeCAD.addExportType(&quot;SDF (*.sdf)&quot;,&quot;importSDF&quot;) </code></pre> <p>which are found in the file called <code>init.py</code>. so is there any solution to my problem or any other alternatives?</p>
no module named 'importSDF' error in RobotCreator
<p>Yes, obviously a cartesian robot can be simulated in Gazebo.</p> <p>Typically, a cartesian gantry robot has translational motion axes that are parallel to the cartesian reference frame axes, so both the forward as well as the inverse kinematics is trivial.</p> <p>So if you want more specific answers, you will need to ask more specific questions.</p>
105032
2023-10-28T13:23:32.827
|gazebo|moveit|robot|cartesian|
<p>I wanna simulate a cartesian(gantry) robot in a simulator and solve the forward and reverse kinematics. Is it possible with Gazebo or MOVEIT? which one and how? Or if you suggest any free software thanks in advance</p>
simulator for a cartesian(gantry) robot
<p>Remove <code>ros2 pkg prefix --share</code> and type directly:</p> <pre><code>C:\dev\ros2_ws&gt;ros2 launch urdf_tutorial display.launch.py model:='src/urdf_tutorial`/urdf/01-myfirst.urdf' </code></pre> <p>Pay attention to use the right path to your file <code>01-myfirst.urdf</code> And be sure you are using the right quotes ''</p>
105041
2023-10-29T03:36:21.550
|ros2|build|errors|urdf-tutorial|ros2-launch|
<p>I have installed ROS 2 (Humble Hawksbill) on windows 11 and trying to follow <a href="https://docs.ros.org/en/humble/Tutorials/Intermediate/URDF/Building-a-Visual-Robot-Model-with-URDF-from-Scratch.html" rel="nofollow noreferrer">URDF tutorial</a>. When I run the folllwing command:</p> <pre><code>ros2 launch urdf_tutorial display.launch.py model: =`ros2 pkg prefix --share urdf_tutorial`/urdf/01-myfirst.urdf </code></pre> <p>I get the error below:<a href="https://i.stack.imgur.com/fgTy1.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/fgTy1.png" alt="enter image description here" /></a></p> <pre><code> C:\dev\ros2_ws&gt;ros2 launch urdf_tutorial display.launch.py model:=`ros2 pkg prefix --share urdf_tutorial`/urdf/01-myfirst.urdf usage: ros2 [-h] [--use-python-default-buffering] Call `ros2 &lt;command&gt; -h` for more detailed usage. ... ros2: error: unrecognized arguments: --share urdf_tutorial`/urdf/01-myfirst.urdf </code></pre> <p>I installed the package from here: <a href="https://github.com/ros/urdf_tutorial" rel="nofollow noreferrer">https://github.com/ros/urdf_tutorial</a> and then copy pasted it in my overlay src folder &quot;C:/dev/ros2_ws/src. Any guidence or leads would be really helpful.</p>
display.launch.py --share on ROS 2 tutorials is not working - Windows 11
<p>There is a conceptional issue here: ros2_control hardware components usually don't have a node inside, hence, cannot parse ROS parameters, e.g., set from launch file or CLI.</p> <p>Two possibilities are:</p> <ul> <li>You can add a node spinning in the hardware component, but this might harm realtime constraints, e.g., see this <a href="https://github.com/PickNikRobotics/topic_based_ros2_control" rel="nofollow noreferrer">topic-based</a> component.</li> <li>Add parameters to our URDf and parse them with the hardware component, as <a href="https://github.com/ros-controls/ros2_control_demos/blob/ce7497d894114ed9fd796829caa7b696c112c072/example_1/hardware/rrbot.cpp#L39-L41" rel="nofollow noreferrer">done in the examples</a>. You can then use xacro arguments to change it from your launch file.</li> </ul>
105051
2023-10-29T18:50:23.017
|ros-humble|hardware-interface|diff-drive-controller|ros2-control|
<p>I've successfully re-written a HardwareInterface for my diff_drive robot. So I'm now able to communicate with my real robot with ros2_control. (<a href="https://github.com/PaddyCube/hoverboard-driver/tree/ros_2" rel="nofollow noreferrer">https://github.com/PaddyCube/hoverboard-driver/tree/ros_2</a>)</p> <p>Communication is done by USB port, which is hard coded in some config.h file. I'm now looking for a solution, how I can set serial port by parameter of launch command, something like</p> <p><code>ros2 launch my_robot my_diff_Drive.py port:=&quot;/dev/ttsUSB2&quot;</code></p> <p>This is an easy task for me when I use nodes and XML launch files. As I'm not familiar with python launch files as well as I don't have a good understanding of ros2_control, I kindly ask some of you to explain how I can achieve this.</p> <p>My launch file currently looks like this</p> <pre><code># # Licensed under the Apache License, Version 2.0 (the &quot;License&quot;); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an &quot;AS IS&quot; BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. from launch import LaunchDescription from launch.actions import DeclareLaunchArgument, RegisterEventHandler from launch.conditions import IfCondition from launch.event_handlers import OnProcessExit from launch.substitutions import Command, FindExecutable, PathJoinSubstitution, LaunchConfiguration from launch_ros.actions import Node from launch_ros.substitutions import FindPackageShare def generate_launch_description(): # Declare arguments declared_arguments = [] declared_arguments.append( DeclareLaunchArgument( &quot;gui&quot;, default_value=&quot;true&quot;, description=&quot;Start RViz2 automatically with this launch file.&quot;, ) ) # Initialize Arguments gui = LaunchConfiguration(&quot;gui&quot;) # Get URDF via xacro robot_description_content = Command( [ PathJoinSubstitution([FindExecutable(name=&quot;xacro&quot;)]), &quot; &quot;, PathJoinSubstitution( [FindPackageShare(&quot;hoverboard_driver&quot;), &quot;urdf&quot;, &quot;diffbot.urdf.xacro&quot;] ), ] ) robot_description = {&quot;robot_description&quot;: robot_description_content} robot_controllers = PathJoinSubstitution( [ FindPackageShare(&quot;hoverboard_driver&quot;), &quot;config&quot;, &quot;hoverboard_controllers.yaml&quot;, ] ) # rviz_config_file = PathJoinSubstitution( # [FindPackageShare(&quot;ros2_control_demo_description&quot;), &quot;diffbot/rviz&quot;, &quot;diffbot.rviz&quot;] # ) control_node = Node( package=&quot;controller_manager&quot;, executable=&quot;ros2_control_node&quot;, parameters=[robot_description, robot_controllers], output=&quot;both&quot;, ) robot_state_pub_node = Node( package=&quot;robot_state_publisher&quot;, executable=&quot;robot_state_publisher&quot;, output=&quot;both&quot;, parameters=[robot_description], remappings=[ (&quot;/diff_drive_controller/cmd_vel_unstamped&quot;, &quot;/cmd_vel&quot;), ], ) #rviz_node = Node( # package=&quot;rviz2&quot;, # executable=&quot;rviz2&quot;, # name=&quot;rviz2&quot;, # output=&quot;log&quot;, # arguments=[&quot;-d&quot;, rviz_config_file], # condition=IfCondition(gui), #) joint_state_broadcaster_spawner = Node( package=&quot;controller_manager&quot;, executable=&quot;spawner&quot;, arguments=[&quot;joint_state_broadcaster&quot;, &quot;--controller-manager&quot;, &quot;/controller_manager&quot;], ) robot_controller_spawner = Node( package=&quot;controller_manager&quot;, executable=&quot;spawner&quot;, arguments=[&quot;hoverboard_base_controller&quot;, &quot;--controller-manager&quot;, &quot;/controller_manager&quot;], ) # Delay rviz start after `joint_state_broadcaster` #delay_rviz_after_joint_state_broadcaster_spawner = RegisterEventHandler( # event_handler=OnProcessExit( # target_action=joint_state_broadcaster_spawner, # on_exit=[rviz_node], # ) #) # Delay start of robot_controller after `joint_state_broadcaster` delay_robot_controller_spawner_after_joint_state_broadcaster_spawner = RegisterEventHandler( event_handler=OnProcessExit( target_action=joint_state_broadcaster_spawner, on_exit=[robot_controller_spawner], ) ) nodes = [ control_node, robot_state_pub_node, joint_state_broadcaster_spawner, # delay_rviz_after_joint_state_broadcaster_spawner, delay_robot_controller_spawner_after_joint_state_broadcaster_spawner, ] return LaunchDescription(declared_arguments + nodes) </code></pre> <p>and in on_acitvate method, here I want to use the parameter</p> <pre><code> const rclcpp_lifecycle::State &amp; /*previous_state*/) { ====&gt;&gt;&gt; GET PORT FROM LAUNCH PARAMETER INSTEAD &lt;&lt;&lt;===== port = DEFAULT_PORT; ... if ((port_fd = open(port.c_str(), O_RDWR | O_NOCTTY | O_NDELAY)) &lt; 0) { RCLCPP_FATAL(rclcpp::get_logger(&quot;hoverboard_driver&quot;), &quot;Cannot open serial port to hoverboard&quot;); exit(-1); } <span class="math-container">```</span> </code></pre>
ros2_control want to set parameter from cli, serial port
<p>It seems you are only publishing the odometry message to topics &quot;odom_data_euler&quot; and odom_data_quat &quot;, but not actually broadcasting it.</p> <p>You will typically use TransformStamped to achieve this. It is almost the same code-wise, but check this link out:</p> <p><a href="https://docs.ros.org/en/humble/Tutorials/Intermediate/Tf2/Writing-A-Tf2-Broadcaster-Cpp.html" rel="nofollow noreferrer">https://docs.ros.org/en/humble/Tutorials/Intermediate/Tf2/Writing-A-Tf2-Broadcaster-Cpp.html</a></p>
105054
2023-10-30T04:14:52.057
|ros|navigation|odometry|hector-slam|tf|
<p>i am trying publish tf and odom. my problem is that i always get this error:</p> <pre><code>[ERROR] [1698638471.209513778]: Transform failed during publishing of map_odom transform: Lookup would require extrapolation 0.567969187s into the future. Requested time 1698638469.812094927 but the latest data is at time 1698638469.244125605, when looking up transform from frame [base_link] to frame [odom] </code></pre> <p>And when i look at my tf tree, odom is not publish to base_link as i expected <a href="https://i.stack.imgur.com/m8SNu.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/m8SNu.png" alt="enter image description here" /></a></p> <p>Here's the odom publisher node:</p> <pre><code>#include &quot;ros/ros.h&quot; #include &quot;std_msgs/Int16.h&quot; #include &lt;nav_msgs/Odometry.h&gt; #include &lt;geometry_msgs/PoseStamped.h&gt; #include &lt;tf2/LinearMath/Quaternion.h&gt; #include &lt;tf2_ros/transform_broadcaster.h&gt; #include &lt;cmath&gt; // Create odometry data publishers ros::Publisher odom_data_pub; ros::Publisher odom_data_pub_quat; nav_msgs::Odometry odomNew; nav_msgs::Odometry odomOld; // Initial pose const double initialX = 0.0; const double initialY = 0.0; const double initialTheta = 0.00000000001; const double PI = 3.141592; // Robot physical constants const double TICKS_PER_REVOLUTION = 535; // For reference purposes. const double WHEEL_RADIUS = 0.033; // Wheel radius in meters const double WHEEL_BASE = 0.17; // Center of left tire to center of right tire const double TICKS_PER_METER = 3100; // Original was 2800 // Distance both wheels have traveled double distanceLeft = 0; double distanceRight = 0; // Flag to see if initial pose has been received bool initialPoseRecieved = false; using namespace std; // Get initial_2d message from either Rviz clicks or a manual pose publisher void set_initial_2d(const geometry_msgs::PoseStamped &amp;rvizClick) { odomOld.pose.pose.position.x = rvizClick.pose.position.x; odomOld.pose.pose.position.y = rvizClick.pose.position.y; odomOld.pose.pose.orientation.z = rvizClick.pose.orientation.z; initialPoseRecieved = true; } // Calculate the distance the left wheel has traveled since the last cycle void Calc_Left(const std_msgs::Int16&amp; leftCount) { static int lastCountL = 0; if(leftCount.data != 0 &amp;&amp; lastCountL != 0) { int leftTicks = (leftCount.data - lastCountL); if (leftTicks &gt; 10000) { leftTicks = 0 - (65535 - leftTicks); } else if (leftTicks &lt; -10000) { leftTicks = 65535-leftTicks; } else{} distanceLeft = leftTicks/TICKS_PER_METER; } lastCountL = leftCount.data; } // Calculate the distance the right wheel has traveled since the last cycle void Calc_Right(const std_msgs::Int16&amp; rightCount) { static int lastCountR = 0; if(rightCount.data != 0 &amp;&amp; lastCountR != 0) { int rightTicks = rightCount.data - lastCountR; if (rightTicks &gt; 10000) { distanceRight = (0 - (65535 - distanceRight))/TICKS_PER_METER; } else if (rightTicks &lt; -10000) { rightTicks = 65535 - rightTicks; } else{} distanceRight = rightTicks/TICKS_PER_METER; } lastCountR = rightCount.data; } // Publish a nav_msgs::Odometry message in quaternion format void publish_quat() { tf2::Quaternion q; q.setRPY(0, 0, odomNew.pose.pose.orientation.z); nav_msgs::Odometry quatOdom; quatOdom.header.stamp = odomNew.header.stamp; quatOdom.header.frame_id = &quot;odom&quot;; quatOdom.child_frame_id = &quot;base_link&quot;; quatOdom.pose.pose.position.x = odomNew.pose.pose.position.x; quatOdom.pose.pose.position.y = odomNew.pose.pose.position.y; quatOdom.pose.pose.position.z = odomNew.pose.pose.position.z; quatOdom.pose.pose.orientation.x = q.x(); quatOdom.pose.pose.orientation.y = q.y(); quatOdom.pose.pose.orientation.z = q.z(); quatOdom.pose.pose.orientation.w = q.w(); quatOdom.twist.twist.linear.x = odomNew.twist.twist.linear.x; quatOdom.twist.twist.linear.y = odomNew.twist.twist.linear.y; quatOdom.twist.twist.linear.z = odomNew.twist.twist.linear.z; quatOdom.twist.twist.angular.x = odomNew.twist.twist.angular.x; quatOdom.twist.twist.angular.y = odomNew.twist.twist.angular.y; quatOdom.twist.twist.angular.z = odomNew.twist.twist.angular.z; for(int i = 0; i&lt;36; i++) { if(i == 0 || i == 7 || i == 14) { quatOdom.pose.covariance[i] = .01; } else if (i == 21 || i == 28 || i== 35) { quatOdom.pose.covariance[i] += 0.1; } else { quatOdom.pose.covariance[i] = 0; } } odom_data_pub_quat.publish(quatOdom); } // Update odometry information void update_odom() { // Calculate the average distance double cycleDistance = (distanceRight + distanceLeft) / 2; // Calculate the number of radians the robot has turned since the last cycle double cycleAngle = asin((distanceRight-distanceLeft)/WHEEL_BASE); // Average angle during the last cycle double avgAngle = cycleAngle/2 + odomOld.pose.pose.orientation.z; if (avgAngle &gt; PI) { avgAngle -= 2*PI; } else if (avgAngle &lt; -PI) { avgAngle += 2*PI; } else{} // Calculate the new pose (x, y, and theta) odomNew.pose.pose.position.x = odomOld.pose.pose.position.x + cos(avgAngle)*cycleDistance; odomNew.pose.pose.position.y = odomOld.pose.pose.position.y + sin(avgAngle)*cycleDistance; odomNew.pose.pose.orientation.z = cycleAngle + odomOld.pose.pose.orientation.z; // Prevent lockup from a single bad cycle if (isnan(odomNew.pose.pose.position.x) || isnan(odomNew.pose.pose.position.y) || isnan(odomNew.pose.pose.position.z)) { odomNew.pose.pose.position.x = odomOld.pose.pose.position.x; odomNew.pose.pose.position.y = odomOld.pose.pose.position.y; odomNew.pose.pose.orientation.z = odomOld.pose.pose.orientation.z; } // Make sure theta stays in the correct range if (odomNew.pose.pose.orientation.z &gt; PI) { odomNew.pose.pose.orientation.z -= 2 * PI; } else if (odomNew.pose.pose.orientation.z &lt; -PI) { odomNew.pose.pose.orientation.z += 2 * PI; } else{} // Compute the velocity odomNew.header.stamp = ros::Time::now(); odomNew.twist.twist.linear.x = cycleDistance/(odomNew.header.stamp.toSec() - odomOld.header.stamp.toSec()); odomNew.twist.twist.angular.z = cycleAngle/(odomNew.header.stamp.toSec() - odomOld.header.stamp.toSec()); // Save the pose data for the next cycle odomOld.pose.pose.position.x = odomNew.pose.pose.position.x; odomOld.pose.pose.position.y = odomNew.pose.pose.position.y; odomOld.pose.pose.orientation.z = odomNew.pose.pose.orientation.z; odomOld.header.stamp = odomNew.header.stamp; // Publish the odometry message odom_data_pub.publish(odomNew); } int main(int argc, char **argv) { // Set the data fields of the odometry message odomNew.header.frame_id = &quot;odom&quot;; odomNew.pose.pose.position.z = 0; odomNew.pose.pose.orientation.x = 0; odomNew.pose.pose.orientation.y = 0; odomNew.twist.twist.linear.x = 0; odomNew.twist.twist.linear.y = 0; odomNew.twist.twist.linear.z = 0; odomNew.twist.twist.angular.x = 0; odomNew.twist.twist.angular.y = 0; odomNew.twist.twist.angular.z = 0; odomOld.pose.pose.position.x = initialX; odomOld.pose.pose.position.y = initialY; odomOld.pose.pose.orientation.z = initialTheta; // Launch ROS and create a node ros::init(argc, argv, &quot;ekf_odom_pub&quot;); ros::NodeHandle node; // Subscribe to ROS topics ros::Subscriber subForRightCounts = node.subscribe(&quot;right_ticks&quot;, 100, Calc_Right, ros::TransportHints().tcpNoDelay()); ros::Subscriber subForLeftCounts = node.subscribe(&quot;left_ticks&quot;, 100, Calc_Left, ros::TransportHints().tcpNoDelay()); ros::Subscriber subInitialPose = node.subscribe(&quot;slam_out_pose&quot;, 1, set_initial_2d); // Publisher of simple odom message where orientation.z is an euler angle odom_data_pub = node.advertise&lt;nav_msgs::Odometry&gt;(&quot;odom_data_euler&quot;, 100); // Publisher of full odom message where orientation is quaternion odom_data_pub_quat = node.advertise&lt;nav_msgs::Odometry&gt;(&quot;odom_data_quat&quot;, 100); ros::Rate loop_rate(30); while(ros::ok()) { if(initialPoseRecieved) { update_odom(); publish_quat(); } ros::spinOnce(); loop_rate.sleep(); } return 0; } </code></pre> <p>Can anyone please tell me where did i do wrong. Thank you!</p>
Transform failed during publishing of map_odom transform
<p>Did you re-source your workspace's install directory?</p>
105056
2023-10-30T04:56:42.500
|ros-humble|nav2|global-planner|
<p>I followed this nav2 getting started <a href="https://navigation.ros.org/getting_started/index.html" rel="nofollow noreferrer">guide</a> without issue: I am now following this nav2 custom planner plugin <a href="https://navigation.ros.org/plugin_tutorials/docs/writing_new_nav2planner_plugin.html" rel="nofollow noreferrer">tutorial</a></p> <p>I have created a new plugin nav2_straightline_planner in my ws/src directory. And I have confirmed there are no typos between the sample code and mine. When trying to run</p> <p>$ ros2 launch nav2_bringup tb3_simulation_launch.py params_file:=/ws/src/navigation2_tutorials/nav2_straightline_planner/nav2_params.yaml</p> <p>I get the following error:</p> <p>Failed to create global planner. Exception: According to the loaded plugin descriptions the class nav2_straightline_planner/StraightLine with base class type nav2_core::GlobalPlanner does not exist. Declared types are nav2_navfn_planner/NavfnPlanner</p> <p>I have checked my CMAKELists has <code>pluginlib_export_plugin_description_file(nav2_core global_planner_plugin.xml)</code></p> <p>my global_planner_plugin.xml is the following</p> <pre><code>&lt;library path=&quot;nav2_straightline_planner&quot;&gt; &lt;class name=&quot;nav2_straightline_planner/StraightLine&quot; type=&quot;nav2_straightline_planner::StraightLine&quot; base_class_type=&quot;nav2_core::GlobalPlanner&quot;&gt; &lt;description&gt;This is an example plugin which produces straight path.&lt;/description&gt; &lt;/class&gt; &lt;/library&gt; </code></pre> <p>And in my params.xml I have the following:</p> <pre><code> planner_server: ros__parameters: plugins: [&quot;GridBased&quot;] use_sim_time: True GridBased: plugin: &quot;nav2_straightline_planner/StraightLine&quot; interpolation_resolution: 0.1 </code></pre>
Nav2 Custom planner plugin does not exist
<p>The answer to this question was provided in a comment by <code>Steven Macenski</code></p> <blockquote> <p>This is just a simple post processing step for the nature of the Hybrid-A* method to make the quality &quot;pop&quot; with little variations due to discrete cost search. It takes well under 0.1ms so its not a heavy operation like the other Smoothers in the Smoother Server.</p> </blockquote>
105061
2023-10-30T07:40:00.493
|navigation|ros2|ros-humble|path-planning|nav2|
<p>I was wondering what is the reasoning behind having both a smoother server (e.g., <code>a constrained smoother</code>) and defining a smoother in the planner server. Shouldn't the smoother server be enough on its own? Is there a difference between the two? Why are the default <code>w_smooth</code> parameters so different for the two (<code>0.3</code> when defined in the <code>planner server</code> and <code>2000000.0</code> when creating a <code>constrained smoother</code>)?</p>
Smoother both in global planner and independent
<p>you should setup micro-ros-setup on your ubuntu computer, and also download arduino ide image, download micro-ros-arduino library into the arduino ide, and try upload example to arduino due, if you want do with video I made a video about how to use micro-ros with esp32 board.</p> <p><a href="https://www.youtube.com/watch?v=ggxNLKZqITU" rel="nofollow noreferrer">https://www.youtube.com/watch?v=ggxNLKZqITU</a></p>
105076
2023-10-30T14:40:34.330
|ros2|micro-ros|
<p>As the question states, I'm trying to get micro-ROS on an Aruino-DUE. I know it isn't an officially supported board, but it does have community support however I can't figure out how to install that version of micro-ROS. I'm very new so sorry if this is an obvious answer. Thank you</p>
How to use micro-ROS for an Arduino DUE
<p>You need to set <code>orientation.w = 1</code> for every quaternion in the message if you don't have another orientation you want them to have. It's unfortunate that <code>(0, 0, 0, 1)</code> isn't the default value, or that rviz couldn't automatically force quaternions to be valid (and issue an error or warning) instead bringing down the whole application.</p> <p>In <code>to_ros_boxes_3d()</code> in <a href="https://github.com/opendr-eu/opendr/blob/master/projects/opendr_ws/src/opendr_bridge/src/opendr_bridge/bridge.py#L769-L782" rel="nofollow noreferrer">https://github.com/opendr-eu/opendr/blob/master/projects/opendr_ws/src/opendr_bridge/src/opendr_bridge/bridge.py#L769-L782</a> it sets the orientation of <code>box.bbox</code> but doesn't set it for <code>box.results[0]</code>- try adding <code>box.results[0].pose.pose.orientation.w = 1.0</code> (and if there were more than one <code>ObjectHypothesisWithPose</code> be sure any others have orientation.w = 1.0 (unless you know the orientation ought to be something else).</p>
105085
2023-10-30T18:58:26.350
|ros|rviz|pointcloud|artificial-intelligence|rviz-plugins|
<p>I'm using a plugin for visualization of topic type <a href="https://docs.ros.org/en/noetic/api/vision_msgs/html/msg/Detection3DArray.html" rel="nofollow noreferrer">vision_msgs/Detection3DArray</a>. I add a point cloud and there is not problem, but when I add this topic Rviz crashes with this error message:</p> <pre><code>original visual, set by message: position Vector3(0, 0, 0) with orientation Quaternion(0, 0, 0, 0) rviz: /build/ogre-1.9-kiU5_5/ogre-1.9-1.9.0+dfsg1/OgreMain/include/OgreAxisAlignedBox.h:251: void Ogre::AxisAlignedBox::setExtents(const Ogre::Vector3&amp;, const Ogre::Vector3&amp;): La declaración `(min.x &lt;= max.x &amp;&amp; min.y &lt;= max.y &amp;&amp; min.z &lt;= max.z) &amp;&amp; &quot;The minimum corner of the box must be less than or equal to maximum corner&quot;' no se cumple. Abortado (`core' generado) </code></pre> <p><a href="https://i.stack.imgur.com/NGABS.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/NGABS.png" alt="rviz terminal output quaternion error" /></a></p> <p>I get this plugin for visualization of this type of topics: <strong><a href="https://github.com/Kukanani/vision_msgs_visualization" rel="nofollow noreferrer">https://github.com/Kukanani/vision_msgs_visualization</a></strong></p> <p>There are some failures like this in internet but no one is useful for me.</p> <p><a href="https://i.stack.imgur.com/HZfVk.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/HZfVk.png" alt="rviz ork object display" /></a></p>
Rviz crash when adding topic
<p>It must be something on your side, as it works for me both with a non-recently updated source tree, as well as with a fresh <code>vcs import</code>:</p> <pre><code><span class="math-container">$ echo $</span>ROS_DISTRO <span class="math-container">$ source install/setup.bash $</span> echo $ROS_DISTRO rolling </code></pre> <p>It also works using packages:</p> <ul> <li>Pull <a href="https://hub.docker.com/layers/library/ubuntu/22.04/images/sha256-ffa841e85005182836d91f7abd24ec081f3910716096955dcc1874b8017b96c9?context=explore" rel="nofollow noreferrer">22.04 Docker image</a></li> <li>Install ROS2 Rolling per <a href="https://docs.ros.org/en/rolling/Installation/Ubuntu-Install-Debians.html" rel="nofollow noreferrer">the official instructions</a> (package <code>ros-rolling-ros-base</code>),</li> </ul> <pre><code>root@664088dbb40c:/# echo $ROS_DISTRO root@664088dbb40c:/# source /opt/ros/rolling/setup.bash root@664088dbb40c:/# echo $ROS_DISTRO rolling root@664088dbb40c:/# </code></pre> <p>EDIT: added following <code>grep</code> output:</p> <pre><code>/opt/ros2_rolling/install$ grep -rI ROS_DISTRO lib/python3.10/site-packages/rpyutils/import_c_library.py: distro = os.environ.get('ROS_DISTRO', 'rolling') lib/python3.10/site-packages/ros2doctor/api/package.py: distro_name = os.environ.get('ROS_DISTRO') lib/python3.10/site-packages/ros2doctor/api/package.py: doctor_error('ROS_DISTRO is not set.') lib/python3.10/site-packages/ros2doctor/api/platform.py: Check ROS_DISTRO environment variables and distribution installed. lib/python3.10/site-packages/ros2doctor/api/platform.py: distro_name = os.environ.get('ROS_DISTRO') lib/python3.10/site-packages/ros2doctor/api/platform.py: doctor_error('ROS_DISTRO is not set.') share/ros_environment/package.xml: &lt;description&gt;The package provides the environment variables `ROS_VERSION` and `ROS_DISTRO`.&lt;/description&gt; share/ros_environment/environment/1.ros_distro.dsv:set;ROS_DISTRO;rolling share/ros_environment/environment/0.ros_distro_check.sh:if [ -n &quot;$ROS_DISTRO&quot; -a &quot;$ROS_DISTRO&quot; != &quot;rolling&quot; ]; then share/ros_environment/environment/0.ros_distro_check.sh: echo &quot;ROS_DISTRO was set to '$ROS_DISTRO' before. Please make sure that the environment does not mix paths from different distributions.&quot; &gt;&amp;2 share/ros_environment/environment/1.ros_distro.sh:export ROS_DISTRO=rolling </code></pre>
105087
2023-10-30T21:15:22.210
|ros2|
<p>After sourcing a rolling install, the environment variable ROS_DISTRO is not set. Is this intentional?</p> <pre><code>tyler@fractal:~<span class="math-container">$ source /opt/ros/rolling/setup.bash tyler@fractal:~$</span> echo $ROS_DISTRO tyler@fractal:~$ </code></pre>
Why is ROS_DISTRO not set for rolling?
<p><strong>TL;DR</strong></p> <p>While writing this answer, I found that the Gazebo Sim sensor parameters are read through the SDFormat library, and hence all parameters are described in the <a href="http://sdformat.org/spec?ver=1.10&amp;elem=sensor" rel="nofollow noreferrer">SDFormat Specification</a>, e.g. <a href="http://sdformat.org/spec?ver=1.10&amp;elem=sensor#sensor_imu" rel="nofollow noreferrer">here</a> for the imu sensor.</p> <p><strong>Long answer</strong></p> <p>For Gazebo Sim system plugins, the documentation is in the header files, see <a href="https://robotics.stackexchange.com/a/103884/35117">this previous answer</a> for the references.</p> <p>For Gazebo Sim sensors, it seems it's a bit more elaborate:</p> <ul> <li>The sensors themselves are located in the <a href="https://github.com/gazebosim/gz-sensors/tree/gz-sensors7/include/gz/sensors" rel="nofollow noreferrer"><code>gz-sensors</code> repository</a>.</li> <li>If you read through the header and source files (e.g. <a href="https://github.com/gazebosim/gz-sensors/blob/gz-sensors7/include/gz/sensors/ImuSensor.hh" rel="nofollow noreferrer">ImuSensor.hh</a> and <a href="https://github.com/gazebosim/gz-sensors/blob/gz-sensors7/src/ImuSensor.cc" rel="nofollow noreferrer">ImuSensor.cc</a>) you will find descriptions about the internal variables of the sensor,</li> <li>However, whereas the <code>gz-sim</code> system plugins typically have a <code>configure()</code> in which parameters are read from the SDF, it seems that <code>gz-sensors</code> have little logic in their <code>load()</code> call (e.g. <a href="https://github.com/gazebosim/gz-sensors/blob/gz-sensors7/src/ImuSensor.cc#L118" rel="nofollow noreferrer">here</a>) while the actual loading of the parameters from the SDF is deferred to the <a href="https://github.com/gazebosim/sdformat" rel="nofollow noreferrer"><code>SDFormat</code> library</a>: <ul> <li>E.g. see the <a href="https://github.com/gazebosim/sdformat/blob/sdf13/include/sdf/Imu.hh" rel="nofollow noreferrer">IMU sensor header file</a> and <a href="https://github.com/gazebosim/sdformat/blob/sdf13/src/Imu.cc" rel="nofollow noreferrer">implementation file</a>,</li> <li>More specifically the <a href="https://github.com/gazebosim/sdformat/blob/24b61531ad49420aeddb9f59c4c08b01b59e5d29/src/Imu.cc#L81" rel="nofollow noreferrer"><code>load()</code> function</a> performs the read from the SDF.</li> </ul> </li> </ul>
105092
2023-10-31T06:26:39.907
|gazebo|gazebo-plugin|gazebo-7|gazebo-model|gazebo-sensor|
<p>From the tutorial of gazebo garden <a href="https://gazebosim.org/docs/garden/sensors" rel="nofollow noreferrer">here</a>,</p> <pre><code>&lt;sensor name=&quot;imu_sensor&quot; type=&quot;imu&quot;&gt; &lt;always_on&gt;1&lt;/always_on&gt; &lt;update_rate&gt;1&lt;/update_rate&gt; &lt;visualize&gt;true&lt;/visualize&gt; &lt;topic&gt;imu&lt;/topic&gt; &lt;/sensor&gt; </code></pre> <p>I can see the tags that can be used in adding an IMU sensor. However, I am still not sure if the tags shown on that link are already the entire tags that can be used. Not only that, I also would like to know what tags can be used for other sensors as well e.g. Lidar sensor, camera sensor.</p> <p>Where can I find the list of tags that can be used for each sensors available in Gazebo Garden?</p>
Where can I find the list of SDF tags that can be used for a particular sensor in Gazebo Garden?
<p>You can look at installing via <a href="https://robostack.github.io" rel="nofollow noreferrer">Robostack</a> and see if your needs can be met that way:</p> <p><a href="https://robostack.github.io/GettingStarted.html" rel="nofollow noreferrer">https://robostack.github.io/GettingStarted.html</a></p> <p><a href="https://robostack.github.io/humble.html" rel="nofollow noreferrer">https://robostack.github.io/humble.html</a></p> <p>Some core MoveIt packages are available for binary installation. I successfully installed <code>ros-humble-moveit</code> on an existing Robostack Humble installation. I can add a motion planning panel to RViz.</p> <p>However, <code>moveit2_tutorials</code> is not available to try a more functional test per the <a href="https://moveit.picknik.ai/main/doc/tutorials/quickstart_in_rviz/quickstart_in_rviz_tutorial.html" rel="nofollow noreferrer">quickstart documentation</a>. I briefly looked into building it from source, but it looks like I'll have to try to build a number of its dependencies from source too.</p> <p>EDIT: I added a number of packages from source and got the <code>moveit2_tutorials</code> RViz quickstart working:</p> <p><a href="https://i.stack.imgur.com/rxrdD.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/rxrdD.png" alt="enter image description here" /></a></p> <p><code>moveit2_tutorials</code> depends on a number of packages that as far as I know aren't available on Windows and might be hard to build.</p> <p>I currently removed the dependencies <code>moveit_task_constructor_core</code>, <code>moveit_ros_perception</code> and <code>moveit_servo</code> from <a href="https://github.com/ros-planning/moveit2_tutorials/blob/main/CMakeLists.txt" rel="nofollow noreferrer">the <code>CMakeLists.txt</code> in <code>moveit2_tutorials</code></a> and commented out all the <code>add_subdirectory()</code> calls except for <code>add_subdirectory(doc/tutorials/quickstart_in_rviz)</code></p> <p>I will fork it and make a Windows branch to share later, and update with another edit, as well as share a <code>.repos</code> file. I needed to build <code>graph_msgs</code>, <code>moveit_resources</code>, <code>moveit_visual_tools</code> and <code>rviz_visual_tools</code> from source.</p> <p>--- End Edit ---</p> <p>You might give it a try and see if you can get a working installation. You can also check out <a href="https://github.com/ms-iot/ROSOnWindows/issues" rel="nofollow noreferrer">https://github.com/ms-iot/ROSOnWindows/issues</a> and see if there's any information about MoveIt.</p> <p>Are you planning to control real robot hardware? Even if you get MoveIt working with demos/simulation, hardware driver support tends to be limited.</p>
105100
2023-10-31T11:23:08.203
|ros2|moveit|installation|ros-humble|windows|
<p>Has anyone tried installing MoveIt 2 on windows? When I go to the <a href="https://moveit.ros.org/install-moveit2/binary-windows/" rel="nofollow noreferrer">official website</a> it says nothing about compatibility with ROS-humble version. It's very similar to how it doesn't mention anything for compatibility with ROS-Iron and it is still compatible in Linux. I tried this <a href="http://wiki.ros.org/Installation/Windows" rel="nofollow noreferrer">tutorial</a> but it has lack of information. Can anyone tell if there is any way to install MoveIt 2 on windows for ROS-Humble? I didn't find any resources online.</p>
Moveit 2 On Windows 11: ROS2-Humble Hawksbill
<p>I wrote some documentation <a href="https://github.com/jrutgeer/ROS2_assorted_docs/blob/main/ROS2_core/ROS2_logging.md#rosout" rel="nofollow noreferrer">here</a>: there is a hash table (<code>__logger_map</code>) that links logger and child logger names to the respective publisher.</p> <p>I think this is the reason:</p> <ul> <li>The node goes out of scope,</li> <li>This <a href="https://github.com/ros2/rclcpp/blob/fff009a75100f2afd8ef1c3863620bf5ebe67708/rclcpp/src/rclcpp/node_interfaces/node_base.cpp#L130" rel="nofollow noreferrer">calls</a> <code>rcl_logging_rosout_fini_publisher_for_node</code>,</li> <li>which <a href="https://github.com/ros2/rcl/blob/1b79535fa07fcc77c873adf5261308dac7a821ce/rcl/src/rcl/logging_rosout.c#L330-L330" rel="nofollow noreferrer">calls</a> <code>_rcl_logging_rosout_remove_logger_map</code>,</li> <li>This <a href="https://github.com/ros2/rcl/blob/1b79535fa07fcc77c873adf5261308dac7a821ce/rcl/src/rcl/logging_rosout.c#L139-L143" rel="nofollow noreferrer">iterates through</a> <code>__logger_map</code> and 'unsets' all entries associates with that node,</li> <li>Then the child logger destructs and it fails to find its entry in the <code>__logger_map</code> <a href="https://github.com/ros2/rcl/blob/1b79535fa07fcc77c873adf5261308dac7a821ce/rcl/src/rcl/logging_rosout.c#L529" rel="nofollow noreferrer">here</a>.</li> </ul> <p>Quick fix that confirms above:</p> <p>Change</p> <pre class="lang-cpp prettyprint-override"><code>rclcpp::Node::SharedPtr node = rclcpp::Node::make_shared(&quot;dut_node&quot;);` </code></pre> <p>into</p> <pre class="lang-cpp prettyprint-override"><code>static rclcpp::Node::SharedPtr node = rclcpp::Node::make_shared(&quot;dut_node&quot;);` </code></pre> <p>And there no longer is an exception, since the node outlives the loggers.</p> <p>An unrelated remark:</p> <p>Note that calling <code>get_child</code> on a <em>named</em> logger (i.e. a non-node logger such as <code>rclcpp::get_logger(&quot;moveit&quot;)</code>) throws an exception (as a named logger does not have a publisher).</p> <p>So in your example code it is mandatory to issue <code>getLogger() = node-&gt;get_logger();</code>, before calling <code>getStaticChildLogger()</code>.</p>
105110
2023-10-31T19:25:50.203
|ros2|rclcpp|
<p>I believe this is likely a bug. Here is a minimal example and the output with a stack trace (made with backwardcpp):</p> <pre class="lang-cpp prettyprint-override"><code>#include &lt;rclcpp/rclcpp.hpp&gt; rclcpp::Logger&amp; getLogger() { static auto logger = rclcpp::get_logger(&quot;moveit&quot;); return logger; } // make the child logger the first time we use it rclcpp::Logger getStaticChildLogger() { static auto logger = [] { auto logger = getLogger().get_child(&quot;child&quot;); RCLCPP_INFO(logger, &quot;making the child logger&quot;); return logger; }(); return logger; } int main(int argc, char** argv) { rclcpp::init(argc, argv); rclcpp::Node::SharedPtr node = rclcpp::Node::make_shared(&quot;dut_node&quot;); getLogger() = node-&gt;get_logger(); RCLCPP_INFO(getLogger(), &quot;node logger&quot;); RCLCPP_INFO(getStaticChildLogger(), &quot;child node logger&quot;); } </code></pre> <pre><code>[INFO] [1698780244.672370104] [dut_node]: node logger [INFO] [1698780244.672399117] [dut_node.child]: making the child logger [INFO] [1698780244.672402452] [dut_node.child]: child node logger terminate called after throwing an instance of 'rclcpp::exceptions::RCLError' what(): failed to call rcl_logging_rosout_remove_sublogger: Sub-logger 'dut_node.child' not exist., at ./src/rcl/logging_rosout.c:530 Stack trace (most recent call last): #19 Object &quot;&quot;, at 0xffffffffffffffff, in #18 Object &quot;/home/tyler/code/ws_moveit/install/moveit_core/lib/moveit_core/logger_from_child_dut&quot;, at 0x55756bef8624, in _start #17 Source &quot;../csu/libc-start.c&quot;, line 392, in __libc_start_main_impl [0x7fad3c229e3f] #16 Source &quot;../sysdeps/nptl/libc_start_call_main.h&quot;, line 74, in __libc_start_call_main [0x7fad3c229d96] #15 Source &quot;./stdlib/exit.c&quot;, line 143, in exit [0x7fad3c24560f] #14 Source &quot;./stdlib/exit.c&quot;, line 113, in __run_exit_handlers [0x7fad3c245494] #13 Source &quot;/opt/ros/rolling/include/rclcpp/rclcpp/logger.hpp&quot;, line 92, in ~Logger [0x55756bef9467] 89: rcpputils::fs::path 90: get_logging_directory(); 91: &gt; 92: class Logger 93: { 94: public: 95: /// An enum for the type of logger level. #12 Source &quot;/usr/include/c++/11/bits/shared_ptr.h&quot;, line 122, in ~shared_ptr [0x55756bef93bd] 119: * pointer see `std::shared_ptr::owner_before` and `std::owner_less`. 120: */ 121: template&lt;typename _Tp&gt; &gt; 122: class shared_ptr : public __shared_ptr&lt;_Tp&gt; 123: { 124: template&lt;typename... _Args&gt; 125: using _Constructible = typename enable_if&lt; #11 Source &quot;/usr/include/c++/11/bits/shared_ptr_base.h&quot;, line 1154, in ~__shared_ptr [0x55756bef939d] 1152: __shared_ptr(const __shared_ptr&amp;) noexcept = default; 1153: __shared_ptr&amp; operator=(const __shared_ptr&amp;) noexcept = default; &gt;1154: ~__shared_ptr() = default; 1155: 1156: template&lt;typename _Yp, typename = _Compatible&lt;_Yp&gt;&gt; 1157: __shared_ptr(const __shared_ptr&lt;_Yp, _Lp&gt;&amp; __r) noexcept #10 Source &quot;/usr/include/c++/11/bits/shared_ptr_base.h&quot;, line 705, in ~__shared_count [0x55756bef9ee6] 702: ~__shared_count() noexcept 703: { 704: if (_M_pi != nullptr) &gt; 705: _M_pi-&gt;_M_release(); 706: } 707: 708: __shared_count(const __shared_count&amp; __r) noexcept #9 Source &quot;/usr/include/c++/11/bits/shared_ptr_base.h&quot;, line 168, in _M_release [0x55756befa64c] 165: if (__gnu_cxx::__exchange_and_add_dispatch(&amp;_M_use_count, -1) == 1) 166: { 167: _GLIBCXX_SYNCHRONIZATION_HAPPENS_AFTER(&amp;_M_use_count); &gt; 168: _M_dispose(); 169: // There must be a memory barrier between dispose() and destroy() 170: // to ensure that the effects of dispose() are observed in the 171: // thread that runs destroy(). #8 Object &quot;/opt/ros/rolling/lib/librclcpp.so&quot;, at 0x7fad3cb27e06, in #7 Object &quot;/opt/ros/rolling/lib/librclcpp.so&quot;, at 0x7fad3cafe6d8, in rclcpp::exceptions::throw_from_rcl_error(int, std::__cxx11::basic_string&lt;char, std::char_traits&lt;char&gt;, std::allocator&lt;char&gt; &gt; const&amp;, rcutils_error_state_s const*, void (*)()) #6 Object &quot;/usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.30&quot;, at 0x7fad3c6ae1fd, in std::rethrow_exception(std::__exception_ptr::exception_ptr) #5 Object &quot;/usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.30&quot;, at 0x7fad3c6ae276, in std::terminate() #4 Object &quot;/usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.30&quot;, at 0x7fad3c6ae20b, in #3 Object &quot;/usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.30&quot;, at 0x7fad3c6a2b9d, in #2 Source &quot;./stdlib/abort.c&quot;, line 79, in abort [0x7fad3c2287f2] #1 Source &quot;../sysdeps/posix/raise.c&quot;, line 26, in raise [0x7fad3c242475] #0 | Source &quot;./nptl/pthread_kill.c&quot;, line 89, in __pthread_kill_internal | Source &quot;./nptl/pthread_kill.c&quot;, line 78, in __pthread_kill_implementation Source &quot;./nptl/pthread_kill.c&quot;, line 44, in __pthread_kill [0x7fad3c2969fc] Aborted (Signal sent by tkill() 1572362 1000) Aborted (core dumped) </code></pre>
exception thrown in rcl_logging_rosout_remove_sublogger on destruction with a child logger with static lifetime
<p>It looks like the <code>Salisbury Hand / Stanford/JPL Hand</code></p> <p>Built sometime in 1983, as per the following link.</p> <p><a href="https://www.robotgrasping.org/#Salisbury%20Hand%20/%20Stanford/JPL%20Hand" rel="nofollow noreferrer">https://www.robotgrasping.org/#Salisbury%20Hand%20/%20Stanford/JPL%20Hand</a></p> <p><a href="https://i.stack.imgur.com/z5Kje.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/z5Kje.png" alt="enter image description here" /></a></p> <p>picture from <a href="https://americanhistory.si.edu/sites/default/files/file-uploader/Salisbury%20Hand%20Rectangle.jpg" rel="nofollow noreferrer">https://americanhistory.si.edu/sites/default/files/file-uploader/Salisbury%20Hand%20Rectangle.jpg</a></p>
105116
2023-11-01T02:44:05.770
|robotic-arm|inverse-kinematics|microcontroller|
<p>I have a picture with robotic arm which comes from lego video (<a href="https://www.youtube.com/watch?v=rex8SYJRvT0" rel="nofollow noreferrer">Lego Mindstorms Robotics Invention System 2.0 tour</a>) and I was wondering when and where it was created.</p> <p>Very interesting stuff. It was probably most advanced three finger hand those days. I'll be very grateful for answer. Here is the picture:</p> <p><a href="https://i.stack.imgur.com/colm6.jpg" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/colm6.jpg" alt="enter image description here" /></a></p>
Three finger robotic arm history
<p>See <a href="https://github.com/ros2/rcutils/blob/e276dc1fe5a0e450e53423ec71d57cbe05b129f8/src/logging.c#L375-L383" rel="nofollow noreferrer">here</a> for the list of currently supported format tokens and the respective formatting handler functions.</p> <p>Wrt. time, the only formats are:</p> <ul> <li>{time}, which installs a formatting handler <a href="https://github.com/ros2/rcutils/blob/e276dc1fe5a0e450e53423ec71d57cbe05b129f8/src/logging.c#L211-L220" rel="nofollow noreferrer"><code>expand_time_as_seconds</code></a> and</li> <li>{time_as_nanoseconds}, which installs the formatting handler <a href="https://github.com/ros2/rcutils/blob/e276dc1fe5a0e450e53423ec71d57cbe05b129f8/src/logging.c#L222-L231" rel="nofollow noreferrer"><code>expand_time_as_nanoseconds</code></a>.</li> </ul> <p>Both of these handlers call <a href="https://github.com/ros2/rcutils/blob/e276dc1fe5a0e450e53423ec71d57cbe05b129f8/src/logging.c#L180-L209" rel="nofollow noreferrer"><code>expand_time()</code></a>.</p> <p>So if you want a different formatting, you will have to implement a different handler.</p>
105121
2023-11-01T10:31:46.050
|ros2|ros-humble|logging|
<p>I read <a href="https://docs.ros.org/en/foxy/Tutorials/Demos/Logging-and-logger-configuration.html#console-output-formatting" rel="nofollow noreferrer">https://docs.ros.org/en/foxy/Tutorials/Demos/Logging-and-logger-configuration.html#console-output-formatting</a> but it doesn't mention how to specify the datetime format to be something else than seconds, is that possible?</p>
How to specify the datetime format for ros2 logs?
<p>The error itself comes from your own (or the copied) script</p> <pre><code> except AttributeError: print('Could not add Stabilizer log config, bad configuration.')` </code></pre> <p>This is an error but it is not clear why it fails. Here is the API documentation of the crazyflie python library logging functionality:</p> <p><a href="https://www.bitcraze.io/documentation/repository/crazyflie-lib-python/master/user-guides/python_api/#logging" rel="nofollow noreferrer">https://www.bitcraze.io/documentation/repository/crazyflie-lib-python/master/user-guides/python_api/#logging</a></p> <p>Here it says that you can only create logblocks for messages that are no larger than 30 bytes. the logblock 'Stabilizer' contains 36 bytes (9 x float which is 4 bytes each). So therefore it fails.</p> <p>If you look in the documentation, you can see that there is a possibility of declaring a 16 - bit (2 byte) float. In the logblock declaration, you can replace <code>'float'</code> with <code>'FP16'</code>, and reduce the total message size. This should fix it.</p>
105138
2023-11-01T23:00:28.033
|ros2|quadcopter|uav|drone|ros2-launch|
<p>I would like to define <code>stateEstimate.vx</code>, <code>stateEstimate.vy</code>, and <code>stateEstimate.vz</code> variables to access those data. I used the following code parts.</p> <pre><code>def _connected(self, link_uri): self.get_logger().info('Connected!') self._lg_stab = LogConfig(name='Stabilizer', period_in_ms=100) self._lg_stab.add_variable('stateEstimate.x', 'float') self._lg_stab.add_variable('stateEstimate.y', 'float') self._lg_stab.add_variable('stateEstimate.z', 'float') self._lg_stab.add_variable('stateEstimate.vx', 'float') self._lg_stab.add_variable('stateEstimate.vy', 'float') self._lg_stab.add_variable('stateEstimate.vz', 'float') self._lg_stab.add_variable('stateEstimate.roll', 'float') self._lg_stab.add_variable('stateEstimate.pitch', 'float') self._lg_stab.add_variable('stateEstimate.yaw', 'float') self._lg_range = LogConfig(name='Range', period_in_ms=100) self._lg_range.add_variable('range.zrange', 'uint16_t') self._lg_range.add_variable('range.front', 'uint16_t') self._lg_range.add_variable('range.right', 'uint16_t') self._lg_range.add_variable('range.left', 'uint16_t') self._lg_range.add_variable('range.back', 'uint16_t') try: self._cf.log.add_config(self._lg_stab) self._lg_stab.data_received_cb.add_callback(self._stab_log_data) self._lg_stab.error_cb.add_callback(self._stab_log_error) self._lg_stab.start() self._cf.log.add_config(self._lg_range) self._lg_range.data_received_cb.add_callback(self._range_log_data) self._lg_range.error_cb.add_callback(self._range_log_error) self._lg_range.start() except KeyError as e: print('Could not start log configuration,' '{} not found in TOC'.format(str(e))) except AttributeError: print('Could not add Stabilizer log config, bad configuration.') </code></pre> <p>When I run my python file via ROS2, I saw the following part on the terminal (<code>Could not add Stabilizer log config, bad configuration</code>). I could not get why it happens. However, whenver I use the above code part without <code>stateEstimate.vx</code>, <code>stateEstimate.vy</code>, and <code>stateEstimate.vz</code>, it works pretty well. Could you help us to fix it, thanks.</p> <pre><code>ntukenmez3@ae-icps-407120:~/Documents/2023-crazy-flie-1/crazyflie_ros2_experimental_2/crazyflie_ros2$ ros2 run crazyflie_ros2 crazyflie_publisher [INFO] [1698871075.735318053] [crazyflie_publisher]: Connected! Could not add Stabilizer log config, bad configuration. </code></pre>
Crazyflie: Connecting, logging and parameters-->Add logging config
<p>Alright, to clarify: the problem occurs only when you execute the command rosrun map_server map_server mylimomap.yaml, and not from the launch file, correct?</p> <p>Try this checklist:</p> <ol> <li>Source your ROS distribution.</li> <li>Confirm that both .yaml and .pgm files reside in the same directory.</li> <li>Verify the filename specified inside the .yaml matches the .pgm filename.</li> <li>Execute:rosrun map_server map_server path_to_your_map_folder/mylimomap.yaml</li> <li>Make sure the topic for the map is set to /map</li> </ol>
105142
2023-11-02T03:31:33.790
|rviz|
<p>ubuntu 18.04 Rviz</p> <p>my current launch file </p> <pre><code>&lt;arg name=&quot;robot_namespace&quot; default=&quot;/&quot;/&gt; &lt;!-- Load URDF model --&gt; &lt;param name=&quot;robot_description&quot; command=&quot;<span class="math-container">$(find xacro)/xacro '$</span>(find limo_description)/urdf/limo_ackerman.xacro' robot_namespace:=$(arg robot_namespace)&quot; /&gt; &lt;!-- Load joint controller configurations from YAML file to parameter server --&gt; &lt;rosparam file=&quot;$(find limo_gazebo_sim)/config/limo_ackerman_control.yaml&quot; command=&quot;load&quot;/&gt; &lt;!-- Load controllers --&gt; &lt;node name=&quot;controller_spawner&quot; pkg=&quot;controller_manager&quot; type=&quot;spawner&quot; respawn=&quot;false&quot; output=&quot;screen&quot; args=&quot;limo_state_controller limo_fl_steering_hinge_controller limo_fr_steering_hinge_controller&quot;/&gt; &lt;!-- Robot state publisher for URDF visualization --&gt; &lt;node name=&quot;robot_state_publisher&quot; pkg=&quot;robot_state_publisher&quot; type=&quot;robot_state_publisher&quot; /&gt; &lt;!-- Joint State Publisher for the wheels --&gt; &lt;node name=&quot;joint_state_publisher&quot; pkg=&quot;joint_state_publisher&quot; type=&quot;joint_state_publisher&quot;&gt; &lt;param name=&quot;use_gui&quot; value=&quot;false&quot;/&gt; &lt;!-- Set to true if you want to use the GUI --&gt; &lt;/node&gt; &lt;!-- Use robot pose ekf to provide odometry --&gt; &lt;node pkg=&quot;robot_pose_ekf&quot; name=&quot;robot_pose_ekf&quot; type=&quot;robot_pose_ekf&quot;&gt; &lt;param name=&quot;output_frame&quot; value=&quot;odom&quot; /&gt; &lt;param name=&quot;base_footprint_frame&quot; value=&quot;base_link&quot; /&gt; &lt;remap from=&quot;imu_data&quot; to=&quot;imu&quot; /&gt; &lt;/node&gt; &lt;!-- Launch AMCL for localization --&gt; &lt;node pkg=&quot;amcl&quot; type=&quot;amcl&quot; name=&quot;amcl&quot; output=&quot;screen&quot;&gt; &lt;rosparam file=&quot;<span class="math-container">$(find limo_bringup)/param/amcl_params_omni.yaml" command="load" /&gt; &lt;param name="initial_pose_x" value="0" /&gt; &lt;param name="initial_pose_y" value="0" /&gt; &lt;param name="initial_pose_a" value="0" /&gt; &lt;param name="global_frame_id" value="map" /&gt; &lt;param name="map_file" value="$</span>(find limo_bringup)/maps/mylimomap.yaml&quot; /&gt; &lt;/node&gt; &lt;!-- Launch RViz with the amcl.rviz configuration --&gt; &lt;node name=&quot;rviz_amcl&quot; pkg=&quot;rviz&quot; type=&quot;rviz&quot; args=&quot;-d $(find limo_bringup)/rviz/amcl.rviz&quot; /&gt; </code></pre> <p>Currently when running :</p> <p>rosrun map_server map_server mylimomap.yaml[ INFO] [1698894029.234616479]: Loading map from image &quot;mylimomap.pgm</p> <p>It shows that:</p> <p>[ INFO] [1698893480.853439381]: Loading map from image &quot;mylimomap.pgm&quot; [ INFO] [1698893480.928262434]: Read a 4000 X 4000 map @ 0.050 m/cell</p> <p>but Rviz shows that&quot;image not found&quot; why is that so</p> <p>Link of Rviz to show the issue <a href="https://drive.google.com/file/d/1Xge3uHC-lcusR8IcR_xOy6k5rPaH5BvI/view?usp=sharing" rel="nofollow noreferrer">https://drive.google.com/file/d/1Xge3uHC-lcusR8IcR_xOy6k5rPaH5BvI/view?usp=sharing</a></p>
Map loading in Terminal but (image not found in Rviz)
<p>According to the link you provided, there's the next parameter:</p> <blockquote> <p>/default_tolerance (double, default: 0.0) A tolerance on the goal point for the planner. The planner will attempt to create a plan that is as close to the specified goal as possible but no further than default_tolerance away.</p> </blockquote> <p>If you mean that the planner isn't able to make a plan to a point inside an obstacle, this parameter should fix this.</p>
105157
2023-11-02T10:17:15.463
|navigation|ros-noetic|
<p>When the goal point is designated as an obstacle area, the global planner cannot navigate. Can you make the Global Planner move as close to the obstacle area as possible?</p> <p><a href="https://wiki.ros.org/global_planner" rel="nofollow noreferrer">https://wiki.ros.org/global_planner</a> I can't find the relevant parameters in global_planner.</p>
ROS navigation to obstacle
<p>All the computers in your ros system need to have the OS provide an accurate time-of-day clock. This is accomplished by enabling the NTP service on each computer.</p>
105162
2023-11-02T13:07:05.800
|ros|robot-localization|laserscan|ros-noetic|tf-tree|
<p>an external localization package is performing localization between the map and base_link frames. There is a lidar attached to the laser frame, which is connected to the base_link, and it is also publishing data. There is a discrepancy between the map and laser frame. When I select &quot;map&quot; as the fixed frame in Rviz, I receive the &quot;Message removed because it is too old&quot; error from the lidar topic. When I set the fixed frame to base_link or the scan frame, the scan data starts to arrive, and the same error begins to come from other topics related to the map. The Tf tree is as follows. Can you assist with this?</p> <p><a href="https://i.stack.imgur.com/y7Q7q.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/y7Q7q.png" alt="Laser Scan Error image" /></a> <a href="https://i.stack.imgur.com/bq9rn.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/bq9rn.png" alt="TF Tree" /></a></p> <p><a href="https://i.stack.imgur.com/FZbLK.png" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/FZbLK.png" alt="TF Tree After setting " /></a></p>
ROS-Tf-Message too old
<p>I think I know the answer, sorry for the rash question.</p> <p>I forgot to add the header file. Although clang plug-in will automatically add a fake header file for me, it cannot replace the header file I added.</p> <p>&quot;A fake header file&quot; like this:</p> <pre><code>#include &lt;std_msgs/msg/detail/int8__struct.hpp&gt; </code></pre> <p>but what the program need is:</p> <pre><code>#include &lt;std_msgs/msg/int8.hpp&gt; </code></pre>
105193
2023-11-04T07:10:34.770
|ros2|publisher|ros-humble|
<h2>Version:</h2> <p>Ubuntu 22.04 , ROS2 Humble</p> <h2>ERROR description</h2> <p>I want to create a publisher that publishes <code>std_msgs::msgs::Int8</code> messages, but when I write according to the tutorials in the official document, I get errors:</p> <pre><code>Lock_Stop_pub = this-&gt;create_publisher&lt;std_msgs::msg::Int8&gt;(&quot;/stop&quot;, 10); ----------------------- ERROR: No matching member function for call to 'create_publisher' clang(ovl_no_viable_member_function_in_call) node_impl.hpp(73, 7): Candidate function template not viable: no known conversion from 'SentryMaster' to 'rclcpp::Node' for object argument </code></pre> <pre><code>rclcpp::Publisher&lt;std_msgs::msg::Int8&gt;::SharedPtr Lock_Stop_pub; ------------------------ ERROR: In template: static assertion failed due to requirement 'rclcpp::is_ros_compatible_type&lt;std_msgs::msg::Int8_&lt;std::allocator&lt;void&gt;&gt;&gt;::value': given message type is not compatible with ROS and cannot be used with a Publisher publisher.hpp(80, 3): Error occurred here sentry_master.cpp(28, 13): In instantiation of template class 'rclcpp::Publisher&lt;std_msgs::msg::Int8_&lt;std::allocator&lt;void&gt;&gt;&gt;' requested here </code></pre> <p>The complete minimal program is here</p> <pre><code>class SentryMaster: public rclcpp::Node{ public: SentryMaster():Node(&quot;sentry_master_node_cpp&quot;){ Lock_Stop_pub = this-&gt;create_publisher&lt;std_msgs::msg::Int8&gt;(&quot;/stop&quot;, 10); } private: rclcpp::Publisher&lt;std_msgs::msg::Int8&gt;::SharedPtr Lock_Stop_pub; }; </code></pre> <p>Then I found that I can write like this without error</p> <pre><code>auto Lock_Stop_pub = this-&gt;create_publisher&lt;std_msgs::msg::Int8&gt;(&quot;/stop&quot;, 10); </code></pre> <p>Write as it, Clang told me the type of <code>Lock_Stop_pub</code> is <code>std::shared_ptr&lt;Publisher&lt;Int8_&lt;allocator&lt;void&gt;&gt;, allocator&lt;void&gt;&gt;&gt;</code> , but when I change the code, get errors again.</p> <p>But why???</p> <hr /> <p><strong>Added based on comments:</strong></p> <ol> <li><p>It is worth noting that this situation is only found when writing the message type &quot;std_msgs::msgs::Int8&quot;, but this does not happen with other message types such as &quot;geometry_msgs::msgs::PointStamped&quot;.</p> </li> <li><p>Other typical open source ROS2 programs will not report errors</p> </li> <li><p>if add 'count_(0)' to the code:</p> </li> </ol> <pre><code>SentryMaster():Node(&quot;sentry_master_node_cpp&quot;), count_(0){ Lock_Stop_pub = this-&gt;create_publisher&lt;std_msgs::msg::Int8&gt;(&quot;/stop&quot;, 10); } ----------------------- ERROR: Member initializer 'count_' does not name a non-static data member or base class </code></pre> <ol start="4"> <li>My CMakeLists.txt</li> </ol> <pre><code>cmake_minimum_required(VERSION 3.8) project(user_packages) if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES &quot;Clang&quot;) add_compile_options(-Wall -Wextra -Wpedantic) endif() # Export compile commands for clanged set(CMAKE_EXPORT_COMPILE_COMMANDS ON) # find dependencies find_package(ament_cmake REQUIRED) find_package(rclcpp REQUIRED) find_package(geometry_msgs REQUIRED) find_package(nav_msgs REQUIRED) set(CMAKE_AUTOUIC ON) set(CMAKE_AUTOMOC ON) set(CMAKE_AUTORCC ON) set(CMAKE_CXX_STANDARD_REQUIRED ON) include_directories( include ) set(dependencies rclcpp geometry_msgs nav_msgs ) add_executable(sentry_master src/sentry_master.cpp) ament_target_dependencies( sentry_master ${dependencies} ) install(TARGETS sentry_master DESTINATION lib/${PROJECT_NAME}) install(DIRECTORY launch DESTINATION share/${PROJECT_NAME}) if(BUILD_TESTING) find_package(ament_lint_auto REQUIRED) # the following line skips the linter which checks for copyrights # comment the line when a copyright and license is added to all source files set(ament_cmake_copyright_FOUND TRUE) # the following line skips cpplint (only works in a git repo) # comment the line when this package is in a git repo and when # a copyright and license is added to all source files set(ament_cmake_cpplint_FOUND TRUE) ament_lint_auto_find_test_dependencies() endif() ament_package() </code></pre>
[ROS2][std_msgs::msg::Int8] publisher and create_publisher
<p>With a hash, there should be no guarantee that the data will be in order.</p> <p>If order is important, one would normally use a list.</p> <p>Other options would be</p> <pre class="lang-cpp prettyprint-override"><code>std::vecgtor&lt;std::string&gt;sets = [&quot;left_front&quot;, &quot;right_front&quot;, &quot;left_back&quot;, &quot;right_back&quot;]; for (const auto&amp; s : sets) { ROS_ASSERT(wheelset[s].hasMember(&quot;position&quot;)); ROS_ASSERT(wheelset[s][&quot;position&quot;].getType() == XmlRpc::XmlRpcValue::TypeArray); ROS_ASSERT(wheelset[s][&quot;position&quot;].size() == 2); ROS_ASSERT(wheelset[s].hasMember(&quot;steer_offset&quot;)); ROS_ASSERT(wheelset[s][&quot;steer_offset&quot;].getType() == XmlRpc::XmlRpcValue::TypeDouble); ROS_ASSERT(wheelset[s].hasMember(&quot;wheel_radius&quot;)); ROS_ASSERT(wheelset[s][&quot;wheel_radius&quot;].getType() == XmlRpc::XmlRpcValue::TypeDouble); } </code></pre>
105201
2023-11-05T03:53:13.213
|ros|yaml|xmlrpc|
<p>i'm using <code>XmlRpc::XmlRpcValue</code> to receive data from .yaml file, but it seems that the received data order is different from the order in .yaml file.</p> <p>here is my .yaml file:</p> <pre><code>wheelsets: left_front: position: [ 0.585, 0.557 ] &lt;&lt;: &amp;default steer_offset: 0.0 wheel_radius: 0.1175 right_front: position: [ 0.585, -0.557 ] &lt;&lt;: *default left_back: position: [ -0.585, 0.557 ] &lt;&lt;: *default right_back: position: [ -0.585, -0.557 ] &lt;&lt;: *default </code></pre> <p>and here is my code:</p> <pre><code>XmlRpc::XmlRpcValue wheelsets; nh_.getParam(&quot;wheelsets&quot;, wheelsets); ROS_ASSERT(wheelsets.getType() == XmlRpc::XmlRpcValue::TypeStruct); for (const auto&amp; wheelset : wheelsets) { ROS_ASSERT(wheelset.second.hasMember(&quot;position&quot;)); ROS_ASSERT(wheelset.second[&quot;position&quot;].getType() == XmlRpc::XmlRpcValue::TypeArray); ROS_ASSERT(wheelset.second[&quot;position&quot;].size() == 2); ROS_ASSERT(wheelset.second.hasMember(&quot;steer_offset&quot;)); ROS_ASSERT(wheelset.second[&quot;steer_offset&quot;].getType() == XmlRpc::XmlRpcValue::TypeDouble); ROS_ASSERT(wheelset.second.hasMember(&quot;wheel_radius&quot;)); ROS_ASSERT(wheelset.second[&quot;wheel_radius&quot;].getType() == XmlRpc::XmlRpcValue::TypeDouble); } </code></pre> <p>the code can run successfully, but the data i receive is in order of <code>left_back</code>, <code>left_front</code>, <code>right_back</code>, <code>right_front</code>, which i guess is alphabetical order.</p> <p>Is there any way to receive data in order same as .yaml file?</p>
how to use XmlRpc::XmlRpcValue receive data in order
<p>I had a quick look at the KDL <code>CartToJnt()</code> method. I looked at the c++ code as I am less familiar with python, but the implementation should be identical:</p> <p>The <a href="https://github.com/orocos/orocos_kinematics_dynamics/blob/7478f96d01963db105e214ba79bd43709b0a3e5a/orocos_kdl/src/chainiksolver.hpp#L42-L58" rel="nofollow noreferrer"><code>ChainIkSolverPos</code></a> is the KDL interface class for inverse position kinematics. There seem to be three implementations:</p> <ul> <li><a href="https://github.com/orocos/orocos_kinematics_dynamics/blob/master/orocos_kdl/src/chainiksolverpos_nr.hpp" rel="nofollow noreferrer"><code>ChainIkSolverPos_NR</code></a>,</li> <li><a href="https://github.com/orocos/orocos_kinematics_dynamics/blob/master/orocos_kdl/src/chainiksolverpos_nr_jl.hpp" rel="nofollow noreferrer"><code>ChainIkSolverPos_NR_JL</code></a>,</li> <li><a href="https://github.com/orocos/orocos_kinematics_dynamics/blob/master/orocos_kdl/src/chainiksolverpos_lma.hpp" rel="nofollow noreferrer"><code>ChainIkSolverPos_LMA</code></a>.</li> </ul> <p>The Baxter SDK uses the <code>ChainIkSolverPos_NR</code>: see <a href="https://github.com/RethinkRobotics/baxter_pykdl/blob/8b95af39b0f7455aecc24ed89eb254eab1165754/src/baxter_pykdl/baxter_pykdl.py#L62-L64" rel="nofollow noreferrer">here</a>.</p> <p>This is an inverse position kinematics algorithm based on 'Newton-Raphson iterations':</p> <ul> <li>It starts from an <a href="https://github.com/orocos/orocos_kinematics_dynamics/blob/7478f96d01963db105e214ba79bd43709b0a3e5a/orocos_kdl/src/chainiksolverpos_nr.cpp#L50" rel="nofollow noreferrer">initial guess</a> for the joint positions,</li> <li>It calculates the <a href="https://github.com/orocos/orocos_kinematics_dynamics/blob/7478f96d01963db105e214ba79bd43709b0a3e5a/orocos_kdl/src/chainiksolverpos_nr.cpp#L54" rel="nofollow noreferrer">forward position kinematics</a> for that guess,</li> <li>It calculates the <a href="https://github.com/orocos/orocos_kinematics_dynamics/blob/7478f96d01963db105e214ba79bd43709b0a3e5a/orocos_kdl/src/chainiksolverpos_nr.cpp#L56" rel="nofollow noreferrer">displacement twist</a> (i.e. the diff between the effector frame corresponding for the 'guess' and the specified end effector frame),</li> <li>It then uses a forward velocity solver to <a href="https://github.com/orocos/orocos_kinematics_dynamics/blob/7478f96d01963db105e214ba79bd43709b0a3e5a/orocos_kdl/src/chainiksolverpos_nr.cpp#L57" rel="nofollow noreferrer">calculate joint displacements</a> that realize the displacement twist (for the solver used in the Baxter SDK, see <a href="https://github.com/RethinkRobotics/baxter_pykdl/blob/8b95af39b0f7455aecc24ed89eb254eab1165754/src/baxter_pykdl/baxter_pykdl.py#L60" rel="nofollow noreferrer">here</a>),</li> <li>Finally these joint displacements are added to the guess values, which should move the 'guess joint values' towards the 'desired joint values', and</li> <li>These steps are iterated untill the solution converges, see <a href="https://github.com/orocos/orocos_kinematics_dynamics/blob/7478f96d01963db105e214ba79bd43709b0a3e5a/orocos_kdl/src/chainiksolverpos_nr.cpp#L62-L63" rel="nofollow noreferrer">here</a>.</li> </ul> <p>I did not check the other two algorithms in detail, but the <code>NR_JL</code> one is 'Newton Raphson with Joint Limits', so should be more or less identical, and the <code>LMA</code> one also iterates based on an initial guess.</p> <p>So it is clear that these algorithms will converge to one or another configuration based on the 'initial guess' values, i.e. the seed values for the algorithm.</p> <p>There is no &quot;one fits all&quot; solution to choosing good seed values, this is highly application dependent. E.g.:</p> <ul> <li>If, for your application, you know that the robot will always start from a known configuration, than an obvious choice for the seed values would be the joint positions of that configuration,</li> <li>If you know that there are a few 'main configurations' in your task, then you could define seed values for each of these main configurations, and choose a set based on which main configuration your target is closest to (this needs a 'closest' metric though, which is not nessesarily straightforward if it comprises both position and orientation),</li> <li>If your goal is to move from one endeffector pose to another, you could calculate intermediate poses along that trajectory, and calculate for each intermediate pose the joint positions, using the previous joint positions as seed value (though this could equally well lead to undesired configurations, depending on which target poses you require in your application),</li> <li>You could calculate multiple times, with random seed values, and compare the results (but then you need a metric to descriminate a 'desired robot configuration' from an 'undesirable configuration'),</li> <li>Etc.</li> </ul> <p><strong>EDIT:</strong></p> <p>It is important to make a distinction between:</p> <ul> <li><p>A general kinematics framework, such as KDL, that is intended to work for <em>any</em> kind of robot manipulator,</p> <p>vs</p> </li> <li><p>A custom, robot-specific implementation, that is optimized for, but only works for <em>just one kind</em> of robot.</p> </li> </ul> <p>E.g. a <code>KDL::Chain</code> can be defined for a 4 DOF palletizing robot, a 5 DOF scara robot and a 6 DOF manipulator, and for each of those chains KDL can provide forward and inverse kinematics.</p> <p>But: if your task will always be executed by the same type of robot (e.g. a UR arm), than you can typically derive an analytical solution for the forward and inverse kinematics and implement that instead. It will be more efficient (e.g. not iterative), and can be tailored to your needs.</p> <p>Industrial controllers typically use robot-specific implementations and not general-purpose, iterative solutions.</p> <p>There also exist general solutions for analytical inverse kinematics, e.g. <a href="http://openrave.org/docs/0.8.2/openravepy/ikfast/" rel="nofollow noreferrer">OpenRave</a>, but I have no experience with those.</p> <p>If you are looking for specific IK implementations for UR robots, then <a href="https://www.google.com/search?sca_esv=580067936&amp;sxsrf=AM9HkKkNGLAeaW9WcSl-lKVF16ryTAmzNA:1699347116132&amp;q=UR+analytical+kinematics+ros&amp;nirf=YOUR+analytical+kinematics+ros&amp;sa=X&amp;ved=2ahUKEwiGytOpwbGCAxVGhf0HHepICukQ8BYoAXoECAoQAg&amp;biw=3370&amp;bih=1304&amp;dpr=1" rel="nofollow noreferrer">this google search</a> yields some interesting results, e.g.:</p> <ul> <li><a href="https://github.com/ros-industrial/universal_robot/tree/melodic-devel/ur_kinematics" rel="nofollow noreferrer">ur_kinematics</a></li> <li><a href="https://gramaziokohler.github.io/compas_fab/latest/examples/06_backends_kinematics/01_ik_and_cartesian.html" rel="nofollow noreferrer">COMPAS FAB - Analytical kinematics</a></li> </ul> <p>And the best reference is probably the <a href="https://github.com/UniversalRobots" rel="nofollow noreferrer">official Universal Robots repository</a> which mentions <a href="https://github.com/UniversalRobots/Universal_Robots_ROS2_Driver#moveit-support" rel="nofollow noreferrer">MoveIt support</a>.</p>
105221
2023-11-06T09:33:01.633
|inverse-kinematics|universal-robot|kdl|ik|
<p>I am using PyKDL to compute IK solutions for the UR3e arm. However, as we know, multiple solutions are available for IK of the 6-DoF arm, and I need to identify the preferred one. Industrial arms have a variable (configuration) to specify these solutions. But here I am looking for ROS-way. The selected solution will be used to define the arm's trajectory in the future. Please see below to find some cases:</p> <p>Case 1: <a href="https://i.stack.imgur.com/Lg3Ag.jpg" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/Lg3Ag.jpg" alt="enter image description here" /></a></p> <p>Case 2: <a href="https://i.stack.imgur.com/HlpeW.jpg" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/HlpeW.jpg" alt="enter image description here" /></a></p> <p>The code snippet is similar to the one employed by Baxter SDK <a href="https://github.com/RethinkRobotics/baxter_pykdl/blob/8b95af39b0f7455aecc24ed89eb254eab1165754/src/baxter_pykdl/baxter_pykdl.py#L126-L153" rel="nofollow noreferrer">here</a>.</p>
Getting preferred (elbow-up/down) IK solution in PyKDL
<p>Not really for pioneer, but came across this while trying to figure out generally how to go from URDF to SDF (and google took me here for some reason :-)</p> <p>I created robot.urdf.xacro and then:</p> <ul> <li>xacro robot.urdf.xacro &gt; robot.urdf</li> <li>gz sdf -p robot.urdf &gt; robot.sdf</li> <li>Add following to main SDF file for Gazebo:</li> </ul> <pre><code>&lt;include&gt; &lt;name&gt;Robot&lt;/name&gt; &lt;pose&gt;1 1 0 0 0 0&lt;/pose&gt; &lt;uri&gt;file://robot.sdf&lt;/uri&gt; &lt;/include&gt; </code></pre>
105241
2023-11-06T22:46:49.393
|ignition-fortress|
<p>I'm trying to include a <a href="https://app.gazebosim.org/OpenRobotics/fuel/models/Pioneer%202DX" rel="nofollow noreferrer">pioneer model</a> into an existing .sdf world using &lt;include&gt;, but the <a href="http://sdformat.org/tutorials?tut=spec_world&amp;cat=specification&amp;#models-defined-in-other-files" rel="nofollow noreferrer">existing documentation</a> is less than clear about how to do things (and simply using the names of the packages didn't work) and the <a href="https://gazebosim.org/docs/fortress/fuel_insert" rel="nofollow noreferrer">gazebo tutorials</a> only cover inserting from fuel. I tried several little variations of the examples (such as using the name specified in model.sdf and adding &quot;model://&quot; before either name), but nothing worked.</p> <p>Can someone tell me how to properly include models from other files into a .sdf world, or at least point me at some obscure tutorial that explains this properly? Thanks in advance.</p> <p>Edit: found how to fix it. The model directory must be renamed to the model name specified on the model.sdf file.</p>
How to include models from local files into existing world? (ROS2)
<p>You haven't provided a fully reproducible example to show what is actually going wrong. But based on your description it looks like you're setting that parameter twice and the one that you're trying to parameterize is lower in precedence.</p> <p>The last instance of the parameter is the one used. <a href="http://wiki.ros.org/roslaunch/XML#Evaluation_order" rel="nofollow noreferrer">http://wiki.ros.org/roslaunch/XML#Evaluation_order</a></p>
105250
2023-11-07T11:19:32.060
|rosparam|
<p>I am trying to change the parameter <code>map_frame</code> within the rosparam list as shown below, but the substitution does not work.</p> <p>I am actually passing the <code>&quot;tf_pre&quot;</code> argument from another launch file.</p> <p>This line <code>&lt;rosparam param=&quot;map_frame&quot; subst_value=&quot;True&quot;&gt;$(arg tf_pre)/map &lt;/rosparam&gt; # prefix the tf name for the map</code> is used to pass the value of <code>tf_pre</code> to the value of <code>map_frame</code> inside the <code>&lt;rosparam&gt;</code> list.</p> <p>This does not seem to have any effect because when I run <code>rosparam get /h1/slam_gmapping/map_frame</code> I get the wrong value.</p> <p>Anyone has an idea what could be wrong?</p> <p>This runs in ROS Noetic and I am trying to slam a multi robot Gazebo simulation.</p> <pre><code> &lt;launch&gt; &lt;arg name=&quot;tf_pre&quot; default=&quot;robot1&quot; /&gt; &lt;node pkg=&quot;gmapping&quot; type=&quot;slam_gmapping&quot; name=&quot;slam_gmapping&quot;&gt; &lt;rosparam param=&quot;map_frame&quot; subst_value=&quot;True&quot;&gt;$(arg tf_pre)/map &lt;/rosparam&gt; # prefix the tf name for the map &lt;rosparam&gt; odom_frame: odom base_frame: base_link map_frame: map ... &lt;/rosparam&gt; &lt;/node&gt; &lt;/launch&gt; </code></pre>
Parameter substitution within rosparam list inside launch file
<p>I resolved it by changing the model (redesigning the model on Solidworks)</p> <p>All the wheels should be touching the floor, in my case the caster wheel is above the ground by 1mm and gazebo is not happy with that making it difficult to check the coordinate system and axis of the part.</p>
105251
2023-11-07T11:20:44.773
|gazebo|urdf|simulation|differential-drive|sw-urdf-exporter|
<p>I get my URDF using SW2URDF plugin in SolidWorks. Iam following this <a href="https://github.com/ageofrobotics/import_your_custom_urdf_package_to_ROS-main/blob/2e713d1acf99981a315667f32bbb82ab184ffcfe/Importing_URDF_Package_from_Soloidworks_in_ROS.pdf" rel="nofollow noreferrer">instruction</a>. From the instruction, I need to add link and joint as &quot;world&quot; and set the child link to the base link, (It is needed since Im having an warning &quot;<strong>The root link base_link has an inertia specified in the URDF, but KDL does not support a root link with an inertia.</strong>&quot; and the model is not appearing if not added). One thing I noticed is that the xyz origin of the joint &quot;world&quot; is 0.17 which makes my model floating in gazebo.</p> <pre><code>&lt;joint name=&quot;world_joint&quot; type=&quot;fixed&quot;&gt; &lt;parent link=&quot;world&quot;/&gt; &lt;child link=&quot;base_link&quot;/&gt; &lt;origin rpy=&quot;0 0 0&quot; xyz=&quot;0.0 0.0 0.17&quot;/&gt; &lt;/joint&gt; </code></pre> <p><a href="https://i.stack.imgur.com/CgWWQ.jpg" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/CgWWQ.jpg" alt="enter image description here" /></a> I dont know why but Im expecting that the gravity should take effect, that even it is floating, the model should drop once it is running. Then, I checked the &quot;link&quot; from the &quot;ground_plane&quot; under &quot;Model&quot; tab and the gravity is set to False, I tried to click it to change to True but it always goes to False.</p> <p>I also tried to change the xyz origin of the &quot;world&quot; joint, expecting that the model should have contact in the ground:</p> <pre><code>&lt;joint name=&quot;world_joint&quot; type=&quot;fixed&quot;&gt; &lt;parent link=&quot;world&quot;/&gt; &lt;child link=&quot;base_link&quot;/&gt; &lt;origin rpy=&quot;0 0 0&quot; xyz=&quot;0.0 0.0 0.0&quot;/&gt; &lt;/joint&gt; </code></pre> <p>But all the models are on the center. What could be the problem? <a href="https://i.stack.imgur.com/YLRwl.jpg" rel="nofollow noreferrer"><img src="https://i.stack.imgur.com/YLRwl.jpg" alt="enter image description here" /></a> The model is good only when the z value is 0.17.</p> <p>BTW, Im doing a differential drive robot. I also added libgazebo_ros_diff_drive to drive the wheels, no problem with that as the wheels are moving when I publish in /cmd_vel, but the robot is not moving as it is floating.</p>
SW2URDF model is floating in Gazebo
<p>Sounds like you should leverage the power of the behavior tree to do this work and remove parts of the BT XML that aren't necessarily helpful for your application in these situations. This is one of the main benefits of Nav2 -- you should get familiar with behavior trees so that you may design your own logic for how this is handled so that you do get the optimal behavior.</p> <p><strong>This is why the Behavior Tree exists.</strong></p>
105256
2023-11-07T14:42:43.247
|navigation|ros2|path-planning|nav2|global-planner|
<p>I have created a c++ executable whose purpose is to generate Poses based on the coordinates provided by the user. More specifically, the user provides a number of points in cartesian coordinates (e.g., (x1,y1), (x2, y2), (x3,y3)) and then the program creates a number of poses between those points using linear interpolation, where the number of the subpoints between two points is proportional to the distance of those two points (e.g., if point a has a Euclidean distance of 2 meters from b then we will generate 4 subpoints between those two points). Then, I dynamically delete some of those poses based on whether that pose is in a grid cell with a high cost (to do that I subscribe to the global costmap).</p> <p>As the vehicle moves, more and more subpoints are deleted since the LIDAR detects either previously unseen static obstacles (e.g., a side of a cubic obstacle that was previously unseen) or dynamic obstacles. Each time a subpoint is deleted I send the new goal path to the Nav2 stack (<code>FollowPath</code>). That way we do not run into problems where the planner is trying to generate a Navigate Through Poses path which runs through a subpoint that is on top of an obstacle (that would lead the planner to fail to generate a valid path).</p> <p>This process generally works smoothly, however, sometimes our LIDAR suddenly detects part of an obstacle that is on top of a pose that we are supposed to move to (e.g., when we are turning around one of the edges of a large cubic obstacle). Before, my code has time to delete this subpoint the planner server already fails to create a plan (as no valid path is found since we are &quot;trying&quot; to move towards a pose that in on top of an obstacle). Then the progress of the path of the vehicle is delayed by 15 to 30 seconds (effectively stays still), since a new global costmap is requested from the planner , and we go into recovery behavior before finally, the bt-navigator receives a goal preemption with the new NavThroughPoses path. From the logs and the terminal I can see that effectively, the planner outputs a warning that we &quot;cannot&quot; generate a valid path and after 10 ms the subpoint is deleted by my script. However, the progress of the vehicle is delayed by around 20 seconds and not 10 ms as the planner goes into recovery behavior etc. If my script was faster than the planner server that problem would not occur since the path considered by the planner would already contain only &quot;good&quot; subwaypoints.</p> <p>Any advice on how to tackle this problem? Ideally, I would like to solve this problem by editing my script (e.g, through the <code>FollowPath server</code> I create) but I am not entirely sure that this is actually possible.</p> <p><strong>Edit</strong>: Taking a look at the <code>RCLCPP INFO</code> one can see the following:</p> <ol> <li>In the <code>FollowPath</code> Server terminal:</li> </ol> <pre><code>[WARN] [1699017481.380706307] [costmap_reader]: Subwaypoint at location (17.870499, 63.003998) with cost 40 was erased (OBSTACLE) Size of subpoints: 34 [WARN] [1699017481.381329198] [SimpleWaypointFollower]: Number of the goal poses is 34 [INFO] [1699017481.381367433] [simple_path_waypoints_client]: Sending goal poses [INFO] [1699017481.382638981] [simple_path_waypoints_client]: Goal accepted by server, waiting for result </code></pre> <ol start="2"> <li>From the <code>planner</code> and rest of <code>Nav Stack</code>:</li> </ol> <pre><code>[planner_server-4] [WARN] [1699017481.371018314] [planner_server]: GridBased: failed to create plan, no valid path found. [planner_server-4] [WARN] [1699017481.385618140] [planner_server]: Planning algorithm GridBased failed to generate a valid path to (18.40, 56.45) [planner_server-4] [WARN] [1699017481.386603146] [planner_server]: [compute_path_through_poses] [ActionServer] Aborting handle. </code></pre> <p>Do note the <strong>timestamp difference</strong> between the two outputs (it's around 10ms)</p>
Preempt global planner or give higher priority to FollowPath server
<p>If you are asking for a package that will do that, you can use <code>robot_localization</code>, but you'll run into some trouble. Your IMU, at most, will give you linear acceleration, rotational velocity, and absolute orientation. A pressure sensor, once converted to a depth measurement, will provide Z position. Without a reference for either X and Y position or at least X and Y velocity, your state estimate is going to explode.</p> <p>We're also aiming to deprecate <code>robot_localization</code> in favor of <code>fuse</code>, but we need to implement some 3D sensor models first.</p> <p><a href="https://docs.ros.org/en/noetic/api/robot_localization/html/index.html" rel="nofollow noreferrer">https://docs.ros.org/en/noetic/api/robot_localization/html/index.html</a> <a href="https://docs.ros.org/en/noetic/api/fuse_doc/html/index.html" rel="nofollow noreferrer">https://docs.ros.org/en/noetic/api/fuse_doc/html/index.html</a></p>
105261
2023-11-07T20:23:18.277
|ros2|imu|ros-humble|sensor-fusion|nav2|
<p>Can I fuse a depth sensor or a pressure sensor's readings with imu to publish /odom (odometry) in Nav2 stack? I don't have wheel encoders in my robot, and I don't know what to use. Any resources?</p>
Fusing pressure/depth sensor/IMU for Nav2
<p>As explained in <a href="https://github.com/ros2/rmw_fastrtps/issues/733" rel="nofollow noreferrer">this issue</a>, there was a change in rmw_fastrtps_cpp on rolling distro causing the unhandled error in humble. The linked PR fixes my issue.</p>
105276
2023-11-08T13:18:51.537
|ros2|ros-humble|dds|ros-iron|
<p>We experienced that humble ROS executables crash if there are any iron nodes running in the same network. This can be reproduced with running <code>ros2 topic list</code> without any sourced workspace: it returns <code>std::bad_alloc</code> without any further warning. We use fastrtps with default settings.</p> <p>A stack trace from the debugger gives</p> <pre><code>#0 0x00007ffff62ad265 in __cxa_begin_catch () from /lib/x86_64-linux-gnu/libstdc++.so.6 #1 0x00007ffff62ae4d3 in __cxa_throw () from /lib/x86_64-linux-gnu/libstdc++.so.6 #2 0x00007ffff62a27ac in ?? () from /lib/x86_64-linux-gnu/libstdc++.so.6 #3 0x00007fff72ade915 in ?? () from /opt/ros/humble/lib/librmw_dds_common__rosidl_typesupport_fastrtps_cpp.so #4 0x00007fff72adeeb7 in rmw_dds_common::msg::typesupport_fastrtps_cpp::cdr_deserialize(eprosima::fastcdr::Cdr&amp;, rmw_dds_common::msg::NodeEntitiesInfo_&lt;std::allocator&lt;void&gt; &gt;&amp;) () from /opt/ros/humble/lib/librmw_dds_common__rosidl_typesupport_fastrtps_cpp.so #5 0x00007fff72adf1f7 in rmw_dds_common::msg::typesupport_fastrtps_cpp::cdr_deserialize(eprosima::fastcdr::Cdr&amp;, rmw_dds_common::msg::ParticipantEntitiesInfo_&lt;std::allocator&lt;void&gt; &gt;&amp;) () from /opt/ros/humble/lib/librmw_dds_common__rosidl_typesupport_fastrtps_cpp.so #6 0x00007fff72753a39 in ?? () from /opt/ros/humble/lib/librmw_fastrtps_cpp.so #7 0x00007fff727049b6 in rmw_fastrtps_shared_cpp::TypeSupport::deserialize(eprosima::fastrtps::rtps::SerializedPayload_t*, void*) () from /opt/ros/humble/lib/librmw_fastrtps_shared_cpp.so #8 0x00007fff7242f42a in ?? () from /opt/ros/humble/lib/libfastrtps.so.2.6 #9 0x00007fff720eced2 in eprosima::fastdds::dds::DataReaderImpl::read_or_take(eprosima::fastdds::dds::LoanableCollection&amp;, eprosima::fastdds::dds::LoanableSequence&lt;eprosima::fastdds::dds::SampleInfo, std::integral_constant&lt;bool, true&gt; &gt;&amp;, int, eprosima::fastrtps::rtps::InstanceHandle_t const&amp;, unsigned short, unsigned short, unsigned short, bool, bool, bool) () from /opt/ros/humble/lib/libfastrtps.so.2.6 #10 0x00007fff720ed07a in eprosima::fastdds::dds::DataReaderImpl::take(eprosima::fastdds::dds::LoanableCollection&amp;, eprosima::fastdds::dds::LoanableSequence&lt;eprosima::fastdds::dds::SampleInfo, std::integral_constant&lt;bool, true&gt; &gt;&amp;, int, unsigned short, unsigned short, unsigned short) () from /opt/ros/humble/lib/libfastrtps.so.2.6 #11 0x00007fff726fc3e6 in rmw_fastrtps_shared_cpp::_take(char const*, rmw_subscription_s const*, void*, bool*, rmw_message_info_s*, rmw_subscription_allocation_s*) () from /opt/ros/humble/lib/librmw_fastrtps_shared_cpp.so #12 0x00007fff726eb19f in ?? () from /opt/ros/humble/lib/librmw_fastrtps_shared_cpp.so #13 0x00007ffff62dc253 in ?? () from /lib/x86_64-linux-gnu/libstdc++.so.6 #14 0x00007ffff7c94ac3 in start_thread (arg=&lt;optimized out&gt;) at ./nptl/pthread_create.c:442 #15 0x00007ffff7d26a40 in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81 </code></pre> <p>While we maybe could use different domain IDs for different projects, we'd like to use different ROS distros within the same distributed ROS project (like now, some packages are in a transition phase from humble to iron, but not all are ported yet). The problem is that the topics aren't just not readable, but the nodes crashes immediately.</p> <ul> <li>Should this be possible in principle?</li> <li>Are there any changes on purpose, making the different distros incompatible?</li> <li>Is it a bug somewhere in the RMW layer or fastrtps?</li> </ul> <p>Any hints would be highly appreciated!</p>
`ros2 topic list` from humble throws std::bad_alloc if iron runs on the same network