Hello,
I am currently implementing a simulation wherein a robot is supposed to explore and map an unknown environment. An exploration algorithm decides on a frontier cell for a robot to move towards and then sends it to move_base which does the rest. This works for some time, but then move_base begins to provide a faulty path, one that cuts right through a high cost route, as shown in the [figure](https://drive.google.com/folderview?id=0B-duhOA4E85LcWZrbGVVTHhZblE&usp=sharing). The drive folder in the link also contains the rqt_graph, tf tree and a screenshot of the Stage simulation.
At this stage, the robot stops moving and continuously displays the warning:
**Unable to get starting pose of robot, unable to create global plan**
This is then followed by the errors:
**Cannot clear map because pose cannot be retrieved**
and
**Aborting because a valid plan could not be found. Even after executing all recovery behaviors.**
If I then teleoperate the robot a bit and restart the whole process, it continues exploring for a while again, till move_base plans another path through a high-cost region.
Am I making any obvious mistakes here? Should I retune my costmap parameters? Is there any way I can make move_base provide an alternate that does not traverse through the high cost region?
If anyone has any suggestions at all on how I can prevent this from happening, please share them, I would greatly appreciate it.
Thank You.
↧
Path Planning error with Move_Base
↧
Steps to use IR sensor
Hi,
I built an infrared sensor ring, but I can't handle how to use it to navigate TurtleBot 2. I can show distances send by sensors in RViz (using LaserScan messages) but I don't know how to use these data to update costmap. I use the default TurtleBot launch files, because I also use Kinect, bumpers etc. and I want to add IR sensors for this configuration. Please tell me what steps should I take having hardware and data in correct format ready.
Best reagards
↧
↧
Simple 2D Navigation Goal is making robot to drive circles endlessly
Hi
I have created URDF for simple three-wheeled robot with differential drive with laser scanner on top of it.
All packages created for ROS Hydro can be found [here](https://code.google.com/p/ros-cpp-samples/source/browse/trunk/hydro/robone/)
In package robone_description there is folder urdf with robot description. In folder launch there is file display.launch which starts RViz with robot model and robot state publisher node.
The package robone_2dnav contains two launch files
1. gmapping.launch - starts SLAM gmapping
2. move_base.launch - starts move base node
Configuration for those nodes can be found in the folder config for robone_2dnav package.
Also I have created package robone_gazebo to start simulation of the robot.
*Please ignore package robone_main. It does not have anything to do with the issue.*
Previously I have some issues with SLAM gmapping but now map of the environment is good.
To reproduce issues I am doing following steps
- roslaunch robone_gazebo robone.launch
- roslaunch robone_description display.launch
- roslaunch robone_2dnav gmapping.launch
- roslaunch robone_2dnav move_base.launch
- In Gazebo office wall will block all view. Please right click on the working area of the Gazebo and in context menu select View->Wireframe so you’ll be able to see robot in the building’s room.
- In RViz select “2D Nav Goal” in the toolbar. Click on point approximately 1 meter in front of robot.
**Result**
Robot will start making clockwise circles in the room indefinitely or until it collide with the wall.
**Expected behavior**
Go directly to the point which is 1m in from of robot. There is no obstacles to avoid on its path.
> The question is what I am doing wrong> in this simple setup of navigation> stack?
If bag file is needed please provide names of the topics which need to be provided because for all topics such file is too big to send.
↧
Moving from no layers to layered costmaps
Hello,
i did all the setting up to get my robot working with navigation 2D in ROS Fuerte. After that, I went to ROS Hydro and got it to work again. My problem is now I want to change to layered costmaps, where the current map is in a layer (let's call it Obstacles) and the person i'm following is in another layer (let's call it PersonLayer). How can I do that? Is there a more easy to understand and/or complete tutorial than the ones in the costmap_2d section of the wiki? I'm having some trouble in understanding those.
Thanks!
↧
How to navigate without localization
We're using **turtlebot_gazebo** now and every time we **navigate**, it uses **amcl**. However, in our task,robot just needs to **avoid the obstacles and follow the target person in a small range of view**. In short, we want to modify or some way to **navigate without localization**. I suspect it is sth related to the tf and move_base but not quite clear what to do. Can it be easily fulfilled? What should we do? *Many Thanks!*
↧
↧
costmap begineer question[update]
Im using the move_base, there are some topics of local_costmap
and Im following a tutorial using grid cell to display the cost on map,
but I find nothing as shown below with the rqt_graph
What should I do with?


Best,
Evoe
↧
How does Dijkstra's algorithms work inside navfn?
Hi,
I have gone through [this](http://answers.ros.org/question/63370/dijkstra-global-path-planner/) and [this as well](http://answers.ros.org/question/28366/why-navfn-is-using-dijkstra/) .
Can someone explain how the global path planner works in details?
***EDIT:***
According to your suggestion, I have visited [NavfnROS](http://docs.ros.org/api/navfn/html/classnavfn_1_1NavfnROS.html#a01635a5d9947ae1b51ce70603db0c843) and [navfn](http://docs.ros.org/api/navfn/html/classnavfn_1_1NavFn.html). In navfn_ros.cpp, [line number 197 ](http://docs.ros.org/api/navfn/html/navfn__ros_8cpp_source.html#l00192), we have makePlan () method defined which calls calcNavFnDijkstra(true) [line 285](http://docs.ros.org/api/navfn/html/navfn__ros_8cpp_source.html#l00192). This calcNavFnDijkstra(true) is defined [here, line number 290](http://docs.ros.org/api/navfn/html/navfn_8cpp_source.html#l00290). It uses
propNavFnDijkstra(std::max(nx*ny/20,nx+ny),atStart);
00302
00303 // path
00304 int len = calcPath(nx*ny/2);
***END***
I want to remove only Dijkstra's algorithm; I don't want to change
the **costmap generation, cell updation, obstacle avoidance**
I know what is a potential field, but how it is used here; I have no idea. How does gradient [line 989](http://docs.ros.org/api/navfn/html/navfn_8cpp_source.html#l00290) help us in making the global plan?
I am using ROS fuerte.
↧
amcl not reacting to /initialpose topic
Hi folks,
I've set up navigation stack for my robot and all the frames seem to be correct (according to my tf_three, and frames...map->odom->base_link, etc). if I move the robot, I can see it moving in the map as well as the updated feed from all sensors.
The problem I am having is that AMCL is subscribed to the /initialpose topic, but when I give the robot an initial pose (using the "2D Pose Estimate" button), the robot does not move in rviz.
When I do "rostopic echo /initialpose" I can see the data changing, but the robot image does not move in rviz.
↧
Navigation Stack with omni-wheel Robot
Hi everybody,
I worked through the navigation stack tutorial yesterday and now I am wondering how to get the stack running on my 3-wheeled omni-wheel robot equipped with a Arduino Nano, an old Laptop, and an MS Kinect. The ROS stack is already running as well as freenect. (It may look a bit like Robotino)
It would be great if you would give me some hints how to get it running. I am especially confused how to setup the 3 omni-wheel control.
Thanks very much in advance!
Karamba
↧
↧
move_base accuracy thresholds
Hi there,
I'm using a turtlebot to do some navigation stuff and have some high level logic that determines where the turtlebot should go (I have regions of interest on my map, and this higher level logic chooses a specific region and decides to move the robot to the centre of the region.).
One of the issues I ran into was when it picks the locations and passes the co-ordinates to the [move_base](http://wiki.ros.org/move_base) server (which seems to be using [global_planner](http://wiki.ros.org/global_planner?distro=hydro)) there is no guarantee that a valid plan can be found for those specific co-ordinates.
What I wanted to know is if there is any way to set a threshold for the accuracy of the planner - in a use case like this it is not actually important that the robot moves to exact co-ordinates specified - being *roughly* there is good enough.
I was thinking that I could probably just make it keep trying to generate plans by randomly picking new points within a certain radius until it succeeds, but this seemed a little inelegant.
Would appreciate any ideas/pointers on how to get around this issue :)
↧
costmap_2d local into image
Hello everyone,
I was wondering if there is already a function to convert the costmap_2d data into a image (or other kind of matrix, so that it is easier to search through).
I have managed to subscribe to the local costmap_2d and wanted to check particular areas by myself now. Unfortunately the format that I receive the costmap in is not very intuitive (at least for me). I was wondering if there is a simple conversion. If not, where can I find the convention on how the information is stored (columns, rows, height, width)?
[here](http://answers.ros.org/question/147846/how-costmap-works/) I saw it must be possible, just that I don t know how.
I am running ros hydro
Kind regards,
Marcus
↧
local_path not showing in rviz while robot navigates
Hi Folks,
I'm trying to show the local path when my robot is navigating, but the path only shows for about 1/4 of a second, disappears for about 10 seconds, then repeats.
I know I am subscribing to the right topic because, like I said, I can see the local path. But why is it behaving that way?
Any parameters I need to update (maybe through dynamic configuration while the robot runs), or is there a more appropriate topic I shall subscribe to?
The global path shows up fine, and it stays until the robot reaches the goal point.
↧
enable carrot_planner
Hi folks,
How can I enable the carrot_planner in the navigation stack without have to write any code?
Is there a parameter to enable it (in the same way we can enable/disable dwa)?
Thanks
edit1:
After I include that line inside move_base launcher to include carror, now I get this error:
[FATAL] [1424988770.617038890]: Failed to create the carrot_planner/CarrotPlanner planner, are you sure it is properly registered and that the containing library is built? Exception: According to the loaded plugin descriptions the class carrot_planner/CarrotPlanner with base class type nav_core::BaseGlobalPlanner does not exist. Declared types are navfn/NavfnROS
↧
↧
Transferring State Between Navigation Processes
I'm working on a project to enable load balancing across computers running ROS applications, specifically I way to move robot navigation processes from one desktop computer to another, so that if you have a couple of desktop computers, you can use these to control your robot, but not deal with your computer slowing down because it's running something like mapping or navigation, since the navigation process will hop to another machine.
This requires the process to transfer its state. Does anyone know how the navigation stack might transfer its state between instances?
Thanks
↧
dwa_local_planner: in place rotation FIRST goal does not work
When running the navigation with dwa_local_planner, and give the **first** goal as an in place rotation, the robot won't move (except for recovery behaviors), and gives the below output. But, if you give a translation goal first, it moves normally, and from then on, it can rotate in place without a problem.
[ERROR] [...]: Footprint spec is empty, maybe missing call to setFootprint?
[ WARN] [...]: Invalid Trajectory 0.000000, 0.000000, -0.401014, cost: -9.000000
[ WARN] [...]: Rotation cmd in collision
[ INFO] [...]: Error when rotating.
I think the error actually comes from line 80 in [obstacle_cost_function.cpp](https://github.com/ros-planning/navigation/blob/indigo-devel/base_local_planner/src/obstacle_cost_function.cpp) at base_local_planner, something about the footprint is not properly initialized:
double ObstacleCostFunction::scoreTrajectory(Trajectory &traj) {
...
if (footprint_spec_.size() == 0) {
// Bug, should never happen
ROS_ERROR("Footprint spec is empty, maybe missing call to setFootprint?");
return -9;
}
...
All this is easy to reproduce running, for example, turtlebot simulator and navigation amcl demo:
$ roslaunch turtlebot_gazebo turtlebot_world.launch
$ roslaunch turtlebot_gazebo amcl_demo.launch
$ roslaunch turtlebot_rviz_launchers view_navigation.launch
EDIT:
The local costmap config used in the example above is the one from turtlebot packages, which can be found here: [local_costmap_params.yaml](https://github.com/turtlebot/turtlebot_apps/blob/indigo/turtlebot_navigation/param/local_costmap_params.yaml)
EDIT2: I use ROS Indigo on Ubuntu 14.04.2, but i've also tried with Hydro on 12.04. And yes, I have all installed via apt-get
↧
Global planner passes through the wall
When I give a navigation goal to my robot, the global path is just a simple straight between the goal point and my robot current location, even pass through the wall between them. The local planner works well, it is able to navigate to goal point when a short and simple navigation goal is given to it. The following is my move_base launch file:
↧
Navigation Stack - Path Planning
Hi All,
I have the navigation stack running on my husky robot. One problem I'm having is that the planned path is always too close to the corners. So when the robot reaches the corner, it gets stuck and starts initiating the recovery behavior (rotating). Anyone know the parameter to change so that the path is not as close to the edges/walls?
Picture can be seen here: http://s30.postimg.org/cqjf48z6p/path.jpg
Thanks
↧
↧
Publishing to Initialpose Programmatically on Turtlebot Navigation
Hi,
I am trying to set the initial pose of the turtlebot programmatically, and to do so I publish to initial pose using some code. I am following what is done here: http://answers.ros.org/question/114631/robot-estimated-pose/
I can publish to initial pose, but the robot will not react. When I set a new nav goal, the robot will continue from its previously calculated pose. Here is what rostopic echo /initialpose looks like:
header:
seq: 1
stamp:
secs: 0
nsecs: 0
frame_id: map
pose:
pose:
position:
x: 100.0
y: 102.0
z: 0.0
orientation:
x: 0.0
y: 0.0
z: 0.0267568523876
w: 0.999641971333
covariance: [0.25, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.25, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.06853891945200942]
---
One interesting thing is that that header here shows seq to be 1, when I never set it to be 1 in my code.
My publishing code looks like this:
def talker():
pub = rospy.Publisher('/initialpose', geometry_msgs.msg.PoseWithCovarianceStamped, queue_size=10)
rospy.init_node('talker', anonymous=True)
rate = rospy.Rate(1) # 10hz
pose = geometry_msgs.msg.PoseWithCovarianceStamped()
pose.header.frame_id = "map"
pose.pose.pose.position.x=100
pose.pose.pose.position.y=102
pose.pose.pose.position.z=0
pose.pose.covariance=[0.25, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.25, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.06853891945200942]
pose.pose.pose.orientation.z=0.0267568523876
pose.pose.pose.orientation.w=0.999641971333
rospy.loginfo(pose)
pub.publish(pose)
rospy.spin()
if __name__ == '__main__':
try:
talker()
except rospy.ROSInterruptException:
pass
Does anyone know what might be happening?
↧
Representing multiple attributes in a map in ROS
Hi all,
I am working on a problem where a robot has to navigate around a building which has both glass walls and brick walls. Laser sensors only work for a brick wall so I am using Ultrasound sensors where there is a glass wall.
Currently, you can represent three characteristics in a map (Free, Occupied and Unknown) as mentioned in http://wiki.ros.org/map_server. I am wondering if its possible to represent more characteristics in the same map. For example, Lets say I want to represent a glass wall (occupied) in the map (say, with Red color) in a different way than a brick wall (occupied). Later, I have to use this map for navigation purposes and use only ultrasound sensors where there is a glass wall and laser sensor where there is a brick wall, so I want that information on the map. Is there any existing package which can read such a map and publish it as a service (something similar to map_server for example) or if not, what is the best way to go about it?
Please let me know if anything is not clear.
Thanks in advance.
Naman
↧
robot_pose_ekf with Lego NXT giving wrong coordinate
I'm working on Lego NXT navigaition using both odometry with IMU data as input in robot_pose_ekf.
- I have my own design which has different wheel size than the starter robot but with the same wheelbase, give or take.
- In my launch file, I'm still using robot.urdf file from starter robot provided by NXT stack. Which is obviously doesn't look anything like what I have. But I've corrected wheel_radius and wheel_basis in base_parameters.
What I got from robot_pose_ekf/odom_combined are as following:
- If the robot is running only in straight line either forward or backward the coordinates are correct. (I've measured them.)
- Once the robot makes a turn it will start giving wrong coordinate. Mostly
position: X will increase continuously even when the robot is travelling in parallel with Y axis.
Sorry for the long intro. Here are my questions:
- Where should I start to fix the problem? I figured wheel_radius is right since X axis positions in straight line are correct. Unfortunately no matter how I adjust the wheel_basis it still behave strangely once it starts to turn as I described earlier.
- Does robot.urdf file has anything to do with robot_pose_ekf? I mean does the urdf file provide the shape and design for position calculation (eg. IMU position in relation to centre of gravity) or is it there just for visualization?
One last thing, my Rviz doesn't work. So I don't know how it behave in visualization.
↧