Create Virutal Worlds
A virtual world can be created by listing the objects in the main configuration file (see the previous section).
In addition, PyBullet objects can be added programatically (see the next section of the documentation) or even from the command line using $ rosservice call.
We give full details for the the available PyBullet object types in the following sub-sections.
For every PyBullet object type, the configuration file must contain a name tag [str].
The name given to the object must be unique with respect to all other PyBullet objects (even of different types).
Futuremore, all objects are defined with respect to a global base coordinate frame called rpbi/world.
Robot
A robot can be specified using a URDF file. Currently, this is the only format accepted by the ROS-PyBullet Interface. The parameters for specifying a robot are listed as follows.
loadURDF(required), see the PyBullet documentation. Note thefileNamecan be given relative to a ROS package using curley brackets{ros_package}. In addition, if thefileNameis given asrobot_descriptionthen the URDF file is retrieved from therobot_descriptionROS parameter - of course, this must be set in the launch file or a script somewhere.setJointMotorControlArray(required), see the PyBullet documentation. Only thecontrolModeis required to be specified. Do not specifyjointIndices,targetPositions,targetVelocities,forces, or thephysicsClientId.initial_joint_positions[dict[str: float]], the initial joint positions for the robot. You can specify any joints you wish by giving the joint name and initial joint position value. Unspecified joints default to0.0.initial_revolute_joint_positions_are_deg, whentruethe initial joint positions for revolute joints are assumed to be given in degrees, otherwise radians. Default isTrue.joint_state_publisher_hz[int], frequency that the robot joint states are published to ROS on the topicrpbi/NAME/joint_stateswith typesensor_msgs/JointStatewhereNAMEis the PyBullet object name.broadcast_link_states[bool], when true the robot links for the robot are broadcast to ROS astf2frames.broadcast_link_states_hz[int], the frequency that the links of the robot are broadcast to ROS.enabled_joint_force_torque_sensors[list[str]], list of joint names that have Force-Torque sensors enabled. Names of joints should correspond to those defined in the URDF. When these sensors are enabled, they are published to ROS with topic namerpbi/NAME/JOINTNAME/ft_sensorwith typegeometry_msgs/WrenchStampedwhereNAMEis the PyBullet object name, andJOINTNAMEis the given joint name.is_visual_robot[bool], whentruethe robot is treated a visual object, i.e. it will not react to other objects in the environment and other objects will not react to the robot. This can be useful for debugging and also visualizing a real robot. The default value isfalse.do_log_joint_limit_violations[bool], when the robot is a visual robot iftruethen the joint limit violations are reported to the terminal.log_joint_limit_violations_hz[int], the frequency that the joint limit violations are checked.start_ik_callback[bool], when true a subscriber is started for the topicrpbi/NAME/ikof message typeros_pybullet_interface/CalculateInverseKinematicsProblemwhereNAMEis the name of the Pybullet object. This allows you to implement task space controller. Rather than streaming target joint states to the robot, you can stream goal states (defined as aros_pybullet_interface/CalculateInverseKinematicsProblemmessage). This option can only be used when the robot is in thePOSITION_CONTROLorVELOCITY_CONTROLcontrol modes.color_alpha[float], the alpha value for the robot in range[0.0, 1.0]. This allows you to make the robot transparent. By default, this option is not used.
You can move the robot in several ways.
The most common way is to stream target joint states by publishing to the topic rpbi/NAME/joint_states/target where NAME is the name of the Pybullet object.
Note, that joint state messages must include the name parameter, i.e. a list of joint names that specify the order of the position/velocity/effort attributes.
Another way to generate motion is to stream task space targets, see the tag start_ik_callback above.
Finally, several services are provided that will move the robot to desired states - see below.
Several ROS services are started when a PyBullet robot is instantiated.
These are listed as follows.
Note, in the following NAME is the name of the Pybullet object.
rpbi/NAME/robot_info[ros_pybullet_interface/RobotInfo], returns information about the robot (i.e. the name, link/joint names, body unique ID, number of joints, number of degrees of freedom, joint information from PyBulletpybullet.getJointInfomethod, see the documentation, enabled Force-Torque sensors, and the current joint state).rpbi/NAME/ik[ros_pybullet_interface/CalculateInverseKinematics], compute a single IK. The target joint state is returned.
The following ROS services are only created for robots that are not visual (i.e. the tag is_visual_robot, see above, is ommited or set to false).
rpbi/NAME/move_to_joint_state[ros_pybullet_interface/ResetJointState], given a target joint state and duration the robot is moved from the current state to the goal. The duration (in seconds) is the time it will take for the robot to move from the current state to the goal state. Note, there is no collision avoidance.rpbi/NAME/move_to_init_joint_state[ros_pybullet_interface/ResetJointState], moves the robot to the initial joint state specified in the yaml configuration file under the taginitial_joint_positions(see above). Note, we re-use theros_pybullet_interface/ResetJointStateservice type here. That means when you call the service you will need to include the duration (i.e. time it takes for the robot to move from the current configuration to the goal) and an emptysensor_msgs/JointStatemessage - the joint state message will be ignored. Note, there is no collision avoidance.rpbi/NAME/move_to_eff_state[ros_pybullet_interface/ResetEffState], given a task space target and a duration the robot is moved from the current configuration to a goal configuration (computed using PyBullet’s Inverse Kinematics feature). Note, there is no collision avoidnace.
Collision Object
For the collision object, other objects will react to it, but it will remain unaffected. Objects such as walls, doors, ceilings, floors should be modelled using this object type. The parameters to setup this object are listed as follows.
createVisualShape(required), see PyBullet documentation.createCollisionShape(required), see PyBullet documentation.changeDynamics, see PyBullet documentation.object_tf, if unspecified the default is the identity.tf_id[str], thetf2frame ID that defines the camera pose. This frame must be defined with respect to therpbi/worldframe.hz[int], the frequency that the pose is queried. Default is30.
Dynamic Object
You can simulate virtual objects using a dynamic object. In this case, the objects motion is completely defined by Pybullet. The parameters for this object type are as follows.
createVisualShape(required), see PyBullet documentation.createCollisionShape(required), see PyBullet documentation.changeDynamics(required), see PyBullet documentation.baseMass[float] (required), mass of the base.basePosition[list[float]], base position in therpbi/worldframe.baseOrientation[list[float]], base orientation in therpbi/worldframe (as a quaternion).resetBaseVelocity, see PyBullet documentation. Note, thebodyUniqueIddoes not need to be passed. This will specify the initial velocity of the object.broadcast_hz[int], this is the frequency that the object pose is broadcast totf2. Default is0(i.e. the pose is not broadcast). The frame is always published with respect to therpbi/worldframe and given the namerpbi/NAMEwhereNAMEis the name of the PyBullet object.
Visual Object
A visual object is used primarily for visualizing real world objects or for debugging. These simply visualize objects, other objects will not react to this object and it will not react to other objects. To specify this object the following parameters can be used.
createVisualShape(required), see the PyBullet documentation. Note thefileNamecan be given relative to a ROS package using curley brackets{ros_package}. Also, theshapeTypeparameter can be passed as a string.object_tf, if unspecified the default is the identity.tf_id[str], thetf2frame ID that defines the camera pose. This frame must be defined with respect to therpbi/worldframe.hz[int], the frequency that the pose is queried. Default is30.
Soft bodies
PyBullet also implements deformable object and cloth simulation.
Soft bodies can be setup using the pybullet.loadSoftBody method, or from a URDF file.
For the URDF, see the next section.
When using the loadSoftBody approach, you can to specify the following tags.
loadSoftBody(required), see PyBullet documentation.createSoftBodyAnchor[list[list[float/int]]], pin vertices of a deformable object to the world. Note, the PyBullet documentation forcreateSoftBodyAnchoris limited. It is not clear what is exactly the interface. The soft body unique ID will be passed automatically, but any other parameters must be supplied. Some potential resources:
Loading from URDF
This interface allows you to load objects directly from a URDF. The only required tag is as follows.
loadURDF(required), see the PyBullet documentation.
Note, for this object type there is no ROS communication available. Future work will include updated feature set for this object type.