Jump to content

Asynchronous multi-body framework

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Elisa191996 (talk | contribs) at 13:27, 15 June 2021 (ambf iid). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

This sandbox is in the article namespace. Either move this page into your userspace, or remove the {{User sandbox}} template.

City life, Milan

My name is Elisa. I live in Italy. I like it here.


AMBF, Asynchronous Multi-Body Framework, is an open-source 3D versatile simulator for robots, implemented on April 2019 by Adnan Munawar. This multi body framework provides a real-time dynamic simulation of multi-bodies (robots, free bodies and multi-link puzzles) coupled with real-time haptic interaction via several haptic device (CHAI-3D)[1]. The framework allows to integrate real surgeon master console, haptic or not, to control simulated robots in real-time. This aspect makes the simulator play an important role in real-time training applications for surgical and non-surgical tasks. It offers the possibility to implement soft bodies providing a simulation that is as realistic as possible especially when there is the need to simulate surgical tasks in which tissues are subject to deformations. It also provides a python client to train Neural Network on real-time data with in-loop simulation.

Each simulated object is represented as an afObject, likewise the simulation world is represented as an afWorld. Both utilize two communication interfaces: State and Command. Through the State command the object can send data outside the simulation environment, while the Command allows to apply commands to the underlying afObject.

It uses multiple physics engine including Bullet Physics.

It is compatible with Ubuntu 16.04 and Ubuntu 18.04 but it has been also tested on MacOs Maverick and MacOs Mojave.

AMBF Format - ADF

IIDs

The framework allows to integrate real master console to manipulate simulated bodies in real-time. These consoles are also referred as Input Interface Device or IIDs and can be haptic or not. Each IID is simulated as a dynamic End-Effector (SDE) that can be bound or not to any simulated bodies. Several IIDs are already included in the simulator such as the Geomagic Phantom, Falcon Novint, Razer Hydra and dVRK MTM. Others can be easily included in the simulator by defining them in the input_device.yaml file[2].

This file is structured as follow:

Sigma7:

hardware name: sigma.7
haptic gain: {linear: 0.05, angular: 0.00}
workspace scaling: 2
root link: "kuka_tip"
enable joint control: true
location: {
position: {x: 0.2839, y: 0.3299, z: 1.4126},
orientation: {r: 0, p: 0, y: 0}}
orientation offset: {r: 0.0, p: -1.57079, y: 1.57079}
controller gain: {
linear: {P: 20, I: 0.0, D: 0.0},
angular: {P: 30, I: 0.0, D: 0.0} }
pair cameras: [cameraEndoscope]

The SDE is controlled using a dynamic control law based on the motion of the IIDs.

The root link is the base of the SDE to which the IID is connected. Usually the state of the IID is in the reference frame of the device itself while the SDE is with respect to the world frame therefore a transform mapping is needed in order to have the states converted to a common frame.

the workspace scaling scales the motion of the IID in simulation.

The simulated multibody specifies the multi-body that emulates the external device within the simulated AMBF scene. Different descriptions files such as grippers can be chosen to be implemented inn the simulation.

The haptic gain is a set of gains for controlling the force feedback applied on the IID.

The controller gain is used for scaling the wrench for the SDE.

The pair cameras filed is optional and it is used to set one or more cameras to be paired with the IID-SDE pair.

Python Client

The Python Client provides the possibility to control different afObject while keeping a high communication speed. This allows to manage the ROS communication making the process of controlling simulated bodies much easier compared to any other simulator. It uses a bidirectional communication such that it is possible to set commands to the bodes and at the same time to read their states using a library of Python functions. These functions are used, for example, to set or get position and orientation of bodies as well as control the wrench acting on a body or get the number of joints connected to it.

In order to use it, there is the need to crate an instance of the client and connect it to the simulation. This creates callable objects from ROS topics and initiates a shared pool of threads for bidrectional communication[3].

Soft Bodies

Blender Add-on

ROS Communication

Fruit

I like apples.

  1. ^ "WPI-AIM/ambf".{{cite web}}: CS1 maint: url-status (link)
  2. ^ "Input Devices".{{cite web}}: CS1 maint: url-status (link)
  3. ^ "Python Client".{{cite web}}: CS1 maint: url-status (link)