mc_rtc::Configuration
general purpose configurationThe objective in this section is to control not only one robot, but multiple at the same time. This technique, although computationally expensive when controlling many DoFs, allows us to:
All of this without having to perform explicit inverse kinematics.
To do this, our QP controller will minimize under constraints an objective function that takes into account the whole system of robots.
In our example we will focus on manipulating a simple object with two degrees of freedom: a door. Our goal is to move the robot's hand to the door handle, turn the handle and open the door.
Note: this article assumes you are able to run and visualize the controller using ROS.
We will build a simple controller that will load JVRC1 and a door. This door is part of the mc_rtc_data package that you should already have.
Now, let's build a controller that takes multiple robots as input:
To do so, we will modify the constructor:
#include <mc_rbdyn/RobotLoader.h>
MyFirstController::MyFirstController(mc_rbdyn::RobotModulePtr rm, double dt, const mc_rtc::Configuration & config)
: mc_control::MCController({rm,
mc_rbdyn::RobotLoader::get_robot_module("env", std::string(mc_rtc::MC_ENV_DESCRIPTION_PATH) + "/../mc_int_obj_description", std::string("door")),
mc_rbdyn::RobotLoader::get_robot_module("env", std::string(mc_rtc::MC_ENV_DESCRIPTION_PATH), std::string("ground"))}, dt)
@staticmethod
def create(robot, dt):
door = mc_rbdyn.get_robot_module("env", mc_rtc.MC_ENV_DESCRIPTION_PATH + "/../mc_int_obj_description", "door")
ground = mc_rbdyn.get_robot_module("env", mc_rtc.MC_ENV_DESCRIPTION_PATH, "ground")
return MyFirstController([robot, door, ground], dt)
In this new example, we have loaded an extra robot: the door. We have done so by proviing a list of robot modules to load. The loaded robots are then indexed according to the order in which the modules were provided. It means that our main robot still has the index 0, however we now have the door at index 1 and the ground at index 2. It means that the contacts setting must be updated as follows:
addContact({robot().name(), "ground", "LeftFoot", "AllGround"});
addContact({robot().name(), "ground", "RightFoot", "AllGround"});
self.addContact(self.robot().name(), "ground", "LeftFoot", "AllGround")
self.addContact(self.robot().name(), "ground", "RightFoot", "AllGround")
If you start the controller now, you will notice that the door and the robot are in the same location, for the purpose of this tutorial, we will manually reposition the door:
// In the reset function
robots().robot(1).posW(sva::PTransformd(sva::RotZ(M_PI), Eigen::Vector3d(0.7, 0.5, 0)));
# In the reset callback
self.robots().robot(1).posW(
sva.PTransformd(sva.RotZ(math.pi), eigen.Vector3d(0.7, 0.5, 0))
)
Since we have added a new articulated robot, we should add some constraints on it. We will only add a kinematics constraint and a posture task:
// In the header
std::shared_ptr<mc_solver::KinematicsConstraint> doorKinematics;
std::shared_ptr<mc_tasks::PostureTask> doorPosture;
// In the reset function
doorKinematics = std::make_shared<mc_solver::KinematicsConstraint>(robots(), 1, solver().dt());
solver().addConstraintSet(*doorKinematics);
doorPosture = std::make_shared<mc_tasks::PostureTask>(solver(), 1, 5.0, 1000.0);
solver().addTask(doorPosture)
# In the reset callback
self.robots().robot(1).posW(
sva.PTransformd(sva.RotZ(math.pi), eigen.Vector3d(0.7, 0.5, 0))
)
self.doorKinematics = mc_solver.KinematicsConstraint(
self.robots(), 1, self.qpsolver.timeStep
)
self.qpsolver.addConstraintSet(self.doorKinematics)
self.doorPosture = mc_tasks.PostureTask(self.qpsolver, 1, 5.0, 1000.0)
self.qpsolver.addTask(self.doorPosture)
Note: we don't add contact for the door yet as it has a fixed base. However, note that the contact constraint is directly available for every robots in the controller.
In that part of the tutorial we will setup the logic of our controller. We want to:
So, our code will reflect that:
// In the header
enum DoorPhase
{
APPROACH = 0,
HANDLE,
OPEN
};
// A private property of our controller
DoorPhase phase = APPROACH;
// A new method for our controller
void switch_phase()
{
if(phase == APPROACH && 0 /** we write this condition later */)
{
/** Setup the HANDLE phase */
phase = HANDLE;
}
else if(phase == HANDLE && 0 /** we write this condition later */)
{
/** Setup the OPEN phase */
phase = OPEN;
}
}
// Call this in the run function
bool MyFirstController::run()
{
switch_phase();
return mc_control::MCController::run();
}
# Declare constants
APPROACH = 0
HANDLE = 1
OPEN = 2
# In constructor
self.phase = APPROACH
# New method for our controller
def switch_phase(self):
if self.phase == APPROACH and False: # We write this condition later
# Setup the HANDLE phase
self.phase = HANDLE
elif self.phase == HANDLE and False: # We write this condition later
# Setup the OPEN phase
self.phase = OPEN
# Call this in the run callback
def run_callback(self):
self.switch_phase()
return True
For this phase, we will introduce the SurfaceTransformTask
. It is very similar to the EndEffectorTask
we used except that the task is controlling a surface of the robot instead of an end-effector. This usually makes it easier to express the objective. Furthermore, we will define the target using the position of the door's handle so that we don't have to adapt our code if we decide to move the door or change the hand.
// In the header
#include <mc_tasks/SurfaceTransformTask.h>
// In the private members
std::shared_ptr<mc_tasks::SurfaceTransformTask> handTask;
// In the reset function
// Create the task and add it to the solver
handTask = std::make_shared<mc_tasks::SurfaceTransformTask>("RightGripper", robots(), 0, 5.0, 1000.0);
solver().addTask(handTask);
// Set a target relative to the handle position
handTask->target(sva::PTransformd(Eigen::Vector3d(0, 0, -0.025)) * robots().robot(1).surfacePose("Handle"));
# In the reset callback
# Create the task and add it to the solver
self.handTask = mc_tasks.SurfaceTransformTask(
"RightGripper", self.robots(), 0, 5.0, 1000.0
)
self.qpsolver.addTask(self.handTask)
# Set a target relative to the handle position
self.handTask.target(
sva.PTransformd(eigen.Vector3d(0, 0, -0.025))
* self.robots().robot(1).surfacePose("Handle")
)
We need to settle two things for phase 2:
The first point will be to monitor the execution of the SurfaceTransformTask
. For the second point, we will add a contact between the door and the robot's gripper, remove the task on the robot's gripper (it is now handled by the contact) and change the target for the handle position.
// Modify MyFirstController::switch_phase()
if(phase == APPROACH && handTask->eval().norm < 0.05 && handTask->speed().norm() < 1e-4)
{
// Add a new contact
addContact({robot().name(), "door", "RightGripper", "Handle"});
// Remove the surface transform task
solver().removeTask(handTask);
// Keep the robot in its current posture
postureTask->reset();
comTask->reset();
// Target new handle position
doorPosture->target(handle});
// Switch phase
phase = HANDLE;
}
# Modify switch_phase(self)
if (
self.phase == APPROACH
and self.handTask.eval().norm() < 0.05
and self.handTask.speed().norm() < 1e-4
):
# Add a new contact
self.addContact(self.robot().name(), "door", "RightGripper", "Handle")
# Remove the surface transform task
self.qpsolver.removeTask(self.handTask)
# Keep the robot in its current posture
self.postureTask.reset()
self.comTask.reset()
# Target new handle position
self.doorPosture.target({"handle": {-1.0}})
# Switch phase
self.phase = HANDLE
This phase is very similar to the previous one. We will check the handle position that has been reached to trigger the transition and then set an objective for the door opening.
else if(phase == HANDLE && doorPosture->eval().norm() < 0.01)
{
// Update door opening target
doorPosture->target(door});
// Switch phase
phase = OPEN;
}
elif self.phase == HANDLE and self.doorPosture.eval().norm() < 0.01:
# Update door opening target
self.doorPosture.target({"door": [0.5]})
# Switch phase
self.phase = OPEN
At this point you can play around with some of the parameters and see the benefits of the multi-robot approach as you can change the following without changing anything else:
Note: of course, you may end up with an unfeasible setup. For example, if you put the door 10 meters away from the robot it won't be able to reach the handle.
This tutorial concludes the introduction of mc_rtc controllers. The next tutorials are focused on the advanced tools available in the framework to help you program and debug complex controllers. However, you might have noticed we have used a very crude approach to program the logic of our controller. The framework provides a much more powerful way to deal with such scenarios as introduced in the tutorial introducing the FSM facilties and the FSM in practice tutorial. The latter will re-program this tutorial using the FSM facilities.
The full sources for this controller are available here.