We present an interactive design system that allows casual users to quickly create 3D-printable robotic creatures. Our approach automates the tedious parts of the design process while providing ample room for customization of morphology, proportions, gait and motion style. The technical core of our framework is an efficient optimization-based solution that generates stable motions for legged robots of arbitrary designs. An intuitive set of editing tools allows the user to interactively explore the space of feasible designs and to study the relationship between morphological features and the resulting motions. Fabrication blueprints are generated automatically such that the robot designs can be manufactured using 3D-printing and off-the-shelf servo motors. We demonstrate the effectiveness of our solution by designing six robotic creatures with a variety of morphological features: two, four or five legs, point or area feet, actuated spines and different proportions. We validate the feasibility of the designs generated with our system through physics simulations and physically-fabricated prototypes.
We present a computation-driven approach to design optimization and motion synthesis for robotic creatures that locomote using arbitrary arrangements of legs and wheels. Through an intuitive interface, designers first create unique robots by combining different types of servomotors, 3D printable connectors, wheels and feet in a mix-and-match manner. With the resulting robot as input, a novel trajectory optimization formulation generates walking, rolling, gliding and skating motions. These motions emerge naturally based on the components used to design each individual robot.We exploit the particular structure of our formulation and make targeted simplifications to significantly accelerate the underlying numerical solver without compromising quality. This allows designers to interactively choreograph stable, physically-valid motions that are agile and compelling. We furthermore develop a suite of user-guided, semi-automatic, and fully-automatic optimization tools that enable motion-aware edits of the robot’s physical structure. We demonstrate the efficacy of our design methodology by creating a diverse array of hybrid legged/wheeled mobile robots which we validate using physics simulation and through fabricated prototypes.
In this paper we present an optimization-based approach for the design of cable-driven kinematic chains and trees. Our system takes as input a hierarchical assembly consisting of rigid links jointed together with hinges. The user also specifies a set of target poses or keyframes using inverse kinematics. Our approach places torsional springs at the joints and computes a cable network that allows us to reproduce the specified target poses. We start with a large set of cables that have randomly chosen routing points and we gradually remove the redundancy. Then we refine the routing points taking into account the path between poses or keyframes in order to further reduce the number of cables and minimize required control forces. We propose a reduced coordinate formulation that links control forces to joint angles and routing points, enabling the co-optimization of a cable network together with the required actuation forces. We demonstrate the efficacy of our technique by designing and fabricating a cable-driven, animated character, an animatronic hand, and a specialized gripper.
We propose a complete process for designing, simulating, and fabricating synthetic skin for an animatronics character that mimics the face of a given subject and its expressions. The process starts with measuring the elastic properties of a material used to manufacture synthetic soft tissue. Given these measurements we use physicsbased simulation to predict the behavior of a face when it is driven by the underlying robotic actuation. Next, we capture 3D facial expressions for a given target subject. As the key component of our process, we present a novel optimization scheme that determines the shape of the synthetic skin as well as the actuation parameters that provide the best match to the target expressions. We demonstrate this computational skin design by physically cloning a real human face onto an animatronics figure.