Simulation and Machine Learning

From the physics of soft tissues to the mechanics of complex materials, accurate simulation is key to understanding and predicting real-world behavior. However, traditional computational models are often costly, requiring trade-offs between accuracy and efficiency. Our research bridges simulation and machine learning to develop data-driven models that accelerate computations while preserving physical fidelity. By combining physics-based reasoning with modern AI techniques, we aim to push the boundaries of predictive modeling—enabling faster, more scalable, and more intelligent simulations for applications ranging from digital humans to advanced materials.

Neural Metamaterial Families

Neural Metamaterial Networks
Y. Li, S. Coros, B. Thomaszewski
ACM Transactions on Graphics (Proc. ACM SIGGRAPH Asia 2023)
PDF  Video

Nonlinear metamaterials with tailored mechanical properties have applications in engineering, medicine, robotics, and beyond. While modeling their macromechanical behavior is challenging in itself, finding structure parameters that lead to ideal approximation of high-level performance goals is a challenging task. In this work, we propose Neural Metamaterial Networks (NMN) -- smooth neural representations that encode the nonlinear mechanics of entire metamaterial families. Given structure parameters as input, NMN return continuously differentiable strain energy density functions, thus guaranteeing conservative forces by construction. Though trained on simulation data, NMN do not inherit the discontinuities resulting from topological changes in finite element meshes. They instead provide a smooth map from parameter to performance space that is fully differentiable and thus well-suited for gradient-based optimization. On this basis, we formulate inverse material design as a nonlinear programming problem that leverages neural networks for both objective functions and constraints. We use this approach to automatically design materials with desired strain-stress curves, prescribed directional stiffness and Poisson ratio profiles. We furthermore conduct ablation studies on network nonlinearities and show the advantages of our approach compared to native-scale optimization.

Neural Subspaces

Neural Modes: Self-supervised Learning of Nonlinear Modal Subspaces
J. Wang, Y. Du, S. Coros, B. Thomaszewski
IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2024
PDF  Video

We propose a self-supervised approach for learning physics-based subspaces for real-time simulation. Existing learning-based methods construct subspaces by approximating pre-defined simulation data in a purely geometric way. However, this approach tends to produce highenergy configurations, leads to entangled latent space dimensions, and generalizes poorly beyond the training set. To overcome these limitations, we propose a self-supervised approach that directly minimizes the system’s mechanical energy during training. We show that our method leads to learned subspaces that reflect physical equilibrium constraints, resolve overfitting issues of previous methods, and offer interpretable latent space parameters.

Neural Cloth Simulation

HOOD: Hierarchical Graphs for Generalized Modelling of Clothing Dynamics
A. Grigorev, B. Thomaszewski, M. Black, O. Hilliges
IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2023
PDF Video

We propose a method that leverages graph neural networks, multi-level message passing, and unsupervised training to enable real-time prediction of realistic clothing dynamics. Whereas existing methods based on linear blend skinning must be trained for specific garments, our method is agnostic to body shape and applies to tight-fitting garments as well as loose, free-flowing clothing. Our method furthermore handles changes in topology (e.g., garments with buttons or zippers) and material properties at inference time. As one key contribution, we propose a hierarchical message-passing scheme that efficiently propagates stiff stretching modes while preserving local detail. We empirically show that our method outperforms strong baselines quantitatively and that its results are perceived as more realistic than state-of-the-art methods.

Neural Topology Optimization

NTopo: Mesh-free Topology Optimization using Implicit Neural Representations
J. Zehnder, Y. Li, S. Coros, B. Thomaszewski
NeurIPS 2021
PDF Video

Recent advances in implicit neural representations show great promise when it comes to generating numerical solutions to partial differential equations. Compared to conventional alternatives, such representations employ parameterized neural networks to define, in a mesh-free manner, signals that are highly-detailed, continuous, and fully differentiable. In this work, we present a novel machine learning approach for topology optimization—an important class of inverse problems with high-dimensional parameter spaces and highly nonlinear objective landscapes. To effectively leverage neural representations in the context of mesh-free topology optimization, we use multilayer perceptrons to parameterize both density and displacement fields. Our experiments indicate that our method is highly competitive for minimizing structural compliance objectives, and it enables self-supervised learning of continuous solution spaces for topology optimization problems.