Options
2026
Journal Article
Title
Rapid Offline Training for Deep Material Networks via a displacement-based laminate formulation and a novel sampling technique for a compliance-based fatigue model
Abstract
Deep Material Networks DMNs are based on hierarchical laminates and serve as surrogate models for the microstructure in multiscale computations. They involve a two-step process: In the initial training phase, the parameters of the laminates are identified via supervised learning on precomputed elastic data with varying stiffness tensors associated to the phases. In the second phase, the laminate parameters are fixed, and the DMN may be used for inelastic predictions, giving rise to the sought surrogate model. Once identified, DMNs lead to a tremendous speed-up compared to full-field simulations and turn out to be rather robust w.r.t. changes of the constitutive behavior of the individual phases comprising the microstructure. The work at hand offers two novelties: For a start, we take a look at the initial training phase which represents the bottleneck when using DMNs for industrial-scale problems. For laminate-based DMNs, we study three different ways to obtain the effective elastic properties of laminates. Although the different formulations lead to identical results in exact arithmetic, their numerical performance differs when used in the DMN training process. We find that the displacement-based laminate formulation leads to a speed-up factor of roughly seven compared to the two-phase Milton formulation. The second contribution concerns an innovative sampling strategy for the stiffness tensors to be used in the offline training for a compliance-based fatigue-damage model. The proposed strategy leads to robust training and accurate online predictions. We demonstrate the utility of our findings via dedicated computational experiments for industrial-scale glass-fiber reinforced composites.
Author(s)
Open Access
File(s)
Rights
CC BY 4.0: Creative Commons Attribution
Additional link
Language
English