Options
2023
Journal Article
Title
A strategy to train machine learning material models for finite element simulations on data acquirable from physical experiments
Abstract
A high quality finite element analysis of structural components requires an adequate, often complex, description of the material behaviour. Data-driven models based on artificial neural networks have been shown to possess the potential to substitute current material models, as they promise fast computation, high adaptability to new materials and uncoupling from closed analytical formulations of the model, among others. However, the training of these Machine Learning Material Models (MLMM) is currently limited to data acquired from simulations using classical material models, since this gives the stress-output as a response to a given strain input directly, required to train the MLMM using supervised learning. Although, from experiments only surficial strain field and global force are measurable, local stress values in an inhomogeneous field are not obtainable.
We propose a methodology to uncouple MLMM from classical modelling, making them no longer surrogate models, with the promise to unleash their full potential of data-driven materials modelling. In a two-step optimization, starting with a pre-trained MLMM on a similar material and using only data practically acquirable from experiments, we re-train or re-calibrate the MLMM onto a new material. The first and major step is denoted pseudogradient-ES (evolutionary strategy), an external optimization of the neural network’s parameters, such as weights and biases, using mechanically-motivated loss-functions as the optimization objective. It uses a unique combination of evolutionary strategy and gradient based optimization, with a special formulation of a vector serving as an alternative to a gradient in the optimization process. The final step is required to reduce the amount of physical tests by taking advantage of the material symmetry, using our proposed rotation-backpropagation strategy, where new training data is constructed and used in a final training stage of the MLMM using backpropagation. The methodology is shown as a proof of concept on elastic virtual experiments, which however only uses data acquirable in real physical experiments. The final MLMM describing the new material is shown running in finite element analyses.
We propose a methodology to uncouple MLMM from classical modelling, making them no longer surrogate models, with the promise to unleash their full potential of data-driven materials modelling. In a two-step optimization, starting with a pre-trained MLMM on a similar material and using only data practically acquirable from experiments, we re-train or re-calibrate the MLMM onto a new material. The first and major step is denoted pseudogradient-ES (evolutionary strategy), an external optimization of the neural network’s parameters, such as weights and biases, using mechanically-motivated loss-functions as the optimization objective. It uses a unique combination of evolutionary strategy and gradient based optimization, with a special formulation of a vector serving as an alternative to a gradient in the optimization process. The final step is required to reduce the amount of physical tests by taking advantage of the material symmetry, using our proposed rotation-backpropagation strategy, where new training data is constructed and used in a final training stage of the MLMM using backpropagation. The methodology is shown as a proof of concept on elastic virtual experiments, which however only uses data acquirable in real physical experiments. The final MLMM describing the new material is shown running in finite element analyses.
Author(s)