Model Validation

Model validation is based on comparison to experimental data from camera-based instruments.
Starting Point: CEN Workshop Agreement
A recent CEN guideline recommends comparison by using data reduction techniques such as image decomposition. In MOTIVATE this procedure is applied to an aircraft sub-component test in an industrial environment.
Predicted (top left) & measured (bottom left) y-direction strain field in percentage strain in the web of an I-beam subject to three-point bending; corresponding plot (right) of normalised shape descriptors with a band of acceptability shown by the dashed lines (from Hack et al. J. Strain Analysis 51(2016)5 ).

We have developed novel methods for the comparison of predictive numerical models with full-field experimental data in order to achieve a robust, quantitative validation of the simulation. In particular an understanding has been gained of the uncertainties in simulation and experimental data and their influence on predictions and measurements.

Criteria allowing an easy comparison and interpretation of data have been embedded in novel correlation methods that allow confidence in simulations to be established, supported by the quantifcation of 'the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model', as the ASME definition of validation reads. Work at ULIV has demonstrated that feature vectors, obtained via image decomposition, can be used to generate relative error metrics. This preliminary work is being integrated with the prior work embedded in the CEN guideline to produce a validation methodology.

Validation flowchart

The MOTIVATE Validation Flowchart, reproduced below, forms the basis of the MOTIVATE Protocol. A key novel feature is the evaluation of historical data.  In addition, the development of the model takes priority, while physical testing is performed only if required. The processes involved are shown as coloured boxes.  This includes the decision sequence required to evaluate historical data for use in the validation process and the quantitative validation process described in the recently published CEN guide. To allow for an unbiased comparison of the data sets, it is recommended to implement a “double blind” procedure. To this end, interaction of the testing and modelling teams are only allowed as far as to clarify and agree on test object, geometry, boundary conditions and load cases. Then, data are generated by experiment (blue box) and by simulation (orange box) independently, and the results are transferred to an independent validation team who is in charge of the Quantitative Comparison (red box).  It should be noted that, prior to starting the validation process, it will be necessary for decision-makers to state their expectations so that appropriate decision criteria can be adopted, for instance requirements for measurement uncertainty.  While this activity of the decision-makers was not within the brief of the MOTIVATE project, their review at the end of the process is shown for completeness and comparison with the existing ASME flowchart

Proposed new flowchart for the process of validating simulations that allows the use of historical data. While the two strands for simulation and dedicated validation experiment known from the corresponding ASME flowchart make part of the graph, the proposed flowchart opens more options to reach a validation statement.