T. Siebert, E. Hack, G. Lampeas, E.A. Patterson, and K. Splitthof, Uncertainty Quantification for DIC Displacement Measurements in Industrial Environments, Experimental Techniques, under revision
- K. Dvurecenska, I. Diamantakos, E. Hack, G. Lampeas, E.A. Patterson, T. Siebert, The validation of a full-field deformation analysis of an aircraft panel – a case study, Journal of Strain Analysis for Engineering Design
- K. Dvurecenska, E. Patelli, E.A. Patterson, Probabilistic metric for validation based on strain field data, BSSM's 14th Int. Conf. on Advances in Experimental Mechanics, Belfast, UK, 10-12 Sep 2019
- E. Hack, K. Dvurecenska, G. Lampeas, E.A. Patterson, T. Siebert, E.Szigeti, Incorporating historical data in a validation process, BSSM's 14th Int. Conf. on Advances in Experimental Mechanics, Belfast, UK, 10-12 Sep 2019
- A. Dean, W.J.R. Christian, Generalised decomposition of strain fields in complex components, BSSM's 14th Int. Conf. on Advances in Experimental Mechanics, Belfast, UK, 10-12 Sep 2019
- G. Lampeas, E.A. Patterson, E. Hack, Round robin results for CWA16799:2014 – Validation of computational solid mechanics models, BSSM's 14th Int. Conf. on Advances in Experimental Mechanics, Belfast, UK, 10-12 Sep 2019
- G. Lampeas, I. Diamantakos, T. Siebert, E. Hack, E. Patterson, Uncertainty quantification of Digital Image Correlation measurements based on projected speckle patterns, EASN 2019 Conference, Athens, Greece, 04–06 Sep 2019
- A. Alexiadis, R.L. Burguete, K. Dvurecenska, E. Hack, G. Lampeas, E.A. Patterson, T. Siebert and E. Szigeti, A Novel Flow-chart for Model Validation: Is it Conceivable to Validate without New Measurements, SEM Annual Conference, Reno, USA, 03–06 June 2019
- E. Hack, K. Dvurecenska, G. Lampeas, E.A. Patterson, T. Siebert, and E. Szigeti, Steps towards industrial validation experiments, ICEM 2018, Brussels, BE, 01-05 July 2018, Proceedings 2 (2018) 391
- K. Dvurecenska, E. Hack, G. Lampeas, T. Siebert and E.A. Patterson, Comparative study of orthogonal decomposition of surface deformation in composite automotive panel, ECCM18 - 18th European Conference on Composite Materials, Athens, HE, 24-28th June 2018
- K. Dvurecenska, E. Patelli, E.A. Patterson, What’s the probability that a simulation agrees with your experiment?, Photomechanics 2018, Toulouse, FR, 19–22 March 2018, Book of Abstracts pp.52-53
Welcome to MOTIVATE
MOTIVATE is an Innovation Action within the European Commission's Horizon 2020 Clean Sky 2 program under Grant Agreement No. 754660, supported by the Swiss State Secretariat for Education, Research and Innovation (SERI) under contract number 17.00064.
Our team involves Airbus Operations SAS as Topic Manager as well as University of Liverpool (the Coordinator), Empa, Dantec Dynamics GmbH and Athena Research and Innovation Center as Beneficiaries.
The goal of the project is to deploy a CEN validation method to numerical results from FE simulations of a subcomponent test based on measurements using Digital Image Correlation. Results of this test at an Airbus site as well as preliminary tests at Empa will be documented on this website. An introductory video is found on Empa TV.
The work is based on a Support Action that was conducted in the FP7 project VANESSA. We have made significant progress on methods for DIC calibration and model validation, notably on measuring the quality of data comparison. A special session at the BSSM international conference in Belfast on September 11th 2019 as well as a Knowledge Exchange workshop in Zurich on November 5th 2019 were held to present and discuss the latest relevant findings. The project results have been summarized on CORDIS.
Clean Sky 2 Joint Undertaking
Clean Sky is the largest European research programme developing innovative, cutting-edge technology aimed at reducing CO2, gas emissions and noise levels produced by aircraft. Funded by the EU’s Horizon 2020 programme, Clean Sky contributes to strengthening European aero-industry collaboration, global leadership and competitiveness.
The MOTIVATE project team held its final project meeting on April 30th, 2020. Due to the Corona pandemic they gathered on-line to review the project outcomes which include a novel validation flowchart; digital tools to implement it; a validation metric to quantify the extent to which a model represents an experiment; and methodologies to estimate DIC measurement uncertainty in an industrial environment.
Starting Point: CEN Workshop Agreement
We have developed novel methods for the comparison of predictive numerical models with full-field experimental data in order to achieve a robust, quantitative validation of the simulation. In particular an understanding has been gained of the uncertainties in simulation and experimental data and their influence on predictions and measurements.
Criteria allowing an easy comparison and interpretation of data have been embedded in novel correlation methods that allow confidence in simulations to be established, supported by the quantifcation of 'the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model', as the ASME definition of validation reads. Work at ULIV has demonstrated that feature vectors, obtained via image decomposition, can be used to generate relative error metrics. This preliminary work is being integrated with the prior work embedded in the CEN guideline to produce a validation methodology.
The MOTIVATE Validation Flowchart, reproduced below, forms the basis of the MOTIVATE Protocol. A key novel feature is the evaluation of historical data. In addition, the development of the model takes priority, while physical testing is performed only if required. The processes involved are shown as coloured boxes. This includes the decision sequence required to evaluate historical data for use in the validation process and the quantitative validation process described in the recently published CEN guide. To allow for an unbiased comparison of the data sets, it is recommended to implement a “double blind” procedure. To this end, interaction of the testing and modelling teams are only allowed as far as to clarify and agree on test object, geometry, boundary conditions and load cases. Then, data are generated by experiment (blue box) and by simulation (orange box) independently, and the results are transferred to an independent validation team who is in charge of the Quantitative Comparison (red box). It should be noted that, prior to starting the validation process, it will be necessary for decision-makers to state their expectations so that appropriate decision criteria can be adopted, for instance requirements for measurement uncertainty. While this activity of the decision-makers was not within the brief of the MOTIVATE project, their review at the end of the process is shown for completeness and comparison with the existing ASME flowchart