HM 2019 : résumés des exposés

La modélisation moléculaire comme outil d’aide à la formulation de lubrifiant
Par Sophie Loehlé (Total)
Lubrication plays a major role in a wide range of key sectors such as automotive and industry. Renewal of lubricants is currently triggered by the quest of improved properties together with reduced environmental footprint. Therefore, it becomes important to better understand the interaction between the lubricant and the surface. Molecular modeling is a powerful tool that already proven its ability in understanding reactivity of lubricant additives toward a surface [1-5]. It is proposed here to review the different mode of action of friction modifiers and associated molecular modelling development, e.g, the coupling of density functional theory (DFT) with molecular dynamics (MD) simulation as well as the development of automated workflow enabling to quickly assess adsorption site, surface coverage and adsorption energy. Finally, the present work is a good example on how a fundamental study can be applied to an industrial issue and what will be the future challenge in computational chemistry. 
Références :
[1] Loehlé et al., Tribology International 2014
[2] Loehlé and Righi, Lubrication Science 2017
[3] Loehlé and Righi, Lubricants 2018
[4] Peeters et al., J. Phys. Chem. C , 2019
[5] Blanck et al., Tribology Letters, Submitted
 

Molecularly detailed computer modelling of complex formulations in solution and at responsive interfaces assist the development of cosmetic formulations: how to go further?
Par Fabien Léonforte (L’Oréal R&I)

In hair care, the formulations used in shampoos and conditioners are intrinsically complex to help address very different issues [1-3]. Many of the ingredients focus mainly in addressing their main function for washing. Very complex formulations containing 10 to 30 ingredients to satisfy consumers demands. Among them we can list detergents (surfactants), conditioners, additives to impart extra side effects (foam, etc.), preserve or stabilize the formulations [4-10]. Most of the time, good surface conditioning can be achieved by mixing polymeric macromolecules to such ingredients. Polymers are then used to modify substrates and turn them functional, make them compatible with particular solvents and particularly are commonly used to provide cosmetic effects on bio-interfaces. In a near future, the overarching goal would be: In product design across disciplines, the extensive use of computer aided modeling is fully integrated in the classical engineering and experimentation processes. Inspired by rigorous engineering results, extensive experimental characterization and thorough theoretical analyses, any product undergoes a number of cycles of improvements before it reaches the market. How great would it be to sign such roadmap for product development at the current year 2019? However, to date, the true practice is different. Leaving apart industrial constraints and fast push-off time-to-markets technics that must quickly respond to economic principles, bottlenecks mainly rely to the lack of efficient theoretical and computational available methods to tackle out these multiple time- and length-scale problems. It is important that research effort get a firm grip on complexity -state of affairs that is characterized by many types of molecules having a correspondingly rich pallet of intricate interactions from which targeted properties emerge. We argue that computer aided modeling is the tool by excellence that can account for complexity. A systematic integration of simulation methods into a development program can therefore bring the product design to a next level. During this talk, I will introduce one of the new theoretically-informed computational technics that is devoted to take it a step further in the direction of this program. The formalism accounts for a particle-based representation of strongly-, weakly- and uncharged complex mixtures of macromolecules and decouples the bonded and non-bonded molecular interactions through fluctuating external fields derived from a self-consistent field theoretical framework. It implicitly accounts for the solvent quality through a third-order virial expansion of the equation of state of the polymers [11]. The method uses the Single-Chain-in-Mean-Field algorithm [12] to sample the macromolecules dynamics in a semi-grand-canonical ensemble where dissociation states are treated in a mean-field theoretic framework. Coupled to a simultaneous treatment of the electrostatics via a generalized and non-linear Poisson-Boltzmann (NLPB) equation, the approach is able to capture the main features of polyelectrolyte macromolecular systems, as ones used in cosmetics, and their evolution in bulk formulation and at bio-mimetic interfaces. However, despite of the gain in self-consistently, implicitly, accounting for solvent effects, the method still needs deepen improvement in more stable and fast computational methods for solving the NLPB, including hybrid multi-CPU/GPU developments. We will briefly discuss last evolutions in that direction, in collaboration with Institute Carnot Smiles. Finally, we will conclude with the next challenges to address in the same theoretically-informed computational framework to either improve both the macromolecules and solvent implicit descriptions and solvent effects, or the computational, on-the-fly, methods to extract thermodynamic variables for informing cosmetics performance descriptors.
Références :
[1] G.S. Luengo and A. Galiano. Aqueous lubrication in cosmetics. In N.D. Spencer, editor, Aqueous Lubrication, chapter 1, pages 103–144. World Scientific, 2015.
[2] P. H ?ossel, R. Dieing, R. N ?orenberg, A. Pfau, and R. Sander. Conditioning polymers in today’sshampoo formulations - efficacy, mechanism and test methods. Int. J. Cosmetic. Sci., 22(1):1–10, 2000.
[3] A.L.L. Hunting. Can there be cleaning and conditioning in the same product? Cosmet. Toilet., 103:73–78, 1988.
[4] E.D. Goddard and K.P. Ananthapadmanabhan. Application of Polymer-Surfactant Systems. Marcel Dekker, inc. N.Y., United States of America, 1998.
[5] E.D. Goddard and J.V. Gruber. Principles of Polymer Science and Technology in Cosmetics and Personal Care. Marcel Dekker, inc. N.Y., United States of America, 1999.
[6] E.D. Goddard. Polymer/surfactant interaction: Interfacial aspects. J. Colloid Interface Sci.,256:228–235, 2002.
[7] S.E. Morgan, K.O. Havelka, and R.Y. Lochhead. Cosmetic Nanotechnology, volume 961. ACS Symposium Series, 2007.
[8] A. Rook. The clinical importance of ‘weathering’ in human hair. Br. J. Dermatol., 95:111–112,1976.
[9] M. Starch. Screening silicones for hair cluster. Cosmet. Toilet., 114:56–60, 1999.
[10] J. Sun, J. Parr, and D. Travagline. Stable conditioning shampoos containing high molecular weight dimethicone. Cosmet. Toilet., 117:41–50, 2002.
[11] F. Léonforte, U. Welling and M. Müller, Single-Chain-in-Mean-Field Simulations of Weak Polyelectrolyte Brushes. J. Chem. Phys. 145, 224902 (2016).
[12] K. Daoulas and M. Müller, Single chain in mean field simulations: Quasi-instantaneous field approximation and quantitative comparison with Monte Carlo simulations. J. Chem. Phys. 125, 184904 (2006).
 

Multi-scale modelling of microstructure evolution under irradiation – the time scale and the coarse graining challenge
Par Christophe Domain1,3 
Travaux réalisés en collaboration avec C.S. Becquart2,3 
1. EDF R&D, Dpt Matériaux & Mécanique des Composants Les Renardieres, Moret sur Loing, France
2. Univ. Lille, CNRS, INRA, ENSCL, UMR 8207 - UMET - Unité Matériaux et Transformations
3. EM2VM, Joint laboratory Study and Modeling of the Microstructure for Ageing of Materials
In a classical nuclear power plant, the components near the reactor, such as the cladding (made out of Zr alloys) or the internal structures (austenitic steels) and to a lesser extent the pressure vessel (bainitic / ferritic steels), are subjected to neutron bombardment. In a fusion reactor, high energy particles (H isotopes and He) as well as 14 MeV neutrons bombard the walls (ferritic martensitic steels) and the divertor (tungsten alloys).
Under these neutron or ion irradiations, point defects and defect clusters are formed within the displacement cascades or by diffusion of defects under irradiation fluxes. The properties of these clusters have significant impact on the evolution of the microstructure under irradiation conditions. Furthermore, solutes within the alloys may affect the defect behaviour (their relative stability and their mobility) due to their more or less large interaction with them.
In order to predict and understand the ageing of structural materials, a multiscale modelling approach is developed to simulate the microstructure evolution under irradiation. It is based on the key physical phenomena the material experiences. In this approach, one of the main component, with a strong impact on the microstructure obtained, are the properties of defects and defect clusters, and more precisely their stability and mobility, which correspond to their binding and migration energies (the material evolution is described in terms of elementary physical mechanisms). These data are the inputs of kinetic Monte Carlo models for the modelling of the microstructure evolution.
Industrial alloys are multi-component alloys and the number of possible interactions between solute and defect clusters increases rapidly with the number of solute present. A typical example is the reactor pressure vessel steels, where the main elements observed in the clusters formed under irradiation contain Cu, Ni, Mn, Si as well as P. In order to characterize all the possible migration barriers between these solutes and the point defects, a large number of DFT calculations are necessary to build cohesive models and/or migration-diffusion models.
The different strategies to model the microstructure will be discussed in term of methods (atomic, object, hybrid kinetic Monte Carlo) and models (e.g. pair interactions, cluster expansion, neural networks) in order to overcome the time scale challenge and coarse grain the microstructure.
For most simulation methods, parallel algorithms have been implemented (kinetic Monte Carlo remains a hard point though) and important computing resources are needed. HPC is very widely used as well as the “recent” use of GPU.
Some numerical issues faced at the different scale and with the different methods will also be pointed out.
 

Up-to-date atomistic approaches in materials science
Par Mihai Cosmin Marinica1
Travaux réalisés en collaboration avec A. M. Goryaeva1, T. D. Swinburne2, C. Lapointe1, J. Dérès1, J.-B. Maillet3
1. DEN - Service de Recherches de Métallurgie Physique, CEA, Université Paris-Saclay
2. CNRS, Centre Interdisciplinaire de Nanoscience de Marseille (CINaM), Université Aix-Marseille 3CEA-DAM, DAM-DIF
The prediction power of multiscale models that bridge the atomistic and macroscopic scale depends on the accuracy of input information from the smaller scale [1]. Thus, the characterization of defects at the atomic scale is crucial and underpin multi-scale techniques in materials science.
Firstly, in this work we propose a new concept for the definition of structural defects in crystalline solids. A quantitative measure that describes the distortion degree of the local atomic environment is associated to each atom. Based solely on geometric information, this atomic distortion score is provided by various outlier detection machine learning techniques. The proposed definition of defects opens many perspectives in the field of computational materials science. The applications range from the qualitative substitution of the concept of the energy per atom to the selection of the relevant structural information in techniques at the forefront of the materials design, such as QM/MM or recently proposed interatomic machine learning potentials.
Secondly, the interaction and transformation of crystal defect networks, limited only by weak topological constraints, gives rise to an extraordinarily diverse range of defect morphologies whose marginal distributions in size, character and density exhibit significant variation with temperature. This study aims at improving our understanding of the free energy landscape of point defects metals up to the melting temperature by resorting to atomistic simulations based on electronic structure calculations. The phase space is sampled using adaptive molecular dynamics methods [2,3] and ab initio atomic forces. Moreover, taking advantage of versatility of adaptive free energy methods we combine the ab-initio force field with interatomic machine learning [4,5] based interactions. The present approach merging ab initio - free energy - machine learning methods, can provide quantities such as formation free energies or diffusion transition rates as key input parameter for any subsequent multi-scale simulation. We exemplify this approach in the case of point defects in tungsten and iron.
Références :
[1] R. Alexander at al. Phys. Rev. B 94, 024103 (2016).
[2] T. Lelievre, M. Rousset, G. Stoltz. J. Chem. Phys., 126, 134111, (2007).
[3] T.D Swinburne, M.-C. Marinica, Phys. Rev. Lett. 120, 135503 (2018).
[4] M.-C. Marinica et al, MiLaDy - Machine Learning Dynamics, CEA, Saclay, 2015-2019.
[5] A.M. Goryaeva, J.-B. Maillet, M.-C. Marinica. Comp. Mater. Sci. 166, 200 (2019)
 

Panorama de la modélisation moléculaire dans l'industrie pharmaceutique
Par Claire Minoletti (Sanofi)
Après une revue rapide du processus de découverte du médicament en R&D, seront décrites les différentes techniques utilisées en modélisation moléculaire et leur application dans les différentes étapes de design du médicament. Une attention plus particulière sera portée sur les techniques émergentes, entre autres l'intelligence artificielle, la manipulation de collections de grands nombres.
 

Accélérer les simulations de dynamique moléculaire avec Tinker-HP : l’importance de l’interdisciplinarité
Par Jean-Philip Piquemal1,2,3
1. Sorbonne Université
2. Institut Universitaire de France
3. Department of Biomedical Engineering, the University of Texas at Austin
Tinker-HP (http:///www.ip2ct.upmc.fr/tinkerHP) est un logiciel dédié à la dynamique moléculaire permettant de réaliser des simulations aussi bien avec les approches de champs de forces polarisables de dernière génération qu’avec les approches hybrides QM/MM combinant physiques quantiques (QM) et classiques (MM). Tinker-HP est une évolution du package Tinker (http://dasher.wustl.edu/tinker) dont il fait partie et ambitionne de préserver sa simplicité d’utilisation tout en rendant possible l’utilisation performante des ressources de calcul modernes (supercalculateurs, cartes graphiques GPUs). Tinker-HP propose un environnement informatique évolutif haute performance donnant accès à des simulations de grands systèmes complexes allant jusqu'à des millions d'atomes. Je présenterai les possibilités et performances du logiciel tout en mettant en lumière les interactions fortes entre chimie théorique, calcul haute performance et mathématiques appliquées qui ont permis de lever certains verrous méthodologiques. Enfin, je discuterai les développements en cours dédiés à pouvoir utiliser efficacement les futures machines exascale.
Références
i) Tinker-HP: a Massively Parallel Molecular Dynamics Package for Multiscale Simulations of Large Complex Systems with Advanced Point Dipole Polarizable Force Fields. L. Lagardère, L.-H. Jolly, F. Lipparini, F. Aviat, B. Stamm, Z. F. Jing, M. Harger, H. Torabifard, G. A. Cisneros, M. J. Schnieders, N. Gresh, Y. Maday, P. Ren, J. W. Ponder, J.-P. Piquemal, Chem. Sci., 2018, 9, 956-972
ii) Towards Large Scale Hybrid QM/MM Dynamics of Complex Systems with Advanced Point Dipole Polarizable Embeddings. D. Loco, L. Lagardère, G. A. Cisneros, G. Scalmani, M. Frisch, F. Lipparini, B. Mennucci, J.-P. Piquemal, Chem. Sci., 2019, 10, 7200-7211
iii) Raising the Performance of the Tinker-HP Molecular Modeling Package [Article v1.0]. L. H. Jolly, A. Duran, L. Lagardère, J. W. Ponder, P. Y. Ren, J.-P. Piquemal, LiveCoMS, 2019, en ligne. DOI: 10.33011/livecoms.1.2.10409
 

L'informatique quantique dans tous ses états pour de nouvelles applications scientifiques
Par Georges Uzbelger (IBM)
Dans cet exposé, on présentera cette nouvelle informatique et ses concepts, la nouvelle démarche algorithmique, ainsi que son apport dans les domaines notamment de l'optimisation et de la simulation quantique. Une démonstration avec accès à distance à une machine quantique IBM sera effectuée.