Information in EnglishInformation in EnglishInformation en français
Información en españolInformationen in DeutschInformasjon på norsk

Siste utgave av Kjemi Daigitalversjon:

www.kjemidigital.no
 
 

Redaktør:

Lars Ole Ørjasæter


Nettsiden redigeres i samsvar med Redaktørplakaten
og Vær varsom-plakaten.

Utgiver:
Logo MO
Media Oslo AS
Boks 119 Manglerud
NO-0612 Oslo

Tlf. 23 15 85 00

Org.nr.
NO 958 168 799 MVA

 Abonner (RSS)

Chemistry and materials

Computational chemistry and materials science can trace its roots to the development of quantum mechanics in the 1920s, when it was realized that molecules and materials consists charged particles in motion, governed by the laws of quantum mechanics.
Fig1. Reproduced with permission from U Landman, WD Luedtke, NA Burnham, RJ Colton, 1990. Atomistic mechanisms and dynamics of adhesion, nanoindentation, and fracture, Science, 248, 454-461.
Fig1. Reproduced with permission from U Landman, WD Luedtke, NA Burnham, RJ Colton, 1990. Atomistic mechanisms and dynamics of adhesion, nanoindentation, and fracture, Science, 248, 454-461.
Professor Trygve Helgaker, Department of Chemistry, University of Oslo

Scientific Challenges At the same time, it was recognized that the underlying manybody problem was computa-tionally intractable as even a moderately sized molecule consists of hundreds of interacting particles. Indeed, in 1927, P. A. M. Dirac, one of the founders of quantum mechanics, fa-mously stated that "The underlying physical laws necessary for the mathematical treatment of a large part of physics and the whole of chemistry are thus completely known and the difficulty is only that the exact application of these laws leads to equations that are much too complicated to be soluble."However, at the time Dirac could not foresee the spectacular emergence of the electronic computer in the second half of the twentieth century, which made a direct numerical attack on the electronic manybody problem of chemistry and materials science possible. As a result of Moore's law, first-principles simulations of chemical systems and processes have nowadays become commonplace and are today being performed by non-specialists more often than by specialists, typically in support of experimental activities and measurements. Indeed, a quick perusal of the most general journal of chemistry, Journal of American Chemical Society, reveals the ubiquity of computation in modern chemistry: about 40% of all articles in this journal are today supported by computation, mostly quantum mechanical. At the Department of Chemistry, University of Oslo, about one third of its scientific staff have authored scientific papers supported by computation. This is an amazing development for a science that only a few years ago was considered to be archetypically experimental and empirical in nature. Nowadays, computation is an integral part of chemistry and materials science and is widely being perceived as the "third way": simulations not only play an important role in the interpretation and prediction of experimental observations, they are in fact more and more often viewed as an alternative to experimental measurements.
   The modern computational chemistry and materials science owe their importance not just to Moore's law and the emergence of powerful computers but also to the development of flexible computational electronic-structure models and efficient numerical algorithms followed their implementation in general-purpose software packages, widely distributed and easily accessible to the broad community of chemists and materials scientists. The sustained further development of such codes, among which the Dalton quantum-chemistry code (developed in Norway and in the other Scandinavian countries) is a good example, is essential for the future of computational chemistry and materials sciences, for the following two reasons: first, to take advantage of new developments and advances in computer technology such as massively parallel computing; second, to enable simulations of a broader range of physical observations and phenomena as well as simulations on more realistic systems. For example, whereas quantum chemistry twenty years ago was typically applied to systems containing a handful of atoms, it is nowadays often applied to hundreds of atoms. Such developments have become possible because of new electronic-structure models (e.g., accurate density-functional methods), new numerical techniques (e.g., linear-scaling techniques) as well as new computer hardware. Recent developments are towards systems containing thousands of atoms, employing hybrid methods that describe parts of the system at a high, first-principles level of theory, while the surroundings are described at a lower, semi-empirical or empirical, level of theory as in the popular QM/MM (quantum mechanics/molecular mechanics) approach. Such multi-scale simulations that employ a hierarchy of methods spanning a broad range of spatial and temporal domains are essential for important problems such defects in solids and enzymatic reactions.
   Interestingly, whereas chemical simulations are being performed on larger systems over larger time spans, experiments are being performed with higher and higher spatial and temporal resolutions—for example, femtosecond laser pulses are now used to probe directly reaction mechanisms of simple molecular systems. Computationally, the challenge for such studies is to provide simulations of sufficient accuracy, to match those of the experiment. The present status is that computational chemistry has reached the level of "chemical accuracy", meaning that, for small and medium sized molecular systems, its accuracy is comparable with or even surpasses that of measured reaction enthalpies and atomization energies (about 1 kcal/mol). However, computational chemistry cannot yet routinely deliver the accuracy needed in many studies of spectroscopic processes. Also, even though reaction enthalpies and equilibrium constants can be calculated to sufficient accuracy, reaction barriers and reaction rates are more demanding, requiring a further refinement of computational models—in particular, a more reliable description of the correlated motion of electrons in molecules and solids. Indeed, apart from the development towards larger systems, the development towards higher accuracy constitutes the grand challenge of computational chemistry and materials science over the coming decade.
   Turning our attention from chemistry and materials since towards the newer field of nanoscience, we situation with respect to computation and simulations becomes more complex. In nanoscience, the objects of study are structures of the size of up to 100 nm in at least one dimension whose building blocks are nanotubes, quantum dots, clusters and nanoparticles. Such nanostructures have been experimentally studied and manipulated since the 1980s, following the development of a number of new experimental techniques such as scanning tunneling and atomic force microscopies. New and surprising phenomena have been discovered that cannot be predicted in a simple manner from knowledge of the physical laws that operate on the atomic scale. Importantly, some of these discoveries were first predicted by simulations such as the growth of a nanowire of gold atoms when a nickel tip is withdrawn from a gold sheet by Uzi Landman and coworkers in 1990. In other cases, simulations have been essential in unraveling the physics of important new phenomena such as giant magnetoresistance (GMR), which within a decade of its discovery in 1988 was used in commercial hard disks. In the future, simulations of nanoscience will continue to play an important role in unraveling the secrets of the nanostructures. Indeed, in the report from a 2002 US Department of Energy Workshop on “Theory and Modeling in Nanoscience” concludes that “the country’s investment in the national nanoscience initiative will pay greater scientific dividends if it is accelerated by a new investment in theory, modeling and simulation in nanoscience”, warning that the “absence of quantitative models that describe newly observed phenomena increasingly limits progress in the field”. In this field, the fundamental challenges and opportunities in simulations are in the broad of nano building blocks, complex nanostructures and nano-interfaces and the assembly and growth of such structures. Specific areas of interest include transport mechanisms, optical properties of nanostructures and spintronics. Apart from electronic-structure methods as discussed above, important ingredients simulations in nanoscience are methods for classical and non-classical molecular dynamics and Monte Carlo methods. An important special requirement for nano-sized systems is the need to deal with widely different length and time scales as well as to treat simultaneously materials and molecules that have traditionally been treated by different methods and techniques.

Requirements for infrastructure
New developments in computer technology will make it possible to perform simulations in ten years time that are unthinkable today, if we are able to utilize in an efficient manner the combined power of massively parallel computers. For example, as demonstrated by recent benchmarks on Argonne's Blue Gene/P with 294 192 PowerPC 450 850 MHz processors (designed to run continuously on 1 PFLOPS), the Dalton code scales well to over 20.000 processing cores. However, to take full advantage of tomorrow's technology, existing codes must upgraded and rewritten.
   From the perspective of computational chemistry and materials science, we discern both encouraging and discouraging trends in the emerging computer technology. On the positive side, we note that modern computers incorporate powerful multi-core chips, graphical processing units (GPUs) that can be harnessed for faster number crunching, speeding up central computational tasks by an order of magnitude or more relative to central processing units (CPUs), and finally improved bandwidths in interconnects. On the negative side, chips are not getting much faster beyond 3GHz, multi-core chips are hard to program effectively, and communication between CPUs and GPUs is slow.
   Quantum mechanical simulations are essentially number crunching, with typically low requirements on data storage (both output and input data), although some applications require large (and even gain from vast) intermediate scratch storage (removed upon completion of run). Memory requirements vary; most simulations can be carried out using 1 or 2 Gb memory, while other may require an order of magnitude more. Some methods and algorithmic developments may benefit or even depend on need fast interconnects with large bandwidths and shared-memory architectures; the need for such architectures will therefore likely increase. On the whole, however, quantum mechanical simulations are flexible in that they can be adapted to a wide variety of computers, as reflected by the fact that the CoE Centre for Theoretical and Computational Chemistry (CTCC) during 2009 utilized a variety of platforms in their production calculations: 18.000.000 CPU hours on Stallo, 1.500.000 CPU hours on Titan, and 250.000 CPU hours on Hexagon and Njord. Currently, therefore, the CTCC (which comprises nine senior scientists) uses the computing power equivalent to about 2000 processors for molecular simulations, clearly illustrating the enormous need for number-crunching capabilities in chemistry and materials science.
   A critical factor for all simulations in chemistry and materials science is high stability. Many production applications require a week or more computing time, even when fully parallelized on clusters. Typically, such calculations cannot be restarted without considerable loss of computed data and computing time. Unstable systems therefore quickly become unattractive as a tool in quantum chemistry.
   Maintenance and optimization of production codes on specific platforms is impor-tant work but which cannot easily be undertaken within small research groups, even by those heavily involved in methods development. Such work can be best performed within the national framework of supercomputing, in support groups.

Expectations from 2015
We do not foresee large changes in the way our simulations are carried out over the next five years. Mostly calculations will be performed on loosely connected computers; fast interconnect solutions will be utilized as they arrive. Computing codes are being constantly improved and faster algorithms are introduced—for example, the current experimental version of Dalton calculates electronic energies and molecular forces one to two orders of magnitude faster than the currently released version. However, such improvements never reduce the needs for computing power; the increase efficiency is instead used to improve the quality of the simulations—by improving the overall accuracy, increasing system size or simulation times. Therefore, even with vastly improved codes and computers, production runs will typically stretch over one or two weeks, using 500 to 1000 processors—only in this manner will it be possible to stay relevant by carrying out research at the cutting edge of chemistry and materials science.

This article is reproduced with permission of the Norweigian Research Council from the report “The scientific case for eInfrastructure in Norway” (ISBN 978-82-12-02832-6).