A world war has recently ended, and the computational machines lie dormant; waiting to be reinvigorated, but the
war against science hasn't really come to a standstill.
Sitting in the Alamos lab ware typical war veterans who had experience with the modern tools the world had recently created: Computers. After the end
of the war, physicists started to devote their time and attention to more arcane problems of physics and mathematics. The first among these
involved the area of combinatorials. But how exactly did this calculation of permutations helped to create a new field of research?
The journey begins with this paper on Monte Carlo Simulation by S. Ulam and N. Metropolis. Two magnates of the Hydrogen Bomb fame ; they simultaneously worked out a simple solution of the many body problem using computers, or rather "fast computing machines". Entitled "Monte Carlo Methods" , the paper gives a brief history of the previous developments in the field. Indeed, what's a better to start a journey than looking at the past? The paper lays down the foundation of a method to calculate the different configurations of many body systems. A small step for Maths, a giant leap for Molecules.
Often termed as the “Metropolis et al", the first ever
molecular simulation was performed in the paper entitled "Equation of State Calculations
by fast computing machines". The paper boasts of a number of influential giants of the Hydrogen bomb fame.
Nicholas Metropolis (senior
physicist, Los Alamos group), Edward Teller (father of Hydrogen Bomb), his wife Augusta Teller,
the more modest Marshall Rosenbluth and finally, his wife Arriana Rosenbluth. At first, it seems
a bit odd for a group of renowned physicists to have arrived at this method. However, due to the post-war era, they had easy
access to some of the best computing machines of the time; add to this the
awareness about the problem of solving higher order integrals, especially the more cumbersome
multibody problems. Consequently, they set themselves to work out a solution.
What sets this paper apart is the “importance sampling" technique that was used for generating configurations. A technique so revolutionizing, that it is recognized as one of the top 10 algorithms of the 20th century.
In the 1953 paper, the authors had given acknowledgement to Alder and co for having developed a similar method independently. Some have imputed a bitterness between Alder and his boss that stopped them from publishing the idea. However, he did come into the picture with his paper entitled 'Radial Distribution Function Calculated by the Monte-Carlo Method for a Hard Sphere Fluid.’ Co-authored by Frankel and Lewinson, this paper is a rudimentary guide on using Monte Carlo Simulations for providing molecular insights, albeit a bit ubiquitous in today's time, at that time it marked a big step for computational microscopy. Radial distribution function help to provide the insights on the interaction between the molecules of the system and is analogous to the x-ray scan of the human body: it's about the minute separations. An honorable mention of the IBM 604 computer is made, one of the most powerful computing machines of the time.
Finally, in 1957, Alder received his due fame. The first Molecular Dynamics Simulation is attributed to have been started by Alder and Wainwright. Alder was a student under Kirkwood. The paper aptly entitled 'Phase Transition for a Hard Sphere System’ meticulously describes how they solved the classical equations of motion on the IBM704 computer. Published in Journal of Chemical Physics , the authors try to observe the 'phase transition’ behavior using Molecular dynamics approach, something you won’t expect in the very first paper. Periodic boundary conditions and radial distribution function are present, as was in the ’57 classic. However, there are no Thermostats or Barostats as of now; even the Lennard-Jones potentials are missing from the picture. Consequently, the innocuousness of the paper is something that stands out. If you give it a read, you will hardly recognize it as the very first paper on a subject that will change the course of future research methodology.
A second paper by the same duo, Alder and Wainwright,
this one loses the innocence of the first one and is
audaciously titled 'Studies in Molecular Dynamics I. General Method'. The title is again apt and the paper does
go into good depths dealing with the ways, advantages and limitations of the approach.
The authors are quick enough to enumerate the most important advantage of molecular dynamics, i.e. the possibility of calculating the non-equilibrium (say, transport) properties; an inevitable disadvantage of the Monte Carlo method. The paper is a must read for the tyros of the field, as it describes diligently the minute details of the procedure. Not only does it provide the constrained applicability of the method, but also delineates the possible solutions to some of the problems.
'Dynamics of Radiation Damage' by Gibson, Goland, Milgram and
Remember it is 1960's and the world is in
a phase called 'Cold War'. The topic seems apt, suiting the needs of the time. This is often considered
as the first ever
paper that used a continuous potential model.
Alder and wainwright were using the hard sphere model which is more
like binary '1' or '0', so was the case with Metropolis et al.
The authors came up with a continuous potential of the exponential form; Born-Mayer potentials (repulsive part)
coupled with an inward
force to compensate for the attractive interactions.
An interesting aspect of the paper is the reason quoted by the authors for this numerical approach: "It has seemed to us that analytical methods are inadequate and that numerical treatment with the aid of a high-speed computing machine is required", a presage for the upcoming era of computations? Once again, the IBM704 is used and the paper states that one time step took about a minute. Not bad indeed; considering there were 500 atoms.
Molecular simulations give the coordinates (and momenta) of the particles in the system. However, it is left to
the sagacity of the scientist to make sense of the changes in phase space. Boltzmann and Maxwell, did their best to
lay down the base of this approach, but that wasn't the end of Statistical Mechanics. More equations needed to be derived to
address important physically relevant properties of the substances and mixtures alike, not just the Helmholtz energy.
In the process, Widom made a remarkable contribution, in fact the method described is so ubiquitous that it loses its significance in the arising banality. Modestly titled “Some Topics in the Theory of Fluids", Widom correlates the radial distribution function to potential-distribution theory, which allows for the calculation of a number of properties directly. Remember, the properties can be calculated from the partition function, however, his particle insertion method calculates the same by perturbing the system a bit, via inserting a single ghost particle. Thus, the method enables to calculate the properties of a single component, most importantly the chemical potential in non-homogenous systems. In the years to come, this paper would help to establish molecular simulations firmly in industrial solvent predictions.
The 'father of Molecular Dynamics'
who showed the world how molecular dynamics simulations are
performed with an artistic perfection. Published in Physical review, 1964,
entitled 'Correlations in the motion
of atoms in Liquid Argon', this paper is a gem.
From Lennard-Jones potential to radial distribution functions, velocity autocorrelation functions and mean square displacement, it has all the essential ingredients of a complete MD recipe. A captivating aspect of the paper is the importance given to the correlation functions and their relation to the states of the molecules, additionally explaining the rapid decay of radial distribution functions in vineyard approximation and how to improve by adding a delay.
The paper is evergreen and if you give it a read, you won't feel it's a fifty year old publication. The calculations were performed in the prestigious Argonne National Lab in Illinois, and once again, the computing machine used was among the fastest in the business. With 864 atoms of Argon and CDC3600 computer, Mr. Rahman revealed to the world the true power of computers.
Loup Verlet, a French Physicist,
came up with the ubiquitous Verlet algorithm in his paper '
on Classical Fluids I. Thermodynamical Properties of Lennard Jones Molecules'. This one is an important landmark in
Molecular Dynamics simulation. Not only did he introduce the
Verlet algorithm for integrating the Newton's equation
more effectively but he also brought in the concept of
neighbor lists into the picture. Together they reduced the computational time
drastically. The paper builds upon Rahman's 864 Argon atoms and explains how to reduce the computational time in an
'Refinement of protein conformations using a macromolecular
energy minimization procedure' by
Michael Levitt and Shneior Lifson.
The paper focuses on getting the most stable structure of a protein; the data are derived from experimental studies,
crystallography mainly. However, the researchers are aiming to "refine" the structure, to make it more stable.
Steepest decent gradient
is used (an idea concocted by Lifson in an early '66 leaflet) for the minimization procedure, with different force field,
first ignoring and then taking into considerations, the non-bonded and hydrogen bond.
Almost 5 decades later, if you give the paper a read, you are bound to lose yourself so much in the fluency that you will end up looking for the simulation methodology section that should be there at the end. The paper went on till the threshold of protein simulations; everything in its place, the scientists just had to apply the second law to perform the very first protein simulations. And apply they did!
Its the 70's now, two decades have passed since the first molecular simulation ran on the
machines meant to advance the field of
encryption and further develop bombs. The new decade demands for change, and the change comes in the shape of
Up until now, all the simulations were being carried out using Hard-Sphere model or the Lennard-Jones
interactions for "simple fluid systems". Meanwhile, a professor-student duo decided to move on to the next step, i.e. the
simulation of molecules. Harp
and Dr. Berne
published their work in the paper entitled "
Memory Functions, and Molecular Dynamics". Having produced a master treatise on time correlation functions earlier,
Bernes continues to expand his study about the correlation functions in this paper. However, what stands out is the
use of a diatomic “Carbon monoxide" molecule to carry out the simulations. The first of its kind; surely not the last
In a biography, Dr. Berne notes that initially Frank Stillinger had come up to him with the proposal of water simulations, however due to the extreme computational time that was involved, Berne rejected the proposal! So, what happened next? Keep reading, to find out:)
The leading paper in the Rahman-Stillinger series (Aneesur Rahman and Frank Stillinger ), this one is probably the first ever “ Molecular Dynamics study of liquid water". As the authors aptly note: water has not 'enjoyed the attention of a rapidly developing body of statistical mechanical theory'. The initial simulations were mostly of noble gases that could be easily parameterized by the Van der Waals interaction. However, water concerned a major problem because of its notorious hydrogen bonds. The problem is evident in the paper. Lennard Jones does come into play with a slight modification attributed to the erudite Ben Naim, however, the authors feel by the end of the paper that a lot of work has to be done to improve the potential terms. This twenty-four page treatise could have been enough for any trivial compound but it was just a start for our player. The duo developed a penchant for studying water systems and went on to publish close to a dozen papers involving water simulations, trying meticulously to improve the results obtained and provide more insights into the understanding of this ubiquitous compound.
'Isothermal Molecular dynamics calculations for liquid salts'
by L.V. Woodcock. Printed in the journal Chemical
Physics Letters, this one provides a pleasant understanding of the common techniques that were being used in the 1970's.
It explains how the verlet algorithms are implemented to carry forward the simulation, and most importantly, comments on
controlling the temperature of the system. This was the phase when people moved from the trivial NVE to the
which required the temperature to be fixed.
Remember, as of yet, there are no Nose-Hoover algorithms to control the temperature. A simple scaling of velocity is all that is possible and aesthetically done in the paper. Comparison of the internal energies obtained with the experimental cohesive energies, calculation of a number of thermodynamic properties and finally onto proving that NVT works better than the MC code. Overall, it is a nice read to understand the basics of how things happen in an MD simulation.
A ubiquitous part of computational simulations is the evaluation of electrostatic interactions, especially
when the periodic boundary conditions are implemented. It's hard enough to calculate the electrostatic interaction using the coloumbs
law for three particles, for infinite particles, its imponderable. Up till now, people used to take a cutoff radius, but that method had some
serious issues related to polarization of the cutoff sphere. In 1973, Barker and
came up with a daunting solution: The reaction field theory. A mere five page,
entitled "Monte Carlo studies of the dielectric properties of water-like models", the paper justifies the use of the
depolarizing potential with more accurate results, compared to direct cutoff.
'Computer simulation of protein folding' by
Michael Levitt and Arieh Warshel.
This was the first ever molecular simulation
of a protein, the first ever use of Langevin Dynamics (molecular dynamics coupled with some stochastic part to represent a
solvent; saves time and computational power), the first ever contact maps in a publication and a number of today's 'trivial tricks of the trade' owe their
origin to this paper, including the coarse grained approach.
And most importantly this was the "point of no return" for
molecular simulations. Levitt and Warshel went on to produce a number of gems in the field of
computational biology, establishing themselves amongst the pioneers of the field.
They were accompanied by Martin Karplus on the way, and the three went on to win the
Noble Prize for Chemistry in 2013.
Back in 1968, Dr. Levitt had produced a paper on “energy minimization" of protein conformations, along with Lifson, that had presaged this inevitable step in protein simulations. The picture shows how the protein folds as molecular simulation is performed. Quiet remarkable? Indeed!
Someone knocks at the door of a house named Computational Simulation:
A voice from inside: “Who is it?"
Stranger : “Martin Karplus".
The 1975 classic on protein simulation lacked just one thing, Martin Karplus. Entitled 'Dynamics of folded proteins' and published in the esteemed journal: Nature, the paper demonstrates that molecular dynamics approach can give insights that experiments are unable to tell. Having already established a reputation among his colleagues for being a very young PhD (at 23), and having an equation in NMR crystallography named after him, this was Karplus's yet another feat.
Along with McCammon and Gelin, Karplus comments that 'proteins have a fluid like behavior' at atomic scale, meaning that they possess diffusional character. Up till then, proteins were thought of as being rigid bodies with temporal flexibility to perform different functions. This highlights the strength and potentiality of molecular dynamics approach. It is tremendously hard to get such microscopic details of a process using experiments even with the precision and accuracy of present day equipments. Consequently, Molecular Simulation is decorously called 'a computational microscope'.
Time is a quintessential constraint in any molecular simulation technique. If only we had enough time (and obviously the power), we could simulate anything. As a result, it was obvious to put in a lot of effort in obtaining better algorithms that would enable to increase the time step and help perform longer simulations. The first break through in that direction came through the “SHAKE" algorithm. The time step of a simulation must be smaller than the period of the fastest motion. What if we ignore that fastest motion and make it rigid? That's what Ryckaert, Ciccotti and Berendsen answer in the paper entitled 'Numerical Integration of the Cartesian Equations of Motion of a system with Constraints..'. The algorithm keeps the desired degree of freedom fixed and to overcome the fluctuation error it keeps it within a tolerance. Something, which might seem obvious but if ignored, can pollute the simulation all together.
The classics are out in journals, and molecular simulation
is taking the world by storm. However, there are two limitations of this new tool. Firstly, it uses computational power, a lot,
secondly, the calculation requires a lot of time. Reducing time required would reduce computational power too; the physicits realize this
and at this moment comes another stroke of a marksman: the Brownian Dynamics simulation.
The theory was old, attributed to Einstein and Robert Brown, but the approach was hot as a cake. Entitled 'Brownian dynamics with hydrodynamic interactions' , the authors, Ermak and Dr. McCammon used the decades old strategy to assuage the time constraint. Instead of explicitly specifing the solvent molecules, the researchers specified it inherently by adding a random force term to the equations. Further, the addition was also quintissential in modelling the random behavior of molecules, rather than basing it on the more sequential Newton's Law.
Molecular simulations were performed in the microcanonical ensembles,
where the N, V and E were kept constant.
However, with the remarkable predicting ability of the method and it's relevance to real properties, it became necessary
to simulate real life experiments that can occur at sundry pressure and temperatures. The properties predicted by a
simulation are reliable, based on the ergodic hypothesis
; that the ensemble average and time average should be equal.
Any ensemble that samples all the configurational space is bound to give same results. The idea was simple, however the
implementation was taking some time.
Finally, in the beginning of the 80's, Dr. Andersen came up with an answer in the paper “Molecular dynamics simulations at constant pressure and/or temperature". The paper begins by addressing the problem, then outlines how it is going to tackle it. Consequently, explanations are provided for all the different cases, with suitable examples and further explications. The author finally sums up the whole proceedings in the conclusion, and provides further insights in the limitations and advantages. Worthy of a perfect 6 in the GRE-AWA, this paper has the grip to keep a reader through each of the sentences. Though marked by a number of erudite equations and simple proofs, the author keeps the reader entertained by drawing subtle differences between the different ensembles, a mark of a genuine researcher? Indeed! The paper does not end before providing marvelous insights into the applications of these methods. Ah! How I wish to read it again.
The globalization of molecular simulations has already begun. People have realized the importance of saving time on rewriting the code again and again, on different machines. Morever, the advancement of computers is now making it possible to have the same computing machine at different places. As such, a common molecular dynamics code would really solve the problem and times of a number of young scientists in the field. Dr. Karplus might have had a similar view, for he open sourced a powerful molecular dynamics package, "Charmms". Published in the journal of computational chemistry and appositely titled "CHARMM: A Program for Macromolecular Energy, Minimization, and Dynamics Calculations", charmms was not just an MD package, but came with an inbuilt force field too. Furthermore, it could also "analyze the structural, equilibrium, and dynamic properties determined in these calculations".
The SHAKE algorithm given by Ryckaert et al in 1977 was based on the Verlet integration algorithm. However, there is another common algorithm widely used in molecular dynamics, a modified version of the verlet algorithm, a bit computationally expensive but more effective, the “velocity verlet" algorithm. In 1983, Andersen came up with the SHAKE equivalent for the velocity verlet integrator. An important landmark for the society using the VV algorithm. For some reasons not disclosed in the paper, he named the algorithm “RATTLE". The paper goes a bit harsh on proving why both the verlet and the SHAKE algorithms are inferior to velocity verlet and RATTLE algorithms respectively. To be honest, nothing is more annoying when the RATTLE algorithm fails in your MD code.
Monte carlo and molecular dynamics simulations had started as a need to
solve or rather calculate the many body problem. However, as the author notes, “the study of
coexistence properties have been limited to single component systems”. Why? Well, because the
number of simulations required for these calculations would increase “dramatically” with the
number of components in the mixture. The authors are Athanassios Z. Panagiotopoulos, Ulrich Suter
and Robert Reid; paper is "Phase diagram of nonideal fluid mixtures
from Monte Carlo Simulation".
An inherent problem with mixtures is the difference in interaction energies which makes it very difficult for the monte carlo moves to be acceptable. Meanwhile, Dr. Panagiotopoulos suggested that a random transfer of a molecule can open up a space big enough to accommodate the other molecule. Consequently, this would lead to an increased probability for the move to be accepted and hence the simulations would converge efficiently. Since, phase equilibria requires chemical potential to be equal, the authors resort to the test particle approach for the same. The results verify the claim, quite aptly!!
Following the success of the particle transfer move, Dr. Panagiotopoulos takes on the larger problem
of simulating multi-component systems for phase coexistence behavior. He comes up with a novel
solution where a single computer simulation is enough to give insights on two phase interactions.
The solution is elegant, to use to different boxes or regions; and the particle transfer move makes
it rather easy to achieve acceptable monte carlo moves in such a situation. In the paper ”
of phase coexistence properties of fluids by Monte Carlo simulation in a new ensemble”, the author makes
use of an imaginary surface to separate the two phases (this was done to avoid introducing any surface effect),
while the biasing procedure is kept similar to the classic Monte Carlo method.
This paper marked a significant advancement in the field of computational simulation of phases. The work, termed as the
Gibbs ensemble, was quintessential
from an industrial perspective too. This was because the two-three phase simulations of multicomponent mixtures resulted in a large number
of computational studies to find out more efficient industrial solvents. Even today, this field attracts a majority of
thermodynamic enthusiasts and there are competitions like the Industrial fluid simulation challenge to
further improve the tool.
If molecular simulations had to be summarized in a line it would be:
“You give me configurations, and I will give you properties”. It does not matter how you create
this configurations; as long as they are physically viable, it is a valid configuration in the
phase space. In Molecular dynamics, Newton’s Law dictates these configurations. In contrast, the
beauty of Monte Carlo simulations lies in its randomness regarding the configurations. In 1992,
in the paper "Configurational bias Monte Carlo: a new sampling
scheme for flexible chains ",
Dr. Siepmann and
Daan Frenkel took advantage of this fact in a rather shrewd way, and gave the
configurational bias algorithm. The idea is simple enough, select a molecule randomly, select a
part of the molecule, again randomly, and then move this part to other available position, ahem..randomly.
But it doesn’t end there, next you need to decide whether you want to accept the move or not, well, this too
is done, randomly. Indeed! What a great way to extend the field of Monte Carlo simulations.
Dr, Siepmann went on to give some major contributions in the field of Phase Equilibria. This included the rigid "TraPPE" force field, and also an open source Monte Carlo simulation code: Towhee.
Initially, the source code was termed as MDScope and was published entitled “MDScope -
A visual computing environment for structural biology”. It was a start to an open source software
that would change the course of molecular dynamics drastically, and would help it to become a part of
every other university of the world.
Klaus shulten would soon become an unforgettable name in the field,
and NAMD would go on to get close to 10000 citations, with some major and remarkable achievements almost
every other year.
Namd was soon to become the soul of molecular simulations, and Klaus schulten its beating heart. Lisa Pollock provides a
beautiful history in this page, and it will take me a course in namd to put it with such dignity as she does.
Molecular dynamics has come a long way now. Since the ’57 gem of Alder and Wainwright
which marked its beginning as a highly dubious predictive tool, it has now developed into a strong
methodology in itself. Its the late 1990’s and Microsoft (not Google) rules the world. MS Word and
Excel files are slowing invading the lives of millions of individuals. Consequently, the invasion is bound
to touch anything that is related to computing. Dr. Livestock and his student
IJ Fraser set themselves the
task of evaluating Newton’s Laws for multi-body system using a simple Excel spreadsheet. Entitled
dynamics on an Excel Spreadsheet”, the authors perform the calculations for a small system of few Lennard-jones
particles and obtain some useful curves to explain the results. Additionally, the authors note that they developed
the code as part of an undergraduate teaching/research project. The paper is a stark reality of the changing times,
and it illuminates how, once a “million dollar” simulation performed in the famous Atlas Computers, can now be
performed in personal computers.
Ever since its beginning in mid-1900s, every decade has seen a prominent advancement in the
field of computer simulations. This is now a new century; the situation demands a change that could give a new direction to the field.
The change comes from the acumen of a Stanford Professor,
Dr. Vijay Pande.
Dr. Pande laid the foundation of the what he names as the "folding at home" project. But what is it exactly?
Well, its the century of computer revolutions and personal computers are invading every individual home in the world. However, Dr. Pande realized that the computers aren't used continuously throughout the day, say it might remain dormant during the night, or maybe when you go on a vacation? Even when you are using a computer, a typical user rarely consumes all the power the computer can provide. Conseuquently, he devised a software to extract the computational powers of millions of personal computers at different homes around the world, and use them to perform computational simulations. The idea was robust and the execution sublime. It was named Folding@Home, as it was meant to solve the central impasse of biology, i.e. the protein folding problem. As of 2017, the Folding@home project has grabbed around million users, and has made some momentous discovery in the field.
The Gibbs ensemble introduced in ’87 was remarkable in solving phase equilibria problems,
but it had its own limitations. The simulation of the two phases were done simultaneously which meant that the
fluctuations in one phase directly affect the fluctuations of the other. Further, as pointed out by the author,
these fluctuations drastically affect the behavior at critical points often causing phase reversals in the two
regions. At Stuttgart, Germany, two pundits in the field of phase equilibria
Dr. Vrabec and Dr. Hasse come up
with a refined method to counter the difficulty. The duo proposed the Grand Equilibrium Method, in which they avoided the interaction of fluctuations by performing two different simulations. First, the chemical potential
as a function of pressure is obtained by performing NPT run for the liquid phase; this function
is then used to equilibrate the chemical potential of the vapor phase and obtain consistency between the two phases.
The method was pristine, and this is compounded by the fact that the duo went on to win the Industrial Fluid Simulation challenge in 2006!!
D. E. Shaw, a biochemist and computational scientist, started the D.E.Shaw research group. Dr. Shaw was a computer scientist from
Stanford, and made a good fortune by applying "proprietory algorithms for security trading". However, he didn't let his earnings (a whopping 2 billion dollars)
rot in bank accounts.
Instead, he invested in biochemistry research, specifically, in molecular dynamics of proteins. As part of the research project, he
established the supercomputer "Anton", and later "Anton2", both designed to perform massively parallel computer simulations, aka molecular dynamics simulations.
Dr. Shaw group has indeed performed some remarkable feats in the field, and is a highly revered name in the turf.
As the companies webpage illustrates, the aim of the research group is to "design novel algorithms and machine architectures for high-speed molecular dynamics" and
" to study the structural changes underlying biological phenomena that occur on time scales far in excess of those previously accessible to computational study".
The massive investment made by the group is a presage of the continued progress of the field for years to come.
This was a memorable moment in the history of computational simulations. In a lab at UCLA, scientists had seen the unseen and predicted the unthinkable. The paper is entitled “Discovery of a Novel Binding Trench in HIV Integrase ”; and indeed it is a discovery in itself! Roughly speaking, integrases are enzymes that help a virus to integrate to a host cell, consequently Integrase inhibitors are those drugs that prevent viruses from integrating into the host system. To compound on the biology, inhibition occurs if the drug is able to attach itself to a trench or a hole present in the structure of the virus. The scientists Dr. Schames and co predicted the availability of such trench in the HIV integrase; what it meant was that a cure was possible by binding a drug onto it. The fact that no such trench had been identified previously using X-ray crystallography, etc. is what sets this paper apart. It was an all atom molecular dynamics simulations that had provided the necessary information for a viable drug. And the information was indeed helpful, as it resulted in new experimental studies and ultimately resulted in the production of an effective drug, “raltegravir”, the first of its kind.
It’s 2006 and NAMD recently celebrated its decade of existence. A long way indeed, however, it still has to go through a very long path; for this, it has to keep proving its mettle along the journey, and NAMD doesn’t let us down. Entitled ”Molecular dynamics simulation of the complete satellite tobacco mosaic virus”, the work involved the study of a million atoms, for over 50 nanoseconds. Though the number is too far from even coming close to the Avogadro number of molecules, yet the aim isn’t getting close to reality. Schulten and co focus on getting erudite information about the stability of the virus and also how it breaks up inside the host. The authors note that this is the “first ever simulation of an entire life form”, and that further questions on the stability and mechanisms of the virus can be answered with more invested research.
The DESRES group of DE Shaw had some major funding from its owner, as such it wanted to prove it's dominance in the field of computational simulations.
For this purpose, the anton supercomputer was put to use and it didn't let anyone down. Entitled
"Atomic-Level Characterization of the Structural Dynamics of Proteins"
; the paper describes the details of interaction between protein, in the length of milliseconds,
1.013ms to be precise.
The common bovine pancreatic trypsin inhibitor protein is back again, and the fundamentals of the folding and conformational change are at close scrutiny
by a team of world reknowned scientists. This is the first time a simulation has reached to the scale of a millisecond in time length, and the paper doesn't
shy in boasting about it, again and again. However, the work is extraordinary, not just in time length, but the depth of analysis made from the resulting
trajectories. The authors note the "unobvious finding" that modern force fields are capable of realistically describing
the structure and dynamics of proteins over even extended time scales.
And did you notice? It's the start of a new decade; not a bad start I would say ;)
It's 2011, and just a year has passed since the first millisecond simulation; scientists are still to get over the remarkable milestone in the field
but Dr. Shaw isn't among the ones who rest. The anton is back and this time he surpasses his own record. Published in the prestigious Science,
"How Fast-Folding Proteins Fold", the paper tries to solve the central dogma of modern biology. Meanwhile,
Anton sets another benchmark, 1.112 ms!!
The central problem of modern biology is how different protein structure encode the functions performed by the proteins. An equally ubiquitous question is how do the proteins fold in reality? Experiments to determine the protein folding of a single protein are arduous, at times impossible. However, the paper provides an answer for the folding of twelve structurally diverse proteins and also insights on how close the folding mechanisms are to each other.
Obivously the story doesn't end at 2017. Stay tuned to learn more ;)
The author acknowledges the fact that a number of important landmarks in the history of the field has been missed out. The list is by no means exhaustive, but the author makes sure that regular updates are made.
Feel free to contact me if you need a pdf version of this page, or if you have any suggestion/query related to the content :)