And why it matters for the Enterprise On Sep 23, 1992 the “Divider” marked the last nuclear test to date conducted by the United States. It was the 928th nuclear test at the Nevada Test Site and one that led to unexpected consequences…
And why it matters for the Enterprise On Sep 23, 1992 the “Divider” marked the last nuclear test to date conducted by the United States. It was the 928th nuclear test at the Nevada Test Site and one that led to unexpected consequences…
On Sep 23, 1992 the “Divider” marked the last nuclear test to date conducted by the United States. It was the 928th nuclear test at the Nevada Test Site and number 1,032nd by US in total. It was also one that led to a rather unexpected consequence [1].
Since the US Department of Energy started to reduce nuclear underground tests in the mid 1980s, the idea of simulation-based design and analysis concepts was born. Simulations let one avoid long and cost-intensive physical tests, and a nuclear explosion was quite expensive indeed, moreover — prohibited thereafter [2].
Over the next 25 years several other important developments made it possible to take the idea of multi-scale simulations beyond the realm of research labs and into the “I need it to work right now!” domain of enterprise.
Multi-scale modeling [3] is hard to deploy in practice. The field is interdisciplinary in nature and combines the knowledge of materials and chemistry with computer science (and data science, more recently). When one adds the need to understand how nature behaves at multiple scales, complexity grows even further.
At different scales different phenomena can be prevalent. It is easy to understand by looking into the quantum-classical transition for the current generation of electronic design automation tools in semiconductor industry. Many of the tools developed from classical perspective break down when sizes of logical devices reach below 10nm [4].
After accounting for the limitations of physics-based approaches, like Density Functional Theory, associated with having to sacrifice accuracy to achieve manageable computational burden [5], the whole applicability of materials modeling becomes questionable.
That is why even today MM is more of a “magical skill” practiced by a chosen few and passed along like a medieval craft rather than a reproducible and reliable tool for product design [6]. And that is why only a few practical examples of successful enterprise applications exist to date.
The first wave of materials modeling companies came during the 1990-s. It followed the early success of personal computers and promised to bring a similar transition to chemicals and materials. One notable survivor from wave 1.0 is Biovia, the modern day successor of Accelrys [7].
After a few years, however, there came a realization that transitioning paper documents to virtual domain with Microsoft Word is quite a bit less complex than doing so with chemical processes. Computers did not have enough power to provide accurate results transferable from one problem to another and the modeling often failed leaving many disappointed in it at large [8].
Nevertheless, Accelrys, and later Schrodinger, were able to establish themselves as leaders in pharmaceutical field in particular where problems could be efficiently reduced and parametrized with a dataset manageable by then modern computers.
During the last 5 years we saw much growth in the areas of:
Less noticeably for general public, the other fields that made advancements include atomistic modeling techniques themselves (eg. 15,000 publications per year for Density Functional Theory alone [9]) and very recently quantum computing [10].
All these recent advancements provide a fertile ground for the next wave of materials modeling, or as we call it “Materials Modeling 2.0”.
Just like “Web 2.0” (see reference to Wikipedia below), the next wave of materials modeling takes advantage of cloud computing, big data and machine learning to improve upon the results of the first generation.
Web 2.0 websites user-generated content usability interoperability end users
In other words, Materials Modeling 2.0 is:
It is also more accurate through the combination of improved physics-based models and machine intelligence with direct input from experiment.
In Oct 2014 I visited Chevron and sat down with their R&D scientists asking for feedback on my presentation. After I mentioned “cloud” the mood started to deteriorate. They did not believe that anyone will ever want to store the information about petroleum catalysts in the cloud.
In Nov 2017 Chevron and Microsoft Azure announced a milestone contract focused around data that comes from far more valuable sources for any oil and gas company — oil fields [11]. It is easy to imagine where their chemists could have been now if people we spoke with back in 2014 were a bit more forward-looking.
We ended up partnering with other customers instead, including similar sized, but more open to innovation companies from energy space, and had a very productive and mutually valuable exchange. We learn about their problems and incorporate solutions into our product. They get to innovate faster and stay ahead of the competition.
Software “ate” many areas already [10], however it did not yet get to the complex and unsexy guts of the enterprise. That’s why we hear a lot about Digital Transformation today. [11] That’s where software can still make great impact. For many enterprises the digital transformation and innovation strategy already includes materials modeling.
Much like with the emergence of CAD and EDA as industries, materials modeling will change the way we develop products from atoms up. Companies really have two choices: “wait and see” risking to learn about how other companies do it from newspapers, or “join early” to get ahead of the game and play a role in establishing new standards.
What is your choice?…
[1] https://www.ctbto.org/specials/testing-times/23-september-1992-last-us-nuclear-test
[2] https://www.cnbc.com/2017/08/08/heres-how-much-a-nuclear-weapon-costs.html
[3] https://en.wikipedia.org/wiki/Multiscale_modeling
[4] https://www.nature.com/articles/nature13570
[5] http://science.sciencemag.org/content/321/5890/792.full
[6] http://www.sciencedirect.com/science/article/pii/S0927025615005820
[7] http://accelrys.com/; https://www.schrodinger.com/
[8] http://psi-k.net/docs/Economic_impact_of_modelling.pdf#page=22;
[9] http://science.sciencemag.org/content/351/6280/aad3000
[10] https://research.googleblog.com/2017/10/announcing-openfermion-open-source.html
[11] https://www.chevron.com/stories/chevron-partners-with-microsoft
[12] https://a16z.com/2016/08/20/why-software-is-eating-the-world/