(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

VIII. Earth Earns: An Open CoCreative Earthropocene to Astropocene PediaVerse

1. Mind Over Matter and Energy: Quantum, Atomic, Chemical, Astronomic Realms

Zheng, Xiaolong, et al. Machine Learning Material Properties from the Periodic Table using Convolutional Neural Networks. Chemical Science. 9/8426, 2018. In this Royal Society of Chemistry journal, Hangzhou Dianzi University and Northwest University, Xi'an computational chemists achieve a novel application of this multiplex connective method by which to better study the atomic elements in the 21st century.

In recent years, convolutional neural networks (CNNs) have achieved great success in image recognition with powerful feature extraction ability. Here we show that CNNs can learn the inner structure and chemical information in the periodic table. Using the periodic table as representation, and full-Heusler compounds in the Open Quantum Materials Database (OQMD) as training and test samples, a multi-task CNN was trained to output the lattice parameter and enthalpy of formation. Our results indicate that the two-dimensional inner structure of the periodic table was learned by the CNN as useful chemical information. (Abstract excerpt)

Zhou, Quan, et al. Learning Atoms for Materials Discovery. Proceedings of the National Academy of Sciences. 115/E6411, 2018. A team of Chinese-American physicists at Stanford University and Temple University describe a sophisticated endeavor to apply machine computation and neural net learning to initiate a novel phase of crystal and chemical creativity. In the process, they noticed that the methods seem to take on a literary and linguistic semblance, which in turn extended to quantum phases. In the references can be found Distributed Representations of Words and Phrases, and GloVe: Global Vectors for Word Representation, along with Quantum-Chemical Insights from Deep Tensor Neural Networks (search Schutt). In regard, a further entry is achieved of how much the cosmos to culture span is innately textual in nature. It then occurred to me that the term “atomic” might be able to accrue an “at-omics” identity.

Exciting advances have been made in artificial intelligence (AI) during recent decades. Among them, applications of machine learning (ML) and deep learning techniques brought human-competitive performances in various tasks of fields, including image recognition, speech recognition, and natural language understanding. In this work, we show that our unsupervised machines (Atom2Vec) can learn the basic properties of atoms by themselves from the extensive database of known compounds and materials. These learned properties are represented in terms of high-dimensional vectors, and clustering of atoms in vector space classifies them into meaningful groups consistent with human knowledge. We use the atom vectors as basic input units for neural networks and other ML models designed and trained to predict materials properties, which demonstrate significant accuracy. (Abstract)

Summary and Outlook: We introduce unsupervised learning of atoms from a database of known existing materials and show the rediscovery of the periodic table by AI. The learned feature vectors not only capture well the similarities and properties of atoms in a vector space, but also show their superior effectiveness over simple empirical descriptors when used in ML problems for materials science. While empirical descriptors are usually designed specifically for a task, our learned vectors from unsupervised learning should be general enough to be applied to many cases. We anticipate their effectiveness and broad applicability can greatly boost the data-driven approaches in today’s materials science, especially for the recently proposed deep neural network methods (29–32), the same as the huge success of word vectors in language modeling. (5-6)

[Prev Pages]   Previous   | 12 | 13 | 14 | 15 | 16 | 17