(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

II. Planetary Prodigy: A Global Sapiensphere Learns by Her/His Self

1. Earthificial Intelligence: Deep Neural Network Learning

Das Sarma, Sankar, et al. Machine Learning Meets Quantum Physics. Physics Today. March, 2019. In a “most read” journal paper, University of Maryland and Tsinghua University, Beijing computational theorists show how these disparate fields actually have common qualities, which can serve to inform, meld and advance each endeavor. The graphic article goes on to compare affine neural network and quantum states so as to join “classical and quantum” phases, especially for communicative and computational purposes.

Deng, Dong-Ling, et al. Quantum Entanglement in Neural Network States. Physical Review X. 7/021021, 2017. University of Maryland, and Fudan University, Shanghai, theorists identify, develop and extol the practical affinity of neural network cognitive geometries and operational workings with quantum phase phenomena. If to reflect, how might its application to this fundamental cosmic realm imply an intrinsic cerebral character and content? A step further, if global human beings can so readily plumb such depths, and intentionally apply these basic, principled methods, could a creative universe intend for their passage to our cognizance and continuance?

Machine learning, one of today’s most rapidly growing interdisciplinary fields, promises an unprecedented perspective for solving intricate quantum many-body problems. Understanding the physical aspects of the representative artificial neural-network states has recently become highly desirable in the applications of machine-learning techniques to quantum many-body physics. In this paper, we explore the data structures that encode the physical features in the network states by studying the quantum entanglement properties, with a focus on the restricted-Boltzmann-machine (RBM) architecture. Our results uncover the unparalleled power of artificial neural networks in representing quantum many-body states regardless of how much entanglement they possess, which paves a novel way to bridge computer-science-based machine-learning techniques to outstanding quantum condensed-matter physics problems. (Abstract excerpts)

Dieleman, Sander, et al. Rotation-Invariant Convolutional Neural Networks for Galaxy Morphology Prediction. arXiv:1503.07077. Since the Galaxy Zoo (Google) crowdsourcing project to characterize astronomical topologies has limitations, Ghent University and University of Minnesota researchers lay out a deep neural network method to further empower the communal project. (Spiral Science)

Dufourq, Emmanuel and Bruce Bassett. EDEN: Evolutionary Deep Networks for Efficient Machine Learning. arXiv:1709.09161. University of Cape Town, African Institute for Mathematical Sciences, theorists add a temporal depth to neural net computations by informing and integrating them with evolutionary phenomena. By turns, life’s quickening emergence might be likened to a grand educative endeavor. See also Evolving Deep Neural Networks at 1703.00548 for a companion effort.

Deep neural networks continue to show improved performance with increasing depth, an encouraging trend that implies an explosion in the possible permutations of network architectures and hyperparameters for which there is little intuitive guidance. To address this increasing complexity, we propose Evolutionary DEep Networks (EDEN), a computationally efficient neuro-evolutionary algorithm which interfaces to any deep neural network platform, such as TensorFlow. Evaluation of EDEN across seven image and sentiment classification datasets shows that it reliably finds good networks -- and in three cases achieves state-of-the-art results -- even on a single GPU, in just 6-24 hours. Our study provides a first attempt at applying neuro-evolution to the creation of 1D convolutional networks for sentiment analysis including the optimisation of the embedding layer. (Abstract)

Dunjko, Vedran and Hans Briegel. Machine Learning & Artificial Intelligence in the Quantum Domain. Reports on Progress in Physics. 81/7, 2018. University of Innsbruck physicists scope out how these dual approaches may cross-inform and merge so as to be fruitfully applied to this deepest frontier. See also, for example, Quantum Neural Network States by Zhih-Ahn Jia, et al atarXiv:1808.10601.

Quantum information technologies and intelligent learning systems are both emergent technologies that are likely to have a transformative impact on our society. In a growing body of recent work, researchers have been probing the question of the extent to which these fields can indeed learn and benefit from each other. Quantum ML explores the interaction between quantum computing and ML, investigating how results and techniques from one field can be used to solve the problems of the other. Beyond the topics of mutual enhancement—exploring what ML/AI can do for quantum physics and vice versa—researchers have also broached the fundamental issue of quantum generalizations of learning and AI concepts. This deals with questions of the very meaning of learning and intelligence in a world that is fully described by quantum mechanics. (Abstract excerpts)

Gencaga, Deniz. Information-Theoretic Approaches in Deep Learning. Entropy. August, 2018. An Antalya Bilim University, Turkey informatics engineer proposes a special issue with this title. It has a June 30, 2019 closing date for submissions.

Deep Learning (DL) has revolutionized machine learning, especially in the last decade. As a benefit of this unprecedented development, we are capable of working with very large Neural Networks (NNs), composed of multiple layers (Deep Neural Networks), in many applications, such as object recognition-detection, speech recognition and natural language processing. Although many Convolutive Neural Network (CNN) and Recurrent Neural Network (RNN) based algorithms have been proposed, a comprehensive theoretical understanding of DNNs remains a major research area. In this Special Issue, we would like to collect papers focusing on both the theory and applications of information-theoretic approaches, such as Mutual Information. The application areas are diverse and include object tracking/detection, speech recognition, natural language processing, neuroscience, bioinformatics, engineering, finance, astronomy, and Earth and space sciences.

George, Daniel and Eliu Antonio Huerta. Deep Neural Networks to Enable Real-Time Multimessenger Astrophysics. arXiv:1701.00008. Along with a pervasive employ of Bayesian (search) probabilities in mid 2010s science, as these University of Illinois astronomers explain, another incisive approach is the use of artificial neural nets as a generic self-organizing complex system of universal application. See also in this cosmic realm Deep Learning for Studies of Galaxy Morphology (1701.05917), and Star-Galaxy Classification using Deep Convolutional Neural Networks (1608.04369). (Spiral Science)

Gillet, Nicolas, et al. Deep Learning from 21cm Images of the Cosmic Dawn. arXiv:1805.02699. We cite as an example of how astrophysicists from Italy, Australia, the USA, and Canada are making use of brain-like neural networks to analyze patterns across the celestial reaches and depth of temporal origins.

Golkov, Vladimir, et al. 3D Deep Learning for Biological Function Prediction from Physical Fields. arXiv:1704.04039. As this turn to an organic AI actively advances, along with an evolutionary setting, a further rooting may be established into physical foundations.

Goodfellow, Ian, et al, eds. Deep Learning. Cambridge: MIT Press, 2016. Seen as a most comprehensive introduction, pioneer theorists Goodfellow, Google Brain, with Yoshua Bengio and Aaron Courville, University of Montreal, provide a technical text on this broad, multi-faceted revolution from inept machine methods to animate algorithms as the way our brains learn and think.

Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models. (Publisher)

Gopnik, Alison. An AI that Knows the World like Children Do. Scientific American. June, 2017. The UC Berkeley child psychologist and philosopher achieves a good synopsis of this recent turn to a viable computational intelligence by modeling it on actual human cognitive processes than machines. The best teacher may then be how each of us were able to gain knowledge in the first place. Two paths to AI’s resurgence are thus identified as a bottom up deep iterative learning and top down Bayesian probabilities.

Graupe, Daniel. Deep Learning Neural Networks. Singapore: World Scientific, 2016. . A senior University of Illinois electrical engineer provides a technical intro to their basic concepts, scope, methods, back-propagation, convolutional, recurrence aspects and much more, along with many case studies.

Deep Learning Neural Networks is the fastest growing field in machine learning. It serves as a powerful computational tool for solving prediction, decision, diagnosis, detection and decision problems based on a well-defined computational architecture. It has been successfully applied to a broad field of applications ranging from computer security, speech recognition, image and video recognition to industrial fault detection, medical diagnostics and finance. This work is intended for use as a one-semester graduate-level university text and as a textbook for research and development establishments in industry, medicine and financial research.

Previous   1 | 2 | 3 | 4 | 5  Next