来自杂志 Nature 的文献。
当前共找到 82 篇文献分享,本页显示第 81 - 82 篇。
81.
十年 (2022-02-12 20:00):
#paper doi:10.1038/s41586-021-04223-6 Wright et al., Deep physical neural networks trained with backpropagation. Nature 601,549-555(2022) 传说中的万物皆可神经网络,作者提出PNN(physical neutral network),在机械、光学、电子方面效果贼好。万物皆可神经网络,牛逼格拉斯。
IF:50.500Q1 Nature, 2022-01. DOI: 10.1038/s41586-021-04223-6 PMID: 35082422
Abstract:
Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability. Deep-learning accelerators aim to perform deep learning energy-efficiently, usually targeting the … >>>
Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability. Deep-learning accelerators aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this deficiency constitutes a major impediment. Here we introduce a hybrid in situ-in silico algorithm, called physics-aware training, that applies backpropagation to train controllable physical systems. Just as deep learning realizes computations with deep neural networks made from layers of mathematical functions, our approach allows us to train deep physical neural networks made from layers of controllable physical systems, even when the physical layers lack any mathematical isomorphism to conventional artificial neural network layers. To demonstrate the universality of our approach, we train diverse physical neural networks based on optics, mechanics and electronics to experimentally perform audio and image classification tasks. Physics-aware training combines the scalability of backpropagation with the automatic mitigation of imperfections and noise achievable with in situ algorithms. Physical neural networks have the potential to perform machine learning faster and more energy-efficiently than conventional electronic processors and, more broadly, can endow physical systems with automatically designed physical functionalities, for example, for robotics, materials and smart sensors. <<<
翻译
82.
尹志 (2022-01-31 12:53):
#paper doi:10.1038/nature14539 LeCun, Y., Bengio, Y. & Hinton, G. Deep learning. Nature 521, 436–444 (2015). 这是深度学习三巨头于2015年写的一篇nature综述。也是nature纪念AI60周年的一系列综述paper里的一篇。这篇paper综述了深度学习这一热门主题。当然,作为深度学习的几位奠基人,确实把深度学习的概念原理应用写的深入浅出。本文从监督学习一直介绍到反向传播,主要综述了CNN和RNN的原理及其应用,很适合初学者全面了解(当时)的深度学习的概貌。在最后一段深度学习的未来一节,作者对无监督学习的未来报以热烈的期望,看看这几年,特别是yann lecun大力推动的自监督成为显学,也算是念念不忘必有回响了。
IF:50.500Q1 Nature, 2015-May-28. DOI: 10.1038/nature14539 PMID: 26017442
Abstract:
Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in … >>>
Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech. <<<
翻译
回到顶部