来自杂志 Nature 的文献。
当前共找到 83 篇文献分享,本页显示第 81 - 83 篇。
81.
傅宇蕾 (2022-02-28 18:08):
#paper Dexterous magnetic manipulation of conductive non-magnetic objects. #DOI: 10.1038/s41586-021-03966-6 这是一篇通过对非磁性材料进行操控的文章,利用磁场(由永磁、电磁或超导产生)和感生的磁场(由感生或,或铁磁材料产生),相互做哟个形成的里进行操控。区别于小轩,操控用到的是三自由度的过程。优点在于针对导电材料可以实现非接触操控,缺点在于力小,速度慢,实时性差。针对空间场景来说,目前离应用较远。
IF:50.500Q1 Nature, 2021-10. DOI: 10.1038/s41586-021-03966-6 PMID: 34671137
Abstract:
Dexterous magnetic manipulation of ferromagnetic objects is well established, with three to six degrees of freedom possible depending on object geometry. There are objects for which non-contact dexterous manipulation is … >>>
Dexterous magnetic manipulation of ferromagnetic objects is well established, with three to six degrees of freedom possible depending on object geometry. There are objects for which non-contact dexterous manipulation is desirable that do not contain an appreciable amount of ferromagnetic material but do contain electrically conductive material. Time-varying magnetic fields generate eddy currents in conductive materials, with resulting forces and torques due to the interaction of the eddy currents with the magnetic field. This phenomenon has previously been used to induce drag to reduce the motion of objects as they pass through a static field, or to apply force on an object in a single direction using a dynamic field, but has not been used to perform the type of dexterous manipulation of conductive objects that has been demonstrated with ferromagnetic objects. Here we show that manipulation, with six degrees of freedom, of conductive objects is possible by using multiple rotating magnetic dipole fields. Using dimensional analysis, combined with multiphysics numerical simulations and experimental verification, we characterize the forces and torques generated on a conductive sphere in a rotating magnetic dipole field. With the resulting model, we perform dexterous manipulation in simulations and physical experiments. <<<
翻译
82.
十年 (2022-02-12 20:00):
#paper doi:10.1038/s41586-021-04223-6 Wright et al., Deep physical neural networks trained with backpropagation. Nature 601,549-555(2022) 传说中的万物皆可神经网络,作者提出PNN(physical neutral network),在机械、光学、电子方面效果贼好。万物皆可神经网络,牛逼格拉斯。
IF:50.500Q1 Nature, 2022-01. DOI: 10.1038/s41586-021-04223-6 PMID: 35082422
Abstract:
Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability. Deep-learning accelerators aim to perform deep learning energy-efficiently, usually targeting the … >>>
Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability. Deep-learning accelerators aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this deficiency constitutes a major impediment. Here we introduce a hybrid in situ-in silico algorithm, called physics-aware training, that applies backpropagation to train controllable physical systems. Just as deep learning realizes computations with deep neural networks made from layers of mathematical functions, our approach allows us to train deep physical neural networks made from layers of controllable physical systems, even when the physical layers lack any mathematical isomorphism to conventional artificial neural network layers. To demonstrate the universality of our approach, we train diverse physical neural networks based on optics, mechanics and electronics to experimentally perform audio and image classification tasks. Physics-aware training combines the scalability of backpropagation with the automatic mitigation of imperfections and noise achievable with in situ algorithms. Physical neural networks have the potential to perform machine learning faster and more energy-efficiently than conventional electronic processors and, more broadly, can endow physical systems with automatically designed physical functionalities, for example, for robotics, materials and smart sensors. <<<
翻译
83.
尹志 (2022-01-31 12:53):
#paper doi:10.1038/nature14539 LeCun, Y., Bengio, Y. & Hinton, G. Deep learning. Nature 521, 436–444 (2015). 这是深度学习三巨头于2015年写的一篇nature综述。也是nature纪念AI60周年的一系列综述paper里的一篇。这篇paper综述了深度学习这一热门主题。当然,作为深度学习的几位奠基人,确实把深度学习的概念原理应用写的深入浅出。本文从监督学习一直介绍到反向传播,主要综述了CNN和RNN的原理及其应用,很适合初学者全面了解(当时)的深度学习的概貌。在最后一段深度学习的未来一节,作者对无监督学习的未来报以热烈的期望,看看这几年,特别是yann lecun大力推动的自监督成为显学,也算是念念不忘必有回响了。
IF:50.500Q1 Nature, 2015-May-28. DOI: 10.1038/nature14539 PMID: 26017442
Abstract:
Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in … >>>
Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech. <<<
翻译
回到顶部