来自用户 十年 的文献。
当前共找到 2 篇文献分享。
1.
十年
(2022-03-25 12:29):
#paper 10.1038/s41587-020-0740-8 Systematic classification of unknown metabolites using high-resolution fragmentation mass spectra. 代谢组学数据处理中代谢物识别又一工具,归属于SIRIUS,主要还是碎片树策略。这次用DNN的方法做的模型,交叉验证准确率号称高达99.7%。质谱碎片预测这个东西,很多大佬都在做,但是准确率一直没有想象中的那么高,这几年借着机器学习的风口,希望能做的更好。
Abstract:
Metabolomics using nontargeted tandem mass spectrometry can detect thousands of molecules in a biological sample. However, structural molecule annotation is limited to structures present in libraries or databases, restricting analysis …
>>>
Metabolomics using nontargeted tandem mass spectrometry can detect thousands of molecules in a biological sample. However, structural molecule annotation is limited to structures present in libraries or databases, restricting analysis and interpretation of experimental data. Here we describe CANOPUS (class assignment and ontology prediction using mass spectrometry), a computational tool for systematic compound class annotation. CANOPUS uses a deep neural network to predict 2,497 compound classes from fragmentation spectra, including all biologically relevant classes. CANOPUS explicitly targets compounds for which neither spectral nor structural reference data are available and predicts classes lacking tandem mass spectrometry training data. In evaluation using reference data, CANOPUS reached very high prediction performance (average accuracy of 99.7% in cross-validation) and outperformed four baseline methods. We demonstrate the broad utility of CANOPUS by investigating the effect of microbial colonization in the mouse digestive system, through analysis of the chemodiversity of different Euphorbia plants and regarding the discovery of a marine natural product, revealing biological insights at the compound class level.
<<<
翻译
2.
十年
(2022-02-12 20:00):
#paper doi:10.1038/s41586-021-04223-6 Wright et al., Deep physical neural networks trained with backpropagation. Nature 601,549-555(2022) 传说中的万物皆可神经网络,作者提出PNN(physical neutral network),在机械、光学、电子方面效果贼好。万物皆可神经网络,牛逼格拉斯。
Abstract:
Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability. Deep-learning accelerators aim to perform deep learning energy-efficiently, usually targeting the …
>>>
Deep-learning models have become pervasive tools in science and engineering. However, their energy requirements now increasingly limit their scalability. Deep-learning accelerators aim to perform deep learning energy-efficiently, usually targeting the inference phase and often by exploiting physical substrates beyond conventional electronics. Approaches so far have been unable to apply the backpropagation algorithm to train unconventional novel hardware in situ. The advantages of backpropagation have made it the de facto training method for large-scale neural networks, so this deficiency constitutes a major impediment. Here we introduce a hybrid in situ-in silico algorithm, called physics-aware training, that applies backpropagation to train controllable physical systems. Just as deep learning realizes computations with deep neural networks made from layers of mathematical functions, our approach allows us to train deep physical neural networks made from layers of controllable physical systems, even when the physical layers lack any mathematical isomorphism to conventional artificial neural network layers. To demonstrate the universality of our approach, we train diverse physical neural networks based on optics, mechanics and electronics to experimentally perform audio and image classification tasks. Physics-aware training combines the scalability of backpropagation with the automatic mitigation of imperfections and noise achievable with in situ algorithms. Physical neural networks have the potential to perform machine learning faster and more energy-efficiently than conventional electronic processors and, more broadly, can endow physical systems with automatically designed physical functionalities, for example, for robotics, materials and smart sensors.
<<<
翻译