张德祥 (2022-09-19 19:40):
#paper https://doi.org/10.48550/arXiv.2206.00426 Semantic Probabilistic Layers for Neuro-Symbolic Learning 论文为结构化输出预测设计了一个预测层,可以嵌入神经网络中,保证预测与标签约束一致,通过建模复杂的相关性和约束,结合了概率推理和逻辑推理。是现在唯一满足六个条件的实现。(概率性,高表达力,保证逻辑约束一致,通用-支持各种约束的形式语言表达,模块化嵌入神经网络端对端训练,高效的线性时间);核心是论文通过带约束的概率线路来实现。应用:路径规划(有障碍物、水路等限制),层级多标签训练等。
Semantic Probabilistic Layers for Neuro-Symbolic Learning
翻译
Abstract:
We design a predictive layer for structured-output prediction (SOP) that can be plugged into any neural network guaranteeing its predictions are consistent with a set of predefined symbolic constraints. Our Semantic Probabilistic Layer (SPL) can model intricate correlations, and hard constraints, over a structured output space all while being amenable to end-to-end learning via maximum likelihood. SPLs combine exact probabilistic inference with logical reasoning in a clean and modular way, learning complex distributions and restricting their support to solutions of the constraint. As such, they can faithfully, and efficiently, model complex SOP tasks beyond the reach of alternative neuro-symbolic approaches. We empirically demonstrate that SPLs outperform these competitors in terms of accuracy on challenging SOP tasks including hierarchical multi-label classification, pathfinding and preference learning, while retaining perfect constraint satisfaction.
翻译
回到顶部