In this lecture, we will look at the idea of compiling logical constraints into loss functions. After looking at the general principle, we will see two approaches: semantic loss and t-norm losses.
Lectures
- Semantic loss
- T-norm losses
Readings and References
Semantic loss and weighted model counting
-
Xu, Jingyi, Zilu Zhang, Tal Friedman, Yitao Liang, and Guy Van den Broeck. 2018. “A Semantic Loss Function for Deep Learning with Symbolic Knowledge.” In International Conference on Machine Learning, 5498–5507.
-
Darwiche, A., and P. Marquis. 2002. “A Knowledge Compilation Map.” Journal of Artificial Intelligence Research 17 (September):229–64. https://doi.org/10.1613/jair.989.
T-norms and t-norm losses
-
Li, Tao, Vivek Gupta, Maitrey Mehta, and Vivek Srikumar. 2019. “A Logic-Driven Framework for Consistency of Neural Models.” In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 3924–35. Hong Kong, China: Association for Computational Linguistics. https://doi.org/10/ghvf88.
-
Medina Grespan, Mattia, Meghan Broadbent, Xinyao Zhang, Katherine Axford, Brent Kious, Zac Imel, and Vivek Srikumar. 2023. “Logic-Driven Indirect Supervision: An Application to Crisis Counseling.” In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), 11704–22. Toronto, Canada: Association for Computational Linguistics. https://aclanthology.org/2023.acl-long.654.
-
Klement, Erich Peter, Radko Mesiar, and Endre Pap. 2013. Triangular Norms. Vol. 8. Springer Science & Business Media.