19 Papers Accepted from LTI Authors

Tuesday, June 8, 2021 - by Bryan Burtner

LTI faculty and students are once again featured prominently at this year's conference of the North American chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL HLT 2021). The conference includes 19 papers with at least one LTI author, with 23 members of the LTI community represented in total.

NAACL HLT, now in its 19th year, is one of the world’s premier conferences in the fields of computation linguistics and natural language processing. The conference takes place remotely from June 6-11.

Papers and presentations including LTI researchers are as follows, with the names of LTI authors in bold text:

Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models
Po-Yao Huang, Mandela Patrick, Junjie HuGraham NeubigFlorian Metze and Alexander Hauptmann

StylePTB: A Compositional Benchmark for Fine-grained Controllable Text Style Transfer
Yiwei Lyu, Paul Pu LiangHai PhamEduard Hovy, Barnabás Póczos, Ruslan Salakhutdinov and Louis-Philippe Morency

GSum: A General Framework for Guided Neural Abstractive Summarization
Zi-Yi DouPengfei LiuHiroaki HayashiZhengbao Jiang and Graham Neubig

Case Study: Deontological Ethics in NLP
Shrimai PrabhumoyeBrendon Boldt, Ruslan Salakhutdinov and Alan W Black

Multi-view Subword Regularization
Xinyi Wang, Sebastian Ruder and Graham Neubig

Larger-Context Tagging: When and Why Does It Work?
Jinlan Fu, Liangjing Feng, Qi Zhang, Xuanjing Huang and Pengfei Liu

Explicit Alignment Objectives for Multilingual Bidirectional Encoders
Junjie Hu, Melvin Johnson, Orhan Firat, Aditya Siddhant and Graham Neubig

On Learning Text Style Transfer with Direct Rewards
Yixin LiuGraham Neubig and John Wieting

Focused Attention Improves Document-Grounded Generation
Shrimai Prabhumoye, Kazuma Hashimoto, Yingbo Zhou, Alan W Black and Ruslan Salakhutdinov

Compositional Generalization for Neural Semantic Parsing via Span-level Supervised Attention
Pengcheng YinHao FangGraham Neubig, Adam Pauls, Emmanouil Antonios Platanios, Yu Su, Sam Thomson and Jacob AndreasMeta

XL: Meta Representation Transformation for Low-resource Cross-lingual Learning
Mengzhou Xia, Guoqing Zheng, Subhabrata Mukherjee, Milad Shokouhi, Graham Neubig and Ahmed Hassan Awadallah

RefSum: Refactoring Neural Summarization
Yixin LiuZi-Yi Dou and Pengfei Liu

MTAG: Modal-Temporal Attention Graph for Unaligned Human Multimodal Language Sequences
Jianing Yang, Yongxin Wang, Ruitao Yi, Yuying Zhu, Azaan Rehman, Amir Zadeh, Soujanya Poria and Louis-Philippe Morency

Understanding Factuality in Abstractive Summarization with FRANK: A Benchmark for Factuality
Metrics Artidoro Pagnoni, Vidhisha Balachandran and Yulia Tsvetkov

Searchable Hidden Intermediates for End-to-End Models of Decomposable Sequence Tasks
Siddharth Dalmia, Brian Yan, Vikas Raunak, Florian Metze and Shinji Watanabe

Controlling Dialogue Generation with Semantic Exemplars
Prakhar Gupta, Jeffrey Bigham, Yulia Tsvetkov and Amy Pavel

COIL: Revisit Exact Lexical Match in Information Retrieval with Contextualized Inverted List
Luyu GaoZhuyun Dai and Jamie Callan

Does syntax matter? A strong baseline for Aspect-based Sentiment Analysis with RoBERTa
Junqi Dai, Hang Yan, Tianxiang Sun, Pengfei Liu and Xipeng Qiu

Neural Language Modeling for Contextualized Temporal Graph Generation
Aman Madaan and Yiming Yang

For More Information, Contact:

Bryan Burtner | bburtner@cs.cmu.edu | 412-268-2805