About
People
Publications
Open Source
Demos
Events
Blog
Careers
Contact
English
English
Français
ServiceNow
ServiceNow Research
Tags
Transformers
ServiceNow Research
Transformers
Exploring Sparse Adapters for Scalable Merging of Parameter Efficient Experts
Merging parameter-efficient task experts has recently gained growing attention as a way to build modular architectures that can be …
Samin Yeasar Arnob
,
Oleksiy Ostapenko
,
Alessandro Sordoni
,
Lucas Caccia
Conference on Language Modeling (COLM), 2025.
PDF
Cite
AlignVLM: Bridging Vision and Language Latent Spaces for Multimodal Understanding
Aligning visual features with language embeddings is a key challenge in vision-language models (VLMs). The performance of such models …
Ahmed Masry
,
Juan A. Rodriguez
,
Tianyu Zhang
,
Suyuchen Wang
,
Chao Wang
,
Aarash Feizi
,
Akshay Kalkunte
,
Abhay Puri
,
Xiangru Jian
,
Pierre-André Noël
,
Sathwik Madhusudhan
,
Marco Pedersoli
,
Bang Liu
,
Nicolas Chapados
,
Yoshua Bengio
,
Enamul Hoque Prince
,
Christopher Pal
,
Issam H. Laradji
,
David Vazquez
,
Perouz Taslakian
,
Spandana Gella
,
Sai Rajeswar Mudumba
Workshop at the International Conference of Learning Representation (ICLR), 2025.
PDF
Cite
Exploring Sparse Adapters for Scalable Merging of Parameter Efficient Experts
Merging parameter-efficient task experts has recently gained growing attention as a way to build modular architectures that can be …
Samin Yeasar Arnob
,
Zhan Su
,
Minseon Kim
,
Oleksiy Ostapenko
,
Doina Precup
,
Lucas Caccia
,
Alessandro Sordoni
Workshop at the International Conference of Learning Representation (ICLR), 2025.
PDF
Cite
TACTIS-2: Better, Faster, Simpler Attentional Copulas for Multivariate Time Series
We introduce a new model for multivariate probabilistic time series prediction, designed to flexibly address a range of tasks including …
Arjun Ashok
,
Étienne Marcotte
,
Valentina Zantedeschi
,
Nicolas Chapados
,
Alexandre Drouin
Montreal AI Symposium (MAIS), 2024.
PDF
Cite
Code
Video
TACTIS-2: Better, Faster, Simpler Attentional Copulas for Multivariate Time Series
We introduce a new model for multivariate probabilistic time series prediction, designed to flexibly address a range of tasks including …
Arjun Ashok
,
Étienne Marcotte
,
Valentina Zantedeschi
,
Nicolas Chapados
,
Alexandre Drouin
International Conference of Learning Representations (ICLR), 2024.
PDF
Cite
Code
Video
LLM aided semi-supervision for efficient Extractive Dialog Summarization
Generating high-quality summaries for chat dialogs often requires large labeled datasets. We propose a method to efficiently use …
Nishant Mishra
,
Gaurav Sahu
,
Iacer Calixto
,
Ameen Abu-Hanna
,
Issam H. Laradji
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2023.
PDF
Cite
Does entity abstraction help generative Transformers reason?
We study the utility of incorporating entity type abstractions into pre-trained Transformers and test these methods on four NLP tasks …
Nicolas Gontier
,
Siva Reddy
,
Christopher Pal
Transactions on Machine Learning Research, 2022.
PDF
Cite
Code
TACTiS: Transformer-Attentional Copulas for Time Series
The estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance. …
Alexandre Drouin
,
Étienne Marcotte
,
Nicolas Chapados
International Conference on Machine Learning (ICML), 2022.
PDF
Cite
Code
Model Card
Slides
Video
Latent Variable Sequential Set Transformers for Joint Multi-Agent Motion Prediction
Robust multi-agent trajectory prediction is essential for the safe control of robotic systems. A major challenge is to efficiently …
Roger Girgis
,
Florian Golemo
,
Felipe Codevilla
,
Martin Weiss
,
Jim Aldon D'Souza
,
Samira Ebrahimi Kahou
,
Felix Heide
,
Christopher Pal
International Conference on Learning Representations (ICLR), 2022.
PDF
Cite
Code
Systematic Generalization with Edge Transformers
Recent research suggests that systematic generalization in natural language understanding remains a challenge for state-of-the-art …
Leon Bergen
,
Timothy J. O'Donnell
,
Dzmitry Bahdanau
Conference on Neural Information Processing Systems (NeurIPS), 2021.
PDF
Cite
»
Cite
×