ServiceNow Research

Compositional Generalization

Exploring Sparse Adapters for Scalable Merging of Parameter Efficient Experts
Merging parameter-efficient task experts has recently gained growing attention as a way to build modular architectures that can be …
Exploring Sparse Adapters for Scalable Merging of Parameter Efficient Experts
Merging parameter-efficient task experts has recently gained growing attention as a way to build modular architectures that can be …
Egocentric Planning for Scalable Embodied Task Achievement
Embodied agents face significant challenges when tasked with performing actions in diverse environments, particularly in generalizing …
On the Compositional Generalization Gap of In-Context Learning
Pretrained large generative language models have shown great performance on many tasks, but exhibit low compositional generalization …
Scaling up ML-based Black-box Planning with Partial STRIPS Models
A popular approach for sequential decision-making is to perform simulator-based search guided with Machine Learning (ML) methods like …
A Planning based Neural-Symbolic Approach for Embodied Instruction Following
The ALFRED environment features embodied instruction following tasks in simulated home environments. However, end-to-end deep learning …
Scaling up ML-based Black-box Planning with Partial STRIPS Models
A popular approach for sequential decision-making is to perform simulator-based search guided with Machine Learning (ML) methods like …
Compositional Generalization in Dependency Parsing
Compositionality, or the ability to combine familiar units like words into novel phrases and sentences, has been the focus of intense …
Object-centric Compositional Imagination for Visual Abstract Reasoning
Like humans devoid of imagination, current machine learning systems lack the ability to adapt to new, unexpected situations by …
Continual Learning via Local Module Composition
Modularity is a compelling solution to continual learning (CL), the problem of modeling sequences of related tasks. Learning and then …