ServiceNow Research

Attention

Attention for Compositional Modularity
Modularity and compositionality are promising inductive biases for addressing longstanding problems in machine learning such as better …
Neural Attentive CIrcuits
Recent work has seen the development of general purpose neural architectures that can be trained to perform tasks across diverse data …
Pay attention to the activations: a modular attention mechanism for fine-grained image recognition
Fine-grained image recognition is central to many multimedia tasks such as search, retrieval and captioning. Unfortunately, these tasks …