1

On the impressive performance of randomly weighted encoders in summarization tasks
In this work, we investigate the performance of untrained randomly initialized encoders in a general class of sequence to sequence …
Structure Learning for Neural Module Networks
Neural Module Networks, originally proposed for the task of visual question answering, are a class of neural network architectures that …
Efficient Deep Gaussian Process Models for Variable-Sized Inputs
Deep Gaussian processes (DGP) have appealing Bayesian properties, can handle variable-sized data, and learn deep features. Their …
Searching for Markovian Subproblems to Address Partially Observable Reinforcement Learning
In partially observable environments, an agent’s policy should often be a function of the history of its interaction with the …
Context-Aware Visual Compatibility Prediction
How do we determine whether two or more clothing items are compatible or visually appealing? Part of the answer lies in understanding …
Hierarchical Importance Weighted Autoencoders
Importance weighted variational inference (Burda et al., 2015) uses multiple i.i.d. samples to have a tighter variational lower bound. …
Investigating Trust Factors in Human-Robot Shared Control: Implicit Gender Bias Around Robot Voice
This paper explores the impact of warnings, audio feedback, and gender on human-robot trust in the context of autonomous driving and …
BabyAI: A Platform to Study the Sample Efficiency of Grounded Language Learning
Allowing humans to interactively train artificial agents to understand language instructions is desirable for both practical and …
Meta-Learning Framework with Applications to Zero-Shot Time Series Forecasting
Can meta-learning discover generic ways of processing time series (TS) from a diverse dataset so as to greatly improve generalization …
On Difficulties of Probability Distillation
Probability distillation has recently been of interest to deep learning practitioners as it presents a practical solution for sampling …