Deep learning has mainly thrived by training on large-scale datasets. However, for continual learning in applications such as robotics, it is critical to incrementally update its model in a sample efficient manner. We propose a novel method that constructs the new class weights from few labelled samples in the support set, while updating the previously learned classes. Inspiring from the work on adaptive correlation filters, an adaptive masked imprinted weights method is proposed. It utilizes a masked average pooling layer on the output embeddings and acts as a positive proxy for that class. It is then used to adaptively update the 1x1 convolutional filters that are responsible for the final classification. Our proposed method is evaluated on PASCAL-5i dataset and outperforms the state of the art in the 5-shot semantic segmentation. Unlike previous methods, our proposed approach does not require a second branch to estimate parameters or prototypes, and it enables the adaptation of previously learned weights. We further propose a novel setup for evaluating incremental object segmentation which we term as incremental PASCAL (iPASCAL), where our adaptation method has shown to outperform the baseline method.