View on GitHub

Explorable-Super-Resolution

Yuval Bahat   &  Tomer Michaeli

Abstract

Single image super resolution (SR) has seen major performance leaps in recent years. However, existing methods do not allow exploring the infinitely many plausible reconstructions that might have given rise to the observed low-resolution (LR) image. These different explanations to the LR image may dramatically vary in their textures and fine details, and may often encode completely different semantic information. In this work, we introduce the task of explorable super resolution. We propose a framework comprising a graphical user interface with a neural network backend, allowing editing the SR output so as to explore the abundance of plausible HR explanations to the LR input. At the heart of our method is a novel module that can wrap any existing SR network, analytically guaranteeing that its SR outputs would precisely match the LR input, when downsampled. Besides its importance in our setting, this module is guaranteed to decrease the reconstruction error of any SR network it wraps, and can be used to cope with blur kernels that are different from the one the network was trained for. We illustrate our approach in a variety of use cases, ranging from medical imaging and forensics, to graphics.

Try it yourself :mag:

Use our GUI to explore the infinite high-resolution images corresponding to an input low-resolution image. You can use our pre-trained explorable super-resolution model, or train one yourself. You can also utilize our CEM to enforce consistency on any super resolution model, regardless of explorability. Please find details in our code repository.

Resources

  1. Paper (Including supplementary material)
  2. CVPR 2020 oral presentation (5 mins)
  3. MIT vision seminar talk (47 mins, slides)
  4. Code

BibTex

@inproceedings{bahat2020explorable,
  title={Explorable Super Resolution},
  author={Bahat, Yuval and Michaeli, Tomer},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={2716--2725},
  year={2020}
}