Novel scene generation, merging and stitching views using the 2D affine space

Kuntal Sengupta, Jun Ohya

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

In this paper we present a unified theoretical framework for novel scene synthesis, merging real and virtual worlds, and view stitching. To start with, we have a set of real images from weakly calibrated cameras, for which we compute the dense point match correspondences. For applications like novel view synthesis, one may first solve the 3D scene reconstruction problem, followed by a view rendering process. However, errors in 3D scene reconstruction usually gets reflected in the quality of the new scene generated, so we seek a more direct method. In this paper, we use the knowledge of dense point matches and their affine coordinate values to estimate the corresponding affine coordinate values in the new scene. Our technique of reprojection is extended for other applications like merging real and synthetic worlds, and view stitching.

Original languageEnglish
Pages (from-to)39-53
Number of pages15
JournalSignal Processing: Image Communication
Volume14
Issue number1-2
Publication statusPublished - 1998 Nov 6
Externally publishedYes

    Fingerprint

Keywords

  • Image based rendering
  • Reprojection
  • View generation

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Signal Processing
  • Electrical and Electronic Engineering

Cite this