Novel scene generation, merging and stitching views using the 2D affine space

Kuntal Sengupta, Jun Ohya

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

In this paper we present a unified theoretical framework for novel scene synthesis, merging real and virtual worlds, and view stitching. To start with, we have a set of real images from weakly calibrated cameras, for which we compute the dense point match correspondences. For applications like novel view synthesis, one may first solve the 3D scene reconstruction problem, followed by a view rendering process. However, errors in 3D scene reconstruction usually gets reflected in the quality of the new scene generated, so we seek a more direct method. In this paper, we use the knowledge of dense point matches and their affine coordinate values to estimate the corresponding affine coordinate values in the new scene. Our technique of reprojection is extended for other applications like merging real and synthetic worlds, and view stitching.

Original languageEnglish
Pages (from-to)39-53
Number of pages15
JournalSignal Processing: Image Communication
Volume14
Issue number1-2
Publication statusPublished - 1998 Nov 6
Externally publishedYes

Fingerprint

Merging
Cameras

Keywords

  • Image based rendering
  • Reprojection
  • View generation

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Signal Processing
  • Electrical and Electronic Engineering

Cite this

Novel scene generation, merging and stitching views using the 2D affine space. / Sengupta, Kuntal; Ohya, Jun.

In: Signal Processing: Image Communication, Vol. 14, No. 1-2, 06.11.1998, p. 39-53.

Research output: Contribution to journalArticle

@article{3736ec7b83f04965a8a1efd503cf80db,
title = "Novel scene generation, merging and stitching views using the 2D affine space",
abstract = "In this paper we present a unified theoretical framework for novel scene synthesis, merging real and virtual worlds, and view stitching. To start with, we have a set of real images from weakly calibrated cameras, for which we compute the dense point match correspondences. For applications like novel view synthesis, one may first solve the 3D scene reconstruction problem, followed by a view rendering process. However, errors in 3D scene reconstruction usually gets reflected in the quality of the new scene generated, so we seek a more direct method. In this paper, we use the knowledge of dense point matches and their affine coordinate values to estimate the corresponding affine coordinate values in the new scene. Our technique of reprojection is extended for other applications like merging real and synthetic worlds, and view stitching.",
keywords = "Image based rendering, Reprojection, View generation",
author = "Kuntal Sengupta and Jun Ohya",
year = "1998",
month = "11",
day = "6",
language = "English",
volume = "14",
pages = "39--53",
journal = "Signal Processing: Image Communication",
issn = "0923-5965",
publisher = "Elsevier",
number = "1-2",

}

TY - JOUR

T1 - Novel scene generation, merging and stitching views using the 2D affine space

AU - Sengupta, Kuntal

AU - Ohya, Jun

PY - 1998/11/6

Y1 - 1998/11/6

N2 - In this paper we present a unified theoretical framework for novel scene synthesis, merging real and virtual worlds, and view stitching. To start with, we have a set of real images from weakly calibrated cameras, for which we compute the dense point match correspondences. For applications like novel view synthesis, one may first solve the 3D scene reconstruction problem, followed by a view rendering process. However, errors in 3D scene reconstruction usually gets reflected in the quality of the new scene generated, so we seek a more direct method. In this paper, we use the knowledge of dense point matches and their affine coordinate values to estimate the corresponding affine coordinate values in the new scene. Our technique of reprojection is extended for other applications like merging real and synthetic worlds, and view stitching.

AB - In this paper we present a unified theoretical framework for novel scene synthesis, merging real and virtual worlds, and view stitching. To start with, we have a set of real images from weakly calibrated cameras, for which we compute the dense point match correspondences. For applications like novel view synthesis, one may first solve the 3D scene reconstruction problem, followed by a view rendering process. However, errors in 3D scene reconstruction usually gets reflected in the quality of the new scene generated, so we seek a more direct method. In this paper, we use the knowledge of dense point matches and their affine coordinate values to estimate the corresponding affine coordinate values in the new scene. Our technique of reprojection is extended for other applications like merging real and synthetic worlds, and view stitching.

KW - Image based rendering

KW - Reprojection

KW - View generation

UR - http://www.scopus.com/inward/record.url?scp=0032206518&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0032206518&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:0032206518

VL - 14

SP - 39

EP - 53

JO - Signal Processing: Image Communication

JF - Signal Processing: Image Communication

SN - 0923-5965

IS - 1-2

ER -