Video stream retrieval based on temporal feature of frame difference

Mikito Toguro*, Kenji Suzuki, Pitoyo Hartono, Shuji Hashimoto

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    4 Citations (Scopus)

    Abstract

    In recent years, we can easily access an enormous amount of digital video stream in standardized video format such as MPEG and etc. However, it is not so easy to find a desired video stream from video database in reasonable short time. Some efficient searching methods in terms of computational cost are definitely required for video stream retrieval. In this paper, we propose a novel method for video stream retrieval using a video stream as search-key. The objective is to find a part of the video stream stored in the database that is similar to the given video stream key. This method extracts some characteristics in frame transition in video streams, and utilizes the characteristic for the similarity matching. Some experimental results are also shown to verify the efficiency of the proposed method.

    Original languageEnglish
    Title of host publicationICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
    VolumeII
    DOIs
    Publication statusPublished - 2005
    Event2005 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP '05 - Philadelphia, PA
    Duration: 2005 Mar 182005 Mar 23

    Other

    Other2005 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP '05
    CityPhiladelphia, PA
    Period05/3/1805/3/23

    Keywords

    • Feature matrix
    • Frame transition
    • Similarity matching
    • Video stream retrieval

    ASJC Scopus subject areas

    • Electrical and Electronic Engineering
    • Signal Processing
    • Acoustics and Ultrasonics

    Fingerprint

    Dive into the research topics of 'Video stream retrieval based on temporal feature of frame difference'. Together they form a unique fingerprint.

    Cite this