Highly parallel fractional motion estimation engine for super hi-vision 4k×4k@60 fps

Yiqing Huang, Takeshi Ikenaga

Research output: Contribution to journalArticle

Abstract

One Super Hi-Vision (SHV) 4k×4k@60 fps fractional motion estimation (FME) engine is proposed in our paper. Firstly, two complexity reduction schemes are proposed in the algorithm level. By analyzing the integer motion cost of sub blocks in each inter mode, the mode reduction based mode pre-filtering scheme can achieve 48% clock cycle saving compared with previous algorithm. By further check the motion cost of search points around best integer candidate, the motion cost oriented directional one-pass scheme can provide 50% clock cycle saving and 36% reduction in the number of processing units (PU). Secondly, in the hardware level, two parallel improved schemes namely 16-Pel processing and MB-parallel scheme are given out in our paper, which reduces design effort to only 145MHz for SHV FME processing. Also, quarter sub-sampling is adopted in our design and 75% hardware cost is reduced for each PU. Thirdly, one unified pixel block loading scheme is proposed. About 28.67% to 86.39% pixels are reused and the related memory access is saved. Furthermore, we also give out one parity pixel organization scheme to solve memory access conflict of MB-parallel scheme. By using TSMC 0.18 μm technology in worst work conditions (1.62V, 125°C), our FME engine can achieve real-time processing for SHV 4k×4k@60 fps with 412k gates hardware.

Original languageEnglish
Pages (from-to)244-252
Number of pages9
JournalIEICE Transactions on Electronics
VolumeE93-C
Issue number3
DOIs
Publication statusPublished - 2010 Jan 1

Keywords

  • FME
  • H.264/AVC
  • Super hi-vision
  • VLSI

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Highly parallel fractional motion estimation engine for super hi-vision 4k×4k@60 fps'. Together they form a unique fingerprint.

  • Cite this