TY - GEN
T1 - A Robust Driver's Gaze Zone Classification using a Single Camera for Self-occlusions and Non-aligned Head and Eyes Direction Driving Situations
AU - Lollett, Catherine
AU - Hayashi, Hiroaki
AU - Kamezaki, Mitsuhiro
AU - Sugano, Shigeki
N1 - Funding Information:
ACKNOWLEDGMENT The authors would like to thank the Driving Interface Team of Sugano’s Laboratory in Waseda University, to all the subjects for the support given and to the Research Institute for Science and Engineering of Waseda University. REFERENCES
Publisher Copyright:
© 2020 IEEE.
PY - 2020/10/11
Y1 - 2020/10/11
N2 - Distracted driving is one of the most common causes of traffic accidents around the world. Recognizing the driver's gaze direction during a maneuver could be an essential step for avoiding the matter mentioned above. Thus, we propose a gaze zone classification system that serves as a base of supporting systems for driver's situation awareness. However, the challenge is to estimate the driver's gaze inside not ideal scenarios, specifically in this work, scenarios where may occur self-occlusions or non-aligned head and eyes direction of the driver. Firstly, towards solving miss classifications during self-occlusions scenarios, we designed a novel protocol where a 3D full facial geometry reconstruction of the driver from a single 2D image is made using the state-of-the-art method PRNet. To solve the miss classification when the driver's head and eyes direction are not aligned, eyes and head information are extracted. After this, based on a mix of different data pre-processing and deep learning methods, we achieved a robust classifier in situations where self-occlusions or non-aligned head and eyes direction of the driver occur. Our results from the experiments explicitly measure and show that the proposed method can make an accurate classification for the two before-mentioned problems. Moreover, we demonstrate that our model generalizes new drivers while being a portable and extensible system, making it easy-adaptable for various automobiles.
AB - Distracted driving is one of the most common causes of traffic accidents around the world. Recognizing the driver's gaze direction during a maneuver could be an essential step for avoiding the matter mentioned above. Thus, we propose a gaze zone classification system that serves as a base of supporting systems for driver's situation awareness. However, the challenge is to estimate the driver's gaze inside not ideal scenarios, specifically in this work, scenarios where may occur self-occlusions or non-aligned head and eyes direction of the driver. Firstly, towards solving miss classifications during self-occlusions scenarios, we designed a novel protocol where a 3D full facial geometry reconstruction of the driver from a single 2D image is made using the state-of-the-art method PRNet. To solve the miss classification when the driver's head and eyes direction are not aligned, eyes and head information are extracted. After this, based on a mix of different data pre-processing and deep learning methods, we achieved a robust classifier in situations where self-occlusions or non-aligned head and eyes direction of the driver occur. Our results from the experiments explicitly measure and show that the proposed method can make an accurate classification for the two before-mentioned problems. Moreover, we demonstrate that our model generalizes new drivers while being a portable and extensible system, making it easy-adaptable for various automobiles.
UR - http://www.scopus.com/inward/record.url?scp=85098847449&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85098847449&partnerID=8YFLogxK
U2 - 10.1109/SMC42975.2020.9283470
DO - 10.1109/SMC42975.2020.9283470
M3 - Conference contribution
AN - SCOPUS:85098847449
T3 - Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics
SP - 4302
EP - 4308
BT - 2020 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2020
Y2 - 11 October 2020 through 14 October 2020
ER -