In this paper, we propose a method to create a 3D environmental map from images captured by omnidirectional camera. Conventional methods using Simultaneous Localization and Mapping (SLAM) and fisheye images have a problem that the obtained feature points are geometrically distorted. To solve this problem, we transform the images captured by the omnidirectional camera using cube mapping and apply SLAM to the video created by concatenating the transformed images. The feature points located 360 degrees around the camera are merged into a single map. Experimental results show that our method outperforms other methods in terms of the number and accuracy of the obtained point cloud.