Omnidirectional Stereo Dataset


We present synthetic datasets for the omnidirectional stereo. We virtually implement the camera rig with four mounted fisheye cameras. These datasets were rendered using Blender.

Datasets


Synthetic Urban Datasets

Each dataset consists of 1000 sequential frames of city landscapes, and we split them into two parts, the former 700 frames for training and the later 300 for testing.

Sunny
Cloudy
Sunset

Input images

Front within 220° FOV
Right
Rear
Left

Omnidirectional depth map

Inverse depth map
Reference panorama

Download

Paper


  • Changhee Won, Jongbin Ryu, and Jongwoo Lim, "SweepNet: Wide-baseline Omnidirectional Depth Estimation", in ICRA 2019. [arxiv] [code]

Citation


Will be updated soon