sorry, we can't preview this file
SeasonDepth_testset.zip (10.07 GB)
SeasonDepth: Cross-Season Monocular Depth Prediction Dataset
Version 2 2021-06-05, 05:35
Version 1 2021-06-05, 05:07
datasetposted on 2021-06-05, 05:35 authored by Hanjiang HuHanjiang Hu, Baoquan YangBaoquan Yang, Zhijian QiaoZhijian Qiao, Ding ZhaoDing Zhao, Hesheng Wang
Changing environments poses great challenge on the outdoor visual perception and scene understanding for robust long-term autonomous driving and mobile robots, where depth-auxiliary geometric information plays an essential role to the robustness under challenging scenes. Although monocular depth prediction has been well studied recently, there are few work focused on the depth prediction across multiple environmental conditions, e.g. changing illumination and seasons, owing to the lack of such real-world dataset and benchmark. In this work, a new cross-season scaleless monocular depth prediction dataset SeasonDepth (Available on https://seasondepth.github.io) is derived from CMU Visual Localization dataset through structure from motion. To benchmark the depth estimation performance under different environments, we investigate representative and recent state-of-the-art open-source supervised, self-supervised and domain adaptation depth prediction methods from KITTI benchmark using several newly-formulated metrics. Through extensive experimental evaluation on the proposed dataset without fine-tuning, the influence of multiple environments on mean and variance of performance is analyzed, showing that the long-term monocular depth prediction is far from solved. We further give promising solutions especially with self-supervised stereo geometry and multi-task training to enhance the robustness to changing environments.