Info hash | f81ba64cfd597b0185d5d66ffe74a75b6ee0a80d |
Last mirror activity | 13:21 ago |
Size | 82.69GB (82,687,724,241 bytes) |
Added | 2024-12-25 20:49:26 |
Views | 11 |
Hits | 27 |
ID | 5262 |
Type | multi |
Downloaded | 10 time(s) |
Uploaded by | casey |
Folder | urban_traversability |
Num files | 23 files [See full list] |
Mirrors | 7 complete, 0 downloading = 7 mirror(s) total [Log in to see full list] |
urban_traversability (23 files)
farm_road/country_road/country_road_testing.bag | 6.74GB |
farm_road/country_road/country_road_training.bag | 1.28GB |
farm_road/greenhouse/greenhouse.bag | 6.75GB |
farm_road/greenhouse/greenhouse2.bag | 5.51GB |
farm_road/greenhouse/greenhouse3.bag | 3.10GB |
KU_campus/01-parking_lot/parking_lot.bag | 644.49MB |
KU_campus/02-wheelchair_ramp/wheelchair_ramp.bag | 251.97MB |
KU_campus/02-wheelchair_ramp/wheelchair_ramp2.bag | 188.43MB |
KU_campus/02-wheelchair_ramp/wheelchair_ramp3.bag | 232.16MB |
KU_campus/03-campus_south/campus_south.bag | 3.30GB |
KU_campus/03-KU_campus/raw/urban_campus (raw).bag | 6.96GB |
KU_campus/03-KU_campus/raw/urban_campus_large (raw).bag | 12.37GB |
KU_campus/03-KU_campus/raw/urban_campus_large (raw).bagivm6k62e.part | 1.35GB |
KU_campus/03-KU_campus/raw/urban_campus_small (raw).bag | 1.17GB |
KU_campus/03-KU_campus/urban_campus.bag | 4.71GB |
KU_campus/03-KU_campus/urban_campus_small.bag | 1.16GB |
KU_campus/04-campus_east/campus_east.bag | 719.22MB |
KU_campus/07-campus_full/campus_full.bag | 4.71GB |
KU_campus/08-campus_full_long/campus_full_long.bag | 12.39GB |
KU_campus/09-campus_road_camera/campus_road_camera.bag | 8.41GB |
KU_campus/10-KU_innovation_hall_4F/ku_innovation_hall_4F.bag | 732.57MB |
LICENSE | 0.55kB |
README.txt | 0.37kB |
Type: Dataset
Tags: traversability, laser radar, mobile robots, navigation, robot vision systems, terrain types, urban environments, robot sensing systems, conservative prediction, cumulative incidence, entropy regularization, geometry, grid cells, human operator, image analysis, intrinsic risk, landforms, learning models, learning strategies, LiDaR perception, mobile robot, navigation experiment, neural network, noise measurement, noisy labels, prediction accuracy, prediction confidence, prediction model, risky areas, robot trajectory, safety, self-supervised learning methods, self-supervised learning, semantic scene understanding, semantics, service robots, speed bumps, statistical distribution, step height, terrain classification, terrain map, terrain mapping, topographic maps, vision-based navigation, weak labels, mapping, confirmation bias
Bibtex:
Tags: traversability, laser radar, mobile robots, navigation, robot vision systems, terrain types, urban environments, robot sensing systems, conservative prediction, cumulative incidence, entropy regularization, geometry, grid cells, human operator, image analysis, intrinsic risk, landforms, learning models, learning strategies, LiDaR perception, mobile robot, navigation experiment, neural network, noise measurement, noisy labels, prediction accuracy, prediction confidence, prediction model, risky areas, robot trajectory, safety, self-supervised learning methods, self-supervised learning, semantic scene understanding, semantics, service robots, speed bumps, statistical distribution, step height, terrain classification, terrain map, terrain mapping, topographic maps, vision-based navigation, weak labels, mapping, confirmation bias
Bibtex:
@article{, title= {Learning Self-Supervised Traversability With Navigation Experiences of Mobile Robots: A Risk-Aware Self-Training Approach}, author={Cho, Ikhyeon and Chung, Woojin}, journal={IEEE Robotics and Automation Letters}, year={2024}, publisher={IEEE}, url= {https://github.com/Ikhyeon-Cho/urban-traversability-dataset}, abstract= {Mobile robots operating in outdoor environments face the challenge of navigating various terrains with different degrees of difficulty. Therefore, traversability estimation is crucial for safe and efficient robot navigation. Current approaches utilize a robot's driving experience to learn traversability in a self-supervised fashion. However, providing sufficient and diverse experience to the robot is difficult in many practical applications. In this paper, we propose a self-supervised traversability learning method that adapts to challenging terrains with limited prior experience. One key aspect is to enable prioritized learning of scarce yet high-risk terrains by using a risk-sensitive approach. To this end, we train a neural network through a risk-aware instance weighting scheme. Another key aspect is to leverage traversability pseudo-labels on the basis of a self-training scheme. The proposed confidence-regularized self-training generates high-quality pseudo-labels, thereby achieving reliable data augmentation for unexperienced terrains. The effectiveness of the proposed method is verified in extensive real-world experiments, ranging from structured urban environments to complex rugged terrains.}, keywords= {mapping, traversability, laser radar, mobile robots, navigation, robot vision systems, terrain types, urban environments, robot sensing systems, confirmation bias, conservative prediction, cumulative incidence, entropy regularization, geometry, grid cells, human operator, image analysis, intrinsic risk, landforms, learning models, learning strategies, LiDaR perception, mobile robot, navigation experiment, neural network, noise measurement, noisy labels, prediction accuracy, prediction confidence, prediction model, risky areas, robot trajectory, safety, self-supervised learning methods, self-supervised learning, semantic scene understanding, semantics, service robots, speed bumps, statistical distribution, step height, terrain classification, terrain map, terrain mapping, topographic maps, vision-based navigation, weak labels}, terms= {}, license= {CC BY-NC-SA 4.0: https://creativecommons.org/licenses/by-nc-sa/4.0/}, superseded= {} }