A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Beyond Controlled Environments: 3D Camera Re-Localization in Changing Indoor Scenes
[article]
2020
arXiv
pre-print
In this paper, we adapt 3RScan - a recently introduced indoor RGB-D dataset designed for object instance re-localization - to create RIO10, a new long-term camera re-localization benchmark focused on indoor ...
We also examine in detail how different types of scene change affect the performance of different methods, based on novel ways of detecting such changes in a given RGB-D frame. ...
We think this benchmark closes a gap in the literature by going beyond controlled indoor environments, similar to recent high-impact benchmarks modelling outdoor scene changes [7, 66, 84, 85 ]. ...
arXiv:2008.02004v1
fatcat:tzzjbltmh5g5jiykgmzd5rh7na
Interactive 3D modeling of indoor environments with a consumer depth camera
2011
Proceedings of the 13th international conference on Ubiquitous computing - UbiComp '11
In this work we utilize (Kinect style) consumer depth cameras to enable non-expert users to scan their personal spaces into 3D models. ...
Detailed 3D visual models of indoor spaces, from walls and floors to objects and their configurations, can provide extensive knowledge about the environments as well as rich contextual information of people ...
This work was supported by an Intel grant, and partially by NSF grants IIS-0963657 and IIS-0812671, by ONR MURI grant N00014-09-1-1052, and through collaborative participation in the Robotics Consortium ...
doi:10.1145/2030112.2030123
dblp:conf/huc/DuHRCGSF11
fatcat:tpa27yrelngjxfxabwr337jbji
JRDB: A Dataset and Benchmark of Egocentric Visual Perception for Navigation in Human Environments
[article]
2020
arXiv
pre-print
The dataset has been annotated with over 2.3 million bounding boxes spread over 5 individual cameras and 1.8 million associated 3D cuboids around all people in the scenes totaling over 3500 time consistent ...
Our dataset incorporates data from traditionally underrepresented scenes such as indoor environments and pedestrian areas, all from the ego-perspective of the robot, both stationary and navigating. ...
We hypothesize that people cluster more tightly in indoor environments, and indoor scene includes significantly more occluders in the pointcloud data, rendering 3D detection in indoor environment harder ...
arXiv:1910.11792v2
fatcat:wquzzegmznh2vasn2z3trkugsq
Towards Vision Based Navigation in Large Indoor Environments
2006
2006 IEEE/RSJ International Conference on Intelligent Robots and Systems
It is shown that in a larger office environment, the proposed algorithm generates location estimates which are topologically correct, but statistically inconsistent. ...
environments. ...
In an indoor environment, the potential is high for scenes or regions in an image that are very similar in appearance. ...
doi:10.1109/iros.2006.282487
dblp:conf/iros/MiroZD06
fatcat:6yd65eqeujbvxlyjnw5mjuauji
Vision-based SLAM using natural features in indoor environments
2005
2005 International Conference on Intelligent Sensors, Sensor Networks and Information Processing
true that depth information is difficult to rely on, particularly on measurements beyond a few meters (in fact the full 3D estate is observable, but here robot motion is constrained to 2D and only the ...
camera. ...
to changes in the scene such as lighting conditions. ...
doi:10.1109/issnip.2005.1595571
fatcat:ibha2mvafzbxpbpfgzdqadoh2a
Interactive Environment-Aware Handheld Projectors for Pervasive Computing Spaces
[chapter]
2012
Lecture Notes in Computer Science
This paper presents two novel handheld projector systems for indoor pervasive computing spaces. These projection-based devices are "aware" of their environment in ways not demonstrated previously. ...
They offer both spatial awareness, where the system infers location and orientation of the device in 3D space, and geometry awareness, where the system constructs the 3D structure of the world around it ...
3D scene and testing for a 3D intersection with the room mesh). ...
doi:10.1007/978-3-642-31205-2_13
fatcat:swwdpvynmjepteaeoyogx7fftq
Robust 3D Position Estimation in Wide and Unconstrained Indoor Environments
2015
Sensors
In this paper, a system for 3D position estimation in wide, unconstrained indoor environments is presented that employs infrared optical outside-in tracking of rigid-body targets with a stereo camera rig ...
Recently, a number of commercially available ILS applications such as Google Indoor Maps [8], SensionLab [9] as well as Indoors [10] emerged to localize a smartphone (and thus its user) by fusing mobile ...
Nowadays, optical tracking is widely used for position estimation in indoor localization, fostered by the fact that the majority of these setups are used-up till now-in controlled indoor environments. ...
doi:10.3390/s151229862
pmid:26694388
pmcid:PMC4721782
fatcat:vqotuasr5bcczefzltgecxkgca
An experimental study of spatial sound usefulness in searching and navigating through AR environments
2015
Virtual Reality
The collected data suggest that the use of spatial sound in AR environments can be a significant factor in searching and navigating for hidden objects within indoor AR scenes. ...
What is more, 3D sound was a valuable cue for navigation in AR environment. ...
any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. ...
doi:10.1007/s10055-015-0274-4
fatcat:diefdyg5rbep5iduworlf7i3cy
Visual 3D Model-based Tracking toward Autonomous Live Sports Broadcasting using a VTOL Unmanned Aerial Vehicle in GPS-Impaired Environments
2015
International Journal of Computer Applications
Experimental results are demonstrated in 3 different environments including static scenes, real broadcast video, and indoor flying. ...
or hexacopter in GPS-impaired environments. ...
This local window search approach will be able to perform re-initialisation with low computation demands. ...
doi:10.5120/21709-4825
fatcat:vznslqcq35hmfn5623uzpgypmq
Human-Robot Perception in Industrial Environments: A Survey
2021
Sensors
This paper presents a survey on sensory equipment useful for human detection and action recognition in industrial environments. ...
Autonomous and collaborative robots able to adapt to varying and dynamic conditions of the environment, including the presence of human beings, will have an ever-greater role in this context. ...
Author Contributions: All the authors, listed in alphabetical order, equally contributed to this work. ...
doi:10.3390/s21051571
pmid:33668162
pmcid:PMC7956747
fatcat:zkneqhwbwvh7nigscwogcagbly
UGV-UAV Object Geolocation in Unstructured Environments
[article]
2022
arXiv
pre-print
In this paper, we present a UGV-UAV object detection and geolocation system, which performs perception, navigation, and planning autonomously in real scale in unstructured environment. ...
Our system is validated through data-driven offline tests as well as a series of field tests in unstructured environments. ...
We would also like to thank our academic partners including CMU DeLight Lab, TAMU Vehicle Systems & Control Lab, Penn State University Applied Research Lab, and CMU Software Engineering Institute. ...
arXiv:2201.05518v1
fatcat:rpawblf7uncopobfcuhs365r4q
Autonomous Flight using a Smartphone as On-Board Processing Unit in GPS-Denied Environments
2013
Proceedings of International Conference on Advances in Mobile Computing & Multimedia - MoMM '13
We demonstrate the UAVs capabilities of mapping, localization and navigation in an unknown 2D marker environment. ...
The UAV is able to map, locate and navigate in an unknown indoor environment fusing vision based tracking with inertial and attitude measurements. ...
Furthermore, we will integrate depth sensing image devices to allow for dense 3D mapping and localization in unconstrained indoor environments. ...
doi:10.1145/2536853.2536898
dblp:conf/momm/LeichtfriedKMK13
fatcat:ouqjjl5zjrc6hby47aaviovolu
Multi-view 3D Human Pose Estimation in Complex Environment
2011
International Journal of Computer Vision
We introduce a framework for unconstrained 3D human upper body pose estimation from multiple camera views in complex environment. ...
In the subsequent hypothesis verification stage, the candidate 3D poses are re-projected into the other camera views and ranked according to a multi-view likelihood measure. ...
This research was in part funded by the MultimediaN project. ...
doi:10.1007/s11263-011-0451-1
fatcat:gad62w62k5adpflqa6l2udbxr4
Detection, Location and Grasping Objects Using a Stereo Sensor on UAV in Outdoor Environments
2017
Sensors
The use of stereo cameras improves the learning stage, providing 3D information and helping to filter features in the online stage. ...
[3, 4] , but varies in the use of stereo cameras to improve the learning process (as they provide 3D information) and introduces improved feature filtering. ...
Acknowledgments: This work has been carried out in the framework of the AEROARMS (SI-1439/2015) EU-funded projects and the AEROMAIN (DPI2014-59383-C2-1-R) Spanish National Research project. ...
doi:10.3390/s17010103
pmid:28067851
pmcid:PMC5298676
fatcat:tznuq7wuk5fsdcft3r7352s5sm
3D Semantic VSLAM of Indoor Environment Based on Mask Scoring RCNN
2020
Discrete Dynamics in Nature and Society
obtain camera pose changes. ...
In order to achieve object detection and semantic segmentation for both static objects and dynamic objects in indoor environments and then construct dense 3D semantic map with VSLAM algorithm, a Mask Scoring ...
Acknowledgments is study was supported in part by the National Natural Science Foundation of China (grant no. 61801323), the Science and Technology Project Fund of Suzhou (grant nos. ...
doi:10.1155/2020/5916205
fatcat:45e43i3v7vajpmajdjbc36ncay
« Previous
Showing results 1 — 15 out of 2,451 results