sensors-logo

Journal Browser

Journal Browser

Sensing and Processing for 3D Computer Vision: 3rd Edition

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensing and Imaging".

Deadline for manuscript submissions: 31 December 2024 | Viewed by 677

Special Issue Editor


E-Mail Website
Guest Editor
Computer Vision and Systems Laboratory, Laval University, 1665 Rue de l’Universite, Universite Laval, Quebec City, QC G1V 0A6, Canada
Interests: 3D sensors; active vision; 3D image processing and understanding; modelling; geometry; 3D sensing and modelling for augmented and virtual reality; applications of 3D computer vision
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

This Special Issue welcomes the submission of research articles that address 3D sensing technology in addition to the use of advanced 3D sensors in computer vision. Original contributions on novel active 3D sensors, stereo reconstruction approaches and sensor calibration techniques are solicited. Articles on 3D point cloud/mesh processing, geometric modeling, shape representation and recognition are also of interest in this Special Issue. Articles on the application of 3D sensing and modeling to metrology, industrial inspection and quality control, augmented/virtual reality, heritage preservation, arts and other fields are also welcome.

Prof. Dr. Denis Laurendeau
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • active/passive 3D sensors
  • sensor calibration
  • stereo reconstruction
  • point cloud/mesh processing
  • geometry
  • modeling and representation
  • shape analysis and recognition
  • applications of 3D vision

Related Special Issue

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 21787 KiB  
Article
Expanding Sparse Radar Depth Based on Joint Bilateral Filter for Radar-Guided Monocular Depth Estimation
by Chen-Chou Lo and Patrick Vandewalle
Sensors 2024, 24(6), 1864; https://doi.org/10.3390/s24061864 - 14 Mar 2024
Viewed by 497
Abstract
Radar data can provide additional depth information for monocular depth estimation. It provides a cost-effective solution and is robust in various weather conditions, particularly when compared with lidar. Given the sparse and limited vertical field of view of radar signals, existing methods employ [...] Read more.
Radar data can provide additional depth information for monocular depth estimation. It provides a cost-effective solution and is robust in various weather conditions, particularly when compared with lidar. Given the sparse and limited vertical field of view of radar signals, existing methods employ either a vertical extension of radar points or the training of a preprocessing neural network to extend sparse radar points under lidar supervision. In this work, we present a novel radar expansion technique inspired by the joint bilateral filter, tailored for radar-guided monocular depth estimation. Our approach is motivated by the synergy of spatial and range kernels within the joint bilateral filter. Unlike traditional methods that assign a weighted average of nearby pixels to the current pixel, we expand sparse radar points by calculating a confidence score based on the values of spatial and range kernels. Additionally, we propose the use of a range-aware window size for radar expansion instead of a fixed window size in the image plane. Our proposed method effectively increases the number of radar points from an average of 39 points in a raw radar frame to an average of 100 K points. Notably, the expanded radar exhibits fewer intrinsic errors when compared with raw radar and previous methodologies. To validate our approach, we assess our proposed depth estimation model on the nuScenes dataset. Comparative evaluations with existing radar-guided depth estimation models demonstrate its state-of-the-art performance. Full article
(This article belongs to the Special Issue Sensing and Processing for 3D Computer Vision: 3rd Edition)
Show Figures

Figure 1

Back to TopTop