sensors-logo

Journal Browser

Journal Browser

Time of Flight (TOF) Cameras

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (20 August 2019) | Viewed by 8883

Special Issue Editor


E-Mail Website
Guest Editor
Department of Mechanical and Industrial Engineering, University of Brescia, 25121 Brescia, Italy
Interests: optoelectronics; lasers and their applications; metrology; measurements; laser-tissue interactions; laser engineering; laser safety; vision in industry; robotics; optics in biomedicine

Special Issue Information

Dear Colleagues,

Time-of-flight (TOF) cameras represent an efficient alternative to classical depth sensors, mostly based on triangulation systems, and provide depth images at high frame rates, simplifying several challenging tasks such as shape analysis, people and object detection, object classification, etc. TOF cameras measure the 3D position of each point in the scene by evaluating the “time-of-flight” of a light signal emitted by the camera and reflected back from each point. TOF cameras are compact, with no moving parts. As such, they can easily be miniaturized, for integration into mobile devices such as smartphones and tablets. As a result, researchers have started exploring their potential application in industrial and medical fields.

The aim of this Special Issue is to collect papers from academic and industrial players with original, previously unpublished research about new trends and solutions to the technological problems of applied metrology and sensing in different fields.

Potential topics of interest include, but are not limited to, the following:

  • TOF camera design and engineering
  • Performance evaluations of TOF cameras
  • Comparison between TOF cameras and triangulation-based or other sensing devices
  • Embedding of TOF cameras in smartphones and portable devices
  • TOF cameras and sensor fusion
  • TOF camera data processing
  • TOF camera-based range systems
  • Applications of TOF cameras to automotive, robotic, and aerospace systems
  • Applications of TOF cameras in biomedical systems
  • Other applications of TOF cameras

Prof. Franco Docchio
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Time of flight
  • Cameras
  • Depth measurement
  • Smartphones
  • Miniature
  • Robots
  • Automotive
  • Aerospace
  • Biomedical

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 7614 KiB  
Article
Indirect Time-of-Flight Depth Sensor with Two-Step Comparison Scheme for Depth Frame Difference Detection
by Donguk Kim and Jaehyuk Choi
Sensors 2019, 19(17), 3674; https://doi.org/10.3390/s19173674 - 23 Aug 2019
Cited by 5 | Viewed by 4034
Abstract
A depth sensor with integrated frame difference detection is proposed. Instead of frame difference detection using light intensity, which is vulnerable to ambient light, the difference in depth between successive frames can be acquired. Because the conventional time-of-flight depth sensor requires two frames [...] Read more.
A depth sensor with integrated frame difference detection is proposed. Instead of frame difference detection using light intensity, which is vulnerable to ambient light, the difference in depth between successive frames can be acquired. Because the conventional time-of-flight depth sensor requires two frames of depth-image acquisition with four-phase modulation, it has large power consumption, as well as a large area for external frame memories. Therefore, we propose a simple two-step comparison scheme for generating the depth frame difference in a single frame. With the proposed scheme, only a single frame is needed to obtain the frame difference, with less than half of the power consumption of the conventional depth sensor. Because the frame difference is simply generated by column-parallel circuits, no access of the external frame memory is involved, nor is a digital signal processor. In addition, we used an over-pixel metal–insulator–metal capacitor to store temporary signals for enhancing the area efficiency. A prototype chip was fabricated using a 90 nm backside illumination complementary metal–oxide–semiconductor (CMOS) image sensor process. We measured the depth frame difference in the range of 1–2.5 m. With a 10 MHz modulation frequency, a depth frame difference of >10 cm was successfully detected even for objects with different reflectivity. The maximum relative error from the difference of the reflectivity (white and wooden targets) was <3%. Full article
(This article belongs to the Special Issue Time of Flight (TOF) Cameras)
Show Figures

Figure 1

16 pages, 3011 KiB  
Article
Driver Face Verification with Depth Maps
by Guido Borghi, Stefano Pini, Roberto Vezzani and Rita Cucchiara
Sensors 2019, 19(15), 3361; https://doi.org/10.3390/s19153361 - 31 Jul 2019
Cited by 14 | Viewed by 3506
Abstract
Face verification is the task of checking if two provided images contain the face of the same person or not. In this work, we propose a fully-convolutional Siamese architecture to tackle this task, achieving state-of-the-art results on three publicly-released datasets, namely Pandora, [...] Read more.
Face verification is the task of checking if two provided images contain the face of the same person or not. In this work, we propose a fully-convolutional Siamese architecture to tackle this task, achieving state-of-the-art results on three publicly-released datasets, namely Pandora, High-Resolution Range-based Face Database (HRRFaceD), and CurtinFaces. The proposed method takes depth maps as the input, since depth cameras have been proven to be more reliable in different illumination conditions. Thus, the system is able to work even in the case of the total or partial absence of external light sources, which is a key feature for automotive applications. From the algorithmic point of view, we propose a fully-convolutional architecture with a limited number of parameters, capable of dealing with the small amount of depth data available for training and able to run in real time even on a CPU and embedded boards. The experimental results show acceptable accuracy to allow exploitation in real-world applications with in-board cameras. Finally, exploiting the presence of faces occluded by various head garments and extreme head poses available in the Pandora dataset, we successfully test the proposed system also during strong visual occlusions. The excellent results obtained confirm the efficacy of the proposed method. Full article
(This article belongs to the Special Issue Time of Flight (TOF) Cameras)
Show Figures

Figure 1

Back to TopTop