Special Issue "Multimedia Sensor Networks for Mission-Critical Surveillance Applications"
A special issue of Journal of Sensor and Actuator Networks (ISSN 2224-2708).
Deadline for manuscript submissions: closed (31 December 2013)
Prof. Dr. Congduc Pham
LIUPPA Laboratory, University of Pau, Avenue de l’Université, BP1155, 64013 Pau Cedex, France
Interests: wireless sensor networks; mission-critical; surveillance application; MAC; routing; congestion control
The monitoring capability of Wireless Sensor Networks (WSN) make them very suitable for large scale surveillance systems. A large number of applications related to environment (agriculture, water, forest, fire detection,...), military, buildings, health (elderly people, home monitoring,...), disaster relief & emergency management, area and industrial surveillance have already been studied from the WSN perspective. Most of these surveillance applications have very specific needs due to their inherently critical nature associated to security and usually have a high level of criticality which make them difficult to deploy with the current state of technology. Moreover, the purely scalar nature of traditional sensor nodes might be limiting for more complex applications such as object detection, surveillance, recognition, localization, and tracking. Therefore, in addition to traditional sensors, a wide range of emerging WSN applications can be strengthened by introducing multimedia capability such as images. In the domain of surveillance applications that are extremely mission-critical in nature, adding visual capabilities highlights news challenges.
This special issue aims to gather latest research and development achievements in the field of Wireless Multimedia Sensor Networks (WMSN) for mission-critical surveillance applications. Original papers that address the most current issues and challenges are solicited. Topics of interest include, but are not limited to:
- Real-time and QoS mechanisms
- Dynamic criticality management, dynamic scheduling and dynamic resource management
- Cooperation, cross-layer mechanisms for advanced multimedia traffic management
- Congestion control for real-time multimedia traffic
- Advanced information/data management for mission-critical applications
- Dedicated MAC layers for multimedia traffic targeted to mission-critical applications
- Advanced and adaptive routing schemes for image transfer targeted to mission-critical applications
- Networked sensors and robots for mission-critical applications
- Image sensing techniques for very energy constrained devices
- Optimized and robust image encoding techniques for very energy constrained devices and lossy environments
- Distributed vision processing algorithms and fusion of vision
- 3D scene analysis from distributed image sensors
- Optimized algorithms for multimodal intrusion detection systems
- Multimedia-oriented middleware for mission-critical applications
- Multi-sensor oriented multimedia GIS for disaster management
- Prototypes, proofs-of-concept and new multimedia sensor hardware
Prof. Dr. Congduc Pham
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Journal of Sensor and Actuator Networks is an international peer-reviewed Open Access quarterly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 300 CHF (Swiss Francs). English correction and/or formatting fees of 250 CHF (Swiss Francs) will be charged in certain cases for those articles accepted for publication that require extensive additional formatting and/or English corrections.
Article: Collaborative 3D Target Tracking in Distributed Smart Camera Networks for Wide-Area Surveillance
J. Sens. Actuator Netw. 2013, 2(2), 316-353; doi:10.3390/jsan2020316
Received: 26 March 2013; in revised form: 26 April 2013 / Accepted: 14 May 2013 / Published: 30 May 2013| Download PDF Full-text (2233 KB)
Article: Energy-Efficient Packet Relaying in Wireless Image Sensor Networks Exploiting the Sensing Relevancies of Source Nodes and DWT Coding
J. Sens. Actuator Netw. 2013, 2(3), 424-448; doi:10.3390/jsan2030424
Received: 24 May 2013; in revised form: 19 June 2013 / Accepted: 26 June 2013 / Published: 10 July 2013| Download PDF Full-text (1760 KB) | Download XML Full-text
The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.
Type of Paper: Article
Title: Parallel Computational Intelligence based Multi-Camera Surveillance System
Authors: Sergio Orts-Escolano, Jose Garcia-Rodriguez, Vicente Morell, Miguel Cazorla and Juan Manuel Garcia-Chamizo
Affiliation: University of Alicante, Spain; E-Mail: firstname.lastname@example.org
Abstract: In this work we present a multi-camera surveillance system based on the use of self-organizing neural networks to represent events in video. The system process several tasks in parallel using GPUs (Graphic Processor Units). Addressing multiple vision tasks of various levels such as segmentation, representation or characterization, analysis and monitoring of the movement to allow the construction of a robust representation of their environment and interpret the elements of the scene. It is also necessary to integrate the vision module into a global system that operates in a complex environment by receiving images from multiple acquisition devices at video frequency and offering relevant information to higher level systems, monitor and take decisions in real time, and must accomplish a set of requirements such as: time constraints, high availability, robustness, high processing speed and re-configurability. We have built a system able to represent and analyze the motion in several image sequences acquired by a multi-camera network and process multisource data in parallel onto a Multi-GPU architecture.
Last update: 29 October 2013