Next Article in Journal
Crop Separability from Individual and Combined Airborne Imaging Spectroscopy and UAV Multispectral Data
Next Article in Special Issue
A Serverless-Based, On-the-Fly Computing Framework for Remote Sensing Image Collection
Previous Article in Journal
Spatially Enhanced Spectral Unmixing Through Data Fusion of Spectral and Visible Images from Different Sensors
Previous Article in Special Issue
An Enhanced Deep Convolutional Model for Spatiotemporal Image Fusion
 
 
Review
Peer-Review Record

An Overview of Platforms for Big Earth Observation Data Management and Analysis

Remote Sens. 2020, 12(8), 1253; https://doi.org/10.3390/rs12081253
by Vitor C. F. Gomes 1,2,*, Gilberto R. Queiroz 2,* and Karine R. Ferreira 2,*
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Reviewer 3: Anonymous
Remote Sens. 2020, 12(8), 1253; https://doi.org/10.3390/rs12081253
Submission received: 15 March 2020 / Revised: 2 April 2020 / Accepted: 6 April 2020 / Published: 16 April 2020
(This article belongs to the Special Issue Spatial Data Infrastructures for Big Geospatial Sensing Data)

Round 1

Reviewer 1 Report

Major comments

  • Authors should clarify what a platform is, at least in the context of the paper. This will help readers understand why the paper is comparing e.g. GEE (software, infrastructure) with ODC (software) or with Open EO (API specification).
  • Authors should clarify the criteria which led to the selection of the seven platforms (e.g. these the only existing "Big EO Data Management and Analysis" platforms, or these are the most used ones, etc.).
  • It is difficult to trace back assessments of the capabilities to their justification in the platforms' descriptions. I suggest authors to first introduce the capabilities, then describe the platforms highlighting the satisfaction or not (or partial) of the capabilities and finally present the summary table

 

Minor comments:

  • please rephrase "Based on these demands and needs presented by these authors" (pag. 14, line 436)
  • The Autonomy capability seems misleading, in fact it is defined as "the capacity of the scientific community to participate in the governance and development of the platform.", my suggestion is to name this capability "Open Governance".

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 2 Report

This is an interesting comparison, which aims to give a state of the art of Platforms.

I was surprised to find in the list JEODPP (which to my knowledge is a "closed" platform accessible only to specific users), whereas existence of 5 EC funded DIAS platforms (open and accessible to all users) was not even mentioned. So I would invite the authors to consider also this large investment from EC in their analysis... 

In addition I believe that one element of interest for the readers is the ownership of the developed algorithms: do the algos remain property of the developer or the hosting platform has the right to assimilate it and use elsewhere?

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 3 Report

Manuscript titled: An Overview of Platforms for Big Earth Observation Data Management and Analysis. The manuscript has been submitted as an article, although the title itself suggests its proper classification: Overview synonymous with Review. In addition to the title itself, the structure of the manuscript demonstrates this. There are no hypotheses or research challenges, no objectives and no final section of conclusions. The subject of the review is highly relevant with the continuous emergence of new sources of remote sensing data, which produce huge amounts of information in the form of mostly multispectral images in a repetitive manner over time with different cadences associated with satellite missions and constellations. The standards developed within the OGC and later sanctioned by ISO as standards of the ISO 19,000 family by Technical Committee 211, do not reach the scope of the infrastructures that must support processing capacity on these data sources nor the needs of analysis of scientists on these data that are also changing. To help clarify and improve the Review manuscript, the following comments and/or criticisms are offered. Page 2, lines 38-40. It is criticized without giving evidence that most of the analysis of OE data is done on files that are accessible through the Internet by different protocols (HTTP, FTP and SSH). OGC has generated standards such as WCS with OE profiles that effectively facilitate access to the data and the downloading of data is done in the form of files, but after a first level of processing (change of format, CRS, resampling, range selection, etc.) Pag2, line 41-46 SDIs are exposed and criticized for only facilitating access to EO data in the form of files. It is really a decision of the stakeholders to define how to organize the data to offer them in the infrastructures. I am referring to Coverages and associated services: WCS, WPS, WCPS, etc.. Page 2 line 69 the WCPS and WTSS services are presented together. The first is OGC standard while the second is not. This should be indicated. This does not have to make the authors think that I do not agree with this standard that I personally find the OGC data very interesting or useful for time series analysis. Page 2. Line 167-169 The authors should better justify this statement, incorporating some reference, for example. On page 3, lines 90-93, appear what could be considered as premises (desired aspects of this type of platform). Reviewing the manuscript trying to find the relationship with the other groups of criteria used when reviewing or comparing the 6 infrastructures this sounds strange and is not well coordinated. The review is complete, although the systematic process proposed has not been completed 100% for the 6 platforms analysed. On page 3, line 97-100 lists the issues that have been raised in reviewing the 6 platforms. But when reading the text of all of them it is difficult or impossible to find the answers to them. In this sense the systematic process is not complete and leaves a feeling of incompleteness. As for the question of architecture, in some cases the official one is presented, in others the inferred one, and in others a UML diagram of use cases is presented. This does not seem to be very harmonized. It also presents a lack of harmony, seeing that when describing some platform many footnotes are provided and in others none at all. It may not be needed on some, but is abused on others (ODC). Page 7, line 284. In referencing the organizations or institutions that use the OFC platform, data is provided according to a reference [40]. The date of this data should be indicated, as there may be a greater number currently if more than 1 year has passed. In section 3, where the analysis of the platforms from the point of view of the demands and needs of the authors is made, which is constituted together with the previous description of section 2 in the contribution of this manuscript, 10 desirable capabilities have been chosen, based on the previous works of Camera et al and Ariza-Porras et al. The category of Data access interoperability identified as a collaborative work is not well explained and I do not share it. Interoperable data access is facilitated by standards either for data exchange or service APIs. What does it have to do with Collaborative work? Regarding the qualitative classification made on the 10 aspects: Data abastraction, Processing abstraction, etc.. Yes, No and Partial, it seems adequate for the summary table 1. The problem I find is that in the following figures, which attempt to reason by comparing two of these 10 dimensions, the partials are represented graphically with positions that are not similar for all those that are so labeled. The authors should reason why the qualitative variable has been quantified to compare them and no common criterion is maintained For example in Figure 12, SEPAL catalogued with Extensibility Yes does not reach the maximum and openEO with partial has the same level in the graph. The same happens in Figure 12 with this category and infrastructure. I would like to conclude by indicating that I find the work adequate and interesting, although I consider it a review and requires an additional round to complete and analyze the issues indicated.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Round 2

Reviewer 3 Report

The authors have taken into account the comments of the first review and have corrected and added or removed information appropriately. In relation to the last comment made about considering the planet platform in the comparison, they have reasoned or justified its non-inclusion by claiming the lack of APIs to develop functionalities in the platform for the time being and taking into account the definition of platform proposed in this review. A couple of minor comments that would surely come out in the proof of reading and that I advance. Page 2 line 9 the "e" that serves as a link at the end of the sentence will be an and. Page 14, lines 448-449 and 451-452 seem to repeat part of a sentence. Please review this paragraph to avoid this duplication.
Back to TopTop