Next Article in Journal
Snow Water Equivalent Monitoring—A Review of Large-Scale Remote Sensing Applications
Previous Article in Journal
Improvements in the Estimation of Air Temperature with Empirical Models on Livingston and Deception Islands in Maritime Antarctica (2000–2016) Using C6 MODIS LST
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Terminal Congestion Analysis of Container Ports Using Satellite Images and AIS

1
School of Engineering, The University of Tokyo, Tokyo 113-8656, Japan
2
Research Center for Advanced Science and Technology, The University of Tokyo, Tokyo 153-8904, Japan
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(6), 1082; https://doi.org/10.3390/rs16061082
Submission received: 12 January 2024 / Revised: 28 February 2024 / Accepted: 18 March 2024 / Published: 20 March 2024
(This article belongs to the Section Engineering Remote Sensing)

Abstract

:
This study proposes the use of satellite images and a vessel’s automatic identification system (AIS) data to evaluate the congestion level at container ports for operational efficiency analysis, which was never attempted in previous studies. The congestion level in container yards is classified by developing a convolutional neural network (CNN) model and an annotation tool to reduce the workload of creating training data. The annotation tool calculates the number of vertically stacked containers and the reliability of each container cell in a detection area by focusing on the shadows generated by the containers. Subsequently, a high-accuracy CNN model is developed for end-to-end processing to predict congestion levels. Finally, as an example of dynamic efficiency analysis of container terminals using satellite images, the relationship of the estimated average number of vertically stacked containers in the yard with the elapsed time between the image capture time and vessel arrival or departure time obtained from the automatic identification system data is analyzed. This study contributes to representing a prototype for dynamically estimating the number of vertically stacked containers and congestion level of container terminals using satellite images without statistical information, as well as its relationship with the timing of vessel arrival acquired from AIS data.

1. Introduction

With the increasing global competition in the port industry, improving container port efficiency has become a critical issue in recent years. The operational efficiency at each port and terminal is significant for determining which ports the shipping companies will use. Efficient operation is also necessary from an environmental perspective as it will reduce the time each vessel is in port; thus, the port industry can engage in sustainable economic development by saving fuel and reducing emissions. Many factors influence port efficiency, including the availability of quay and dock facilities, quality of connections to road and rail services and their competitiveness, and total number of cranes and berth use at each port. However, one challenge is the need for uniform criteria for evaluating efficiency among different ports. Accordingly, many studies have been conducted to address this need.
Big data have been widely used in the maritime industry. One example is the automatic identification system (AIS) in vessels [1]. Since the late 2000s, the development of satellites and information technology has led to the expansion of AIS data and other information about vessel movements. Moreover, the number of businesses that have accumulated and used such data has increased. Several studies have used this data in logistics and maritime economics, as summarized by [2,3,4,5]. Moreover, satellite images have been recently expected to be useful in the maritime industry. Detecting vessels in satellite images is significant for ocean observations and disaster relief. It has also gained substantial attention from a maritime security perspective, such as in the detection and monitoring of poaching and maritime accidents involving vessels without an AIS. In addition, recent developments in deep learning and computational technologies have enabled real-time and high-speed large-scale image recognition, and resources for the quick image processing of vessels and ports have been made available. This has led to more deep learning research on vessel detection in images. However, owing to the lack of large datasets, most studies have focused on vessel detection in images rather than port terminals, and few studies have associated satellite images with AIS data.
To address these shortcomings, this study considers container yard congestion among the factors affecting port efficiency and contributes to the literature by first proposing a method for the dynamic assessment using satellite images and AIS data without statistical information. Specifically, this study aims to classify the congestion of container yards into several levels using a machine learning model. Container yard conditions change momently and are significantly influenced by the season, day of the week, time of day, and timing of vessel arrival and departure. Although counting the total number of containers in the yard from satellite images is a direct approach, it requires considerable time, especially for extracting the height information from stacked containers. Therefore, this study first develops an annotation tool to generate training data by focusing on the shadows generated by the containers. Subsequently, a convolutional neural network (CNN) is applied for the end-to-end analysis of yard congestion classification. Finally, AIS data are used to analyze the relationship between yard congestion and vessel arrival and departure times as an example of dynamic efficiency analysis of container terminals using satellite images.
Various statistics and field surveys are necessary to investigate container yard congestion for third parties who cannot obtain operational data directly from operators. However, considering the current availability of inexpensive satellite images captured with high frequency owing to the development of space technology [6], this study aims to serve as a benchmark when satellite images are abundantly available. The remainder of this paper is organized as follows: Section 2 summarizes the related literature and data used in this study and positions this paper; Section 3 and Section 4 describe an annotation tool and CNN model with their estimation results, respectively; Section 5 analyzes the relationship between yard congestion and vessel arrival and departure timing; and finally, Section 6 summarizes the conclusions of this paper.

2. Literature Review and Data

2.1. Literature Review

Many studies have used various explanatory variables and data to evaluate the efficiency of port operations, including vessel capacity, container handling volume, berth use rate [7], number of voyage days for cargo ships and tankers [8], and AIS data [9,10]. Some studies have applied data envelopment analysis [11,12,13] or stochastic frontier analysis [14,15,16] to evaluate the efficiency of port operations. However, most of these studies have been based on statistical information and have not provided dynamic assessments [5]. Moreover, it is impossible to analyze ports for which statistical data are unavailable.
Studies on ship detection using deep learning have increased in recent years. For example, Liu et al. [17] presented a high-resolution ship dataset and proposed ship annotation and some development tools based on Google Earth images. Yang et al. [18] proposed rotation-dense feature pyramid networks applied to Google Earth images. Zhang and Zhang [19] applied a CNN for ship classification and size detection using RADARSAT-2, TerraSAR-X, and Sentinel-1 images. Wang et al. [20] used a CNN for highly accurate ship detection in environments with different scales and backgrounds using Gaofen-3 and Sentinel-1 images. Graziano et al. [21] improved the accuracy of matching synthetic aperture radar (SAR) images and AIS data by estimating vessel speeds from TerraSAR-X images. Štepec et al. [22] proposed a vessel detection method using Sentinel-2 and Planet Dove images and evaluated it on a large-scale dataset that was collected and automatically annotated with the help of AIS data. Hou et al. [23] proposed an automatic sea segmentation, ship detection, and SAR-AIS matchup procedure and presented an extensible marine target taxonomy of 15 primary ship categories, 98 sub-categories, and many non-ship targets using high-resolution C-band SAR Gaofen-3. Ping et al. [24] detected vessels from daily Planet Labs satellite images and related the number of vessels with the monthly cargo throughput acquired from the port statistics. Suo et al. [25] proposed the port-use prosperity index and applied it to six ports around the Bohai Sea, China, to evaluate the operational efficiency of ports using high-resolution optical satellite images such as Tianhui-1, Ziyuan-3, Gaofen-1, and Gaofen-2. They calculated the area occupied by cargo ships berthed along the coastline. Xu et al. [26] proposed a method called Lite-YOLOv5 for onboard ship detection using Sentinel-1 SAR images. Paolo et al. [27] classified Sentinel-1 SAR and Sentinel-2 optical images using deep learning to detect vessels. These detected vessels were compared with AIS data to determine whether the vessels were publicly tracked or not. Thus, many studies have focused on vessel detection in images and developed analytical techniques.
However, image analysis studies focusing on port terminals are limited. Yong et al. [28] applied high-resolution optical satellite images, such as Pleiades, WorldView-2, and Beijing-2, to monitor container terminal construction from 2010 to 2017 at the ports of Colombo and Hambantota in Sri Lanka. Li et al. [29] used Google Earth images and monitored the expansion of the port of Ajmr in the Philippines from 2009 to 2018. Sengupta and Lazarus [30] measured patterns of seaward expansion in 65 of the world’s top 100 container ports from satellite imagery in Google Earth Engine during 1990–2020. Yao et al. [31] proposed processing and analyzing procedures for large-scale SAR image annotation, including port infrastructure as one of the classes using TerraSAR-X images. Liu et al. [32] estimated the cargo handling capacity of ports using DMSP-OLS nighttime light data. Murata et al. [33] identified the operational status of container terminals from high-resolution nighttime-light satellite CE-SAT-IIB imagery. These studies focused on the container terminal as a field rather than inside the terminal.
Meanwhile, the Japan Aerospace Exploration Agency [34,35] used ALOS-2 PALSAR-2 and Sentinel-1 images to observe the range of container storage areas periodically in the port of Nagoya after the COVID-19 pandemic, although it focused only on spreading in the planar direction. Yu et al. [36] also calculated the daily average change in the number of containers in the container ports using Sentinel-2 optical images to predict the relationship between port container numbers and economic activity. They counted the number of pixels in each satellite image classified as containers and took this as a proxy for the number of containers in the port; however, the number of layers could not be recognized precisely due to the limited spatial resolution. In contrast, Johnsen [37,38] used TerraSAR-X images to understand the stacked structure of containers on three container terminals at the port of Oslo. The proposed method enables the detection of changes in the number of stacked containers between two different dates.
In summary, to the best of the knowledge of the authors, no studies have been conducted on evaluating port efficiency using satellite images and AIS data. The main reason is that AIS data cannot be directly used as training data in port image analysis because it does not contain any information on container movement at terminals, and other training data are difficult to obtain. Therefore, this study fills the research gap by first applying image analysis to estimate the port congestion level with an annotation tool to create the training data and CNN. In particular, this study attempts to estimate the number of vertically stacked containers from their shadows, which was not examined by previous studies. Subsequently, the operation efficiency of container terminals is evaluated by combining the estimated port congestion level with the departure and arrival times of vessels acquired from AIS data as an example of dynamic efficiency analysis using the estimated values.

2.2. Data Used in This Study

This study uses satellite images because of their availability with third parties and comparability among multiple ports and terminals, while other images, such as cable cameras or drones, may allow for more detailed analysis. Among various satellite images that were applied, as summarized in Section 2.1, we use Google Earth Pro. Murata et al. [39] mentioned that Google Earth imagery provides a more detailed spatial resolution than high-resolution satellite imagery, such as WorldView-3 multispectral (1.6 m pixel resolution) and panchromatic (0.4 m pixel resolution) images. Malarvizhi et al. [40] also recommended Google Earth images as the best application for urban-related analyses.
This study uses satellite images of five berths of three terminals in the Oi Container Terminals, port of Tokyo, which were collected for six days from 2018 to 2020 shown in Table 1 from Google Earth Pro. The Oi Container Terminals, the cores of the container terminals at the port of Tokyo, are 2354 m long terminals with a continuous seven-berth deep-water quay wall. There are four terminals, and the seven berths are equipped with 20 container cranes, allowing 10,000 TEU-class large container vessels to berth at the terminal.
This study also uses AIS data in Section 5 to confirm when vessels arrived and departed at each berth immediately before and after each image was captured. AIS is a communication system that transmits vessel dynamics, such as location, speed over ground, and course over ground, and voyage-related information, such as vessel draft and destination, at irregular intervals ranging from a few seconds to a few minutes, requiring to be equipped for all oceangoing ships of 300 gross tons or more, all domestic ships of 500 gross tons or more, and all passenger ships. AIS data provides the time-series location information (longitude and latitude) of each vessel; however, note that the average duration of AIS data normally provided by commercial providers is about 1 h when a vessel stops, while it is about 10 min when a vessel moves. We extract AIS data from the Seasearcher database (Lloyd’s List Intelligence), as described in Appendix A for detail.

3. Developing an Annotation Tool

3.1. Overview of the Tool

Estimating the density of containers by quickly viewing satellite images is inadequate in creating training data for classifying yard congestion levels using machine learning because containers are stacked in the vertical direction in a yard. However, visually counting the number of vertically stacked containers in a yard requires considerable time and effort. Therefore, this study proposes a tool that automatically counts the number of containers in the yard with annotation by a developer if necessary and outputs the congestion level as training data.
Two primary container sizes (20 ft and 40 ft) are considered to create this tool, as shown in Table 2. In detecting containers, several examples, such as [34,35,36], have focused on detecting the number of containers spread over the storage area in the planar direction. However, detecting the number of containers stacked in the vertical direction is more challenging. This study focuses on the shadows, which were often used to estimate the height of some objects [41], generated by containers to address this issue. Although adaptive binarization and semantic segmentation can be introduced to extract shadows, this study uses hue-saturation-value (HSV) models of the image. The HSV model is a nonlinear transformation of the RGB color space, with hue indicating the color type, saturation indicating the color vividness, and value indicating the color brightness.
The number of containers in each detection area of a container yard is estimated by focusing on the shadows generated by the containers. Figure 1 shows an image of the proposed tool. The estimation reliability of each cell is calculated sequentially by setting the cell corresponding to each container. The container occupancy rate and congestion level are then calculated after manually adjusting low-reliability cells. The development process of the tool is as follows:
Step 1:
Set the detection area.
Step 2:
Identify shadows by binarization using HSV values and preset the configuration.
Step 3:
Set a cell for each container and calculate the average pixel value for each cell to detect irregularities.
Step 4:
Sequentially estimate the number of vertically stacked containers in each cell and their reliability.
Step 5:
Recalculate after adjusting low-reliability cells.
Step 6:
Classify the congestion level based on the calculated container occupancy rates.
Figure 1. Flows of the annotation tool.
Figure 1. Flows of the annotation tool.
Remotesensing 16 01082 g001

3.2. Detailed Description of the Tool

3.2.1. Step 1: Setting Detection Area

First, a container yard is divided into several detection areas to create the training data and congestion classification. The user of the tool should visually set up the detection areas for preparation, as shown in Figure 2.

3.2.2. Step 2: Identifying Shadows by Binarization Using HSV Values and Presetting Configuration

In the second step, shadows made by the containers are identified by setting the threshold HSV values. Specifically, if all HSV values of a pixel are below the thresholds, it is judged as a shadow pixel, and 0 is set as the pixel value. Otherwise, the pixel value is set to 1. This study set the threshold HSV values on a trial-and-error basis, as shown in Table 3. Figure 3 provides an example image after shadow extraction.
Subsequently, the tool user defines a length direction vector ( L ), width direction vector ( W ), and shadow direction vector ( S ) per container for each type of container (20 ft and 40 ft), as well as the starting point of the tool ( s 0 ) for each detection area by specifying the mouse pointer in the image, as shown in Figure 4. In addition, the number of containers (NL) and their size (20 ft or 40 ft, L_list) in the length direction, the number of containers in the width direction (NW), and the maximum number of stacked containers in the vertical direction (NV) for each detection area are obtained visually.

3.2.3. Step 3: Setting a Cell for Each Container and Its Shadow and Calculating the Average Pixel Value for Each Cell to Detect Irregularities

In the third step, a cell is set for each container in the image using the tool based on the obtained L , W , L_list, NW, and s 0 . i and j are integers that take values from 0 to NL – 1 and 0 to NW – 1, respectively, with Cellc(i, j) showing a container cell set. Subsequently, shadow cell Cells(i, j, k) is generated against a container cell Cellc(i, j) based on the relationship among L , W , and S using the tool. Figure 5 shows examples of setting up a container and shadow cell.
The average pixel value for each container and shadow cell is then calculated as follows. First, the average pixel value Dc(i, j) (0 ≤ Dc(i, j) ≤ 1) in the container cell Cellc(i, j) is calculated to determine whether the target cell is concave (i.e., whether the number of vertically stacked containers in the target cell is smaller than those in the neighboring cells). Notably, the average pixel value in Cellc(i, j) with a parallel quadrilateral domain is calculated by reducing the number of pixels, as shown in Figure 6, to reduce the calculation time. Specifically, the pixels and their neighboring pixels are extracted along the straight line passing through the intersection of the diagonal lines of the domain and parallel to the y-axis of the image.
Subsequently, the average pixel value Ds(i, j, k) (0 ≤ Ds(i, j, k) ≤ 1) in the shadow cell Cells(i, j, k) is calculated starting from k = 1, and k0 is set to the first k where Ds(i, j, k) exceeds a criterion. This study set the criterion as 0.5, representing the midst of pure shadow (Ds(i, j, k) = 0) and non-shadow (Ds(i, j, k) = 1). Thereafter, the number of shadows S(i, j) created by Cellc(i, j) is calculated according to Equation (1).
S i , j = k 0 1

3.2.4. Step 4: Sequentially Estimating the Number of Vertically Stacked Containers in Each Cell and Its Reliability

In the fourth step, the number of vertically stacked containers in Cellc(i, j), denoted by N(i, j), and its reliability P(i, j) are determined sequentially from the first row for each detection area according to Equation (2a,b). First, if the average pixel value Dc(i, j) in the container cell is more than the criterion set in Step 3 (i.e., 0.5 in this study), this study assumes that the cell is judged as “not concave”, and the number of vertically stacked containers N(i, j) is determined by N(i − 1, j) and S(i, j). The reliability P(i, j) is defined as the closeness of the average pixel value of each shadow cell to the criteria.
By contrast, if the cell is judged as concave (i.e., Dc(i, j) is less than or equal to the criterion), the number of vertically stacked containers N(i, j) is randomly generated based on the distribution of the observed data because determining it by visual inspection is challenging. The reliability P(i, j) is set to 0.
  • if   D c i ,   j > 0.5
N i ,   j = S i , j   j   i = 1 N i ,   j = N i 1 ,   j + S i , j   j   i 2 P i , j = 1 k 0 k = 1 k 0 1 D s i ,   j , k 0.5 i , j
otherwise
N i ,   j = m   i , j P i , j = 0   i , j m   : randomly   generated   from   the   observed   data

3.2.5. Step 5: Recalculating after Adjusting Low-Reliability Cells

In the fifth step, the above calculation is repeatedly applied to low-reliability container cells, such that P(i, j) is less than the criteria (this study set it as 0.5, considering the criteria introduced in Steps 3 and 4). Specifically, low-reliability cells are displayed in the image, the number of vertically stacked containers in Cellc(i, j) is counted visually, the value is updated, and recalculation is performed according to Step 4. This recalculation process is repeated until no cells need revision (i.e., until P(i, j) for all cells is greater than or equal to the criteria). The number of vertically stacked containers counted in this step is also recorded as the observed data and fed back to a stochastic variable m in Equation (2b) in the previous step.

3.2.6. Step 6: Classifying the Congestion Level Based on the Calculation Results

The last step calculates the container occupancy rate D for each detection area using Equation (3). Furthermore, using the natural number z, the areas are classified into NV classes (congestion level) with a width of 1/NV, as shown in Equation (4), which indicates that z − 1 to z containers are stacked on average in the detection area classified as class z. We define z as the congestion level of the container terminal.
D = i j N i ,   j N L × N W × N V
c l a s s   z   : z 1 N V D < z N V z   : 1 , 2 , , N V

3.3. Evaluation of the Developed Tool

The tool developed in this study is evaluated from two perspectives: the calculation time and estimation accuracy. For evaluation, 30 sample detection areas with 36 cells (NL = NW = 6) are randomly selected.
Regarding the calculation time, Figure 7 shows the frequency distribution of the percentage of low-reliability cells that must be corrected in Step 5 against 36 cells. The average percentage of low-reliability cells is 0.17. Thus, the calculation time can be reduced by less than one-fifth on average by introducing the annotation tool compared with the case in which all containers are checked visually.
Regarding estimation accuracy, the number of containers counted visually is considered the correct value and compared with the calculation results using the proposed tool. Figure 8 compares the correct and predicted values of the container occupancy rate for each detection area. The root-mean-squared error (RMSE) and mean absolute error (MAE) are 0.0243 and 0.107, respectively. Many samples are overestimated, and some overestimate container occupancy rates by more than 10%.

3.4. Discussion of Estimation Results

This section develops an annotation tool to classify container yard congestion because the correct data needs to be ascertained. By focusing on the shadows generated by containers, the number of vertically stacked containers and the reliability of each container cell in the detection areas are sequentially estimated by binarization based on the HSV values. The container occupancy rate in the detection area is estimated after manually correcting low-reliability cells. The workload of this tool is approximately one-fifth of that of the total number of containers counted visually, and the RMSE and MAE are 0.0243 and 0.107, respectively.
Container occupancy rates are calculated using the developed tool on 163 detected areas (in six days of satellite data) in the Oi Container Terminals. The target images are classified into four congestion levels according to Equation (4) because the maximum number of vertically stacked containers in these images is four (NV = 4) in all cases. Figure 9 shows the frequency distribution and percentage of each congestion level, indicating that most data belong to Class 3, followed by Class 4, Class 2, and Class 1.
Table 4 summarizes the characteristics of the images, where the estimated container occupancy rates differ significantly from the images, primarily for two reasons. The first difference is observed in the detection areas close to the quayside, which significantly overlaps with the quay cranes, as shown in Figure 10a. This issue cannot be solved without visual correction because cranes hinder container images.
A second difference is observed if the ground is exposed in the detection area, as shown in Figure 10b. This misclassification is caused by the storage area and surrounding driving paths not being separated in the HSV binarization in Step 2. Although the number of misclassifications is small because such sparse situations are rarely observed in the dataset in this study, this issue should be addressed in the future. One possible solution is to use semantic segmentation to classify the pixels into containers inside the container yard and ground.

4. Image Classification by CNN

4.1. CNN Overview

The method described in the previous section (Section 3) requires acquiring preliminary information about detection areas, such as size, shadow direction, and container size sequence, and repeating corrections based on reliability, which are time-consuming and require significant work. Therefore, this study applies machine learning to classify container congestion levels end to end with the same accuracy as the annotation tool developed in the previous section.
Specifically, a CNN is used as a machine learning model. A CNN is the most popular method for deep learning of images, especially for object detection rather than segmentation, using convolution instead of matrix multiplication in at least one layer [42]. Feature extraction is performed by combining convolutional and pooling layers to reduce the number of parameters because it becomes large and requires considerable computation time if the entire network consists of fully connected layers. A feature map is created by convolving a kernel with the input image in the convolutional layer. The kernel size and stride width are set in advance, and feature extraction is performed by shifting the kernels to the input image. In the pooling layer, the dimensionality of the image is reduced while retaining important features. Only the maximum value in the domain is output and passed to the next layer while sliding the filter in the feature map. The pooling layer is typically placed after the multiple convolutional layers.
This study uses a CNN model with four convolutional layers, two pooling layers, three dropout layers, and two fully connected layers. The data for training and validation are 151 images (80% for training and 20% for validation), excluding the misclassified 12 images in Section 3.4.

4.2. Estimation Results

Table 4 summarizes the three evaluation indices. In the table, the accuracy rate represents the correct prediction rate, the recall rate is the rate of predictions that are correct among those that are actually correct, and the precision rate is the rate of predictions that are actually correct among those that are predicted to be correct. Table 5 also shows the confusion matrix of the estimation results for 30 images of validation data.
Table 5 indicates that the accuracy and recall rates in Class 4 are lower than those in the other congestion levels. Table 6 also shows 11 underpredicted data points for Class 4. In addition to the insufficient network structure of the CNN and insufficient training samples, the misclassification suggests poor decision accuracy of the annotation tool for Class 4. Particularly, the complicated shape of container shadows if the terminal is congested may affect the accuracy of both CNN and the annotation tool. The micro-average (the rate of data for which the predicted levels match the correct ones to the total data) obtained from Table 5 is 0.63. The macro-average (the average accuracy rate for each congestion level) is 0.78. The micro-average is used as a reference because the number of data points is skewed among the congestion levels in this study. Therefore, it can be concluded that the output obtained is consistent with the output of the annotation tool for 63% of the data.
In summary, this section develops a CNN model for predicting the congestion level from images to achieve end-to-end processing because classification using the proposed annotation tool in the previous section requires significant work. The validation results indicate that the micro-mean accuracy rate is 0.63, and any misclassified item is underestimated.

5. Congestion Analysis with AIS Data

This section presents an example of the analysis of the relationship between the estimated container occupancy rate in a yard in the previous section (Section 4) and vessel arrivals and departures using AIS data because the congestion level of container terminals is considered to fluctuate depending on the timing of arrival and departure of containerships. Generally, export containers stacked at a yard are maximized right before the vessel arrivals, while import containers stacked are maximized right after the vessel departures. This is because the handling time discharging from and loading onto a vessel (normally, it is a few hours to half a day) is sufficiently shorter than the duration time when all export and import containers are carried in from and out outside the terminal by trucks (a few days to more than one week).
Based on the AIS data of vessels arriving and departing from each terminal immediately before and after each image was captured, the average number of vertically stacked containers in all detection areas for each terminal (NX for export and NM for import containers), calculated from the output of the annotation tool, is compared with the elapsed time TX (or TM) until the arrival (from the departure) of the first (last) vessel after (before) image capture from (until) the time of image capture. Figure 11 shows the definitions of the elapsed times TX and TM. Note that if a vessel was berthed at image capture, the elapsed time would be negative, as shown in the figure. Details about the estimation of the elapsed times from the AIS data are described in the Appendix A. NX and NM are expected to be higher (the more containers are expected to remain in the yard) as the absolute values of TX and TM become smaller (the closer the time of image capture is to the vessel arrival or departure time).
One issue in this analysis is that the export and import containers cannot be differentiated using satellite images. Therefore, the yard layout and breakdowns of loaded, unloaded, and empty containers for each detection area are obtained from Terminal X. Notably, the timings of the numbers of loaded, unloaded, and empty containers obtained differ from the image capture times. Thus, the general rate of export and import containers (rX(a, b) and rM(a, b)) for each detection area a of berth b are estimated from the observed data at multiple time points, as shown in Table 7 (note that most empty containers are exported from the port of Tokyo). Accordingly, the average numbers of vertically stacked export and import containers for each terminal, NX and NM, can be calculated as
N X = b a r X a , b · z a , b   b a r X a , b   ,     N M = b a r M a , b · z a , b   b a r M a , b ,
where z(a, b) is the congestion level of detection area a of berth b. Equation (5) assumes that each terminal similarly uses all detection areas wherever the containership berths in the terminal based on the current operation. For Terminals Y and Z, the same number of vertically stacked containers in all detection areas is assumed for export and import because the observed numbers of containers are not available for these terminals. The estimated average number of vertically stacked containers in Terminal X does not differ significantly between exports and imports, indicating that the estimation method for export and import containers in this study is inadequate.
First, by checking the coherence between the satellite images and AIS data, the satellite image data from one of the six days (as of 16 October 2018) used in the previous sections are eliminated from the following analyses because of the contradiction that no berthing vessels were found in the satellite image while the AIS indicated some vessel berthing, possibly due to inaccuracy of the image capture date. Figure 12 shows the relationship between the estimated average number of vertically stacked containers (NX or NM) and the elapsed time of the arrival of the first vessel (TX) or departure of the last vessel (TM) at the three terminals for the other five days. The figure indicates a negative correlation between the maximum value of the average number of vertically stacked containers NX (or NM) and the absolute value of the elapsed time |TX| (or |TM|), represented by the envelope curve in the figure.
Although the above analysis focuses on the overall trends of the three terminals, Figure 12 indicates that each terminal had different characteristics; for example, the number of vertically stacked containers in Terminal X was generally smaller than those in other terminals and not correlated with the elapsed time. In addition, it is generally known that more containers are loaded and discharged onto/from the larger vessels. Therefore, the relationships between the average numbers of vertically stacked containers NX and NM and vessel size are investigated in Figure 13 with the group of the absolute value of the elapsed time. The figure indicates that in Terminals X and Y, the average number of vertically stacked containers NX (or NM) in the same elapsed time group was larger if the vessel size was larger. In addition, the average number of vertically stacked containers NX (or NM) was smaller in these terminals if the absolute value of the elapsed time |TX| (or |TM|) was larger. However, these findings were not observed in Terminal Z.
Considering the results shown in both Figure 12 and Figure 13, the average number of vertically stacked containers correlated more with the elapsed time than with the vessel size in Terminals Y and Z, although the sample size was insufficient. By contrast, it was correlated more with the vessel size than with the elapsed time in Terminal X. One possible explanation is that the difference in vessel size could be observed more clearly in Terminal X because the average number of vertically stacked containers NX (or NM) was relatively smaller (i.e., the terminal is not congested). By contrast, the changes in the number of vertically stacked containers were more dynamic in Terminals Y and Z because the more severe time management of yard operation was necessary due to much congestion.
In conclusion, as an example of dynamic efficiency analysis of container terminals, the relationship between the estimated average number of vertically stacked containers and the elapsed time between image capture and vessel arrival or departure acquired from the AIS data is investigated in this section. A negative correlation is observed between the maximum value of the estimated average number of vertically stacked containers and the absolute value of each elapsed time, although the number of samples used in the analysis is insufficient. More specifically, because more severe time management of yard operation is necessary for congested terminals, the closer the image capture time to the vessel arrival or departure time, the greater the container occupancy rate of the terminal. However, in the uncongested terminal, the average number of vertically stacked export and import containers increases if the vessel size increases because more containers are loaded and discharged onto/from the larger vessels. Since these discussions are based on a very limited number of samples, increasing the sample size and timeliness are significant issues. In addition, all satellite images were captured at the same time (9:00 AM), as shown in Table 1; thus, the variation of time is necessary for further analysis because many vessels arrived early morning (right before the time when the satellite image was captured) and left midnight (several hours before the time when the satellite image was captured), as can be observed from Figure 12, although these terminals operated throughout the day in principle.

6. Conclusions

This study proposed a method to dynamically estimate container occupancy rate in container yards and classify their congestion level using satellite images by developing an annotation tool to reduce the workload of creating the training data and end-to-end classification model using machine learning. The relationship between the congestion level and elapsed time until vessel arrival (or from vessel departure) was examined using AIS data as an example of dynamic efficiency analysis of container terminals using satellite images.
This study proposed a prototype for dynamically estimating the number of vertically stacked containers and congestion level of container terminals using satellite images without statistical information, as well as examining the relationship between the congestion level and timing of vessel arrival by combining with AIS data. The proposed method can contribute to dynamic operational analyses in container terminals, which is particularly beneficial for third parties who cannot obtain operational data directly from operators to understand and compare the overall efficiency of terminal operations. While this study aims to serve as a benchmark if satellite images with high resolution become abundantly available, Google Earth images used in this study are not updated in real time, making them unsuitable for monitoring daily status changes. However, 30 cm high-resolution and higher revisit satellite constellations are planned to be built in the near future [6]; therefore, if images with spatial resolution nearly comparable to Google Earth images are available with high frequency, real-time monitoring of container terminals and measuring operation efficiency by third parties would be realized. It can also be used to analyze container cargo flows and congestion on land, such as trucks, railways, inland depots, and warehouses, where comprehensive information about real-time movement of transport means, such as an AIS, is unavailable for third parties. In addition, the findings regarding the relationships among the yard congestion level, its fluctuation by time, and vessel size will improve understanding of the essence of container yard operation.
However, some issues require additional investigation. First, the proposed annotation tool could be improved further. For example, although this study evaluated the accuracy of the tool by comparing it with the number of containers counted visually, it should be better evaluated with observed data acquired from the terminals to ensure objectivity. More advanced techniques for developing annotation tools, such as ensemble learning and active learning (or optimal experimental design), which combine multiple learning processes and automatically decide the direction of learning, can also be introduced. Moreover, semantic segmentation can be applied to differentiate pixels with cranes or those without containers (i.e., the ground area and driving path in the terminal) from vertically stacked containers to reduce misclassified images. Validation of some criteria used in the tool is also necessary.
Increasing the sample size and timeliness are also significant issues, as discussed above. If the development of satellite technology enables satellite image acquisition at a lower cost and with higher frequency, more sophisticated models and analyses will become possible. SAR images can also improve the precision of the analysis. More detailed analyses of the relationship between yard congestion and vessel arrival or departure are also necessary by increasing the sample size. By incorporating the dataset representing various situations of container terminals, we can conduct the dynamic and real-time assessment of port operation efficiencies, including comparative analysis with the current static assessment using statistics.
Other methods of measuring the operational efficiency of container terminals using satellite images are also possible, such as observing the operation of quay cranes using vessel berthing information from the AIS. Another example is detecting the length of truck queues outside the gates of container yards because yard congestion causes many problems outside, including truck congestion. Future works should focus on detecting yard congestion at its propagation destination and analyzing its relationship with yard congestion.

Author Contributions

Conceptualization, K.Y. and R.S.; methodology, K.Y.; software, K.Y.; validation, K.Y. and R.S.; formal analysis, K.Y. and R.S.; investigation, K.Y., R.S., R.Y. and H.M.; resources, R.S.; data curation, K.Y. and R.S.; writing—original draft preparation, K.Y. and R.S.; writing—review and editing, R.S., R.Y. and H.M.; visualization, K.Y.; supervision, R.S.; project administration, R.S.; funding acquisition, R.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by JSPS KAKENHI, grant number 20H00286.

Data Availability Statement

Data are contained within the article or available on request due to some restrictions.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Using AIS data, we identify a vessel that arrives and departs from each terminal immediately before and after each image was captured, as follows. First, polygons were set in the water area at the front of the target terminals, as shown in Figure A1. Subsequently, AIS data of containerships with capacities of more than 300 TEU (i.e., not barge vessels), which stopped in the polygon within one week before and after the date and time the satellite image was captured. This study assumed the vessel stopped when its speed over the ground was less than one knot in the polygon. Among these containerships, the first vessel arriving at either berth of the terminal after the image was captured (for export analysis) and the last vessel departing from either berth before the image was captured (for import analysis) were determined.
If a containership stopped at either berth of the terminal when the image was captured, the vessel was considered the first arrival and last departure vessel, as explained in Figure 11 of Section 5. However, as an exceptional case, if the vessel arrived within 1 h before image capture, the vessel was considered the first arrival vessel but not the last departure vessel (i.e., the previous vessel was considered to be the last departure vessel instead), because of the lag between the times at which the vessel stopped and when the cargo discharge started. Similarly, if the vessel departed within 1 h after image capture, the vessel was considered the last departure vessel but not the first arrival vessel (i.e., the next vessel was regarded as the first arrival vessel instead).
Figure A1. Example of a polygon (red area) related to a target terminal for extracting AIS data of berthing vessels (yellow arrow).
Figure A1. Example of a polygon (red area) related to a target terminal for extracting AIS data of berthing vessels (yellow arrow).
Remotesensing 16 01082 g0a1

References

  1. Tu, E.; Zhang, G.; Rachmawati, L.; Rajabally, E.; Huang, G.-B. Exploiting AIS data for intelligent maritime navigation: A comprehensive survey from data to methodology. IEEE Trans. Intell. Transp. Syst. 2018, 19, 1559–1582. [Google Scholar] [CrossRef]
  2. Svanberg, M.; Santén, V.; Hörteborn, A.; Holm, H.; Finnsgård, C. AIS in maritime research. Mar. Policy 2019, 106, 103520. [Google Scholar] [CrossRef]
  3. Yang, D.; Wu, L.; Wu, S.; Jia, H.; Li, X.K. How big data enriches maritime research—A critical review of Automatic Identification System (AIS) data applications. Transp. Rev. 2019, 39, 755–773. [Google Scholar] [CrossRef]
  4. Kanamoto, K.; Murong, L.; Nakashima, M.; Shibasaki, R. Can maritime big data be applied to shipping industry analysis?—Focusing on commodities and vessel sizes of dry bulk carriers. Marit. Econ. Logist. 2021, 23, 211–236. [Google Scholar] [CrossRef]
  5. Filom, S.; Amiri, A.M.; Razavi, S. Applications of machine learning methods in port operations—A systematic literature review. Transp. Res. Part E Logist. Transp. Rev. 2022, 161, 102722. [Google Scholar] [CrossRef]
  6. Planet Labs. Our Next-Generation Satellite Constellation Pelican Is Expected to Deliver Very-High-Resolution and Rapid-Revisit Capabilities. 2022. Available online: https://www.planet.com/pulse/our-next-generation-satellite-constellation-pelican-is-expected-to-deliver-very-high-resolution-and-rapid-revist-capabilities/ (accessed on 4 July 2023).
  7. Chen, L.; Zhang, D.; Ma, X.; Wang, L.; Li, S.; Wu, Z.; Pan, G. Container port performance measurement and comparison leveraging ship GPS traces and maritime open data. IEEE Trans. Intell. Transp. Syst. 2016, 17, 1227–1242. [Google Scholar] [CrossRef]
  8. Scully, B.; Mitchell, K.N. Archival Automatic Identification System (AIS) Data for Navigation Project Performance Evaluation. Coastal and Hydraulics Engineering Technical Note, Coastal Hydraulics Laboratory, The U.S. Army Engineer Research and Development Center (ERDC/CHL CHETN-IX-40). 2015. Available online: https://apps.dtic.mil/sti/tr/pdf/ADA623191.pdf (accessed on 4 July 2023).
  9. Farhadi, N.; Parr, S.A.; Mitchell, K.N.; Wolshon, B. Use of nationwide automatic identification system data to quantify resiliency of marine transportation systems. Transp. Res. Rec. J. Transp. Res. Board 2016, 2549, 9–18. [Google Scholar] [CrossRef]
  10. Shibasaki, R.; Kanamoto, K.; Suzuki, T. Estimating global pattern of LNG supply chain: A port-based approach by vessel movement database. Marit. Policy Manag. 2020, 47, 143–171. [Google Scholar] [CrossRef]
  11. Zhou, P.; Ang, B.W.; Poh, K.L. A survey of data envelopment analysis in energy and environmental studies. Eur. J. Oper. Res. 2008, 189, 1–18. [Google Scholar] [CrossRef]
  12. Zheng, X.B.; Park, N.K. A study on the efficiency of container terminals in Korea and China. Asian J. Shipp. Logist. 2016, 32, 213–220. [Google Scholar] [CrossRef]
  13. Xu, Y.; Ishiguro, K. Measuring the efficiency of automated container terminals in China and Korea. Asian Transp. Stud. 2019, 5, 584–599. [Google Scholar] [CrossRef]
  14. Culliname, K.; Song, D.-W.; Gray, R. A stochastic frontier model of the efficiency of major container terminals in Asia: Assessing the influence of administrative and ownership structures. Transp. Res. Part A Policy Pract. 2002, 36, 743–762. [Google Scholar] [CrossRef]
  15. Tongzon, J.; Heng, W. Port privatization, efficiency and competitiveness: Some empirical evidence from container ports (terminals). Transp. Res. Part A Policy Pract. 2005, 39, 405–424. [Google Scholar] [CrossRef]
  16. Wiegmans, B.; Witte, P. Efficiency of inland waterway container terminals: Stochastic frontier and data envelopment analysis to analyze the capacity design- and throughput efficiency. Transp. Res. Part A Policy Pract. 2017, 106, 12–21. [Google Scholar] [CrossRef]
  17. Liu, Z.; Yuan, L.; Weng, L.; Yang, Y. A high resolution optical satellite image dataset for ship recognition and some new baselines. In Proceedings of the 6th International Conference on Pattern Recognition Applications and Methods (ICPRAM 2017), Porto, Portugal, 24–26 February 2017; pp. 324–331. [Google Scholar] [CrossRef]
  18. Yang, X.; Sun, H.; Fu, K.; Yang, J.; Sun, X.; Yan, M.; Guo, Z. Automatic ship detection in remote sensing images from Google Earth of complex scenes based on multiscale rotation dense feature pyramid networks. Remote Sens. 2018, 10, 132. [Google Scholar] [CrossRef]
  19. Zhang, T.; Zhang, X. High-speed ship detection in SAR images based on a grid convolutional neural network. Remote Sens. 2019, 11, 1206. [Google Scholar] [CrossRef]
  20. Wang, Y.; Wang, C.; Zhang, H.; Dong, Y.; Wei, S. A SAR dataset of ship detection for deep learning under complex backgrounds. Remote Sens. 2019, 11, 765. [Google Scholar] [CrossRef]
  21. Graziano, M.D.; Renga, A.; Moccia, A. Integration of Automatic Identification System (AIS) data and single-channel Synthetic Aperture Rader (SAR) images by SAR-based ship velocity estimation for maritime situational awareness. Remote Sens. 2019, 11, 2196. [Google Scholar] [CrossRef]
  22. Štepec, D.; Martinčič, T.; Skočaj, D. Automated system for ship detection from medium resolution satellite optical imagery. In Proceedings of the Oceans 2019 MTS/IEEE Seattle, Seattle, WA, USA, 27–31 October 2019; pp. 1–10. [Google Scholar] [CrossRef]
  23. Hou, X.; Ao, W.; Song, Q.; Lai, J.; Wang, H.; Xu, F. FUSAR-Ship: Building a high-resolution SAR-AIS matchup dataset of Gaofen-3 for ship detection and recognition. Sci. China Inform. Sci. 2020, 63, 140303. [Google Scholar] [CrossRef]
  24. Ping, Y.; Zhao, J.; Shi, H.; Zhang, Q.; Xiang, B. An approximate system for evaluating real-time port operations based on remote sensing images. Int. J. Remote Sens. 2021, 42, 783–793. [Google Scholar] [CrossRef]
  25. Suo, A.; Xu, J.; Li, X.; Wei, B. Evaluation of port prosperity based on high spatial resolution satellite remote sensing images. Chin. Geogr. Sci. 2020, 30, 889–899. [Google Scholar] [CrossRef]
  26. Xu, X.; Zhang, X.; Zhang, T. Lite-YOLOv5: A lightweight deep learning detector for on-board ship detection in large-scene Sentinel-1 SAR images. Remote Sens. 2022, 14, 1018. [Google Scholar] [CrossRef]
  27. Paolo, F.S.; Kroodsma, D.; Raynor, J.; Hochberg, T.; Davis, P.; Cleary, J.; Marsaglia, L.; Orofino, S.; Thomas, C.; Halpin, P. Satellite mapping reveals extensive industrial activity at sea. Nature 2024, 625, 85–91. [Google Scholar] [CrossRef] [PubMed]
  28. Yong, G.; Yuehong, C.; Yuanxin, J.; Xian, G.; Shan, H. Dynamic monitoring the infrastructure of major ports in Sri Lanka by using multi-temporal high spatial resolution remote sensing images. J. Geogr. Sci. 2018, 28, 973–984. [Google Scholar] [CrossRef]
  29. Li, H.; Wu, M.; Tian, D.; Wu, L.; Niu, Z. Monitoring and analysis of the expansion of the Ajmr Port, Davao City, Philippines using multi-source remote sensing data. PeerJ 2019, 7, e7512. [Google Scholar] [CrossRef]
  30. Sengupta, D.; Lazarus, E.D. Rapid seaward expansion of seaport footprints worldwide. Commun. Earth Environ. 2023, 4, 440. [Google Scholar] [CrossRef]
  31. Yao, W.; Dumitru, C.O.; Loffeld, O. Semi-supervised hierarchical clustering for semantic SAR image annotation. IEEE J. Sel. Top. Appl. 2016, 9, 1993–2008. [Google Scholar] [CrossRef]
  32. Liu, A.; Wei, Y.; Yu, B.; Song, W. Estimation of Cargo Handling Capacity of Coastal Ports in China Based on Panel Model and DMSP-OLS Nighttime Light Data. Remote Sens. 2019, 11, 582. [Google Scholar] [CrossRef]
  33. Murata, H.; Shibasaki, R.; Imura, N.; Nishinari, K. Identifying the operational status of container terminals from high-resolution nighttime-light satellite image for global supply chain network optimization. Front. Remote Sens. 2023, 4, 1229745. [Google Scholar] [CrossRef]
  34. Japan Aerospace Exploration Agency. On Change of Port and Harbors after COVID-19 Pandemic (In Japanese). Available online: https://earth.jaxa.jp/covid19/industry/index.html (accessed on 4 July 2022).
  35. Hamamoto, K.; Kuze, A.; Tadono, T.; Sobue, S.; Ishizawa, J.; Ohyoshi, K.; Murakami, H.; Kawamura, K.; Ikehata, Y. JAXA’s Earth Observation Data Analysis on COVID-19. In Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, Brussels, Belgium, 11–16 July 2021; pp. 1362–1365. [Google Scholar] [CrossRef]
  36. Yu, H.; Hao, X.; Wu, L.; Zhao, Y.; Wang, Y. Eye in outer space: Satellite imageries of container ports can predict world stock returns. Humanit. Soc. Sci. Commun. 2023, 10, 383. [Google Scholar] [CrossRef]
  37. Johnsen, T. Change detection and detailed analysis of stacking configuration of container in TerraSAR-X SAR images. In Proceedings of the 2010 IEEE Rader Conference, Arlington, VA, USA, 10–14 May 2010. [Google Scholar]
  38. Johnsen, T. Coherent change detection in SAR images of harbors with emphasis on findings from container backscattering. In Proceedings of the 2011 IEEE Rader Conference, Kansas City, MO, USA, 23–27 May 2011. [Google Scholar]
  39. Murata, H.; Yonezawa, C. Detection of submerged aquaculture raft using a drone-based multispectral camera. In Proceedings of the 43rd Asian Conference on Remote Sensing (ACRS), Ulaanbaatar, Mongolia, 3–5 October 2022; Available online: https://a-a-r-s.org/proceeding/ACRS2022/ACRS22_89.pdf (accessed on 5 January 2024).
  40. Malarvizhi, K.; Kumar, S.V.; Porchelvan, P. Use of high resolution Google Earth satellite imagery in landuse map preparation for urban related applications. Procedia Technol. 2016, 24, 1835–1842. [Google Scholar] [CrossRef]
  41. Wang, T.; Li, Y.; Yu, S.; Liu, Y. Estimating the volume of oil tanks based on high-resolution remote sensing images. Remote Sens. 2019, 11, 793. [Google Scholar] [CrossRef]
  42. Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
Figure 2. Example of detection area set in a container yard.
Figure 2. Example of detection area set in a container yard.
Remotesensing 16 01082 g002
Figure 3. Example of binarization using HSV values.
Figure 3. Example of binarization using HSV values.
Remotesensing 16 01082 g003
Figure 4. Example of presetting configuration of L , W , S , s 0 , NL, L_list, NW, and NV.
Figure 4. Example of presetting configuration of L , W , S , s 0 , NL, L_list, NW, and NV.
Remotesensing 16 01082 g004
Figure 5. Setting of a container cell Cellc(i, j) and shadow cell Cells(i, j, k).
Figure 5. Setting of a container cell Cellc(i, j) and shadow cell Cells(i, j, k).
Remotesensing 16 01082 g005
Figure 6. Pixels used for calculating the average pixel value in the container or shadow cell.
Figure 6. Pixels used for calculating the average pixel value in the container or shadow cell.
Remotesensing 16 01082 g006
Figure 7. Histogram of the rate of the number of cells that need to be corrected manually.
Figure 7. Histogram of the rate of the number of cells that need to be corrected manually.
Remotesensing 16 01082 g007
Figure 8. Comparison between observed and predicted container occupancy rates.
Figure 8. Comparison between observed and predicted container occupancy rates.
Remotesensing 16 01082 g008
Figure 9. Distribution of estimated container occupancy rates and congestion levels.
Figure 9. Distribution of estimated container occupancy rates and congestion levels.
Remotesensing 16 01082 g009
Figure 10. Example images of detected areas where the estimated container occupancy rates differ significantly.
Figure 10. Example images of detected areas where the estimated container occupancy rates differ significantly.
Remotesensing 16 01082 g010
Figure 11. Definition of elapsed time between the arrival time of the next vessel (or departure time of the last vessel) and the time of image capture.
Figure 11. Definition of elapsed time between the arrival time of the next vessel (or departure time of the last vessel) and the time of image capture.
Remotesensing 16 01082 g011
Figure 12. Relationship between the arrival time of the next vessel (or departure time of the last vessel) and estimated average number of vertically stacked containers.
Figure 12. Relationship between the arrival time of the next vessel (or departure time of the last vessel) and estimated average number of vertically stacked containers.
Remotesensing 16 01082 g012
Figure 13. Relationship between the estimated average number of vertically stacked containers, vessel size, and elapsed time by export and import.
Figure 13. Relationship between the estimated average number of vertically stacked containers, vessel size, and elapsed time by export and import.
Remotesensing 16 01082 g013
Table 1. Satellite images of the Oi Container Terminal used in this study.
Table 1. Satellite images of the Oi Container Terminal used in this study.
DateTime (in Japan Time)Note
11 January 20189:00 a.m.
16 October 20189:00 a.m.Excluded in the analysis of Section 5 because of non-correspondence with AIS data
30 November 20189:00 a.m.
1 January 20199:00 a.m.
25 October 20199:00 a.m.
16 December 20209:00 a.m.
Table 2. External dimensions of primary ISO containers. (Source: International Standard Organization).
Table 2. External dimensions of primary ISO containers. (Source: International Standard Organization).
Type20 Feet40 Feet40 Feet High Cube
Length6058 mm12,192 mm12,192 mm
Width2438 mm2438 mm2438 mm
Height2591 mm2591 mm2896 mm
Table 3. Threshold HSV values. (Source: authors).
Table 3. Threshold HSV values. (Source: authors).
HSV
18010880
Table 4. Features and number of images with significantly different estimation results.
Table 4. Features and number of images with significantly different estimation results.
Estimated Congestion Level
Feature1234Total
(1) Quay cranes are largely overlapped00011
(2) Ground is exposed in the detection area10304
Both (1) and (2)00134
Others12003
Total224412
Table 5. Metrics by congestion level.
Table 5. Metrics by congestion level.
MetricsCongestion Level
1234
Accuracy rate10.930.70.47
Recall rate1110.24
Precision rate10.60.531
Table 6. Confusion matrix of CNN.
Table 6. Confusion matrix of CNN.
Predicted Congestion Level
1234Total
Observed Congestion Level110001
203003
30010010
4029516
Total1519530
Table 7. Estimated rate of export and import containers for each terminal.
Table 7. Estimated rate of export and import containers for each terminal.
Detection AreaTerminal XTerminal YTerminal Z
Berth 1Berth 2Berth 3Berth 4Berth 5
ExportImportExportImportExportImportExportImportExportImport
10.50.50.50.50.50.50.50.50.50.5
20.50.50.50.5
30101
40.50.50.50.5
51010
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yasuda, K.; Shibasaki, R.; Yasuda, R.; Murata, H. Terminal Congestion Analysis of Container Ports Using Satellite Images and AIS. Remote Sens. 2024, 16, 1082. https://doi.org/10.3390/rs16061082

AMA Style

Yasuda K, Shibasaki R, Yasuda R, Murata H. Terminal Congestion Analysis of Container Ports Using Satellite Images and AIS. Remote Sensing. 2024; 16(6):1082. https://doi.org/10.3390/rs16061082

Chicago/Turabian Style

Yasuda, Kodai, Ryuichi Shibasaki, Riku Yasuda, and Hiroki Murata. 2024. "Terminal Congestion Analysis of Container Ports Using Satellite Images and AIS" Remote Sensing 16, no. 6: 1082. https://doi.org/10.3390/rs16061082

APA Style

Yasuda, K., Shibasaki, R., Yasuda, R., & Murata, H. (2024). Terminal Congestion Analysis of Container Ports Using Satellite Images and AIS. Remote Sensing, 16(6), 1082. https://doi.org/10.3390/rs16061082

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop