**Figure 1.**
Map of the study area, showing a false color image ([R,G,B] = [NIR,R,G]) from a January 2019 Sentinel-2 mosaic.

**Figure 1.**
Map of the study area, showing a false color image ([R,G,B] = [NIR,R,G]) from a January 2019 Sentinel-2 mosaic.

**Figure 2.**
S1 and S2 image preprocessing method.

**Figure 2.**
S1 and S2 image preprocessing method.

**Figure 3.**
Finding optimum parameters for segment generation, and using the segments to create segmented and K-means cluster images.

**Figure 3.**
Finding optimum parameters for segment generation, and using the segments to create segmented and K-means cluster images.

**Figure 4.**
Assessment of generated segments against a polygon outlining one of the 75 test fields. In this example, the total area error is the sum of the area of the red and green shapes, and there are three segments intercepting the polygon outlining the field.

**Figure 4.**
Assessment of generated segments against a polygon outlining one of the 75 test fields. In this example, the total area error is the sum of the area of the red and green shapes, and there are three segments intercepting the polygon outlining the field.

**Figure 5.**
Web apps for collecting (**a**) training and (**b**) validation data.

**Figure 5.**
Web apps for collecting (**a**) training and (**b**) validation data.

**Figure 6.**
Generating band samples from training data and filtering samples to include only those belonging to points with evergreen or summer growth.

**Figure 6.**
Generating band samples from training data and filtering samples to include only those belonging to points with evergreen or summer growth.

**Figure 7.**
Selecting optimum supervised classifiers and parameters per band combination.

**Figure 7.**
Selecting optimum supervised classifiers and parameters per band combination.

**Figure 8.**
Method of generating pixel-based, object-based and refined object-based maps.

**Figure 8.**
Method of generating pixel-based, object-based and refined object-based maps.

**Figure 9.**
The average number of segments per field polygon geometry, as a function of (**a**) the SNIC segmentation size parameter, and (**b**) the average segment area error. Multiple segment merging settings are shown. For example, 2 $\times \phantom{\rule{3.33333pt}{0ex}}\Delta $ < 0.1 merges adjacent segments if the mean NDVI difference across all months is less than 0.1, with two iterations.

**Figure 9.**
The average number of segments per field polygon geometry, as a function of (**a**) the SNIC segmentation size parameter, and (**b**) the average segment area error. Multiple segment merging settings are shown. For example, 2 $\times \phantom{\rule{3.33333pt}{0ex}}\Delta $ < 0.1 merges adjacent segments if the mean NDVI difference across all months is less than 0.1, with two iterations.

**Figure 10.**
Analysis of K-means (k = 3) clustering of training points. (**a**) shows the number of training points per class and per cluster. (**b**) shows the maximum NDVI for each training point per cluster, with classes grouped into Perennial, Annual and Other. (**c**–**e**) show the time-series for training points from each cluster, including the mean and 1 and 2 standard deviations from the mean.

**Figure 10.**
Analysis of K-means (k = 3) clustering of training points. (**a**) shows the number of training points per class and per cluster. (**b**) shows the maximum NDVI for each training point per cluster, with classes grouped into Perennial, Annual and Other. (**c**–**e**) show the time-series for training points from each cluster, including the mean and 1 and 2 standard deviations from the mean.

**Figure 11.**
Time series NDVI for the 12 classes within K-means cluster 0. The graphs show the mean value for all training points per class, along with one and two standard deviations from the mean. The classes shown are: (**a**) Almond, (**b**) Annual, (**c**) Cherry, (**d**) Citrus, (**e**) Forest, (**f**) Hazelnut, (**g**) Olive, (**h**) Other, (**i**) Plum, (**j**) Stonefruit, (**k**) Vineyard and (**l**) Walnut.

**Figure 11.**
Time series NDVI for the 12 classes within K-means cluster 0. The graphs show the mean value for all training points per class, along with one and two standard deviations from the mean. The classes shown are: (**a**) Almond, (**b**) Annual, (**c**) Cherry, (**d**) Citrus, (**e**) Forest, (**f**) Hazelnut, (**g**) Olive, (**h**) Other, (**i**) Plum, (**j**) Stonefruit, (**k**) Vineyard and (**l**) Walnut.

**Figure 12.**
Sample classification accuracy for: (**a**) one month and (**b**) all combinations of two months S2 5-band images.

**Figure 12.**
Sample classification accuracy for: (**a**) one month and (**b**) all combinations of two months S2 5-band images.

**Figure 13.**
Example 33 km^{2} area of the classified maps. (**a**) K-means clustering with k = 3. (**b**) Object-based classified map using S1 radar time series (TS). (**c**) Pixel-based classification using S2(10) + S1(2) TS. (**d**) Proportion of pixels belonging to the majority class within each segment. (**e**) Refined object-based map using S2(10) + S1(2) TS features with proportion threshold of 0%. (**f**) Refined object-based map using S2(10) + S1(2) TS features with proportion threshold of 80%.

**Figure 13.**
Example 33 km^{2} area of the classified maps. (**a**) K-means clustering with k = 3. (**b**) Object-based classified map using S1 radar time series (TS). (**c**) Pixel-based classification using S2(10) + S1(2) TS. (**d**) Proportion of pixels belonging to the majority class within each segment. (**e**) Refined object-based map using S2(10) + S1(2) TS features with proportion threshold of 0%. (**f**) Refined object-based map using S2(10) + S1(2) TS features with proportion threshold of 80%.

**Figure 14.**
Overall accuracy of object-based classified maps generated from the segmented image, assessed using the random validation segments. All results are using the SVM classifier with RBF apart from those marked “CART” and “RF”. Optimal SVM RBF parameters used for each classification are shown on the bars. Additional notes: ***1** CART parameters: MinSplitPoplulation:1, MinLeafPopulation:1, MaxDepth:10. ***2** RF parameters: NumberOfTrees:128, VariablesPerSplit:16, MinLeafPopulation:2. ***3** SVM RBF supervised classification applied over the whole map (not using K-means clustering to filter samples and classify some Annual and Other areas).

**Figure 14.**
Overall accuracy of object-based classified maps generated from the segmented image, assessed using the random validation segments. All results are using the SVM classifier with RBF apart from those marked “CART” and “RF”. Optimal SVM RBF parameters used for each classification are shown on the bars. Additional notes: ***1** CART parameters: MinSplitPoplulation:1, MinLeafPopulation:1, MaxDepth:10. ***2** RF parameters: NumberOfTrees:128, VariablesPerSplit:16, MinLeafPopulation:2. ***3** SVM RBF supervised classification applied over the whole map (not using K-means clustering to filter samples and classify some Annual and Other areas).

**Figure 15.**
Trading average producers’ and users’ accuracies, by adjusting the threshold for proportion of pixels in each segment belonging to the majority class.

**Figure 15.**
Trading average producers’ and users’ accuracies, by adjusting the threshold for proportion of pixels in each segment belonging to the majority class.

**Figure 16.**
Final classified map using S2(10) + S1(2) time series features, and a majority pixel proportion threshold of 60%. The location of the detailed maps in

Figure 13 is indicated by the black inset rectangle, and points show the location of the validation segments.

**Figure 16.**
Final classified map using S2(10) + S1(2) time series features, and a majority pixel proportion threshold of 60%. The location of the detailed maps in

Figure 13 is indicated by the black inset rectangle, and points show the location of the validation segments.

**Figure 17.**
Confusion matrices for the final map accuracy. (**a**) Count-based. (**b**) Area-based.

**Figure 17.**
Confusion matrices for the final map accuracy. (**a**) Count-based. (**b**) Area-based.

**Table 1.**
Definitions of the nine perennial crop classes and three remaining classes.

**Table 1.**
Definitions of the nine perennial crop classes and three remaining classes.

Group | Class | Notes |
---|

Perennial crops | Citrus | Includes common oranges (mainly the Valencia) and navel oranges Citrus sinensis. There are also some Grapefruit Citrus paradisi, Lemon Citrus limon and Mandarin Citrus reticulata orchards. |

Almond | Prunus dulcis. |

Cherry | Prunus avium. |

Plum | Prunus domestica, which in this area are mainly used to produce dried prunes. |

Stonefruit | Other stonefruit, which includes small areas of Nectarines Prunus persica var. nucipersica, Peaches Prunus persica and Apricots Prunus armeniaca. |

Olive | Olea europaea. |

Hazelnut | Corylus avellana. |

Walnut | Juglans regia. |

Vineyard | Vitis vinifera, mainly used to grow grapes for wine production in this area. |

Other areas | Annual | Annual crops, which includes those mainly grown over the summer (such as rice, cotton and maize) and those grown over the winter (such as barley, canola and wheat) as well as melons and vegetables. |

Forest | Trees other than those used to produce a crop, such as native forest areas, mainly consisting of evergreen Eucalyptus. |

Other | All other areas, including water, buildings, grass and cropping areas not planted during the study year. |

**Table 2.**
Sentinel-1 (S1) and Sentinel-2 (S2) bands. NDVI = (NIR − R)/(NIR + R) was also computed at 10 m resolution, and NDVI together with the S2(4) bands formed the S2(5) grouping.

**Table 2.**
Sentinel-1 (S1) and Sentinel-2 (S2) bands. NDVI = (NIR − R)/(NIR + R) was also computed at 10 m resolution, and NDVI together with the S2(4) bands formed the S2(5) grouping.

Grouping | Band | Abbreviation | Approx. Band Center | Resolution (m) |
---|

S1(2) | VV | VV | 5.4 GHz | 10 |

VH | VH | 5.4 GHz | 10 |

S2(4) | Blue | B | 490 nm | 10 |

Green | G | 560 nm | 10 |

Red | R | 660 nm | 10 |

Near infrared | NIR | 830 nm | 10 |

S2(10) (includes S2(4)) | Red edge 1 | RE1 | 700 nm | 20 |

Red edge 2 | RE2 | 740 nm | 20 |

Red edge 3 | RE3 | 780 nm | 20 |

Near infrared narrowband | NIRN | 860 nm | 20 |

Short wave infrared 1 | SWIR1 | 1610 nm | 20 |

Short wave infrared 2 | SWIR2 | 2200 nm | 20 |

**Table 3.**
Band combinations used in supervised classification. TS = time series. Agg = aggregate bands computed per pixel over the twelve months. Months are indicated as YYMM (e.g., 1805 is May 2018).

**Table 3.**
Band combinations used in supervised classification. TS = time series. Agg = aggregate bands computed per pixel over the twelve months. Months are indicated as YYMM (e.g., 1805 is May 2018).

Designation | Bands | Months | Features |
---|

S1(2) TS | VV, VH | 1805, 1806, …, 1904 | 24 |

S2(4) 1 month | B, G, R, NIR | 1806 | 4 |

S2(5) 1 month | B, G, R, NIR, NDVI | 1806 | 5 |

S2(10) 1 month | B, G, R, NIR, RE1, RE2, RE3, NIRN, SWIR1, SWIR2 | 1806 | 10 |

S2(5) 2 month | B, G, R, NIR, NDVI | 1809, 1810 | 10 |

NDVI TS | NDVI | 1805, 1806, …, 1904 | 12 |

S2(4) TS | B, G, R, NIR | 1805, 1806, …, 1904 | 48 |

S2(10) TS | B, G, R, NIR, RE1, RE2, RE3, NIRN, SWIR1, SWIR2 | 1805, 1806, …, 1904 | 120 |

S2(10) + S1(2) Agg | B, G, R, NIR, RE1, RE2, RE3, NIRN, SWIR1, SWIR2, VV, VH | Min, Mean, Max | 36 |

S2(10) + S1(2) TS | B, G, R, NIR, RE1, RE2, RE3, NIRN, SWIR1, SWIR2, VV, VH | 1805, 1806, …, 1904 | 144 |

**Table 4.**
Area error and number of segments per field geometry for selected sets of SNIC segmentation and merging paramters. The parameters in the middle row were chosen.

**Table 4.**
Area error and number of segments per field geometry for selected sets of SNIC segmentation and merging paramters. The parameters in the middle row were chosen.

Size | Compactness | Merging | Area Error Per Field (ha) | Segments Per Field |
---|

20 | 0.4 | 1 × $\Delta <$0.05 | 0.31 | 44.65 |

40 | 0.4 | 2 × $\Delta <$0.05 | 0.38 | 9.88 |

80 | 0.2 | 1 × $\Delta <$0.1 | 0.94 | 3.48 |