Wind–Wave Modeling: Where We Are, Where to Go

We perform a critical analysis of the present approach in wave modeling and of the related results. While acknowledging the good quality of the best present forecasts, we point out the limitations that appear when we focus on the corresponding spectra. Apart from the meteorological input, these are traced back to the spectral approach at the base of the present operational models, and the consequent approximations involved in properly modeling the various physical processes at work. Future alternatives are discussed. We then focus our attention on how, given the situation, to deal today with the estimate of the maximum wave heights, both in the long term and for a specific situation. For this, and within the above limits, a more precise evaluation of the wave spectrum is shown to be a mandatory condition.


Current State of the Field
We all agree that, by and large, the present situation of the best operational models is quite satisfactory. Figure 1 shows one of the latest comparisons between altimeter (in this case Sentinel 3A) significant wave heights H s and the corresponding results of the European Centre for Medium-Range Weather Forecasts (ECMWF, Reading, UK). A 2.5 cm worldwide bias and 0.091 scatter index speak for themselves. It is significant that ESA (the European Space Agency) actually verifies (not calibrates) its altimeter H s versus the ECMWF data. Similar results are reported by the National Center for Environmental Research (NOAA-NCEP, Maryland, USA), the UK Meteorological Office (UKMO, Exeter, UK), and Meteo France (Toulouse, France). It is instructive that a similar statement on the quality of the results [1] was offered ten years ago, pointing out however some substantial limitations concerning two main items: a frequent miss of the H s peak values, and an often crude approximation of the spectra. If not quantitatively, on a qualitative basis this is still true today. The aim of this paper is to perform a (short) historical analysis of the development of the daily used wave models pointing out where the present problems come from.

A Brief History of Wave Modeling Development
Granted the obvious relationship between wind and waves, the first quantification was achieved with the Sverdrup and Munk [2] pragmatic (but reasoned) relationships among wind speed, fetch, time, and wave height and period. This was later reconsidered by Bretschneider, leading to the worldwide and long used SMB abacus and method. The substantial breakthrough came in 1952 with the suggestion by Pierson and Marks to describe the sea conditions (in a given area and at a given time) as the superposition of sinusoidal waves, each specified in frequency, direction, and height (hence energy). This opened the door to mathematicians. Sinusoids are an easily handled function, suitable for a number of manipulations and physical assumptions. We still live with this hypothesis. quantified the interactions and conservative exchange of energy among the spectral components. White-capping, the energy loss by wave crests breaking in deep water, had to wait 17 years after the formulation of the input function before Hasselmann [6], in connection with the JONSWAP (JOint North Sea WAve Project) experiments and report [7], proposed an empirical, but at least quantified and spectrum-dependent, usable expression suitable for practical applications. The adult age of wave modeling was finally reached in 1988 [8] with the reduced Discrete Interaction Approximation (DIA) parameterization [9] of the non-linear interactions, and the consequent formulation of the first thirdgeneration model WAM. The overall situation was well pictured in the so-called WAM book by Komen et al. [10]. Other models followed on the same route: WAVEWATCH [11], SWAN [12], and the one by the Danish Hydraulic Institute.
Today we live in a slow evolution of the 1994 situation. Improvements have been made on some aspects of the source functions, computers are faster, and we have gone to progressively higher resolutions. However, although with a rather crude statement, we can say that most of our improvements in the last three decades have been associated with the parallel improvements in the definition and accuracy of the driving wind fields. Indeed, during the last 30 years the only basic conceptual improvement has been the acknowledgement by Janssen [13] of the two-way interaction with the atmosphere, and the consequent need to use a coupled model. Including the interaction with ocean currents, the process is still on its way.

Analysis of the Present Situation
As seen in Figure 1, the Hs results are on average pretty good, at least in the best modeling centers. However, we still often miss the peaks of the, especially higher, storms. Figure 2 is indicative: on a general good estimate of the whole, the peak model value is too low. Of course, especially for hurricanes and typhoons, we could blame the wind information, but it is correct to ponder how much our basic physical hypotheses can actually hold in these situations. One indeed wonders when Katrina, one of the worst hurricanes in the Gulf of Mexico (2005), has been simulated hundreds of times trying to get the correct results.

Analysis of the Present Situation
As seen in Figure 1, the H s results are on average pretty good, at least in the best modeling centers. However, we still often miss the peaks of the, especially higher, storms. Figure 2 is indicative: on a general good estimate of the whole, the peak model value is too low. Of course, especially for hurricanes and typhoons, we could blame the wind information, but it is correct to ponder how much our basic physical hypotheses can actually hold in these situations. One indeed wonders when Katrina, one of the worst hurricanes in the Gulf of Mexico (2005), has been simulated hundreds of times trying to get the correct results.   Figure 3 (courtesy of Jesus Portilla Yandun) reports a frequent problem with spectra, especially in the so-called cross-sea conditions (one of the worst conditions for navigation). Granted a very good fit in Hs (bias virtually null), the right panel shows a completely different situation for the spectrum, with a strong underestimate of the wind sea component (at 0.2 Hz) and a compensating overestimate of the swell (at 0.09 Hz). Examples in 2D spectra also abound, typically with a shifted peak frequency or a directional distribution that is too wide. The practical implications for everyday use will be discussed in the last part of Section 2.2.  Figure 3 (courtesy of Jesus Portilla Yandun) reports a frequent problem with spectra, especially in the so-called cross-sea conditions (one of the worst conditions for navigation). Granted a very good fit in H s (bias virtually null), the right panel shows a completely different situation for the spectrum, with a strong underestimate of the wind sea component (at 0.2 Hz) and a compensating overestimate of the swell (at 0.09 Hz). Examples in 2D spectra also abound, typically with a shifted peak frequency or a directional distribution that is too wide. The practical implications for everyday use will be discussed in the last part of Section 2.2. This leads naturally to the problem of resolution, both in frequency and direction. Nature is continuous (at the time scale we deal with), and our discretization is obviously an approximation. The problem becomes macroscopic when identifying the peak frequency in a storm or, probably more, with the long distance propagation of swell.
We proceed now to a critical analysis of the single processes at work.

Generation by Wind
The papers by Phillips [3] and Miles [4] were complementary to each other because the Phillips process, based on an oscillating atmospheric pressure on the surface, leads to a first minimal corrugation of the surface on which the positive feedback mechanism of Miles can work. Quantitatively the Phillips process is trivial, but it is required (in principle) because the Miles one has no effect on a flat surface. All this has been beautifully shown by Benetazzo et al. [14]. They reported how in a wind wave tank, a layer of oil a few molecules thick, killing the initially expected 2-3 cm long wavelets, de facto impeded any wave generation. Even under a 12 ms -1 wind, the surface remained flat and calm. Without touching here on the problem of the reason for the first wavelets, in practice today most of the models ignore the Phillips process, imposing a minimal background noise (typically 5 cm Hs) or a minimal energy connected to the local wind.
Focusing on the Miles process, the overall theoretical picture is of the wind smoothly flowing over each single sinusoidal component with a consequent adaptation of the wind vertical profile, all this being almost independent of any other spectral component (there are typically 1000 of them in a modern spectrum). Attempts to correct for some of these aspects have been made (see the WW3 manual at https://github.com/NOAA-EMC/WW3/wiki/Manual). However, these are only approximations to the truth. This representation appears far too simple to anyone who has had the chance to look at a stormy sea: one wave interacting with another one, wind cutting the crests, crests breaking, sharp ones crossing each other, and wind detaching from the surface at the sharp crests (certainly so at each breaker). Banner and Melville [15] already showed 44 years ago the substantial differences between the wind flow over breaking and non-breaking waves, and in strong generative conditions a large percentage of the waves is breaking. It is only the fact that we cannot visualize wind that allows us to play with this (almost) linear hypothesis. Wind is extremely irregular, and so is the input to the ocean.

Dissipation by White-Capping
In several fields of atmospheric research, the number of white-caps is parameterized as a function of wind speed; see for instance [16]. On the opposite side, most wave modelers model the This leads naturally to the problem of resolution, both in frequency and direction. Nature is continuous (at the time scale we deal with), and our discretization is obviously an approximation. The problem becomes macroscopic when identifying the peak frequency in a storm or, probably more, with the long distance propagation of swell.
We proceed now to a critical analysis of the single processes at work.

Generation by Wind
The papers by Phillips [3] and Miles [4] were complementary to each other because the Phillips process, based on an oscillating atmospheric pressure on the surface, leads to a first minimal corrugation of the surface on which the positive feedback mechanism of Miles can work. Quantitatively the Phillips process is trivial, but it is required (in principle) because the Miles one has no effect on a flat surface. All this has been beautifully shown by Benetazzo et al. [14]. They reported how in a wind wave tank, a layer of oil a few molecules thick, killing the initially expected 2-3 cm long wavelets, de facto impeded any wave generation. Even under a 12 ms -1 wind, the surface remained flat and calm. Without touching here on the problem of the reason for the first wavelets, in practice today most of the models ignore the Phillips process, imposing a minimal background noise (typically 5 cm H s ) or a minimal energy connected to the local wind.
Focusing on the Miles process, the overall theoretical picture is of the wind smoothly flowing over each single sinusoidal component with a consequent adaptation of the wind vertical profile, all this being almost independent of any other spectral component (there are typically 1000 of them in a modern spectrum). Attempts to correct for some of these aspects have been made (see the WW3 manual at https://github.com/NOAA-EMC/WW3/wiki/Manual). However, these are only approximations to the truth. This representation appears far too simple to anyone who has had the chance to look at a stormy sea: one wave interacting with another one, wind cutting the crests, crests breaking, sharp ones crossing each other, and wind detaching from the surface at the sharp crests (certainly so at each breaker). Banner and Melville [15] already showed 44 years ago the substantial differences between the wind flow over breaking and non-breaking waves, and in strong generative conditions a large percentage of the waves is breaking. It is only the fact that we cannot visualize wind that allows us to play with this (almost) linear hypothesis. Wind is extremely irregular, and so is the input to the ocean.

Dissipation by White-Capping
In several fields of atmospheric research, the number of white-caps is parameterized as a function of wind speed; see for instance [16]. On the opposite side, most wave modelers model the energy loss by white-capping as a function of the wave spectrum (see the cited [6]). An ample overview of the problem has been given by Babanin [17]. The truth lies in the middle. It is indeed instructive to look at a stormy sea under a gusty, rapidly varying wind. While the spectrum is not supposed to change on the minute scale (at least according to present theories), it is obvious to see how the number and intensity of whitecaps follow the single gusts of the wind speed. Incidentally, we point out how in a rapidly varying wave field the very concept of spectrum may lose part of its meaning.

Non-Linear Interactions
This is theoretically the formally best formulated process, although also the least intuitive one. There is no visualization of it, certainly not from the point of view of a mariner looking at a waving sea. On the other hand, it is an integrated experience that, with wind carrying on blowing, waves get longer, as suggested by the intra-spectrum energy flow quantified with the Hasselmann formulation [5]. The practical problem is that the full estimate of this process is computationally by far out of any practical operational application, hence the already cited solution of the Discrete Interaction Approximation (DIA) of Hasselmann et al. [9]. DIA considers only a few exchanges (the main ones) and, this notwithstanding, it uses 10% of the present ECMWF wave modeling computer budget. It was 15% until a short while ago before some optimization by Jean Bidlot (personal communication) and an increased load by other parts of the model.
A more fundamental point is that the Hasselmnn [5] approach considers only resonant interactions in quadruplets. In doing so, we focus only on one aspect of the problem, for two reasons. First, there is a general feeling (e.g., [18,19]) that quasi-resonant interactions should also be considered. Then, we could, or should, look at higher orders, a fact practically unreachable at the moment, even only for the tremendous analytical troubles there involved.
Limiting ourselves to the presently reachable targets, DIA has been much criticized because its approximation to the results derived with the full evaluation (EXACT-NL) is crude, the more so as soon as the spectrum diverges from the classical JONSWAP shape of a generative sea. Several improved (i.e., more complicated) solutions have been proposed, all hitting the limit of the allowable computer time. Note that, given the substantial amount of energy exchanged following non-linear interactions, the approximation of DIA has a non-trivial influence on the shape of the resulting spectrum.
A fact regularly ignored is that the coefficients proposed in 1985 by Hasselmann et al. [9] for using DIA had been carefully optimized for certain frequency and directional resolutions. Given the progressive increase in this respect, the coefficients should be accordingly updated, but this is not done.
The highly theoretical formulation and the lack of any physical perception of the process somehow hide all the implications. Non-linear interactions are based on the assumption of linear waves, in particular on the classical relationship σ 2 = gk (with σ the circular frequency, g gravity acceleration, and k the wave number). One of the classical images of active generation conditions is (see Figure 4) the irregular distribution of curved crests. One explanation for the tendency of a crest to be forward at its center is that, as a rule, this is where a crest is indeed higher. If, as the theory suggests, the wave phase celerity grows with the height (see, among others, [20]), the curved crest flows naturally. However, if this is indeed the case, then also the resonance conditions for the non-linear interactions [5] are no longer valid.
interactions, the approximation of DIA has a non-trivial influence on the shape of the resulting spectrum.
A fact regularly ignored is that the coefficients proposed in 1985 by Hasselmann et al. [9] for using DIA had been carefully optimized for certain frequency and directional resolutions. Given the progressive increase in this respect, the coefficients should be accordingly updated, but this is not done.

Wave-Current Interactions
Given the accuracy we are aiming at in a professional fore-or hind-cast, it is mandatory to also consider the interactions with ocean currents. The theory has been exhaustively dealt with (see, among others, the summary by Soulsby et al. [21], the model description by Tolman [11], and the formulation of the problem by [20]), but the real problem lies in the proper definition of a current field. Circulation models, because of resolution, lack of data and of what, borrowing a wave expression, we can call "low pass filtering", get the fields right only up to a certain resolution, after which the local variability is either ignored or, in practice, a noise. The problem is that these non-considered wiggles can affect the local wave conditions to an appreciable extent, both in mild and severe storms (a good example is given by [22]). There are multiple reasons for this. As for atmospheric modeling, circulation models lack sufficiently detailed data assimilation. Similarly to wind waves, we can also suppose a sort of non-linear interaction cascade, presently poorly represented, if not fully absent, in the standard approach. Finally, the local currents do depend much on the wave themselves. Although not strictly part of the scope of this paper, this last subject deserves a short detour. In nature, most of the momentum (80-90%) and energy passed by wind to the ocean goes into waves; the remaining parts go into current driven by the wind stress. Note that all this is crudely approximated in our modeling because our spectra typically stop at 1 Hz (i.e., 1.56 m long waves), while the so-called tail of the spectrum is parameterized under certain assumptions that are still a subject of discussion. In any case, the waves will soon lose about 90% of these quantities via breaking, in so doing enhancing and driving ocean currents and turbulence. McWilliams and Restrepo [23] were quite successful in driving the world ocean circulation with only wave breaking. The point of concern is that large-scale circulation modelers often bypass the intermediate wave phase, forcing the whole atmospheric input to go into the current field. The ensuing approximations are obvious, as are the implications for a correct estimate of the wave field.

A Summary of the Situation
In the above sub-sections, we have conducted a rather quick, qualitative (but based on quantitative results) review of the present approach in wave modeling. Are we too pessimistic? We do not think so. The fact that a careful definition of the source functions, guided by long-term experience, leads to good estimates of the significant wave heights on a global basis should not let us forget the limitations of the present approach. Even more, also accepting the use of the spectrum, it is increasingly apparent that non-linear wave dynamics dominates its evolution. Linear wave physics and the empirical parameterizations developed do not allow for reproducing the correct spectral distribution.
Given this short excursus, where should we act for further improvement? This is what we discuss in the next section.

Future Directions
"Future directions" or, in other words, "where to go next" does not have a unique reply. First of all we can "play" with the source functions and the overall coupling of the system as done until now. Notwithstanding our criticism on the present, spectral approach, there is still room for (in our opinion, limited) improvement. However, precise limits stand along this way, and in the long term a different approach should be followed, which we will soon discuss. A different alternative is the choice between an engineeristic approach (we need practical results tomorrow) and, at the opposite end, the longer-term aim of improving our basic knowledge of the overall process. Obviously, the latter choice is the background for long-term advances. We will touch on the various options in sequence.

The Processes at Work and Their Coupling
The basic processes at work in active generation are obvious at a qualitative look: the wind puts energy into the waves (input), which in turn lose part of it via white-capping (output). With the less obvious process of non-linear interactions, these are, and were, the key elements of the energy balance equation, and hence of all our wind wave models (see [24] and in more recent times [10]). Much effort has been put into improving the input and output terms, via both theoretical approaches and experiment/measurement campaigns. Although a certain level of success has been repetitively claimed, basically on the H s values, the problem still lies with the spectra. Indeed, there are reasons to wonder when we think that all the brilliant and sophisticated wind wave generation campaigns, measuring wind profile, momentum flux, wave detailed evolution, etc., have been necessarily performed on a much reduced scale and in strongly limited conditions. Both Snyder et al. [25] and Young and Babanin [26], the classical references spanning the period of substantial wave model development, deserve our admiration. However, although later verified and possibly adjusted to the real world events, we cannot forget that there is a four orders of magnitude difference between the energy present in their experiments and a hurricane. The cited example of Katrina, like any other disastrous storm, is indicative of the situation. This is not surprising, on one hand for the uncertainty in the input and output energy distribution, and on the other hand for the mentioned approximation to the true progressive redistribution of energy by the non-linear term. For the input, the obvious limit is the incompatibility between what is happening on a growing sea and the Miles scheme. As a matter of fact, that this notwithstanding we can get good quality H s values should be by itself a good piece of information. A possible further refinement for the future could be a more realistic description of the wind field surface boundary layer. With this we do not mean, e.g., a more correct uniform wind vertical profile. We refer to a description more adherent to the reality of a highly turbulent and locally variable wind field. As we have already said, but we prefer to stress again, the main reason why we actually "simplify" our description of the wind field close to the surface is that we cannot see it. Although with a slightly different meaning, "what the eye doesn't see, the heart doesn't grieve over" summarizes well the situation.
Mutatis mutandis, similar, probably stronger, considerations hold for white-capping. This is obviously a non-linear process, the single sinusoids being unable to break. Babanin [17] offers a wide panorama of the problem and the various approaches used for practical applications. However, the main point to aim at is to consider white-capping as a function not only of the spectrum, but also of the driving wind. As we already mentioned, the direct evidence of a stormy sea, better if under the action of a rapidly varying wind speed, strongly suggests that the wind itself affects the intensity of breaking, both in terms of the number and dimensions of single breakers.
Pushing further in this direction, it has recently been suggested [27] that the obvious strong interconnection between wind input and white-capping should be interpreted as evidence that these two processes are indeed parts of a single one. As a matter of fact, as mariners know very well, rain has the surprising effect (for a computer modeler) of smoothing the sea surface. This drastically decreases the input by wind to the field, at the same time canceling to a large extent any white-cap from the sea surface. If, as we have good reasons to believe, this is true, it shows how much we still have to learn about air-sea interaction.
For non-linear interactions, the problem is computer time. The common choice to devote any newly available time to an increase in grid resolution does not leave much room. However, we should distinguish between apparent (more details) and true improvement (better quality of the results). A test we suggest is to start from a given situation (in terms of resolution and non-linear approximation) and, using the same extra computer time, run two tests: (1) with increased spatial resolution, (2) with an improved estimate of the non-linear interactions. A comparison, both between the two solutions and the measured data, will provide meaningful insight into the respective limits and advantages. Higher resolution is required where there are strong spatial gradients. This is typical of the coastal environment and, the obvious example, hurricanes and typhoons. The solution is to use unstructured grids, nowadays a common choice. This leaves the problem open for hurricanes, for which the solution is a dynamically varying grid (in space and time) according to where a higher resolution is required. This line of research has been pursued by Gorman et al. [28], but the interest seems to have slowed down. Besides the necessary increase in computer time, the main problem seems to be the tendency of the many and repetitive interpolations to smooth the fields, which implies losing (part of) the advantage of a locally increased resolution.
When we look at wave problems from a wider perspective, we realize that the key player is the moving interface between the atmosphere (wind) and ocean (undulations and currents). Because each one of these affects the other two, coupling is the mandatory word. Given the purpose of this paper, we do not delve into the implications for climate or, e.g., ocean mixing. For the ocean, the interaction with current is a rather well-defined subject. As mentioned in the first section, the problem may be the correct estimate of the current field. Note that, while usually the attention is on the surface current, implicitly assuming this to be true also in the wave-affected (and hence wave-affecting) layer, the current field can be, and often is, three-dimensional. Ardhuin et al. [29] have discussed the implied differences with respect to the uniform case, but this is hardly considered in operational applications. It is something to remember, especially when dealing with extreme conditions (in the next sub-section) or particularly large waves.

The Engineeristic Point of View
The average good model results for significant wave heights may provide a satisfactory reply to most regular activities. We are left with the frequent miss of the peaks, whose solution also depends on a higher quality of the corresponding wind fields and (as discussed in the previous section) on a further improvement of the source functions (but see the next sub-section). The further requirement of an engineer concerns the correct estimate of the possible extreme conditions (the tail of the probability distribution of significant wave heights) and, for given conditions, the single wave and crest heights.
The former problem has been amply treated in the literature; see, among others, [30,31]. Different methods are applied, depending on the length of the available series. The recent availability of long-term hindcasts, e.g., the most recent ERA5 of ECMWF, with a relatively good spatial resolution (31 km and 0.36 • for the atmosphere and wave fields, respectively, see https://www.ecmwf.int/en/forecasts/ datasets/reanalysis-datasets/era5), provides decades of data at any desired position. Using ERA5, or the previous ERA-Interim (80 km and 1 • resolution for wind and waves) reanalysis, higher resolution local wave hindcasts will undoubtedly be, and have been, done. Recent examples are [32] and the recent global one by Jean Bidlot (personal communication). Using ERA5 winds, but with an improved WAM model, he reran a global reanalysis with 36 directions and 36 frequencies at 18 km resolution. However, if the higher peaks are frequently underestimated (previous sub-section), our statistics will fail exactly in the range of interest. One crude, certainly not elegant to say the least, possibility is to compare the hindcast peaks with recorded ones, deriving an estimate of the misses. Given the implied cost of structures, this is a very critical approach.
A possibly more serious point is the extrapolation of the time series extremes, typically the yearly maxima. One of the implied assumptions is that all the storms belong to the same "family" of events. If different kinds of storms affect an area, different statistics, to be then combined into a single one, need to be used. However, things may not be so simple. Cavaleri et al. [33] have recently reported about the statistics of one of the longest recorded H s time series in the world, spanning the 1979-to-present period [34]. The series has been, and is, recorded at the ISMAR oceanographic tower [35], 15 km offshore from the Venice coastline at the upper end of the Adriatic Sea, East of Italy ( Figure 5). The highest waves are locally due to the sirocco storms, blowing from the southeast all along the basin. The distribution of yearly maxima is also reported in Figure 5 for 40 years of measurements at the ISMAR tower (courtesy of Angela Pomaro). While the general distribution seems to tail out at about 4.5 m, the two peaks at 6 m (22 December 1979 and 29 October 2018) stand by themselves. This is further confirmed by the hindcast of the 1966 storm, the worst in documented history, whose peak, in terms of both the damage and the model value, has been estimated at well above 6 m (the tower was not there at the time). The question is open, with Cavaleri et al. [33] suggesting the possibility that indeed, although apparently similar, these three storms do belong to a different family of events. This moves the problem to the meteorological side, but leaves it open. From our present point of view, the physics and genesis of these apparently exceptional events, especially if beyond the limits of any available statistics, should be carefully analyzed before reporting them as, e.g., 1000-year events or similar. The highest waves are locally due to the sirocco storms, blowing from the southeast all along the basin. The distribution of yearly maxima is also reported in Figure 5 for 40 years of measurements at the ISMAR tower (courtesy of Angela Pomaro). While the general distribution seems to tail out at about 4.5 m, the two peaks at 6 m (22 December 1979 and 29 October 2018) stand by themselves. This is further confirmed by the hindcast of the 1966 storm, the worst in documented history, whose peak, in terms of both the damage and the model value, has been estimated at well above 6 m (the tower was not there at the time). The question is open, with Cavaleri et al. [33] suggesting the possibility that indeed, although apparently similar, these three storms do belong to a different family of events. This moves the problem to the meteorological side, but leaves it open. From our present point of view, the physics and genesis of these apparently exceptional events, especially if beyond the limits of any available statistics, should be carefully analyzed before reporting them as, e.g., 1000-year events or similar. Our last subject for the engineering approach deals with single wave extremes. The discussed non-linear behavior of an active sea surface (sharp high crests and low flatter troughs) naturally leads to non-linear statistics. The subject has been dealt with extensively in the literature; see, among others, [36][37][38] with progressively second and third order non-linear statistics. In most cases, these statistics Our last subject for the engineering approach deals with single wave extremes. The discussed non-linear behavior of an active sea surface (sharp high crests and low flatter troughs) naturally leads to non-linear statistics. The subject has been dealt with extensively in the literature; see, among others, [36][37][38] with progressively second and third order non-linear statistics. In most cases, these statistics fit the data well, but this is not always the case. Figure 6 reports the wave height statistics from a one-hour record at the peak of the just-cited 29 October 2018 storm. The data follow the Tayfun and Fedele [38]  This leads us to the fashionable subject of freak waves. By definition an unusual, exceptional event, the wish to have more of them (in itself a contradiction in terms) led most of the community to lower the formal limit from H > 2.2 Hs to H > 2.0 Hs. This is ridiculous because with this lower limit, also the linear Rayleigh statistics implies a freak event every about 3000 waves, corresponding to ten hours or less in a storm, and much less in mild conditions. In any case, the whole matter has been demystified by analyzing the three-dimensional + time behavior of a relatively large section of the sea surface. This has been made recently possible with the video-stereo recording system described by Benetazzo et al. [39]. Extensive related data are available from the ISMAR oceanographic tower (Adriatic Sea, see Figure 5) and later from the Gageocho-ORS tower managed by KORDI in the Yellow Sea between China and Korea. In the 50 x 60 m 2 area monitored from the ISMAR tower, and with a 60 m typical wavelength during active wave conditions, at some position single wave heights higher than 2 Hs appear every few tens of seconds. Most of them last a few seconds. For a point recording, what makes them apparently rare is the low probability of happening at exactly the recording point. However, the probability increases quite rapidly for a ship or a rig. What made the Draupner and Andrea waves famous ( [40,41]) was their happening in severe wave conditions (11.9 and 9.2 m Hs, respectively) and being recorded of course. Consider that in those conditions similar wave heights were likely to appear in the whole surrounding area.
The basic message is that these once supposedly anomalous events are the norm, with a welldefined probability. The final question is how to estimate this probability. This takes us back to the subject of Section 1, i.e., to the accuracy with which we can model the 2D wave spectra in a storm. The crucial point (see for this [42]) is that this probability estimate depends critically on the shape of, and values in, the spectrum. Figure 7, panels a and b, show two relatively similar spectra (the first one is a wind sea JONSWAP-type spectrum with cos 2 directional distribution, the second one is a similar wind sea spectrum with a superposed swell) with the same, 1.95 m, significant wave height. The third panel, c, provides the corresponding probabilities of maximum wave heights (extreme value probabilities). There are appreciable differences. For instance, the two results, respectively, 4.09 m and 3.95 m, for the 1/100 probability differ by more than 3%. This is by no means negligible, This leads us to the fashionable subject of freak waves. By definition an unusual, exceptional event, the wish to have more of them (in itself a contradiction in terms) led most of the community to lower the formal limit from H > 2.2 H s to H > 2.0 H s . This is ridiculous because with this lower limit, also the linear Rayleigh statistics implies a freak event every about 3000 waves, corresponding to ten hours or less in a storm, and much less in mild conditions. In any case, the whole matter has been demystified by analyzing the three-dimensional + time behavior of a relatively large section of the sea surface. This has been made recently possible with the video-stereo recording system described by Benetazzo et al. [39]. Extensive related data are available from the ISMAR oceanographic tower (Adriatic Sea, see Figure 5) and later from the Gageocho-ORS tower managed by KORDI in the Yellow Sea between China and Korea. In the 50 x 60 m 2 area monitored from the ISMAR tower, and with a 60 m typical wavelength during active wave conditions, at some position single wave heights higher than 2 H s appear every few tens of seconds. Most of them last a few seconds. For a point recording, what makes them apparently rare is the low probability of happening at exactly the recording point. However, the probability increases quite rapidly for a ship or a rig. What made the Draupner and Andrea waves famous ( [40,41]) was their happening in severe wave conditions (11.9 and 9.2 m H s , respectively) and being recorded of course. Consider that in those conditions similar wave heights were likely to appear in the whole surrounding area.
The basic message is that these once supposedly anomalous events are the norm, with a well-defined probability. The final question is how to estimate this probability. This takes us back to the subject of Section 1, i.e., to the accuracy with which we can model the 2D wave spectra in a storm. The crucial point (see for this [42]) is that this probability estimate depends critically on the shape of, and values in, the spectrum. Figure 7, panels a and b, show two relatively similar spectra (the first one is a wind sea JONSWAP-type spectrum with cos 2 directional distribution, the second one is a similar wind sea spectrum with a superposed swell) with the same, 1.95 m, significant wave height. The third panel, c, provides the corresponding probabilities of maximum wave heights (extreme value probabilities). There are appreciable differences. For instance, the two results, respectively, 4.09 m and 3.95 m, for the 1/100 probability differ by more than 3%. This is by no means negligible, especially in the extreme range. (c) Extreme value probability of the maximum wave height in a 20 min sea state given the two spectra in the left and middle panels. Parent probability distribution is Naess' [43], accounting for the bandwidth of the spectrum in the frequency range.

The Long-Term Perspective
The mentioned foreseen/proposed activities are still based on the spectral approach. While, as indicated, in the short term, albeit limited, improvements are still possible, we need to explore if in a longer term alternative approaches exist and can be pursued.
We wonder what the developments of wave modeling would have been had Marks (of Pierson and Marks, [44]), an optics scientist, not suggested in 1952 to his friend Willard the use of spectral analysis. Most likely, the field was ripe and someone else would have followed that route. However, let us assume that spectral analysis was not available. Alternatives? We would have probably followed some other statistics of the sea surface, such as the wave height distribution or similar, basically developing further along the original route of [2] and adapting the interactions with the atmosphere and the ocean to this point of view. "Necessity is the mother of invention", or using the exact 500-year-old expression, "need taught him wit". So, no doubt we would have devised clever uses of the available information, congratulating ourselves for the achievements. However, now we have powerful computers at our disposal and this allows for more realistic approaches.
In our (Section 1) previous criticism, we have pointed out how the real interaction between wind and waves is much more complicated than represented with the spectral approach. If this is, and it is, the case, then we must aim at a fully realistic representation of this interaction. In practice, we need to produce a "true" dynamical representation of the wavy sea surface and to model the 3D motion of the atmosphere above it. The latter must be without any constraint on what is happening, hence allowing boundary detachment, flow inversion in the wave trough behind (from the wind perspective) a breaking or a particularly sharp crest, etc. This does not mean to be able to picture, wave by wave, the true ocean, but we need to at least assemble a fully realistic representation of it, with the same integral results step by step, leading, in the same conditions, to the statistically and dynamically identical characteristics of the real sea. Of course this is presently impossible, but this must be our target and the route in our mind.
Such a deterministic approach is not new. Coastal, sediment, and harbor engineers, among others, have repetitively aimed at describing the detailed behavior of waves, with and without sediments, close to coasts, and in interaction with, offshore or harbor, structures. Within our specific field of interest, Iafrati [45] has already tried to simulate in one spatial dimension the detailed physical interaction between wind and a dynamically varying waving profile. The results, albeit limited with respect to our indicated ideal, are instructive in that they show how complicated the process and the interactions are. Indeed, in our idealized simulation there are also obvious limits. For instance, which spatial resolution should we use for our surface modeling? We know there is no obvious lower limit. (c) Extreme value probability of the maximum wave height in a 20 min sea state given the two spectra in the left and middle panels. Parent probability distribution is Naess' [43], accounting for the bandwidth of the spectrum in the frequency range.

The Long-Term Perspective
The mentioned foreseen/proposed activities are still based on the spectral approach. While, as indicated, in the short term, albeit limited, improvements are still possible, we need to explore if in a longer term alternative approaches exist and can be pursued.
We wonder what the developments of wave modeling would have been had Marks (of Pierson and Marks, [44]), an optics scientist, not suggested in 1952 to his friend Willard the use of spectral analysis. Most likely, the field was ripe and someone else would have followed that route. However, let us assume that spectral analysis was not available. Alternatives? We would have probably followed some other statistics of the sea surface, such as the wave height distribution or similar, basically developing further along the original route of [2] and adapting the interactions with the atmosphere and the ocean to this point of view. "Necessity is the mother of invention", or using the exact 500-year-old expression, "need taught him wit". So, no doubt we would have devised clever uses of the available information, congratulating ourselves for the achievements. However, now we have powerful computers at our disposal and this allows for more realistic approaches.
In our (Section 1) previous criticism, we have pointed out how the real interaction between wind and waves is much more complicated than represented with the spectral approach. If this is, and it is, the case, then we must aim at a fully realistic representation of this interaction. In practice, we need to produce a "true" dynamical representation of the wavy sea surface and to model the 3D motion of the atmosphere above it. The latter must be without any constraint on what is happening, hence allowing boundary detachment, flow inversion in the wave trough behind (from the wind perspective) a breaking or a particularly sharp crest, etc. This does not mean to be able to picture, wave by wave, the true ocean, but we need to at least assemble a fully realistic representation of it, with the same integral results step by step, leading, in the same conditions, to the statistically and dynamically identical characteristics of the real sea. Of course this is presently impossible, but this must be our target and the route in our mind.
Such a deterministic approach is not new. Coastal, sediment, and harbor engineers, among others, have repetitively aimed at describing the detailed behavior of waves, with and without sediments, close to coasts, and in interaction with, offshore or harbor, structures. Within our specific field of interest, Iafrati [45] has already tried to simulate in one spatial dimension the detailed physical interaction between wind and a dynamically varying waving profile. The results, albeit limited with respect to our indicated ideal, are instructive in that they show how complicated the process and the interactions are. Indeed, in our idealized simulation there are also obvious limits. For instance, which spatial resolution should we use for our surface modeling? We know there is no obvious lower limit. As an example, the centimetric wavelets are important in determining the roughness of the surface. On the other hand, we must put a limit, unavoidably parameterizing the smaller scale processes.
Granted this long-term target, what should we aim at for the time being? Our suggestion is to make an idealized experiment limited in time and space. Consider a one-point model, in practice an infinite ocean with spatially uniform conditions. Starting from an initial spectrum, and with of course a given wind, on one hand we integrate the process in the traditional way. At the same time, using the initial spectrum, we make a realistic representation of the two-dimensional sea surface. Mind you, this must not be a single random superposition of sinusoids. In doing so, we would be back at the original problem. The surface must be fully realistic with higher sharp crests and low flatter troughs. To this end, we need to evolve the wind and wave fields according to the basic fluid dynamic equations. For the wave part, this can be done at different levels of sophistication (e.g., to the second order following [37]; to higher orders, solving Euler equations by means of numerical methods, see for instance [46,47]), but this is a technical matter that for the moment we leave aside. Instead, we then need to assess the results, and somehow here we go back to general statistics or even the spectrum. We can obviously evaluate the spectrum of the integrated surface and compare it with the "traditionally" modeled one. More so, we can compare the various spectral moments as these provide valid information on the non-linearity of the surface. For instance, we can "measure" the crest distribution in the 2D integrated field, and compare it with what is derivable from the spectrum using the usual spectral model.
The described approach has two purposes. Conceived as a long-term solution, in the short term it can be used to check the realism of the spectral method. Of course, this can be looked at from two different perspectives. In the one just hinted at, we implicitly assume that the fluid dynamic integration of the wind and wave fields leads to the right solution. This is of course debatable, especially in the initial, probably cruder, attempts. In the first stage, the traditional approach would probably be the reference target. However, while the latter has the well-defined limits we have discussed, the 3D field approach is expected to improve over time, increasingly approaching the field truth. Ideally, we should have a reference truth. In most of the cases this would certainly be a one-point spectrum, with a limited amount of information. A much better reference would be a 3D video of the sea surface, such as the one described in Section 2.2.
These are long-term targets, but we believe that, with the present computer power, a dedicated effort/project, especially if extended over time, would provide highly valuable results, putting the spectral approach in the right perspective, and opening the way to the, possibly not so close, but certainly at a higher level, future of wave modeling.

Summary
We have discussed the limits of present wave models. The fundamental limitation appears to be the implicit spectral approach. Valuable as it has been for several decades in allowing a process-by-process approach when estimating the evolution of a wave field, its basic concept, together with the implied formulation of the various processes at work, has in itself an intrinsic limit. This is connected to the physical fact and evidence that all the spectral components, both in their kinematics and dynamics, do depend on what is happening across the whole spectrum. Although attempts have been made to take this into account, these are still, often brilliant but always limited, patches. The simple suggestion that the two presently, in a way, opposite processes of generation and dissipation should be considered as parts of a single process is an indication of the substantial steps ahead we still have to take. It is not easy to leave a safe, reliable, and long-term tested solution. However, a time may come when conditions and needs urge us to undertake a new, certainly reasoned, adventure. Of course, this does not imply we abandon the well-paved route. The two courses need to be followed in parallel, comparing results and exchanging experience, the new one, probably with some on-the-way adjustments, progressively taking over more and more ground. It is a long-term process that needs to be well-planned and followed with attention. On a different scale, it can be compared to the well-planned and managed group development of the first third-generation spectral model, WAM. That was the work of a limited group, about ten persons, lasting several years. It may appear historical from today perspective, but on a different scale it can be a good reference for how to plan and manage long-term ambitious, but fascinating, ideas and projects.
We itemize the main points of this paper in the following: 1.
The present wave modeling activity shows good results for significant wave heights.

2.
However, the approximation for higher peak wave heights is still not as good as the general performance.

3.
This implies: for the peaks a poor, frequently low, estimate of the possible extremes. For the spectra, a poor estimate of the maximum heights and the corresponding wave shape in a storm.

4.
While room for improvement still exists, the basic limitation is found in the spectral approach, and the consequent limitations in dealing with the overall integral representation of a stormy sea.

5.
It is also suggested that the traditional opposite processes of input by wind and dissipation by white-capping should be considered as part of a single process. 6.
In the longer term, the drastic change seems to be in the three-dimensional + time dynamical representation of the interacting wind and wave fields. 7.
Starting from a given situation (spectrum), we suggest the realization of a 3D non-linear surface (higher crests, flatter troughs) to be then numerically integrated (air and water) via the basic fluid dynamics equations. 8.
Leaving any practical application for the far future, as a first approach we suggest the use of a one-grid point model. 9.
Progressive improvements in the deterministic results, compared with the spectral ones, will lead in time to a more realistic representation of the interaction between a turbulent wind and a waving surface. 10. It would be useful to carry out this exercise in correspondence with similarly detailed measured wave conditions. 11. This suggested effort should be carried out by an organized team with a solid leadership and a coordinated effort.
Author Contributions: The conceptualization was coordinated by L.C., together with A.B. and F.B. All authors participated in the writing of the paper. All authors have read and agreed to the published version of the manuscript.
Funding: This work has been partially carried out as part of the Copernicus Marine Environment Monitoring Service (CMEMS) LATEMAR project. CMEMS is implemented by Mercator Ocean in the framework of a delegation agreement with the European Union.