GNGTS 2024 - Atti del 42° Convegno Nazionale
Session 2.1 GNGTS 2024 method to distinguish mainshocks from all other earthquakes is not available (and maybe it does not exist at all), and the ground shaking of earthquakes that have been removed from the declustering technique (e.g., aftershocks) can be very damaging. Nonetheless, as regards the testing phase, to maintain coherency and to avoid the comparison of apples and pears, any consistency test should consider only data associated to the mainshocks selected, e.g., excluding strong ground shaking in some sites that were caused by aftershocks. If we want to consider all earthquakes or ground shaking data in the testing phase, the NSHM has to be corrected for declustering. Different techniques have been already proposed in scientific literature that are used in MPS19 (Meletti et al., 2019) and in the most recent NSHM of the United States (Field et al. 2023) and New Zealand (Gerstenberger et al., 2023). One of the most remarkable features about NSHM in Italy is the rich database of macroseismic intensities, which are not measured ground shaking data, but they may be used to mimic them. However, such a kind of data may be affected by significant problems that have to be taken into account. Here, we just list some of the most important ones: (i) the large uncertainties in the transition from macroseismic intensity to numerical values of the shaking such as accelerations and speeds, and vice versa; (ii) the uncertainties on the macroseismic intensities of the past are affected by substantial uncertainties due, for example, to the cumulative effect of earthquakes of a seismic sequence and to the type of soil which is not considered in the hazard model (by definition, NSHM refers to a rigid ground); (iii) macroseismic intensity data sometimes depend on the research group estimating them. Another point worth being mentioned is the fact that a rigorous testing phase must be based on solid statistical techniques. Sometimes, the outcomes of NSHM are analysed using ad hoc techniques whose statistical properties have not been properly investigated, or even based on untenable assumptions. It goes without saying that this attitude cannot lead to any reliable judgement on NSHM. Last, but not least, almost all consistency tests that have been made so far are based on the mean (or median) hazard model, neglecting de facto the so-called epistemic uncertainty. In other words, two models having the same mean hazard, but a quite different dispersion of the branches of a logic tree (or alternative models) around the central value, are considered the same. It is easy to demonstrate that this attitude leads to asymmetrical conclusions, i.e., if a NSHM passes the test considering only the mean hazard it may be deemed as consistent with the data, but if it does not pass the test, it cannot be considered necessarily inconsistent with the data. In a more formal approach, the proper scientific interpretation of the seismic hazard estimates requires a probabilistic framework that admits epistemic uncertainties on aleatory variables. This is not straightforward because, to subjectivists, all probabilities are epistemic, whereas to frequentists, all probabilities are aleatory. The inadequacy of purely subjectivist and
Made with FlippingBook
RkJQdWJsaXNoZXIy MjQ4NzI=