Abstract

As with any situation that involves economical risk refineries may share their risk with insurers. The decision process generally includes modelling to determine to which extent the process area can be damaged. On the extreme end of modelling the so-called Estimated Maximum Loss (EML) scenarios are found. These scenarios predict the maximum loss a particular installation can sustain. Unfortunately no standard model for this exists. Thus the insurers reach different results due to applying different models and different assumptions. Therefore, a study has been conducted on a case in a Swedish refinery where several scenarios previously had been modelled by two different insurance brokers using two different softwares, ExTool and SLAM. This study reviews the concept of EML and analyses the used models to see which parameters are most uncertain. Also a third model, EFFECTS, was employed in an attempt to reach a conclusion with higher reliability.

1. Introduction

The petroleum refineries share their inherent safety problems with many other chemical processing industries. The raw material as well as almost all of the products are highly flammable, can give rise to vapour cloud explosions (VCEs), and are toxic above a certain thresholdvalue. Much of the processing, as well as storage, is done under higher than ambient pressure, not uncommonly above 10 bars. This ensures that loss of confinement will lead to rapid discharge rates.

In addition to the threat to the workforce an economical risk is associated with processing flammable compounds. not only due to the direct impact of a fire or explosion but also the cost of business interruption (BI) in case of a shutdown. In many cases the BI after an accident is much more expensive than the actual repair costs due to a fire or explosion.

Today the highest reported property damage (PD) is from a VCE that occurred in Pasadena Texas 1989. It is estimated that the costs to rebuild the plant were around 869 million USD (based on a 2002 USD) [1]. The BI cost in this case was a mere 700 million USD. Notice here that the BI and PD costs are roughly the same whereas the average, calculated from 119 accidents, is that BI exceeds PD with a factor 2.7 [2].

A decision has to be made by the operator. How much of my financial risk do I want an outside party to carry, and how much money do I consider is a fair price for that service? The decision process generally includes modelling of various scenarios to determine to which extent the process area can be damaged. On the extreme end of modelling the so-called Estimated Maximum Loss (EML) scenarios are found. These models try to predict the maximum loss a particular installation can sustain due to an accident. Within the refinery industry these scenarios usually consist of a number of different vapour cloud explosions. For obvious reasons such scenarios are riddled with uncertainties. Some scenarios are frowned upon by some and deemed plausible by others.

Unfortunately the gas explosion models available today are by no means perfect. The models are occasionally off with a factor two, regardless of its being empirical models or computational fluid dynamics [3]. Recently even more doubt has been cast on the models that are of use today since none of them are able to predict the damage seen at the Buncefield oil depot [4].

Two different EML studies have been carried out at the refinery in Lysekil, Sweden by two different brokers, for confidentiality reasons henceforth referred to as “Broker A” and “Broker B”. Apart from the differences from using different modelling tools, there is also no set standard for which assumptions to base an EML study upon. Thus the previous two studies have generated very varied conclusions.

The maximum property damages estimated by Broker A and Broker B are 2 390 000 000 SEK, and 6 430 000 000 SEK, respectively. Broker B has identified five different scenarios that are more expensive on a property damage base than the highest one for Broker A. Even when the scenarios are based on the same process equipment failure the numbers differ. For instance a major breach on V2505, which is an intermediate storage tank for a mixture mainly consisting of butane would lead to a PD of either 1 470 000 000 SEK or 4 100 000 000 SEK, a difference of almost 300%.

In order to make a sound business decision a remodelling of the proposed scenarios has been conducted. The sources of difference between the previous models have been determined and some recommendations and thoughts on the concept of EML are given.

2. Aim of the Study

The main objectives of this study are as follows:(i)to compare EML studies carried out by two different insurance brokers for a Swedish refinery,(ii)to remodel and remove some of the uncertain parameters by using a third party software,(iii)to review the EML concept and highlight areas with uncertainties that needs improvements in the future. This paper is prepared on the basis of a Master of Science thesis carried out at Chalmers University of Technology [6].

3. Methodology

A definition of the EML concept was given by Canaway [7] “The effect of spillage of flammable substance or inventory from the largest discrete circuit and so forth. In the EML analysis, no prediction of the ignition source location may be made in order to reduce the damage level.”

In this study an EML is defined as a single release of inventory from a vessel and the resulting formation of a drifting vapour cloud. An ignition following the formation of the vapour cloud, generates an explosion, thus causing property damage. In general domino effects are not modelled in EML's and the same methodology with a single accident has been used in this study.

The two brokers EML calculations where studied in detail in order to identify and determine the sources of deviations. Thereafter, using software developed by TNO, called EFFECTS, where all physical models are described in the yellow book [8], a remodelling of all scenarios investigated by both of the companies has been done. Figure 1 summarizes the steps taken.

The two most “expensive” scenarios presented by the two companies have also been remodelled to find out which of the two that are to be considered as the more reliable one. During the modelling a more thorough method has been used than the one employed by the two companies. Since EFFECTS allow linking of scenarios it has been possible to start with a pipe connection failure, proceed with a spray release, then model the dispersion of the gas cloud and thereafter model the overpressure after ignition. As precise input data as possible have been used, including height of release, and normal filling degree of vessels. Where different types of distillation columns have been selected as points of origin a more accurate calculation of release rates has been used.

A less improbable version of a containment failure has been used in all the modelling conducted. The bottom flange on each studied vessel has been modelled as ruptured, and the resulting jets release rate calculated. The brokers models instead assume a sudden burst, where the whole inventory all at once appear on the outside of the vessel. Though the probability of catastrophic failure of a vessel and total rupture of the piping connection modelled herein is not that different, 2 and 5 [cpm] (failure frequency of 10−6 per year = cpm), respectively, [9], the rupture pipe model allows for a more accurate modelling of the following step, dispersion of the cloud.

Although of interest for the operator, the concept of BI has been neglected in this study. BI is a consequence of a major accident and can be deducted from the damages from that accident. Instead of guessing delivery times for different process equipment and time for investigation and possible reengineering of the process, the BI concept has been left to those that are more suited to make such estimations.

4. Modelling Vapour Cloud Explosions

Broker A is using the SLAM software, which is based on the Congestion Assessment Method. Broker B is using the ExTool software, which is based on the TNT equivalency method. EFFECTS is based on the Multi-Energy Method.

A brief description of the three different empirical methods mentioned will be given as well as the reason for not using CFD, the most accurate modelling technique.

TNT Equivalency Method (ExTool)
The TNT equivalency method assumes that a vapour cloud explosion is similar to an explosion of a high charge explosive, TNT.
A pressure-distance curve yields the peak pressure, where the distance is scaled with a TNT mass equivalent. The TNT equivalent is obtained as the product between the explosion yield and the mass of hydrocarbons in the vapour cloud in accordance with (1): is the empirical yield factor, normally set between 0.03–0.05. The factor 10 is used since most hydrocarbons have a 10 times higher heat of combustion than TNT. The quota between the different heat of combustion () can be used for other fuel types. The main weaknesses of the TNT method is that the yield factor and pressure-distance curve are based on empirical data and not theoretically proven. Also, since TNT is a solid state explosive the difference in physical behaviour between TNT and gas explosions are substantial. With this method the predicted overpressure difference between the model and a real VCE is most pronounced close to, and far away, from to the centre of explosion. The method has a weak theoretical basis, but is used because it is simple and under most circumstances gives a reliable upper estimate [3].

Multi-Energy Concept (EFFECTS)
The multi-energy concept assumes that only the confined or obstructed part of a vapour cloud give a rise in overpressure [11]. A combustion-energy scaled distance is related to the distance from the explosion centre according to: is the atmospheric pressure and is the total amount of combustion energy. is calculated as the product of combustion energy per volume times the congested cloud volume . Since the total amount of combustion energy for a stoichiometric hydrocarbon-air mixture is relatively constant regardless of the type of hydrocarbon, it is common to estimate the combustion energy according to: Data from explosion experiments have been fitted to the parameter and the overpressure for different charge strengths, dependant on, for example, strength of the ignition source and level of congestion. The charge strength is given a value in the range of one to ten, where ten represent a detonation.
Setting the charge strength and the total combustion energy is the main sources of uncertainty in the multi-energy concept.

Congestion Assessment Method (SLAM)
An assessment of the congested region is first done in order to get a reference pressure , which is an estimation of the maximum overpressure generated by a deflagration of a vapour cloud of propane [12]. The reference pressure is estimated with a decision tree that first takes confinement into account, then congestion or obstacles in the confined area and last whether there are strong ignition sources. There are some similarities between the choice of charge strength in Multi-Energy method and the choice of in CAM. If the vapour cloud does not consist of propane a fuel factor is multiplied to the reference pressure to get a maximum source pressure.With the maximum source pressure, the overpressure at a specific distance can be given by fitted data. CAM uses data from the MERGE (Modelling and Experimental Research into Gas Explosions) project.

CFD Models (Computational Fluid Dynamics)
A number of different CFD models are available today but one has to be aware of their limitations since the models are by no means perfect, even for simple geometries. MERGE was an EU-founded project that tried to determine the accuracy of some explosion models. A cubodial pipe array, shown in Figure 2, was filled with gas and thereafter ignited in the centre. The results, shown in Figure 3, depict a considerable spread even for such a simple geometry.
If one were to use a CFD model to predict the damage within a refinery it would not only take a large amount of time for the actual modelling. First of all a three-dimensional model of the refinery is needed. Further, the accuracy of the model would be lowered by the fact that a normal desktop computer today is unable to make the mesh fine enough. If one was to apply a mesh to a typical refinery area the length of each finite volume would be too large to yield an answer that is precise enough to warrant the amount of time and work needed for the modelling.
After an extensive study which included 27 large-scale experiments Ledin (1997) [3] made the following conclusion “My interpretation of the outcome of JIP-2 is that confidence can be attached to the model predictions only if the new geometry strongly resembles one of the two geometries in the database. It must be emphasised that even with the use of what appears to be in principle a more advanced model, that is, CFD-based, outside its area of validation/calibration it may in fact give little overall reduction in uncertainties over the use of simpler modelling approaches.”
The obstacle geometries of a standard refinery are thus too complex to be handled by the available CFD models. Thus to screen for EML-scenarios a simpler model type should be used.

4.1. Property Damage Cost Estimation

Each process unit has an estimated total cost for rebuilding and each overpressure is associated with a specific damage percentage. Today it is common to use specific threshold values. For example between 150–350 mBar overpressure corresponds to 40% damage. Thus one calculates the damage on each subprocess area and thereafter sum up to reach the total damage cost. In reality damage is related both to duration of the overpressure and to the specific geometry of the structure [13]. Also one should consider mechanical properties of the structure, reflection, and so forth. However, calculations are usually only considering peak overpressure and positive impulse [14].

5. Analysis

The first section presents the costs for five scenarios modelled by the two brokers. The second section discusses where in the two models the sources of difference originate. Focus will be on damage due to overpressure, overpressure decay, releasable inventory, cloud weight, and cloud drifting. The third section shows results from our study, using a third software, EFFECTS. Modelling has been made on all scenarios previously studied by the brokers. But details are given just for two scenarios, D1538 (Drum) and T2302 (Tower), due to limited space of this paper.

5.1. Section One

The data used by Broker A and Broker B in their modelling is given in Table 1.

The operating volume in the towers was estimated according to two assumptions. The bottom level was assumed to be 3 m high and the height of the liquid above each tray was assumed to be 0.05 m. For Vessels and Drums the assumption was that the liquid inventory was 50% of the total volume of the vessel.

V stands for Vessel which in these cases are cylinders lying down, D stands for Drums which are cylinders standing up and T represents Tower.

Every scenario has its origin and ignition point within the process area of the refinery. Larger vessels exist outside of the process area, for example, within the storage area. But the cost of damage to the process equipment widely outweighs the cost of damage to the storage equipment.

The estimated property damage costs (million SEK) for all scenarios are presented in Table 2.

5.2. Section Two

A number of modelling parameters could be the reason for the difference between ExTool and SLAM. Among them five were identified as the most critical potential sources for the difference. These are damage threshold values, overpressure decay, releasable inventory, cloud weight calculations, and allowed cloud drift.

Damage Thresholds
SLAM and ExTool employ two different sets of threshold values to calculate the damage percentage on process equipment. The values are shown as curves in Figure 4 for clarity reasons but are used as threshold values within the actual programs. For example, the whole area affected by an overpressure between 138–345 mBar (2–5 PSI) will be 40% damaged according to ExTool.
Since damage percentage and subsequent cost depends on the overpressure as well as ignition point it is impossible to say exactly how big impact the different set of threshold values give rise to. One thing can be said though, while considering large explosions ExTool threshold values gives rise to higher costs.

Overpressure Decay
Modelling of the pressure decay has been made with the Matlab software. In both scenarios 100 kmol of gas was used. For the typical alkane propane a yield factor of 6% has been used within the TNT model. 4% for a straight alkane raised by 2 percentage points for confinement. For propene 9% yield factor was used. Choosing 9% is done due to the fact that within CAM propene has a 150% higher fuel factor when compared to propane [12]. Thus this gives a fairer view on the actual pressure decay within the models. Congestion is in both cases set as typical within the CAM method.
As can be seen in Figure 5, with the selected parameter values, the distance to a certain overpressure does not differ that much in the near field. However, in the far field the TNT model gives lower overpressure than the CAM model.
This implies that a scenario that uses the TNT-model for its pressure decay would in fact give lower costs. In the brokers reports ExTool yields higher costs than SLAM. Thus it is probably not the use of the TNT or CAM method for overpressure decay per se that gives rise to the differences in costs.

Releasable Inventory
For the brokers modelling the releasable amount had been set to a standard value of 50 percent of the vessel or drum size. For towers the bottom height of liquid was assumed to be 3 m and the height of the liquid on each tray was assumed to be 0.05 m.
The two brokers estimate releasable material in towers in different ways. Broker A allows only the part that is on the bottom of the tower to participate in the vapour cloud formation. According to Broker B the whole content of a tower may participate in cloud formation.
In this study a more thorough survey was conducted on the inventory of the equipment. These results are shown in Section Three.

Cloud Weights
How to calculate cloud weight is not a part of the TNT or the CAM model. But the softwares employed for the overpressure decay calculations contain a method to determine cloud weight. ExTool calculates the cloud weight as two times the flash fraction [15] [F] which in turn is calculated according to:
The heat capacity and heat of vaporisation are chosen at the initial temperature of the inventory. The heat capacity and heat of vaporisation depends on the temperature, hence ExTool overestimates the flash fraction.
SLAM calculates the cloud weight as the flash fraction [16] and the flash fraction is calculated according to: The heat of vaporisation is chosen at the boiling point for the compound at atmospheric conditions, however the heat capacity is chosen at the mean temperature between boiling temperature and the initial reference temperature. Since heat of vaporization and heat capacity both are temperature dependent the same temperature should be used for choosing physical parameters for (5). The quota gives an underestimation of the flash fraction. Also using only flash fraction to calculate cloud weight omits the entrainment phenomenon further decreasing the total cloud weight.

Cloud Drifting
ExTool has a clearly defined method to calculate maximum cloud drifting. After modelling an ignition at the point of release the cloud is allowed to travel within the 138 mBar isobar to find the position associated with the highest cost. Two objections to this method can be raised. First, since there is no connection between wind speed and dispersion, the cloud contains the same total weight no matter how far it travels. Secondly this implies that the larger the cloud, or the more reactive, the longer it will be allowed to travel before ignition.
For SLAM no exact data on cloud drifting has been found. But it seems that the centre of ignition normally is within 75 m of the release point. However, so-called “engineering judgement” has been used to override the initial ignition point in one of the cases D1538. This can be done by the user if a reasonable additional drift will induce significant rise in cost.
It is not reasonable to think that a major part of the differences in damage costs could be attributed to these small differences in cloud drift allowance. ExTool scans large part of the refinery and SLAM is overridden if the cost is maximized outside the initial iteration zone.

5.3. Section Three

Using the software EFFECTS instead of SLAM or ExTool eliminates two of the parameters mentioned above, cloud drifting and cloud weight.

ExTool and SLAM use (4) and (5), respectively, to calculate cloud weight. This crude method is surpassed by EFFECTS use of so-called coupled models. The main advantage of EFFECTS is the ability to model a chain of events each with its specific method and then feeding the result into the following model. The scenarios have been modelled in the following fashion: TPDIS (bottom venting)→Spray Release→Dense Gas Dispersion→Dense Gas Explosive Mass. This means that the results from each submodel have been fed into the next to arrive at the point of interest, cloud weight. ExTool use the TNT model for its overpressure generation and decay and SLAM uses the CAM method to determine the centre overpressure and subsequent decay after ignition. In EFFECTS the coupling of the models is continued by linking an explosion model based on the Multi-Energy concept to the dense gas explosive mass. For more information on the three softwares see the yellow book [8] for EFFECTS, the ExTool theory manual [15], and the guidance document for SLAM [17].

Each scenario has its starting point in a complete rupture of the nearest flange on the bottom pipe of the specific process equipment.

For cloud weights calculations in EFFECTS four parameters are considered as key factors, due to their high impact on the end result of the cloud weight modelling. These four parameters are presented in Table 3.

The scenarios have been calculated for different stability classes and wind speeds. However Stability class F and wind speed 1.5 [m/s] were found to be the worst circumstance for every case. It should be noted that wind speeds below 1.5 [m/s] has not been tested since the dispersion model is not considered valid for such low wind speeds.

The cloud weights calculated by three different tools are shown in Table 4. These values are affected to a great extent by the releasable inventory presented in Table 5.

Further elimination of uncertainty was made by using process data and drawings of the process equipment to get an enhanced certainty for the parameter releasable material. For releasable material within towers, the amount of liquid that can pass through one tray per second is lower than the decay in explosive mass due to dispersion. Therefore only the bottom content has been considered as taking part in the release to estimate the maximum explosive mass. As shown in Table 5 the difference between estimated amount and the real amount is significant.

Using EFFECTS also eliminates the Cloud Drift parameter since it is possible to see the explosive mass at each given time unit and thus eliminate extremely long drift if additional time severely impacts explosive mass.

For modelling of the overpressure obstruction has been considered as low, ignition source anticipated as high, and parallel confinement has been deemed as existing. According to Kinsella [19] these assumptions give blast strengths ranging from 5–7. In accordance with the EML concept the highest blast strength (7) in the range has been chosen.

A comparison of blast prediction models for vapour cloud explosion done in 2001 at the NRC shows how data from different models are fitted to the observed pressure data from the Flixborough accident and the accident in La Mede [20]. The close fit in this comparison for Multi-Energy, blast strength 7, indicates that this is a valid choice.

As for the threshold value parameter there is no consensus as to the usage of threshold values. It is also beyond the scope of this study to further investigate such values. However, as can be seen in Figure 6 the values used by ExTool appear to be quite conservative. All scenarios studied in EFFECTS have been modelled with both sets of threshold values to show the difference between the results. With the limited amount of data acquired it would be presumptuous to point at either as definitively correct or incorrect.

Scenario 1: D1538
Drum 1538 contains 100% propene and is situated about 6 [m] above the ground. Average values for pressure inside temperature and volume has been taken from the process data average ranging from 2007-11-27 to 2007-12-27. Data is thus set to [bar], [°C], [m3]. To estimate a worst case scenario a total rupture of a 6′′ pipe situated 6.05 [m] above ground has been simulated by EFFECTS.
The digitized cloud shape 17 s after the release represent maximum explosive mass is shown in Figure 7. The cloud drifted approximately 200 [m], which is considered far but not unreasonable. The time for the digitalization has been taken from the graph shown in Figure 8 and the total area of the cloud has been taken from the graph shown in Figure 9.
The total area of the cloud is 3050 [m2] corresponding to 900 [kg] explosive mass. The confined area was approximated at an onsite inspection, between the dispersion and explosion step, to be 965 [m2]. Figure 10 shows the different damage zones using the thresholds from SLAM. Figure 11 shows the damage zones using the threshold values from ExTool.

Scenario 2: T2302
Tower 2302 main contents is C5+ [79 w%]. The temperature within the vessel has been chosen as [°C]. This is not a true process value but since EFFECTS only can handle single component releases the temperature was fitted to the initial pressure within the vessel  bar. The volume that is able to participate in the cloud formation [m3]. The release is modelled as coming from a 20′′ pipe situated 5.3 [m] above the ground. The inventory was modelled as pure Heptane.
The digitized cloud shape at 11 [s] after the release, representing maximum explosive mass, is shown in Figure 12. Within the picture it can be seen that the cloud has spread out in a both west and eastward way from the release point. This is not considered as impossible since it is one of the inherent traits of a denser than air gas to move both up- and downwind of a release point. Since the cloud has spread into two separate process areas two simultaneous explosions are modelled. The distance between these two process areas is larger than the critical separation distance for the multi-energy method [21]. The time for the digitalization has been taken from Figure 13 and the total area of the cloud has been taken from Figure 14.
The total area of the cloud is 14803 [m2] and the total explosive mass is 28300 [kg].
The confined area was approximated at an onsite inspection, between the dispersion and explosion step, the two different areas are 3531 and 1010 [m2], respectively. Figure 15 shows the different damage zones using the thresholds from SLAM. Figure 16 shows the damage zones using the threshold values from ExTool.

6. Summary of the Results

Since there is no mechanism in EFFECTS that allows calculation of cost an alternate way for cost estimations was employed. Using Adobe Photoshop CS2 the number of pixels in each damage zone was counted as the percentage of each process unit within each damage zone.

Two apparent trends can be seen in Table 6. Except for D-1538, this study gave higher damage costs than those calculated by both Broker A and Broker B. In this study higher costs are predicted when ExTool threshold values are used. It has been previously mentioned that the ExTool threshold values seem to be on the conservative level. However this is perhaps done in order to account for the steeper pressure decay for TNT as compared to gas explosions.

It is also apparent that in all the scenarios studied the final cost suggested from EFFECTS is more in accordance with ExTool than with SLAM. The percentage of the releasable inventory that is turned into an explosive cloud is also most coherent between ExTool and EFFECTS. This suggests that the simplification to use two times the flash fraction to account for the entrainment effects is acceptable.

As can be seen from Table 6 there are large deviations in releasable inventory between scenarios modelled in EFFECTS as compared to SLAM and ExTool. Most of these differences could have been avoided if a more thorough search of the refinery inventory had been done from the start.

7. Discussion

As a starting point for this study a definition of an EML scenario was needed. Although a written down definition was finally found which could cover the work about to be undertaken as well as the brokers previous studies this was not the only definition found. A number of different abbreviations can be found within the literature, PML (probable maximum loss), MCL (maximum credible loss), MFL (maximum foreseeable loss), EML (estimated maximum loss), and NML (normal maximum loss). All of these different abbreviations, which more or less imply the same thing, come with its own set of probability interval. However, the intervals differ between different sources as well as the definitions. Thus even choosing a proper probability level for an EML is not as easy as it might sound. The way that the two brokers handle the release from towers is clearly mirroring this lack of clearcut definition. Not only does an investigator have to find a proper definition and a proper probability interval. The investigator also has to choose one definitive source to use for probabilities of accidents.

For a refinery the size that we have studied, 140 000 m2, it is unlikely that the modelling of EML scenarios actually helps the decision process. With clever use of the models it is possible to let most of the scenarios vary between denting the closest equipment to total annihilation of the whole process area. This kind of modelling might not have any use until no matter how far you stretch the model there are still parts of the refinery outside of the blast radius. Although one should remember that domino effects are neglected in EML modelling.

In order to choose a model one must consider the purpose of the modelling as well as the necessary precision of the model. In Table 7 a set of criteria is listed in order to help with such a decision. The criteria have been defined in accordance to Transparency: the ease of finding, and interpreting how a certain model works; Input Demand: The time that needs to be spent in order to collect the necessary data for using the model; Complexity: The level of knowledge needed to use the model which also reflects the amount of influence the choice of analyst might have; Precision: A model ability to accurately reflect the reality of a VCE.

As EML is so ill defined and the number of uncertainties in the dispersion, drift, ignition, discharge rates, damage thresholds, equipment failures and so forth, is so high there is no reason to aim for a time-consuming, although somewhat more accurate model. As a primary choice for EML modelling, if at all such modelling must be conducted, ExTool stand out with its ease of interpretation owing to its high openness and low complexity. In addition the low amount of time needed to collect the data further strengthens its position. Also, explosion modelling is a specialist field and it is highly inappropriate to mask the very high insecurity in the modelling behind complex models. Although, no matter the choice of model an increased openness about their limitations would be in its place.

When comparing the models one can see that the energy released as overpressure can be matched setting parameters such as the confinement percentage and the yield coefficient at the right levels. In the results reported herein we see the effect of these two parameters. The energy released as overpressure has been higher for the Multi-Energy method, used by EFFECTS, simply because we have deemed the confined area as larger than the corresponding yield coefficient used in ExTool (TNT model) by Broker B. One remark concerning the three methods dissected here must thus be made. Any of the three methods can be made to fit most historical cases. In the TNT method the analyst can adjust the yield, in SLAM (CAM method). can be adjusted and in EFFECTS (Multi-Energy method) the charge strength and confinement values can be adjusted. All these three overpressure models mentioned are also dependant on a sensible choice of vapour cloud weight, which in historical cases seldom can be definitively known. Hence, any claim, also the ones made within this report, on historical accuracy should be taken with a grain of salt. These flexibilities of the models are strengths as well as weaknesses. As with any software model it is almost always possible to get the response one wants. Modelling must therefore be made by a competent analyst and with a predefined set of rules that must not be broken.

8. Conclusion

The EML concept as it is used today is a rather loosely defined method to compute the maximal damage due to a large-scale accident. Also, any modelling at so extreme conditions as those used for EML scenarios are bound to be uncertain since the scale is balancing on the validity range of any model used. Hence improvements should be made not only to the models used but also to the EML concept itself. Clear cutoff values for the probabilities of an accident should be used to avoid the “not plausible” argument sometimes heard. As for improving the models themselves, no clear reason for working with threshold values when it comes to overpressure damage can be found. A continuous curve seems more fitting in the age of computers. The possibility to shift such a curve to account for the difference in overpressure sustainability between different types of process equipment could also be explored. Further, phenomenological models have been left out of this study but the result of such a model could prove interesting, at least as comparison.

All in all, there are many aspects to investigate further in order to make potential loss predictions more reliable, and this should be well worthwhile since much money is at stake when plant owners and insurers decide on insurance limits and premiums.