Lab 5: Sea-Level Rise

Lab
Author

CEVE 421/521

Published

Fri., Feb. 16

There are two objectives of this lab:

  1. To familiarize ourselves with an increasingly complex model of our house-elevation problem
  2. To conduct exploratory modeling to understand the implications of different parameter values and how they affect our decision-making

Setup

The usual

As always:

  1. Clone the lab repository to your computer
  2. Open the lab repository in VS Code
  3. Open the Julia REPL and activate, then instantiate, the lab environment
  4. Make sure you can render: quarto render template.qmd in the terminal.
    • If you run into issues, try running ] build IJulia in the Julia REPL (] enters the package manager).
    • If you still have issues, try opening up blankfile.py. That should trigger VS Code to give you the option to install the Python extension, which you should do. Then you should be able to open a menu in the bottom right of your screen to select which Python installation you want VS Code to use.

Load packages

using CSV
using DataFrames
using DataFramesMeta
using Distributions
using Plots
using StatsPlots
using Unitful

Plots.default(; margin=5Plots.mm)
Precompiling CSV
  ✓ WeakRefStrings
  ✓ FilePathsBase
  ✓ CSV
  3 dependencies successfully precompiled in 8 seconds. 20 already precompiled.
Precompiling DataFramesMeta
  ✓ DataFramesMeta
  1 dependency successfully precompiled in 1 seconds. 29 already precompiled.
Precompiling Distributions
  ✓ StatsFuns
  ✓ StatsFuns → StatsFunsChainRulesCoreExt
  ✓ Distributions
  3 dependencies successfully precompiled in 4 seconds. 39 already precompiled.
Precompiling DistributionsTestExt
  ✓ Distributions → DistributionsTestExt
  ✓ Distributions → DistributionsChainRulesCoreExt
  2 dependencies successfully precompiled in 1 seconds. 43 already precompiled.
Precompiling StatsPlots
  ✓ AbstractFFTs → AbstractFFTsChainRulesCoreExt
  ✓ Distances → DistancesChainRulesCoreExt
  ✓ Adapt → AdaptStaticArraysExt
  ✓ StaticArrays → StaticArraysChainRulesCoreExt
  ✓ Interpolations
  ✓ Interpolations → InterpolationsUnitfulExt
  ✓ KernelDensity
  ✓ StatsPlots
  8 dependencies successfully precompiled in 7 seconds. 222 already precompiled.

Local package

We’re starting to accumulate a lot of code describing our model. A good way to store this model is by creating a local package. I have created a package called HouseElevation that contains the model code. You don’t need to do anything special to install it, and you don’t need to edit the code, though I’d encourage you to have a look around!

When we work with local packages, it’s common to use another package called Revise. This is a cool package that will automatically propagate any changes you make to the package to any code that uses the package. You don’t need to worry about this for now – just load them.

using Revise
using HouseElevation
Precompiling Revise
  ✓ JuliaInterpreter
  ✓ LoweredCodeUtils
  ✓ Revise
  3 dependencies successfully precompiled in 11 seconds. 4 already precompiled.
Precompiling HouseElevation
  ✓ HouseElevation
  1 dependency successfully precompiled in 2 seconds. 79 already precompiled.

Building the model

We’ve added a bit of complexity to our model. In this section, we walk through each of the sections of the model.

House

We will consider a single house, and will ignore uncertainty in the depth-damage function or other house parameters

  • Neglect uncertainty in depth-damage function
  • Consider a single building
  • We’re going to put all relevant information into a House object:
    • Depth-damage function
    • Area
    • Cost (USD)
    • Elevation relative to gauge
    • Metadata

We can create a House as follows – note that we’re using a let...end block to create the House object. This means that any variables defined inside the block are not available outside the block, which is a good way to avoid “polluting the global namespace.”

house = let
    haz_fl_dept = CSV.read("data/haz_fl_dept.csv", DataFrame) # read in the file
    desc = "one story, Contents, fresh water, short duration"
    row = @rsubset(haz_fl_dept, :Description == desc)[1, :] # select the row I want
    area = 500u"ft^2"
    height_above_gauge = 10u"ft"
    House(
        row;
        area=area,
        height_above_gauge=height_above_gauge,
        value_usd=250_000,
    )
end

We can then use the House object to calculate the damage to the house for a given flood depth. Let’s convert the damage to dollars by multiplying the fraction (given by our depth-damage function) by the value of the house. For example:

Code
let
    depths = uconvert.(u"ft", (-7.0u"ft"):(1.0u"inch"):(30.0u"ft"))
    damages = house.ddf.(depths) ./ 100
    damages_1000_usd = damages .* house.value_usd ./ 1000
    scatter(
        depths,
        damages_1000_usd;
        xlabel="Flood Depth",
        ylabel="Damage (Thousand USD)",
        label="$(house.description)\n($(house.source))",
        legend=:bottomright,
        size=(800, 400),
        yformatter=:plain, # prevents scientific notation
    )
end

We can also use the House object to calculate the cost of raising the house to a given elevation. We use the elevation_cost function like this:

elevation_cost(house, 10u"ft")
67620.0

and again we can plot this.

let
    elevations = 0u"ft":0.25u"ft":14u"ft"
    costs = [elevation_cost(house, eᵢ) for eᵢ in elevations]
    scatter(
        elevations,
        costs ./ 1_000;
        xlabel="Elevation",
        ylabel="Cost (Thousand USD)",
        label="$(house.description)\n($(house.source))",
        legend=:bottomright,
        size=(800, 400),
        yformatter=:plain, # prevents scientific notation
    )
end

Sea-level rise

We will sample many different scenarios of sea-level rise

We’re modeling sea-level rise following the approach of Oddo et al. (2017). Essentially, we use five parameters: \(a\), \(b\), \(c\), \(t^*\), and \(c^*\). The local sea-level in year \(t\) is given by equation 6 of Oddo et al. (2017):

\[ \mathrm{SLR}= a + b(t - 2000) + c (t - 2000)^2 + c^* \, \mathbb{I} (t > t^*) (t - t^*) \]

The authors note:

In this model, the parameters \(a\), \(b\), and \(c\) represent the reasonably well-characterized process of thermosteric expansion as a second-order polynomial. It also accounts for more poorly understood processes, including potential abrupt sealevel rise consistent with sudden changes in ice flow dynamics. Here, \(c^*\) represents an increase in the rate of sea-level rise that takes place at some uncertain time, \(t^*\), in the future.

This is, of course, a highly simplified model. However, the parameters can be calibrated to match historical sea-level rise (i.e., throwing out any parameter values that don’t match the historical record) and use a statistical inversion method to estimate the parameters. One could also calibrate the parameters to match other, more complex, physics-based models. We’ll use Monte Carlo simulations from Oddo et al. (2017), available on GitHub. These were actually calibrated for the Netherlands, but we’ll pretend that sea-level rise in your location matches (which – as we know – it doesn’t).

slr_scenarios = let
    df = CSV.read("data/slr_oddo.csv", DataFrame)
    [Oddo17SLR(a, b, c, tstar, cstar) for (a, b, c, tstar, cstar) in eachrow(df)]
end
println("There are $(length(slr_scenarios)) parameter sets")

We can plot these scenarios to get a sense of the range of sea-level rise we might expect.

let
    years = 1900:2150
    p = plot(;
        xlabel="Year",
        ylabel="Mean sea-level (ft)\nwith respect to the year 2000",
        label="Oddo et al. (2017)",
        legend=false
    )
    for s in rand(slr_scenarios, 250)
        plot!(p, years, s.(years); color=:lightgrey, alpha=0.5, linewidth=0.5)
    end
    p
end

The key insight you should take from this plot is that uncertainty in future sea level increases over time!

Storm surge

We will consider parametric uncertainty in the storm surge

The next component of the model is the storm surge (i.e., the height of the flood above mean sea-level). We can model the water level at the gauge as the sum of the local sea-level and the storm surge. We can then model the water level at the house as the water level at the gauge minus the elevation of the house above the gauge.

We will consider parametric uncertainty in the storm surge. From lab 3, you should have a GeneralizedExtremeValue distribution for the storm surge. We can then sample parameters from a range centered on this distribution. For example, in the example for lab 3 we had GeneralizedExtremeValue(5, 1.5, 0.1). We can use this function to create a distribution for the storm surge.

function draw_surge_distribution()
    μ = rand(Normal(5, 1))
    σ = rand(Exponential(1.5))
    ξ = rand(Normal(0.1, 0.05))
    GeneralizedExtremeValue(μ, σ, ξ)
end
draw_surge_distribution (generic function with 1 method)

We can then call this function many times to get many different distributions for the storm surge. For example,

[draw_surge_distribution() for _ in 1:1000]
Important

This is NOT statistical estimation. We are not saying anything at all about whether these parameters are consistent with observations. In fact, even when parameters are uncertain, sampling around a point estimate in this manner usually produces lots of parameter values that are highly implausible. Here, we are just exploring the implications of different parameter values. Building a better model for storm surge is a great idea for your final project!

Discount rate

We will consider parametric uncertainty in the discount rate.

The discount rate is an important economic parameter in our NPV analysis. There are elements of discounting that are perhaps not random (e.g., how much do you value the future versus the present?) while there are other elements that are very much random (what is the opportunity cost of spending money now?) We will model this by treating the discount rate as a random variable, but more sophisticated analyses are possible. We can use the following function

function draw_discount_rate()
    return rand(Normal(0.04, 0.02))
end

Note that we are now defining the discount rate as a proportion (from 0 to 1) rather than a percentage (from 0 to 100).

Running a simulation

In the notation we’ve seen in class, we have a system model \(f\) that takes in a state of the world \(\mathbf{s}\), an action \(a\), and outputs some metric or metrics. I’ve reproduced this in our model, adding one extra piece: a ModelParams object that contains all the parameters of the model that don’t change from one simulation to the next.

In our model, the ModelParams are the house characteristics (area, value, and depth-damage curve) and the years we’re considering. You should consider different time horizons!

p = ModelParams(
    house=house,
    years=2024:2083
)

The next step is to create an object to hold our state of the world (SOW). We can create one like this. In the next step, we’ll want to sample a large ensemble of SOWs.

sow = SOW(
    rand(slr_scenarios),
    draw_surge_distribution(),
    draw_discount_rate()
)

Last, we need to define our action. For now, our action is very simple: we’re going to raise the house to a fixed elevation. However, in the future we might have a more complex action (e.g., when the sea level exceeds some threshold \(t1\), raise the house by some fixed amount \(t2\), which has two parameters). We define our action as follows:

a = Action(3.0u"ft")

Finally, we have a function to run the simulation. This function takes in the model parameters, the state of the world, and the action, and returns the NPV of the action. Please have a look at run_sim.jl to see how this is implemented!

res = run_sim(a, sow, p)
-273044.24141314684

Exploratory modeling

Now that you’ve figured out how this model works, it’s your turn to conduct some exploratory modeling. In template.qmd, I’ve provided only the code required to load packages.

Apply the model to your site

  1. Build your own house object, based on the house you’ve been using (or you can switch if you’d like)
    1. Briefly explain where you got the area, value, and depth-damage curve from
    2. Plot the depth-damage curve
    3. Plot the cost of raising the house to different elevations from 0 to 14 ft
  2. Read in the sea-level rise data
  3. Modify my code to create a function to draw samples of storm surge and the discount rate. Explain your modeling choices!
  4. Define an illustrative action, SOW, and model parameters, and run a simulation.

Large ensemble

Now that you’ve got the model working for your site, you should run a large ensemble of simulations (explain how you interpret “large”).

  1. Sample many SOWs (see below)
  2. Sample a range of actions. You can do this randomly, or you can look at just a couple of actions (e.g., 0, 3, 6, 9, 12 ft) – explain your choice.
  3. Run the simulations for each SOW and action. You can use a for loop for this.
  4. Create a DataFrame of your key inputs and results (see below)

Here’s how you can create a few SOWs and actions and run the simulations for each:

sows = [SOW(rand(slr_scenarios), draw_surge_distribution(), draw_discount_rate()) for _ in 1:10] # for 10 SOWs
actions = [Action(3.0u"ft") for _ in 1:10] # these are all the same
results = [run_sim(a, s, p) for (a, s) in zip(actions, sows)]
10-element Vector{Float64}:
 -119809.48968875225
      -1.186686027284644e6
  -65399.083502911075
  -70227.35536411834
  -62245.679107963915
  -64911.20307788288
  -61567.0
 -303261.33731304156
  -95120.13989756959
  -61567.85528667119

Here’s how you can create a dataframe of your results. Each row corresponds to one simulation, and the columns are the inputs and outputs of the simulation.

df = DataFrame(
    npv=results,
    Δh_ft=[a.Δh_ft for a in actions],
    slr_a=[s.slr.a for s in sows],
    slr_b=[s.slr.b for s in sows],
    slr_c=[s.slr.c for s in sows],
    slr_tstar=[s.slr.tstar for s in sows],
    slr_cstar=[s.slr.cstar for s in sows],
    surge_μ=[s.surge_dist.μ for s in sows],
    surge_σ=[s.surge_dist.σ for s in sows],
    surge_ξ=[s.surge_dist.ξ for s in sows],
    discount_rate=[s.discount_rate for s in sows],
)
10×11 DataFrame
Row npv Δh_ft slr_a slr_b slr_c slr_tstar slr_cstar surge_μ surge_σ surge_ξ discount_rate
Float64 Float64 Float64 Float64 Float64 Float64 Float64 Float64 Float64 Float64 Float64
1 -1.19809e5 3.0 22.7455 1.97232 0.00108138 2055.61 13.6815 5.71354 1.23384 0.132045 0.0305316
2 -1.18669e6 3.0 31.8042 2.12253 0.00181862 2075.83 26.0206 2.72266 6.29061 0.18237 0.0337339
3 -65399.1 3.0 45.1381 2.67168 0.00475014 2073.01 14.3629 4.89454 1.07941 0.0557278 0.0696326
4 -70227.4 3.0 34.2526 2.15362 0.00179496 2023.02 12.3762 5.18212 0.807767 0.0591513 0.0147343
5 -62245.7 3.0 46.1374 2.71067 0.00596875 2052.41 26.0308 5.45452 0.441162 0.138966 0.0513225
6 -64911.2 3.0 41.5385 2.29426 0.00299289 2096.37 30.6067 4.54426 1.44636 -0.0583326 0.0244682
7 -61567.0 3.0 16.885 1.8469 0.000219119 2087.38 17.5534 4.78217 0.168667 0.114206 0.0454478
8 -3.03261e5 3.0 41.1331 2.46154 0.00395867 2089.86 9.03578 3.90958 2.74709 0.12845 0.044172
9 -95120.1 3.0 44.1679 2.46237 0.00460655 2078.93 31.6353 3.86119 1.83269 0.0248341 0.0406796
10 -61567.9 3.0 41.0524 2.0722 0.00136389 2051.64 8.46521 3.84943 0.344563 0.0705257 0.017751

Analysis

Now, analyze your results. You can use scatterplots and other visualizations, or any other statistical analyses that you think may be helpful. Remember that the goal is to understand how different parameter values affect the success or failure of different actions.

Some questions to consider:

  • When do you get the best results?
  • When do you get the worst results?
  • What are the most important parameters?
  • If you had unlimited computing power, would you run more simulations? How many?
  • What are the implications of your results for decision-making?

References

Oddo, P. C., Lee, B. S., Garner, G. G., Srikrishnan, V., Reed, P. M., Forest, C. E., & Keller, K. (2017). Deep uncertainties in sea-level rise and storm surge projections: Implications for coastal flood risk management. Risk Analysis, 0(0). https://doi.org/ghkp82