Git Product home page Git Product logo

Comments (4)

roaldarbol avatar roaldarbol commented on July 28, 2024

As an aside, would there be any way of knowing whether such a discrete event would have a delayed effect? Say, lights turn on and off for a 5 minute period, but the effect of that event plays out over the following 1 hour, despite the light no longer being on.

from mvgam.

nicholasjclark avatar nicholasjclark commented on July 28, 2024

Hi @roaldarbol, thanks for the question and the interesting example. For this kind of problem, I'd probably ignore the smooth of time and just focus on the different structural changes that are happening. I've found a lot of use for including predictors that represent the time since some event happened. In this case, including a measure of the time since entering the "b" group can be an effective way to investigate the delayed effect of the "b" condition. Of course if there are consistent temporal dynamics that you'd need to deal with, you could include those as well. But I've left them out here to keep the point simple:

library(mvgam)
library(marginaleffects)
library(dplyr)
library(ggplot2); theme_set(theme_bw())

# Simulate data
t <- seq(from = 0, to = 20, length.out = 1000)
v1 <- rep(1, 500)
v2 <- rep(2, 500)
noise <- rnorm(1000, mean = 0.5, sd = 1)
values <- c(v1,v2) + noise
a <- rep("a", 500)
b <- rep("b", 500)
df <- data.frame(
  time = t,
  value = values,
  # Don't use the name 'group' as it causes problems for marginaleffects
  grp = c(a,b)
) |> 
  mutate(value = if_else(time > 15, value - ((time-14)^1.2), value)) |>
  # Add a time since the start of condition 'b' as a new variable
  mutate(time_since_b = c(rep(0, 500),
                          1:500))
df <- bind_rows(df, df, df, df, df) |> 
  mutate(time = row_number(),
         season = as.integer(time / 1000),
         series = factor("A"))

# Fit a model that estimates the nonlinear, delayed impact of entering
# condition 'b'
mod <- mvgam(value ~ grp + 
               s(time_since_b, k = 15),
             data = df,
             family = gaussian(),
             # Meanfield Variation inference should be just fine for this model,
             # and will be lightning fast
             algorithm = 'meanfield',
             backend = 'cmdstanr')
#> Compiling Stan program using cmdstanr
#> 
#> Start sampling
#> ------------------------------------------------------------ 
#> EXPERIMENTAL ALGORITHM: 
#>   This procedure has not been thoroughly tested and may be unstable 
#>   or buggy. The interface is subject to change. 
#> ------------------------------------------------------------ 
#> Gradient evaluation took 0.000375 seconds 
#> 1000 transitions using 10 leapfrog steps per transition would take 3.75 seconds. 
#> Adjust your expectations accordingly! 
#> Begin eta adaptation. 
#> Iteration:   1 / 250 [  0%]  (Adaptation) 
#> Iteration:  50 / 250 [ 20%]  (Adaptation) 
#> Iteration: 100 / 250 [ 40%]  (Adaptation) 
#> Iteration: 150 / 250 [ 60%]  (Adaptation) 
#> Iteration: 200 / 250 [ 80%]  (Adaptation) 
#> Success! Found best value [eta = 1] earlier than expected. 
#> Begin stochastic gradient ascent. 
#>   iter             ELBO   delta_ELBO_mean   delta_ELBO_med   notes  
#>    100      -623427.781             1.000            1.000 
#>    200      -386878.079             0.806            1.000 
#>    300      -320377.370             0.606            0.611 
#>    400        -9371.998             8.751            1.000 
#>    500        -7400.788             7.054            0.611 
#>    600        -7129.433             5.885            0.611 
#>    700        -7051.640             5.046            0.266 
#>    800        -7060.191             4.415            0.266 
#>    900        -7063.776             3.925            0.208 
#>   1000        -7053.622             3.532            0.208 
#>   1100        -7019.789             3.433            0.038   MAY BE DIVERGING... INSPECT ELBO 
#>   1200        -7045.633             3.372            0.011   MAY BE DIVERGING... INSPECT ELBO 
#>   1300        -7126.481             3.352            0.011   MAY BE DIVERGING... INSPECT ELBO 
#>   1400        -7040.557             0.035            0.011 
#>   1500        -7017.499             0.009            0.005   MEAN ELBO CONVERGED   MEDIAN ELBO CONVERGED 
#> Drawing a sample of size 500 from the approximate posterior...  
#> COMPLETED. 
#> Finished in  9.3 seconds.
summary(mod, include_betas = FALSE)
#> GAM formula:
#> value ~ grp + s(time_since_b, k = 15)
#> 
#> Family:
#> gaussian
#> 
#> Link function:
#> identity
#> 
#> Trend model:
#> None
#> 
#> N series:
#> 1 
#> 
#> N timepoints:
#> 5000 
#> 
#> Status:
#> Fitted using Stan 
#> 1 chains, each with iter = 500; warmup = ; thin = 1 
#> Total post-warmup draws = 500
#> 
#> 
#> Observation error parameter estimates:
#>              2.5%  50% 97.5% Rhat n.eff
#> sigma_obs[1] 0.95 0.98     1  NaN   NaN
#> 
#> GAM coefficient (beta) estimates:
#>             2.5%  50% 97.5% Rhat n.eff
#> (Intercept) 0.34 0.37  0.39  NaN   NaN
#> grpb        0.96 1.00  1.00  NaN   NaN
#> 
#> Approximate significance of GAM smooths:
#>                  edf Ref.df Chi.sq p-value    
#> s(time_since_b) 9.94     14  17780  <2e-16 ***
#> ---
#> Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
#> 
#> Posterior approximation used: no diagnostics to compute

# Plot the delayed effect of entering the 'b' condition
plot_predictions(mod, by = c("time_since_b"), 
                 newdata = datagrid(time_since_b = unique,
                                    group = 'b'),
                 points = 0.2)

# Plot an estimate of the structural change when first entering condition 'b'
plot_predictions(mod, by = 'grp',
                 newdata = datagrid(time_since_b = 0,
                                    group = unique))

# Plot full predictions
fc <- hindcast(mod)
plot(fc)

Created on 2024-06-27 with reprex v2.0.2

from mvgam.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.