---
title: "README"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{README}
%\VignetteEngine{knitr::rmarkdown}
%\VignetteEncoding{UTF-8}
---
```{r, include = FALSE}
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>"
)
```
```{r setup}
library(NAPrior)
```
## Description
The **NAPrior** package facilitates the implementation of the Network Meta-Analytic Predictive (NAP) prior framework, specifically designed to address changes in the Standard of Care (SoC) during ongoing randomized controlled trials (RCTs). The framework synthesizes in-trial data from both pre- and post-SoC change periods by leveraging external trial data—specifically the head-to-head comparisons between the original and new SoC that established the new SoC—to bridge the two phases of evidence.
To ensure robust inference, the package implements two robustified priors: (i) the mixture NAP (mNAP), which incorporates a noninformative prior via a fixed mixing weight, and (ii) the elastic NAP (eNAP), which adaptively adjusts mixing weight and information borrowing based on the consistency between direct and indirect evidence. The NAP framework is fully prespecifiable, easy to calibrate, and computationally straightforward. This package provides a comprehensive toolkit to calibrate eNAP tuning parameters, generate NAP priors, simulate operating characteristics, and obtain posterior distributions.
## Installation
### System requirements
The **NAPrior** package fits Bayesian models using JAGS via the **R2jags** interface.
Please ensure that JAGS is installed on your system before running model functions.
- macOS: `brew install jags`
- Ubuntu/Debian: `sudo apt-get install jags`
- Windows:
### Install and load
```{r install, eval=FALSE, include=TRUE}
install.packages("devtools")
devtools::install_github("EstravenZZZ/NAPrior")
```
```{r include=TRUE}
library(NAPrior)
```
## Quick start
Consider a scenario in which the SoC changes mid-trial in a registrational or pivotal RCT that was initially designed to compare an experimental treatment $E$ against an SoC $C_1$ using 1:1 randomization of $2N$ patients. At the outset, a total of $2n_1$ patients are randomized between $E$ and $C_1$. Subsequently, a mid-trial SoC change occurs, and another $2n_2$ patients are randomized between $E$ and the new SoC $C_2$. As a result, the trial data consist of two components: data directly comparing $E$ versus $C_1$ (denoted as $D_{E,C_1}$) and data directly comparing $E$ versus $C_2$ (denoted as $D_{E,C_2}$). Meanwhile, summary statistics from external trial(s) motivated SoC change is also available (denoted as $D_{C_2,C_2}$). The primary objective of the RCT is to evaluate the treatment effect of $E$ versus $C_2$, denoted as $\theta_{E, C_2}$. For estimating $\theta_{E, C_2}$, $D_{E,C_2}$ provides direct evidence while $D_{E,C_1}$ and $D_{C_2,C_2}$ provide indirect evidence.
Let $y_{E,C_2}$, $y_{E,C_1}$, and $y_{C_2,C_1}$ denote the estimated log hazard ratios (log-HRs) calculated from the datasets $D_{E, C_2}$, $D_{E, C_1}$ and $D_{C_2, C_1}$, respectively, with sampling variances $s^2_{E,C_2}$, $s^2_{E,C_1}$, and $s^2_{C_2,C_1}$. These estimates can be derived, for example, using the Cox proportional hazards model. Following standard meta-analytic convention, we assume that $s^2_{E,C_2}$, $s^2_{E,C_1}$, and $s^2_{C_2,C_1}$ are known. This package accommodates external data ($D_{C_2,C_2}$) from either a single trial and multiple trials by allowing users to supply a scalar or a vector for `y_C2C1` and `s_C2C1` arguments accordingly.
### 1) Calibrate the tuning parameters for eNAP prior
The eNAP uses dynamic weight based on Bucher test statistic $Z$, to quantify the extent of consistency between direct and indirect evidence, and mapped to the mixing weight using an elastic function:
$$
w(Z) = \{ 1 + \exp\bigl(a + b \log(Z + 1)\bigr)\}^{-1},
$$
where Bucher test statistic
$$
Z =
\frac{
\bigl| y_{E,C_2} - (y_{E,C_1} - y_{C_2,C_1}) \bigr|
}{
s^2_{E,C_2} + s^2_{E,C_1} + s^2_{C_2,C_1}
}.
$$
To calibrate the tuning parameters $(a, b)$, the user provides:
- $\delta$: A clinically meaningful difference on the log-HR scale; differences beyond this threshold indicate strong inconsistency between direct and indirect evidence.
- $t_1$: The target posterior mixing weights under exact consistency, typically set close to 1 (e.g., $t_1 = 0.999$).
- $t_0$: The target posterior mixing weights under strongly inconsistency $Z_\delta=\delta/(s^2_{E,C_2} + s^2_{E,C_1} + s^2_{C_2,C_1})$, typically set close to 0 (e.g., $t_0 = 0.05$)
These inputs calibrate $(a, b)$ such that the updated posterior weight $w'$ satisfies $w'(Z_0) = t_1$ and $w'(Z_\delta) = t_0$. A companion plot method is provided for visualizing the resulting posterior updated weight; simply call `plot()` on the returned object to view the weighting function relative to evidence consistency..
```{r tune_eNAP, echo=TRUE}
tuned_ab <- tune_param_eNAP(
y_C2C1=c(-0.4,-0.5,-0.5),
s_EC2 = 0.12^2, # Var(E:C2)
s_EC1 = 0.16^2, # Var(E:C1)
s_C2C1 = c(0.12^2, 0.11^2, 0.15^2), # vector => multiple external trials
delta = 0.5, # clinically meaningful difference on log-HR
t1 = 0.999, # near full borrowing at Z = 0
t0 = 0.05 # near zero borrowing at Z(delta)
)
c(tuned_ab$a,tuned_ab$b)
plot(tuned_ab)
```
### 2) Construct NAP priors
To construct an mNAP or eNAP prior, use the `NAP_prior()` function. This requires summary statistics for the indirect evidence ($D_{E,C_1}$ and $D_{C_2,C_2}$), and a selection for the weighting method ("adaptive" for eNAP or "fixed" for mNAP). For mNAP, a fixed mixing weight must specified (a weight of 1 results in the NAP prior, assuming full transitivity). For eNAP, the function requiresthe tuning parameters $(a,b)$. A companion `plot()` method is available to visualize the resulting prior distribution.
Since the dynamic weight for eNAP depends on direct evidence ($D_{E,C_2}$) that may not yet be observed (e.g., at the time of the SoC change), `NAP_prior()` offers two operational modes:
- **When direct evidence:**\
If an assumed log-HR and sampling variance for $D_{E,C_2}$ are provided, the function calculates the specific dynamic weight and resulting prior under those assumptions.
- **When direct evidence:**\
If no assumed direct evidence is provided, the function returns the informative (NAP) and vague components separately, allowing for later synthesis.
The function returns an object of class `"NAP_prior"`, which serves as the input for subsequent simulations or posterior data analyses.
#### NAP
```{r NAP_example, echo=TRUE}
NAP_test1 <- NAP_prior(
weight_mtd = "fixed", w = 1, # w = 1 ⇒ informative component only (NAP)
y_EC1 = -0.36, s_EC1 = 0.16^2,
y_C2C1 = -0.30, s_C2C1 = 0.14^2 # single external → FE
)
NAP_test1$table
plot(NAP_test1)
```
#### mNAP with a fixed weight of 0.5
```{r mNAP_example, echo=TRUE, warning=FALSE}
mNAP_test1 <- NAP_prior(
weight_mtd = "fixed", w = 0.50, # fixed mixture weight
y_EC1 = -0.36, s_EC1 = 0.16^2,
y_C2C1 = -0.30, s_C2C1 = 0.14^2, # single external → FE
tau0 = 1000
)
mNAP_test1$table
plot(mNAP_test1,main="mNAP prior")
```
#### eNAP without direct data $D_{E,C_2}$ provided
```{r eNAP_example1, echo=TRUE}
eNAP_test1 <- NAP_prior(
weight_mtd = "adaptive",
a = tuned_ab$a, b = tuned_ab$b, # from calibration
y_EC1 = -0.36, s_EC1 = 0.16^2, # E:C1 (current, pre-change)
y_C2C1 = c(-0.28, -0.35, -0.31), # C2:C1 (external trials)
s_C2C1 = c(0.12^2, 0.11^2, 0.15^2),
tau0 = 1000 # vague variance
)
eNAP_test1$table
```
#### eNAP with direct data $D_{E,C_2}$ provided
```{r eNAP_example2, echo=TRUE, warning=FALSE}
eNAP_test2 <- NAP_prior(
weight_mtd = "adaptive",
a = tuned_ab$a, b = tuned_ab$b, # from calibration
y_EC1 = -0.36, s_EC1 = 0.16^2, # E:C1 (current, pre-change)
y_C2C1 = c(-0.28, -0.35, -0.31), # C2:C1 (external trials)
s_C2C1 = c(0.12^2, 0.11^2, 0.15^2),
y_EC2 = 0, s_EC2 = 0.2^2,
tau0 = 1000 # vague variance
)
eNAP_test2$table
plot(eNAP_test2,main="eNAP prior")
```
### 3) Simulate operating characteristics
With previously obtained object from `NAP_prior()` function, use `NAP_oc()` function to simulate operating characteristics.
```{r oc_example, echo=TRUE, message=FALSE}
set.seed(123)
# Run JAGS quietly
oc <- NULL
invisible(capture.output(
oc <- NAP_oc(
NAP_prior = eNAP_test1,
theta_EC2 = 0.00, # true log-HR for E:C2
n_EC2 = 400, # total N for the post-SoC-change period
lambda = 1, # randomization ratio for the post-SoC-change period
sim_model = "Weibull", # "Exponential" or "Weibull"
model_param = c(shape = 1.2, rate = 0.05),
nsim = 100,
iter = 3000,
chains = 4
),
type = "output" # capture JAGS console output
))
# Now show only the summary in the knitted doc
summary(oc)
```
### 4) Calculate posterior
To obtain posterior posterior of target estimand $\theta_{E,C_2}$, pass the object generated by the `NAP_prior()` function, along with the observed direct evidence (log_HR and sampling variance from $D_{E,C2}$) to the `NAP_posterior()` function.
```{r conduct_example, echo=TRUE}
res <- NAP_posterior(
NAP_prior = eNAP_test1,
y_EC2 = -0.20,
s_EC2 = 0.12^2,
iter = 4000,
chains = 4
)
res$posterior_sum # mean, sd, 95% CI, prob_E_better, prior weight, post weight
res$enap_prior # actual eNAP prior at analysis time with realized dynamic weight
# res$jags_fit is stored but does not print by default
```
## Core functions
- `tune_param_eNAP()` — Calibrates the eNAP tuning parameters \((a, b)\) based on user-defined consistency targets.
- `NAP_prior()` — Generates mNAP and eNAP prior objects using indirect evidence and specified weighting methods.
- `NAP_posterior()` — Computes the posterior distribution of the target estimand by combining a `NAP_prior` object with observed direct evidence ($D_{E,C2}$).
- `NAP_oc()` — Simulates operating characteristics.
## JAGS model specification
The package includes default BUGS/JAGS model strings as **data objects**:
- `jags_model_FE`
- `jags_model_RE`
You may pass
- a **path** to a `.txt` model file (e.g., `model = "path/to/model_RE.txt"`), or
- a **string** containing BUGS/JAGS code (e.g., `model = jags_model_RE`).
Internally, models are routed through `.model_path_from()`, which writes text to a temporary file and passes it to `R2jags::jags()`.
## Tips & diagnostics
- **Inputs:** Sampling variances must be positive and finite.
- **Stability:** If tuning parameters \(|a|\) or \(b\) reaches internally pre-specified upper limits (by defualt 5 and 50), respectively, recommend reducing \(t_1\), increasing \(t_0\), or enlarging \(\delta\) to enhance the reliability of posterior inference.
- **Reproducibility:** Set a seed before simulations (`set.seed()`).
- **JAGS messages:** Lines about “resolving undeclared variables” are normal. Most JAGS errors arise from name or length mismatches in the data or model.
## Citing
If you use **NAPrior** in publications, please cite both the methodological paper and this package (full citation to be added once available).
## Maintainer
Chunyi Zhang ()