Laplace contours#
This page shows how to visualize Laplace-approximation posteriors using GetDist, starting from a Laplace approximation computed with DerivKit.
The focus here is what to do next once you already have a Laplace approximation: how to turn it into confidence contours or samples for quick inspection, comparison, and plotting.
This is intended for fast exploration and visualization, not as a replacement for full likelihood-based inference or full MCMC sampling.
If you are looking for:
how the Laplace approximation is defined and interpreted, see ForecastKit
how to compute a Laplace approximation with DerivKit, see Laplace approximation
Two complementary visualization workflows are supported:
conversion of the Laplace Gaussian into an analytic Gaussian for GetDist
Monte Carlo samples drawn from the Laplace Gaussian, returned as
getdist.MCSamples
Both outputs can be passed directly to GetDist plotting utilities (e.g. triangle / corner plots).
Notes and conventions#
The Laplace approximation is a local Gaussian centered on the MAP point
theta_map. In other words, the Laplace mean istheta_mapby construction.The Laplace covariance is the inverse Hessian of the negative log-posterior at
theta_map(up to numerical regularization).getdist.MCSamples.loglikesstores minus the log-posterior (up to an additive constant), following GetDist conventions.For strongly non-Gaussian posteriors or curved degeneracies far from the MAP, consider using DALI or a full sampler instead.
Analytic Gaussian (no sampling)#
Convert the Laplace approximation into an analytic Gaussian object compatible with GetDist, then plot contours using GetDist.
>>> import numpy as np
>>> from getdist import plots as getdist_plots
>>> from derivkit.forecasting.getdist_fisher_samples import fisher_to_getdist_gaussiannd
>>> from derivkit.forecasting.laplace import laplace_approximation
>>> # Linear–Gaussian likelihoods with Gaussian prior (posterior is exactly Gaussian)
>>> observed_y = np.array([1.2, -0.4], dtype=float)
>>> design_matrix = np.array([[1.0, 0.5], [0.2, 1.3]], dtype=float)
>>> data_cov = np.array([[0.30**2, 0.0], [0.0, 0.25**2]], dtype=float)
>>> data_prec = np.linalg.inv(data_cov)
>>> prior_mean = np.array([0.0, 0.0], dtype=float)
>>> prior_cov = np.array([[1.0**2, 0.0], [0.0, 1.5**2]], dtype=float)
>>> prior_prec = np.linalg.inv(prior_cov)
>>> def neg_logposterior(theta):
... theta = np.asarray(theta, dtype=float)
... r = observed_y - design_matrix @ theta
... nll = 0.5 * (r @ data_prec @ r)
... d = theta - prior_mean
... nlp = 0.5 * (d @ prior_prec @ d)
... return float(nll + nlp)
>>> # Expansion point (can be an initial guess; Laplace returns the MAP it finds/uses)
>>> theta_map0 = np.array([0.0, 0.0], dtype=float)
>>> out = laplace_approximation(
... neg_logposterior=neg_logposterior,
... theta_map=theta_map0,
... method="finite",
... dk_kwargs={"stepsize": 1e-3, "num_points": 5, "extrapolation": "ridders", "levels": 4},
... )
>>> theta_map = np.asarray(out["theta_map"], dtype=float)
>>> cov = np.asarray(out["cov"], dtype=float)
>>> theta_map.shape == (2,) and cov.shape == (2, 2)
True
>>> # Convert Laplace (mean+cov) into a GetDist analytic Gaussian via Fisher precision
>>> fisher = np.linalg.pinv(cov)
>>> gnd = fisher_to_getdist_gaussiannd(
... theta0=theta_map,
... fisher=fisher,
... names=["theta1", "theta2"],
... labels=[r"\theta_1", r"\theta_2"],
... label="Laplace (Gaussian)",
... )
>>> isinstance(gnd, object)
True
(png)
Sampling from the Laplace Gaussian#
For more flexibility (e.g. marginal histograms, bounds, or combining with other samples), draw Monte Carlo samples from the Laplace Gaussian and plot them with GetDist.
>>> import numpy as np
>>> from getdist import plots as getdist_plots
>>> from derivkit.forecasting.getdist_fisher_samples import fisher_to_getdist_samples
>>> from derivkit.forecasting.laplace import laplace_approximation
>>> observed_y = np.array([1.2, -0.4], dtype=float)
>>> design_matrix = np.array([[1.0, 0.5], [0.2, 1.3]], dtype=float)
>>> data_cov = np.array([[0.30**2, 0.0], [0.0, 0.25**2]], dtype=float)
>>> data_prec = np.linalg.inv(data_cov)
>>> prior_mean = np.array([0.0, 0.0], dtype=float)
>>> prior_cov = np.array([[1.0**2, 0.0], [0.0, 1.5**2]], dtype=float)
>>> prior_prec = np.linalg.inv(prior_cov)
>>> def neg_logposterior(theta):
... theta = np.asarray(theta, dtype=float)
... r = observed_y - design_matrix @ theta
... nll = 0.5 * (r @ data_prec @ r)
... d = theta - prior_mean
... nlp = 0.5 * (d @ prior_prec @ d)
... return float(nll + nlp)
>>> out = laplace_approximation(
... neg_logposterior=neg_logposterior,
... theta_map=np.array([0.0, 0.0], dtype=float),
... method="finite",
... dk_kwargs={"stepsize": 1e-3, "num_points": 5, "extrapolation": "ridders", "levels": 4},
... )
>>> theta_map = np.asarray(out["theta_map"], dtype=float)
>>> cov = np.asarray(out["cov"], dtype=float)
>>> fisher = np.linalg.pinv(cov)
>>> samples = fisher_to_getdist_samples(
... theta0=theta_map,
... fisher=fisher,
... names=["theta1", "theta2"],
... labels=[r"\theta_1", r"\theta_2"],
... store_loglikes=True,
... label="Laplace (samples)",
... )
>>> dk_yellow = "#e1af00"
>>> line_width = 1.5
>>> plotter = getdist_plots.get_subplot_plotter(width_inch=3.6)
>>> plotter.settings.linewidth_contour = line_width
>>> plotter.settings.linewidth = line_width
>>> plotter.triangle_plot(
... samples,
... params=["theta1", "theta2"],
... filled=False,
... contour_colors=[dk_yellow],
... contour_lws=[line_width],
... contour_ls=["-"],
... )
>>> samples.numrows > 0
True
(png)