MeasureTheory API

MeasureTheory.CorrCholeskyType
CorrCholesky(n)

Cholesky factor of a correlation matrix of size n. Transforms $n(n-1)/2$ real numbers to an $n×n$ lower-triangular matrix L, such that L*L' is a correlation matrix (positive definite, with unit diagonal).

Notes

If

  • z is a vector of n IID standard normal variates,
  • σ is an n-element vector of standard deviations,
  • C is obtained from CorrCholesky(n),

then Diagonal(σ) * C.L * z is a zero-centered multivariate normal variate with the standard deviations σ and correlation matrix C.L * C.U.

source
MeasureTheory.ForMethod
For(f, base...)

For provides a convenient way to construct a ProductMeasure. There are several options for the base. With Julia's do notation, this can look very similar to a standard for loop, while maintaining semantics structure that's easier to work with.


For(f, base::Int...)

When one or several Int values are passed for base, the result is treated as depending on CartesianIndices(base).

julia> For(3) do λ Exponential(λ) end |> marginals
3-element mappedarray(MeasureBase.var"#17#18"{var"#15#16"}(var"#15#16"()), ::CartesianIndices{1, Tuple{Base.OneTo{Int64}}}) with eltype Exponential{(:λ,), Tuple{Int64}}:
 Exponential(λ = 1,)
 Exponential(λ = 2,)
 Exponential(λ = 3,)
julia> For(4,3) do μ,σ Normal(μ,σ) end |> marginals
4×3 mappedarray(MeasureBase.var"#17#18"{var"#11#12"}(var"#11#12"()), ::CartesianIndices{2, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}) with eltype Normal{(:μ, :σ), Tuple{Int64, Int64}}:
 Normal(μ = 1, σ = 1)  Normal(μ = 1, σ = 2)  Normal(μ = 1, σ = 3)
 Normal(μ = 2, σ = 1)  Normal(μ = 2, σ = 2)  Normal(μ = 2, σ = 3)
 Normal(μ = 3, σ = 1)  Normal(μ = 3, σ = 2)  Normal(μ = 3, σ = 3)
 Normal(μ = 4, σ = 1)  Normal(μ = 4, σ = 2)  Normal(μ = 4, σ = 3)

For(f, base::AbstractArray...)`

In this case, base behaves as if the arrays are zipped together before applying the map.

julia> For(randn(3)) do x Exponential(x) end |> marginals
3-element mappedarray(x->Main.Exponential(x), ::Vector{Float64}) with eltype Exponential{(:λ,), Tuple{Float64}}:
 Exponential(λ = -0.268256,)
 Exponential(λ = 1.53044,)
 Exponential(λ = -1.08839,)
julia> For(1:3, 1:3) do μ,σ Normal(μ,σ) end |> marginals
3-element mappedarray((:μ, :σ)->Main.Normal(μ, σ), ::UnitRange{Int64}, ::UnitRange{Int64}) with eltype Normal{(:μ, :σ), Tuple{Int64, Int64}}:
 Normal(μ = 1, σ = 1)
 Normal(μ = 2, σ = 2)
 Normal(μ = 3, σ = 3)

For(f, base::Base.Generator)

For Generators, the function maps over the values of the generator:

julia> For(eachrow(rand(4,2))) do x Normal(x[1], x[2]) end |> marginals |> collect
4-element Vector{Normal{(:μ, :σ), Tuple{Float64, Float64}}}:
 Normal(μ = 0.255024, σ = 0.570142)
 Normal(μ = 0.970706, σ = 0.0776745)
 Normal(μ = 0.731491, σ = 0.505837)
 Normal(μ = 0.563112, σ = 0.98307)
source
MeasureTheory.LKJCholeskyType
LKJCholesky(k=3, η=1.0)
LKJCholesky(k=3, logη=0.0)

LKJCholesky(k, ...) gives the k×k LKJ distribution (Lewandowski et al 2009) expressed as a Cholesky decomposition. As a special case, for C = rand(LKJCholesky(k=K, η=1.0)) (or equivalently C=rand(LKJCholesky{k}(k=K, logη=0.0))), C.L * C.U is uniform over the set of all K×K correlation matrices. Note, however, that in this case C.L and C.U are not sampled uniformly (because the multiplication is nonlinear).

The logdensity method for this measure applies for LowerTriangular, UpperTriangular, or Diagonal matrices, and will "do the right thing". The logdensity does not check if L*U yields a valid correlation matrix.

Valid values are $η > 0$. When $η > 1$, the distribution is unimodal with a peak at I, while $0 < η < 1$ yields a trough. $η = 2$ is recommended as a vague prior.

Adapted from https://github.com/tpapp/AltDistributions.jl

source
MeasureTheory.asparamsFunction

asparams build on TransformVariables.as to construct bijections to the parameter space of a given parameterized measure. Because this is only possible for continuous parameter spaces, we allow constraints to assign values to any subset of the parameters.


asparams(::Type{<:ParameterizedMeasure}, ::StaticSymbol)

Return a transformation for a particular parameter of a given parameterized measure. For example,

julia> asparams(Normal, static(:σ))
asℝ₊

asparams(::Type{<: ParameterizedMeasure{N}}, constraints::NamedTuple) where {N}

Return a transformation for a given parameterized measure subject to the named tuple constraints. For example,

julia> asparams(Binomial{(:p,)}, (n=10,))
TransformVariables.TransformTuple{NamedTuple{(:p,), Tuple{TransformVariables.ScaledShiftedLogistic{Float64}}}}((p = as𝕀,), 1)

aspararams(::ParameterizedMeasure)

Return a transformation with no constraints. For example,

julia> asparams(Normal{(:μ,:σ)})
TransformVariables.TransformTuple{NamedTuple{(:μ, :σ), Tuple{TransformVariables.Identity, TransformVariables.ShiftedExp{true, Float64}}}}((μ = asℝ, σ = asℝ₊), 2)
source
MeasureTheory.@halfMacro
@half dist([paramnames])

Starting from a symmetric univariate measure dist ≪ Lebesgue(ℝ), create a new measure Halfdist ≪ Lebesgue(ℝ₊). For example, @half Normal() creates HalfNormal(), and @half StudentT(ν) creates HalfStudentT(ν).

source
MeasureTheory.@parameterizedMacro
@parameterized <declaration>

The <declaration> gives a measure and its default parameters, and specifies its relation to its base measure. For example, @parameterized Normal(μ,σ) declares the Normal is a measure with default parameters μ and σ. The result is equivalent to

struct Normal{N,T} <: ParameterizedMeasure{N}
    par :: NamedTuple{N,T}
end
KeywordCalls.@kwstruct Normal(μ,σ)
Normal(μ,σ) = Normal((μ=μ, σ=σ))

See KeywordCalls.jl for details on @kwstruct.

source