Skip to content

Commit

Permalink
Fix advanced usage guide (#1785)
Browse files Browse the repository at this point in the history
  • Loading branch information
devmotion authored Feb 25, 2022
1 parent 25a32ef commit 02170ed
Showing 1 changed file with 21 additions and 32 deletions.
53 changes: 21 additions & 32 deletions docs/src/using-turing/advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,7 @@ First, define a type of the distribution, as a subtype of a corresponding distri


```julia
struct CustomUniform <: ContinuousUnivariateDistribution
end
struct CustomUniform <: ContinuousUnivariateDistribution end
```

### 2. Implement Sampling and Evaluation of the log-pdf
Expand All @@ -31,8 +30,11 @@ Second, define `rand` and `logpdf`, which will be used to run the model.


```julia
Distributions.rand(rng::AbstractRNG, d::CustomUniform) = rand(rng) # sample in [0, 1]
Distributions.logpdf(d::CustomUniform, x::Real) = zero(x) # p(x) = 1 → logp(x) = 0
# sample in [0, 1]
Distributions.rand(rng::AbstractRNG, d::CustomUniform) = rand(rng)

# p(x) = 1 → logp(x) = 0
Distributions.logpdf(d::CustomUniform, x::Real) = zero(x)
```

### 3. Define Helper Functions
Expand Down Expand Up @@ -72,20 +74,10 @@ and then we can call `rand` and `logpdf` as usual, where

To read more about Bijectors.jl, check out [the project README](https://github.com/TuringLang/Bijectors.jl).

#### 3.2 Vectorization Support


The vectorization syntax follows `rv ~ [distribution]`, which requires `rand` and `logpdf` to be called on multiple data points at once. An appropriate implementation for `Flat` is shown below.


```julia
Distributions.logpdf(d::Flat, x::AbstractVector{<:Real}) = zero(x)
```

## Update the accumulated log probability in the model definition

Turing accumulates log probabilities internally in an internal data structure that is accessible through
the internal variable `_varinfo` inside of the model definition (see below for more details about model internals).
the internal variable `__varinfo__` inside of the model definition (see below for more details about model internals).
However, since users should not have to deal with internal data structures, a macro `Turing.@addlogprob!` is provided
that increases the accumulated log probability. For instance, this allows you to
[include arbitrary terms in the likelihood](https://github.com/TuringLang/Turing.jl/issues/1332)
Expand Down Expand Up @@ -123,19 +115,21 @@ end
Note that `@addlogprob!` always increases the accumulated log probability, regardless of the provided
sampling context. For instance, if you do not want to apply `Turing.@addlogprob!` when evaluating the
prior of your model but only when computing the log likelihood and the log joint probability, then you
should [check the type of the internal variable `_context`](https://github.com/TuringLang/DynamicPPL.jl/issues/154)
should [check the type of the internal variable `__context_`](https://github.com/TuringLang/DynamicPPL.jl/issues/154)
such as

```julia
if !isa(_context, Turing.PriorContext)
if DynamicPPL.leafcontext(__context__) !== Turing.PriorContext()
Turing.@addlogprob! myloglikelihood(x, μ)
end
```

## Model Internals


The `@model` macro accepts a function definition and rewrites it such that call of the function generates a `Model` struct for use by the sampler. Models can be constructed by hand without the use of a macro. Taking the `gdemo` model as an example, the macro-based definition
The `@model` macro accepts a function definition and rewrites it such that call of the function generates a `Model` struct for use by the sampler.
Models can be constructed by hand without the use of a macro.
Taking the `gdemo` model as an example, the macro-based definition

```julia
using Turing
Expand All @@ -152,41 +146,36 @@ end
model = gdemo([1.5, 2.0])
```

is equivalent to the macro-free version
can be implemented also (a bit less generally) with the macro-free version

```julia
using Turing

# Create the model function.
function modelf(rng, model, varinfo, sampler, context, x)
# Assume s has an InverseGamma distribution.
= Turing.DynamicPPL.tilde_assume(
rng,
function gdemo(model, varinfo, context, x)
# Assume s² has an InverseGamma distribution.
s², varinfo = DynamicPPL.tilde_assume!!(
context,
sampler,
InverseGamma(2, 3),
Turing.@varname(s),
(),
Turing.@varname(s²),
varinfo,
)

# Assume m has a Normal distribution.
m = Turing.DynamicPPL.tilde_assume(
rng,
m, varinfo = DynamicPPL.tilde_assume!!(
context,
sampler,
Normal(0, sqrt(s²)),
Turing.@varname(m),
(),
varinfo,
)

# Observe each value of x[i] according to a Normal distribution.
Turing.DynamicPPL.dot_tilde_observe(context, sampler, Normal(m, sqrt(s²)), x, varinfo)
DynamicPPL.dot_tilde_observe!!(context, Normal(m, sqrt(s²)), x, Turing.@varname(x), varinfo)
end
gdemo(x) = Turing.Model(:gdemo, gdemo, (; x))

# Instantiate a Model object with our data variables.
model = Turing.Model(modelf, (x = [1.5, 2.0],))
model = gdemo([1.5, 2.0])
```

## Task Copying
Expand Down

0 comments on commit 02170ed

Please sign in to comment.