Skip to content
Licensed Unlicensed Requires Authentication Published by De Gruyter June 1, 2021

Optimising Poisson bridge constructions for variance reduction methods

  • Casper H. L. Beentjes ORCID logo EMAIL logo

Abstract

In this paper we discuss different Monte Carlo (MC) approaches to generate unit-rate Poisson processes and provide an analysis of Poisson bridge constructions, which form the discrete analogue of the well-known Brownian bridge construction for a Wiener process. One of the main advantages of these Poisson bridge constructions is that they, like the Brownian bridge, can be effectively combined with variance reduction techniques. In particular, we show here, in practice and proof, how we can achieve orders of magnitude efficiency improvement over standard MC approaches when generating unit-rate Poisson processes via a synthesis of antithetic sampling and Poisson bridge constructions. At the same time we provide practical guidance as to how to implement and tune Poisson bridge methods to achieve, in a mean sense, (near) optimal performance.

MSC 2010: 65C05; 65C40; 60-08

A Proofs

A.1 Proof of Proposition 5.1

Without loss of generality we take t [ t r , t r + 1 ] for some 1 r < R . Next we use the fact that if we condition the Poisson processes on the start of the interval, i.e. on Y ( 1 ) ( t r ) , , Y ( M ) ( t r ) , and the end of the interval, i.e. on Y ( 1 ) ( t r + 1 ) , , Y ( M ) ( t r + 1 ) , the increments of the Poisson processes are binomially distributed for t [ t r , t r + 1 ] , i.e.

Y ( m ) ( t ) - Y ( m ) ( t r ) ( N r ( m ) , p ) ,

where

p = t - t r t r + 1 - t r and N r ( m ) = Y ( m ) ( t r + 1 ) - Y ( m ) ( t r ) .

Then note that in general we can decompose the MSE for t [ t r , t r + 1 ] as

MSE ( t ) = M Var [ 1 M m = 1 M Y ( m ) ( t ) ]
= 1 M n , m Cov [ Y ( n ) ( t ) , Y ( m ) ( t ) ] ,

and in particular this holds for t = t r and t = t r + 1 as well. Next we use the law of total covariance to express each covariance term in the double sum over all n , m as

Cov [ Y ( n ) ( t ) , Y ( m ) ( t ) ] = 𝔼 [ Cov [ Y ( n ) ( t ) , Y ( m ) ( t ) r ] ] + Cov [ 𝔼 [ Y ( n ) ( t ) r ] , 𝔼 [ Y ( m ) ( t ) r ] ] ,

where r = σ { Y ( 1 ) ( t r ) , , Y ( M ) ( t r ) , Y ( 1 ) ( t r + 1 ) , , Y ( M ) ( t r + 1 ) } is the sigma algebra generated by the random variables comprising the Poisson skeleton at times t r and t r + 1 . Using the independence of the Poisson processes conditional on the skeleton, we see immediately that

Cov [ Y ( n ) ( t ) , Y ( m ) ( t ) r ] = { 0 , n m , N r ( m ) p ( 1 - p ) , n = m .

In addition, since the Poisson skeleton is unbiased by assumption, we know that

𝔼 [ N r ( m ) ] = t r + 1 - t r

and thus

𝔼 [ Cov [ Y ( n ) ( t ) , Y ( m ) ( t ) r ] ] = { 0 , n m , ( t r + 1 - t r ) ( t - t r t r + 1 - t r ) ( t r + 1 - t t r + 1 - t r ) , n = m .

Using the fact that the Poisson process conditional on its endpoints is given by the value of the Poisson skeleton at t r plus a binomial contribution, we find

𝔼 [ Y ( n ) ( t ) r ] = p Y ( n ) ( t r + 1 ) + ( 1 - p ) Y ( n ) ( t r ) .

This, in turn, implies that the covariance of the conditional expectation can be factored as

Cov [ 𝔼 [ Y ( n ) ( t ) r ] , 𝔼 [ Y ( m ) ( t ) r ] ] = ( 1 - p ) 2 Cov [ Y ( n ) ( t r ) , Y ( m ) ( t r ) ]
+ p 2 Cov [ Y ( n ) ( t r + 1 ) , Y ( m ) ( t r + 1 ) ]
+ p Cov [ Y ( n ) ( t r ) , Y ( m ) ( t r + 1 ) ]
+ p Cov [ Y ( n ) ( t r + 1 ) , Y ( m ) ( t r ) ] .

Finally, summing over all n , m of both contributions to the total covariance then yields equation (5.7).

B Poisson bridge implementation details

In this appendix we provide explicit descriptions of the algorithms used in the midpoint-based and median-based Poisson bridge constructions. This is largely based on [4, Chapter 15] with minor corrections and additions to allow for more memory efficient implementations. We use matlab-index notation for vectors throughout this appendix.

B.1 Efficient sampling of uniform order statistics

We first describe in Algorithm 1 an efficient and stable implementation of the exponential spacings method to sequentially generate the order statistics of uniform random variables on a finite interval based on [4, Section 15.6],[4] which is a numerically stable version of the exponential spacings method [2, Section V.3.1]. This subroutine is used in both the median-based and midpoint-based Poisson bridge constructions.

Algorithm 1 ( UOS ( a , b , N ) . (This generates N ordered uniform random variates on ( a , b ) .)).

B.2 Midpoint-based Poisson bridge

The full midpoint-based Poisson bridge construction is described in Algorithm 2, which uses two subroutines; Algorithm 4 to construct a Poisson skeleton, which is based on [4, Section 15.3], and Algorithm 5 to fill in such a Poisson skeleton efficiently, using Algorithm 1. The full Poisson bridge construction uses the fact that multiple steps of the Poisson bridge can be sampled independently by the independent increment property of a Poisson process. Although in Algorithm 2 we sample a Poisson process on t [ 0 , τ f τ s τ s ) rather than t [ 0 , τ f ) , we can achieve the latter if we build in an early stopping checkpoint in step 6 of Algorithm 2, i.e. in the filling in stage (done via Algorithm 5).

Note that to use variance reduction techniques in combination with the Poisson bridge method we simply propose to sample the uniform random variables, 𝐔 , that are used to construct the Poisson skeleton via a variance reduction technique, such as antithetic sampling. Consequently, the Poisson and binomial random variates in the midpoint-based Poisson bridge construction need to be generated via inverse transform sampling. For the Poisson random variables we use the fast inverse Poisson CDF from [6]. For the binomial random variables to be sampled from an inverse CDF we need to solve

u = I 1 / 2 ( N - k , 1 + k ) ,

where I x ( a , b ) is the regularised incomplete beta function, for k when u [ 0 , 1 ] , and this is a relatively costly operation. For small N this can be done efficiently using look-up tables, but for large N such an approach is inefficient. Instead, to invert the incomplete beta function, we use an adaptation of ideas in [6] that yield a fast approximation to the inverse of the incomplete gamma function.

Algorithm 2 ( PB midpoint ( τ f , τ s , L , 𝐔 ) . (This generates a single Poisson process Y ( t ) when t [ 0 , τ f τ s τ s ) via the midpoint-based Poisson bridge construction with L levels and step size τ s .)).

B.3 Median-based Poisson bridge

The full median-based Poisson bridge construction is described in Algorithm 3, which uses two subroutines; Algorithm 6 to construct a Poisson skeleton and Algorithm 7 to fill in such a Poisson skeleton. Note that to use variance reduction techniques in combination with the Poisson bridge method we simply make sure that the uniform random variables, 𝐔 , that are used to construct the Poisson skeleton are sampled via a variance reduction technique, such as antithetic sampling.

To implement a single-step of the median-based Poisson bridge we can almost directly follow [4, Section 15.2]. Note, however, that [4, Section 15.2] describes a (hybrid) median-based Poisson bridge of fixed internal time length, i.e. for a given internal time τ one first samples Y ( τ ) 𝒫 ( τ ) as we did in the midpoint-based Poisson bridge method. Thereafter we can use the median-based procedure described in this section to construct a Poisson skeleton and fill this in via the exponential spacing method. We can, however, replace this initial step by fixing N s and sampling the arrival epoch τ N s as described above from a gamma distribution. A complete description of this procedure can be found in Appendix B. Similarly to the midpoint-based Poisson bridge in Algorithm 2, the median-based approach in Algorithm 3 samples a Poisson process on when 0 Y ( t ) N f N s N s rather than 0 Y ( t ) N f . We can, however, achieve the latter if we build in an early stopping checkpoint in step 7 of Algorithm 3, i.e. in the filling in stage (done via Algorithm 7).

In order to use variance reduction methods the gamma and beta random variates need to be generated via inverse transform sampling. For the beta random variates this means that we have to invert the regularised incomplete beta function, i.e. solve u = I x ( a , b ) for x when u [ 0 , 1 ] , and this is a relatively costly operation. However, as noted in [4, Section 3.3], the beta random variables used in the median-based Poisson bridge are always distributed according to Beta ( α , α ) or Beta ( α , α - 1 ) and we can therefore use a fast inversion method that is tailored to the symmetrical beta distribution, such as [8]. For the case Beta ( α , α - 1 ) we postulate that the identities relating I x ( α , α ) and I x ( α , α - 1 ) in combination with a method for the symmetrical beta distribution can provide fast inversion too. For the gamma random variates we can, for example, use the efficient numerical inversion of the incomplete gamma function ratios [5] to sample via inverse transform sampling. Finally, after generating the Poisson skeleton we fill in the remaining reaction times using uniform order statistics, which we generate in exactly the same way as described Section 3.1.

Algorithm 3 ( PB median ( N f , N s , L , 𝐔 ) . (This generates a single Poisson process Y ( t ) when 0 Y ( t ) N f N s N s via the median-based Poisson bridge construction with L levels and step size N s .)).

B.4 Subroutines

Algorithm 4 ( stepPB midpoint ( τ s , L , 𝐔 ) . (This generates a single Poisson skeleton for the Poisson process Y ( t ) when t [ 0 , τ s ) via the midpoint-based Poisson bridge construction with L levels.)).

Algorithm 5 ( fillPB midpoint ( 𝐌 , 𝐭 ) . (This fills in a single Poisson skeleton generated by the midpoint-based Poisson bridge construction.)).

Algorithm 6 ( stepPB median ( N s , L , 𝐔 ) . (This generates a single Poisson skeleton for the Poisson process Y ( t ) when 0 Y ( t ) N s via the median-based Poisson bridge construction with L levels.)).

Algorithm 7 ( fillPB median ( 𝐣 , 𝐭 ) . (This fills in a single Poisson skeleton generated by the median-based Poisson bridge construction.)).

References

[1] C. H. L. Beentjes, Variance reduction techniques for chemical reaction network simulation, Ph.D. thesis, University of Oxford, 2020. Search in Google Scholar

[2] L. Devroye, Non-Uniform Random Variate Generation, Springer, New York, 1986. 10.1007/978-1-4613-8643-8Search in Google Scholar

[3] B. L. Fox, Generating Poisson processes by quasi-Monte Carlo, Lecture notes (1996). Search in Google Scholar

[4] B. L. Fox, Strategies for Quasi-Monte Carlo, Internat. Ser. Oper. Res. Management Sci. 22, Springer, New York, 1999. 10.1007/978-1-4615-5221-5Search in Google Scholar

[5] A. Gil, J. Segura and N. M. Temme, Efficient and accurate algorithms for the computation and inversion of the incomplete gamma function ratios, SIAM J. Sci. Comput. 34 (2012), no. 6, A2965–A2981. 10.1137/120872553Search in Google Scholar

[6] M. B. Giles, Algorithm 955: Approximation of the inverse Poisson cumulative distribution function, ACM Trans. Math. Software 42 (2016), no. 1, 1–22. 10.1145/2699466Search in Google Scholar

[7] P. Glasserman, Monte Carlo Methods in Financial Engineering, Springer, New York, 2003. 10.1007/978-0-387-21617-1Search in Google Scholar

[8] P. L’Ecuyer and R. Simard, Inverting the symmetrical beta distribution, ACM Trans. Math. Software 32 (2006), no. 4, 509–520. 10.1145/1186785.1186786Search in Google Scholar

[9] C. Lemieux, Monte Carlo and Quasi-Monte Carlo Sampling, Springer Ser. Statist., Springer, New York, 2009. 10.1007/978-0-387-78165-5_5Search in Google Scholar

[10] P. A. Maginnis, M. West and G. E. Dullerud, Exact variance-reduced simulation of lattice continuous-time Markov chains with applications in reaction networks, Bull. Math. Biol. 81 (2019), no. 8, 3159–3184. 10.1007/s11538-019-00576-2Search in Google Scholar

[11] B. Moskowitz and R. E. Caflisch, Smoothness and dimension reduction in quasi-Monte Carlo methods, Math. Comput. Model. 23 (1996), no. 8–9, 37–54. 10.1016/0895-7177(96)00038-6Search in Google Scholar

[12] A. B. Owen, Monte Carlo Theory, Methods and Examples, Book in progess, http://statweb.stanford.edu/~owen/mc/. Search in Google Scholar

Received: 2020-10-28
Revised: 2021-04-12
Accepted: 2021-04-26
Published Online: 2021-06-01
Published in Print: 2021-09-01

© 2021 Walter de Gruyter GmbH, Berlin/Boston

Downloaded on 18.4.2024 from https://www.degruyter.com/document/doi/10.1515/mcma-2021-2090/html
Scroll to top button