Noura Dawass
2.2. N UMERICAL COMPUTATION OF w ( x ) 2 27 2.2.1. I MPORTANCE SAMPLING ALGORITHM FOR COMPUTING p ( i ) In the algorithm below, we show how the probability distribution function p ( i ) and the weight function W ( i ) are computed. Note that the algorithm presented in this work is for a 3 D subvolume, however, it is trivial to adjust it to other di- mensions. The algorithm follows the following steps: 1. Set ∆ r and the maximum allowed displacement for random displace- ments. 2. Set a weight function W ( i ) to zero for all bins. 3. Choose two random points ( P 1 and P 2 ) inside the subvolume V . 4. For each sampling cycle (we typically performed 10 11 cycles): (a) Select a single point, P 1 or P 2 , randomly. Assume that P i is selected (the other point is denoted by P j ). (b) Give a random displacement to point P i leading to P new . (c) Check if this new position falls inside the subvolume. If it does not, skip to step (f ), otherwise carry on with the next step (d). (d) Determine the normalized distance, r / L max , between P new and P j and determine the bin number corresponding to this distance, i new . The bin number corresponding to the old distance is denoted by i old . (e) Accept the displacement if a uniformly distributed random number between 0 and 1 is less than exp[ W ( i new ) − W ( i old )], otherwise the dis- placement is rejected. If the displacement move is accepted, update P i and i old such that P i = P new and i old = i new . (f) Compute the normalized distance between P i and P j and the bin number, i corresponding to that distance. Update the sampling of the observed probability distribution function p biased ( i ). 5. After a large number of cycles, remove the bias caused by the weightfunc- tion: p ( i ) = p biased ( i )exp[ − W ( i old )] (2.1) 6. Update and save W ( i ) for the consecutive computations of p ( i ) using an iterative updating scheme: W ( i ) → W ( i ) − (1/2) ln p biased ( i ) (2.2) and shift W ( i ) so that its minimum equals zero.
Made with FlippingBook
RkJQdWJsaXNoZXIy ODAyMDc0