Abstract:
Importance sampling has been widely used as a technique to reduce run-time in the simulation of digital systems. It is well known that the effects of importance sampling can be reduced in the presence of system memory (eg. systems employing a Viterbi decoder). This paper presents some modifications to conventional importance sampling which give improvements when system memory is present. Also a new technique of implementing importance sampling in burst is developed and the effects of this burst weighting scheme on the simulated systems employing Viterbi decoding is analysised. The use of control variables is also introduced as an additional variance reduction technique.