This paper has two purposes. The first is to propose and justify a new synchronizer timing model, called the jitter model, which has general application to metastable reliability analysis. The second is to apply the jitter model to show that redundancy cannot improve the metastable reliability of synchronizers, contradicting previous work by El-Amawy . The jitter model extends previous synchronizer input timing models by incorporating the effects of circuit noise. The circuit noise translates into jitter or random time displacement of the previously proposed deterministic aperture model . The jitter model is supported by simulation, circuit analysis, and experimental work. An example of a CMOS D-type flip-flop is simulated in detail to provide a practical emphasis for the work. Also, an experimental bistable device has been constructed to examine the behavior of synchronizers with noise. Statistical results obtained from the experimental bistable support the jitter model for metastability. The model for metastability proposed by El-Amawy  is shown to be invalid. The results of the original treatment of the nonviability of redundancy techniques on synchronizer metastable reliability by the author and Cantoni  are extended in this paper to incorporate the effects of circuit noise, contradicting work of El-Amawy . This paper highlights the sensitivity of metastable reliability of redundant synchronizers to modeling assumptions.
- Asynchronous inputs
- metastable behavior