next up previous
Next: A Viterbi-like Algorithm to Up: Segmental Semi-Markov Models for Previous: Keywords


Introduction

Plasma etch is a critical process in semiconductor manufacturing. Accurate automatic detection of the end of the etch process is quite important for reliable wafer processing.

Figure 1: An illustrative example of a change-point detection problem from plasma etching. Data are from a commercial LAM 9400 plasma etch machine.
\includegraphics [width=4.5in,height=2.0in]{eps/tc31.eps}

Figure 2: An example of an interferometry sensor data from plasma etching: (a) (top) a waveform pattern indicating the end of the plasma etch process is indicated with dotted line, (b) (bottom) another run of the same process where we wish to detect a similar pattern (indicated by dotted lines).
\includegraphics [width=4.5in]{eps/tb01.eps}

Figures 1 and 2 show two techniques for endpoint detection using interferometry sensor data from plasma etching.

Change-point detection
In Fig. 1, there is a ``change-point" if two quadratic curves are fitted. Process engineers are interested in detecting this change-point in real time for the purpose of endpoint detection.

Pattern matching
In Fig. 2, a signature pattern (enlarged in the box in Fig. 2(a)) is found to be a good detector of the endpoint. Given one example pattern (e.g., from a test run of the process), can we find similar patterns in subsequent runs (e.g., in Fig. 2(b))?

We formulate the above two problems in a general framework of segmental semi-Markov models, which extends the standard hidden Markov model (HMM) to allow explicit state duration (semi-Markov model, e.g., Ferguson 1980), and segmental observations (the segmental model, e.g., Holmes and Russell 1999).

For example, in the context of change-point detection in Fig. 1, we have two states: ``pre-change-point", and ``post-change-point". Similarly, for pattern matching in Fig. 2, if we approximate the pattern as a sequence of linear segments (by piecewise linear segmentation, see Fig. 3), state $i$ corresponds to the $i$-th segment of the pattern.

Figure 3: The example waveform pattern Figure 2(a), and its piecewise linear representation.
\includegraphics [width=4.5in,height=2.0in]{eps/ta26.eps}

When the states of the Markov model are not directly observed (i.e., hidden), the states $s_1 s_2 \ldots s_t
\ldots$ must be inferred from the observed data $y_1 y_2 \ldots y_t \ldots$. For standard hidden Markov model (HMM), there exist efficient algorithms to compute $p(s_t=i\vert y_1 y_2 \ldots
y_t \ldots)$ (forward-backward algorithm), and the most likely state sequence $\hat{\mathbf{s}}=\hat{s}_1 \hat{s}_2 \ldots \hat{s}_t \ldots$, which maximizes $p(\mathbf{s}=s_1s_2\ldots s_t \ldots\vert y_1 y_2 \ldots
y_t \ldots)$ (known as the Viterbi algorithm). These algorithms can be extended to our segmental semi-Markov models. In the next section, we give a Viterbi-like algorithm to compute the most likely state sequence in a segmental semi-Markov model.


next up previous
Next: A Viterbi-like Algorithm to Up: Segmental Semi-Markov Models for Previous: Keywords
Xianping Ge
2000-05-16