The semiconductor industry is moving toward 20nm nodes and below. As the Overlay (OVL) budget is getting tighter at these advanced nodes, the importance in the accuracy in each nanometer of OVL error is critical. When process owners select OVL targets and methods for their process, they must do it wisely; otherwise the reported OVL could be inaccurate, resulting in yield loss. The same problem can occur when the target sampling map is chosen incorrectly, consisting of asymmetric targets that will cause biased correctable terms and a corrupted wafer. Total measurement uncertainty (TMU) is the main parameter that process owners use when choosing an OVL target per layer. Going towards the 20nm nodes and below, TMU will not be enough for accurate OVL control. KLA-Tencor has introduced a quality score named ‘Qmerit’ for its imaging based OVL (IBO) targets, which is obtained on the-fly for each OVL measurement point in X & Y. This Qmerit score will enable the process owners to select compatible targets which provide accurate OVL values for their process and thereby improve their yield. Together with K-T Analyzer’s ability to detect the symmetric targets across the wafer and within the field, the Archer tools will continue to provide an independent, reliable measurement of OVL error into the next advanced nodes, enabling fabs to manufacture devices that meet their tight OVL error budgets.
In order to fulfill the ever tightening requirements of advanced node overlay budgets, overlay metrology is becoming more and more sensitive to even the smallest imperfections in the metrology target. Under certain circumstances, inaccuracy due to such target imperfections can become the dominant contribution to the metrology uncertainty and cannot be quantified by the standard TMU contributors. In this paper we describe a calibration method that makes the overlay measurement robust to target imperfections without diminishing its sensitivity to the target overlay. The basic assumption of the method is that overlay measurement result can be approximated as the sum of two terms: the accurate overlay and the measurement inaccuracy (independently of the conventional contributors). While the first term (the “real overlay”) is robust it is known that the overlay target inaccuracy depends on the measurement conditions. This dependence on measurement conditions is used to estimate quantitative inaccuracy by means of the overlay quality merit which was described in previous publications. This paper includes the theoretical basis of the method as well as experimental validation.
Overlay control is one of the most critical areas in advanced semiconductor processing. Maintaining
optimal product disposition and control requires high quality data as an input. Outliers can contaminate lot
statistics and negatively impact lot disposition and feedback control. Advanced outlier removal methods
have been developed to minimize their impact on overlay data processing. Rejection methods in use today
are generally based on metrology quality metrics, raw data statistics and/or residual data statistics.
Shortcomings of typical methods include the inability to detect multiple outliers as well as the unnecessary
rejection of valid data. As the semiconductor industry adopts high-order overlay modeling techniques,
outlier rejection becomes more important than for linear modeling. In this paper we discuss the use of
robust regression methods in order to more accurately eliminate outliers. We show the results of an
extensive simulation study, as well as a case study with data from a semiconductor manufacturer.
A primary concern when selecting an overlay sampling plan is the balance between accuracy and throughput. Two
significant inflections in the semiconductor industry require even more careful sampling consideration: the transition
from linear to high order overlay control, and the transition to dual patterning lithography (DPL) processes. To address
the sampling challenges, an analysis tool in KT-Analyzer has been developed to enable quantitative evaluation of
sampling schemes for both stage-grid and within-field analysis. Our previous studies indicated (1) the need for fully
automated solutions that takes individual interpretation from the optimization process, and (2) the need for improved
algorithms for this automation; both of which are described here.
KEYWORDS: Semiconducting wafers, Overlay metrology, Process control, Time metrology, Monte Carlo methods, Optical lithography, Statistical methods, Control systems, Immersion lithography, Lithography
Overlay metrology and control have been critical for successful advanced microlithography for many years, and are
taking on an even more important role as time goes on. Due to throughput constraints it is necessary to sample only a
small subset of overlay metrology marks, and typical sample plans are static over time. Standard production monitoring
and control involves measuring sufficient samples to calculate up to 6 linear correctables. As design rules shrink and
processing becomes more complex, however, it is necessary to consider higher order models with additional degrees of
freedom for control, fault detection, and disposition. This in turn, requires a higher level of sampling and a careful
consideration of flyer removal. Due to throughput concerns, however, careful consideration is needed to establish a
baseline sampling plan using rigorous statistical methods. This study focuses on establishing a 3x nm node immersion
lithography production-worthy sampling plan for 3rd order modeling, verification of the accuracy, and proof of
robustness of the sampling. In addition we discuss motivation for dynamic sampling for application to higher order
modeling.
Isolated and dense patterns were formed at process layers from gate through to back-end on wafers using a 90 nm logic device process utilizing ArF lithography under various lithography conditions. Pattern placement errors (PPE) between AIM grating and BiB marks were characterized for line widths varying from 1000nm to 140nm. As pattern size was reduced, overlay discrepancies became larger, a tendency which was confirmed by optical simulation with simple coma aberration. Furthermore, incorporating such small patterns into conventional marks resulted in significant degradation in metrology performance while performance on small pattern segmented grating marks was excellent. Finally, the data also show good correlation between the grating mark and specialized design rule feature SEM
marks, with poorer correlation between conventional mark and SEM mark confirming that new grating mark significantly improves overlay metrology correlation with device patterns.
Modern overlay metrology tools achieve the required metrology accuracy by controlling critical asymmetries in the imaging optics, and by compensating for the remaining asymmetries through TIS-calibration. We extend our study on the TIS-WIS interaction in stepper alignment optics to the overlay metrology tool, and propose a new method for characterizing residual TIS. This method is based on the examination of the through-focus behavior of the metrology tool on a wafer with a simple, TIS-sensitive structure.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.