In-situ Composition Analysis of Direct Metal Deposition Process using Optical Emission Spectroscopy of Laser Induced Plasma

By: Lijun Song and Jyoti Mazumder

Center for Laser Aided Intelligent Manufacturing, University of Michigan

The ability to monitor the elemental composition in real time for laser aided rapid manufacturing is essential for both estimation of elemental losses and elemental control. Traditionally X-ray fluorescence (XRF) and energy dispersive X-ray spectroscopy (EDS) have been widely used for element analysis after the parts were fabricated. However, these two methods are time consuming due to the sample preparation and post processing. Thanks to the laser induced plasma, which is a by-product of direct metal deposition process, a real-time multi-element analysis is possible by calibrating the plasma parameters.

The Center for Laser Aided Intelligent Manufacturing at the University of Michigan has developed an approach to monitor the composition of the laser manufactured parts using multiple calibration curves to improve the accuracy and repeatability of the composition analysis. Chromium composition prediction with ±3.5% uncertainty and 4.1% resolution has been achieved for chromium composition ranging from 9% to 37% during direct metal deposition of chromium and H13 tool powders.

During direct metal deposition process, a fiber pig-tailed high resolution spectrometer is used to monitor the laser generated plasmas. After carefully emission line selection from different elements, baseline removal and line fitting, three types of second order polynomial calibration curves including multiple line intensity ratios, plasma temperature and electron density have been formed. Figure 1 shows an example of the calibration curve based on the line intensity ratio between chromium and iron emissions during direct metal deposition of a mixture of H13 tool steel and pure chromium powders.

Figure 1. Calibration curve for the chromium composition and the Cr/Fe line ratio

Real time composition analysis has been performed using the three kinds of calibration curves. The relative composition deviation is 25% from a single line intensity ratio calibration curve and decreases to 7% from four line intensity ratio calibration curves. By using seven line intensity ratio calibration curves, the relative composition deviation decreases to 5%. By incorporating electron density calibration curve, the composition variation can be further reduced to 3.5%.

To test the system resolution, direct metal deposition experiments with nominal Cr composition of 19.80% and 23.66% (both verified using EDS) were performed and the measured compositions from the system are shown in Figure 2 and Figure 3, respectively. For nominal Cr composition of 19.8%, the measured average value of the Cr composition using this method is 19.41%. Two times the standard deviation of the measured composition is taken as the system resolution, which is 4.1%. For nominal Cr composition of 23.66%, the measured average value of Cr composition is 24.87%. Two times the standard deviation of the measured composition is 3.47%.

Figure 2. Chromium composition measurement for 19.80% chromium direct metal deposition process
Figure 3. Chromium composition measurement for 22.36% chromium direct metal deposition process