Get In-Line: How To Choose In-Line Metrology For Lithium-Ion Battery Production 

Contributed Commentary by Chris Burnett, Thermo Fisher Scientific 

November 1, 2023 | The number of technologies reliant on rechargeable batteries is growing rapidly, driven by society’s increasing shift to electric vehicles and other electric devices. This rise in demand for lithium-ion batteries is forcing manufacturers to optimize their processes in order to ramp up production without sacrificing quality. The use of in-line metrology—such as in-line thickness or coating weight gauges—during the electrode manufacturing process is essential, as variations in coating thickness, an uneven profile, or undetected defects can drastically impact battery performance and increase the risk of thermal runaway. This article discusses the technical considerations for battery manufacturers to bear in mind when choosing in-line metrology, to ensure an efficient process and a consistent, high-quality end product. 

The global initiative to move away from fossil fuels and embrace clean energy sources has led to the development of new alternative technologies. However, using this energy in our daily lives requires a way to store it, and lithium-ion batteries have become a popular choice, as they have a longer lifespan and higher energy and power densities compared to other solutions currently available. Lithium-ion batteries can safely store large amounts of energy, ensure a stable and predictable flow of electricity, and are used in both electric vehicles (EVs) and energy grids, as well as consumer goods. However, for this technology to be fully adopted, mass production needs to become more efficient and the number of faulty batteries must be minimised through strict quality control (QC). The performance and safety of lithium-ion batteries is greatly affected by the uniformity of the electrode coating and separator film. Therefore, a precise method to measure the coating weight and thickness of these materials without slowing down the manufacturing process is essential, and installing in-line metrology on the production line can achieve this. 

Gauging Thickness 

Electrode coatings are very thin, and variations of even a few microns can greatly affect battery cell performance. Inadequate anode material will render the battery unable to store the influx of electrons produced by the cathode during the charging process. Similarly, non-uniform cathode material can diminish battery power, requiring frequent charging. Ensuring a uniform coating on battery electrodes is therefore critical for safe and effective battery functionality. Investing in high-end in-line metrology—such as coating weight gauges—will help to detect non-uniformity and defects in a timely manner. A coating weight gauge operates with a sensor scanning back and forth across a moving reel of conductive substrate, in the case of battery electrodes a copper or aluminium foil, allowing real-time measurements and defect detection. The efficiency of this measurement solution is determined by two factors: choice of sensor and scanning speed. 

Sensor 

There are many types of sensors available, and the optimal choice depends on the material and properties to be analysed. For example, low density materials, such as polypropylene or polyethylene, commonly used for separator films, are best measured using infrared, beta radiation, or very low energy x-ray sensors. Confocal lasers are best suited to measuring the thickness of the coated electrode, while the mass loading, or coating weight, on an electrode is measured during the coating process with an x-ray or beta sensor. Sensor resolution, precision, and beam size are also important factors to consider when choosing a measurement sensor. This is especially important in battery production, where reliable, high-resolution measurement of edge defects is vital.   

Speed 

Scanning speed is also important when setting up an in-line measurement gauge. Since the gauge moves perpendicular relative to the processing direction of the substrate, the sensor will scan the material in a zig-zag pattern, which will allow some defects to pass undetected. The scanning speed will therefore determine the percentage of the material that is measured and, although the obvious solution might be to increase scan speed, this may lead to a decrease in accuracy. If the speed is too high, the response time of the sensor may be insufficient to generate an accurate representation of the material, blurring the data collected and increasing the risk of undetected defects. Ultimately, this becomes a balancing act between measurement accuracy and the amount of material measured. 

The uniformity of electrode coatings and separator film is critical to ensure safe and effective battery operation, and manufacturers are increasingly using in-line metrology solutions to measure the coating weight and thickness of the material. Determining the right system for each application involves careful selection of sensor type and optimisation of scan speed to minimise the risk of defects. A suitable setup will allow continuous measurement of the electrode and separator film materials without compromising throughput, reducing the risk of defects and increasing overall productivity. 

 

Chris Burnett studied physics at Worcester Polytechnic Institute in Massachusetts and has held a number of positions at Thermo Fisher Scientific including Director of Sensor Development and Manager of Systems Integration for the flat-sheet guaging business. With over 25 years’ experience, he currently helps drive solutions for the battery industry in his role as senior manager, applications. He can be reached christopher.burnett@thermofisher.com.