Chapter 2. Big data changes the factories - AMORE STORIES - ENGLISH
#Jang Saetbyeol
2017.04.27
0 LIKE
289 VIEW
  • 메일 공유
  • https://stories.amorepacific.com/en/chapter-2-big-data-chan

Chapter 2. Big data changes the factories

Introducing the columns written by member of Amorepacific Group

ColumnistJang Saetbyeol
Amorepacific Amundsen Camp


Introduction

 Hello, I'm Jang Saetbyeol and I'm back again today with other cases of using big data. To convey the familiar yet somewhat daunting topic of big data in a way you can relate to, I'm going to focus my talk on topics related to our jobs and business trends affecting Amorepacific. Last time in Chapter 1, we covered the planning section, which is like the starting point for a product to be created. This time, we'll be looking at cases of using big data in the production and quality control stages, which is where products are actually made.

 Having such diverse facilities linked with one another in the production stage means that a great amount of data is produced. It's easy to quantify the application effects such as percent defective or yield and costs and, indeed, there have been quite a few cases of big data being used to drastically reduce costs. A recent trend in quality control is all about making quicker and more accurate judgments by using algorithms learned through data, rather than using the conventional method based on skills-based discretion.

Recalling the Six Sigma

 Let me mention a term that has now become a memory for many: Six Sigma. This has created a sensation in the manufacturing industry from the late 1990s to the early 2000s. I'm sure you've also heard about this on site or as a story of the past.
  • Normal Distribution and Six Sigma (Source : THN)

 Sigma (σ) is generally a term that represents the standard deviation or error range in a population in statistics. In terms of a normal distribution, which is the most commonly known probability distribution, 68% in 1-sigma, 95.4% in 2-sigma, and 99.7% in 3-sigma are fair quality or normal goods. The bigger the number that comes before 'sigma,' the rate of fair quality goods increases geometrically. If we look at 6-sigma in terms of probability, 99.9997% of products are fair quality, corresponding to 3.4PPM (parts per million). This indicates that only about 3.4 defective products occur when producing one million products. In fact, 6-sigma is considered the lowest level of percent defective that is feasible in business practice.

 Before talking about cases of big data, why have I brought up this concept of Six Sigma? Aside from detailed methodology, the general framework of Six Sigma originates from thorough analysis based on statistical data, which made it completely different from the business methodologies it largely replaced. Statistical techniques and data were actively applied in multiple stages, such as measurement, analysis, design, optimization and verification. Unlike the old methods implemented by the intuition of managers or decision makers, Six Sigma adopted a strictly scientific and objective approach. In this sense, it seems like the origin of big data, which has a remarkable influence over all businesses today. In fact, what we're seeing in the market is that, just as Six Sigma once did, big data is now slowly increasing its influence in the manufacturing process as well.

Change will come to you

 In November 2016, GE held its fifth annual Minds + Machines event. Here, GE Vice Chair Beth Comstock emphasized the speed of change, and said GE will be the guide to such digital innovation. In the past, we listened to music on CDs and tapes, and shopped at offline stores, but now the structure has completely been transformed with digital technology in disruptive innovation, which we will face in all fields.
  • Minds + Machines 2016 / Source : GE Report

 In Minds + Machines 2015, GE Chairman Jeffrey Immelt surprised everyone by declaring that "GE is on track to be a top 10 software company in the world by 2020," adding that they must "improve productivity again." Why did he talk about 'productivity,' which was the focus of manufacturing in the past, while discussing software and digital technology?

 He stated that the productivity increase in the industry overall has been just 0.5% between 2011 and 2015, whereas GE surpassed this figure as it has pursued a shift to digital technology. As he pointed out, productivity increase seems to have reached its limit in terms of machinery, and it won't be easily achieved without digital technology and data.

 Immelt closed his speech by imploring employees to "embrace change, so that change will come to you." Is everyone ready to embrace change? It's no exaggeration to say that a company's success and failure will depend on its preparations or attitude toward embracing change.

Changing companies will survive

GE's Brilliant Factory / Source : GE

 In the first chapter, I mentioned that now it's necessary to manufacture products that consumers want, which is quite unlike in the past when we manufactured products that manufacturers wanted to make. In fact, the process of manufacturing products to a fixed plan is now shifting from the manufacturer-centered automation process of the past, as represented by mass production and mass marketing, to intellectualized factories that meet the demands and tastes of consumers.

 The method of lowering production costs in the past has been to build factories in the third world with low labor costs or land/resource costs. There could now be greater cost efficiency by building an adequately-sized factory close to the consumer market (even if it's in an advanced country) and managing the operation method more effectively. GE and Siemens are examples of companies representing process innovation.

 First, in GE's case, there is the Brilliant Factory located in Pune, India in 2015. Unlike the past when one or two main products were manufactured in just one factory, this Brilliant Factory made it possible to manufacture all the products in the four business fields (aviation/ electric power/ transport/ oil and gas) by implementing a Multi Module approach that supports various manufacturing modes. This is possible because the factory is optimized by collecting and using data on a real-time basis.

 In particular, apparel makers like Adidas and Nike have recently shown a great deal of interest in implementing the kind of factory that manufactures multiple products at once. Unlike the previous era when consumers just chose from ready-made goods and sizes, there has been greater demand in recent years for strictly customized products from colors and fabrics to the curves or detailed measurements of foot, as well as greater need for better productivity.

 Let's check out Siemens, which is another case aside from GE that applied big data to the manufacturing industry. We can see how the company made a breakthrough, even in the dry spell of productivity when the yield no longer went up.
  • Panorama of Siemens Amberg Factory in Germany / Source : Siemens

 The Amberg factory near Munich, Germany, has the data of all the devices connected in a communications network, and at least 75% of all processes and flows run automatically. The current yield of this factory is 99.9989%, which means that there are 11 defects in one million products. This is close to the ideal yield that can be achieved in reality under the aforementioned Six Sigma. The factory's productivity has increased by 9 times despite the same number of workers and machines after it was changed into a smart factory.

 Siemens didn't just keep this case and know-how of smart factory in-house, but has expanded its scope of business by exporting such know-how in the form of an IT system, namely "Digital Twin." This system offers simulations of virtual images of the actual production line and environment within a digital space.

 By simulating the process from the design stage of the product using Digital Twin, it's possible to produce millions of options in a single production line (for example, if it's a car, there can be combinations of colors, tires and wheels). Using this system, the Italian luxury car brand Maserati significantly reduced the vehicle development period from 30 months to 16 months.

 Big data and various digital technologies like IoT and 3D printers are integrated within all areas of production and manufacturing to find solutions for the most common issues of marketers these days, such as personalization and customization. In this way, business is actually turning mass customization into an achievable reality.

Quality is vital

 Quality control and improvement are tasks commonly faced by all manufacturers. Defects are the biggest issue faced by the manufacturing business, since they require immense cost and time to fully determine or resolve the issue once they occur. Merely removing one defective product won't solve everything. Samsung Electronics Chairman Lee Kunhee compared defects to cancer, saying that "even a single defective product out of a million will make the buyer think that all of the million products are defective." As such, defects can completely change the company's image and brand perception.

 This is why countless manufacturers are more interested in pre-quality control or quality prediction than post-quality control. Now, we will look at the case of using big data for quality, which has greater significance in the permanence of a company, even though it's slightly different from productivity.

Sample survey vs. complete enumeration survey

 Special glass manufacturer Schott replaces the old method of quality control using a sample survey with big data. Conventional quality control in the production process is based on a sample survey method that selects and tests multiple finished goods in a specific line, which is like a poll often taken before an election. This method is used to reasonably estimate the entire population, instead of a complete enumeration survey.

 Conducting a poll among all qualified voters for an election is like actually voting and counting the votes. To save time and costs, we estimate the results by extracting a small number of samples (almost the only case of conducting a complete numeration survey in the conventional survey methodology was the Population and Housing Census conducted every five years). Likewise, it's either difficult or inefficient to measure the quality of all finished products in the manufacturing process as well. Sometimes, even a destructive inspection is required to determine the product's durability, and thus sample surveys have been widely adopted.

 However, as we have entered the era of big data, sample surveys are often replaced by complete enumeration surveys by improving the technology that will save considerable time and cost, or by securing additional data.
  • Glass production process / Source : Google Image

 The special glass tubes for pharmaceutical packaging, the main item manufactured by Schott, undergo multiple processes and extremely meticulous procedures. Glass melted in a furnace of more than 1,600 degrees Celsius is poured out at a rate of many meters each second, before being cooled off into desired shapes and forms. The thickness of the glass and the diameter of the tube can be set as elaborately as 0.01 mm units by adjusting the speed or air entrainment. The glass is cooled by this process, cut into necessary lengths, and then put into final processing.

 According to the conventional sample survey, the degree of completion is tested by giving shock to a few selected products after processing, or taking precision measurements. But, rather than using statistical quality testing, Schott collects data using various sensors and devices from the moment glass comes out of the furnace, such as measurement of the diameter, thickness and deviation of glass using a laser, and foreign matter test using high-performance cameras. Data measured this way are processed on a real-time basis, selecting products that might have defects or are different from normal ones, thereby removing them before shipment.

Prevent defects in advance

 Unlike the case of Schott, there are many cases of determining issues in the production process in advance and taking immediate measures to reduce the percent defective, rather than doing a quality survey of finished goods.

 A leading display manufacturer in Korea also implemented big data for quality control of TFT-LCD process. Approximately 2,000 items were collected and measured as data by this company in the process of making a single product. Many kinds of data are built and stored, but the issue faced by this company was that it was difficult to use the data to maintain production quality.
  • Control chart for facility monitoring / Source : Infinity QS

 The main graphic tool used in quality monitoring is the control chart, which is to determine whether the manufacturing process is stable and monitor whether the values of each quality attribute exceed the fixed limits (both upper and lower). The example of the control chart above shows that the monitoring panel operates in a way that sends notification if the value exceeds a certain band. It's highly intuitive, but what if there are 100 of these graphs? Can all of them be determined if they exceed 1,000?

 In fact, most of the production process varies by the level of collection, measurement and management of data, but in electronic facilities, there are countless types of sensor data ranging from thousands to even tens of thousands of units. This company had as many as 2,000 items, which made it difficult to determine quality defects in advance with the conventional dashboard. Thus, this company established an anomaly detection model to preemptively detect any anomalies in facilities using big data.

 Unlike the past, when verification was required whenever individual items exceeded the control limit, now the anomalies can now be monitored by integrating the whole, which has increased the work efficiency of quality managers and considerably reduced the actual percent defective as well.

Blending big data into factories

 POSCO also began to build a smart factory two years ago at Gwangyang Steelworks, which is the world's biggest single factory. It is now conducting continuous experiments and research after selecting 39 tasks to reduce defects and save costs.

 Data such as pressure or temperature received in the production process are collected and recorded for each item, just like the examples mentioned above like display or electronic goods. The most elaborate unit of data collection is 0.005 seconds, and this data flow reaches as much as 1TB a day for a single process. Data loaded this way are used to determine the cause in case there is a defect and to take follow-up measures.

 Moreover, the post-production task to remove cracks in the manufactured steel products is one of the essential prerequisites due to the nature of steel manufacturing. Since big data made it possible to precisely predict the possibility of cracks, the number of processes to remove the cracks, which had required a lot of time and cost, has been reduced, thereby saving production costs considerably.

 Instead of keeping the smart factory to itself that embraces cost reduction, quality improvement and facility defect reduction, POSCO also continues its research to expand the system to other companies and businesses based on cooperation with GE. It has recently accelerated data- and digital-based change, such as planning to build its own data center.

In conclusion

 Today we looked into cases of using big data in manufacturing where products are actually manufactured, in addition to the planning stage we talked about last time. Even though it's now the Fourth Industrial Revolution, we're seeing rapid changes based on digital technology even in the manufacturing business, which still seemed like a traditional field.

 This change does not occur within the boundaries of factories alone, but will approach our reality with generalization of customized production or a quality level beyond imagination. I look forward to the expansion of smart factories so that we no longer have to fit our body and taste into ready-made clothes and shoes.  I'll be back in the next column with cases of big data used in our business.

  • Like

    0
  • Recommend

    0
  • Thumbs up

    0
  • Supporting

    0
  • Want follow-up article

    0
TOP

Follow us:

FB TW IG