2018 Outlook: Rapid commercial truck growth, lower sales with better profitability in autos
Features - Cover Story
Following three years of more than 17 million light cars and trucks sold, 2018 may miss that target, but production will stay near record levels, and industry profits and investments are projected to climb.
Following seven years of constant gains, auto sales fell slightly in 2017 (see infographic, pg. 24-25). while declines were concentrated in small, fuel-efficient, inexpensive vehicles, booming sport utility vehicle (SUV), crossover, and truck sales brought in massive profit margins.
In addition, on the heavy truck side of the motor vehicle world, Class 8 trucks are expected to extend fantastic gains from the second half of 2017 into this year. ACT Research Vice President Steve Tam says freight growth is boosting profitability for fleet managers, making it much easier to justify big equipment purchases this year (see sidebar, page 17).
Predictions for 2018 are mixed. Experts offer full-year sales estimates ranging from about 16.5 million to about 17.5 million with most centering around 16.7 million – the prediction of the National Automobile Dealers Association (NADA). The underlying factors that have kept sales higher than 17 million for the past three years – a growing economy, easy access to credit, and new features that attract new buyers – remain in place. Experts expect a plateau for the foreseeable future of flat sales at historically high levels.
NADA Chairman Mark Scarpelli says, “Every dealer in America, myself included, would be thrilled with a seasonally adjusted annualized rate of above 16 million. Because it means that, one, the market is stable, and two, that demand is still healthy. And both factors are true in this case. We are looking at a stable market where demand – particularly for light trucks, SUVs, and crossovers – continues to be very healthy.”
General Motors executives predict 2018 sales in the high 16 million range, in line with NADA’s outlook. Company chief economist Mustafa Mohatarem says in addition to increasing strength of the general economy, low unemployment is leading to wage growth, making it easier for individuals to add car payments this year. On top of that, the recently passed tax cuts could boost take-home pay for many.
“Many consumers will see their take-home pay rise because of tax reform. That will keep the broad economy growing, and help keep sales at very healthy levels even as the Fed increases interest rates,” Mohatarem says.
Interest rates, low since the recession, are rising. However, monthly car payments haven’t increased as many buyers are opting for longer-term loans – ratings agency Experian notes a 69-month average loan, an all-time record. If rates climb sharply, monthly payments could rise, even with the longer loan terms, but Mohatarem and others don’t expect dramatic increases.
Fuel-efficient automakers, companies such as Toyota and Hyundai/Kia have traditionally earned most of their sales from smaller cars. With gas prices low, consumer tastes continue to shift toward crossovers, SUVs, and trucks, so companies are radically altering their lineups. Toyota, for example, has expanded production of its RAV4 SUV and plans new Lexus crossovers for this year.
Lexus General Manager Jeff Bracken says, “In 2018, Lexus dealers will have even more options for customers as we bring 15 all-new and special edition models to the market.”
As Pierre Labat (see sidebar, page 20), vice president of global automotive for Novelis Aluminum, notes, at the Los Angeles Auto Show, the vast majority of new production vehicles were SUVs and crossovers. Companies also showed off several electric vehicles and plug-in hybrids, but those were mostly concept cars that won’t go into production this year.
GM’s U.S. sales chief Kurt McNeil says that the automaker has been planning for the demand shift, adding new crossovers to Buick and Cadillac and refreshing Chevrolet’s large-vehicle lineup.
“We are starting 2018 with very lean inventories for such a strong industry, and we see more room to grow because Chevrolet, Buick, and GMC will have a full year of sales of their all-new crossovers, and we are going to launch the industry’s best full-size pickups,” McNeil says.
Though sales of all-electric vehicles (EVs) and plug-in hybrid electric vehicles (PHEVs) are expected to remain a tiny fraction of the industry in 2018 (1.1% of sales in 2017), increasing availability could lead to big sales increases. In January, Tesla Motors lowered production targets for the Model 3 EV, saying it won’t hit 5,000-vehicles-per-week targets before the end of June. But even with that miss, Tesla’s sales should more than double from the 101,312 models delivered to customers in 2017. In addition, GM’s Chevy Bolt EV hit a milestone in December with more than 3,000 sales.
Nissan plans to launch an updated Leaf EV in 2018, Chrysler Pacifica Hybrid PHEV minivans will have a full year’s availability (the vehicle started shipping in April and had limited availability until the third quarter of 2017), and Toyota’s Prius Prime PHEV will enter its second full sales year with higher availability.
As the Center for Automotive Research’s Brett Smith notes, Ford sells 6.3 F-Series trucks for every EV or PHEV sold, but the growth rate of electrified vehicles is increasing, and many industry observers expect dramatic gains throughout the next several years.
Carefully analyzing feature measurement data can tell manufacturers when a product is falling out of spec. Monitoring quality variations throughout time can help experts identify process problems, telling manufacturers where to invest in new equipment or systems.
But post-processing measurement data has limits. Users can tell when a product is nearing the limitations of its specs, but they often don’t know exactly what’s causing the drift. It could be tool wear, heat generated in a continuous manufacturing process, variations in base materials, or other causes.
Loads of analytical data are great, but identifying root causes generally requires an up-close look at processes. Seeing the process through time and in real time can provide insights far greater than many charts or tables. So, many manufacturers have been adding cameras to their process-monitoring systems. However, managing that image data and making it usable is a huge task says Mat Daniel, vice president of operations for Sciemetric Inc., a Canadian process control software company recently purchased by the TASI Group in Cincinnati, Ohio.
“The cost of cameras, the cost of processing, the ability of software to do some really smart stuff, and be a really reliable source of data, that’s been driving what we do,” Daniel says.
As digital cameras have become cheaper and more capable, they’ve become more useful in industrial settings. The challenge, he adds, is tying those images to the process data already being collected.
Matching images to processes
Call it the Industrial Internet of Things (IIoT) or Industry 4.0, the push to connect production machinery with monitoring systems has been a massive trend the past decade. Manufacturers are collecting more data from equipment, hoping to glean important information that will boost productivity and quality.
“The knee-jerk reaction when people start getting into Industry 4.0 is to collect all the data you can and figure out what to do with it later. A lot of people are collecting data into these big, huge databases with no plans for how to use it,” Daniel says. “There’s a better way of doing things, and vision can be a very important part of it.”
Sciemetric’s QualityWorX Vision software pairs images from cameras mounted in and around production machines with process data from that equipment. If quality managers at a manufacturing plant notice process changes, such as higher cutting forces on a machining center, they can call up images from that machine in real time to identify what’s causing the variation.
“We can merge image data into a consolidated part history,” Daniel says. “With that, you can compare part images from early in a part’s production history to the ones coming off the line months or years later. We take the complexity of managing all of that image data out of the manufacturer’s hands.”
A problem with recording image data is that it’s massive compared to process information. Measurements from a scanner or touch probe generally get recorded as simple text points measured in bytes or kilobytes. Modern digital cameras generate megabytes of data in seconds.
For example, an automotive engine producer can generate 40MB of image data every 40 seconds as equipment monitors machining processes on bore honing and other machining steps. At full resolution, 1MB-per-second data generation rates can lead to nearly 29TB of data generated in a single shift. Manufacturers who want that level of data collection have to figure out where to store that amount of information in the short run, how to transfer such large file sizes to servers, and how much data they want to keep long term.
Daniel says Sciemetric’s engineers work with customers to determine the best approaches to those storage and processing questions on a case-by-case basis. A few best practices have emerged, but the answer generally comes down to each company’s goals for its quality initiatives.
Full, raw image data tends to have a short shelf life, he explains – typically about three months’ worth of raw footage. Quality problems that require enlarging images up to 800x magnification to look for detailed wear or production marking tend to show up early in the process. Once that window has passed, most manufacturers want reference data to compare parts and processes through time. In those cases, a lower-resolution photo works well.
“Capturing low-res versions in one database, alongside the process data, eliminates the need for huge raw files,” Daniel says.
Sciemetric’s software saves JPEG images at about 75% image quality to a database, tying those images to process information. Daniel explains that the lower-resolution images tend to be more than adequate for quality control and legal/corporate/warranty accountability.
Vision management system design
System architectures vary depending on company goals, existing hardware, and other factors. Daniel says vision management systems tend to have similar components:
Cameras – placed inside equipment, within quality stations, or along material handling systems; capture image data as parts move by post processing
Gateways – computers connected to cameras and process monitors; collect image and process data
Network – typically wired or wireless Internet connections between gateways and central servers
Servers – computer systems that can be on-site, off-site, or in the cloud; collect data from gateways for processing
“If the network’s down, the gateway allows you to cache the data. You can store images at the gateway level, so you’re not moving these huge data files across the network,” Daniel says.
How users allocate funds and equipment to the various parts of the image system can determine overall cost and functionality. Smaller manufacturers that operate one or two plants can invest more heavily in gateways, using floor-level machines to collect and analyze data. Network and server spending can be minimized in those cases, as smaller companies can use those portions of the system for backup and occasional analysis.
Large, multi-national users will likely invest more in network and servers, giving quality managers in corporate offices the ability to pull up real-time images and data of processes running simultaneously in the United States, Europe, and Asia.
To get the most out of QualityWorX Vision or any quality management system, Daniel says many companies are moving responsibility for data from information technology (IT) departments to stand-alone data processing divisions. Those data managers are the ones who typically decide what data to collect, how to design the systems, and what improvements the company should pursue.
“It takes someone with a vision for data to drive the process. Typically, a lot of organizations have been good at building equipment or systems monitoring,” Daniel says. “But we’re seeing more need for a data person who thinks through how to architect and manage their data strategically.”
Behind the glory of being an innovator is the arduous process of being the first to figure out how to deliver something new. Michigan-based Plasan Carbon Composites knows this firsthand as a supplier of carbon-fiber components for the automotive industry.
Carbon fiber has been around for decades in expensive supercars and airplanes, but it’s still relatively new to mass automotive production. The composite material uses a base of plastics that are mixed, spun into fibers, and carbonized through a heating process that can exceed 5,000°F.
Carbon fiber’s lightweight and high-strength characteristics create higher-performing wind-turbine blades and commercial aircraft that are 20% lighter than aluminum-based designs. Applied to automotive vehicles, carbon fiber lowers vehicle mass, improving gas mileage.
Limiting its appeal to mass-market producers has been a series of technical manufacturing challenges:
Plasan is helping push automakers toward higher adoption. The company already produces carbon-fiber components such as hoods, roofs, and side panels for popular performance vehicles such as the Chevrolet Corvette, Dodge Viper, and Ford Mustang. The company is mixing research and development, proactive development of engineered solutions, and automated manufacturing processes to pioneer carbon fiber mass production.
Traditionally, large pressure vessels known as autoclaves have been used for curing carbon fiber. But cycle times can be as long as 90 minutes, making them too inefficient to meet production targets for vehicles such as the Corvette Stingray.
The solution – first-of-their-kind pressure presses at a new 200,000ft2 production facility outside Grand Rapids, Michigan. The high-speed presses abandon the autoclaves’ convection mass-heating process and use proprietary technology to directly heat the tool mold surface for faster heat transfer to the carbon fiber.
The presses reduced curing time from 90 minutes to less than 20 minutes, among the fastest curing times in the world for the production of Class A, carbon-fiber vehicle components.
While cure times improved dramatically, pressure presses introduced new, complex process variables not experienced with autoclaves. The presses had no means of capturing and logging process data, so production personnel had no means of doing historical analysis of press cycles or identifying product issues with process data.
“The press process was highly variable, and we needed to straighten it out,” says Danny McKinnon, controls engineer for Plasan Carbon Composites. “My biggest challenge was being blind to what was happening on the off shifts. I couldn’t see what happened, and that limited my ability to troubleshoot and resolve issues.”
Variability in the presses led to higher-than-expected scrap and quality defects within the parts. This slowed production and created significant production losses, with scrap carbon-fiber components costing the company capacity and causing delivery issues.
Plasan turned to Rockwell Automation to help troubleshoot the problems. The goal was to implement software that could serialize and track each vehicle part going through the presses, and record and report process parameters within the equipment. The target – drive scrap to less than 4%, down from autoclaves’ historical 10% rate.
Plant managers implemented Rockwell Automation’s FactoryTalk Historian Site Edition (SE) and FactoryTalk VantagePoint EMI (enterprise manufacturing intelligence) software. FactoryTalk Historian SE captures process variables as each product goes through any of the facility’s seven presses. A serial number for each product, created by Plasan’s manufacturing execution system (MES), allows the Historian to associate process data to each part. VantagePoint EMI software uses this information to deliver real-time quality and performance dashboards and Microsoft Excel-based production reports.
With his process and quality teams, McKinnon can use the software to monitor products and 15 press process settings. If quality issues arise, managers can review process settings to investigate and remedy potential issues. Additionally, operations personnel can use daily production reports for each press to review key metrics, such as average cycle times and overall equipment effectiveness (OEE).
With the ability to track parts and processes, scrap rates at Plasan fell more than 50%.
“There are so many variables to our process that it had seemed almost impossible to figure out,” McKinnon says. “The Historian software helped our team track each variable to finally see a pattern as to why we were getting scrap parts. It’s greatly reduced our scrap in the press room and is setting a new standard for quality in our plant.”
Operators also use the software to better coordinate production by reviewing cycle times to understand exactly how long parts should be in the presses, then use that information to set up schedules and staffing.
More than 70,000 carbon-fiber vehicle parts have been serialized and stored since implementing the software, helping the plant produce more than 400 carbon-fiber components per day.
The company is also extending Historian software use to other areas of production. The software tracks temperature and humidity in the plant because raw material must be kept cool and within specific temperature and moisture ranges. Otherwise, the material can dry out and become unmanufacturable.
“Temperature changes can create dry lines of the individual carbon fibers, which greatly affects part quality,” McKinnon says. “We then have to address imperfections in our finishing area by Dremeling out the dry lines and filling them back in, which is a long and tedious process. This is just another area where the Historian software can help us refine our processes and drive up quality.”
For McKinnon, additional uses for the software have been a bonus.
“I initially wanted a Historian solely so I could look at last night’s cycles to see what happened,” he says. “I wasn’t even thinking about what else historical analysis could do for us. But getting our hands on it and exploring its different uses has opened my eyes to what we can really do.”
Richard Childress Racing (RCR), an elite NASCAR organization founded in 1969, fields three teams in the NASCAR Monster Cup Series and five teams in the NASCAR XFINITY Series. It has accumulated more than 200 victories and 17 championships across NASCAR’s top series, including two Daytona 500 wins and three Brickyard 400 victories.
Achieving that kind of success demands a team of hundreds of skilled people: drivers, crews, management, clerical workers, and manufacturing personnel.
When the car doesn’t succeed in competition, RCR Manufacturing Manager Rocky Helms says, “You don’t want someone to say, ‘Well, we ordered a part that probably would have made us run faster, but the manufacturing department didn’t get it done in time. It’s your fault we didn’t run faster.’”
After a car is built, the team’s machine shop constantly modifies and improves components. Team engineers and track personnel devise performance-improving upgrades, or find weaknesses that can be remedied with a new part geometry.
“A part that makes you a half-a-second faster each lap can be the difference between winning the race or finishing 20th,” Helms says. “It might take updates of 10 different parts to gain two-or-three tenths of a second on the track.”
Intense competition has driven race teams to assemble sophisticated in-house CNC machining operations to produce increasingly complex parts, maximize control over the processes, and quickly update parts.
“Every machine tool, every day, is running a part that usually has to go to the race track or to a test in the next couple of days,” Helms says. “Machine utilization is important for any manufacturer, but every part we make is needed the next day or sometimes the same day.”
RCR’s team uses CNC machining simulation software to speed machining output by revealing programming errors before they cause delays on the shop floor. It initially employed a simulation program that was embedded in the shop’s computer-aided design/computer-aided manufacturing (CAD/CAM) package. Using cut-line data, the simulator validated the CNC cutting program, sent it through a post-processor program, and translated it to G-code. However, the G-code sometimes produced unexpected machine motion or other errors, problems not apparent until the code got to the machine.
When operators edited defective G-code at the machine, they couldn’t simulate and test the result because the shop’s simulation program didn’t analyze G-code. In addition, the simulation package did not cover a machine’s full range of motion, fixturing, or components; and simulated tool holders did not match those in use.
“You would spend at least one to two hours after you had posted doing editing and then make sure you had everything corrected the best you could,” Helms says.
To minimize the time lost waiting for program corrections, RCR invested in Spring Technologies’ NCSIMUL, a machine simulation software package that analyzes CAM programs and indicates errors so they can be corrected before the postprocessor generates G-code. The software also examines the G-code to determine how the program performs in relation to the part, the machine setup, and the machine components. Errors are flagged so users can correct the code and eliminate potential crashes. The software compares the simulated part geometry to the original CAD model, based on the toolpaths, and a kinematic model of the machine tool involved.
Helms says simulation has become more important as part modifications have evolved from purely structural changes to include aerodynamic factors.
“Our part geometries have become way more complex in the last two to three years. Everything on the car has become aero-dependent,” Helms says. “We’re starting to look at the underside of the car for aerodynamic advantages. Many chassis components previously were designed purely to meet structural or geometric needs, but now they also are modified based on how air flows around them.”
Engineers attach sheet metal and other materials to cars in wind tunnels and use computational fluid dynamics (CFD) software analysis to find changes that make the car faster. When an experimental part produces an improvement, the engineers make a 3D CAD model of it and Helms’ shop machines it. RCR uses 3D printers to make non-structural prototype parts that, if successful, can be machined in steel, aluminum, or titanium for racing.
Free-form contours of aerodynamic parts require complex 3D surfacing machining programs. To support those complex designs, RCR added a 9-axis Okuma Multus 4000 mill-turn machine. The machine tool has an upper milling head that can rotate 240° and a bottom turret with tools for turning. The machine can transfer parts from main spindle to sub-spindle and perform simultaneous machining on both spindles.
“None of the programmers are familiar with this type of machine, so it’s a lot better to simulate the process through NCSIMUL instead of getting it to the machine and implementing it in a piece of material,” Helms says. “With the software, the programmer can see errors on the monitor and correct them, versus having an operator crash a $750,000 machine.”
He adds that the complexity of aerodynamically optimized parts, coupled with the machine’s sophistication and part programming options, demand simulation.
“When you get into surfacing, it’s easy to make a small programming mistake and not notice it until you’re holding the part in your hand. The NCSIMUL representation can be compared to your (CAD) model to make sure you got everything right,” Helms explains.
Helms has seen massive growth in NASCAR competition. Intensifying competitiveness has prompted RCR manufacturing operations to grow from two machines and two operators to 18 machines and a staff of 22 that works two shifts. Amid that fast-moving environment, NCSIMUL simulation software is helping RCR meet competitive challenges and continue its success.
“When I got into racing 23 years ago, there were 10 competitor cars that legitimately could win a race,” Helms says. “Now, any of the top 25 to 30 cars could win.”
The U.S. Department of Transportation (DOT) is requesting public input on several proposals that could increase the number of autonomous vehicles on public roads. The agency is developing the Federal Automated Vehicle Policy (FAVP) 3.0, “A Vision for Safety 3.0.” It’s a series of rules and guidelines that could determine how technology companies and automakers incorporate technology into future vehicles.
“Autonomous vehicle technologies will have a tremendous impact on society in terms of safety, mobility, and security,” Transportation Secretary Elaine L. Chao says. “Policy makers need to preserve the creativity and innovation that is part of the American tradition and allow innovation to flourish.”
FAVP 3.0 will emphasize a unified, intermodal approach to automated driving systems (ADS) policy. It will enable the safe integration of surface automated transportation systems, including cars, trucks, light rail, infrastructure, and port operations.
Autonomous driving policies and proposals have been scattered throughout federal agencies and between different technical issues within individual agencies. Chao’s office has consolidated the various policy proposals on one DOT website, making it easier for interested parties to participate in public comments.
Regulators certify Cummins Westport’s natural gas commercial truck engine
The U.S. Environmental Protection Agency (EPA) and California’s Air Resources Board (CARB) have certified the emissions levels of Cummins Westport’s 2018 ISX12N natural gas commercial truck engine.
The ISX12N meets CARB’s optional Low NOx standard of 0.02 grams per brake horsepower-hour (g/bhp-hr), a 90% reduction from engines operating at the EPA NOx limit of 0.2g/bhp-hr. The ISX12N also meets 2017 EPA greenhouse gas emission (GHG) requirements. The ISX12N is the first Class 8 truck engine for larger heavy-duty vehicles to certify to the 0.02g/bhp-hr optional standard.
ISX12N natural gas engines will be available with ratings from 320hp to 400hp and up to 1,450 lb-ft of peak torque. The ISX12N is designed for line haul, regional haul, refuse, and vocational trucks. Production will begin in February 2018.
ISX engines offer customers the choice of using compressed natural gas (CNG), liquefied natural gas (LNG), or renewable natural gas (RNG) as a fuel. RNG is pipeline-quality natural gas produced from the decomposition of organic waste, which can come from dairy farms, landfills, and urban waste treatment plants.