Solution Study
Monday, December 08
03:15 PM - 03:40 PM
Live in Dearborn, Michigan
Less Details
The automotive industry is undergoing a seismic shift as Large Language Models (LLMs), Vision Language Models (VLMs), and multimodal AI applications move from research labs into vehicles, factories, and customer experiences. These technologies promise transformative capabilities—from predictive maintenance and autonomous driving to intelligent voice assistants and real-time visual inspection—but they also introduce unprecedented strain on memory subsystems. This talk explores how the explosion of multimodal workloads are reshaping memory architectures across the automotive value chain. We’ll examine the specific challenges Tier 1 suppliers and OEMs face in deploying these models at scale, including bandwidth bottlenecks, latency constraints, and thermal limits. We’ll also highlight how the market is responding—with innovations in memory hierarchy and system-level optimization—to meet the demands of AI-driven automotive platforms.
Obi Okeke is a seasoned technology leader with over 25 years in semiconductors and two decades driving automotive innovation. As Director of Business Development for the Americas and Global OEMs at Micron, Obi partners with automakers and Tier 1 suppliers to align advanced memory technologies—LPDDR5/6, GDDR6, HBM, UFS, and NAND/SSD—with emerging SDV architectures, ADAS compute, and immersive infotainment workloads. He specializes in translating system requirements such as bandwidth, latency, endurance, safety, and security into scalable memory solutions that meet ISO 26262 and cybersecurity standards. Renowned for bridging technical insight with business strategy, Obi collaborates across engineering and ecosystem partners to accelerate adoption of next generation memory for intelligent, connected vehicles. He holds an Executive MBA from the University of Ottawa and an M.S. in Integrated Manufacturing Systems from the University of Birmingham UK.
