As the adoption of generative and agentic AI accelerates, the challenges for memory as a key enabler of AI/ML processing architectures continue to grow. Balancing the demands for ever greater bandwidth and capacity with the needs of power efficiency, thermal management and increased reliability is increasingly difficult. Continued advances in high performance HBM and GDDR memories, and mainstream DDR and LPDDR memories, remains a strategic industry imperative. In addition, a suite of new technologies including multiplexed modules (MRDIMM), CXL and processing in memory are needed to meet upcoming AI requirements. In this panel, we’ll discuss the evolution of memory technologies and the challenges the industry faces on the road ahead for future AI chips and systems.

Steven Woo
I was drawn to Rambus to focus on cutting edge computing technologies. Throughout my 15+ year career, I’ve helped invent, create and develop means of driving and extending performance in both hardware and software solutions. At Rambus, we are solving challenges that are completely new to the industry and occur as a response to deployments that are highly sophisticated and advanced.
As an inventor, I find myself approaching a challenge like a room filled with 100,000 pieces of a puzzle where it is my job to figure out how they all go together – without knowing what it is supposed to look like in the end. For me, the job of finishing the puzzle is as enjoyable as the actual process of coming up with a new, innovative solution.
For example, RDRAM®, our first mainstream memory architecture, implemented in hundreds of millions of consumer, computing and networking products from leading electronics companies including Cisco, Dell, Hitachi, HP, Intel, etc. We did a lot of novel things that required inventiveness – we pushed the envelope and created state of the art performance without making actual changes to the infrastructure.
I’m excited about the new opportunities as computing is becoming more and more pervasive in our everyday lives. With a world full of data, my job and my fellow inventors’ job will be to stay curious, maintain an inquisitive approach and create solutions that are technologically superior and that seamlessly intertwine with our daily lives.
After an inspiring work day at Rambus, I enjoy spending time with my family, being outdoors, swimming, and reading.
Education
- Ph.D., Electrical Engineering, Stanford University
- M.S. Electrical Engineering, Stanford University
- Master of Engineering, Harvey Mudd College
- B.S. Engineering, Harvey Mudd College

Taeksang Song
Taeksang is a Corporate VP at Samsung Electronics where he is leading a team dedicated to pioneering cutting-edge technologies including CAMM, MRDIMM, CXL memory expander, fabric attached memory solution and processing near memory to meet the evolving demands of next-generation data-centric AI architecture. He has 20 years' professional experience in memory and sub-system architecture, interconnect protocols, system-on-chip design and collaborating with CSPs to enable heterogeneous computing infrastructure. Prior to joining Samsung Electronics, he worked at Rambus Inc., Micron Technology and SK hynix in lead architect roles for the emerging memory controllers and systems.
Taeksang received his Ph.D. degree from KAIST, South Korea, in 2006. Dr. Song has authored and co-authored over 20 technical papers and holds over 50 U.S. patents.