Opening Speech – Challenges & Opportunities of LLM and Generative AI: A Hardware Perspective

ISES Docs:

Recent advancements in Large Language Models (LLMs) and Generative Artificial Intelligence (Generative AI) have presented an exciting opportunity for developers, as exemplified by ChatGPT, which can mimic human dialogue and decision-making. However, the tremendous computing power and energy consumption required by the hardware implementing these models pose a considerable cost and challenge, due to a huge number of parameters and complex neural network architecture.

This talk will explore the potential of designing more efficient hardware for deployment and inference of large language models, from an integrated circuit (IC) hardware perspective. Possible methods of AI model compression, such as quantization, pruning or knowledge distillation will be covered as well as AI chip design based on Computing-in-Memory (CIM), Deep-Learning Accelerator (DLA) and Chiplets. This talk will also discuss the hardware and software co-design methodology and tools needed for improving the design quality and productivity.

Dr. Ming-Der Shieh photo

Dr. Ming-Der Shieh

CTO of Electronic and Optoelectronic System Research Laborations (EOSL)

Industrial Technology Research Institute (ITRI)

Members Only

Sorry ISES Docs are exclusive to ISES Members

For access please either login to your membership account or visit our Membership page to sign up for ISES membership.