Abstract

ChatGPT captured global imagination and triggered mainstream AI adoption, but it represents merely one visible peak in a vast technological landscape that most people—including many working in tech—fundamentally misunderstand. Behind every ChatGPT query lies an intricate stack - specialized AI accelerators (GPUs, TPUs, emerging neuromorphic chips), massive data centers consuming gigawatts of power, sophisticated training pipelines requiring months and millions of dollars, alignment techniques like RLHF that shape model behavior, and deployment infrastructure serving hundreds of millions of users simultaneously. Beyond LLMs, a parallel revolution is unfolding - computer vision systems achieving superhuman performance in medical imaging, reinforcement learning enabling humanoid robots, generative models designing new materials and therapeutics, and agentic systems that reason, plan, and act autonomously. Understanding this complete landscape—the technology stack, the hardware constraints, the market dynamics, the geopolitical implications—is essential for anyone seeking to build impactful AI systems or companies, yet it remains fragmented across research papers, industry reports, and insider knowledge rarely documented publicly.

This lecture provides that comprehensive view, synthesized from direct experience across the full AI value chain. From my time at Samsung Semiconductor, I will decode the hardware layer - why GPU architectures proved transformative for deep learning, how memory bandwidth bottlenecks constrain model scaling, why the semiconductor supply chain has become a geopolitical battleground, and what next-generation AI accelerators promise. From Amazon, I will illuminate production ML systems at scale - how recommendation engines actually work, why A/B testing and production monitoring matter more than model architecture, how to generate $200M+ in revenue through ML systems, and what separates research prototypes from products serving hundreds of millions of users. From Erudio Bio, I will reveal the research-to-startup journey in AI-powered biotech - why AlphaFold 3’s breakthrough creates a $1 trillion market opportunity, how AI enables new diagnostics and therapeutics impossible with traditional methods, what regulatory and clinical validation actually requires, and how we secured Gates Foundation funding to scale cancer diagnostics globally. Each domain offers complementary lessons - semiconductors teach hardware constraints that bound all AI systems, e-commerce demonstrates production engineering at massive scale, and biotech shows how AI creates entirely new categories of value.

For KSEA members—Korean-American scientists and engineers navigating careers across academia, industry, and entrepreneurship—this lecture addresses questions that matter most - How do you identify genuine AI opportunities versus hype? What technical skills remain valuable as tools rapidly evolve? Should you specialize deeply or develop broad cross-domain expertise? How do you transition from research to startups, or from large companies to founding your own venture? Having made multiple pivots myself (Samsung → Amazon → Erudio Bio, Korea → US, math/engineering → entrepreneurship/business development/technical sales), I will share hard-won insights about what actually matters - not predicting which specific technology wins, but developing frameworks for evaluating opportunities as the landscape shifts; not maximizing depth in one narrow area, but becoming a “connector” who synthesizes insights across domains; not waiting for permission or perfect information, but cultivating the judgment to act decisively under uncertainty. The goal is not inspiration through success stories but practical wisdom for navigating the AI revolution’s opportunities and pitfalls—because while the technology is new, the patterns of innovation, the market dynamics, and the human challenges of building impactful ventures remain timeless.