Intern - LLM Serving System Research (Summer 2026)
SK hynix America
About this role
An internship opportunity at SK hynix America focused on researching large language model (LLM) serving systems, particularly utilizing CXL Pooled memory systems. The role involves analyzing distributed LLM serving systems, reading and implementing research papers, and contributing to system improvements.
Skills
Qualifications
About SK hynix America
skhynix.comSK hynix is a South Korean semiconductor company and one of the world’s leading makers of memory and storage solutions, primarily producing DRAM, NAND flash, SSDs, and related memory modules for PCs, servers, mobile devices, and automotive applications. The company focuses on advanced memory technology development, large-scale manufacturing, and integrated system solutions to support data centers, consumer electronics, and industrial customers worldwide. Known for heavy R&D investment and global supply capabilities, SK hynix emphasizes innovation, reliability, and sustainability under the tagline “Technology Innovator for a Better World.”
Recent company news
SK hynix America Revenue Soars 75% in One Year
1 week ago
Exclusive: SK Hynix speeds up new chip fab opening to meet memory demand, executive says
Jan 14, 2026
Chey Tae-won becomes first SK hynix America chairman
Nov 14, 2025
SK hynix Wins GSA Award 2025 For Financial Management
Dec 6, 2025
SK Hynix mulls AI strategy hub in US
1 month ago
About SK hynix America
Headquarters
San Francisco, CA
Company Size
201-500 employees
Founded
2018
Industry
Technology
Glassdoor Rating
4.2 / 5
Leadership Team
Sarah Johnson
Chief Executive Officer
Michael Chen
Chief Technology Officer
Emily Williams
VP of Engineering
David Rodriguez
VP of Product
Jessica Thompson
Chief Financial Officer
Andrew Park
VP of Sales
Unlock Company Insights
View leadership team, funding history,
and employee contacts for SK hynix America.
Salary
$52k – $100k
per year
More jobs at SK hynix America
Similar Jobs
Research Associate (Computer Science/Artificial Intelligence)
Nanyang Technological University Singapore
2026 Intern, Speech Research (Summer/Fall)
Samsung Research America
Staff Engineer - Embedded Firmware Development - CXL / PCIe / Memory
Marvell Technology
Machine Learning Intern - Dynamic KV-Cache Modeling for Efficient LLM Inference
d-Matrix
Principal Software Engineer – Large-Scale LLM Memory and Storage Systems
NVIDIA
LLM Engineers
Masterworks