Research Scientist Intern, Systems ML - SW/HW Co-Design - Inference

Research Scientist Intern, Systems ML - SW/HW Co-Design - Inference
Location pin icon
Menlo Park, CA
AI System SW/HW Co-design team’s mission is to explore, develop and help productize high-performance software and hardware technologies for AI at datacenter scale. We achieve this via concurrent design and optimization of many aspects of the system such as models, algorithms, numerics, performance and AI hardware including compute, networking and storage. In essence, we drive the AI HW roadmap at Meta and ensure our existing and future AI workloads and software are well optimized and suited for the hardware infrastructure. Meta is seeking Research Scientist Interns to join our AI & Systems Co-Design HPC & Inference team to drive the definition of our next-generation AI Systems Inference and Training architectures. The team works across HW types (GPUs, ASICs), workload types (Recommendation Models, LLM, LDM) and workloads (Training & Inference). The team drives innovation on: - Low precision Numerics for Training & Inference - ML operator / Kernel optimizations for Training & inference - Inference E2E performance - Model, SW, System, Accelerator - Performance modeling and simulations - HPC Software Optimizations - GPU / ASIC optimizations - Software libraries, models, and frameworks In this role, you will work cross-functionally with internal software and platforms engineering teams to understand the workloads and infrastructure requirements. You will drive technology path-finding, roadmap definition and co-design activities to deliver new capabilities and efficient systems for our fleet. You will also work with external industry partners to influence their roadmaps and build the best products for Meta’s Infrastructure. Join our team and help shape one of the largest infrastructure footprints which powers Meta’s applications used by billions of people across the globe. Our team at Meta AI offers twelve (12) to sixteen (16) weeks long internships and we have various start dates throughout the year. To learn more about our research, visit https://ai.facebook.com.
Research Scientist Intern, Systems ML - SW/HW Co-Design - Inference Responsibilities
  • Develop tools and methodologies for large scale workload analysis and extract representative benchmarks (in C++/Python/Hack) to drive early evaluation of upcoming platforms.
  • Analyze evolving Meta workload trends and business needs to derive requirements for future offerings. Apply in depth knowledge of how the AI/ML systems interact with the compute and storage systems around.
  • Utilize extensive understanding of CPUs (x86/ARM), GPU (Nvidia/AMD/Intel), Collectives and systems to identify bottlenecks and enhance product/service efficiency. Collaborate closely with software developers to re-architect services, improve codebase through algorithm redesign, reduce resource consumption, and identify hardware/software co-design opportunities.
  • Identify industry trends, analyze emerging technologies and disruptive paradigms. Conduct prototyping exercises to quantify the value proposition for Meta and develop adoption plans. Influence vendor hardware roadmap and broader ecosystem to align with Meta's roadmap requirements.
  • Work with Software Services, Product Engineering, and Infrastructure Engineering teams to find the optimal way to deliver the hardware roadmap into production and drive adoption.
Minimum Qualifications
  • Currently has, or is in the process of obtaining, PhD degree in the field of Computer Science or a related STEM field.
  • Must obtain work authorization in the country of employment at the time of hire, and maintain ongoing work authorization during employment.
  • Experience with hardware architecture, compute technologies and/or storage systems
Preferred Qualifications
  • Intent to return to degree-program after the completion of the internship/co-op.
  • Track record of achieving results as demonstrated by grants, fellowships, patents, as well as first-authored publications at leading workshops or conferences such as MICRO, ISCA, HPCA, ASPLOS, ATC, SOSP, OSDI, MLSys or similar.
  • Architectural understanding of CPU, GPU, Accelerators, Networking, systems.
  • Some experience with large-scale infrastructure, distributed systems, full stack analysis of server applications.
  • Experience or knowledge in developing and debugging in C/C++, Python and/or PyTorch.
  • Experience driving original scholarship in collaboration with a team.
  • Experience leading a team in solving analytical problems using quantitative approaches.
  • Interpersonal experience: cross-group and cross-culture collaboration.
  • Experience in theoretical and empirical research and for answering questions with research.
  • Experience communicating research for public audiences of peers.
For those who live in or expect to work from California if hired for this position, please click here for additional information.
Locations
About Meta
Meta builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. People who choose to build their careers by building with us at Meta help shape a future that will take us beyond what digital connection makes possible today—beyond the constraints of screens, the limits of distance, and even the rules of physics.
Meta is committed to providing reasonable support (called accommodations) in our recruiting processes for candidates with disabilities, long term conditions, mental health conditions or sincerely held religious beliefs, or who are neurodivergent or require pregnancy-related support. If you need support, please reach out to accommodations-ext@fb.com.
$7,500/month to $11,333/month + benefits

Individual compensation is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect the base hourly rate, monthly rate, or annual salary only, and do not include bonus, equity or sales incentives, if applicable. In addition to base compensation, Meta offers benefits. Learn more about benefits at Meta.
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. You may view our Equal Employment Opportunity notice here. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. We may use your information to maintain the safety and security of Meta, its employees, and others as required or permitted by law. You may view Meta Pay Transparency Policy, Equal Employment Opportunity is the Law notice, and Notice to Applicants for Employment and Employees by clicking on their corresponding links. Additionally, Meta participates in the E-Verify program in certain locations, as required by law.

Meta is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at accommodations-ext@fb.com.