Meta was built to help people connect and share, and over the last decade our tools have played a critical part in changing how people around the world communicate with one another. With over a billion people using the service and more than fifty offices around the globe, a career at Meta offers countless ways to make an impact in a fast growing organization.
Meta’s Core AI team is seeking a Research Scientist Intern with a focus on Multimodal and Generative AI. Our team is pioneering AI research across text, audio, and video domains, with a mission to develop AI-driven foundational models and their applications. We are committed to advancing state-of-the-art algorithms, promoting open research, and fostering scientific innovation in all aspects of AI for language, including language modeling, natural language understanding and generation, audiovisual learning, on-device/personalized LM, and multimodal applications.
As a Research Scientist Intern, you will play a crucial role in developing cutting-edge models and algorithms in AI Research. We are seeking a candidate with expertise in vision, audio and multimodal learning. The ideal candidate will have a strong background in deep learning and general machine learning, coupled with a deep passion for computer vision and audio/speech processing. In this position, you will work with the domain experts to understand the challenges and build state-of-the-art foundational models to tackle them. Our internships are twelve (12) to twenty-four (24) weeks long and we have various start dates throughout the year.
Research Scientist Intern, Multimodal and Generative AI (PhD) Responsibilities
Lead and contribute to cutting-edge foundation model research that leads to publications on top-tier CV/ML conferences
Perform research to tackle unsolved real-world problems and push the state of the art in multimodal learning
Independently design and implement algorithms, train advanced foundational models on large datasets, and evaluate their performance
Define, plan and execute cutting-edge deep learning research to advance AR/VR experiences
Develop novel deep learning techniques to achieve state-of-the-art accuracy within the constraints of on-device and real-time execution
Collaborate with other research scientists and software engineers to develop innovative deep learning techniques for vision, audio, user interface and other use-cases
Communicate the experimental results and the recommendations clearly, both within the group as well as to the cross-functional groups
Minimum Qualifications
Currently is in the process of obtaining a PhD in the field of Artificial Intelligence or related field
Research experience in one or more of these areas: machine learning, deep learning, computer vision, audio/speech processing or related fields
Knowledge of state of the art deep learning methods and neural networks
Experience working with machine learning libraries like Pytorch, Jax, etc.
Experience with scripting languages such as Python and shell scripts
Must obtain work authorization in the country of employment at the time of hire, and maintain ongoing work authorization during employment
Preferred Qualifications
Intent to return to degree-program after the completion of the internship
Experience with developing scalable machine learning models in at least one of the following areas: large language models, natural language understanding or generation, efficient training and inference, multimodals, or relevant areas
Experience with large scale model training, implementing algorithms, and evaluating language systems
Proven track record of achieving significant results as demonstrated by publications at leading conferences/journals such as NeurIPS, ICLR, ICML, CVPR, ICCV, ICASSP, Interspeech, AAAI, IEEE TASLP or similar
Experience working and communicating cross functionally in a team environment
Experience solving complex problems and comparing alternative solutions, trade offs, and diverse points of view to determine a path forward
For those who live in or expect to work from California if hired for this position, please click
here for additional information.
Meta builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. People who choose to build their careers by building with us at Meta help shape a future that will take us beyond what digital connection makes possible today—beyond the constraints of screens, the limits of distance, and even the rules of physics.
$7,800/month to $11,293/month + benefits
Individual compensation is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect the base hourly rate, monthly rate, or annual salary only, and do not include bonus, equity or sales incentives, if applicable. In addition to base compensation, Meta offers benefits. Learn more about
benefits at Meta.
Equal Employment Opportunity and Affirmative Action
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. You may view our Equal Employment Opportunity notice
here.
Meta is committed to providing reasonable support (called accommodations) in our recruiting processes for candidates with disabilities, long term conditions, mental health conditions or sincerely held religious beliefs, or who are neurodivergent or require pregnancy-related support. If you need support, please reach out to
accommodations-ext@fb.com.