ServiceNow

Staff Applied Research Scientist - Pretraining/Fine-tuning

Join ServiceNow as a Staff Applied Research Scientist in Santa Clara, CA. Drive LLM innovations, collaborate with teams, and enjoy competitive pay and benefits.

ServiceNow Role Type:
ServiceNow Modules:
Department - JobBoardly X Webflow Template
Virtual Agent
ServiceNow Certifications (nice to have):

Job description

Date - JobBoardly X Webflow Template
Posted on:
 
November 12, 2024

The Advanced Technology Group (ATG) at ServiceNow is seeking a Staff Applied Research Scientist to drive significant innovations for Large Language Models (LLMs) for Enterprise Language Generation. The successful candidate will play a major part in driving innovations for LLMs, applying existing methods and developing new ones to solve real-world challenges and datasets. The role involves collaborating with cross-functional teams to identify and prioritize requirements, and working closely with researchers and applied research scientists to validate and drive innovations.

Requirements

  • PhD with 3 years' experience; or equivalent work experience
  • Expertise in Python, OOP, Design Patterns
  • Experience in Pretraining of an LLM is preferred but not mandatory
  • Experience with Instruction fine tuning and other fine-tuning techniques is a must
  • Experience with Reinforcement learning is preferred but not mandatory
  • Experience with various transformer architectures (auto-regressive,sequence-to-sequence etc)
  • Ability to read latest papers and experiment with the ideas
  • Good publication record in top tier conferences such as ICLR, NeurIPS, ICML, ACL, EMNLP, AAAI, etc
  • Communicating research findings to both technical and non-technical stakeholders

Benefits

  • Base pay of $165,800 - $290,200
  • Equity (when applicable)
  • Variable/incentive compensation
  • Health plans, including flexible spending accounts
  • 401(k) Plan with company match
  • ESPP
  • Matching donations
  • Flexible time away plan
  • Family leave programs

Requirements Summary

PhD with 3 years' experience, expertise in Python, experience with LLMs and transformer architectures, and good publication record in top tier conferences