Hacker Careers logo
Company Logo for Stanford Research Computing

Stanford Research Computing is hiring a
Principal Storage Architect & Team Lead

About Stanford Research Computing

Stanford Research Computing is a collaboration between University IT and the Vice Provost and Dean of Research. They operate HPC environments for researchers, provide consultations on projects, and offer contract support for individual Labs, Departments, and Schools.

Job Description

HYBRID role leading the storage team, setting the direction for large storage environments: Oak (file storage), Fir (fast scratch for Sherlock), and Elm (object storage on top of tape). You’ll be the Technical Manager, guiding strategy and execution. Knowledge of Lustre, Infiniband, and PB-scale storage is important.

Remote

Remote Conditions

Hybrid: on-site at Stanford, CA with remote work allowed as part of the role

Salary

Not Specified

Benefits

Relocation incentive; transit passes; 403(b) match; healthcare; 30+ days off per year; parking not provided for on-site; benefits publicly documented at cardinalatwork.stanford.edu/benefits-rewards

Tech Tags

Senior Role

Date Listed

01 May, 2026 (6 days ago)
Loading...

Share this job

Stanford Research Computing has 2 other jobs listed

Hiring engineers?

Reach thousands of tech candidates from the Hacker News community.

Post a Job β€” $99

Similar Jobs

No tags
πŸ“ Palo Alto, CA
84% match
No tags
πŸ“ Remote (US) or Hybrid (NYC, Dublin)
84% match
No tags
πŸ“ Richmond, BC, Canada
84% match
No tags
πŸ“ Cupertino, CA
84% match
No tags
πŸ“ Brooklyn, NY; San Francisco, CA
83% match
No tags
πŸ“ Berlin, Germany
83% match
No tags
πŸ“ Remote (US) or Hybrid (NYC, Dublin)
83% match
No tags
πŸ“ Palo Alto, CA
83% match
No tags
πŸ“ Remote (US) or Hybrid (NYC, Dublin)
83% match
No tags
πŸ“ Hybrid (Boston preferred; NYC/SF possible)
83% match

We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.

By clicking β€œAccept”, you agree to our use of cookies.
Learn more.