Hacker Careers logo
Company Logo for Preferred Networks

Preferred Networks is hiring a
LLM Inference Optimization Engineer

About Preferred Networks

Preferred Networks is an AI company based in Tokyo working across the stack, from AI chips and computing infrastructure to LLMs and products. You may already know us indirectly if you\'ve used software we\'ve built, such as Optuna or CuPy (or Chainer, back in the day). We are designing in-house chips (MN-Core series) and training LLMs (PLaMo series). Our team is actively hiring for two roles related to these endeavors.

Job Description

Improve the inference engine powering our API service and maintain PLaMo implementations in open source projects such as vLLM.

Remote

Remote Conditions

Remote within Japan; relocation to Japan required; visa and relocation support provided.

Salary

Not Specified

Benefits

Visa and relocation support

Tech Tags

Date Listed

02 May, 2026 (6 days ago)
Loading...

Share this job

Preferred Networks has 1 other job listed

Hiring engineers?

Reach thousands of tech candidates from the Hacker News community.

Post a Job β€” $99

Similar Jobs

No tags
πŸ“ Berlin, Freiburg, NYC
85% match
No tags
πŸ“ Remote
85% match
No tags
πŸ“ Berlin / Freiburg / NYC
85% match
No tags
πŸ“ Brooklyn, NY; San Francisco, CA
85% match
No tags
πŸ“ Brooklyn, NY; San Francisco, CA
85% match
No tags
πŸ“ Berlin / Freiburg / NYC
84% match
No tags
πŸ“ Berlin, Freiburg, NYC
84% match
No tags
πŸ“ Austin, TX; London, UK; San Francisco, CA
84% match
No tags
πŸ“ Remote
84% match
No tags
πŸ“ Santa Clara, CA, USA
84% match
No tags
πŸ“ Austin, TX; London, UK; San Francisco, CA
84% match
No tags
πŸ“ Santa Clara, CA
84% match
No tags
πŸ“ Berlin, Freiburg, NYC
83% match
No tags
πŸ“ Santa Clara, CA, USA
83% match
No tags
πŸ“ Remote
83% match

We use cookies

We use cookies to ensure you get the best experience on our website. For more information on how we use cookies, please see our cookie policy.

By clicking β€œAccept”, you agree to our use of cookies.
Learn more.