
krea is hiring a
Distributed Systems Engineer & Data Engineer
About krea
A company of 6 engineers making a new type of creative tool. Older creative tools wrapped computer graphics advances into a performant controllable user interface (e.g. brightness, hue, saturation, etc.) and we think modern tools should do the same for the latest AI research (e.g. diffusion models).
Job Description
Build distributed systems to process massive (billions of files, petabytes), amounts of image, video, and 3D data, solving scaling bottlenecks as you go. Build data pipelines and deploy ML models to make sense of raw data (e.g. find clean scenes within videos). Learn ML engineering from world-class researchers on a small tight-knit team. As you’re building data pipelines, you’ll contribute to foundation image, video, and world models from 0 → production, and see them used by millions of Krea users. Play with massive amounts of compute on huge Kubernetes GPU clusters; our main GPU cluster takes up an entire datacenter from our provider. Tech stack: python, pytorch, k8s, a rotating cast of data tools (e.g. DuckDB, massive relational DBs, PyArrow, etc.). You should apply if you are an excellent generalist engineer with strong backend experience and an intuition for systems design. Bonus points for deep experience with distributed systems, kubernetes, and existing ML or data experience (but not required!). Cool side projects are a green flag.Location
San Francisco, CA
Salary
Not Specified
Benefits
Not Specified