Perception Engineer (Member of the Technical Staff)

Transfyr Bio

Transfyr Bio

IT
Cambridge, MA, USA
Posted on Jan 26, 2026

Member of the Technical Staff - Perception Engineer

About Transfyr

Transfyr is building physical AI for science.

Why is it that a professional athlete has dramatically more information about every play they make than a scientist has about the cause of any experimental failure? At Transfyr, we are building the infrastructure to make real-world scientific work legible, transferable, and reproducible.

Modern science is capable of extraordinary outcomes, but much of the most important insights never become explicit: how experiments are actually executed, protocols drift, how experts make gametime decisions on the fly, why experiments fail on Tuesdays. This tacit knowledge is rarely captured, making it difficult to reliably reproduce results, much less hand off protocols to new team members or collaborators. We believe our systematic failure to capture tacit knowledge is holding back the entire industry.

We’re building systems that operate directly in real laboratory environments to elucidate, capture, and interpret this missing information. Our platform records and analyzes multimodal data about how scientific work is performed and turns it into durable, operational knowledge. In doing so, we are also building the world’s largest commercial dataset on real-world scientific execution.

This foundation is critical not only for driving elite human performance today, but for enabling meaningful automation tomorrow. Physical AI systems cannot learn from outcomes alone; they require rich, grounded records of how work is actually done in the real world.

Want to learn more? You can read some of our writings here.

The Role

Perception engineers at Transfyr build the systems that allow our platform to see and understand real-world scientific work as it actually happens. You will design and deploy perception systems that operate in active laboratory environments, interpreting human actions, object interactions, instruments, and automation hardware under real-world conditions. This is not offline benchmarking on clean datasets, your work must function in messy, changing environments with imperfect lighting, occlusions, evolving protocols, and long-tail edge cases.

The role demands strong perception fundamentals, practical engineering judgment, and high agency. You will work closely with scientists and operators, observe failure modes firsthand, and iterate until systems are robust enough to be trusted as part of critical scientific workflows.

We're building a team, and we have needs across levels:

  • Hands-on builders early in their careers who are excited to build and ship real systems quickly

  • Senior engineers who enjoy shaping system boundaries, abstractions, and long-term technical direction.

This role is in-person in Cambridge, MA (other locations may open in the future, feel free to reach out even if Boston is not currently an option for you).

What you’ll accomplish with us:

  • Capture the Unspoken: Design perception systems that make real-world scientific execution legible, translating physical actions and system state into structured, reusable data.

  • Give Our AI Models “Sight”: Build and deploy computer vision systems that operate reliably in live laboratory environments, tracking people, objects, instruments, and automation under real-world variability.

  • Build the Pipeline: Develop multimodal perception pipelines that integrate video with audio, sensor data, metadata, and experimental context, in close partnership with software and AI/ML engineers.

  • The Real World Is Right (Your Model Is Incomplete): Work directly alongside scientists and operators in our Cambridge-based laboratory and at customer sites, observing failure modes firsthand and iterating until systems are robust to changing conditions (lighting, layouts, equipment, etc)

  • Scale the Science: Enable transferable science and future automation by building a physical AI layer that supports workflows generalizing across labs, teams, and geographies. Allow scientists to focus on exploration rather than bookkeeping by enabling truly passive information capture.

  • Enable the Robot Future: Enable generalizable automation in scientific settings by building a robust physical AI layer that understands the actions that are taken to achieve experimental outcomes.

Who you are:

  • High agency. You don’t wait for ideal data or perfect specs. You take ownership of problems, identify what needs to be done, and push work forward.

  • Biased toward action. You value momentum, can move quickly without being careless, and know when to trade elegance for progress.

  • Successful in ambiguity. You can make progress on long-tail problems where ground truth is incomplete and success criteria evolve.

  • Thoughtful. You understand tradeoffs between model complexity, reliability, latency, and operational burden, and choose deliberately.

  • Clear, direct communicator. You close loops, surface issues early, and work effectively with colleagues who may not share your technical background.

  • Intense. You care deeply about the mission, are willing to work hard when it matters, and take pride in helping a small team do outsized work.

What you know:

  • Computer Vision: You have deep experience in computer vision (Object Detection & Tracking, Pose Estimation, Activity Recognition, Video Understanding) and you know how to make these models work in the messy, high-stakes environments.

  • Modern VLMs: New advances come out daily - you track the ecosystem and are excited about bringing new AI tools and systems into your problem solving & pipelines - everything from VLMs and VLAs to that new long video understanding model that might just be SOTA for the task.

  • Modern ML Stack: You’re fluent in Python and PyTorch. You’ve deployed models to the cloud (AWS/GCP) and, ideally, have experience with edge deployment for real-time inference using frameworks like TFLite and TensorRT.

  • Engineering Rigor: You can build a functional prototype in a weekend, but you also understand how to take prototypes to production. There’s a time and place for both and you can distinguish between them.

We don’t expect candidates to have experience with every modality or tool we use. We value strong fundamentals, curiosity, and the ability to learn quickly over narrow specialization.

Other things we like to see:

  • A passion for and experience in science

  • A passion for and experience with AI

  • Demonstrated experience working in fast-moving/ambiguous environments (like startups!)

The basics:

  • Competitive compensation (cash + equity)

  • Full benefits (low/no-cost health insurance options, HSA, 401K with matching, lunch subsidy, etc.)