Senior Data Engineer

Sanctuary Computer

Apply Now
Worldwide
$180,000 - $270,000 / year
full-time
senior
Posted January 12, 2026
via Remote OK

About This Role

We are recruiting a Sr. Data Engineer for a client in the AI / health & wellness space Original Job Post link About garden3d We are worker owned creative collective, innovating on everything from brands and IRL communities to IoT devices and cross platform apps. We share profit, open source everything, spin out new businesses, and invest in exciting ideas through financial and/or in-kind contributions. Our client roster includes Google, Stripe, Figma, Hinge, Black Socialists in America, ACLU, Pratt, Parsons, Mozilla, The Nobel Prize, MIT, Gnosis, Etsy & Gagosian. We're the software team behind innovative products like The Light Phone & Mill, and a global, decentralized community space collective called Index Space. We think of our garden3d as collective for creative people, prioritizing a happy, talented, and diverse studio culture. We work on projects that bring value to our world, and we balance deep care for the work we do with a genuine curiosity about life outside of our jobs. About the client Our client is an early-stage AI startup based in NYC (but open to remote team members). The founders have experience building and scaling successful ventures including a 9-figure exit. Who we're looking for: We're looking for a Senior Data Engineer with deep expertise in designing and owning data pipelines, workflow orchestration, and complex data integrations. You'll play a key role in evolving our data ingestion architecture, from an existing in-house, code-defined workflow system backed by queues, to a more scalable and observable orchestration layer using Prefect. In this role, you'll lead the development and optimization of pipelines handling both structured and unstructured data from a wide range of sources, including web crawls and scrapers. You'll be expected to make architectural decisions, ensure reliability and scalability, and establish best practices for workflow design, monitoring, and performance as our data platform grows. In this role, you'll work across a variety of initiatives to find cost-effective, high-quality, pragmatic solutions to complex problems. Responsibilities will include: • Monitoring and maintaining data pipelines, troubleshooting new errors, and addressing format drift • Extracting and enriching additional data elements from diverse sources • Reprocessing and validating large datasets in batch workflows • Designing and integrating new data sources into existing pipelines • Aligning and integrating extracted data with the core application data model to ensure consistency and usability • Participating in code reviews, providing constructive feedback to teammates and ensuring adherence to best practices • Contributing to project success by keeping a close eye on team velocity, project scope, budget, and timeline • Negotiating with clients to align project scope with budget and timeline, if needed Who you are The person we're looking for is happy, relaxed and easy to get along with. They're flexible on anything except conceits that will lower their usually outstanding work quality. They work "smart", by carefully managing their workflow and staggering features that have dependencies intelligently - they prefer deep work but are OK coming up to the surface now and then for top level / strategic conversations. We believe people with backgrounds or interests in design, art, music, food or fashion tend to have a well rounded sense of design & quality - so a variety of hobbies or side projects is a big nice to have! Must Have Competencies: • Senior-level Python expertise • Experience with data/workflow orchestration tools (e.g., Prefect, Airflow, Dagster) • A thorough understanding ETL & data transformation for the ingestion of industry standard LLMs (OpenAI, Claude, etc) • Familiarity with Large Language Models (LLMs) • Skilled in interfacing with APIs (OpenAI, Google Gemini/Vertex, etc.) using wrapper libraries such as Instructor, LiteLLM, etc. • Practical experience in prompt engineering • Ability to work with structured outputs and potentially tool calling • 5+ years general experience in backend (Ruby on Rails, Elixir Phoenix, Python Django, or Node Express) and/or native app development (React Native, Flutter, Android, AOSP, Kotlin/Java). Nice to Have Competencies: We're always pitching for new and exciting technology niches. Some of the areas below are relevant to us! • Experience with Google Cloud Platform (GCP), particularly Cloud Run and Cloud Tasks • Knowledge of search technologies, including embeddings and vector databases for semantic search, as well as keyword-based search (BM25) • Familiarity with PySpark for batch data processing • Experience working with LLMs, Vector Databases, and other generalist AI-enabled application patterns • Client-facing experience: working directly with customers to gather requirements and provide technical solutions • Product management experience: defining product roadmaps and collaborating closely with s...

Ready to Apply?

Click the button below to visit the company's application page.

Apply for this Position