Attunement

We automate compliance for behavioral health, saving clinicians hours…

Founding Engineer

$150 - $300San Francisco, CA, US
Job type
Full-time
Role
Engineering, Full stack
Experience
3+ years
Visa
Will sponsor
Skills
Python, React, Software Security, Amazon Web Services (AWS)
Connect directly with founders of the best YC-funded startups.
Apply to role ›
Angie Muller
Angie Muller
Founder

About the role

Location: Onsite / San Francisco

Stage: Seed

Type: Full-time, founding team

attunement.ai

Engineering Observability and Accountability into AI for Behavioral Health

Attunement is building the compliance infrastructure for AI in behavioral health. Our goal is to make AI systems in clinical settings auditable, explainable, and accountable by design.

Today, clinics using Attunement cut audit preparation time by 80% and documentation costs by 40%. We are building the technical standard for safety and integrity in AI-assisted behavioral care.

What You'll Do

As an early engineer, you’ll design and implement the technical foundation for compliant and reliable AI in healthcare. You’ll build systems with our forward deployment engineer and product designer to make compliance and transparency operational.

Your work will include:

  • The core compliance intelligence layer: secure, explainable, and continuously learning from real clinical workflows.
  • Data pipelines that connect with EHRs and healthcare APIs (FHIR, HL7) to create real-time, auditable feedback loops.

You Might Be Right for This If

  • You’ve built production systems end-to-end , backend to frontend in security-sensitive or regulated environments (HIPAA, SOC 2, or similar).
  • You’ve worked with healthcare data standards (FHIR, HL7, or EHR integrations) and understand the nuance of data lineage, auditability, and interoperability.
  • You have experience with LLMs or ML Ops, particularly in designing explainability, safety, or audit systems around AI models.
  • You’re fluent in React / Next.js and Python / FastAPI (or equivalent frameworks), with strong fundamentals in database architecture, API design, and observability.
  • You care deeply about reliability, data integrity, and user trust
  • (Bonus) You have a background or strong interest in clinical psychology, AI safety, or human-centered systems design, and you want to build software that genuinely improves human wellbeing.

Why this matters

This role shapes how AI systems are integrated into healthcare. You’ll collaborate with a founding team with backgrounds in neuroscience, AI safety, and clinical psychology to define the technical and ethical standards for responsible AI in clinical environments.

You’ll have meaningful ownership, early equity, and the opportunity to influence not only the product architecture but also the principles that govern how AI supports human decision-making in care.

About Attunement

Attunement (YC W24) is engineering observability and accountability into AI for behavioral health. We’re building secure infrastructure that connects clinical data pipelines, model outputs, and audit systems so every AI-assisted decision in care is traceable and explainable. Clinics using Attunement stay continuously audit-ready, protect revenue, and set a new bar for transparency in digital mental-health tools.

Attunement
Founded:2024
Batch:W24
Team Size:3
Status:
Active
Location:San Francisco
X (Twitter) logo
Founders
Angie Muller
Angie Muller
Founder