Senior Level Data Engineer / Healthcare / Python / Django / PostgreSQL
San Diego, California
Hybrid
Full Time
$160k - $180k
Our client is a is a mission-driven technology company focused on transforming how individuals access mental health and substance use care. Based in Encinitas, CA, their platform leverages modern software and intelligent systems to streamline the connection between people in need and verified care providers. Partnering with school districts, government agencies, and organizations across the country, we offer a secure, data-informed coordination system that simplifies what can often be a complex and overwhelming process. As a Senior Data Engineer at their company, you will play a pivotal role in modernizing and enhancing their data infrastructure, with a focus on improving reliability and enabling new AI, analytics, and operational capabilities. This role involves taking the lead in evolving their PostgreSQL database schema, stabilizing data pipelines, integrating observability tools, and supporting data-driven projects. It’s a great opportunity for someone passionate about building scalable solutions in a collaborative, fast-paced environment while making a meaningful difference in people’s lives. Required Skills & Experience
Applicants must be currently authorized to work in the US on a full-time basis now and in the future.
- 4+ years in a Data Engineer or similar role
- Bachelor’s degree in Computer Science, Engineering, Logistics, or related field (preferred)
- Strong experience with PostgreSQL for database design, schema optimization, performance tuning, managing large datasets
- Familiarity with orchestration tools like Airflow, DBT, or similar
- Strong experience with Python, Django & SQL for data processing, automation, and infrastructure work
- Experience setting up CI/CD & monitoring
- Familiarity with streaming technologies
- 100% Data Engineering (Python, Django, SQL)
- Evolve the core PostgreSQL schema to support scalable, accurate matching at the provider and practitioner levels.
- Re-architect and stabilize data pipelines for greater resilience, modularity, and maintainability.
- Build and manage reliable batch and streaming pipelines using Airflow and Kafka.
- Implement CI/CD for data workflows and automate manual processes.
- Develop clean, well-structured datasets to power reporting, operational insights, and machine learning.
- Enhance data validation to ensure high-quality inputs for AI/ML systems.
- Partner with backend engineers to integrate data solutions into a Django-based platform.
- Maintain clear documentation and promote reproducible, reliable data workflows.
- Help bring structure and scalability to a fast-paced, evolving data environment.
- Medical, Dental, and Vision Insurance
- Unlimited PTO
- Will be able to provide details on additional compensation
Applicants must be currently authorized to work in the US on a full-time basis now and in the future.