Senior DataOps Engineer
Chicago, IL
Hybrid
Full Time
$140k - $160k
A Fortune 500 company based in the Chicagoland area is hiring a Senior Infrastructure Operations Data Engineer (DataOps) to help scale and stabilize its modern data platforms. With over $17B in revenue and one of the largest B2B e-commerce footprints in North America, this team is deep in the midst of a next-gen data transformation. Tech stack includes Databricks, Snowflake, Python, Spark, Airflow, Terraform, Kubernetes, and AWS.
This isn’t just another senior engineer role — this is a unique opportunity to own platform reliability for enterprise-scale data systems. You’ll split your time between data engineering and infrastructure operations, working cross-functionally with platform, analytics, and data teams to improve observability, automate cost tracking, enforce SLAs, and streamline CI/CD. If you're someone who wants to mentor junior engineers, solve real platform problems, and influence tooling and strategy — this is your playground.
Required Skills & Experience
Tech Breakdown
The Offer
Applicants must be currently authorized to work in the US on a full-time basis now and in the future.
#LI-MW9
This isn’t just another senior engineer role — this is a unique opportunity to own platform reliability for enterprise-scale data systems. You’ll split your time between data engineering and infrastructure operations, working cross-functionally with platform, analytics, and data teams to improve observability, automate cost tracking, enforce SLAs, and streamline CI/CD. If you're someone who wants to mentor junior engineers, solve real platform problems, and influence tooling and strategy — this is your playground.
Required Skills & Experience
- 5+ years in data engineering or infrastructure roles
- Strong experience with Snowflake or Databricks
- Proficiency in AWS (Glue, Lambda, S3, Athena, DynamoDB)
- Hands-on experience with CI/CD pipelines (GitHub Actions, Jenkins, or GitLab)
- Experience with Terraform, Docker, and Kubernetes
- Familiarity with Airflow or Databricks Workflows
- Strong Python and Spark for batch/streaming ETL
- Scripting experience in Bash, Unix shell, or PowerShell
- Experience defining and enforcing SLAs
- Experience with both Snowflake and Databricks
- Experience with Datadog or other observability tools
- Background in MLOps
- Power BI development or admin knowledge
Tech Breakdown
- 50% Data Engineering (Snowflake / Databricks / Spark / Python / Airflow)
- 50% Infrastructure Operations (CI/CD, IaC, Containerization, AWS, Monitoring)
- 80% Hands-On Engineering
- 10% Mentorship
- 10% Cross-Team Collaboration
The Offer
- Bonus eligible — 10% annual bonus
- Medical, Dental, and Vision Insurance (starting Day 1)
- 18 PTO Days + 6 Holidays
- 401(k) with 6% automatic company contribution
- Tuition reimbursement and financial wellness tools
- Parental leave and mental health support
Applicants must be currently authorized to work in the US on a full-time basis now and in the future.
#LI-MW9