Senior Infrastructure Operations/DevOps
Chicago, Illinois
Hybrid
Full Time
$120k - $150k
Senior Infrastructure Operations/DevOps
The candidate will be responsible for managing and enhancing the performance of data platforms and analytics operations. You’ll develop observability and monitoring solutions, oversee end-to-end operational workflows, and ensure the reliability and efficiency of platforms in alignment with business objectives.
The company is located in Merchandise Mart, IL and will be a hybrid model.
What You Will Be Doing:
This position doesn’t provide sponsorship.
The candidate will be responsible for managing and enhancing the performance of data platforms and analytics operations. You’ll develop observability and monitoring solutions, oversee end-to-end operational workflows, and ensure the reliability and efficiency of platforms in alignment with business objectives.
The company is located in Merchandise Mart, IL and will be a hybrid model.
What You Will Be Doing:
- Serve as an Infrastructure Operations Data Engineer, responsible for maintaining high-performance, scalable data platforms that align with business objectives.
- Act as a Subject Matter Expert (SME) in data operations, optimizing processes, managing cost-efficient pipelines, and implementing standardized performance and quality metrics.
- Stay current with industry trends and emerging technologies, assessing tools and platforms for their potential value to the organization.
- Define and uphold Service Level Agreements (SLAs) to ensure reliable and timely data platform operations.
- Improve operational workflows by applying best practices that streamline and enhance process efficiency.
- Provide guidance and mentorship to junior team members.
- 3+ years of experience building robust CI/CD pipelines with automated testing using tools such as Jenkins, GitHub Actions, or GitLab CI/CD.
- Proven success in monitoring and optimizing the performance of databases, data lakes, and data pipelines.
- Experience in orchestrating workflows and data pipelines using Databricks Workflows or Apache Airflow (3+ years).
- Developed automated tools and processes for cost tracking and utilization reporting.
- Hands-on experience with batch and streaming ETL pipelines using Spark, Python, Terraform, Snowflake, or Databricks (3+ years).
- Skilled in containerization and orchestration with tools like Docker and Kubernetes, along with shell scripting in Bash, Unix, or Windows shell.
- Familiarity with AWS services including Glue, Athena, Lambda, S3, and DynamoDB.
- Demonstrated expertise in managing the data lifecycle, with a strong focus on data quality functions such as standardization, transformation, rationalization, linking, and matching.
- Experience with Power BI development and administration is highly desirable.
This position doesn’t provide sponsorship.