Motion Recruitment | Jobspring | Workbridge

Data Engineer / Hybrid in Loop

Chicago, Illinois

Hybrid

Direct Hire

$130k - $160k

An innovative data and insights organization operating within fast-evolving, highly regulated consumer markets is seeking a Data Engineer to join its growing platform team. This company builds large-scale data pipelines, ingesting millions of daily signals across retail, product inventory, pricing, and promotional activity to power real-time market intelligence products.

This role sits at the intersection of data architecture, pipeline design, and analytics engineering—focused on transforming fragmented, multi-source datasets into clean, reliable systems that directly drive customer-facing insights. You’ll be joining a team that values curiosity, end-to-end ownership, and thoughtful engineering in an industry where accuracy, timeliness, and transparency are essential.

This is a full-time, hybrid role based in the Chicago Loop, offering significant opportunities to shape the underlying data ecosystem behind a market-leading analytics platform. Required Skills & Experience
  • 4–6 years of experience in data engineering or software engineering with a strong foundation in modern data technologies.
  • Proficiency in Python and SQL, including experience building and scaling production-grade pipelines.
  • Experience with dbt, Snowflake, or similar cloud-based data warehouses.
  • Solid understanding of cloud infrastructure (preferably AWS—S3, EC2, Lambda).
  • Experience working with large, complex datasets across multiple data sources.
  • Ability to diagnose pipeline issues, analyze anomalies, and enforce data quality and lineage.
  • Strong communication skills and a collaborative approach to working with cross-functional teams.
Desired Skills & Experience
  • Familiarity with workflow orchestration tools such as Prefect.
  • Experience with Terraform, Docker, or other IaC/DevOps tools.
  • Background with retail, point-of-sale, or other high-volume marketplace data.
  • Exposure to scraping, ingestion frameworks, or high-throughput ETL/ELT pipelines.
  • Interest in building internal tooling that improves experimentation, observability, and governance.
  • Ability to thrive in fast-moving, product-driven environments.
What You Will Be Doing Tech Breakdown:
• 60% pipeline development, data modeling, architecture, and system optimization
• 40% analysis, debugging, cross-functional collaboration, and tooling
Daily Responsibilities:
  • Build and enhance scalable pipelines that aggregate data from diverse retail, market, and product sources.
  • Design and maintain robust data models using dbt and Snowflake to support analytics, reporting, and product-led insights.
  • Investigate data flows to identify inconsistencies, quality issues, or architectural gaps—then implement improvements.
  • Develop tooling that increases visibility into pipeline health, data quality, and operational metrics.
  • Collaborate with engineering, product, and analytics stakeholders to translate business needs into technical solutions.
  • Evaluate new architectural patterns, AWS services, and ingestion strategies to improve efficiency and scale.
  • Maintain strong documentation, lineage tracking, and monitoring frameworks.
The Offer You will receive the following benefits:
  • Medical, dental, and vision coverage options
  • Competitive salary
  • Flexible work hours with a hybrid schedule in the Chicago Loop
  • Opportunities for professional development and continued technical growth

#LI-OP

Posted by: Olivia Policastro

Specialization: