Senior DevOps Engineer
Arlington, Virginia
Hybrid
Full Time
$160k - $180k
Senior DevOps Engineer
In this role, you will oversee the deployment, management, and optimization of our data warehouse infrastructure. You will work closely with data engineering, data science, and software engineering teams to design, implement, and maintain scalable, reliable, and high-performance data solutions.
The company is located in Herndon, VA and will be a hybrid model.
What You Will Be Dong:
This position doesn’t provide sponsorship.
In this role, you will oversee the deployment, management, and optimization of our data warehouse infrastructure. You will work closely with data engineering, data science, and software engineering teams to design, implement, and maintain scalable, reliable, and high-performance data solutions.
The company is located in Herndon, VA and will be a hybrid model.
What You Will Be Dong:
- Design, implement, and maintain data warehouse infrastructure to ensure high availability and optimal performance.
- Develop and manage CI/CD pipelines for automated deployment and testing of data solutions.
- Oversee the management of durable data queues.
- Deploy and maintain services within Kubernetes environments.
- Monitor, troubleshoot, and enhance ETL/ELT workflows and database performance.
- Collaborate with data engineers and analysts to optimize data models and improve query efficiency.
- Manage integrations between the data warehouse and enterprise tools, including analytics platforms, data lakes, and external/internal data sources.
- Implement and maintain Infrastructure as Code (IaC) to ensure consistent environment setups.
- Enforce security and compliance measures, including access control, data encryption, and routine audits.
- Support data science product deployments and data pipelines.
- Develop and maintain technical documentation, including architecture diagrams, workflows, and standard operating procedures.
- Assist DevOps teams with infrastructure and deployment requests.
- Stay current with emerging technologies and best practices in data engineering and DevOps.
- 4+ years of experience in DevOps or related roles with a focus on data warehouse technologies.
- Strong expertise in data warehouse platforms (e.g., Snowflake, Redshift, BigQuery, Azure Synapse).
- Proficiency with cloud platforms (AWS, Azure, or GCP).
- Hands-on experience with CI/CD tools (e.g., Jenkins, GitHub Actions).
- Skilled in scripting and automation using Scala, Python, Bash, TypeScript, Node.js, or similar languages.
- Experience working with ETL/ELT tools and frameworks (e.g., Airflow).
- Familiarity with Infrastructure as Code tools (e.g., Terraform, Pulumi).
- Strong knowledge of big data storage, optimization techniques, and data modeling best practices.
- In-depth understanding of security and compliance standards for data warehouses.
- Proficient in Kubernetes, with hands-on experience managing GKE.
- Experience with Apache Spark.
- Experience with real-time data streaming and durable queues (e.g., Kafka, Spark Streaming, Google Pub/Sub, NATS).
- Certifications in cloud platforms or data warehouse technologies.
- Bachelor's degree in Computer Science or equivalent industry experience.
This position doesn’t provide sponsorship.