Senior Data Engineer EDM

  • Python
  • Advance SQL
  • Snowflake
  • DBT
  • AWS
  • Airflow

Experience - 7 - 10 years relevant experience
Location - Bangalore (Hybrid)
Primary Skills - Snowflake, DBT, SQL(advanced), Python, Git
Secondary Skills - AWS, Airflow, Power BI

Position Overview
We are looking for a highly skilled Data Engineer to support the design, development, and optimization of enterprise data pipelines and analytics platforms. The role will focus on building reliable, scalable data solutions that enable timely and actionable insights for business stakeholders.
You will work closely with Data & Analytics teams to transform, process, and deliver high-quality datasets using modern data stack technologies such as Snowflake, DBT, and AWS. The primary objective is to ensure efficient data flow, strong data quality, and support for analytics and reporting use cases.
________________________________________
Key Responsibilities
• Develop and maintain data pipelines for ingestion, transformation, and loading of data from multiple sources using SQL, Python, and DBT
• Build and optimize data models and transformation layers in DBT
• Work extensively with Snowflake (SnowSQL, Snowpipe, query optimization, cost management)
• Support AWS-based data architecture (S3, Glue, Lambda, Redshift, CloudWatch)
• Implement and maintain workflow orchestration using Airflow or similar tools
• Enhance existing ETL/ELT workflows and improve performance and scalability
• Ensure data quality, validation, and reliability through monitoring and testing frameworks
• Collaborate with BI and Analytics teams to deliver analytics-ready datasets
• Troubleshoot data issues and support ad-hoc data requests
• Follow best practices for version control (Git), documentation, and deployment
________________________________________
Minimum Requirements
• 7–10 years of relevant experience in Data Engineering / Analytics Engineering
• Strong hands-on expertise in:
o Snowflake (must-have)
o DBT (must-have)
o SQL (advanced) and Python
• Experience working with AWS services (S3, Glue, Lambda, Redshift, etc.)
• Experience with workflow orchestration tools (Airflow preferred)
• Strong understanding of data warehousing, ETL/ELT, and data modeling concepts
• Working knowledge of Power BI
• Experience with Git/version control systems
________________________________________
Preferred Requirements
• Exposure to data observability / data quality tools
• Experience with streaming technologies (Kafka, Kinesis)
• Familiarity with data governance and cataloging tools
• Experience working in cross-functional or distributed teams

Facebook Twitter
Sounds like a match?

Want to turn your ideas into brilliant applications?

Talk to our Experts
Quarks

Want to give wings to your career?

Apply Now

Stay up to date with insights from Quarks!

    How can I help you?