DBT Engineer
2 weeks ago
Job Title: DBT Developer - Pune
About Us
Capco, a Wipro company, is a global technology and management consulting firm.
Awarded with Consultancy of the year in the British Bank Award and has been ranked Top
100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence
across 32 cities across globe, we support 100+ clients across banking, financial and Energy
sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance
companies, payment service providers and other key players in the industry. The projects
that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients
transform their business. Together with our clients and industry partners, we deliver
disruptive work that is changing energy and financial services.
BEYOURSELFATWORKCapco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking
their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
Job Title: DBT Developer - Pune
Location: Pune
Work Mode: Hybrid (3 days WFO - Tues, Wed, Thurs)
Shift Time: 12.30 PM TO 9.30 PM
Job Summary
We are seeking a skilled and detail-oriented DBT Engineer to join our cross-functional Agile
team. In this role, you will be responsible for designing, building, and maintaining modular,
reliable data transformation pipelines using dbt (Data Build Tool) in a Snowflake
environment. You will collaborate closely with backend and frontend engineers, product
managers, and analysts to create analytics-ready data models that power application
features, reporting, and strategic insights. This is an exciting opportunity for someone who
values clean data design, modern tooling, and working at the intersection of engineering
and business.
Key Responsibilities
- Design, build, and maintain scalable, modular dbt models and transformation
pipelines using DBT Core. DBT Cloud experience is good to have.
- Understand DBT Architecture thoroughly and experience in writing Python operators
in DBT flow. Strong experience in writing Jinja code, macros, seeds etc.
- Write SQL to transform raw data into curated, tested datasets in Snowflake.
- Knowledge of data modeling techniques like data vault and dimensional modeling
(Kimball/Inmon).
- Collaborate with full-stack developers and UI/UX engineers to support application
features that rely on transformed datasets.
- Work closely with analysts and stakeholders to gather data requirements and
translate them into reliable data models.
- Enforce data quality through rigorous testing, documentation, and version control in
dbt.
- Participate in Agile ceremonies (e.g., stand-ups, sprint planning) and manage tasks
using Jira.
- Integrate dbt into CI/CD pipelines and support automated deployment practices.
- Monitor data performance and pipeline reliability, and proactively resolve issues.
Mandatory Qualifications & Skills
- 35 years of experience in data engineering or analytics engineering, with a focus
on SQL-based data transformation.
- Hands-on production experience using dbt core or dbt cloud as a primary
development tool.
- Strong command of SQL and solid understanding of data modeling best practices
(e.g., star/snowflake schema).
- Proven experience with Snowflake as a cloud data warehouse.
- Python skills for data pipeline integration or ingestion.
- Familiarity with Git-based version control workflows.
- Strong communication and collaboration skills, with the ability to work across
engineering and business teams.
- Experience working in Agile/Scrum environments and managing work using Jira.
Nice-to-Have Skills
- Knowledge of data orchestration tools (e.g., Apache Airflow) is a big plus.
- Exposure to CI/CD pipelines and integrating dbt into automated workflows.
- Experience with cloud platforms such as AWS.
- Familiarity with Docker and container-based development.
- Understanding of how data is consumed in downstream analytics tools (e.g.,
Looker, Tableau, Power BI).
Preferred Experience
- A track record of building and maintaining scalable dbt projects in a production
setting.
- Experience working in cross-functional teams involving developers, analysts, and
product managers.
- A strong sense of ownership, documentation habits, and attention to data quality
and performance.
If you are keen to join us, you will be part of an organization that values your contributions,
recognizes your potential, and provides ample opportunities for growth. For more
information, visit Follow us on Twitter, Facebook, LinkedIn, and YouTube.
-
DBT Snowflake
2 weeks ago
Pune, Maharashtra, India LTIMindtree Full timeJob description:5+ year expDBT SnowflakeJob Description DBTDesign develop and maintain ELT data pipelines using DBT with Snowflake as the cloud data warehouseCollaborate with data analysts data scientists and business stakeholders to gather requirements and translate them into scalable DBT modelsBuild modular reusable and welldocumented DBT models following...
-
Snowflake DBT
5 days ago
Pune, Maharashtra, India Cognizant Full time ₹ 12,00,000 - ₹ 36,00,000 per yearSkills- Snowflake DBTExperience: 6 to 9 yearsLocation: AIA-PuneWe are looking for a skilled and detail-oriented Snowflake & DBT Developer with strong experience in data engineering and transformation. The ideal candidate will have hands-on expertise in Snowflake, DBT (with Jinja and SQL), and a working knowledge of Python. You will be responsible for...
-
Data Engineer
1 week ago
Pune, Maharashtra, India Arting Digital Private Limited Full time ₹ 15,00,000 - ₹ 28,00,000 per yearPosition-Data Engineer (Snowflake+DBT+Airflow)Location- Pune,AhmedabadExperience-5yrworking Mode- HybridSkills-Snowflake, Apache Airflow,Terraform,DBT,Git, SQL,SparkandPython ,Data Warehousing, CIi/CD PipelinesKey Responsibilities:Design, implement, and optimize data pipelines and workflows using Apache AirflowDevelop incremental and full-load strategies...
-
Python Data Engineer
2 weeks ago
Pune, Maharashtra, India Wissen Infotech Full time ₹ 9,00,000 - ₹ 12,00,000 per yearPython (Pandas, PySpark)Data engineering & workflow optimizationDelta Tables, ParquetGood-to-Have:DatabricksApache Spark, DBT, AirflowAdvanced Pandas optimizationsPyTest/DBT testing frameworks
-
Senior Data Platform Engineer
2 weeks ago
Pune, Maharashtra, India Zywave, Inc. Full time ₹ 6,00,000 - ₹ 18,00,000 per yearJob Title: Senior Data Platform EngineerLocation: Pune, IndiaWork Mode: Work From Office (WFO), 5 Days a WeekShift Timing: 12:00 PM – 9:00 PM ISTAbout ZywaveZywave is a leading provider of InsurTech solutions, empowering insurance brokers and agencies with innovative software tools to grow and manage their business. We are building a modern data platform...
-
Senior Data Platform Engineer
4 hours ago
Pune, Maharashtra, India Zywave Full time ₹ 20,00,000 - ₹ 25,00,000 per yearBrief DescriptionJob Title:Senior Data Platform EngineerLocation:Pune, IndiaWork Mode:Work From Office (WFO), 5 Days a WeekShift Timing:12:00 PM – 9:00 PM ISTAbout ZywaveZywave is a leading provider of InsurTech solutions, empowering insurance brokers and agencies with innovative software tools to grow and manage their business. We are building a modern...
-
Lead Data Engineer
1 week ago
Pune, Maharashtra, India Arting Digital Private Limited Full time ₹ 20,00,000 - ₹ 25,00,000 per yearPosition: Lead Data EngineerExperience: 7–10 yearsLocation: Pune / AhmedabadMode: HybridKey SkillsData Engineering, SQL, Python, Snowflake, dbt, Apache AirflowCloud Platforms: AWS / Azure / GCPCI/CD, Git, Infrastructure as Code (Terraform)Team Leadership & Stakeholder CollaborationResponsibilitiesDesign, build, and optimize scalable data pipelines...
-
Data Engineer
3 weeks ago
Pune, Maharashtra, India, Maharashtra Bajaj Technology Services Full timeWe are looking for Immediate Joiner for below positionData Engineer ( ADF , SQL , Python )Experience – 5 to 10 years Working Hours – 4AM to 1PM IST Remote Interview Process :1st Round – Virtual 2nd Round – Face to Face ( Pune )Primary Skill – Azure Data Factory ( ADF ) , SQL , DBT , Python , Snowflake . Primary RoleWe are seeking a skilled Senior...
-
Data Engineer Azure
2 hours ago
Pune, Maharashtra, India EXL Talent Acquisition Team Full time ₹ 6,00,000 - ₹ 18,00,000 per yearRole: Azure Data EngineerLocation: All EXL LocationWork Mode: HybridKey Responsibilities:• Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT.• Build and maintain data integration workflows from various data sources to Snowflake.• Write efficient and optimized SQL queries for data extraction and transformation.• Work...
-
Data Engineer 2
2 weeks ago
Pune, Maharashtra, India Talentonova Full time ₹ 9,00,000 - ₹ 12,00,000 per yearPosition: Data EngineerLocation: PAN IndiaDuration: Contract to hireDescription:Strong Knowledge on DBT tool and Snowflake is mustGood experience in end-to-end implementation of AWSDevelopment projects, especially Glue and Lambda Service