DBT Developer
1 week ago
Job Title: DBT Developer - Pune
About Us
"Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
MAKE AN IMPACT
Job Title: DBT Developer - Pune
Location: Pune
Work Mode: Hybrid (3 days WFO - Tues, Wed, Thurs)
Shift Time: 12.30 PM TO 9.30 PM
Job Summary
We are seeking a skilled and detail-oriented DBT Engineer to join our cross-functional Agile team. In this role, you will be responsible for designing, building, and maintaining modular, reliable data transformation pipelines using dbt (Data Build Tool) in a Snowflake environment. You will collaborate closely with backend and frontend engineers, product managers, and analysts to create analytics-ready data models that power application features, reporting, and strategic insights. This is an exciting opportunity for someone who values clean data design, modern tooling, and working at the intersection of engineering and business.
Key Responsibilities
- Design, build, and maintain scalable, modular dbt models and transformation pipelines using DBT Core. DBT Cloud experience is good to have.
- Understand DBT Architecture thoroughly and experience in writing Python operators in DBT flow. Strong experience in writing Jinja code, macros, seeds etc.
- Write SQL to transform raw data into curated, tested datasets in Snowflake.
- Knowledge of data modeling techniques like data vault and dimensional modeling (Kimball/Inmon).
- Collaborate with full-stack developers and UI/UX engineers to support application features that rely on transformed datasets.
- Work closely with analysts and stakeholders to gather data requirements and translate them into reliable data models.
- Enforce data quality through rigorous testing, documentation, and version control in dbt.
- Participate in Agile ceremonies (e.g., stand-ups, sprint planning) and manage tasks using Jira.
- Integrate dbt into CI/CD pipelines and support automated deployment practices.
- Monitor data performance and pipeline reliability, and proactively resolve issues.
Mandatory Qualifications & Skills
- 3–5 years of experience in data engineering or analytics engineering, with a focus on SQL-based data transformation.
- Hands-on production experience using dbt core or dbt cloud as a primary development tool.
- Strong command of SQL and solid understanding of data modeling best practices (e.g., star/snowflake schema).
- Proven experience with Snowflake as a cloud data warehouse.
- Python skills for data pipeline integration or ingestion.
- Familiarity with Git-based version control workflows.
- Strong communication and collaboration skills, with the ability to work across engineering and business teams.
- Experience working in Agile/Scrum environments and managing work using Jira.
Nice-to-Have Skills
- Knowledge of data orchestration tools (e.g., Apache Airflow) is a big plus.
- Exposure to CI/CD pipelines and integrating dbt into automated workflows.
- Experience with cloud platforms such as AWS.
- Familiarity with Docker and container-based development.
- Understanding of how data is consumed in downstream analytics tools (e.g., Looker, Tableau, Power BI).
Preferred Experience
- A track record of building and maintaining scalable dbt projects in a production setting.
- Experience working in cross-functional teams involving developers, analysts, and product managers.
- A strong sense of ownership, documentation habits, and attention to data quality and performance.
If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit Follow us on Twitter, Facebook, LinkedIn, and YouTube.
-
DBT Snowflake
5 days ago
Pune, Maharashtra, India LTIMindtree Full timeJob description:5+ year expDBT SnowflakeJob Description DBTDesign develop and maintain ELT data pipelines using DBT with Snowflake as the cloud data warehouseCollaborate with data analysts data scientists and business stakeholders to gather requirements and translate them into scalable DBT modelsBuild modular reusable and welldocumented DBT models following...
-
DBT Engineer
1 week ago
Pune, Maharashtra, India Persistent Full time ₹ 12,00,000 - ₹ 36,00,000 per yearAbout Position:We are hiring experience with Terraform , Snowflake is must. Looking for Techincally good candidates to grow with Persistent.Role: DBT EngineerLocation: All Persistent LocationsExperience: 5+ YearsJob Type: Full Time EmploymentWhat You'll Do:Terraform:Minimum 3 years of hands-on experience in designing, implementing, and managing...
-
Snowflake DBT
3 days ago
Pune, Maharashtra, India Virtusa Full time ₹ 1,00,00,000 - ₹ 2,00,00,000 per yearYou bring extensive experience in DBT, including designing and developing technical architecture, data pipelines, and performance scaling. You have leveraged tools to integrate Talend data and ensure data quality in a big data environment. Your PL/SQL skills are very strong, particularly with queries, procedures, and JOIN operations.You have hands-on...
-
Sr. Developer
7 days ago
Pune, Maharashtra, India Iilika Groups Full time ₹ 15,00,000 - ₹ 25,00,000 per yearJob Title: Sr. Developer - SQL, Snowflake, DBT (Data Build Tool)Number of Positions: 4Job Type: Contract (Remote)Industry: ITExperience: 6+ YearsQualification: Any IT-related degreeBudget: As per industry standardsJoining: Immediate Joiner PreferredJob Description:We are looking for experienced Sr. Developers with expertise in SQL, Snowflake, and DBT (Data...
-
Snowflake Developer
1 week ago
Pune, Maharashtra, India Proficient Full time ₹ 8,00,000 - ₹ 12,00,000 per yearDBT snowflake, Kafka, and SQLAdvanced SQL, ASNI, hands-on MS SQL DataStage, GitHub , Kafka, ETL Data modeling/ DataStage or ETL, SQL query
-
Pune, Maharashtra, India EXL Full time ₹ 12,00,000 - ₹ 36,00,000 per yearDescriptionRole: Azure Data EngineerLocation: All EXL LocationWork Mode: HybridKey Responsibilities:• Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT.• Build and maintain data integration workflows from various data sources to Snowflake.• Write efficient and optimized SQL queries for data extraction and...
-
Data Engineer with strong Data Vault 2.0
3 days ago
Pune, Maharashtra, India TESTQ Full time ₹ 15,00,000 - ₹ 25,00,000 per yearStrong expertise in data transformation and pipeline development (SQL, Python, DBT)Strong expertise and experience in Data Vault 2.0 modeling frameworkIDE-savvy, working with dbt CLI locally and github copilotBuild dbt model following the DV2.0 and StarschemasWorking knowledge of SnowflakeAbility to design and optimize scalable data workflowsExposure...
-
Python +SQL developer
5 days ago
Pune, Maharashtra, India Growel Softech Pvt. Ltd. Full time ₹ 12,00,000 - ₹ 36,00,000 per yearPython +SQL developer Python Data Integration Engineer Role and Responsibilities:As the Data Integration Engineer, you will play a pivotal role in shaping the future of our data integration engineering initiatives. You will remain actively involved in the technical aspects of the projects. Your responsibilities will includeHands-On ContributionContinue to...
-
Data Engineer
23 hours ago
Pune, Maharashtra, India Innova ESI Full time ₹ 10,00,000 - ₹ 25,00,000 per yearMust have skills : Snowflake+DBTExperience- 6+ YearsGood to have skills : PL/SQLLocation(s) : PuneDetailed JD :"Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business...
-
Pune, Maharashtra, India EXL Full time ₹ 12,00,000 - ₹ 36,00,000 per yearDescriptionWe are seeking a Senior Data Engineer to build and optimize data systems that power batch processing, real-time streaming, pipeline orchestration, data lake management, and data cataloging. You will have the opportunity to use your expertise in solving big data problems and apply design thinking, coding and analytical skills to develop data...