Data Engineer – Snowflake and DBT | 2025HP12003/#2Zdafega

2 weeks ago


Hyderabad, India Mindverse Consulting Services Full time

Job Description Job Summary We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt or Matillion and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the client’s organization. Job Responsibilities 1. Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices.   2. Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake.   3. Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets..   4. Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows.   5. Apply dbt best practices: modular SQL development, testing, documentation, and version control.   6. Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design.   7. Apply CI/CD and Git-based workflows for version-controlled deployments.   8. Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks.   9. Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets.   10. Write well-documented, maintainable code using Git for version control and CI/CD processes.   11. Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives.   12. Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions. Essential Skills Required Qualifications   • 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT or Matillion (Matillion-DPC is highly preferred, not mandatory   • Experience building and deploying DBT models in a production environment.   • Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred).   • Familiarity with data quality and validation techniques: dbt tests, dbt docs etc.   • Experience with Git, CI/CD, and deployment workflows in a team setting   • Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory.   Core Competencies:   o Data Engineering and ELT Development:   Building robust and modular data pipelines using dbt. Writing efficient SQL for data transformation and performance tuning in Snowflake. Managing environments, sources, and deployment pipelines in dbt.   o Cloud Data Platform Expertise:   Strong proficiency with Snowflake: warehouse sizing, query profiling, data loading, and performance optimization. Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages.   Technical Toolset:   o Languages & Frameworks:   Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning.     Best Practices and Standards:   o Knowledge of modern data architecture concepts including layered architecture (e.g., staging → intermediate → marts, Matillion architecture).   Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs).   Security & Governance:   o Access and Permissions:   Understanding of access control within Snowflake (RBAC), role hierarchies, and secure data handling. Familiar with data privacy policies (GDPR basics), encryption at rest/in transit.   Deployment & Monitoring:   o DevOps and Automation:   Version control using Git, experience with CI/CD practices in a data context. Monitoring and logging of pipeline executions, alerting on failures.   Soft Skills:   o Communication & Collaboration:   Ability to present solutions and handle client demos/discussions. Work closely with onshore and offshore team of analysts, data scientists, and architects. Ability to document pipelines and transformations clearly. Basic Agile/Scrum familiarity – working in sprints and logging tasks. Comfort with ambiguity, competing priorities and fast-changing client environment.       Education:   o Bachelor’s or master’s degree in computer science, Data Engineering, or a related field.   o Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus.   Please note the mandatory or most preferred skill set for this role Must have experience in Snowflake Must have experience in DBT or Matillion (Matillion-DPC is highly preferred) Must have experience in SSIS Background Check required No criminal record Others ·        Interview process- 2-3 technical round ·        This is 5 days work from office role in Hyderabad ·        You must be open to relocation or travel ·        You must join immediately Requirements Snowflake DBT/Matillion Azure Cloud Data Factory Data Bricks Data Warehousing SQL ETL/ELT SSIS Cloud Storage



  • Hyderabad, India Mindverse Consulting Services Full time

    Job Summary We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt or Matillion and be able to effectively work in a consulting setup. In this role, you will be instrumental...


  • Bengaluru, Hyderabad, Pune, India KPI Partners Full time

    Job Title: Data EngineerLocation: Hybrid Hyderabad/Bangalore/PuneExperience: 3+ yearsEmployment Type: Full-timeMandatory Skills: Azure, DBT, Snowflake, SQLQualifications:Bachelor's degree in Computer Science, Engineering, or a related field.Proven experience as a Data Engineer with a strong focus on DBT and Snowflake.Proficiency in SQL and programming...


  • Hyderabad, India Tata Consultancy Services Full time

    TCS is hiring... Role: Snowflake DBT Developer EXP: 6 - 8 YEARS LOCATION: Hyderabad, Chennai, Pune, Bengaluru, Bhubaneshwar, Kochi **Notice period- Immediate to 30 DAYS** Job Description 1. Extensive experience in Snowflakes data build tool. 2. Builds and manages data transformation pipelines directly within Snowflake using dbt (data build tool) 3....


  • Bengaluru, Hyderabad, Pune, India Tata Consultancy Services Full time ₹ 90,000 - ₹ 1,80,000 per year

    TCS is hiring...Role: Snowflake DBT DeveloperEXP: 6 - 8 YEARSLOCATION: Hyderabad, Chennai, Pune, Bengaluru, Bhubaneshwar, Kochi**Notice period- Immediate to 30 DAYS**Job DescriptionExtensive experience in Snowflakes data build tool.Builds and manages data transformation pipelines directly within Snowflake using dbt (data build tool)Leveraging Snowflake's...


  • Bengaluru, Hyderabad, Pune, India Tata Consultancy Services Full time

    Greetings from Tata Consultancy Services (TCS)TCS has always been in the spotlight for being adept in the next big technologies. What we can offer you is a space to explore variedtechnologies and quench your techie soul.What we are looking for : Snowflake DBTLocation :Chennai/Bengaluru/Hyderabad/Pune/Kochi/BhubaneshwarInterview Mode : Virtual mode (Microsoft...

  • Data Engineer

    3 weeks ago


    Hyderabad, India Tech Mahindra Full time

    Skills : Snowflake + DBT + Iceberg + ApacheflinkExp: 5 to 10 yearsNP: Immediate to 15 DaysLocation: HyderabadAbout the RoleWe are seeking a highly skilled Data Engineer to design, build, and optimize modern data pipelines and data models that power our analytics and data products. The ideal candidate will have strong experience with Snowflake and DBT, and...

  • Snowflake Developer

    1 week ago


    Hyderabad, India KPI Partners Full time

    Role : DBT + Snowflake Developer (with OpenFlow experience) Overview We are seeking a DBT & Snowflake Data Engineer with strong SQL and ELT modeling experience. The role involves validating and refining DBT models, ensuring transformation accuracy, and optimizing Snowflake SQL pipelines. Knowledge of OpenFlow-based orchestration is preferred. Key...

  • Snowflake Developer

    6 days ago


    hyderabad, India KPI Partners Full time

    Role : DBT + Snowflake Developer (with OpenFlow experience)OverviewWe are seeking a DBT & Snowflake Data Engineer with strong SQL and ELT modeling experience. The role involves validating and refining DBT models, ensuring transformation accuracy, and optimizing Snowflake SQL pipelines. Knowledge of OpenFlow-based orchestration is preferred.Key...

  • Snowflake Developer

    7 days ago


    hyderabad, India KPI Partners Full time

    Role : DBT + Snowflake Developer (with OpenFlow experience) Overview We are seeking a DBT & Snowflake Data Engineer with strong SQL and ELT modeling experience. The role involves validating and refining DBT models, ensuring transformation accuracy, and optimizing Snowflake SQL pipelines. Knowledge of OpenFlow-based orchestration is preferred. Key...

  • Snowflake Developer

    7 days ago


    Hyderabad, India KPI Partners Full time

    Role : DBT + Snowflake Developer (with OpenFlow experience)OverviewWe are seeking a DBT & Snowflake Data Engineer with strong SQL and ELT modeling experience. The role involves validating and refining DBT models, ensuring transformation accuracy, and optimizing Snowflake SQL pipelines. Knowledge of OpenFlow-based orchestration is preferred.Key...