Pentaho, ETL Tool

4 days ago


Gurgaon, Haryana, India PureSoftware Pvt Ltd Full time ₹ 6,00,000 - ₹ 18,00,000 per year

We are seeking an experienced ETL Developer with strong Pentaho Data Integration (PDI) expertise to support new solution implementations and modernization of existing data workflows. The role involves re-engineering current processes built using Shell scripts, Java, or other legacy automation tools into scalable Pentaho jobs. The candidate will also contribute to cloud migration efforts to Azure, ensuring enterprise-grade performance, security, and maintainability.

Key Responsibilities:

Design, develop, and maintain ETL workflows in Pentaho (PDI) based on existing processes in Shell scripts, Java, or other automation tools.

Implement new, efficient, and scalable data integration pipelines in Pentaho to meet evolving business requirements.

Analyze and reverse-engineer current data workflows to build equivalent solutions in Pentaho.

Support migration of existing on-prem or custom data solutions to Azure Cloud, integrating with services like Azure Blob Storage, ADF, Azure SQL, Key Vault, etc.

Work with various source and target systems such as Oracle, PostgreSQL, SQL Server, CSV, JSON, XML, APIs, etc.

Develop parameterized, modular, and reusable Pentaho transformations and jobs.

Perform data validation, reconciliation, error handling, and logging within the ETL framework.

Optimize Pentaho jobs for performance and monitor scheduled job execution.

Ensure data quality and governance, aligning with enterprise and compliance standards (e.g., GDPR, HIPAA).

Collaborate with business analysts, architects, and data engineers to deliver solutions aligned with functional needs.

Document ETL design, data flow, and operations for ongoing support and enhancements.

Participate in Agile ceremonies, provide estimates, and track tasks using tools like JIRA.

Technical Skills:

Experience in rewriting/refactoring legacy scripts into ETL jobs using visual tools like Pentaho.

Strong background in data processing workflows implemented in Shell scripts, Java, or similar tools.

Hands-on experience with Azure Cloud Services relevant to data migration:

o Azure Data Factory

o Azure Blob Storage

o Azure SQL / Synapse

o Azure Key Vault / Managed Identity

Proficient in SQL, stored procedures, and performance tuning.

Experience with data validation, audit logging, and data quality frameworks.

Knowledge of file-based, API-based, and database-based integration techniques.

Version control using Git/GitLab, and awareness of CI/CD practices for ETL deployments.

Familiarity with Agile/Scrum/SAFe methodologies, and use of JIRA/Confluence.

Familiarity with Apache HOP, PowerBI is a plus

Experience in data archival, purging, and retention policy implementation.

Roles and Responsibilities

We are seeking an experienced ETL Developer with strong Pentaho Data Integration (PDI) expertise to support new solution implementations and modernization of existing data workflows. The role involves re-engineering current processes built using Shell scripts, Java, or other legacy automation tools into scalable Pentaho jobs. The candidate will also contribute to cloud migration efforts to Azure, ensuring enterprise-grade performance, security, and maintainability.

Key Responsibilities:

Design, develop, and maintain ETL workflows in Pentaho (PDI) based on existing processes in Shell scripts, Java, or other automation tools.

Implement new, efficient, and scalable data integration pipelines in Pentaho to meet evolving business requirements.

Analyze and reverse-engineer current data workflows to build equivalent solutions in Pentaho.

Support migration of existing on-prem or custom data solutions to Azure Cloud, integrating with services like Azure Blob Storage, ADF, Azure SQL, Key Vault, etc.

Work with various source and target systems such as Oracle, PostgreSQL, SQL Server, CSV, JSON, XML, APIs, etc.

Develop parameterized, modular, and reusable Pentaho transformations and jobs.

Perform data validation, reconciliation, error handling, and logging within the ETL framework.

Optimize Pentaho jobs for performance and monitor scheduled job execution.

Ensure data quality and governance, aligning with enterprise and compliance standards (e.g., GDPR, HIPAA).

Collaborate with business analysts, architects, and data engineers to deliver solutions aligned with functional needs.

Document ETL design, data flow, and operations for ongoing support and enhancements.

Participate in Agile ceremonies, provide estimates, and track tasks using tools like JIRA.

Technical Skills:

Experience in rewriting/refactoring legacy scripts into ETL jobs using visual tools like Pentaho.

Strong background in data processing workflows implemented in Shell scripts, Java, or similar tools.

Hands-on experience with Azure Cloud Services relevant to data migration:

o Azure Data Factory

o Azure Blob Storage

o Azure SQL / Synapse

o Azure Key Vault / Managed Identity

Proficient in SQL, stored procedures, and performance tuning.

Experience with data validation, audit logging, and data quality frameworks.

Knowledge of file-based, API-based, and database-based integration techniques.

Version control using Git/GitLab, and awareness of CI/CD practices for ETL deployments.

Familiarity with Agile/Scrum/SAFe methodologies, and use of JIRA/Confluence.

Familiarity with Apache HOP, PowerBI is a plus

Experience in data archival, purging, and retention policy implementation.


  • Big Data Engineer

    1 week ago


    Gurgaon, Haryana, India MyCareernet Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Company:Indian / Global Digital OrganizationKey Skills:Pyspark, AWS, Python, SCALA, ETLRoles and Responsibilities:Develop and deploy ETL and data warehousing solutions using Python libraries and Linux bash scripts on AWS EC2, with data stored in Redshift.Collaborate with product and analytics teams to scope business needs, design metrics, and build...

  • ETL Testing

    2 weeks ago


    Gurgaon, Haryana, India Talent Worx Full time ₹ 6,00,000 - ₹ 18,00,000 per year

    Position: ETL Testing EngineerLocation: RemoteExperience: 3-6 yearsWe are seeking a skilled ETL Testing Engineer to join our team. In this role, you will be responsible for validating the ETL processes and ensuring that data is accurately extracted, transformed, and loaded from various sources to data warehouses. You will work closely with data analysts and...

  • ETL Developer

    4 days ago


    Gurgaon, Haryana, India Flairdeck Consulting Full time ₹ 4,28,000 - ₹ 13,00,000 per year

    Job Summary:We are seeking a skilled and experienced ETL Developer with strong proficiency in Informatica PowerCenter, Teradata, and end-to-end ETL processes. The candidate will be responsible for the design, development, testing, and maintenance of scalable ETL workflows to support enterprise data integration, warehousing, and reporting solutions.Key...

  • ETL Developer

    1 week ago


    Gurgaon, Haryana, India Arting Digital Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Position :ETL DeveloperExperience :49 YearsLocation :GurugramNotice Period :Immediate -15 daysJob SummaryWe are seeking a skilled ETL Developer with strong expertise in Apache NiFi, AWS, and SQL to design, develop, and maintain robust data pipelines. The ideal candidate will have hands-on experience in building scalable ETL workflows, integrating multiple...

  • ETL Test Engineer

    4 days ago


    Gurgaon, Haryana, India Capgemini Full time ₹ 6,00,000 - ₹ 18,00,000 per year

    Role & responsibilitiesRole OverviewWe are looking for an experienced Software Development Engineer in Test (SDET) with strong expertise in ETL testing, database validation, and Python-based automation.The ideal candidate will ensure data integrity and accuracy across ETL pipelines and data warehouse systems, while leveraging deep technical and analytical...

  • ETL Tester

    1 week ago


    Gurgaon, Haryana, India Adescare Technologies INC Full time ₹ 4,00,000 - ₹ 8,00,000 per year

    Job Title:ETL TesterLocation:Gurugram, IndiaExperience:3 – 6 YearsWork Mode:Onsite / Hybrid (as per project requirement)Employment Type:Full-time / ContractAbout the RoleWe are looking for a detail-orientedETL Testerwith strong hands-on experience inSQL, API testing, Linux, andSOAP UI. The ideal candidate should be able to validate data across multiple...

  • ETL Developer

    4 days ago


    Gurgaon, Haryana, India BlackRock Full time ₹ 1,00,00,000 - ₹ 2,00,00,000 per year

    Location:Gurgaon, HaryanaTeam:Data OperationsJob Requisition #: R252867Date posted: Nov. 17, 2025Job descriptionAbout this roleBusiness Unit Overview:BlackRock's US Wealth Advisory business ("USWA") manages the firm's relationships with US retail financial services firms and their advisors, who ultimately serve end investors. Representing a full suite of...

  • Data Build Tool

    2 weeks ago


    Gurgaon, Haryana, India Ampcus Tech Full time ₹ 10,000 - ₹ 50,000 per year

    Role: DBT ArchitectLocation: GurgaonRole:    DBT Architect DBT Architect with 10+ years of hands-on experience in ETL and DBT tool Stakeholder managementAttitudeCollaborative & Persuasive, Self-motivated, Research Oriented, Hands-on, Committed, Always-on learnerShould have Customer focus - understanding the customer, define the problem, and develop...

  • ETL Developer

    4 days ago


    Gurgaon, Haryana, India BlackRock Full time ₹ 1,00,00,000 - ₹ 2,00,00,000 per year

    About this roleBusiness Unit Overview:BlackRock's US Wealth Advisory business ("USWA") manages the firm's relationships with US retail financial services firms and their advisors, who ultimately serve end investors. Representing a full suite of strategies – from iShares ETFs and mutual funds to SMAs, model portfolios, alternatives, portfolio solutions, and...


  • Gurgaon, Haryana, India Appzlogic Full time ₹ 8,00,000 - ₹ 20,00,000 per year

    About the Role : We are seeking an experienced ETL Test Engineer with strong expertise in cloud-based ETL tools, Azure ecosystem, and advanced SQL skills. The ideal candidate will have a proven track record in validating complex data pipelines, ensuring data integrity, and collaborating with cross-functional teams in an Agile environment. Location...