Data Analyst with ETL Expert

2 weeks ago


Noida, Uttar Pradesh, India Eice Technology Full time ₹ 9,00,000 - ₹ 12,00,000 per year

Responsibilities

Design, Develop, and Maintain ETL Pipelines: Create, optimize, and manage Extract, Transform, Load (ETL) processes using Python scripts and Pentaho Data Integration (Kettle) to move and transform data from various sources into target systems (e.g., data warehouses, data lakes).

Data Quality Assurance: Implement rigorous data validation, cleansing, and reconciliation procedures to ensure the accuracy, completeness, and consistency of data.

Data Sourcing and Integration: Work with diverse data sources, including relational databases (SQL Server, MySQL, PostgreSQL), flat files (CSV, Excel), APIs, and cloud platforms.

Performance Optimization: Identify and implement improvements for existing ETL processes to enhance data load times, efficiency, and scalability.

Troubleshooting and Support: Diagnose and resolve data-related issues, ensuring data integrity and timely availability for reporting and analysis.

Documentation: Create and maintain comprehensive documentation for all ETL processes, data flows, and data dictionaries.

Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver robust data solutions.

Ad-hoc Analysis: Perform ad-hoc data analysis and provide insights to support business decisions as needed.

About the Role:

We are looking for a skilled and passionateData Engineerwith 3 to 4 years of experience in building robust ETL pipelines using both visual ETL tools (preferably Kettle/Pentaho) and Python-based frameworks. You will be responsible for designing, developing, and maintaining high-quality data workflows that support our data platforms and reporting environments.

Key Responsibilities:

Design, develop, and maintain ETL pipelines using Kettle (Pentaho) or similar tools.

Build data ingestion workflows using Python (Pandas, SQLAlchemy, psycopg2).

Extract data from relational and non-relational sources (APIs, CSV, databases).

Perform complex transformations and ensure high data quality.

Load processed data into target systems such as PostgreSQL, Snowflake, or Redshift.

Implement monitoring, error handling, and logging for all ETL jobs.

Maintain job orchestration via shell scripts, cron, or workflow tools (e.g., Airflow).

Work with stakeholders to understand data needs and deliver accurate, timely data.

Maintain documentation for pipelines, data dictionaries, and metadata.

Requirements:

3 to 4 years of experience in Data Engineering or ETL development.

Hands-on experience withKettle (Pentaho Data Integration) or similar ETL tools.

Strong proficiency in Python (including pandas, requests, datetime, etc.).

Strong SQL knowledge and experience with relational databases (PostgreSQL, SQL Server, etc.).

Experience with source control (Git), scripting (Shell/Bash), and config-driven ETL pipelines.

Good understanding of data warehousing concepts, performance optimization, and incremental loads.

Familiarity with REST APIs, JSON, XML, and flat file processing.

Good to Have:

Experience with job scheduling tools (e.g., Airflow, Jenkins).

Familiarity with cloud platforms (AWS, Azure, or GCP).

Knowledge of Data Lakes, Big Data, or real-time streaming tools is a plus.

Experience working in Agile/Scrum environments.

Soft Skills:

Strong analytical and problem-solving skills.

Self-motivated and able to work independently and in a team.

Good communication skills with technical and non-technical stakeholders.

Industry

Software Development

Employment Type

Full-time


  • Data Analyst

    3 weeks ago


    Noida, Uttar Pradesh, India Velodata Global Pvt Ltd Full time

    Job Title: BI Developer / Data Analyst Location: Bengaluru (Karnataka), Noida (Uttar Pradesh) Mode of Interviews: 2 rounds (Virtual) Experience Brackets & Compensation 4–6 years: Up to ₹18–20 LPA (based on profile & exposure). 6–8 years: Up to ₹28 LPA (based on profile & exposure). Joining Requirement Immediate joiners preferred. Maximum 15–20...

  • Data Analyst

    3 weeks ago


    Noida, Uttar Pradesh, India Velodata Global Pvt Ltd Full time

    Job Title: BI Developer / Data AnalystLocation: Bengaluru (Karnataka), Noida (Uttar Pradesh) Mode of Interviews: 2 rounds (Virtual)Experience Brackets & Compensation4–6 years: Up to ₹18–20 LPA (based on profile & exposure).6–8 years: Up to ₹28 LPA (based on profile & exposure).Joining RequirementImmediate joiners preferred.Maximum 15–20 days...

  • ETL Tester

    2 weeks ago


    Noida, Uttar Pradesh, India Strategic Talent Partner Full time

    Role : QA Automation EngineerJob Description : As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes,...

  • Data Engineer

    3 weeks ago


    Noida, Uttar Pradesh, India NTT DATA Full time

    Req ID 303115NTT DATA strives to hire exceptional innovative and passionate individuals who want to grow with us If you want to be part of an inclusive adaptable and forward-thinking organization apply now We are currently seeking a Data Engineer to join our team in Noida Uttar Pradesh IN-UP India IN Job Duties As a data engineer you will have a...

  • ETL QA

    1 week ago


    Noida, Uttar Pradesh, India Iris Software Full time ₹ 15,00,000 - ₹ 28,00,000 per year

    Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks,...

  • ETL Data Engineer

    2 weeks ago


    Noida, Uttar Pradesh, India Lumiq Full time ₹ 1,04,000 - ₹ 1,30,878 per year

    Who we are:LUMIQ is the leading Data and Analytics company in the Financial Services and Insurance (FSI) industry. We are trusted by the world's largest FSIs, including insurers, banks, AMCs, and NBFCs, to address their data challenges. Our clients include 40+ enterprises with over $10B in deposits/AUM, collectively representing about 1B customers globally....


  • Noida, Uttar Pradesh, India Strategic Talent Partner Full time

    As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will...


  • Noida, Uttar Pradesh, India Strategic Talent Partner Full time

    As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will...

  • Etl Architect

    2 weeks ago


    Greater Noida, Uttar Pradesh, India Ipeople Infosysteams LLC Full time

    Hi Hope you are doing well I am hiring for position with client Please go through the JD below and let me know your interest on swapnil t ipeopleinfosystems com Role ETL Architect Location Greater Noida onsite Type Full time Experience - 10-16 years We are seeking a skilled ETL Architect Engineer having an experience of...

  • ETL Tester

    29 minutes ago


    Noida, Uttar Pradesh, India Iris Software Full time ₹ 20,00,000 - ₹ 25,00,000 per year

    Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality...