
ETL Developers with Experience in Airflow, Snowflake
4 days ago
Job Title: ETL Developers
Job Location: Coimbatore
Type: WFO
Job description
Key Responsibilities:
1. ETL Design and Development:
Design and develop efficient, scalable SSIS packages to extract, transform, and load data between systems.
Translate business requirements into technical ETL solutions using data flow and control flow logic.
Develop reusable ETL components that support modular, configuration-driven architecture.
2. Data Integration and Transformation:
Integrate data from multiple heterogeneous sources: SQL Server, flat files, APIs, Excel, etc.
Implement business rules and data transformations such as cleansing, standardization, enrichment, and deduplication.
Manage incremental loads, full loads, and slowly changing dimensions (SCD) as required.
3. SQL and Database Development:
Write complex T-SQL queries, stored procedures, and functions to support data transformations and staging logic.
Perform joins, unions, aggregations, filtering, and windowing operations effectively for data preparation.
Ensure referential integrity and proper indexing for performance.
4. Performance Tuning:
Optimize SSIS packages by tuning buffer sizes, using parallelism, and minimizing unnecessary transformations.
Tune SQL queries and monitor execution plans for efficient data movement and transformation.
Implement efficient data loads for high-volume environments.
5. Error Handling and Logging:
Develop error-handling mechanisms and event logging in SSIS using Event Handlers and custom logging frameworks.
Implement restartability, checkpoints, and failure notifications in workflows.
6. Testing and Quality Assurance:
Conduct unit and integration testing of ETL pipelines.
Validate data outputs against source and business rules.
Support QA teams in user acceptance testing (UAT) and defect resolution.
7. Deployment and Scheduling:
Package, deploy, and version SSIS solutions across development, test, and production environments.
Schedule ETL jobs using SQL Server Agent or enterprise job schedulers (e.g., Control-M, Tidal).
Monitor and troubleshoot job failures and performance issues.
8. Documentation and Maintenance:
Maintain documentation for ETL designs, data flow diagrams, transformation logic, and job schedules.
Update job dependencies and maintain audit trails for data pipelines.
9. Collaboration and Communication:
Collaborate with data architects, business analysts, and reporting teams to understand data needs.
Provide technical support and feedback during requirements analysis and post- deployment support.
Participate in sprint planning, status reporting, and technical reviews.
10. Compliance and Best Practices:
Ensure ETL processes comply with data governance, security, and privacy regulations (HIPAA, GDPR, etc.).
Follow team coding standards, naming conventions, and deployment protocols.
Required Skills s Experience:
- 4–8 years of hands-on experience with ETL development using SSIS.
- Strong SQL Server and T-SQL skills.
- Solid understanding of data warehousing concepts and best practices.
- Experience with flat files, Excel, APIs, or other common data sources.
- Familiarity with job scheduling and monitoring (e.g., SQL Agent).
- Strong analytical and troubleshooting skills.
- Ability to work independently and meet deadlines.
Preferred Skills:
- Exposure to Azure Data Factory or cloud-based ETL tools.
- Experience with Power BI or other reporting platforms.
- Experience in healthcare, finance, or regulated domains is a plus.
- Knowledge of version control tools like Git or Azure DevOps.
-
Advanced ETL Solutions Developer
3 days ago
Erode, Tamil Nadu, India beBeeDatastage Full time ₹ 12,00,000 - ₹ 25,00,000About the RoleAs a highly skilled Teradata and DataStage Developer, you will be responsible for designing, developing, and maintaining efficient ETL solutions using IBM DataStage. Your expertise in data warehousing concepts, SQL skills, and ability to collaborate with cross-functional teams will be invaluable assets to our organization.Main...
-
Snowflake Data Engineer
3 days ago
Erode, Tamil Nadu, India beBeeData Full time ₹ 1,50,00,000 - ₹ 2,00,00,000Key ResponsibilitiesDesign, implement and optimize data warehouses on Snowflake platform.Ensure effective utilization of Snowflake features for scalable and high-performance data storage.Develop, implement and optimize end-to-end data pipelines on Snowflake.Create and maintain ETL workflows for seamless data processing.Leverage PySpark for advanced data...
-
Senior Data Architect
1 week ago
Erode, Tamil Nadu, India beBeeDataEngineer Full time ₹ 18,00,000 - ₹ 25,00,000Job Title: Data EngineerWe are seeking a skilled professional to lead the design, development, and maintenance of our data infrastructure.Main Responsibilities:Data Pipeline Development: Design, develop, and maintain ETL/ELT data pipelines to ingest data from various sources into our data warehouse.Snowflake Stored Procedures: Develop and optimize Snowflake...
-
Expert Data Architect Snowflake Specialist
3 days ago
Erode, Tamil Nadu, India beBeeData Full time ₹ 2,00,00,000 - ₹ 3,00,00,000Job Title: Data Architect Snowflake ExpertSnowflake specialist with experience in designing, implementing and optimizing data pipelines on the platform. You will work closely with stakeholders to understand business requirements and develop solutions that meet their needs.Key Responsibilities:Design and implement scalable data models for large-scale BFSI...
-
Data and AI Professional Opportunity
3 days ago
Erode, Tamil Nadu, India beBeeData Full time ₹ 20,00,000 - ₹ 40,00,000Job Title:Data and AI Professional Opportunity">About the RoleWe are seeking a highly skilled Data and AI professional to join our team. This internship provides an opportunity to gain hands-on experience at the intersection of data engineering and AI.You will be part of a collaborative team working on real-world data challenges using modern tools like...
-
Principal Data Architect
1 week ago
Erode, Tamil Nadu, India beBeeDataEngineer Full time ₹ 20,00,000 - ₹ 25,00,000Job Title: Data EngineerWe are seeking an experienced and skilled professional to join our team as a Data Engineer.The ideal candidate will possess strong technical skills in data ingestion, transformation, and analysis. They will also have excellent communication and problem-solving abilities.Design, develop, and maintain data pipelines using ETL/ELT tools...
-
Data Architecture Specialist
1 week ago
Erode, Tamil Nadu, India beBeeDataEngineer Full time ₹ 1,80,00,000 - ₹ 2,40,00,000Senior Data EngineerWe are seeking an exceptional Senior Data Engineer to join our team and design, build, and maintain our modern data platform.This role is perfect for someone with deep technical expertise in ETL pipeline design, data modeling, and data infrastructure who thrives in a fast-paced, engineering-driven environment.You'll work across the full...
-
Senior Data Specialist
1 week ago
Erode, Tamil Nadu, India beBeeData Full time ₹ 25,00,000 - ₹ 35,00,000Key Role Summary:We are seeking a highly skilled Senior Data Specialist to join our team.Job DescriptionThe ideal candidate will have a strong background in data engineering and a proven track record of designing, developing, and maintaining large-scale data pipelines. Key responsibilities include working with big data ecosystems, developing efficient ETL...
-
Expert Data Engineer with GCP Experience
2 days ago
Erode, Tamil Nadu, India beBeeDataEngineer Full time ₹ 1,00,00,000 - ₹ 2,00,00,000Data Engineering ExpertiseWe are seeking a seasoned data engineer to spearhead the design and implementation of data pipelines using GCP Cloud technologies. This individual will be responsible for driving the development of efficient ETL processes on GCP Cloud, leveraging best practices and handling complex challenges.Key Responsibilities:Design and...
-
Senior Data Engineer with GCP, ETL
3 days ago
Erode, Tamil Nadu, India iitjobs, Inc. Full timeGreetingsRole : Senior Data Engineer with GCP, ETLLocation- 100 % RemoteExp Range- 4+ YrsNotice Period- Immediate Joiners only.Duration- 4 months ContractTime: 11 am to 8 pm ISTMust have skills:1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, Airflow/Composer, Python(preferred)/Java2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best...