Current jobs related to Data engineer - Nadiad - Delta Air Lines
-
Data Modeler
3 weeks ago
Nadiad, India HCLTech Full timeLooking for Immediate joiners - From 5+ to 7 yrs :-Key Responsibilities:Design and implement conceptual, logical, and physical data models aligned with business and regulatory requirements.Develop and maintain ETL/ELT pipelines to support data ingestion, transformation, and integration across systems.Collaborate with DevOps, BI, and application teams to...
-
Data Modeler
3 weeks ago
Nadiad, India HCLTech Full timeLooking for Immediate joiners - From 5+ to 7 yrs :-Key Responsibilities:Design and implement conceptual, logical, and physical data models aligned with business and regulatory requirements.Develop and maintain ETL/ELT pipelines to support data ingestion, transformation, and integration across systems.Collaborate with DevOps, BI, and application teams to...
-
Data Engineer
1 week ago
Nadiad, India Tata Consultancy Services Full timeTCS has been a great pioneer in feeding the fire of Young techies like you. We are global leaders in the technology arena and there's nothing that can stop us from growing together. TCS Hiring for skill " Azure Databricks Engineer". Role: Azure Sr. Data Engineer Required Technical Skill Set: Azure data factory, data bricks, data lake, automation and...
-
Data Engineer
3 weeks ago
Nadiad, India Talent Corner HR Services Pvt Ltd Full timeONLY BANGALORE CANDIDATES APPLY Position: AWS Data EngineerExperience: 5- 12 YearsSkills must have: Python, AWS, SQL, Data Warehousing is a MustLocation: JP Nagar, BangaloreSalary: Industry standardsWork Mode: WFO, 3 days WFH per monthInterview Mode: Direct F2F Interview (Manager + HR Round at same day)Duration of the interview: 1:30 Minutes Skills and...
-
Data Engineer
3 weeks ago
Nadiad, India Talent Corner HR Services Pvt Ltd Full timeONLY BANGALORE CANDIDATES APPLY Position: AWS Data EngineerExperience: 5- 12 YearsSkills must have: Python, AWS, SQL, Data Warehousing is a MustLocation: JP Nagar, BangaloreSalary: Industry standardsWork Mode: WFO, 3 days WFH per monthInterview Mode: Direct F2F Interview (Manager + HR Round at same day)Duration of the interview: 1:30 Minutes Skills and...
-
Backend and Data Pipeline Engineer
2 weeks ago
Nadiad, India JRD Systems Full timeJob Role: Backend and Data Pipeline Engineer - PythonLocation: RemoteJob Type : Fulltime** Only Immediate Joiners **Job Summary:The Team:We’re investing in technology to develop new products that help our customers drive their growth and transformation agenda. These include new data integration, advanced analytics, and modern applications that address new...
-
Senior aws data engineer
1 day ago
Nadiad, India CYAN360 Full timePosition: Senior AWS Data Engineer Location: Remote Salary: Open Work Timings: 2:30 PM to 11:30 PM IST *** Need someone who can join immediately or in 15 days*** Responsibilities: Design, develop, and deploy end-to-end data pipelines on AWS cloud infrastructure using services such as Amazon S3, AWS Glue, AWS Lambda, Amazon Redshift, etc. Implement data...
-
Senior Data Engineer
2 days ago
Nadiad, India iVoyant Full timeOur client is looking for a Senior Python Data Engineer who not only builds pipelines but also understands business context, data modeling, and why certain schemas or architectures are used. They expect strong Python problem-solving, practical experience with Polars (Must have), Pandas, Duck DB, and the ability to design data pipelines end-to-end. The role...
-
Nadiad, India Gravity Infosolutions, Inc. Full timeRole:Data Engineer with Snowflake & Data Vault 2.0 Experience Type:Contract Duration:6 months+ Extendable Experience:5+ Years Location:Remote Shift:UK ShiftJob Description Project Details: The project involves setting up a new software tool to meet audit requirements and is therefore critical. Most data to be migrated is already in Snowflake; some will be...
-
AI Data Engineer
1 week ago
Nadiad, India Turing Full timeRole Overview:We’re looking for experienced AI data engineers skilled in Python to collaborate with one of the world’s top Large Language Model (LLM) companies. Your work will directly help improve how AI models think, reason, and code.In this role, you’ll generate and evaluate high-quality data used to fine-tune and benchmark LLMs. You’ll design...
Data engineer
4 weeks ago
About Delta Tech Hub:Delta Air Lines (NYSE: DAL) is the U. S. global airline leader in safety, innovation, reliability and customer experience. Powered by our employees around the world, Delta has for a decade led the airline industry in operational excellence while maintaining our reputation for award-winning customer service. With our mission of connecting the people and cultures of the globe, Delta strives to foster understanding across a diverse world and serve as a force for social good. Delta has fast emerged as a customer-oriented, innovation-led, technology-driven business. The Delta Technology Hub will contribute directly to these objectives. It will sustain our long-term aspirations of delivering niche, IP-intensive, high-value, and innovative solutions. It supports various teams and functions across Delta and is an integral part of our transformation agenda, working seamlessly with a global team to create memorable experiences for customers.Responsibilities: Data Pipeline Development and Maintenance:Design, build, and optimize scalable ETL/ELT pipelines to ingest data from diverse sources such as APIs, cloud platforms, and databases.Ensure pipelines are robust, efficient, and capable of handling large volumes of data.Data Integration and Harmonization:Implement data transformation and enrichment processes to support analytics and reporting needs.Data Quality and Monitoring:Develop and implement data validation and monitoring frameworks to ensure data accuracy and consistency.Troubleshoot and resolve issues related to data quality, latency, or performance.Collaboration with Stakeholders:Partner with cross functional teams, analysts, and data scientists to understand data requirements and translate them into technical solutions.Provide technical support and guidance on data-related issues or projects.Tooling and Automation:Leverage cloud-based solutions and frameworks (e.g., AWS) to streamline processes and enhance automation.Maintain and optimize existing workflows while continuously identifying opportunities for improvement.Documentation and Best Practices:Document pipeline architecture, data workflows, and processes for both technical and non-technical audiences.Follow industry best practices for version control, security, and data governance.Continuous Learning and Innovation:Stay current with industry trends, tools, and technologies in data engineering and marketing analytics.Recommend and implement innovative solutions to improve the scalability and efficiency of data systems.What you need to succeed (minimum qualifications):Bachelor of Science degree in Computer Science or equivalent Extensive experience with databases and data platforms (AWS preferred)2+ Hands-on experience in designing, implementing, managing large scale data and ETL solutions utilizing AWS Compute, Storage and database services (S3, Lambda, Redshift, Glue, Athena etc.,)Proficiency in Python, SQL, Py Spark2-3 years of post-degree professional experience as a data engineer developing and maintaining data pipelinesExperience in Data Quality, Data Modeling, Data Analytics/BI, Data Enrichment, Security and Governance.Understanding of concepts such as normalization, SCD (Slowly changing dimensions) and CDC (Change data capture)Experience in working on streaming event platforms such as Kafka/KinesisStrong knowledge of relational and non-relational databasesProficiency in DBT for data transformation and modeling.Good understanding of data warehouses, ETL/ELT, AWS architecture (using Glue, SQS, SNS, S3, step functions etc.,)Strong understanding of orchestration tools such as AirflowAbility to create clean, well-designed code and systemsProven ability to work with large and complex datasetsStrong analytical and programming skills with the ability to solve data-related challenges efficientlyStrong attention to detail and a commitment to data accuracy.Proven ability to learn new data models quickly and apply them effectively in a fast-paced environment.Excellent communication skills with the ability to present complex data findings to both technical and non-technical audiences.Ability to work collaboratively in a team environment.Behavioral Competencies: Ability to work in collaborative environments and embrace diverse perspectives.?? Communicate clearly and concisely, express thoughts and ideas effectively, and embrace cultural differences with respect when engaging with others.? Ability to engage effectively with peers and stakeholders to build strong partnerships.? Prioritize, maintain focus, and consistently deliver commitment.? Proactively understand customer expectations and willingness to create customer-based solutions.? What will give you a competitive edge (preferred qualifications): Experience working with AWS to develop data pipelinesAWS certifications: Solution Architect or Developer AssociateExperience migrating data pipelines and systems to modern cloud-based solutions