
Consultant-data Engineer, Aws+python, Spark, Kafka For Etl
5 days ago
Ready to shape the future of work At Genpact we don t just adapt to change we drive it AI and digital innovation are redefining industries and we re leading the charge Genpact s our industry-first accelerator is an example of how we re scaling advanced technology solutions to help global enterprises work smarter grow faster and transform at scale From large-scale models to our breakthrough solutions tackle companies most complex challenges If you thrive in a fast-moving tech-driven environment love solving real-world problems and want to be part of a team that s shaping the future this is your moment Genpact NYSE G is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally Through our deep business knowledge operational excellence and cutting-edge solutions - we help companies across industries get ahead and stay ahead Powered by curiosity courage and innovation our teams implement data technology and AI to create tomorrow today Get to know us at and on and Inviting applications for the role of Consultant-Data Engineer AWS Python Spark Kafka for ETL Responsibilities Develop deploy and manage ETL pipelines using AWS services Python Spark and Kafka Integrate structured and unstructured data from various data sources into data lakes and data warehouses Design and deploy scalable highly available and fault-tolerant AWS data processes using AWS data services Glue Lambda Step Redshift Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness Implement and maintain security measures to protect data and systems within the AWS environment including IAM policies security groups and encryption mechanisms Migrate the application data from legacy databases to Cloud based solutions Redshift DynamoDB etc for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop Apache Spark etc with appropriate cloud-based services like Amazon AWS etc Build data pipelines by building ETL processes Extract-Transform-Load Implement backup disaster recovery and business continuity strategies for cloud-based applications and data Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements User stories at the business meetings and strategize the impact of requirements on different platforms applications convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements product designs schedules and or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management other supporting teams to deploy changes in production environment Qualifications we seek in you Minimum Qualifications Experience in designing implementing data pipelines build data applications data migration on AWS Strong experience of implementing data lake using AWS services like Glue Lambda Step Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3 Lambda Glue EMR and Redshift Advanced programming skills in Python for data processing and automation Hands-on experience with Apache Spark for large-scale data processing Experience with Apache Kafka for real-time data streaming and event processing Proficiency in SQL for data querying and transformation Strong understanding of security principles and best practices for cloud-based environments Experience with monitoring tools and implementing proactive measures to ensure system availability and performance Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed cloud-based environment Strong communication and collaboration skills to work effectively with cross-functional teams Preferred Qualifications Skills Master s Degree-Computer Science Electronics Electrical AWS Data Engineering Cloud certifications Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change Incident Management process Why join Genpact Be a transformation leader - Work at the cutting edge of AI automation and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience mentorship and continuous learning opportunities Work with the best - Join 140 000 bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage curiosity and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters Up Let s build tomorrow together Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race color religion or belief sex age national origin citizenship status marital status military veteran status genetic information sexual orientation gender identity physical or mental disability or any other characteristic protected by applicable laws Genpact is committed to creating a dynamic work environment that values respect and integrity customer focus and innovation Furthermore please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way Examples of such scams include purchasing a starter kit paying to apply or purchasing equipment or training
-
Aws data engineer- sagemaker
1 week ago
Bangalore, India YASH Technologies Full timePrimary skillsets : AWS services including Glue, Pyspark, SQL, Databricks, Python Secondary skillset : Any ETL Tool, Github, Dev OPs(CI-CD) Mandatory Skill Set: Python, Py Spark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications Strong understanding and hands-on experience with AWS services like EC2, S3,...
-
Data engineer
1 week ago
Bangalore, India LTIMindtree Full timeJob Role : Data Engineer Experience : 12 - 16 years Notice Period : Immediate - 30 days Job Location : Bengaluru or Pan India Primary Skills : Bigdata, Spark, Scala and AWS Secondary Skills : Kubernetes, Kafka Job Description : 5+ years of software development experience with deep understanding of algorithms, data structures, design patterns, data...
-
Databricks + Python + Aws
1 week ago
Bengaluru, Karnataka, India ESK Technologies Full time**Department**:Software Development**: **We are Looking for Databricks Professionals** with 7-10 years of experience in designing data pipelines, ETL processes, and cloud-based data platforms using **Spark**, **Databricks**, and **AWS**. Proficiency in **Java/Python**, data structures, and modern engineering practices like **CI/CD** and **DevSecOps** is...
-
Aws Data Architect
1 week ago
Bengaluru, Karnataka, India Infogain Full timeROLES & RESPONSIBILITIES Key Responsibilities: - Design and implement scalable, reliable, and high-performance data architectures to support business needs. - Develop and maintain real-time data streaming solutions using Kafka and other streaming technologies. - Utilize AWS cloud services to build and manage data infrastructure, ensuring security,...
-
Data Engineer
2 weeks ago
Bangalore, India LTIMindtree Full timeJob Role : Data Engineer Experience : 12 - 16 years Notice Period : Immediate - 30 days Job Location : Bengaluru or PanIndia Primary Skills : Bigdata, Spark, Scala and AWS Secondary Skills : Kubernetes, Kafka Job Description : 5+ years of software development experience with deep understanding of algorithms, data structures, design...
-
Data Engineer
1 day ago
bangalore, India LTIMindtree Full timeJob Role : Data EngineerExperience : 12 - 16 yearsNotice Period : Immediate - 30 daysJob Location : Bengaluru or PanIndiaPrimary Skills : Bigdata, Spark, Scala and AWSSecondary Skills : Kubernetes, KafkaJob Description : 5+ years of software development experience with deep understanding of algorithms, data structures, design patterns, data pipelines and...
-
Bangalore, India Acme Services Full timeJob Title: AWS Data Engineer Location: Bangalore, Gurgaon and Pune Experience: 5 to 10 years Employment Type: Full-time About the Role We are seeking a highly skilled AWS Data Engineer to design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong expertise in AWS cloud services, Snowflake, Python,...
-
Bengaluru East, Karnataka, India NTT DATA North America Full time ₹ 12,00,000 - ₹ 36,00,000 per yearReq ID:337802NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.We are currently seeking a Data Engineer with Snowflake, AWS Glue, Kafka, API to join our team in Bangalore, Karnātaka (IN-KA), India (IN).6+ years...
-
▷ Urgent Search: Sr. Data Engineer
3 days ago
Bangalore, Karnataka, India Visa Full timeCompany Description Visa is a world leader in payments and technology with over 259 billion payments transactions flowing safely between consumers merchants financial institutions and government entities in more than 200 countries and territories each year Our mission is to connect the world through the most innovative convenient reliable and secure...
-
Data Engineer-AWS/Python
2 weeks ago
HAL Bangalore Airport, India RAPSYS TECHNOLOGIES PTE LTD Full timeWe're Hiring: Data Engineer-AWS/Python/Airflow We are seeking an experienced Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. The ideal candidate will have expertise in AWS cloud services, Python programming, and Apache Airflow to support our data-driven initiatives and analytics needs. Location:Bangalore Urban,...