Hadoop Big data engineer
5 days ago
Responsibilities:
- Onboarding and updating data deliveries on our Hadoop Datalake
• Developing frameworks to process and deploy new data delivery methods or file formats (JSON, XML, CSV, ProtoBuf, Avro, Parquet, etc)
• Creating views and tables for our users with optimal performance for their use case
• Experience of connecting and interfacing with systems via APIs
• Support both Development and Operations teams in quickly solving complex issues that are hindering the team and/or stakeholders
• Coordinating with source teams producing data and also consumers using this data for their use cases
• Supporting in the design, building, and testing of new services and extending existing services on our platform
• Team collaboration on complex challenges, code reviews and testing new functionalities
• Evaluating stakeholder requirements and guiding the innovation team towards the most suitable solution architecture
• Identifying common development patterns and ensuring generic solutions and architecture are applied where possible
• Take the design lead for complex requirements that go beyond the standard work done by the team
• Continue to move the teams towards using more automation and efficient ways of working, to improve quality, eliminate manual mistakes and be more predictable with our deliveries
Education, Experience, and Licensing Requirements:
As an engineer, ideally you have the following qualifications or experience with:
• Extensive knowledge of Data modelling Data Warehousing, Relational DBs and Hadoop BigData stores
• Hadoop including specifically HDFS, Hive and SQL
• Java/Scala, Spark with 5 years+ general software design/development experience
• Shell scripting / Shell commands
• Git and Confluence
• Bachelor's Degree in Computer Science, Information Systems or other area of academic study (IT experience can substitute for a Bachelor's Degree)
• DevOps mindset
• Able to learn quickly in a complex multitool environment
Extra assets are:
• Python, Groovy, Ansible
• Hbase and Phoenix
• Apache Nifi and Kafka
• General hands-on experience with (Cloudera's) Hadoop Suite
Domain
• Telecom Domain knowledge - mainly Fault, Performance, Configuration and Capacity Management Data
-
Big Data/Hadoop Engineer
2 weeks ago
Bengaluru, Karnataka, India Rackspace Technology Full time ₹ 15,00,000 - ₹ 25,00,000 per yearWe are hiring for Big Data / Hadoop EngineerExp - 3+yrsLocation - Bangalore ( 3days WFO)Notice Period - Less than 30days only. Big Data Development: Design, build, and maintain robust and scalable data pipelines for ETL/ELT processes using tools within the Hadoop ecosystem.. Programming & Scripting: Develop complex data processing applications using Spark...
-
Big Data/Hadoop Engineer
2 weeks ago
Bengaluru, India Rackspace Technology Full timeJob Description We are hiring for Big Data / Hadoop Engineer Exp - 3+yrs Location - Bangalore ( 3days WFO) Notice Period - Less than 30days only . Big Data Development:Design, build, and maintain robust and scalable data pipelines for ETL/ELT processes using tools within the Hadoop ecosystem. . Programming & Scripting:Develop complex data processing...
-
Big Data/Hadoop Engineer
2 weeks ago
Bengaluru, Karnataka, India Rackspace Full time ₹ 15,00,000 - ₹ 25,00,000 per yearWe are hiring for Big Data / Hadoop Engineer Exp - 3 yrs Location - Bangalore ( 3days WFO Notice Period - Less than 30days only Big Data Development: Design, build, and maintain robust and scalable data pipelines for ETL/ELT processes using tools within the Hadoop ecosystem Programming & Scripting: Develop complex data processing applications using Spark...
-
Big Data/Hadoop Engineer
2 weeks ago
Bengaluru, Karnataka, India Rackspace Full time ₹ 15,00,000 - ₹ 25,00,000 per yearWe are hiring for Big Data / Hadoop EngineerExp - 3+yrs Location - Bangalore ( 3days WFO)Notice Period - Less than 30days only. Big Data Development: Design, build, and maintain robust and scalable data pipelines for ETL/ELT processes using tools within the Hadoop ecosystem.. Programming & Scripting: Develop complex data processing applications...
-
Rackspace - Big Data/Hadoop Engineer
2 weeks ago
Bengaluru, Karnataka, India rackspace technology Full time ₹ 80,000 - ₹ 1,50,000 per yearJob Description Big Data/Hadoop Engineer Location: India - BangaloreDepartment: Public Cloud - Offerings and Delivery Cloud Data Services / HybridExperience: 3 yearsNotice Period: Less than 30 days onlyWork Arrangement: 3 days Work From OfficeJob Responsibilities Big Data Development: Design, build, and maintain robust and scalable data pipelines for...
-
Hadoop Administrator
9 hours ago
Bengaluru, India Tehno Right Full timeJob Role : Hadoop Administrator (Role open for multiple locations) - WFH and WFOJob description :What is your Role ?- You will manage Hadoop clusters, data storage, server resources, and other virtual computing platforms. - You perform a variety of functions, including data migration, virtual machine set-up and training, troubleshooting end-user problems,...
-
Bengaluru, Karnataka, India NTT DATA Full time**Req ID**: 325776 We are currently seeking a ITSS-Big Data/Hadoop/Splunk Developer-HUMJP00032981 to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Azure Data Factory, Azure Synapse, and Databricks **About NTT DATA
-
Big Data Engineer
1 week ago
Bengaluru, India IT Firm Full timeRole Overview :As a Big Data Engineer, you will be responsible for building and maintaining scalable data pipelines and architectures that support data analytics, reporting, and real-time processing. You will collaborate with data scientists, business analysts, and software engineers to design robust solutions that drive data-driven decisions.Mandatory Skill...
-
Senior Big Data Developer
6 days ago
Bengaluru, India Info Origin Inc Full timeJob Title : Senior Big Data Developer.Required Experience : 6+ Years.Location : Bangalore.Employment Type : Full-time.Job Summary :Key Responsibilities :- Develop, test, and deploy high-performance data pipelines using Hadoop, Spark, Kudu, and HBase.- Implement ETL/ELT workflows and ensure data integrity and scalability.- Work closely with Data Architects...
-
Big data-Pyspark Developer-Hadoop
2 weeks ago
Bengaluru, Karnataka, India Infosys Full time ₹ 12,00,000 - ₹ 36,00,000 per yearKey Responsibilities:Deploying a hadoop cluster maintaining a hadoop cluster adding and removing nodes using cluster monitoring tools like Cloudera Manager configuring the NameNode high availability and keeping a track of all the running hadoop jobsImplementing managing and administering the overall hadoop infrastructureKnowledge all the components in the...