Hadoop and Kafka
4 days ago
Travitons Technologies Pvt. Ltd.
3 - 15 Years
INR 0.5 - 1.5 LPA
India
**Job Description**:
**Responsibilities**:
1. Design, develop, and implement data processing solutions leveraging Hadoop ecosystem tools such as HDFS, MapReduce, Spark, Hive, and HBase.
3. Develop and maintain scalable and fault-tolerant data pipelines to ingest, process, and analyze large volumes of data.
4. Collaborate with cross-functional teams to gather requirements, understand business objectives, and translate them into technical solutions.
6. Ensure data quality, integrity, and security throughout the entire data lifecycle.
7. Stay updated with the latest technologies and best practices in big data processing and streaming.
**Requirements**:
1. Bachelor's degree in Computer Science, Information Technology, or a related field. (Master's degree preferred)
2. Proven experience in designing, developing, and implementing big data solutions using the Hadoop ecosystem.
3. Strong proficiency in programming languages such as Java, Scala, or Python.
4. Hands-on experience with Apache Kafka, including topics, partitions, producers, consumers, and Kafka Connect.
5. Solid understanding of distributed computing principles and large-scale data processing frameworks.
6. Experience with SQL and NoSQL databases for data storage and retrieval.
7. Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes.
8. Excellent problem-solving skills and the ability to troubleshoot complex issues in distributed environments.
9. Strong communication and interpersonal skills with the ability to collaborate effectively in a team environment.
10. Experience with Agile development methodologies is a plus.
**Mandatory Skills**: Hadoop, Kafka, Java, SQL
About Company / Benefits
100% Work From Home
Flexible working hours
**Role**:
Hadoop and Kafka / 100% Remote Work
**Location**:
India
**Work Exp.**:
3-15 Years
**Job Type**:
Full Time
**Salary**:
INR 0.5-1.5 LPA
-
Hadoop Developer
3 weeks ago
india Tata Consultancy Services Full timeTCS || Hadoop Developer || Urgent RequirementHello LinkedIn Community,An exciting opportunity awaits at TCS! We are seeking individuals to join us as an Associate Consultant in the role of "Hadoop Developer."Here are the key details:Education: Minimum 15 years of full-time education (10th, 12th, and Graduation) Skills: Big Data, Hadoop Administration,...
-
Hadoop Developer
4 weeks ago
Kolkata, West Bengal, India, West Bengal Tata Consultancy Services Full timeTCS || Hadoop Developer || Urgent Requirement Hello LinkedIn Community,An exciting opportunity awaits at TCS! We are seeking individuals to join us as an Associate Consultant in the role of "Hadoop Developer."Here are the key details: Education: Minimum 15 years of full-time education (10th, 12th, and Graduation) Skills: Big Data, Hadoop Administration,...
-
Big data-Pyspark Developer-Hadoop
1 week ago
Bengaluru, India Infosys Full timeJob Description Skillset required: 1. Excellent knowledge of UNIX/LINUX OS. 2. Knowing of core java is a plus but not mandatory. 3. Good understanding of OS concepts, process management and resource scheduling. 4. Basics of networking, CPU, memory and storage. 5. Good hold of shell scripting. - Deploying a hadoop cluster, maintaining a hadoop cluster, adding...
-
Senior Hadoop Admin
7 days ago
India Capco Full time ₹ 12,00,000 - ₹ 36,00,000 per yearAbout Us"Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy...
-
Apache Spark and Hadoop Developer
5 days ago
Bengaluru, India Tata Consultancy Services Full timeJob Description Greetings from TCS! Skillset : Apache Spark and Hadoop Location: Bangalore Experience Range: 4 - 10 years Must-have skills - Design, build, and maintain scalable big data pipelines for processing and analyzing large datasets using Spark, Airflow, Bodo, Flume, Flink, etc. - Utilize technologies like Hadoop, Spark, Kafka, and NoSQL databases. -...
-
▷ Apply in 3 Minutes: Big Data Developer
2 weeks ago
India Tata Consultancy Services Full timeRole: Big Data Streaming Ingestion Developer Required Technical Skill Set: Hortonworks suite of tools including Kafka, Scala, Hbase and Phoenix Location of Requirement: Chennai/Mumbai/Pune Must-Have - Hands on Experience in Hadoop tools (Kafka, Scala, Hbase and Phoenix) - Experience in RDBMS Concepts - ETL Concepts - DevOps - Bit bucket Good-to-Have -...
-
Big Data Full-time
4 days ago
India HyreFox Consultants Full time**POSITION: Big Data Engineer** **Primary skills**: Apache Hadoop, Apache Spark, Apache Kafka **Roles & Responsibilities** - Developing Hadoop systems. - Loading disparate data sets and conducting pre-processing services using Hive or Pig. - Finalizing the scope of the system and delivering Big Data solutions. - Managing the communications between the...
-
Data Operations Associate
5 days ago
India MP DOMINIC AND CO Full timeJob Summary List Format - Join the Data Operations team as a graduate supporting the smooth flow and reliability of data across business units and countries - Perform daily monitoring and management of data loads batch jobs and ETL workflows in various systems - Conduct data validation reconciliation and quality checks to ensure data accuracy completeness...
-
Sr Data Engineer
1 week ago
India Photon Group Full time ₹ 12,00,000 - ₹ 24,00,000 per yearDescriptionKey Responsibilities:Develop, configure, and optimize Apache DolphinScheduler workflows for data processing and automation.Design and implement ETL pipelines, job scheduling, and workflow orchestration solutions.Troubleshoot and resolve performance, scalability, and reliability issues in DolphinScheduler.Integrate DolphinScheduler with big data...
-
Data Architect
1 week ago
India InOrg Full time ₹ 12,00,000 - ₹ 36,00,000 per yearJob Summary:We're seeking an experienced Data Architect / Senior Data Architect/ Associate Data ArchitectExpertise in data engineering across major data platforms. The ideal candidate will have astrong background in Python, SQL, ETL, and data modeling, with experience in tools likeTeradata, Informatica, Hadoop, Spark, PySpark, ADF, Snowflake, and Big Data....