
Aws & Pyspark
4 days ago
From 4 to 9 year(s) of experience
- ₹ Not Disclosed by Recruiter
- Mumbai, Pune, Chennai, Bangalore/Bengaluru**Roles and Responsibilities**
**Must haveskills**:
1.PySpark(Python +Spark) or Spark with Scala. Spark aka Apache Spark.
2.Cloud(AWSorAzure). Preferably AWS
Good to have:Databricks(PySpark),Snowflake.AWS(S3, Glue,...)
**Primary skill**: Data engineer, BigData, Spark (or PySpark or Spark with Scala)
**Secondary skill**: Cloud(AWS or Azure). Python
**JD**:
Primarily looking for an data engineer with expertise in processing data pipelines using PySpark, Spark SQL on Hadoop distributions like AWS EMR, Data bricks, Cloudera etc
Must have - PySpark, AWS S3, EMR, Big data
Good to have - databricks, Snowflake, Talend
**Requirements**:
Other ideal qualifications include experiences in:
Primarily looking for an data engineer with expertise in processing data pipelines using PySpark, Spark SQL on Hadoop distributions like AWS EMR, Data bricks, Cloudera etc.
Should be very proficient in doing large scale data operations using PySpark and overall very comfortable using Python
Familiarity with AWS compute, storage and IAM concepts.
Experience in working with S3 Data Lake as the storage tier.
Any ETL background (Talend, AWS Glue etc.) is a plus but not required
Cloud Warehouse experience (Snowflake etc.) is a huge plus
Carefully evaluates alternative risks and solutions before taking action
Optimizes the use of all available resources
Develops solutions to meet business needs that reflect a clear understanding of the objectives, practices and procedures of the corporation, department and business unit
- Role:_Other
- Salary:_ Not Disclosed by Recruiter
- Industry:_IT Services & Consulting
- Functional Area_Other
- Role Category_Other
- Employment Type:_Full Time, Permanent
- Key Skills- S3PysparkAzureSCALACloudSparkApacheAWSPythonEducation
- UG:_Any Graduate
**Company Profile**:
CHANGE LEADERS CONSULTING
CL is an HR Tech & Consulting firm engaged in Talent management for Analytics & Big Data Organizations.
- Company Info- Contact Company:_CHANGE LEADERS CONSULTING
-
Python with Pyspark
5 days ago
Mumbai, India INFOBEANS TECHNOLOGIES Full timeBangalore/Chennai/Hyderabad/Pune/Mumbai/Noida/Indore 4+ years Role: Python with Pyspark Location: Bangalore, Chennai, Hyderabad, Pune, Mumbai, Noida, Indore Experience: 4+ years Key Skills:. Python, Pyspark, AWS Job Category: Python Development What will your role look like Python Programming Language, PySpark, Linux Shell Scripting, Data Integration...
-
AWS Databricks
2 weeks ago
Bengaluru, Hyderabad, Mumbai, India Ltimindtree Full time US$ 60,000 - US$ 1,20,000 per yearPrimary Skill:-AWS Databricks , Pyspark Glue.Secondary Skill:-Python,Pyspark
-
Expert AWS Data Architect
1 week ago
Mumbai, Maharashtra, India beBeeDataEngineer Full time ₹ 15,00,000 - ₹ 25,00,000As a seasoned AWS data engineer, you will play a pivotal role in designing and developing large-scale data pipelines and applications using AWS services and Python/PySpark.About the JobWe are seeking an experienced professional to join our team as an AWS Data Engineer. The successful candidate will be responsible for architecting, building, and maintaining...
-
Pyspark Developer
21 hours ago
Mumbai, Maharashtra, India Artech Full time ₹ 20,00,000 - ₹ 25,00,000 per yearRole & responsibilitiesYou will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also be...
-
Aws Data Engineer
2 weeks ago
Mumbai, Maharashtra, India Ahana Systems & Solutions Full time ₹ 9,00,000 - ₹ 12,00,000 per yearWe are looking for AWS Data EngineerWe are looking for a skilled AWS Data Engineer (Lead) to join our team in Mumbai. The ideal candidate will have 4+ years of experience in designing, building, and optimizing datapipelines and workflows using AWS cloud technologies. You should be proficient in SQL, PySpark, and Python and have hands-on experience with AWS...
-
AWS Data Engineer
22 hours ago
Mumbai, Maharashtra, India Apex One Full time ₹ 8,00,000 - ₹ 12,00,000 per yearHands-on experience with AWS services including S3, Lambda,Glue, API Gateway, and SQS. Strong skills in data engineering on AWS, with proficiency in Python ,pyspark & SQL. Experience with batch job scheduling and managing data dependencies. Knowledge of data processing tools like Spark and Airflow. Automate repetitive tasks and build reusable frameworks...
-
Senior AWS Data Pipeline Expert
1 week ago
Mumbai, Maharashtra, India beBeeDataEngineer Full time ₹ 40,00,000 - ₹ 50,00,000About the RoleWe are seeking a highly skilled Senior AWS Data Engineer to join our data engineering team. The ideal candidate will have deep expertise in building scalable data pipelines using Apache Spark, PySpark, SQL, and Python, along with hands-on experience in the AWS ecosystem.
-
AWS Data Engineer
2 weeks ago
Mumbai, Maharashtra, India Infogain Full time ₹ 15,00,000 - ₹ 20,00,000 per yearRoles & ResponsibilitiesRole SummaryWe are looking for a highly skilledAWS Databricks Data Engineerto design, develop, and optimize data pipelines and lakehouse solutions on AWS using Databricks. The ideal candidate will have strong experience inPySpark, SQL, Delta Lake, AWS native services, Unity Catalogand building scalable data processing solutions for...
-
AWS Data Engineer
6 days ago
Mumbai, Maharashtra, India Infogain Full time ₹ 6,00,000 - ₹ 8,00,000 per yearRoles & ResponsibilitiesRole SummaryWe are looking for a highly skilledAWS Databricks Data Engineerto design, develop, and optimize data pipelines and lakehouse solutions on AWS using Databricks. The ideal candidate will have strong experience inPySpark, SQL, Delta Lake, AWS native services, Unity Catalogand building scalable data processing solutions for...
-
Aws Data Architect
1 day ago
Mumbai, India Xpheno Full time**4 - 8 Year**: **Bangalore NCR Mumbai Pune Chennai Hyderabad**: *** - Explore and learn the latest Data & Analytics and Databricks Platform features /technologies to provide new capabilities and increase efficiency. - Design and build production data pipelines/ETL jobs from ingestion to consumption within a big data architecture, using Python, PySpark,...