
Cloudious - Senior Data Engineer - ETL/PySpark
3 weeks ago
Requirements:
- 8+ years of professional software engineering mostly focused on the following: 3 to 4 years of customer-facing international exposure.
- At least 2 years of experience interacting with technology and business senior leaders.
- Exceptional leadership, communication and stakeholder management skills Leading innovation and automaton agendas for data engineering organizations.
- Lead organization hiring for data engineering Have interfaced with customers and helped organization de-escalate situations.
- Developing ETL pipelines involving big data.
- Developing data processing\analytics applications primarily using PySpark.
- Experience of developing applications on cloud(AWS) mostly using services related to storage, compute, ETL, DWH, Analytics and streaming.
- Clear understanding and ability to implement distributed storage, processing and scalable applications.
- Experience of working with SQL and NoSQL database.
- Ability to write and analyze SQL, HQL and other query languages for NoSQL databases.
- Proficiency is writing disitributed & scalable data processing code using PySpark, Python and related libraries.
- Experience of developing applications that consume the services exposed as ReST APIs.
- Special Consideration given for Experience supporting GTM strategy and supporting pre-sales teams
- Experience of working with Container-orchestration systems like Kubernetes.
- Experience of working with any enterprise grade ETL tools.
- Experience & knowledge with Adobe Experience Cloud solutions.
- Experience & knowledge with Web Analytics or Digital Marketing.
- Experience & knowledge with Google Cloud platforms.
- Experience & knowledge with Data Science, ML/AI, R or Jupyter
-
Data Engineer
4 weeks ago
Delhi, Delhi, India Haruto Technologies LLP Full timeRole : GCP Data EngineerLocation : RemoteExperience Required : 5+ YearsAbout the Role :We are looking for an experienced GCP Data Engineer who has strong expertise in building and optimizing data pipelines, big data processing, and data warehousing solutions on Google Cloud Platform. The ideal candidate should be hands-on with BigQuery, DataProc, PySpark,...
-
Data Engineer
4 weeks ago
Delhi, Delhi, India V2Solutions Full timeRole : Data EngineerLocation : RemoteExp : 5 to 9 yearsKey Responsibilities :- Design, develop, and maintain scalable ETL/ELT data pipelines using PySpark, SQL, and Python.- Work with AWS data services (Glue, Redshift, S3, Athena, EMR, Lambda, etc.) to manage large-scale data processing.- Implement data ingestion, transformation, and integration from...
-
ETL Developer
4 weeks ago
Delhi, Delhi, India Intraedge Technologies Ltd. Full timeJob Title : ETL Developer DataStage, AWS, Snowflake.Experience : 5- 7 Years.Location : Remote.Job Type : Full-time.About the Role :We are looking for a talented and motivated ETL Developer / Senior Developer to join our data engineering team.You will work on building scalable and efficient data pipelines using IBM DataStage (on Cloud Pak for Data), AWS...
-
Pyspark Developer
3 days ago
Delhi, Delhi, India VAK Consulting LLC Full time ₹ 15,00,000 - ₹ 25,00,000 per yearPyspark4+ years of hands-on experience with following GCP tools: BigQuery, Dataproc, Cloud Composer/Airflow, Cloud StorageDevelop and optimize ETL/ELT pipelines using Dataproc, Cloud Composer, BigqueryOptimize complex SQL queries and data processing workflowsStrong experience with DevOps processes Independently collaborate with cross-functional teams to...
-
Sr. Azure Data Engineer
1 day ago
Delhi, Delhi, India ALIQAN Technologies Full time ₹ 6,00,000 - ₹ 18,00,000 per yearJob Overview:We are urgently looking for a highly skilled and experienced Senior Azure Data Engineer to join our team. The ideal candidate must have a solid background in Azure Data Factory (ADF), Databricks, PySpark, and strong hands-on experience with modern data tools and platforms. This is a client-facing role requiring excellent communication skills,...
-
Data Validation Engineer
4 weeks ago
Delhi, Delhi, India VSHR UNNATI LLP Full timeAbout the RoleWe are looking for a detail-oriented Data Validation Engineer to ensure the accuracy, quality, and consistency of data across our platforms. The ideal candidate will be proficient in PySpark and have strong experience working with Databricks to build automated validation frameworks that verify ingestion, transformation, and reporting...
-
Data Engineer
3 weeks ago
Delhi, Delhi, India EGISEDGE TECHNOLOGIES PVT LTD Full timeJob Title :Data Engineer at Egisedge Technologies Pvt Ltd is a highly skilled role that involves designing, developing, and maintaining scalable ETL/ELT data pipelines using Databricks (PySpark) on Azure/AWS/GCP.Key Responsibilities :- Design, develop, and maintain scalable ETL/ELT data pipelines using Databricks (PySpark) on Azure/AWS/GCP.- Develop clean,...
-
Senior Data Engineer
4 weeks ago
Delhi, Delhi, India Vikash Technologies Full timeSenior Data EngineerSenior Data Engineer with strong expertise in SQL, Python, Azure Synapse, Azure Data Factory, Snowflake, and Databricks. The ideal candidate should have a solid understanding of SQL (DDL, DML, query optimization) and ETL pipelines while demonstrating a learning mindset to adapt to evolving Responsibilities : - Design and implement...
-
Senior Data Engineer
4 weeks ago
Delhi, Delhi, India Aptus Data Labs Full timeJob Title : Senior Data EngineerLocation : : 58 YearsEmployment Type : Full-TimeAbout the Role :Aptus Data Labs is looking for a talented and proactive Senior Data Engineer to help build the backbone of our enterprise data and AI initiatives. Youll work on modern data lake architectures and high-performance pipelines in AWS, enabling real-time insights and...
-
Lead Data Engineer/Manager/Senior Manager
4 weeks ago
Delhi, Delhi, India Huquo Consulting Full timeKey Responsibilities :- Lead and mentor a team of data engineers in designing, developing, and maintaining scalable data pipelines.- Architect, build, and optimize ETL workflows using Python, PySpark, and SQL.- Collaborate with data scientists, analysts, and business teams to understand data requirements and deliver reliable solutions.- Implement and manage...