Cloudious - Senior Data Engineer - ETL/PySpark

3 days ago


Delhi, Delhi, India Cloudious lLC Full time

Requirements:

- 8+ years of professional software engineering mostly focused on the following: 3 to 4 years of customer-facing international exposure.

- At least 2 years of experience interacting with technology and business senior leaders.

- Exceptional leadership, communication and stakeholder management skills Leading innovation and automaton agendas for data engineering organizations.

- Lead organization hiring for data engineering Have interfaced with customers and helped organization de-escalate situations.

- Developing ETL pipelines involving big data.

- Developing data processing\analytics applications primarily using PySpark.

- Experience of developing applications on cloud(AWS) mostly using services related to storage, compute, ETL, DWH, Analytics and streaming.

- Clear understanding and ability to implement distributed storage, processing and scalable applications.

- Experience of working with SQL and NoSQL database.

- Ability to write and analyze SQL, HQL and other query languages for NoSQL databases.

- Proficiency is writing disitributed & scalable data processing code using PySpark, Python and related libraries.

- Experience of developing applications that consume the services exposed as ReST APIs.

- Special Consideration given for Experience supporting GTM strategy and supporting pre-sales teams

- Experience of working with Container-orchestration systems like Kubernetes.

- Experience of working with any enterprise grade ETL tools.

- Experience & knowledge with Adobe Experience Cloud solutions.

- Experience & knowledge with Web Analytics or Digital Marketing.

- Experience & knowledge with Google Cloud platforms.

- Experience & knowledge with Data Science, ML/AI, R or Jupyter

(ref:hirist.tech)

  • Delhi, Delhi, India beBeeData Full time ₹ 20,00,000 - ₹ 27,00,000

    Senior Data EngineerOur organization seeks a highly experienced Senior Data Engineer to lead data engineering initiatives. This senior professional will have a strong background in software development, with 8+ years of experience designing and implementing big data ETL pipelines, data processing, analytics applications using PySpark.Key...


  • Delhi, Delhi, India beBeeDataEngineer Full time ₹ 8,00,000 - ₹ 12,00,000

    Job TitleDesign, build and maintain scalable data pipelines using Python, PySpark and Airflow.Responsibilities:- Design, build, and maintain scalable data pipelines using Python, PySpark, and Airflow.- Develop and optimize ETL workflows on Cloudera Data Platform (CDP).- Implement data quality checks, monitoring, and alerting mechanisms.- Ensure data...


  • Delhi, Delhi, India beBeeData Full time ₹ 15,00,000 - ₹ 20,10,000

    We are looking for a skilled Data Architect to lead the development of scalable data pipelines. This role requires expertise in designing and implementing data transformation workflows using advanced technologies such as PySpark and Scala Spark.Key ResponsibilitiesData Pipeline Development: Design, develop, maintain scalable ETL pipelines for batch data...

  • Pyspark Developer

    5 days ago


    Delhi, Delhi, India VAK Consulting LLC Full time ₹ 12,00,000 - ₹ 20,00,000 per year

    Pyspark4+ years of hands-on experience with following GCP tools: BigQuery, Dataproc, Cloud Composer/Airflow, Cloud StorageDevelop and optimize ETL/ELT pipelines using Dataproc, Cloud Composer, BigqueryOptimize complex SQL queries and data processing workflowsStrong experience with DevOps processes Independently collaborate with cross-functional teams to...

  • Data Engineer

    6 days ago


    Delhi, Delhi, India Haruto Technologies LLP Full time

    Role : GCP Data EngineerLocation : RemoteExperience Required : 5+ YearsAbout the Role :We are looking for an experienced GCP Data Engineer who has strong expertise in building and optimizing data pipelines, big data processing, and data warehousing solutions on Google Cloud Platform. The ideal candidate should be hands-on with BigQuery, DataProc, PySpark,...

  • Data Engineer

    6 days ago


    Delhi, Delhi, India V2Solutions Full time

    Role : Data EngineerLocation : RemoteExp : 5 to 9 yearsKey Responsibilities :- Design, develop, and maintain scalable ETL/ELT data pipelines using PySpark, SQL, and Python.- Work with AWS data services (Glue, Redshift, S3, Athena, EMR, Lambda, etc.) to manage large-scale data processing.- Implement data ingestion, transformation, and integration from...

  • ETL Developer

    2 weeks ago


    Delhi, Delhi, India IntraEdge Full time

    Website- Title:ETL Developer – DataStage, AWS, SnowflakeExperience:5–7 YearsLocation:(Remote)Job Type:(Full-time )About the RoleWe are looking for a talented and motivated ETL Developer / Senior Developerto join our data engineering team. You will work on building scalable and efficient data pipelines usingIBM DataStage (on Cloud Pak for Data) ,AWS Glue...


  • Delhi, Delhi, India beBeeDataQualityEngineer Full time ₹ 80,00,000 - ₹ 1,00,00,000

    About the JobOur client is a global IT services company with operations across over 50 locations worldwide. Founded in 1996, they specialize in digital engineering and IT services, helping clients modernize their technology infrastructure and adopt cloud and AI solutions.The ideal candidate will have hands-on experience in developing and executing tests...


  • Delhi, Delhi, India beBeeDataEngineering Full time ₹ 1,00,00,000 - ₹ 2,00,00,000

    Job Opportunity: Data Engineering ExpertThe ideal candidate will be responsible for designing and developing scalable data pipelines using PySpark, SQL, and Python.Key responsibilities include collaborating with cross-functional teams to drive business decisions through data-driven insights, implementing best practices for data governance, security, and...


  • Delhi, Delhi, India beBeeDataEngineering Full time ₹ 2,00,00,000 - ₹ 2,50,00,000

    Job Title: Senior Data EngineerJob Description:As a Senior Data Engineer, you will play a key role in designing, developing, and maintaining large-scale data processing systems. You will work closely with cross-functional teams to deliver high-quality solutions that meet business needs.Required Skills and Qualifications:Requirements:8+ years of professional...