
Pyspark Developer
2 weeks ago
Pyspark
4+ years of hands-on experience with following GCP tools: BigQuery, Dataproc, Cloud Composer/Airflow, Cloud Storage
Develop and optimize ETL/ELT pipelines using Dataproc, Cloud Composer, Bigquery
Optimize complex SQL queries and data processing workflows
Strong experience with DevOps processes
Independently collaborate with cross-functional teams to understand data requirements
Lead technical design discussions and architecture reviews
Preferred: experience building pipelines using Cloud pub-sub and cloud functions
Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
-
Snowflake Developer With PySpark
15 hours ago
Delhi, India DigiHelic Solutions Pvt. Ltd. Full timeJob Role: Snowflake DeveloperExperience: 6-10 YearsLocation: Trivandrum/Kochi/Bangalore/Chennai/Pune/Noida/HyderabadWork Model: HybridMandatory Skills: Snowflake, PySpark, ETL, SQLMust Have SkillsData Warehouse:· Design, implement, and optimize data warehouses on the Snowflake platform.· Ensure effective utilization of Snowflake features for scalable and...
-
Pyspark Developer
2 days ago
Delhi, Delhi, India VAK Consulting LLC Full time ₹ 15,00,000 - ₹ 25,00,000 per yearPyspark4+ years of hands-on experience with following GCP tools: BigQuery, Dataproc, Cloud Composer/Airflow, Cloud StorageDevelop and optimize ETL/ELT pipelines using Dataproc, Cloud Composer, BigqueryOptimize complex SQL queries and data processing workflowsStrong experience with DevOps processes Independently collaborate with cross-functional teams to...
-
Data Engineer
6 days ago
Delhi, India Tata Consultancy Services Full timeTCS is hiring forPyspark Data Engineerfor BangaloreRole: Data Engineer with Mandatory Pyspark KnowledgeRequired Technical Skill Set: Python , Pyspark, BigQueryDesired Experience Range: 7 to 10 yrsLocation: BangaloreDesired Competencies (Technical/Behavioral Competency)Development, Production Support and delivery of Python, BigQuery, SQL, GCP, Airflow based...
-
IIT Python Pyspark Engineers
2 weeks ago
Delhi, India Appice Full timeCompany Description Appice, a product of Semusi, is a machine learning-based mobile marketing automation platform that revolutionizes how banks and fintech companies engage with customers. With over 12 years of experience in mobile and enterprise software, Appice offers solutions for mobile acquisition, engagement, retention, and monetization through...
-
Data Engineer
16 hours ago
Delhi, India Namasys Analytics Full timeWe’re Hiring: Data Engineer (AWS | Python | PySpark | SQL) Work Type: 100% Remote Experience: 5–7 years Notice Period: Immediate joiners preferred; up to 15 days acceptable About the Role: We are seeking a highly skilled Data Engineer to design, develop, and optimize robust data pipelines and cloud-based architectures. You will work with...
-
Data Engineer
15 hours ago
Delhi, India Namasys Analytics Full timeWe’re Hiring: Data Engineer (AWS | Python | PySpark | SQL)Work Type: 100% RemoteExperience: 5–7 yearsNotice Period: Immediate joiners preferred; up to 15 days acceptableAbout the Role:We are seeking a highly skilled Data Engineer to design, develop, and optimize robust data pipelines and cloud-based architectures. You will work with cutting-edge AWS...
-
Cloudious - Senior Data Engineer - ETL/PySpark
3 weeks ago
Delhi, Delhi, India Cloudious lLC Full timeRequirements:- 8+ years of professional software engineering mostly focused on the following: 3 to 4 years of customer-facing international exposure.- At least 2 years of experience interacting with technology and business senior leaders.- Exceptional leadership, communication and stakeholder management skills Leading innovation and automaton agendas for...
-
Senior Data Engineer
2 weeks ago
Delhi Division, India Markovate Full timeJob Description : We are looking for an experienced Senior Data Engineer to lead the development of scalable AWS-native data lake pipelines, with a strong focus on time series forecasting, upsert-ready architectures, and enterprise-grade data governance. This role demands end-to-end ownership of the data lifecycle from ingestion to partitioning, versioning,...
-
MethodHub - AWS Data Engineer - PySpark/SQL
3 weeks ago
Delhi, Delhi, India Method Hub Software Limited Full timeTitle : AWS Data EngineerExperience :8 yrs to 15 yrsLocation : RemoteNotice Period : Immediate This role is highly hands-on and requires experience building robust data pipelines and data assets on AWS. - 4+ years of strong programming experience in Python / PySpark.- Should be able to write clean, production-grade code.- Look for mentions of Spark SQL...
-
Databricks Developer
4 weeks ago
Delhi, Delhi, India NPG Consultants Full timeDatabricks Developer :We are seeking a Databricks Developer to enhance our data engineering team. This role focuses on building scalable data pipelines, optimizing performance, and enabling analytics innovation.Responsibilities :- Develop and optimize Spark pipelines using Databricks and Delta Lake.- Write production-grade PySpark/Scala code for ETL and data...