Python Pyspark with Aws
3 days ago
**Responsibilities**:
1.Analyze various data sources present with the client, including structured and unstructured data formats
2.Map entities across different data sources to identify relationships and data inconsistencies
3.Analyze existing reports created with Tableau and other tools to understand data usage and identify gaps
4.Colloborate with data engineers to define data extraction, transformation, and Loading (ETL) processes
5.Develop a target data model with tables and relationships representing the business domain
6.Document data model and its components for future reference
7.support data viz efforts providing insights and analysis
Required skills:
1.Strong analytical and problem-solving skills
2.Experience with data analysis tools and techniques (SQL, Python, Excel)
3.Proficient in data visualization tools (Tableau, Power BI)
4. Understanding of data modeling concepts
5. Excellent communication and collaboration skills
Good to have:
Familiarity with querying data from salesforce.
understanding of data integration tools like Informatica and their use in data analysis projects.
**About Virtusa**
Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 30,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.
Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.
Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
-
Bigdata Pyspark
3 days ago
Andhra Pradesh, India Virtusa Full timeOverall 10+ years of experience in Datawarehouse and Hadoop platform. MUST have experience with Python/PySpark and Hive in Big Data environments Should have strong skills in writing complex SQL Queries and good understanding of Data warehouse concepts. exposure to migration of legacy Data warehouse platform to Hadoop platform experience will be a big...
-
Andhra Pradesh, India Growel Softech Pvt. Ltd. Full time ₹ 9,00,000 - ₹ 12,00,000 per year- 1.Hands on industry experience in design and coding from scratch in AWS Glue-Pyspark with services like S3 DynamoDB StepFunctions etc. 2.Hands on industry experience in design and coding from scratch in Snowflake 3.Experience in Pyspark/Snowflake 1 to 3 years with overall around 5 years of experience in building data/analytics solutions Level Senior...
-
AWS Data Engineer
3 days ago
Andhra Pradesh, India Inityinfotech Full time ₹ 15,00,000 - ₹ 25,00,000 per yearAWS Data EngineerExperience :- 5+ yearsLocation : RemoteJob DescriptionDesign, development, and implementation of performant ETL pipelines using python API (pySpark) of Apache Spark on AWS EMR.Writing reusable, testable, and efficient codeIntegration of data storage solutions in spark – especially with AWS S3 object storage. Performance tuning of pySpark...
-
Databricks + Pyspark
1 week ago
Andhra Pradesh, India Virtusa Full time**Detailed Job Description for**Databricks + PySpark Developer**: - Data Pipeline Development: Design, implement, and maintain scalable and efficient data pipelines using PySpark and Databricks for ETL processing of large volumes of data. - Cloud Integration: Develop solutions leveraging Databricks on cloud platforms (AWS/Azure/GCP) to process and analyze...
-
Python, Pyspark, SQL, GCP
2 days ago
uttar pradesh, India Tata Consultancy Services Full timeTCS present an excellent opportunity for Python, Pyspark, SQL, GCPJob Location: TCS Noida YamunaExperience required : 7-12 yrsWalk in Interview Date: 08-Nov-25 (SaturdayMust-Have. ReactJS and Python f/w for Backend – preferably FAST API. Kubernetes/Docker (preferably AKS). Strong Hands-on experience . Handling large volumes of data on web pages (preferably...
-
Python Dev
7 days ago
Andhra Pradesh, India Virtusa Full timePython Developer Strong proficiency in Python, including experience with data manipulation libraries like Pandas, NumPy, and PySpark. Strong in writing and optimizing complex SQL queries for data retrieval, transformation, and analysis within Snowflake and PostgreSQL Design, implement, and optimize end-to-end data pipelines using Python, including tasks...
-
Pyspark Internship
5 days ago
Bhopal, Madhya Pradesh, India SAA Consultancy Full time**PySpark Internship** **Join SAA** **Synergistic AI Analytics** **We Are Hiring — PySpark Data Engineering Intern** **Location**: Bhopal, MP (**Work from Office**) **Internship Duration**: 3-6 Months **Experience**: Fresher / Final-year student / 0-2 Years Are you passionate about data and ready to build real data engineering projects in a modern...
-
Java Aws
1 week ago
Andhra Pradesh, India Virtusa Full time5+ years of hands on in JAVA Experience of SA and designs using AWS platform and services design, usage, security, performance, etc. Experience of IAM, S3, Athena, Python or NodeJS, Glue, Lambda, DMS, RDS, Redshift ,Cloud Formation and other AWS serverless resources. IAC using AWS CDK AWS experience solution Architect associate is a big plus with the...
-
Python Scripting
2 weeks ago
Andhra Pradesh, India Virtusa Full timePython Scripting Led Primary Skills Python scripting Data Scikit learn Py Spark Data mining, AI JD Design and build new tools as well as enhancements to existing our surveillance and monitoring tools for the compliance business community Develop solutions using Python, Hadoop, Spark, and HIVE/SQL Work on projects that require deep knowledge of Python and...
-
Data Engineer
17 hours ago
Andhra Pradesh, India Growel Softech Pvt. Ltd. Full time ₹ 15,00,000 - ₹ 25,00,000 per year:- Location :Any location in India Years of Experience:Level 2 (3 to ?5 Years) Data Engineer About The Role : for Data Journey USBIE Experience developing and deploying application code using SQL Python spark. 3-5 years experience developing and deploying data pipeline in cloud. 3-5 years experience in AWS using glue athena lambda secrets manager redshift...