
Python Pyspark with Aws
1 week ago
**Responsibilities**:
1.Analyze various data sources present with the client, including structured and unstructured data formats
2.Map entities across different data sources to identify relationships and data inconsistencies
3.Analyze existing reports created with Tableau and other tools to understand data usage and identify gaps
4.Colloborate with data engineers to define data extraction, transformation, and Loading (ETL) processes
5.Develop a target data model with tables and relationships representing the business domain
6.Document data model and its components for future reference
7.support data viz efforts providing insights and analysis
Required skills:
1.Strong analytical and problem-solving skills
2.Experience with data analysis tools and techniques (SQL, Python, Excel)
3.Proficient in data visualization tools (Tableau, Power BI)
4. Understanding of data modeling concepts
5. Excellent communication and collaboration skills
Good to have:
Familiarity with querying data from salesforce.
understanding of data integration tools like Informatica and their use in data analysis projects.
**About Virtusa**
Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 30,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.
Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.
Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
-
Pyspark QA
2 weeks ago
Andhra Pradesh, India Virtusa Full time**JOB DESCRIPTION** **Skill: PySpark QA** **Role / Tier: Lead Software Engineer/Tier 2** **Experience: 6 - 9 years** Primary Skills BIG Data technology mentioned below Hadoop / Big Data (HDFS, PYTHON, SPARK-SQL, MapReduce) with PYSpark. build CI/CD pipelines Spark APIs to cleanse, explore, aggregate, transform, store & analyse installing, configuring,...
-
Bigdata Pyspark
1 week ago
Andhra Pradesh, India Virtusa Full timeOverall 10+ years of experience in Datawarehouse and Hadoop platform. MUST have experience with Python/PySpark and Hive in Big Data environments Should have strong skills in writing complex SQL Queries and good understanding of Data warehouse concepts. exposure to migration of legacy Data warehouse platform to Hadoop platform experience will be a big...
-
Andhra Pradesh, India Growel Softech Pvt. Ltd. Full time ₹ 9,00,000 - ₹ 12,00,000 per year- 1.Hands on industry experience in design and coding from scratch in AWS Glue-Pyspark with services like S3 DynamoDB StepFunctions etc. 2.Hands on industry experience in design and coding from scratch in Snowflake 3.Experience in Pyspark/Snowflake 1 to 3 years with overall around 5 years of experience in building data/analytics solutions Level Senior...
-
AWS Data Engineer
4 days ago
Andhra Pradesh, India Inityinfotech Full time ₹ 15,00,000 - ₹ 25,00,000 per yearAWS Data EngineerExperience :- 5+ yearsLocation : RemoteJob DescriptionDesign, development, and implementation of performant ETL pipelines using python API (pySpark) of Apache Spark on AWS EMR.Writing reusable, testable, and efficient codeIntegration of data storage solutions in spark – especially with AWS S3 object storage. Performance tuning of pySpark...
-
Aws Python
6 days ago
Andhra Pradesh, India Virtusa Full timeJD for Data Engineer Python At least 5 to 8 years of experience in AWS Python programming and who can design, build, test & deploy the code. Should have experience in using following AWS services: AWS SQS, AWS MSK, AWS RDS Aurora DB, BOTO 3. Very strong SQL knowledge is a must, should be able to understand build complex queries. He/she should be closely...
-
Python Aws
4 days ago
Andhra Pradesh, India Virtusa Full timePython AWS Job Summary We are looking for a highly motivated and skilled Back End Engineer with expertise in Python FastAPI and AWS cloud services to join our engineering team This role focuses on building and deploying robust APIs and MCP Microservice Control Plane server components leveraging modern containerization and cloud native technologies Key...
-
Aws De
6 days ago
Andhra Pradesh, India Virtusa Full timeMust Have Strong hands-on Experience in AWS Native Services Python, Pyspark Data Engineering(S3, EMR, Glue, Sqoop) Serverless Experience(Lambda, Step Functions) Strong SQL AWS Athena, Cloud Formation Templates, Crawler Strong Data analysis and troubleshooting skill Datamart / Data Warehousing Multi-Dimensional OLAP Star/Snowflake Schema Design **About...
-
Net with Aws
4 days ago
Andhra Pradesh, India Virtusa Full timeRole:.Net Developer **Skills**:Minimum 8 years of development and design experience in C# and AWS Lambda Comprehensive knowledge in C# programming language Strong knowledge of RESTful APIs, AWS Lambda, Terraform and ReactJS Hands on experience in using MongoDB Familiarity with other cloud services (e.g., S3, Athena, Cloudwatch, Cloudformation, DynamoDB),...
-
Gcp Python
4 days ago
Andhra Pradesh, India Virtusa Full time**Skills**:Python Knowledge with Pyspark, Pandas and Python Objects Knowledge of Google Cloud Platform Google Cloud : GCP cloud storage, Data proc, Big query SQL - Strong SQL & Advanced SQL Spark - writing skills on Pyspark DWH - Data warehousing concepts & dimension modeling GIT Any GCP Certification Roles & Responsibilities: Perform data analytics,...
-
Aws – Data Engineer
4 weeks ago
Gurgaon, Haryana Uttar Pradesh, India Epergne Solutions Full timeJob Role -AWS Data Engineer Job Location -Pune Hyderabad Chennai Mysuru Bhubaneswar Mangalore Trivandrum Chandigarh Jaipur Nagpur Indore Gurgaon Experience - 7 Years Job Roles Responsibilities - Design develop and maintain data pipelines and assets on AWS Optimize and refactor legacy PySpark Spark SQL code for performance and maintainability ...