
Generative AI Associate- Red Teaming
3 weeks ago
If you are interested, kindly complete the LLM Evaluation assessment (link below).
- LLM Evaluation Assessment Link: LLM Assessment Guidelines: Please follow below (Complete it by Today)
- Job Title: Generative AI Associate – Red Teaming Specialist
- Location: Bengaluru (Work from Office)
- Language: English
- Opportunity: Contractual (4 weeks)
- Hourly Rate: varies for experienced resource
- Mode: Work from Office
- Start Date: 01-Sep-2025
- End Date: 30-Sep-2025
- Education Required: BA/MA/PhD in ENGLISH or AI or ML
- Daily Required Hours from Office: 8 working hours from office
- Hourly Rate: $8 per hour
LLM Evaluation Guidelines:
- No. of Questions: 122
- No. of Sections: 7
- Test Duration: 120 minutes (2 hours)
- Total Marks: 122
- Passing Criteria: High (kindly do it carefully with full concentration, good luck)
- Required: Complete before Today Evening
What you’ll be doing:
As a Red Teaming Specialist on our AI Large Language Models (LLMs) team, you will be joining a truly global team of subject matter experts across a wide variety of disciplines and will be entrusted with a range of responsibilities. We’re seeking self-motivated, clever, and creative specialists who can handle the speed required to be on the frontlines of AI security. In return, we’ll be training you in cutting-edge methods of identifying and addressing vulnerabilities in generative AI. Below are some responsibilities and tasks of our Red Teaming Specialist role:
- Complete extensive training on AI/ML, LLMs, Red Teaming, and jailbreaking, as well as specific project guidelines and requirements
- Craft clever and sneaky prompts to attempt to bypass the filters and guardrails on LLMs, targeting specific vulnerabilities defined by our clients
- Collaborating closely with language specialists, team leads, and QA leads to produce the best possible work
- Assist our data scientists to conduct automated model attacks
- Adapt to the dynamic needs of different projects and clients, navigating shifting guidelines and requirements
- Keep up with the evolving capabilities and vulnerabilities of LLMs and help your team’s methods evolve with them
- Hit productivity targets, including for number of prompts written and average handling time per prompt
What we need you to bring:
- Bachelor’s degree or above or associates and 1 year of relevant industry experience or high school degree and 2 years of relevant industry experience
- Excellent writing skills (in English)
- Strong understanding of grammar, syntax, and semantics – knowing what "proper” English rules are, as well as when to violate them to better test AI responses
- Ability to adopt different voices and points of view
- Creative thinking
- Strong attentive to detail
- Well-honed internet research skills
- Ability to embrace diverse teams
- Ability to navigates ambiguity with grace
- Adaptability to thrive in a dynamic environment, with the agility to adjust to evolving guidelines and requirements
What we offer:
- Fully remote work environment
- Collaborative culture – and key tools enabling it
- Competitive compensation package
- Health, dental & vision benefits
- Employee Assistance Program (EAP)
- Career development & progression opportunities
- Paid vacation & personal days as well as sick days
-
QA/Red Teaming Expert
2 weeks ago
Delhi, India Innodata Inc. Full timeJob Description:We are seeking highly analytical and detail-oriented professionals with hands-on experience in Red Teaming, Prompt Evaluation, and AI/LLM Quality Assurance. The ideal candidate will help us rigorously test and evaluate AI-generated content to identify vulnerabilities, assess risks, and ensure compliance with safety, ethical, and quality...
-
QA/Red Teaming Expert
6 days ago
Delhi, India Innodata Inc. Full timeJob Description:We are seeking highly analytical and detail-oriented professionals with hands-on experience in Red Teaming, Prompt Evaluation , and AI/LLM Quality Assurance . The ideal candidate will help us rigorously test and evaluate AI-generated content to identify vulnerabilities, assess risks, and ensure compliance with safety, ethical, and quality...
-
QA/Red Teaming Expert
7 days ago
Delhi, India Innodata Inc. Full timeJob Description:We are seeking highly analytical and detail-oriented professionals with hands-on experience in Red Teaming, Prompt Evaluation , and AI/LLM Quality Assurance . The ideal candidate will help us rigorously test and evaluate AI-generated content to identify vulnerabilities, assess risks, and ensure compliance with safety, ethical, and quality...
-
QA/Red Teaming Expert
7 days ago
Delhi, India Innodata Inc. Full timeJob Description:We are seeking highly analytical and detail-oriented professionals with hands-on experience in Red Teaming, Prompt Evaluation , and AI/LLM Quality Assurance . The ideal candidate will help us rigorously test and evaluate AI-generated content to identify vulnerabilities, assess risks, and ensure compliance with safety, ethical, and quality...
-
Qa/red teaming expert
6 days ago
Delhi, India Innodata Inc. Full timeJob Description: We are seeking highly analytical and detail-oriented professionals with hands-on experience in Red Teaming, Prompt Evaluation , and AI/LLM Quality Assurance . The ideal candidate will help us rigorously test and evaluate AI-generated content to identify vulnerabilities, assess risks, and ensure compliance with safety, ethical, and quality...
-
Qa/red teaming expert
5 days ago
Delhi, India Innodata Inc. Full timeJob Description:We are seeking highly analytical and detail-oriented professionals with hands-on experience in Red Teaming, Prompt Evaluation , and AI/LLM Quality Assurance. The ideal candidate will help us rigorously test and evaluate AI-generated content to identify vulnerabilities, assess risks, and ensure compliance with safety, ethical, and quality...
-
Qa/red teaming expert
4 days ago
Delhi, India Innodata Inc. Full timeJob Description:We are seeking highly analytical and detail-oriented professionals with hands-on experience in Red Teaming, Prompt Evaluation , and AI/LLM Quality Assurance. The ideal candidate will help us rigorously test and evaluate AI-generated content to identify vulnerabilities, assess risks, and ensure compliance with safety, ethical, and quality...
-
QA/Red Teaming Expert
2 weeks ago
Delhi, India Innodata Inc. Full timeJob Description:We are seeking highly analytical and detail-oriented professionals with hands-on experience in Red Teaming, Prompt Evaluation, and AI/LLM Quality Assurance. The ideal candidate will help us rigorously test and evaluate AI-generated content to identify vulnerabilities, assess risks, and ensure compliance with safety, ethical, and quality...
-
QA/Red Teaming Expert
4 days ago
New Delhi, India Innodata Inc. Full timeJob Description:We are seeking highly analytical and detail-oriented professionals with hands-on experience in Red Teaming, Prompt Evaluation, and AI/LLM Quality Assurance. The ideal candidate will help us rigorously test and evaluate AI-generated content to identify vulnerabilities, assess risks, and ensure compliance with safety, ethical, and quality...
-
QA/Red Teaming Expert
2 weeks ago
Delhi, India Innodata Inc. Full timeJob Description:We are seeking highly analytical and detail-oriented professionals with hands-on experience inRed Teaming, Prompt Evaluation , andAI/LLM Quality Assurance . The ideal candidate will help us rigorously test and evaluate AI-generated content to identify vulnerabilities, assess risks, and ensure compliance with safety, ethical, and quality...