Prompt injection

Prompt injection image
Prompt injection is a security attack that happens when someone intentionally manipulates the input to a Generative AI system like a chatbot or code generator to make it behave in ways the designer didn’t intend.

It’s done by crafting inputs to Gen AI systems in order to confuse, hijack, or redirect the AI’s response by messing with its underlying structure.

For software testers, it's a way to test for input attacks on LLM-based systems. Just like a SQL injection or XSS, but here the payload is language and words designed to interfere with the model or system prompts.
Leave Complexity Behind image
No-Code Test Automation with Low-Code integration via Robot Framework and TestBench
Explore MoT
Leading with Quality image
Tue, 30 Sep
A one-day educational experience to help business lead with expanding quality engineering and testing practices.
MoT Software Testing Essentials Certificate image
Boost your career in software testing with the MoT Software Testing Essentials Certificate. Learn essential skills, from basic testing techniques to advanced risk analysis, crafted by industry experts.
Leading with Quality
A one-day educational experience to help business lead with expanding quality engineering and testing practices.
This Week in Testing image
Debrief the week in Testing via a community radio show hosted by Simon Tomes and members of the community
Subscribe to our newsletter
We'll keep you up to date on all the testing trends.