Software2 months

Security Red-Teaming for Code Gen Assistant

Client

DevTools Startup

Overview

A startup building a coding assistant needed to ensure their tool did not introduce security vulnerabilities into user codebases.

The Challenge

The model was susceptible to prompt injection, often revealing system instructions and generating SQL injection vulnerabilities when asked.

Our Solution

We performed aggressive red-teaming, creating a library of 500+ attack vectors. We then fine-tuned the model on "refusal" datasets and implemented output filtering.

Tech Stack

Azure OpenAILangChainPythonNemo GuardrailsPinecone

Key Results

security Score

Improved from C- to A+

vulnerability Block

100% of known injection attacks blocked

trust

Successfully passed SOC2 Type II AI controls

Security Score Improved from C- to A+

100% of known injection attacks blocked

Successfully passed SOC2 Type II AI controls

Ready to achieve similar results?

Kaycore Technologies | AI Quality Engineering & Validation