The AI Guard Lab Tool is used to evaluate the efficacy of the CrowdStrike AIDR AI Guard API against labeled datasets. It supports both malicious prompt injection detection and topic-based detection. - View it on GitHub
Star
2
Rank
4172466