Responsible Prompting is an LLM-agnostic tool that aims at dynamically supporting users in crafting prompts that embed responsible intentions and help avoid harmful, adversarial prompts. - View it on GitHub
Star
45
Rank
556526