Responsible Prompting is an LLM-agnostic tool that aims at dynamically supporting users in crafting prompts that embed responsible intentions and help avoid harmful, adversarial prompts. - View it on GitHub
Star
46
Rank
509190