Home Developer LLM Guard: Open-Source Toolkit for Securing Large Language Models By James Patterson September 22, 2023 LLM Guard provides extensive evaluators for both inputs and outputs of LLMs, offering sanitization, detection of harmful language and data leakage. Learn more here. Complete Story Facebook Twitter Linkedin Email Print Get the Free Newsletter! Subscribe to Developer Insider for top news, trends, & analysis Email Address By subscribing, you agree to our Terms of Use and Privacy Policy. Subscribe Must Read Blog 14 Best Free and Open Source Linux Logfile Viewers Blog How to Create a Secure FTP Server with ProFTPD on Ubuntu/Debian Blog NetworkManager 1.50 Adds Support for Configuring Wi-Fi Channel Width in AP Mode Developer How to Create a Deb Package Repository on Sourceforge with Reprepro Developer Linux SED Command: Everything you Need to Know