Security | Threat Detection | Cyberattacks | DevSecOps | Compliance

Latest Posts

Nightfall AI and Snyk Partner to Offer Developers AI-Powered Secrets Scanning

SAN FRANCISCO, Oct. 12, 2023—Nightfall AI, the leader in cloud Data Leak Prevention (cloud DLP), today announced a partnership with Snyk, a leading developer security provider, to offer developers AI-powered secrets-scanning capabilities. Snyk and Nightfall are partnering to co-sell Nightfall's DLP platform alongside Snyk's Developer Security Platform.

Nightfall Partnered with Snyk to Make Developers' Lives Easier. Here's How.

As we announced earlier today, Nightfall is thrilled to team up with Snyk to provide a state-of-the-art security solution for developers working in every phase of the code-to-cloud lifecycle. But that’s just the “What”—now let’s dive into the “Why” and the “How.”

AI Development Tools that Security Teams Should Know About and How to Secure Them

Following the rush to Artificial Intelligence (AI), many companies have introduced new tools and services to the software supply chain. Some of today’s most popular AI development tools include: This assortment of tools can be used to develop a wide range of AI applications, such as chatbots, virtual assistants, and image recognition systems.

Nightfall Named A Leader in Data Loss Prevention (DLP) by G2

Nightfall has been named as a Leader in Data Loss Prevention (DLP), Sensitive Data Discovery, and Data Security in G2’s Fall ‘23 rankings. We’d like to extend a huge thank you to all the customers and supporters who made this possible. This past season, the Nightfall team has been working tirelessly to innovate new ways to keep customers safe in the cloud.

7 Ways Security Teams Can Save Time With AI

AI has already revolutionized the way we work. ChatGPT, GitHub Copilot, and Zendesk AI are just a few of the tools that are taking over day-to-day tasks like generating customer support emails, de-bugging code, and much, much more. Yet despite all of these advancements, security teams are under more intense pressure than ever to mitigate rapidly evolving risks. Paired with a growing shortage of over 3.4 million cybersecurity workers, security teams are in need of a solution—and fast.

Do You Use ChatGPT at Work? These are the 4 Kinds of Hacks You Need to Know About.

From ChatGPT to DALL-E to Grammarly, there are countless ways to leverage generative AI (GenAI) to simplify everyday life. Whether you’re looking to cut down on busywork, create stunning visual content, or compose impeccable emails, GenAI’s got you covered—however, it’s vital to keep a close eye on your sensitive data at all times.

Worried About Leaking Data to LLMs? Here's How Nightfall Can Help.

Since the widespread launch of GPT-3.5 in November of last year, we’ve seen a meteoric rise in generative AI (GenAI) tools, along with an onslaught of security concerns from both countries and companies around the globe. Tech leaders like Apple have warned employees against using ChatGPT and GitHub Copilot, while other major players like Samsung have even go so far as to completely ban GenAI tools. Why are companies taking such drastic measures to prevent data leaks to LLMs, you may ask?