AI Security - Black Duck Launches Signal to Mitigate Risks

Black Duck has launched Signal, a new AI application security tool to address risks in AI-generated code. This tool is essential for developers as reliance on AI coding assistants increases. Signal promises to enhance security and governance in software development, ensuring safer code practices.

AI & SecurityHIGHUpdated: Published: 📰 3 sources

Original Reporting

ISIT Security Guru·Guru Writer

AI Summary

CyberPings AI·Reviewed by Rohit Rana

🎯Basically, Black Duck created a new tool to help keep AI-written code safe from security problems.

What Happened

Black Duck has unveiled Black Duck Signal™, a groundbreaking AI application security solution. This tool is specifically designed to tackle the security challenges that arise from AI-native software development. As AI coding assistants become commonplace, industry analysts predict that 90% of enterprise developers will rely on these tools by 2028. However, existing security tools struggle to keep pace with the rapid evolution of AI-generated code.

Jason Schmitt, CEO of Black Duck, emphasized the urgency of this issue, stating, "AI is no longer just accelerating development; it’s actively authoring software." Signal aims to bridge this gap by providing a robust framework for managing the complexities of AI-driven code.

A Different Architecture for a Different Problem

Unlike traditional application security testing (AST) tools, Signal employs an agentic AI architecture. This innovative approach utilizes a coordinated system of specialized AI security agents. These agents collaborate to analyze code, evaluate vulnerabilities, prioritize risks, and recommend or implement fixes with a reasoning process akin to human logic.

At the core of this system is ContextAI, Black Duck’s proprietary application security model. Trained on extensive, human-validated security data, it enables Signal to make informed decisions about risk and remediation. This capability is particularly vital for identifying complex vulnerabilities that conventional tools often overlook.

Proof in the Wild

To demonstrate Signal’s effectiveness, Black Duck cited a real-world example where their Cybersecurity Research Center used the tool to discover a previously unknown authentication bypass vulnerability in Gitea, a popular open-source Git platform. This incident highlights Signal's ability to identify significant logic flaws that traditional tools might miss entirely.

Signal integrates seamlessly into existing developer tools, such as AI coding assistants and IDEs, providing continuous analysis as code is written. This proactive approach helps surface issues before they reach a commit, significantly reducing the risk of vulnerabilities slipping through the cracks.

Governance at AI Scale

Beyond merely detecting vulnerabilities, Black Duck positions Signal as an enterprise governance tool. As AI coding assistants increasingly take on the role of software developers, organizations face heightened challenges surrounding security, compliance, and trust. Signal equips security and engineering leaders with the visibility and control necessary to manage AI-generated software effectively.

With its innovative architecture and proactive capabilities, Black Duck Signal is now generally available, marking a significant step forward in securing the future of AI-driven software development. This tool aims to empower organizations to harness the benefits of AI while mitigating the associated risks.

🔒 Pro Insight

🔒 Pro insight: Signal's unique architecture could set a new standard for application security tools in the age of AI-driven development.

📅 Story Timeline

Story broke by IT Security Guru

Covered by Help Net Security

Covered by Dark Reading

Related Pings