Identity management and security have grown tougher with the rise of AI agents; SailPoint hopes to remedy that while ransomware statistics from the FBI show a shift from data as the primary objective and toward operational disruption as the real leverage point. And IoT hasn't lived up to its potential.
Q&A: SailPoint EVP Chandra Gnanasambandam Discusses the Next Evolution of Identity
Enterprises are rapidly moving from a human-centric model of security to one that must account for both humans and autonomous AI agents. As organizations deploy these agents across cloud environments, applications, and endpoints, they face a growing governance gap. Unlike traditional users, AI agents can act independently and at machine speed, often without clear ownership or consistent controls. As these non-human identities multiply—quickly outnumbering human users—organizations are being forced to rethink identity, access, and governance at a fundamental level, which Tech-Channels discussed at RSA with Chandra Gnanasambandam, EVP of Product and Chief Technology Officer at SailPoint. To address this shift, SailPoint this week announced Agentic Fabric, a new solution designed to secure AI agents and other non-human identities at scale. Extending its Identity Security Cloud, Agentic Fabric provides a unified, identity-centric approach that connects discovery, visibility, governance, authorization, and protection in a single platform. By mapping every AI agent to its human owner, associated data, and systems, the platform gives organizations the context needed to enforce accountability, manage risk, and maintain control as AI adoption accelerates.
2,100 Attacks Later: What the FBI Report Reveals About Ransomware’s New Strategy
More than 2,100 ransomware incidents targeted U.S. critical infrastructure in 2025 alone, nearly double the number of reported data breach threats across the same sectors. The latest figures from the FBI’s Internet Crime Complaint Center challenge the framing that ransomware as a persistent threat, something cyclical, almost predictable in its evolution. What we are seeing is a move away from data as the primary objective and toward operational disruption as the real leverage point. Critical infrastructure, by design, cannot afford downtime. Hospitals cannot pause operations. Energy grids cannot “fail gracefully.” Manufacturing lines cannot simply restart without consequence. Ransomware exploits the one constraint these systems cannot escape: continuity. Groups like Akira, Qilin, and Lynx are not operating as isolated actors but as structured, scalable businesses. Their model, ransomware-as-a-service, reflects a shift toward industrialization within cybercrime.
$27 Billion for Compute: Why AI Is Forcing a Rethink of the Cloud Itself
The most interesting part of Meta’s $27 billion agreement with Nebius is not the number, even though it is large enough to command attention. It is what that number represents: a shift in where power now resides in the technology stack. With $12 billion in dedicated capacity and the option to purchase up to $15 billion more over five years, the company is effectively securing access to a resource that is becoming both critical and constrained. Rather than flexibility, this is about control over a supply chain that now directly shapes product capability. AI workloads behave differently from the systems that defined the previous generation of cloud computing. They are not short-lived, bursty, or easily distributed across generic infrastructure. Training large models requires sustained, high-intensity compute over long periods, often tied to specific hardware configurations. Interruptions are costly, inefficiencies scale quickly, and access to the right infrastructure at the right time becomes a limiting factor.
Axios Supply Chain Attack Shows Why Software Delivery Pipelines Need Hardening
Even though two malicious Axios npm releases that were published on the npm registry within 39 minutes of each other, were removed approximately three hours later, they were a formidable reminder of just how fragile today’s software supply chains can be. Acting as a public database with over two-million JavaScript packages and accompanying metadata, the npm registry is the largest software registry in the world, with over 17 million developers—around three quarters of all JavaScript developers globally
The Illusion of Intelligence: Why IoT Still Hasn’t Delivered What It Promised
A convoy moves through a remote stretch of road. No central command, no human coordination, yet every vehicle adjusts its route in real time, responding to data flowing invisibly between machines. Traffic patterns shift, delays are avoided, and decisions are made before drivers even realize there was a problem. It sounds like a glimpse into the future. In many ways, it already exists. The Internet of Things (IoT) promises a world where systems no longer wait for instructions, where data moves faster than human awareness, and where decisions begin to emerge from the network itself rather than from any single point of control. And yet, for most organizations, that promise still feels just out of reach.
Why Iceberg V3 Is Pushing Data Platforms Toward Greater Interoperability
Snowflake recent expansion of support for Apache Iceberg V3 as part of its push toward more interoperable data management should help organizations meet the increasing board-level pressures of rising data-platform costs, AI readiness, and vendor lock-in fears. In April, Snowflake also expressed its commitment to adding broader V3 capabilities at the Iceberg Summit, making clear that open operability is no longer merely a ‘nice-to-have’ for software companies, but a competitive baseline and operational necessity in enterprise data management. The company considers its latest move as a more effective and efficient way to access, govern, and analyze data across multiple platforms, rather than being boxed in by proprietary constraints.
.png?width=1816&height=566&name=brandmark-design%20(83).png)