Gyala closed a new investment round and accelerates the cyber resilience of critical IT/OT/IoT infrastructures Read

Beyond Third Parties, Supply Chain Cybersecurity in 2026

Gyala is reshaping the rules of the game in a scenario where trust is no longer an option.

The question this article raises — and that no organization can afford to ignore — is not “are we secure?”. It is far more uncomfortable: do we truly know who our security depends on?

IT, OT, and IoT are interconnected; unpatchable PLCs, distributed IoT sensors, cloud and SaaS dependencies, geopolitical friction, persistent attacks, and shadow risks generated by maintainers and free AI tools. In this landscape, it is clear that the implicit trust model paradigm is no longer viable in cybersecurity.

It is precisely from this awareness that Gyala’s approach emerges: no implicit trust, but real-time anomaly detection, behavioral analytics, native automation, and cross-layer correlation across IT, OT, and IoT environments. With Agger, detection and reaction become adaptive and configurable down to the individual endpoint, enabling immediate and context-aware response.

Supply Chain Cybersecurity 2026: From Defense to Strategy

By 2026, cybersecurity has definitively evolved from a defensive discipline into a strategic lever for business resilience. The traditional model, based on implicit trust in suppliers, technologies, and connections, can no longer withstand a continuously evolving digital ecosystem. Geopolitical instability, the rise of supply chain attacks, and the convergence of IT, OT, and IoT are redefining the very concept of perimeter.

But the real breaking point lies elsewhere: from a cybersecurity perspective, the supply chain is no longer a chain.

It has become a distributed, continuous, and non-deterministic system, where every node — whether a supplier, a device, or a digital service — is an integral part of the infrastructure. Relationships are no longer linear but dynamic: they evolve over time, expand, and intersect. In this context, risk is no longer external to the organization but intrinsic to its day-to-day operations.

Third Parties and Persistent Access: Why Traditional Third Party Risk Management Is No Longer Sufficient

The traditional Third Party Risk Management (TPRM) approach, based on periodic assessments and compliance checks, was designed for a static world. Today, access is persistent, integrations are continuous, and technological dependencies are often not fully visible. It is no longer sufficient to know whether a supplier is trustworthy; it is necessary to understand how that supplier interacts in real time with systems, data, and processes.

The convergence of IT, OT, and IoT further amplifies this complexity. OT systems designed for isolated environments now coexist with distributed IT infrastructures and IoT devices that are difficult to govern. Even legitimate access today — remote maintenance, software updates, or the use of external services — can turn into a compromise vector.

Shadow IT Risk and Invisible Dependencies: Legacy Systems, Uncontrolled AI, and Indirect Supply Chains

The landscape is further complicated by the emergence of less visible but increasingly relevant risks: unpatchable legacy technologies, indirect supply chains that are difficult to trace, uncontrolled use of AI tools, and free services that introduce potential data exposure channels. These elements are no longer exceptions but part of the operational baseline of modern organizations.

If the context is this dynamic, the approach to security cannot be static.
Prevention alone is not enough: it is necessary to understand infrastructure behavior and respond coherently based on context.
This is where a new paradigm emerges, in which cybersecurity evolves into cyber resilience. No longer a centralized defense based on generic rules, but a distributed capability to detect anomalies, correlate events, and trigger automated responses directly where risk materializes.
In this direction, platforms like Agger represent a concrete evolution. The ability to adapt detection and reaction rules at the individual endpoint level, combined with cross-layer correlation and native automation, enables a drastic reduction in response time and limits the impact of incidents even in complex, interconnected environments.

Prevention alone is not enough: it is necessary to understand infrastructure behavior and respond coherently based on context.

This is where a new paradigm emerges, in which cybersecurity evolves into cyber resilience. No longer a centralized defense based on generic rules, but a distributed capability to detect anomalies, correlate events, and trigger automated responses directly where risk materializes.

In this direction, platforms like Agger represent a concrete evolution. The ability to adapt detection and reaction rules at the individual endpoint level, combined with cross-layer correlation and native automation, enables a drastic reduction in response time and limits the impact of incidents even in complex, interconnected environments.

In today’s digital supply chain, security is no longer a perimeter to defend, but a system to understand, govern, and make resilient.

Technological Sovereignty and Cloud: When Dependency on Hyperscalers Becomes an Operational Constraint

Technological sovereignty is one of those concepts that, for years, has been treated as a matter of strategic positioning — almost abstract — until the data began to tell a far more concrete and far less reassuring story.

Today, over 70% of the European cloud market is controlled by the three major U.S. hyperscalers, while European providers account for around 15% (TechRadar).
Looking deeper, 53% of installed capacity in European data centers is concentrated among just ten operators, seven of which are American (Agenda Digitale).

At first glance, the issue already seems clear.

But the interesting point is not the concentration — it is how it is perceived.

According to ZeroUno, 54% of large European enterprises do not consider local providers truly competitive, and 37% of Italian companies are evaluating — or have already initiated — repatriation strategies.

Here, a tension emerges that deserves careful attention.

On one side, dependence on a few global players is no longer debatable — it is structural. On the other, when it comes to reducing that dependence, alternatives are perceived as insufficient, creating a situation in which awareness of the problem grows faster than the ability to solve it.

It is within this gap that technological sovereignty ceases to be a political objective and becomes an operational dependency.

Because it is not simply about where data resides, but about who actually controls:

  • the infrastructure

  • the access models

  • the management logic

  • and ultimately, the operating conditions of the business

It is no coincidence that, despite the growth of investments in sovereign cloud, hyperscalers continue to dominate the European market, even when it comes to “sovereign” services (Broadcom News and Stories).

This raises a less comfortable but inevitable question.
If infrastructure is localized but control remains elsewhere, can we truly speak of sovereignty?

Or, more realistically, are we redefining the problem without solving it?
Perhaps the point is not to build a complete alternative — which today simply does not exist — but to start recognizing dependency for what it is: not a potential risk, but a condition already embedded in operating models.

At that point, the question changes.It is no longer “how to become sovereign,”

But rather how much autonomy remains when technological choices are already, to a large extent, constrained upstream.

Article published in ICT Security Magazine