Before we dive too deep, let’s agree on some definitions — and here we will rely on two eminently trustworthy sources. “Zero Trust is the term for an evolving set of cybersecurity paradigms that move defenses from status, network-based perimeters to focus on users, assets, and resources. Zero Trust assumes there is no implicit trust granted to assets or user accounts based solely on their physical or network location (i.e., local area networks versus the Internet) or based on asset ownership (enterprise or personally owned),” explained NIST in a 2020 piece on Zero Trust Architecture.
The US Department of Defense (DoD) further argued that “The foundational tenet of the Zero Trust Model is that no actor, system, network, or service operating outside or within the security perimeter is trusted. Instead, we must verify anything and everything attempting to establish access.”
Zero Trust and Zero Trust Network Access (ZTNA) is an architecture, a new way of doing business from a security perspective. In the old days, you would walk in, sign in to your computer and after you signed into your computer, you would have access to all of the resources that you were granted. That’s a lot of passwords end users must remember.
Let’s look at how all this changes with Zero Trust.
How Zero Trust Affects Customers
Authentication is done continuously.
Authorization is done continuously.
Access control is done at more than one point.
Devices are now subject to authentication, authorization and access.
One slow authentication or authorization server can severely degrade customer experience.
What works today with your existing authentication and authorization solution cannot support Zero Trust without significant reengineering for performance and resilience. Access control, authentication and authorization must occur before a user is granted access to any information.
At the same time, the device must also authenticate to the network. This requires a methodology for device authentication that includes device health checks to make sure they meet a minimum standard before the device is allowed to send information back and forth across the network.
Why Things Go South
The infrastructure that supports all these authentication, authorization or device access control systems is easily overwhelmed with requirements. When this happens, end users have a bad experience, are not allowed access, offered slow access or their access is indeterminate. Maybe you’ve experience this, where access is good one day and bad the next day, or good one hour, bad the next hour. It can even change within minutes.
These bad customer experiences are because your infrastructure is not robust enough to handle the requirements. In short, IT needs to know what's working and what's not, and that the confidence that the plumbing directed to pieces of your infrastructure or security components are resilient and high performance.
The infrastructure has moved from a perimeter-based security model to a distributed security model where security decisions are being made throughout the entire architecture, across more than one component in the architectures and are not just reliant on a confusing, vulnerable smorgasbord of end user passwords.
The Zero Trust Heavy Load
As you can see, there is a lot of activity continuously requesting validation of who you are, what you are, what you're trying to access and whether you have access rights.
Special Zero Trust Needs of the Government Market
Shops within the US government have special needs, as they require an authentication model for users that's based upon an X.509 V3 certificate. This could be in the form of a CAC card, PIV card or perhaps a USB token that has a certificate on.
The way these certificate-based access models work is that a user presents the certificate in order to access resources on the network. This security handshake establishes an encrypted connection, such as a TLS connection or an HTTPS based connection.
This security handshake has to be performed at the right place in order to gain proper access to the multiple tiers in the security architecture.
There are mechanisms other than certificates for multifactor authentication. You've heard perhaps of SAML, maybe OAUTH 2 or OIDC which are token-based authentication mechanisms. These are portable across the network, which means one can use that same token to authenticate at multiple places as they go through the network and don't have to recreate the token.
As you look at how you're going to do Zero Trust from an authentication perspective, you need to take a hard look at how your authentication framework is set up today. And if you're based entirely upon certificate-based authentication, you'll need to make modifications.
“Assume a Hostile Environment. There are malicious personas both inside and outside the network. All users, devices, and networks/environments are treated as untrusted.
Presume Breach. There are hundreds of thousands of attempted cybersecurity attacks against DOD networks every day. Consciously operate and defend resources with the assumption that an adversary has a presence within your environment. Enhanced scrutiny of access and authorization decisions to improve response outcomes.
Never Trust, Always Verify. Deny access by default. Every device, user, application/workload, and data flow are authenticated and explicitly authorized using least privilege, multiple attributes, and dynamic cybersecurity policies.
Scrutinize Explicitly. All resources are consistently accessed in a secure manner using multiple attributes (dynamic and static) to derive confidence levels for contextual access to resources. Access to resources is conditional and access can dynamically change based on action and confidence levels resulting from those actions.
Apply Unified Analytics. Apply unified analytics for Data, Applications, Assets, Services (DAAS) to include behavioristics, and log each transaction.”
For the first pillar, assume that user credentials may have been compromised and the device trying to connect to your network has been compromised. Build your information systems and solutions around the concept that not everything inside that path to information is trustworthy.
That requires continuous verification and validation. Some components in your architecture which used to work just fine for signing on once in the morning start falling down due to the demand placed upon them to authenticate more than once across the network.
The Need for Macro-Segmentation, Micro-Segmentation and SDN
Device compliance is critical in the Zero Trust model and this can be address by applying micro-segmentation and macro-segmentation to your network environment. This means now having perhaps each and every subnet across your network (or even perhaps smaller components than that) be separately contained and require access decisions to go from one segment to another segment of your network.
These segments (both logical and physical) are isolated and controlled via granular access and policy restrictions. As your perimeter becomes granular through macro-segmentation, micro-segmentation provides greater protections and controls over the Data/Assets/Applications/ Services (DAAS). This is vital if you are to control privileged access, manage internal and external data flows and prevent lateral movement.
Government Market Demands Deep Software Supply Chain Visibility
The software supply chain is an interesting aspect of Zero Trust for the government market. In fact, the government is beginning to ask for a software built of material. How is this application built? What are the components inside of the application that comprise the application?
They are also asking for validation of corporate identity. Is the software vendor influenced by foreign entities? What does its investor portfolio look like?
Load Balancing and the Segmented Network
Load balancers are architected to easily handle network segments. For instance, a single load balancer could have 10 or 20 network segments and individual network segments can be assigned to each application. Here the load balancer, equipped to handle authentication, can be the front end the user connects into.
With this model, all the backend components live on micro-segments of the network. They don't talk to each other. If one gets compromised, it can't compromise the one next to it because it doesn't have access to the one next to it.
The goal for the segmentation structures is to contain an exploit at the lowest possible level possible – applications and workloads.
Watch and Learn About Zero Trust and Customer Experience
In a recent webinar, “Excelling the Customer Experience in a Zero-Trust World,” Mike Bomba, Progress' senior principal solutions architect and 36-year veteran of the Department of Defense, detailed the ins and outs of Zero-Trust, and how you can become a Zero-Trust master.
Doug Barney was the founding editor of Redmond Magazine, Redmond Channel Partner, Redmond Developer News and Virtualization Review. Doug has also served as Executive Editor of Network World, Editor in Chief of AmigaWorld and Editor in Chief of Network Computing.
Progress, Telerik, Ipswitch, Chef, Kemp, Flowmon, MarkLogic, Semaphore and certain product names used herein are trademarks or registered trademarks of Progress Software Corporation and/or one of its subsidiaries or affiliates in the U.S. and/or other countries. See
Trademarks for appropriate markings.