The Five Barriers to Enterprise AI Adoption

April 23, 2026

AI is changing the role of the developer, shifting work from coding toward agent orchestration and higher-level architecture. While not the only area seeing results, developer productivity is where AI’s traction is most visible and measurable, particularly among individual developers, small teams, and greenfield shops free of legacy systems and layered processes.

Enterprise adoption is different. Scaling those results, and those from other use cases, beyond pilots and isolated efforts requires addressing five key barriers. Those that do will pull ahead. Those that don’t will find themselves increasingly outpaced by peers that did.

Note: this is v1, any and all feedback welcome

The Five Barriers

  1. Security and Governance If security and governance are not solved for, AI not only fails to reach its full potential but becomes a liability, opening the door to breaches, unauthorized actions, and compliance failures. In the agentic era, traditional enterprise security models that assume human decision checkpoints no longer apply. Agents act autonomously, at machine speed and at scale. They require a governance layer between agents and infrastructure to enforce policies, monitor behavior, and audit actions.
  2. Data Access and Quality AI utility is directly proportional to the quality and accessibility of internal data. In large enterprises, that data is scattered across disconnected databases and geographies, and without deliberate efforts to unify, clean, and make it accessible, AI remains narrow and underpowered. If you don’t have a data strategy, you don’t have an AI strategy.
  3. Integration into Systems and Workflows For AI to succeed beyond individual tooling, it needs to be brought in-process and woven into existing software, workflows, and decision points, a non-trivial exercise in large enterprises with decades of legacy tooling. The challenge is less about deploying models and more about reworking how work flows through the organization.
  4. Organizational Structure and Roles To stay competitive, enterprises will need to rethink how they are organized, what teams and roles are needed, and what skills are required. This includes changes to how developers, operations, and business stakeholders interact and make decisions. While important when leveraging AI for efficiency, it becomes essential when the goal is discovering and developing new opportunities, which requires reinventing both business models and the structures that support them.
  5. Culture: Adapting and Adopting Adoption is constrained as much by culture as by technology. Whether employees embrace AI depends heavily on whether leadership creates an environment that encourages and supports its use, through clear communication, visible commitment, and concrete examples of what good looks like. Without that, adoption stays inconsistent and localized. Equally important is transparency around how roles will evolve. When employees are left to fill in the blanks themselves, anxiety around job security sets in, and that anxiety can paralyze large parts of the organization. While many see AI as an opportunity, there are many who, without context or clear expectations, see it as a threat. 

Pau for now…


How Infoblox Reinvents Network Services for the Multi-Cloud Era

April 21, 2025

Presented at Cloud Field Day in Santa Clara


Earlier this year, I had the opportunity to participate as a delegate at Cloud Field Day in Santa Clara. As delegates, we engaged directly with the presenting companies, offering feedback on what resonated, what needed clarification, and how their strategies could evolve.

The first presenter was Infoblox, a company that merges networking and security into a unified solution, more specifically they are focused on DDI — that’s DNS, DHCP, and IPAM. Other than an acronym of acronyms, what exactly is DDI?  I soon found out as Chief Product Officer Mukesh Gupta explained how this combination of “boring” network services is critical in today’s messy, manual, and fragmented hybrid multi-cloud environments.


What Really Is DDI and Why Does It Matter?

DDI is about managing the “naming,” “numbering,” and “locating” of everything connected to a network — whether it’s a laptop, server, phone, or cloud service. Specifically it is made up of three foundational network services:

  • DNS (Domain Name System): Translates human-readable domain names (like google.com) into IP addresses.
  • DHCP (Dynamic Host Configuration Protocol): Automatically assigns IP addresses to devices on a network.
  • IPAM (IP Address Management): Manages the allocation, tracking, and planning of IP addresses across an organization.

These services form the invisible infrastructure behind every enterprise network. Mukesh described DDI as the “electricity” of networking — when it goes down, everything stops.


Multi-Cloud Challenges and DDI

Mukesh outlined three key trends currently reshaping enterprise infrastructure:

  1. Hybrid multi-cloud adoption
    Most organizations now operate across a mix of public cloud providers and on-premises infrastructure.
  2. SaaS-first, cloud-first strategies
    Enterprises are rapidly moving off legacy systems (especially post-VMware acquisition) in favor of cloud-native approaches.
  3. Increasing cybersecurity threats
    Attackers are more frequent, more sophisticated, and more damaging than ever before.

These trends introduce real complexity for DDI. Key challenges include:

  • Fragmented DNS systems across multiple clouds
  • Inconsistent APIs that make automation difficult and expensive
  • IP address conflicts due to disconnected systems
  • Stale DNS records that introduce security vulnerabilities

Real-world example:
A major New York bank allowed cloud teams to use native DNS tools. One day, a simple typo in a DNS entry brought down the entire bank for four hours, costing them millions.


Infoblox’s Answer: An Integrated Platform

To address these pain points, Infoblox introduced the Infoblox Universal DDI™ Product Suite. This integrated platforms provides a centralized, automated, and cloud-managed way to run critical network services (DNS, DHCP, IPAM) across complex hybrid and multi-cloud environments.

Key Features:

  • Unified management layer
    Manage DNS across on-prem, branch, and cloud from a single interface.
  • Universal IPAM & asset visibility
    Real-time insights into IP usage and resource status.
  • Conflict detection & stale record resolution
    Automatically identify and resolve subnet overlaps and outdated DNS entries.
  • Built-in security
    Use DNS as a security control point to detect and block threats.

The platform supports physical, virtual, and cloud-based DNS servers, and integrates with automation tools like Terraform and Ansible. It also maintains backward compatibility via API replication, ensuring existing workflows stay intact.


Security Through DNS

One of the most compelling elements of Infoblox’s platform is how it uses DNS as a security layer.

Since nearly every internet communication starts with a DNS query, Infoblox can analyze DNS traffic patterns to:

  • Detect ransomware activity
  • Prevent data exfiltration
  • Block malicious domains in real time

By combining DNS logs with threat intelligence feeds, Infoblox transforms a foundational service into a proactive security shield.


The Future is Unified DDI

As enterprises deepen their multi-cloud investments, unified management and visibility across distributed infrastructure becomes invaluable. Infoblox’s Universal DDI™ Product Suite delivers this allowing organizations to manage DNS, DHCP, and IP address assignments consistently across data centers, cloud providers, and edge environments — all from a single interface.

While DNS, DHCP, and IPAM may seem behind-the-scenes, they are essential to:

  • Prevent outages
  • Accelerate cloud operations
  • Strengthen enterprise security

In a world where spreadsheets and siloed tools can bring down billion-dollar operations, Infoblox’s Universal DDI is something definitely worth checking out.

Pau for now…


Gazzang – One of the 10 Austin startups to check out at SXSW

March 10, 2012

Last night we  held our first SXSW meet up at Opal Divines.  Opals is very close to the worldwide headquarters of Gazzang, which last week was named by GigaOm one of The 10 Austin startups you need to meet at SXSW 2012.  Gazzang focuses on securing your data in the cloud via transparent data encryption.

Given the  proximity and the promise of free beer, I was able to twist the arms of four members of their development team and get them to join us.  Here is a quick video featuring Dustin Kirkland, Sergio Pena, Hector Acosta, and Eddie Garcia.

Pau for now…


OSCON: ex-NASA cloud lead on his OpenStack startup, Piston

July 31, 2011

Last week  at OSCON in Portland, I dragged Josh McKenty away from the OpenStack one-year anniversary (that’s what Josh is referring to at the very end of the interview) to do a quick video.  Josh, who headed up NASA’s Nebula tech team and has been very involved with OpenStack from the very beginning has recently announced Piston, a startup that will productize OpenStack for enterprises.

Here is what the always entertaining Josh had to say:

Some of the ground Josh covers:

  • What, in a nutshell, will Piston be offering?
  • Josh’s work at NASA and how got involved in OpenStack
  • Timing around Piston’s general release and GA
  • The roles he plays on the OpenStack boards
  • What their offering will have right out of the shoot and their focus on big data going forward

Extra-credit reading

Pau for now…


Dell Joins Cloud Security Alliance

April 4, 2010

I recorded this interview with David Lang earlier this year and have been meaning to post it for the longest time.   David is Dell’s program manager for federal security which means he is charge of the team that supports the security requirements for all Dell’s businesses that faces the federal government.  He’s based in DC but I was able to grab a bit of his time when he was out visiting Austin.

Some of the topics David tackles:

  • Dell’s joining of the Cloud Security Alliance at the end of last year.
  • What the CSA is and does.
  • David’s interesting background:  he spent many years as a special agent in the air force doing computer and espionage investigations and how this lead him to the cloud.
  • How David addresses questions around cloud security and what type of environments you find in federal space.
  • The balancing act between availability, security and cost and where Homeland Security would want to use the public cloud.

Extra-credit reading

Pau for now…