DXPThought LeadershipDXP Scorecard

Every Platform in the DXP Scorecard

The Arch of the North begins an honest, platform-by-platform assessment of all 26 platforms in the DXP Scorecard. Not rankings. Not vendor pitches. Implementation truth.

7 min read
Dxp scorecard mapping

Why I'm Writing About Every Platform in the DXP Scorecard

There's a photograph from the early days of cartography that has always stayed with me. It shows a room full of mapmakers hunched over drafting tables, each one tracing coastlines they had never visited from accounts provided by sailors who had every reason to embellish. The maps were beautiful. They were also wrong in ways that sent ships into rocks.

I think about that image every time I open an analyst report.

The Problem with Maps Drawn from a Distance

For the better part of two decades, enterprise technology leaders have relied on a small number of analyst reports to guide platform decisions worth millions of dollars. The Gartner Magic Quadrant. The Forrester Wave. These reports became the lingua franca of procurement conversations, shortlisted in boardrooms and referenced in RFPs with the reverence of settled science.

But here's what thirty years of building on these platforms has taught me: the map is not the territory. And the maps we've been using have some serious gaps.

The 2025 Gartner Magic Quadrant for Digital Experience Platforms evaluated 16 vendors. The 2025 Forrester Wave for DXPs included just nine. These are the two most referenced platform evaluations in the enterprise technology landscape, and together they cover a fraction of the platforms that real engineering teams are actually building on.

Where is Sanity? Where is Storyblok? Where are Strapi, Payload CMS, Hygraph, or Joomla? These platforms power production systems at organizations of every size, yet they fall below the revenue thresholds or outside the category definitions that determine analyst inclusion. They are invisible on the maps that procurement teams are using to navigate.

The DXP Scorecard at dxpscorecard.com was built to address exactly this problem. It evaluates 26 platforms across more than 120 scored criteria spanning nine categories, with the framework continuing to expand as the market evolves. It covers enterprise DXPs, headless CMS platforms, traditional content management systems, and open source alternatives. It does not require vendor participation fees. It does not impose revenue thresholds. And its methodology is open for anyone to inspect, challenge, or build upon.

But the Scorecard is a dataset. It tells you what platforms score across nine evaluation categories, with more than 120 criteria and counting. It does not tell you what those numbers feel like when you are three months into a migration and your team is discovering the distance between marketing promises and production reality.

That's what this series is for.

What This Series Will Cover

Over the coming weeks, I am going to write about every single platform evaluated in the DXP Scorecard. All 26 of them. Not as product reviews or vendor comparisons, but as honest assessments from someone who has spent three decades building digital experiences on platforms across the entire spectrum.

Here is the complete list, organized by category:

Enterprise DXPs: Adobe Experience Manager, Sitecore AI, Sitecore XP, Optimizely SaaS CMS, Optimizely PaaS DXP, Bloomreach, Magnolia, Acquia, Liferay, HCL Digital Experience, Kentico Xperience, Salesforce Experience Cloud, and Uniform.

Headless CMS Platforms: Sanity, Contentful, Contentstack, Storyblok, Kontent.ai, and Hygraph.

Traditional and Open Source CMS: Drupal, WordPress VIP, Strapi, Payload CMS, Joomla, HubSpot CMS, and Umbraco.

For each platform, I will cover what the Scorecard data reveals, what the numbers cannot capture about the actual implementation experience, who the platform genuinely serves well, who should look elsewhere, and what the migration landscape looks like. I will close each assessment with an architect's perspective on where the platform sits in the broader story of digital experience.

Why the Architect's View Matters

I want to be clear about something. I am not a neutral observer. I have opinions about these platforms that have been shaped by decades of building on them, debugging them at two in the morning, watching upgrade paths go sideways, and celebrating the moments when a well-chosen foundation made everything that followed easier.

But I am also not selling you any of them.

The technology industry has a peculiar relationship with honesty. Vendors cannot afford to acknowledge limitations. Analyst reports face structural incentives that favor large, well-funded platforms over technically excellent smaller ones. Implementation partners often have commercial relationships that color their recommendations.

What I can offer is the view from the drafting table of someone who has actually built the buildings. Not someone who evaluated the blueprints or attended the sales presentation, but someone who poured the foundation, wired the electrical, and came back two years later to see what was holding up and what was cracking.

Some of these assessments will be uncomfortable for vendors who prefer their platforms be discussed in marketing language. Some will challenge assumptions that have hardened into conventional wisdom. A few might surprise people who have already written off platforms that deserve a second look.

That's the point.

How This Series Connects to the Scorecard

Every post in this series will reference specific scores from the DXP Scorecard, and I will explain what those scores mean in practical terms. But the Scorecard is the starting point, not the conclusion.

The Scorecard evaluates platforms across nine weighted categories: Core Content Management, Platform Capabilities, Technical Architecture, Platform Velocity and Health, Total Cost of Ownership, Build Complexity, Maintenance Burden, Use-Case Fit, and AI and Automation Readiness. The framework includes more than 120 scored criteria and continues to expand as the market evolves. Each score includes a reasoning statement and a confidence level based on the quality of evidence available.

That framework provides structure. What I add is context. A platform might score well on raw capability but carry operational complexity that only becomes visible after your team has been living with it for six months. Another might score modestly on features but deliver an implementation experience so clean that your developers are productive from week one. These are the kinds of distinctions that matter in real platform decisions, and they are exactly what I intend to illuminate.

The Order of Things

I am starting with the platforms that generate the most procurement activity and the most debate: Adobe Experience Manager, the Sitecore family, and both sides of the Optimizely story. These are the heavyweights that dominate enterprise shortlists, and they deserve to be examined first because so many organizations are making consequential decisions about them right now.

From there, I will move through the modern headless contenders that are reshaping what enterprise content management looks like. Then the specialized enterprise platforms that serve specific industries and use cases. And I will finish with the open source foundation that powers an enormous portion of the web, often without the analyst coverage or marketing budgets of their commercial competitors.

Every platform gets the same treatment. The same structure. The same honesty.

A Note on What This Is Not

This series is not a ranking. It is not designed to declare winners or crown a "best" platform. The entire premise of the DXP Scorecard is that platform decisions are multidimensional, and the right choice depends on your organization's specific needs, technical capabilities, budget constraints, and long-term strategy.

What this series will do is give you the implementation context that no dataset can provide. It will help you understand not just what a platform can do on paper, but what it actually demands from the teams who build on it and the organizations that operate it over time.

If you are a CTO evaluating platforms for a major initiative, this series will give you the kind of honest technical assessment that is difficult to find in vendor materials or analyst reports. If you are a marketing director trying to understand why your engineering team is passionate about one platform over another, this series will translate that technical perspective into terms that connect to business outcomes. And if you are an architect like me, navigating these decisions for the organizations that trust you with their digital foundation, I hope this series becomes a resource you return to.

Twenty-six platforms. Over one hundred and twenty criteria across nine categories, and growing. Three decades of perspective.

Let's begin. 26 platforms in 26 weeks. I'm sure a 27th platform will make an appearance by the time I'm done.

Danny-William
The Arch of the North

Sr Solution Platform Architect

HT Blue