Background Image
December 15, 2022

A Summary: 7 Tips for a Successful Digital Transformation

This summary recounts the AB Tasty Summit 2020 fireside chat session with Margaret Wise, CRO and Luke Barton, VP, Client Services at Arke. In this session they discussed the 7 Tips for a Successful Digital Transformation – essential steps that have consistently worked for Arke customers and can work for you, too.

While focusing on 7 most important tips of digital transformation success, Margaret and Luke also highlighted how transformation must be treated as an organizational change initiative and how effective leadership, planning, identifying goals, alignment to KPIs, and ongoing measurement will yield positive results.

1. Align on a shared taxonomy.

This seemingly obvious tip often gets overlooked in the rush to kick off a project, until confusion and the costs/time of clarification and realignment become apparent. Being able to speak in the same lexicon is imperative from the very beginning of Digital Transformation.

Luke suggests conducting a “Pre-Discovery before a project gets kicked off”. This creates the opportunity to align and organize with your core group, stakeholder group, and then partners and compress your “Sprint 0” timeframe.

Another area where shared taxonomy comes into play is with analytics—what data gets captured, what platform does it come from, where will it get stored and what value does it show to KPIs and business goals?

TIP: Start to rough in your data collection and reporting framework – even if it’s imperfect – before you kickoff, so you know the goals of your analytics, KPIs, key reports, repositories and designated sources of truth BEFORE you start the endeavor.

2. Know what problems your Digital Transformation should solve.

Margaret begins the conversation by laying out two differing approaches based on 1) doing something big and having a digital transformation roadmap or 2) making smaller incremental changes that eventually add up to organizational digital transformation.

Luke’s experiences historically skewed more heavily towards organizations tackling big issues — almost 75% of programs. The classic example involves a new leader joining a company and immediately seeing need to re-work something: Brand, Web site, CRM, ERP, Process re-org, Structural re-org etc.

“Philosophically, this ‘go big or go home’ approach can work exceptionally well, even with failures along the way. Often, when leaders get broad buy-in across the organization and people truly believe in the value of the result, companies rally to make it work.”

Regarding the roadmap question, Luke explained “one intrinsic risk to this approach is that the primary initiative is the biggest and the first– if it has challenges or does not finish on time, other objectives are not served, and key results are missed. Luke cited instances when “successful groups ladder their goals in longer increments” (for example; here’s our 1-year goal, 2-year, and 3-year goals). Longer roadmaps tend to provide adequate time for larger programs to complete and provide valuable outcomes as well as show organizational adoption. When supporting departments align around smaller incremental goals and results that must happen along the way (smaller wins for aligned departments), even when the primary focus is the bigger initiative.

More recently, Luke explained, we’re seeing companies approaching more “winnable initiatives that are smaller scale, but higher in number” (re-balancing the earlier percentages to 60% smaller, 40% big audacious goals). The best leaders are the ones who can chart a 3 year plan with enough detail in the first year and flexibility in subsequent years to adapt along the way, but still drive consistent strategic alignment over the years. Budgets tend to be consistent if they yield predictable ROI that is closely aligned to KPIs.

“As far as comparing results to an approach – it is a toss-up. There is no right answer. Market conditions, financial positions, leadership biases, operational buy in, etc. will always inform approach” said Luke.

TIP: Projects typically fail when they attempt to solve too much. Have the hard conversations up front to focus on the key problems you want to solve, prioritize them, and then set the expectation that all decisions are vetted against these priorities.

3. Build a business base for each problem to prioritize effort.

Margaret started this topic by asking: One important part of what we do when we help our clients develop a roadmap is to prioritize steps. For many companies that are under-invested in digital tech, it can feel like everything is a priority. We look at cost/benefit analysis, but there are also some dependencies and logic to making sure the foundation is solid. What do you see as foundational needs and then where to go from there?

A key aspect to evaluating priorities is finding simple, accurate ways to measure and compare. By contrast, looking for the perfect way to tie everything together every time for every context can be very costly, time consuming, a drain on resources…and may not give you the decision support guidance that’s useful.

Evaluating what you can measure with the readily available data you have is a key first step. Challenging that model with a fresh lens is critical, too, because many times you can develop a new perspective that informs your decisions. Once you put that measurement approach through the paces and feel confident that the approach provides accurate reporting, you can then break down each step of the process, experience, solution (whether you are building new or improving) and find ways to optimize that before you embark on major changes or new expenses. Seek ways to build or improve your solution in a manner that gets you to market quickly instead of perfectly. “Getting it out there” gives you the best opportunity to pressure test what you’ve done and then adapt and iterate until you’re confident that you’re reaching diminishing returns with the approach.

“Getting it out there gives you the best opportunity to pressure test what you’ve done and then adapt and iterate”

This optimization approach is highly repeatable and helps advance your baselines or benchmarks quickly, stacking up some wins before making the big bets. “A common approach we see is one where we solidify the analytics and measurement approach, optimize experiences, then roll out some test-and-learn scenarios” says Luke, and then, once we know more about the user base, start rolling out personalized experiences that create real benefits.

TIP: Adopt a simplified, distilled way to summarize the business case for the initiatives and initiatives within the program.

For example:

Note: Although obvious and simple, attaching projections and deadlines to initiatives help prioritize them.

4. Designate ownership by an Executive Sponsor or Steering Committee.

New roles have emerged around Chief Customer Officer, Chief Digital Officer… For companies that are showing both progress and success in Digital Transformation/CX- where do you see ownership for this sitting?

I see a trend moving away from the traditional “Marketing vs IT” balance towards a “Digital” Leader or Digital Center of Excellence explains Luke.

Over the past few years, Digital has become a structured practice that, in the context of customers, seems to have risen above the peer relationship of IT vs Marketing. It’s a blended practice that ties it all together. Digital rolls up to revenue so everything is viewed through that lens, as opposed to a situation where the technology that drives a digital solution rolls up to an IT organization that may focus more on cost of ownership instead of revenue. Focusing on revenue as opposed to cost changes the value proposition of Digital Initiatives.

TIP: Regardless of where ownership sits, it’s imperative that it ties to corporate KPIs and is a senior enough role to span the business and remove roadblocks.

5. Align the goals of your digital transformation to your corporate KPIs.

Most companies have matured their goals past vanity metrics. But sometimes there is still a gap between a goal for marketing, for instance, engagement, and the corporate goal of increased revenue, share of wallet or retention. How can companies think about connecting the dots from micro goals to strategic corporate KPIs?

Think about it in terms of client journey. Micro goals are conversions in each stage of the journey, but when you dial in on each stage and are able to progress a prospect through the funnel faster, then you’ve created conversion velocity and lowered your CAC. When we talk about connecting the dots, certain metrics cannot be viewed just on their own in a vacuum, but rather, in relationship to other goals. Here’s an example: you may designate certain marketing metrics as leading indicators of revenue metrics, even to the point of ratios that paint a picture a picture like, “Historically, when we’ve amplified a solution that increased {Metric X} by {Y} percent, we see an increase in {Metric Z} within {this number of months}.” We don’t always find perfect attribution for every KPI, but evaluating trends helps inform priorities.

Luke explains a great example: A company wants to increase wallet share of customers. They know that winning existing business is less expensive than winning new. They also know that expanding a customer’s spend means marketing and serving a customer segment who is not the “buyer,” but the user or practitioner, the influencer. So, they amplify low cost enhancements to the marketing of that customer segment to influence their decision to buy more products and services from the “bottom up” instead of from the top down.

TIP: In a vacuum, the metrics surrounding a program may not immediately show increased revenue. The idea should be to compare to future sales numbers of existing customers where you will clearly see a trend.

6. Create a framework methodology for measuring success.

One of the biggest challenges I’ve seen is how companies create a framework for measuring success, Margaret explains. It requires good hygiene around data management and analytics. This really closes the loop back on aligned taxonomy and measuring, focused on the problems you were trying to solve.

Margaret asked Luke to provide an example of creating a framework for measuring success.

“There is one that is top of mind for me. It’s the problem teams face when they rename or re-organize a process while the process is going on” says Luke. Here’s an example: Let’s take a sales funnel with 7 steps in it. Well, let’s say the team decides to reimagine the 7 step process a new way; maybe with only 5 steps. They decide to use similar or some of the same labels for the new steps of the 5 step process–you can imagine the confusion that will ensue. If that group didn’t identify a map from “new name” to “old name” and “old name” to “new name,” then reporting gets confusing and potentially flawed, especially when new parties analyze data across time periods using different labels. The same thing can happen with Conversion Goals in Google Analytics – “We used to call it this, then we changed the user experience and flow and noticed that conversion funnels were misaligned, so we re-worked those. With enough of those, analytics gets really messy” says Luke.

To avoid this, we do see teams extracting data from various systems and importing it into a reporting solution they control. This helps bridge natural changes to solutions over time and keep reporting as consistent as possible.

TIP: The right framework is going to set up for frequent communication and transparency. This is required to enable a culture that supports curiosity.

7. Develop a mindset of experimentation and continuous improvement.

Gartner recently stated that the organizations that ranked the highest effectiveness of martech were embracing Agile Marketing. Margaret asked: Can you leave us with a couple of examples of experimentation in action? What kinds of impacts have you seen?

Some of the most common phrases we hear when working with customers not yet maximizing their MarTech ecosystem include things like…

“We know we’re only using 25% of this platform’s capabilities…”

“We’re not staffed enough to use it all, so we only use this and this…”

“It can’t do this one thing the way we really want it to, so we’re going to migrate to something else…”

By contrast, when we hear customers say things like,

“Let’s find out if {insert platform} can do {this}, and if so, let’s figure out if we can get a lift in {X} if we do it a lot…”

“I just found out that we can “integrate” enough {Platform X} with {Platform Y} if we just do {this}. It’s not perfect, but we’ll be much better off”

“We know that exploration and curiosity drive adoption and value realization” says Luke. For example, one of our customers was rolling out a new product detail page, and the primary call to action involved a user selecting a date range before proceeding. Well, there are several ways to do that in a desktop view and a mobile view, so we presented a few options, but the team could not decide. Rather than arm wrestle, we deployed a new measurement tool first to track user interactions in a very detailed way, then deployed this major updated and measured. Within less than a day, it was clear that the chosen approach was creating friction for some users. We deployed a hot fix within a few days and the problem disappeared and conversions went up dramatically.

In another example, with a big box retail customer, we were able to measure what users were NOT doing to determine why a certain KPI, “add to cart (ATC),” was low. We discovered that by increasing the “activity timeout” timer so as NOT to log out too many users too quickly, increased the number of conversions dramatically and immediately.

Many customers see how experimentation is vital to aligning to an organization’s unique culture and how it will yield continued success.

RESOURCE: Want to read more about Building a Culture of Experimentation? Read the Harvard Business Review article.

Looking forward – where are we headed?

If I had to bet today on whether more companies embark on a single major initiative vs a collection of smaller ones, I’d bet on the latter. I think the trend towards maximizing returns with EXISTING technology will persist for another 8-to-18 months (most of our clients would say they have “plenty of technology.”) or with discrete, best-of-breed solutions that don’t claim to do everything (read: clear value proposition and less overlap). I think pressure to get to market is increasing, so optimizing what you have and showing returns quickly is getting prioritized over taking huge investments of time and budget like re-platforming.

In conjunction with that trend, we’re seeing clients emphasize analytics as a practice among the team, not one that is contained only within an Analytics group. This means teams are able to make data driven solutions more easily than before. This helps with a shift in mindset – to align teams towards KPIs that are accessible, meaningful and adaptable and to prioritize work for the most yield in the shortest amount of time.

Get the most from your digital investments.

For more information on how to maximize the value of your digital investments, reach out to us and get started today.

By Michael Stewart | December 15, 2022

About the author:

Michael Stewart is the Director of Marketing at Arke. He's a Savannah College of Art and Design graduate with more than 30 years of omni-channel marketing and ad creative experience.