Changing technology forces are changing digital transformation. The rush to the cloud is old news – but organisations that lifted and shifted are now going through a journey of cloud modernisation. The building blocks of this journey are cloud-native services – strung together to build more powerful, more complex and more valuable capabilities faster than ever before.
The appeal of cloud-native services is this: they are easy to pick up and integrate, and they all live within one set of infrastructure, one set of project and deployment tooling and one set of governance standards. The services themselves are commodities, and the lines between which service is being used for transactional software, data engineering, or even data science are starting to look more and more blurred.
This technology paradigm is driving several broader changes in the industry. It wasn’t long ago that you’d find a data engineering team working in total isolation – maybe on Hadoop running on some local boxes – and your data scientists were probably doing most of their work on their own machine on data they manually downloaded, whilst your software teams enjoyed all the cutting-edge tooling. However, these days if you look inside most modern engineering businesses, you might find it hard to tell at first glance who’s a software engineer or data engineer.
The inefficiency of isolation
As most businesses are starting to demand that their insight is near real-time, their analytics forward-looking and predictive and their customer experiences slicker and more engaging, it’s no longer realistic to look at the areas of software, data and platform in isolation.
But operationally, many organisations haven’t reacted to this yet. Quite often, in our experience, companies ‘doing digital transformation’ reveal separate software, data and business intelligence teams with their own management structures and goals. Software projects are often standalone as part of an application or modernisation strategy. Data engineering teams are focused on data strategy, and building common data models, with business intelligence often sitting in finance. Data science, if happening at all, is in isolation, as is IT.
These separate teams, separate management structures and disparate goals make executing efficiently against a modern digital strategy exceptionally difficult and create a lack of clear understanding as to how these separate technology silos connect together and build on each other to drive common outcomes.
A model we have adopted when thinking about how to design a successful strategy and operational implementation for a digital transformation that traverses software, data and platform is the value pyramid.
The digital value pyramid
The pyramid analogy is important because you can’t build the next layer without all the layers below in place. The base layer of the pyramid is hardware, and platforms which have been heavily commoditised to the point of being largely strategically unimportant. The next layer up is software apps: transactional systems – your usual business applications. The next layer is actionable intelligence: information that helps humans make decisions, such as business intelligence. Next is predictive analytics: forward-looking analytics that predict the outcomes of decisions. Lastly, at the very top of the pyramid is Predictive Automation, both predicting the optimal outcomes and automatically taking action.
As a CEO, standalone software and data engineering projects are usually justified on grounds of efficiency, business continuity, risk reduction and hygiene. To get projects sanctioned in the upper layers of the pyramid – actionable intelligence, predictive analytics and predictive automation – you need to be really clear on the tangible link to business value, which can be really tough to prove: how does this improve the competitiveness of the business or increase revenue or increase gross margin?
To justify spending in those areas your teams need to really understand their own businesses, and how they make money and give the leadership team confidence that the associated business change can be delivered. So how does this model help? The answer lies in the idea of vertical capabilities.
A vertical capability is not concerned with fitting into a given technology grouping. Instead, it is concerned with using technology to create a new or improved very clearly defined business capability, with its own goals, KPIs and value profile.
For example, “understand when an item is likely to go out of stock and the impact that may have” is a clear vertical capability. It slices up through the layers all the way to predictive analytics but stops short of predictive automation.
If a business had the budget and appetite, it could expand its vertical capability to “automatically manage stock so items are kept at the optimal level” or “plan the most efficient delivery route in real-time” – another clear vertical capability that slices all the way up to predictive analytics.
It is worth noting that not all vertical capabilities reach for the top of the pyramid. Many capabilities naturally top out at software apps or actionable intelligence.
Placing a capability at the centre of a transformation project involves a big shift in thinking.
One of the biggest shifts is the move from the isolated departments and strategies to integrated teams formed around realising capabilities. These teams may include many different technology practices all working closely together to reach a common goal, from software engineers through to data scientists. It involves stepping back from designing large sweeping data pipelines and warehouses and data models, and instead expanding and modernising small services to support your capability.
On a macro level, this approach to transformation has fundamental implications impacting everything from how you hire, how teams are allocated to projects and how costs are allocated across a business to how entire strategies are formulated. It requires a move to combine strategic and operational thinking in an integrated capability roadmap aligned toward a common goal.
It aligns teams of data engineers and software engineers to effectively collaborate and focus on the business value. Of course, businesses will ultimately require multiple capabilities to be developed at the same time – with some going right to the top of the pyramid, and others remaining in the software layers. The key to success, as always in business, is communication between these defined and closely integrated project roles. There is a certain level of complexity to manage, making sure teams effectively leverage common technologies and patterns being the most pressing.
The impact of this approach, born out of the drive to the cloud, is a switch from separate strategies and projects per technology area to an integrated strategic roadmap that focuses on vertical capabilities. And if you get this right, you’ll fly.
About the author: Stewart Smythe is CEO of Ascent.