In a previous post, I wrote about the first lesson of Couchbase’s Data Dilemma research – that, despite businesses having clear ambitions to radically improve new customer experiences and engagement, those ambitions aren’t being met. Today we’re going to look at one of the clear culprits: data.
When it comes to offering new digital experiences and services, an inability to use data can prove crippling. 90 percent of digital decision makers are having their ambitions to use data for new digital services held back – whether due to a lack of resources, the complexity of using multiple technologies, or a reliance on legacy database technology. In fact, 41 percent of respondents have had digital projects fail outright in their organization because the legacy database couldn’t support it; 15 percent projects failed after significant time and resources were invested, while 29 percent have had to reduce the scope of a project due to the cost of making changes to legacy technology, and 14 percent have had to delay projects significantly have never reported any issues. From these statistics, legacy databases are the common thread.
Pointing the finger at the legacy database
Almost every enterprise will have a legacy database: based on reliable, relational technology from Oracle, IBM, Microsoft or one of a host of others that has proven to be invaluable at tasks such as processing transactions swiftly, reliably and consistently – critical for online businesses. However, digital experiences that foster customer engagement place new demands on the database, which legacy technology cannot always meet. These new experiences have to be agile – the enterprise must be able to develop new applications that use data in different ways almost on the fly. They have to be reliable and always-available, with no downtime allowed for maintenance or any other reason. They have to be scalable – as in scale massively to accommodate the success of new applications, seasonality, or other surge in interactions. And they must be able to guarantee performance, even as applications engage with end users using increasingly complex, interconnected and varied data. Legacy databases simply aren’t up to these tasks.
This won’t improve over time: in the very near future, databases will have entirely new demands placed on them that, again, legacy solutions simply cannot cope with. For instance, only 41 percent of organizations say they can use data in real-time – i.e. as soon as it is recorded. On average, the most recent customer data that organizations’ databases can use is 28 hours old. Ask yourself whether you would use any supplier or service provider if they took more than a day to react to your choices, and then ask whether these databases can cope with providing a user experience that occurs essentially in real-time. There is also the question of new technology – consumers want to immerse themselves in virtual or augmented reality, to be able to experience the Internet of Things, or to receive services backed up by Artificial Intelligence. Yet only 19 percent of digital decision makers believe their database would be up to the task of completely supporting these new technologies if their organization began using it tomorrow. 59 percent can only support the technology to some extent, and 22 not at all, meaning investment is still necessary in many places.
Clearly, legacy databases are an impediment to true digital innovation. If they can’t solve it, what’s the worst that could happen? Part 3 will explain just that.