Martin Gaffney, vice-president EMEA at Yugabyte, shares a cautionary tale of picking the wrong software for a modern omnichannel service, and explains why such an approach won’t work long-term without edge computing
Once upon a time, there was (and still very much is) a huge global retail giant trying to run a vast e-commerce site. Featuring thousands of sellers and millions of products, this was supported by multiple applications which handled large volumes of data and detailed how all the products for sale were listed, displayed, grouped, filtered, searched and presented.
They aimed to give users a personalised, seamless buying experience. To do this, the organisation, quite rightly, was making full use of microservices at scale. These were powered by a cloud-native technology stack exploiting all the right names: Kafka, Akka Streams, and Apache Cassandra as the database. And all these great services were being deployed in a multi-data centre topology, using a multi-cloud deployment strategy.
You’re probably beginning to suspect an enterprise CIO ‘but’ coming. And you’d be right, however, it might surprise you to learn that though this wasn’t (or isn’t!) a telco, or an IoT company, or real-time monitoring system — the ‘gotcha’ was around edge computing.
What e-commerce retailers can expect in 2022
Close to the edge
If you’ve been monitoring the progress of edge, this won’t surprise you. Edge is a distributed computing paradigm focused on bringing computation and data storage closer to the sources of data to reduce latency and save bandwidth.
If, as many do, you see ‘edge’ as a tool for specific applications rather than for business, well, things have moved on. If you run databases in the cloud, you may need to update your thinking around edge computing.
Before we get back to my customer experience war story, a brief explanation is in order.
Edge is not a new topic for the Internet of Things or 5G people. Really, it’s not that new for any company that runs a network to support a business; it’s just that expectations have changed. Most industries have some requirement to keep track of big pieces of plant and kit, be it manufacturing, transportation, logistics, energy, or oil and gas. In the past, if it took a few seconds to get data back when you called up a request from your server, it didn’t matter. But, business started to speed up, and you needed that answer back quicker, and you also started to ask many, smaller and smaller things how they were doing too.
In the past, retailers had their own on-site computer that talked back to the company network and reported stock levels and all that good stuff. All this was fine when there wasn’t a rush or any issues, but a network failure between a high street store and the main data centre or cloud farm meant (among other things) the checkouts go down, and you can’t trade until it’s fixed centrally. While this is happening you are losing money and customers are voting with their feet.
The in-store network is now much better. If I’m in Waitrose with my little handheld supplied device, scanning an intriguing box of cornflakes, that purchase gets zapped to the equivalent of the local server in the manager’s office and is added to my virtual basket. The speed of that microtransaction isn’t important; but if there was no local network and it had to be pushed all the way to the store’s data centre in Bracknell to reach the price database, checkout would be really slow, especially given the multiple transactions by multiple shoppers.
I’m a picky customer, so I like to do some online price and product checking before I buy. If I see some great cornflakes on the website, or some amazing new antioxidant prune juice concoction for my health kick, and it isn’t in store after you promised me it would be, I wouldn’t be too happy.
If I am online and buy the last Apple MacBook Pro 64Gb in Rose Gold in your shop – you had better tell the store right now, before you take my money. You must be able to tell them this is now sold, and they cannot sell it to the customer at the front of the queue eager to buy it.
Keeping the busy digital-savvy shopper happy
So, in a world full of people like me, who want to interact with the retailer on that local network, but also in the cloud, data interactions have become more complicated than they used to be.
The store now needs to make sure it’s got my item on the shelf, or in the backroom ready for me if I’ve reserved it. If I travel to pick it up and it’s been sold to someone else, I’m not going to be pleased. Picture the annoyed tweets and poor Google reviews, which nobody can afford now that we all shop and compare in cyber space. To effectively deliver the 360-degree customer service you have promised, you need to tie the two — the shop network and the bigger company cloud network — very tightly together.
Ultimately, the latency and efficiency of return of message I expect from my industrial edge, I now need in my previously more relaxed retail context.
Another aspect of this is the ultra-fast, optimised pricing you now need. You can’t leave that to some buyer in the main office, the computers have got to be doing this with all your EPOS and SKU data in real time. The best way of managing this and achieving true commercial ‘edge’ capability is via a distributed database.
So, you distribute part of your main working product database into each location. Every supermarket outlet gets a copy of the price and product location database, which is constantly updated in real time. If it gets disconnected from the network, it doesn’t matter; prices are the latest ones it had before disconnecting, so it can carry on and resynchronise when the network comes back.
Even better, because of the way these databases are written, integration between the edge (the store) and the cloud is quick, interactive and automated. You don’t have to worry about batch jobs running at the end of the day, or manual intervention to ensure that everything is working, scripting, monitoring, checking and recovering effectively. All that is now built in, because these are the intrinsic features of a modern distributed SQL database.
Is this the end of the Point of Sale (PoS)?
Right tool for an edge-y job
This brings us back to my real-world customer example. The client was trying to do all this with NoSQL, and it just wasn’t cutting it.
The company ended up having unsustainable difficulties maintaining true consistency across their range. Too many times, things were no longer available, or they were failing to sell something because people didn’t know it was available at the right time at the right place. This ticked their customers and their selling partners off. Worst of all, the central IT team was spending un-budgeted money fixing and unpicking the inconsistencies in the database.
In technical terms, the NoSQL database they’d chosen was just not adequate for the systems of record that formed the core of its commercial responsibilities.
The good news is that by going for a distributed SQL option instead, they achieved extremely high-volume ACID-transactional consistency across a multi-region deployment on multi-table transactions. They also saw very low latency for high volume reads, and the system is now far more bullet-proof in terms of uptime and resiliency.
In a world where you want to achieve high customer service with superior customer experience, and gain full value from omnichannel at the data level, a distributed, transactional database with edge computing may be your best option.
It’s good to have a choice, but even better to get what you thought you were buying in the first place.