Digital Services

An Expert Guide to Using Digital Platforms

An Expert Guide to Using Digital Platforms

What is a Digital Platform?

A Digital Platform is a low-cost method of leveraging your existing IT systems into a unified platform. Using IT system integration, a Digital Platform allows your business to be more competitive in the modern marketplace. There are many elements and types of Digital Platforms, which is why the use of them requires planning and patience. Overall, the correct Digital Platform strategy could transform a business. Although embracing modernisation in this way may feel overwhelming for some at the start, following our expert advice in this guide to ensure the process is completed properly, will pay off in the future.


The Problem

Managing digital transformation in the 21st century isn’t easy, with organisations over the years building up tens or hundreds of legacy IT systems (either off the shelf or bespoke), that implement strategic and tactical business processes. Typically, these systems are stand-alone or have limited ability to integrate with a wider enterprise. Some organisations have chosen to meet the challenge of modernising their systems by spending tens of millions of pounds on expensive ERP systems, or other systems integrations platforms. These projects often go off the rails in terms of cost, time and achieving objectives.


The Solution

A Digital Platform is a cloud-based approach to IT system integration, allowing these disparate legacy systems to evolve into a modern-looking, more flexible IT systems. The approach here is to create and use cloud-based systems that allow the legacy systems to exchange information with each other and, crucially, with a modern API that external systems can use.

Digital Platforms (by definition) usually consist of these components:

  • A modern identity platform for login
  • A modern, friendly API
  • A Data Broker
  • An Enterprise Service Bus to allow communication between components
  • A variety of Digital Adapters

Digital Platform Example

For the last 20 years, Acme Energy Company (AEC) has sold its services to corporate customers as a complete managed service. This has been very profitable. AEC has built up an array of billing, operations, metering, invoicing and other systems, all of which were best of breed when bought. These systems do not play well together and data is often (inaccurately) rekeyed between systems.

In recent years, the energy market has moved away from single managed services into a market for an individual “pick n’ mix” approach where customers buy different services from different providers. AEC’s systems do not support this model and so AEC would like to replace all of its systems. The current estimate is around eight years to complete this work, by which time AEC will have become uncompetitive.

Instead, AEC has chosen to implement a Microsoft Azure based Digital Platform. Each existing system will have a Digital Adapter that will allow the Digital Platform to read data from it and write data top it. The platform will be fronted by a modern REST API that will allow customers and App developers to harness individual services provided by AEC rather than AEC providing a complete, managed service.


Scope and Versioning

As time goes on, behind the scenes, AEC can replace the ageing legacy systems with modern systems. Each modern system will have a Digital Adapter which allows the legacy system to be seamlessly replaced with a modern system without API users noticing the change.

This means that one would expect the Digital Platform to change over time. Given the pace of change in modern businesses, this change may occur rapidly! As such, the Digital Platform should be developed iteratively, with each new iteration built to production standard and delivering valuable end-user functionality. The scope of each iteration should be understood in terms of the business objectives that it services.

Click below to view the next chapter on how to architect cloud based platforms for digital systems integration.


How to Architect Cloud Based Platforms for Digital Systems Integration

Typically a Digital Platform design consists of these platforms and components:

This diagram below illustrates how these components fit together.


The Switchboard Approach

Later in this blog, we will explore different digital platform strategies for implementing a Digital Platform such as Switchboard, Sync and Hybrid. For now, this article will focus on the Switchboard approach.

In this approach, a user is using a modern iOS mobile App which accesses the Digital Platform. The sequence of events is:

  1. The user requests information using the modern API relating to a particular Product
  2. The Identity Platform authenticates the user
  3. The Security Manager authorises the request
  4. The Data Broker dispatches messages requesting information about the Product to all Digital Adapters
  5. Each Digital Adapter simultaneously returns all information that it has relating to the Product to the Data Broker
  6. The Data Broker collates all the data and returns it to the App via the API response

This sequence of events can be synchronous or asynchronous depending on the availability and reliability of the systems a Digital Adapter is connected to.


The Microsoft Azure Implementation

Many cloud based platforms can be used for this type of digital systems integration. The diagram below shows a Microsoft Azure based implementation of the Digital Platform as described above:


The Benefits

It can be seen that the Digital Platform approach allows a modern, REST API to easily overlay legacy systems through the use of cloud technologies and Digital Adapters.

Click below to view the next chapter on how to do data modelling for your new Digital Platform.


How to do Data Modelling for your New Digital Platform

What do you Mean by Data Model?

The Data Model is a description of the data (from a business perspective) that the Digital Platform will manage. It describes as a whole the names, descriptions and relationships between the types of data that your Digital Platform will hold.

It will be very useful to refer to as you develop the Digital Platform and it will also allow the clients of the Digital Platform (other computer systems, Apps, Alexa etc.) to understand the information that they will have access to.

In reality, you will probably create multiple Data Models for different services. For example, you may have different functional models such as a Billing Data Model, a Sales Data Model etc. Or you may have different models based on different regions. There are many different types of Data Model, the one used will depend on your exact business needs and operations.


Formats

Although there may be varying types of Data Model, typically the Data Model will be represented using a standard notation such as a UML diagram or Entity Relationship diagram. This is usually accompanied by a Data Dictionary that gives a more detailed description of each individual piece of information.

An example UML diagram is shown here:

A fragment of the companion Data Dictionary may look like this:


Scoped, Necessarily Complicated and Flexible

As discussed at the start of the article, each iteration of the Digital Platform should be scoped to deliver valuable business functionality that delivers benefits to users of the Digital Platform. As such, the Data Model for the Digital Platform will be similarly versioned. Each iteration will change the overall Data Model in a defined manner.

The Data Model for each iteration should deliver only the data required to deliver the business objectives for that iteration (and the previous iterations). Therefore the Data Model should be as simple as possible, but without shying away from necessary complexity. Many systems have failed over the years due to being over-simplified to the point of not being fit for purpose.

Click below to view the next chapter on how to map data from existing legacy systems for your new Digital Platform.


How to Map Data From Existing Legacy Systems for your New Digital Platform

How to Map Data

The Data Mapping process is often a complex one, involving specialists in business functions and IT from across your organisation. It is common to discover multiple sources of data, some of which may conflict (more on cleansing data later on). Part of the mapping exercise is to create a view of the rules for deciding which set of data is definitive when data from different systems conflicts!


User Mapping and Technical Mapping

Data Mapping often comprises of two processes, usually done in sequence. User Mapping involves using a legacy system and recording where information from the Data Model can be found in the user interface of the legacy system. This usually involves filling in a document and taking screenshots as examples.

This User Mapping document acts as a shared understanding of what information will become part of the Digital Platform Data Model.

Technical Mapping usually happens next. This data mapping process consists of taking the User Mapping documents and interrogating the legacy systems from a technical perspective. This involves technical staff, such as computer programmers and database specialists, using APIs and database queries to discover how to access and modify the data specified in the User Mapping document.

The purpose of the Technical Mapping documents is to discover enough information to allow each Digital Adapter to be created later on. The Digital Adapters are the components that retrieve and update information in legacy systems at the request of the Data Broker in the Digital Platform (see article 2 in this series).

You can download a short example of a Data Map here that contains User Mapping and Technical Mapping information.

Example of a Data Map:

Download

Click below to view the next chapter on how to do data cleansing in existing legacy systems for your new Digital Platform.


How to do Data Cleansing In Existing Legacy Systems for your New Digital Platform

What is Data Cleaning, its Importance and Benefits?

Now, we should consider the data cleansing process within the legacy systems.

Typically the data in legacy systems can be of poor quality and be inconsistent between systems. For example, a customer’s name may be Smith in one system and Jones in another. It is also possible that one system uses a number to uniquely identify each customer and another customer uses a completely different alphanumeric code.

The process of normalising and correcting data across your systems is known as cleansing.


Scoping And Cleansing

Earlier, we discussed the need to scope each iteration of your Digital Platform. This scoping will allow us to limit the amount of time-consuming and expensive data cleansing that we will need to undertake at any one time.

Although you should automate your data cleansing as much as possible, cleansing is frequently a semi-accurate manual task requiring time and money. You may choose not to cleanse some data and trust that your new Digital Platform (specifically the Data Broker component described in our earlier section) will work out the “most correct” data when the client Apps requests it for your Digital Platform.

Ideally, you’d undergo the data cleansing process before incorporating it into a Digital Platform, but sometimes it is just too uneconomic to do so!


Other Benefits of the Data Cleansing Process

There are other benefits to cleansing your data. Especially in global organisations, cleansing for a Digital Platform is a great opportunity for agreeing on common coding and identification practices for key information in your organisation. This could be agreeing a global system for labelling products, identifying customers or coding projects. In itself, this will improve cross-border understanding and co-operation within your organisation.

Click below to view the next chapter on API design for a digital transformation.


Application Programming Interface Design for a Digital Transformation

The next key activity is to design an API (Application Programming Interface) that client mobile Apps, progressive web Apps and Internet Of Things Apps can utilise to create a great customer experience.


Anatomy of a Modern Application Programme Interface

A modern API is typically a REST API that exchanges JSON over the HTTPS protocol. This approach maximises the number of client App types that can leverage to the API.

REST APIs have URL structures like this:

This example would retrieve JSON data that represents customer order number 12345 like this:

{
  "orderNumber": "12345",
  "status": "Shipped",
  "deliveryDate": "2019-06-26T13:00:00Z",
  "... etc ..."
}

REST APIs use the HTTP verbs such as GET, POST, PUT, DELETE to carry out API instructions.


Versioning

Notice in the example above that “v1” forms part of the REST API URL. This is part of a common API versioning scheme. Whenever the contract of an API changes, a new version should be created, which simply increases the version number. As much as possible API providers should seek to maintain older versions of their APIs for as long as possible. This is because many Apps will use the API and not all users will update their Apps in a timely manner. Usage of API versions should be monitored and versions discarded when clients no longer use them.

Click below to view the next chapter on authentication provided by an identity provider for an API.


Authentication Provided by an Identity Provider for an API

A modern Digital Platform API typically uses the industry standard OAuth and OpenID Connect to allow clients to Authenticate to the API. This functionality is typically provided by a third party Identity Provider system such as Okta, Auth0, Microsoft Azure Active Directory etc.

With OAuth, when the user tries to access a protected resource (such as the order above), they must provide a “bearer token” which was issued by the Identity Provider to authenticate their identity. If the user does not provide a valid token, then the user will be prompted to logon to the Identity Provider to obtain a token.


Tokens

Tokens are typically provided as JSON Web Tokens and they look something like this:

{
  "ver": 1,
  "jti": "AB.j5d093ynt095y4nt45uth409gn4mh59",
  "iss": "https://okta.somewhere.com/oauth2/8947dh63487326hx8",
  "aud": "p349tcu4ogjmpgeo",
  "iat": 1561553243,
  "exp": 1561556843,
  "cid": "uictnp9cny3t9",
  "uid": "ciu3tm958mct089",
  "scp": [
    "openid",
    "profile"
  ],
  "sub": "someone@somdomain.com"
}

This JWT identifies information, including the following:

  • Issuer: okta.somewhere.com (the Identity Provider that issued this token)
  • Expires: the date and time (as a UNIX epoch) when this token expires
  • Audience: The application that this token authorises access to
  • And more…

Providers

OAuth and OpenID Connect are very widely implemented by an identity provider such as Okta, Auth0, Microsoft Azure Active Directory, Twitter (who invented it), Facebook, Google and many, many more. Next, we look at implementing with the help of Microsoft Azure API Management!

Click below to view the next chapter on implementing using Microsoft Azure management.


Implementing Using Microsoft Azure API Management

Microsoft Azure provides an excellent component for easily managing your APIs in the cloud called API Management. Typically APIs’ code would be implemented using a mechanism such as an Azure Function or web application. Azure API Management fronts the Azure Function and provides many functions such as out of the box integration with OAuth providers (so that application code does not have to implement complex authentication) and API definition (using YAML or other OpenAPI formats).


Benefits of Azure API Management

The required identity access management Azure offers provides the following benefits:

  • Scalable cloud platform
  • A single place to manage APIs
  • The ability to aggregate other web services into a single API tool
  • Built in logging and Application Insights on usage
  • Connectivity beyond Azure
  • Supports standard protocols such as OAuth

Click below to view the next chapter on data synchronization strategies for digital transformation.


Data Synchronization Strategies for Digital Transformation

Behind the scenes of your new Progressive Digital Platform, you will likely be aggregating data from multiple legacy systems and issuing instructions across various systems. For example, retrieving complete information about a customer may require retrieving information from a CRM system, ERP system, web shop and others. All of this requires a digital platform strategy for bringing that information together and keeping it all up to date.

For this page, we will focus on data retrieval. However, the same principles apply for updating data.

There are two primary data synchronisation strategies: periodic synchronisation of data and real-time switchboard.


Switchboard

A “real-time switchboard” system deals with requests for information (e.g. the customer information explained above) by providing a single point of contact that queries the sub-ordinate systems (CRM, ERP, etc.) in real-time for each inbound request for information. This has the advantage of providing the latest, up to date information to the requester. This approach is only suitable where the systems being contacted have some kind of API access and are reasonably stable and quick to respond. In environments with a large number of legacy systems, this approach is unlikely to work well.


Sync

Where systems are unstable, slow to respond or lack a good API, a periodic synchronisation of data may be a superior approach. In this approach at set intervals (once per hour/day etc.) a synchronisation Data Broker requests data about different data items (such as Orders) from all the different systems. The Data Broker collates all the results from the different systems and updates its own internal database containing a copy of the received data.

This means that users making requests do not have to wait for unreliable systems that may not respond (the Data Broker database will be cloud-based and very reliable). It also means that the data provided will always be out of date. If the synchronisation job runs once per day, then data could be 24 hours out of date.


Digital Adapters

In either approach, Digital Adapters are typically built that front the legacy systems and provide modern access for the Data Broker. Typically these are required as the legacy systems typically do not play well with more modern systems!


Hybrid

In a more sophisticated platform, a mix of both approaches may be suitable to accommodate the variance of reliability of the legacy systems that contain the data being requested.

Click below to view the next chapter on access logging, legal issues and debugging for your Digital Platform.


Access Logging, Legal Issues and Debugging for your Digital Platform

Progressive Digital Platforms typically have various monitoring and logging mechanisms. These mechanisms allow for simple availability monitoring, or sometimes a more detailed access log is needed.


Access Log

In low-trust environments, it may be important to track which users are accessing which data when to discover suspicious patterns of activity. For example, in financial institutions, unusual patterns of credit card activity may be flagged.


Legal Issues

Access Logging is a thorny legal issue and legal advice should ALWAYS be sought when implementing such a system. Digital Transformations typically cover multiple countries and legal jurisdictions where data protection legislation varies and the degree of consent required (if it is even legal) for monitoring system usage will vary.


Debug Support

Logs can also allow the replay of previous data changes requested by users which is very useful when tracking down difficult to repeat bugs. If a transaction log is kept as part of the data synchronisation or modification processes, then those transactions may be repayable by a tester or programmer when tracking down system errors. This typically requires some forethought and incorporation into the overall system architecture.


Microsoft Azure Application Insights

Microsoft Azure Application Insights is an excellent application performance management system and logging platform. It neatly provides extensive performance information (request rate, time to process requests, failure rates etc.) for cloud-based applications as well as providing a great platform for unifying logging in a widely distributed Digital Platform. It comes with a complete query language and is highly recommended for use in Digital Transformations!

There you have it, our expert guide to using Digital Platforms. We appreciate there’s a lot to take in there! So, if you have any questions or want to know more about how we can help you with your Digital Platform, then please contact us.

Have a question about this topic?

Our team would be happy to discuss this further with you.