Blockchain in the compliance arsenal

By Mervyn Mooi

Blockchain technology may support some data management efforts, but it’s not a silver bullet for compliance.

Amid growing global interest in the potential for technologies to support management, enterprises may be questioning its role in compliance, particularly as the deadline looms for compliance with the European Union General Data Protection Regulation (GDPR).

complianceFor South African enterprises, compliance with the Protection of Personal Information (POPI) Act and alignment with the GDPR are a growing concern. Because GDPR and POPI are designed to foster best practice in data governance, it is in the best interests of any company to follow their guidelines for data quality, access , life cycle management and process management – no matter where in the world they are based.

At the same time, blockchain is attracting worldwide interest from a storage efficiency and optimisation point of view, and many companies are starting to wonder whether it can effectively support data management, security and compliance. One school of thought holds that moving beyond crypto-currency, blockchain’s decentralised data management systems and ledgers present new opportunities for more secure, more efficient data storage and processing.

However, there are still questions around how blockchain will align with best practice in data management and whether it will effectively enhance data security.

Once data is stored in blockchains, it cannot be changed or deleted.

Currently, blockchain technology for storing data may be beneficial for historic accounting and tracking/lineage purposes (as it is immutable), but there are numerous factors that limit blockchain’s ability to support GDPR/POPI and other compliance requirements.

Immutability pros and cons

Because public blockchains are immutable, once data is stored in blockchains, it cannot be changed or deleted. This supports auditing by keeping a clear record of the original, and every instance of change made to the data. While blockchain stores the lineage of data in an economical way, it will not address data quality and integration issues, however.

It should also be noted that this same immutability could raise compliance issues around the GDPR’s right to be forgotten guidelines. These dictate the circumstances under which records should be deleted or purged.

In a public blockchain environment, this is not feasible. Indeed, in many cases, it would not be realistic or constructive to destroy all records, and this is an area where local enterprises would need to carefully consider how closely they want to align with GDPR, and whether encryption to put data beyond use would suffice to meet GDPR’s right to be forgotten guidelines.

Publicly stored data concerns

In addition to the right to be forgotten issue, there is the challenge that data protection, privacy and accessibility are always at risk if data is stored in a public domain, such as the cloud or a blockchain environment. Therefore, enterprises considering the storage optimisation benefits of blockchain would also have to consider whether the core and confidential data is locally stored on private chains, and more importantly, whether those chains are subjected to security and access rules and whether the chain registries in the blockchain distributed environment are protected and subject to availability rules.

Blockchain environments also potentially present certain processing limitations: enterprises will have to consider whether blockchain will allow for parts of the chain stored for a particular business entity, such as a customer (or its versions), to be accessed and processed separately by different parties (data subjects) and/or processes.

Data quality question

The pros and cons of blockchain’s ability to support storage, management and security of data in the environment is just one side of the compliance coin: data quality is also a requirement of best practice data management. This is not a function of blockchain and therefore cannot be guaranteed by blockchain. Indeed, blockchain will store even unqualified data prior to its being cleansed and validated.

Enterprises will need to be aware of this, and consider how and where such data will be maintained. The issues of data integration and impact analysis also lie outside the blockchain domain.

IDC notes: “While the functions of the blockchain may be able to act independently of legacy systems, at some point blockchains will need to be integrated with systems of record,” and says there are therefore opportunities for “blockchain research and development projects, [to] help set standards, and develop solutions for management, integration, interoperability, and analysis of data in blockchain networks and applications”.

While blockchain is set to continue making waves as ‘the next big tech thing’, it remains to be seen whether this developing technology will have a significant role to play in compliance and overall data management in future.

Advertisements

Sub-second analytical BI time to value still a pipe dream

Internet search engines with instant query responses may have misled enterprises into believing all analytical queries should deliver split second answers.

With the advent of Big Data analytics hype and the rapid convenience of internet searches, enterprises might be forgiven for expecting to have all answers to all questions at their fingertips in near real time.

pexels-photo-256307.jpeg

Unfortunately, getting trusted answers to complex questions is a lot more complicated and time consuming than simply typing a search query. Behind the scenes on any internet search, a great deal of preparation has already been done in order to serve up the appropriate answers. Google, for instance, dedicates vast amounts of high-end resources and all of its time to preparing the data necessary to answer a search query instantly. But even Google cannot answer broad questions or make forward-looking predictions. In cases where the data is known and trusted, the data has been prepared and rules have been applied, and the search parameters are limited, such as with a property website, almost instant answers are possible, but this is not true BI or analytics.

Within the enterprise, matters become a lot more complicated.  When the end-user seeks an answer to a broad query – such as when a marketing firm wants to assess social media to find an affinity for a certain range of products over a 6-month period – a great deal of ‘churn’ must take place in the background to deliver answers. This is not a split-second process, and it may deliver only general trends insights rather than trusted, quality data that can serve as the basis for strategic decisions.

When the end user wishes to do a query and is given the power to process their own BI/Analytics, lengthy churn must take place. Every time a query, report or instance of data access is converted into useful BI/Analytical information for end-consumers, there is a whole lot of preparation work to be done along the way : i.e. identify data sources>  access> verify> filter> pre-process>  standardize> lookup> match> merge> de-dup> integrate> apply rules> transform> preprocess> format> present> distribute/channel.

Because most queries have to traverse, link and process millions of rows of data and possibly trillions of words from within the data sources, this background churn could take hours, days or even longer.

A recent TWDI study found that organisations are dissatisfied with the time it takes for the chain of processes involved for BI, analytics and data warehousing to deliver valuable data and insights to business users. The organisations attributed this, in part, to ill-defined project objectives and scope, a lack of skilled personnel, data quality problems, slow development or inability to access all relevant data.

The problem is that most business users are not BI experts and do not all have analytical minds, so the discover and report method may be iterative (therefore slow) and in many cases the outputs/results are not of the quality expected. The results may also be inaccurate as data quality rules may not have been applied, and data linking may not be correct, as it would be in a typical data warehouse where data has been qualified and pre-defined/derived. In a traditional situation, with a structured data warehouse where all the preparation is done in one place, and once only, and then shared many times, supported by quality data and predefined rules, it may be possible to get sub-second answers. But often even in this scenario, sub-second insights are not achieved, since time to insight also depends on properly designed data warehouses, server power and network bandwidth.

Users tend to confuse search and discover on flat raw data that’s already there, with information and insight generation at the next level. In more complex BI/Analytics, each time a query is run, all the preparation work has to be done from the beginning and the necessary churn can take a significant amount of time.

Therefore, demanding faster BI ‘time to value’ and expecting answers in sub-seconds could prove to be a costly mistake. While it is possible to gain some form of output in sub-seconds, these outputs will likely not be qualified, trusted insights that can deliver real strategic value to the enterprise.

By Mervyn Mooi, Director at Knowledge Integration Dynamics (KID)

 

Big data best practices, and where to get started

Big data analytics is on the ‘to do’ list of every large enterprise, and a lot of smaller businesses too. But perceived high costs, complexity and the lack of a big data game plan have hampered adoption in many South African businesses.

By Mervyn Mooi, Director, The Knowledge Integration Dynamics Group

Big data as a buzzword gets thrown around a great deal these days. Experts talk about zettabytes of data and the potential goldmines of information residing in the wave of unstructured data circulating in social media, multimedia, electronic communications and more.

As a result, every business is aware of big data, but not all of them are using it yet. In South Africa, big data analytics adoption is lagging for a number of reasons: not least of them, the cost of big data solutions. In addition, enterprises are concerned about the complexity of implementing and managing big data solutions, and the potential disruptions these programmes could cause to daily operations.

It is important to note that all business decision makers have been using a form of big data analytics for years, whether they knew it or not. Traditional business decision making has always been based on a combination of structured, tabular reports and a certain amount of unstructured data – be that a phone call to consult a colleague or a number of documents or graphs – and the analytics took place at the discretion of the decision maker. What has changed is that the data has become digital; it has grown exponentially in volume and variety, and now analytics is performed within an automated system. To benefit from the new generation of advanced big data analytics, there are a number of key points enterprises should keep in mind:

  • Start with a standards-based approach. To benefit from the almost unlimited potential of big data analytics, enterprises must adopt an architected and standards-based approach for data / information management implementation which includes business requirements-driven integration, data and process modeling, quality and reporting, to name a few competencies.

Unlocking-big-data

(Image not owned by KID)

In context of an organized approach, an enterprise first needs to determine where to begin on its big data journey. The Knowledge Integration Dynamics Group is assisting a number of large enterprises to implement their big data programmes, and we have formulated a number of preferred practices and recommendations that deliver almost instant benefits and result in sustainable and effective big data programmes.

  • Proof of Concept unlocks big value. Key to success is to start with a proof of concept (or pilot project) in a department or business subject area that has the most business “punch” or is of the most importance to the organisation. In a medical aid company, for example, the claims department or business might be the biggest cost centre and with the most focus. The proof of concept or pilot for this first subject area should not be a throwaway effort, but rather a solution that can later be quickly productionised, with relevant adjustments, and reused as a template (or “foot-print”) for programmes across the enterprise.
  • Get the data, questions and outputs right. Enterprises should also ensure that they focus on only the most relevant data and know what outputs they want from it. They would have to carefully select the data/information for analytics that would give the organisation the most value for the effort. Furthermore, the metrics and reports that the organisation generates and measures itself by, must also be carefully selected and adapted to specific business purposes. And of course, the quality and trust-worthiness of sourced data/ information must be ensured before analytical models and reports are applied to it.
  • Get the right tools. In many cases, enterprises do not know how to apply the right tools and methodologies to achieve this. Vendors are moving to help them by bringing to market templated solutions that are becoming more flexible in what they offer, so allowing organisations to cherry pick the functionality, metrics and features they need. Alternatively, organisations can have custom solutions developed.
  • It’s a programme, not a project. While proof of concepts typically show immediate benefits, it is important for organisations to realise that the proof of concept is not the end of the journey – it is just the beginning. Implementing the solution across the enterprise requires strategic planning, adoption of a common architected approach (e.g. to eliminate data siloes and wasted / overlapping resources), and effective change management and collaboration initiatives to overcome internal politics and potential resistance and ensure the programme delivers enterprise-wide benefits.