The wielder, not the axe, propel plunder aplenty

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

Business intelligence is a fairly hot topic today – good news for me and my ilk – but that doesn’t mean everything about it is new and exciting. The rise and rise of BI has seen a maturation of the technologies, derived from a sweeping round of acquisitions and consolidations in the industry just a few years ago, that have created something of a standardisation of tools.

business-dashboard-types

(image not owned by KID)

We have dashboards and scorecards, data warehouses and all the old Scandinavian-sounding LAPs: ROLAP, MOLAP, OLAP and possibly a Ragnar Lothbrok or two. And, like the Vikings knew, without some means to differentiate, everyone in the industry becomes a me-too, which means that’s what their customers ultimately get. And that makes it very hard to win battles.

 

Building new frameworks around tools to achieve some sense of differentiation achieves just that: only a sense of differentiation. In fact, even when it comes to measurements, most measures, indicators and references in BI today are calculated in a common manner across businesses. They typically use financial measures, such as monthly revenues, costs, interest and so on. The real difference, however, comes in preparing the data and the rules that are applied to the function.

 

A basic example that illustrates the point: let’s say the Vikings want to invade England and make off with some loot. Before they can embark on their journey of conquest they need to ascertain a few facts. Do they have enough men to defeat the forces in England? Do they have enough ships to get them there? Do they know how to navigate the ocean? Are their ships capable of safely crossing? Can they carry enough stores to see them through the campaign or will they need to raid settlements for food when they arrive? Would those settlements be available to them? How much booty are they likely to capture? Can they carry it all home? Will it be enough to warrant the cost of the expedition?

 

The simple answer was that the first time they set sail they had absolutely no idea because they had no data. It was massively risky of the type that most organisations aim to avoid these days. So before they could even begin to analyse the pros and cons they had to get at the raw data itself. And that’s the same issue that most organisations have today. They need the raw data but they don’t need it, in the Viking context, from travellers and mystics, spirits and whispers carried on the wind. It must be good quality data derived from reliable sources and a good geographic cross-section. And in preparing their facts, checking they are correct, that they come from reliable sources, that there has been case of broken telephone, that businesses will truly make a difference. Information is king in war because it allows a much smaller force to figure out where to maximise its impact upon a potentially much larger enemy. The same is true in business today.

 

Before the Vikings could begin to loot and pillage they had to know where they could put ashore quickly to effect a surprise raid with overwhelming odds in their favour. In business you could say that you need to know the basic facts before you drill down for the nuggets that await.

 

The first Viking raids grew to become larger as the information the Vikings had about England grew. Pretty soon they had banded their tribes or groups together, shared their knowledge and were working toward a common goal: getting rich by looting England. In business, too, divisions, units or operating companies may individually gain knowledge that it makes sense to share with the rest to work toward the most sought-after plunder: the overall business strategy.

 

Because the tools and technologies supply common functionality and businesses or implementers can put them together in fairly standard approaches as they choose, the real differentiator for BI is the data itself and how the data is prepared – what rules are applied to it before it enters the BI systems. Preparation is king.

 

These rules ultimately differentiate information based on wind-carried whispers or reliable reports from spies abroad. Which would you prefer with your feet on the deck?

 

Contact

 

Knowledge Integration Dynamics, Mervyn Mooi, (011) 462-1277, mervyn.mooi@kid.co.za

Thought Bubble, Jeanné Swart, 082-539-6835, jeanne@thoughtbubble.co.za

 

 

Advertisements

Big data: don’t adopt if you can’t derive value from it

Amid massive big data hype, KID warns that not every company is geared to benefit from costly big data projects yet.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

images-3

Big data has been a hot topic for some time now, and unfortunately, many big data projects still fail to deliver on the hype. Recent global studies are pointing out that it’s time for enterprises to move from big data implementations and spend, to actually acting on the insights gleaned from big data analytics.

 

But turning big data analytics into bottom line benefits requires a number of things, including market maturity, the necessary skills, and processes geared to auctioning insights. In South Africa, very few companies have these factors in place to allow them to benefit from significant big data projects. Despite the hype about the potential value derived from big data; in truth, value derivation is still in its infancy.

 

Locally, we find the early adopters have been major enterprises like banks, where big data tools are necessary for sifting through massive volumes of structured and unstructured data to uncover trends and run affinity analysis and sentiment analysis. But while they have the necessary advanced big data tools, we often find that these new technologies are delivering little more than a sense of confirmation, rather than the surprise findings and bottom line benefits they hoped for.

 

This may be due to processes that result in slow application of new insights, as well as to a dire shortage of the new data science skills that marry technical, analytics and strategic business know-how. Currently, the process of big data management is often disjointed from start to finish: companies may be asking the right questions and gaining insights, but unless these insights are delivered rapidly and companies actually use those insights effectively, the whole process is rendered ineffective. There is  little point in having of a multi-million rand big data infrastructure if the resulting insights aren’t applied at right time in the right places.

 

The challenge now is around the positioning, management and resourcing of big data as a discipline. Companies with large big data implementations must also face the challenges of integration, security and governance at scale. We also find there are many misconceptions about big data, what it is, and how it should be managed. There is an element of fear about tackling the ‘brave new world’ of technology, when in reality, big data might be seen as the evolution of BI.

 

Most commonly, we see companies feeling pressured to adopt big data tools and strategies when they aren’t ready, and are not positioned to benefit. As with many technologies, hype and ‘hard sell’ may convince companies to spend on big data projects when they are simply not equipped to use them. In South Africa, only the major enterprises, research organisations and perhaps players in highly competitive markets stand to benefit from big data investments. For most of the mid-market, there is little to be gained from being a big data early adopter. We are already seeing cheaper cloud-based big data solutions coming to market, and – as with any new technology – we can expect more of these to emerge in future. Within a year or two, big data solutions will become more competitively priced,  simpler, require fewer skilled resources to manage, and may then become more viable for small to mid-market companies. Until then, many may find that more effective use of their existing BI tools, and even simple online searches, meet their current needs for market insights and information.

 

Unless there is a compelling reason to embark on a major big data project now, the big data laggers stand to benefit in the long run. This is particularly true for those small and mid-size companies currently facing IT budget constraints. These companies should be rationalizing, reducing duplication and waste, and looking to the technologies that support their business strategies, instead of constantly investing in new technology simply because it is the latest trend.

 

Mervyn Mooi, Knowledge Integration Dynamics, (011) 462 1277

Jeanne Swart, Thought Bubble, 082 539 6835

Data (Information) Governance: a safeguard in the digital economy

Global interest in Data Governance is growing, as organisations around the world embark on Digital Transformation and Big Data management to become more efficient and competitive. But while data is being used in myriad new ways, the rules for effective governance must prevail.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

176703924

(image not owned by KID)

The sheer volume and variety of data coming into play in the increasingly digital enterprise presents massive opportunities for organisations to analyse this data/information and apply the insights derived therefrom to achieve business growth and realise efficiencies. Digital transformation has made data management central to business operations and created a plethora of new data sources and challenges. New technology is enabling data management and analysis to be more widely applied; supporting organisations that are increasingly viewing data as a strategic business asset that could be utilised for gaining a competitive advantage.

To stay ahead, organisations have to be agile and quick in this regard, which has prompted some industry experts to take the view that data governance needs a new approach; with data discovery carried out first, before data governance rules are decided on and applied in an agile, scalable and iterative way.

While approaching data management, analysis and associated data governance in an iterative way using smaller packets of data makes sense, however, the rules that must be applied must still comply with legislation and best practice; and as a prerequisite these rules should therefore be formalised before any data project or data discovery is undertaken. Governance rules must be consistent and support the overall governance framework of the organisation throughout the data lifecycles of each data asset regardless of where and when the data is generated, processed, consumed and retired.

In an increasingly connected world, data is shared and analysed across multiple platforms all the time – by both organisations and individuals. Most of that data is being governed in some way, and where it is not, there is risk. Governed data is secure, applied correctly and of quality (reliable), and – crucially – it helps mitigate both legal and operational risk. Poor quality data alone is a significant cause for concern among global CEOs, with a recent Forbes Insights and KPMG study finding that 45% of CEOs say their customer insight is hindered by a lack of quality data and 56% saying they have concerns about the quality of data they base their strategic decisions on; while Gartner reports that the average financial impact of poor quality data could amount to around $9.7 million annually. On top of this, the potential costs of unsecured data or non-compliance could be significant. Fines, lawsuits, reputational damage and the loss of potential business from highly regulated business partners and customers are among the risks faced by the organisation failing to implement effective data governance frameworks, policies and processes.

Ungoverned data results in poor business decisions and exposes the organisation and its customers to risk. Internationally, data governance is taking top priority as organisations prepare for new legislation such as the new EU GDPR, formally known as the General Data Protection Regulation legislation, which is set to come into effect next year, and organisations such as Data Governance Australia launch a new draft Code of Practice on benchmarks for the responsible collection, use, management and disclosure of data. South Africa, surprisingly, is on the forefront here with its POPI regulations and wide implementations of other guideline such as KING III and Basel.  New Chief Data Officer (CDO) roles are being introduced around the world.

Now more than ever before, every organisation has to have up to date data governance frameworks in place and more importantly, have the rules articulated or mapped into their processes and data assets. They must look from the bottom up, to ensure that the rules on the floor align with the compliance rules and regulations from the top. These rules and conditions must be formally mapped to the actual physical rules and technical conditions in place throughout the organisation. By doing this, the organisation can illustrate that its data governance framework is real and articulated into its operations, across physical business and technical processes, methodologies, access controls and data domains of the organisation, ICT included.  This mapping process should ideally begin with a data governance maturity assessment upfront. Alongside this, the organisation should deploy dedicated data governance resources for sustained stewardship.

Mapping the rules and conditions, and the due configuration of the relevant toolsets to enforce data governance, can be a complex and lengthy process.  But they are necessary in order to entrench data governance throughout the organisation. Formalised data governance mapping proves to the world where and how the organisation has implemented data governance, demonstrating that policies are entrenched throughout its processes and so supporting audit and reducing compliance risk and operational risk.

To support agility and speed of delivery iterations for data management and analyses initiatives and instances, data governance can be “sliced” specifically for the work at hand and also applied in iterative fashion, organically covering all data assets over time.