Data (Information) Governance: a safeguard in the digital economy

Global interest in Data Governance is growing, as organisations around the world embark on Digital Transformation and Big Data management to become more efficient and competitive. But while data is being used in myriad new ways, the rules for effective governance must prevail.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

176703924

(image not owned by KID)

The sheer volume and variety of data coming into play in the increasingly digital enterprise presents massive opportunities for organisations to analyse this data/information and apply the insights derived therefrom to achieve business growth and realise efficiencies. Digital transformation has made data management central to business operations and created a plethora of new data sources and challenges. New technology is enabling data management and analysis to be more widely applied; supporting organisations that are increasingly viewing data as a strategic business asset that could be utilised for gaining a competitive advantage.

To stay ahead, organisations have to be agile and quick in this regard, which has prompted some industry experts to take the view that data governance needs a new approach; with data discovery carried out first, before data governance rules are decided on and applied in an agile, scalable and iterative way.

While approaching data management, analysis and associated data governance in an iterative way using smaller packets of data makes sense, however, the rules that must be applied must still comply with legislation and best practice; and as a prerequisite these rules should therefore be formalised before any data project or data discovery is undertaken. Governance rules must be consistent and support the overall governance framework of the organisation throughout the data lifecycles of each data asset regardless of where and when the data is generated, processed, consumed and retired.

In an increasingly connected world, data is shared and analysed across multiple platforms all the time – by both organisations and individuals. Most of that data is being governed in some way, and where it is not, there is risk. Governed data is secure, applied correctly and of quality (reliable), and – crucially – it helps mitigate both legal and operational risk. Poor quality data alone is a significant cause for concern among global CEOs, with a recent Forbes Insights and KPMG study finding that 45% of CEOs say their customer insight is hindered by a lack of quality data and 56% saying they have concerns about the quality of data they base their strategic decisions on; while Gartner reports that the average financial impact of poor quality data could amount to around $9.7 million annually. On top of this, the potential costs of unsecured data or non-compliance could be significant. Fines, lawsuits, reputational damage and the loss of potential business from highly regulated business partners and customers are among the risks faced by the organisation failing to implement effective data governance frameworks, policies and processes.

Ungoverned data results in poor business decisions and exposes the organisation and its customers to risk. Internationally, data governance is taking top priority as organisations prepare for new legislation such as the new EU GDPR, formally known as the General Data Protection Regulation legislation, which is set to come into effect next year, and organisations such as Data Governance Australia launch a new draft Code of Practice on benchmarks for the responsible collection, use, management and disclosure of data. South Africa, surprisingly, is on the forefront here with its POPI regulations and wide implementations of other guideline such as KING III and Basel.  New Chief Data Officer (CDO) roles are being introduced around the world.

Now more than ever before, every organisation has to have up to date data governance frameworks in place and more importantly, have the rules articulated or mapped into their processes and data assets. They must look from the bottom up, to ensure that the rules on the floor align with the compliance rules and regulations from the top. These rules and conditions must be formally mapped to the actual physical rules and technical conditions in place throughout the organisation. By doing this, the organisation can illustrate that its data governance framework is real and articulated into its operations, across physical business and technical processes, methodologies, access controls and data domains of the organisation, ICT included.  This mapping process should ideally begin with a data governance maturity assessment upfront. Alongside this, the organisation should deploy dedicated data governance resources for sustained stewardship.

Mapping the rules and conditions, and the due configuration of the relevant toolsets to enforce data governance, can be a complex and lengthy process.  But they are necessary in order to entrench data governance throughout the organisation. Formalised data governance mapping proves to the world where and how the organisation has implemented data governance, demonstrating that policies are entrenched throughout its processes and so supporting audit and reducing compliance risk and operational risk.

To support agility and speed of delivery iterations for data management and analyses initiatives and instances, data governance can be “sliced” specifically for the work at hand and also applied in iterative fashion, organically covering all data assets over time.

 

 

Risks en route to cloud

By Veemal Kalanjee, Managing Director at Infoflow – part of the KID group

Security in the cloud worries many companies, but security and risk management during migration should be of greater concern.

cloud-security

Security and control of data are commonly cited as being among the top concerns of South African CIOs and IT managers. There is a prevailing fear that business-critical applications and information hosted anywhere but on-premises are at greater risk of being lost or accessed by cyber criminals.

In fact, data hosted by a reputable cloud service provider is probably far safer than data hosted on-premises and secured by little more than a firewall.

What many businesses overlook, however, is the possibility that the real business risks and data security issues could occur before the data has actually moved to the cloud, or during the migration to the cloud.

When planning a move to the cloud, risks are posed by attempting to rush the process. Poor selection of the cloud service provider, failure to ensure data quality and security, and overlooking critical integration issues can present risks both to data security and business continuity.

Large local companies have failed to achieve ambitious plans to rapidly move all their infrastructure and applications to the cloud due to an ‘eat the elephant whole’ approach, which can prove counter-productive and risky. To support progress to the cloud while mitigating risk, cloud migrations should be approached in small chunks instead, as this allows for sufficient evaluation and troubleshooting throughout the process.

Look before leaping

Before taking the plunge, companies must carefully evaluate their proposed cloud service and environment, and strategically assess what data and applications will be moved.

Cloud migrations should be approached in small chunks

Businesses must consider questions around what cloud they are moving to, and where it is hosted. For example, if the data will be hosted in the US, issues such as bandwidth and line speed come into play: companies must consider the business continuity risks of poor connections and distant service providers.

They must also carefully assess the service provider’s continuity and disaster recovery plans, the levels of security and assurances they offer, and what recourse the customer will have in the event of data being lost or compromised or the service provider going out of business. Moving to the cloud demands a broader understanding of security technologies and risk among all project team members than was needed previously, in non-cloud environments.

In addition, when considering a move to the public cloud, one aspect that can’t be mitigated is what was once an exclusive use environment for the company in a non-cloud form is now a multi-tenant shared environment, which potentially brings its own security risks.

It is up to the company to perform a comprehensive due diligence analysis on the cloud vendor to ensure the multitude of security risks are adequately addressed through preventative security measures put in place by the vendor.

Data on the move

Once a suitable cloud vendor has been identified, the data to be migrated must be assessed, its quality must be assured, and the data must be effectively secured.

The recommended first step is to identify the data to be migrated, considering, for example:
* Are there inactive customers on this database?
* Should the company retain that data, archiving it on-premises, and move only active customers to the cloud?

Once the data to be migrated has been identified, the company must review the quality of this data, identifying and addressing anomalies and duplicates before moving to the next phase of the cloud migration. Since poor quality data can undermine business success, the process of improving data quality ahead of a cloud migration can actually improve business operations, and so help mitigate overall business risk.

Moving data from the company’s internal network to an external network can present a number of risks.

Adequate levels of data encryption and/or masking must be applied and a secure transport layer implemented to ensure the data remains secure, wherever it is.

In the move to the cloud, the question of access must also be considered – both for individual users and for enterprise applications. It is important to consider all points of integration to mitigate business continuity issues. In many cloud migrations, companies tend to overlook points that haven’t been documented and integrated, presenting business continuity challenges. A robust cloud integration solution simplifies this task.

The risk of business processes failing should also be considered during the migration to the cloud. Companies must allocate sufficient time for testing – running systems in parallel for a period to ensure they all function as expected.

While there are risks in moving to the cloud, when the process is approached strategically and cautiously, there are many potential benefits to the migration process. Done well, the process can result in better quality data, a more strategic approach to data management and security, and more streamlined business processes.

Big data: over-hyped and under utilized

over-hype

(Image not owned by KID)

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

 

The spectre of big data analytics is driving businesses everywhere to reevaluate their strategies and consider massive investments to monetise their data. But many are missing the point – big data is available to virtually everyone without significant investment, and is being under-utilised within the enterprise right now.

 

Too many enterprises hold the mistaken belief that to get value from big data, they must invest heavily in infrastructure and software solutions that will allow them to gather practically all the internal and external, structured and unstructured data that exists, store it in massive data reservoirs and then embark on lengthy data analytics processes to arrive at insights.

 

This belief holds them back from fully capitalising on the big data they already have access to. Budget constraints and perceived complexity are limiting their use of data beyond the walls of their own enterprises. This need not be the case.

Big data has been hyped to a point where it has become daunting to many, yet in reality it is just the next level of the BI, fact-finding and business logic that has existed for years.  Big data practice simply delivers quicker value to end-users through enablement factors such as the internet, the cloud and the availability of feature-rich tools.

images-13
(Image not owned by KID)

Big data at its most basic

 

Many of these tools are affordable and scalable to a single user anywhere on the planet. For example, a consumer with a concern about his health might use his smartphone to go online and research high cholesterol symptoms and treatment. He uses a search engine to distill the massive volumes of big data that exist on the subject, he assesses the information, and makes an informed decision to consult a doctor based on that information. This is big data analytics methodology and analytics tools in use in their simplest form.

 

On a larger scale, a car dealer might assess his sales figures and expand his insight by following social media opinions about the car models he sells, studying industry forecasts and trends, and reading columns about buyer priorities. By bringing additional, external inputs into his data, he positions himself to offer better deals or models more likely to sell.

In these cases, the value of the data analysis comes from distilling only the relevant data from multiple sources to support decision-making.

 

Big data as broader BI

 

In large enterprises, large amounts of data already exist – often in siloes within the BI, CRM, customer service centre and sales divisions. This data, supplemented with external data from quality research, social media sentiment analysis, surveys and other sources, becomes big data that can be harnessed to deliver more advanced insights for a competitive edge. Big data is not as big as it sounds, and organisations do not need to invest millions to start benefiting from it. They just need to start looking outside the organisation and bringing in information that is relevant to the business case they want to address.

For many, this will be the extent of their big data analytics needs, and it is achievable with the technologies, skills and data they already have access to.  Big data practice is accommodating of less skilled analysts and is not just pitched for experienced BI or data scientists. Nor should it be the task of IT.

 

In fact, big data practice should be the preserve of business managers, who are best placed to determine what questions should be asked, what external factors impact on business, what information will be relevant, and what steps should be taken once insights are obtained from data analysis. Business managers, who are the data stewards and subject matter experts, will require certain technology tools to analyse the data, but these BI tools are typically user friendly and little training is needed to master them.

 

Big data moves for big business

 

In major enterprises who see potential long term business value from a big data investment, a simple way to assess its value is to outsource big data analysis before taking the plunge. This will allow the enterprise to determine whether the investment will deliver on its promise.

 

Whether outsourced or implemented internally, enterprises must determine at the outset what their objectives for big data projects are, to ensure that they deliver on expectations. Big Data practice is agile and can be applied to any data to deliver any insight.  It is not enough for enterprises to vaguely seek to ‘monetise’ data.

 

This term, which is merely a new spin on ‘data franchising’, remains meaningless without clear business objectives for the big data analysis exercise. To be effective, data analytics must be applied in a strategic way to achieve specific business outcomes.

 

Five data protection approaches to take seriously in 2017

Information security remains a grudge purchase for many, but SA business needs to pay urgent attention to key lessons learnt from increasingly sophisticated breaches.

 

By Veemal Kalanjee, Managing Director at Infoflow – part of the KID group

 

In the past year, we have witnessed increasingly bold and sophisticated attacks on corporate and personal data around the world. The fact that there has been no common modus operandi in these attacks should be cause for concern among businesses everywhere, since this means attacks are unpredictable and harder to mitigate. We’ve seen significant IT organisations breached, and even security-savvy victims tricked into parting with passwords. Clearly, the standard security protocols are no longer enough and data security must be built into the very fabric of the business.

Five key lessons South African businesses need to take from data breach patterns of the past year are:

Security is a C-suite problem. IT professionals are well aware of the risks, but in many cases, the rest of the C-suite sees security as a grudge purchase. This is understandable, because the reality is that most C-level executives are focused on maximising their dwindling budgets to address business- critical initiatives, and protection against data breaches often takes a back seat.

But protection of personal information is becoming legislated and it is only a matter of time before C-suite members are held personally accountable for breaches. Business owns the data and is ultimately responsible for any breaches that occur, regardless of the measures that IT might put in place. The business itself stands to fail if a significant breach occurs.

cloud-caution

(Image not owned by KID)

Business, therefore, needs the visibility into where the vulnerabilities lie for data breaches within an organisation and need to actively participate in assisting IT to ensure that policies are implemented and adapted to address the ever changing security threats. The C-suite cannot afford to sit back and ‘see what happens’ – it needs to immediately determine the risk and weigh it up against the investment, time and effort they want to spend on mitigating that risk.

Cloud caution is warranted. For years, South African businesses were cautious about the security and sovereignty of their data in the cloud. A lack of clearly defined policies (or any policies for that matter) often dissuades organisations from moving to the cloud.

Now, many have moved to cloud, but typically through a hybrid or private model, with data security top of mind. This approach means organisations cannot fully optimise the scalability and other benefits of the public cloud, but it also means that their own data security policies can be applied to protecting their data at all times.

Data classification and DLP strategies are crucial. Classification of sensitive data is an extremely important step in implementing a data loss prevention strategy. This classification becomes the point of departure for understanding where sensitive data lies, how much of it is susceptible to breach and how the organisation is tracking it in terms of protecting its sensitive data assets. Organisations may well have their data centres locked down, but if sensitive data also resides in email, test and development environments or unprotected workflow systems, it remains at risk.

Advanced solutions must be harnessed to manage the data classification process and give C-level users a holistic view into where they stand in terms of protection of data.

Security doesn’t end at encryption. While encryption is an important step in securing data, it is not a foolproof solution for all threats. Encryption is a great mechanism to prevent data access in the case of the theft of physical hardware, but it is just as important to protect data assets from unauthorised access within the organisation.

Some of the biggest data breaches in the past have been due to employees having full access to all systems and leaking sensitive information without the physical theft of hardware. Data Masking is an important consideration to prevent this type of unauthorised access.

An example is production systems that are replicated to multiple test environments. Often the data on production has some level of protection, but as soon as it is “cloned” to the test system, this protection is dropped and unauthorised users are able to access all sensitive information.

Ongoing education remains key. Enforcement of security policies doesn’t only mean applying technology to monitor/track employees’ usage of company’s data assets, but also implies an inherent culture shift in the processes of the business. This is often the biggest stumbling block that needs to be overcome, and ongoing staff education is needed to help staff understand the importance of data security, identify the various risks and possible attack modes, and their roles in securing sensitive data. It is not enough to post notices and have policies in place – ongoing awareness programmes must teach staff about phishing, scamming and the mechanisms hackers use to gain access.

In South Africa, financial services appears to be the leader in terms of data security best practice, mainly due to legislation, international guidelines and the sensitivity of the data the sector works with. However, many other sectors hold highly sensitive data too.  All businesses need to learn from international breach trends and move to assess their data security risk and improve their security strategies.

It’s not the data management – it’s you

When poor quality data, duplicated effort and siloed information impacts operational efficiencies, organisations might feel inclined to point a finger at data management. But it’s not data management that’s broken, it’s enterprise strategy.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

Recently, international thought leaders speculated that data management might be ‘broken’ due to the growth in siloes of data and a lack of data standardisation. They pointed out that data siloes were still in place, much like they were back in the 1980s, and that the dream of standardised, centralised data providing a single view was as elusive as ever.

aaeaaqaaaaaaaaayaaaajgjimdq4ndu0lwi1ytmtndq2yi05yjdmltqyntlmmtkxnzrmmg

(Image not owned by KID)

In South Africa, we also see frustrated enterprise staff increasingly struggling to gain control of growing volumes of siloed, duplicated and non-standardised data. This is despite the fact that most organisations believe they have data management policies and solutions in place.

The truth of the matter is – data management is not what’s at fault. The problem lies in enterprise-wide data management strategies, or the lack thereof.

Data Management per se is never really broken.  Data management refers to a set of rules, policies, standards and governance for data throughout its life-cycles. While most organisations have these in place, they do not always have uniform data management standards in place throughout the organisation. Various operating units may have their own legacy models which they believe best meet their needs. In mergers and acquisitions, new companies may come aboard, each bringing with them their own tried and trusted data management policies. Each operating unit may be under pressure to deliver business results in a hurry, so they continue doing things in any way that has always worked for them.

The end result is that there is no standardised model for data management across the enterprise. Efforts are duplicated, productivity suffers and opportunities are lost.

In many cases, where questions are raised around the effectiveness of data management, one will find that it is not being applied at all. Unfortunately, many companies are not yet mature in terms of data management and will continue to experience issues, anomalies and politics in the absence of enterprise wide data management. But this will start to change in future.

In businesses across the world, but particularly in Africa, narrower profit margins and weaker currencies are forcing management to look at back end processes for improved efficiencies and cost cutting. Implementing more effective data management strategies is an excellent place for them to start.

Locally, some companies are now striving to develop enterprise-wide strategies to improve data quality and bring about more effective data management. Large enterprises are hiring teams and setting up competency centres to clean the data at enterprise level and move towards effective master data management for a single view of customer that is used in common way across various divisions.

Enterprise wide data management standards are not difficult to implement technology-wise. The difficult part is addressing the company politics that stands in the way and driving the change management needed to overcome people’s resistance to new ways of doing things. You may even find a resistance to improved data management efficiencies simply because manual processes and inefficient data management keeps more people in jobs – at least for the time being.

But there is no question that an enterprise wide standards for data management must be introduced to overcome siloes of data, siloes of competency, duplication of effort and sub-par efficiency. Local large enterprises, particularly banks and other financial services enterprises, are starting to follow the lead of international enterprises in moving to address this area of operational inefficiency. Typically, they find that the most effective way to overcome the data silo challenge is to slowly adapt their existing ways of working to align with new standards in a piecemeal fashion that adheres to the grand vision.

The success of enterprise wide data management strategies also rests a great deal on management: you need a strong mandate from enterprise level executives to secure the buy-in and compliance needed to achieve efficiencies and enable common practices. In the past, the C-suite business executives were not particularly strong in driving data management and standards – they were typically focused on business results, and nobody looked at operating costs as long as the service was delivered. However, now business is focusing more on operating and capital costs and discovering that data management efficiencies will translate into better revenues.

With enterprise wide standards for data management in place, the later consumption and application of that data becomes is highly dependent on the users’ requirements, intent and discipline to maintain the data standards.  Data items can be redefined, renamed or segmented in line with divisional needs and processes. But as long as the data is not manipulated out of context or in an unprotected manner, and as long as governance is not overlooked, the overall data quality and standards will not suffer.

How to tell if your organisation is strategically data driven

Striving to become a ‘data driven organisation’ is not enough, says Knowledge Integration Dynamics (KID).

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

There is a great deal of focus on the ‘data driven organisation’ now. But this focus misses the point – everyone is data driven to some degree. The question should be: are you strategically data driven?

Everyone – from the man in the street to the large enterprise – is driven by data. This data might emerge from weather reports, calendars, meeting schedules and commitments. A plethora of data drives decisions and processes all the time. But this does not mean data is being used effectively. In fact, in this scenario, the data drives us. Only when data is used strategically can we turn the situation around so that we drive the data, using it as a powerful tool to improve business.

868fcd_c372ccde15ba49c2a18e384fadbdad59

While there is always room for improvement and innovation in the gathering, management and application of data, many companies are already strategically data driven. These companies are relatively easy to identify, based on a number of traits they have in common:

  • Innovation and market disruption. Innovation can happen as a once-off ‘accident’, but a sustainable business that consistently innovates and disrupts is certainly basing its success on the strategic use of data. The sustainably innovative enterprise harnesses quality internal and external data and analytics to inform business decisions, improve products and customer experience, and maintain its competitive edge.
  • A culture of rationalisation. When a company is strategically data driven, it has achieved a clear understanding of where its resources can be put to the best use, where bottlenecks and duplication occurs and how best to improve efficiencies. A company with a culture of rationalisation, a focus on deduplication and a tendency to automate and reduce manual interventions clearly has good insight into its internal data.
  • A ‘Governance over all’ approach to business and operations. Only an organisation with quality data delivering effective insights into all spheres of the business is in a position to apply effective rules and governance over all systems, operations and processes.
  • Decisions are based on interrogating the right data with the right questions, using the right models.  A strategically data driven organisation does not tolerate poor quality data or interrogate this data in a haphazard fashion. The widespread use of quality data and analytics is evident in every aspect of the business, and is the basis of every decision within the organisation. The strategically data driven organisation also routinely tests new theories, asks the ‘what if’ questions, and constantly monitors and evaluates outcomes to add to the quality of its data and analytics.
  • ‘More than fair’ labour practices. Organisations with a good grasp of their data know what impact employee skills development and job satisfaction have on business processes and revenues. Strategically data driven organisations tend to leverage their skills investments with better working conditions, incentives, salaries, training and perks.
  • Strong leadership at all levels. Strong leadership is the base enabler for the evolution of all the other traits; and strong leaders are supported and measured by data. Data is the lifeblood of the organisation, supporting good leadership by allowing managers to improve efficiencies, ensure effective resource allocation, monitor and improve employee performance and measure their own performance as managers.

 

Any organisation not displaying these traits needs to be asking: “Are we taking an organised approach to data usage and information consumption in order to make our business more effective? Are we using our data to effectively look both inward and outward; finding areas for improvement within our operations and scope for innovation and business growth in our market?”

Big data vs accuracy: don’t believe everything you read

Analysis of unstructured big data has the potential to complement / enhance structured data analysis. Application of big data analysis can deliver a range of interesting new insights that can enhance / support decision-making within the organization. But companies should not always believe all that may be derived from the data, warns KID.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

Organizations around the world are buying into the hope that big data analytics which includes combining structured and unstructured data, will deliver an all-knowing ‘crystal ball’ to drive competitiveness. Last year, Gartner reported that big data analytics services alone was a $40 billion market and growing fast.

images-7

A major part of the big data appeal is the promise of combining accurate and structured internal data with fast-changing unstructured external data, offering a complete picture of the market environment and the organization’s own position within it.

However, while unstructured external data could add useful new methods for information gathering and decision making processes, it cannot be considered 100% accurate. In some cases, it will not be even close to accurate, and cannot be counted on as a basis for making crucial business decisions.

What proportion of unstructured external data is brought into the big data mix, and how much credence is given to it, depends on the questions to be addressed, the organization’s willingness to accept discrepancies in the data when answering a particular question, and the importance of the decisions to be made based on the big data analysis. Searching for useful insights in unstructured external big data may also require a few passes before acceptable data is identified.

For example, a new car dealership looking for prospective customers might rely entirely on external data to build a leads list. They might use a search engine to identify companies in the area of the dealership; then narrow down the list to companies likely to need cars and likely to have the budget for new cars. The resulting leads list is a good start, but may still require verification calls to determine whether the prospective customers are still in business, still based in the area and likely to be interested.

A bank investigating new branches and new markets might combine its own structured customer data with unstructured external data such as a map, to plot a visual representation of where existing customers are, and where there are gaps with potential for marketing to new customers. This insight may require further clarification and does not guarantee new customers in the blank spots on the map, but it does give the bank useful information to work with.

When an organization is seeking insights for business-critical decisions, the ratio of qualified structured data to unstructured external data should be around 90-10, with unstructured external data serving to complement the analysis, not form the basis of it. This is because structured (high-value) data is traditionally compliance and quality (ACID) bound and can be trusted.

When using big data analytics, organizations should also note that deriving business value from big data is not an exact science, and there are no guarantees. For instance, a company using its own data in combination with unstructured data to assess interest in its products might count visits to its website as an indicator of its popularity.

While the visitor figures might be accurate, the assumptions made based on the figures could be completely wrong, since visitors to the site could have stumbled across it by accident or have been using it for comparison shopping and have no interest in buying the products.

Big Data analytics is helpful for traversing high volumes of unstructured data and supplementing the company’s existing, qualified data. But, depending on the answers needed, big data will need to achieve greater degrees of accuracy and reliability before business critical decisions can be made based on its analysis.