Data (Information) Governance: a safeguard in the digital economy

Global interest in Data Governance is growing, as organisations around the world embark on Digital Transformation and Big Data management to become more efficient and competitive. But while data is being used in myriad new ways, the rules for effective governance must prevail.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

176703924

(image not owned by KID)

The sheer volume and variety of data coming into play in the increasingly digital enterprise presents massive opportunities for organisations to analyse this data/information and apply the insights derived therefrom to achieve business growth and realise efficiencies. Digital transformation has made data management central to business operations and created a plethora of new data sources and challenges. New technology is enabling data management and analysis to be more widely applied; supporting organisations that are increasingly viewing data as a strategic business asset that could be utilised for gaining a competitive advantage.

To stay ahead, organisations have to be agile and quick in this regard, which has prompted some industry experts to take the view that data governance needs a new approach; with data discovery carried out first, before data governance rules are decided on and applied in an agile, scalable and iterative way.

While approaching data management, analysis and associated data governance in an iterative way using smaller packets of data makes sense, however, the rules that must be applied must still comply with legislation and best practice; and as a prerequisite these rules should therefore be formalised before any data project or data discovery is undertaken. Governance rules must be consistent and support the overall governance framework of the organisation throughout the data lifecycles of each data asset regardless of where and when the data is generated, processed, consumed and retired.

In an increasingly connected world, data is shared and analysed across multiple platforms all the time – by both organisations and individuals. Most of that data is being governed in some way, and where it is not, there is risk. Governed data is secure, applied correctly and of quality (reliable), and – crucially – it helps mitigate both legal and operational risk. Poor quality data alone is a significant cause for concern among global CEOs, with a recent Forbes Insights and KPMG study finding that 45% of CEOs say their customer insight is hindered by a lack of quality data and 56% saying they have concerns about the quality of data they base their strategic decisions on; while Gartner reports that the average financial impact of poor quality data could amount to around $9.7 million annually. On top of this, the potential costs of unsecured data or non-compliance could be significant. Fines, lawsuits, reputational damage and the loss of potential business from highly regulated business partners and customers are among the risks faced by the organisation failing to implement effective data governance frameworks, policies and processes.

Ungoverned data results in poor business decisions and exposes the organisation and its customers to risk. Internationally, data governance is taking top priority as organisations prepare for new legislation such as the new EU GDPR, formally known as the General Data Protection Regulation legislation, which is set to come into effect next year, and organisations such as Data Governance Australia launch a new draft Code of Practice on benchmarks for the responsible collection, use, management and disclosure of data. South Africa, surprisingly, is on the forefront here with its POPI regulations and wide implementations of other guideline such as KING III and Basel.  New Chief Data Officer (CDO) roles are being introduced around the world.

Now more than ever before, every organisation has to have up to date data governance frameworks in place and more importantly, have the rules articulated or mapped into their processes and data assets. They must look from the bottom up, to ensure that the rules on the floor align with the compliance rules and regulations from the top. These rules and conditions must be formally mapped to the actual physical rules and technical conditions in place throughout the organisation. By doing this, the organisation can illustrate that its data governance framework is real and articulated into its operations, across physical business and technical processes, methodologies, access controls and data domains of the organisation, ICT included.  This mapping process should ideally begin with a data governance maturity assessment upfront. Alongside this, the organisation should deploy dedicated data governance resources for sustained stewardship.

Mapping the rules and conditions, and the due configuration of the relevant toolsets to enforce data governance, can be a complex and lengthy process.  But they are necessary in order to entrench data governance throughout the organisation. Formalised data governance mapping proves to the world where and how the organisation has implemented data governance, demonstrating that policies are entrenched throughout its processes and so supporting audit and reducing compliance risk and operational risk.

To support agility and speed of delivery iterations for data management and analyses initiatives and instances, data governance can be “sliced” specifically for the work at hand and also applied in iterative fashion, organically covering all data assets over time.

 

 

Risks en route to cloud

By Veemal Kalanjee, Managing Director at Infoflow – part of the KID group

Security in the cloud worries many companies, but security and risk management during migration should be of greater concern.

cloud-security

Security and control of data are commonly cited as being among the top concerns of South African CIOs and IT managers. There is a prevailing fear that business-critical applications and information hosted anywhere but on-premises are at greater risk of being lost or accessed by cyber criminals.

In fact, data hosted by a reputable cloud service provider is probably far safer than data hosted on-premises and secured by little more than a firewall.

What many businesses overlook, however, is the possibility that the real business risks and data security issues could occur before the data has actually moved to the cloud, or during the migration to the cloud.

When planning a move to the cloud, risks are posed by attempting to rush the process. Poor selection of the cloud service provider, failure to ensure data quality and security, and overlooking critical integration issues can present risks both to data security and business continuity.

Large local companies have failed to achieve ambitious plans to rapidly move all their infrastructure and applications to the cloud due to an ‘eat the elephant whole’ approach, which can prove counter-productive and risky. To support progress to the cloud while mitigating risk, cloud migrations should be approached in small chunks instead, as this allows for sufficient evaluation and troubleshooting throughout the process.

Look before leaping

Before taking the plunge, companies must carefully evaluate their proposed cloud service and environment, and strategically assess what data and applications will be moved.

Cloud migrations should be approached in small chunks

Businesses must consider questions around what cloud they are moving to, and where it is hosted. For example, if the data will be hosted in the US, issues such as bandwidth and line speed come into play: companies must consider the business continuity risks of poor connections and distant service providers.

They must also carefully assess the service provider’s continuity and disaster recovery plans, the levels of security and assurances they offer, and what recourse the customer will have in the event of data being lost or compromised or the service provider going out of business. Moving to the cloud demands a broader understanding of security technologies and risk among all project team members than was needed previously, in non-cloud environments.

In addition, when considering a move to the public cloud, one aspect that can’t be mitigated is what was once an exclusive use environment for the company in a non-cloud form is now a multi-tenant shared environment, which potentially brings its own security risks.

It is up to the company to perform a comprehensive due diligence analysis on the cloud vendor to ensure the multitude of security risks are adequately addressed through preventative security measures put in place by the vendor.

Data on the move

Once a suitable cloud vendor has been identified, the data to be migrated must be assessed, its quality must be assured, and the data must be effectively secured.

The recommended first step is to identify the data to be migrated, considering, for example:
* Are there inactive customers on this database?
* Should the company retain that data, archiving it on-premises, and move only active customers to the cloud?

Once the data to be migrated has been identified, the company must review the quality of this data, identifying and addressing anomalies and duplicates before moving to the next phase of the cloud migration. Since poor quality data can undermine business success, the process of improving data quality ahead of a cloud migration can actually improve business operations, and so help mitigate overall business risk.

Moving data from the company’s internal network to an external network can present a number of risks.

Adequate levels of data encryption and/or masking must be applied and a secure transport layer implemented to ensure the data remains secure, wherever it is.

In the move to the cloud, the question of access must also be considered – both for individual users and for enterprise applications. It is important to consider all points of integration to mitigate business continuity issues. In many cloud migrations, companies tend to overlook points that haven’t been documented and integrated, presenting business continuity challenges. A robust cloud integration solution simplifies this task.

The risk of business processes failing should also be considered during the migration to the cloud. Companies must allocate sufficient time for testing – running systems in parallel for a period to ensure they all function as expected.

While there are risks in moving to the cloud, when the process is approached strategically and cautiously, there are many potential benefits to the migration process. Done well, the process can result in better quality data, a more strategic approach to data management and security, and more streamlined business processes.

Five data protection approaches to take seriously in 2017

Information security remains a grudge purchase for many, but SA business needs to pay urgent attention to key lessons learnt from increasingly sophisticated breaches.

 

By Veemal Kalanjee, Managing Director at Infoflow – part of the KID group

 

In the past year, we have witnessed increasingly bold and sophisticated attacks on corporate and personal data around the world. The fact that there has been no common modus operandi in these attacks should be cause for concern among businesses everywhere, since this means attacks are unpredictable and harder to mitigate. We’ve seen significant IT organisations breached, and even security-savvy victims tricked into parting with passwords. Clearly, the standard security protocols are no longer enough and data security must be built into the very fabric of the business.

Five key lessons South African businesses need to take from data breach patterns of the past year are:

Security is a C-suite problem. IT professionals are well aware of the risks, but in many cases, the rest of the C-suite sees security as a grudge purchase. This is understandable, because the reality is that most C-level executives are focused on maximising their dwindling budgets to address business- critical initiatives, and protection against data breaches often takes a back seat.

But protection of personal information is becoming legislated and it is only a matter of time before C-suite members are held personally accountable for breaches. Business owns the data and is ultimately responsible for any breaches that occur, regardless of the measures that IT might put in place. The business itself stands to fail if a significant breach occurs.

cloud-caution

(Image not owned by KID)

Business, therefore, needs the visibility into where the vulnerabilities lie for data breaches within an organisation and need to actively participate in assisting IT to ensure that policies are implemented and adapted to address the ever changing security threats. The C-suite cannot afford to sit back and ‘see what happens’ – it needs to immediately determine the risk and weigh it up against the investment, time and effort they want to spend on mitigating that risk.

Cloud caution is warranted. For years, South African businesses were cautious about the security and sovereignty of their data in the cloud. A lack of clearly defined policies (or any policies for that matter) often dissuades organisations from moving to the cloud.

Now, many have moved to cloud, but typically through a hybrid or private model, with data security top of mind. This approach means organisations cannot fully optimise the scalability and other benefits of the public cloud, but it also means that their own data security policies can be applied to protecting their data at all times.

Data classification and DLP strategies are crucial. Classification of sensitive data is an extremely important step in implementing a data loss prevention strategy. This classification becomes the point of departure for understanding where sensitive data lies, how much of it is susceptible to breach and how the organisation is tracking it in terms of protecting its sensitive data assets. Organisations may well have their data centres locked down, but if sensitive data also resides in email, test and development environments or unprotected workflow systems, it remains at risk.

Advanced solutions must be harnessed to manage the data classification process and give C-level users a holistic view into where they stand in terms of protection of data.

Security doesn’t end at encryption. While encryption is an important step in securing data, it is not a foolproof solution for all threats. Encryption is a great mechanism to prevent data access in the case of the theft of physical hardware, but it is just as important to protect data assets from unauthorised access within the organisation.

Some of the biggest data breaches in the past have been due to employees having full access to all systems and leaking sensitive information without the physical theft of hardware. Data Masking is an important consideration to prevent this type of unauthorised access.

An example is production systems that are replicated to multiple test environments. Often the data on production has some level of protection, but as soon as it is “cloned” to the test system, this protection is dropped and unauthorised users are able to access all sensitive information.

Ongoing education remains key. Enforcement of security policies doesn’t only mean applying technology to monitor/track employees’ usage of company’s data assets, but also implies an inherent culture shift in the processes of the business. This is often the biggest stumbling block that needs to be overcome, and ongoing staff education is needed to help staff understand the importance of data security, identify the various risks and possible attack modes, and their roles in securing sensitive data. It is not enough to post notices and have policies in place – ongoing awareness programmes must teach staff about phishing, scamming and the mechanisms hackers use to gain access.

In South Africa, financial services appears to be the leader in terms of data security best practice, mainly due to legislation, international guidelines and the sensitivity of the data the sector works with. However, many other sectors hold highly sensitive data too.  All businesses need to learn from international breach trends and move to assess their data security risk and improve their security strategies.

It’s not the data management – it’s you

When poor quality data, duplicated effort and siloed information impacts operational efficiencies, organisations might feel inclined to point a finger at data management. But it’s not data management that’s broken, it’s enterprise strategy.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

Recently, international thought leaders speculated that data management might be ‘broken’ due to the growth in siloes of data and a lack of data standardisation. They pointed out that data siloes were still in place, much like they were back in the 1980s, and that the dream of standardised, centralised data providing a single view was as elusive as ever.

aaeaaqaaaaaaaaayaaaajgjimdq4ndu0lwi1ytmtndq2yi05yjdmltqyntlmmtkxnzrmmg

(Image not owned by KID)

In South Africa, we also see frustrated enterprise staff increasingly struggling to gain control of growing volumes of siloed, duplicated and non-standardised data. This is despite the fact that most organisations believe they have data management policies and solutions in place.

The truth of the matter is – data management is not what’s at fault. The problem lies in enterprise-wide data management strategies, or the lack thereof.

Data Management per se is never really broken.  Data management refers to a set of rules, policies, standards and governance for data throughout its life-cycles. While most organisations have these in place, they do not always have uniform data management standards in place throughout the organisation. Various operating units may have their own legacy models which they believe best meet their needs. In mergers and acquisitions, new companies may come aboard, each bringing with them their own tried and trusted data management policies. Each operating unit may be under pressure to deliver business results in a hurry, so they continue doing things in any way that has always worked for them.

The end result is that there is no standardised model for data management across the enterprise. Efforts are duplicated, productivity suffers and opportunities are lost.

In many cases, where questions are raised around the effectiveness of data management, one will find that it is not being applied at all. Unfortunately, many companies are not yet mature in terms of data management and will continue to experience issues, anomalies and politics in the absence of enterprise wide data management. But this will start to change in future.

In businesses across the world, but particularly in Africa, narrower profit margins and weaker currencies are forcing management to look at back end processes for improved efficiencies and cost cutting. Implementing more effective data management strategies is an excellent place for them to start.

Locally, some companies are now striving to develop enterprise-wide strategies to improve data quality and bring about more effective data management. Large enterprises are hiring teams and setting up competency centres to clean the data at enterprise level and move towards effective master data management for a single view of customer that is used in common way across various divisions.

Enterprise wide data management standards are not difficult to implement technology-wise. The difficult part is addressing the company politics that stands in the way and driving the change management needed to overcome people’s resistance to new ways of doing things. You may even find a resistance to improved data management efficiencies simply because manual processes and inefficient data management keeps more people in jobs – at least for the time being.

But there is no question that an enterprise wide standards for data management must be introduced to overcome siloes of data, siloes of competency, duplication of effort and sub-par efficiency. Local large enterprises, particularly banks and other financial services enterprises, are starting to follow the lead of international enterprises in moving to address this area of operational inefficiency. Typically, they find that the most effective way to overcome the data silo challenge is to slowly adapt their existing ways of working to align with new standards in a piecemeal fashion that adheres to the grand vision.

The success of enterprise wide data management strategies also rests a great deal on management: you need a strong mandate from enterprise level executives to secure the buy-in and compliance needed to achieve efficiencies and enable common practices. In the past, the C-suite business executives were not particularly strong in driving data management and standards – they were typically focused on business results, and nobody looked at operating costs as long as the service was delivered. However, now business is focusing more on operating and capital costs and discovering that data management efficiencies will translate into better revenues.

With enterprise wide standards for data management in place, the later consumption and application of that data becomes is highly dependent on the users’ requirements, intent and discipline to maintain the data standards.  Data items can be redefined, renamed or segmented in line with divisional needs and processes. But as long as the data is not manipulated out of context or in an unprotected manner, and as long as governance is not overlooked, the overall data quality and standards will not suffer.

How to tell if your organisation is strategically data driven

Striving to become a ‘data driven organisation’ is not enough, says Knowledge Integration Dynamics (KID).

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

There is a great deal of focus on the ‘data driven organisation’ now. But this focus misses the point – everyone is data driven to some degree. The question should be: are you strategically data driven?

Everyone – from the man in the street to the large enterprise – is driven by data. This data might emerge from weather reports, calendars, meeting schedules and commitments. A plethora of data drives decisions and processes all the time. But this does not mean data is being used effectively. In fact, in this scenario, the data drives us. Only when data is used strategically can we turn the situation around so that we drive the data, using it as a powerful tool to improve business.

868fcd_c372ccde15ba49c2a18e384fadbdad59

While there is always room for improvement and innovation in the gathering, management and application of data, many companies are already strategically data driven. These companies are relatively easy to identify, based on a number of traits they have in common:

  • Innovation and market disruption. Innovation can happen as a once-off ‘accident’, but a sustainable business that consistently innovates and disrupts is certainly basing its success on the strategic use of data. The sustainably innovative enterprise harnesses quality internal and external data and analytics to inform business decisions, improve products and customer experience, and maintain its competitive edge.
  • A culture of rationalisation. When a company is strategically data driven, it has achieved a clear understanding of where its resources can be put to the best use, where bottlenecks and duplication occurs and how best to improve efficiencies. A company with a culture of rationalisation, a focus on deduplication and a tendency to automate and reduce manual interventions clearly has good insight into its internal data.
  • A ‘Governance over all’ approach to business and operations. Only an organisation with quality data delivering effective insights into all spheres of the business is in a position to apply effective rules and governance over all systems, operations and processes.
  • Decisions are based on interrogating the right data with the right questions, using the right models.  A strategically data driven organisation does not tolerate poor quality data or interrogate this data in a haphazard fashion. The widespread use of quality data and analytics is evident in every aspect of the business, and is the basis of every decision within the organisation. The strategically data driven organisation also routinely tests new theories, asks the ‘what if’ questions, and constantly monitors and evaluates outcomes to add to the quality of its data and analytics.
  • ‘More than fair’ labour practices. Organisations with a good grasp of their data know what impact employee skills development and job satisfaction have on business processes and revenues. Strategically data driven organisations tend to leverage their skills investments with better working conditions, incentives, salaries, training and perks.
  • Strong leadership at all levels. Strong leadership is the base enabler for the evolution of all the other traits; and strong leaders are supported and measured by data. Data is the lifeblood of the organisation, supporting good leadership by allowing managers to improve efficiencies, ensure effective resource allocation, monitor and improve employee performance and measure their own performance as managers.

 

Any organisation not displaying these traits needs to be asking: “Are we taking an organised approach to data usage and information consumption in order to make our business more effective? Are we using our data to effectively look both inward and outward; finding areas for improvement within our operations and scope for innovation and business growth in our market?”

Companies still fail to protect data

Despite their having comprehensive information security and data protection policies in place, most South African businesses are still wide open to data theft and misuse, says KID.

By Mervyn Mooi, Director at the Knowledge Integration Dynamics Group

Numerous pieces of legislation, including the Protection of Personal Information (POPI) Act, and governance guidelines like King III, are very clear about how and why company information, and the information companies hold on partners and customers, should be protected. The penalties and risks involved in not protecting data are well known too. Why then, is data held within South African companies still inadequately protected?

In our experience, South African organisations have around 80% of the necessary policies and procedures in place to protect data. But the physical implementation of those policies and procedures is only at around 30%. Local organisations are not alone – a recent IDC study has found that two-thirds of enterprises internationally are failing to meet best practice standards for data control.

dreamstime_m_24243852-454x340

(Image not owned by KID)

The risks of data loss or misuse are present at every stage of data management – from gathering and transmission through to destruction of data. Governance and control are needed at every stage. A company might have its enterprise information systems secured, but if physical copies of data – like printed documents or memory sticks – are left lying around an office, or redundant PCs are sent for recycling without effective reformatting of the hard drives, sensitive data is still at risk. Many overlook the fact that confidential information can easily be stolen in physical form.

Many companies fail to manage information sharing by employees, partners and other businesses. For example, employees may unwittingly share sensitive data on social media: what may seem like a simple tweet about drafting merger documents with the other party might violate governance codes. Information shared with competitors in exploratory merger talks might be misused by the same competitors later.

We find that even larger enterprises with policies in place around moving data to memory sticks and mobile devices don’t clearly define what confidential information is, so employees tweet, post or otherwise share information without realizing they are compromising the company’s data protection policies. For example, an insurance firm might call a client and ask for the names of acquaintances who might also be interested in their product, but under the POPI Act, this is illegal. There are myriad ways in which sensitive information can be accessed and misused, with potentially devastating outcomes for the company that allows this to happen. In a significant breach, someone may lose their job, or there may be penalties or a court case as a result.

Most organisations are aware of the risks and may have invested heavily in drafting policies and procedures to mitigate them. But the best-laid governance policies cannot succeed without effective implementation. Physical implementation begins with analysing data risk: discovering, identifying, and classifying it, as well as analysing its risk based on value, location, protection, and proliferation.  Once the type and level of risk have been identified, data stewards need to take tactical and strategic steps to ensure data is safe.

These steps within the data lifecycle need to include:

  • Standards-based data definition and creation to also ensure that security and privacy rules are implemented from the out-set.
  • Strict provisioning of data security measures such as data masking, encryption/decryption and privacy controls to prevent unauthorised access to and disclosure of sensitive, private, and confidential information.
  • The organisation also needs to securely provision test and development data by automating data masking, data sub-setting and test data-generation capabilities.
  • Attention must also be given to data privacy and accountability by defining access based on privacy policies and laws – for instance,  who view personal, financial, health, or confidential data, and when.
  • Finally, archiving must be addressed: the organisation must ensure that it securely retires legacy applications, manages data growth, improves application performance, and maintains compliance with structured archiving.

 

Policies and awareness are not enough to address the vulnerabilities in data protection. The necessary guidelines, tools and education exist, but to succeed, governance has to move off paper and into action. It is important for companies to understand that policies and awareness programmes are not enough to ensure good governance. The impact of employee education is temporary – it must be refreshed regularly, and it must be enforced with systems and processes that entrench security within the database, at file level, server level, network level and in the cloud. This can be a huge task, but it is a necessary one when architecting for the future.

In context of the above, a big question to ponder is: Has your organisation mapped the rules, conditions, controls and standards (RCSSs) as translated from accords, legislation, regulation and policies, to your actual business / technical processes and data domains?

 

Big data follows the BI evolution curve

Big Data analysis in South Africa is early in its maturity levels, and has yet to evolve in much the same way as BI did 20 years ago, says Knowledge Integration Dynamics.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

Big data analysis tools aren’t ‘magical insight machines’ spitting out answers to all business’s questions: as is the case with all business intelligence tools, there are lengthy and complex processes that must take place behind the scenes before actionable and relevant insights can be drawn from the vast and growing pool of structured and unstructured data in the world.

Depositphotos_45199151_l-2015

South African companies of all sizes have an appetite for big data analysis, but because the country’s big data analysis segment is relatively immature, they are still focused on their big data strategies and the complexity of actually getting the relevant data out of this massive pool of information. We find many enterprises currently looking at technologies and tools like Hadoop to help them collate and manage big data. There are still misconceptions around the tools and methodologies for effective big data analysis: companies are sometimes surprised to discover they are expecting too much, and that a great deal of ‘pre-work’, strategic planning and resourcing is necessary.

Much like the early days of BI, big data analysis started as a relatively unstructured, ad hoc discovery process, but once patterns are established, models are developed, and the process becomes a structured one.

And in the same way that BI tools depend on data quality and relationship linking, big data analysis depends on some form of qualifying prior to being used. The data needs to be profiled for flaws which need to be cleansed (quality), it must be put into relevancy (relationships) and it must be timeous in context of what is being searched or reported on.  Methods must be devised to qualify much of the unstructured data, as a big question remains around how trusted and accurate information from the internet will be.

The reporting and application model that uses this structured and unstructured data must be addressed, and the models must be tried and tested. In the world of sentiment analysis and trends forecasting based on ever-changing unstructured data, automated models are not always the answer. Effective big data analysis also demands human intervention from highly skilled data scientists who have both business and technical experience.  These skills are still scare in South Africa, but we are finding a growing number of large enterprises retaining small teams of skilled data scientists to develop models and analyse reports.

As local big data analysis matures, we will find enterprises looking to strategise on their approaches, the questions they want to answer, what software and hardware to leverage and how to integrate new toolsets with their existing infrastructure. Some will even opt to leverage their existing BI toolsets to address their big data analysis needs.  BI and big data are already converging, and we can expect to see more of this taking place in years to come.