Big data: over-hyped and under utilized

over-hype

(Image not owned by KID)

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

 

The spectre of big data analytics is driving businesses everywhere to reevaluate their strategies and consider massive investments to monetise their data. But many are missing the point – big data is available to virtually everyone without significant investment, and is being under-utilised within the enterprise right now.

 

Too many enterprises hold the mistaken belief that to get value from big data, they must invest heavily in infrastructure and software solutions that will allow them to gather practically all the internal and external, structured and unstructured data that exists, store it in massive data reservoirs and then embark on lengthy data analytics processes to arrive at insights.

 

This belief holds them back from fully capitalising on the big data they already have access to. Budget constraints and perceived complexity are limiting their use of data beyond the walls of their own enterprises. This need not be the case.

Big data has been hyped to a point where it has become daunting to many, yet in reality it is just the next level of the BI, fact-finding and business logic that has existed for years.  Big data practice simply delivers quicker value to end-users through enablement factors such as the internet, the cloud and the availability of feature-rich tools.

images-13
(Image not owned by KID)

Big data at its most basic

 

Many of these tools are affordable and scalable to a single user anywhere on the planet. For example, a consumer with a concern about his health might use his smartphone to go online and research high cholesterol symptoms and treatment. He uses a search engine to distill the massive volumes of big data that exist on the subject, he assesses the information, and makes an informed decision to consult a doctor based on that information. This is big data analytics methodology and analytics tools in use in their simplest form.

 

On a larger scale, a car dealer might assess his sales figures and expand his insight by following social media opinions about the car models he sells, studying industry forecasts and trends, and reading columns about buyer priorities. By bringing additional, external inputs into his data, he positions himself to offer better deals or models more likely to sell.

In these cases, the value of the data analysis comes from distilling only the relevant data from multiple sources to support decision-making.

 

Big data as broader BI

 

In large enterprises, large amounts of data already exist – often in siloes within the BI, CRM, customer service centre and sales divisions. This data, supplemented with external data from quality research, social media sentiment analysis, surveys and other sources, becomes big data that can be harnessed to deliver more advanced insights for a competitive edge. Big data is not as big as it sounds, and organisations do not need to invest millions to start benefiting from it. They just need to start looking outside the organisation and bringing in information that is relevant to the business case they want to address.

For many, this will be the extent of their big data analytics needs, and it is achievable with the technologies, skills and data they already have access to.  Big data practice is accommodating of less skilled analysts and is not just pitched for experienced BI or data scientists. Nor should it be the task of IT.

 

In fact, big data practice should be the preserve of business managers, who are best placed to determine what questions should be asked, what external factors impact on business, what information will be relevant, and what steps should be taken once insights are obtained from data analysis. Business managers, who are the data stewards and subject matter experts, will require certain technology tools to analyse the data, but these BI tools are typically user friendly and little training is needed to master them.

 

Big data moves for big business

 

In major enterprises who see potential long term business value from a big data investment, a simple way to assess its value is to outsource big data analysis before taking the plunge. This will allow the enterprise to determine whether the investment will deliver on its promise.

 

Whether outsourced or implemented internally, enterprises must determine at the outset what their objectives for big data projects are, to ensure that they deliver on expectations. Big Data practice is agile and can be applied to any data to deliver any insight.  It is not enough for enterprises to vaguely seek to ‘monetise’ data.

 

This term, which is merely a new spin on ‘data franchising’, remains meaningless without clear business objectives for the big data analysis exercise. To be effective, data analytics must be applied in a strategic way to achieve specific business outcomes.

 

Advertisements

Five data protection approaches to take seriously in 2017

Information security remains a grudge purchase for many, but SA business needs to pay urgent attention to key lessons learnt from increasingly sophisticated breaches.

 

By Veemal Kalanjee, Managing Director at Infoflow – part of the KID group

 

In the past year, we have witnessed increasingly bold and sophisticated attacks on corporate and personal data around the world. The fact that there has been no common modus operandi in these attacks should be cause for concern among businesses everywhere, since this means attacks are unpredictable and harder to mitigate. We’ve seen significant IT organisations breached, and even security-savvy victims tricked into parting with passwords. Clearly, the standard security protocols are no longer enough and data security must be built into the very fabric of the business.

Five key lessons South African businesses need to take from data breach patterns of the past year are:

Security is a C-suite problem. IT professionals are well aware of the risks, but in many cases, the rest of the C-suite sees security as a grudge purchase. This is understandable, because the reality is that most C-level executives are focused on maximising their dwindling budgets to address business- critical initiatives, and protection against data breaches often takes a back seat.

But protection of personal information is becoming legislated and it is only a matter of time before C-suite members are held personally accountable for breaches. Business owns the data and is ultimately responsible for any breaches that occur, regardless of the measures that IT might put in place. The business itself stands to fail if a significant breach occurs.

cloud-caution

(Image not owned by KID)

Business, therefore, needs the visibility into where the vulnerabilities lie for data breaches within an organisation and need to actively participate in assisting IT to ensure that policies are implemented and adapted to address the ever changing security threats. The C-suite cannot afford to sit back and ‘see what happens’ – it needs to immediately determine the risk and weigh it up against the investment, time and effort they want to spend on mitigating that risk.

Cloud caution is warranted. For years, South African businesses were cautious about the security and sovereignty of their data in the cloud. A lack of clearly defined policies (or any policies for that matter) often dissuades organisations from moving to the cloud.

Now, many have moved to cloud, but typically through a hybrid or private model, with data security top of mind. This approach means organisations cannot fully optimise the scalability and other benefits of the public cloud, but it also means that their own data security policies can be applied to protecting their data at all times.

Data classification and DLP strategies are crucial. Classification of sensitive data is an extremely important step in implementing a data loss prevention strategy. This classification becomes the point of departure for understanding where sensitive data lies, how much of it is susceptible to breach and how the organisation is tracking it in terms of protecting its sensitive data assets. Organisations may well have their data centres locked down, but if sensitive data also resides in email, test and development environments or unprotected workflow systems, it remains at risk.

Advanced solutions must be harnessed to manage the data classification process and give C-level users a holistic view into where they stand in terms of protection of data.

Security doesn’t end at encryption. While encryption is an important step in securing data, it is not a foolproof solution for all threats. Encryption is a great mechanism to prevent data access in the case of the theft of physical hardware, but it is just as important to protect data assets from unauthorised access within the organisation.

Some of the biggest data breaches in the past have been due to employees having full access to all systems and leaking sensitive information without the physical theft of hardware. Data Masking is an important consideration to prevent this type of unauthorised access.

An example is production systems that are replicated to multiple test environments. Often the data on production has some level of protection, but as soon as it is “cloned” to the test system, this protection is dropped and unauthorised users are able to access all sensitive information.

Ongoing education remains key. Enforcement of security policies doesn’t only mean applying technology to monitor/track employees’ usage of company’s data assets, but also implies an inherent culture shift in the processes of the business. This is often the biggest stumbling block that needs to be overcome, and ongoing staff education is needed to help staff understand the importance of data security, identify the various risks and possible attack modes, and their roles in securing sensitive data. It is not enough to post notices and have policies in place – ongoing awareness programmes must teach staff about phishing, scamming and the mechanisms hackers use to gain access.

In South Africa, financial services appears to be the leader in terms of data security best practice, mainly due to legislation, international guidelines and the sensitivity of the data the sector works with. However, many other sectors hold highly sensitive data too.  All businesses need to learn from international breach trends and move to assess their data security risk and improve their security strategies.

It’s not the data management – it’s you

When poor quality data, duplicated effort and siloed information impacts operational efficiencies, organisations might feel inclined to point a finger at data management. But it’s not data management that’s broken, it’s enterprise strategy.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

Recently, international thought leaders speculated that data management might be ‘broken’ due to the growth in siloes of data and a lack of data standardisation. They pointed out that data siloes were still in place, much like they were back in the 1980s, and that the dream of standardised, centralised data providing a single view was as elusive as ever.

aaeaaqaaaaaaaaayaaaajgjimdq4ndu0lwi1ytmtndq2yi05yjdmltqyntlmmtkxnzrmmg

(Image not owned by KID)

In South Africa, we also see frustrated enterprise staff increasingly struggling to gain control of growing volumes of siloed, duplicated and non-standardised data. This is despite the fact that most organisations believe they have data management policies and solutions in place.

The truth of the matter is – data management is not what’s at fault. The problem lies in enterprise-wide data management strategies, or the lack thereof.

Data Management per se is never really broken.  Data management refers to a set of rules, policies, standards and governance for data throughout its life-cycles. While most organisations have these in place, they do not always have uniform data management standards in place throughout the organisation. Various operating units may have their own legacy models which they believe best meet their needs. In mergers and acquisitions, new companies may come aboard, each bringing with them their own tried and trusted data management policies. Each operating unit may be under pressure to deliver business results in a hurry, so they continue doing things in any way that has always worked for them.

The end result is that there is no standardised model for data management across the enterprise. Efforts are duplicated, productivity suffers and opportunities are lost.

In many cases, where questions are raised around the effectiveness of data management, one will find that it is not being applied at all. Unfortunately, many companies are not yet mature in terms of data management and will continue to experience issues, anomalies and politics in the absence of enterprise wide data management. But this will start to change in future.

In businesses across the world, but particularly in Africa, narrower profit margins and weaker currencies are forcing management to look at back end processes for improved efficiencies and cost cutting. Implementing more effective data management strategies is an excellent place for them to start.

Locally, some companies are now striving to develop enterprise-wide strategies to improve data quality and bring about more effective data management. Large enterprises are hiring teams and setting up competency centres to clean the data at enterprise level and move towards effective master data management for a single view of customer that is used in common way across various divisions.

Enterprise wide data management standards are not difficult to implement technology-wise. The difficult part is addressing the company politics that stands in the way and driving the change management needed to overcome people’s resistance to new ways of doing things. You may even find a resistance to improved data management efficiencies simply because manual processes and inefficient data management keeps more people in jobs – at least for the time being.

But there is no question that an enterprise wide standards for data management must be introduced to overcome siloes of data, siloes of competency, duplication of effort and sub-par efficiency. Local large enterprises, particularly banks and other financial services enterprises, are starting to follow the lead of international enterprises in moving to address this area of operational inefficiency. Typically, they find that the most effective way to overcome the data silo challenge is to slowly adapt their existing ways of working to align with new standards in a piecemeal fashion that adheres to the grand vision.

The success of enterprise wide data management strategies also rests a great deal on management: you need a strong mandate from enterprise level executives to secure the buy-in and compliance needed to achieve efficiencies and enable common practices. In the past, the C-suite business executives were not particularly strong in driving data management and standards – they were typically focused on business results, and nobody looked at operating costs as long as the service was delivered. However, now business is focusing more on operating and capital costs and discovering that data management efficiencies will translate into better revenues.

With enterprise wide standards for data management in place, the later consumption and application of that data becomes is highly dependent on the users’ requirements, intent and discipline to maintain the data standards.  Data items can be redefined, renamed or segmented in line with divisional needs and processes. But as long as the data is not manipulated out of context or in an unprotected manner, and as long as governance is not overlooked, the overall data quality and standards will not suffer.

How to tell if your organisation is strategically data driven

Striving to become a ‘data driven organisation’ is not enough, says Knowledge Integration Dynamics (KID).

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

There is a great deal of focus on the ‘data driven organisation’ now. But this focus misses the point – everyone is data driven to some degree. The question should be: are you strategically data driven?

Everyone – from the man in the street to the large enterprise – is driven by data. This data might emerge from weather reports, calendars, meeting schedules and commitments. A plethora of data drives decisions and processes all the time. But this does not mean data is being used effectively. In fact, in this scenario, the data drives us. Only when data is used strategically can we turn the situation around so that we drive the data, using it as a powerful tool to improve business.

868fcd_c372ccde15ba49c2a18e384fadbdad59

While there is always room for improvement and innovation in the gathering, management and application of data, many companies are already strategically data driven. These companies are relatively easy to identify, based on a number of traits they have in common:

  • Innovation and market disruption. Innovation can happen as a once-off ‘accident’, but a sustainable business that consistently innovates and disrupts is certainly basing its success on the strategic use of data. The sustainably innovative enterprise harnesses quality internal and external data and analytics to inform business decisions, improve products and customer experience, and maintain its competitive edge.
  • A culture of rationalisation. When a company is strategically data driven, it has achieved a clear understanding of where its resources can be put to the best use, where bottlenecks and duplication occurs and how best to improve efficiencies. A company with a culture of rationalisation, a focus on deduplication and a tendency to automate and reduce manual interventions clearly has good insight into its internal data.
  • A ‘Governance over all’ approach to business and operations. Only an organisation with quality data delivering effective insights into all spheres of the business is in a position to apply effective rules and governance over all systems, operations and processes.
  • Decisions are based on interrogating the right data with the right questions, using the right models.  A strategically data driven organisation does not tolerate poor quality data or interrogate this data in a haphazard fashion. The widespread use of quality data and analytics is evident in every aspect of the business, and is the basis of every decision within the organisation. The strategically data driven organisation also routinely tests new theories, asks the ‘what if’ questions, and constantly monitors and evaluates outcomes to add to the quality of its data and analytics.
  • ‘More than fair’ labour practices. Organisations with a good grasp of their data know what impact employee skills development and job satisfaction have on business processes and revenues. Strategically data driven organisations tend to leverage their skills investments with better working conditions, incentives, salaries, training and perks.
  • Strong leadership at all levels. Strong leadership is the base enabler for the evolution of all the other traits; and strong leaders are supported and measured by data. Data is the lifeblood of the organisation, supporting good leadership by allowing managers to improve efficiencies, ensure effective resource allocation, monitor and improve employee performance and measure their own performance as managers.

 

Any organisation not displaying these traits needs to be asking: “Are we taking an organised approach to data usage and information consumption in order to make our business more effective? Are we using our data to effectively look both inward and outward; finding areas for improvement within our operations and scope for innovation and business growth in our market?”

Big data vs accuracy: don’t believe everything you read

Analysis of unstructured big data has the potential to complement / enhance structured data analysis. Application of big data analysis can deliver a range of interesting new insights that can enhance / support decision-making within the organization. But companies should not always believe all that may be derived from the data, warns KID.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

Organizations around the world are buying into the hope that big data analytics which includes combining structured and unstructured data, will deliver an all-knowing ‘crystal ball’ to drive competitiveness. Last year, Gartner reported that big data analytics services alone was a $40 billion market and growing fast.

images-7

A major part of the big data appeal is the promise of combining accurate and structured internal data with fast-changing unstructured external data, offering a complete picture of the market environment and the organization’s own position within it.

However, while unstructured external data could add useful new methods for information gathering and decision making processes, it cannot be considered 100% accurate. In some cases, it will not be even close to accurate, and cannot be counted on as a basis for making crucial business decisions.

What proportion of unstructured external data is brought into the big data mix, and how much credence is given to it, depends on the questions to be addressed, the organization’s willingness to accept discrepancies in the data when answering a particular question, and the importance of the decisions to be made based on the big data analysis. Searching for useful insights in unstructured external big data may also require a few passes before acceptable data is identified.

For example, a new car dealership looking for prospective customers might rely entirely on external data to build a leads list. They might use a search engine to identify companies in the area of the dealership; then narrow down the list to companies likely to need cars and likely to have the budget for new cars. The resulting leads list is a good start, but may still require verification calls to determine whether the prospective customers are still in business, still based in the area and likely to be interested.

A bank investigating new branches and new markets might combine its own structured customer data with unstructured external data such as a map, to plot a visual representation of where existing customers are, and where there are gaps with potential for marketing to new customers. This insight may require further clarification and does not guarantee new customers in the blank spots on the map, but it does give the bank useful information to work with.

When an organization is seeking insights for business-critical decisions, the ratio of qualified structured data to unstructured external data should be around 90-10, with unstructured external data serving to complement the analysis, not form the basis of it. This is because structured (high-value) data is traditionally compliance and quality (ACID) bound and can be trusted.

When using big data analytics, organizations should also note that deriving business value from big data is not an exact science, and there are no guarantees. For instance, a company using its own data in combination with unstructured data to assess interest in its products might count visits to its website as an indicator of its popularity.

While the visitor figures might be accurate, the assumptions made based on the figures could be completely wrong, since visitors to the site could have stumbled across it by accident or have been using it for comparison shopping and have no interest in buying the products.

Big Data analytics is helpful for traversing high volumes of unstructured data and supplementing the company’s existing, qualified data. But, depending on the answers needed, big data will need to achieve greater degrees of accuracy and reliability before business critical decisions can be made based on its analysis.

Big data: don’t adopt if you can’t derive value from it

Amid massive big data hype, KID warns that not every company is geared to benefit from costly big data projects yet.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

Big data has been a hot topic for some time now, and unfortunately, many big data projects still fail to deliver on the hype. Recent global studies are pointing out that it’s time for enterprises to move from big data implementations and spend, to actually acting on the insights gleaned from big data analytics.

AAEAAQAAAAAAAALAAAAAJDEwNGEwM2EzLWNiNjAtNDk2Ny05NjJjLTQyY2MyNThiMDc4OQ

(Image not owned by KID)

But turning big data analytics into bottom line benefits requires a number of things, including market maturity, the necessary skills, and processes geared to auctioning insights. In South Africa, very few companies have these factors in place to allow them to benefit from significant big data projects. Despite the hype about the potential value derived from big data; in truth, value derivation is still in its infancy.

Locally, we find the early adopters have been major enterprises like banks, where big data tools are necessary for sifting through massive volumes of structured and unstructured data to uncover trends and run affinity analysis and sentiment analysis. But while they have the necessary advanced big data tools, we often find that these new technologies are delivering little more than a sense of confirmation, rather than the surprise findings and bottom line benefits they hoped for.

This may be due to processes that result in slow application of new insights, as well as to a dire shortage of the new data science skills that marry technical, analytics and strategic business know-how. Currently, the process of big data management is often disjointed from start to finish: companies may be asking the right questions and gaining insights, but unless these insights are delivered rapidly and companies actually use those insights effectively, the whole process is rendered ineffective. There is  little point in having of a multi-million rand big data infrastructure if the resulting insights aren’t applied at right time in the right places.

The challenge now is around the positioning, management and resourcing of big data as a discipline. Companies with large big data implementations must also face the challenges of integration, security and governance at scale. We also find there are many misconceptions about big data, what it is, and how it should be managed. There is an element of fear about tackling the ‘brave new world’ of technology, when in reality, big data might be seen as the evolution of BI.

Most commonly, we see companies feeling pressured to adopt big data tools and strategies when they aren’t ready, and are not positioned to benefit. As with many technologies, hype and ‘hard sell’ may convince companies to spend on big data projects when they are simply not equipped to use them. In South Africa, only the major enterprises, research organisations and perhaps players in highly competitive markets stand to benefit from big data investments. For most of the mid-market, there is little to be gained from being a big data early adopter. We are already seeing cheaper cloud-based big data solutions coming to market, and – as with any new technology – we can expect more of these to emerge in future. Within a year or two, big data solutions will become more competitively priced,  simpler, require fewer skilled resources to manage, and may then become more viable for small to mid-market companies. Until then, many may find that more effective use of their existing BI tools, and even simple online searches, meet their current needs for market insights and information.

Unless there is a compelling reason to embark on a major big data project now, the big data laggers stand to benefit in the long run. This is particularly true for those small and mid-size companies currently facing IT budget constraints. These companies should be rationalizing, reducing duplication and waste, and looking to the technologies that support their business strategies, instead of constantly investing in new technology simply because it is the latest trend.

 

 

Companies still fail to protect data

Despite their having comprehensive information security and data protection policies in place, most South African businesses are still wide open to data theft and misuse, says KID.

By Mervyn Mooi, Director at the Knowledge Integration Dynamics Group

Numerous pieces of legislation, including the Protection of Personal Information (POPI) Act, and governance guidelines like King III, are very clear about how and why company information, and the information companies hold on partners and customers, should be protected. The penalties and risks involved in not protecting data are well known too. Why then, is data held within South African companies still inadequately protected?

In our experience, South African organisations have around 80% of the necessary policies and procedures in place to protect data. But the physical implementation of those policies and procedures is only at around 30%. Local organisations are not alone – a recent IDC study has found that two-thirds of enterprises internationally are failing to meet best practice standards for data control.

dreamstime_m_24243852-454x340

(Image not owned by KID)

The risks of data loss or misuse are present at every stage of data management – from gathering and transmission through to destruction of data. Governance and control are needed at every stage. A company might have its enterprise information systems secured, but if physical copies of data – like printed documents or memory sticks – are left lying around an office, or redundant PCs are sent for recycling without effective reformatting of the hard drives, sensitive data is still at risk. Many overlook the fact that confidential information can easily be stolen in physical form.

Many companies fail to manage information sharing by employees, partners and other businesses. For example, employees may unwittingly share sensitive data on social media: what may seem like a simple tweet about drafting merger documents with the other party might violate governance codes. Information shared with competitors in exploratory merger talks might be misused by the same competitors later.

We find that even larger enterprises with policies in place around moving data to memory sticks and mobile devices don’t clearly define what confidential information is, so employees tweet, post or otherwise share information without realizing they are compromising the company’s data protection policies. For example, an insurance firm might call a client and ask for the names of acquaintances who might also be interested in their product, but under the POPI Act, this is illegal. There are myriad ways in which sensitive information can be accessed and misused, with potentially devastating outcomes for the company that allows this to happen. In a significant breach, someone may lose their job, or there may be penalties or a court case as a result.

Most organisations are aware of the risks and may have invested heavily in drafting policies and procedures to mitigate them. But the best-laid governance policies cannot succeed without effective implementation. Physical implementation begins with analysing data risk: discovering, identifying, and classifying it, as well as analysing its risk based on value, location, protection, and proliferation.  Once the type and level of risk have been identified, data stewards need to take tactical and strategic steps to ensure data is safe.

These steps within the data lifecycle need to include:

  • Standards-based data definition and creation to also ensure that security and privacy rules are implemented from the out-set.
  • Strict provisioning of data security measures such as data masking, encryption/decryption and privacy controls to prevent unauthorised access to and disclosure of sensitive, private, and confidential information.
  • The organisation also needs to securely provision test and development data by automating data masking, data sub-setting and test data-generation capabilities.
  • Attention must also be given to data privacy and accountability by defining access based on privacy policies and laws – for instance,  who view personal, financial, health, or confidential data, and when.
  • Finally, archiving must be addressed: the organisation must ensure that it securely retires legacy applications, manages data growth, improves application performance, and maintains compliance with structured archiving.

 

Policies and awareness are not enough to address the vulnerabilities in data protection. The necessary guidelines, tools and education exist, but to succeed, governance has to move off paper and into action. It is important for companies to understand that policies and awareness programmes are not enough to ensure good governance. The impact of employee education is temporary – it must be refreshed regularly, and it must be enforced with systems and processes that entrench security within the database, at file level, server level, network level and in the cloud. This can be a huge task, but it is a necessary one when architecting for the future.

In context of the above, a big question to ponder is: Has your organisation mapped the rules, conditions, controls and standards (RCSSs) as translated from accords, legislation, regulation and policies, to your actual business / technical processes and data domains?