Risks en route to cloud

By Veemal Kalanjee, Managing Director at Infoflow – part of the KID group

Security in the cloud worries many companies, but security and risk management during migration should be of greater concern.

cloud-security

Security and control of data are commonly cited as being among the top concerns of South African CIOs and IT managers. There is a prevailing fear that business-critical applications and information hosted anywhere but on-premises are at greater risk of being lost or accessed by cyber criminals.

In fact, data hosted by a reputable cloud service provider is probably far safer than data hosted on-premises and secured by little more than a firewall.

What many businesses overlook, however, is the possibility that the real business risks and data security issues could occur before the data has actually moved to the cloud, or during the migration to the cloud.

When planning a move to the cloud, risks are posed by attempting to rush the process. Poor selection of the cloud service provider, failure to ensure data quality and security, and overlooking critical integration issues can present risks both to data security and business continuity.

Large local companies have failed to achieve ambitious plans to rapidly move all their infrastructure and applications to the cloud due to an ‘eat the elephant whole’ approach, which can prove counter-productive and risky. To support progress to the cloud while mitigating risk, cloud migrations should be approached in small chunks instead, as this allows for sufficient evaluation and troubleshooting throughout the process.

Look before leaping

Before taking the plunge, companies must carefully evaluate their proposed cloud service and environment, and strategically assess what data and applications will be moved.

Cloud migrations should be approached in small chunks

Businesses must consider questions around what cloud they are moving to, and where it is hosted. For example, if the data will be hosted in the US, issues such as bandwidth and line speed come into play: companies must consider the business continuity risks of poor connections and distant service providers.

They must also carefully assess the service provider’s continuity and disaster recovery plans, the levels of security and assurances they offer, and what recourse the customer will have in the event of data being lost or compromised or the service provider going out of business. Moving to the cloud demands a broader understanding of security technologies and risk among all project team members than was needed previously, in non-cloud environments.

In addition, when considering a move to the public cloud, one aspect that can’t be mitigated is what was once an exclusive use environment for the company in a non-cloud form is now a multi-tenant shared environment, which potentially brings its own security risks.

It is up to the company to perform a comprehensive due diligence analysis on the cloud vendor to ensure the multitude of security risks are adequately addressed through preventative security measures put in place by the vendor.

Data on the move

Once a suitable cloud vendor has been identified, the data to be migrated must be assessed, its quality must be assured, and the data must be effectively secured.

The recommended first step is to identify the data to be migrated, considering, for example:
* Are there inactive customers on this database?
* Should the company retain that data, archiving it on-premises, and move only active customers to the cloud?

Once the data to be migrated has been identified, the company must review the quality of this data, identifying and addressing anomalies and duplicates before moving to the next phase of the cloud migration. Since poor quality data can undermine business success, the process of improving data quality ahead of a cloud migration can actually improve business operations, and so help mitigate overall business risk.

Moving data from the company’s internal network to an external network can present a number of risks.

Adequate levels of data encryption and/or masking must be applied and a secure transport layer implemented to ensure the data remains secure, wherever it is.

In the move to the cloud, the question of access must also be considered – both for individual users and for enterprise applications. It is important to consider all points of integration to mitigate business continuity issues. In many cloud migrations, companies tend to overlook points that haven’t been documented and integrated, presenting business continuity challenges. A robust cloud integration solution simplifies this task.

The risk of business processes failing should also be considered during the migration to the cloud. Companies must allocate sufficient time for testing – running systems in parallel for a period to ensure they all function as expected.

While there are risks in moving to the cloud, when the process is approached strategically and cautiously, there are many potential benefits to the migration process. Done well, the process can result in better quality data, a more strategic approach to data management and security, and more streamlined business processes.

It’s not the data management – it’s you

When poor quality data, duplicated effort and siloed information impacts operational efficiencies, organisations might feel inclined to point a finger at data management. But it’s not data management that’s broken, it’s enterprise strategy.

By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

Recently, international thought leaders speculated that data management might be ‘broken’ due to the growth in siloes of data and a lack of data standardisation. They pointed out that data siloes were still in place, much like they were back in the 1980s, and that the dream of standardised, centralised data providing a single view was as elusive as ever.

aaeaaqaaaaaaaaayaaaajgjimdq4ndu0lwi1ytmtndq2yi05yjdmltqyntlmmtkxnzrmmg

(Image not owned by KID)

In South Africa, we also see frustrated enterprise staff increasingly struggling to gain control of growing volumes of siloed, duplicated and non-standardised data. This is despite the fact that most organisations believe they have data management policies and solutions in place.

The truth of the matter is – data management is not what’s at fault. The problem lies in enterprise-wide data management strategies, or the lack thereof.

Data Management per se is never really broken.  Data management refers to a set of rules, policies, standards and governance for data throughout its life-cycles. While most organisations have these in place, they do not always have uniform data management standards in place throughout the organisation. Various operating units may have their own legacy models which they believe best meet their needs. In mergers and acquisitions, new companies may come aboard, each bringing with them their own tried and trusted data management policies. Each operating unit may be under pressure to deliver business results in a hurry, so they continue doing things in any way that has always worked for them.

The end result is that there is no standardised model for data management across the enterprise. Efforts are duplicated, productivity suffers and opportunities are lost.

In many cases, where questions are raised around the effectiveness of data management, one will find that it is not being applied at all. Unfortunately, many companies are not yet mature in terms of data management and will continue to experience issues, anomalies and politics in the absence of enterprise wide data management. But this will start to change in future.

In businesses across the world, but particularly in Africa, narrower profit margins and weaker currencies are forcing management to look at back end processes for improved efficiencies and cost cutting. Implementing more effective data management strategies is an excellent place for them to start.

Locally, some companies are now striving to develop enterprise-wide strategies to improve data quality and bring about more effective data management. Large enterprises are hiring teams and setting up competency centres to clean the data at enterprise level and move towards effective master data management for a single view of customer that is used in common way across various divisions.

Enterprise wide data management standards are not difficult to implement technology-wise. The difficult part is addressing the company politics that stands in the way and driving the change management needed to overcome people’s resistance to new ways of doing things. You may even find a resistance to improved data management efficiencies simply because manual processes and inefficient data management keeps more people in jobs – at least for the time being.

But there is no question that an enterprise wide standards for data management must be introduced to overcome siloes of data, siloes of competency, duplication of effort and sub-par efficiency. Local large enterprises, particularly banks and other financial services enterprises, are starting to follow the lead of international enterprises in moving to address this area of operational inefficiency. Typically, they find that the most effective way to overcome the data silo challenge is to slowly adapt their existing ways of working to align with new standards in a piecemeal fashion that adheres to the grand vision.

The success of enterprise wide data management strategies also rests a great deal on management: you need a strong mandate from enterprise level executives to secure the buy-in and compliance needed to achieve efficiencies and enable common practices. In the past, the C-suite business executives were not particularly strong in driving data management and standards – they were typically focused on business results, and nobody looked at operating costs as long as the service was delivered. However, now business is focusing more on operating and capital costs and discovering that data management efficiencies will translate into better revenues.

With enterprise wide standards for data management in place, the later consumption and application of that data becomes is highly dependent on the users’ requirements, intent and discipline to maintain the data standards.  Data items can be redefined, renamed or segmented in line with divisional needs and processes. But as long as the data is not manipulated out of context or in an unprotected manner, and as long as governance is not overlooked, the overall data quality and standards will not suffer.

SA companies are finally on the MDM and DQ bandwagon

Data integration and data quality management have become important factors for many South African businesses, says Johann van der Walt, MDM practice manager at Knowledge integration Dynamics (KID).

We have always maintained that solid data integration and data quality management are essential building blocks for master data management (MDM) and we’re finally seeing that customers believe this too. One of the primary drivers behind this is the desire for services oriented architecture (SOA) solutions for which MDM is a prerequisite to be effective. SOA relies on core data such as products, customers, suppliers, locations, and employees. Companies develop the capacity for lean manufacturing, supplier collaboration, e-commerce and business intelligence (BI) programmes. Master data also informs transactional systems and analytics systems so bad quality master data can significantly impact revenues and customer service as well as company strategies.

Taken in the context of a single piece of data MDM simply means ensuring one central record of a customer’s name, a product ID, or a street address, for example. But in the context of companies that employ in excess of 1 000 people, McKinsey found in 2013 that they have, on average, around 200 terabytes of data. Getting even small percentages of that data wrong can have wide ranging ramifications for operational and analytical systems, particularly as companies attempt to roll out customer loyalty programmes or new products, let alone develop new business strategies. It can also negatively impact business performance management and compliance reporting. In the operational context, transactional processing systems refer to the master data for order processing, for example.

get-on-the-big-data-bandwagon

(Image not owned by KID)

MDM is not metadata, which refers to technical details about the data. Nor is it data quality. However, MDM must have good quality data in order to function correctly. These are not new concerns. Both MDM and good quality data have existed for as long as there have been multiple data systems operating in companies. Today, though, they are exacerbated concerns because of the volume of data, the complexity of data, the most acute demand for compliance in the history of business, and the proliferation of business systems such as CRM, ERP and analytics. Add to that the fact that many companies use multiple instances of these systems across their various operating companies, divisions and business units, and can even extend to multiple geographies, across time zones with language variations. It unites to create a melting pot of potential error with far reaching consequences unless MDM is correctly implemented based on good quality data.

None of these concerns yet raise the issue of big data or the cloud. Without first ensuring MDM is properly and accurately implemented around the core systems companies don’t have a snowball’s hope in Hell of succeeding at any big data or cloud data initiatives. Big data adds successive layers of complexity depending on the scope of the data and variety of sources. Shuffling data into the cloud, too, introduces a complexity that the vast majority of businesses, especially those outside of the top 500, simply cannot cope with. With big data alone companies can expect to see an average growth of 60% of their stored data annually, according to IDC. That can be a frightening prospect for CIOs and their IT teams when they are still struggling to grapple with data feeding core systems.

While MDM is no longer a buzzword and data quality is an issue as old as data itself they are certainly crucial elements that South African companies are addressing today.

 

 

Why your business needs canonical data and how to achieve it

By Mervyn Mooi

Most companies did not suddenly emerge with hundreds or thousands of employees, hundreds of thousands of customers and millions in turnover. They grew organically, by merging, being acquired or acquiring other companies.

Growth, though mostly desirable in the business world, creates its own problems and challenges within an organisation. And it is likely the cause of one of your biggest headaches – alignment. For example: one part of the business has no idea what another is up to and the executives struggle to get at the information they need to make educated decisions of how to run the growing, merged, or acquired business. Furthermore, it is a humungous manual process to capture, recapture and synchronise data between the full spectrum of systems – this includes data on printed sheets of paper or even apps on mobile devices.

Most programs pass data or information to and from a data store, such as a database, file or storage repository. Most programs, such as apps, processes and so on, operate against an underlying data store. Even Outlook is just a fancy front-end sitting on top of a data store, and business systems are no different. Many business applications and processes share a data store – some companies go a step further and create data warehouses and master hubs that collate, integrate and standardise the data from many sources.

(image not owned by KID)

(image not owned by KID)

The next problem then is that there are so many data stores Continue reading