Barry Lerner, Huawei’s regional CIO for South Pacific solutions marketing, says avoiding being deluged in a data storm of useless information requires clearly defining the expected business outcomes
Cloud Supplement: How should service providers undertake big data?
Barry Lerner: As service providers start to explore the insights, efficiencies, value and competitive edge that big data can bring to their companies, they must first decide how to embark on such a journey. Big data must be business driven with well-defined expected outcomes. Each department or organization needs to define its own individual big data requirements.
Big data must not be undertaken as a technology initiative, nor driven by the IT department. Each department will need to clearly define their requirements, such as improving customer loyalty, streamlining processes, improving revenue, network optimization or top-line growth through predictive analytics, etc. These requirements need to be clearly defined and agreed upon before any project spending begins.
How do service providers identify what data they need to capture and keep?
The biggest big data challenge I see for service providers today is to identify the correct and most useful data sources to tap, and then to understand where to find the value in that data. You need to start with the basic questions that pertain to your department’s mission and KPIs.
If you go and capture everything, you will soon find yourself deluged in a data storm up to your neck in useless information. Typical questions will be: “How are customers and potential customers using technology?” “What are they not doing with technology, or what technology are they avoiding?” “Are my customers happy If not, why?” “What will my customers be doing six months or a year from now?” And, of course, “What do I do to meet my customers’ needs?”
The internal side of a service provider (OSS function), can use analytics to address questions such as:
- Where network failures occur
- Where they have to improve response time or uptime
- Where shifts in customer behavior, drawn from the customer-facing analysis, will impact the network or call for network expansion
- How concepts such as virtualization can allow the operations team to respond to shifts in customer activity, even things like time of day variances.
Is there a true life cycle to big data? Which data has a short life cycle, and which data has a long life cycle?
Everyone always asks this question. The simple answer is that all life cycles truly depend on the data and the business requirements. Operational data usually has a short life cycle but can be kept to define trend analysis. For example, certain customer data can expose busy hour traffic, type of traffic or types of devices that are connected during that busy hour. This data may be considered most valuable when new, and less valuable as time passes.
Do you believe that life cycle management matters more with big data?
There are three major concerns with life cycle management of big data that service providers should be aware of.
1. Big data grows at a vicious rate. There are plenty of statistical examples that demonstrate how huge and fast a service provider generates unstructured and semi-structured data. A perfect example is with the advent of service providers entering into digital services environments such as cloud or M2M. There is a whole new type of data chatter that has been created by sensors, web chatter, etc. Therefore, it is critical to understand you cannot keep data forever. You need to develop a life-cycle management strategy that includes archiving and deletion policies, business rules and IT automation.
2. Most big data is ephemeral by nature. Not only does it grow like dandelions in spring, but much of it is quickly outdated. Sensor data, social media mining and even web logs can be analyzed for trends, but when you’re using it for real-time decisions, as companies often are today, all too quickly this data becomes irrelevant.
3. Out-of-date big data can undermine the results of your business analytics. One reason there’s a huge focus right now on streaming data and real-time analytics is that companies are trying to manage by exception. Organizations don’t care when things go right, only when things go wrong. With data, which can be unpredictable and come in many different sizes and formats, the process can be a nightmare.
We need to start thinking about how we are going to manage this incoming mass of unstructured and semi-structured data in our data centers - where images, videos and documents are growing at a clip of 80%, we may never be able to lift our heads above water.
How do you believe service providers should approach managing big data?
I believe there are two major steps service providers need to undertake to manage big data. First, classify the data as strategic and operational. Second, ensure that the IT department works with the business organizations to set up roadblocks and checkpoints, via policies to ensure the flow of big data through the portals are managed before they enter data centers or the cloud.
Get back to the basics by setting up some old-fashioned data management meetings, this time about big data. These meetings need to be around both the strategic and operational levels. The IT department needs to be aggressive in the service provider IT environment, by instituting proven technologies, governances, policies, procedures and rules to harness the big data that is created and transits all the organizations and departments.
Also create time dimensions into your data quality process and metrics:
- How often do we need to get this data into the internal systems?
- How frequently does this data need to be refreshed to be useful?
- When is data too dated to provide value to the business?
- When do you archive, or backup data?
- When do you remove the data as it becomes dated?