Using our research, best practices and expertise, we help you understand how to optimize your business processes using applications, information and technology. We provide advisory, education, and assessment services to rapidly identify and prioritize areas for improvement and perform vendor selection
We provide guidance using our market research and expertise to significantly improve your marketing, sales and product efforts. We offer a portfolio of advisory, research, thought leadership and digital education services to help optimize market strategy, planning and execution.
Services for Technology Vendors
We provide guidance using our market research and expertise to significantly improve your marketing, sales and product efforts. We offer a portfolio of advisory, research, thought leadership and digital education services to help optimize market strategy, planning and execution.
IBM hosted the Big Data and Analytics Analyst Insights conference in Toronto recently to emphasize the strategic importance of this topic to the company and to highlight recent and forthcoming advancements in its big data and analytics software. Our firm followed the presentations with interest. My colleagues Mark Smith and Tony Cosentino have commented on IBM’s execution of its big data strategy and its approach to analytics. As well, Ventana Research has conducted benchmark research on challenges in big data.
The perennial challenge for the IT industry observer is to be skeptical enough to avoid being taken in by overblown claims (often, the next big thing isn’t) without missing turning points in the evolution of technology. “Big data” has always been with us – it’s the amount that constitutes “big” that has changed. A megabyte was considered big data in the 1970s but is a trivial amount today. There’s no question that the hype around big data is excessive, but beneath the noise is considerable substance. Big data is of particular interest today because the scope and depth of the data that can be analyzed rapidly and turned into useful information are so large as to enable transformations in business management. The effect will be to raise computing – especially business computing – to a higher level.
The IBM event demonstrated that the technology for handling big data analytics to support more effective business computing is falling into place. It’s not all there yet but should improve strongly over the next several years. Yet while technological barriers are falling, there are other issues organizations will need to resolve. For example, the conversation in the conference sessions frequently turned to the people element of successfully deploying big data and analytics. These discussions confirmed an important finding in our big data research, shown in the chart, that more than three-fourths of organizations find staffing and training people for big data roles to be a challenge. It’s useful to remind ourselves that there will be the usual lag between what technology makes possible and the diffusion of new information-driven management techniques.
The conference focused mostly on the technology supporting big data and analytics but included examples of conceivable use cases for that technology. For example, there was a session on better management of customer profitability in the banking industry. Attempts to use software to optimize customer profitability go back to the 1980s, but results have been mixed at best, and a great many users continue to perform analysis in spreadsheets using data stored in business silos. IBM speakers described a generic offering incorporating Cognos TM1 to automate the fusion of multiple bank data sources that are incorporated in a range of profitability-related analytic applications. The aim is to enable more precise pricing and price-related decisions related to rates and fees, among other factors. This application enables consistent costing methodologies, including activity-based ones for indirect and shared expenses, to promote a more accurate assessment of the economic profitability of offers to customers. A good deal of the value in this offering is that it puts the necessary data in one place, giving executives and managers a consistent and more complete data set than they typically have. As well, the product’s use of in-memory analytic processing enables much faster calculations. Faster processing of a more complete data set enables more iterative, what-if analyses that can be used to explore the impact of different strategies in setting objectives for a new product or service or examining alternatives to exploit market opportunities or address threats.
As the Internet did, big data will change business models and drive the creation of new products and services. The dramatic drop in the cost of instrumenting machinery of all types and connecting them to a data network (part of the concept of the Internet of Things) is already changing how companies manage their productive assets, for example, by optimizing maintenance using sensors and telematics to enhance uptime while minimizing repair expense. Decades ago, companies monitored production parameters to ensure quality. More recent technologies can extend the speed and scope of what’s monitored and provide greater visibility into production conditions and trends. Representatives from BMW were on hand at the conference to talk about their experience in improving operations with predictive maintenance. Centrally monitoring the in-service performance of equipment and capital assets is old hat for airlines and jet engine manufacturers. For them, the economic benefits of optimizing maintenance to maximize the availability and uptime were significant enough to warrant the large investment they started making decades ago. The same basic techniques can be used to for early detection of warranty issues, such as identifying specific vehicles subject to “lemon law” provisions. From IBM’s perspective, these new technologies will enhance the value of Maximo, its asset management software, by extending its functionality to incorporate monitoring and analytics that help users explore options and optimize responses to specific conditions.
IBM Watson is the company’s poster child for the transformative capabilities of big data and analytics on how organizations operate. The company’s objective is to enable customers to achieve better performance and outcomes by having systems that learn through interactions, providing evidence-based responses to queries. My colleagues Mark Smith and Richard Snow have written about Watson in the contexts of cognitive computing and its application to customer service. And we awarded IBM Watson Engagement Advisor our 2012 Technology Innovation Award. Conference presenters gave an extensive review of progress to date with Watson, featuring innovative ways to use it for diagnosis and treatment in medicine as well as to provide customer support.
Although this was not directly related to big data, IBM also used the conference to announce the availability of Cognos Disclosure Management (CDM) 10.2.1, which will be available both on-premises and in a cloud-based SaaS version. CDM facilitates the creation, editing and publishing of highly structured enterprise documents that combine text and numbers and are created repeatedly and collaboratively, including ones that incorporate eXtensible Business Reporting Language (XBRL) tags. The new version offers improvements in scalability and tagging over the earlier FSR offering. The SaaS version, available on a per-seat subscription basis, will make this technology feasible for midsize companies, enabling them to save the time of highly skilled individuals as well as enhance the accuracy and consistency of, for example, regulatory filings and board books. A SaaS option also will help IBM address the requirements of larger companies that prefer a cloud option.
Most of the use cases presented at the conference were extensions and enhancements of well-worn uses for information technology. However, when it comes to business, the bottom line is what matters, not novelty. Adoption of technology occurs fastest when “new” elements of its use are kept to a minimum. The rapid embrace of the Internet in North America and other developed regions of the world was a function of the substantial investment that had been made over the previous decades in personal computers and local- and wide-area communications networks as well as the training and familiarity of people with these technologies. Big data and the analytics that enable us to apply it have a similar base of preparation. Over the next five years we can expect advances in practical use that benefit businesses and their customers.
Regards,
Robert Kugel – SVP Research
Robert Kugel leads business software research for ISG Software Research. His team covers technology and applications spanning front- and back-office enterprise functions, and he runs the Office of Finance area of expertise. Rob is a CFA charter holder and a published author and thought leader on integrated business planning (IBP).
Ventana Research’s Analyst Perspectives are fact-based analysis and guidance on business,
Each is prepared and reviewed in accordance with Ventana Research’s strict standards for accuracy and objectivity and reviewed to ensure it delivers reliable and actionable insights. It is reviewed and edited by research management and is approved by the Chief Research Officer; no individual or organization outside of Ventana Research reviews any Analyst Perspective before it is published. If you have any issue with an Analyst Perspective, please email them to ChiefResearchOfficer@isg-research.net