ITOA News

Three Enablers That Bring New Breakthroughs for ITOA

By Jin Zhang, SR Product Director, CA Technologies

ITOA (IT Operations Analytics) is undergoing significant change, led by analytics. Gartner identifies "advanced, pervasive and invisible analytics" as one of the top 10 technology trends for 2015 and predicts, "Analytics will take center stage as the volume of data generated by embedded systems increases and vast pools of structured and unstructured data inside and outside the enterprise are analyzed."1 With advanced analytics, ITOA has become more data-driven and more real-time, creating greater value for business, especially in the areas of operational efficiency, business opportunity discovery, and customer experience optimization.

Such breakthroughs don't happen overnight or in a vacuum. Three significant technology enablers are laying the groundwork for fully leveraging the value of ITOA.

The first enabler is the network effect of big data, making it possible to yield predictive and actionable intelligence. Big data significantly enriched the possibility of analytics by both the volume and the mix. First, big volume is finally here. Instead of megabytes or gigabytes, we are now easily talking about terabytes and petabytes of data or more. This type of volume is game-changing as it increases the overall statistical significance for any data analysis. Second, the mix of the data is vastly broadened, especially by incorporating social media data. Where previously a particular user name or IP address would only provide limited knowledge about the source, now we are able to discover this user's interest via Facebook, Twitter, LinkedIn and more, and such public data can help us to put together a user profile. A wealth of user profiles from public data can help generate insights for IT Operations.

The second enabler for ITOA break-through is the advance of computers, more specifically, data science accelerated by the advance of computing. If you search on terms like "data science" and "data scientist" today, you will discover top entries such as "5 most lucrative careers!" and "Data Scientist Highest Paying" that instantly impart the excitement about the data career. Data science, however, is not a new field. Modeling and statistical analysis has been around for a long time, especially in research. What is changing, however, is the speed of the computer. Hardware advances have significantly brought down the cost and cloud computing has enabled us to use computing resources as a service. We can now go to Amazon.com and sign up in a matter of minutes to run a computing job. Rather than investing in and maintaining a large computing infrastructure we only pay for the time and resources we consume. This kind of computing advancement has pushed data science from the back-end to the front-end, from the research lab to the user interface, from analyzing historical data to making real-time recommendations. In fact, it's so accessible that "data is embedded right in the product, to enable real-time decisions."2

The third enabler is deep learning. To me this is the most interesting enabler with huge potential, as it's still being developed. With all the buzzwords going around in this space, it's worthwhile to briefly discuss the evolution of machine learning. Machine learning, as defined by Wikipedia, is "a subfield of computer science that evolved from the study of pattern recognition and computational learning theory in artificial intelligence."3 Deep learning is part of the broader machine-learning field of learning representation of data. It makes machine learning smarter in order to deal with more abstract concepts. According to Reza Zadeh from Stanford University4, in the 1990s Google was very successful in leveraging machine learning for search, page rank and spam email filtering. Similarly, in the last few years deep learning has become significant, with the applicable fields expanding from robotics and speech recognition to enterprise IT.6 This evolution of machine learning to deep learning is especially valuable for enterprises in the ITOA space because of data correlation. Correlating the end-user data, system and network data, application monitoring data and data from many other sources, the pattern recognition8 capability of ITOA is greatly enhanced, opening huge, new possibilities for the combination of deep learning and advanced analytics.

While the three enablers of big data, cloud computing and deep learning have resulted in significant breakthroughs for ITOA, why is it important for us to understand these enablers?

For users it matters because understanding the three enablers will help them to make the right decision for their ITOA strategy. "As more data is generated and more systems are incorporated into the network," stated Cassandra Balentine from Software Magazine, "emerging uses for ITOA and AOA simplify complex environments in a way that IT can gather meaningful insight."7 Understanding the three enablers will allow users to seek these "emerging uses" so they can simplify complex environments and gain meaningful insight.

For vendors, the knowledge of the three enablers is critical in order to define a product niche and to discover new market opportunity. Many vendors are now focusing on the ability to analyze large volumes of data from a variety of sources, in real-time. It would be highly unlikely for any of them to deliver on this ability without an in-depth understanding of the three enablers: big data, cloud computing and deep learning.

References

  1. "Gartner Identifies the Top 10 Strategic Technology Trends for 2015", Gartner, 2015, http://www.gartner.com/newsroom/id/2867917
  2. Davenport, Thomas. "Analytics 3.0", Harvard Business Review, December 2013
  3. Wikipedia, https://en.wikipedia.org/wiki/Deep_learning
  4. Beyer, David. "On the evolution of machine learning. From linear models to neural networks: an interview with Reza Zadeh", O'Reilly, http://radar.oreilly.com/2015/05/on-the-evolution-of-machine-learning.html
  5. Zhang, Jin. "Using Artificial Intelligence to Solve Complex Problems", Grace Hopper Conference 2015, http://schedule.gracehopper.org/session/using-artificial-intelligence-to-solve-complex-problems/, 2015
  6. Cappelli, Will. "IT Operations Analytics: Pattern-Based Strategies in the Data Center", Gartner, 2014
  7. Balentine, Cassandra. "Better performance faster innovation", http://www.softwaremag.com/better-performance-faster-innovation/, 2015

About Jin Zhang
Jin Zhang is Sr Director of Product at CA Technologies. Jin Zhang is a passionate technology leader who is currently leading analytics at CA Technologies. Prior to this, Jin led Apigee to its IPO as their VP of engineering and was an engineering executive with IBM, responsible for managing large teams as well as delivering eight-figure revenue. She is a member of the Palo Alto Lean In Circle and one of the sponsors/speakers for the Bay Area Girls Geek Dinner. Jin is passionate about diversity and mentoring, and her favorite technology spaces are data, cloud, and mobile.

Connect With Us

We're social people, so get in touch. Follow us and be the first to hear news and get updates,
new releases and special offers.


Join the #ITOA Conversation

Sponsored by
Copyright © 2016 Evolven Software

AddThis Smart Layers