IT Today Catalog Auerbach Publications ITKnowledgebase IT Today Archives Book Proposal Guidelines IT Today Catalog Auerbach Publications ITKnowledgebase IT Today Archives Book Proposal Guidelines
IT Today is brought to you by Auerbach Publications

IT Performance Improvement



Networking and Telecommunications

Software Engineering

Project Management


Share This Article

Search the Site

Free Subscription to IT Today

Powered by VerticalResponse

Project Management Theory and Practice
The Green and Virtual Data Center
The Effective CIO: How to Achieve Outstanding Success through Strategic Alignment, Financial Management, and IT Governance
The Decision Model: A Business Logic Framework Linking Business and Technology
The SIM Guide to Enterprise Architecture

The Top Trends Shaping Analytics

By Mark Burton

In a dynamic and fast paced business environment, organizations need to make decisions more quickly than ever. Where should they focus their limited dollars and resources? What customers should they target? What's the best marketing approach? Since a reliable crystal ball has yet to be invented, businesses need to look at what happened in the past and use that information to predict what might happen in the future. This is what analytics are all about.

But while the practice of collecting and extracting intelligence from business information is not new (database and reporting systems have been in commercial use since the 1960s), today's analytic requirements are evolving dramatically. Business managers need answers today or tomorrow, not next month or next year. They need to capture and make sense of massive volumes of data spanning both traditional sources, such as transactional systems, as well as an ever expanding array of data from online and mobile devices. And in an economy that's forcing everyone to do more with less, they need scalable, affordable and simple-to-use solutions. Here are the top trends that I see shaping analytics in 2010 and beyond:

Dynamic Analysis Replaces Static Reporting
For most organizations, the days of simply churning out quarterly revenue reports are over. Current business decision cycles are far more urgent than they used to be. Decision makers need to know what happened yesterday, an hour ago, or even minutes ago in order to optimize campaigns, maximize efficiency and preserve cash. And they can't just run the same static report over and over again, as the questions that need to be answered are constantly changing. For example, today an online retailer may need to know how many purchases resulted from a new e-mail marketing campaign. Tomorrow, the retailer will want to understand how their brand is discussed on social media sites like Twitter. The highly dynamic nature of business in a Web 2.0 world demands analytic solutions that are not only capable of delivering fast response times on big data, but also of enabling ad-hoc queries delivering comparisons that marry streams of data from many diverse systems and sources.

The Data Storm Will Continue to Gather Strength
As everything from shopping to socializing has moved online or into the mobile arena, businesses now need to serve, track, analyze and store more information than ever before. And while they struggle to manage the deluge, data retention regulations requiring that information be kept for longer periods of time only add to the challenges. In a recent blog on enterprise data warehouse growth that appeared on, Forrester analyst James Kobielus estimated that the need for data warehouse capacity will double every 2-3 years over the next decade, with volumes expanding into the petabyte range.

And it's not just that there's so much more data-there is a much broader spectrum of data sources. Valuable information is being generated by a host of devices, including smart phones, web servers, e-readers, Xboxes and even GPS systems. Traditional, hardware-centric data management platforms typically used by large enterprises simply aren't flexible enough to keep up. To overcome the storage and scalability challenges posed by the data storm, organizations will need to seek out new tools that can help them load data faster, set up the database quickly without complex tuning, store it more compactly and reduce the cost and resources involved in managing it.

Central IT Is Constrained
It's no secret that IT budgets have had to tighten over the last few years. SLAs and lead times between have invariably suffered. Yet when it comes to analytics, time is more critical than ever. How can business units reconcile their "I need it now" intelligence demands with IT processes that involve lengthy review cycles, rigid requirements and delivery that can take months? Clearly, this approach is out of step with the highly dynamic and quickly evolving nature of today's information landscape.

When I was executive vice president of Worldwide Sales at MySQL, our team ran into this exact problem. We urgently needed to make key decisions that required timely analysis of incoming data spanning lead flow, conversion rates and other indicators that impacted revenue growth. Our central IT department was tied up with a host of other priorities-making changes to the ERP system, managing the network, dealing with e-mail systems-and couldn't deliver the answers we needed fast enough. Instead of waiting for central IT, we built a departmental data mart ourselves by layering a business intelligence tool on top of our open source database platform. This framework enabled us to fulfill our immediate reporting needs without having to go through a lengthy SLA process. The flexible nature of the solution also gave business users the ability to incorporate new queries on a daily and weekly basis in order to get the insight that was required to run such a dynamic business.

This example mirrors what goes on in all types of businesses today. Organizations need affordable tools and technologies that can either help central IT respond as fast as business needs demand-or that give business units a simple way to do it themselves.

Faster, Cheaper and Simpler Is the New Analytics Paradigm
How do you efficiently extract, integrate, mine and analyze millions of rows and terabytes of online data, event data and call details logs generated by thousands of devices and millions of transactions? How do you transform data into intelligence fast enough for it to be useful? And how do you get the database performance required while consuming minimal processing power and disk space? These are the challenges that modern data management must address.

Traditional data warehousing requires custom configuration, a massive and expensive hardware footprint and ongoing resources to store and maintain ever expanding volumes of information. It's not uncommon for a large organization to keep an army of data base administrators on staff to model data, create custom indexes and optimize poor performing queries. This is not sustainable in a world where information requirements change at lightning speed.

Faster, cheaper and simpler is the new analytics paradigm. This is being enabled in part by the rise of open source, as well as by technologies that can compress 10 terabytes of data into a performing database of less than a terabyte, provide fast query response for both planned and ad hoc queries, that are easy-to-deploy and administer. Simplicity and low maintenance are perhaps the most important legs of the stool. By freeing users from the constraints of cumbersome data management platforms that require proprietary knowledge and skills, organizations can save money on IT resources, servers and equipment. More importantly, they can get the answers they need in time to make decisions that have positive business impact. And the beauty of the open source model is that users can try a solution out before they commit-in many cases, for free.

Yes, corporate information ecosystems are more complex than ever. But this also means that valuable sources of knowledge are just waiting to be tapped by the next generation of analytics. Is your organization ready to evolve?

About the Author
Mark Burton is CEO of Infobright. He has 30 years of sales and marketing experience, highlighted by executive sales roles at MySQL AB and Informatica. At MySQL, as EVP of Worldwide Sales and Alliances, he contributed as the sales executive building a worldwide sales engine monetizing the MySQL open source brand and culminating in a $1 billion acquisition by Sun Microsystems in February 2008. At Informatica, Mark held roles as senior vice president of Americas Sales and eventually group vice president of Worldwide Sales and Alliances. His contribution helped grow Informatica from $12m to over $150m in sales resulting in a successful IPO in April 1999 and a secondary offering in September 2000. Other positions include several vice president of sales and marketing roles in early stage companies, executive-in-residence at Partech International Venture Partners, and regional sales management at Oracle, among others. Mark also serves on the board of directors of open-source leaders Zend and MuleSource as well as on the advisory board for Jaspersoft. Read Mark's Blog.

© Copyright 2010 Auerbach Publications