Business Requirements: Not One Size Fits All

By |2019-12-16T06:54:13-06:00September 7th, 2013|Big Data, Blog|

The total cost of operations (TCO) of Business Intelligence (BI) systems is often measured in three categories: time-to-completion of projects, on-budget completion of projects, and cost per user of BI applications. There is a key process in every project that impacts all three categories: Business Requirements Engineering. An effective requirements methodology ensures that project scope is clearly understood and costs accurately estimated. At the same time, when we deliver what users want, usage and adoption of the solution increase the user base. Why then do so many programs not take a closer look and the effectiveness of their approach to this key part of the process?

Big Data and Cloud Data Warehousing Knowledge: Do You Have the Technical Skills You Need?

By |2020-01-08T04:09:25-06:00September 6th, 2013|Big Data, Blog|

BI Professionals are used to working with a wide range of products and platforms and typically have a pretty substantial tool belt to be able to work across a multitude of different technologies. Over the past couple of months I took the opportunity to experiment with technologies that are entering the data warehousing ecosystem. These technologies included the Cloudera Sandbox, Hortonworks Sandbox, IBM Big Insights Sandbox, and Amazon’s Red Shift.

The Cloud Is Ready for Your Data Warehouse. Are You?

By |2020-01-08T03:34:28-06:00August 29th, 2013|Big Data, Blog|

The business world continues to evaluate and implement the cloud for some of its IT requirements. The concept of the cloud as a viable IT storage solution as well as a way to cut costs is gaining momentum. But it might prompt the question: is the cloud the right place for a data warehouse? This is an interesting question for many, and a problematic question for some.

A Better Data Warehouse Benchmark

By |2020-01-08T03:37:11-06:00July 25th, 2013|Big Data, Blog|

We often work with clients to benchmark database performance during the vendor selection process and as a result have become experts at working with the various standardized testing data sets available to the public. Almost anyone who has been involved in a performance test is familiar with the Transaction Processing Performance Council (TPC) test.

Is Amazon’s Redshift a Game Changer?

By |2019-12-16T07:15:07-06:00July 5th, 2013|Big Data, Blog|

Now and then new technologies, ideas, and even buzzwords come along that fundamentally change the way people look at the IT game. When Amazon first released Amazon Web Services (AWS) it changed the game of cloud-based data centers by introducing pay-as-you-go pricing for servers and storage. By replacing large up-front capital infrastructure expenditures with much lower costs that people could scale as their businesses grew, Amazon grew their own business by fostering many more entrants into the e-commerce space that in many cases also turned to Amazon logistics and fulfillment services. That was a game changing moment that was definitely a win-win for all parties.

Are Those Data Clouds on the Horizon?

By |2020-01-08T03:39:18-06:00June 20th, 2013|Big Data, Blog|

So, would you ever even consider putting a data warehouse in the cloud? With the cloud’s huge capacity, quick deployments and high availability—all at really low costs—it’s hard to ignore the possibilities. Ever since the iOLAP management team told me we were new integration partners with the Amazon Web Services (AWS) Redshift platform, I have been thinking about the cloud-based data warehouse concept.

Avoiding Big Failure with Big Data

By |2020-01-07T06:12:53-06:00June 7th, 2013|Big Data, Blog|

Companies are being pressured on multiple fronts to do something amazing with Big Data. But should the brakes be applied, even slightly, to consider some lessons from the past? Historically, large data warehouse projects failed at surprisingly high rates. How do we learn from these collective past mistakes and increase the odds that Big Data efforts will provide Big Returns?

Big Data Is Opening Doors, but Maybe Too Many

By |2020-01-08T03:40:05-06:00June 1st, 2013|Big Data, Blog|

From a May 23, 2013 article in the NY Times Technology Section. "In the 1960s, mainframe computers posed a significant technological challenge to common notions of privacy. That’s when the federal government started putting tax returns into those giant machines, and consumer credit bureaus began building databases containing the personal financial information of millions of Americans. Many people feared that the new computerized databanks would be put in the service of an intrusive corporate or government Big Brother."

Big Data and Relational Databases: “Both/And” or “Either/Or”

By |2020-01-08T03:40:56-06:00May 29th, 2013|Big Data, Blog|

In the May 21, 2013 issue of BI This Week, TDWI Author Steve Swoyer wrote an interesting article that represented some insights into both practitioner and vendor thinking when it comes to the pragmatics of Big Data, Hadoop and Relational Databases.