Common Sense Rules Apply to Big Data Usage

Common Sense Rules Apply to Big Data Usage

By: Cortland J. Fondon

The term “Big Data” may sound like something out of a science fiction movie of the near feature and totalitarian technology run amok. However, it’s a term for a quiet yet powerful trend that is taken over big business IT management usage now that enterprise systems are becoming commonplace and their understanding is becoming widespread. In short, the term involves leveraging the synergies available from combined database systems and their information. Previous versions of the concept became apparent with data mining, and the benefits evolved under data analytics.

More specifically, when Big Data is applied to customer information, it allows a far greater leveraging of customer behavior and information beyond what was originally available in CRM systems. The goal, of course, is to glean valuable pattern and trend predictions from information collected on customers. That includes everything from basic identification and spending data to customer behavior and opinions with regards to services or products. Ideally, the information summary should then be able to predict what a customer will respond to and likely spend on, versus what will bounce the wrong way and be ignored in terms of future marketing towards that target.

However, running reports willy-nilly gets expensive. It requires lots of technical software, people who know how to run the tools, and lots of custom slicing and dicing of data to get the right picture. All of that work adds up to a price tag. So finding a more efficient approach to get the same results or better is key with Big Data management.

The first step is to focus on what is desired – clearly defined report parameters make it far easier to determine which data to pull to meet reporting expectations. Playing a guessing game will just result a lot of wasted time and money and no viable information that can be used for decision-making.

Second, ensuring data integrity is critical – the raw information pulled should be accurate and objective. If it has other influences or gets corrupted, it produces reports that are useless. This issue follows the old IT principle of “garbage in, garbage out.” So controlling how input occurs can make a big difference on the information summarized.

Third, always include the human factor in report evaluation. Just running reports often ends up producing odds results that will raise people’s eyebrows. Good reports are delivered with context and explanation about the environment the data is collected it. Indirect influences can have big impacts on how data behaves, but those influences won’t always be reflected in the reports (i.e. the star salesperson has been sick with the flu, so the sales numbers dropped. His sickness won’t show up on the report, but the sales figures will).

Common sense applies to these three rules. When followed right, the reports gleaned can be extremely helpful. Data is immensely powerful, but it needs to be targeted to be beneficial. That requires practical planning instead of fishing expeditions.

 

Leave a Reply

Your email address will not be published. Required fields are marked *