Search ERP Arena

Jun 3, 2014

Making BIG DATA, small.

The term "Big data" is a terms that has caught up with the many business IT teams that drown in large data pools. Anyone who has been involved with processing or working with large amounts of data, will be able to tell you the challenges they face in deciphering the useful information in the data.

Big data can be used as a blanket term for any collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools or traditional data processing applications. The volume of business data worldwide, across all companies, doubles every 1.2 years -Wikipedia.

If you are an organization that runs an ERP system chances are that there is a lot of data flowing through your IT & Logistic network which you have designed to capture in your ERP system, in order to report them in BI tools for the important decision makers within the company. However, there are many occasion when these individual are drowning in data but starved of information.

 This is not something new, everyone has gone through this for sure. The best way then to make big data small is to understand what you are looking to get out of it.

  • Know who is giving you the big data and only filter the data from the data points that will have an impact on the final decision
  • Understanding what the end result should be will help eliminate the noise in the big data
  • Understanding the decision making process itself will help to bring out the relevant data sets from the overall data pool
  • Setting up user cases will help breakdown the big data into specific user case scenarios
  • Understanding the roles (the role I'm referring to is the role of the "business objective" in the company's overall strategic initiative and not the common man's role) that will be using the big data is another method to breakdown the analysis into smaller chunks.
  • Most importantly the nature of the decision to be made will determine if we are actually dealing with big data or would a small set of the sample data actually suffice.


All the above are simple tasks which you've already probably done in your day to day tasks, but sometimes when dealing with big data we can get too engrossed in it , that we think every bit of data is important and end up creating our own variation of big data and confusing ourselves in the process.

Hope you found this post informative.