The Big Data Bandwagon has picked up momentum and all the consultants, professors, organisers, writers pundits, crooks, cheats, equity firms are queuing up to get aboard. A bandwagon has rarely before called for so much attention and passengers.
The basic premises for big data are:
There is common perception that more data is always better than less data.
Greater volume, variety and the velocity of the data creates further avenues of knowledge that can be called potential.
It is possible to answer “ALL” the question through big data and it is all the more easy to predict the future.
The questions that still create ripples are that, “Can we create an accurate picture of the future through Big Data or is it just a glittery mirage that shimmers far away in the distance in the heat of a desert? Is it the final truth or a bandwagon of overstated commitments and mirage dreams?
The truth to all this is that the solution to the business problems and the determining of strategic opportunities often rests in the boundary of little data and not Big Data. It is not required to boil the ocean to find out the salt content in it and nor is it required to eat the full steer to understand it is tough.
Corporate decision makers would be served better if they could trust on tools from the world of little data that were tried and tested and not like the illusionary Big Data. Sampling theory does state that in the case of a random sample it is possible to measure the behaviour or mood of the entire universe of the population even by taking a very few people.
A sample of 2000 suffices to predict the winner of the Lok Sabha Elections. A random sample of 200-300 would sufficiently predict the response of the whole population towards a new product.
With those examples of little data it becomes evident that survey research is comparatively less costly yet quite accurate. However it is dependent upon the knowledge of the source, stimulus, context and history by the researcher. It is also important that the measuring instruments are tried and tested and the researcher has normative data, quality assurance and controls.
It is true that relevance of data, information, choices, perception may vary from one to another individual, however, collection of information from a certain part of population may somewhat give a view in general about the perceptions and choices of people. So, with collection of samples, we could analyse and complete our research.
Data is the primary requirement in any research work. Most importantly the analysis segment is incomplete without a proper amalgamation of data. But I believe, with usage of analytical tools, we can assess the real value of data related to any research.
Keeping a big data wagon is not a bad or useless thing to do according to me. Infact I always try to collect extra data so that I get a better understanding of my work.Also in case the guide asks to change something or asks to add more information at the last moment then I do not go mad to collect it.