Types Of Big Data And Large Data

1635 Words7 Pages
Big data is a term used to define the amount of data, structures and unstructured, so huge that traditional data base managements techniques are rendered useless and the storage and analytics of this data pose a problem. There are various types of big data and big data can be defined in four Vs, which are: Volume, Velocity, Variety and Veracity. The problem is solved by Google Distributed File System, an application that google created for it’s own use to automatically store data. With the amount of big data multiplying every minute, technologies like cloud computing came into picture, in many, Hadoop is one which stands out. Hadoop is an open-source platform where big data can be stored, managed and even used. Organisations like Amazon and LinkedIn use the Hadoop framework to connect to it’s customers. Another new technology is MapReduce. MapReduce maps the input data and provides it to the user as output in a reduced form due to the clusters of computers working in parallel to individually process the data. With the further growth of data more technologies will be explored and if we assume if things proceed as they are doing, there will be time soon when we won’t even have to look for jobs. The right job for our capabilities will be offered to us by comprehending us on our IQ, our likes and dislikes, our nature, hobbies, etc. Big Data comprises of data, structured and unstructured of such vast quantities that managing it, or even properly utilising it poses
Open Document