Big data management is the organization, administration and governance of large volumes of both structured and unstructured data.
Big data management refers to the efficient handling, organization or use of large volumes of structured and unstructured data belonging to an organization.
The goal of big data management is to ensure a high level of data quality and accessibility for business intelligence and big data analytics applications. Corporations, government agencies and other organizations employ big data management strategies to help them contend with fast-growing pools of data, typically involving many terabytes or even petabytes stored in a variety of file formats. Effective big data management particularly helps companies locate valuable information in large sets of unstructured and semistructured data from various sources, including call detail records, system logs, sensors, images and social media sites.
Character
- Variety: To the existing landscape of transactional and demographic data such as phone numbers and addresses, information in the form of photographs, audio streams, video, and a host of other formats now contributes to a multiplicity of data types about 80% of which are completely unstructured.
- Volume: This trait refers to the immense amounts of information generated every second via social media, cell phones, cars, transactions, connected sensors, images, video, and text. In petabytes, terabytes, or even zettabytes, these volumes can only be managed by big data technologies.
- Velocity: Information is streaming into data repositories at a prodigious rate, and this characteristic alludes to the speed of data accumulation. It also refers to the speed with which big data can be processed and analyzed to extract the insights and patterns it contains. These days, that speed is often real-time.
Various processes such as the following:
- Performing database maintenance for better results.
- Monitoring and ensuring the availability of all big data resources through a centralized interface/dashboard.
- Implementing and monitoring big data analytics, big data reporting and other similar solutions.
- Ensuring the security of big data repositories and control access.
- Using techniques such as data virtualization to reduce the volume of data and improve big data operations with faster access and less complexity.
- Ensuring the efficient design and implementation of data life-cycle processes that deliver the hisghest quality results.
- Ensuring that data are captured and stored from all resources as desired.
- Implementing data virtualization techniques so that a single data set can be used by multiple applications/users simultaneously.