Big data used to bring awe to the reader and the users. However, with the deluge of digital information growing at such a pace the term no longer seems adequate. Maybe we should call it gigantic data instead? Gigabit brought this topic to us in their article, “Three barriers to effective data analytics.”
Many companies have solved or largely solved the problem of how much data there is. Cloud storage makes it easy to store data, and many organizations have made heavy investments in data warehouses to manage their data. Unfortunately, data that flows from a growing number of sources has resulted in data islands across databases and legacy archival systems. This results in inefficient data duplication and multiple, disconnected repositories of data with inconsistent structures.
So how do we manage that? The first and most important step is to take an organization-wide approach to data strategy and architecture. It is possible to build on the existing investments in data warehouses, where you may have huge volumes of historical data and current processes for ingesting new data.
Melody K. Smith
Sponsored by Access Innovations, the world leader in thesaurus, ontology, and taxonomy creation and metadata application.