The new ‘Data Economy’
In some cases, data itself has become more valuable than the actual products a company sells - becoming its own ‘economy’ that can be sold, traded or commodified. It’s understandable then, more than ever, large organisations are eager to ensure they are capturing data in the right way and using it to its full potential.
Whilst the opportunity is clear, realising the full potential of your data is a major challenge for most organisations.
Both the volume and complexity of the data sets that we are dealing with are growing at an exponential rate. While many organisations are looking at how they manage this problem, we are seeing a growing number looking at it as a strategic opportunity.
For many Data Professionals, 2018 brings to the table a directive to unlock the potential of organisational data and develop a platform capable of driving ‘data-led’ decision making. This directive is typically driven by the strategic management teams, as its promises of growth, greater ROI, improved processes and superior insights for strategic planning means data is a critical element for any organisation looking to become an industry leader.
The first part of the equation is usually the easiest part; whether traditional data warehouses, data lakes or hybrid models, ingesting and storing data is quite achievable. But the risk lies in applying the correct management and ongoing governance to ensure the data maintains its integrity, is trusted by the business, and is able to provide the most valuable output.
There is no point in collecting data and storing it if it is impossible to extract or use in a relevant manner. Simply storing large volumes of data without an effective governance framework, kicks the issue down the line, extrapolates problems and risks creating a vast data swamp.
Database or data lakes? Or is there a better approach?
With such obvious opportunity and growing directive from the executive team, how do you start developing a platform that can keep pace with the new agile business processes that they must serve?
Traditional enterprise data warehousing has struggled to keep pace with the rate of change, scale and variety of today’s workloads. Anyone who has dealt with enterprise data warehouses understands the pain of managing and interacting with them.
Loading, validating and managing data in an ever-changing business environment is an ongoing, yet specialised and complex, task. This means it is costly, frustrating and laborious for both the business and IT, and struggles to keep pace with modern agile business practices.
At the other end of the scale, Data Lakes are either the next logical step in an organisation’s information management journey or an easier and more flexible alternative than building a data warehouse.
The ability to avoid the cost and effort of defining data structures up front is a significant advantage. You can achieve huge time and cost savings by pointing this task at where the data will actually be used, and at the people who know it best — the business people doing the analysis.
However, this is where you must safeguard the integrity of your data and avoid the agility and integrity trade-off . Failure to understand the need for governance, quality, context and security/access can quickly pollute your data lake, muddying the waters and turning it into a data swamp.
Rebalancing the data agility/data integrity trade-off
If neither the highly-structured traditional enterprise data warehouse approach or the more flexible data lake approach provide an answer to the equation, then can any data warehouse realistically be expected to keep up with the pace of today’s business?
This was a question pondered by Dan Linstedt, the creator of the Data Vault 2.0 methodology.
Data Vault 2.0 is an innovative data architecture that has been built at the multi-petabyte solution level, spanning both relational and no SQL data sores in a single logical Data Warehouse. This allows you to store data where it is most suited but with a single management framework governing the integrity of all your data.
Data Vault 2.0 was built from the ground up to deal with the real-world data challenges that most businesses are facing today. Data Vault 2.0 delivers improved total cost of ownership, greatly enhanced operational agility and traceable data governance.
When you build a Data Vault solution, you don’t have to re-engineer to cope with new business demands. It’s a foundation that can be used to scale your business when you want to, without re-engineering your IT solution.
Re-balancing the data integrity/agility equation and unlocking the strategic value of your organisational data requires understanding the potential of Data Vault 2.0 and why it is the methodology of choice for many of the world’s leading organisations.
Download our eBook Unlocking Data Vault 2.0 to understand how deploying Data Vault 2.0 can drive strategic decision-making and cut TCO by over 50%.
If you’re a data professional looking to gain certification in this best practice methodology for data warehousing, register for one of our upcoming courses in Sydney, Brisbane & Melbourne.