Enterprise companies struggle to identify critical business data to extract actionable insights: Wasting resources, hindering decisions and profitability.
Money down the drain
Together, Compute and Data form the fuel of modern business operations but data management is hugely painful. And there is too much. Creating new traffic jams every second. Creating new bottlenecks.
A Bottleneck at the root of modern commerce
Three interlocking operational problems create the root bottleneck:
​​​​
-
Converting data assets to current assets is not possible;
-
Making it hard to rapidly calculate accurate costs to calculate profitability;
-
Which it turn, makes it hard to continuously improve data quality.
​
And there are interlocking cultural barriers - we all speak different languages:
​
-
Data Architects and Engineers need cost metrics to optimize cleaning.
-
Data Scientists need to optimize performance to achieve extreme accuracy.
-
Business Teams need assets with operating value to optimize for profitability.
Intangibility is the operating barrier
Intangible assets are assets that lack physical substance. Examples include customer preferences; patents; actionable insights and predictions, brand reputation; and the tacit knowledge and expertise of employees in the culture of a business, its operations and know-how.
​
Intangible assets have immense business value--often as the driver of decision-making--but that value is hard to measure, manage and operationalize. Intangible assets such as data are hard to use, store, and transfer: as they derive value from knowledge, rights, and future benefits; the value can be subjective and context-dependent; and is vulnerable to technological change and shifts in consumer behavior.
​
The significance of Data is that it is a fundamental asset: Its value influences and drives the efficiency, operating cost and value-add of compute - the cost to deliver electronically-generated goods and services.
​
The challenge with data is that it does not possess intrinsic value, so cannot be used to systematically measure and improve the quality of itself. This makes data dirty and awkward to clean and prepare; which makes it difficult to use to profitably standardize and streamline data-driven systems.
There is no good solution: only high-touch boutique work-arounds because currently "there is no standard to measure the value of data.” (1)
The need for rapid intrinsic monetization
Compute and Data Costing are hard to achieve, and Data Profit & Loss calculations are a guess. Teams use traditional methods to achieve valuations that are not designed to improve industrial operations. In fact, these are workarounds created to get around the inability to rapidly determine value-add.
​​​
We need business metrics to use statistical improvement methods, such as the 80/20 Rule (Pareto analysis) to improve data quality. But this means that we must have cash value, to create "x" and "y" variables. This is not possible today, simply because there is no Denominator: Intangible Assets have $0.00 daily value.
References
(1) Mike Fleckenstein, Chief Data Strategist, MITRE: A Review of Data Valuation (MIT: 2023): https://hdsr.mitpress.mit.edu/pub/1qxkrnig/release/1 - See our blog for data valuation reviews: