"Organizing data" is commonly presented as a choice among frameworks. That suits vendors and tidies slides—but it does nothing for profit and loss.

Few debates are as solemn as the one about data architecture. The agenda fills with technical nouns—lake, mesh, fabric—and soon everyone has forgotten that the results will be priced in economic verbs: sell, save, avoid.

The central question that should be discussed is "What is our data worth now, and how might its worth rise in the future?" Structure matters only insofar as it is the cheapest way to make that number go up through concrete revenue uplift, cost avoided, capital saved or risk reduced.

Treating data as an asset requires the same scepticism applied to any asset. If a dataset cannot be tied to a measurable change in cash flows or the probability of an expensive event, it is inventory—carrying costs without turnover.

This lens clarifies many quarrels. Consider a retailer with two enthusiasms: a clean re-platform of the warehouse or a quick-fix improvement to customer identification to sharpen offers. The outcome of this debate is not predetermined, but the way to evaluate it is: cost, revenue, risk.

None of this requires heroics in measurement. You're not trying to calculate dollars per byte. A workable house metric is a gross data margin: attribute the income and savings of data-enabled features, subtract acquisition, engineering, tooling and compliance costs, and view the ratio over time. Imperfect attribution will not ruin the signal; indifference will.

Infrastructure is foundational only if you can estimate its enabling value across multiple use cases. Two practical habits help here: First, assume decay: addresses go stale; models drift; the present value of a dataset falls without maintenance. Second, treat your data as a portfolio: a few speculative bets for compound value, and a flow of frequent wins that pay today's bills.

To turn the argument into practice, data product managers need fluency in both finance and engineering, willing to say "not yet" to elegant abstractions and "yes" to unglamorous work that moves a number. Their tools are unromantic: short experiments, explicit acceptance criteria, and a willingness to stop. The mark of seriousness is not the diagram but the kill rate.

None of this denies that structure can enable value. It denies only that structure predicts it. Frameworks are means. When they follow demand, firms buy only the scaffolding they need, when they need it, and can explain the purchase without resort to prophecy. When frameworks lead, programmes expand to fill budgets, and the future pays the bill.

The rule, then, is brief enough to test every meeting on the subject. First ask what the data is worth. Next ask what would raise that worth soonest. Only then ask what structure would achieve it at the lowest expected cost.

Manage the price of data, and let the architecture follow. Value first, plumbing later.