A recent wide-ranging, erudite, and politically salient Substack post by Justin Smith defends the thesis that "we may identify a vast process currently underway whereby we are moving, in our efforts to understand reality and to harness it to our ends, from its to bits. It seems to me the real story of the past few decades, a story that easily explains the rise in popularity of the simulation argument...is of a fairly total regime change, from physics to informatics —here I am adapting the French informatique, which names the research domain in question far more appropriately than “computer science”— as the Prima Scientia or ultimate science of reality." (The post may be gated so you will have to trust me on its interest or take a subscription to Justin's excellent substack, Hinternet.) Presumably, by 'regime' Smith is suggesting that the wide world of learning is fundamentally ordered and (he makes an explicit nod to Hacking here) ordering and authoritative in some sense.
In these digressions, I have been joking over the years that economics will soon be displaced by computer science. But as is usual while I have been focusing on a change in one tree, Justin is noticing the shifts in the wider forest.
Unfortunately, Smith never defines 'information' (and its non-trivial cognate,) 'units of information processing' despite it playing a key role in his overall argument. That's probably a feature and not a bug of his argument because part of his point is that in virtue of the significance and fluidity (within constraints) of symbolic representation information is not (ahh) ontically stable, eternally out there.
Inform has Latinate roots (to give form to, to shape, educate, etc.), and, perhaps, there are echoes of the scholastic doctrine of hylomorphism lurking here. (Not so long ago, I was pleased to learn that computer scientists use 'hylomorphism' to refer to a certain recursive function which struck me as very funny and profound at once.) As I was barely conscious contemplating the possible significance of this, through the mystery of the associative process and the rapidity of modern networks (and Google), I looked up and downloaded George Stigler's (1961) "The Economics of Information." This paper has over 12000 citations, and alongside his work on economic regulation, crucial to his importance to the development of economics (for which he got a Nobel).
Somewhat amusingly, Stigler also fails to give a definition of information. He tends to use 'knowledge,' 'advertising,' subjectieve beliefs, and the comparison of prices by consumers as ways of understanding and describing information. Information can be 'pooled' and has a 'flow.' Most famously, information is costly (and can be bought and sold).
That information is costly is both part of the cause and part of the effect of the dispersion of prices. (In an older tradition of economics -- as David M. Levy taught me, one can think of Wicksteed -- prices were thought to be uniform in a market at a given time; although earlier, during the 18th century, Adam Smith and his contemporary, James Steuart, thought that prices dispersed, too although their analysis is not identical.) I don't mean to suggest that this exhausts the role of information in Stigler's analysis of economic life because it connects means and ends, but about that some other time more.
In Stigler's paper, I encountered a fascinating passage that is worth quoting and discussing (and this is the real point of this digression):
The maintenance of appreciable dispersion of prices arises chiefly out of the fact that knowledge becomes obsolete. The conditions of supply and demand, and therefore the distribution of asking prices, change over time. There is no method by which buyers or sellers can ascertain the new average price in the market appropriate to the new conditions except by search. Sellers cannot maintain perfect correlation of successive prices, even if they wish to do so, because of the costs of search. Buyers accordingly cannot make the amount of investment in search that perfect correlation of prices would justify. The greater the instability of supply and/or demand conditions, therefore, the greater the dispersion of prices will be.
In addition, there is a component of ignorance due to the changing identity of buyers and sellers. There is a flow of new buyers and sellers in every market, and they are at least initially uninformed on prices and by their presence make the information of experienced buyers and sellers somewhat obsolete.
The amount of dispersion will also vary with one other characteristic which is of special interest: the size (in terms of both dollars and number of traders) of the market. As the market grows in these dimensions, there will appear a set of firms which specialize in collecting and selling information. They may take the form of trade journals or specialized brokers. Since the cost of collection of information is (approximately) independent of its use (although the cost of dissemination is not), there is a strong tendency toward monopoly in the provision of information: in general, there will be a "standard" source for trade information. (Stigler 1961, 220)
Chicago economists are not exactly known for assuming that any market will tend toward a monopoly. As I noted last week, in the 1950s Stigler's work was, in fact, prominent for undermining empirically the Marxist (and Marx-inspired) thesis that markets naturally tend to monopoly.
Now, later when (in part inspired by the work of the sociologist Morton and in part because of his interest in which economics Phds would be hired where) Stigler turned to the sociology of scientific knowledge, he tackled this tendency toward monopoly in knowledge production head on. I will return to that some other time. But in the 1961 paper he leaves the point in the quoted passage alone.
It would be tempting here to discuss the role of google or wikipedia or the stanford encyclopedia of philosophy as standard sources of information in our ecologies. (And to what degree ChatGPT can displace these or become one itself.) But I want to close with an important broader context in which i situate Stigler's passage.
In the liberal tradition — with which Stigler was very familiar — which understands, as a a core function of the state, it (the state) as a machinery of record, a collector and disseminator of accurate public data. A lot of our social practices, inside and outside the market, presuppose a social infrastructure in which the machinery of record is reliable, allowing public authorities and private actors to plan their activities. For that to happen, the public must be well-informed, and the only way citizens can possibly be well-informed on complex matters of policy is for state experts and bureaucracies to organize and process information. Elsewhere, also in joint (forthcoming) work with Nick Cowen I have argued that Walter Lippmann's (1922) Public Opinion puts this vision forward.[1]
In fact, the germ of Lippmann’s idea is expressed in the closing paragraph of John Stuart Mill’s and Harriet Taylor’s On Liberty (1869, Ch. 5), where they claim that “the greatest dissemination of power consistent with efficiency” should be allied with “the greatest possible centralisation of information, and diffusion of it from the centre.” In context, it is clear (recall) that they are responding to Tocqueville’s fears that in order to prevent democratic despotism through local self-government, incompetent art of government is inevitable.
So, lurking in Stigler (1961) is a causal account that both explains the origin of the state and justifies the state's continued existence as a (perhaps even natural) authoritative monopoly provider of the machinery of record. (In fact, this is how Adam Smith explains the epistemic origin of the state: as guarantor of measures and weights, and the soundness of currency. ) And that even in a bit-rich world this will remain even more necessarily so
Of course, you may think I have shifted too quickly from a monopoly enterprise to a government. Marxists and Libertarians may object that the account hinted at here effaces the violence of the state. This is undoubtedly so, but I consider this a feature not a bug of the argument.
[1] In the age of cheap computer power and powerful data collection, private institutions and individuals are capable of organizing and disseminating complex and large amounts of data. But in general they are not capable of coordinating public policy authoritatively.
The flipside of this attitude about "its to bits" can be found in this good analysis from Tim Lee https://www.understandingai.org/p/software-didnt-eat-the-world
I've been talking about this for the last 20 years or so. Our concepts of the economy haven't caught up with the fact that it's mostly bits now
https://johnquiggin.com/2022/10/17/capitalism-without-capital-doesnt-work/