Mining Idioms in Thirdness

Dennis Bouvard (@dennisbouvard)

June 26, 2024

The origin of humanity is the creation of the ostensive sign and the future of humanity is the endless generation of new ostensives. When I speak of “technology” as the stack of scenes, scenic design and the establishment of pedagogical platforms, that is what I am referring to: scenes are designed so as to bring something new into view, something we could point to; scenes are stacked so that each scene can subcontract to other scenes the production of new ostensives out of what is tacit in existing scenes; and pedagogy, or learning, is nothing other than being able to say, jointly with others, “this is the same,” with regard to something that only exists as something to point to because of the configuration of that scene. Writing, or inscripture, has as its end the generation of idioms, or self-referencing, self-constituting, world creating discourses that maximize the possibility of new ostensives. What I’m writing now should serve to reconfigure some scene I can’t imagine centuries from now. I work with and against David Olson’s concept of metalanguage, as it is instantiated in Francis-Noel Thomas and Mark Turner’s As Clear and Simple as the Truth, predicated on the fiction that reader and writer stand on the same scene with the writer pointing to the doings and happenings on that scene precisely because it is both true as a theory of composition and to be redesigned insofar as it is the scene of metaphysics. Note that there can be no learning on that scene of writing: both reader and writer are always already configured so as to see what is after all right in front of us and immediately intelligible once seen. That scene starts getting frayed at the edges once we acknowledge that it is constructed and itself a product of technology or the stack of scenes, and that those scaffolding and supplemental scenes can interfere with the “view” offered on this one, and do so irremediably once attention is directed toward them. Now, the basic configuration remains intact, which makes the fiction of the scene of writing so powerful, but now we have several scenes, upon which the writer might be taken to be standing with various readers, some of whom are to be figured as themselves writing on other scenes, which themselves open up. This line of thinking gets us to the point where it’s most economical and productive to think of writing for the algorithms that will determine the configuration of all those scenes. If we’re thinking, then, in terms of producing idioms that enter into the algorithms in such a way as to maximize the pedagogical performativity of the most performative platforms (“performative” in the linguistic sense of “doing things with words,” or creating scenes) then the models we’re working with are those of protocols and mining—in the sense of authenticating an exchange in such a way as to generate currency for future exchanges.

This can be (hypothetically) operationalized through the (hypothetical) Thirdness apparatus. We have here a prediction market, where what is predicted is the judgment to be rendered on actual or potential conflicts articulated as cases. We don’t want to issue odds, so we try and construct cases so as to be as close to even as possible which will often involve constructing cases around conflicts at the margins or what would ordinarily be taken to be the event in question: for example, rather than predicting the result of a trial, or even a would be trial, which will almost always weigh more towards guilty or not guilty, we might construct a case in which we imagine the plaintiff suing a media outlet for incitement for a particular claim about the case. These marginal cases, if considered pedagogically, will be more important than the center of the case because they offer routes of circulation back to the tributary center, where data pertinent to the stacking of scenes and preparing the bringing of more cases to more effective judgment (i.e., capping the vendetta on one end and aborting the antinomic vendetta through law, the attack on the nomos, on the other end)  is gathered, filtered, and mapped. Plus, judgment is here a pedagogical site, where the future officer class can learn to transform resentments into social energies.

So far, I’ve been speaking of this as a bet one would lay down once a case is announced (with a time frame for the case to be studied and judgment rendered)—one is betting on one’s own greater discernment into the case, the knowledge base drawn upon and the intellectual habits of the team involved in rendering judgment. I consider this all to be critical “human capital” development of an unprecedented type. I’ve been thinking about these bets being made in conventional currencies, with tokens or coupons being issued that turn into a specifically Thirdness currency. Since Thirdness aspires to become the world company (either by taking over all other companies or being taken over by a company more capable of actualizing the Thirdness program), it imagines Thirdness currency becoming the only currency. This project is not only compatible with but made more precise when conjoined with Brute Computation economic forecasting—investing in and eventually taking over and remaking the most powerful companies would generate myriad new cases—Thirdness assumes the intensive formulation towards ever lower thresholds tending asymptotically towards abolition of the laws of libel, defamation, fraud, slander and incitement in particular. (It might be very interesting to convert intellectual property cases into fraud cases: after all, intellectual property “theft” comes down to presenting as your own work what has been done by others, and the work of sifting through the threads of some intellectual work and determining where due recognition is called for might be less arbitrary than trying to determine who “owns” an idea and the monetary damages owed through “stealing” it—and if the “ideas” are real, public acknowledgement will provide the best material compensation anyway.) Without “cases” there is no reality—this cannot be put strongly enough. An anonymous society built up out administrative abstractions which, as Blaise Aguera de Arcas points out in Who Are We Now?, is the source of all of our “identities,” i.e., resentments, which therefore cannot be made visible and accessible other than through cases, which ultimately return to the juridical. All of our intellectual and therefore economic and technological energies come from simultaneously multiplying cases and dissolving them into new tacit norms, and social improvement comes from decreasing the lag between the multiplication and the dissolving to the point where the two approach articulation within single gestures. It is this practice that is in turn to be converted into one of idiom protocols and mining.

We have this gap between the bet laid and the judgment rendered, and so far this gap is going unutilized. Rather than a simple bet, which has the bettor then waiting passively for the judgment, which then ends the participation of the bettor, instead have the bet set in motion a process involving the bettor. The either/or choice directly present to the bettor is linked through an algorithm (revised with each bet) to previous either/or choices (previous bets). This would tie the individual bet to a logic of continuity that the judgment would, on one level, implicitly reject or confirm, while, on another level, it would redirect that logic or path through the data. The bettor could then, in the interim, reject or confirm the paths presented to him, thereby selecting a context for the judgment and the bettor’s judgment on the judgment. It would be like intervening in the weights to be selected as the program works its way through the neural network on the way to the predicted outcome. The judgment by the Thirdness team, then, would rebound back on the path through the data selected by the bettor and in this way function like a kind of appeals court or peer review, even if implicitly (the Thirdness team would not have access to this mining work of the bettor), and in this way add to the weights given to the various paths, rejected and confirmed. Rejected judgments are fed back into the data along with the confirmed ones, functioning as minority opinions that might be retrieved by some future majority. This is how the system learns.

If we have these various paths from the setting up of an either/or decision and the making of the decision itself, then each of these paths can be generated as a discourse or, in Thirdness terms, idiom, that can be extracted and turned into “exchangeable” language while also being recirculated back into the data. In other words, each path could be turned into a sentence or series of sentences that would articulate the stack of scenes in an iterable way. So, for example, a sentence would construct some relation between, say, a plaintiff suing a media outlet and a judge’s decision on whether to allow evidence enhanced through a machine learning process. We could think of this by analogy to the stock of commonplaces ancient and Renaissance rhetoricians collected, repeated, remembered, varied and so on in the course of producing their discourses. This is material to think with. These mined idioms can then function as currency within the Thirdness system or, first of all, metacurrency, since we’re still assuming traditional bets using conventional money. The idioms, which come first to the bettor who mined them, give that bettor access, only some of which need reach awareness, to the system of weights within the system—the idioms would eventually filter back into the system and become available in some integrated form, but would nevertheless provide a temporary, but very important, advantage to its holder (or hodler, if we like).

Mining idioms has Thirdness more closely approximate a mode of central intelligence pegged to singular succession in perpetuity.  We’re learning along with the machine now. If we read Peirce in a radical way, every utterance not only makes a prediction or proposes a certain probability regarding future events but weighs in on that probability—it’s not too hard at this point in history to see that predictions are really ways of increasing the likelihood that something will or won’t happen. We can start to think in terms of an exhaustively performative language, in the sense of utterances that don’t mean but make things happen. Here, language, technology and currency all converge—what is tacit and probable is made more explicit and more probable by excluding from one’s discourse anything that doesn’t intimate a particular way of increasing explicitness and probability. This is not cynical rhetoric, where you try to manipulate people within a particular scene and hope they don’t realize they’ve been manipulated until you’re safely off the scene. Those are the idioms that don’t last or get minted. I’ll return to the concept of “inscripture” here, and frame it in terms of the more familiar question of what makes a particular text last—what makes for a “classic”? We can’t separate the intrinsic qualities of the text or work from the institutional structures embedded in traditions that perpetuate certain understandings, interpretations, exchanges, pedagogies, rituals and so on around the work. (An interesting experiment—perhaps there is some Borges story like this?—would be to apply all the canon-making firepower on some utterly mediocre text and see what it would take, what kinds of contextual complexity would need to be constructed, to confer upon it the needed profundity. Of course, there are people who would attack much, if not all, of the existing canon along these lines.) Up until modernity writers and artists (and there were never that many of them) could anticipatorily curate their work by situating themselves within communities and networks of patronage that increased the likelihood of the perpetuation of their work. It was probably very rare that one became an artist without serving some kind of apprenticeship within such communities and networks—maybe it wouldn’t even occur to one. Now, the proof of concept and work of an idiom would have to be meme-ological, that is, your idioms need to be able to fit into and eventually replace existing discourses. There will be a qualitative dimension to such idioms—they can’t be gossip or topical insults. They need to withstand sustained anthropological scrutiny. They need to be able to convert opposing theses into samples testifying to their own reality. They need to infiltrate all disciplines. They need to consist of a tissue of overlapping and inter-referencing mini-arguments that reinforce without simply repeating each other. Maybe the best way of summing all this up is in terms of the irreversible and undeniable impact the system of idioms would have on a Large Language Model—an impact that is predictable to the extent one has mastered the mode of idiom generation itself and that in the same proportion contributes to greater mastery. The equivalent of a community and network of patronage today would be a company designed to provide intelligence useful in proportion to one’s proximity to power and one’s commitment to making singular succession in perpetuity more likely and imminent. That is the institutional framework within which the asymptotically self-abolishing idioms of Thirdness can operate and proliferate. But this institutional framework itself needs to be organized so that discourse flows accountably into upper level decisions or decisions that could understudy upper level decisions—that is, the institution has to be a pipeline, an ongoing dress rehearsal for governance and intelligence operatives. That’s what Thirdness aims at—in the meantime, oscillations in our approximation to such conditions indexes the meaning of our idioms. Approximate speaking only of the 50/50 splits as I explored in The Same Sentence, focusing on cases and the cases within cases and the cases behind cases and you will be idiom mining. That, then, is the protocol: keep further approximating making every declarative claim about something in the world that is equally likely and unlikely to be the case, or, more precisely, build performativity into the declarative (embed it with imperatives) in such a way that reality will be permitted to unfold until either one out of two ostensives could be equally reasonably predicted or, again, to be more precise, an a scene around either of two ways of saying “this is the same” can be imagined with an equivalent number of choreographable moves in either case.