Work-related updates:
- Last quarter I completed a first version of the back-end of our app at Q10E Labs, and so we started to work on middleware to orchestrate data storage and synchronization across devices and users. It turns out that the combination of specific user outcomes we are targeting (local-first storage, end-to-end encryption, multi-user collaboration and option to self-host to increase user trust) have not yet been explored together before, and so we cannot use off-the-shelf solutions. This phase will likely keep us busy for a month or two more.
- At the start of the year I became interested in authoring a guide to help younger people / less experienced people navigate the appearance of LLMs in the tech industry, with regards to their career development. I ran interviews to gather input from people in my network throughout two month, and just published the resulting report: LLMs in Software Engineering What Experienced Practitioners Actually See.
Reading notes follow.
❦❦❦
The following three pieces are really about upcoming large shifts in the invisible power dynamics underlying the fabric of western society.
In Why the Most Valuable Things You Know Are Things You Cannot Say, an anonymous author reminds us that humans learn to do complex stuff in ways that they cannot explain with language afterwards. “The inability to articulate the model is not evidence of a crude model. It is evidence of a model too sophisticated for the transmission channel.” This, in turn, creates a tragedy of governance: “Institutions that allocate authority based on legible credentials systematically promote book-smart people over street-smart people, which works in knowledge domains and fails catastrophically in judgment domains.” The author then points out that many organizations do not deal with expertise properly, by overly trying to codify “level-four”, intuitive knowledge into processes and frameworks that become brittle and vulnerable to changes in circumstances. The main takeaway is a warning:
The most valuable forms of human expertise are precisely the forms that resist formalisation. They are learnable but not teachable, acquirable but not transmissible, demonstrable but not articulable. Every attempt to compress them into a transmissible format destroys the information that makes them valuable. The only reliable method of developing them, which is prolonged calibration through direct experience, is the one method that cannot be scaled, standardised, or accelerated beyond a modest degree.
In What's Missing in the ‘Agentic’ Story, Mark Nottingham reminds us that most artifacts that connect us through the internet are really intermediates that act on our behalf to do things with/for other people. We learned to trust them historically because for the longest time, the layers of technology were arranged in such a way that the interests of their producers and users are aligned. This is changing in recent years, with more and more parties abusing this trust.
Within this context, the author points out that standardized, uniform “user agent” programs, like web browsers, act as a sort of protective shield for the interest of end-users. It has everything to do with bargaining power: forced to negotiate features/prices/information site-by-site, individuals would lose—we would give up out of exhaustion. A standardized user agent—like a web browser—forces the sites to comply to the agent's expectations, lest they would lose the multitude of its users as an audience.
Within this additional context, the author points out that LLMs today lack a well-defined “user agent” role. We see too many one-off projects and custom interfaces developed using LLMs, such that each individual user gets a wholly different experience of connected internet services. Organizing our interactions in this fully customized way fails to shield us from nefarious actors through collective bargaining and we have not yet fully understood the consequences of this choice.
The author proposes:
Creating an agent role for AI – with all of the benefits to the user and market that brings – will require constraining the tools that it can call in a fashion that becomes ‘normal’, so that people can depend on how it behaves. That might involve standard tool APIs with appropriate constraints, permission models, sandboxing (TEE or otherwise), and much more.
All of these issues are currently swept up under the carpet of ‘security’ in many AI discussions. We need to start talking about them with more nuance. Security is a defensive posture; agency is a collective bargain.
In How Silicon Valley Is Turning Scientists Into Exploited Gig Workers, Hirsh Chitkara writing for The Nation points out that private equity is currently plundering the scientific arm of the US American government (both by assassinating the reputation of academic institutions and manipulating the government to de-fund them, both in the interest of capturing all its scientific researchers), sacrificing the collective and long-term opportunities of public science on the altar of short-term gains for venture capital. Or, as the author puts it:
“The new bargain struck by Silicon Valley conflates wealth generation with progress. It is akin to deciding that a tree’s roots no longer need to be watered because the fruit comes only from its branches.”
The question, for me, will be whether/how we can restore this balance after the current hype wave dies down.
❦❦❦
I was also happy to learn:
- De machinerie van de volkshuisvesting by Carlijn Kingma (Follow The Money). How government and private interests orchestrate housing in the Netherlands, what broke in the last twenty years and what we could do to fix it.
- If America's So Rich, How'd It Get So Sad? by Derek Thompson. The most intriguing finding is that happiness has generally decreased in English-speaking cultures and increased in non-English-speaking cultures, but the author points out the latter also suffered less inflation, so it's not all about language. The proposed explaining theory is that since 2020 we had to weather more crises than usual, we are more exposed to them through online news, and we have less ability to weather them through social connection (too much individualism + poisoned social media) and leisure activities (too much inflation).
- Why Swedish Schools Are Bringing Back Books by Joshua Cohen. This is a bit of good news, masquerading as a scientific finding: kids do better in school without devices and with books.
❦❦❦
References:
- LLMs in Software Engineering: What Experienced Practitioners Actually See. April 2026.
- anonymous, Why the Most Valuable Things You Know Are Things You Cannot Say, Dead Neurons.
- Mark Nottingham, What's Missing in the 'Agentic' Story. 2026.
- Hirsh Chitkara, How Silicon Valley Is Turning Scientists Into Exploited Gig Workers, The Nation.
- Carlijn Kingma, De machinerie van de volkshuisvesting, Follow The Money.
- Derek Thompson, If America's So Rich, How'd It Get So Sad?. 2026.
- Joshua Cohen, Why Swedish Schools Are Bringing Back Books, Undark, April 2026.