with also (in a way that I haven’t been able to entirely compose together) CHERI https://www.cl.cam.ac.uk/research/security/ctsrd/cheri/ , and @cwebber ’s work on Mark-Miller-like capabilities.
Sorry, that sounds conspiratorial and I didn’t mean it to. What I mean is that a successful tool or service builds on the browser, or Android/POSIX say, or inside Twitter or Facebook’s ecosystem. Those environments are rich and complex and provide coherent and comprehensible abstractions for doing almost everything that their creators and sponsors want to do (and would like you to do) in the world
In earlier years, we were lobbying and picking and working on the abstraction towers we hoped would lead to a better world, but now it feels like those directions have been buried by the buildings built above them: you can think of this as co-option, but another way may be a narrowing of options after a period of abstractions that tended to general innovation - a post-Cambrian winnowing? Sorry for all the metaphors I’m still trying to name and frame this
I don't know the name for this act, but adversarial interop will do for now. You wire yourself up to the existing abstraction framework, and pull it in a new direction. But you only do that to the degree that the abstraction fails to be able to stop you, and to the degree that you can comprehend what the abstraction presents
Virtual Machines are a good present-day example of this approach: it shows the possibilities, and the challenges. Essentially, VMs enguls almost all of the software stack of entire operating systems. Something like Qubes run them within its (fairly thin) hypervisory world, where it can wrap them in its own model.
VMs have to do a little bit of adversarial interop -- they don't just wrap around the code, they push past its boundaries a little too. They dig a bit deeper into the old abstraction. They'll emulate the machine, but they'll also work out details about individual applications, and bring that to the surface.
We avoid doing this a great deal, because, frankly, it's /really difficult/ and /extremely fragile/. Abstractions have interfaces, and those interfaces are stable.
Everything else is what the abstraction is *meant to be hiding*.
But we are at a moment where *have* to dig deeper -- and perhaps all this computational power can assist us.
We have an abstraction in which *what we want* -- data, patterns, power, is buried deep deep below us -- and we are trying to build tottering, tall, highly experimental alternatives of our own.
I can run an existing app on my new computing environment, because I can -- given our current powers -- emulate it precisely. But how can my new computing environment *understand* what the app is doing, saying, processing?
It needs to be able to dig deeper: Rather than a bunch of syscalls, we need to be able to recognise certain computations -- turn the emulator's understanding of file and network operations into an understanding of photograph collections, contact updating, calendar sync
In many ways, this is always the overambitious goal of any reverse-engineer, or someone re-writing an existing system.
But, if we were to view this not as a doomed pursuit, but a topic of active research and activism, using modern tools, how deep down could we dive? How large, complex, universal and interconnected are the things we could dredge from the current system?
I ask this, because of its similarity to what modern applied commercial and academic computing is working on. They too, are faced with understanding a large, ever-changing, overcomplex and non-compliant environment, and recognising patterns with it. And then turning those patterns into large-scale understanding.
Only in this case, they are applying it to better understand and therefore us.
And in my case, we are applying it to understand and control our own digital environment.
Well, imagine what's happening with RedoxOS or even something as extreme as Interim or GNU Mes. If you start at the very base of the device abstraction *and* obtain complex objects at the very top of that stack, you might be construct a consistent, high-level OS that skips a lot of the construction work in-between.
God, again I'm sorry, i should have written this as an essay. I hadn't realised how well-formed it was in my head, and how incoherent it must appear when I'm typing it as I go.
Also, disappointing if you thought I was offering an easier solution, rather than just more hard work!
Still, maybe it's provoked some wild thoughts in you too. Let me know! I'm here or email@example.com.
Okay, the next thing I read after this stream-of-consciousness was this Hacker News thread: https://news.ycombinator.com/item?id=22149866
Which seems to be talking in the same rough ballpark.
@crwbot @mala I see some valuable ideas here and hope I can participate in the next phase of the discussion. In particular there's really some meat in the idea "observe the stack at multiple levels and pull out the connections between things happening at the machine code level and things happening at the higher level" whether that's in the social network running in the browser, or in the network stack, or whatever
Generalistic Mastodon instance for open-minded people. Instance Mastodon généraliste pour personnes ouvertes d'esprit.