Burning Computing

Posted on Jun 18, 2023

burning.computing was an article I wrote some time back, for the idea of giving back a philosophical background to the craft of building top-quality software. I can come up with some examples from the 90s fairly easy: the GPL, maybe Affero from the 2000s, and Phil Zimmerman. I wonder if I could make semantic queries to wikipedia. Asking my offline LLM said that wikipedia has 6.1 million articles, so that’s probably indexable with pgvector to create a cosine similarity search, if I need 1k tokens per each, that a database of about 12 GB with floats of 16 bits, 24 gb with 32 bits…

I just saw the “Joan is Awful” episode on the new season of Black Mirror. Not quite what it used to be, it doesn’t go as far as they used to go with their critique of society. I think we need a reboot in our digital fingerprints that I can’t see how to trigger, but we should. I find myself using llama.cpp in CPU version because it loads much faster. I think it makes sense to keep it open, as long as it’s one of the best models (Guanaco 33B or Falcon 40B have been pretty good IMHO).

Here are the contents of “Burning Computing”:

  1. Radical Self Reliance Your compute is not going to be made by others. You should not rely on other people’s hardware; unless you are strict about the API and the kind and level of service you are expecting from them. You can’t expect others to compute your stuff for free; you have to pay for it, or do it yourself.

  2. Leave no trace: No Data Out Of Place (DOOP) Data’s place is wherever it’s closest to its source or sink; according to the individuals and organizations that generated that data, bought access to that data, or received that data. Once I have a piece of information, I’m free to use it however I want. Information wants to be libre; can’t be contained; but you can choose with whom you share that data. Unfortunately, the universe’s entropy dictates that there is no going back; yeah, it’s cool that we are trying to come up with laws to manage information that you wish to “undo back”; but there are more realistic and true ways to handle this.

  3. Radical Inclusion: Universal Filesystem

Drawing inspiration from IPFS, we recommend the pollution of “/” with more semantic meaning to the user. What if ls /twitter/followers | wc -l returned your number of followers in Twitter? It sound like there might be something there from a UX perspective, but it turned out that events are more important than documents.

  1. Deplatformization Interacting with services through their APIs is a great way to build siloed platforms (see Bezos’ “Amazon Team API Indictment”). But that is not necessarily the best for everyone, if we aim towards a model of computing that respects the more immediate nature of how data is created and shared. One should not have siloed files with siloed data that is incredibly hard to swap around.

  2. Radical Transparency The free software movement had the idea that software should be free to use and modify at one’s whim, which I regard as one of their best ideas. But this escapes the fact (now that the user does not necessarily know how to program) of what are the skills required to modify the software. How easy should we strive for? What level of knowledge and understanding should a user have to make modifications to the program? It’s a lot harder to make the system easy to modify through some kind of interface, and exposing the source code is not enough.

  3. Immediacy Programming efficiently is about data, and it’s about the frequency of read/writes to that data. Immediacy as a developer means being aware and anticipate the frequency with which you are going to need to access this data, and create algorithms and data structures that work efficiently against that expected frequency. It also means to have to move as few electrons as possible to achieve thee desired result; because moving more electrons takes more time (or bandwidth/capacity) by definition.

  4. Communal Sense One should have absolute control and opt-in to sending data outwardly from their devices as possible, and as reasonable as possible data policies as one can achieve. It’s a matter of style to be concious about your data and log storage, and try to keep as little as possible from others and try to comply with their preferences. But it must be OK to decline.

Maybe we do need a computer environment along the ideas of this. Now, I found myself stuck on my dependencies to restart my matrix server locally with no inbound. I need to find a way to retrieve my old configuration for matrix-ansible-docker deploy. It’s on my NAS. The NAS always was accessed from a local network, “airgapped”, and requires a DHCP address. I don’t have that configured on helios. I need to do that! Also, I’m traveling really soon, so I need to get my setup tight for working remotely from my Macbook. I chosed that over the framework because it’s more powerful, just works, and NixOS distracts me from actually doing stuff. It’s great at home, not on the go.

Once I get matrix-docker-ansible-deploy working, I need to update it to use traefik and use DNS challenges instead of publishing my IP. It gets logged anyways, but I can use a VPN. Once I get to do that, I setup a new account and link my telegram, signal, whatsapp. Once there, I create a backup of my old setup and I freeze that, without losing messages in the meantime.

I also need to find a new appartment in Berlin, somewhere so I don’t have to move anymore, these short-time rentals are killing me. I would not leave such a papertrail as I did on this one, using Amazon and Lieferando almost every week with a credit card on my name. Lieferando offers payments with Bitcoin, and I used the lightning network and Muun to pay.

So, in order to procrastinate more, I start a folder named “construction” of my new personal digital life, using the ideas from johnnydecimal.com.