Interfaces

Posted on Sep 13, 2022

What information do we need on our phones to enable the communication systems and social exchange of data and resources viable?

  • Most of the world runs on SAP and Excel.
  • SMS and Whatsapp are also critical in business
  • Emails!
  • Catalogs of information
  • Video calls and in-real meetings
  • Calendars and rolodexes embedded on our phones
  • Trellos and boards for multimedia management
  • Papers, science publications

What kinds of mechanisms have been lost from the early days of BBS?

  • gopher/finger
  • email lists (still relevant in some circles)
  • irc (today’s discord vibes)
  • icq -> whatsapp, instagram

What kinds of systems did we not envision?

  • Infinte scroll of images or videos
  • Youtube explainers for anything
  • Social, Local, Mobile ecosystem
  • everyone has a gaming console on their pockets
  • everyone is connected 24 hours through notifications

UDP universal packet flood is real in the shape of system notifications

How to share socially these ideas is also super relevant. What kind of device is the other person using? In which environment?

More malleable software means destroying the concept of the application and providing immediate access to the data.

Interconnected online is the idea that collaboration online has consequences in local environments. More importantly, the transfer of data from one person to another person in the other part of the world can generate consequences in the brain chemistry and alter the physical world. That’s the other part of Snowcrash: informational drugs that cause changes in the real world.

One of the most interesting phenomena of this kind is Bitcoin. Although purely an informational system, it connects the minds of many users around the world, generating a consensus system on the basis of impossible-to-fake proof-of-thermodynamical-work. Now, the world of blockchains seems more inclined towards joining the structures of more conventional organizations, the institutions we already have, which are a combination of various “proof-of-social-work” systems. Examples of these are the investments of time and effort in the fields of politics, science, and research.

But the world still needs “neutral-politics” systems. The red cross, the humanitarian global efforts are a testimony for the need of neutrality in certain aspects of day-to-day life. When all the other systems fail, the Bitcoin neutrality is the mechanism to stay alive. Crisis in the real world is diminished thanks to the existence of financial flows of this kind.

So, of bitcoin blocks as self-enforced thermodinamical information, we can trust. The operating system of the future has a UNIX-like root tree interface with one folder dedicated to Bitcoin:

 /
 └─ bitcoin
    ├─ by-address
    ├─ by-block
    ├─ by-block-height
    ├─ by-transaction
    └─ by-utxo

What else can we be certain of?

Well, IPFS system of hashing content is definitely self-verifiable information. We’ll include it, but not the IPNS system, for the dangers of out-of-sync-with-the-outside-world it brings with it.

 /
 ├─ bitcoin
 └─ ipfs
    ├─ baf124823...
    └─ bafwjv92t...

And now, let’s focus on my most private information and data: calendar, maps, contact list. All of that information I would store it in my ideal folder system.

 /
 ├─ bitcoin
 ├─ ipfs
 └─ personal
    ├─ bookmarks
    ├─ browsing-history
    ├─ calendar
    ├─ contacts
    ├─ forums
    ├─ instant-messages
    ├─ mail
    ├─ maps
    └─ photos

This would present me with a new way to interact with information and media than the 2D web interfaces of every day. We’d be missing a good video interface, but that’s easy to create.

The existance of a build system like nix is also super critical. With a combination of simple software and a reproducible build system, our software of the future can have a lot more potential for testability, in one application of machine learning that is about to explode and make the creation of software much easier. Reproducibility of environments is tightly coupled with functional programming, since we can now interact with systems in a more natural way.

Mapping from the current status of the folder to the end user is the task of the computer, but the user should always be in control of how that information is being displayed. Ultimate malleability of programs means that the source code for everything the user is doing needs to have its source code attached for reviewal, documentation, or alteration purposes.

The problem then becomes that of correctly mapping versions, distributing these versions, how much oversight is enough oversight, and how do we trust that the software is doing exactly what we intend, not what we claim. Because the interprestation of the user command can lead to the system failing to function according to how it was designed.

By going back to email, people’s communication changes. You now have to write a Subject. It’s bad nettiquete to send short emails with no subject and “hi, wasup?” as body. The kinds of interactions in the usage of a protocol as designed imprint behavior change that is sometimes not intended from the design perspective.

But, back to the filesystem, maybe the first thing we should have there are the sources of the programs, the programs themselves compiled to binary, type information that each language might have…

So, back to it, it now looks like this:

 /
 ├─ bitcoin
 ├─ ipfs
 ├─ programs
 |  ├─ binary-by-alias
 |  |  └─ "hello_world" --> ../binary-by-hash/d093...bfe5
 |  ├─ binary-by-hash
 |  |  └─ "d093...bfe5" --> /ipfs/baf...32492342 (hash of binary)
 |  ├─ source-by-alias
 |  |  └─ "hello_world" --> ../source-by-hash/d093...bfe5
 |  └─ source-by-hash
 |     └─ "d093...bfe5" --> /ipfs/baf...d093bfe5 equivalent
 └─ personal

The interesting thing I found here is that source code speaks for itself, but hashes do not. It would be a very handy tool to keep all the reverse mappings of hashes generated, in order to solve the preimage problem.

Also, the relation of source -> hash is similar to that of source -> binary, given the same context (which hash to use, which compiler configured to use).

Those modifications are subject to a certain metadata equivalence under which they stand, for example, the architecture for which it was built, or the compiler flags used, or the version and hash of depending libraries to be used in compilation. Sometimes, it might make sense that the binary is not even renderized in memory as a string of bytes, as in this example: maybe it’s just in the interpreter’s memory.

My javascript head went ahead and developed this as a client-side javascript “dynamic linker” for functions that can be defined in the frontend. Wanting to share the computing environment over the network, I went on the task of deploying javascript over websocket connections.

Probably the same ideas are bouncing back and forth behind the closed doors of Hole Punch. Or Spritely Goblins early development. For now, what seems to be most interesting is how to share the ability to transform the medium over the network? What are the learning patterns that can make this an awesome building block, even if the other person has almost no training as a developer?

But the interface I got is not malleable enough. The source code editor should be a lot prettier, and the functions executable with one click. Experimenting with packaging libraries and how to create an API is probably an arm’s race – this is not as interesting as an experiment that we could do.

We could start to re-create the NPM package registry. Or, move to another language (instead of loading javascript, load wasm for compatibility?) and open up a global arena of customization of sources, data, interconnection of sources of data across devices, and, most interestingly, how to deal with private data. Maybe an encryption-system?

But, we need object capabilities to dwell on this. Creating unscoped executions is not a good idea, the scope should limit the kinds of things plugins have to do. At least, everything is distributed as source, but how trust-worthy can be packages of thousands of depencencies?

Another problem lies with the visual identification of sources. If the source looks very similar to another, or the changes are hidden under UTF8 tricks, then the verification requires of remote attestation (or local review) to warantee changes/sanitized versions. This is not a problem at bootstrap stage, when operators can be assumed to be truthworthy. Trusted setup should be fixed in a later stage.

The goal is to create a system that can read and generate input/output interactions while coding, so that an AI can be trained on the kinds and types of the expected result by measuring the reactions of the developer. Correctly configured an API server? Correlate the increase in heart beats and facial expressions indicating joy, with the keys typed and in which context. All of this, reproducible by default, so that machines can learn about what is the context-sensitive indication of “good system behavior”, which can also lead to automated detectors of downtime.

Cheers,
Claude