Canal Swans #30 - Permacomputing & Infinite Time
In the last years, Permacomputing has become a useful umbrella of interest to me.
It’s not a word that can be precisely pinned down, which is part of what I like about it, and it doesn’t have a ton of history as a term, so there is still a feeling of openness about what it could be.
One nice definition of Permacomputing according to Fiber Space:
"Said with tongue in cheek, permacomputing is a radically slightly more sustainable approach to computer and network technology inspired by permaculture. It is both a concept and a small nascent community of practice-oriented around issues of resilience and regenerativity in digital technology derived, among others, from permaculture principles. In a time where computing epitomises industrial waste, permacomputing encourages the maximising of hardware lifespans, minimising energy use and focussing on the use of already available computational resources."
There is a sense of creatively simplifying and lightening, but not necessarily in a reactionary return to the past.
This one-page “Lowtech Manifesto”, written in 1999, still holds up remarkably well.
Because Permacomputing looks to a different relation to time than the industrial technology world, there can be a sense of entering into a different timescale, which also fits that the “Lowtech Manifesto” would still be relevant twenty-five years later.
There is a quality of lightness, that can take many forms, that feels related to Fruitful’s Ultralight class.
At the very least, I’ve managed to nerd-snipe myself with this topic and find there’s something beautiful about it. I could easily imagine spending multiple lifetimes just trying to make beautiful, simple & maintainable tools with others doing the same.
I wanted to write this little intro in a way that would hopefully give a sense of my interest to anyone reading this newsletter, but the rest of the newsletter is probably only interesting if you have some familiarity with HTML and making websites, and it gets a bit more into specific projects I’ve been working on and wanted to share.
Permacomputing Projects
The most recent one, was prompted by trying to update the canalswans website and then running into build errors. I ended up replacing my old Gatsby website with a combination of Hugo, and a new simple tool I created called Thimble, that I describe as “the world’s most minimal single-page-application front-end framework”. A link to the blog-post is here, and I also included the full text below.
I also have been working on Lichen-Markdown here and there with @abekonge@sunbeam.city, which is a new take on “the simplest possible CMS that is friendly enough for non-technical users”.
I also included a list of related tools in the “permacomputing” section of the Solidarity Infrastructures website.
A strange fascination with trying to make beautiful things in just a few hundred lines of code. Will see where it leads, and happy to hear about anything related. Hope everyone is managing, and thank you for reading. I sometimes feel like I’m sending out broadcasts of niche special interests that are probably irrelevant to most people, but I thought I would share here in any event, as who knows what connections there might be.
- Max
Static Site Generators, Permacomputing, Software Decay & 87 Lines Of Plain Javascript
This is a story of static site generators, software decay, and an attempt at a solution using 87 lines of plain javascript called thimble-switcher, largely related to this post by @tty@sunbeam.city:
My code from 5 years ago: “security vulnerabilities”, “this project is dead”. Super Mario Bros. 3 NES ROM: continues to work flawlessly 36 years later. I can kiiind of understand why these are different, but not entirely. Is it the Internet connectivity that makes the difference? Or are there effects at play? It feels like there are more factors, but I can’t pin anything down. I just want to write software that can be done and stay useful. Thoughts welcome, as I continue to grapple with this.
First the short backstory of a website that decayed over many years:
Software Decay
In 2016, I made my personal website, and later a blog called Canal Swans (the predecessor of this website), using tools that later stopped working for me. The tools I chose were based on my software interests at the time, before I had heard of permacomputing, or had a decade of lived experience watching tool ecosystems change over a longer period of time.
Originally I chose to make the websites using Gatsby, a static site generator that produces single-page-application websites via React, largely because React seemed like sort of an industry standard, and I liked that the “single-page-application" front-end made navigating the website snappy and smooth feeling.
A couple times, after trying to make a small update to the content of the website or write a new post, it refused to build, and node or gatsby complained. I had updated node at some point for some other project, and now the gatsby and node versions were upset with each other. I started using nvm, and made a comment in the README reminding me to only use node 14, which I thought would solve the issue once and for all, as I hubristically declared in this blog post.
It did improve things, and it seemed for some years I was able to update my website, without updating node or gatsby, as node progressed to version 20 and I stayed happily at 14.
However, in this past year, when I added new content, I started getting cryptic error messages. Usually this could be fixed by clearning the cache, rm -rf /node_modules/
, reinstall and then try again, but I started having to do this frequently, and it was a pain, and building the site also took 50 seconds, and there was some weird glitchiness on the navigation of the wiki section. Some of these problems might have been solvable by updating to the latest Node and Gatsby, then updating my code to accommodate breaking changes and meet the new specifications, but I didn’t want to update my static site generator, I just wanted to occasionally update the actual content of my website, not do extra programming work.
So in order to do less programming, I did a lot more programming, as is often the case, and decided I would switch to a different more-minimal, hopefully-evergreen framework.
Thimble Switcher In 87 Lines, Fediverse Software Commons
I posed a question to the Fediverse asking what was the most minimal front-end framework for client-side switching between pages, and did a bunch of research, and found a bunch of “micro”-frameworks, including MithrilJS, SlingJS, Hyperapp, htm, NanoJSX, Preact, uhtml, reefjs, and htmx, which all were much smaller than React, but still overkill for what I actually was trying to do.
Then based on a suggestion from @j12i@weirder.earth, I attempted a solution in plain javascript, and was pleased to achieve a result whose user experience was to me comparable to Gatsby, in 87 lines of plain javascript.
I then added ASCII Art to the javascript file header, as well as code to set how long cached pages should last before they expire, and incorporated error-handling suggestions from the generous spontaneous code review of @forest@pixie.town, and now the total line count is 144, but still within the ballpark of the first working version.
I used git ls-files | xargs wc -l
on the facebook/react repository, and found the total line count was 416,795.
Thus the total number of lines of my front-end code was greatly reduced, while maintaining 90% of the functionality that I actually was using.
The remaining 10% would be the nice way that Gatsby handles images (with a sort of smooth lazy load), and some image carousels I built on some pages of my personal website.
I’m curious if I could also replace both of these with plain javascript, and achieve functional parity with the previous version.
Here is the old gatsby version for reference.
Some might call this bit of javascript a simple script or snippet, but in the spirit of permacomputing, and to elevating simple solutions rather than belittling them, I enjoyed giving it a name, Thimble Switcher, and thinking of it as a framework — it’s just a very minimal framework, and a flexible one that could be subbed into any static website which has a <main>
section on each page. It’s also a framework which requires no extra build step, and which I hope will remain evergreen for many years to come.
We can check-in ten years from now and see how it went.
Permacomputing, Lichen-Markdown & Infinite Tradeoffs
A critic could say, “but Max, Thimble Switcher prefetches pages that the user might not even up visiting, how can you call this permacomputing?”
I would agree that if your site currently loads pages on-demand in the standard way, then adding this javascript snippet makes your site more complicated and resource-intensive, not less.
However, Thimble Switcher is more light-weight in multiple ways than Gatsby or React, so if you are switching from one of those tools, which internally uses quicklink for pre-fetching, then it makes the build process that supports your site, more simple and efficient.
Or perhaps someone who is deciding what tool to use feels less of a draw to use React, knowing they can have client-side page-switching without it.
More generally I see “permacomputing” as an approach, not a prescription of a specific tool or set of features which are “enough”, which is always contextual and subjective. Distributing writing through using natural fibers and dyes or orally are both beautiful approaches which are more resource efficient than any website.
This was similar to the reasoning behind Lichen-Markdown, which is less minimal than the original Lichen, but could be more minimal than other options depending on what is being considered.
Permacomputing, Human Readable Code & Open Source
I appreciate that Thimble and Lichen use fewer computing resources than some alternatives, but possibly even more than that I appreciate that small code-bases can be human-readable.
Open source advocates often bring up the empowerment of being able to fork and modify open source code (the second freedom), which is not without value, but with large and complicated code bases, even a trained programmer may not be able to modify the source code without years of yak shaving, leaving the second freedom more theoretical than actual.
This is also pointed to by Bret Victor’s concise one-page essay titled “Radical Decentralization, Radical Empowerment and Dynamicland”:
However, for true decentralization to be possible, the ideas and abilities must be learnable. Many modern technologies are too complex to be learned and practiced by communities, and are instead bound to industrial modes of production, creating a class divide between “developers” and “consumers”. These complex technologies will never be decentralizable, and any attempts to decentralize them will fail.
I don’t think there is a pure binary here between “true decentralization” and “untrue”: complicated open source code can still empower more groups to make changes than if it were closed-source, and complexity is itself a spectrum, but simplicity of understanding and modification is an aspect of decentralization that is often overlooked by decentralized technology advocates.
I appreciate that with just a few files, I can have a sense of what all the code does.
Hugo, Compiled Languages From The Perspective of Permacomputing
After getting rid of Gatsby and React, I ended up deciding on using Hugo to build this site, partly inspired by coming across the blog of the creator of Lean Web Club and ReefJS, and seeing that they use Hugo, as well as other factors.
Using Hugo, the build time for my website is 97ms, vs with Gatsby it was 50 seconds. This is a great difference and makes adding content to the site feel much lighter.
@decentral1ze@varia.zone wrote a beautiful assessment of the go programming language from a permacomputing perspective for permacomputing.net, highlighting some of the positives and negatives.
In my case, I can see how a Gatsby build time of 50 seconds might incentivize me to get a faster and more modern computer in order to improve this, but that with Hugo, clearly my computer could even be slower than it is, and it would be fine. In this light, I think building things in compiled languages so that they can be more resource efficient can be seen as a permacomputing practice, although as always there are different tradeoffs and contexts to consider.
Lastly, after changing the tools this site was built with, I also re-organized the content, inspired largely by Laurel Schwulst’s metaphor of notebooks in “writing & worlding”. You can read more about the organization of this site here.
Let’s see in 10 years if the Hugo build is as broken as my Gatsby site was, or if this new set up fares better.
☽