Unease, writing, and thinking

On the missed opportunity of digital gardens, a blurry gap in the tools we have online for knowledge sharing, and how having AIs write for us might be another step in the wrong direction.

Note → I tried to shove three things together in one section of issue No.251, but I ran out of time to make it a coherent whole, plus the newsletter was already quite long enough, so I moved all of that into its own post. Still not coherent, but now split into parts.


A couple of years ago, I used Grant For The Web funds to turn the Sentiers archive into something like a digital garden, splitting every issue into multiple ‘notes’ which were all tagged and were to be increasingly interlinked with ‘wiki’ backlinks. It was a great learning experience, Eleventy was a pleasure to use and using an Obsidian vault as the ‘cms’ was fun to accomplish and fun to use.

But I never properly got into the habit of backlinking. Unless I missed some activity on the website, it was never really used much by anyone else, and the ‘build process’ started getting pretty lengthy when working on it. There are some technical fixes I could have spent time on, but since it’s also longer to publish each issue as a bunch of notes and the email format, and since it hasn’t helped my thinking or archiving process, a couple of weeks ago I decided to rip out a good bit of that mechanism. The old pages are all still online so nothing breaks but the issues don’t have the ‘See note’ links and can only be read as the were sent, one long ‘post’ instead of an assemblage of notes.

It’s been a relief, issues are a bit quicker to get online, the Eleventy build process is super fast once again, and I think that will get me back to a looser format that doesn’t always look like three featured articles, some shorts, often the Futures section, and then the Asides.


I don’t have a clear line through all of these or a great conclusion, but all the digital gardens of two-three years back that never became collaborative, or died outright, my missing habit of backlinking, the weight of using some tools, the suckiness of search engines, missing space for thinking, and renewing intents of good curation practices all seem to point to something missing, something insufficient, an unease with the volume of information we see and our grasping for some control. The conclusion is not new, but ‘where it hurts’ seems different.

In a way, it reminds me of the dark forest theory of the internet. Strickler was pointing at how so many of us were retreating from large social networks and looking for something smaller, more intimate, often more targeted. The massive scale of Facepalm and the bird site left us uncomfortable (or worse). The links above seem to point to a similar kind of unease, but this one between a craving for knowledge and facing an overwhelming amount of information. We don’t currently have the right tools for either one of these uneases.


Tangentially, AI, especially ChatGPT, seems related. In theory it could be one of those tools, providing clearer answers, but it’s actually fed on that same overflow of unvalidated information and provides the same incomplete or plain wrong answers. It’s also, perhaps, the opposite of what we need–for the above unease, anyway. The culture of productivity, capitalism’s demands, and these flows of information don’t leave us enough time to reflect, to slow down. It’s doable, but it’s like swimming against the current. And now we should remove another one of our tools for thinking? We should hand off to an AI our opportunities to think through writing? We need to speed that up to? Automate it? Optimize it? What happens to our time for thinking and reflection then? I love trying those things out, and they will be good for some things, but automating writing seems like another step in the wrong direction.

Your Futures Thinking Observatory