Vibe augmentation
I did some vibe coding yesterday, as one does these days, and had some thoughts. It was going to be a post on LinkedIn, but ended up a short article, so I'm also posting it here.

Everyone’s talking about vibe coding—prompting an AI to code for you without doing much code reviewing, if any— theses days, and of course more broadly about AI and its impacts. As I’ve mentioned numerous times in the past, I’m more interested in “humans in the loop” and augmentation than replacement.
For months now I’ve been trying various ways of chatting with my Sentiers archive. In part because it’s a curated newsletter and not a bunch of essays, I haven’t been satisfied with the results so far.
Since such a setup would help me in parts of the pre-research for a couple of projects, yesterday I decided to optimise the archive for use with AIs. And of course, I used an AI to help me do it (Cursor). Things move quickly in this field, and I found it was already better than just a couple of months back.
I’m not going to go into too much detail, but I wanted to standardise the format of the issues, since it has evolved through the years and has been in different CMSes. I also wanted to “chunk” the featured articles in their own files—I had that back in the days of treating the archive as a digital garden but not for the last 60 some issues. I also wanted a one file index of everything that’s been in the newsletter.
Some things to note regarding this vibe coding angle:
- It’s not always the case, but here it definitely saved me an order of magnitude of time. Hours instead of days for sure.
- This project would have been pushed back a number of times if I had to code it by hand, and it wasn’t going to be coded by someone else for a fee.
- I gave some detailed prompts in some cases, but also explained the situation in others and asked for the LLM’s “opinion,” usually to great and faster results than trying to give it specs. (Before devs intervene: this was a scripting exercice in a few steps, not anything close to an app!)
The nugget that led me to writing this here is the ease of adding stuff I wouldn’t have taken the time to do by myself. And again, never would have paid someone to do, it’s too much of a nice-to-have and not vital for my work. (Sorry, I’m not going to explain every “weird” word in there.)
- “Maybe I could output the index in different formats (json, markdown, html) and try them out.” Done.
- “Let’s re-order the metadata front matter in all the old files.” Done.
- “Let’s add css style to the html index, make it look like my website, just for fun.” Done.
- “Oh! Let’s ‘wikilink’ all the markdown files so they can open as an Obsidian vault and be interlinked.” Done.
- “There’s some old Nunjunks tag in a bunch of files, let’s remove them. Oh, and remove extra empty lines that have appeared through the other steps.” Done.
- “Let’s create a unified reference of all tags used.” Done.
All of this to say. In this case I got something done I might not have had time for otherwise. It didn’t take any work away from anyone. I was able to add a bunch of features almost as fast as I could think of them, things I wouldn’t have taken the time for if I had to code by hand.
I’m aware of the environmental hit. One calculation I don’t remember seeing: let’s say it took me three hours instead of 30, what’s the impact of chatting on and off for three hours vs searching the web, opening multiple pages, and having my laptop working for ten times as long? Also, in this case I was using the default LLM in Cursor, but I could also have done the same with a local model if I had more time (it’s slower) and double the unified memory in my laptop, which runs on hydro power.
Finally, kind of like using local models (with Ollama and Msty), the level of technical knowhow, or let’s say the difficulty of learning this stuff, is just about the same as learning html 25-30 years ago. I.e. not that hard if you take a bit of time.
—
I’m not planning on changing fields again and doing AI consulting, but if you have questions about the above, or would like a longer post about some part, like maybe the use of Msty for local AI, ping me or comment below.