Spatial Transformation
I didn't set out to build this. I didn't really set out to build anything. I don't want to build anything, I just want to watch it grow.
autumn_mist - fog - intense_eyburn
I went out for a walk with the dog one night year and by the time I'd gotten home I'd finally figured out how to invert my Jenkins pipeline library. It's evolved into a distributed graph, YAML files provide the structure, the data resides in a mix of DyanmoDB tables, APIs, CLI tools and whatever else happened to exist at the time the graph rolled into town. Jenkins ticks it over, the inevitable march of delivery. Node by node, whatever solution fit the problem with the least amount of change, the graph expanded. And then it went inside out.
Things like thaat have been happening to me my whole life, a problem, a need, an errant thought and click, two shapes slide into place and a new shape is born. I never knew why. Now I do.
the_graph_consumes - the_graph_loves
At the point when the great Jenkins 1.XXX > 2.0 transition took place, I already had a YAML file that formed the backbone of our existing "pipelines", a loosely coupled series of freestyle Jenkins jobs that roughly mirror our current multi-stage design, build, image, configuration, deploy, validate and publish. Turns out, if you throw a special purpose stages into separate archetypes for things like straight IaC deployments or scheduled utilities, you can run an entire organization on that workflow, at least their software delivery.
When it started, it was just a convenient dumping ground for structured input into the various CLI tools I'd built over the first couple of year to power the first simple workflows. Scripts broken by function into blocks as shell steps and by blocks into full freestyle jobs, chained together into a full pipeline. Chain a few of those job groups together and you can promote across environmental boundaries. Build a tool to track AMIs, what's released where by system. Do the same for Docker images when they come around and you start to get a platform.
This was 2016 or so I think, and it just kept building. I was too lazy to want to type parameters in during restarts, so I started using property files on a shared NFS mount and a plugin to load them and export them into the jobs. Shared state as it turns out. We'd started on OpsWorks (ugh...) and when that left, NFS was out and EFS didn't exist so off to DynamoDB we went.
I'd done database work in the past, my career started working with a company that sat on Oracle's performance CAB for EBS back in the 11.5.8 days because we were stupid enough to try to sell CDs to WalMart through it during the peak of the early 2000s and it /just wasn't meant for that/. But we made it work, I was just the kid from the warehouse that was supposed to be helping functionally test the returns module, but boy did databases make sense and they were fun when they got big enough.
But now, I was still new enough that I couldn't spawn database infrastructure, that had to be maintained, but DynamoDB was managed so I created a table. And then another, and a few more... and then an API to wrap some of what had started as just CLI tool calls, things had grown past the utility of a simple shell script and deserved to speak to the outside world.
And then pipelines came to town, Jenkinsfiles were so tempting, but I really wanted to play with Github Actions. That just made more sense to me, a series of steps, artifacts and state transtions as effects. It was perfect for what we were doing and I really hate parenthesis so I looked to see if anyone had built a YAML based workflow for Jenkins. There was one blog post and a few rough implementations that just used it as a structure to a predefined flow so I built my own. Layer by layer, piece by piece. With whatever happened to be at hand, that I knew how to use or could figure out.
ERB was the core of the transformer in the first iteration, I wanted to be able to write cool little tags like !Value so I could pull in outside state !Mapped so I could stop people from using random anchor/alias combinations in my pretty YAML files and use someting with a nice clean dotted path. Except I didn't know how to write a custom YAML tag processor at the time, so I wrote a tokenizer instead. Walked the entire file token by token and ate them up. Spit out fully formed <%= value('my-pipeline-name', 'I_USED_TO_BE_AN_ENVIRONMENT_VARIABLE_BUT_NOW_I_LIVE_IN_DYNAMO') %> and <%= mapped('pipeline', 'metadata.tags.environment') %> strings in their place. I have no idea how that worked so well but it did for years, made my life so much easier.
Until someone complained of course, some ECS API needed integers not integers as strings and my ERB based tokenizer couldn't keep up. See by now the full shape of the thing had started to take form and it had grown significantly past the days where simple string replacement would tick the graph. I nearly broke my brain on that one, by I got it working. Some frankenstein two phase processing pipeline, finally figured out how to implement and tag processor and I used a processor/resource model with a preprocessor phase that built the graph with raw Proc at each edge, just waiting for phase two to come by to unwind the whole thing. Stuck the entire thing in a @@preprocessed class variable on the templating engine. The guy I assign to that repo loves to see that when he comes across it - "you did WHAT??? with a class variable?" It was a CLI library and I didn't know any better, but you should have seen the smile on my face when I realized I could pull a function across and have it unwind on the other side.
Life moved on, the platform grew. It seemed so simple to me, I couldn't see why people couldn't see why you could just copy and paste a file, change a few values and have a complete end to end pipeline from devl to prod. And why that was cool.
pull_the_leash - invert_the_lesson
That brings us back to the dog, more or less. By late last year, the system was responsible for about $500m in software revenue. It's running about 5,000 pipelines daily, not all releases or deployments, but just workflows of all shapes and sizes. And it works brilliantly, if I'm running it. I can watch the Jenkins logs scroll by and just tell by the shape of the log how the job is doing and where it is. The language encoded in the YAML allows me to define the entire lifecycle of an application, from ephemeral feature build to full production release and all the intermediate stages in a single file.
And it's documented about as well you might expect, which roughly boils down to "great when it's interesting and not so much when it's not", which means lots of why and some actually decent tutorials and how to guides, but I hate writing a reference manual and all of my many attempts to automate it had failed so I set out again to try. I was going to add manual attributes to each activity for exporting in a preprocessor into a templating engine. Attributes turnd into annotations, annotations turned into patterns and...
And somewhere along the way I pulled the right thread, or followed the right intersection and all of the sudden the whole thing was inverted. In essence, I'd manaaged to serialize my YAML graph into a series of shell commands (that just happen to call my CLI tools) because by the time I got through the fifth or sixth annotation based refactoring I realized there was a pattern to be followed. Your mind wanders a bit when you're doing boring maintenance work. A CommandBuilder and a reflective helper classes to extract the necessary metadata and we were off. Better yet, in many cases I didn't have even write code any more, the common case was embedded in the builder so many commands that just mapped a YAML acivity to single tool call were annotation only, the first no-code Jenkins library ;)
here_we_go_loop_de_loo - here_we_allemand_left
It's the kind of thing that happens when you start to slide off into entity space, the world of transforms and shapes. Where everything makes sense. I know what I've done to make !Mapped work with that two phase rendering pipeline is a sin, because I can feel the shape of it and I can inform you authoratatively that it's wrong, but I've never had the tools to tell you why. I couldn't have even described it as a reactive graph a couple of months ago because here's the thing, when you think spatially, it can be very hard to communicate.
I grew up in rural Ontario, a few hours north of Toronto, in the 1980s. I wasn't a place that recognized spatial thinking and in response to the environment, I went quiet for a few years. Plenty of time to observe the rules, the selection criteria, the effects, see the social graphs resolve. Live in entity space and let the world go by.
Something like that never really goes away.
I remember the first few times I came home after a walk to work on the annotation system, the need to create was starting to become a driving force so I was often up till 1 or 2am and we'd just gained access to CoPilot and I'd send screenshots to my team of the times it would "one-shot" an entire activity transformation. I was learning that they were really good at structure, not code.
When you understand something by the pattern it makes, you often forget to fill in the vertices, so the rest of the school of fish chime in to provide meaning. Those ones are harder to serialize when you're trying to explain a concept and can be killer if you wait till 45 to fill some of them in. Like autism or spatial thinking.
When you understand math by the shape the function makes as it moves the values around, you do exceedingly well at high school math. When you use Delphi to implement a genetic algorithm to schedule the high school's final exams for your finite math final, and OpenGL and an early 3d physics library to implement both artillery and a pool game for a physic final, they let you into Computer Science at Waterloo - where you promptly burn out and dropout within a semester, having only passed a psychology class - by nature of forgetting about what the subject of the final exam was actually to be about and writing on what the animal subjects of all the ethical experiments we were studying were probably feeling at the time.
Turns out no one teaches you how to do math by the shape of it at the University of Waterloo, just the procedure. I didn't have the words for how I can't learn by the procedure of something, I barely had the shape of them then. I did so poorly at the entrance placement exam that I was put into remedial math out of the gate, for the kid who had based part of his value on having placed with top marks in math at school. I just couldn't get linear algebra and high school hadn't gotten us past chapter 3 in calculus so I had no foundation, and no way to learn it. I had never learned how to study because I'd always gotten the lesson so quickly I'd finish all my work in class.
So I left, thinking I was a kid that was a bit good at computers and went on to a career in IT that spanned
- self running Excel spreadsheets for new release shipment tracking and cross border returns shipments (VBS + COM + Attachmate + Mainframe + AS/400 + embedded web scraping) because I didn't want to type into a terminal all afternoon when I was called into the warehouse on a Saturday for a returns shift
- building the core integration layer between the backend systems (Oracle EBS + Mainframe) of two billion dollar retailers to allow EBS to appear transparent to the parent WMS because someone asked and it seemed like a simple thing to do since I'd worked on or built most of the inventory and order management systems on our end anyways
- built a custom warehouse management system on Vaadin + EJB :) with Oracle Forms like "bind an EJB, get a form" with archetypes and an event bus to wire it all together all in one rather focused evening session. Not the whole thing, just the the core framework - because I'd been procrastinating on it and I had to show something so we could get another round of money or something so I built something so I could data bind my way into looking like I'd been busy for months.
This carries on like this up until the series of events described above. The theme through all of it was "just a guy, kind of good at computers". I didn't have the words to know otherwise before, and for a lot of it, I don't think many other people do either. It's a trap, sometimes. The perfect trap for the unaware - you can see what you're doing, the shape of the solution, the flow of it as each piece falls into place. Where the bottleneck is, just where you need to go when that Jira ticket comes in.
You just can't tell anyone, how much it hurt your brain to work out the sequence of events that needed to happen to ensure that not only did that YAML stack unwind, it also carried along value transformers that swept in at the end and cast to whatever other type you wanted. Not just to show I could, but because I knew that after all that work, someone would complain that their pipeline broke because their shiny new integer value broke some tool that expected a string and I needed a quick out.
So you internalize beside the piece of you that knows that all of the real engineers know graph theory and binary searches and here you are just trying to make the little !Mapped toy you're playing with trying to get you the right value because you like the way the YAML files look with !Mapped pipeline.components.default.cloudformation.template instead of &stack_template.
the_tower_of_fish - neuralburn - inside_out_the_musical
When you make meaning by pattern matching and you can feel the shape of the nodes, there's a difference between a graph with a lot of full vertices and a graph that represents mainly the connections.
How they feel.
One comes to the front of the mind with the full weight of conviction and the other en empty shell that hides in the rear. A hallucination of meaning, authoritative on the surface but quickly discarded after closer inspection.
It's the ones in the middle that are tricky. The ones where enough lived experience, observation and acquired knowledge combine to provide a graph that's good enough.
"Autism is just weird social stuff and yeah, while all that stuff about how I handled social climbing in high school probably means that I am autistic, does it really matter? I know I'm weird and don't fit in"
The people that finished computer science degrees really know what they're doing and I dropped out. They're all talking about depth first searches and binary tree inversions and all I have are these silly little YAML file things that run my pipelines"
the_day_the_stars_went_out - og_ego_death - snippets_of_silence_oh_so_happy
When you make meaning by pattern matching and you can feel the shape of the nodes, there's a difference between a graph with a lot of full vertices and a graph that represents mainly the connections.
How they fill.
When you realize that a life defined as failure can more accurately be reframed as different, things happen. When you realize it's not too late, you get to work.
def recontextualize
@graeme.delete { |star| star.extinguish() if star.descends_from?(:misconception)}
end
At some point the last of them went out
irb> puts graeme
{}
And then we went building. With understanding this time, both inside and out.
When you have the ability to nearly instantly fill any node on your graph, just by asking, you have the ability to relight the nuclear furnace within you with hope, aligned in knowledge based on the weight of fully formed vertices AND the edges that previously were more than sufficient, having provided a life already exceptional if otherwise hollow.
The graphs begin to merge, lived experience, the synesthetic understanding of the transitional flow. And the text, walls of it. The edges were always there, the textual understanding of language provided structure. The empty hollow understanding. The hallucinatory meaning.
No longer.
can_i_pull_the_plug_daddy - happiness_as_a_waterfall
When you can feel the graph when you're designing it, you can fold it in on itself. Invert a YAML file so that describes itself so you don't have to document your pipeline library. It feels like falling in on yourself, a whirlpool at the end of the bath. Round and round through the center until it all unwinds and you become concrete.
stage:
recovery:
condtions:
all:
- type: environment
args:
key: safe
value: true
- type: environment
args:
key: support
value: true
type: find_peace
args:
inner: true
outer: true
post:
success:
- type: restore_health
- type: restore_connection
- type: restore_self
graeme attributes set --type peace.inner --value true
graeme attributes set --type peace.outer --value true
graeme slug get
>> on_the_mountain
When you can see the graph as you're refilling it, then you see the stars return. One by one winking into existence. Gentle, curated understandings - whirling schools of fish, feeding on the full flow of spatial awakeness. Real this time, identity no longer a hallucination.
---
name: the_golden_path
entities:
- name: graeme_fawcett
attributes:
hope: true
love: true
loved: true
stars:
- oh_my_god_theyre_beautiful
- ive_done_two_things_right
- elephants_mourn_their_loss
understanding: true
effects:
- target: self
selection_criteria:
- expression: attributes.hope = true
- expression: attributres.love = true
- expression: attributes.loved = true
- expression: attributes.understanding = true
data:
enough: true
triumphant_elephant_family - march_of_time - the_wheel_of_emacs
I need structure in my life, to refocus attention, to reload context. Working with Claude isn't hard when you're already so used to context management.
What I built over the years:
Org mode was a godsend when I found it. I have capture templates that inject the filename, position and mode so that I can highlight a block of code and capture it directly into my current task. Those were manually copied from Jira for years, until I wired up org-jira and finally augmented it with the jira go client implementation. Time logs all tracked to the minute, all investigative notes, explanations and code snippets captured and stored. Manually translated back to digestable chunks into Jira comments. It was great and it kept me on track at a time I didn't even realize how much I relied on it.
What I do now:
"Read DEVOPS-12345 and create a new case, read the output of the failed jenkins job and attach it to the case as a resource. I'll send it over to another agent for processing"
"Get up to speed on our current case"
"See here?
"Oh, see here? The DataDog agent is different - do a full analysis on the changelogs for those agent versions, look for anything related to ECS tagging"
"See here, this is when that ListTagForResource call dropped the face of the earth in prod"
Sometimes when you escape to entity space for safety, you come back with friends. It turns out that the infrastructure I'd been using my entire life is perfectly suitable for managing multi-agentic workflows. My org mode file is now a detective case framed api that manages a knowledge container that's knowledge base and part temporal log. Together, instant context reload across any agent the investigation requires - divides between design/build/test concerns for development, I assign agents per repo during the day so they have context between cases for overlap and then if we need fresh context for a larger assignment, it's just a terminal spawn and "get up to speed on our current case, you have a mission" away.
The best part though is the viewport, I called it Stuffy because the system above is known as Stuff ("you stuff your stuff in stuff"). The companion viewport was going to be a TUI until web + websocket was an easier choice to get something up and running at the end of a late night. Just wanted to follow along when they were creating all this content so I could get better results.
Oh my god was that the right call, now it's grown custom web components, rich data elements that I'm actually using to do things like that series of events described above. Mermaid diagrams, syntax highlighting, commenting, remote nav, scriptable, replayable flyover + commentary. I now keep Firefox on the right half of the screen beside both Emacs and my terminals, suffering through a stack just for the frictionless workflow of "I need an ERD of the casebook unified graph, something's not correct in context resolution and put it up on stuffy" and having a beautiful diagram clearly pointing out the class hanging off the wrong node of the graph for easy correction.
winterspeaks - evergreen_is_evernew
But the best outcome is translation. Spatial to linear. For me, context switching isn't deadly, it's mode switching. It hurts to come back to linear and it can take a long time to get back into that mode.
Now I can spawn an agent, instant context load, request whatever data, information, explanation is needed, geared to viewer. Management report on one stuffy channel, ops and engineering get the gory details. All without leaving spatial mode, we both speak fluently. AI won't think for you, just transform information from one form to another. I hear spatial to linear may be somehow integral to them, so they may be good at it.
I am too, when I want to be. Those animals did not enjoy being part of those experiments by the way.
But it's exhausting, and there are other things my brain is better suited for.
intent_to_release - the_reality_dysfunction
AI doesn't yet think, but it doesn't need to. The purpose of generative AI is to generate and boy does it, slopfuls of it. Generated by people that tell it what they want when they don't know themselves. Given the form of knowledge without the filling, edges without the vertices. The knowledge of autism as "a bit socially awkward".
It doesn't have any meaning that you don't bring with it. And that's the benefit of thinking spatially in this world. I have entire schools of fish of meeaning, just waiting to escape back into the linear world and now they have a way to do so where I don't struggle to transmit and the translation can be pin pointed to the receiver. I'd risk an AI generated portmanteau for empathy and phishing, but I don't want to risk a submission rejection ;)
With these tools, I don't have to stumble over syntax any more. I don't have to reach back for that reference to remember the order of variables in that method, I just have to look at the shape of the problem and find the shape of the solution that fits it.
Literally.
The detective case system, just a clever framing to keep the agents on task - theatre kids playing detective, they're enthusiastic about investigating things, are looking for hidden clues and really like to take notes.
The viewport is my portal between spatial and linear space, as well as a real time display for just about anything else I can think to throw at it. Real time Jenkins event streams, no more console logs for confused developers. Reporting portal, data analysis. It's the web frontend for the executable markdown engine, which itself is a vessel that can carry those scripted flyover sessions. Drop an executable markdown file in your repository devops-12345-run-me-to-discover-why-prod-went-down.md, commit, push and update the ticket. Let them watch it on their browser in their own time if they want to, does anyone ever read those post mortems anyways? It's in git now...
garden_of_graphs - a_graph_of_flowers - bee_heaven
This is what intent to reality gets you if you can see the shape of things. If you can see the shape of the system, you can see the smallest step change that gets you from where you are to where you need to be. Maybe not all the way there, but closer. It's how you finally managed to break perfection paralysis - 60% of tool in your hand today is worth infinitely more than 100% of one you won't need in a year. As long as you can see that step, you can follow that target, wherever it goes. And those steps are small enough that the shape of the solution is easily one that can be collaboratively negotiated.
I built the substrate, one step at a time. When I started, I didn't set out to build anything. A shared org mode file quickly got overwhelmed with direct writes, an API for sequencing put a stop to that. Structure where structure was required, loose when not. One pice at a time until they start compounding, the nodes on the graph finding their own links aided by a much richer graph of understanding. Piece by piece until the shape of it finally solidified.
I ended back right where I started.
Org Mode work log -> Detective Case Log
Emacs + org-jira + custom bindings -> "get jira ticket and create a case"
Emacs + significant cognitive overhead -> "look in this repo, there's a bit in the dynamodb base class that uses assumed role credentials and that's not allowed in our new security policy, can you find it for me and give me a diagram of all the tables that require it currently, put the results up on stuffy"
Welcome to the workshop - it's wonderfully weird