The "20-year nostalgia cycle" is probably something you've noticed, even if you've not really paid attention to it. It's the idea that each generation rejects the aesthetic of the one that immediately precedes it, while rehabilitating the look, feel, and sounds of the one before it.
Your kids probably hate your music but really love Grandpa and Grandma's playlists.
In fashion, it's known as Laver's Law (1937):
Technology experiences its own oscillations. First with hardware and hosting:
1960s-1970s: Mainframes. Centralized. Compute was scarce and expensive; you time-shared a single machine. IBM owned this.
Late 70s-1980s: Minicomputers. First decentralization. DEC, Data General, Wang. Departments could afford their own machine. Cheaper compute moved processing closer to the user.
1980s-1990s: PCs and client-server. Full decentralization. Compute on every desk. The mainframe became the "back office," databases stayed central but applications ran locally.
Mid-1990s-2000s: Web/thin client. Partial recentralization. Browsers as universal clients, applications served from data centers, but companies still ran their own servers. This is when ASP (application service provider, the SaaS precursor) first appeared and mostly failed because bandwidth wasn't ready.
2006-present: Cloud/SaaS. Heavy recentralization. AWS launches 2006, Salesforce had been proving the SaaS model since 1999, and by ~2012 cloud-first becomes default. Compute, storage, and applications all rented from a handful of hyperscalers. Functionally a return to the time-share model with better UX.
2020s-now: Edge / repatriation / on-prem revival. Partial decentralization beginning. Cloud repatriation is a real and measurable trend (37signals' "leaving cloud" being the loud example, but quieter cases are everywhere — companies hitting the wall on AWS bills and rebuilding on Hetzner/colo). Edge computing pushes inference closer to users. Local LLMs running on Apple Silicon and consumer GPUs are the consumer-facing version.
Software architecture has its own moodswings.
Monolith → SOA (early 2000s) → Microservices (2010s) → "Distributed monolith" backlash → Modular monolith / "right-sized services"
The languages we choose swing between static and dynamic on a slightly slower cycle.
Strongly-typed (Java/C++) → Dynamic (Ruby/Python/JS) → Strongly-typed again (Rust/Go/TS)
As does the underlying paradigm, on an even slightly longer period as it tends to lag the specific language choice.
Imperative → OOP → Functional → "Multi-paradigm"
And finally the one that is most relevant for anyone in the SWE field, how software gets built.
If you want to know the future of software, one only has to look at the past for a prediction. The parallels between the current trends in software, specifically the unholy matrimony of the hyperscalers and SaaS, and the arc of Warrant's career are apt:
The hyperscalers, and the SaaS that runs on them, aren't going anywhere. Poison still plays state fairs. AWS in 2040 will still be running the COBOL-equivalent workflow for the handful of banks and other corporate customers that can't absorb the risk to migrate.
The rest of us will have moved on by then, as the cycles of the past show we will. Hair bands were sold to teenagers in the 80s as identity goods, and identity goods are price-elastic as far as the limits of their allowances. Hyperscalers and SaaS are sold to CFOs as operational expense, and operational expense is price-elastic as far as the limits of board patience, and multiple consecutive years of double-digit inflation has caused this to hit its limit.
Teenagers in 1991 didn't need a board meeting, they just needed Nevermind.