Home / Blog / Work & Systems / From Geocities to GPT

From Geocities to GPT



I’ve been trying to find my first website for ages now. The McPhee Family Pets, circa 1999.

Hours down rabbit holes of the Wayback Machine, trying every possible URL combination I can think of. Was it heartland/prairie? EnchantedForest? Did I use underscores or hyphens? The internet has swallowed it whole, along with Porygon’s Cave and whatever I called Horsea’s page (Horsea’s Haven? Horsea’s Hideout? The name floats just out of reach).

There’s something devastating about losing these first creative digital expressions. Like they existed in some parallel internet that’s been paved over. I was ten years old with a chinchilla, pet rats, mice, chickens – the list was genuinely ridiculous – and I believed each one deserved their own dedicated webpage. Their own corner of the internet, pure “here is my rabbit named Libby and here are three facts about her” energy.

I remember the old web though. Not all of it, but fragments. Like how our dial-up plan gave us an allowance for New Zealand hosted websites versus international ones, so I’d browse locally hosted tutorials for hours. There was one about frames that explained them like a dinner plate – your main content (meat) in the middle, navigation (salad) on the side, maybe a footer (dessert) down the bottom. I thought this was the most brilliant metaphor at the time. I probably spent weeks just moving frame borders around, watching content reflow, feeling like an architect.

Then there was Vikimouse and the MousePad Kids – this website where you could adopt virtual mice that lived in elaborately crafted pixel houses. Someone called Vikimouse had made each one pixel by pixel. I’d stare at them, trying to understand how someone had that much patience. How they knew which pixel should be brown and which should be tan to make it look like wood grain. I’d view source on everything, trying to decode the magic. And then I would populate my home page with entire families of adopted, digital, pixel-art rodents.

The platform wandering started early. Geocities, Tripod, Bravepages, Angelfire – I was chasing free. Zero dollar budget, minimal ads, maximum creative control. Each platform migration was like trying on a different digital identity. Would THIS be the place where my Pokemon fan sites would finally look professional? (They never did. But they had auto-playing MIDI files and that’s what really mattered.)

I joined a forum called Young Coders, or something close to that. We’d share JavaScript snippets we’d found – mouse trails, falling snow, those eyes that followed your cursor around the page. Copy, paste, pray it worked. When it did, you felt like you’d just cast an actual spell. When it didn’t, you’d spend hours hunting for the missing semicolon, not knowing that twenty-five years later you’d still be hunting for missing semicolons, just now with better error messages.

As I grew into a teenager, things got more sophisticated. Or at least, I thought they did. Dreamweaver felt like cheating after hand-coding everything. Macromedia Flash was pure magic – suddenly things could MOVE. Not just blink tags and marquees, but actual animation. I made band fan pages with the dedication of a digital shrine builder. Learned some ASP.NET because it sounded important and grown-up, and eventually grew into PHP, where I got my first taste of the pre-WordPress bbPress.

The webrings were their own special kind of commitment. You’d apply to join one – “Anna’s Pokemon Paradise is applying to join the Elite Water Pokemon Webring” – and wait anxiously for approval. Then you’d get this chunk of HTML to add to your site with Previous and Next buttons, making you part of this infinite loop of similarly obsessed people. I was probably in twelve different rings at one point. Pokemon ones, virtual pet ones, one for just about anything.

Guestbooks were mandatory. You weren’t a real website without a guestbook. Mine was from Bravenet, plastered with whatever background GIF I thought was sophisticated that week. The entries were always the same – “Cool site!” “Love the pics!” and occasionally someone would actually write something substantial and you’d feel like you’d made it. Like your website was a real place people visited, not just pixels you were shouting into the void.

Then came the band fan pages. I had opinions and they needed dedicated web spaces. Frames for everything. Left frame: navigation with each band member’s name in a different font, Top frame: band logo I’d painstakingly cut out of a larger image in Paint Shop Pro. Main frame: “News” that I’d copied from other fan sites, maybe a gallery of images that took seventeen years to load on dial-up. I probably had a disclaimer somewhere about not owning the images, as if Sony Music was going to come after a fourteen-year-old in New Zealand.

I remember the exact moment I discovered Google in beta. I was a catalogue of search engines and web directory listings before that (I don’t say catalogue metaphorically, I kid you not – I had a clearfile folder where I would write down the URL of every search engine and directory I could find) But Google was just… empty. A logo, a search box, two buttons. It felt wrong, like someone had forgotten to finish building it. Where were all the portal features? The weather? The news? But then you searched for something and it actually found what you wanted. Not seventeen pages of garbage with your result buried on page twelve. It was unsettling how good it was.

CSS Zen Garden broke my brain entirely. This was maybe 2003? I’d been tables-for-layout loyal, defending my nested tables like they were a personal religion. Then someone showed me CSS Zen Garden – the exact same HTML, completely transformed just by changing the stylesheet. I spent hours viewing source, trying to understand how the garden became the ocean became the subway map. It was like finding out you’d been painting with your fingers while everyone else had brushes.

I think I tried to recreate every single design. Failed spectacularly. But in that failure, I started to understand the cascade, specificity, the box model (though IE6 would torture us with that for years to come). Started to grasp that we were trying to teach browsers our intent. That HTML was supposed to be structure, CSS was presentation, and mixing them was… wrong somehow? Though I definitely kept using inline styles for “just this one quick thing” for an embarrassingly long time after.

We spent the next two decades getting really good at this conversation with browsers. Teaching them to understand that when we said “display: flex” we meant “please for the love of god just center this div.” Learning their quirks – Safari would do this, Chrome would do that, and IE… well, IE would do whatever it felt like. We learned to speak their language, to think in their logic.

And now here we are, trying to teach AI to understand context, and it’s like being ten years old staring at Vikimouse’s pixel art again. We know there’s something magical here, something transformative. But we’re still copy-pasting code snippets and praying they work. Still hunting for missing semicolons, just now they’re in JSON configs for MCP servers instead of JavaScript mouse trails.

The thing is, AI doesn’t understand context the way we learned to understand the cascade.

When I’m debugging why an MCP server won’t talk to my tools properly, it feels exactly like debugging why my frames wouldn’t resize in Netscape Navigator. Except now instead of teaching a browser that “frameborder=’0′” means “please don’t draw that ugly gray line,” I’m teaching Claude that when I say “search my previous conversations about MCP” I mean actual conversations, not some hallucinated memory of conversations that never happened.

I’ve been experimenting with MCP servers for a little while now, and it’s giving me the same feeling as those early days of copying JavaScript snippets. You know something powerful is happening, but you’re not entirely sure why it works when it works. Just last week I spent three hours trying to figure out why my context wasn’t passing through properly, only to discover I had the wrong quotation marks. Not missing ones – the wrong kind. Curly quotes instead of straight ones. In 1999, it was forgetting to close a font tag. In 2025, it’s Unicode characters that look identical but aren’t.

The documentation situation feels familiar too. Back then, you’d have seventeen browser tabs open (once we got tabs – remember when opening a new site meant opening a whole new window?), each with a different tutorial that explained things slightly differently. Now I have seventeen tabs of Anthropic docs, GitHub repos, and Discord conversations where someone’s figured out something that isn’t documented anywhere yet. We’re all still collectively teaching each other, just now it’s in Slack threads instead of Young Coders forums.

But here’s what’s making me think: We got really good at teaching browsers to understand us. It took twenty-five years, but we did it. We went from table-based layouts and spacer GIFs to CSS Grid and container queries. From “best viewed in Internet Explorer 5” badges to responsive designs that work on everything from a watch to a wall-mounted TV (and perhaps even your fridge!)

What I’m wondering is – what will teaching AI look like in twenty-five years? Right now, we’re in the Geocities era of AI interaction. We’re copy-pasting prompts like we used to copy-paste JavaScript snow effects. We’re joining the AI equivalent of webrings – Discord servers and GitHub repos where people share their successful MCP configurations. We’re building the 2025 equivalent of “The McPhee Family Pets” – earnest, ambitious projects that probably won’t exist in their current form in five years, let alone twenty-five.

I found a screenshot the other day of a website I made in 2001. It had a splash page. Remember splash pages? “Click here to enter” with some elaborate Flash animation that everyone immediately clicked through. It seemed so important at the time – the grand entrance to your digital space. Now it’s almost embarrassing to look at. What will we think of our current AI interactions in 2049? Will we laugh at how we used to manually configure context windows? Will prompt engineering seem as quaint as table-based layouts?

Sometimes I wonder if those lost websites – The McPhee Family Pets, Porygon’s Cave, Horsea’s whatever-it-was – are better off disappeared. They exist now exactly as they should: perfect in memory, terrible in reality. They were never about being good websites. They were about that feeling when your HTML finally worked, when your frame borders aligned, when someone actually signed your guestbook.

That’s what I’m chasing now with MCP servers and AI tools. Not the perfect implementation, but that moment when something clicks into place. When the context passes through correctly and suddenly your tool can see your previous conversations. When the AI understands not just what you’re saying but what you mean. It’s the same magic, just with better error messages and worse documentation.

We spent decades teaching browsers to understand our intent. Now we’re teaching AI. The difference is, this time I’m not ten years old with unlimited time and a chinchilla. I’m thirty-nine with a toddler, a full-time job, and approximately seventeen minutes of free time per day. But I still get that same feeling when something finally works. That same urge to view source on everything, to understand the magic.

Don’t judge – we all started somewhere. And honestly? We’re all starting somewhere again.

The web I grew up with is gone – not just my websites, but that whole version of the internet where teenagers could build shrines to their pets and their favourite bands without thinking about SEO, TikTok videos, engagement metrics or whether an AI could do it better. But maybe that’s okay. Maybe each generation gets their own version of the web to figure out, to break, to build weird things on.

I just hope somewhere out there, some ten-year-old is building the AI equivalent of The McPhee Family Pets. Teaching GPT about their pet chickens. Making something wonderfully terrible that they’ll try to find in twenty-five years and fail.

That’s the web I want to help build.


Discover more from Anna.Kiwi

Subscribe to get the latest posts sent to your email.

Comments

Leave a Reply

More to read...