This is a little project I did back last summer, to explore glitching games. I've long been inspired by speedrunning, and glitches like the Missingno glitch in Pokémon Red and Blue.
I love being able to pull apart the game and see how they work, an getting an insight into what the data looks like from the perspective of the machine. For example, the Missingno glitch shows us what happens when the GameBoy interprets the player's name as the parameters of a Pokémon.
With this project I worked on modifying GenesisPlus, an open source Sega Mega Drive emulator for Mac, which is bundled as part of OpenEmu. I put in a system of scripting interactions with the console's memory. This means it can write new values into memory on certain triggers. These include a time period passing, a button being pressed, or another value in memory changing.
I then worked on turning this into something that would be fun to play in its own right. I added networking features so that players playing separate copies of the game can interact with each other, using glitching to fuel a competitive challenge.
The video below talks about how I created this, how it works, the thought process I went to, and the Sonic the Hedgehog 2 variant that emerged as a result. I hope you enjoy watching!
ELIZA was the world’s first digital psychotherapist. Created from 1964 to 1966, long before Siri and Cortana, long even before the first commerical videogames, ELIZA was an AI that had conversations with its users.
A user, communicating with ELIZA through a terminal, would be asked a question about themselves, and ELIZA would listen, prompting the user with questions.
Except ELIZA had no idea what was going on. ELIZA only created the illusion of understanding, using pattern-matching and substitution to parrot the own user’s words in the the form of a question.
ELIZA’s conversational ability grew over time - not through machine learning, but through users adding new rules and behaviours to her script. She was an illusion, non-sentient and entirely artificial. Nevertheless, users were reported as having meaningful conversations with her. ELIZA talked them through their problems. They found the experience comforting, often revealing to themselves inner feelings they hadn’t acknowledged.
Joseph Weizenbaum, the creator of the program, was dismissive of this response. He had created ELIZA as a parody of artificial intelligence, to demonstrate the superficiality of communication between man and machine. He felt the popular response was merely a result of humanity’s tendency to anthropomorphise the world around them.
Regardless of what was really going on under the hood, users had a meaningful human experience with ELIZA. Whether or not the machine was actually intelligent is not important. Even whether or not users actually believed that the device was intelligent is, arguably, of little consequence.
For the end user, their emotional response was the entirety of the experience. The banality of the program only mattered if believing it to be artificial affected that response.
Maybe it was enough to simply play along with the artifice.