november 19

artificial intellgence class is rarely a place where i come to any sort of revelation. i mean sure, the concepts are new and exciting at times, but mostly i'm a little drowsy or overwhelmed by cs whizzes. so today i zoned out, but in a new way... kinda like a mental excersise or experiment, i imagined i had no idea what was going on, no history or previous knowledge of the subject matter, and as people were talking around me, all i could really concieve of was how silly they sounded, what their body language was like, how they interacted. all the context was lost, and it was pretty interesting.

as i walked away from the building, i conceptualized the experience like this: people hold entire world models in their heads, some all the time, some created and finished during certain ocassions. like when you go to a movie, and you suspend all disbelief and get absorbed, taking in what you see and hear and creating a world that you temporarily exist in. you make a model, an environment that you have to keep track of, a whole context for action and thought. it's also like someone programming.. you have to keep the grand order of the program in the back of your head, hold the whole thing there and access it when a problem appears, referencing lightning-fast thru all the ideas and data that you've accumulated. then when the program is done, you are allowed to release all that memory space, and you feel somehow refreshed. fiction writers have to do it too... they create a world in their heads, explain it to us, have characters move realistically (pertaining to the local rules of realism in that world) in that environment, all the while they have every detail retained. so in a class, you have all the past material and vocabulary set when you walk in, and you then call up that little world, put yourself in it, act inside it, then tuck it away again till next class. i just decided to take a little walk outside looking in.

i have great admiration for people like writers who can hold such vast universes in their minds, and hold them realistically. a computer has the most difficult of time doing just this, modeling a realistic world based on whatever internal structures might exist. it is one of the things that keeps artificial intelligences from truly being useful, or more interestingly, alive in a way that we would want them to be - aware and active (vs. passive), thinking realistically and intuitively, joking, or dreaming.

i also wondered... how much is our everyday life a journey thru our own world model, our own environment that we are continuously creating and interpreting? we have our own world views with local logic and specific vocabularies or words and possibilities, histories and memories, expectations, other crazy shit that we pick up along the way. so what happens when we take a stroll outside of that?


11.17 | november | 11.22