TMI in our digital age

i think most people would agree that i am generally an embracer of technology and the use of technology to enrich work, lives, arts, &c. in middle school and high school i was the geek who was addicted to video games, excited to learn how to use computers, and spent hours logging on to BBS’s everywhere in the Pennsylvania area to chat, play online games, and the like. I’m an advocate of technology in classical music, having composed several works of music for live performer and an ‘intelligent’ computer that reacts to what’s being played or reacts to the performer breaking an infrared beam. In my job prior to my current one, I was part of a team of reporting and reporting system analysts who were very tech-saavy, and we were always enthusiastic about (as my boss liked to put it) “moving reporting into 21st century”, streamlining as many data points as we could so that the company could receive relevant data quickly, accurately, and with as little human intervention or manipulation as possible.

But there’s a distinction i make between technology that i feel serves as positive enrichment versus progress-hindering. A while back i wrote a reaction to the Robotic Drumstick Haptic Guidance System, and i still stand by its thesis that such a device is poorly conceived as a pedagogical tool and that anyone who uses this as the basis for their musical knowledge and understanding could become an excellent “note player” but would become a poor musician.

i’ve also gone off on why i further disliked iPods when they could start playing movies and i still find value in that stance, although i think it needs to be refined somewhat. There’s no doubt that sometimes kids need attention and sometimes a parent needs to focus on other things. Distractions are a good answer to that, but i think that distractions need to be approached cautiously, first in the kind of distraction involved (i like to think that some degree of cognitive distraction is better than nonsense distraction), and secondly in the mindset that distractions of that sort of nature should never be an excessive or complete answer to everything (like if the iPod runs out of battery during a long car ride, the parents have no idea what to do beacause they’ve never actually talked to their kid in the car before). In that sense, the use and/or abuse of technology has to do with degrees and where to define the threshold of something moving from enriching/harmless distraction to harmful and potential long-term negative effects.

And now there’s a new technology trend that i feel is teetering dangerously away from its initial positive enrichment to progress hindering and backwards thinking: too much accesible information.

In the decades in which the World Wide Web continued to develop and grow, there were various stages of mindsets. In the early days, it was a “i can find useful academic information” mindset. As the internet became more mainstream and information outside of academics started to gain presence on the web, the mindset evolved into, “I might be able to find some of the answers i need on the web.” And then in what i consider the post-Google era, the mindset evolved into, “I can find anything on the web!”, or slightly more sinister, “Why can’t i find everything on the web?”

in a lot of ways, i think the easy access to any sort of information or opinions and the ability for so many people to connect in ways that weren’t possible before is fantastic and has a lot of potential to be more on the positive enrichment side of things. the problem is that there’s as much useless information as there is useful information out on the internet, and the ability to pull up any information at any point can make it too easy for people to transfix themselves on trivial information that ultimately serves no real purpose, and with the recent surge of mobile internet trend set by Apple and the iPhone, people can now increase their habit of merrily finding out whatever they want whenever they want whether they need to or not.

Let’s take a hypothetical example and compare mentalities:

You’re walking in the park or in a long car ride or whatever with a friend and you’re discussing the three live action x-men movies. In trying to compare the three movies, you remember that in the last movie, Kitty Pryde has more of a spotlight role than the previous two movies and that triggers a question, “wasn’t Kitty Pryde played by a different actress in the second movie? maybe even the first?”

in today’s Mobile internet world, finding the answer to that is a snap. pull out your smartphone, go to IMDB or wikipedia, find the answer you’re looking for instantly.

in yesterday’s world of internet-houses-all-information, you have to wait until you’re in front of a computer to find the answer. So one of two things happens: a) after the long car ride, you remember that this was information you wanted to know, so you find a computer, find your answer, and receive satisfaction for having answered an unanswered question, or b) you completely forget that you were curious about this tidbit of trivia and the question never gets answered which is fine because you didn’t remember that you asked the question in the first place.

in the pre-internet era, finding the answer would be damned difficult. likely it would involve more thought than the information really warrants; trying to trigger a memory, calling up someone else who has seen the movies on the offchance that they know the answer, or something similar. And eventually in your head you discover the answer (or what you think is the answer) or else you let it go or shelve it for later and move on with your life.

What’s striking to me about all of these scenarios is that i feel that the end result doesn’t actually change anything or fulfill any sort of enrichment. Whether you discover the answer to that question or *any* trivia question or not, the path that your life is taking remains the same. You could say that now you know something that you didn’t, but that doesn’t say much about how well you will retain that information (and in a world where the information is readily at your fingertips, there is less incentive to retain it on your own) nor does it speak to the value of the information.

So then you may argue, “if the end result is the same, then why does it matter? If immediate access to the information is a different means to the same sort of end, then i don’t see the problem.”

The problem is two-fold:

First, the easier it is to discover useless information, the more useless information people will fill their lives with. In the above example, particularly with IMDB and wikipedia, it becomes too easy to start link-hopping to tangenting articles, statistics, and other random findings. Oh, that’s right, Kitty was played by Ellen Page in the last x-men movie. I wonder what else she was in? Ooooh, she was the one that was the lead role in Juno! I loved that movie! When did that come out again? oh, i didn’t know that John Malcovich produced it! That “Being John Malcovich” movie was so cool. Didn’t that have John Cusack in it?… and on and on and on, so that now a harmless curiosity with a simple ten second answer turns into a thirty-minute tangent filled with information that is likely forgotten a month later, and that thirty minutes could have been used in a different way. And sometimes that thirty minutes can turn into hours of wasted time.

Secondly, becoming used to a paradigm in which information is expected to be so accessible can resultingly cause a new kind of psychological anxiety when that information is no longer accessible or if a partiuclar piece of information is not easy to find. this is well parodied in the South Park episode Over Logging, and it’s also reminiscent of the reason why i decided a long time ago to never wear a wristwatch which i blogged about on oscillate in 2004:

many many years ago i wore a watch around my wrist and… I reached a point where i would look at the time every two minutes out of habit, and that evolved into a *need* to know what time it was every second. I remember distinctly the first time i forgot my watch or lost my watch and there was no time piece nearby. i was in a state of total panic. I felt so afraid and insecure and alone and kept on looking around everywhere for something or someone to tell me what time it was. After that i… vowed never to ever wear a wristwatch on a regular basis ever again, opting for some sort of pocket timekeeper instead. because of this, a) i’m a much more relaxed individual, and b) i’ve developed the skill of knowing pretty accurately what time it is when asked even if the last time i checked a watch was hours before.

While not exactly analagous, i think it’s a close enough resemblance: we’ve reached a point in our culture where the expectation of information is so great that any information gaps regardless of its value can cause stress.

Again, the issue i have isn’t really with the technology itself, it’s with how it’s being applied. And it’s something that i have to be particularly careful about because of my own addicition to information. i love absorbing a wide variety of information whether important or not, and it’s for this reason that i’ve determined that mobile internet and smartphones are something i need to keep out of my life or give myself strict restrictions on how and when it is used. i’ve developed enough bad internet habits as it is.

Originally posted on darkblog resonate. I prefer any thoughts or comments there.

Apple needs to fill the gap.

i have to give Steve Jobs props for revitalizing Apple as a dying company when he helped introduce all things iBrand back in the late 90’s and early 00’s. The first iMac was noteworthy for its attempt to make computers fashionable and helped to establish the momentum that paved the way to the iPod, the iBook, the iLife software suite, and now the iPhone craze.

As a loyal supporter of Apple computers since about 1994, the direction that Jobs has taken Apple gives me mixed feelings. On the one hand, it’s nice to see a company that was such an underdog to Microsoft bring itself back into the spotlight, and i admire the company for finding ways to evolve outside of its original box and continue to push technology innovation and trends. The iPod pretty much blew away any existing portable MP3 player at the time through its marketing scheme; the iBook (and now the MacBook) has helped make laptops of any sort more mainstream, affordable, and trendy, and the iPhone caused all of the competing mobile phone manufacturers to scatter like chickens with their heads cut off to develop their own touchscreen smartphones.

But a side effect of the growth and development of that level of iCraze is that Apple’s flagship product of desktop computers (currently the Mac Pro) has further distanced itself from the mass market.

yesterday when i went to the lakeside mall i decided to skim my way through the new Apple store that had only recently opened there. the last time i was in an apple store was a couple of years ago in san francisco, and at that time i was going in with the attitude of ‘let’s just wander around’ as opposed to this time, which was ‘let’s assess the situation’.

and as i walked around this particular store, i saw iPods and iPod accessories, iPhone and iPhone accessories, iMac and iMac accessories, and MacBooks – none of which i was looking for. There was no sign of the Mac Pro, no corner where a user looking for a more power computer user that has expandability out the wazoo could find information. It made me think that the store should have changed its name from the Apple Store to the iTrend store.

And this reflects a particular attitude that Apple seems to have about their two lines of desktop computers. The Mac Pro is a powerful machine and has been generally received well by the critics, but Apple decided once it went Intel to make it such an Ultimate High-End Machine that it doesn’t pander well to the consumer market. The base model starts at about $2300 (without monitor) and customizing the machine to give it more oomph can easily put it into the $3500-$4000 range. For what you get that’s not unreasonable (from what i understand after basic digging) but the bottom line is still pretty steep and more computer than most people have a need for.

Which is fine because it’s nice that that option is available, but the problem is that the only alternative cheap option is the iMac. The base model of the iMac is $1200 (without a need for a monitor) and can be upgraded and oomphed up to a price that hits the low end of the Mac Pro specs for a much relatively cheaper cost. And i’d be completely happy with that except that the All-In-One design of the iMac restricts the kind of expandability that i’ve always had and still want with my desktops. i want multiple RAM slots and multiple PCI slots and multiple hard drive and optical bays. i want the ability to add a second monitor to my set up and then replace it if i get a new one or need to transfer my current monitor somewhere else. I want to be able to put in a RAID card or upgrade my graphics card. etcetera.

Ideally it would be nice if Apple brought back the PowerMac series as a reasonable compromise to fill that gap: consumer level processing options but with the expandability of the Mac Pro. I believe the audience is out there – the ones who want a compact and efficient workstation that gets the job done but can be modded as time goes by. A Powermac G6 could start somewhere mid iMac price range and ramp up to the beginning of the Mac Pro range, offering similar if not identical processor specs to the iMac.

But honestly i don’t see that happening any time in the near future. Apple’s desktop computers already seemed to be taking a backseat in development before the iPhone came out; now, between the newest MacBook Pros, the MacBook Airs, the iPhones, the iPods, &c., i think that the Mac desktops will continue to fade into a niche obscure market and fanbase comparable to that of Linux.

Which for me means two options: buy an old Mac Pro or G5 off of a distributor site that’s cheaper and more in line with what i want, or, for the first time in many years, consider buying/building a Windows machine as my main operating computer.

Buying a Windows machine as my main computer seems absurd because i’m much more comfortable with macintosh hardware and software, and i have all of these programs and files and archives of things that are Mac only. I hate Windows Vista, am not terribly fond of Windows XP, and don’t relish having to find a whole new suite of applications that will likely be unable to read my mac files.

And yet it still falls under consideration simply because of the question: “what do i really need in a computer and how much is that need worth?” against all other considerations it seems horribly imbalanced, but it’s a valid concern since there are many other things i should be using my money for other than a $4k computer and i bet i’d be able to build a PC that meets my needs for half that price (although i’m not sure if i feel like it will last as long).

But we’ll see. All this is moot until 2009 in any case, so when it becomes relevant i’ll look at the current offerings both present and recent past and then assess the situation then.

originally posted on darkblog resonate. comments are preferred there.

shift in video game target audiences

video games have evolved a great deal since their introduction a few decades ago, and to me, the past couple of years have shown an interesting shift in the popular video game trend and its audience that feels like its bringing the entire history of video gaming around full circle.

in its infancy, “video game” meant “arcade game”, starting (essentially) with Pong and then developing into a thriving arcade culture of individuals who plopped quarter after quarter gobbling pellets, shooting asteroids or space invaders, or jumping over barrels. And whlie my personal experience in arcades growing up didn’t match the stereotype of angsty/rebellious teenagers, society definitely bought into that impression on both sides of the fence, and as the popularity of video games started to rise so did the concern of parents that video games were a bad influence on youth. Video games are a waste of money, they make our kids not interested in reading, they make our kids violent or lose touch with the real world, &c.

It’s impossible to say where video games would be right now if the Nintendo Entertainment System hadn’t revitalized the home video game industry after the video game crash of 1983. I think it was likely a mixed blessing for arcade machine developers; on the one hand, the success of the NES console took people away from the arcades and more money into cartridges, but on the other hand, if the NES hsdn’t resurged video gaming back into popular culture, the arcade industry would have probably died on its own.

The interesting thing to note about the arcade industry versus the home industry was how those competing yet co-dependent paths slowly diverged over time both in society’s attitudes about them and the experiences they tried to create. During the third and fourth generation of home consoles from the mid-80s to late-90s, home consoles were still “behind” when it came to replicating the arcade experience. The graphics weren’t as sharp, the home joystick didn’t have the same sort of “feel” as an arcade joystick, and more importantly, home consoles couldn’t match the social aspect of arcade video gaming, particularly in the early 90s when Street Fighter II and Mortal Kombat brought people back to the arcades. But the home console market at that time was able to compete in a way that the prior home console market failed because they had a particular slice of video game aesthetic that wasn’t meant to replicate the arcade experience, it was supposed to stand on its own. Super Mario Brothers, Legend of Zelda, Metroid, Sonic the Hedgehog, and early RPGs like the early Final Fantasy and Dragon Warrior games helped define the home market audience versus the arcade audience.

It was the next generation of video game consoles (Playstation, N64, Saturn) that started to shift the dynamics and attitudes in game development as technology and graphics for home consoles started to accelerate and create the market that still has strong influence today. The long platform/RPG and other “console specialized” sorts of games still had a strong following, but it was also around this time that consoles had advanced enough to create a truer arcade experience or create an experience that (in some views) *surpassed* the arcade experience in gaming. And when the next generation of consoles came out years after (PS2, Xbox, Gamecube, Dreamcast), the arcade video game industry had to change its tactic to keep the arcade experience unique, which is how games with non-standard controllers rose to dominance, particularly music video games like Dance Dance Revolution and other bemani.

Through these decades of video game history, the overwhelming majority of consoles and systems were still aimed at the everchanging youth. Video games that were smash hits in the 8-bit era were abandoned as a home market aesthetic in favor of games that emphasized graphic superiority and/or a greater sense of epicism. and as that philosophy of “better graphics! more dazzle! who cares about gameplay? just blow things up!” gained momentem and became a standard to uphold in entertainment in general (don’t even get me started on the Michael Bay’s Transformers), it created a separation between the older and newer generation of gamers, leaving older gamers in the dust.

Until a new video game aesthetic started to creep into the mainstream which in its infancy was pretty invisible to the likes of me but is now impossible to ignore: the online casual flash game.

I’m not sure when casual flash games rose to such popularity, but it’s evident how much it has a strong foothold in the new video gaming culture not just because of the popularity of sites like kongregate, yahoo games, the casual game apps that exist on facebook &c, but also how much prominence casual games have in the current gen. consoles. The PS3 and XBox 360 certainly still have the genre of hardcore gamers that are looking for games that make full use of their power to give them that Next Dazzling first person shooter/racing game/sports game, but there’s an entire online paradigm for both of these consoles that is dedicated to the downloading and buying of casual games not unlike what is possible to do on the internet. In fact, some of the games that are available through those consoles’ online services are ones that were found on the internet first and developed as an enhanced version, such as N+ and Flow.

In addition to this, you have the Wii. Nintendo’s whole marketing strategy for the Wii other than its innovative controls is that it’s the video game console for the whole family, and with launches such as Wii Sports, Wii Play, and the like, it’s clear that part of the new controller design is optimized to help enhance the casual game experience with the unique Wii interface.

When i think about how and why casual games have risen to such prominence, a few key factors come into play. First off, i feel that the online casual flash game was the first video game genre that was targeted towards older people, particularly corporate office workers. Even small businesses have integrated high-speed internet as a part of their infrastructure, and when people need a break and are tired of reading news or looking at pictures or whatever, more people find a casual flash game to occupy their time. it’s the new version of the newspaper crossword puzzle or word scramble, and it succeeds at grabbing that new audience because a) the games are generally simpler in concept and execution than typical video games (compare point and click or finding words as opposed to executing a haryuken), and b) the games are generally short to finish, an instant gratification/momentary distraction sort of thing rather than a long involved mission that involves more walking and random encounters than people want to have even in real life.

Secondly, there’s the ease in which any random joe can program and develop a quality casual game. As opposed to console games which require a team of programmers and artists and what have you to put together, flash is relatively easy enough to learn that basic games can be a one-man show, and with sites like kongregate, they can gain free and instant exposure to tens of thousands of people. It’s even hit a point where those that can’t comprehend Flash can go to sites like simcarnival where a special application exists to make that process even easier, requiring practically no programming experience whatsoever.

Third, and in my opinion the most significant, some of the casual games that have come out of this have risen to true brilliance, and this is where i feel the video game trend has come full circle. Because surely there are current more standard video games that have their own sense of brilliance and success such as WoW or the Final Fantasy series or GTA or Mortal Kombat, but it’s been a long time since there has been a video game in which the brilliance matches the sensibility of how Pac Man and Tetris and Centipede and Asteroids were brilliant, or how Legend of Zelda and the original Super Mario Brothers were brilliant: that despite its seeming simplicity in concept, gameplay, and graphics, they never get tiresome or old.

And because of all of this, i have a suspicion that the Big 3 console companies are on their last legs in the market of video games unless the momentum can be rebuilt up because of the likes of Rock Band and Guitar Hero. Otherwise, i strongly suspect that people will soon be more likely to buy a $5 texas hold ’em application on their smartphone or pull up a game of chain factor or their favorite kongregate game than spend $50+ on a console video game.

Originally posted on darkblog resonate. I prefer any thoughts or comments there.