Wednesday, June 27, 2007

Information Overload Hacks, Part 3

In my previous posts on this topic, I discussed the information overload the wired worker deals with, and some of the causes of the overload. I wrote that part 3 would be an overview of a possible solution to one component of the problem- email overload.

Well, the plan has changed- the main point of Part Three will be about how there is no Part Three.

For each blog entry I post, there’s five or six that never make it past draft- I realize halfway through the writing that the topic is not actually that interesting, not germane to the subject of this blog, or I’m just not doing a very good job of expressing myself. Well, Part Three is different- it’s the first time I’ve spiked a post because it was apparently too good.

I’m not referring to its literary merit, or even its pure entertainment or informational value- I’m talking about the basic idea. I was planning to outline a possible software solution to the problem of email overload; a combination of technology and design that, if built, could substantially help humans manage the flood of email they receive every day.

I ran an overview of the idea past a couple of people who are much smarter than I am- and they convinced me that there is a real possibility to develop some intellectual property from my concept- property that I shouldn’t be freely distributing on my blog. I had intended the exercise to be a thought experiment to demonstrate how a fresh review of an entrenched problem can lead to novel solutions. These smart people pointed out that there are people who start businesses and launch companies on similar premises, and that I’d be a dope not to at least explore that avenue (especially given some of the funded companies out there with no discernible utility, let alone business plan).

Given my schedule (and my full time job) it’s highly unlikely that I would be able to make much progress on it on my own- but maybe some Red Bull fueled fit of wireframing will lead to something I can hand off for development. If productized, my idea won’t change the world- but it might make email just a little bit better to use, and it would certainly make my life easier- and I may even end up answering that email you sent me ten months ago.

Saturday, June 23, 2007

A Message To Architects of the Past, and the Future

Here's a suggestion: If you're going to name your building the 'Steel Building,' make sure it's made out of, you know, steel.

Slapping a steel entrance, oddly reminiscent of a 1930's robot, onto your extremely conventional brick building doesn't make it right.

Extensive Googling failed to turn up any substantive information about why this building wears its misfit name. In the absence of any actual facts, I'm going to presume it was the headquarters for some minor steel industry office in the past, and they didn't deem it worth going over the top with an all steel building (unlike Alcoa, who use aluminum in their offices as much as possible).

Real world architects aren't the only ones guilty of hyperbolic naming. Information architects, and the marketing departments they support, are quite fond of declarations that don't quite fit the structure they're stamped on... for instance, can there really be 109,000 "premiere online destinations" on the Web?

Similarly, is your website really 'easy to use' if you feel compelled to issue a press release explaining that you now have an index page that links to other pages on your site? Holy New Web Paradigm, Batman!

A truly great interactive experience, like a real steel building, doesn't have to label itself as great. Hopefully architects of the future- both structural and informational- will resist the temptation of patently false claims (such as this page that is inexplicable returned by a search for 'simple instructions').

Friday, June 22, 2007

Seven Sevens

Tonight I had dinner with an old friend and a group of young Drupal architects. Their company, Lullabot, has enjoyed great success, and they described how it grew organically from the two founders to an organization of about a dozen rockstars. At that point I reflexively said, 'Stop at 45!'

The 'Bots were understandably curious about that arbitrary number, so I explained how I have been a member of several startups- all of which ceased to be 'fun' places to work around when they crossed the 50 person mark. However, I couldn't exactly target why this was the case.

Further discussion and subsequent pondering led me to a realization that 50 is just a hair over seven groups of seven. There seems to be something resonant about the number seven in the human calculus; besides all the seven references in our culture (seven days a week, seven seas, seven years bad luck) there also appears to be a hardwired capacity to manage information in groups of seven. Usability professionals speak of the (sometimes derided) 'seven plus or minus two' rule when designing interactive experiences- a menu with three items seems like too few to be consequential, whereas a side nav with twenty items is an overbearing list.

Just as web designers tune their sitemaps to achieve perceptual comfort in their information architecture, it appears humans prefer to build their social structures in similar fashion when creating organizations where interdepartmental communication is important. The department I am currently part of, while a sub group of a much larger organization, operates in a close-knit startup fashion- and yes, there are eight sub groups, each with roughly 5-9 employees (Two of the smaller groups could arguably be lumped into a single function if not for the presence of two VPs in the top roles, thus crowding what would otherwise be a streamlined org, bringing us down to seven groups).

Like a burgeoning website losing the familiar and personal touch of its quaint beta days when it begins to add more sections and navigation layers, a company that expands beyond the seven sevens becomes too large to be your extended family anymore. You cease getting to know new employees personally, and start to define them via their job functions first and personalities second. You pine for the 'good old days,' when things seemed less complicated even though the workload was proportionately higher. You tighten your circle of mental responsibility to those in your department, and suddenly you're not pulling for the company as a whole, but for the success of your circle vs. those in that other, suddenly foreign department.

These small association groups can be extremely powerful- enough so to jump apparent gaps, like an electrical spark. My small division has been thrown into the mix of a joint venture between my company and a traditional competitor- yet the virtual teams have melded together into strong working groups of about- yep- seven full time individuals (I have a hilariously uniform org chart devised by a consultant who I suspect did not ponder the metaphysical implications of their perfectly aligned Excel spreadsheet).

So it's all well and good when we can organize according to our psychological imperative- how touchy feely. What happens when this guideline breaks? My experience shows that it creates a flawed organization.

In my previous job, my boss received instructions from above to maintain a 'flat' organization- reduce the number of levels between him and the lowest employee, in an effort to reduce bureaucracy and increase accountability. As a result, he ended up with 18 direct reports- all of them design managers rolling up to his VP level position.

Each of these managers, in turn, led a team of one to three designers, each dedicated to a particular business unit in the company. The result was nearly universal dissatisfaction- operating in micro-teams, the designers had no clear relief when their workload exploded; two more hands on each team would have led to more psychological comfort that help was available for those overflow events.

Meanwhile, the eighteen design managers had less than 15 minutes a week of interaction with their manager, whose communications would often come in the form of late night email responses to inquiries sent weeks before and likely outdated by the time of reply. Finally, the VP himself was run ragged, exhausted by having to keep track of 18 teams worth of status and continually falling behind on tasks both mundane and important- he simply had too much on his plate. Additionally, he wasn't doing his career any favors, because he never had the time or mental bandwidth to demonstrate to *his* boss that he could be a strategic leader instead of a frantic email responder.

The ultimate conclusion? The entire department was killed in a fiery bus crash when the VP fell asleep at the wheel. Actually, nothing that dramatic- the VP asked for and received permission to reorganize his department, raise a couple of lieutenants to Director level, and restore harmony and balance by moving to a more reasonable number of subgroups to manage (I'll let you guess the number). I hear it's a much nicer place to work nowadays.

The next time I have to design a department org chart maybe I'll apply some guidelines based on this seven sevens concept- and decree that managers and directors can have no more than seven directs and seconds, and VPs can have no more than seven directs. It may lead to some interesting dynamics and efficiencies in organizations where problems are typically solved by throwing bodies at a them- if a manager knows they only have a limited number of resources, I bet they'll make sure they have the strongest candidates possible in place, and be more vocal about copping to their limitations and asking for cross departmental help or relief when faced with a sudden overflow situation.

Would such a structured and rigid policy be a help or a hindrance to organizational success? I suspect the former, as project failures are usually attributable to human inabilities to recognize self-limitations. Perhaps by imposing ground rules that restrict us to our psychological comfort zones we can reduce that source of project uncertainty and failure.

At the very least, it will result in some very pretty org charts.

Wednesday, June 20, 2007

Earth Friendly Crusaders or Ironic Parasites?

I just finished reading a story about Freeganism on the New York Times website. In a nutshell, they are a social movement that attempts to reduce their impact on the planet by living off of other people's castoffs.

On the face of it, their story is a good soundbyte...they are the ultimate recyclers, extracting everything from household goods to food from the considerable piles of garbage generated by our capitalist society. However, like most good soundbytes, it starts to ring hollow upon deeper analysis.

The Freegans are railing against the very system that allows their lifestyle to exist. Without the constant churn and turnover of consumer goods, and the excess wealth created by the productivity consumerism incentivizes, there wouldn't be dumpsters full of mostly functioning consumer electronics and conveniences for them to raid. Visit an impoverished country like Mexico or Malaysia, or any of the myriad African countries Marie hangs out in, and you'll see this phenomenon happening as a well defined part of the economy. There are no piles of anything worth taking in the hills outside of Ensenada... the locals have made an art out of squeezing every last bit of function out of everything they've got, and not because they're trying to make a statement.

Back here in the states our Freegans don't even bother to stockpile food- because they know there will always be more in the trash tomorrow. So, the crusaders against waste are comfortable in their lifestyle precisely because of waste. It's no surprise that the movement is strongest in New York City, our country's greatest concentration of wealth and population (and garbage) in a single area.

So what is this post doing in a blog about invention and interaction?

I'm irked by the statements of some of these individuals, well meaning as they are, because they are only able to sustain their lifestyle due to the inventions of others. They could have even less impact on the earth by moving out to the wilderness, and living on what they 'find' out there- but I suspect not a one of them would last a week without the goods and services designed, produced, and delivered by those more industrious and productive than them. The goals of Freeganism and the discipline of design are the same- to improve the world. However, the designer wishes to devise, to create, to improve that which exists in order to foster a positive change- it's would be unthinkable for a designer to stand up and says, 'Let's not do anything new, and just try and reuse what we've got.' Whether it's vapid consumer goods or a bionic arm for amputees, it takes a desire to improve the world- or at least your own economic situation- to make things happen. All I expect from the Freegans are more wacky human interest news stories with sanctimonious statements about all the detergent we throw away.

To me, the real irony will come once commercial recycling becomes effective and automated enough to make it worth mining the trillions of tons of junkyards and landfills out there (I'm convinced this will be a booming business within twenty years). Suddenly, the Freegans will be competing in the garbage heaps with robotic recyclers out to make a profit from the base elements in the garbage- and their way of life will be extinguished by capitalist dumpster divers mining refuse for a living.

Friday, June 15, 2007

Why Safari on Windows? Here's Why.

Om Malik asks, 'why Safari on Windows?'

Om notes that alternative browsers don't thrive on the Windows platform, and that releasing software (even a free browser) to the wider Windows platform opens Apple up to the slings and arrows of outrageous (and unfriendly) pundits.

Om speculates that Apple's motives center around switchers- the more Apple interfaces you can acclimate Windows users to prior to their switch, the easier the transition becomes.

I think there's a more subtle, longer term motive to this plan, and it can be summed up in one Ballmeresque sentence: "Developers, developers, DEVELOPERS!"

Here's my reasoning:

1) The iPhone runs Safari.

2) The only way to write 'apps' for the iPhone is to create Safari-compatible widgets.

3) Making Safari available to Windows users makes it easier for Windows developers to create iPhone widgets, widening the pool of potential iPhone developers by a factor of 10.

Safari on Windows, and all the costs and PR headaches that go along with it, are a component of Apple's master plan to democratize mobile application development and create a platform owned by device makers (like Apple) and less dependent on carrier control.

Wednesday, June 13, 2007

iPhone: Crippled Also-Ran or Clever Revolutionary?

In the wake of Steve Jobs' Worldwide Developers Conference keynote, there's been a firestorm of discussion regarding the lack of a 'true' SDK for the much-anticipated iPhone. Most of the negative feedback centers around the assertion that without the ability to write 'true' software apps for the iPhone, Apple is shutting out development of really useful apps. Instead, developers (and by extension, users) will have to make do with widgets- AJAX applications designed to execute in a browser running off information stored on servers.

The interactive experiences I invent are pretty much all browser based applications, so I come into this argument with a bias- but what truly interests me is not what I can get out of the iPhone. In this case, I'm more interested in the business perspective, and what this decision will do to the mobile platform in general.

In my work, I speak with interactive professionals who work in the mobile space, and their #1 complaint is the endless multiplicity of platforms they need to develop and deploy to. #2 is the management layer of the various carriers who control those platforms. As a result, it's a lot harder to gain a critical mass of users in the mobile application space than in the web space- there's no equivalent to making a quick Flash game and having it available to 90% of users overnight.

If you're a web designer, the situation is akin to their being fourteen different browsers you would have to build and test for- plus, these browsers operate differently for Yahoo! users, for AOL users, for MSN users, etc.

Despite its apparent popularity, the iPhone isn't going to change this situation overnight. Even Apple has stated they'd be happy with a 1% marketshare in 2008- that's not going to represent a market mover in terms of platforms.

Instead, I think what Jobs & Co. are doing is more subversive- they are democratizing the mobile application space. That's the core of the dismay we're hearing from developers- the flip side of their argument (they can't make real apps) is that AJAX apps are somehow fake. This is an echo of the old back end vs. front end non-argument that erupts all over the interactive industry map when web developers meet software engineers.

When all it takes is a copy of Dreamweaver, and O'Reilly book, and some copy and pasted Javascript to make a web app, we will see an explosion of single purpose widgets proudly proclaiming their iPhone compatibility. Most of these will be utter rubbish- and the message boards and ratings on Hotscripts and Versiontracker will serve to banish them into well deserved obscurity. However, some will be gems- as a frequent air traveler and motorcycle rider, I rely on the OS X Dashboard 'Radar In Motion' widget to predict whether I will be delayed or drenched. It is completely immaterial to me that this widget is just an HTML fragment and not a compiled blob of code- it does what I need it to do, elegantly and conveniently.

No matter how many Diggers cry foul, when Jobs stands up at Macworld San Francisco and proclaims that thousands of apps for the iPhone have been developed since launch, no consumer is going to care that they aren't 'real' apps. All they are going to care about is: Does this widget let me do something I couldn't do, or make something easier to do? I firmly believe that all the 'fake' developers need is imagination, insight and focus to supply that user experience, despite the sandbox Apple is making them play in.

Tuesday, June 12, 2007

Not So Smart Phones

As part of my job, I carry a Blackberry, which I use as a mobile phone, an email client, and a web browsing device. It has undeniably benefited my productivity and flexibility- I'm tapping this note as I wait in a train, stopped in a tunnel for some unknown reason- but through my incessant usage I've identified some ways in which it could be enormously improved:

1) Always show the time!

The only screen in which the time is visible is the 'home' screen, which is only visible when I'm either doing nothing, or in the process of selecting an app to launch. Once I'm actually doing something (such as composing this blog post on the train) the current time is hidden, and I would have to save this message as a draft and exit the 'mail' application, or look at my watch- which seems like a brutish workaround for an otherwise advanced piece of technology.

2) Allow sensible multitasking

Multitasking has different demands on a mobile device than a desktop computer- it doesn't make sense to have both your web browser and your email client open at the same time, due to limited screen real estate and the methods of interaction unique to mobile email and web. However, there are times when I wish for at least limited multitasking- mainly when I'm using the Blackberry as a phone. Typically, I'll be gabbing away on the phone and the individual on the other side will ask me about my availability for a meeting, or need some info from an email I've sent or received- and I'll have to apologize, explaining that the info they need is on the device I'm using to speak with them, and that I won't be able to access it until I terminate the call.

This may be a limitation of my current device- a corp-issued Blackberry 7250- my carrier's version of the software, or even myself- there may be a way to swap into another app while I'm talking. If there is, I haven't found it- during a call all I see is a button indicating how to end the call. Being able to look up contacts or access email during a call would be a welcome improvement (and something I seem to recall was possible on my dear departed Sidekick II- but that may just be nostalgia talking).

3) More info about incoming email, please!

Similar to the note above about poor multitasking, I'd like some more informative notification of the content of incoming e-mails.

When I'm tapping away at an email, or browsing the web, new incoming email is denoted by a blinking red light on the device. Urgent messages from my boss get the same treatment as a piece of spam... blink blink blink. If I want to stay on top of incoming emails I need to interrupt my session to view the latest messages.

A brief popup preview, with the subject, sender and first line of text, overlayed on the top of the screen would be extremely helpful. If it's something I want to see, I could 'jog' up and select it to view the full message, or I could 'cancel' to dismiss it if its presence offends me- or I could just wait a few seconds for it to disappear on its own. Either way, I could continue my current task secure in the knowledge that I wasn't missing anything important.

4) Spell check- even a painfully slow one, please!

This entire post was tapped out on my Blackberry- it's how I pass the time on various trains and buses I travel on each day. The email composing software is savvy enough to catch common misspellings- it will automatically correct a transposition in 'their' for instance- but it falls down in both grammar (it won't catch the dropped 'd' in 'peas an carrots') nor will it catch longer manglings, such as 'portotyppe'.

This weak spellchecker is unfortunate, since the cramped ergonomics of the thumboard and the pint sized screen make it more likely that I'm going to commit a typo. I understand that space and processing power are limited on a handheld. How about a function where I can submit my composed email to some online spell checker that then returns it with errors flagged along with a little XML payload that lets me choose from possible guesses? Google and Blogger seem to do a pretty good job with online spell checkers- I'd wait ten seconds for a submit and response if it means a cleaner outgoing email from my Blackberry.

I suspect some or most of these shortcomings will be addressed by upcoming handsets, more purpose built for mobile work (such as the iPhone) and more aware of the challenges and opportunities posed by mobile multitasking. None of my requests require new technology to be developed- they just require some thoughtfulness and good interface design.

Monday, June 11, 2007

Information Overload Hacks, part 2

In my last post in this thread, I wrote about a corporate cry for help- the internal '24' campaign at a big-name branding agency, designed to create a sense of urgency around emails and voice messages that reference the numeral '24.'

I by no means blame the individual or organization for succumbing to a social hack to solve the problems of information overload, as the issue they're grappling with affects all of us in the connected world. Fred Wilson went so far as to declare email bankruptcy- the public admission that he could not keep up with the constant flood of emails that trickle in throughout the day.

The root of the problem is not that our society is flawed, or that people are workaholics and send too many emails. I blame email itself- the medium, and the technology that expresses it. Like an aging trolley system in a burgeoning metropolis, email has failed to keep up with the task at hand.

The problem with email is that the presentation obscures the important information. We each receive emails of wildly varying importance and urgency each day, but aside from some incredibly minor interface elements (a tiny '!', or language in the subject line ('URGENT' or, at the company I mentioned, '24'), there is no immediately obvious way to differentiate the noise from the truly important emails.

Instead, you begin to methodically examine each email:

- who is the sender? Is it an important colleague, client or loved one?
- if one of the above, does the subject line reference anything I am especially interested in, or does it de-escalate the message urgency or interest?
- assuming the message passes one of the tests above, you then need to open the email, read and evaluate the contents, and then make a decision- reply? Trash it? Defer it for future action?

While the three steps above are simple in themselves, the true cost comes from the frequency and variability of the action. Making hundreds of such evaluation / decision loops a day would be fairly easy if you were inspecting identical flower pots on an assembly line for defects- the work would become automatic, and you might even describe it as mindless. That's because you would be able to focus on the small set of possibilities in each decision set- is there a crack in the pot? Is the rim malformed?

With corporate email, the scenarios you need to instantly load into local memory, evaluate, and render a decision upon are fairly endless- anything (in my case) from obscure automatically generated emails informing me of a forgotten contractor's building access expiring or some helpdesk in India failing to receive my expense receipts, to cloaked cries for help from a staffer unhappy with their job and their relationship with their coworkers. The danger I find is missing the latter in the flood of the former- but ignoring the 'noise' brings its own risks (I've suffered the wrath of the expense receipts department before, and I dare not repeat).

The constant mental switching is what tires us out and frustrates our desires to find continuity and focus in our daily lives. I hear it all the time when staffers feel like they are being unduly buffeted by administrative or procedural requests- 'I just want to come in, sit down, and do my job.' They (rightly) don't consider the constant peppering of emails from all points of the compass to be part of their core duties, and they fear it endangers their ability to succeed at what they were hired for.

Ok, I promised a solution in the last post, and I'll deliver- I think the problem can be mitigated by rethinking what e-mail is, and how email clients work. Obviously, changing a piece of software is not going to alter patterns of behavior that have emerged as part of the information economy, but it will prevent email from taking a bad problem and making it much, much worse.

The core of the problem lies in email's weak answer to the question of importance sorting- the current framework puts the onus on the sender to indicate importance to the receiver. How in the heck is someone external to me, possibly by thousands of miles and several organizational rat mazes away, supposed to know what I'm most concerned about in any given moment?

Email, and all similar messaging, must change to be more user centered- to be more aware of the context in which the receiver is evaluating the messages. It must give the user control- not to sort, delete, or search emails, that stuff has already been developed within its limitations. Email must evolve to the next stage, where it becomes a partner to help you understand and evaluate the incoming information, and provide clarity and control over your precious decision making time and mental bandwidth.

In my third and final post on this topic, I'll outline what a next generation email application that performs these functions might look like, and what it's going to take to get us there.

Wednesday, June 6, 2007

Beta: The 'Under Construction' of the Aughts

Does anyone remember this icon?

It was the image you placed on your website back in the '90s to signify that you weren't quite done with the page, and that the viewer should pardon any poor table alignments, or groovy background images that didn't tile correctly, or perhaps your accidental wrapping of the entire site text in an unfortunate blink tag.

The usage of these icons faded away once web designers clued up a little, and realized that the best web pages were continually evolving- and that you did no one a favor (most of all, yourself) by pointing out the deficiencies of your creation.

Fast forward to the present, and we have the concept of 'Beta.' Slapping a Beta tag on your otherwise open and functional site signifies that you're not quite done with your app, and that the user should pardon wonky CSS, or a Flash movie that doesn't quite play right, or the bad Ajax that makes your text blink every time the user moves their mouse.

Today, I sat in a room of interactive professionals helping to conceive of a next generation user experience around media consumption, and I challenged them to live without the Beta crutch. If we're going to have a Beta, let it be a real one- with a closed, limited group of beta testers, who are tasked with running the site through its paces and reporting on any last minute tweaks and fixes that need to be made.

I think Beta goes both ways- it can be an excuse for limited or half-baked functions, but I bet it sometimes appears when the site owners want the aroma of a fast-paced, innovative release cycle, rather than a cautious, test-until-it-works old school model.

My advice to would be Beta launchers: If it's just a launch where you didn't get everything in there that you wanted, call that version 1 and pick up version 2 where you left off.

Tuesday, June 5, 2007

With The Spinning Blades, and the Screaming...

Seen on the Captivate elevator screen this morning:

"Lawn mower accidents- all of them preventable- send 9,400 kids to emergency rooms each year."

OK, so I understand the point of putting 'preventable' in the sentence, to increase the senselessness of the carnage. However, doesn't it beg the question of what an unpreventable lawn mover accident looks like?

"We were walking on the beach... nowhere NEAR a lawn... when the mower emerged from the waves and bolted up the sand, shredding sunbathers left and right! We tried to run, but it happened so fast- there was no way this could have been prevented!"

Information Overload Hacks

I had a meeting at a big-name branding agency today, and saw something that reminded me how much work we user experience professionals still have to do to make technology really useful for humans.

It was a poster on the bathroom door, with a giant numerals '24' on it. The text of the poster outlined the new office wide '24' initiative- any message (email or voice) that referenced '24' in the subject or body was to be acted upon, responded to, and satisfactorily resolved in 24 hours.

This amused, saddened, and enlightened me all at the same time. It perfectly illustrates how humans make up for deficiencies in applications by hacking around them out of desperation.

Doesn't just about every email system have a method to mark an email 'Urgent?' I can only assume that so many people marked emails 'urgent,' that eventually that function lost its meaning. I bet at that point, users started to add the word 'URGENT' in all caps to messages, until that method was likewise ignored.

Somewhere, someone dropped the ball after ignoring too many 'URGENT' emails, and an operations lead decided the way to reset the urgency meter was to introduce a novel way of indicating something is 'urgent,' some way that people had not yet developed an immunity to, and then reinforcing it with a company wide awareness program using official posters to impress the seriousness of the 24 initiative.

I can just imagine that the overloaded office drones, informed via poster of their misbehavior and the new top down initiative to correct it, began to immediately subvert the system with messages like:

24- lunch plans

Hey, what are you doing for lunch today? 24.

- since that is obviously a question that must be answered well within 24 hours, otherwise it's hopelessly out of date.

The effectiveness of this policy will be directly proportional to its enforcement. Will the first person to take 26 hours to reply lose their job?

Next post in this thread: attacking the root of the '24' problem.