Captology: A Bad Name for an Interesting Idea

by William Grosso

Related link:

Ever have one of those nights when you discover that there's somehow, magically, a hole in your brain, and an obvious idea that you've completely overlooked? It happened to me last night, at a talk on Captology.

You see, I've been designing and building a shareware application in my spare time (it's a fun and useful product; we'll ship the first version in August). So, I've been working hard on things like memory footprint and usability, and finding a feature set that captures the core functionality (so it's useful) while still being implementable by a small team of programmers (as of yesterday, there are three of us working part-time).

Last night, I went to Mark Finnern's Bay Area Futurists Salon. BJ Fogg was talking about Captology. The idea behind captology is simple: computers (and other technological devices) can be used to persuade people to do things. Not just by displaying advertisements, by also by subtly rewarding "correct behaviors." The element of interactivity that computers provide makes them enormously different from more traditional attempts at persuasion (for example, billboard advertisements).

This is interesting. Designing software to persuade people to do things. It's not the approach I typically take (except in the inadvertent extreme case of "well, if they do that, the software will crash. And then they'll learn not to do that"). And, with some exceptions (software for the children's market) it's not the approach most software takes (though things like wizards and training features are often added later). Most software development is about functionality and usability, and only incidentally about modifying the user.

Or is that true? I'm now beginning to wonder.

In any case, the scope of this is huge. It's not just about getting people to buy stuff. You can also try to change opinions, get people to join the army, and so on.
Examples ranged from Amazon's Gold Box to the talking Barney.

And I realized last night that I'd been missing something. The goal of a shareware product is to get people to buy it. So maybe, the UI, instead of being designed for functionality and ease of use, ought to be designed to convince users to buy it. Maybe I should be trying to think of subtle conditioning techniques that will encourage the user to spend money.

In the light of morning, I tend to doubt it. Or rather, I think that designing for functionality and ease of use is the way to convince people to buy shareware. The plural aspect ("people" and not "person") is important though-- a user interface designed to convince a particular user to buy a piece of shareware is not necessarily the way to get a large number of sales. I think simplicity, ease of use, and functionality win the day when you factor in publicity channels and word of mouth. Trying to hard to capture a single user's money is a "win the battle and lose the war" sort of thing.

But captology's an interesting idea, nonetheless. And something we should all know more about, if only to avoid being conditioned by our computers.

Also highly recommended: the related web-credibility project.

Got a captological example to share?


2003-06-22 16:15:01
Use Captology ...

Hi Bill,

For your shareware project you may still want to take one or two pages from
the Captology book.

For example I like one of Friday's suggestion to extend the time the software
loads the longer you use it without paying for it. For half a year now the spell
checker from ieSpell helps me to erase
my worst spelling errors. I always wanted to donate some money for this cool
tool. Today I did, so that I only look half bad here.

I am positive, that I would have paid them for their excellent work earlier,
if they would have reminded me with an ever increasing delay that I haven't.

Just a thought, Mark.

2003-06-22 20:03:00
The ethical version sounds like a potentially more tactul approach to nagware and crippleware.

I could list many examples of annoying and obnoxious ware, but here's my example of Nudgeware:

The print feature doesn't exist.
In place of the print menu item, is a menu item "Print Feature Overview", or to be blatently honest "Print Feature Sales Pitch"
When the user sees this, they know what to expect, and then, on there own time, will read what the print feature does in light of the value of the product. At this point they may even get the opportunity to try the feature for a limited number of days, or forever as they took the time to read the spiel. (Somehow the latter reminds me of a floor cleaner sales presentation...)

Really nifty nudgeware would contain a list of "features I find value in" which is built by the software when the user selects that feature when presented with an (unobtrusive) opportunity to add it to the list.
User knows value, developers learn need (if user chooses to send the feature list, garnering them a n x% discount)

When googling for nudgeware I found it may the name of a apologies if there really is a nudgeware(tm)

2006-04-03 17:31:08
A less draconian view
Putting aside the evil side of captology for a moment, there is actually an interesting ramification of this notion, which is to train people to use software effectively by rewarding them. A reward can be very subtle. I have a Sudoku game running on my Treo that flashes and makes a nice little noise every time you successfully fill in a row or column. This "reward" actually does give me a small sense of satisfaction that I miss when I play the newspaper version with a pencil. The same principle can be used by the software to "train" users to perform various tasks, which is not actually a bad thing if both you and the users want to work as effectively as possible. Getting a reward when you complete a task actually motivates you to stay focused on the task.

This is not to say that rewards can't be used for evil purposes, too, such as getting people in a phone center to process more calls per hour than is actually healthy for them or thier customers, but there does seem to be an "up" side as well.

The interesting philosophical issue is the one addressed by B.F. Skinner in his "Walden Two," which posits a utopian societey where everyone is manipulated (by positive reinfocement) into living a pretty good life. The people at the receiving end of the manipulation not only know that it's going on, but welcome it, since their lives are improved. The whole situationa is morally and ethically ambiguous, of course, but it is interesting.