We've gotten used to seeing ads that are targeted directly at us. When we use Google to search the web we see text ads related to the content of our search. When we use Google's Gmail online mail system, as many of us now do for business as well as personal email, we see ads generated from the content of our own messages. Ten years ago that would have seemed creepy; now it seems like nothing.
There's another kind of targeting going on, one that looks not at you and me as individuals, but at what we can call, for want of a better term, human nature. Of course, that's pretty much the definition of marketing – but this is a closer and more scientific look than ever, using MRI scans.
That's right, magnetic resonance imaging. Psychologist Dan Ariely, author of Predictably Irrational, is talking about what he and Gregory S. Burns call "neuromarketing." "The most promising application of neuroimaging methods to marketing," they say, "may come before a product is even released — when it is just an idea being developed."
So, no more sitting around a conference table shooting around ideas on what might make a product appealing to this type of person or that type of person. No more guesswork. Now we'll design and position the product based on intricate knowledge of our map of the human brain. Not limited to designing products, the analysis could lend itself to "gauging people's reactions to food, entertainment, buildings and more," say the researchers.
Presented with a product or service developed this way, we wouldn't likely know that. We'd just find the product appealing, because it was designed to be. It would fit us, like a puzzle piece precisely cut to interlock perfectly with its neighbor.
Here's the downside I can see: we're already hardwired to be attracted to things that are bad for us (or for society, or both). Things like sweets; addictive drugs like opiates, painkillers, sleep medications, crack; even underage sex partners. Making it even easier for marketers to appeal to our natural (including our baser) instincts could well push us even harder towards things people want to sell us even though we ought not to have them or do them.
The upside? Positive goals could be aimed at in the same way: for example, getting people to eat healthy foods that they normally would find less appealing than unhealthy ones, or designing an energy-efficient light source with a "warmer," more pleasant color.
The possibilities are almost as myriad as the neurons in our brains. The question is, how much more manipulation do we want to accept?