We risk a lot when we allow computers to take over every part of our lives, right down to empathy.

Greg Milner's Pinpoint: How GPS is Changing Technology, Culture, and Our Minds offers a fascinating insight into how we give ourselves to technology, with sometimes dubious returns.

The book talks about how the global positioning system has changed human behaviour, because we trust it blindly, even to the point of killing us - "Death by GPS" as Milner puts it, with people allowing their navigation devices to lead them down impossible routes, such as into lakes or off broken bridges.

Milner analyses several scientific studies, and has concluded that GPS has had a profound effect on humanity and changed us, and not always for the better.

"We have come to depend on GPS, a technology that, in theory, makes it impossible to get lost. Not only are we still getting lost, we may actually be losing a part of ourselves," Milner writes.


The message in Milner's book is that in outsourcing our cognitive capabilities to information technology systems, we gain speed and convenience, but risk switching off critical faculties and abilities that have helped us survive for millennia.

Blind faith in technology is nothing new, of course, and it won't go away, either. Technology continues to evolve, however, with our lives becoming increasingly integrated and dependent on it.

What's more, technology really does seem to seep in everywhere. Shortly after I read about "Death by GPS", a friend pointed me to the innocuously named Crystal website (crystalknows.com).

Crystal is a freemium service that tries to help people with "empathetic communication" because, uh, we're different, and it can be difficult to understand one another.

The site siphons up your LinkedIn profile and Crystal "analyses your personality based on any public data written by you or about you, and confirms accuracy from people in your network".

Even though I shouldn't allow third parties to access my LinkedIn profile, I threw caution to the wind and let Crystal analyse it (and yes, it appears you can revoke access for the site as well).

It suggested I improve my relationships ... Walter H White, a high school chemistry teacher in Albuquerque, New Mexico, looked familiar, but how about Barack Obama?

I always wanted to communicate with the President of the United States, with empathy.

Crystal assembled a long list of "relationship advice" for me about Barack, and we seemed pretty similar.

I can even ask Crystal to write emails to Barack for me, on subjects such as getting a pay rise, and the site will then try to get the communications style right.

It didn't go quite so well.

The suggested pay rise email read like sycophantic grovelling that I hope will never leave my computer, and Crystal's relationship advice for the President of the United States is hilarious, like how we should give feedback to each other: "Barack is naturally supportive and avoids conflict, so he will tend to avoid criticising Juha in most situations, even if it's necessary. Juha, on the other hand, will often speak out loud when providing feedback to Barack, and may not completely organise their thoughts first, causing Barack to read into some words too deeply."

Yeah, right. That's nonsense, but I am fascinated by the idea that we need machine-generated advice on empathy, such a human concept.

Perhaps Crystal will improve and become usable, thanks to user input and better analysis, to the point that it and similar services become so good we start to trust them, like GPS.

What part of humanity would atrophy when empathy becomes a computer-generated service?

Debate on this article is now closed.