Select Page

How to tell if someone is lying

Wouldn’t it be grand if our x-ray vision enabled us to separate the liars from the trustworthy? Being able to tell whether the used car salesman is really selling this car cheap or whether the shop assistant did inadvertently add that extra item on your bill. Or at work, being able to trust a colleague who seems charming enough but may just be screwing whatever they can out of you?

 

Lying is a deceptively complex concept. As we all create our own realities on the fly, the idea that my reality is not the same as yours is built into the system. We can disagree about what just happened and be entirely consistent with our own internal radar. This lends itself to gradations of truth.

The idea of a white lie plays on the need to protect someone from something. We lie in their best interests. Following this logic, the lies that we worry about are those that are designed to favour someone else over us.

Perhaps this opaqueness is why ‘thou shalt not lie’ did not make the top 10 commandments.

 

Lying is not a natural act. The sociopaths amongst us may be better at it, but it remains a challenge even for the most self-deceptive. Firstly, you must rearrange the facts to fit with the evidence. This is not always easy hence the saying that a liar will get caught in their own web.

Secondly, there is an element of guilt. We’re social creatures who like to roll with the herd. Our emotional response to breaking with the rules is to feel bad. Like all emotions, guilt is likely to motivate the liar to behave in certain ways to be consistent with the way they feel. Perhaps they will compensate for their actions, by acting in some other altruistic way.

So how can you tell if someone is lying? Here are some techniques from the professionals.

1) Pose a credibility question – this was a favourite tactic of a friend who was a stock analyst. He would start interviews with pretty straight-forward questions to build trust and relax his quarry. Then as he began to ratchet up the complexity of the questions, he would slip one in where he knew the answer – and that the interviewee would be expected to know too. A wrong answer here is a strong tell that things are not as they seem.

2) Deny, deflect, delay – we don’t like lying as a rule. It requires effort even for those that have limited emotional empathy, as they need to construct an alternate reality that still fits with the evidence. For this reason, the preferred first course of action is to try to not answer the question. So a politician may suggest that this question needs to be considered by some committee process, or they may suggest that is something that they could not have knowledge of. If someone is being evasive with their responses, the chances are that they are avoiding the truth.

3) Body language – while our words may say one thing, our movements can often say another. We often unconsciously register if someone’s body language is out of sync with their words. In fact, it’s been said that over 90% of communication is non-verbal. People who are lying will often try to create distance between themselves and the question – physically pushing back to create extra space. They may also close themselves – folding arms, legs or turning to one side – rather than meeting your gaze or opening their posture. This is not an exact science – some of us are nervous around people at the best of times. So it is relative, which is why detectives will try to gather a baseline behavior before applying the polygraph test.


 

Restoring the symmetry with platform coops

Platform coops align interests

We’ve been migrating to the Borg since computers were invented — integrating our lives ever more deeply into our machines. It’s not a trend that can be stopped. Instead, as Kevin Kelly suggests, we have to ‘civilize’ it.

The immediate problem is one of alignment — the business models driving many of our machines are misaligned with our collective interests. The issue only becomes more pressing as we begin adopting AI-driven personal assistants. The more we rely on machines, the more certain we need to be that they are acting in our interests. With platform coops, we aim to align interests to foster that certainty.

Exhibit A — asymmetric data-tracking

The Migration began inside corporates. We started gluing resource-planning systems together in the 1970’s, and have long since moved to patching the entire enterprise into a coherent whole. Over the last decade, integration has begun to jump the corporate fence as it has pushed up the supply chain. But, while the ubiquity of the chip may promise interactive access to each of us, integrating with customers has proved elusive.

This is why Customer Relationship Management systems still offer about the same value as a library card — they are more archives than platforms to exchange value. Most corporates remain once removed from their customers.

The response to this problem has been to develop data-tracking. This is an attempt to better understand us by collecting our data exhaust and following our movements through the world, both virtual and real.

Tracking by corporates is asymmetric. Data is collected in the shadows, under conditions that are at best tacitly agreed to in never-read terms-of-use-tomes. Corporations know this, which why they hide their activities rather than being transparent.

In short, tracking creates a surveillance economy, not a sharing economy.

Transparency leads to choice

The crazy thing is that we humans are keen on sharing. We want customised interactions with business and government. We hate spam and poorly informed conversations. And we understand that a corporation must know who we are to give us what we want. The opposite is that we remain a statistic, a number that gravitates to the lowest common denominator as scale increases.

So there is another way corporates could respond to the problem. They could just ask — chances are we will be willing to share. It’s about being transparent. If it’s clear what you’d like us to share, then we can agree to it, even if it is tacitly.

This is the type of approach that the proponents of Vendor Relationship Management espouse. “VRM tools provide customers with both independence from vendors, and better ways of engaging with vendors.”

Aligning interests

Logically then, those that use tracking are worried that we won’t agree to share. If they hide their activities, they’ll get away with them. There is something fundamentally wrong with this proposition.

This is one of the reasons why platform coops are so important. Where the customers are the owners there is a self-regulating system to protect the interests of users. As our technologies are working for us, we can be confident that they are not leaking data we may not want to share. Conversely, we are motivated to develop tools that make sharing simpler and more effective — as that is what our customers want.

Our view is that the tipping point for shifting from surveillance to sharing will be relatively low. All it takes is a successful alternative to demonstrate its value and the tracking approach quickly becomes uneconomic for any type of enterprise.

So if we are to cross the final frontier, let’s do it on our terms. That way we can be confident that when we are rubbing virtual shoulders with a personalised corporate avatar, they aren’t fleecing our pockets too.


 

Who do personal bots work for?

So I have my personal butler that comes when I pull on the bell-hop. He fetches me music, orders in dinner, and even claims to automatically restock the pantry. But what happens below stairs is a little bit of a mystery to me. I know that he scurries away to his closet and waits patiently for my next call, but how does one keep oneself busy below decks?

Turns out that our personal bot builders are thinking that there are a bunch of things to keep the butler busy. And herein is a problem for bot-builders – one that they’ve responded to by thinking that they can do more that simply sell us smart machines. They’re building the ‘operating systems’ for our daily existence. And if we are to become reliant on these systems then they can do a whole lot more to monetize our needs and wants. For example, how about clipping every transaction that the bots intermediate? You may not even have to pay for your shiny new fridge, if the fridge can take over the task of buying your daily provisions.

We can see the opportunity in this – a bot for every home, mixing our drinks and picking up the dog poo. But this business model poses one of the biggest challenges with personal bots – who are they really working for?

In the olden days, my butler was working for me. For a modest salary, he lived under my stairs, ate my food and shined my silverware. It would have been unusual indeed if he took a margin on the food that he purchased for my family, even if there was always the chance of a little slippage. The point being that the cost of the butler was effectively a fixed fee that was paid by me – this served to align our interests.

But what are the motivations of a bot that delivers profits to a third party based on the transactions that it can intermediate? It sounds suspiciously like they are motivated to sell me more and in ways that generate the highest profits to someone else.

We are being sold a fairy tale that a few companies can build the operating systems for our lives. This is not the way that evolution works. It thrives on difference and competition. We would be wise to encourage this – to have the bot-builders compete away their ‘clip-the-ticket’ business models – and before we all become dependent on our digital butlers.


In a conversation with a CEO of one of Australia’s largest health insurer’s, we were talking about the demographic problem that our society faces as upwards of 400 people a day are turning 75. The demand for aged care services is outstripping supply. As a mutual, he posited the question as to where I would prefer to have my parents looked after – in a mutually owned aged care home or for-profit one. It’s the same question about alignment of interests. I joked that they could get a mutually-owned fleet of personal bots. Perhaps it is not such a silly idea…


 

Making Markets

Making Markets

I have been a student of markets since my best friend introduced me to the vagaries of supply and demand as the school’s preferred supplier of cannabis. I’ve traded currencies, commodities, equities and debt across listed and unlisted global markets. This is a collection of thoughts on what makes a market and what causes them to fail – and therein lies opportunity.


 

Engaging people

Engaging people

For the last decade, I’ve been working to understand how we can engage and motivate people to action through technology. It’s a challenge that crosses consumer anthropology with behavioural economics with emerging technologies. These articles are a collection of lessons from this journey.


 

Valuing Time

Valuing Time

Our time is precious. Yet as the world speeds up, we seem to have less of it. The problem is that we humans can’t keep up with our machines. These articles are about the challenge to take make the most out of our time.