I hope, by the end of this piece – I can prove to you that methods that are over two thousand years old are alive and well in todays digital space and were used in the recent Cambridge Analytica and Facebook revelations. I also hope I can turn your mind to highlighting that the same method is being used every day by the hospitality sectors biggest digital players.
All right-minded people would agree that gathering data through seemingly innocent and simple questions and interfaces, analysing and interpreting that data to gain a view on behaviour and wants is fine, but that using that data to inform and drive unrelated and unlinked conditions and experiences of peoples’ lives is completely unacceptable.
But I wonder if up until the point that they ‘removed’ the data from Facebook that all Cambridge Analytica did was exploit human behaviour – in fact our own inherent laziness and weakness?
What if that same understanding of human behaviour was driving the digital hospitality space and was one of the main drivers for TripAdvisor’s, Airbnb’s and Booking.com’s success?
Let’s start with trying to understand if most people really care if what they see as ‘harmless’ personal data is used for various unrelated news, opinion and content creation and curation. If I were to ask anyone in the street to tell me what their favourite colour, animal, country and food are and that I would then be able to influence every political decision they make over the next 6 months – they would basically laugh in my face.
I mean – it’s completely and utterly absurd, an individuals’ political beliefs are just that – there is no way that would influence any final decision an individual made – right? So why would an individual care if companies try and use harmless data for other means – they are not going to fall for that – are they?
But that’s the accusation aimed at Cambridge Analytica and Facebook.
Taking this into the hospitality space – we know that TripAdvisor, Airbnb and Booking.com (let’s call them TAB) collate and harvest huge amounts of ‘harmless’ personal data from users – not just feedback on accommodation and restaurants but also from bland surveys asking about likes and dislikes in their travel space (all anonymised of course… well maybe anonymised).
They take this data and can push ‘subtle’ content, offers and recommendations to people based on seemingly unrelated and innocent data – much more than browser history and what sites have been visited. All without any email address being shared…. (because that’ll be against GDPR – right?)
This is exactly what Cambridge Analytica did (before going seriously rogue), in fact exactly what most marketing departments do.
So, how about adding to the harmlessness of ‘soft’ data – the inherent human condition of laziness?
Sure, there are a very many number of people to which the complexities of the issues facing society politically, economically and culturally are understood and of great concern. Many of these are well informed and take a deep interest in these subjects – going out of their way to educate themselves to make sure that they have visibility of the full picture.
But there are a huge majority of people that do not spend a huge amount of time working through detail – life gets in the way – and this where you could argue that Cambridge Analytica were being smart – playing on the fact that they knew that the clear majority of people were lazy and didn’t like reading long winded newspaper articles. They then identified that Facebook as the ideal place for people to be able to see ‘news’ in a very passive way.
Does this ring any bells? Laziness is the very human behaviour that TAB base their entire business models on. We also know that they spend considerable advertising content spend on Social Media. And the exploitation of laziness doesn’t stop there. “Can’t be bothered to look for restaurants, places to see and things to do? Well, here are some of those too – why go anywhere else online to take a look?”
These companies also know that Social Media in general doesn’t just play on peoples’ desire to be social but in many cases their laziness and inertia – after all it is an ever-scrolling world of indifference. They know that this plays straight into the hands of those that can gain access to levels of personal information and they then have the ability to constantly refine messages until they hit a ‘sales nerve’.
This isn’t insight gained through analytical excellence – it’s a very simple understanding of human behaviour and laziness – which you can gain by talking to anyone over a drink.
So, we therefore have the following inputs into an algorithm – harmless data and human laziness.
Now we add groupthink laziness. Just as the Remain campaign were lazy and didn’t challenge the Brexit camp openly enough, so the Democrats really didn’t believe that Trump could win and consequently waged the wrong campaign. Or at least didn’t push their messages across in the right way to connect with the lower educated or lazy groups.
Again, ring any bells in terms of ‘following’ the crowd? True – that’s what recommendations are based on – but if those that are drawn to TAB’s services are doing so because they are ‘lazy’ and have knowingly provided ‘harmless’ data and have everything they need, its easy for them all to buy into the same narrative – “The best and most flexible accommodation is on Airbnb”, “Booking.com is the cheapest”, “You can get everything you need from everywhere on TripAdvisor” – whether any of that is true or not. That’s groupthink and it’s the last of the three simple steps for our algorithm.
We therefore have;
- Asking for and taking ‘harmless’ personal data
- Applying the fundamentals of human laziness
- Identifying and adding groupthink outputs and trends
Which gives you a clear identification of being able to;
Define the level of content, message and channel required to influence.
It really is an algorithm as old as time because – however long its circumference – life and civilisation goes in circles. The Athenians and the ancient Chinese – knew that playing on people’s inertia and groupthink can create a false sense of dependency and trust – allowing messages and influence to be (depending on individual opinion) manipulated or persuaded.
Cambridge Analytica know this, Mark Zuckerberg knows this, TripAdvisor Airbnb and Booking.com know this – and today although the delivery method has changed – the algorithm is the same as it ever was.