AI is powerful. AI is transformational. AI is the future. All true statements, yet all worryingly pointing to the fact the humans are rendering themselves surplus to requirements. Nicholas Larder, digital strategy director, Mediavest, thinks we shouldn’t be so quick to let AI take over. Writing exclusively for ExchangeWire, Larder bluntly explains why AI still needs the human touch and asks whether it’s not all just hype.
Winston Smith, that tortured employee of The Ministry of Truth, famously railed against his omnipotent overlords in 1984 as he questioned the morals of his endeavours. Though a futile rebellion in the end, Orwell’s Big Brother could have done away with Winston completely, had he been able to comprehend the power that could be wielded by the omnipotent, omniscient algorithm.
Facebook firing its entire trending topics team, in deference to the aforementioned algorithm, was predictably an unmitigated disaster. Republican propaganda, McMasturbators and four letter words rose to the top, as the ‘purist’ algorithmic aggregator sought to identify those topics being shared most readily. And this being the homogenous gloop that is Facebook, those articles were typically of the sensationalist, clickbait nature. With checks and balances by sentient beings no longer in place, ‘the algorithm’ was free to run amok, opening a rather worrying window into what the future of news could look like. The concept of the filter bubble is already well entrenched; and, despite Facebook’s apparently benign desire to influence the news, there are many out there who would seek to game such an algorithm to their own ends. How long before ‘2+2=5’ starts trending?
Similarly, a glitch in Bing’s translation service saw ‘Daesh’ translated as ‘Saudi Arabia’. Outrage and threats of Bing boycotts quickly followed, though it remains to be seen how much of the latter would have any impact on Bing’s minuscule market share.
Both examples offer a stark reminder that Gartner were probably right to slot ‘machine learning’ in to the apex of their annual Hype Cycle (aka The Peak of Inflated Expectations) which was released last month. Those familiar with the Hype Cycle will be aware that the next phase of development is the Trough of Disillusionment.
Facebook, Google, and others, are rightly placing great stock on the future importance of machine learning and artificial intelligence, investing heavily and bringing consumer products to market that are starting to unlock their potential. But, despite the recent endeavours of Deep Mind and Alpha Go, Google Now, Deep Dream, and the nascent Facebook ‘M’ assistant, the ‘intelligence’ bit is still overly reliant on the flesh and blood, non-machine.
This has not stopped ‘AI’ from worming its way in to the contemporary advertising lexicon, often a key propagator of those ‘Inflated Expectations’: see also Big Data, Augmented Reality, Gamification et al, with a clamour of companies looking to appropriate the latest zeitgeist to their own ends. Witness the cringe-inducing rebranding of Tribune Media as Tronc, whose corporate video presents a rallying cry for their future as a fully data-driven, machine learning, automated news organisation. Or in Nathan Barley vernacular, a ‘self-facilitating media node’. So good it should be parody, but sadly isn’t.
There is no doubt that an algorithmic approach is at the core of the future advertising model – just look at how Google’s humble PageRank has propelled that former PhD project to the goliath we know today.
But leaving the machines to get on with it today is fraught with potential mishap, exemplified by Facebook’s overeager curator, and the invitation which now lays wide open to game Bing’s translator by fair means or foul.
AI might indeed be the future, but we shouldn’t be so eager to hand the reigns to our robot overlords. Fortunately, we still need humans (for now).