Social media is failing. It promised to bring us closer together through allowing people to share ideas more easily. The opposite is happening – the breadth of information we’re receiving online is narrowing. Instead of presenting us with differing, opposing and challenging views that circle an idea, we’re being blinkered. We see just one view of a new story – our own.
Algorithms monitor our likes, dislikes and behaviour, and use the model they’ve built up as a basis for the content they serve to us. The software knows what you like, what you believe and – most likely – for whom you’ll vote.
These systems are only interested in recommending content that aligns with our beliefs. Nobody spends time clicking on what they don’t like. Platforms overprovide their users with information they know we agree with and, by definition, suppress ideas they know we don’t like.
With so much of politics now playing out online, this kind of confirmation bias is neutering political discussion. It strengthens existing biases and political prejudices. It is narrowing political, cultural and social awareness.
This is the echo chamber
BCS recently commissioned Demos to conduct research on the echo chamber effect