Social media — to blame, or just a mirror?
By Robert Hudson, President, ITPA
Wednesday, 27 March, 2019
As you are no doubt aware, there was a horrific shooting in Christchurch on 15 March, with 50 dead as a result. Like the New Zealand Prime Minister, we won’t be naming the alleged shooter — but we will absolutely talk about some of the ‘root causes’ being blamed for the atrocity.
Politicians, ‘social commentators’ and the mainstream media have started to blame technology and gaming for the radicalisation of the alleged shooter. Yet it has repeatedly been proven in numerous studies that computer gaming (even ‘violent’ games where shooting is the main purpose, or games that allow and even promote violence to either non-player characters or other players) does not cause people to be violent in real life.
Here’s some light reading on the subject:
For the people in the back: Video games don’t cause violence
New study shows violent video games do not make teens more aggressive
New Study Shows That There Is No Link Between Violent Video Games And Aggression In Teenagers
Do Violent Video Games Make Kids More Violent?
Can Video Games Cause Violence?
No evidence to support link between violent video games and behavior
New Research Underscores Video Games Do Not Lead to Real-World Violence
It’s time to end the debate about video games and violence
The other main technology (I use the term loosely) being blamed is social media. Apparently the shooter in Christchurch was radicalised on social media, and social media is to blame for his actions.
Social media has two distinct meanings here, though. On one hand, you have the technology that people use to access it — the app itself, in its various forms. And then there are the people who write the apps and run and maintain the companies behind them. There is no doubt that social media companies could and should do more to prevent the spread of hatred online… that they’re obligated, through social contract as part of a functional society, to do so to the best of their abilities.
But it’s certainly not as simple as just writing an algorithm, as has been suggested by some. Prime Minister Scott Morrison claimed, “If they can write an algorithm to make sure that the ads they want you to see can appear on your mobile phone, then I’m quite confident they can write an algorithm to screen out hate content on social media platforms.”
This shows a poor understanding of the complexity of the technologies that underpin social media sites — sadly unsurprising, given our government’s repeated demonstration that it simply does not understand even basic technology issues. You only have to look at past or ongoing failures such as the nbn, the 2016 Census issues, ATO online services failures, Robodebt, My Health Record and so on.
It also completely ignores other factors, such as the very nature of live streaming services (and the sheer volume of live streamed content being created at any one time — gone are the days of the internet being a centralised content delivery system). And it ignores one other critical issue.
Social media is just a mirror. Social media sites themselves do not create content, they simply reflect it. And sometimes, like a shaving/make-up mirror, they amplify it at the same time. The issue that they are increasingly reflecting of late is an aspect of society that is unsavoury to us — where hatred and bigotry are commonplace and actively encouraged by the people participating in it, and are ignored by those who are not interested.
This is an issue not of their creation. For that, we need to look more closely at what’s happening in our society in general, and take a long, hard look at the dog whistling, marginalisation and outright public treatment of those who are in some way different to ‘us’ in public life by politicians and the media — be it gender, sexuality, religion, race, social demographic, etc. For too long, we’ve stood by and let our standards in this area lower, and no amount of focusing on technology will fix this.
Like any tool, social media itself has no agenda or bias and is not the root cause of the problems we’re seeing, any more than a computer, mobile phone, television or printing press is. The problem with social media (and mainstream traditional media) isn’t solvable with algorithms. The problem must be solved at the human level, by addressing the attitudes and actions of those who publish what is displayed on the computer/phone/television/printed page.
We need to take a good look at ourselves. The standard we walk past is the standard we accept, be that online, in our media or with our politicians. Whether we like his specific action or not, Egg Boy showed us an example of someone not willing to walk past the standard of behaviour he was seeing. Whilst we don’t all need to egg a right-wing politician, we do need to make sure in our professional lives that we are adhering to our own standards and our code of ethics.
Information Technology Professionals Association (ITPA) is a not-for-profit organisation focused on continual professional development for its 18,700 members. To learn more about becoming an ITPA member, and the range of training opportunities, mentoring programs, events and online forums available, go to www.itpa.org.au.
Measuring inefficiency
With a view to improving my 'leanness' and stop myself working so many extra hours, I...
Cybersecurity advice in the wake of Ukraine
In light of the current situation in Ukraine, the ACSC is urging all Australian organisations to...
Why major IT changes can wait
Attempting major IT changes late in the day — or week — can be a recipe for disaster.