We must wake up to the dangers of tech firms

12 Sep 2018


The podcast between Joe Rogan and Elon Musk last week was documented by most newspapers, predominately covering the nine percent drop in Tesla’s share worth. More often than not, investors will tell you of its irrelevance.


Instead, the primary focus should have been on his urgent call to senators about the current state, and prospects, of the monopolistic technology firms that have burgeoned in little over a decade.  


It is no secret that Musk detests Google, its parent company Alphabet, and Facebook. He sees the current path of these conglomerates as a cataclysmic catastrophe in waiting for society’s accessibility to news and thus, its informative decision making. The notion that more clicks equates to more sales has hampered the freedom of a supposedly infinite web space. In his latest UnHerd article, Jamie Barlett notes physcologist Robert Epstein’s research that Google could be determining the outcome of elections, topical discussions and the extent of our freedom of speech by twenty-five percent. The public sphere has irrefutably become the private enterprise.


This has caused me to question the morality of these superior echelons and whether they have restricted our freedom and enhanced repression. Are we on our way to district nine as the capital pit us against one another, through social media outlets, in a cyber-like hunger games.


Yuval Noah Harari noted in the economist this week the dangers of tech companies’ algorithms collecting information about people, so much so that they may know you better than yourself. Indeed, ‘just as divine authority was legitimised by religious mythologies and human authority was justified by the liberal story, the coming technological revolution might establish the authority of big data algorithms, undermining the very idea of individual freedom’.


A trans-cybernetic system, as Elon coins it, is: ‘a cybernetic connection between humans and machine. Google for example is a collective AI, all plugged in as nodes on a network; feeding the network and collectively programming AI’.


Mr Harari delves further, warning that the more the world is tailored around individual needs and personality traits, the more people will become passive recipients of these AI decisions. Autonomy and the capacity for free thinking wilfully withering away.

The lamentable effects of these products are hardly hidden either. Several case studies of web developers who don’t let their own children use the products and apps they have created were documented by the Business Insider. Even Steve Jobs revealed to the New York Times that he prohibited his kids from using the newly released iPad, telling reporter Nick Bilton “We limit how much technology our kids use at home”.


A study last year found that Silicon Valley parents have serious concerns about the impact their products have on children’s psychological social development.


Saurav Koduri, an ex-programmer, told Business Insider: “Tech companies know that the sooner you get kids, adolescents or teenagers used to your platform the easier it becomes to have it as a lifelong habit of malaise scrolling. It is no coincidence that google has made a push into schools with Google docs and Google classroom”.


In fact, Silicon Valley has seen a number of low-tech schools pop up in an effort to reintroduce the basics, preventing kids learning from screen-based devices until they turn 14. This is a far cry from most public schools where children are required to use such devices in lessons.  What is it these wealthy executives, who document and now decide our every search, song choice and suggestion,   know about their products that their customers don’t.


These companies also discuss the extent of belligerence on their content feed every week, deciding what should stay up or be deleted. Decisions about these are implemented algorithmically, filtered down to thousands of content reviewers around the world, acting supposedly as the new ‘ministries of truth’. Never has content accessibility been controlled by such a small number of firms. Murdoch must seem a saint to some now.


Musk and Harari are both in accordance on the solution. Preventing the power of AI in the hands of a few can only be achieved through regulating the ownership of data. Musk unenviably recalls his attempts to inform Senators of the dangers of technological companies and that regulation was imperative:


 “I tried to tell people to slow down with AI and regulate but nobody listened. Regulations are very slow. Usually they will be some new technology that will cause damage or death, an outcry year will pass there will be insight committee, rule making, oversight and eventually regulations. This all takes years”.


His daunting comparison to the motor industry will hopefully leave you with a sobering yet stout thought about how to solve this issue before it comes irreparable.


“The Motor industry fought seatbelts for years, almost a decade. Eventually after many people died they insisted on regulation. This is not relevant to AI as it will be too late.”



Share on Facebook
Share on Twitter
Please reload

Want to respond? Submit an article.


We provide a space for reasoned arguments and constructive disagreements.

Help to improve the quality of political debate – support our work today.