Pure Manipulation im Netz? – 1.2019

0
226

Social networks are loved, hated and feared. Facebook was recently called „the most dangerous weapon against democracy“ by franz. Media scientist Frédéric Filloux called. Facebook & Co does not care if users click on cat videos or right-wing hate speech. The important thing is that the users use the platform as often and as long as possible to collect the big money for the advertising revenue. Users spend more time in networks when they are provided with content appropriate to their world view. But not only social networks do that. If you are interested in Audi and you read the car magazine. What do you read first? The review about Audi, of course. We prefer to read and watch content that corresponds to our worldview.
The algorithms are then added to the network. Most people are probably familiar with the purchase recommendations of large shopping networks. „Customers who bought this also bought that“. The algorithm of facebook and co – in short – works the same way. The superior contents are selected according to the own usage behavior and the similar user. This is what science calls „filter bubble“. Only this filter bubble we have always had. On TV and in the big newspapers, for example. Only there is it dictated by the media corporations and by politics. I just ask, what is more democratic?
In my opinion, you can easily outsmart the algorithms. One should only act consciously with the online media.

From here only our online readers read 😉
Have fun with the extended editorial by Harald Matousek:

Intellectual incest
As already described, filter bubbles are certainly not a unique feature of social networks. Nevertheless, they are a problem. Their inmates are sealed off from the opposing arguments and thus lead to a kind of mental incest.
This can be observed, for example, among the radical FPÖ supporters or in Germany among the AfD fans. The users stay almost only in „their“ groups, get there specifically their links and information leaked from the right net, thus these groups are often even more radicalized and are then no longer available for arguments from „outside“. On the other hand, this applies to the left groups as well. Of course, if users always get the same content, they will have a similar worldview and validate each other. You just can not get to those people who are „trapped“ in their networks. The analysis of the problem seems to be clear. But a simple solution does not seem to be in sight.

Enlightenment important!
For me purposeful and promising would be to enlighten the people. We could give users the tools they need to better understand media and social networks with their algorithms. There is far too little action against the lack of media literacy. People would be better able to deal with messages and opinions if they understood that the filter bubble is not a reflection of the entire range of opinions and even numerous messages do not penetrate to the user. So they could better assess the social networks and deal with them more critically. But does that want politics? That we can deal critically with the networks? The filter problem of analog and established media (ORF, KRONE, PRESS, STANDARD, etc. ) is quite similar to the problem of social networks. Thus, responsible and critical users would then transfer their new view of social networks to the analog and well-known media. I would not like that many, I think.
Finally, a few new questions arise for me. Actually, only the well-known media enjoy the protection of the freedom of the press or is it also valid for facebook & Co?
Do algorithms also have a right to press freedom?
Would intervention in the algorithms also be an interference with press freedom?
My opinion is that even if you answered all the questions above with a „yes“, then a forced intervention in the algorithms of social networks would also be an interference in the press. or how do you see it?

I would also like to give you a good tip from many years of experience in using social networks so that you are not manipulated and trapped in the „echo room“ of facebook & Co. You also like, share and comment on topics that you would not otherwise pay attention to. Believe me, the algorithm learns very fast and you’ll be surprised how quickly you come to new arguments and information. As in analogue life, you should always seek a different opinion and above all everything that looks too radical and lurid, generally question and google for the sources.

Finally, I would also like to state that facebook & Co, including their algorithms, were and are borderline inventions for our society. You just have to use it correctly and carefully. Then the new media also make sense and fun!

In this sense, I wish you a lot of fun while reading on www.news-online.at

Her Harald Matousek
Editor-in-Chief & Editor

Your opinion directly to: matousek@news-online.at

Kommentieren Sie den Artikel

Please enter your comment!
Please enter your name here