N
noname223
Archangel
- Aug 18, 2020
- 5,426
I think this is an interesting question. If I could post that thread in off-topic I would have more energy to spend time on elaborating. In this subforum noone will care anyway. Not even when I bait people with controversial takes.
As I always I don't have a fucking clue what I am talking about so take my words with a grain of salt.
Here are some things I summed up:
In general chatGPT seems to to have a (slight?) liberal leftwing bias. However overall it is way more neutral than most humans. I read Jordan Peterson's Tweet about it. And there seems for me a fallacy. Don't confuse neutrality and objectivity. There seems to be a fallacy. A postmodern understanding of truth. Neutrality would require to be neutral on facts. We always had to listen to both sides. (false balance) I think the US media system suffers a lot because of such a notion. Neutrality does not mean we have to reiterate pseudo-scientific concepts. We don't have to state the doubts on climate change or that 9/11 might have been an inside job. Just because these conspiracies and fake news exist we are not obliged to give these people a platform. It is true contrarian positions should and must be allowed. However it is obvious the enemies of liberal democracy want to fight with democracy with its own merits. Beat it with their own rules.
Karl Popper perfectly elaborated on that the paradox of tolerance.
The paradox of tolerance states that if a society is tolerant without limit, its ability to be tolerant is eventually seized or destroyed by the intolerant. Karl Popper described it as the seemingly self-contradictory idea that in order to maintain a tolerant society, the society must retain the right to be intolerant of intolerance
In my country Germany we had to learn that lesson the hard way. Our constitution because of that prevents a more severe polarization so far. The German democracy is because of that more stable than in some other current Western countries.
I like to read a newspaper with very contrarian positions. Some of them really make me angry though I still enjoy reading it because it shows me other intellectual positions. Though I see that many people on the internet went down the rabbit hole and come to positions which seem to be far away of any facts or serious discussion. I still tried to argument with such people but many of them radicalized more and more and I gave up. I think fake news and conspiracies are a huge struggle for democracies and the best medicine is not invented so far. I want to say there is a difference between serious contrarian positions and just spreading fake news caused by brainwashing. One has to be careful because liberal elites tend to use the term conspiracy as a weapon to delegitimize positions. It is a topic with many nuances and some of my statements might be slightly added with some polemic. There were true or at least possible conspiracies. The covid lab theory is an exampel for that. Though way too many people tend to go with their gut-feeling which causes many biases. Populistic parties use that human tendeny and appeal to simple (or lets say simplistic) reasoning.
Maybe now more to the core topic. Yes chatGPT is biased in many instances. It postures itself as neutral and as above the fray which is not true. The AI was trained by people with tendencies, it was fed with data of humans and humans have biases.
However to a certain degree chatGPT admits that the data it was trained on probbably contained prejudices and stereotypes. Personally I think it still downplays it.
I don't have fully knowledge how exactly they filtered content like hate speech but it is likely the censorship of such content was probably more than the least minimum just to be safe.
I had my own experience with it. I asked for a legal advice. The thing I asked for was probably a grey area. At least this is what I read from other experts. I could imagine just for the wish of the company behind chatGPT they wanted to be on the safe side on favored the stricter advice and called it illegal.
So it is likely chatGPT favors opinions and stances that are favorable for the company behind chatGPT.
This thread got a little bit longer than I expected. It was fun to write despite the fact that probably like 45 people will read it.
ChatGPT is not politically neutral
Since its launch last Wednesday, the AI language model ChatGPT has attracted more than a million users, scores of opinion pieces, and some very well-founded concerns. The chatbot may be among the most sophisticated of its kind, and was developed by OpenAI, the tech company — which was also...
unherd.com
As I always I don't have a fucking clue what I am talking about so take my words with a grain of salt.
Here are some things I summed up:
In general chatGPT seems to to have a (slight?) liberal leftwing bias. However overall it is way more neutral than most humans. I read Jordan Peterson's Tweet about it. And there seems for me a fallacy. Don't confuse neutrality and objectivity. There seems to be a fallacy. A postmodern understanding of truth. Neutrality would require to be neutral on facts. We always had to listen to both sides. (false balance) I think the US media system suffers a lot because of such a notion. Neutrality does not mean we have to reiterate pseudo-scientific concepts. We don't have to state the doubts on climate change or that 9/11 might have been an inside job. Just because these conspiracies and fake news exist we are not obliged to give these people a platform. It is true contrarian positions should and must be allowed. However it is obvious the enemies of liberal democracy want to fight with democracy with its own merits. Beat it with their own rules.
Karl Popper perfectly elaborated on that the paradox of tolerance.
Paradox of tolerance - Wikipedia
en.wikipedia.org
The paradox of tolerance states that if a society is tolerant without limit, its ability to be tolerant is eventually seized or destroyed by the intolerant. Karl Popper described it as the seemingly self-contradictory idea that in order to maintain a tolerant society, the society must retain the right to be intolerant of intolerance
In my country Germany we had to learn that lesson the hard way. Our constitution because of that prevents a more severe polarization so far. The German democracy is because of that more stable than in some other current Western countries.
I like to read a newspaper with very contrarian positions. Some of them really make me angry though I still enjoy reading it because it shows me other intellectual positions. Though I see that many people on the internet went down the rabbit hole and come to positions which seem to be far away of any facts or serious discussion. I still tried to argument with such people but many of them radicalized more and more and I gave up. I think fake news and conspiracies are a huge struggle for democracies and the best medicine is not invented so far. I want to say there is a difference between serious contrarian positions and just spreading fake news caused by brainwashing. One has to be careful because liberal elites tend to use the term conspiracy as a weapon to delegitimize positions. It is a topic with many nuances and some of my statements might be slightly added with some polemic. There were true or at least possible conspiracies. The covid lab theory is an exampel for that. Though way too many people tend to go with their gut-feeling which causes many biases. Populistic parties use that human tendeny and appeal to simple (or lets say simplistic) reasoning.
Maybe now more to the core topic. Yes chatGPT is biased in many instances. It postures itself as neutral and as above the fray which is not true. The AI was trained by people with tendencies, it was fed with data of humans and humans have biases.
However to a certain degree chatGPT admits that the data it was trained on probbably contained prejudices and stereotypes. Personally I think it still downplays it.
I don't have fully knowledge how exactly they filtered content like hate speech but it is likely the censorship of such content was probably more than the least minimum just to be safe.
I had my own experience with it. I asked for a legal advice. The thing I asked for was probably a grey area. At least this is what I read from other experts. I could imagine just for the wish of the company behind chatGPT they wanted to be on the safe side on favored the stricter advice and called it illegal.
So it is likely chatGPT favors opinions and stances that are favorable for the company behind chatGPT.
This thread got a little bit longer than I expected. It was fun to write despite the fact that probably like 45 people will read it.
Last edited: