> How would DDG do this even if they wanted? Send fact checkers to the ground?
If you want DDG to independently verify every decision it makes via primary sources, you are going to get less useful search results. DuckDuckGo doesn't have a team of scientists to reproduce every research paper they see. Nevertheless, they can decide to intervene in situations where they are reasonably certain that a source isn't trustworthy.
Of course, people are free to disagree with them. Is the disagreement here that people think they're blocking sites that aren't misinformation? That's difficult to debate given that we don't know the list of sites, but my personal priors are that the sites probably aren't the victims of smear-campaigns, they probably are peddling deliberate misinformation. Hard to debate one way or another if we don't know the list; but once again, arguing that DuckDuckGo is wrong about whether these sources are trustworthy is not the same as saying that they shouldn't be able to downrank a bad news source without first forming their own team of investigative journalists.
----
> I agree. The problem here is they are only flagging disinformation from one side as factually incorrect, and do not even bother to do so for disinformation coming from the other side, thus creating bias, which is political in nature for the reasons explained above.
So, there's two things here:
First, yes, search engines have bias for the same reason that all ranking systems have bias. Remember that DuckDuckGo is literally in the business of ranking certain sites above other sites. There is no one in the world and no algorithm that is capable of ranking information without incorporating some degree of worldview into that decision about how rankings should work. This bias is why we use search engines, and it's why diversity in search engines would be a good thing. We want sorting systems to have opinions about how information should be sorted.
This is still very difficult to talk about when the word "political" is being used in such a broad sense. Do you mean political in the sense that all editorial decisions are political by nature because they either reinforce or question a status quo? Or do you mean political in a more narrow way -- that applying more strict standards to a subgroup of sources is the thing that makes this political? If you mean "political" in a broad sense, then sure, I agree, but also there's no such thing as a web search engine that is apolitical in that broad sense and I question whether it's possible to build one that is apolitical without also being completely useless for most users. If you mean political in the second sense, that there is a narrow category of political topics and the lack of fairness is the thing that makes it political... again, I just don't understand how you square that with the regular filtering that search engines do all the time.
When Google Ads pay special attention to ads for lockpickers because it's a popular spam category of ad, but they don't pay special attention to other ads to the same degree, is that suddenly political?
Second issue I have here, if the problem is a lack of flagging misinformation in other contexts, why would the answer not be more rigorous flagging of that misinformation? Why would the answer necessarily be that DuckDuckGo results should be a free-for-all whenever someone searches for the word Ukraine? There's a big jump here from, "I think they're not doing a thorough enough job and I think they're taking sides in a conflict" to "they shouldn't be even trying to do this at all".
There are some services where that viewpoint makes sense, but I don't see how DDG is one of them. I personally have argued that companies like Cloudflare fundamentally shouldn't be in the business of releasing content filters at all. I personally have argued that TLDs shouldn't be involved in censorship. I have personally argued that ISPs should not be allowed to filter content that is not illegal. Important difference, none of those are companies whose primary service is sorting content, none of them are companies that we go to with the explicit request for them to give us information based on what they think is relevant and accurate.
How do you make the jump from disapproval of DDG's standard for misinformation and how it's applied to the idea that they shouldn't be involved in filtering of misinformation at all?
----
TLDR, I still don't really understand why editorial decisions about political content is a slippery slope, but abandoning editorial decisions based on a word ("political") that doesn't seem particularly rigorously defined isn't also a slippery slope.
> How do you make the jump from disapproval of DDG's standard for misinformation and how it's applied to the idea that they shouldn't be involved in filtering of misinformation at all?
It is quite easy to disapprove their current standard on record because according to it, misinformation can be coming from a "Russian" source only and we all know there is much more misinformation in the world than that. Can we agree on this?
I am not saying that a search engine shouldn't be involved in filtering of misinformation. On the contrary, I think that DDG (and any other search engine) should absolutely be in the business of filtering all misinformation they can. Key here is "all".
But by being selective, and in this case based on a particular political view (and I use the word political in the context of world politics), introduces a bias which may negatively affect its users, without any particular benefit.
Sure. I think it's reasonable to ask DuckDuckGo to apply more rigorous standards across the board.
I'll offer a weak note in their defense that I suspect part of the reason they don't is specifically to avoid sliding down a slippery slope and breaking this balance between neutrality and editorial decisions about content. I suspect that DuckDuckGo would say that there is a volume and kind of misinformation happening here that they are willing to address, but that applying the standards too broadly would result in them making decisions in other contexts where they feel less confident and in pushing their editorial line too far.
However, I don't think it's unreasonable at all to disagree with them on that assessment, and I think it's extremely reasonable to ask why DuckDuckGo feels safer about downranking certain kinds of misinformation and feels nervous about taking stances about other misinformation.
My hope is that if it's somehow possible for anything positive at all to come out of Russia's invasion of Ukraine, it's in part that people become more conscientious and critical about other conflicts (and narratives about conflicts) that we tend to take for granted or ignore.
> However, I don't think it's unreasonable at all to disagree with them on that assessment, and I think it's extremely reasonable to ask why DuckDuckGo feels safer about downranking certain kinds of misinformation and feels nervous about taking stances about other misinformation.
It may look like that on a first glance, but assuming even spread of 100M DDG users, those users from Russia, China, India, Indonesia, Middle east may be more sensitive to other kind (in this case western media) of misinformation.
Since this is easily half of the world population I would argue that stance they took could also be easily seen as an 'extremely unreasonable' at the same time.
If you want DDG to independently verify every decision it makes via primary sources, you are going to get less useful search results. DuckDuckGo doesn't have a team of scientists to reproduce every research paper they see. Nevertheless, they can decide to intervene in situations where they are reasonably certain that a source isn't trustworthy.
Of course, people are free to disagree with them. Is the disagreement here that people think they're blocking sites that aren't misinformation? That's difficult to debate given that we don't know the list of sites, but my personal priors are that the sites probably aren't the victims of smear-campaigns, they probably are peddling deliberate misinformation. Hard to debate one way or another if we don't know the list; but once again, arguing that DuckDuckGo is wrong about whether these sources are trustworthy is not the same as saying that they shouldn't be able to downrank a bad news source without first forming their own team of investigative journalists.
----
> I agree. The problem here is they are only flagging disinformation from one side as factually incorrect, and do not even bother to do so for disinformation coming from the other side, thus creating bias, which is political in nature for the reasons explained above.
So, there's two things here:
First, yes, search engines have bias for the same reason that all ranking systems have bias. Remember that DuckDuckGo is literally in the business of ranking certain sites above other sites. There is no one in the world and no algorithm that is capable of ranking information without incorporating some degree of worldview into that decision about how rankings should work. This bias is why we use search engines, and it's why diversity in search engines would be a good thing. We want sorting systems to have opinions about how information should be sorted.
This is still very difficult to talk about when the word "political" is being used in such a broad sense. Do you mean political in the sense that all editorial decisions are political by nature because they either reinforce or question a status quo? Or do you mean political in a more narrow way -- that applying more strict standards to a subgroup of sources is the thing that makes this political? If you mean "political" in a broad sense, then sure, I agree, but also there's no such thing as a web search engine that is apolitical in that broad sense and I question whether it's possible to build one that is apolitical without also being completely useless for most users. If you mean political in the second sense, that there is a narrow category of political topics and the lack of fairness is the thing that makes it political... again, I just don't understand how you square that with the regular filtering that search engines do all the time.
When Google Ads pay special attention to ads for lockpickers because it's a popular spam category of ad, but they don't pay special attention to other ads to the same degree, is that suddenly political?
Second issue I have here, if the problem is a lack of flagging misinformation in other contexts, why would the answer not be more rigorous flagging of that misinformation? Why would the answer necessarily be that DuckDuckGo results should be a free-for-all whenever someone searches for the word Ukraine? There's a big jump here from, "I think they're not doing a thorough enough job and I think they're taking sides in a conflict" to "they shouldn't be even trying to do this at all".
There are some services where that viewpoint makes sense, but I don't see how DDG is one of them. I personally have argued that companies like Cloudflare fundamentally shouldn't be in the business of releasing content filters at all. I personally have argued that TLDs shouldn't be involved in censorship. I have personally argued that ISPs should not be allowed to filter content that is not illegal. Important difference, none of those are companies whose primary service is sorting content, none of them are companies that we go to with the explicit request for them to give us information based on what they think is relevant and accurate.
How do you make the jump from disapproval of DDG's standard for misinformation and how it's applied to the idea that they shouldn't be involved in filtering of misinformation at all?
----
TLDR, I still don't really understand why editorial decisions about political content is a slippery slope, but abandoning editorial decisions based on a word ("political") that doesn't seem particularly rigorously defined isn't also a slippery slope.