It's true that the Facebook app or company didn't suddenly kill them. But my claim was very specific ("Misinformation spread on Facebook led to the genocide in Myanmar") and well backed up by the evidence.
It's the same as the role Radio Télévision Libre des Mille Collines (RTLM) played during the Rwandan genocide.
The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more.[1]
and
In a surprising concession before the Senate intelligence committee in September 2018, Chief Operating Officer Sheryl Sandberg even accepted that Facebook may have a legal obligation to take down accounts that incentivize violence in countries like Myanmar. Sandberg called the situation “devastating” and acknowledged that the company needed to do more, but highlighted that Facebook had put increased resources behind being able to review content in Burmese. Shortly before the hearing, Facebook announced that it had taken the unusual step of removing a number of pages and accounts linked to the Myanmar military for “coordinated inauthentic behaviour” and in order to prevent them from “further inflam[ing] ethnic and religious tensions.”[2]
The report Facebook itself commissioned and posted at [1] says:
the consequences for the victim are severe, with lives and bodily integrity placed at risk from incitement to violence. and there is a high likelihood of these risks occurring in practice(they have occurred in the past and are happening today) (page 35).
I think it's a pretty reasonable thing to believe if Facebook themselves say it too.
Misinformation spread on Facebook led to the genocide in Myanmar[1].
Yes, it's fueling a bit of radicalization, which isn't great.
Radicalization is the real problem because it leads to rejection of democracy as a method for solving disagreements.
[1] https://www.reuters.com/article/us-myanmar-rohingya-facebook...