Tuesday 21 February 2023

gonzalez v google (10. 563)

During oral arguments, the US Supreme Court entertained a 2015 case contending that internet giant and parent company of Youtube effectively acted as a recruitment platform for Islamic State violence by hosting and algorithmically promoting harmful content and ought to be held liable for what users post on their sites—as publishers would for seditious or dangerous material. Heretofore, host companies have been shielded from legal responsibility from third-party screeds and recommen-dations that can potential deputise and radicalise through their affirmation and reinforcement (admittedly a search engine’s raison d’รชtre) under a provision of the law called Section 230, a carve-out of the 1996 Communications Decency Act that states those operators are not the authors of what people choose to share and propagate, and without a measure of immunity, it is feared that US companies would be exposed lawsuits and severely disincentivised from offering anything that one might find objectionable by any standard. Though the court and the twenty-six word that the argument hinges on may not provide a sufficient framework to define defamation and danger, justices—again, the internet is not America and such regulations should be taken in context—are trying to parse the difference between inclusion and amplification.