{"id":53340,"date":"2022-08-23T01:00:37","date_gmt":"2022-08-23T01:00:37","guid":{"rendered":"https:\/\/harchi90.com\/google-refuses-to-reinstate-mans-account-after-he-took-medical-images-of-sons-groin-technology\/"},"modified":"2022-08-23T01:00:37","modified_gmt":"2022-08-23T01:00:37","slug":"google-refuses-to-reinstate-mans-account-after-he-took-medical-images-of-sons-groin-technology","status":"publish","type":"post","link":"https:\/\/harchi90.com\/google-refuses-to-reinstate-mans-account-after-he-took-medical-images-of-sons-groin-technology\/","title":{"rendered":"Google refuses to reinstate man’s account after he took medical images of son’s groin | Technology"},"content":{"rendered":"
Google has refused to reinstate a man’s account after it wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM), the New York Times first reported. Experts say it’s an inevitable pitfall of trying to apply a technological solution to a social problem.<\/p>\n
Experts have long warned about the limitations of automated child sexual abuse image detection systems, particularly as companies face regulatory and public pressure to help address the existence of sexual abuse material.<\/p>\n
\u201cThese companies have access to a tremendously invasive amount of data about people’s lives. And still they don’t have the context of what people’s lives actually are,\u201d said Daniel Kahn Gillmor, a senior staff technologist at the ACLU. \u201cThere’s all kinds of things where just the fact of your life is not as legible to these information giants.\u201d He added that the use of these systems by tech companies that \u201cact as proxies\u201d for law enforcement puts people at risk of being \u201cswept up\u201d by \u201cthe power of the state.\u201d<\/p>\n
The man, only identified as Mark by the New York Times, took pictures of his son’s groin to send to a doctor after realizing it was inflamed. The doctor used that image to diagnose Mark’s son and prescribe prescribing. When the photos were automatically uploaded to the cloud, Google’s system identified them as CSAM. Two days later, Mark’s Gmail and other Google accounts, including Google Fi, which provides his phone service, were disabled over \u201charmful content\u201d that was \u201ca severe violation of the company’s policies and might be illegal\u201d, the Times reported, citing a message on his phone. He later found out that Google had flagged another video he had on his phone and that the San Francisco police department opened an investigation into him.<\/p>\n
Mark was cleared of any criminal wrongdoing, but Google has said it will stand by its decision.<\/p>\n
\u201cWe follow US law in defining what constitutes CSAM and use a combination of hash matching technology and artificial intelligence to identify it and remove it from our platforms,\u201d said Christa Muldoon, a Google spokesperson. <\/p>\n
Muldoon added that Google staffers who review CSAM were trained by medical experts to look for rashes or other issues. They themselves, however, were not medical experts and medical experts were not consulted when reviewing each case, she said.<\/p>\n