{"id":53340,"date":"2022-08-23T01:00:37","date_gmt":"2022-08-23T01:00:37","guid":{"rendered":"https:\/\/harchi90.com\/google-refuses-to-reinstate-mans-account-after-he-took-medical-images-of-sons-groin-technology\/"},"modified":"2022-08-23T01:00:37","modified_gmt":"2022-08-23T01:00:37","slug":"google-refuses-to-reinstate-mans-account-after-he-took-medical-images-of-sons-groin-technology","status":"publish","type":"post","link":"https:\/\/harchi90.com\/google-refuses-to-reinstate-mans-account-after-he-took-medical-images-of-sons-groin-technology\/","title":{"rendered":"Google refuses to reinstate man’s account after he took medical images of son’s groin | Technology"},"content":{"rendered":"
\n

Google has refused to reinstate a man’s account after it wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM), the New York Times first reported. Experts say it’s an inevitable pitfall of trying to apply a technological solution to a social problem.<\/p>\n

Experts have long warned about the limitations of automated child sexual abuse image detection systems, particularly as companies face regulatory and public pressure to help address the existence of sexual abuse material.<\/p>\n

\u201cThese companies have access to a tremendously invasive amount of data about people’s lives. And still they don’t have the context of what people’s lives actually are,\u201d said Daniel Kahn Gillmor, a senior staff technologist at the ACLU. \u201cThere’s all kinds of things where just the fact of your life is not as legible to these information giants.\u201d He added that the use of these systems by tech companies that \u201cact as proxies\u201d for law enforcement puts people at risk of being \u201cswept up\u201d by \u201cthe power of the state.\u201d<\/p>\n

The man, only identified as Mark by the New York Times, took pictures of his son’s groin to send to a doctor after realizing it was inflamed. The doctor used that image to diagnose Mark’s son and prescribe prescribing. When the photos were automatically uploaded to the cloud, Google’s system identified them as CSAM. Two days later, Mark’s Gmail and other Google accounts, including Google Fi, which provides his phone service, were disabled over \u201charmful content\u201d that was \u201ca severe violation of the company’s policies and might be illegal\u201d, the Times reported, citing a message on his phone. He later found out that Google had flagged another video he had on his phone and that the San Francisco police department opened an investigation into him.<\/p>\n

Mark was cleared of any criminal wrongdoing, but Google has said it will stand by its decision.<\/p>\n

\u201cWe follow US law in defining what constitutes CSAM and use a combination of hash matching technology and artificial intelligence to identify it and remove it from our platforms,\u201d said Christa Muldoon, a Google spokesperson. <\/p>\n

Muldoon added that Google staffers who review CSAM were trained by medical experts to look for rashes or other issues. They themselves, however, were not medical experts and medical experts were not consulted when reviewing each case, she said.<\/p>\n

<\/figure>\n

That’s just one way these systems can cause harm, according to Gillmor. To address, for instance, any limitations algorithms might have in distinguishing between harmful sexual abuse images and medical images, companies often have a human in the loop. But those humans are themselves inherently limited in their expertise, and getting the proper context for each case requires further access to user data. Gillmor said it was a much more intrusive process that could still be an ineffective method of detecting CSAM.<\/p>\n

\u201cThese systems can cause real problems for people,\u201d he said. \u201cAnd it’s not just that I don’t think that these systems can catch every case of child abuse, it’s that they have really terrible consequences in terms of false positives for people. People’s lives can be really upended by the machinery and the humans in the loop simply making a bad decision because they don’t have any reason to try to fix it.\u201d<\/p>\n

Gillmor argued that technology wasn’t the solution to this problem. In fact, it could introduce many new problems, he said, including creating a robust surveillance system that could disproportionately harm those on the margins.<\/p>\n

\u201cThere’s a dream of a sort of techno-solutionists thing, [where people say], ‘Oh, well, you know, there’s an app for me finding a cheap lunch, why can’t there be an app for finding a solution to a thorny social problem, like child sexual abuse?’\u201d he said. \u201cWell, you know, they might not be solvable by the same kinds of technology or skill set.\u201d<\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"

Google has refused to reinstate a man’s account after it wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM), the New York Times first reported. Experts say it’s an inevitable pitfall of trying to apply a technological solution to a social problem. Experts have long warned about the …<\/p>\n

Google refuses to reinstate man’s account after he took medical images of son’s groin | Technology<\/span> Read More »<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"default","ast-global-header-display":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","spay_email":"","jetpack_publicize_message":"","jetpack_is_tweetstorm":false,"jetpack_publicize_feature_enabled":true},"categories":[4],"tags":[],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack-related-posts":[{"id":53091,"url":"https:\/\/harchi90.com\/google-flagged-parents-photos-of-sick-children-as-sexual-abuse\/","url_meta":{"origin":53340,"position":0},"title":"Google Flagged Parents’ Photos of Sick Children as Sexual Abuse","date":"August 22, 2022","format":false,"excerpt":"Two fathers, one in San Francisco and another in Houston, were separately investigated by the police on suspicion of child abuse and exploitation after using Android phones (owned by Google) to take photos of their sons' genitals for medical purposes. Though in both cases the police determined that the parents\u2026","rel":"","context":"In "Technology"","img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":52998,"url":"https:\/\/harchi90.com\/google-locked-account-of-parent-for-medical-photos-of-child\/","url_meta":{"origin":53340,"position":1},"title":"Google locked account of parent for medical photos of child","date":"August 22, 2022","format":false,"excerpt":"Our Google accounts can, in many cases, be the pillar of our digital lives. One account holds our email, our photos, our documents, and perhaps even the login to other accounts around the web. It's for that reason that one parent has been living through a terrible situation, after Google\u2026","rel":"","context":"In "Technology"","img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":55303,"url":"https:\/\/harchi90.com\/uncensored-ai-art-model-prompts-ethics-questions-techcrunch\/","url_meta":{"origin":53340,"position":2},"title":"Uncensored AI art model prompts ethics questions \u2013 TechCrunch","date":"August 24, 2022","format":false,"excerpt":"A new open source AI image generator capable of producing realistic pictures from any text prompt has seen stunningly swift uptake in its first week. Stability AI's Stable Diffusion, high fidelity but capable of being run on off-the-shelf consumer hardware, is now in use by art generator services like Artbreeder,\u2026","rel":"","context":"In "Technology"","img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":45770,"url":"https:\/\/harchi90.com\/r-kelly-stands-trial-in-chicago-what-to-know\/","url_meta":{"origin":53340,"position":3},"title":"R. Kelly Stands Trial in Chicago: What to Know","date":"August 15, 2022","format":false,"excerpt":"R. Kelly, who was sentenced to 30 years in prison for racketeering and sex trafficking earlier this year, will stand trial again starting this week, beginning the next chapter of prosecutors' efforts to hold him criminally responsible for of sexual abuse dating back more than three decades.The trial is in\u2026","rel":"","context":"In "Entertainment"","img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":57550,"url":"https:\/\/harchi90.com\/zoey-101-star-alexa-nikolas-said-she-did-not-feel-safe-around-creator-dan-schneider\/","url_meta":{"origin":53340,"position":4},"title":"Zoey 101 star Alexa Nikolas said she “did not feel safe” around creator Dan Schneider","date":"August 27, 2022","format":false,"excerpt":"Alexa Nikolas in 2005photo: Michael Buckner (Getty Images)TV creator Dan Schneider has been an uncomfortable topic for Nickelodeon for years at this point\u2014his multi-decade tenure at the network a blend of massive success, and of unprofessional behavior, including toward the many young actors who worked on his various shows. Now,\u2026","rel":"","context":"In "Entertainment"","img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]}],"fifu_image_url":"https:\/\/i.guim.co.uk\/img\/media\/3dc5a5317bf8e1c745264d6c55275eddaaf42ec6\/0_151_4500_2700\/master\/4500.jpg?width=1200&height=630&quality=85&auto=format&fit=crop&overlay-align=bottom,left&overlay-width=100p&overlay-base64=L2ltZy9zdGF0aWMvb3ZlcmxheXMvdGctZGVmYXVsdC5wbmc&enable=upscale&s=598e232a43b1653393fd214151d00a84","_links":{"self":[{"href":"https:\/\/harchi90.com\/wp-json\/wp\/v2\/posts\/53340"}],"collection":[{"href":"https:\/\/harchi90.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/harchi90.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/harchi90.com\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/harchi90.com\/wp-json\/wp\/v2\/comments?post=53340"}],"version-history":[{"count":0,"href":"https:\/\/harchi90.com\/wp-json\/wp\/v2\/posts\/53340\/revisions"}],"wp:attachment":[{"href":"https:\/\/harchi90.com\/wp-json\/wp\/v2\/media?parent=53340"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/harchi90.com\/wp-json\/wp\/v2\/categories?post=53340"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/harchi90.com\/wp-json\/wp\/v2\/tags?post=53340"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}