Jonkman Microblog
  • Login
Show Navigation
  • Public

    • Public
    • Network
    • Groups
    • Popular
    • People

Conversation

Notices

  1. GeniusMusing (geniusmusing@nu.federati.net)'s status on Friday, 06-Aug-2021 19:36:15 EDT GeniusMusing GeniusMusing
    Apple explains how iPhones will scan photos for child-sexual-abuse images Ars Technica
    https://nu.federati.net/url/282291

    >Shortly after reports today that Apple will start scanning iPhones for child-abuse images, the company confirmed its plan and provided details in a news release and technical summary.
    >
    >"Apple's method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind," Apple's announcement said. "Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users' devices."
    >
    >Apple provided more detail on the CSAM detection system in a technical summary and said its system uses a threshold "set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account."
    >
    >The changes will roll out "later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey," Apple said. Apple will also deploy software that can analyze images in the Messages application for a new system that will "warn children and their parents when receiving or sending sexually explicit photos."
    >...

    Related:
    Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life Electronic Frontier Foundation
    https://nu.federati.net/url/282292
    In conversation Friday, 06-Aug-2021 19:36:15 EDT from nu.federati.net permalink

    Attachments

    1. Invalid filename.
      Apple explains how iPhones will scan photos for child-sexual-abuse images
      from Ars Technica
      Apple offers technical details, claims 1-in-1 trillion chance of false positives.
    2. Invalid filename.
      Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life
      from Electronic Frontier Foundation
      Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.
    1. lnxw48a1 (lnxw48a1@nu.federati.net)'s status on Friday, 06-Aug-2021 20:18:05 EDT lnxw48a1 lnxw48a1
      in reply to
      Oh, this is going to be bad. If any parents still try to do the old-style "bearskin rug and bare bottom" picture, the sheriff will be visiting them.

      Or the "baby just born picture before nurse wraps the private parts" that some parents take.
      In conversation Friday, 06-Aug-2021 20:18:05 EDT from nu.federati.net permalink
    2. lnxw48a1 (lnxw48a1@nu.federati.net)'s status on Friday, 06-Aug-2021 20:24:09 EDT lnxw48a1 lnxw48a1
      in reply to
      Also, phone pics usually have unwanted metadata. It is especially unwanted if you're taking illegal photos, because that info can lead police to you. It makes me suspect that only people that are brand new to such things would use a mobile computing device to take those photos.
      In conversation Friday, 06-Aug-2021 20:24:09 EDT from nu.federati.net permalink
      1. GeniusMusing (geniusmusing@nu.federati.net)'s status on Friday, 06-Aug-2021 20:31:43 EDT GeniusMusing GeniusMusing
        in reply to
        Also the miss identification of image still happens, couldn't find the article I was looking for but I did find this.

        UK Police Have a Porn-Spotting AI That Gets Confused by Desert Photos PetaPixel
        https://petapixel.com/2017/12/20/uk-police-porn-spotting-ai-gets-confused-desert-photos/

        snip

        >“Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” says Met digital forensics head Mark Stokes. “For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin color.”

        snip

        Have heard the same thing about AI's and some fruits.
        In conversation Friday, 06-Aug-2021 20:31:43 EDT from nu.federati.net permalink

        Attachments

        1. Invalid filename.
          UK Police Have a Porn-Spotting AI That Gets Confused by Desert Photos
          By Michael Zhang from PetaPixel
          UK Police Have a Porn-Spotting AI That Gets Confused by Desert Photos
        1. GeniusMusing (geniusmusing@nu.federati.net)'s status on Friday, 06-Aug-2021 20:33:52 EDT GeniusMusing GeniusMusing
          in reply to
          Things seem to be moving pretty fast...

          PSA: Apple Can't Run CSAM Checks On Devices With iCloud Photos Turned Off Slashdot
          https://nu.federati.net/url/282298
          In conversation Friday, 06-Aug-2021 20:33:52 EDT from nu.federati.net permalink

          Attachments

          1. Invalid filename.
            PSA: Apple Can't Run CSAM Checks On Devices With iCloud Photos Turned Off - Slashdot
            An anonymous reader quotes a report from iMore: Apple announced new on-device CSAM detection techniques yesterday and there has been a lot of confusion over what the feature can and cannot do. Contrary to what some people believe, Apple cannot check images when users have iCloud Photos disabled. App...
    3. Alexandre Oliva (moved to @lxo@gnusocial.jp) (lxo@gnusocial.net)'s status on Sunday, 08-Aug-2021 17:04:53 EDT Alexandre Oliva (moved to @lxo@gnusocial.jp) Alexandre Oliva (moved to @lxo@gnusocial.jp)
      in reply to
      apple learns from their MAFIAA business partners how useful it can be to pretend to think of the children and to fight abuse to push abusive practices down customers' throats
      In conversation Sunday, 08-Aug-2021 17:04:53 EDT from gnusocial.net permalink
      1. GeniusMusing (geniusmusing@nu.federati.net)'s status on Thursday, 19-Aug-2021 16:47:15 EDT GeniusMusing GeniusMusing
        in reply to
        From the "But wait! There's more fail" department.

        Apple's Not Digging Itself Out of This One
        https://gizmodo.com/apples-not-digging-itself-out-of-this-one-1847509340

        >Well, that didn’t take long. Online researchers say they have found flaws in Apple’s new child abuse detection tool that could allow bad actors to target iOS users. However, Apple has denied these claims, arguing that it has intentionally built-in safeguards against such exploitation.
        >
        >It’s just the latest bump in the road for the rollout of the company’s new features, which have been roundly criticized by privacy and civil liberties advocates since they were initially announced two weeks ago. Many critics view the updates—which are built to scour iPhones and other iOS products for signs of child sexual abuse material (CSAM)—as a slippery slope towards broader surveillance.
        >
        >The most recent criticism centers around allegations that Apple’s “NeuralHash” technology—which scans for the bad images—can be exploited and tricked to potentially target users. This started because online researchers dug up and subsequently shared code for NeuralHash as a way to better understand it. One Github user, AsuharietYgvar, claims to have reverse-engineered the scanning tech’s algorithm and published the code to his page. Ygvar wrote in a Reddit post that the algorithm was basically available in iOS 14.3 as obfuscated code and that he had taken the code and rebuilt it in a Python script to assemble a clearer picture of how it worked.
        >
        >Problematically, within a couple of hours, another researcher said they were able to use the posted code to trick the system into misidentifying an image, creating what is called a “hash collision.”
        >...
        In conversation Thursday, 19-Aug-2021 16:47:15 EDT from nu.federati.net permalink

        Attachments

        1. Invalid filename.
          Apple's Not Digging Itself Out of This One
          from Gizmodo
          The ongoing drama surrounding the company's proposed new child abuse prevention tools has taken another turn.
  • Help
  • About
  • FAQ
  • TOS
  • Privacy
  • Source
  • Version
  • Contact

Jonkman Microblog is a social network, courtesy of SOBAC Microcomputer Services. It runs on GNU social, version 1.2.0-beta5, available under the GNU Affero General Public License.

Creative Commons Attribution 3.0 All Jonkman Microblog content and data are available under the Creative Commons Attribution 3.0 license.

Switch to desktop site layout.