Archived version: https://archive.li/TbziV

Google is launching new privacy tools to allow users to have more control over unwanted personal images online and ensure explicit or graphic photos do not appear easily in search results.

Updates to Google policies on personal explicit images mean that users will be able to remove non-consensual and explicit imagery of themselves that they no longer wish to be visible in searches.

The update means that even if an individual created and uploaded explicit content to a website, and no longer wishes for it to be available on search, they will be able to request to remove it from Google search. The forms to submit requests have also been made more simple. The policy does not apply to images users are currently and actively commercialising.

The policy also applies to websites containing personal information.

Google will also roll out a new dashboard, only available in the US in English initially, that will let users know search results that display their contact information. Users can then quickly request the removal of these results from Google. The tool will also send a notification when new results with a user’s information pop up in search.

A new blurring setting in SafeSearch will also be implemented as the default on Google search for users who do not already have SafeSearch filtering on. Explicit imagery, adult or graphic violent content will be blurred by default when it appears in search results. The setting can be turned off at any time, unless you are a supervised user on a public network that has kept this setting as default and locked it.

For instance, in a search for images under “injury”, explicit content will be blurred to prevent users from being shown graphic content.

Google initially announced this safeguard in February and it will be launched globally in August.

  • WhyIDie@lemmy.ml
    link
    fedilink
    arrow-up
    6
    arrow-down
    2
    ·
    1 year ago

    “If you could tell us which images you really don’t want seen, along with confirming its association with you, we’ll take it off the search results and definitely not add it to our dataset on you. Pinky promise.”

    • _bonbon_@lemm.ee
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      1 year ago

      LMAO Lemmy should really have rewards so I can give you one for this

      • WhyIDie@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        if it was you that downvoted me, know that it wasn’t me that downvoted you

        I really hope upvotes/downvotes become public eventually, like it is on kbin; for times like this when a random passerby causes a misunderstanding

  • owlinsight@lemm.ee
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    1 year ago

    The day I trust Google for anything privacy related, is the day I’ll have lost my mind

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    Google is launching new privacy tools to allow users to have more control over unwanted personal images online and ensure explicit or graphic photos do not appear easily in search results.

    Google will also roll out a new dashboard, only available in the US in English initially, that will let users know search results that display their contact information.

    The tool will also send a notification when new results with a user’s information pop up in search.

    Explicit imagery, adult or graphic violent content will be blurred by default when it appears in search results.

    Google initially announced this safeguard in February and it will be launched globally in August.


    I’m a bot and I’m open source!