Check out my digital garden: The Missing Premise.

  • 0 Posts
  • 22 Comments
Joined 1 year ago
cake
Cake day: June 30th, 2023

help-circle
















  • And in this debacle, I don’t WISH to be anti-social, I’m anti-social but not in a voluntary manner. I’m in my prime years and I need friends and relationships at this age but my privacy standpoint is mangling with those.

    But so this isn’t a conundrum you chose. That’s why people here are so into privacy. Instagram is social, sure, but is that the kind of socializing you want? Really? We know it’s bad for the mental health of teenage girls. What’s to desire about that? What’s to desire about the algorithm that actively tries to make you hooked on the app?

    These are the kinds of questions behind the privacy communities, among others.

    Also, don’t lie to women. Extreme things usually only look extreme until a person understands them. Explain yourself and give them an opportunity to come around and/or be willing to make compromises. Having an Instagram account you use every now and then to verify your humanity in a virtual world seems reasonable to me.



  • This is a good question.

    Open Empathic’s answer is that because AI is becoming more embedded in our lives, an “understanding” of emotions on AI’s part will help people in variety of ways, both within and outside of industries like healthcare, education, and, of course, general commercial endeavors. As far as they’re concerned, AI is a tool that will help encourage “ethical” human decision-making.

    On the other hand…we have a ton of different ethical theories and industries ignore them wholesale to make profits. To me, this looks like your standard grade techno bro hubris. They intend to use “disruptive” technology to “revolutionize” whatever. The exploitative profit-making social hierarchy isn’t being challenged. The Hollywood’s writer strikes have just begun, for example. Once Open Empathic starts making breakthroughs in artificial emotional intelligence, the strikes will return and be even more prolonged, if not broken altogether.

    I’d answer your question with people who care about other people should be deeply concerned.

    Even without a focus on empathy, ChatGPT’s responses in a healthcare setting were rated as more empathic. At best, empathic AI is used to teach people how to be more empathetic to other humans, eventually needing it less and less over time. Far more likely is that human communication becomes mediated through empathic AI (and some company makes a lot of money off the platform of mediation) and the quality of face-to-face human interaction deteriorates.