• 0 Posts
  • 28 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle

  • Snapchat is not the only problem here, but it is a problem.

    If they can’t guarantee their recommendations are clean, they shouldn’t be offering recommendations. Even to adults. Let people find other accounts to connect to for themselves, or by consulting some third party’s curated list.

    If not offering recommendations destroys Snapchat’s business model, so be it. The world will continue on without them.

    It really is that simple.

    Using buggy code (because all nontrivial code is buggy) to offer recommendations only happens because these companies are cheap and lazy. They need to be forced to take responsibility where it’s appropriate. This does not mean that they should be liable for the identity of posters on their network or the content of individual posts—I agree that expecting them to control that is unrealistic—but all curation algorithms are created by them and are completely under their control. They can provide simple sorts based on data visible to all users, or leave things to spread externally by word of mouth. Anything beyond that should require human verification, because black box algorithms demonstrably do not make good choices.

    It’s the same thing as the recent Air Canada chatbot case: the company is responsible for errors made by its software, to about the same extent as it is responsible for errors made by its employees. If a human working for Snapchat had directed “C.O.” to the paedophile’s account, would you consider Snapchat to be liable (for hiring the kind of person who would do that, if nothing else)?


  • Yes, they should. They chose to deploy the algorithm rather than using a different algorithm, or a human-curated suggestion set, or nothing at all. It’s like a store offering one-per-purchase free bonus items while knowing a few of them are soaked in a contact poison that will make anyone who touches them sick. If your business uses a black box to serve clients, you are liable for the output of that black box, and if you can’t find a black box that doesn’t produce noxious output, then either don’t use one or put a human in the loop. Yes, that human will cost you money. That’s why my suggestion at the end was to use a single common feed, to reduce the labour. If they can’t get enough engagement from a single common feed to support the business, maybe the business should be allowed to die.

    The only leg Snapchat has to stand on here is the fact that “C.O.” was violating their TOS by opening an account when she was under the age of 13, and may well have claimed she was over 18 when she was setting up the account.


  • Bunch of things going on here.

    On the one hand, Snapchat shouldn’t be liable for users’ actions.

    On the other hand, Snapchat absolutely should be liable for its recommendation algorithms’ actions.

    On the third hand, the kid presumably lied to Snapchat in order to get an account in the first place.

    On the fourth hand, the kid’s parents fail at basic parenting in ways that have nothing to do with Snapchat: “If you get messages on-line that make you uncomfortable or are obviously wrong, show them to a trusted adult—it doesn’t have to be us.” “If you must meet someone you know on-line in person, do it in the most public place you can think of—mall food courts during lunch hour are good. You want to make sure that if you scream, lots of people will hear it.” “Don’t ever get into a car alone with someone you don’t know very well.”

    Solution: make suggestion algorithms opt-in only (if they’re useful, people will opt in). Don’t allow known underage individuals to opt in—restrict them to a human-curated “general feed” that’s the same for everyone not opted in if you feel the need to fill in the space in the interface. Get C.O. better parents.

    None of that will happen, of course.


  • Actually, what really matters is not the quality of your code or the disruptiveness of your paradigm, or whether you can outlive the competitors that existed when you started up, but whether you can keep the money coming. The rideshares in particular will fail over time in any country with labour laws that allow drivers to unionize—if the drivers make a sane amount of money, the company’s profits plummet, and investors and shareholders head for the hills. Netflix is falling apart already because the corporations with large libraries of content aren’t so happy to license them anymore, and they’re scrambling to make up the revenue they’ve lost. Google will probably survive only because its real product is the scourge of humanity known as advertising.

    Again, it’s all business considerations, not technical ones. Remember the dot-com boom of the 1990s, or are you not old enough? A lot of what’s going on right now looks like the 2.0 (3.0? 4.0?) release of the same thing. A few of these companies will survive, but more of them will fold, and in some cases their business models will go with them.















  • What we actually need is more variety in rendering engines. There were never that many, and two or three (Presto, Trident, and Spartan if you count it) have been killed off within the past ten years. All that’s left are two lineages: Google’s Blink and its barely-threre parent WebKit (in Apple’s Safari), and Mozilla’s Gecko and its barely-there child Goanna (in Pale Moon).

    Unfortunately, the rendering engine is probably the largest single chunk of code in a browser, and writing a new one (or even forking an existing one) is non-trivial.