• 0 Posts
  • 9 Comments
Joined 1 year ago
cake
Cake day: August 7th, 2023

help-circle







  • No I’d say that it has more to do with improved usability and better design overall making them unable to fix issues when they do occur. There isn’t one specific company or system to blame. Nearly everything has, for better or for worse, been boiled down into a webapp where there is minimal potential for error.

    It’s also not really fair to compare gen z to Millenials as Millennials have had nearly twice as much time to figure things out.


  • LLMs only predict the next token. Sometimes those predictions are correct, sometimes they’re incorrect. Larger models trained on a greater number of examples make better predictions, but they are always just predictions. This is why incorrect responses often sound plausable even if logically they don’t make sense.

    Fixing hallucinations is more about decreasing inaccuracies rather than fixing an actual problem with the model itself.