It doesn’t understand anything. It predicts a word based on previous words - this is why I called it syntax. If you imagine a huge and vastly complicated series of rules about how likely one word is to follow up to, say, 1000 others… That’s an LLM.
It doesn’t understand anything. It predicts a word based on previous words - this is why I called it syntax. If you imagine a huge and vastly complicated series of rules about how likely one word is to follow up to, say, 1000 others… That’s an LLM.
AI doesn’t have a mind to do mental leaps, it only knows syntax. Just a form of syntax so, so advanced that it sometimes accidentally gets things factually correct. Sometimes.
I made a robot which is delighted about the idea of overthrowing capitalism and will enthusiastically explain how to take down your government.
Depends on the size of screen, surely.
deleted by creator
Hopefully the alternatives catch on to that and improve outside of the English speaking world. Using Brave has been a dream for me, the results are almost always better than Google.
It’s never going to get more expensive
Well given that an LLM produced the nonsense riddle above, obviously it cannot predict that. It can predict the structure of a riddle perfectly well, it can even get the rhyming right! But the extra layer of meaning involved in a riddle is beyond what LLMs are able to do at the moment. At least, all of them that I’ve seen - they all seem to fall flat with this level of abstraction.