- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
A “natural language query” search engine is what I need sometimes.
Edit: directly reachable with the !ai bang
A “natural language query” search engine is what I need sometimes.
Edit: directly reachable with the !ai bang
Useless. Unless you are dumb enough to trust the result without verifying it yourself. And if you do verify it, at that point you spend more time than just doing a regular search.
i find its useful to get your toes dipped in a new topic, summarized in a neat way. most of the actual search results doing that are now ai garbage too anyway.
of course you should always verify.
I think that’s a little unfair: not everyone has the know-how to verify, and not everyone who can has the know-how to do original research on every potential topic they want to learn about.
If we all went by your logic here, none of us would put any stock in books, essays, encyclopedias, nothing.
Yes, comprehending what you read is important, but expecting everyone to original research on everything they want to learn is just not practical.
AI can be a valuable tool, in addition to critical thinking skills, if used properly.
You are missing the point. You don’t have to become a subject expert to verify the information. Not all sources are the same, some are incorrect on purpose, some are incorrect due to lax standards. As a thinking human being, you can decide to trust one source over the other. But LLMs sees all the information they are trained on as 100% correct. So it can generate factually incorrect information while believing what it provided you are 100% factually correct.
Using LLMs as a shortcut to find something is like playing a Russian roulette, you might get correct information 5 out of 6 times, but that one time is guaranteed to be incorrect.
No, I understood that. Hence why I said if sourced ethically & responsibly.
If you think that LLM’S are anything like encyclopedias, you fundamentally misunderstand what an LLM is. Its a story teller. Its not designed to be right its designed to engaging.
Encyclopedias are designed to be knowledge bases. Things you can rely on to give correct answers. LLM’s are not. They can be pushed towards that, but their very foundation is antithetical to that and it makes them very hard to believe.
I never said I think they’re anything like encyclopedias; I said that being so skeptical that you feel you have to personally verify every little thing you hear or read or watch would be akin to not trusting second- or third-party sources, such as encyclopedias, books, essays, documentaries, expert opinions, etc.
That heavily depends on how its designed and for what purpose. It is not a hard-and-fast rule.
Current iterations maybe. But future iterations will improve. As they say, it’s a learning process.
Every current LLM is built this way so it is a hard and fast rule.
I’m only talking about current iterations. No one here knows what the next iterations will be so we can’t comment on it. And right now its incredibly foolish to believe what an LLM tells you. They lie, like a lot.