This is what people don’t get. Information is always unreliable when not from a trusted source. Just because it’s easier to generate that kind of information now doesn’t mean it’s a new problem.
Should learn yes, but are they? Who is teaching them? In my experience, many people who don’t seem to think they know how to judge accurate information online.
They seem to go by how convincing it sounds and how smart the person sounds. So convincing pseudoscience is all it takes to have a bunch of people sure it must be legit and no one is really teaching them otherwise.
Amazon is feeding into this by taking advantage of peoples trust in large companies. People also seem to assume that well, it’s amazon, they’re a big global company, they must be trustworthy and thus most of what they sell is too.
I don’t think that most people are even aware that alot of the things on amazon are from third party sellers either.
This is what people don’t get. Information is always unreliable when not from a trusted source. Just because it’s easier to generate that kind of information now doesn’t mean it’s a new problem.
Being dramatically easier IS a problem, though.
Yes, but not a really big one since people should learn how to deal with information and trustworthiness of them anyway
Should learn yes, but are they? Who is teaching them? In my experience, many people who don’t seem to think they know how to judge accurate information online.
They seem to go by how convincing it sounds and how smart the person sounds. So convincing pseudoscience is all it takes to have a bunch of people sure it must be legit and no one is really teaching them otherwise.
Amazon is feeding into this by taking advantage of peoples trust in large companies. People also seem to assume that well, it’s amazon, they’re a big global company, they must be trustworthy and thus most of what they sell is too.
I don’t think that most people are even aware that alot of the things on amazon are from third party sellers either.
Thats often the case with AI critical stories.
Most of the time ML is faulted for a problem its root lays way deeper.