Bullshit. As if the people training the LLMs are the same ones building the datasets. And once the dataset is built it can be used to trained all models so it doesn’t affect them.
This happened earlier today to me, too. My comment ended up in the wrong thread and ended up completely out of context. There are definitely some issues to work out.
Bullshit. As if the people training the LLMs are the same ones building the datasets. And once the dataset is built it can be used to trained all models so it doesn’t affect them.
Wrong thread?
Yeah, the fuck? I’ve reloaded and I’m somewhere else than where I wrote the comment.
Lemmy has teleportation? Damn, it’s a lot better than reddit.
This happened earlier today to me, too. My comment ended up in the wrong thread and ended up completely out of context. There are definitely some issues to work out.