LLMs inherently are just distilling knowledge to an average output. So whenever companies complain about this ultimately they’re just complaining that their product is being used as designed. www.nbcnews.com/tech/secu…