Otter Raft@lemmy.ca to Medicine@mander.xyzEnglish · 8 天前AI medical tools found to downplay symptoms of women, ethnic minorities - Ars Technicaarstechnica.comexternal-linkmessage-square4linkfedilinkarrow-up138arrow-down12cross-posted to: fuck_ai@lemmy.worldtechnology@lemmy.ziparstechnica_index@ibbit.atpublichealth@mander.xyz
arrow-up136arrow-down1external-linkAI medical tools found to downplay symptoms of women, ethnic minorities - Ars Technicaarstechnica.comOtter Raft@lemmy.ca to Medicine@mander.xyzEnglish · 8 天前message-square4linkfedilinkcross-posted to: fuck_ai@lemmy.worldtechnology@lemmy.ziparstechnica_index@ibbit.atpublichealth@mander.xyz
minus-squareWhyDoYouPersist@lemmy.worldlinkfedilinkEnglisharrow-up5·7 天前Seriously, human generated data is what AI is trained on–this should come as a no-brainer.
minus-squareOtter Raft@lemmy.caOPlinkfedilinkEnglisharrow-up2·7 天前Yep, the companies are pushing AI models as being a “fair” and “unbiased” alternative to human workers. In reality LLLMs are going to be biased depending on the training data
Seriously, human generated data is what AI is trained on–this should come as a no-brainer.
Yep, the companies are pushing AI models as being a “fair” and “unbiased” alternative to human workers. In reality LLLMs are going to be biased depending on the training data