• Tb0n3@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      17
      ·
      9 days ago

      It’s funny that anybody would expect models trained on information from current doctors to not have the same blind spots.

  • LostWanderer@fedia.io
    link
    fedilink
    arrow-up
    18
    arrow-down
    6
    ·
    9 days ago

    Imagine, a hallucination engine mostly developed by white men and trained on data gathered by white men failing to treat symptoms experienced by women and ethnic minorities seriously. Who would’ve guessed this outcome?!

    • Buelldozer@lemmy.today
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      5
      ·
      8 days ago

      Imagine a hallucination engine being developed globally by white men in China on data gathered by white men in India.

      Wait…what?

      Truth is that even here in the United States roughly 30% of the AI Workforce is non-male. You can hop online and look at pictures from AI Conferences from around the world and notice a high participation of non-male presenting people, sometimes approaching at least half the audience.