• 1 Post
  • 23 Comments
Joined 1 year ago
cake
Cake day: July 20th, 2023

help-circle


  • One of my big worries with the way people are using LLMs is that they’re being trained to trust whatever they spit out. Hey Google, what’s the nutritional content of peanuts? And people are learning not to ask where the information came from or to check sources.

    One of the many reasons this worries me is that very soon these businesses are going to need to recoup the billions they’re spending, and I wonder how long until these systems start feeding paid promotions to a population that’s been trained to accept whatever they’re told. imagine what some businesses, or governments, would pay to have exactly their choice of words produced on demand in response to knowledge queries.


















  • br3d@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    10 months ago

    Nobody can audit the code for all their apps. Even within an ideal world where the code is all open, people don’t have the skills or the time. Sensor permissions are supposed to be a system so that people can have a strong level of confidence in apps without needing those skills and time, and so not having the ability to control this sensor is a problem - but an OS problem