• theluddite@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    I do a lot of writing of various kinds, and I could not disagree more strongly. Writing is a part of thinking. Thoughts are fuzzy, interconnected, nebulous things, impossible to communicate in their entirety. When you write, the real labor is converting that murky thought-stuff into something precise. It’s not uncommon in writing to have an idea all at once that takes many hours and thousands of words to communicate. How is an LLM supposed to help you with that? The LLM doesn’t know what’s in your head; using it is diluting your thought with statistically generated bullshit. If what you’re trying to communicate can withstand being diluted like that without losing value, then whatever it is probably isn’t meaningfully worth reading. If you use LLMs to help you write stuff, you are wasting everyone else’s time.

    • TropicalDingdong@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      10 months ago

      How is an LLM supposed to help you with that?

      I have it read and review a couple paragraphs of a research article, many many times, to create a distribution of what was likely said in those paragraphs, in a tabular format. I’ll also work with it to create an outline of an idea I’m working on to keep me focused, and help develop my research plan. I’ll then ask it to drill down into each sub-point and give me granular points to focus on. Obviously, I’m steering, but its not too difficult to use it in such a way that it creates a scaffolding for you to work from.

      If you use LLMs to help you write stuff, you are wasting everyone else’s time.

      If you aren’t using LLMs to help you write stuff, you are wasting your own time.