• bizdelnick@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    One of the four essential freedoms is the freedom to study the software and modify it. Studying means training your brain on the open source code. Can one use their brain to write proprietary code after they studied some copylefted code?

    • chgxvjh [he/him, comrade/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      If you study a code base then implement something similar yourself without attribution, there is a good chance that you are doing a form of plagiarism.

      In other contexts like academic writing this approach might be considered a pretty clear and uncontroversial case of plagiarism.

      • bizdelnick@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        2 months ago

        There’s no such a word as plagiarism in free licenses nor in copyright laws. One could violate copyrights or patents or not. Copyleft licenses do not forbid what you call plagiarism. If you want to forbid this as well as training LLMs on your code, you need a new type of license. However I’m unsure if such a license could be considered free by FSF or approved by OSI.

  • bizarroland@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    LLMs are tools. They’re not replacements for human creativity. They are not reliable sources of truth. They are interesting tools and toys that you can play with.

    So have fun and play with them.

    • geolaw@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      LLMs consume vast amounts of energy and freash water and release lots of carbon. That is enough for me to not want to “play” with them.

      • m532@lemmygrad.ml
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        I have a solution its called china

        They have solar panels those neither use water nor produce co2/ch4, they can train the AI (the energy-intensive part)

        Then you download the AI from the internet and can use it 100000x and it will use less energy than a washing machine, and neither consume water nor produce co2/ch4

      • 87Six@lemmy.zip
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        That’s only because they’re implemented haphazardly to save as much as possible and produce as fast as possible and basically cut any possible corner

        And that’s only caused by the leadership of these companies. AI in general is okay. LLM’s are meh but I don’t specifically see the LLM concept as the devil the same way shovels weren’t the devil during the gold rush.