• mfed1122@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    10
    ·
    4 hours ago

    This reminds me of that scene in Breaking Bad where the two morons were talking about how if you ask an undercover cop if they’re cop they legally have to tell you the truth

  • Natanael@infosec.pub
    link
    fedilink
    arrow-up
    7
    ·
    6 hours ago

    Not how copyright works. Adding something with creative height together with something without leaves the combined work with ownership only of the part with creative height with the rest unprotected.

    (bots can not achieve creative height by definition in law)

  • Zoabrown@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    7 hours ago

    The conversation around ownership and AI is less about absolutes and more about clarity and accountability.

  • JATth@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    13 hours ago

    Stick to the GPL licensencing of your code whenever possible and the garbage EEE can’t subdue you. (Embrace extend exthinguish.)

    If they plagiarize it they kinda ow you the honor.

    Hower, plagiarism is still plagiarism, so you better actually write some of your code by hand.

    • douglasg14b@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      15 minutes ago

      Sure they can. Because what are you going to do financially ruin yourself in a lawsuit you’re going to lose against Microsoft or some other mega Corp who doesn’t give a shit about GPL licencing?

      If it’s not enforced with teeth then it doesn’t matter anymore.

  • iglou@programming.dev
    link
    fedilink
    arrow-up
    45
    arrow-down
    1
    ·
    1 day ago

    That sounds like complete bullshit to me. Even if the logic is sound, which I seriously doubt, if you use someone’s code and you claim their license isn’t valid because some part of the codebase is AI generated, I’m pretty sure you’ll have to prove that. Good luck.

    • Kushan@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 hours ago

      I work for a large enterprise firm, our corporate lawyer has told be about this exact scenario so I’m inclined to believe it’s real.

      That being said, for established projects it won’t be that hard to prove the non-AI bit because you have a long commit history that predates the tooling.

      Even if you were to assume that all commits after a certain date were AI generated, the OP is slightly off in their attestation that any AI code suddenly makes the whole thing public domain, it would only be if a majority of the codebase was AI coded (and provably so).

      So yes all the vibe coded shite is a lost cause, but stuff like Windows isn’t in any danger.

        • Kushan@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          7 hours ago

          I think that’s actually quite sensible, our lawyer wasn’t flagging some clear cut legal certainty, he was flagging risk.

          Risk can be mitigated, even if the chance of it panning out is slim.

          • iglou@programming.dev
            link
            fedilink
            arrow-up
            3
            ·
            5 hours ago

            A bit besides the point, but it is pretty crazy to me that we’re moving towards a world where if you create by yourself, you’re outcompeted, but if you use AI like everyone else, you own nothing.

    • Tartas1995@discuss.tchncs.de
      link
      fedilink
      arrow-up
      6
      ·
      8 hours ago

      There was a case in which a monkey took a picture and the owner of the camera wanted to publish the photo. Peta sued and lost because an animal can’t hold any copyright as an human author is required for copyright.

      https://en.wikipedia.org/wiki/Monkey_selfie_copyright_dispute

      As you also find in the wikipedia article, this case is used to argue that ai generated content is not by an human author and consequently not copyrightable.

      • iglou@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        7 hours ago

        I’d argue that this is a different scenario, as AI is a tool, not a being. At least at this point.

        A complex tool, but really just a tool. Without the human input, it can’t do shit.

        • Natanael@infosec.pub
          link
          fedilink
          arrow-up
          3
          ·
          6 hours ago

          There’s already rulings on this holding that the prompt for all LLM or image generator isn’t enough to count the result as the human’s expression, thus no copyright (both in USA and other places)

          You need both human expression and creative height to get copyright protection

        • draco_aeneus@mander.xyz
          link
          fedilink
          arrow-up
          2
          ·
          6 hours ago

          Exactly. If I use online Photoshop or whatever, and I use the red eye removal tool, I have copyright on that picture. Same if I create a picture from scratch. Just because someone like OpenAI hosts a more complex generator doesn’t mean a whole new class of rules applies.

          Whomever uses a tool, regardless of the complexity, is both responsible and benificiary of the result.

          • Natanael@infosec.pub
            link
            fedilink
            arrow-up
            2
            ·
            6 hours ago

            Not quite how copyright law works. Photoshop and similar gives you copyright because it captures your expression.

            An LLM is more like work-for-hire but unlike a human artist it doesn’t qualify for copyright protection and therefore neither does you

            https://infosec.pub/comment/20390963

            • draco_aeneus@mander.xyz
              link
              fedilink
              arrow-up
              2
              ·
              6 hours ago

              Well, not how USA copyright works, but point well taken. It seems I was too naïve in my understanding of copyright.

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      If there was an actual civil suit you’d probably be able to subpoena people for that information, and the standard is only more likely than not. I have no idea if the general idea is bullshit, though.

      IANAL

    • GalacticSushi@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      15
      ·
      1 day ago

      I do not give Facebook or any entities associated with Facebook permission to use my pictures, information, messages, or posts, both past and future.

    • brianary@lemmy.zip
      link
      fedilink
      arrow-up
      18
      ·
      1 day ago

      The Windows FOSS part, sure, but unenforceable copyright seems quite possible, but probably not court-tested. I mean, AI basically ignored copyright to train in the first place, and there is precedent for animals not getting copyright for taking pictures.

      • CanadaPlus@lemmy.sdf.org
        link
        fedilink
        arrow-up
        14
        ·
        1 day ago

        If it’s not court tested, I’m guessing we can assume a legal theory that breaks all software licensing will not hold up.

        Like, maybe the code snippets that are AI-made themselves can be stolen, but not different parts of the project.

  • Kokesh@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    5
    ·
    1 day ago

    As it should. All the idiots calling themselves programmers, because they tell crappy chatbot what to write, based on stolen knowledge. What warms my heart a little is the fact that I poisoned everything I ever wrote on StackOverflow just enough to screw with AI slopbots. I hope I contributed my grain of sand into making this shit little worse.

    • DeathsEmbrace@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 day ago

      Do it in a way that a human can understand but AI fails. I remember my days and you guys are my mvp helping me figure shit out.

      • Chakravanti@monero.town
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        24 hours ago

        Most “humans” don’t understand reality. So you’re postulative challenge invention isn’t going find a break you seek to divine. Few exist. I’m yet to find many that can even recognize the notion that this language isn’t made to mean what think you’re attempting to finagle it into.

        Evil Money Right Wrong Need…

        Yeah…I could go on and on but there’s five sticks humans do not cognate the public consent about the meaning of Will Never be real. Closest you find any such is imagination and the only purpose there is to help the delirious learn to cognate the difference and see reality for what it may be.

        Good fucking luck. Half the meat zappers here think I am an AI because break the notion of consent to any notion of a cohesive language. I won’t iterate that further because I’ve already spelt out why.

  • ricecake@sh.itjust.works
    link
    fedilink
    arrow-up
    20
    ·
    1 day ago

    That’s not what that research document says. Pretty early on it talks about rote mechanical processes with no human input. By the logic they employ there’s no difference between LLM code and a photographer using Photoshop.

  • Evil_Shrubbery@thelemmy.club
    link
    fedilink
    arrow-up
    20
    ·
    1 day ago

    By that same logic LLMs themselves (by now some AI bro had to vibe code something there) & their trained datapoints (which were on stolen data anyway) should be public domain.

    What revolutionary force can legislate and enforce this?? Pls!?

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      22 hours ago

      By that same logic LLMs themselves (by now some AI bro had to vibe code something there)

      I’m guessing LLMs are still really really bad at that kind of programming. The packaging of the LLM, sure.

      & their trained datapoints

      For legal purposes, it seems like the weights would be generated by the human-made training algorithm. I have no idea if that’s copyrightable under US law. The standard approach seems to be to keep them a trade secret and pretend there’s no espionage, though.

      • Evil_Shrubbery@thelemmy.club
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        The packaging of the LLM, sure.

        Yes, totally, but OP says a small bit affects “possibly the whole project” so I wanted to point out that includes prob AIs, Windows, etc too.

    • Lung@lemmy.world
      link
      fedilink
      arrow-up
      222
      ·
      2 days ago

      It’s just unsettled law, and the link is basically an opinion piece. But guess who wins major legal battles like this - yep, the big corps. There’s only one way this is going to go for AI generated code

      • Droechai@piefed.blahaj.zone
        link
        fedilink
        English
        arrow-up
        24
        ·
        1 day ago

        Worst case is that its the owner of the agent that recieves the copyright, so all vibe coded stuff outside local ai will be claimed by the big corpos

        • Grail@multiverse.soulism.net
          link
          fedilink
          English
          arrow-up
          21
          arrow-down
          1
          ·
          1 day ago

          I actually think that’s the best case because it would kill enterprise adoption of AI overnight. All the corps with in-house AI keep using and pushing it, but every small to medium business that isn’t running AI locally will throw it out like yesterday’s trash. OpenAI’s stock price will soar and then plummet.

          • Grimy@lemmy.world
            link
            fedilink
            arrow-up
            25
            ·
            1 day ago

            The big AI companies would just come out with a business subscription that explicitly gives you copyright.

          • iegod@lemmy.zip
            link
            fedilink
            arrow-up
            3
            ·
            1 day ago

            Unlikely since, as you say, it would deter business. OpenAI already assigns rights of output to the end user according to their licensing and terms.

            • marlowe221@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              20 hours ago

              No attempt to argue with you, personally is intended here. But your comment raises another question that I’m not sure the law has answered yet.

              What rights does OpenAI have in the output of ChatGPT in the first place? Because if the answer is “Not much” then their transfer of rights to the output to the user doesn’t necessarily mean much.

              After all, OpenAI can only transfer rights that they have. If they don’t have any to begin with… 🤷‍♂️

              • iegod@lemmy.zip
                link
                fedilink
                arrow-up
                1
                ·
                20 hours ago

                Yep, totally fair question, and one that’s being tested legally on many fronts. Rulings are generally siding with AI companies on the training side (using copyrighted works to train models is fair use) but there aren’t many decisions yet about output. The next few years will be interesting.

    • PlzGivHugs@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      39
      arrow-down
      2
      ·
      2 days ago

      It is true that AI work (and anything derived from it that isn’t significantly transformative) is public domain. That said, the copyright of code that is a mix of AI and human is much more legally grey.

      In other work, where it can be more separated, individual elements may have different copyright. For example, a comic was made using AI generated images. It was ruled that all the images were thus public domain. Despite that, the text and the layout of the comic was human-made and so the copyright to that was owned by the author. Code, obviously can’t be so easily divided up, and it will be much harder to define what is transformative or not. As such, its a legal grey area that will probably depend on a case-by-case basis.

      • ssfckdt@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        18
        ·
        2 days ago

        Yeah, it’s like products that include FOSS in them, only have to release the FOSS stuff, not their proprietary. (Was kind of cute to find the whole GNU license buried in the menus of my old TiVo…)

      • I Cast Fist@programming.dev
        link
        fedilink
        arrow-up
        9
        ·
        1 day ago

        So, you’re telling me I can copypaste 100% some of the ai slop books on amazon and resell it as mine? Brb, gonna make a shit site an absolute diarrhea

    • Successful_Try543@feddit.org
      link
      fedilink
      arrow-up
      11
      arrow-down
      3
      ·
      2 days ago

      If the AI generated code is recognisably close to the code the AI has been trained with, the copyright belongs to the creator of that code.

        • grue@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          1 day ago

          Since AIs are trained on copyleft code, either every output of AI code generators is copyleft, or none of it is legal to use at all.

      • inari@piefed.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        I may be wrong but I think current legal understanding doesn’t support this

        • Successful_Try543@feddit.org
          link
          fedilink
          arrow-up
          7
          ·
          edit-2
          1 day ago

          Under U.S. law, to prove that an AI output infringes a copyright, a plaintiff must show the copyrighted work was “actually copied”, meaning that the AI generates output which is “substantially similar” to their work, and that the AI had access to their work.[4]

          Wikipedia – AI and copyright

          I’ve found a similar formulation in a official German document before posting my above comment. Essentially, it doesn’t matter if you’ve “stolen” copied somebody else’s code yourself and used it in your work or did so by using an AI.

      • ranzispa@mander.xyz
        link
        fedilink
        arrow-up
        17
        ·
        1 day ago

        The use of AI tools to assist rather than stand in for human creativity does not affect the availability of copyright protection for the output. Copyright protects the original expression in a work created by a human author, even if the work also includes AI-generated material.

        I’m not sure where you get that from, I’m pretty sure vibe coding still complies with these indications

        • Tatar_Nobility@lemmy.ml
          link
          fedilink
          arrow-up
          3
          arrow-down
          2
          ·
          1 day ago

          “AI-generated” works can be copyrighted. However, on the condition that the AI-generated elements are explicitly mentioned in the “Excluded Material” field. In other words, the parts generated by AI are not protected, only the parts that are expressed by human creativity. Courts in the U.S have already rejected registration for many AI works because of that. Regardless, it’s still a contentious matter.

          P.S. I am completely opposed to (generative) AI as well as the copyright system. I’m just stating my findings researching the law and court cases.

        • tempest@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          1 day ago

          As mentioned elsewhere in this thread it won’t matter either way unless tested in court and that will never happen for most companies.

      • HappyFrog@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        10
        ·
        1 day ago

        Did you even read your own report? It says that AI works are copyrightable in certain circumstances, not that they make a whole project public:

        Copyright law has long adapted to new technology and can enable case-by- case determinations as to whether AI-generated outputs reflect sufficient human contribution to warrant copyright protection. As described above, in many circumstances these outputs will be copyrightable in whole or in part—where AI is used as a tool, and where a human has been able to determine the expressive elements they contain. Prompts alone, however, at this stage are unlikely to satisfy those requirements.

        • Tatar_Nobility@lemmy.ml
          link
          fedilink
          arrow-up
          2
          arrow-down
          4
          ·
          1 day ago

          “AI-generated” works can be copyrighted. However, on the condition that the AI-generated elements are explicitly mentioned in the “Excluded Material” field. In other words, the parts generated by AI are not protected, only the parts that are expressed by human creativity. Courts in the U.S have already rejected registration for many AI works because of that.

          P.S. I am completely opposed to (generative) AI as well as the copyright system. I’m just stating my findings researching the law and court cases.

    • FiniteBanjo@feddit.online
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      17
      ·
      2 days ago

      The part that is untrue is the “public domain” part. If you generate code then you don’t own it because the actual human work that went into creating it was done by the owner of the AI Model and whatever they trained on.

      • hoshikarakitaridia@lemmy.world
        link
        fedilink
        arrow-up
        31
        ·
        edit-2
        2 days ago

        Iirc it’s even funnier: the relevant case law comes from Naruto v Slater. A case about a monkey taking a selfie and a photographer failing to acquire copyright of it (https://en.wikipedia.org/wiki/Monkey_selfie_copyright_dispute).

        The copyright belonged to whoever shot the selfie, but because it was the monkey and animals aren’t juristic entities, they can not hold copyright. Therefore, as it stands and as new case law outlines, AIs are compared to monkeys, in that the copyright would fall onto them but it’s not a juristic entities either, and therefore copyright just vanishes and no one can claim it.

        The wikipedia page suggests current cases on generative AI directly build on this.

        • ch00f@lemmy.world
          link
          fedilink
          arrow-up
          11
          ·
          2 days ago

          It was an especially interesting case because there was a question of whether the photographer lied about who actually took the picture. So he could either claim the monkey took it an lose the copyright or claim he took it and have it lose all value.

        • FiniteBanjo@feddit.online
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          9
          ·
          2 days ago

          See that’s kind of what I’m talking about. The monkeys who pressed the buttons to make the AI Generate the code isn’t the computer, it isn’t the user, it’s the employees at the AI company. My advice is that until laws are properly in place that we shouldn’t use AI for any generative industry.

          • Susaga@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            2
            ·
            2 days ago

            The AI company didn’t do shit. They stole apples from someone elses tree and threw it in a blender. They didn’t make the apples, nor did they buy them, so they don’t legally own the juice.

            • Klear@quokk.au
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              1
              ·
              2 days ago

              While I agree with the general idea, please don’t call piracy “stealing”. It’s not stealing, whether you do it or some giant corpo.

              • Sonotsugipaa@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                4
                ·
                2 days ago

                While I agree², their use of “steal” makes sense in the analogy because the apple doesn’t belong to the “thief”; besides, you can’t pirate an apple

                • sgibson5150@slrpnk.net
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  1 day ago

                  Evidently you can!

                  “Apple piracy is a problem,” said Lynnell Brandt, chief executive officer of Proprietary Variety Management, the Yakima firm hired to handle the Cosmic Crisp rollout. “Some are propagating trees without permission. Some are stolen in the middle of the night.”

            • FiniteBanjo@feddit.online
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              7
              ·
              2 days ago

              If the AI produces verbatim the licensed works of others then the others own it.

              If the AI took individual unlicensed elements and pieced them together then the AI Company owns it.

              In any and every case, neither the User nor the Public Domain owns it. Moral of the story is: never use AI for anything.

              • Susaga@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                10
                arrow-down
                2
                ·
                2 days ago

                The AI company stole other people’s code, threw it into a blender, and is selling the output. They didn’t do any real work, and they don’t own the materials. They have no legal claim over the result. You do not own a car you made from stolen parts, no matter how many cars you stole from.

                Stop trying to imply your buddies at AI companies have value.

                • FiniteBanjo@feddit.online
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  7
                  ·
                  2 days ago

                  We appear to be talking in circles.

                  I’m literally sitting here telling people it isn’t safe to use AI Code, you’re doing the opposite, and you’re accusing me of being buddies with the Slop Companies?

      • TheTechnician27@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        arrow-down
        4
        ·
        edit-2
        2 days ago

        You’re just making shit up. The US Court of Appeals for the DC Circuit has affirmed that AI-generated work is in the public domain. Put up or shut up.


        Edit: Additionally, the US Copyright Office writes:

        As the agency overseeing the copyright registration system, the [Copyright] Office has extensive experience in evaluating works submitted for registration that contain human authorship combined with uncopyrightable material, including material generated by or with the assistance of technology.

        • reabsorbthelight@lemmy.world
          link
          fedilink
          arrow-up
          9
          ·
          2 days ago

          How does this work in practice? Someone would have to prove that it’s AI generated, which isn’t straight forward.

          Also, I’m not clear this protects the release of code centered a trade secret or under NDA.

          So while the court ruled it’s public domain. Could it still be prevented from release? Like a Microsoft employee couldn’t just dump sections of the AI code to the internet I imagine.

          https://www.upcounsel.com/patents-trademarks-copyrights-and-trade-secrets

          Competitive advantage: Trade secrets can cover information that would not qualify for patents or copyright but still has economic value.

          I would imagine dumping Microsoft code to the internet would be sued under NDA

          • TheTechnician27@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            2 days ago

            The answer is that it’s messy and that I’m not qualified to say where the line is (nor, I think, is anyone yet). The generated parts are not copyrightable, but you can still have a valid copyright by bringing together things that aren’t individually copyrightable. For example, if I make a manga where Snow White fights Steamboat Willie, I’ve taken two public domain elements and used them to create a copyrightable work.

            So it’s not like the usage of AI inherently makes a project uncopyrightable unless the entire thing or most of it was just spat out of a machine. Where’s the line on this? Nobody (definitely not me, but probably nobody) really knows.

            As for courts ever finding out, how this affects trade secret policy… Dunno? I’m sure a Microsoft employee couldn’t release it publicly, because as you said, it’d probably violate an NDA. If there were some civil case, the source may come out during discovery and could maybe be analysed programmatically or by an expert. You would probably subpoena the employee(s) who wrote the software and ask them to testify. This is just spitballing, though, over something that’s probably inconsequential, because the end product is prooooobably still copyrightable.

            This kind of reminds me of the blurry line we have in FOSS, where everyone retains the copyright to their individual work. But if push comes to shove, how much does there need to be for it to be copyrightable? Where does it stop being a boilerplate for loop and start being creative expression?

        • Grimy@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          1 day ago

          It begins by asking “whether the `work’ is basically one of human authorship, with the computer [or other device] merely being an assisting instrument, or whether the traditional elements of authorship in the work (literary, artistic, or musical expression or elements of selection, arrangement, etc.) were actually conceived and executed not by man but by a machine.” [23]

          In the case of works containing AI-generated material, the Office will consider whether the AI contributions are the result of “mechanical reproduction” or instead of an author’s “own original mental conception, to which [the author] gave visible form.” [24]

          The answer will depend on the circumstances, particularly how the AI tool operates and how it was used to create the final work.[25] This is necessarily a case-by-case inquiry.

          That’s the rest of what you posted. I guess you just didn’t read it, right? Even if it comes right after and is part of the same paragraph. What a joke.

          • TheTechnician27@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 day ago

            I clarified this a bit in a follow-up comment, but my first comment was simplifying for the sake of countering:

            [it’s not in the public domain] because the actual human work that went into creating it was done by the owner of the AI Model and whatever they trained on.

            Their claim that the copyright for AI-generated works belongs to the model creator and the authors of the training material – and is never in the public domain – is patent, easily disprovable nonsense.

            Yes, I understand it’s more nuanced than what I said. No, it’s not nuanced in their favor. No, I’m not diving into that with a pathological liar (see their other comments) when it’s immaterial to my rebuttal of their bullshit claim. I guess you just didn’t read the claim I was addressing?

        • FiniteBanjo@feddit.online
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          12
          ·
          2 days ago

          Technology is an extremely vague word in this context. If the US Court of Appeals for the DC Circuit has affirmed that then I haven’t heard of it, it’s not posted here, and most importantly: such rules are not currently enshrined in law.

              • Warl0k3@lemmy.world
                link
                fedilink
                arrow-up
                8
                arrow-down
                3
                ·
                2 days ago

                No it’s that you’re trying to walk back a provably false claim and then deflect the claims by pretending the people calling you out are doing so because they like AI instead of, you know, valuing the truth.

                • FiniteBanjo@feddit.online
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  7
                  ·
                  2 days ago

                  I walk back no claims. The AI Companies have more claim on ownership of the output than the public. Don’t use Slop Code, it’s not safe.