Updates to Consumer Terms and Privacy Policy

(anthropic.com)

35 points | by meetpateltech 7 hours ago

7 comments

  • jimmont 5 hours ago

    For those that use AI/LLM's that retrain on your input, I assume you realize this commoditizes your intellectual work? And effectively makes use of it like they already used copyrighted intellectual property. This is effectively the same as the commons appropriations made for railroad development, reinterpreting fair use, etc.

    • bloomca 5 hours ago

      I don't understand why would you opt in to share your data. Is it because you believe that it would help to improve the model and you would benefit from it? Or something altruistic?

      • poly2it 1 hour ago

        I'd assume the layman user already suffering from cookie pop-up fatigue won't pay much attention to these privacy toggles.

        • Juminuvi 2 hours ago

          I always assumed the folks who intentionally do this either work for the company, are associated with the company, or are in some way part of q/a pilot user group.

          • jimmont 4 hours ago

            I think it's just general lack of awareness of the effect, or in many instances having alternate economic incentives, like academics who want to commoditize their intellectual outputs to all available distribution channels. Tyler Cowen for example. The AI companies are in a race to the bottom.

          • WaxProlix 6 hours ago

            At least in the Settings pane, the slider is kinda ambiguous as to whether you're opted in or not.

            https://postimg.cc/2V7mM77C vs https://postimg.cc/1nF1HGzh

            • AlexandrB 3 hours ago

              How is it that in 2025 UI is worse than what we had in Windows 98? A checkbox would be unambiguous here.

              • 4b11b4 5 hours ago

                Thought so too, I assume it was checked by default, so it hit it once

              • dpcx 7 hours ago

                I don't love that this is opt-in by default, but I'm happy that they're at least offering an opt-out.

                • roughly 5 hours ago

                  I dunno, I feel like we’ve seen this play often enough - “option to opt-out” is absolutely going to be the first feature slated for elimination on the product roadmap - “after all, only 5% of customers are using it.”

                  • jkaplowitz 5 hours ago

                    I agree with everything you’ve said, but also am happy that they’re forcing users both new and existing to make a choice to continue using Claude under the new terms, rather than silently starting to train for existing users who take no action.

                    Like you, I would have preferred that the UI for the choice didn’t make opt-in the default. But at least, this is one of the rare times where a US company isn’t simply assuming or circumventing consent from existing users in countries without EU-style privacy laws who ignore the advance notification. So thank you Anthropic for that form of respect.

                  • lostmsu 6 hours ago

                    Were they not using the data from Claude Code for training before this change? After this change, will they not train on my code if I switch this off (Claude Pro sub)?

                    • jkaplowitz 5 hours ago

                      From their FAQ at the bottom of the linked page:

                      “Previous chats with no additional activity will not be used for model training.”

                      So, I guess they weren’t. You can switch off and keep that the case.

                    • owebboy 7 hours ago

                      eek. opt-in default. 5 year retention. i knew that something like this was coming, but it's a hard pill to swallow

                      • lenerdenator 5 hours ago

                        I mean, it's great that it's at least got an opt-out, but the whole appeal for me of Anthropic and giving them money was they explicitly didn't do anything with your data, or that was the impression I had.

                        When you see this kind of thing it makes you wonder what else they'll try to do to get around your opt-out.