Updates to Consumer Terms and Privacy Policy

(anthropic.com)

52 points | by meetpateltech 162 days ago

10 comments

  • jimmont 162 days ago

    For those that use AI/LLM's that retrain on your input, I assume you realize this commoditizes your intellectual work? And effectively makes use of it like they already used copyrighted intellectual property. This is effectively the same as the commons appropriations made for railroad development, reinterpreting fair use, etc.

    • WaxProlix 162 days ago

      At least in the Settings pane, the slider is kinda ambiguous as to whether you're opted in or not.

      https://postimg.cc/2V7mM77C vs https://postimg.cc/1nF1HGzh

      • AlexandrB 162 days ago

        How is it that in 2025 UI is worse than what we had in Windows 98? A checkbox would be unambiguous here.

        • 4b11b4 162 days ago

          Thought so too, I assume it was checked by default, so it hit it once

        • dpcx 162 days ago

          I don't love that this is opt-in by default, but I'm happy that they're at least offering an opt-out.

          • roughly 162 days ago

            I dunno, I feel like we’ve seen this play often enough - “option to opt-out” is absolutely going to be the first feature slated for elimination on the product roadmap - “after all, only 5% of customers are using it.”

            • ptx 161 days ago

              The terms "opt-in" and "opt-out" indicate what the default is, so "... by default" is redundant. "Opt-in" means that you can opt (choose) to be in while the default is out.

              In this case, since the default is in unless you opt out, it's opt-out.

              • jkaplowitz 162 days ago

                I agree with everything you’ve said, but also am happy that they’re forcing users both new and existing to make a choice to continue using Claude under the new terms, rather than silently starting to train for existing users who take no action.

                Like you, I would have preferred that the UI for the choice didn’t make opt-in the default. But at least, this is one of the rare times where a US company isn’t simply assuming or circumventing consent from existing users in countries without EU-style privacy laws who ignore the advance notification. So thank you Anthropic for that form of respect.

                • esbranson 161 days ago

                  The terms say it is opt-out not opt-in, despite the word play.

                  > We may use Materials ... unless you opt out of training through your account settings.

                  [1] https://github.com/OpenTermsArchive/GenAI-versions/commit/d8...

                  • croes 161 days ago

                    It's opt-out, so it's in by default. Opt-in would mean it's out by default and would be a good thing.

                  • bloomca 162 days ago

                    I don't understand why would you opt in to share your data. Is it because you believe that it would help to improve the model and you would benefit from it? Or something altruistic?

                    • poly2it 162 days ago

                      I'd assume the layman user already suffering from cookie pop-up fatigue won't pay much attention to these privacy toggles.

                      • jimmont 162 days ago

                        I think it's just general lack of awareness of the effect, or in many instances having alternate economic incentives, like academics who want to commoditize their intellectual outputs to all available distribution channels. Tyler Cowen for example. The AI companies are in a race to the bottom.

                        • Juminuvi 162 days ago

                          I always assumed the folks who intentionally do this either work for the company, are associated with the company, or are in some way part of q/a pilot user group.

                          • croes 162 days ago

                            It's opt-out. So if you missed that change you share your data without knowing it.

                          • esbranson 161 days ago

                            TOSBack.org, supported by the Electronic Frontier Foundation, lists changes in terms and policies sequentially.[1][2]

                            [1] https://opentermsarchive.org/en/collections/genai/

                            [2] https://github.com/OpenTermsArchive/GenAI-versions/tree/main...

                            • lostmsu 162 days ago

                              Were they not using the data from Claude Code for training before this change? After this change, will they not train on my code if I switch this off (Claude Pro sub)?

                              • jkaplowitz 162 days ago

                                From their FAQ at the bottom of the linked page:

                                “Previous chats with no additional activity will not be used for model training.”

                                So, I guess they weren’t. You can switch off and keep that the case.

                              • adubashi 162 days ago

                                5 year retention is nuts. Amazon’s for AWS is 30 days.

                                • LocalH 162 days ago

                                  "Accept" or "Not Now". Weasel words that mean they'll spam you about it again at some point.

                                  • lenerdenator 162 days ago

                                    I mean, it's great that it's at least got an opt-out, but the whole appeal for me of Anthropic and giving them money was they explicitly didn't do anything with your data, or that was the impression I had.

                                    When you see this kind of thing it makes you wonder what else they'll try to do to get around your opt-out.

                                    • croes 161 days ago

                                      Everything data related should be opt-in otherwise they just try to sneak it in

                                    • owebboy 162 days ago

                                      eek. opt-in default. 5 year retention. i knew that something like this was coming, but it's a hard pill to swallow

                                      • croes 161 days ago

                                        Opt-out default