20 comments

  • vovavili 19 minutes ago

    Replacing an 11.6GB Parquet file every 5 minutes strikes me as a bit wasteful. I would probably use Apache Iceberg here.

    • ai-inquisitor 10 minutes ago

      It's not doing that. If you look at the repository, it's adding a new commit with tiny parquet files every 5 minutes. This recent one only was a 20.9 KB parquet file: https://huggingface.co/datasets/open-index/hacker-news/commi... and the ones before it were a median of 5 KB: https://huggingface.co/datasets/open-index/hacker-news/tree/...

      The bigger concern is how large the git history is going to get on the repository.

      • zerocrates 8 minutes ago

        "The dataset is organized as one Parquet file per calendar month, plus 5-minute live files for today's activity. Every 5 minutes, new items are fetched from the source and committed directly as a single Parquet block. At midnight UTC, the entire current month is refetched from the source as a single authoritative Parquet file, and today's individual 5-minute blocks are removed from the today/ directory."

        So it's not really one big file getting replaced all the time. Though a less extreme variation of that is happening day to day.

        • fabmilo 17 minutes ago

          Was thinking the same thing. probably once a day would be more than enough. if you really want a minute by minute probably a delta file from the previous day should be more than enough.

        • maxloh 8 minutes ago

          Could you also release the source code behind the automatic update system?

          • xnx 2 hours ago

            The best source for this data used to be Clickhouse (https://play.clickhouse.com/play?user=play#U0VMRUNUIG1heCh0a...), but it hasn't updated since 2025-12-26.

            • epogrebnyak 16 minutes ago

              Wonder why median votes count is 0, seems every post is getting at least a few votes - maybe this was not the case in the past

              • epogrebnyak 15 minutes ago

                Ahhh I get it the moment I asked, there are usually no votes on comments

              • imhoguy 16 minutes ago

                Yay! So much knowledge in just 11GB. Adding to my end of the World hoarding stash!

                • robotswantdata 28 minutes ago

                  Where’s the opt out ?

                  • john_strinlai 26 minutes ago

                    hackernews is very upfront that they do not really care about deletion requests or anything of that sort, so, the opt out is to not use hackernews.

                    • ratg13 12 minutes ago

                      Create a new account every so often, don’t leave any identifying information, occasionally switch up the way you spell words (British/US English), and alternate using different slang words and shorthand.

                      • tantalor 18 minutes ago

                        The back button

                      • gkbrk 2 hours ago

                        My Hacker News items table in ClickHouse has 47,428,860 items, and it's 5.82 GB compressed and 18.18 GB uncompressed. What makes Parquet compression worse here, when both formats are columnar?

                        • 0cf8612b2e1e 2 hours ago

                          Sorting, compression algorithm +level, and data types can all have an impact. I noted elsewhere that a Boolean is getting represented as an integer. That’s one bit vs 1-4 bytes.

                          There is also flexibility in what you define as the dataset. Skinnier, but more focused tables could be space saving vs a wide table that covers everything -will probably break compressible runs of data.

                          • xnx 2 hours ago

                            Parquet has a few compression option. Not sure which one they are using.

                            • hirako2000 2 hours ago

                              Plus isn't the least wasteful format, native duckdb for instance compacts better. That's not just down to the compression algorithm, which as you say got three main options for parquet.

                          • brtkwr 20 minutes ago

                            This comment should make it into the download in a few mins.

                          • kshacker 59 minutes ago

                            Good for demo but every 5 minutes? Why?

                            • Imustaskforhelp 46 minutes ago

                              It can have some good use cases I can think of. Personally I really appreciate the 5 minute update.

                            • mlhpdx 1 hour ago

                              Static web content and dynamic data?

                              > The archive currently spans from 2006-10 to 2026-03-16 23:55 UTC, with 47,358,772 items committed.

                              That’s more than 5 minutes ago by a day or two. No big deal, but a little bit depressing this is still how we do things in 2026.

                              • voxic11 35 minutes ago

                                That is just the archive part, if you just would finish reading the paragraph you would know that updates since 2026-03-16 23:55 UTC are "are fetched every 5 minutes and committed directly as individual Parquet files through an automated live pipeline, so the dataset stays current with the site itself."

                                So to get all the data you need to grab the archive and all the 5 minute update files.

                                archive data is here https://huggingface.co/datasets/open-index/hacker-news/tree/...

                                update files are here (I know that its called "today" but it actually includes all the update files which span multiple days at this point) https://huggingface.co/datasets/open-index/hacker-news/tree/...

                                • john_strinlai 31 minutes ago

                                  >if you just would finish reading the paragraph

                                  probably uncalled for

                                • xandrius 47 minutes ago

                                  I don't get what you meant with this comment.

                                  • john_strinlai 38 minutes ago

                                    the data updates every 5 minutes, but the description on huggingface says the last update was 2 days ago.

                                    they are suggesting that the huggingface description should be automatically updating the date & item count when the data gets updated.

                                    • voxic11 34 minutes ago

                                      No that is the date at which the bulk archive ends and the 5 minute update files begin, so it should not be updated.

                                • alstonite 1 hour ago

                                  What happened between 2023 and 2024 to cause the usage dropoff?

                                  • ghgr 1 hour ago

                                    I'd say it's less a usage dropoff and more a reversion to the mean after Covid

                                    • tehjoker 1 hour ago

                                      That's a possible hypothesis, but there was also a rising trend prior, it wasn't stable.

                                    • imhoguy 23 minutes ago

                                      Return to office

                                    • lyu07282 1 hour ago

                                      Please upload to https://academictorrents.com/ as well if possible

                                      • palmotea 2 hours ago

                                        > At midnight UTC, the entire current month is refetched from the source as a single authoritative Parquet file, and today's individual 5-minute blocks are removed from the today/ directory.

                                        Wouldn't that lose deleted/moderated comments?

                                        • BoredPositron 1 hour ago

                                          I guess that's the point.

                                          • Imustaskforhelp 44 minutes ago

                                            Can't someone create an automatic script which can just copy the files say 5 minutes before midnight UTC?

                                        • 0cf8612b2e1e 2 hours ago

                                          Under the Known Limitations section

                                            deleted and dead are integers. They are stored as 0/1 rather than booleans.
                                          
                                          Is there a technical reason to do this? You have the type right there.
                                          • Imustaskforhelp 47 minutes ago

                                            As someone who had made a project analysing hackernews who had used clickhouse, I really feel like this is a project made for me (especially the updated every 5 minute aspect which could've helped my project back then too!)

                                            Your project actually helps me out a ton in making one of the new project ideas that I had about hackernews that I had put into the back-burner.

                                            I had thought of making a ping website where people can just @Username and a service which can detect it and then send mail to said username if the username has signed up to the service (similar to a service run by someone from HN community which mails you everytime someone responds to your thread directly, but this time in a sort of ping)

                                            [The previous idea came as I tried to ping someone to show them something relevant and thought that wait a minute, something like ping which mails might be interesting and then tried to see if I can use algolia or any service to hook things up but not many/any service made much sense back then sadly so I had the idea in back of my mind but this service sort of solves it by having it being updated every 5 minutes]

                                            Your 5 minute updates really make it possible. I will look what I can do with that in some days but I am seeing some discrepancy in the 5 minute update as last seems to be 16 march in the readme so I would love to know more about if its being updated every 5 minutes because it truly feels phenomenal if true and its exciting to think of some new possibilities unlocked with it.

                                            • tonymet 1 hour ago

                                              what's the license for HN content?

                                              • echelon 1 hour ago

                                                At this point, you can train on anything without repercussion.

                                                Copyright doesn't seem to matter unless you're an IP cartel or mega cap.

                                                • marginalia_nu 54 minutes ago

                                                  Laughs nervously in jurisdiction without fair use doctrine

                                              • Onavo 2 hours ago

                                                Is is possible to only download a subset? e.g. Show HNs or HN Whoishiring. The Show HNs and HN Whoishiring are very useful for classroom data science i.e. a very useful set of data for students to learn the basic of data cleaning and engineering.

                                                • nelsondev 2 hours ago

                                                  It’s date partitioned, you could download just a date range. It’s also parquet, so you can download just specific columns with the right client

                                                • lokimoon 1 hour ago

                                                  You are the product

                                                  • waynesonfire 19 minutes ago

                                                    Your reward is the endorphin hit from writing this comment.

                                                  • bstsb 2 hours ago

                                                    what’s the license? “do whatever the fuck you want with the data as long as you don’t get caught”? or does that only work for massive corporations

                                                  • GeoAtreides 2 hours ago

                                                    is the legal page a placeholder, do words have no meaning?

                                                    https://www.ycombinator.com/legal/

                                                    Mods, enforce your license terms, you're playing fast and loose with the law (GDPR/CPRA)

                                                    • Retr0id 2 hours ago

                                                      Which terms are not being enforced? (not disagreeing I just don't feel like reading a large legal document)

                                                      • GeoAtreides 2 hours ago

                                                        > By uploading any User Content you hereby grant and will grant Y Combinator and its affiliated companies

                                                        The user content is supposed to be licensed only Y Combinator and (bleah) its affiliated companies (which are many, all the startups they fund, for example).

                                                        • jmalicki 1 hour ago

                                                          Curious why it should be on HackerNews to enforce restrictions on content they only license from you?

                                                          If it's owned by you and only licensed by HN shouldn't you be the one enforcing it?

                                                          • AndrewKemendo 1 hour ago

                                                            Seems like they are trying to do that through the stated legal intermediary (YC)

                                                          • zamadatix 1 hour ago

                                                            If you carry on the quote two more words:

                                                            > ... a nonexclusive

                                                            I.e. this section is talking to additional rights to the content you post to ALSO go to YC, not that YC is guaranteeing it (+friends) will be the only one to hold these rights or will enforce who else should hold the rights to your publicly shared content for you.

                                                            There's a more intricate conversation to be had with GDPR and public data on forums in general but that's wholly unrelated to what YC's legal page says and still unlikely to end up in an alarming result.

                                                            • ryandvm 1 hour ago

                                                              That agreement is largely about "Personal Information", not the posts and comments.

                                                              That said, there are "no scraping" and "commercial use restricted" carve-outs for the content on HN. Which honestly is bullshit.

                                                            • ungruntled 2 hours ago

                                                              None that I could see:

                                                              Your submissions to, and comments you make on, the Hacker News site are not Personal Information and are not "HN Information" as defined in this Privacy Policy.

                                                              Other Users: certain actions you take may be visible to other users of the Services.

                                                              • GeoAtreides 2 hours ago

                                                                I mean, just because they say the comments are not PI doesn't make it so.

                                                                • ungruntled 1 hour ago

                                                                  That’s a good point. I’m only referring to the terms they used in the privacy policy.

                                                            • ryandvm 1 hour ago

                                                              Eh, fuck that agreement. I'm kind of old school in that I believe if you put it on the internet without an auth-wall, people should be allowed to do whatever they want with it. The AI companies seem to agree.

                                                              Then again, I'm not the guy that is going to get sued...

                                                              • Ylpertnodi 1 hour ago

                                                                > I believe if you put it on the internet without an auth-wall, people should be allowed to do whatever they want with it.

                                                                I agree. It's the owners of the sites that have to follow rules, not us.

                                                                • kmeisthax 1 hour ago

                                                                  "I'm kind of old school in that I believe if you put grass on the ground without a fence, people should be allowed to do whatever they want with it. The noblemen with a thousand cows seem to agree."

                                                                  And that, my friends, is how you kill the commons - by ignoring the social context surrounding its maintenance and insisting upon the most punitive ways of avoiding abuse.

                                                                  • petercooper 55 minutes ago

                                                                    Context is important, but isn’t HN’s social context, in particular, that the site is entirely public, easily crawled through its API (which apparently has next to no rate limits) and/or Algolial, and has been archived and mirrored in numerous places for years already?

                                                                    • echelon 58 minutes ago

                                                                      Signal and information are not grass.

                                                                      Grass and property require upkeep. Radio waves and electromagnetic radiation do not.

                                                                      I don't want your dog to piss on my lawn and kill my grass. But what harm does it cause me if you take a picture of my lawn? Or if I take a picture of your dog?

                                                                      If I spend $100M making a Hollywood movie - pay employees, vendors, taxes - contribute to the economic growth of the country - and then that product gets stolen and given away completely for free without being able to see upside, that's a little bit different.

                                                                      But my Hacker News comment? It's not money.

                                                                      I think there are plausible ways to draw lines that protect genuine work, effort, and economics while allowing society and innovation to benefit from the commons.

                                                                  • hsuduebc2 2 hours ago

                                                                    How is is he breaking gdpr here?

                                                                    • andrewmcwatters 2 hours ago

                                                                      They already refuse to comply with CPRA, instead electing to replace your username with a random 6(?) character string, prefixed with `_`, if I remember correctly.

                                                                      I know, because I've been here since maybe 2015 or so, but this account was created in 2019.

                                                                      So any PII you have mentioned in your comments is permanent on Hacker News.

                                                                      I would appreciate it if they gave users the ability to remove all of their personal data, but in correspondence and in writing here on Hacker News itself, Dan has suggested that they value the posterity of conversations over the law.