14 comments

  • scottLobster 1 day ago
    Unfortunately there's no money in privacy, and a lot of money in either outright selling data or cutting costs to the bare minimum required to avoid legal liability.

    Wife and I are expecting our third child, and despite my not doing much googling or research into it (we already know a lot from the first two) the algorithms across the board found out somehow. Even my instagram "Explore" tab that I accidentally select every now and then started getting weirdly filled with pictures of pregnant women.

    It is what it is at this point. Also I finally got my last settlement check from Equifax, which paid for Chipotle. Yay!

    • jablongo 23 hours ago
      Interestingly in healthcare there is a correlation between companies that license/sell healthcare data to other ones (usually they try to do this in a revokable way with very stringent legal terms, but sometimes they just sell it if there is enough money involved) and their privacy stance... and it's not what you would think. Often it's these companies that are pushing for more stringent privacy laws and practices. For example, they could claim that they cannot share anonymized data with academic researchers, because of xyz virtuous privacy rules, when they are actually the ones making money off of selling patient data. It's an interesting phenomenon I have observed while working in the industry that seems to refute your claim that "there's no money in privacy". Another way to think about it is that they want to induce a lower overall supply for the commodity they are selling, and they do this by championing privacy rules.
      • philipallstar 10 hours ago
        Totally agree. Have seen this in the UK with the NHS.
    • supertrope 1 day ago
      As new moms tend to change their consumer purchasing habits they are coveted by advertisers. http://www.nytimes.com/2012/02/19/magazine/shopping-habits.h... Certain cohorts and keywords are very valuable so even searching a medical condition once or clicking on a hiring ad for an in-demand job can shift ads toward that direction for a long time.
      • Larrikin 16 hours ago
        It seems more important than ever to have self hosted apps or browser extensions that will intermittently search for these valuable keywords. Ad Nauseum is much better than bare Ublock Origin for the same reason.
      • scottLobster 19 hours ago
        Yeah I'm less shocked that it got picked up and more how quickly it spread to literally every platform we use, even those that wouldn't have much if any hint that it was happening.

        There's clearly quite the active market for this information

    • miki123211 6 hours ago
      > the algorithms across the board found out somehow.

      It's worth keeping in mind that this is basically untrue.

      In most of these algorithms, there's no "is_expecting: True" field. There are just some strange vectors of mysterious numbers, which can be more or less similar to other vectors of mysterious numbers.

      The algorithms have figured out that certain ad vectors are more likely to be clicked if your user vector exhibits some pattern, and that some actions (keywords, purchases, slowing down your scroll speed when you see a particular image) should make your vector go in that direction.

    • kleiba 9 hours ago
      > Unfortunately there's no money in privacy

      But there should be and there should be punishments for data breaches, or at least compensations for those affected. Then there would be an incentive for corporations to take their user's privacy more seriously.

      Your personal data is basically the currency of the digital world. This is way data about you is collected left, right, and center. It's valuable.

      When I trust a bank to safely lock away my grandmother's jewelry, I have to pay for it, but in return, if it just so happened that the bank gets broken into and all my possessions get stolen, at least I'll get (some) compensation.

      When I give my valuable data to a company, I have already paid them (with the data themselves), but I have no case whatsoever if they get compromised.

    • maxtaco 1 day ago
      Also on the front page of HN right now is a job posting for Optery (YC W22). Seems like they are growing really fast.
    • vasco 1 day ago
      Could be as simple as buying a bunch of scent free soap / lotion and some specific vitamin supplements. Walmart / Target were able to detect pregnancy reliably back in 2012 from just their own shopping data.
      • throwway120385 1 day ago
        Regular purchase of prenatal vitamins is probably a huge marker for either being pregnant or intention to become pregnant.
        • simulator5g 23 hours ago
          Just shopping in the store and lingering by those products for a few moments is enough for the algorithm to detect a possible pregnancy. They use Bluetooth beacons & camera software to see how long you look at everything in the store.
          • Larrikin 16 hours ago
            Facial recognition may be possible. BLE beacons were a useful technology that is dead now because even ten years ago it was being abused for this. It's fully blocked without a ton of jumping through hoops.
    • jerlam 1 day ago
      Also possible they have your location if you went to the hospital. Maybe from any Meta "partners" or third party brokers.
  • gen220 1 day ago
    FYI, there's a .gov-maintained portal where healthcare companies in the U.S. are legally obliged to publish data breaches. It's an interesting dataset!

    https://ocrportal.hhs.gov/ocr/breach/breach_report.jsf

    • fhsm 19 hours ago
      This is a suboptimal characterization of this site.

      I think it would be less wrong to say this is where covered entities that discover reportable breaches of PHI (whether their own or that of a BA) that trigger the immediate reporting obligation report them.

      This is a narrower scope of coverage and shallower depth of epistemic obligation than you implied.

    • mexicocitinluez 8 hours ago
      One of my favorite HIPAA stories is about a doctor who utilized his patient list when sending out campaign-related information when he was running for local office. Over 2 decades of schooling and still didn't understand how stupid this was.
  • didgetmaster 23 hours ago
    Although almost every company issues a 'we care about your privacy' statement, but there is often very little 'money where your mouth is' resources to back that up.

    This is why I am almost always very reluctant to give out any information that is not absolutely necessary to provide me the service that I need. If they don't know it, they can't leak it.

    Every company wants you to fill out their standard form that tries to get you to volunteer way more info than they really need.

    • rationalist 16 hours ago
      If it's a paper form, I leave it blank. If it's a digital form and required, I put in the business's own phone number, address, etc.
  • sehugg 1 day ago
    Several maps created to assist the agency with decisions — like where to open new offices and allocate certain resources — were made public through incorrect privacy settings between 2021 and 2025 ... the mapping website was unable to identify who viewed the maps ... implemented a secure map policy that prohibits uploading customer data to public mapping websites.

    So a state employee/contractor (doesn't say) uploaded unaggregated customer records to a mapping website hosted on the public internet?

  • 1970-01-01 22 hours ago
    And everyone was fired, the top management has stepped down, and the fines were so massive that nobody ever took a chance with sloppy security ever again. Oh, it's actually the opposite of all that.
  • xbar 1 day ago
    The last time this happened, did the AG prosecute the person who discovered the vulnerable data?
    • larrymcp 1 day ago
      Ah, I think I recall the story you're referring to: reporter Josh Renaud of the St. Louis Post-Dispatch discovered that a public web site was exposing Social Security numbers of teachers in Missouri. He notified the site's administrators, and later published a story about the leak after it was fixed.

      The governor of Missouri at the time, Mike Parson, called him a hacker and advocated prosecuting him. Fortunately the prosecutor's office declined to file charges though.

  • tonymet 23 hours ago
    I've built Healthcare SAAS APIs that required custom integrations with EHR partners, as well as consulted on similar apps for others.

    On top of common OWASP vulnerabilities, the bigger concern is that EHR and provider service apps do not have the robust security practices needed to defend against attacks. They aren't doing active pen testing, red-teaming, supply chain auditing -- all of the recurring and costly practices necessary to ensure asset security.

    There are many regulations, HIPAA being the most notable, but their requirements and the audit process are incredibly primitive . They are still using a 1990s threat model. Despite HIPAA audits being expensive, the discoveries are trivial, and they are not recurring, so vulns can originate between the audit duration and the audit summary delivery.

  • cosmotic 1 day ago
    I'm sure they "take security very seriously".
    • A4ET8a8uTh0_v2 1 day ago
      I will admit that a level of fatigue has reached me as well. I am not even sure what would be an appropriate remedy at this point. My information has been all over the place given multiple breaches the past few years ( and, I might add, my kid's info too as we visited a hospital for her once ).

      Anyway, short of collapsing current data broker system, I am not sure what the answer is. Experian debacle showed us they are too politically entrenched to be touched by regular means.

      At this point, I am going through life assuming most of my data is up for grabs. That is not a healthy way to live though.

      • stackskipton 1 day ago
        >I am not even sure what would be an appropriate remedy at this point.

        It will have to be political and it's got to be fines/damages that are business impacting enough for companies to pause and be like A) Is it worth collecting this data and storing it forever? and B) If I don't treat InfoSec as important business function, it could cost me my business.

        It also clear that certification systems do not work and any law/policy around it should not offer any upside for acquiring them.

        EDIT: I also realize in United States, this won't happen.

        • dajtxx 18 hours ago
          I agree but I think the problem will be if the consequences are that dire then entire classes of business will cease to exist OR the cost of doing things properly will be passed on to the consumer.

          I struggle to see how data brokers, social media, etc are a net benefit to society so would be happy to see those sorts of businesses cease to exist, but I suspect I'm in the minority.

          • miki123211 6 hours ago
            The entire targeted advertising industry is basically a progressive tax.

            The "social contract" is that many services are fully or partially financed by advertising. Rich people produce more ad revenue (because they spend more), but they get the same quality of service, effectively subsidizing access for the poorer part of the population, who couldn't afford it otherwise.

            If we break this social contract down, companies will still try to extract as much revenue as possible, but the only way to do that will be through feature gating, price discrimination, and generally making your life a misery unless you make a lot of money.

        • closeparen 20 hours ago
          The State of Illinois is going to lose its "business" already for other reasons. Do you think there is a reasonable privacy regime that prevents health systems from knowing where their patients live or using that information to site clinics?
          • fc417fc802 18 hours ago
            Why is my data freely and instantly available within a centralized "health system" to begin with? Why can't we implement a digital equivalent of clunky paper records? Everything E2EE. Local storage requiring in person human intervention to access. When a new provider wants my records from an old one there should be a cryptographic dance involving all three parties. Signed request, signed patient authorization, and then reencryption for the receiving party using the request key.

            What the health system should impose is a standard for interoperability. Not an internal network that presents a juicy target.

      • closeparen 1 day ago
        This has nothing to do with the "data broker system." Reading between the lines it was more of a "shadow IT" issue where employees were using some presumably third-party GIS service for a legitimate business purpose but without a proper authentication & authorization setup.
        • A4ET8a8uTh0_v2 23 hours ago
          Assuming your tea leaf reading is correct, that particular third party would not even exist in its current form without 'data broker ecosystem'. It is, genuinely, the original sin.
          • closeparen 20 hours ago
            A website where you can upload POIs to a shareable map seems like one of those things that's so obvious and so useful it exists almost under any economic arrangement of the advertising industry.

            I get that data brokers and big tech are a much sexier topic, but this breach - like so many of the most pressing threats to our privacy - are mundane shortages of competence and giving-a-shit in the IT activities of boring old organizations.

            • A4ET8a8uTh0_v2 19 hours ago
              Heh. The shareable map is operated by someone and that someone has information that other people crowdsourced for them for free is even more valuable. If you want a more relatable example, I would like to point to defuct effort ( karma or something.. I can't find the specifics now ), where people were invited to crowdsource all sorts of info on other people. It only got shut down, because it was too on the nose. On the other hand, items like the shareable map like the one you mention is more easily defensible...

              << I get that data brokers and big tech are a much sexier topic, but this breach - like so many of the most pressing threats to our privacy - are mundane shortages of competence and giving-a-shit in the IT activities of boring old organizations.

              I posit that both could be true at the same time.

              • fc417fc802 17 hours ago
                I think OSM would exist regardless of data brokers. Free services ingesting that data and letting a user annotate it would also exist. People create and operate all sorts of little projects for fun.
      • miki123211 6 hours ago
        Did you actually suffer any negative consequences of these breaches?

        I see so many comments about how punishments for data breaches should be increased, but not a single story about quantifiable harm that any of those commenters has suffered from them.

      • hmokiguess 1 day ago
        If you want to get more stressed about it and consider the impeding dystopian future, I invite you to think about the “harvest now, decrypt later” potential reality that quantum cryptography is going to enable.

        At some point, everything that we have ever assumed to be confidential and secure will be exposed and up for grabs.

      • skeptic_ai 1 day ago
        Change name to a very common one. Much better privacy.
        • fhsm 18 hours ago
          I’m from a culture in which family use a very small number of very highly conserved names and non standard name positions. I’ve noticed this is sufficient to confuse the low rent data brokers that do statistical linkage. My parents and grandparents and my siblings and my children have all at various points shared addresses landlines and have overlapping names. The brokers are very unclear on how many people are involved , what sex , what generations what states.
        • EvanAnderson 22 hours ago
          I grew up around some people with the last name "Null". I often wonder how they're doing for data privacy today.
    • SilverElfin 1 day ago
      Would you like 2 years of credit monitoring? Or perhaps you can get $5 from this class action settlement.
      • rationalist 15 hours ago
        I don't even understand paid credit monitoring.

        Each of the big three credit bureaus offer free accounts where they email me if something changes and allow me to lock and thaw my credit.

  • Xeoncross 1 day ago
    Restrict data collection? It would kill all startups and firmly entrance a terrible provider monopoly who can comply.

    Have the government own data collection? Yeah, I don't even know where to start with all the problems this would cause.

    Ignore it and let companies keep abusing customers? Nope.

    Stop letting class-action lawsuits slap the company's wrists and then give $0.16 payouts to everyone?

    What exactly do we do without killing innovation, building moats around incumbents, giving all the power to politicians who will just do what the lobbyists ask (statistically), or accepting things as is?

    • nemomarx 1 day ago
      Why do the start ups need to collect data like this?
      • thinkingtoilet 1 day ago
        I work for a medical technology company. How do you propose we service our customers without their medical data?
        • rainonmoon 9 hours ago
          I just registered CVEs in several platforms in a related industry, the founders of whom likely all asked themselves a similar question. And yet, it's the wrong question. The right one is, "Does this company need to exist?" I don't know you or your company. Maybe it's great. But many startups are born thinking there's a technological answer to a question that requires a social/political one. And instead of fixing the problem, the same founders use their newfound wealth to lobby to entrench the problem that justifies their company's existence, rather than resolves the need for it to exist in the first place. "How do you propose we service our customers without their medical data?" Fix your fucked healthcare system.
        • nemomarx 1 day ago
          Does it need to be hosted on your servers? Could you provide something to the customers where they host the data or their local doctors office does it?

          Can you delete it after the shortest possible period of using it, potentially? Do you keep data after someone stops being a customer or stops actively using the tech?

          • fhsm 17 hours ago
            Record retention is covered by a complex set of overlapping regulations and contracts. They are dependent on much more than date of service. M&A activity, interstate operations, subsequent changes in patient mental status, etc can all cause the horizon to change well after the last encounter.

            As all the comments in this thread suggest the cost of having an extra record , even an extra breached record is low. The cost of failing to produce a required medical record is high.

            Put this together with dropping storage prices, razor then margins, and IT estates made out of thousands of specialized point solutions cobbled together with every integration pattern ever invented and you get a de facto retention of infinity paired with a de jure obligation of could-be-anything-tomorrow.

          • 35fbe7d3d5b9 20 hours ago
            Professionally, my company builds one of the largest EHR-integrated web apps in the US

            Ask me how many medical practices connect every day via IE on Windows 8.

          • closeparen 23 hours ago
            Having seen this world up close, the absolute last place you ever want your medical data to be is on the Windows Server in the closet of your local doctors office. The public cloud account of a Silicon Valley type company that hires reasonably competent people is Fort Knox by comparison.
            • fc417fc802 17 hours ago
              Yeah but the a local private practice is a fairly small target. No one is going to break into my house just to steal my medical records, for example.

              This could also be drastically improved by the government spearheading a FOSS project for medical data management (archival, backup, etc). A single offering from the US federal government would have a massive return on investment in terms of impact per dollar spent.

              Maybe the DOGE staff could finally be put to good use.

              • nradov 5 hours ago
                You seem to be confused about how this works. Attackers use automated scripts to locate vulnerable systems. Small local private practices are always targeted because everything is targeted. The notion of the US federal government offering an online data backup service is ludicrous, and wouldn't have even prevented the breach in this article.
                • fc417fc802 4 hours ago
                  > Attackers use automated scripts to locate vulnerable systems.

                  I'm aware. I thought we were talking about something a bit higher effort than that.

                  > online data backup service

                  That isn't what I said. I suggested federally backed FOSS tooling for the specific usecase. If nothing else that would ensure that low effort scanners came up empty by providing purpose built software hardened against the expected attack vectors. Since it seems we're worrying about the potential for broader system misconfiguration they could even provide a blessed OS image.

                  The breach in the article has nothing to do with what we're talking about. That was a case of shadow IT messing up. There's not much you can do about that.

        • cheeseomlit 1 day ago
          Ask for it?
          • pear01 1 day ago
            I hope you're joking...

            Otherwise it would suggest you think the problem is they didn't ask? When was the last time you saw a customer read a terms of service? Or better yet reject a product because of said terms once they hit that part of the customer journey?

            The issue isn't about asking it's that for take your pick of reasons no one ever says no. The asking is thus pro forma and irrelevant.

    • troupo 1 day ago
      We apply crippling fines on companies and executives that let these breaches happen.

      Yes, some breaches (actual hack attacks) are unavoidable, so you don't slap a fine on every breach. But the vast majority of "breaches" are pure negligence.

    • ourmandave 1 day ago
      Honestly I'd take the 16 cents. Usually its a discount voucher on a product you'd never buy.

      Or if it's a freebie then it's hidden behind a plain text link 3 levels deep on their website.

    • gassi 1 day ago
      [deleted]
    • apercu 1 day ago
      > Restrict data collection? It would kill all startups and firmly entrance a terrible provider monopoly who can comply.

      That's a terrible argument for allowing our data to be sprayed everywhere. How about regulations with teeth that prohibit "dragons" from hoarding data about us? I do not care what the impact is on the "economy". That ship sailed with the current government in the US.

      Or, both more and less likely, cut us in on the revenue. That will at least help some of the time we have to waste doing a bunch of work every time some company "loses" our data.

      I'm tired of subsidizing the wealth and capital class. Pay us for holding our data or make our data toxic.

      Obviously my health provider and my bank need my data. But no one else does. And if my bank or health provider need to share my data with a third party it should be anonymized and tokenized.

      None of this is hard, we simply lack will (and most consumers, like voters are pretty ignorant).

    • logicchains 1 day ago
      The solution is to anonymize all data at the source, i.e. use a unique randomized ID as the key instead of someone's name/SSN. Then the medical provider would store the UID->name mapping in a separate, easily secured (and ideally air-gapped) system, for the few times it was necessary to use.
      • nradov 5 hours ago
        What a silly idea. That would completely prevent federally mandated interoperability APIs from working. While privacy breaches are obviously a problem, most consumers don't want care quality and coordination harmed just for the sake of a minor security improvement.

        https://www.cms.gov/priorities/burden-reduction/overview/int...

      • dredmorbius 20 hours ago
        ...use a unique randomized ID as the key...

        33 bits is all that are required to individually identify any person on Earth.

        If you'd like to extend that to the 420 billion or so who've lived since 1800, that extends to 39 bits, still a trivially small amount.

        Every bit[1] of leaked data bisects that set in half, and simply anonymising IDs does virtually nothing of itself to obscure identity. Such critical medical and billing data as date of birth and postal code are themselves sufficient to narrow things down remarkably, let alone a specific set of diagnoses, procedures, providers, and medications. Much as browser fingerprints are often unique or nearly so without any universal identifier so are medical histories.

        I'm personally aware of diagnostic and procedure codes being used to identify "anonymised" patients across multiple datasets dating to the early 1990s, and of research into de-anonymisation in Australia as of the mid-to-late 1990s. Australia publishes anonymisation and privacy guidelines, e.g.:

        "Data De‑identification in Australia: Essential Compliance Guide"

        <https://sprintlaw.com.au/articles/data-de-identification-in-...>

        "De-identification and the Privacy Act" (2018)

        <https://www.oaic.gov.au/privacy/privacy-guidance-for-organis...>

        It's not merely sufficient to substitute an alternative primary key, but also to fuzz data, including birthdates, addresses, diagnostic and procedure codes, treatment dates, etc., etc., all of which both reduces clinical value of the data and is difficult to do sufficiently.

        ________________________________

        Notes:

        1. In the "binary digit" sense, not in the colloquial "small increment" sense.

  • citizenpaul 1 day ago
    I've been saying this forever. Computer security is and always will be nothing more than theater for with some minimal effort to cover bases, like hiring an INFOSEC then ignoring them. No on in charge cares about security because the number of people in charge punished for these breaches is still ZERO.
  • theLegionWithin 19 hours ago
    one more reason to overhaul the system. if a health care provider has a security incident they should be sued for the value of the data - and if that bankrupts them, then other providers will (hopefully) learn from that mistake. sort of like OSHA
  • teeray 1 day ago
    Sounds like some patients are in for some lucrative free credit and identity monitoring /s
  • renewiltord 1 day ago
    Until we can guarantee privacy and security maybe it’s best we shut down Illinois health care system.
    • jckahn 1 day ago
      Assuming this is a serious comment, what do you propose instead if the health system is shut down?
  • rvz 1 day ago
    I just heard a chorus of AI agents rejoicing that there's more private data now made public available to train on.
    • swores 1 day ago
      It's not often anybody writes a sentence that combines cynicism/negativity about AI with anthropomorphising AI agents!