srndsnd 5 days ago

To me, what's missing from that set of recommendations is some method to increase the liability of companies who mishandle user data.

It is insane to me that I can be notified via physical mail of months old data breaches, some of which contained my Social Security number, and that my only recourse is to set credit freezes from multiple credit bureaus.

  • arminiusreturns 5 days ago

    I agree. Let me tell you about what just happened to me. After a very public burnout and spiral, a friend rescued me and I took a part time gig helping a credit card processing company. About 2 months ago, the owner needed something done while I was out, and got their uber driver to send an email. They emailed the entire customer database, including bank accounts, socials, names, addresses, finance data, to a single customer. When I found out, (was kept hidden from me for 11 days) I said "This is a big deal, here are all the remediations and besides PCI we have 45 days by law to notify affected customers." The owner said "we aren't going to do that", and thus I had to turn in my resignation and am now unemployed again.

    So me trying to do the right thing, am now scrambling for work, while the offender pretends nothing happened while potentially violating the entire customer base, and will likely suffer no penalty unless I report it to PCI, which I would get no reward for.

    Why is it everywhere I go management is always doing shady stuff. I just want to do linuxy/datacentery things for someone who's honest... /cry

    My mega side project isn't close enough to do a premature launch yet. Despite my entire plan being to forgo VC/investors, I'm now considering compromising.

    • aftbit 5 days ago

      >Why is it everywhere I go management is always doing shady stuff.

      Well here's a cynical take on this - management is playing the business game at a higher level than you. "Shady stuff" is the natural outcome of profit motivation. Our society is fundamentally corrupt. It is designed to use the power of coercive force to protect the rights and possessions of the rich against the threat of violence by the poor. The only way to engage with it AND keep your hands clean is to be in a position that lets you blind yourself to the problem. At the end of the day, we are all still complicit in enabling slave labor and are beneficiaries of policies that harm the poor and our environment in order to enrich our lives.

      >unless I report it to PCI, which I would get no reward for.

      You may be looking at that backwards. Unless you report it to PCI, you are still complicit in the mishandling of the breach, even though you resigned. You might have been better off reporting it over the owner's objections, then claiming whistleblower protections if they tried to terminate you.

      This is not legal advice, I am not a lawyer, I am not your lawyer, etc.

      • arminiusreturns 5 days ago

        I did verify with an attorney that since I wasn't involved and made sure the owner knew what was what, that I had no legal obligations to disclose.

        • HansardExpert 4 days ago

          What about your moral obligation?

          • arminiusreturns 4 days ago

            People are so quick to judge without knowing the details, the situation is more complex than I'm willing to go into here, and I'm comfortable with my decision.

      • positus 5 days ago

        The problem isn't society or profit motivation. It's people. Humanity itself is corrupt. There aren't "good people" and "bad people". There's only "bad people." We're all bad people, just some of us are more comfortable with our corruption being visible to others to a higher degree.

        • ragnese 5 days ago

          > We're all bad people, just some of us are more comfortable with our corruption being visible to others to a higher degree.

          If the GP's story is true (and I have no reason to suspect otherwise), then there are clearly differences in the degree of "badness" between people. GP chose to resign from his job, while his manager chose to be negligent and dishonest.

          So, even if we're all bad people, there are less bad and more bad people, so we might as well call the less bad end of the spectrum "good". Thus, there are good and bad people.

          • positus 5 days ago

            I understand your perspective, but I maintain that "good" (morally pure) isn't a category any of us belong to. We're all lying, hateful people to one extent or another, and lying hateful people aren't "good", even if we haven't lied or hated as much as other lying, hateful people. "Less evil" isn't synonymous with "good".

            The argument that profit motivation is the origin of shady business practices ignores the existence of those businesses which pursue profit in an ethical manner. The company I work for, for instance, is highly motivated to produce a profit, but the way we go about obtaining that profit is by providing our customers with products that have real value, at fair (and competitive) prices, and by providing consistently excellent customer support. Our customers are *very* satisfied with our products and services, and they show their satisfaction with extreme brand loyalty. The profit we make year over year allows us to increase the quality of life for our employees, and keeps our employees highly motivated towards serving our customers. We pursue the good of our customers alongside our own, and we avoid shady business practices like the plague.

            • Ajedi32 a day ago

                as it is written:
              
                  None is righteous, no, not one;
                    no one understands;
                    no one seeks for God.
               
                  All have turned aside; together they have become worthless;
                    no one does good,
                    not even one.
              
              Romans 3:10-12
            • BlueTemplar 4 days ago

              Your attempt of making the situation seem to be asymmetric by arbitrarily defining "good" as absolute good, while "evil" as everything else - is unconvincing. (Why not the opposite - which would also be ridiculous ?)

              • positus a day ago

                It's like a glass of pure water. If there is anything else in the glass, it can no longer be called a glass of pure water. Likewise, the presence of evil of any quantity or quality in a person disqualifies them from being good. Apple trees do not yield blueberries, and good men do not do evil things. So if we discern in our thoughts, words, and actions the presence of evil, we can know with absolute certainty that we, too, are evil.

                Or is it supposed that hating each other, lying to one another, stealing from one another, murdering one another, failing to fulfill contract, covenant, and commitment to one another are things that should be considered good?

            • aftbit 4 days ago

              See also "The Good Place" which is an absolute sleeper of a TV show.

        • throwaway92024 4 days ago

          No. There are no good or bad people. But people do good or bad things, all the time.

        • idle_zealot 4 days ago

          What is this even supposed to mean? Profit motivation is a concept invented by humans for humans to apply. If it leads to unexpected or undesirable outcomes then it's a bad idea. A system that requires all participants be paragons of some definition of virtue to produce good results is fundamentally unsuited for human beings.

    • ValentinA23 5 days ago

      The DOJ has just launched a corporate whistleblower program, you should look into it maybe it covers your case:

      https://www.justice.gov/criminal/criminal-division-corporate...

      >As described in more detail in the program guidance, the information must relate to one of the following areas: (1) certain crimes involving financial institutions, from traditional banks to cryptocurrency businesses; (2) foreign corruption involving misconduct by companies; (3) domestic corruption involving misconduct by companies; or (4) health care fraud schemes involving private insurance plans.

      >If the information a whistleblower submits results in a successful prosecution that includes criminal or civil forfeiture, the whistleblower may be eligible to receive an award of a percentage of the forfeited assets, depending on considerations set out in the program guidance. If you have information to report, please fill out the intake form below and submit your information via CorporateWhistleblower@usdoj.gov. Submissions are confidential to the fullest extent of the law.

    • TinyRick 5 days ago

      Why would you resign? You could have reported it yourself and then you would have whistleblower protections - if the company retaliated against you (e.g. fired you), you then would have had a strong lawsuit.

      • arminiusreturns 5 days ago

        Because I don't want to be associated with companies that break the law and violate regulations knowingly. I've long had a reputation of integrity, and it's one of the few things I have left having almost nothing else.

        • TinyRick 5 days ago

          So you would rather be known as someone who had an opportunity to report a violation, and chose not to? From my perspective it seem like you decided against acting with integrity in this situation - the moral thing would have been to report the violation, but you chose to look the other way and resign.

          • 1659447091 5 days ago

            > it seem like you decided against acting with integrity in this situation ... you chose to look the other way and resign.

            I agree with this statement.

            This isn't a judgement, we all have to make choices; the "right" choice (the one that aligns with integrity) is usually the one that will be the least self-serving and even temporarily harmful. They did what was right for them, that's okay, but it was not the choice of integrity.

            • Dylan16807 5 days ago

              How is quitting right for them? They chose a path that's bad for the users and bad for them.

              • 1659447091 5 days ago

                Because that is the choice they made for themselves.

                How it plays out after is another matter entirely. But the choice was what they seemed to think was right, for them, at the time. Thus it was the right choice for them. It doesn't mean it was the right choice in terms of integrity, or the right choice for me, or you or anyone whose data got caught up in it. Nor was it right choice in receiving a paycheck the next week.

                But the way it was explained, it doesn't seem like they went out of their way to pick a "wrong" choice, specifically. They picked what they felt was the right one, for them, at that time. There were less ethical options to choose as well, and those were not picked either.

                • Dylan16807 5 days ago

                  Someone choosing an action does not at all mean it's the right choice for them.

                  • 1659447091 5 days ago

                    I believe we are talking two separate things.

                    You appear to be talking about the external consequences of choices, while I am talking about them making a choice based on what they believed was the inner rightness of their choice. They did not want to be associated with a company like that, so they made the choice to not be -- because it aligned with their inner knowing of not wanting to be a part of that company. The right or wrongness in terms of external consequences is not what makes the choice, right or wrong -- for them

                    • Dylan16807 5 days ago

                      But they left the vast majority of the morality on the table. They even talked to a lawyer to avoid reporting. So in the sense of making the choice that aligns with inner rightness and makes them moral, they still made a bad choice.

                      • 1659447091 5 days ago

                        > making the choice that aligns with inner rightness

                        Again, I am talking about -- them -- not anyone one else or what anyone else thinks of it outside of them. I am not talking about "inner rightness" in general, I am talking "what they believed was the inner rightness of their choice" -- Their inner rightness. You seem to be talking about what -- you and/or others -- may believe from an outside perspective. My outside perspective is they made the choice that did not align with integrity. But that does not mean that was not the right choice for them.

                        And again, they made the right choice, for them -- at that time. How that plays out after is neither here nor there and in your labeling it a "bad" choice for them is akin to saying that they have no real agency over their choices, and we outside of them are the final say in what is good or bad for that person.

                        • Dylan16807 4 days ago

                          Again, I am talking about things internal to that specific person just as much as you are. Not external anything.

                          You are trying to focus on what they believed in that moment, but I see no reason to use that in an analysis of whether their actions fit their own morals. Sometimes people make mistakes even by their own rules. If we only care about what someone thought right in the heat of the moment, that category of mistake would be impossible, and it's not impossible. Saying that mistakes are possible is not overriding agency.

                          The core of it is in this line "the choice was what they seemed to think was right, for them, at the time. Thus it was the right choice for them". I don't agree with that logic at all. Humans are not good enough at following their own motivations and principles. They are impulsive and bad at analysis. You can't assume that their choices will always be consistent with their personal parameters of right and wrong.

                          Also, saying I think someone made a mistake is not denying agency. Don't be so melodramatic. Nowhere am I claiming to have the final say. I merely have the right to an opinion.

                          • 1659447091 4 days ago

                            I was never talking about if they made a mistake or not. That is after the fact and outside the scope of what I have been saying. I know it matters, but that is not within the scope of my first comment that started this.

                            I took the little information they gave and from that the only true logical conclusion was they made the right choice for them at that moment. Full Stop.

                            You’re the one bringing the extra opinions into the matter and reading into a simple thing far too much. Most of the above I agree with you on outside of this particular thread. It has nothing to do with the very narrow scope of my original comment and attempted clarification.

                            Neither of us can know 100% what was right or wrong for them in that moment, but based on the information of A. no longer feeling right about being associated with a place for reasons that they deemed important enough to come to this conclusion — and B. aligning actions with that inner knowledge; makes it the right action (choice) for that person. If they changed their mind later, it does not change the immutable facts of that moment. It simply provides a new set of choices and options that is outside the scope of my original comment.

                            • Dylan16807 4 days ago

                              > I was never talking about if they made a mistake or not. That is after the fact and outside the scope of what I have been saying. I know it matters, but that is not within the scope of my first comment that started this.

                              When I say mistake here, I specifically mean "mistake as far as their goal of making the right choice". And I mean that in the moment, using knowledge they have at that time, just like you're defining "right choice". Nothing after the fact nor outside the scope.

                              > I took the little information they gave and from that the only true logical conclusion was they made the right choice for them at that moment. Full Stop.

                              I don't see how they gave enough information to be sure, but more importantly you seemed to make a generic statement that anyone making a choice like that would be making the right choice, and that's what I really object to.

                              > You’re the one bringing the extra opinions

                              I am not! Please stop misreading me! Why won't you listen to what I'm saying about my own argument?

                              > Neither of us can know 100% what was right or wrong for them in that moment,

                              Please explain how "neither of us can know 100%" can be true at the same time as "only true logical conclusion was they made the right choice for them at that moment. Full Stop."

                              > A, B

                              Remember that not reporting the company was also part of the choice they made. The basic description of the choice was to report, quit, both, or neither, and they chose to quit.

                              > If they changed their mind later, it does not change the immutable facts of that moment. It simply provides a new set of choices and options that is outside the scope of my original comment.

                              I'm not talking about whether someone might change their mind later with new information, per se. I'm making the objectively true claim that people don't always think things through, meaning their choice might fail to represent the knowledge and priorities they had at the time.

                              • arminiusreturns 4 days ago

                                Since this thread seems interested, I will reveal the main issue being that based on my analysis, reporting to PCI and the state AG would have absolutely destroyed this (very) small business and the businessowner, someone who helped me in a very rough time, and thus I felt both options were bad/wrong, but resigning without whistleblowing was the better of the two options. I appreciate the analysis and really hope I made the right choice, and if I change my mind, I can still whistleblow on this, which has been a solace in the struggle in this decision.

          • qup 5 days ago

            I wonder if I was part of the database that got emailed.

            • arminiusreturns 5 days ago

              Very unlikely, this is a very small operation with a tiny customer base.

    • mikeodds 5 days ago

      As in.. his actual Uber driver? He just handed his laptop over?

      • arminiusreturns 5 days ago

        Yes. The owner is old, and going blind, but refuses to sell or hand over day to day ops to someone else, and thus must ask for help on almost everything. I even pulled on my network to find a big processor with a good reputation to buy the company, but after constant delays and excuses for not engaging with them, I realized to the owner the business is both their "baby" and their social life, neither of which they want to lose.

  • runjake 5 days ago

    YMMV, but it took me 15 minutes start to finish to freeze my credit with the 3 bureaus using the following instructions.

    https://www.nerdwallet.com/article/finance/how-to-freeze-cre...

    • tombrossman 4 days ago

      YMMV indeed.

      Since moving overseas 15 years ago, I tried numerous times and it simply is not possible. All the forms require a U.S. mailing address to register. Same for online access to your Social Security account.

      There are an estimated 10 million Americans living overseas. Taken together, we are the equivalent of the 11th largest state. All of us completely blind to what is happening with our credit record and Social Security account.

      At this point I think the only way this gets fixed is massive fraud/exploitation by organized crime, so these organizations finally address the problem.

      • PNewling 4 days ago

        > There are an estimated 10 million Americans living overseas

        Curious how you found this number, have a source?

        This made me pretty curious, but I couldn't find any official numbers. The closest 'official' numbers that I could find are from the Federal Voting Assistance Program [0] and that lists 4.4 million people, but only 2.8 million of those being adults.

        [0] https://www.fvap.gov/info/interactive-data-center/overseas

        • tombrossman 3 days ago

          Strange that someone down-voted you, as this is a fair question.

          > Curious how you found this number, have a source?

          I don't have the source handy but have seen the estimated 10 million figure cited repeatedly. But maybe it is about a million too high, as the US Department of State estimates nine million in this 2020 publication: https://travel.state.gov/content/dam/travel/CA-By-the-Number...

          This Wikipedia page has a lot more info for those interested: https://en.wikipedia.org/wiki/Emigration_from_the_United_Sta...

          Using FVAP stats to me seems problematic, because just like the general population, many US citizens do not bother registering to vote (though they do acknowledge this on the page you linked to and try to control for it).

          State likely have a more accurate estimate from knowing how many passport renewals originate from overseas addresses. I am sure some Americans renew or replace their passports while merely travelling overseas, but I cannot imagine this is a routine practice.

    • wwmiller3 4 days ago

      Unfortunately, that isn’t enough to mitigate identity theft. Someone leveraging the recent National Public Data breach opened a checking and savings account using my identity (no credit checks are performed in doing so) then committed wire fraud using accounts.

      • jamesmotherway 3 days ago

        Banks use various other services such as Early Warning. Still, it's absurd the lengths we need to go to for any level of assurance against fraud.

    • saagarjha 4 days ago

      Ok, but this is something that shouldn't be my problem. And it's not just that; I have to go unfreeze it if someone needs to run a credit check.

      • runjake 4 days ago

        Right, but you've got to do what's within your control, unless you're planning a Senate campaign and plan to resist significant and lucrative lobbying operations against you.

  • bilekas 5 days ago

    > To me, what's missing from that set of recommendations is some method to increase the liability of companies who mishandle user data.

    As nice as this is on paper, it will never happen, lobbyist exists. Not to be tinfoil hat but why would any lawmaker slap the hand that feeds them.

    Until there is an independent governing body which is permitted to regulate over the tech industry as a whole it wont happen. Consider the FDA, they decide which drugs and ingredients are allowed and that's all fine. There could be a regulating body which could determine the risk to people's mental health for example from 'features' of tech companies etc. But getting that body created will require a tragedy. Like why the FDA was created in the first place. [1]

    That's just my 2cents.

    1 : https://www.fda.gov/about-fda/fda-history/milestones-us-food....

    • Aerroon 5 days ago

      >There could be a regulating body which could determine the risk to people's mental health for example from 'features' of tech companies etc.

      I think ideas like this is why it's not going to happen.

      Our understanding of mental health is garbage. Psychiatry used to be full of quackery and very well still might be. Treatment for something like depression boils down to "let's try drug in a random order until one works". It's a field where a coin-flip rivals the accuracy of studies. Therefore any regulating body on that will just be political. It will be all about the regulators "doing something" because somebody wrote enough articles (propaganda).

      Problems like this are why people aren't interested in supporting such endeavors.

      • intended 5 days ago

        That is not the treatment for depression.

        this argument reduces mental health to medication, which leaves aside everything from the history of mental health (asylums, witch burnings to today), leaps in medicine (from lobotomies, to SNRIs, bipolar meds and more), to simply better diagnoses.

        There are certainly tons of people here who have benefited from mental health professionals - overextending the flaws in psych simply to dismiss the idea of a watchdog is several unsupported arguments too far.

        • night862 5 days ago

          I disagree, in brief because the practical side of psychiatry is medication-dominated, mostly because medical research is difficult and expensive.

          There are some non-medication treatments for some psychiatric symptoms such as those caused by trauma (Prominently, EMDR) that some hail as actual cures, and even maybe depression (I am clearly not a doctor.) but in the case of depression I think you'll find its quite medication-heavy.

          The reason for this is that psychiatrists are Medical Doctors and Psychiatry is a medical field which is of course bounded by the means of medical science. This is not to say there is some "magic" at work which science could never understand--not at all. It is merely the case that medical doctors are a research paper oriented bunch, and most of the medical research which makes it into practice is either relating to anatomy or pharmaceutical interventions.

          Most of the treatments we have are pharmaceutical medications because most of our research dollars have gone into pharmaceutical research.

          I decided to edit this comment to add: In my personal opinion, is probable that psychiatrists et all, writ large as it were, have already figured out how to cure depression. Only, we cannot really manage to employ it because it isn't a pill, therapy, device or surgery.

          • throwaway92024 4 days ago

            Ironically, the proposed biological mechanism behind EMDR is totally incorrect, and everyone knows it, outside of some diehard polyvagal theory (also totally bunk) adherents. But the treatments do work for a lot of people, probably because it's just exposure therapy by another name.

      • pwillia7 4 days ago

        Psychiatry is useful in the way Statistics is useful for math models we don't fully understand. Statistics let's us get at answers with enough data even though we don't really understand the underlying model at play.

        There a whole host of 'sciences' that are kind of 2nd tier like this, Psychiatry being one of them. Once we understand enough Neuroscience, it's likely to me Psychiatry will get consumed by Neuroscience which will splinter into more useful for day to day life categories as it grows (like a psychiatrist)

        Super book on the subject and also talks about the rising bar for individual culpability as we understand more about the brain: https://www.amazon.com/Incognito-Secret-Lives-David-Eagleman...

    • pwillia7 4 days ago

      Through civil disobedience is the only way stuff like this happens in America. You're right about the incentives to those in power, but how do you think we got emancipation2, women's suffrage, organized labor rights, prohibition and the end to prohibition?

      • tacocataco 4 days ago

        The prohibitionisn't over. The war on drugs is still going strong, even with marijuana legalization in many states.

    • samfundev 4 days ago

      While I also worry about lobbying, we'll have to lobby harder.

  • pkphilip 4 days ago

    But reveal any "classified" information about the govt and you will end up in jail. The severe asymmetry between what a citizen can do and what the govt gives itself the right to do is crazy.

  • bhhaskin 5 days ago

    If your identity gets stolen, you should be able to sue all the companies that had a leak.

  • layer8 5 days ago

    I’m completely sympathetic to making companies more liable for data security. However, until data breaches regularly lead to severe outcomes for subjects whose personal data was leaked, and those outcomes can be causally linked to the breaches in an indisputable manner, it seems unlikely for such legislation to be passed.

    • wepple 5 days ago

      I forgot where I saw this, but the US govt recently announced that they see mass PII theft as a legitimate national security issue.

      It’s not just that you or I will be inconvenienced with a bit more fraud or email spam, but rather that large nation state adversaries having huge volumes of data on the whole population can be a significant strategic advantage

      And so far we typically see email+password+ssn be the worst data leaked; I expect attackers will put in more effort to get better data where possible. Images, messages, gps locations, etc

      • kragen 5 days ago

        yes, privacy is not an individual problem; it's a civil defense problem, and not just when your opponent is a nation-state. we already saw this in 02015 during the daesh capture of mosul; here's the entry from my bookmarks file:

        https://www.facebook.com/dwight.crow/media_set?set=a.1010475... “#Weaponry and morale determine outcomes. The 2nd largest city of Iraq (Mosul) fell when 1k ISIS fighters attacked “60k” Iraqi army. 40k soldiers were artifacts of embezzlement, and of 20k real only 1.5k fought - these mostly the AK47 armed local police. An AK47 loses to a 12.7mm machine gun and armored suicide vehicle bombs. Finally, the attack was personal - soldiers received calls mid-fight threatening relatives by name and address. One army captain did not leave quickly enough and had two teenage sons executed.” #violence #Iraq #daesh

        of course the americans used this kind of personalized approach extensively in afghanistan, and the israelis are using it today in lebanon and gaza, and while it hasn't been as successful as they hoped in gaza, hamas doesn't exactly seem to be winning either. it's an asymmetric weapon which will cripple "developed" countries with their extensive databases of personal information

        why would a politician go to war in the first place if the adversary has the photos and imeis of their spouse, siblings, and children, so they have a good chance of knowing where they are at all times, and the politician can't hope to protect them all from targeted assassination?

        the policy changes needed to defend against this kind of attack are far too extreme to be politically viable. they need to be effective at preventing the mere existence of databases like facebook's social graph and 'the work number', even in the hands of the government. many more digital pearl harbors like the one we saw this week in lebanon will therefore ensue; countries with facebook, credit bureaus, and national identity cards are inevitably defenseless

        imposing liability on companies whose data is stolen is a completely ineffective measure. first, there's no point in punishing people for things they can't prevent; databases are going to get stolen if they're in a computer. second, the damage done even at a personal level can vastly exceed the recoverable assets of the company that accumulated the database. third, if a company's database leaking got your government overthrown by the zetas or daesh, what court are you going to sue the company in? one operated by the new government?

        • treypitt 5 days ago

          Are you saying you think more critical government databases than OPM or security clearance rosters are inevitably going to be breached? I'd like to think the government or corporation can effectively protect some databases at least...

          • kragen 5 days ago

            those are already pretty bad, but i think the really dangerous ones are things like verizon's billing records and customer location history, credit card transaction histories, license plate registrations, credit bureau histories, passport biometrics, enough voice recordings from each person for a deepfake, public twitter postings, etc.

            consider https://en.wikipedia.org/wiki/1943_bombing_of_the_Amsterdam_...:

            > The 1943 bombing of the Amsterdam civil registry office was an attempt by members of the Dutch resistance to destroy the Amsterdam civil registry (bevolkingsregister), in order to prevent the German occupiers from identifying Jews and others marked for persecution, arrest or forced labour. The March 1943 assault was only partially successful, and led to the execution of 12 participants. Nevertheless, the action likely saved many Jews from arrest and deportation to Nazi extermination camps.

            to avoid partisan debate, imagine a neo-nazi group takes over the us, which presumably we can all agree would be very bad. after they took over, how hard would it be for them to find all the jews? not just make a list of them, but physically find them? (much easier than it was in 01943, i'm sure we can agree.) how hard would it be for them to find all the outspoken anti-fascists? where could those anti-fascists hide?

            now, step it up a notch. how hard would it be for them to find all the jews before they take over? it wouldn't be that hard if the databases leak. and if you feel safe because you're not jewish, rest assured that neo-nazis aren't the only groups who are willing to use violence for political ends. someone out there wants you dead simply because of the demographic groups you belong to. the reason you haven't been seeing widespread political violence previously is that it hasn't been a winning strategy

            the situation is changing very fast

          • intended 5 days ago

            Hey, on a long enough timeline the answer will tend towards yes.

            Do note, that this isn’t just an Americas problem.

            Your data is probably on DBs in other nations.

            Plus - the playbook is to target weaker nations and then use them for staging grounds to target stronger nations.

      • dantheman 5 days ago
        • wepple 5 days ago

          Very aware of that. That to me seemed like a targeted attack by a tracked APT group. What I’m referring to above is that the more vanilla attacks (ex: popular online mattress store gets popped) actually have national security implications, despite seeming like just an inconvenience

          • grugq 5 days ago

            > Even minutiae should have a place in our collection, for things of a seemingly trifling nature, when enjoined with others of a more serious cast, may lead to valuable conclusion.

            — George Washington.

    • EasyMark 5 days ago

      They’d need a lot less security if they stopped spying on us and saving all of our most critical ID data, period.

    • Onavo 5 days ago

      Then instead of regulating the companies, make SSN easily revokable and unique per service. I don't understand why Americans are so oppposed to a national ID despite the fact that every KYC service use SSNs and driver licenses.

      • mapt 5 days ago

        The expansion of KYC and the hegemonic dominance of our global financial intelligence network is a recent infringement on our privacy that would not necessarily pass popular muster if it became well-known.

        Most of our population is still living in a headspace where transactions are effectively private and untraceable, from the cash era, and has not considered all the ways that the end of this system makes them potential prey.

        The fact is that the market is demanding a way to identify you both publicly and privately, and it will use whatever it needs to, including something fragile like a telephone number 2fa where you have no recourse when something goes wrong. It's already got a covert file on you a mile long, far more detailed than anything the intelligence agencies have bothered putting together. The political manifestation of anti-ID libertarians is wildly off base.

        • monksy 5 days ago

          The concern about organizations and the governments feelings that it needs to track you is a very valid concern. Why does the government need to make sure your "hand job from a friend" venmo payment to your friend is "legally legit"? (You can get transactions flagged for this and the moderator will shame you)

          Are you correct in what's going on? Yes. Are we placed in this with no option to resist? For the most part yes.

      • candiddevmike 5 days ago

        Because they're the mark of the beast or a step towards fascism or something.

        I don't think it would take much to convert real IDs into a national ID, they are as close to as they can get without "freaking people out".

        • Nevermark 5 days ago

          Emphasizing that the number can be changed would really help there.

          People could even generate their own number (private key), which they never gave out, and appeared differently to each account manager verifying it, and still replace them.

          When you choose your own number, it's only the Mark of the Beast if you are the Beast! * **

          * 666, 13, 69 and 5318008 expressly prohibited.

          ** Our offices only provide temporary tattoos.

    • deegles 5 days ago

      Nearly everyone's data has been leaked already. Any strong protections would only protect people who haven't been born yet imo.

    • mapt 5 days ago

      "What fraction of the FBI and CIA do the Communists have blackmail material on?"

  • nimbius 5 days ago

    I think the only reason were seeing this revelation from a federal agency after 20 years is to boost the governments case against tiktok.

  • closeparen 4 days ago

    Shared secrets are criminally negligent security architecture in 2024. We can authenticate identity and authorize payment without giving the relying party a token to leak or abuse. The energy behind this problem is good, but "everyone try harder to protect the shared secrets entrusted to you" would be a tragic waste of it.

    • DoctorOetker 4 days ago

      > [...] would be a tragic waste of it.

      The first time would have been a tragedy, from then on it has been farce after farce.

      Imagine a world where companies would have to prove the necessity of storing specific factoids. It would only take 1 security researcher to prove it being unnecessary, invalidating that class of "legitimate interests".

      Today this value judgement happens in human brains, like the (correct) judgement in your comment. If we want to scale it objectively we would have to switch to formal verification. A whole industry of compliance checking could come to exist where a company wants to get its operations screened for compliance issues, so as not to suffer criminal negligence penalties.

      • closeparen 4 days ago

        The problem here is the payments industry (continuing to issue and accept "credit card numbers") and the voters (refusing to authorize a proper national ID). An individual entity that has to conduct business under these circumstances has no real alternative.

        You are not being harmed by the storage or leakage of a few bytes, that's ridiculous. You are being harmed by the financial industry and government's insistence that knowledge of these bytes is sufficient to take your property or hold a debt against you.

  • trinsic2 5 days ago

    Sounds like a bunch of crap the industry is already trying to sell the public and no its not working and yes we can do with out it.

  • m463 4 days ago

    that, and removing the commercial trackers from this ftc.gov webpage...

    googletagmanager.com googleapis.com fontawesome.com addtoany.com

    sigh

  • alsetmusic 5 days ago

    Regulation is key, but I don’t see it as likely when our society is poisoned by culture war bs. Once we put that behind us (currently unlikely), we can pass sane laws reigning in huge corporations.

  • OkeyDokey2 5 days ago

    [flagged]

    • dylan604 5 days ago

      This does nothing for them being able to continue with shadow profiles and inferences about you based on data they gather from others in your social network. It is well beyond "data you provide". Like waaaaay beyond.

  • 2OEH8eoCRo0 5 days ago

    I get a feeling that liability is the missing piece in a lot of these issues. Section 230? Liability. Protection of personal data? Liability. Minors viewing porn? Liability.

    Lack of liability is screwing up the incentive structure.

    • brookst 5 days ago

      I think I agree, but people will have very different views on where liability should fall, and whether there is a malicious / negligent / no-fault model?

      Section 230? Is it the platform or the originating user that's liable?

      Protection of personal data? Is there a standard of care beyond which liability lapses (e.g. a nation state supply chain attack exfiltrates encrypted data and keys are broken due to novel quantum attack)?

      Minors viewing porn? Is it the parents, the ISP, the distributor, or the creator that's liable?

      I'm not here to argue specific answers, just saying that everyone will agree liability would fix this, and few will agree on who should be liable for what.

      • TheOtherHobbes 5 days ago

        It's not a solvable problem. Like most tech problems it's political, not technical. There is no way to balance the competing demands of privacy, security, legality, and corporate overreach.

        It might be solvable with some kind of ID escrow, where an independent international agency managed ID as a not-for-profit service. Users would have a unique biometrically-tagged ID, ID confirmation would be handled by the agency, ID and user behaviour tracking would be disallowed by default and only allowed under strictly monitored conditions, and law enforcement requests would go through strict vetting.

        It's not hard to see why that will never happen in today's world.

        • malfist 5 days ago

          > It's not a solvable problem

          Lawnmower manufacturers said the same thing about making safe lawnmowers. Until government regulations forced them to

          • brookst 4 days ago

            Well, perpetual motion is also considered unsolvable. Perhaps the right regulation would make it happen?

            Or... maybe that line of reasoning isn't super strong.

            • malfist 4 days ago

              There's a big difference between breaking the physical laws of the universe and Big Tech not wanting to spend money on moderators.

              • brookst 4 days ago

                "Big" is pretty vague there.

                But my point was that you asserted that regulations can overcome claims of impossibility. I was just illustrating that is not always true, and therefore we can't assume it is true here.

          • ToucanLoucan 5 days ago
            • Aerroon 5 days ago

              Well, something to consider is that part of why everything is so much expensive these days is that a lot of the solutions to those problems add costs. That cost needs to be absorbed by the price.

              One of the reasons it's so expensive to build a house is safety regulations. They exist for a reason, but they nevertheless add a substantial cost to building a house. If you had mandated such a cost to people living in 1870 then a lot fewer people could've afforded a house.

              • tacocataco 4 days ago

                We should stop begging the for profit house building industry to make affordable housing. Just use the army core of engineers and build the mega cities from Judge dredd.

              • malfist 4 days ago

                Building codes are written in blood.

                • ToucanLoucan 4 days ago

                  In fact you can say that about most regulations of any sort, this seems to hit people's brains in the same way as how anti-vaxxers are so cavalier about things like Measles because we effectively eradicated it via vaccination (except it's now making a comeback thanks to anti-vaxxers) and people have forgotten how horrible it is, in the same way people forget, sure, houses used to be cheaper to build. And they also burned down a hell of a lot more, or were constructed poorly and not to code, or were built too close together, etc. etc. etc.

                  Just about every regulation that exists for every product or thing or way of doing something was written not even after a death, that's usually not enough, it needs to be a substantial number of deaths that can be directly linked to the thing the regulation covers and only then does it become law.

                • Aerroon 4 days ago

                  Yes, and?

                  It still adds an enormous amount of cost. Despite that buildings (and even bridges) still collapse. What is often not asked is how many people lack housing because those building codes made it unaffordable for them.

      • StanislavPetrov 5 days ago

        >Protection of personal data? Is there a standard of care beyond which liability lapses (e.g. a nation state supply chain attack exfiltrates encrypted data and keys are broken due to novel quantum attack)?

        There absolutely should be, especially for personal data collected and stored without the express written consent of those being surveilled. They should have to get people to sign off on the risks of having their personal data collected and stored, be legally prevented from collecting and storing the personal data of people who haven't consented and/or be liable for any leaking or unlawful sharing/selling of this data.

  • zeroonetwothree 5 days ago

    If you aren’t directly harmed yet what liability would they have? I imagine if your identity is stolen and it can be tied to a breach then they would already be liable.

    • kibwen 5 days ago

      The fact that my data can be stolen in the first place is already outrageous, because I neither consented to allowing these companies to have my data, nor benefit from them having my data.

      It's like if you go to an AirBNB and the owner sneaks in at night and takes photos of you sleeping naked and keeps those photos in a folder on his bookshelf. Would you be okay with that? If you're not directly harmed, what liability would they have?

      Personal data should be radioactive. Any company retaining it better have a damn good reason, and if not then their company should be burned to the ground and the owners clapped in irons. And before anyone asks, "personalized advertisements" is not a good reason.

      • ryandrake 5 days ago

        That's the big problem with relying on tort law to curb this kind of bad corporate behavior: The plaintiff has to show actual injury or harm. This kind of bad behavior should be criminal, and the state should be going after companies.

      • JumpCrisscross 5 days ago

        > before anyone asks, "personalized advertisements" is not a good reason

        The good reason is growth. Our AI sector is based on, in large part, the fruits of these data. Maybe it's all baloney, I don't know. But those are jobs, investment and taxes that e.g. Europe has skipped out on that America and China are capitalising on.

        My point, by the way, isn't pro surveillance. I enjoy my privacy. But blanket labelling personal data as radioactive doesn't seem to have any benefit to it outside emotional comfort. Instead, we need to do a better job of specifying which data are harmful to accumulate and why. SSNs are obviously not an issue. Data that can be used to target e.g. election misinformation are.

        • thfuran 5 days ago

          So you're saying it's all vastly valuable and that's why it is right that it is taken without consent or compensation?

          • JumpCrisscross 5 days ago

            > it's all vastly valuable and that's why it is right that it is taken without consent or compensation?

            No, I'm saying it's a common with a benefit to utilisation. A lot of discussions around data involve zealouts on both sides. (One claiming it's the god-given right to harvest everyone's personal information. The other acting like it's the crime of the century for their email address to be leaked.)

        • rockskon 4 days ago

          See - your problem is you think you're talking to politicians, Facebook-era journalists, disinfo activists.

          Most people here have thought more about the the topic of privacy in the modern era far more than what some 70 year old politician has.

      • pc86 5 days ago

        I mean it's pretty clear that you are directly harmed if someone takes naked photos of you without your knowledge or consent and then keeps them. It's not a good analogy so if we want to convince people like the GP of the points you're making, you need to make a good case because that is not how the law is currently structured. "I don't like ads" is not a good reason, and comments like this that are seething with rage and hyperbole don't convince anyone of anything.

        • drawkward 5 days ago

          What is the harm? It is not obvious to me, if the victim is unaware...unless you are alleging simply that there is some ill-defined right to privacy. But if that is so, why does it apply to my crotch and not my personal data?

          • simoncion 5 days ago

            These are exactly my questions. If I never, ever know about those pictures and never, ever have my life affected by those pictures, what is the actual harm to me?

            If the answer to them ends up being "Well, it's illegal to take non-consensual nudie pictures.", then my follow-up question is "So, why isn't the failure to protect my personal information also illegal?".

            To be perfectly clear, I do believe that the scenario kibwen describes SHOULD be illegal. But I ALSO believe that it should be SUPER illegal for a company to fail to secure data that it has on me. Regardless of whether they are retaining that information because there is literally no way they could provide me with the service I'm paying them for without it, or if they're only retaining that information in the hopes of making a few pennies off of it by selling it to data brokers or whoever, they should have a VERY SERIOUS legal obligation to keep that information safe and secure.

            • lcnPylGDnU4H9OF 5 days ago

              > to fail to secure data that it has on me

              Just want to point out that the company is usually also doing what it can to get other information about you without your consent based on other information it has about you. It's a lot closer to the "taking non-consensual nudie pictures" than "fail to secure data" makes it sound.

        • JumpCrisscross 5 days ago

          > it's pretty clear that you are directly harmed if someone takes naked photos of you without your knowledge or consent and then keeps them

          Sure. In those cases, there are damages and that creates liability. I'm not sure what damages I've ever faced from any leak of e.g. my SSN.

          • pixl97 5 days ago

            I mean most people won't until that day they find out theirs a house in Idaho under their name (and yes I've seen just this happen).

            The problem here is because of all these little data leaks you as an individual now bear a cost ensuring that others out there are not using your identity and if it happens you have to clean up the mess by pleading it wasn't you in the first place.

          • pc86 4 days ago

            The real kicker is trying to prove which leak your SSN came from. If your SSN gets leaked by 3 different companies, and 6 months later someone uses your identity to commit some crime, you can't have each company share 1/3 of the blame.

            • BlueTemplar 4 days ago

              I don't see why they couldn't all share the full blame ?

              It's not like there's a "conservation of blame" law.

      • ranger_danger 5 days ago

        >I neither consented to allowing these companies to have my data, nor benefit from them having my data.

        I think both of those are debatable.

      • lesuorac 5 days ago

        I don't think thats a proper parallel.

        I think a better example would be You (AirBnB Host) rent a house to Person and Person loses the house key. Later on (perhaps many years later), You are robbed. Does Person have liability for the robbery?

        Of course it also gets really muddy because you'll have renting the house out for those years and during that time many people will have lost keys. So does liability get divided? Is it the most recent lost key?

        Personally, I think it should just be some statutory damages of probably a very small amount per piece of data.

        • pixl97 5 days ago

          The particular problem comes in because the amount of data lost tends to be massive when these breaches occur.

          It's kind of like the idea of robbing a minute from someone's life. It's not every much to an individual, but across large populations it's a massive theft.

          • lesuorac 5 days ago

            Sure and if you pay a statutory fine times 10 million then it becomes a big deal and therefore companies would be incentivized to protect it better the larger they get.

            Right now they probably get some near free rate to offer you credit monitoring and dgaf.

        • polygamous_bat 5 days ago

          > I think a better example would be You (AirBnB Host) rent a house to Person and Person loses the house key.

          This is not a direct analogue, a closer analogy would be when the guest creates a copy of the key (why?) without my direct consent (signing a 2138 page "user agreement" doesn't count) and at some later point when I am no longer renting to them, loses the key.

          • lesuorac 5 days ago

            I'm still much more interested in the answer to who is liable for the robbery.

            Just the Robber? Or are any of the key-copiers (instead of losers w/e) also?

            • Dylan16807 5 days ago

              I don't really care about the answer to that specific question, where there's only one household.

              What I will say is the guy that has copies of 20000 people's keys should get in trouble if he loses his horde.

        • 8note 5 days ago

          This version loses multiple parts of things that are important

          1. I have no control over what was stored 2. I have no control over where the storage is

          The liability in this case is the homeowner/host, as you should have and had full ability to change out the locks.

          To make it more similar, I think you'd need one of the guests to have taken some amount of art off the wall, and brought it to a storage unit, and then the art later was stolen from the storage unit, and you don't have access to the storage unit.

          It's not as good as the naked pictures example because what's been taken is copies of something sensitive, not the whole thing

        • mistrial9 4 days ago

          > You (AirBnB Host) rent a house to Person

          this is outrageously incorrect analogy.. you ASSUME property ownership in the first statement. Where are personal legal records analogous to owned property? by whom?

    • drawkward 5 days ago

      Go ahead, post your phone number here. It's not directly harmful.

    • halJordan 5 days ago

      This is the traditional way of thinking, and a good question, but it is not the only way.

      An able bodied person can fully make complaints against any business that fails their Americans with Disabilities Act obligation. In fact these complaints by able bodied well-doers is the de facto enforcement mechanism even though these people can never suffer damage from that failure.

      The answer is simply to legislate the liability into existence.

    • idle_zealot 5 days ago

      That's the whole problem with "liability", isn't it? If the harms you do are diffuse enough then nobody can sue you!

    • squeaky-clean 5 days ago

      The same way you can get ticketed for speeding in your car despite not actually hitting anyone or anything.

    • drawkward 5 days ago

      Surveillance apologist.

    • bunderbunder 5 days ago

      This is exactly why thinking of it in terms of individual cases of actual harm, as Americans have been conditioned to do by default, is precisely the wrong way to think about it. We're all familiar with the phrase "an ounce of prevention is worth a pound of cure", right?

      It's better to to think of it in terms of prevention. This fits into a category of things where we know they create a disproportionate risk of harm, and we therefore decide that the behavior just shouldn't be allowed in the first place. This is why there are building codes that don't allow certain ways of doing the plumbing that tend to lead to increased risk of raw sewage flowing into living spaces. The point isn't to punish people for getting poop water all over someone's nice clean carpet; the point is to keep the poop water from soaking the carpet in the first place.

      • supertrope 5 days ago

        Safety rules are written in blood. After a disaster there’s a push to regulate. After enough years we only see the costs of the rules and not the prevented injuries and damage. The safety regulations are then considered annoying and burdensome to businesses. Rules are repealed or left unenforced. There is another disaster…

        • bunderbunder 5 days ago

          Tangentially, there was an internet kerfuffle about someone getting in trouble for having flower planters hanging out the window of their Manhattan high rise apartment a while back, and people's responses really struck me.

          People from less dense areas generally saw this as draconian nanny state absurdity. People who had spent time living in dense urban areas with high rise residential buildings, on the other hand, were more likely to think, "Yeah, duh, this rule makes perfect sense."

          Similarly, I've noticed that my fellow data scientists are MUCH less likely to have social media accounts. I'd like to think it's because we are more likely to understand the kinds of harm that are possible with this kind of data collection, and just how irreparable that harm can be.

          Perhaps Americans are less likely to support Europe-style privacy rules than Europeans are because Americans are less likely than Europeans to know people who saw first-hand some of what was happening in Europe in the 20th century.

vundercind 5 days ago

Behind the ball by 15 years to start taking this seriously and beginning to think about pushing back, but better late than never.

Next please reign in the CRAs.

  • flycaliguy 5 days ago

    I think Snowden was bang on when in 2013 he warned us of a last chance to fight for some basic digital privacy rights. I think there was a cultural window there which has now closed.

    • orthecreedence 5 days ago

      Snowden pointed and everyone looked at his finger. It was a huge shame, but a cultural sign that the US is descending into a surveillance hell hole and people are ok with that. As someone who was (and still is) vehemently against PRISM and NSLs and all that, it was hard to come to terms with. I'm going to keep building things that circumvent the "empire" and hope people start caring eventually.

      • digging 5 days ago

        > and people are ok with that

        I've seen no evidence of this. People mostly either don't understand it for feel powerless against it.

        • dylan604 5 days ago

          There's also a vast amount of people that were just too young to be aware of Snowden's revelations. These people are now primarily on TikTok what not, and I doubt there's much in those feeds to bring them to light while directly feeding the beast of data hoarding.

        • davisr 5 days ago

          > I've seen no evidence of this

          Over 99% of Americans point a camera at themselves while they take a shit.

          • lcnPylGDnU4H9OF 5 days ago

            And I'd bet over 99% of those people have never once considered that said camera could even be capable of saving any data without them operating it.

            • davisr 5 days ago

              Very doubtful they've not considered it. When I go to coffee shops, I see maybe a quarter-to-half the laptops have a shade over the webcam. But when I see people using their phones, I've never once seen them use a shade, piece of tape, or post-it note.

              They use the front-facing camera of their phone so often that the temporary inconvenience of removing a shade outweighs the long-term inconvenience of malware snapping an exposing photo.

              • digging 5 days ago

                But do you think they're taking a measured inventory of the possible consequences, both personal and societal, and saying, "No, I don't value that" ?

                Extremely few decisions that people make are deeply calculated with cold logic. Most decisions are primarily unconscious, automatic, and emotional.

                Example: A persons hears it's good to have a webcam cover, so they get one. Nobody mentions doing it for their phone, so they never even think about it. Then someday a friend does mention it, but that would be an inconvenient change, so the person's gut puts up resistance against considering it too strongly. They give in to their emotional response, instead of doing the hard work of changing their emotions based on the knowledge they have.

                At no point in the above scenario would the person state "I don't think mass surveillance is a bad thing." For me, that's why I mean when I say people "aren't ok with it."

                If one's definition of people being "ok with mass surveillance" just means they tolerate it, that they don't sufficiently resist it (and what level of resistance is sufficient? For a person with a webcam cover but no phone cam cover? Does adding a phone cam cover mean they've declared their opposition to mass surveillance?), then how can you say people aren't okay with literally everything evil or wrong? Most people just won't summon enough activation energy to fight any given injustice around them, no matter how egregious it is. That's not a reflection of their morals and values, it's a reflection of how fucking tired we all are.

                I would challenge you to offer up in detail how strongly you have worked to resist mass surveillance in your life. You're logged in and posting on HN, so my guess is, you haven't worked hard enough at it according to someone's metric. Do you have a cover on your phone camera? Just the front one or both? Do you have a cover on the microphones? Do you let others add your number in their contacts or do you refuse to ever give out your real phone number?

                • davisr 4 days ago

                  I don't even have a cell phone! I have a Libreboot-ed X200 running a 100%-libre distribution of GNU/Linux. My MAC address is randomized every hour. I pulled the OnStar fuse from my car 10 years ago. I run my own email server, and use E2E encryption when my recipient knows how. I use Tor and VPNs. My home phone number is public and posted to my website, and I can't control who has it anyway.

                  • tacocataco 4 days ago

                    If people had a better work/life balance, perhaps more would have the mental bandwidth to jump through an Olympic gold winning amount of hoops like you did.

                    The rest would probably use the extra free time to raise their kids/be with family.

                    • davisr 4 days ago

                      It might seem like the Olympics when it's all put together like that, but it's really just one thing at a time, and none of them required any kind of real effort. They were just choices. When it was time to buy a new laptop, I picked the one that had my interests. When it was time to upgrade my OS, I chose the one that respected me. When my cell phone contract was due to be renewed, I said, "no thank you," and bought a VoIP base station instead. I understood how my car, and reportage of my driving habits, were connected to the outside world and I severed that connection by pulling a simple fuse.

                      If people only did one change per year, even that would be enough to change the winds pushing our mass surveillance.

              • chiefalchemist 5 days ago

                The cover over the webcam might not be for security per se. It could be they don't want anyone at work - or home? - to accidentally see where they are. If you cover the camera you don't have to worry any such accidents.

                My gut says that for most people is the reason.

            • timeon 4 days ago

              When I explain to anyone about privacy and some service/products answer usually is 'yeah but it is more convenient so...'

          • verisimi 4 days ago

            Best comment I've seen on hn, maybe ever. Perfect refutation, comic, on point.

          • doctorpangloss 5 days ago

            Snowden couldn't convince people that the privacy he was talking about meant a limit on government power. Not sensitive data. And honestly, nobody cares about anyone taking a shit.

            You can advocate for limiting govt. power ("LGP") without leaking any NSA docs. I don't think a single story about "LGP" changed due to the leaks. Everyone knows the government can do a lot of violence on you. So it's very hard.

            If you're a high drama personality, yeah you can conflate all these nuanced issues. You can make privacy mean whatever you want.

        • immibis 5 days ago

          I've seen no evidence people aren't ok with that. Most people around me didn't care about the Snowden revelations. It was only tech people who tightened up security.

          • orthecreedence 5 days ago

            This is my experience as well. I talked to a LOT of people after the Snowden debacle (techies and otherwise) and the general attitude was "so what? they aren't using the information for anything bad!" or "I have nothing to hide!" (in this thread, for instance: https://news.ycombinator.com/item?id=41594775)

            I think people don't really understand what an enormous sleeping dragon the entire thing is.

            • digging 5 days ago

              > I think people don't really understand what an enormous sleeping dragon the entire thing is.

              Isn't that what I said? Mostly we're debating semantics. My deeper point is that it's counterproductive and borderline misanthropic to argue "People just don't care about evil being done!" whereas the argument that "People seriously have no idea yet what they're 'agreeing' to" opens the door to actual solutions, for one inclined to work on them.

              • neom 5 days ago

                But what is the "enormous sleeping dragon" my mom, dad, little sister, and teenage cousins need to understand? - and, even once it's patently clear, does it with certainly not result in another "...and???"

        • zo1 4 days ago

          Not true at all. I'm a tech person, understand it all and the implications, and I'm not being a doomer about it.

          The more people faff about and fight for privacy as a misguided absolute, the less discussions we can have about ethical, safe and managed uses of surveillance. Privacy advocates have this weird habit of thinking they speak for everyone, which they don't.

        • ajsnigrutin 5 days ago

          But won't you think of the children!

          (EU is trying to implement chat control again...)

          We need more real-world analogies... "see, this is like having a microphone recording everything you say in this bar"... "see, this is like someone ID-ing you infront of every store and recording what store you've visited, and then following you inside to see what products you look at. See, this is like someone looking at your clothes and then pasting on higer price tags on products. ..."

        • 6stringronin a day ago

          Maybe you missed this article where a many of the replies say they are fine with facial scanning at airports. Digital rights removal is the slow boiling frog.

          "Federal civil rights watchdog sounds alarm over Feds use of facial recognition"

          https://news.ycombinator.com/item?id=41603698

          The mentality of people in tech has drastically shifted into "o well... "

        • tommiegannert 5 days ago

          >> and people are ok with that

          > I've seen no evidence of this. People mostly either don't understand it for feel powerless against it.

          Isn't feeling powerless and being ok with it, ultimately the same thing: Complacency

      • Clubber 5 days ago

        >and people are ok with that.

        All the propagandists said he was a Russian asset, as if even if that were true, it somehow negated the fact that we were now living under a surveillance state.

        >Snowden pointed and everyone looked at his finger.

        This is a great way of putting it.

        • lesuorac 5 days ago

          > it somehow negated the fact that we were now living under a surveillance state.

          There's long been surveillance programs and also numerous laws outlining the responsibilities of telecom provides to enable wire tapping.

          There's really nothing new from Snowden besides the names of a bunch of people to go kill cause they're spies.

          FISA [1] isn't a private law either.

          https://en.wikipedia.org/wiki/COINTELPRO

          https://en.wikipedia.org/wiki/Mass_surveillance_in_the_Unite...

          Note: 2006 (Klien) predates 2013 (Snowden)

          https://en.wikipedia.org/wiki/Room_641A

          [1]: https://en.wikipedia.org/wiki/Foreign_Intelligence_Surveilla...

          • roenxi 4 days ago

            There was something substantially new after Snowden though - prior to his leaks if you pointed out what the US government was likely up to people would laugh at the idea and ask for more sources. Afterwards they tended just accept it.

            There was a big cultural shift from the default assumption in polite company being "They're spying on Middle Easterners" to "they're spying on everyone, everywhere" when talking about US spying.

          • simoncion 5 days ago

            > There's long been surveillance programs and also numerous laws outlining the responsibilities of telecom provides to enable wire tapping.

            Laws which the telecoms were knowingly and willfully breaking for years.

            You do remember that Congress gave them retroactive immunity? [0][1] You do know that this was only granted because people COULD sue (and were suing) them because of the information made public by Snowden and others?

            [0] <https://www.aclu.org/news/national-security/retroactive-tele...>

            [1] See Title II of the this bill <https://www.congress.gov/bill/110th-congress/house-bill/6304>

          • Clubber 5 days ago

            >There's really nothing new from Snowden besides the names of a bunch of people to go kill cause they're spies.

            https://en.wikipedia.org/wiki/2010s_global_surveillance_disc...

            • lesuorac 5 days ago

              You are dense. Imagine a government authorizes 10B for a bridge and then in 5 years a bridge shows up.

              Now instead, imagine in 1978 [1] a government authorizes "United States federal law that establishes procedures for the surveillance and collection of foreign intelligence on domestic soil" and in 2008 [2] amends it to not be a big deal if they're foreign or not and then 5 years later it turns out they're doing just that.

              These bills are not secret. Were not secret. Have never been secret. It's not my fault you didn't read them but it doesn't make Snowden novel.

              [1]: https://en.wikipedia.org/wiki/Foreign_Intelligence_Surveilla...

              [2]: https://en.wikipedia.org/wiki/Foreign_Intelligence_Surveilla...

              • Clubber 5 days ago

                >You are dense.

                Well, maybe you're one of those propagandists. If you can't attack the idea, attack the person, right?

                Hand waves, nothing new to see here, carry on.

                The bills aren't what were exposed, it was more the techniques and scope. Like PRISM and XKeyScore and circumventing laws by sharing intelligence on US citizens with allies who aren't restricted by US laws. Spying on allied governments, etc. You know, that stuff.

                You should really click on the link.

                https://en.wikipedia.org/wiki/2010s_global_surveillance_disc...

                • dialup_sounds 5 days ago

                  https://en.wikipedia.org/wiki/Project_SHAMROCK

                  https://en.wikipedia.org/wiki/Martin_and_Mitchell_defection

                  https://en.wikipedia.org/wiki/Church_Committee

                  https://en.wikipedia.org/wiki/ECHELON

                  Et cetera. These aren't new issues. The obsession with Snowden as a messianic figure is unhelpful in contextualizing the information.

                  • Clubber 4 days ago

                    >The obsession with Snowden as a messianic figure is unhelpful in contextualizing the information.

                    Damn the gall, give it a rest. Again, methods and scope. Which one of those exposes PRISM, XKeyScore and the NSA infiltrating Google servers? Which one of those exposes the companies that willingly "integrated" with the NSA?

                    Which of those exposes US government spying on allied governments, recording private conversations, etc?

                    Saying Snowden didn't reveal anything is a silly hill to die on. What is your prerogative in minimizing the exposure? Do you work for one of the companies implicated in participating in prism or something?

                    Microsoft joined PRISM on 9/11/2007 (fitting)

                    Yahoo joined PRISM on 3/12/08

                    Google joined PRISM on 1/4/09

                    Facebook joined PRISM on 6/3/09

                    YouTube joined PRISM on 9/24/10

                    Skype joined PRISM on 2/6/11

                    AOL joined PRISM on 3/11/11

                    Apple, the last holdout on the list, joined PRISM 10/12 (after Jobs died).

                    OP, this is why it seems nobody cares, there's plenty of people trying to sway public opinion on the matter by minimizing it. Nobody wants to believe their government would do things like this, so when someone offers that, "hey it's not so bad," they want to believe it. We've always been at war with Eastasia.

                    • dialup_sounds 4 days ago

                      Weird take. You have all these aspersions to cast on me and saying I'm trying to "sway public opinion by minimizing" when I'm literally telling one person to learn more about these subjects so they'll be less ignorant.

                      • Clubber 9 hours ago

                        >These aren't new issues. The obsession with Snowden as a messianic figure

                        I'll give you the benefit of the doubt, maybe we weren't communicating well, but I felt you were certainly minimizing it by claiming what Snowden revealed weren't "new issues," which they certainly were, and anyone who thinks different has an "obsession with Snowden as a messianic figure," which is an attempt to discredit. Was that your objective?

  • devonbleak 5 days ago

    It makes me irrationally angry that I suddenly started getting spam emails from Experian. Like motherfucker I never consented for you to have my data, then you leak it all, now you're sending me unsolicited junk email? It's just such bullshit that I'm literally forced to have a relationship with these companies to freeze my credit or else I'm at the mercy of whoever they decide to release my information to without my authorization.

    • nicholasjarnold 5 days ago

      Yep. It sucks. Zero consequences of any import for those companies as far as I'm aware too. Tiny fines end up being "cost of doing business". Then they get to externalize their failures onto us by using terms like "Identity Theft", which indicates something was stolen from ME and is now MY problem.

      In actuality some not-well-maintained systems owned by <corp> were hacked or exposed or someone perpetrated fraud on a financial institution and happened to use information that identifies me. It's really backwards.

      PSA: If you haven't already, go freeze your credit at Experian, TransUnion, Equifax and Innovis. It will make the perpetration of this type of fraud much more difficult for adversaries.

      • singleshot_ 5 days ago

        PSA pro tip: they will try to steer you toward “locking” your account. Don’t fall for it. Freeze your account.

        • Hugsun 4 days ago

          Do you know why they do this?

          • singleshot_ 4 days ago

            No. I have some guesses. A credit reporting company can probably sell access to their data for good money if the account is locked but not if it is frozen?

            Put otherwise if a bank asks experian to look at my credit report and experian tells them to take a hike because my account is frozen, that’s not worth much money to the bank. But that’s the only credit account configuration that has any value to me, so I’ll insist on it.

            I think “freezing” and the dynamics thereof are established by federal law, while “locked” is a think the companies made up so they had an account setting that they could provide that would give the illusion of security, while maintaining the ability to sell information associated with the account.

            In other words: evil people do evil things when we aren’t paying sufficient attention. It’s our job to hold them accountable.

    • twoodfin 5 days ago

      My pet solution has been to make the credit reporters liable for transmitting false information to the CRAs.

      Chase tells Experian I opened a new line of credit with them, but it later is demonstrated that it was a scammer with my SSN? Congratulations, $5,000 fine.

      Of course this all gets priced in to the cost and availability of consumer credit. Good! Now the lenders have an incentive to drive those costs down (cheaper, better identity verification) to compete.

      • lotsofpulp 5 days ago

        The solution is much simpler. Put all of the consequences of being defrauded by a borrower onto the lender.

        If a lender wants to be repaid, then they need to show the borrower all the evidence they have for proof that the borrower entered into the contract.

        If all a lender has is the fact that a 9 digit number, date of birth, name, and address were entered online, then the borrower simply has to say “I did not enter that information”, and the lender can go pound sand.

        Guarantee all the lenders will tighten up their operations very quickly, and consequently, so will the loans that appear on one’s credit report.

        • ryandrake 5 days ago

          Right. This is a problem between the lenders and the people who stole from the lenders. The person whose name/number was used shouldn't even be part of the transaction or part of the problem.

          They call it "Identity Theft" instead of what it should be called: Bank fraud. The term "Identity Theft" 1. needlessly pulls an otherwise uninvolved person into the mix, suddenly making it their problem too, and 2. downplays the bank's negligence.

          If someone uses my name to take out a loan, and the bank stupidly lets them, this shouldn't even remotely be my problem. I shouldn't even have to know about it. This is the bank's problem from their own stupidity.

        • twoodfin 5 days ago

          Lenders hand over bad loans to collection agencies (“accept the consequences”) all the time. Cost of doing business. That an innocent person’s credit is destroyed is just collateral damage from their perspective.

        • sib 5 days ago

          "Put all of the consequences of being defrauded by a borrower onto the lender" - that seems a bit strange.

          Imagine saying "put all of the consequences of getting robbed onto the bank, not the robber"

          • lotsofpulp 5 days ago

            Who bears the consequences of their home being robbed? Or mugged on the street? Or a contractor taking payment for services and then disappearing?

            Why are we subsidizing lenders’ by putting this ridiculous burden on people who have nothing to do with the lender’s business?

            The lender can pay to appropriately verify their borrower’s identity, or go to court and sue for damages like everyone else has to.

      • trinsic2 5 days ago

        Can you describe how you make them liable in this arrangement?

        • twoodfin 5 days ago

          You can challenge entries your credit report today. Win the challenge, whoever reported the entry is liable to the Feds. Maybe add a modest bounty for the injured taxpayer.

    • rkagerer 5 days ago

      That's not an irrational reaction.

  • newsclues 5 days ago

    The long term consequences of 9/11.

disambiguation 5 days ago

Ever stop and think it's funny that Meta, Google, etc. are worth billions because they figured out how to legally fill a database with information about you? In any other time in history some might call it spying, but well they figured out how to do it legally, and it's worth billions. Meanwhile from a technical standpoint, remotely logging your data is a trivial thing, with consent of course. It's like, we made this imaginary wall (law) and spent billions building a road around that wall, and thats equivalent to econmic prosperity. Similar idea with streaming services versus file sharing.

  • weixiyen 5 days ago

    they are valuable b/c they built something billions of people use. I suspect the revenue loss from every one of these FTC recommendations being implemented would not have a material impact on either of the businesses you mentioned

  • cscurmudgeon 5 days ago

    Spying is done without consent.

    Why do people keep saying social media is just a database?

    • Llamamoe 5 days ago

      The consent you give to web services isn't much better than if an electrician said "hey, can you tap this button to give me consent to work on your house?" and then installed undetectable hidden microphones inside every surface in your apartment.

      All of the UX of online consent forms exists to misinform, trick, and get users used to agreeing to sell their digital soul.

    • WickyNilliams 4 days ago

      Facebook will create shadow profiles of you even if you've never signed up, never created a profile. They'll take your number from other people's contacts via WhatsApp. They'll do facial scans of you on photos other people upload.

      Even if you've never visited their site.

      Where's the consent there?

  • pwillia7 4 days ago

    Wait --- the emperor isn't wearing any clothes?!?!?

    • MetaWhirledPeas 4 days ago

      I wouldn't say that. If it meets our needs/wants and we are willing to pay for it, that represents value, no matter how silly it sounds. People pay money for plenty of nonsense: cosmetics, junk food, DLC. The fact that it's artificially derived (laws begetting paid workarounds) doesn't change the value proposition. For data gluttons the investment in data acquisition pays off. There are plenty of people paying money to work around laws, especially tax laws. Tax decisions have a scale from wisdom (an individual making prudent financial decisions) to deviance (a company playing shell games with businesses and bank accounts) but the line can be blurry.

      I do think the situation is dystopian though. Sharing data without explicit case-by-case consent should be disallowed.

cynan123 5 days ago

Lina Khan has been on a tear. She actually seems to care about online human rights.

  • montag 5 days ago

    I think this effort is positive, but a bit misdirected. Think data breach liability. Facebook and YouTube are willing and capable defenders of sensitive customer data. Watch the AshleyMadison documentary. Arrogant disregard for customer privacy and almost no culpability. These smaller, irresponsible players are where consumers are most vulnerable.

    • gopkarthik 5 days ago

      Agreed. Mid-sized/smaller players are the places which have very poor data & security practices. Especially when they require PII as part of their operations.

      Meta, Google are much better stewards of their users data. One misconception I see is claiming these companies sell user data. I'd instead say that they sell user attention.

      • justapassenger 5 days ago

        They don’t sell user data for a very simple reason - it’s a crappy business, as you can charge much much more with to recurring sales for heavily obfuscated access to the data than just one off selling of said data.

        When you think about it - initiatives are kind of aligned with user privacy (kind of, as there’s much more to the story than this simplistic point of view)

        • soylentcola 4 days ago

          The way I've always seen it is that they'd never directly sell user data because that is their most valuable and prized asset. It's what allows them to sell targeted access to users for so much money.

          • mistrial9 4 days ago

            sort-of agree .. the accuracy, timeliness and completeness of any real-world data will change.. many comments assume some kind of perfect knowledge, which is never the case. Selling snapshots of current profile understanding is not only better business, but can also evolves with the enterprise over years; meanwhile selling raw user profiles as a whole is more like "damaged goods at a discount" for quick profit maximizers. This said, "surveillance capitalism" is rife and inherently a step backwards from a democratic society IMHO due to information asymmetry, secrecy and income generation.

        • pwillia7 4 days ago

          The data was always a commodity to produce other goods (ads)

  • xhevahir 5 days ago

    I will be surprised if she's still there six months from now. Trump will remove her if he becomes president; whereas if Harris wins, and the GOP take the Senate--a pretty likely scenario--I fear Harris won't hesitate to use Khan as a bargaining chip to gain confirmation of her appointments.

GeekyBear 5 days ago

This portion is particularly problematic:

> many companies engaged in broad data sharing that raises serious concerns regarding the adequacy of the companies’ data handling controls and oversight.

  • mrmetanoia 5 days ago

    It would be wonderful if the staff report recommendations were taken seriously by our legislators. I think I'll send a copy of this to my reps and say hi.

shawn-butler 5 days ago

the full report[0] is a good read don't just read the summary..

>>> But these findings should not be viewed in isolation. They stem from a business model that varies little across these nine firms – harvesting data for targeted advertising, algorithm design, and sales to third parties. With few meaningful guardrails, companies are incentivized to develop ever-more invasive methods of collection. >>>

[0]: https://www.ftc.gov/system/files/ftc_gov/pdf/Social-Media-6b...

CatWChainsaw 5 days ago

Surveillance is cancerous. It keeps on growing, feeding on justification for every data point "just because", and then eventually it kills you.

EasyMark 5 days ago

Let’s add automaker to the list as well with all the cameras and microphones spying in auto cabins.

SamuelAdams 5 days ago

> The report found that the companies collected and could indefinitely retain troves of data, including information from data brokers, and about both users and non-users of their platforms.

As a non-user of many social media platforms, is there anything I can do to prevent companies from collecting data about me? It feels wrong that companies you do not sign up for are still finding and processing data about you.

darby_nine 5 days ago

This is truly interesting from a dialectical perspective. The current narrative is that data is simultaneously infinitely valuable and presents zero liability. This contradiction can't hold forever (though it can hold longer than any of us are alive, of course)

I suspect it will break in the direction of the narrative that "data wasn't that valuable anyway", regardless of how disingenuous this sentiment is. Nothing else preserves the economic machine while simultaneously dismissing the concerns of consumers. Perhaps we'll get special protection for stuff like SSNs to make it seem like politicians are acting on the behalf of their constituents (even though a competent manager of a rational society would simply ban use of ssn as a form of identification as this is basically public information.

BlueTemplar 4 days ago

Far from me to try to defend platforms, but I am still wondering (for years now) :

How are data deletion requests supposed to be handled in practice, when the only way to be sure is to physically destroy the hardware that data was stored on ? (Especially the case for transistor-based storage, and even more so when wear leveling is being used.)

Or is this is actually a "pinky promise" by the company to not restore the data (or else they will have to face legal consequences) ?

advael 4 days ago

I honestly think that if half of the FTC's lawsuits under Lina Khan succeed, she'll have more meaningful positive change in the life of the average American to point to than the last six or so presidents

seydor 5 days ago

A little hypocritical when it comes from various government organizations all over the western world. Surveillance companies are essential for police to be able to easily gather data when needed fast. It is a happy accident that surveillance is so lucrative for advertising and also so effective for policing.

  • janalsncm 5 days ago

    Different parts of government might disagree on the best course of action but I wouldn’t call that disagreement hypocrisy per se.

    It’s also not true that it’s an irresolvable conflict. Yes the cops can and do buy your phone location data, but even if we said that was fine and should continue, that doesn’t also mean that any schmuck should be able to buy real-time Supreme Court justice location data from a broker.

herf 5 days ago

Please make it so my kids can watch a YouTube video required by school without watching 20 YouTube shorts after. That's all I want.

fred_bellows 4 days ago

from the report: "While the Order did not explicitly request that the Companies report all the types of Personal Information collected.." Why wouldn't they ask for all the personal information that they collect? Can anyone explain this?

mont_tag 4 days ago

Captain Renault: I'm shocked, shocked to find that surveillance is going on in here!

blondelegs 5 days ago

Yes thank you for listening BRAVO BRAVO BRAVO

JackOfCrows 5 days ago

Shocked, gambling, establishment, etc.

ianopolous 5 days ago

We really need e2ee social media that's designed to protect, not addict people.

  • janalsncm 5 days ago

    “E2ee social media” isn’t a coherent concept. E2ee has to do with how information is transferred not what is transferred.

hermannj314 4 days ago

There are pretending to take this topic seriously because there is an election coming up. Your government is in bed with big tech, no one is coming to save you, everyone is on their own, expect no quarter.

jarbus 3 days ago

Lina Kahn has been the best part of the Biden administration, by far. I wish she'd run for president.

hnpolicestate 5 days ago

Imagine the respect the government has for your intelligence publishing this while purchasing said surveilled user data.

  • carom 5 days ago

    The government is large and consists of multiple organizations with different goals.

  • bbarnett 5 days ago

    There is no single "the government".

    Instead "The Government" is like a huge community. They are all supposed to adhere to the same code, but like any community there are those members that look for a way to bypass the law, without quite going over it.

    That's what said purchases are. And even parts of the community in the same branch of a government department, may do what other parts are not even really aware of. Or agree with.

    • hollerith 5 days ago

      Although you have a valid point, I object to your calling it a community because communities don't have constitutions and cannot throw people in jail if they break the community's rules. Also, a community has much less control over who becomes a member of the community than a government has over who it employs.

  • timeon 4 days ago

    Do people really have this simplistic mindset?

yieldcrv 5 days ago

Wait till the FTC discovers Full Story

russdpale 5 days ago

instead of stupid recommendations, which are laughable, the government should actually enforce them.

  • layer8 5 days ago

    “The government” isn’t a singular entity, and the FTC is an independent agency.

ryanisnan 5 days ago

I love the cognitive dissonance on display within the federal government.

One arm: "everyone is a criminal; spy on everyone"

Other arm: "hey you shouldn't really harvest all of that data"

  • jlarocco 5 days ago

    The cognitive dissonance is in the voters and users.

    Even right here on HN, where most people understand the issue, you'll see conversations and arguments in favor of letting companies vacuum up as much data and user info as they want (without consent or opt-in), while also saying it should be illegal for the government to collect the same data without a warrant.

    In practice, the corporations and government have found the best of both worlds: https://www.wired.com/story/fbi-purchase-location-data-wray-... Profit for the corporation, legal user data for the government.

    • spacemadness 5 days ago

      HN is filled with folks that wrote the code in question, or want to create similar products. And they hate to have it pointed out that these tools may cause harm so they thrash around and make excuses and point fingers. A tale as old as this site.

      • mrmetanoia 5 days ago

        I often have to remind myself who hosts this board and that I am hanging out on a site for successful and aspiring techno-robber-barons.

        • singleshot_ 5 days ago

          Explaining that modern technology is user-hostile and destructive to the society is nowhere else more on-topic than Paul Graham’s ego blog. While it might be true to say the site is “for” robber barons, There are a lot more users here than the ones you described.

        • sabbaticaldev 5 days ago

          > I am hanging out on a site for successful and aspiring techno-robber-barons.

          that’s how we first arrive here (all of us). Time pass tho and most around fail then we become proper people capable of reasoning

        • 2OEH8eoCRo0 5 days ago

          Complete with egotistical and ironic appropriation of the word hacker.

    • neuralRiot 5 days ago

      >The cognitive dissonance is in the voters and users.

      People really need to learn to say “NO” even if that means an inconvenience “Your personal information might be shared with our business partners for metrics and a customer tailored experience” no thanks, “what is your phone number? so I can give you 10% discount” no thanks, “cash or credit?” Cash, thanks, “login with google/ apple/ blood sample” no thanks

    • BeetleB 5 days ago

      Anti-disclaimer: I'm not one of those folks.

      However, that's not at all a cognitive dissonance. Fundamentally, there's a difference between governments and private companies, and it is fairly basic to have different rules for them. The government cannot impinge on free speech, but almost all companies do. The government cannot restrict religion, but to some extent, companies can. Etc.

      Of course, in this case, it's understandable to argue that neither side should have that much data without consent. But it's also totally understandable to allow only the private company to do so.

      • jlarocco 5 days ago

        There is fundamentally a difference between corporations and the government, but it's still a cognitive dissonance. These aren't the laws of physics - we chose to have different rules for the government and corporations in this case.

        There are plenty of cases where the same rules apply to both the government and corporations.

    • doctorpangloss 5 days ago

      There isn’t a single intellectually honest harm associated with the majority of app telemetry and for almost all ad data collection. Like go ahead and name one.

      Once you say some vague demographic and bodily autonomy stuff: you know, if you’re going to invoke “voters,” I’ve got bad news for you. Some kinds of hate are popular. So you can’t pick and choose what popular stuff is good or what popular stuff is bad. It has to be by some objective criteria.

      Anyway, I disagree with your assessment of the popular position anyway. I don’t think there is really that much cognitive dissonance among voters at all. People are sort of right to not care. The FTC’s position is really unpopular, when framed in the intellectually honest way as it is in the EU, “here is the price of the web service if you opt out of ads and targeting.”

      You also have to decide if ad prices should go up or down, and think deeply: do you want a world where ad inventory is expensive? It is an escape valve for very powerful networks. Your favorite political causes like reducing fossil fuel use and bodily autonomy benefit from paid traffic all the same as selling junk. The young beloved members of Congress innovate in paid Meta campaign traffic. And maybe you run a startup or work for one, and you want to compete against the vast portfolio of products the network owners now sell. There’s a little bit of a chance with paid traffic but none if you expect to play by organic content creation rules: it’s the same thing, but you are transferring money via meaningless labor of making viral content instead of focusing on your cause or business. And anyway, TikTok could always choose to not show your video for any reason.

      The intellectual framework against ad telemetry is really, really weak. The FTC saying it doesn’t change that.

      • photonthug 5 days ago

        > There isn’t a single intellectually honest harm associated with the majority of app telemetry and for almost all ad data collection. Like go ahead and name one.

        You’ve already signaled that you’re ready and willing to dismiss any of the many obvious reasons why this is bad. But let’s flip it. What intellectually honest reason do you have for why it would be wrong if I’m watching you while you sleep? If I inventory your house while you’re away, and sell this information to the highest bidder? No bad intentions of course on my part, these things are just my harmless hobby and how I put bread on the table.

        In my experience literally everyone who argues that we don’t really have a need for privacy, or that concerns about it are paranoid or that there’s no “real” threat.. well those people still want their own privacy, they just don’t respect anyone else’s.

        More to the point though, no one needs to give you an “intellectually honest” reason that they don’t want to be spied on, and they don’t need to demonstrate bad intentions or realistic capabilities of the adversary, etc. If someone threatens to shoot you, charges won’t be dropped because the person doesn’t have a gun. The threat is extremely problematic and damaging in itself, regardless of how we rank that persons ability to follow through with their stated intent.

        • doctorpangloss 5 days ago

          > What intellectually honest reason do you have for why it would be wrong if I’m watching you while you sleep? If I inventory your house while you’re away, and sell this information to the highest bidder?

          This is an interesting idea, but it's a pretty far analogy from app telemetry or ad data collection. If you're really saying, "would it be wrong for me as a camera app developer to collect the videos end users record?" I suppose the answer would really be, "It depends." Like that's what Instagram does, it collects videos end users record. But without their permission? I guess not, no, but that's pretty obvious. The same would be true if you made firmware for security cameras, which happened to be pointed at my bedroom. I suppose if you asked for permission, and I granted it, go ahead - if you didn't ask for permission, I would be surprised why you would need to collect the videos as a firmware developer. The house inventory thing is the same tack - are you talking about, does it make sense for Amazon to sell my purchase history, or something? I guess they asked for permission, go ahead... Nobody forces me to use Amazon or whatever.

          Instagram, Amazon, etc. do the things they do with permission. And I don't think anyone who is fully educated is surprised what the idea is for the transactional attribution data it collects. There's lying by omission, which is bad, but that is an issue of leadership and education. Everyone in the EU still chooses telemetry and free over no telemetry and paid service, when it is spelled out to them. It's too bad that leadership has to be taken in that form, but there's no alternative in the regime they built there.

          If this is just a competition over the leadership and education of laypeople, so be it, but this real life experiment keeps happening, and the people who try to inject drama into ad telemetry keep losing, so I really don't think it's just about lying. There is a real lack of harm.

          > reason that they don’t want to be spied on

          Nobody forces you to use Instagram. If you think ad data attribution is a form of spying, go for it. Delete the free social media apps. I don't use them. I don't have Instagram, TikTok, etc. I spend less than 10m a week watching something on YouTube. I don't even have a TV in my house. Do you see? They are not enriching your life.

          > In my experience literally everyone who argues... well those people still want their own privacy, they just don’t respect anyone else’s.

          In my experience this is pure projection. I respect when people don't want to give permission to Instagram to collect ad telemetry when they choose to not install the app. Of course, you say these things on the Internet, but you, you personally, are not going to migrate off of Gmail, which does all the same things. This is all really about vibes, about vibes being vibesy against social media, but not vibes being vibesy against Gmail, which would be a major inconvenience to say no to, and it would suck to have to pay $35/mo for e-mail - at the very least!

          • photonthug 5 days ago

            So basically your argument is everything is fine because consumers can opt out. Another tired old argument where even the people saying it don’t really believe it.

            You can’t even rent a hotel room without giving them an email and a phone number they don’t need, and are looking to sell. If this works for you.. the person at the counter probably faked it rather than arguing with you. Some people will be happy when menus disappear and you need to install an app. What happens when you can’t check out of the grocery store without the requisite likes-and-subscribes? What happens when your flashlight app has 37 page ToS that says they reserve the right to steal your contact list for the purposes of selling flashlight apps? All is well because you can see in the dark, and no one makes you choose anything? Well I hope there’s healthy competition amongst the manufacturers of your pacemaker, and they don’t inform your insurance company that your health is deteriorating..

            If you’ve got no sense of right or wrong beyond what is legally permissible, just exercise your imagination a bit to look at the likely future, and ask yourself if that’s how you really want to live.

            • doctorpangloss 4 days ago

              All you have to do is tell me how you are harmed by your email sold for marketing by a hotel room. The flashlight thing sounds like a bad actor that doesn’t have anything to do with anyone’s opinions about privacy, and it doesn’t sound like it has anything to do with Meta or YouTube. I’d be most interested in naming a specific harm in something that their app telemetry does.

              • photonthug 4 days ago

                I’m harmed because I did not consent to it, and that should really be enough for you. What intellectually honest reason do you have that it’s ok to coerce others into things that they don’t want?

      • jlarocco 4 days ago

        > There isn’t a single intellectually honest harm associated with the majority of app telemetry and for almost all ad data collection. Like go ahead and name one.

        The harm is the privacy violation. App telemetry needs to be "opt-in", and people should know who can see the data and how it's being used.

      • timeon 4 days ago

        Are you in ad telemetry industry?

      • arminiusreturns 5 days ago

        The intelligence agencies literally use ad data to do "targeted killing" what are you even talking about?

        Ex-NSA Chief: 'We Kill People Based on Metadata'...

        • walrushunter 4 days ago

          So terrorists? You're concerned data logging may harm terrorists?

          • arminiusreturns 4 days ago

            I got my start in the military in anti-terrorism and I find your comment foolishly naive, provacative, and short-sighted beyond measure.

        • doctorpangloss 5 days ago

          Can you define a harm suffered by the people that the FTC represents? What about the EU beneficiaries of the GDPR? This is sincere, it is meant to advance to a real and interesting conversation.

          • arminiusreturns 5 days ago

            I think privacy violations are a harm in themselves, but you seem to have already dismissed this issue, so I'll move on. How about behavioral manipulation via microtargeting, economic harm via price discrimination, reselling of the data via monetization to unscrupulous aggregators or third parties, general security reduction (data and metadata sets could be used for APT, etc), or the chilling effect of being tracked all the time in this way?

            • doctorpangloss 5 days ago

              > How about behavioral manipulation via microtargeting...

              I don't know. Ads are meant to convince you to buy something. Are they "behavioral manipulation?" Are all ads harmful?

              > ...economic harm via price discrimination...

              Should all price discrimination be "illegal?" This is interesting because it makes sense for the FTC and for anti-trust regulators to worry about consumer prices. Price discrimination in software services - the thing I know about - helps the average consumer, because it gets richer people to pay more and subsidize the poor.

              > reselling of the data via monetization to unscrupulous aggregators or third parties

              "Unscrupulous" is doing a lot of work here.

              > ...general security reduction...

              Gmail and Chrome being free ad subsidized has done a lot more for end user security than anything else. Do you want security to be only for the rich? It really depends how you imagine software works. I don't know what APT stands for.

              > chilling effect of being tracked all the time in this way?

              Who is chilled?

              I guess talk about some specific examples. They would be really interesting.

    • itronitron 5 days ago

      And in Europe, everyone and their dog uses WhatsApp

      • ale42 4 days ago

        Not everyone but almost... and it's the same in other places (was already the case in Buenos Aires when I went there a few years ago). And of course when you tell people that there are better alternatives, many of them don't want "another app"... (but then they install one full of trackers to hope get some kind of prize at the local supermarket).

  • bee_rider 5 days ago

    It isn’t cognitive dissonance, the state does lots of things we’re not supposed to do. Like we’re not supposed to kill people, but they have whole departments built around the task.

    Should the state do surveillance? Maybe some? Probably less? But the hypocrisy isn’t the problem, the overreach is.

  • cvnahfn 5 days ago

    The FTC is under the president's authority. This is election pandering, same as Zuckerberg's backpedaling on government censorship.

    This is for getting votes from the undecided.

    Everything will be back to normal (surveillance, data collection and censorship) after the election.

    • layer8 5 days ago

      The FTC is bipartisan, no more than three of the five commissioners can belong to the same party. The present report was unanimously voted by all five.

    • munk-a 5 days ago

      I don't know if you've been watching but the FTC has actually been extremely proactive during this cycle. Lina Khan is an excellent steward and has pushed for a lot of policy improvements that have been sorely needed - including the ban (currently suspended by a few judges) on non-competes.

      It is disingenuous to accuse the FTC of election pandering when they've been doing stuff like this for the past four years consistently.

      • srndsnd 5 days ago

        And has sued Amazon for their use of anti-competitive pricing.

        This is just what Kahn's FTC does.

    • singleshot_ 5 days ago

      Begs the question of agency authority which is manifestly not resolved. You will find that the elections’ results will effect the eventual resolution of the question of the unitary executive quite dramatically.

  • kiba 5 days ago

    There are different organizations with different opinions. The government isn't a monolithic entity.

  • bitwize 5 days ago

    And it's not just here.

    The EU: Unlike the barbarians across the pond, we actually protect people's privacy rights.

    Also the EU: ChAt CoNtRoL

    • ryanisnan 5 days ago

      The problem seems deeply fundamental to what it means to be a human.

      On one hand, there's a lack of clear leadership, unifying the societal approach, on top of inherently different value systems held by those individuals.

      It seems like increasingly, it's up to technologists, like ones who author our anti-surveillance tools, to create a free way forward.

    • whimsicalism 5 days ago

      this view presupposes the state as “just another actor” as opposed to a privileged one that can take actions that private actors can’t

      • lupusreal 5 days ago

        In the matter of corporations vs governments, if you tally up number of people shot it's clear which of the two is more dangerous. You would think Europe of all regions would be quick to recognize this.

        I don't like corporations spying on me, but it doesn't scare me nearly as much as the government doing it. In fact the principle risk from corporations keeping databases is giving the government something to snatch.

        • whimsicalism 5 days ago

          because the government has a monopoly on violence. i would much prefer that to corporations being able to wage war themselves

          • lupusreal 5 days ago

            Who is arguing for corporations to wage war? What an absolutely insane strawman. What I am arguing against is letting governments grant themselves the ability to spy on their own populations on an unprecedented scale, because governments "waging war" (mass murder) against their own people is a historically common occurrence.

      • Karunamon 5 days ago

        Those privileged actions are mostly irrelevant when discussing mass surveillance. Doubly so since they can just buy or acquire the data from corps.

    • immibis 5 days ago

      The EU has multiple parts. One part keeps asking for chat control, and another part keeps saying no.

  • whimsicalism 5 days ago

    It seems entirely reasonable/consistent that we would allow some capabilities among publicly sanctioned, democratically legitimate actors while prohibiting private actors from doing the same.

    In fact, many such things fall into that category.

  • daedrdev 5 days ago

    I would be worried if the state was conscious of what it itself was doing as a whole

  • layer8 5 days ago

    Since the federal government isn’t a single mind (nor a hive mind), a cognitive dissonance can only be meaningfully located on the observer’s side.

DaleNeumann 5 days ago

"According to one estimate, some Teens may see as many as 1,260 ads per day.200 Children and Teens may be lured through these ads into making purchases or handing over personal information and other data via dark patterns"

There is a long trail of blood behind google and facebook, amazon... Etc...

  • 93po 5 days ago

    Even with ad blockers, we still see tons of ads. Corporate news like CNN constantly has front page stories that are just paid promotion for some product or service wrapped in a thin veil of psuedo journalism. Product placement is everywhere too. Tons of reddit front page content is bot-upvoted content that is actually just a marketing campaign disguised as some TIL or meme or sappy story.

    • FMecha 2 days ago

      That does not even account for offline ads, such as billboards.

  • walrushunter 4 days ago

    Blood??? Some kid spending their allowance on a shitty phone game does not make them bleed.

short_sells_poo 5 days ago

[flagged]

  • dragonwriter 5 days ago

    > People criticize the clunky attempts by the EU to reign this in, and yes I agree the execution leaves much to be desired. It's still vastly better than the complete laissez-faire approach of the US authorities.

    This is kind of weird as a response to a report by a US regulatory agency that is making specific policy requests for legislation to address this.

    • short_sells_poo 5 days ago

      Apologies I was unclear: I'm not criticizing this report, I'm criticizing the lack of action over the past decade or so.

doctorpangloss 5 days ago

Simple questions:

Should ad prices be lower or higher?

Should YouTube be free for everyone, or should it cost money?

  • beezlebroxxxxxx 5 days ago

    Having ads does not require mass surveillance --- that's really just something that social media companies have normalized because that's the particular business model and practices they have adopted and which makes them the most amount of money possible.

    • goosejuice 5 days ago

      Well put. Targeting and more specifically retargeting is the problem.

      Most companies can't afford to not do this when their competitors are. Hence the need for regulation.

  • janalsncm 5 days ago

    Those are useful questions but I don’t think they’re the only ones that matter. Here’s another one for consideration:

    What is the minimum level of privacy that a person should be entitled to, no matter their economic status?

    If we just let the free market decide these questions for us, the results won’t be great. There are a lot of things which shouldn’t be for sale.

    • doctorpangloss 5 days ago

      > What is the minimum level of privacy that a person should be entitled to, no matter their economic status?

      This is an interesting question: maybe the truth is, very little.

      I don't think that user-identified app telemetry is below that minimum level of privacy. Knowing what I know about ad tracking in Facebook before Apple removed app identifiers, I don't think any of that was below the minimum level.

      This is a complex question for sort of historical reasons, like how privacy is meant to be a limit on government power as opposed to something like, what would be the impact if this piece of data were more widely known about me? We're talking about the latter but I think people feel very strongly about the former.

      Anyway, I answered your questions. It's interesting that no one really wants to engage with the basic premise, do you want these services to be free or no? Is it easy to conceive that people never choose the paid version of the service? What proof do you need that normal people (1) understand the distinction between privacy as a barrier to government enforcement versus privacy as a notion of sensitive personal data (2) will almost always view themselves as safe from the government, probably rightly so, so they will almost always choose the free+ads version of any service, and just like they have been coming out ahead for the last 30 years, they are likely to keep coming out ahead, in this country?

      • BriggyDwiggs42 5 days ago

        The issue to me is that these companies have operated and continue to operate by obfuscating the nature of their surveillance to users. This isn’t a system of informed consent to surveillance in exchange for free services; it’s a system of duping ordinary people into giving up sensitive personal information by drawing them in with a free service. I’m almost certain this model could still exist without the surveillance. They could still run ads; the ads would be less targeted.

      • janalsncm 5 days ago

        I didn’t mean to evade your questions, but my opinion is as follows:

        Yes I want YouTube to be free, but not if that requires intrusive surveillance.

        People who pay for YouTube aren’t opted out of surveillance as far as I can tell. So I reject the premise of your question, that people are choosing free because they don’t value privacy. They haven’t been given the choice in most cases.

        On a tangential note, you previously asked if ads should be more expensive. It’s possible that ads should be less expensive, since they may be less effective than ad spend would suggest: https://freakonomics.com/podcast/does-advertising-actually-w...

mgraczyk 5 days ago

"these surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking."

Is there any evidence that any of these things have ever happened as a result of this sort of data collection? I'm not talking about data posted to social media, I'm talking about the specific data collection described in this FTC press release.

  • mu53 5 days ago

    I have been stalked and harassed by an Apple employee using data they were able to glean from their access at Apple.

    The impossible part is proving the abuse. All of these companies keep their database, access controls, and everything they possible can about these data lakes secret. The simple fact of the matter is that you will never have any evidence someone looked you up in a database.

    It is really easy to walk the line, but be obvious enough to intimidate.

    • mgraczyk 5 days ago

      Apple wasn't listed and (outside the app store) doesn't collect the data described in the press release.

      • stiffenoxygen 5 days ago

        They absolutely do, in fact they even tried to encrypt user data to not be as invasive as other companies but the FBI sued them and said no you can't do that, you need to keep that data so we can subpoena you.

      • stiffenoxygen 5 days ago

        They mentioned practices that corporations do. I think any corporation that collects data on you counts here. I don't think its worth it to only talk about the examples provided in the article.

      • drawkward 5 days ago

        So imagine the possible abuses by people at companies who do.

  • dogman144 5 days ago

    Not only is there evidence of harms, there are is a whole industry focused on fixing the problem for those wealthy enough or incentivized enough to care.

    Do a bit of googling, but ADINT and RTB tracking will get you there for search terms.

    Or, continue being confidently dismissive of something serious people are taking very seriously. I am sorry if this FTC report targeted the source of your RSUs or otherwise motivated set of incentives, but there’s no free lunch. The consequences are finally landing of your viewpoint, done collectively, over the last decade.

    • mgraczyk 5 days ago

      > targeted the source of your RSUs or otherwise motivated

      I don't currently have any financial interest in any of these companies

      > but ADINT and RTB tracking will get you there for search terms.

      These are good things, do you have any examples of harm that has been caused by ADINT or RTB? Prosecuting criminals doesn't count for me

  • orthecreedence 5 days ago

    Your comment is really coming across as "well, nothing bad has happened yet so who cares?" If that's not the case, please let me know how you meant it. If it is the case, surely you can imagine a world in which dragnet surveillance of people who have an expectation of privacy can be abused by corporations, institutions, or private individuals. It really doesn't take a lot of imagination to picture this world.

    • mgraczyk 5 days ago

      It's been ubiquitous for around 20 years now (Google started doing mass surveillance for display ads in the early 2000s) and nothing bad has happened, so yes that's my point.

      If nothing bad happens for decades, and that is inconsistent with your model of danger, then the model is probably wrong

      • orthecreedence 5 days ago

        Your argument boils down to "yes, someone has had a gun pointed at my head for quite some time now, but they haven't pulled the trigger yet so I don't see the problem."

        • mgraczyk 5 days ago

          No, I'm arguing that it's not actually a gun, and my evidence is that there are 2 billion "guns" that have been pointed at 2 billion people's heads for years, and nobody has been hurt.

          It's more like a flashlight than a gun

          • orthecreedence 5 days ago

            > It's more like a flashlight than a gun

            I disagree, and again, implore you to use your imagination. If private messages (not just yours but someone elses) were to suddenly be public or institutional knowledge, what damning things might happen? What influence might some have over others? What dynamics could or would shift as a result?

            I'm comfortable making the claim that you aren't really thinking this through, at all, in any meaningful way.

            • mgraczyk 5 days ago

              The FTC press release is not talking about private messages, that is not the kind of data they are asking to protect. Private messages are already generally protected in the way the FTC is asking for.

            • immibis 5 days ago

              What was the fallout last time this happened? Was it like pulling the triggers of guns pointed at people's heads?

      • ryandrake 5 days ago

        If you don't think anything bad happens from personal data being accessed without one's consent, please reply to this comment and share:

        1. Your full name

        2. Your home address

        3. Your social security number (if you're American)

        4. Your mother's maiden name

        If you're right, then you have nothing to worry about.

        • mgraczyk 5 days ago

          None of this data is included in the FTC report. They are not talking about this.

          My full name is Michael Graczyk, I live in San Francisco, none of these companies know any more detail than that about the questions you asked

          • tway_GdBRwW 5 days ago

            Michael, I disagree with your point but I recognize your integrity. You just posted your name and city, and your HN profile shares more personal information.

            I respect that you are willing to stand behind your claim. Best of success with your current venture.

          • lcnPylGDnU4H9OF 5 days ago

            > none of these companies know any more detail than that about the questions you asked

            I suspect you mean that you haven't provided these companies with these details. What reason do you have to think they don't know those details?

            • mgraczyk 5 days ago

              They don't know these details because they have never asked. It's not the sort of detail that would be useful for ads (except my home address)

      • tway_GdBRwW 5 days ago

        > nothing bad has happened

        ummm, WTF?

        10x increase in teen suicide doesn't qualify as "bad"?

        or repeated DOJ lawsuits against Facebook because their advertising practices result in highly effective racial discrimination?

  • drawkward 5 days ago

    Wait for the AI tools Larry Ellison wants to give to law enforcement to retroactively connect/hallucinate the dots.

negativeonehalf 5 days ago

Targeted advertising is a good thing. It lets people who make stuff more efficiently connect with people who want that stuff.

The FTC chair is complaining that companies "monetize that data to the tune of billions of dollars a year," but all this means is that this service is tremendously valuable.

The Internet's targeted advertising system is a major achievement of modern information technology and data science, and we dismantle it at our peril.

  • ickelbawd 5 days ago

    Really? Peril? We’ll be in perilous danger if we don’t maintain targeted, invasive ad tracking? Get a grip.

    • negativeonehalf 4 days ago

      I am saying that we should not destroy a major source of prosperity. Targeted advertising is far more effective than untargeted because it lets you show ads to people who might have any interest in what you're selling.

      • Loughla 4 days ago

        That can be done without the massive accumulation of data on each of us.

        You join a Facebook group for fashion? Ads for fashion.

        You join a Facebook group for woodworking? Ads for tools.

        See how that doesn't require any personal information?

kart23 5 days ago

> Profound Threats to Users Can Occur When Targeting Occurs Based on Sensitive Categories

> Targeted ads based on knowledge about protected categories can be especially distressing. One example is when someone has not disclosed their sexual orientation publicly, but an ad assumes their sexual orientation. Another example is when a retailer identifies someone as pregnant and targets ads for baby products before others, including family, even know about the pregnancy. These types of assumptions and inferences upon which targeted advertising is based can in some instances result in emotional distress, lead to individuals being misidentified or misclassified, and cause other harms.

If this is one of the biggest harms the FTC can come up with, then honestly as a consumer I don't really care. Having free youtube is worth getting a few mistargeted ads, or I CAN JUST TURN TARGETED ADS OFF. Advertising isn't someone harassing you, its an ad that I can close or just report as not being accurate. I'd really be interested to hear from someone who thinks getting a mistargeted ad is in top 10 most stressful things in their life.

What I would really be interested in is the raw responses from the companies, not this report.

  • carb 5 days ago

    > I CAN JUST TURN TARGETED ADS OFF

    The only reason you have the option to do this is because of groups pushing back against advertising companies. Ad companies have no incentive to offer the option to disable targeting.

    If you like having this option available, then you should like this FTC report and the position they are taking.

    • kart23 5 days ago

      > If you like having this option available, then you should like this FTC report and the position they are taking.

      I can like other positions and actions the FTC has done, like requiring the ability to turn off targeted ads, and not like others, like this one. This is among the biggest problems in politics right now. Supporting a political party doesn't mean you need to 100% back all their opinions and policies, thats how change is effected in successful democratic systems.

      • stiffenoxygen 5 days ago

        > I can like other positions and actions the FTC has done, like requiring the ability to turn off targeted ads, and not like others, like this one

        They weren't saying that was the case I think you're misunderstanding them here. But they are 100% correct, you are benefiting from other people fighting against this mass surveillance and yet speaking against it. I think you should do some research on why privacy is important and challenge yourself and your potentially entrenched beliefs.

        • kart23 5 days ago

          Read my first comment. I definitely agree privacy is important. All I'm saying is that this is not one of the harms we should be worrying about when saying targeted advertising is a problem, and I don't understand why this is an important issue that we should care about when targeted advertising can be turned off:

          "Profound Threats to Users Can Occur When Targeting Occurs Based on Sensitive Categories"