>There is also lack of clarity on what constitutes “sale” of information, retail lobbyists and attorneys advising retailers said.
It's embarrassing that the law isn't more explicit that this also covers "sharing" where money doesn't change hands, but I wouldn't expect any more from California's legislators where my default assumption is that they'll be captured by the industry in their backyard.
I guess it will be up to courts to decide if sharing data counts as payment-in-kind.
(t) (1) “Sell,” “selling,” “sale,” or “sold,” means selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information by the business to another business or a third party for monetary or other valuable consideration.
Even if Google doesn't pay you for your data, you still have the right (as a resident of California) to prevent Google from selling on your data directly, or in using your data as part of a sale (e.g. targeted advertising).
My experience is the "confusion" is in what constitutes "other valuable consideration:" Can a company offer a personal data marketplace using credits that can only be used to obtain other personal data, and can only be obtained by sharing personal data? (e.g. the old data.com connect model). The code suggests maybe, but I suspect the Attorney General will take as broad a view as possible, so I'd steer clear from any startups that think this is a good idea.
> Even if Google doesn't pay you for your data, you still have the right (as a resident of California) to prevent Google from selling on your data directly, or in using your data as part of a sale (e.g. targeted advertising).
Except the definition you quoted doesn't say anything about "using" the information to provide a product (ie targeted advertising), it only talks about actually transferring that information to "another business or third party for monetary or other valuable consideration". So if the information doesn't leave Google's servers then it seems like it doesn't apply to Google.
1. The ad tag Google delivers to publishers captures personal data like IP addresses and cookies, non-personal but potentially privacy-leaking data such as the URL the user is visiting, and somewhere in-between marketing segments that the user may belong to, and delivers that information (usually in JSON) to hundreds or even thousands of different advertisers using a protocol called OpenRTB.
2. The ad tag being served can also include an impression tracker. This is usually a pixel (literally an <img tag!) that refers to the advertisers' server where they record counts, sometimes media spend, and because it's a third-party server, that advertiser will receive (automatically) the IP addresses and cookies, the URL the user is visiting, and so on.
3. One of Google's products includes custom segments which contain IP addresses and cookies for one or more ad exchange. This is actually delivered in a flat file to buyers, and while Google themselves do not offer this service publicly (so conceivably the contracts could be updated to be CCPA-compliant), other exchanges that are like Google in other ways certainly do not.
It is entirely possible your point is accurate for someone who doesn't provide ad exchange services or impression trackers or custom segments (such as Facebook), but it is also likely that some party selling a product that is derived from the use of this data will be considered in-scope. I would steer clear of companies that favor an alternate interpretation until the Attorney General has had a say.
>I suspect the Attorney General will take as broad a view as possible, so I'd steer clear from any startups that think this is a good idea.
He was one of two AGs to not join other state AGs in their antitrust investigation of Facebook [1], so I'm not holding my breath. Private right of action is what's going to hold businesses accountable, even though it's also been crippled [2].
the law isn't more explicit that this also covers "sharing" where money doesn't change hands
Our lawyers tell me that it does cover any in-kind exchange. For example, if we offered a customer summit in partnership with one of our product manufacturers (as we often do), and if that manufacturer helps underwrite the cost of the event, then us providing to them a list of attendees would count as a sale.
It feels a bit pedantic to say that about a post from a week ago, but the whole point of the article was that they had until January 1st to comply, which has now passed. When I first saw the article title, I assumed it would be an article about how companies were out of compliance already since the date had passed, but it's not.
Well, kind of. The law technically goes into effect on Jan 1, but the CA Attorney General is not allowed to enforce it until July 1. This was part of a compromise brokered around the fact that it's taken a long while to nail down the final wording of the law. While some companies are already compliant, others are taking this as a grace period to get their act together. Which is necessary because of the short timeline.
As far as I'm aware, the actual text of the regulation is still not finalized. The most recent update to the text was in October, but the AG office was gathering public feedback in early December. They still haven't released their findings from the feedback. The October version was more a Release Candidate, if you will.
>You want to sell my purchase/demographics data? No problem. Just let me know how much or what i'll get for it and i'll decide if it worth to opt-in.
It should be allowed to be the cost of admission to the site. If you don't like that the advertising and your own metadata allows you to view the site for free, you could just not visit (or when you click no to consent, they would block you).
Websites shouldn't be required to provide you content for free.
Maybe for some or even most sites this makes sense. But Walmart, Home Depot, etc., from the article are businesses open to the public, and as such operate under laws governing public businesses. Few would put up with Walmart making them sign a contract letting them sell their personal demographic information in order to enter a public store (except if every public business did this, removing the choice - currently how most of the internet works). Why should websites for public businesses be different?
You may say that credit and debit cards do this already (and they do), but you can still pay cash in public stores. Of course they can track you with your phone's bluetooth identifiers, facial recognition, etc., but why as a society should we allow public businesses to require this? Once we permit one place to do it, others will follow.
You don't need a membership to enter. Specifically, the Costco near me has a liquor store inside but can't legally have it be part of the store. It's a separate store within Costco, and they won't stop you from walking in, buying liquor, and leaving. You can even grab a cheap hot dog on the way
Costco does not do that because of the goodness of their heart. They do that because state laws mandate liquor stores to allow anyone to buy liquor without being part of a club or membership. You can thank your state's liquor lobby for that.
Sure, thats what paid membership websites are for.
But none of the websites should be allowed to pull the fast one on their visitors by stealing and sharing visitor data to other entities.
If you going to give me a free kiss, please ask my concent to be infected by personal-data-stealing disease.
Of course if you like kissing and don't give a shit about others - government should step in to protect their citizens.
Honestly, I would happily pay many websites if they allowed me to, instead of being ad-supported.. I bet Facebook makes less than $10/month selling me targeted ads, tracking info, etc. (much less considering my ad-blockers, etc) I would happily pay for the site, so I am no longer the product, and an actual customer.
Websites aren't required to provide you content for free, and (for the most part) shouldn't be. But they also shouldn't be able to claim that there's an implicit acceptance of TOS by visiting the site. If they want to make it a cost of accessing the site, they should put the equivalent of a paywall in front of the site for data.
If I make an unauthenticated HTTP GET request to a server, and it responds with a 200, it can't claim after the fact that receiving that request means I agreed to something.
Hi! The E.U. makes us put up warnings for cookies. These are silly things that every service on the WWW uses: don't worry, little user, we're just doing what everyone else does!
[accept] [go away]
By Visiting This Website, You Agree to Have Cookies Placed on Your Device.
I don't know I'd call it a "farce;" it pretty closely mirrors my experience browsing the World Wide Web these days.
EU added a hell of an inefficiency to the WWW by making user education the responsibility of every site with no way for a user to pre-emptively signal "Yeah, I get it, and I don't care."
The farce is saying that the popup is needed "because we use cookies". It's because you're using cookies to track users, their PII and sell information and ads.
The EU didn't add any inefficiencies, rather, it's forcing the data abuse by websites to come to light.
If you use cookies only for login/session tracking then guess what, no popup is needed.
Also if the user expressed their consent (or lack of) once, then the popup is not needed anymore (google/fb/etc do exactly that).
So, as a user, how can I signal "I don't care; stop showing me this popup on every site?"
Because if the answer is "per-site signaling," that's soft encouragement to keep using the same sites so I don't need to see that tedious notification and it rewards big players over small players (and defeats some of the benefit of the WWW as a hyperlinked network of data that's fairly location-agnostic).
> as a user, how can I signal "I don't care; stop showing me this popup on every site?"
That's on the website unfortunately
> that's soft encouragement to keep using the same sites so I don't need to see that tedious notification and it rewards big players over small players
I agree, I would want the same option (since I use privacy browser extensions I don't care), it's not the users fault the small sites are playing it dumb and using some generic annoying "we value your privacy" popup BS.
> This is FUD, proportionality is used when calculating penalties
Can you point to where in the law that guarantee is given? Furthermore, given the previous penalty was "none," any penalty can probably be considered significant for website operators who were previously assuming nearly zero risk in running their sites.
Paragraph 1.
> Each supervisory authority shall ensure... in each individual case be effective, proportionate and dissuasive.
Paragraph 2. a)
> the nature, gravity and duration of the infringement... as well as the number of data subjects affected and the level of damage suffered by them
Also, as we are discussing this, a lot of this will now apply to CCPA and (in some cases) COPPA
> Also, as we are discussing this, a lot of this will now apply to CCPA and (in some cases) COPPA
Yes, and COPPA has terrified YouTube content creators.
Wiggle language like "proportionate" (especially when paired with "dissuasive") doesn't assuage the fears of website admins, because it isn't a dollars-and-cents (or, in this case, euros) amount. It's at the behest of a judge, which is not a risk space an admin (particularly one running a site as a secondary function, not as core to their business model) wants to take on. So the law, as structured, encourages those popups everywhere forever. Annoying, to say the least.
I don't think I'm spreading the FUD; I think the FUD comes from the ambiguity in the penalties described in the law itself.
I'll just add that: if they're so worried about the "huge penalties" why are they taking the cheapest way out and using the first non-compliant popup they came across instead of trying to understand the law and come up with an actual solution?
Probably because nobody's gotten sued for using the same banner yet. At the end of the day, this law has added a burden very few web site admins wanted; they don't want to understand a law to provide content on the Internet (especially when that was unnecessary only a few years ago, and especially when it isn't perceived as adding value to the user experience). I'm not surprised if people try and do the minimum legally necessary to comply with its ambiguity and pray they don't become the test case to resolve that ambiguity.
CCPA is no doubt inferior to GDPR for that very reason. But to keep things in perspective, CCPA was only passed as a compromise because Alastair Mactaggart, a rich real estate developer, forced their hand by threatening to make it a ballot initiative. It's unlikely that California would have led on this issue without the threat of a referendum.
Fortunately he's back at it again with a new ballot initiative, which he's funding himself, to improve on CCPA [1]:
>Mr. Mactaggart said his 2020 state ballot initiative, among other things, would create a state enforcement agency, limit targeted advertisements based on geolocation and add items covered by the “negligent data breach” section, which would allow consumers to pursue legal action in more instances of a hack.
>Most significantly, Mr. Mactaggart said, his new effort would make it harder to adjust the current law any further. The initiative includes a “purpose and intent” section that requires any amendment to the law to be in the service of protecting consumers’ rights to privacy, a legally binding clause that Mr. Mactaggart said would prevent industry from chipping away at the measure.
...
>The lobbying against Mr. Mactaggart’s earlier initiative kept it from the ballot in 2018, and legislators instead passed the privacy law. “It was the right thing to do because there was no guarantee at the ballot box,” Mr. Mactaggart said of his agreement to drop his measure. “Now, it’s different because we have the law.”
>Mr. Mactaggart said he would do whatever it takes to pass his initiative next year, including spending millions of his own money. He bankrolled his last campaign almost entirely himself, spending more than $3 million. Mr. Mactaggart will have to collect more than 623,000 signatures for his new initiative to qualify for the ballot. A survey in October by Goodwin Simon Strategic Research of 777 registered voters in California found that most supported Mr. Mactaggart’s initiative.
Any reason why this could not be implemented right in the browser? If the browser sends the specific header, either allowing it or disabling it, the server doesn't need to show the opt-in/out nag.
There's no reason a government like California couldn't require honoring DNT (within their jurisdiction, of course, as with the CCPA) and provide penalties for violations.
My phone has Google Fi VPN. This VPN doesn't do everything, but websites now think I'm in Mountain View. I am nowhere close. Is a VPN a better solution? I understand that Google might not be the best company to trust here, but they did give it to me for free and it is working fine.
This is how it should be; it’s how GDPR defined it to.
Instead, the rules weren’t tight enough to stop the current shitshow we now have. Your explicit opt in is a nice single button click, on a modal that fills the entire screen. It tells you nothing about what happens, it’s just easy.
Meanwhile, the rest is hidden away. And you still don’t know if those changes have any effect, especially if the tracking is also done server side. You’ll still get opted into all the emails by default, the needy ones that have to remind you every day that you made an account, and make you log in to disable them (to update their active monthly user count I’m sure).
Of course, it’s working that way because companies share your data with so many random third parties that opting in for each one would scare you away. And most of them do similar things, it’s probably different teams or departments insisting on using their own tools.
I don’t blame advertising for this per-se, I blame growth hacking too.
>Instead, the rules weren’t tight enough to stop the current shitshow we now have. Your explicit opt in is a nice single button click, on a modal that fills the entire screen. It tells you nothing about what happens, it’s just easy.
I wouldn't blame GDPR for any company's deficient consent process. That's just what they believe to be in compliance with GDPR, it may not actually be. Many of these questions are being litigated by privacy activists like Max Schrems.
"""
A Walmart source with knowledge of the matter told Reuters the company is “working through a lot of ambiguities in the law, for example, the language around loyalty programs and if retail companies can offer them going forward.”
"""
If loyalty programs run afoul of the law, it'll be interesting to see if this creates a plane of competition between companies that kick out or forego loyalty programs in states outside of CA as well as inside CA and companies that continue to offer loyalty programs in states outside CA.
Loyalty programs seem a mixed bag for consumers; some are actually into them, some hate them (seeing them as inefficiency that pushes labor onto the consumer).
Except there is absolutely no problem with loyalty programs. There is only a problem with collecting data for loyalty programs. Loyalty programs used to be done with stamps on paper cards, if you had a card full of stamps, you could get a rebate or whatever.
Also, I don't hate them (the ones that collect data, that is) because they push inefficiency onto the customer, but because I have to pay more so other people get paid for helping corporations manipulate me more effectively.
Given how much fraud is happening with customer data (that is: it being acquired under a pretense and then used for a different purpose), I very much doubt it.
> require more physical product be printed, and are less convenient for users than online solutions.
Except that's kinda orthogonal. The paper solution is easier to demonstrate to be free of back doors, but there is no reason why you couldn't build an "online solution" that doesn't do any tracking. You could just hand out random tokens without ever associating them with a transaction, and then accept sets of ten of those to redeem for a rebate, say.
> Given how much fraud is happening with customer data (that is: it being acquired under a pretense and then used for a different purpose), I very much doubt it.
Sorry; I was unclear. More prone to fraud against the loyalty program, i.e. customers buying the appropriate stamp and photocopying a dozen instances of the card to make "every 10th visit" into "every visit."
The privacy violation that infuriates me the most is that Visa and Mastercard sell your purchase data, and I don't believe CCPA applies to them. Like cell phone companies selling tower-based location data, the credit card's sale of data is unavoidable and completely hidden from consumers.
> The privacy violation that infuriates me the most is that Visa and Mastercard sell your purchase data, and I don't believe CCPA applies to them. Like cell phone companies selling tower-based location data, the credit card's sale of data is unavoidable and completely hidden from consumers.
If they do business in CA, why wouldn't CCPA apply to them? Even for HIPAA-covered entities, the CCPA still applies, just not all the parts (specifically there are exceptions for health data).
While cell-phone companies could function without keeping records of my location, credit cards couldn't function without keeping records of my purchases—at least until the next billing cycle. Somehow this makes their abuse of that information less surprising to me—which doesn't mean I like it any better!
That isn't actually true. It would be perfectly possible to build a payment system where the participants by default don't know the identity of most other parties.
Like, it would be perfectly possible for a merchant to encrypt the transaction description with a public key of the customer, and their bank only submitting the encrypted record with the amount and the account to debit to the card issuer, who would debit the account and pass on the encrypted record to their customer.
That's just a random idea, but the point is that you could achieve much the same result with a lot less data collection if you actually cared to.
I'm OK with the banks or even the card companies keeping a record of transactions. But Google boasts that it buys 80% of all the world's credit card transaction data for its profiling. That's just not cool.
Look at it this way: Would you prefer to received generic, un-targeted advertisements? Or would you rather something that is more likely suited to your interests?
The truth is "profiling" / ad targeting is actually a benefit to the consumer.
... so insisted the Surveillance Valley reality distortion field of the late 2000's, leading to an ever-growing backlash now including this law.
No, pervasive surveillance is not a benefit to me because it facilitates smoother attempts at psychological manipulation. It's generally a good thing when spam stands out as much as possible, making it easier to ignore.
I can receive targeted advertisements without being surveilled and sold.
It's been done for hundreds of years with billboards, newspapers, magazines, radio, and television. Heck, even web sites can do it, and did for many years before Google even existed.
Not in the traditional sense, no. Also those same magazines are selling their subscriber lists to other advertisers and data companies. Privacy is an illusion and always has been.
One of the reasons I use my Apple Card is because it's a step in the right direction.
I don't remember exactly what Tim Cook said in the announcement, but I believe it was something to the effect that Goldman Sachs can't use the information for marketing. Whether that means internal marketing, or if they're selling it or not wasn't specified.
It's not ideal, but it's a start; and other than cash, the only way to vote with my dollars.
I often think these efforts will just continue to empower the larger players and frustrate the user. No one is going to go to all these companies/websites/apps and ask them to delete their data and then do it again tomorrow, next week, next month, next year...when they collect more data. We need a mechanism that just creates so much data both legitimate and artificial that these systems prove worthless.
Edit: Curious the effectiveness of browser extensions like Ad-Nauseum. I've used it on & off and became amused with how much I 'cost' the ad-companies but for all I know it just proves another data point that my machine clicks every add that comes across it.
By the year 2030, the amount of fine print and opt-ins at the top of each website - in order to comply with every state and country's bespoke data regulations - will be larger than the warnings on cigarette packs.
Sorry if you're being sarcastic and it's totally gone over my head, but if you're not processing personal data you don't need to explain all the ways you don't process personal data. Otherwise you might as well explain all the other ways you're not breaking the law (which is going to be a long list).
It's impossible to build a website that doesn't process personal information at all, because the California law explicitly defines your IP address as personal information.
assuming it works like GDPR, that is presumptively acceptable on multiple grounds: (a) it is necessary to provide you business services (serving you the website) and (b) you are not retaining/logging it.
california didn't outlaw the internet guys, you're just being hysterical because you're going to have to ease back on your data collection a bit
It's presumptively acceptable, yes. But the comment I responded to was saying that you don't have to even explain it, and I don't think that's true; the law makes it pretty clear you do.
Except the _website_ doesn't have to know the IP address? It is perfectly sufficient for your IP stack to know it while the connection is established, and then immediately forget it.
If you delegate the processing to a service provider, you're still responsible for it. Someone has to do it on your behalf, which is isomorphic to you doing it.
IANAL, but my understanding is that it's not even clear you can avoid the "don't sell my data" language in CCPA even if you don't sell/share/collect data.
> IANAL, but my understanding is that it's not even clear you can avoid the "don't sell my data" language in CCPA even if you don't sell/share/collect data.
This is how my company's counsel (and other attorneys with whom our counsel consults with) has read the CCPA. If you don't sell data you don't need to include language to opt out of data sales since you don't do it. If that changes you have to message customers of change and provide the button.
It's embarrassing that the law isn't more explicit that this also covers "sharing" where money doesn't change hands, but I wouldn't expect any more from California's legislators where my default assumption is that they'll be captured by the industry in their backyard.
I guess it will be up to courts to decide if sharing data counts as payment-in-kind.