They’ve been sued in IL as well under the same auspices, and Clearview’s defense that they aren’t a private entity because they "only" do business with public entities (e.g., law enforcement) is definitely not going to fly here. This settlement establishes how seriously the state of Illinois is taking non-consensual biometric information gathering, especially facial recognition.
This is a reductive argument and does not hold up to both social and legal norms.
Making data publicly accessible does not grant users of it carte blanche to use it however they wish. The data may be copyrighted: Viewing it does not grant the viewer rights to republish it or claim it as their own work. An image may contain a likeness: Viewing it does not grant the viewer rights to claim endorsement by that individual, or to use their likeness in ways of which the individual does not approve.
Beyond copyright and likeness rights, a likeness contains biometric information. Having access to that likeness, at least by the laws of the State of Illinois, does not grant the user rights to biometrics derived from that image.
To be fair here, it kind of is the question right?
I mean, say you and I are at a neighborhood kid's birthday party with our own kids. Suppose further that you take some pictures, and then post them publicly. Maybe to your FB?
Well, I never really gave you permission to use my image on your FB. Let alone publicly. I know that I certainly would never consent to publishing images of my children publicly. Further, I suspect the birthday party's host wouldn't give you permission to post his/her family's images on FB either.
So here we are, with all of these people on your public FB. None having signed or even given verbal authorization for any sort of release. And now FB starts running FaceRec on all the information you just gave them. To top it all off, my family and I don't even use FB. Never have. So it really can't even be claimed that we consented to facial recognition via the terms and conditions of service.
I only outlined all of that to outline this, just because a person's image is in public doesn't mean that the person consented to the image being public. Especially images taken from private spaces, (like bday parties at some kid's house).
> So here we are, with all of these people on your public FB. None having signed or even given verbal authorization for any sort of release. And now FB starts running FaceRec on all the information you just gave them. To top it all off, my family and I don't even use FB. Never have. So it really can't even be claimed that we consented to facial recognition via the terms and conditions of service.
> I only outlined all of that to outline this, just because a person's image is in public doesn't mean that the person consented to the image being public.
But what part of this would, even hypothetically, give you a cause of action against Facebook? What's Facebook supposed to have done wrong? You have a cause of action against your friend who illegitimately provided your photos to Facebook, not against Facebook who relied on the legal assurance your friend provided that he had the right to post those photos.
They ran the facial recognition software without knowing if they had all applicants consent. Ignorance isn't an excuse for breaking the law, the onus is on FB to make sure all people have consented. I would not say that they have done all that they can to check, by just having it in the T&Cs that its' the user's responsibility. They are hiding behind their T&Cs, knowing full well that they could be scanning people they have no consent for.
Comically, once they run the scans, they should be able to certain that they don't have consent because it doesn't match one of their users, and should throw the data away.
FB offers a service that allows you to share your pictures with your friends among other things. Before you upload a picture Facebook asks you if you have all of the rights to the pictures you are uploading (albeit they ask that in the fine print of their TOS.) FB offers ancillary services based on those images you shared. If you are uploading images you don’t have the rights to how is this FB’s problem?
It doesn't mean they did their due diligence to confirm their data was clean of non-consenting individuals. They are clearly abusing their users ignorance. Signed T&Cs of one person shouldn't be enough when you're dealing with PII of multiple people in my opinion. You need everyone's explicit consent.
There is a disparity here between real world and software services I think. When my real estate agent's software asked me to put in my room mates details, I just had to check a checkbox that said he consented. Yet when it came to the paper contracts, we all had to sign individually. We need the latter for software, else PII is going to keep spilling all over the place and we'll forget we ever had privacy.
I don't think it's unfair to scrutinize a company in such a powerful position as FB either. They know they have unconsenting people in their photo database, it's just inevitable, and they are hiding behind their T&Cs hoping it's enough in the court of law. Ethically, they've failed already, and clearly in some states they're breaking the law too.
I find this type of logic to be inconsistent with the innovative culture of tech, and a fair bit hypocritical. On one hand, we have a new, fairly unprecedented technology (at least unprecedented on the scale on which it’s used) yet we are relying on tort law that predates the contemplation of anything of this magnitude. Why?
It was reasonable in 1990 that the mechanism that you suggest here would be highly effective in most cases. We are in uncharted waters now and unlike the titanic, I think we should proceed with caution.
Wouldn't it be similar to any copyright protection claim, similar to YouTube (or Napster, or LimeWire, or MegaUpload, or...)? If they don't legally have the right to share the media, the default action is to remove it without even investigating the claim (because of the scale Facebook and YouTube operate at).
Assuming Facebook cannot legally host the media, and they choose to do so after being informed of its contention, they are choosing to say they are legally in the clear (or that they don't care about the law, which seems a lot more likely).
IANAL, but it seems simple enough on the surface. Of course Facebook will claim otherwise, and without being required to make it easy to report contentious media very few people will actually do so, and sharing a photo without permission is not the same as copyright protection. I still think the same basic logic applies though.
Case law already exists on the right of people who appear in photos to stop the photographer from publishing the photos; the issue is not original to the internet at all.
If I recall correctly, there is no such right, because it would effectively eliminate photography.
This only applies in a public setting though (or anywhere where privacy is not to be expected). So any photos taken of friends/strangers in the privacy of one's home (or car, or work, or...) would be exempt, as I understand it.
Intent matters in law. Facebook knows a substantial portion of their users will not acquire permission. In fact, their UX at every turn is a honeypot, so they doubly know.
Well, if the birthday party were in, say, Illinois, it's obviously that FB is in violation of Illinois law with respect to facial recognition.
Further, it being a child's birthday party, at a private residence, the photographer's action becomes a tort in the vast majority of states. If you were a big enough pain in FB's butt, you could use that to stop distribution of the images at a minimum.
It's true that the easiest method of guaranteeing ironclad, legally enforceable privacy with respect to this facial recog stuff is to live in Illinois. But that doesn't mean the situation is hopeless for people who live in other states. It's just a matter of whether or not you want to do the legwork of making the claims. Which, depending on the state you live in, may have to be based on anything from privacy violation like Illinois, to contributory copyright infringement if you took the photo and forwarded to your friend who then posted it.
So it would depend on both the situation, and the state and municipality in which you reside. I can't give you one catchall, because there isn't one catchall right now these laws are uneven across the country. And very much in flux right now.
But beware, if it concerns stuff that is already on FB or on partner sites with similar ToS one has signed away ones rights. IANAL but this is from the FB Terms:
>> Specifically, when you share, post or upload content that is covered by intellectual property rights on or in connection with our Products, you grant us a non-exclusive, transferable, sub-licensable, royalty-free and worldwide licence to host, use, distribute, modify, run, copy, publicly perform or display, translate and create derivative works of your content (consistent with your privacy and application settings). This means, for example, that if you share a photo on Facebook, you give us permission to store, copy and share it with others. [..]
>> When you delete content, it's no longer visible to other users; however, it may continue to exist elsewhere on our systems where:
>> - your content has been used by others in accordance with this licence and they have not deleted it (in which case, this licence will continue to apply until that content is deleted);
The Privacy Policy states that you are responsible with whom you share your content, after which above license applies.
I’m not worried about photos of myself that I upload, but am somewhat worried about photos of me that other people took. For example, there exist photos of me hanging out with my friends who are wearing kink gear, and some of those photos have been posted online. Everyone involved is having a good time and everything’s legal. Some of my coworkers might find the photos offensive, but those people aren’t browsing through kink photo galleries, so they aren’t going to see them. But I don’t want these photos to come up if you search for my name online. I also don’t want to engage in some censorship war trying to get them taken down.
Facial recognition tech, used in this way, seems to throw a bomb into a fragile peace where people can coexist with each other despite having incompatible values.
Wow, I'm glad that the rest of the country isn't following the Illinois example. I remember when the Google art institute released the "what classic painting do you look like", it was banned from use in Illinois, I guess this law is why.
I get that people don't think twice about giving it away. But it's another thing for that to be the moral justification for defending the merits of facial recognition.
Pardon my take, but:
A large uprising is starting to swell in the name of privacy, but the swing voter is concerned about what is essentially the visual equivalent of the "Which Disney Princess are you?" quiz.
The moral framework that concludes that people need to be coddled against making the wrong decision about sharing their face data for stuff like this is a bit condescending. Why do we assume people haven't considered the risks and decided they think either the odds of worst case scenarios are low or they aren't bothered by them?
It feels a little bit like the gun control debate. Taking away people's freedoms because of the possible worst case outcome can result in a worse overall situation.
If people aren't bothered by how facial recognition is used, stepping in and asserting we know better and they need a legal protection seems premature and overreaching.
No, because I don't see my face as a private thing. Nor is the exact distance between my pupils, or the profile of my nose, or any other information that can be cleaned from a photograph of my face.
It's not that it's a private thing, it's that it's yours to control.
There aren't technical limitations to the use (just as there aren't technical limitations to copying and freely distributing all digital media), so we either institute legal limitations or accept that it's allowed. There are many negative aspects of allowing it, so we should probably make sure we take the time to look at the repercussions of both stances carefully (that is, more carefully than "I don't have a problem therefore allow it").
Of course my face is mine to control. I can wear makeup, grow out or shave my hair, tattoo the whole thing if I care to.
Images of my face created by other people may be a different story entirely. Does Donald Trump or Boris Johnson own every photo of them on the AP newswire?
These arguments are way more nuanced with new tech. What about your fingerprint? your signature? Cadence of every syllable you say when you call the tech support or takeout? Any one can do any of these things over a single weekend. Tomorrow, Google could enable some feature on Android to recognize your face or voice on any phone in the world.
We'll need a better answer to deepfakes than "You cannot do anything with an image of another person's face."
Unless we want to be governed by faceless legislators or entertained by faceless celebrities. There should be some way to divide responsibility for person's images that isn't totalitarian in either direction (either "You cannot use a person's face without their consent," which kills visual news as a practice, or "every face is fair game for anyone to use at any time," which feels invasive to the individual).
I agree; in retrospect my comment was a bit facile. In the long run I don't think anyone will have any rights with respect to images of her or his own (clothed or naked) body. We're not in the long run yet. It will take some time for most of us to be comfortable with that.
In my view, it's more important to create a sort of symmetry with respect to images and other data than it is to preserve particular customs that are problematic in light of modern technology. That is, it's probably OK that every e.g. FBI agent has access to thousands of images of me, or even my complete genome, as long as I have access to thousands of images and the complete genome of every FBI agent. We learned in kindergarten that "knowledge is power". Like power, knowledge is not symmetrical. A federal prosecutor having power over me doesn't necessarily mean I have power over that federal prosecutor.
If we've learned anything in the decade just past (to be clear, that's an open question), it is that authoritarian structures are easily hacked by the authoritarians who run them. It may be that e.g. the Department of Justice wasn't always constructed to capriciously surveil and/or construct false cases against innocents (although, was that before or after they harassed MLK and other civil rights leaders?), but ISTM at least the cases of Aaron Swartz and Carter Page, to cover both ends of the political spectrum, show that to be the case now. If the tools of knowledge/power are increasing in power, we all need to have access to those tools.
> If I put information about myself out there on a public site, I still have a right to say that people can't download and use that information?
It's my understanding scraping is legal, but as the OP states, you could be fined over a half a billion dollars if you use that information. What part of the law/article was unclear?
Technically it's only illegal to use the data because the subjects of the data never gave you express written permission to do so, and you have to get that in Illinois before doing any kind of facial recognition.
in many jurisdictions yes. In most European countries people are allowed to download publicly available information of others but they can only use it for private use, not for commercial use without consent and they cannot distribute it without consent.
I find this notion slightly strange that the internet as a public space is some sort of voyeuristic free for all where one forfeits every right. If this was not stopped it would render the entire public internet hostile and borderline unusable which would be a shame for a medium with so much potential.
My view is that while it shouldn't matter, collecting a database of such information is potentially dangerous enough to society to require regulation - particularly when it is most dangerous in the hands of the very same government you would ask to regulate itself.
Yes, they're called personal rights and are protected in sone jurisdictions. A Frenchman who was taking a leak in his own front yard when the Google Streetview car passed and photographed him sued G to erase his image.
If I go out for a walk in the park, I still have a right to say that people can’t murder me and sell my organs?
???
As a (supposedly democratic) society we (supposedly) decide what is acceptable and what is not. These decisions are, in the eyes of the universe, quite arbitrary. You may be discovering that now, but some of us have known it for a while.
They’ve been sued in IL as well under the same auspices, and Clearview’s defense that they aren’t a private entity because they "only" do business with public entities (e.g., law enforcement) is definitely not going to fly here. This settlement establishes how seriously the state of Illinois is taking non-consensual biometric information gathering, especially facial recognition.