Who’s Watching Whom?  The Growing Battle Over Your Digital Likeness

Who’s Watching Whom? The Growing Battle Over Your Digital Likeness

Walk through any bustling city center, shopping district, or large mall, and you’ll see them: sleek, bright, high-resolution digital signage displays promoting food, fashion, tech, entertainment – essentially, anything with a profit margin.  They’re hard to ignore.  But as you glance at these screens, you might want to ask yourself: are they looking back?

It’s not a sci-fi scenario anymore.  For the past few years, digital signage vendors have quietly integrated AI-powered cameras and sensors into displays that can detect and analyze who is standing nearby.  These systems can estimate age, gender, how many people are watching, what part of the screen they’re focused on, what kind of clothes they’re wearing – even their emotional reaction.  In some cases, the signage adapts its content in real time to serve up ads that it believes match the viewer profile.  It’s advertising that’s not just targeted – it’s reactive.

This raises a series of thorny questions that society is only just beginning to grapple with.

If you’re walking through a public space, have you consented to being scanned, analyzed, and databased by a retail display?  Should you have to?  And if your image or biometric data is captured and processed – who owns it?

The Kiss Cam and the Fine Print

We got a fresh reminder of how complicated this issue has become with the recent “kiss cam” controversy at a Coldplay concert.  A couple’s uncomfortable moment was broadcast to thousands of spectators and then circulated online, raising serious questions about consent and the use of personal imagery.

As a lifelong Mets baseball fan, I was immediately reminded of the fine print on the back of a baseball ticket – the legalese that grants the team rights to your image while you're in the stadium.  By entering, you agree to be filmed, photographed, and possibly broadcast.  That clause may seem buried, but at least it exists.

The Back of a 2018 Mets Baseball Ticket

Today, though, most of us don’t carry printed tickets for events.  We scan a barcode from our mobile device.  Is that same legal release embedded in the digital transaction?  Do we know?  Is it displayed?  Assumed?  Ignored?  And if a baseball team feels obligated to spell out the rights they're claiming, does that imply that other events – or, say, a retail store using facial recognition – don't have that right unless they explicitly state it?

We are quickly heading into a world where the boundaries of consent are blurred beyond recognition.  At what point does capturing your image become an act of surveillance rather than customer engagement?

Your Data, Someone Else’s Asset

This isn’t just about faces on a billboard.  It extends to our voices, our DNA, and our digital behaviors.  Just look at the current bankruptcy proceedings for 23andMe.  The court allowed third parties to bid for their user database – a trove of personal genetic data collected from people who, in many cases, were simply curious about their ancestry.

I was never foolish enough to submit my DNA to a service like that.  The idea that my biological identity could be stored in a for-profit company’s cloud server, repackaged, and resold?  Hard pass.  But now the possibility that that data could change hands entirely – to bidders with unknown intentions – is exactly the nightmare scenario privacy advocates warned about.  And here we are.

In a political climate less fractured and distracted, this would be headline news.  We'd be talking about digital and biometric ownership in congressional hearings, not just tech blogs.  Instead, it’s been quietly happening while everyone’s distracted by… well, you know.

Office Surveillance, by Default

Even in the professional world, the push toward capturing and commodifying our biometric data is accelerating.  Modern collaboration platforms – Microsoft Teams, Zoom, and others – increasingly rely on facial and voice recognition to “enhance” the meeting experience.  The goal is to automatically assign quotes and action items to the correct person in transcripts and summaries.

That used to require a user to opt in.  Now, it’s often the default.  You’re assumed to be OK with it unless you take action to opt out – assuming that’s even possible in your organization.

But opting out isn’t really a neutral act, is it?  In a large enterprise, declining to participate in a platform’s data ecosystem might get you labeled as a non-team player, someone resistant to change.  That’s not just a UX flaw – it’s the very definition of an implied contract made under duress.  “Sure, you can opt out...  but don’t expect that choice to be without consequences.”

These systems are being implemented under the banner of productivity and innovation, but what they really do is normalize a deeper form of surveillance – one that quietly shifts the balance of control away from individuals and toward institutions that are eager to monetize every aspect of human behavior.

It’s Time for a Digital Bill of Rights

We need to start asking tougher questions – not just about whether someone can capture your digital likeness, but whether they should.  And under what terms?

Is there any scenario where a retail display should be allowed to analyze our face without our consent?  Should enterprise collaboration tools be required to provide meaningful opt-out mechanisms without fear of retribution?  Should biometric databases – whether built from voice, face, or DNA – be treated as publicly tradable assets?

Right now, the answer to all of those questions, sadly, seems to be a shrug.  But shrugging our way into a world where our likenesses, voices, and even genetic codes are treated as corporate commodities is not a sustainable strategy.

If we don't address the issue of biometric and digital identity ownership soon, we may look back at kiss cams as quaint.  The real danger won’t be public embarrassment – it will be total loss of control over who we are, how we’re represented, and who profits from it.

Because the next time you look at a screen, it won’t just be showing you an ad.  It’ll be building a profile, making decisions, and – most troubling – assuming it has every right to do so.

=========================================================

PLEASE click the “Like” button on this blog if you liked it. It will allow others in your network to see it. Also, please comment with your thoughts and experiences.

=========================================================

This article was written by David Danto and contains solely his own, personal opinions. David has over four decades of experience providing problem-solving leadership and innovation in media and unified communications technologies for various firms in the corporate, broadcasting and academic worlds, including AT&T, Bloomberg LP, FNN, Morgan Stanley, NYU, Lehman Brothers, and JP Morgan Chase. He is a Principal Analyst at TalkingPointz and can be reached at DDanto@talkingpointz.com

Ilya Bukshteyn

Corporate VP, Microsoft Teams Calling, Devices, and Premium Experiences

2w

David Danto I can’t speak for Zoom but in Teams a user always has to opt in for audio and/or facial biometrics. There was a tenant level policy, to enable users to opt in, that was changed from default off to default on, but any one particular user still always has to opt in.

Lofty Whitaker

Regional Sales Manager East at Audix Microphones

2w

So what would opting out of this situation look like and how would the machine recognize that you have opted out? There are certain patterns you can wear that scramble facial recognition for example. (Thanks for the reminder Terri) Maybe wearing that pattern somewhere on our clothing could indicate we don't approve of being analyzed or profiled. Then what does that look like? Does the machine blank out your image leaving a digital ghost? Lastly, how would we know if the machine actually opts you out, how much trust would we have in that? As usual David you raise more questions than answers, and weighty questions they are. Let's hope everyone joins the conversation.

Every time I walk past an interactive kiosk display in the city streets of New York or in a shopping mall, my instinct makes me wonder if there's a camera looking back at me. I would guess chances are there is.

Like
Reply
Jonathan Grover

Product and Business Strategy Leader | Creating Superpowers for People and Products

3w

So many interesting themes unpacked here. What is our right to anonymity anymore? I believe that's evolving, considering the digitally connected society we live in, and the level of benefits we glean from it - which I know we may not have explicitly asked for. Take for (unfortunate) example you call 911, and don't give your location. Don't you want them to be able to track you to help you? I love your notion of a Digital Bill of Rights - but I wonder if that is enough. In as much as fine print and legalese and rights are supposed to be enforceable, the judge in the 23andMe case is allowing the sale of customer data as "involves a sale of customer data only in a technical sense." What good are the rights, if the courts decide not to enforce them? Still, all that said, I'd love for society to have a better codified framework of how things are done, and acceptable practice - at a minimum for transparency.

To view or add a comment, sign in

Explore topics