NEWSLETTER SIGNUP

Speak out

Against voice profiling

Collective actions toward change is critical.

As individuals, it’s important to let companies like Amazon, Google, Spotify, and even McDonald’s (which has begun testing voice profiling in Chicago drive-throughs) know via email and other communications that voice profiling is not, and will not, be OK—that their use of it diminishes their credibility and your trust in them.

Contact your local and national political representatives and urge them to pass laws that outlaw voice profiling in marketing. Support advocacy groups such as Access Now and the Electronic Frontier Foundation that chastise companies publicly for bad behavior and push state and national legislators toward laws that protect people from intrusive technologies.

Connect WITH OUR SOCIAL ACTION TEAM



Get the Book

The Voice Catchers

Do we really want companies to do voice profiling without our being able to talk back? To have our voice profile affect the kind of deals, discounts, and opportunities we are offered if we are perceived as more desirable by a set of data points? Once we accept that level of analytics, what happens when a company does marketing for political candidates and uses our voice in real time to figure out what kind of pitch they could give you? Or if governments use this information, can we cross the border based on how we sound?

The Voice Catchers: How Marketers Listen In to Exploit Your Feelings, Your Privacy, and Your Wallet.

And check out his earlier work: The Aisles Have Eyes: How Retailers Track Your Shopping, Strip Your Privacy, and Define Your Power.

Listen

to this story in CitizenCast

Welcome to the voice-generator enhanced edition of Sarah’s story


And go here for more audio articles from CitizenCast

For more smart takes

Sign up for our newsletter

For a weekly dose of ideas, solutions and practical action steps, sign up for our newsletter:

* indicates required



/

( mm / dd )

And follow us on Facebook, Twitter & Instagram.

Using Your Voice Against You

In his new book, Penn professor Joseph Turow warns about a growing smart tech industry determined to use our voices in ways that should alarm us. It’s not too late to take back control.

Using Your Voice Against You

In his new book, Penn professor Joseph Turow warns about a growing smart tech industry determined to use our voices in ways that should alarm us. It’s not too late to take back control.

What if the smart speakers in our homes and cars began listening not only to our voice commands but how we say them? Does your voice and syntax make you sound wealthy and educated? Maybe your voice’s plummy tones drip with status and could catapult you into a marketer’s exclusive target audience. Your voice and vocal patterns might score you some good deals from companies trying to attract and cater to you as a customer.

But what if your smart speaker analyzes your voice and concludes it sounds like it belongs to someone who is poor, or a certain ethnicity, or outside of a particular target demographic it privileges? What if your voice could give away your health status, age and size? Do you sound like you might have a heart condition or need to drop some pounds?


MORE ON TECHNOLOGY IN OUR DAILY LIVES


Imagine being turned down for a mortgage, a job, or having to pay a higher rate for health insurance just because of the way a coder has correlated what your voice signals in terms of wealth, status and physical attributes. Amazon and Google already have numerous patents relating to voice profiling. How will they use them in the future with the personal voice data already in their possession?

Penn professor and author Joseph Turow explores these dystopian—though mostly theoretical at the moment—ideas in his most recent book, The Voice Catchers: How Marketers Listen In to Exploit Your Feelings, Your Privacy, and Your Wallet. In it, Turow fires up a figurative flare gun to put us on alert as to how our voices might be used against our own best interests in the future.

After all, Americans have already welcomed intimate surveillance into our homes, cars, and schools through hundreds of millions of smart speakers like Amazon’s Alexa, Apple’s Siri and cell phones. Turow says that while you’ve been giving voice commands to a “helpful voice assistant,” you’ve also been allowing a company to compile data points and amass a profile on you to be used in the future in ways we cannot even yet predict. Turow is concerned with voice profiling technology that uses advanced machine learning and neural-network programs to analyze your value to those running these programs.

To Turow, the most important questions for us as we face this nascent industry that is still in its “scale-building” phase—that is, the period when it works to normalize surveillance and integrate these devices into all aspects of our daily lives—are: Are we going to pump the brakes and take some legislative or regulatory measures to protect our rights to our own biometric identities, the physical characteristics used to identify us—and that we can’t change as we would with a hacked password? Or, will we shrug our shoulders and later regret being seduced by the convenience of what we thought seemed harmless?

Turow, a professor at Penn’s Annenberg School for Communication, has spent decades studying the intersection of marketing, digital media and society; he has authored 12 books, edited five and written more than 160 articles on mass media industries. The Voice Catchers came as a natural outgrowth of his 2017 book The Aisles Have Eyes: How Retailers Track Your Shopping, Strip Your Privacy, and Define Your Power.

I spoke to him recently from his Bala Cynwyd home about what’s at stake with “voice profiling”—and what we can do to keep Big Tech from controlling another aspect of our lives in the not too distant future.

Sarah Jordan: What is your elevator pitch to convince people to proceed with more caution when using an Amazon Alexa or Google Assistant?

Joseph Turow: In my mind, what’s at stake is people giving up a relatively unchangeable part of their bodies to marketers without knowing what they will do with it. Will they use that information, or share it with other marketers in ways that will end up biting you in the end? There could be discriminatory activities taking place. For example, you’re on a call with a contact center about something and you’re angry, you may not know that the contact center has computers that are looking at real time emotions, tracking them with your importance to the company, linking them to a personality profile, triaging you to a particular agent who is good at upselling your type of person. This is all going on behind the scenes, so there’s a kind of loss of control of our sense of being, our body, literally.

Do we really want companies to do voice profiling without our being able to talk back? To have our voice profile affect the kind of deals, discounts, and opportunities we are offered if we are perceived as more desirable by a set of data points? Once we accept that level of analytics, what happens when a company does marketing for political candidates and uses our voice in real time to figure out what kind of pitch they could give you? Or if governments use this information, can we cross the border based on how we sound?

There’s no reason to cross that barrier of biometric profiling and allow this information to be exploited in the future by marketers, political campaigners, and frankly, even governments. There’s something here to worry about and our information should be protected.

SJ: The way the medical community has a consensus of concern about cloning with its potential for unknowable outcomes, there doesn’t seem to be the same concern about voice profiling and home surveillance and its potential for future unsavory outcomes. Somehow these devices have been normalized.

JT: During the time I was writing this book, I found no articles that went into any details about the idea that voice profiling could be a problem during this scale-building phase. For example, there’s an Amazon patent for “voice-based determination” of physical characteristics of users that could have the scenario where a woman could walk into her apartment, and say, Alexa, I’m hungry, give me a recipe. Alexa says, You sound like you have a cold. Do you want chicken soup? Would you like some cough drops? I can get them delivered to you in an hour. Amazon today owns a pharmacy. This is convenient, but it’s because it’s analyzing how you sound.

Companies know how to get people on board by making it easy and alluring to use new technology. Amazon’s Prime day just past, and in the past they have sold Alexa (Echo) devices for as little as $15 a piece. These devices become a part of everyday living. That’s the seductive part, but Amazon, Google, customer contact firms, and others minimize the surveillance tracking part, not disclosing what they will do with the information they collect.

SJ: In your book, you write about home builders that tout the desirability of “truly connected homes” with listening devices built discreetly into the walls, or the use of “voice-assistants” in schools so young children become used to these devices. Are there other tech developments that have gotten your attention?

JT: Amazon’s Halo, which is an activity tracker like a Fitbit. I got one that I asked my wife to use, but she got annoyed by its “tone” tracker. It records what you say, among other things, and then it will tell you how supposedly healthy your relationships are by how your voice sounds, how you sound to your spouse, or boss, or friends. Yet I have never seen any information about Halo’s training set [The initial set of data used to teach a program how to compute and process information.] So how does it support these allegations? The only thing I can figure is it’s compared with Alexa speech records. It never makes clear how they arrive at their conclusions. I have no idea if it’s accurate because there’s no proof.

SJ: Is the problem with voice-devices that we are agreeing to be surveilled and we’re giving away freedoms with uninformed consent for the sake of convenience?

JT: Right. And we are opening ourselves up to discrimination. Consent can never be fully informed, because the companies themselves can’t articulate how information they are collecting now will be used in the future.

If people mobilize now there’s still an opportunity to take the proper precautions. The industry is not fixed in stone… yet. There needs to be a public discussion about these issues and how they’ll affect our future.

In research I’ve conducted with colleagues, we find that most Americans are resigned. That is, they believe they would like to control the use of their information, but don’t believe it will happen. We did a major national survey in 2015 and 2018 and found that 58 to 63 percent of Americans are resigned to the use of their information.

SJ: If you go into someone’s home that has a listening device do you tell them why they should ditch them?

JT: I haven’t been, but a woman who is a Duke Law professor read my book and commented on Goodreads that she asks people to shut off all devices when she goes into their homes. That may become normative behavior for some parts of the population.

SJ: Are there others who are taking this as seriously as you? Who else would need to “get it” to make appropriate safeguards?

JT: I’ve moved from the position that we have to educate people, to realizing that’s impossible to do adequately. It’s not that we shouldn’t inform people about what’s going on; but I’ve spent so much of my life trying to understand this stuff, why would I expect regular people to know this? Increasingly, I think that reasonable regulation is the way to go. I’m arguing that voice profiling should be illegal. There’s no reason to cross that barrier of biometric profiling and allow this information to be exploited in the future by marketers, political campaigners, and frankly, even governments. There’s something here to worry about and our information should be protected.

SJ: Why is Europe in the lead with implementing safer practices? What are they getting right?

JT: Europe has always believed that protecting data and privacy is a fundamental human right. That’s not the case in the United States. We look at it in a sectorial way, so there are rules about finance data, rules of use for kids under 12, rules for HIPPA. Some states have legislation to rein in biometric surveillance: Illinois, Texas, and Washington. Illinois’s 2008 Illinois Biometric Information Act is so far the only state law allowing private suits and recovery for damages for violations. That’s creating some blowback and creates a whole lot of legal issues. To my mind, what’s ideal is to have a federal law that outlaws the collection of biometrics profiling in marketing and perhaps other areas of society.

Do we really want companies to do voice profiling without our being able to talk back? To have our voice profile affect the kind of deals, discounts, and opportunities we are offered if we are perceived as more desirable by a set of data points?

SJ: What can we do here in the U.S.?

JT: It’s easy to say that people should not use devices sold by firms that don’t rule out profiling people’s voices, but unfortunately that would include some of the biggest tech names. Yet there are things we can and should do. As individuals, it’s important to let companies like Amazon, Google, Spotify, and even McDonald’s (which has begun testing voice profiling in Chicago drive-throughs) know via email and other communications that voice profiling is not, and will not, be OK—that their use of it diminishes their credibility and your trust in them. Collective actions toward change is critical.

Contact your local and national political representatives and urge them to pass laws that outlaw voice profiling in marketing. Also, support advocacy groups such as Access Now and the Electronic Frontier Foundation that chastise companies publicly for bad behavior and push state and national legislators toward laws that protect people from intrusive technologies. Access Now, for example, recently petitioned Spotify to renounce a patent it received that helps it profile the voices of its app users by gender, age, and other characteristics. The more collective voices against voice profiling, the better.

If people mobilize now there’s still an opportunity to take the proper precautions. The industry is not fixed in stone… yet. There needs to be a public discussion about these issues and how they’ll affect our future.

Header image by tua ulamac / Flickr

The Philadelphia Citizen will only publish thoughtful, civil comments. If your post is offensive, not only will we not publish it, we'll laugh at you while hitting delete.

Be a Citizen Editor

Suggest a Story

Advertising Terms

We do not accept political ads, issue advocacy ads, ads containing expletives, ads featuring photos of children without documented right of use, ads paid for by PACs, and other content deemed to be partisan or misaligned with our mission. The Philadelphia Citizen is a 501(c)(3) nonprofit, nonpartisan organization and all affiliate content will be nonpartisan in nature. Advertisements are approved fully at The Citizen's discretion. Advertisements and sponsorships have different tax-deductible eligibility. For questions or clarification on these conditions, please contact Director of Sales & Philanthropy Kristin Long at [email protected] or call (609)-602-0145.