Regulators See Tech Tricks Hurting Consumers

0


[ad_1]

Tech companies, subscription apps, and e-commerce sites have been using subtle tricks for years to trick people into making a decision or purchase that they might not otherwise have made. There’s even a name for the tactic: dark patterns.

Now a crackdown can happen.

Members of Congress, consumer protection agencies, nonprofit watchdog groups and academic researchers all announced plans this year to increase the scrutiny they give to dark trends, laying the groundwork for a possible legal action and promising to bring more clarity and fairness to the way people navigate the Internet.

The tactics in question are probably familiar to anyone who has used a web browser or smartphone: the extra hoops to go through to cancel a subscription, a poorly worded question that results in an avalanche of spam emails, or a sign-up process. which requires some work to activate the privacy features.

“They’re trying to use emotional blackmail or coercive techniques to try and get you to do something,” said Harry Brignull, a London expert in online user experience design who coined the term “dark models” in 2010.

There is often money at stake, and the financial costs add up as more online services adopt subscription models. Last year, then-President Donald Trump’s re-election campaign was accused to trick people into recurring weekly payments.

In other cases, heavy design tactics can have intangible costs such as privacy, time, or sanity.

Last month, the Federal Trade Commission issued a warning that it would take a closer look, particularly at models that “trick or trick” consumers into subscription services. The commission and state agencies have provided many case against subscription services or other companies in the past, but the FTC said the app didn’t go far enough.

“The number of ongoing cases and the high volume of complaints demonstrate that there is widespread and ongoing harm to consumers in the marketplace,” the FTC said in a 15-page document. declaration. He said he receives thousands of complaints a year about tactics like automatic renewals.

Many cases of alleged dark grounds are already the subject of close scrutiny by other sources, including in class actions in which the phrase “dark grounds” came recently in the event of disputed refunds or recurring subscriptions. In 2015, LinkedIn OK to pay $ 13 million to settle a class action lawsuit over the design of its registration process.

The FTC’s statement amounted to a wake-up call for companies that could now engage in similar activities. The commission was studying the issue has been around for months and its new chairwoman, Lina Khan, who was appointed by President Joe Biden, has been a lingering criticism of tech companies.

Brignull, who has a Twitter Account to catalog examples of dark models, said he believes regulatory action is needed because the problem will not resolve itself. Even though app or website designers are committed to being more ethical, they usually don’t have the final say in a business, he said.

“Having a call for ethics or self-regulation doesn’t work,” he said. “Organizations need to know where the guide rails are. Otherwise, they’ll just look at other successful businesses and say, “This is what we need to do. ”

“Organizations need to know where the guide rails are. Otherwise, they’ll just look at other successful businesses and say, “This is what we need to do. ”

Harry Brignull, User Experience Designer

A big question facing the FTC and other law enforcement agencies: Where should they draw the line between a harmless steer and a manipulative practice?

Part of the answer may be the money lost by consumers, as the FTC has focused on expensive subscriptions and less on expensive subscriptions. other forms of harm. But the commission also established rules that businesses should generally follow if they are to stay on the right side of the law and regulations. Among them: Marketers must obtain “explicit informed consent” from consumers for subscriptions, including those that start out as free and turn into paid plans.

Fenwick & West law firm said in a note to customers after the FTC’s statement that businesses should review their existing websites and apps to make sure they’re compliant and “revamp” if necessary.

Another federal agency, the Securities and Exchange Commission, is examining a related issue in app design: the “gamification” of trading investments through design elements. In August he demand for public comment on whether the design choices encourage investors to trade more often, invest in different products, or change their investment strategy.

Representative Sean Casten, D-Ill., Who called for further government study into the gamification of stock transactions, previously called on the SEC to look at trading apps like Robinhood.

“Online trading platforms like Robinhood make money using the same psychological tricks that Silicon Valley originally developed to get us hooked on games like Candy Crush and Farmville,” he said. in a press release.

Robinhood responded to the SEC with a 37-page public commentary defend its practices.

“It’s important not to confuse gamification with simple, intuitive design,” said Aparna Chennapragada, product manager at Robinhood, in a separate statement. “Our app has made investing more affordable and accessible, helping to remove barriers for a whole new generation of investors. We will continue to provide our customers with an exceptional user experience and look forward to working with the SEC on this matter. ”

But the concern about potential dark trends is broader than expensive subscriptions or stock trading.

In California, government lawyers could use a state privacy law that came into effect last year to crack down on apps or websites with manipulative design, according to regulations. finalized by the state attorney general’s office in March.

The regulation deals with the denial of the sale of personal information and says that the denial process “Will require minimum steps”. A website also cannot use confusing language such as double negatives (“Do not sell my personal information”).

The California attorney general’s office said in a statement that it actively monitors compliance with privacy law, but has no public enforcement action to share.

Lucy Bernholz, director of the Digital Civil Society Lab at Stanford University, said that part of the reason there has been solutions limited to dark patterns is that there has not been enough research. But that too is changing.

In May, a coalition of nonprofits and researchers founded the Dark pattern edge line to collect examples of frustrated consumers who felt pressured to make unwanted choices. Funders include Consumer Reports and the Electronic Frontier Foundation, and the whistleblower line will be hosted at Stanford.

It received 700 submissions after its initial marketing campaign, Bernholz said, and while the advice doesn’t go to law enforcement or necessarily lead to immediate solutions for consumers, the line will provide data and examples. for research and teaching.

One of the questions the researchers might look at, she said, is whether different populations experience dark patterns more frequently – for example, whether online services that cater to low-income people are more likely to employ them.

“Are vulnerable populations targeted more often? Are they discriminatory? she asked.

Change could come slowly, Bernholz said, but she was optimistic it would come.

“The more we can draw attention to the prevalence of dark patterns in everyday life and the damage they cause, we might actually start to see a change in people’s experiences when they sign up for something, then a change that the big suppliers have to make, ”she said.


[ad_2]

Leave A Reply

Your email address will not be published.