FTC’s Brill sees consumer consent as key for health, finance apps

When normal people use a new app, they don’t wade through hidden service terms. Many just click “OK” and hope for the best. This might be fine for a game of Candy Crush, but it can be risky in the case of apps that monitor things like your bank account or heartbeat.

On March 18, you can find out why from FTC Commissioner Julie Brill, a leading authority on privacy in the age of apps, and one of our guests at Structure Data in New York City.

Brill told me in Washington last week that her agency is concerned about gaps in existing privacy law, especially in how data is stored and sold.

“When it comes to hospitals, insurers and doctors, we have a law that’s well known and well used [i.e. HIPAA],” she said. “Outside of that, when it comes to health tech and wearables, there’s a lot of deeply sensitive information that can be analyzed.”

Brill pointed to an FTC study of 12 health and fitness apps released last spring. It showed how the apps can lead to personal health data, which is normally kept in closed loops of the medical community, trickling out to analytics and advertising companies. Here’s an FTC slide that illustrates the point:

ftc screenshot

A similar information sprawl can occur with financial apps, which many consumers use to track spending or obtain rewards.

The result, Brill said, is that data gathered for one purpose, such as counting steps or tracking spending, can get used for another without the consumer’s knowledge. In a worst-case scenario, the data could become a means for insurance companies or employers to discriminate against those who have experienced health or financial trouble.

One way to prevent this, she said, involves improving the consent and transparency process for apps that deal in sensitive data, such as those that collect health or financial information, or precise geo-locations. In these cases, Brill sees a potential solution in encouraging app makers to obtain affirmative consent if they want to use a consumer’s data out of context.

“So if the consumer downloads an app to monitor some of her vital statistics, and the health information is being used to provide that information to the consumer herself – to monitor her weight or blood pressure  – that is part of the context that the consumer understands when she downloads the app, and affirmative consent is not needed,” Brill stated in a follow-up email. “However, if the company is going to share this health information with third parties, like advertising networks, data brokers or data analysts, then that is a collection and use that is outside the context of the relationship that the consumer understood, and affirmative consent should be required.”

The challenge, of course, is for the Federal Trade Commission to find a way to improve privacy protection without subjecting vibrant parts of the economy to pointless or burdensome regulations. Brill said she’s aware of this and, in any event, formal rules or laws (including a Privacy Act like that proposed last week by President Obama) may be a long time coming.

“I believe industry can do lots before any legislation happens,” she said. “Legislation will take a long time, and this industry is taking off — so if industry can do best practices, it will allow appropriate business practices to flourish.”

To hear more about how (and if) the FTC can find a practical way to protect consumers, come join us on March 18-19 at Structure Data, where you’ll meet other leaders of the data economy, including executives from Google, Twitter, BuzzFeed and Amazon.

An earlier version of this story misspelled HIPAA as HIPPA. It has since been corrected.

FTC shuts down “revenge porn” operation, but imposes no fine

In one of the most despicable scams on the internet, Colorado resident Craig Brittain ran a website that posted nude photographs of hundreds of women alongside their Facebook profiles and other personal information, then blackmailed the victims to pay bogus lawyer sites, controlled by Brittain, in the hope of expunging the photos.

On Thursday, the Federal Trade Commission announced it has shut down Brittain’s operation, including the nude photo site Isanybodydown.com, and ordered him to destroy the photos. And while Brittain is banned from running a so-called “revenge porn” site in the future, he will not face any penalty unless he disobeys the order.

According to the FTC, Brittain obtained the photographs in various ways, including posing as a woman on Craigslist where he persuaded other women to trade photos. He also encouraged people to submit photos anonymously to his website, and offered “bounties” of $100 to those who could provide additional photos and other information, such as the women’s home town, that attested to their identities — increasing the changes his desperate victims would pay to get the photos removed.

The payment part of the scam, which amounted to blackmail, involved websites run by Brittian with names like “Takedown Hammer” and “Takedown Lawyer” that purported to help the victims. This arrangement resembles the “mug shot” operations, exposed by the New York Times, in which sleazy websites that embarrass people with mug shots work hand-in-glove with other sites that claim to restore reputations.

Brittain is hardly the first to be targeted by authorities for running a revenge porn operation. In late 2013, San Diego police arrested a man who ran a similar scam through a website called Ugotposted, and charged him with extortion.

Brittain does not face a similar fate as the San Diego man since, for now, the FTC lacks criminal powers or the ability to fine first-time offenders. (This might change under a new White House bill to give the agency more teeth — a topic we will be discussing with FTC Commissioner Julie Brill this March in New York at Gigaom’s Structure Data event).

Meanwhile, the debate over how to address revenge porn continues. The U.K. has passed a law to outlaw it, and activists like Charlotte Laws have called for such measures in the United States, though some worry about the implications for free speech.

Firms may face new $16,500 privacy fines under White House bill

Many consumers have grown resigned to the parade of privacy breaches that occur when apps or big tech companies like Facebook or Google misuse their personal data. These incidents typically result in a slap on the wrist for the offending company, but that could change under a new privacy law the White House is expected to propose next month.

According to Politico, which offers details from three Administration sources, the proposed law would work by strengthening data protection rules and by greatly increasing the power of the Federal Trade Commission to impose fines.

This last point is significant since the FTC, which is the country’s de facto privacy cop, is often incapable of meting out real punishments, even in the event of most egregious privacy breaches. A recent example is a company that used a free flashlight app to steal personal data from 50 million Android users but avoided even a fine. As a result, newer companies like Snapchat may be tempted to play fast and loose with privacy, knowing there will be few consequences.

While the FTC has been able to punish repeat offenders, including Facebook and Google, through the use of 20-year consent decrees, the companies sometimes appear to treat such measures as just a cost of doing business. But under the new bill, the FTC would pack more of a punch, including a new power to fine $16,500/day:

The agency, under the administration’s proposal, would gain the power to issue civil penalties against companies, sources said. Currently, the FTC can only levy fines when companies break existing privacy or security settlements with the agency. But the bill would empower the FTC to slap businesses with penalties of $16,500 per violation per day for breaking the law, one source indicated. Other portions of the bill would firm up the FTC’s legal authority over nonprofits and telecom companies.

If passed, the law would also reportedly increase the FTC’s oversight over data brokers and in emerging areas of tech like facial recognition software. It may also give consumers new power to learn what information internet companies possess about their personal lives, though it would not go as far as Europe’s controversial “right to be forgotten” law.

While the additional powers FTC may be welcomed by many consumers, the proposed law could prove contentious in Congress, and with the increasingly powerful tech lobby. Opponents are likely to claim that the stricter controls on data and privacy could inhibit innovation, and risk imposing emerging industries in red tape.

To learn more, and to hear directly from the FTC, come join Gigaom at Structure Data in New York City on March 18, where I’ll be speaking with FTC Commissioner Julie Brill.