FTC’s Brill sees consumer consent as key for health, finance apps

When normal people use a new app, they don’t wade through hidden service terms. Many just click “OK” and hope for the best. This might be fine for a game of Candy Crush, but it can be risky in the case of apps that monitor things like your bank account or heartbeat.

On March 18, you can find out why from FTC Commissioner Julie Brill, a leading authority on privacy in the age of apps, and one of our guests at Structure Data in New York City.

Brill told me in Washington last week that her agency is concerned about gaps in existing privacy law, especially in how data is stored and sold.

“When it comes to hospitals, insurers and doctors, we have a law that’s well known and well used [i.e. HIPAA],” she said. “Outside of that, when it comes to health tech and wearables, there’s a lot of deeply sensitive information that can be analyzed.”

Brill pointed to an FTC study of 12 health and fitness apps released last spring. It showed how the apps can lead to personal health data, which is normally kept in closed loops of the medical community, trickling out to analytics and advertising companies. Here’s an FTC slide that illustrates the point:

ftc screenshot

A similar information sprawl can occur with financial apps, which many consumers use to track spending or obtain rewards.

The result, Brill said, is that data gathered for one purpose, such as counting steps or tracking spending, can get used for another without the consumer’s knowledge. In a worst-case scenario, the data could become a means for insurance companies or employers to discriminate against those who have experienced health or financial trouble.

One way to prevent this, she said, involves improving the consent and transparency process for apps that deal in sensitive data, such as those that collect health or financial information, or precise geo-locations. In these cases, Brill sees a potential solution in encouraging app makers to obtain affirmative consent if they want to use a consumer’s data out of context.

“So if the consumer downloads an app to monitor some of her vital statistics, and the health information is being used to provide that information to the consumer herself – to monitor her weight or blood pressure  – that is part of the context that the consumer understands when she downloads the app, and affirmative consent is not needed,” Brill stated in a follow-up email. “However, if the company is going to share this health information with third parties, like advertising networks, data brokers or data analysts, then that is a collection and use that is outside the context of the relationship that the consumer understood, and affirmative consent should be required.”

The challenge, of course, is for the Federal Trade Commission to find a way to improve privacy protection without subjecting vibrant parts of the economy to pointless or burdensome regulations. Brill said she’s aware of this and, in any event, formal rules or laws (including a Privacy Act like that proposed last week by President Obama) may be a long time coming.

“I believe industry can do lots before any legislation happens,” she said. “Legislation will take a long time, and this industry is taking off — so if industry can do best practices, it will allow appropriate business practices to flourish.”

To hear more about how (and if) the FTC can find a practical way to protect consumers, come join us on March 18-19 at Structure Data, where you’ll meet other leaders of the data economy, including executives from Google, Twitter, BuzzFeed and Amazon.

An earlier version of this story misspelled HIPAA as HIPPA. It has since been corrected.

Here’s a draft of the consumer privacy “Bill of Rights” act Obama wants to pass

The White House has released a draft of the Consumer Privacy Bill of Rights Act of 2015. It outlines the steps companies need to take to tell people what data they’re collecting and what they’re doing with that information. It also suggests data opt-out options for identifying details like email addresses and passport numbers. The bill also mandates that companies give people information about how they store the data they collect, for how long, and how consumers can view those details.

Here are some of the key points:

  1. Employee data is excluded from these disclosure requirements, as is information collected to fend off a cybersecurity threat.
  2. Companies must give people information for who they can contact at the organization regarding any privacy questions.
  3. If a person withdraws his or her consent for data collection, the company has 45 days to delete the specific information on the user.
  4. Companies need to thoughtfully design their privacy notifications for users, considering everything from the size of the device displaying the notification to the timing when these notices appear.
  5. Companies must delete user data after it has fulfilled its purpose.
  6. Smaller businesses which have five or fewer employees are exempt from these requirements.

There’s plenty more to parse in the 24-page document, which is chock-full of words like “clear” “transparent” and “individual control.” Some politicians have already spoken out against the draft, saying it puts consumers at risk by lessening, not increasing their protections. The Hill reported that one Democratic official argued that the White House’s bill takes away some of the Federal Trade Commission’s power to prosecute companies that abuse consumer data.

This is an early draft that will go through several revisions before being voted on by the House and Senate. If passed, it will hold tech companies accountable for the way they treat consumer privacy. The FTC, state attorney generals, and people who use the technology will have the power to bring civil suit against those organizations that violate the bill.

FTC shuts down “revenge porn” operation, but imposes no fine

In one of the most despicable scams on the internet, Colorado resident Craig Brittain ran a website that posted nude photographs of hundreds of women alongside their Facebook profiles and other personal information, then blackmailed the victims to pay bogus lawyer sites, controlled by Brittain, in the hope of expunging the photos.

On Thursday, the Federal Trade Commission announced it has shut down Brittain’s operation, including the nude photo site Isanybodydown.com, and ordered him to destroy the photos. And while Brittain is banned from running a so-called “revenge porn” site in the future, he will not face any penalty unless he disobeys the order.

According to the FTC, Brittain obtained the photographs in various ways, including posing as a woman on Craigslist where he persuaded other women to trade photos. He also encouraged people to submit photos anonymously to his website, and offered “bounties” of $100 to those who could provide additional photos and other information, such as the women’s home town, that attested to their identities — increasing the changes his desperate victims would pay to get the photos removed.

The payment part of the scam, which amounted to blackmail, involved websites run by Brittian with names like “Takedown Hammer” and “Takedown Lawyer” that purported to help the victims. This arrangement resembles the “mug shot” operations, exposed by the New York Times, in which sleazy websites that embarrass people with mug shots work hand-in-glove with other sites that claim to restore reputations.

Brittain is hardly the first to be targeted by authorities for running a revenge porn operation. In late 2013, San Diego police arrested a man who ran a similar scam through a website called Ugotposted, and charged him with extortion.

Brittain does not face a similar fate as the San Diego man since, for now, the FTC lacks criminal powers or the ability to fine first-time offenders. (This might change under a new White House bill to give the agency more teeth — a topic we will be discussing with FTC Commissioner Julie Brill this March in New York at Gigaom’s Structure Data event).

Meanwhile, the debate over how to address revenge porn continues. The U.K. has passed a law to outlaw it, and activists like Charlotte Laws have called for such measures in the United States, though some worry about the implications for free speech.

Firms may face new $16,500 privacy fines under White House bill

Many consumers have grown resigned to the parade of privacy breaches that occur when apps or big tech companies like Facebook or Google misuse their personal data. These incidents typically result in a slap on the wrist for the offending company, but that could change under a new privacy law the White House is expected to propose next month.

According to Politico, which offers details from three Administration sources, the proposed law would work by strengthening data protection rules and by greatly increasing the power of the Federal Trade Commission to impose fines.

This last point is significant since the FTC, which is the country’s de facto privacy cop, is often incapable of meting out real punishments, even in the event of most egregious privacy breaches. A recent example is a company that used a free flashlight app to steal personal data from 50 million Android users but avoided even a fine. As a result, newer companies like Snapchat may be tempted to play fast and loose with privacy, knowing there will be few consequences.

While the FTC has been able to punish repeat offenders, including Facebook and Google, through the use of 20-year consent decrees, the companies sometimes appear to treat such measures as just a cost of doing business. But under the new bill, the FTC would pack more of a punch, including a new power to fine $16,500/day:

The agency, under the administration’s proposal, would gain the power to issue civil penalties against companies, sources said. Currently, the FTC can only levy fines when companies break existing privacy or security settlements with the agency. But the bill would empower the FTC to slap businesses with penalties of $16,500 per violation per day for breaking the law, one source indicated. Other portions of the bill would firm up the FTC’s legal authority over nonprofits and telecom companies.

If passed, the law would also reportedly increase the FTC’s oversight over data brokers and in emerging areas of tech like facial recognition software. It may also give consumers new power to learn what information internet companies possess about their personal lives, though it would not go as far as Europe’s controversial “right to be forgotten” law.

While the additional powers FTC may be welcomed by many consumers, the proposed law could prove contentious in Congress, and with the increasingly powerful tech lobby. Opponents are likely to claim that the stricter controls on data and privacy could inhibit innovation, and risk imposing emerging industries in red tape.

To learn more, and to hear directly from the FTC, come join Gigaom at Structure Data in New York City on March 18, where I’ll be speaking with FTC Commissioner Julie Brill.

Yelp says FTC investigation shows its reviews are not fixed

Customer review site Yelp can be a source of stress for small business owners, who fear that a poor ranking or negative comment can sabotage sales. Some businesses even claim that Yelp plays on their fears by manipulating rankings in order to extort them into buying ads.

Yelp, however, has for years adamantly denied that it rigs its rankings. And on Tuesday, the company’s claim for its integrity got a boost after the Federal Trade Commission reportedly concluded an investigation into’s Yelp’s recommendation software and sales practices, and decided to take no action:

“[T]he FTC recently concluded a deep inquiry into our business practices and informed us that it will not be taking any action against Yelp. The FTC looked into our recommendation software, what we say to businesses about it, what our salespeople say about our advertising programs, and how we ensure that our employees are not able to manipulate the ratings and reviews that we display on our platform,” Yelp wrote in a new blog post.

News of the investigation first surfaced last April when the Wall Street Journal reported that the FTC received more than 2,000 complaints about Yelp from businesses between 2008 and 2014. Yelp, however, points out that this number amounts to only a tiny fraction of the number of its overall listings, which cover everything from restaurants to barbers to airports.

Yelp also earned another victory in September when a federal appeals court in California threw out a class action suit brought by a dentist and others, which claimed Yelp had “extorted” them by tying ad purchases to positive ad placements. The court’s decision, however, was based on the grounds that Yelp has a legal right to engage in “hard bargaining” if it wishes; the court did not make a finding about the company’s actual business practices.

After the court ruling, Yelp described the complaints as “conspiracy theories” and pointed to a Harvard Business School study that concluded the company does not manipulate its reviews based on who advertises with it.

The FTC investigation into Yelp occurred at a time when social media has given customers an unprecedented degree of power to sound off about businesses. The phenomenon has even led some companies to include non-disparagement clauses into their contracts; a hotel in New York, for instance, last year threatened to withhold $500 from a bride’s security deposit for every negative review posted online by her wedding guests.

Feds issue final order over Snapchat privacy incidents, but no fine

What do regulators do when a popular app deceives users about photos that “disappear forever” and scrapes their contact list without permission? Not much.

On Wednesday morning, the Federal Trade Commission announced the approval of a final order against Snapchat, a popular messaging app that lets people send text and images that disappear after a few seconds.

The service, which initially gained notoriety as a teen sexting app, landed in hot water this spring after it became apparent that users could deploy easy workarounds to capture permanent copies of the photos that were supposed to “disappear forever.”

According to an FTC complaint published this spring, Snapchat not only deceived users with such false marketing promises, but it also collected information from iPhone contact lists without permission, and employed lax security measures that exposed 4.6 million users to a data breach.

This week’s order serves to formally implement the terms of the settlement that were announced in March. Notably, the terms do not include any sort of financial repercussions for the company or its executives.

Instead, Snapchat’s punishment consists of a 20-year consent decree, which requires the company to comply with a series of obligations, including the implementation of a privacy program.

Such decrees, which the FTC also has in force against tech companies like Facebook and Google, provide the agency with a means to slap down harsher penalties, including multimillion dollar fines, in the event of future privacy breaches.

The downside of the Snapchat consent decree, however, is that it may reinforce perceptions among Silicon Valley startups that it’s okay to blow off privacy precautions, since little of consequence happens to first-time violators. Indeed, earlier this year, the maker of an Android flashlight app secretly recorded the location of 50 million people, but faced no fine from the FTC.

The original headline of this story used the word “breach.” That word has been changed to “incidents” in light of the fact that the primary issue in the FTC case — the capturing of photos — was facilitated by third party apps, rather than by overcoming Snapchat’s security measures.

FTC shuts down massive “PC cleaner” scam

Every year, tens of thousands of consumers spend millions of dollars on services that purport to “clean” their computers, and protect them from spyware and malware. In reality, many of these services are elaborate scams involving fake tech support and sleazy tele-marketers.