Designing security into the internet of things

I never thought as a product designer that I’d be writing a piece about security and the internet of things, but it turns out it’s useful to think about and shapes the way we design interactions for a connected world. For me this goes back to the Fitbit sex scandal where, when given the opportunity, users were very happy tagging their physical activity with a little too much information.

Fitbit (see disclosure) had assumed that transparency about one’s own data would make people responsible for it. No such luck, so they had to impose a “private by default” setting overnight. Knowing and understanding the landscape around the privacy and security issues we will encounter in the world of connected devices means that we have to rethink our policies around the data and the objects. Only then can we design interactions that make sense and tools people can use.

mobilize-2013-essayWhat I and many others are suggesting would probably extend existing Consumer Protection and Data Protection Acts around the world, because they often assume a process of publishing that is straightforward: Data gets published or gets posted. Yet, in the internet of things, we might have lots of different directions, platforms and owners involved in any given interaction. The firmware provided, the sensor manufacturer in charge of calibration, the app developers, the data centers, the API developers, social media sites, etc. all will play a role.

My efforts here can also be seen as adding to the recently published Bill of Rights from Limor Fried, the founder of Adafruit, on the same subject.

Let’s start with an example: a Wi-Fi-enabled scale. You would weigh yourself with it and it posts your weight to a Twitter feed of your choice (private or public). We’re assuming there’s a Wi-Fi chip, a pressure sensor, maybe other sensors included in the scale for future use. There’s also software and firmware updates. There’s a cloud service where the data is stored. There’s an app that helps you keep track, maybe that app has an API so you can get recommendations about dieting.


So in service design terms, the scale and resulting services have lots of potential “touch points.” But what happens when insurance companies start to follow the hashtag on Twitter and send you messages hoping you’ll sign up to their health insurance if your thin. Or conversely remind you that you are too fat for their current policy?

What if the Wi-Fi packets can be sniffed so someone can ascertain if you weigh too little to be dangerous against a really bulky robber? What if brands start to sell you healthy salads, shakes, and more based on your trend of weight loss on your tweets? Say a hacker sniffs the data packets sent by your scale and it turns out there are more sensors which produce data that aren’t used currently (like a tiny speaker/mic) and those sensors can tell when you’re around your home.

What if there’s a database issue and you get shown data online that doesn’t add up to what your product tells you. Who do you believe? Even though some of these scenarios are a little extreme, it paints a useful picture for a conversation about what can be done about data rights and who should do it. There are some ideas:


  • Consumers should have the right to know what data is being collected about them and why.
  • Reasonable efforts should be made to protect confidentiality and privacy of the consumer.
  • Explicit permission should be granted from the consumer if a third party or service provider receives requests to de-anonymize the data set

None of this exists right now, if I buy the scale, it’s pretty opaque why data is being gathered apart from whatever mobile experience I may have. I don’t know what other sensors are there, pushing out data, but I should know. This means we might end up with short URLs printed at the bottom of every connected object that points to the data being gathered by that particular object. A Data Collection Act. Something akin to the provenance and recycling signs on most plastic goods. This is also data someone might be able to “claim” the way someone can claim a Twitter account if they are the owner.



  • Consumers must be granted license to any machine-generated data that is created, collected or otherwise generated that relates to them.
  • Service providers should inform data subjects that deleting all copies of data may be technically unfeasible once published.
  • Where data is collected from public space, consumers and service providers should have a role in decision-making and governance.
  • Consumers should have the right to remain anonymous, and/or have the ability to license data on an anonymous basis and/or at a different granularity/resolution (e.g. temporal or spatial).

This is very much part of the conversation about the value of data. Like selling your home-generated energy back to the grid. If my scale is helping a company understand weight fluctuations across the year and inform the sizes that should be stocked in stores, it should be both transparent but also possibly remunerative to the original producer of the data.

Workout gym

These points are also about engaging in a conversation about the public/private nature of data. If I take my weight data to the gym, can the gym use it for its own analytics about how much progress people make at this gym? If my city scale can weigh me and connect to the same data, can it send me Foursquare recommendations via Twitter because it knows which restaurants have more salads around me now that it knows where I am?

  • Service providers should clearly publish the relationship between the data, sensors as well as link to any APIs they and others develop.
  • Service providers and sensor manufacturers will publish in a machine and human readable form a link to their security and risk assessments.

This is interesting and challenging. At the moment, there are limited engagements between data providers and people building services on top of the API they’ve exposed. The Transport for London’s API, when it was published, helped create great games like Chromorama which you would play with your Oyster card, but every API change meant the game designers had to constantly keep up. The Transport for London itself doesn’t even have a good understanding of its own impact on the app ecology and, from what I hear, the most important impact of opening up the API meant that traffic to their website was halved. I’m certain there are better ways to create value for everyone when transparency about the data relationships is exposed.

Building business models around data

With a lot of data comes a big opportunity, but also a lot of responsibility. As a final thought we should discuss how to release data, when it’s the foundation of your business.

Data owners should release data, once requested:

  • without imposed delay, based on the accessibility principle above;
  • at the resolution at which it has been acquired;
  • to the data subject for as long as the provider hosts the data and for at least a pre-agreed duration of time
  • on an anonymous basis if so requested by the data subject

This has huge repercussions for the world of consumer electronics & big data providers. Imagine the data requirements of connected razor-blades around the world if shavers become Wi-Fi-connected. You could re-order blades automatically, but the manufacturer will have to be storing and giving you access to the data they are mining during the life of the company, and not only the product.

dude guy shaving razor bathroom mirror

That data storage cost will have to be either passed on the customer through a subscription model (get blades auto-magically when you run out) or by making the shaver much more expensive which, for the higher-end of the market will work (just look at connected cars and high-end fridges). Companies will essentially be faced with a choice: to make their data accessible at a cost to the consumer or to absorb the cost themselves to become more competitive.

When these principles have to be applied practically, they have a direct impact not only on how software is developed, but also how hardware is introduced in spaces, how packaging indicates that “data will be collected, see this URL”, how APIs are developed and how we sell the idea of connectivity as a useful, safe technology to the end consumer.

There’s a lot to design into a safe and secure connected product, but it’s a great meaty problem that both technologists and designers should tackle head on. Not only do we need to deal with a world where the NSA/GCHQ is reading our emails, but we need to build in agency to empower people to understand and act on the data cloud they produce around themselves. There has to be a valuable, well designed experience there, otherwise, like the smart meter debate taking place in the UK, fear-mongering will prevent it moving forward.

Alexandra Deschamps-Sonsino is one of the partners on the Eyehub project and the founder of the Good Night Lamp. This essay is part of a package for our Mobilize conference Oct 16th and 17th in San Francisco.

Disclosure: Fitbit is backed by True Ventures, a venture capital firm that is an investor in the parent company of this blog, Giga Omni Media. Om Malik, founder of Giga Omni Media, is also a venture partner at True.