It was the Three-Course Dinner Gum that served as Violet Beauregarde’s downfall at Willy Wonka’s Chocolate Factory and also introduced multiple generations to the curious possibilities of food’s future. Now, more than fifty years’ since the publication of the Roald Dahl classic, we’re on the brink of innovations that might make twentieth-century fiction look more like a forecasting engine. As the way we cook, eat and interact with our food is evolving, what does the future eating look like?
Let’s start in the kitchen
Many an embroidered wall-hanging will tell you that the kitchen is the heart of the home. Today, that heart holds many possibilities for innovation, some of which are already in play. There are a growing number of smart refrigerators on the market, offering touch-screen, wi-fi enabled doors—yes, you can watch cat videos but you can also view how many eggs you have stocked while you’re at the market.
Similarly, wi-fi oven ranges are making it possible to adjust oven temperatures from afar and check if you left your burners on after you left the house. The connectivity plays out in a few different ways; some appliances will connect to your smartphone, but many are hooking up with smart home systems or digital assistants (see Whirlpool and Nest and GE and Alexa) and yet others plug into their own smart home systems (see Samsung’s Smartthings Hub).
But if you’re not ready to invest in new built-in appliances, there are other entry points to smarter cooking. Cuciniale, for example, promises to deliver perfectly cooked meats by connecting your steak to your smartphone through its multisensor probe. June Intelligent Oven also works with sensors to improve timing and preparation, but can also recognize what food it’s cooking.
These (as well as the bigger appliances) have the appeal of ease and convenience and may also elevate our cooking skills much in the same way digital has improved our photography. (Think of “seared” as a filter you can simply tap to apply to your tuna.)
Those holding out for a fully hands-off solution might find projects like UK startup Moley Robotics’ robotic kitchen of interest. Moley offers a pair of wall-embedded arms that can prepare and cook your meals. (No indication if it also does dishes.) Meanwhile, thanks to artificial intelligence, robots are learning how to cook the same way many humans are picking up tips: through Youtube. It’s all quite compelling, though, for now at least, it’s still more convenient to just order a pizza.
What about the actual food?
A more savory aspect of the future of food is, naturally, the food itself. One fairly easy trend to identify is the move toward a more health-conscious eating—there are plenty of studies to support this but you really only need to see that McDonald’s sells apple slices for confirmation. Technology is ready to enable this trend, with apps that offer calorie counts on pictures of food and devices like Nima that scan food for gluten and other allergens.  
In a way that mirrors the fragmenting of media experiences, we’re also moving toward an era of more customized meals. That’s not simply Ethan-won’t-eat-zucchini-so-make-him-a-hot-dog-customization, but rather food that is developed to mirror our specific preferences, adjust to allergies and even address specific nutritional deficiencies. Success here relies on access to dietary insights, be it through logged historical eating patterns, blood levels and/or gut microbiome data. (New York Magazine has an interesting piece on the use of microbiome data to create your own personal food algorithm.)
And while it’s easy to imagine more personalized diets at home, we can count on technology to support that same customized approach while we’re eating out. Increasingly restaurants like Chili’s, Applebee’s, Olive Garden and Buffalo Wild Wings are introducing the table side tablet to increase efficiency and accuracy in orders and payments. As restaurant-goers take more control in how food is ordered, it will be easy to expect more customization in what is ordered.
Are we redefining food?
Given the rise of allergies and food intolerance, it’s not difficult to imagine a world of highly-customized eating. More unexpected in the evolution of eating is the work being done in neurogastronomy. This is a field that is approaching flavor from a neural level—actually rewiring the brain to change our perception of taste. In other words, neurogastronomy could make a rice cake register as delicious as ice cream cake. By fundamentally changing the types of food from which we derive pleasure, neurogastronomy could essentially trick us into healthier eating.
Then there is the emerging camp that eschews eating in favor of more efficient food alternatives. Products like provocatively-named Soylent and the much-humbler-sounding Schmilk offer a minimalist approach to nutrition (underscored by minimalist packaging), sort of like Marie Kondo for your diet. While this level of efficiency may have appeal in today’s cult of busy-ness, there something bittersweet about stripping food to the bare nutritional essentials, like eliminating the art of conversation in favor of plain, cold communication.
Another entry from the takes-some-time-to-get-used-to department comes from a team of Danish researchers. With the goal of addressing the costly challenge of food storage in space, CosmoCrops is working on a way to 3D-print food. There are already a number of products available that offer 3D-printed food (check out this Business Insider article for some cool food sculptures), but CosmoCrops is unique in its aim to reduce storage needs by printing food from bacteria. To that end, they are developing a ‘super-bacterium’ that can survive in space. (What could possibly go wrong?)
Where is the opportunity?
It’s probably too soon to tell if we’ll be more likely to nosh on bacteria burgers or pop nutritional powder pills come 2050. What is easier to digest today is the fact that connectivity is coming to eating. For the home kitchen, it won’t happen immediately—the turnover for built-in appliances isn’t as quick as, say, televisions and costs are still high. This means there’s still time for the contenders, both the appliance builders and the smart technology providers, to figure out which features will tip the kitchen in their favor.
From a dietary perspective, there is an opportunity in bridging the gap between our diet and technology. Restaurants will want to explore how to use technology to support more customized food preferences, but the broader question may be what will make it possible—and acceptable, in terms of privacy—to analyze personal data in order to develop meals that align with our unique food preferences as well as our specific nutritional needs? Maybe it’s a wearable that links your gut bacteria to ingredients stocked in the fridge, a toothbrush that reads your saliva, or (to really close the loop) the diagnostic toilet.
With innovation happening on many tracks, the possibilities for our future cooking and eating are both broad and captivating. What will lunch look like in next fifty, twenty, or even ten years? To borrow from Willy Wonka (who actually borrowed from Oscar Wilde): “The suspense is terrible. I hope it’ll last.”

What’s a Store For?

The first e-commerce transaction—a music CD, pizza, or weed, depending on who you ask—took place around thirty years ago. That means that first truly native ecommerce generation is now in charge of their own foot traffic and armed with at least one device that spares them the trouble of leaving the house. This, paired with the broader shift in consumer behavior across all generations, means brick and mortars need to find new ways to compete with digital to inspire visits and sales. Stores are evolving and, along the way, challenging the very notion of what a store is for.
Up against digital
A big part of brick and mortar’s evolution is digital integration. Today, retailers are working to enhance and personalize customer experience by connecting to consumers in-store through their mobile devices—building apps, targeting ads, and using beacons. You can find many examples of digital integration today, though online retailer Rebecca Minkoff’s flagship store in New York offers one of the more comprehensive ones; its interactive wall and dressing rooms have been credited with tripling expected clothing sales. Timberland also just launched its first connected store while Nordstrom’s commitment to digital integration has been credited with 50% growth in revenue over 5 years. (They just hired a former Amazon exec to serve as CTO.) Target, too, is getting into the mix, launching an LA25 initiative where it’s testing 50 of its top enhancements in 25 Los Angeles stores.
The IRL advantage
But digital integration is not the only strategy; retailers can also draw on the in-real-life [IRL] advantages of the physical space. Immediacy comes in here, with more retailers enabling online ordering and pick up in store or curbside. It’s competitive because fewer exclusively online retailers can offer this instant gratification, but is not necessarily a long-term strategy given that online fulfillment will continue to evolve and speed up.
More effective is the opportunity to build community. Oftentimes, this comes in the form of caffeine; Barnes and Noble was an early innovator here, adding a Starbucks to a New Jersey store back in 1993. Since then, many retailers have adopted or tested in-store cafes, including Urban Outfitters, Target, Restoration Hardware, and Kohl’s. Along the same lines, Target, Whole Foods, and Nordstrom, among others, are offering cocktails in some stores. When trying to attract customers and increase dwell time, there’s an advantage in offering something that can’t be instantly downloaded, like coffee, booze, and yes, maybe even tattoos. (See Whole Foods.)
Meanwhile, another concept that keeps popping up is—ahem—the pop up shop. The pop up shop’s currency is urgency; if customers don’t come now they risk missing out forever. Bloomingdales is hosting a pop up inspired by the musical Hamilton while Macy’s is bringing in pop ups as part of the reinvention of its Brooklyn store. The pop up also presents a low-risk testing ground for online retailers, one compelling example being Warby Parker’s touring store that was housed in a school bus.
But…is it a store?
As brick and mortar adapts, becoming deeper integrated with digital, acting a fulfillment center and expanding to offer drinks and other services, the classic definition of “store” begins to fragment. Already, the “store” has lost its longstanding position as the finale of the customer purchase funnel; in no small part because that purchase funnel itself is an antiquated concept. Savvy retailers and brands in general now think of the consumer experience as an ongoing loop, with consumers moving from digital to physical and back until, eventually, there may be no clear delineation between the two. This emphasis on the overall experience changes the expectations of stores. It also opens opportunities for more types of brands to invest in physical locations.
For example, last year, there was an more than an hour wait at the Museum of Feelings in downtown New York City. The museum invited visitors to walk through a sensory presentation of each feeling: Optimism, Joy, Invigorated, Exhilarated and Calm, while its exterior changed color to reflect the social mood of New York. You might argue that this wasn’t actually a store, but then it wasn’t actually a museum either; The Museum of Feelings was a branded retail experience for Glade, generating buzz for an otherwise not-so-buzzed-about brand.
More recently, Samsung launched Samsung 837, a “first-its-kind cultural destination, digital playground and marketing center of excellence.” Samsung 837 serves as a showcase for innovation, offering what may be the first virtual reality experience for many visitors and providing Instagram-friendly experiences like the walk-through Social Media Gallery. But what’s unique about Samsung’s space is that there is nothing sold there. It’s an experience—an opportunity for Samsung to tell its story and give visitors a way to get excited about the brand they’ll buy in the future.
In cases like these, brick and mortars serve as a marketing vehicle—an opportunity for brands to curate their own presence for customers, just as social provided the format to operate as a media company. It’s a trend that makes Amazon’s decision to open its own brick and mortars seem strategic. But is the return there?
It always comes back to data
The ability to more accurately track consumer activity gives brick and mortars a host of insights. Not only can the more connected store know what was purchased, they can also see what products compelled the most research, price comparisons, or inspired trips to the fitting room. They can engage with in-store customers via social media as well as encourage and measure posts from their store and, increasingly, tap into emotional analytics. Further, more sophisticated attribution measurement is making it possible to determine what investments drove traffic to the store, even without purchase.
Though it would be inaccurate to suggest that traffic and sales aren’t still the key performance indicators for most stores, this broader set of data, if put to use, can help a retailer optimize beyond the limits of its four walls—especially critical at a time when stores are closing so rapidly that CNN wrote “Store Closings are the Hottest Trend in Retail.”
Where to go from here
Digital has an odd way of creating challenges and then presenting solutions for those challenges it creates. It offers a range of ways of to add genuine value, from brand awareness to interaction, coupled with pop-up flexibility. If retailers are savvier about embracing this value, they’ll stand a better chance of attracting customers. If not, they’re not only missing out on opportunities in the near term, they’re limiting their future prospects for growth—after all, isn’t it a waste to see a store as a fulfilment outlet?

The Untapped Market for Wearables and Aging

Aging societies worldwide pose serious challenges to both healthcare delivery and how it is financed. The financial challenges are not only for the healthcare system, but also for families and caregivers who spend a significant amount of time and money caring for elderly.  Carers UK, a non-profit organization, has estimated that over 2 million people have given up jobs and over 3 million have cut back hours of work in order to care for sick or disabled elderly relatives. Gallup estimates that lost productivity due to caregiving amounts to over $25 billion annually in the US alone.
These figures do not count the economic impact of chronic diseases and other issues that affect the elderly. Falls alone are estimated to cost nearly $15 billion/year and are a major cause of death due to sequelae (additional complications) after the fall.
While the headlines on wearables tend to focus on athletes and those who are already in good health, the business case for wearables for managing issues that afflict the elderly, coupled with the time and financial constraints on their caregivers, may be even more robust, if the right solutions could be made available.
A brief look at the current market for wearables reveals that solutions for the elderly are beginning to enter the market. Personal emergency response systems (PERS), which can be worn by the elderly in the event of a fall, have been available for many years. These devices allow the user to summon emergency assistance by activating a button on the device that is worn on the wrist or around the neck. The next generation of PERS, such as Artemis or Amulyte, has broader applications for safety while also considering fashion and the use of smartphones.
Even more importantly, we need to harness sensors to predict the onset of conditions that may lead to falls and other health issues before they happen.  AgeWell Biometrics is a cloud-based platform that can connect to a broad range of wearable devices and enable healthcare professionals to evaluate stability of an individual and the risk of falls and other neurological issues.
In addition, the market for remote monitoring technologies that utilize sensors to measure cardiac function, track illness symptoms (e.g. Samsung Gear Smartwatches) or track location while safeguarding privacy (CarePredict) looks set to grow. In the next several years we expect to see many of the expensive medical devices found in intensive care units become wearable sensors worn on the wrist or in clothing.
This development will help health systems transition patients out of the hospitals (where they can be at risk of infection) and into the home sooner, hopefully for a speedier recovery. Other applications include focusing on conditions such as Parkinson’s Disease and treating hand tremors, or applications for those suffering from Alzheimer’s (Smart Sole).
In order to realize the benefits of wearables for the elderly we will need to improve capturing and sharing this data with the right healthcare provider and/or caregiver, at the right time. However, advances made in recent years are on the right track to deliver solutions which will make a real difference, in terms of both cost and quality of life.
Photo credit: Neill Kumar

Mobile Security: putting the consumerisation genie back in the bottle

Since the arrival of the first consumer-bought smartphones, enterprise security has been under threat. That all-important chain of defense against security risks has been undermined by its weakest link, people, in this case by using non-standard devices to conduct business and therefore making corporate data vulnerable to attack.
The alternative, to roll out company-issued mobile devices, has not been an easy path to follow. When historical market leader Blackberry lost its leading position in the market to Apple and Google’s Android, companies also lost a significant part of the ability to control corporate messaging and applications from a central point.
From the perspective of the IT shop, the consequence has been fragmentation, which has undermined the ability to deliver a coherent response in security terms. While solutions such as Mobile Device Management have existed, they have been seen as onerous; also, some devices (in particular those based on Android) have been seen as less secure.
Looking more broadly, many organisations have ended up adopting an approach in which corporate devices are used alongside personal equipment for business use. The genie of consumerisation is out of the bottle, say the pundits. But now devices exist that can deliver on an organisation’s mobile security needs, the question is, can it be put back?
The answer lies in addressing the source of the challenge, which is not the device but the person using it. Human beings assess risk all the time, and indeed, we are very good at it. In the case of a mobile device for example, we are prepared to put up with a small amount of discomfort if it will get us the result we want: sending a message, say.
If the discomfort is too great, we will assess other risks, such as, “What happens if I get caught using my personal phone?” If the answer is nothing, then the chances are that the behavior will continue. With this in mind, anyone deploying a mobile solution needs to consider two variables: the discomfort it causes, and the cost of avoiding the discomfort.
Considering the discomfort first, the point of any mobile solution is to enable productivity. Different security features — such as encrypted data storage, separation of apps and so on — may be applicable to different business scenarios.
Defining a solution appropriate for an organisation or group requires familiarity with the security features available on a device and the risks they mitigate. Better knowledge makes for more flexibility, reduced operational overhead and therefore increased probability of a successful deployment.
An equal partner to product knowledge should be an understanding of the organisation concerned, the data assets to be protected and what constitutes their acceptable use. If a policy is in place, this may need to be reviewed: note that it needs to be signed off at the top of the organisation to be effective.
Once a standard configuration has been defined, it will require testing. Too often, enterprise mobile security can fail “for want of a nail” — insufficient licenses on the RADIUS server for example, or lack of WiFi cover in areas where authentication takes place. Users need a solution that works from day one, or they will immediately lose confidence in it.
Putting all these measures in place can help minimize discomfort, but the need to go hand in hand with measures to ensure that the capabilities cannot be circumvented. Note that we are talking about the organisation’s most important asset — it’s people — who will respond far better to inclusionary tactics than draconian tactics.
At the same time as understanding why secure mobile working technologies are being deployed however, employees need to know that they need to act as a strong link in the chain, not a weak one. An Acceptable Use Policy should be enforceable, in that a staffer at any level’s card will be marked if they attempt to circumvent it.
In addition, the genie should be given a clear timescale for getting back in the bottle. For example, in an ‘anything goes’ environment which mixes personal and corporate mobile equipment, individuals should be given a cut-off date following which corporate data access will only be possible via a secure device.
A final question is about sustainability, that is, how to keep it all going? Reporting is important, with deprovisioning perhaps the most critical — it is one thing to know that resources have been allocated to the right people, but even more so is to know that any rights — and indeed devices — have been returned on change of role or exit from the company.
The bottom line, and the most fundamental challenge, is that any shiny new corporate devices deliver on what they are supposed to do — in this case enabling mobile users to stay productive without compromising on corporate risk. Provide people with usable security they will not try to circumvent, and you avoid consigning devices to the desk drawer.
If you’re interested in improving your business’s mobile security operations, join us for our upcoming webinar: Evolving Enterprise Security for the Mobile-First World. This webinar is presented by GigaOm’s Jon Collins, with sponsorship by Samsung. Register now for the webinar taking place on Wednesday, March 9 from 1 to 2pm EST.

Virtual Reality: Just for fun? It won’t stay that way!

While the 1992 film Toys, starring the late, great Robin Williams, did not meet with universal acclaim (it registered a paltry 26% at the review site Rotten Tomatoes, despite receiving two Oscar nominations for its artistic merit), it contained at least one notable scene. This involved Williams, as toy designer Leslie Zevo, sitting on a sofa with his sister Alsatia (played by Joan Cusack) and wearing what looked like eye masks. As the pair rocked, screamed and waved their hands in the air it became clear that they were watching a roller coaster simulation.
Spot the date: over a quarter of a century has now passed since virtual reality (VR) headsets first entered the popular consciousness. A number of challenges have had to be overcome — not least insufficient screen resolution and movement tracking, which have been considered as causes of queasiness when headsets are worn.
But also, cost. I can remember, back then, considering the scenario of a young rebel on a city metro train, sporting cool-looking glasses that beamed images onto his retinas. Even if this were yet possible, it would be cost-prohibitive. But it is coming.
A mere three years have passed since Palmer Luckey first set himself the task of producing a low-cost VR headset. Following a luck(e)y break when he met John Carmack, creator of the seminal first-person shooter Doom, Palmer followed the footsteps of so many entrepreneurs when he left college to follow his dreams.
The Kickstarter campaign for the Oculus Rift heads-up display raised nearly $2.5m and while the device is yet to be released, its technology is already built into Samsung Gear devices. As it arrives however, the Rift is already offering more potential than just viewing images and videos.
To understand why VR in general, and the Rift in particular, are set to be such a game changer, we need to consider not just the headset but how it fits with a range of other technologies. Augmented reality for example, which links visuals with context-based data. Motion tracking from the likes of Leap or Kinect.
It’s not how any one technology delivers that matters; rather, it is how they can be used in combination. You can think of all the pieces as components of a new range of solutions, which have applicability in retail, in healthcare, in navigation, in geology and (as per Mr Williams) in film, media and all forms of entertainment.
Audi’s “world’s first fully digital car showroom,” based in London, is one example of how VR can benefit the retail experience. Audi integrated a Samsung Gear VR headset, immersing customers in a tour of the car’s features – you can even take a test drive. While this set-up claims to be the first of its kind, it is unlikely to be the last. As such solutions become prevalent, and as an inevitable consequence of the laws of supply and demand, the components will also become cheaper even as they diversify in form and function.
What other applications might we see? We might see a resurgence in virtual worlds such as Second Life; more likely however is that VR will become part of our daily lives. As such it is important for any organisation to consider the implications, which can come from a number of directions. It may be that VR has applicability within the business — in R&D for example, or in data visualization. Equally, it has potential to change behaviors, for example in how people work and relate to their colleagues.
The bottom line is: today, for many, VR still looks like a fun gadget, and indeed it is outside of certain domains. So have some of that fun — for a few hundred dollars, invest in some headsets and try them out. That way, when VR becomes more than fun, you will have a more solid perspective into how to integrate VR into your business strategies.

The Shape of Shopping in 2016: Holiday Shopping Trends Inform the Coming Year

We’ve made it through that time of year again, when consumers struggle to manage busy schedules, tightening budgets and the pressure to bolster holiday memories with the perfect gift. For the trend watchers among us, the holiday season also turns out to be the most wonderful time of year to gain some insight into current shopping behaviors. How were shoppers shopping this season, and what can that tell us about the year ahead?
A rise in online (and mobile) shopping
It’s not surprising that e-commerce is continuing to grow its share of the shopping experience, with mobile’s role increasing. According to the IBM US Retail Black Friday report, more traffic was generated from mobile devices than desktop on Black Friday, driving a 30% increase in sales via mobile devices compared to the previous year. Cyber Monday told a similar story; Adobe data showed that 49% of shopping visits could be attributed to mobile devices. This is both an opportunity and challenge for retailers; they’ll need to ensure their properties are optimized to engage and convert mobile users while also exploring new incentives to drive traffic to brick and mortar locations.
Shopping starts earlier; fragments
And while Black Friday and Cyber Monday have long served as the groundhog-like indicator for holiday sales, there are signs of a shift. A Google/Ipsos Media CT study indicated that shoppers are starting earlier; in 2014, 61% began gift research before Halloween and 48% completed shopping by Cyber Monday. This may be a reflection of increasingly fragmented shopping driven by m-commerce; the same study showed that marathon shopping excursions are giving way to spare time shopping on mobile devices. In 2016, retailers will need to employ year-round top-of-mind strategies in order to capture more of those shopping moments.
What we’re buying
There was no shortage of buzzworthy gadgets in 2015, but did that buzz translate to sales? While there are some discrepancies in the estimates around Apple Watch’s overall sales, recent data from Best Buy stated that customers purchased twice as many wearables this year compared to the 2014 holiday season. That 100% increase is notable, but also suggests that the true Wearable Revolution has been postponed until at least 2016.

This post is sponsored by Samsung. All thoughts and opinions are my own.
Meanwhile, other gadgets experienced a few bumps in the sleigh ride leading into the holiday season. Citing safety concerns, the Federal Aviation Association established a registration policy for drone owners, for which non-compliance could result in a fine. Hoverboards (more accurately known as “self-balancing scooters”) aimed for explosive holiday growth, but some were removed from shelves as it became evident that the self-balancing scooters were at risk of, yes, exploding. While both still made a strong presence under the tree, both Adobe and IBM’s data showed that Samsung TVs ranked as the most popular purchase on Black Friday 2015.
The wallet goes mobile, sort of
Consumers had multiple point of sale payment options to choose from this holiday season as the mobile wallet wars heated up. Despite the intensity of the battlefield, however, a study from InfoScout showed single-digit activity in this area. Credit cards, in comparison, accounted for 79% of in-store payments, making mobile payments another entry into the “maybe next year” category. Noting that younger generations will drive growth in this area, eMarketer projects a 210% increase in mobile payments in 2016.
Social commerce debuts
The pressure to monetize social media is on for both brands and platforms, and so 2015 saw a push for social commerce. Designed to usher social networkers down the path to purchase, Facebook, Twitter, Instagram and Pinterest promoted their respective “buy buttons” in 2015. Still, while some brands and users experimented with the option, there was not yet a groundswell of activity in social commerce. Like mobile payments, the buy button needs a little more time to heat up.
This holiday season showed us that consumers are developing a mobile shopping habit, one that may have a measurable impact on brick and mortar traffic and sales in 2016, but other trends will take a little more time to hit the mainstream. For social commerce, it remains to be seen how easily users in a social mindset can be converted to buyers, but brands (and the platforms that support them) are motivated to monetize the critical channel. Mobile payments have the opportunity to gain a lot of ground in 2016, as long as providers continue to educate shoppers and assuage obstacles like security concerns. While change won’t happen overnight, it’s very likely that by this time next year we’ll see a notable increase in items (perhaps wearables and safer hoverboards) purchased in non-traditional ways.

Thinking differently about digital signage

Screens, it seems, are everywhere these days. Printers and watches, tablets and smartphones all have high-resolution digital displays; front rooms and shared spaces across the globe are being furnished with increasingly large monitors. It’s a consequence of the relationship between increasing fabrication quality and falling prices, reducing what might be called the ‘threshold of suitability’ – simply put, the point at which deploying a screen becomes cost effective.
Unsurprisingly, this model is not only affecting broadcast media but also retailers, sports and entertainment venues, hospitals and university campuses. Sometimes the need can be quite simple: for example, consider the use of digital screens to show up to date bus times at transit points. Doing so saves a not-insignificant amount of rigmarole, from printing and distributing timetables, to maintaining a list of what has been updated where.
Once such a facility is in place it can serve as a basis for additional features – in this case, pushing out service changes or problems, serving adverts, updating the bus location in real time or even reporting emergencies. A screen is also just one step away from being an interactive terminal, accessible via a smartphone, watch or other device.
Screens can scale up, and scale out. In terms of size the sky is the limit, as images can be projected onto entire buildings. With the right back-end infrastructure in place (i.e. systems that co-ordinate what is being displayed), screens can also be spread across a campus environment and updated, to all intents and purposes, simultaneously. In practice, this means that any surface, pillar or wall can be a location from which information can be shared.
It also means that organisations could benefit from thinking differently about digital signage. In general screens and display acquisitions are considered on a capital or project basis – back with the bus stop example, the shopping list probably said “screens to display timetables etc.” Given the broadening opportunity both in terms of where to display, what to display and indeed, how to interact, an advisable alternative is to broaden horizons in terms of what this makes possible.
In practice this starts with a quite simple thought experiment. Look at the environment around you and think, “What if every surface was able to deliver information, in any form? How would this change how I consumed information, if I was a student, a patient, a customer? What kind of relationship would I be able to have?” While this may seem to be a step too far (“All we wanted was a display”), it is not that different from design of a space – architects do not consider the lowly footpath in terms of linking two points, for example, but as an opportunity to create new, unthought-of routes.
There remain numerous practicalities to keep in mind – connectivity, software integration and aspects such as screen brightness in certain locations, as well as cost-benefit analysis of the specific business case. But even if the outcome of a given deployment of digital signage is going to be relatively limited, it would be wise to include an element of up-front brainstorming in any discussion such that the full potential of any investment can be realised.
If you’re interested in learning more about the evolving digital signage industry, watch our first webinar sponsored by Samsung titled “Business on Display: Making a Statement with Digital Signage.”

Why the future of payments may lie deep in the past

It’s that time of year when we take stock, review where we are and try to work out where we are going. Of particular interest is what we term ‘payments’ — a simple term which belies its importance.
Every banking transaction is a ‘payment’ — nearly 400 billion of these were made globally in 2014. And if this appears to be a big number, it is estimated that this makes up only 15% of all transactions worldwide, with the rest being cash-based.
While the past decade has been overshadowed by the near-death of global banking, the ways we give and receive money have undergone a quiet revolution, driven by so-called ‘Fintech’ companies which use leading edge technology to deliver lower-cost financial services.
As a consequence for example, we have seen a rapid rise in peer to peer services. P2P lending organisations such as Lending Club are already moving beyond their social networking roots and adding stronger governance, linking high net worth with lower risk individuals.
Meanwhile London-based Transferwise, which enables P2P currency transfers between individuals without using banks as intermediaries, has been valued at a billion dollars.
Payment mechanisms are also becoming more affordable and, therefore accessible. As we have seen, Near Field Communication (NFC) payments via cards and smartphones are giving people confidence to let go of cash and accept mobile payments.
And in many countries, any small business can buy a card reader terminal for tens of dollars or less, from the likes of Paypal or Square, supporting chip card or NFC payments.
A third trend is how cryptocurrencies such as Bitcoin continue to rise in popularity. Due to their system of record, Blockchain, cryptocurrency payments can take place without involving banks, inviting new models for purchasing goods and services — not least music and arts.
So what does any of this mean? Most importantly, as costs reduce and confidence increases, so the lower limit on transaction size drops. We are unlikely to see the death of cash any time soon, but small change may become a thing of the past.
Acceptance of smaller electronic payments also encourages the rise in minimal-cost services (such a, for example, the individual play of a song). And while traditional cash has a minimal measure, cryptocurrencies operate at sub-cent levels, further enabling micro-payments.
As well as seeing a growth in such services, we also need to watch out for the potential for micropayment misuse, for example for currency laundering or fraudulent provision — if a service is offered for a ha’penny then fails to deliver, few would bother to make a complaint.
We will also, inevitably, see the creation of new intermediaries which offer a layer of governance and trust on top of payment mechanisms. Many are vying for this position, from service providers and handset manufacturers to old and new financial institutions.
Indeed, a new ecosystem is already evolving. Santander, JPMorgan, UBS and Barclays are looking into Blockchain, and Transferwise is in talks with traditional banks about adding a P2P currency exchange feature to their mobile banking apps.
The consequence is of improved flow of money. This is both a blessing and a curse as it yields market unpredictability, as various trading disasters in recent years serve to illustrate. Technology also brings its own risks — not least, it can be buggy or insecure.
Overall however, less friction makes for more efficiency, meaning that the payment mechanism reduces in importance compared to the products or services we are selling or buying.
Currency was never meant to become more important than the things that we do with it, but over the last couple of thousand years it has become so. While we may never go back to a system of barter, our experiment with currency may well be moving back to its proper place.

Striking a balance between security and productivity — an impossible dream?

Security and personal productivity do not make comfortable bedfellows. To understand why goes to the roots of what business is all about.
Business can be measured in two ways: effectiveness and efficiency. Effectiveness refers to doing the right things, for example creating the right products and services, or closing the necessary number of sales.
Efficiency means doing things right, that is, achieving results without unnecessary overheads. In personal terms this equates to productivity — or, simply put, how much time is spent in a day achieving results, versus doing things with no apparent business value?
Security often appears the enemy of such personal productivity, creating what can seem to be unnecessary barriers to getting the job done. And sometimes, it looks like business effectiveness itself has been sacrificed on the altar of over-bearing security.
I remember one organization I worked for, back in the day, that banned the use of floppy disks for file transfers. The rationale made sense from a security perspective — such transfers were a major contributor to the transmission of computer viruses. But for an administrative department whose business relied on file transfers, it meant that productivity took a major hit.
Things are so much more complicated these days, of course. Via the Internet, every computer is connected to every other; we have phones that can store large enough volumes of data to run entire companies.
And meanwhile, in infrastructure terms we see fragmentation at all levels. While the buzzword might be convergence, in reality this has led to a nightmare of integration work. We have created for ourselves a leaky bucket.
The security risks of such a complex environment are genuine and need to be addressed. But does this mean that we are doomed to becoming increasingly unproductive? Or is there an alternative answer which enables both security and productivity to be achieved?
There is an answer, but not necessarily where people might first look. This is not about striking some kind of arbitrary balance between making things secure and allowing people to be productive.
Rather, this is about being clear on what you want to secure. Technology, in all its complex and far-reaching glory, is a distraction from the main event — the information that it creates, processes and communicates.
Information is an organization’s most important asset, it has been said. But not all information is created equal. Some is business critical; some incorporates intellectual property; some is subject to compliance criteria.
Understanding what information you need, and why it matters, does not have to be an onerous task. While organizations may struggle to get on top of all of their ‘information assets’ they can nonetheless identify a core of information that is of particular importance to the business.
Such understanding goes a long way to creating a suitable response, as it enables the right trade-offs to be made. A clear example is in healthcare, where patient information absolutely should be subject to far more stringent criteria than, say, the menu in the staff canteen.
In security architecture speak, this is called ‘separation of concerns’. In layman’s terms, this means providing access mechanisms, policies and roles appropriate to the information.
It may be, for example, that customer information can only be accessed on personal mobile devices via a locked-down app. Or that certain systems can only be accessed via a virtual private network (VPN), accessible only to certain people.
The bottom line is that we cannot expect everything to be accessible everywhere, nor that everything can be locked down to the same level of security. Taking such a stance can only end in failure.
By focusing on information first we can identify what cannot be compromised, before considering where compromises can be made. The old adage “balance in all things” should only be applied once the organization’s confidential assets have been understood.