19.04.15 — Design
Privacy & Anticipatory Design
An article recently posted in Fast Co. Design recently caught my attention, as it was entitled “The Next Big Thing in Design? Less Choice.” As the title of this post may suggest, it was indeed talking about anticipatory design.
This concept is not wholly new, and for those unacquainted with the term, I shall now attempt to summarise the concept as succinctly as possible. Anticipatory design, as suggested by it’s blatant nomenclature, is design which utilises a given knowledge base to anticipate decisions and choices so that a user has to merely approve decisions that a given system has made for them. Sometimes this may even mean that the user is wholly detached from the decision making process – leaving the systems to equate and decide instead.
As the article says, anticipatory design has been around for some time already now, albeit in less automated forms. Think about YouTube’s suggested video section which suggests videos for you to watch based on your viewing history, or Amazon’s recommended items which curates a selection of items which are frequently bought with products you have recently ordered.
The problem I found with anticipatory design when I first read the article was the one that Aaron Shapiro addresses within the article itself, that of whether the user should allow automated systems to make decisions for them, or whether a sense of more granular control is preferable.
Just today I was editing photos for my previous lifestyle blog post and found myself hitting Apple’s auto-enhance button in the iOS Photos app, which enhanced my photos beautifully as I’d have liked – but even then I felt the need to go and adjust each granular colour and brightness setting manually, just so that I could be sure I had the image exactly the way I wanted to.
But I do, of course, see the attraction in having a one-click enhance function. It’s perfect for people like my sister who want to take and upload good-looking photos on the fly.
Maybe, then, anticipatory design just needs to adjust itself to suit different user groups. Light users want quick and simple options, they want the systems to sweat the small stuff and do the hard work. Pro users, however, want access to every little option and setting that they can. What does this mean for UX? I could write a whole new post about it – and maybe I will – but for now let’s tackle an issue that came to me as I was clambering into bed the other night and which spurred the writing of this blog post: privacy.
In today’s connected world privacy is now very much at the forefront of public awareness, especially given recent events such as Snowden’s NSA leaks – even if the revelation of the existence of such widespread recording/monitoring activities came as no surprise to myself. People talk a lot about personal data and privacy, and web users are becoming increasingly aware of how their personal details can be used and abused.
So then, when we know that anticipatory design works on a system of analysing personal data, how can this invaluable privacy and the future of convenient design comfortably coexist? Surely if we allow such systems to amass such huge reserves of personal data, then we are effectively sacrificing our privacy and exposing ourselves to the risks associated with allowing the storage of such details?
Take Google for example, a company who’s interconnected network of services (YouTube, Gmail, Google Calendar etc.) are already pushing anticipatory design forward. These services share a user’s data between each other to both make a user’s life easier (and yes, also to deliver tailored advertising), however when people realise that their data is being used in this manner, there’s uproar.
So we may be able to anticipate people’s decisions before they make them, but will they want us to? Having an algorithm work out the best adjustments to your image is one thing, having it know what you want to say and to whom before you do is definitely another.
The data to make such decisions is probably, in most cases, already collected and ready to be put to use. But if we do, will we scare users away? Will anticipatory design’s unintentional (and possibly catastrophic) secondary function be to reveal the true extent of data collection and how very little privacy we all really have?
And if it does, then we must really must ask: is the world ready for anticipatory design?