Do You Know Me? Can I Trust You? (Part 2)

By Eric Fensterstock

Personalization, Privacy

In an earlier post, we discussed how our lives are about to be subject to even more sophisticated analysis. The quantified self movement promises to help us make more intelligent decisions about how we take care of ourselves. There’s even a quantified baby trend to turn the lil’uns into data. Cars and homes will improve via smart technologies. The value of this data is limited only by our ability to interpret it, and this analysis is where we still need improvement.

More Data, More Problems

For years, recommendation engines have made use of collaborative filtering, a system that looks for patterns in the tastes of large groups to predict individual preferences. Once we have more types of data from more types of people, we may be able to better understand human nature at a more general level. As with other social science research, we will need to keep in mind that what applies to one country may not fit another. What works for men may not always work for women. And I may be very different from people with my demographics.

These systems also need to understand that each of us is a different person in different situations. I don’t need hobby recommendations at the office, and I don’t want to see career advice during a romantic dinner. I want to keep everything G-rated when the kids are around.

Sometimes, I really am asking about a friend or buying a gift rather than something for personal use. When my father bought me the album “Fear of a Black Planet” on Amazon fifteen years ago, he wasn’t looking for this to be the basis of future product recommendations.

Some things really just need to be off the record. If a topic is embarrassing, let’s not talk about it. If I look something up once, that may be all of the information I need on that topic. Once I’ve bought a refrigerator, I don’t need another one. Once I’ve fixed my leaky pipe, I don’t want to think about plumbing again.

One day, we may resolve all of these issues, technology will know me well, and I will trust technology with this knowledge. Then other issues may arise: the boredom of routine and the ignorance of the cocoon. Designers will need to build in serendipity to allow us to encounter things that are outside of our everyday experience. Often, we won’t like these things, but often enough our tastes will expand. We will need to be challenged, either by simple randomness or by algorithms that calculate our appetite for novelty. The best technology will explain why we are seeing odd recommendations. Maybe I’ve never read a novel from Africa but there’s a Kenyan woman who touches on some of the same themes as that guy from Chicago who’s my favorite.

A Real Dumb Solution

In the real world, we have a good understanding of physical objects. If I write something in a diary, I can hide that diary in my home and assume some degree of privacy. If I have a lock on the diary, I give myself more assurance and I clearly indicate to others that it would be a violation to read. When I eat in a restaurant, I understand that others eating there can see me. “This person you see is in the restaurant now” is data with a lifespan equal to the duration of the meal. I authorize this information to be shared within the restaurant and perhaps the viewing range of nearby windows. It would be a violation to photograph me and store the photo forever or to show a video feed of me eating on a screen in Times Square.

Priests and attorneys are sworn to keep our confidences. Doctors will usually, but HIPPAA disclosures make things seem murkier, and who knows what our healthcare insurers and pharmacists are doing with our information. We trust that the government will not wiretap doctors, clergy, and attorneys. Government cannot demand their records.

We need the virtual world of smart things to be as intuitive as the real world of dumb objects. There is a precedent for this in graphical user interfaces. Windows, folders, trashcans, and desktops all helped users get used to computers. The trend to mimic the real world, skeuomorphism, can be dropped once the technology becomes second nature. Mobile interfaces are already losing some of their photorealistic touches. In other places, these things hang on, even when users know what to do. Consider the horn icon that still appears on nearly every car steering wheel. How often do you use a real horn?

At times, we do have an intuitive sense of how data should be treated. When we buy an app or a song on Google Play, we give Google our billing information. We feel that this is a different Google from the search engine. Our credit card numbers and home address won’t suddenly become available to anyone who does a Google search. We don’t need to read the lengthy terms and conditions to feel confident that we are ok. It is conceivable that Google might show our friends what we bought, but our expectation is that Google would request our consent first. Some services might bury this sort of consent in a settings menu, but most users would frown on that. In 2007 – 2009 Facebook got into trouble for automatically sharing our activities via Facebook Beacon.

When we install an Android app, we get a list of the permissions that the app will need. This can be confusing, but it is a step in the right direction. Perhaps someday, lengthy terms of service agreements can be replaced by industry-standard definitions. Then users would be able to consult a table, similar to movie ratings or TV content descriptors, to know what will happen to our data. No one can be expected to read a 60-page End User License Agreement, but a skimmable summary would work.

This would require legal reform. Another area for legal reform would be to designate some virtual services as the equivalents of doctors, lawyers, and priests, eligible for the same privileges of confidentiality. Law enforcement would fight this, but we need these sorts of reassurances to move ahead.

Now the World Don’t Move to the Beat of Just One Drum

Even today, we can help our technology to know us and to act in a trustworthy way. We can make sure we understand what we are agreeing to. We can scrutinize privacy preferences and settings menus.

Ultimately, personal customization may be the solution, rather than a set of global rules. Users may cluster into three groups:

1. “Apple Fans,” those who want the solution to “just work.” They want something that is satisfactory for most people, even if it is not perfect.
2. “Hackers and Tinkerers,” those who want the ability to adjust everything to their liking. These people will tweak settings at a granular level and mash up different components to make better solutions.
3. “Curmudgeons,” those who will never be content, though some may find niche solutions that are good enough.

There will always be room for paranoia. A little paranoia is probably healthy. But if I am financially secure and in good health, riding in my self-driving car, avoiding traffic, and enjoying entertainment that I love, I can afford to make some compromises. The important thing is that the rules for these compromises are transparent and well thought out and that the system is run competently.