DesignTalk Ep. 57: The new rules of designing with data


– [Margaret] I’m Margaret Kelsey, and this is the 57th
episode of DesignTalks, a webinar series brought
to you by InVision. Today, we hear from Nathan Kinch about the new rules of
designing with data. Let’s listen in. – [Nathan] Yeah, absolutely,
thanks, Margaret. And thank everyone for joining us today. Really appreciate the time that
you guys are all investing. Before we kick off and
before I say a brief bit about myself, if you’re
wondering why you can’t see me, it’s because Margaret gave me the choice. I could show my face or I could
choose to remain anonymous. One of the things that we’ll be talking about today is data privacy. So I felt it kind of made sense. Really, really quickly, who am I? Right now, I’m a founding
partner of Greater Than Ex. As I said before, you
can’t actually see this, but I’m still sweating. We deployed a new experience, our website, about 10 minutes ago. We were really cutting it fine. We were pushing just in time a little too close to the limits. It’s still buggy, but if you’re interested in what we do at Greater Than Ex, head over to greaterthanexperience.design. I’m also Head of Ecosystem
Value and Growth at Meeco. Meeco is a really, really
interesting company I was actually following
for a number of years before I joined last year in February. You’re gonna learn a little bit more about what Meeco does quite practically throughout the talk today. As Margaret said, if
you’ve got any questions, you all know the Twitter tag, hashtag, so let’s kick off. I’m gonna start today by setting hopefully a fairly clear expectation. I think this is gonna
last about 40 minutes, which gives us 20 minutes for Q&A. My objective for today is to really try and achieve two things. Number one is to give
everyone on the call, if you don’t already, a
fairly strong foundational understand of what’s happening
in the personal data economy, so, how is it evolving and why. Then, number two is to
progress beyond that and give you a set of
rules, a set of tools, and a bunch of approaches that you can use to hopefully design valuable, meaningful and engaging experiences for the people that you serve as customers. All of this is obviously gonna be in the context of personal data. Now, the third for me is
a little bit of a bonus. I’m really humbled by
the opportunity today, and I’m humbled by the fact
that everyone listening has invested about an hour
of their time in this. So I’ll do absolutely
everything that I can to ensure that this session is valuable, meaningful and engaging to all of you. I’m gonna start by asking
you to imagine something. I’ll use an example that’s relevant to me. But you can take the same sort of concepts that I’m talking about and apply them to something
that’s meaningful to you. Imagine it’s the year 2023, and your data’s an asset,
or maybe it’s a commodity. But it’s definitely
something that you control. It’s used to make your life easier, it’s used to help you get the right thing at the right time and in the right place. Perhaps you even have something that’s called a net present data value. All of this is completely normal. It’s as normal as my smartphone that’s sitting next to me right now. Given all of this, let’s
pretend I’m in New York. I’m speaking at the
Data Transparency Labs’ annual conference and
I’m scheduled to speak at about 11:30. Now, because I practice
intermittent fasting, I’m not planning to eat until afterwards. I really don’t think
I’m gonna be in a rush because I’ve got about an hour for lunch and then I’ll walk somewhere nearby and have my first meal for the day. But it’s an event. And due to unforeseen technical glitches, which we’ve all experienced, my session’s been pushed
back to about 1:30. So what happens is me, the
digital representation of Nathan, which is powered by the
Meeco platform, is notified. Me knows what by blood sugar levels are, me knows that for me
to perform at my best, pushing out my fasting window
too far isn’t really ideal. Me also knows I’m on a
whole food plant-based diet. So me gets to work. In a matter of seconds,
I’m notified that at 12:30, I’m gonna be receiving a black
bean and sweet potato salad with quinoa and grains, very yummy, and it’s gonna be delivered directly to my location at Columbia. Payment’s also already been taken. And all of this has been achieved without any of my data or
my identity being shared. Basically what’s happened is, simple binary assertions
or yes/no’s were enough to find a plant-based eatery with a delivery service in the area. All of this was possible
because my digital identity, my personal data, and my
consent were under my control. Now, of course, it’s 2023,
we’re a few years out. So at this point, I have the tools to make use of these things, as do the organizations participating in whatever this marketplace is. Now, that’s a really simple
example that I’ve given. Yours could be bigger picture, it could be much more powerful. But the point is, try and imagine a world where your data is used by
you or your digital assistant, privately and compliantly, to make life fundamentally better. Now, if you’re asking,
“How’s a Rubik’s cube “relate to that story that
you just told, Nathan? “That’s a bit odd,” well, a Rubik’s cube has got a really clear outcome. But in order to achieve that outcome, you’ve gotta make the right
moves at the right time. And this, in my view, is an analogy to where the personal data
market’s at right now. Let’s start unpacking
what’s happening today. You know, why do I
believe that personal data is gonna transform aspects of society? And why are we all here? Why should we as designers
actually care about this stuff? I’m gonna start with perhaps
not the sexiest topic. I’ll start with regulation. There’s this thing called the GDPR, or General Data Protection Regulation, and its probably the most
significant data regulation globally right now. There is a bunch of other stuff happening, is the E-Privacy Directive,
EIS, PSD2 and other things. Broadly, these regulations
are changing the dynamic of the entire personal data market. And the outcome’s really interesting. It’s that people and
organizations will soon begin exchanging data with each other as equals. This is pretty significant in of itself, and we’re gonna unpack aspects of that throughout the talk. But there are also consequences. As an example, if a company’s
seen to breach the GDPR, they could be fined up to
4% of global annual revenue, or 20 million Euros, whichever’s greater. And key executives can be
held personally liable. They could be imprisoned. Because of these things, the GDPR and other regulations like it, they now have a seat at the table, a seat at the board, actually. Now, if you’re in US
and you’re dialing in, this might not seem as relevant, ’cause a lot of those things
I’ve referenced were European. But if you have people using
your product in the EU, that’s actually not the case. So my simple advice here is, try not to leave the GDPR
and other data regulations to legal and compliance. Get a seat at the table if you can because it is going to
fundamentally impact how you design. Now, we’ve also got data breaches. This is a pretty topical thing right now. I’m not gonna dwell on it because it’s really well-documented. It’s also gonna come up a
little bit later in the talk. We then have the IOT, or the Internet of basically Everything. It’s not just people and organizations that have identities and
need to exchange data, it’s Things, too. Things will have identities, Things want information about us. They wanna share it. The question is, will we let
them and under what terms? Now, there are also standards emerging. There are standards for
things like consent receipts, standards that are gonna
support interoperabilty, and there are really interesting standards like the Coalition of Everyday Living, which is worth checking out. Many of these are emerging. Some of them will be adopted
and made use of broadly. Some of them we won’t hear much about. Then there’s our behavior. Now, this in of itself probably merits an entire design talk. So I won’t touch on it too much. But as an example of how this is changing, over 400 million people have
now installed ad blockers. And it’s not just because
they don’t like being tracked. (mumbles) the ad model’s broken. It’s inaccurate, it’s
capitally inefficient, and frankly, it annoys a lot of people. Now, it’s not here on the globe, but I think identity is a
really interesting topic. So what is digital identity, and what isn’t a digital identity? There’s a heap of work being done in this market right now, and it’s pretty complex and
nuanced, to say the least. But I think that Kantara, they highlight this really
nicely in this diagram. What it shows is the complex nature of digital identity and personal data. It shows the shift away
from fairly simple, almost binary, digital identities to digital IDs that are attribute-abundant and completely and
utterly context dependent. At Meeco, we see this as a transition, away from what we dub
the attribute economy 1.0 to something we call the
attribute economy 2.0. I was really fortunate
to coauthor a report with some incredible
people on this subject. If you’re interested in all of this stuff that I’m talking about, let us know. I can work with Margaret to make sure that that report’s available. So a lot’s changing. It’s changing now and it’s
changing really, really quickly. But this is a good question, is it a challenge or an opportunity? To be honest, it depends on perspective. It depends on the
information that’s available. It depends on strategic priorities. It depends on how people are remunerated. It depends on things
like values and ethics. And it really depends on an
organization’s propensity to do what is new, unique and valuable. Or in other words, innovate. Now, at the moment, I get
to spend quite a lot of time with leading thinkers and decision makers in the personal data space. So when it comes to all
of these forces of change being a challenge or an
opportunity for businesses, I get to see a lot of
different perspectives. What you seen onscreen right now, really the two ends of the
spectrum that I’m seeing. On the left, you have the focus of simply meeting compliance, which is absolutely
really, really important. On the right you have
leaders that are determined to use this changing landscape to their organization’s advantage. Right here, to provide an analogy, I used to be an athlete. I don’t know why, but I always find ways to fit sports analogies into stuff. If you surf, you don’t
compete against the wave, you ride with it. And these forces that
I’ve been speaking to, they’re too great to fight against. You’ve got to ride their momentum, and you’ve gotta try
and use their momentum to your advantage. That’s why the innovation
approach is and will win. Now, in case you’re wondering, compliance is built into
the innovation model. It’s a positive sum game here. But what really matters, at least from my perspective, is value. A question that’s constantly asked is, can value actually be
generated from all of this? It’s a really good question. A few years back, the Boston
Consulting Group estimated, I think it was 2012, that
it’s about a trillion Euros of new economic value that can be realized in the EU alone by 2020. Really interestingly, they forecast that 670 billion of that was gonna flow straight back to citizens. The rest will obviously
go to organizations. All of that value will
apparently be unlocked if people, citizens, simply have control and the ability to use their
digital identities seamlessly. Now, it’s just a forecast so I’m certainly not in a position to say if it’s accurate or not. I think we should dive a little bit deeper and start getting more practical. I’d like to start at the top left. Both consumers and
business decision makers, they associate data privacy
and security with trust. This is really backed up by
consumer purchasing behavior. We buy from whom we trust. But it’s also backed up by
business spending patterns. There are now Chief Privacy Officers and huge data security budgets. So people are really putting their money where their mouth is. If you look to the bottom right, you’ll notice some insights
from Meeco’s research. It turns out that this
trusting is really significant when it comes to the
propensity that customers have to share data with the brands they choose to do business with. I’m gonna unpack a
practical example of that a little later. So just keep it in mind for now. Now, if we go to the top right, I kind of think this is
just Amazon boasting. What it does show, though, is that in third
personalization is working really well for them. It’s quite clearly an activity
that has huge business value. But what if people consented to sharing their personal data? Amazon could take that small data, they could enrich their existing analytics and cognitive intelligence and start actually
personalizing experiences. This might be a long shot, but Bezos, if you’re listening, I am a capitalist, albeit a conscious one, and more than happy to
talk about this prospect. Now, if we move over to the bottom left, we’re on to data breaches. When it comes down to it,
honeypots are ripe for attacks. And distributed data architectures
are by far preferred. I’m not gonna unpack that in detail today ’cause that’s a fairly
technical discussion. But as you can see, data breaches, they really do hurt financially. The other thing that often
isn’t spoken about that much when it comes to data breaches is the impact on the human beings the data actually relates to. And I think that’s far
worse than the four million. Over the last few minutes,
it’s been a lot to take in. Keep in mind that there
will be a recording of this, and you can view it later
if you’ve missed anything that you think is relevant. With that said, let’s move on swiftly to the practical stuff. These are what I used
to call the three rules of data minimization. But you can really think
of them as three rules of designing with personal data. Look, it’s by no means an exhaustive list of every design consideration. But I do believe it’s
simple enough to action and it’s gonna give you a
really strong foundation. Let’s kick off with rule number one. Acquire data progressively. This is really the key
to data minimization. It’s basically saying, don’t
ask for all the data up front. In fact, don’t ask for
all the data, full stop, because it’s unlikely
you will need it anyway. What you really need is
access to the right data at the right time, or better yet, the ability to access or consume the token of a verified attribute
or group of attributes so you can determine if someone
is who they say they are, so you can assess their trustworthiness, which is obviously something
that goes both ways, or so you can assess their eligibility for something that you offer. In many cases, all you’re gonna need is a binary yes/no assertion. So, give you an example. Let’s say you’re selling alcohol online. And you need to know
someone’s over 18 or 21 or whatever the law states. That’s your legitimate
business requirement. So how do you go about this? You know, you’ve got a duty
of care, you have risks. You also care about the health and wellbeing of your customers. So you ask them to take a selfie and a scanned picture of
their driving license? Or would it be better to
access a binary assertion where the relying party asserting the person is over 18 is
someone that you can trust, maybe like the government. Now, even though there are
fairly significant technical or system design considerations
to rule number one, how, how often, and what
data you actually ask for, it directly impacts the
experience people have with your product or service. This is just as much a
human problem to solve as it is a technical one. There are a bunch of benefits
to this type of approach to accessing personal data. Acquiring only what you need to fulfill your value proposition
at the time you need it gives people context. Context increases trust, and trust is something we’re gonna unpack. Context helps make, or
give people the opportunity to make informed decisions. Importantly, when we talk about consent, about what they are and what
they aren’t willing to share. From a US standpoint
or a design standpoint, asking for less data or at
least finding better ways to gain access to the actual
information that you need, it’s gonna help you
deliver better experiences. If we collectively deliver
better experiences, then our unit economics should be better. Now, the other thing here
is that gaining access to only what you need
at the time you need it, it decreases the liability
of holding personal data. You’ve always gotta keep in mind that having access to personal data, it’s not just potentially an asset. It’s a serious liability. In some cases, less can be more. Small data can, I think in a lot of ways, be just as powerful as big. So let’s have a look at a few examples. The first one’s really, really basic. It’s the contact form on our website. Basically we show and we explicitly state that all someone needs to do to contact us is submit a message and
pass the basic human test. What this means is that
they can send us feedback, they can submit details about a bug, and like I said earlier, (chuckles) the site right now is
probably pretty buggy, or they can do something
else completely anonymously. Now, if they wanna work with us and they want us to get back in touch, then we let them know that we need their most basic personal information so that we can actually
fulfill their request. This is really, really basic
data minimization in practice. Then is this example from Meeco. I think at this point it’s
probably worth explaining Meeco a little bit more, ’cause
I haven’t done that yet. I think if I do that, the examples I use will make more sense. Really quickly on the big picture, Meeco’s vision is for a future where people get equity in
exchange for what they share. To make this happen,
we’re working with people and we’re working with organizations to establish a personal
data marketplace of equals. What this means is that
people and organizations are gonna have the
tools to share or access the right data at the right
time and in the right place. When data flows like
this, it’s most valuable. Right now what we’re doing is, we’re building out these tools and we’re working with
leading organizations globally to establish the foundation
of this marketplace. But we’re doing it progressively. We’re not going straight from where we are to this big, massive picture or to that 2023 vision that I gave you. We’re starting with trigger
events or use cases, the types of things that
we experience every day, things that we value highly, things that require our personal data, and to be frank, things that
are pretty high friction. By giving people an organization’s tools that enable privacy by design, compliant and innovative
use of personal data, we’re helping people save time. We’re helping them get better outcomes or experience things that
are truly personal to them. And we’re helping brands ask for, we’re helping them gain access to, and we’re helping them make
really, really effective use of the right personal data. The cool thing is, they’re
getting it directly from the source, from their customers. Now, I’ll stop there because again, this isn’t all about pitching you guys. But I think that’s going to help make the examples much more meaningful. In this screen that you see, or this example that I’m using, a person’s using one of
the tools from Meeco. Let’s just say or assume
that they already control lots of their information. They’re sharing it in context and they’re doing it on terms that they’re comfortable with. They can share with either
organizations or people. In this example, someone’s
speaking to a bank about an offer they’ve heard about. They’re chatting privately and securely, and they’re really just
at the “I’m curious” stage for the relationship. After a little banter or chatter, they start becoming interested. They feel like the bank’s
giving them enough information to make a call. And they get to the point
where they explicitly state to the bank, “Hey, I’d like to kick off “the application process.” Now, to do this, they reuse an
existing identity credential that they have access to within Meeco. Now, if you’re in Europe, this could be any IDIS-compatible
government-issued digital ID. But there are obviously
a bunch of other ways that someone could assert their identity in this kind of a process. What happens is, the
identity token is backed up by some form of relying third party and it’s shared with the bank
once the process is complete. Now, to make this happen,
from an experience standpoint, the person goes through a simple
and pretty familiar-feeling multifactor authentication process. Just think of this as a replacement for filling in forms, scanning your ID, whatever else we’re asked to do pretty much day in and day out
to prove who we are online. Now, once that’s complete, the bank knows that they can rely on
the issuance of the ID. They trust it. So the bank’s really
confident that this person is a real person and they’re
also who they say they are. So there’s a high level of mutual trustworthiness
established at this point. The person’s then free to
continue with the application. After a few more steps,
process is complete. It was really quick and
Charlotte is really happy. She got the outcome that she wanted, and it was made easier
because she had control of her information. And the bank only asked her
for what they truly needed at the time they needed it. With that highlighting, I
skipped a bunch of detail there, let’s look at what happened. Charlotte was the person using Meeco. She started off as a persona. Her identity wasn’t
actually known to the bank. Then, as trust was earned and
the relationship fostered, Charlotte verified her identity. When she did this, it was
appropriate to the context. She wanted to open a bank account. She became known to the bank at the right time for both parties. This is what we call
progressive disclosure. It’s the idea that I, or Charlotte, can start a process as a persona and progressively and
securely release access to information about myself or herself when it makes sense to do so. This is really a two-way example of rule number one in action. With that said, let’s keep
on to rule number two. Clearly state your purpose. Now, I think this is a really interesting design consideration, but it’s actually really tough to enact. We as consumers or citizens, we’re so used to the behavior of opt-out or simply tick the box that we rarely consider as designers how we can meaningfully
expose consent and purpose to our customers. It’s like we’re saying,
“Trust us, we’ve got this,” but we’re not even being
remotely transparent. Think of it like this. As an organization, you’re
just a temporary custodian of the personal data that
you intend to utilize. For the purpose of achieving whatever your business objectives are, it’s pretty critical that
you maximize the likelihood that the person grants
their explicit consent for you to use their information. To do this, a person needs to
take an affirmative action. So you need to make your purpose clear, meaningful and unambiguous, and you need to use plain human language, or better yet, visual references, that clearly state why you want that data, who’s gonna have access to it, and for what duration. You need to give someone,
and this is another thing to consider, an easy
way to revoke your right to access and use the data
that they share with you. This is consent revocation. You might be thinking,
“Oh, there are a bunch “of design risks to that.” But, you know, 4% of
global annual revenues is a fairly significant risk, too. But one of the design risks
that might come to mind is the additional cognitive load. Like, if people are so
used to a simple tick box and just getting on to the
service that they want, and we know they don’t
really read the terms and the privacy policy, but how do we limit things like drop-off? How do we limit friction
and dissatisfaction if we’re gonna be so explicit
and transparent like this. To be honest, there’s no real easy answer. The market’s still trying to figure out the best way to do this. I think the longer term, the idea that we have privacy assistance
operating on our behalf, not on the behalf of an organization, makes a lot of sense. I’m really looking forward to
participating in that model where our default permissions
combined with some form of scalable utility can
make the process way easier. That’s great, that 2023
vision that I painted, really, really cool. But what can we do right now? I’m gonna kick off with
another really simple example. This is Greater Than Ex’s cookie notice. What you’ll note, probably,
is that our purpose is explicit but it’s still simple enough to understand and take action
on in a couple of seconds. We also make consent
an affirmative action. What that means is, if the
banner’s left to its devices, as in you don’t click
I accept or I refuse, Google Analytics won’t run. So the key thing here to remember is, explicit, not implicit consent. You probably also notice that
opt-out of Google Analytics call-out here. This just links directly to
the opt-out browser plugin. What we’re trying to do
is, give people options. Don’t get me wrong. This is not perfect. It needs work and it’s
something that we’re actively working on right now ’cause it’s an important
part of our experience in a lot of ways. At Greater Than Ex, we also
don’t shy away from this on our privacy policy. We don’t want it to be a
secret how we deal with data. The thing is, our privacy
policy as a result, it’s pretty different to most out there. Again, little bit of a buggy website because we shipped
earlier than we wanted to. But feel free to head to our website, scroll down the first page, and go to the footer and then do the green
highlighted privacy promise. That’ll take you directly
to the privacy policy. Now again, just like before, I’m not saying that this
experience is perfect. But the thing that our
research has clearly exposed is that it’s comprehensible. We’ve done something interesting. We added a privacy promise and our privacy by design principles. As a result of that, we basically learned that attitudinal trustworthiness or, like, how trustworthy our website
visitors think we are, it increases by about 80%
after someone views this. This is really basic. I’d like to move on to something that’s hopefully a little
bit more compelling here. Now, Meeco, you probably notice a trend. I’m doing a simple example
with Greater Than Ex, and because Meeco is a
leader in this space, I’m sort of showing
some of the sexier stuff as a Meeco example. But one of the things
that we’ve learned is that visual consent notices, and this is really something to consider with your internal design function, they achieve approximately
70% greater comprehension, which is really, really big. Under the GDPR, consent has to be clear, meaningful and unambiguous. Now, in some ways, that’s
largely subjective. But let’s just roll with it for
the purpose of this session. What we’ve learned is that
layered consent notices, particularly those that
have simple visual elements, they increase the likelihood
that a consent notice can actually be deemed clear,
meaningful and unambiguous. Now, these visual notices are supported by a consent summary, so written, and obviously terms of
service and a privacy policy. And those two things,
they’re not going away in the foreseeable future. Now, doing this maximizes the likelihood that someone granting their consent is taking a genuine affirmative action. It means they actually get it, which most of the time right now we don’t. They then make a choice and then move on. And current consent models
just simply don’t support this. There’s a heap of work being
done in this space right now. A specific example is Kentara’s consent receipt specification. I think it’s definitely worth looking at. To summarize rule number two, calling out purpose really
clearly, it’s one thing. But we’ve learned time and time again that showing it visually
is just so much better. And we’ve come to consider
this best practice for consent. So I’d suggest experimenting with different consent visualizations, test a bunch of options,
and focus on making what was formally a pretty
scary and confusing thing something that’s simple
and something that’s human. Now to the last rule. Give before you get. This is really the most important rule in terms of how you
foster real relationships with the people that
you serve as customers. Now, we’re designers, or
many of us at least are on this call. So we’d likely know a little bit about the reciprocity tactic. I think about it as that give to get is almost a data version of that. Organizations, they so
often ask for our data that we’ve become used to the activity of just giving it to them, whether it’s buying
clothes, booking flights, ordering food, filling in a tax return. All these things require us to give. But what if organizations
started an interaction by saying, “Hey Nathan,
you’ve been giving us data “for a while. “We really appreciate it. “In fact, it’s your data. “We’re just using it to
help offer you our best. “Here’s your data back, “and here are some insights
that we’ve generated. “Here are the ways that you can use them, “and here are the ways that we think “using them will be
really, really valuable.” Now, a number of things
happen when this occurs. Obviously, there’s a bit the “Whoa! “I’ve never seen this before. “That’s really odd. “Don’t organizations just want my data?” But the first thing is that data liability for the organization, or the
organizational perspective, is that it starts to go
down almost immediately. Number two, the organization
has maximized the likelihood that people are gonna
share more with them, that that person in the
future will share more. Number three, and this is
what’s really interesting is that organization’s
opened up the opportunity of an access rights subscription
to things like life events. That’s a type of customer data
that can actually help you create new and unique value, actually personalized experiences. Let’s look at that a little
bit more specifically. But before we go to the example, and I’ll use a Meeco example again, remember this. If an organization starts
by giving data back, customers are eight times more likely to share data in return. The way that we now think about it is that every interaction
you have with a customer is an opportunity to give back. The more you give, the more you get. So let’s look at how. In this example here,
it’s similar to before, a bank has enabled someone
to use a digital ID to gain access to a new bank account. The person now has the bank account and they’re pretty happy. But there’s an addition to the experience. The bank’s offering to give them all the data they’ve shared
as part of the process back. They’re doing it in context. They’re explaining how taking
control of this information is gonna be really, really, valuable for that particular person. Now, if a person chooses to take control of this information,
they’ll have access to it. They’ll be able to reuse it. And by sharing it on their terms, they start exchanging
data with their bank, with their Telco, with their government, or with your brand, as an equal. But the beauty of it is this. It goes back to the stuff I explained about Meeco earlier. Now people have new tools to control and securely share their data. Organizations also have new tools to ask for customer data and get it directly from their customers. The plumbing in between might be Meeco. Could be something else. But these new tools within
an environment of trust mean that we start working
towards that data marketplace that I referred to. So give to get, it’s more
than just reciprocity. It’s more than just
increasing the likelihood that you’ll get data in the future. I think it’s the start
of fundamentally changing the data relationship. It’s the start of a
different model entirely. Now, regardless of whether
you’re on a trajectory to participate in that model or not, it’s likely that you’re gonna
need some practical tools. These tools, they might help you innovate, they might help you craft consent notices that keep legal and
compliance really happy and off your back. They might help you gain
access to customer data that you never thought you could. They might even help
you genuinely transform your business model. Who knows. But how you use them and the outcome that you’re looking to get
is completely up to you. Let’s just start with the core of it all. Privacy by design. We actually have the Canadians to thank for this little beauty. Now, privacy by design or
PBD used to be thought of as a systems design method. But really it’s an approach to projects that promotes privacy and data protection right from the very start. The reason it’s really valuable is because all too often, these
types of considerations they’re an afterthought, or sometimes they’re ignored
or forgotten entirely in environments where
that’s certainly the case. So PBD gives you the best chance, I think, of designing human experiences,
systems and business models that actually include and
empower your customers. Now, privacy by design is also
a requirement of the GDPR. So if you’re in Europe, if
you have customers in Europe, then this really does matter to you, those seven principles there. But there’s a lot of
literature available on this. My advice here is to
conduct a little research and figure out how privacy
by design can become embedded into your existing experience and systems design processes. There are a couple of things
that you could look at as ancillary benefits. It’s the type of approach
that’s probably likely to increase privacy awareness internally within your organization. It’s definitely gonna decrease the risk that you’re fined up to 4% of
your annual global revenues, which is really good,
everyone cares about that. But I also think it’s
actually gonna give you the best chance to deliver
experiences to market that your customers trust and that they therefore
engage with openly. Now, from these seven principles, there’s this one that I
tend to focus on the most. It’s just because my view is really that privacy and value aren’t and should not be competing factors. Remember, it’s not a zero
sum game that we’re playing. If you are to hone in on one, it’s positive sum. I believe that we can
create better experiences for real people if we bake
privacy into our offerings from the very start. Interestingly, what we’re
learning right now is that more and more, positive sum privacy enhancing experiences can actually become part
of your value proposition. You don’t have to be an
organization like Meeco to make this happen. At Meeco, it’s inherent. It’s a huge part of our why. You could be a bank, you
could be a travel agent, a grocery store, a coffee shop, an intergalactic space station,
it really doesn’t matter. Right now at Greater Than Ex,
we’re doing work with brands to embed this into their value proposition and their product design process. So be really practical, my advice is to assess your existing
documented design process and look at where you
first start considering things like privacy,
security, access rights and data transparency and bring it forward in
the process, way forward. Introduce it earlier. Once you do that, once you find a way to contextualize privacy by design to your organization or your workflow, in short, it’s actually
a documented component of your process. Make sure people know about it. This is gonna make a massive difference and it’s likely gonna save you a lot of potential headaches down track. Now, I’m gonna contradict
myself a little here, given that I’m sitting in this room and everyone on this call is listening to my voice right now. But don’t just take my word for it. Get out the room that you’re in and start testing this stuff. Frame some hypotheses,
design and conduct tests. What I’m gonna bet is
that in almost all cases, if privacy is positive sum, people, your customers or your prospects, will respond with overwhelming positivity. At Meeco, we learn about
this type of stuff. We learn about attitudes and behavior through a variety of different means. Number one is that we run
a thing called Meeco Labs. Basically that just
gives our organizational partners and clients the ability to utilize Meeco dual-sided platform and to test what you could consider customer or business value of these new approaches to personal data before they go and do some type of large-scale implementation. Number two is that we run
internal design experience quite regularly. I actually wrote about this
on the InVision blog before. We’ve made a few adaptations to the GD, designs and process. And a mate of mine in New York, Jay, this is a little bit
of a shout-out to you, Jay Malone from New Haircut, he’s got a couple of great design talks on design (mumbles). So if you’re not familiar
with this framework, check those out. Number three is that we’re
really active in market. This means that we’re invited all the time really fortunately to speak at events, run workshops, contribute
to industry research and do a bunch of other stuff that hopefully helps move the
market forward positively. It’s a combination of
these three approaches that enables us to learn really quickly and in context. This one’s really, really interesting. It’s about quantifying the impact of privacy security trust et cetera. At Greater Than Ex, we
often use this adaptation of what many others probably know as a fairly common service design tool, and we use it with our clients. Basically what we do is, we map the stages of an experience, not just as it relates to the
experience with that product but as it relates to the entire
decision process or journey. You could think of this in
terms of jobs-to-be-done. If you’re not really familiar
with jobs-to-be-done, I suggest checking out
Strategyn’s website. If you are familiar with jobs-to-be-done, then you might know a little bit about the switching moment. Basically what we do with this tool is, we look at the entire decision journey, we look specifically
at the switching moment and we try and figure out how
privacy-enhancing approaches to personal data help
us increase the value, the meaning and engagement
of the experience we’re offering the market. What this does is, it helps us quantify how these types of approaches
impact our core metrics, acquisition, time to
value, frequency of use, outcome success, et cetera. Now, we tend to version control these. We typically print them often. We use them as guides to split test a bunch of different scenarios,
always with a control. We then measure how different approaches yield different VME scores,
value, meaning and engagement. What we’re doing through this approach is again looking at how
privacy-enhancing approaches, they have an uptick or a downtick on the value, meaning
and engagement score. Now, if we’re doing this
through simple usability tests backed up by maybe some
contextual inquiry, then we tend to use InVision. That gives us the opportunity to do this really cheaply
and cost-effectively. If we’re doing this and we’re
relying on our existing quant, then we tend to inferentially
analyze the quan and complete one of these VME
maps based on that inference. It gives us an inferred kind of state. We then go about validating that inference by running further qualitive sessions or different types of quant methods. Now, if you’re interested
in this approach, we actually have a blog
going live on InVision soon that details this with
a little bit more depth. Gets really practical. The other thing you could do, most likely if you’re in Europe, is check out the Data Transparency Lab’s practical workshop series. They’re being held in London,
Barcelona and Paris this year. So where do we get to? And what is all of this actually about? Funnily enough, my view is, really it’s not about data. Don’t get me wrong, data
serves a significant purpose. But it shouldn’t be
the primary focus here. My view is that the purpose of technology is really to augment human capability. So it’s really the human outcome
that we’ve gotta focus on. And personal data is just
one of the raw materials that we can use to fulfill the outcomes or help fulfill the outcomes that people care about the most. Now, you’ve probably
heard me reference this a few times today, value,
meaning and engagement, in that exact order. That’s because at Greater Than Ex, our core thesis, kinda
the reason why we exist, is that we believe
experiences should be greater than the sum of their parts. They should be valuable,
they should be meaningful, and they should be engaging. Way to think about value is
that it’s about efficacy. It’s like, did experience we designed fulfill or surpass the expected
or desired human outcome? Meaning’s all about relative priorities, so how important is it
that this experience that I’m engaged in gives
me this outcome right now? And engagement’s all about fun. Now, we know that our role as designers is to fulfill outcomes. But I really think we can do more. I think we can deliver
value, meaning and engagement to the people consuming
our products and services, and we can absolutely do this when we’re practicing the three rules of designing with data. If we practice these three rules, we are far more likely to
gain access to the right data when we truly need it. This is when data is most valuable, and it’s when we as
organizations are best positioned to deliver personal
experiences to our customers that are valuable,
meaningful and engaging. Now, we’re getting really
close to wrapping up. I’m getting a little bit of a sore throat. I promise to be explicit here. I’m about to hit you with
this design talk CTI. It’s not commercial at all, but it’s a little bit of a call to arms. So hear me out. Now, I know you can’t respond verbally, but shout out in whatever room you’re in and disrupt all your colleagues (chuckles) if you’ve seen Bill Gurley’s TED Talk on new venture success. I hear heaps of shouting. Look, if you haven’t,
let me ruin the surprise and tell you that the research shows that timing trumps all other factors. When it comes to personal
data, the timing is now. But here’s the bigger question. What future do you want to
contribute to the design of? Now, it might be a little opaque, but in those two eyes are a
scene from Minority Report. It may work out for Tom Cruise in the end, but surely there’s a better future than the one articulated in that movie. In that future, people don’t have agency. They’re controlled by their data. And my belief is that if collectively, as a group of people empowered
to evoke positive change, we design for a future where people have agency over their data, then the future will be
fundamentally better. Right now, there’s a growing
quite active group of people working towards this,
but it is early days. So I’d like to ask you to have a think about what I’ve said over
the last 35, 40 minutes. Conduct your own research,
learn about the market, don’t just take it from me. Then make a decision about what future you’d like to contribute to the design of. – [Margaret] Awesome. Unfortunately, that’s all
the time we have today. I feel like we could have kept talking for another hour or so. But Nathan, thank you so
much for chatting with us and sharing your knowledge. And a big thank you to
everyone who attended today. Have a great rest of your day and keep designing awesome things. – [Nathan] Thanks, Margaret, really appreciate it, guys. Have a good one, bye-bye.

One comment

  • Charlie Tetreault

    Excellent perspective and very attributable to what we all need to be thinking about right now in tech.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *