Jul 30, 2007
Originally posted on Half an Hour, July 30, 2007.
Summary of a talk given to Jonathon Cave to the IFIPTM conference in Moncton.
Jonathon Cave
RAND Europe
What Are You, Who Are You, And How Do You Know?
Interesting to observe differing attitudes toward privacy and security. People have differing motives - some because they think they should, others who think there is something to be gained.
Many of the things we do protect us against risk - but this may be more efficiently managed at the individual level. It's not immediately obvious that the challeges we face today fit into the categories we drew in the past.
Businesses are the people who can most efficiently manage the risks of managing information. But they then become a target, because of the value of that information. They become part of a complex system, and such systems have failure nodes, at the boundary.
Leads us to think that government's role needs to be 'rebalanced'. If we did not have deregulation, it would not be possible to have that conversation.
Different countries have different rules regarding privacy and security. We benefit from that - because of expenses created by compliance in other places.
That said...
1. Tangibles are changing business models
I am a game theorist. Game theory is the idea that you rank things and pick the best. But you can't rank things without knowing how other people look at things. So these rules describe what people do. Then you design systems based on these rules.
Privacy and security run right though economic theory, and this is where game theory comes in. The view of the individual runs right through this analysis. Eg. what people intend to do and what they can do are different things. There are things we cannot predict - risks. There are things we cannot even define - uncertainties.
Eg. CCTV cameras - they can help do things like catch criminals. But also - they push crime indoors. But even more - when I'm being watched, I'm not being trusted. When I'm not trusted, I am less inclined to be trustworthy. So CCTV may contribute to the sort of behaviour they are intended to reduce.
We want to sraw out these intangables, to touch them - this desire makes us very uncomfortable, the way IP did for the content industry.
What does it mean to steal someone's identity? It could mean stealing my stuff. It could mean creating a new identity entirely, without taking anything from me. But that may mean denying me access to my own identity.
A borderless world is very messy (the IT world). We leave traces all the time. There are traces of pretty much everything we did online. One of the things that compromises my identity - prevents me from doing things - is myself in the past. That ability to move off of where we were is an essential part of our identity.
Privacy - we need to have routes to know what is known about is. But it's not just the information - it's the judgments that are made with that information. It's not even just the accuracy of the information - what if judgments are made with only half of the information.
Businesses may be able to deliver better services when they have more information. But if a business has a program that creates a profile of me, the information belongs to the business. So the company will serve me just enough to get that information.
These intangibles become increasingly important to business models. It used to be that transactions were anonymous, but no longer. Now some organizations collect things like names - or even thumbprints - and erase it later. But the important things is that a business model that used to be about selling fruits and vegetables is now a model about the collection of information.
A lot of this at the moment lives at the realm of corporate social responsibility - part of the halo effect, things that big businesses do because they can affoprd to do it that gives them a subtle advantage.
Now we are getting things that we never anticipated. The new regulatory framework (privacy protection, eg). is a result of this. But also - a service is pretending to sell me identity protection. It used to be, I expected my bank to protect my identity. But now it may be more like dread disease insurance, where we pay for our own protection.
2. Privacy, security, trust, etc., are all good things.
But that doesn't mean 'more' is better.
Eg. security cameras on vacant lots - there's nothing to protect.
The fact that my actions are being observed changes responsibility - if you give me too much privacy, I don't work about responsibility. We see that in the area of anonymity.
It only makes sense to trust some people if (a) they earn that trust, and (b) they can use it to do what they do. So there are only some cass where we trust, say, government.
There is another example of where you can have too much security - but you live closer to that example than I do.
Suppose I trust you - I give you access, etc. - but then if I put cameras on you - then I'm not trusting you. I'm just using you as spare parts. These monitoring things cut at the very heart of trust. If you control me, you're not trusting me.
These things are good things only if we all agree they're good things. If it's better for you than it is for me, then there is a question of whether I give consent. In some cases we are forced to 'give consent' - eg. 'you can refuse to give fingerprints, but can't cross the border'. Some consent. Also, there are cases where we don't know what we are consenting to. Or the conditions may change.
The growth of this public space, and its incursion into our private space, may be an incurson into our right to be left along. If the state intrudes when you aren't doing anything, then you have become the property of the state (Burke).
3. The Atlantic Perspective
The two sides of the Atlantic have very different views of privacy and security - in a globalized world, this creates a lot of conflict.
There are also differences in the structure of markets. In America, small enterprise is considered the font of innovaton. In Europe, they are anything but innovative.
Differences in security issues, regularity issues, roles, government legislation, etc. etc. etc. and also in public procurement rules. Whoever wins the market for a core technology - eg. biometrics - has won much more than that.
So there are many disagreements - but we can presume we'll get out acts together.
Examples
a. CCTV camera - monitor everything - they have microphones, to predict fights, face recognition, to catch gang members - but this results in people - all people - wearing hoodies. Automatic number plate recognition - they know where you've gone,
b. What hoodies and hijabs mean - they mean that my identity belongs to me. If people feel threatened, they withdraw. And if they withdraw, there will be a reaction to it. Is the withholding of identity reasonable grounds for denying people a public life - holding jobs, etc.
c. Biometrics - iris samples can be re-issued. But there are cultural barriers to using the iris. but the key point is - there are types of errors, false acceptance, and false rejection, and the type 3 error, the right solution to the wrong problem - if we think biometrics protect us against biometrics, well, they don't - they identify only the physical person - but frequently the physical person isn't important.
d. DNA is another example of this. ou go into the DNA database if you are drawn to the attention of the police. But it tells much more than just identity - it tells kinship, health issues, etc.
e. Data-mashing. Google maps is a benign example of this. If you mash data, you can violate privacy without even identifying someone.
f. Loyalty cards and commercial profiling.
g. Virtual worlds - to some extent, we are all public figures in virtual worlds. We have certain privacy rights - they may be very limited in some cases (children, criminals, politicians) - but on the internet we're all public. One compartment of our identity may compromise another compartment of our identity. The rules become very different. I don't know what the rules are in Second Life - but I do know they're making a lot of money, collecting a lot of information.
4. Intangibles
There is no necessary contradiction between privacy and security.
Networks amount to the links between people. Game theorists look at links as decisions we have taken. We have:
- people who are careful about privacy
- people who are careless
- people who are opportunistic
We can have
- a high degree of security, because the customer is careful
- but if the business, or the customer, is careless, then there is not a high degree of security
So what state does a network of such states settle down to? It doesn't necessarly settle to the most beneficial state.
Things - like insurance policies - don't force the outcome, but allow it to settle to the optimal state.
But also - we don't have just one system - we have a lot of small worlds. If I join eBay, eg., my behaviour changes.
The way in which I respond depends on the likelihood of an attack, and how much I care about the other people in the network. The system doesn't smoothly adjust - he threat varies depending on how careful people are, and how much the system is being used.
5. Markets
Networks effects and interoperability - we can get excess inertia - perfectly good ideas might never be adopted, other ideas may be rapidly adopted.
We have a system with a soft centre and hard boundaries around the outside - breaking the boundaries becomes high value - this is a system of brittleness.
What we see in a lot of IT worlds - we know we have to give up information, and incur risks - eg. people who download software - and others are scared of those risks, and create private domains. That might be OK - but the whole point is that we gain from being connected from each other.
This splitting apart - and what it means for things where we have to pay - like education and health care - is concerning.
6. A Warning from History
Business, government and civil society have very different perspectives.
Events have a disproportionate influence because of this. The different agendas are not necessarily consistent.
The challenge to business is to embrace these issues (not to describe them into nothingness).
We can see possibilities of:
- high security - high privacy (the academic publishing world, eg)
- high security - low privacy (the surveillance society)
- low security - high privacy (walled gardens)
- low security - low privacy
Security and privacy - are both states of mind. These choices are ours to make.
Summary of a talk given to Jonathon Cave to the IFIPTM conference in Moncton.
Jonathon Cave
RAND Europe
What Are You, Who Are You, And How Do You Know?
Interesting to observe differing attitudes toward privacy and security. People have differing motives - some because they think they should, others who think there is something to be gained.
Many of the things we do protect us against risk - but this may be more efficiently managed at the individual level. It's not immediately obvious that the challeges we face today fit into the categories we drew in the past.
Businesses are the people who can most efficiently manage the risks of managing information. But they then become a target, because of the value of that information. They become part of a complex system, and such systems have failure nodes, at the boundary.
Leads us to think that government's role needs to be 'rebalanced'. If we did not have deregulation, it would not be possible to have that conversation.
Different countries have different rules regarding privacy and security. We benefit from that - because of expenses created by compliance in other places.
That said...
1. Tangibles are changing business models
I am a game theorist. Game theory is the idea that you rank things and pick the best. But you can't rank things without knowing how other people look at things. So these rules describe what people do. Then you design systems based on these rules.
Privacy and security run right though economic theory, and this is where game theory comes in. The view of the individual runs right through this analysis. Eg. what people intend to do and what they can do are different things. There are things we cannot predict - risks. There are things we cannot even define - uncertainties.
Eg. CCTV cameras - they can help do things like catch criminals. But also - they push crime indoors. But even more - when I'm being watched, I'm not being trusted. When I'm not trusted, I am less inclined to be trustworthy. So CCTV may contribute to the sort of behaviour they are intended to reduce.
We want to sraw out these intangables, to touch them - this desire makes us very uncomfortable, the way IP did for the content industry.
What does it mean to steal someone's identity? It could mean stealing my stuff. It could mean creating a new identity entirely, without taking anything from me. But that may mean denying me access to my own identity.
A borderless world is very messy (the IT world). We leave traces all the time. There are traces of pretty much everything we did online. One of the things that compromises my identity - prevents me from doing things - is myself in the past. That ability to move off of where we were is an essential part of our identity.
Privacy - we need to have routes to know what is known about is. But it's not just the information - it's the judgments that are made with that information. It's not even just the accuracy of the information - what if judgments are made with only half of the information.
Businesses may be able to deliver better services when they have more information. But if a business has a program that creates a profile of me, the information belongs to the business. So the company will serve me just enough to get that information.
These intangibles become increasingly important to business models. It used to be that transactions were anonymous, but no longer. Now some organizations collect things like names - or even thumbprints - and erase it later. But the important things is that a business model that used to be about selling fruits and vegetables is now a model about the collection of information.
A lot of this at the moment lives at the realm of corporate social responsibility - part of the halo effect, things that big businesses do because they can affoprd to do it that gives them a subtle advantage.
Now we are getting things that we never anticipated. The new regulatory framework (privacy protection, eg). is a result of this. But also - a service is pretending to sell me identity protection. It used to be, I expected my bank to protect my identity. But now it may be more like dread disease insurance, where we pay for our own protection.
2. Privacy, security, trust, etc., are all good things.
But that doesn't mean 'more' is better.
Eg. security cameras on vacant lots - there's nothing to protect.
The fact that my actions are being observed changes responsibility - if you give me too much privacy, I don't work about responsibility. We see that in the area of anonymity.
It only makes sense to trust some people if (a) they earn that trust, and (b) they can use it to do what they do. So there are only some cass where we trust, say, government.
There is another example of where you can have too much security - but you live closer to that example than I do.
Suppose I trust you - I give you access, etc. - but then if I put cameras on you - then I'm not trusting you. I'm just using you as spare parts. These monitoring things cut at the very heart of trust. If you control me, you're not trusting me.
These things are good things only if we all agree they're good things. If it's better for you than it is for me, then there is a question of whether I give consent. In some cases we are forced to 'give consent' - eg. 'you can refuse to give fingerprints, but can't cross the border'. Some consent. Also, there are cases where we don't know what we are consenting to. Or the conditions may change.
The growth of this public space, and its incursion into our private space, may be an incurson into our right to be left along. If the state intrudes when you aren't doing anything, then you have become the property of the state (Burke).
3. The Atlantic Perspective
The two sides of the Atlantic have very different views of privacy and security - in a globalized world, this creates a lot of conflict.
There are also differences in the structure of markets. In America, small enterprise is considered the font of innovaton. In Europe, they are anything but innovative.
Differences in security issues, regularity issues, roles, government legislation, etc. etc. etc. and also in public procurement rules. Whoever wins the market for a core technology - eg. biometrics - has won much more than that.
So there are many disagreements - but we can presume we'll get out acts together.
Examples
a. CCTV camera - monitor everything - they have microphones, to predict fights, face recognition, to catch gang members - but this results in people - all people - wearing hoodies. Automatic number plate recognition - they know where you've gone,
b. What hoodies and hijabs mean - they mean that my identity belongs to me. If people feel threatened, they withdraw. And if they withdraw, there will be a reaction to it. Is the withholding of identity reasonable grounds for denying people a public life - holding jobs, etc.
c. Biometrics - iris samples can be re-issued. But there are cultural barriers to using the iris. but the key point is - there are types of errors, false acceptance, and false rejection, and the type 3 error, the right solution to the wrong problem - if we think biometrics protect us against biometrics, well, they don't - they identify only the physical person - but frequently the physical person isn't important.
d. DNA is another example of this. ou go into the DNA database if you are drawn to the attention of the police. But it tells much more than just identity - it tells kinship, health issues, etc.
e. Data-mashing. Google maps is a benign example of this. If you mash data, you can violate privacy without even identifying someone.
f. Loyalty cards and commercial profiling.
g. Virtual worlds - to some extent, we are all public figures in virtual worlds. We have certain privacy rights - they may be very limited in some cases (children, criminals, politicians) - but on the internet we're all public. One compartment of our identity may compromise another compartment of our identity. The rules become very different. I don't know what the rules are in Second Life - but I do know they're making a lot of money, collecting a lot of information.
4. Intangibles
There is no necessary contradiction between privacy and security.
Networks amount to the links between people. Game theorists look at links as decisions we have taken. We have:
- people who are careful about privacy
- people who are careless
- people who are opportunistic
We can have
- a high degree of security, because the customer is careful
- but if the business, or the customer, is careless, then there is not a high degree of security
So what state does a network of such states settle down to? It doesn't necessarly settle to the most beneficial state.
Things - like insurance policies - don't force the outcome, but allow it to settle to the optimal state.
But also - we don't have just one system - we have a lot of small worlds. If I join eBay, eg., my behaviour changes.
The way in which I respond depends on the likelihood of an attack, and how much I care about the other people in the network. The system doesn't smoothly adjust - he threat varies depending on how careful people are, and how much the system is being used.
5. Markets
Networks effects and interoperability - we can get excess inertia - perfectly good ideas might never be adopted, other ideas may be rapidly adopted.
We have a system with a soft centre and hard boundaries around the outside - breaking the boundaries becomes high value - this is a system of brittleness.
What we see in a lot of IT worlds - we know we have to give up information, and incur risks - eg. people who download software - and others are scared of those risks, and create private domains. That might be OK - but the whole point is that we gain from being connected from each other.
This splitting apart - and what it means for things where we have to pay - like education and health care - is concerning.
6. A Warning from History
Business, government and civil society have very different perspectives.
Events have a disproportionate influence because of this. The different agendas are not necessarily consistent.
The challenge to business is to embrace these issues (not to describe them into nothingness).
We can see possibilities of:
- high security - high privacy (the academic publishing world, eg)
- high security - low privacy (the surveillance society)
- low security - high privacy (walled gardens)
- low security - low privacy
Security and privacy - are both states of mind. These choices are ours to make.