Behavioural Economics as a tool for enterprise security

Have you ever wondered why on your electricity bill there is a representation of your household’s usage against the average 2, 3 or 4-person household telling you whether you are over or under? How does it make you feel?

The term behavioural economics has been around for maybe two decades. The marketing profession has been using the techniques it describes for even longer to get you to buy their brand. However, the use of behavioural economics as a tool for enterprise security is just emerging.

It is time for security professionals to start using these techniques to help protect organisations and not just to influence people to buy a particular soap, car or follow a sporting code.

What is behavioural economics

Behavioural economics looks at the relationship between the decisions that we make and the psychological and social factors that influence them. A significant amount of study in this area has been on people’s economic decisions, but the tools and techniques that have been tested can be applied in many other contexts.

Daniel Kahneman and his late research partner Amos Tversky are the two research psychologists most associated with behavioural economics. In 2002, Kahneman shared the Swedish Banker’s Prize in Economic Sciences in Memory of Alfred Nobel for this work. Kahneman’s 2011 book “Thinking Fast and Slow” explains many of the concepts in accessible terms. Kahneman and Tversky built on earlier studies that cut down an idea that now sounds quaint, the idea that humans act entirely rationally at the population or large group level. Even so, this idea was at the heart of much classical economic thinking.

You might not think at first that this seems entirely related to enterprise security. However, if you consider that the premise of behavioural economics is that people do not always make decisions that are entirely rational, you’d probably see the connection! In addition, the ideas that small (and sometimes even intangible) incentives and disincentives can be used to guide individual actions on a large scale are also very important. It is this second aspect which is of greatest use to the enterprise security practitioner.

Behaviour is at the heart of enterprise security, because people are every organisation’s greatest asset and often also their greatest risk. At its simplest, the key aim of good enterprise security is ensuring that individuals are encouraged to make the right decisions that benefit their organisation.

Behavioural economics works by assuming that in many cases, people making the ‘wrong’ decision within an organisation do so because they have imperfect information or lack the right incentives or disincentives.

Psychologists have also found that people can often exhibit a strong inclination to conform to social norms. The social norms change with the social groups that we participate in. Essentially, we often do things because our friends, colleagues, or those we admire, do.  Our friends and colleagues provide us with informational social influence or social proof. In plain English, we like to follow our herd and keep up with the Jones’.

Curiously though, we seem to struggle more with changing our minds than coming to a decision in the first place. The idea that when the facts change, people change their minds is a bit tricky for many. Associated with this curious aspect, researchers from Harvard Business School have claimed also that we tend to think we are more moral than we actually are and inhabit an “ethical mirage”.  This can mean there’s a disconnect between how we describe our decisions and how we actually behave. If we accept this somewhat unflattering portrait of human behaviour, it means that we tend to take a position that justifies our actions whatever they were, once we’ve made a decision. And we want more justification to change our minds than we needed to come to it in the first place!

But what if we could get people to make the ‘right’ decision in the first place. Then they wouldn’t have to justify wrong decisions. This is where the research findings of behavioural economics are tested at organisational and national scale.

Behavioural economics concepts are being applied at the public policy level by governments wanting to encourage certain behaviour without going to the expense of legislating compliance. It is expensive to make something illegal. Sometimes it is absolutely necessary e.g. murder, but the society has to create enforcement systems, pay the enforcers, and then who watches the watchers? Some enlightened government agencies are dabbling with the use of behavioural economics to achieve high levels of compliance.

In the UK and latterly also in Australia, the tax authorities have been attempting to use behavioural economics techniques. So called ‘nudge units’ have been set up to coax to get people to do their taxes by using social proof methods.  Informing taxpayers who are late paying that “90% of people pay their taxes on time” increases the rate of taxpayer compliance. This achieves the policy objective of getting timely tax payments, but does it in a way that won’t generate negative headlines. This in turn allows the tax agency to focus on individuals who are intentionally breaking the law, rather than doing so because life got in the way.

Another recent example has been the introduction of the “No Jab, No Pay” policy by the Australian Government where parents do not get all their family tax benefits unless they are willing to vaccinate their children. Rather than making it illegal for children to remain unvaccinated, the government has incentivised parents to vaccinate. This, added to significant social pressure from almost all the medical community, means that Australia’s childhood vaccination rates are generally very high and we see fewer distressing pictures of children with whooping cough around the country.

One interesting way that companies are using social proof is in encouraging households to save water and electricity. Increasingly, utility bills show householders where they stand in comparison to their suburb in terms of water or electricity use. The householder can then consider whether they want to moderate their behaviour. Literally to keep up with the Jones’!

Marketing firms use many behavioural economics techniques to encourage us to use particular products. Many of us take advantage of airline frequent flyer programs that give rewards for the flights taken by members. The extremely successful travel website, Tripadvisor awards points to its website users for the travel reviews that they produce. However, Tripadvisor points have absolutely no dollar value. They are valuable only to users in terms of social proof to that community that a member is a well seasoned traveller. You may have realised that the majority of social media operates in a similar way.

Why should enterprise security professionals consider using behavioural economics in their organisation?

It is expensive and time consuming to maintain rules for the increasingly complex environment that organisations operate in. Rules are difficult to write well and often only work in limited circumstances. The more detail, the more exceptions need to be built. Quite often rules also create a culture where individuals only follow the letter, not the spirit of the rules. This can contribute to the creation of a workplace which is not adaptable and where security is blamed for the problems of the organisation.

This can lead to situations where workers sometimes choose to circumvent organisational rules in order to achieve local goals. A worker might shortcut a process to ensure that their team are able to complete it faster. The individual might rationalise this as being good for their company in that the job is completed faster and good for themselves in that they can go home earlier. However, the decision that they have rationally come to might be the ‘wrong’ decision from the perspective of their organisation. The shortcuts that have been introduced may decrease organisational security.

How do organisations change this? By changing the decision-equation the worker takes when he or she makes that decision. This is very much the place of behavioural economics in enterprise security. Organisational messaging which demonstrates the social norms of the organisation from a security perspective are vital. So to are tools and procedures which endeavour where possible to make the secure decision, the easiest one to make.

In many ways the decision is very much linked to the ‘security culture’ of the organisation. The security culture is effectively the customs and practices of the organisation for whom the individual works.

Organisations are increasingly moving to principles and risk based frameworks in many areas including security because they find the sheer complexity of business overwhelming otherwise. This was one of the main drivers for the creation of the Australian Government’s Protective Security Policy Framework. The PSPF tries to get government agencies to focus on their security outcomes, rather than on process.

 

[Brain scan of white matter fibers, brainstem and above. The fibers are color coded by direction: red = left-right, green = anterior-posterior, blue = ascending-descending (RGB=XYZ). The Human Connectome Project, a $40-million endeavor funded by the National Institutes of Health, aims to plot connections within the brain that enables the complex behaviors our brains perform so seamlessly.MANDATORY CREDIT: Courtesy of the Laboratory of Neuro Imaging at UCLA and Martinos Center for Biomedical Imaging at MGH / www.humanconnectomeproject.org] *** []
Brain scan of white matter fibers, brainstem and above.  Laboratory of Neuro Imaging at UCLA and Martinos Center for Biomedical Imaging at MGH / www.humanconnectomeproject.org
Enterprise security professionals should be asking where they can apply these behavioural economics techniques in their organisations. The possibilities are varied and many, but one financial institution has used behavioural economics give nudges to staff regarding personnel security. In one case, to improve their reporting of change of circumstances by giving them a simple message that “most people in our organisation report their change of personal circumstances within four weeks”.

In the government space, there has been debate about whether it is possible to create an ‘information classification market’ which balances the need to classify information appropriately against the costs to organisation of over-classification in terms of long term storage and devaluation of security markings. Such a market could work by incentivising managers to ensure that staff were classifying information as accurately as possible. As always, the trick would be to ensure that the incentives matched the risk profile of the organisation.

Every organisation is different and so are the opportunities for using these techniques to improve your enterprise security.

For more information:

http://www.nobelprize.org/nobel_prizes/economic-sciences/laureates/2002/advanced-economicsciences2002.pdf

http://theconversation.com/the-potential-of-behavioural-economics-beyond-the-nudge-43535

http://www.immunise.health.gov.au/internet/immunise/publishing.nsf/Content/clinical-updates-and-news/$File/Update-No-Jab-No-Pay-Immunisation-Catch-Up-Arrangements(D15-1126865).pdf

www.hbs.edu/faculty/Publication%20Files/08-012.pdf

https://en.wikipedia.org/wiki/Social_proof

 

Security Solutions Magazine 100th EDAn earlier version of this article appeared in the 100th edition of Security Solutions magazine  http://www.securitysolutionsmagazine.biz/

Information Security for health practitioners

Is it possible for health practitioners to  achieve information security?  Maybe a better question is  “How can health professionals balance privacy, information security and accessibility in an online world?” Or even, should the medical profession be bothered with keeping private and sensitive information secure?

Over the last few months, I’ve been working with a number of health practitioners to help them improve their information security. Much of this has been done with a view to the introduction of electronic health records.

I sympathise with hospital administrators, doctors and nurses. They don’t have a lot of time to think about security and privacy. However, the fact is that they have to do better.

monash university - surgery clinic 2012

Criminals follow the money

According to the Australian Institute of Health and Welfare, the health system costs just under 10% of Australia’s GDP (AUD121.4 billion in 2009/10 according to the AIHW) . In the US, it is around 18% (USD2.6 trillion in 2010 according to the CDC).  With this much money involved in the health system, it is a fat series of targets for cyber attack and fraud.

Terrorist vector? Probably not.

The Department of Homeland Security has even gone so far as to suggest that the health system could be targeted by terrorists and activists in the USA. I am not convinced by this or similar suggestions as the no1 aim of terrorists remains to create terror. Terrorists understand this and seek targets and methods along those lines. It matters less how few people a terrorist kills. It is more important for the terrorists that they have an audience that can clearly see a hard link between cause (terrorist attack) and effect (death, destruction etc). The  murder of a single UK soldier in May 2013 by allegedly Al-Qaeda inspired terrorists with machetes has created significant community angst, not only in the UK where it occurred  but in Australia, Canada and the USA. Yet, it is likely that more people died on that same day on the roads in London. My point is this, that if terrorists discovered some way of causing significant death or maiming from medical equipment, I do not doubt that they would use it. However, it is likely that the effect on the collective public consciousness would not be as great as the machete attack mentioned above.

However, we must accept that it is possible, if not altogether probable. One identified flaw is the chronic inability of many health systems to patch their software and applications.

One high consequence scenario involves hackers attacking defibrillators and insulin delivery systems remotely. I think this comes into the unlikely but possible category. Shodan, was used by a hacker to access the controls of a blood glucose monitor connected to the Internet by WiFi.

Whilst we can probably discount to some extent the terrorist threat, I can imagine the attraction of such attacks as assassination vectors or for the installation of ‘ransomware‘. Thus the high consequence threat from foreign governments and organised crime can’t be as easily discounted.

Beyond the extreme, privacy compromise and fraud

Beyond these extreme events, there is the possibility that patient or staff privacy can be compromised by weak information security. Dr Avi Rubin, a computer scientist and technical director of the Information Security Institute at Johns Hopkins University, talking about the US health system has been quoted . “If our financial industry regarded security the way the health-care sector does, I would stuff my cash in a mattress under my bed.”…. Unfortunately, it is not possible to hide one’s health under the mattress!

I experienced this personally about a week ago when my daughter’s optometrist sent through the results of her recent eye test, only it wasn’t. The attachment data was for somebody I had never met.

We have a tendency to compare the worst case scenario of e-health privacy with the best case scenario of the current system. We all know that it isn’t the case as my example above shows.

Good information security will also help protect healthcare organisations from fraud. Fraud is estimated to be a USD60 Billion impost on US hospitals. Methods that are being used by fraudsters include

  • Diversion of fee revenue
  • Diversion of controlled items (eg drugs)
  • Collusion with suppliers; and
  • Diversion of accounts receivable.

The same methods are being used in medical practices, albeit on a smaller scale.

What to do

A holistic approach is needed. We have worked with a number of medical practices to implement the key elements of the information security standard ISO 27000. This ensures that the practice has a risk based approach which mitigates threats based on real world experience of consequence and likelihood. Working with practice owners and stakeholders, we determine tolerance to information risk and work with them to implement controls which make sense and meet any regulatory requirements.

If you think this is something your organisation needs, please contact us at [email protected]

Information Declassification – A way for governments to save money and improve their information security

In the digital world it is very easy to create data, very difficult to get rid of it

Like us all, government agencies are creating huge amounts of information. Lots of it is classified either to protect privacy or for national security. This is what should happen, classification is an important aspect of information security.

What is data classification?

It is the process of assigning a business impact level to a piece of data or a system. This then governs how many resources are directly devoted to their protection. By classifying documents and systems an organisation makes risk managed decisions on how information is protected.

Graphic by Mark Smiciklas
Graphic by Mark Smiciklas, Flickr.com/photos/intersectionconsulting

Digital data wants to be free and it is expensive to ensure confidentiality if you also want to maintain data integrity and availability.

However over-classification of information can be as bad for an organisation as under classification. This is particularly true of large government organisations.

In addition, Government agencies tend to be risk averse places anyway – which on balance is a good thing!

So how could governments shift the classification balance, improve security and improve efficiency in agencies?

The problem is that the person who classifies data or systems does not have to pay for the cost of their actions in classifying. In fact, the individual avoids personal risk if a  piece of data is over-classified. However their agency has to wear the added expense.

Gentle readers, we have a problem of incentive imbalance!

Suppose it costs $100 to store a Secret document for its lifetime and $10 to store an everyday unclassified document. If governments placed a nominal value on document creation relative to the whole of life costs, it might be possible to stem the tide of increasing amounts of classified data.

If under this scheme a government employee wishes to create a secret classified document, they would need to find $100 in their budget to do so. In this case the employee might consider producing an unclassified document or one that was slightly classified. I argue that this market based approach to declassification would have far more effect than more rules.

A plan for implementation

So how might the plan be implemented in the tight fiscal environment that government agencies currently face, even though it is likely to save money long term?

  1. Survey government agencies to see how many classified pieces of data they produce each year by type. eg, there might be 500 top secret data pieces and 1000 secret.
  2. Assign a dollar value to each document according to the level of protection it receives. This bit would require a bit of research or possibly a pilot scheme.
  3. Based on the previous year’s classified information output, each agency is given a declassification budget. It might be considered that as this task was one that the agency should have been doing previously, that there is no requirement for central funding.
  4. Require each agency to report the numbers of classified data produced.
  5. Agencies that produced too many classified documents would need to pay the treasury a fine equivalent to the cost of storing the extra documents in archives.
  6. Agencies that produced fewer pieces of data than the previous year would receive a windfall.

That’s it in a nutshell. As governments produce more data, they will need to store it. Balancing the incentives to overclassify and underclassify data will help ensure that information is properly protected.

I’d love to hear your ideas, please make a comment

Alex

 

 

Security is your business 2

I was at the launch last Thursday of ‘Security is your business 2’. If you are interested or responsible for Enterprise Risk Management on a practical level, then this DVD will help your organisation.

The DVD includes interviews with Australian and overseas (UK mostly) security literati talking about a number of issues related to ERM. It builds upon the well regarded ‘Security is your business’ but stands alone.

Apart from the fact that I know and respect a number of the talking heads on the dvd, I have no association with the enterprise.

More information from

http://www.securityisyourbusiness.com/Security_Is_Your_Business/Home.html

 

Pizza Hut Australia got hacked – what they did right!

Last week Pizza Hut Australia admitted that its cyber-defences had been breached. Unfortunately the attackers did get hold of customers’ names and addresses. Technically it seems that Pizza Hut didn’t get breached, but the website providers who host their site did.

From a privacy perspective, its time to use the ‘pub’ test, That is – what would your level of unhappiness be if the world knew that you liked the meatlover’s supreme with extra cheese and lived at 32 Rosegardens Road, Morphett Vale? I think not very high. The important thing is that the company claims credit card details didn’t get stolen.

Pizza – not Pizza Hut

My sources tell me that the hackers didn’t get credit card details because this information is held in a separate and better-protected database by a specialised payment gateway. I hope they’re right!

There are a couple of important lessons for organisations to learn from this breach. Firstly by developing granular controls that separate data by its value and what it was used for, organisations like Pizza Hut can develop protocols for their security that give the best mix of data availability and confidentiality. As an example, there are far more parts of the business that benefit from knowing where customers live and who they are than need the credit details. If the data isn’t separated, the organisation can’t make the best use of the data and ensure security at the same time – they have to do one or other. But with granular controls, the marketing department can use addresses and telephone numbers to plan promotions and the planning department can work out where to open the next store, but they don’t need to know credit card details.

The other point is about risk transfer. Although transferring risk to a third-party is an acceptable mitigation according to the risk management standard ISO31000, organisational reputation can’t be transferred. If your company wants to keep its good name when it gets hacked, it needs to have thought about recovery and restoration. Blaming the web provider won’t cut it with customers if your organisation is anything bigger than the local fish and chippery. Generally, the larger the company, the bigger the reputation; more so for .gov

There has been a gradual, but definite change in the way that cybersecurity professionals talk about breaches. Until around 2001, people talked about the possibility of being breached online. Now this has changed from ‘if you get breached’ to ‘when you get breached’.

Essentially, if information is available on Internet facing systems, then it is more a matter of time and luck as to when your system gets done over. This is something security professionals need to communicate with the senior management of the organisation.

For Pizza Hut, this recent event will probably contribute to its longevity and improve its resilience. Research is showing that organisations that undergo small shocks are more ready for the black swans of the future.

However, they should not rest on their laurels, in the aftermath of any breach, an organisation needs to examine how to reduce the risk of further breaches. Some of the questions I ask in such situations are (in no particular order):

  • Does the organisation need to think further about the balance between confidentiality of personal information and the availability to internet facing systems.
  • From a marketing and public relations perspective is the organisation talking to its customers to show that the organisation is taking their personal information seriously;
  • What changes does the organisation need to make in terms of digital evidence gathering – was this adequate enough to deter future attacks – in the long term the rule of law is the only way to reduce the power of the attackers;
  • Did the organisation understand how to respond to the breach, does this need regular exercising;
  • Was there an ageed direction from senior management in the event of a breach, so that the technical staff could ‘get on with the job’ as quickly as possible;
  • Are the relationships with service providers adequate, were the levels of service and measures taken to recover sufficient.

It is important to recognise that the best value gains for the organisation come not from IT changes like forensics, but business process rearrangement.

In cyberspace, if you don’t share, you don’t survive

This might seem a brave call when talking about cyber-security threat information. But the truth is that the cyber world forces a new paradigm on security. The tools that are familiar in the offline world for providing elements of security, such as obscurity, tend to benefit the attackers rather than the defenders, because the very advantages of the online world, things like search and constant availability are also the online world’s greatest weaknesses. What matters most in the online world is not what you know, but how fast you know and make use of the information you have.

I’ve been reading the Cyber Security Task Force: Public-Private Information Sharing report, and I think its worth promoting what it says. It presents a call to action for government and companies in the US to improve information-sharing to prevent the increasing risks from cyber attacks on organisations, both public and private. The work was clearly done with a view to helping the passage of legislation being proposed in the USA, however..

 Most, if not all the findings made could be extrapolated to every advanced democracy around the world. 

 If you are familiar with this field, much of what has been written will not be new, as we have been calling for the sorts of measures that are proposed in the report since at least 2002. That does not mean that the authors haven’t made a valuable contribution, because they have made recommendations about how to solve the problem. Specifically they recommend removing legislative impediments to sharing whilst maintaining protections on personal information.

According to the authors: From October 2011 through February 2012, over 50,000 cyber attacks on private and government networks were reported to the Department of Homeland Security (DHS), with 86 of those attacks taking place on critical infrastructure networks. As they rightly point out, the number reported represents a tiny fraction of the total occurrences.

As is the case in many areas of security, the lack of an evidence base is at the core of the problem, because it creates a cycle where there is resistance to change and adaptation to fix the problems efficiently and effectively.

 Of course, the other thing that happens is that organisations don’t support an even level of focus or resourcing on the problem, because, most of the time, like an iceberg, the bit of the problem that you can ‘see’ is comparatively small.

To make matters worse, new research is telling us that we are optimistically biased when making predictions about the future. That is, people generally underestimate the likelihood of negative events. So without ‘hard’ data, and given the choice of underestimating the size of a problem or overestimating it, humans that make decisions in organisations and governments are likely to underestimate the likelihood of bad things happening. You can find out more about the optimism bias in a talk by Tali Sharot on TED.com

The cost differential to organisations when they don’t build in cyber security, are unable to mitigate risks and then need to recover from cyber attacks is significant. This cost is felt most by the organisations affected, but its effects are passed across an economy.

So what can be done to break this cycle of complacency? Government and industry experts have long spoken about the need for better sharing of information about cyberthreats. I was talking in public fora about this ten years ago.

The devil is in the detail in the ‘what’ and the ‘how’. Inside the ‘what’ is also the ‘who’. I’ll explain below

What should be shared, who should do the sharing – and with whom?

Both government and industry, whilst they generally enthusiastically agree that there should be sharing, think that the other party should be doing more of it and then come up with any number of excuses as to why they can’t! For those who are fans of iconic 80’s TV, it reminds me of the Yes Minister episode where the PM wants to have a policy of promoting women and in cabinet each minister enthusiastically agrees that it should be done, whilst explaining why it wouldn’t be possible in his department. In government, the spooks will tell you that they have ‘concerns’ with sharing, ie they want to spy on other countries and don’t want to give up any potential tools. It’s no better in industry, companies don’t have an incentive to share specific data, because their competitors might get some kind of advantage.

The UK has developed perhaps the most mature approach to this. UK organisations have been subject to a number of significant cyber attacks and government officials attempt to ‘share what is shareable’. The ability to do this may be because of the close relationship between the UK government and industry, developed initially during the time of the Troubles in Ireland and has been maintained in one form or other through the terrorism crises of this Century. It remains to be seen whether the government will be able to maintain these relationships and UK industry will see value in them as the UK and Europe struggle with short-termism brought on by the fiscal situation.

Australia has also attempted to share what is shareable, however as the government computer emergency response team sits directly within a department of state this is very difficult. It seems that the CERT does not have a clear mission. Is it an arbiter of cyber-policy and information disseminator, or an operational organisation that facilitates information exchange on cyber issues between government and industry?

This quandary has not been solved completely by any G20 country. Indeed, it will never be solved, it is a journey without end. It is possible that New Zealand has come closest, but this seems to be because of the small size of the country and the ability to develop individual relationships between key people in industry and government. Another country that is doing reasonably well is South Korea – mainly because it has to, it has the greatest percentage of broadband users of any country and North Korea just a telephone line away. The Korean Internet security agency – KISA brings together industry development, Internet policy, personal information protection, government security, incident prevention and response under one umbrella.

For larger countries, I am of the view that a national CERT should be a quasi-government organisation that is controlled by a board comprised of:

  • companies that are subject to attack (including critical infrastructure);
  • network providers;
  • government security and
  • government policy agencies.

In this way, the CERT would strive to serve the country more fully. There would be more incentive from government to share information with industry and industry to share information with government. With this template, it is possible to create a national cyber-defence strategy that benefits all parts of the society and provides defence-in-depth to those parts of the community that we are most dependent on, ie the critical infrastructure and government.

Ensuring two-way information flow within the broader community and with industry has the potential to provide direct benefits for national cyber-security and for the community more broadly. Firstly, by helping business and the community to protect itself. Secondly, for government, telecommunications providers and the critical infrastructure in the development of sentinel systems in the community, which like the proverbial canary in the coalmine, signal danger if they are compromised. Thirdly, by improving the evidence base through increased quality and quantity of incident reporting – which is so often overlooked.

Governments can easily encourage two-way communication by ‘sharing first’. Industry often questions the value of information exchanges, because they turn up to these events at their own expense and some government bigwig opens and says ‘let there be sharing’ and then there is silence, because the operatives from the three letter security agencies don’t have the seniority to share anything and the senior ones don’t understand the technical issues. I am not the first person to say that in many cases (I think 90%), technical details that can assist organisations to protect their networks do not need to include the sensitive ‘sources and methods’ discussion. By that I mean, if a trust relationship exists or is developed between organisations in government and industry and one party passes a piece of information to the other and says “Do x and be protected from y damage”, then the likelihood of the receiving party to undertake the action depends on how much they trust the provider. Sources and methods information are useful to help determine trustworthiness, but they are not intrinsically essential (usually) to fixing the problem.

As the Public-Private Information Sharing report suggests, many of the complex discussions about information classification/ over-classification and national security clearances can be left behind. Don’t get me wrong; having developed the Australian Government’s current protective security policy framework, I think there is a vital place for security clearances and information classification. However, I think that it is vastly over-rated in a speed of light environment where the winner is not the side with the most information, but the side that can operationalise it most quickly and effectively. Security clearances and information classification get in the way of this and potentially deliver more benefit to the enemy by stopping the good guys from getting the right information in time. We come back to the question of balancing confidentiality, integrity and availability – the perishable nature of sensitive information is greater than ever.

How should cyber threat information be shared?

This brings me to the next area of concern. There is also a problem with how information is shared between industry and government, or more importantly the speed with which it is shared. In an era when cyber attacks are automated, the defence systems are still primarily manual and amazingly, in some cases rely on paper based systems to exchange threat signatures. There is an opportunity for national CERTs to significantly improve the current systems to share unclassified information about threats automatically. Ideally these systems would be designed so that threat information goes out to organisations from the national CERT and information about possible data breaches returns immediately to be analysed.

Of course, the other benefit of well-designed automated systems could be that they automatically strip customer private information out of any communications, as with the sources and methods info, peoples’ details are not important (spear phishing being an exception). In most cases, I’d rather have a machine automatically removing my private details than some representative of my ‘friendly’ telecommunications provider or other organisation.

These things are all technically possible, the impediments are only organisational. Isn’t it funny, people are inherrently optimistic, but don’t trust each other. Its surprising civilisation has got this far.

CERTs – Computer Emergency Response Teams


References

1How Dopamine Enhances an Optimism Bias in Humans. Sharot, T; Guitart-Masip, M; Korn, c; Chowdhury, R; Dolan, R. CURRENT BIOLOGY. July 2012. www.cell.com

2 Yes Minister Series 3, EP 1 – “Equal Opportunities” 1982