11.5 C
London
Saturday, October 21, 2017
Home Tags Cloud

Tag: Cloud

Cloud computing is a kind of Internet-based computing that provides shared processing resources and data to computers and other devices on demand. It is a model for enabling ubiquitous, on-demand access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services), which can be rapidly provisioned and released with minimal management effort. Cloud computing and storage solutions provide users and enterprises with various capabilities to store and process their data in third-party data centers. It relies on sharing of resources to achieve coherence and economy of scale, similar to a utility (like the electricity grid) over a network.

A common understanding of “cloud computing” is continuously evolving, and the terminology and concepts used to define it often needs clarifying. Press coverage can be vague or may not fully capture the extent of what cloud computing entails or represents, sometimes reporting how companies are making their solutions available in the “cloud” or how “cloud computing” is the way forward, but not examining the characteristics, models, and services involved in understanding what cloud computing is and what it can become.

Advocates claim that cloud computing allows companies to avoid upfront infrastructure costs, and focus on projects that differentiate their businesses instead of on infrastructure.[5] Proponents also claim that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and enables IT to more rapidly adjust resources to meet fluctuating and unpredictable business demand. Cloud providers typically use a “pay as you go” model. This will lead to unexpectedly high charges if administrators do not adapt to the cloud pricing model.

The present availability of high-capacity networks, low-cost computers and storage devices as well as the widespread adoption of hardware virtualization, service-oriented architecture, and autonomic and utility computing have led to a growth in cloud computing. Companies can scale up as computing needs increase and then scale down again as demands decrease.

Cloud computing has become a highly demanded service or utility due to the advantages of high computing power, cheap cost of services, high performance, scalability, accessibility as well as availability. Some cloud vendors are experiencing growth rates of 50% per year, but being still in a stage of infancy, it has pitfalls that need to be addressed to make cloud computing services more reliable and user friendly.

Security is not often seen as a driver for innovation. More often than not, it's seen as the uncomfortable problem that needs to be solved in order to meet regulatory requirements, if there is regulation for it at all. It's been seen time and time again as one of the biggest roadblocks to new technologies. Cloud? The biggest concerns are privacy and security. Big data? Again, how our information is scr**ed and stored. BYOD? Security and data loss prevention.

The list goes on. Both major political parties in Australia have decided that yes, a National Broadband Network (NBN) is necessary, and yes, the future is going to be about the digital economy. But if this virtual economy is meant to be the next big thing for saving our country since digging up rocks, why is it that no one is debating about what has been a historically huge roadblock? When the Labor government released its update to the National Digital Economy Strategy (NDES), I wondered whether it even knew what it was doing when it came to securing the online space and making it a suitable place for business. Most of what was in the strategy failed to inspire me, given that it spoke mostly of education, neglected to include any actual executable plans or strategies for businesses or government, and generally rehashed what we have been doing as a country for the past five years or so. But if I found Labor's performance lacklustre, the Coalition's plan leaves me wanting to flip some tables. In referring to the NDES, the Coalition's policy attempts to shelve the "aspirations" of its opposition's paper as issues to be dealt with at a state and territory level, specifically using the National Plan to Fight Cybercrime as an example. What part of "national" indicates that this was intended to be approached at the state level? Online crime does not know any boundaries.

The idea that you need to consider security differently because you live in New South Wales instead of Queensland is archaic and misses the point. By contrast, information security companies are calling for international coordination, a harmonisation of legislation to break down country-level silos that enable criminals to jump jurisdiction. I wouldn't be so upset with such backward thinking, except for the fact that security is only mentioned one other time in the entire policy. The sliver of promise comes as the Coalition notes that what we've being doing as a country for the past five years or so has been about trying, fruitlessly, to digitise everything.

Its policy states that "the traditional focus of public sector effort in this area has been on online enforcement of laws and property rights, cybersafety education, digital literacy, and similar attempts to translate the traditional tasks of government into a digital context". Thank goodness someone has realised that concepts in the offline world don't always work online, especially for security. In the real world, if a robber tries your car door and finds that it's locked, they then proceed to try another. Online criminals send out their minions to try all of the cars on the street at the one time, and if the doors are locked, they check in all the right places for weaknesses or a spare set of keys. The good news in the Coalition's policy is that it states that under its stewardship, it will move resources to where they are most effective. So far, so good. But the proposed solution for this is to get the private sector involved by encouraging it, of course, all while ensuring that a Coalition government does not "pick winners or lay down inflexible rules". I can't help but feel that just as security has been lumped into a problem for the states and territories, so too again has it become the private sector's issue. A watch-from-afar approach is not what the private sector needs. It is looking to the government to get its act together so that the Australian Federal Police isn't waiting months for overseas law enforcement agencies to come back with information. Businesses are innovative enough to adopt cloud technology and accept the use of BYOD, but when a breach occurs because of a hacker in another country, there's nothing on a national scale for them.

They turn to state-level resources, but they don't see it as their job to hunt down someone in Russia, China, or wherever the hip place to hack from these days is. The current plan of thinking about talking to other countries or one day getting involved with the United Nations Charter is slow, and could have been a great point of differentiation between the two biggest political parties, but it hasn't happened.

If anything, the focus by both major parties on making sure we're "innovative" enough as a country is almost a backhanded insult to the current startups and research hubs. Both parties had the opportunity to show how they would really enable the digital economy by removing the largest concerns around security. Sadly, it seems that neither of them actually have a clue.
NEWS ANALYSIS: For an increasing number of enterprises, private clouds and new-generation security are the best ways to handle data and file sharing. The significance of secure data sharing could not be more evident within the overall IT industry. Security remains the single most important question mark that still causes doubt about cloud computing, especially in regulated verticals such as the financial, government and science sectors. For an increasing number of enterprises, private clouds and new-generation security are the best ways to handle data- and file-sharing. A recent survey by Palmer Research and eWEEK publisher QuinStreet reported that 65 percent of respondents currently use or plan to use a private cloud deployment model for internal purposes or for application inside value chains. Thirty-six percent of respondents say they are now running a private cloud, with 29 percent planning to use a private cloud. Those are big numbers at this early stage, and there are good reasons for them. Private clouds enable businesses to take advantage of the efficiency of cloud computing without exposing their data and applications to those outside the organization—or, if they choose, their value chains of resellers and contractors.

These private systems are the ones being marketed hardest by cloud infrastructure providers such as Hewlett-Packard, IBM, Cisco Systems, Oracle, EMC/VMware, Dell and others. We're seeing private cloud services slowly but steadily replacing former in-office functions, such as employee recruiting management, testing and development of software, travel and expense management, and employee benefit management. Value-chain transactions, retail sales and credit-related business deals are increasing through private clouds because the security quotient is much higher than it is when using conventional means. Meanwhile, the increased consumerization of IT and the popularity of  BYOD practices are jeopardizing the security and integrity of enterprise data that is not accessed through private cloud systems. Seeking an easy way to share files across smart phones, tablets and desktops, employees often use free public cloud file-sharing services that lack rigorous security and audit controls.

These services are prone to security outages, and they lack the centralized monitoring and control features that IT and security teams need for keeping data safe and demonstrating compliance.  
Building secure enterprise applications starts at the design phase. But it has taken a long time to create tools that help ferret out code flaws and teach developers how to write better code. Developers have long struggled with the security conundrum of how to quickly deliver apps that are as inherently secure as they are robust, reliable and efficient. In today's fast-paced world of mobile, social, cloud and often complex enterprise applications, pressure is on developers to produce applications faster than ever. Yet, despite that pressure to deliver more apps faster, there is just as much pressure—brought on by those same mobile, social and cloud factors—to deliver applications that are more secure and reliable than ever before. What's a developer to do? "Time-to-market pressure results in continually shrinking software delivery windows, while the business risks associated with software defects have never been greater," said Jennifer Johnson, chief marketing officer for Coverity, the maker of the Coverity Development Testing Platform, an integrated suite of software testing technologies for identifying and remediating quality and security issues during development. Coverity's platform automatically tests source code for software defects that could lead to product crashes, unexpected behavior, security breaches or even catastrophic failure. According to IBM, application security vulnerabilities can be introduced in various phases of the development cycle.

The requirements and design process fails to consider proper security; flaws are introduced inadvertently or purposely introduced into the code during the software implementation, or during deployment because a configuration setting did not match the requirements of the product within the computing environment—for example, when unencrypted communication is allowed over the Internet. To limit such occurrences, IBM has instituted a structured development process for delivering secure applications called the Secure Engineering Framework, which recommends the use of automated security analysis tools and proven certified security components. These tools include source code security analyzers, bytecode security analyzers, binary security analyzers, dynamic analysis tools and runtime analysis tools. Source code analyzers analyze application source code to locate vulnerabilities and poor coding practices.

These tools can also trace user input through the application (code flow analysis, taint propagation) to uncover various injection-based attacks. Bytecode analyzers analyze application byte code (relevant for certain languages only) for the same vulnerabilities as source code analyzers. Binary analysis is very similar to source code analysis. However, instead of evaluating the source code, this analysis examines the application binary. Dynamic analysis tools perform analysis of the application as a black box, without knowing its internal operation and source code. Dynamic analysis tools automatically map the application, its entry points and exit points, and attempt to inject input that will either break the application or subvert its logic. Runtime analysis is not a specific security analysis technique or tool. It is the software development practice targeted at understanding software behavior during runtime—including system monitoring, memory profiling, performance profiling, thread debugging and code coverage analysis. All these types of tools play a part in what Caleb Barlow, director of mobile security at IBM, likes to call security by design. "The whole idea is to recognize that if you think about security in the design of your applications in the very beginning, and if you use security tools during the build process and during the development process, at the end of the day you'll actually save money—in addition to having a better protected application," Barlow said. "The reason is fixing a security vulnerability early in the development lifecycle is very inexpensive. Let's say you find a bug and maybe it takes a half hour of a U.S. developer's time. What's that cost you, maybe $50 or $100. But if you find a bug late in the development cycle, you then have to go figure out where it was in the code, then you have to remediate it, then you have to retest, you have to rebuild, you have to repackage and maybe do multiple testing scenarios.
VIDEO: One of the most infamous hackers of all time talks about Website security and what users should do to protect themselves. In the world of computer security hackers, few are as well-known as Kevin Mitnick....
Security researchers have published a research paper on how they bypassed the security features of cloud-based storage service Dropbox and gained access to private user files. Dhiru Kholia of Openwall and Przemysław Wegrzyn of CodePainters said although service has more than 100 million users, the platform had previously not been analysed extensively enough from a security standpoint. The said their goal is to get Dropbox to create an open source version, which would mean that anyone could look at its code and verify that the service is secure. The researchers said they were able to gain unauthorised access to files, despite the fact that Dropbox added security features after it was hacked a year ago. Security measures aimed at attracting enterprise users included encryption and two-factor authentication, but both were bypassed by Kholia and Wegrzyn. They were able to reverse engineer the portion of Dropbox that runs on a user's computer, despite the fact that Dropbox was written in Python using techniques aimed at preventing reverse engineering. The means that many other cloud services that use Python and the same anti-hacking techniques could be at risk, according to Business Insider. The researchers said they found that two-factor authentication as used by Dropbox protects only against unauthorised access to the Dropbox’s website. “The Dropbox internal client API does not support or use two-factor authentication.

This implies that it is sufficient to have only the host_id value to gain access to the target’s data stored in Dropbox,” they said. However, Dropbox has issued a statement, saying it does not believe that the research presents a vulnerability in the Dropbox client. “In the case outlined [in the research], the user’s computer would first need to have been compromised in such a way that it would leave the entire computer, not just the user's Dropbox, open to attacks across the board,” the company said. Kholia and Wegrzyn hope that others will help them build a more secure, open source method for using Dropbox that would be available for Dropbox to adopt. Email Alerts Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from ComputerWeekly.com
U.S. cloud and managed-service firms could face a drop in revenues by up to one-quarter due to backlash from other nations, according to two separate analyses. Fueled by protectionism and worries over U.S. surveillance, companies in other nations may spurn American cloud services and could shrink the U.S. industry's revenues by up to one-quarter, according to two worst-case analyses of the impact of recent revelations of rampant data collection by the National Security Agency. The analyses—conducted by the Information Technology and Innovation Foundation (ITIF) and Forrester Research—predicted that U.S. cloud companies could face slower demand by as much as $35 billion over three years if the backlash against the NSA materializes as predicted.

Forrester's analysis expanded the number to $80 billion and added another $100 billion from U.S. managed-service providers that will likely face similar discrimination. "Where they have cared, companies have been avoiding using U.S. service providers.

There may be some, who have said, how much risk is there? And those are the ones that will now be reconsidering U.S. providers," James Staten, a principal analyst with Forrester Research, told eWEEK. Already, the surveillance revelations have impacted a number of companies providing secure messaging services. Encrypted email service Lavabit, which reportedly counted NSA whistleblower Edward Snowden as a subscriber, shut down, leaving behind only a veiled explanation that suggests the U.S. government had requested access to Snowden's email. Secure messaging firm Silent Circle shuttered a similar service, citing the possibility that they might be subpoenaed as well. Meanwhile, companies that provide encryption services for the cloud, such as CipherCloud, have seen interest skyrocket. While U.S. companies may not worry that the NSA has potential access to business secrets, international firms are worried far more about potential U.S. spying.

The NSA's ability to reportedly tap into three-quarters of the data flowing through the U.S. Internet combined with its legal power to subpoena international communications gives weight to previous concerns that companies may be baring their trade secrets by using a U.S. cloud provider, ITIF and Forrester argued. Already, Germany's Minister of the Interior has told companies not to use services that go through U.S. servers, if they are worried about privacy.

The French government had already embarked on a program to create a domestic cloud infrastructure to compete with U.S. firms, investing EUR 150 million in two coalitions to build separate clouds. Yet for companies not prone to protectionism, the solution should be simple: Implement encryption to protect data no matter where it resides, said Forrester's Staten. "You can use a U.S. provider and bring your own encryption and if you do that and the government gets access to your data, they only are getting access to your encrypted data," he said. In the end, Staten stresses that the estimates are worst-case scenarios.

If more misconduct by U.S. intelligence agencies is revealed and European leaders and businesses politicize the issue, damages could extend to 25 percent. However, companies tend to make rational decisions, and, thus, far fewer will likely move their business offshore.

For that reason, 3 to 5 percent is a more reasonable estimate, he said. ${QSComments.incrementNestedCommentsCounter()} {{if QSComments.checkCommentsDepth()}} {{if _childComments}}
The government's "digital by default" agenda should be accompanied by one of "security by default" if public sector organisations are to truly protect themselves against cyber security threats and avoid a cyber security skills crisis.  That's according to Graeme Stewart, director for UK public sector strategy at security software vendor McAfee. "There is no bigger indicator of a cyber security skills crisis than the world's most prestigious security agencies struggling to compete for staff," he said. "With the UK government driving its own digital transformation agenda, and cyber security being reclassified to a tier-one national security threat, never has there been more pressure for the public sector to rectify a very real cyber security skills gap." Stewart argued that the government needs to do more to ensure that service providers at every level of the "public sector supply chain" are educated about the need for proper cyber security.  "With the UK government opening doors for more and more small and medium-sized businesses to become suppliers to the G-Cloud, and an influx of international players, it is critical that every level of the supply chain, not just the top tier, must be approached with the utmost seriousness.  "Ultimately, governments must take responsibility for the security of the supply chain but, in part, this should be about educating and supporting the full ecosystem of businesses involved," he said. "Security by default should be embedded alongside ‘digital be default' as a cornerstone of our public services, rather than the afterthought it has often been in the past," Stewart added. Last month, Shadow Cabinet Office minister Chi Onwurah MP told Computing that the government isn't spending enough on raising cyber security awareness. "There needs to be a greater profile of cyber security in a positive way and I don't believe the balance of spend right now is right. In terms of priority, it is giving it to national cyber security over the awareness more generally among the UK population," she said.
Many government leaders are not informed and familiar with technology, according to Scott Borg, director and chief economist at the US Cyber Consequences Unit, an independent research institute. “This leads to wrong decisions, such as investing in technology solutions that are useless, or financing research that will never produce results,” he told FutureGov. According to Borg, leaders also often confuse the main cyber security roles government has to fulfil by having the same people or organisations perform all the roles at the same time. These roles include helping critical infrastructure industries defend themselves against cyber attacks, protecting citizens from cyber attacks, and protecting government itself to ensure continuity and trust. Graeme Stewart, director of UK public sector strategy at security firm McAfee, said Borg’s comments highlight a worrying lack of cyber security skills among government leaders. “There is no bigger indicator of a cyber security skills crisis than the world’s most prestigious security agencies struggling to compete for staff,” he said. According to Stewart, there has never been more pressure to address the cyber security skills gap, with the UK government driving its own digital transformation agenda and cyber security being reclassified to a tier-one national security threat. Borg also pointed out how crucial it is to secure the supply chain for critical national infrastructure. “With the UK government opening doors for more small and medium-sized enterprises to become suppliers to the G-Cloud, and an influx of international players, it is critical that security is applied at every level of the supply chain,” said Stewart. “Ultimately, governments must take responsibility for the security of the supply chain, but in part this should be about educating and supporting the full ecosystem of businesses involved,” he said. Stewart believes the principle of “security by default” should be embedded alongside “digital by default” as a cornerstone of UK public services, rather than the afterthought it has often been in the past. Email Alerts Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from ComputerWeekly.com
Amazon's Virginia data centers experienced four hours of degraded service on Sunday as a single physical device is blamed for the issue. Amazon Web Services (AWS) was hit with a service interruption on Sunday, Aug. 25, that caused four hours of degraded service for customers of its US-EAST data center availability zone and knocked a number of virtual machine instances offline.

The degraded service was the result of an issue with a single networking device that failed. The first public acknowledgment from Amazon that there was some trouble with its cloud infrastructure came at 1:22 p.m. PDT on Sunday afternoon. "We are investigating degraded performance for some volumes in a single AZ in the US-EAST-1 Region," an Amazon AWS status update reported. The US-EAST-1 Region is a set of Amazon data centers located in Northern Virginia.

Amazon refers to its data centers as "Availability Zones" (AZs).

The purpose of the AZ concept is to have geographically disparate fault tolerance and stability on a global basis.

Amazon currently operates eight AZs in total, including three in the Asia Pacific region, one in Western Europe, one in South America and three AZs in the United States. US-EAST-1 is the only Amazon AZ on the East Coast; the other two AZs are US-WEST-1 located in Northern California and US-WEST-2 located in Oregon. As it turns out, although Amazon did not report any trouble via its status update feeds for US-EAST-1 until 1:22 p.m. PDT on Sunday, the issue actually started approximately 30 minutes earlier.

Amazon did not provide full details on the incident until 3:23 p.m. PDT, at which point an AWS status update noted, "From approximately 12:51 PM PDT to 1:42 PM PDT network packet loss caused elevated EBS-related API error rates in a single AZ." EBS is Amazon's Elastic Block Storage service and provides persistent storage to virtual machines running on the Amazon cloud.

Amazon noted that a "small" number of its cloud customers had virtual machine instances that became unreachable due to the EBS error.

Among the sites that were impacted on Sunday afternoon were Airbnb, Instagram, Flipboard and Vine. "The root cause was a 'grey' partial failure with a networking device that caused a portion of the AZ to experience packet loss," Amazon noted in its status update. Amazon physically removed the failed networking device in order to restore service in US-EAST-1 to normal. It was not until 6:58 p.m. PT that Amazon's status update gave the all clear, indicating that normal performance had been restored. The US-EAST-1 issue on Sunday is not the first time that Amazon has had trouble with that data center. In 2012, storms knocked off power to Amazon's East Coast availability zones, leaving the service unavailable.

There was also an incident in 2011 that hit the Virginia-based East Coast AZs. The whole concept behind the AZs, though, is to help customers mitigate the risk of an outage in any one geographical area. "When you launch an instance, select a region that puts your instances closer to specific customers, or meets the legal or other requirements you have," Amazon's AZ documentation states. "By launching your instances in separate Availability Zones, you can protect your applications from the failure of a single location." Sean Michael Kerner is a senior editor at eWEEK and InternetNews.com. Follow him on Twitter @TechJournalist.
The time has come for service providers and consumers to move to a security model better suited to the cloud computing era, says cloud-based content management and collaboration firm Box.com.   The firm has pursued transparency or openness as a key policy to establish trust with customers concerned about security in the cloud environment. Customers are able to access all activity and transactions related to their content and even download that data to their security information and event management (Siem) systems. They also have access to SOC1, SSAE16, SOC2, ISO27001 and internal audit reports and quarterly penetration test reports.  Box.com even allows customers to perform their own penetration tests. In pursuit of greater transparency, Box.com has also achieved compliance with the US health sector HIPPA standard and is working on compliance with the US government FedRAMP cloud security assessment programme. Limitations to this approach However, this approach has its limitations, according to Justin Somaini, chief trust officer at Box.com and former chief information security officer (CISO) at two Fortune 500 companies. He is calling for a new security model that can address the security issues arising from the evolution of computing on the one side and cyber threats on the other. Somaini has begun working on a new model in consultation with the Cloud Security Alliance (CSA), which he hopes will evolve into an industry standard that will benefit cloud service providers and users alike. He believes that cloud computing is essentially a return to centralised computing, which is an opportunity to achieve the security benefits the industry has been missing out on for 40 years. “There is a lot of security value you can get when you move back into a centralised computing utility,” he told Computer Weekly. “It is only when we bring content back to a centralised model do we have the ability to apply identity, authentication and authorisation capabilities,” he said. Consumers of cloud services have a role to play in having an open mind about the possibilities of doing things better in the cloud from a security point of view. The importance of security At the same time, cloud providers must strive to make security a differentiator by building products that share the customers’ objective of fending off attackers and ensuring confidentiality, integrity and availability, said Somaini. Transparency around activity and transactions around content is a key component, he said, but many cloud providers still do not allow customers to access logs to see what is going on. Many also still do not have good security certifications or detailed audits to provide a level of transparency around how they are managing content. Without transparency there can be no trust, said Somaini, which is why he is forging a new security model that is aimed at enforcing this principle in the cloud services industry. “One of the things I am call for in the industry is a more detailed and prescriptive audit and certification specifically for cloud providers,” he said. For example, it should require cloud providers to supply all documentation on how they work instead of just a certification letter, and allow customers to view and download all transactions on their data. Other important questions would be around providers’ ability to assist in any e-discovery requirements, how they defend against advanced cyber threats, and how they deal with application security. Leagfrogging cloud specific frameworks Internally, Box.com is seeking to roll out a version by the end of the year to leapfrog cloud specific framework within the ISO model, which is expected to take years to develop. Somaini intends to update and mature the CSA’s control compliance matrix as the basis for new controls aimed at giving customers greater visibility and transparency than the SOCs, SSAE and ISO can. He hopes that once established, these controls will be rolled into the fledgling ISO cloud framework.      “My intent is to drive something new into the security industry and I believe the best place for it and the best leadership I can see doing this is the CSA,” said Somaini. He is in discussions with the CSA in the hope of being able to create a new certification that will give users of cloud services better trust in what providers do and how they do it. Somaini believes security professionals need to learn that there is a better security model than what they do today and cloud suppliers need to provide better capabilities to enable trust and transparency. “Both sides of the industry have some work to do if we are to solve some of the fundamental problems in the industry,” he said. Email Alerts Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from ComputerWeekly.com
Intellectual Property (IP) theft – whether by competitors or states – has been occurring for a long time. Traditional approaches of protecting IP involve patents, copyrights, trademarks, physical security (locking documents away), classifying documents using a labelling scheme and staff education. These traditional approaches are still valid today, and may need to be strengthened.

They should also be supplemented by a range of electronic approaches.  These include electronic licensing, encryption, data classification, access control, logically or physically separate networks, and providing "clean" devices to staff travelling to countries where IP theft is likely.

All approaches are complicated by the demands of international travel, collaborative working, the need to share information (including IP) in the supply chain, consumerisation, and the cloud. Information Security Forum (ISF) research has shown that protecting your IP can follow an information-led, risk-based process similar to that used to protect information in your supply chains, as discussed in the Securing the Supply Chain reports and tools.  The process is modified to reflect the greater control over your own organisation and staff, and compromises eight steps: Understand what you have and what you share Quantify the effect of losing information: what information, if lost, would hurt us most? Introduce a physical and electronic labelling scheme Deploy physical and logical controls: for example, clear desks, lockable cabinets, encryption and access control Educate your staff in both physical and electronic protection Investigate and implement technical solutions, such as data loss prevention Record and manage incidents and breaches: check for relationships and correlation Think like the thief: identify valuable information and how you would circumvent its protection. Such a process should yield a mix of physical and electronic approaches that provide the required protection for your organisation and your IP. Adrian Davis is principal research analyst at the Information Security Forum (ISF). Email Alerts Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from ComputerWeekly.com This was first published in August 2013
US politicians are expected to retreat from their obsession with spying on citizens after it was revealed that the biggest losers were actually corporations. Since the so-called Land of the Free overthrew its lawful king in a French backed terrorist coup, most of the country's major decisions have been made to prop up businesses and corporate culture. Snooping on citizens is more of a knee jerk reaction against terrorism which was, in itself, a smoke screen for poor economic performance by the last two presidents. Now it seems that the snooping is getting in the way of the US's number one priority of protecting big business from real life. It turns out that the NSA surveillance programmes are very damaging for the American technology industry. A report by the Information Technology and Innovation Foundation said that companies that provide cloud computing services stand to lose as much as $35 billion over the next three years unless Congress takes action to alleviate the fears of American people that they are being snooped on. Cloud computing and storage companies are being seen as the saviour for business and the economy. The industry is growing fast and is expected to be a $207 billion business by 2016. But, to the NSA, putting the material on the cloud is a bit like shoving all the personal information in one place where it can be easily collected. While that makes life easier for the spooks, it makes companies less likely to go with cloud stuff. Big business is unhappy with the idea of being spied on as much as your Average Joe. At the moment it is US companies that dominate the international cloud computing market. Normally that would mean that piles of foreign cash would be rolling into the US from foreign parts. However, Daniel Castro of ITIF, is now warning that foreign companies are not trusting American cloud computer companies and he thinks that US cloud companies will lose anywhere from 10 percent to 20 percent of the market to international rivals. This will represent a loss of $22 billion to $35 billion. Already 10 percent of international companies surveyed have already cancelled a project that used a cloud computing service based in the US and 56 percent of companies surveyed are "less likely" to use a US based cloud computing service. It also seems that 36 percent of US companies surveyed said they have found it "more difficult" to do business outside of the country because of NSA spying. When you factor all that in, firms are almost certain to remind their sock puppets that this spying lark is going to have to stop - or at least have it toned down a little. It does not matter if occasionally someone blows something up in the name of their terror campaign, so long as US business is not harmed.