Thursday, December 14, 2017
Home Tags Cloud

Tag: Cloud

Cloud computing is a kind of Internet-based computing that provides shared processing resources and data to computers and other devices on demand. It is a model for enabling ubiquitous, on-demand access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services), which can be rapidly provisioned and released with minimal management effort. Cloud computing and storage solutions provide users and enterprises with various capabilities to store and process their data in third-party data centers. It relies on sharing of resources to achieve coherence and economy of scale, similar to a utility (like the electricity grid) over a network.

A common understanding of “cloud computing” is continuously evolving, and the terminology and concepts used to define it often needs clarifying. Press coverage can be vague or may not fully capture the extent of what cloud computing entails or represents, sometimes reporting how companies are making their solutions available in the “cloud” or how “cloud computing” is the way forward, but not examining the characteristics, models, and services involved in understanding what cloud computing is and what it can become.

Advocates claim that cloud computing allows companies to avoid upfront infrastructure costs, and focus on projects that differentiate their businesses instead of on infrastructure.[5] Proponents also claim that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and enables IT to more rapidly adjust resources to meet fluctuating and unpredictable business demand. Cloud providers typically use a “pay as you go” model. This will lead to unexpectedly high charges if administrators do not adapt to the cloud pricing model.

The present availability of high-capacity networks, low-cost computers and storage devices as well as the widespread adoption of hardware virtualization, service-oriented architecture, and autonomic and utility computing have led to a growth in cloud computing. Companies can scale up as computing needs increase and then scale down again as demands decrease.

Cloud computing has become a highly demanded service or utility due to the advantages of high computing power, cheap cost of services, high performance, scalability, accessibility as well as availability. Some cloud vendors are experiencing growth rates of 50% per year, but being still in a stage of infancy, it has pitfalls that need to be addressed to make cloud computing services more reliable and user friendly.

One nation asks for new parts on two satellites for fear of U.S. eavesdropping.

Other companies spend money to show that their products do not contain "spycraft." Last year's revelations about the extent to which the U.S. National Security Agency eavesdropped and collected data on other nations, foreign nationals and American citizens continues to cause problems for U.S. companies. This week, the United Arab Emirates reportedly refused to accept two intelligence satellites from France because they allegedly contained U.S. parts that would allow the NSA to tap into the satellites' encrypted transmissions to ground-based stations. Delivery of the satellites is set for 2018, along with a ground station, but may be delayed as the Middle Eastern nation considers its options. This is not the first time that other nations have worried about U.S. technology in sensitive products following the revelations of the extent of the NSA's spying activities. Cloud providers and other IT companies have particularly felt the brunt of distrust, a trend that will continue, said John Dickson, a principal with the Denim Group, a provider of secure application technology. "If your company handles sensitive information from international clients, you need to be ready to answer questions about your organization's cooperation with U.S. law-enforcement and government organizations and how that may affect their business, especially cloud providers," he wrote in Jan. 6 a blog post. "In fact, I'd suggest that you think through these issues now and reach out to your international clients prior to them asking the question." In June 2013, Edward Snowden, an NSA contractor, left the country and began leaking documents, leading to a steady flow of classified information on the operations and capabilities of the U.S. intelligence agency. From the NSA's bulk collection of data on U.S. citizens to the tapping of communications with allied heads of state, the document provided an unprecedented look into the capabilities and intent of NSA programs. Yet the revelations have led to widespread distrust of U.S. companies, particularly following leaks revealing the NSA's ability to tap links between cloud providers and the reported $10 million deal with security firm RSA to use a weaker encryption method in one of its products.

According to a Dec. 20 Reuters article, the NSA paid the security firm to designate an NSA-designed algorithm, one which the intelligence agency knew it could break, as the default random-number generator for its BSAFE encryption. The situation now resembles the problem Chinese infrastructure companies face when trying to convince other nations and companies to buy their products. Huawei, for example, built a test lab in the United Kingdom to allow that nation's government to inspect its products.

Now companies—such as Microsoft, Google and RSA—have to proactively persuade clients of their trustworthiness. Microsoft, for example, is pursuing a strategy of greater software transparency to convince companies in other nations that its code is secure, said John Pescatore, director of emerging trends for the SANS Institute. "The Snowden leaks of NSA activities mean that U.S. IT exporters will need to make investments similar to Huawei's in order to convince overseas customers that their technology has not been compromised," he said in the group's bi-weekly NewsBytes newsletter. Because the NSA has compromised the security of many technology products and services, and other intelligence agencies have similar efforts under way, companies worldwide will likely distance themselves further from governments and cooperate only as far as required under the letter of the law, said Dickson of the Denim Group.
EC2 service helps hackers bypass measures designed to protect LinkedIn users.
By Leo Kelion Technology reporter Sony's computer entertainment chief Andrew House announced the service at CES Sony has announced plans to roll out its cloud gaming service. PlayStation Now will allow subscribers to play some of the platform's greatest hits without the need to own a console. It works by streaming data from the company's servers.

The firm's latest smart TVs will be among the first devices to support it, but Sony also intends to offer the facility to other third-party products. One expert said the firm had a rich back catalogue to draw on. "It's a pretty big asset," said Brian Blau from the tech consultancy Gartner. "You can imagine the hundreds of years of manpower that went into building it up, and now they can get value from it for a long time into the future." Sony said it would launch a restricted test of the service in the US later this month before a wider launch in the summer. It has not provided details of plans for other markets yet or information of which smartphone and table platforms will be the first to get apps to run the facility. PlayStation Now lets the firm bring its console games to smartphones and tablets The announcement was made at the Consumer Electronics Show in Las Vegas. Tackling lag The new facility is based on technology developed by Gaikai, a cloud gaming start-up that the Japanese firm acquired for $380m (£232m) in 2012. Its rival Samsung had previously announced plans to partner with the firm to let its TVs offer games. Sony said the service would also allow its new PlayStation 4 console to run titles from the PS3's library. The move means PlayStation exclusives, including The Last of Us, will run on other internet connected devices Since the two machines use different types of processors, the PS4 cannot currently run the previous generation's titles. It also said that its handheld, the PS Vita, would use PlayStation Now to gain access to a wider catalogue. Sony added that another benefit of hosting titles in the cloud was that subscribers would always be playing the most up-to-date versions. Demo titles on show at CES include The Last of Us, Beyond: Two Souls, and God of War: Ascension. Tech bloggers who tested the kit have noted that it does have some limitations. "There's a slightly perceptible lag between button presses and the corresponding action onscreen," wrote Chris Welch for The Verge. Others noted that the visuals were not as crisp as would be the case if they were running natively on a PS3. But they said the games were still playable. Sony is not alone in offering such a service. OnLive has offered a cloud-based gaming platform since 2010, and is already available in the UK. However, it lacks many of the big-name titles that Sony will be able to offer. OnLive already offers a similar service but does not offer many of the PlayStation's bestselling games Mr Blau said that he expected that PlayStation Now would suit some titles better than others. "I imagine that for the games that don't require a very fast frame rate that lag won't make any difference at all," he said. "But for those that run at 50 to 60 frames per second it could be an issue if you're not close to a Sony server. "However, I imagine the firm has the capability to ensure that most of the connected PlayStations and TV will be close to at least one of its data centres. "And as the years go on that will become less of a problem as the internet's infrastructure matures." PS4 v Xbox One Sony also announced that it had sold 4.2 million PlayStation 4 consoles as of December 28. The Xbox One is more expensive than the PS4 but is bundled with the Kinect sensor Its rival Microsoft had previously said that three million Xbox One machines had been sold by the end of 2013. Jason Kingsley, chief executive of developer Rebellion, suggested the numbers reflected the firm's different strategies. "Microsoft seemed to have a US focused launch with an emphasis on TV and US sports," he said. "Sony played the hardcore gamer card well in the UK. "All sales are good for the development scene though and it is still early days.

They are both excellent machines."
We're only three days into 2014, and already two major online service hacks that compromised user data have hit the headlines.

The usernames and passwords for 4.6 million Snapchat users have been compromised and posted online. Skype, the Microsoft-owned VOIP service, was also reportedly breached, though at this point, it's believed that the attacks are unrelated.

The Syrian Electronic Army, a group of hackers that support the regime of Syrian President Bashar al-Assad, has taken responsibility for the Skype hack.

And just last month, hackers broke into the computer databases of Target, one of the world's largest retailers, stealing millions of credit card numbers and encrypted PIN numbers during the holiday shopping rush, forcing cardholders to scramble to replace their cards and guard their credit accounts.

Unfortunately, there's no simple solution for people to protect their identities and credit data on all the online services and with all the retailers they do business with. But there are a number of things people can do to at least limit their chances of getting hit hard by a data breach.

This slide show covers some of the techniques Web users can employ to improve their online security. Securing Sensitive Personal Data in Cloud Services: 10 Best Practices By Don Reisinger Get the Password Manager Going Nowadays, both mobile and desktop users can use password managers designed to house all-important credentials in one spot and create difficult passwords for sensitive sites.

All a user needs to do is remember the password for actually getting into the application, like 1Password or Password Manager, and he or she has full access to important accounts. Such password managers are important: They act as a repository for credentials, they create strong passwords, and they can populate password fields whenever the user goes to a particular site. Definitely check them out. Aim for Two-Factor Authentication Two-factor authentication is a key component in keeping data safe in the event of a breach. Let's say, for example, that hackers have obtained passwords for a site's entire user base.

If that site employs two-factor authentication, like sending a code to the user's mobile phone in addition to requiring a password, the impact might not be as great. It's an extra step, but it's an important one. That's probably why companies like Google and Bank of America use the technology in their online services. Switch Up Passwords There's no reason to keep passwords going for years and years. In fact, critical passwords should be changed on a regular basis. Some security experts say passwords should be changed at least every 90 days on all sites.

If you're not doing that, you could potentially put your many accounts across the Web at risk once hackers get just one credential. Just ask LivingSocial users who had their passwords stolen last year. Remember the Beauty of VPNs Virtual private networks deserve more credit for providing users with a higher level of security. VPNs allow users to find their way to critical sites through a secure connection and hopefully improve their security a bit. VPNs aren't the panacea to end all security woes, but it'd be nice to see them put into practice a bit more both in the enterprise and in the consumer space. Don't Trust Social Networks Social networks have become a breeding ground for potential security problems. In fact, a study last year from security firm Sophos revealed that social sites are swimming with major threats. So, while surfing through the sites, be sure to not click on unknown links.

And as Snapchat proved, believing that just because a site or service is big and popular it won't get hacked is pure folly.

Always be on the defensive when using social networks. Use Only Fraud-Secured Credit Cards Credit card numbers are being shared across the Web at an increasing rate.

And in many cases, hackers are intentionally going after those services that keep credit card information on file. So those who are going to share credit card information online should only be using plastic that has fraud protection. Those cards allow for any stolen money to be credited back to an account, safeguarding owners from any possible financial issues. It seems obvious, but it's an important consideration before buying a product online. Don't Link Online Accounts It's common practice now for Websites and operating systems to ask users to link their accounts with other services. Such a practice is great for the sites, which can get some free promotion, but it can unleash a world of hurt on consumers.

If a site is hacked and user credentials are stolen, it's possible that the linked accounts on other sites could become compromised as well. Beware. Stick to a Few Reliable Retail Sites Hitting the Web with a credit card in hand can be a risky move. That's why consumers should buy products from as few sites as possible. Granted, saying such a thing could hurt smaller retail sites, but this is about security.

And the best way to be secure on the Internet is to have a small footprint. Saving profiles and credit card information on dozens of retail sites around the Web is a bad idea that only increases the number of places information could be exposed. Stick with a few trustworthy e-commerce sites. Consider a Site's Security Not all sites are created equal when it comes to their devotion to safety and security of your information. Many sites have been overrun with malware and are just waiting for an unsuspecting user to come along and get hit. In other cases, some sites and site categories, especially adult content sites, are naturally dangerous. Staying on safe, reputable sites is always a good idea. Don't Be Late on the News When news breaks and a hack has occurred, the last thing you should want to be is late to the game. To ensure data is kept safe, users must respond quickly to a breach, change passwords across the Web and keep a close eye on the events as they unfold.

There's no excuse for ignorance anymore—the Internet is a dangerous place, and we must all accept that. Don Reisinger is a freelance technology columnist. He started writing about technology for Ziff-Davis' Since then, he has written extremely popular columns for, Computerworld, InformationWeek, and others. He has appeared numerous times on national television to share his expertise with viewers. You can follow his every move at
What happens to cloud data after a virtual machine is destroyed? One cloud vendor reassesses its policy. Security is often cited as a top concern for any organization looking to move to the cloud, and it's a con...
A new version of Cryptolocker—dubbed Cryptolocker 2.0—has been discovered by ESET, although researchers believe it to be a copycat of the original Cryptolocker after noting large differences in the program’s code and operation. Just last month, antivirus companies  discovered a new ransomware known as Cryptolocker. This ransomware is particularly nasty because infected users are in danger of losing their personal files...
NEWS ANALYSIS: In space, just-in-time delivery doesn't exist, which changes the rules of resiliency and backup options. Disasters and equipment failures can happen at any time, anywhere, and enterprise IT admini...
Sign up to Computer Weekly's research library to download the top 10 research reports from 2013. 1. CW buyer's guide: Tablets for business In this, our most popular buyer’s guide, Computer Weekly looks at the use of tablets in business and how the effect of mass adoption of tablets in the enterprise could be far more significant than either the desktop or laptop computer. Click here for more buyers' guides 2. Technology Industry Survey 2014 From CTOs to software developers, enterprise architects to digital entrepreneurs, this extensive report from Computer Weekly and Mortimer Spinks captures the opinions of people at the forefront of shaping what is next for our industry.

The survey covers careers, salary and employment trends, innovation and work-life balance, and much more. Click here for our Women in Technology survey 3. Protecting against modern password cracking This article in our Royal Holloway Security Thesis series explains just how insecure passwords are, and offers advice on alternative methods of security. Click here for more from our Royal Holloway security series 4.

The human face of big data: Data driven Software engineers are transforming the daily lives of hundreds of millions of people for good and for ill, writes Jonathan Harris, in this article taken from the book, Human Face of Big Data. Click here for more on big data 5. World-Class EA: The agile enterprise Agile is about more than software development, it's a mindset.

This report from the Open Group explains how to apply agile techniques and best practices to the enterprise. Click here for more from The Open Group 6. European IT Law Briefing: Recruitment and social media Four leading European law firms help you navigate the legal maze when using social media for recruitment in France, Germany, the UK and Italy.   Click here for more in the series on European law and social media 7. Gartner: Best practices for I&O for cloud-readiness Gartner analysts explain how infrastructure and operations (I&O) teams can best prepare for the cloud. Click here for more Gartner articles 8. Finance for IT decision-makers: Making business cases This extract from Michael Blackstaff's book, Finance for IT decision makers, teaches you how to make effective business and financial cases for IT projects. Click here for more Computer Weekly book extracts 9.

Understanding the hard ROI of BYOD BYOD can increase IT costs, rather than decrease them. Companies need to look carefully at a range of variables to calculate the real ROI, says analyst group Nucleus Research. Click here for more reports from Nucleus Research 10. Musings on datacentres The potential impact on the datacentre from rising energy costs and new technology architectures is huge.

This report pulls together articles on the topic written by analyst group Quocirca for Computer Weekly and TechTarget. Click here for more analyst reports from Quocirca Email Alerts Register now to receive IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from
Cloud and mobility have been ubiquitous subjects in the IT industry in recent years, with a spate of associated new jobs and increasing reliance on them to drive business. Despite this, an element of ambiguity in how best to utilise them in business remains. “Mobility is so full of whitewater; we need standards around all aspects.” – Google Across the industry, companies have been struggling to cope with the moving of software data into the cloud and a proliferation of remote devices with varying degrees of access to this.

They require assurances that IT staff are suitably qualified for these new challenges, but don’t necessarily know what these staff should be expected to know.  With growing calls among major industry players for a set of supplier-neutral industry standards, the need for recognised certification was clear. CompTIA – with a 20-plus year track record of developing IT certifications – worked closely with the IT industry to identify the key skills that staff responsible for cloud and mobility implementations must have. Security issues, for example, were unanimously agreed to be the primary challenge. Many organisation struggle to find the right staff who understand the weight of specific security issues and cloud infrastructure needs on networks, what the impact is across the whole organisation, and how they can pre-empt these. There was a major need for a set of specific security skills to address the ever-changing threat from cyber attacks.

These included a requirement for staff to know how to set different data access settings for different staff groups. IT businesses want staff to be capable of spotting vulnerabilities and threats long before they cause damage. Can their network administrators note unusual volumes of mobile device traffic going out over FTP to one particular unknown location and take appropriate action, for example? With no concrete staff credentials to fall back on, this is the kind of question businesses have not been able to answer with a resounding "yes". Employees need a greater understanding not just of software compatibility across a network – remotely hosted or otherwise – but the full range of hardware in use too. Each device type has its own implications for security, and the administrator should know the best configuration to improve the security of data on each mobile device across the network. CompTIA's industry analysis was not just related to specific job roles within organisations – we also picked up views of the whole cloud and mobility space and where businesses believe the industry is heading. A strong consensus in the job market is that traditional knowledge and skills in networking and systems are migrating to include aspects of cloud and mobility. Being technically competent with a blend of skills is becoming a necessity, rather than a differentiator. Crucially, this means that the need to understand planning, implementing and managing cloud solutions will extend beyond just the core network administrator roles across the whole organisation. Even the traditional IT professional needs to become cloud-savvy.

The work environment is changing within these organisations, and the traditional networking or technical sales professional must learn new skills to support the overall objectives of the business. Businesses are clearly struggling to find these skills.

The rapid developments of areas such as the cloud and mobility have created uncertainty among established professionals.

Even where the skills exist, businesses do not always know how to spot them, or even what to look for.

This is why there has been an overwhelming call for an industry certification to validate skills in these new areas. John McGlinchey (pictured) is senior vice-president, global business development, of CompTIA, the global IT trade association. CompTIA recently launched the Cloud+and Mobility+certifications, which it has developed over the past year in consultation with the IT industry. Email Alerts Register now to receive IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from This was first published in December 2013
The need to evolve IT security strategies to match new and emerging threats has come clearly to the fore in the past year as attackers become increasingly adept at evading traditional security controls and stealing data without being detected. Attackers have quickly adapted to new technologies, exploiting a range of security vulnerabilities in mobile, virtualised and cloud environments to target sensitive data, but social engineering mainly in the form of email phishing continues to be a key factor in most targeted attacks. The need to defend against disruption has also increased in the past year with a growing number of disruptive cyber attacks by hackers with political agendas, with denial of service attacks becoming more powerful and more common. The past year has seen a sharp shift in focus from traditional perimeter defences to more data-centric security controls, intelligence-based security systems and building a capability to detect, respond and mitigate the effects of data breaches once they occur. There has also been an increasing emphasis on the need for information security professionals to be aligned with the business to enable new opportunities and information sharing in secure ways. As a growing number of devices become internet enabled, security experts expect the so-called internet of things to present a whole new order of security challenges. Read our top 10 IT security stories of 2013 here: 1. White hat Wi-Fi hacking shows vulnerability of business data White hat hackers have shown that usernames, passwords, contact lists, details of e-commerce accounts and banking details can be sniffed easily from public Wi-Fi hotspots. To illustrate one of the many ways people can have their data compromised, the white hat hackers from First Base Technologies conducted two tests in partnership with security firm Trend Micro. 2. Digitally signed malware a fast-growing threat, say researchers Digitally signed malware is a fast-growing threat that is aimed at bypassing whitelisting and sandboxing security controls, say security researchers. “We found 1.2 million pieces of new signed malware in the last quarter alone,” said David Marcus, director of advanced research and threat intelligence at McAfee.

This is malware that is signed using legitimate digital certificates that have not been stolen or forged, but acquired from certificate authorities (CAs) or their sub-contractors, he said. 3. FBI warns of increased spear phishing attacks In July, the FBI issued a warning about an increase in spear-phishing attacks targeting multiple industry sectors. Spear phishing – a highly targeted phishing email – is one of the tools used by attackers to compromise endpoints and gain a foothold in the enterprise network.

According to the FBI, victims are selected because of their involvement in an industry or organisation the attackers wish to compromise. 4. SQLi has long been unsolved, but has that finally changed? The Open Web Application Security Project (Owasp) continues to rank SQL injection attacks at the top of its 10 most critical web application risks. But what is an SQL injection (SQLi) attack, why are they important, and why have they remained unsolved more than 15 years since they first appeared – and has that changed?  5. RSA says security needs to change, but what does that mean? RSA executive chairman Art Coviello ended his opening keynote speech at RSA Europe 2013 with a call to the IT security industry to show the same spirit as Europe in setting up a common market after the Second World War. But what exactly does he have in mind? 6. Security finally able to enable business, says Voltage New security technologies are finally making it easier for security to enable the business and drive value, according to Dave Anderson, senior director at Voltage Security. Many of the largest organisations in the world are beginning to use information security as a strategic advantage and to re-establish the value of data. “Although we have been talking about this for years, it has become much easier to achieve in the past year to two years,” he told Computer Weekly. 7. Security incident response below par at most firms, says Guidance Software Most firms are not as prepared as they should be for responding to cyber attacks, says e-discovery firm Guidance Software. But with sensible reviews of processes and communications strategies, up to 70% of firms could put themselves on a much better footing, said Nick Pollard, the firm’s senior director of professional services. 8. DDoS attacks more than treble in the past year, report reveals The number of distributed denial of service (DDoS) attacks monitored at over 20Gbps this year is more than three times greater than for the whole of 2012, according Arbor Networks. Despite the business risks of DDoS attacks, a survey by communications firm Neustar, published in July, found that 20% of UK respondents admitted that their companies have no DDoS protection in place. 9. NYT hacktivist attack shows need for registry locking The Syrian hacktivist attack on the New York Times website highlights urgent need for registry locking, says communications and analysis firm Neustar.

The site was unavailable after the Syrian Electronic Army (SEA) that supports Syrian president Bashar al-Assad was able to access the domain name system (DNS) settings for the site.

The SEA breached the NYT’s domain name registrar Melbourne IT and changed the DNS record to point to systems in Syria and Russia. 10. Internet of things to pose 'huge security and privacy risks' The internet of things will pose enormous security and privacy challenges, a CW500 Club meeting heard. By 2020, trillions of sensors will be feeding data across the internet, recording everything from people’s movements to what they have just bought. Such data may prove invaluable for city planning or alerting consumers to special offers on their favourite products in a nearby shop, but it also poses unprecedented risks to individuals' privacy and security, a meeting of senior IT leaders heard. Email Alerts Register now to receive IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from
Security firm RSA has strongly denied allegations of a secret contract with the US National Security Agency (NSA). A report from Reuters claimed the NSA arranged a secret $10m contract with RSA. But in a recent blog post, RSA said it “categorically denies” the allegations.   “We have worked with the NSA, both as a vendor and an active member of the security community. We have never kept this relationship a secret and in fact have openly publicised it. Our explicit goal has always been to strengthen commercial and government security,” stated RSA.   Reuters claimed the NSA paid RSA to generate a random number formula to create a "back door" in encryption products. The RSA said in its blog post that it has never entered into a contract with the “intention of weakening RSA’s products, or introducing potential ‘back doors’ into our products for anyone’s use”. In September, RSA advised its developer customers to stop using an encryption algorithm that documents leaked by whistleblower Edward Snowden indicated contained a backdoor. Last week, the European Parliament Civil Liberties Committee into the surveillance of EU citizens by the NSA called for political and technology changes following the NSA revelations. The draft conclusions call for an EU cloud and proper analysis of the use of open source software, as well as political signals from the US that it understands the difference between allies and adversaries. Email Alerts Register now to receive IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from
For far too long, businesses have been at the mercy of software suppliers for ensuring that critical applications are secure. At best, businesses have to rely on the software supplier’s assurances that the software has been tested and that it has passed those tests. But as cyber attackers shift focus from operating systems to exploiting weaknesses in business applications, secure coding has never been more important. It is widely recognised in the security industry that any commercial organisation that values its security is well-advised to perform a security test of any software before deployment. For this reason, many in the security industry believe the time has come for a shift in power from supplier to buyer, but for this to happen, security professionals within organisations need to ensure that security testing becomes part of the procurement process for all business software – and therein lies the challenge. Procurement and security are uneasy bedfellows, mainly because commercial teams want things to be done quickly, simply and as cheaply as possible, while security introduces delays, complexity and cost. So what is to be done? How can security professionals ensure security testing is part of procurement? Raise security awareness and control testing Security should mandate testing, and advise as to why this is necessary, says Robert Newby, an analyst and managing partner at KuppingerCole UK.  "Procurement should be seeking to lower risk to the business in their processes, but sadly, a lack of awareness of security may make this a difficult task," he says. Software is a seemingly benign way to gain absolute access to your network – an open window even when your doors are firmly locked Robert Newby, KuppingerCole Therefore, it is security’s task to mandate testing and raise awareness. "This can be done via policy, but also by campaigns, communication, contact and relationships," says Newby.  He believes security professionals can communicate with the business by talking in terms of risk, and this can translate simply to commercial teams in terms of cost. "Software is a seemingly benign way to gain absolute access to your network – an open window even when your doors are firmly locked," says Newby. "The cost of not treating this is simple, it could cost you everything you hold electronically within your company network, all IP, customer records, credit card information, designs, processes and developments, not to mention brand and reputation," he says. Conversely, the cost of addressing this in a procurement cycle can be low, and if companies state it as a requirement, says Newby, many software houses will meet the cost of security checks themselves. Security professionals therefore need to be more business aware, he says, and not assume that their colleagues have the same grasp of the technical details. "Raise awareness with your commercial teams, but talk in business terms, state the risks in realistic terms, give examples, work with them to state the costs at each stage, and talk about realistic measures to mitigate.

Above all, create awareness and understanding," says Newby. Software code must be secure at the procurement stage Another key strategy for information security professionals is to ensure they are plugged in to the procurement process, says Peter Wenham, committee member of the BCS Security Forum strategic panel and director of information assurance consultancy Trusted Management. "Security professionals need to work tirelessly to ensure that when projects are mooted – that is before they are actually funded – that they are well and truly plugged into the purchasing and/or tendering processes," he says. According to Wenham, this applies equally if software code is created in-house or purchased from an external supplier.  "Where code is to be developed externally, the security professional needs to ensure that clauses are included in the contract of supply that call for the code to be developed using current accepted industry good practices for the delivery of secure code and outlining the acceptance/rejection process for that developed code," he says. For internally developed code, Wenham says the security professional should ensure that internal procedures call for code to be developed with security in mind and is well documented. "On the project side, the security professional needs to ensure that all code received, irrespective of it being bespoke, commercial of the shelf (COTS) or internally generated, is passed through a 'sheep dip' process to test for viruses and malware," he says. Wenham believes the code should be checked by two different commercial antivirus products during this sheep dip process, and that the security professional should also ensure that the project allows for a test environment where code can be run as if it were in production to ensure that it is “well behaved” and not doing anything that would compromise security. While large and complex projects may well have multiple test environments to undertake basic testing, check integration with other project components, and test performance and user acceptance, Wenham points out that small projects may have to rely on good and well documented back-out procedures so that any received and sheep dipped code can be run directly in production. "Finally the security professional needs to ensure that any project has appropriate and effective mechanisms in place to reject faulty code, together with metrics that clearly define the accept/reject parameters which should tie back to a contract or internal project specification," he says. To bring about a real shift in the balance of power, customer organisations need to make it clear they will not buy unless their security standards are met, says Vladimir Jirasek, managing director of Jirasek Consulting Services. Due diligence should include testing Most organisations rely on software developed by someone else, but Jirasek suggests that as a standard part of any due diligence process, every application should pass a few tests. First, it must fit the overall enterprise application architecture; second, it must meet business continuity requirements; and third, it must meet security requirements. The problem is that many organisations do not have an enterprise architecture function established and do not involve their security team in testing new external applications Vladimir Jirasek, Jirasek Consulting Services "The problem is that many organisations do not have an enterprise architecture function established, have not done business continuity testing, and certainly do not involve their security team in testing new external applications," says Jirasek. A review of the due diligence process is a good place to start, says John Colley, European director for (ISC)2.  "Within security we lament an inability to be more involved in the software and systems development lifecycle, but with third parties playing a significant role in what is very often a complicated supply chain, a lot can be achieved in reviewing the quality of due diligence that is applied," he says. Security professionals, says Colley, have the ability to review and understand providers’ overall software development process, their approach to security, testing and requirements analysis in the design. "We can specify expectations in these areas or pose questions that are often missed, around their use of code libraries, other third parties or their appreciation of misuse, as well as use cases, professional credentials for security in software, and the like," he says. When procuring off-the shelf packages, Colley says the ability to apply due diligence is more limited, but research can still be done.  "Take a look at customer forums and comment, or ask for and speak to references, so you can learn about and assess the complications they may have faced post implementation," he says. Security professionals must get involved  Jirasek laments that often the first information security professionals learn of a new application is through internal communication stating the application’s launch date. But there a several things a security professional can do to ensure security testing is done before implementation, he says. First, security professionals should update procurement policies and processes to include security team involvement. "Most procurement teams are remunerated based on the discount they negotiate.

A security manager could argue that the company should get a discount, based on the number of security issues discovered in the application software. That way, procurement is more likely to involve the security team; indeed no security is without faults, so some discount is going to be found," he says. An IT health check should be an integral part of the commissioning and testing process for new software, regardless of its function Mike Gillespie, The Security Institute Second, information security professionals should partner with an application testing company that is set up to test applications written in different programming languages, says Jirasek. "If the software supplier has already tested its application, perhaps the testing partner can work with it to normalise the test results.

The outcome from this step is a score that is passed to procurement," he says. Finally, Jirasek recommends that information security professionals update security policy so applications that score outside the desired value are not allowed on the enterprise network. Having a security presence in the boardroom absolutely ensures that security becomes an embedded process throughout an entire organisation, says Mike Gillespie, director of cyber research and security at the Security Institute. "Until this happens and security is involved in all relevant purchasing activity, then fire fighting and additional expense from playing security catch-up will continue," he says. Gillespie also believes that security professionals need to be in a position to bring pressure to bear on the supply chain to encourage accountability and insist that software should be secure at purchase. "Security needs to be involved in the procurement process at both ends of the transaction and beyond.

For instance the relationship of buyer to seller expanded to include the seller security person talking to the buyer security person," he says. This approach can cover key risk areas in an appropriate manner and the buyer knows precisely what they are buying before money changes hands, says Gillespie. "An IT health check should be an integral part of the commissioning and testing process for new software, regardless of its function," he says. If security at the buyer side covers questions with security from the supplier on the results of testing, the way forward starts to look far more clear, logical and secure, says Gillespie. "The most economically advantageous approach to procurement may be okay for buying mugs or toilet rolls, but it really doesn't work with software," he says. Benefits of security-aware procurement In summary, KuppingerCole UK’s Newby says a properly implemented secure development lifecycle can be of real value to a development company, and where a software sales representative can point to a well-known process for assurance, a purchasing company would be well advised to take this into serious account. (ISC)2's Colley says security professionals must always understand that commercial imperatives may outweigh their own. “But until we take the steps to become better informed about the risks that are being taken, no one can be in a position to make this assessment,” he says. And pointing to the increasingly comprehensive information that is available on structuring software testing activities from organisations such as Enisa, Nist and Owasp, there is no longer any excuse for buying software with poor security, says analyst firm Gartner. Given all this complexity and variation, it is vital to set the right expectations and create a solid process when acquiring business software, writes Ramon Krikken, research vice-president in Gartner's technical professionals security and risk management team.

The following are important technical and non-technical tasks: Commit to making security a part of the acquisition.

As with in-house development, business users first have to want security as a required aspect of overall quality.

A critical part of creating this commitment is communicating with them about risks and rewards, and establishing mutually agreed goals. Find the right process touch points. Mandatory security checkpoints in project management, contract review and sourcing are a must. But since the availability of free and pay-as-you-go software makes it easy for individual business users to acquire software by themselves, additional tools will probably be needed to discover installed software and cloud application use. Tailor testing activities to the software and sourcing type.

All software should be security-tested at least once before it goes into production, but not all software needs be tested equally often or in equal depth. Look for some combination of the following: supplier process and testing evidence; third-party testing evidence; third-party certifications and validations (for what they are worth); in-house testing during coding and testing (for some outsourcing); pre-release in-house testing; and testing of software once deployed in production. In all cases, test against documented requirements. Put your expectations of the supplier in writing. Except for full off-the-shelf products or SaaS, create specific contract clauses for the following: the security requirements; how the supplier will show evidence of software security testing processes and outcomes; the security go/no-go criteria for acceptance; in what way, and how quickly, the supplier will communicate discovery of non-trivial vulnerabilities; the allowable time-to-fix for non-trivial vulnerabilities; and the penalties for non-compliance. Gear up for in-house security testing. Organisations should be ready to perform their own application security tests, whether they use a product or a service to do this.

Any deployment that isn't completely off-the-shelf is a candidate for in-house testing, even if only to validate the supplier's tests. Some security problems won't show up until deployment into an organisation's staging or production environment, so such testing is very desirable. Prepare mitigation tactics. Since most software cannot be guaranteed to be vulnerability-free, and most discovered vulnerabilities cannot be fixed overnight, the ability to isolate vulnerable assets, block attacks or at least detect attacks is essential as a back-up. To come full circle, it can also help to identify previously unknown security vulnerabilities. Email Alerts Register now to receive IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from This was first published in December 2013