3.1 C
London
Sunday, November 19, 2017
Home Tags Cloud

Tag: Cloud

Cloud computing is a kind of Internet-based computing that provides shared processing resources and data to computers and other devices on demand. It is a model for enabling ubiquitous, on-demand access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services), which can be rapidly provisioned and released with minimal management effort. Cloud computing and storage solutions provide users and enterprises with various capabilities to store and process their data in third-party data centers. It relies on sharing of resources to achieve coherence and economy of scale, similar to a utility (like the electricity grid) over a network.

A common understanding of “cloud computing” is continuously evolving, and the terminology and concepts used to define it often needs clarifying. Press coverage can be vague or may not fully capture the extent of what cloud computing entails or represents, sometimes reporting how companies are making their solutions available in the “cloud” or how “cloud computing” is the way forward, but not examining the characteristics, models, and services involved in understanding what cloud computing is and what it can become.

Advocates claim that cloud computing allows companies to avoid upfront infrastructure costs, and focus on projects that differentiate their businesses instead of on infrastructure.[5] Proponents also claim that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and enables IT to more rapidly adjust resources to meet fluctuating and unpredictable business demand. Cloud providers typically use a “pay as you go” model. This will lead to unexpectedly high charges if administrators do not adapt to the cloud pricing model.

The present availability of high-capacity networks, low-cost computers and storage devices as well as the widespread adoption of hardware virtualization, service-oriented architecture, and autonomic and utility computing have led to a growth in cloud computing. Companies can scale up as computing needs increase and then scale down again as demands decrease.

Cloud computing has become a highly demanded service or utility due to the advantages of high computing power, cheap cost of services, high performance, scalability, accessibility as well as availability. Some cloud vendors are experiencing growth rates of 50% per year, but being still in a stage of infancy, it has pitfalls that need to be addressed to make cloud computing services more reliable and user friendly.

Firms tend to block cloud services based on productivity concerns and less for security issues, according to data collected about the cloud services blocked by firms. For most companies, blocking a cloud service like Netflix is a no brainer, as it saps both bandwidth and productivity. Yet, IT administrators need to think differently about cloud services to better secure their company and its data, Rajiv Gupta, co-founder and CEO of cloud security firm Skyhigh Networks, told eWEEK. In a study released this week, the company found that customers are more likely to block popular safe services than more risky services that could compromise security or cause data leaks. The study, based on real data from customers that use Skyhigh's network monitoring service, found that 46 percent of firms blocked Netflix, 45 percent blocked Foursquare and 39 percent blocked Apple iCloud, but no companies blocked MovShare, myCapture or FileFactory—all considered high-risk services by the firm. "Companies are taking yesterday's approach to blocking," Gupta said. "IT is still taking a productivity- and bandwidth-based approach rather than risk-based approach to what they need to block." As a greater variety of cloud services appear, companies need a more risk-based strategy to determine which services employees can use, he said. With workers using an average of more than 500 different cloud services, the task is not an easy one. Moreover, if employees are not educated about the reasons why certain services are blocked nor given alternatives, they will adapt as well, Gupta said. In one instance, a company blocked backup service Carbonite for fear that data would be leaked or exfiltrated using the service. Soon after, an employee started using another service known as Elephant Backup instead. Skyhigh rates Elephant as risky. "What you find is that our IT organization is so in the dark about what is risky that they are making the wrong decisions," Gupta said. Two big categories that are considered high risk are tracking services and development services, according to the report. Typical Web tracking services, such as KISSmetrics and AddThis, do not deliver any value to a company but can offer attackers enough insight into employee habits to help target waterhole attacks. Such attacks find Websites visited by company employees, attempt to compromise the sites and then deliver malicious code to employees through the sites. Development services can be used as a way of exfiltrating data or as an infection vector.

Even social media sites can be used to communicate data outside the company firewall, according to Gupta. One customer found a user who sent out a million tweets in a day, but in reality, its compromised systems were exporting data 140 characters at a time via the tweets. "This is all about shedding light on shadow IT," he said.   ${QSComments.incrementNestedCommentsCounter()} {{if QSComments.checkCommentsDepth()}} {{if _childComments}}
Whenever anyone asks me if the cloud is secure, I typically have a very simple answer: The cloud is no more and no less secure than the controls that enterprises, users and cloud providers put in place. Until this week, I did not consider the Microso...
The IT world is seemingly under constant attack from all corners of the globe, at all times. Reports out in recent weeks provide fresh fuel for the analysis of IT security and what is under attack.

Among the major trends is the fact that mobile malware...
Microsoft switches on security enhancements in a bid to lure enterprise-grade workloads onto its cloud. Microsoft is making Windows Azure tougher to crack, alleviating the security concerns for businesses that a...
Survey respondents believed cyber-security is a serious threat to data and business continuity, with 51 percent rating the threat as very serious. The majority of technology and health care companies view cyber-security as a serious threat to both their data and business continuity, and only one-third are completely confident in the security of their information, according to a survey by Silicon Valley Bank. The study, based on a survey of 216 executives of technology-based companies, found nearly all (98 percent) companies are maintaining or increasing their cyber-security resources and of those, half are increasing resources devoted to online attacks this year. Resources are most likely to be invested in monitoring, preventative policies, training and staffing rather than in preventative infrastructure, indicating they are planning for when, not if, they are attacked. Most survey respondents believed cyber-security is a serious threat to both their data and business continuity, with a full 51 percent rating the threat to business interruption as very serious or serious.

The threat to data or intellectual property (IP) was viewed as more significant with 60 percent saying it was a very serious or serious threat. Software companies are more likely than those in health care and cleantech or hardware to consider a cyber-attack a serious threat in terms of business interruption (58 percent versus 47 percent and 41 percent, respectively). Overall, companies expressed confidence in the security of their information, but only 35 percent said they were completely/very confident, while 57 percent were moderately confident. Survey respondents were far less confident about the security measures taken by their critical business partners, including vendors, distributors and customers, with only 16 percent completely/very confident. "The survey shows us the threat of a cyber-attack is not just hype.

A surprising number of technology companies we heard from say the threat to their IP and their business is very serious," Bob Curley, managing director of corporate finance for Silicon Valley Bank, said in a statement. "Companies in the tech sector, particularly software companies, are feeling exposed, and increasingly having to expend resources to manage cyber-attacks, rather than investing in the growth of their business. That's a huge impact on a growing company, and eventually the economy overall." Most respondents said they still store company information either privately or only partially on the cloud, indicating a lower appetite for using the public cloud–even with security measures in place–to cost-effectively store important company information.

While 52 percent said they are storing information privately, nearly half are using the public cloud for at least some of their data. Software companies were the most aggressive users, with 59 percent using the public cloud for at least some of their data. Early-stage companies are also more likely to embrace the cloud, presumably to lower their costs and achieve efficiencies even at a small scale.

For nearly half of executives surveyed, increased media attention to cyber-attacks has changed the tide of opinions on cyber-security and has helped heighten its priority in companies across the board. "In addition to risks to their intellectual property and business interruption, they may be keenly aware of the mounting customer concern that comes from rising media exposure–which in turn creates even more incentive to beef up cyber-security resources," the report noted.
The rise of virtualisation, cloud computing, big data and the Internet of Things within enterprise IT has not really killed the mainframe. 93% of global IT execs cited mainframe as a robust, long-term solution in their enterprise IT strategy. Mainframes are hardware used for large-scale computing purposes that require greater availability and security.

While first generation mainframe occupied 2,000 - 10,000 square feet area within a datacentre, newer mainframes are about as large as a refrigerator. Popular in the 80s, many predicted the death of a mainframe in late 90s and the beginning of the new millennium because of its lack of scalability and the lack of skills in the IT workforce to manage mainframes. But the 8th annual worldwide survey of mainframe users by BMC Software has found that mainframe platforms continue to play a critical role in delivering computing power at a time when users expect access anytime, anywhere, regardless of the data volume and velocity implications. IT professionals in Europe, the US and Asia-Pacific said that the mainframe is not only sticking within their IT estate, but that it is evolving to play an increasingly important role in enterprise IT environments. Almost all respondents (93%) indicated that mainframe is a robust solution and is part of their long-term IT strategy.

About half (50%) said that mainframe will grow and continue to attract new IT workloads in their enterprises. None of the large enterprise respondents said they plan to eliminate mainframes from their IT infrastructure and just 6% of small enterprises indicated plans to eliminate mainframes from their IT. Two-thirds of respondents also said that mainframe will be incorporated into their big data or cloud strategies. “Mainframe will continue to play a big role in IT transformation, particularly in these times of unprecedented change,” said Jonathan Adams, general manager of data management at BMC Software that provides low-cost mainframes. Particularly in finance, banking and insurance environments, long-established IT departments with many years' experience still have corporate databases running on mainframes. IBM, one of the biggest suppliers of mainframes, continues to invest in developing its mainframe line of products. In 2012, after a 10% to 20 % dip in revenue in the first three quarters, IBM saw mainframe sales increase 56% in the fourth quarter. “The unprecedented pace of technology evolution and the consumerisation trend is only solidifying the need for a platforms with superior availability, security and performance capabilities,” Adams said. Availability advantages (72%) and security benefits (70%) of mainframe technology were cited as the main reasons for continued investment.

Its suitability to run legacy applications, lower datacentre costs and centralized IT features were other reasons IT execs cited for using mainframes. But the study also revealed that the current IT workforce lacks the skills to deploy and operate a mainframe environment. To overcome the mainframe skills shortages, enterprises are investing in internal training programmes (52%) and hiring experienced candidates (39%). Many are also looking to outsource or automate mainframe management. Respondents also indicated that nearly half (46%) of their mainframe budgets are spent on software. About 62% of the IT execs surveyed work in enterprises with over $1bn revenue and 45% represented financial and insurance industries while 20% represented government. Email Alerts Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from ComputerWeekly.com
While cloud storage has gained significant ground in terms of adoption, the amount of data pushed to the cloud remains relatively light. Widespread acceptance of cloud storage indicates the impact cloud storage can have on disaster recovery strategies, according to a survey of 288 IT personnel from cloud-based storage solutions specialist TwinStrata. Sixty percent of respondents today currently use cloud storage, with another 23 percent planning to use it. With more than four out of five (83 percent) organizations planning to use cloud storage in some capacity, survey results indicated that cloud storage has gained significant ground in terms of widespread acceptance and even adoption. Furthermore, when asked about current and future plans, the survey revealed that current users have already implemented cloud storage for two or more use cases with plans to average at four use cases per organization. Sixty percent of respondents estimated that they could recover from a full disaster (as defined by the need to bring up both apps and data in a secondary location) within 24 hours. However, one in five (20 percent) estimate that it would take them more than three days. Interestingly, however, while cloud storage has gained significant ground in terms of adoption, the amount of data pushed to the cloud remains relatively light, with 55 percent of cloud storage users having less than 10TB in the cloud. Only 25 percent have more than 50TB. "Last week’s announcement that Nirvanix is shutting down has caused many organizations to question whether cloud storage is worth the perceived risk," Nicos Vekiarides, CEO of TwinStrata, said in a statement. "Our survey clearly shows that the benefits of cloud storage are far too compelling to ignore, particularly from a disaster recovery perspective. With the right technology and migration plan in place, customers can recover from any disaster–even the loss of their cloud storage provider." When asked specifically about their current backup and disaster recovery strategy, 40 percent of respondents indicated that offsite tape backup plays a role, and 37 percent indicated that they have either a hot or a cold standby site available to them.

Approximately one-quarter (24 percent) said they rely on cloud storage in whole or in part for their disaster recovery needs. However, despite the availability of affordable data recovery options such as cloud storage, 1 in 3 organizations relies solely on either onsite backups or offsite tapes as their only backup strategy, while 1 in 10 relies only on cloud storage for backup and disaster recovery. "From a disaster recovery perspective, we find clear evidence that a cloud-storage-focused disaster recovery strategy makes a significant difference to an organization’s ability to recover from both data and site disasters," the report concluded. "With the advent of cloud storage, on-demand disaster recovery and other options, organizations that struggled to have comprehensive disaster-recovery plans, due to either costs or resources, now have the ability to maintain a disaster-recovery strategy that’s both affordable and effective."
Information security professionals are making progress, but they are still losing the race against adversaries, according to Hord Tipton, executive director of security professional certification body (ISC)2. But one of the biggest challenges is the lack of skilled people to help mitigate the security risks as businesses move into mobile and cloud computing. “Even banks are going into mobile transactions, despite the fact that this is still one of the most dangerous areas in terms of security threats,” he told Computer Weekly. Despite the lack of skilled people, Tipton said there is room for improvement as those in the field continue to “fight the good fight” to bring the effect of cyber attacks down to an acceptable level. “Typically only around 10% of the easy stuff is being addressed, but we cannot afford to ignore the low-hanging fruit,” he said. Tipton said improvement can come in areas such as reducing the number of days it takes for organisations to detect that they have been breached, which is around 320 days on average. Security skills must be kept up to date As head of (ISC)2, Tipton sees ongoing education as important, and the constantly changing threat landscape is why members are required to re-certify every three years. In addition to broad-based skills, Horton said there is a growing demand for specialised skills in areas such as software assurance and forensics. “I am excited about the new Cyber Forensics Professional Certification (CCFP) because that will enable practitioners to learn what they need to feed back into the preventive side to ensure the same weaknesses are never exploited again,” he said. The new Secure Software Lifecycle Professional Certification (CSSLP) is aimed at creating skills in building software secure from the start of the development process. “If businesses knew the cost of patching bespoke, in-house and even commercial software, the demand for software assurance would be extremely high,” said Tipton. With the skeptical view that most organisations will be breached at some time or other, Tipton said it is important that organisations do not neglect traditional preventive strategies. “If we are ever to get losses down to acceptable levels, we can’t give up on prevention,” he said. Two commonly overlooked areas are application security updates or patching and proper configuration management.

Although both require manpower, Tipton said if done properly, the gains are huge. Continuous monitoring is a new area that is becoming more prominent, he said, which is a good thing because it is a form of preventive control, but requires a forensic capability to translate data into action. This is where security practitioners who have forensics training could be invaluable to organisations in being able to analyse and interpret logs to help fine-tune cyber defences. Investing in future information security professionals To help ensure more skilled people enter the cyber security profession, Tipton said organisations have got to create career pathways in the field to attack talented individuals from a young age. Through around 10,000 volunteers in 105 member chapters around the world, (ISC)2 runs school programmes to help “build the pipeline” of information security professionals. “Organisations need to make it known that they offer challenging and lucrative careers in cyber security,” said Tipton. In another outreach initiative, (ISC)2 is running a pilot training programme for graduates to become associate members with a view to becoming fully fledged members with work experience. Looking ahead, he said businesses need to look at ensuring they have the right people with the right skills and that the software they are using is free of vulnerabilities that can be exploited. With the amount of outsourcing, including cloud, it is also important for businesses to achieve security oversight through the supply chain. For its part, (ISC)2 is working with the Cloud Security Alliance (CSA) on a certification for advanced cloud practitioners. Email Alerts Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from ComputerWeekly.com
American private, hybrid and public cloud services provider Nirvanix has advised its customers to move data off its services before September 30 when it shuts down.

The closure should serve as a wake-up call for end users when contracting for cloud services as well as propel them to plan exit and migration strategies right at the onset, experts have warned. Last week the ailing cloud storage provider advised customers to stop uploading first or second copy of data onto its platforms and urged them to migrate it to other public cloud providers such as Amazon S3, Google Storage, Microsoft Azure or IBM-SoftLayer. The company’s confidential email to customers warned them about its closure giving them two weeks to move data off its cloud. The company plans to close two or three datacentres it has leased in the US, Europe and Asia.

Other Nirvanix datacentres will be used to pull data out of the systems. The company has a programme called Direct Connections to help customers migrate data using LAN or the internet. But Forrester’s senior analyst Henry Baltazar has warned that even over large network links it could take days or even weeks to retrieve terabytes or petabytes of data from a cloud.  “If you used Nirvanix for third or fourth duplicate copies you need assurance that data will be destroyed.

If you used it for primary data you need that data back, and that is no trivial task right now,” said 451 Research Group’s analyst Simon Robinson in a Computer Weekly article. “The whole scenario is clearly also a big blow to the cloud storage model, since it apparently validates fears over the risks of handing your data over to a third party,” Robinson said. Launched in 2007, Nirvanix was struggling to compete with giant public cloud providers such as Amazon, Google and Microsoft on price points. Meanwhile the not-for-profit industry body, Cloud Industry Forum (CIF), said that enterprise customers must ensure they have achieved contractual clarity for service delivery and with regards to how a relationship formally ends at a practical level, regardless of cause of the termination. The cloud company’s closure is not dissimilar to UK datacentre 2e2’s closure earlier this year. 2e2 asked its customers for nearly £1m in funding if they want uninterrupted services and access to their data. “Offering a mere two-weeks in which to migrate customer data from their servers, Nirvanix have essentially hung their customers out to dry,” said CIF founder Andy Burton. But end-users must take steps to mitigate risk and assume ultimate responsibility, Burton warned. The sudden closure of Nirvanix should serve as a necessary reminder for end users to seek contractual clarity and reassurance from CSPs (cloud service providers) to understand how a service will be delivered, where and how the data is stored, the minimum notice they will have to move data to another provider is, what format the data will be provided back and what costs will be incurred (if any) to retrieve their data, he said.   Customers must always seek clarity on how the service will be delivered, who is accountable and liable for which aspects of service continuity, and ultimately what is the process and timescale to disengage and move data in a planned or forced termination, added Burton.  “When it comes to procuring cloud services, caveat emptor still applies today. Whether buying direct online or via a third party it is essential that an organisation can establish confidence in the service provider and be confident in both its expectations and experience throughout the life of a contract,” he said. The industry body also urged end-users to take stock and always maintain overall governance of how IT is delivered, and therefore to always assume and maintain ultimate responsibility for decisions they make either on-premise or in terms of adopting cloud services But CIF also called for cloud service providers to “act honourably”. “CSPs have an opportunity to achieve competitive advantage by being straightforward and transparent about their business practices and processes. Clarity of obligations, service levels, service options and issue resolution will positively reduce risk for end users in making their supplier choice decisions,” Burton concluded. Exit strategy -- a crucial aspect of the cloud contract A European virtualisation and cloud expert Ruben Spruijt advised enterprise customers that it is crucial to consider the “exit strategy” right at the time of entering into a cloud contract. Forrester’s Baltazar adds that in addition to planning an exit strategy, customers should also plan migration strategies as they formulate their cloud storage deployments. “One of the most significant challenges in cloud storage is related to how difficult it is to move large amounts of data from a cloud,” Baltazar said.  Gartner too has been advocating the importance of cloud exit strategies to clients for some time and has published a titled, “Devising a Cloud Exit Strategy: Proper Planning Prevents Poor Performance. “I’m sad to say however, that compared to many other Gartner research documents, this document has not seen nearly the amount of demand or uptake from our clients,” Gartner’s research director Kyle Hilgendorf said on his blog.  “Why is that?  I suspect it is because cloud exits are not nearly as sexy as cloud deployments – they are an afterthought.”  “If you are a Nirvanix customer, it’s too late to build a strategy. Get as much of the data as you can back immediately,” Hilgendorf said. “If you are a customer of any other cloud service – take some time and build a cloud exit strategy/plan for every service you depend upon. Cloud providers will continue to go out of business,” he further warned.  Email Alerts Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from ComputerWeekly.com
Security is constantly changing, which means security professionals need to be proactive, says a former US military CIO. “They need to look at security from multiple perspectives all the time,” Maria Horton, chief executive of information assurance firm Emesec, told attendees of the (ISC)2 Security Congress 2013 in Chicago. But, many feel overwhelmed by the security implications and enormous potential pitfalls of moving to mobile and cloud environments, she told Computer Weekly. “Our society is becoming more dynamic, which means preconceived notions of security practices need to change too,” she said. Assessing security priorities According to Horton, information security professionals need to evolve their approach to security to match technological evolution, and give up the old idea of achieving “perfect” security. “Start by determining the organisation’s risk tolerance and formulate a security strategy accordingly, giving the most protection to what matters most,” she said. Companies that need to grow would typically have a greater risk tolerance than government agencies, such as the US Department of Veterans Affairs, which is one of Emesec’s largest customers. Security has more chance of success, said Horton, if it is concentrated on only the most critical information assets rather than trying to protect everything. The difficulty with a risk-based approach to security, she said, is that many security professionals are unwilling to prioritise out of fear of making the wrong choices. However, making choices is also necessary to ensure that limited security budgets are well spent on protecting what is really important, said Horton. Expand the security skills base In recognition of the rapidly changing technology and threat environments, she said businesses should also change their security recruitment practices to seek out people with multiple skills. “Non-traditional backgrounds in such things as combating fraud can provide different perspectives,” said Horton. Security professionals need to start integrating non-traditional skills into their teams if they want to remain relevant to their organisations in future, she said. Cloud security challenges Turning to the specific security challenges of cloud computing, Horton said all contracts should include a get-out clause if service providers failed to meet agreed service levels in terms of security. Before signing up to a service, organisations should also determine how to ensure they are able to migrate to another service provider if necessary and define a clear exit strategy. “Ask important questions around what happens to data when you leave, about who owns what data, and how intellectual property will be protected,” said Horton. It is also important to ensure upfront that security strategies are flexible enough to keep up with potential changes, such as a move to another cloud services provider. “There is still no clear security standard for cloud, so organisations must accept that they will essentially need to make one on their own,” said Horton. Looking to the future, she said that over and above baseline security, organisations should expect to pay for the level of security they want. In any event, organisations should ensure that enterprise management of cloud service level agreements should be a key part of the overall cloud strategy. Email Alerts Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox. By submitting you agree to receive email from TechTarget and its partners.

If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy Read More Related content from ComputerWeekly.com
The DCS-2136L includes the D-ViewCam management software for expanded surveillance options, allowing users to manage up to 32 network cameras. Network solutions provider D-Link Systems, which specializes in small- and midsize-business (SMB) products, announced the DCS-2136L, which the company is billing as the world’s first IP surveillance camera with wireless 11ac support for longer range and greater bandwidth availability. The mydlink-enabled day and night camera with wireless N connectivity offers small-business owners networking and viewing capabilities, including a white light LED for color night vision viewing in complete darkness.

The technology allows the camera to show images in complete darkness while remaining in color mode, and the visible white light LED serves as a warning sign that the camera is on, and often results in intruders looking directly at the camera. The DCS-2136L includes the company’s D-ViewCam management software for expanded surveillance options, allowing users to manage up to 32 network cameras, set email alert notifications, create recording schedules and use motion detection to record directly to a hard drive.

The camera will be available in late 2013 throughout D-Link’s network of channel partners, including value-added resellers (VARs) and distributors. Additional camera features include WiFi Protected Setup (WPS) for three-step installation, built-in passive infrared sensor (PIR) for enhanced motion detection, two-way audio with built-in microphone and speaker, a microSD Card (SDHC) slot for local storage HD resolution with 1280x720 HD (1 megapixel), H.264, MPEG-4 and Motion JPEG compression, and support for the Open Network Video Interface Forum (ONVIF), a global and open industry forum with the goal to facilitate the development and use of a global open standard for the interface of physical IP-based security products. "With the demand for residential and SMB wireless surveillance steadily increasing, D-Link is moving the market forward with the first IP camera to support the new Wireless 11ac standard," Vance Kozik, director of marketing for IP surveillance for D-Link, said in a statement. "Whether you are looking for an entry-level, DIY camera or a high end professional enterprise solution, D-Link is dedicated to providing the best, most advanced IP surveillance solutions to its customers." Featuring mydlink support, the DCS-2136L offers user-friendly installation and integration into an existing network with the mydlink portal to view streaming video from a PC, notebook, iPhone, iPad, or Android mobile devices, as well as enhanced remote capabilities via the mydlink+ and free mydlink Lite app. Both apps offer access to camera feeds from anywhere and a host of updated features for expanded remote control, including motion detection settings, pinch-to-zoom viewing, day/night viewing options, drag-and-drop reordering of devices, two-way audio, remote pan and tilt of live video, and the ability to configure recording schedules and override options, however, the company noted mydlink application capabilities vary depending on the camera model. A mydlink-enabled cloud camera or cloud router device needs to be registered with mydlink in order to use the service. Once the device is physically plugged in and connected to the network it can be configured from the setup wizard on a Windows PC or Mac.

A free mydlink account can be created during the setup process, or the device can be added to an existing account. Once activated, the mydlink account is accessible from a Web browser on any Internet-connected computer.
A recent Evans Data survey shows that 39 percent of software developers said they suspect a government agency is tracking their data. Nearly four out of 10 (or 39 percent of) developers working with big data applications say they believe a government agency is keeping track of the data that they create, collect or use, according to the findings of a new survey. More than one-third of developers polled believe they have reason to think the government is watching their data, according to the study released by Evans Data, which regularly gathers data on the global developer population. This is more than simple paranoia, according to Evans Data officials, who based the "Data and Advanced Analytics Survey 2013" on input from more than 440 developers who work with databases and analytics. Of the respondents who indicated they are confident that they could tell if the government was tracking their data, the proportion of developers convinced that they are being tracked jumps to 59 percent.

For those who did not think they would personally be able to tell, only 23 percent suspect government tracking. The survey, completed in August 2013, covers a wide range of topics related to data, analytics and storage, including sections on data security, which is implicated in data tracking. Big data provides new problems in implementing security, as does governmental interference, Evans Data said. For instance, one question in the survey was: "Have you run into a situation in which your traditional security mechanisms for data don't work with big or unstructured data?" According to Evans Data, 72 percent of developers who suspect governmental tracking answered "yes." The belief that governmental tracking is ongoing spans across industries, the survey showed. "Big data and big government both bring unique challenges," Janel Garvin, CEO of Evans Data, said in a statement. "Security becomes not only a technical issue but can also become a policy issue that developers may not be able to address. In addition, cloud platforms, while providing necessary scalability for big data, may also increase the risk of governmental eavesdropping." While developers are split over whether the data will be stored on-site or in the cloud, two-thirds agree that the typical structure of data or analytics projects must be integrated into an enterprise-wide data warehouse, and not segregated from other data projects, thus increasing the need for security. The Evans Data survey covers such topics as the environment for big data, advanced analytics tools and services, real-time event processing, database technologies, data storage, shared resources and the cloud, general technology use, security concerns and more. Public skepticism of government snooping is widespread after former U.S. National Security Agency (NSA) contractor Edward Snowden leaked word of an electronic surveillance program called PRISM, where NSA analyzed email and telephone data in an effort to find patterns of activity that the agency claims provided valuable intelligence in the fight against terrorism. Also, since Snowden's leak, major technology firms, including Apple, Google and Facebook, have been battling allegations that the U.S. government enjoyed direct access to the servers in their cloud data centers and the user data contained within. Concerns about snooping are widespread. eWEEK Senior Editor Sean Michael Kerner reported that Linus Torvalds, the inventor of the Linux operating system, was asked at the recent LinuxCon 2013 event if U.S. government officials requested a backdoor into Linux. "Torvalds responded 'no' while nodding his head 'yes,' as the audience broke into spontaneous laughter," Kerner wrote.