10.1 C
Monday, October 23, 2017

CIOs are ill-prepared for data-driven business

The “I” in CIO has less to do with information than with technology. Gartner’s information and data management event in London focused on how to derive value from information. But the CIO is unlikely to get the call when the business wants to create and foster information assets. “In successful organisations, someone owns the vision, execution and strategy to make it successful, such as the CFO role, or a VP of operations,” said Gartner analyst Ted Friedman. “Even office rubbish is managed better than information. Organisations need a chief data officer to maximise information assets. There are 300 CDOs today, and this number will double.” Gartner research director Dan Sommer warned that business intelligence is still focused on running analytics on internal data, while users want greater access to external data streams – especially given the fact that 50% of business data now lives in the cloud. “Many new data sources are emerging that are external to the organisation,” said Sommer. Debra Logan, vice-president and Gartner fellow, said IT leaders should shift from a technology focus to asking what business outcome needs to be achieved. Logan said that in her experience of speaking to Gartner clients, “over 50% of clients have no answer, 30% have the wrong answer and only 5% have the right answer”. Rather than discuss IT systems, Logan urged IT decision-makers to focus on business outcomes. “Every information conversation is a business conversation,” she said. Logan urged IT heads to change the way they measure success. “CIOs care about projects that are delivered on time and on budget,” she added. “But for an information professional, what matters is business outcomes.” Such outcomes include those to measure process improvement, or finding new revenue and fostering growth. Democratising information Among the themes coming out of the Gartner Business Intelligence and Analytics Summit was the fact that IT can no longer keep pace with the demands of the business. Logan said: “We need to get everyone in the organisation to understand information.” Transforming business to a point where information is collected and harnessed will involve organisations changing the way data is governed, she said. “Monolithic governance does not work. Governance should be democratic, but many organisations are far from democratic.” Logan urged IT heads to consider a bimodal approach to information governance, which would support the democratisation of data, separating information that must be retained centrally from a compliance perspective from data that can be distributed to the business. “Most of what you do is not controlled by a regulator. Agile information governance requires a decentralised organisation,” she said. According to Logan, regulated information and vital records represent only 5% of a company’s data. “The majority of information in an organisation should be governed in an agile way,” she said. In fact, Logan recommended that 90% of governance should be devolved to the business user. Bupa's IT director for data strategies and applied technologies, Tony Cassin-Scott, admitted that when he joined the healthcare organisation, he started out looking down the wrong end of the proverbial telescope. “I wasn't sure of the problem I needed to solve,” he said.  But he has worked to build a pragmatic, value-based data strategy, which he said is a force underpinning the company’s 2020 vision, tying his data strategy to the business strategy. “I wanted to think of the culture and vision of the organisation by creating an insight-driven world where information is democratised,” said Cassin-Scott. Cassin-Scott said he had looked at taking operational data to generate value to the business. For instance, he looked at the 300 care homes that Bupa runs in the UK and found that they buy a large quantity of low-value goods, which is an opportunity to apply economies of scale. To encourage skills and best practice, Bupa has built a community of 300 data analysts to share ideas, and this is augmented by webinars for business users. “We educate the masses to know what questions they can ask,” said Cassin-Scott. “There is no shortage of technology. What we lack is imagination.” Cassin-Scott, who also runs data discovery sessions, added: “Bring out your data and we'll find value. Without exception, every session generates value, from £500,000 to £14m.” Email Alerts Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox. By submitting your personal information, you agree to receive emails regarding relevant products and special offers from TechTarget and its partners. You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy. Read More Related content from ComputerWeekly.com RELATED CONTENT FROM THE TECHTARGET NETWORK

‘GCHQ doesn’t do mass surveillance or data collection’, claims Parliamentary security...

The long-awaited Parliamentary Intelligence and Security Committee report into privacy and security, published this morning, has dismissed suggestions that GCHQ and other parts of the UK's intelligence services circumvent the law. The claims followed the leak of US National Security Agency (NSA) material by whistleblower Edward Snowden, which seemed to indicate that the UK's security services were engaged in mass data collection and surveillance. "The [security] Agencies do not have the legal authority, the resources, the technical capability, or the desire to intercept every communication of British citizens, or of the internet as a whole: GCHQ are not reading the emails of everyone in the UK," says the report. It continues: "GCHQ's bulk interception systems operate on a very small percentage of the bearers that make up the internet. We are satisfied that they apply levels of filtering and selection such that only a certain amount of the material on those bearers is collected. Further targeted searches ensure that only those items believed to be of the highest intelligence value are ever presented for analysts to examine: therefore only a tiny fraction of those collected are ever seen by human eyes." It added that the current legal framework has led to widespread confusion, yet also asserted "we have established that bulk interception cannot be used to target the communications of an individual in the UK without a specific authorisation naming that individual, signed by a Secretary of State". The report has therefore called for the security services to be put under clearer legislation to overcome what it describes as an "unnecessarily complicated" legal framework. "Our key recommendation therefore is that the current legal framework be replaced by a new Act of Parliament governing the intelligence and security Agencies. This must clearly set out the intrusive powers available to the Agencies, the purposes for which they may use them, and the authorisation required before they may do so," the report says. Campaign group Privacy International accused the committee of whitewashing the extent of GCHQ's mass surveillance and data gathering operations. In a statement it said: "Far from allaying the public's concerns, the Intelligence and Security Committee's report should trouble every single person who uses a computer or mobile phone: it describes in great detail how the security services are intercepting billions of communications each day and interrogating those communications against thousands of selection fields." It called for the new legal framework proposed by the Committee to provide genuine restraints on the power of GCHQ and the security services, and for greater judicial oversight in their surveillance and data-gathering activities.  Is the report accurate or is it a whitewash? Comment below.

Former MI6 deputy chief: What we’re doing is not mass surveillance

Interviewed on the BBC Radio 4 Today programme this morning, ahead of an ISC announcement expected later today,  Jimmy Wales of the Wikimedia foundation, which yesterday launched legal action against the US National Security Agency, argued that mass surveillance of communications by the intelligence services has to stop. "For me one of the key elements is one of the oldest bits of jurisprudence in free societies, which is probable cause. Get a warrant, go to a judge. Don't surveil everyone," he said. "Bulk collection of data is incredibly dangerous," he said. "We've been very lucky that we live in a society where we don't have leaders who are using that kind of data for political assassinations and so forth but that possibility exists so long as this data is being collected," Wales added. Interviewed at the same time, Nigel Inkster of the International Institute for Strategic Studies and former deputy chief at intelligence service MI6, sought to downplay the activities of the spy agencies. Asked whether mass surveillance should be troubling in a free society, Inkster said: "It would be troubling if it were mass surveillance but it's not what we're talking about here. It is a bulk collection of civilian telecommunications, something which has actually been going on for decades without obvious detriment to civil liberty human rights, in order for the intelligence agencies to identify very narrow and specific sets of information about threats." This is an interesting statement given that such activities may well have been against the law. Inkster also sought to broaden the argument to cover the activities of commercial organisations such as Google, contrasting the legal framework under which the security agencies must operate with the free-for-all of the internet. "I think the biggest thing to come out of the Snowden revelations is the growing realisation by people around the world that the degree to which their personal data has been traded and commoditised without their active consent by [commercial organisations] who are operating with very few of the constraints under which intelligence services in democratic societies do operate," Inkster said. Wales believes that the rise of end-to-end encryption will soon make life very difficult for the intelligence agencies. Citing the example of WhatsApp he said: "If you use the application everything you type is encrypted from your computer to your friend's computer. The service itself has no way to read it, GCHQ can't read it, NSA can't read it. That's what consumers are demanding today because of the overall intrusive nature of what's been going on." Wales went on: "People in China are using Tor to browse websites. The genie is out of the bottle and there's nothing to be done about it." Inkster's response was that this would be a dangerous thing. Arguing that the rise of encryption is more to do with market share than concerns over personal privacy, he said: "The big question to be answered here is about whether we do actually want to live in a world where no communications can be controlled." He went on: "There is no doubt that the Snowden revelations have provided for many malevolent actors on the internet a roadmap that minimises the risk of them getting caught."

Facebook security flaw enables attackers to control users’ PCs

Millions of users could be at risk from two vulnerabilities, but Facebook downplays significance

Five tech trends that will change the mobile world in the...

At Mobile World Congress (MWC) in Barcelona, Computer Weekly sat down with Kevin Curran, computer science lecturer at the University of Ulster and a senior member of the Institute of Electrical and Electronics Engineers (IEEE) to explore some of the trends shaping the future of this industry. With more than 800 published works to his name, Curran is a noted expert in networking technology, as well as group leader for the Ambient Intelligence Research Group, a body that explores the future of the connected world, human-centric technology and the internet of things (IoT). As a more independent observer and bystander at MWC, Curran is free to talk more candidly than the vast array of exhibitors at the world’s largest mobility trade fair, free of PR minders and corporate responsibility. There is no doubt some of his views diverge from the mobile party line. 1. Smartphone makers are in deep trouble MWC was notable this year for a lack of momentous announcements or major advances in smartphone technology, and a feeling that things were just ticking over. “Obviously nobody can see what’s around the corner,” says Curran, “but everything just seems more incremental than the year before. Smartphones are reaching a plateau in terms of size, in terms of sensor and pixel resolution, so the manufacturers are desperately trying to distinguish themselves in smaller ways.” One trend Curran does pick up on is that the Samsung Galaxy S6 – one of the headline smartphone launches at MWC – did not have as much bloatware as its parents and grandparents, suggesting a move away from the walled garden approach to mobile phone operating systems. “People are becoming savvy to that,” he says. “Consumers want the phone to be as neat as possible, out of the box, really, without all the add-ons.” This indicates to Curran that the smartphone market has hit saturation point and, faced with the prospect of becoming as much a commodity as the bland, white, desktop PC boxes from the turn of the century, manufacturers have to be seen to be doing something about it. The trouble is that nobody knows what yet. “We’ve seen how even Samsung has struggled with its phones,” he says. “Apple is riding high, the number two smartphone maker is Samsung, but its sales are dropping – the S5 sold 40 million units fewer than the S3. There’s a lot riding on the S6, but just because it’s a flagship phone and gets all the press attention, that doesn’t mean it’s a best-seller.” Curran adds: “It is so hard to distinguish yourself in this market. If you walk onto most of the stands here, they all look similar. When did you last get excited about a phone? For most people, upgrading is just a case of upgrading from a cracked screen to an uncracked screen.” 2. Wearables will be huge but fitness is a gimmick In the rush to stand out from the bland crowd, the opportunity for smartphone suppliers such as Samsung – and even Apple, which has just launched its first wristwatch product – will be in wearables, Curran believes. “People only really have one phone, but when it comes to wearables, you have more chances to sell a product because in the future, people will have a lot more wearables on them, whether watches or fitness trackers, or even smart clothing,” he says. “There will be a massive opportunity to sell more products to us.” The big hurdle will be battery life, says Curran, and with early reports of the Apple Watch suggesting that this has not yet been overcome, wearables will remain notification tools, rather than phone replacements. But if you think it’s all going to be about fitness products and the quantified self, then think on, says Curran, who describes fitness as a gimmick. He suggests the manufacturers are missing a lot because they all seem to have concentrated on fitness. “I feel that fitness and premium products don’t really match – there’s a lot of sweat and a lot of rigorous activity,” he says. It’s gimmicky and it never translates into something that’s desirable and reaches a wider market, so I really do doubt the early emphasis on fitness “As a guy who runs six days a week, if you’re into running, you just kind of know what to do, your distance, your time, and they have all these features, but it’s gimmicky and it never translates into something that’s desirable and reaches a wider market, so I really do doubt the early emphasis on fitness.” The logical question to ask at this point is where wearables should be pitched. In Curran’s view, it’s around health, physical security and mobile payments, using the smartphone as a hub for connectivity back to the network. With mobile payments a big theme at MWC 2015, Curran says we are rapidly approaching a tipping point where the ability to make payments via a smartphone or wearable, or combination of the two, will go mainstream. “This is the year that makes or breaks it,” he says. “It’s going to happen – it’s just a case of who’s going to be the main player. “The same with crypto-currency – that will happen, whether it’s bitcoin or not. It’s a no-brainer. There has to be an e-currency; there’s an online world that we’re all moving to to make payments, so it makes sense.” 3. Our ideas about what 5G will be are way off the mark Wearable or smartphone, the mobile device of the future will need network connectivity. With 4G roll-out now well advanced in many mature markets and attention turning to 5G research and standards, what will the next generation mobile network look like? Actually, not much different, says Curran. He reckons progress towards the next mobile standard will be less of a quantum leap, and more of a series of steps towards faster speed. Why should this be? Curran explains: “I think 5G is really just another term for 4G that’s a bit faster, with MIMO technology and smaller cells. Small cells are key to 5G,” he says. Curran theorises a future where 5G ends up being a community resource, much like how BT has expanded public Wi-Fi in the UK by slicing unused capacity from consumer customer routers. In this scenario, 5G will work by sharing bandwidth over fixed routers and small cells. It will still be more powerful and much faster than 4G, but when you walk down the street in 2025, your phone may connect to something more like a wireless broadband network, rather than the current network of mobile phone masts. Phones will have multiple chips, and multiple antennae to handle this, suggests Curran. At the moment, 5G is just a series of proprietary trials conducted by industry by saturating a small area with connectivity, he says. A broad consensus is still at least five years off. “There will be consensus, but again, who is to stop someone coming along next year, an operator trying to get ahead of the curve because their sales are dropping, and just branding something as 5G?” he says. “There’s no copyright on the term.” 4. Autonomous cars are here but driverless is further off Autonomous cars were the stars at MWC 2015, with a multitude of suppliers, from Ericsson to Qualcomm, getting in on the act, and auto maker Ford even having its own stand, which it used to launch two new electric bicycles and a journey-planning app. The problem is what happens when truly autonomous cars hit the roads and there are still rubbish humans With the head of Renault-Nissan using a keynote address at the show to announce plans to launch an autonomous car capable of controlling its own movements in traffic jams, in 2016, the drive towards driverless is now in top gear. “We will see it,” says Curran. “Google is the driver there, having driven millions of miles with no accidents. The Google car is difficult, and they are cheating a little bit because they have pre-mapped the routes and have control of the test space. “The problem is what happens when truly autonomous cars hit the roads and there are still rubbish humans. There’s a lot to think about, but driverless is still a little way down the road.” Like the march towards 5G, the move towards driverless will take a lot of incremental steps. After dealing with low-speed, stop-start driving in heavy traffic will come motorway driving, and later still, urban driving, where a greater variation in speeds and hazards will bring more emphasis on safety and tighter regulation. “Right now, all the car-makers are just hanging in there to see what everybody else is doing,” says Curran. “These are proprietary standards, so regulation will be difficult.” Car-to-car communications is the biggest application at the moment, says Curran, and the technology for cars to be able to tell each other their speed and distance apart is close to commercialisation. “Driver assistance is where it will happen first,” he says. “It’s not a big step, but it will make a big difference.” 5. Drones will take off Drones will be one of the biggest changes to hit the technology world since mobile phones, says Curran, who describes them as one of the most exciting technologies he has seen in the past 20 years. Up to now, drones have made the news either when someone flies one too close to an airport, or when someone uses one as a gimmick in a publicity stunt. The biggest such stunt in the past 12 months was the Amazon drone delivery system, which first saw the light of day in July 2014. People will be happy to allow drones to land on the lawn, detach the parcel, send it back up, and keep the kids well back “It really was a gimmick but, oddly enough, I do think this will happen,” says Curran. “I can see drone deliveries in urban spaces.” People are doubtful of this for now, says Curran, but he reckons the rules of society will change much as they did when the first automobiles arrived. “People have said that because drone deliveries are out of the ordinary, they will never happen,” says Curran. “But when the tech is rolled out and people see their friends use it, they will be happy to allow drones to land on the lawn, detach the parcel, send it back up, and keep the kids well back. I honestly think it will happen.” He predicts that the other big application for drones will be in healthcare, where some pilot studies have already been conducted with drone ambulances delivering supplies such as defibrillator equipment to the scene of an accident – although drones probably won’t be airlifting crash victims to hospital. “A drone can, in many cases, fly more quickly and cheaply than a helicopter, and with a camera on board, the paramedics on scene can talk to the hospital,” he says. Regulation will be the biggest challenge for drone technology, says Curran. Aviation authorities will have to take a leading role but, fortunately, no-fly zones around buildings, airports and other sensitive sites are already enforceable through drone firmware updates. “Most people who are into drones adhere to safety and proper radio frequency use,” he adds. “The skies will become busier than we ever imagined in the very near future.” Email Alerts Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox. By submitting your personal information, you agree to receive emails regarding relevant products and special offers from TechTarget and its partners. You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy. Read More Related content from ComputerWeekly.com RELATED CONTENT FROM THE TECHTARGET NETWORK This was first published in March 2015

ENTERTECH SYSTEMS and Reliable Security Products Ltd. Bring Suprema Biometrics to...

Birmingham, UK and Dublin, Ireland - 12 March, 2015 -- ENTERTECH SYSTEMS, the official operating partner for Suprema Inc. in Ireland, the United Kingdom, United States, Canada and Puerto Rico, has announced that Reliable Security Products Ltd., a leadi...

What will office desktop computing look like in 2020?

Over the next five years, IT departments will increasingly support a highly heterogeneous computing environment. It is worth starting with a history lesson on desktop computing, because the situation today is analogous to IT a quarter of a century ago. In the 1990s, from an IT audit, security and compliance perspective, desktop computing was out of control. The desktop freed users from the controls IT placed on data and application access and people installed what they wanted, when they wanted it. But then experts worked out that the total cost of ownership of these PCs that the IT department was not managing was more than £5,000 per device per year. Arguably, the consumerisation of IT has recreated 1990s-style user-led computing, with PCs now replaced by smartphones and tablets as the device of choice. Back then, to save costs, IT locked down the desktop, rolled out standard desktop software images across the organisation and offered the business a common Windows desktop environment. Users could belong to groups that gave them authorisation to use certain applications and data. In some organisations, desktop IT became so commoditised that it could be outsourced. But then Apple came along. New ways to work Desktop IT is no longer about Windows and supporting Windows applications. Tablets, smartphones, cloud computing and applications delivered as software as a service offer compelling new ways to work. The shift from Windows-only to a user computing environment where Windows is just one element will not happen overnight. But by 2016, Gartner expects tablet sales to overtake sales of desktop PCs. The analyst’s Mobility is having a major impact on IT support paper by Terrence Cosgrove, published in February 2015, notes that by 2018, 40% of contact with the IT service desk will relate to smartphones and tablet devices – a leap from less than 20% today. This will put a heavy burden on desktop IT support, unless it works in a different way. Limiting device choice is not the answer. IT can no longer deny users choice by enforcing standardisation. The bring-your-own-device (BYOD) genie is out of the bottle and the heads of IT that Computer Weekly has spoken to are sympathetic to the changes in user computing. One example is Peterborough Council, which is changing its IT to a department that commissions services rather than buying on-premise. The council is undergoing a transformation that is running alongside its IT strategy. It plans to deploy Chromebooks or tablet devices to 50% of its staff. Richard Godfrey, assistant director of Digital:Peterborough, says it needs a “more dynamic IT department” and a “move to tools that fit together”. Godfrey wants to avoid situations in which the council is stuck on certain versions of Microsoft Office or Windows. “This takes away from the day-to-day fire-fighting task, allowing us to work more closely with the council departments,” he says. In fact, modern cloud-based applications are designed to work well together. “In the old days, we worked on integration, but a lot of products are now designed to work together, such as Box with Salesforce.com,” says Godfrey. He points out that some of the newer tools are also simpler to use, which means users can customise them. “We use Form Assembly,” he says. “Anyone in the council can build their own form, rather than wait a week for a quote from IT.” Simplifying a multi-device strategy Dale Vile, research director at Freeform Dynamics, recommends IT heads look at segmenting users into task workers and information workers. “It’s easy to run away with the idea that everyone is using multiple devices,” he says. “Meanwhile, PCs are used as the endpoint for the network, and provide access to hardcore routine, process-centric, back-end and administration systems.” These tend to be used by task workers. From an IT management and infrastructure perspective, it is clearly beneficial to be consistent, which, says Vile, is why IT departments virtualise desktops to stream desktop applications onto thin-client access devices. Clearly, a thin-client computing environment will fulfil the requirements of a certain proportion of users going forward, and this is a style of desktop computing that the IT department has fine-tuned since the late 1990s. But Vile says information workers are “more client-server in the way they work”. In other words, they may work on a central IT system some of the time, but then require the flexibility to perform computing tasks locally. They typically use cloud-type applications, accessed from a web browser on a desktop computer, or an app front end via a tablet or smartphone. Medway Council has taken a desktop virtualisation approach to centralising its desktop computing on a Citrix server farm to support flexible working. Moira Bragg, head of ICT at the council, admits that user computing is more complex than it was a few years ago, when everyone ran Windows. “People can have more devices,” she says. “People want to do more with these devices, and we need to support them.” This is achieved through user segmentation, she says. “We identify every worker as a mobile or a desk worker. If they have a laptop, they don’t get a desktop.” Security and risk mitigation Two years ago, people would go to work and access their work applications. Today, the challenge is not the technology, but making sure data is secure, says Bragg. So mobile access and device management is set to become part of every IT department’s desktop IT toolkit in the next few years. Gartner estimates that, by 2018, 40% of organisations will use enterprise mobile management tools to manage some Windows PCs – up from less than 1% today. But that is only half the story. IT will not only be expected to manage multiple devices, it must also support cloud-based applications, some of which may not yet exist. Speaking at a recent Computer Weekly CW500 Club event, Richard Gough, group IT operations manager at Punter Southall Group, said it made sense to move certain applications to the cloud, while others require plenty of due diligence. “It is a no-brainer to use Mimecast for email,” he said. “It is an obvious one. And we use Salesforce.com.” Email, messaging and even complex applications such as customer relationship management (CRM) are good candidates to migrate off-premise. But, says Freeform Dynamics’ Vile, moving a vertical application is much more risky: “You spend a lot of time defining your policy, workflow, rule set. A lot of this is not pure data; it defines how your business process works. Getting data out of the cloud using ETL [extract, transform, load] tools is pretty easy. But trying to transform cloud-enabled workflow is a lot harder.” So there needs to be an element of risk mitigation if the application to migrate to the cloud is business-critical. “Theoretically, there is nothing we will not put in the cloud,” says Punter Southall’s Gough. But where he struggles is when he encounters new, innovative tools from companies that provide specialist software, such as the tools that actuaries use for risk analysis. “These companies won’t sell the software to you any more,” he says. “They won’t let you bring it on-premise. They only want to deliver it as a service.” Vile adds: “You have to do the due diligence. What happens when you want to switch?” Gough’s biggest concern is, if the provider’s business folds, how can the customer replicate the value-added processes and services on-premise or with another cloud provider? “There is no concept of a cloud escrow,” he says. Email Alerts Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox. By submitting your personal information, you agree to receive emails regarding relevant products and special offers from TechTarget and its partners. You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy. Read More Related content from ComputerWeekly.com RELATED CONTENT FROM THE TECHTARGET NETWORK This was first published in March 2015

Hillary Clinton’s Private Email Use and the State of SSL/TLS Security

Though Hillary Clinton's server wasn't using the most advanced cryptographic protections for her email, there's no indication of certificate misuse, Venafi finds. As the news surrounding former Secretary of State Hillary Clinton and her use of a private email server continues to swirl, security specialist Venafi offered its take on the situation. Venafi's analysis is being accompanied by the 2015 Cost of Failed Trust Report, which provides broader insight into the state of Secure Sockets Layer/Transport Layer Security (SSL/TLS) certificate use on the modern Internet. To properly secure data transfer on the Internet, cryptography is used, typically in the form of SSL/TLS certificates. Venafi has a new service, called TrustNet, which was used to conduct the analysis on the "clintonemail.com" domain used by the former Secretary of State. TrustNet looks at how digital certificates are used in an effort to help track them and prevent potential misuse. Venafi TrustNet acquires certificates and metadata from Internet scanning as well as public domain historical archives, according to Kevin Bocek, vice president of security strategy and threat intelligence at Venafi.  The Venafi scanning effort discovered that "clintonemail.com" did, in fact, have SSL/TLS certificates in place on the site. The analysis found three different certificates were issued since 2009, with one issued by Network Solutions for "mail.clintonemail.com" in March 2009 that expired in September 2013. There is another certificate for "mail.clintonemail.com" that was issued by GoDaddy in September 2013 that is valid until September 2018. Additionally, a certificate was issued in February 2012 by Network Solutions for the "sslvpn.clintonemail.com" domain that is valid until February 2013. "The 2009-issued 'mail.clintonemail.com' and 2012-issued 'sslvpn.clintonemail.com' certificates were found in the historical archives," Bocek told eWEEK. "The TrustNet scanning engine acquired the current 2013-issued 'mail.clintonemail.com' certificates." Venafi's analysis shows the certificates to all be domain-validated, as opposed to the more rigorously audited Extended Validation (EV-SSL) certificates that can also be used to secure servers. Looking at the underlying technology for the server, Bocek said that Clintonemail.com is running Microsoft's Internet Information Server (IIS) 7 Web server for Web services. The server is not leveraging Perfect Forward Secrecy (PFS), which is an SSL/TLS server deployment option that provides new encryption keys for every connection session. After revelations of U.S government snooping, multiple large Web properties, including Twitter, began to deploy Perfect Forward Secrecy in 2013 in an effort to harden security. Though Clinton's server wasn't using the most advanced forms of cryptographic protections for her email, at this time, there is no indication of current certificate misuse, Bocek said. "During the time Secretary Clinton was using 'clintonemail.com' with certificates and encryption, she traveled to China, Egypt, Israel and South Korea," Bocek said. "The risks of eavesdropping and/or credential theft were and are real for both businesses and government travelers." 2015 Cost of Failed Trust Report The research for the 2015 Cost of Failed Trust Report, sponsored by Venafi and conducted by the Ponemon Institute, sheds some additional light on how organizations manage and use SSL/TLS certificates. The study surveyed 2,300 IT security professionals in the United States, the United Kingdom, Australia, France and Germany. The study found that 54 percent of organizations admitted that they did not know how all their digital certificates were being used or even where they were all located. Looking at the types of attacks that occur against cryptographic security, respondents indicated that man-in-the-middle (MITM) attacks were the most common, followed by attacks against weak cryptography. The use of weak cryptography is at the heart of the FREAK SSL/TLS flaw that was disclosed on March 3 and has since been patched by Apple and Microsoft. Sean Michael Kerner is a senior editor at eWEEK and InternetNews.com. Follow him on Twitter @TechJournalist.

Companies Failing to Remain Compliant With PCI Standards, Verizon Finds

While companies are complying with more of the Payment Card Industry’s security standards, less than a third remain compliant after they complete an assessment. Companies that handle credit- and debit-card data are complying with more of the requirements of the payment industry’s security standards, but are less likely to maintain their security posture over time, according to data published on March 11 by business-technology services firm Verizon Enterprise Solutions. In its 2015 PCI Compliance Report, Verizon found that companies typically met nearly 94 percent of the requirements of the Payment Card Industry’s Data Security Standard during an intial assessment conducted in 2014, up from 85 percent in 2013. Yet, four out of five companies did not comply with all the necessary requirements to pass the initial PCI assessment. Moreover, after passing a previous assement, more than 81 percent of companies failed their next compliance test. “It is a mixed bag,” said Franklin Tallah, an executive consultant at Verzion Enterprise Solutions, told eWEEK. “There are those companies that are ahead of the game, but for those in the lagging category, it should be a wake up call.” While the PCI Data Security Standard has often been criticized for emphasizing check boxes over security, many security professionals have recommended the standard as a starting point for any company aiming to harden their network against cyber-attacks. Last year, driven by companies’ lack of security awareness, their slow detection of attacks and threats, and the lack of consistency in assessments, the PCI Security Standards Council rolled out Verion 3 of the standards. In its report, Verizon found that, in 2014, more companies had complied with almost all 12 main PCI requirements, with the exception of the requirement to regularly conduct security scans and mitigate security issues. Only 33 percent of companies complied with those requirements in their initial assessment, down from 40 percent in 2013. Verizon, which performs both PCI assessment, as a qualified standard assessor (QSA), and post-breach investigations, has never encountered a compromised company that was compliant with PCI DSS at the time of the breach, the company stated in the report. “The companies that we visited post-breach as a [PCI forensics investigator] were significantly less … compliant than our control group of QSA customers,” Verizon stated. “Not only were breached companies less likely to be found compliant overall, they were also less likely to be compliant with 10 out of the 12 requirements individually.” In particular, companies suffering a compromise never met the requirements for maintaining secure development practices, identifying vulnerabilities, tracking access to networks and analyzing log files. The benefits of the PCI standard depend on the approach that a company takes to implementing the requirements, Verizon’s Tallah said. Companies that put a great deal of emphasis on finding the systems and software that handle payment card data and securing those systems will be far more likely to remain compliant with the standards. However, it is not an easy task, he said. “We find that many organizations underappreciate the level of effort,” Tallah said. “We are recommending that people educate themselves and understand that it is a complex project.”

Ellen Pao was “resentful” and dismissive, Kleiner Perkins lawyers say

Pao's discrimination claim gets microscope treatment from former employer.

Congressperson asks DoJ to “intensify enforcement” of online harassment laws

Says only 10 cases out of estimated 2.5 million were prosecuted from 2010-13.