Security News

IDG Contributor Network: Are IT certifications worth it?

“You absolutely need IT certifications in order to advance your career and boost your earning potential,” say some people, certificates clutched firmly in their grasp.“You absolutely do not need IT certifications in order to advance your career or boost your earning potential,” say some other people, hands astonishingly bereft of paper.This is quite the pickle, a true conundrum.

Two camps both claiming they have the right answer, but how do you decide who is right?When it comes to the value of certifications in the technology industry there are definitely mixed feelings. While some see them as validation of their skills and evidence of their ability, there are some senior workers who hold the opposite point of view and believe that an IT workerrsquo;s true value and ability is demonstrated through their experience and the projects they have worked on.To read this article in full or to leave a comment, please click here

Nvidia’s new TensorRT speeds machine learning predictions

Nvidia has released a new version of TensorRT, a runtime system for serving inferences using deep learning models through Nvidiarsquo;s own GPUs.Inferences, or predictions made from a trained model, can be served from either CPUs or GPUs.
Serving inferences from GPUs is part of Nvidiarsquo;s strategy to get greater adoption of its processors, countering what AMD is doingnbsp;to break Nvidiarsquo;s stranglehold on the machine learning GPU market.[ Revealed: AMDrsquo;s strategy to become a machine learning giant. | Roundup: TensorFlow, Spark MLlib, Scikit-learn, MXNet, Microsoft Cognitive Toolkit, and Caffe machine learning and deep learning frameworks. ]Nvidia claims the GPU-based TensorRT is better across the board for inferencing than CPU-only approaches. One of Nvidiarsquo;s proffered benchmarks, the AlexNet image classificationnbsp;test under the Caffe framework, claims TensorRT to be 42 times faster than a CPU-only version of the same testnbsp;— 16,041 images per second vs. 374mdash;when run on Nvidiarsquo;s Tesla P40 processor. (Always take industry benchmarks with a grain of salt.)To read this article in full or to leave a comment, please click here

See you in 2023 – Bitcoin exchange Coin.mx bigwig gets 66...

Murgio gets off easy in money laundering case A kingpin of the ill-fated Coin.mx Bitcoin exchange was today handed a 66-month prison sentence for conspiracy, fraud, and money laundering.…

Cisco announces Kinetic IoT operations platform

Cisco has unveiled its Internet of Things operations platform labelled Kinetic, which it said will enable enterprises to extract trillions of terabytes of data from devices.

Six quick facts to know about today’s global ransomware attack

This is what you need to know — right now.

Six quick facts to know about the Petya global ransomware attack

This is what you need to know -- right now.

Ohio Gov. Kasich’s website, dozens of others defaced using year-old exploit

"High risk" exploit patch was issued in May of 2016.

Walmart sued after teen steals machete and kills her Uber driver

Lawsuit: Walmart let teen steal weapons, pass security before killing driver.

BrandPost: What’s All the Excitement Over Software Defined Visualization and in...

Software Defined Visualization (SDVis) has become a mainstream idea: visualization on processors (CPUs) has enormous advantages in flexibility, cost, and performance for large visualizations. While that sounds like a great marketing pitch for Intel, the facts actually check out on this for a number of reasons, including trends in memory and algorithms.These advantages are overwhelming for truly big data (referred to as “exascalerdquo; data).  In the paper “An Image-based Approach to Extreme Scalenbsp;In Situ Visualization and Analysis,” the data movement challenge is quantified as being 1,000,000,000X as much data for extreme scale simulation data (where CPUs win) vs. that of image processing (where GPUs have dominated).To read this article in full or to leave a comment, please click here

Matthew Keys’ guilty verdict and sentence to stand, 9th Circuit rules

"Keys made the CMS far weaker by taking and creating new user accounts."

Meet the RapidE, Aston Martin’s first EV due in 2019

Just the thing for the eco-conscious 00 agent?

BrandPost: Stop Juggling and Increase Productivity With Better IT Infrastructure Management

By Bharath Vasudevan, HPE Product Manager, Software-Defined and Cloud Group Those in IT are usually quite adept at juggling -- keeping lots of balls in the air to ensure the organization’s entire IT infrastructure operates as efficiently as possible. From providing desktop support to provisioning and updating compute, storage, and fabric resources -- the job of the IT professional is never done.And it seems like the demands on IT become greater every day. That’s because the pace of innovation continues to accelerate, and IT services are becoming ever more complex. Businesses are now managing and consuming IT services across a hybrid infrastructure, and they’re trying to use infrastructure that is usually not designed for these demands. In addition, complex manual processes and non-integrated tools fail to provide the speed and simplicity needed to support current tasks, much less new ideas and applications.To read this article in full or to leave a comment, please click here