Home Tags NVIDIA

Tag: NVIDIA

Asus ROG GX800VH review: A ludicrous liquid-cooled $6,000-plus laptop

Overclocked i7, two GTX 1080s, 64GB RAM, Raid 0 NVMe, and a suitcase to carry it all.

AMD Vega FE reviews disappoint fans with humdrum gaming performance

Vega FE sits between GTX 1070 and 1080 in games, but beats Titan Xp in some pro apps.

Nvidia’s new TensorRT speeds machine learning predictions

Nvidia has released a new version of TensorRT, a runtime system for serving inferences using deep learning models through Nvidiarsquo;s own GPUs.Inferences, or predictions made from a trained model, can be served from either CPUs or GPUs.
Serving inferences from GPUs is part of Nvidiarsquo;s strategy to get greater adoption of its processors, countering what AMD is doingnbsp;to break Nvidiarsquo;s stranglehold on the machine learning GPU market.[ Revealed: AMDrsquo;s strategy to become a machine learning giant. | Roundup: TensorFlow, Spark MLlib, Scikit-learn, MXNet, Microsoft Cognitive Toolkit, and Caffe machine learning and deep learning frameworks. ]Nvidia claims the GPU-based TensorRT is better across the board for inferencing than CPU-only approaches. One of Nvidiarsquo;s proffered benchmarks, the AlexNet image classificationnbsp;test under the Caffe framework, claims TensorRT to be 42 times faster than a CPU-only version of the same testnbsp;— 16,041 images per second vs. 374mdash;when run on Nvidiarsquo;s Tesla P40 processor. (Always take industry benchmarks with a grain of salt.)To read this article in full or to leave a comment, please click here

Dealmaster: Get a Dell XPS tower or an Inspiron desktop with...

Plus savings on laptops, SSDs, robot vacuums, smart TVs, and more.

Destiny 2 on PC: 4K, GTX 1080 Ti, all the eye-candy

Sure, the Xbox One X and PS4 Pro support 4K too.

But who wants to play at 30FPS?

Logitech finally finds a good use for wireless charging: A mouse...

With a Powerplay mouse pad, never again will your wireless mouse run out of power.

HP grows Omen line with new gaming laptops, VR backpack PC,...

Are you ready for a compact desktop that doubles as a wearable VR machine?

Nvidia Max-Q laptops: Impressively thin, but industrial design needs work

Asus Zephyrus has an innovative cooling system, but the keyboard is hilariously bad.

Got an antenna and a tuner? You can now stream live...

Free broadcasts now live where you watch the rest of your streamed content.

Nvidia Max-Q wants to make gaming laptops thinner, lighter, less fugly

Max-Q is kind of like Intel's Ultrabooks, but for gaming. No word on price, battery life.

Apple iCloud, Android Nvidia driver N-day exploit details revealed

Kernels can be exploited and iCloud account user information leaked due to the security flaws.

AMD’s game plan to become a machine-learning giant

Right now, the market for GPUs for use in machine learning is essentially a market of one: Nvidia.AMD, the only other major discrete GPU vendor of consequence, holds around 30 percent of the market for total GPU sales compared to Nvidiarsquo;s 70 percent.

For machine-learning work, though, Nvidiarsquo;s lead is near-total. Not just because all the major clouds with GPU support are overwhelmingly Nvidia-powered, but because the GPU middleware used in machine learning is by and large Nvidiarsquo;s own CUDA.[ Roundup: TensorFlow, Spark MLlib, Scikit-learn, MXNet, Microsoft Cognitive Toolkit, and Caffe machine learning and deep learning frameworks. | Get a digest of the dayrsquo;s top tech stories in the InfoWorld Daily newsletter. ]AMD has long had plans to fight back.
Itrsquo;s been prepping hardware that can compete with Nividia on performance and price, but itrsquo;s also ginning up a platform for vendor-neutral GPU programming resourcesnbsp;— a way for developers to freely choose AMD when putting together a GPU-powered solution without worrying about software support.To read this article in full or to leave a comment, please click here