We were unable to load Disqus. If you are a moderator please see our troubleshooting guide.

deathtoscrub • 5 years ago

Buying without testing... that's going to go well.

lehpron • 5 years ago

Why do you assume that? Just because something isn't being mass produced and available any time soon doesn't mean there are no working samples to convince the customer of intentions.

Intel isn't obligated to give the public those samples, especially since that would give competition an idea to compete against before mass production is underway. Keep in mind, consumer graphics wasn't the initial purpose, so that variant may be different than the variant offered to the DoE.

FreckledTrout • 5 years ago

It will. The government is cheap asf I promise Intel gave them one heck of a deal.

Jysoul • 5 years ago

DoE now serious about data mining in the cryptocurrency market!

lazychocobo • 5 years ago

Aren't they supposed to release a consumer GPU next year? We should be seeing something about them by now I'd have thought.

Zero11s . • 5 years ago

14nm or 10nm?

14

Patrick Proctor • 5 years ago

10nm actually. That was confirmed at Intel's last conference.

Intel is so doomed 😉🤣🤣🤣

Again new security flaws, this one is going all the way back to gen 1.
Intel consumer dGPU will take some more years.... And Intel is loosing a huge chunk of market share.

What is your reaction, let me guess

"Intel and NVIDIA are the best"

Patrick Proctor • 5 years ago

Easily patched and already guaranteed to be gone with Sunny Cove, so no big deal.

Ehhhmm yeah right, the easy was how Intel milked the consumer for years. Now AMD is finally starting to give "some" competition Intel is starting to loosing it's nerves.

There is no easy fix for all those flaws

lazychocobo • 5 years ago
Anthony Mare • 5 years ago

There must have been some awfully fat brown paper bags handed around to win a deal with as yet un-taped, unproven and unreleased hardware.

Michael • 5 years ago

What process is Intel GPU on? It seems like a marketing exercise, so Intel can say they're in a powerful supercomputer so they'd be great for a home PC. They'd be a specially designed version of the chips, as there is much on a GPU that isn't relevant to these workloads.

Wirxaw • 5 years ago

Smells like ASIC. Or just ungodly stacking. I remember that AMD presentation during Zen1\Vega times when it announced some kind of "garage petaflop" server, a totally mobile stack that was moved out on stage. They have almost twice improved since then, and even then stacking 500-1000 of these looks like an overkill. But... maybe it's not. Since supercomputers do not seem to have any technical limitation to size, maybe this thing would just be the fattest one.
I mean, if current top supercomputers running P100 and V100 barely do 100-200Pflops, and as we know Tuгding isn't much of an improvement, more like branching of Volta into RT, as well Intel still being stuck on 14nm - for something to be 5 times faster... Either it's a very fat stack of their newest 48Core chips, or some Xeon Phi-s, coupled an even fatter stack with MI60-grade or V100-grade GPUs... Like, it's something about twice as fast as what most SCs use and there are about 3x of that. Or... it's either some Optane magical memory handling or ASICs. 5x faster architectures don't happen overnight, let alone get introduced into supercomputers.

I call bull until we get more data.

de ja ful #Rent Free • 5 years ago

So can it run metro?

No Valid Proof • 5 years ago

Remember kids when someone says "I am an AMD fanboy too but", that means he is "NOT"

Sharpedon • 5 years ago

Well, the same can be claimed for any statement followed by a "but". Is that valid? It's questionable.

Certified_Giraffe • 5 years ago

"i don't hate white dudes BUT"

Guest • 5 years ago
Youknowitsjimmy • 5 years ago

Poor Mii....