We were unable to load Disqus. If you are a moderator please see our troubleshooting guide.

DebtLadenEbolaZombie • 9 years ago

Wired commenting on attempts to remain relevant.

Oh the Hilarity.

NegroSven • 9 years ago

This is a good move by Intel. In order to proliferate IoT, low powered ICs will need to become the norm being solar, tidal undulation and wind will power IoT in stand-alone and utility interactive applications.

YeahRight • 9 years ago

I love it how you use a meaningless word to make yourself feel more powerful.

anjanson • 9 years ago

He's just uttering the utterly utilitarian unity under united uterus

YeahRight • 9 years ago

:-)

Sarkazein70 • 9 years ago

How is the world's largest chip maker "staying relevant" by snapping up a small competitor? Why isn't Apple struggling to "stay relevant" by acquiring FoundationDB? How is Microsoft "fighting for relevance" when Windows is on over 90% of the world's computers?

I know there's a fruity bias around here. But this is more hyperbolic BS than usual. And the vast majority of personal computers run on Intel-based architecture, be it an Intel chip or an AMD chip.

BtotheT • 9 years ago

Interview from a year ago pretty much outlining this move.
https://www.youtube.com/wat...

What a FPGA is(below). I think the writer may need this as well, their flexibility/versatility is what makes them unique, though yes they can run "very specific tasks".
https://www.youtube.com/wat...

Correction to article.
"They’s why Microsoft used them to"
*That's why*

YeahRight • 9 years ago

Wow... another completely clueless writer imagining how things may work that just don't work that way. FPGAs are anything but efficient, they are far from cheap (top of the line chips cost more than a complete file server with the highest end Intel CPU) and they are very power hungry compared to custom chips. FPGA programming is a nightmare and with the exception of very few kinds of problems won't even result in a particularly high performance designs.

The Senior List • 9 years ago

I assume they have a sound strategy for the future of this architecture in combination with their own. They aren't one of the largest companies in the world because they're hacks for goodness sake! :)

disqus_u9WfvbZaF1 • 9 years ago

4th paragraph: "They’s why Microsoft used them to help power Bing."

Mary Vizzaccaro • 9 years ago

Feeding the bottom feeders in the world of shrinking profit margins. Everyone in this space knows that quarter over quarter and year over year, the number of booking and billings is dropping to stagnating. This makes it real hard for commodity chip producers like Intel, Texas Instruments, Micron, SanDisk, etc., etc., etc.... There is now a chip glut and over supply of commodity chips. This is also the case developing in the apple related silicon due to lack of interest for the over hyped I watch.

Darkness • 9 years ago

There are a number of issues with the article. First, the idea that Intel buying Altera has anything at all with relevance is absurd. FPGAs are a very small market, their existing working agreement with Altera is a solid engineering attempt to enable a more diverse server market. That it would enable Altera and Intel to be at the forefront is so much the better. If it comes to pass. Two years down the road and what has been the result thus far? Not much out there.

FPGAs are fast turn around parts. They are efficient only in the aspect of cranking out iterative designs quickly, with little additional hardware design. They are not more efficient than the current generation of CPUs in relation to wattage or board space. One can design a very hardware intensive solution for a very particular problem that might well be more efficient than a full CPU doing the same task in software with the limited hardware support they provide. Although there isn't an FPGA out there than can match any modern CPU on the FPU side. FPGAs are not fine tuned. They are based on generic building blocks that allow some problems to be addressed. Time to market is the most common reason to use FPGAs. That and prototyping an idea that might or might not ever be made in large numbers.

If the solution can be done in hardware, then prototyping it in FPGA is a good practice. If the solution needs to be cheaper and use less power as well as use less physical space then it would be appropriate to move to an ASIC. Welcome to the engineering track that started the entire GPU business, as we now know it. They used to be largely bit slicers so let's not go back that far. Although there were power hogs they were great devices, but nothing compared to the modern solution.

Bing using FPGAs for their search solution is an aberration, not an endorsement of a 'new' way of doing business. And it actually fits with my description of a prototyping solution for a problem that will never see wide spread use. If it did, then we would be reading about Bing using ASICs. An ASIC is useful when the solution becomes well known or established. An FPGA is useful when that will never be the case as they can be reprogrammed on the fly.

Relevance should never have been mentioned. Finish this thought on your own.

Rick James • 9 years ago

So you seem to know some of the main differences of CPUs, FPGAs, ASICs, but lack the understanding of the economics and applications.

FPGAs are used in quite a number of ways beyond prototyping. Many FPGAs nowadays have hardened functions (like FPU) that allow for them to be quite competitive in performance/watt metrics valued by data center customers. Not many customers are willing to spin ASICs based on the NRE/SW dev fees. FPGAs are a strong alternative to these in limited volumes, like data center acceleration.

I agree that the Bing example isn't something that is going to fundamentally change data center layouts, but it's a solid example of MSFT utilizing the reconfigurability and acceleration of an FPGA to speed up search optimization. Since search functions are constantly being improved, this is a good solution.

Darkness • 9 years ago

I understand the economics very well. The migration path of applications and the power of being able to reconfigure the hardware on the fly, to the extent that there were significant attempts in computing history that proposed using FPGA (or FPGA-like features) to configure the system several times per minute (even to times per second) to adjust to the data stream and its unique processing requirements.

I also believe that Altera and Intel make a good combination. Intel has fabs and they have vision others lack in many areas; e.g. look at their presentation of Thunderbolt. Connecting processing racks with memory racks was very exciting. Add a rack of Alterna FPGAs to a data center and the possibilities are massive. Far beyond what the high-speed traders do as well as the small number Bing is using.

However, none of this is required by the mega-manufacturing giant that is Intel Corp. to remain relevant. While I see many possibilities from their current efforts and any possible merger, it will do nothing in relation to 'relevance'. They are the most relevant semiconductor in the business, others trail by multiple fab processes. Adding Altera to their portfolio, or embracing them would have Altera jump ahead by several fab generations. I hesitate to even guess at the implications.

At least it will prevent the Feds from thinking that FPGAs are PC processors and get them all a dither to throw monopoly attacks at Intel. Well, the Feds thinking might would be a first.

Bigdog • 9 years ago

What a POS article. Intel relevant? Really? They're more than relevant, They are dominant.

johnwerneken • 9 years ago

Amen. Some of us like power, efficiency, economy, and reliability in computing far more than mobility or cloudiness.