[Blabber] EC2 instances with FPGAs

Peter S. Shenkin shenkin at gmail.com
Thu Dec 1 03:04:48 UTC 2016


Another thought: One possibility in the number crunching realm is tight optimized loops over small arrays, which don't benefit much from GPUs. An example is quantum apps. But the Azure article would lead me to think that maybe they're not primarily for number-crunching at all.

I don't think interface to special purpose HW is likely to be the point on a general-purpose cloud instance. I mean, such an instance could use FPGAs at the infrastructure level optimize their HW interfaces, but what is discussed in the article is cloud instances that allow the user to program the FPGA. Encryption and related is a real possibility. "user-defined algorithm support" does not sufficiently differentiate it from GPU; what algorithms are better suited for FPGAs would then be the question. GPUs excel on heavy number-crunching on large arrays of data, but they've got to be large arrays. That's  what led to my thought in the previous paragraph, where GPU has not made great inroads.

I guess the primary questions are: (a) Would they be used more for number-crunching (as in floating point) or more for "other" types of calculation (including integer calculations); (b) to the extent that the answer is "Yes", which classes of floating-point algorithm are better suited to GPU and which are better suited to FPGA; (c) To the extent that the answer is "No", what sorts of computational algorithms are best suited to FPGA; in what realm do we get the greatest bang for the buck?

Best,
-P.

> On 30 Nov 2016, at 2:29 PM, John Larson <larsonj.home at gmail.com> wrote:
> 
> Peter,  
> 
> Hre are my thoughts on that:
> 
> GPU 
> - good for heavy number crunching on large amounts of data
> - very fast and wide memory bank but can require a lot of memory 
> - requires PC architecture and power
> - software algorithms on general purpose processors
> 
> FPGA -
> - can function like a low power SBA, does not require the hardware and power required by GPUs
> - user defined algorithm support that runs directly on the device 
> - has customizable io that can be interfaced directly to other specialized hardware
> - good at accelerating a broad range of real-time workloads like encryption and IP traffic manipulation 
> 
> Once example that I can think of is how financial institutions use FPGA in ultra-low latency systems to optimize network traffic switching and for accelerating feed handling and market data workloads.
> 
> I think of filtering real-time high-volume network data by complex expressions as a general place where they could be useful vs GPU.  
> 
> Here is a recent article in fortune that discussed how Microsoft is using them in the Azure cloud.
> http://fortune.com/2016/10/17/microsoft-fpga-chips-azure/ <http://fortune.com/2016/10/17/microsoft-fpga-chips-azure/>
> 
> John
> 
> On Wed, Nov 30, 2016 at 1:11 PM, Peter S. Shenkin <shenkin at gmail.com <mailto:shenkin at gmail.com>> wrote:
> 
> On 30 Nov 2016, at 1:07 PM, Guan Yang <guan at yang.dk <mailto:guan at yang.dk>> wrote:
>> https://aws.amazon.com/blogs/aws/developer-preview-ec2-instances-f1-with-programmable-hardware/ <https://aws.amazon.com/blogs/aws/developer-preview-ec2-instances-f1-with-programmable-hardware/>
> Interesting.
> 
> Does anyone know what kinds of apps work better with FPGA co-processors than with GPUs, which are basically vector processors that can run trigonometric functions very quickly?
> 
> -P.
> 
> 
> _______________________________________________
> Blabber mailing list Blabber at list.hackmanhattan.com <mailto:Blabber at list.hackmanhattan.com>
> https://list.hackmanhattan.com/listinfo/blabber <https://list.hackmanhattan.com/listinfo/blabber>
> 
> _______________________________________________
> Blabber mailing list Blabber at list.hackmanhattan.com
> https://list.hackmanhattan.com/listinfo/blabber

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://list.hackmanhattan.com/pipermail/blabber/attachments/20161130/1a4ace34/attachment.html>


More information about the Blabber mailing list