Kevin Neilson- It sounds like it might be difficult to do (efficient) decimal arithmetic in an FPGA; the carry chains and multipliers are all binary.
Back in the 1830s Charles Babbage attempted to do this with geared systems. Most of us have heard of the Difference and Analytical engines. These monsters were to have 30 columns of numbers moved by interposers. Most of Babbage's lifetime was spent in ways to determine what he called the anticipating carry. Some of his mechanical solutions look crazy, half stepped and elliptical gearing on the diagrams. Today we call this pipelining and I think half carry.
The solution I Like the best was from the analytical engine, where he used bolts. Like are used to lock doors. Each decimal digit had a bolt which would line up in a row when the interposer kicked in. This way the whole stack of 30 digits could be lifted quikly and with the force of gravity.
The "improved" method implemented in the machines built for his bicentenary in the late 1990s, use "clock" improvements and sectional gearing to do this 'pipelining'
The elliptical gearing shown on his later diagrams, are actually ways of splitting the clock so that there are 5 clocks rather than 10 to find the carry chain. The ellipses are in reality two gears, one running forward in time, the other running backwards in time from -5 clocks to 5 clocks. The results tabulated at zero.
As a firmware programmer, I do not know much about silicon (my classes in this were 30 years ago.) It does seem that the fundamental theory of decmai engines was pretty well worked out by 1850. The rest is just implementation.
For a fascinating social account on how these high speed trading machines were expected to be used in the 1850s read Charles Dicken's Little Dorrit. Even 160 years ago the big players had the advantage spending large sums on research to get that edge over the other players in the game. Back then the futures were in insurance and actuarial tables. betting on the lives of those who had money and were willing to insure the inheritance.
@Kevin... Decimal floating point arithmetic is essential when we want "natural" decimal fraction accuracy, most typicaly in a financial/monetary application. For e.g. when you want perfect decimal fraction accuracy to the 3rd or 6th digit beyond the point, or even past, binary floating point doesn't do (1/2 + 1/4 + 1/8 + ... covers only a small percent of all possible decimal fractions). DFPA is done both in software and in hardware. Several current processors adopted the simpler operations within their cores, and kept the more complex ones in software to save space.
When performance (latency) is an absolute requirement, full hardware realization is the solution. Being aware that this article's subject matter is highly statistical and hence can be thought of as inherently inaccurate, leading market index publishers for example require sometimes 10 decimal digit fraction accuracy. As well, many traders prefer to do the arithmetic in proper DFP to avoid accidental cumulative errors.
I'm interested as well. I don't know what the point of decimal cores would be. I know it was useful in the old (ENIAC) days, when binary->decimal conversion was hard. And the HP calculators used BCD processors, which avoided some rounding errors.
@ssidman... I probably wouldn't feel myself being an HFT operator as well. This was a strong use case for floating point arithmetic. And from that point-of-view, it is business. From a financial industry perspective, it is happening anyway and yes some experts have more reservations than others, and the technology enablers to ultra-low-latency are driving it. I feel that financial experts are in best position to judge the impact on World's financial stability. What I meant to highlight is the opportunities that this growing business is opening for FPGA systems development. The intention is to elaborate in a follow through article on how/where does this relate and generate merit to the decimal floating point IP unit business.
We used to ask ourselves the same question, also from an Islamic economy standpoint, where this may be seen as gambling. However, HFT is only a frequency dimension extension to the long existing capital market operations, arbitrage. If trading is ethical manually, it is probably more ethical quantiatively, and algorithmically--being more qualified and reasoned. If it is ethical at low and medium frequencies, then it probably is at higher frequencies. It's the regulator who should ensure how it contribute to the market stability and to the economy in general. The debate is on in US, UK, Europe, and S.E. Asia of "how high frequency" traders and venues should be allowed to go. And most of the market now is in the US and UK, while the EU is as usual more conservative with regulatory, so you'd find European traders setting platforms in NYSE, NASDAQ, Chicago, and London's FTSE.
Here is a link to a WSJ article on subject matter: http://online.wsj.com/news/articles/SB10001424053111903392904576512250007216020
"Independent research shows that high-frequency trading has a positive impact on global capital markets by increasing liquidity and reducing price volatility, findings that could allay some of the popular and political concerns over the strategy."
@mkellett: Does any one ever consider the ethics and morality of this business...
I do and i'm sure others do also -- I just don't know what the answer is.
I have a somewhat simplistic view of the world that might be summaraize as follows: At the end of the day, true wealth is generated by someone picking up a shovel and digging something (e.g. a mineral) out of the ground, or growing something, or taking something and fabricating something else out of it.
When institutions like banks "make money" by buying or selling foreign currencies, or when traders "make money" by buying or selling shares, that money has to come from somewhere (and judging from the state of my bank account, I have a pretty good idea where it's coming from).
This isn't to say that the concept of a stock exchange is bad per se -- the ability to raise money to fund projects is very useful -- and I have no problem with the idea that folks who take the risk with their money should get some reward -- but things like high frequency trading do leave a "bad taste" in my mouth...
As it turns out, I know one of the original founders of Celoxica, from when they were a spin-out of a University hardware compilation group. Their business then was mainly tools, for direct hardware execution of algorithms. Many changes since then. I believe that he shares your view as to the current application focus. Does the world really need faster, deterministic, jitter-free ways of jumping down the economic rabbit hole whose Wonderland world is disconnected from the real one, producing no real or tangible economic benefit? It's a high value application until reality catches up and things crash.
Does any one ever consider the ethics and morality of this business. Huge amounts of effort and money poured into making faster and faster trades so that organisations with huge capital resources can extract money from the rest of us by making a trade a few uS faster. This isn't useful business - it's totally parasitic.
What's gone wrong with the way we organise ourselves that we put effort into nonsense like this rather than any one of hundreds of useful investments.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.