Thank you for sharing your background and interests. I love history too, geaographical and ethnological history; and you actually brought up the exciting area of the history of computing. I find this dialog enriching, even that you got strong opinions. I appreciate learning of different likes and perspectives because it boosts one's own self perspective. I am, unlike yourself, with full respect to yourself, not a handy person. I am a systems person who's into logic and modeling (broadly). I strongly believe in knowledge economy and services-oriented society and I deal best with intangibles.
One thing I still wish to comment about. You wrote "I think the concern with HFT is that it gives an unfair advantage. That if one was say to use the same equipment to play in a casino, it would be frowned upon."
I researched the HFT topic within my former role as SilMinds Head of Technical Business Development. Simply since it demands complex computational speed and it's got possible merit for accuracy. Trading has been there for ages. Probably you and I, and Max, don't see much fun in it. What I want to say is that the gambling aspect is associated with security trading in general, and not specifically in HFT. My study revealed that over 80% of current trading is algorithmic, and conducted by performant servers from outside of the floor. There is low, medium, and high frequency trading. If we think of the trader's mind, this is "highly" qualified gambling. I don't like gambling anyways. If we think of the capital market perspective, trading is necessary to valuate and stabilize, and without it world economy would crash. So, somebody (who likes it) has to do it.
However, it is driven by enabling technologies, and once there it motivated higher technology for lower latencies. Isn't this how it works with technology cycles..? Performance merit stimulates bigger and faster hardware, which in turn drives more possibilities requiring hardware advancement. I saw this as clear case. And afterall, all major exchanges now (something you could publicly view) deal buy and sell orders in millisecond time units; and you'd see multiple orders within the same ms 'tick'.
Back to fairness, yes... The trader who's got more clever algorithms running over faster software and hardware platforms have better chances. It is fair because he/she has invested more in "Financial engineering" education and also in pltaforms and algorithms development.
"Finally, sure ideas not carried all way to maturity are for sharing, publishing, and working on, more than patenting; though some may do patent them."
My frustration with IP is not about research development time. It has more to do with accessibility and ownership of ideas. Applied research is implementation. There is a physical goal or service involved.
I love history. If I knew how to do it I would loved to have a degree in history of technology and mathematics. This is why I collect old textbooks, some going back to the 18th century. It is also why I tend to write walls of text as I think in a more vicorian style of prose.
What concerns me is the departure from Yankee engineering, to the locked out guild system of Europe it was meant to counter act. We lament the lack of young people (and Women in these fields.) I love watchmaking as well as pipe organ building. I have a watchmakers bench, which I do my engineering on. Where I can hand solder down to 0805 parts.
Has I lived before 1880 in Europe it would have been illegal for me (as a woman) to own these tools. The guilds were restrictive. Masters had the right to destroy inferior work. This was often used to keep the apprentice in check, should the apprentice make the master look incompetent.
Such worked well for hundreds of years. Till the printing press happened and revolution came about in the 18th century. Now we have the concept that all are created equal and should have fair access
We are in more dynamic interconnected world. We can find out that what was once rare, where someone might see a toy or an idea once in a lifetime, to a world where these ideas are common.
I think the concern with HFT is that it gives an unfair advantage. That if one was say to use the same equipment to play in a casino, it would be frowned upon.
At the moment I am interested in statistical analysis of recorded music events. The idea of fast accurate decimal floating point interests me. I got interested in non linear mathematics after reading a biography of Johnny VonNuemann. Where one learns that 3 is not 30 times 10. Such is abstract and hard to explain without walls of text ad graphs.
Should silicon go the way of 3D printing and assembly, It would be nice to have access to something that might not work in one field, but could be adapted to another.
This is the heart of the maker/designer movement. To open up ideas to those who can best implement them.
As I kid I got to take field trips to the landfill/dump, we would use the refuse like old typewriters to make art project. Why can we not do the same with old silicon cores and obsolete software that can only run under MS-DOS or OS9?
The mentioned DFPA cores were developed independent of application. It is financial applications that demand perfect decimal fraction accuracy. This may explain why mainly IBM (Power) and Oracle (SPARC) processors have adopted DFPA cores.
SilMinds Inc. is the patent holder, not myself, and I included the link to the data sheets if you are interested in technical/algorithm detail. They involve tens of person.years of research work (initiated by Dr. Hossam Fahmy, SilMinds CTO & Prof. of Computer Engineering). The first mature DFPA standard is IEEE 754-'2008', which is implementation independent. SilMinds commercial hardware implementations were novel at their respective time of development, and hence they ought to be protected of course (there was no other commercial offering until a year ago, when I became no longer involved with business development of this IP product, and I don't know of any yet.)
Now, HFT came into place because it is very latency sensitive and hence those wanting DFPA accuracy (rather than binary float or software workarounds) are likely to need it in hardware. There is a product model that isn't published yet. On another hand, banking and accounting, for e.g., are most accuracy demanding applications; but unfortunately are not that latency sensitive.
I disagree with you that IP work is cheap. Firstly, these cores are simulated, verified, tested with both software and in Silicon (FGA and on chip). But how do you want to sell them as hardware, they can only be part of a processor or part of an FPGA relaization. More generally, we are living a highly layered and specialized industry at this time. A digital designer may use the (knowledge) services of an electronics engineer to support his work, which the later may not be familiar with. Having come from a network services background, the routing expert is not necessarily a DSL or UMTS expert (layer 3 vs. layer 1).
Finally, sure ideas not carried all way to maturity are for sharing, publishing, and working on, more than patenting; though some may do patent them. Yet sometimes, it is sound innovative ideas that generate funds, while patents may not do. What matters to business is the potential impact and what matters to knowledge is the genereated progress or synergy to state of knowledge.
@Kevin: I don't know what the point of decimal cores would be.
Some countries madate that any financial calculations are performed in such a way that you get exactly the same result as if you'd performned them with pencil and paper. If you work in base-2 (binary) you may introduce errors -- the bottom line is that you have to work with a base-10 decimal representation -- but most CPUs don't perform calculations on decimal representations very efficiently -- you can do much better using custom cores created in FPGAs...
ka.aly - Hello @Sheepdoll No I was replying to @Kevin Neilson ...
I was also more interested in the comments. In re reading the article I noticed that it was about "Patented" Floating point algorithms for HFT. I must have missed the first paragraph thinking it was about silicon fab.
I find the idea of patenting something fundamental like floating point hardware disturbing. Especially given the open history of development 200 years ago. Especially in the higher maths and statistics.
Perhaps it is my age that I have the problem in protecting non physical implementations. Ideas are cheap. Physical implementations such as a fab or a data center is where the costs are. Power, labor, taxes etc. These should be rewarded by those who can make a better product and not game the system. (Which I consider tantamount to cheating.)
Given the speed of progress it seems that patent protections of ideas should be shorter. If one can not reasonably develop an idea in a few months, let someone else who can implement and improve it. Not lock it away in a vault to borrow funding against.
Dear rich.pell: The article isn't about evaluating the usefulnees of HFT. It is meant to highlight rising merit of reconfigurable hardware in an environment that used to have such merit only in exchange ticker plants, involving just a couple of suppliers; now in almost every component of every single participating entity with over a dozen serious US & European FPGA hardware developerment SMEs involved, not to mention as well dominant network systems vendors.
Hello Sheepdoll: No I was replying to Kevin Neilson re his "It sounds like it might be difficult to do (efficient) decimal arithmetic in an FPGA; the carry chains and multipliers are all binary.... Now I recall that the 6502 processor had a BCD mode for decimal (integer) arithmetic. ..."
are you repling to my note regarding decimal arithmetic? I am aware of the difference between floating point and integer. This is important in modern binary computing. I have a textbook on rounding errors in floating point. Decimal computing is done with fixed and floating point. There are just 10 states rather than two states.
In the 19th century, the difference and analytical were do deal with floating point numbers. Non linear functions that could be linearized through statistical probabilities. Why Babbage wanted 30 digits of decimal accuracy. His floating point was mechanical.
Few people have actually looked at the abstractions of his work. It is written in a language that he invented that no one can read. Statistically mapping out the carry propagations through multiplications and divisions. These could be broken down statistically into summations called differences.
The ultimate goal about any of this, is to find the right statistical model to predict the near future.
I have a text from the 1970 on this sort of modeling. Back when binary was new. One of the problems was to see how large a matrix your basic interpreter could invert. Need a large complex number, load it into a matrix digit by digit.
I personally see no reason why this can not be done directly in silicon.
The talk about decimal arithmetic is strictly related to floating point fractions, no accuracy issue with integers. All arithmetic operations are done, verified, and commercially available as IP core units (http://silminds.com/ip-products/dfp-units).
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by