I am aware of the Panasonic MN101L which I think was the first embedded ReRAM in the field, I am not sure which ReRAM material they are using. I have not investigated the Philips product. It is the market and designers who will decide and when we see commercial products with them in then we can measure the success.
With respect to TiO2, I think in any device where you build up a concentration gradient of material that can be moved by electrical means relatively easily is likely to suffer from thermal degradation. I am afraid it's a case of dammed if you do dammed if you don't. If you require a high temperature/high energy to write the memory then that's a problem in many other respects, if you have low energy then diffusion might be the elevated temperature data retention problem.
Cost of ReRAM plant at 10-20nm Samsung should know, most of it will be for silicon processing anyway.
Shockley 22:- You can never say phase change memory (PCM) is definitely dead, it is alive but at the moment to use a medical acronym more in the DNR phase. There are still a very large number of problems to solve if PCM memory arrays with chip bit densities that are competitive with Flash (say double digit Gb) and with cell sizes in the 10nm to 20nm lithographic range are to appear in the market place. Problems like elevated temperature data retention, element separation, current density, parameter drift, matrix isolation devices and thermal crosstalk, the latter two are very significant if you want to construct a three dimensional, or stacked matrix.
Part of the answer to your question regarding PCM may be found in the reason why now there are so many many different types of RRAM/.ReRAM memory mechanisms, technologies and devices all competing for NV memory top spot. That effort tells you those workers believe there is still an opportunity and PCM has failed. I think there is still a lot of work ahead before something in the form of a competitive product pops out from the wide spectrum of RRAM/ReRAM opportunities. Of the RRAMs the one that interests me the most at the moment is the CeRAM (Correlated electron RAM). If all the performance and fabrication claims for this device can be established by independent third parties then it has to be the favourite to suceed and as a bulk effect it should scale. If Micron/Sony can turn their 16Gb RRAM copper filament based memory cell into a product in short order then (see my EETimes article) they might be onto a winner of sorts. However, I see that device more as a test vehicle for many different types of RRAM technology and should receive support for that reason alone to allow a quick jump from the laboratory claims to high end lithography.
I think as I said in my recent VLSI report in EETimes perhaps the ReRAM/RRAM community would be better served if they focussed their efforts towards on chip embedded memory and from success there build the bit capacity upwards.
Resistion:- On 16Gb reliability nothing was provided in the ISSCC 2014 paper. However, in August 2012 at the Flash Memory Summit, Santa Clara, Amigo (Keiichi) Tsutsui of the Sony Corporation under the heading "Adaptive ReRAM Tecchnology for 2014" presented the results for a 10Gbit memory using the same CuTe copper filament bridge mechanism, with the same 1GB/s read and 200MB/s program performance as the ISSCC device. For that device a program endurance of greater than105 cycles was claimed with data retention of greater than 105 years. I am afraid that rather meagre bit of what might pass as "reliability" data is the best I can offer at the moment; with the caveat that results do not always necessarily transfer between different device structures.
On the subject of verify one important aspect of reliability involves the way in which, with the 16Gb ISSCC device, iterative set is used in relation to the CSP to reduce power. If with write/erase lifetime the number of tries required increases then it would appear to be the power dissipation budget gets compromised.
Krisi & resition:-I requested from Micron the details of the power dissipation when the chip is operating at its maximum claimed read and write performance to complete my report. So far I have not received any reply. I think with that information it would be a little easier to judge how near the 16Gb device might be to product status or commercial availibility and higher bit densities-only Micron/Sony can really answer that question.
I think what is very important is Micron/Sony with the 16Gb GeCu-Cu bridge device now have a test structure and mask set that should allow other bi-directional NV memory technologies to be quickly evaluated at high bit density. In that role it will help solve the problem of getting laboratory claims for the performance of single or a few NV memory devices, from where ever they come, evaluated at high bit density and any problems in that respect quickly exposed.
A Book For All Reasons Bernard Cole1 Comment Robert Oshana's recent book "Software Engineering for Embedded Systems (Newnes/Elsevier)," written and edited with Mark Kraeling, is a 'book for all reasons.' At almost 1,200 pages, it ...