Yes, I do. And as long as customers want to reuse their existing software libraries I will develop software for 8-bit MCUs, too. For many, many embedded systems 8 bits are absolutely enough or are even the best choice.
I use a lot of 8-bit HSC08 processor in embedded automotive applications that I design. Sometime as supervisory/real-time processor close to a bigger application processor running Linux, sometime as standalone processors.
My 8-bit code is done in C, sometime a bit of assembly code but no C++.
The funny thing is that we have much more confidence in the longeivity of those good old processors than in the newer PPC or ARM processors that we use.
I think ippisl really clarified the market (4/8-bit grew 6% in a market that grew 16%), but I would suspect that the 8bit market is shrinking more that the 4bit. 4bit MCUs are real cheap: I got a quote for a 4bit 32 pin MCU (bare die)with 24k (mask) ROM for less than $0.07 @ 500k pieces. So for high volume, low cost applications, 4bit fits. I could be wrong, but I don't think any 8bit MCU can match this price.
The overall view depends on development time, processor/controller cost and cleanliness of hardware & software structures. There a very positive example is the MSP430 from TI with a very clean 16 bit RISC/CISC structure both in hardware and software, compared to the old 4-bit structure of TMS1000 or 8-bit 80xx. The MSP430 is easier to program in assembler and/or C, so you get your product faster with higher reliability of your written code.
The more interesting statistic is percentage of new designs by mcu family(design starts) and lines of code written by mcu family.
I haven't yet seen such stats, but i bet it would offer a much worse story for 8-bitters.
Hmmm... this is kinda complex to unravel. Having designed an 8080 system in early 1975, I have a great fondness for 8-bitters. Writing embedded apps in C or C++ doesn't make much sense for an 8-bit MPU (and less for a 4-bitter) in a size/cost sensitive application.
Looking at the report, I'd say the shrinkage (dollars) in the 4/8 bit market is no doubt in large part due to price reductions - their designs are well past paid for, and processes are by now ho-hum, so manufacturers can stand to let prices sag to keep their volume (quantity) up. I'd say the 4/8 bit producers are probably raking in pretty good profits.
ARM may win out eventually for power management reasons. Adapting an 8051 for deep ARM-like power-down would be pretty tough, given the small address space and the huge body of legacy code. Still, I'm sure the 4/8 bit community will do design shrinks to stay in the game for a good while.
I'd be interested to know if there's an age difference in the choice of 8 vs 32 bit MCU. Are engineers new to the business starting out at 32 bits with the experienced engineers keeping with something that works quite well?
From my perspective, the tool chain, setup and hardware design can be quite a bit simpler with some of the 8-bitters than with a typical 32-bit MCU. For small projects, the difference time devoted in setup and managing the project can be enough to justify an 8-bit over a similarly priced 32 bit MCU. Some of the new low-pin count ARM chips may start to eat into that advantage, especially the schematic and layout design.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.