Breaking News
View Comments: Oldest First | Newest First | Threaded View
Page 1 / 2   >   >>
Aeroengineer
User Rank
Blogger
Needs Two More Timers
Aeroengineer   2/25/2014 3:30:01 PM
NO RATINGS
I wish that this thing had one more channel for timers.  Then it could be used as a micro brushless controller.  I am excited though as there is a 16QFN package that looks tempting to use in a product.

 

I just wish that there was a prototyping service that would do very samll boards that had allowables that would let you design for this chip in the BGA form.

 

http://www.freescale.com/webapp/sps/site/prod_summary.jsp?code=KL03

DrFPGA
User Rank
Blogger
IoT Golf Balls
DrFPGA   2/25/2014 4:18:24 PM
NO RATINGS
I guess they are going after the billions of golf balls that don't yet have IP addresses. Just think if every golf ball could talk to each other and to all the golf clubs knocking them around. Lots of ways to improve your game with access to all that data.

What is the best way to avoid that sand trap? Just ask your golf ball to talk to all its friends. Maybe your club has some advice on your follow thru. I see a big market for the Internet of Golf Things (IoGT).

betajet
User Rank
CEO
Re: IoT Golf Balls
betajet   2/25/2014 4:53:53 PM
NO RATINGS
But... but... but... it would totally spoil that great scene in Goldfinger (1964) when James Bond is standing on Goldfinger's "Slazenger 1" so that he can't find it.

Wilton.Helm
User Rank
Rookie
IoT?
Wilton.Helm   2/25/2014 6:24:18 PM
NO RATINGS
The hoopla failed to mention one thing.  Does this tiny Arm processor have a MAC and a PHY?  For that matter does it have NV data memory and FLASH for code storage?  If it lacks any of those, then it isn't even close to being ready for IoT.  As for 32 bits.  There are a lot of really useful IoT products out there running just fine on 16 bit processors (and some even on 8 bit processors).  This world doesn't need to migrate everything to 32 bit processors (or worse yet, 64 bit or 128 bit), especially ones that aren't particularly memory efficient.  The number of applications that can benefit by a 32 bit CPU that will fit in the dimple of a golf ball with only 16 pins (read <= 14 I/Os) is fairly small, especially Iots that done need access to the internet.  Let's see, how does one put an IP address into this?

Susan Rambo
User Rank
Blogger
Re: IoT?
Susan Rambo   2/25/2014 6:46:43 PM
NO RATINGS
Interesting. Thanks for your excellent comment. When would you need 32-bits versus 16? What's the general dividing line?

Wilton.Helm
User Rank
Rookie
Re: IoT?
Wilton.Helm   2/25/2014 7:14:37 PM
NO RATINGS
That depends on a number of factors in the application, primarily on the nature of the data.  A lot of IoT devices involve bit level control (the more bits in a word, the less effiecient it is to control them individually).  They often read ADCs, which are generally less than 16 bit resolution.  While some programmers immediately convert the results to FP, that is generally an unnecessary and inefficient thing to do, and the article didn't mention if this tiny wonder had an FPU.  If it doesn't then FP is going to slow things down considerably whether it is 16 or 32 bit.

32 bit becomes useful if a lot of the variables are numeric and require > 32K resolution, or are large bit map fields.  it can also be advantageous if there is a large amount of code or date (not likely on such a tine processor) that could benefit from pointers and address fields that are 32 bits wide.

Another thing that could sway the decision would be a large bit mapped display (LCD) with complex GUI or video or something.  But I don't see that connected to a CPU with 14 I/O lines.

Finally 32 bit becomes useful if the programmer is writing using a lot of high level abstractions without knowing how inefficiently they are being implemented, or worse yet using one of the numerous popular interpreted languages that are very code inefficient and memory hogs.

I have a control processor with two Ethernet ports on two LANS (for redundancy).  It communicates with up to 100 other Ethernet devices as part of a dedicated system (high reliability aerospace stuff) and handles complex event scheduling at the rate of 100 to 1000 events per second.  It can process Ethernet packets at the rate of 1000 per second.  The kicker--it is a 16 bit processor with 256 K bytes of total storage for code and data, including the protocol stack and Ethernet buffers!  There are a few places where I could have doubled or trippled speed if I had 32 bit registers.  But in general, porting it to a 32 bit processor would have resulted in less than a 10% performance improvement.  Porting it to an ARM would have doubled the code memory requriements.  (BTW, the 16 bit CPU had two MACs and an PHY on the die as well as four serial ports, four DMAs and the usual assortment of timers and interrupt controller, etc.  It was 184 pins.  Clearly far more hardware than 14 I/Os could use and yet still quite comfortably in the 16 bit realm).

 

A Metcalfe
User Rank
Rookie
Re: IoT?
A Metcalfe   2/25/2014 7:36:36 PM
NO RATINGS
>64k code size (eg most 'standard' protocol packages), without awkward compiler kludges.

or

Computationally intensive ops such as encryption, protocol processing, sensor processing etc.

The 'sell' is that using wide instruction/data processors results in less instructions per function, reducing power. However, the power compromised processes that most 32b processors are fabricated on (to reduce cost) result in higher consumption than the best-in-class 16 bitters at low-medium performance points. At higher performance levels, the 32 bitters are out in front.

As many IoT apps are battery powered, I'd say there is a long, successful life still ahead for 16 bit machines and some 8-bitters also.

betajet
User Rank
CEO
Re: IoT?
betajet   2/25/2014 7:55:54 PM
NO RATINGS
While many applications of a tiny MCU like the KL03 could fit in an 8-bit or 16-bit processor, if your application is doing signal processing with multiply-accumulate the number of bits in your intermediate results could easily exceed 16 bits.  Dealing with this is a pain on a 16-bit processor.

With a 32-bit architecture, you usually don't need to think about computations overflowing and producing strange results, so you can concentrate on algorithms.  With an 8- or 16-bit architecture, you need to keep this in mind all the time.

The KL03 only 32KB of Flash, so you don't need 32 bits for addressing -- today.  However, you may want to start with this chip but some day grow into a chip with far more memory, in which case you have to bank switch or go to a 32-bit architecture.  It's expensive to switch architectures.  So you might as well go with a 32-bit architecture now and save the hassle of switching over later, unless there are serious downsides.

How about those downsides?  A 32-bit CPU core may be twice the size of a 16-bit core, but the chip area is probably dominated by Flash and SRAM, so CPU core area doesn't have much impact on price.

How about code density?  ARM Cortex-M0+ uses the Thumb-2 encoding, where most instructions are 16 bits long instead of 32 bits.  So code density is similar to an 8-bit or 16-bit processor.

How about ecosystem?  Well, IMO ARM has pretty much everyone else beat in that category.

Dividing line?  If you're sure your application isn't doing arithmetic that's going to overflow (e.g., you are mostly moving data around and not doing much arithmetic) and the application is always going to be small, 16 bits is a good candidate.  Otherwise, you might as well go with a tiny ARM and spend your energy worrying about other things.

JMO/YMMV

Sanjib.A
User Rank
CEO
Re: IoT?
Sanjib.A   2/25/2014 9:09:56 PM
NO RATINGS
@Max: Going through the KL03 factsheet (link below), I saw it has three communication interfaces: (1) One 8-bit SPI module (2) One low power UART module (3) One 1Mbps I2C module; I could not find anything using which this could get connected on the Internet. Then how is it meant for IoT. Am I missing anything? Is there a latest version which is not published on the Freescale website?

betajet
User Rank
CEO
Re: IoT?
betajet   2/25/2014 9:36:33 PM
NO RATINGS
You can use SPI to talk to an external MAC & PHY.  For example, Microchip has the ENC28J60 for 10baseT Ethernet.

Page 1 / 2   >   >>
August Cartoon Caption Winner!
August Cartoon Caption Winner!
"All the King's horses and all the KIng's men gave up on Humpty, so they handed the problem off to Engineering."
5 comments
Top Comments of the Week
Like Us on Facebook

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
EE Times on Twitter
EE Times Twitter Feed
Flash Poll
Radio
LATEST ARCHIVED BROADCAST
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.