In some ways there are packages that are converging to offer an integrated solution for the hobbiest level, and hence they may not realize that each is its own package. For this you can get away without having some of the fancier things listed that are involved with microwave signal frequencies and high power dissapation. Though the further you go, then many more tools will come into play.
Hi Max, I must say I've never had the opportunity to work for a company that could afford all of those nice add-ons and it was only recently I got a hold of Altium for some of those features. About 15 years ago I did a PCI 3 channel image acquisition card with co processor and 2 FPGA's got rev 1 to work without any issues. The biggest challenge was to get 12bit resolution at 1V signal with all of the digital stuff going on. The FPGA code was done with a text editor and Excel and the mechanical dimensioning match up to a daughter board was done with Excel and a vernier. I guess what I'm saying is that apart from using bishop graphics tape (which I've done also) the essentials for 98% of all PCB design tasks really are just schematic and layout. I use the verification part of Altium maybe every 10th design and the FPGA compiler 20th so really I think most of the features aren't used for most designs. Take the wiring loom features you mentioned, great stuff to have but I've never used it because no one had the money to buy it and the thermal simulation I've only ever had a calculator (or excel) and my designs don't overheat because I stick to a bunch of impirical rules on how much power is allowed into how many square millimetres of PCB space without assistance for removal. Yep you won't get a design that is right on the physical limits of what is possible but you'll uually have a more robust design that still works when unobtainium is unavailable. I don't consider myself a brilliant designer just one that loves rule of thumb influenced by 40 years of doing stuff. Per someone elses comment here, let's hope the young guys always have access to those guys who always seems to have an answer to all those questions.
Once upon a time our saying was "All below 10MHz is "continuous current" ... I made a real lot of boards without any simulation and got them right the first time ... up to 270MHz LVDS transfer of video data, with a lot of analog stuff like genlocking, etc. Of course working PCI bus interfaces and a real lot of digital data processing for professional TV station equipment. This went up to 8 layers and a size comparable to todays largest PC graphics boards. Plenty of SMD chips on both sides with up to about 500 pins (or balls) ... all that about 15 years ago. All I needed was the schematics, the PCB layout, the PLD design software and later the FPGA stuff from Xilinx. I also, just once, went up to about 400MHz for something (I believe a satellite receiver board) and even then it worked first time and without any simulation be it power or thermal or whatever. So you CAN work with just schematic and PCB layout tool (most of the time integrated anyway) and of course if you use programmable logic you need to be able to "compile" the VHDL (or Verilog or whatever) for them, but I did my pin-swapping in advance in my head and on paper (and believe me it didn't take long and I always used my chips resources to the limit.). Sorry, but I believe that to some extend it's a question of talent (pls forgive the bragging).
@Max: I agree with your list and yes we would need all of those for a fairly complex boards to more & more complex boards. That is why electronics design has been known as a costly investment (software tools as you mentioned + the lab instruments + certification cost) and those tools don't come for free (for commercial use). Hence depending on the complexity of the design involved, we could sacrifice some of those as appropriate...compromising somewhere else - more bench testing, debugging and possible reiterations. :)
@Sanjib: That is why electronics design has been known as a costly investment...
I must admit that the ever-increasing complexity is starting to worry me -- how do younger engineers learn all of this stuff? There wa smuch less to learn when I was at university -- the other stuff evolved over the last 35 years (LOL)
I've actually designed very simple PCBs using just the layout tool and instructions from the mechanical engineer (who presumabley had some sort of 3D model) Of cours I'm talking about small boards with a couple sensors and pads to hook up the wires. For anything more complex I'll at least use the schematic capture. For most hobbiest stuff you probably don't need much more although I always seem to have to create at least one new part in the library. As Aeroengineer said, these capibilities are merging at the hobiest level so they see it as one package.
On the other hand if you want a complex design to work when it's produced by the thousands then you need to start using the analysis tools.
In my day job, I need to do power and thermal analysis and usually digital/ analog simulation, signal integrety and EMC/EMI on every design. I haven't gotten into the mechanical aspects or FPGA design so I don't personally use those tools but I use the results that are generated.
One of these days, I'll actually have time to do one of the "hobbiest" projects I've been contemplating. Probably something with LEDs...
If you really understand PCB engineering, you will eventually locate tools to help clear various "hurdles" of typical problems that will arise. A good set of tools and the understanding to apply them will allow you to produce a 100% functional board in ONE revision (assuming less than 500 parts). Since the lead time to produce a board at reasonable cost is usually a few weeks - it is better approach to spend an extra few days looking at crosstalk, paracitic capacitance, trace impedance, loop area ect. The physics of how a physical version of a circuit is well known and thus tools have been developed to help us engineers "see these problems". Until you get to frequencies above 100Mhz, a prototype is useful primarily in optomizing component values, rather than determining a need to revise traces on the board.
Yes, I agree. Having the tools and the knowledge to use them really helps in getting it right the first time.
One thing that I've found though is that one of the biggest factors in getting it right is to have good design reviews. After the last few designs that I did (where we DID get it right the first time) I'm convinced that you can't have too many reviews and you really need to have the right kind of reviews with the right people (engineers with expertise in the area that is being reviewed). I had the project manager breathing down my neck because we were late getting the PCB design done and ended up with the project finishing ahead of schedule becuase we didn't need the second PCB spin.
VERY good point. Design reviews help catch a lot of stuff, and the earlier, the cheaper :-) Another thing you allude to is often stated, "Measure twice, cut once." I note that some (not many, I hope) larger companies are reticent to put in the simulation/checking/review time up front, with the aim of cutting out a prototype spin and therefore finishing the overall project faster. At one job I was actually ordered to rush a design (taking just a few days), with few simulations and checks, rather than take a bit longer and get it right the first time, because we "would fix any issues in the respin." I never heard the phrase "first-pass success" there, and attempts to introduce it were quashed.
I had someone say to me yesterday that,"clock nets should be routed shortest as a rule of thumb." I thought about this and it didn't quite ring well for me. I realize clks need to be routed cleanly with minimal angles and vias. Terminated. But I'd think Address and Data should be routed the same lenghts within +/-.150" tolerences if possible. And clock should be equal or slightly longer. The thinking being that the clocking edge should arrive just after data/addr setup, but still allowing hold time before data/addr changes for the next active clk edge.( the timing might also be controlled in the system so skews should be minimized ) Is there such a rule of thumb? ( 6"fr4 ~= 1ns prop )
Then to make things more murky I found these appnotes and on page 16 of each they contradict each other:
Based on both Technical Notes stating that the clock signal has "a shorter flight time" and the DDR3 TN being later (2009 vs. 2005, and likely copy-pasted from a more up-to-date document), I would guess that the earlier DDR2 document is in error ("sense inversion" errors like longer vs. shorter are not uncommon and can be difficult for a knowledgeable person to catch because they know what is meant). Presumably even with "the ability to prelaunch the address and control signals", there is a need for the faster signal to be given extra delay, so the clock path should be "slightly longer".
(I am not an EE; I base the above only on context.)
In a perfect world the clock would be rectangular and if the clock line is routed the same way as the data the clock fronts would be located where the data is stable. So the first orientation would be to keep all the traces equal. On the other hand in the real world data might take a little time to become stable _after_ it was switched from 0 to 1 (or vice versa) and it will also remain stable a little bit after it has started to change. So the middle of the window would be displaced a bit "behind" the halftime. So you could make the clock trace a little slower (longer). On the other hand the receiving circuit needs a little time to recognize the clock edge and would capture the data a little later than the edge. So if both data and clock go from 0 to 1 in the same instant what you would latch would normally be the 1. So that would now be an argument to make the clock trace shorter than the data traces. So you end up where you started - keep them equal. I usually routed the clock trace parallel to the data but inserted a place where I could cut it in order to enable a diversion connectable via two 0 Ohm resistors. I never actually needed this delay, but it was a good feeling to have the option. OK, that was only in the realm of about 67MHz, not really fast stuff ...
At the end of the day (how trite), we need to consider the manufacturability of the PCB. Design constraints for vias, trace widths, trace bends, trace spacing, impedance matching, component placement guidelines, recommended component footprints, implementation of EMC metal covers, etc. Many of these can be impemented in design checking software, but as stated by others, a rigorous design review process including manufacturing representation is a must. And don't forget a vibration analysis of the PCB mounting.
I remember in the distant past a fairly interesting package for this called Touchstone made by someone called EEsof. The company was acquired by Hewlett-Packard and the program since morphed into Lisa then into what's now called Advanced Design System (ADS). This was just the sort of thing to use if you wanted to actually lay out an antenna as part of your PCB. I remember how part of it worked, you could put a cutting pin into a plotter instead of a pen then you'd lay your rubylith sheet down on the plotter surface and cut it, then you could plaster the rubylith down on a metal sutface and etch away in an acid bath (well look this was pretty hot technology back in about 1986!). No really it could also do a complete analysis on a design as well as some synthesis, I just can't recall exactly how you entered the description, might have been an ACAD DXF representation, I believe there were a number of options. I looked at it with envy but never got a chance to work with it, but considering you could get analysis at extremely high frequencies back then with the ease and precision we would now expect at low frequencies with SPICE, it was pretty far ahead of its time. I doubt this would be an affordable package to experiment with though even now.
That's a pretty good list of tools that may be involved in a PCB layout. Now, I'm not going to use any 3D CAD tools, that's for the mechanical engineers on my project. But I'll certainly use their output to shape PCBs and place connectors, and feed back a 3D model so they can check for fit.
I'd like to see RF tools on the list. Sure, most advanced PCB design programs let me route in ohms instead of mils, but it's still worth checking against a dedicated program. As well, if you're using drawn components (PCB filters, antennas, etc), that's way beyond the ability of most PCB layout programs to deal with.
What you provide is a goood list of tools which are needed to design the schematic once you know what the re requirements are.
Prior to that there is always the upfront system engineering effort which will look at things like
a) customer requirements
b) derived requirements
c) subsystem segmentation - what is implemented by HW, SW, FPGA etc
d) interface control documents etc
This typically requires tools such as
1) Requirements capture
2) Model based engineering tools - Sysml, UML etc
Then there is the modeling of the system performance to ensure it can meet requirements typically tools like Matlab or Octave are used
Once you have started a design you will need a product life cycle management tool to enable the storage of design and documentation with a proper engineering change request loop.
At the schematic level there is also the timing level simulation which can be needed for the PCB as well. There is also the design rule checkers for PCB;s layout these can implement company rules along with design rules from the PCB manufacturer to ensure your design can be manuactured with a good yield.
It is an area close to my heart as I am currently tasked with creating a new engineering team and capability and it is no small nor inexpensive task.
Just as in real estate it's location, location, location, in PCB it's layout, layout, layout.
1. Component location. The first objective is to keep signal traces as short as possible, and keep digital and other noisy components away from your analog components. I have yet to see an auto-location program that seems to have even the smallest clue in this regard. That's because the permutations are computer boggling though relatively easy for the human mind to comprehend. It's been my experience if components are carefully laid out to minimize trace lengths, an auto-routing program will take a second rather than overnight.
2. Signal traces. In sensitive analog circuits (they're all sensitive) keep digital signals and noisy supply rails well away. Don't even cross them on the other side of the board. Auto-routing will kill you here.
3. Power and ground traces. Improper layout of these can destroy everything you've gained in 1 and 2. Power supply traces have to be laid out to keep impedances at a minimum. (Some badly laid out boards can be saved by simply increasing the weight of the board's copper). Again, keep digital supply lines well away from analog circuitry. Grounding layout is crucial. Carefully examine your board and visualize where your heaviest ground currents are flowing, especially noisy ones. You may have to put cuts in your ground plane, if you're using one, to keep these currents away from your analog circuitry. It's amazing what you can see just probing the ground and supply traces with a 'scope on the average PCB.
If you do a proper layout, your 24 bit A/D should consistently roll over its LSB at the 1/2 LSB point.
@mpk.nyy1: the "EASY" botton. becasue after the schematic is complete, you "push" the "autoplace" - "auto-route" button and its DONE, right!!
I once had a digital video camera with enough controls to fly an aircraft -- the best one was actually called "The Easy Button" -- when you pressed that, it turned all of the other controlls off and did everything automatically ... I LOVED that button :-)
A engineeer knowledgeable in all the implied areas these tools involve!
It seems most engineers know some of these areas.. but most don't know ALL of the areas (electronics , physics, mechanical, manufacturing, business, electrical standards, international standards, documentation standards, etc....
We are doing good to label an electronics engineer as: digital, analog, RF
IF these conditions are met:
- All the BEST tools (features) are available from a single source
- This (mythical) single source will always be the best source for ALL of these tools.
- plus a few additional features (as mentioned by others here)
You MIGHT be able to have a tool set that can be operated by A engineer.
Assuming the supplier does a great job of creating a uniform user interface for all these tools.
(I don't think this is goiing to happen, technology changes too quickly)
What is likely to happen:
- We will continue to use multiple vendors for our tools (competition.. it can be a "good" thing")
- There will be too many interfaces for a single person to be stay knowledgeable and productive with. Both equating to a team requirement for nearly any level of comlexity. This is NOT a good thing.
- and still no common data set for a design that can truly cover ALL the aspects of this large list of tools requires.
We can't even get past the use of gerber file format.
I think a new perspective is needed... and some new standards.
Schematics , HDML, etc.. all have serious limiations in describing a pcb design.
Best cholce: quality people , with tools integrated with their processes.
Replay available now: A handful of emerging network technologies are competing to be the preferred wide-area connection for the Internet of Things. All claim lower costs and power use than cellular but none have wide deployment yet. Listen in as proponents of leading contenders make their case to be the metro or national IoT network of the future. Rick Merritt, EE Times Silicon Valley Bureau Chief, moderators this discussion. Join in and ask his guests questions.