Sorry to have to take a break yesterday, but I couldn't find exactly what I was looking for online to illustrate my points. So, to get started I will use the example, as I interpreted it, that Bill asked about in a PM. It is a good place to start and build from.
As I read it, he asked that if the resistance in the primary were reduced, wouldn't that also reduce the output of the secondary (spark)? The answer would no. In fact, if the resistance in the primary were reduced, that means that the current would increase, which also means that the power input is increased. This is assuming that the voltage remains constant from the output of the CDI.
Remember this in all of your thinking of what is happening in a coil: Power-in equals power-out is a firm rule to go by, but you must expect minor efficiency losses too.
So, in this case, if the resistance in the primary were reduced, either by removing the ballast resistor or by using a coil with a lower resistance, then the power input will be greater, meaning also that the power out to the plug will be greater and the spark should be hotter. Don't forget that power is voltage times amperage.
The voltage at the spark plug is determined solely by the conditions the spark is trying to jump, or what we call the "soup"(refer to my terminology page for a complete definition of those conditions). Once sufficient voltage has been developed to jump that gap through the soup, it doesn't develop any higher. Under low engine power conditions this spark voltage requirement will be low, and that means that the current will be higher (remember that power remains constant, so if voltage is lower, current is higher). Current is what creates the heat necessary to ignite the soup.
For this next step remember that the input to the primary from the stock CDI basically remains constant no matter what conditions are being asked for at the plug. This is true only for "normal" CDI's that come from the factory. They work fine for most street use riding conditions. A lot of aftermarket CDI's have features that monitor system requirement changes and can vary their output to the coil. We're not talking about them right now, though.
Now, when the rpm's increase and/or the driver asks for more power, the composition of the soup changes, requiring more voltage to jump the gap. When that voltage increases, the current decreases. With less current, the heat from the spark is less, and eventually there won't be enough heat to ignite the mixture. Misfires happen at this point. If you ride the bike according to manufacturer's recommendations, this situation will rarely happen, though, so stock ignition systems are fine.
The major manufacturers have engineered their bikes' ignition systems to perform adequately under the conditions the bike was intended for, and tens of thousands of owners out there have been happy the way the come from the factory. But, how many on this forum are happy to leave well-enough alone and ride sanely? NONE! That's why we are always looking for that holy grail of coils that will give us flawless performance at all speeds and conditions for ever and ever!
One of the things I hope to get across in all of this is just why that holy grail coil is pretty much a myth.
I presented a hypothetical case above and it needs to be well understood before we go on. The way all of those factors play together should be second nature to your thinking, so please go read, re-read and re-read it all again until it is. I put quite a bit of stuff in such a short space. To help remember, just remember eating pie (P=IE). Power(watts) equals current(amps) times voltage(volts), then remember how varying any of the three elements affects the other two. This is the basis of everything I will discuss from here.