A fair amount of misinformation, Jeff.
I don't buy the claim that the flame sensors measure in microvolts. It's hard enough to measure millivolts in a noisy environment; measuring DC levels below a millivolt in a noisy environment is quite difficult. There are several different ways to detect a flame with electronics: you can use an optical sensor, you can measure the flame's conductivity or you can heat a thermocouple. None of these techniques requires reading sub-millivolt voltages.
Why should the frequency be so critical? What is "on the nose"? Sure, any ac motors in the system (such as the blower) want something close to 60 Hz, but the electronics shouldn't care: it's all DC at that point. BTW, a cheap frequency meter is the "Kill-A-Watt" meter. Should be good enough to get you within one or two Hz. The only generators capable of maintaining a precise frequency are inverters; any conventional generator will necessarily suffer from droop as the load changes.
All AC generators are by definition "alternators", and most if not all have brushes. The brushes are necessary to excite the field. The inverter-style generators go through the additional steps of rectifying the ac alternator output, then converting it to a sine or square wave output with solid-state switches. The advantage of this is it decouples the engine speed from the output frequency, so an inverter unit will use less gas and make less noise than a conventional generator. It's a fact that the square-wave output of the cheaper inverter units generates a lot of high frequency noise that can play havoc with electronics, but the high-end units such as Honda put out a reasonably clean sine wave.