I have a question.. would someone knows why the radio does not gives me 100 W in all bands?
Where only I can reach the 100W is 6M, 10M & 15M. The antennas are with a good SWR.
 19 Posts
 0 Reply Likes
Posted 4 months ago
 698 Posts
 91 Reply Likes
Example (measuring output with LP100A, tuning with AT2K, 20 watt tune, 40 meters)
SWR of 1.75 output = 16.9 watts
SWR of 1.50 output = 17.7 watts
SWR of 1.25 output = 18.5 watts
SWR of 1.0 output = 19.5 /19.6 watts
So as you can see SWR can greatly effect output
 944 Posts
 218 Reply Likes
Jim, K6QE
 105 Posts
 29 Reply Likes
Most amateur's shacks I've been in are not setup to do this, and many commercial power supplies need to have their voltage tweaked slightly higher in order to accomplish the task. A powerpole distribution box, some emergency battery cutovers, some lower awg power cable, extra wire curled up behind the rig/power supply, etc all lead to voltage drop once you start drawing significant current.
Best to go direct from the power supply to the flex, using proper awg wire, and trim it to be the appropriate length.
Or you can tweak the output of your supply up a little bit.
 300 Posts
 98 Reply Likes
This question has been asked countless times, in this and other fora for other brands. Here is one thread from this community, including the response from Flex:
https://community.flexradio.com/flexradio/topics/6500power
Gerald  K5SDR, Employee
Official Response
Let me first address how we calibrate our radios at the
factory. The entire process is automated and cannot be bypassed by the
factory operator or test technician. It is not possible for a radio to
complete the test process without PA calibration over its linear power range on
all all bands. After the automated calibration and 24 hour burn in we
perform a 100% QA using the SmartSDR GUI and run the power up to manually confirm
100W on the center of each band before shipment.
All power calibration is performed with the voltage set to 13.8V under
full load at the DC terminals on the back of the radio. We use a 40
dB precision power attenuator connected to a MiniCircuits PWR4GHS Power
Sensor to precisely measure the power during both the automated test and the
final QA. The measurement accuracy of this test station is significantly
better than virtually all ham measurement equipment on the market including the
popular Bird "slug" wattmeters.
MOSFETs used in HF power amplifiers are effectively variable resistors where
the resistance will increase as they heat. As the resistance goes up with
heat, the power will decrease (P = E^2/R). If the voltage on the input
terminals decreases the power will go down proportional to the square of the
voltage.
Not all dummy loads are 50 ohms. Not all ham dummy loads are still (or
were ever) good. Cables don't have 0 dB loss. Not all connectors
are good. DC connector crimps and connections are not 0 Ohms. What
is the absolute accuracy of the power meter at the frequency of measurement?
Let's say we have a peak current of 20A and a series resistance of 0.1 Ohm in
the DC cable and connector. P = I^2 * R = 20^2 * 0.1 = 40W That means
that the cable and connector will dissipate 40W before you get to the back of
the radio. The voltage drop at 20A will be 20 * 0.1 = 2V. It all
makes a difference.
Now let's look at the effect of a simple change in impedance of the RF load.
Let's say that the load is 55 Ohms instead of 50 Ohms. 100W is 70.7
Vrms into 50 Ohms. Since P = E^2/R then P = 70.7^2/55 = 90.88W. A 5
Ohm (10%) increase in the load drops the power to ~91W. 55 Ohms is a SWR
of 1.1:1. You can experience that SWR on a cable, connectors and ham
shack dummy load fairly commonly. Drop the impedance to 45 Ohms at the
same voltage and you get 111W. Same 1.1:1 SWR!
Finally, in the 90.88W example the power output difference from 100W is only
0.4 dB, which is totally imperceptible on the other end of the contact.
73,
Gerald
 234 Posts
 80 Reply Likes
 19 Posts
 0 Reply Likes
Related Categories

FLEX6500 Signature Series SDR
 3553 Conversations
 912 Followers