This period of time I have start gathering my own experiences, in the regard of variables them able to influence
charging factor (Accounts for charging inefficiencies).
For example:a) Begging of winter time, with indoor temperature at 19.5C (excellent temperature for NiMH testing)
b) DUT this is bright new 9V NiMH PP3 battery (lowest internal resistance), 280mAh (tested).
c) Bright new battery charger, with accurate delivery of charging current at 35mA, actual delivery 28mAh per hour.
At such all three,
ideal conditions,
Lower charging factor of 1.2 this seems as applicable.
Mathematical estimate
9H 36M.
Actual charge process this required
9H 52M 42S 
Due the fact that its impossible for us
to control all three variables ( these known at causing charging inefficiencies), the industry selected
charging factor 1.4 as an more fault tolerant choice.
From the other hand, I do believe that the designers of chargers and NiMH battery makers, they do use at their own testing lab, the
Lower charging factor of 1.2 as a benchmark.
