to put it simply,
all transformers are designed to work at a specified ambient temperature for maximum efficiency. all transformers also generate heat also. You stated that ambient was 40 deg C or 104 deg F.
all this means is that to get the maximum current out of the transformer, you have it at no more than 40 deg c of 104 deg F. at 30 deg c for example, you
might get a little more current out of the transformer and will definitely have less heat generated.
In your case, California desert heat.
items to consider.
1. it reached 122 deg f one day.
2. type of transformer.
3. location of the transformer - inside or outside?
4. will the transformer be shaded by something or in direct sunlight?
5. will the transformer be subjected to reflected light or heat?
6. electrical wiring must be sized based on items 1,2,3,4,5.
if you look at the specs for your transformer, it should have a chart telling you how much you must derate (take away) from the transformer based on the temperature seen by the transformer.
wire is rated the same way, look at the nec 70 codebook for derating factors and you will see how much you must derate the current capacity for wire.
just a guess here, and I do mean guess.
in the California desert, 122 degrees + 25 degrees for radient heat = 147degrees.
so your 1 kva transformer would be rated 1/2 kva.
so I would go with a 3 kva just to be safe.
As I said, this is a guess on my part and someone can easily prove me wrong with the numbers since my design books are at home. This is what I had to do with a project in west Tennessee 7 years ago. the pe questioned me about why I chose such a large transformer and I showed him my calculations and rules. he was happy.
james