Features, heat, Learn, Science -

Can heat acclimation really improve endurance performance?

In a recent CyclingTips article sport scientist and cycling coach Dr Jason Boynton discussed how to optimize indoor training by manipulating environmental conditions. In that article he left us with a bit of a teaser by saying that heat acclimating for the purpose of improving performance is more complex than it might seem.

In this article Dr Boynton explores the often overlooked caveats when it comes to heat acclimating endurance athletes for the purpose of improving performance, especially when subsequent riding is going to happen in temperate conditions.


In the beginning

It was the summer of 2011 and I was out on an afternoon ride in the green rolling farmland just outside of Madison, Wisconsin. I remember it being hot, humid, and sunny – as tends to be the case for that time of year in Wisconsin. It was one of those days where wearing a jersey and bibs felt like too much kit to have on. So you can imagine my surprise when I came across a friend (now a former national cyclocross masters champion) dressed head-to-toe in cold weather gear — a balaclava, arm warmers, and full-length tights.

We stopped for a moment to have a chat and I asked him the obvious question: “What’s with all the winter gear in the middle of summer?” His reply was another question: “Haven’t you seen the recent article in Cyclocross Magazine about how heat acclimation can improve cycling performance?”.

“No”. I was dumbfounded.

At the time I was finishing my Masters degree in Exercise Physiology under Professor Stephen McGregor and had been coaching cyclists for 3-4 years. On missing the article, I was slightly perturbed with myself that I had let this scientific finding slip by me before it made its way to a layman’s publication. But at the same time I was excited to dive into research exploring a new ergogenic approach (i.e. something that could enhance physical performance).

However, upon my initial read of the actual research article by Lorenzo et al., and after pondering its claim that heat acclimation increases endurance performance, something didn’t quite sit right with me.

To be very clear, it wasn’t that the authors of the article were being dishonest about their findings or anything else malicious in nature. Their conclusions were supported by their results, and a number of the authors on the paper were trusted veteran researchers in their field, including Christopher Minson from the University of Oregon.

What I questioned about the study was whether heat acclimation conducted by an athlete in a real-world context (i.e. acclimatization) would have the same increase in performance as the methods utilized in this laboratory experiment (more details on that in a bit). This question perplexed me so much that it actually influenced the path I chose for my PhD research.

Fast forward to now, almost a decade later, and researchers are since arguing for and against this study’s conclusion. But, important nuance is now starting to emerge on the question of whether heat acclimation actually improves endurance performance in temperate conditions.

Scenes from a Edith Cowan University study on heat acclimation.

Heat stress and acclimation

In my previous CyclingTips article I gave a brief description of what happens to cyclists exercising in the heat. Under time-trial-like efforts in hot conditions power will decrease while heart rate and perceived exertion will increase. Similarly, during matched power outputs, heart rate (i.e. cardiovascular strain) is higher in hot conditions than cool conditions. This relationship between temperature, power output, and cardiovascular strain will be important for later in this article.

Fortunately, if an athlete is exposed regularly enough to heat stress, at a sufficient intensity and duration (i.e. thermal impulse), they will adapt to handle it better. In other words, at a given hot temperature their time-trial performance will improve and for a given wattage their heart rate will decrease while the duration they can exercise for will increase (see the graphic below).

So in a sense, heat acclimation is similar to how the body adapts to the stress from exercise. However, unlike adaptations to exercise, the lion’s share of heat acclimation can occur very quickly — in some cases in less than seven days, but with more certainty in 10-14 days. Additionally, adaptations to heat are ‘staggered’, with key cardiovascular (e.g. increased blood plasma volume and reduced heart rate) and body temperature adaptations (e.g. reduced core temperature) reaching their peaks sooner than those of sweat response and performance improvements.

Athletes can achieve heat acclimation either through active (i.e. when exercising) or passive (e.g. in a sauna or hot bath) methods. However, heat exposure during exercise offers the advantage of an increased thermal impulse as exercise induces a more rapid increase in core body temperature than being sedentary. Additionally, it has been argued that the method of heat acclimation should reflect the intended outcome.

For the endurance athlete this would imply heat exposure for the purpose of acclimation is best incorporated with exercise. However, the intensity at which the athlete exercises is also an important consideration for acclimation protocols in order to foster proper adaptations and reduce maladaptation.

A very plausible conclusion

It might initially seem odd that regular exposure to heat, and the subsequent adaptations, could potentially improve endurance performance. But if you think about it, exposing endurance athletes to environmental stress with the intent of improving performance is nothing new. Especially when you consider, prior to Lorenzo et al., we had been subjecting elite athletes to hypoxic conditions (i.e. altitude training) with the intent of achieving this outcome for decades.

The utilization of an environmental stress as an ergogenic aid can be categorized by the desired outcome. Either, 1) the intent is to acclimate an athlete to an environmental condition they will be exposed to during competition, and therefore decrease the negative effects of that environmental stress; or 2) environmental stress is added to training stress for the outcome of increasing performance in “normal” conditions (as we are focusing on in this article).

Of these two desired outcomes, generally speaking, the evidence for efficacy is better for the former than for the latter. Indeed, as eluded to above, it is well established in the scientific literature that acclimating individuals with regular exposures to heat can improve their exercise performance in hot conditions.

One of the initial observations that lends plausibility to heat acclimation as an ergogenic aid in temperate conditions deals with one of the main premises in my last article. This is that optimal performance occurs in temperatures cooler than what we often compete at (i.e. 10-17°C versus the 22°C of a nice spring or summer race). This implies that thermal stress is a cause of fatigue, even when it isn’t necessarily hot out. Therefore, heat acclimation could further help athletes deal with the thermal stress they experience under “normal” conditions and lead to improvements in their performance.

Another justification for the hypothesis that heat acclimation improves endurance performance deals with the adaptations heat exposure promotes. If you are physiologically savvy, you may have noticed that the adaptations to heat exposure mentioned above are also adaptations that result from endurance training (see graphic below). Some of these adaptations, such as increases in blood plasma volume, are potentially stimulated to a greater extent by heat than by exercise alone. This reality is one of the key points emphasized in a scientific review by Corbett et al. in 2014 that promoted the concept of heat acclimation as a method of improving performance in cooler conditions.

In addition to the physiological argument, more importantly, the review by Corbett et al. highlighted a number of experimental papers (eight in total, Lorenzo et al. included) that demonstrated evidence for heat acclimation improving performance in cooler conditions.

Based on these arguments and evidence, to the casual observer heat acclimation as an ergogenic aid for endurance athletes could seem like an open and shut case. However, as with many things, the devil is in the details …

The narrative is countered

In 2016 an experimental paper published by Keiser et al. claimed that, while heat acclimation did indeed improve performance in the heat (which was already well established in the literature), heat acclimation did not improve endurance performance in temperate conditions.

The findings of this paper directly countered the findings of Lorenzo et al. and the argument conveyed in Corbett et al.’s narrative review that heat acclimation increases performance in temperate conditions. But how could this contradiction have occurred? Wasn’t heat acclimation as an ergogenic aid now “established” science published in peer reviewed journals?

Well, firstly there is never such a thing as “established” or irrefutable scientific findings. All research conclusions are subject to criticism and all research methods can be improved upon (at least theoretically – there can be technological limiters, of course). But to figure out what is going on in this specific case, we need to first have a closer examination of the research that supports heat acclimation as an ergogenic aid in temperate conditions. Luckily, Corbett’s review provides a pretty comprehensive list of the papers supporting this claim, up until circa 2014.

What can be noted upon an initial glance at this literature is that many of the studies supporting heat acclimation as an ergogenic aid do not incorporate a control group into their study design. What’s the issue with this? Well, there are a number, but there are two primary concerns to note.

Firstly, for a study that utilizes a training intervention with both exercise and heat there is no real way to tell if performance increased due to heat acclimation, the exercise, or the combination of both. Ideally you would have a control group that did the same training intervention as the experimental group without the incorporation of heat. Then you would compare the outcomes of the two groups. Secondly, without a control group there is no way to account for psychological factors, such as the placebo effect, or research participants pushing themselves harder than normal during post-intervention testing because they want to ‘appease’ the researchers.

However, currently there is a substantial cohort of studies that support heat acclimation as an ergogenic aid in temperate conditions and utilize both experimental (hot) and control (cool) groups in their research design. This includes the study performed by Lorenzo et al.

But shouldn’t these multiple studies be more than enough to counter the findings of the (then) one study by Keiser and co? To answer this question we need to take a closer look at Lorenzo et al.’s methods. In particular, how they prescribed exercise intensity during their training intervention compared to Keiser — and real life.

A methods breakdown

During the Lorenzo et al. study athletes exercised for 10 days in hot or cool conditions performing 2x 45 minutes at 50% of their power at VO2max. To the lay-cyclist this intensity is around ‘zone 2’ or endurance pace. This low intensity was intentionally chosen as it was less likely to cause a post-intervention increase in performance in trained cyclists. Subsequently, this would increase the likelihood any change in performance after the intervention was due primarily to the temperature and not the training.

The kicker for this study was that 50% VO2max was determined during a graded exercise test that was conducted under similar environmental conditions for both groups. This means the external load/power outputs during training between the temperature groups were ‘matched’. However, if you remember from above, and my previous article, when power outputs are matched between hot and cool conditions, heart rate (i.e. cardiovascular strain) and rate of perceived exertion end up higher in hot conditions.

There are now two issues with this experimental design. Scientifically speaking, you have now introduced another variable/stimulus (i.e. increased cardiovascular strain) to your experimental group other than just an increase in heat. This is not optimal — ideally you want to reduce the variables between groups down to just the factor you are looking at. In this case, environmental temperature.

Additionally, practically speaking, you are now examining an experimental condition that is unlikely to occur in the real world. This is because when athletes self-pace their exercise in the heat they will reduce their power output. And while their heart rate and perceived exertion might still be higher in the hot conditions (especially after a long duration of exercise), the difference in cardiovascular strain between hot and cool conditions would be nowhere near as dramatic as those in the training conditions for Lorenzo et al.’s study.

These are precisely the doubts I had with this study upon reading it on that summer day almost a decade ago. Was it really heat acclimation that increased temperate performance for these athletes, or was it simply a matter of increased cardiovascular stress stimulating superior training adaptations? Because if it was simply a case of the latter, there is one really easy way a cyclist can increase their heart rate and perceived exertion during training without incorporating heat … push harder on those pedals.

Fortunately, this is one of the questions that Keiser et al. aimed to answer in their study. To do this they utilized a very similar study design to Lorenzo et al. (10 days of 2x 45 minutes at 50% of VO2max) with one very key difference in the context of training load. Keiser et al. conducted their graded exercise testing to determine training intensity (i.e. 50% of VO2max) in the conditions the participants would be training in (i.e. either in hot or cool conditions).

As you can probably guess, based on the overarching theme I’ve been expressing about exercising in the heat, the power output one can achieve at VO2max is reduced in hot conditions. This change in study design meant training intensity for the individuals was set relative to the temperature condition they were training in. This meant Keiser et al. effectively negated much of the increased cardiovascular strain of training in the heat that was not accounted for in Lorenzo et al.’s study.

As already stated above, Keiser et al. demonstrated no difference in temperate performance outcomes after training in hot versus cool conditions. However, importantly, they did observe that heat acclimation occurred in the hot group despite that group’s reduced power output and similar cardiovascular load compared to the cooler control group. This essentially demonstrated that improvements in temperate performance after training in the heat observed by Lorenzo et al. were more likely due to the corresponding increase in cardiovascular stress during training rather than actual heat acclimation.

Additional thoughts

One could be inclined to think that, in light of these criticisms I have levelled at Lorenzo et al., it was a poorly designed study and/or the researchers have somehow mislead us. However, I will emphatically state that this is not the case at all. At the time, this was a groundbreaking study, and it inspired numerous studies in its wake (including my PhD research), and each of these studies has helped us get closer to the truth of how environmental temperature affects humans.

Additionally, it should be noted that Lorenzo et al. did actually make considerations for the differences in relative stress for hot and cold intervention temperatures in their study design when calculating the 50% of VO2max training intensity. However they attempted to do this by increasing the participants’ core temperature in a hot water bath prior to the testing session. Unfortunately, the graded exercise test was directly after a lactate threshold test and core temperature differences between groups could have normalized after the previous exercise.

Also, it wasn’t until relatively recently that we had a better understanding that increases in skin temperature (i.e. what happens when you are in a hot environment), not just raises in core temperature, have effects on heart rate, stroke volume, and potentially subsequent performance outcomes.

Practical applications

There are a number of ways we can apply the findings of Lorenzo et al., Keiser et al., and similar studies.

First, it is very useful to know that repeated exposures to heat during training does not decrease performance in cooler conditions, as prior to Lorenzo to et al. this was still considered a plausible outcome. This is valuable to know for the athlete that is heat acclimating in a climate that frequently varies between temperate and hot competitive conditions (i.e. my beloved home state of Wisconsin). Or, if you are training and competing in a temperate climate while you are heat acclimating for upcoming races in hotter climates. Or, if that race you thought was going to be hot suddenly ends up being cool. Indeed, these were some of the practical concerns and questions Lorenzo et al. had when designing their research.

There are a number interesting uses for heat during training that can potentially trace at least some of their roots back to Lorenzo et al. One of these practical outcomes is for injured athletes and is the focus of an upcoming paper from Dr. Sébastien Racinais and colleagues. Namely, athletes that can only exert limited force on a given limb due to injury can potentially increase cardiovascular strain during their training for their given limited work by training in a hot environment. This would effectively increase training stress for the injured athlete, helping to attenuate decreases in fitness (i.e. detraining).

There is also some research exploring the potential of heat acclimation as an ergogenic aid for athletes competing at high altitudes. But that … that is a topic for another article.

Conclusions

So, does heat acclimation really increase endurance performance? As with many things the answer is quite nuanced. As we have discussed, the likelihood is very high that repeated exposures to heat during exercise improves endurance performance in subsequent heat exposures. This makes it good practice to heat acclimate prior to hot competitive events.

However, based on the current literature, it is doubtful that heat acclimation, as an ergogenic aid in temperate conditions, is more beneficial for increasing performance than simply riding your bike harder during your training sessions.

About the author

Jason Boynton, Ph.D. is a sport scientist and cycling coach. His PhD research at Edith Cowan University, under the supervision of Associate Professor Chris Abbiss, investigated the effects of environmental temperature on high-intensity interval training in endurance athletes. Wisconsin raised, he is loving the cycling and expat life in beautiful Western Australia. You can learn more about Jason and his coaching at his website, his Facebook page and on Instagram.

The post Can heat acclimation really improve endurance performance? appeared first on CyclingTips.


Tags