To Expose or Not to Expose?
One of us, when a trainee, was mildly admonished for returning from vacation with a moderate tan. The implicit message was that sun exposure was harmful and tanned dermatologists bad role models. Much as a respiratory physician who smoked set a bad example, so did a tanned dermatologist. Whatever the ethical or professional foundations for this view, the science underpinning this conceit was always a little shakier than many imagined. In recent years there has been a resurgence of interest in this topic, and the paper in the current issue (p. 362–367) by Jensen et al. on skin cancer epidemiology in Denmark is an important addition to this subject. First, a little bit of background.
There are two determinants of skin cancer worldwide, skin colour and ultraviolet radiation (UVR) exposure (1). If skin is deeply pigmented – whatever the ambient UVR – skin cancer is not a major public health problem. In those with pale skin however, variation in UVR exposure is the main determinant of skin cancer epidemiology (2). This relation is most compelling for squamous cell carcinoma and actinic keratoses, but is also evident for basal cell carcinoma and melanoma (2). As a general rule, increases in sun exposure will lead to increases in skin cancer, and there will be an accompanying increase in morbidity and mortality. If one was only interested in skin disease this perspective would perhaps be sufficient to guide clinical behaviour – if cutaneous health is our sole output, reduce sun exposure. This framing of the question is now suspected to be inadequate or at least too narrow, because a number of epidemiological studies have reported associations between a range of non-cutaneous diseases (e.g. non-cutaneous cancer, cardiovascular diseases, bone disease) and latitude (3, 4). The suspicion is that latitude is a proxy for UVR exposure and in turn vitamin D status.
At the global level, reviews of observational evidence suggest that reductions in UVR exposure may be harmful for overall health. This finding seems to be true even if we only consider the relation between vitamin D and bone disease, ignoring for the present much of the other epidemiological evidence linking low levels of vitamin D with increased cancer incidence (3, 4). The research question we are currently concerned with is: what is the answer if we use that subset of the world population that has pale skin? Is there any biological reason to postulate a beneficial effect of UVR on overall health despite knowing that increases in UVR are associated with increased skin cancer mortality?
Much genetic change in humans over the recent past (say 50,000 years) has been driven by changes relating to the environment and diet. As humans migrated out of Africa, lighter skin colour appears to have been selected for. Most authorities believe the driver for this was the need to synthesise adequate levels of vitamin D in parts of the world where ambient UVR is low in comparison with equatorial Africa. The problem of maintaining vitamin D levels in our ancestors was greatly increased with the move from hunter–gatherer status to settled agricultural communities, because although cereal based diets can support a larger population density, vitamin D levels become ever more dependent on sun exposure. Current studies of the genome all highlight that there has been strong recent selective pressure on pigmentation genes and enzymes concerned with processing food (such as milk beyond early childhood). This line of argument strongly suggests to these authors that vitamin D levels are critical for human health. This leads naturally to the work of Jensen and colleagues. So, what have they discovered?
Building on the excellent epidemiological infrastructure available in some of the Northern European countries, Jensen et al. have prospectively compared mortality over a ten-year period in those who have been diagnosed with basal cell carcinoma (BCC) or squamous cell carcinoma (SCC). Their findings are that those with basal cell carcinomas have a lower overall mortality, and those with squamous cell carcinoma have an increase in mortality in comparison with control populations. The bane of much observational epidemiology is confounding – you can only adjust for what you know about – but the authors provide a model example of how to present and interpret such data. That SCC is a marker for other illnesses and systemic immunosuppression, and hence overall mortality is perhaps not too surprising, as there are studies in man showing big differences in the immune status of those with and without SCC (5).
Whereas the authors attribute the increase in mortality in the SCC group to SCC being a marker of immune status, for BCC the authors suggest that the reduced mortality in the BCC group is as a result of better vitamin D status. Are the current reviewers convinced the issue is decided? No, not yet. First, we know little about the relation between types of sun exposure and vitamin D status, and the magnitude of the effect appears somewhat modest. Second, much of the evidence (but not all) comes from observational studies, rather than experimental studies (6, 7). And of course, in terms of providing clinical advice, even if we accept an increased role for vitamin D in normal health, oral vitamin D supplementation would appear a safer option that circumvents the need to receive harmful UVR exposure.
The present paper however highlights a general point that needs more attention. In dermatology, most of our patients are unlikely to die from their primary disease, and often we assume that our treatments and even our advice can take place in a vacuum without considering other issues. Dermatology, unlike skin biology, is more than skin deep.
Jonathan Rees and Lisa Naysmith