Patterns of health and disease change over time
Defining terms is always a problem in the social sciences, particularly in the field of health. There is a raft of popular terms concerning health and ill-heath. For instance, people talk about being healthy, fit, poorly, ‘one degree under’, low, below par, diseased, even ‘sick as a parrot’. Amazingly, we often seem to know what we are each talking about! Additionally, health professionals nay use the same terms in a different way from lay people. To try to minimise confusion, I have used the medical view of health common definition as “state of being well, without disease”.
According to the World Health Organisation (WHO), health is ‘a state of complete physical, mental and social well-being and not merely the absence of disease and infirmity’. In this view we are suffering from ill-health when we fall short of ‘complete well being’. In comparison, ordinary people do not view health in such absolute terms yet rather in relative terms. Hence, as Dubos and Pines (1980) state “good health may mean different things to an astronaut and to a fashion model, to a lumberjack and to a member of the stock exchange.
Their ways of life `require different levels of physical activity; their food requirements and stresses vary, and they are not equally vulnerable to all diseases”. Furthermore, Jones (1994) stated that when one ‘don’t feel well for example women who feel period pains. It doesn’t cover everything and is very crude. If we don’t see a doctor, we don’t get the official stamp.’
People’s own definitions of health and illness are varied. Herzlin (1973) Famous study. 80 middle class men and women in France and asked their definitions of health and ell into three categories.
1 – Absence of disease
2 Good constitution
3 Condition of equilibrium.
Although it seems impossible to construct a universally applicable definition of health and therefore to measure health with much validity or reliability, practical demands ensure that attempts are made. In practice, health is measured using two indicators; morbidity (sickness) and mortality (death). Since morbidities are difficult to measure and quantify because of the heavily subjective element in identifying them, mortality is more commonly used in studies of health differences within and between societies.
In humanity’s long social evolution, as some diseases have assumed a greater role, so others have become less important. Sociological historians such as Powles (1973) and McKeown (1979) suggest that human disease history can be chronologically categorised; 1) the pre-agricultural (mortality arose from environmental and safety hazards), 2) the agricultural predominantly infectious diseases such as air bourn, water bourn, food bourn diseases like dysentery and vector bourn diseases carried by rats and mosquitoes such as plagues and malaria. Finally, 3) the modern industrial periods. Each period is characterised by different ways of life and different disease problems.
1) Pre-agricultural society (hunting-gathering)
For the greater part of humanity’s past, before the invention of agriculture, societies survived on hunting, fishing and gathering food. During those many thousands of years, life is thought to have been short but generally healthy. Adult death was probably often linked to the search for food supplies and competition over them. Relatively common causes of death therefore are thought to include homicide, tribal wars, hunting accidents, death by exposure and malnutrition
McKeown states that “food deficiency limited numbers and prejudiced health in to ways. It led to attempts to restrict population size through reduction of the number of births and by killing or neglecting unwanted individuals.” (McKeown, 1988, p40). However, the deliberate control of numbers was not sufficient to prevent food shortage, and deaths from starvation, malnutrition and parasitic disease largely associated with poor nutrition were common. Infection was therefore important, although it was not the predominant cause of death that it became in the historical period.
Changes arose from development of cereal. Also population became more settled. More people were fed. However, more close contact. Rely on cereal, nothing to build up immune system and die from infectious disease.
2 -Agricultural to industrial (more complicated)
3 Decline in the virulence of organisms – become less lethal over time
4 Reduction in exposure to infectious organisms – changes in domestic living. Housing developed. Less contact for infections to be passed on.
5 Genetically induced resistance of humans to infections. – Darwian explanation. Humans advanced.
6 Improved lifestyles increases resistance. Increase in standards if living. Better nutrition, not just rely on just one food group.
Medical intervention – anti-biotics, hospitals and drugs
2) Agricultural society
Some 10,000 years ago, humanity began the cultivation and domestication of plants and animals. This led to settled communities capable of supporting larger groupings, and so created the conditions required for the spread of infectious diseases (i.e. the diseases caused by micro-organisms – ‘germs’ – and spread via air, water, touch or a carrier such as insects).
In general there was a link between disease levels and food supplies. When food supplies and levels of nutrition improved, disease diminished; when they deteriorated, infectious diseases advanced. In Europe, a major infectious killer for centuries was the plague, spread by the fleas carried by black rats. In the particularly virulent phase of the disease, during the so called Black Death of the late 1340’s, it is thought to have killed somewhere between one third and one half of the population of England and Western Europe.
McKeown also emphasises that under agriculture, there were two important changes in the types of food. ‘Hunters-gathers lived on meat, fish, fruit and vegetables’, and although the proportions of the different foods varied from one population to another, on the average about two thirds of the diet came from plant sources. ‘They were not able to have cereals and they had almost no dairy products. Under agriculture man’s diet also consisted of cheese, milk and eggs’. (McKeown, 1988, p46-47). Hence the additional food provided by agriculture made it possible for numbers to increase; but the increase was not effectively limited by social restraints (control of fertility and deliberate killing), and populations expanded to the size at which food supplies became again marginal.
3) Modern industrial society
The modern industrial phase has been characterised by a decline in mortality from infectious diseases. Especially important has been the decline (though not necessarily the elimination) of former killers such as tuberculosis, measles, scarlet fever and whooping cough.
By about the mid-twentieth century, infectious diseases had been overtaken as a cause of death by the types of diseases which are now characteristic of all modern Western societies. The main killers now are cancers, heart disease and strokes. It maybe that such diseases now appear commonly in Western societies because previously they were seldom seen as people died prematurely from other causes. However, it is more likely that they are mainly new diseases, caused by conditions of life associated with industrialisation.
In support of this view is the finding that, in the Third World, those closest to Western lifestyle are most likely to suffer Western diseases. According to the medical historian, Thomas McKeown, the main factors contributing to this include changes in diet – especially a reduction in fibre and increase of fat, sugar and salt, increased use of alcohol, tobacco and illicit drugs, and a reduction of physical exercise (McKeown, 1988, p154). These behaviours often increase, he argues as Third World populations grow and become more densely distributed and people move from a rural to urban environment.
Recognising that degenerative diseases are now the major killers, it is often argued that health policy should focus in combating the risk factors associated with chronic illnesses, such as tobacco smoking, excessive alcohol intake, insufficient exercise and a high fat diet. If such a policy were successful, it is claimed, vigorous good health could be extended until well into old age.
It is asked why health has improved in the West; many people would see the advance of science and medicine as the main cause. As science came increasingly to understand the human body, (so the reasoning would go), it developed more effective medicines, immunisations and anaesthetics. It would also be assumed that science produced better doctors and better surgeons, and led to the construction of more and better hospitals. All these improvements, it would be argued, have led to the decline in the death rate.
However, this view finds little support among medical historians. According to McKewon, t fall in the death rate, which began in the 18th century, was largely due to three non-medical factors: improvements in nutrition, in public hygiene and birth limitations.
The earliest and most important view in McKeown’s view, was the improvement in people’s diets which came from advances in agriculture spreading across Europe from about the end of the 17th century onwards. The introduction of new crops such as potatoes and maize, and of improved techniques such as new forms of crop rotation and manuring, increased the availability of food and raised the population’s nutritional levels.
A second crucial factor, progressively effective from about 1870 in England and Wales, and somewhat later in other countries, was improvements in public hygiene, such as cleaner, piped water supply, and better sewage disposal. These public health measures reduced risks of exposure to disease, particularly to water bourne diseases such as typhoid and cholera. And, from milk supply – with the introduction of sterilisation and bottling – were also important.
A third reason for the transformation in health, according to McKeown, was the fall in the birth rate, and in family size, beginning among the middle classes in the 1870’s. If the birth rate had not fallen, McKeown calculates, the present population for England and Wales would now be some 140 million rather than 50 million! Without this limitation on population, he thinks, the other advances would soon have been reversed.
Importantly, however, and contrary to popular belief, McKeown doubts that either medical care or specifically immunisations had any significant effect on mortality before the 20th century. Indeed, he concludes that ‘they contributed little to the reduction of deaths from infectious disease before 1935 (McKeown, 1979 p77). In other words, although, dash in the later part of the twentieth century, immunisation, surgery an medical treatment have had a very welcome impact upon suffering, morbidity and mortality, their impact has been relatively late.
The most significant improvements in mortality have had a lot to do with preventive medicine (in the form of public hygiene measure) and only relatively little (then only recently) with the modern medical profession and curative intervention. And McKeown claims that his comparative studies of Sweden, France, Ireland and Hungary show a similar pattern, and comparable conclusion have been drawn in the USA.
In McKeown’s conclusion, then whilst some forms of medical intervention (such as routine treatments like the repair of fractures) have clearly proven successful, it is mistaken to assume that the undoubted improvements in the morbidity and mortality rates of the last three centuries have been mainly due to advances in medical knowledge and in its application. Medicines role in contemporary health has thus been significantly overestimated, by lay people and medics alike.
So it is that critics of the medical profession, such as Nick Hart, question it claims. Nevertheless, it has successfully persuaded public and politicians alike that it is the professions high standards and achievements of medical care, (rather than for instance public hygiene) which have ushered in longer, healthier lives, and on which our personal health continues to depend (N. Hart, 1985, p1).
Hence, the profession has been able to achieve a largely uncritical acceptance of the view that high-technology hospitals, surgeries and clinics characteristic of modern medicine are essential for a high level of health in a population, critics disagree and they claim that, even now, we remain largely unaware of what value any clinical treatment has because little treatment has ever been carefully evaluated. Because outcomes of treatment in the NHS have not been systematically documented and analysed, the limited data means that we often cannot be cure of which techniques and interventions have been successful and which have not. Thus, we cannot be at all sure of medicine’s contribution to the nation’s health even today.
In comparison, Lupton (1994) argues, western societies in the late twentieth century are characterised by people’s increasing disillusionment with scientific medicine. Yet, paradoxically, there is also an increasing dependence upon biomedicine to provide the answers to social as well as medical problems