12 Microbiological risk assessment in food processing During pregnancy protect your baby from Listeria nfection SAY NOI to these P心 Eat chly and prepaed food Ask here for a leaflet or contact EnvironmentalHealth Service on(09)388 4999 Fig 2.1 Health advice given to pregnant women by the Health Department of Western Australia 199 derive methods for calculating the times necessary to process canned foods at appropriate temperatures. For commercial sterilisation, the goal of thermal processing was to reduce the probability of survival and growth of microorganisms in a particular canned food to an acceptably low level. The starting point for the rationale of what is now termed an appropriate level of protection'(ALOP)was the work of Esty and Meyer(1922). They derived process performance criteria for the destruction of spores of proteolytic strains of Clost. botulinum in low-acid canned foods. It was proposed that requirements for sterilisation should be based on(i)the response to heating of the most heat-resistant spores found among strains of Clost botulinum and
derive methods for calculating the times necessary to process canned foods at appropriate temperatures. For commercial sterilisation, the goal of thermal processing was to reduce the probability of survival and growth of microorganisms in a particular canned food to an acceptably low level. The starting point for the rationale of what is now termed ‘an appropriate level of protection’ (ALOP) was the work of Esty and Meyer (1922). They derived process performance criteria for the destruction of spores of proteolytic strains of Clost. botulinum in low-acid canned foods. It was proposed that requirements for sterilisation should be based on (i) the response to heating of the most heat-resistant spores found among strains of Clost. botulinum and Fig. 2.1 Health advice given to pregnant women by the Health Department of Western Australia 1995. 12 Microbiological risk assessment in food processing
The evolution of microbiological risk assessment 13 (ii) a reduction in the spore population by a factor of 10-10 2 to ensure the desired level of product safety. For this purpose, heat inactivation trials were carried out on 109 different strains of the test species. The resultant performance criteria, based on the approach outlined above, have been applied over many years and have proved to be sound, with an adequate margin of afety(Plug and gould, 2000) Process performance criteria for heat pasteurisation of milk The work of Enright et al. (1957) led to the development of process standards for controlling Cox. burnetii in milk. The heat treatments used initially for milk were designed to inactivate any tubercle bacilli present and these were considered to be the most heat-resistant of the nonsporing pathogenic bacteria likely to occur in the product. The treatments were based on information from many studies on the heat-resistance of both human and bovine (Mycobacterium tuberculosis and Myc. bovis respectively). In the USA, the heating regime adopted in 1924 for the conventiona nal process was 142F (61 1C)to 145F(628C)for 30 min In 1933 a heating regime was introduced for the high-temperature, short-time(HTST) process: 161F(717C)for 15s In practice, Cox. burnetii appears to be slightly more heat-resistant than the tubercle bacilli and, following recognition that the organism, which causes Q fever in humans, could be transmitted by raw milk, it was necessary to check on the adequacy of existing pasteurisation processes for inactivating the organism. The work undertaken by Enright and colleagues (1956, 1957) fulfilled this requirement and, although no formal MRa was employed, elements of the MRA approach were implicit in their studies. These aspects The organism and the disease it causes: Cox. burnetii is a small, Gram- negative bacterium, originally classified as a rickettsia, that cannot be grown in axenic culture but can now be cultivated in vitro in various cell lines (Maurin and Raoult, 1999). Q fever is characterised by fever, chills and muscle pain, with occasional long-term complications. It was first described by Derrick(1937) and is known to occur worldwide. The organism infects many wild and domestic animals, which often remain asymptomatic Domestic animals, such as cattle, sheep and goats, are considered the main sources of infection for humans(Maurin and Raoult, 1999)and, when shed in milk, Cox. burnetii is often present in relatively high numbers Hazard identification: contact with infected animals was known to result transmission of Cox. burnetii to people, with subsequent development of illness, and the likelihood of the organism contaminating raw milk was ecognised. Early on, there was a lack of epidemiological evidence for ransmission via milk, but this was suspected in several outbreaks and there was strong supporting evidence from a UK outbreak in 1967( Brown et al 1968). Thus, the hazard was the presence of Cox. burnetii in milk intended
(ii) a reduction in the spore population by a factor of 1011–1012 to ensure the desired level of product safety. For this purpose, heat inactivation trials were carried out on 109 different strains of the test species. The resultant performance criteria, based on the approach outlined above, have been applied over many years and have proved to be sound, with an adequate margin of safety (Pflug and Gould, 2000). Process performance criteria for heat pasteurisation of milk The work of Enright et al. (1957) led to the development of process standards for controlling Cox. burnetii in milk. The heat treatments used initially for milk were designed to inactivate any tubercle bacilli present and these were considered to be the most heat-resistant of the nonsporing pathogenic bacteria likely to occur in the product. The treatments were based on information from many studies on the heat-resistance of both human and bovine strains (Mycobacterium tuberculosis and Myc. bovis respectively). In the USA, the heating regime adopted in 1924 for the conventional process was 142 ºF (61.1 ºC) to 145 ºF (62.8 ºC) for 30 min. In 1933 a heating regime was introduced for the high-temperature, short-time (HTST) process: 161 ºF (71.7 ºC) for 15 s. In practice, Cox. burnetii appears to be slightly more heat-resistant than the tubercle bacilli and, following recognition that the organism, which causes Q fever in humans, could be transmitted by raw milk, it was necessary to check on the adequacy of existing pasteurisation processes for inactivating the organism. The work undertaken by Enright and colleagues (1956, 1957) fulfilled this requirement and, although no formal MRA was employed, elements of the MRA approach were implicit in their studies. These aspects are discussed below. • The organism and the disease it causes: Cox. burnetii is a small, Gramnegative bacterium, originally classified as a rickettsia, that cannot be grown in axenic culture but can now be cultivated in vitro in various cell lines (Maurin and Raoult, 1999). Q fever is characterised by fever, chills and muscle pain, with occasional long-term complications. It was first described by Derrick (1937) and is known to occur worldwide. The organism infects many wild and domestic animals, which often remain asymptomatic. Domestic animals, such as cattle, sheep and goats, are considered the main sources of infection for humans (Maurin and Raoult, 1999) and, when shed in milk, Cox. burnetii is often present in relatively high numbers. • Hazard identification: contact with infected animals was known to result in transmission of Cox. burnetii to people, with subsequent development of illness, and the likelihood of the organism contaminating raw milk was recognised. Early on, there was a lack of epidemiological evidence for transmission via milk, but this was suspected in several outbreaks and there was strong supporting evidence from a UK outbreak in 1967 (Brown et al. 1968). Thus, the hazard was the presence of Cox. burnetii in milk intended for human consumption. The evolution of microbiological risk assessment 13
14 Microbiological risk assessment in food processing Dose response: there was no information on the dose response in humans since challenge trials had not been carried out and epidemiological data were Exposure assessment: information relevant to this step in mra was obtained by injecting guinea pigs to determine the presence and titre of Cox. burnetii in milk. The organism was found in 33% of 376 samples of raw milk from California, USA. The maximum number of Cox. burnetii demonstrated in the milk of an infected dairy cow was the number of organisms contained in 10 000 infective guinea pig doses of Cox. burnetii per millilitre'(Enright et al, 1957) Similar titres were found in milk that had been frozen and thawed however the study did not involve testing of all breeds of dairy cattle, and it is possible that even higher levels of shedding may have occurred in some breeds that were not examined. Nevertheless. it was concluded that the maximum level of consumer exposure would be represented by the highest infective dose demonstrated in this study and that the pasteurisation process should bring about thermal inactivation of such a number(Enright et al., 1957) Risk characterisation: the risk involved in consuming raw milk could not be estimated because of the absence of dose-response data. The data for the prevalence of contaminated milk, the maximum level of contamination and the fact that milk would have been consumed regularly by the majority of the population were probably implicit factors in an assumption that the risks associated with inadequate heat treatment were high. The studies of Enright et al.(1956, 1957)led to the conclusion that heating at 143F for 30 min was wholly inadequate to eliminate viable Cox burnetii from whole, raw milk, while heating at 145"F ensures elimination of these organisms with a high level of confidence(Enright et al., 1957). This led to the adoption of the higher temperature for vat pasteurisation in the USA. The work on the hTST process indicated that the recommended standard of 161F for 15s was sufficient for total elimination 2.3.4 Microbiological examination of food Microbiological testing, as a means of assessing whether a food product is hazardous due to the presence of pathogens, is of relatively recent origin. It became the vogue only after Robert Koch developed a method for growing microorganisms in pure culture and foodborne organisms capable of causing spoilage or disease were recognised and could be enumerated( Hartman, 1997) Over the last 80 years or so, many different methods have been devised for detecting pathogenic organisms and/or their toxins. Even from the beginning of that period, statutory microbiological requirements relating to food safety were established in many countries and were based on the testing of prepared foods for the organisms or toxins of concern a disadvantage was that routine examination of foods for a multiplicity of pathogens and toxins was impractical in most laboratories and an alternative
• Dose response: there was no information on the dose response in humans, since challenge trials had not been carried out and epidemiological data were lacking. • Exposure assessment: information relevant to this step in MRA was obtained by injecting guinea pigs to determine the presence and titre of Cox. burnetii in milk. The organism was found in 33% of 376 samples of raw milk from California, USA. ‘The maximum number of Cox. burnetii demonstrated in the milk of an infected dairy cow was the number of organisms contained in 10 000 infective guinea pig doses of Cox. burnetii per millilitre’ (Enright et al., 1957). Similar titres were found in milk that had been frozen and thawed. However, the study did not involve testing of all breeds of dairy cattle, and it is possible that even higher levels of shedding may have occurred in some breeds that were not examined. Nevertheless, it was concluded that the maximum level of consumer exposure would be represented by the highest infective dose demonstrated in this study and that the pasteurisation process should bring about thermal inactivation of such a number (Enright et al., 1957). • Risk characterisation: the risk involved in consuming raw milk could not be estimated because of the absence of dose–response data. The data for the prevalence of contaminated milk, the maximum level of contamination and the fact that milk would have been consumed regularly by the majority of the population were probably implicit factors in an assumption that the risks associated with inadequate heat treatment were high. The studies of Enright et al. (1956, 1957) led to the conclusion that heating at ‘143 ºF for 30 min was wholly inadequate to eliminate viable Cox. burnetii from whole, raw milk, while heating at 145 ºF ensures elimination of these organisms with a high level of confidence’ (Enright et al., 1957). This led to the adoption of the higher temperature for vat pasteurisation in the USA. The work on the HTST process indicated that the recommended standard of 161 ºF for 15 s was sufficient for total elimination. 2.3.4 Microbiological examination of food Microbiological testing, as a means of assessing whether a food product is hazardous due to the presence of pathogens, is of relatively recent origin. It became the vogue only after Robert Koch developed a method for growing microorganisms in pure culture and foodborne organisms capable of causing spoilage or disease were recognised and could be enumerated (Hartman, 1997). Over the last 80 years or so, many different methods have been devised for detecting pathogenic organisms and/or their toxins. Even from the beginning of that period, statutory microbiological requirements relating to food safety were established in many countries and were based on the testing of prepared foods for the organisms or toxins of concern. A disadvantage was that routine examination of foods for a multiplicity of pathogens and toxins was impractical in most laboratories and an alternative 14 Microbiological risk assessment in food processing
The evolution of microbiological risk assessment 15 approach was needed. This led to widespread use of microbial groups or specie that were more readily detectable in foods and considered to be indicative of conditions in which the food had been exposed to contamination with pathogens or been under-processed. Enumeration of the organisms was even used as a measure of the possible growth of pathogens in a food, should these have beer present. The bacteria in question were termed indicator organisms and they have value for indirect assessment of both microbiological safety and quality of foods. The use of indicator organisms flourished, especially in the period 1960- 980. During that time, numerous procedures for enumerating bacterial indicators were described (e.g. American Public Health Association, 1966, United States Food and Drug Administration, 1972). Clearly, the main objective of their use was to reveal conditions of food handling that implied a potential hazard. Furthermore, some indicators were proposed as a possible index rather than a mere indication of faecal contamination in food(Mossel, 1982) Setting criteria The traditional approach to controlling food safety has been based on education and training of personnel, inspection of production facilities and operations, and microbiological testing of the finished product. Testing of the product is usually an integral part of the overall control programme, and the perceived risk of foodborne illness from the presence of a particular pathogen is reflected in the limit values that are set for the organism in a specific type of food. Where possible, these criteria are based on epidemiological data and are a reflection of the minimum dose expected to cause illness. Table 2.2 gives some values that are essentially derived from analyses of outbreaks of foodborne disease. The data show a clear parallel between limit values and the minimum dose associated with human disease. In general, infective organisms such as Salmonella should be absent from food because very low numbers are known to be capable of causing illness(D'Aoust, 1989). On the other hand, toxigenic bacteria, such as Staphylococcus aureus, may be acceptable at levels that are well below those causing food to become hazardous. With foodborne intoxications caused by Staph. aureus, the numbers present in the food usually exceed 10 cfu(colony-forming units) per g(Bergdoll, 1989 Shortcomings of microbiological testing eaving aside questions regarding the accuracy and reproducibility of the methods used, it is clear that microbiological testing of food is of limited value without ook on food sampling produced by the International Commission on Microbiological Specifications for Foods(CMSF, 1974). The book gives details of statistically based sampling plans for the microbiological examination of different types of food Although the book gives an excellent account of the various sampling plans, it also reveals the limitation of testing for pathogenic organisms that may be infrequent, low in number and unevenly distributed throughout the test batch, especially when complete absence is the only acceptable result. Thus, testing to
approach was needed. This led to widespread use of microbial groups or species that were more readily detectable in foods and considered to be indicative of conditions in which the food had been exposed to contamination with pathogens, or been under-processed. Enumeration of the organisms was even used as a measure of the possible growth of pathogens in a food, should these have been present. The bacteria in question were termed ‘indicator organisms’ and they have value for indirect assessment of both microbiological safety and quality of foods. The use of indicator organisms flourished, especially in the period 1960– 1980. During that time, numerous procedures for enumerating bacterial indicators were described (e.g. American Public Health Association, 1966; United States Food and Drug Administration, 1972). Clearly, the main objective of their use was to reveal conditions of food handling that implied a potential hazard. Furthermore, some indicators were proposed as a possible index rather than a mere indication of faecal contamination in food (Mossel, 1982). Setting criteria The traditional approach to controlling food safety has been based on education and training of personnel, inspection of production facilities and operations, and microbiological testing of the finished product. Testing of the product is usually an integral part of the overall control programme, and the perceived risk of foodborne illness from the presence of a particular pathogen is reflected in the limit values that are set for the organism in a specific type of food. Where possible, these criteria are based on epidemiological data and are a reflection of the minimum dose expected to cause illness. Table 2.2 gives some values that are essentially derived from analyses of outbreaks of foodborne disease. The data show a clear parallel between limit values and the minimum dose associated with human disease. In general, infective organisms such as Salmonella should be absent from food because very low numbers are known to be capable of causing illness (D’Aoust, 1989). On the other hand, toxigenic bacteria, such as Staphylococcus aureus, may be acceptable at levels that are well below those causing food to become hazardous. With foodborne intoxications caused by Staph. aureus, the numbers present in the food usually exceed 107 cfu (colony-forming units) per g (Bergdoll, 1989). Shortcomings of microbiological testing Leaving aside questions regarding the accuracy and reproducibility of the methods used, it is clear that microbiological testing of food is of limited value without a sound sampling plan. To overcome the problem, a book on food sampling was produced by the International Commission on Microbiological Specifications for Foods (ICMSF, 1974). The book gives details of statistically based sampling plans for the microbiological examination of different types of food. Although the book gives an excellent account of the various sampling plans, it also reveals the limitation of testing for pathogenic organisms that may be infrequent, low in number and unevenly distributed throughout the test batch, especially when complete absence is the only acceptable result. Thus, testing to The evolution of microbiological risk assessment 15
16 Microbiological risk assessment in food processing Table 2.2 Correlation between minimum dose considered to cause disease and criteria Minimum dose Probability of eral end- sidered to infection from 1.0×10 o 10 gram T <105-105/ 5 Staphylococcus aureus <105-10°gram Based on analysis of foodborne disease outbreaks(presented in doyle, 1989) b Rose Criteria for pathogenic organisms are not yet well established and they may differ from country to country. The validity of the criteria starts mostly after production and ends at the time of consumption ensure that the target pathogen is absent from the batch requires uneconomically rge numbers of samples, with no guarantee that absence of the organism can be established 2.3.5 Introduction of GMP and HAccP One of the first quality assurance systems developed by the food industry was at involving the application of GMP, as a supplement to end-product testing GMP has been used for many years to ensure the microbiological safety and quality of food, and it provides a framework for hygienic food production. The establishment of GMP is the outcome of long practical experience and it includes attention to environmental conditions in the food plant, e.g quirements for plant layout, hygienic design of equipment and control of operational procedures. The GMP concept is largely subjective and qualitative in its benefits. It has no direct relationship with the safety status of the product. For these reasons, the concept has been extended by introducing the HACCP system, which seeks, among other things, to avoid reliance on microbiological sting of the end-product as a means of controlling food safety. Such testing may fail to distinguish between safe and unsafe batches of food and is both time-
ensure that the target pathogen is absent from the batch requires uneconomically large numbers of samples, with no guarantee that absence of the organism can be established. 2.3.5 Introduction of GMP and HACCP GMP One of the first quality assurance systems developed by the food industry was that involving the application of GMP, as a supplement to end-product testing. GMP has been used for many years to ensure the microbiological safety and quality of food, and it provides a framework for hygienic food production. The establishment of GMP is the outcome of long practical experience and it includes attention to environmental conditions in the food plant, e.g. requirements for plant layout, hygienic design of equipment and control of operational procedures. The GMP concept is largely subjective and qualitative in its benefits. It has no direct relationship with the safety status of the product. For these reasons, the concept has been extended by introducing the HACCP system, which seeks, among other things, to avoid reliance on microbiological testing of the end-product as a means of controlling food safety. Such testing may fail to distinguish between safe and unsafe batches of food and is both timeconsuming and relatively costly. Table 2.2 Correlation between minimum dose considered to cause disease and criteria set for end-products Pathogenic Minimum dose Probability of General endorganism considered to infection from product criteria cause diseasea exposure to 1 usedc organismb Infectious organism Shigella 1 1.0 10ÿ3 Absence/25 gram Salmonella 1 2.3 10ÿ3 Absence/25 gram Campylobacter 1–10 7.0 10ÿ3 Absence/25 gram Listeria monocytogenes > 103 < 100/gram Vibrio parahaemoliticus > 104 < 103 /gram Toxico-infectious organisms Clostridium perfringens > 106 < 105 ÿ106 /gram Bacillus cereus > 106 < 105 ÿ106 /gram Organisms causing intoxication Staphylococcus aureus > 106 < 105 ÿ106 /gram a Based on analysis of foodborne disease outbreaks (presented in Doyle, 1989). b Rose and Gerba (1991). c Criteria for pathogenic organisms are not yet well established and they may differ from country to country. The validity of the criteria starts mostly after production and ends at the time of consumption. 16 Microbiological risk assessment in food processing