In this article we will discuss about the public health aspects of wastewater treatment in Kuwait.

M.Y. Abdulraheem

Regional Organization for the Protection of the Marine Environment

Introduction:

The World Bank estimates that the need for water will continue to increase throughout the 21st century. However, the quantities of water available for various uses will continue to decrease as will its quality, and will reach even more critical levels than those existing today in the least developed countries (Tables 1 and 2).

Variations

Water Quantity-Quality Nexus

With such a grim forecast, and knowing that demand for irrigation in most Arab countries exceeds total demand by about 70 per cent of, the UN and most other concerned international organizations advocate the reuse of treated sewage waters for irrigation purposes. Such an option is considered to be economically viable and to have low health risks once quality control measures and public health regimes are applied.

Health risks have been central to the controversy over the reuse of treated sewage effluents for agriculture, especially of crops intended for human consumption in a raw form. The challenge is to insure that both the concerned authorities and the public are aware of the potential risks associated with the reuse of treated sewage waters for irrigation in developing countries and the economically feasible alternatives for reducing this risk.

The decision on whether to a use a given water resource for irrigation is usually based on the chemical properties of the water. Water quality parameters for irrigation purposes are normally presented in range form since plants can endure a relatively higher degree of variability in these parameters.

Also, the nature of the soil can greatly influence the ultimate effect of water quality on the plants. For example, alkaline soils can tolerate relatively high levels of metals in irrigation waters since the mobility of metals in soil is pH-dependent. Moreover, retention of pathogens in the soil is dependent on the physical properties of the soil.

Sandy soils act as filters for parasite eggs, protozoa, bacteria and viruses, whereas soils with high silt contents can accumulate pathogens much faster, thus increasing the risk of exposure. The method of irrigation can also influence the distribution of pathogens in the treated sewage waters, and thus influence the health risk associ­ated with the use of such effluents for irrigation.

Finally, the type of crops irrigated by treated effluents, and the mode of human consumption of these crops are also important. Vegetables eaten raw represent the greatest health risk to the public when irrigated with reclaimed sewage effluents.

In Kuwait, it is generally accepted that treated sewage effluents represent a valuable water resource, especially when one considers that 90 per cent of the water is distilled water in origin, and produced at a cost of over US$10/m3. The three main sewage treatment plants produce currently around 250,000 m3 of treated effluents per day, with a design quality (Biochemical Oxygen Demand, BOD) of 10 mg/1.

These figures would normally convince city planners of the need to fully utilize such a water resource. How­ever, the amount of treated sewage waters used for afforestation and alfa alfa production have not exceeded 15 per cent in the past few years. In symposia and workshops like this one, the Ministry of Health is often criticized for being reluctant to approve the use of treated sewage effluents for irrigation.

However, in reality there has never been disagree­ment, in principle, on utilizing this water resource, but there has been a debate over where, how and what types of plants crops are to be irrigated with it.

Public Health Concerns over the Biological Content of Treated Wastewater:

Waterborne infectious agents of particular significance are of intestinal origin and are discharged in human and animal feces. Since different human communities have different prevalence and incidence rates and different distributions of illnesses, the occur­rence and concentration of infectious agents in raw water will vary with communities.

Other factors that influence the presence and concentration of infectious agents are the actual source of the wastewater, the level of treatment provided, and the tolerance of the individual organisms to environmental factors. These organisms can be broadly classified into four groups according to size: viruses, bacteria, protozoa, and helminths.

Another method of classifying microbes which has been developed for pur­poses of water reuse management and treatment. By using this method particular pathogens are associated with paths of transmission and sanitation measures, rather than just pathogenicity or virulence as with the traditional method.

The important epidemiological criteria are latency of the pathogens, length of the infection cycle, persistence outside the host, and median infective dose. This method of classification, therefore, places Salmonella spp into a longer infectious cycle than would be ex­pected by their short latency period for causing illness because the fact that these bacteria readily multiply outside the host on starchy crops is taken into consideration.

In general, health concerns over the use of treated wastewater’s are in proportion to the degree of human contact with the water, the quality of the effluent, and the reliability of the treatment processes.

For most of the uses of reclaimed water, biological agents pose the greatest health risks. There is epidemiological evidence indicating that the reuse of municipal wastewater, particularly for irrigation of food crops, has resulted in the transmission of disease. Most epidemiological studies show that direct contact increases the probability of encountering gastrointestinal and urogenital infections.

Transmission of most pathogenic diseases from wastewater requires fecal-to-oral route. There is, however, very little evi­dence indicating a potential health hazard in handling disinfected wastewater for irriga­tion on agricultural land. The majority of documented disease outbreaks have been the result of bacterial or parasitic contamination. In all cases, either raw sewage or un-disinfected effluent was the source of the irrigation water.

The condi­tions that are necessary to produce infectious diseases in a population include:

1. The presence of the disease agent.

2. It must be present in sufficient concentration to be infectious, and

3. Susceptible individuals must come into contact with the agent in a manner that causes infection and disease.

Table 3 shows the levels of typical indicator organisms expected in sewage effluents and sludge’s. A more comprehensive list of pathogens (and associated health effects) present in sewage is shown in Table 4.

Typical Numbers of Microorganisms

Important Viruses, Bacteria, Protozoans

i. Viruses:

Viruses are the smallest of the microbes in sewage, and are the most difficult to assess in terms of health risk. There are over 114 different known human enteric viruses.

The role of water in the overall incidence of viral diseases may be limited. Other modes of transmission, such as personal contact, probably are responsible for the great majority of viral diseases. However, any excreted virus capable of producing infection when ingested could be transmissible by inadequately treated wastewater.

The most important are the enteroviruses (polio, echo, and coxsackie), rotaviruses, Norwalk agent and hepatitis A virus. The hepatitis A virus is the most frequently reported and documented to be transmitted by contaminated waters.

Information concerning the occurrence of viral diseases resulting from the reuse of treated wastewater is still limited. This may be attributed to several factors. Firstly, until recently, virus detection methods have not been sensitive enough to accurately detect low concentrations of viruses in large volumes of water.

Secondly, enteric virus infections are often not readily apparent, thus making it difficult to establish the endemicity of such infections. Thirdly, the apparently mild nature of most enteric virus infections preclude reporting by the patient or the physician. Fourthly, the morbidity of enteroviral infections may not become obvious for several months or years.

And fifthly, once introduced into a population, person-to-person contact becomes a major mode of transmission of an enteric virus, thereby obscuring the role of water in its transmission. However, the public health concern over new viral diseases (e.g., AIDS), have led to greater interest in carrying out studies on the health risk resulting from the use of treated effluents in agriculture.

Some other qualities of viruses are that many of them have negative surface charges which either readily removes them from soils to give them access to groundwater or absorbs them into the soil to extend their environmental survival. Viruses depend on host cells for growth and reproduction, since they have no cell structure of their own.

Once the virus is passed into the environment with the host’s excreta, all viruses degenerate, although some do stay viable for months, since their protein coats impart some resistance to environmental stress. The number of viruses excreted into the environment is depend­ent on the socio-economic level of the community. Low hygienic standards result in high viral excretions and high acquired life-long immunity among those who survive such illnesses.

The water and food (including fish) route of viral infection is less prominent in hepatitis A, poliomyelitis, and gastroenteritis transmissions than previously thought. How­ever, viruses account for about twice as many gastroenteritis infections as do bacteria; the main organisms being rotavirus, Norwalk-type virus, and echovirus.

ii. Bacteria:

The bacterial pathogens of primary concern are listed in Table 4. Campylobacter jejuni has recently been added as a common cause of gastroenteritis particularly in children. Also, Legionella microbes were added to the list of waterborne pathogens, although they are not found in excreta; inhalation is its normal mode of transmission through aerosols.

Legionella is, however, found wherever there is water, including showerheads, cooling towers, hot water heaters, and of course rivers and lakes. Accord­ing to the current state of knowledge, Salmonella are the bacterial pathogens of prime importance for wastewater used for irrigation.

On the other hand, wastewater treated to a level of below 500 fecal coliforms/100 ml presents a threat of gastroenteritis that may be due to a number of different causes: viral, bacterial other than Salmonella or Shigella, endotoxin, or some other agent already on the grass or field being irrigated. The infective dose of enteric bacterial pathogens ranges from about 100 (for Shigella) to 105 (for Salmonella) to 108 (for Escherichia coli or Vibrio cholerae).

Fecal coliforms have been the primary microbes used as indicator bacteria; How­ever, work has indicated that Escherichia coli and Enterococci are better indicators for gastroenteritis than any of the past microbes used for that purpose. Thus, the recreational water criterion of 200 fecal coliforms/100ml has now been changed to an ambient water quality criteria of 126/100ml for E. coli or 33/100ml for Enterococci in freshwater and 35/ 100ml for Enterococci in marine waters.

Other studies have expanded knowledge of this specialty organisms and their association with the determination of health effects still further. Fecal coliforms and Fecal streptococci densities at 500/100ml were able to predict nearly 70 per cent of gastrointestinal cases.

Similar statistically significant increases for gastrointestinal symptoms were found with total coliform densities above 3000/ml. This study was conducted to determine the health effects of reclaimed water used to spray- irrigate public parks used for golfing, picnicking, sports, and other activities.

iii. Protozoans:

The protozoans of major public health importance are Entamoeba histolytica and Giardia lamblia (Table 4). Infections are transmitted through environmentally resistant cysts which develop from the vegetative, environmentally nonresistant form (trophozoite) in the lower parts of the large intestine and are shed in excreta.

The median infective dose is in the 10 or less than 100 range; however, even single cysts may lead to disease. , Infected persons may be either symptomatic or asymptomatic; in either case each gram of feces contains 10s to 10s cysts. Multiplication of the pathogen is only possible inside the host. Any acquired immunity does not appear to be fully protective or lasting.

iv. Helminths:

Table 4, also lists the helminths of pathogenic importance. Helminthic infections are high primarily in developing countries where they are closely linked with low eco­nomic status and high contamination. Those most commonly observed in developed nations include Ascaris lumbricoides, Taenia saginata, Trichuris trichuria, Enterobius vermicularis, and Necator americanus (hookworm).

The helminthic eggs are particularly resistant to environmental stress. Helminth infections lead to no host immunity; the only exception is schistosomiasis (Bilharsia). Many helminthic infections require an interme­diate host, which when eliminated eradicates the infections in communities.

In general, the disease organisms responsible for epidemics in the past are still present in today’s sewage. Good sanitary engineering practice results in control rather that total eradication of the disease agents, as can be seen in Table 3. The numbers of pathogens in sewage have markedly declined over the decade as a result of disease control with antibiotics, and improved sanitary conditions and practices.

Finally, disease can be transmitted to humans either directly by contact, ingestion, or inhalation of infectious agents in reclaimed water, or indirectly by contact with objects previously contaminated by the reclaimed water.

Once the infectious agent is present in the community producing the wastewater and, hence, in the wastewater from that commu­nity, treatment is not sufficient to reduce the levels of the infectious agent to safe levels, and should a person come in contact with the effluent, at the time the agents are present in sufficient numbers, illness may occur.

Whether illness occurs depends on a series of complex interrelationships between the host and the infectious agents. These include the numbers of the invading microorganism (dose), the numbers of organisms necessary to initiate infection (infective dose), the organism’s ability to cause disease (pathogenicity), the degree to which the microorganism can cause disease (virulence), and the relative susceptibility of the host.

Studies have shown that 10 or fewer Giardia lamblia and as few as 10 Shigella dysentariae can cause illness, whereas it may require as many as 1000 Vibrio cholerae or 10,000 Salmonella typhi to initiate disease. A maximum of 20 Entamoeba histolytica cysts constitutes as infective dose, and very low numbers of viruses may be able to initiate disease in humans.

For most organisms, infections occur at lower doses than are required to cause disease. Infection is defined as an immunological response to pathogens by a host without manifestation of clinical signs of the disease. Enteroviruses in municipal wastewater fluctuate widely. Viruses shed from an infected individual commonly range from 1000 to 100,000 infective units per gram of feces. The average enteric virus density is about 500 units/100 ml.

An examination of published data on exposure of wastewater treatment plant opera­tors and maintenance personnel has revealed that the occurrence of clinical disease associated with occupational exposure to the water is rarely reported. Thus, it can be assumed that agricultural waters exposed to reclaimed sewage waters run a low risk of exposure to infectious diseases.

However, one would anticipate more effort in carrying out in-depth epidemiological studies of waters and exposure to treated wastewaters in view of the intense public concern over AIDS. Although the HIV virus may potentially be present in sewage (Table 4), it is known that the HIV virus does not survive in water.

Concern over odours and spread of droplets over a wider area during spray irrigation using treated wastewater has been the subject of several studies. During spray irrigation, the amount of water that is aerosolized can vary from less than 0.1 per cent to almost 2 per cent.

A possible direct means of human infection by aerosols is through inhalation. Infection or disease can be contracted indi­rectly by aerosols deposited on surfaces such as food, vegetation, and clothing. The infective dose of many pathogens is lower for respiratory tract infections than for infec­tions via the gastrointestinal tract; thus, inhalation may be a more likely route for disease transmission than either contact or ingestion.

Bacteria and viruses in aerosols remain viable and travel farther with increased wind velocity, in­creased relative humidity, lower temperature, and darkness. Coliforms may be carried 90-130 m with winds of 1.5 m/sec, 300-400 m with winds of 5 m/sec and 1000 m or more with stronger winds.

Sewage treatment plants may be a source of microbial aerosols. Fannin (1980) measured microbial aerosol densities 150 m downwind before and after installation of a sewage treatment plant. The data indicated a significant increase in microbial densities. In general, virus numbers decrease substantially once they are in the environment; spray irrigation particularly shocks viruses into about a one- half log loss of virus particles and subsequent die-off is about one log every 40 sec.

Because there is a paucity of information concerning the health risks associated with wastewater aerosols, health implications regarding this subject are difficult to assess. Most of the epidemiological studies conducted on residents in communities subjected to aerosols from sewage treatment plants, many using subjective health questionnaires, have not detected any correlation between exposure to aerosols and illness.

Although some studies have indicated higher incidences of respiratory and gastrointestinal illnesses in areas receiving aerosols from sewage treatment plants than in control areas, the elevated illness rates were either suspected to be the result of other factors, such as economic disparities, or were not verified by antibody tests for human viruses and isolations of pathogenic bacteria, parasites, or viruses.

A limited number of reports concerning allergic response to sewage treatment plant waters exposed to sperm of Aspergillus fungi in the dust associated with the composing of sewage sludge. However, it is generally accepted that the health risk associated with aerosols from sewage effluent spray irrigation sites is low, particularly for irrigation with’ wastewater that has been disinfected. However, prudence dictates that inhalation of aerosols that may contain viable pathogens should be minimized.

Public Health Concerns About Chemical Composition of Treated Waters:

In potable water reuse, chemical contaminants are of great concern. Chemicals in reclaimed water may be dissolved or associated with particulate matter and may be classified as either inorganic or organic. Health effects may be manifested as acute toxicity from short-term exposure or as a chronic disease resulting from long-term expo­sure to sub-acute doses.

The chronic effects are of particular concern as they may relate to cancer. The degree of impact that reclaimed wastewater has on the chemical content of drinking water supplies cannot be generalized. Although it is normally recognized that greater industrial input to a treatment system may signify greater potential adverse im­pacts, the methods of treatment, the type of potable reuse (direct or indirect), and the exact nature of the contaminants are the real determinants.

There are no universally accepted standards for the reuse of water. Any potable reuse of reclaimed water must regard drinking water origin/treatment, quality, and reliability criteria. Various agencies have established drinking water criteria for specific contaminants. Under the Safe Drinking Water Act, the US Environmental protection Agency (EPA) is required to set national drinking water standards in the US.

This is done through the use of maximum contaminant level goals (MCLGs) and of maximum contaminant levels (MCLs), which are based on the potential toxicological impact on humans. There are also secondary and aesthetic standards for some contami­nants which, although not of direct health concern, may affect the acceptability of the water.

These include iron, manganese, copper, chloride, sulphate, and zinc. There are additional guidelines which may applied for potable reuse. The National Academy of Sciences (NAS) has developed acceptable daily intakes (ADIs) for chemicals that exhibit non-carcinogenic chronic toxicity.

The ADIs are extrapolated from animal toxicity data and are used to indicate an exposure level to a specific chemical which should produce no observable toxic effect in humans. However, in most developing countries, WHO criteria are applied, which are generally less restrictive, since they are designed to cover the highly diverse physical and social conditions in the world.

Unfortunately, quite often these guidelines are taken as standards and are applied with little regard to the location- specific conditions that may mandate a more restrictive values of certain parameters.

i. Inorganics:

With the exception of the discovery of the potential health hazards of asbestos, the scope of inorganic chemical contamination related to human health has not changed significantly over the past two decades. Table 5 summarizes water constituents of common concern, recommended levels, and associated health impacts.

Trace metals occur in water under natural conditions, as a consequence of man’s activities, or from corrosion and leaching from plumbing materials. This latter method is of particular concern at the moment due to the recent findings of elevated lead levels in drinking water. Leaching from plumbing materials can normally be controlled by chemically limiting the aggressiveness of the water.

Inorganic Contaminants of Health Concern

Although some amounts of trace metals are essential to maintain a proper diet, they are known to exhibit toxic effects at threshold levels of intake, usually at concentrations much greater than the levels normally found in drinking water. Maximum contaminant levels have been developed for eight elements: arsenic, barium, cadmium, chromium, lead, mercury, selenium, and silver, although the EPA has recently proposed removing silver from the regulated list.

Secondary levels have been developed for copper, iron, manganese, and zinc to ensure portability from an aesthetic standpoint (i.e., taste, odour, and appearance) rather than for health reasons. Sodium, which is also a metal, poses a health concern distinct from other metallic substances. Excessive intake of sodium from food and water has been asserted to be a factor in hypertension for susceptible population groups.

From a water reuse point of view, nitrate may be a constituent of particular signifi­cance because of the relatively high concentration of combined nitrogen commonly present in wastewater, and the tendency for all nitrogenous materials to be converted to nitrate. The best known human health effect related to the ingestion of nitrates is methemoglobinemia in infants or debilitated individuals.

ii. Organics:

These organics of most common concern, their recommended levels, and their associated health effects are shown in Table 6. The list of individual compounds has been amended several times as health effects data became available. Virtually all of the compounds on the list are man-made, either intentionally (e.g., insecticides) or as the by-­products of the growing hydrocarbon-based chemical industry.

Organic Constituents of Health Concern

iii. Carcinogens:

Carcinogens pose the newest chemical concern in drinking water treatment, and hence, the concern for potable water reuse. Carcinogens are a major concern because of the long disease formation time, the lack of undisputed proof of causality, and the lack of an absolute, no risk level.

Levels of eight trace metals were compared with cancer mortality rates and no significant correlations were found for iron, cobalt, or chromium. Nickel concentrations were correlated with mortality from oral and intestinal cancers, and arsenic levels with mortality from oral, intestinal cancers, laryngeal and optic cancers and leukemia.

How­ever, such data need to be taken with caution since they are not supported by a properly, designed epidemiological studies. Thus, available scientific evidence seems to indicate a low risk of cancer from inorganic constituents in treated sewage effluents, whereas animal tests have shown correlations between beryllium, lead, and cadmium and tumours found in various tissues.

As for organic carcinogens, the available toxicological information about individual organic compounds is variable in quality and quantity, and in some instances, has proved insufficient to properly assess whether a compound is of health significance. This is especially true in the case of carcinogenic, mutagenic, and teratogenic outcomes.

The need to quantify health risks associated with exposure to environmental con­taminants has generated a new interdisciplinary methodology referred to as risk assess­ment and risk management. Risk assessment determines the qualitative nature of the risk posed by exposures to contaminants, and quantifies the dimensions of that risk.

The assessment relies on information from epidemiological, clinical, toxicological, and environmental research. Risk management refers to the selection and implementation of the most appropriate regulatory action based on the results of the risk assessment, available control technology, and economic, social, and political factors.

The National Research Council (NRC) has categorized the risk assessment process into four steps: hazard identification, dose response assessment, exposure assessment, and risk characterization. Hazard identification determines whether exposure to an agent causes specific health effects.

Dose response assessment describes the relationship between the magnitude of exposure and the incidence of disease. For carcinogens, this process requires extrapola­tion from high doses to predict incidence at lower doses and, for assessments based on animal data, conversion of animal doses to human doses.

Exposure assessments involve the determination of the frequency, duration, mode, and intensity of exposure to the agent in question. Risk characterization uses information from the dose response and exposure assessment steps to estimate the expected incidence of the adverse health effect.

Most risk assessments involve uncertainties or errors resulting from the experi­mental methods utilized, misinterpretation of results, misclassification of exposure, con­founding factors, or insufficient data. In some cases, uncertainty can be defined qualita­tively as under- or overexposure or quantitatively using sensitivity analyses.

Fate of Wastewater Constituents in Soil and Groundwater:

i. Pathogens:

Pathogens that can survive modern wastewater treatment include bacte­ria, protozoa, helmiths and viruses. It takes about 2-3 months for the numbers of most pathogens in untreated wastewater to decline to negligible levels in soil after application. However, the survival of enteric bacteria in soil is dependent on several factors, including the soil moisture, temperature, pH, season of the year, sunlight, organic matter content, and presence of antagonistic organisms.

Increased soil moisture tends to increase the survival rate of enteric bacteria; the survival of Escherichia coli and Salmonella typhosa increase substantially in moist soils. Kibbey (1978) found that the survival rate of Streptococcus faecalis was greatest when the soil was saturated, and lowest when soil was air-dried.

Boyd (1969) reported a decrease in microbial survival rates with a decrease in moisture content from 50 per cent to 10 per cent in a fine sandy loam soil. Survival time is less in sandy soils than in soils with greater water-holding capacities.

By increasing the temperature, tend to decrease the survival rates of enteric bacteria in soil. Beard (1940) reported that S. typhosa could survive as long as two years under freezing conditions. Reddy (1981) reported that the die-off rates of pathogenic bacteria and indicator organisms approximately double with a 10° C rise in temperature between 5-30° C. Weiser and Osterud (1945) reported that freezing and thawing pro­moted significant mortality of E.coli.

High acidic and alkaline conditions (pH <6.0 or >8.0) tend to adversely affect most bacteria in soil. Generally, neutrality (pH 7.0) favours the growth and survival of enteric bacteria. Salmonella, E. coli, S. faecalis, and Mycobacterium survive best in a soil pH range of 6-7. Die-off rates were high under acidic soil conditions.

The season of the year influences the survival rates through fluctuations in moisture and temperature. Van Donsel (1967) found a 90 per cent reduction in the survival rate in fecal coliforms after 3.3 days in the summer and 13.4 days in the autumn in temperate regions. Fecal coliforms survived slightly longer than fecal streptococci in the summer. The reverse was true in the spring and winter months.

Sunlight exerts a lethal effect on all microorganisms located on the very surface of soil. This effect may be due to desiccation and/or ultraviolet (UV) irradiation. Tannock and Smith (1971) studied soils exposed to direct sunlight vs. shaded areas. Survival of salmonellae was prolonged in the shaded areas compared to in the exposed plots.

In the summer, survival of S. bovismorbificans was up to six weeks after inoculation in the shaded areas and two weeks in direct sunlight. Similar results were reported by Van Donsel (1967). E. coli and S. faecalis die-off rates were much greater when exposed to direct sunlight than when in shaded areas. Die-off rates of Leptospira, Brucella, and Mycobacterium are also influenced by the intensity of sunlight.

Bacterial numbers are generally related to the organic matter content of soil. Since the enteric bacterial are heterotrophs, their only carbon supply would be the organic nutrients present in soil and suspended organic material contained in the wastewaters. Both Shigella and Salmonella are capable of utilizing the limited carbonaceous materials in soils.

However, they are not able to compete with the indigenous soil microflora for the low available nutrient supply. Enteric bacteria require simple sugars as a carbon source to synthesize amino acids, nucleic acids, and vitamins, but simple sugars rarely exist in soils in the free form. Instead, carbohydrates are often complexes as polymers.

Antagonistic microflora may also effect the survival of pathogens introduced to soil by the means of wastewater. The survival time for enteric bacteria inoculated into sterilized soil is longer than that in non-sterile soil. Predation by protozoa and parasitism by Bdellovibrio may play a role in regulating bacterial populations.

Production of lytic enzymes and antibiotics by soil fungi and actinomycetes could sup­press the growth of enteric bacteria. It has been reported that the growth of Salmonella and dysentery bacilli were inhibited by the presence of actinomycetes in soils.

There is very little information available on the survival of helminths and protozoa discharged into soil as a result of irrigation by sewage. Helminth ova can appear in sewage sludge and treated wastewater after sewage treatment practices. The ova of Ascaris lumbricoides may survive as long as seven years in garden soil.

The survival and persistence of protozoa in wastewater and soil is usually attributed to their ability to form cysts, which make possible an inactive metabolic state that can endure extreme environmental conditions such as high and low temperatures, drought, adverse pH, and low oxygen concentrations.

Cysts of the protozoan Entamoeba histolytica have been reported to survive up to eight days in soil and up to three days on plant surfaces. The rate of survival is dependent upon the temperature and moisture content of the soil.

Many viruses survive modern wastewater-treatment practices, including chlorination. When wastewater is applied to soil, viruses may survive a year or more. Most of the studies on virus survival have been carried out with polioviruses and bacteriophages. Virus inactivation appears to be affected by dispersion of viral aggregate clumps, presence of salts, appropriate temperature, pH, virucidal chemical species, and the presence of suspended solids.

Virus inactivation rates tend to be much greater in wastewater and natural waters than in distilled water because of desegregation of viral clumps. The presence of salts (NaCl) increases virus inactivation relative to non-saline environments. This effect can be partially reduced by the presence of divalent cautions such as Ca2+ and Mg2+. These cations have a stabilizing effect on viruses that prevents inactivation, particularly at elevated temperatures (50°C).

Virus inactivation increases with increasing temperature, and perma­nent inactivation of enteroviruses in solution is favoured by alkaline pH. Antagonistic microflora in soil may produce inactivating agents such as antiviral toxins. Also sterilization or heat treatment of wastewater may drive off heat-labile virucidal chemical agents present in the water such as ammonia.

Suspended organic matter and clay have some protective effect on viruses in wastewater. Little work has been done on virus survival in field experiments. Wellings (1975) reported that virus survival in a Florida cypress dome used for sewage effluent discharge was evident for at least 28 days.

Fujioka and Loh (1974) reported that the poliovirus survived in field plots at least 32 days when irrigated with wastewater effluent. Depend­ing on the nature of the soil, and its temperature, pH, and moisture content, enterovirus survival has been reported to vary between 25 and 170 days.

The disappearance of infectious viruses from percolating wastewater occurs mainly be sorption. This does not insinuate that the disappearance of infectious viruses from percolating wastewater occurs mainly by sorption. This does not eliminate infectivity of host cells. Schaub (1974) reported that viruses absorbed to clay were just as infectious to mice as were the free entities. Lefler and Kott (1974) demonstrated that viruses absorbed to sand are capable of infecting tissue cultures.

Table 7, which was compiled from several sources by Rose (1986) summarizes the results of field and laboratory survival studies. Feachem (1983) compared the survival rate (in days) of pathogens in wastewater, soil and on crops at 20-30° C (Table 8).

Survival of Selected Pathogens

Typical Pathogen Survival Times at 20-30° C

The migration of wastewater pathogeny in soil involves transport by insects, birds, rodents, runoff, windblown soil, and percolation through the soil profile to the groundwater. Retention of pathogenic organisms near the surface could be a potential problem because of contamination of surface runoff waters and windblown soil.

Several studies have shown that as much as 90 – 95 per cent of the fecal organisms are concentrated in the surface layers during the passage of wastewater through soil, and the remainder are concentrated in subsurface layers.

Movement of pathogenic bacteria through soil was compiled by Frankenberger, 1984 is shown in Table 9. The removal of bacteria, ova, and cysts through a given depth, is inversely proportional to the soil particle size. Low flow rates favour greater retention. Nutrients may support continuous growth of the microflora, resulting in biological clogging and alteration in pore configuration.

Movement of Bacteria

Virus removal from percolating wastewater is almost totally dependent on absorp­tion to various soil components. The movement of viruses through various soil systems is shown in Table 10. Drewey and Eliassen (1968), in column experiments, found that most of the viruses were retained in the appear 0.8 inches (2 cm) of the columns.

Robeck (1962) studied poliovirus removal through 0.6 m of California dune sand (0.8 – 1.6 m/ day). They indicated that the sand was capable of removing 99 per cent of the viruses in 98 days. Lance (1976) studied virus movement in 250-cm columns packed with calcareous sand flooded with secondary sewage effluent. Movement through the soil reduced the virus concentration to about 1 per cent after 38 cm of travel. Viruses were not detected in 1-ml extracts from columns below the 160-cm level.

Movement of Viruses

Many factors affect virus removal by soil. The mobility of viruses is related to the properties of the viral protein coat; and to the cation exchange capacity, pH, hydraulic conductivity and surface area. With soil, organic matter content, pH and ionic strength are relevant. The flow rate of the percolating fluid is also important.

The protein coat is characterized by being amphoteric. Soil organic matter and clay are generally negatively charged and readily absorb the positively charged reactive groups of the viral protein coat below its isoelectric point.

Virus absorption is generally enhanced with increasing clay and organic matter contents because of their surface area and greater number of active sites for absorption compared with silt and sand. However, of all the soil properties only pH was significantly correlated with the rate of virus absorption.

Absorption is generally favoured at pH values below 7.0. Gerba and Lance (1980) reported that virus absorption was highly dependent on the cation exchange capacity (CEC) and exchangeable aluminum content of soils.

Organic compounds compete with viruses for absorption sites on soil colloids. Consequently, viruses in primary sewage effluent are poorly held by soil compared with those in secondary effluent. Velocity of wastewater movement through soil is an important factor affecting virus distribution with depth.

When flow rates of secondary sewage seeded with poliovirus were increased from 0.6 to 1.2 m/day, more virus movement occurred through the soil. Flow rates less than 0.6 m/ day were the most effective for virus removal.

ii. Trace Elements:

Although a conventional wastewater treatment system is not de­signed to remove trace elements, these are effectively removed by removal of suspended solids. Under normal conditions, trace-element concentrations of primary effluents are reduced by 70 per cent to 90 per cent through secondary treatment.

Trace elements are present in the natural environment in low concentrations. F, Si, V, Cr, Mn, Co, Ni, Zn, and Se are essential to biological growth. At slightly higher concentrations, many elements may become toxic to plants and/or animals. Some ele­ments (e.g., As, Cd, Pb, Hg) have no known physiological functions and are always considered biologically harmful.

Once accumulated, they are in most cases practically impossible to remove and subsequently may lead to toxicity to plants grown on the affected soils, absorption by crops, resulting in trace-element levels in the plant tissue considered harmful to the health of humans or animals who consume the crops, or transportation through the soil to underground or surface water, thereby rendering the water unfit for its intended use.

Factors that affect the retention of trace elements by soils include soil texture, pH, soil organic matter, and contents of amorphous oxides of Fe, Al, and Mn. Fuller (1977) found that all soils studied showed a high capacity to attenuate the concentration of Cu and Pb in the soil solution.

The retention of other trace elements studied was best correlated with the clay content and free iron oxide content of the soil. Capacities for attenuating the cationic trace elements (i.e., Cu, Pb, Be, Zn, Cd, Ni, and Hg) in soils tended to increase with their clay and iron oxide contents. For the anionic trace elements (i.e., SeO3, VO3, AsO4, CrO4), retention in the soil also increased with increasing clay and iron oxide content of the soils.

In general, the solubility of cationic trace-element species increases as the pH of the soil decreases. Solubility of anionic trace-element species in the soil tend to increase as the pH of the soil increases.

Trace elements are widely used in industrial processing and manufacturing con­sumer goods, or as contaminants in products. The deterioration of storage and transmis­sion equipment in the community water supply system and the wear of household plumb­ing fixtures also contribute to their presence in the water.

For these reasons, small amounts are always found in domestic wastewater. A consistently high concentration of trace elements is an indication of industrial waste input. The concentration, however, may vary considerably with time within a particular treatment plant and among treatment plants in various communities. Most trace elements in wastewater (except boron) are absorbed onto the organic or inorganic solids, or form sparingly soluble inorganic precipitates.

Certain trace metals, such as B, Cd, Cu, Mo, Ni and Zn, present a potentially serious hazard if they are introduced into the cropland soils in an uncontrolled manner (EPA, 1976). The potential hazards associated with other trace elements in the soil are ruled out by their attenuated chemical activities in the soil, or by their infrequent occur­rence and exceptionally low concentration in the wastewater.

Following common crop- producing practices, Mn, Fe, Al, Cr, As, Se, Sb, Pb, and Hg inputs through application of treated wastewater to land should not result in phyto-toxicity or expose consumers to potentially hazardous trace-element levels.

Other elements, namely Cd, Cu, Ni and Zn, have been singled out for their potential phytotoxic effects. Their input into the soil should be calculated from the basis of their concentration in the effluents, irrigation practice and nature of soil and crops.

It has been shown that with typical irrigation practices at 1.2 m/year, annual inputs of Cu, Ni and Zn are about 1.2, 0.24 and 1.8 kg/ha. For Cu and Zn, these input levels are far below the amounts routinely used to correct trace-element deficiencies in the soil. Even if the concentrations of Cu and Zn in the wastewater are one order of magnitude higher, their inputs to the soil are still in-line with recommended fertilization practices.

It is unlikely that short-term use of wastewater for crop irrigation could result in crop injuries due to Cu, Ni, or Zn. However, these elements may accumulate in the soil, and the long-term effects may require further examination.

The amounts of trace elements removed by crops are small compared with the amounts applied (Table 11). Generally, no more than 10 per cent of the added trace elements is expected to appear in the harvested crops. Long-term use of wastewater for irrigation, therefore, may result in their gradual accumulation in the affected soils at a rate which depends on the characteristics of the soil.

Concentrations of Trace Elements

iii. Trace Organics:

Trace organics is a term generally used to characterize a group of newly discovered contaminants of water samples. Their presence, even at low concentra­tions, is the cause of concern in view of the inherent toxic effects of several of these compounds including pesticides.

Biochemical oxygen demand (BOD), chemical oxygen demand (COD), and total organic carbon (TOC) are usually used to express the organic loads. None of these indicate the composition of organic matter nor identify any toxic organic substance. A panel of experts examined the effects of chronic exposure to trace quantities of hazardous organic chemicals by direct ingestion of these substance through drinking water.

No evidence that trace amounts of toxic organic substances in drinking water have caused cancer or increased the incidences of cancer was found. However, since a no-response threshold concentration could not be established, the risk of disease as a result of long-term exposure cannot be completely ruled out.

In wastewater irrigation, the route of exposure to waterborne organics is not through direct ingestion. Trace organics are deposited in the soil; subjected to attenuation by physical, chemical, and biological reactions; and translocated into plant tissue before they may enter the human food chain. Through the soil matrix, trace organics introduced by irrigation may also become contaminants of groundwater.

Conventional wastewater treatment is effective in reducing the level of trace organ­ics through biological treatment. Chorination of wastewater often yields additional halogenated hydrocarbons as by-products of halogenating larger molecular weight organic compounds.

While rapid sand filtration has little effect, lime coagulation is more effective in removing organics. However, even with activated carbon absorption and reverse osmosis, a wastewater effluent is not rendered completely free of trace organics.

The fate of trace organic com­pounds during the passage of wastewater through the soil matrix is dependent on the properties of the soil and the nature of the chemical compound. Absorption, volatilization, and biodegradation are considered most important. Soil absorption is always measured as the sum of the contributing processes.

The soil absorption constant is specific for the soil and the chemical involved. For non-polar, hydrophobic organic substance, absorption can be positively correlated to the organic matter content of the soil.

In a moist soil, volatilization from the dilute aqueous solution is determined by the partitioning coefficient between water and air. Trace organics in treated domestic wastewater effluents consist primarily of low-molecular-weight, nonpolar organic substances with high vapor pressure and relatively low solubility in water which makes them especially conducive to rapid volatilization.

At the concentrations at which they are present in the water, even a slight volatilization could be significant in solute transport. In the soil, the mass transfer through volatilization depends also on the movement of the chemical to the soil surface and its dispersion in the air.

Whereas in absorption or volatilization the chemical structure of the compounds is not altered , biodegradation is an enzyme-activated biochemical reaction whereby organic substances are effectively decomposed. Bacteria, actinomycetes, and fungi are the soil micro-organisms most important in the decomposition of organic substances in the soil.

Biodegradation thus offers the means to detoxify trace organics retained by the soil. The rate of biodegradation in soils is influenced by the type of microorganism, the characteristics of the soil, and the molecular structure of the compound. Callahan (1979) concluded that the biodegradation pathway of organic priority pollutants may be developed on a theoretical basis or extrapolated from experiments using compounds with similar chemical structures.

Few experimental data are available to confirm the speculation. Tabak (1981) studied the biodegradability of 96 organic priority pollutants. Except for chorodibromomethane and heptachlor, organic pollutants can be biologically degraded. Because absorbed trace organics are generally not subject to microbial attack, the rate of biodegradation for trace organics in the soil may also be slowed by absorption.

Some degree of plant absorption of trace organic substances will occur at a magnitude lower than that of pesticides. Since there are no known physiological pathways by which plants absorb trace organics, the most likely mechanism would be by mass flow. Once trace organics are absorbed by the soil, the amounts available for plant uptake become limited. Recent work on natural and constructed wetland and aquatic plant systems has shown a high potential for the removal of contaminants, including organics, from treated wastewater effluents.

Characteristics of Treated Wastewater Effluents in Kuwait:

In Kuwait, and probably in most GCC countries, sewage is unique in that it originates as distilled water produced by thermal desalination of seawater. Sodium hydroxide is added to raise the pH value, before the distillate is mixed with brackish water (8-12%) to improve the mineral content.

The resulting mixture is a potable water with a pH value of about 8, low in bicarbonates and highly aggressive. As sewage, this water leaves homes with a relatively high total dissolved solids value resulting from the input of brackish water (1200 ppm) used for domestic needs around the home, and a high sus­pended sediment load which may be attributed to the highly arid climate and nature of the soil in the region and corrosion of cement-asbestos pipes, cement culverts and manholes.

The design of the sewage networks (pumping against gravity) and the high temperature lead to quick depletion of oxygen and the development of anoxic conditions that are responsible for the characteristic odors associated with pumping stations in Kuwait, and which often lead to the fluctuations in the performance of the treatment plants.

Monitoring of the quality of treated effluents of the country’s three major treatment plants is carried out under the direct supervision of the Ministry of Public Works. The Environment Protection Department (EPD) of the Ministry of Public Health also carries out monthly monitoring of the tertiary effluents.

Tables 12,13,& 14 summarize some of the major physical, chemical and biological characteristics of the tertiary effluents from the principal stations in the country. (Ardiah, Jahara and Riqqa) monitored during the period 1993-1995.

Levels of Physical and Chemical Parameters

Levels of Trace Metals

Results of Microbiological Indicators

The mean and range of values of these parameters are discussed in comparison with that produces by treatment plants in the USA (Table 15).

Typical Water Quality

The following conclusions may be made:

1. Except for chlorides (429 ppm), and total dissolved solids (1340 ppm) and sodium levels (290 ppm) which are relatively high for an already alkaline soil, the values are generally in agreement with those of effluents used for irrigation in the US. Although, these values are of no public health concern, they are of concern in terms of agriculture.

However, they may be reduced in the future as the Ministry of Electricity and Water is in the process of switching to the use of carbonation instead of adding sodium hydroxide.

2. As for the metal content, it can be seen that the levels are generally lower than, those observed in the US cities’ effluents. The values for chromium are attrib­uted almost entirely to trivalent chromium which is naturally high in Kuwait soil. In addition, the pH value of the soil in Kuwait (about 9) would impede the availability of metals after application.

3. The microbial picture is, however, different and is of concern from a public health point of view. The three year data shown in Table 14 suggest that the treatment plants are capable of achieving acceptable microbial levels for irriga­tion purposes, but not in a consistent manner.

An extreme case would be that at Riqqa in August, 1995 when the fecal coliform count exceeded 1,500,000. Once discovered, such a case could result in the disruption of the entire scheme, and thus, could have economic implications. However, if such situations are not discovered in time, they could result in disasters from a public health point of view.

It may take only one such incident to loose the credibility of the entire wastewater works system. This problem is not unique to Kuwait, but is common to most developing countries.

Risk Management of Reclaimed Waters:

Risk management herein refers to the selection of the most appropriate, cost-effec­tive socially and politically acceptable course of action for the sustainable use of treated effluents produced by wastewater plants for irrigation in Kuwait (which probably also applies to most countries in the region).

It involves, as a first step, carrying out a thorough health risk assessment in accordance with the terms outlined earlier. Such an assessment should be based on the location-specific conditions, and the type, efficiency and reliability of the wastewater treatment systems operated in the country as well as the nature and location of the irrigation projects and types of crops to be irrigated.

The climate, nature of the soil and socio economic structures and habits of the population being served must also be taken into consideration. The experience of other countries is important and should be learned from, but knowledge of the local environment is essential for carrying out a proper health risk assessment.

Models may also be used to provide a quantitative estimate of the health risk associated with public exposure to wastewater pathogens. However, models that use assumptions that do not truly reflect the local conditions may provide outputs that are misleading.

It is also important that the models are accepted by government authorities, treatment plant and irrigation system operators and the public that inspires all of the efforts. There will always be residual health risks involved in the use of treated wastewater for irrigation. This risk can be minimized through the develop­ment of a management scheme in which all efforts are integrated and complimentary.

A scheme in which wastewater treatment and effluent management are not integrated and harmonized is likely to have higher health exposure to pathogens for the public.

Based on the findings of the health risk assessment, the following issues need to be addressed:

1. Selection of effluent quality criteria that are most suitable to the local conditions and needs:

The most commonly applied criteria are those adapted by WHO and EPA. (Tables 16 and 17). Some communities have developed more stringent criteria (e.g., California, USA), and adopted more stringent criteria based on their local conditions.

Microbiological Guidelines for Wastewater Use in Agriculture

EPA Guidelines for Water Reuse

Table 18 compares such criteria with those of the EPA and WHO for microbial guidelines. In Kuwait, an Environment Protection Council (EPC)-designated standards Committee adopted guidelines for treated wastewater from the Shuiaba Industrial Area (combined domestic and indus­trial). The criteria are shown as Table 19.

Microbial Guidelines Used by Various States in Recycling Wastewater

Kuwait's Tentative Quality Standard of Treated Effluents

Adopting quality criteria, however, is only the first step. A quality assurance and quality control system is equally important, and the most demanding phase in terms of cost and effort. Several factors may limit success of this task, including cost, and limita­tions of existing monitoring infrastructures and public health supervision.

Past experi­ence have shown that the harsh and highly variable climate of Kuwait (diurnal as well as seasonal) mandates the establishment of an effluent monitoring system and availability of stabilization tanks (or lagoons)where wastewater treatment plant effluents are first re­ceived and stabilized before being used for irrigation.

2. Design considerations:

In the majority of cases, chlorinated secondary treated wastewater is used for irrigation (preceded by filtration when spray irrigation is applied). Whereas in Kuwait, tertiary effluents are used. This does not only mean increased cost, but it also results in the reduction of the nutrient content which is needed for the rather nutrient-poor soil of Kuwait.

Alternative designs should be carefully considered. Secondary treat­ment, followed by rapid sand Alteration and then stabilization into artificially constructed wetlands may represent a viable option. Such lagoons could reduce the need for storage (and thus insure constant flow of water of a known quality), create much needed wetlands for their aesthetic and recreational values, as well as to provide a sanctuary for migratory birds wintering annually in the area.

A visit to the Jahara artificial wetland is evidence of the utility of such a system. Table 20, compares the influent and effluent characteristics in an artificial wetland. Table 21 summarizes the effluent characteristics of four pilot-scale constructed wetlands.

Pilot-Scale Constructed Wetland

Performance of Pilot-Scale Constructed Wetland Systems

3. Selection of crops and irrigation methods:

Afforestation of highways and open fields has received emphasis in the Master Plan for Greenery in Kuwait. In the past, wastewater from septic tanks was used for irrigation of afforestation in remote areas. As the sewage network covers more than 85 per cent of the country now, such a source has been reduced or eliminated.

For lawn irrigation and landscaping inside residential areas, brackish water has been used. The use of treated effluents of high and reliable quality and drip irrigation should be considered for both categories. Fodder production (mainly alfalfa) could be kms separate the Sulaibiyah fields from residential areas. Irrigation of vegetables in the Wafra and Abdali farms with reclaimed water should only be considered after the reliability of the treatment system is established and has been accepted by the public.

Home››Kuwait››