Techno-Economic Analysis of RO Desalination of Produced Water for Beneficial Reuse in California

: There is approximately 508.7 million cubic meters (3.2 million barrels) of oilfield-produced water generated per year across the oil fields of California. While less than 2% of this produced water receives advanced treatment for beneficial reuse, changing regulations and increasing scarcity of freshwater resources is expected to increase the demand for beneficial reuse. This paper reviews onshore-produced water quality across California, relevant standards and treatment objectives for beneficial reuse, identifies contaminants of concern, and treatment process design considerations. Lastly, we evaluate the capital and operating costs of an integrated membrane system for treating produced water based on data from a field pilot conducted in the coastal region of California.


Introduction
"Produced water" and "associated water" are the oilfield terms given to the water that is co-generated when producing crude oil and gas (hereafter "produced water"). Produced water comprises both connate or formation water that is naturally occurring and water that is injected as part of enhanced oil recovery processes (e.g., water flooding, polymer flooding, steam flooding, etc.). According to data from the California Division of Oil, Gas, and Geothermal Resources (DOGGR, which recently changed name to California Geologic Energy Management Division or CalGEM) the State of California produces (approximately 25.7 million cubic meters (161.7 million barrels) of oil each year. Moreover, for every barrel of oil produced in California, on average, about 20 barrels of water is co-produced [1].
Given the geological diversity of oil fields in California, there is considerable variation in the quality of produced water across the state. For example, total dissolved solids (TDS) concentrations range from below 2000 mg/L to over 30,000 mg/L. Also, depending on the geochemical nature of the field, the produced water can contain various levels of carbonate, sulfate and silicate minerals of calcium (up to 930 mg/L), barium and strontium (up to 51 and 28 mg/L), high boron concentrations (72 mg/L), naturally occurring radioactive materials (up to 41 pCi/L of gross alpha particles) as well as free oil and grease (up to 898 mg/L), emulsified and soluble petroleum hydrocarbons (up to 430 mg/L).
The primary means of produced water management in California has historically been injection in disposal wells [2]; however, tightening regulations along with growing scarcity of fresh water resources in drought-prone portions of California are motivating California oil producers to look at produced water as an unconventional source of fresh water. The proximity of most California oilfields to agricultural and farming lands has prompted a nexus between the two sectors where produced water is already being used as irrigation water for some of the locally common crops such as nuts, fruits, citrus, and avocado.
There are a number of regulatory mandated and use-case water quality requirements for beneficial reuse. While 92% of produced water generated in California receives at least a primary level of treatment (e.g., free water knockout to remove free oil), only 8% of that volume receives the advanced treatment needed to bring the concentrations of various contaminants down to an acceptable range for any reuse purpose. An overwhelming majority of produced water in California is brackish and the most cost-effective means of advanced treatment is reverse osmosis (RO) membrane-based desalination. However, for RO technology to work well, specific and often complex sequences of pretreatment are needed to remove various types of organic and inorganic constituents before the RO membranes. This paper reviews produced water qualities across different regions of California and the treatment objectives needed for beneficial reuse, process design considerations and treatment costs using real process data.

Produced Water Quality and Volumes Across California
DOGGR (CalGEM) divides oil and gas operations across four districts: northern, inland, coastal, and southern. Figure 1 below graphically shows this division [3,4]. Over 74% of oil production in California takes place in the inland district (mainly in Kern County), while the balance comes in roughly equal proportions from the southern and coastal districts; very little oil is produced in the northern district (0.12%). Given the higher ratio of produced water to oil in the southern region compared to the coastal district, the share of produced water generation is approximately 61%, 29% and 10% for inland, southern, and coastal districts, respectively. On an annual basis, approximately 508.7 million cubic meters (3.2 million barrels) of produced water is generated across California.
The TDS concentration is used as the key water quality indicator by DOGGR (CalGEM) across various oil-producing regions in California and is also shown in Figure 1. While most produced waters in California have TDS concentrations above 15,000 mg/L, there are some regional variations, especially in the inland region. Typically, the fields east and southeast of the city of Bakersfield generate lower TDS produced waters (1000-16,000 mg/L) than the ones to the south and southwest (~13,000 to ~23,000 g/L) and those in the west and north west (~16,000 to ~25,000 g/L). More detailed produced water quality data for these regions are provided in Table 1. These data were generated by the authors over the past few years from various produced water treatment projects and studies performed across California. While there are a wide range of concentrations for many parameters, the key indicators of "treatability" appear to be oil and grease, total petroleum hydrocarbons (TPH), hardness, silica, and boron in addition to TDS. The data shown for Santa Maria County and Ventura county in the Coastal district as well as the Southern district analysis are each a single sample. The data for the Inland district is shown as the lowest and highest TDS samples among a series of 15 samples. The variations are attributed to the diversity of the geochemical nature of the oilfields across the state. Standard measurement methods are provided in Appendix A.

California Water Reuse Standards
About 92% of the 508 million cubic meters (3.2 billion barrels) of produced water generated in California per year receives primary treatment (e.g., gravity separation to separate free oil from water) prior to re-injection, reuse or disposal. Approximately 20% of that volume (18.4% of the total volume) receives secondary treatment (typically, induced or dissolved gas flotation for emulsified oil and grease removal), and about 9% of that (1.8% of total) receives a tertiary level of treatment for beneficial reuse which may include different combinations of nutshell or other media filtration, ion-exchange softening and/or reverse osmosis desalination [2]. Currently, in California, there are no water quality standards developed for produced water reuse, primarily because it is happening less than 2% of the volume is beneficially reused, but also because there are a number of different beneficial reuse options. The three most common are: (1) direct onsite reuse for water or steam flooding, (2) agricultural irrigation, and (3) surface water (streamflow) augmentation. Hence, produced water reuse permits are considered on an ad hoc basis by the relevant local Regional Water Quality Control Board. If the water is returned to the same oil-bearing formation from which it came, there is no treatment required, but the injection well must be permitted.
Depending on the purpose of beneficial reuse purpose and permit requirements, the necessary treatment scheme is defined on a case-by-case basis. For example, reusing produced water in steam flooding operations requires different levels of softening, desilication and/or desalination to avoid premature scaling of the steam generating equipment. Once-through steam generators (OTSGs) can handle fairly high TDS concentrations (up to 10,000 mg/L) but require hardness and silica to be removed to very low levels to minimize scaling [2]. A key indicator of irrigation water quality is the sodium adsorption ratio (SAR); a high concentration of sodium relative to calcium and magnesium can lead to severe soil degradation issues (e.g., loss of permeability for clayey soils) [5]. The formulation for SAR is widely known and various crops have differing sensitivity to SAR as shown in Table 2. Fruits, nuts, citrus and avocado crops are predominantly cultivated throughout California's central valley and coastal regions [6,7]. Further, for reuse in agricultural irrigation, specific organic and/or inorganic constituents of concern may have to be removed to different levels depending on the type of the crop being irrigated. In particular, boron and chloride, which are present in various concentrations in produced waters across California, may need to be brought down to certain limits to protect crop yields. Boron, while an essential micronutrient for plant metabolism, has an extremely narrow concentration window where it changes from essential to toxic [8]. Excess boron (over 1 mg/L) and chloride (over 140 mg/L) concentrations in water can result in reduced yields of many food crops. Some examples of boron and chloride concentrations limits for various crops are provided in Tables 3 and 4 [9].

Produced Water Treatment Process Design Considerations
Based on the data in Table 1, most California Central and Coastal basin produced waters have a TDS from ~3000 to 30,000 mg/L with a SAR between ~80 and >200, chloride between ~2000 and ~20,000 and boron from ~3 to ~70 ppm. Moreover, most California produced waters have significant amounts of both carbonate and non-carbonate hardness, including calcium, magnesium, barium and strontium; the noncarbonate hardness is predominantly associated with sulfate. In addition, most California produced waters contain high levels of silica and trace levels (single digit ppm values or lower) of iron and/or aluminum. Reviewing the produced water quality data and treatment objectives as outlined in the previous section, the primary contaminants of concern (CoCs) are SAR, boron and chloride; hence, most California produced water requires desalination for reuse in agricultural irrigation.
It is generally accepted that reverse osmosis (RO) is the most cost-effective, commercially proven technology for brackish water desalination [2,10,11]. The key element for reliable design and cost-effective operation of any RO process is the design and selection of an appropriate pre-treatment scheme to maximize recovery (reducing the amount of concentrated brine that is produced) and minimize the deleterious effects of RO membrane fouling, which include higher energy demand, down-time for cleaning, cleaning chemicals and premature membrane replacement. To avoid rapid fouling (loss of water permeability) of RO membranes, free and emulsified oil and grease (hexane extractable material or HEM, per US EPA1664B) and non-oil suspended solids (e.g., clay fines from the formation) need to be removed ahead of the RO process. Next, to avoid rapid degradation (loss of TDS rejection) of RO membranes, soluble hydrocarbons (e.g., alcohols, aldehydes, ketones) and volatile organic compounds (VOCs such as the sum of benzene, toluene, ethylbenzene, and xyelen or "BTEX", diesel and gasoline range hydrocarbons) need to be removed ahead of the RO membrane process. Finally, to enable high recovery, sparingly soluble metals and minerals (e.g., iron and aluminum, silica, calcium carbonate, calcium, barium, and strontium sulfate) concentrations may need to be significantly reduced.
With appropriate pre-treatment, the RO membrane process operation is stable and consistent with little buildup in differential (feed-to-brine) or trans-membrane (feed-to-permeate) hydraulic pressure loss. After primary TDS reduction by RO, boron can still be well above the 1 mg/L maximum contaminant level required for agricultural irrigation. Boron chemistry in water is well established [12], but typically, at pH levels below ~9, boron predominantly exists as boric acid (B(OH)3), which is poorly rejected by RO membranes because it is small, polar and uncharged. There are conflicting interpretations for the origin of the acidity of aqueous boric acid solutions. Raman spectroscopy of strongly alkaline solutions has shown the presence of B(OH) − 4 ions [13], leading some to conclude that the acidity is exclusively due to the abstraction of OH − from water, according to [13][14][15]: 3 × 10 −10 ; pK = 9.14) (1) And hence, above a pH of ~9, boric acid (B(OH)3) forms borate (B(OH)4 − ), which is rejected very highly (>90%) by most brackish water RO membranes [16]. As a result, depending on the concentration of boron in produced water, there may be a post-treatment polishing step may be required, comprising either: (a) boron selective ion-exchange resins or (b) a second pass of brackish water RO membranes with pH adjustment to about pH 10.

Pilot-Scale Case Study
The case study reported on herein is a two-month long field pilot test project performed in the DOGGR (CalGEM) Coastal District of California in 2018. The primary treatment objectives were to meet the beneficial reuse standards required by the neighboring agricultural end-users (see Table 5), while the secondary objective was to maximize the RO process recovery and minimize RO brine volumes because the cost of brine disposal in this region is exceptionally high (~$38 per m 3 ; ~$6 per barrel) . The pilot here had a throughput of 3.4 m 3 per hour.  Table 5 outlines the concentrations of the key produced water quality parameters, as well as the treatment objectives for this project. The influent concentrations are single point measurements and the method and reporting limit (RL) for each parameter is shown in Appendix A. The treatment objectives were defined by the oil producer. From the information in Table 5, the principal COCs in the influent were:

Produced Water Quality and Treatment Objectives
Additionally, to protect RO membranes in operation, pre-treatment would need to be selected to target the removal of the following constituents: • Oil and grease • Suspended solids • TPH and other dissolved organic constituents • Hardness and alkalinity • Sparingly soluble metals and minerals

RO Pretreatment Process Design
The raw influent and softened influent water quality from Table 5 was modeled to evaluate scale formation potential using Genesys International's MM4 software. A number of minerals were supersaturated such that catastrophic scaling would be expected at only 10% product water recovery ( Table 6). This includes mineral species of CaCO3, BaSO4, Fe(OH)3, Mn(OH)2 and SiO2. Since there was both carbonate and non-carbonate hardness, lime-soda softening was evaluated. The lime-soda softened water was well below saturation for all the relevant minerals, and when adjusted to pH 6.5 none of the minerals exceeded their saturation concentrations up to 70% recovery. With proper antiscalant addition, all metals and minerals remain well below saturation levels up to 70% recovery and with slightly higher Anti-Scalant dosing up to 84% recovery (not shown). Accordingly, after a review of multiple technologies [2], it was decided that upfront chemical softening followed by TSS removal would sufficiently lower the scaling and fouling potential of the produced water such that it could be treated by RO at sufficiently high recovery [75-80%]. The process included an upfront oil-water separator to manage inevitable variations in oil and grease and soluble hydrocarbons, in addition to improving the efficiency of the chemical softening process. An intermediate ultrafiltration (UF) membrane filtration system was used downstream of chemical softening, which significantly lowered the particulate fouling potential of the RO feed, providing an absolute barrier to TSS above 10 nm. Additionally, granular activated carbon (GAC) was employed to remove soluble hydrocarbons that could rapidly damage RO membranes. Using the above process design approach, the treatment process proposed and deployed for the pilot was divided into six operating zones detailed below. A process flow diagram is shown in Figure 2. A picture of the pilot site (approximately 9000 ft 2 in area) is shown in Figure 3.

RO-2
Anti-Scalant NaOH Figure 3. Image of the pilot test site. Zones indicate different water treatment objectives. Zone 1: oil-water separation; zone 2: chemical softening and clarification; zone 3: suspended solids removal; zone 4: dissolved hydrocarbon removal; zone 5: primary desalination and secondary boron removal; zone 6: solids dewatering and solid waste management.

Influent and Treated Water Quality
Throughout the pilot, the water quality parameters listed in Table A1 in Appendix A at the inlet and outlet of each zone were monitored at a regular interval using field instruments onsite. Further, a more detailed offsite analysis of minerals and organic constituents of the influent produced water and treated product water were performed on a weekly basis. See Appendix A (Table A2) for a complete list of onsite and offsite analysis and methods used and Appendix B for a list of the number of samples by location and water quality parameter. While most influent concentrations remained fairly stable during the course of the pilot, the influent oil and grease concentrations varied significantly with daily onsite measurements varying from as low as 8 mg/L to as high as 1013 mg/L. The variations in the oil grease content can be attributed to the efficacy of primary separators and/or feed level changes during the day. Typically, the lower the feed tank, the more oil from the skimmed oil layer on the top gets into the influent. This is why any advanced produced water treatment system requires a robust oil/water separator as the first step in the treatment process to be able to absorb these variations and protect the downstream unit operations and their efficiency. The treated water parameters, however, were stable and far exceeded the treatment objectives. Table 7 outlines the changes in concentrations of different water quality parameters across different process zones versus the treatment objective for each parameter. The number of samples taken at each event, as well as the standard error calculated using Microsoft Excel divided by the total number samples for each case, have been reported in Appendix B. The concentrations shown are the averages of analyses done over the course of the pilot. Taking these values into account, the removal efficiencies of various categories of contaminants by each zone have been calculated and are shown in Table 8. Key process parameters for each zone are also presented in Table 8. These are the key parameters needed to size the solution at larger scales.

Capital and Operating Costs
A cost model was developed that uses the data from the pilot results to estimate capital and operating costs (CAPEX and OPEX, respectively) at larger scales. Figure 4 shows the working diagram of the model. Table 9, we estimated CAPEX and OPEX for the following scenarios:

•
Scenario 1: Baseline similar to the pilot process configuration • Scenario 2: Baseline with a 5% UF filtrate blend to take advantage of the good quality of the permeate and reduce some of the capital and operating costs • Scenario 3: Blend with membrane-based RO concentrate treatment The throughput considered in Table 9 is typical for most small to mid-sized producers across California. The resulting CAPEX and OPEX are shown in Table 10. Generally, increasing the overall process recovery seems to reduce the overall operating costs through savings in waste management costs. Solely from a pure cost perspective (CAPEX plus OPEX), Scenario 2 seems to be the most cost effective of the three evaluated. That said, there are several life cycle and intangible costs associated with having to manage less brine volume which are not included in this estimate. For example, less brine volume to dispose of means more disposal well capacity and lifetime. It also means smaller downstream infrastructure such as tanks, pumps, and pipeline and more available fresh water. The OPEX breakdown per category is provided in Table 11.

Conclusions
From the 508.7 million cubic meters (3.2 billion barrels) of produced water generated annually in California, only 8% receives some form of advanced treatment for beneficial reuse. Tightening regulations, California's limited access to freshwater resources (especially in oil-producing regions), and the proximity of most oilfields to farmlands will likely increase the demand for beneficial reuse. The majority of produced water qualities across California have high TDS levels. Comparing that and other inorganic and organic parameters against some of the requirements for agricultural water and regulations, advanced treatment in the form of membrane-based desalination (specifically RO) seems to be the most cost-effective and commercially available solution. For membrane-based desalination to operate sustainably here, care must be given to the design of the pre-treatment. The main categories of contaminants that need to be addressed are free oil and greases, suspended solids, dissolved organics and hydrocarbons, hardness and inorganic salts, and silica. Boron is another parameter that is universally present at potentially toxic levels in produced waters across California. In order to being boron down to acceptable levels, consideration needs to be given to boron chemistry and membrane rejection when designing the RO component of the process. Finally, using data from an actual pilot project, our economic model show that the combined capital and operating cost of advanced treatment of high hardness, high TDS brackish produced water is within $1.43 to $1.46 per each m 3 of produced water treated. This may be higher than the current cost of fresh water (typically in the range of $0.49-0.92 per m 3 ), but certainly an affordable option when considering all the regulatory, limited disposal infrastructure, and rising costs of water management such as transportation to other disposal sites.     Table 8   Table A3. Number of Sampling Events and Standard Errors for Data Presented in Table 8.