This report presents the phase-two groundwater-flow
simulations and predicted effects of groundwater irrigation
on stream base flow in the Elkhorn and Loup River Basins
of central Nebraska.
Simulation of Groundwater Flow and Effects of Groundwater Irrigation |
This report describes the construction
and calibration of the simulations and the methods used to
predict changes to stream base flow that result from changes
to groundwater irrigation. Effects of groundwater irrigation
were evaluated using three distinct approaches: (1) a base-flow
depletion analysis, derived from results of the model simulation, mapped the spatial distribution of the percentage of
pumped water that causes base-flow depletion at the end of a
50-year period; (2) groundwater-flow simulations were used to
predict changes to stream base flow that resulted from changes
to the amount of irrigated area during a 25-year period; and
(3) a simulation-optimization model determined the minimum
reduction of groundwater pumpage that would be necessary
in the Elkhorn River Basin to maintain various hypothetical
levels of base flow in the Elkhorn River. The climate, land
use, water use and management, hydrogeology, and general
description of the conceptual model were described by Peterson and others (2008) and are not presented herein.
Simulation of Groundwater Flow
This section describes the topical background, methods, and results for developing the phase-two simulation of
groundwater flow. The simulation, or model, was developed
to simulate groundwater flow, groundwater withdrawals, and
stream-aquifer interactions for the Elkhorn and Loup River
Basins, Nebraska. To simulate those processes, large amounts
of hydrogeologic data from numerous sources were needed
to describe aquifer properties and hydrologic stresses. These
data were compiled as spatially referenced data layers within a
geographic information system (GIS) and then assigned to the
simulation at discrete intervals in space and time. Simulations
were built for this study using MODFLOW–2005 (Harbaugh,
2005), with assistance from Groundwater Vistas Version 5
software (Environmental Simulations, Inc., 2009).
The hydrogeologic data (simulation parameters) describing the study area were assigned to the simulation directly and through calibration. For the direct case, characteristics such as recharge, land use, streambed properties, and hydraulic conductivity were introduced into the simulation using the best available information (appendix 1) and used as compiled. Once all available information was compiled and entered into the simulation, the results from the simulation were compared to measured groundwater levels, decadal groundwater-level changes, and estimated groundwater discharge to streams (hereinafter referred to as base flow). Differences between simulation results and values were used to guide calibration, which is the process of obtaining parameter values to construct a framework useful for describing the hydro-geologic characteristics of the study area (Reilly and Harbaugh, 2004). Simulations were calibrated by adjusting selected parameters until simulated groundwater levels, decadal groundwater level changes, and base flow best reproduced measured values
(see “Calibration” section of this report). Calibration proceeded in two stages. In the first stage, manual trial-and-error calibration techniques were used to adjust average recharge from precipitation, additional recharge beneath irrigated and non irrigated cropland, horizontal hydraulic conductivity (KH), streamed hydraulic conductivity (KS ), and maximum evapotranspiration (ET) rates from groundwater to achieve the best match with measured groundwater levels, decadal groundwater-level changes, and base-flow data. Recharge from precipitation was calibrated as a constant, average rate throughout the simulation period rather than a time variable rate during the manual trial-and-error calibration stage. The second stage of calibration used automated calibration techniques and incorporated recharge from precipitation as a temporally changing value. The automated, or inverse modeling, calibration stage used the Parameter Estimation software (PEST) (Doherty, 2008a, 2008b) (appendix 2). Adjustable parameters for the automated calibration were recharge from precipitation and KH.
The hydrogeologic data (simulation parameters) describing the study area were assigned to the simulation directly and through calibration. For the direct case, characteristics such as recharge, land use, streambed properties, and hydraulic conductivity were introduced into the simulation using the best available information (appendix 1) and used as compiled. Once all available information was compiled and entered into the simulation, the results from the simulation were compared to measured groundwater levels, decadal groundwater-level changes, and estimated groundwater discharge to streams (hereinafter referred to as base flow). Differences between simulation results and values were used to guide calibration, which is the process of obtaining parameter values to construct a framework useful for describing the hydro-geologic characteristics of the study area (Reilly and Harbaugh, 2004). Simulations were calibrated by adjusting selected parameters until simulated groundwater levels, decadal groundwater level changes, and base flow best reproduced measured values
(see “Calibration” section of this report). Calibration proceeded in two stages. In the first stage, manual trial-and-error calibration techniques were used to adjust average recharge from precipitation, additional recharge beneath irrigated and non irrigated cropland, horizontal hydraulic conductivity (KH), streamed hydraulic conductivity (KS ), and maximum evapotranspiration (ET) rates from groundwater to achieve the best match with measured groundwater levels, decadal groundwater-level changes, and base-flow data. Recharge from precipitation was calibrated as a constant, average rate throughout the simulation period rather than a time variable rate during the manual trial-and-error calibration stage. The second stage of calibration used automated calibration techniques and incorporated recharge from precipitation as a temporally changing value. The automated, or inverse modeling, calibration stage used the Parameter Estimation software (PEST) (Doherty, 2008a, 2008b) (appendix 2). Adjustable parameters for the automated calibration were recharge from precipitation and KH.
Spatial and Temporal Discretization
To simulate flow using MODFLOW, the study area is
divided into a grid of discrete cells. Hydrogeologic properties, initial conditions, and simulation results are assigned to
each grid cell. The actual hydrogeologic system is continuous
rather than discrete; therefore, groundwater-flow simulations
are always an approximation of the actual system. Simulations
with a smaller grid-cell size generally yield more accurate
approximations of the actual system because less averaging
occurs as spatially variable properties are assigned to grid
cells, especially where large changes take place over small
distances. The study area was simulated using a uniformly
spaced grid of 162 rows and 248 columns of 1-mile (mi)
by 1-mi cells, covering an area of 40,176 mi2
(fig. 2). This
is a refinement of the phase-one simulation, which used grid
cells 2 mi by 2 mi in extent. The active simulation area, which
is smaller than the extent of the model domain, encompasses
29,707 mi2
and includes areas with an estimated aquifer saturated thickness of at least 10 feet (ft). Similar to the phase-one
simulations, a single unconfined layer was used to simulate the
aquifer.
If a simulation is used to evaluate the aquifer system as a
function of time, it is referred to as a transient simulation and
is divided into discrete time intervals called stress periods.
Hydrologic stresses, such as recharge and pumping, are held
constant within each stress period. In the ELM study area,
major changes in land-use practices occurred from 1895 to
1940 and from 1940 through 2005. Starting in 1895, irrigation canals were constructed, and water was diverted from
streams for agriculture. Simulation of conditions from 1895
through 1939 used two stress periods (1895 to 1929 and 1929
through 1939) to represent the two time periods when new
canal systems became operational and caused a change to
recharge from canal seepage (see “Additional Recharge from
Canal Seepage”). From 1940 through 2005, irrigated agriculture expanded to include wells and additional canals. The 1940
through 2005 period was simulated using 66 stress periods,
one stress period for each year. Simulated hydrologic stresses
were updated during each of those annual stress periods so that
changes to land use and irrigation development with time are
represented in the simulation.
Groundwater levels were needed to represent 1895 conditions at the beginning of the simulation (Reilly and Harbaugh,
2004) and measured groundwater levels were unavailable
during 1895; therefore, a pre-1895 period was simulated to represent the system in long-term equilibrium, or steady-state conditions. When a steady-state simulation is used to define starting
conditions for a transient simulation, the steady-state simulation
uses the same aquifer properties and hydrologic stresses, with
the exception of stresses such as pumping. This period was simulated using a single transient stress period that was 1,000 years
long. It was determined that 1,000 years was a sufficient amount
of time to reproduce long-term equilibrium conditions because
simulated change to groundwater storage was close to zero. This
approach was used in place of a true steady-state stress period, a
single stress period having a single time step and a storage term
set to zero, because it helped prevent numerical instability in the
far northeast corner of the study area and resulted in fewer dry
cells. Dry cells are cells that become inactive when calculated
interim groundwater levels drop below the simulated base of
the aquifer during the iterative approximations of groundwaterflow equations (Harbaugh, 2005). During calibration, simulated results of the pre-1895 period were not compared to calibration targets; however, 1895 simulated groundwater levels were used as starting groundwater levels for the 1895 through 1939 simulation, and 1939 simulation results were compared to measured groundwater levels and estimated base flows (see “Calibration” section of this report). This was considered appropriate because water development from 1895 through 1939 only occurred in a relatively small area along the southern boundary of the simulation.
the aquifer during the iterative approximations of groundwaterflow equations (Harbaugh, 2005). During calibration, simulated results of the pre-1895 period were not compared to calibration targets; however, 1895 simulated groundwater levels were used as starting groundwater levels for the 1895 through 1939 simulation, and 1939 simulation results were compared to measured groundwater levels and estimated base flows (see “Calibration” section of this report). This was considered appropriate because water development from 1895 through 1939 only occurred in a relatively small area along the southern boundary of the simulation.
Download Link
No comments:
Post a Comment