Learn

This page walks through the basics of what data centers and AI actually are, why they are not inherently harmful, and where the real problems lie, with particular focus on communities in water-stressed regions like El Paso. Every section includes sources you can open and verify.

Data Centers 101

Data centers

What they are, where they came from, and why they are not inherently harmful.

What is a data center?

A data center is a building or campus of buildings filled with computers that store and process digital information for companies, governments, or online services. Think of it as a warehouse for computing: servers run 24/7, which means constant electricity and cooling. Cooling often involves water or large air systems, depending on design and climate.

Every time you send an email, stream a video, check the weather, or process a credit card transaction, a data center somewhere is doing the work behind the scenes.

A brief history

Data centers are not new. The concept dates to the 1940s and 1950s, when room-sized mainframe computers needed dedicated, climate-controlled spaces. Through the 1970s and 1980s, corporate computer rooms grew alongside the minicomputer and early PC era.

The modern data center took shape in the 1990s with the rise of the internet: companies needed reliable, always-on server rooms to host websites and handle traffic. Colocation facilities appeared, letting multiple companies share the same building, power, and cooling.

By the 2010s, cloud computing concentrated workloads into massive hyperscale campuses operated by Amazon, Google, Microsoft, and others. Today these campuses can draw hundreds of megawatts, more power than a small city, and the pace of expansion is accelerating.

Are data centers bad?

No. Data centers enable services that modern society depends on. Hospitals use them for electronic health records and telemedicine. Emergency services rely on them for 911 dispatch routing. Weather modeling, genomics research, financial transaction processing, and government benefits systems all run on data center infrastructure.

Modern life depends on data centers. The civic question is whether a specific facility’s design and location match the local water supply and power grid, with binding commitments rather than vague promises.

A well-sited, efficiently cooled data center in a region with abundant water and renewable energy is a positive economic asset. The same building in a drought-prone desert with a strained grid is a different story entirely.


AI 101

AI & machine learning

What AI actually means, how long it has been around, and why the technology itself is not the problem.

What does AI mean here?

Artificial intelligence, in the way engineers use the term, refers to software that can recognize patterns, make predictions, or automate decisions under uncertainty, without being explicitly programmed for every scenario. Engineers mean statistics and optimization at scale. The popular image of a sentient robot misses how these systems actually work.

Machine learning (ML) is the most common approach: instead of hand-coding rules, you feed a system large amounts of data and let it find patterns. Deep learning is a subset of ML that uses layered neural networks, the technology behind image recognition, language translation, and tools like ChatGPT.

A brief history of the field

AI is not a product of the 2020s. The field was formally named in 1956 at a Dartmouth College workshop, but the intellectual roots go back further: Alan Turing’s landmark 1950 paper “Computing Machinery and Intelligence” posed the question of whether machines could think.

From the 1960s through the 1980s, researchers pursued symbolic AI built on hand-crafted rules and expert systems. Results were promising but brittle. Starting in the 1990s, statistical machine learning gained traction as computing power grew and data became more available.

The deep learning revolution began around 2012 when neural networks dramatically outperformed older methods in image recognition. Since then, deep learning has expanded to natural language, protein folding, autonomous vehicles, and generative media. The field has been active for over 70 years. The recent hype cycle is just the latest chapter.

AI you already use every day

Long before anyone marketed “AI-powered” products, machine learning was quietly embedded in services most people use without thinking about it:

Credit card fraud detection
Email spam filters
Netflix and Spotify recommendations
GPS route optimization
Voice assistants (Siri, Alexa)
Auto-correct and predictive text
Online search ranking
Photo face grouping
Language translation
Weather forecasting models

Many layers of the internet and the services built on it rely on statistical and ML methods. The internet itself predates modern ML, but today it depends on these systems for routing, security, content delivery, and more. AI is infrastructure you have been using for years. It only sounds novel when vendors put a fresh label on it.


The real issue

What we are actually worried about

Data centers and AI are not the enemy. The problem is irresponsible siting, governance gaps, and corporate hype that inflates demand for rapid expansion.

This is a quick overview. For detailed issues visit the Data Center page.

Wrong place, wrong resources

A data center that uses potable water for cooling in a region under chronic drought stress is a fundamentally different proposition than the same building in the Pacific Northwest. El Paso sits in the Chihuahuan Desert, depends on the Rio Grande and the Hueco Bolson aquifer (both under strain), and faces summer temperatures that push cooling demand to its peak.

Ask whether this specific facility should consume these specific shared resources in this specific location, and under what enforceable terms. Whether data centers exist in general is already settled; the live debate is about this project and this place.

AI hype and artificial demand

The current wave of corporate AI marketing has created a gold-rush atmosphere. Companies are slapping “AI-powered” labels on trivial features such as search bars, email sorting, and photo filters, which either already used ML quietly or barely benefit from it. This marketing inflates the perceived need for new compute capacity.

When every company claims it needs cutting-edge AI infrastructure, the aggregate demand projections balloon. Those inflated projections are then used to justify rapid data center expansion, often in regions where the sales pitch outpaces the actual local benefit or resource availability.

The governance gap

Large infrastructure projects touch shared resources: water, electricity, land, roads, and the community’s quality of life. Environmental justice and critical infrastructure frameworks help communities understand risks and negotiate fair outcomes with accountability at the center.

The gap today is that many municipalities approve data center developments using processes designed for warehouses or light industrial use, without requiring the water, power, and noise commitments that a 24/7 compute facility actually demands.


El Paso and the region

Desert Southwest context

Why the same facility that works fine in a wet climate can be a serious concern here.

Water stress in the region

El Paso’s water comes primarily from the Rio Grande and the Hueco Bolson aquifer. Both face long-term depletion pressure from population growth, agriculture, and climate change. The city has invested heavily in conservation and reclaimed-water programs, and that progress can be undermined by a single large industrial user drawing potable water for cooling.

When reviewing a data center proposal, ask whether the facility will use potable or reclaimed water, what its peak daily draw will be, and what happens during drought restrictions. These questions go beyond narrow technical detail. They are community survival questions.

Grid capacity and cooling

Summer temperatures in El Paso regularly exceed 100 °F. Cooling a building full of heat-generating servers in that climate is inherently more energy-intensive than in cooler regions where outside air can supplement mechanical cooling.

Ask for expected megawatt demand, grid interconnection plans, backup generation fuel type, and how costs or outages could affect residential neighbors. A facility that promises 50 MW today may request 200 MW in phase two, after the initial approvals are locked in.


Regional snapshot

Lessons from other places

Short summaries with sources. For the full harm analysis, charts, and policy ideas, use the Data centers page.

Water: closed-loop language and real makeup

Marketing may stress a closed fluid path next to servers, but heat still leaves the building. Large daily water permits are a signal that evaporation or makeup water is in the plan—not just pipe fill.

Corporate “water positive” programs can help habitats elsewhere; they do not automatically protect the Hueco or Mesilla bolson water your neighbors drink unless agreements say so.

Energy: filings, gas proposals, and hourly reality

News reporting on El Paso Electric filings describes modular gas generation and questions about who pays during and after any bridge period. Those details belong in open dockets, not slogans.

Annual renewable matching is not the same as clean power every hour. Night-time load can still sit on fossil plants without massive storage.

Air: ozone planning and backup engines

Research in dense data center markets shows generator fleets matter for NOx and particulates. El Paso’s ozone challenges make new combustion worth scrutinizing.

Noise: cities tightening rules after the fact

Phoenix-area debates show retrofitting standards after saturation is painful. Measuring at the source property and capping decibels early prevents the Arizona pattern.

Jobs and incentives

Press reporting places Meta’s El Paso campus near $10 billion in planned spend while citing on the order of 300 permanent roles as a company promise, not a contractual floor. That is still an extreme capital-per-job ratio if you take the headline at face value. Good Jobs First and similar groups document clawback policies so public money tracks outcomes.

What El Paso can require

Reclaimed cooling water, intensity limits, public dashboards, clawbacks, and limits on running backup generators for profit are all on the menu—see the full page for a playbook.

Open policy recommendations

Civic toolkit

How to read a proposal and speak up

You do not need a technical background. Ask for numbers, timelines, and who enforces them.

The five questions to ask

1. How much water will this facility use, and what kind?

Ask for annual gallons, peak daily use, whether it is potable or reclaimed, and what happens during drought restrictions.

2. How much electricity will it draw?

Ask for megawatt capacity at build-out (not just phase one), grid interconnection details, and backup generation plans.

3. What are the binding commitments?

Promises in slide decks are not enforceable. Ask which numbers appear in the zoning conditions, permits, or water-use agreements.

4. Who enforces the terms, and what happens if they are violated?

A commitment without an enforcement mechanism and penalties is a suggestion. Ask which agency monitors and what the consequences are.

5. What is the net benefit to this community, specifically?

Jobs, tax revenue, and infrastructure improvements should be quantified and compared against resource costs and quality-of-life impacts.

Public hearings and comment periods

Most large developments require public hearings before city council or the county commissioners court. You can attend in person, submit written comments during open comment periods, or both. Your voice is strongest when it references specific numbers (water draw, megawatts, jobs) rather than general opposition.

Check the city and county meeting agendas online. If a data center project is on the agenda, the staff report is usually posted several days before the hearing. Read it. Prepare two or three of the questions above and ask them on the record.


Quick reference

Glossary

Common terms you will see in data-center, energy, air-quality, and local incentive discussions.

Chapter 380 agreement

Under Texas law, cities may use economic development programs (often called Chapter 380) to offer incentives such as grants or tax rebates tied to a project. Local contracts with a developer are often called the 380 agreement.

Clawback

Contract terms that let a government recover part of an incentive if a company misses promised jobs, investment, or other targets.

Colocation

A facility where multiple companies rent rack space and share power, cooling, and physical security rather than building their own building.

Data heat island (DHI)

A measured pattern where land surface temperatures rise near large AI or hyperscale data centers after they operate, separate from ordinary urban heat. Studies use satellites to compare before and after.

Decibel (dBA)

A-weighted decibels measure how loud sound seems to human hearing. City noise codes often set dBA limits at a property line or fence.

Deep learning

A subset of ML using neural networks with many layers. Powers image recognition, language translation, and generative AI tools.

Environmental justice

The fair treatment of all people regarding environmental laws and siting. It highlights when pollution or industrial burdens fall more heavily on some neighborhoods, often lower-income or historically marginalized.

Evapotranspiration

Water vapor released by plants and soil. It cools the air naturally; paving and sparse tree cover reduce this cooling in cities.

Grid interconnection

The utility process to connect a large new electrical customer to transmission and distribution. It can require studies, upgrades, and public rate cases.

Ground-level ozone

A harmful air pollutant (smog) formed when nitrogen oxides and other compounds react in sunlight and heat. It is not the same as the protective ozone layer high in the atmosphere.

Hyperscale

A data center operated by a cloud giant (Amazon, Google, Microsoft, Meta) with capacity exceeding 5,000 servers and 10,000+ sq ft, often much larger.

Land surface temperature (LST)

Temperature of the ground, roofs, or other surfaces as measured from satellites or sensors. It can differ from air temperature in the shade and is used in heat and data-center impact research.

Machine learning (ML)

A subset of AI where systems learn patterns from data instead of following hand-written rules. Most modern AI applications are ML-based.

Megawatt (MW)

A unit of power equal to one million watts. A single large data center can draw 50–200 MW, comparable to tens of thousands of homes.

Nitrogen oxides (NOx)

Pollutants from fuel combustion (including backup generators and power plants). They help form ground-level ozone and particle pollution.

Non-attainment area

An EPA designation for a region that does not meet a federal air quality standard until plans reduce pollution enough.

PM2.5

Fine particulate matter 2.5 microns or smaller. It can penetrate deep into lungs and is tracked in air quality indexes.

Potable vs. reclaimed water

Potable water is treated to drinking standards. Reclaimed (or recycled) water is treated wastewater reused for industrial cooling, which is often the better fit for data centers.

PUE (Power Usage Effectiveness)

The ratio of total facility energy to IT equipment energy. A PUE of 1.0 is perfect efficiency; most facilities land between 1.2 and 1.6. Lower is better.

Renewable Energy Certificate (REC)

A market instrument tied to renewable generation attributes. Buying RECs for annual matching does not by itself prove clean power every hour unless contracts require hourly alignment.

Training vs. inference

Training is the computationally intense phase where an AI model learns from data. Inference is using the trained model for everyday predictions. Per request, inference is much lighter, but aggregate volume is huge.

Urban heat island (UHI)

The tendency of built areas to stay warmer than nearby open land because of pavement, buildings, waste heat, and less plant cooling.

WUE (Water Usage Effectiveness)

Water use divided by IT energy, often reported in liters per kilowatt-hour (L/kWh). It describes how much cooling water a site needs per unit of computing; lower usually means better efficiency.