Module 4: AI Impact Assessment

Environmental Considerations

15 min
+50 XP

Environmental Considerations in AI

Introduction to AI's Environmental Impact

As AI systems become more prevalent and powerful, their environmental footprint has emerged as a critical concern. While AI can contribute to environmental solutions (climate modeling, renewable energy optimization, conservation), AI development and deployment themselves consume significant resources and generate environmental impacts.

Environmental considerations in AI impact assessment address:

  • Energy Consumption: Electricity used for training and inference
  • Carbon Emissions: Greenhouse gases from energy consumption
  • Hardware Lifecycle: Manufacturing, use, and disposal of computing equipment
  • Water Usage: Cooling requirements for data centers
  • Resource Extraction: Raw materials for hardware production
  • E-Waste: Electronic waste from obsolete equipment
  • Indirect Effects: Rebound effects and economic impacts

This lesson provides comprehensive guidance on assessing and mitigating AI's environmental impacts, aligned with ISO 42001's holistic approach to responsible AI.


Carbon Footprint of AI Systems

Understanding AI's Carbon Impact

The carbon footprint of AI systems comprises:

1. Operational Emissions (Direct)

Energy consumed during AI operation:

PhaseActivityEnergy IntensityDuration
TrainingInitial model trainingVery High (100-1000+ MWh for large models)Days to months
Fine-TuningAdapting pre-trained modelsMedium (1-100 MWh)Hours to days
InferenceMaking predictionsLow per query, but continuousOngoing
Data ProcessingData collection, cleaning, storageMediumOngoing
InfrastructureData centers, networking, coolingMediumContinuous

2. Embodied Emissions (Indirect)

Carbon embedded in hardware manufacturing:

  • Silicon wafer production
  • Chip fabrication
  • Server assembly
  • Transportation
  • Infrastructure construction

3. End-of-Life Emissions

Disposal and recycling of equipment:

  • E-waste processing
  • Material recovery
  • Disposal of non-recyclable components

Carbon Intensity Variation

Carbon emissions depend heavily on energy source:

Energy SourceCO₂ per kWhRelative Impact
Coal820 gHighest (100%)
Natural Gas490 gHigh (60%)
Grid Average (Global)475 gHigh (58%)
Grid Average (US)417 gMedium-High (51%)
Grid Average (EU)275 gMedium (34%)
Solar48 gLow (6%)
Wind11 gVery Low (1.3%)
Hydroelectric24 gVery Low (3%)
Nuclear12 gVery Low (1.5%)

Implication: Same AI workload can have 75x different carbon impact depending on energy source.

Measuring AI Carbon Footprint

Carbon Footprint Formula:

Total CO₂ = (Energy Consumed × Grid Carbon Intensity) + Embodied Carbon

Where:
- Energy Consumed = Training Energy + Inference Energy + Infrastructure Energy
- Grid Carbon Intensity = gCO₂/kWh of local electricity grid
- Embodied Carbon = Manufacturing emissions of hardware

Detailed Calculation Example:

Large Language Model Training

Hardware: 1024 GPUs (NVIDIA A100)
Training Duration: 30 days
Power per GPU: 400W
Total Power: 1024 × 400W = 409.6 kW

Training Energy:
409.6 kW × 24 hours × 30 days = 294,912 kWh

Location: US Data Center (417 gCO₂/kWh)

Operational Emissions:
294,912 kWh × 417 gCO₂/kWh = 122,978,304 g CO₂ = 123 metric tons CO₂

Embodied Emissions (GPUs):
1024 GPUs × 150 kg CO₂e per GPU = 153,600 kg = 154 metric tons CO₂

Total Training Carbon Footprint: 277 metric tons CO₂

Equivalents:
- 62 passenger vehicles driven for one year
- 135 round-trip flights NYC to London
- 31,000 gallons of gasoline consumed

Inference Carbon Footprint:

Deployed Model Serving

Requests per day: 10 million
Energy per request: 0.002 kWh
Daily energy: 10M × 0.002 = 20,000 kWh
Annual energy: 20,000 × 365 = 7,300,000 kWh

Location: EU Data Center (275 gCO₂/kWh)

Annual Operational Emissions:
7,300,000 kWh × 275 gCO₂/kWh = 2,007,500,000 g = 2,008 metric tons CO₂/year

Note: Over 5-year model lifetime, inference emissions (10,040 tons)
significantly exceed training emissions (277 tons).

Carbon Impact of Different AI Approaches

Comparative Carbon Footprints:

Model TypeTraining CO₂ (tons)Use CaseEfficiency
Small BERT0.03Text classificationVery efficient
GPT-3 (175B)552Large language modelHigh impact
BLOOM (176B)25Open multilingual LLMMedium (renewable energy)
PaLM (540B)4,000+Massive language modelVery high impact
Computer Vision (ResNet)0.1Image classificationEfficient
AlphaGo96Game playingHigh impact for RL
Recommendation System1-10E-commerceVaries widely

Key Insight: Larger models have exponentially higher training costs, but may be more efficient per query if serving many users.


Energy Consumption Analysis

Data Center Energy Breakdown

Typical Data Center Energy Use:

Component% of Total EnergyOptimization Potential
IT Equipment40-50%Medium (hardware efficiency)
Cooling35-45%High (cooling optimization)
Power Distribution5-10%Low (infrastructure design)
Lighting & Other2-5%Low (LED, automation)

Power Usage Effectiveness (PUE):

PUE = Total Facility Energy / IT Equipment Energy

PUE 3.0 = Poor (typical older facilities)
PUE 2.0 = Average (industry baseline)
PUE 1.5 = Good (well-designed facilities)
PUE 1.2 = Excellent (state-of-the-art)
PUE 1.1 = World-class (best practices + renewable cooling)

Example Impact:
For 1 MW of IT load:
- PUE 2.0 → Total consumption = 2 MW (1 MW wasted)
- PUE 1.2 → Total consumption = 1.2 MW (0.2 MW wasted)
Savings: 0.8 MW = 40% energy reduction

Energy Efficiency Metrics

Training Efficiency:

MetricDescriptionFormulaTarget
FLOPs per kWhComputational work per energyTotal FLOPs / Total kWhHigher is better
Accuracy per kWhModel quality per energyAccuracy / Training kWhHigher is better
Time to AccuracySpeed to target performanceHours to reach X% accuracyLower is better

Inference Efficiency:

MetricDescriptionFormulaTarget
Queries per kWhThroughput per energyTotal queries / Total kWhHigher is better
Latency per WattResponse time to power ratioLatency (ms) / Power (W)Lower is better
Energy per TokenEnergy for language model outputskWh / tokens generatedLower is better

Energy Optimization Strategies

1. Model Architecture Optimization

Efficient Architectures:

ApproachDescriptionEnergy SavingsTrade-offs
Knowledge DistillationTrain smaller model to mimic larger one50-90%Slight accuracy loss
PruningRemove unnecessary model parameters30-70%Careful tuning needed
QuantizationReduce numerical precision (e.g., INT8 vs FP32)40-75%Minimal accuracy impact
Neural Architecture SearchAutomatically find efficient architectures20-60%High upfront search cost
Sparse ModelsOnly activate subset of parameters40-80%Specialized hardware needed

Example: Model Distillation

Original Model: BERT-Large
- Parameters: 340M
- Inference latency: 45ms
- Power consumption: 25W
- Accuracy: 94.2%

Distilled Model: DistilBERT
- Parameters: 66M (80% reduction)
- Inference latency: 15ms (67% reduction)
- Power consumption: 8W (68% reduction)
- Accuracy: 92.8% (1.4% loss)

Energy Savings for 1M daily queries:
Original: 1M × 25W × 0.045s = 312.5 Wh/day
Distilled: 1M × 8W × 0.015s = 33.3 Wh/day
Reduction: 89% energy savings

2. Hardware Selection

Hardware Efficiency Comparison:

HardwareUse CasePerformance/WattCostBest For
CPUGeneral compute1x (baseline)LowSmall models, diverse workloads
GPUParallel training10-50xMediumLarge model training
TPUGoogle AI workloads30-80xMedium (cloud)TensorFlow models at scale
AI AcceleratorsSpecialized inference50-100xHighProduction inference
FPGAsCustomizable20-40xVery HighSpecialized applications
Neuromorphic ChipsBrain-inspired100-1000x (future)ExperimentalLow-power edge AI

3. Data Center Efficiency

Cooling Optimization:

  • Free Cooling: Use outside air when temperatures permit (20-40% savings)
  • Liquid Cooling: Direct liquid cooling for high-density racks (30% savings)
  • Hot Aisle/Cold Aisle: Structured airflow management (15-25% savings)
  • Intelligent Temperature: Raise cold aisle temperature from 18°C to 27°C (4% savings)
  • Economizers: Outside air for cooling (30-70% cooling energy savings)

Renewable Energy Sourcing:

Carbon Reduction through Renewables:

Data Center Annual Consumption: 10,000 MWh

Scenario 1: Grid Mix (400 gCO₂/kWh)
Emissions: 10,000 MWh × 400 kg/MWh = 4,000 tons CO₂

Scenario 2: 50% Renewable PPA
Emissions: 5,000 MWh × 400 kg/MWh = 2,000 tons CO₂
Reduction: 50%

Scenario 3: 100% Renewable
Emissions: 10,000 MWh × 20 kg/MWh = 200 tons CO₂
Reduction: 95%

4. Workload Optimization

Carbon-Aware Computing:

Schedule workloads based on grid carbon intensity:

Time PeriodGrid Carbon IntensityOptimal Workloads
Morning (6-10am)High (peak demand)Only critical inference
Midday (10am-2pm)Low (solar peak)Training jobs, batch processing
Afternoon (2-6pm)MediumStandard operations
Evening (6-10pm)Very High (peak)Minimal non-critical work
Night (10pm-6am)Low (off-peak)Training jobs, data processing

Benefit Example:

Training Job: 1000 kWh required

Evening Peak (600 gCO₂/kWh): 600 kg CO₂
Night Off-Peak (300 gCO₂/kWh): 300 kg CO₂
Savings: 50% carbon reduction by shifting timing

5. Geographic Optimization

Data Center Location Impact:

RegionGrid Carbon IntensityRenewable %Cooling ClimateOverall Rating
Iceland25 gCO₂/kWh100% (hydro/geothermal)Excellent⭐⭐⭐⭐⭐
Norway18 gCO₂/kWh98% (hydro)Excellent⭐⭐⭐⭐⭐
Quebec, Canada30 gCO₂/kWh99% (hydro)Good⭐⭐⭐⭐⭐
France85 gCO₂/kWh75% (nuclear)Good⭐⭐⭐⭐
US Pacific Northwest200 gCO₂/kWh65% (hydro)Good⭐⭐⭐⭐
Germany350 gCO₂/kWh45% (mixed renewable)Moderate⭐⭐⭐
US Midwest550 gCO₂/kWh20%Moderate⭐⭐
China600 gCO₂/kWh28%Varies⭐⭐
India700 gCO₂/kWh20%Poor (hot)

Location Impact Example:

Same AI training job:

  • India: 700 kg CO₂
  • Iceland: 25 kg CO₂
  • Reduction: 96% by choosing low-carbon location

Hardware Lifecycle and E-Waste

Embodied Carbon in Hardware

Manufacturing Emissions:

ComponentEmbodied CO₂Useful LifeAnnual Amortized CO₂
Server1,200 kg5 years240 kg/year
GPU (High-end)150 kg4 years37.5 kg/year
Storage (1TB SSD)25 kg5 years5 kg/year
Networking Equipment300 kg7 years43 kg/year
Cooling Infrastructure5,000 kg15 years333 kg/year

Total Embodied Impact:

AI Training Cluster Example:
- 100 Servers: 120,000 kg CO₂
- 400 GPUs: 60,000 kg CO₂
- 500 TB Storage: 12,500 kg CO₂
- Network Infrastructure: 30,000 kg CO₂
- Cooling Systems: 50,000 kg CO₂

Total Embodied: 272,500 kg = 273 tons CO₂

If used for 5 years:
Amortized: 54.5 tons CO₂/year

For comparison:
If operational emissions = 200 tons/year
Total: 254.5 tons/year (21% from embodied carbon)

E-Waste Challenge

Global E-Waste from AI/Data Centers:

  • Current: ~2-3 million tons/year of server equipment waste
  • Growth Rate: 15-20% annually (faster than AI adoption)
  • Recycling Rate: Only 17% globally, 80% ends in landfills
  • Toxic Materials: Lead, mercury, cadmium, brominated flame retardants
  • Valuable Materials Lost: Gold, silver, copper, rare earth elements

AI Hardware Lifecycle:

Manufacturing → Deployment → Use → Upgrade → Disposal
   ↑                                             ↓
   └──────────── Recycling/Refurbishment ←──────┘
                    (17% currently)

Improvement Goal: 70%+ circular economy

Sustainable Hardware Practices

1. Extend Hardware Lifespan

StrategyImpactImplementation
Modular DesignReplace components vs. entire serversDesign for upgradability
Proper Maintenance20-30% lifespan extensionRegular cleaning, monitoring
Software OptimizationAvoid unnecessary upgradesEfficient algorithms
Cascading DeploymentReuse for less intensive tasksTraining → Inference → Development

Example Cascading:

Year 1-2: Flagship training cluster
Year 3-4: Production inference serving
Year 5-6: Development and testing environment
Year 7: Donated to educational institutions
Year 8+: Recycling

Effective lifespan: 7 years vs. 4 years
Waste reduction: 43%

2. Responsible Recycling

Recycling Best Practices:

  • Certified Recyclers: Use e-Stewards or R2 certified facilities
  • Data Sanitization: Secure data destruction before recycling
  • Material Recovery: Extract and reuse valuable materials
  • Toxic Handling: Proper treatment of hazardous components
  • Transparency: Track recycling and material recovery rates

Material Recovery Potential:

Material% in ElectronicsRecovery ValueEnvironmental Benefit
Copper20%HighReduces mining impact
Aluminum8%High95% energy savings vs. virgin
Gold0.03%Very HighConcentrations higher than ore
Silver0.1%HighValuable and recyclable
Rare Earths1-2%Very HighCritical material security
Plastics15-20%MediumReduces petroleum use

3. Circular Economy Approaches

Hardware-as-a-Service:

Instead of purchasing, lease hardware:

  • Supplier maintains ownership and responsibility
  • Incentivizes durable design and maintenance
  • Ensures proper recycling at end-of-life
  • Reduces waste through professional refurbishment

Refurbishment and Resale:

  • Certified refurbishment extends life 3-5 years
  • Makes technology accessible to smaller organizations
  • Reduces new hardware demand
  • Creates local jobs in refurbishment sector

Water Consumption

Water Use in AI Infrastructure

Data Center Water Consumption:

Cooling MethodWater Use (L/kWh)Water TypeSustainability
Evaporative Cooling1.8 - 4.0Fresh waterHigh impact in water-scarce regions
Water-Cooled Chillers1.0 - 2.5Fresh water (recirculated)Medium impact
Adiabatic Cooling0.5 - 1.5Fresh waterLower impact
Air Cooling0 - 0.2None (or minimal)Minimal water impact
Liquid Immersion0 - 0.5Non-water coolantNo freshwater use

Water Footprint Example:

Large AI Data Center:
- Power consumption: 50 MW
- Annual energy: 438,000 MWh
- Cooling method: Evaporative
- Water use: 2 L/kWh

Annual water consumption:
438,000,000 kWh × 2 L/kWh = 876,000,000 L = 876 million liters

Equivalent to:
- 350 Olympic swimming pools
- Annual consumption of 4,900 US households
- Significant impact in water-stressed regions

Water Scarcity Considerations

Data Center Location vs. Water Stress:

LocationWater Stress LevelData Center DensityConflict Level
Phoenix, ArizonaExtremely HighHigh⚠️ Critical
SingaporeHighVery High⚠️ High
NetherlandsLowHigh✓ Acceptable
IrelandLowHigh✓ Acceptable
ScandinaviaVery LowMedium✓ Ideal

Mitigation Strategies:

  1. Use Alternative Cooling: Air cooling in cooler climates
  2. Reclaimed Water: Use treated wastewater for cooling
  3. Closed-Loop Systems: Recirculate cooling water
  4. Location Selection: Avoid water-stressed regions
  5. Seasonal Adaptation: Reduce cooling water in winter

Environmental Impact Assessment Framework

Assessment Methodology

Phase 1: Baseline Measurement

1. Energy Inventory:

SourceMeasurement MethodData Points
TrainingGPU/TPU power monitoringkWh per training run
InferenceServer power meterskWh per 1000 queries
StorageStorage system monitoringkWh per TB per month
NetworkingNetwork equipment meterskWh per GB transferred
CoolingFacility monitoring (PUE)Total facility kWh

2. Carbon Calculation:

Component-by-Component:

Training:
- Hardware: 256 GPUs
- Training time: 72 hours
- Power per GPU: 400W
- Total energy: 256 × 0.4 kW × 72 h = 7,373 kWh
- Grid carbon: 450 gCO₂/kWh
- Training carbon: 3,318 kg CO₂

Inference (Annual):
- Queries: 100M/year
- Energy per query: 0.001 kWh
- Total energy: 100,000 kWh
- Grid carbon: 450 gCO₂/kWh
- Inference carbon: 45,000 kg CO₂/year

Embodied:
- Hardware: 50,000 kg CO₂ amortized over 5 years
- Annual embodied: 10,000 kg CO₂/year

Total Annual Carbon:
Training (annual): 3,318 kg (one-time, amortized: 664 kg/year over 5-year model life)
Inference: 45,000 kg/year
Embodied: 10,000 kg/year
Total: 55,664 kg CO₂/year = 56 tons/year

3. Resource Assessment:

  • Water consumption (liters/year)
  • Hardware units (servers, GPUs, storage)
  • Expected hardware lifespan
  • E-waste generation rate (tons/year)

Phase 2: Impact Evaluation

Environmental Impact Scoring:

FactorScore (1-5)WeightWeighted Score
Carbon Emissions4 (High)40%1.6
Energy Efficiency2 (Poor)25%0.5
Renewable Energy %3 (Medium)15%0.45
Hardware Efficiency3 (Medium)10%0.3
E-Waste Management2 (Poor)5%0.1
Water Consumption3 (Medium)5%0.15
Total100%3.1/5

Rating Scale:

  • 4.0-5.0: Excellent environmental performance
  • 3.0-3.9: Good, room for improvement
  • 2.0-2.9: Poor, significant improvements needed
  • 1.0-1.9: Critical, immediate action required

Phase 3: Mitigation Planning

Mitigation Priority Matrix:

High Impact, Quick Wins:
1. Shift to renewable energy (95% carbon reduction)
2. Implement model compression (60% inference energy reduction)
3. Carbon-aware scheduling (30% carbon reduction)

High Impact, Longer Term:
4. Migrate to low-carbon region (80% carbon reduction)
5. Upgrade to efficient hardware (40% energy reduction)
6. Implement hardware reuse program (30% e-waste reduction)

Lower Impact, Quick Wins:
7. Optimize data center cooling (15% energy reduction)
8. Implement responsible recycling (100% e-waste properly handled)

Lower Impact, Longer Term:
9. Develop circular economy partnerships
10. Invest in carbon offset projects (last resort)

Phase 4: Implementation and Monitoring

Environmental KPIs:

MetricBaselineTarget (1 year)Target (3 years)Monitoring Frequency
Carbon Intensity (gCO₂/query)45020050Monthly
Renewable Energy %30%60%100%Quarterly
PUE1.81.41.2Monthly
Hardware Lifespan4 years5 years6 yearsAnnual
E-Waste Recycling %20%60%85%Quarterly
Water Intensity (L/kWh)2.51.50.5Quarterly

Green AI Best Practices

1. Design for Efficiency

Efficient-First Development:

  • Start Small: Begin with smallest model that meets requirements
  • Incremental Scaling: Only scale up if necessary for performance
  • Efficiency Benchmarking: Compare energy/carbon against baselines
  • Transfer Learning: Reuse pre-trained models rather than training from scratch

Example Comparison:

Option A: Train Custom Large Model
- Training time: 30 days
- Training cost: $500,000
- Carbon: 300 tons CO₂
- Accuracy: 94.2%

Option B: Fine-tune Pre-trained Model
- Training time: 2 days
- Training cost: $15,000
- Carbon: 5 tons CO₂
- Accuracy: 93.8%

Decision: Option B delivers 99.6% of performance with 98% less carbon.

2. Measure and Report

Carbon Accounting:

Include carbon metrics in standard reporting:

  • Carbon footprint in model cards
  • Energy consumption in benchmarks
  • Sustainability section in technical papers
  • Public commitments and progress tracking

Model Card Sustainability Section:

## Environmental Impact

**Training**
- Hardware: 64 NVIDIA A100 GPUs
- Training Duration: 5 days
- Energy Consumption: 15,360 kWh
- Carbon Emissions: 6.9 tons CO₂e (Iceland, renewable grid)

**Inference**
- Energy per 1000 queries: 0.5 kWh
- Carbon per 1000 queries: 0.2 kg CO₂e
- Estimated annual inference emissions: 20 tons CO₂e (at 100M queries)

**Mitigation Measures**
- Model quantization applied (40% energy reduction)
- Hosted on 100% renewable energy
- Hardware recycling program in place

**Tools Used**
- Carbon tracking: CodeCarbon
- Energy monitoring: ML CO₂ Impact

3. Optimize Continuously

Ongoing Optimization:

FrequencyActivityExpected Benefit
Each Training RunHyperparameter tuning for efficiency10-30% energy savings
MonthlyReview inference efficiency5-15% improvements
QuarterlyModel compression updates20-40% cumulative savings
AnnuallyHardware refresh planning30-50% efficiency gains
ContinuousCarbon-aware scheduling20-50% carbon reduction

4. Collaborate and Share

Industry Collaboration:

  • Share energy-efficient model architectures
  • Publish optimization techniques
  • Contribute to open-source efficiency tools
  • Participate in green AI research

Resource Sharing:

  • Publish pre-trained models to avoid redundant training
  • Share datasets to reduce collection overhead
  • Collaborate on infrastructure to improve utilization
  • Pool knowledge on best practices

Carbon Offset and Beyond

When to Consider Carbon Offsets

Hierarchy of Climate Action:

1. Reduce Emissions (Priority)
   ↓
2. Shift to Renewable Energy
   ↓
3. Improve Efficiency
   ↓
4. Remove Residual Emissions
   ↓
5. Offset Remaining Emissions (Last Resort)

Appropriate Use of Offsets:

Good Uses:

  • Residual emissions after all reduction efforts
  • Historical emissions from past AI work
  • Unavoidable emissions during transition period

Poor Uses:

  • Substitute for efficiency improvements
  • Instead of switching to renewable energy
  • To claim "carbon neutral" without reductions
  • To avoid making harder changes

Quality Carbon Offset Criteria

High-Quality Offset Projects:

CriterionDescriptionVerification
AdditionalityWould not happen without offset fundingIndependent verification
PermanenceCarbon storage is long-termMonitoring and guarantees
VerificationIndependently certifiedThird-party audits
No LeakageDoesn't increase emissions elsewhereLife cycle analysis
Co-BenefitsProvides social/environmental benefitsCommunity validation

Preferred Project Types:

  1. Renewable Energy: Wind, solar installations in developing regions
  2. Reforestation: Native species, community-managed forests
  3. Direct Air Capture: Technology-based CO₂ removal (emerging)
  4. Soil Carbon: Regenerative agriculture, improved farming practices

Avoid: Industrial gas credits, questionable forest preservation projects, unverified programs

Beyond Carbon: Holistic Environmental Responsibility

Broader Environmental Commitments:

  1. Biodiversity: Locate data centers to minimize habitat impact
  2. Circular Economy: Design for reuse, repair, and recycling
  3. Water Stewardship: Responsible water use, especially in water-stressed regions
  4. Toxic Materials: Minimize use of hazardous substances
  5. Community Impact: Support local environmental initiatives
  6. Supply Chain: Work with environmentally responsible suppliers
  7. Transparency: Public reporting on all environmental metrics

Environmental Assessment Template

AI System Environmental Profile

System Information:

  • System Name: _______________
  • Purpose: _______________
  • Deployment Scale: _______________

Energy Consumption:

PhaseEnergy (kWh)FrequencyAnnual Total (kWh)
TrainingOne-time
InferencePer query × volume
Data ProcessingContinuous
StorageContinuous
Total

Carbon Footprint:

SourceAmountUnit
Operational Emissionstons CO₂/year
Embodied Emissionstons CO₂/year
Total Carbon Footprinttons CO₂/year

Resource Consumption:

  • Hardware: _____ servers, _____ GPUs, _____ TB storage
  • Expected Lifespan: _____ years
  • Water Consumption: _____ liters/year
  • E-Waste Generation: _____ kg/year

Environmental Impact Rating: _____ / 5

Mitigation Measures:




Monitoring Plan:

  • Carbon tracking tool: _______________
  • Reporting frequency: _______________
  • Review schedule: _______________

Key Takeaways

  1. AI has significant environmental impact through energy consumption, carbon emissions, hardware manufacturing, and e-waste

  2. Carbon footprint varies dramatically based on energy source, with 75x difference between coal and renewable grids

  3. Inference often exceeds training emissions over system lifetime, requiring attention to operational efficiency

  4. Multiple optimization strategies exist: model compression, efficient hardware, renewable energy, carbon-aware scheduling

  5. Hardware lifecycle matters: embodied carbon and e-waste are substantial components of environmental impact

  6. Measurement is essential: track energy consumption, carbon emissions, and resource use systematically

  7. Location choices are critical: grid carbon intensity and cooling climate significantly affect environmental impact

  8. Carbon offsets are last resort: prioritize reduction, renewable energy, and efficiency before offsetting


Next Steps

Proceed to Lesson 4.5: AI Impact Assessment Template for a comprehensive template that integrates environmental considerations with rights and societal impact analysis.


Environmental sustainability is an essential component of responsible AI, requiring systematic assessment and ongoing optimization.

Complete this lesson

Earn +50 XP and progress to the next lesson