"""
    ObservationProcessing

Comprehensive observation processing system for GSI data assimilation. This module provides
a complete suite of tools for handling, processing, and optimizing atmospheric observations
used in variational and ensemble data assimilation methods.

# Core Components

## Observation Types and Management
- **ObservationTypes**: Comprehensive type system for all atmospheric observations
- **DataFormats**: Support for various observation data formats (BUFR, NetCDF, HDF5)
- **ProcessingPipeline**: Configurable processing workflows

## Quality Control and Validation
- **QualityControl**: Multi-level quality control algorithms
- **SpatialProcessing**: Spatial thinning and super-observation creation
- **BiasCorrection**: Observation bias detection and correction

## Forward Modeling and Operators
- **ForwardOperators**: Observation operator implementations
- **CRTMInterface**: Community Radiative Transfer Model integration

# Mathematical Framework

The observation processing system implements comprehensive algorithms for:

## Quality Control
Multi-stage quality control including gross error checks, spatial consistency validation,
background departure analysis, and variational quality control.

## Spatial Processing
Advanced spatial optimization algorithms including:
- Observation thinning with multiple algorithms
- Super-observation creation with error propagation
- Spatial distribution optimization

## Forward Modeling
Observation operators that map model state to observation space:
```
y = H(x) + ε
```

Where:
- `y`: Observations
- `H`: Observation operator  
- `x`: Model state
- `ε`: Observation error

# Usage

## Basic Processing Pipeline
```julia
using GSICoreAnalysis.ObservationProcessing

# Load observations
observations = load_observations("observation_file.bufr")

# Create processing pipeline
pipeline = create_processing_pipeline([
    :quality_control => QCConfig(gross_error_check=true, background_check=true),
    :spatial_thinning => ThinningConfig(method=SPATIAL_GRID, grid_spacing=(50.0, 50.0)),
    :bias_correction => BiasConfig(method=:variational, update_coefficients=true)
])

# Process observations
processed_obs = process_observations(observations, pipeline)
```

## Advanced Spatial Processing
```julia
# Spatial thinning configuration
thinning_config = ThinningConfig{Float64}(
    method = ADAPTIVE_DENSITY,
    grid_spacing = (100.0, 100.0),
    quality_threshold = 0.7
)

# Create super-observations
super_config = SuperObservationConfig{Float64}(
    method = ERROR_WEIGHTED,
    radius = 25.0,
    min_observations = 3
)

# Apply processing
thinned_obs = spatial_thinning(observations, thinning_config)
super_obs = create_super_observations(thinned_obs, super_config)
```

# Integration with GSI Analysis

The observation processing system seamlessly integrates with GSI analysis components:
- Provides processed observations to variational cost functions
- Supports ensemble data assimilation with optimized observation distribution
- Maintains observation error characteristics for proper weighting

# Performance and Scalability

The system is designed for high-performance processing of large observation datasets:
- Parallel processing support
- Memory-efficient algorithms
- Streaming processing for massive datasets
- GPU acceleration (planned)

# Quality Assurance

Comprehensive validation and testing ensure reliable operation:
- Unit tests for all algorithms
- Integration tests with real observation data
- Performance benchmarks
- Compliance with GSI operational standards
"""
module ObservationProcessing

using LinearAlgebra
using SparseArrays
using StaticArrays
using Dates
using Printf
using Random
using Statistics

# Include all submodules
include("ObservationTypes.jl")
include("DataFormats/DataFormats.jl")
include("QualityControl/QualityControl.jl")
include("SpatialProcessing/SpatialProcessing.jl")
include("BiasCorrection/BiasCorrection.jl")
include("ForwardOperators/ForwardOperators.jl")
include("CRTMInterface/CRTMInterface.jl")
include("ProcessingPipeline.jl")

# Re-export from submodules
using .ObservationTypes
using .DataFormats
using .QualityControl
using .SpatialProcessing
using .BiasCorrection
using .ForwardOperators
using .CRTMInterface
using .ProcessingPipeline

# Export submodules
export ObservationTypes, DataFormats, QualityControl, SpatialProcessing
export BiasCorrection, ForwardOperators, CRTMInterface, ProcessingPipeline

# Export key types from ObservationTypes
export AbstractObservation, ConventionalObservation, SatelliteObservation, SpecializedObservation
export SurfaceObservation, RadiosondeObservation, AircraftObservation, MarineObservation
export RadianceObservation, SatelliteWindObservation, RetrievalObservation
export GPSROObservation, RadarObservation, LightningObservation, AerosolObservation
export ObservationLocation, ObservationMetadata, QualityFlags, QualityMetrics

# Export key types from SpatialProcessing
export ThinningConfig, SuperObservationConfig, SpatialIndex, ProcessingPipeline
export ThinningMethod, SuperObservationMethod, SpatialSearchMethod
export SPATIAL_GRID, QUALITY_PRIORITY, RANDOM_STATISTICAL, SYSTEMATIC_PATTERN
export ADAPTIVE_DENSITY, MULTI_SCALE, TEMPORAL_REGULAR
export SIMPLE_AVERAGE, QUALITY_WEIGHTED, ERROR_WEIGHTED, OPTIMAL_INTERPOLATION
export KD_TREE, QUAD_TREE, GRID_HASH, R_TREE

# Export key functions from SpatialProcessing
export spatial_thinning, temporal_thinning, quality_based_thinning
export create_super_observations, optimize_observation_distribution
export build_spatial_index, spatial_search, range_query
export process_observations, validate_spatial_processing

# Export key functions from QualityControl
export gross_error_check, background_check, buddy_check, variational_qc
export quality_control_pipeline, apply_quality_control
export QCConfig, QCResult, QCStatistics

# Export key functions from BiasCorrection
export detect_bias, correct_bias, update_bias_coefficients
export BiasConfig, BiasModel, BiasCorrection

# Export key functions from DataFormats
export load_observations, save_observations, convert_format
export BUFRReader, NetCDFReader, HDF5Reader

# Export key functions from ForwardOperators
export observation_operator, tangent_linear_operator, adjoint_operator
export ForwardOperator, LinearOperator, NonlinearOperator

# High-level processing functions
"""
    create_processing_pipeline(components::Vector{Pair{Symbol, Any}})

Create comprehensive observation processing pipeline.

# Arguments
- `components`: Vector of processing stage pairs (stage_name => config)

# Returns
- `ProcessingPipeline`: Configured processing pipeline

# Example
```julia
pipeline = create_processing_pipeline([
    :quality_control => QCConfig(gross_error_check=true),
    :spatial_thinning => ThinningConfig(method=SPATIAL_GRID),
    :bias_correction => BiasConfig(method=:variational)
])
```
"""
function create_processing_pipeline(components::Vector{Pair{Symbol, Any}})
    stages = Function[]
    configs = Any[]
    
    for (stage_name, config) in components
        if stage_name == :quality_control
            push!(stages, apply_quality_control)
            push!(configs, config)
        elseif stage_name == :spatial_thinning
            push!(stages, spatial_thinning)
            push!(configs, config)
        elseif stage_name == :super_observations
            push!(stages, create_super_observations)
            push!(configs, config)
        elseif stage_name == :bias_correction
            push!(stages, correct_bias)
            push!(configs, config)
        elseif stage_name == :optimization
            push!(stages, optimize_observation_distribution)
            push!(configs, config)
        else
            @warn "Unknown processing stage: $stage_name"
        end
    end
    
    return ProcessingPipeline{Float64}(
        stages, configs, 
        validation=true, 
        parallel=false
    )
end

"""
    process_observations_comprehensive(observations, pipeline::ProcessingPipeline)

Process observations through comprehensive pipeline with full diagnostics.

# Arguments
- `observations`: Input observation vector
- `pipeline::ProcessingPipeline`: Processing pipeline configuration

# Returns
- `Vector{AbstractObservation}`: Final processed observations
- `Dict{String,Any}`: Comprehensive processing diagnostics
"""
function process_observations_comprehensive(observations::Vector{AbstractObservation{T}}, 
                                          pipeline::ProcessingPipeline{T}) where T
    
    @info "Starting comprehensive observation processing..."
    start_time = time()
    
    # Run processing pipeline
    processed_obs, pipeline_stats = process_observations(observations, pipeline)
    
    # Calculate comprehensive statistics
    processing_time = time() - start_time
    
    comprehensive_stats = Dict{String,Any}(
        "processing_time_seconds" => processing_time,
        "initial_observation_count" => length(observations),
        "final_observation_count" => length(processed_obs),
        "overall_reduction_ratio" => length(processed_obs) / length(observations),
        "pipeline_statistics" => pipeline_stats
    )
    
    # Calculate quality statistics
    if !isempty(observations) && !isempty(processed_obs)
        initial_quality = mean([obs.quality_metrics.overall_quality for obs in observations])
        final_quality = mean([obs.quality_metrics.overall_quality for obs in processed_obs])
        
        comprehensive_stats["quality_statistics"] = Dict(
            "initial_average_quality" => initial_quality,
            "final_average_quality" => final_quality,
            "quality_improvement" => final_quality - initial_quality
        )
    end
    
    # Calculate spatial coverage statistics
    if !isempty(processed_obs)
        spatial_coverage = calculate_spatial_coverage(processed_obs)
        comprehensive_stats["spatial_coverage"] = spatial_coverage
    end
    
    @info "Observation processing completed in $(processing_time) seconds"
    @info "Processed $(length(observations)) → $(length(processed_obs)) observations"
    
    return processed_obs, comprehensive_stats
end

"""
    validate_observation_processing(original_obs, processed_obs)

Validate observation processing results for quality and consistency.

# Arguments
- `original_obs`: Original observation vector
- `processed_obs`: Processed observation vector

# Returns
- `Dict{String,Any}`: Validation results and recommendations
"""
function validate_observation_processing(original_obs::Vector{AbstractObservation{T}},
                                       processed_obs::Vector{AbstractObservation{T}}) where T
    
    validation_results = Dict{String,Any}(
        "validation_passed" => true,
        "warnings" => String[],
        "errors" => String[],
        "recommendations" => String[]
    )
    
    # Basic count validation
    if isempty(processed_obs)
        push!(validation_results["errors"], "No observations remain after processing")
        validation_results["validation_passed"] = false
        return validation_results
    end
    
    reduction_ratio = length(processed_obs) / length(original_obs)
    validation_results["reduction_ratio"] = reduction_ratio
    
    if reduction_ratio < 0.01
        push!(validation_results["warnings"], "Extreme observation reduction (< 1%)")
        push!(validation_results["recommendations"], "Consider relaxing processing criteria")
    elseif reduction_ratio > 0.95
        push!(validation_results["warnings"], "Minimal observation reduction (> 95%)")
        push!(validation_results["recommendations"], "Consider more aggressive processing")
    end
    
    # Quality validation
    if !isempty(original_obs)
        original_quality = mean([obs.quality_metrics.overall_quality for obs in original_obs])
        processed_quality = mean([obs.quality_metrics.overall_quality for obs in processed_obs])
        
        validation_results["quality_change"] = processed_quality - original_quality
        
        if processed_quality < original_quality - 0.1
            push!(validation_results["warnings"], "Significant quality degradation detected")
            push!(validation_results["recommendations"], "Review quality control settings")
        end
    end
    
    # Spatial coverage validation
    spatial_validation = validate_spatial_processing(processed_obs, original_obs)
    validation_results["spatial_validation"] = spatial_validation
    
    if !spatial_validation["valid"]
        validation_results["validation_passed"] = false
        append!(validation_results["errors"], spatial_validation["errors"])
    end
    
    append!(validation_results["warnings"], spatial_validation["warnings"])
    
    return validation_results
end

end # module ObservationProcessing