"""
    GSICoreAnalysis.jl

A Julia implementation of the Gridpoint Statistical Interpolation (GSI) core analysis algorithms,
ported from the GSI/EnKF Fortran codebase. This package provides efficient implementations of
variational data assimilation methods including 3D-Var, 4D-Var, ensemble Kalman filter (EnKF), 
and hybrid ensemble-variational approaches with comprehensive multi-model support.

# Core Components

## Variational Data Assimilation
- `ControlVectors`: Control vector management and operations
- `CostFunctions`: Cost function evaluation and gradient computation  
- `Minimization`: Iterative optimization algorithms (PCG, BiCG, Lanczos)
- `StateVectors`: State vector transformations and memory management
- `BackgroundError`: Background error covariance modeling
- `ObservationOperators`: Observation operator implementations
- `FourDVar`: Four-dimensional variational data assimilation

## Ensemble Data Assimilation
- `EnKFCore`: Core ensemble Kalman filter algorithms
- `LETKF`: Local Ensemble Transform Kalman Filter implementation
- `CovarianceLocalization`: Covariance localization methods
- `ModelInterface`: Multi-model ensemble I/O interface
- `EnKFDiagnostics`: Comprehensive diagnostics and monitoring

## Supporting Infrastructure
- `GridOperations`: Grid manipulations and transformations
- `DataIO`: Data input/output operations
- `MainDriver`: High-level analysis workflow coordination
- `AdvancedSolvers`: Advanced iterative solvers and algorithms

# Mathematical Framework

The package implements both variational and ensemble data assimilation methods:

## Variational Analysis
```
J(x) = ½(x-xₑ)ᵀB⁻¹(x-xₑ) + ½(H(x)-y)ᵀR⁻¹(H(x)-y)
```

## Ensemble Kalman Filter Analysis
```
xᵃ = xᵇ + K(y - H(xᵇ))
K = BHᵀ(HBHᵀ + R)⁻¹
```

## Hybrid Ensemble-Variational
```
J(x) = ½(x-xₑ)ᵀ(αB + βBₑₙₛ)⁻¹(x-xₑ) + ½(H(x)-y)ᵀR⁻¹(H(x)-y)
```

Where:
- `x`: Analysis state vector
- `xₑ`, `xᵇ`: Background (first guess) state
- `B`: Static background error covariance matrix
- `Bₑₙₛ`: Ensemble-based background error covariance
- `H`: Observation operator
- `y`: Observations  
- `R`: Observation error covariance matrix
- `K`: Kalman gain matrix
- `α`, `β`: Hybrid weighting coefficients

# Usage

## Basic Variational Analysis
```julia
using GSICoreAnalysis

# Initialize variational analysis system
config = AnalysisConfig(
    grid_size = (360, 180, 64),
    analysis_method = "3DVar"
)

# Create control vectors
cv = ControlVector(config)
state = StateVector(config)

# Run 3D-Var analysis
result = run_analysis(cv, state, observations, config)
```

## Ensemble Kalman Filter Analysis
```julia
using GSICoreAnalysis

# Initialize EnKF system
config = AnalysisConfig(
    grid_size = (360, 180, 64),
    ensemble_size = 40,
    analysis_method = "EnKF",
    model_type = "GFS"
)

# Initialize model interface
model_interface = initialize_model_interface("GFS", config.model_params)

# Run EnKF analysis
result = run_enkf_analysis(ensemble_data, observations, config, model_interface)
```

## Hybrid Ensemble-Variational Analysis
```julia
using GSICoreAnalysis

# Initialize hybrid system
config = AnalysisConfig(
    grid_size = (360, 180, 64),
    ensemble_size = 40,
    analysis_method = "Hybrid",
    hybrid_coeff = 0.75,
    model_type = "GFS"
)

# Run hybrid analysis
result = run_analysis(control_vector, ensemble_data, observations, config)
```
"""
module GSICoreAnalysis

using LinearAlgebra
using SparseArrays
using StaticArrays
using Distributed
using FFTW
using Printf
using Random
using Dates

# Export main types and functions
export AbstractAnalysisConfig, AnalysisConfig
export AbstractControlVector, ControlVector
export AbstractStateVector, StateVector
export AbstractGrid
export GridDimensions, VariableInfo, AnalysisVariables, default_analysis_variables
export CostFunction, minimize_cost_function

# Export DRP-4DVar functionality
export DRP4DVar, EnsembleProjection, ReducedSpaceCostFunction, DRP4DVariationalSolver
export run_drp4dvar, ensemble_perturbations, project_to_observation_space
export reduced_cost_function, reduced_gradient, optimize_reduced_space

# Export analysis functions (EnKF-specific functionality disabled for now)
export run_analysis
# export AbstractModelInterface, ModelConfig
# export GFSInterface, WRFInterface, FV3Interface, NMMBInterface
# export EnKFAnalysis, LETKF, CovarianceLocalization
# export DiagnosticConfig, InnovationStats, EnsembleSpreadStats
# export run_enkf_analysis, run_hybrid_analysis
# export initialize_model_interface, read_ensemble_member, write_ensemble_member

# Include submodules
include("types.jl")
include("ControlVectors/ControlVectors.jl")
include("CostFunctions/CostFunctions.jl") 
include("Minimization/Minimization.jl")
include("StateVectors/StateVectors.jl")
include("BackgroundError/BackgroundError.jl")
include("GridOperations/GridOperations.jl")
include("DataIO/DataIO.jl") 
include("ObservationOperators/ObservationOperators.jl")
include("MainDriver/MainDriver.jl")
include("AdvancedSolvers/AdvancedSolvers.jl")
include("FourDVar/FourDVar.jl")
include("Diagnostics/Diagnostics.jl")
# Temporarily disable Performance module due to compilation issues
# include("Performance/Performance.jl")

include("EnKF/EnKFCore.jl")
# Remaining EnKF modules will be re-enabled as their drivers are completed
# include("EnKF/LETKF.jl")
# include("EnKF/CovarianceLocalization.jl")
# include("EnKF/ModelInterface.jl")
include("EnKF/EnKFDiagnostics.jl")

# Re-export from submodules
using .ControlVectors
using .CostFunctions
using .Minimization
using .StateVectors
using .BackgroundError
using .GridOperations
using .DataIO
using .ObservationOperators
using .MainDriver
using .AdvancedSolvers
using .FourDVar
using .Diagnostics
# using .Performance

using .EnKFCore
# using .LETKF
# using .CovarianceLocalization
# using .ModelInterface
using .EnKFDiagnostics

export ControlVectors, CostFunctions, Minimization, StateVectors
export BackgroundError, GridOperations, DataIO, ObservationOperators, MainDriver
export AdvancedSolvers, FourDVar, Diagnostics
# export Performance
export EnKFCore
# export CovarianceLocalization
export EnKFDiagnostics
# export LETKF

# Provide lightweight fallbacks until the dedicated model interface layer is restored
if !isdefined(@__MODULE__, :initialize_model_interface)
    struct StubModelInterface
        model_type::String
        params::Dict{String,Any}
    end

    function initialize_model_interface(model_type::AbstractString, config_params::Dict{String,Any})
        return StubModelInterface(String(model_type), deepcopy(config_params))
    end

    finalize_model_interface(::StubModelInterface) = nothing
end

# High-level analysis functions
"""
    run_analysis(method::String, data::Dict, observations::Dict, config::AnalysisConfig)

Unified high-level interface for running different analysis methods (3DVar, 4DVar, EnKF, Hybrid).
"""
function run_analysis(method::String, data::Dict, observations::Dict, config::AnalysisConfig)
    if method == "3DVar"
        return run_3dvar_analysis(data, observations, config)
    elseif method == "4DVar"
        return run_4dvar_analysis(data, observations, config)
    elseif method == "EnKF"
        return run_enkf_analysis(data, observations, config)
    elseif method == "Hybrid"
        return run_hybrid_analysis(data, observations, config)
    else
        error("Unsupported analysis method: $method")
    end
end

"""
    run_enkf_analysis(ensemble_data::Dict, observations::Dict, config::AnalysisConfig)

Run ensemble Kalman filter analysis with full diagnostics and multi-model support.
"""
function run_enkf_analysis(ensemble_data::Dict, observations::Dict, config::AnalysisConfig)
    println("Starting EnKF Analysis...")
    mkpath(config.output_path)
    start_time = now()

    # Initialize model interface
    model_interface = initialize_model_interface(config.model_type, config.model_params)
    
    # Initialize diagnostics
    diag_config = DiagnosticConfig(
        output_path = config.output_path,
        log_level = get(config.params, "log_level", 2),
        save_statistics = true,
        generate_plots = get(config.params, "generate_plots", false),
        observation_types = get(config.params, "observation_types", ["conv", "sat"]),
        variable_names = get(config.params, "variable_names", ["u", "v", "t", "q", "ps"]),
        regional_analysis = get(config.params, "regional_analysis", false),
        latitude_bands = get(config.params, "latitude_bands", [-90.0, -30.0, 30.0, 90.0]),
        quality_control_thresholds = get(config.params, "qc_thresholds", Dict{String,Float64}()),
        performance_monitoring = true,
        memory_tracking = true
    )
    
    start_time = now()
    
    # Run EnKF analysis
    result = EnKFCore.enkf_analysis(ensemble_data, observations, config)
    
    # Calculate diagnostics
    innovation_stats = print_innovation_stats(
        result["innovation_fits"], 
        result["ensemble_spread"], 
        result["observation_errors"], 
        diag_config
    )
    
    spread_stats = calculate_ensemble_spread(result["analysis_ensemble"], diag_config.variable_names)
    performance_stats = monitor_performance(start_time, "ensemble_analysis", diag_config)
    memory_stats = track_memory_usage(diag_config)
    
    # Validate results
    validation_results = validate_ensemble_consistency(result["analysis_ensemble"], diag_config)
    
    # Generate comprehensive report
    all_stats = Dict(
        "innovation" => Dict("combined" => innovation_stats),
        "ensemble_spread" => spread_stats,
        "performance" => performance_stats,
        "memory" => memory_stats,
        "validation" => validation_results
    )
    
    generate_diagnostic_report(all_stats, diag_config)
    
    # Finalize model interface
    finalize_model_interface(model_interface)
    
    println("EnKF Analysis completed successfully!")
    
    return result
end

"""
    run_hybrid_analysis(control_vector::Dict, ensemble_data::Dict, observations::Dict, config::AnalysisConfig)

Run hybrid ensemble-variational analysis combining 3D/4D-Var with ensemble covariances.
"""
function run_hybrid_analysis(control_vector::Dict, ensemble_data::Dict, observations::Dict, config::AnalysisConfig)
    println("Starting Hybrid Ensemble-Variational Analysis...")
    mkpath(config.output_path)
    start_time = now()

    # Get hybrid coefficients
    static_weight = get(config.params, "static_covar_weight", 0.25)
    ensemble_weight = get(config.params, "ensemble_covar_weight", 0.75)

    println(@sprintf("Hybrid weights: Static B = %.2f, Ensemble B = %.2f", static_weight, ensemble_weight))

    # Initialize model interface for ensemble components
    model_interface = initialize_model_interface(config.model_type, config.model_params)

    # Produce ensemble-based diagnostics and covariance statistics
    enkf_background = EnKFCore.enkf_analysis(ensemble_data, observations, config)
    ensemble_covariances = EnKFCore.calculate_ensemble_covariances(ensemble_data, config)

    # Set up hybrid background error covariance
    hybrid_covariance = BackgroundError.create_hybrid_covariance(
        static_weight,
        ensemble_weight,
        ensemble_covariances,
        config
    )

    # Ensure a usable background state is present for the variational solve
    hybrid_control = copy(control_vector)
    if !haskey(hybrid_control, "background_state")
        hybrid_control["background_state"] = hybrid_covariance["ensemble_mean"]
    end

    # Prepare 4D-Var friendly observations if needed
    four_dvar_observations = haskey(observations, "time_series") ? observations["time_series"] : observations
    if !all(_key_is_time_index, keys(four_dvar_observations))
        four_dvar_observations = Dict(0 => four_dvar_observations)
    end

    variational_method = get(config.params, "variational_method", "3DVar")
    result = if variational_method == "4DVar"
        hybrid_control["background_error_cov"] = hybrid_covariance["covariance"]
        hybrid_control["background_error_operator"] = nothing
        run_4dvar_analysis(hybrid_control, four_dvar_observations, config)
    else
        run_3dvar_analysis(
            hybrid_control,
            observations,
            config,
            hybrid_covariance["covariance"],
            background_override = hybrid_covariance["ensemble_mean"]
        )
    end

    if variational_method == "4DVar"
        analysis_state = get(result, "analysis", nothing)
        if analysis_state === nothing
            @warn "4D-Var hybrid analysis did not return an \"analysis\" vector; skipping state packaging"
        else
            result["analysis_state"] = analysis_state
            result["background_state"] = hybrid_control["background_state"]
            result["analysis_increment"] = analysis_state - result["background_state"]
        end
    end

    # Add ensemble diagnostics
    diag_config = DiagnosticConfig(
        output_path = config.output_path,
        log_level = get(config.params, "log_level", 2),
        save_statistics = true,
        generate_plots = get(config.params, "generate_plots", false),
        observation_types = get(config.params, "observation_types", ["conv", "sat"]),
        variable_names = get(config.params, "variable_names", ["u", "v", "t", "q", "ps"]),
        regional_analysis = get(config.params, "regional_analysis", false),
        latitude_bands = get(config.params, "latitude_bands", [-90.0, -30.0, 30.0, 90.0]),
        quality_control_thresholds = get(config.params, "qc_thresholds", Dict{String,Float64}()),
        performance_monitoring = true,
        memory_tracking = true
    )

    spread_stats = calculate_ensemble_spread(ensemble_data["members"], diag_config.variable_names)
    innovation_stats = print_innovation_stats(
        enkf_background["innovation_fits"],
        enkf_background["ensemble_spread"],
        enkf_background["observation_errors"],
        diag_config
    )
    performance_stats = monitor_performance(start_time, "hybrid_analysis", diag_config)
    memory_stats = track_memory_usage(diag_config)
    validation_results = validate_ensemble_consistency(enkf_background["analysis_ensemble"], diag_config)

    all_stats = Dict(
        "innovation" => Dict("combined" => innovation_stats),
        "ensemble_spread" => spread_stats,
        "performance" => performance_stats,
        "memory" => memory_stats,
        "validation" => validation_results
    )

    generate_diagnostic_report(all_stats, diag_config)

    result["ensemble_diagnostics"] = spread_stats
    result["hybrid_covariance"] = hybrid_covariance
    result["background_covariance"] = hybrid_covariance["covariance"]
    result["enkf_background"] = enkf_background
    result["hybrid_weights"] = (static = hybrid_covariance["static_weight"], ensemble = hybrid_covariance["ensemble_weight"])

    finalize_model_interface(model_interface)

    println("Hybrid Analysis completed successfully!")
    return result
end

function run_3dvar_analysis(data::Dict, observations::Dict, config::AnalysisConfig, B_matrix=nothing; background_override=nothing)
    println("Running 3D-Var analysis...")

    state = if background_override === nothing
        Vector{Float64}(get(data, "background_state", Float64[]))
    else
        Vector{Float64}(background_override)
    end

    isempty(state) && error("3D-Var requires a background state vector")

    obs_values, obs_errors, H = _extract_linear_observations(observations, length(state))

    B = _materialize_background_covariance(B_matrix, length(state), config, data)
    R = Diagonal(obs_errors .^ 2)

    innovation = obs_values - H * state
    S = Symmetric(H * B * H' + R)
    K = (B * H') * inv(Matrix(S))
    analysis_state = state + K * innovation
    analysis_residual = obs_values - H * analysis_state

    increment = analysis_state - state
    residual_norm = sum((analysis_residual ./ obs_errors) .^ 2)

    return Dict(
        "analysis_state" => analysis_state,
        "background_state" => state,
        "analysis_increment" => increment,
        "innovation" => innovation,
        "analysis_residual" => analysis_residual,
        "kalman_gain" => K,
        "observation_operator" => H,
        "cost_function_value" => 0.5 * residual_norm,
        "background_covariance" => B
    )
end

function _extract_linear_observations(observations::Dict, state_size::Int)
    group = if haskey(observations, "values")
        observations
    else
        found = nothing
        for value in values(observations)
            if isa(value, Dict) && haskey(value, "values")
                found = value
                break
            end
        end
        found === nothing && error("Observations must contain a group with \"values\" and optional operator information")
        found
    end

    obs_values = Vector{Float64}(group["values"])
    obs_errors = haskey(group, "errors") ? Vector{Float64}(group["errors"]) : ones(Float64, length(obs_values))

    H = if haskey(group, "operator")
        Matrix{Float64}(group["operator"])
    elseif haskey(group, "state_indices")
        idx = collect(Int, group["state_indices"])
        mat = zeros(Float64, length(idx), state_size)
        for (row, col) in enumerate(idx)
            1 <= col <= state_size || error("State index $col out of bounds for observation operator")
            mat[row, col] = 1.0
        end
        mat
    else
        n_obs = length(obs_values)
        n_obs <= state_size || error("Cannot build identity observation operator with more observations than state elements")
        mat = zeros(Float64, n_obs, state_size)
        for i in 1:n_obs
            mat[i, i] = 1.0
        end
        mat
    end

    return obs_values, obs_errors, H
end

function _materialize_background_covariance(B_matrix, state_size::Int, config::AnalysisConfig, data::Dict)
    T = config.precision

    if B_matrix === nothing
        variance = get(data, "background_variance", get(config.params, "static_background_variance", 1.0))
        return Matrix{T}(I, state_size, state_size) .* (T(variance)^2)
    elseif isa(B_matrix, Dict)
        haskey(B_matrix, "covariance") || error("Hybrid covariance dictionary must include a \"covariance\" matrix")
        return Matrix{T}(B_matrix["covariance"])
    elseif isa(B_matrix, AbstractMatrix)
        return Matrix{T}(B_matrix)
    else
        error("Unsupported background covariance representation: $(typeof(B_matrix))")
    end
end

function _key_is_time_index(key)
    if key isa Integer
        return true
    elseif key isa AbstractString
        return !isempty(key) && all(isdigit, key)
    elseif key isa Symbol
        s = String(key)
        return !isempty(s) && all(isdigit, s)
    else
        return false
    end
end

function _time_key_to_int(key)
    if key isa Integer
        return Int(key)
    elseif key isa AbstractString
        return parse(Int, key)
    elseif key isa Symbol
        return parse(Int, String(key))
    else
        error("Cannot convert key $(key) to time index")
    end
end

function _extract_observation_payload(obs_data)
    if obs_data isa AbstractDict
        values = haskey(obs_data, "values") ? obs_data["values"] : get(obs_data, :values, Float64[])
        values_vec = Vector{Float64}(values)
        if haskey(obs_data, "errors")
            errors_vec = Vector{Float64}(obs_data["errors"])
        elseif haskey(obs_data, :errors)
            errors_vec = Vector{Float64}(obs_data[:errors])
        else
            errors_vec = ones(Float64, length(values_vec))
        end
        return values_vec, errors_vec
    elseif obs_data isa AbstractVector
        values_vec = Vector{Float64}(obs_data)
        return values_vec, ones(Float64, length(values_vec))
    else
        return [Float64(obs_data)], [1.0]
    end
end

function _normalize_4dvar_observations(observations::Dict)
    keys_are_time = all(_key_is_time_index, keys(observations))
    obs_source = keys_are_time ? observations : Dict(0 => observations)

    normalized = Dict{Int, Dict{Symbol, Vector{Float64}}}()
    for (raw_key, obs_data) in obs_source
        time_idx = _time_key_to_int(raw_key)
        values, errors = _extract_observation_payload(obs_data)
        normalized[time_idx] = Dict(:values => values, :errors => errors)
    end

    return normalized
end

"""
    run_4dvar_analysis(data::Dict, observations::Dict, config::AnalysisConfig)

Run DRP-4DVar analysis using dimensionality reduction projection method.
This function integrates the standalone DRP-4DVar implementation with the 
GSI analysis framework for realistic atmospheric data assimilation.
"""
function run_4dvar_analysis(data::Dict, observations::Dict, config::AnalysisConfig)
    println("Running DRP-4DVar analysis...")
    
    # Extract parameters from config or use defaults
    ensemble_size = get(config.params, "ensemble_size", 40)
    time_window = get(config.params, "time_window", 6)
    max_outer_loops = get(config.params, "max_outer_loops", 3)
    max_inner_loops = get(config.params, "max_inner_loops", 100)
    convergence_tol = get(config.params, "convergence_tolerance", 1e-6)
    optimizer = get(config.params, "optimizer", "lbfgs")
    use_localization = Bool(get(config.params, "use_localization", true))
    localization_radius = Float64(get(config.params, "localization_radius", 1000.0))
    ensemble_inflation = Float64(get(config.params, "ensemble_inflation", 1.02))
    background_variance = Float64(get(config.params, "background_error_variance", 1.0))
    
    # Initialize DRP-4DVar method
    drp4dvar = DRP4DVar(
        ensemble_size = ensemble_size,
        max_outer_loops = max_outer_loops,
        max_inner_loops = max_inner_loops,
        convergence_tolerance = convergence_tol,
        time_window = time_window,
        optimizer = optimizer,
        ensemble_inflation = ensemble_inflation,
        localization_radius = localization_radius,
        use_localization = use_localization,
        background_error_variance = background_variance
    )
    
    # Extract required data from input
    background_state = get(data, "background_state", randn(100))
    background_error_cov = get(data, "background_error_operator", nothing)
    if background_error_cov === nothing
        background_error_cov = get(data, "background_error_cov", nothing)
    end
    if background_error_cov === nothing
        background_error_cov = Matrix{Float64}(I, length(background_state), length(background_state))
    end
    if background_error_cov isa AbstractDict
        if haskey(background_error_cov, "covariance")
            background_error_cov = background_error_cov["covariance"]
        elseif haskey(background_error_cov, :covariance)
            background_error_cov = background_error_cov[:covariance]
        end
    end
    raw_observation_ops = get(data, "observation_operators", Dict{Int, Any}())
    observation_operators = Dict{Int, Any}()
    for (key, op) in raw_observation_ops
        observation_operators[Int(key)] = op
    end

    raw_model_ops = get(data, "model_operators", Dict{Int, Any}())
    model_operators = Dict{Int, Any}()
    for (key, op) in raw_model_ops
        model_operators[Int(key)] = op
    end
    
    # Convert observations to proper format
    normalized_obs = _normalize_4dvar_observations(observations)
    obs_dict = Dict{Int, Vector{Float64}}(t => data_dict[:values] for (t, data_dict) in normalized_obs)
    obs_errors = Dict{Int, Vector{Float64}}(t => data_dict[:errors] for (t, data_dict) in normalized_obs)
    
    # Run DRP-4DVar algorithm
    analysis_state, statistics = run_drp4dvar(
        drp4dvar,
        background_state,
        background_error_cov,
        obs_dict,
        observation_operators,
        model_operators
    )

    # Attach observation error metadata for downstream diagnostics
    statistics["observation_errors"] = obs_errors

    # Return results in GSI format
    return Dict(
        "analysis" => analysis_state,
        "cost_function_value" => get(statistics, "final_cost", Inf),
        "drp4dvar_statistics" => statistics,
        "analysis_increment" => get(statistics, "analysis_increment", analysis_state - background_state),
        "convergence_history" => get(statistics, "convergence_history", Float64[]),
        "execution_time" => get(statistics, "total_execution_time", 0.0),
        "method" => "DRP-4DVar"
    )
end

end # module GSICoreAnalysis
