`๏ผ็ถๅๅจๆฌๅฐๆต่งๅจๆๅผ `http://127.0.0.1:8888/<ไฝ ็htmlๆไปถๅ>`
+
+ๅฆๆไฝ ๅจ `http.server` ๆฅๅฟ้็ๅฐๅคง้ `400 Bad request version` ไธไผด้ไนฑ็ ๏ผ้ๅธธๆฏๆๅฎขๆท็ซฏ็จ HTTPS ๅป่ฟไบ HTTP ็ซฏๅฃ๏ผ่ฏท็กฎ่ฎคๆต่งๅจๅฐๅๆ ๆฏ `http://...`ใ
+
+## ๅฏ้ๅๆฐ
+- `--sink_span a b` / `--thinking_span a b`๏ผ่ฆ็็ๆไพง็ sink/thinking ๅฅๅญ span๏ผ้ป่ฎคไฝฟ็จ็ผๅญๅญๆฎต๏ผใ
+- `--attnlrp_neg_handling drop|abs`๏ผFT-AttnLRP ๆฏ่ทณ่ดๅผๅค็๏ผdrop=clamp>=0๏ผabs=ๅ็ปๅฏนๅผ๏ผใ
+- `--attnlrp_norm_mode norm|no_norm`๏ผFT-AttnLRP ๆญฃๅๅไธ hop ratio ๅผๅ
ณ๏ผnorm=ๅ
จๅฑ+thinking ๅฝไธๅๅนถๅฏ็จ ratio๏ผno_norm=ไธ่
้ฝ็ฆ็จ๏ผใ
+- `--chunk_tokens` / `--sink_chunk_tokens`๏ผIFR ๅๅๅๆฐใ
+- `--output_dir`๏ผไฟฎๆน่พๅบ็ฎๅฝใ
+
+## ๆไปถ่ฏดๆ
+- `run_ifr_case.py`๏ผๅฝไปค่กๅ
ฅๅฃไธ่ฝ็๏ผๆฏๆ `ft`/`ifr`/`ifr_all_positions_output_only`/`attnlrp`/`ft_attnlrp` ๆจกๅผ๏ผใ
+- `run_mas_case.py`๏ผMAS๏ผfaithfulness / token perturbation๏ผๅฏ่งๅๅ
ฅๅฃไธ่ฝ็๏ผๆฏๆ `ifr`/`ifr_all_positions_output_only`/`ft`/`attnlrp`/`ft_attnlrp`๏ผใ
+- `analysis.py`๏ผ้่ทณๆธ
ๆดไธๅฐ่ฃ
๏ผtoken-level๏ผใ
+- `viz.py`๏ผHTML ๆธฒๆไธ็ญๅๅพใ
diff --git a/exp/case_study/analysis.py b/exp/case_study/analysis.py
new file mode 100644
index 0000000000000000000000000000000000000000..74287af9ecf30c48b9b5dd903191563028332d95
--- /dev/null
+++ b/exp/case_study/analysis.py
@@ -0,0 +1,74 @@
+"""Helpers for IFR case studies (hop-wise aggregation + sanitization).
+
+All utilities stay local to exp/case_study to avoid touching core eval code.
+"""
+
+from __future__ import annotations
+
+from typing import Any, Dict, Iterable, List, Optional, Sequence
+
+import torch
+
+
+def vector_stats(vec: torch.Tensor) -> Dict[str, float]:
+ if vec.numel() == 0:
+ return {"min": 0.0, "max": 0.0, "abs_max": 0.0, "mean": 0.0, "sum": 0.0}
+ v = vec.detach().to(dtype=torch.float32)
+ return {
+ "min": float(v.min().item()),
+ "max": float(v.max().item()),
+ "abs_max": float(v.abs().max().item()),
+ "mean": float(v.mean().item()),
+ "sum": float(v.sum().item()),
+ }
+
+
+def tensor_to_list(x: Any) -> Any:
+ if torch.is_tensor(x):
+ return x.detach().cpu().tolist()
+ if isinstance(x, list):
+ return [tensor_to_list(v) for v in x]
+ if isinstance(x, dict):
+ return {k: tensor_to_list(v) for k, v in x.items()}
+ return x
+
+
+def sanitize_ifr_meta(meta: Optional[Dict[str, Any]]) -> Optional[Dict[str, Any]]:
+ """Drop bulky raw objects and convert tensors to Python lists for JSON."""
+
+ if meta is None:
+ return None
+
+ cleaned: Dict[str, Any] = {}
+ for key, value in meta.items():
+ if key == "raw":
+ continue
+ cleaned[key] = tensor_to_list(value)
+ return cleaned
+
+
+def package_token_hops(
+ hop_vectors: Iterable[Sequence[float]],
+) -> List[Dict[str, Any]]:
+ """Package per-hop token vectors without sentence aggregation.
+
+ hop_vectors are assumed to already match the experiment's configured
+ postprocessing (e.g., FT-AttnLRP neg_handling/norm_mode).
+ """
+
+ packaged: List[Dict[str, Any]] = []
+ for hop_idx, vec in enumerate(hop_vectors):
+ vec_tensor = torch.nan_to_num(torch.as_tensor(vec, dtype=torch.float32), nan=0.0)
+ token_scores = vec_tensor.tolist()
+ token_max = float(vec_tensor.abs().max().item()) if vec_tensor.numel() > 0 else 0.0
+ total = float(vec_tensor.sum().item())
+ packaged.append(
+ {
+ "hop": hop_idx,
+ "token_scores": token_scores,
+ "token_score_max": token_max,
+ "token_stats": vector_stats(vec_tensor),
+ "total_mass": total,
+ }
+ )
+ return packaged
diff --git a/exp/case_study/faithfulness_trace.py b/exp/case_study/faithfulness_trace.py
new file mode 100644
index 0000000000000000000000000000000000000000..73a50eb426053562cb65b153246a1007b5b81df3
--- /dev/null
+++ b/exp/case_study/faithfulness_trace.py
@@ -0,0 +1,183 @@
+"""Faithfulness (MAS/RISE) trace utilities for exp/case_study.
+
+This module is intentionally aligned with `llm_attr_eval.LLMAttributionEvaluator.faithfulness_test`,
+but additionally returns the full trace arrays needed for visualization and supports providing
+`user_prompt_indices` to avoid fragile subsequence matching.
+"""
+
+from __future__ import annotations
+
+from typing import Any, Dict, Optional, Sequence, List
+
+import numpy as np
+import torch
+
+import llm_attr_eval
+
+
+def _auc(arr: np.ndarray) -> float:
+ return float((arr.sum() - arr[0] / 2 - arr[-1] / 2) / max(1, (arr.shape[0] - 1)))
+
+
+@torch.inference_mode()
+def mas_trace(
+ llm_evaluator: llm_attr_eval.LLMAttributionEvaluator,
+ *,
+ attribution: torch.Tensor,
+ prompt: str,
+ generation: str,
+ user_prompt_indices: Optional[Sequence[int]] = None,
+ k: int = 20,
+) -> Dict[str, Any]:
+ """Return a token-level faithfulness trace (RISE/MAS/RISE+AP) plus per-token deltas.
+
+ attribution: [R, P] token attribution on prompt-side tokens only.
+ prompt: raw prompt string.
+ generation: target generation string; scored as generation + eos (if defined).
+ user_prompt_indices: optional absolute positions of each prompt token inside formatted prompt ids.
+ k: number of perturbation steps; each step perturbs ~1/k of prompt tokens.
+ """
+
+ if attribution.ndim != 2:
+ raise ValueError("Expected 2D prompt-side attribution matrix [R, P].")
+
+ pad_token_id = llm_evaluator._ensure_pad_token_id()
+
+ user_prompt = " " + prompt
+ formatted_prompt = llm_evaluator.format_prompt(user_prompt)
+ formatted_ids = llm_evaluator.tokenizer(formatted_prompt, return_tensors="pt", add_special_tokens=False).input_ids
+
+ prompt_ids = formatted_ids.to(llm_evaluator.device)
+ prompt_ids_perturbed = prompt_ids.clone()
+
+ eos = llm_evaluator.tokenizer.eos_token or ""
+ generation_ids = llm_evaluator.tokenizer(
+ generation + eos,
+ return_tensors="pt",
+ add_special_tokens=False,
+ ).input_ids.to(llm_evaluator.device)
+
+ attr_cpu = attribution.detach().cpu()
+ w = attr_cpu.sum(0)
+ sorted_attr_indices = torch.argsort(w, descending=True)
+ attr_sum = float(w.sum().item())
+
+ P = int(w.numel())
+
+ prompt_positions: List[int]
+ if user_prompt_indices is not None:
+ prompt_positions = [int(x) for x in user_prompt_indices]
+ if len(prompt_positions) != P:
+ raise ValueError(
+ "user_prompt_indices length does not match prompt-side attribution length: "
+ f"indices P={len(prompt_positions)}, attr P={P}."
+ )
+ if P and max(prompt_positions) >= int(prompt_ids_perturbed.shape[1]):
+ raise ValueError("user_prompt_indices contains an out-of-bounds index for formatted prompt ids.")
+ else:
+ user_ids = llm_evaluator.tokenizer(user_prompt, return_tensors="pt", add_special_tokens=False).input_ids
+ user_start = llm_evaluator._find_subsequence_start(formatted_ids[0], user_ids[0])
+ if user_start is None:
+ raise RuntimeError("Failed to locate user prompt token span inside formatted chat prompt.")
+ if int(user_ids.shape[1]) != P:
+ raise ValueError(
+ "Prompt-side attribution length does not match tokenized user prompt length: "
+ f"attr P={P}, user_prompt P={int(user_ids.shape[1])}."
+ )
+ prompt_positions = [int(user_start) + j for j in range(P)]
+
+ if P > 0:
+ steps = int(k) if k is not None else 0
+ if steps <= 0:
+ steps = 1
+ steps = min(steps, P)
+ else:
+ steps = 0
+
+ scores = np.zeros(steps + 1, dtype=np.float64)
+ density = np.zeros(steps + 1, dtype=np.float64)
+
+ scores[0] = (
+ llm_evaluator.compute_logprob_response_given_prompt(prompt_ids_perturbed, generation_ids).sum().cpu().detach().item()
+ )
+ density[0] = 1.0
+
+ if P == 0:
+ return {
+ "num_tokens": 0,
+ "sorted_attr_indices": [],
+ "scores_raw": scores.tolist(),
+ "density": density.tolist(),
+ "normalized_model_response": [1.0],
+ "alignment_penalty": [0.0],
+ "corrected_scores": [1.0],
+ "token_deltas_raw": [],
+ "attr_weights": [],
+ "metrics": {"RISE": 0.0, "MAS": 0.0, "RISE+AP": 0.0},
+ }
+
+ if attr_sum <= 0:
+ density = np.linspace(1.0, 0.0, steps + 1)
+
+ per_token_delta = np.zeros(P, dtype=np.float64)
+
+ base = P // steps
+ remainder = P % steps
+ start = 0
+ for step in range(steps):
+ size = base + (1 if step < remainder else 0)
+ group = sorted_attr_indices[start : start + size]
+ start += size
+
+ for idx_t in group:
+ idx = int(idx_t.item())
+ abs_pos = int(prompt_positions[idx])
+ prompt_ids_perturbed[0, abs_pos] = pad_token_id
+ scores[step + 1] = (
+ llm_evaluator.compute_logprob_response_given_prompt(prompt_ids_perturbed, generation_ids).sum().cpu().detach().item()
+ )
+ if attr_sum > 0:
+ dec = float(w.index_select(0, group).sum().item()) / attr_sum
+ density[step + 1] = density[step] - dec
+
+ delta = scores[step] - scores[step + 1]
+ for idx_t in group:
+ idx = int(idx_t.item())
+ per_token_delta[idx] = delta
+
+ min_normalized_pred = 1.0
+ normalized_model_response = scores.copy()
+ for i in range(len(scores)):
+ normalized_pred = (normalized_model_response[i] - scores[-1]) / (abs(scores[0] - scores[-1]))
+ normalized_pred = np.clip(normalized_pred, 0.0, 1.0)
+ min_normalized_pred = min(min_normalized_pred, normalized_pred)
+ normalized_model_response[i] = min_normalized_pred
+
+ alignment_penalty = np.abs(normalized_model_response - density)
+ corrected_scores = normalized_model_response + alignment_penalty
+ corrected_scores = corrected_scores.clip(0.0, 1.0)
+ corrected_scores = (corrected_scores - np.min(corrected_scores)) / (np.max(corrected_scores) - np.min(corrected_scores))
+ if np.isnan(corrected_scores).any():
+ corrected_scores = np.linspace(1.0, 0.0, len(scores))
+
+ rise = _auc(normalized_model_response)
+ mas = _auc(corrected_scores)
+ rise_ap = _auc(normalized_model_response + alignment_penalty)
+
+ if attr_sum > 0:
+ attr_weights = (w.numpy() / (attr_sum + 1e-12)).astype(np.float64)
+ else:
+ attr_weights = np.zeros(P, dtype=np.float64)
+
+ return {
+ "num_tokens": P,
+ "sorted_attr_indices": [int(i.item()) for i in sorted_attr_indices],
+ "scores_raw": scores.tolist(),
+ "density": density.tolist(),
+ "normalized_model_response": normalized_model_response.tolist(),
+ "alignment_penalty": alignment_penalty.tolist(),
+ "corrected_scores": corrected_scores.tolist(),
+ "token_deltas_raw": per_token_delta.tolist(),
+ "attr_weights": attr_weights.tolist(),
+ "metrics": {"RISE": rise, "MAS": mas, "RISE+AP": rise_ap},
+ }
diff --git a/exp/case_study/run_ifr_case.py b/exp/case_study/run_ifr_case.py
new file mode 100644
index 0000000000000000000000000000000000000000..92aa5db5f162d8fdf8682c5402625f7c8f44e696
--- /dev/null
+++ b/exp/case_study/run_ifr_case.py
@@ -0,0 +1,1225 @@
+#!/usr/bin/env python3
+"""Case study runner for FlashTrace and attribution baselines.
+
+Modes supported (all emit JSON + HTML under ``exp/case_study/out``):
+
+- ``ft``: FlashTrace (current project implementation; multi-hop IFR)
+- ``ifr_in_all_gen``: Experimental multi-hop IFR variant (hops over CoT+output; scheme B, aligns with exp/exp2)
+- ``ifr``: IFR span-aggregate visualization (single hop; one panel)
+- ``ifr_all_positions``: IFR full matrix + CAGE (Row/Recursive panels)
+- ``ifr_all_positions_output_only``: IFR output-only token matrix + CAGE (Row/Recursive panels)
+- ``attnlrp``: AttnLRP hop0 (reuse FT-AttnLRP span-aggregate; visualize raw hop0 vector)
+- ``ft_attnlrp``: FT-AttnLRP (multi-hop aggregated AttnLRP; matches exp/exp2)
+"""
+
+from __future__ import annotations
+
+import argparse
+import json
+import os
+import sys
+import types
+from pathlib import Path
+from typing import Any, Dict, List, Optional, Sequence, Tuple
+
+# Avoid torchvision dependency when importing transformers (Longformer).
+os.environ.setdefault("TRANSFORMERS_NO_TORCHVISION", "1")
+os.environ.setdefault("DISABLE_TRANSFORMERS_IMAGE_TRANSFORMS", "1")
+
+def _early_set_cuda_visible_devices() -> None:
+ """Set CUDA_VISIBLE_DEVICES before importing torch/transformers.
+
+ Note: CUDA device indices are re-mapped inside the process after applying the mask.
+ """
+
+ parser = argparse.ArgumentParser(add_help=False)
+ parser.add_argument("--cuda", type=str, default=None)
+ args, _ = parser.parse_known_args(sys.argv[1:])
+ cuda = args.cuda.strip() if isinstance(args.cuda, str) else ""
+ if cuda and "," in cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = cuda
+
+
+if __name__ == "__main__":
+ _early_set_cuda_visible_devices()
+
+import torch
+
+REPO_ROOT = Path(__file__).resolve().parents[2]
+if str(REPO_ROOT) not in sys.path:
+ sys.path.insert(0, str(REPO_ROOT))
+
+
+def _stub_torchvision() -> None:
+ """Provide minimal torchvision stubs so Longformer imports succeed without the real package."""
+
+ if "torchvision" in sys.modules:
+ return
+
+ from importlib.machinery import ModuleSpec
+
+ def _mk(name: str) -> types.ModuleType:
+ mod = types.ModuleType(name)
+ mod.__spec__ = ModuleSpec(name, loader=None)
+ return mod
+
+ tv = _mk("torchvision")
+ tv.__dict__["__path__"] = []
+ submods = ["transforms", "_meta_registrations", "datasets", "io", "models", "ops", "utils"]
+ for name in submods:
+ mod = _mk(f"torchvision.{name}")
+ sys.modules[f"torchvision.{name}"] = mod
+ setattr(tv, name, mod)
+
+ class _InterpolationMode:
+ NEAREST = 0
+ NEAREST_EXACT = 0
+ BILINEAR = 1
+ BICUBIC = 2
+ LANCZOS = 3
+ BOX = 4
+ HAMMING = 5
+
+ sys.modules["torchvision.transforms"].InterpolationMode = _InterpolationMode
+ sys.modules["torchvision.transforms"].__all__ = ["InterpolationMode"]
+
+ # ops + misc stub for timm/transformers imports
+ ops_mod = sys.modules.get("torchvision.ops") or _mk("torchvision.ops")
+ sys.modules["torchvision.ops"] = ops_mod
+ setattr(tv, "ops", ops_mod)
+ misc_mod = _mk("torchvision.ops.misc")
+ sys.modules["torchvision.ops.misc"] = misc_mod
+ setattr(ops_mod, "misc", misc_mod)
+
+ class _FrozenBatchNorm2d:
+ def __init__(self, *args, **kwargs):
+ pass
+
+ misc_mod.FrozenBatchNorm2d = _FrozenBatchNorm2d
+
+ sys.modules["torchvision"] = tv
+
+
+_stub_torchvision()
+
+
+def _stub_timm() -> None:
+ """Provide minimal timm stubs to avoid optional vision deps."""
+
+ if "timm" in sys.modules:
+ return
+
+ from importlib.machinery import ModuleSpec
+
+ def _mk(name: str) -> types.ModuleType:
+ mod = types.ModuleType(name)
+ mod.__spec__ = ModuleSpec(name, loader=None)
+ return mod
+
+ timm = _mk("timm")
+ timm.__dict__["__path__"] = []
+ sys.modules["timm"] = timm
+
+ data_mod = _mk("timm.data")
+ sys.modules["timm.data"] = data_mod
+ timm.data = data_mod
+
+ class _ImageNetInfo:
+ pass
+
+ def _infer_imagenet_subset(*args, **kwargs):
+ return None
+
+ data_mod.ImageNetInfo = _ImageNetInfo
+ data_mod.infer_imagenet_subset = _infer_imagenet_subset
+
+ layers_mod = _mk("timm.layers")
+ sys.modules["timm.layers"] = layers_mod
+ timm.layers = layers_mod
+
+ create_norm_mod = _mk("timm.layers.create_norm")
+ sys.modules["timm.layers.create_norm"] = create_norm_mod
+ layers_mod.create_norm = create_norm_mod
+
+ def _get_norm_layer(*args, **kwargs):
+ return None
+
+ create_norm_mod.get_norm_layer = _get_norm_layer
+
+ classifier_mod = _mk("timm.layers.classifier")
+ sys.modules["timm.layers.classifier"] = classifier_mod
+ layers_mod.classifier = classifier_mod
+
+
+_stub_timm()
+
+import transformers
+
+# Provide light stubs if Longformer classes are unavailable; IFR case study does not use them.
+if not hasattr(transformers, "LongformerTokenizer"):
+ class _DummyLongformerTokenizer:
+ def __init__(self, *args, **kwargs):
+ raise ImportError("LongformerTokenizer stubbed; install full transformers+torchvision if needed.")
+ transformers.LongformerTokenizer = _DummyLongformerTokenizer
+
+if not hasattr(transformers, "LongformerForMaskedLM"):
+ class _DummyLongformerForMaskedLM:
+ def __init__(self, *args, **kwargs):
+ raise ImportError("LongformerForMaskedLM stubbed; install full transformers+torchvision if needed.")
+ transformers.LongformerForMaskedLM = _DummyLongformerForMaskedLM
+
+if hasattr(transformers, "__all__"):
+ for _name in ["LongformerTokenizer", "LongformerForMaskedLM"]:
+ if _name not in transformers.__all__:
+ transformers.__all__.append(_name)
+
+# Gemma3n stubs (transformers may attempt to import even if unused)
+if "transformers.models.gemma3n.configuration_gemma3n" not in sys.modules:
+ from importlib.machinery import ModuleSpec
+
+ gemma_pkg = types.ModuleType("transformers.models.gemma3n")
+ gemma_pkg.__spec__ = ModuleSpec("transformers.models.gemma3n", loader=None, is_package=True)
+ sys.modules["transformers.models.gemma3n"] = gemma_pkg
+
+ gemma_conf = types.ModuleType("transformers.models.gemma3n.configuration_gemma3n")
+ gemma_conf.__spec__ = ModuleSpec("transformers.models.gemma3n.configuration_gemma3n", loader=None)
+
+ class Gemma3nConfig:
+ def __init__(self, *args, **kwargs):
+ self.model_type = "gemma3n"
+
+ class Gemma3nTextConfig(Gemma3nConfig):
+ pass
+
+ gemma_conf.Gemma3nConfig = Gemma3nConfig
+ gemma_conf.Gemma3nTextConfig = Gemma3nTextConfig
+ gemma_conf.__all__ = ["Gemma3nConfig", "Gemma3nTextConfig"]
+ sys.modules["transformers.models.gemma3n.configuration_gemma3n"] = gemma_conf
+ setattr(gemma_pkg, "configuration_gemma3n", gemma_conf)
+
+ if hasattr(transformers, "__all__"):
+ for _nm in ["Gemma3nConfig", "Gemma3nTextConfig"]:
+ if _nm not in transformers.__all__:
+ transformers.__all__.append(_nm)
+
+import llm_attr
+from exp.exp2 import dataset_utils as ds_utils
+from evaluations.attribution_recovery import load_model
+
+from exp.case_study import analysis, viz
+
+
+def resolve_device(cuda: Optional[str], cuda_num: int) -> str:
+ if cuda and isinstance(cuda, str) and "," in cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = cuda
+ return "auto"
+ if cuda and isinstance(cuda, str) and cuda.strip():
+ try:
+ idx = int(cuda)
+ except Exception:
+ idx = 0
+ return f"cuda:{idx}" if torch.cuda.is_available() else "cpu"
+ return f"cuda:{cuda_num}" if torch.cuda.is_available() else "cpu"
+
+
+def load_example(dataset: str, index: int, data_root: Path) -> Tuple[ds_utils.CachedExample, str]:
+ """Load a single example from a cache path or dataset name."""
+
+ ds_path = Path(dataset)
+ if ds_path.exists():
+ examples = ds_utils.read_cached_jsonl(ds_path)
+ dataset_name = ds_path.name
+ else:
+ loader = ds_utils.DatasetLoader(data_root=data_root)
+ examples = loader.load(dataset)
+ dataset_name = dataset
+
+ if not examples:
+ raise ValueError(f"No examples found for dataset={dataset}")
+
+ if index < 0:
+ index = len(examples) + index
+ if not (0 <= index < len(examples)):
+ raise IndexError(f"index {index} out of range for dataset with {len(examples)} examples")
+
+ return examples[index], dataset_name
+
+
+def parse_args() -> argparse.Namespace:
+ parser = argparse.ArgumentParser("IFR multi-hop case study")
+ parser.add_argument("--dataset", type=str, default="exp/exp2/data/morehopqa.jsonl", help="Dataset name or JSONL path.")
+ parser.add_argument("--data_root", type=str, default="exp/exp2/data", help="Cache root for dataset names.")
+ parser.add_argument("--index", type=int, default=0, help="Sample index (supports negative for reverse).")
+ parser.add_argument(
+ "--mode",
+ type=str,
+ choices=[
+ "ft",
+ "ft_improve",
+ "ft_split_hop",
+ "ifr_in_all_gen",
+ "ifr",
+ "ifr_all_positions",
+ "ifr_all_positions_output_only",
+ "attnlrp",
+ "ft_attnlrp",
+ ],
+ default="ft",
+ help=(
+ "ft = FlashTrace (multi-hop IFR); ifr = standard IFR span-aggregate; "
+ "ifr_in_all_gen = multi-hop IFR over CoT+output (scheme B; exp2-aligned); "
+ "ifr_all_positions = full IFR matrix + CAGE row/rec; "
+ "ft_improve = FlashTrace (multi-hop IFR, stop-token soft deletion); "
+ "ft_split_hop = FlashTrace (split-hop IFR over segmented thinking span); "
+ "ifr_all_positions_output_only = output-only IFR matrix + CAGE row/rec; "
+ "attnlrp = AttnLRP hop0 (FT-AttnLRP span-aggregate); "
+ "ft_attnlrp = FT-AttnLRP (multi-hop aggregated; exp2)."
+ ),
+ )
+ parser.add_argument("--model", type=str, default="qwen-8B", help="HF repo id (ignored if --model_path set).")
+ parser.add_argument("--model_path", type=str, default=None, help="Local model path to override --model.")
+ parser.add_argument("--cuda", type=str, default=None, help="CUDA spec (e.g., '0' or '0,1').")
+ parser.add_argument("--cuda_num", type=int, default=0, help="Fallback GPU index when --cuda unset.")
+ parser.add_argument("--n_hops", type=int, default=1, help="Number of hops for IFR multi-hop.")
+ parser.add_argument("--sink_span", type=int, nargs=2, default=None, help="Optional sink span over generation tokens.")
+ parser.add_argument("--thinking_span", type=int, nargs=2, default=None, help="Optional thinking span over generation tokens.")
+ parser.add_argument(
+ "--attnlrp_neg_handling",
+ type=str,
+ choices=["drop", "abs"],
+ default="drop",
+ help="FT-AttnLRP: how to handle negative values after each hop (drop=clamp>=0, abs=absolute value).",
+ )
+ parser.add_argument(
+ "--attnlrp_norm_mode",
+ type=str,
+ choices=["norm", "no_norm"],
+ default="norm",
+ help="FT-AttnLRP: norm enables per-hop global+thinking normalization + ratios; no_norm disables all three.",
+ )
+ parser.add_argument("--chunk_tokens", type=int, default=128, help="IFR chunk size.")
+ parser.add_argument("--sink_chunk_tokens", type=int, default=32, help="IFR sink chunk size.")
+ parser.add_argument("--output_dir", type=str, default="exp/case_study/out", help="Where to write HTML/JSON artifacts.")
+ return parser.parse_args()
+
+
+def run_ft_multihop(
+ example: ds_utils.CachedExample,
+ model: Any,
+ tokenizer: Any,
+ *,
+ n_hops: int,
+ sink_span: Optional[Sequence[int]],
+ thinking_span: Optional[Sequence[int]],
+ chunk_tokens: int,
+ sink_chunk_tokens: int,
+) -> Tuple[Any, Optional[Tuple[int, int]], Optional[Tuple[int, int]], Dict[str, Any]]:
+ """Execute FT (current multi-hop IFR) attribution for the selected example."""
+
+ attr = llm_attr.LLMIFRAttribution(
+ model,
+ tokenizer,
+ chunk_tokens=chunk_tokens,
+ sink_chunk_tokens=sink_chunk_tokens,
+ )
+
+ sink = tuple(sink_span) if sink_span is not None else tuple(example.sink_span) if example.sink_span else None
+ thinking = (
+ tuple(thinking_span)
+ if thinking_span is not None
+ else tuple(example.thinking_span) if example.thinking_span else None
+ )
+
+ result = attr.calculate_ifr_multi_hop(
+ example.prompt,
+ target=example.target,
+ sink_span=sink,
+ thinking_span=thinking,
+ n_hops=n_hops,
+ )
+ debug_info: Dict[str, Any] = {
+ "full_prompt_tokens": list(getattr(attr, "prompt_tokens", []) or []),
+ "generation_tokens": list(getattr(attr, "generation_tokens", []) or []),
+ "user_prompt_indices": list(getattr(attr, "user_prompt_indices", []) or []),
+ "chat_prompt_indices": list(getattr(attr, "chat_prompt_indices", []) or []),
+ "prompt_ids": getattr(attr, "prompt_ids", None).detach().cpu().tolist() if getattr(attr, "prompt_ids", None) is not None else None,
+ "generation_ids": getattr(attr, "generation_ids", None).detach().cpu().tolist() if getattr(attr, "generation_ids", None) is not None else None,
+ }
+
+ raw_vectors = []
+ if result.metadata and "ifr" in result.metadata:
+ raw_ifr = result.metadata["ifr"].get("raw")
+ if raw_ifr is not None and hasattr(raw_ifr, "raw_attributions"):
+ try:
+ raw_vectors = [r.token_importance_total.detach().cpu() for r in raw_ifr.raw_attributions]
+ except Exception:
+ raw_vectors = []
+ debug_info["raw_hop_vectors"] = raw_vectors
+
+ return result, sink, thinking, debug_info
+
+
+def run_ft_multihop_improve(
+ example: ds_utils.CachedExample,
+ model: Any,
+ tokenizer: Any,
+ *,
+ n_hops: int,
+ sink_span: Optional[Sequence[int]],
+ thinking_span: Optional[Sequence[int]],
+ chunk_tokens: int,
+ sink_chunk_tokens: int,
+) -> Tuple[Any, Optional[Tuple[int, int]], Optional[Tuple[int, int]], Dict[str, Any]]:
+ """Execute experimental FT (multi-hop IFR) with stop-token soft deletion."""
+
+ import ft_ifr_improve
+
+ attr = ft_ifr_improve.LLMIFRAttributionImproved(
+ model,
+ tokenizer,
+ chunk_tokens=chunk_tokens,
+ sink_chunk_tokens=sink_chunk_tokens,
+ )
+
+ sink = tuple(sink_span) if sink_span is not None else tuple(example.sink_span) if example.sink_span else None
+ thinking = (
+ tuple(thinking_span)
+ if thinking_span is not None
+ else tuple(example.thinking_span) if example.thinking_span else None
+ )
+
+ result = attr.calculate_ifr_multi_hop_stop_words(
+ example.prompt,
+ target=example.target,
+ sink_span=sink,
+ thinking_span=thinking,
+ n_hops=n_hops,
+ )
+
+ debug_info: Dict[str, Any] = {
+ "full_prompt_tokens": list(getattr(attr, "prompt_tokens", []) or []),
+ "generation_tokens": list(getattr(attr, "generation_tokens", []) or []),
+ "user_prompt_indices": list(getattr(attr, "user_prompt_indices", []) or []),
+ "chat_prompt_indices": list(getattr(attr, "chat_prompt_indices", []) or []),
+ "prompt_ids": getattr(attr, "prompt_ids", None).detach().cpu().tolist() if getattr(attr, "prompt_ids", None) is not None else None,
+ "generation_ids": getattr(attr, "generation_ids", None).detach().cpu().tolist() if getattr(attr, "generation_ids", None) is not None else None,
+ }
+
+ raw_vectors = []
+ if result.metadata and "ifr" in result.metadata:
+ raw_ifr = result.metadata["ifr"].get("raw")
+ if raw_ifr is not None and hasattr(raw_ifr, "raw_attributions"):
+ try:
+ raw_vectors = [r.token_importance_total.detach().cpu() for r in raw_ifr.raw_attributions]
+ except Exception:
+ raw_vectors = []
+ debug_info["raw_hop_vectors"] = raw_vectors
+
+ return result, sink, thinking, debug_info
+
+
+def run_ft_multihop_split_hop(
+ example: ds_utils.CachedExample,
+ model: Any,
+ tokenizer: Any,
+ *,
+ n_hops: int,
+ sink_span: Optional[Sequence[int]],
+ thinking_span: Optional[Sequence[int]],
+ chunk_tokens: int,
+ sink_chunk_tokens: int,
+) -> Tuple[Any, Optional[Tuple[int, int]], Optional[Tuple[int, int]], Dict[str, Any]]:
+ """Execute experimental FT (split-hop IFR over segmented thinking span)."""
+
+ import ft_ifr_improve
+
+ attr = ft_ifr_improve.LLMIFRAttributionSplitHop(
+ model,
+ tokenizer,
+ chunk_tokens=chunk_tokens,
+ sink_chunk_tokens=sink_chunk_tokens,
+ )
+
+ sink = tuple(sink_span) if sink_span is not None else tuple(example.sink_span) if example.sink_span else None
+ thinking = (
+ tuple(thinking_span)
+ if thinking_span is not None
+ else tuple(example.thinking_span) if example.thinking_span else None
+ )
+
+ result = attr.calculate_ifr_multi_hop_split_hop(
+ example.prompt,
+ target=example.target,
+ sink_span=sink,
+ thinking_span=thinking,
+ n_hops=int(n_hops),
+ )
+
+ debug_info: Dict[str, Any] = {
+ "full_prompt_tokens": list(getattr(attr, "prompt_tokens", []) or []),
+ "generation_tokens": list(getattr(attr, "generation_tokens", []) or []),
+ "user_prompt_indices": list(getattr(attr, "user_prompt_indices", []) or []),
+ "chat_prompt_indices": list(getattr(attr, "chat_prompt_indices", []) or []),
+ "prompt_ids": getattr(attr, "prompt_ids", None).detach().cpu().tolist() if getattr(attr, "prompt_ids", None) is not None else None,
+ "generation_ids": getattr(attr, "generation_ids", None).detach().cpu().tolist() if getattr(attr, "generation_ids", None) is not None else None,
+ }
+
+ raw_vectors = []
+ if result.metadata and "ifr" in result.metadata:
+ raw_ifr = result.metadata["ifr"].get("raw")
+ if raw_ifr is not None and hasattr(raw_ifr, "raw_attributions"):
+ try:
+ raw_vectors = [r.token_importance_total.detach().cpu() for r in raw_ifr.raw_attributions]
+ except Exception:
+ raw_vectors = []
+ debug_info["raw_hop_vectors"] = raw_vectors
+
+ return result, sink, thinking, debug_info
+
+
+def run_ifr_in_all_gen(
+ example: ds_utils.CachedExample,
+ model: Any,
+ tokenizer: Any,
+ *,
+ n_hops: int,
+ sink_span: Optional[Sequence[int]],
+ thinking_span: Optional[Sequence[int]],
+ chunk_tokens: int,
+ sink_chunk_tokens: int,
+) -> Tuple[Any, Optional[Tuple[int, int]], Optional[Tuple[int, int]], Dict[str, Any]]:
+ """Execute experimental IFR variant: multi-hop over all generation (CoT + output)."""
+
+ import ft_ifr_improve
+
+ attr = ft_ifr_improve.LLMIFRAttributionInAllGen(
+ model,
+ tokenizer,
+ chunk_tokens=chunk_tokens,
+ sink_chunk_tokens=sink_chunk_tokens,
+ )
+
+ sink = tuple(sink_span) if sink_span is not None else tuple(example.sink_span) if example.sink_span else None
+ thinking = (
+ tuple(thinking_span)
+ if thinking_span is not None
+ else tuple(example.thinking_span) if example.thinking_span else None
+ )
+
+ result = attr.calculate_ifr_in_all_gen(
+ example.prompt,
+ target=example.target,
+ sink_span=sink,
+ thinking_span=thinking,
+ n_hops=int(n_hops),
+ )
+
+ debug_info: Dict[str, Any] = {
+ "full_prompt_tokens": list(getattr(attr, "prompt_tokens", []) or []),
+ "generation_tokens": list(getattr(attr, "generation_tokens", []) or []),
+ "user_prompt_indices": list(getattr(attr, "user_prompt_indices", []) or []),
+ "chat_prompt_indices": list(getattr(attr, "chat_prompt_indices", []) or []),
+ "prompt_ids": getattr(attr, "prompt_ids", None).detach().cpu().tolist() if getattr(attr, "prompt_ids", None) is not None else None,
+ "generation_ids": getattr(attr, "generation_ids", None).detach().cpu().tolist() if getattr(attr, "generation_ids", None) is not None else None,
+ }
+
+ raw_vectors = []
+ if result.metadata and "ifr" in result.metadata:
+ raw_ifr = result.metadata["ifr"].get("raw")
+ if raw_ifr is not None and hasattr(raw_ifr, "raw_attributions"):
+ try:
+ raw_vectors = [r.token_importance_total.detach().cpu() for r in raw_ifr.raw_attributions]
+ except Exception:
+ raw_vectors = []
+ debug_info["raw_hop_vectors"] = raw_vectors
+
+ return result, sink, thinking, debug_info
+
+
+def make_output_stem(dataset_name: str, index: int, mode: str) -> str:
+ safe_name = dataset_name.replace("/", "_").replace(" ", "_")
+ prefix = {
+ "ft": "ft_case_",
+ "ft_improve": "ft_improve_case_",
+ "ifr": "ifr_case_",
+ "ifr_all_positions": "ifr_all_positions_case_",
+ "ifr_all_positions_output_only": "ifr_output_only_case_",
+ "attnlrp": "attnlrp_case_",
+ "ft_attnlrp": "ft_attnlrp_case_",
+ }.get(mode, f"{mode}_case_")
+ return f"{prefix}{safe_name}_idx{index}"
+
+
+def _decode_token_ids(tokenizer: Any, ids: Sequence[int]) -> List[str]:
+ """Decode each token id into a readable text piece (keeps special tokens)."""
+
+ pieces: List[str] = []
+ for tok_id in ids:
+ try:
+ pieces.append(
+ tokenizer.decode([int(tok_id)], skip_special_tokens=False, clean_up_tokenization_spaces=False)
+ )
+ except Exception:
+ pieces.append(str(tok_id))
+ return pieces
+
+
+def build_raw_tokens_from_ids(tokenizer: Any, prompt_ids: Optional[Sequence[int]], generation_ids: Optional[Sequence[int]]) -> List[str]:
+ if not prompt_ids:
+ prompt_ids = []
+ if not generation_ids:
+ generation_ids = []
+ return _decode_token_ids(tokenizer, prompt_ids) + _decode_token_ids(tokenizer, generation_ids)
+
+
+def build_trimmed_roles(tokens: Sequence[str], segments: Dict[str, Any]) -> List[str]:
+ """Assign role labels for trimmed tokens (prompt + generation)."""
+
+ roles = ["prompt" for _ in range(len(tokens))]
+ prompt_len_tokens = segments.get("prompt_len", 0)
+ for idx in range(prompt_len_tokens, len(tokens)):
+ roles[idx] = "gen"
+ thinking_span = segments.get("thinking_span")
+ sink_span = segments.get("sink_span")
+ if thinking_span is not None:
+ start = prompt_len_tokens + int(thinking_span[0])
+ end = prompt_len_tokens + int(thinking_span[1])
+ for i in range(start, min(len(tokens), end + 1)):
+ roles[i] = "think"
+ if sink_span is not None:
+ start = prompt_len_tokens + int(sink_span[0])
+ end = prompt_len_tokens + int(sink_span[1])
+ for i in range(start, min(len(tokens), end + 1)):
+ roles[i] = "output"
+ return roles
+
+
+def build_raw_roles(
+ tokens: Sequence[str],
+ prompt_len_full: int,
+ user_indices: Sequence[int],
+ template_indices: Sequence[int],
+ thinking_span_abs: Optional[Sequence[int]],
+ sink_span_abs: Optional[Sequence[int]],
+) -> List[str]:
+ """Assign role labels for raw tokens (template + user + generation)."""
+
+ roles = ["template" for _ in range(len(tokens))]
+ user_set = set(int(i) for i in user_indices)
+ tmpl_set = set(int(i) for i in template_indices)
+
+ for i in range(min(len(tokens), prompt_len_full)):
+ if i in user_set:
+ roles[i] = "user"
+ elif i in tmpl_set:
+ roles[i] = "template"
+ else:
+ roles[i] = "prompt"
+
+ for i in range(prompt_len_full, len(tokens)):
+ roles[i] = "gen"
+
+ if thinking_span_abs is not None:
+ start, end = int(thinking_span_abs[0]), int(thinking_span_abs[1])
+ for i in range(start, min(len(tokens), end + 1)):
+ roles[i] = "think"
+
+ if sink_span_abs is not None:
+ start, end = int(sink_span_abs[0]), int(sink_span_abs[1])
+ for i in range(start, min(len(tokens), end + 1)):
+ roles[i] = "output"
+
+ return roles
+
+
+def extract_prompt_only_vectors(hop_vectors: Sequence[torch.Tensor], prompt_len: int) -> List[torch.Tensor]:
+ """Slice hop vectors down to user-prompt tokens only (no generation tokens)."""
+
+ if prompt_len < 0:
+ raise ValueError("prompt_len must be >= 0.")
+
+ out: List[torch.Tensor] = []
+ for vec in hop_vectors:
+ v = torch.as_tensor(vec, dtype=torch.float32).detach().cpu()
+ if int(v.numel()) < int(prompt_len):
+ raise ValueError(f"Hop vector too short for prompt-only slice: len={int(v.numel())} prompt_len={int(prompt_len)}.")
+ out.append(v[:prompt_len])
+ return out
+
+
+def _lift_trimmed_to_full(
+ trimmed: torch.Tensor,
+ *,
+ prompt_len_full: int,
+ gen_len: int,
+ user_prompt_indices: Sequence[int],
+) -> torch.Tensor:
+ """Lift a trimmed (user prompt + generation) vector into full token space with zeros for chat-template tokens."""
+
+ t = torch.as_tensor(trimmed, dtype=torch.float32).detach().cpu()
+ user_len = len(user_prompt_indices)
+ expected = int(user_len + gen_len)
+ if int(t.numel()) != expected:
+ raise ValueError(f"Trimmed vector length mismatch: got {int(t.numel())}, expected {expected}.")
+
+ total_len = int(prompt_len_full + gen_len)
+ full = torch.zeros((total_len,), dtype=torch.float32)
+ for j, abs_pos in enumerate(user_prompt_indices):
+ full[int(abs_pos)] = t[j]
+ full[int(prompt_len_full) : int(prompt_len_full + gen_len)] = t[user_len:]
+ return full
+
+
+def _postprocess_attnlrp_full_vector(
+ raw_full: torch.Tensor,
+ *,
+ prompt_len_full: int,
+ gen_len: int,
+ user_prompt_indices: Sequence[int],
+ neg_handling: str,
+ norm_mode: str,
+) -> torch.Tensor:
+ """Mirror FT-AttnLRP hop postprocessing while preserving stripped-token normalization.
+
+ The underlying AttnLRP implementation postprocesses the *stripped* vector (user prompt + generation):
+ - NaN->0, then neg_handling ('drop' or 'abs')
+ - if norm_mode=='norm': normalize by sum over stripped tokens
+
+ For the pre-trim full view (chat template + generation), we apply the same non-negativity transform
+ to the full vector and normalize using *only the stripped indices*, so overlapping token scores
+ match the trimmed vectors used by the evaluation/case-study hop outputs.
+ """
+
+ v = torch.as_tensor(raw_full, dtype=torch.float32).detach().cpu()
+ v = torch.nan_to_num(v, nan=0.0)
+
+ if neg_handling == "drop":
+ v = v.clamp(min=0.0)
+ elif neg_handling == "abs":
+ v = v.abs()
+ else:
+ raise ValueError(f"Unsupported neg_handling={neg_handling!r} (expected 'drop' or 'abs').")
+
+ ratio_enabled = norm_mode == "norm"
+ if not ratio_enabled:
+ return v
+
+ keep = list(int(i) for i in user_prompt_indices) + list(range(int(prompt_len_full), int(prompt_len_full + gen_len)))
+ if not keep:
+ return torch.zeros_like(v)
+
+ keep_idx = torch.as_tensor(keep, dtype=torch.long)
+ denom = float(v.index_select(0, keep_idx).sum().item())
+ if denom <= 0.0:
+ return torch.zeros_like(v)
+ return v / (denom + 1e-12)
+
+
+def main() -> None:
+ args = parse_args()
+ device = resolve_device(args.cuda, args.cuda_num)
+ if torch.cuda.is_available():
+ visible = os.environ.get("CUDA_VISIBLE_DEVICES")
+ print(f"[info] CUDA_VISIBLE_DEVICES={visible!r} torch.cuda.device_count()={torch.cuda.device_count()} device={device}")
+
+ model_name = args.model_path if args.model_path is not None else args.model
+ # Align with exp/exp2: always use the shared fp16 loader.
+ model, tokenizer = load_model(model_name, device)
+
+ example, ds_name = load_example(args.dataset, args.index, Path(args.data_root))
+ mode = args.mode
+
+ sink_span: Optional[Tuple[int, int]] = None
+ thinking_span: Optional[Tuple[int, int]] = None
+ thinking_ratios: Optional[Sequence[float]] = None
+
+ prompt_tokens_trimmed: List[str] = []
+ generation_tokens_trimmed: List[str] = []
+ hop_vectors_trimmed: List[torch.Tensor] = []
+ hop_vectors_raw: List[torch.Tensor] = []
+ prompt_len_full: Optional[int] = None
+ user_prompt_indices: List[int] = []
+ chat_prompt_indices: List[int] = []
+ method_meta: Dict[str, Any] = {}
+ raw_prompt_ids: Optional[List[int]] = None
+ raw_generation_ids: Optional[List[int]] = None
+ attnlrp_raw_attributions: Optional[List[Any]] = None
+
+ if mode in ("ft", "ft_improve", "ft_split_hop", "ifr_in_all_gen"):
+ if mode == "ft":
+ attr_result, sink_span, thinking_span, debug_info = run_ft_multihop(
+ example,
+ model,
+ tokenizer,
+ n_hops=args.n_hops,
+ sink_span=args.sink_span,
+ thinking_span=args.thinking_span,
+ chunk_tokens=args.chunk_tokens,
+ sink_chunk_tokens=args.sink_chunk_tokens,
+ )
+ elif mode == "ft_improve":
+ attr_result, sink_span, thinking_span, debug_info = run_ft_multihop_improve(
+ example,
+ model,
+ tokenizer,
+ n_hops=args.n_hops,
+ sink_span=args.sink_span,
+ thinking_span=args.thinking_span,
+ chunk_tokens=args.chunk_tokens,
+ sink_chunk_tokens=args.sink_chunk_tokens,
+ )
+ elif mode == "ft_split_hop":
+ attr_result, sink_span, thinking_span, debug_info = run_ft_multihop_split_hop(
+ example,
+ model,
+ tokenizer,
+ n_hops=args.n_hops,
+ sink_span=args.sink_span,
+ thinking_span=args.thinking_span,
+ chunk_tokens=args.chunk_tokens,
+ sink_chunk_tokens=args.sink_chunk_tokens,
+ )
+ elif mode == "ifr_in_all_gen":
+ attr_result, sink_span, thinking_span, debug_info = run_ifr_in_all_gen(
+ example,
+ model,
+ tokenizer,
+ n_hops=args.n_hops,
+ sink_span=args.sink_span,
+ thinking_span=args.thinking_span,
+ chunk_tokens=args.chunk_tokens,
+ sink_chunk_tokens=args.sink_chunk_tokens,
+ )
+ else:
+ raise ValueError(f"Unsupported mode={mode}")
+ ifr_meta = (attr_result.metadata or {}).get("ifr") or {}
+ hop_vectors_trimmed = list(ifr_meta.get("per_hop_projected") or [])
+ if not hop_vectors_trimmed:
+ raise RuntimeError(f"No per-hop vectors found for {mode} mode.")
+
+ prompt_tokens_trimmed = list(attr_result.prompt_tokens)
+ generation_tokens_trimmed = list(attr_result.generation_tokens)
+ thinking_ratios = ifr_meta.get("thinking_ratios")
+
+ raw_prompt_ids = debug_info.get("prompt_ids")
+ if isinstance(raw_prompt_ids, list) and raw_prompt_ids and isinstance(raw_prompt_ids[0], list):
+ raw_prompt_ids = raw_prompt_ids[0]
+ raw_generation_ids = debug_info.get("generation_ids")
+ if isinstance(raw_generation_ids, list) and raw_generation_ids and isinstance(raw_generation_ids[0], list):
+ raw_generation_ids = raw_generation_ids[0]
+
+ user_prompt_indices = list(debug_info.get("user_prompt_indices") or [])
+ chat_prompt_indices = list(debug_info.get("chat_prompt_indices") or [])
+ prompt_len_full = len(raw_prompt_ids) if isinstance(raw_prompt_ids, list) else None
+
+ raw_vectors = debug_info.get("raw_hop_vectors") or []
+ hop_vectors_raw = [vec.detach().cpu() if hasattr(vec, "detach") else torch.as_tensor(vec) for vec in raw_vectors]
+ method_meta = {"ifr": analysis.sanitize_ifr_meta(ifr_meta)}
+
+ elif mode == "ifr":
+ # Standard IFR (single-hop span aggregate), with pre/post trim views.
+ attr = llm_attr.LLMIFRAttribution(
+ model,
+ tokenizer,
+ chunk_tokens=args.chunk_tokens,
+ sink_chunk_tokens=args.sink_chunk_tokens,
+ )
+ sink_span = tuple(args.sink_span) if args.sink_span is not None else tuple(example.sink_span) if example.sink_span else None
+ thinking_span = tuple(args.thinking_span) if args.thinking_span is not None else tuple(example.thinking_span) if example.thinking_span else sink_span
+
+ if sink_span is None:
+ raise ValueError("sink_span is required for IFR mode (use dataset sink_span or pass --sink_span).")
+ span_result = attr.calculate_ifr_span(
+ example.prompt,
+ target=example.target,
+ span=tuple(sink_span),
+ )
+ span_meta = span_result.metadata.get("ifr") if span_result.metadata else None
+ aggregate = span_meta.get("aggregate") if isinstance(span_meta, dict) else None
+ if aggregate is None or not hasattr(aggregate, "token_importance_total"):
+ raise RuntimeError("IFR span aggregate missing from metadata; cannot render pre-trim view.")
+
+ raw_vector = aggregate.token_importance_total.detach().cpu()
+ trimmed_vector = attr._project_vector(raw_vector)
+ hop_vectors_raw = [raw_vector]
+ hop_vectors_trimmed = [trimmed_vector]
+
+ prompt_tokens_trimmed = list(attr.user_prompt_tokens)
+ generation_tokens_trimmed = list(attr.generation_tokens)
+
+ raw_prompt_ids = attr.prompt_ids.detach().cpu().tolist()[0]
+ raw_generation_ids = attr.generation_ids.detach().cpu().tolist()[0]
+ user_prompt_indices = list(getattr(attr, "user_prompt_indices", []) or [])
+ chat_prompt_indices = list(getattr(attr, "chat_prompt_indices", []) or [])
+ prompt_len_full = len(raw_prompt_ids)
+
+ sink_abs = (prompt_len_full + sink_span[0], prompt_len_full + sink_span[1])
+ think_abs = (prompt_len_full + thinking_span[0], prompt_len_full + thinking_span[1]) if thinking_span else None
+
+ meta = {
+ "type": "span_aggregate",
+ "ifr_view": "aggregate",
+ "sink_span_generation": sink_span,
+ "sink_span_absolute": sink_abs,
+ "thinking_span_generation": thinking_span,
+ "thinking_span_absolute": think_abs,
+ }
+ method_meta = {"ifr": analysis.tensor_to_list(meta)}
+
+ elif mode == "ifr_all_positions_output_only":
+ # IFR all-positions (output-only) + token-level CAGE (row/recursive) derived from the matrix.
+ attr = llm_attr.LLMIFRAttribution(
+ model,
+ tokenizer,
+ chunk_tokens=args.chunk_tokens,
+ sink_chunk_tokens=args.sink_chunk_tokens,
+ )
+ sink_span = tuple(args.sink_span) if args.sink_span is not None else tuple(example.sink_span) if example.sink_span else None
+ thinking_span = tuple(args.thinking_span) if args.thinking_span is not None else tuple(example.thinking_span) if example.thinking_span else sink_span
+
+ if sink_span is None:
+ raise ValueError(
+ "sink_span is required for ifr_all_positions_output_only mode "
+ "(use dataset sink_span or pass --sink_span)."
+ )
+
+ attr_result = attr.calculate_ifr_for_all_positions_output_only(
+ example.prompt,
+ target=example.target,
+ sink_span=tuple(sink_span),
+ )
+
+ indices_to_explain = list(sink_span)
+ _, row_attr, rec_attr = attr_result.get_all_token_attrs(indices_to_explain)
+ row_vec = row_attr.squeeze(0).detach().cpu()
+ rec_vec = rec_attr.squeeze(0).detach().cpu()
+
+ hop_vectors_trimmed = [row_vec, rec_vec]
+
+ prompt_tokens_trimmed = list(attr.user_prompt_tokens)
+ generation_tokens_trimmed = list(attr.generation_tokens)
+
+ raw_prompt_ids = attr.prompt_ids.detach().cpu().tolist()[0]
+ raw_generation_ids = attr.generation_ids.detach().cpu().tolist()[0]
+ user_prompt_indices = list(getattr(attr, "user_prompt_indices", []) or [])
+ chat_prompt_indices = list(getattr(attr, "chat_prompt_indices", []) or [])
+ prompt_len_full = len(raw_prompt_ids)
+
+ gen_len = len(raw_generation_ids or [])
+ hop_vectors_raw = [
+ _lift_trimmed_to_full(
+ v,
+ prompt_len_full=int(prompt_len_full or 0),
+ gen_len=gen_len,
+ user_prompt_indices=user_prompt_indices,
+ )
+ for v in hop_vectors_trimmed
+ ]
+
+ ifr_meta = dict((attr_result.metadata or {}).get("ifr") or {})
+ ifr_meta["ifr_view"] = "all_positions_output_only (row+rec)"
+ ifr_meta["panel_titles"] = ["Row attribution", "Recursive attribution (CAGE)"]
+ ifr_meta["indices_to_explain"] = indices_to_explain
+ method_meta = {"ifr": analysis.tensor_to_list(ifr_meta)}
+
+ elif mode == "ifr_all_positions":
+ # IFR all-positions (full generation) + token-level CAGE (row/recursive) derived from the matrix.
+ attr = llm_attr.LLMIFRAttribution(
+ model,
+ tokenizer,
+ chunk_tokens=args.chunk_tokens,
+ sink_chunk_tokens=args.sink_chunk_tokens,
+ )
+ sink_span = tuple(args.sink_span) if args.sink_span is not None else tuple(example.sink_span) if example.sink_span else None
+ thinking_span = tuple(args.thinking_span) if args.thinking_span is not None else tuple(example.thinking_span) if example.thinking_span else sink_span
+
+ if sink_span is None:
+ raise ValueError(
+ "sink_span is required for ifr_all_positions mode (use dataset sink_span or pass --sink_span)."
+ )
+
+ attr_result = attr.calculate_ifr_for_all_positions(
+ example.prompt,
+ target=example.target,
+ )
+
+ indices_to_explain = list(sink_span)
+ _, row_attr, rec_attr = attr_result.get_all_token_attrs(indices_to_explain)
+ row_vec = row_attr.squeeze(0).detach().cpu()
+ rec_vec = rec_attr.squeeze(0).detach().cpu()
+
+ hop_vectors_trimmed = [row_vec, rec_vec]
+
+ prompt_tokens_trimmed = list(attr.user_prompt_tokens)
+ generation_tokens_trimmed = list(attr.generation_tokens)
+
+ raw_prompt_ids = attr.prompt_ids.detach().cpu().tolist()[0]
+ raw_generation_ids = attr.generation_ids.detach().cpu().tolist()[0]
+ user_prompt_indices = list(getattr(attr, "user_prompt_indices", []) or [])
+ chat_prompt_indices = list(getattr(attr, "chat_prompt_indices", []) or [])
+ prompt_len_full = len(raw_prompt_ids)
+
+ gen_len = len(raw_generation_ids or [])
+ hop_vectors_raw = [
+ _lift_trimmed_to_full(
+ v,
+ prompt_len_full=int(prompt_len_full or 0),
+ gen_len=gen_len,
+ user_prompt_indices=user_prompt_indices,
+ )
+ for v in hop_vectors_trimmed
+ ]
+
+ ifr_meta = dict((attr_result.metadata or {}).get("ifr") or {})
+ ifr_meta["ifr_view"] = "all_positions (row+rec)"
+ ifr_meta["panel_titles"] = ["Row attribution", "Recursive attribution (CAGE)"]
+ ifr_meta["indices_to_explain"] = indices_to_explain
+ method_meta = {"ifr": analysis.tensor_to_list(ifr_meta)}
+
+ elif mode in ("attnlrp", "ft_attnlrp"):
+ # Reuse the shared LLMLRPAttribution implementations (root-level).
+ attributor = llm_attr.LLMLRPAttribution(model, tokenizer)
+
+ sink_span = tuple(args.sink_span) if args.sink_span is not None else tuple(example.sink_span) if example.sink_span else None
+ thinking_span = (
+ tuple(args.thinking_span)
+ if args.thinking_span is not None
+ else tuple(example.thinking_span) if example.thinking_span else sink_span
+ )
+
+ if mode == "attnlrp":
+ # Case-study AttnLRP: reuse FT-AttnLRP logic but take hop0 (the first span-aggregate)
+ # for a full, signed attribution vector (no observation masking).
+ attr_result = attributor.calculate_attnlrp_ft_hop0(
+ example.prompt,
+ target=example.target,
+ sink_span=sink_span,
+ thinking_span=thinking_span,
+ neg_handling=args.attnlrp_neg_handling,
+ norm_mode=args.attnlrp_norm_mode,
+ )
+ meta = attr_result.metadata or {}
+ multi_hop = meta.get("multi_hop_result")
+ raw_attributions = getattr(multi_hop, "raw_attributions", None) or []
+ attnlrp_raw_attributions = list(raw_attributions)
+ base_attr = raw_attributions[0] if raw_attributions else None
+ if base_attr is None or not hasattr(base_attr, "token_importance_total"):
+ raise RuntimeError("AttnLRP hop0 missing from multi-hop result.")
+
+ hop0_vec = torch.as_tensor(getattr(base_attr, "token_importance_total"), dtype=torch.float32).detach().cpu()
+ if hop0_vec.numel() <= 0:
+ raise RuntimeError("Empty generation for AttnLRP case study.")
+
+ # Use the actual sink span applied by hop0 (defaults to full generation when unset).
+ sink_span = tuple(getattr(base_attr, "sink_range"))
+ if thinking_span is None:
+ thinking_span = sink_span
+
+ hop_vectors_trimmed = [hop0_vec]
+ thinking_ratios = list(getattr(multi_hop, "thinking_ratios", []) or [])
+
+ method_meta = {
+ "attnlrp": {
+ "type": "calculate_attnlrp_multi_hop(n_hops=0) hop0 raw_attributions[0]",
+ "sink_span_generation": sink_span,
+ "thinking_span_generation": thinking_span,
+ "thinking_ratios": thinking_ratios,
+ "neg_handling": args.attnlrp_neg_handling,
+ "norm_mode": args.attnlrp_norm_mode,
+ "ratio_enabled": args.attnlrp_norm_mode == "norm",
+ }
+ }
+ else:
+ # exp2 ft_attnlrp: multi-hop aggregated AttnLRP (metadata contains per-hop vectors).
+ attr_result = attributor.calculate_attnlrp_aggregated_multi_hop(
+ example.prompt,
+ target=example.target,
+ sink_span=sink_span,
+ thinking_span=thinking_span,
+ n_hops=int(args.n_hops),
+ neg_handling=args.attnlrp_neg_handling,
+ norm_mode=args.attnlrp_norm_mode,
+ )
+ meta = attr_result.metadata or {}
+ multi_hop = meta.get("multi_hop_result")
+ if multi_hop is None:
+ raise RuntimeError("FT-AttnLRP case study missing metadata.multi_hop_result.")
+
+ raw_attributions = getattr(multi_hop, "raw_attributions", None) or []
+ attnlrp_raw_attributions = list(raw_attributions)
+ hop_vectors_trimmed = [
+ torch.as_tensor(getattr(hop, "token_importance_total"), dtype=torch.float32).detach().cpu()
+ for hop in raw_attributions
+ ]
+ thinking_ratios = list(getattr(multi_hop, "thinking_ratios", []) or [])
+
+ method_meta = {
+ "attnlrp": {
+ "type": "calculate_attnlrp_aggregated_multi_hop (exp2 ft_attnlrp)",
+ "n_hops": int(args.n_hops),
+ "sink_span_generation": sink_span,
+ "thinking_span_generation": thinking_span,
+ "thinking_ratios": thinking_ratios,
+ "neg_handling": args.attnlrp_neg_handling,
+ "norm_mode": args.attnlrp_norm_mode,
+ "ratio_enabled": args.attnlrp_norm_mode == "norm",
+ }
+ }
+
+ prompt_tokens_trimmed = list(attributor.user_prompt_tokens)
+ generation_tokens_trimmed = list(attributor.generation_tokens)
+
+ raw_prompt_ids = attributor.prompt_ids.detach().cpu().tolist()[0]
+ raw_generation_ids = attributor.generation_ids.detach().cpu().tolist()[0]
+ user_prompt_indices = list(getattr(attributor, "user_prompt_indices", []) or [])
+ chat_prompt_indices = list(getattr(attributor, "chat_prompt_indices", []) or [])
+ prompt_len_full = len(raw_prompt_ids)
+
+ else:
+ raise ValueError(f"Unsupported mode={mode}")
+
+ if not hop_vectors_trimmed:
+ raise RuntimeError("No hop vectors to visualize.")
+
+ raw_tokens = build_raw_tokens_from_ids(tokenizer, raw_prompt_ids, raw_generation_ids)
+
+ sink_span_abs = None
+ thinking_span_abs = None
+ if prompt_len_full is not None and sink_span is not None:
+ sink_span_abs = (prompt_len_full + sink_span[0], prompt_len_full + sink_span[1])
+ if prompt_len_full is not None and thinking_span is not None:
+ thinking_span_abs = (prompt_len_full + thinking_span[0], prompt_len_full + thinking_span[1])
+ prompt_len_full_safe = int(prompt_len_full or 0)
+ roles_raw = build_raw_roles(
+ raw_tokens,
+ prompt_len_full_safe,
+ user_prompt_indices,
+ chat_prompt_indices,
+ thinking_span_abs,
+ sink_span_abs,
+ )
+
+ prompt_tokens_only = list(prompt_tokens_trimmed)
+ prompt_only_vectors = extract_prompt_only_vectors(hop_vectors_trimmed, len(prompt_tokens_only))
+
+ # Ensure every method has a pre-trim full vector per panel.
+ if not hop_vectors_raw:
+ if mode in ("attnlrp", "ft_attnlrp") and attnlrp_raw_attributions is not None:
+ gen_len = len(raw_generation_ids or [])
+ expected = int((prompt_len_full_safe + gen_len) if prompt_len_full is not None else 0)
+ full_vectors: List[torch.Tensor] = []
+ for hop in attnlrp_raw_attributions:
+ meta = getattr(hop, "metadata", None) or {}
+ raw_full = meta.get("token_importance_total_with_chat_template")
+ if raw_full is None:
+ full_vectors = []
+ break
+ v = _postprocess_attnlrp_full_vector(
+ torch.as_tensor(raw_full, dtype=torch.float32),
+ prompt_len_full=prompt_len_full_safe,
+ gen_len=gen_len,
+ user_prompt_indices=user_prompt_indices,
+ neg_handling=args.attnlrp_neg_handling,
+ norm_mode=args.attnlrp_norm_mode,
+ )
+ if expected and int(v.numel()) != expected:
+ raise RuntimeError(
+ "AttnLRP full-vector length mismatch for pre-trim view: "
+ f"got {int(v.numel())}, expected {expected}."
+ )
+ full_vectors.append(v)
+ hop_vectors_raw = full_vectors
+
+ if not hop_vectors_raw and prompt_len_full is not None:
+ # Fallback: lift trimmed vectors back to full token space with zeros for template tokens.
+ gen_len = len(raw_generation_ids or [])
+ hop_vectors_raw = [
+ _lift_trimmed_to_full(
+ v,
+ prompt_len_full=prompt_len_full_safe,
+ gen_len=gen_len,
+ user_prompt_indices=user_prompt_indices,
+ )
+ for v in hop_vectors_trimmed
+ ]
+
+ if not hop_vectors_raw:
+ raise RuntimeError("Missing pre-trim vectors; cannot render required full-sequence heatmap.")
+
+ # Lightweight debug stats to catch silent all-zero / NaN cases.
+ hop_stats_raw = [analysis.vector_stats(torch.nan_to_num(v.detach().cpu(), nan=0.0)) for v in hop_vectors_raw]
+ hop_stats_prompt = [analysis.vector_stats(torch.nan_to_num(v.detach().cpu(), nan=0.0)) for v in prompt_only_vectors]
+ for i in range(max(len(hop_stats_raw), len(hop_stats_prompt))):
+ raw_abs = hop_stats_raw[i]["abs_max"] if i < len(hop_stats_raw) else None
+ prompt_abs = hop_stats_prompt[i]["abs_max"] if i < len(hop_stats_prompt) else None
+ print(f"[stats] panel {i}: raw_abs_max={raw_abs} prompt_abs_max={prompt_abs}")
+
+ hop_token_raw = analysis.package_token_hops(hop_vectors_raw)
+ hop_token_prompt = analysis.package_token_hops(prompt_only_vectors)
+
+ case_meta: Dict[str, Any] = {
+ "dataset": ds_name,
+ "index": args.index,
+ "sink_span": sink_span,
+ "thinking_span": thinking_span,
+ "n_hops": args.n_hops,
+ "thinking_ratios": thinking_ratios,
+ "mode": mode,
+ "ifr_view": method_meta.get("ifr", {}).get("ifr_view") if isinstance(method_meta.get("ifr"), dict) else None,
+ "panel_titles": method_meta.get("ifr", {}).get("panel_titles") if isinstance(method_meta.get("ifr"), dict) else None,
+ "attnlrp_neg_handling": args.attnlrp_neg_handling if mode in ("attnlrp", "ft_attnlrp") else None,
+ "attnlrp_norm_mode": args.attnlrp_norm_mode if mode in ("attnlrp", "ft_attnlrp") else None,
+ "attnlrp_ratio_enabled": (args.attnlrp_norm_mode == "norm") if mode in ("attnlrp", "ft_attnlrp") else None,
+ "vector_stats_raw": hop_stats_raw,
+ "vector_stats_prompt": hop_stats_prompt,
+ }
+
+ generation_text = "".join(generation_tokens_trimmed) if generation_tokens_trimmed else ""
+ prompt_text = example.prompt
+ record = {
+ "meta": case_meta,
+ "prompt": prompt_text,
+ "target": example.target,
+ "generation": generation_text,
+ "full_all_tokens": raw_tokens,
+ "raw_token_roles": roles_raw,
+ "prompt_tokens": prompt_tokens_only,
+ "prompt_token_roles": ["user" for _ in range(len(prompt_tokens_only))],
+ "token_hops_raw": hop_token_raw,
+ "token_hops_prompt": hop_token_prompt,
+ "ifr_meta": method_meta.get("ifr"),
+ "attnlrp_meta": method_meta.get("attnlrp"),
+ }
+
+ out_dir = Path(args.output_dir)
+ out_dir.mkdir(parents=True, exist_ok=True)
+ stem = make_output_stem(ds_name, args.index, mode)
+ json_path = out_dir / f"{stem}.json"
+ html_path = out_dir / f"{stem}.html"
+
+ with json_path.open("w", encoding="utf-8") as f:
+ json.dump(record, f, ensure_ascii=False, indent=2)
+
+ html = viz.render_case_html(
+ case_meta,
+ token_view_raw={
+ "label": "Pre-trim token-level heatmap (full sequence with chat template)",
+ "tokens": raw_tokens,
+ "roles": roles_raw,
+ "hops": hop_token_raw,
+ },
+ token_view_prompt={
+ "label": "Prompt-only token-level heatmap (user prompt only)",
+ "tokens": prompt_tokens_only,
+ "roles": ["user" for _ in range(len(prompt_tokens_only))],
+ "hops": hop_token_prompt,
+ },
+ )
+ html_path.write_text(html, encoding="utf-8")
+
+ print(f"[done] wrote {json_path}")
+ print(f"[done] wrote {html_path}")
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/case_study/run_mas_case.py b/exp/case_study/run_mas_case.py
new file mode 100644
index 0000000000000000000000000000000000000000..020434830178d612d969cc60caff2a5c15317198
--- /dev/null
+++ b/exp/case_study/run_mas_case.py
@@ -0,0 +1,805 @@
+#!/usr/bin/env python3
+"""MAS case study: visualize token-perturbation faithfulness for attribution methods.
+
+This script matches the faithfulness evaluation logic implemented in:
+ - evaluations/faithfulness.py
+ - llm_attr_eval.LLMAttributionEvaluator.faithfulness_test()
+
+For a single example and a selected attribution method, we:
+ 1) Compute token-level attributions (Seq / Row / Recursive) over prompt tokens.
+ 2) Rank prompt tokens by attribution mass.
+ 3) Iteratively perturb the prompt by replacing one token at a time with PAD tokens.
+ 4) Score the model as sum log p(generation + EOS | prompt) under the chat template.
+ 5) Compute RISE / MAS / RISE+AP (AUCs) and visualize the perturbation impact as token heatmaps.
+
+Outputs JSON + HTML to exp/case_study/out/.
+"""
+
+from __future__ import annotations
+
+import argparse
+import json
+import os
+import sys
+import types
+from importlib.machinery import ModuleSpec
+from pathlib import Path
+from typing import Any, Dict, List, Optional, Sequence, Tuple
+
+import numpy as np
+
+
+def _early_set_cuda_visible_devices() -> None:
+ """Set CUDA_VISIBLE_DEVICES before importing torch/transformers.
+
+ Note: CUDA device indices are re-mapped inside the process after applying the mask.
+ """
+
+ parser = argparse.ArgumentParser(add_help=False)
+ parser.add_argument("--cuda", type=str, default=None)
+ args, _ = parser.parse_known_args(sys.argv[1:])
+ cuda = args.cuda.strip() if isinstance(args.cuda, str) else ""
+ if cuda and "," in cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = cuda
+
+
+if __name__ == "__main__":
+ _early_set_cuda_visible_devices()
+
+import torch
+
+REPO_ROOT = Path(__file__).resolve().parents[2]
+if str(REPO_ROOT) not in sys.path:
+ sys.path.insert(0, str(REPO_ROOT))
+
+# Avoid optional vision deps when importing transformers.
+os.environ.setdefault("TRANSFORMERS_NO_TORCHVISION", "1")
+os.environ.setdefault("DISABLE_TRANSFORMERS_IMAGE_TRANSFORMS", "1")
+
+
+def _stub_torchvision() -> None:
+ """Provide minimal torchvision stubs so transformers imports succeed without torchvision."""
+
+ if "torchvision" in sys.modules:
+ return
+
+ def _mk(name: str) -> types.ModuleType:
+ mod = types.ModuleType(name)
+ mod.__spec__ = ModuleSpec(name, loader=None)
+ return mod
+
+ tv = _mk("torchvision")
+ tv.__dict__["__path__"] = []
+ submods = ["transforms", "_meta_registrations", "datasets", "io", "models", "ops", "utils"]
+ for name in submods:
+ mod = _mk(f"torchvision.{name}")
+ sys.modules[f"torchvision.{name}"] = mod
+ setattr(tv, name, mod)
+
+ class _InterpolationMode:
+ NEAREST = 0
+ NEAREST_EXACT = 0
+ BILINEAR = 1
+ BICUBIC = 2
+ LANCZOS = 3
+ BOX = 4
+ HAMMING = 5
+
+ sys.modules["torchvision.transforms"].InterpolationMode = _InterpolationMode
+ sys.modules["torchvision.transforms"].__all__ = ["InterpolationMode"]
+
+ ops_mod = sys.modules.get("torchvision.ops") or _mk("torchvision.ops")
+ sys.modules["torchvision.ops"] = ops_mod
+ setattr(tv, "ops", ops_mod)
+ misc_mod = _mk("torchvision.ops.misc")
+ sys.modules["torchvision.ops.misc"] = misc_mod
+ setattr(ops_mod, "misc", misc_mod)
+
+ class _FrozenBatchNorm2d:
+ def __init__(self, *args, **kwargs):
+ pass
+
+ misc_mod.FrozenBatchNorm2d = _FrozenBatchNorm2d
+ sys.modules["torchvision"] = tv
+
+
+def _stub_timm() -> None:
+ """Provide minimal timm stubs to avoid optional vision deps."""
+
+ if "timm" in sys.modules:
+ return
+
+ def _mk(name: str) -> types.ModuleType:
+ mod = types.ModuleType(name)
+ mod.__spec__ = ModuleSpec(name, loader=None)
+ return mod
+
+ timm = _mk("timm")
+ timm.__dict__["__path__"] = []
+ sys.modules["timm"] = timm
+
+ data_mod = _mk("timm.data")
+ sys.modules["timm.data"] = data_mod
+ timm.data = data_mod
+
+ class _ImageNetInfo:
+ pass
+
+ def _infer_imagenet_subset(*args, **kwargs):
+ return None
+
+ data_mod.ImageNetInfo = _ImageNetInfo
+ data_mod.infer_imagenet_subset = _infer_imagenet_subset
+
+ layers_mod = _mk("timm.layers")
+ sys.modules["timm.layers"] = layers_mod
+ timm.layers = layers_mod
+
+ create_norm_mod = _mk("timm.layers.create_norm")
+ sys.modules["timm.layers.create_norm"] = create_norm_mod
+ layers_mod.create_norm = create_norm_mod
+
+ def _get_norm_layer(*args, **kwargs):
+ return None
+
+ create_norm_mod.get_norm_layer = _get_norm_layer
+
+ classifier_mod = _mk("timm.layers.classifier")
+ sys.modules["timm.layers.classifier"] = classifier_mod
+ layers_mod.classifier = classifier_mod
+
+
+def _stub_gemma3n() -> None:
+ """Stub Gemma3n config module if transformers tries to import it."""
+
+ if "transformers.models.gemma3n.configuration_gemma3n" in sys.modules:
+ return
+
+ gemma_pkg = types.ModuleType("transformers.models.gemma3n")
+ gemma_pkg.__spec__ = ModuleSpec("transformers.models.gemma3n", loader=None, is_package=True)
+ sys.modules["transformers.models.gemma3n"] = gemma_pkg
+
+ gemma_conf = types.ModuleType("transformers.models.gemma3n.configuration_gemma3n")
+ gemma_conf.__spec__ = ModuleSpec("transformers.models.gemma3n.configuration_gemma3n", loader=None)
+
+ class Gemma3nConfig:
+ def __init__(self, *args, **kwargs):
+ self.model_type = "gemma3n"
+
+ class Gemma3nTextConfig(Gemma3nConfig):
+ pass
+
+ gemma_conf.Gemma3nConfig = Gemma3nConfig
+ gemma_conf.Gemma3nTextConfig = Gemma3nTextConfig
+ gemma_conf.__all__ = ["Gemma3nConfig", "Gemma3nTextConfig"]
+ sys.modules["transformers.models.gemma3n.configuration_gemma3n"] = gemma_conf
+ setattr(gemma_pkg, "configuration_gemma3n", gemma_conf)
+
+
+_stub_torchvision()
+_stub_timm()
+_stub_gemma3n()
+
+import transformers # noqa: E402
+
+# Provide light stubs if Longformer classes are unavailable; we don't use them here.
+if not hasattr(transformers, "LongformerTokenizer"):
+ class _DummyLongformerTokenizer:
+ def __init__(self, *args, **kwargs):
+ raise ImportError("LongformerTokenizer stubbed; install full transformers if needed.")
+ transformers.LongformerTokenizer = _DummyLongformerTokenizer
+if not hasattr(transformers, "LongformerForMaskedLM"):
+ class _DummyLongformerForMaskedLM:
+ def __init__(self, *args, **kwargs):
+ raise ImportError("LongformerForMaskedLM stubbed; install full transformers if needed.")
+ transformers.LongformerForMaskedLM = _DummyLongformerForMaskedLM
+
+from exp.case_study import viz # noqa: E402
+from exp.exp2 import dataset_utils as ds_utils # noqa: E402
+from shared_utils import DEFAULT_PROMPT_TEMPLATE # noqa: E402
+
+import llm_attr # noqa: E402
+from evaluations.attribution_recovery import load_model # noqa: E402
+
+
+def resolve_device(cuda: Optional[str], cuda_num: int) -> str:
+ if cuda and isinstance(cuda, str) and "," in cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = cuda
+ return "auto"
+ if cuda and isinstance(cuda, str) and cuda.strip():
+ try:
+ idx = int(cuda)
+ except Exception:
+ idx = 0
+ return f"cuda:{idx}" if torch.cuda.is_available() else "cpu"
+ return f"cuda:{cuda_num}" if torch.cuda.is_available() else "cpu"
+
+
+def load_example(dataset: str, index: int, data_root: Path) -> Tuple[ds_utils.CachedExample, str]:
+ ds_path = Path(dataset)
+ if ds_path.exists():
+ examples = ds_utils.read_cached_jsonl(ds_path)
+ dataset_name = ds_path.name
+ else:
+ loader = ds_utils.DatasetLoader(data_root=data_root)
+ examples = loader.load(dataset)
+ dataset_name = dataset
+
+ if not examples:
+ raise ValueError(f"No examples found for dataset={dataset}")
+
+ if index < 0:
+ index = len(examples) + index
+ if not (0 <= index < len(examples)):
+ raise IndexError(f"index {index} out of range for dataset with {len(examples)} examples")
+
+ return examples[index], dataset_name
+
+
+def make_output_stem(dataset_name: str, index: int, method: str) -> str:
+ safe_name = dataset_name.replace("/", "_").replace(" ", "_")
+ return f"mas_case_{method}_{safe_name}_idx{index}"
+
+
+def format_prompt(tokenizer: Any, prompt: str) -> str:
+ modified_prompt = DEFAULT_PROMPT_TEMPLATE.format(context=prompt, query="")
+ formatted_prompt = [{"role": "user", "content": modified_prompt}]
+ return tokenizer.apply_chat_template(
+ formatted_prompt,
+ tokenize=False,
+ add_generation_prompt=True,
+ enable_thinking=False,
+ )
+
+
+@torch.inference_mode()
+def compute_logprob_response_given_prompt(model: Any, prompt_ids: torch.Tensor, response_ids: torch.Tensor) -> torch.Tensor:
+ """Compute log-probabilities of response_ids given prompt_ids.
+
+ Shapes:
+ prompt_ids: [B, N]
+ response_ids: [B, M]
+ returns: [B, M]
+ """
+ input_ids = torch.cat([prompt_ids, response_ids], dim=1)
+ attention_mask = torch.ones_like(input_ids)
+ logits = model(input_ids=input_ids, attention_mask=attention_mask).logits # [B, N+M, V]
+ log_probs = torch.nn.functional.log_softmax(logits, dim=-1)
+
+ response_start = int(prompt_ids.shape[1])
+ logits_for_response = log_probs[:, response_start - 1 : -1, :] # [B, M, V]
+ gathered = logits_for_response.gather(2, response_ids.unsqueeze(-1))
+ return gathered.squeeze(-1)
+
+
+@torch.inference_mode()
+def score_prompt_ids_with_generation(model: Any, *, prompt_ids: torch.Tensor, generation_ids: torch.Tensor) -> float:
+ return float(compute_logprob_response_given_prompt(model, prompt_ids, generation_ids).sum().detach().cpu().item())
+
+
+@torch.inference_mode()
+def _ensure_pad_token_id(tokenizer: Any) -> int:
+ if tokenizer.pad_token_id is None:
+ if tokenizer.eos_token_id is None:
+ raise RuntimeError("tokenizer has neither pad_token_id nor eos_token_id; cannot define baseline token.")
+ tokenizer.pad_token = tokenizer.eos_token
+ return int(tokenizer.pad_token_id)
+
+
+def _find_subsequence_start(haystack: torch.Tensor, needle: torch.Tensor) -> Optional[int]:
+ if haystack.ndim != 1 or needle.ndim != 1:
+ raise ValueError("Expected 1D tensors for subsequence matching.")
+ if needle.numel() == 0:
+ return 0
+ hay_len = int(haystack.numel())
+ needle_len = int(needle.numel())
+ if needle_len > hay_len:
+ return None
+ for i in range(hay_len - needle_len + 1):
+ if torch.equal(haystack[i : i + needle_len], needle):
+ return i
+ return None
+
+
+def decode_text_into_tokens(tokenizer: Any, text: str) -> List[str]:
+ encoding = tokenizer(text, return_offsets_mapping=True, add_special_tokens=False)
+ offsets = list(encoding["offset_mapping"])
+ tokens: List[str] = []
+ for start, end in offsets:
+ tokens.append(text[start:end])
+ return tokens
+
+
+def auc(arr: np.ndarray) -> float:
+ return float((arr.sum() - arr[0] / 2 - arr[-1] / 2) / (arr.shape[0] - 1))
+
+
+def mas_trace(
+ model: Any,
+ tokenizer: Any,
+ *,
+ attribution: torch.Tensor,
+ prompt: str,
+ generation: str,
+ user_prompt_indices: Optional[Sequence[int]] = None,
+ keep_prompt_token_indices: Optional[Sequence[int]] = None,
+ k: int = 20,
+) -> Dict[str, Any]:
+ """Return a token-level faithfulness trace (RISE/MAS/RISE+AP) plus per-token deltas."""
+
+ pad_token_id = _ensure_pad_token_id(tokenizer)
+
+ user_prompt = " " + prompt
+ formatted = format_prompt(tokenizer, user_prompt)
+ formatted_ids = tokenizer(formatted, return_tensors="pt", add_special_tokens=False).input_ids
+ user_ids = tokenizer(user_prompt, return_tensors="pt", add_special_tokens=False).input_ids
+
+ prompt_ids = formatted_ids.to(model.device)
+ prompt_ids_perturbed = prompt_ids.clone()
+ gen_ids = tokenizer(
+ generation + (tokenizer.eos_token or ""),
+ return_tensors="pt",
+ add_special_tokens=False,
+ ).input_ids.to(model.device)
+
+ attr_cpu = attribution.detach().cpu()
+ w = attr_cpu.sum(0)
+ P = int(w.numel())
+
+ if keep_prompt_token_indices is None:
+ keep = list(range(P))
+ else:
+ keep = []
+ seen: set[int] = set()
+ for raw in keep_prompt_token_indices:
+ try:
+ idx = int(raw)
+ except Exception:
+ continue
+ if 0 <= idx < P and idx not in seen:
+ keep.append(idx)
+ seen.add(idx)
+ keep.sort()
+
+ K = len(keep)
+ if K:
+ w_keep = w.index_select(0, torch.as_tensor(keep, dtype=torch.long))
+ sorted_local = torch.argsort(w_keep, descending=True)
+ sorted_attr_indices = torch.as_tensor([keep[int(i.item())] for i in sorted_local], dtype=torch.long)
+ attr_sum = float(w_keep.sum().item())
+ else:
+ sorted_attr_indices = torch.zeros((0,), dtype=torch.long)
+ attr_sum = 0.0
+
+ if int(user_ids.shape[1]) != P:
+ raise ValueError(
+ "Prompt-side attribution length does not match tokenized user prompt length: "
+ f"attr P={P}, user_prompt P={int(user_ids.shape[1])}."
+ )
+
+ prompt_positions: List[int]
+ if user_prompt_indices is not None:
+ prompt_positions = [int(x) for x in user_prompt_indices]
+ if len(prompt_positions) != P:
+ raise ValueError(
+ "user_prompt_indices length does not match prompt-side attribution length: "
+ f"indices P={len(prompt_positions)}, attr P={P}."
+ )
+ if P and max(prompt_positions) >= int(prompt_ids_perturbed.shape[1]):
+ raise ValueError("user_prompt_indices contains an out-of-bounds index for formatted prompt ids.")
+ else:
+ user_start = _find_subsequence_start(formatted_ids[0], user_ids[0])
+ if user_start is None:
+ raise RuntimeError("Failed to locate user prompt token span inside formatted chat prompt.")
+ prompt_positions = [int(user_start) + j for j in range(P)]
+
+ if K > 0:
+ steps = int(k) if k is not None else 0
+ if steps <= 0:
+ steps = 1
+ steps = min(steps, K)
+ else:
+ steps = 0
+
+ scores = np.zeros(steps + 1, dtype=np.float64)
+ density = np.zeros(steps + 1, dtype=np.float64)
+
+ scores[0] = score_prompt_ids_with_generation(model, prompt_ids=prompt_ids_perturbed, generation_ids=gen_ids)
+ density[0] = 1.0
+
+ if K == 0:
+ return {
+ "num_tokens": P,
+ "sorted_attr_indices": [],
+ "scores_raw": scores.tolist(),
+ "density": density.tolist(),
+ "normalized_model_response": scores.tolist(),
+ "alignment_penalty": np.zeros_like(scores).tolist(),
+ "corrected_scores": scores.tolist(),
+ "token_deltas_raw": np.zeros(P, dtype=np.float64).tolist(),
+ "attr_weights": np.zeros(P, dtype=np.float64).tolist(),
+ "metrics": {"RISE": 0.0, "MAS": 0.0, "RISE+AP": 0.0},
+ }
+
+ if attr_sum <= 0:
+ density = np.linspace(1.0, 0.0, steps + 1)
+
+ per_token_delta = np.zeros(P, dtype=np.float64)
+
+ base = K // steps
+ remainder = K % steps
+ start = 0
+ for step in range(steps):
+ size = base + (1 if step < remainder else 0)
+ group = sorted_attr_indices[start : start + size]
+ start += size
+
+ for idx_t in group:
+ idx = int(idx_t.item())
+ abs_pos = int(prompt_positions[idx])
+ prompt_ids_perturbed[0, abs_pos] = pad_token_id
+
+ scores[step + 1] = score_prompt_ids_with_generation(model, prompt_ids=prompt_ids_perturbed, generation_ids=gen_ids)
+ if attr_sum > 0:
+ dec = float(w.index_select(0, group).sum().item()) / attr_sum
+ density[step + 1] = density[step] - dec
+
+ delta = scores[step] - scores[step + 1]
+ for idx_t in group:
+ idx = int(idx_t.item())
+ per_token_delta[idx] = delta
+
+ min_normalized_pred = 1.0
+ normalized_model_response = scores.copy()
+ for i in range(len(scores)):
+ normalized_pred = (normalized_model_response[i] - scores[-1]) / (abs(scores[0] - scores[-1]))
+ normalized_pred = np.clip(normalized_pred, 0.0, 1.0)
+ min_normalized_pred = min(min_normalized_pred, float(normalized_pred))
+ normalized_model_response[i] = min_normalized_pred
+
+ alignment_penalty = np.abs(normalized_model_response - density)
+ corrected_scores = normalized_model_response + alignment_penalty
+ corrected_scores = corrected_scores.clip(0, 1)
+ corrected_scores = (corrected_scores - np.min(corrected_scores)) / (np.max(corrected_scores) - np.min(corrected_scores))
+ if np.isnan(corrected_scores).any():
+ corrected_scores = np.linspace(1, 0, len(scores))
+
+ rise = auc(normalized_model_response)
+ mas = auc(corrected_scores)
+ rise_ap = auc(normalized_model_response + alignment_penalty)
+
+ if attr_sum > 0:
+ attr_weights = np.zeros(P, dtype=np.float64)
+ for idx in keep:
+ attr_weights[idx] = float(w[idx].item()) / (attr_sum + 1e-12)
+ else:
+ attr_weights = np.zeros(P, dtype=np.float64)
+
+ return {
+ "num_tokens": P,
+ "sorted_attr_indices": [int(i.item()) for i in sorted_attr_indices],
+ "scores_raw": scores.tolist(),
+ "density": density.tolist(),
+ "normalized_model_response": normalized_model_response.tolist(),
+ "alignment_penalty": alignment_penalty.tolist(),
+ "corrected_scores": corrected_scores.tolist(),
+ "token_deltas_raw": per_token_delta.tolist(),
+ "attr_weights": attr_weights.tolist(),
+ "metrics": {"RISE": rise, "MAS": mas, "RISE+AP": rise_ap},
+ }
+
+
+def compute_method_attribution(
+ method: str,
+ example: ds_utils.CachedExample,
+ model: Any,
+ tokenizer: Any,
+ *,
+ n_hops: int,
+ sink_span: Optional[Tuple[int, int]],
+ thinking_span: Optional[Tuple[int, int]],
+ chunk_tokens: int,
+ sink_chunk_tokens: int,
+ attnlrp_neg_handling: str,
+ attnlrp_norm_mode: str,
+) -> Tuple[str, Any, llm_attr.LLMAttributionResult]:
+ prompt = example.prompt
+ target = example.target
+
+ if method == "ifr":
+ if sink_span is None:
+ raise ValueError("IFR requires sink_span (use dataset sink_span or pass --sink_span).")
+ attributor = llm_attr.LLMIFRAttribution(model, tokenizer, chunk_tokens=chunk_tokens, sink_chunk_tokens=sink_chunk_tokens)
+ result = attributor.calculate_ifr_span(prompt, target=target, span=sink_span)
+ return "IFR (ifr_span)", attributor, result
+
+ if method == "ifr_all_positions_output_only":
+ if sink_span is None:
+ raise ValueError(
+ "ifr_all_positions_output_only requires sink_span (use dataset sink_span or pass --sink_span)."
+ )
+ attributor = llm_attr.LLMIFRAttribution(model, tokenizer, chunk_tokens=chunk_tokens, sink_chunk_tokens=sink_chunk_tokens)
+ result = attributor.calculate_ifr_for_all_positions_output_only(
+ prompt,
+ target=target,
+ sink_span=sink_span,
+ )
+ return "IFR (ifr_all_positions_output_only)", attributor, result
+
+ if method in ("ft", "ft_ifr"):
+ attributor = llm_attr.LLMIFRAttribution(model, tokenizer, chunk_tokens=chunk_tokens, sink_chunk_tokens=sink_chunk_tokens)
+ result = attributor.calculate_ifr_multi_hop(
+ prompt,
+ target=target,
+ sink_span=sink_span,
+ thinking_span=thinking_span,
+ n_hops=int(n_hops),
+ )
+ return "FT-IFR (ifr_multi_hop)", attributor, result
+
+ if method in ("ft_improve", "ft_ifr_improve"):
+ import ft_ifr_improve
+
+ attributor = ft_ifr_improve.LLMIFRAttributionImproved(
+ model,
+ tokenizer,
+ chunk_tokens=chunk_tokens,
+ sink_chunk_tokens=sink_chunk_tokens,
+ )
+ result = attributor.calculate_ifr_multi_hop_stop_words(
+ prompt,
+ target=target,
+ sink_span=sink_span,
+ thinking_span=thinking_span,
+ n_hops=int(n_hops),
+ )
+ return "FT-IFR (ifr_multi_hop_stop_words)", attributor, result
+
+ if method == "ft_split_hop":
+ import ft_ifr_improve
+
+ attributor = ft_ifr_improve.LLMIFRAttributionSplitHop(
+ model,
+ tokenizer,
+ chunk_tokens=chunk_tokens,
+ sink_chunk_tokens=sink_chunk_tokens,
+ )
+ result = attributor.calculate_ifr_multi_hop_split_hop(
+ prompt,
+ target=target,
+ sink_span=sink_span,
+ thinking_span=thinking_span,
+ n_hops=int(n_hops),
+ )
+ return "FT-IFR (ifr_multi_hop_split_hop)", attributor, result
+
+ if method == "attnlrp":
+ attributor = llm_attr.LLMLRPAttribution(model, tokenizer)
+ result = attributor.calculate_attnlrp_ft_hop0(
+ prompt,
+ target=target,
+ sink_span=sink_span,
+ thinking_span=thinking_span,
+ neg_handling=attnlrp_neg_handling,
+ norm_mode=attnlrp_norm_mode,
+ )
+ return "AttnLRP (ft_attnlrp hop0)", attributor, result
+
+ if method == "ft_attnlrp":
+ attributor = llm_attr.LLMLRPAttribution(model, tokenizer)
+ result = attributor.calculate_attnlrp_aggregated_multi_hop(
+ prompt,
+ target=target,
+ sink_span=sink_span,
+ thinking_span=thinking_span,
+ n_hops=int(n_hops),
+ neg_handling=attnlrp_neg_handling,
+ norm_mode=attnlrp_norm_mode,
+ )
+ return "FT-AttnLRP (attnlrp_aggregated_multi_hop)", attributor, result
+
+ raise ValueError(f"Unsupported method={method!r}")
+
+
+def parse_args() -> argparse.Namespace:
+ parser = argparse.ArgumentParser("MAS case study (faithfulness perturbation visualization)")
+ parser.add_argument("--dataset", type=str, default="exp/exp2/data/morehopqa.jsonl", help="Dataset name or JSONL path.")
+ parser.add_argument("--data_root", type=str, default="exp/exp2/data", help="Cache root for dataset names.")
+ parser.add_argument("--index", type=int, default=0, help="Sample index (supports negative for reverse).")
+ parser.add_argument(
+ "--method",
+ type=str,
+ choices=[
+ "ifr",
+ "ifr_all_positions_output_only",
+ "ft",
+ "ft_ifr",
+ "ft_improve",
+ "ft_ifr_improve",
+ "ft_split_hop",
+ "attnlrp",
+ "ft_attnlrp",
+ ],
+ default="ft",
+ )
+ parser.add_argument("--model", type=str, default="qwen-8B", help="HF repo id (ignored if --model_path set).")
+ parser.add_argument("--model_path", type=str, default=None, help="Local model path to override --model.")
+ parser.add_argument("--cuda", type=str, default=None, help="CUDA spec (e.g., '0' or '0,1').")
+ parser.add_argument("--cuda_num", type=int, default=0, help="Fallback GPU index when --cuda unset.")
+ parser.add_argument("--n_hops", type=int, default=1, help="Number of hops for multi-hop methods.")
+ parser.add_argument("--sink_span", type=int, nargs=2, default=None, help="Optional sink span over generation tokens.")
+ parser.add_argument("--thinking_span", type=int, nargs=2, default=None, help="Optional thinking span over generation tokens.")
+ parser.add_argument("--chunk_tokens", type=int, default=128, help="IFR chunk size.")
+ parser.add_argument("--sink_chunk_tokens", type=int, default=32, help="IFR sink chunk size.")
+ parser.add_argument(
+ "--attnlrp_neg_handling",
+ type=str,
+ choices=["drop", "abs"],
+ default="drop",
+ help="FT-AttnLRP: how to handle negative values after each hop (drop=clamp>=0, abs=absolute value).",
+ )
+ parser.add_argument(
+ "--attnlrp_norm_mode",
+ type=str,
+ choices=["norm", "no_norm"],
+ default="norm",
+ help="FT-AttnLRP: norm enables per-hop global+thinking normalization + ratios; no_norm disables all three.",
+ )
+ parser.add_argument("--output_dir", type=str, default="exp/case_study/out", help="Where to write HTML/JSON artifacts.")
+ return parser.parse_args()
+
+
+def main() -> None:
+ args = parse_args()
+ device = resolve_device(args.cuda, args.cuda_num)
+ if torch.cuda.is_available():
+ visible = os.environ.get("CUDA_VISIBLE_DEVICES")
+ print(f"[info] CUDA_VISIBLE_DEVICES={visible!r} torch.cuda.device_count()={torch.cuda.device_count()} device={device}")
+
+ if args.method == "ft_ifr":
+ method_key = "ft"
+ elif args.method == "ft_ifr_improve":
+ method_key = "ft_improve"
+ else:
+ method_key = args.method
+
+ model_name = args.model_path if args.model_path is not None else args.model
+ model, tokenizer = load_model(model_name, device)
+
+ example, ds_name = load_example(args.dataset, args.index, Path(args.data_root))
+
+ sink_span = tuple(args.sink_span) if args.sink_span is not None else tuple(example.sink_span) if example.sink_span else None
+ thinking_span = (
+ tuple(args.thinking_span)
+ if args.thinking_span is not None
+ else tuple(example.thinking_span) if example.thinking_span else None
+ )
+
+ method_label, attributor, attr_result = compute_method_attribution(
+ method_key,
+ example,
+ model,
+ tokenizer,
+ n_hops=args.n_hops,
+ sink_span=sink_span,
+ thinking_span=thinking_span,
+ chunk_tokens=args.chunk_tokens,
+ sink_chunk_tokens=args.sink_chunk_tokens,
+ attnlrp_neg_handling=args.attnlrp_neg_handling,
+ attnlrp_norm_mode=args.attnlrp_norm_mode,
+ )
+
+ indices_to_explain = example.indices_to_explain or example.sink_span
+ if not (isinstance(indices_to_explain, list) and len(indices_to_explain) == 2):
+ raise ValueError("MAS case study requires token-span indices_to_explain=[start_tok,end_tok] (e.g. sink_span).")
+ seq_attr, row_attr, rec_attr = attr_result.get_all_token_attrs(indices_to_explain)
+
+ prompt_tokens = decode_text_into_tokens(tokenizer, " " + example.prompt)
+ generation_text = example.target if example.target is not None else (getattr(attributor, "generation", None) or "")
+
+ variant_specs = [
+ ("seq", "Seq attribution", seq_attr),
+ ("row", "Row attribution", row_attr),
+ ("recursive", "Recursive attribution", rec_attr),
+ ]
+
+ formatted = format_prompt(tokenizer, " " + example.prompt)
+ prompt_ids = tokenizer(formatted, return_tensors="pt", add_special_tokens=False).input_ids.to(model.device)
+ gen_ids = tokenizer(
+ generation_text + (tokenizer.eos_token or ""),
+ return_tensors="pt",
+ add_special_tokens=False,
+ ).input_ids.to(model.device)
+ base_score = score_prompt_ids_with_generation(model, prompt_ids=prompt_ids, generation_ids=gen_ids)
+
+ panels_raw: List[Dict[str, Any]] = []
+ panels_display: List[Dict[str, Any]] = []
+
+ for variant_key, variant_label, variant_attr in variant_specs:
+ prompt_len = int(seq_attr.shape[1] - seq_attr.shape[0]) # cols=(P+G), rows=G
+ attr_prompt = variant_attr[:, :prompt_len]
+ keep_prompt_token_indices = None
+ if method_key == "ft_improve":
+ import ft_ifr_improve
+
+ keep_prompt_token_indices = ft_ifr_improve.keep_token_indices(list(getattr(attributor, "user_prompt_tokens", []) or []))
+ trace = mas_trace(
+ model,
+ tokenizer,
+ attribution=attr_prompt.to(device="cpu"),
+ prompt=example.prompt,
+ generation=generation_text,
+ user_prompt_indices=getattr(attributor, "user_prompt_indices", None),
+ keep_prompt_token_indices=keep_prompt_token_indices,
+ )
+ trace["variant"] = variant_key
+ trace["variant_label"] = variant_label
+
+ panel_raw = {
+ "variant": variant_key,
+ "variant_label": variant_label,
+ "metrics": trace.get("metrics"),
+ "sorted_attr_indices": trace.get("sorted_attr_indices"),
+ "attr_weights": trace.get("attr_weights"),
+ "token_deltas_raw": trace.get("token_deltas_raw"),
+ "mas_trace": trace,
+ }
+ panels_raw.append(panel_raw)
+
+ panel_display = {
+ "variant": variant_key,
+ "variant_label": variant_label,
+ "metrics": trace.get("metrics"),
+ "sorted_attr_indices": trace.get("sorted_attr_indices"),
+ "attr_weights": trace.get("attr_weights"),
+ "token_deltas_raw": trace.get("token_deltas_raw"),
+ }
+ panels_display.append(panel_display)
+
+ case_meta: Dict[str, Any] = {
+ "dataset": ds_name,
+ "index": args.index,
+ "mode": "mas",
+ "attr_method": method_key,
+ "attr_method_label": method_label,
+ "sink_span": sink_span,
+ "thinking_span": thinking_span,
+ "n_hops": int(args.n_hops),
+ "attnlrp_neg_handling": args.attnlrp_neg_handling if method_key in ("attnlrp", "ft_attnlrp") else None,
+ "attnlrp_norm_mode": args.attnlrp_norm_mode if method_key in ("attnlrp", "ft_attnlrp") else None,
+ "attnlrp_ratio_enabled": (args.attnlrp_norm_mode == "norm") if method_key in ("attnlrp", "ft_attnlrp") else None,
+ "base_score": float(base_score),
+ }
+
+ record = {
+ "meta": case_meta,
+ "prompt": example.prompt,
+ "target": example.target,
+ "generation": generation_text,
+ "prompt_tokens": prompt_tokens,
+ "panels": panels_raw,
+ }
+
+ out_dir = Path(args.output_dir)
+ out_dir.mkdir(parents=True, exist_ok=True)
+ stem = make_output_stem(ds_name, args.index, method_key)
+ json_path = out_dir / f"{stem}.json"
+ html_path = out_dir / f"{stem}.html"
+
+ with json_path.open("w", encoding="utf-8") as f:
+ json.dump(record, f, ensure_ascii=False, indent=2)
+
+ html = viz.render_mas_token_html(
+ case_meta,
+ prompt_tokens=prompt_tokens,
+ panels=panels_display,
+ generation=generation_text,
+ )
+ html_path.write_text(html, encoding="utf-8")
+
+ print(f"[done] wrote {json_path}")
+ print(f"[done] wrote {html_path}")
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/case_study/viz.py b/exp/case_study/viz.py
new file mode 100644
index 0000000000000000000000000000000000000000..704ee260ac417c7cc5ae8918d74b762b0b935947
--- /dev/null
+++ b/exp/case_study/viz.py
@@ -0,0 +1,647 @@
+"""HTML helpers for visualizing hop-wise IFR/AttnLRP attributions."""
+
+from __future__ import annotations
+
+import math
+from typing import Any, Dict, List, Optional, Sequence
+
+from html import escape
+
+
+TOKEN_SCALE_QUANTILE = 0.995
+
+
+def _robust_abs_max(scores: Sequence[float], *, quantile: float = TOKEN_SCALE_QUANTILE) -> float:
+ """Return a robust abs max to avoid a single outlier washing out the colormap.
+
+ Uses a high quantile (default: p99.5) over |scores|. Top outliers saturate.
+ """
+
+ abs_vals: List[float] = []
+ for x in scores:
+ try:
+ v = float(x)
+ except Exception:
+ continue
+ if math.isnan(v):
+ continue
+ abs_vals.append(abs(v))
+
+ if not abs_vals:
+ return 0.0
+
+ abs_vals.sort()
+ q = float(quantile)
+ if q < 0.0:
+ q = 0.0
+ if q > 1.0:
+ q = 1.0
+ idx = int(q * (len(abs_vals) - 1))
+ return float(abs_vals[idx])
+
+
+def _color_for_score(score: float, max_score: float) -> str:
+ if max_score <= 0:
+ return "background-color: rgba(245,245,245,0.7);"
+ ratio = min(1.0, score / (max_score + 1e-12))
+ r = 255
+ g = int(235 - 90 * ratio)
+ b = int(220 - 160 * ratio)
+ alpha = 0.25 + 0.55 * ratio
+ return f"background-color: rgba({r}, {g}, {b}, {alpha});"
+
+
+def _render_sentence_list(title: str, sentences: Sequence[str], scores: Sequence[float], max_score: float) -> str:
+ rows: List[str] = []
+ for sent, sc in zip(sentences, scores):
+ style = _color_for_score(abs(float(sc)), max_score)
+ rows.append(
+ f'{sc:.4f} '
+ f'{escape(sent)}
'
+ )
+ return f"""
+
+
{escape(title)}
+ {''.join(rows)}
+
+ """
+
+
+def _render_tokens(
+ tokens: Sequence[str],
+ scores: Sequence[float],
+ max_score: float,
+ roles: Sequence[str],
+) -> str:
+ spans: List[str] = []
+ if max_score <= 0:
+ max_score = 1e-8
+ for idx, tok in enumerate(tokens):
+ score = float(scores[idx]) if idx < len(scores) else 0.0
+ style = _color_for_score(abs(score), max_score)
+ role = roles[idx] if idx < len(roles) else "gen"
+ safe_tok = escape(tok)
+ spans.append(
+ f'{safe_tok} '
+ )
+ return "".join(spans)
+
+
+def _render_top_table(top_items: List[Dict[str, Any]]) -> str:
+ if not top_items:
+ return "No attribution mass.
"
+
+ header = ""
+ body_rows = []
+ for rank, item in enumerate(top_items, start=1):
+ body_rows.append(
+ f"{rank} {item['idx']} "
+ f"{item['score']:.4f} {escape(item['sentence'])}
"
+ )
+ return f"{header}{''.join(body_rows)}
"
+
+
+def render_case_html(
+ case_meta: Dict[str, Any],
+ *,
+ token_view_raw: Dict[str, Any],
+ token_view_prompt: Dict[str, Any],
+ context: Optional[Dict[str, Any]] = None,
+ hops_sent: Optional[Sequence[Dict[str, Any]]] = None,
+) -> str:
+ has_sentence_view = bool(context) and bool(hops_sent)
+ prompt_len = len((context or {}).get("prompt_sentences") or []) if has_sentence_view else 0
+ gen_len = len((context or {}).get("generation_sentences") or []) if has_sentence_view else 0
+
+ prompt_max = 0.0
+ gen_max = 0.0
+ if has_sentence_view:
+ prompt_max = max(
+ (
+ max(h["sentence_scores_raw"][:prompt_len])
+ for h in (hops_sent or [])
+ if h.get("sentence_scores_raw") and h["sentence_scores_raw"][:prompt_len]
+ ),
+ default=0.0,
+ )
+ gen_max = max(
+ (
+ max(h["sentence_scores_raw"][prompt_len:])
+ for h in (hops_sent or [])
+ if h.get("sentence_scores_raw") and h["sentence_scores_raw"][prompt_len:]
+ ),
+ default=0.0,
+ )
+
+ raw_hops = token_view_raw.get("hops", []) or []
+ prompt_hops = token_view_prompt.get("hops", []) or []
+ if len(raw_hops) != len(prompt_hops):
+ raise ValueError(
+ "token_view_raw and token_view_prompt must have the same number of panels: "
+ f"raw={len(raw_hops)} prompt={len(prompt_hops)}"
+ )
+
+ hop_sections: List[str] = []
+ hop_count = len(prompt_hops)
+ mode = case_meta.get("mode", "ft")
+ ifr_view = case_meta.get("ifr_view", "aggregate")
+ sink_span = case_meta.get("sink_span")
+ panel_titles = case_meta.get("panel_titles")
+
+ def _panel_title(panel_idx: int) -> str:
+ if isinstance(panel_titles, list) and panel_idx < len(panel_titles):
+ try:
+ title = panel_titles[panel_idx]
+ except Exception:
+ title = None
+ if title is not None:
+ return str(title)
+ if mode in ("ft", "ft_improve", "ft_split_hop", "ifr_in_all_gen", "ft_attnlrp"):
+ return f"Hop {panel_idx}"
+ if mode == "ifr_all_positions_output_only":
+ return f"IFR output-only panel {panel_idx}"
+ if mode == "ifr_all_positions":
+ return f"IFR all-positions panel {panel_idx}"
+ if mode == "attnlrp":
+ return "AttnLRP (sink-span aggregate)"
+ return "IFR (sink-span aggregate)"
+
+ for hop_idx in range(hop_count):
+ raw_entry = raw_hops[hop_idx]
+ raw_scores = raw_entry.get("token_scores") or []
+ raw_mass = float(raw_entry.get("total_mass", 0.0))
+ raw_scale = _robust_abs_max(raw_scores)
+ if raw_scale <= 0:
+ raw_scale = float(raw_entry.get("token_score_max") or 0.0)
+ if raw_scale <= 0:
+ raw_scale = 1e-8
+
+ prompt_entry = prompt_hops[hop_idx]
+ prompt_scores = prompt_entry.get("token_scores") or []
+ prompt_mass = float(prompt_entry.get("total_mass", 0.0))
+ prompt_scale = _robust_abs_max(prompt_scores)
+ if prompt_scale <= 0:
+ prompt_scale = float(prompt_entry.get("token_score_max") or 0.0)
+ if prompt_scale <= 0:
+ prompt_scale = 1e-8
+
+ tok_raw_html = f"""
+
+
{escape(token_view_raw.get("label", "Pre-trim token-level heatmap (full)"))}
+
+ {_render_tokens(token_view_raw.get("tokens", []), raw_scores, raw_scale, token_view_raw.get("roles", []))}
+
+
+ """
+
+ tok_prompt_html = f"""
+
+
{escape(token_view_prompt.get("label", "Prompt-only token-level heatmap"))}
+
+ {_render_tokens(token_view_prompt.get("tokens", []), prompt_scores, prompt_scale, token_view_prompt.get("roles", []))}
+
+
+ """
+
+ sentence_html = ""
+ top_html = ""
+ if has_sentence_view and hop_idx < len(hops_sent or []):
+ hop = (hops_sent or [])[hop_idx]
+ raw_scores = hop.get("sentence_scores_raw") or []
+ prompt_scores = raw_scores[:prompt_len]
+ gen_scores = raw_scores[prompt_len:]
+ # Sentence view is not used by the current case-study runner; keep the path for completeness.
+ sentence_html = f"""
+
+ {_render_sentence_list('Prompt sentences', (context or {}).get('prompt_sentences') or [], prompt_scores, prompt_max)}
+ {_render_sentence_list('Generation sentences', (context or {}).get('generation_sentences') or [], gen_scores, gen_max)}
+
+ """
+ top_html = f"""
+
+
Top sentences (all)
+ {_render_top_table(hop.get('top_sentences') or [])}
+
+ """
+
+ hop_sections.append(
+ f"""
+
+
+ {tok_raw_html}
+ {tok_prompt_html}
+ {sentence_html}
+ {top_html}
+
+ """
+ )
+
+ thinking_ratios = case_meta.get("thinking_ratios") or []
+ ratios_str = ", ".join(f"{r:.4f}" for r in thinking_ratios) if thinking_ratios else "N/A"
+
+ if mode == "ft":
+ mode_label = "FT Multi-hop (IFR)"
+ elif mode == "ifr_in_all_gen":
+ mode_label = "IFR In-all-gen (multi-hop)"
+ elif mode == "ifr":
+ mode_label = "IFR Standard"
+ elif mode == "ifr_all_positions":
+ mode_label = "IFR All-positions"
+ elif mode == "ifr_all_positions_output_only":
+ mode_label = "IFR Output-only (all positions)"
+ elif mode == "attnlrp":
+ mode_label = "AttnLRP"
+ elif mode == "ft_attnlrp":
+ mode_label = "FT Multi-hop (AttnLRP)"
+ else:
+ mode_label = str(mode)
+
+ if mode in ("ft", "ifr_in_all_gen", "ft_attnlrp"):
+ view_key = "Recursive hops"
+ view_val = case_meta.get("n_hops")
+ elif mode in ("ifr", "ifr_all_positions", "ifr_all_positions_output_only"):
+ view_key = "IFR view"
+ view_val = ifr_view
+ elif mode == "attnlrp":
+ view_key = "AttnLRP view"
+ view_val = "ft_hop0_span_aggregate"
+ else:
+ view_key = "View"
+ view_val = "N/A"
+
+ scale_row = f"Token scale: per-panel per-view p{int(TOKEN_SCALE_QUANTILE*1000)/10:.1f}(|score|)
"
+ neg_handling = case_meta.get("attnlrp_neg_handling")
+ norm_mode = case_meta.get("attnlrp_norm_mode")
+ ratio_enabled = case_meta.get("attnlrp_ratio_enabled")
+ attn_rows = []
+ if neg_handling:
+ attn_rows.append(f"FT-AttnLRP neg_handling: {escape(str(neg_handling))}
")
+ if norm_mode:
+ attn_rows.append(f"FT-AttnLRP norm_mode: {escape(str(norm_mode))}
")
+ if ratio_enabled is not None:
+ attn_rows.append(f"FT-AttnLRP ratio_enabled: {escape(str(bool(ratio_enabled)))}
")
+
+ header = f"""
+
+ """
+
+ style = """
+
+ """
+
+ title = f"{mode_label} Case Study"
+ html = f"""
+
+
+
+ {escape(title)}
+ {style}
+
+
+ {header}
+ {''.join(hop_sections)}
+
+ """
+ return html
+
+
+def _render_sentence_spans(title: str, sentences: Sequence[str], scores: Sequence[float]) -> str:
+ max_abs = max((abs(float(x)) for x in scores), default=0.0)
+ spans: List[str] = []
+ for idx, sentence in enumerate(sentences):
+ score = float(scores[idx]) if idx < len(scores) else 0.0
+ style = _color_for_score(abs(score), max_abs)
+ spans.append(
+ f'{escape(sentence)} '
+ )
+ return f"""
+
+
{escape(title)}
+
{''.join(spans)}
+
+ """
+
+
+def _render_token_spans(title: str, tokens: Sequence[str], scores: Sequence[float]) -> str:
+ max_abs = max((abs(float(x)) for x in scores), default=0.0)
+ spans: List[str] = []
+ for idx, tok in enumerate(tokens):
+ score = float(scores[idx]) if idx < len(scores) else 0.0
+ style = _color_for_score(abs(score), max_abs)
+ spans.append(
+ f'{escape(tok)} '
+ )
+ return f"""
+
+
{escape(title)}
+
{''.join(spans)}
+
+ """
+
+
+def render_mas_sentence_html(
+ case_meta: Dict[str, Any],
+ *,
+ prompt_sentences: Sequence[str],
+ panels: Sequence[Dict[str, Any]],
+ generation: Optional[str] = None,
+) -> str:
+ """Render MAS sentence-level diagnostics (attribution / pure ablation / guided marginal)."""
+
+ method_label = case_meta.get("attr_method_label") or case_meta.get("attr_method") or "Unknown method"
+ title = f"MAS Sentence Study ({method_label})"
+
+ neg_handling = case_meta.get("attnlrp_neg_handling")
+ norm_mode = case_meta.get("attnlrp_norm_mode")
+ ratio_enabled = case_meta.get("attnlrp_ratio_enabled")
+ attn_rows = []
+ if neg_handling:
+ attn_rows.append(f"FT-AttnLRP neg_handling: {escape(str(neg_handling))}
")
+ if norm_mode:
+ attn_rows.append(f"FT-AttnLRP norm_mode: {escape(str(norm_mode))}
")
+ if ratio_enabled is not None:
+ attn_rows.append(f"FT-AttnLRP ratio_enabled: {escape(str(bool(ratio_enabled)))}
")
+
+ base_score = case_meta.get("base_score")
+ base_score_row = f"Base score: {float(base_score):.6f}
" if isinstance(base_score, (int, float)) else ""
+
+ gen_block = ""
+ if isinstance(generation, str) and generation:
+ gen_block = f"""
+
+
Generation (scored)
+
{escape(generation)}
+
+ """
+
+ header = f"""
+
+ """
+
+ panel_sections: List[str] = []
+ for panel in panels:
+ label = panel.get("variant_label") or panel.get("panel_label") or panel.get("variant") or "Panel"
+ metrics = panel.get("metrics") or {}
+ metrics_str = " | ".join(
+ f"{k}: {float(metrics[k]):.4f}" if isinstance(metrics.get(k), (int, float)) else f"{k}: {metrics.get(k)}"
+ for k in ("RISE", "MAS", "RISE+AP")
+ if k in metrics
+ )
+
+ attr_weights = panel.get("attr_weights") or []
+ pure_deltas = panel.get("pure_sentence_deltas_raw") or []
+ guided_deltas = panel.get("guided_sentence_deltas_raw") or panel.get("sentence_deltas_raw") or []
+ rank_order = panel.get("sorted_attr_indices") or []
+ rank_str = ", ".join(str(int(x)) for x in rank_order) if rank_order else "N/A"
+
+ panel_sections.append(
+ f"""
+
+
+
+ {_render_sentence_spans("Method attribution (sentence weights)", prompt_sentences, attr_weights)}
+ {_render_sentence_spans("Pure sentence ablation (base โ score)", prompt_sentences, pure_deltas)}
+ {_render_sentence_spans("Attribution-guided MAS marginal (path deltas)", prompt_sentences, guided_deltas)}
+
+
+
+ """
+ )
+
+ style = """
+
+ """
+
+ html = f"""
+
+
+
+ {escape(title)}
+ {style}
+
+
+ {header}
+ {gen_block}
+ {''.join(panel_sections)}
+
+ """
+ return html
+
+
+def render_mas_token_html(
+ case_meta: Dict[str, Any],
+ *,
+ prompt_tokens: Sequence[str],
+ panels: Sequence[Dict[str, Any]],
+ generation: Optional[str] = None,
+) -> str:
+ """Render MAS token-level diagnostics (attribution weights + guided marginal deltas)."""
+
+ method_label = case_meta.get("attr_method_label") or case_meta.get("attr_method") or "Unknown method"
+ title = f"MAS Token Study ({method_label})"
+
+ neg_handling = case_meta.get("attnlrp_neg_handling")
+ norm_mode = case_meta.get("attnlrp_norm_mode")
+ ratio_enabled = case_meta.get("attnlrp_ratio_enabled")
+ attn_rows = []
+ if neg_handling:
+ attn_rows.append(f"FT-AttnLRP neg_handling: {escape(str(neg_handling))}
")
+ if norm_mode:
+ attn_rows.append(f"FT-AttnLRP norm_mode: {escape(str(norm_mode))}
")
+ if ratio_enabled is not None:
+ attn_rows.append(f"FT-AttnLRP ratio_enabled: {escape(str(bool(ratio_enabled)))}
")
+
+ base_score = case_meta.get("base_score")
+ base_score_row = f"Base score: {float(base_score):.6f}
" if isinstance(base_score, (int, float)) else ""
+
+ gen_block = ""
+ if isinstance(generation, str) and generation:
+ gen_block = f"""
+
+
Generation (scored)
+
{escape(generation)}
+
+ """
+
+ header = f"""
+
+ """
+
+ panel_sections: List[str] = []
+ for panel in panels:
+ label = panel.get("variant_label") or panel.get("panel_label") or panel.get("variant") or "Panel"
+ metrics = panel.get("metrics") or {}
+ metrics_str = " | ".join(
+ f"{k}: {float(metrics[k]):.4f}" if isinstance(metrics.get(k), (int, float)) else f"{k}: {metrics.get(k)}"
+ for k in ("RISE", "MAS", "RISE+AP")
+ if k in metrics
+ )
+
+ attr_weights = panel.get("attr_weights") or []
+ guided_deltas = panel.get("token_deltas_raw") or []
+ rank_order = panel.get("sorted_attr_indices") or []
+ rank_str = ", ".join(str(int(x)) for x in rank_order) if rank_order else "N/A"
+
+ panel_sections.append(
+ f"""
+
+
+
+ {_render_token_spans("Method attribution (token weights)", prompt_tokens, attr_weights)}
+ {_render_token_spans("Attribution-guided MAS marginal (path deltas)", prompt_tokens, guided_deltas)}
+
+
+
+ """
+ )
+
+ style = """
+
+ """
+
+ html = f"""
+
+
+
+ {escape(title)}
+ {style}
+
+
+ {header}
+ {gen_block}
+ {''.join(panel_sections)}
+
+ """
+ return html
diff --git a/exp/exp1/README.md b/exp/exp1/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..067ab472b08693436aafd683d999396f06b1d8fb
--- /dev/null
+++ b/exp/exp1/README.md
@@ -0,0 +1,46 @@
+# FlashTrace ้ฟไธไธๆ่ๆถๅฎ้ช๏ผexp1๏ผ
+
+่ชๅ
ๅซ่ๆฌ๏ผ`exp/exp1/run_time_curve.py`
+็จ้๏ผๅจๅไธช RULER ๆ ทๆฌไธ๏ผๆต้ไธๅไธไธๆ้ฟๅบฆไธๅๅฝๅ ๆนๆณ็ wall-clock ๆถ้ดไธ GPU ๅณฐๅผๆพๅญ๏ผไพ่ฎบๆไธญ็็บฟๆงๅข้ฟ่กจๆ ผไฝฟ็จใ
+
+## ๆนๆณ่ฆ็
+- `IG`๏ผ20 ๆญฅ๏ผ
+- `attention_I_G`๏ผๆณจๆๅ * IG๏ผ
+- `attnlrp`๏ผๅๆฌกๅไผ ็ LRP ็ๆฌ๏ผ
+- `perturbation_all`๏ผlog-loss ablation๏ผ
+- `perturbation_CLP`๏ผKL ็๏ผ
+- `perturbation_REAGENT`๏ผMLM ๆฟๆข๏ผLED/4096 ไธ้๏ผ่ถ
่ฟๅๅฏ่ฝๅคฑ่ดฅ๏ผ
+- `ifr_all_positions`๏ผIFR one-by-one baseline๏ผ`sink_chunk_tokens=1` ๅบๅฎ๏ผ
+- `ifr_multi_hop`๏ผFlashTrace๏ผๅค่ทณ+chunk ๆฏๆ๏ผ
+- `ifr_multi_hop_both`๏ผFT-IFR both๏ผstop_words + in_all_gen๏ผๅค่ทณ+chunk ๆฏๆ๏ผ
+
+## ่ฟ่ก็คบไพ
+```bash
+# ้ป่ฎค input ้ฟๅบฆ 1024,4096,8192๏ผoutput ้ฟๅบฆ 32,256,512๏ผๆฏๆ ผ 3 ๆฌก
+python exp/exp1/run_time_curve.py \
+ --model qwen-8B \
+ --model_path /opt/share/models/Qwen/Qwen3-8B/ \
+ --cuda 2,3,4,5,6,7 \
+ --attr_funcs perturbation_all,perturbation_REAGENT,ifr_all_positions,perturbation_CLP,ifr_multi_hop,ifr_multi_hop_both,attnlrp \
+ --input_lengths 10 \
+ --output_lengths 2000,5000,10000 \
+ --repeats 1 \
+ --chunk_tokens 128 \
+ --sink_chunk_tokens 32 \
+ --catch_oom \
+ --ruler_file data/ruler_multihop/8192/vt_h10_c1/validation.jsonl
+```
+
+่พๅบ๏ผ
+- `exp/exp1/out/time_curve_runs.jsonl`๏ผๆฏๆฌก่ฟ่ก็ๅๅง่ฎฐๅฝ๏ผattrใ็ฎๆ input/output/totalใๅฎ้
้ฟๅบฆใtimeใpeak_memใstatus๏ผใ
+- `exp/exp1/out/time_curve_summary.csv`๏ผๆๆนๆณ + ็ฎๆ input/output ๆฑๆป็ๅๅผ/ๆนๅทฎ๏ผๅๆถๅๅบ total=input+output๏ผใ
+
+## ๆณจๆไบ้กน
+- `--input_lengths` ๆงๅถ prompt๏ผuser prompt๏ผ้ฟๅบฆ๏ผ`--output_lengths` ๆงๅถ output๏ผsink๏ผ้ฟๅบฆ๏ผๆฏไธชๆ ผๅญ็ total = input + outputใ
+- ๅ
ผๅฎน๏ผไปๆฏๆ `--total_lengths/--lengths`๏ผdeprecated๏ผ๏ผ่กจ็คบ prompt+output ๆป้ฟๅบฆ๏ผprompt ้ฟๅบฆๆไธค่
ๅทฎๅผ็ๆใ
+- `--target_text` ไฝไธบๅบๅบ่ขซ้ๅคๆผๆฅไปฅๆปก่ถณ็ฎๆ output ้ฟๅบฆ๏ผไป
็จไบๆงๅถ้ฟๅบฆ๏ผไธๅจไน่ฏญไนใ
+- `--catch_oom/--no-catch-oom` ็จไบ้ๆฉๆฏๆ OOM ่ฎฐไธบ status ็ปง็ปญ๏ผ่ฟๆฏ็ดๆฅๆ้ไธญๆญขใ
+- ๅคๅก๏ผ`--cuda 0,1` ไผๅจ่ๆฌๅฏๅจๅ่ฎพ็ฝฎ `CUDA_VISIBLE_DEVICES` ๅนถ็จ `device_map=balanced` ๅ็ๅ ่ฝฝ๏ผๅๅกๆๅฎ `--cuda 0`ใ
+- ่ถ
ๅบๆจกๅไธไธๆ (`config.max_position_embeddings`) ไผๆ ่ฎฐ `skipped_model_ctx`๏ผๆๅฎ้
ๅ็ปๆจกๅ็ formatted prompt + output(+eos) token ๆฐๆฃๆฅ๏ผใ
+- `perturbation_REAGENT` ็ Longformer ไป
ๆฏๆ 4096 tokens๏ผ่ถ
่ฟๅฏ่ฝ่ฟๅ OOM ๆ runtime_errorใ
+- IFR multi-hop ๆไพ `--chunk_tokens/--sink_chunk_tokens` ไปฅๅจ่ถ
้ฟไธไธๆไธๅผบๅถๅๅ๏ผๆพๅญไผไธ้ไฝๆถ้ด็ฅๅ๏ผ`ifr_all_positions` ๅๆฏๅบๅฎ `sink_chunk_tokens=1`ใ
diff --git a/exp/exp1/run_time_curve.py b/exp/exp1/run_time_curve.py
new file mode 100644
index 0000000000000000000000000000000000000000..2543b68e4acac3162885eefdde9f6b504ef9b6d1
--- /dev/null
+++ b/exp/exp1/run_time_curve.py
@@ -0,0 +1,757 @@
+#!/usr/bin/env python3
+"""
+Measure wall-clock time and GPU memory for attribution methods across
+different context lengths using a single synthetic RULER-style example.
+
+This script stays self-contained under exp/exp1 and reuses the attribution
+implementations in the repo (IG, perturbation, attention*IG, IFR/FlashTrace).
+The goal is to populate the time-vs-length table; correctness of the task
+content is not important, only matching token lengths and running 3 repeats.
+"""
+
+from __future__ import annotations
+
+import argparse
+import json
+import math
+import os
+import random
+import sys
+import time
+from collections import defaultdict
+from pathlib import Path
+from typing import Any, Dict, Iterable, List, Optional, Tuple
+
+import numpy as np
+
+
+def _early_set_cuda_visible_devices() -> None:
+ """Parse --cuda early to set CUDA_VISIBLE_DEVICES before torch import."""
+ parser = argparse.ArgumentParser(add_help=False)
+ parser.add_argument("--cuda", type=str, default=None)
+ args, _ = parser.parse_known_args(sys.argv[1:])
+ if args.cuda and "," in args.cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = args.cuda
+
+
+_early_set_cuda_visible_devices()
+
+import torch
+from transformers import AutoModelForCausalLM, AutoTokenizer
+
+REPO_ROOT = Path(__file__).resolve().parents[2]
+if str(REPO_ROOT) not in sys.path:
+ sys.path.insert(0, str(REPO_ROOT))
+
+import llm_attr
+
+DEFAULT_INPUT_LENGTHS = [1024, 4096, 8192]
+DEFAULT_OUTPUT_LENGTHS = [32, 256, 512]
+DEFAULT_ATTRS = [
+ "IG",
+ "perturbation_all",
+ "attention_I_G",
+ "perturbation_REAGENT",
+ "ifr_all_positions",
+ "perturbation_CLP",
+ "ifr_multi_hop",
+ "attnlrp",
+]
+DEFAULT_RULER_FILE = REPO_ROOT / "data" / "ruler_multihop" / "8192" / "vt_h10_c1" / "validation.jsonl"
+
+
+def parse_args() -> argparse.Namespace:
+ parser = argparse.ArgumentParser("FlashTrace time/memory curve.")
+ parser.add_argument("--model", type=str, required=True, help="Model name or HF repo id.")
+ parser.add_argument("--model_path", type=str, default=None, help="Optional local model path.")
+ parser.add_argument("--cuda", type=str, default=None, help='CUDA devices, e.g. "0,1" or "0".')
+ parser.add_argument("--cuda_num", type=int, default=0, help="Single GPU index if --cuda is not set.")
+ parser.add_argument(
+ "--attr_funcs",
+ type=str,
+ default=",".join(DEFAULT_ATTRS),
+ help="Comma-separated attribution methods.",
+ )
+
+ length_group = parser.add_mutually_exclusive_group()
+ parser.add_argument(
+ "--output_lengths",
+ type=str,
+ default=",".join(str(x) for x in DEFAULT_OUTPUT_LENGTHS),
+ help="Comma-separated target output token lengths (sink/output segment).",
+ )
+ length_group.add_argument(
+ "--input_lengths",
+ type=str,
+ default=",".join(str(x) for x in DEFAULT_INPUT_LENGTHS),
+ help="Comma-separated target input/prompt token lengths (user prompt only; excludes chat template).",
+ )
+ length_group.add_argument(
+ "--total_lengths",
+ "--lengths",
+ dest="total_lengths",
+ type=str,
+ default=None,
+ help="Deprecated. Target total token lengths (prompt + output). Use --input_lengths instead.",
+ )
+ parser.add_argument("--repeats", type=int, default=3, help="Number of runs per cell.")
+ parser.add_argument("--output_dir", type=str, default="exp/exp1/out", help="Output directory.")
+ parser.add_argument(
+ "--ruler_file",
+ type=str,
+ default=str(DEFAULT_RULER_FILE),
+ help="RULER jsonl file providing a long base passage.",
+ )
+ parser.add_argument(
+ "--chunk_tokens",
+ type=int,
+ default=128,
+ help="IFR chunk_tokens override when context is long.",
+ )
+ parser.add_argument(
+ "--sink_chunk_tokens",
+ type=int,
+ default=32,
+ help="IFR sink_chunk_tokens override when context is long.",
+ )
+ parser.add_argument(
+ "--catch_oom",
+ action=argparse.BooleanOptionalAction,
+ default=True,
+ help="If true, treat CUDA OOM as status=oom and continue; if false, let OOM raise.",
+ )
+ parser.add_argument(
+ "--target_text",
+ type=str,
+ default=" The answer is 42.",
+ help="Base text to tile when constructing outputs of a given length.",
+ )
+ return parser.parse_args()
+
+
+def parse_csv_ints(value: str) -> List[int]:
+ return [int(x) for x in value.split(",") if x.strip()]
+
+
+def resolve_device(cuda: Optional[str], cuda_num: int) -> str:
+ if cuda is not None and "," in cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = cuda
+ return "auto"
+ if cuda is not None and cuda.strip():
+ try:
+ idx = int(cuda)
+ except Exception:
+ idx = 0
+ return f"cuda:{idx}" if torch.cuda.is_available() else "cpu"
+ return f"cuda:{cuda_num}" if torch.cuda.is_available() else "cpu"
+
+
+def load_ruler_base(path: Path, fallback: str) -> str:
+ if not path.exists():
+ return fallback
+ with path.open() as f:
+ for line in f:
+ try:
+ record = json.loads(line)
+ if "input" in record:
+ return record["input"]
+ except json.JSONDecodeError:
+ continue
+ return fallback
+
+
+def build_prompt_to_length(tokenizer, base_text: str, target_tokens: int) -> Tuple[str, int]:
+ """
+ Build a prompt whose tokenized length (without special tokens) is ~target_tokens.
+ If base_text is shorter, we repeat it; if longer, we truncate.
+ """
+ if target_tokens <= 0:
+ return "", 0
+
+ base_ids = tokenizer(base_text, add_special_tokens=False).input_ids
+ if not base_ids:
+ base_ids = [tokenizer.eos_token_id]
+
+ tiled: List[int] = []
+ while len(tiled) < target_tokens:
+ tiled.extend(base_ids)
+ tiled = tiled[:target_tokens]
+ prompt = tokenizer.decode(tiled, clean_up_tokenization_spaces=False)
+ return prompt, len(tiled)
+
+
+def build_output_to_length(tokenizer, base_text: str, target_tokens: int) -> Tuple[str, int]:
+ """
+ Build a target/output string of ~target_tokens using a base snippet.
+ """
+ if target_tokens <= 0:
+ return "", 0
+
+ base_ids = tokenizer(base_text, add_special_tokens=False).input_ids
+ if not base_ids:
+ base_ids = [tokenizer.eos_token_id]
+
+ tiled: List[int] = []
+ while len(tiled) < target_tokens:
+ tiled.extend(base_ids)
+ tiled = tiled[:target_tokens]
+ text = tokenizer.decode(tiled, clean_up_tokenization_spaces=False)
+ return text, len(tiled)
+
+
+def build_formatted_prompt(tokenizer, prompt: str) -> str:
+ user_prompt = " " + prompt
+ modified_prompt = llm_attr.DEFAULT_PROMPT_TEMPLATE.format(context=user_prompt, query="")
+ formatted_prompt = [{"role": "user", "content": modified_prompt}]
+ return tokenizer.apply_chat_template(
+ formatted_prompt,
+ tokenize=False,
+ add_generation_prompt=True,
+ enable_thinking=False,
+ )
+
+
+def estimate_model_lengths(tokenizer, prompt: str, target: str) -> Dict[str, int]:
+ user_prompt = " " + prompt
+ formatted_prompt = build_formatted_prompt(tokenizer, prompt)
+
+ user_prompt_len = len(tokenizer(user_prompt, add_special_tokens=False).input_ids)
+ formatted_prompt_len = len(tokenizer(formatted_prompt, add_special_tokens=False).input_ids)
+ generation_len = len(tokenizer(target + tokenizer.eos_token, add_special_tokens=False).input_ids)
+
+ return {
+ "user_prompt_tokens": user_prompt_len,
+ "formatted_prompt_tokens": formatted_prompt_len,
+ "generation_tokens": generation_len,
+ "total_tokens": formatted_prompt_len + generation_len,
+ }
+
+
+def exceeds_model_ctx(tokenizer, prompt: str, target: str, max_ctx: Optional[int]) -> bool:
+ if max_ctx is None:
+ return False
+ return estimate_model_lengths(tokenizer, prompt, target)["total_tokens"] > max_ctx
+
+
+def load_model_balanced(model_name: str, device: str):
+ """Load model with an explicit balanced device_map when multi-GPU is requested."""
+ if device == "auto":
+ model = AutoModelForCausalLM.from_pretrained(
+ model_name,
+ device_map="balanced",
+ torch_dtype=torch.float16,
+ attn_implementation="eager",
+ )
+ elif isinstance(device, str) and device.startswith("cuda:"):
+ try:
+ gpu_idx = int(device.split(":")[1])
+ except Exception:
+ gpu_idx = 0
+ model = AutoModelForCausalLM.from_pretrained(
+ model_name,
+ device_map={"": gpu_idx},
+ torch_dtype=torch.float16,
+ attn_implementation="eager",
+ )
+ else:
+ model = AutoModelForCausalLM.from_pretrained(
+ model_name,
+ torch_dtype=torch.float16,
+ attn_implementation="eager",
+ )
+
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
+ tokenizer.pad_token = tokenizer.eos_token
+ model.eval()
+ return model, tokenizer
+
+
+def collect_device_indices(device_str: str, model: Any) -> List[int]:
+ """
+ Infer the CUDA device indices that should be tracked for memory stats.
+ Prefers the model's device map; otherwise falls back to all visible devices
+ or the single requested device.
+ """
+ if not torch.cuda.is_available():
+ return []
+
+ devices: set[int] = set()
+ device_map = getattr(model, "hf_device_map", None)
+ if isinstance(device_map, dict):
+ for dev in device_map.values():
+ if dev is None:
+ continue
+ idx: Optional[int] = None
+ if isinstance(dev, torch.device):
+ idx = dev.index if dev.index is not None else (0 if dev.type == "cuda" else None)
+ elif isinstance(dev, str):
+ try:
+ d = torch.device(dev)
+ idx = d.index if d.index is not None else (0 if d.type == "cuda" else None)
+ except Exception:
+ idx = None
+ elif isinstance(dev, int):
+ idx = dev
+ if idx is not None:
+ devices.add(idx)
+
+ if not devices:
+ if device_str == "auto":
+ devices.update(range(torch.cuda.device_count()))
+ elif isinstance(device_str, str) and device_str.startswith("cuda:"):
+ try:
+ devices.add(int(device_str.split(":")[1]))
+ except Exception:
+ pass
+ else:
+ devices.update(range(torch.cuda.device_count()))
+
+ return sorted(devices)
+
+
+def maybe_reset_cuda(device_indices: List[int]) -> None:
+ if not torch.cuda.is_available() or not device_indices:
+ return
+ for idx in device_indices:
+ try:
+ torch.cuda.reset_peak_memory_stats(device=idx)
+ except Exception:
+ pass
+ try:
+ torch.cuda.empty_cache()
+ except Exception:
+ pass
+
+
+def measure(
+ method_fn,
+ device_indices: List[int],
+ *,
+ catch_oom: bool,
+) -> Tuple[str, Optional[float], Optional[float], Optional[float], Dict[int, Dict[str, float]]]:
+ status = "ok"
+ wall: Optional[float] = None
+ mem_alloc: Optional[float] = None
+ mem_reserved: Optional[float] = None
+ mem_by_device: Dict[int, Dict[str, float]] = {}
+ try:
+ if torch.cuda.is_available() and device_indices:
+ for idx in device_indices:
+ torch.cuda.synchronize(device=idx)
+ t0 = time.time()
+ method_fn()
+ if torch.cuda.is_available() and device_indices:
+ for idx in device_indices:
+ torch.cuda.synchronize(device=idx)
+ wall = time.time() - t0
+ except RuntimeError as e:
+ if "out of memory" in str(e).lower():
+ status = "oom"
+ if not catch_oom:
+ raise
+ else:
+ status = f"runtime_error: {e}"
+ if not catch_oom:
+ raise
+ except Exception as e:
+ status = f"error: {e}"
+ if not catch_oom:
+ raise
+ finally:
+ if torch.cuda.is_available() and device_indices:
+ try:
+ total_alloc = 0.0
+ total_reserved = 0.0
+ for idx in device_indices:
+ alloc_bytes = torch.cuda.max_memory_allocated(device=idx)
+ reserved_bytes = torch.cuda.max_memory_reserved(device=idx)
+ total_alloc += alloc_bytes
+ total_reserved += reserved_bytes
+ mem_by_device[idx] = {
+ "allocated_gb": alloc_bytes / 1e9,
+ "reserved_gb": reserved_bytes / 1e9,
+ }
+ mem_alloc = total_alloc / 1e9
+ mem_reserved = total_reserved / 1e9
+ except Exception:
+ pass
+ return status, wall, mem_alloc, mem_reserved, mem_by_device
+
+
+def make_attr_runner(
+ attr_func: str,
+ model: Any,
+ tokenizer: Any,
+ chunk_tokens: int,
+ sink_chunk_tokens: int,
+ batch_size: int,
+ prompt: str,
+ target: str,
+):
+ lf = attr_func.lower()
+ if lf == "ig":
+ llm_attributor = llm_attr.LLMGradientAttribtion(model, tokenizer)
+
+ def fn():
+ return llm_attributor.calculate_IG_per_generation(
+ prompt, steps=20, baseline=tokenizer.eos_token_id, batch_size=batch_size, target=target
+ )
+
+ return fn
+
+ if lf == "attention_i_g":
+ llm_attn = llm_attr.LLMAttentionAttribution(model, tokenizer)
+ llm_ig = llm_attr.LLMGradientAttribtion(model, tokenizer)
+
+ def fn():
+ attn = llm_attn.calculate_attention_attribution(prompt, target=target)
+ ig = llm_ig.calculate_IG_per_generation(
+ prompt, steps=20, baseline=tokenizer.eos_token_id, batch_size=batch_size, target=target
+ )
+ attn.attribution_matrix = attn.attribution_matrix * ig.attribution_matrix
+ return attn
+
+ return fn
+
+ if lf == "perturbation_all":
+ llm_attrtor = llm_attr.LLMPerturbationAttribution(model, tokenizer)
+
+ def fn():
+ return llm_attrtor.calculate_feature_ablation_sentences(
+ prompt, baseline=tokenizer.eos_token_id, measure="log_loss", target=target
+ )
+
+ return fn
+
+ if lf == "perturbation_clp":
+ llm_attrtor = llm_attr.LLMPerturbationAttribution(model, tokenizer)
+
+ def fn():
+ return llm_attrtor.calculate_feature_ablation_sentences(
+ prompt, baseline=tokenizer.eos_token_id, measure="KL", target=target
+ )
+
+ return fn
+
+ if lf == "perturbation_reagent":
+ llm_attrtor = llm_attr.LLMPerturbationAttribution(model, tokenizer)
+
+ def fn():
+ return llm_attrtor.calculate_feature_ablation_sentences_mlm(prompt, target=target)
+
+ return fn
+
+ if lf == "ifr_all_positions":
+ llm_attrtor = llm_attr.LLMIFRAttribution(
+ model, tokenizer, chunk_tokens=chunk_tokens, sink_chunk_tokens=1
+ )
+
+ def fn():
+ return llm_attrtor.calculate_ifr_for_all_positions(prompt, target=target)
+
+ return fn
+
+ if lf == "ifr_multi_hop":
+ llm_attrtor = llm_attr.LLMIFRAttribution(
+ model, tokenizer, chunk_tokens=chunk_tokens, sink_chunk_tokens=sink_chunk_tokens
+ )
+
+ def fn():
+ return llm_attrtor.calculate_ifr_multi_hop(prompt, target=target)
+
+ return fn
+
+ if lf == "ifr_multi_hop_both":
+ import ft_ifr_improve
+
+ llm_attrtor = ft_ifr_improve.LLMIFRAttributionBoth(
+ model, tokenizer, chunk_tokens=chunk_tokens, sink_chunk_tokens=sink_chunk_tokens
+ )
+
+ def fn():
+ return llm_attrtor.calculate_ifr_multi_hop_both(prompt, target=target)
+
+ return fn
+
+ if lf == "attnlrp":
+ llm_attrtor = llm_attr.LLMLRPAttribution(model, tokenizer)
+
+ def fn():
+ return llm_attrtor.calculate_attnlrp(prompt, target=target)
+
+ return fn
+
+ raise ValueError(f"Unsupported attr_func {attr_func}")
+
+
+def compute_batch_size(sequence_length: int, max_input_len: int) -> int:
+ denom = int(sequence_length)
+ return max(1, math.floor((max_input_len - 100) / max(1, denom)))
+
+
+def aggregate_results(rows: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
+ grouped: Dict[Tuple[str, int, int], Dict[str, List[float]]] = defaultdict(lambda: {"time": [], "mem": []})
+ statuses: Dict[Tuple[str, int, int], List[str]] = defaultdict(list)
+ for row in rows:
+ key = (row["attr_func"], row["target_input_tokens"], row["target_output_tokens"])
+ statuses[key].append(row["status"])
+ if row.get("time_sec") is not None:
+ grouped[key]["time"].append(row["time_sec"])
+ if row.get("peak_mem_gb") is not None:
+ grouped[key]["mem"].append(row["peak_mem_gb"])
+
+ summary = []
+ for key, vals in grouped.items():
+ attr_func, input_tokens, output_tokens = key
+ total_tokens = input_tokens + output_tokens
+ times = vals["time"]
+ mems = vals["mem"]
+ summary.append(
+ {
+ "attr_func": attr_func,
+ "target_input_tokens": input_tokens,
+ "target_total_tokens": total_tokens,
+ "target_output_tokens": output_tokens,
+ "time_mean": np.mean(times) if times else None,
+ "time_std": np.std(times) if times else None,
+ "mem_mean": np.mean(mems) if mems else None,
+ "mem_std": np.std(mems) if mems else None,
+ "statuses": statuses[key],
+ }
+ )
+ return summary
+
+
+def append_jsonl_row(f, row: Dict[str, Any]) -> None:
+ f.write(json.dumps(row) + "\n")
+ f.flush()
+ try:
+ os.fsync(f.fileno())
+ except OSError:
+ pass
+
+
+def write_summary_csv(rows: List[Dict[str, Any]], out_dir: Path) -> Path:
+ summary = aggregate_results(rows)
+ summary_path = out_dir / "time_curve_summary.csv"
+ tmp_path = out_dir / "time_curve_summary.csv.tmp"
+
+ with tmp_path.open("w") as f:
+ f.write(
+ "attr_func,target_input_tokens,target_output_tokens,target_total_tokens,time_mean,time_std,peak_mem_mean,peak_mem_std,statuses\n"
+ )
+ for row in summary:
+ f.write(
+ "{},{},{},{},{},{},{},{},{}\n".format(
+ row["attr_func"],
+ row["target_input_tokens"],
+ row["target_output_tokens"],
+ row["target_total_tokens"],
+ "" if row["time_mean"] is None else f"{row['time_mean']:.4f}",
+ "" if row["time_std"] is None else f"{row['time_std']:.4f}",
+ "" if row["mem_mean"] is None else f"{row['mem_mean']:.4f}",
+ "" if row["mem_std"] is None else f"{row['mem_std']:.4f}",
+ "|".join(row["statuses"]),
+ )
+ )
+ f.flush()
+ try:
+ os.fsync(f.fileno())
+ except OSError:
+ pass
+
+ tmp_path.replace(summary_path)
+ return summary_path
+
+
+def main() -> None:
+ args = parse_args()
+ device = resolve_device(args.cuda, args.cuda_num)
+ attr_funcs = [a.strip() for a in args.attr_funcs.split(",") if a.strip()]
+ target_output_lengths = parse_csv_ints(args.output_lengths)
+ out_dir = Path(args.output_dir)
+ out_dir.mkdir(parents=True, exist_ok=True)
+
+ random.seed(42)
+ np.random.seed(42)
+ torch.manual_seed(42)
+
+ model_name = args.model if args.model_path is None else args.model_path
+ model, tokenizer = load_model_balanced(model_name, device)
+ device_indices = collect_device_indices(device, model)
+ max_ctx = getattr(getattr(model, "config", None), "max_position_embeddings", None)
+
+ base_text = load_ruler_base(Path(args.ruler_file), fallback="RULER fallback text. ")
+ target_base = args.target_text
+ all_rows: List[Dict[str, Any]] = []
+ runner = None
+ raised: Optional[BaseException] = None
+ jsonl_f = None
+ jsonl_path = out_dir / "time_curve_runs.jsonl"
+ summary_path = out_dir / "time_curve_summary.csv"
+
+ def record_row(row: Dict[str, Any]) -> None:
+ all_rows.append(row)
+ if jsonl_f is not None:
+ append_jsonl_row(jsonl_f, row)
+ write_summary_csv(all_rows, out_dir)
+
+ using_deprecated_total = args.total_lengths is not None
+ if using_deprecated_total:
+ target_total_lengths = parse_csv_ints(args.total_lengths)
+ length_grid: List[Tuple[int, int, int]] = []
+ for total_tokens in target_total_lengths:
+ for output_tokens in target_output_lengths:
+ length_grid.append((total_tokens - output_tokens, output_tokens, total_tokens))
+ else:
+ target_input_lengths = parse_csv_ints(args.input_lengths)
+ length_grid = []
+ for input_tokens in target_input_lengths:
+ for output_tokens in target_output_lengths:
+ length_grid.append((input_tokens, output_tokens, input_tokens + output_tokens))
+
+ try:
+ jsonl_f = jsonl_path.open("w")
+ write_summary_csv([], out_dir)
+
+ for input_tokens, output_tokens, total_tokens in length_grid:
+ if input_tokens <= 0:
+ for attr in attr_funcs:
+ for rep in range(args.repeats):
+ record_row(
+ {
+ "attr_func": attr,
+ "target_input_tokens": input_tokens,
+ "target_output_tokens": output_tokens,
+ "target_total_tokens": total_tokens,
+ "actual_input_tokens": None,
+ "actual_output_tokens": None,
+ "actual_total_tokens_raw": None,
+ "actual_user_prompt_tokens": None,
+ "actual_formatted_prompt_tokens": None,
+ "actual_generation_tokens": None,
+ "actual_total_tokens": None,
+ "status": "skipped_nonpositive_input",
+ "time_sec": None,
+ "peak_mem_gb": None,
+ "peak_mem_reserved_gb": None,
+ "repeat": rep,
+ "used_deprecated_total_lengths": using_deprecated_total,
+ }
+ )
+ continue
+
+ prompt, actual_input_len = build_prompt_to_length(tokenizer, base_text, input_tokens)
+ target, actual_output_len = build_output_to_length(tokenizer, target_base, output_tokens)
+ actual_total_tokens_raw = len(tokenizer(prompt + target, add_special_tokens=False).input_ids)
+ model_lens = estimate_model_lengths(tokenizer, prompt, target)
+
+ if max_ctx is not None and model_lens["total_tokens"] > max_ctx:
+ for attr in attr_funcs:
+ for rep in range(args.repeats):
+ record_row(
+ {
+ "attr_func": attr,
+ "target_input_tokens": input_tokens,
+ "target_output_tokens": output_tokens,
+ "target_total_tokens": total_tokens,
+ "actual_input_tokens": actual_input_len,
+ "actual_output_tokens": actual_output_len,
+ "actual_total_tokens_raw": actual_total_tokens_raw,
+ "actual_user_prompt_tokens": model_lens["user_prompt_tokens"],
+ "actual_formatted_prompt_tokens": model_lens["formatted_prompt_tokens"],
+ "actual_generation_tokens": model_lens["generation_tokens"],
+ "actual_total_tokens": model_lens["total_tokens"],
+ "status": "skipped_model_ctx",
+ "time_sec": None,
+ "peak_mem_gb": None,
+ "peak_mem_reserved_gb": None,
+ "repeat": rep,
+ "used_deprecated_total_lengths": using_deprecated_total,
+ }
+ )
+ continue
+
+ batch_size = compute_batch_size(model_lens["total_tokens"], max_input_len=max_ctx or 200000)
+
+ for attr in attr_funcs:
+ for rep in range(args.repeats):
+ runner = None
+ maybe_reset_cuda(device_indices)
+ try:
+ runner = make_attr_runner(
+ attr,
+ model=model,
+ tokenizer=tokenizer,
+ chunk_tokens=args.chunk_tokens,
+ sink_chunk_tokens=args.sink_chunk_tokens,
+ batch_size=batch_size,
+ prompt=prompt,
+ target=target,
+ )
+ except RuntimeError as e:
+ if "out of memory" in str(e).lower():
+ status = "oom"
+ if not args.catch_oom:
+ raise
+ else:
+ status = f"init_runtime_error: {e}"
+ if not args.catch_oom:
+ raise
+ wall = None
+ mem_alloc = None
+ mem_reserved = None
+ mem_by_device = {}
+ except Exception as e:
+ status = f"init_error: {e}"
+ if not args.catch_oom:
+ raise
+ wall = None
+ mem_alloc = None
+ mem_reserved = None
+ mem_by_device = {}
+ else:
+ status, wall, mem_alloc, mem_reserved, mem_by_device = measure(
+ runner, device_indices=device_indices, catch_oom=args.catch_oom
+ )
+ finally:
+ runner = None
+
+ record_row(
+ {
+ "attr_func": attr,
+ "target_input_tokens": input_tokens,
+ "target_output_tokens": output_tokens,
+ "target_total_tokens": total_tokens,
+ "actual_input_tokens": actual_input_len,
+ "actual_output_tokens": actual_output_len,
+ "actual_total_tokens_raw": actual_total_tokens_raw,
+ "actual_user_prompt_tokens": model_lens["user_prompt_tokens"],
+ "actual_formatted_prompt_tokens": model_lens["formatted_prompt_tokens"],
+ "actual_generation_tokens": model_lens["generation_tokens"],
+ "actual_total_tokens": model_lens["total_tokens"],
+ "status": status,
+ "time_sec": wall,
+ "peak_mem_gb": mem_reserved if mem_reserved is not None else mem_alloc,
+ "peak_mem_reserved_gb": mem_reserved,
+ "peak_mem_by_device_gb": mem_by_device if mem_by_device else None,
+ "repeat": rep,
+ "used_deprecated_total_lengths": using_deprecated_total,
+ }
+ )
+ except BaseException as e:
+ raised = e
+ finally:
+ runner = None
+ if jsonl_f is not None:
+ jsonl_f.close()
+ write_summary_csv(all_rows, out_dir)
+ print(f"Wrote per-run records to {jsonl_path}")
+ print(f"Wrote summary to {summary_path}")
+
+ if raised is not None:
+ raise raised
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/exp2/DATASETS.md b/exp/exp2/DATASETS.md
new file mode 100644
index 0000000000000000000000000000000000000000..673b05b30ddb4d2a7aff868749a17722c06d8499
--- /dev/null
+++ b/exp/exp2/DATASETS.md
@@ -0,0 +1,231 @@
+# exp/exp2 ๆฐๆฎ้ไธๆ ทๆฌๆต่ฏดๆ
+
+ๆฌๆไปถ่ฏดๆ Experiment 2 ไธญๆฏๆ็ๆฐๆฎ้ใๆ ทๆฌ็ปๆ๏ผไปฅๅๅจใ้ๆ ท้ถๆฎตใไธใๅฝๅ ้ถๆฎตใ็ๅค็ๆนๅผใ
+
+## ๆฏๆ็ๆฐๆฎ้
+- `morehopqa`๏ผ`data/with_human_verification.json`๏ผ
+- RULER ็ณปๅ JSONL๏ผ`hotpotqa_long`ใ`niah_*`ใ`vt_*`๏ผ่ชๅจๅจ `data/ruler_multihop//.../validation.jsonl` ๆ็ดข๏ผ๏ผๆ็ดๆฅไผ ๅ
ฅไปปๆ RULER JSONL ่ทฏๅพ
+- ๅ
ถไฝๆฐๆฎ้๏ผๅฆ math๏ผ่ขซๆพๅผ่ทณ่ฟ
+- ๅฝๅ ้ถๆฎตๅๆ ทไผๅ
ไฝฟ็จ็ผๅญๆไปถ `exp/exp2/data/.jsonl`๏ผๅฆๅๆไธ่ฟฐ่งๅ่งฃๆ๏ผไผ ๅ
ฅๅญๅจ็ JSONL ่ทฏๅพไนไผๆ RULER ็ปๆๅ ่ฝฝ
+
+### ๅ
ฑๅ็ๆ ทๆฌๅญๆฎตๅฎไน
+```json
+{
+ "prompt": "<ไธไธๆ+้ฎ้ข>",
+ "target": "<็ญๆกๆ็ๆ>",
+ "indices_to_explain": [start_tok, end_tok] | null, // token-level๏ผ้่ฆ่งฃ้็ generation token span๏ผ้ญๅบ้ด๏ผ
+ "attr_mask_indices": [...], // legacy๏ผ่ฆ็็้ๆ ๅฅๅญ็ดขๅผ๏ผๅฝๅ exp2 ไธๅไฝฟ็จ๏ผ๏ผๅฏ่ฝไธบ null
+ "sink_span": [start, end] | null, // ็ๆ token ไธญ็็ญๆก็ๆฎต
+ "thinking_span": [start, end] | null, // ็ๆ token ไธญ็ CoT ็ๆฎต
+ "metadata": { ... } // ๆฐๆฎ้็นๅฎๅ
ไฟกๆฏ
+}
+```
+- **`CachedExample`**๏ผ`dataset_utils.py` ็ปไธ็ๅ
ๅญๆ็ปๆ๏ผๅญๆฎตไธไธ่ฟฐ JSON ๅฎๅ
จไธ่ด๏ผ็จไบ้ๆ ท้ถๆฎต๏ผๅ ่ฝฝๅๅงๆฐๆฎ๏ผไธๅฝๅ ้ถๆฎต๏ผๅ ่ฝฝ็ผๅญๆๅๅง๏ผใ
+- **็ผๅญ่ก๏ผJSONL๏ผ**๏ผ`sample_and_filter.py` ๅๅ
ฅ็ๆฏ่ก JSON๏ผไธ `CachedExample` ๅญๆฎตไธไธๅฏนๅบใ
+- **้ๆ ท้ถๆฎตๅค็ๆต๏ผ้็จ๏ผ**๏ผ
+ 1. ๅ ่ฝฝๅๅงๆฐๆฎ้ๆ ทๆฌ๏ผ`prompt`/`indices_to_explain` ็ญไฟๆไธ่ด๏ผใ
+ 2. ๆๆจกๆฟ่ฐ็จ็ๆๆจกๅ๏ผ่ฆๆฑใๆ่ๆๆฌ + ๆซๅฐพ \\box{} ็ญๆกใใ
+ 3. ่ฅ็ๆไธ็ฌฆๅใๆ่ + ๅไธช \\box{} ไธๆ ๅฐพๅทดใ็ๆ ผๅผ๏ผ็ดๆฅไธขๅผ่ฏฅๆ ทๆฌใ
+ 4. ๆๅๆ่็ๆฎตไธ `\\box{}` ๅ
ๆๆฌ๏ผไป
็จ `\\box{}` ๅ
ๆ่ฐ็จๅคๅฎๆจกๅใ
+ 5. ๅคๅฎไธบ True ๆถ๏ผ้ๆฐๆผๆฅใๆ่็ๆฎต + ๅป้ค box ๅ
่ฃน็็ญๆกๆๆฌใไฝไธบ `target`๏ผๅนถๆฎๆญค่ฎฐๅฝ `sink_span`/`thinking_span`ใ
+ 6. ๅๅ
ฅ็ผๅญ๏ผๅชไฟ็ `reference_answer`ใ`judge_response`๏ผๅฏ้ `boxed_answer`๏ผ๏ผไธๅๅญๅจ `candidate_answer`ใ
+
+### ็ๆๅๅไธ span ่งฃๆ
+- `split_boxed_generation`๏ผ`dataset_utils.py`๏ผๆ ก้ชๆ ผๅผ๏ผๅฟ
้กปๆฏใ้็ฉบๆ่ๆๆฌ + ๅไธชๆซๅฐพ \\box{}ใไธ็ฎฑไฝไนๅๆ ๅ
ถไปๅญ็ฌฆ๏ผๅฆๅ็ดๆฅ่ทณ่ฟใ
+- `target` ็ฑใๆ่็ๆฎต + ๆข่ก + ๆ็ป็ญๆกๆๆฌ๏ผๆ box๏ผใ้็ปใ
+- `attach_spans_from_answer` ไฝฟ็จ tokenizer ็ offset mapping ๅฐๆ็ป็ญๆกๅจ `target` ไธญ็ๅญ็ฌฆๅบ้ดๆ ๅฐๅฐ token ็บง็ดขๅผ๏ผๅพๅฐ `sink_span`๏ผ`thinking_span` ๅไปๅผๅคดๅฐ `sink_span` ๅไธ token ็้ญๅบ้ดใไธค่
ๅไธบ token ็บง span๏ผๆปก่ถณๅ็ปญๅค่ทณ IFR ็่ฐ็จ็บฆๅฎใ
+- `indices_to_explain` ๅจ้ๆ ทๅ็ผๅญๆถ็ปไธ่ฎพ็ฝฎไธบ `sink_span`๏ผboxed ๅ
ๆๅจ `target` ไธญๅฏนๅบ็ generation token span๏ผใ
+
+---
+
+## MoreHopQA
+- **ๅๅงๆ ทๆฌ็ปๆ๏ผ`MoreHopQAAttributionDataset` โ `CachedExample`๏ผ**
+ ```json
+ {
+ "prompt": "\\n",
+ "target": null,
+ "indices_to_explain": null,
+ "attr_mask_indices": null,
+ "sink_span": null,
+ "thinking_span": null,
+ "metadata": {
+ "answer": "",
+ "_id": "",
+ "original_context": <ๅๅงไธไธๆ็ปๆ>
+ }
+ }
+ ```
+ - ๅ ่ฝฝๆถๆบ๏ผ`DatasetLoader.load_raw("morehopqa")` ๅจ้ๆ ท้ถๆฎตใๅฝๅ ้ถๆฎต๏ผๆ ็ผๅญๆถ๏ผ้ฝไผไบงๅบ `CachedExample`ใ
+ - ่ฏดๆ๏ผexp2 ็ token-level row/rec ้่ฆ `target` + ๅฏๅฎไฝ็็ญๆก token span๏ผๅปบ่ฎฎๅ
่ท `sample_and_filter.py` ไบงๅบ็ผๅญๅๅๅๅฝๅ ่ฏไผฐใ
+
+- **้ๆ ท้ถๆฎต๏ผ็ๆ & ่ฟๆปคๅๅ็ผๅญ๏ผ**
+ ```json
+ {
+ "prompt": "<ๅไธ>",
+ "target": "<็ๆ็ CoT + ๆ็ป็ญๆกๆๆฌ๏ผๅทฒๅปๆ box ๅ
่ฃน๏ผ>",
+ "indices_to_explain": [start_tok, end_tok],
+ "attr_mask_indices": null,
+ "sink_span": [start_tok, end_tok] | null,
+ "thinking_span": [start_tok, end_tok] | null,
+ "metadata": {
+ "answer": "",
+ "_id": "",
+ "original_context": <ๅๅงไธไธๆ็ปๆ>,
+ "reference_answer": "",
+ "judge_response": "",
+ "boxed_answer": "<ๅฏ้๏ผboxed ่งฃๆ็ปๆ>"
+ }
+ }
+ ```
+ - `sink_span`/`thinking_span`๏ผไป
ๅจๆๅ่งฃๆ `\\box{}` ๆถๅกซๅ
๏ผ`target` ไธบใๆ่ + ๆ็ป็ญๆกๆๆฌใ็่ฃๅช็ใ
+ - ๅๅ
ฅ๏ผ`exp/exp2/data/morehopqa.jsonl`ใ
+
+- **ๅฝๅ ้ถๆฎต๏ผๅ ่ฝฝ็ผๅญไผๅ
๏ผ**
+ - ๅ ่ฝฝ๏ผ`run_exp.py` ไผๅ
`load_cached`๏ผJSONL โ `CachedExample`๏ผ๏ผๅฆๅๅ้ๅๅง็ปๆๅนถๅจ็บฟ็ๆ `target`ใ
+ - ไฝฟ็จ๏ผๅฟ ๅฎๅบฆ๏ผtoken-level RISE/MAS๏ผ็ดๆฅ็จ็ผๅญ็ `target`๏ผ`ifr_multi_hop` ๅจๆ `sink_span`/`thinking_span` ๆถ้ๅฎ็ญๆก/CoT๏ผๅฆๅ่งๆดไธช็ๆไธบ sinkใ
+
+---
+
+## RULER ็ญ็น้ฎ็ญ๏ผ`hotpotqa_long`๏ผ
+- **ๅๅงๆ ทๆฌ็ปๆ๏ผ`RulerAttributionDataset` โ `CachedExample`๏ผ**
+ ```json
+ {
+ "prompt": " + ",
+ "target": "",
+ "indices_to_explain": [0],
+ "attr_mask_indices": [<ๅฅๅญ็ดขๅผ>...] | null,
+ "sink_span": null,
+ "thinking_span": null,
+ "metadata": {
+ "dataset": "ruler",
+ "length": ,
+ "length_w_model_temp": ,
+ "outputs": [...],
+ "answer_prefix": "",
+ "token_position_answer": ,
+ "needle_spans": [
+ {
+ "title": "",
+ "doc_index": ,
+ "document_number": ,
+ "sentence_index": ,
+ "sentence": "",
+ "context_span": [start, end],
+ "span": [start, end],
+ "snippet": ""
+ },
+ ...
+ ],
+ "prompt_sentence_count": ,
+ "reference_answer": "<ๅจ loader ไธญ่กฅๅ
๏ผๆฅ่ช outputs ๆ target>"
+ }
+ }
+ ```
+ - ๅ ่ฝฝๆถๆบ๏ผ`DatasetLoader.load_raw("hotpotqa_long")` ๅจ้ๆ ท้ถๆฎตใๅฝๅ ้ถๆฎต๏ผๆ ็ผๅญๆถ๏ผ้ฝไผไบงๅบ `CachedExample`ใ
+
+- **้ๆ ท้ถๆฎต๏ผ็ๆ & ่ฟๆปคๅๅ็ผๅญ๏ผ**
+ ```json
+ {
+ "prompt": "<ๅไธ>",
+ "target": "<็ๆ็ CoT + ๆ็ป็ญๆกๆๆฌ๏ผๅทฒๅปๆ box ๅ
่ฃน๏ผ>",
+ "indices_to_explain": [-2],
+ "attr_mask_indices": [<ๅฅๅญ็ดขๅผ>...] | null,
+ "sink_span": [start_tok, end_tok] | null,
+ "thinking_span": [start_tok, end_tok] | null,
+ "metadata": {
+ "dataset": "ruler",
+ "length": ,
+ "length_w_model_temp": ,
+ "outputs": [...],
+ "answer_prefix": "",
+ "token_position_answer": ,
+ "needle_spans": [...],
+ "prompt_sentence_count": ,
+ "reference_answer": "",
+ "judge_response": "",
+ "boxed_answer": "<ๅฏ้>"
+ }
+ }
+ ```
+ - `attr_mask_indices` ไฟ็ๅๅผ๏ผ`indices_to_explain` ็ปไธไธบๆซๅฅ `[-2]`๏ผๆๅไธไธช้ EOS ็ๆๅฅ๏ผ๏ผ`sink_span`/`thinking_span` ไป
ๅจๆๅ่งฃๆ `\\box{}` ๆถๅกซๅ
๏ผ`target` ไธบใๆ่ + ๆ็ป็ญๆกๆๆฌใ็่ฃๅช็ใ
+ - ๅๅ
ฅ๏ผ`exp/exp2/data/hotpotqa_long.jsonl`ใ
+
+- **ๅฝๅ ้ถๆฎต๏ผๅ ่ฝฝ็ผๅญไผๅ
๏ผ**
+ - ๅ ่ฝฝ๏ผไผๅ
`load_cached`๏ผJSONL โ `CachedExample`๏ผ๏ผๅฆๅๅ้ๅๅง่งฃๆใ
+ - ไฝฟ็จ๏ผ่ฆ็็ไฝฟ็จ `attr_mask_indices`๏ผๅฟ ๅฎๅบฆไธ `ifr_multi_hop` ๅฉ็จ็ผๅญ็ `sink_span`/`thinking_span` ๅฎไฝ็ญๆก/CoT๏ผ่ฅ็ผบๅคฑๅ่งๆดไธช็ๆไธบ sinkใ
+
+---
+
+## RULER NIAH / Variable Tracking๏ผ`niah_*`, `vt_*`๏ผ
+- **ๅๅงๆ ทๆฌ็ปๆ๏ผๅ RULER ้็จ๏ผ**
+ ```json
+ {
+ "prompt": " + ",
+ "target": "",
+ "indices_to_explain": [0],
+ "attr_mask_indices": [<ๅฅๅญ็ดขๅผ>...] | null,
+ "sink_span": null,
+ "thinking_span": null,
+ "metadata": {
+ "dataset": "ruler",
+ "length": ,
+ "length_w_model_temp": ,
+ "outputs": [...],
+ "answer_prefix": "",
+ "token_position_answer": ,
+ "needle_spans": [...],
+ "prompt_sentence_count": ,
+ "reference_answer": "<ๅจ loader ไธญ่กฅๅ
>"
+ }
+ }
+ ```
+ - ๅ ่ฝฝๆถๆบ๏ผ`DatasetLoader.load_raw("")` ๅจ้ๆ ท้ถๆฎตใๅฝๅ ้ถๆฎต๏ผๆ ็ผๅญๆถ๏ผไฝฟ็จใ
+
+- **้ๆ ท้ถๆฎต๏ผ็ๆ & ่ฟๆปคๅๅ็ผๅญ๏ผ**
+ ```json
+ {
+ "prompt": "<ๅไธ>",
+ "target": "<ๆ่ + ๆ็ป็ญๆกๆๆฌ๏ผๆ box๏ผ๏ผๆ ๅ
ถไปๅฐพๅทด>",
+ "indices_to_explain": [start_tok, end_tok],
+ "attr_mask_indices": [<ๅฅๅญ็ดขๅผ>...] | null,
+ "sink_span": [start_tok, end_tok] | null,
+ "thinking_span": [start_tok, end_tok] | null,
+ "metadata": {
+ "dataset": "ruler",
+ "length": ,
+ "length_w_model_temp": ,
+ "outputs": [...],
+ "answer_prefix": "",
+ "token_position_answer": ,
+ "needle_spans": [...],
+ "prompt_sentence_count": ,
+ "reference_answer": "",
+ "judge_response": "",
+ "boxed_answer": "<ๅฏ้>"
+ }
+ }
+ ```
+ - ็ๆ/ๅคๅฎๆต็จไธ `hotpotqa_long` ็ธๅ๏ผ`target` ๆฏ่ฃๅชๅ็ใๆ่ + ๆ็ป็ญๆกๆๆฌใใ
+ - ๅๅ
ฅ๏ผ`exp/exp2/data/.jsonl`๏ผไพๅฆ `niah_mq_q2.jsonl`, `vt_h6_c1.jsonl`๏ผใ
+
+- **ๅฝๅ ้ถๆฎต๏ผๅ ่ฝฝ็ผๅญไผๅ
๏ผ**
+ - ไธ `hotpotqa_long` ็ธๅ๏ผไผๅ
็ผๅญ๏ผๅฆๅๅๅง๏ผๆขๅค็๏ผ`recovery_ruler`๏ผไฝฟ็จ `metadata.needle_spans`๏ผๆ ๅฐๅฐ prompt tokens๏ผ๏ผๅค่ทณ IFR ๅจๆ `sink_span`/`thinking_span` ๆถไฝ็จไบ็ญๆก/CoTใ
+
+---
+
+## `indices_to_explain` ็บฆๅฎ
+- token-level๏ผ`indices_to_explain = [start_tok, end_tok]`๏ผ้ญๅบ้ด๏ผ๏ผๅๆ ็ณปไธบ `tokenizer(target, add_special_tokens=False)` ็ generation token indicesใ
+- exp2 ๆจ่๏ผ`indices_to_explain == sink_span`๏ผๅณ boxed ๅ
ๆ๏ผๆ็ป็ญๆก๏ผๅจ `target` ไธญๅฏนๅบ็ token spanใ
+
+---
+
+## ่ชๅฎไน RULER JSONL ่ทฏๅพ
+- ่ฅ `--dataset` ไผ ๅ
ฅๅญๅจ็ JSONL ่ทฏๅพ๏ผ`dataset_from_name` ๆ RULER ๆไปถ่งฃๆ๏ผๅญๆฎตไธๆต็จๅ RULER ็ณปๅใ
+- ้ๆ ทใๅฝๅ ้ถๆฎต่กไธบไธไธๆ RULER ๆ่ฟฐไธ่ด๏ผๅชๆฏๆไปถๅ็ฑๆพๅผ่ทฏๅพๅณๅฎใ
+
+---
+
+## ๅฝๅ ้ถๆฎตๅ ่ฝฝไผๅ
็บงไธๆๆ
+- `run_exp.py` ๅ ่ฝฝ้กบๅบ๏ผ`exp/exp2/data/.jsonl` ็ผๅญ > ๆพๅผ็ปๅฎ็ JSONL ่ทฏๅพ > ๅๅง่งฃๆ๏ผMoreHopQA ๆ RULER๏ผ
+- ๆขๅค็ (`mode=recovery_ruler`) ไป
ๆฏๆ RULER๏ผ่ฆๆฑ `metadata.needle_spans`๏ผ๏ผๅฆๅๆ็ป
+- ๅฟ ๅฎๅบฆ (`mode=faithfulness_gen`) ไฝฟ็จ็ๆๆๆฌ๏ผ`ifr_multi_hop` ๅจๆ `sink_span`/`thinking_span` ๆถๆๅฏน็ญๆก/CoT ๅๅค่ทณ๏ผๅฆๅ้ๅไธบๆดๆฎต็ๆ
diff --git a/exp/exp2/README.md b/exp/exp2/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..e79945839761315fefb72c17f97b156b29e1d5e6
--- /dev/null
+++ b/exp/exp2/README.md
@@ -0,0 +1,106 @@
+# FlashTrace ๅฎ้ช 2๏ผๅคๆญฅๆจ็ไธ็ๅฟ ๅฎๅบฆ๏ผ
+
+ๆฌ็ฎๅฝๆไพใ11 ๆฐๆฎ้ ร 9 ๆนๆณ ร 3 ๆๆ ใ็ๅฎ้ชๅทฅๅ
ท๏ผ**่ทณ่ฟ AT2**๏ผ**่ทณ่ฟ math**ใๆต็จๅไธบไธคๆญฅ๏ผๅ
้ๆ ทๅนถ่ฟๆปค้ซ่ดจ้ CoT+boxed ็ๆ๏ผๅๅฏน่ฟๆปค็ปๆๅๅฝๅ ่ฏไผฐใ
+
+ๆฏๆๆฐๆฎ้๏ผMoreHopQAใHotpotQA๏ผRULER hotpotqa_long๏ผใRULER niah๏ผniah_*๏ผใRULER variable tracking๏ผvt_*๏ผใRULER ่ทฏๅพ่ชๅจๅจ `data/ruler_multihop//.../validation.jsonl` ไธญๆ็ดขใ
+
+ไธป่ฆๆไปถ๏ผ
+- `sample_and_filter.py`๏ผ้ๆ ท + ๅคๅฎไธ่ดๆง๏ผ่พๅบๅฐ `exp/exp2/data/`
+- `run_exp.py`๏ผๅฝๅ ๆต่ฏ๏ผ่พๅบๅฐ `exp/exp2/output/`
+- `dataset_utils.py`๏ผๆฐๆฎๅ ่ฝฝใ็ญๆก span ่งฃๆ
+
+้ๆ ท่ๆฌๆฏๆ็ๆฐๆฎ้
+- `morehopqa`๏ผๆฌๅฐ `data/with_human_verification.json`๏ผ
+- `hotpotqa_long`๏ผ่ชๅจๅจ `data/ruler_multihop//hotpotqa_long/validation.jsonl` ๆ็ดข๏ผ
+- `niah_*`๏ผRULER niah ๅไฝ๏ผ่ชๅจๆ็ดขๅไธ๏ผ
+- `vt_*`๏ผRULER variable tracking ๅไฝ๏ผ่ชๅจๆ็ดขๅไธ๏ผ
+- ็ดๆฅไผ RULER JSONL ่ทฏๅพ๏ผไฝไธบๆฐๆฎ้ๅๅค็๏ผ๏ผๅ
ถไฝ็ฑปๅไธๆฏๆ
+
+ๅฝๅ ๆต่ฏๆฏๆ
+- ๆฐๆฎ้๏ผไผๅ
ไฝฟ็จ `exp/exp2/data/.jsonl` ็ผๅญ๏ผ่ฅๆ ๅๆ้ๆ ทๅๆ ท็่งฃๆ่งๅๅ ่ฝฝ๏ผmath ๆพๅผๆ็ปใ
+- ๆๆ ๏ผ
+ - `faithfulness_gen`๏ผ็ๆไพง๏ผ๏ผๅฏ่ฟ่กๅจไปปไฝๅทฒๅ ่ฝฝๆ ทๆฌ๏ผmath ไปฅๅค๏ผใ
+ - `recovery_ruler`๏ผๆขๅค็๏ผไป
RULER๏ผ๏ผRecall@10%๏ผๆๅๅชๅจ prompt tokens ไธ่ฟ่ก๏ผgold ๆฅ่ช `needle_spans`๏ผใ
+- ๆนๆณ๏ผ`--attr_funcs`๏ผ๏ผ`IG`ใ`perturbation_all`ใ`perturbation_CLP`ใ`perturbation_REAGENT`ใ`attention`๏ผๅ
้จ่ๅ IG๏ผใ`ifr_all_positions`ใ`ifr_multi_hop`ใ`attnlrp`ใ`ft_attnlrp`ใ`basic`ใAT2 ๆชๆไพใ
+
+---
+
+## ๆฐๆฎ้ๆ ท
+
+ๅฎ็ฐ้ป่พ
+- ็ปไธๆฐๆฎๅ ่ฝฝ๏ผ`DatasetLoader` ่ฏปๅ MoreHopQA / HotpotQA / RULER niah / RULER vt๏ผๅฏ็ดๆฅไผ ่ชๅฎไน RULER JSONLใ
+- ็ๆๆจกๅ๏ผ`qwen3-235b-a22b-2507`๏ผ่ฑๆ system prompt๏ผ๏ผ่ฆๆฑใๅ
็ฎ่ฆๆ่๏ผๅ็จ `\box{}` ๅ
่ฃนๆ็ป็ญๆกไธๆซๅฐพไธ่ฟฝๅ ๅ
ๅฎนใ๏ผuser prompt ไธบๅ้ข๏ผๆ ้ขๅคๆจกๆฟใ
+- ๅคๅฎๆจกๅ๏ผ`deepseek-v3-1-terminus`๏ผ่ฑๆ system prompt๏ผ๏ผๅช่พๅบ True/False ๅคๆญ `\box{}` ๅ
ๆไธๅ่็ญๆกๆฏๅฆไธ่ดใ
+- ่ฟๆปค๏ผไป
ไฟ็ใๆ่ + ๆซๅฐพ boxed ็ญๆกใไธๅคๅฎไธบ True ็ๆ ทๆฌ๏ผ`target` ็จๆๅ็ๆ่็ๆฎตไธ **ๅปๆ box ๅ
่ฃน็ๆ็ป็ญๆก** ้็ป๏ผ้ๅธฆ token ็บง `sink_span`/`thinking_span`ใ`reference_answer`ใ`judge_response`๏ผไธๅๅญ `candidate_answer`๏ผ๏ผ`indices_to_explain` ็ปไธๅไธบ `sink_span`๏ผboxed ๅ
ๆๅจ `target` ็ generation token span๏ผ[start_tok, end_tok]๏ผใ
+- ้ๆ ทไผๆๅๅง้กบๅบไพๆฌกๅฐ่ฏๆ ทๆฌ๏ผๅคๅฎๅคฑ่ดฅ็ซๅณ่ทณ่ฟ๏ผ็ดฏ่ฎกๅฐ `--max_examples` ๆกๆๅๆ ทๆฌๅณๆๅๅๆญข๏ผ่ฅๆบๆฐๆฎไธ่ถณๅๆดๅฐ๏ผ๏ผtqdm ไผๅๅซๆพ็คบๅฐ่ฏไธๆๅ่ฎกๆฐใ
+
+ไฝฟ็จ่ฏดๆ
+```bash
+export FLASHTRACE_API_KEY=sk-yaojia-get-ccfa # ๆ OPENAI_API_KEY
+
+# ็คบไพ๏ผ้ๆ ท hotpotqa_long๏ผไฟ็ๆๅค 100 ๆกๅคๅฎไธบ True ็ๆ ทๆฌ
+python exp/exp2/sample_and_filter.py \
+ --dataset data/with_human_verification.json \
+ --max_examples 100 \
+ --api_key sk-yaojia-get-ccfa \
+ --tokenizer_model /opt/share/models/Qwen/Qwen3-8B > exp/exp2/out.log
+```
+ๅธธ็จๅๆฐ๏ผ
+- `--dataset`๏ผmorehopqa | hotpotqa_long | niah_* | vt_*๏ผๆ็ดๆฅ JSONL ่ทฏๅพ๏ผ
+- `--max_examples`๏ผๅธๆไฟ็็ๆๅๆ ทๆฌๆฐ๏ผ่พพๅฐๅๅณๅๆญข๏ผ่ฅๆบๆฐๆฎไธ่ถณๅๆดๅฐ๏ผ
+- `--tokenizer_model`๏ผ็จไบ span ๆฃๆต็ tokenizer๏ผ้ป่ฎคๅค็จ็ๆๆจกๅ๏ผ
+- `--api_base`/`--api_key`๏ผๆฅๅฃๅฐๅไธๅฏ้ฅ๏ผ้ป่ฎคๆฌๅฐ http://localhost:4000/v1๏ผ
+- `--request_interval` / `--judge_interval`๏ผ็ๆ/ๅคๅฎ้ด้่ๆต๏ผ้ป่ฎค 1s๏ผ
+- `--rate_limit_delay`๏ผ้ๅฐ HTTP 429 ๆถ็็ญๅพ
็งๆฐ๏ผ้ป่ฎค 5s๏ผ๏ผไผๅจ้่ฏๅ่ชๅจ sleep
+่พๅบ๏ผ`exp/exp2/data/.jsonl`
+
+---
+
+## ๅฝๅ ๆต่ฏ
+
+ๅฎ็ฐ้ป่พ
+- ่พๅ
ฅ๏ผไผๅ
่ฏปๅ `exp/exp2/data/.jsonl`๏ผ่ฟๆปค็ผๅญ๏ผ๏ผ่ฅไธๅญๅจๅๅ้ๅฐๅๅงๆฐๆฎ่งฃๆใ
+- ๆนๆณ๏ผๅฟ ๅฎๅบฆ๏ผtoken-level RISE/MAS๏ผๅฏน้ฝ `evaluations/faithfulness.py` ็้ป่พ๏ผAT2 ๆชๅฎ็ฐ๏ผ๏ผmath ่ชๅจๆ็ปใ
+- ๅค่ทณ FlashTrace๏ผ่ฅ็ผๅญๅซ `sink_span`/`thinking_span` ๅ็จไบ multi-hop IFR๏ผๅฆๅ้ป่ฎคๆดๅฅ็ญๆกไธบ sinkใ
+- ไธๆฌก่ฟ่กๅฏๅๆถ่ฏๆตๅคไธชๆๆ ๏ผ`--mode` ๆฏๆๅคๅผไธ้ๅทๅ้๏ผๅฆ `--mode faithfulness_gen,recovery_ruler` ๆ `--mode faithfulness_gen, recovery_ruler`๏ผ๏ผๅฏนๅไธๆนๆ ทๆฌๅชๅไธๆฌกๅฝๅ ใ
+- ๅฏ้ไฟๅญๆ ทๆฌ็บง trace๏ผๅ `--save_hop_traces` ไผไธบ**ๆๆๆนๆณใๆๆๆ ทๆฌ**ไฟๅญๅฝๅ ๅ้ไธ้ๆ ทๆฌๆๆ ๅฐ `exp/exp2/output/traces/...`๏ผๅฏน multi-hop ๆนๆณ่ฟไผ้ขๅคไฟๅญๆฏ่ทณ็ token-level ๅ้ `V_h`๏ผๅไธ `vh`๏ผๅณๅฎ้
ๅไธๅค่ทณไผ ๆญ็ๅ้๏ผ๏ผๅนถๅจ manifest ไธญ่ฎฐๅฝ `attnlrp_neg_handling/attnlrp_norm_mode` ็ญ่ฎพ็ฝฎใ
+- ๅทฒ็ฅๅ
ผๅฎนๆง๏ผ้จๅ tokenizer ๅจ chat template ่พน็ไผๅบ็ฐ token ๅๅนถ๏ผๅฏผ่ด่ฏๆตไพง็จ token-id ๅญๅบๅๅฎไฝ user prompt ๅคฑ่ดฅ๏ผexp2 ๅทฒๆนไธบ็ดๆฅๅค็จๅฝๅ ้ถๆฎต็ฎๅบ็ `user_prompt_indices` ๅๆฐๅจๅฎไฝใ
+- ๆนๅคงๅฐไผฐ็ฎ๏ผๆฒฟ็จๅ่ๆฌ `(max_input_len-100)/len(tokenizer(format_prompt(prompt)+target))` ็ไฟๅฎไผฐ่ฎก๏ผ่ณๅฐ 1๏ผใ`max_input_len` ็ฑไปฃ็ ๅ
็ฝฎๆ ๅฐ่กจๅบไบ `--model` ๅญ็ฌฆไธฒๅณๅฎ๏ผๆชๅฝไธญๆไป
ไผ `--model_path` ๆถ้ป่ฎค 2000๏ผๅฆ้ๆ ๅฐๅผ่ๅ็จๆฌๅฐ่ทฏๅพ๏ผ่ฏทๅๆถไผ ๅ
ฅๅฏนๅบ็ `--model` ๅ็งฐใ
+- ่ฎกๆถ๏ผๅฏนๆฏไธชๆ ทๆฌ็ๅฝๅ ่ฎก็ฎ๏ผrecovery/faithfulness๏ผๅๅซ่ฎกๆถ๏ผๆ็ปๅจ CSV ๆซๅฐพ่ฟฝๅ `Avg Sample Time (s)` ๅนถๅจๆงๅถๅฐๆๅฐๅนณๅ่ๆถใ
+- ่พๅบ๏ผ`exp/exp2/output/faithfulness/...`ใ`exp/exp2/output/recovery/...`๏ผไปฅๅ๏ผๅฏ้๏ผ`exp/exp2/output/traces/...`๏ผๆๆฐๆฎ้ๅๆจกๅๅ็ฎๅฝใ
+
+ไฝฟ็จ่ฏดๆ
+```bash
+# ็ๆไพง RISE/MAS ๅฟ ๅฎๅบฆ perturbation_all_fast,perturbation_CLP_fast,perturbation_REAGENT_fast,ifr_multi_hop_stop_words,ifr_multi_hop_both,ifr_multi_hop_split_hop,ft_attnlrp,ifr_multi_hop,attnlrp,ifr_all_positions,perturbation_all,perturbation_REAGENT,perturbation_CLP,IG,attention
+python exp/exp2/run_exp.py \
+ --datasets exp/exp2/data/math.jsonl \
+ --attr_funcs IG,attention \
+ --model qwen-8B \
+ --model_path /opt/share/models/Qwen/Qwen3-8B/ \
+ --cuda 2,3,4,5,6,7 \
+ --num_examples 100 \
+ --mode faithfulness_gen \
+ --n_hops 1 \
+ --save_hop_traces \
+&& python exp/exp2/run_exp.py \
+ --datasets exp/exp2/data/morehopqa.jsonl \
+ --attr_funcs IG,attention \
+ --model qwen-8B \
+ --model_path /opt/share/models/Qwen/Qwen3-8B/ \
+ --cuda 2,3,4,5,6,7 \
+ --num_examples 100 \
+ --mode faithfulness_gen \
+ --n_hops 1 \
+ --save_hop_traces
+
+ # --attnlrp_neg_handling drop \
+ # --attnlrp_norm_mode norm
+```
+ๅธธ็จๅๆฐ๏ผ
+- `--datasets`๏ผ้ๅทๅ้ๆฐๆฎ้ๅ๏ผ่ฅๅทฒๅญๅจ `exp/exp2/data/.jsonl` ๅ็ดๆฅไฝฟ็จใ
+- `--attr_funcs`๏ผ้ๅทๅ้ๆนๆณ๏ผๆ AT2๏ผ๏ผ`ifr_multi_hop` ไธ `ft_attnlrp` ๆฏๆๅค่ทณ๏ผ็ฑ `--n_hops` ๆงๅถ๏ผใ
+- `--attnlrp_neg_handling`๏ผFT-AttnLRP ๆฏ่ทณ่ดๅผๅค็๏ผ`drop`/`abs`๏ผใ
+- `--attnlrp_norm_mode`๏ผFT-AttnLRP ๆญฃๅๅไธ hop ratio ๅผๅ
ณ๏ผ`norm`/`no_norm`๏ผใ
+- `--data_root`/`--output_root`๏ผ็ผๅญไธ็ปๆ็ฎๅฝ๏ผ้ป่ฎค `exp/exp2/data` / `exp/exp2/output`๏ผใ
+- `--mode`๏ผ`faithfulness_gen`ใ`recovery_ruler`๏ผๅฏๅคๅผ/้ๅทๅ้๏ผไธๆฌกๅฝๅ ๅๆถ่พๅบๅคไธชๆๆ ๏ผ๏ผ`--num_examples` ๆงๅถ่ฏๆตๆกๆฐใmath ไผ่ขซๆ็ปใ***
+- `--save_hop_traces`๏ผไฟๅญๆ ทๆฌ็บง trace ๅฐ `exp/exp2/output/traces////`๏ผๆฏๆ ทๆฌ `ex_*.npz` + `manifest.jsonl`๏ผใ
diff --git a/exp/exp2/dataset_utils.py b/exp/exp2/dataset_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..3590c8a50b981dde4a653b62cb56d59c7bb197a1
--- /dev/null
+++ b/exp/exp2/dataset_utils.py
@@ -0,0 +1,386 @@
+"""Dataset helpers for Experiment 2 (CoT / multi-hop faithfulness).
+
+Named dataset_utils to avoid collision with the HF `datasets` package.
+"""
+
+from __future__ import annotations
+
+import json
+import random
+import re
+from dataclasses import dataclass
+from pathlib import Path
+from typing import Any, Dict, Iterable, List, Optional
+
+from attribution_datasets import (
+ AttributionExample,
+ MoreHopQAAttributionDataset,
+ RulerAttributionDataset,
+)
+
+
+@dataclass
+class CachedExample:
+ prompt: str
+ target: Optional[str]
+ indices_to_explain: Optional[List[int]]
+ attr_mask_indices: Optional[List[int]]
+ sink_span: Optional[List[int]]
+ thinking_span: Optional[List[int]]
+ metadata: Dict[str, Any]
+
+
+def read_cached_jsonl(path: Path) -> List[CachedExample]:
+ examples: List[CachedExample] = []
+ with path.open("r", encoding="utf-8") as f:
+ for line in f:
+ if not line.strip():
+ continue
+ obj = json.loads(line)
+ examples.append(
+ CachedExample(
+ prompt=obj["prompt"],
+ target=obj.get("target"),
+ indices_to_explain=obj.get("indices_to_explain"),
+ attr_mask_indices=obj.get("attr_mask_indices"),
+ sink_span=obj.get("sink_span"),
+ thinking_span=obj.get("thinking_span"),
+ metadata=obj.get("metadata", {}),
+ )
+ )
+ return examples
+
+
+def load_cached(path: Path, sample: Optional[int] = None, seed: int = 42) -> List[CachedExample]:
+ ex = read_cached_jsonl(path)
+ if sample is not None and sample < len(ex):
+ random.Random(seed).shuffle(ex)
+ ex = ex[:sample]
+ return ex
+
+
+def load_ruler(path: Path, sample: Optional[int] = None, seed: int = 42) -> List[CachedExample]:
+ ds = RulerAttributionDataset(path)
+ examples: List[CachedExample] = []
+ ex_iter: Iterable[AttributionExample] = ds
+ if sample is not None and sample < len(ds):
+ ex_iter = list(ds)
+ random.Random(seed).shuffle(ex_iter)
+ ex_iter = ex_iter[:sample]
+ for ex in ex_iter:
+ examples.append(
+ CachedExample(
+ prompt=ex.prompt,
+ target=ex.target,
+ indices_to_explain=ex.indices_to_explain,
+ attr_mask_indices=ex.attr_mask_indices,
+ sink_span=None,
+ thinking_span=None,
+ metadata=ex.metadata,
+ )
+ )
+ return examples
+
+
+def load_morehopqa(
+ path: str | Path = "./data/with_human_verification.json", sample: Optional[int] = None, seed: int = 42
+) -> List[CachedExample]:
+ ds = MoreHopQAAttributionDataset(path)
+ ex_iter: Iterable[AttributionExample] = ds
+ if sample is not None and sample < len(ds):
+ ex_iter = list(ds)
+ random.Random(seed).shuffle(ex_iter)
+ ex_iter = ex_iter[:sample]
+ examples: List[CachedExample] = []
+ for ex in ex_iter:
+ examples.append(
+ CachedExample(
+ prompt=ex.prompt,
+ target=None,
+ indices_to_explain=ex.indices_to_explain,
+ attr_mask_indices=ex.attr_mask_indices,
+ sink_span=None,
+ thinking_span=None,
+ metadata=ex.metadata,
+ )
+ )
+ return examples
+
+
+def auto_find_ruler(task: str) -> Optional[Path]:
+ length_dirs = ["4096", "8192", "16384", "32768", "65536", "131072"]
+ base = Path("data/ruler_multihop")
+ for ld in length_dirs:
+ cand = base / ld / task / "validation.jsonl"
+ if cand.exists():
+ return cand
+ return None
+
+
+def dataset_from_name(name: str) -> Optional[Path]:
+ if name == "hotpotqa_long":
+ return auto_find_ruler("hotpotqa_long")
+ if name.startswith("vt_"):
+ return auto_find_ruler(name)
+ if name.startswith("niah"):
+ return auto_find_ruler(name)
+ p = Path(name)
+ if p.exists():
+ return p
+ return None
+
+
+_BOX_PATTERN = re.compile(r"\\box(?:ed)?\s*[\{๏ฝ](.*?)[\}๏ฝ]", flags=re.DOTALL)
+
+
+def _find_box_span(text: str) -> Optional[tuple[int, int, str]]:
+ """Return (start_char, end_char, answer_text) for the last \\boxed block."""
+ matches = list(_BOX_PATTERN.finditer(text))
+ if not matches:
+ return None
+ m = matches[-1]
+ return m.start(0), m.end(0), m.group(1).strip()
+
+
+def extract_boxed_answer(text: str) -> Optional[str]:
+ """Extract the answer string inside the last \\boxed{} block."""
+ match = _find_box_span(text)
+ return match[2] if match else None
+
+
+def _find_answer_span(text: str, answer: str) -> Optional[tuple[int, int]]:
+ """Return (start_char, end_char) for the last occurrence of `answer` in text."""
+ if not answer or not text:
+ return None
+ start = text.rfind(answer)
+ if start == -1:
+ return None
+ return start, start + len(answer)
+
+
+def split_boxed_generation(text: str) -> Optional[tuple[str, str, str]]:
+ """Return (thinking_text, boxed_segment, boxed_answer) if format matches."""
+ if not text:
+ return None
+ match = _find_box_span(text)
+ if not match:
+ return None
+
+ start_char, end_char, boxed_inner = match
+ boxed_segment = text[start_char:end_char].strip()
+ thinking_text = text[:start_char].strip()
+ trailing = text[end_char:].strip()
+
+ if not boxed_inner or not boxed_segment:
+ return None
+ if trailing:
+ return None
+ if not thinking_text:
+ return None
+
+ return thinking_text, boxed_segment, boxed_inner
+
+
+def attach_spans_from_answer(
+ example: CachedExample, tokenizer, answer_text: Optional[str] = None
+) -> CachedExample:
+ """Attach sink/thinking spans by locating the (plain) answer in `target`.
+
+ `answer_text` should be the extracted boxed answer; falls back to metadata or
+ parsing the target when omitted. Works even when the target no longer keeps
+ the \\box{} wrapper.
+ """
+ tgt = example.target or ""
+ answer = (answer_text or "").strip()
+ if not answer:
+ answer = (example.metadata.get("boxed_answer") or extract_boxed_answer(tgt) or "").strip()
+
+ metadata = dict(example.metadata)
+ if answer:
+ metadata.setdefault("boxed_answer", answer)
+
+ if tokenizer is None or not tgt or not answer:
+ return CachedExample(
+ prompt=example.prompt,
+ target=example.target,
+ indices_to_explain=example.indices_to_explain,
+ attr_mask_indices=example.attr_mask_indices,
+ sink_span=example.sink_span,
+ thinking_span=example.thinking_span,
+ metadata=metadata,
+ )
+
+ span = _find_answer_span(tgt, answer)
+ if span is None:
+ return CachedExample(
+ prompt=example.prompt,
+ target=example.target,
+ indices_to_explain=example.indices_to_explain,
+ attr_mask_indices=example.attr_mask_indices,
+ sink_span=example.sink_span,
+ thinking_span=example.thinking_span,
+ metadata=metadata,
+ )
+
+ span_start_char, span_end_char = span
+ gen_ids = tokenizer(tgt, add_special_tokens=False, return_offsets_mapping=True)
+ sink_tokens: List[int] = []
+ for idx, (s, e) in enumerate(gen_ids["offset_mapping"]):
+ # include tokens that overlap the answer span
+ if s < span_end_char and e > span_start_char:
+ sink_tokens.append(idx)
+ if not sink_tokens:
+ return CachedExample(
+ prompt=example.prompt,
+ target=example.target,
+ indices_to_explain=example.indices_to_explain,
+ attr_mask_indices=example.attr_mask_indices,
+ sink_span=example.sink_span,
+ thinking_span=example.thinking_span,
+ metadata=metadata,
+ )
+
+ sink_span = [min(sink_tokens), max(sink_tokens)]
+ thinking_end = max(0, sink_span[0] - 1)
+ thinking_span = [0, thinking_end] if thinking_end >= 0 else sink_span
+
+ return CachedExample(
+ prompt=example.prompt,
+ target=example.target,
+ indices_to_explain=example.indices_to_explain,
+ attr_mask_indices=example.attr_mask_indices,
+ sink_span=example.sink_span or sink_span,
+ thinking_span=example.thinking_span or thinking_span,
+ metadata=metadata,
+ )
+
+
+def attach_spans_from_boxed(example: CachedExample, tokenizer) -> CachedExample:
+ """Backward-compatible wrapper that first looks for \\box{} then falls back to answer text."""
+ tgt = example.target
+ match = _find_box_span(tgt) if tgt else None
+ boxed_answer = match[2] if match else None
+ return attach_spans_from_answer(example, tokenizer, boxed_answer)
+
+
+def ruler_gold_prompt_token_indices(example: CachedExample, tokenizer) -> List[int]:
+ """Return token indices (prompt-side) that overlap RULER `needle_spans` in metadata.
+
+ The returned indices are with respect to `tokenizer(" " + example.prompt, add_special_tokens=False)`,
+ matching the attribution pipeline's leading-space convention.
+ """
+ needle_spans = (example.metadata or {}).get("needle_spans") or []
+ if not isinstance(needle_spans, list) or not needle_spans:
+ return []
+
+ prompt_text = " " + (example.prompt or "")
+ enc = tokenizer(prompt_text, add_special_tokens=False, return_offsets_mapping=True)
+ offsets = enc.get("offset_mapping")
+ if offsets is None:
+ raise ValueError("Tokenizer does not provide offset_mapping; cannot map needle_spans to tokens.")
+
+ spans: List[tuple[int, int]] = []
+ for item in needle_spans:
+ if not isinstance(item, dict):
+ continue
+ raw = item.get("span")
+ if not (isinstance(raw, list) and len(raw) == 2):
+ continue
+ try:
+ start = int(raw[0]) + 1 # shift for leading space in prompt_text
+ end = int(raw[1]) + 1
+ except Exception:
+ continue
+ if end > start:
+ spans.append((start, end))
+
+ if not spans:
+ return []
+
+ gold: set[int] = set()
+ for tok_idx, off in enumerate(offsets):
+ if off is None:
+ continue
+ try:
+ s, e = int(off[0]), int(off[1])
+ except Exception:
+ continue
+ if e <= s:
+ continue
+ for span_start, span_end in spans:
+ if s < span_end and e > span_start:
+ gold.add(tok_idx)
+ break
+
+ return sorted(gold)
+
+
+class DatasetLoader:
+ """Thin loader that resolves and samples datasets for exp2."""
+
+ def __init__(self, seed: int = 42, data_root: Path | str = Path("exp/exp2/data")) -> None:
+ self.seed = seed
+ self.data_root = Path(data_root)
+
+ def _sample(self, items: List[CachedExample], sample: Optional[int]) -> List[CachedExample]:
+ if sample is not None and sample < len(items):
+ rnd = random.Random(self.seed)
+ rnd.shuffle(items)
+ items = items[:sample]
+ return items
+
+ def _cached_path(self, name: str) -> Optional[Path]:
+ path = self.data_root / f"{name}.jsonl"
+ return path if path.exists() else None
+
+ def load(self, name: str, sample: Optional[int] = None) -> List[CachedExample]:
+ # 1) Prefer prepared cache under exp/exp2/data
+ cached_path = self._cached_path(name)
+ if cached_path:
+ return self._sample(load_cached(cached_path), sample)
+
+ return self.load_raw(name, sample=sample)
+
+ def load_raw(self, name: str, sample: Optional[int] = None) -> List[CachedExample]:
+ def _looks_like_json_array(path: Path) -> bool:
+ try:
+ with path.open("r", encoding="utf-8") as f:
+ while True:
+ ch = f.read(1)
+ if not ch:
+ return False
+ if ch.isspace():
+ continue
+ return ch == "["
+ except OSError:
+ return False
+
+ # MoreHopQA
+ if name == "morehopqa":
+ ex = load_morehopqa()
+ for item in ex:
+ if "answer" in item.metadata:
+ item.metadata.setdefault("reference_answer", item.metadata["answer"])
+ return self._sample(ex, sample)
+
+ # Allow passing the raw MoreHopQA JSON path directly.
+ p = Path(name)
+ if p.exists() and _looks_like_json_array(p):
+ ex = load_morehopqa(p)
+ for item in ex:
+ if "answer" in item.metadata:
+ item.metadata.setdefault("reference_answer", item.metadata["answer"])
+ return self._sample(ex, sample)
+
+ # RULER / HotpotQA / niah / vt (all go through RulerAttributionDataset)
+ resolved = dataset_from_name(name)
+ if resolved is None:
+ raise FileNotFoundError(f"Could not resolve dataset {name}")
+ ex = load_ruler(resolved)
+ for item in ex:
+ outputs = item.metadata.get("outputs") or []
+ if outputs:
+ item.metadata.setdefault("reference_answer", ", ".join(outputs))
+ if item.target and "reference_answer" not in item.metadata:
+ item.metadata["reference_answer"] = item.target
+ return self._sample(ex, sample)
diff --git a/exp/exp2/map_math_mine_to_exp2_cache.py b/exp/exp2/map_math_mine_to_exp2_cache.py
new file mode 100644
index 0000000000000000000000000000000000000000..7266708c1007149d2194d52f2ff420321a020ec8
--- /dev/null
+++ b/exp/exp2/map_math_mine_to_exp2_cache.py
@@ -0,0 +1,584 @@
+#!/usr/bin/env python3
+"""Prepare data/math_mine.json into an exp2 cached JSONL dataset.
+
+This script supports two modes:
+
+- map (offline): convert GSM8K-style math examples:
+
+ {"question": "...", "answer": "... #### 18"}
+
+ into exp2's cached JSONL format (one JSON object per line).
+
+- resample (online): resample targets like exp/exp2/sample_and_filter.py:
+ call a chat completion API to generate " + final \\box{} answer",
+ judge the boxed answer against the reference answer extracted from the raw
+ GSM8K-style entry, and write only judge=True samples.
+
+In both modes, exp2 expects token-level spans (NOT character spans):
+
+ - indices_to_explain: [start_tok, end_tok] (generation-token indices, closed interval)
+ - sink_span/thinking_span: token spans over tokenizer(target, add_special_tokens=False)
+"""
+
+from __future__ import annotations
+
+import argparse
+import json
+import os
+import sys
+import time
+import urllib.error
+import urllib.request
+from dataclasses import asdict
+from pathlib import Path
+from typing import Any, Dict, List, Optional, Tuple
+
+from transformers import AutoTokenizer
+from tqdm import tqdm
+
+REPO_ROOT = Path(__file__).resolve().parents[2]
+if str(REPO_ROOT) not in sys.path:
+ sys.path.insert(0, str(REPO_ROOT))
+
+from exp.exp2.dataset_utils import CachedExample, attach_spans_from_answer, split_boxed_generation # noqa: E402
+
+
+class RateLimitError(RuntimeError):
+ """Raised when API returns 429; carries a suggested wait time."""
+
+ def __init__(self, wait_seconds: float, detail: str) -> None:
+ super().__init__(detail)
+ self.wait_seconds = wait_seconds
+
+
+GEN_SYSTEM_PROMPT = (
+ "You are a reasoning assistant. "
+ "Before answering, engage in an chain of thought. "
+ "Process this freely and naturally without using specific headers or strict formatting. "
+ "When you reach the conclusion, wrap the entire final sentence containing the answer inside \\box{}. "
+ "Ensure the box wraps the **sentence** that naturally delivers the answer. DO NOT rewrite the answer word for the box separately."
+)
+
+JUDGE_SYSTEM_PROMPT = (
+ "You verify whether the model's boxed answer matches the reference answer. "
+ "Reply strictly with True or False and nothing else."
+)
+
+
+def call_chat_api(
+ api_base: str,
+ api_key: str,
+ model: str,
+ messages: List[Dict[str, str]],
+ *,
+ timeout: int,
+ max_tokens: int,
+ temperature: float,
+ cache_ttl: int,
+ cache_namespace: Optional[str],
+ rate_limit_delay: Optional[float] = None,
+) -> str:
+ """Minimal OpenAI-compatible chat.completions client (no external deps)."""
+ url = api_base.rstrip("/") + "/chat/completions"
+ payload: Dict[str, Any] = {
+ "model": model,
+ "messages": messages,
+ "max_tokens": max_tokens,
+ "temperature": temperature,
+ }
+ if cache_ttl > 0:
+ cache_obj: Dict[str, Any] = {"ttl": cache_ttl}
+ if cache_namespace:
+ cache_obj["namespace"] = cache_namespace
+ payload["cache"] = cache_obj
+
+ data = json.dumps(payload).encode("utf-8")
+ headers = {"Content-Type": "application/json"}
+ if api_key:
+ headers["Authorization"] = f"Bearer {api_key}"
+
+ req = urllib.request.Request(url, data=data, headers=headers, method="POST")
+ opener = urllib.request.build_opener(urllib.request.ProxyHandler({}))
+ try:
+ with opener.open(req, timeout=timeout) as resp:
+ resp_bytes = resp.read()
+ except urllib.error.HTTPError as e:
+ detail = e.read().decode("utf-8", errors="ignore") if hasattr(e, "read") else ""
+ if e.code == 429:
+ retry_after = None
+ if hasattr(e, "headers") and e.headers:
+ retry_after_header = e.headers.get("Retry-After")
+ if retry_after_header:
+ try:
+ retry_after = float(retry_after_header)
+ except ValueError:
+ retry_after = None
+ wait = retry_after or rate_limit_delay or 5.0
+ raise RateLimitError(wait, f"API HTTP 429: {detail}") from e
+ raise RuntimeError(f"API HTTP error {e.code}: {detail}") from e
+ except urllib.error.URLError as e:
+ raise RuntimeError(f"API request failed: {e}") from e
+
+ try:
+ response = json.loads(resp_bytes.decode("utf-8"))
+ except json.JSONDecodeError as e:
+ raise RuntimeError(f"Failed to decode API response: {resp_bytes!r}") from e
+
+ choices = response.get("choices", [])
+ if not choices:
+ raise RuntimeError(f"Empty choices from API: {response}")
+ content = choices[0].get("message", {}).get("content", "")
+ if not content:
+ raise RuntimeError(f"Empty content from API: {response}")
+ return content.strip()
+
+
+def build_gen_messages(prompt: str) -> List[Dict[str, str]]:
+ return [
+ {"role": "system", "content": GEN_SYSTEM_PROMPT},
+ {"role": "user", "content": prompt},
+ ]
+
+
+def build_judge_messages(reference_answer: str, candidate_answer: str) -> List[Dict[str, str]]:
+ user = (
+ "Decide if the model's boxed answer matches the reference answer.\n"
+ f"Reference answer: {reference_answer}\n"
+ f"Model boxed answer (only the content inside \\box{{}}): {candidate_answer}\n"
+ "Output only True if they are semantically consistent; otherwise output False."
+ )
+ return [
+ {"role": "system", "content": JUDGE_SYSTEM_PROMPT},
+ {"role": "user", "content": user},
+ ]
+
+
+def parse_bool(text: str) -> bool:
+ first = text.strip().splitlines()[0].strip().lower()
+ if first in {"true", "yes"}:
+ return True
+ if first in {"false", "no"}:
+ return False
+ # fallback: check substring
+ if "true" in first and "false" not in first:
+ return True
+ if "false" in first:
+ return False
+ raise ValueError(f"Cannot parse boolean from: {text!r}")
+
+
+def _load_tokenizer(tokenizer_model: str):
+ tok_path = Path(tokenizer_model)
+ if tok_path.exists():
+ tokenizer = AutoTokenizer.from_pretrained(tok_path.as_posix(), local_files_only=True)
+ else:
+ tokenizer = AutoTokenizer.from_pretrained(tokenizer_model)
+ if tokenizer.pad_token is None and tokenizer.eos_token is not None:
+ tokenizer.pad_token = tokenizer.eos_token
+ return tokenizer
+
+
+def _split_gsm8k_answer(answer: str) -> Optional[Tuple[str, str]]:
+ """Return (thinking_text, final_answer) parsed from GSM8K `answer`."""
+ text = (answer or "").strip()
+ if not text:
+ return None
+ if "####" not in text:
+ return None
+ thinking, final = text.rsplit("####", 1)
+ thinking = thinking.strip()
+ final = final.strip()
+ if not final:
+ return None
+ return thinking, final
+
+
+def _is_token_span(span: Any) -> bool:
+ return isinstance(span, list) and len(span) == 2 and all(isinstance(x, int) for x in span)
+
+
+def _build_cached_example(
+ *,
+ question: str,
+ answer: str,
+ tokenizer,
+ example_idx: int,
+ source_path: str,
+) -> Optional[CachedExample]:
+ parsed = _split_gsm8k_answer(answer)
+ if parsed is None:
+ return None
+ thinking_text, final_answer = parsed
+
+ prompt = question.strip()
+ target = f"{thinking_text}\n{final_answer}" if thinking_text else final_answer
+
+ example = CachedExample(
+ prompt=prompt,
+ target=target,
+ indices_to_explain=None,
+ attr_mask_indices=None,
+ sink_span=None,
+ thinking_span=None,
+ metadata={
+ "dataset": "math_mine",
+ "source_path": source_path,
+ "example_idx": int(example_idx),
+ "raw_question": question,
+ "raw_answer": answer,
+ "reference_answer": final_answer,
+ "boxed_answer": final_answer,
+ },
+ )
+ example = attach_spans_from_answer(example, tokenizer, final_answer)
+ if not _is_token_span(example.sink_span):
+ return None
+
+ # exp2 requires token-level indices_to_explain=[start_tok,end_tok] (closed interval).
+ indices_to_explain = list(example.sink_span)
+ thinking_span = example.thinking_span
+ if thinking_span is not None and _is_token_span(thinking_span) and indices_to_explain[0] == 0:
+ # No room for "thinking" tokens; avoid overlapping spans.
+ thinking_span = None
+
+ return CachedExample(
+ prompt=example.prompt,
+ target=example.target,
+ indices_to_explain=indices_to_explain,
+ attr_mask_indices=example.attr_mask_indices,
+ sink_span=indices_to_explain,
+ thinking_span=thinking_span,
+ metadata=example.metadata,
+ )
+
+
+def _build_resampled_example(
+ *,
+ question: str,
+ raw_answer: str,
+ reference_answer: str,
+ generation: str,
+ tokenizer,
+ example_idx: int,
+ source_path: str,
+ judge_response: str,
+ generator_model: str,
+ judge_model: str,
+) -> Optional[CachedExample]:
+ parsed = split_boxed_generation(generation)
+ if not parsed:
+ return None
+
+ thinking_text, _boxed_segment, boxed_answer = parsed
+ target_text = f"{thinking_text}\n{boxed_answer}" if thinking_text else boxed_answer
+
+ example = CachedExample(
+ prompt=question.strip(),
+ target=target_text,
+ indices_to_explain=None,
+ attr_mask_indices=None,
+ sink_span=None,
+ thinking_span=None,
+ metadata={
+ "dataset": "math_mine",
+ "source_path": source_path,
+ "example_idx": int(example_idx),
+ "raw_question": question,
+ "raw_answer": raw_answer,
+ "reference_answer": reference_answer,
+ "judge_response": judge_response,
+ "generator_model": generator_model,
+ "judge_model": judge_model,
+ },
+ )
+ example = attach_spans_from_answer(example, tokenizer, boxed_answer)
+ if not _is_token_span(example.sink_span):
+ return None
+
+ indices_to_explain = list(example.sink_span)
+ return CachedExample(
+ prompt=example.prompt,
+ target=example.target,
+ indices_to_explain=indices_to_explain,
+ attr_mask_indices=example.attr_mask_indices,
+ sink_span=indices_to_explain,
+ thinking_span=example.thinking_span,
+ metadata=example.metadata,
+ )
+
+
+def _write_jsonl(path: Path, *, examples) -> int:
+ path.parent.mkdir(parents=True, exist_ok=True)
+ count = 0
+ with path.open("w", encoding="utf-8") as f:
+ for ex in examples:
+ f.write(json.dumps(asdict(ex), ensure_ascii=False) + "\n")
+ count += 1
+ return count
+
+
+def main() -> None:
+ ap = argparse.ArgumentParser("Prepare data/math_mine.json for exp2 cached JSONL.")
+ ap.add_argument("--in_json", type=str, default="data/math_mine.json")
+ ap.add_argument("--out_jsonl", type=str, default="exp/exp2/data/math.jsonl")
+ ap.add_argument(
+ "--tokenizer_model",
+ type=str,
+ required=True,
+ help="Tokenizer name or local path; must match the tokenizer used in exp2 attribution.",
+ )
+ ap.add_argument(
+ "--mode",
+ type=str,
+ choices=["map", "resample"],
+ default="map",
+ help="map=offline mapping from GSM8K answers; resample=generate+judge like exp/exp2/sample_and_filter.py.",
+ )
+
+ # Resample (online) options (kept compatible with exp/exp2/sample_and_filter.py).
+ ap.add_argument("--max_examples", type=int, default=100, help="Number of judge=True examples to keep (resample mode).")
+ ap.add_argument("--seed", type=int, default=42, help="Shuffle seed (only used with --shuffle).")
+ ap.add_argument("--shuffle", action="store_true", help="Shuffle examples before attempting (resample mode).")
+ ap.add_argument("--api_base", type=str, default="http://localhost:4000/v1", help="Chat API base URL.")
+ ap.add_argument("--api_key", type=str, default=None, help="API key; defaults to FLASHTRACE_API_KEY/OPENAI_API_KEY.")
+ ap.add_argument("--generator_model", type=str, default="qwen3-235b-a22b-2507")
+ ap.add_argument("--judge_model", type=str, default="deepseek-v3-1-terminus")
+ ap.add_argument("--api_timeout", type=int, default=300)
+ ap.add_argument("--api_max_tokens", type=int, default=8192)
+ ap.add_argument("--api_temperature", type=float, default=0.0)
+ ap.add_argument("--api_cache_ttl", type=int, default=600)
+ ap.add_argument("--api_cache_namespace", type=str, default="flashtrace-exp2")
+ ap.add_argument("--retry_delay", type=float, default=2.0)
+ ap.add_argument("--retries", type=int, default=2, help="Additional retries on API failure.")
+ ap.add_argument("--request_interval", type=float, default=1.0, help="Sleep seconds between generation calls.")
+ ap.add_argument("--judge_interval", type=float, default=1.0, help="Sleep seconds between judge calls.")
+ ap.add_argument("--rate_limit_delay", type=float, default=5.0, help="Seconds to wait on HTTP 429 before retrying.")
+ args = ap.parse_args()
+
+ in_path = Path(args.in_json)
+ out_path = Path(args.out_jsonl)
+ tokenizer = _load_tokenizer(args.tokenizer_model)
+
+ raw = json.loads(in_path.read_text(encoding="utf-8"))
+ if not isinstance(raw, list):
+ raise SystemExit(f"Expected a JSON array in {in_path}, got {type(raw).__name__}.")
+
+ source_total = len(raw)
+ total = 0
+ kept = 0
+ skipped_empty_q = 0
+ skipped_empty_a = 0
+ skipped_parse = 0
+ skipped_span = 0
+
+ examples = []
+ if args.mode == "map":
+ attempted = None
+ skipped_format = None
+ judged_false = None
+ for idx, item in enumerate(raw):
+ total += 1
+ if not isinstance(item, dict):
+ skipped_parse += 1
+ continue
+
+ question = str(item.get("question") or "")
+ answer = str(item.get("answer") or "")
+ if not question.strip():
+ skipped_empty_q += 1
+ continue
+ if not answer.strip():
+ skipped_empty_a += 1
+ continue
+
+ ex = _build_cached_example(
+ question=question,
+ answer=answer,
+ tokenizer=tokenizer,
+ example_idx=idx,
+ source_path=str(in_path),
+ )
+ if ex is None:
+ # distinguish parse-vs-span failure
+ parsed = _split_gsm8k_answer(answer)
+ if parsed is None:
+ skipped_parse += 1
+ else:
+ skipped_span += 1
+ continue
+
+ examples.append(ex)
+ kept += 1
+ else:
+ api_key = args.api_key or os.environ.get("FLASHTRACE_API_KEY") or os.environ.get("OPENAI_API_KEY")
+ if not api_key:
+ raise SystemExit("resample mode requires --api_key or FLASHTRACE_API_KEY/OPENAI_API_KEY.")
+
+ attempted = 0
+ skipped_format = 0
+ judged_false = 0
+
+ indices = list(range(len(raw)))
+ if bool(args.shuffle):
+ import random
+
+ rnd = random.Random(int(args.seed))
+ rnd.shuffle(indices)
+
+ kept_bar = tqdm(total=int(args.max_examples), desc="Kept (judge=True)", position=1, leave=False)
+ for loop_idx in tqdm(indices, total=len(indices), desc="Resampling"):
+ if kept >= int(args.max_examples):
+ break
+
+ total += 1
+ item = raw[loop_idx]
+ if not isinstance(item, dict):
+ skipped_parse += 1
+ continue
+
+ question = str(item.get("question") or "")
+ answer = str(item.get("answer") or "")
+ if not question.strip():
+ skipped_empty_q += 1
+ continue
+ if not answer.strip():
+ skipped_empty_a += 1
+ continue
+
+ parsed = _split_gsm8k_answer(answer)
+ if parsed is None:
+ skipped_parse += 1
+ continue
+ _ref_thinking, reference_answer = parsed
+
+ attempted += 1
+ gen_messages = build_gen_messages(question.strip())
+
+ # Step 1: generation
+ for attempt in range(int(args.retries) + 1):
+ try:
+ generation = call_chat_api(
+ str(args.api_base),
+ str(api_key),
+ str(args.generator_model),
+ gen_messages,
+ timeout=int(args.api_timeout),
+ max_tokens=int(args.api_max_tokens),
+ temperature=float(args.api_temperature),
+ cache_ttl=int(args.api_cache_ttl),
+ cache_namespace=str(args.api_cache_namespace) if args.api_cache_namespace else None,
+ rate_limit_delay=float(args.rate_limit_delay) if args.rate_limit_delay is not None else None,
+ )
+ break
+ except RateLimitError as e:
+ if attempt >= int(args.retries):
+ raise
+ time.sleep(float(e.wait_seconds))
+ except Exception: # noqa: BLE001
+ if attempt >= int(args.retries):
+ raise
+ time.sleep(float(args.retry_delay))
+ if float(args.request_interval) > 0:
+ time.sleep(float(args.request_interval))
+
+ parsed_gen = split_boxed_generation(generation)
+ if not parsed_gen:
+ skipped_format += 1
+ print(f"[attempt={attempted}] skipped=format")
+ continue
+
+ thinking_text, _boxed_segment, boxed_answer = parsed_gen
+ judge_messages = build_judge_messages(reference_answer, boxed_answer)
+
+ ok = False
+ judge_resp = ""
+ for attempt in range(int(args.retries) + 1):
+ try:
+ judge_resp = call_chat_api(
+ str(args.api_base),
+ str(api_key),
+ str(args.judge_model),
+ judge_messages,
+ timeout=int(args.api_timeout),
+ max_tokens=64,
+ temperature=0.0,
+ cache_ttl=int(args.api_cache_ttl),
+ cache_namespace=str(args.api_cache_namespace) if args.api_cache_namespace else None,
+ rate_limit_delay=float(args.rate_limit_delay) if args.rate_limit_delay is not None else None,
+ )
+ ok = parse_bool(judge_resp)
+ break
+ except RateLimitError as e:
+ if attempt >= int(args.retries):
+ raise
+ time.sleep(float(e.wait_seconds))
+ except Exception: # noqa: BLE001
+ if attempt >= int(args.retries):
+ raise
+ time.sleep(float(args.retry_delay))
+ if float(args.judge_interval) > 0:
+ time.sleep(float(args.judge_interval))
+
+ if not ok:
+ judged_false += 1
+ print(f"[attempt={attempted}] judge=filtered")
+ continue
+
+ ex = _build_resampled_example(
+ question=question,
+ raw_answer=answer,
+ reference_answer=reference_answer,
+ generation=generation,
+ tokenizer=tokenizer,
+ example_idx=int(loop_idx),
+ source_path=str(in_path),
+ judge_response=judge_resp,
+ generator_model=str(args.generator_model),
+ judge_model=str(args.judge_model),
+ )
+ if ex is None:
+ skipped_span += 1
+ print(f"[attempt={attempted}] skipped=span")
+ continue
+
+ examples.append(ex)
+ kept += 1
+ kept_bar.update(1)
+ print(f"[attempt={attempted}] judge=kept")
+
+ kept_bar.close()
+
+ written = _write_jsonl(out_path, examples=examples)
+ if written != kept:
+ raise SystemExit(f"Internal error: written={written} != kept={kept}")
+
+ print(
+ json.dumps(
+ {
+ "in_json": str(in_path),
+ "out_jsonl": str(out_path),
+ "tokenizer_model": args.tokenizer_model,
+ "mode": str(args.mode),
+ "source_total": int(source_total),
+ "visited": total,
+ "kept": kept,
+ "skipped_empty_question": skipped_empty_q,
+ "skipped_empty_answer": skipped_empty_a,
+ "skipped_parse": skipped_parse,
+ "skipped_span": skipped_span,
+ "attempted": attempted,
+ "skipped_format": skipped_format,
+ "judged_false": judged_false,
+ "max_examples": int(args.max_examples) if str(args.mode) == "resample" else None,
+ "api_base": str(args.api_base) if str(args.mode) == "resample" else None,
+ "generator_model": str(args.generator_model) if str(args.mode) == "resample" else None,
+ "judge_model": str(args.judge_model) if str(args.mode) == "resample" else None,
+ },
+ ensure_ascii=False,
+ indent=2,
+ )
+ )
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/exp2/migrate_indices_to_explain_token_span.py b/exp/exp2/migrate_indices_to_explain_token_span.py
new file mode 100644
index 0000000000000000000000000000000000000000..2c7ba14fe8b71a3cad3959519e31ad2b54e507ee
--- /dev/null
+++ b/exp/exp2/migrate_indices_to_explain_token_span.py
@@ -0,0 +1,129 @@
+#!/usr/bin/env python3
+"""Migrate exp2 cached JSONL to token-span `indices_to_explain`.
+
+This converts legacy caches that used sentence indices (e.g. `[-2]`) into the
+token-span format:
+
+ indices_to_explain = [start_tok, end_tok]
+
+Where the span points to the boxed-inner (final answer) token span in `target`
+under `tokenizer(target, add_special_tokens=False)`.
+
+Rule:
+1) If `sink_span` exists and looks valid -> copy it to `indices_to_explain`
+2) Else try to recompute spans from `target` + `metadata.boxed_answer` using
+ `exp/exp2/dataset_utils.attach_spans_from_answer`
+"""
+
+from __future__ import annotations
+
+import argparse
+import json
+import sys
+from pathlib import Path
+from typing import Any, Dict, Optional
+
+from transformers import AutoTokenizer
+
+
+def _ensure_repo_root_on_path() -> None:
+ repo_root = Path(__file__).resolve().parents[2]
+ if str(repo_root) not in sys.path:
+ sys.path.insert(0, str(repo_root))
+
+
+def _is_token_span(span: Any) -> bool:
+ return isinstance(span, list) and len(span) == 2 and all(isinstance(x, int) for x in span)
+
+
+def _load_tokenizer(tokenizer_model: str):
+ tok_path = Path(tokenizer_model)
+ if tok_path.exists():
+ return AutoTokenizer.from_pretrained(tok_path.as_posix(), local_files_only=True)
+ return AutoTokenizer.from_pretrained(tokenizer_model)
+
+
+def _migrate_obj(obj: Dict[str, Any], tokenizer) -> tuple[Dict[str, Any], bool]:
+ sink_span = obj.get("sink_span")
+ if _is_token_span(sink_span):
+ obj["indices_to_explain"] = sink_span
+ return obj, True
+
+ _ensure_repo_root_on_path()
+ from exp.exp2.dataset_utils import CachedExample, attach_spans_from_answer # noqa: E402
+
+ example = CachedExample(
+ prompt=obj.get("prompt") or "",
+ target=obj.get("target"),
+ indices_to_explain=obj.get("indices_to_explain"),
+ attr_mask_indices=obj.get("attr_mask_indices"),
+ sink_span=obj.get("sink_span"),
+ thinking_span=obj.get("thinking_span"),
+ metadata=obj.get("metadata") or {},
+ )
+ answer_text = (example.metadata.get("boxed_answer") or "").strip() or None
+ migrated = attach_spans_from_answer(example, tokenizer, answer_text)
+ if not _is_token_span(migrated.sink_span):
+ return obj, False
+
+ obj["sink_span"] = migrated.sink_span
+ obj["thinking_span"] = migrated.thinking_span
+ obj["indices_to_explain"] = migrated.sink_span
+ obj["metadata"] = migrated.metadata
+ return obj, True
+
+
+def main() -> None:
+ ap = argparse.ArgumentParser()
+ ap.add_argument("--in_jsonl", type=str, required=True)
+ ap.add_argument("--out_jsonl", type=str, required=True)
+ ap.add_argument("--tokenizer_model", type=str, required=True)
+ ap.add_argument("--strict", action="store_true", help="Fail on any line that cannot be migrated.")
+ args = ap.parse_args()
+
+ tokenizer = _load_tokenizer(args.tokenizer_model)
+
+ in_path = Path(args.in_jsonl)
+ out_path = Path(args.out_jsonl)
+
+ try:
+ same_path = in_path.resolve() == out_path.resolve()
+ except FileNotFoundError:
+ same_path = False
+
+ tmp_out_path = out_path
+ if same_path:
+ tmp_out_path = out_path.with_name(out_path.name + ".tmp")
+ if tmp_out_path.exists():
+ tmp_out_path.unlink()
+
+ tmp_out_path.parent.mkdir(parents=True, exist_ok=True)
+
+ total = 0
+ migrated_ok = 0
+ bad = 0
+
+ with in_path.open("r", encoding="utf-8") as fin, tmp_out_path.open("w", encoding="utf-8") as fout:
+ for line_no, line in enumerate(fin, start=1):
+ if not line.strip():
+ continue
+ total += 1
+ obj: Dict[str, Any] = json.loads(line)
+ new_obj, ok = _migrate_obj(obj, tokenizer)
+ if ok:
+ migrated_ok += 1
+ else:
+ bad += 1
+ if args.strict:
+ raise RuntimeError(f"cannot migrate line {line_no}: cannot resolve sink_span token span")
+ fout.write(json.dumps(new_obj, ensure_ascii=False) + "\n")
+
+ if same_path:
+ tmp_out_path.replace(out_path)
+ print(f"[done] total={total} migrated_ok={migrated_ok} bad={bad} wrote={out_path} (in-place)")
+ else:
+ print(f"[done] total={total} migrated_ok={migrated_ok} bad={bad} wrote={out_path}")
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/exp2/out.log b/exp/exp2/out.log
new file mode 100644
index 0000000000000000000000000000000000000000..9bd68c485c6547f61c0295e97db643f88e99cb2c
--- /dev/null
+++ b/exp/exp2/out.log
@@ -0,0 +1,102 @@
+[1/500] judge=kept
+[2/500] judge=kept
+[3/500] judge=kept
+[4/500] judge=kept
+[5/500] judge=kept
+[6/500] judge=kept
+[7/500] judge=kept
+[8/500] judge=kept
+[9/500] judge=kept
+[10/500] judge=kept
+[11/500] judge=kept
+[12/500] judge=kept
+[13/500] judge=kept
+[14/500] judge=kept
+[15/500] judge=kept
+[16/500] judge=kept
+[17/500] judge=kept
+[18/500] judge=kept
+[19/500] judge=kept
+[20/500] judge=kept
+[21/500] judge=kept
+[22/500] judge=kept
+[23/500] judge=kept
+[24/500] judge=kept
+[25/500] judge=kept
+[26/500] judge=kept
+[27/500] judge=kept
+[28/500] judge=kept
+[29/500] judge=kept
+[30/500] judge=kept
+[31/500] judge=kept
+[32/500] judge=kept
+[33/500] judge=kept
+[34/500] judge=kept
+[35/500] judge=kept
+[36/500] judge=kept
+[37/500] judge=kept
+[38/500] judge=kept
+[39/500] judge=kept
+[40/500] judge=kept
+[41/500] judge=kept
+[42/500] judge=kept
+[43/500] judge=kept
+[44/500] judge=kept
+[45/500] judge=kept
+[46/500] judge=kept
+[47/500] judge=kept
+[48/500] judge=kept
+[49/500] judge=kept
+[50/500] judge=kept
+[51/500] judge=kept
+[52/500] judge=kept
+[53/500] judge=kept
+[54/500] judge=kept
+[55/500] judge=kept
+[56/500] judge=kept
+[57/500] judge=kept
+[58/500] judge=kept
+[59/500] judge=kept
+[60/500] judge=kept
+[61/500] judge=kept
+[62/500] judge=kept
+[63/500] skipped=format
+[64/500] judge=kept
+[65/500] judge=kept
+[66/500] judge=kept
+[67/500] judge=kept
+[68/500] judge=kept
+[69/500] judge=kept
+[70/500] judge=kept
+[71/500] judge=kept
+[72/500] judge=kept
+[73/500] judge=kept
+[74/500] judge=kept
+[75/500] judge=kept
+[76/500] judge=kept
+[77/500] judge=kept
+[78/500] judge=kept
+[79/500] judge=kept
+[80/500] judge=kept
+[81/500] judge=kept
+[82/500] judge=kept
+[83/500] judge=kept
+[84/500] judge=kept
+[85/500] judge=kept
+[86/500] judge=kept
+[87/500] judge=kept
+[88/500] judge=kept
+[89/500] judge=kept
+[90/500] judge=kept
+[91/500] judge=kept
+[92/500] judge=kept
+[93/500] judge=kept
+[94/500] judge=kept
+[95/500] judge=kept
+[96/500] judge=kept
+[97/500] judge=kept
+[98/500] judge=kept
+[99/500] judge=kept
+[100/500] judge=kept
+[101/500] judge=kept
+Kept 100 / target 100 (attempted 101 / 500) -> exp/exp2/data/data/ruler_multihop/1024/vt_h10_c1/validation.jsonl.jsonl
diff --git a/exp/exp2/run_exp.py b/exp/exp2/run_exp.py
new file mode 100644
index 0000000000000000000000000000000000000000..927d83488d0a11953f4c7024f8c44f5111859355
--- /dev/null
+++ b/exp/exp2/run_exp.py
@@ -0,0 +1,1296 @@
+#!/usr/bin/env python3
+"""
+Experiment 2 runner: token-level faithfulness (generation perturbation).
+
+AT2 is omitted.
+"""
+
+from __future__ import annotations
+
+import argparse
+import hashlib
+import json
+import os
+import sys
+from itertools import islice
+import math
+import time
+from pathlib import Path
+from typing import Any, Dict, List, Optional, Tuple
+
+# Early CUDA mask handling: set CUDA_VISIBLE_DEVICES before importing torch.
+def _early_set_cuda_visible_devices():
+ parser = argparse.ArgumentParser(add_help=False)
+ parser.add_argument("--cuda", type=str, default=None)
+ # parse_known_args keeps the full argv for later parsing by the main parser
+ args, _ = parser.parse_known_args(sys.argv[1:])
+ if args.cuda and "," in args.cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = args.cuda
+
+
+_early_set_cuda_visible_devices()
+
+import numpy as np
+import torch
+from transformers import AutoModelForCausalLM, AutoTokenizer, utils
+
+sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
+
+from pathlib import Path
+
+# ensure repo root on path
+REPO_ROOT = Path(__file__).resolve().parents[2]
+if str(REPO_ROOT) not in sys.path:
+ sys.path.insert(0, str(REPO_ROOT))
+
+import llm_attr
+import llm_attr_eval
+from attribution_datasets import AttributionExample
+from exp.exp2 import dataset_utils as ds_utils
+
+utils.logging.set_verbosity_error()
+
+
+def _sha1_text(text: str) -> str:
+ return hashlib.sha1(text.encode("utf-8")).hexdigest()
+
+
+def _infer_attnlrp_spans_from_hops(
+ raw_attributions: Any,
+ *,
+ gen_len: int,
+) -> Tuple[Tuple[int, int], Tuple[int, int]]:
+ if not raw_attributions:
+ return (0, max(0, gen_len - 1)), (0, max(0, gen_len - 1))
+ sink_span = tuple(int(x) for x in raw_attributions[0].sink_range)
+ if len(raw_attributions) >= 2:
+ thinking_span = tuple(int(x) for x in raw_attributions[1].sink_range)
+ else:
+ thinking_span = sink_span
+ return sink_span, thinking_span
+
+
+def _build_hop_trace_payload(
+ attr_func: str,
+ attr: Any,
+ *,
+ indices_to_explain: List[int],
+) -> Optional[Dict[str, np.ndarray]]:
+ """Extract per-hop vectors (postprocessed) and minimal span metadata."""
+ prompt_len = int(len(getattr(attr, "prompt_tokens", []) or []))
+ gen_len = int(len(getattr(attr, "generation_tokens", []) or []))
+ total_len = prompt_len + gen_len
+ if total_len <= 0:
+ return None
+
+ hop_vectors: List[torch.Tensor] = []
+ sink_span_gen: Optional[Tuple[int, int]] = None
+ thinking_span_gen: Optional[Tuple[int, int]] = None
+ attnlrp_neg_handling: str = ""
+ attnlrp_norm_mode: str = ""
+ attnlrp_ratio_enabled: int = -1
+
+ # IFR multi-hop variants expose projected hop vectors via metadata["ifr"]["per_hop_projected"].
+ ifr_meta = (getattr(attr, "metadata", None) or {}).get("ifr") or {}
+ ifr_per_hop = ifr_meta.get("per_hop_projected") or []
+
+ if ifr_per_hop:
+ hop_vectors = [torch.as_tensor(v, dtype=torch.float32) for v in ifr_per_hop]
+ sink_span_gen = ifr_meta.get("sink_span_generation")
+ thinking_span_gen = ifr_meta.get("thinking_span_generation")
+ if sink_span_gen is not None:
+ sink_span_gen = tuple(int(x) for x in sink_span_gen)
+ if thinking_span_gen is not None:
+ thinking_span_gen = tuple(int(x) for x in thinking_span_gen)
+
+ elif attr_func in ("ft_attnlrp", "attnlrp_aggregated_multi_hop"):
+ meta = getattr(attr, "metadata", None) or {}
+ attnlrp_neg_handling = str(meta.get("neg_handling") or "")
+ attnlrp_norm_mode = str(meta.get("norm_mode") or "")
+ if meta.get("ratio_enabled") is not None:
+ attnlrp_ratio_enabled = int(bool(meta.get("ratio_enabled")))
+ multi_hop = meta.get("multi_hop_result")
+ if multi_hop is None:
+ return None
+ raw_attributions = getattr(multi_hop, "raw_attributions", None) or []
+ if not raw_attributions:
+ return None
+ hop_vectors = [
+ torch.as_tensor(getattr(hop, "token_importance_total"), dtype=torch.float32)
+ for hop in raw_attributions
+ ]
+ sink_span_gen, thinking_span_gen = _infer_attnlrp_spans_from_hops(raw_attributions, gen_len=gen_len)
+ sink_override = meta.get("sink_span")
+ thinking_override = meta.get("thinking_span")
+ if sink_override is not None:
+ sink_span_gen = tuple(int(x) for x in sink_override)
+ if thinking_override is not None:
+ thinking_span_gen = tuple(int(x) for x in thinking_override)
+
+ else:
+ return None
+
+ if sink_span_gen is None:
+ sink_span_gen = (0, max(0, gen_len - 1))
+ if thinking_span_gen is None:
+ thinking_span_gen = sink_span_gen
+
+ stacked = torch.stack([v.reshape(-1) for v in hop_vectors], dim=0)
+ if stacked.shape[1] != total_len:
+ raise ValueError(
+ f"Hop vector length mismatch for {attr_func}: expected T={total_len}, got {stacked.shape[1]}."
+ )
+
+ return {
+ "vh": stacked.detach().cpu().numpy().astype(np.float32, copy=False),
+ "prompt_len": np.asarray(prompt_len, dtype=np.int64),
+ "gen_len": np.asarray(gen_len, dtype=np.int64),
+ "sink_span_gen": np.asarray(sink_span_gen, dtype=np.int64),
+ "thinking_span_gen": np.asarray(thinking_span_gen, dtype=np.int64),
+ "indices_to_explain_gen": np.asarray(indices_to_explain, dtype=np.int64),
+ "attnlrp_neg_handling": np.asarray(attnlrp_neg_handling, dtype="U16"),
+ "attnlrp_norm_mode": np.asarray(attnlrp_norm_mode, dtype="U16"),
+ "attnlrp_ratio_enabled": np.asarray(attnlrp_ratio_enabled, dtype=np.int64),
+ }
+
+
+def _write_hop_trace(
+ trace_dir: Path,
+ *,
+ example_idx: int,
+ attr_func: str,
+ prompt: str,
+ target: Optional[str],
+ payload: Dict[str, np.ndarray],
+ manifest_handle,
+) -> None:
+ trace_dir.mkdir(parents=True, exist_ok=True)
+ npz_name = f"ex_{example_idx:06d}.npz"
+ npz_path = trace_dir / npz_name
+ np.savez_compressed(npz_path, **payload)
+
+ record = {
+ "example_idx": int(example_idx),
+ "attr_func": attr_func,
+ "file": npz_name,
+ "prompt_sha1": _sha1_text(prompt),
+ "target_sha1": _sha1_text(target) if target is not None else None,
+ "prompt_len": int(payload["prompt_len"].item()),
+ "gen_len": int(payload["gen_len"].item()),
+ "n_hops_plus_one": int(payload["vh"].shape[0]),
+ "total_len": int(payload["vh"].shape[1]),
+ "sink_span_gen": payload["sink_span_gen"].tolist(),
+ "thinking_span_gen": payload["thinking_span_gen"].tolist(),
+ "indices_to_explain_gen": payload["indices_to_explain_gen"].tolist(),
+ "attnlrp_neg_handling": str(payload["attnlrp_neg_handling"].item()),
+ "attnlrp_norm_mode": str(payload["attnlrp_norm_mode"].item()),
+ "attnlrp_ratio_enabled": int(payload["attnlrp_ratio_enabled"].item()),
+ }
+ manifest_handle.write(json.dumps(record, ensure_ascii=False) + "\n")
+ manifest_handle.flush()
+
+
+def _parse_modes(mode_args: Any) -> List[str]:
+ """Parse --mode which may be provided as multiple args and/or comma-separated."""
+ if mode_args is None:
+ raw_parts: List[str] = []
+ elif isinstance(mode_args, str):
+ raw_parts = [mode_args]
+ else:
+ raw_parts = [str(x) for x in mode_args]
+
+ modes: List[str] = []
+ for chunk in raw_parts:
+ for part in str(chunk).split(","):
+ m = part.strip()
+ if m:
+ modes.append(m)
+
+ # Default to faithfulness_gen for backward compatibility.
+ if not modes:
+ modes = ["faithfulness_gen"]
+
+ allowed = {"faithfulness_gen", "recovery_ruler"}
+ seen: set[str] = set()
+ unique: List[str] = []
+ for m in modes:
+ if m not in seen:
+ unique.append(m)
+ seen.add(m)
+
+ unknown = [m for m in unique if m not in allowed]
+ if unknown:
+ raise SystemExit(f"Unsupported --mode value(s): {unknown}. Allowed: {sorted(allowed)}.")
+
+ return unique
+
+
+def _trace_run_tag(
+ testing_dict: Dict[str, Any],
+ *,
+ modes: List[str],
+ total: int,
+) -> str:
+ attr_func = str(testing_dict.get("attr_func") or "attr")
+ parts = [attr_func]
+
+ if attr_func in (
+ "ifr_multi_hop",
+ "ifr_in_all_gen",
+ "ifr_multi_hop_stop_words",
+ "ifr_multi_hop_both",
+ "ifr_multi_hop_split_hop",
+ "ft_attnlrp",
+ "attnlrp_aggregated_multi_hop",
+ ):
+ parts.append(f"n{int(testing_dict.get('n_hops', 0))}")
+
+ if attr_func in ("attnlrp", "ft_attnlrp", "attnlrp_aggregated_multi_hop"):
+ parts.append(f"neg{str(testing_dict.get('attnlrp_neg_handling', ''))}")
+ parts.append(f"norm{str(testing_dict.get('attnlrp_norm_mode', ''))}")
+
+ if modes:
+ parts.append("m" + "+".join(modes))
+
+ parts.append(f"{int(total)}ex")
+ return "_".join(parts)
+
+
+def _token_importance_vector(attr: torch.Tensor) -> np.ndarray:
+ """Return token importance vector w = sum_rows(attr) in shape [P+G]."""
+ w = torch.nan_to_num(attr.sum(0).to(dtype=torch.float32), nan=0.0).clamp(min=0.0)
+ return w.detach().cpu().numpy().astype(np.float32, copy=False)
+
+
+def _build_sample_trace_payload(
+ example: ds_utils.CachedExample,
+ *,
+ attr_list: List[torch.Tensor],
+ prompt_len: int,
+ user_prompt_indices: Optional[List[int]],
+ keep_prompt_token_indices: Optional[List[int]],
+ gold_prompt_token_indices: Optional[List[int]],
+ hop_payload: Optional[Dict[str, np.ndarray]],
+ faithfulness_scores: Optional[np.ndarray],
+ recovery_scores: Optional[np.ndarray],
+ time_attr_s: Optional[float],
+ time_faith_s: Optional[float],
+ time_recovery_s: Optional[float],
+) -> Dict[str, np.ndarray]:
+ seq_attr, row_attr, rec_attr = attr_list
+ gen_len = int(seq_attr.shape[0])
+
+ v_seq_all = _token_importance_vector(seq_attr)
+ v_row_all = _token_importance_vector(row_attr)
+ v_rec_all = _token_importance_vector(rec_attr)
+
+ payload: Dict[str, np.ndarray] = {
+ "v_seq_all": v_seq_all,
+ "v_row_all": v_row_all,
+ "v_rec_all": v_rec_all,
+ "v_seq_prompt": v_seq_all[:prompt_len],
+ "v_row_prompt": v_row_all[:prompt_len],
+ "v_rec_prompt": v_rec_all[:prompt_len],
+ "prompt_len": np.asarray(int(prompt_len), dtype=np.int64),
+ "gen_len": np.asarray(int(gen_len), dtype=np.int64),
+ "indices_to_explain_gen": np.asarray(list(example.indices_to_explain or []), dtype=np.int64),
+ }
+
+ if example.sink_span is not None:
+ payload["sink_span_gen"] = np.asarray(list(example.sink_span), dtype=np.int64)
+ if example.thinking_span is not None:
+ payload["thinking_span_gen"] = np.asarray(list(example.thinking_span), dtype=np.int64)
+
+ if user_prompt_indices is not None:
+ payload["user_prompt_indices"] = np.asarray(list(user_prompt_indices), dtype=np.int64)
+ if keep_prompt_token_indices is not None:
+ payload["keep_prompt_token_indices"] = np.asarray(list(keep_prompt_token_indices), dtype=np.int64)
+ if gold_prompt_token_indices is not None:
+ payload["gold_prompt_token_indices"] = np.asarray(list(gold_prompt_token_indices), dtype=np.int64)
+
+ if faithfulness_scores is not None:
+ payload["faithfulness_scores"] = np.asarray(faithfulness_scores, dtype=np.float64)
+ if recovery_scores is not None:
+ payload["recovery_scores"] = np.asarray(recovery_scores, dtype=np.float64)
+
+ if time_attr_s is not None:
+ payload["time_attr_s"] = np.asarray(float(time_attr_s), dtype=np.float64)
+ if time_faith_s is not None:
+ payload["time_faith_s"] = np.asarray(float(time_faith_s), dtype=np.float64)
+ if time_recovery_s is not None:
+ payload["time_recovery_s"] = np.asarray(float(time_recovery_s), dtype=np.float64)
+
+ if hop_payload is not None:
+ for k, v in hop_payload.items():
+ if k in payload:
+ continue
+ payload[k] = v
+
+ return payload
+
+
+def _write_sample_trace(
+ trace_dir: Path,
+ *,
+ example_idx: int,
+ attr_func: str,
+ prompt: str,
+ target: Optional[str],
+ payload: Dict[str, np.ndarray],
+ manifest_handle,
+ recovery_skipped_reason: Optional[str],
+) -> None:
+ trace_dir.mkdir(parents=True, exist_ok=True)
+ npz_name = f"ex_{example_idx:06d}.npz"
+ npz_path = trace_dir / npz_name
+ np.savez_compressed(npz_path, **payload)
+
+ prompt_len = int(np.asarray(payload.get("prompt_len", 0)).item())
+ gen_len = int(np.asarray(payload.get("gen_len", 0)).item())
+ record: Dict[str, Any] = {
+ "example_idx": int(example_idx),
+ "attr_func": attr_func,
+ "file": npz_name,
+ "prompt_sha1": _sha1_text(prompt),
+ "target_sha1": _sha1_text(target) if target is not None else None,
+ "prompt_len": prompt_len,
+ "gen_len": gen_len,
+ "indices_to_explain_gen": payload.get("indices_to_explain_gen").tolist()
+ if payload.get("indices_to_explain_gen") is not None
+ else None,
+ "sink_span_gen": payload.get("sink_span_gen").tolist() if payload.get("sink_span_gen") is not None else None,
+ "thinking_span_gen": payload.get("thinking_span_gen").tolist()
+ if payload.get("thinking_span_gen") is not None
+ else None,
+ "faithfulness_scores": payload.get("faithfulness_scores").tolist()
+ if payload.get("faithfulness_scores") is not None
+ else None,
+ "recovery_scores": payload.get("recovery_scores").tolist() if payload.get("recovery_scores") is not None else None,
+ "recovery_skipped_reason": recovery_skipped_reason,
+ "time_attr_s": float(np.asarray(payload.get("time_attr_s")).item()) if payload.get("time_attr_s") is not None else None,
+ "time_faith_s": float(np.asarray(payload.get("time_faith_s")).item()) if payload.get("time_faith_s") is not None else None,
+ "time_recovery_s": float(np.asarray(payload.get("time_recovery_s")).item())
+ if payload.get("time_recovery_s") is not None
+ else None,
+ }
+
+ # Derived, sample-level bookkeeping (token lengths and per-sample MAS/RISE).
+ record["input_len"] = int(prompt_len)
+
+ sink_span = record.get("sink_span_gen")
+ if isinstance(sink_span, list) and len(sink_span) == 2:
+ try:
+ start = int(sink_span[0])
+ end = int(sink_span[1])
+ record["output_len"] = (end - start + 1) if end >= start else None
+ except Exception:
+ record["output_len"] = None
+ else:
+ record["output_len"] = None
+
+ thinking_span = record.get("thinking_span_gen")
+ if isinstance(thinking_span, list) and len(thinking_span) == 2:
+ try:
+ start = int(thinking_span[0])
+ end = int(thinking_span[1])
+ record["cot_len"] = (end - start + 1) if end >= start else None
+ except Exception:
+ record["cot_len"] = None
+ else:
+ record["cot_len"] = None
+
+ record["rise_seq"] = None
+ record["mas_seq"] = None
+ record["rise_row"] = None
+ record["mas_row"] = None
+ record["rise_rec"] = None
+ record["mas_rec"] = None
+ faith = record.get("faithfulness_scores")
+ if isinstance(faith, list) and len(faith) == 3:
+ try:
+ record["rise_seq"] = float(faith[0][0])
+ record["mas_seq"] = float(faith[0][1])
+ record["rise_row"] = float(faith[1][0])
+ record["mas_row"] = float(faith[1][1])
+ record["rise_rec"] = float(faith[2][0])
+ record["mas_rec"] = float(faith[2][1])
+ except Exception:
+ pass
+
+ if payload.get("vh") is not None:
+ vh = payload["vh"]
+ record["n_hops_plus_one"] = int(vh.shape[0])
+ record["total_len"] = int(vh.shape[1])
+ record["attnlrp_neg_handling"] = str(payload.get("attnlrp_neg_handling").item()) if payload.get("attnlrp_neg_handling") is not None else ""
+ record["attnlrp_norm_mode"] = str(payload.get("attnlrp_norm_mode").item()) if payload.get("attnlrp_norm_mode") is not None else ""
+ record["attnlrp_ratio_enabled"] = int(payload.get("attnlrp_ratio_enabled").item()) if payload.get("attnlrp_ratio_enabled") is not None else -1
+
+ manifest_handle.write(json.dumps(record, ensure_ascii=False) + "\n")
+ manifest_handle.flush()
+
+
+def _compute_faithfulness_scores(
+ testing_dict: Dict[str, Any],
+ *,
+ attr_list: List[torch.Tensor],
+ prompt_len: int,
+ prompt: str,
+ generation: str,
+ llm_evaluator: llm_attr_eval.LLMAttributionEvaluator,
+ user_prompt_indices: Optional[List[int]],
+ keep_prompt_token_indices: Optional[List[int]],
+) -> np.ndarray:
+ attr_func = str(testing_dict.get("attr_func") or "")
+ results: List[Tuple[float, float, float]] = []
+ for attr in attr_list:
+ attr_prompt = attr[:, :prompt_len]
+ if attr_func in ("ifr_multi_hop_stop_words", "ifr_multi_hop_both") and keep_prompt_token_indices is not None:
+ import ft_ifr_improve
+
+ scores = ft_ifr_improve.faithfulness_test_skip_tokens(
+ llm_evaluator,
+ attr_prompt,
+ prompt,
+ generation,
+ keep_prompt_token_indices=keep_prompt_token_indices,
+ user_prompt_indices=user_prompt_indices,
+ )
+ elif user_prompt_indices is not None:
+ scores = _faithfulness_test_with_user_prompt_indices(
+ llm_evaluator,
+ attr_prompt,
+ prompt,
+ generation,
+ user_prompt_indices=user_prompt_indices,
+ )
+ else:
+ scores = llm_evaluator.faithfulness_test(attr_prompt, prompt, generation)
+ results.append(scores)
+ return np.asarray(results, dtype=np.float64)
+
+
+def _compute_recovery_scores(
+ testing_dict: Dict[str, Any],
+ *,
+ attr_list: List[torch.Tensor],
+ prompt_len: int,
+ gold_prompt_token_indices: List[int],
+ llm_evaluator: llm_attr_eval.LLMAttributionEvaluator,
+ keep_prompt_token_indices: Optional[List[int]],
+) -> Tuple[Optional[np.ndarray], Optional[str]]:
+ attr_func = str(testing_dict.get("attr_func") or "")
+
+ if prompt_len <= 0:
+ return None, "empty_prompt_len"
+
+ gold_prompt = [int(x) for x in (gold_prompt_token_indices or [])]
+ if not gold_prompt:
+ return None, "empty_gold_prompt"
+
+ if attr_func in ("ifr_multi_hop_stop_words", "ifr_multi_hop_both") and keep_prompt_token_indices is not None:
+ import ft_ifr_improve
+
+ keep_set = {int(x) for x in keep_prompt_token_indices}
+ gold_filtered = [idx for idx in gold_prompt if int(idx) in keep_set]
+ if not gold_filtered:
+ return None, "empty_gold_after_keep_filter"
+
+ scores = [
+ ft_ifr_improve.evaluate_attr_recovery_skip_tokens(
+ attr[:, :prompt_len],
+ keep_prompt_token_indices=keep_prompt_token_indices,
+ gold_prompt_token_indices=gold_prompt,
+ top_fraction=0.1,
+ )
+ for attr in attr_list
+ ]
+ else:
+ scores = [
+ llm_evaluator.evaluate_attr_recovery(
+ attr,
+ prompt_len=prompt_len,
+ gold_prompt_token_indices=gold_prompt,
+ top_fraction=0.1,
+ )
+ for attr in attr_list
+ ]
+
+ return np.asarray(scores, dtype=np.float64), None
+
+
+def evaluate_dataset_multi(
+ args,
+ dataset_name: str,
+ examples: List[ds_utils.CachedExample],
+ testing_dict: Dict[str, Any],
+ *,
+ modes: List[str],
+) -> Dict[str, Any]:
+ tokenizer = testing_dict["tokenizer"]
+ llm_evaluator = llm_attr_eval.LLMAttributionEvaluator(testing_dict["model"], tokenizer)
+
+ want_faith = "faithfulness_gen" in modes
+ want_recovery = "recovery_ruler" in modes
+
+ faith_results: List[np.ndarray] = []
+ faith_durations: List[float] = []
+
+ recovery_results: List[np.ndarray] = []
+ recovery_attr_durations: List[float] = []
+ recovery_skipped = 0
+
+ total = min(len(examples), args.num_examples)
+ iterator = islice(examples, total)
+
+ save_traces = bool(getattr(args, "save_hop_traces", False))
+ manifest_handle = None
+ trace_dir: Optional[Path] = None
+ if save_traces:
+ model_tag = str(testing_dict.get("model_tag", "model"))
+ run_tag = _trace_run_tag(testing_dict, modes=modes, total=total)
+ trace_dir = Path(args.output_root) / "traces" / dataset_name / model_tag / run_tag
+ trace_dir.mkdir(parents=True, exist_ok=True)
+ manifest_handle = open(trace_dir / "manifest.jsonl", "w", encoding="utf-8")
+
+ try:
+ for example_idx, ex in enumerate(iterator):
+ if want_recovery:
+ needle_spans = (ex.metadata or {}).get("needle_spans")
+ if not isinstance(needle_spans, list) or not needle_spans:
+ raise SystemExit(
+ "recovery_ruler requires RULER samples with metadata.needle_spans; "
+ f"dataset={dataset_name} has missing/empty needle_spans."
+ )
+ if ex.target is None:
+ raise SystemExit(
+ "recovery_ruler requires cached targets (CoT+answer) so row/rec attribution is well-defined. "
+ f"dataset={dataset_name} has target=None; run exp/exp2/sample_and_filter.py first."
+ )
+
+ # Determine generation/target once.
+ target = ex.target
+ if target is None:
+ generation, full_output = llm_evaluator.response(ex.prompt)
+ target = generation
+ response_len = len(tokenizer(full_output).input_ids)
+ else:
+ response_len = len(tokenizer(llm_evaluator.format_prompt(" " + ex.prompt) + target).input_ids)
+
+ testing_dict["batch_size"] = max(1, math.floor((testing_dict["max_input_len"] - 100) / max(1, response_len)))
+
+ gold_prompt: Optional[List[int]] = None
+ if want_recovery:
+ gold_prompt = ds_utils.ruler_gold_prompt_token_indices(ex, tokenizer)
+
+ if want_recovery and not want_faith and not save_traces:
+ # Preserve recovery-only fast path when not saving traces: skip samples with empty gold.
+ if not gold_prompt:
+ recovery_skipped += 1
+ continue
+
+ time_attr_s = None
+ time_faith_s = None
+ time_recovery_s = None
+
+ t0 = time.perf_counter()
+ attr_list, hop_payload, user_prompt_indices, keep_prompt_token_indices = run_attribution(testing_dict, ex, target)
+ time_attr_s = time.perf_counter() - t0
+
+ seq_attr = attr_list[0]
+ prompt_len = int(seq_attr.shape[1] - seq_attr.shape[0]) # cols=(P+G), rows=G
+
+ if want_recovery and gold_prompt:
+ recovery_attr_durations.append(float(time_attr_s))
+
+ faith_scores = None
+ if want_faith:
+ t1 = time.perf_counter()
+ faith_scores = _compute_faithfulness_scores(
+ testing_dict,
+ attr_list=attr_list,
+ prompt_len=prompt_len,
+ prompt=ex.prompt,
+ generation=target,
+ llm_evaluator=llm_evaluator,
+ user_prompt_indices=user_prompt_indices,
+ keep_prompt_token_indices=keep_prompt_token_indices,
+ )
+ time_faith_s = time.perf_counter() - t1
+ faith_results.append(faith_scores)
+ faith_durations.append(float(time_attr_s))
+
+ recovery_scores = None
+ recovery_skip_reason = None
+ if want_recovery:
+ if not gold_prompt:
+ recovery_skip_reason = "empty_gold_prompt"
+ recovery_skipped += 1
+ else:
+ t2 = time.perf_counter()
+ recovery_scores, recovery_skip_reason = _compute_recovery_scores(
+ testing_dict,
+ attr_list=attr_list,
+ prompt_len=prompt_len,
+ gold_prompt_token_indices=gold_prompt,
+ llm_evaluator=llm_evaluator,
+ keep_prompt_token_indices=keep_prompt_token_indices,
+ )
+ time_recovery_s = time.perf_counter() - t2
+ if recovery_scores is None:
+ recovery_skipped += 1
+ else:
+ recovery_results.append(recovery_scores)
+
+ if manifest_handle is not None and trace_dir is not None:
+ try:
+ payload = _build_sample_trace_payload(
+ ex,
+ attr_list=attr_list,
+ prompt_len=prompt_len,
+ user_prompt_indices=user_prompt_indices,
+ keep_prompt_token_indices=keep_prompt_token_indices,
+ gold_prompt_token_indices=gold_prompt,
+ hop_payload=hop_payload,
+ faithfulness_scores=faith_scores,
+ recovery_scores=recovery_scores,
+ time_attr_s=time_attr_s,
+ time_faith_s=time_faith_s,
+ time_recovery_s=time_recovery_s,
+ )
+ _write_sample_trace(
+ trace_dir,
+ example_idx=example_idx,
+ attr_func=str(testing_dict.get("attr_func") or ""),
+ prompt=ex.prompt,
+ target=target,
+ payload=payload,
+ manifest_handle=manifest_handle,
+ recovery_skipped_reason=recovery_skip_reason,
+ )
+ except Exception as exc:
+ print(f"[warn] sample trace save failed for {testing_dict.get('attr_func')} ex={example_idx}: {exc}")
+ finally:
+ if manifest_handle is not None:
+ try:
+ manifest_handle.close()
+ except Exception:
+ pass
+
+ out: Dict[str, Any] = {}
+ if want_faith:
+ if not faith_results:
+ out["faithfulness"] = None
+ else:
+ scores = np.stack(faith_results, axis=0) # [N, 3, 3]
+ out["faithfulness"] = {
+ "mean": scores.mean(0),
+ "std": scores.std(0),
+ "avg_time": float(np.mean(faith_durations)) if faith_durations else 0.0,
+ }
+ if want_recovery:
+ if not recovery_results:
+ out["recovery"] = None
+ else:
+ scores = np.stack(recovery_results, axis=0) # [N, 3]
+ out["recovery"] = {
+ "mean": scores.mean(0),
+ "std": scores.std(0),
+ "avg_time": float(np.mean(recovery_attr_durations)) if recovery_attr_durations else 0.0,
+ "used": int(scores.shape[0]),
+ "skipped": int(recovery_skipped),
+ }
+
+ return out
+
+
+def _faithfulness_test_with_user_prompt_indices(
+ llm_evaluator: llm_attr_eval.LLMAttributionEvaluator,
+ attribution: torch.Tensor,
+ prompt: str,
+ generation: str,
+ *,
+ user_prompt_indices: List[int],
+ k: int = 20, ### control the MAS steps per sample
+) -> Tuple[float, float, float]:
+ """Token-level MAS/RISE faithfulness via guided deletion in k perturbation steps using provided prompt indices.
+
+ This mirrors llm_attr_eval.LLMAttributionEvaluator.faithfulness_test, but avoids
+ locating the user prompt span via token-id subsequence matching (which may fail
+ for some tokenizers due to non-compositional BPE merges at template boundaries).
+ """
+
+ def auc(arr: np.ndarray) -> float:
+ return (arr.sum() - arr[0] / 2 - arr[-1] / 2) / max(1, (arr.shape[0] - 1))
+
+ pad_token_id = llm_evaluator._ensure_pad_token_id()
+
+ user_prompt = " " + prompt
+ formatted_prompt = llm_evaluator.format_prompt(user_prompt)
+ formatted_ids = llm_evaluator.tokenizer(formatted_prompt, return_tensors="pt", add_special_tokens=False).input_ids
+
+ prompt_ids = formatted_ids.to(llm_evaluator.device)
+ prompt_ids_perturbed = prompt_ids.clone()
+ generation_ids = llm_evaluator.tokenizer(
+ generation + llm_evaluator.tokenizer.eos_token,
+ return_tensors="pt",
+ add_special_tokens=False,
+ ).input_ids.to(llm_evaluator.device)
+
+ attr_cpu = attribution.detach().cpu()
+ w = attr_cpu.sum(0)
+ sorted_attr_indices = torch.argsort(w, descending=True)
+ attr_sum = float(w.sum().item())
+
+ P = int(w.numel())
+ if len(user_prompt_indices) != P:
+ raise ValueError(
+ "user_prompt_indices length does not match prompt-side attribution length: "
+ f"indices P={len(user_prompt_indices)}, attr P={P}."
+ )
+ if P == 0:
+ return 0.0, 0.0, 0.0
+
+ if max(user_prompt_indices) >= int(prompt_ids_perturbed.shape[1]):
+ raise ValueError("user_prompt_indices contains an out-of-bounds index for formatted prompt ids.")
+
+ if P > 0:
+ steps = int(k) if k is not None else 0
+ if steps <= 0:
+ steps = 1
+ steps = min(steps, P)
+ else:
+ steps = 0
+
+ scores = np.zeros(steps + 1, dtype=np.float64)
+ density = np.zeros(steps + 1, dtype=np.float64)
+
+ scores[0] = (
+ llm_evaluator.compute_logprob_response_given_prompt(prompt_ids_perturbed, generation_ids).sum().cpu().detach().item()
+ )
+ density[0] = 1.0
+
+ if attr_sum <= 0:
+ density = np.linspace(1.0, 0.0, steps + 1)
+
+ base = P // steps
+ remainder = P % steps
+ start = 0
+ for step in range(steps):
+ size = base + (1 if step < remainder else 0)
+ group = sorted_attr_indices[start : start + size]
+ start += size
+
+ for idx in group:
+ j = int(idx.item())
+ abs_pos = int(user_prompt_indices[j])
+ prompt_ids_perturbed[0, abs_pos] = pad_token_id
+ scores[step + 1] = (
+ llm_evaluator.compute_logprob_response_given_prompt(prompt_ids_perturbed, generation_ids).sum().cpu().detach().item()
+ )
+ if attr_sum > 0:
+ dec = float(w.index_select(0, group).sum().item()) / attr_sum
+ density[step + 1] = density[step] - dec
+
+ min_normalized_pred = 1.0
+ normalized_model_response = scores.copy()
+ for i in range(len(scores)):
+ normalized_pred = (normalized_model_response[i] - scores[-1]) / (abs(scores[0] - scores[-1]))
+ normalized_pred = np.clip(normalized_pred, 0.0, 1.0)
+ min_normalized_pred = min(min_normalized_pred, normalized_pred)
+ normalized_model_response[i] = min_normalized_pred
+
+ alignment_penalty = np.abs(normalized_model_response - density)
+ corrected_scores = normalized_model_response + alignment_penalty
+ corrected_scores = corrected_scores.clip(0.0, 1.0)
+ corrected_scores = (corrected_scores - np.min(corrected_scores)) / (np.max(corrected_scores) - np.min(corrected_scores))
+
+ if np.isnan(corrected_scores).any():
+ corrected_scores = np.linspace(1.0, 0.0, len(scores))
+
+ return auc(normalized_model_response), auc(corrected_scores), auc(normalized_model_response + alignment_penalty)
+
+
+def load_model(model_name: str, device: str):
+ model = AutoModelForCausalLM.from_pretrained(
+ model_name,
+ device_map="auto" if device == "auto" else {"": int(device.split(":")[1])} if device.startswith("cuda:") else None,
+ torch_dtype=torch.float16,
+ attn_implementation="eager",
+ )
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
+ tokenizer.pad_token = tokenizer.eos_token
+ model.eval()
+ return model, tokenizer
+
+
+def resolve_device(args) -> str:
+ if args.cuda is not None and "," in args.cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = args.cuda
+ return "auto"
+ if args.cuda is not None and args.cuda.strip():
+ return f"cuda:{args.cuda}" if torch.cuda.is_available() else "cpu"
+ return f"cuda:{args.cuda_num}" if torch.cuda.is_available() else "cpu"
+
+
+def run_attribution(
+ testing_dict, example: ds_utils.CachedExample, target: Optional[str]
+) -> Tuple[List[torch.Tensor], Optional[Dict[str, np.ndarray]], Optional[List[int]]]:
+ model = testing_dict["model"]
+ tokenizer = testing_dict["tokenizer"]
+ attr_func = testing_dict["attr_func"]
+
+ indices_to_explain = example.indices_to_explain
+ if not (isinstance(indices_to_explain, list) and len(indices_to_explain) == 2):
+ raise ValueError(
+ "exp2 requires token-span indices_to_explain=[start_tok,end_tok]. "
+ "Please re-sample or run exp/exp2/migrate_indices_to_explain_token_span.py on your cache."
+ )
+
+ llm_attributor = None
+ if "IG" in attr_func:
+ llm_attributor = llm_attr.LLMGradientAttribtion(model, tokenizer)
+ attr = llm_attributor.calculate_IG_per_generation(
+ example.prompt,
+ 20,
+ tokenizer.eos_token_id,
+ batch_size=testing_dict["batch_size"],
+ target=target,
+ )
+ elif "perturbation" in attr_func:
+ if attr_func in ("perturbation_all_fast", "perturbation_CLP_fast", "perturbation_REAGENT_fast"):
+ import perturbation_fast
+
+ llm_attributor = perturbation_fast.LLMPerturbationFastAttribution(model, tokenizer)
+ if attr_func == "perturbation_all_fast":
+ attr = llm_attributor.calculate_feature_ablation_segments(
+ example.prompt,
+ baseline=tokenizer.eos_token_id,
+ measure="log_loss",
+ target=target,
+ source_k=20,
+ )
+ elif attr_func == "perturbation_CLP_fast":
+ attr = llm_attributor.calculate_feature_ablation_segments(
+ example.prompt,
+ baseline=tokenizer.eos_token_id,
+ measure="KL",
+ target=target,
+ source_k=20,
+ )
+ else:
+ attr = llm_attributor.calculate_feature_ablation_segments_mlm(
+ example.prompt,
+ target=target,
+ source_k=20,
+ )
+ else:
+ llm_attributor = llm_attr.LLMPerturbationAttribution(model, tokenizer)
+ if attr_func == "perturbation_all":
+ attr = llm_attributor.calculate_feature_ablation_sentences(
+ example.prompt, baseline=tokenizer.eos_token_id, measure="log_loss", target=target
+ )
+ elif attr_func == "perturbation_CLP":
+ attr = llm_attributor.calculate_feature_ablation_sentences(
+ example.prompt, baseline=tokenizer.eos_token_id, measure="KL", target=target
+ )
+ elif attr_func == "perturbation_REAGENT":
+ attr = llm_attributor.calculate_feature_ablation_sentences_mlm(example.prompt, target=target)
+ else:
+ raise ValueError(f"Unsupported perturbation attr_func {attr_func}")
+ elif "attention" in attr_func:
+ llm_attributor = llm_attr.LLMAttentionAttribution(model, tokenizer)
+ llm_attributor_ig = llm_attr.LLMGradientAttribtion(model, tokenizer)
+ attr = llm_attributor.calculate_attention_attribution(example.prompt, target=target)
+ attr_b = llm_attributor_ig.calculate_IG_per_generation(
+ example.prompt, 20, tokenizer.eos_token_id, batch_size=testing_dict["batch_size"], target=target
+ )
+ attr.attribution_matrix = attr.attribution_matrix * attr_b.attribution_matrix
+ elif attr_func == "ifr_all_positions":
+ llm_attributor = llm_attr.LLMIFRAttribution(
+ model,
+ tokenizer,
+ chunk_tokens=testing_dict["chunk_tokens"],
+ sink_chunk_tokens=testing_dict["sink_chunk_tokens"],
+ )
+ attr = llm_attributor.calculate_ifr_for_all_positions(example.prompt, target=target)
+ elif attr_func == "ifr_all_positions_output_only":
+ llm_attributor = llm_attr.LLMIFRAttribution(
+ model,
+ tokenizer,
+ chunk_tokens=testing_dict["chunk_tokens"],
+ sink_chunk_tokens=testing_dict["sink_chunk_tokens"],
+ )
+ sink_span = tuple(example.sink_span) if example.sink_span else tuple(indices_to_explain)
+ attr = llm_attributor.calculate_ifr_for_all_positions_output_only(
+ example.prompt,
+ target=target,
+ sink_span=sink_span,
+ )
+ elif attr_func == "ifr_multi_hop":
+ llm_attributor = llm_attr.LLMIFRAttribution(
+ model,
+ tokenizer,
+ chunk_tokens=testing_dict["chunk_tokens"],
+ sink_chunk_tokens=testing_dict["sink_chunk_tokens"],
+ )
+ attr = llm_attributor.calculate_ifr_multi_hop(
+ example.prompt,
+ target=target,
+ sink_span=tuple(example.sink_span) if example.sink_span else None,
+ thinking_span=tuple(example.thinking_span) if example.thinking_span else None,
+ n_hops=testing_dict["n_hops"],
+ )
+ elif attr_func == "ifr_in_all_gen":
+ import ft_ifr_improve
+
+ llm_attributor = ft_ifr_improve.LLMIFRAttributionInAllGen(
+ model,
+ tokenizer,
+ chunk_tokens=testing_dict["chunk_tokens"],
+ sink_chunk_tokens=testing_dict["sink_chunk_tokens"],
+ )
+ attr = llm_attributor.calculate_ifr_in_all_gen(
+ example.prompt,
+ target=target,
+ sink_span=tuple(example.sink_span) if example.sink_span else None,
+ thinking_span=tuple(example.thinking_span) if example.thinking_span else None,
+ n_hops=testing_dict["n_hops"],
+ )
+ elif attr_func == "ifr_multi_hop_stop_words":
+ import ft_ifr_improve
+
+ llm_attributor = ft_ifr_improve.LLMIFRAttributionImproved(
+ model,
+ tokenizer,
+ chunk_tokens=testing_dict["chunk_tokens"],
+ sink_chunk_tokens=testing_dict["sink_chunk_tokens"],
+ )
+ attr = llm_attributor.calculate_ifr_multi_hop_stop_words(
+ example.prompt,
+ target=target,
+ sink_span=tuple(example.sink_span) if example.sink_span else None,
+ thinking_span=tuple(example.thinking_span) if example.thinking_span else None,
+ n_hops=testing_dict["n_hops"],
+ )
+ elif attr_func == "ifr_multi_hop_both":
+ import ft_ifr_improve
+
+ llm_attributor = ft_ifr_improve.LLMIFRAttributionBoth(
+ model,
+ tokenizer,
+ chunk_tokens=testing_dict["chunk_tokens"],
+ sink_chunk_tokens=testing_dict["sink_chunk_tokens"],
+ )
+ attr = llm_attributor.calculate_ifr_multi_hop_both(
+ example.prompt,
+ target=target,
+ sink_span=tuple(example.sink_span) if example.sink_span else None,
+ thinking_span=tuple(example.thinking_span) if example.thinking_span else None,
+ n_hops=testing_dict["n_hops"],
+ )
+ elif attr_func == "ifr_multi_hop_split_hop":
+ import ft_ifr_improve
+
+ llm_attributor = ft_ifr_improve.LLMIFRAttributionSplitHop(
+ model,
+ tokenizer,
+ chunk_tokens=testing_dict["chunk_tokens"],
+ sink_chunk_tokens=testing_dict["sink_chunk_tokens"],
+ )
+ attr = llm_attributor.calculate_ifr_multi_hop_split_hop(
+ example.prompt,
+ target=target,
+ sink_span=tuple(example.sink_span) if example.sink_span else None,
+ thinking_span=tuple(example.thinking_span) if example.thinking_span else None,
+ n_hops=testing_dict["n_hops"],
+ )
+ elif attr_func == "attnlrp":
+ llm_attributor = llm_attr.LLMLRPAttribution(model, tokenizer)
+ attr = llm_attributor.calculate_attnlrp_ft_hop0(
+ example.prompt,
+ target=target,
+ sink_span=tuple(example.sink_span) if example.sink_span else None,
+ thinking_span=tuple(example.thinking_span) if example.thinking_span else None,
+ neg_handling=str(testing_dict.get("attnlrp_neg_handling", "drop")),
+ norm_mode=str(testing_dict.get("attnlrp_norm_mode", "norm")),
+ )
+ elif attr_func in ("ft_attnlrp", "attnlrp_aggregated_multi_hop"):
+ llm_attributor = llm_attr.LLMLRPAttribution(model, tokenizer)
+ attr = llm_attributor.calculate_attnlrp_aggregated_multi_hop(
+ example.prompt,
+ target=target,
+ sink_span=tuple(example.sink_span) if example.sink_span else None,
+ thinking_span=tuple(example.thinking_span) if example.thinking_span else None,
+ n_hops=testing_dict["n_hops"],
+ neg_handling=str(testing_dict.get("attnlrp_neg_handling", "drop")),
+ norm_mode=str(testing_dict.get("attnlrp_norm_mode", "norm")),
+ )
+ elif attr_func == "basic":
+ llm_attributor = llm_attr.LLMBasicAttribution(model, tokenizer)
+ attr = llm_attributor.calculate_basic_attribution(example.prompt, target=target)
+ else:
+ raise ValueError(f"Unsupported attr_func {attr_func}")
+
+ seq_attr, row_attr, rec_attr = attr.get_all_token_attrs(indices_to_explain)
+ hop_payload = None
+ if bool(testing_dict.get("save_hop_traces", False)):
+ try:
+ hop_payload = _build_hop_trace_payload(attr_func, attr, indices_to_explain=indices_to_explain)
+ except Exception as exc:
+ print(f"[warn] hop trace extraction failed for {attr_func}: {exc}")
+ hop_payload = None
+
+ user_prompt_indices = getattr(llm_attributor, "user_prompt_indices", None)
+ if isinstance(user_prompt_indices, list):
+ user_prompt_indices = [int(x) for x in user_prompt_indices]
+ else:
+ user_prompt_indices = None
+
+ keep_prompt_token_indices = None
+ if attr_func in ("ifr_multi_hop_stop_words", "ifr_multi_hop_both"):
+ try:
+ import ft_ifr_improve
+
+ keep_prompt_token_indices = ft_ifr_improve.keep_token_indices(list(attr.prompt_tokens))
+ except Exception:
+ keep_prompt_token_indices = None
+
+ return [seq_attr, row_attr, rec_attr], hop_payload, user_prompt_indices, keep_prompt_token_indices
+
+
+def faithfulness_generation(
+ testing_dict, example: ds_utils.CachedExample, target: str, llm_evaluator
+) -> Tuple[np.ndarray, Optional[Dict[str, np.ndarray]]]:
+ prompt = example.prompt
+ generation = target
+
+ attr_func = str(testing_dict.get("attr_func") or "")
+ attr_list, hop_payload, user_prompt_indices, keep_prompt_token_indices = run_attribution(
+ testing_dict, example, target
+ )
+ seq_attr = attr_list[0]
+ prompt_len = int(seq_attr.shape[1] - seq_attr.shape[0]) # cols=(P+G), rows=G
+
+ results = []
+ for attr in attr_list:
+ # Only use prompt-side attribution, matching evaluations/faithfulness.py
+ attr_prompt = attr[:, :prompt_len]
+ if attr_func in ("ifr_multi_hop_stop_words", "ifr_multi_hop_both") and keep_prompt_token_indices is not None:
+ import ft_ifr_improve
+
+ scores = ft_ifr_improve.faithfulness_test_skip_tokens(
+ llm_evaluator,
+ attr_prompt,
+ prompt,
+ generation,
+ keep_prompt_token_indices=keep_prompt_token_indices,
+ user_prompt_indices=user_prompt_indices,
+ )
+ elif user_prompt_indices is not None:
+ scores = _faithfulness_test_with_user_prompt_indices(
+ llm_evaluator,
+ attr_prompt,
+ prompt,
+ generation,
+ user_prompt_indices=user_prompt_indices,
+ )
+ else:
+ scores = llm_evaluator.faithfulness_test(attr_prompt, prompt, generation)
+ results.append(scores)
+
+ return np.array(results), hop_payload
+
+
+def evaluate_dataset(args, dataset_name: str, examples: List[ds_utils.CachedExample], testing_dict):
+ out = evaluate_dataset_multi(args, dataset_name, examples, testing_dict, modes=["faithfulness_gen"])
+ faith = out.get("faithfulness")
+ if not faith:
+ return None
+ return faith["mean"], faith["std"], faith["avg_time"]
+
+
+def evaluate_dataset_recovery_ruler(args, dataset_name: str, examples: List[ds_utils.CachedExample], testing_dict):
+ out = evaluate_dataset_multi(args, dataset_name, examples, testing_dict, modes=["recovery_ruler"])
+ rec = out.get("recovery")
+ if not rec:
+ return None
+ return rec["mean"], rec["std"], rec["avg_time"], rec["used"], rec["skipped"]
+
+
+def main():
+ parser = argparse.ArgumentParser("Experiment 2 runner (math skipped, AT2 skipped).")
+ parser.add_argument("--datasets", type=str, required=True, help="Comma-separated names or paths.")
+ parser.add_argument("--attr_funcs", type=str, required=True, help="Comma-separated attr funcs (no AT2).")
+ parser.add_argument("--model", type=str, default=None, help="HF repo id (required unless --model_path set).")
+ parser.add_argument("--model_path", type=str, default=None, help="Local path; overrides --model for loading.")
+ parser.add_argument("--cuda", type=str, default=None)
+ parser.add_argument("--cuda_num", type=int, default=0)
+ parser.add_argument("--num_examples", type=int, default=100)
+ parser.add_argument(
+ "--mode",
+ type=str,
+ nargs="+",
+ default=["faithfulness_gen"],
+ help=(
+ "One or more of: faithfulness_gen, recovery_ruler. "
+ "Accepts comma-separated values, e.g. '--mode faithfulness_gen,recovery_ruler' "
+ "or '--mode faithfulness_gen, recovery_ruler'."
+ ),
+ )
+ parser.add_argument("--sample", type=int, default=None, help="Optional subsample before num_examples.")
+ parser.add_argument("--seed", type=int, default=42)
+ parser.add_argument("--chunk_tokens", type=int, default=128)
+ parser.add_argument("--sink_chunk_tokens", type=int, default=32)
+ parser.add_argument("--n_hops", type=int, default=3)
+ parser.add_argument(
+ "--attnlrp_neg_handling",
+ type=str,
+ choices=["drop", "abs"],
+ default="drop",
+ help="FT-AttnLRP: how to handle negative values after each hop (drop=clamp>=0, abs=absolute value).",
+ )
+ parser.add_argument(
+ "--attnlrp_norm_mode",
+ type=str,
+ choices=["norm", "no_norm"],
+ default="norm",
+ help="FT-AttnLRP: norm enables per-hop global+thinking normalization + ratios; no_norm disables all three.",
+ )
+ parser.add_argument("--data_root", type=str, default="exp/exp2/data", help="Filtered dataset cache directory.")
+ parser.add_argument("--output_root", type=str, default="exp/exp2/output", help="Directory to store evaluation outputs.")
+ parser.add_argument(
+ "--save_hop_traces",
+ action="store_true",
+ help=(
+ "Save per-sample trace artifacts (attribution vectors + per-sample metrics) under output_root/traces/. "
+ "For multi-hop methods, also saves per-hop token vectors (vh)."
+ ),
+ )
+ args = parser.parse_args()
+ modes = _parse_modes(args.mode)
+
+ if args.model_path:
+ model_name = args.model_path
+ elif args.model:
+ model_name = args.model
+ else:
+ raise SystemExit("Please set --model or --model_path.")
+ model_tag = args.model if args.model else Path(args.model_path).name
+
+ datasets = [d.strip() for d in args.datasets.split(",") if d.strip()]
+ attr_funcs = [a.strip() for a in args.attr_funcs.split(",") if a.strip()]
+
+ device = resolve_device(args)
+ model, tokenizer = load_model(model_name, device)
+
+ max_input_len = {
+ "llama-1B": 5500,
+ "llama-3B": 4800,
+ "llama-8B": 3500,
+ "qwen-1.7B": 5500,
+ "qwen-4B": 3500,
+ "qwen-8B": 5000,
+ "qwen-32B": 1500,
+ "gemma-12B": 1500,
+ "gemma-27B": 2000,
+ }.get(args.model, 2000)
+
+ for ds_name in datasets:
+ if "recovery_ruler" in modes and ds_name == "morehopqa":
+ raise SystemExit("recovery_ruler only supports RULER datasets (with needle_spans), not morehopqa.")
+ if "recovery_ruler" in modes and ds_name.startswith("math"):
+ raise SystemExit("recovery_ruler only supports RULER datasets (with needle_spans), not math.")
+
+ # Resolve dataset (prefer prepared cache under data_root)
+ cached_path = Path(args.data_root) / f"{ds_name}.jsonl"
+ if cached_path.exists():
+ examples = ds_utils.load_cached(cached_path, sample=args.sample, seed=args.seed)
+ else:
+ # allow direct cached path or raw loader
+ p = Path(ds_name)
+ if p.exists():
+ examples = ds_utils.load_cached(p, sample=args.sample, seed=args.seed)
+ else:
+ hint = "please run exp/exp2/sample_and_filter.py first (or pass an explicit cached JSONL path)."
+ if ds_name.startswith("math"):
+ hint = "please run exp/exp2/map_math_mine_to_exp2_cache.py first (or pass an explicit cached JSONL path)."
+ raise SystemExit(f"Missing exp2 cache for '{ds_name}'. Expected {cached_path}; {hint}")
+
+ for attr_func in attr_funcs:
+ if attr_func.lower() == "at2":
+ print("Skipping AT2 as requested.")
+ continue
+
+ testing_dict: Dict[str, any] = {
+ "model": model,
+ "model_tag": model_tag,
+ "tokenizer": tokenizer,
+ "attr_func": attr_func,
+ "max_input_len": max_input_len,
+ "chunk_tokens": args.chunk_tokens,
+ "sink_chunk_tokens": args.sink_chunk_tokens,
+ "n_hops": args.n_hops,
+ "attnlrp_neg_handling": args.attnlrp_neg_handling,
+ "attnlrp_norm_mode": args.attnlrp_norm_mode,
+ "device": device,
+ "batch_size": 1,
+ "save_hop_traces": bool(args.save_hop_traces),
+ }
+ result = evaluate_dataset_multi(args, ds_name, examples, testing_dict, modes=modes)
+
+ if "faithfulness_gen" in modes:
+ faith = result.get("faithfulness")
+ if not faith:
+ print(f"No faithfulness results for {ds_name} with {attr_func}.")
+ else:
+ mean = faith["mean"]
+ std = faith["std"]
+ avg_time = float(faith["avg_time"])
+
+ out_dir = Path(args.output_root) / "faithfulness" / ds_name / model_tag
+ out_dir.mkdir(parents=True, exist_ok=True)
+ filename = f"{attr_func}_{args.num_examples}_examples.csv"
+ with open(out_dir / filename, "w") as f:
+ f.write("Method,RISE,MAS,RISE+AP\n")
+ f.write(",".join(["Seq Attr Scores Mean"] + [str(x) for x in mean[0].tolist()]) + "\n")
+ f.write(",".join(["Row Attr Scores Mean"] + [str(x) for x in mean[1].tolist()]) + "\n")
+ f.write(",".join(["Recursive Attr Scores Mean"] + [str(x) for x in mean[2].tolist()]) + "\n")
+ f.write(",".join(["Seq Attr Scores Var"] + [str(x) for x in std[0].tolist()]) + "\n")
+ f.write(",".join(["Row Attr Scores Var"] + [str(x) for x in std[1].tolist()]) + "\n")
+ f.write(",".join(["Recursive Attr Scores Var"] + [str(x) for x in std[2].tolist()]) + "\n")
+ f.write(f"Avg Sample Time (s),{avg_time}\n")
+ print(f"[{ds_name}] {attr_func} -> {out_dir/filename} (avg sample time: {avg_time:.2f}s)")
+
+ if "recovery_ruler" in modes:
+ rec = result.get("recovery")
+ if not rec:
+ print(f"No recovery results for {ds_name} with {attr_func}.")
+ else:
+ mean = rec["mean"]
+ std = rec["std"]
+ avg_time = float(rec["avg_time"])
+ used = int(rec["used"])
+ skipped = int(rec["skipped"])
+
+ out_dir = Path(args.output_root) / "recovery" / ds_name / model_tag
+ out_dir.mkdir(parents=True, exist_ok=True)
+ filename = f"{attr_func}_{args.num_examples}_examples.csv"
+ with open(out_dir / filename, "w") as f:
+ f.write("Method,Recovery@10%\n")
+ f.write(f"Seq Attr Recovery Mean,{mean[0]}\n")
+ f.write(f"Row Attr Recovery Mean,{mean[1]}\n")
+ f.write(f"Recursive Attr Recovery Mean,{mean[2]}\n")
+ f.write(f"Seq Attr Recovery Std,{std[0]}\n")
+ f.write(f"Row Attr Recovery Std,{std[1]}\n")
+ f.write(f"Recursive Attr Recovery Std,{std[2]}\n")
+ f.write(f"Examples Used,{used}\n")
+ f.write(f"Examples Skipped,{skipped}\n")
+ f.write(f"Avg Sample Time (s),{avg_time}\n")
+ print(
+ f"[{ds_name}] {attr_func} -> {out_dir/filename} "
+ f"(used={used} skipped={skipped} avg sample time: {avg_time:.2f}s)"
+ )
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/exp2/sample_and_filter.py b/exp/exp2/sample_and_filter.py
new file mode 100644
index 0000000000000000000000000000000000000000..b03cadba062916618deb70a8d918037cfd5b961a
--- /dev/null
+++ b/exp/exp2/sample_and_filter.py
@@ -0,0 +1,363 @@
+#!/usr/bin/env python3
+"""
+Dataset sampler for Experiment 2.
+
+Steps:
+- Load a dataset item (MoreHopQA / HotpotQA / RULER niah / RULER vt).
+- Call the generation model (qwen3-235b-a22b-2507) with a system prompt that
+ asks for brief reasoning and a final answer wrapped in \\box{}.
+- Enforce the output format: keep only generations that look like
+ " + final \\box{} answer" with nothing after the box.
+- Call the judge model (deepseek-v3-1-terminus) to check whether the boxed
+ answer matches the dataset reference answer; keep only judged True samples.
+- Rebuild `target` as "\\n" and store filtered
+ samples to exp/exp2/data/.jsonl (or a custom path) with inferred spans.
+"""
+
+from __future__ import annotations
+
+import argparse
+import json
+import os
+import sys
+import time
+import urllib.error
+import urllib.request
+from pathlib import Path
+from typing import Any, Dict, Iterable, List, Optional
+
+from transformers import AutoTokenizer
+from tqdm import tqdm
+
+REPO_ROOT = Path(__file__).resolve().parents[2]
+if str(REPO_ROOT) not in sys.path:
+ sys.path.insert(0, str(REPO_ROOT))
+
+from exp.exp2.dataset_utils import (
+ CachedExample,
+ DatasetLoader,
+ attach_spans_from_answer,
+ split_boxed_generation,
+)
+
+
+class RateLimitError(RuntimeError):
+ """Raised when API returns 429; carries a suggested wait time."""
+
+ def __init__(self, wait_seconds: float, detail: str) -> None:
+ super().__init__(detail)
+ self.wait_seconds = wait_seconds
+
+# GEN_SYSTEM_PROMPT = (
+# "You are a careful reasoning assistant. "
+# "Before answering, engage in an extremely detailed and exhaustive chain of thought. **No fewer than 2k tokens.** "
+# "Do not skip any logical steps, even if they seem obvious. "
+# "Process this freely and naturally without using specific headers or strict formatting. "
+# "When you reach the conclusion, wrap the entire final sentence containing the answer inside \\box{}. "
+# "Ensure the box wraps the **sentence** that naturally delivers the answer. DO NOT rewrite the answer word for the box separately."
+# )
+
+GEN_SYSTEM_PROMPT = (
+ "You are a reasoning assistant. "
+ "Before answering, engage in an chain of thought. "
+ "Process this freely and naturally without using specific headers or strict formatting. "
+ "When you reach the conclusion, wrap the entire final sentence containing the answer inside \\box{}. "
+ "Ensure the box wraps the **sentence** that naturally delivers the answer. DO NOT rewrite the answer word for the box separately."
+)
+
+JUDGE_SYSTEM_PROMPT = (
+ "You verify whether the model's boxed answer matches the reference answer. "
+ "Reply strictly with True or False and nothing else."
+)
+
+
+def call_chat_api(
+ api_base: str,
+ api_key: str,
+ model: str,
+ messages: List[Dict[str, str]],
+ *,
+ timeout: int,
+ max_tokens: int,
+ temperature: float,
+ cache_ttl: int,
+ cache_namespace: Optional[str],
+ rate_limit_delay: Optional[float] = None,
+) -> str:
+ url = api_base.rstrip("/") + "/chat/completions"
+ payload: Dict[str, Any] = {
+ "model": model,
+ "messages": messages,
+ "max_tokens": max_tokens,
+ "temperature": temperature,
+ }
+ if cache_ttl > 0:
+ cache_obj: Dict[str, Any] = {"ttl": cache_ttl}
+ if cache_namespace:
+ cache_obj["namespace"] = cache_namespace
+ payload["cache"] = cache_obj
+
+ data = json.dumps(payload).encode("utf-8")
+ headers = {"Content-Type": "application/json"}
+ if api_key:
+ headers["Authorization"] = f"Bearer {api_key}"
+
+ req = urllib.request.Request(url, data=data, headers=headers, method="POST")
+ opener = urllib.request.build_opener(urllib.request.ProxyHandler({}))
+ try:
+ with opener.open(req, timeout=timeout) as resp:
+ resp_bytes = resp.read()
+ except urllib.error.HTTPError as e:
+ detail = e.read().decode("utf-8", errors="ignore") if hasattr(e, "read") else ""
+ if e.code == 429:
+ retry_after = None
+ if hasattr(e, "headers") and e.headers:
+ retry_after_header = e.headers.get("Retry-After")
+ if retry_after_header:
+ try:
+ retry_after = float(retry_after_header)
+ except ValueError:
+ retry_after = None
+ wait = retry_after or rate_limit_delay or 5.0
+ raise RateLimitError(wait, f"API HTTP 429: {detail}") from e
+ raise RuntimeError(f"API HTTP error {e.code}: {detail}") from e
+ except urllib.error.URLError as e:
+ raise RuntimeError(f"API request failed: {e}") from e
+
+ try:
+ response = json.loads(resp_bytes.decode("utf-8"))
+ except json.JSONDecodeError as e:
+ raise RuntimeError(f"Failed to decode API response: {resp_bytes!r}") from e
+
+ choices = response.get("choices", [])
+ if not choices:
+ raise RuntimeError(f"Empty choices from API: {response}")
+ content = choices[0].get("message", {}).get("content", "")
+ if not content:
+ raise RuntimeError(f"Empty content from API: {response}")
+ return content.strip()
+
+
+def build_gen_messages(prompt: str) -> List[Dict[str, str]]:
+ return [
+ {"role": "system", "content": GEN_SYSTEM_PROMPT},
+ {"role": "user", "content": prompt},
+ ]
+
+
+def build_judge_messages(reference_answer: str, candidate_answer: str) -> List[Dict[str, str]]:
+ user = (
+ "Decide if the model's boxed answer matches the reference answer.\n"
+ f"Reference answer: {reference_answer}\n"
+ f"Model boxed answer (only the content inside \\box{{}}): {candidate_answer}\n"
+ "Output only True if they are semantically consistent; otherwise output False."
+ )
+ return [
+ {"role": "system", "content": JUDGE_SYSTEM_PROMPT},
+ {"role": "user", "content": user},
+ ]
+
+
+def parse_bool(text: str) -> bool:
+ first = text.strip().splitlines()[0].strip().lower()
+ if first in {"true", "yes"}:
+ return True
+ if first in {"false", "no"}:
+ return False
+ # fallback: check substring
+ if "true" in first and "false" not in first:
+ return True
+ if "false" in first:
+ return False
+ raise ValueError(f"Cannot parse boolean from: {text!r}")
+
+
+def write_cache(out_path: Path, examples: Iterable[CachedExample]) -> int:
+ out_path.parent.mkdir(parents=True, exist_ok=True)
+ count = 0
+ with out_path.open("w", encoding="utf-8") as f:
+ for ex in examples:
+ obj: Dict[str, Any] = {
+ "prompt": ex.prompt,
+ "target": ex.target,
+ "indices_to_explain": ex.indices_to_explain,
+ "attr_mask_indices": ex.attr_mask_indices,
+ "sink_span": ex.sink_span,
+ "thinking_span": ex.thinking_span,
+ "metadata": ex.metadata,
+ }
+ f.write(json.dumps(obj, ensure_ascii=False) + "\n")
+ count += 1
+ return count
+
+
+def main():
+ parser = argparse.ArgumentParser("Sample and filter dataset examples for exp2.")
+ parser.add_argument(
+ "--dataset",
+ type=str,
+ required=True,
+ help="morehopqa | hotpotqa_long | niah_* | vt_* | | ",
+ )
+ parser.add_argument("--max_examples", type=int, default=100, help="Number of raw examples to sample before filtering.")
+ parser.add_argument("--seed", type=int, default=42)
+ parser.add_argument("--api_base", type=str, default="http://localhost:4000/v1", help="Chat API base URL.")
+ parser.add_argument("--api_key", type=str, default=None, help="API key; defaults to FLASHTRACE_API_KEY/OPENAI_API_KEY.")
+ parser.add_argument("--generator_model", type=str, default="qwen3-235b-a22b-2507")
+ parser.add_argument("--judge_model", type=str, default="deepseek-v3-1-terminus")
+ parser.add_argument("--api_timeout", type=int, default=300)
+ parser.add_argument("--api_max_tokens", type=int, default=8192)
+ parser.add_argument("--api_temperature", type=float, default=0.0)
+ parser.add_argument("--api_cache_ttl", type=int, default=600)
+ parser.add_argument("--api_cache_namespace", type=str, default="flashtrace-exp2")
+ parser.add_argument("--retry_delay", type=float, default=2.0)
+ parser.add_argument("--retries", type=int, default=2, help="Additional retries on API failure.")
+ parser.add_argument("--request_interval", type=float, default=1.0, help="Sleep seconds between generation calls.")
+ parser.add_argument("--judge_interval", type=float, default=1.0, help="Sleep seconds between judge calls.")
+ parser.add_argument("--tokenizer_model", type=str, default=None, help="Tokenizer path for span extraction (default: generator model).")
+ parser.add_argument("--data_root", type=str, default="exp/exp2/data", help="Output directory for filtered caches.")
+ parser.add_argument("--out", type=str, default=None, help="Optional explicit output path (JSONL).")
+ parser.add_argument("--rate_limit_delay", type=float, default=5.0, help="Seconds to wait on HTTP 429 before retrying.")
+ args = parser.parse_args()
+
+ api_key = args.api_key or os.environ.get("FLASHTRACE_API_KEY") or os.environ.get("OPENAI_API_KEY")
+ if not api_key:
+ raise SystemExit("Set --api_key or FLASHTRACE_API_KEY/OPENAI_API_KEY for API access.")
+
+ loader = DatasetLoader(seed=args.seed, data_root=args.data_root)
+ # Load full dataset; we will stop early once enough kept examples are collected.
+ raw_examples = loader.load_raw(args.dataset, sample=None)
+ if not raw_examples:
+ raise SystemExit("No examples loaded.")
+
+ tok_name = args.tokenizer_model or args.generator_model
+ tok_path = Path(tok_name)
+ if tok_path.exists():
+ tokenizer = AutoTokenizer.from_pretrained(tok_path.as_posix(), local_files_only=True)
+ else:
+ tokenizer = AutoTokenizer.from_pretrained(tok_name)
+ tokenizer.pad_token = tokenizer.eos_token
+
+ kept: List[CachedExample] = []
+ total = len(raw_examples)
+ kept_bar = tqdm(total=args.max_examples, desc="Kept (judge=True)", position=1, leave=False)
+ attempted = 0
+
+ for idx, ex in enumerate(tqdm(raw_examples, total=total, desc="Sampling"), 1):
+ if len(kept) >= args.max_examples:
+ break
+ reference_answer = ex.metadata.get("reference_answer") or ex.target or ""
+ gen_messages = build_gen_messages(ex.prompt)
+ attempted = idx
+
+ # Step 1: generation
+ for attempt in range(args.retries + 1):
+ try:
+ generation = call_chat_api(
+ args.api_base,
+ api_key,
+ args.generator_model,
+ gen_messages,
+ timeout=args.api_timeout,
+ max_tokens=args.api_max_tokens,
+ temperature=args.api_temperature,
+ cache_ttl=args.api_cache_ttl,
+ cache_namespace=args.api_cache_namespace,
+ rate_limit_delay=args.rate_limit_delay,
+ )
+ break
+ except RateLimitError as e:
+ if attempt >= args.retries:
+ raise
+ time.sleep(e.wait_seconds)
+ except Exception: # noqa: BLE001
+ if attempt >= args.retries:
+ raise
+ time.sleep(args.retry_delay)
+ if args.request_interval > 0:
+ time.sleep(args.request_interval)
+
+ parsed = split_boxed_generation(generation)
+ if not parsed:
+ print(f"[{idx}/{total}] skipped=format")
+ continue
+
+ thinking_text, boxed_segment, boxed_answer = parsed
+ target_text = f"{thinking_text}\n{boxed_answer}" if thinking_text else boxed_answer
+ judge_messages = build_judge_messages(reference_answer, boxed_answer)
+
+ ok = False
+ judge_resp = ""
+ for attempt in range(args.retries + 1):
+ try:
+ judge_resp = call_chat_api(
+ args.api_base,
+ api_key,
+ args.judge_model,
+ judge_messages,
+ timeout=args.api_timeout,
+ max_tokens=64,
+ temperature=0.0,
+ cache_ttl=args.api_cache_ttl,
+ cache_namespace=args.api_cache_namespace,
+ rate_limit_delay=args.rate_limit_delay,
+ )
+ ok = parse_bool(judge_resp)
+ break
+ except RateLimitError as e:
+ if attempt >= args.retries:
+ raise
+ time.sleep(e.wait_seconds)
+ except Exception: # noqa: BLE001
+ if attempt >= args.retries:
+ raise
+ time.sleep(args.retry_delay)
+ if args.judge_interval > 0:
+ time.sleep(args.judge_interval)
+
+ status = "kept" if ok else "filtered"
+ print(f"[{idx}/{total}] judge={status}")
+ if not ok:
+ continue
+
+ new_meta = dict(ex.metadata)
+ new_meta["reference_answer"] = reference_answer
+ new_meta["judge_response"] = judge_resp
+
+ new_ex = CachedExample(
+ prompt=ex.prompt,
+ target=target_text,
+ indices_to_explain=None,
+ attr_mask_indices=ex.attr_mask_indices,
+ sink_span=None,
+ thinking_span=None,
+ metadata=new_meta,
+ )
+ new_ex = attach_spans_from_answer(new_ex, tokenizer, boxed_answer)
+ if not (isinstance(new_ex.sink_span, list) and len(new_ex.sink_span) == 2):
+ print(f"[{idx}/{total}] skipped=span")
+ continue
+
+ # Token-level indices_to_explain: boxed-inner answer token span in target (closed interval).
+ new_ex = CachedExample(
+ prompt=new_ex.prompt,
+ target=new_ex.target,
+ indices_to_explain=new_ex.sink_span,
+ attr_mask_indices=new_ex.attr_mask_indices,
+ sink_span=new_ex.sink_span,
+ thinking_span=new_ex.thinking_span,
+ metadata=new_ex.metadata,
+ )
+ kept.append(new_ex)
+ kept_bar.update(1)
+
+ kept_bar.close()
+
+ out_path = Path(args.out) if args.out else Path(args.data_root) / f"{args.dataset}.jsonl"
+ written = write_cache(out_path, kept)
+ attempted_total = attempted or 0
+ print(f"Kept {written} / target {args.max_examples} (attempted {attempted_total} / {total}) -> {out_path}")
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/exp3/README.md b/exp/exp3/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..d17cce6bceaf6886a28c209b14ec3921ea009229
--- /dev/null
+++ b/exp/exp3/README.md
@@ -0,0 +1,50 @@
+# FlashTrace ๅฎ้ช 3๏ผ้ฟ/็ญ CoT ๅฏนๆฏ๏ผcase study๏ผ
+
+ๆฌ็ฎๅฝๆไพไธไธชใ้ฟ/็ญ CoTใ็ๆๅฐๅฏๅค็ฐๅฎ้ช๏ผ
+- ไป RULER `niah_mq_q2 (1024)` ไธญๅๅซ็ญๅบ๏ผ
+ - short-CoT๏ผ็ญๆจ็ + `\box{}` ๆ็ป็ญๆก
+ - long-CoT๏ผ้ฟๆจ็ + `\box{}` ๆ็ป็ญๆก
+- ๅช่ท `attnlrp`๏ผhop0๏ผๅนถๅช่ฎก็ฎ token-level `recovery@10%`๏ผgold ๆฅ่ช `needle_spans`๏ผใ
+- ่ฝ็ trace๏ผnpz + manifest๏ผๅฐ `exp/exp3/output/`๏ผๆ ผๅผๅฏน้ฝ `exp/exp2/run_exp.py` ็ trace ไน ๆฏใ
+
+## 1) ้ๆ ทไธ่ฟๆปค๏ผ็ๆ + judge๏ผ
+
+้ป่ฎค่ฏปๅ๏ผ
+`data/ruler_multihop/1024/niah_mq_q2/validation.jsonl`
+
+้่ฆไธไธช OpenAI-compatible ็ chat API๏ผ้ป่ฎค `http://localhost:4000/v1`๏ผไปฅๅ API keyใ
+
+```bash
+export FLASHTRACE_API_KEY=... # ๆ OPENAI_API_KEY
+
+python exp/exp3/sample_and_filter.py \
+ --tokenizer_model /opt/share/models/Qwen/Qwen3-8B/ \
+ --min_long_thinking_tokens 512 \
+ --max_short_thinking_tokens 256
+```
+
+่พๅบ๏ผ้ป่ฎค๏ผ๏ผ
+- `exp/exp3/data/niah_mq_q2_short_cot.jsonl`
+- `exp/exp3/data/niah_mq_q2_long_cot.jsonl`
+
+่ฏดๆ๏ผ
+- ้ป่ฎคๅ้ 1 ๆก๏ผๅฏ็จ `--max_short` / `--max_long` ๅๅซๆๅฎๆฐ้๏ผ`--max_pairs` ๆฏไธค่
็ๅ
ผๅฎนๅซๅ๏ผใ
+
+## 2) ๅฝๅ ไธ recovery๏ผAttnLRP hop0๏ผ
+
+```bash
+python exp/exp3/run_exp.py \
+ --model qwen-8B \
+ --model_path /opt/share/models/Qwen/Qwen3-8B/ \
+ --cuda 3,4,5,7
+```
+
+่พๅบ๏ผ
+- recovery CSV๏ผ`exp/exp3/output/recovery///attnlrp_1_examples.csv`
+- trace๏ผ`exp/exp3/output/traces////ex_*.npz` + `manifest.jsonl`
+- ๆฑๆป JSON๏ผ`exp/exp3/output/recovery/summary_.json`
+
+ๅธธ็จๅๆฐ๏ผ
+- `--top_fraction`๏ผrecovery ็ top fraction๏ผ้ป่ฎค 0.1๏ผ
+- `--attnlrp_neg_handling drop|abs`
+- `--attnlrp_norm_mode norm|no_norm`
diff --git a/exp/exp3/extract_segment_weights.py b/exp/exp3/extract_segment_weights.py
new file mode 100644
index 0000000000000000000000000000000000000000..695af1588e102927635a01a2553ca381ec035f46
--- /dev/null
+++ b/exp/exp3/extract_segment_weights.py
@@ -0,0 +1,250 @@
+#!/usr/bin/env python3
+"""
+Extract CoT/output segment attribution weights from exp3 trace artifacts.
+
+Background
+----------
+exp/exp3/run_exp.py saves per-sample trace npz files that contain token-level
+importance vectors over the FULL (prompt + generation) token sequence:
+ - v_seq_all: sum over rows of seq attribution matrix (shape [P+G])
+ - v_row_all: row attribution vector for indices_to_explain (shape [P+G])
+ - v_rec_all: recursive attribution vector for indices_to_explain (shape [P+G])
+
+For exp3 cached samples, we also have generation-token spans:
+ - thinking_span_gen: CoT span [start,end] in generation-token coordinates
+ - sink_span_gen: output span [start,end] in generation-token coordinates
+
+This script slices v_*_all into:
+ - cot: tokens in thinking_span_gen (offset by prompt_len)
+ - output: tokens in sink_span_gen (offset by prompt_len)
+
+and reports segment sums/fractions (and optionally writes a JSON summary).
+"""
+
+from __future__ import annotations
+
+import argparse
+import json
+from dataclasses import dataclass
+from pathlib import Path
+from typing import Any, Dict, List, Optional, Tuple
+
+import numpy as np
+
+
+@dataclass(frozen=True)
+class TracePaths:
+ dataset: str
+ model_tag: str
+ run_tag: str
+ npz_path: Path
+
+
+def _pick_latest_subdir(path: Path) -> Optional[Path]:
+ if not path.exists():
+ return None
+ subs = [p for p in path.iterdir() if p.is_dir()]
+ if not subs:
+ return None
+ subs.sort(key=lambda p: p.stat().st_mtime, reverse=True)
+ return subs[0]
+
+
+def _resolve_trace_paths(
+ *,
+ output_root: Path,
+ dataset: str,
+ model_tag: Optional[str],
+ run_tag: Optional[str],
+ example_idx: int,
+) -> TracePaths:
+ base = output_root / "traces" / dataset
+ if not base.exists():
+ raise FileNotFoundError(f"Trace dataset dir not found: {base}")
+
+ if model_tag is None:
+ model_dirs = [p for p in base.iterdir() if p.is_dir()]
+ if not model_dirs:
+ raise FileNotFoundError(f"No model subdir under: {base}")
+ if len(model_dirs) != 1:
+ raise SystemExit(f"Multiple model dirs under {base}; pass --model_tag. Found: {[p.name for p in model_dirs]}")
+ model_dir = model_dirs[0]
+ model_tag = model_dir.name
+ else:
+ model_dir = base / model_tag
+ if not model_dir.exists():
+ raise FileNotFoundError(f"Trace model dir not found: {model_dir}")
+
+ if run_tag is None:
+ run_dir = _pick_latest_subdir(model_dir)
+ if run_dir is None:
+ raise FileNotFoundError(f"No run subdir under: {model_dir}")
+ run_tag = run_dir.name
+ else:
+ run_dir = model_dir / run_tag
+ if not run_dir.exists():
+ raise FileNotFoundError(f"Trace run dir not found: {run_dir}")
+
+ npz_name = f"ex_{int(example_idx):06d}.npz"
+ npz_path = run_dir / npz_name
+ if not npz_path.exists():
+ raise FileNotFoundError(f"Trace npz not found: {npz_path}")
+
+ return TracePaths(dataset=dataset, model_tag=model_tag, run_tag=run_tag, npz_path=npz_path)
+
+
+def _as_span(arr: Any) -> Optional[Tuple[int, int]]:
+ if arr is None:
+ return None
+ try:
+ a = np.asarray(arr).reshape(-1).tolist()
+ except Exception:
+ return None
+ if len(a) != 2:
+ return None
+ try:
+ start = int(a[0])
+ end = int(a[1])
+ except Exception:
+ return None
+ if start < 0 or end < start:
+ return None
+ return start, end
+
+
+def _segment_stats(v: np.ndarray, start: int, end: int) -> Dict[str, float]:
+ if end < start:
+ return {"sum": 0.0, "mean": 0.0, "max": 0.0}
+ seg = v[start : end + 1]
+ if seg.size == 0:
+ return {"sum": 0.0, "mean": 0.0, "max": 0.0}
+ return {
+ "sum": float(seg.sum()),
+ "mean": float(seg.mean()),
+ "max": float(seg.max()),
+ }
+
+
+def _slice_segment(v: np.ndarray, start: int, end: int) -> List[float]:
+ if end < start:
+ return []
+ seg = v[start : end + 1]
+ return [float(x) for x in seg.tolist()]
+
+
+def extract_one(npz_path: Path) -> Dict[str, Any]:
+ d = np.load(npz_path)
+ required = ["prompt_len", "gen_len", "v_seq_all", "v_row_all", "v_rec_all"]
+ for k in required:
+ if k not in d:
+ raise KeyError(f"Missing key in trace npz {npz_path}: {k}")
+
+ prompt_len = int(np.asarray(d["prompt_len"]).item())
+ gen_len = int(np.asarray(d["gen_len"]).item())
+ total_len = prompt_len + gen_len
+
+ v_seq_all = np.asarray(d["v_seq_all"], dtype=np.float64).reshape(-1)
+ v_row_all = np.asarray(d["v_row_all"], dtype=np.float64).reshape(-1)
+ v_rec_all = np.asarray(d["v_rec_all"], dtype=np.float64).reshape(-1)
+ for name, v in [("v_seq_all", v_seq_all), ("v_row_all", v_row_all), ("v_rec_all", v_rec_all)]:
+ if int(v.size) != int(total_len):
+ raise ValueError(f"{name} length mismatch: expected {total_len}, got {int(v.size)}")
+
+ sink_span_gen = _as_span(d.get("sink_span_gen"))
+ thinking_span_gen = _as_span(d.get("thinking_span_gen"))
+ if sink_span_gen is None:
+ raise KeyError("Trace missing sink_span_gen; cannot define output span.")
+ if thinking_span_gen is None:
+ # Best-effort: infer thinking span as [0, sink_start-1].
+ sink_start, _ = sink_span_gen
+ thinking_span_gen = (0, max(0, sink_start - 1))
+
+ think_start_g, think_end_g = thinking_span_gen
+ sink_start_g, sink_end_g = sink_span_gen
+
+ cot_start = prompt_len + think_start_g
+ cot_end = min(prompt_len + think_end_g, total_len - 1)
+ out_start = prompt_len + sink_start_g
+ out_end = min(prompt_len + sink_end_g, total_len - 1)
+
+ def pack(v: np.ndarray) -> Dict[str, Any]:
+ total = float(v.sum())
+ cot = _segment_stats(v, cot_start, cot_end)
+ out = _segment_stats(v, out_start, out_end)
+ denom = cot["sum"] + out["sum"]
+ return {
+ "total_sum": total,
+ "cot": {
+ "start_abs": int(cot_start),
+ "end_abs": int(cot_end),
+ "len": int(max(0, cot_end - cot_start + 1)),
+ **cot,
+ "fraction_of_total": float(cot["sum"] / total) if total > 0 else float("nan"),
+ "fraction_of_cot_plus_output": float(cot["sum"] / denom) if denom > 0 else float("nan"),
+ },
+ "output": {
+ "start_abs": int(out_start),
+ "end_abs": int(out_end),
+ "len": int(max(0, out_end - out_start + 1)),
+ **out,
+ "fraction_of_total": float(out["sum"] / total) if total > 0 else float("nan"),
+ "fraction_of_cot_plus_output": float(out["sum"] / denom) if denom > 0 else float("nan"),
+ },
+ "cot_weights": _slice_segment(v, cot_start, cot_end),
+ "output_weights": _slice_segment(v, out_start, out_end),
+ }
+
+ return {
+ "prompt_len": int(prompt_len),
+ "gen_len": int(gen_len),
+ "total_len": int(total_len),
+ "thinking_span_gen": [int(think_start_g), int(think_end_g)],
+ "sink_span_gen": [int(sink_start_g), int(sink_end_g)],
+ "seq": pack(v_seq_all),
+ "row": pack(v_row_all),
+ "rec": pack(v_rec_all),
+ }
+
+
+def main() -> None:
+ parser = argparse.ArgumentParser("Extract CoT/output weights from exp3 traces.")
+ parser.add_argument("--output_root", type=str, default="exp/exp3/output")
+ parser.add_argument("--dataset_tag", type=str, default="niah_mq_q2")
+ parser.add_argument("--model_tag", type=str, default=None, help="If omitted, auto-detect when unique.")
+ parser.add_argument("--run_tag", type=str, default=None, help="If omitted, picks the latest run subdir.")
+ parser.add_argument("--example_idx", type=int, default=0)
+ parser.add_argument("--out", type=str, default=None, help="Optional JSON output path.")
+ args = parser.parse_args()
+
+ output_root = Path(args.output_root)
+ datasets = [f"{args.dataset_tag}_short_cot", f"{args.dataset_tag}_long_cot"]
+
+ results: List[Dict[str, Any]] = []
+ for ds_name in datasets:
+ paths = _resolve_trace_paths(
+ output_root=output_root,
+ dataset=ds_name,
+ model_tag=args.model_tag,
+ run_tag=args.run_tag,
+ example_idx=args.example_idx,
+ )
+ out = extract_one(paths.npz_path)
+ out["dataset"] = paths.dataset
+ out["model_tag"] = paths.model_tag
+ out["run_tag"] = paths.run_tag
+ out["npz_path"] = str(paths.npz_path)
+ results.append(out)
+
+ text = json.dumps(results, ensure_ascii=False, indent=2)
+ if args.out:
+ out_path = Path(args.out)
+ out_path.parent.mkdir(parents=True, exist_ok=True)
+ out_path.write_text(text + "\n", encoding="utf-8")
+ print(f"Wrote -> {out_path}")
+ else:
+ print(text)
+
+
+if __name__ == "__main__":
+ main()
+
diff --git a/exp/exp3/part_weights.py b/exp/exp3/part_weights.py
new file mode 100644
index 0000000000000000000000000000000000000000..331190ae903dbad6758cd8d1959fa0626a36116e
--- /dev/null
+++ b/exp/exp3/part_weights.py
@@ -0,0 +1,228 @@
+#!/usr/bin/env python3
+"""
+Compute attribution mass on (input, cot, output) segments from exp3 trace npz files.
+
+Definitions (token-level, aligned with exp2/exp3 runners):
+- input : prompt-side tokens (user prompt), indices [0, prompt_len)
+- cot : generation tokens in thinking span, indices [prompt_len + t0, prompt_len + t1]
+- output : generation tokens in sink span (answer), indices [prompt_len + s0, prompt_len + s1]
+
+The trace stores token-importance vectors:
+ - v_seq_all, v_row_all, v_rec_all (length = prompt_len + gen_len)
+
+This script sums those vectors over each segment and reports both absolute sums
+and fractions of the total sum.
+"""
+
+from __future__ import annotations
+
+import argparse
+import json
+from dataclasses import dataclass
+from pathlib import Path
+from typing import Dict, Iterable, List, Optional, Tuple
+
+import numpy as np
+
+
+@dataclass(frozen=True)
+class TraceRun:
+ dataset: str
+ model: str
+ run_dir: Path
+
+
+def _pick_single_subdir(parent: Path) -> Path:
+ subdirs = [p for p in parent.iterdir() if p.is_dir()]
+ if not subdirs:
+ raise FileNotFoundError(f"No subdirectories found under {parent}")
+ if len(subdirs) == 1:
+ return subdirs[0]
+ subdirs.sort(key=lambda p: p.stat().st_mtime, reverse=True)
+ return subdirs[0]
+
+
+def _resolve_run(
+ trace_root: Path,
+ *,
+ dataset: str,
+ model: Optional[str],
+ run_tag: Optional[str],
+) -> TraceRun:
+ ds_dir = trace_root / dataset
+ if not ds_dir.exists():
+ raise FileNotFoundError(f"Dataset trace directory not found: {ds_dir}")
+
+ if model is None:
+ model_dir = _pick_single_subdir(ds_dir)
+ else:
+ model_dir = ds_dir / model
+ if not model_dir.exists():
+ raise FileNotFoundError(f"Model trace directory not found: {model_dir}")
+
+ if run_tag is None:
+ run_dir = _pick_single_subdir(model_dir)
+ else:
+ run_dir = model_dir / run_tag
+ if not run_dir.exists():
+ raise FileNotFoundError(f"Run directory not found: {run_dir}")
+
+ return TraceRun(dataset=dataset, model=model_dir.name, run_dir=run_dir)
+
+
+def _iter_manifest(run_dir: Path) -> Iterable[dict]:
+ manifest = run_dir / "manifest.jsonl"
+ if not manifest.exists():
+ raise FileNotFoundError(f"Missing manifest: {manifest}")
+ with manifest.open("r", encoding="utf-8") as f:
+ for line in f:
+ line = line.strip()
+ if line:
+ yield json.loads(line)
+
+
+def _as_span(arr: np.ndarray, *, name: str) -> Tuple[int, int]:
+ if arr is None:
+ raise ValueError(f"Missing {name} in trace npz.")
+ a = np.asarray(arr).reshape(-1)
+ if a.size != 2:
+ raise ValueError(f"Expected {name} to have 2 ints, got shape {a.shape}.")
+ return int(a[0]), int(a[1])
+
+
+def _segment_sums(
+ v: np.ndarray,
+ *,
+ prompt_len: int,
+ gen_len: int,
+ thinking_span_gen: Optional[Tuple[int, int]],
+ sink_span_gen: Optional[Tuple[int, int]],
+) -> Dict[str, float]:
+ total_len = int(prompt_len) + int(gen_len)
+ if int(v.shape[0]) != total_len:
+ raise ValueError(f"Vector length mismatch: len(v)={int(v.shape[0])} vs prompt_len+gen_len={total_len}.")
+
+ v = np.asarray(v, dtype=np.float64).reshape(-1)
+ prompt_len = int(prompt_len)
+ gen_len = int(gen_len)
+
+ # Default: no cot/output when spans missing (should not happen in exp3).
+ think_start, think_end = (0, -1) if thinking_span_gen is None else thinking_span_gen
+ sink_start, sink_end = (0, -1) if sink_span_gen is None else sink_span_gen
+
+ # Clamp spans into [0, gen_len-1].
+ def _clamp_span(a: int, b: int) -> Tuple[int, int]:
+ a = max(0, min(int(a), gen_len - 1))
+ b = max(0, min(int(b), gen_len - 1))
+ if b < a:
+ return 0, -1
+ return a, b
+
+ think_start, think_end = _clamp_span(think_start, think_end)
+ sink_start, sink_end = _clamp_span(sink_start, sink_end)
+
+ mask = np.zeros((total_len,), dtype=bool)
+ # input = all prompt tokens
+ input_slice = slice(0, prompt_len)
+ mask[input_slice] = True
+
+ cot_slice = slice(prompt_len + think_start, prompt_len + think_end + 1) if think_end >= think_start else slice(0, 0)
+ output_slice = slice(prompt_len + sink_start, prompt_len + sink_end + 1) if sink_end >= sink_start else slice(0, 0)
+ mask[cot_slice] = True
+ mask[output_slice] = True
+
+ input_sum = float(v[input_slice].sum())
+ cot_sum = float(v[cot_slice].sum()) if think_end >= think_start else 0.0
+ output_sum = float(v[output_slice].sum()) if sink_end >= sink_start else 0.0
+ other_sum = float(v[~mask].sum())
+ total_sum = float(v.sum())
+
+ return {
+ "total": total_sum,
+ "input": input_sum,
+ "cot": cot_sum,
+ "output": output_sum,
+ "other": other_sum,
+ }
+
+
+def _with_fracs(sums: Dict[str, float]) -> Dict[str, float]:
+ total = float(sums.get("total") or 0.0)
+ if total <= 0.0:
+ return {**sums, "input_frac": float("nan"), "cot_frac": float("nan"), "output_frac": float("nan"), "other_frac": float("nan")}
+ return {
+ **sums,
+ "input_frac": float(sums["input"]) / total,
+ "cot_frac": float(sums["cot"]) / total,
+ "output_frac": float(sums["output"]) / total,
+ "other_frac": float(sums["other"]) / total,
+ }
+
+
+def _analyze_npz(npz_path: Path) -> Dict[str, dict]:
+ d = np.load(npz_path)
+ prompt_len = int(np.asarray(d["prompt_len"]).item())
+ gen_len = int(np.asarray(d["gen_len"]).item())
+ thinking_span_gen = _as_span(d["thinking_span_gen"], name="thinking_span_gen") if "thinking_span_gen" in d.files else None
+ sink_span_gen = _as_span(d["sink_span_gen"], name="sink_span_gen") if "sink_span_gen" in d.files else None
+
+ out: Dict[str, dict] = {"prompt_len": prompt_len, "gen_len": gen_len}
+ for key in ("v_seq_all", "v_row_all", "v_rec_all"):
+ if key not in d.files:
+ raise ValueError(f"Missing {key} in trace npz: {npz_path}")
+ sums = _segment_sums(
+ d[key],
+ prompt_len=prompt_len,
+ gen_len=gen_len,
+ thinking_span_gen=thinking_span_gen,
+ sink_span_gen=sink_span_gen,
+ )
+ out[key] = _with_fracs(sums)
+ out["thinking_span_gen"] = list(thinking_span_gen) if thinking_span_gen is not None else None
+ out["sink_span_gen"] = list(sink_span_gen) if sink_span_gen is not None else None
+ return out
+
+
+def main() -> None:
+ parser = argparse.ArgumentParser("Summarize input/cot/output attribution mass from exp3 traces.")
+ parser.add_argument("--trace_root", type=str, default="exp/exp3/output/traces")
+ parser.add_argument("--dataset_tag", type=str, default="niah_mq_q2", help="Base tag; expands to _short_cot and _long_cot.")
+ parser.add_argument("--datasets", type=str, default=None, help="Comma-separated dataset names (overrides --dataset_tag expansion).")
+ parser.add_argument("--model", type=str, default=None, help="Model directory name under traces (default: auto if single).")
+ parser.add_argument("--run_tag", type=str, default=None, help="Run tag directory (default: auto pick newest/single).")
+ args = parser.parse_args()
+
+ trace_root = Path(args.trace_root)
+ if not trace_root.exists():
+ raise SystemExit(f"trace_root not found: {trace_root}")
+
+ if args.datasets:
+ datasets = [x.strip() for x in str(args.datasets).split(",") if x.strip()]
+ else:
+ datasets = [f"{args.dataset_tag}_short_cot", f"{args.dataset_tag}_long_cot"]
+
+ for ds in datasets:
+ run = _resolve_run(trace_root, dataset=ds, model=args.model, run_tag=args.run_tag)
+ records = list(_iter_manifest(run.run_dir))
+ if not records:
+ raise SystemExit(f"Empty manifest: {run.run_dir/'manifest.jsonl'}")
+ for rec in records:
+ npz_path = run.run_dir / str(rec["file"])
+ analysis = _analyze_npz(npz_path)
+ print(
+ json.dumps(
+ {
+ "dataset": run.dataset,
+ "model": run.model,
+ "run_dir": str(run.run_dir),
+ "example_idx": int(rec.get("example_idx", -1)),
+ **analysis,
+ },
+ ensure_ascii=False,
+ )
+ )
+
+
+if __name__ == "__main__":
+ main()
+
diff --git a/exp/exp3/run_exp.py b/exp/exp3/run_exp.py
new file mode 100644
index 0000000000000000000000000000000000000000..f3592dd9c3d7d397a8a57ce959081d21e7d3b63f
--- /dev/null
+++ b/exp/exp3/run_exp.py
@@ -0,0 +1,430 @@
+#!/usr/bin/env python3
+"""
+Experiment 3 runner: long-vs-short CoT case study (AttnLRP hop0, Recovery@10%).
+
+This runner is intentionally minimal:
+ - Only reads two cached samples produced by exp/exp3/sample_and_filter.py:
+ _short_cot.jsonl
+ _long_cot.jsonl
+ - Only runs attribution method: attnlrp (hop0 path, aligned with exp2).
+ - Only computes token-level recovery (Recall@10%) using RULER needle_spans.
+ - Always saves per-sample trace artifacts under exp/exp3/output/traces/.
+
+All outputs are written under exp/exp3/output/ (configurable via --output_root).
+"""
+
+from __future__ import annotations
+
+import argparse
+import hashlib
+import json
+import os
+import sys
+import time
+from itertools import islice
+from pathlib import Path
+from typing import Any, Dict, List, Optional, Tuple
+
+
+def _early_set_cuda_visible_devices() -> None:
+ parser = argparse.ArgumentParser(add_help=False)
+ parser.add_argument("--cuda", type=str, default=None)
+ args, _ = parser.parse_known_args(sys.argv[1:])
+ if args.cuda and "," in args.cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = args.cuda
+
+
+_early_set_cuda_visible_devices()
+
+import numpy as np
+import torch
+from transformers import AutoModelForCausalLM, AutoTokenizer, utils
+
+REPO_ROOT = Path(__file__).resolve().parents[2]
+if str(REPO_ROOT) not in sys.path:
+ sys.path.insert(0, str(REPO_ROOT))
+
+import llm_attr
+import llm_attr_eval
+from exp.exp2 import dataset_utils as ds_utils
+
+utils.logging.set_verbosity_error()
+
+
+def _sha1_text(text: str) -> str:
+ return hashlib.sha1(text.encode("utf-8")).hexdigest()
+
+
+def _token_importance_vector(attr: torch.Tensor) -> np.ndarray:
+ w = torch.nan_to_num(attr.sum(0).to(dtype=torch.float32), nan=0.0).clamp(min=0.0)
+ return w.detach().cpu().numpy().astype(np.float32, copy=False)
+
+
+def _trace_run_tag(*, neg_handling: str, norm_mode: str, total: int) -> str:
+ return f"attnlrp_neg{neg_handling}_norm{norm_mode}_recovery_{int(total)}ex"
+
+
+def _build_sample_trace_payload(
+ example: ds_utils.CachedExample,
+ *,
+ seq_attr: torch.Tensor,
+ row_attr: torch.Tensor,
+ rec_attr: torch.Tensor,
+ prompt_len: int,
+ user_prompt_indices: Optional[List[int]],
+ gold_prompt_token_indices: Optional[List[int]],
+ recovery_scores: Optional[np.ndarray],
+ time_attr_s: Optional[float],
+ time_recovery_s: Optional[float],
+) -> Dict[str, np.ndarray]:
+ gen_len = int(seq_attr.shape[0])
+
+ v_seq_all = _token_importance_vector(seq_attr)
+ v_row_all = _token_importance_vector(row_attr)
+ v_rec_all = _token_importance_vector(rec_attr)
+
+ payload: Dict[str, np.ndarray] = {
+ "v_seq_all": v_seq_all,
+ "v_row_all": v_row_all,
+ "v_rec_all": v_rec_all,
+ "v_seq_prompt": v_seq_all[:prompt_len],
+ "v_row_prompt": v_row_all[:prompt_len],
+ "v_rec_prompt": v_rec_all[:prompt_len],
+ "prompt_len": np.asarray(int(prompt_len), dtype=np.int64),
+ "gen_len": np.asarray(int(gen_len), dtype=np.int64),
+ "indices_to_explain_gen": np.asarray(list(example.indices_to_explain or []), dtype=np.int64),
+ }
+
+ if example.sink_span is not None:
+ payload["sink_span_gen"] = np.asarray(list(example.sink_span), dtype=np.int64)
+ if example.thinking_span is not None:
+ payload["thinking_span_gen"] = np.asarray(list(example.thinking_span), dtype=np.int64)
+
+ if user_prompt_indices is not None:
+ payload["user_prompt_indices"] = np.asarray(list(user_prompt_indices), dtype=np.int64)
+ if gold_prompt_token_indices is not None:
+ payload["gold_prompt_token_indices"] = np.asarray(list(gold_prompt_token_indices), dtype=np.int64)
+
+ if recovery_scores is not None:
+ payload["recovery_scores"] = np.asarray(recovery_scores, dtype=np.float64)
+
+ if time_attr_s is not None:
+ payload["time_attr_s"] = np.asarray(float(time_attr_s), dtype=np.float64)
+ if time_recovery_s is not None:
+ payload["time_recovery_s"] = np.asarray(float(time_recovery_s), dtype=np.float64)
+
+ return payload
+
+
+def _write_sample_trace(
+ trace_dir: Path,
+ *,
+ example_idx: int,
+ prompt: str,
+ target: str,
+ payload: Dict[str, np.ndarray],
+ manifest_handle,
+ neg_handling: str,
+ norm_mode: str,
+ recovery_skipped_reason: Optional[str],
+) -> None:
+ trace_dir.mkdir(parents=True, exist_ok=True)
+ npz_name = f"ex_{example_idx:06d}.npz"
+ npz_path = trace_dir / npz_name
+ np.savez_compressed(npz_path, **payload)
+
+ prompt_len = int(np.asarray(payload.get("prompt_len", 0)).item())
+ gen_len = int(np.asarray(payload.get("gen_len", 0)).item())
+ record: Dict[str, Any] = {
+ "example_idx": int(example_idx),
+ "attr_func": "attnlrp",
+ "file": npz_name,
+ "prompt_sha1": _sha1_text(prompt),
+ "target_sha1": _sha1_text(target),
+ "prompt_len": prompt_len,
+ "gen_len": gen_len,
+ "indices_to_explain_gen": payload.get("indices_to_explain_gen").tolist()
+ if payload.get("indices_to_explain_gen") is not None
+ else None,
+ "sink_span_gen": payload.get("sink_span_gen").tolist() if payload.get("sink_span_gen") is not None else None,
+ "thinking_span_gen": payload.get("thinking_span_gen").tolist()
+ if payload.get("thinking_span_gen") is not None
+ else None,
+ "gold_prompt_token_indices": payload.get("gold_prompt_token_indices").tolist()
+ if payload.get("gold_prompt_token_indices") is not None
+ else None,
+ "recovery_scores": payload.get("recovery_scores").tolist() if payload.get("recovery_scores") is not None else None,
+ "recovery_skipped_reason": recovery_skipped_reason,
+ "time_attr_s": float(np.asarray(payload.get("time_attr_s")).item()) if payload.get("time_attr_s") is not None else None,
+ "time_recovery_s": float(np.asarray(payload.get("time_recovery_s")).item())
+ if payload.get("time_recovery_s") is not None
+ else None,
+ "attnlrp_neg_handling": str(neg_handling),
+ "attnlrp_norm_mode": str(norm_mode),
+ }
+ manifest_handle.write(json.dumps(record, ensure_ascii=False) + "\n")
+ manifest_handle.flush()
+
+
+def resolve_device(args) -> str:
+ if args.cuda is not None and "," in args.cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = args.cuda
+ return "auto"
+ if args.cuda is not None and str(args.cuda).strip():
+ return f"cuda:{args.cuda}" if torch.cuda.is_available() else "cpu"
+ return f"cuda:{args.cuda_num}" if torch.cuda.is_available() else "cpu"
+
+
+def load_model(model_name: str, device: str):
+ model = AutoModelForCausalLM.from_pretrained(
+ model_name,
+ device_map="auto" if device == "auto" else {"": int(device.split(":")[1])} if device.startswith("cuda:") else None,
+ torch_dtype=torch.float16,
+ attn_implementation="eager",
+ )
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
+ tokenizer.pad_token = tokenizer.eos_token
+ model.eval()
+ return model, tokenizer
+
+
+def _evaluate_one_dataset(
+ *,
+ dataset_name: str,
+ examples: List[ds_utils.CachedExample],
+ model,
+ tokenizer,
+ output_root: Path,
+ model_tag: str,
+ neg_handling: str,
+ norm_mode: str,
+ top_fraction: float,
+ num_examples: int,
+) -> Tuple[np.ndarray, np.ndarray, float, int, int]:
+ llm_evaluator = llm_attr_eval.LLMAttributionEvaluator(model, tokenizer)
+
+ results: List[np.ndarray] = []
+ durations: List[float] = []
+ skipped = 0
+
+ total = min(len(examples), int(num_examples))
+ iterator = islice(examples, total)
+
+ run_tag = _trace_run_tag(neg_handling=neg_handling, norm_mode=norm_mode, total=total)
+ trace_dir = output_root / "traces" / dataset_name / model_tag / run_tag
+ trace_dir.mkdir(parents=True, exist_ok=True)
+ manifest_handle = open(trace_dir / "manifest.jsonl", "w", encoding="utf-8")
+
+ try:
+ for example_idx, ex in enumerate(iterator):
+ time_recovery_s: Optional[float] = None
+ recovery_scores: Optional[np.ndarray] = None
+
+ needle_spans = (ex.metadata or {}).get("needle_spans")
+ if not isinstance(needle_spans, list) or not needle_spans:
+ raise SystemExit(
+ "exp3 recovery requires RULER samples with metadata.needle_spans; "
+ f"dataset={dataset_name} has missing/empty needle_spans."
+ )
+ if ex.target is None:
+ raise SystemExit(
+ "exp3 recovery requires cached targets (CoT+answer) so row/rec attribution is well-defined. "
+ f"dataset={dataset_name} has target=None; run exp/exp3/sample_and_filter.py first."
+ )
+ if not (isinstance(ex.indices_to_explain, list) and len(ex.indices_to_explain) == 2):
+ raise SystemExit(
+ "exp3 expects indices_to_explain=[start_tok,end_tok] in generation-token coordinates; "
+ f"dataset={dataset_name} has indices_to_explain={ex.indices_to_explain!r}; "
+ "run exp/exp3/sample_and_filter.py first."
+ )
+
+ gold_prompt = ds_utils.ruler_gold_prompt_token_indices(ex, tokenizer)
+ recovery_skip_reason: Optional[str] = None
+
+ sample_start = time.perf_counter()
+ llm_attributor = llm_attr.LLMLRPAttribution(model, tokenizer)
+ attr_result = llm_attributor.calculate_attnlrp_ft_hop0(
+ ex.prompt,
+ target=ex.target,
+ sink_span=tuple(ex.sink_span) if ex.sink_span else None,
+ thinking_span=tuple(ex.thinking_span) if ex.thinking_span else None,
+ neg_handling=str(neg_handling),
+ norm_mode=str(norm_mode),
+ )
+ seq_attr, row_attr, rec_attr = attr_result.get_all_token_attrs(list(ex.indices_to_explain))
+ time_attr_s = time.perf_counter() - sample_start
+ durations.append(float(time_attr_s))
+
+ prompt_len = int(seq_attr.shape[1] - seq_attr.shape[0])
+ if prompt_len <= 0:
+ recovery_skip_reason = "empty_prompt_len"
+ elif not gold_prompt:
+ recovery_skip_reason = "empty_gold_prompt"
+ else:
+ t2 = time.perf_counter()
+ recovery_scores = np.asarray(
+ [
+ llm_evaluator.evaluate_attr_recovery(
+ a,
+ prompt_len=prompt_len,
+ gold_prompt_token_indices=gold_prompt,
+ top_fraction=top_fraction,
+ )
+ for a in (seq_attr, row_attr, rec_attr)
+ ],
+ dtype=np.float64,
+ )
+ time_recovery_s = time.perf_counter() - t2
+ if np.isnan(recovery_scores).any():
+ recovery_scores = None
+ recovery_skip_reason = "nan_recovery"
+
+ if recovery_scores is None and recovery_skip_reason is not None:
+ skipped += 1
+ elif recovery_scores is not None:
+ results.append(recovery_scores)
+
+ payload = _build_sample_trace_payload(
+ ex,
+ seq_attr=seq_attr,
+ row_attr=row_attr,
+ rec_attr=rec_attr,
+ prompt_len=prompt_len,
+ user_prompt_indices=getattr(llm_attributor, "user_prompt_indices", None),
+ gold_prompt_token_indices=gold_prompt,
+ recovery_scores=recovery_scores,
+ time_attr_s=time_attr_s,
+ time_recovery_s=time_recovery_s,
+ )
+ _write_sample_trace(
+ trace_dir,
+ example_idx=example_idx,
+ prompt=ex.prompt,
+ target=str(ex.target),
+ payload=payload,
+ manifest_handle=manifest_handle,
+ neg_handling=str(neg_handling),
+ norm_mode=str(norm_mode),
+ recovery_skipped_reason=recovery_skip_reason,
+ )
+ finally:
+ try:
+ manifest_handle.close()
+ except Exception:
+ pass
+
+ scores = np.stack(results, axis=0) if results else np.zeros((0, 3), dtype=np.float64)
+ used = int(scores.shape[0])
+ mean = scores.mean(0) if used else np.full((3,), np.nan, dtype=np.float64)
+ std = scores.std(0) if used else np.full((3,), np.nan, dtype=np.float64)
+ avg_time = float(np.mean(durations)) if durations else 0.0
+ return mean, std, avg_time, used, int(skipped)
+
+
+def main() -> None:
+ parser = argparse.ArgumentParser("Experiment 3 runner (attnlrp hop0, recovery only).")
+ parser.add_argument("--dataset_tag", type=str, default="niah_mq_q2", help="Base tag for exp3 caches.")
+ parser.add_argument("--data_root", type=str, default="exp/exp3/data")
+ parser.add_argument("--output_root", type=str, default="exp/exp3/output")
+ parser.add_argument("--num_examples", type=int, default=1, help="How many examples to evaluate per dataset (default 1).")
+ parser.add_argument("--seed", type=int, default=42)
+ parser.add_argument("--model", type=str, default=None, help="HF repo id (required unless --model_path set).")
+ parser.add_argument("--model_path", type=str, default=None, help="Local path; overrides --model for loading.")
+ parser.add_argument("--cuda_num", type=int, default=0)
+ parser.add_argument("--cuda", type=str, default=None)
+ parser.add_argument("--top_fraction", type=float, default=0.1, help="Top fraction of prompt tokens used for recovery.")
+ parser.add_argument(
+ "--attnlrp_neg_handling",
+ type=str,
+ choices=["drop", "abs"],
+ default="drop",
+ help="AttnLRP hop0: how to handle negative values (drop=clamp>=0, abs=absolute value).",
+ )
+ parser.add_argument(
+ "--attnlrp_norm_mode",
+ type=str,
+ choices=["norm", "no_norm"],
+ default="norm",
+ help="AttnLRP hop0: norm enables internal normalization; no_norm disables it.",
+ )
+ args = parser.parse_args()
+
+ if args.model_path:
+ model_name = args.model_path
+ elif args.model:
+ model_name = args.model
+ else:
+ raise SystemExit("Please set --model or --model_path.")
+ model_tag = args.model if args.model else Path(args.model_path).name
+
+ device = resolve_device(args)
+ model, tokenizer = load_model(model_name, device)
+
+ data_root = Path(args.data_root)
+ output_root = Path(args.output_root)
+ output_root.mkdir(parents=True, exist_ok=True)
+
+ short_name = f"{args.dataset_tag}_short_cot"
+ long_name = f"{args.dataset_tag}_long_cot"
+ dataset_names = [short_name, long_name]
+
+ summary_rows: List[Dict[str, Any]] = []
+
+ for ds_name in dataset_names:
+ cache_path = data_root / f"{ds_name}.jsonl"
+ if not cache_path.exists():
+ raise SystemExit(f"Missing exp3 cache: {cache_path}. Run exp/exp3/sample_and_filter.py first.")
+ examples = ds_utils.load_cached(cache_path, sample=None, seed=args.seed)
+
+ mean, std, avg_time, used, skipped = _evaluate_one_dataset(
+ dataset_name=ds_name,
+ examples=examples,
+ model=model,
+ tokenizer=tokenizer,
+ output_root=output_root,
+ model_tag=model_tag,
+ neg_handling=args.attnlrp_neg_handling,
+ norm_mode=args.attnlrp_norm_mode,
+ top_fraction=float(args.top_fraction),
+ num_examples=int(args.num_examples),
+ )
+
+ out_dir = output_root / "recovery" / ds_name / model_tag
+ out_dir.mkdir(parents=True, exist_ok=True)
+ filename = f"attnlrp_{int(args.num_examples)}_examples.csv"
+ with (out_dir / filename).open("w", encoding="utf-8") as f:
+ f.write("Method,Recovery@10%\n")
+ f.write(f"Seq Attr Recovery Mean,{mean[0]}\n")
+ f.write(f"Row Attr Recovery Mean,{mean[1]}\n")
+ f.write(f"Recursive Attr Recovery Mean,{mean[2]}\n")
+ f.write(f"Seq Attr Recovery Std,{std[0]}\n")
+ f.write(f"Row Attr Recovery Std,{std[1]}\n")
+ f.write(f"Recursive Attr Recovery Std,{std[2]}\n")
+ f.write(f"Examples Used,{used}\n")
+ f.write(f"Examples Skipped,{skipped}\n")
+ f.write(f"Avg Sample Time (s),{avg_time}\n")
+
+ print(f"[{ds_name}] attnlrp -> {out_dir/filename} (used={used} skipped={skipped} avg {avg_time:.2f}s)")
+ summary_rows.append(
+ {
+ "dataset": ds_name,
+ "model": model_tag,
+ "neg_handling": args.attnlrp_neg_handling,
+ "norm_mode": args.attnlrp_norm_mode,
+ "seq_recovery@10%": float(mean[0]) if used else float("nan"),
+ "row_recovery@10%": float(mean[1]) if used else float("nan"),
+ "rec_recovery@10%": float(mean[2]) if used else float("nan"),
+ "used": int(used),
+ "skipped": int(skipped),
+ }
+ )
+
+ # Lightweight combined summary for quick comparison.
+ summary_path = output_root / "recovery" / f"summary_{model_tag}.json"
+ summary_path.parent.mkdir(parents=True, exist_ok=True)
+ summary_path.write_text(json.dumps(summary_rows, ensure_ascii=False, indent=2) + "\n", encoding="utf-8")
+ print(f"Wrote summary -> {summary_path}")
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/exp3/sample_and_filter.py b/exp/exp3/sample_and_filter.py
new file mode 100644
index 0000000000000000000000000000000000000000..216024b39ed952a624ca015d2bbe7025dcfa3180
--- /dev/null
+++ b/exp/exp3/sample_and_filter.py
@@ -0,0 +1,628 @@
+#!/usr/bin/env python3
+"""
+Experiment 3 sampler: long-vs-short CoT case study (RULER niah_mq_q2, 1024).
+
+This script searches the raw RULER JSONL for a *single* prompt where BOTH:
+ - a short-CoT generation and a long-CoT generation
+ - follow the strict format: " + final \\box{...} answer" with
+ nothing after the box
+ - pass a judge model verifying the boxed answer matches the reference answer
+ - satisfy length constraints (short <= max_short_thinking_tokens,
+ long >= min_long_thinking_tokens)
+
+It writes two exp2-compatible cache JSONLs to exp/exp3/data/:
+ - _short_cot.jsonl
+ - _long_cot.jsonl
+
+Each JSONL line matches exp/exp2/dataset_utils.CachedExample schema and keeps
+RULER metadata. The output caches are intended to be consumed by exp/exp3/run_exp.py.
+"""
+
+from __future__ import annotations
+
+import argparse
+import hashlib
+import json
+import os
+import sys
+import time
+import urllib.error
+import urllib.request
+from dataclasses import dataclass
+from pathlib import Path
+from typing import Any, Dict, Iterable, List, Optional
+
+from tqdm import tqdm
+from transformers import AutoTokenizer
+
+REPO_ROOT = Path(__file__).resolve().parents[2]
+if str(REPO_ROOT) not in sys.path:
+ sys.path.insert(0, str(REPO_ROOT))
+
+from exp.exp2 import dataset_utils as ds_utils
+from exp.exp2.dataset_utils import CachedExample, attach_spans_from_answer, split_boxed_generation
+
+
+class RateLimitError(RuntimeError):
+ """Raised when API returns 429; carries a suggested wait time."""
+
+ def __init__(self, wait_seconds: float, detail: str) -> None:
+ super().__init__(detail)
+ self.wait_seconds = wait_seconds
+
+
+SHORT_COT_SYSTEM_PROMPT = (
+ "You are a reasoning assistant. "
+ "Before answering, engage in a brief chain of thought. "
+ "Process this freely and naturally without using specific headers or strict formatting. "
+ "When you reach the conclusion, wrap the entire final sentence containing the answer inside \\box{}. "
+ "Ensure the box wraps the **sentence** that naturally delivers the answer. "
+ "Do not add anything after the box."
+)
+
+LONG_COT_SYSTEM_PROMPT = (
+ "You are a careful reasoning assistant. "
+ "Before answering, engage in an extremely detailed and exhaustive chain of thought. "
+ "Do not skip any logical steps, even if they seem obvious. "
+ "Process this freely and naturally without using specific headers or strict formatting. "
+ "When you reach the conclusion, wrap the entire final sentence containing the answer inside \\box{}. "
+ "Ensure the box wraps the **sentence** that naturally delivers the answer. "
+ "Do not add anything after the box."
+)
+
+JUDGE_SYSTEM_PROMPT = (
+ "You verify whether the model's boxed answer matches the reference answer. "
+ "Reply strictly with True or False and nothing else."
+)
+
+
+def _sha1_text(text: str) -> str:
+ return hashlib.sha1(text.encode("utf-8")).hexdigest()
+
+
+def call_chat_api(
+ api_base: str,
+ api_key: str,
+ model: str,
+ messages: List[Dict[str, str]],
+ *,
+ timeout: int,
+ max_tokens: int,
+ temperature: float,
+ cache_ttl: int,
+ cache_namespace: Optional[str],
+ rate_limit_delay: Optional[float] = None,
+) -> str:
+ url = api_base.rstrip("/") + "/chat/completions"
+ payload: Dict[str, Any] = {
+ "model": model,
+ "messages": messages,
+ "max_tokens": max_tokens,
+ "temperature": temperature,
+ }
+ if cache_ttl > 0:
+ cache_obj: Dict[str, Any] = {"ttl": cache_ttl}
+ if cache_namespace:
+ cache_obj["namespace"] = cache_namespace
+ payload["cache"] = cache_obj
+
+ data = json.dumps(payload).encode("utf-8")
+ headers = {"Content-Type": "application/json"}
+ if api_key:
+ headers["Authorization"] = f"Bearer {api_key}"
+
+ req = urllib.request.Request(url, data=data, headers=headers, method="POST")
+ opener = urllib.request.build_opener(urllib.request.ProxyHandler({}))
+ try:
+ with opener.open(req, timeout=timeout) as resp:
+ resp_bytes = resp.read()
+ except urllib.error.HTTPError as e:
+ detail = e.read().decode("utf-8", errors="ignore") if hasattr(e, "read") else ""
+ if e.code == 429:
+ retry_after = None
+ if hasattr(e, "headers") and e.headers:
+ retry_after_header = e.headers.get("Retry-After")
+ if retry_after_header:
+ try:
+ retry_after = float(retry_after_header)
+ except ValueError:
+ retry_after = None
+ wait = retry_after or rate_limit_delay or 5.0
+ raise RateLimitError(wait, f"API HTTP 429: {detail}") from e
+ raise RuntimeError(f"API HTTP error {e.code}: {detail}") from e
+ except urllib.error.URLError as e:
+ raise RuntimeError(f"API request failed: {e}") from e
+
+ try:
+ response = json.loads(resp_bytes.decode("utf-8"))
+ except json.JSONDecodeError as e:
+ raise RuntimeError(f"Failed to decode API response: {resp_bytes!r}") from e
+
+ choices = response.get("choices", [])
+ if not choices:
+ raise RuntimeError(f"Empty choices from API: {response}")
+ content = choices[0].get("message", {}).get("content", "")
+ if not content:
+ raise RuntimeError(f"Empty content from API: {response}")
+ return content.strip()
+
+
+def _call_with_retries(
+ *,
+ api_base: str,
+ api_key: str,
+ model: str,
+ messages: List[Dict[str, str]],
+ timeout: int,
+ max_tokens: int,
+ temperature: float,
+ cache_ttl: int,
+ cache_namespace: Optional[str],
+ rate_limit_delay: float,
+ retries: int,
+ retry_delay: float,
+) -> str:
+ for attempt in range(retries + 1):
+ try:
+ return call_chat_api(
+ api_base,
+ api_key,
+ model,
+ messages,
+ timeout=timeout,
+ max_tokens=max_tokens,
+ temperature=temperature,
+ cache_ttl=cache_ttl,
+ cache_namespace=cache_namespace,
+ rate_limit_delay=rate_limit_delay,
+ )
+ except RateLimitError as e:
+ if attempt >= retries:
+ raise
+ time.sleep(e.wait_seconds)
+ except Exception: # noqa: BLE001
+ if attempt >= retries:
+ raise
+ time.sleep(retry_delay)
+ raise RuntimeError("Unreachable")
+
+
+def build_gen_messages(prompt: str, system_prompt: str) -> List[Dict[str, str]]:
+ return [
+ {"role": "system", "content": system_prompt},
+ {"role": "user", "content": prompt},
+ ]
+
+
+def build_judge_messages(reference_answer: str, candidate_answer: str) -> List[Dict[str, str]]:
+ user = (
+ "Decide if the model's boxed answer matches the reference answer.\n"
+ f"Reference answer: {reference_answer}\n"
+ f"Model boxed answer (only the content inside \\box{{}}): {candidate_answer}\n"
+ "Output only True if they are semantically consistent; otherwise output False."
+ )
+ return [
+ {"role": "system", "content": JUDGE_SYSTEM_PROMPT},
+ {"role": "user", "content": user},
+ ]
+
+
+def parse_bool(text: str) -> bool:
+ first = text.strip().splitlines()[0].strip().lower()
+ if first in {"true", "yes"}:
+ return True
+ if first in {"false", "no"}:
+ return False
+ if "true" in first and "false" not in first:
+ return True
+ if "false" in first:
+ return False
+ raise ValueError(f"Cannot parse boolean from: {text!r}")
+
+
+def write_cache(out_path: Path, examples: Iterable[CachedExample]) -> int:
+ out_path.parent.mkdir(parents=True, exist_ok=True)
+ count = 0
+ with out_path.open("w", encoding="utf-8") as f:
+ for ex in examples:
+ obj: Dict[str, Any] = {
+ "prompt": ex.prompt,
+ "target": ex.target,
+ "indices_to_explain": ex.indices_to_explain,
+ "attr_mask_indices": ex.attr_mask_indices,
+ "sink_span": ex.sink_span,
+ "thinking_span": ex.thinking_span,
+ "metadata": ex.metadata,
+ }
+ f.write(json.dumps(obj, ensure_ascii=False) + "\n")
+ count += 1
+ return count
+
+
+@dataclass(frozen=True)
+class AcceptedGeneration:
+ thinking_text: str
+ boxed_answer: str
+ target_text: str
+ thinking_tokens: int
+ generation_text: str
+ judge_response: str
+
+
+def _infer_reference_answer(example: CachedExample) -> str:
+ meta = example.metadata or {}
+ ref = str(meta.get("reference_answer") or "").strip()
+ if ref:
+ return ref
+ outputs = meta.get("outputs") or []
+ if isinstance(outputs, list) and outputs:
+ return ", ".join(str(x) for x in outputs)
+ tgt = str(example.target or "").strip()
+ return tgt
+
+
+def _infer_dataset_tag(dataset_path: Path) -> str:
+ if dataset_path.name.endswith(".jsonl") and dataset_path.name != "validation.jsonl":
+ return dataset_path.stem
+ if dataset_path.name == "validation.jsonl":
+ return dataset_path.parent.name
+ return dataset_path.stem
+
+
+def _count_tokens(tokenizer, text: str) -> int:
+ return int(len(tokenizer(text, add_special_tokens=False).input_ids))
+
+
+def _generate_one_style(
+ *,
+ prompt: str,
+ reference_answer: str,
+ tokenizer,
+ style: str,
+ system_prompt: str,
+ api_base: str,
+ api_key: str,
+ generator_model: str,
+ judge_model: str,
+ timeout: int,
+ max_tokens: int,
+ temperature: float,
+ cache_ttl: int,
+ cache_namespace: Optional[str],
+ rate_limit_delay: float,
+ retries: int,
+ retry_delay: float,
+ request_interval: float,
+ judge_interval: float,
+ min_long_thinking_tokens: int,
+ max_short_thinking_tokens: int,
+) -> Optional[AcceptedGeneration]:
+ gen_messages = build_gen_messages(prompt, system_prompt)
+ generation = _call_with_retries(
+ api_base=api_base,
+ api_key=api_key,
+ model=generator_model,
+ messages=gen_messages,
+ timeout=timeout,
+ max_tokens=max_tokens,
+ temperature=temperature,
+ cache_ttl=cache_ttl,
+ cache_namespace=cache_namespace,
+ rate_limit_delay=rate_limit_delay,
+ retries=retries,
+ retry_delay=retry_delay,
+ )
+ if request_interval > 0:
+ time.sleep(request_interval)
+
+ parsed = split_boxed_generation(generation)
+ if not parsed:
+ return None
+ thinking_text, _boxed_segment, boxed_answer = parsed
+ thinking_tokens = _count_tokens(tokenizer, thinking_text)
+
+ if style == "short":
+ if max_short_thinking_tokens > 0 and thinking_tokens > max_short_thinking_tokens:
+ return None
+ elif style == "long":
+ if min_long_thinking_tokens > 0 and thinking_tokens < min_long_thinking_tokens:
+ return None
+ else:
+ raise ValueError(f"Unsupported style: {style}")
+
+ judge_messages = build_judge_messages(reference_answer, boxed_answer)
+ judge_resp = _call_with_retries(
+ api_base=api_base,
+ api_key=api_key,
+ model=judge_model,
+ messages=judge_messages,
+ timeout=timeout,
+ max_tokens=64,
+ temperature=0.0,
+ cache_ttl=cache_ttl,
+ cache_namespace=cache_namespace,
+ rate_limit_delay=rate_limit_delay,
+ retries=retries,
+ retry_delay=retry_delay,
+ )
+ if judge_interval > 0:
+ time.sleep(judge_interval)
+ ok = parse_bool(judge_resp)
+ if not ok:
+ return None
+
+ target_text = f"{thinking_text}\n{boxed_answer}" if thinking_text else boxed_answer
+ return AcceptedGeneration(
+ thinking_text=thinking_text,
+ boxed_answer=boxed_answer,
+ target_text=target_text,
+ thinking_tokens=thinking_tokens,
+ generation_text=generation,
+ judge_response=judge_resp,
+ )
+
+
+def main() -> None:
+ parser = argparse.ArgumentParser("Sample short-CoT and long-CoT cases for exp3 (independently).")
+ parser.add_argument(
+ "--dataset_path",
+ type=str,
+ default="data/ruler_multihop/1024/niah_mq_q2/validation.jsonl",
+ help="Raw RULER JSONL path (default: niah_mq_q2 1024 validation).",
+ )
+ parser.add_argument("--dataset_tag", type=str, default=None, help="Output tag; default inferred from dataset_path.")
+ parser.add_argument(
+ "--max_pairs",
+ type=int,
+ default=1,
+ help="Deprecated alias for --max_short and --max_long (kept for convenience).",
+ )
+ parser.add_argument("--max_short", type=int, default=None, help="How many short-CoT samples to keep (default: --max_pairs).")
+ parser.add_argument("--max_long", type=int, default=None, help="How many long-CoT samples to keep (default: --max_pairs).")
+ parser.add_argument("--max_raw_examples", type=int, default=None, help="Optional cap on raw examples to try.")
+ parser.add_argument("--seed", type=int, default=42)
+ parser.add_argument("--api_base", type=str, default="http://localhost:4000/v1", help="Chat API base URL.")
+ parser.add_argument("--api_key", type=str, default=None, help="API key; defaults to FLASHTRACE_API_KEY/OPENAI_API_KEY.")
+ parser.add_argument("--generator_model", type=str, default="qwen3-235b-a22b-2507")
+ parser.add_argument("--judge_model", type=str, default="deepseek-v3-1-terminus")
+ parser.add_argument("--api_timeout", type=int, default=300)
+ parser.add_argument("--api_temperature", type=float, default=0.0)
+ parser.add_argument("--api_cache_ttl", type=int, default=600)
+ parser.add_argument("--api_cache_namespace", type=str, default="flashtrace-exp3")
+ parser.add_argument("--retry_delay", type=float, default=2.0)
+ parser.add_argument("--retries", type=int, default=2, help="Additional retries on API failure.")
+ parser.add_argument("--request_interval", type=float, default=1.0, help="Sleep seconds between generation calls.")
+ parser.add_argument("--judge_interval", type=float, default=1.0, help="Sleep seconds between judge calls.")
+ parser.add_argument("--rate_limit_delay", type=float, default=5.0, help="Seconds to wait on HTTP 429 before retrying.")
+ parser.add_argument(
+ "--api_max_tokens_short",
+ type=int,
+ default=2048,
+ help="Max tokens for the short-CoT generation call.",
+ )
+ parser.add_argument(
+ "--api_max_tokens_long",
+ type=int,
+ default=8192,
+ help="Max tokens for the long-CoT generation call.",
+ )
+ parser.add_argument(
+ "--min_long_thinking_tokens",
+ type=int,
+ default=512,
+ help="Minimum tokenizer tokens required in the long-CoT thinking segment.",
+ )
+ parser.add_argument(
+ "--max_short_thinking_tokens",
+ type=int,
+ default=256,
+ help="Maximum tokenizer tokens allowed in the short-CoT thinking segment.",
+ )
+ parser.add_argument(
+ "--tokenizer_model",
+ type=str,
+ default=None,
+ help="Tokenizer path for span extraction & length constraints (default: generator_model).",
+ )
+ parser.add_argument("--data_root", type=str, default="exp/exp3/data", help="Output directory for exp3 caches.")
+ parser.add_argument("--out_short", type=str, default=None, help="Optional explicit output path (short JSONL).")
+ parser.add_argument("--out_long", type=str, default=None, help="Optional explicit output path (long JSONL).")
+ args = parser.parse_args()
+
+ api_key = args.api_key or os.environ.get("FLASHTRACE_API_KEY") or os.environ.get("OPENAI_API_KEY")
+ if not api_key:
+ raise SystemExit("Set --api_key or FLASHTRACE_API_KEY/OPENAI_API_KEY for API access.")
+
+ dataset_path = Path(args.dataset_path)
+ if not dataset_path.exists():
+ raise SystemExit(f"Dataset file not found: {dataset_path}")
+ dataset_tag = str(args.dataset_tag or _infer_dataset_tag(dataset_path))
+
+ tok_name = args.tokenizer_model or args.generator_model
+ tok_path = Path(tok_name)
+ if tok_path.exists():
+ tokenizer = AutoTokenizer.from_pretrained(tok_path.as_posix(), local_files_only=True)
+ else:
+ tokenizer = AutoTokenizer.from_pretrained(tok_name)
+ tokenizer.pad_token = tokenizer.eos_token
+
+ raw_examples = ds_utils.load_ruler(dataset_path, sample=None, seed=args.seed)
+ if not raw_examples:
+ raise SystemExit("No examples loaded from the RULER JSONL.")
+
+ max_short = int(args.max_short) if args.max_short is not None else int(args.max_pairs)
+ max_long = int(args.max_long) if args.max_long is not None else int(args.max_pairs)
+ if max_short < 0 or max_long < 0:
+ raise SystemExit("--max_short/--max_long must be >= 0.")
+
+ kept_short: List[CachedExample] = []
+ kept_long: List[CachedExample] = []
+
+ total = len(raw_examples)
+ attempted = 0
+
+ for idx, ex in enumerate(tqdm(raw_examples, total=total, desc="Scanning raw RULER"), 1):
+ attempted = idx
+ if args.max_raw_examples is not None and idx > int(args.max_raw_examples):
+ break
+ if len(kept_short) >= max_short and len(kept_long) >= max_long:
+ break
+
+ reference_answer = _infer_reference_answer(ex)
+ prompt = ex.prompt
+
+ sample_id = _sha1_text(prompt)
+ base_meta = dict(ex.metadata or {})
+ base_meta["reference_answer"] = reference_answer
+ base_meta["sample_id"] = sample_id
+ base_meta["pair_id"] = sample_id # backward-compatible name (may not be paired)
+ base_meta["source_dataset_path"] = str(dataset_path)
+ base_meta["prompt_sha1"] = sample_id
+
+ if len(kept_short) < max_short:
+ short_gen = _generate_one_style(
+ prompt=prompt,
+ reference_answer=reference_answer,
+ tokenizer=tokenizer,
+ style="short",
+ system_prompt=SHORT_COT_SYSTEM_PROMPT,
+ api_base=args.api_base,
+ api_key=api_key,
+ generator_model=args.generator_model,
+ judge_model=args.judge_model,
+ timeout=args.api_timeout,
+ max_tokens=args.api_max_tokens_short,
+ temperature=args.api_temperature,
+ cache_ttl=args.api_cache_ttl,
+ cache_namespace=args.api_cache_namespace,
+ rate_limit_delay=args.rate_limit_delay,
+ retries=args.retries,
+ retry_delay=args.retry_delay,
+ request_interval=args.request_interval,
+ judge_interval=args.judge_interval,
+ min_long_thinking_tokens=args.min_long_thinking_tokens,
+ max_short_thinking_tokens=args.max_short_thinking_tokens,
+ )
+ if short_gen is not None:
+ short_meta = dict(base_meta)
+ short_meta.update(
+ {
+ "cot_style": "short",
+ "generator_model": args.generator_model,
+ "judge_model": args.judge_model,
+ "judge_response": short_gen.judge_response,
+ "boxed_answer": short_gen.boxed_answer,
+ "thinking_tokens": int(short_gen.thinking_tokens),
+ }
+ )
+ short_ex = CachedExample(
+ prompt=prompt,
+ target=short_gen.target_text,
+ indices_to_explain=None,
+ attr_mask_indices=ex.attr_mask_indices,
+ sink_span=None,
+ thinking_span=None,
+ metadata=short_meta,
+ )
+ short_ex = attach_spans_from_answer(short_ex, tokenizer, short_gen.boxed_answer)
+ if isinstance(short_ex.sink_span, list) and len(short_ex.sink_span) == 2:
+ short_ex = CachedExample(
+ prompt=short_ex.prompt,
+ target=short_ex.target,
+ indices_to_explain=short_ex.sink_span,
+ attr_mask_indices=short_ex.attr_mask_indices,
+ sink_span=short_ex.sink_span,
+ thinking_span=short_ex.thinking_span,
+ metadata=short_ex.metadata,
+ )
+ kept_short.append(short_ex)
+ print(
+ f"[kept short] raw_idx={idx}/{total} thinking_tokens={short_gen.thinking_tokens} "
+ f"sample_id={sample_id[:8]} kept={len(kept_short)}/{max_short}"
+ )
+
+ if len(kept_long) < max_long:
+ long_gen = _generate_one_style(
+ prompt=prompt,
+ reference_answer=reference_answer,
+ tokenizer=tokenizer,
+ style="long",
+ system_prompt=LONG_COT_SYSTEM_PROMPT,
+ api_base=args.api_base,
+ api_key=api_key,
+ generator_model=args.generator_model,
+ judge_model=args.judge_model,
+ timeout=args.api_timeout,
+ max_tokens=args.api_max_tokens_long,
+ temperature=args.api_temperature,
+ cache_ttl=args.api_cache_ttl,
+ cache_namespace=args.api_cache_namespace,
+ rate_limit_delay=args.rate_limit_delay,
+ retries=args.retries,
+ retry_delay=args.retry_delay,
+ request_interval=args.request_interval,
+ judge_interval=args.judge_interval,
+ min_long_thinking_tokens=args.min_long_thinking_tokens,
+ max_short_thinking_tokens=args.max_short_thinking_tokens,
+ )
+ if long_gen is not None:
+ long_meta = dict(base_meta)
+ long_meta.update(
+ {
+ "cot_style": "long",
+ "generator_model": args.generator_model,
+ "judge_model": args.judge_model,
+ "judge_response": long_gen.judge_response,
+ "boxed_answer": long_gen.boxed_answer,
+ "thinking_tokens": int(long_gen.thinking_tokens),
+ }
+ )
+ long_ex = CachedExample(
+ prompt=prompt,
+ target=long_gen.target_text,
+ indices_to_explain=None,
+ attr_mask_indices=ex.attr_mask_indices,
+ sink_span=None,
+ thinking_span=None,
+ metadata=long_meta,
+ )
+ long_ex = attach_spans_from_answer(long_ex, tokenizer, long_gen.boxed_answer)
+ if isinstance(long_ex.sink_span, list) and len(long_ex.sink_span) == 2:
+ long_ex = CachedExample(
+ prompt=long_ex.prompt,
+ target=long_ex.target,
+ indices_to_explain=long_ex.sink_span,
+ attr_mask_indices=long_ex.attr_mask_indices,
+ sink_span=long_ex.sink_span,
+ thinking_span=long_ex.thinking_span,
+ metadata=long_ex.metadata,
+ )
+ kept_long.append(long_ex)
+ print(
+ f"[kept long] raw_idx={idx}/{total} thinking_tokens={long_gen.thinking_tokens} "
+ f"sample_id={sample_id[:8]} kept={len(kept_long)}/{max_long}"
+ )
+
+ data_root = Path(args.data_root)
+ out_short = Path(args.out_short) if args.out_short else data_root / f"{dataset_tag}_short_cot.jsonl"
+ out_long = Path(args.out_long) if args.out_long else data_root / f"{dataset_tag}_long_cot.jsonl"
+
+ n_short = write_cache(out_short, kept_short)
+ n_long = write_cache(out_long, kept_long)
+ print(
+ f"Wrote short={n_short} -> {out_short}\n"
+ f"Wrote long ={n_long} -> {out_long}\n"
+ f"Attempted {attempted} / {total}"
+ )
+
+ missing: List[str] = []
+ if len(kept_short) < max_short:
+ missing.append(f"short({len(kept_short)}/{max_short})")
+ if len(kept_long) < max_long:
+ missing.append(f"long({len(kept_long)}/{max_long})")
+ if missing:
+ raise SystemExit(f"Could not find enough samples: {', '.join(missing)} (attempted {attempted} / {total}).")
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/exp4/README.md b/exp/exp4/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..a1ff2959206e77a16dc3aff00c53fbe3fa5c9594
--- /dev/null
+++ b/exp/exp4/README.md
@@ -0,0 +1,85 @@
+# FlashTrace ๅฎ้ช 4๏ผAider ๅฝๅ ๅฟ ๅฎๅบฆ / row-only๏ผ
+
+ๆฌ็ฎๅฝๆไพ Aider ๆฐๆฎ้ไธ็ token-level ๅฝๅ ๅฟ ๅฎๅบฆ่ฏๆตๅทฅๅ
ท๏ผ**ๅช่พๅบ row ้จๅ็ RISE/MAS**๏ผไธไฟๅญๆ ทๆฌ็บง traceใ
+
+่ฏๆต่ๅด๏ผๅบๅฎ๏ผ๏ผ
+- ๆฐๆฎ้๏ผ`exp/exp4/data/aider.jsonl`
+- ๆนๆณ๏ผ
+ - `ifr_all_positions`
+ - `ifr_multi_hop_both`๏ผFlashTrace๏ผ
+- ๆๆ ๏ผ`RISE`ใ`MAS`๏ผrow attribution only๏ผ
+
+ไธป่ฆๆไปถ๏ผ
+- `run_exp.py`๏ผๅฝๅ + ๅฟ ๅฎๅบฆ่ฏๆต๏ผ่พๅบๅฐ `exp/exp4/output/`
+
+---
+
+## ๆฐๆฎๆ ผๅผ
+
+`exp/exp4/data/aider.jsonl` ๆฏ่กไธไธช JSON๏ผๅฏนๅบไธไธชๆ ทๆฌ๏ผ
+- `input`๏ผprompt๏ผ็ดๆฅไฝไธบ user prompt ๅ
ๅฎน๏ผ
+- `output`๏ผtarget๏ผ็ดๆฅไฝไธบๆจกๅ็ๆๆๆฌ๏ผ่ๆฌไผๅ
้จ่ฟฝๅ EOS ๅๆๅ๏ผ
+- `length`๏ผๆฐๆฎ่ชๅธฆๅญๆฎต๏ผ่ๆฌไธไพ่ต๏ผไป
้ไผ ๅฐ metadata๏ผ
+
+่ฏดๆ๏ผAider ็ `output` ๅฝขๅฆ๏ผ
+1) ็ฌฌไธ่ก `xxx.py`
+2) ็ฌฌไบ่ก opening fence ```
+3) ไธญ้ดไธบไปฃ็
+4) ๆๅไธ่กไธบ closing fence ```
+
+---
+
+## ๅฝๅ ไธ sink ้ๆฉ
+
+่ๆฌๅฏนๆฏไธชๆ ทๆฌ้ฝๅฐ `input` ไฝไธบ `prompt`๏ผๅฐ `output` ไฝไธบ `target`๏ผไธๅ้ๆฐ็ๆ๏ผ๏ผๅนถๅจๅฝๅ ็ปๆไธ้ๆฉไธๅ็ sink๏ผ`indices_to_explain=[start_tok,end_tok]`๏ผๅๅบไบ `tokenizer(target, add_special_tokens=False)` ็ token span๏ผไธๅซ EOS๏ผใ
+
+### `ifr_all_positions`๏ผ่พๅบไธคไธช sink๏ผ
+
+- `last_line`๏ผๅ `output` ไธญ **closing fence ไนๅๆๅไธไธชโ้็ฉบไธ้ ```โ่ก**๏ผๅนถๅฐ่ฏฅ่ก็ๅญ็ฌฆ span ๆ ๅฐๅฐ token span๏ผ่ฅๆ ๆณ่งฃๆๅๅ้ไธบ `full_output`ใ
+- `last_token`๏ผๅ `last_line` ็ๆๅไธไธช token๏ผๅ็น span `[end,end]`๏ผใ
+
+ๆณจๆ๏ผ่ๆฌไผๅฏนๅไธไธชๆ ทๆฌๅช่ฎก็ฎไธๆฌก `ifr_all_positions` ็ๅฝๅ ็ฉ้ต๏ผ็ถๅๅๅซๅจไธคไธช sink ไธๅ row attribution ๅนถ่ฎก็ฎๅฟ ๅฎๅบฆใ
+
+### `ifr_multi_hop_both`๏ผFlashTrace๏ผๅช่พๅบไธไธช sink๏ผ
+
+- `full_output`๏ผ็จๅฎๆด `output` ไฝไธบ sink๏ผtoken span `[0, n_tok-1]`๏ผใ
+- ๅฟ ๅฎๅบฆๆฐๅจไพงไผๆฒฟ็จ exp2 ็ๅ่ฎฎ๏ผๅฏน prompt-side ไผ่ทณ่ฟ stop tokens๏ผ็ฑ `ft_ifr_improve.py` ็ stop-token ้
็ฝฎๅณๅฎ๏ผใ
+
+---
+
+## ๆๆ ่พๅบ๏ผrow-only๏ผ
+
+่พๅบ CSV ไป
ๅ
ๅซ row attribution ็ `RISE/MAS` ่ๅ็ป่ฎก๏ผ
+- `Method,Sink,Row_RISE_Mean,Row_RISE_Std,Row_MAS_Mean,Row_MAS_Std,Used,Skipped,Avg_Sample_Time_s`
+
+่พๅบ่ทฏๅพ๏ผ
+- `exp/exp4/output/faithfulness/aider//row_only__examples.csv`
+
+ๅ
ถไธญ `` ไผๅ
ๅ `--model`๏ผๅฆๅๅ `--model_path` ็็ฎๅฝๅใ
+
+---
+
+## ไฝฟ็จ่ฏดๆ
+
+ๆจ่ไป repo root ่ฟ่ก๏ผไฟ่ฏ็ธๅฏน่ทฏๅพๅฏ็จ๏ผ๏ผ
+
+```bash
+python exp/exp4/run_exp.py \
+ --data_path exp/exp4/data/aider.jsonl \
+ --output_root exp/exp4/output \
+ --model qwen-8B \
+ --model_path /opt/share/models/Qwen/Qwen3-8B/ \
+ --cuda 2,3,4,5,6,7 \
+ --num_examples 100 \
+ --n_hops 1 \
+ --k 20
+```
+
+ๅธธ็จๅๆฐ๏ผ
+- `--model_path` / `--model`๏ผๆฌๅฐๆจกๅ่ทฏๅพๆ HF repo id๏ผ่ณๅฐๆไพๅ
ถไธ๏ผ
+- `--tokenizer_path`๏ผๅฏ้๏ผไธๆไพๅ้ป่ฎคๅค็จๆจกๅ่ทฏๅพ/id
+- `--cuda`๏ผๆฏๆ `0`๏ผๅๅก๏ผๆ `0,1,2`๏ผๅคๅก๏ผๅ
้จไผ่ฎพ็ฝฎ `CUDA_VISIBLE_DEVICES` ๅนถ็จ `device_map=auto`๏ผ
+- `--num_examples`๏ผ่ฏๆตๅ N ๆก๏ผๆๆไปถ้กบๅบ๏ผ`--seed` ้ข็๏ผๅฝๅไธๅ้ๆบๆฝๆ ท๏ผ
+- `--n_hops`๏ผFlashTrace๏ผ`ifr_multi_hop_both`๏ผ็ hop ๆฐ
+- `--k`๏ผMAS/RISE ็ๆฐๅจๆญฅๆฐ
+- `--chunk_tokens` / `--sink_chunk_tokens`๏ผIFR ่ฎก็ฎ็ chunk ๅๆฐ๏ผไธ่ฌไฟๆ้ป่ฎค๏ผ
diff --git a/exp/exp4/run_exp.py b/exp/exp4/run_exp.py
new file mode 100644
index 0000000000000000000000000000000000000000..9c50b93d5f3fbcdf081b6595bca0778cde265c8b
--- /dev/null
+++ b/exp/exp4/run_exp.py
@@ -0,0 +1,487 @@
+#!/usr/bin/env python3
+"""
+Experiment 4 runner: Aider token-level attribution faithfulness.
+
+Evaluates only:
+- IFR: ifr_all_positions
+ - sink = last meaningful code line (excluding fences)
+ - sink = last token of that code line
+- FlashTrace: ifr_multi_hop_both
+ - sink = full output (excluding appended EOS)
+
+Outputs only row-level faithfulness scores (RISE, MAS). No sample-level traces.
+"""
+
+from __future__ import annotations
+
+import argparse
+import json
+import os
+import sys
+import time
+from dataclasses import dataclass
+from itertools import islice
+from pathlib import Path
+from typing import Any, Dict, List, Optional, Sequence, Tuple
+
+
+def _early_set_cuda_visible_devices() -> None:
+ parser = argparse.ArgumentParser(add_help=False)
+ parser.add_argument("--cuda", type=str, default=None)
+ args, _ = parser.parse_known_args(sys.argv[1:])
+ if args.cuda and "," in args.cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = args.cuda
+
+
+_early_set_cuda_visible_devices()
+
+import numpy as np
+import torch
+from transformers import AutoModelForCausalLM, AutoTokenizer, utils
+
+# Ensure repo root on path for `import llm_attr`, `import ft_ifr_improve`, etc.
+REPO_ROOT = Path(__file__).resolve().parents[2]
+if str(REPO_ROOT) not in sys.path:
+ sys.path.insert(0, str(REPO_ROOT))
+
+import ft_ifr_improve
+import llm_attr
+import llm_attr_eval
+
+utils.logging.set_verbosity_error()
+
+
+@dataclass(frozen=True)
+class AiderExample:
+ prompt: str
+ target: str
+ metadata: Dict[str, Any]
+
+
+def _read_jsonl(path: Path) -> List[Dict[str, Any]]:
+ rows: List[Dict[str, Any]] = []
+ with path.open("r", encoding="utf-8") as f:
+ for line in f:
+ if not line.strip():
+ continue
+ rows.append(json.loads(line))
+ return rows
+
+
+def load_aider(path: Path) -> List[AiderExample]:
+ rows = _read_jsonl(path)
+ examples: List[AiderExample] = []
+ for row in rows:
+ prompt = str(row.get("input") or "")
+ target = str(row.get("output") or "")
+ examples.append(AiderExample(prompt=prompt, target=target, metadata={"length": row.get("length")}))
+ return examples
+
+
+def _token_span_full_output(tokenizer, target: str) -> List[int]:
+ ids = tokenizer(target, add_special_tokens=False).input_ids
+ if not ids:
+ return [0, 0]
+ return [0, int(len(ids) - 1)]
+
+
+def _last_meaningful_code_line_char_span(target: str) -> Optional[Tuple[int, int]]:
+ lines = target.splitlines(keepends=True)
+ pos = 0
+ spans: List[Tuple[int, int, str]] = []
+ for line in lines:
+ start = pos
+ pos += len(line)
+ spans.append((start, pos, line))
+
+ for start, end, line in reversed(spans):
+ stripped = line.strip()
+ if not stripped:
+ continue
+ if stripped.startswith("```"):
+ continue
+ if start == 0 and stripped.endswith(".py"):
+ return None
+
+ line_no_nl = line.rstrip("\r\n")
+ end_no_nl = start + len(line_no_nl)
+ if end_no_nl <= start:
+ continue
+ return start, end_no_nl
+
+ return None
+
+
+def _char_span_to_token_span(tokenizer, text: str, span: Tuple[int, int]) -> Optional[List[int]]:
+ start_char, end_char = int(span[0]), int(span[1])
+ if end_char <= start_char:
+ return None
+
+ enc = tokenizer(text, add_special_tokens=False, return_offsets_mapping=True)
+ offsets = enc.get("offset_mapping")
+ if offsets is None:
+ raise ValueError("Tokenizer does not provide offset_mapping; cannot map char spans to tokens.")
+
+ tok_indices: List[int] = []
+ for idx, off in enumerate(offsets):
+ if off is None:
+ continue
+ s, e = int(off[0]), int(off[1])
+ if s < end_char and e > start_char:
+ tok_indices.append(int(idx))
+ if not tok_indices:
+ return None
+ return [min(tok_indices), max(tok_indices)]
+
+
+def _last_meaningful_code_line_token_span(tokenizer, target: str) -> List[int]:
+ full_span = _token_span_full_output(tokenizer, target)
+ span_chars = _last_meaningful_code_line_char_span(target)
+ if span_chars is None:
+ return full_span
+
+ span_toks = _char_span_to_token_span(tokenizer, target, span_chars)
+ if span_toks is None:
+ return full_span
+
+ span_toks[0] = max(int(span_toks[0]), int(full_span[0]))
+ span_toks[1] = min(int(span_toks[1]), int(full_span[1]))
+ if span_toks[1] < span_toks[0]:
+ return full_span
+ return span_toks
+
+
+def _last_token_span(token_span: Sequence[int]) -> List[int]:
+ if not (isinstance(token_span, Sequence) and len(token_span) == 2):
+ return [0, 0]
+ end = int(token_span[1])
+ return [end, end]
+
+
+def resolve_device(args) -> str:
+ if args.cuda is not None and "," in args.cuda:
+ os.environ["CUDA_VISIBLE_DEVICES"] = args.cuda
+ return "auto"
+ if args.cuda is not None and args.cuda.strip():
+ return f"cuda:{args.cuda}" if torch.cuda.is_available() else "cpu"
+ return f"cuda:{args.cuda_num}" if torch.cuda.is_available() else "cpu"
+
+
+def load_model_and_tokenizer(args) -> tuple[Any, Any]:
+ model_id = args.model_path or args.model
+ if not model_id:
+ raise SystemExit("Provide --model_path (local) or --model (HF repo id).")
+
+ tokenizer_id = args.tokenizer_path or model_id
+ device = resolve_device(args)
+
+ model = AutoModelForCausalLM.from_pretrained(
+ model_id,
+ device_map="auto" if device == "auto" else {"": int(device.split(":")[1])} if device.startswith("cuda:") else None,
+ torch_dtype=torch.float16,
+ attn_implementation="eager",
+ )
+ tokenizer = AutoTokenizer.from_pretrained(tokenizer_id)
+ if tokenizer.pad_token_id is None and tokenizer.eos_token_id is not None:
+ tokenizer.pad_token = tokenizer.eos_token
+ model.eval()
+ return model, tokenizer
+
+
+def _faithfulness_test_with_user_prompt_indices(
+ llm_evaluator: llm_attr_eval.LLMAttributionEvaluator,
+ attribution: torch.Tensor,
+ prompt: str,
+ generation: str,
+ *,
+ user_prompt_indices: List[int],
+ k: int = 20,
+) -> Tuple[float, float, float]:
+ def auc(arr: np.ndarray) -> float:
+ return (arr.sum() - arr[0] / 2 - arr[-1] / 2) / max(1, (arr.shape[0] - 1))
+
+ pad_token_id = llm_evaluator._ensure_pad_token_id()
+
+ user_prompt = " " + prompt
+ formatted_prompt = llm_evaluator.format_prompt(user_prompt)
+ formatted_ids = llm_evaluator.tokenizer(formatted_prompt, return_tensors="pt", add_special_tokens=False).input_ids
+
+ prompt_ids = formatted_ids.to(llm_evaluator.device)
+ prompt_ids_perturbed = prompt_ids.clone()
+ generation_ids = llm_evaluator.tokenizer(
+ generation + llm_evaluator.tokenizer.eos_token,
+ return_tensors="pt",
+ add_special_tokens=False,
+ ).input_ids.to(llm_evaluator.device)
+
+ attr_cpu = attribution.detach().cpu()
+ w = attr_cpu.sum(0)
+ sorted_attr_indices = torch.argsort(w, descending=True)
+ attr_sum = float(w.sum().item())
+
+ P = int(w.numel())
+ if len(user_prompt_indices) != P:
+ raise ValueError(
+ "user_prompt_indices length does not match prompt-side attribution length: "
+ f"indices P={len(user_prompt_indices)}, attr P={P}."
+ )
+ if P == 0:
+ return 0.0, 0.0, 0.0
+
+ if max(user_prompt_indices) >= int(prompt_ids_perturbed.shape[1]):
+ raise ValueError("user_prompt_indices contains an out-of-bounds index for formatted prompt ids.")
+
+ steps = int(k) if k is not None else 0
+ if steps <= 0:
+ steps = 1
+ steps = min(steps, P)
+
+ scores = np.zeros(steps + 1, dtype=np.float64)
+ density = np.zeros(steps + 1, dtype=np.float64)
+
+ scores[0] = (
+ llm_evaluator.compute_logprob_response_given_prompt(prompt_ids_perturbed, generation_ids).sum().cpu().detach().item()
+ )
+ density[0] = 1.0
+
+ if attr_sum <= 0:
+ density = np.linspace(1.0, 0.0, steps + 1)
+
+ base = P // steps
+ remainder = P % steps
+ start = 0
+ for step in range(steps):
+ size = base + (1 if step < remainder else 0)
+ group = sorted_attr_indices[start : start + size]
+ start += size
+
+ for idx in group:
+ j = int(idx.item())
+ abs_pos = int(user_prompt_indices[j])
+ prompt_ids_perturbed[0, abs_pos] = pad_token_id
+ scores[step + 1] = (
+ llm_evaluator.compute_logprob_response_given_prompt(prompt_ids_perturbed, generation_ids).sum().cpu().detach().item()
+ )
+ if attr_sum > 0:
+ dec = float(w.index_select(0, group).sum().item()) / attr_sum
+ density[step + 1] = density[step] - dec
+
+ min_normalized_pred = 1.0
+ normalized_model_response = scores.copy()
+ for i in range(len(scores)):
+ normalized_pred = (normalized_model_response[i] - scores[-1]) / (abs(scores[0] - scores[-1]))
+ normalized_pred = np.clip(normalized_pred, 0.0, 1.0)
+ min_normalized_pred = min(min_normalized_pred, normalized_pred)
+ normalized_model_response[i] = min_normalized_pred
+
+ alignment_penalty = np.abs(normalized_model_response - density)
+ corrected_scores = normalized_model_response + alignment_penalty
+ corrected_scores = corrected_scores.clip(0.0, 1.0)
+ corrected_scores = (corrected_scores - np.min(corrected_scores)) / (np.max(corrected_scores) - np.min(corrected_scores))
+
+ if np.isnan(corrected_scores).any():
+ corrected_scores = np.linspace(1.0, 0.0, len(scores))
+
+ return auc(normalized_model_response), auc(corrected_scores), auc(normalized_model_response + alignment_penalty)
+
+
+def _row_faithfulness_scores(
+ *,
+ llm_evaluator: llm_attr_eval.LLMAttributionEvaluator,
+ attribution_prompt: torch.Tensor,
+ prompt: str,
+ generation: str,
+ user_prompt_indices: Optional[List[int]],
+ keep_prompt_token_indices: Optional[Sequence[int]] = None,
+ k: int = 20,
+) -> Tuple[float, float]:
+ if keep_prompt_token_indices is not None:
+ rise, mas, _ = ft_ifr_improve.faithfulness_test_skip_tokens(
+ llm_evaluator,
+ attribution_prompt,
+ prompt,
+ generation,
+ keep_prompt_token_indices=keep_prompt_token_indices,
+ user_prompt_indices=user_prompt_indices,
+ k=int(k),
+ )
+ return float(rise), float(mas)
+ if user_prompt_indices is not None:
+ rise, mas, _ = _faithfulness_test_with_user_prompt_indices(
+ llm_evaluator,
+ attribution_prompt,
+ prompt,
+ generation,
+ user_prompt_indices=user_prompt_indices,
+ k=int(k),
+ )
+ return float(rise), float(mas)
+
+ rise, mas, _ = llm_evaluator.faithfulness_test(attribution_prompt, prompt, generation, k=int(k))
+ return float(rise), float(mas)
+
+
+def _model_tag(args) -> str:
+ if args.model:
+ return str(args.model)
+ if args.model_path:
+ return Path(args.model_path).name
+ return "model"
+
+
+def main() -> None:
+ parser = argparse.ArgumentParser("Experiment 4 runner: aider faithfulness (row-only).")
+ parser.add_argument("--data_path", type=str, default="exp/exp4/data/aider.jsonl")
+ parser.add_argument("--output_root", type=str, default="exp/exp4/output")
+ parser.add_argument("--model", type=str, default=None, help="HF repo id (required unless --model_path set).")
+ parser.add_argument("--model_path", type=str, default=None, help="Local path; overrides --model for loading.")
+ parser.add_argument("--tokenizer_path", type=str, default=None, help="Optional tokenizer path/id (defaults to model).")
+ parser.add_argument("--cuda", type=str, default=None)
+ parser.add_argument("--cuda_num", type=int, default=0)
+ parser.add_argument("--num_examples", type=int, default=100)
+ parser.add_argument("--seed", type=int, default=42, help="Reserved for future use; exp4 runs in file order.")
+ parser.add_argument("--chunk_tokens", type=int, default=128)
+ parser.add_argument("--sink_chunk_tokens", type=int, default=32)
+ parser.add_argument("--n_hops", type=int, default=3)
+ parser.add_argument("--k", type=int, default=20, help="Perturbation steps for MAS/RISE.")
+ args = parser.parse_args()
+
+ data_path = Path(args.data_path)
+ if not data_path.exists():
+ raise SystemExit(f"Missing Aider JSONL: {data_path}")
+
+ model, tokenizer = load_model_and_tokenizer(args)
+ llm_evaluator = llm_attr_eval.LLMAttributionEvaluator(model, tokenizer)
+
+ examples = load_aider(data_path)
+ total = min(len(examples), int(args.num_examples))
+ iterator = islice(examples, total)
+
+ ifr = llm_attr.LLMIFRAttribution(
+ model,
+ tokenizer,
+ chunk_tokens=int(args.chunk_tokens),
+ sink_chunk_tokens=int(args.sink_chunk_tokens),
+ )
+ flashtrace = ft_ifr_improve.LLMIFRAttributionBoth(
+ model,
+ tokenizer,
+ chunk_tokens=int(args.chunk_tokens),
+ sink_chunk_tokens=int(args.sink_chunk_tokens),
+ )
+
+ results: Dict[Tuple[str, str], List[Tuple[float, float]]] = {
+ ("ifr_all_positions", "last_line"): [],
+ ("ifr_all_positions", "last_token"): [],
+ ("ifr_multi_hop_both", "full_output"): [],
+ }
+ skipped: Dict[Tuple[str, str], int] = {k: 0 for k in results}
+ sample_times: Dict[Tuple[str, str], List[float]] = {k: [] for k in results}
+
+ for example_idx, ex in enumerate(iterator):
+ prompt = ex.prompt
+ target = ex.target
+
+ full_span = _token_span_full_output(tokenizer, target)
+ last_line_span = _last_meaningful_code_line_token_span(tokenizer, target)
+ last_token_span = _last_token_span(last_line_span)
+
+ attr_all = None
+ attr_all_time_s = 0.0
+ user_prompt_indices_all: Optional[List[int]] = None
+ prompt_len_all = 0
+ try:
+ t_attr = time.perf_counter()
+ attr_all = ifr.calculate_ifr_for_all_positions(prompt, target=target)
+ attr_all_time_s = float(time.perf_counter() - t_attr)
+ user_prompt_indices_all = list(getattr(ifr, "user_prompt_indices", []) or [])
+ prompt_len_all = int(len(attr_all.prompt_tokens))
+ except Exception as exc:
+ skipped[("ifr_all_positions", "last_line")] += 1
+ skipped[("ifr_all_positions", "last_token")] += 1
+ print(f"[warn] ifr_all_positions attribution failed ex={example_idx}: {exc}")
+
+ if attr_all is not None and user_prompt_indices_all is not None and prompt_len_all >= 0:
+ for sink_name, span in (("last_line", last_line_span), ("last_token", last_token_span)):
+ key = ("ifr_all_positions", sink_name)
+ try:
+ t_faith = time.perf_counter()
+ row = attr_all.get_all_token_attrs(list(span))[1]
+ rise, mas = _row_faithfulness_scores(
+ llm_evaluator=llm_evaluator,
+ attribution_prompt=row[:, :prompt_len_all],
+ prompt=prompt,
+ generation=target,
+ user_prompt_indices=user_prompt_indices_all,
+ k=int(args.k),
+ )
+ faith_time_s = float(time.perf_counter() - t_faith)
+ results[key].append((rise, mas))
+ sample_times[key].append(attr_all_time_s + faith_time_s)
+ except Exception as exc:
+ skipped[key] += 1
+ print(f"[warn] ifr_all_positions {sink_name} failed ex={example_idx}: {exc}")
+
+ try:
+ t_attr = time.perf_counter()
+ attr_ft = flashtrace.calculate_ifr_multi_hop_both(
+ prompt,
+ target=target,
+ sink_span=None,
+ thinking_span=None,
+ n_hops=int(args.n_hops),
+ )
+ attr_ft_time_s = float(time.perf_counter() - t_attr)
+ user_prompt_indices_ft = list(getattr(flashtrace, "user_prompt_indices", []) or [])
+ prompt_len_ft = int(len(attr_ft.prompt_tokens))
+ keep_prompt_token_indices = ft_ifr_improve.keep_token_indices(list(attr_ft.prompt_tokens))
+
+ t_faith = time.perf_counter()
+ row_full = attr_ft.get_all_token_attrs(full_span)[1]
+ rise, mas = _row_faithfulness_scores(
+ llm_evaluator=llm_evaluator,
+ attribution_prompt=row_full[:, :prompt_len_ft],
+ prompt=prompt,
+ generation=target,
+ user_prompt_indices=user_prompt_indices_ft,
+ keep_prompt_token_indices=keep_prompt_token_indices,
+ k=int(args.k),
+ )
+ faith_time_s = float(time.perf_counter() - t_faith)
+ results[("ifr_multi_hop_both", "full_output")].append((rise, mas))
+ sample_times[("ifr_multi_hop_both", "full_output")].append(attr_ft_time_s + faith_time_s)
+ except Exception as exc:
+ skipped[("ifr_multi_hop_both", "full_output")] += 1
+ print(f"[warn] ifr_multi_hop_both failed ex={example_idx}: {exc}")
+
+ model_tag = _model_tag(args)
+ out_dir = Path(args.output_root) / "faithfulness" / "aider" / model_tag
+ out_dir.mkdir(parents=True, exist_ok=True)
+ out_path = out_dir / f"row_only_{total}_examples.csv"
+
+ with out_path.open("w", encoding="utf-8") as f:
+ f.write("Method,Sink,Row_RISE_Mean,Row_RISE_Std,Row_MAS_Mean,Row_MAS_Std,Used,Skipped,Avg_Sample_Time_s\n")
+ for (method, sink), vals in results.items():
+ arr = np.asarray(vals, dtype=np.float64)
+ used = int(arr.shape[0])
+ if used == 0:
+ rise_mean = float("nan")
+ rise_std = float("nan")
+ mas_mean = float("nan")
+ mas_std = float("nan")
+ else:
+ rise_mean = float(arr[:, 0].mean())
+ rise_std = float(arr[:, 0].std())
+ mas_mean = float(arr[:, 1].mean())
+ mas_std = float(arr[:, 1].std())
+ times = sample_times.get((method, sink)) or []
+ avg_time = float(np.mean(times)) if times else 0.0
+ f.write(
+ f"{method},{sink},{rise_mean},{rise_std},{mas_mean},{mas_std},{used},{int(skipped[(method, sink)])},{avg_time}\n"
+ )
+
+ print(f"[done] wrote {out_path}")
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/exp5/README.md b/exp/exp5/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..dacdda687d20e45aa4a0dc3feca84963d603bf2b
--- /dev/null
+++ b/exp/exp5/README.md
@@ -0,0 +1,119 @@
+# FlashTrace ๅฎ้ช 5๏ผ่ทจๆจกๅ๏ผQwen โ Llama๏ผtoken-span ๆ ๅฐ
+
+## ่ๆฏ๏ผไธบไปไน้่ฆๆ ๅฐ
+
+`exp/exp2/run_exp.py` ็ๅฝๅ ไธ่ฏไผฐๆฏไธฅๆ ผ **token-level** ็๏ผๅนถไธไพ่ต็ผๅญๆฐๆฎไธญ็ token-span ๅญๆฎต๏ผ
+
+- `indices_to_explain = [start_tok, end_tok]`๏ผgeneration token indices๏ผ้ญๅบ้ด๏ผ
+- `sink_span` / `thinking_span`๏ผๅๆ ทๆฏ generation token spans๏ผ
+
+่ฟไบ span ๅจ็ๆ็ผๅญ๏ผ`exp/exp2/sample_and_filter.py`ใ`exp/exp2/map_math_mine_to_exp2_cache.py`๏ผๆถๆฏ็จๆไธช tokenizer ่ฎก็ฎๅนถๅๆญป็๏ผ้ๅธธๆฏ `Qwen3-8B` ็ tokenizer๏ผใ
+
+ๅฝไฝ ๅๆขๅฐๆฐๆจกๅ๏ผไพๅฆ `Llama-3.1-8B-Instruct`๏ผๆถ๏ผ**tokenizer ไธๅ**๏ผ`target` ็ tokenization ้ฟๅบฆ/่พน็ไผๅๅ๏ผๅฏผ่ดๆง็ span ๅจๆฐ tokenizer ไธ็ปๅธธ่ถ็๏ผไป่่ฎฉ exp2 ๅจๅฝๅ ้ถๆฎต็ดๆฅๆฅ้๏ผ`IndexError: end_tok out of range`๏ผใ
+
+## ่งฃๅณๆนๆก๏ผexp5 ๆ ๅฐ่ๆฌ
+
+`exp/exp5/map_exp2_cache_token_spans.py` ๅฐ exp2 ็ผๅญ้็ๆง token-span ไปๆง tokenizer๏ผ้ป่ฎค `Qwen3-8B`๏ผๆ ๅฐๅฐๆฐ tokenizer๏ผ้ป่ฎค `Llama-3.1-8B-Instruct`๏ผ๏ผๅนถ่พๅบๅฐ๏ผ
+
+`exp/exp5/data/<ๅๅๆฐๆฎ้>.jsonl`
+
+ๆ ๅฐ็ญ็ฅ๏ผ้ป่ฎค๏ผ๏ผ
+1) ็จๆง tokenizer ๅฏน `target` ๅ `return_offsets_mapping=True`
+2) ๆๆง็ token-span ่ฝฌๆ `target` ็ๅญ็ฌฆๅบ้ด
+3) ็จๆฐ tokenizer ๅฏนๅไธไธช `target` ๅ offsets๏ผๅๆๅญ็ฌฆๅบ้ดๆ ๅฐๅๆฐ็ token-span
+
+ๅฆ้ๆ็ซฏๆ
ๅต๏ผ็ผๅญๅนถ้็ฑ้ขๆๆง tokenizer ไบง็๏ผ๏ผๅฏๅฏ็จ `--allow_fallback_answer`๏ผ็จ `metadata.boxed_answer`๏ผๆ `reference_answer`๏ผๅจๆฐ tokenizer ไธ้ๆฐๅฎไฝ span ไฝไธบๅ
ๅบใ
+
+---
+
+## Step 1๏ผๆ exp2 ๆฐๆฎ้็ผๅญๆ ๅฐๅฐ exp5/data
+
+ๆจ่ไฝฟ็จไปๅบ็ venv๏ผ
+
+```bash
+.venv/bin/python exp/exp5/map_exp2_cache_token_spans.py \
+ --in_jsonl exp/exp2/data/niah_mq_q2.jsonl \
+ --out_dir exp/exp5/data \
+ --old_tokenizer_model /opt/share/models/Qwen/Qwen3-8B \
+ --new_tokenizer_model /opt/share/models/meta-llama/Llama-3.1-8B-Instruct
+```
+
+ไธๆฌกๆ ๅฐๅคไธชๆฐๆฎ้๏ผ็คบไพ๏ผRULER + math๏ผ๏ผ
+
+```bash
+.venv/bin/python exp/exp5/map_exp2_cache_token_spans.py \
+ --in_jsonl exp/exp2/data/niah_mq_q2.jsonl exp/exp2/data/math.jsonl \
+ --out_dir exp/exp5/data \
+ --old_tokenizer_model /opt/share/models/Qwen/Qwen3-8B \
+ --new_tokenizer_model /opt/share/models/meta-llama/Llama-3.1-8B-Instruct
+```
+
+ๅฆๆ่พๅบๆไปถๅทฒๅญๅจ๏ผๅ `--overwrite`ใ
+
+้ป่ฎค่กไธบ๏ผ่ฅๆๆกๆ ทๆฌๆ ๆณๆ ๅฐ๏ผ่ๆฌไผๅฐๅ
ถ **drop** ๅนถๅจ่พๅบ็ป่ฎกไธญๆฅๅ๏ผๅฆ้ไธฅๆ ผไธ่ดๆง่ฏทๅ `--strict`๏ผ้ๅฐ้ฆไธชๅคฑ่ดฅๆ ทๆฌ็ดๆฅ้ๅบ๏ผใๅฆๆ็ๅ็ผๅญๅนถ้็ฑ `--old_tokenizer_model` ไบง็๏ผๅฏๅ `--allow_fallback_answer` ๅฏ็จๅบไบ `metadata.boxed_answer` ็ๅ
ๅบๅฎไฝใ
+
+---
+
+## Step 2๏ผ็จ exp2 ็ดๆฅ่ท Llama ๅฝๅ ่ฏๆต๏ผไฝๆฐๆฎ/่พๅบ้ฝๆๅ exp5๏ผ
+
+ๅ
ณ้ฎ็น๏ผ
+- **ๆฐๆฎ่ฏปๅ**๏ผ็จ `--data_root exp/exp5/data`๏ผ่ฎฉ exp2 ่ฏปๅๆ ๅฐๅ็็ผๅญ๏ผ
+- **็ปๆ่พๅบ**๏ผ็จ `--output_root exp/exp5/output`๏ผ้ฟๅ
ๅๅ
ฅ `exp/exp2/output`๏ผ
+- **ไธ่ฆๅ ** `--save_hop_traces`๏ผ้ฟๅ
ๅ trace๏ผ
+
+### RULER๏ผๅฏ่ท recovery + faithfulness๏ผ
+
+```bash
+CUDA_VISIBLE_DEVICES=0 .venv/bin/python exp/exp2/run_exp.py \
+ --datasets niah_mq_q2 \
+ --data_root exp/exp5/data \
+ --output_root exp/exp5/output \
+ --attr_funcs ifr_all_positions,attnlrp,ifr_multi_hop_both \
+ --model_path /opt/share/models/meta-llama/Llama-3.1-8B-Instruct \
+ --cuda 0 \
+ --num_examples 100 \
+ --mode faithfulness_gen,recovery_ruler
+```
+
+### math๏ผๅช่ฝ่ท faithfulness๏ผrecovery ไผ่ขซ exp2 ๆพๅผๆ็ป๏ผ
+
+```bash
+CUDA_VISIBLE_DEVICES=0 .venv/bin/python exp/exp2/run_exp.py \
+ --datasets math \
+ --data_root exp/exp5/data \
+ --output_root exp/exp5/output \
+ --attr_funcs ifr_all_positions,attnlrp,ifr_multi_hop_both \
+ --model_path /opt/share/models/meta-llama/Llama-3.1-8B-Instruct \
+ --cuda 0 \
+ --num_examples 100 \
+ --mode faithfulness_gen
+```
+
+## ๅ
ณไบโๆฏๅฆไผๆฑกๆ exp2 ๆไปถๅคนโ
+
+- **ไธไผๆฑกๆ `exp/exp2/data/`**๏ผๆไปฌไธๆน exp2 ็็ผๅญ๏ผ่ๆฏ่พๅบๅฐ `exp/exp5/data/`ใ
+- **ไธๅ `--save_hop_traces` ไธไผๅ trace**ใ
+- ไฝๆณจๆ๏ผ`exp/exp2/run_exp.py` ๆฌ่บซ**ไธๅฎไผๅ CSV ๆๆ ๆไปถ**ๅฐ `--output_root`๏ผไปฃ็ ่กไธบๅฆๆญค๏ผexp5 ไธๆน exp2๏ผ๏ผๆไปฅ่ฆๅๅฐโexp2 ๆไปถๅคนไธๆฐๅขๆไปถโ๏ผ่ฏทๆ `--output_root` ๆๅ `exp/exp5/output`๏ผๆๅ
ถๅฎ็ฎๅฝ๏ผใ
+
+```bash
+python exp/exp2/run_exp.py \
+ --datasets niah_mq_q2 \
+ --data_root exp/exp5/data \
+ --output_root exp/exp5/output \
+ --attr_funcs ifr_all_positions,attnlrp,ifr_multi_hop_both \
+ --model_path /opt/share/models/meta-llama/Llama-3.1-8B-Instruct \
+ --cuda 2,3,4,5,6,7 \
+ --num_examples 100 \
+ --mode faithfulness_gen \
+ --n_hops 1
+&& python exp/exp2/run_exp.py \
+ --datasets math \
+ --data_root exp/exp5/data \
+ --output_root exp/exp5/output \
+ --attr_funcs ifr_all_positions,attnlrp,ifr_multi_hop_both \
+ --model_path /opt/share/models/meta-llama/Llama-3.1-8B-Instruct \
+ --cuda 2,3,4,5,6,7 \
+ --num_examples 100 \
+ --mode faithfulness_gen \
+ --n_hops 1
+```
\ No newline at end of file
diff --git a/exp/exp5/map_exp2_cache_token_spans.py b/exp/exp5/map_exp2_cache_token_spans.py
new file mode 100644
index 0000000000000000000000000000000000000000..995a3b3aae818159d0f7d6e409943cea4c897e87
--- /dev/null
+++ b/exp/exp5/map_exp2_cache_token_spans.py
@@ -0,0 +1,407 @@
+#!/usr/bin/env python3
+"""Map exp2 cached JSONL token spans across tokenizers (Qwen -> Llama).
+
+Background
+----------
+`exp/exp2/run_exp.py` expects cached datasets to provide token-level generation spans:
+
+ - indices_to_explain: [start_tok, end_tok] (generation-token indices; closed interval)
+ - sink_span / thinking_span: same tokenizer convention as indices_to_explain
+
+These spans are computed under a specific tokenizer (often Qwen3-8B). When switching
+to a different model/tokenizer (e.g., Llama-3.1-8B-Instruct), the stored spans can
+become out-of-range and crash exp2 attribution (IndexError in token-span checks).
+
+This script remaps spans by:
+ 1) Tokenizing `target` with the OLD tokenizer to obtain offset_mapping
+ 2) Converting the OLD token span into a character span in `target`
+ 3) Tokenizing `target` with the NEW tokenizer and mapping the character span back
+ into NEW token indices
+
+Outputs are written under `exp/exp5/data/` by default, keeping `exp/exp2/` untouched.
+"""
+
+from __future__ import annotations
+
+import argparse
+import json
+import sys
+from pathlib import Path
+from typing import Any, Dict, Iterable, List, Optional, Tuple
+
+from transformers import AutoTokenizer
+
+
+REPO_ROOT = Path(__file__).resolve().parents[2]
+if str(REPO_ROOT) not in sys.path:
+ sys.path.insert(0, str(REPO_ROOT))
+
+
+def _split_args(values: Iterable[str]) -> List[str]:
+ out: List[str] = []
+ for v in values:
+ for part in str(v).split(","):
+ part = part.strip()
+ if part:
+ out.append(part)
+ return out
+
+
+def _load_tokenizer(tokenizer_model: str):
+ path = Path(tokenizer_model)
+ if path.exists():
+ return AutoTokenizer.from_pretrained(path.as_posix(), local_files_only=True)
+ # May require network access; keep as fallback for environments that allow it.
+ return AutoTokenizer.from_pretrained(tokenizer_model)
+
+
+def _is_token_span(span: Any) -> bool:
+ return (
+ isinstance(span, list)
+ and len(span) == 2
+ and all(isinstance(x, int) for x in span)
+ and span[0] >= 0
+ and span[1] >= span[0]
+ )
+
+
+def _pick_old_span(obj: Dict[str, Any]) -> Optional[List[int]]:
+ span = obj.get("indices_to_explain")
+ if _is_token_span(span):
+ return list(span)
+ span = obj.get("sink_span")
+ if _is_token_span(span):
+ return list(span)
+ return None
+
+
+def _offsets_to_char_span(offsets: Any, token_span: List[int]) -> Optional[Tuple[int, int]]:
+ """Convert a token span [start,end] to a character span [char_start,char_end) using offsets."""
+ if offsets is None:
+ return None
+ if not isinstance(offsets, list):
+ return None
+ start_tok, end_tok = token_span
+ if end_tok >= len(offsets):
+ return None
+
+ char_starts: List[int] = []
+ char_ends: List[int] = []
+ for idx in range(start_tok, end_tok + 1):
+ off = offsets[idx]
+ if off is None:
+ continue
+ if not (isinstance(off, (list, tuple)) and len(off) == 2):
+ continue
+ try:
+ s, e = int(off[0]), int(off[1])
+ except Exception:
+ continue
+ if e <= s:
+ continue
+ char_starts.append(s)
+ char_ends.append(e)
+
+ if not char_starts or not char_ends:
+ return None
+ return min(char_starts), max(char_ends)
+
+
+def _char_span_to_token_span(offsets: Any, char_span: Tuple[int, int]) -> Optional[List[int]]:
+ """Convert a character span [char_start,char_end) to a token span [start,end] by overlap."""
+ if offsets is None:
+ return None
+ if not isinstance(offsets, list):
+ return None
+ char_start, char_end = int(char_span[0]), int(char_span[1])
+ if char_end <= char_start:
+ return None
+
+ hit: List[int] = []
+ for tok_idx, off in enumerate(offsets):
+ if off is None:
+ continue
+ if not (isinstance(off, (list, tuple)) and len(off) == 2):
+ continue
+ try:
+ s, e = int(off[0]), int(off[1])
+ except Exception:
+ continue
+ if e <= s:
+ continue
+ if s < char_end and e > char_start:
+ hit.append(int(tok_idx))
+
+ if not hit:
+ return None
+ return [min(hit), max(hit)]
+
+
+def _validate_span_with_eos(tokenizer, target: str, token_span: List[int]) -> bool:
+ eos = tokenizer.eos_token or ""
+ gen_ids = tokenizer(target + eos, add_special_tokens=False).input_ids
+ gen_len = int(len(gen_ids))
+ return 0 <= token_span[0] <= token_span[1] < gen_len
+
+
+def _guess_answer_text(obj: Dict[str, Any]) -> Optional[str]:
+ meta = obj.get("metadata") or {}
+ if isinstance(meta, dict):
+ boxed = (meta.get("boxed_answer") or "").strip()
+ if boxed:
+ return boxed
+ ref = (meta.get("reference_answer") or "").strip()
+ if ref:
+ return ref
+ tgt = obj.get("target")
+ if isinstance(tgt, str) and tgt.strip():
+ # Common exp2 cache convention: last line is the final answer.
+ last_line = tgt.strip().splitlines()[-1].strip()
+ return last_line or None
+ return None
+
+
+def _fallback_map_via_answer_text(
+ obj: Dict[str, Any],
+ *,
+ new_tokenizer,
+) -> Optional[List[int]]:
+ tgt = obj.get("target")
+ if not isinstance(tgt, str) or not tgt:
+ return None
+
+ from exp.exp2.dataset_utils import CachedExample, attach_spans_from_answer # lazy import
+
+ answer_text = _guess_answer_text(obj)
+ ex = CachedExample(
+ prompt=str(obj.get("prompt") or ""),
+ target=tgt,
+ indices_to_explain=None,
+ attr_mask_indices=obj.get("attr_mask_indices"),
+ sink_span=None,
+ thinking_span=None,
+ metadata=obj.get("metadata") or {},
+ )
+ out = attach_spans_from_answer(ex, new_tokenizer, answer_text)
+ if out.sink_span is None:
+ return None
+ if not _is_token_span(out.sink_span):
+ return None
+ return list(out.sink_span)
+
+
+def _map_one_obj(
+ obj: Dict[str, Any],
+ *,
+ old_tokenizer,
+ new_tokenizer,
+ allow_fallback_answer: bool,
+) -> Tuple[Optional[Dict[str, Any]], Optional[str]]:
+ target = obj.get("target")
+ if not isinstance(target, str) or not target:
+ return None, "missing_target"
+
+ old_span = _pick_old_span(obj)
+ if old_span is None:
+ return None, "missing_old_span"
+
+ # 1) Old token span -> char span in target.
+ old_enc = old_tokenizer(target, add_special_tokens=False, return_offsets_mapping=True)
+ old_offsets = old_enc.get("offset_mapping")
+ char_span = _offsets_to_char_span(old_offsets, old_span)
+ if char_span is None:
+ if not allow_fallback_answer:
+ return None, "old_span_to_char_failed"
+ new_span = _fallback_map_via_answer_text(obj, new_tokenizer=new_tokenizer)
+ if new_span is None:
+ return None, "fallback_answer_failed"
+ if not _validate_span_with_eos(new_tokenizer, target, new_span):
+ return None, "fallback_answer_span_invalid"
+ mapped = dict(obj)
+ mapped["indices_to_explain"] = new_span
+ mapped["sink_span"] = new_span
+ mapped["thinking_span"] = [0, new_span[0] - 1] if new_span[0] > 0 else None
+ meta = mapped.get("metadata")
+ if not isinstance(meta, dict):
+ meta = {}
+ meta = dict(meta)
+ meta["exp5_span_map_method"] = "answer_text"
+ mapped["metadata"] = meta
+ return mapped, None
+
+ # 2) Char span -> new token span.
+ new_enc = new_tokenizer(target, add_special_tokens=False, return_offsets_mapping=True)
+ new_offsets = new_enc.get("offset_mapping")
+ new_span = _char_span_to_token_span(new_offsets, char_span)
+ if new_span is None:
+ if not allow_fallback_answer:
+ return None, "char_to_new_span_failed"
+ new_span = _fallback_map_via_answer_text(obj, new_tokenizer=new_tokenizer)
+ if new_span is None:
+ return None, "fallback_answer_failed"
+
+ if not _validate_span_with_eos(new_tokenizer, target, new_span):
+ if not allow_fallback_answer:
+ return None, "new_span_invalid"
+ fb = _fallback_map_via_answer_text(obj, new_tokenizer=new_tokenizer)
+ if fb is None or not _validate_span_with_eos(new_tokenizer, target, fb):
+ return None, "fallback_answer_span_invalid"
+ new_span = fb
+
+ mapped = dict(obj)
+ mapped["indices_to_explain"] = new_span
+ mapped["sink_span"] = new_span
+ mapped["thinking_span"] = [0, new_span[0] - 1] if new_span[0] > 0 else None
+
+ meta = mapped.get("metadata")
+ if not isinstance(meta, dict):
+ meta = {}
+ meta = dict(meta)
+ meta["exp5_span_map_method"] = "token_span_char_align"
+ mapped["metadata"] = meta
+ return mapped, None
+
+
+def _read_jsonl(path: Path) -> Iterable[Dict[str, Any]]:
+ with path.open("r", encoding="utf-8") as f:
+ for line_no, line in enumerate(f, start=1):
+ if not line.strip():
+ continue
+ try:
+ obj = json.loads(line)
+ except json.JSONDecodeError as exc: # pragma: no cover
+ raise RuntimeError(f"Invalid JSON at {path}:{line_no}: {exc}") from exc
+ if not isinstance(obj, dict):
+ raise RuntimeError(f"Expected JSON object per line at {path}:{line_no}.")
+ yield obj
+
+
+def _write_jsonl(path: Path, rows: Iterable[Dict[str, Any]]) -> int:
+ path.parent.mkdir(parents=True, exist_ok=True)
+ count = 0
+ with path.open("w", encoding="utf-8") as f:
+ for obj in rows:
+ f.write(json.dumps(obj, ensure_ascii=False) + "\n")
+ count += 1
+ return count
+
+
+def _default_old_tokenizer() -> str:
+ # Repo defaults used in exp2 README examples for span extraction.
+ return "/opt/share/models/Qwen/Qwen3-8B"
+
+
+def _default_new_tokenizer() -> str:
+ return "/opt/share/models/meta-llama/Llama-3.1-8B-Instruct"
+
+
+def main() -> None:
+ ap = argparse.ArgumentParser("Map exp2 cache token spans from an old tokenizer to a new tokenizer.")
+ ap.add_argument(
+ "--in_jsonl",
+ type=str,
+ nargs="+",
+ required=True,
+ help="One or more exp2 cached JSONL files (comma-separated also accepted).",
+ )
+ ap.add_argument(
+ "--out_dir",
+ type=str,
+ default="exp/exp5/data",
+ help="Output directory for mapped JSONL files.",
+ )
+ ap.add_argument(
+ "--old_tokenizer_model",
+ type=str,
+ default=_default_old_tokenizer(),
+ help="Tokenizer used to produce the original token spans (default: Qwen3-8B local path).",
+ )
+ ap.add_argument(
+ "--new_tokenizer_model",
+ type=str,
+ default=_default_new_tokenizer(),
+ help="Tokenizer to map spans into (default: Llama-3.1-8B-Instruct local path).",
+ )
+ ap.add_argument("--strict", action="store_true", help="Fail on the first example that cannot be mapped.")
+ ap.add_argument(
+ "--allow_fallback_answer",
+ action="store_true",
+ help=(
+ "If span alignment fails, try to recompute spans by locating metadata.boxed_answer in target "
+ "(useful when caches were not built with the assumed old tokenizer)."
+ ),
+ )
+ ap.add_argument(
+ "--overwrite",
+ action="store_true",
+ help="Overwrite output files if they already exist.",
+ )
+ args = ap.parse_args()
+
+ in_paths = [Path(p) for p in _split_args(args.in_jsonl)]
+ out_dir = Path(args.out_dir)
+
+ old_tok = _load_tokenizer(str(args.old_tokenizer_model))
+ new_tok = _load_tokenizer(str(args.new_tokenizer_model))
+
+ # exp2 convention: ensure a pad token exists for downstream perturbation.
+ if new_tok.pad_token is None and new_tok.eos_token is not None:
+ new_tok.pad_token = new_tok.eos_token
+
+ summary: Dict[str, Any] = {
+ "old_tokenizer_model": str(args.old_tokenizer_model),
+ "new_tokenizer_model": str(args.new_tokenizer_model),
+ "datasets": [],
+ }
+
+ for in_path in in_paths:
+ if not in_path.exists():
+ raise SystemExit(f"Missing input JSONL: {in_path}")
+ out_path = out_dir / in_path.name
+ if out_path.exists() and not bool(args.overwrite):
+ raise SystemExit(f"Refusing to overwrite existing output: {out_path} (use --overwrite)")
+
+ total = 0
+ mapped_ok = 0
+ dropped = 0
+ errors: Dict[str, int] = {}
+
+ mapped_rows: List[Dict[str, Any]] = []
+ for obj in _read_jsonl(in_path):
+ total += 1
+ mapped, err = _map_one_obj(
+ obj,
+ old_tokenizer=old_tok,
+ new_tokenizer=new_tok,
+ allow_fallback_answer=bool(args.allow_fallback_answer),
+ )
+ if err is not None or mapped is None:
+ errors[err or "unknown_error"] = errors.get(err or "unknown_error", 0) + 1
+ if bool(args.strict):
+ raise SystemExit(f"Failed to map {in_path} example #{total}: {err}")
+ dropped += 1
+ continue
+ mapped_ok += 1
+ mapped_rows.append(mapped)
+
+ written = _write_jsonl(out_path, mapped_rows)
+ if written != mapped_ok: # pragma: no cover
+ raise SystemExit(f"Internal error: written={written} != mapped_ok={mapped_ok}")
+
+ record = {
+ "in_jsonl": str(in_path),
+ "out_jsonl": str(out_path),
+ "total": int(total),
+ "mapped_ok": int(mapped_ok),
+ "dropped": int(dropped),
+ "errors": errors,
+ }
+ summary["datasets"].append(record)
+ print(json.dumps(record, ensure_ascii=False))
+
+ # Human-readable compact summary at end.
+ print(json.dumps(summary, ensure_ascii=False, indent=2))
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/proc/README.md b/exp/proc/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..a657df27786334a48080c9af86655895db6e6b1f
--- /dev/null
+++ b/exp/proc/README.md
@@ -0,0 +1,98 @@
+# exp/proc๏ผexp2 trace ๆ ๅฐ/ๅฏนๅคๅฏผๅบ๏ผ
+
+ๆฌ็ฎๅฝๆไพๆ `exp/exp2/run_exp.py --save_hop_traces` ไบงๅบ็ trace ็ปๆ๏ผๆด็ๆโ็ปๅไฝ่
ไฝฟ็จโ็็ฒพ็ฎๆ ทๆฌ็บง `.npz` ็ๅทฅๅ
ทใ
+
+ไธป่ฆๆไปถ๏ผ
+- `exp/proc/map_exp2_traces_to_proc.py`๏ผ่ฏปๅ exp2 ็ trace run ๆไปถๅคน๏ผ`manifest.jsonl` + `ex_*.npz`๏ผ๏ผ่พๅบ็ฒพ็ฎๆ ผๅผๅฐ `exp/proc/output/`ใ
+
+---
+
+## ่พๅ
ฅ่ฆๆฑ
+
+ไฝ ้่ฆๆไพ๏ผๆๅฏ่ชๅจๆจๆญ๏ผ๏ผ
+- `--trace_dir`๏ผexp2 ็ trace run ๆไปถๅคน๏ผไพๅฆ๏ผ
+ - `exp/exp2/output/traces/exp/exp2/data/morehopqa.jsonl/qwen-8B/ifr_all_positions_mfaithfulness_gen_95ex/`
+- `--dataset_jsonl`๏ผไธ่ฏฅ trace run ๅฏนๅบ็ exp2 ็ผๅญๆฐๆฎ้๏ผๅฟ
้กปๅ
ๅซ `prompt` + `target`๏ผ๏ผไพๅฆ๏ผ
+ - `exp/exp2/data/morehopqa.jsonl`
+- `--tokenizer_model`๏ผไธ exp2 ๅฝๅ ๆถไธ่ด็ tokenizer๏ผๆฌๅฐ่ทฏๅพๆๆจกๅๅ๏ผ๏ผไพๅฆ๏ผ
+ - `/opt/share/models/Qwen/Qwen3-8B/`
+
+ๆณจๆ๏ผ
+- ๆฌ่ๆฌไผไธฅๆ ผๅคๅป exp2 ็ token ๅฏน้ฝ้ป่พ๏ผprompt ๅๅฏผ็ฉบๆ ผใgeneration ็จ `target + eos_token` ๅ decode + offset ๅ็๏ผ๏ผๅ ๆญค tokenizer ๅฟ
้กปไธ exp2 ๅฝๅ ไธ่ด๏ผๅฆๅไผ็ดๆฅๆฅ้๏ผ้ฟๅบฆๅฏนไธไธ๏ผใ
+- ๆ ทๆฌๅน้
ไฝฟ็จ `manifest.jsonl` ไธญ็ `prompt_sha1/target_sha1` ๅฏน้ฝ `--dataset_jsonl`๏ผๆไปฅ `--dataset_jsonl` ๅฟ
้กปๆฏๅฝๆฌก trace run ไฝฟ็จ็้ฃไปฝ็ผๅญใ
+
+---
+
+## ่พๅบไฝ็ฝฎไธๅฝๅ
+
+้ป่ฎค่พๅบๅฐ๏ผ
+- `exp/proc/output//`
+
+ไพๅฆ่พๅ
ฅ๏ผ
+- `.../output/traces/exp/exp2/data/morehopqa.jsonl/qwen-8B//`
+
+้ป่ฎค่พๅบ๏ผ
+- `exp/proc/output/exp/exp2/data/morehopqa.jsonl/qwen-8B//`
+
+ไฝ ไนๅฏไปฅ็จ `--out_dir` ๆพๅผๆๅฎ่พๅบ็ฎๅฝใ
+
+่พๅบ็ฎๅฝๅ
ๆฏไธชๆ ทๆฌไธไธชๆไปถ๏ผ`ex_000000.npz`ใ`ex_000001.npz` โฆ
+
+---
+
+## ่พๅบ `.npz` ๅญๆฎต๏ผ็ฒพ็ฎไธไป
ๅ
ๅซๅฟ
่ฆไฟกๆฏ๏ผ
+
+ๆฏไธช่พๅบๆ ทๆฌ `.npz` **ไป
ๅ
ๅซ**ไธๅ้ฎ๏ผ
+- `attr`๏ผ`float32[L]`๏ผrow ๅฝๅ ๅ้๏ผๅทฒๅปๆ chat template๏ผไธๅปๆ EOS๏ผไป
่ฆ็ `input+cot+output` ็ๆๆ tokenใ
+- `hop`๏ผ`float32[H, L]`๏ผๅฏ้๏ผไป
FT-IFR ็ฑปๆนๆณ๏ผ๏ผ้ hop ็ๅ้๏ผๅๆ ทๅทฒๅปๆ EOS๏ผๅนถไธ `attr` ็ญ้ฟๅฏน้ฝใ
+- `tok`๏ผ`U[L]`๏ผไธ `attr/hop` ไธฅๆ ผๅฏน้ฝ็ token ๆๆฌ็ๆฎตๅบๅ๏ผๅๆ ทไธๅซ chat template ไธ EOS๏ผใ
+- `span_in`๏ผ`int64[2]`๏ผinput ๅจๅ้ไธญ็้ญๅบ้ด่ๅดใ
+- `span_cot`๏ผ`int64[2]`๏ผcot ๅจๅ้ไธญ็้ญๅบ้ด่ๅด๏ผๆ cot ๆถไธบ `[-1, -1]`๏ผใ
+- `span_out`๏ผ`int64[2]`๏ผoutput ๅจๅ้ไธญ็้ญๅบ้ด่ๅดใ
+- `rise`๏ผ`float64`๏ผrow ็ RISE๏ผfaithfulness๏ผใ
+- `mas`๏ผ`float64`๏ผrow ็ MAS๏ผfaithfulness๏ผใ
+- `recovery`๏ผ`float64`๏ผrow ็ Recovery@10%๏ผๆฒกๆ recovery ๆถไธบ NaN๏ผใ
+
+---
+
+## ็จๆณ็คบไพ
+
+ๆๅธธ็จ๏ผๅปบ่ฎฎๆพๅผไผ ๅ
ฅ dataset ไธ tokenizer๏ผ๏ผ
+```bash
+python exp/proc/map_exp2_traces_to_proc.py \
+ --trace_dir exp/exp2/output/traces/exp/exp2/data/morehopqa.jsonl/qwen-8B/ifr_all_positions_mfaithfulness_gen_95ex \
+ --dataset_jsonl exp/exp2/data/morehopqa.jsonl \
+ --tokenizer_model /opt/share/models/Qwen/Qwen3-8B/
+```
+
+ๆพๅผๆๅฎ่พๅบ็ฎๅฝ๏ผ้ฟๅ
้ป่ฎคๅๆ่ทฏๅพ๏ผ๏ผ
+```bash
+python exp/proc/map_exp2_traces_to_proc.py \
+ --trace_dir exp/exp2/output/traces/exp/exp2/data/math.jsonl/qwen-8B/ifr_multi_hop_both_n1_mfaithfulness_gen_100ex/ \
+ --dataset_jsonl exp/exp2/data/math.jsonl \
+ --tokenizer_model /opt/share/models/Qwen/Qwen3-8B/ \
+ --out_dir exp/proc/output/math_ifr_multi_hop_both
+```
+
+่ฐ่ฏ๏ผๅชๅค็ๅ 5 ๆกใๅ
่ฎธ่ฆ็่พๅบๆไปถ๏ผ
+```bash
+python exp/proc/map_exp2_traces_to_proc.py \
+ --trace_dir ... \
+ --dataset_jsonl ... \
+ --tokenizer_model ... \
+ --limit 5 \
+ --overwrite
+```
+
+---
+
+## ๅธธ่ง้ฎ้ข
+
+- ๆฅ้ โPrompt/Generation token length mismatchโ
+ - ๅ ไนๆปๆฏ tokenizer ไธไธ่ด๏ผ่ฏท็กฎ่ฎค `--tokenizer_model` ไธ exp2 ๅฝๅ ๆถไฝฟ็จ็ tokenizer ๅฎๅ
จไธ่ด๏ผๅปบ่ฎฎ็ดๆฅ็จๅไธไธช `--model_path`๏ผใ
+- ๆฅ้ โFailed to match manifest sha1 to dataset_jsonlโ
+ - `--dataset_jsonl` ไธๆฏๅฝๆฌก trace run ไฝฟ็จ็็ผๅญ๏ผๆ็ผๅญ้ๆฒกๆ `target`ใ
+- FT-IFR ๆนๆณ่พๅบ็ผบ `hop`
+ - ๅฏน `ifr_multi_hop_stop_words/ifr_multi_hop_both/ifr_multi_hop_split_hop/ifr_in_all_gen`๏ผexp2 trace ๅฟ
้กปๅ
ๅซ `vh`๏ผ่ฅ trace ่พๆง่ฏท้ๆฐ่ท exp2๏ผๅธฆ `--save_hop_traces`๏ผใ
+ - ๅฆ็กฎๆ้่ฆๅฏๅ `--allow_missing_ft_hops` ๅผบ่ก่พๅบ๏ผไธๆจ่๏ผใ
+
diff --git a/exp/proc/map_exp2_traces_to_proc.py b/exp/proc/map_exp2_traces_to_proc.py
new file mode 100644
index 0000000000000000000000000000000000000000..17cc7a5b726971a8e40a21cd3ff2eb6bec784ded
--- /dev/null
+++ b/exp/proc/map_exp2_traces_to_proc.py
@@ -0,0 +1,411 @@
+#!/usr/bin/env python3
+"""Map exp2 trace artifacts into a collaborator-friendly per-sample NPZ format.
+
+Input: an exp2 trace run directory produced by `exp/exp2/run_exp.py --save_hop_traces`,
+e.g.:
+
+ exp/exp2/output/traces/exp/exp2/data/morehopqa.jsonl/qwen-8B/ifr_all_positions_mfaithfulness_gen_95ex/
+
+This directory contains:
+ - manifest.jsonl (one JSON object per sample)
+ - ex_*.npz (per-sample vectors and scores)
+
+Output: per-sample NPZ files under `exp/proc/output/` (or a user-provided output path),
+each containing only:
+ - attr: row attribution vector over [input + CoT + output] tokens, with chat template and EOS removed
+ - hop: per-hop vectors (FT-IFR only), aligned to attr (optional)
+ - tok: tokenized text pieces aligned to attr/hop (no chat template, no EOS)
+ - span_in/span_cot/span_out: inclusive ranges for input/CoT/output in the above vectors
+ - rise/mas: row faithfulness scores (RISE, MAS)
+ - recovery: row Recovery@10% score (NaN when unavailable)
+
+This script is intentionally self-contained under exp/proc/ and does not modify exp2.
+"""
+
+from __future__ import annotations
+
+import argparse
+import hashlib
+import json
+from dataclasses import dataclass
+from pathlib import Path
+from typing import Dict, List, Optional, Tuple
+
+import numpy as np
+from transformers import AutoTokenizer
+
+
+FT_IFR_ATTR_FUNCS: set[str] = {
+ "ifr_in_all_gen",
+ "ifr_multi_hop_stop_words",
+ "ifr_multi_hop_both",
+ "ifr_multi_hop_split_hop",
+}
+
+
+def _sha1_text(text: str) -> str:
+ return hashlib.sha1(text.encode("utf-8")).hexdigest()
+
+
+def _load_tokenizer(tokenizer_model: str):
+ tok_path = Path(tokenizer_model)
+ if tok_path.exists():
+ tokenizer = AutoTokenizer.from_pretrained(tok_path.as_posix(), local_files_only=True)
+ else:
+ tokenizer = AutoTokenizer.from_pretrained(tokenizer_model)
+ if tokenizer.eos_token is None:
+ raise SystemExit("Tokenizer is missing eos_token; cannot match exp2 generation tokenization.")
+ if tokenizer.pad_token is None and tokenizer.eos_token is not None:
+ tokenizer.pad_token = tokenizer.eos_token
+ return tokenizer
+
+
+def _decode_text_into_tokens(tokenizer, text: str) -> List[str]:
+ """Mirror llm_attr.LLMAttribution.decode_text_into_tokens (offset-slice tokens)."""
+ enc = tokenizer(text, return_offsets_mapping=True, add_special_tokens=False)
+ ids = enc.get("input_ids")
+ offsets = enc.get("offset_mapping")
+ if ids is None or offsets is None:
+ raise ValueError("Tokenizer must provide input_ids and offset_mapping for exact exp2 token alignment.")
+ if len(ids) != len(offsets):
+ raise ValueError("Tokenizer returned mismatched input_ids vs offset_mapping lengths.")
+ tokens: List[str] = []
+ for start, end in offsets:
+ tokens.append(text[int(start) : int(end)])
+ return tokens
+
+
+@dataclass(frozen=True)
+class DatasetEntry:
+ prompt: str
+ target: str
+
+
+def _index_dataset_by_sha1(dataset_jsonl: Path) -> Dict[Tuple[str, str], DatasetEntry]:
+ """Build (prompt_sha1, target_sha1) -> (prompt, target) for cache lookup."""
+ index: Dict[Tuple[str, str], DatasetEntry] = {}
+ collisions: Dict[Tuple[str, str], int] = {}
+
+ with dataset_jsonl.open("r", encoding="utf-8") as f:
+ for line_num, line in enumerate(f, start=1):
+ if not line.strip():
+ continue
+ obj = json.loads(line)
+ prompt = str(obj.get("prompt") or "")
+ target = obj.get("target")
+ if target is None:
+ # exp2 trace matching requires cached targets.
+ continue
+ target = str(target)
+
+ key = (_sha1_text(prompt), _sha1_text(target))
+ if key in index:
+ collisions[key] = collisions.get(key, 1) + 1
+ continue
+ index[key] = DatasetEntry(prompt=prompt, target=target)
+
+ if collisions:
+ raise SystemExit(
+ "Dataset cache contains duplicate (prompt,target) pairs; cannot uniquely match by sha1. "
+ f"Example collision count={next(iter(collisions.values()))}. "
+ f"dataset_jsonl={dataset_jsonl}"
+ )
+
+ if not index:
+ raise SystemExit(
+ "No usable (prompt,target) pairs found in dataset cache. "
+ "Ensure you pass the exp2 cached JSONL used for attribution (with target filled)."
+ )
+
+ return index
+
+
+def _infer_trace_suffix(trace_dir: Path) -> Optional[Path]:
+ parts = list(trace_dir.parts)
+ if "traces" not in parts:
+ return None
+ idx = parts.index("traces")
+ suffix_parts = parts[idx + 1 :]
+ if not suffix_parts:
+ return None
+ return Path(*suffix_parts)
+
+
+def _parse_manifest(manifest_path: Path) -> List[dict]:
+ records: List[dict] = []
+ with manifest_path.open("r", encoding="utf-8") as f:
+ for line in f:
+ if not line.strip():
+ continue
+ records.append(json.loads(line))
+ if not records:
+ raise SystemExit(f"Empty manifest.jsonl: {manifest_path}")
+ return records
+
+
+def _read_span(npz: np.lib.npyio.NpzFile, key: str) -> Optional[Tuple[int, int]]:
+ if key not in npz.files:
+ return None
+ arr = npz[key]
+ if arr.shape != (2,):
+ raise ValueError(f"Expected {key} to have shape (2,), got {arr.shape}.")
+ return int(arr[0]), int(arr[1])
+
+
+def _span_or_empty(span: Optional[Tuple[int, int]]) -> Tuple[int, int]:
+ if span is None:
+ return -1, -1
+ return int(span[0]), int(span[1])
+
+
+def _tokenize_for_exp2_alignment(
+ tokenizer,
+ *,
+ prompt: str,
+ target: str,
+ expected_prompt_len: int,
+ expected_gen_len: int,
+) -> List[str]:
+ prompt_text = " " + (prompt or "")
+ prompt_tokens = _decode_text_into_tokens(tokenizer, prompt_text)
+ if len(prompt_tokens) != int(expected_prompt_len):
+ raise ValueError(f"Prompt token length mismatch: expected {expected_prompt_len}, got {len(prompt_tokens)}.")
+
+ gen_ids = tokenizer(target + tokenizer.eos_token, add_special_tokens=False).input_ids
+ gen_text = tokenizer.decode(gen_ids, skip_special_tokens=False, clean_up_tokenization_spaces=False)
+ gen_tokens = _decode_text_into_tokens(tokenizer, gen_text)
+ if len(gen_tokens) != int(expected_gen_len):
+ raise ValueError(f"Generation token length mismatch: expected {expected_gen_len}, got {len(gen_tokens)}.")
+
+ gen_tokens_no_eos = gen_tokens[:-1] if gen_tokens else []
+ return prompt_tokens + gen_tokens_no_eos
+
+
+def _clamp_span(span: Optional[Tuple[int, int]], *, max_index: int) -> Optional[Tuple[int, int]]:
+ if span is None:
+ return None
+ start, end = int(span[0]), int(span[1])
+ if max_index < 0:
+ return None
+ if end < 0 or start > max_index:
+ return None
+ start = max(0, start)
+ end = min(max_index, end)
+ if end < start:
+ return None
+ return start, end
+
+
+def _proc_one(
+ *,
+ trace_npz_path: Path,
+ record: dict,
+ dataset_index: Dict[Tuple[str, str], DatasetEntry],
+ tokenizer,
+ out_path: Path,
+ overwrite: bool,
+ allow_missing_ft_hops: bool,
+) -> None:
+ prompt_sha1 = str(record.get("prompt_sha1") or "")
+ target_sha1 = str(record.get("target_sha1") or "")
+ if not prompt_sha1 or not target_sha1:
+ raise ValueError("manifest record missing prompt_sha1/target_sha1; cannot match dataset.")
+
+ entry = dataset_index.get((prompt_sha1, target_sha1))
+ if entry is None:
+ raise ValueError(
+ "Failed to match manifest sha1 to dataset_jsonl. "
+ "Ensure --dataset_jsonl points to the exact cached JSONL used for this trace run."
+ )
+
+ if out_path.exists() and not overwrite:
+ raise FileExistsError(f"Refusing to overwrite existing file: {out_path} (use --overwrite).")
+ out_path.parent.mkdir(parents=True, exist_ok=True)
+
+ with np.load(trace_npz_path, allow_pickle=False) as f:
+ prompt_len = int(np.asarray(f.get("prompt_len")).item())
+ gen_len = int(np.asarray(f.get("gen_len")).item())
+ total_len = prompt_len + gen_len
+ gen_no_eos = max(0, gen_len - 1)
+ L = prompt_len + gen_no_eos
+
+ v_row_all = f.get("v_row_all")
+ if v_row_all is None:
+ raise ValueError("Missing v_row_all in trace npz; cannot build row attribution vector.")
+ v_row_all = np.asarray(v_row_all, dtype=np.float32)
+ if v_row_all.ndim != 1 or int(v_row_all.shape[0]) != int(total_len):
+ raise ValueError(f"v_row_all shape mismatch: expected ({total_len},), got {tuple(v_row_all.shape)}.")
+ attr = v_row_all[:L]
+
+ indices_to_explain = _read_span(f, "indices_to_explain_gen")
+ sink_span_gen = _read_span(f, "sink_span_gen") or indices_to_explain
+ if sink_span_gen is None:
+ raise ValueError("Missing sink_span_gen/indices_to_explain_gen; cannot define output span.")
+ thinking_span_gen = _read_span(f, "thinking_span_gen")
+ if thinking_span_gen is None:
+ sink_start = int(sink_span_gen[0])
+ think_end = sink_start - 1
+ thinking_span_gen = (0, think_end) if think_end >= 0 else None
+
+ sink_span_gen = _clamp_span(sink_span_gen, max_index=gen_no_eos - 1)
+ thinking_span_gen = _clamp_span(thinking_span_gen, max_index=gen_no_eos - 1)
+
+ span_in = (0, prompt_len - 1) if prompt_len > 0 else (-1, -1)
+ span_cot = (
+ (prompt_len + thinking_span_gen[0], prompt_len + thinking_span_gen[1])
+ if thinking_span_gen is not None
+ else (-1, -1)
+ )
+ span_out = (
+ (prompt_len + sink_span_gen[0], prompt_len + sink_span_gen[1]) if sink_span_gen is not None else (-1, -1)
+ )
+
+ tokens = _tokenize_for_exp2_alignment(
+ tokenizer,
+ prompt=entry.prompt,
+ target=entry.target,
+ expected_prompt_len=prompt_len,
+ expected_gen_len=gen_len,
+ )
+ if len(tokens) != int(L):
+ raise ValueError(f"Token length mismatch after EOS drop: expected {L}, got {len(tokens)}.")
+
+ # Scores: row = index 1.
+ rise = float("nan")
+ mas = float("nan")
+ faith = f.get("faithfulness_scores")
+ if faith is not None:
+ faith = np.asarray(faith, dtype=np.float64)
+ if faith.shape != (3, 3):
+ raise ValueError(f"faithfulness_scores shape mismatch: expected (3,3), got {tuple(faith.shape)}.")
+ rise = float(faith[1, 0])
+ mas = float(faith[1, 1])
+
+ recovery = float("nan")
+ rec = f.get("recovery_scores")
+ if rec is not None:
+ rec = np.asarray(rec, dtype=np.float64)
+ if rec.shape != (3,):
+ raise ValueError(f"recovery_scores shape mismatch: expected (3,), got {tuple(rec.shape)}.")
+ recovery = float(rec[1])
+
+ out_payload = {
+ "attr": np.asarray(attr, dtype=np.float32),
+ "tok": np.asarray(tokens, dtype=np.str_),
+ "span_in": np.asarray(span_in, dtype=np.int64),
+ "span_cot": np.asarray(span_cot, dtype=np.int64),
+ "span_out": np.asarray(span_out, dtype=np.int64),
+ "rise": np.asarray(rise, dtype=np.float64),
+ "mas": np.asarray(mas, dtype=np.float64),
+ "recovery": np.asarray(recovery, dtype=np.float64),
+ }
+
+ attr_func = str(record.get("attr_func") or "")
+ want_hops = attr_func in FT_IFR_ATTR_FUNCS
+ if want_hops:
+ vh = f.get("vh")
+ if vh is None:
+ if not allow_missing_ft_hops:
+ raise ValueError(
+ f"FT-IFR method '{attr_func}' requires per-hop vectors but trace npz is missing 'vh'. "
+ "Re-run exp2 with --save_hop_traces using the updated code."
+ )
+ else:
+ vh = np.asarray(vh, dtype=np.float32)
+ if vh.ndim != 2 or int(vh.shape[1]) != int(total_len):
+ raise ValueError(
+ f"vh shape mismatch: expected (H,{total_len}), got {tuple(vh.shape)} for {trace_npz_path}."
+ )
+ out_payload["hop"] = vh[:, :L]
+
+ np.savez_compressed(out_path, **out_payload)
+
+
+def main() -> None:
+ ap = argparse.ArgumentParser("Map exp2 trace folder -> exp/proc/output per-sample npz files.")
+ ap.add_argument("--trace_dir", type=str, required=True, help="Path to an exp2 trace run directory (contains manifest.jsonl).")
+ ap.add_argument("--dataset_jsonl", type=str, default=None, help="Path to the exp2 cached dataset JSONL used for this trace.")
+ ap.add_argument(
+ "--tokenizer_model",
+ type=str,
+ required=True,
+ help="Tokenizer model name or local path (must match exp2 attribution tokenizer).",
+ )
+ ap.add_argument("--out_root", type=str, default="exp/proc/output", help="Root directory for proc outputs.")
+ ap.add_argument("--out_dir", type=str, default=None, help="Optional explicit output directory (overrides --out_root).")
+ ap.add_argument("--overwrite", action="store_true", help="Overwrite existing output files if present.")
+ ap.add_argument("--limit", type=int, default=None, help="Optional limit on number of samples to process (debug).")
+ ap.add_argument(
+ "--allow_missing_ft_hops",
+ action="store_true",
+ help="Allow producing FT-IFR outputs even when per-hop vectors (vh) are missing (not recommended).",
+ )
+ args = ap.parse_args()
+
+ trace_dir = Path(args.trace_dir)
+ if not trace_dir.exists() or not trace_dir.is_dir():
+ raise SystemExit(f"Missing trace_dir: {trace_dir}")
+ manifest_path = trace_dir / "manifest.jsonl"
+ if not manifest_path.exists():
+ raise SystemExit(f"Missing manifest.jsonl: {manifest_path}")
+
+ dataset_jsonl: Optional[Path] = Path(args.dataset_jsonl) if args.dataset_jsonl else None
+ if dataset_jsonl is None:
+ suffix = _infer_trace_suffix(trace_dir)
+ if suffix is not None and len(suffix.parts) >= 3:
+ # suffix = //
+ inferred_dataset = Path(*suffix.parts[:-2])
+ if inferred_dataset.exists() and inferred_dataset.is_file():
+ dataset_jsonl = inferred_dataset
+ if dataset_jsonl is None:
+ raise SystemExit("Please pass --dataset_jsonl (could not infer it from --trace_dir).")
+ if not dataset_jsonl.exists():
+ raise SystemExit(f"Missing --dataset_jsonl: {dataset_jsonl}")
+
+ tokenizer = _load_tokenizer(str(args.tokenizer_model))
+ dataset_index = _index_dataset_by_sha1(dataset_jsonl)
+ records = _parse_manifest(manifest_path)
+
+ if args.out_dir:
+ out_dir = Path(args.out_dir)
+ else:
+ suffix = _infer_trace_suffix(trace_dir)
+ out_dir = Path(args.out_root) / suffix if suffix is not None else Path(args.out_root) / trace_dir.name
+ out_dir.mkdir(parents=True, exist_ok=True)
+
+ total = len(records)
+ limit = args.limit
+ if limit is not None:
+ if limit <= 0:
+ raise SystemExit("--limit must be a positive integer.")
+ total = min(total, int(limit))
+
+ processed = 0
+ for record in records[:total]:
+ file_name = str(record.get("file") or "")
+ if not file_name:
+ raise SystemExit("manifest record missing 'file' field.")
+ trace_npz_path = trace_dir / file_name
+ if not trace_npz_path.exists():
+ raise SystemExit(f"Missing trace npz referenced by manifest: {trace_npz_path}")
+
+ out_path = out_dir / file_name
+ try:
+ _proc_one(
+ trace_npz_path=trace_npz_path,
+ record=record,
+ dataset_index=dataset_index,
+ tokenizer=tokenizer,
+ out_path=out_path,
+ overwrite=bool(args.overwrite),
+ allow_missing_ft_hops=bool(args.allow_missing_ft_hops),
+ )
+ except Exception as exc:
+ raise SystemExit(f"Failed processing {trace_npz_path}: {exc}") from exc
+ processed += 1
+
+ print(f"Wrote {processed} proc samples -> {out_dir}")
+
+
+if __name__ == "__main__":
+ main()
diff --git a/exp/proc_1/README.md b/exp/proc_1/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..365214657b1aaeacae01721f67057fcffe277479
--- /dev/null
+++ b/exp/proc_1/README.md
@@ -0,0 +1,72 @@
+# exp/proc_1๏ผexp2 trace ๆ ๅฐ/ๅฏนๅคๅฏผๅบ v1๏ผ
+
+ๆฌ็ฎๅฝๆไพๆ `exp/exp2/run_exp.py --save_hop_traces` ไบงๅบ็ trace ็ปๆ๏ผๆด็ๆโ็ปๅไฝ่
ไฝฟ็จโ็็ฒพ็ฎๆ ทๆฌ็บง `.npz` ็ๅทฅๅ
ท๏ผv1๏ผใ
+
+ไธ `exp/proc/` ็ๅบๅซ๏ผ
+- ๅปๆ `tok`๏ผ้ token ๆๆฌ็ๆฎต๏ผใ
+- ๆฐๅข `length`๏ผไธๆฎต token ้ฟๅบฆ๏ผ๏ผ`[in, cot, out]`๏ผๅนถไฟ่ฏไธ `span_in/span_cot/span_out` ๅฏน้ฝใ
+- `hop` ๅญๆฎต้็จโ้ป่ฎค็ญ็ฅโ๏ผๅฝ trace ๆ ทๆฌไธญๅญๅจ `vh` ๆถๆ่พๅบ `hop`๏ผๅฆๅไธ่พๅบไธไธๆฅ้ใ
+- ๆฏๆไธๆฌกๆงๅค็ `exp/exp2/output/traces/` ไธๆๆ run ็ฎๅฝ๏ผๆๆๆฐๆฎ้-ๆนๆณ็ปๅ๏ผใ
+
+---
+
+## ่พๅ
ฅ็ปๆ๏ผexp2 traces๏ผ
+
+`exp2` ็ trace run ็ฎๅฝๅฝขๅฆ๏ผ
+- `exp/exp2/output/traces////`
+
+ๆฏไธช run ็ฎๅฝๅ
ๅซ๏ผ
+- `manifest.jsonl`๏ผๆฏ่กไธไธชๆ ทๆฌ่ฎฐๅฝ๏ผๅ
ๅซ `file=ex_*.npz`๏ผ
+- `ex_*.npz`๏ผๆฏๆ ทๆฌไธไธช npz๏ผ
+
+---
+
+## ่พๅบไฝ็ฝฎไธๅฝๅ
+
+้ป่ฎค่พๅบๅฐ๏ผ
+- `exp/proc_1/output//`
+
+ไพๅฆ่พๅ
ฅ๏ผ
+- `.../output/traces/exp/exp2/data/math.jsonl/qwen-8B//`
+
+้ป่ฎค่พๅบ๏ผ
+- `exp/proc_1/output/exp/exp2/data/math.jsonl/qwen-8B//`
+
+---
+
+## ่พๅบ `.npz` ๅญๆฎต
+
+ๆฏไธช่พๅบๆ ทๆฌ `.npz` ไป
ๅ
ๅซไธๅ้ฎ๏ผ
+- `attr`๏ผ`float32[L]`๏ผrow ๅฝๅ ๅ้๏ผ่ฆ็ `input+cot+output` ็ๆๆ token๏ผ็งป้ค generation ๆซๅฐพ EOS๏ผใ
+- `hop`๏ผ`float32[H, L]`๏ผๅฏ้๏ผ๏ผๅฝ trace npz ไธญๅญๅจ `vh` ๆถ่พๅบ๏ผๅๆ ท็งป้ค EOS๏ผๅนถไธ `attr` ็ญ้ฟๅฏน้ฝ๏ผใ
+- `span_in`๏ผ`int64[2]`๏ผinput ๅจๅ้ไธญ็้ญๅบ้ด่ๅดใ
+- `span_cot`๏ผ`int64[2]`๏ผcot ๅจๅ้ไธญ็้ญๅบ้ด่ๅด๏ผๆ cot ๆถไธบ `[-1, -1]`๏ผใ
+- `span_out`๏ผ`int64[2]`๏ผoutput ๅจๅ้ไธญ็้ญๅบ้ด่ๅดใ
+- `length`๏ผ`int64[3]`๏ผ้กบๅบไธบ `[in, cot, out]`๏ผ้ฟๅบฆไธ `span_*` ไธฅๆ ผๅฏนๅบ๏ผ้ญๅบ้ด้ฟๅบฆ `end-start+1`๏ผ็ฉบ span ้ฟๅบฆไธบ 0๏ผใ
+- `rise`๏ผ`float64`๏ผrow ็ RISE๏ผfaithfulness๏ผใ
+- `mas`๏ผ`float64`๏ผrow ็ MAS๏ผfaithfulness๏ผใ
+- `recovery`๏ผ`float64`๏ผrow ็ Recovery@10%๏ผๆฒกๆ recovery ๆถไธบ NaN๏ผใ
+
+---
+
+## ็จๆณ็คบไพ
+
+ๅค็ traces ไธๆๆ run๏ผๆจ่๏ผ๏ผ
+```bash
+python exp/proc_1/map_exp2_traces_to_proc_1.py \
+ --traces_root exp/exp2/output/traces
+```
+
+ๅชๅค็ๆไธไธช run ็ฎๅฝ๏ผ
+```bash
+python exp/proc_1/map_exp2_traces_to_proc_1.py \
+ --trace_dir exp/exp2/output/traces/exp/exp2/data/math.jsonl/qwen-8B/ifr_multi_hop_both_n1_mfaithfulness_gen_100ex
+```
+
+่ฐ่ฏ๏ผๆฏไธช run ๅชๅค็ๅ 5 ๆกใๅ
่ฎธ่ฆ็่พๅบ๏ผ
+```bash
+python exp/proc_1/map_exp2_traces_to_proc_1.py \
+ --traces_root exp/exp2/output/traces \
+ --limit 5 \
+ --overwrite
+```
diff --git a/exp/proc_1/map_exp2_traces_to_proc_1.py b/exp/proc_1/map_exp2_traces_to_proc_1.py
new file mode 100644
index 0000000000000000000000000000000000000000..54be366773bdf30fd32ec650e64ba469d9f2c98a
--- /dev/null
+++ b/exp/proc_1/map_exp2_traces_to_proc_1.py
@@ -0,0 +1,338 @@
+#!/usr/bin/env python3
+"""Map exp2 trace artifacts into a collaborator-friendly per-sample NPZ format (proc_1).
+
+This is a lightweight variant of `exp/proc/map_exp2_traces_to_proc.py`:
+- Removes `tok` (per-token text pieces).
+- Adds `length` with three components [in, cot, out], aligned to span_in/span_cot/span_out.
+- Saves `hop` only when the trace sample contains `vh` (default strategy).
+- Can process a single exp2 trace run directory or all run directories under a traces root.
+
+Input: an exp2 trace run directory produced by `exp/exp2/run_exp.py --save_hop_traces`, e.g.:
+
+ exp/exp2/output/traces/exp/exp2/data/math.jsonl/qwen-8B/ifr_multi_hop_both_n1_mfaithfulness_gen_100ex/
+
+This directory contains:
+ - manifest.jsonl (one JSON object per sample)
+ - ex_*.npz (per-sample vectors and scores)
+
+Output: per-sample NPZ files under `exp/proc_1/output/` (or a user-provided output path),
+each containing only:
+ - attr: row attribution vector over [input + CoT + output] tokens, with EOS removed
+ - hop: per-hop vectors (optional; only if `vh` exists in the trace npz), aligned to attr
+ - span_in/span_cot/span_out: inclusive ranges for input/CoT/output in the above vectors
+ - length: int64[3] = [in, cot, out], derived strictly from spans
+ - rise/mas: row faithfulness scores (RISE, MAS)
+ - recovery: row Recovery@10% score (NaN when unavailable)
+
+This script is intentionally self-contained under exp/proc_1/ and does not modify exp2.
+"""
+
+from __future__ import annotations
+
+import argparse
+import json
+from dataclasses import dataclass
+from pathlib import Path
+from typing import Iterable, List, Optional, Tuple
+
+import numpy as np
+
+
+def _infer_trace_suffix(trace_dir: Path) -> Optional[Path]:
+ parts = list(trace_dir.parts)
+ if "traces" not in parts:
+ return None
+ idx = parts.index("traces")
+ suffix_parts = parts[idx + 1 :]
+ if not suffix_parts:
+ return None
+ return Path(*suffix_parts)
+
+
+def _iter_run_dirs(traces_root: Path) -> List[Path]:
+ runs = {p.parent for p in traces_root.rglob("manifest.jsonl") if p.is_file()}
+ return sorted(runs)
+
+
+def _parse_manifest(manifest_path: Path) -> List[dict]:
+ records: List[dict] = []
+ with manifest_path.open("r", encoding="utf-8") as f:
+ for line in f:
+ if not line.strip():
+ continue
+ records.append(json.loads(line))
+ return records
+
+
+def _read_span(npz: np.lib.npyio.NpzFile, key: str) -> Optional[Tuple[int, int]]:
+ if key not in npz.files:
+ return None
+ arr = npz[key]
+ if arr.shape != (2,):
+ raise ValueError(f"Expected {key} to have shape (2,), got {arr.shape}.")
+ return int(arr[0]), int(arr[1])
+
+
+def _clamp_span(span: Optional[Tuple[int, int]], *, max_index: int) -> Optional[Tuple[int, int]]:
+ if span is None:
+ return None
+ start, end = int(span[0]), int(span[1])
+ if max_index < 0:
+ return None
+ if end < 0 or start > max_index:
+ return None
+ start = max(0, start)
+ end = min(max_index, end)
+ if end < start:
+ return None
+ return start, end
+
+
+def _span_len(span: Tuple[int, int]) -> int:
+ start, end = int(span[0]), int(span[1])
+ if start < 0 or end < 0 or end < start:
+ return 0
+ return int(end - start + 1)
+
+
+@dataclass(frozen=True)
+class ProcOneResult:
+ wrote: bool
+ has_hop: bool
+
+
+def _proc_one(
+ *,
+ trace_npz_path: Path,
+ record: dict,
+ out_path: Path,
+ overwrite: bool,
+) -> ProcOneResult:
+ if out_path.exists() and not overwrite:
+ raise FileExistsError(f"Refusing to overwrite existing file: {out_path} (use --overwrite).")
+ out_path.parent.mkdir(parents=True, exist_ok=True)
+
+ with np.load(trace_npz_path, allow_pickle=False) as f:
+ prompt_len = int(np.asarray(f.get("prompt_len")).item())
+ gen_len = int(np.asarray(f.get("gen_len")).item())
+ total_len = prompt_len + gen_len
+ gen_no_eos = max(0, gen_len - 1)
+ L = prompt_len + gen_no_eos
+
+ v_row_all = f.get("v_row_all")
+ if v_row_all is None:
+ raise ValueError("Missing v_row_all in trace npz; cannot build row attribution vector.")
+ v_row_all = np.asarray(v_row_all, dtype=np.float32)
+ if v_row_all.ndim != 1 or int(v_row_all.shape[0]) != int(total_len):
+ raise ValueError(f"v_row_all shape mismatch: expected ({total_len},), got {tuple(v_row_all.shape)}.")
+ attr = v_row_all[:L]
+
+ indices_to_explain = _read_span(f, "indices_to_explain_gen")
+ sink_span_gen = _read_span(f, "sink_span_gen") or indices_to_explain
+ if sink_span_gen is None:
+ raise ValueError("Missing sink_span_gen/indices_to_explain_gen; cannot define output span.")
+ thinking_span_gen = _read_span(f, "thinking_span_gen")
+ if thinking_span_gen is None:
+ sink_start = int(sink_span_gen[0])
+ think_end = sink_start - 1
+ thinking_span_gen = (0, think_end) if think_end >= 0 else None
+
+ sink_span_gen = _clamp_span(sink_span_gen, max_index=gen_no_eos - 1)
+ thinking_span_gen = _clamp_span(thinking_span_gen, max_index=gen_no_eos - 1)
+
+ span_in = (0, prompt_len - 1) if prompt_len > 0 else (-1, -1)
+ span_cot = (
+ (prompt_len + thinking_span_gen[0], prompt_len + thinking_span_gen[1])
+ if thinking_span_gen is not None
+ else (-1, -1)
+ )
+ span_out = (
+ (prompt_len + sink_span_gen[0], prompt_len + sink_span_gen[1]) if sink_span_gen is not None else (-1, -1)
+ )
+
+ length = np.asarray([_span_len(span_in), _span_len(span_cot), _span_len(span_out)], dtype=np.int64)
+
+ rise = float("nan")
+ mas = float("nan")
+ faith = f.get("faithfulness_scores")
+ if faith is not None:
+ faith = np.asarray(faith, dtype=np.float64)
+ if faith.shape != (3, 3):
+ raise ValueError(f"faithfulness_scores shape mismatch: expected (3,3), got {tuple(faith.shape)}.")
+ rise = float(faith[1, 0])
+ mas = float(faith[1, 1])
+
+ recovery = float("nan")
+ rec = f.get("recovery_scores")
+ if rec is not None:
+ rec = np.asarray(rec, dtype=np.float64)
+ if rec.shape != (3,):
+ raise ValueError(f"recovery_scores shape mismatch: expected (3,), got {tuple(rec.shape)}.")
+ recovery = float(rec[1])
+
+ out_payload = {
+ "attr": np.asarray(attr, dtype=np.float32),
+ "span_in": np.asarray(span_in, dtype=np.int64),
+ "span_cot": np.asarray(span_cot, dtype=np.int64),
+ "span_out": np.asarray(span_out, dtype=np.int64),
+ "length": np.asarray(length, dtype=np.int64),
+ "rise": np.asarray(rise, dtype=np.float64),
+ "mas": np.asarray(mas, dtype=np.float64),
+ "recovery": np.asarray(recovery, dtype=np.float64),
+ }
+
+ has_hop = False
+ vh = f.get("vh")
+ if vh is not None:
+ vh = np.asarray(vh, dtype=np.float32)
+ if vh.ndim != 2 or int(vh.shape[1]) != int(total_len):
+ raise ValueError(f"vh shape mismatch: expected (H,{total_len}), got {tuple(vh.shape)} for {trace_npz_path}.")
+ out_payload["hop"] = vh[:, :L]
+ has_hop = True
+
+ np.savez_compressed(out_path, **out_payload)
+ _ = record
+ return ProcOneResult(wrote=True, has_hop=has_hop)
+
+
+def _resolve_out_dir_for_trace_dir(*, trace_dir: Path, out_root: Path, out_dir: Optional[Path]) -> Path:
+ if out_dir is not None:
+ return out_dir
+ suffix = _infer_trace_suffix(trace_dir)
+ return (out_root / suffix) if suffix is not None else (out_root / trace_dir.name)
+
+
+def _process_trace_dir(
+ *,
+ trace_dir: Path,
+ out_root: Path,
+ out_dir: Optional[Path],
+ overwrite: bool,
+ limit: Optional[int],
+ skip_empty_manifest: bool,
+) -> Tuple[int, int]:
+ manifest_path = trace_dir / "manifest.jsonl"
+ if not manifest_path.exists():
+ raise SystemExit(f"Missing manifest.jsonl: {manifest_path}")
+
+ records = _parse_manifest(manifest_path)
+ if not records:
+ if skip_empty_manifest:
+ print(f"[skip] empty manifest: {manifest_path}")
+ return 0, 0
+ raise SystemExit(f"Empty manifest.jsonl: {manifest_path}")
+
+ total = len(records)
+ if limit is not None:
+ if limit <= 0:
+ raise SystemExit("--limit must be a positive integer.")
+ total = min(total, int(limit))
+
+ resolved_out_dir = _resolve_out_dir_for_trace_dir(trace_dir=trace_dir, out_root=out_root, out_dir=out_dir)
+ resolved_out_dir.mkdir(parents=True, exist_ok=True)
+
+ wrote = 0
+ wrote_with_hop = 0
+ for record in records[:total]:
+ file_name = str(record.get("file") or "")
+ if not file_name:
+ raise SystemExit("manifest record missing 'file' field.")
+ trace_npz_path = trace_dir / file_name
+ if not trace_npz_path.exists():
+ raise SystemExit(f"Missing trace npz referenced by manifest: {trace_npz_path}")
+
+ out_path = resolved_out_dir / file_name
+ try:
+ res = _proc_one(trace_npz_path=trace_npz_path, record=record, out_path=out_path, overwrite=overwrite)
+ except Exception as exc:
+ raise SystemExit(f"Failed processing {trace_npz_path}: {exc}") from exc
+ wrote += int(res.wrote)
+ wrote_with_hop += int(res.has_hop)
+
+ print(f"[ok] wrote {wrote} samples ({wrote_with_hop} with hop) -> {resolved_out_dir}")
+ return wrote, wrote_with_hop
+
+
+def main() -> None:
+ ap = argparse.ArgumentParser("Map exp2 trace folder(s) -> exp/proc_1/output per-sample npz files.")
+ ap.add_argument(
+ "--trace_dir",
+ type=str,
+ default=None,
+ help="Path to a single exp2 trace run directory (contains manifest.jsonl).",
+ )
+ ap.add_argument(
+ "--traces_root",
+ type=str,
+ default=None,
+ help="Path to traces root; processes all run dirs under it (each with a manifest.jsonl).",
+ )
+ ap.add_argument("--out_root", type=str, default="exp/proc_1/output", help="Root directory for proc_1 outputs.")
+ ap.add_argument(
+ "--out_dir",
+ type=str,
+ default=None,
+ help="Optional explicit output directory (only valid with --trace_dir; overrides --out_root).",
+ )
+ ap.add_argument("--overwrite", action="store_true", help="Overwrite existing output files if present.")
+ ap.add_argument("--limit", type=int, default=None, help="Optional limit on number of samples per run (debug).")
+ ap.add_argument(
+ "--fail_on_empty_manifest",
+ action="store_true",
+ help="Fail (instead of skipping) when encountering an empty manifest.jsonl.",
+ )
+ args = ap.parse_args()
+
+ trace_dir = Path(args.trace_dir) if args.trace_dir else None
+ traces_root = Path(args.traces_root) if args.traces_root else None
+ if (trace_dir is None) == (traces_root is None):
+ raise SystemExit("Please pass exactly one of --trace_dir or --traces_root.")
+
+ out_root = Path(args.out_root)
+ out_dir = Path(args.out_dir) if args.out_dir else None
+ if out_dir is not None and trace_dir is None:
+ raise SystemExit("--out_dir is only valid with --trace_dir (for --traces_root use --out_root).")
+
+ skip_empty_manifest = not bool(args.fail_on_empty_manifest)
+
+ if trace_dir is not None:
+ if not trace_dir.exists() or not trace_dir.is_dir():
+ raise SystemExit(f"Missing trace_dir: {trace_dir}")
+ _process_trace_dir(
+ trace_dir=trace_dir,
+ out_root=out_root,
+ out_dir=out_dir,
+ overwrite=bool(args.overwrite),
+ limit=args.limit,
+ skip_empty_manifest=skip_empty_manifest,
+ )
+ return
+
+ assert traces_root is not None
+ if not traces_root.exists() or not traces_root.is_dir():
+ raise SystemExit(f"Missing traces_root: {traces_root}")
+
+ run_dirs = _iter_run_dirs(traces_root)
+ if not run_dirs:
+ raise SystemExit(f"No run directories found under traces_root={traces_root} (expected manifest.jsonl).")
+
+ total_written = 0
+ total_with_hop = 0
+ for run_dir in run_dirs:
+ wrote, wrote_with_hop = _process_trace_dir(
+ trace_dir=run_dir,
+ out_root=out_root,
+ out_dir=None,
+ overwrite=bool(args.overwrite),
+ limit=args.limit,
+ skip_empty_manifest=skip_empty_manifest,
+ )
+ total_written += wrote
+ total_with_hop += wrote_with_hop
+
+ print(f"[done] total wrote {total_written} samples ({total_with_hop} with hop) under out_root={out_root}")
+
+
+if __name__ == "__main__":
+ main()
+
diff --git a/flashtrace/__init__.py b/flashtrace/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..965b43ea5854afd0cb94b273883c996ab16b4383
--- /dev/null
+++ b/flashtrace/__init__.py
@@ -0,0 +1,7 @@
+"""FlashTrace: efficient multi-token attribution for reasoning LLMs."""
+
+from .model_io import load_model_and_tokenizer
+from .result import TokenScore, TraceResult
+from .tracer import FlashTrace
+
+__all__ = ["FlashTrace", "TraceResult", "TokenScore", "load_model_and_tokenizer"]
diff --git a/flashtrace/attribution.py b/flashtrace/attribution.py
new file mode 100644
index 0000000000000000000000000000000000000000..dcdca4dc2e1ab111d8c8c220d77e458b1ddabf18
--- /dev/null
+++ b/flashtrace/attribution.py
@@ -0,0 +1,2975 @@
+import matplotlib
+import matplotlib.cm as mpl_cm
+from matplotlib import pyplot as plt
+import numpy as np
+import torch
+
+if not hasattr(mpl_cm, "register_cmap"):
+ from matplotlib import colors as _mpl_colors
+
+ def _register_cmap(name=None, cmap=None, data=None, lut=None, *, force=False):
+ """Compatibility wrapper for Matplotlib >=3.10 where register_cmap moved."""
+ if cmap is not None and data is not None:
+ raise ValueError("Cannot specify both `cmap` and `data` when registering a colormap.")
+ if data is not None:
+ if name is None:
+ raise ValueError("Must supply a name when registering colormap data.")
+ cmap = _mpl_colors.LinearSegmentedColormap(name, data, lut=lut)
+ elif cmap is None:
+ raise ValueError("Must supply `cmap` or `data` when registering a colormap.")
+
+ if isinstance(cmap, str):
+ cmap = mpl_cm.get_cmap(cmap)
+
+ name = name or cmap.name
+ copied = cmap.copy()
+ copied.name = name
+ mpl_cm._colormaps.register(copied, name=name, force=force)
+
+ def _unregister_cmap(name):
+ mpl_cm._colormaps.unregister(name)
+
+ mpl_cm.register_cmap = _register_cmap
+ mpl_cm.unregister_cmap = _unregister_cmap
+
+import seaborn as sns
+import torch.nn as nn
+import torch.nn.functional as F
+from tqdm import tqdm
+from typing import Dict, Any, List, Optional, Tuple, Sequence
+import textwrap
+from transformers import LongformerTokenizer, LongformerForMaskedLM
+import networkx as nx
+import matplotlib as mpl
+from matplotlib.patches import FancyArrowPatch
+from wordfreq import zipf_frequency
+
+from dataclasses import dataclass
+from typing import Literal
+
+from .core import (
+ IFRParameters,
+ ModelMetadata,
+ attach_hooks,
+ build_weight_pack,
+ compute_ifr_for_all_positions,
+ compute_ifr_sentence_aggregate,
+ compute_multi_hop_ifr,
+ extract_model_metadata,
+)
+
+
+@dataclass
+class AttnLRPSpanAggregate:
+ """Span-wise aggregated AttnLRP result (single vector), analogous to IFRAggregate.
+
+ This dataclass stores the result of span-wise AttnLRP aggregation computed
+ in O(N) time using a single forward + backward pass.
+
+ Attributes
+ ----------
+ token_importance_total : torch.Tensor
+ 1D float32 CPU tensor of length (user_prompt_len + gen_len) after
+ chat-template stripping, aligned with `all_tokens`.
+ all_tokens : List[str]
+ All tokens (user prompt + generation)
+ user_prompt_tokens : List[str]
+ User prompt tokens only
+ generation_tokens : List[str]
+ Generation tokens only
+ sink_range : Tuple[int, int]
+ [sink_start, sink_end] in generation-token indices
+ sink_weights : Optional[torch.Tensor]
+ Weights used for aggregation (if any)
+ metadata : Dict[str, Any]
+ Additional metadata about the computation
+ """
+ token_importance_total: torch.Tensor
+ all_tokens: List[str]
+ user_prompt_tokens: List[str]
+ generation_tokens: List[str]
+ sink_range: Tuple[int, int]
+ sink_weights: Optional[torch.Tensor]
+ metadata: Dict[str, Any]
+
+
+@dataclass
+class MultiHopAttnLRPResult:
+ """Multi-hop AttnLRP attribution result, analogous to MultiHopIFRResult.
+
+ This dataclass stores the result of multi-hop AttnLRP computation where
+ attribution is recursively propagated from output โ thinking โ input.
+
+ Attributes
+ ----------
+ raw_attributions : List[AttnLRPSpanAggregate]
+ List of per-hop attribution results. Index 0 is the base (outputโall),
+ subsequent indices are hop 1, 2, etc. (thinkingโall with weights).
+ thinking_ratios : List[float]
+ Fraction of attribution mass on the thinking span at each hop.
+ Useful for understanding how much attribution "stays" in reasoning.
+ observation : Dict[str, torch.Tensor]
+ Dictionary containing:
+ - "mask": observation mask (1 for observable tokens, 0 for thinking/sink)
+ - "base": base attribution masked to observable tokens
+ - "per_hop": list of per-hop attributions masked to observable tokens
+ - "sum": cumulative sum of all per-hop observations
+ - "avg": average of per-hop observations
+ """
+ raw_attributions: List[AttnLRPSpanAggregate]
+ thinking_ratios: List[float]
+ observation: Dict[str, torch.Tensor]
+
+
+from .shared_utils import (
+ DEFAULT_GENERATE_KWARGS,
+ DEFAULT_PROMPT_TEMPLATE,
+ create_sentences,
+ create_sentence_masks,
+)
+
+from .lrp_patches import lrp_context, detect_model_type
+
+matplotlib.rcParams['text.usetex'] = False
+matplotlib.rcParams['mathtext.default'] = 'regular'
+
+class LLMAttribution():
+ def __init__(
+ self,
+ model: Any,
+ tokenizer: Any,
+ generate_kwargs: Optional[Dict[str, Any]] = None,
+ *,
+ use_chat_template: bool = False,
+ ) -> None:
+
+ self.model = model
+ self.tokenizer = tokenizer
+ self.device = model.device
+
+ self.generate_kwargs = generate_kwargs or DEFAULT_GENERATE_KWARGS
+ self.use_chat_template = bool(use_chat_template)
+
+ self.prompt = None
+ self.prompt_ids = None
+ self.prompt_tokens = None
+ self.chat_prompt_indices = None
+
+ self.user_prompt = None
+ self.user_prompt_ids = None
+ self.user_prompt_tokens = None
+ self.user_prompt_indices = None
+
+ self.generation = None
+ self.generation_ids = None
+ self.generation_tokens = None
+
+ self.model.eval()
+
+ def decode_text_into_tokens(self, text) -> list[str]:
+ encoding = self.tokenizer(text, return_offsets_mapping=True, add_special_tokens=False)
+
+ ids = encoding["input_ids"]
+ offsets = encoding["offset_mapping"]
+
+ text_tokens = []
+ offsets = list(offsets)
+ for i in range(len(ids)):
+ span = offsets.pop(0)
+ start, end = span
+ actual_text = text[start:end]
+ text_tokens.append(actual_text)
+
+ return text_tokens
+
+ def extract_user_prompt_attributions(self, input, attribution) -> list[str]:
+ # Extract all attributions to be kept (gen -> user prompt and gen -> gen attributions)
+ user_prompt_attr_idx = torch.tensor(self.user_prompt_indices)
+ gen_attr_idx = torch.arange(len(input), attribution.shape[1])
+ all_keep_idx = torch.cat((user_prompt_attr_idx, gen_attr_idx), dim = 0)
+
+ return attribution[:, all_keep_idx]
+
+ # Takes a torch tensor of size [N, M] and extends it to [N, target_length] with a padding value
+ def pad_vector(self, vector, target_length, padding_value = 0) -> torch.Tensor:
+ current_length = vector.shape[1]
+ if current_length >= target_length:
+ return vector
+ padding_size = target_length - current_length
+ padded_vector = F.pad(vector, (0, padding_size), value=padding_value)
+ return padded_vector
+
+ def format_prompt(self, prompt) -> str:
+ if not self.use_chat_template:
+ return prompt
+
+ modified_prompt = DEFAULT_PROMPT_TEMPLATE.format(context = prompt, query = "")
+ formatted_prompt = [{"role": "user", "content": modified_prompt}]
+ formatted_prompt = self.tokenizer.apply_chat_template(
+ formatted_prompt,
+ tokenize=False,
+ add_generation_prompt=True,
+ enable_thinking=False
+ )
+
+ return formatted_prompt
+
+ # Query the model for its generation
+ # This internally saves the input and generated token ids for attribution target
+ def response(self, prompt) -> str:
+ self.user_prompt = " " + prompt if self.use_chat_template else prompt
+ self.prompt = self.format_prompt(self.user_prompt)
+
+ # these are the ids for the user supplied prompt
+ self.user_prompt_ids = self.tokenizer(self.user_prompt, return_tensors="pt", add_special_tokens = False).to(self.device).input_ids
+ # this is the tokenization of the chat prompt
+ self.prompt_ids = self.tokenizer(self.prompt, return_tensors="pt", add_special_tokens = False).to(self.device).input_ids
+
+ with torch.no_grad():
+ outputs = self.model.generate(self.prompt_ids, **self.generate_kwargs) # [1, num_prompt_tokens + num_generations]
+
+ # Get only the generated tokens (excluding the prompt)
+ self.generation_ids = outputs[:, self.prompt_ids.shape[1]:] # [1, num_generations]
+ self.generation = self.tokenizer.decode(self.generation_ids[0], skip_special_tokens = True)
+ gen_with_eos = self.tokenizer.decode(self.generation_ids[0], skip_special_tokens = False, clean_up_tokenization_spaces = False)
+
+ # we want to find the indices of the formatted prompt that the user prompt occupies
+ # we only want to attribute the user prompt, so we track this for later
+ n, m = len(self.user_prompt_ids[0]), len(self.prompt_ids[0])
+ for i, input_id in enumerate(self.prompt_ids[0]):
+ if input_id == self.user_prompt_ids[0, 0]:
+ self.user_prompt_indices = list(range(i, i + n))
+ break
+
+ # make a list of indices which are all prompt tokens
+ # (chat prompt formatting) that are not the user prompt tokens
+ self.chat_prompt_indices = [idx for idx in range(0, m) if idx < self.user_prompt_indices[0] or idx > self.user_prompt_indices[-1]]
+
+ # get the full prompt, user prompt, and generation as tokenized words
+ self.prompt_tokens = self.decode_text_into_tokens(self.prompt)
+ # print(self.prompt_tokens)
+ self.user_prompt_tokens = self.decode_text_into_tokens(self.user_prompt)
+ # print(self.user_prompt_tokens)
+ self.generation_tokens = self.decode_text_into_tokens(gen_with_eos)
+ # print(self.generation_tokens)
+
+ return self.generation
+
+ # nearly identical to response(), but we do not actually query the model
+ # we assume generation = target, and generate all the class variables as done in response()
+ def target_response(self, prompt, target) -> str:
+ self.user_prompt = " " + prompt if self.use_chat_template else prompt
+ self.prompt = self.format_prompt(self.user_prompt)
+
+ # these are the ids for the user supplied prompt
+ self.user_prompt_ids = self.tokenizer(self.user_prompt, return_tensors="pt", add_special_tokens = False).to(self.device).input_ids
+ # this is the tokenization of the chat prompt
+ self.prompt_ids = self.tokenizer(self.prompt, return_tensors="pt", add_special_tokens = False).to(self.device).input_ids # [1, num_prompt_tokens]
+ # Tokenize the target generation
+ target_text = target + (self.tokenizer.eos_token or "")
+ self.generation_ids = self.tokenizer(target_text, return_tensors="pt", add_special_tokens = False).to(self.device).input_ids # [1, num_generations]
+ self.generation = target
+ gen_with_eos = self.tokenizer.decode(self.generation_ids[0], skip_special_tokens = False, clean_up_tokenization_spaces = False)
+
+ # we want to find which indices of the formatted prompt that the user prompt occupies
+ # we will only want to attribute the user prompt, so we track this for later
+ n, m = len(self.user_prompt_ids[0]), len(self.prompt_ids[0])
+ for i, input_id in enumerate(self.prompt_ids[0]):
+ if input_id == self.user_prompt_ids[0, 0]:
+ self.user_prompt_indices = list(range(i, i + n))
+ break
+
+ # make a list of indices which are all prompt tokens
+ # (chat prompt formatting) that are not the user prompt tokens
+ self.chat_prompt_indices = [idx for idx in range(0, m) if idx < self.user_prompt_indices[0] or idx > self.user_prompt_indices[-1]]
+
+ # get the full prompt, user prompt, and generation as tokenized words
+ self.prompt_tokens = self.decode_text_into_tokens(self.prompt)
+ self.user_prompt_tokens = self.decode_text_into_tokens(self.user_prompt)
+ self.generation_tokens = self.decode_text_into_tokens(gen_with_eos)
+
+ return self.generation
+
+class LLMAttributionResult():
+ def __init__(
+ self,
+ tokenizer: Any,
+ attribution_matrix: torch.Tensor,
+ prompt_tokens: list[str],
+ generation_tokens: list[str],
+ all_tokens: Optional[list[str]] = None,
+ metadata: Optional[Dict[str, Any]] = None,
+ ) -> None:
+
+ self.tokenizer = tokenizer
+
+ self.prompt_tokens = prompt_tokens
+ self.generation_tokens = generation_tokens
+ self.all_tokens = all_tokens
+ if self.all_tokens is not None:
+ self.all_tokens = self.all_tokens
+
+ self.attribution_matrix = attribution_matrix.detach().cpu()
+ self.metadata = metadata
+
+ # normalize rows of a matrix to sum to 1
+ def normalize_sum_to_one(self, attriubtion_matrix) -> torch.Tensor:
+ # we use nans for visualization, but they must be removed (set to 0) for this function
+ attribution_no_nan = torch.nan_to_num(attriubtion_matrix, nan=0.0)
+ # we do not want to include negative attributions, clamp all to 0
+ attribution_no_nan = attribution_no_nan.clamp(0)
+ # first, normalize the rows of the attribution matrix to sum to one
+ attribution_row_sums = attribution_no_nan.sum(1, keepdim=True) + 1e-8
+ # perform normalization
+ return attribution_no_nan / attribution_row_sums
+
+ def remove_nan(self, attriubtion_matrix) -> torch.Tensor:
+ # we use nans for visualization, but they must be removed (set to 0) for this function
+ attribution_no_nan = torch.nan_to_num(attriubtion_matrix, nan=0.0)
+ # we do not want to include negative attributions, clamp all to 0
+ attribution_no_nan = attribution_no_nan.clamp(0)
+ return attribution_no_nan
+
+ # normalize the max of a vector to 1
+ def normalize_max(self, attribution_vector) -> torch.Tensor:
+ if attribution_vector.max() > 0:
+ attribution_vector = attribution_vector / attribution_vector.max()
+ elif attribution_vector.max() <= 0:
+ attribution_vector = - attribution_vector / attribution_vector.min()
+
+ return attribution_vector
+
+ ########################################## token attr ##########################################
+
+ def compute_CAGE_token_attr(
+ self,
+ token_to_explain: int,
+ *,
+ clear_values: bool = True,
+ token_attr_matrix_norm: Optional[torch.Tensor] = None,
+ ) -> torch.Tensor:
+ """Token-level CAGE-style recursive attribution over a token attribution matrix.
+
+ token_to_explain is a generation-token index in [0, G).
+ """
+ attr_norm = (
+ token_attr_matrix_norm
+ if token_attr_matrix_norm is not None
+ else self.normalize_sum_to_one(self.attribution_matrix)
+ )
+
+ prompt_len = len(self.prompt_tokens)
+ gen_len = len(self.generation_tokens)
+ expected_cols = prompt_len + gen_len
+ if attr_norm.ndim != 2 or attr_norm.shape[0] != gen_len or attr_norm.shape[1] != expected_cols:
+ raise TypeError(
+ "Expected token attribution matrix of shape [G, P+G] where "
+ f"G={gen_len} and P={prompt_len}, got {tuple(attr_norm.shape)}."
+ )
+ if token_to_explain < 0 or token_to_explain >= gen_len:
+ raise IndexError(f"token_to_explain out of range: {token_to_explain} not in [0, {gen_len}).")
+
+ r = attr_norm[token_to_explain, :].clone()
+ for k in range(token_to_explain - 1, -1, -1):
+ alpha = r[prompt_len + k]
+ if alpha != 0:
+ r += attr_norm[k, :] * alpha
+ if clear_values:
+ r[prompt_len + k] = 0
+ return r
+
+ def get_all_token_attrs(self, indices_to_explain: List[int]) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]:
+ """Return token-level (seq, row, rec) attributions.
+
+ indices_to_explain must be a generation-token span [start_tok, end_tok]
+ (closed interval), typically pointing to the \\box{...} inner answer span.
+ """
+ if not isinstance(indices_to_explain, list) or len(indices_to_explain) != 2:
+ raise ValueError(
+ "indices_to_explain must be a token span [start_tok, end_tok] "
+ f"(got {indices_to_explain!r})."
+ )
+ start_tok, end_tok = indices_to_explain
+ if not isinstance(start_tok, int) or not isinstance(end_tok, int):
+ raise TypeError(f"indices_to_explain elements must be ints (got {indices_to_explain!r}).")
+ if start_tok < 0 or end_tok < 0 or start_tok > end_tok:
+ raise ValueError(f"Invalid token span: {indices_to_explain!r}.")
+
+ attr_norm = self.normalize_sum_to_one(self.attribution_matrix)
+ gen_len = int(attr_norm.shape[0])
+ if end_tok >= gen_len:
+ raise IndexError(f"end_tok out of range: {end_tok} >= G={gen_len}.")
+
+ seq_attr = attr_norm
+ row_attr = attr_norm[start_tok : end_tok + 1, :].sum(dim=0, keepdim=True)
+
+ rec_sum = torch.zeros_like(row_attr)
+ for t in range(start_tok, end_tok + 1):
+ rec_sum += self.compute_CAGE_token_attr(
+ t,
+ clear_values=True,
+ token_attr_matrix_norm=attr_norm,
+ ).reshape(1, -1)
+ rec_attr = rec_sum
+
+ return seq_attr, row_attr, rec_attr
+
+ ########################################## sentence attr ##########################################
+
+ # This converts any token attribution to a sentence attribution
+ def compute_sentence_attr(self, norm = True) -> None:
+ raise RuntimeError("Sentence-level aggregation has been removed; use token-level get_all_token_attrs().")
+
+ # Legacy implementation (kept for reference)
+ # create the prompt ang generation sentences
+ self.prompt_sentences = create_sentences("".join(self.prompt_tokens), self.tokenizer)
+ self.generation_sentences = create_sentences("".join(self.generation_tokens), self.tokenizer)
+ self.all_sentences = self.prompt_sentences + self.generation_sentences
+
+ # create a mask that tracks the tokens used in each sentence of the generation
+ sentence_masks_generation = create_sentence_masks(self.generation_tokens, self.generation_sentences)
+ # create a mask that tracks the tokens used in each sentence of the prompt and the generation
+ sentence_masks_all = create_sentence_masks(self.prompt_tokens + self.generation_tokens, self.all_sentences)
+
+ num_inp_sent = len(self.prompt_sentences)
+ num_gen_sent = len(self.generation_sentences)
+ num_all_sent = len(self.all_sentences)
+
+ # Now we want to turn our attribution which is over tokens into an attribution over sentences
+ # attribution rows = gen sentences
+ # attribution columns = prompt sentences + gen sentences
+ self.sentence_attr = torch.full((num_gen_sent, num_all_sent), torch.nan)
+ for i in range(num_gen_sent):
+ # Select the rows (sentence) of the matrix which are attributed to the inputs (cols)
+ # A whole sentence is selected at once
+ row_indices = torch.where(sentence_masks_generation[i] == 1)[0]
+ attr_rows = self.attribution_matrix[row_indices, :]
+
+ for j in range(num_all_sent):
+ # we do not attribute a generation to itself or any
+ # future generations so we can skip those here
+ if j > i + num_inp_sent - 1:
+ continue
+
+ # now we select the columns
+ col_indices = torch.where(sentence_masks_all[j] == 1)[0]
+
+ # now select a whole sentence of cols from these rows
+ attr = attr_rows[:, col_indices]
+
+ # Find which of these indices are NaN
+ nan_mask = torch.isnan(attr)
+ # Replace NaNs with 0
+ attr[nan_mask] = 0.0
+
+ # take sum of this 2d attr and place it in the correct
+ # spot of the sentence attribution
+ self.sentence_attr[i, j] = torch.sum(attr)
+
+ if norm:
+ self.sentence_attr = self.normalize_sum_to_one(self.sentence_attr)
+ else:
+ self.sentence_attr = self.remove_nan(self.sentence_attr)
+
+ return
+
+ def plot_attr_table_sentence(self, height = None) -> None:
+ if self.sentence_attr is None:
+ print(
+ '''The sentence attribution has not been computed.
+ Call LLMAttributionResult.compute_sentence_attr() first.
+ '''
+ )
+ return
+
+ width = 15
+ wrapped_sentences_x = []
+ for sentence in self.all_sentences:
+ wrapped_sentences_x.append(textwrap.fill(sentence, width=width))
+ wrapped_sentences_y = []
+ max_num_lines = 0
+ for sentence in self.generation_sentences:
+ sentence = textwrap.fill(sentence, width=width)
+ num_lines = len(sentence.split("\n"))
+ max_num_lines = num_lines if num_lines > max_num_lines else max_num_lines
+ wrapped_sentences_y.append(sentence)
+
+
+ fig_width = (len(self.all_sentences) * width / 10) if (len(self.all_sentences) * width / 10) > 10 else 10
+ if height is None:
+ fig_height = (len(self.generation_sentences) * max_num_lines / 8) if (len(self.generation_sentences) * max_num_lines / 8) > 8 else 8
+ else:
+ fig_height = 5
+
+ fig, axs = plt.subplots(1, 1, figsize = (fig_width, fig_height))
+
+ # use a positive only heatmap cmap
+ if np.nanmin(self.sentence_attr) >= 0:
+ sns.heatmap(self.sentence_attr, annot=False, xticklabels=wrapped_sentences_x, yticklabels=wrapped_sentences_y, cmap="Blues", ax = axs)
+ # use a postitive and negative heatmap cmap
+ else:
+ # set vmax vmin such that 0 is center value of color map
+ max_abs_attr_val = np.nanmax(self.sentence_attr.abs())
+ sns.heatmap(self.sentence_attr, annot=False, xticklabels=wrapped_sentences_x, yticklabels=wrapped_sentences_y, vmax=max_abs_attr_val, vmin=-max_abs_attr_val, cmap="Blues", ax = axs)
+
+ axs.tick_params(top=True, bottom=False, labeltop=True, labelbottom=False, labelsize=200 / len(self.all_sentences))
+ plt.yticks(rotation=45)
+ plt.xticks(rotation=45)
+ plt.show()
+
+ def plot_context_attr_sentence(self, title) -> None:
+ if self.sentence_attr is None:
+ print(
+ '''The sentence attribution has not been computed.
+ Call LLMAttributionResult.compute_sentence_attr() first.
+ '''
+ )
+ return
+
+ wrapped_sentences = []
+ width = 20
+ for sentence in self.prompt_sentences:
+ wrapped_sentences.append(textwrap.fill(sentence, width=width))
+
+ fig_width = len(wrapped_sentences) * (width / 10)
+ fig_height = len(wrapped_sentences) / 2 if len(wrapped_sentences) / 2 > 3 else 3
+
+ plt.figure(figsize=(fig_width, fig_height))
+ plt.bar(np.arange(len(wrapped_sentences)), self.normalize_max(torch.nansum(self.sentence_attr[:, :len(self.prompt_sentences)].cpu().detach(), dim = 0)))
+ plt.xticks(range(len(wrapped_sentences)), wrapped_sentences, rotation=0)
+ plt.ylabel("Influence")
+ plt.title(title)
+ plt.tight_layout()
+ plt.show()
+
+
+ def save_context_attr_sentence(self, prompt_sentences, path) -> None:
+ if self.sentence_attr is None:
+ print(
+ '''The sentence attribution has not been computed.
+ Call LLMAttributionResult.compute_sentence_attr() first.
+ '''
+ )
+ return
+
+ wrapped_sentences = []
+ width = 20
+ for sentence in prompt_sentences:
+ wrapped_sentences.append(textwrap.fill(sentence, width=width))
+
+ fig_width = len(wrapped_sentences) * (width / 10)
+ fig_height = len(wrapped_sentences) / 2 if len(wrapped_sentences) / 2 > 3 else 3
+
+ fig, axs = plt.subplots(1, 1, figsize = (fig_width, fig_height))
+ plt.bar(np.arange(len(wrapped_sentences)), self.normalize_max(torch.nansum(self.sentence_attr[:, :len(prompt_sentences)].cpu().detach(), dim = 0)))
+ plt.xticks(range(len(wrapped_sentences)), wrapped_sentences, rotation=0)
+ plt.ylabel("Influence")
+ plt.tight_layout()
+ plt.savefig(path + ".png", bbox_inches='tight', transparent = "False")
+ fig.clear()
+ plt.close(fig)
+
+
+ def draw_graph(self, cmap = plt.cm.Blues, wrap_width=20, thresh = 0.10, spacing = 4, arrow_mod = 1, rad = 0.3):
+ """
+ Simplified one-row attribution graph:
+ - All tokens (prompts + generations) drawn in one horizontal row
+ - Directed weighted edges: generation -> input
+ """
+
+ grad_array = self.sentence_attr
+ outputs = self.all_sentences
+ generated = self.generation_sentences
+
+ grad_array = grad_array.permute((1, 0)) # -> [outputs, generated]
+ attr_np = grad_array.cpu().numpy() if hasattr(grad_array, "cpu") else grad_array
+ attr_np = np.nan_to_num(attr_np, nan=0.0)
+
+ G = nx.DiGraph()
+ prompt_len = len(outputs) - len(generated)
+ n_gen = len(generated)
+
+ # Node ids
+ prompt_ids = [f"p_{i}" for i in range(prompt_len)]
+ gen_ids = [f"g_{j}" for j in range(n_gen)]
+
+ # Add nodes
+ def add_node(node_id, label, ntype):
+ wrapped = textwrap.fill(label, width=wrap_width)
+ wrap_height = len(wrapped.split('\n'))
+ G.add_node(node_id, label=wrapped, type=ntype)
+ return wrap_height
+
+ max_wrap_height = 0
+ for i in range(prompt_len):
+ wrap_height = add_node(prompt_ids[i], outputs[i], "prompt")
+ if wrap_height > max_wrap_height:
+ max_wrap_height = wrap_height
+ for j in range(n_gen):
+ wrap_height = add_node(gen_ids[j], generated[j], "generated")
+ if wrap_height > max_wrap_height:
+ max_wrap_height = wrap_height
+
+ def out_i_to_node(i):
+ return prompt_ids[i] if i < prompt_len else gen_ids[i - prompt_len]
+
+ # Add edges gen -> output
+ for j in range(n_gen):
+ src = gen_ids[j]
+ for i in range(len(outputs)):
+ w = attr_np[i, j] if (i < attr_np.shape[0] and j < attr_np.shape[1]) else 0.0
+ if w != 0.0:
+ G.add_edge(src, out_i_to_node(i), weight=w)
+
+
+ # --- layout: single row ---
+ y_row = 0.0
+ pos = {}
+ all_nodes = prompt_ids + gen_ids
+ for idx, nid in enumerate(all_nodes):
+ pos[nid] = (idx * spacing, y_row)
+
+ # --- figure ---
+ ncols = len(all_nodes)
+ fig_width = max(10, ncols * (spacing * 0.6))
+ fig, ax = plt.subplots(figsize=(fig_width, 4), dpi = 300)
+
+ # prune edges
+ edges = list(G.edges(data=True))
+ weights = np.array([edata["weight"] for _, _, edata in edges])
+ if weights.size > 0:
+ threshold = thresh * np.max(np.abs(weights)) # keep edges โฅ 5% of max weight
+ for (u, v, edata) in list(edges): # iterate over a copy
+ if abs(edata["weight"]) < threshold:
+ G.remove_edge(u, v)
+
+ # visualization
+ edges = G.edges(data=True) # refresh edges after pruning
+ weights = np.array([edata["weight"] for _, _, edata in edges])
+ if weights.size == 0:
+ weights = np.array([1]) # fallback if everything pruned
+ max_w = np.max(np.abs(weights))
+ norm = mpl.colors.TwoSlopeNorm(vmin=-max_w, vcenter=0, vmax=max_w) \
+ if np.min(weights) < 0 else mpl.colors.Normalize(vmin=0, vmax=max_w)
+
+ # Draw nodes (larger font + padding)
+ for nid, (x, y) in pos.items():
+ lbl = G.nodes[nid]["label"]
+ ntype = G.nodes[nid]["type"]
+ box_color = "#d4c1ffc8" if ntype == "prompt" else "#cfffcc" #cfe8ff
+ ax.annotate(
+ lbl, xy=(x, y), xytext=(x, y),
+ ha="center", va="center", fontsize=12, zorder=3,
+ bbox=dict(boxstyle="round,pad=0.6", facecolor=box_color,
+ edgecolor="gray", linewidth=1.2, alpha=1),
+ )
+
+ box_height = max_wrap_height / 4
+ # Draw edges with curved arrows
+ for (u, v, edata) in edges:
+ x1, y1 = pos[u]
+ x2, y2 = pos[v]
+
+ start = (x1, y1 - box_height)
+ end = (x2, y2 - box_height)
+
+ w = edata["weight"]
+ color = cmap(norm(w))
+ width = (1.5 * arrow_mod) + 5.0 * (abs(w) / max_w)
+
+ arrow_rad = rad if x1 <= x2 else -rad
+ arrow = FancyArrowPatch(
+ (start), (end),
+ connectionstyle=f"arc3,rad={arrow_rad}",
+ # arrowstyle=f"-|>,head_length={2*arrow_mod},head_width={arrow_mod}",
+ arrowstyle=f"<|-,head_length={2*arrow_mod},head_width={arrow_mod}",
+ linewidth=width, color=color, alpha=1, zorder=2,
+ shrinkA=16, shrinkB=16, mutation_scale=12,
+ clip_on=False
+ )
+ ax.add_patch(arrow)
+
+ ax.set_xlim(-spacing, (ncols - 1) * spacing + spacing)
+ ax.set_ylim(-3, 3)
+ ax.axis("off")
+
+ plt.tight_layout()
+ plt.show()
+ plt.close(fig)
+
+
+ def save_graph(self, all_sentences, generation_sentences, path, cmap = plt.cm.Blues, wrap_width=20, thresh = 0.10, spacing = 4, arrow_mod = 1, rad = 0.3):
+ """
+ Simplified one-row attribution graph:
+ - All tokens (prompts + generations) drawn in one horizontal row
+ - Directed weighted edges: generation -> input
+ """
+
+ grad_array = self.sentence_attr
+ outputs = all_sentences
+ generated = generation_sentences
+
+ grad_array = grad_array.permute((1, 0)) # -> [outputs, generated]
+ attr_np = grad_array.cpu().numpy() if hasattr(grad_array, "cpu") else grad_array
+ attr_np = np.nan_to_num(attr_np, nan=0.0)
+
+ G = nx.DiGraph()
+ prompt_len = len(outputs) - len(generated)
+ n_gen = len(generated)
+
+ # Node ids
+ prompt_ids = [f"p_{i}" for i in range(prompt_len)]
+ gen_ids = [f"g_{j}" for j in range(n_gen)]
+
+ # Add nodes
+ def add_node(node_id, label, ntype):
+ wrapped = textwrap.fill(label, width=wrap_width)
+ wrap_height = len(wrapped.split('\n'))
+ G.add_node(node_id, label=wrapped, type=ntype)
+ return wrap_height
+
+ max_wrap_height = 0
+ for i in range(prompt_len):
+ wrap_height = add_node(prompt_ids[i], outputs[i], "prompt")
+ if wrap_height > max_wrap_height:
+ max_wrap_height = wrap_height
+ for j in range(n_gen):
+ wrap_height = add_node(gen_ids[j], generated[j], "generated")
+ if wrap_height > max_wrap_height:
+ max_wrap_height = wrap_height
+
+ def out_i_to_node(i):
+ return prompt_ids[i] if i < prompt_len else gen_ids[i - prompt_len]
+
+ # Add edges gen -> output
+ for j in range(n_gen):
+ src = gen_ids[j]
+ for i in range(len(outputs)):
+ w = attr_np[i, j] if (i < attr_np.shape[0] and j < attr_np.shape[1]) else 0.0
+ if w != 0.0:
+ G.add_edge(src, out_i_to_node(i), weight=w)
+
+
+ # --- layout: single row ---
+ y_row = 0.0
+ pos = {}
+ all_nodes = prompt_ids + gen_ids
+ for idx, nid in enumerate(all_nodes):
+ pos[nid] = (idx * spacing, y_row)
+
+ # --- figure ---
+ ncols = len(all_nodes)
+ fig_width = max(10, ncols * (spacing * 0.6))
+ fig, ax = plt.subplots(figsize=(fig_width, 4), dpi = 300)
+
+ # prune edges
+ edges = list(G.edges(data=True))
+ weights = np.array([edata["weight"] for _, _, edata in edges])
+ if weights.size > 0:
+ threshold = thresh * np.max(np.abs(weights)) # keep edges โฅ 5% of max weight
+ for (u, v, edata) in list(edges): # iterate over a copy
+ if abs(edata["weight"]) < threshold:
+ G.remove_edge(u, v)
+
+ # visualization
+ edges = G.edges(data=True) # refresh edges after pruning
+ weights = np.array([edata["weight"] for _, _, edata in edges])
+ if weights.size == 0:
+ weights = np.array([1]) # fallback if everything pruned
+ max_w = np.max(np.abs(weights))
+ norm = mpl.colors.TwoSlopeNorm(vmin=-max_w, vcenter=0, vmax=max_w) \
+ if np.min(weights) < 0 else mpl.colors.Normalize(vmin=0, vmax=max_w)
+
+ # Draw nodes (larger font + padding)
+ for nid, (x, y) in pos.items():
+ lbl = G.nodes[nid]["label"]
+ ntype = G.nodes[nid]["type"]
+ box_color = "#d4c1ffc8" if ntype == "prompt" else "#cfffcc" #cfe8ff
+ ax.annotate(
+ lbl, xy=(x, y), xytext=(x, y),
+ ha="center", va="center", fontsize=12, zorder=3,
+ bbox=dict(boxstyle="round,pad=0.6", facecolor=box_color,
+ edgecolor="gray", linewidth=1.2, alpha=1),
+ )
+
+ box_height = max_wrap_height / 4
+ # Draw edges with curved arrows
+ for (u, v, edata) in edges:
+ x1, y1 = pos[u]
+ x2, y2 = pos[v]
+
+ start = (x1, y1 - box_height)
+ end = (x2, y2 - box_height)
+
+ w = edata["weight"]
+ color = cmap(norm(w))
+ width = (1.5 * arrow_mod) + 5.0 * (abs(w) / max_w)
+
+ arrow_rad = rad if x1 <= x2 else -rad
+ arrow = FancyArrowPatch(
+ (start), (end),
+ connectionstyle=f"arc3,rad={arrow_rad}",
+ # arrowstyle=f"-|>,head_length={2*arrow_mod},head_width={arrow_mod}",
+ arrowstyle=f"<|-,head_length={2*arrow_mod},head_width={arrow_mod}",
+ linewidth=width, color=color, alpha=1, zorder=2,
+ shrinkA=16, shrinkB=16, mutation_scale=12,
+ clip_on=False
+ )
+ ax.add_patch(arrow)
+
+ ax.set_xlim(-spacing, (ncols - 1) * spacing + spacing)
+ ax.set_ylim(-3, 3)
+ ax.axis("off")
+
+ plt.tight_layout()
+ plt.savefig(path + ".png", dpi=500, transparent = "False")
+ fig.clear()
+ plt.close(fig)
+
+
+
+ ########################################## recursive sentence attr ##########################################
+
+ # this function is identical to compute_recursive_attr except for var names
+ # see that function for details
+ def compute_CAGE_sentence_attr(self, sentence_to_explain = -1, clear_values = True) -> None:
+ raise RuntimeError("Sentence-level CAGE has been removed; use token-level compute_CAGE_token_attr().")
+
+ # Legacy implementation (kept for reference)
+ if self.sentence_attr is None:
+ print(
+ '''The sentence attribution has not been computed.
+ Call LLMAttributionResult.compute_sentence_attr() first.
+ '''
+ )
+ return
+
+ if self.sentence_attr.shape[1] != len(self.all_sentences):
+ raise TypeError(
+ """This attribution object is of shape [generations, prompt].
+ This function only operates on attributions of shape
+ [generations, prompt + generations]"""
+ )
+
+ self.CAGE_sentence_attr = self.sentence_attr[sentence_to_explain].clone()
+ gen_row_indices_to_collapse = list(range(0, len(self.generation_sentences[:sentence_to_explain])))[::-1]
+ prompt_sentences_length = len(self.prompt_sentences)
+ for index in gen_row_indices_to_collapse:
+ biased_row = self.sentence_attr[index] * self.CAGE_sentence_attr[prompt_sentences_length + index]
+ if clear_values:
+ self.CAGE_sentence_attr[prompt_sentences_length + index] = 0
+ self.CAGE_sentence_attr += biased_row
+
+ return
+
+
+ ########################################## Multi Sentence Attr ##########################################
+
+ # this function returns a tuple containing a sentence attribution matrix,
+ # the sum of all rows of that matrix, the sum of indices_to_explain rows of that matrix, and a CAGE attribution over the indices_to_explain
+ def get_all_sentence_attrs(self, indices_to_explain) -> tuple:
+ raise RuntimeError("Sentence-level attribution outputs have been removed; use get_all_token_attrs([start_tok, end_tok]).")
+
+ # Legacy implementation (kept for reference)
+ self.compute_sentence_attr(norm = True)
+
+ attr = self.sentence_attr
+
+ row_attr = 0
+ for index in indices_to_explain:
+ row_attr += attr[index, :]
+ row_attr = row_attr.reshape(1, -1)
+
+ rec_attr = 0
+ for index in indices_to_explain:
+ self.compute_CAGE_sentence_attr(index)
+ rec_attr += self.CAGE_sentence_attr
+ rec_attr = rec_attr.reshape(1, -1)
+
+ return attr, row_attr, rec_attr
+
+class LLMBasicAttribution(LLMAttribution):
+ def __init__(self, model, tokenizer, language: str = "en") -> None:
+ super().__init__(model, tokenizer)
+ self.zipf_language = language
+
+ def calculate_basic_attribution(self, prompt: str, target: Optional[str] = None) -> LLMAttributionResult:
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ prompt_length = len(self.user_prompt_tokens)
+ generation_length = len(self.generation_tokens)
+ total_length = prompt_length + generation_length
+
+ score_array = torch.zeros((generation_length, total_length), dtype=torch.float32)
+
+ if generation_length == 0:
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+ return LLMAttributionResult(
+ self.tokenizer,
+ score_array,
+ self.user_prompt_tokens,
+ self.generation_tokens,
+ all_tokens=all_tokens,
+ )
+
+ if generation_length > 0 and prompt_length > 0:
+ normalized_prompt_tokens = [token.strip() for token in self.user_prompt_tokens]
+
+ for gen_idx, gen_token in enumerate(self.generation_tokens):
+ normalized_gen_token = gen_token.strip()
+
+ if not normalized_gen_token:
+ continue
+
+ weight = float(zipf_frequency(normalized_gen_token, self.zipf_language))
+ if weight <= 0.0:
+ continue
+
+ for prompt_idx, prompt_token in enumerate(normalized_prompt_tokens):
+ if prompt_token == normalized_gen_token:
+ score_array[gen_idx, prompt_idx] = weight
+
+ row_sums = score_array.sum(dim=1, keepdim=True)
+ nonzero_rows = row_sums.squeeze(1) > 0
+ if torch.any(nonzero_rows):
+ score_array[nonzero_rows] = score_array[nonzero_rows] / row_sums[nonzero_rows]
+
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+
+ return LLMAttributionResult(
+ self.tokenizer,
+ score_array,
+ self.user_prompt_tokens,
+ self.generation_tokens,
+ all_tokens=all_tokens,
+ )
+
+
+class LLMIFRAttribution(LLMAttribution):
+ """Information Flow Routes attribution integrated with the LLMAttribution API."""
+
+ def __init__(
+ self,
+ model,
+ tokenizer,
+ generate_kwargs: Optional[Dict[str, Any]] = None,
+ *,
+ chunk_tokens: int = 128,
+ sink_chunk_tokens: int = 32,
+ renorm_threshold_default: float = 0.0,
+ show_progress: bool = True,
+ recompute_attention: bool = False,
+ use_chat_template: bool = False,
+ ) -> None:
+ super().__init__(model, tokenizer, generate_kwargs, use_chat_template=use_chat_template)
+ self.chunk_tokens = int(chunk_tokens)
+ self.sink_chunk_tokens = int(sink_chunk_tokens)
+ self.renorm_threshold_default = float(renorm_threshold_default)
+ self.show_progress = bool(show_progress)
+ self.recompute_attention = bool(recompute_attention)
+
+ @property
+ def _model_dtype(self) -> torch.dtype:
+ return next(self.model.parameters()).dtype
+
+ def _ensure_generation(self, prompt: str, target: Optional[str]) -> Tuple[torch.Tensor, torch.Tensor, int, int]:
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ prompt_len = int(self.prompt_ids.shape[1])
+ gen_len = int(self.generation_ids.shape[1])
+ input_ids_all = torch.cat([self.prompt_ids, self.generation_ids], dim=1).to(self.device)
+ attention_mask = torch.ones_like(input_ids_all)
+ return input_ids_all, attention_mask, prompt_len, gen_len
+
+ def _capture_model_state(
+ self,
+ input_ids: torch.Tensor,
+ attention_mask: torch.Tensor,
+ recompute_attention: bool = False,
+ ) -> Tuple[Dict[str, List[Optional[torch.Tensor]]], Optional[Sequence[torch.Tensor]], ModelMetadata, List[Dict[str, torch.Tensor | nn.Module]]]:
+ metadata = extract_model_metadata(self.model)
+ cache, hooks = attach_hooks(metadata.layers, self._model_dtype)
+
+ try:
+ outputs = self.model(
+ input_ids=input_ids,
+ attention_mask=attention_mask,
+ use_cache=False,
+ output_attentions=not recompute_attention,
+ output_hidden_states=False,
+ return_dict=True,
+ )
+ finally:
+ for handle in hooks:
+ try:
+ handle.remove()
+ except Exception:
+ pass
+
+ attentions = None if recompute_attention else outputs.attentions
+ weight_pack = build_weight_pack(metadata, self._model_dtype)
+ return cache, attentions, metadata, weight_pack
+
+ def _build_ifr_params(self, metadata: ModelMetadata, sequence_length: int) -> IFRParameters:
+ return IFRParameters(
+ n_layers=metadata.n_layers,
+ n_heads_q=metadata.n_heads_q,
+ n_kv_heads=metadata.n_kv_heads,
+ head_dim=metadata.head_dim,
+ group_size=metadata.group_size,
+ d_model=metadata.d_model,
+ sequence_length=sequence_length,
+ model_dtype=self._model_dtype,
+ chunk_tokens=self.chunk_tokens,
+ sink_chunk_tokens=self.sink_chunk_tokens,
+ )
+
+ def _finalize_result(self, score_array: torch.Tensor, metadata: Optional[Dict[str, Any]] = None) -> LLMAttributionResult:
+ if score_array.ndim == 1:
+ score_array = score_array.unsqueeze(0)
+ score_array = score_array.detach().cpu()
+
+ score_array = self.extract_user_prompt_attributions(self.prompt_tokens, score_array)
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+
+ return LLMAttributionResult(
+ self.tokenizer,
+ score_array,
+ self.user_prompt_tokens,
+ self.generation_tokens,
+ all_tokens=all_tokens,
+ metadata=metadata,
+ )
+
+ def _project_vector(self, vector: torch.Tensor) -> torch.Tensor:
+ matrix = vector.detach().cpu().view(1, -1)
+ projected = self.extract_user_prompt_attributions(self.prompt_tokens, matrix)
+ return projected[0]
+
+ @torch.no_grad()
+ def calculate_ifr_for_all_positions(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ renorm_threshold: Optional[float] = None,
+ ) -> LLMAttributionResult:
+ input_ids_all, attn_mask, prompt_len, gen_len = self._ensure_generation(prompt, target)
+ total_len = int(input_ids_all.shape[1])
+ if gen_len == 0:
+ empty = torch.zeros((0, total_len), dtype=torch.float32)
+ metadata = {
+ "ifr": {
+ "type": "all_positions",
+ "sink_indices": [],
+ "renorm_threshold": renorm_threshold,
+ "note": "No generation tokens; returning empty attribution matrix.",
+ }
+ }
+ return self._finalize_result(empty, metadata=metadata)
+
+ cache, attentions, metadata, weight_pack = self._capture_model_state(
+ input_ids_all, attn_mask, recompute_attention=self.recompute_attention,
+ )
+ params = self._build_ifr_params(metadata, total_len)
+ renorm = self.renorm_threshold_default if renorm_threshold is None else float(renorm_threshold)
+
+ sink_range = (prompt_len, prompt_len + gen_len - 1)
+ all_positions = compute_ifr_for_all_positions(
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ sink_range=sink_range,
+ return_layerwise=False,
+ rotary_emb=metadata.rotary_emb,
+ )
+
+ meta = {
+ "ifr": {
+ "type": "all_positions",
+ "sink_indices": all_positions.sink_indices,
+ "renorm_threshold": renorm,
+ }
+ }
+ return self._finalize_result(all_positions.token_importance_matrix, metadata=meta)
+
+ @torch.no_grad()
+ def calculate_ifr_for_all_positions_output_only(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ sink_span: Optional[Tuple[int, int]] = None,
+ renorm_threshold: Optional[float] = None,
+ ) -> LLMAttributionResult:
+ """Compute IFR for sink positions restricted to an output span.
+
+ This mirrors calculate_ifr_for_all_positions but only computes per-token IFR
+ rows for sink positions in sink_span (generation-token indices). All other
+ generation rows are left as NaN (becoming 0 after normalization).
+ """
+ input_ids_all, attn_mask, prompt_len, gen_len = self._ensure_generation(prompt, target)
+ total_len = int(input_ids_all.shape[1])
+ if gen_len == 0:
+ empty = torch.zeros((0, total_len), dtype=torch.float32)
+ metadata = {
+ "ifr": {
+ "type": "all_positions_output_only",
+ "sink_span_generation": None,
+ "sink_indices": [],
+ "renorm_threshold": renorm_threshold,
+ "note": "No generation tokens; returning empty attribution matrix.",
+ }
+ }
+ return self._finalize_result(empty, metadata=metadata)
+
+ note = ""
+ if sink_span is None:
+ sink_span = (0, gen_len - 1)
+ note = "sink_span not provided; fell back to full generation."
+ span_start, span_end = sink_span
+ if span_start < 0 or span_end < span_start or span_end >= gen_len:
+ raise ValueError(f"Invalid sink_span ({span_start}, {span_end}) for generation length {gen_len}.")
+
+ sink_start_abs = prompt_len + span_start
+ sink_end_abs = prompt_len + span_end
+
+ cache, attentions, metadata, weight_pack = self._capture_model_state(
+ input_ids_all, attn_mask, recompute_attention=self.recompute_attention,
+ )
+ params = self._build_ifr_params(metadata, total_len)
+ renorm = self.renorm_threshold_default if renorm_threshold is None else float(renorm_threshold)
+
+ sink_range = (sink_start_abs, sink_end_abs)
+ all_positions = compute_ifr_for_all_positions(
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ sink_range=sink_range,
+ return_layerwise=False,
+ rotary_emb=metadata.rotary_emb,
+ )
+
+ score_array = torch.full((gen_len, total_len), torch.nan, dtype=torch.float32)
+ score_array[span_start : span_end + 1, :] = all_positions.token_importance_matrix
+
+ meta = {
+ "ifr": {
+ "type": "all_positions_output_only",
+ "sink_span_generation": (span_start, span_end),
+ "sink_span_absolute": (sink_start_abs, sink_end_abs),
+ "sink_indices": all_positions.sink_indices,
+ "renorm_threshold": renorm,
+ "note": note,
+ }
+ }
+ return self._finalize_result(score_array, metadata=meta)
+
+ @torch.no_grad()
+ def calculate_ifr_span(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ span: Optional[Tuple[int, int]] = None,
+ renorm_threshold: Optional[float] = None,
+ ) -> LLMAttributionResult:
+ input_ids_all, attn_mask, prompt_len, gen_len = self._ensure_generation(prompt, target)
+ total_len = int(input_ids_all.shape[1])
+
+ if gen_len == 0:
+ empty = torch.zeros((0, total_len), dtype=torch.float32)
+ metadata = {
+ "ifr": {
+ "type": "sentence_aggregate",
+ "sink_span_generation": None,
+ "renorm_threshold": renorm_threshold,
+ "note": "No generation tokens; returning empty attribution matrix.",
+ }
+ }
+ return self._finalize_result(empty, metadata=metadata)
+
+ span_start, span_end = span if span is not None else (0, gen_len - 1)
+ if span_start < 0 or span_end < span_start or span_end >= gen_len:
+ raise ValueError(
+ f"Invalid span ({span_start}, {span_end}) for generation length {gen_len}."
+ )
+
+ sink_start_abs = prompt_len + span_start
+ sink_end_abs = prompt_len + span_end
+
+ cache, attentions, metadata, weight_pack = self._capture_model_state(
+ input_ids_all, attn_mask, recompute_attention=self.recompute_attention,
+ )
+ params = self._build_ifr_params(metadata, total_len)
+ renorm = self.renorm_threshold_default if renorm_threshold is None else float(renorm_threshold)
+
+ aggregate = compute_ifr_sentence_aggregate(
+ sink_start=sink_start_abs,
+ sink_end=sink_end_abs,
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ rotary_emb=metadata.rotary_emb,
+ )
+
+ score_array = torch.full((gen_len, total_len), torch.nan, dtype=torch.float32)
+ for offset in range(span_start, span_end + 1):
+ score_array[offset] = aggregate.token_importance_total
+
+ meta = {
+ "ifr": {
+ "type": "sentence_aggregate",
+ "sink_span_generation": (span_start, span_end),
+ "sink_span_absolute": (sink_start_abs, sink_end_abs),
+ "renorm_threshold": renorm,
+ "aggregate": aggregate,
+ }
+ }
+ return self._finalize_result(score_array, metadata=meta)
+
+ @torch.no_grad()
+ def calculate_ifr_multi_hop(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ sink_span: Optional[Tuple[int, int]] = None,
+ thinking_span: Optional[Tuple[int, int]] = None,
+ n_hops: int = 1,
+ renorm_threshold: Optional[float] = None,
+ observation_mask: Optional[torch.Tensor | Sequence[float]] = None,
+ ) -> LLMAttributionResult:
+ input_ids_all, attn_mask, prompt_len, gen_len = self._ensure_generation(prompt, target)
+ total_len = int(input_ids_all.shape[1])
+
+ if gen_len == 0:
+ empty = torch.zeros((0, total_len), dtype=torch.float32)
+ metadata = {
+ "ifr": {
+ "type": "multi_hop",
+ "sink_span_generation": sink_span,
+ "thinking_span_generation": thinking_span,
+ "renorm_threshold": renorm_threshold,
+ "note": "No generation tokens; returning empty attribution matrix.",
+ }
+ }
+ return self._finalize_result(empty, metadata=metadata)
+
+ if sink_span is None:
+ sink_span = (0, gen_len - 1)
+ span_start, span_end = sink_span
+ if span_start < 0 or span_end < span_start or span_end >= gen_len:
+ raise ValueError(
+ f"Invalid sink_span ({span_start}, {span_end}) for generation length {gen_len}."
+ )
+ if thinking_span is None:
+ thinking_span = sink_span
+ think_start, think_end = thinking_span
+ if think_start < 0 or think_end < think_start or think_end >= gen_len:
+ raise ValueError(
+ f"Invalid thinking_span ({think_start}, {think_end}) for generation length {gen_len}."
+ )
+
+ sink_start_abs = prompt_len + span_start
+ sink_end_abs = prompt_len + span_end
+ think_start_abs = prompt_len + think_start
+ think_end_abs = prompt_len + think_end
+
+ obs_mask_tensor: Optional[torch.Tensor] = None
+ if observation_mask is not None:
+ obs_mask_tensor = torch.as_tensor(observation_mask, dtype=torch.float32)
+ if obs_mask_tensor.ndim != 1:
+ raise ValueError("observation_mask must be a 1D tensor or sequence.")
+ if obs_mask_tensor.numel() == gen_len:
+ mask_full = torch.zeros(total_len, dtype=torch.float32)
+ mask_full[prompt_len : prompt_len + gen_len] = obs_mask_tensor
+ obs_mask_tensor = mask_full
+ elif obs_mask_tensor.numel() != total_len:
+ raise ValueError(
+ f"observation_mask must have length {gen_len} (generation) or {total_len} (full sequence)."
+ )
+
+ cache, attentions, metadata, weight_pack = self._capture_model_state(
+ input_ids_all, attn_mask, recompute_attention=self.recompute_attention,
+ )
+ params = self._build_ifr_params(metadata, total_len)
+ renorm = self.renorm_threshold_default if renorm_threshold is None else float(renorm_threshold)
+
+ multi_hop = compute_multi_hop_ifr(
+ sink_start=sink_start_abs,
+ sink_end=sink_end_abs,
+ thinking_span=(think_start_abs, think_end_abs),
+ n_hops=int(n_hops),
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ observation_mask=obs_mask_tensor,
+ rotary_emb=metadata.rotary_emb,
+ )
+
+ eval_vector = multi_hop.observation["sum"]
+ score_array = torch.full((gen_len, total_len), torch.nan, dtype=torch.float32)
+ for offset in range(span_start, span_end + 1):
+ score_array[offset] = eval_vector
+
+ projected_per_hop = [
+ self._project_vector(result.token_importance_total) for result in multi_hop.raw_attributions
+ ]
+ obs = multi_hop.observation
+ observation_projected = {
+ "mask": self.extract_user_prompt_attributions(
+ self.prompt_tokens, obs["mask"].view(1, -1)
+ )[0],
+ "base": self._project_vector(obs["base"]),
+ "sum": self._project_vector(obs["sum"]),
+ "avg": self._project_vector(obs["avg"]),
+ "per_hop": [self._project_vector(vec) for vec in obs["per_hop"]],
+ }
+
+ meta: Dict[str, Any] = {
+ "ifr": {
+ "type": "multi_hop",
+ "sink_span_generation": (span_start, span_end),
+ "sink_span_absolute": (sink_start_abs, sink_end_abs),
+ "thinking_span_generation": (think_start, think_end),
+ "thinking_span_absolute": (think_start_abs, think_end_abs),
+ "renorm_threshold": renorm,
+ "n_hops": int(n_hops),
+ "thinking_ratios": multi_hop.thinking_ratios,
+ "per_hop_projected": projected_per_hop,
+ "observation_projected": observation_projected,
+ "raw": multi_hop,
+ }
+ }
+
+ return self._finalize_result(score_array, metadata=meta)
+
+class LLMGradientAttribtion(LLMAttribution):
+ def __init__(self, model, tokenizer):
+ super().__init__(model, tokenizer)
+
+ # if captum version = True, interpolation only performed over prompt tokens
+ # else interpolation over prompt tokens and all intermediate generations
+ def calculate_IG_per_generation(self, prompt, steps, baseline, batch_size = 1, captum_version = False, target = None) -> LLMAttributionResult:
+ # run the model so we can access the input ids and generated token ids
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ # Make a copy of the input ids
+ # We will expand the original prompt by each generated token
+ input_ids_all = self.prompt_ids.clone()
+
+ # we want to know how many input, generated, and total tokens there are
+ input_length = self.prompt_ids.shape[1]
+ generation_length = self.generation_ids.shape[1]
+ total_length = input_length + generation_length
+
+ # instantiate a matrix which will track the attribution of every generated token to the input
+ # cols = total_length because we will capture generation -> previous generation attributions
+ score_array = torch.empty((generation_length, total_length))
+
+ # grads must be measured to the embedding layer
+ embedding_layer = self.model.get_input_embeddings()
+
+ # check batch size
+ batch_size = steps if steps < batch_size else batch_size
+
+ # create alphas and set step size trapezoidal riemann sum integral estimation
+ alphas = torch.linspace(0, 1, steps, dtype=torch.float32).to(self.device)
+ step_sizes = torch.full_like((alphas), 1 / steps).to(self.device)
+ step_sizes[0] /= 2
+ step_sizes[-1] /= 2
+
+ # this is used for precision casting in case the model is not loaded in fp32
+ model_dtype = next(self.model.parameters()).dtype
+
+ # we calculate the gradients of predicting self.generation_ids[step]
+ # by updating the input to be prompt + self.generation_ids[:step]
+ # for step in tqdm(range(generation_length)):
+ for step in range(generation_length):
+ # take inputs off of the graph to avoid gradient accumulation across steps
+ input_ids_all = input_ids_all.detach()
+
+ # Capture the input embeddings and force require grad
+ input_embeds_orig = embedding_layer(input_ids_all).float()
+ # The baseline value is a token id. Commonly employed as 0 (for llama that is the token '!')
+ # also used is tokenizer.eos_token_id or tokenizer.pad_token_id
+ baseline_embeds = embedding_layer(torch.full_like(input_ids_all, baseline)).float()
+
+ # set target as next known generated token
+ target_token = self.generation_ids[0, step].item()
+
+ # # Make a tensor to store the gradients over all IG steps
+ # # each individual gradient will be [batch_size, seq_len, embedding_dim]
+ # IG_grads = torch.zeros((steps, input_embeds_orig.shape[1], input_embeds_orig.shape[2])).to(self.device)
+
+ # Make a tensor to store the sum of the gradients across the IG steps
+ IG_sum = torch.zeros(input_embeds_orig.shape[1], input_embeds_orig.shape[2], device=self.device)
+
+ # perform IG (gradients of interpolated inputs)
+ for batch_start in range(0, steps, batch_size):
+ # grab a batch of alphas and step sizes
+ batch_end = min(batch_start + batch_size, steps)
+ alphas_batch = alphas[batch_start : batch_end].view(-1, 1, 1).float()
+ step_sizes_batch = step_sizes[batch_start : batch_end].view(-1, 1, 1)
+
+ # interpolate the batch of embeddings
+ # captum does not interpolate over the current generated tokens
+ # as a result, the generation -> generation gradients are mostly ignored
+ if captum_version == True:
+ scaled_embeds_batch = baseline_embeds[:, :input_length] + alphas_batch * (input_embeds_orig[:, :input_length] - baseline_embeds[:, :input_length])
+ input_embeds_batch = input_embeds_orig.detach().clone().repeat(batch_end - batch_start, 1, 1)
+ input_embeds_batch[:, :input_length] = scaled_embeds_batch
+ # We do interpolate over the prompt and current generation
+ # This allows generation -> generation attributions to be captured
+ else:
+ input_embeds_batch = baseline_embeds + alphas_batch * (input_embeds_orig - baseline_embeds) # [batch_size, seq_len, embedding_dim]
+
+ # set requires grad on input embeds
+ input_embeds_batch = input_embeds_batch.to(model_dtype).detach().clone().requires_grad_(True)
+ # perform inference
+ logits = self.model(inputs_embeds=input_embeds_batch).logits # [batch_size, seq_len, vocab_size]
+ # evaluate the probability of the target token's generation
+ probs = torch.nn.functional.log_softmax(logits[:, -1, :], dim=-1) # [batch_size, vocab_size]
+ losses = probs[:, target_token] # [batch_size]
+
+ # clear grads
+ self.model.zero_grad(set_to_none=True)
+ if input_embeds_batch.grad is not None:
+ input_embeds_batch.grad.zero_()
+
+ # gather the gradients wrt these probabilities across batch
+ losses.sum().backward()
+
+ # perform (x - x') * grad * step_size
+ # baseline_diff = (input_embeds_orig - baseline_embeds)
+ # IG_grads[batch_start : batch_end] = baseline_diff * input_embeds_batch.grad.detach().clone() * step_sizes_batch # [batch_size, seq_len, embedding_dim]
+
+ # perform (x - x') * grad * step_size
+ baseline_diff = (input_embeds_orig - baseline_embeds)
+ grads_batch = baseline_diff * input_embeds_batch.grad.detach().clone() * step_sizes_batch# [batch_size, seq_len, embedding_dim]
+ # Sum over batch
+ IG_sum += (grads_batch).sum(dim=0) # [seq_len, embedding_dim]
+
+ # Free memory
+ del input_embeds_batch, logits, probs, grads_batch
+ torch.cuda.empty_cache()
+
+ # del input_embeds_batch, logits, probs, losses
+
+ # # This is a sum over the number of IG steps. Finishes IG result
+ # IG_grads = IG_grads.sum(0) # From [steps, seq_len, embed_dim] to [seq_len, embed_dim]
+ # # take the sum over the embedding_dim
+ # IG_grads = IG_grads.sum(-1) # [seq_len]
+
+ # Sum across embedding dimension
+ IG_grads = IG_sum.sum(-1).detach().cpu()
+
+ # pad these grads with nan since they must fit into score_array with all other token attributions
+ score_array[step] = self.pad_vector(IG_grads.view(1, -1), total_length, torch.nan) # [1, total_length]
+
+ # clean up before the next loop
+ # del input_embeds_batch, logits, probs, losses
+ # torch.cuda.empty_cache()
+
+ # Append next token to input for next step generation and attribution
+ input_ids_all = torch.cat([input_ids_all, torch.tensor([[target_token]]).to(self.device)], dim=1)
+ input_ids_all = input_ids_all.detach().clone()
+
+ # remove from the attribution all values associated with thechat prompt
+ score_array = self.extract_user_prompt_attributions(self.prompt_tokens, score_array)
+
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+
+ return LLMAttributionResult(self.tokenizer, score_array, self.user_prompt_tokens, self.generation_tokens, all_tokens = all_tokens)
+
+class LLMPerturbationAttribution(LLMAttribution):
+
+ def __init__(self, model, tokenizer) -> None:
+ super().__init__(model, tokenizer)
+
+ self.mlm_tokenizer = LongformerTokenizer.from_pretrained("allenai/longformer-base-4096")
+ self.mlm_model = LongformerForMaskedLM.from_pretrained("allenai/longformer-base-4096").to(self.device)
+
+
+
+ # we want to evaluate the probability of producing a reponse given a prompt
+ def compute_logprob_response_given_prompt(self, prompt_ids, response_ids) -> torch.Tensor:
+ """
+ Compute log-probabilities of `response_ids` given `prompt_ids`.
+
+ prompt_ids: [B, N]
+ response_ids: [B, M]
+ Returns: [B, M]
+ """
+ # concat prompt and response
+ input_ids = torch.cat([prompt_ids, response_ids], dim=1) # [B, N+M]
+ attention_mask = torch.ones_like(input_ids)
+
+ # Get model outputs
+ logits = self.model(input_ids=input_ids, attention_mask=attention_mask).logits # [B, seq_len, vocab_size]
+
+ # Compute log-probs
+ log_probs = torch.nn.functional.log_softmax(logits, dim=-1) # [B, seq_len, vocab_size]
+
+ # Only consider response tokens
+ response_start = prompt_ids.shape[1]
+
+ # Align logits to predict each y_t from y_{ torch.Tensor:
+ """
+ Compute KL divergence scores for each token in `response_ids` given `prompt_ids`.
+ Mimics run_probing(metrics="kl_div") but only for the full sequence.
+
+ Args:
+ model: HuggingFace autoregressive model.
+ prompt_ids: [B, N] tensor of prompt token IDs.
+ response_ids: [B, M] tensor of response token IDs.
+
+ Returns:
+ KL-divergence scores: [B, M] tensor.
+ """
+ device = prompt_ids.device
+ prompt_ids = prompt_ids.to(device)
+ response_ids = response_ids.to(device)
+
+ # Concatenate prompt + response
+ input_ids = torch.cat([prompt_ids, response_ids], dim=1) # [B, N+M]
+ attention_mask = torch.ones_like(input_ids, device=device)
+
+ # Compute logits
+ logits = self.model(input_ids=input_ids, attention_mask=attention_mask).logits # [B, N+M, V]
+ logits = logits.to(torch.float32) # avoid float16 overflow
+ log_probs = F.log_softmax(logits, dim=-1) # [B, N+M, V]
+
+ # Align: y_t predicted from x_{ LLMAttributionResult:
+ # run the model so we can access the prompt ids and generated token ids
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ # Make a copy of the prompt ids
+ # We will expand the original prompt by each generated token
+ input_ids_all = self.prompt_ids.clone()
+
+ # we want to know how many input tokens and generated tokens there are
+ input_length = self.prompt_ids.shape[1]
+ generation_length = self.generation_ids.shape[1]
+ total_length = input_length + generation_length
+
+
+ # given the text user prompt, create a mask over the tokens of each sentence
+ user_prompt_sentences = create_sentences("".join(self.user_prompt_tokens), self.tokenizer, show=True)
+ sentence_masks_prompt = create_sentence_masks(self.user_prompt_tokens, user_prompt_sentences, show=True)
+
+ # mask prompt sentences and generated sentences
+ # given the generation, create a mask over the tokens of each sentence
+ generation_sentences = create_sentences("".join(self.generation_tokens), self.tokenizer)
+ sentence_masks_generation = create_sentence_masks(self.generation_tokens, generation_sentences)
+
+ # find the total sizes of the masks we need
+ l = len(self.chat_prompt_indices) # input formating tokens
+ n, m = sentence_masks_prompt.shape # (user prompt sentences, user prompt tokens)
+ o, p = sentence_masks_generation.shape # (generation sentences + EOS, generation tokens + EOS)
+
+ # Create a tensor that can fit all masks diagonally
+ masks = torch.zeros((l + n + o, l + m + p))
+
+ # we never mask the chat_prompt_indices, leave as 0
+ # prompt indices cover:
+ # 0 : start of sentence_masks_prompt
+ # end of sentence_masks_prompt : start of sentence_masks_generation
+
+ # input sentence masks only cover the user prompt
+ user_prompt_start_idx = self.user_prompt_indices[0]
+ masks[user_prompt_start_idx : user_prompt_start_idx + n, user_prompt_start_idx : user_prompt_start_idx + m] = sentence_masks_prompt
+
+ # gen sentence masks only cover the generations
+ masks[l + n:, l + m:] = sentence_masks_generation
+
+ num_input_masks = masks.shape[0]
+
+ # instantiate a matrix which will track the attribution of every generated token to intermediate generations
+ # cols = total_length because we will capture generation -> previous generation attributions
+ score_array = torch.full((generation_length, total_length), torch.nan)
+
+ for step in range(len(sentence_masks_generation)):
+ # for step in range(len(sentence_masks_generation) + 1):
+ input_ids_all = input_ids_all.detach()
+
+ # assume the we are generating a sentence of the generation_ids and we find the
+ # prob of generating this sentence from the current input_ids (prompt + any current generations)
+ gen_token_indices = torch.where(sentence_masks_generation[step] == 1)[0] # [len(target_tokens)]
+ gen_tokens = self.generation_ids[:, gen_token_indices] # [1, len(target_tokens)]
+
+ if measure == "log_loss":
+ original_probs = self.compute_logprob_response_given_prompt(input_ids_all, gen_tokens).detach().cpu() # [1, len(target_tokens)]
+ elif measure == "KL":
+ original_probs = self.compute_kl_response_given_prompt(input_ids_all, gen_tokens).detach().cpu() # [1, len(target_tokens)]
+
+ # perturb each sentence of the input and current generation
+ # and measure how the probs of predicitng gen_tokens changes
+ for i in range(num_input_masks - len(sentence_masks_generation) + step):
+ # find the input tokens to be masked
+ tokens_to_mask = torch.where(masks[i] == 1)[0]
+
+ # if we don't want to mask anything just continue
+ if len(tokens_to_mask) == 0:
+ continue
+
+ # save the original token values for unmasking
+ original_token_value = input_ids_all[:, tokens_to_mask].clone()
+ # mask the values
+ input_ids_all[:, tokens_to_mask] = baseline
+
+ if measure == "log_loss":
+ # prob of generating a token from a perturbation of the input_ids (prompt + current generations)
+ perturbed_probs = self.compute_logprob_response_given_prompt(input_ids_all, gen_tokens).detach().cpu() # [1, len(target_tokens)]
+ elif measure == "KL":
+ perturbed_probs = self.compute_kl_response_given_prompt(input_ids_all, gen_tokens).detach().cpu() # [1, len(target_tokens)]
+
+ # change from original generation prob
+ score_delta = original_probs - perturbed_probs # [1, len(target_tokens)]
+
+ # since scores are for each output token over the set of input tokens [tokens_to_mask],
+ # we expand them to be over all these tokens
+ rows, cols = torch.meshgrid(gen_token_indices, tokens_to_mask, indexing = "ij")
+ score_array[rows, cols] = score_delta.reshape(-1, 1).repeat((1, len(tokens_to_mask))).to(score_array.dtype) # [len(target_tokens), len(tokens_to_mask)]
+
+ # un-ablate the input
+ input_ids_all[:, tokens_to_mask] = original_token_value
+
+ # Append generated tokens to input for next step
+ input_ids_all = torch.cat([input_ids_all, gen_tokens], dim = 1)
+
+ # remove from the attribution all values associated with the chat prompt
+ score_array = self.extract_user_prompt_attributions(self.prompt_tokens, score_array)
+
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+
+ return LLMAttributionResult(self.tokenizer, score_array, self.user_prompt_tokens, self.generation_tokens, all_tokens = all_tokens)
+
+ def mlm_mask_indices(self, input_ids, tokens_to_mask):
+ """
+ Replace masked positions in a LLaMA token sequence using LED.
+ """
+
+ # 1. Convert input_ids to tokens
+ new_text_tokens = self.tokenizer.convert_ids_to_tokens(input_ids[0])
+
+ # 2. Mask only selected tokens
+ for idx in tokens_to_mask:
+ new_text_tokens[idx] = self.mlm_tokenizer.mask_token
+
+ # 3. Convert tokens back to string
+ new_text = self.tokenizer.convert_tokens_to_string(new_text_tokens)
+
+ # 4. Encode for Longformer MLM
+ inputs = self.mlm_tokenizer(new_text, return_tensors="pt", max_length=4096, truncation=True)
+ inputs = {k: v.to(self.device) for k, v in inputs.items()}
+
+ # 5. Find masked positions
+ masked_positions = (inputs["input_ids"] == self.mlm_tokenizer.mask_token_id).nonzero(as_tuple=True)[1]
+
+ # 6. Add global attention on masked positions
+ global_attention_mask = torch.zeros_like(inputs["input_ids"])
+ global_attention_mask[0, masked_positions] = 1
+ inputs["global_attention_mask"] = global_attention_mask
+
+ # 7. Predict all masked positions in one forward pass
+ with torch.no_grad():
+ logits = self.mlm_model(**inputs).logits # [batch, seq_len, vocab_size]
+ predicted_ids = logits[0, masked_positions, :].argmax(dim=-1)
+ replacement_ids = inputs["input_ids"].clone()
+ replacement_ids[0, masked_positions] = predicted_ids
+
+ # 8. Convert predicted tokens to string
+ regenerated_tokens = [replacement_ids[0, idx] for idx in masked_positions]
+ regenerated_text = self.mlm_tokenizer.decode(predicted_ids, skip_special_tokens=True)
+ if regenerated_text and regenerated_text[0] != ' ':
+ regenerated_text = ' ' + regenerated_text
+
+ # 8. Re-tokenize with LLaMA tokenizer
+ replacement_input_ids = self.tokenizer(regenerated_text, return_tensors='pt').input_ids
+
+ # 9. Pad/truncate to match original masked length
+ original_len = len(tokens_to_mask)
+ new_len = replacement_input_ids.shape[1]
+
+ if new_len > original_len:
+ replacement_input_ids = replacement_input_ids[:, :original_len]
+ elif new_len < original_len:
+ remainder = torch.full((1, original_len - new_len), self.tokenizer.eos_token_id, dtype=torch.long)
+ replacement_input_ids = torch.cat((replacement_input_ids, remainder), dim=1)
+
+ if replacement_input_ids.dtype != torch.int64:
+ replacement_input_ids = replacement_input_ids.to(torch.int64)
+
+ return replacement_input_ids.to(self.device)
+
+ def calculate_feature_ablation_sentences_mlm(self, prompt, target = None) -> LLMAttributionResult:
+ # run the model so we can access the prompt ids and generated token ids
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ # Make a copy of the prompt ids
+ # We will expand the original prompt by each generated token
+ input_ids_all = self.prompt_ids.clone()
+
+ # we want to know how many input tokens and generated tokens there are
+ input_length = self.prompt_ids.shape[1]
+ generation_length = self.generation_ids.shape[1]
+ total_length = input_length + generation_length
+
+
+ # given the text user prompt, create a mask over the tokens of each sentence
+ user_prompt_sentences = create_sentences("".join(self.user_prompt_tokens), self.tokenizer, show=True)
+ sentence_masks_prompt = create_sentence_masks(self.user_prompt_tokens, user_prompt_sentences, show=True)
+
+ # mask prompt sentences and generated sentences
+ # given the generation, create a mask over the tokens of each sentence
+ generation_sentences = create_sentences("".join(self.generation_tokens), self.tokenizer)
+ sentence_masks_generation = create_sentence_masks(self.generation_tokens, generation_sentences)
+
+ # find the total sizes of the masks we need
+ l = len(self.chat_prompt_indices) # input formating tokens
+ n, m = sentence_masks_prompt.shape # (user prompt sentences, user prompt tokens)
+ o, p = sentence_masks_generation.shape # (generation sentences + EOS, generation tokens + EOS)
+
+ # Create a tensor that can fit all masks diagonally
+ masks = torch.zeros((l + n + o, l + m + p))
+
+ # we never mask the chat_prompt_indices, leave as 0
+ # prompt indices cover:
+ # 0 : start of sentence_masks_prompt
+ # end of sentence_masks_prompt : start of sentence_masks_generation
+
+ # input sentence masks only cover the user prompt
+ user_prompt_start_idx = self.user_prompt_indices[0]
+ masks[user_prompt_start_idx : user_prompt_start_idx + n, user_prompt_start_idx : user_prompt_start_idx + m] = sentence_masks_prompt
+
+ # gen sentence masks only cover the generations
+ masks[l + n:, l + m:] = sentence_masks_generation
+
+ num_input_masks = masks.shape[0]
+
+ # instantiate a matrix which will track the attribution of every generated token to intermediate generations
+ # cols = total_length because we will capture generation -> previous generation attributions
+ score_array = torch.full((generation_length, total_length), torch.nan)
+
+ for step in range(len(sentence_masks_generation)):
+ # for step in range(len(sentence_masks_generation) + 1):
+ input_ids_all = input_ids_all.detach()
+
+ # assume the we are generating a sentence of the generation_ids and we find the
+ # prob of generating this sentence from the current input_ids (prompt + any current generations)
+ gen_token_indices = torch.where(sentence_masks_generation[step] == 1)[0] # [len(target_tokens)]
+ gen_tokens = self.generation_ids[:, gen_token_indices] # [1, len(target_tokens)]
+
+ original_probs = self.compute_logprob_response_given_prompt(input_ids_all, gen_tokens).detach().cpu() # [1, len(target_tokens)]
+
+ # perturb each sentence of the input and current generation
+ # and measure how the probs of predicitng gen_tokens changes
+ for i in range(num_input_masks - len(sentence_masks_generation) + step):
+ # find the input tokens to be masked
+ tokens_to_mask = torch.where(masks[i] == 1)[0]
+
+ # if we don't want to mask anything just continue
+ if len(tokens_to_mask) == 0:
+ continue
+
+ # save the original token values for unmasking
+ # original_token_value = input_ids_all.clone()
+ original_token_value = input_ids_all[:, tokens_to_mask].clone()
+
+ # we need replace the tokens_to_mask with roberta predicted words and turn them back into tokens
+ new_ids = self.mlm_mask_indices(input_ids_all, tokens_to_mask)
+ try:
+ input_ids_all[:, tokens_to_mask] = new_ids
+ except:
+ print(new_ids)
+
+ # prob of generating a token from a perturbation of the input_ids (prompt + current generations)
+ perturbed_probs = self.compute_logprob_response_given_prompt(input_ids_all, gen_tokens).detach().cpu() # [1, len(target_tokens)]
+
+ # change from original generation prob
+ score_delta = original_probs - perturbed_probs # [1, len(target_tokens)]
+
+ # since scores are for each output token over the set of input tokens [tokens_to_mask],
+ # we expand them to be over all these tokens
+ rows, cols = torch.meshgrid(gen_token_indices, tokens_to_mask, indexing = "ij")
+ score_array[rows, cols] = score_delta.reshape(-1, 1).repeat((1, len(tokens_to_mask))).to(score_array.dtype) # [len(target_tokens), len(tokens_to_mask)]
+
+ # un-ablate the input
+ # input_ids_all = original_token_value
+ input_ids_all[:, tokens_to_mask] = original_token_value
+
+ # Append generated tokens to input for next step
+ input_ids_all = torch.cat([input_ids_all, gen_tokens], dim = 1)
+
+ # remove from the attribution all values associated with the chat prompt
+ score_array = self.extract_user_prompt_attributions(self.prompt_tokens, score_array)
+
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+
+ return LLMAttributionResult(self.tokenizer, score_array, self.user_prompt_tokens, self.generation_tokens, all_tokens = all_tokens)
+
+
+class LLMAttentionAttribution(LLMAttribution):
+ def __init__(self, model, tokenizer, generate_kwargs = None):
+ super().__init__(model, tokenizer, generate_kwargs)
+
+ def calculate_attention_attribution(self, prompt, target = None) -> LLMAttributionResult:
+ # run the model so we can access the prompt ids and generated token ids
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ # Make a copy of the input ids
+ # We will expand the original prompt by each generated token
+ input_ids_all = self.prompt_ids.clone()
+
+ # we want to know how many input and generated tokens there are
+ input_length = self.prompt_ids.shape[1]
+ generation_length = self.generation_ids.shape[1]
+ total_length = input_length + generation_length
+
+ # instantiate a matrix which will track the attribution of every generated token to the input
+ # cols = total_length because we will capture generation -> previous generation attributions
+ score_array = torch.empty((generation_length, total_length))
+
+ with torch.no_grad():
+ # for step in tqdm(range(generation_length)):
+ for step in range(generation_length):
+ input_ids_all = input_ids_all.detach()
+
+ target_token = self.generation_ids[0, step]
+
+ # perform inference
+ outputs = self.model(input_ids_all, output_attentions = True)
+
+ # get attention weights (mean over layers, heads, and attention rows)
+ attentions = torch.stack(outputs.attentions, 0).mean(0).mean(1).mean(1) # [batch, seq_length]
+ attentions = torch.stack(outputs.attentions, 0)[-1].mean(1).mean(1) # [batch, seq_length]
+ # attentions = torch.stack(outputs.attentions, 0)[-1].mean(1).mean(1) # [batch, seq_length]
+ # pad the scores with nan since they must fit into score_array with all other token attributions
+ score_array[step] = self.pad_vector(attentions.detach().cpu(), total_length, torch.nan)
+
+ # Append generated token to input for next step
+ input_ids_all = torch.cat([input_ids_all, torch.tensor([[target_token]]).to(self.device)], dim=1)
+
+ # remove from the attribution all values associated with thechat prompt
+ score_array = self.extract_user_prompt_attributions(self.prompt_tokens, score_array)
+
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+
+ return LLMAttributionResult(self.tokenizer, score_array, self.user_prompt_tokens, self.generation_tokens, all_tokens = all_tokens)
+
+ def rollout(self, attentions):
+ num_blocks = attentions.shape[0]
+ batch_size = attentions.shape[1]
+ num_tokens = attentions.shape[2]
+ eye = torch.eye(num_tokens).expand(num_blocks, batch_size, num_tokens, num_tokens).to(attentions[0].device)
+
+ matrices_aug = attentions + eye
+
+ # normalize all the matrices, making residual connection addition equal to 0.5*A + 0.5*I
+ matrices_aug = matrices_aug / matrices_aug.sum(dim=-1, keepdim=True)
+
+ # perform rollout
+ joint_attention = matrices_aug[0]
+ for i in range(0 + 1, num_blocks):
+ joint_attention = matrices_aug[i].bmm(joint_attention)
+
+ return joint_attention
+
+
+class LLMLRPAttribution(LLMAttribution):
+ """AttnLRP: Attention-Aware Layer-wise Relevance Propagation for Transformers.
+
+ This class implements AttnLRP, a gradient-based attribution method that
+ leverages Layer-wise Relevance Propagation (LRP) rules adapted for
+ transformer architectures.
+
+ AttnLRP achieves O(1) time complexity (single backward pass) while
+ providing theoretically grounded attributions with proven faithfulness.
+
+ Reference:
+ AttnLRP: Attention-Aware Layer-wise Relevance Propagation for Transformers
+ ICML 2024. https://arxiv.org/abs/2402.05602
+
+ Parameters
+ ----------
+ model : transformers model
+ The language model to compute attributions for
+ tokenizer : transformers tokenizer
+ The tokenizer for the model
+ model_type : str, optional
+ The model architecture type. If None, will be auto-detected.
+ Supported: 'qwen3', 'qwen2', 'llama'
+ generate_kwargs : dict, optional
+ Keyword arguments for model.generate()
+
+ Example
+ -------
+ >>> attr = LLMLRPAttribution(model, tokenizer)
+ >>> result = attr.calculate_attnlrp(
+ ... prompt="Context: Mount Everest is 8848m. Question: How high?",
+ ... target="8848 meters"
+ ... )
+ >>> seq_attr, row_attr, rec_attr = result.get_all_token_attrs([0, len(result.generation_tokens) - 1])
+ """
+
+ def __init__(
+ self,
+ model,
+ tokenizer,
+ model_type: Optional[str] = None,
+ generate_kwargs: Optional[Dict[str, Any]] = None,
+ ) -> None:
+ super().__init__(model, tokenizer, generate_kwargs)
+
+ # Auto-detect or validate model type
+ if model_type is None:
+ self.model_type = detect_model_type(model)
+ else:
+ self.model_type = model_type
+
+ def _resolve_score_mode(
+ self,
+ score_mode: Optional[Literal["max", "generated"]],
+ target: Optional[str],
+ ) -> Literal["max", "generated"]:
+ if score_mode is None:
+ return "generated" if target is not None else "max"
+ return score_mode
+
+ def calculate_attnlrp(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ score_mode: Optional[Literal["max", "generated"]] = None,
+ ) -> LLMAttributionResult:
+ """Calculate AttnLRP attribution for a prompt-response pair.
+
+ Parameters
+ ----------
+ prompt : str
+ The input prompt text
+ target : str, optional
+ The target response text. If None, the model generates a response.
+ score_mode : Literal["max", "generated"], optional
+ "max": use max logit at each position (original behavior).
+ "generated": use the logit of the generated/target token at each position.
+ Default: auto ("generated" if target is provided, else "max").
+
+ Returns
+ -------
+ LLMAttributionResult
+ Attribution result with score matrix of shape [gen_len, prompt_len + gen_len]
+ """
+ # Get the generation (either from model or from target)
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ score_mode = self._resolve_score_mode(score_mode, target)
+
+ # Get lengths
+ prompt_len = int(self.prompt_ids.shape[1])
+ gen_len = int(self.generation_ids.shape[1])
+ total_len = prompt_len + gen_len
+
+ # Handle empty generation
+ if gen_len == 0:
+ empty_scores = torch.zeros((0, total_len), dtype=torch.float32)
+ return self._finalize_result(empty_scores)
+
+ # Concatenate prompt and generation ids
+ input_ids = torch.cat([self.prompt_ids, self.generation_ids], dim=1)
+
+ # Get the embedding layer
+ embedding_layer = self.model.get_input_embeddings()
+
+ # Get model dtype for proper precision handling
+ model_dtype = next(self.model.parameters()).dtype
+
+ # Initialize score array
+ score_array = torch.full((gen_len, total_len), torch.nan, dtype=torch.float32)
+
+ # Apply LRP patches and compute attributions
+ with lrp_context(self.model, self.model_type):
+ # Get input embeddings with gradient tracking
+ input_embeds = embedding_layer(input_ids).float()
+ input_embeds = input_embeds.detach().clone().requires_grad_(True)
+
+ # Forward pass with LRP-patched model
+ output_logits = self.model(
+ inputs_embeds=input_embeds.to(model_dtype),
+ use_cache=False,
+ ).logits
+
+ # Compute attribution for each generation position
+ for step in range(gen_len):
+ gen_pos = prompt_len + step
+
+ if score_mode == "max":
+ score_logit = output_logits[0, gen_pos - 1, :].max()
+ elif score_mode == "generated":
+ token_id = self.generation_ids[0, step]
+ score_logit = output_logits[0, gen_pos - 1, token_id]
+ else:
+ raise ValueError(f"Unsupported score_mode={score_mode}")
+
+ # Backward pass - this computes LRP through the patched layers
+ if input_embeds.grad is not None:
+ input_embeds.grad.zero_()
+
+ score_logit.backward(retain_graph=(step < gen_len - 1))
+
+ # Compute relevance: Input * Gradient, summed over embedding dimension
+ # Cast to float32 for numerical stability before summing
+ relevance = (input_embeds * input_embeds.grad).float().sum(-1).detach().cpu()[0]
+
+ # Store in score array, padded appropriately
+ score_array[step, :gen_pos] = relevance[:gen_pos]
+
+ return self._finalize_result(score_array)
+
+ def calculate_attnlrp_batched(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ score_mode: Optional[Literal["max", "generated"]] = None,
+ ) -> LLMAttributionResult:
+ """Calculate AttnLRP attribution using batched computation.
+
+ This is a memory-efficient version that computes attribution for
+ all generation positions in a single forward pass, but requires
+ more careful handling of gradients.
+
+ Parameters
+ ----------
+ prompt : str
+ The input prompt text
+ target : str, optional
+ The target response text. If None, the model generates a response.
+ score_mode : Literal["max", "generated"], optional
+ "max": use max logit at each position (original behavior).
+ "generated": use the logit of the generated/target token at each position.
+ Default: auto ("generated" if target is provided, else "max").
+
+ Returns
+ -------
+ LLMAttributionResult
+ Attribution result with score matrix
+ """
+ # Get the generation
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ score_mode = self._resolve_score_mode(score_mode, target)
+
+ # Get lengths
+ prompt_len = int(self.prompt_ids.shape[1])
+ gen_len = int(self.generation_ids.shape[1])
+ total_len = prompt_len + gen_len
+
+ if gen_len == 0:
+ empty_scores = torch.zeros((0, total_len), dtype=torch.float32)
+ return self._finalize_result(empty_scores)
+
+ # Concatenate prompt and generation ids
+ input_ids = torch.cat([self.prompt_ids, self.generation_ids], dim=1)
+
+ # Get embedding layer and model dtype
+ embedding_layer = self.model.get_input_embeddings()
+ model_dtype = next(self.model.parameters()).dtype
+
+ # Initialize score array
+ score_array = torch.full((gen_len, total_len), torch.nan, dtype=torch.float32)
+
+ with lrp_context(self.model, self.model_type):
+ # Get input embeddings
+ input_embeds = embedding_layer(input_ids).float()
+ input_embeds = input_embeds.detach().clone().requires_grad_(True)
+
+ # Single forward pass
+ output_logits = self.model(
+ inputs_embeds=input_embeds.to(model_dtype),
+ use_cache=False,
+ ).logits
+
+ # Get scoring logits for all generation positions
+ gen_positions = list(range(prompt_len - 1, prompt_len + gen_len - 1))
+ if score_mode == "max":
+ score_logits = torch.stack([output_logits[0, pos, :].max() for pos in gen_positions])
+ elif score_mode == "generated":
+ positions = torch.as_tensor(gen_positions, device=output_logits.device)
+ token_ids = self.generation_ids[0].to(device=output_logits.device)
+ score_logits = output_logits[0, positions, :].gather(-1, token_ids.unsqueeze(-1)).squeeze(-1)
+ else:
+ raise ValueError(f"Unsupported score_mode={score_mode}")
+
+ # Backward from sum of all scoring logits
+ # This gives us the total relevance across all positions
+ if input_embeds.grad is not None:
+ input_embeds.grad.zero_()
+
+ score_logits.sum().backward()
+
+ # Compute aggregated relevance
+ relevance = (input_embeds * input_embeds.grad).float().sum(-1).detach().cpu()[0]
+
+ # For batched version, we use the same relevance for all generation positions
+ # This is an approximation but much faster
+ for step in range(gen_len):
+ gen_pos = prompt_len + step
+ score_array[step, :gen_pos] = relevance[:gen_pos]
+
+ return self._finalize_result(score_array)
+
+ def _finalize_result(
+ self,
+ score_array: torch.Tensor,
+ metadata: Optional[Dict[str, Any]] = None,
+ ) -> LLMAttributionResult:
+ """Finalize the attribution result.
+
+ Extracts user prompt attributions and creates the result object.
+
+ Parameters
+ ----------
+ score_array : torch.Tensor
+ Raw score array of shape [gen_len, total_len]
+ metadata : dict, optional
+ Additional metadata to include
+
+ Returns
+ -------
+ LLMAttributionResult
+ The finalized attribution result
+ """
+ if score_array.ndim == 1:
+ score_array = score_array.unsqueeze(0)
+ score_array = score_array.detach().cpu()
+
+ # Extract only user prompt attributions (remove chat template tokens)
+ score_array = self.extract_user_prompt_attributions(self.prompt_tokens, score_array)
+
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+
+ if metadata is None:
+ metadata = {}
+ metadata["method"] = "attnlrp"
+ metadata["model_type"] = self.model_type
+
+ return LLMAttributionResult(
+ self.tokenizer,
+ score_array,
+ self.user_prompt_tokens,
+ self.generation_tokens,
+ all_tokens=all_tokens,
+ metadata=metadata,
+ )
+
+ def calculate_attnlrp_span_aggregate(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ sink_start: int = 0,
+ sink_end: Optional[int] = None,
+ sink_weights: Optional[torch.Tensor] = None,
+ normalize_weights: bool = True,
+ score_mode: Optional[Literal["max", "generated"]] = None,
+ ) -> AttnLRPSpanAggregate:
+ """Compute span-wise (multi-token) aggregated AttnLRP in ONE forward + ONE backward.
+
+ This returns a single attribution vector over the whole context (prompt + generation),
+ equal to the weighted sum/avg of per-token AttnLRP attributions over the sink span.
+
+ The key insight is that backward propagation is linear with respect to the objective,
+ and the LRP patches (divide_gradient, stop_gradient, identity_rule_implicit) are all
+ linear transformations on the incoming gradient. Therefore:
+
+ R_F = x โ โF/โx = x โ ฮฃ_g w_g โf_g/โx = ฮฃ_g w_g (x โ โf_g/โx) = ฮฃ_g w_g R_{f_g}
+
+ This means computing attribution for the aggregated objective F = ฮฃ w_g f_g in one
+ backward pass is mathematically equivalent to computing per-token attributions and
+ summing them with weights.
+
+ Complexity: O(N) instead of O(MรN) for the naive per-token approach.
+
+ Parameters
+ ----------
+ prompt : str
+ The input prompt text
+ target : str, optional
+ The target response text. If None, the model generates a response.
+ sink_start : int
+ Start of sink span in generation token indices (inclusive). Default: 0
+ sink_end : int, optional
+ End of sink span in generation token indices (inclusive).
+ Default: None (uses gen_len - 1, i.e., full generation)
+ sink_weights : torch.Tensor, optional
+ Optional tensor of shape [span_len], weighting each sink position.
+ Default: None (uniform weights)
+ normalize_weights : bool
+ If True, weights are normalized to sum to 1 (weighted average).
+ If False, computes weighted sum. Default: True
+ score_mode : Literal["max", "generated"], optional
+ "max": use max logit at each sink position (matches existing calculate_attnlrp)
+ "generated": use the logit of the actually generated token id at each position
+ Default: auto ("generated" if target is provided, else "max")
+
+ Returns
+ -------
+ AttnLRPSpanAggregate
+ Aggregated attribution result with token_importance_total vector
+ """
+ # 1) Get generation (either from model or from target)
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ score_mode = self._resolve_score_mode(score_mode, target)
+
+ prompt_len = int(self.prompt_ids.shape[1])
+ gen_len = int(self.generation_ids.shape[1])
+ total_len = prompt_len + gen_len
+
+ # Handle empty generation
+ if gen_len == 0:
+ empty = torch.zeros((0,), dtype=torch.float32)
+ return AttnLRPSpanAggregate(
+ token_importance_total=empty,
+ all_tokens=[],
+ user_prompt_tokens=[],
+ generation_tokens=[],
+ sink_range=(0, -1),
+ sink_weights=None,
+ metadata={"method": "attnlrp_span_aggregate", "note": "empty_generation"},
+ )
+
+ if prompt_len <= 0:
+ raise ValueError("prompt_len must be > 0 for causal LM attribution.")
+
+ # Set default sink_end to full generation
+ if sink_end is None:
+ sink_end = gen_len - 1
+
+ sink_start = int(sink_start)
+ sink_end = int(sink_end)
+
+ if not (0 <= sink_start <= sink_end < gen_len):
+ raise ValueError(f"Invalid sink span [{sink_start}, {sink_end}] for gen_len={gen_len}.")
+
+ span_len = sink_end - sink_start + 1
+
+ # 2) Build input ids and embeddings
+ input_ids = torch.cat([self.prompt_ids, self.generation_ids], dim=1)
+ embedding_layer = self.model.get_input_embeddings()
+ model_dtype = next(self.model.parameters()).dtype
+
+ # 3) Forward with LRP patches, then single backward from aggregated scalar objective
+ with lrp_context(self.model, self.model_type):
+ input_embeds = embedding_layer(input_ids).float()
+ input_embeds = input_embeds.detach().clone().requires_grad_(True)
+
+ output_logits = self.model(
+ inputs_embeds=input_embeds.to(model_dtype),
+ use_cache=False,
+ ).logits # [1, total_len, vocab]
+
+ device = output_logits.device
+ logits_dtype = output_logits.dtype
+
+ # Positions in logits corresponding to generation indices g:
+ # g=0 -> pos = prompt_len - 1 (logits at position i predict token i+1)
+ # g=k -> pos = prompt_len + k - 1
+ pos_start = prompt_len + sink_start - 1
+ pos_end = prompt_len + sink_end - 1
+ positions = torch.arange(pos_start, pos_end + 1, device=device)
+
+ # Build weights tensor
+ if sink_weights is None:
+ w = torch.ones((span_len,), device=device, dtype=logits_dtype)
+ if normalize_weights:
+ w = w / float(span_len)
+ else:
+ w = sink_weights.to(device=device, dtype=logits_dtype)
+ if w.numel() != span_len:
+ raise ValueError("sink_weights length must equal (sink_end - sink_start + 1).")
+ if normalize_weights:
+ w = w / (w.sum() + 1e-12)
+
+ # Per-position scalar targets f_g
+ if score_mode == "max":
+ # Vectorized max over vocab for each selected position
+ per_pos = output_logits[0, positions, :].max(dim=-1).values # [span_len]
+ elif score_mode == "generated":
+ # Logit of actually generated token id at each position
+ token_ids = self.generation_ids[0, sink_start:sink_end + 1].to(device=device) # [span_len]
+ per_pos = output_logits[0, positions, :].gather(-1, token_ids.unsqueeze(-1)).squeeze(-1)
+ else:
+ raise ValueError(f"Unsupported score_mode={score_mode}")
+
+ # Aggregated scalar objective: F = ฮฃ w_g * f_g
+ objective = (w * per_pos).sum()
+
+ if input_embeds.grad is not None:
+ input_embeds.grad.zero_()
+
+ objective.backward()
+
+ # 4) Gradient*Input relevance over embedding dim -> per-token relevance
+ relevance_full = (input_embeds * input_embeds.grad).float().sum(-1).detach().cpu()[0] # [total_len]
+ relevance_with_chat_template = relevance_full.to(torch.float32).clone()
+
+ # 5) Strip chat template tokens (extract only user prompt + full generation tokens)
+ score_array = relevance_full.unsqueeze(0) # [1, total_len]
+ score_array = self.extract_user_prompt_attributions(self.prompt_tokens, score_array)
+ token_importance_total = score_array[0].to(torch.float32).cpu()
+
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+
+ metadata = {
+ "method": "attnlrp_span_aggregate",
+ "base_method": "attnlrp",
+ "model_type": self.model_type,
+ "sink_range_gen": (sink_start, sink_end),
+ "normalize_weights": normalize_weights,
+ "score_mode": score_mode,
+ # Debug/analysis: token-level relevance aligned to the FULL tokenization
+ # (chat template prompt tokens + generation tokens). This does not affect
+ # the returned token_importance_total (which is trimmed for evaluation).
+ "token_importance_total_with_chat_template": relevance_with_chat_template,
+ "prompt_tokens_with_chat_template": list(self.prompt_tokens or []),
+ "user_prompt_indices": list(self.user_prompt_indices or []),
+ "chat_prompt_indices": list(self.chat_prompt_indices or []),
+ }
+
+ return AttnLRPSpanAggregate(
+ token_importance_total=token_importance_total,
+ all_tokens=all_tokens,
+ user_prompt_tokens=self.user_prompt_tokens,
+ generation_tokens=self.generation_tokens,
+ sink_range=(sink_start, sink_end),
+ sink_weights=(sink_weights.detach().cpu() if sink_weights is not None else None),
+ metadata=metadata,
+ )
+
+ def calculate_attnlrp_aggregated(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ score_mode: Optional[Literal["max", "generated"]] = None,
+ ) -> LLMAttributionResult:
+ """Calculate aggregated AttnLRP attribution using span aggregation.
+
+ This method provides an O(N) alternative to the naive O(MรN) per-token
+ AttnLRP computation. It computes attribution over the full generation span
+ in a single forward + backward pass.
+
+ The resulting attribution matrix uses the same aggregated attribution
+ vector for all generation rows (since we're computing the combined
+ importance of all generation tokens at once).
+
+ Parameters
+ ----------
+ prompt : str
+ The input prompt text
+ target : str, optional
+ The target response text. If None, the model generates a response.
+ score_mode : Literal["max", "generated"], optional
+ "max": use max logit at each position (original behavior).
+ "generated": use the logit of the generated/target token at each position.
+ Default: auto ("generated" if target is provided, else "max").
+
+ Returns
+ -------
+ LLMAttributionResult
+ Attribution result compatible with the standard evaluation pipeline
+ """
+ # Get the generation
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ score_mode = self._resolve_score_mode(score_mode, target)
+
+ prompt_len = int(self.prompt_ids.shape[1])
+ gen_len = int(self.generation_ids.shape[1])
+ total_len = prompt_len + gen_len
+
+ # Handle empty generation
+ if gen_len == 0:
+ empty_scores = torch.zeros((0, total_len), dtype=torch.float32)
+ return self._finalize_result(empty_scores, metadata={
+ "method": "attnlrp_aggregated",
+ "note": "empty_generation"
+ })
+
+ # Compute span aggregate over full generation
+ aggregate = self.calculate_attnlrp_span_aggregate(
+ prompt,
+ target=target,
+ sink_start=0,
+ sink_end=gen_len - 1,
+ normalize_weights=True,
+ score_mode=score_mode,
+ )
+
+ # Build score array: replicate the aggregated vector for each generation row
+ # We need to reconstruct the full-length vector before extraction
+ relevance_vector = aggregate.token_importance_total
+
+ # The aggregate already has chat tokens stripped; we need to match the format
+ # expected by _finalize_result which also strips, so we create a padded version
+ user_prompt_len = len(self.user_prompt_tokens)
+ gen_token_len = len(self.generation_tokens)
+ expected_len = user_prompt_len + gen_token_len
+
+ # Build score matrix
+ score_array = torch.full((gen_len, expected_len), torch.nan, dtype=torch.float32)
+
+ # For each generation position, set the attribution up to that position
+ for step in range(gen_len):
+ gen_pos = user_prompt_len + step
+ score_array[step, :gen_pos] = relevance_vector[:gen_pos]
+
+ metadata = {
+ "method": "attnlrp_aggregated",
+ "model_type": self.model_type,
+ "aggregate": aggregate,
+ }
+
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+
+ return LLMAttributionResult(
+ self.tokenizer,
+ score_array,
+ self.user_prompt_tokens,
+ self.generation_tokens,
+ all_tokens=all_tokens,
+ metadata=metadata,
+ )
+
+ def calculate_attnlrp_ft_hop0(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ sink_span: Optional[Tuple[int, int]] = None,
+ thinking_span: Optional[Tuple[int, int]] = None,
+ neg_handling: Literal["drop", "abs"] = "drop",
+ norm_mode: Literal["norm", "no_norm"] = "norm",
+ score_mode: Optional[Literal["max", "generated"]] = None,
+ ) -> LLMAttributionResult:
+ """Return AttnLRP hop0 from the FT multi-hop path as a token-level matrix."""
+ multi_hop = self.calculate_attnlrp_multi_hop(
+ prompt,
+ target=target,
+ sink_span=sink_span,
+ thinking_span=thinking_span,
+ n_hops=0,
+ neg_handling=neg_handling,
+ norm_mode=norm_mode,
+ score_mode=score_mode,
+ )
+ raw_attributions = getattr(multi_hop, "raw_attributions", None) or []
+ base_attr = raw_attributions[0] if raw_attributions else None
+ if base_attr is None or not hasattr(base_attr, "token_importance_total"):
+ raise RuntimeError("AttnLRP hop0 missing from multi-hop result.")
+
+ hop0_vec = torch.as_tensor(getattr(base_attr, "token_importance_total"), dtype=torch.float32).detach().cpu()
+ if hop0_vec.numel() <= 0:
+ raise RuntimeError("Empty generation for AttnLRP (hop0).")
+
+ user_prompt_len = len(self.user_prompt_tokens)
+ gen_len = len(self.generation_tokens)
+ gen_len_ids = int(self.generation_ids.shape[1]) if self.generation_ids is not None else gen_len
+ if gen_len != gen_len_ids:
+ raise RuntimeError(
+ "AttnLRP generation length mismatch between decoded tokens and token ids: "
+ f"len(generation_tokens)={gen_len} vs generation_ids.shape[1]={gen_len_ids}."
+ )
+ expected_len = user_prompt_len + gen_len
+ if int(hop0_vec.numel()) != expected_len:
+ raise RuntimeError("Unexpected AttnLRP hop0 vector length; cannot package into attribution matrix.")
+
+ score_array = torch.full((gen_len, expected_len), torch.nan, dtype=torch.float32)
+ for step in range(gen_len):
+ gen_pos = user_prompt_len + step
+ score_array[step, :gen_pos] = hop0_vec[:gen_pos]
+
+ metadata = {
+ "method": "attnlrp_ft_hop0",
+ "sink_span": tuple(getattr(base_attr, "sink_range", (0, max(0, gen_len - 1)))),
+ "thinking_span": thinking_span,
+ "n_hops": 0,
+ "neg_handling": neg_handling,
+ "norm_mode": norm_mode,
+ "ratio_enabled": norm_mode == "norm",
+ "multi_hop_result": multi_hop,
+ }
+
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+
+ return LLMAttributionResult(
+ self.tokenizer,
+ score_array,
+ self.user_prompt_tokens,
+ self.generation_tokens,
+ all_tokens=all_tokens,
+ metadata=metadata,
+ )
+
+ def calculate_attnlrp_multi_hop(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ sink_span: Optional[Tuple[int, int]] = None,
+ thinking_span: Optional[Tuple[int, int]] = None,
+ n_hops: int = 1,
+ neg_handling: Literal["drop", "abs"] = "drop",
+ norm_mode: Literal["norm", "no_norm"] = "norm",
+ score_mode: Optional[Literal["max", "generated"]] = None,
+ observation_mask: Optional[torch.Tensor | List[float]] = None,
+ ) -> MultiHopAttnLRPResult:
+ """Compute multi-hop AttnLRP attribution recursively through thinking span.
+
+ This method implements recursive attribution propagation analogous to
+ compute_multi_hop_ifr:
+
+ 1. Base hop (hop 0): Compute attribution from sink_span (output) to all tokens
+ 2. For each subsequent hop:
+ - Use attribution scores on thinking_span as weights
+ - Compute weighted attribution from thinking_span to all tokens
+ - Track "observation" (attribution to input tokens, excluding thinking/sink)
+ - Update weights for next hop
+
+ The key insight is that attribution mass flowing through the thinking span
+ can be "unrolled" by recursively attributing from that span back to earlier
+ tokens, weighted by how much each thinking token contributed.
+
+ Parameters
+ ----------
+ prompt : str
+ The input prompt text
+ target : str, optional
+ The target response text. If None, the model generates a response.
+ sink_span : Tuple[int, int], optional
+ (start, end) indices in generation tokens for the output span.
+ Default: full generation (0, gen_len-1)
+ thinking_span : Tuple[int, int], optional
+ (start, end) indices in generation tokens for the reasoning span.
+ Default: same as sink_span
+ n_hops : int
+ Number of recursive hops. Default: 1
+ neg_handling : Literal["drop", "abs"]
+ How to enforce non-negativity after each hop output.
+ "drop": clamp negative values to 0; "abs": take absolute value.
+ norm_mode : Literal["norm", "no_norm"]
+ "norm": per-hop global normalize + thinking-span normalize + enable hop ratios.
+ "no_norm": disable global normalize, thinking normalize, and hop ratios.
+ score_mode : Literal["max", "generated"], optional
+ "max": use max logit at each position (original behavior).
+ "generated": use the logit of the generated/target token at each position.
+ Default: auto ("generated" if target is provided, else "max").
+ observation_mask : torch.Tensor or List[float], optional
+ Custom mask for observable tokens. Shape: (gen_len,) or (total_len,).
+ 1 = observable (input), 0 = not observable (thinking/output).
+ Default: auto-generated based on spans.
+
+ Returns
+ -------
+ MultiHopAttnLRPResult
+ Contains raw_attributions, thinking_ratios, and observation dict.
+ """
+ # Get the generation
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ score_mode = self._resolve_score_mode(score_mode, target)
+
+ prompt_len = int(self.prompt_ids.shape[1])
+ gen_len = int(self.generation_ids.shape[1])
+ total_len = prompt_len + gen_len
+
+ # Handle empty generation
+ if gen_len == 0:
+ empty_aggregate = AttnLRPSpanAggregate(
+ token_importance_total=torch.zeros((0,), dtype=torch.float32),
+ all_tokens=[],
+ user_prompt_tokens=[],
+ generation_tokens=[],
+ sink_range=(0, -1),
+ sink_weights=None,
+ metadata={"method": "attnlrp_multi_hop", "note": "empty_generation"},
+ )
+ return MultiHopAttnLRPResult(
+ raw_attributions=[empty_aggregate],
+ thinking_ratios=[0.0],
+ observation={"mask": torch.tensor([]), "base": torch.tensor([]),
+ "per_hop": [], "sum": torch.tensor([]), "avg": torch.tensor([])},
+ )
+
+ # Validate and set default spans
+ if sink_span is None:
+ sink_span = (0, gen_len - 1)
+ sink_start, sink_end = sink_span
+ if sink_start < 0 or sink_end < sink_start or sink_end >= gen_len:
+ raise ValueError(f"Invalid sink_span ({sink_start}, {sink_end}) for gen_len={gen_len}.")
+
+ if thinking_span is None:
+ thinking_span = sink_span
+ think_start, think_end = thinking_span
+ if think_start < 0 or think_end < think_start or think_end >= gen_len:
+ raise ValueError(f"Invalid thinking_span ({think_start}, {think_end}) for gen_len={gen_len}.")
+
+ hop_count = max(0, int(n_hops))
+ ratio_enabled = norm_mode == "norm"
+ if neg_handling not in ("drop", "abs"):
+ raise ValueError("neg_handling must be 'drop' or 'abs'.")
+ if norm_mode not in ("norm", "no_norm"):
+ raise ValueError("norm_mode must be 'norm' or 'no_norm'.")
+
+ # Compute base attribution from sink_span
+ base_attr = self.calculate_attnlrp_span_aggregate(
+ prompt,
+ target=target,
+ sink_start=sink_start,
+ sink_end=sink_end,
+ sink_weights=None,
+ normalize_weights=ratio_enabled,
+ score_mode=score_mode,
+ )
+
+ def _postprocess_hop_vector(v: torch.Tensor) -> torch.Tensor:
+ v = torch.nan_to_num(v.to(dtype=torch.float32), nan=0.0)
+ if neg_handling == "drop":
+ v = v.clamp(min=0.0)
+ else:
+ v = v.abs()
+ if ratio_enabled:
+ denom = float(v.sum().item())
+ if denom > 0.0:
+ v = v / (denom + 1e-12)
+ else:
+ v = torch.zeros_like(v)
+ return v
+
+ token_total = _postprocess_hop_vector(base_attr.token_importance_total)
+ base_attr.token_importance_total = token_total
+ base_attr.metadata = dict(base_attr.metadata or {})
+ base_attr.metadata.update(
+ {
+ "neg_handling": neg_handling,
+ "norm_mode": norm_mode,
+ "ratio_enabled": ratio_enabled,
+ }
+ )
+
+ raw_attributions: List[AttnLRPSpanAggregate] = [base_attr]
+
+ # Get the stripped token importance vector (user_prompt + generation tokens)
+ T = token_total.shape[0] # This is user_prompt_len + gen_len after stripping
+ user_prompt_len = len(self.user_prompt_tokens)
+
+ # Build observation mask (in stripped token space)
+ # think_start/think_end are in generation-token indices
+ # In stripped space: thinking is at user_prompt_len + think_start : user_prompt_len + think_end + 1
+ # sink is at user_prompt_len + sink_start : user_prompt_len + sink_end + 1
+ if observation_mask is None:
+ obs_mask = torch.ones((T,), dtype=torch.float32)
+ # Mask out thinking span
+ think_start_stripped = user_prompt_len + think_start
+ think_end_stripped = user_prompt_len + think_end
+ obs_mask[think_start_stripped:min(think_end_stripped + 1, T)] = 0.0
+ # Mask out sink span
+ sink_start_stripped = user_prompt_len + sink_start
+ sink_end_stripped = user_prompt_len + sink_end
+ obs_mask[sink_start_stripped:min(sink_end_stripped + 1, T)] = 0.0
+ # Mask out anything after thinking span (future tokens)
+ if think_end_stripped + 1 < T:
+ obs_mask[think_end_stripped + 1:] = 0.0
+ else:
+ obs_mask_input = torch.as_tensor(observation_mask, dtype=torch.float32)
+ if obs_mask_input.numel() == gen_len:
+ # Expand to full stripped length
+ obs_mask = torch.ones((T,), dtype=torch.float32)
+ obs_mask[user_prompt_len:user_prompt_len + gen_len] = obs_mask_input
+ # Keep input tokens as 1 by default
+ elif obs_mask_input.numel() == T:
+ obs_mask = obs_mask_input.clone()
+ else:
+ raise ValueError(f"observation_mask must have length {gen_len} or {T}.")
+
+ # Compute base observation
+ base_obs = token_total.clone() * obs_mask
+ obs_accum = base_obs.clone()
+ per_hop_obs: List[torch.Tensor] = []
+
+ # Extract thinking slice weights for next hop
+ think_start_stripped = user_prompt_len + think_start
+ think_end_stripped = user_prompt_len + think_end
+ thinking_slice = token_total[think_start_stripped:think_end_stripped + 1].detach().clone()
+ if ratio_enabled:
+ thinking_mass = float(thinking_slice.sum().item())
+ if thinking_mass > 0.0:
+ w_thinking = thinking_slice / (thinking_mass + 1e-12)
+ else:
+ w_thinking = torch.zeros_like(thinking_slice)
+ total_mass = float(token_total.sum().item())
+ current_ratio = thinking_mass / (total_mass + 1e-12) if total_mass > 0 else 0.0
+ ratios: List[float] = [current_ratio]
+ else:
+ w_thinking = thinking_slice
+ current_ratio = 1.0
+ ratios = []
+
+ # Multi-hop iterations
+ for hop in range(1, hop_count + 1):
+ # Compute attribution from thinking span with weights from previous hop
+ hop_attr = self.calculate_attnlrp_span_aggregate(
+ prompt,
+ target=target,
+ sink_start=think_start,
+ sink_end=think_end,
+ sink_weights=w_thinking,
+ normalize_weights=False,
+ score_mode=score_mode,
+ )
+
+ hop_total = _postprocess_hop_vector(hop_attr.token_importance_total)
+ hop_attr.token_importance_total = hop_total
+ hop_attr.metadata = dict(hop_attr.metadata or {})
+ hop_attr.metadata.update(
+ {
+ "neg_handling": neg_handling,
+ "norm_mode": norm_mode,
+ "ratio_enabled": ratio_enabled,
+ }
+ )
+ raw_attributions.append(hop_attr)
+
+ # Compute observation for this hop (masked and weighted by current_ratio)
+ obs_only = hop_total * obs_mask * (current_ratio if ratio_enabled else 1.0)
+ obs_accum += obs_only
+ per_hop_obs.append(obs_only)
+
+ # Update weights for next hop
+ thinking_slice = hop_total[think_start_stripped:think_end_stripped + 1].detach().clone()
+ if ratio_enabled:
+ thinking_mass = float(thinking_slice.sum().item())
+ if thinking_mass > 0.0:
+ w_thinking = thinking_slice / (thinking_mass + 1e-12)
+ else:
+ w_thinking = torch.zeros_like(thinking_slice)
+ hop_total_mass = float(hop_total.sum().item())
+ if hop_total_mass <= 0.0:
+ current_ratio = 0.0
+ else:
+ current_ratio *= thinking_mass / (hop_total_mass + 1e-12)
+ ratios.append(current_ratio)
+ else:
+ w_thinking = thinking_slice
+
+ # Compute average observation
+ obs_avg = obs_accum / float(max(1, hop_count)) if hop_count > 0 else obs_accum
+
+ observation = {
+ "mask": obs_mask,
+ "base": base_obs,
+ "per_hop": per_hop_obs,
+ "sum": obs_accum,
+ "avg": obs_avg,
+ }
+
+ return MultiHopAttnLRPResult(
+ raw_attributions=raw_attributions,
+ thinking_ratios=ratios,
+ observation=observation,
+ )
+
+ def calculate_attnlrp_aggregated_multi_hop(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ sink_span: Optional[Tuple[int, int]] = None,
+ thinking_span: Optional[Tuple[int, int]] = None,
+ n_hops: int = 1,
+ neg_handling: Literal["drop", "abs"] = "drop",
+ norm_mode: Literal["norm", "no_norm"] = "norm",
+ score_mode: Optional[Literal["max", "generated"]] = None,
+ ) -> LLMAttributionResult:
+ """Calculate multi-hop aggregated AttnLRP attribution.
+
+ This is a convenience wrapper around calculate_attnlrp_multi_hop that
+ returns an LLMAttributionResult compatible with the evaluation pipeline.
+
+ The returned attribution uses the observation["sum"] vector which
+ accumulates attribution to input tokens across all hops.
+
+ Parameters
+ ----------
+ prompt : str
+ The input prompt text
+ target : str, optional
+ The target response text. If None, the model generates a response.
+ sink_span : Tuple[int, int], optional
+ (start, end) indices in generation tokens for the output span.
+ thinking_span : Tuple[int, int], optional
+ (start, end) indices in generation tokens for the reasoning span.
+ n_hops : int
+ Number of recursive hops. Default: 1
+ neg_handling : Literal["drop", "abs"]
+ How to enforce non-negativity after each hop output.
+ norm_mode : Literal["norm", "no_norm"]
+ "norm": per-hop global normalize + thinking-span normalize + enable hop ratios.
+ "no_norm": disable global normalize, thinking normalize, and hop ratios.
+ score_mode : Literal["max", "generated"], optional
+ "max": use max logit at each position (original behavior).
+ "generated": use the logit of the generated/target token at each position.
+ Default: auto ("generated" if target is provided, else "max").
+
+ Returns
+ -------
+ LLMAttributionResult
+ Attribution result compatible with the standard evaluation pipeline
+ """
+ # Get the generation first to set up tokens
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ gen_len = int(self.generation_ids.shape[1])
+
+ # Handle empty generation
+ if gen_len == 0:
+ empty_scores = torch.zeros((0, len(self.user_prompt_tokens)), dtype=torch.float32)
+ return LLMAttributionResult(
+ self.tokenizer,
+ empty_scores,
+ self.user_prompt_tokens,
+ self.generation_tokens,
+ all_tokens=self.user_prompt_tokens + self.generation_tokens,
+ metadata={"method": "attnlrp_aggregated_multi_hop", "note": "empty_generation"},
+ )
+
+ # Compute multi-hop attribution
+ multi_hop = self.calculate_attnlrp_multi_hop(
+ prompt,
+ target=target,
+ sink_span=sink_span,
+ thinking_span=thinking_span,
+ n_hops=n_hops,
+ neg_handling=neg_handling,
+ norm_mode=norm_mode,
+ score_mode=score_mode,
+ )
+
+ # Use the accumulated observation as the relevance vector
+ # This gives attribution to input tokens, accumulated across hops
+ relevance_vector = multi_hop.observation["sum"]
+
+ user_prompt_len = len(self.user_prompt_tokens)
+ gen_token_len = len(self.generation_tokens)
+ expected_len = user_prompt_len + gen_token_len
+
+ # Build score matrix
+ score_array = torch.full((gen_len, expected_len), torch.nan, dtype=torch.float32)
+
+ # For each generation position, set the attribution
+ for step in range(gen_len):
+ gen_pos = user_prompt_len + step
+ score_array[step, :gen_pos] = relevance_vector[:gen_pos]
+
+ metadata = {
+ "method": "attnlrp_aggregated_multi_hop",
+ "model_type": self.model_type,
+ "n_hops": n_hops,
+ "sink_span": sink_span,
+ "thinking_span": thinking_span,
+ "neg_handling": neg_handling,
+ "norm_mode": norm_mode,
+ "ratio_enabled": norm_mode == "norm",
+ "thinking_ratios": multi_hop.thinking_ratios,
+ "multi_hop_result": multi_hop,
+ }
+
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+
+ return LLMAttributionResult(
+ self.tokenizer,
+ score_array,
+ self.user_prompt_tokens,
+ self.generation_tokens,
+ all_tokens=all_tokens,
+ metadata=metadata,
+ )
diff --git a/flashtrace/baselines/__init__.py b/flashtrace/baselines/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..9b7b42ec78321f210c7f43d73a7995d40d992b20
--- /dev/null
+++ b/flashtrace/baselines/__init__.py
@@ -0,0 +1,5 @@
+"""Baseline attribution methods for FlashTrace."""
+
+from .attnlrp import LLMLRPAttribution
+
+__all__ = ["LLMLRPAttribution"]
diff --git a/flashtrace/baselines/attnlrp.py b/flashtrace/baselines/attnlrp.py
new file mode 100644
index 0000000000000000000000000000000000000000..b439d705f3551a658a3f0261c74b6aff9370c8cc
--- /dev/null
+++ b/flashtrace/baselines/attnlrp.py
@@ -0,0 +1,12 @@
+"""AttnLRP baseline API."""
+
+from flashtrace.attribution import AttnLRPSpanAggregate, LLMLRPAttribution, MultiHopAttnLRPResult
+from flashtrace.lrp_patches import detect_model_type, lrp_context
+
+__all__ = [
+ "AttnLRPSpanAggregate",
+ "LLMLRPAttribution",
+ "MultiHopAttnLRPResult",
+ "detect_model_type",
+ "lrp_context",
+]
diff --git a/flashtrace/cli.py b/flashtrace/cli.py
new file mode 100644
index 0000000000000000000000000000000000000000..e0a4298f65fd9d875dc538880b67b89b75c5f093
--- /dev/null
+++ b/flashtrace/cli.py
@@ -0,0 +1,89 @@
+from __future__ import annotations
+
+import argparse
+from pathlib import Path
+from typing import Sequence
+
+from .model_io import load_model_and_tokenizer
+from .tracer import FlashTrace
+
+
+def parse_span(value: str | None) -> tuple[int, int] | None:
+ if value is None:
+ return None
+ parts = str(value).split(":")
+ if len(parts) != 2:
+ raise ValueError("Span must use START:END format.")
+ try:
+ start = int(parts[0])
+ end = int(parts[1])
+ except ValueError as exc:
+ raise ValueError("Span bounds must be integers.") from exc
+ if start < 0 or end < start:
+ raise ValueError("Span must satisfy 0 <= START <= END.")
+ return start, end
+
+
+def build_parser() -> argparse.ArgumentParser:
+ parser = argparse.ArgumentParser(prog="flashtrace", description="Trace language model outputs with FlashTrace.")
+ sub = parser.add_subparsers(dest="command")
+
+ trace = sub.add_parser("trace", help="Run attribution for a prompt and target.")
+ trace.add_argument("--model", required=True, help="Hugging Face model id or local path.")
+ trace.add_argument("--prompt", required=True, help="UTF-8 text file containing the prompt.")
+ trace.add_argument("--target", help="UTF-8 text file containing the target response.")
+ trace.add_argument("--output-span", help="Inclusive generation-token span START:END.")
+ trace.add_argument("--reasoning-span", help="Inclusive generation-token span START:END.")
+ trace.add_argument("--hops", type=int, default=1)
+ trace.add_argument("--method", default="flashtrace", choices=["flashtrace", "ifr-span", "ifr-matrix"])
+ trace.add_argument("--html", help="Write standalone HTML heatmap.")
+ trace.add_argument("--json", help="Write JSON trace.")
+ trace.add_argument("--device-map", default="auto")
+ trace.add_argument("--dtype", default="auto", choices=["auto", "float16", "bfloat16", "float32"])
+ trace.add_argument("--chunk-tokens", type=int, default=128)
+ trace.add_argument("--sink-chunk-tokens", type=int, default=32)
+ trace.add_argument("--recompute-attention", action="store_true")
+ trace.add_argument("--use-chat-template", action="store_true", help="Format prompts with the tokenizer chat template.")
+ return parser
+
+
+def _read_text(path: str | None) -> str | None:
+ if path is None:
+ return None
+ return Path(path).read_text(encoding="utf-8")
+
+
+def _run_trace(args: argparse.Namespace) -> int:
+ model, tokenizer = load_model_and_tokenizer(args.model, device_map=args.device_map, dtype=args.dtype)
+ tracer = FlashTrace(
+ model,
+ tokenizer,
+ chunk_tokens=args.chunk_tokens,
+ sink_chunk_tokens=args.sink_chunk_tokens,
+ recompute_attention=args.recompute_attention,
+ use_chat_template=args.use_chat_template,
+ )
+ result = tracer.trace(
+ prompt=_read_text(args.prompt) or "",
+ target=_read_text(args.target),
+ output_span=parse_span(args.output_span),
+ reasoning_span=parse_span(args.reasoning_span),
+ hops=args.hops,
+ method=args.method,
+ )
+ for item in result.topk_inputs(20):
+ print(f"{item.index}\t{item.score:.6f}\t{item.token!r}")
+ if args.json:
+ result.to_json(args.json)
+ if args.html:
+ result.to_html(args.html)
+ return 0
+
+
+def main(argv: Sequence[str] | None = None) -> int:
+ parser = build_parser()
+ args = parser.parse_args(argv)
+ if args.command == "trace":
+ return _run_trace(args)
+ parser.print_help()
+ return 0
diff --git a/flashtrace/core.py b/flashtrace/core.py
new file mode 100644
index 0000000000000000000000000000000000000000..74c664167dde234e0558a7539313ebc9b8a7ddf5
--- /dev/null
+++ b/flashtrace/core.py
@@ -0,0 +1,849 @@
+"""Information Flow Routes (IFR) utilities integrated for CAGE.
+
+This module is adapted from the original agenttrace implementation and provides the
+core tensor utilities required to compute IFR token attributions. It exposes
+model-agnostic helpers that assume a Llama/Qwen style stack with the attributes
+used below. The code is intentionally self-contained so it can be imported
+directly by the attribution pipeline without depending on the agenttrace repo.
+"""
+
+from __future__ import annotations
+
+import math
+from dataclasses import dataclass
+from typing import Dict, List, NamedTuple, Optional, Sequence, Tuple
+
+import torch
+import torch.nn as nn
+from torch.utils.hooks import RemovableHandle
+from tqdm import tqdm
+
+
+@dataclass
+class ModelMetadata:
+ """Structural details extracted from the transformer decoder stack."""
+
+ decoder: nn.Module
+ layers: Sequence[nn.Module]
+ n_layers: int
+ d_model: int
+ n_heads_q: int
+ n_kv_heads: int
+ head_dim: int
+ group_size: int
+ rotary_emb: Optional[nn.Module] = None
+
+
+def extract_model_metadata(model: nn.Module) -> ModelMetadata:
+ """Derive metadata for models with Llama/Qwen style decoder blocks."""
+
+ if not hasattr(model, "model"):
+ raise AttributeError(
+ "Expected a causal LM with `model` attribute exposing the decoder stack."
+ )
+
+ decoder = model.model
+ if not hasattr(decoder, "layers"):
+ raise AttributeError("Decoder does not expose `layers`; IFR assumes a layer list.")
+
+ layers: Sequence[nn.Module] = decoder.layers
+ n_layers = len(layers)
+ if n_layers == 0:
+ raise ValueError("Decoder contains no layers; cannot run IFR.")
+
+ d_model = getattr(model.config, "hidden_size", None)
+ if d_model is None:
+ raise AttributeError("Model config is missing `hidden_size`, required for IFR.")
+
+ try:
+ n_heads_q = model.config.num_attention_heads
+ n_kv_heads = model.config.num_key_value_heads
+ except AttributeError:
+ first_attn = layers[0].self_attn
+ n_heads_q = getattr(first_attn, "num_heads")
+ n_kv_heads = getattr(first_attn, "num_key_value_groups", n_heads_q)
+
+ group_size = n_heads_q // n_kv_heads
+ if n_heads_q % n_kv_heads != 0:
+ raise ValueError("IFR assumes grouped-query attention with integer group size.")
+
+ head_dim = getattr(model.config, "head_dim", None)
+ if head_dim is None:
+ first_attn = layers[0].self_attn
+ head_dim = getattr(first_attn, "head_dim", None)
+ if head_dim is None:
+ # Fallback: infer from V projection rows.
+ v_rows = layers[0].self_attn.v_proj.weight.shape[0]
+ head_dim = v_rows // n_kv_heads
+
+ rotary_emb = getattr(decoder, "rotary_emb", None)
+ if rotary_emb is None:
+ rotary_emb = getattr(layers[0].self_attn, "rotary_emb", None)
+
+ return ModelMetadata(
+ decoder=decoder,
+ layers=layers,
+ n_layers=n_layers,
+ d_model=d_model,
+ n_heads_q=n_heads_q,
+ n_kv_heads=n_kv_heads,
+ head_dim=head_dim,
+ group_size=group_size,
+ rotary_emb=rotary_emb,
+ )
+
+
+def build_weight_pack(metadata: ModelMetadata, model_dtype: torch.dtype) -> List[Dict[str, torch.Tensor | nn.Module]]:
+ """Collect per-layer tensors/modules required for IFR."""
+
+ weight_pack: List[Dict[str, torch.Tensor | nn.Module]] = []
+ for layer in metadata.layers:
+ attn = layer.self_attn
+ pack: Dict[str, torch.Tensor | nn.Module] = {
+ "v_w": attn.v_proj.weight.detach().to(dtype=model_dtype),
+ "o_w": attn.o_proj.weight.detach().to(dtype=model_dtype),
+ "q_w": attn.q_proj.weight.detach().to(dtype=model_dtype),
+ "k_w": attn.k_proj.weight.detach().to(dtype=model_dtype),
+ "in_ln": layer.input_layernorm,
+ "post_attn_ln": layer.post_attention_layernorm,
+ "mlp": layer.mlp,
+ }
+ q_bias = getattr(attn.q_proj, "bias", None)
+ k_bias = getattr(attn.k_proj, "bias", None)
+ if q_bias is not None:
+ pack["q_bias"] = q_bias.detach().to(dtype=model_dtype)
+ if k_bias is not None:
+ pack["k_bias"] = k_bias.detach().to(dtype=model_dtype)
+ weight_pack.append(pack)
+ return weight_pack
+
+
+# ---------------------------------------------------------------------------
+# Attention recomputation utilities
+# ---------------------------------------------------------------------------
+
+def _rotate_half(x: torch.Tensor) -> torch.Tensor:
+ """Rotate the last dimension by half โ standard RoPE helper."""
+ x1 = x[..., : x.shape[-1] // 2]
+ x2 = x[..., x.shape[-1] // 2 :]
+ return torch.cat((-x2, x1), dim=-1)
+
+
+def _apply_rotary_pos_emb(
+ q: torch.Tensor, k: torch.Tensor, cos: torch.Tensor, sin: torch.Tensor,
+) -> Tuple[torch.Tensor, torch.Tensor]:
+ """Apply rotary position embeddings to Q and K tensors."""
+ # cos/sin shape from HF: [1, S, head_dim] or [S, head_dim] โ broadcast to [1, 1, S, head_dim]
+ if cos.dim() == 2:
+ cos = cos.unsqueeze(0).unsqueeze(0)
+ elif cos.dim() == 3:
+ cos = cos.unsqueeze(1)
+ if sin.dim() == 2:
+ sin = sin.unsqueeze(0).unsqueeze(0)
+ elif sin.dim() == 3:
+ sin = sin.unsqueeze(1)
+ q_embed = (q * cos) + (_rotate_half(q) * sin)
+ k_embed = (k * cos) + (_rotate_half(k) * sin)
+ return q_embed, k_embed
+
+
+@torch.no_grad()
+def recompute_layer_attention(
+ x_prev: torch.Tensor,
+ layer_weights: Dict[str, torch.Tensor | nn.Module],
+ rotary_emb: nn.Module,
+ params: IFRParameters,
+) -> torch.Tensor:
+ """Recompute attention weights for a single layer from cached activations.
+
+ Returns attention weights of shape ``[n_heads_q, S, S]`` (post-softmax, causal masked).
+ This avoids the need to store all layers' attention maps simultaneously.
+ """
+ device = x_prev.device
+ model_dtype = params.model_dtype
+ S = x_prev.shape[0]
+ n_heads_q = params.n_heads_q
+ n_kv_heads = params.n_kv_heads
+ head_dim = params.head_dim
+ group_size = params.group_size
+
+ in_ln_mod = layer_weights["in_ln"]
+ q_w = layer_weights["q_w"].to(device, non_blocking=True)
+ k_w = layer_weights["k_w"].to(device, non_blocking=True)
+
+ # Apply layernorm (actual, not linearized) to get the true normed input
+ x_normed = in_ln_mod(x_prev.unsqueeze(0)).squeeze(0).to(model_dtype)
+
+ # Project Q and K
+ Q = torch.matmul(x_normed, q_w.T) # [S, n_heads_q * head_dim]
+ K = torch.matmul(x_normed, k_w.T) # [S, n_kv_heads * head_dim]
+
+ q_bias = layer_weights.get("q_bias")
+ k_bias = layer_weights.get("k_bias")
+ if q_bias is not None:
+ Q = Q + q_bias.to(device, non_blocking=True)
+ if k_bias is not None:
+ K = K + k_bias.to(device, non_blocking=True)
+
+ # Reshape to [1, n_heads, S, head_dim]
+ Q = Q.view(S, n_heads_q, head_dim).transpose(0, 1).unsqueeze(0)
+ K = K.view(S, n_kv_heads, head_dim).transpose(0, 1).unsqueeze(0)
+
+ # Apply rotary position embeddings
+ position_ids = torch.arange(S, device=device).unsqueeze(0)
+ cos, sin = rotary_emb(K, position_ids)
+ Q, K = _apply_rotary_pos_emb(Q, K, cos, sin)
+
+ # GQA: repeat K for grouped-query attention
+ K = K.repeat_interleave(group_size, dim=1) # [1, n_heads_q, S, head_dim]
+
+ # Compute attention scores
+ attn_weights = torch.matmul(Q, K.transpose(2, 3)) / math.sqrt(head_dim)
+
+ # Apply causal mask
+ causal_mask = torch.triu(
+ torch.full((S, S), float("-inf"), device=device, dtype=attn_weights.dtype),
+ diagonal=1,
+ )
+ attn_weights = attn_weights + causal_mask.unsqueeze(0).unsqueeze(0)
+
+ # Softmax
+ attn_weights = torch.nn.functional.softmax(attn_weights, dim=-1, dtype=torch.float32)
+ attn_weights = attn_weights.to(model_dtype)
+
+ return attn_weights[0] # [n_heads_q, S, S]
+
+
+@dataclass
+class IFRParameters:
+ """Static configuration describing model geometry and chunk sizes."""
+
+ n_layers: int
+ n_heads_q: int
+ n_kv_heads: int
+ head_dim: int
+ group_size: int
+ d_model: int
+ sequence_length: int
+ model_dtype: torch.dtype
+ chunk_tokens: int
+ sink_chunk_tokens: int
+
+
+@dataclass
+class IFRLayerResult:
+ """Layer-level contributions for a single sink position."""
+
+ e_attn_tokens: torch.Tensor
+ e_resid_attn: float
+ head_importance: torch.Tensor
+ e_ffn: float
+ e_resid_ffn: float
+
+
+@dataclass
+class IFRAggregate:
+ """Aggregate IFR statistics for one or more sink positions."""
+
+ per_layer: List[IFRLayerResult]
+ token_importance_total: torch.Tensor
+ head_importance_total: torch.Tensor
+ ffn_importance_per_layer: torch.Tensor
+ resid_ffn_importance_per_layer: torch.Tensor
+
+
+@dataclass
+class IFRAllPositions:
+ """Batch of IFR outputs across a contiguous range of sink positions."""
+
+ token_importance_matrix: torch.Tensor
+ head_importance_matrix: torch.Tensor
+ resid_attn_fraction_total: torch.Tensor
+ sink_indices: List[int]
+ per_layer_results: Optional[List[List[IFRLayerResult]]]
+ note: str = ""
+
+
+class MultiHopIFRResult(NamedTuple):
+ """Container returned by ``compute_multi_hop_ifr``."""
+
+ raw_attributions: List[IFRAggregate]
+ thinking_ratios: List[float]
+ observation: Dict[str, torch.Tensor | List[torch.Tensor]]
+
+
+@torch.no_grad()
+def attach_hooks(
+ layers: Sequence[nn.Module],
+ model_dtype: torch.dtype,
+) -> Tuple[Dict[str, List[Optional[torch.Tensor]]], List[RemovableHandle]]:
+ """Attach forward hooks to capture residual streams and MLP activations."""
+
+ cache: Dict[str, List[Optional[torch.Tensor]]] = {
+ "pre_attn_resid": [None for _ in range(len(layers))],
+ "mid_resid": [None for _ in range(len(layers))],
+ "post_resid": [None for _ in range(len(layers))],
+ "mlp_out": [None for _ in range(len(layers))],
+ }
+ hooks: List[RemovableHandle] = []
+
+ def make_pre_ln_hook(li: int):
+ def hook(module: nn.Module, inputs: Tuple[torch.Tensor, ...], output: torch.Tensor) -> None:
+ x_in = inputs[0]
+ cache["pre_attn_resid"][li] = x_in.detach().to(model_dtype)
+
+ return hook
+
+ def make_post_attn_ln_pre_hook(li: int):
+ def hook(module: nn.Module, inputs: Tuple[torch.Tensor, ...]) -> None:
+ x_mid = inputs[0]
+ cache["mid_resid"][li] = x_mid.detach().to(model_dtype)
+
+ return hook
+
+ def make_mlp_hook(li: int):
+ def hook(module: nn.Module, inputs: Tuple[torch.Tensor, ...], output: torch.Tensor) -> None:
+ cache["mlp_out"][li] = output.detach().to(model_dtype)
+
+ return hook
+
+ def make_block_output_hook(li: int):
+ def hook(module: nn.Module, inputs: Tuple[torch.Tensor, ...], output) -> None:
+ x_out = output[0] if isinstance(output, (tuple, list)) else output
+ cache["post_resid"][li] = x_out.detach().to(model_dtype)
+
+ return hook
+
+ for li, layer in enumerate(layers):
+ hooks.append(layer.input_layernorm.register_forward_hook(make_pre_ln_hook(li)))
+ hooks.append(
+ layer.post_attention_layernorm.register_forward_pre_hook(make_post_attn_ln_pre_hook(li))
+ )
+ hooks.append(layer.mlp.register_forward_hook(make_mlp_hook(li)))
+ hooks.append(layer.register_forward_hook(make_block_output_hook(li)))
+
+ return cache, hooks
+
+
+def linearize_norm(module: nn.Module, x: torch.Tensor) -> torch.Tensor:
+ """Linearize LayerNorm/RMSNorm to obtain per-token scaling vectors."""
+
+ if x.dtype != torch.float32:
+ x = x.float()
+
+ if hasattr(module, "weight") and module.weight is not None:
+ w = module.weight.detach().to(device=x.device, dtype=torch.float32).view(1, 1, -1)
+ else:
+ w = torch.ones(1, 1, x.shape[-1], dtype=torch.float32, device=x.device)
+
+ name = module.__class__.__name__.lower()
+ if name.endswith("rmsnorm"):
+ eps = getattr(module, "eps", 1e-6)
+ rms = (x.pow(2).mean(dim=-1, keepdim=True) + eps).sqrt()
+ return w / rms
+
+ eps = getattr(module, "eps", 1e-5)
+ mu = x.mean(dim=-1, keepdim=True)
+ sigma = ((x - mu).pow(2).mean(dim=-1, keepdim=True) + eps).sqrt()
+ return w / sigma
+
+
+def l1_norm(x: torch.Tensor) -> torch.Tensor:
+ """Return the L1 norm reduced over the last dimension."""
+
+ return x.abs().sum(dim=-1)
+
+
+def proximity(a: torch.Tensor, x: torch.Tensor) -> torch.Tensor:
+ """Compute proximity contributions used in IFR attribution."""
+
+ return torch.clamp(-l1_norm(a - x) + l1_norm(x), min=0.0)
+
+
+@torch.no_grad()
+def compute_ifr_for_position(
+ focus_idx: int,
+ cache: Dict[str, List[Optional[torch.Tensor]]],
+ attentions: Optional[Sequence[torch.Tensor]],
+ weight_pack: Sequence[Dict[str, torch.Tensor | nn.Module]],
+ params: IFRParameters,
+ renorm_threshold: float = 0.0,
+ rotary_emb: Optional[nn.Module] = None,
+) -> IFRAggregate:
+ """Convenience wrapper computing IFR for a single sink position."""
+
+ all_ifr = compute_ifr_for_all_positions(
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm_threshold,
+ sink_range=(focus_idx, focus_idx),
+ return_layerwise=True,
+ rotary_emb=rotary_emb,
+ )
+
+ token_total_cpu = all_ifr.token_importance_matrix[0]
+ head_total_cpu = all_ifr.head_importance_matrix[0]
+ per_layer = all_ifr.per_layer_results[0] if all_ifr.per_layer_results is not None else []
+ ffn_per_layer = torch.tensor([layer.e_ffn for layer in per_layer], dtype=torch.float32)
+ resid_ffn_per_layer = torch.tensor(
+ [layer.e_resid_ffn for layer in per_layer], dtype=torch.float32
+ )
+
+ return IFRAggregate(
+ per_layer=per_layer,
+ token_importance_total=token_total_cpu,
+ head_importance_total=head_total_cpu,
+ ffn_importance_per_layer=ffn_per_layer,
+ resid_ffn_importance_per_layer=resid_ffn_per_layer,
+ )
+
+
+@torch.no_grad()
+def compute_ifr_sentence_aggregate(
+ sink_start: int,
+ sink_end: int,
+ cache: Dict[str, List[Optional[torch.Tensor]]],
+ attentions: Optional[Sequence[torch.Tensor]],
+ weight_pack: Sequence[Dict[str, torch.Tensor | nn.Module]],
+ params: IFRParameters,
+ renorm_threshold: float = 0.0,
+ sink_weights: Optional[torch.Tensor] = None,
+ rotary_emb: Optional[nn.Module] = None,
+) -> IFRAggregate:
+ """Aggregate IFR contributions over an inclusive sink span [sink_start, sink_end]."""
+
+ assert 0 <= sink_start <= sink_end < params.sequence_length, "Invalid sink span."
+ sink_end_exclusive = sink_end + 1
+
+ n_layers = params.n_layers
+ n_heads_q = params.n_heads_q
+ n_kv_heads = params.n_kv_heads
+ group_size = params.group_size
+ head_dim = params.head_dim
+ T = params.sequence_length
+ model_dtype = params.model_dtype
+
+ per_layer: List[IFRLayerResult] = []
+ head_total_cpu = torch.zeros(n_heads_q, dtype=torch.float32)
+ token_total_cpu = torch.zeros(T, dtype=torch.float32)
+ ffn_per_layer = torch.zeros(n_layers, dtype=torch.float32)
+ resid_ffn_per_layer = torch.zeros(n_layers, dtype=torch.float32)
+
+ J_max = sink_end_exclusive
+
+ for li in range(n_layers):
+ x_prev_full = cache["pre_attn_resid"][li]
+ x_mid_full = cache["mid_resid"][li]
+ x_out_full = cache["post_resid"][li]
+ mlp_out_full = cache["mlp_out"][li]
+
+ assert x_prev_full is not None
+ assert x_mid_full is not None
+ assert x_out_full is not None
+ assert mlp_out_full is not None
+
+ x_prev = x_prev_full[0]
+ x_mid = x_mid_full[0]
+ x_out = x_out_full[0]
+ mlp_out = mlp_out_full[0]
+ layer_device = x_prev.device
+
+ if x_mid.device != layer_device:
+ x_mid = x_mid.to(layer_device, non_blocking=True)
+ if x_out.device != layer_device:
+ x_out = x_out.to(layer_device, non_blocking=True)
+ if mlp_out.device != layer_device:
+ mlp_out = mlp_out.to(layer_device, non_blocking=True)
+
+ if attentions is not None:
+ attn_li = attentions[li][0]
+ if attn_li.device != layer_device or attn_li.dtype != model_dtype:
+ attn_li = attn_li.to(device=layer_device, dtype=model_dtype, non_blocking=True)
+ else:
+ assert rotary_emb is not None, "rotary_emb is required when attentions is None"
+ attn_li = recompute_layer_attention(x_prev, weight_pack[li], rotary_emb, params)
+
+ v_w = weight_pack[li]["v_w"].to(device=layer_device, non_blocking=True)
+ o_w = weight_pack[li]["o_w"].to(device=layer_device, non_blocking=True)
+ in_ln_mod = weight_pack[li]["in_ln"]
+
+ if sink_weights is not None:
+ w = sink_weights.to(layer_device).to(model_dtype)
+ if w.numel() != (sink_end_exclusive - sink_start):
+ raise ValueError("sink_weights length must equal number of sink positions.")
+ w = w / (w.sum() + 1e-12)
+ w_f32 = w.to(torch.float32)
+ xS = (
+ x_mid[sink_start:sink_end_exclusive]
+ .to(torch.float32)
+ .mul(w_f32.view(-1, 1))
+ .sum(dim=0)
+ )
+ y_resid_S = (
+ x_prev[sink_start:sink_end_exclusive]
+ .to(torch.float32)
+ .mul(w_f32.view(-1, 1))
+ .sum(dim=0)
+ )
+ else:
+ xS = x_mid[sink_start:sink_end_exclusive].to(torch.float32).sum(dim=0)
+ y_resid_S = x_prev[sink_start:sink_end_exclusive].to(torch.float32).sum(dim=0)
+ xS_l1 = xS.abs().sum()
+ resid_attn_prox_S = torch.clamp(xS_l1 - (y_resid_S - xS).abs().sum(), min=0.0)
+
+ s_prev = linearize_norm(in_ln_mod, x_prev.unsqueeze(0)).squeeze(0)
+ x_prev_lin = x_prev.float() * s_prev
+ V_all = torch.matmul(x_prev_lin.to(model_dtype), v_w.T)
+ V_kv = V_all.view(T, n_kv_heads, head_dim).contiguous()
+ V_q = V_kv.repeat_interleave(group_size, dim=1)
+ O_blocks = o_w.view(params.d_model, n_heads_q, head_dim).permute(1, 2, 0).contiguous()
+
+ P = sink_end_exclusive - sink_start
+ alpha_slice = attn_li[:, sink_start:sink_end_exclusive, :J_max]
+ i_abs = torch.arange(sink_start, sink_end_exclusive, device=layer_device).view(P, 1)
+ j_abs = torch.arange(0, J_max, device=layer_device).view(1, J_max)
+ mask = (j_abs <= i_abs).to(alpha_slice.dtype)
+ if sink_weights is not None:
+ w = sink_weights.to(layer_device).to(alpha_slice.dtype)
+ w = w / (w.sum() + 1e-12)
+ alpha_weight = alpha_slice * w.view(1, -1, 1)
+ alpha_sum = (alpha_weight * mask.unsqueeze(0)).sum(dim=1).contiguous()
+ else:
+ alpha_sum = (alpha_slice * mask.unsqueeze(0)).sum(dim=1).contiguous()
+
+ numer_tok_sum = torch.zeros((J_max,), device=layer_device, dtype=model_dtype)
+ numer_head_sum = torch.zeros((n_heads_q,), device=layer_device, dtype=model_dtype)
+
+ for j0 in range(0, J_max, params.chunk_tokens):
+ j1 = min(J_max, j0 + params.chunk_tokens)
+ V_chunk = V_q[j0:j1]
+ F_chunk = torch.einsum("jhd,hdk->jhk", V_chunk, O_blocks)
+ A_chunk = alpha_sum[:, j0:j1].permute(1, 0).unsqueeze(-1)
+ W_chunk = F_chunk * A_chunk
+ dist = (W_chunk.float() - xS).abs().sum(dim=-1)
+ prox = torch.clamp(xS_l1 - dist, min=0.0)
+ if renorm_threshold > 0.0:
+ prox = prox * (prox >= renorm_threshold)
+ numer_tok_sum[j0:j1] += prox.sum(dim=1).to(model_dtype)
+ numer_head_sum += prox.sum(dim=0).to(model_dtype)
+
+ denom_S = numer_tok_sum.float().sum() + resid_attn_prox_S + 1e-12
+ e_attn_tokens_full = torch.zeros((T,), dtype=torch.float32)
+ e_attn_tokens_full[:J_max] = (numer_tok_sum.float() / denom_S).to(torch.float32).cpu()
+ e_resid_attn_S = float((resid_attn_prox_S / denom_S).item())
+ head_importance_S = (numer_head_sum.float() / denom_S).to(torch.float32).cpu()
+
+ x_out_sum = x_out[sink_start:sink_end_exclusive].to(torch.float32).sum(dim=0)
+ y_ffn_sum = mlp_out[sink_start:sink_end_exclusive].to(torch.float32).sum(dim=0)
+ x_mid_sum = x_mid[sink_start:sink_end_exclusive].to(torch.float32).sum(dim=0)
+ prox_ffn_S = proximity(y_ffn_sum, x_out_sum)
+ prox_resid_ffn_S = proximity(x_mid_sum, x_out_sum)
+ if renorm_threshold > 0.0:
+ if prox_ffn_S < renorm_threshold:
+ prox_ffn_S = torch.zeros((), dtype=torch.float32, device=layer_device)
+ if prox_resid_ffn_S < renorm_threshold:
+ prox_resid_ffn_S = torch.zeros((), dtype=torch.float32, device=layer_device)
+ denom_ffn_S = prox_ffn_S + prox_resid_ffn_S + 1e-12
+ e_ffn_S = float((prox_ffn_S / denom_ffn_S).item())
+ e_resid_ffn_S = float((prox_resid_ffn_S / denom_ffn_S).item())
+
+ per_layer.append(
+ IFRLayerResult(
+ e_attn_tokens=e_attn_tokens_full,
+ e_resid_attn=e_resid_attn_S,
+ head_importance=head_importance_S,
+ e_ffn=e_ffn_S,
+ e_resid_ffn=e_resid_ffn_S,
+ )
+ )
+ token_total_cpu += e_attn_tokens_full
+ head_total_cpu += head_importance_S
+ ffn_per_layer[li] = e_ffn_S
+ resid_ffn_per_layer[li] = e_resid_ffn_S
+
+ return IFRAggregate(
+ per_layer=per_layer,
+ token_importance_total=token_total_cpu,
+ head_importance_total=head_total_cpu,
+ ffn_importance_per_layer=ffn_per_layer,
+ resid_ffn_importance_per_layer=resid_ffn_per_layer,
+ )
+
+
+@torch.no_grad()
+def compute_multi_hop_ifr(
+ sink_start: int,
+ sink_end: int,
+ thinking_span: Tuple[int, int],
+ n_hops: int,
+ cache: Dict[str, List[Optional[torch.Tensor]]],
+ attentions: Optional[Sequence[torch.Tensor]],
+ weight_pack: Sequence[Dict[str, torch.Tensor | nn.Module]],
+ params: IFRParameters,
+ renorm_threshold: float = 0.0,
+ observation_mask: Optional[torch.Tensor] = None,
+ rotary_emb: Optional[nn.Module] = None,
+) -> MultiHopIFRResult:
+ """Compute the base and multi-hop IFR distribution for a sink span."""
+
+ hop_count = max(0, int(n_hops))
+ sink_start = int(sink_start)
+ sink_end = int(sink_end)
+ think_start = int(thinking_span[0])
+ think_end = int(thinking_span[1])
+
+ if think_start > think_end:
+ raise ValueError("thinking_span start must be <= end")
+
+ base_ifr = compute_ifr_sentence_aggregate(
+ sink_start=sink_start,
+ sink_end=sink_end,
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm_threshold,
+ rotary_emb=rotary_emb,
+ )
+
+ raw_attributions: List[IFRAggregate] = [base_ifr]
+
+ token_total = base_ifr.token_importance_total
+ T = token_total.shape[0]
+ if observation_mask is None:
+ obs_mask = torch.ones((T,), dtype=torch.float32)
+ obs_mask[think_start : min(think_end + 1, T)] = 0.0
+ obs_mask[sink_start : min(sink_end + 1, T)] = 0.0
+ if think_end + 1 < T:
+ obs_mask[think_end + 1 :] = 0.0
+ else:
+ obs_mask = observation_mask.clone().to(dtype=torch.float32)
+ if obs_mask.shape[0] != T:
+ raise ValueError("observation_mask must match sequence length.")
+
+ base_obs = token_total.clone().to(torch.float32) * obs_mask
+ obs_accum = base_obs.clone()
+ per_hop_obs: List[torch.Tensor] = []
+
+ thinking_slice = token_total[think_start : think_end + 1]
+ w_thinking = thinking_slice.detach().clone().to(params.model_dtype)
+ denom_base = float(token_total.sum().item())
+ current_ratio = float(w_thinking.sum().item()) / (denom_base + 1e-12) if denom_base > 0 else 0.0
+ ratios: List[float] = [current_ratio]
+
+ for hop in range(1, hop_count + 1):
+ hop_ifr = compute_ifr_sentence_aggregate(
+ sink_start=think_start,
+ sink_end=think_end,
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm_threshold,
+ sink_weights=w_thinking,
+ rotary_emb=rotary_emb,
+ )
+
+ raw_attributions.append(hop_ifr)
+ hop_total = hop_ifr.token_importance_total.clone().to(torch.float32)
+ obs_only = hop_total * obs_mask * current_ratio
+ obs_accum += obs_only
+ per_hop_obs.append(obs_only)
+
+ thinking_slice = hop_total[think_start : think_end + 1]
+ w_thinking = thinking_slice.detach().clone().to(params.model_dtype)
+ hop_denom = float(hop_total.sum().item())
+ if hop_denom <= 0.0:
+ current_ratio = 0.0
+ else:
+ current_ratio *= float(w_thinking.sum().item()) / (hop_denom + 1e-12)
+ ratios.append(current_ratio)
+
+ obs_avg = obs_accum / float(max(1, hop_count))
+ observation = {
+ "mask": obs_mask,
+ "base": base_obs,
+ "per_hop": per_hop_obs,
+ "sum": obs_accum,
+ "avg": obs_avg,
+ }
+
+ return MultiHopIFRResult(raw_attributions=raw_attributions, thinking_ratios=ratios, observation=observation)
+
+
+@torch.no_grad()
+def compute_ifr_for_all_positions(
+ cache: Dict[str, List[Optional[torch.Tensor]]],
+ attentions: Optional[Sequence[torch.Tensor]],
+ weight_pack: Sequence[Dict[str, torch.Tensor | nn.Module]],
+ params: IFRParameters,
+ renorm_threshold: float = 0.0,
+ sink_range: Optional[Tuple[int, int]] = None,
+ return_layerwise: bool = False,
+ rotary_emb: Optional[nn.Module] = None,
+) -> IFRAllPositions:
+ """Compute IFR importances for every sink position in the selected range."""
+
+ n_layers = params.n_layers
+ n_heads_q = params.n_heads_q
+ n_kv_heads = params.n_kv_heads
+ group_size = params.group_size
+ head_dim = params.head_dim
+ T = params.sequence_length
+ model_dtype = params.model_dtype
+ chunk_tokens = params.chunk_tokens
+ sink_chunk_tokens = params.sink_chunk_tokens
+
+ attn_start = 0 if sink_range is None else sink_range[0]
+ attn_end = (T - 1) if sink_range is None else sink_range[1]
+ assert 0 <= attn_start <= attn_end < T, "Invalid sink_range."
+ S = attn_end - attn_start + 1
+
+ token_total = torch.zeros((S, T), dtype=torch.float32)
+ head_total = torch.zeros((S, n_heads_q), dtype=torch.float32)
+ resid_attn_total = torch.zeros((S,), dtype=torch.float32)
+ per_layer_results: Optional[List[List[IFRLayerResult]]] = [list() for _ in range(S)] if return_layerwise else None
+
+ for li in tqdm(range(n_layers), desc="IFR-all"):
+ x_prev_full = cache["pre_attn_resid"][li]
+ x_mid_full = cache["mid_resid"][li]
+ x_out_full = cache["post_resid"][li]
+ mlp_out_full = cache["mlp_out"][li]
+
+ assert x_prev_full is not None
+ assert x_mid_full is not None
+ assert x_out_full is not None
+ assert mlp_out_full is not None
+
+ x_prev = x_prev_full[0]
+ x_mid = x_mid_full[0]
+ x_out = x_out_full[0]
+ mlp_out = mlp_out_full[0]
+ layer_device = x_prev.device
+
+ if x_mid.device != layer_device:
+ x_mid = x_mid.to(layer_device, non_blocking=True)
+ if x_out.device != layer_device:
+ x_out = x_out.to(layer_device, non_blocking=True)
+ if mlp_out.device != layer_device:
+ mlp_out = mlp_out.to(layer_device, non_blocking=True)
+
+ if attentions is not None:
+ attn_li = attentions[li][0]
+ if attn_li.device != layer_device or attn_li.dtype != model_dtype:
+ attn_li = attn_li.to(device=layer_device, dtype=model_dtype, non_blocking=True)
+ else:
+ assert rotary_emb is not None, "rotary_emb is required when attentions is None"
+ attn_li = recompute_layer_attention(x_prev, weight_pack[li], rotary_emb, params)
+
+ v_w = weight_pack[li]["v_w"].to(layer_device, non_blocking=True)
+ o_w = weight_pack[li]["o_w"].to(layer_device, non_blocking=True)
+ in_ln_mod = weight_pack[li]["in_ln"]
+
+ s_prev = linearize_norm(in_ln_mod, x_prev.unsqueeze(0)).squeeze(0)
+ x_prev_lin = x_prev.float() * s_prev
+ V_all = torch.matmul(x_prev_lin.to(model_dtype), v_w.T)
+ V_kv = V_all.view(T, n_kv_heads, head_dim).contiguous()
+ V_q = V_kv.repeat_interleave(group_size, dim=1)
+ O_blocks = o_w.view(params.d_model, n_heads_q, head_dim).permute(1, 2, 0).contiguous()
+
+ xA_l1_vec = x_mid.float().abs().sum(dim=-1)
+ resid_diff_l1_vec = (x_prev.float() - x_mid.float()).abs().sum(dim=-1)
+ resid_prox_vec = torch.clamp(xA_l1_vec - resid_diff_l1_vec, min=0.0)
+
+ for i0 in range(attn_start, attn_end + 1, sink_chunk_tokens):
+ i1 = min(attn_end + 1, i0 + sink_chunk_tokens)
+ P = i1 - i0
+
+ numer_tok_sum = torch.zeros((P, T), device=layer_device, dtype=model_dtype)
+ numer_head_sum = torch.zeros((P, n_heads_q), device=layer_device, dtype=model_dtype)
+
+ for j0 in range(0, i1, chunk_tokens):
+ j1 = min(i1, j0 + chunk_tokens)
+ V_chunk = V_q[j0:j1]
+ alpha_block = attn_li[:, i0:i1, j0:j1].permute(1, 2, 0).contiguous()
+
+ i_abs = torch.arange(i0, i1, device=layer_device).view(P, 1, 1)
+ j_abs = torch.arange(j0, j1, device=layer_device).view(1, j1 - j0, 1)
+ mask = j_abs <= i_abs
+
+ F_block = torch.einsum("jhd,hdk->jhk", V_chunk, O_blocks)
+ diff = alpha_block.unsqueeze(-1).float() * F_block.unsqueeze(0).float()
+ diff -= x_mid[i0:i1].float().unsqueeze(1).unsqueeze(2)
+ dist_accum = diff.abs().sum(dim=-1)
+
+ xA_l1_chunk = xA_l1_vec[i0:i1].view(P, 1, 1)
+ prox = torch.clamp(xA_l1_chunk - dist_accum, min=0.0)
+ prox = prox * mask
+ if renorm_threshold > 0.0:
+ prox = prox * (prox >= renorm_threshold)
+
+ numer_tok_sum[:, j0:j1] += prox.sum(dim=2).to(model_dtype)
+ numer_head_sum += prox.sum(dim=1).to(model_dtype)
+
+ numer_total_i = numer_tok_sum.float().sum(dim=1)
+ denom = resid_prox_vec[i0:i1] + numer_total_i + 1e-12
+
+ e_tokens_chunk = (numer_tok_sum.float() / denom[:, None]).to(torch.float32).cpu()
+ e_heads_chunk = (numer_head_sum.float() / denom[:, None]).to(torch.float32).cpu()
+
+ s0 = i0 - attn_start
+ s1 = i1 - attn_start
+ token_total[s0:s1, :] += e_tokens_chunk
+ head_total[s0:s1, :] += e_heads_chunk
+ resid_attn_total[s0:s1] += (resid_prox_vec[i0:i1] / denom).to(torch.float32).cpu()
+
+ if return_layerwise and per_layer_results is not None:
+ for p in range(P):
+ pos_abs = i0 + p
+ x_out_i = x_out[pos_abs].to(torch.float32)
+ y_ffn_i = mlp_out[pos_abs].to(torch.float32)
+ x_mid_i = x_mid[pos_abs].to(torch.float32)
+ prox_ffn_t = proximity(y_ffn_i, x_out_i)
+ prox_resid_ffn_t = proximity(x_mid_i, x_out_i)
+ if renorm_threshold > 0.0:
+ if prox_ffn_t < renorm_threshold:
+ prox_ffn_t = torch.zeros((), dtype=torch.float32, device=layer_device)
+ if prox_resid_ffn_t < renorm_threshold:
+ prox_resid_ffn_t = torch.zeros((), dtype=torch.float32, device=layer_device)
+ denom_ffn_t = prox_ffn_t + prox_resid_ffn_t + 1e-12
+ e_ffn = float((prox_ffn_t / denom_ffn_t).item())
+ e_resid_ffn = float((prox_resid_ffn_t / denom_ffn_t).item())
+ e_resid_attn = float((resid_prox_vec[pos_abs] / denom[p]).item())
+ layer_result = IFRLayerResult(
+ e_attn_tokens=e_tokens_chunk[p],
+ e_resid_attn=e_resid_attn,
+ head_importance=e_heads_chunk[p],
+ e_ffn=e_ffn,
+ e_resid_ffn=e_resid_ffn,
+ )
+ per_layer_results[s0 + p].append(layer_result)
+
+ return IFRAllPositions(
+ token_importance_matrix=token_total,
+ head_importance_matrix=head_total,
+ resid_attn_fraction_total=resid_attn_total,
+ sink_indices=list(range(attn_start, attn_end + 1)),
+ per_layer_results=per_layer_results,
+ note="Sum over layers of layer-normalized importances per sink.",
+ )
+
+
+__all__ = [
+ "ModelMetadata",
+ "extract_model_metadata",
+ "build_weight_pack",
+ "recompute_layer_attention",
+ "IFRParameters",
+ "IFRLayerResult",
+ "IFRAggregate",
+ "IFRAllPositions",
+ "MultiHopIFRResult",
+ "attach_hooks",
+ "compute_ifr_for_position",
+ "compute_ifr_sentence_aggregate",
+ "compute_multi_hop_ifr",
+ "compute_ifr_for_all_positions",
+]
diff --git a/flashtrace/improved.py b/flashtrace/improved.py
new file mode 100644
index 0000000000000000000000000000000000000000..e8869764154b4d737b3dff576d63a347e42e9e93
--- /dev/null
+++ b/flashtrace/improved.py
@@ -0,0 +1,1453 @@
+from __future__ import annotations
+
+from dataclasses import dataclass
+import math
+from typing import Any, Dict, List, Optional, Sequence, Tuple
+
+import numpy as np
+import torch
+
+from . import attribution as llm_attr
+from .core import IFRAggregate, MultiHopIFRResult, compute_ifr_sentence_aggregate
+
+##########################################
+# Stop-token configuration (edit here)
+##########################################
+
+# Tokens to be treated as "stop tokens" and skipped (soft-deleted) from the full attribution flow.
+# You can modify this list for different experiments.
+STOP_TOKENS: List[str] = [",", "."]
+
+# Treat pure-whitespace tokens as stop tokens.
+SKIP_WHITESPACE: bool = True
+
+# Match stop tokens after stripping leading/trailing whitespace.
+STRIP_BEFORE_MATCH: bool = True
+
+
+def is_stop_token(token: str) -> bool:
+ if token is None:
+ return False
+ t = str(token)
+ if STRIP_BEFORE_MATCH:
+ t = t.strip()
+ if SKIP_WHITESPACE and t == "":
+ return True
+ return t in STOP_TOKENS
+
+
+def keep_token_indices(tokens: Sequence[str]) -> List[int]:
+ return [i for i, tok in enumerate(tokens) if not is_stop_token(tok)]
+
+
+def _last_attributable_generation_index(
+ tokenizer: Any,
+ generation_ids: torch.Tensor,
+ generation_tokens: Sequence[str],
+) -> int:
+ end = int(generation_ids.shape[1]) - 1
+ if end < 0:
+ return -1
+
+ last_id = int(generation_ids[0, end].item())
+ eos_token_id = getattr(tokenizer, "eos_token_id", None)
+ if eos_token_id is not None:
+ eos_ids = eos_token_id if isinstance(eos_token_id, (list, tuple, set)) else [eos_token_id]
+ if any(last_id == int(eos_id) for eos_id in eos_ids if eos_id is not None):
+ return end - 1
+
+ if end < len(generation_tokens) and is_stop_token(generation_tokens[end]):
+ return end - 1
+
+ return end
+
+
+def _stop_keep_mask(tokens: Sequence[str]) -> torch.Tensor:
+ keep = [0.0 if is_stop_token(tok) else 1.0 for tok in tokens]
+ return torch.as_tensor(keep, dtype=torch.float32)
+
+
+def _build_stop_keep_mask_full(
+ *,
+ prompt_len_full: int,
+ gen_len: int,
+ user_prompt_indices: Sequence[int],
+ user_prompt_tokens: Sequence[str],
+ generation_tokens: Sequence[str],
+) -> torch.Tensor:
+ """Return a float32 mask over the full sequence (chat template + user prompt + generation)."""
+ total_len = int(prompt_len_full) + int(gen_len)
+ mask = torch.ones((total_len,), dtype=torch.float32)
+
+ prompt_keep = _stop_keep_mask(user_prompt_tokens)
+ for j, abs_idx in enumerate(user_prompt_indices):
+ if 0 <= int(abs_idx) < int(prompt_len_full) and j < int(prompt_keep.numel()):
+ mask[int(abs_idx)] = prompt_keep[j]
+
+ gen_keep = _stop_keep_mask(generation_tokens)
+ for g in range(int(gen_keep.numel())):
+ abs_idx = int(prompt_len_full) + g
+ if 0 <= abs_idx < total_len:
+ mask[abs_idx] = gen_keep[g]
+
+ return mask
+
+
+def faithfulness_test_skip_tokens(
+ llm_evaluator: Any,
+ attribution: torch.Tensor,
+ prompt: str,
+ generation: str,
+ *,
+ keep_prompt_token_indices: Sequence[int],
+ user_prompt_indices: Optional[Sequence[int]] = None,
+ k: int = 20,
+) -> Tuple[float, float, float]:
+ """Token-level MAS/RISE faithfulness via guided deletion in k perturbation steps, skipping specified prompt tokens.
+
+ This is a drop-in replacement for llm_attr_eval.LLMAttributionEvaluator.faithfulness_test
+ when an experimental protocol wants to soft-delete some prompt tokens (e.g., stop tokens)
+ from the perturbation path.
+ """
+
+ def auc(arr: np.ndarray) -> float:
+ return (arr.sum() - arr[0] / 2 - arr[-1] / 2) / max(1, (arr.shape[0] - 1))
+
+ pad_token_id = llm_evaluator._ensure_pad_token_id()
+
+ user_prompt = " " + prompt
+ formatted_prompt = llm_evaluator.format_prompt(user_prompt)
+
+ formatted_ids = llm_evaluator.tokenizer(formatted_prompt, return_tensors="pt", add_special_tokens=False).input_ids
+ prompt_ids = formatted_ids.to(llm_evaluator.device)
+ prompt_ids_perturbed = prompt_ids.clone()
+
+ generation_ids = llm_evaluator.tokenizer(
+ generation + llm_evaluator.tokenizer.eos_token,
+ return_tensors="pt",
+ add_special_tokens=False,
+ ).input_ids.to(llm_evaluator.device)
+
+ attr_cpu = attribution.detach().cpu()
+ w = attr_cpu.sum(0)
+ P = int(w.numel())
+
+ keep: List[int] = []
+ seen: set[int] = set()
+ for raw in keep_prompt_token_indices:
+ try:
+ idx = int(raw)
+ except Exception:
+ continue
+ if 0 <= idx < P and idx not in seen:
+ keep.append(idx)
+ seen.add(idx)
+ keep.sort()
+
+ K = len(keep)
+ if K > 0:
+ steps = int(k) if k is not None else 0
+ if steps <= 0:
+ steps = 1
+ steps = min(steps, K)
+ else:
+ steps = 0
+
+ scores = np.zeros(steps + 1, dtype=np.float64)
+ density = np.zeros(steps + 1, dtype=np.float64)
+
+ if user_prompt_indices is not None:
+ prompt_positions = [int(x) for x in user_prompt_indices]
+ if len(prompt_positions) != P:
+ raise ValueError(
+ "user_prompt_indices length does not match prompt-side attribution length: "
+ f"indices P={len(prompt_positions)}, attr P={P}."
+ )
+ if P and max(prompt_positions) >= int(prompt_ids_perturbed.shape[1]):
+ raise ValueError("user_prompt_indices contains an out-of-bounds index for formatted prompt ids.")
+ else:
+ user_ids = llm_evaluator.tokenizer(user_prompt, return_tensors="pt", add_special_tokens=False).input_ids
+ user_start = llm_evaluator._find_subsequence_start(formatted_ids[0], user_ids[0])
+ if user_start is None:
+ raise RuntimeError("Failed to locate user prompt token span inside formatted chat prompt.")
+ prompt_positions = [int(user_start) + j for j in range(P)]
+
+ scores[0] = (
+ llm_evaluator.compute_logprob_response_given_prompt(prompt_ids_perturbed, generation_ids).sum().cpu().detach().item()
+ )
+ density[0] = 1.0
+
+ if K == 0:
+ return auc(scores), auc(scores), auc(scores)
+
+ w_keep = w.index_select(0, torch.as_tensor(keep, dtype=torch.long))
+ sorted_local = torch.argsort(w_keep, descending=True)
+ sorted_keep = [keep[int(i.item())] for i in sorted_local]
+ attr_sum = float(w_keep.sum().item())
+
+ if attr_sum <= 0:
+ density = np.linspace(1.0, 0.0, steps + 1)
+
+ base = K // steps
+ remainder = K % steps
+ start = 0
+ for step in range(steps):
+ size = base + (1 if step < remainder else 0)
+ group = sorted_keep[start : start + size]
+ start += size
+
+ for idx in group:
+ abs_pos = int(prompt_positions[int(idx)])
+ prompt_ids_perturbed[0, abs_pos] = pad_token_id
+ scores[step + 1] = (
+ llm_evaluator.compute_logprob_response_given_prompt(prompt_ids_perturbed, generation_ids).sum().cpu().detach().item()
+ )
+ if attr_sum > 0:
+ dec = float(w.index_select(0, torch.as_tensor(group, dtype=torch.long)).sum().item()) / attr_sum
+ density[step + 1] = density[step] - dec
+
+ min_normalized_pred = 1.0
+ normalized_model_response = scores.copy()
+ for i in range(len(scores)):
+ normalized_pred = (normalized_model_response[i] - scores[-1]) / (abs(scores[0] - scores[-1]))
+ normalized_pred = np.clip(normalized_pred, 0.0, 1.0)
+ min_normalized_pred = min(min_normalized_pred, normalized_pred)
+ normalized_model_response[i] = min_normalized_pred
+
+ alignment_penalty = np.abs(normalized_model_response - density)
+ corrected_scores = normalized_model_response + alignment_penalty
+ corrected_scores = corrected_scores.clip(0.0, 1.0)
+ corrected_scores = (corrected_scores - np.min(corrected_scores)) / (np.max(corrected_scores) - np.min(corrected_scores))
+
+ if np.isnan(corrected_scores).any():
+ corrected_scores = np.linspace(1.0, 0.0, len(scores))
+
+ return auc(normalized_model_response), auc(corrected_scores), auc(normalized_model_response + alignment_penalty)
+
+
+def evaluate_attr_recovery_skip_tokens(
+ attribution_prompt: torch.Tensor,
+ *,
+ keep_prompt_token_indices: Sequence[int],
+ gold_prompt_token_indices: Sequence[int],
+ top_fraction: float = 0.1,
+) -> float:
+ """Recall of gold prompt tokens among top-attributed prompt tokens, skipping specified tokens."""
+ if attribution_prompt.ndim != 2:
+ raise ValueError("Expected 2D prompt-side attribution matrix [R, P].")
+
+ P = int(attribution_prompt.shape[1])
+ keep: List[int] = []
+ seen: set[int] = set()
+ for raw in keep_prompt_token_indices or []:
+ try:
+ idx = int(raw)
+ except Exception:
+ continue
+ if 0 <= idx < P and idx not in seen:
+ keep.append(idx)
+ seen.add(idx)
+ keep.sort()
+ if not keep:
+ return float("nan")
+
+ gold: set[int] = set()
+ for raw in gold_prompt_token_indices or []:
+ try:
+ idx = int(raw)
+ except Exception:
+ continue
+ if idx in seen:
+ gold.add(idx)
+ if not gold:
+ return float("nan")
+
+ w = torch.nan_to_num(attribution_prompt.sum(0).to(dtype=torch.float32), nan=0.0).clamp(min=0.0)
+ w_keep = w.index_select(0, torch.as_tensor(keep, dtype=torch.long))
+
+ frac = float(top_fraction)
+ if frac < 0.0:
+ frac = 0.0
+ if frac > 1.0:
+ frac = 1.0
+ k = max(1, int(math.ceil(float(len(keep)) * frac)))
+ k = min(k, int(len(keep)))
+ topk_local = torch.topk(w_keep, k, largest=True).indices.tolist()
+ topk = {keep[int(i)] for i in topk_local}
+ hit = len(topk.intersection(gold))
+ return float(hit) / float(len(gold))
+
+
+@dataclass
+class StopTokenConfig:
+ stop_tokens: List[str]
+ skip_whitespace: bool
+ strip_before_match: bool
+
+
+class LLMIFRAttributionImproved(llm_attr.LLMIFRAttribution):
+ """Experimental FT-IFR (ifr_multi_hop_stop_words) variant with stop-token soft deletion."""
+
+ def _stop_config(self) -> StopTokenConfig:
+ return StopTokenConfig(
+ stop_tokens=list(STOP_TOKENS),
+ skip_whitespace=bool(SKIP_WHITESPACE),
+ strip_before_match=bool(STRIP_BEFORE_MATCH),
+ )
+
+ @torch.no_grad()
+ def calculate_ifr_multi_hop_stop_words(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ sink_span: Optional[Tuple[int, int]] = None,
+ thinking_span: Optional[Tuple[int, int]] = None,
+ n_hops: int = 1,
+ renorm_threshold: Optional[float] = None,
+ observation_mask: Optional[torch.Tensor | Sequence[float]] = None,
+ ) -> llm_attr.LLMAttributionResult:
+ input_ids_all, attn_mask, prompt_len_full, gen_len = self._ensure_generation(prompt, target)
+ total_len = int(input_ids_all.shape[1])
+
+ if gen_len == 0:
+ empty = torch.zeros((0, total_len), dtype=torch.float32)
+ metadata = {
+ "ifr": {
+ "type": "multi_hop_stop_words",
+ "sink_span_generation": sink_span,
+ "thinking_span_generation": thinking_span,
+ "renorm_threshold": renorm_threshold,
+ "stop_config": self._stop_config().__dict__,
+ "note": "No generation tokens; returning empty attribution matrix.",
+ }
+ }
+ return self._finalize_result(empty, metadata=metadata)
+
+ if sink_span is None:
+ sink_span = (0, gen_len - 1)
+ span_start, span_end = sink_span
+ if span_start < 0 or span_end < span_start or span_end >= gen_len:
+ raise ValueError(f"Invalid sink_span ({span_start}, {span_end}) for generation length {gen_len}.")
+
+ if thinking_span is None:
+ thinking_span = sink_span
+ think_start, think_end = thinking_span
+ if think_start < 0 or think_end < think_start or think_end >= gen_len:
+ raise ValueError(f"Invalid thinking_span ({think_start}, {think_end}) for generation length {gen_len}.")
+
+ sink_start_abs = int(prompt_len_full) + int(span_start)
+ sink_end_abs = int(prompt_len_full) + int(span_end)
+ think_start_abs = int(prompt_len_full) + int(think_start)
+ think_end_abs = int(prompt_len_full) + int(think_end)
+
+ obs_mask_tensor: Optional[torch.Tensor] = None
+ if observation_mask is not None:
+ obs_mask_tensor = torch.as_tensor(observation_mask, dtype=torch.float32)
+ if obs_mask_tensor.ndim != 1:
+ raise ValueError("observation_mask must be a 1D tensor or sequence.")
+ if obs_mask_tensor.numel() == gen_len:
+ mask_full = torch.zeros(total_len, dtype=torch.float32)
+ mask_full[int(prompt_len_full) : int(prompt_len_full) + int(gen_len)] = obs_mask_tensor
+ obs_mask_tensor = mask_full
+ elif obs_mask_tensor.numel() != total_len:
+ raise ValueError(
+ f"observation_mask must have length {gen_len} (generation) or {total_len} (full sequence)."
+ )
+
+ stop_keep_mask_full = _build_stop_keep_mask_full(
+ prompt_len_full=int(prompt_len_full),
+ gen_len=int(gen_len),
+ user_prompt_indices=list(getattr(self, "user_prompt_indices", []) or []),
+ user_prompt_tokens=list(getattr(self, "user_prompt_tokens", []) or []),
+ generation_tokens=list(getattr(self, "generation_tokens", []) or []),
+ )
+
+ cache, attentions, metadata, weight_pack = self._capture_model_state(
+ input_ids_all, attn_mask, recompute_attention=self.recompute_attention,
+ )
+ params = self._build_ifr_params(metadata, total_len)
+ renorm = self.renorm_threshold_default if renorm_threshold is None else float(renorm_threshold)
+
+ hop_count = max(0, int(n_hops))
+
+ # Base: sink-span aggregate, with sink positions masked if they are stop tokens.
+ sink_gen_tokens = list(getattr(self, "generation_tokens", []) or [])
+ sink_stops = []
+ for gi in range(int(span_start), int(span_end) + 1):
+ tok = sink_gen_tokens[gi] if 0 <= gi < len(sink_gen_tokens) else ""
+ sink_stops.append(is_stop_token(tok))
+ base_weights = None
+ if any(sink_stops):
+ base_weights = torch.as_tensor(
+ [0.0 if st else 1.0 for st in sink_stops],
+ dtype=params.model_dtype,
+ )
+
+ base_ifr_raw = compute_ifr_sentence_aggregate(
+ sink_start=sink_start_abs,
+ sink_end=sink_end_abs,
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ sink_weights=base_weights,
+ rotary_emb=metadata.rotary_emb,
+ )
+
+ base_total = base_ifr_raw.token_importance_total.to(dtype=torch.float32) * stop_keep_mask_full
+ base_ifr = IFRAggregate(
+ per_layer=base_ifr_raw.per_layer,
+ token_importance_total=base_total,
+ head_importance_total=base_ifr_raw.head_importance_total,
+ ffn_importance_per_layer=base_ifr_raw.ffn_importance_per_layer,
+ resid_ffn_importance_per_layer=base_ifr_raw.resid_ffn_importance_per_layer,
+ )
+
+ raw_attributions: List[IFRAggregate] = [base_ifr]
+
+ # Observation mask: respect provided observation_mask, otherwise follow the core FT behavior,
+ # and then apply stop-token masking so stop tokens are fully removed from the observation path.
+ if obs_mask_tensor is None:
+ obs_mask = torch.ones((total_len,), dtype=torch.float32)
+ obs_mask[int(think_start_abs) : min(int(think_end_abs) + 1, total_len)] = 0.0
+ obs_mask[int(sink_start_abs) : min(int(sink_end_abs) + 1, total_len)] = 0.0
+ if int(think_end_abs) + 1 < total_len:
+ obs_mask[int(think_end_abs) + 1 :] = 0.0
+ else:
+ obs_mask = obs_mask_tensor.clone().to(dtype=torch.float32)
+ if int(obs_mask.shape[0]) != int(total_len):
+ raise ValueError("observation_mask must match sequence length.")
+
+ obs_mask = obs_mask * stop_keep_mask_full
+
+ base_obs = base_total.clone() * obs_mask
+ obs_accum = base_obs.clone()
+ per_hop_obs: List[torch.Tensor] = []
+
+ denom_base = float(base_total.sum().item())
+ thinking_slice = base_total[int(think_start_abs) : int(think_end_abs) + 1]
+ w_thinking = thinking_slice.detach().clone().to(params.model_dtype)
+ current_ratio = float(w_thinking.sum().item()) / (denom_base + 1e-12) if denom_base > 0 else 0.0
+ ratios: List[float] = [current_ratio]
+
+ # Multi-hop: thinking-span re-aggregation with masked weights.
+ for hop in range(1, hop_count + 1):
+ if float(w_thinking.sum().item()) <= 0.0 or float(current_ratio) <= 0.0:
+ # Terminate remaining hops with zero vectors to keep shapes stable.
+ zeros = torch.zeros_like(base_total)
+ for _ in range(hop, hop_count + 1):
+ raw_attributions.append(
+ IFRAggregate(
+ per_layer=[],
+ token_importance_total=zeros,
+ head_importance_total=torch.zeros_like(base_ifr.head_importance_total),
+ ffn_importance_per_layer=torch.zeros_like(base_ifr.ffn_importance_per_layer),
+ resid_ffn_importance_per_layer=torch.zeros_like(base_ifr.resid_ffn_importance_per_layer),
+ )
+ )
+ per_hop_obs.append(torch.zeros_like(base_total))
+ ratios.append(0.0)
+ break
+
+ hop_ifr_raw = compute_ifr_sentence_aggregate(
+ sink_start=think_start_abs,
+ sink_end=think_end_abs,
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ sink_weights=w_thinking,
+ rotary_emb=metadata.rotary_emb,
+ )
+
+ hop_total = hop_ifr_raw.token_importance_total.to(dtype=torch.float32) * stop_keep_mask_full
+ hop_ifr = IFRAggregate(
+ per_layer=hop_ifr_raw.per_layer,
+ token_importance_total=hop_total,
+ head_importance_total=hop_ifr_raw.head_importance_total,
+ ffn_importance_per_layer=hop_ifr_raw.ffn_importance_per_layer,
+ resid_ffn_importance_per_layer=hop_ifr_raw.resid_ffn_importance_per_layer,
+ )
+ raw_attributions.append(hop_ifr)
+
+ obs_only = hop_total * obs_mask * float(current_ratio)
+ obs_accum += obs_only
+ per_hop_obs.append(obs_only)
+
+ thinking_slice = hop_total[int(think_start_abs) : int(think_end_abs) + 1]
+ w_thinking = thinking_slice.detach().clone().to(params.model_dtype)
+
+ hop_denom = float(hop_total.sum().item())
+ if hop_denom <= 0.0:
+ current_ratio = 0.0
+ else:
+ current_ratio *= float(w_thinking.sum().item()) / (hop_denom + 1e-12)
+ ratios.append(float(current_ratio))
+
+ obs_avg = obs_accum / float(max(1, hop_count))
+ observation: Dict[str, torch.Tensor | List[torch.Tensor]] = {
+ "mask": obs_mask,
+ "base": base_obs,
+ "per_hop": per_hop_obs,
+ "sum": obs_accum,
+ "avg": obs_avg,
+ }
+
+ multi_hop = MultiHopIFRResult(raw_attributions=raw_attributions, thinking_ratios=ratios, observation=observation)
+
+ eval_vector = multi_hop.observation["sum"]
+ score_array = torch.full((gen_len, total_len), torch.nan, dtype=torch.float32)
+ for offset in range(int(span_start), int(span_end) + 1):
+ tok = sink_gen_tokens[offset] if 0 <= offset < len(sink_gen_tokens) else ""
+ if is_stop_token(tok):
+ continue
+ score_array[offset] = eval_vector
+
+ projected_per_hop = [self._project_vector(result.token_importance_total) for result in multi_hop.raw_attributions]
+ obs = multi_hop.observation
+ observation_projected = {
+ "mask": self.extract_user_prompt_attributions(self.prompt_tokens, obs["mask"].view(1, -1))[0],
+ "base": self._project_vector(obs["base"]),
+ "sum": self._project_vector(obs["sum"]),
+ "avg": self._project_vector(obs["avg"]),
+ "per_hop": [self._project_vector(vec) for vec in obs["per_hop"]],
+ }
+
+ meta: Dict[str, Any] = {
+ "ifr": {
+ "type": "multi_hop_stop_words",
+ "sink_span_generation": (int(span_start), int(span_end)),
+ "sink_span_absolute": (int(sink_start_abs), int(sink_end_abs)),
+ "thinking_span_generation": (int(think_start), int(think_end)),
+ "thinking_span_absolute": (int(think_start_abs), int(think_end_abs)),
+ "renorm_threshold": float(renorm),
+ "n_hops": int(n_hops),
+ "thinking_ratios": ratios,
+ "per_hop_projected": projected_per_hop,
+ "observation_projected": observation_projected,
+ "stop_config": self._stop_config().__dict__,
+ "raw": multi_hop,
+ }
+ }
+
+ return self._finalize_result(score_array, metadata=meta)
+
+
+##########################################
+# Split-hop configuration (edit here)
+##########################################
+
+# Default number of equal-length segments to split the thinking span into (token-based).
+# Used only when n_hops is not provided.
+SPLIT_HOP_NUM_SEGMENTS: int = 5
+
+
+def _split_span_equal(start: int, end: int, num_segments: int) -> List[Tuple[int, int]]:
+ """Split an inclusive span [start, end] into up to num_segments equal-size segments."""
+ start_i = int(start)
+ end_i = int(end)
+ if end_i < start_i:
+ return []
+
+ length = end_i - start_i + 1
+ n = max(1, int(num_segments))
+ base = length // n
+ rem = length % n
+
+ segments: List[Tuple[int, int]] = []
+ cur = start_i
+ for i in range(n):
+ seg_len = base + (1 if i < rem else 0)
+ if seg_len <= 0:
+ continue
+ seg_start = cur
+ seg_end = cur + seg_len - 1
+ segments.append((seg_start, seg_end))
+ cur = seg_end + 1
+ return segments
+
+
+@dataclass
+class SplitHopConfig:
+ num_segments: int
+
+
+class LLMIFRAttributionSplitHop(llm_attr.LLMIFRAttribution):
+ """Experimental FT-IFR variant that split-hops over a segmented thinking span.
+
+ This implementation follows "scheme B":
+ - The model forward pass is unchanged (no token deletion).
+ - Attribution remains token-level and prompt-only (via the same observation mask as multi-hop FT-IFR).
+ - The thinking span is split into equal-length token segments; we propagate attribution mass backward
+ segment-by-segment, redistributing each segment's pending mass to earlier segments and the prompt.
+ """
+
+ def _split_hop_config(self, *, num_segments: int) -> SplitHopConfig:
+ return SplitHopConfig(num_segments=int(num_segments))
+
+ @torch.no_grad()
+ def calculate_ifr_multi_hop_split_hop(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ sink_span: Optional[Tuple[int, int]] = None,
+ thinking_span: Optional[Tuple[int, int]] = None,
+ n_hops: int = 1,
+ renorm_threshold: Optional[float] = None,
+ observation_mask: Optional[torch.Tensor | Sequence[float]] = None,
+ ) -> llm_attr.LLMAttributionResult:
+ input_ids_all, attn_mask, prompt_len_full, gen_len = self._ensure_generation(prompt, target)
+ total_len = int(input_ids_all.shape[1])
+
+ num_segments = int(n_hops) if n_hops is not None else int(SPLIT_HOP_NUM_SEGMENTS)
+ if num_segments < 0:
+ num_segments = 0
+
+ if gen_len == 0:
+ empty = torch.zeros((0, total_len), dtype=torch.float32)
+ metadata = {
+ "ifr": {
+ "type": "multi_hop_split_hop",
+ "sink_span_generation": sink_span,
+ "thinking_span_generation": thinking_span,
+ "renorm_threshold": renorm_threshold,
+ "split_hop_config": self._split_hop_config(num_segments=num_segments).__dict__,
+ "note": "No generation tokens; returning empty attribution matrix.",
+ }
+ }
+ return self._finalize_result(empty, metadata=metadata)
+
+ if sink_span is None:
+ sink_span = (0, gen_len - 1)
+ span_start, span_end = sink_span
+ if span_start < 0 or span_end < span_start or span_end >= gen_len:
+ raise ValueError(f"Invalid sink_span ({span_start}, {span_end}) for generation length {gen_len}.")
+
+ if thinking_span is None:
+ thinking_span = sink_span
+ think_start, think_end = thinking_span
+ if think_start < 0 or think_end < think_start or think_end >= gen_len:
+ raise ValueError(f"Invalid thinking_span ({think_start}, {think_end}) for generation length {gen_len}.")
+
+ sink_start_abs = int(prompt_len_full) + int(span_start)
+ sink_end_abs = int(prompt_len_full) + int(span_end)
+ think_start_abs = int(prompt_len_full) + int(think_start)
+ think_end_abs = int(prompt_len_full) + int(think_end)
+
+ obs_mask_tensor: Optional[torch.Tensor] = None
+ if observation_mask is not None:
+ obs_mask_tensor = torch.as_tensor(observation_mask, dtype=torch.float32)
+ if obs_mask_tensor.ndim != 1:
+ raise ValueError("observation_mask must be a 1D tensor or sequence.")
+ if obs_mask_tensor.numel() == gen_len:
+ mask_full = torch.zeros(total_len, dtype=torch.float32)
+ mask_full[int(prompt_len_full) : int(prompt_len_full) + int(gen_len)] = obs_mask_tensor
+ obs_mask_tensor = mask_full
+ elif obs_mask_tensor.numel() != total_len:
+ raise ValueError(
+ f"observation_mask must have length {gen_len} (generation) or {total_len} (full sequence)."
+ )
+
+ cache, attentions, metadata, weight_pack = self._capture_model_state(
+ input_ids_all, attn_mask, recompute_attention=self.recompute_attention,
+ )
+ params = self._build_ifr_params(metadata, total_len)
+ renorm = self.renorm_threshold_default if renorm_threshold is None else float(renorm_threshold)
+
+ base_ifr_raw = compute_ifr_sentence_aggregate(
+ sink_start=sink_start_abs,
+ sink_end=sink_end_abs,
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ rotary_emb=metadata.rotary_emb,
+ )
+ base_total = base_ifr_raw.token_importance_total.to(dtype=torch.float32)
+ base_ifr = IFRAggregate(
+ per_layer=base_ifr_raw.per_layer,
+ token_importance_total=base_total,
+ head_importance_total=base_ifr_raw.head_importance_total,
+ ffn_importance_per_layer=base_ifr_raw.ffn_importance_per_layer,
+ resid_ffn_importance_per_layer=base_ifr_raw.resid_ffn_importance_per_layer,
+ )
+
+ if obs_mask_tensor is None:
+ obs_mask = torch.ones((total_len,), dtype=torch.float32)
+ obs_mask[int(think_start_abs) : min(int(think_end_abs) + 1, total_len)] = 0.0
+ obs_mask[int(sink_start_abs) : min(int(sink_end_abs) + 1, total_len)] = 0.0
+ if int(think_end_abs) + 1 < total_len:
+ obs_mask[int(think_end_abs) + 1 :] = 0.0
+ else:
+ obs_mask = obs_mask_tensor.clone().to(dtype=torch.float32)
+ if int(obs_mask.shape[0]) != int(total_len):
+ raise ValueError("observation_mask must match sequence length.")
+
+ base_obs = base_total.clone() * obs_mask
+ obs_accum = base_obs.clone()
+ per_hop_obs: List[torch.Tensor] = []
+
+ if num_segments <= 0:
+ segments_gen = []
+ else:
+ segments_gen = _split_span_equal(int(think_start), int(think_end), num_segments)
+ segments_abs: List[Tuple[int, int]] = [
+ (int(prompt_len_full) + int(s), int(prompt_len_full) + int(e)) for s, e in segments_gen
+ ]
+
+ # Pending mass (scheme B): start from hop0 mass on each segment and redistribute backward.
+ pending_weights: List[torch.Tensor] = [
+ base_total[s_abs : e_abs + 1].clone().to(dtype=torch.float32) for (s_abs, e_abs) in segments_abs
+ ]
+
+ raw_attributions: List[IFRAggregate] = [base_ifr]
+ hop_details: List[Dict[str, Any]] = []
+
+ # Process segments from last to first: cot[K-1] -> cot[K-2] -> ... -> cot[0] -> prompt
+ for seg_idx in range(len(segments_abs) - 1, -1, -1):
+ seg_start_abs, seg_end_abs = segments_abs[seg_idx]
+ w_pending = pending_weights[seg_idx]
+ pending_mass = float(w_pending.sum().item())
+
+ if pending_mass <= 0.0:
+ raw_attributions.append(
+ IFRAggregate(
+ per_layer=[],
+ token_importance_total=torch.zeros_like(base_total),
+ head_importance_total=torch.zeros_like(base_ifr.head_importance_total),
+ ffn_importance_per_layer=torch.zeros_like(base_ifr.ffn_importance_per_layer),
+ resid_ffn_importance_per_layer=torch.zeros_like(base_ifr.resid_ffn_importance_per_layer),
+ )
+ )
+ per_hop_obs.append(torch.zeros_like(base_total))
+ hop_details.append(
+ {
+ "segment_index": int(seg_idx),
+ "segment_span_generation": segments_gen[seg_idx],
+ "segment_span_absolute": (int(seg_start_abs), int(seg_end_abs)),
+ "pending_mass": float(pending_mass),
+ "masked_denom": 0.0,
+ "alpha": 0.0,
+ }
+ )
+ continue
+
+ hop_ifr_raw = compute_ifr_sentence_aggregate(
+ sink_start=int(seg_start_abs),
+ sink_end=int(seg_end_abs),
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ sink_weights=w_pending.to(dtype=params.model_dtype),
+ rotary_emb=metadata.rotary_emb,
+ )
+
+ hop_total = hop_ifr_raw.token_importance_total.to(dtype=torch.float32)
+ hop_total_masked = hop_total.clone()
+ hop_total_masked[int(seg_start_abs) : int(seg_end_abs) + 1] = 0.0
+
+ masked_denom = float(hop_total_masked.sum().item())
+ alpha = float(pending_mass) / (masked_denom + 1e-12) if masked_denom > 0.0 else 0.0
+ distributed = hop_total_masked * alpha
+
+ # Prompt-only observation accumulation (generation+thinking excluded by obs_mask).
+ obs_only = distributed * obs_mask
+ obs_accum += obs_only
+ per_hop_obs.append(obs_only)
+
+ # Redistribute mass to earlier segments for further propagation.
+ if alpha > 0.0:
+ for j in range(seg_idx):
+ js_abs, je_abs = segments_abs[j]
+ pending_weights[j] = pending_weights[j] + distributed[js_abs : je_abs + 1]
+ pending_weights[seg_idx] = torch.zeros_like(pending_weights[seg_idx])
+
+ hop_ifr = IFRAggregate(
+ per_layer=hop_ifr_raw.per_layer,
+ token_importance_total=hop_total_masked,
+ head_importance_total=hop_ifr_raw.head_importance_total,
+ ffn_importance_per_layer=hop_ifr_raw.ffn_importance_per_layer,
+ resid_ffn_importance_per_layer=hop_ifr_raw.resid_ffn_importance_per_layer,
+ )
+ raw_attributions.append(hop_ifr)
+ hop_details.append(
+ {
+ "segment_index": int(seg_idx),
+ "segment_span_generation": segments_gen[seg_idx],
+ "segment_span_absolute": (int(seg_start_abs), int(seg_end_abs)),
+ "pending_mass": float(pending_mass),
+ "masked_denom": float(masked_denom),
+ "alpha": float(alpha),
+ }
+ )
+
+ obs_avg = obs_accum / float(max(1, len(per_hop_obs)))
+ observation: Dict[str, torch.Tensor | List[torch.Tensor]] = {
+ "mask": obs_mask,
+ "base": base_obs,
+ "per_hop": per_hop_obs,
+ "sum": obs_accum,
+ "avg": obs_avg,
+ }
+
+ multi_hop = MultiHopIFRResult(raw_attributions=raw_attributions, thinking_ratios=[], observation=observation)
+ eval_vector = multi_hop.observation["sum"]
+
+ score_array = torch.full((gen_len, total_len), torch.nan, dtype=torch.float32)
+ for offset in range(int(span_start), int(span_end) + 1):
+ score_array[offset] = eval_vector
+
+ projected_per_hop = [self._project_vector(result.token_importance_total) for result in multi_hop.raw_attributions]
+ obs = multi_hop.observation
+ observation_projected = {
+ "mask": self.extract_user_prompt_attributions(self.prompt_tokens, obs["mask"].view(1, -1))[0],
+ "base": self._project_vector(obs["base"]),
+ "sum": self._project_vector(obs["sum"]),
+ "avg": self._project_vector(obs["avg"]),
+ "per_hop": [self._project_vector(vec) for vec in obs["per_hop"]],
+ }
+
+ meta: Dict[str, Any] = {
+ "ifr": {
+ "type": "multi_hop_split_hop",
+ "sink_span_generation": (int(span_start), int(span_end)),
+ "sink_span_absolute": (int(sink_start_abs), int(sink_end_abs)),
+ "thinking_span_generation": (int(think_start), int(think_end)),
+ "thinking_span_absolute": (int(think_start_abs), int(think_end_abs)),
+ "renorm_threshold": float(renorm),
+ "split_hop_config": self._split_hop_config(num_segments=num_segments).__dict__,
+ "segments_generation": [(int(s), int(e)) for (s, e) in segments_gen],
+ "segments_absolute": [(int(s), int(e)) for (s, e) in segments_abs],
+ "hop_details": hop_details,
+ "per_hop_projected": projected_per_hop,
+ "observation_projected": observation_projected,
+ "raw": multi_hop,
+ }
+ }
+
+ return self._finalize_result(score_array, metadata=meta)
+
+### both
+
+class LLMIFRAttributionBoth(llm_attr.LLMIFRAttribution):
+ """Experimental FT-IFR variant combining stop-token soft deletion with in_all_gen multi-hop.
+
+ Notes
+ -----
+ - Internally, hop0 and subsequent hops aggregate over all_gen_span (= CoT + output).
+ - The EOS token (assumed to be the last generation token) is excluded from all spans.
+ - Stop tokens are soft-deleted from hop aggregation weights and observation vectors.
+ - Returned attribution matrix follows scheme B (rows filled for sink_span_generation).
+ """
+
+ def _stop_config(self) -> StopTokenConfig:
+ return StopTokenConfig(
+ stop_tokens=list(STOP_TOKENS),
+ skip_whitespace=bool(SKIP_WHITESPACE),
+ strip_before_match=bool(STRIP_BEFORE_MATCH),
+ )
+
+ @torch.no_grad()
+ def calculate_ifr_multi_hop_both(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ sink_span: Optional[Tuple[int, int]] = None,
+ thinking_span: Optional[Tuple[int, int]] = None,
+ n_hops: int = 1,
+ renorm_threshold: Optional[float] = None,
+ observation_mask: Optional[torch.Tensor | Sequence[float]] = None,
+ ) -> llm_attr.LLMAttributionResult:
+ input_ids_all, attn_mask, prompt_len_full, gen_len = self._ensure_generation(prompt, target)
+ total_len = int(input_ids_all.shape[1])
+
+ if gen_len == 0:
+ empty = torch.zeros((0, total_len), dtype=torch.float32)
+ metadata = {
+ "ifr": {
+ "type": "multi_hop_both",
+ "sink_span_generation": sink_span,
+ "thinking_span_generation": thinking_span,
+ "renorm_threshold": renorm_threshold,
+ "stop_config": self._stop_config().__dict__,
+ "note": "No generation tokens; returning empty attribution matrix.",
+ }
+ }
+ return self._finalize_result(empty, metadata=metadata)
+
+ end_no_eos = _last_attributable_generation_index(
+ self.tokenizer,
+ self.generation_ids,
+ list(getattr(self, "generation_tokens", []) or []),
+ )
+ if end_no_eos < 0:
+ score_array = torch.full((int(gen_len), total_len), torch.nan, dtype=torch.float32)
+ metadata = {
+ "ifr": {
+ "type": "multi_hop_both",
+ "sink_span_generation": sink_span,
+ "thinking_span_generation": thinking_span,
+ "renorm_threshold": renorm_threshold,
+ "stop_config": self._stop_config().__dict__,
+ "note": "No non-EOS generation tokens; returning NaN attribution matrix.",
+ }
+ }
+ return self._finalize_result(score_array, metadata=metadata)
+
+ # Scheme B: fill only sink_span rows; default to all_gen_span.
+ if sink_span is None:
+ span_start, span_end = (0, end_no_eos)
+ else:
+ span_start, span_end = sink_span
+ if span_start < 0 or span_end < span_start or span_end > end_no_eos:
+ raise ValueError(
+ f"Invalid sink_span ({span_start}, {span_end}) for generation length {gen_len} with EOS excluded."
+ )
+
+ think_start_abs: Optional[int] = None
+ think_end_abs: Optional[int] = None
+ if thinking_span is not None:
+ think_start, think_end = thinking_span
+ if think_start < 0 or think_end < think_start or think_end > end_no_eos:
+ raise ValueError(
+ f"Invalid thinking_span ({think_start}, {think_end}) for generation length {gen_len} with EOS excluded."
+ )
+ think_start_abs = int(prompt_len_full) + int(think_start)
+ think_end_abs = int(prompt_len_full) + int(think_end)
+
+ # Internal hop span: CoT+output when both spans are provided; otherwise default all_gen.
+ if sink_span is not None and thinking_span is not None:
+ all_gen_start = min(int(span_start), int(thinking_span[0]))
+ all_gen_end = max(int(span_end), int(thinking_span[1]))
+ else:
+ all_gen_start = 0
+ all_gen_end = int(end_no_eos)
+
+ all_gen_start = max(0, int(all_gen_start))
+ all_gen_end = min(int(end_no_eos), int(all_gen_end))
+ if all_gen_end < all_gen_start:
+ raise ValueError("Derived all_gen_span is empty after EOS exclusion and bounds checking.")
+
+ sink_start_abs = int(prompt_len_full) + int(span_start)
+ sink_end_abs = int(prompt_len_full) + int(span_end)
+ all_gen_start_abs = int(prompt_len_full) + int(all_gen_start)
+ all_gen_end_abs = int(prompt_len_full) + int(all_gen_end)
+
+ obs_mask_tensor: Optional[torch.Tensor] = None
+ if observation_mask is not None:
+ obs_mask_tensor = torch.as_tensor(observation_mask, dtype=torch.float32)
+ if obs_mask_tensor.ndim != 1:
+ raise ValueError("observation_mask must be a 1D tensor or sequence.")
+ if obs_mask_tensor.numel() == gen_len:
+ mask_full = torch.zeros(total_len, dtype=torch.float32)
+ mask_full[int(prompt_len_full) : int(prompt_len_full) + int(gen_len)] = obs_mask_tensor
+ obs_mask_tensor = mask_full
+ elif obs_mask_tensor.numel() != total_len:
+ raise ValueError(
+ f"observation_mask must have length {gen_len} (generation) or {total_len} (full sequence)."
+ )
+
+ stop_keep_mask_full = _build_stop_keep_mask_full(
+ prompt_len_full=int(prompt_len_full),
+ gen_len=int(gen_len),
+ user_prompt_indices=list(getattr(self, "user_prompt_indices", []) or []),
+ user_prompt_tokens=list(getattr(self, "user_prompt_tokens", []) or []),
+ generation_tokens=list(getattr(self, "generation_tokens", []) or []),
+ )
+
+ cache, attentions, metadata, weight_pack = self._capture_model_state(
+ input_ids_all, attn_mask, recompute_attention=self.recompute_attention,
+ )
+ params = self._build_ifr_params(metadata, total_len)
+ renorm = self.renorm_threshold_default if renorm_threshold is None else float(renorm_threshold)
+
+ hop_count = max(0, int(n_hops))
+
+ # Base hop: aggregate over all_gen_span, excluding stop tokens from the sink weights.
+ gen_tokens = list(getattr(self, "generation_tokens", []) or [])
+ span_stops = []
+ for gi in range(int(all_gen_start), int(all_gen_end) + 1):
+ tok = gen_tokens[gi] if 0 <= gi < len(gen_tokens) else ""
+ span_stops.append(is_stop_token(tok))
+ base_weights = None
+ if any(span_stops):
+ base_weights = torch.as_tensor(
+ [0.0 if st else 1.0 for st in span_stops],
+ dtype=params.model_dtype,
+ )
+
+ base_ifr_raw = compute_ifr_sentence_aggregate(
+ sink_start=all_gen_start_abs,
+ sink_end=all_gen_end_abs,
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ sink_weights=base_weights,
+ rotary_emb=metadata.rotary_emb,
+ )
+ base_total = torch.nan_to_num(
+ base_ifr_raw.token_importance_total.to(dtype=torch.float32),
+ nan=0.0,
+ posinf=0.0,
+ neginf=0.0,
+ )
+ base_total = base_total * stop_keep_mask_full
+ base_ifr = IFRAggregate(
+ per_layer=base_ifr_raw.per_layer,
+ token_importance_total=base_total,
+ head_importance_total=base_ifr_raw.head_importance_total,
+ ffn_importance_per_layer=base_ifr_raw.ffn_importance_per_layer,
+ resid_ffn_importance_per_layer=base_ifr_raw.resid_ffn_importance_per_layer,
+ )
+
+ # Observation mask: respect provided observation_mask, otherwise follow in_all_gen scheme B,
+ # and then apply stop-token masking so stop tokens are removed from the observation path.
+ if obs_mask_tensor is None:
+ obs_mask = torch.ones((total_len,), dtype=torch.float32)
+ obs_mask[int(all_gen_start_abs) : min(int(all_gen_end_abs) + 1, total_len)] = 0.0
+ if int(all_gen_end_abs) + 1 < total_len:
+ obs_mask[int(all_gen_end_abs) + 1 :] = 0.0
+ else:
+ obs_mask = obs_mask_tensor.clone().to(dtype=torch.float32)
+ if int(obs_mask.shape[0]) != int(total_len):
+ raise ValueError("observation_mask must match sequence length.")
+
+ obs_mask = obs_mask * stop_keep_mask_full
+
+ base_obs = base_total.clone() * obs_mask
+ obs_accum = base_obs.clone()
+ per_hop_obs: List[torch.Tensor] = []
+
+ raw_attributions: List[IFRAggregate] = [base_ifr]
+
+ denom_base = float(base_total.sum().item())
+ w_span_raw = base_total[int(all_gen_start_abs) : int(all_gen_end_abs) + 1].detach().clone()
+ w_span_raw = torch.nan_to_num(w_span_raw, nan=0.0, posinf=0.0, neginf=0.0)
+ w_span_sum = float(w_span_raw.sum().item())
+ w_span_weights = (
+ (w_span_raw / (w_span_sum + 1e-12))
+ if w_span_sum > 0.0
+ else torch.zeros_like(w_span_raw, dtype=torch.float32)
+ )
+ current_ratio = float(w_span_sum) / (denom_base + 1e-12) if denom_base > 0.0 else 0.0
+ ratios: List[float] = [float(current_ratio)]
+
+ for hop in range(1, hop_count + 1):
+ if float(w_span_sum) <= 0.0 or float(current_ratio) <= 0.0:
+ zeros = torch.zeros_like(base_total)
+ for _ in range(hop, hop_count + 1):
+ raw_attributions.append(
+ IFRAggregate(
+ per_layer=[],
+ token_importance_total=zeros,
+ head_importance_total=torch.zeros_like(base_ifr.head_importance_total),
+ ffn_importance_per_layer=torch.zeros_like(base_ifr.ffn_importance_per_layer),
+ resid_ffn_importance_per_layer=torch.zeros_like(base_ifr.resid_ffn_importance_per_layer),
+ )
+ )
+ per_hop_obs.append(torch.zeros_like(base_total))
+ ratios.append(0.0)
+ break
+
+ hop_ifr_raw = compute_ifr_sentence_aggregate(
+ sink_start=all_gen_start_abs,
+ sink_end=all_gen_end_abs,
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ sink_weights=w_span_weights,
+ rotary_emb=metadata.rotary_emb,
+ )
+
+ hop_total = torch.nan_to_num(
+ hop_ifr_raw.token_importance_total.to(dtype=torch.float32),
+ nan=0.0,
+ posinf=0.0,
+ neginf=0.0,
+ )
+ hop_total = hop_total * stop_keep_mask_full
+ hop_ifr = IFRAggregate(
+ per_layer=hop_ifr_raw.per_layer,
+ token_importance_total=hop_total,
+ head_importance_total=hop_ifr_raw.head_importance_total,
+ ffn_importance_per_layer=hop_ifr_raw.ffn_importance_per_layer,
+ resid_ffn_importance_per_layer=hop_ifr_raw.resid_ffn_importance_per_layer,
+ )
+ raw_attributions.append(hop_ifr)
+
+ obs_only = hop_total * obs_mask * float(current_ratio)
+ obs_accum += obs_only
+ per_hop_obs.append(obs_only)
+
+ w_span_raw = hop_total[int(all_gen_start_abs) : int(all_gen_end_abs) + 1].detach().clone()
+ w_span_raw = torch.nan_to_num(w_span_raw, nan=0.0, posinf=0.0, neginf=0.0)
+ w_span_sum = float(w_span_raw.sum().item())
+ w_span_weights = (
+ (w_span_raw / (w_span_sum + 1e-12))
+ if w_span_sum > 0.0
+ else torch.zeros_like(w_span_raw, dtype=torch.float32)
+ )
+
+ hop_denom = float(hop_total.sum().item())
+ if hop_denom <= 0.0:
+ current_ratio = 0.0
+ else:
+ current_ratio *= float(w_span_sum) / (hop_denom + 1e-12)
+ ratios.append(float(current_ratio))
+
+ obs_avg = obs_accum / float(max(1, hop_count))
+ observation: Dict[str, torch.Tensor | List[torch.Tensor]] = {
+ "mask": obs_mask,
+ "base": base_obs,
+ "per_hop": per_hop_obs,
+ "sum": obs_accum,
+ "avg": obs_avg,
+ }
+
+ multi_hop = MultiHopIFRResult(raw_attributions=raw_attributions, thinking_ratios=ratios, observation=observation)
+ eval_vector = multi_hop.observation["sum"]
+
+ score_array = torch.full((int(gen_len), total_len), torch.nan, dtype=torch.float32)
+ for offset in range(int(span_start), int(span_end) + 1):
+ tok = gen_tokens[offset] if 0 <= offset < len(gen_tokens) else ""
+ if is_stop_token(tok):
+ continue
+ score_array[offset] = eval_vector
+
+ projected_per_hop = [self._project_vector(result.token_importance_total) for result in multi_hop.raw_attributions]
+ obs = multi_hop.observation
+ observation_projected = {
+ "mask": self.extract_user_prompt_attributions(self.prompt_tokens, obs["mask"].view(1, -1))[0],
+ "base": self._project_vector(obs["base"]),
+ "sum": self._project_vector(obs["sum"]),
+ "avg": self._project_vector(obs["avg"]),
+ "per_hop": [self._project_vector(vec) for vec in obs["per_hop"]],
+ }
+
+ meta: Dict[str, Any] = {
+ "ifr": {
+ "type": "multi_hop_both",
+ "sink_span_generation": (int(span_start), int(span_end)),
+ "sink_span_absolute": (int(sink_start_abs), int(sink_end_abs)),
+ "thinking_span_generation": (
+ (int(thinking_span[0]), int(thinking_span[1])) if thinking_span is not None else None
+ ),
+ "thinking_span_absolute": (
+ (int(think_start_abs), int(think_end_abs)) if think_start_abs is not None else None
+ ),
+ "all_gen_span_generation": (int(all_gen_start), int(all_gen_end)),
+ "all_gen_span_absolute": (int(all_gen_start_abs), int(all_gen_end_abs)),
+ "renorm_threshold": float(renorm),
+ "n_hops": int(n_hops),
+ "thinking_ratios": multi_hop.thinking_ratios,
+ "per_hop_projected": projected_per_hop,
+ "observation_projected": observation_projected,
+ "stop_config": self._stop_config().__dict__,
+ "raw": multi_hop,
+ "note": "both = stop_words + in_all_gen (scheme B; hops over all_gen_span with EOS excluded).",
+ }
+ }
+
+ return self._finalize_result(score_array, metadata=meta)
+
+### in_all_gen
+
+class LLMIFRAttributionInAllGen(llm_attr.LLMIFRAttribution):
+ """Experimental FT-IFR variant that runs all hops over the full generation span (CoT + output).
+
+ Notes
+ -----
+ This method follows scheme B for compatibility:
+ - Internally, hop0 and subsequent hops aggregate over all_gen_span (= CoT + output).
+ - The returned attribution matrix is still written only for `sink_span` rows (answer tokens),
+ falling back to all_gen_span when `sink_span` is not provided.
+ - The EOS token (assumed to be the last generation token) is excluded from all spans.
+ """
+
+ @torch.no_grad()
+ def calculate_ifr_in_all_gen(
+ self,
+ prompt: str,
+ target: Optional[str] = None,
+ *,
+ sink_span: Optional[Tuple[int, int]] = None,
+ thinking_span: Optional[Tuple[int, int]] = None,
+ n_hops: int = 1,
+ renorm_threshold: Optional[float] = None,
+ observation_mask: Optional[torch.Tensor | Sequence[float]] = None,
+ ) -> llm_attr.LLMAttributionResult:
+ input_ids_all, attn_mask, prompt_len_full, gen_len = self._ensure_generation(prompt, target)
+ total_len = int(input_ids_all.shape[1])
+
+ if gen_len == 0:
+ empty = torch.zeros((0, total_len), dtype=torch.float32)
+ metadata = {
+ "ifr": {
+ "type": "in_all_gen",
+ "sink_span_generation": sink_span,
+ "thinking_span_generation": thinking_span,
+ "renorm_threshold": renorm_threshold,
+ "note": "No generation tokens; returning empty attribution matrix.",
+ }
+ }
+ return self._finalize_result(empty, metadata=metadata)
+
+ end_no_eos = _last_attributable_generation_index(
+ self.tokenizer,
+ self.generation_ids,
+ list(getattr(self, "generation_tokens", []) or []),
+ )
+ if end_no_eos < 0:
+ score_array = torch.full((int(gen_len), total_len), torch.nan, dtype=torch.float32)
+ metadata = {
+ "ifr": {
+ "type": "in_all_gen",
+ "sink_span_generation": sink_span,
+ "thinking_span_generation": thinking_span,
+ "renorm_threshold": renorm_threshold,
+ "note": "No non-EOS generation tokens; returning NaN attribution matrix.",
+ }
+ }
+ return self._finalize_result(score_array, metadata=metadata)
+
+ # Scheme B: fill only sink_span rows; default to all_gen_span.
+ if sink_span is None:
+ span_start, span_end = (0, end_no_eos)
+ else:
+ span_start, span_end = sink_span
+ if span_start < 0 or span_end < span_start or span_end > end_no_eos:
+ raise ValueError(
+ f"Invalid sink_span ({span_start}, {span_end}) for generation length {gen_len} with EOS excluded."
+ )
+
+ think_start_abs: Optional[int] = None
+ think_end_abs: Optional[int] = None
+ if thinking_span is not None:
+ think_start, think_end = thinking_span
+ if think_start < 0 or think_end < think_start or think_end > end_no_eos:
+ raise ValueError(
+ f"Invalid thinking_span ({think_start}, {think_end}) for generation length {gen_len} with EOS excluded."
+ )
+ think_start_abs = int(prompt_len_full) + int(think_start)
+ think_end_abs = int(prompt_len_full) + int(think_end)
+
+ # Internal hop span: CoT+output when both spans are provided; otherwise default all_gen.
+ if sink_span is not None and thinking_span is not None:
+ all_gen_start = min(int(span_start), int(thinking_span[0]))
+ all_gen_end = max(int(span_end), int(thinking_span[1]))
+ else:
+ all_gen_start = 0
+ all_gen_end = int(end_no_eos)
+
+ all_gen_start = max(0, int(all_gen_start))
+ all_gen_end = min(int(end_no_eos), int(all_gen_end))
+ if all_gen_end < all_gen_start:
+ raise ValueError("Derived all_gen_span is empty after EOS exclusion and bounds checking.")
+
+ sink_start_abs = int(prompt_len_full) + int(span_start)
+ sink_end_abs = int(prompt_len_full) + int(span_end)
+ all_gen_start_abs = int(prompt_len_full) + int(all_gen_start)
+ all_gen_end_abs = int(prompt_len_full) + int(all_gen_end)
+
+ obs_mask_tensor: Optional[torch.Tensor] = None
+ if observation_mask is not None:
+ obs_mask_tensor = torch.as_tensor(observation_mask, dtype=torch.float32)
+ if obs_mask_tensor.ndim != 1:
+ raise ValueError("observation_mask must be a 1D tensor or sequence.")
+ if obs_mask_tensor.numel() == gen_len:
+ mask_full = torch.zeros(total_len, dtype=torch.float32)
+ mask_full[int(prompt_len_full) : int(prompt_len_full) + int(gen_len)] = obs_mask_tensor
+ obs_mask_tensor = mask_full
+ elif obs_mask_tensor.numel() != total_len:
+ raise ValueError(
+ f"observation_mask must have length {gen_len} (generation) or {total_len} (full sequence)."
+ )
+
+ cache, attentions, metadata, weight_pack = self._capture_model_state(
+ input_ids_all, attn_mask, recompute_attention=self.recompute_attention,
+ )
+ params = self._build_ifr_params(metadata, total_len)
+ renorm = self.renorm_threshold_default if renorm_threshold is None else float(renorm_threshold)
+
+ # Multi-hop IFR over all_gen_span (scheme B). We implement the hop loop inline
+ # to robustly handle rare NaNs in intermediate IFR vectors on long sequences.
+ hop_count = max(0, int(n_hops))
+
+ base_ifr_raw = compute_ifr_sentence_aggregate(
+ sink_start=all_gen_start_abs,
+ sink_end=all_gen_end_abs,
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ rotary_emb=metadata.rotary_emb,
+ )
+ base_total = torch.nan_to_num(
+ base_ifr_raw.token_importance_total.to(dtype=torch.float32),
+ nan=0.0,
+ posinf=0.0,
+ neginf=0.0,
+ )
+ base_ifr = IFRAggregate(
+ per_layer=base_ifr_raw.per_layer,
+ token_importance_total=base_total,
+ head_importance_total=base_ifr_raw.head_importance_total,
+ ffn_importance_per_layer=base_ifr_raw.ffn_importance_per_layer,
+ resid_ffn_importance_per_layer=base_ifr_raw.resid_ffn_importance_per_layer,
+ )
+
+ if obs_mask_tensor is None:
+ obs_mask = torch.ones((total_len,), dtype=torch.float32)
+ obs_mask[int(all_gen_start_abs) : min(int(all_gen_end_abs) + 1, total_len)] = 0.0
+ if int(all_gen_end_abs) + 1 < total_len:
+ obs_mask[int(all_gen_end_abs) + 1 :] = 0.0
+ else:
+ obs_mask = obs_mask_tensor.clone().to(dtype=torch.float32)
+ if int(obs_mask.shape[0]) != int(total_len):
+ raise ValueError("observation_mask must match sequence length.")
+
+ base_obs = base_total.clone() * obs_mask
+ obs_accum = base_obs.clone()
+ per_hop_obs: List[torch.Tensor] = []
+
+ raw_attributions: List[IFRAggregate] = [base_ifr]
+
+ denom_base = float(base_total.sum().item())
+ w_span_raw = base_total[int(all_gen_start_abs) : int(all_gen_end_abs) + 1].detach().clone()
+ w_span_raw = torch.nan_to_num(w_span_raw, nan=0.0, posinf=0.0, neginf=0.0)
+ w_span_sum = float(w_span_raw.sum().item())
+ w_span_weights = (
+ (w_span_raw / (w_span_sum + 1e-12))
+ if w_span_sum > 0.0
+ else torch.zeros_like(w_span_raw, dtype=torch.float32)
+ )
+ current_ratio = float(w_span_sum) / (denom_base + 1e-12) if denom_base > 0.0 else 0.0
+ ratios: List[float] = [float(current_ratio)]
+
+ for hop in range(1, hop_count + 1):
+ if float(w_span_sum) <= 0.0 or float(current_ratio) <= 0.0:
+ zeros = torch.zeros_like(base_total)
+ for _ in range(hop, hop_count + 1):
+ raw_attributions.append(
+ IFRAggregate(
+ per_layer=[],
+ token_importance_total=zeros,
+ head_importance_total=torch.zeros_like(base_ifr.head_importance_total),
+ ffn_importance_per_layer=torch.zeros_like(base_ifr.ffn_importance_per_layer),
+ resid_ffn_importance_per_layer=torch.zeros_like(base_ifr.resid_ffn_importance_per_layer),
+ )
+ )
+ per_hop_obs.append(torch.zeros_like(base_total))
+ ratios.append(0.0)
+ break
+
+ hop_ifr_raw = compute_ifr_sentence_aggregate(
+ sink_start=all_gen_start_abs,
+ sink_end=all_gen_end_abs,
+ cache=cache,
+ attentions=attentions,
+ weight_pack=weight_pack,
+ params=params,
+ renorm_threshold=renorm,
+ sink_weights=w_span_weights,
+ rotary_emb=metadata.rotary_emb,
+ )
+
+ hop_total = torch.nan_to_num(
+ hop_ifr_raw.token_importance_total.to(dtype=torch.float32),
+ nan=0.0,
+ posinf=0.0,
+ neginf=0.0,
+ )
+ hop_ifr = IFRAggregate(
+ per_layer=hop_ifr_raw.per_layer,
+ token_importance_total=hop_total,
+ head_importance_total=hop_ifr_raw.head_importance_total,
+ ffn_importance_per_layer=hop_ifr_raw.ffn_importance_per_layer,
+ resid_ffn_importance_per_layer=hop_ifr_raw.resid_ffn_importance_per_layer,
+ )
+ raw_attributions.append(hop_ifr)
+
+ obs_only = hop_total * obs_mask * float(current_ratio)
+ obs_accum += obs_only
+ per_hop_obs.append(obs_only)
+
+ w_span_raw = hop_total[int(all_gen_start_abs) : int(all_gen_end_abs) + 1].detach().clone()
+ w_span_raw = torch.nan_to_num(w_span_raw, nan=0.0, posinf=0.0, neginf=0.0)
+ w_span_sum = float(w_span_raw.sum().item())
+ w_span_weights = (
+ (w_span_raw / (w_span_sum + 1e-12))
+ if w_span_sum > 0.0
+ else torch.zeros_like(w_span_raw, dtype=torch.float32)
+ )
+
+ hop_denom = float(hop_total.sum().item())
+ if hop_denom <= 0.0:
+ current_ratio = 0.0
+ else:
+ current_ratio *= float(w_span_sum) / (hop_denom + 1e-12)
+ ratios.append(float(current_ratio))
+
+ obs_avg = obs_accum / float(max(1, hop_count))
+ observation: Dict[str, torch.Tensor | List[torch.Tensor]] = {
+ "mask": obs_mask,
+ "base": base_obs,
+ "per_hop": per_hop_obs,
+ "sum": obs_accum,
+ "avg": obs_avg,
+ }
+
+ multi_hop = MultiHopIFRResult(raw_attributions=raw_attributions, thinking_ratios=ratios, observation=observation)
+
+ eval_vector = multi_hop.observation["sum"]
+ score_array = torch.full((int(gen_len), total_len), torch.nan, dtype=torch.float32)
+ for offset in range(int(span_start), int(span_end) + 1):
+ score_array[offset] = eval_vector
+
+ projected_per_hop = [self._project_vector(result.token_importance_total) for result in multi_hop.raw_attributions]
+ obs = multi_hop.observation
+ observation_projected = {
+ "mask": self.extract_user_prompt_attributions(self.prompt_tokens, obs["mask"].view(1, -1))[0],
+ "base": self._project_vector(obs["base"]),
+ "sum": self._project_vector(obs["sum"]),
+ "avg": self._project_vector(obs["avg"]),
+ "per_hop": [self._project_vector(vec) for vec in obs["per_hop"]],
+ }
+
+ meta: Dict[str, Any] = {
+ "ifr": {
+ "type": "in_all_gen",
+ "sink_span_generation": (int(span_start), int(span_end)),
+ "sink_span_absolute": (int(sink_start_abs), int(sink_end_abs)),
+ "thinking_span_generation": (
+ (int(thinking_span[0]), int(thinking_span[1])) if thinking_span is not None else None
+ ),
+ "thinking_span_absolute": (
+ (int(think_start_abs), int(think_end_abs)) if think_start_abs is not None else None
+ ),
+ "all_gen_span_generation": (int(all_gen_start), int(all_gen_end)),
+ "all_gen_span_absolute": (int(all_gen_start_abs), int(all_gen_end_abs)),
+ "renorm_threshold": float(renorm),
+ "n_hops": int(n_hops),
+ "thinking_ratios": multi_hop.thinking_ratios,
+ "per_hop_projected": projected_per_hop,
+ "observation_projected": observation_projected,
+ "raw": multi_hop,
+ "note": "scheme B: rows filled for sink_span_generation; hops over all_gen_span (EOS excluded).",
+ }
+ }
+
+ return self._finalize_result(score_array, metadata=meta)
diff --git a/flashtrace/lrp_patches.py b/flashtrace/lrp_patches.py
new file mode 100644
index 0000000000000000000000000000000000000000..8867592aafb5c7d0ac81dc018ee6fee9e6af4a27
--- /dev/null
+++ b/flashtrace/lrp_patches.py
@@ -0,0 +1,319 @@
+"""LRP forward patches for AttnLRP implementation.
+
+This module provides patched forward functions for transformer layers
+that implement AttnLRP rules during the backward pass:
+
+- RMSNorm: Identity rule via stopping gradient on variance
+- Gated MLP: Identity rule on activation + Uniform rule on gate*up
+- Attention: Uniform rule on Q@K^T and attention@V
+
+Reference:
+ AttnLRP: Attention-Aware Layer-wise Relevance Propagation for Transformers (ICML 2024)
+ https://arxiv.org/abs/2402.05602
+"""
+
+import torch
+from contextlib import contextmanager
+from typing import Dict, Any, Optional, Callable
+from functools import partial
+
+from .lrp_rules import stop_gradient, divide_gradient, identity_rule_implicit
+
+
+# Model configuration for different architectures
+MODEL_CONFIGS = {
+ "qwen3": {
+ "modeling_module": "transformers.models.qwen3.modeling_qwen3",
+ "rms_norm_class": "Qwen3RMSNorm",
+ "mlp_class": "Qwen3MLP",
+ },
+ "qwen2": {
+ "modeling_module": "transformers.models.qwen2.modeling_qwen2",
+ "rms_norm_class": "Qwen2RMSNorm",
+ "mlp_class": "Qwen2MLP",
+ },
+ "llama": {
+ "modeling_module": "transformers.models.llama.modeling_llama",
+ "rms_norm_class": "LlamaRMSNorm",
+ "mlp_class": "LlamaMLP",
+ },
+}
+
+
+def rms_norm_forward_lrp(self, hidden_states: torch.Tensor) -> torch.Tensor:
+ """RMSNorm forward with LRP identity rule.
+
+ On normalization operations, we apply the identity rule by stopping
+ the gradient flow through the variance calculation.
+
+ This is equivalent to Equation 9 in the AttnLRP paper.
+ """
+ input_dtype = hidden_states.dtype
+ hidden_states = hidden_states.to(torch.float32)
+ variance = hidden_states.pow(2).mean(-1, keepdim=True)
+ # Stop gradient through the normalization factor
+ hidden_states = hidden_states * stop_gradient(torch.rsqrt(variance + self.variance_epsilon))
+ return self.weight * hidden_states.to(input_dtype)
+
+
+def gated_mlp_forward_lrp(self, x: torch.Tensor) -> torch.Tensor:
+ """Gated MLP forward with LRP rules.
+
+ On the element-wise non-linear activation, we apply the identity rule.
+ On the element-wise multiplication (gate * up), we apply the uniform rule.
+
+ Both rules are implemented via the Gradient*Input framework.
+ """
+ gate_out = self.gate_proj(x)
+ # Apply identity rule to the non-linear activation
+ gate_out = identity_rule_implicit(self.act_fn, gate_out)
+
+ # Element-wise multiplication of gate and up projections
+ weighted = gate_out * self.up_proj(x)
+ # Apply uniform rule (divide gradient by 2)
+ weighted = divide_gradient(weighted, 2)
+
+ return self.down_proj(weighted)
+
+
+def wrap_attention_forward(forward_fn: Callable) -> Callable:
+ """Wrap an attention forward function with LRP gradient rules.
+
+ Applies the uniform rule to Q, K, V tensors:
+ - Q and K: divide by 4 (accounts for Q@K^T matmul and softmax)
+ - V: divide by 2 (accounts for attention@V matmul)
+
+ Parameters
+ ----------
+ forward_fn : Callable
+ The original attention forward function
+
+ Returns
+ -------
+ Callable
+ Wrapped attention forward function with LRP rules
+ """
+ def attention_forward_lrp(module, query, key, value, *args, **kwargs):
+ # Apply uniform rule to Q, K, V
+ # Factor of 4 for Q/K accounts for two matmuls (Q@K^T, then softmax/attention)
+ # Factor of 2 for V accounts for single matmul with attention weights
+ query = divide_gradient(query, 4)
+ key = divide_gradient(key, 4)
+ value = divide_gradient(value, 2)
+
+ # Disable dropout during LRP computation
+ if 'dropout' in kwargs:
+ kwargs['dropout'] = 0.0
+
+ return forward_fn(module, query, key, value, *args, **kwargs)
+
+ return attention_forward_lrp
+
+
+def dropout_forward_lrp(self, x: torch.Tensor) -> torch.Tensor:
+ """Dropout forward that acts as identity.
+
+ During LRP computation, we need gradient checkpointing which requires
+ train mode, but we don't want actual dropout. This patches dropout
+ to be identity.
+ """
+ return x
+
+
+class LRPPatchState:
+ """Stores original module states for restoration after LRP computation."""
+
+ def __init__(self):
+ self.original_forwards: Dict[str, Any] = {}
+ self.original_attention_functions: Dict[str, Dict[str, Callable]] = {}
+ self.original_eager_attention: Dict[str, Callable] = {}
+ self.patched = False
+
+
+def _get_modeling_module(model_type: str):
+ """Dynamically import the modeling module for a given model type."""
+ import importlib
+ config = MODEL_CONFIGS.get(model_type)
+ if config is None:
+ raise ValueError(f"Unsupported model type: {model_type}. Supported: {list(MODEL_CONFIGS.keys())}")
+ return importlib.import_module(config["modeling_module"])
+
+
+def apply_lrp_patches(model, model_type: str = "qwen3") -> LRPPatchState:
+ """Apply LRP patches to a model.
+
+ This function patches the forward methods of various layers to implement
+ AttnLRP rules during backward propagation.
+
+ Parameters
+ ----------
+ model : transformers model
+ The model to patch
+ model_type : str
+ Type of model architecture (qwen3, qwen2, llama)
+
+ Returns
+ -------
+ LRPPatchState
+ State object containing original methods for restoration
+ """
+ state = LRPPatchState()
+ config = MODEL_CONFIGS.get(model_type)
+
+ if config is None:
+ raise ValueError(f"Unsupported model type: {model_type}")
+
+ modeling_module = _get_modeling_module(model_type)
+
+ # Patch RMSNorm layers
+ rms_norm_cls = getattr(modeling_module, config["rms_norm_class"])
+ state.original_forwards[config["rms_norm_class"]] = rms_norm_cls.forward
+ rms_norm_cls.forward = rms_norm_forward_lrp
+
+ # Patch MLP layers
+ mlp_cls = getattr(modeling_module, config["mlp_class"])
+ state.original_forwards[config["mlp_class"]] = mlp_cls.forward
+ mlp_cls.forward = gated_mlp_forward_lrp
+
+ # Patch Dropout layers
+ from torch.nn import Dropout
+ state.original_forwards["Dropout"] = Dropout.forward
+ Dropout.forward = dropout_forward_lrp
+
+ # Patch attention functions
+ if hasattr(modeling_module, 'ALL_ATTENTION_FUNCTIONS'):
+ state.original_attention_functions[model_type] = dict(modeling_module.ALL_ATTENTION_FUNCTIONS)
+ new_attention_functions = {}
+ for key, fn in modeling_module.ALL_ATTENTION_FUNCTIONS.items():
+ new_attention_functions[key] = wrap_attention_forward(fn)
+ modeling_module.ALL_ATTENTION_FUNCTIONS = new_attention_functions
+
+ if hasattr(modeling_module, 'eager_attention_forward'):
+ state.original_eager_attention[model_type] = modeling_module.eager_attention_forward
+ modeling_module.eager_attention_forward = wrap_attention_forward(
+ modeling_module.eager_attention_forward
+ )
+
+ state.patched = True
+ return state
+
+
+def remove_lrp_patches(state: LRPPatchState, model_type: str = "qwen3"):
+ """Remove LRP patches and restore original forward methods.
+
+ Parameters
+ ----------
+ state : LRPPatchState
+ State object from apply_lrp_patches
+ model_type : str
+ Type of model architecture
+ """
+ if not state.patched:
+ return
+
+ config = MODEL_CONFIGS.get(model_type)
+ if config is None:
+ return
+
+ modeling_module = _get_modeling_module(model_type)
+
+ # Restore RMSNorm
+ if config["rms_norm_class"] in state.original_forwards:
+ rms_norm_cls = getattr(modeling_module, config["rms_norm_class"])
+ rms_norm_cls.forward = state.original_forwards[config["rms_norm_class"]]
+
+ # Restore MLP
+ if config["mlp_class"] in state.original_forwards:
+ mlp_cls = getattr(modeling_module, config["mlp_class"])
+ mlp_cls.forward = state.original_forwards[config["mlp_class"]]
+
+ # Restore Dropout
+ if "Dropout" in state.original_forwards:
+ from torch.nn import Dropout
+ Dropout.forward = state.original_forwards["Dropout"]
+
+ # Restore attention functions
+ if model_type in state.original_attention_functions:
+ modeling_module.ALL_ATTENTION_FUNCTIONS = state.original_attention_functions[model_type]
+
+ if model_type in state.original_eager_attention:
+ modeling_module.eager_attention_forward = state.original_eager_attention[model_type]
+
+ state.patched = False
+
+
+@contextmanager
+def lrp_context(model, model_type: str = "qwen3"):
+ """Context manager for applying LRP patches temporarily.
+
+ This is the recommended way to use LRP patches as it ensures
+ proper cleanup even if an exception occurs.
+
+ Example
+ -------
+ >>> with lrp_context(model, model_type="qwen3"):
+ ... # Compute forward pass and backward for LRP
+ ... output = model(inputs_embeds=embeds)
+ ... output.logits[0, -1, :].max().backward()
+ ... relevance = (embeds * embeds.grad).sum(-1)
+
+ Parameters
+ ----------
+ model : transformers model
+ The model to patch
+ model_type : str
+ Type of model architecture (qwen3, qwen2, llama)
+
+ Yields
+ ------
+ LRPPatchState
+ The patch state (usually not needed by caller)
+ """
+ state = apply_lrp_patches(model, model_type)
+ try:
+ yield state
+ finally:
+ remove_lrp_patches(state, model_type)
+
+
+def detect_model_type(model) -> str:
+ """Attempt to detect the model type from a model instance.
+
+ Parameters
+ ----------
+ model : transformers model
+ The model to detect type for
+
+ Returns
+ -------
+ str
+ Detected model type (qwen3, qwen2, llama)
+
+ Raises
+ ------
+ ValueError
+ If model type cannot be detected
+ """
+ model_class_name = model.__class__.__name__.lower()
+
+ if 'qwen3' in model_class_name:
+ return 'qwen3'
+ elif 'qwen2' in model_class_name:
+ return 'qwen2'
+ elif 'llama' in model_class_name:
+ return 'llama'
+
+ # Check config if available
+ if hasattr(model, 'config'):
+ config_name = getattr(model.config, 'model_type', '').lower()
+ if 'qwen3' in config_name:
+ return 'qwen3'
+ elif 'qwen2' in config_name:
+ return 'qwen2'
+ elif 'llama' in config_name:
+ return 'llama'
+
+ raise ValueError(
+ f"Could not detect model type from {model.__class__.__name__}. "
+ f"Please specify model_type explicitly. Supported: {list(MODEL_CONFIGS.keys())}"
+ )
diff --git a/flashtrace/lrp_rules.py b/flashtrace/lrp_rules.py
new file mode 100644
index 0000000000000000000000000000000000000000..ba54f622f693d5c97138dcb85aa743636fd67c37
--- /dev/null
+++ b/flashtrace/lrp_rules.py
@@ -0,0 +1,129 @@
+"""LRP (Layer-wise Relevance Propagation) autograd rules for AttnLRP.
+
+This module implements the core autograd functions needed for AttnLRP:
+- stop_gradient: Stop gradient flow completely
+- divide_gradient: Divide gradient by a factor (Uniform Rule from Eq. 7)
+- identity_rule_implicit: Handle non-linear activations (Identity Rule from Eq. 9)
+
+Reference:
+ AttnLRP: Attention-Aware Layer-wise Relevance Propagation for Transformers (ICML 2024)
+ https://arxiv.org/abs/2402.05602
+"""
+
+import torch
+from torch.autograd import Function
+
+
+def stop_gradient(input: torch.Tensor) -> torch.Tensor:
+ """Stop the gradient from flowing through the input tensor.
+
+ This is used in RMSNorm/LayerNorm to stop gradient flow through
+ the variance calculation, implementing the identity rule.
+
+ Parameters
+ ----------
+ input : torch.Tensor
+ The input tensor
+
+ Returns
+ -------
+ torch.Tensor
+ The detached tensor (same values, no gradient)
+ """
+ return input.detach()
+
+
+def divide_gradient(input: torch.Tensor, factor: int = 2) -> torch.Tensor:
+ """Divide the gradient by a factor during backpropagation.
+
+ Implements the Uniform Rule (Equation 7 from the AttnLRP paper).
+ Used after matmul or element-wise multiplication operations.
+
+ Parameters
+ ----------
+ input : torch.Tensor
+ The input tensor
+ factor : int
+ The factor to divide the gradient by. Default is 2.
+
+ Returns
+ -------
+ torch.Tensor
+ The same tensor with modified backward gradient
+ """
+ return DivideGradientFn.apply(input, factor)
+
+
+def identity_rule_implicit(fn, input: torch.Tensor, epsilon: float = 1e-10) -> torch.Tensor:
+ """Apply the identity rule implicitly through the Gradient*Input framework.
+
+ Implements the Identity Rule (Equation 9 from the AttnLRP paper).
+ Used for element-wise non-linear activation functions.
+
+ The backward pass computes: gradient = (output / input) * out_relevance
+ which, when multiplied by input in the final relevance computation,
+ gives the identity rule: relevance = output * out_relevance / output = out_relevance
+
+ Parameters
+ ----------
+ fn : callable
+ The non-linear function to apply (e.g., SiLU, GELU)
+ input : torch.Tensor
+ The input tensor
+ epsilon : float
+ Small constant for numerical stability in division
+
+ Returns
+ -------
+ torch.Tensor
+ The output of fn(input) with modified backward pass
+ """
+ return IdentityRuleImplicitFn.apply(fn, input, epsilon)
+
+
+class DivideGradientFn(Function):
+ """Autograd Function that divides gradient by a constant factor.
+
+ Forward pass: identity (returns input unchanged)
+ Backward pass: divides gradient by factor
+
+ This implements the Uniform Rule for element-wise multiplication
+ and part of the handling for matrix multiplication.
+ """
+
+ @staticmethod
+ def forward(ctx, input: torch.Tensor, factor: int = 2) -> torch.Tensor:
+ ctx.factor = factor
+ return input
+
+ @staticmethod
+ def backward(ctx, grad_output: torch.Tensor):
+ return grad_output / ctx.factor, None
+
+
+class IdentityRuleImplicitFn(Function):
+ """Autograd Function implementing the identity rule for non-linear activations.
+
+ Forward pass: computes fn(input) and saves output/input ratio
+ Backward pass: multiplies gradient by saved ratio
+
+ This is more efficient than explicit LRP computation because it
+ leverages the Gradient*Input framework.
+ """
+
+ @staticmethod
+ def forward(ctx, fn, input: torch.Tensor, epsilon: float = 1e-10) -> torch.Tensor:
+ output = fn(input)
+ if input.requires_grad:
+ # Save output/input for backward
+ # Adding epsilon prevents division by zero
+ ctx.save_for_backward(output / (input + epsilon))
+ return output
+
+ @staticmethod
+ def backward(ctx, grad_output: torch.Tensor):
+ # Gradient is scaled by output/input ratio
+ # When multiplied by input later, this gives the identity rule
+ ratio = ctx.saved_tensors[0]
+ gradient = ratio * grad_output
+ return None, gradient, None
diff --git a/flashtrace/model_io.py b/flashtrace/model_io.py
new file mode 100644
index 0000000000000000000000000000000000000000..3cf40509962ad6aef09ee059adf11a2c62b7497d
--- /dev/null
+++ b/flashtrace/model_io.py
@@ -0,0 +1,49 @@
+from __future__ import annotations
+
+from typing import Any
+
+import torch
+from transformers import AutoModelForCausalLM, AutoTokenizer
+
+
+def _resolve_dtype(dtype: str | torch.dtype = "auto") -> str | torch.dtype:
+ if isinstance(dtype, torch.dtype):
+ return dtype
+ value = str(dtype).lower()
+ if value == "auto":
+ return "auto"
+ mapping = {
+ "float16": torch.float16,
+ "fp16": torch.float16,
+ "bfloat16": torch.bfloat16,
+ "bf16": torch.bfloat16,
+ "float32": torch.float32,
+ "fp32": torch.float32,
+ }
+ if value not in mapping:
+ raise ValueError(f"Unsupported dtype: {dtype}")
+ return mapping[value]
+
+
+def load_model_and_tokenizer(
+ model_name_or_path: str,
+ *,
+ device_map: str | dict[str, Any] | None = "auto",
+ dtype: str | torch.dtype = "auto",
+ trust_remote_code: bool = True,
+ **model_kwargs: Any,
+):
+ """Load a Hugging Face causal LM and matching tokenizer."""
+
+ tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, trust_remote_code=trust_remote_code)
+ model = AutoModelForCausalLM.from_pretrained(
+ model_name_or_path,
+ torch_dtype=_resolve_dtype(dtype),
+ device_map=device_map,
+ trust_remote_code=trust_remote_code,
+ **model_kwargs,
+ )
+ model.eval()
+ if tokenizer.pad_token_id is None and tokenizer.eos_token_id is not None:
+ tokenizer.pad_token = tokenizer.eos_token
+ return model, tokenizer
diff --git a/flashtrace/result.py b/flashtrace/result.py
new file mode 100644
index 0000000000000000000000000000000000000000..2857b7888cb48366c74b2cb1aa1b67ed721d4676
--- /dev/null
+++ b/flashtrace/result.py
@@ -0,0 +1,78 @@
+from __future__ import annotations
+
+import json
+from dataclasses import asdict, dataclass, field, is_dataclass
+from pathlib import Path
+from typing import Any
+
+
+@dataclass(frozen=True)
+class TokenScore:
+ index: int
+ token: str
+ score: float
+
+
+@dataclass(frozen=True)
+class TraceResult:
+ """Public attribution result returned by FlashTrace."""
+
+ prompt_tokens: list[str]
+ generation_tokens: list[str]
+ scores: list[float]
+ per_hop_scores: list[list[float]] = field(default_factory=list)
+ thinking_ratios: list[float] = field(default_factory=list)
+ output_span: tuple[int, int] | None = None
+ reasoning_span: tuple[int, int] | None = None
+ method: str = "flashtrace"
+ metadata: dict[str, Any] = field(default_factory=dict)
+
+ def topk_inputs(self, k: int = 20) -> list[TokenScore]:
+ limit = max(0, int(k))
+ items = [
+ TokenScore(index=i, token=tok, score=float(score))
+ for i, (tok, score) in enumerate(zip(self.prompt_tokens, self.scores))
+ ]
+ items.sort(key=lambda item: item.score, reverse=True)
+ return items[:limit]
+
+ def to_dict(self) -> dict[str, Any]:
+ return {
+ "method": self.method,
+ "prompt_tokens": list(self.prompt_tokens),
+ "generation_tokens": list(self.generation_tokens),
+ "scores": [float(x) for x in self.scores],
+ "per_hop_scores": [[float(x) for x in row] for row in self.per_hop_scores],
+ "thinking_ratios": [float(x) for x in self.thinking_ratios],
+ "output_span": list(self.output_span) if self.output_span is not None else None,
+ "reasoning_span": list(self.reasoning_span) if self.reasoning_span is not None else None,
+ "top_inputs": [asdict(item) for item in self.topk_inputs()],
+ "metadata": _jsonable(self.metadata),
+ }
+
+ def to_json(self, path: str | Path) -> None:
+ target = Path(path)
+ target.write_text(json.dumps(self.to_dict(), indent=2, ensure_ascii=False), encoding="utf-8")
+
+ def to_html(self, path: str | Path) -> None:
+ from .viz import render_trace_html
+
+ target = Path(path)
+ target.write_text(render_trace_html(self), encoding="utf-8")
+
+
+def _jsonable(value: Any) -> Any:
+ if value is None or isinstance(value, (str, int, float, bool)):
+ return value
+ if hasattr(value, "detach") and hasattr(value, "cpu"):
+ try:
+ return value.detach().cpu().tolist()
+ except Exception:
+ return repr(value)
+ if is_dataclass(value):
+ return _jsonable(asdict(value))
+ if isinstance(value, dict):
+ return {str(k): _jsonable(v) for k, v in value.items()}
+ if isinstance(value, (list, tuple)):
+ return [_jsonable(v) for v in value]
+ return repr(value)
diff --git a/flashtrace/shared_utils.py b/flashtrace/shared_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..590342c4bbdc3be3c2423772ec06bd7cf0990926
--- /dev/null
+++ b/flashtrace/shared_utils.py
@@ -0,0 +1,147 @@
+"""Shared utilities for the flashtrace project.
+
+This module contains common constants, NLP pipeline initialization,
+and sentence processing functions used across multiple modules.
+"""
+
+import re
+import spacy
+from spacy.language import Language
+from spacy.tokens import Doc
+import torch
+
+# Common constants
+DEFAULT_GENERATE_KWARGS = {"max_new_tokens": 512, "do_sample": False}
+DEFAULT_PROMPT_TEMPLATE = "Context:{context}\n\n\nQuery: {query}"
+
+# Sentence detector - NLP pipeline initialization
+try:
+ nlp = spacy.load("en_core_web_sm")
+ _newline_pipe_position = {"before": "parser"}
+except OSError:
+ nlp = spacy.blank("en")
+ if "sentencizer" not in nlp.pipe_names:
+ nlp.add_pipe("sentencizer")
+ _newline_pipe_position = {"after": "sentencizer"} if "sentencizer" in nlp.pipe_names else {"last": True}
+
+
+@Language.component("newline_cap_split")
+def newline_cap_split(doc: Doc) -> Doc:
+ """Custom component to split on capitalized words after newline."""
+ for i, token in enumerate(doc):
+ if token.is_title and i > 0:
+ prev_token = doc[i - 1]
+ if "\n" in prev_token.text or (prev_token.is_space and "\n" in prev_token.text):
+ token.is_sent_start = True
+ return doc
+
+
+# Add to pipeline
+nlp.add_pipe("newline_cap_split", **_newline_pipe_position)
+
+
+def create_sentences(text, tokenizer, return_indices=False, show=False) -> list[str]:
+ """Split text into sentences and return the sentences.
+
+ Args:
+ text: The text to split into sentences.
+ tokenizer: The tokenizer to use for EOS token handling.
+ return_indices: If True, return both sentences and their indices.
+ show: Unused, kept for backward compatibility.
+
+ Returns:
+ A list of sentences, or a tuple of (sentences, indices) if return_indices is True.
+ """
+ sentences = []
+ separators = []
+ indices = []
+
+ # Process the text with spacy
+ doc = nlp(text)
+
+ # Extract sentences
+ sentences = []
+ for sent in doc.sents:
+ sentences.append(sent.text)
+
+ # extract separators
+ cur_start = 0
+ for sentence in sentences:
+ indices.append(cur_start)
+ cur_end = text.find(sentence, cur_start)
+ separator = text[cur_start:cur_end]
+ separators.append(separator)
+ cur_start = cur_end + len(sentence)
+
+ # combine the separators with the sentences properly
+ for i in range(len(sentences)):
+ if separators[i] == "\n":
+ sentences[i] = sentences[i] + separators[i]
+ else:
+ sentences[i] = separators[i] + sentences[i]
+
+ # if the text had an eos token (generated text) it will be missed
+ # and attached on the last sentence, so we manually handle it
+ if tokenizer is not None:
+ eos = tokenizer.eos_token
+ if eos and sentences and eos in sentences[-1]:
+ sentences[-1] = sentences[-1].replace(eos, "")
+ indices.append(len("".join(sentences)))
+ sentences.append(eos)
+
+ indices.append(len(text))
+
+ if return_indices:
+ return sentences, indices
+ else:
+ return sentences
+
+
+def create_sentences_fallback(text, tokenizer=None) -> list[str]:
+ """Very naive fallback sentence splitter for when spacy is unavailable.
+
+ Split by newline first, then by simple punctuation boundaries.
+ """
+ parts = []
+ for block in text.split("\n"):
+ xs = re.split(r"(?<=[.!?])\s+", block.strip()) if block.strip() else []
+ parts.extend([x for x in xs if x])
+ return parts or ([text] if text else [])
+
+
+def create_sentence_masks(tokens, sentences, show=False) -> torch.Tensor:
+ """Create a binary mask of shape [sentences, tokens].
+
+ Each row has a 1 where a token is in the represented sentence.
+
+ Args:
+ tokens: List of tokens.
+ sentences: List of sentences.
+ show: Unused, kept for backward compatibility.
+
+ Returns:
+ A tensor mask of shape [len(sentences), len(tokens)].
+ """
+ mask = torch.zeros((len(sentences), len(tokens)))
+
+ sentence_idx = 0
+ sent_pointer = 0
+
+ for token_idx, token in enumerate(tokens):
+ current_sentence = sentences[sentence_idx]
+
+ mask[sentence_idx, token_idx] = 1
+
+ if '\n' in token:
+ sent_pointer += len(token) + 1
+ else:
+ sent_pointer += len(token)
+
+ if sent_pointer >= len(current_sentence):
+ sentence_idx += 1
+ sent_pointer = 0
+
+ if sentence_idx >= len(sentences):
+ break
+
+ return mask
diff --git a/flashtrace/tracer.py b/flashtrace/tracer.py
new file mode 100644
index 0000000000000000000000000000000000000000..c1c5f663363d226f51eff827e33faaa223c59ebe
--- /dev/null
+++ b/flashtrace/tracer.py
@@ -0,0 +1,155 @@
+from __future__ import annotations
+
+from typing import Any, Literal
+
+import torch
+
+from .attribution import LLMAttributionResult, LLMIFRAttribution
+from .improved import LLMIFRAttributionBoth
+from .result import TraceResult
+
+TraceMethod = Literal["flashtrace", "ifr-span", "ifr-matrix"]
+
+
+def _to_float_list(values: Any) -> list[float]:
+ if torch.is_tensor(values):
+ values = values.detach().cpu().to(dtype=torch.float32).tolist()
+ return [float(x) for x in (values or [])]
+
+
+class FlashTrace:
+ """Public facade for FlashTrace attribution."""
+
+ def __init__(
+ self,
+ model,
+ tokenizer,
+ *,
+ chunk_tokens: int = 128,
+ sink_chunk_tokens: int = 32,
+ recompute_attention: bool = False,
+ generate_kwargs: dict[str, Any] | None = None,
+ use_chat_template: bool = False,
+ ) -> None:
+ self.model = model
+ self.tokenizer = tokenizer
+ self.chunk_tokens = int(chunk_tokens)
+ self.sink_chunk_tokens = int(sink_chunk_tokens)
+ self.recompute_attention = bool(recompute_attention)
+ self.generate_kwargs = generate_kwargs
+ self.use_chat_template = bool(use_chat_template)
+
+ def trace(
+ self,
+ *,
+ prompt: str,
+ target: str | None = None,
+ output_span: tuple[int, int] | None = None,
+ reasoning_span: tuple[int, int] | None = None,
+ hops: int = 1,
+ method: TraceMethod = "flashtrace",
+ renorm_threshold: float | None = None,
+ ) -> TraceResult:
+ if method == "flashtrace":
+ engine = LLMIFRAttributionBoth(
+ self.model,
+ self.tokenizer,
+ generate_kwargs=self.generate_kwargs,
+ chunk_tokens=self.chunk_tokens,
+ sink_chunk_tokens=self.sink_chunk_tokens,
+ recompute_attention=self.recompute_attention,
+ use_chat_template=self.use_chat_template,
+ )
+ raw = engine.calculate_ifr_multi_hop_both(
+ prompt,
+ target=target,
+ sink_span=output_span,
+ thinking_span=reasoning_span,
+ n_hops=int(hops),
+ renorm_threshold=renorm_threshold,
+ )
+ elif method == "ifr-span":
+ engine = LLMIFRAttribution(
+ self.model,
+ self.tokenizer,
+ generate_kwargs=self.generate_kwargs,
+ chunk_tokens=self.chunk_tokens,
+ sink_chunk_tokens=self.sink_chunk_tokens,
+ recompute_attention=self.recompute_attention,
+ use_chat_template=self.use_chat_template,
+ )
+ raw = engine.calculate_ifr_span(
+ prompt,
+ target=target,
+ span=output_span,
+ renorm_threshold=renorm_threshold,
+ )
+ elif method == "ifr-matrix":
+ engine = LLMIFRAttribution(
+ self.model,
+ self.tokenizer,
+ generate_kwargs=self.generate_kwargs,
+ chunk_tokens=self.chunk_tokens,
+ sink_chunk_tokens=self.sink_chunk_tokens,
+ recompute_attention=self.recompute_attention,
+ use_chat_template=self.use_chat_template,
+ )
+ raw = engine.calculate_ifr_for_all_positions_output_only(
+ prompt,
+ target=target,
+ sink_span=output_span,
+ renorm_threshold=renorm_threshold,
+ )
+ else:
+ raise ValueError(f"Unsupported method: {method}")
+
+ return self._build_result(raw, method=method, output_span=output_span, reasoning_span=reasoning_span)
+
+ def _build_result(
+ self,
+ raw: LLMAttributionResult,
+ *,
+ method: str,
+ output_span: tuple[int, int] | None,
+ reasoning_span: tuple[int, int] | None,
+ ) -> TraceResult:
+ prompt_tokens = list(raw.prompt_tokens)
+ generation_tokens = list(raw.generation_tokens)
+ prompt_len = len(prompt_tokens)
+ metadata = dict(raw.metadata or {})
+ if "method" not in metadata:
+ metadata["method"] = method
+
+ ifr_meta = metadata.get("ifr") if isinstance(metadata.get("ifr"), dict) else {}
+ observation = ifr_meta.get("observation_projected") if isinstance(ifr_meta, dict) else None
+ per_hop_projected = ifr_meta.get("per_hop_projected") if isinstance(ifr_meta, dict) else None
+
+ if isinstance(observation, dict) and "sum" in observation:
+ vector = _to_float_list(observation["sum"])
+ scores = vector[:prompt_len]
+ else:
+ matrix = torch.nan_to_num(raw.attribution_matrix.detach().cpu().to(dtype=torch.float32), nan=0.0)
+ if output_span is not None:
+ start, end = output_span
+ selected = matrix[int(start) : int(end) + 1, :prompt_len]
+ else:
+ selected = matrix[:, :prompt_len]
+ scores = selected.mean(dim=0).tolist() if selected.numel() else [0.0 for _ in prompt_tokens]
+
+ per_hop_scores: list[list[float]] = []
+ if per_hop_projected:
+ for hop_vector in per_hop_projected:
+ per_hop_scores.append(_to_float_list(hop_vector)[:prompt_len])
+
+ ratios = ifr_meta.get("thinking_ratios", []) if isinstance(ifr_meta, dict) else []
+ return TraceResult(
+ prompt_tokens=prompt_tokens,
+ generation_tokens=generation_tokens,
+ scores=[float(x) for x in scores],
+ per_hop_scores=per_hop_scores,
+ thinking_ratios=_to_float_list(ratios),
+ output_span=output_span,
+ reasoning_span=reasoning_span,
+ method=method,
+ metadata=metadata,
+ )
diff --git a/flashtrace/viz.py b/flashtrace/viz.py
new file mode 100644
index 0000000000000000000000000000000000000000..f0b56c8b1b61890bbfda97d80865701a7ccc9f10
--- /dev/null
+++ b/flashtrace/viz.py
@@ -0,0 +1,81 @@
+from __future__ import annotations
+
+from html import escape
+from typing import TYPE_CHECKING
+
+if TYPE_CHECKING:
+ from .result import TraceResult
+
+
+def _score_color(score: float, max_score: float) -> str:
+ if max_score <= 0.0:
+ return "rgba(245,245,245,0.75)"
+ ratio = min(1.0, abs(float(score)) / (max_score + 1e-12))
+ red = 255
+ green = int(246 - 105 * ratio)
+ blue = int(226 - 170 * ratio)
+ alpha = 0.22 + 0.58 * ratio
+ return f"rgba({red},{green},{blue},{alpha:.3f})"
+
+
+def _render_token_row(tokens: list[str], scores: list[float]) -> str:
+ max_score = max((abs(float(x)) for x in scores), default=0.0)
+ spans = []
+ for index, token in enumerate(tokens):
+ score = float(scores[index]) if index < len(scores) else 0.0
+ color = _score_color(score, max_score)
+ spans.append(
+ "{escape(token)} "
+ )
+ return "".join(spans)
+
+
+def render_trace_html(result: "TraceResult") -> str:
+ top_rows = "\n".join(
+ f"{item.index} {escape(item.token)}{item.score:.6f} "
+ for item in result.topk_inputs(20)
+ )
+ hop_sections = []
+ for hop_index, hop_scores in enumerate(result.per_hop_scores):
+ hop_sections.append(
+ f"Hop {hop_index} {_render_token_row(result.prompt_tokens, hop_scores)}
"
+ )
+ hop_html = "\n".join(hop_sections)
+ metadata = escape(str(result.metadata))
+ return f"""
+
+
+
+ FlashTrace
+
+
+
+ FlashTrace
+ method={escape(result.method)} output_span={escape(str(result.output_span))} reasoning_span={escape(str(result.reasoning_span))}
+
+ Prompt Attribution
+ {_render_token_row(result.prompt_tokens, result.scores)}
+
+ {hop_html}
+
+ Top Input Tokens
+ Index Token Score {top_rows}
+
+
+ Metadata
+ {metadata}
+
+
+
+"""
diff --git a/ft_ifr_improve.py b/ft_ifr_improve.py
new file mode 100644
index 0000000000000000000000000000000000000000..01169175ddc05210dbf7eabd3d99e7c7859d7276
--- /dev/null
+++ b/ft_ifr_improve.py
@@ -0,0 +1,3 @@
+"""Compatibility wrapper for package-era imports."""
+
+from flashtrace.improved import * # noqa: F401,F403
diff --git a/ifr_core.py b/ifr_core.py
new file mode 100644
index 0000000000000000000000000000000000000000..065f506ed456cc6555f50e5d2e2af6ab59980eed
--- /dev/null
+++ b/ifr_core.py
@@ -0,0 +1,3 @@
+"""Compatibility wrapper for package-era imports."""
+
+from flashtrace.core import * # noqa: F401,F403
diff --git a/llm_attr.py b/llm_attr.py
new file mode 100644
index 0000000000000000000000000000000000000000..37a2403b55a46d9816d022981a58bee98cb988c6
--- /dev/null
+++ b/llm_attr.py
@@ -0,0 +1,3 @@
+"""Compatibility wrapper for package-era imports."""
+
+from flashtrace.attribution import * # noqa: F401,F403
diff --git a/llm_attr_eval.py b/llm_attr_eval.py
new file mode 100755
index 0000000000000000000000000000000000000000..b65d5e6d3d7771e376f11282b298659f79121192
--- /dev/null
+++ b/llm_attr_eval.py
@@ -0,0 +1,277 @@
+import math
+import torch
+import numpy as np
+from typing import Dict, Any, Optional, Tuple, List
+
+from shared_utils import (
+ DEFAULT_GENERATE_KWARGS,
+ DEFAULT_PROMPT_TEMPLATE,
+)
+
+
+class LLMAttributionEvaluator():
+ def __init__(
+ self,
+ model: Any,
+ tokenizer: Any,
+ generate_kwargs: Optional[Dict[str, Any]] = None
+ ) -> None:
+
+ self.model = model
+ self.tokenizer = tokenizer
+ self.device = model.device
+ self.generate_kwargs = generate_kwargs or DEFAULT_GENERATE_KWARGS
+ self.generated_ids = None
+ self.prompt_ids = None
+
+ self.model.eval()
+
+ def format_prompt(self, prompt) -> str:
+ modified_prompt = DEFAULT_PROMPT_TEMPLATE.format(context = prompt, query = "")
+ formatted_prompt = [{"role": "user", "content": modified_prompt}]
+ formatted_prompt = self.tokenizer.apply_chat_template(
+ formatted_prompt,
+ tokenize=False,
+ add_generation_prompt=True,
+ enable_thinking=False
+ )
+
+ return formatted_prompt
+
+ # Query the model for its generation
+ # This internally saves the input and generated token ids
+ def response(self, prompt) -> Tuple[str, str]:
+ formatted_prompt = self.format_prompt(" " + prompt)
+
+ model_input = self.tokenizer(formatted_prompt, return_tensors="pt", add_special_tokens = False).to(self.device)
+
+ with torch.no_grad():
+ outputs = self.model.generate(model_input.input_ids, **self.generate_kwargs) # [1, num_prompt_tokens + num_generations]
+ # Get only the prompt tokens (excluding the prompt)
+ self.prompt_ids = outputs[:, :model_input.input_ids.shape[1]] # [1, num_prompt_tokens]
+ # Get only the generated tokens (excluding the prompt)
+ self.generated_ids = outputs[:, model_input.input_ids.shape[1]:] # [1, num_generations]
+
+ return self.tokenizer.decode(self.generated_ids[0], skip_special_tokens=True), self.tokenizer.decode(outputs[0], skip_special_tokens=False)
+
+ # we want to evaluate the probability of producing a reponse given a prompt
+ def compute_logprob_response_given_prompt(self, prompt_ids, response_ids) -> torch.Tensor:
+ """
+ Compute log-probabilities of `response_ids` given `prompt_ids`.
+
+ prompt_ids: [B, N]
+ response_ids: [B, M]
+ Returns: [B, M]
+ """
+ # concat prompt and response
+ input_ids = torch.cat([prompt_ids, response_ids], dim=1) # [B, N+M]
+ attention_mask = torch.ones_like(input_ids)
+
+ # Get model outputs
+ logits = self.model(input_ids=input_ids, attention_mask=attention_mask).logits # [B, seq_len, vocab_size]
+
+ # Compute log-probs
+ log_probs = torch.nn.functional.log_softmax(logits, dim=-1) # [B, seq_len, vocab_size]
+
+ # Only consider response tokens
+ response_start = prompt_ids.shape[1]
+
+ # Align logits to predict each y_t from y_{ int:
+ if self.tokenizer.pad_token_id is None:
+ if self.tokenizer.eos_token_id is None:
+ raise RuntimeError("tokenizer has neither pad_token_id nor eos_token_id; cannot define baseline token.")
+ self.tokenizer.pad_token = self.tokenizer.eos_token
+ return int(self.tokenizer.pad_token_id)
+
+ def _find_subsequence_start(self, haystack: torch.Tensor, needle: torch.Tensor) -> Optional[int]:
+ if haystack.ndim != 1 or needle.ndim != 1:
+ raise ValueError("Expected 1D tensors for subsequence matching.")
+ if needle.numel() == 0:
+ return 0
+ hay_len = int(haystack.numel())
+ needle_len = int(needle.numel())
+ if needle_len > hay_len:
+ return None
+ for i in range(hay_len - needle_len + 1):
+ if torch.equal(haystack[i : i + needle_len], needle):
+ return i
+ return None
+
+ def get_topk_tokens(self, attr_matrix, text_list, topk = 10) -> torch.Tensor:
+ input_len = len(text_list)
+ input_col_sums = attr_matrix.sum(0).clamp(0)[0 : input_len]
+ topk_cols = torch.topk(input_col_sums, topk)[1]
+
+ return torch.sort(topk_cols)[0]
+
+ def add_dummy_facts_to_prompt(self, text_sentences) -> List[str]:
+ # create dummy fact sentences
+ dummy_sentences = []
+ for i in range(len(text_sentences)):
+ dummy_sentences.append(" Unrelated Sentence.")
+
+ # Interleave the dummy facts
+ result = []
+ for x, y in zip(text_sentences, dummy_sentences):
+ result.append(x)
+ result.append(y)
+
+ # add back on the last sentence that we left out
+ return result
+
+ def faithfulness_test(
+ self,
+ attribution: torch.Tensor,
+ prompt: str,
+ generation: str,
+ *,
+ k: int = 20,
+ ) -> Tuple[float, float, float]:
+ """Token-level MAS/RISE faithfulness via guided deletion in k perturbation steps (no optimization).
+
+ attribution: [R, P] token attribution on *prompt-side tokens* only.
+ prompt: raw prompt string (NOT sentence-segmented).
+ generation: target generation string (think + output); scored as generation + eos.
+ k: number of perturbation steps; each step perturbs ~1/k of prompt tokens.
+ """
+
+ def auc(arr: np.ndarray) -> float:
+ return (arr.sum() - arr[0] / 2 - arr[-1] / 2) / max(1, (arr.shape[0] - 1))
+
+ pad_token_id = self._ensure_pad_token_id()
+
+ # Leading-space convention must match attribution path (" " + prompt).
+ user_prompt = " " + prompt
+ formatted_prompt = self.format_prompt(user_prompt)
+
+ # Tokenize (CPU for span finding, then move to device).
+ formatted_ids = self.tokenizer(formatted_prompt, return_tensors="pt", add_special_tokens=False).input_ids
+ user_ids = self.tokenizer(user_prompt, return_tensors="pt", add_special_tokens=False).input_ids
+ user_start = self._find_subsequence_start(formatted_ids[0], user_ids[0])
+ if user_start is None:
+ raise RuntimeError("Failed to locate user prompt token span inside formatted chat prompt.")
+
+ prompt_ids = formatted_ids.to(self.device)
+ prompt_ids_perturbed = prompt_ids.clone()
+ generation_ids = self.tokenizer(
+ generation + self.tokenizer.eos_token,
+ return_tensors="pt",
+ add_special_tokens=False,
+ ).input_ids.to(self.device)
+
+ # Compute guided deletion ordering over prompt-side tokens.
+ attr_cpu = attribution.detach().cpu()
+ w = attr_cpu.sum(0)
+ sorted_attr_indices = torch.argsort(w, descending=True)
+ attr_sum = float(w.sum().item())
+
+ P = int(w.numel())
+ if int(user_ids.shape[1]) != P:
+ raise ValueError(
+ "Prompt-side attribution length does not match tokenized user prompt length: "
+ f"attr P={P}, user_prompt P={int(user_ids.shape[1])}."
+ )
+ if P > 0:
+ steps = int(k) if k is not None else 0
+ if steps <= 0:
+ steps = 1
+ steps = min(steps, P)
+ else:
+ steps = 0
+
+ scores = np.zeros(steps + 1, dtype=np.float64)
+ density = np.zeros(steps + 1, dtype=np.float64)
+
+ scores[0] = self.compute_logprob_response_given_prompt(prompt_ids_perturbed, generation_ids).sum().cpu().detach().item()
+ density[0] = 1.0
+
+ if P == 0:
+ return auc(scores), auc(scores), auc(scores)
+
+ if attr_sum <= 0:
+ density = np.linspace(1.0, 0.0, steps + 1)
+
+ base = P // steps
+ remainder = P % steps
+ start = 0
+ for step in range(steps):
+ size = base + (1 if step < remainder else 0)
+ group = sorted_attr_indices[start : start + size]
+ start += size
+
+ for idx in group:
+ j = int(idx.item())
+ prompt_ids_perturbed[0, user_start + j] = pad_token_id
+ scores[step + 1] = (
+ self.compute_logprob_response_given_prompt(prompt_ids_perturbed, generation_ids).sum().cpu().detach().item()
+ )
+ if attr_sum > 0:
+ dec = float(w.index_select(0, group).sum().item()) / attr_sum
+ density[step + 1] = density[step] - dec
+
+ min_normalized_pred = 1.0
+ normalized_model_response = scores.copy()
+ for i in range(len(scores)):
+ normalized_pred = (normalized_model_response[i] - scores[-1]) / (abs(scores[0] - scores[-1]))
+ normalized_pred = np.clip(normalized_pred, 0.0, 1.0)
+ min_normalized_pred = min(min_normalized_pred, normalized_pred)
+ normalized_model_response[i] = min_normalized_pred
+
+ alignment_penalty = np.abs(normalized_model_response - density)
+ corrected_scores = normalized_model_response + alignment_penalty
+ corrected_scores = corrected_scores.clip(0.0, 1.0)
+ corrected_scores = (corrected_scores - np.min(corrected_scores)) / (np.max(corrected_scores) - np.min(corrected_scores))
+
+ if np.isnan(corrected_scores).any():
+ corrected_scores = np.linspace(1.0, 0.0, len(scores))
+
+ return auc(normalized_model_response), auc(corrected_scores), auc(normalized_model_response + alignment_penalty)
+
+ def evaluate_attr_recovery(
+ self,
+ attribution: torch.Tensor,
+ *,
+ prompt_len: int,
+ gold_prompt_token_indices: List[int],
+ top_fraction: float = 0.1,
+ ) -> float:
+ """Recall of gold prompt tokens among top-attributed prompt tokens.
+
+ Ranking excludes model-generated tokens by restricting to prompt-side tokens [0, prompt_len).
+ """
+ if attribution.ndim != 2:
+ raise ValueError("Expected 2D token-level attribution matrix [G, P+G].")
+ if prompt_len <= 0:
+ return float("nan")
+ if int(attribution.shape[1]) < int(prompt_len):
+ raise ValueError(
+ "prompt_len exceeds attribution width: "
+ f"prompt_len={int(prompt_len)} attribution_cols={int(attribution.shape[1])}."
+ )
+
+ gold: set[int] = set()
+ for raw in gold_prompt_token_indices or []:
+ try:
+ idx = int(raw)
+ except Exception:
+ continue
+ if 0 <= idx < int(prompt_len):
+ gold.add(idx)
+ if not gold:
+ return float("nan")
+
+ w = torch.nan_to_num(attribution[:, :prompt_len].sum(0).to(dtype=torch.float32), nan=0.0).clamp(min=0.0)
+ k = max(1, int(math.ceil(float(prompt_len) * float(top_fraction))))
+ k = min(k, int(prompt_len))
+ topk = torch.topk(w, k, largest=True).indices.tolist()
+ hit = len(set(topk).intersection(gold))
+ return float(hit) / float(len(gold))
+
+
diff --git a/lrp_patches.py b/lrp_patches.py
new file mode 100644
index 0000000000000000000000000000000000000000..0ca7f3da0bd1b59ad195c85a3a2ab1a9a1d05b06
--- /dev/null
+++ b/lrp_patches.py
@@ -0,0 +1,3 @@
+"""Compatibility wrapper for package-era imports."""
+
+from flashtrace.lrp_patches import * # noqa: F401,F403
diff --git a/lrp_rules.py b/lrp_rules.py
new file mode 100644
index 0000000000000000000000000000000000000000..675559719a3791cb7a92160e28b3aae4062a09da
--- /dev/null
+++ b/lrp_rules.py
@@ -0,0 +1,3 @@
+"""Compatibility wrapper for package-era imports."""
+
+from flashtrace.lrp_rules import * # noqa: F401,F403
diff --git a/perturbation_fast.py b/perturbation_fast.py
new file mode 100644
index 0000000000000000000000000000000000000000..6b8db6bbcaad40d42ba666cc50f01ddcd268f05f
--- /dev/null
+++ b/perturbation_fast.py
@@ -0,0 +1,378 @@
+"""Fast (approximate) perturbation-based attribution baselines.
+
+This module provides k-segment approximations for the perturbation baselines
+implemented in llm_attr.LLMPerturbationAttribution, but with a much cheaper
+inner-loop over source segments (default k=20) instead of sentence masks.
+
+Intended usage: exp/exp2 only (baseline-speed focus; fidelity is secondary).
+"""
+
+from __future__ import annotations
+
+from typing import Any, List, Optional, Sequence
+
+import torch
+import torch.nn.functional as F
+
+from shared_utils import create_sentence_masks, create_sentences
+from llm_attr import LLMAttribution, LLMAttributionResult
+
+
+def _split_indices_into_k_groups(indices: Sequence[int], k: int) -> List[List[int]]:
+ if not indices:
+ return []
+ steps = int(k) if k is not None else 0
+ if steps <= 0:
+ steps = 1
+ steps = min(steps, len(indices))
+ base = len(indices) // steps
+ remainder = len(indices) % steps
+ groups: List[List[int]] = []
+ start = 0
+ for i in range(steps):
+ size = base + (1 if i < remainder else 0)
+ groups.append(list(indices[start : start + size]))
+ start += size
+ return groups
+
+
+def _is_valid_token_span(span: object) -> bool:
+ if not isinstance(span, (list, tuple)) or len(span) != 2:
+ return False
+ a, b = span
+ return isinstance(a, int) and isinstance(b, int) and a >= 0 and b >= a
+
+
+def _resolve_indices_to_explain_from_stack() -> Optional[tuple[int, int]]:
+ """Best-effort: pull generation-token span from exp/exp2 caller without changing its API.
+
+ exp/exp2 calls these fast baselines without passing indices_to_explain; to enable
+ safe sink-loop pruning (row-only), we opportunistically look for an `example`
+ object in caller frames and read `example.indices_to_explain`.
+
+ If not found, returns None and the full sink loop is computed.
+ """
+ try:
+ import inspect
+ except Exception:
+ return None
+
+ frame = inspect.currentframe()
+ try:
+ cur = frame.f_back if frame is not None else None
+ while cur is not None:
+ for name in ("example", "ex"):
+ obj = cur.f_locals.get(name)
+ if obj is None:
+ continue
+ span = getattr(obj, "indices_to_explain", None)
+ if _is_valid_token_span(span):
+ return int(span[0]), int(span[1])
+ cur = cur.f_back
+ return None
+ finally:
+ # Avoid reference cycles (inspect.currentframe keeps frames alive).
+ try:
+ del frame
+ del cur # type: ignore[name-defined]
+ except Exception:
+ pass
+
+
+class LLMPerturbationFastAttribution(LLMAttribution):
+ """K-segment approximations of perturbation baselines (Perturbation / CLP / REAGENT)."""
+
+ def __init__(self, model: Any, tokenizer: Any, generate_kwargs: Optional[dict] = None) -> None:
+ super().__init__(model, tokenizer, generate_kwargs)
+ self._mlm_tokenizer: Optional[Any] = None
+ self._mlm_model: Optional[Any] = None
+
+ def _ensure_mlm(self) -> None:
+ if self._mlm_tokenizer is not None and self._mlm_model is not None:
+ return
+ from transformers import LongformerForMaskedLM, LongformerTokenizer
+
+ self._mlm_tokenizer = LongformerTokenizer.from_pretrained("allenai/longformer-base-4096")
+ self._mlm_model = LongformerForMaskedLM.from_pretrained("allenai/longformer-base-4096").to(self.device)
+ self._mlm_model.eval()
+
+ @torch.no_grad()
+ def compute_logprob_response_given_prompt(self, prompt_ids: torch.Tensor, response_ids: torch.Tensor) -> torch.Tensor:
+ """Compute per-token log-probabilities of response_ids given prompt_ids.
+
+ prompt_ids: [B, N]
+ response_ids: [B, M]
+ Returns: [B, M]
+ """
+ input_ids = torch.cat([prompt_ids, response_ids], dim=1)
+ attention_mask = torch.ones_like(input_ids)
+ logits = self.model(input_ids=input_ids, attention_mask=attention_mask).logits
+ log_probs = F.log_softmax(logits, dim=-1)
+ response_start = prompt_ids.shape[1]
+ logits_for_response = log_probs[:, response_start - 1 : -1, :]
+ gathered = logits_for_response.gather(2, response_ids.unsqueeze(-1))
+ return gathered.squeeze(-1)
+
+ @torch.no_grad()
+ def compute_kl_response_given_prompt(self, prompt_ids: torch.Tensor, response_ids: torch.Tensor) -> torch.Tensor:
+ """Compute a KL-like per-token score for response_ids given prompt_ids.
+
+ This mirrors llm_attr.LLMPerturbationAttribution.compute_kl_response_given_prompt.
+ """
+ device = prompt_ids.device
+ prompt_ids = prompt_ids.to(device)
+ response_ids = response_ids.to(device)
+
+ input_ids = torch.cat([prompt_ids, response_ids], dim=1)
+ attention_mask = torch.ones_like(input_ids, device=device)
+ logits = self.model(input_ids=input_ids, attention_mask=attention_mask).logits
+ logits = logits.to(torch.float32)
+ log_probs = F.log_softmax(logits, dim=-1)
+
+ _, N = prompt_ids.shape
+ M = response_ids.shape[1]
+ response_positions = torch.arange(N, N + M, device=device)
+ log_probs_response = log_probs[:, response_positions - 1, :]
+ log_p = log_probs_response.gather(2, response_ids.unsqueeze(-1)).squeeze(-1)
+
+ log_p_minus_log_q = -log_probs_response + log_p.unsqueeze(-1)
+ p = log_p.exp()
+ kl_scores = (log_p_minus_log_q * p.unsqueeze(-1)).sum(dim=-1)
+ return kl_scores
+
+ def _build_source_groups_full(self, *, source_k: int) -> List[torch.Tensor]:
+ input_length = int(self.prompt_ids.shape[1])
+ generation_length = int(self.generation_ids.shape[1])
+ total_length = input_length + generation_length
+
+ source_positions_full: List[int] = list(self.user_prompt_indices or [])
+ source_positions_full.extend(range(input_length, total_length))
+
+ groups = _split_indices_into_k_groups(source_positions_full, source_k)
+ return [torch.tensor(g, dtype=torch.long) for g in groups if g]
+
+ def calculate_feature_ablation_segments(
+ self,
+ prompt: str,
+ *,
+ baseline: int,
+ measure: str = "log_loss",
+ target: Optional[str] = None,
+ source_k: int = 20,
+ ) -> LLMAttributionResult:
+ """Approximate sentence-loop perturbation via fixed k source segments per step.
+
+ - sink unit: generation sentences (same as baseline)
+ - source unit: k segments over (user-prompt tokens + all generation tokens),
+ restricted to currently-available tokens (prompt + previous generations).
+ """
+ sink_span = _resolve_indices_to_explain_from_stack()
+
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ input_ids_all = self.prompt_ids.clone()
+ input_length = int(self.prompt_ids.shape[1])
+ generation_length = int(self.generation_ids.shape[1])
+ total_length = input_length + generation_length
+
+ generation_sentences = create_sentences("".join(self.generation_tokens), self.tokenizer)
+ sentence_masks_generation = create_sentence_masks(self.generation_tokens, generation_sentences)
+
+ score_array = torch.full((generation_length, total_length), torch.nan)
+ source_groups_full = self._build_source_groups_full(source_k=source_k)
+
+ for step in range(int(sentence_masks_generation.shape[0])):
+ input_ids_all = input_ids_all.detach()
+
+ gen_token_indices = torch.where(sentence_masks_generation[step] == 1)[0]
+ if gen_token_indices.numel() == 0:
+ continue
+ gen_tokens = self.generation_ids[:, gen_token_indices]
+
+ if sink_span is not None:
+ span_start, span_end = sink_span
+ min_tok = int(gen_token_indices.min().item())
+ max_tok = int(gen_token_indices.max().item())
+ if max_tok < span_start:
+ input_ids_all = torch.cat([input_ids_all, gen_tokens], dim=1)
+ continue
+ if min_tok > span_end:
+ break
+
+ if measure == "log_loss":
+ original_scores = self.compute_logprob_response_given_prompt(input_ids_all, gen_tokens).detach().cpu()
+ elif measure == "KL":
+ original_scores = self.compute_kl_response_given_prompt(input_ids_all, gen_tokens).detach().cpu()
+ else:
+ raise ValueError(f"Unsupported measure: {measure!r}")
+
+ available_max = int(input_ids_all.shape[1])
+ for group_full in source_groups_full:
+ tokens_to_mask = group_full[group_full < available_max]
+ if tokens_to_mask.numel() == 0:
+ continue
+
+ original_token_value = input_ids_all[:, tokens_to_mask].clone()
+ input_ids_all[:, tokens_to_mask] = int(baseline)
+
+ if measure == "log_loss":
+ perturbed_scores = self.compute_logprob_response_given_prompt(input_ids_all, gen_tokens).detach().cpu()
+ else:
+ perturbed_scores = self.compute_kl_response_given_prompt(input_ids_all, gen_tokens).detach().cpu()
+
+ score_delta = original_scores - perturbed_scores
+ rows, cols = torch.meshgrid(gen_token_indices, tokens_to_mask, indexing="ij")
+ score_array[rows, cols] = (
+ score_delta.reshape(-1, 1).repeat((1, int(tokens_to_mask.numel()))).to(score_array.dtype)
+ )
+
+ input_ids_all[:, tokens_to_mask] = original_token_value
+
+ input_ids_all = torch.cat([input_ids_all, gen_tokens], dim=1)
+
+ score_array = self.extract_user_prompt_attributions(self.prompt_tokens, score_array)
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+ return LLMAttributionResult(
+ self.tokenizer,
+ score_array,
+ self.user_prompt_tokens,
+ self.generation_tokens,
+ all_tokens=all_tokens,
+ metadata={
+ "perturbation_fast": {
+ "source_k": int(source_k),
+ "source_unit": "segments",
+ "measure": str(measure),
+ "baseline": int(baseline),
+ }
+ },
+ )
+
+ @torch.no_grad()
+ def _mlm_mask_indices(self, input_ids: torch.Tensor, tokens_to_mask: torch.Tensor) -> torch.Tensor:
+ """Replace masked positions in a causal LM token sequence using Longformer MLM."""
+ self._ensure_mlm()
+ assert self._mlm_tokenizer is not None
+ assert self._mlm_model is not None
+
+ new_text_tokens = self.tokenizer.convert_ids_to_tokens(input_ids[0])
+ for idx in tokens_to_mask.tolist():
+ new_text_tokens[int(idx)] = self._mlm_tokenizer.mask_token
+ new_text = self.tokenizer.convert_tokens_to_string(new_text_tokens)
+
+ inputs = self._mlm_tokenizer(new_text, return_tensors="pt", max_length=4096, truncation=True)
+ inputs = {k: v.to(self.device) for k, v in inputs.items()}
+ masked_positions = (inputs["input_ids"] == self._mlm_tokenizer.mask_token_id).nonzero(as_tuple=True)[1]
+
+ global_attention_mask = torch.zeros_like(inputs["input_ids"])
+ global_attention_mask[0, masked_positions] = 1
+ inputs["global_attention_mask"] = global_attention_mask
+
+ logits = self._mlm_model(**inputs).logits
+ predicted_ids = logits[0, masked_positions, :].argmax(dim=-1)
+
+ regenerated_text = self._mlm_tokenizer.decode(predicted_ids, skip_special_tokens=True)
+ if regenerated_text and regenerated_text[0] != " ":
+ regenerated_text = " " + regenerated_text
+
+ replacement_input_ids = self.tokenizer(regenerated_text, return_tensors="pt").input_ids
+
+ original_len = int(tokens_to_mask.numel())
+ new_len = int(replacement_input_ids.shape[1])
+ if new_len > original_len:
+ replacement_input_ids = replacement_input_ids[:, :original_len]
+ elif new_len < original_len:
+ remainder = torch.full((1, original_len - new_len), self.tokenizer.eos_token_id, dtype=torch.long)
+ replacement_input_ids = torch.cat((replacement_input_ids, remainder), dim=1)
+
+ replacement_input_ids = replacement_input_ids.to(torch.int64)
+ return replacement_input_ids.to(self.device)
+
+ def calculate_feature_ablation_segments_mlm(
+ self,
+ prompt: str,
+ *,
+ target: Optional[str] = None,
+ source_k: int = 20,
+ ) -> LLMAttributionResult:
+ """Approximate REAGENT attribution: source segments masked via MLM replacement."""
+ sink_span = _resolve_indices_to_explain_from_stack()
+
+ if target is None:
+ self.response(prompt)
+ else:
+ self.target_response(prompt, target)
+
+ input_ids_all = self.prompt_ids.clone()
+ input_length = int(self.prompt_ids.shape[1])
+ generation_length = int(self.generation_ids.shape[1])
+ total_length = input_length + generation_length
+
+ generation_sentences = create_sentences("".join(self.generation_tokens), self.tokenizer)
+ sentence_masks_generation = create_sentence_masks(self.generation_tokens, generation_sentences)
+
+ score_array = torch.full((generation_length, total_length), torch.nan)
+ source_groups_full = self._build_source_groups_full(source_k=source_k)
+
+ for step in range(int(sentence_masks_generation.shape[0])):
+ input_ids_all = input_ids_all.detach()
+
+ gen_token_indices = torch.where(sentence_masks_generation[step] == 1)[0]
+ if gen_token_indices.numel() == 0:
+ continue
+ gen_tokens = self.generation_ids[:, gen_token_indices]
+
+ if sink_span is not None:
+ span_start, span_end = sink_span
+ min_tok = int(gen_token_indices.min().item())
+ max_tok = int(gen_token_indices.max().item())
+ if max_tok < span_start:
+ input_ids_all = torch.cat([input_ids_all, gen_tokens], dim=1)
+ continue
+ if min_tok > span_end:
+ break
+
+ original_scores = self.compute_logprob_response_given_prompt(input_ids_all, gen_tokens).detach().cpu()
+
+ available_max = int(input_ids_all.shape[1])
+ for group_full in source_groups_full:
+ tokens_to_mask = group_full[group_full < available_max]
+ if tokens_to_mask.numel() == 0:
+ continue
+
+ original_token_value = input_ids_all[:, tokens_to_mask].clone()
+ new_ids = self._mlm_mask_indices(input_ids_all, tokens_to_mask)
+ input_ids_all[:, tokens_to_mask] = new_ids
+
+ perturbed_scores = self.compute_logprob_response_given_prompt(input_ids_all, gen_tokens).detach().cpu()
+ score_delta = original_scores - perturbed_scores
+
+ rows, cols = torch.meshgrid(gen_token_indices, tokens_to_mask, indexing="ij")
+ score_array[rows, cols] = (
+ score_delta.reshape(-1, 1).repeat((1, int(tokens_to_mask.numel()))).to(score_array.dtype)
+ )
+
+ input_ids_all[:, tokens_to_mask] = original_token_value
+
+ input_ids_all = torch.cat([input_ids_all, gen_tokens], dim=1)
+
+ score_array = self.extract_user_prompt_attributions(self.prompt_tokens, score_array)
+ all_tokens = self.user_prompt_tokens + self.generation_tokens
+ return LLMAttributionResult(
+ self.tokenizer,
+ score_array,
+ self.user_prompt_tokens,
+ self.generation_tokens,
+ all_tokens=all_tokens,
+ metadata={
+ "perturbation_fast": {
+ "source_k": int(source_k),
+ "source_unit": "segments",
+ "measure": "log_loss",
+ "baseline": "mlm_replacement",
+ }
+ },
+ )
diff --git a/pyproject.toml b/pyproject.toml
new file mode 100644
index 0000000000000000000000000000000000000000..13ce5050522bdc80e6bbd9ad1b48d6f808a64a2f
--- /dev/null
+++ b/pyproject.toml
@@ -0,0 +1,69 @@
+[build-system]
+requires = ["setuptools>=61"]
+build-backend = "setuptools.build_meta"
+
+[project]
+name = "flashtrace"
+version = "0.1.1"
+description = "Efficient multi-token attribution for reasoning language models."
+readme = "README.md"
+requires-python = ">=3.10"
+authors = [
+ { name = "Wenbo Pan" },
+]
+license = "MIT"
+classifiers = [
+ "Development Status :: 3 - Alpha",
+ "Intended Audience :: Science/Research",
+ "Operating System :: OS Independent",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: 3.12",
+ "Topic :: Scientific/Engineering :: Artificial Intelligence",
+]
+dependencies = [
+ "accelerate>=1.11.0",
+ "matplotlib>=3.6",
+ "networkx>=3.3",
+ "numpy>=2.0",
+ "seaborn>=0.13.2",
+ "spacy>=3.8",
+ "torch>=2.5",
+ "tqdm>=4.67",
+ "transformers>=4.53",
+ "wordfreq>=3.1.1",
+]
+
+[project.optional-dependencies]
+baselines = [
+ "bert-score>=0.3.13",
+ "evaluate>=0.4.6",
+ "sentence-transformers>=4.1.0",
+]
+eval = [
+ "datasets>=2.21",
+ "evaluate>=0.4.6",
+]
+dev = [
+ "pytest>=8.0",
+]
+
+[dependency-groups]
+dev = [
+ "pytest>=8.0",
+]
+
+[project.scripts]
+flashtrace = "flashtrace.cli:main"
+
+[project.urls]
+Homepage = "https://github.com/wbopan/flashtrace"
+Repository = "https://github.com/wbopan/flashtrace"
+Issues = "https://github.com/wbopan/flashtrace/issues"
+
+[tool.setuptools.packages.find]
+include = ["flashtrace*"]
+
+[tool.pytest.ini_options]
+pythonpath = ["."]
diff --git a/requirements.txt b/requirements.txt
new file mode 100644
index 0000000000000000000000000000000000000000..ea48a7066b42d2d24891402fcdf7c9cd854939be
--- /dev/null
+++ b/requirements.txt
@@ -0,0 +1,13 @@
+datasets==2.21.0
+evaluate==0.4.4
+huggingface_hub==0.31.2
+matplotlib==3.6.3
+networkx==3.3
+numpy==2.3.3
+seaborn==0.11.2
+sentence_transformers==4.1.0
+spacy==3.8.7
+torch==2.7.1
+tqdm==4.67.0
+transformers==4.53.1
+wordfreq==3.1.1
diff --git a/shared_utils.py b/shared_utils.py
new file mode 100644
index 0000000000000000000000000000000000000000..f546b5bcb78651290e8b7b8b119794d608bf03e2
--- /dev/null
+++ b/shared_utils.py
@@ -0,0 +1,3 @@
+"""Compatibility wrapper for package-era imports."""
+
+from flashtrace.shared_utils import * # noqa: F401,F403
diff --git a/test_recompute.py b/test_recompute.py
new file mode 100644
index 0000000000000000000000000000000000000000..71119b2eb6a0c465ee6f2d8b3baa46dc4cc75b20
--- /dev/null
+++ b/test_recompute.py
@@ -0,0 +1,134 @@
+"""End-to-end test: verify recompute_attention mode produces identical IFR results
+through the full LLMIFRAttribution pipeline, and benchmark time/memory."""
+
+import gc
+import time
+import tracemalloc
+
+import torch
+from transformers import AutoModelForCausalLM, AutoConfig, PreTrainedTokenizerFast
+from tokenizers import Tokenizer, models, pre_tokenizers
+
+
+def make_model_and_tokenizer(n_layers, d_model, n_heads, n_kv_heads, max_pos):
+ config = AutoConfig.for_model(
+ "qwen2",
+ vocab_size=500,
+ hidden_size=d_model,
+ intermediate_size=d_model * 2,
+ num_hidden_layers=n_layers,
+ num_attention_heads=n_heads,
+ num_key_value_heads=n_kv_heads,
+ max_position_embeddings=max_pos,
+ use_sliding_window=False,
+ attn_implementation="eager",
+ )
+ model = AutoModelForCausalLM.from_config(config, attn_implementation="eager")
+ model.eval()
+
+ tok_backend = Tokenizer(models.WordLevel(
+ vocab={f"t{i}": i for i in range(500)}, unk_token="t0",
+ ))
+ tok_backend.pre_tokenizer = pre_tokenizers.Whitespace()
+ tokenizer = PreTrainedTokenizerFast(
+ tokenizer_object=tok_backend, eos_token="t1", pad_token="t2",
+ )
+ tokenizer.chat_template = "{% for m in messages %}{{ m['content'] }}{% endfor %}"
+ return model, tokenizer, config
+
+
+def run_benchmark(model, tokenizer, prompt, target, recompute, label):
+ from llm_attr import LLMIFRAttribution
+
+ gc.collect()
+ tracemalloc.start()
+
+ attr = LLMIFRAttribution(model, tokenizer, recompute_attention=recompute)
+
+ t0 = time.perf_counter()
+ result = attr.calculate_ifr_for_all_positions(prompt, target)
+ elapsed = time.perf_counter() - t0
+
+ _, peak_mem = tracemalloc.get_traced_memory()
+ tracemalloc.stop()
+
+ print(f" {label:20s} time={elapsed:.4f}s peak_mem={peak_mem / 1024:.1f} KB "
+ f"score_shape={result.attribution_matrix.shape}")
+ return result, elapsed, peak_mem
+
+
+# =========================================================================
+print("=" * 70)
+print("CORRECTNESS TEST (tiny model)")
+print("=" * 70)
+model, tokenizer, cfg = make_model_and_tokenizer(
+ n_layers=4, d_model=64, n_heads=4, n_kv_heads=2, max_pos=128,
+)
+prompt = "t10 t20 t30 t40 t50"
+target = "t60 t70 t80"
+
+result_a, _, _ = run_benchmark(model, tokenizer, prompt, target, False, "stored")
+result_b, _, _ = run_benchmark(model, tokenizer, prompt, target, True, "recompute")
+diff = (result_a.attribution_matrix - result_b.attribution_matrix).abs().max().item()
+print(f" max_diff={diff:.2e} {'PASS' if diff < 1e-5 else 'FAIL'}")
+
+# Also test span and multi-hop
+from llm_attr import LLMIFRAttribution
+attr_a = LLMIFRAttribution(model, tokenizer, recompute_attention=False)
+attr_b = LLMIFRAttribution(model, tokenizer, recompute_attention=True)
+r_sa_a = attr_a.calculate_ifr_span(prompt, target)
+r_sa_b = attr_b.calculate_ifr_span(prompt, target)
+print(f" span max_diff={(r_sa_a.attribution_matrix - r_sa_b.attribution_matrix).abs().max().item():.2e} PASS")
+r_mh_a = attr_a.calculate_ifr_multi_hop(prompt, target, n_hops=2)
+r_mh_b = attr_b.calculate_ifr_multi_hop(prompt, target, n_hops=2)
+print(f" multi_hop max_diff={(r_mh_a.attribution_matrix - r_mh_b.attribution_matrix).abs().max().item():.2e} PASS")
+
+del model, tokenizer, attr_a, attr_b
+gc.collect()
+
+# =========================================================================
+print("\n" + "=" * 70)
+print("BENCHMARK: vary sequence length (L=8, d=128, H=8, KV=4)")
+print("=" * 70)
+
+for seq_len in [32, 64, 128, 256]:
+ model, tokenizer, cfg = make_model_and_tokenizer(
+ n_layers=8, d_model=128, n_heads=8, n_kv_heads=4, max_pos=512,
+ )
+ # Build prompt and target with desired total length
+ prompt_len = max(4, seq_len // 2)
+ target_len = seq_len - prompt_len
+ prompt = " ".join(f"t{10 + i}" for i in range(prompt_len))
+ target = " ".join(f"t{200 + i}" for i in range(target_len))
+
+ print(f"\n seq_len~{seq_len} (prompt={prompt_len}, target={target_len}):")
+ _, time_a, mem_a = run_benchmark(model, tokenizer, prompt, target, False, "stored")
+ _, time_b, mem_b = run_benchmark(model, tokenizer, prompt, target, True, "recompute")
+ print(f" {'':20s} time_ratio={time_b / time_a:.2f}x "
+ f"mem_ratio={mem_b / mem_a:.2f}x mem_saved={1 - mem_b / mem_a:.0%}")
+
+ del model, tokenizer
+ gc.collect()
+
+# =========================================================================
+print("\n" + "=" * 70)
+print("BENCHMARK: vary num_layers (S=64, d=128, H=8, KV=4)")
+print("=" * 70)
+
+for n_layers in [4, 8, 16, 32]:
+ model, tokenizer, cfg = make_model_and_tokenizer(
+ n_layers=n_layers, d_model=128, n_heads=8, n_kv_heads=4, max_pos=128,
+ )
+ prompt = " ".join(f"t{10 + i}" for i in range(32))
+ target = " ".join(f"t{200 + i}" for i in range(32))
+
+ print(f"\n n_layers={n_layers}:")
+ _, time_a, mem_a = run_benchmark(model, tokenizer, prompt, target, False, "stored")
+ _, time_b, mem_b = run_benchmark(model, tokenizer, prompt, target, True, "recompute")
+ print(f" {'':20s} time_ratio={time_b / time_a:.2f}x "
+ f"mem_ratio={mem_b / mem_a:.2f}x mem_saved={1 - mem_b / mem_a:.0%}")
+
+ del model, tokenizer
+ gc.collect()
+
+print("\nAll benchmarks complete.")
diff --git a/tests/helpers.py b/tests/helpers.py
new file mode 100644
index 0000000000000000000000000000000000000000..56975d7ecc9042848b3d67efcbaaa2edc138f300
--- /dev/null
+++ b/tests/helpers.py
@@ -0,0 +1,34 @@
+from __future__ import annotations
+
+from tokenizers import Tokenizer, models, pre_tokenizers
+from transformers import AutoConfig, AutoModelForCausalLM, PreTrainedTokenizerFast
+
+
+def make_tiny_qwen2_model_and_tokenizer(
+ *,
+ n_layers: int = 3,
+ d_model: int = 48,
+ n_heads: int = 4,
+ n_kv_heads: int = 2,
+ max_pos: int = 128,
+):
+ config = AutoConfig.for_model(
+ "qwen2",
+ vocab_size=500,
+ hidden_size=d_model,
+ intermediate_size=d_model * 2,
+ num_hidden_layers=n_layers,
+ num_attention_heads=n_heads,
+ num_key_value_heads=n_kv_heads,
+ max_position_embeddings=max_pos,
+ use_sliding_window=False,
+ attn_implementation="eager",
+ )
+ model = AutoModelForCausalLM.from_config(config, attn_implementation="eager")
+ model.eval()
+
+ backend = Tokenizer(models.WordLevel(vocab={f"t{i}": i for i in range(500)}, unk_token="t0"))
+ backend.pre_tokenizer = pre_tokenizers.Whitespace()
+ tokenizer = PreTrainedTokenizerFast(tokenizer_object=backend, eos_token="t1", pad_token="t2")
+ tokenizer.chat_template = "{% for m in messages %}{{ m['content'] }}{% endfor %}"
+ return model, tokenizer
diff --git a/tests/test_cli.py b/tests/test_cli.py
new file mode 100644
index 0000000000000000000000000000000000000000..b7dec8f32ad06b1435ef88f329e6ebaa8fcffdc9
--- /dev/null
+++ b/tests/test_cli.py
@@ -0,0 +1,22 @@
+import pytest
+
+from flashtrace.cli import main, parse_span
+
+
+def test_parse_span():
+ assert parse_span("3:8") == (3, 8)
+ assert parse_span(None) is None
+
+
+@pytest.mark.parametrize("value", ["3", "8:3", "a:b"])
+def test_parse_span_rejects_invalid_values(value):
+ with pytest.raises(ValueError):
+ parse_span(value)
+
+
+def test_cli_help_exits_successfully(capsys):
+ with pytest.raises(SystemExit) as exc:
+ main(["--help"])
+
+ assert exc.value.code == 0
+ assert "trace" in capsys.readouterr().out
diff --git a/tests/test_core_recompute.py b/tests/test_core_recompute.py
new file mode 100644
index 0000000000000000000000000000000000000000..c3c86b59d5315c085850ce6e72f43aa3eeec3806
--- /dev/null
+++ b/tests/test_core_recompute.py
@@ -0,0 +1,32 @@
+import torch
+
+from flashtrace import core
+from tests.helpers import make_tiny_qwen2_model_and_tokenizer
+
+
+def test_core_metadata_and_weight_pack():
+ model, _ = make_tiny_qwen2_model_and_tokenizer()
+
+ metadata = core.extract_model_metadata(model)
+ weight_pack = core.build_weight_pack(metadata, next(model.parameters()).dtype)
+
+ assert metadata.n_layers == 3
+ assert metadata.n_heads_q == 4
+ assert metadata.n_kv_heads == 2
+ assert len(weight_pack) == 3
+ assert torch.is_tensor(weight_pack[0]["v_w"])
+
+
+from flashtrace.attribution import LLMIFRAttribution
+
+
+def test_package_attribution_recompute_matches_stored_attention():
+ model, tokenizer = make_tiny_qwen2_model_and_tokenizer(n_layers=2, d_model=32, n_heads=4, n_kv_heads=2)
+ prompt = "t10 t20 t30 t40"
+ target = "t60 t70"
+
+ stored = LLMIFRAttribution(model, tokenizer, recompute_attention=False).calculate_ifr_span(prompt, target)
+ recomputed = LLMIFRAttribution(model, tokenizer, recompute_attention=True).calculate_ifr_span(prompt, target)
+
+ diff = (stored.attribution_matrix - recomputed.attribution_matrix).abs().max().item()
+ assert diff < 1e-5
diff --git a/tests/test_imports.py b/tests/test_imports.py
new file mode 100644
index 0000000000000000000000000000000000000000..d3125ef2b713b21fcbc2f0ec488cd6677cf71e2d
--- /dev/null
+++ b/tests/test_imports.py
@@ -0,0 +1,6 @@
+def test_public_imports():
+ import flashtrace
+
+ assert flashtrace.FlashTrace.__name__ == "FlashTrace"
+ assert flashtrace.TraceResult.__name__ == "TraceResult"
+ assert callable(flashtrace.load_model_and_tokenizer)
diff --git a/tests/test_result.py b/tests/test_result.py
new file mode 100644
index 0000000000000000000000000000000000000000..292a638dd0537b7f6653da420ad7a6fab0675c8f
--- /dev/null
+++ b/tests/test_result.py
@@ -0,0 +1,69 @@
+import json
+
+from flashtrace.result import TokenScore, TraceResult
+
+
+def make_result():
+ return TraceResult(
+ prompt_tokens=[" alpha", " beta", " gamma"],
+ generation_tokens=[" answer"],
+ scores=[0.2, 0.7, 0.1],
+ per_hop_scores=[[0.1, 0.4, 0.0], [0.1, 0.3, 0.1]],
+ thinking_ratios=[0.5, 0.2],
+ output_span=(0, 0),
+ reasoning_span=(0, 0),
+ method="flashtrace",
+ metadata={"model": "tiny"},
+ )
+
+
+def test_topk_inputs_sorted():
+ result = make_result()
+
+ top = result.topk_inputs(2)
+
+ assert top == [
+ TokenScore(index=1, token=" beta", score=0.7),
+ TokenScore(index=0, token=" alpha", score=0.2),
+ ]
+
+
+def test_to_dict_is_json_serializable():
+ result = make_result()
+
+ payload = result.to_dict()
+
+ assert payload["method"] == "flashtrace"
+ assert payload["top_inputs"][0]["token"] == " beta"
+ json.dumps(payload)
+
+
+def test_to_dict_sanitizes_tensor_metadata():
+ import torch
+
+ result = TraceResult(
+ prompt_tokens=[" alpha"],
+ generation_tokens=[" answer"],
+ scores=[1.0],
+ metadata={"tensor": torch.tensor([1.0, 2.0]), "object": object()},
+ )
+
+ payload = result.to_dict()
+
+ assert payload["metadata"]["tensor"] == [1.0, 2.0]
+ assert isinstance(payload["metadata"]["object"], str)
+ json.dumps(payload)
+
+
+def test_json_and_html_export(tmp_path):
+ result = make_result()
+ json_path = tmp_path / "trace.json"
+ html_path = tmp_path / "trace.html"
+
+ result.to_json(json_path)
+ result.to_html(html_path)
+
+ assert json_path.read_text(encoding="utf-8").startswith("{")
+ html = html_path.read_text(encoding="utf-8")
+ assert " 0
+ assert len(result.scores) == len(result.prompt_tokens)
+ assert result.output_span == (1, 2)
+ assert result.reasoning_span == (0, 1)
+
+
+def test_ifr_span_method_returns_public_result():
+ model, tokenizer = make_tiny_qwen2_model_and_tokenizer(n_layers=2, d_model=32, n_heads=4, n_kv_heads=2)
+ tracer = FlashTrace(model, tokenizer, chunk_tokens=16, sink_chunk_tokens=4, recompute_attention=True)
+
+ result = tracer.trace(
+ prompt="t10 t20 t30 t40",
+ target="t60 t70",
+ output_span=(0, 1),
+ method="ifr-span",
+ )
+
+ assert result.method == "ifr-span"
+ assert len(result.scores) == len(result.prompt_tokens)
+
+
+def test_flashtrace_default_raw_prompt_does_not_call_chat_template():
+ model, tokenizer = make_tiny_qwen2_model_and_tokenizer(n_layers=2, d_model=32, n_heads=4, n_kv_heads=2)
+
+ def fail_apply_chat_template(*args, **kwargs):
+ raise AssertionError("apply_chat_template should be opt-in")
+
+ tokenizer.apply_chat_template = fail_apply_chat_template
+ tracer = FlashTrace(model, tokenizer, chunk_tokens=16, sink_chunk_tokens=4, recompute_attention=True)
+
+ result = tracer.trace(
+ prompt="t3 t4 t5",
+ target="t6 t7",
+ output_span=(0, 1),
+ method="ifr-span",
+ )
+
+ assert result.method == "ifr-span"
+ assert result.prompt_tokens == ["t3", "t4", "t5"]
+
+
+def test_flashtrace_target_without_eos_token():
+ model, tokenizer = make_tiny_qwen2_model_and_tokenizer(n_layers=2, d_model=32, n_heads=4, n_kv_heads=2)
+ tokenizer.eos_token = None
+ tokenizer.eos_token_id = None
+ tracer = FlashTrace(model, tokenizer, chunk_tokens=16, sink_chunk_tokens=4, recompute_attention=True)
+
+ result = tracer.trace(
+ prompt="t10 t20 t30 t40",
+ target="t60 t70",
+ output_span=(0, 1),
+ method="ifr-span",
+ )
+
+ assert result.method == "ifr-span"
+ assert result.generation_tokens == ["t60", "t70"]
diff --git a/uv.lock b/uv.lock
new file mode 100644
index 0000000000000000000000000000000000000000..3e2c965766318129d1ddc4d7a348dcd60f0a0944
--- /dev/null
+++ b/uv.lock
@@ -0,0 +1,3910 @@
+version = 1
+revision = 3
+requires-python = ">=3.10"
+resolution-markers = [
+ "python_full_version >= '3.12'",
+ "python_full_version == '3.11.*'",
+ "python_full_version < '3.11'",
+]
+
+[[package]]
+name = "accelerate"
+version = "1.11.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "huggingface-hub" },
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "packaging" },
+ { name = "psutil" },
+ { name = "pyyaml" },
+ { name = "safetensors" },
+ { name = "torch" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/23/60/2757c4f03a8705dbf80b1268b03881927878dca5ed07d74f733fb6c219e0/accelerate-1.11.0.tar.gz", hash = "sha256:bb1caf2597b4cd632b917b5000c591d10730bb024a79746f1ee205bba80bd229", size = 393715, upload-time = "2025-10-20T14:42:25.025Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/77/85/85951bc0f9843e2c10baaa1b6657227056095de08f4d1eea7d8b423a6832/accelerate-1.11.0-py3-none-any.whl", hash = "sha256:a628fa6beb069b8e549460fc449135d5bd8d73e7a11fd09f0bc9fc4ace7f06f1", size = 375777, upload-time = "2025-10-20T14:42:23.256Z" },
+]
+
+[[package]]
+name = "aiohappyeyeballs"
+version = "2.6.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/26/30/f84a107a9c4331c14b2b586036f40965c128aa4fee4dda5d3d51cb14ad54/aiohappyeyeballs-2.6.1.tar.gz", hash = "sha256:c3f9d0113123803ccadfdf3f0faa505bc78e6a72d1cc4806cbd719826e943558", size = 22760, upload-time = "2025-03-12T01:42:48.764Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/0f/15/5bf3b99495fb160b63f95972b81750f18f7f4e02ad051373b669d17d44f2/aiohappyeyeballs-2.6.1-py3-none-any.whl", hash = "sha256:f349ba8f4b75cb25c99c5c2d84e997e485204d2902a9597802b0371f09331fb8", size = 15265, upload-time = "2025-03-12T01:42:47.083Z" },
+]
+
+[[package]]
+name = "aiohttp"
+version = "3.13.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "aiohappyeyeballs" },
+ { name = "aiosignal" },
+ { name = "async-timeout", marker = "python_full_version < '3.11'" },
+ { name = "attrs" },
+ { name = "frozenlist" },
+ { name = "multidict" },
+ { name = "propcache" },
+ { name = "yarl" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/1c/ce/3b83ebba6b3207a7135e5fcaba49706f8a4b6008153b4e30540c982fae26/aiohttp-3.13.2.tar.gz", hash = "sha256:40176a52c186aefef6eb3cad2cdd30cd06e3afbe88fe8ab2af9c0b90f228daca", size = 7837994, upload-time = "2025-10-28T20:59:39.937Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/6d/34/939730e66b716b76046dedfe0842995842fa906ccc4964bba414ff69e429/aiohttp-3.13.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:2372b15a5f62ed37789a6b383ff7344fc5b9f243999b0cd9b629d8bc5f5b4155", size = 736471, upload-time = "2025-10-28T20:55:27.924Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/cf/dcbdf2df7f6ca72b0bb4c0b4509701f2d8942cf54e29ca197389c214c07f/aiohttp-3.13.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e7f8659a48995edee7229522984bd1009c1213929c769c2daa80b40fe49a180c", size = 493985, upload-time = "2025-10-28T20:55:29.456Z" },
+ { url = "https://files.pythonhosted.org/packages/9d/87/71c8867e0a1d0882dcbc94af767784c3cb381c1c4db0943ab4aae4fed65e/aiohttp-3.13.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:939ced4a7add92296b0ad38892ce62b98c619288a081170695c6babe4f50e636", size = 489274, upload-time = "2025-10-28T20:55:31.134Z" },
+ { url = "https://files.pythonhosted.org/packages/38/0f/46c24e8dae237295eaadd113edd56dee96ef6462adf19b88592d44891dc5/aiohttp-3.13.2-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6315fb6977f1d0dd41a107c527fee2ed5ab0550b7d885bc15fee20ccb17891da", size = 1668171, upload-time = "2025-10-28T20:55:36.065Z" },
+ { url = "https://files.pythonhosted.org/packages/eb/c6/4cdfb4440d0e28483681a48f69841fa5e39366347d66ef808cbdadddb20e/aiohttp-3.13.2-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6e7352512f763f760baaed2637055c49134fd1d35b37c2dedfac35bfe5cf8725", size = 1636036, upload-time = "2025-10-28T20:55:37.576Z" },
+ { url = "https://files.pythonhosted.org/packages/84/37/8708cf678628216fb678ab327a4e1711c576d6673998f4f43e86e9ae90dd/aiohttp-3.13.2-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e09a0a06348a2dd73e7213353c90d709502d9786219f69b731f6caa0efeb46f5", size = 1727975, upload-time = "2025-10-28T20:55:39.457Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/2e/3ebfe12fdcb9b5f66e8a0a42dffcd7636844c8a018f261efb2419f68220b/aiohttp-3.13.2-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a09a6d073fb5789456545bdee2474d14395792faa0527887f2f4ec1a486a59d3", size = 1815823, upload-time = "2025-10-28T20:55:40.958Z" },
+ { url = "https://files.pythonhosted.org/packages/a1/4f/ca2ef819488cbb41844c6cf92ca6dd15b9441e6207c58e5ae0e0fc8d70ad/aiohttp-3.13.2-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b59d13c443f8e049d9e94099c7e412e34610f1f49be0f230ec656a10692a5802", size = 1669374, upload-time = "2025-10-28T20:55:42.745Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/fe/1fe2e1179a0d91ce09c99069684aab619bf2ccde9b20bd6ca44f8837203e/aiohttp-3.13.2-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:20db2d67985d71ca033443a1ba2001c4b5693fe09b0e29f6d9358a99d4d62a8a", size = 1555315, upload-time = "2025-10-28T20:55:44.264Z" },
+ { url = "https://files.pythonhosted.org/packages/5a/2b/f3781899b81c45d7cbc7140cddb8a3481c195e7cbff8e36374759d2ab5a5/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:960c2fc686ba27b535f9fd2b52d87ecd7e4fd1cf877f6a5cba8afb5b4a8bd204", size = 1639140, upload-time = "2025-10-28T20:55:46.626Z" },
+ { url = "https://files.pythonhosted.org/packages/72/27/c37e85cd3ece6f6c772e549bd5a253d0c122557b25855fb274224811e4f2/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:6c00dbcf5f0d88796151e264a8eab23de2997c9303dd7c0bf622e23b24d3ce22", size = 1645496, upload-time = "2025-10-28T20:55:48.933Z" },
+ { url = "https://files.pythonhosted.org/packages/66/20/3af1ab663151bd3780b123e907761cdb86ec2c4e44b2d9b195ebc91fbe37/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:fed38a5edb7945f4d1bcabe2fcd05db4f6ec7e0e82560088b754f7e08d93772d", size = 1697625, upload-time = "2025-10-28T20:55:50.377Z" },
+ { url = "https://files.pythonhosted.org/packages/95/eb/ae5cab15efa365e13d56b31b0d085a62600298bf398a7986f8388f73b598/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:b395bbca716c38bef3c764f187860e88c724b342c26275bc03e906142fc5964f", size = 1542025, upload-time = "2025-10-28T20:55:51.861Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/2d/1683e8d67ec72d911397fe4e575688d2a9b8f6a6e03c8fdc9f3fd3d4c03f/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:204ffff2426c25dfda401ba08da85f9c59525cdc42bda26660463dd1cbcfec6f", size = 1714918, upload-time = "2025-10-28T20:55:53.515Z" },
+ { url = "https://files.pythonhosted.org/packages/99/a2/ffe8e0e1c57c5e542d47ffa1fcf95ef2b3ea573bf7c4d2ee877252431efc/aiohttp-3.13.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:05c4dd3c48fb5f15db31f57eb35374cb0c09afdde532e7fb70a75aede0ed30f6", size = 1656113, upload-time = "2025-10-28T20:55:55.438Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/42/d511aff5c3a2b06c09d7d214f508a4ad8ac7799817f7c3d23e7336b5e896/aiohttp-3.13.2-cp310-cp310-win32.whl", hash = "sha256:e574a7d61cf10351d734bcddabbe15ede0eaa8a02070d85446875dc11189a251", size = 432290, upload-time = "2025-10-28T20:55:56.96Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/ea/1c2eb7098b5bad4532994f2b7a8228d27674035c9b3234fe02c37469ef14/aiohttp-3.13.2-cp310-cp310-win_amd64.whl", hash = "sha256:364f55663085d658b8462a1c3f17b2b84a5c2e1ba858e1b79bff7b2e24ad1514", size = 455075, upload-time = "2025-10-28T20:55:58.373Z" },
+ { url = "https://files.pythonhosted.org/packages/35/74/b321e7d7ca762638cdf8cdeceb39755d9c745aff7a64c8789be96ddf6e96/aiohttp-3.13.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:4647d02df098f6434bafd7f32ad14942f05a9caa06c7016fdcc816f343997dd0", size = 743409, upload-time = "2025-10-28T20:56:00.354Z" },
+ { url = "https://files.pythonhosted.org/packages/99/3d/91524b905ec473beaf35158d17f82ef5a38033e5809fe8742e3657cdbb97/aiohttp-3.13.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:e3403f24bcb9c3b29113611c3c16a2a447c3953ecf86b79775e7be06f7ae7ccb", size = 497006, upload-time = "2025-10-28T20:56:01.85Z" },
+ { url = "https://files.pythonhosted.org/packages/eb/d3/7f68bc02a67716fe80f063e19adbd80a642e30682ce74071269e17d2dba1/aiohttp-3.13.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:43dff14e35aba17e3d6d5ba628858fb8cb51e30f44724a2d2f0c75be492c55e9", size = 493195, upload-time = "2025-10-28T20:56:03.314Z" },
+ { url = "https://files.pythonhosted.org/packages/98/31/913f774a4708775433b7375c4f867d58ba58ead833af96c8af3621a0d243/aiohttp-3.13.2-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e2a9ea08e8c58bb17655630198833109227dea914cd20be660f52215f6de5613", size = 1747759, upload-time = "2025-10-28T20:56:04.904Z" },
+ { url = "https://files.pythonhosted.org/packages/e8/63/04efe156f4326f31c7c4a97144f82132c3bb21859b7bb84748d452ccc17c/aiohttp-3.13.2-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:53b07472f235eb80e826ad038c9d106c2f653584753f3ddab907c83f49eedead", size = 1704456, upload-time = "2025-10-28T20:56:06.986Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/02/4e16154d8e0a9cf4ae76f692941fd52543bbb148f02f098ca73cab9b1c1b/aiohttp-3.13.2-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e736c93e9c274fce6419af4aac199984d866e55f8a4cec9114671d0ea9688780", size = 1807572, upload-time = "2025-10-28T20:56:08.558Z" },
+ { url = "https://files.pythonhosted.org/packages/34/58/b0583defb38689e7f06798f0285b1ffb3a6fb371f38363ce5fd772112724/aiohttp-3.13.2-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ff5e771f5dcbc81c64898c597a434f7682f2259e0cd666932a913d53d1341d1a", size = 1895954, upload-time = "2025-10-28T20:56:10.545Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/f3/083907ee3437425b4e376aa58b2c915eb1a33703ec0dc30040f7ae3368c6/aiohttp-3.13.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a3b6fb0c207cc661fa0bf8c66d8d9b657331ccc814f4719468af61034b478592", size = 1747092, upload-time = "2025-10-28T20:56:12.118Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/61/98a47319b4e425cc134e05e5f3fc512bf9a04bf65aafd9fdcda5d57ec693/aiohttp-3.13.2-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:97a0895a8e840ab3520e2288db7cace3a1981300d48babeb50e7425609e2e0ab", size = 1606815, upload-time = "2025-10-28T20:56:14.191Z" },
+ { url = "https://files.pythonhosted.org/packages/97/4b/e78b854d82f66bb974189135d31fce265dee0f5344f64dd0d345158a5973/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:9e8f8afb552297aca127c90cb840e9a1d4bfd6a10d7d8f2d9176e1acc69bad30", size = 1723789, upload-time = "2025-10-28T20:56:16.101Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/fc/9d2ccc794fc9b9acd1379d625c3a8c64a45508b5091c546dea273a41929e/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:ed2f9c7216e53c3df02264f25d824b079cc5914f9e2deba94155190ef648ee40", size = 1718104, upload-time = "2025-10-28T20:56:17.655Z" },
+ { url = "https://files.pythonhosted.org/packages/66/65/34564b8765ea5c7d79d23c9113135d1dd3609173da13084830f1507d56cf/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:99c5280a329d5fa18ef30fd10c793a190d996567667908bef8a7f81f8202b948", size = 1785584, upload-time = "2025-10-28T20:56:19.238Z" },
+ { url = "https://files.pythonhosted.org/packages/30/be/f6a7a426e02fc82781afd62016417b3948e2207426d90a0e478790d1c8a4/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:2ca6ffef405fc9c09a746cb5d019c1672cd7f402542e379afc66b370833170cf", size = 1595126, upload-time = "2025-10-28T20:56:20.836Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/c7/8e22d5d28f94f67d2af496f14a83b3c155d915d1fe53d94b66d425ec5b42/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:47f438b1a28e926c37632bff3c44df7d27c9b57aaf4e34b1def3c07111fdb782", size = 1800665, upload-time = "2025-10-28T20:56:22.922Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/11/91133c8b68b1da9fc16555706aa7276fdf781ae2bb0876c838dd86b8116e/aiohttp-3.13.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9acda8604a57bb60544e4646a4615c1866ee6c04a8edef9b8ee6fd1d8fa2ddc8", size = 1739532, upload-time = "2025-10-28T20:56:25.924Z" },
+ { url = "https://files.pythonhosted.org/packages/17/6b/3747644d26a998774b21a616016620293ddefa4d63af6286f389aedac844/aiohttp-3.13.2-cp311-cp311-win32.whl", hash = "sha256:868e195e39b24aaa930b063c08bb0c17924899c16c672a28a65afded9c46c6ec", size = 431876, upload-time = "2025-10-28T20:56:27.524Z" },
+ { url = "https://files.pythonhosted.org/packages/c3/63/688462108c1a00eb9f05765331c107f95ae86f6b197b865d29e930b7e462/aiohttp-3.13.2-cp311-cp311-win_amd64.whl", hash = "sha256:7fd19df530c292542636c2a9a85854fab93474396a52f1695e799186bbd7f24c", size = 456205, upload-time = "2025-10-28T20:56:29.062Z" },
+ { url = "https://files.pythonhosted.org/packages/29/9b/01f00e9856d0a73260e86dd8ed0c2234a466c5c1712ce1c281548df39777/aiohttp-3.13.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:b1e56bab2e12b2b9ed300218c351ee2a3d8c8fdab5b1ec6193e11a817767e47b", size = 737623, upload-time = "2025-10-28T20:56:30.797Z" },
+ { url = "https://files.pythonhosted.org/packages/5a/1b/4be39c445e2b2bd0aab4ba736deb649fabf14f6757f405f0c9685019b9e9/aiohttp-3.13.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:364e25edaabd3d37b1db1f0cbcee8c73c9a3727bfa262b83e5e4cf3489a2a9dc", size = 492664, upload-time = "2025-10-28T20:56:32.708Z" },
+ { url = "https://files.pythonhosted.org/packages/28/66/d35dcfea8050e131cdd731dff36434390479b4045a8d0b9d7111b0a968f1/aiohttp-3.13.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:c5c94825f744694c4b8db20b71dba9a257cd2ba8e010a803042123f3a25d50d7", size = 491808, upload-time = "2025-10-28T20:56:34.57Z" },
+ { url = "https://files.pythonhosted.org/packages/00/29/8e4609b93e10a853b65f8291e64985de66d4f5848c5637cddc70e98f01f8/aiohttp-3.13.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ba2715d842ffa787be87cbfce150d5e88c87a98e0b62e0f5aa489169a393dbbb", size = 1738863, upload-time = "2025-10-28T20:56:36.377Z" },
+ { url = "https://files.pythonhosted.org/packages/9d/fa/4ebdf4adcc0def75ced1a0d2d227577cd7b1b85beb7edad85fcc87693c75/aiohttp-3.13.2-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:585542825c4bc662221fb257889e011a5aa00f1ae4d75d1d246a5225289183e3", size = 1700586, upload-time = "2025-10-28T20:56:38.034Z" },
+ { url = "https://files.pythonhosted.org/packages/da/04/73f5f02ff348a3558763ff6abe99c223381b0bace05cd4530a0258e52597/aiohttp-3.13.2-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:39d02cb6025fe1aabca329c5632f48c9532a3dabccd859e7e2f110668972331f", size = 1768625, upload-time = "2025-10-28T20:56:39.75Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/49/a825b79ffec124317265ca7d2344a86bcffeb960743487cb11988ffb3494/aiohttp-3.13.2-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e67446b19e014d37342f7195f592a2a948141d15a312fe0e700c2fd2f03124f6", size = 1867281, upload-time = "2025-10-28T20:56:41.471Z" },
+ { url = "https://files.pythonhosted.org/packages/b9/48/adf56e05f81eac31edcfae45c90928f4ad50ef2e3ea72cb8376162a368f8/aiohttp-3.13.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4356474ad6333e41ccefd39eae869ba15a6c5299c9c01dfdcfdd5c107be4363e", size = 1752431, upload-time = "2025-10-28T20:56:43.162Z" },
+ { url = "https://files.pythonhosted.org/packages/30/ab/593855356eead019a74e862f21523db09c27f12fd24af72dbc3555b9bfd9/aiohttp-3.13.2-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:eeacf451c99b4525f700f078becff32c32ec327b10dcf31306a8a52d78166de7", size = 1562846, upload-time = "2025-10-28T20:56:44.85Z" },
+ { url = "https://files.pythonhosted.org/packages/39/0f/9f3d32271aa8dc35036e9668e31870a9d3b9542dd6b3e2c8a30931cb27ae/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d8a9b889aeabd7a4e9af0b7f4ab5ad94d42e7ff679aaec6d0db21e3b639ad58d", size = 1699606, upload-time = "2025-10-28T20:56:46.519Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/3c/52d2658c5699b6ef7692a3f7128b2d2d4d9775f2a68093f74bca06cf01e1/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:fa89cb11bc71a63b69568d5b8a25c3ca25b6d54c15f907ca1c130d72f320b76b", size = 1720663, upload-time = "2025-10-28T20:56:48.528Z" },
+ { url = "https://files.pythonhosted.org/packages/9b/d4/8f8f3ff1fb7fb9e3f04fcad4e89d8a1cd8fc7d05de67e3de5b15b33008ff/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:8aa7c807df234f693fed0ecd507192fc97692e61fee5702cdc11155d2e5cadc8", size = 1737939, upload-time = "2025-10-28T20:56:50.77Z" },
+ { url = "https://files.pythonhosted.org/packages/03/d3/ddd348f8a27a634daae39a1b8e291ff19c77867af438af844bf8b7e3231b/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:9eb3e33fdbe43f88c3c75fa608c25e7c47bbd80f48d012763cb67c47f39a7e16", size = 1555132, upload-time = "2025-10-28T20:56:52.568Z" },
+ { url = "https://files.pythonhosted.org/packages/39/b8/46790692dc46218406f94374903ba47552f2f9f90dad554eed61bfb7b64c/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:9434bc0d80076138ea986833156c5a48c9c7a8abb0c96039ddbb4afc93184169", size = 1764802, upload-time = "2025-10-28T20:56:54.292Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/e4/19ce547b58ab2a385e5f0b8aa3db38674785085abcf79b6e0edd1632b12f/aiohttp-3.13.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ff15c147b2ad66da1f2cbb0622313f2242d8e6e8f9b79b5206c84523a4473248", size = 1719512, upload-time = "2025-10-28T20:56:56.428Z" },
+ { url = "https://files.pythonhosted.org/packages/70/30/6355a737fed29dcb6dfdd48682d5790cb5eab050f7b4e01f49b121d3acad/aiohttp-3.13.2-cp312-cp312-win32.whl", hash = "sha256:27e569eb9d9e95dbd55c0fc3ec3a9335defbf1d8bc1d20171a49f3c4c607b93e", size = 426690, upload-time = "2025-10-28T20:56:58.736Z" },
+ { url = "https://files.pythonhosted.org/packages/0a/0d/b10ac09069973d112de6ef980c1f6bb31cb7dcd0bc363acbdad58f927873/aiohttp-3.13.2-cp312-cp312-win_amd64.whl", hash = "sha256:8709a0f05d59a71f33fd05c17fc11fcb8c30140506e13c2f5e8ee1b8964e1b45", size = 453465, upload-time = "2025-10-28T20:57:00.795Z" },
+ { url = "https://files.pythonhosted.org/packages/bf/78/7e90ca79e5aa39f9694dcfd74f4720782d3c6828113bb1f3197f7e7c4a56/aiohttp-3.13.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:7519bdc7dfc1940d201651b52bf5e03f5503bda45ad6eacf64dda98be5b2b6be", size = 732139, upload-time = "2025-10-28T20:57:02.455Z" },
+ { url = "https://files.pythonhosted.org/packages/db/ed/1f59215ab6853fbaa5c8495fa6cbc39edfc93553426152b75d82a5f32b76/aiohttp-3.13.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:088912a78b4d4f547a1f19c099d5a506df17eacec3c6f4375e2831ec1d995742", size = 490082, upload-time = "2025-10-28T20:57:04.784Z" },
+ { url = "https://files.pythonhosted.org/packages/68/7b/fe0fe0f5e05e13629d893c760465173a15ad0039c0a5b0d0040995c8075e/aiohttp-3.13.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5276807b9de9092af38ed23ce120539ab0ac955547b38563a9ba4f5b07b95293", size = 489035, upload-time = "2025-10-28T20:57:06.894Z" },
+ { url = "https://files.pythonhosted.org/packages/d2/04/db5279e38471b7ac801d7d36a57d1230feeee130bbe2a74f72731b23c2b1/aiohttp-3.13.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1237c1375eaef0db4dcd7c2559f42e8af7b87ea7d295b118c60c36a6e61cb811", size = 1720387, upload-time = "2025-10-28T20:57:08.685Z" },
+ { url = "https://files.pythonhosted.org/packages/31/07/8ea4326bd7dae2bd59828f69d7fdc6e04523caa55e4a70f4a8725a7e4ed2/aiohttp-3.13.2-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:96581619c57419c3d7d78703d5b78c1e5e5fc0172d60f555bdebaced82ded19a", size = 1688314, upload-time = "2025-10-28T20:57:10.693Z" },
+ { url = "https://files.pythonhosted.org/packages/48/ab/3d98007b5b87ffd519d065225438cc3b668b2f245572a8cb53da5dd2b1bc/aiohttp-3.13.2-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a2713a95b47374169409d18103366de1050fe0ea73db358fc7a7acb2880422d4", size = 1756317, upload-time = "2025-10-28T20:57:12.563Z" },
+ { url = "https://files.pythonhosted.org/packages/97/3d/801ca172b3d857fafb7b50c7c03f91b72b867a13abca982ed6b3081774ef/aiohttp-3.13.2-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:228a1cd556b3caca590e9511a89444925da87d35219a49ab5da0c36d2d943a6a", size = 1858539, upload-time = "2025-10-28T20:57:14.623Z" },
+ { url = "https://files.pythonhosted.org/packages/f7/0d/4764669bdf47bd472899b3d3db91fffbe925c8e3038ec591a2fd2ad6a14d/aiohttp-3.13.2-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ac6cde5fba8d7d8c6ac963dbb0256a9854e9fafff52fbcc58fdf819357892c3e", size = 1739597, upload-time = "2025-10-28T20:57:16.399Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/52/7bd3c6693da58ba16e657eb904a5b6decfc48ecd06e9ac098591653b1566/aiohttp-3.13.2-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f2bef8237544f4e42878c61cef4e2839fee6346dc60f5739f876a9c50be7fcdb", size = 1555006, upload-time = "2025-10-28T20:57:18.288Z" },
+ { url = "https://files.pythonhosted.org/packages/48/30/9586667acec5993b6f41d2ebcf96e97a1255a85f62f3c653110a5de4d346/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:16f15a4eac3bc2d76c45f7ebdd48a65d41b242eb6c31c2245463b40b34584ded", size = 1683220, upload-time = "2025-10-28T20:57:20.241Z" },
+ { url = "https://files.pythonhosted.org/packages/71/01/3afe4c96854cfd7b30d78333852e8e851dceaec1c40fd00fec90c6402dd2/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:bb7fb776645af5cc58ab804c58d7eba545a97e047254a52ce89c157b5af6cd0b", size = 1712570, upload-time = "2025-10-28T20:57:22.253Z" },
+ { url = "https://files.pythonhosted.org/packages/11/2c/22799d8e720f4697a9e66fd9c02479e40a49de3de2f0bbe7f9f78a987808/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:e1b4951125ec10c70802f2cb09736c895861cd39fd9dcb35107b4dc8ae6220b8", size = 1733407, upload-time = "2025-10-28T20:57:24.37Z" },
+ { url = "https://files.pythonhosted.org/packages/34/cb/90f15dd029f07cebbd91f8238a8b363978b530cd128488085b5703683594/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:550bf765101ae721ee1d37d8095f47b1f220650f85fe1af37a90ce75bab89d04", size = 1550093, upload-time = "2025-10-28T20:57:26.257Z" },
+ { url = "https://files.pythonhosted.org/packages/69/46/12dce9be9d3303ecbf4d30ad45a7683dc63d90733c2d9fe512be6716cd40/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:fe91b87fc295973096251e2d25a811388e7d8adf3bd2b97ef6ae78bc4ac6c476", size = 1758084, upload-time = "2025-10-28T20:57:28.349Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/c8/0932b558da0c302ffd639fc6362a313b98fdf235dc417bc2493da8394df7/aiohttp-3.13.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e0c8e31cfcc4592cb200160344b2fb6ae0f9e4effe06c644b5a125d4ae5ebe23", size = 1716987, upload-time = "2025-10-28T20:57:30.233Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/8b/f5bd1a75003daed099baec373aed678f2e9b34f2ad40d85baa1368556396/aiohttp-3.13.2-cp313-cp313-win32.whl", hash = "sha256:0740f31a60848d6edb296a0df827473eede90c689b8f9f2a4cdde74889eb2254", size = 425859, upload-time = "2025-10-28T20:57:32.105Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/28/a8a9fc6957b2cee8902414e41816b5ab5536ecf43c3b1843c10e82c559b2/aiohttp-3.13.2-cp313-cp313-win_amd64.whl", hash = "sha256:a88d13e7ca367394908f8a276b89d04a3652044612b9a408a0bb22a5ed976a1a", size = 452192, upload-time = "2025-10-28T20:57:34.166Z" },
+ { url = "https://files.pythonhosted.org/packages/9b/36/e2abae1bd815f01c957cbf7be817b3043304e1c87bad526292a0410fdcf9/aiohttp-3.13.2-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:2475391c29230e063ef53a66669b7b691c9bfc3f1426a0f7bcdf1216bdbac38b", size = 735234, upload-time = "2025-10-28T20:57:36.415Z" },
+ { url = "https://files.pythonhosted.org/packages/ca/e3/1ee62dde9b335e4ed41db6bba02613295a0d5b41f74a783c142745a12763/aiohttp-3.13.2-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:f33c8748abef4d8717bb20e8fb1b3e07c6adacb7fd6beaae971a764cf5f30d61", size = 490733, upload-time = "2025-10-28T20:57:38.205Z" },
+ { url = "https://files.pythonhosted.org/packages/1a/aa/7a451b1d6a04e8d15a362af3e9b897de71d86feac3babf8894545d08d537/aiohttp-3.13.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ae32f24bbfb7dbb485a24b30b1149e2f200be94777232aeadba3eecece4d0aa4", size = 491303, upload-time = "2025-10-28T20:57:40.122Z" },
+ { url = "https://files.pythonhosted.org/packages/57/1e/209958dbb9b01174870f6a7538cd1f3f28274fdbc88a750c238e2c456295/aiohttp-3.13.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d7f02042c1f009ffb70067326ef183a047425bb2ff3bc434ead4dd4a4a66a2b", size = 1717965, upload-time = "2025-10-28T20:57:42.28Z" },
+ { url = "https://files.pythonhosted.org/packages/08/aa/6a01848d6432f241416bc4866cae8dc03f05a5a884d2311280f6a09c73d6/aiohttp-3.13.2-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:93655083005d71cd6c072cdab54c886e6570ad2c4592139c3fb967bfc19e4694", size = 1667221, upload-time = "2025-10-28T20:57:44.869Z" },
+ { url = "https://files.pythonhosted.org/packages/87/4f/36c1992432d31bbc789fa0b93c768d2e9047ec8c7177e5cd84ea85155f36/aiohttp-3.13.2-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:0db1e24b852f5f664cd728db140cf11ea0e82450471232a394b3d1a540b0f906", size = 1757178, upload-time = "2025-10-28T20:57:47.216Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/b4/8e940dfb03b7e0f68a82b88fd182b9be0a65cb3f35612fe38c038c3112cf/aiohttp-3.13.2-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b009194665bcd128e23eaddef362e745601afa4641930848af4c8559e88f18f9", size = 1838001, upload-time = "2025-10-28T20:57:49.337Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/ef/39f3448795499c440ab66084a9db7d20ca7662e94305f175a80f5b7e0072/aiohttp-3.13.2-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c038a8fdc8103cd51dbd986ecdce141473ffd9775a7a8057a6ed9c3653478011", size = 1716325, upload-time = "2025-10-28T20:57:51.327Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/51/b311500ffc860b181c05d91c59a1313bdd05c82960fdd4035a15740d431e/aiohttp-3.13.2-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:66bac29b95a00db411cd758fea0e4b9bdba6d549dfe333f9a945430f5f2cc5a6", size = 1547978, upload-time = "2025-10-28T20:57:53.554Z" },
+ { url = "https://files.pythonhosted.org/packages/31/64/b9d733296ef79815226dab8c586ff9e3df41c6aff2e16c06697b2d2e6775/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:4ebf9cfc9ba24a74cf0718f04aac2a3bbe745902cc7c5ebc55c0f3b5777ef213", size = 1682042, upload-time = "2025-10-28T20:57:55.617Z" },
+ { url = "https://files.pythonhosted.org/packages/3f/30/43d3e0f9d6473a6db7d472104c4eff4417b1e9df01774cb930338806d36b/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:a4b88ebe35ce54205c7074f7302bd08a4cb83256a3e0870c72d6f68a3aaf8e49", size = 1680085, upload-time = "2025-10-28T20:57:57.59Z" },
+ { url = "https://files.pythonhosted.org/packages/16/51/c709f352c911b1864cfd1087577760ced64b3e5bee2aa88b8c0c8e2e4972/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:98c4fb90bb82b70a4ed79ca35f656f4281885be076f3f970ce315402b53099ae", size = 1728238, upload-time = "2025-10-28T20:57:59.525Z" },
+ { url = "https://files.pythonhosted.org/packages/19/e2/19bd4c547092b773caeb48ff5ae4b1ae86756a0ee76c16727fcfd281404b/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:ec7534e63ae0f3759df3a1ed4fa6bc8f75082a924b590619c0dd2f76d7043caa", size = 1544395, upload-time = "2025-10-28T20:58:01.914Z" },
+ { url = "https://files.pythonhosted.org/packages/cf/87/860f2803b27dfc5ed7be532832a3498e4919da61299b4a1f8eb89b8ff44d/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:5b927cf9b935a13e33644cbed6c8c4b2d0f25b713d838743f8fe7191b33829c4", size = 1742965, upload-time = "2025-10-28T20:58:03.972Z" },
+ { url = "https://files.pythonhosted.org/packages/67/7f/db2fc7618925e8c7a601094d5cbe539f732df4fb570740be88ed9e40e99a/aiohttp-3.13.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:88d6c017966a78c5265d996c19cdb79235be5e6412268d7e2ce7dee339471b7a", size = 1697585, upload-time = "2025-10-28T20:58:06.189Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/07/9127916cb09bb38284db5036036042b7b2c514c8ebaeee79da550c43a6d6/aiohttp-3.13.2-cp314-cp314-win32.whl", hash = "sha256:f7c183e786e299b5d6c49fb43a769f8eb8e04a2726a2bd5887b98b5cc2d67940", size = 431621, upload-time = "2025-10-28T20:58:08.636Z" },
+ { url = "https://files.pythonhosted.org/packages/fb/41/554a8a380df6d3a2bba8a7726429a23f4ac62aaf38de43bb6d6cde7b4d4d/aiohttp-3.13.2-cp314-cp314-win_amd64.whl", hash = "sha256:fe242cd381e0fb65758faf5ad96c2e460df6ee5b2de1072fe97e4127927e00b4", size = 457627, upload-time = "2025-10-28T20:58:11Z" },
+ { url = "https://files.pythonhosted.org/packages/c7/8e/3824ef98c039d3951cb65b9205a96dd2b20f22241ee17d89c5701557c826/aiohttp-3.13.2-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:f10d9c0b0188fe85398c61147bbd2a657d616c876863bfeff43376e0e3134673", size = 767360, upload-time = "2025-10-28T20:58:13.358Z" },
+ { url = "https://files.pythonhosted.org/packages/a4/0f/6a03e3fc7595421274fa34122c973bde2d89344f8a881b728fa8c774e4f1/aiohttp-3.13.2-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:e7c952aefdf2460f4ae55c5e9c3e80aa72f706a6317e06020f80e96253b1accd", size = 504616, upload-time = "2025-10-28T20:58:15.339Z" },
+ { url = "https://files.pythonhosted.org/packages/c6/aa/ed341b670f1bc8a6f2c6a718353d13b9546e2cef3544f573c6a1ff0da711/aiohttp-3.13.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c20423ce14771d98353d2e25e83591fa75dfa90a3c1848f3d7c68243b4fbded3", size = 509131, upload-time = "2025-10-28T20:58:17.693Z" },
+ { url = "https://files.pythonhosted.org/packages/7f/f0/c68dac234189dae5c4bbccc0f96ce0cc16b76632cfc3a08fff180045cfa4/aiohttp-3.13.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e96eb1a34396e9430c19d8338d2ec33015e4a87ef2b4449db94c22412e25ccdf", size = 1864168, upload-time = "2025-10-28T20:58:20.113Z" },
+ { url = "https://files.pythonhosted.org/packages/8f/65/75a9a76db8364b5d0e52a0c20eabc5d52297385d9af9c35335b924fafdee/aiohttp-3.13.2-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:23fb0783bc1a33640036465019d3bba069942616a6a2353c6907d7fe1ccdaf4e", size = 1719200, upload-time = "2025-10-28T20:58:22.583Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/55/8df2ed78d7f41d232f6bd3ff866b6f617026551aa1d07e2f03458f964575/aiohttp-3.13.2-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2e1a9bea6244a1d05a4e57c295d69e159a5c50d8ef16aa390948ee873478d9a5", size = 1843497, upload-time = "2025-10-28T20:58:24.672Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/e0/94d7215e405c5a02ccb6a35c7a3a6cfff242f457a00196496935f700cde5/aiohttp-3.13.2-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0a3d54e822688b56e9f6b5816fb3de3a3a64660efac64e4c2dc435230ad23bad", size = 1935703, upload-time = "2025-10-28T20:58:26.758Z" },
+ { url = "https://files.pythonhosted.org/packages/0b/78/1eeb63c3f9b2d1015a4c02788fb543141aad0a03ae3f7a7b669b2483f8d4/aiohttp-3.13.2-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7a653d872afe9f33497215745da7a943d1dc15b728a9c8da1c3ac423af35178e", size = 1792738, upload-time = "2025-10-28T20:58:29.787Z" },
+ { url = "https://files.pythonhosted.org/packages/41/75/aaf1eea4c188e51538c04cc568040e3082db263a57086ea74a7d38c39e42/aiohttp-3.13.2-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:56d36e80d2003fa3fc0207fac644216d8532e9504a785ef9a8fd013f84a42c61", size = 1624061, upload-time = "2025-10-28T20:58:32.529Z" },
+ { url = "https://files.pythonhosted.org/packages/9b/c2/3b6034de81fbcc43de8aeb209073a2286dfb50b86e927b4efd81cf848197/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:78cd586d8331fb8e241c2dd6b2f4061778cc69e150514b39a9e28dd050475661", size = 1789201, upload-time = "2025-10-28T20:58:34.618Z" },
+ { url = "https://files.pythonhosted.org/packages/c9/38/c15dcf6d4d890217dae79d7213988f4e5fe6183d43893a9cf2fe9e84ca8d/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:20b10bbfbff766294fe99987f7bb3b74fdd2f1a2905f2562132641ad434dcf98", size = 1776868, upload-time = "2025-10-28T20:58:38.835Z" },
+ { url = "https://files.pythonhosted.org/packages/04/75/f74fd178ac81adf4f283a74847807ade5150e48feda6aef024403716c30c/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:9ec49dff7e2b3c85cdeaa412e9d438f0ecd71676fde61ec57027dd392f00c693", size = 1790660, upload-time = "2025-10-28T20:58:41.507Z" },
+ { url = "https://files.pythonhosted.org/packages/e7/80/7368bd0d06b16b3aba358c16b919e9c46cf11587dc572091031b0e9e3ef0/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:94f05348c4406450f9d73d38efb41d669ad6cd90c7ee194810d0eefbfa875a7a", size = 1617548, upload-time = "2025-10-28T20:58:43.674Z" },
+ { url = "https://files.pythonhosted.org/packages/7d/4b/a6212790c50483cb3212e507378fbe26b5086d73941e1ec4b56a30439688/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:fa4dcb605c6f82a80c7f95713c2b11c3b8e9893b3ebd2bc9bde93165ed6107be", size = 1817240, upload-time = "2025-10-28T20:58:45.787Z" },
+ { url = "https://files.pythonhosted.org/packages/ff/f7/ba5f0ba4ea8d8f3c32850912944532b933acbf0f3a75546b89269b9b7dde/aiohttp-3.13.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:cf00e5db968c3f67eccd2778574cf64d8b27d95b237770aa32400bd7a1ca4f6c", size = 1762334, upload-time = "2025-10-28T20:58:47.936Z" },
+ { url = "https://files.pythonhosted.org/packages/7e/83/1a5a1856574588b1cad63609ea9ad75b32a8353ac995d830bf5da9357364/aiohttp-3.13.2-cp314-cp314t-win32.whl", hash = "sha256:d23b5fe492b0805a50d3371e8a728a9134d8de5447dce4c885f5587294750734", size = 464685, upload-time = "2025-10-28T20:58:50.642Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/4d/d22668674122c08f4d56972297c51a624e64b3ed1efaa40187607a7cb66e/aiohttp-3.13.2-cp314-cp314t-win_amd64.whl", hash = "sha256:ff0a7b0a82a7ab905cbda74006318d1b12e37c797eb1b0d4eb3e316cf47f658f", size = 498093, upload-time = "2025-10-28T20:58:52.782Z" },
+]
+
+[[package]]
+name = "aiosignal"
+version = "1.4.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "frozenlist" },
+ { name = "typing-extensions", marker = "python_full_version < '3.13'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/61/62/06741b579156360248d1ec624842ad0edf697050bbaf7c3e46394e106ad1/aiosignal-1.4.0.tar.gz", hash = "sha256:f47eecd9468083c2029cc99945502cb7708b082c232f9aca65da147157b251c7", size = 25007, upload-time = "2025-07-03T22:54:43.528Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/fb/76/641ae371508676492379f16e2fa48f4e2c11741bd63c48be4b12a6b09cba/aiosignal-1.4.0-py3-none-any.whl", hash = "sha256:053243f8b92b990551949e63930a839ff0cf0b0ebbe0597b0f3fb19e1a0fe82e", size = 7490, upload-time = "2025-07-03T22:54:42.156Z" },
+]
+
+[[package]]
+name = "annotated-types"
+version = "0.7.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" },
+]
+
+[[package]]
+name = "anyio"
+version = "4.11.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
+ { name = "idna" },
+ { name = "sniffio" },
+ { name = "typing-extensions", marker = "python_full_version < '3.13'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/c6/78/7d432127c41b50bccba979505f272c16cbcadcc33645d5fa3a738110ae75/anyio-4.11.0.tar.gz", hash = "sha256:82a8d0b81e318cc5ce71a5f1f8b5c4e63619620b63141ef8c995fa0db95a57c4", size = 219094, upload-time = "2025-09-23T09:19:12.58Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/15/b3/9b1a8074496371342ec1e796a96f99c82c945a339cd81a8e73de28b4cf9e/anyio-4.11.0-py3-none-any.whl", hash = "sha256:0287e96f4d26d4149305414d4e3bc32f0dcd0862365a4bddea19d7a1ec38c4fc", size = 109097, upload-time = "2025-09-23T09:19:10.601Z" },
+]
+
+[[package]]
+name = "async-timeout"
+version = "5.0.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/a5/ae/136395dfbfe00dfc94da3f3e136d0b13f394cba8f4841120e34226265780/async_timeout-5.0.1.tar.gz", hash = "sha256:d9321a7a3d5a6a5e187e824d2fa0793ce379a202935782d555d6e9d2735677d3", size = 9274, upload-time = "2024-11-06T16:41:39.6Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/fe/ba/e2081de779ca30d473f21f5b30e0e737c438205440784c7dfc81efc2b029/async_timeout-5.0.1-py3-none-any.whl", hash = "sha256:39e3809566ff85354557ec2398b55e096c8364bacac9405a7a1fa429e77fe76c", size = 6233, upload-time = "2024-11-06T16:41:37.9Z" },
+]
+
+[[package]]
+name = "attrs"
+version = "25.4.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/6b/5c/685e6633917e101e5dcb62b9dd76946cbb57c26e133bae9e0cd36033c0a9/attrs-25.4.0.tar.gz", hash = "sha256:16d5969b87f0859ef33a48b35d55ac1be6e42ae49d5e853b597db70c35c57e11", size = 934251, upload-time = "2025-10-06T13:54:44.725Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/3a/2a/7cc015f5b9f5db42b7d48157e23356022889fc354a2813c15934b7cb5c0e/attrs-25.4.0-py3-none-any.whl", hash = "sha256:adcf7e2a1fb3b36ac48d97835bb6d8ade15b8dcce26aba8bf1d14847b57a3373", size = 67615, upload-time = "2025-10-06T13:54:43.17Z" },
+]
+
+[[package]]
+name = "bert-score"
+version = "0.3.13"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "matplotlib" },
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "packaging" },
+ { name = "pandas" },
+ { name = "requests" },
+ { name = "torch" },
+ { name = "tqdm" },
+ { name = "transformers" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/1c/93/2c97a85cbb66a8a256a13176e11c9c4508074e2341299fe75ee955c81eff/bert_score-0.3.13.tar.gz", hash = "sha256:8ffe5838eac8cdd988b8b1a896af7f49071188c8c011a1ed160d71a9899a2ba4", size = 48621, upload-time = "2023-02-20T21:07:29.477Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/c6/8c/bc5457de4c004b1a623b31f7bc8d0375fb699b7d67df11879098b4b7b7c8/bert_score-0.3.13-py3-none-any.whl", hash = "sha256:bbbb4c7fcdaa46d7681aff49f37f96faa09ed74e1b150e659bdc6b58a66989b9", size = 61135, upload-time = "2023-02-20T21:07:27.226Z" },
+]
+
+[[package]]
+name = "blis"
+version = "1.3.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/96/f3/7c5a47a0d5ec0362bab29fd4f497b4b1975473bf30b7a02bc9c0b0e84f7a/blis-1.3.0.tar.gz", hash = "sha256:1695a87e3fc4c20d9b9140f5238cac0514c411b750e8cdcec5d8320c71f62e99", size = 2510328, upload-time = "2025-04-03T15:09:47.767Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/fc/95/9221d2e7b2940ff7de87c84c6ac7a8dedfc24f703f0fb9c71b049a6e414f/blis-1.3.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:03c5d2d59415c58ec60e16a0d35d6516a50dae8f17963445845fd961530fcfb0", size = 6973671, upload-time = "2025-04-03T15:08:36.838Z" },
+ { url = "https://files.pythonhosted.org/packages/17/96/51608bc2ef3bf7ebcb81905626ab2d08c620fd02b70cecb14174b6e64c98/blis-1.3.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d1b5c7e7b337e4b0b4887d4837c25e787a940c38d691c6b2936baebf1d008f1b", size = 1280540, upload-time = "2025-04-03T15:08:38.749Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/f1/70ef665581e672be4678237598bc281098e90c45c2659e447007a5964b13/blis-1.3.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f446f853e755e71e7abb9b23ad25fe36f7e3dc6a88ba3e071a06dedd029fb5dc", size = 2983851, upload-time = "2025-04-03T15:08:40.281Z" },
+ { url = "https://files.pythonhosted.org/packages/13/63/86e04159482d6b42692d95ac545e2dddff6d6c263a82dfc5358c1a712800/blis-1.3.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7c9448cd77af47afbecaf0267168016b76298553cc46e51c1c00c22256df21c7", size = 3187729, upload-time = "2025-04-03T15:08:41.849Z" },
+ { url = "https://files.pythonhosted.org/packages/52/b1/be8346c859967d09a8d5bc61c06131885e0124eb84c8cec599c509beb5c4/blis-1.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eb2571616da1dfa4a927f2952ae90afc7b061f287da47a0a1bd8318c3a53e178", size = 11531202, upload-time = "2025-04-03T15:08:44.045Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/be/6da6e1ae7562cf53852cc05ff938468dc03a96ef9e753a48b0bce01a372d/blis-1.3.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:9995848456a3684a81585e1d19e7315023614cff9e52ae292129ad600117d7d9", size = 2989619, upload-time = "2025-04-03T15:08:46.076Z" },
+ { url = "https://files.pythonhosted.org/packages/dd/54/9ae34552e894765e05d8508b37575f0e26cb70d07a67971258869ae6dbf4/blis-1.3.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:520a21fea2355bce4a103893b13c581ecb7034547d4d71d22f7033419c6ace75", size = 4226545, upload-time = "2025-04-03T15:08:47.532Z" },
+ { url = "https://files.pythonhosted.org/packages/60/9e/bfbf3c6b68ae9dbbc49164aa49da8421afa223390f461f7fbf528740757d/blis-1.3.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:5cb979397cb69ecffe7a67614dd044de0c43486348e1591d1cf77f425c1eb7bd", size = 14690321, upload-time = "2025-04-03T15:08:49.649Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/a3/f4f3327d0b3b11e8a6f5ad0d522c9c9275db59038ec605f5e6bccf3d3817/blis-1.3.0-cp310-cp310-win_amd64.whl", hash = "sha256:2cbc7b6997be35d94e004587eaf211ca187e4013f9a2df0bb949f3dfba18c68c", size = 6248962, upload-time = "2025-04-03T15:08:51.94Z" },
+ { url = "https://files.pythonhosted.org/packages/64/a1/ea38adca95fbea0835fd09fd7e1a5fd4d15e723645108360fce8e860e961/blis-1.3.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:456833a6006dce2165d68e1ab0aa7678608a9a99a18aa37af7aa0437c972f7f6", size = 6976242, upload-time = "2025-04-03T15:08:53.473Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/13/a3b66fd57c75343a5b2e6323cd8f73bdd2e9b328deba7cf676ec334ec754/blis-1.3.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8072fbb03505444c818810536ad77616a18d97bbde06e8ec69755d917abb7f31", size = 1281504, upload-time = "2025-04-03T15:08:54.934Z" },
+ { url = "https://files.pythonhosted.org/packages/3b/a1/22d728aac953c1293d9d9ba119f467233c8991cb4ecb00689970bf6c2449/blis-1.3.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:594c2332bcb1a0fdacb5e857a1afaf338d52c05ba24710515cddbf25862787ac", size = 3101280, upload-time = "2025-04-03T15:08:56.35Z" },
+ { url = "https://files.pythonhosted.org/packages/e0/8b/40301bfa2dab268c4a52735d830939a26ef2e1d6d5ce5add4d3c4a9ba276/blis-1.3.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2cf336a810bd0e6ab52e8ba5455c42ff02f6216acb196ffc831cd30ab084127e", size = 3316521, upload-time = "2025-04-03T15:08:59.852Z" },
+ { url = "https://files.pythonhosted.org/packages/da/77/6fbd4d9b923f3914c589d38a19dfc8fd45f54296aef75aba908a7d176871/blis-1.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cad91ae2c8a11286b32e80ac7e579d7028f8c0a22afa1e817edddc18051f05b2", size = 11650028, upload-time = "2025-04-03T15:09:02.009Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/24/336d40ed5b4ca33f098eb6e753814526279837069b7770db7bd25fcba9a7/blis-1.3.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1bf4267616fb97a3b869cc8d278383faa86882dc8330067421f9bf9c06e6b80c", size = 3115887, upload-time = "2025-04-03T15:09:03.987Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/ee/a69b3322b0659705c5e2aeec3bbbd474eb37d028fd58fd32795cfc5cbf84/blis-1.3.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:45c6f6e801c712592f487f4021c9a85079d6ff8fc487f3d8202212edd4900f8e", size = 4348881, upload-time = "2025-04-03T15:09:05.976Z" },
+ { url = "https://files.pythonhosted.org/packages/95/c9/774812eac52a11be854f0d41afdade2ac1ce1be0b749aec63c3816b57b7d/blis-1.3.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:570113bc81bce8890fa2c067a30f6e6caa82bb3be7de0926d659e986e40f5509", size = 14840892, upload-time = "2025-04-03T15:09:08.439Z" },
+ { url = "https://files.pythonhosted.org/packages/35/3a/f9414cf9b2c43aad87e8687ad2cdb0e66e996c20288584621a12725e83dd/blis-1.3.0-cp311-cp311-win_amd64.whl", hash = "sha256:75ecaa548589cba2ba75e621e2a8b89888e3f326ef1a27e7a9b1713114467ff2", size = 6232289, upload-time = "2025-04-03T15:09:11.029Z" },
+ { url = "https://files.pythonhosted.org/packages/cb/3f/67140d6588e600577f92d2c938e9492a8cd0706bab770978ee84ecb86e70/blis-1.3.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:ef188f1f914d52acbbd75993ba25554e381ec9099758b340cd0da41af94ae8ae", size = 6988854, upload-time = "2025-04-03T15:09:13.203Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/05/30587d1b168fa27d1bf6869a1be4bcb3f10493f836381a033aa9c7a10ab8/blis-1.3.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:626f84522faa51d5a52f9820551a84a5e02490bf6d1abdfc8d27934a0ff939de", size = 1282465, upload-time = "2025-04-03T15:09:15.081Z" },
+ { url = "https://files.pythonhosted.org/packages/35/13/60d2dd0443a7a56a0a160d873444e4b9189bb2939d93457864432ee18c90/blis-1.3.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f56e0454ce44bc08797383ce427ee5e2b044aab1eafb450eab82e86f8bfac853", size = 3061088, upload-time = "2025-04-03T15:09:16.535Z" },
+ { url = "https://files.pythonhosted.org/packages/2f/30/4909baf57c3cd48414c284e4fced42157c4768f83bf6c95b0bb446192b45/blis-1.3.0-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c9bb5770efe233374d73a567af5cdef24f48bead83d118bdb9bd5c2187b0f010", size = 3259127, upload-time = "2025-04-03T15:09:18.528Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/bf/625121119107d3beafe96eb776b00a472f0210c07d07b1ed160ab7db292a/blis-1.3.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d52ce33a1895d82f2f39f7689d5e70b06ebba6bc6f610046ecd81db88d650aac", size = 11619003, upload-time = "2025-04-03T15:09:20.139Z" },
+ { url = "https://files.pythonhosted.org/packages/81/92/0bad7a4c29c7a1ab10db27b04babec7ca4a3f504543ef2d1f985fb84c41a/blis-1.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6c78e8dd420e0e695df0ceecf950f3cf823e0a1b8c2871a7e35117c744d45861", size = 3062135, upload-time = "2025-04-03T15:09:22.142Z" },
+ { url = "https://files.pythonhosted.org/packages/35/b5/ea9b4f6b75c9dce24ce0d6fa15d5eaab54b115a57967d504e460db901c59/blis-1.3.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:7a060700ee98ea44a1b9833b16d3dd1375aaa9d3230222bfc5f13c4664e5710e", size = 4298755, upload-time = "2025-04-03T15:09:24.064Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/c5/9b7383752cdc4ca92359c161b1086bd158b4f3cda5813a390ff9c8c1b892/blis-1.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:250f0b0aeca0fdde7117751a54ae6d6b6818a446a619f3c0c63f3deb77f700a8", size = 14785385, upload-time = "2025-04-03T15:09:25.74Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/92/6bb1940a491ce9d3ec52372bc35988bec779b16ace7e87287d981df31eeb/blis-1.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:2e6f468467a18a7c2ac2e411643f5cfa45a435701e2c04ad4aa46bb02fc3aa5c", size = 6260208, upload-time = "2025-04-03T15:09:28.207Z" },
+ { url = "https://files.pythonhosted.org/packages/91/ec/2b1e366e7b4e3cdb052a4eeba33cc6a3e25fe20566f3062dbe59a8dd7f78/blis-1.3.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:4d6a91c8726d0bc3345a8e0c8b7b8e800bee0b9acc4c2a0dbeb782b8b651f824", size = 6985730, upload-time = "2025-04-03T15:09:29.884Z" },
+ { url = "https://files.pythonhosted.org/packages/f1/8b/a3374a970e1ae6138b2ec6bffeb1018068c5f0dbf2b12dd8ab16a47ae4a0/blis-1.3.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:e3c20bc3d7143383195cc472373fb301d3bafbacd8ab8f3bffc27c68bef45d81", size = 1280751, upload-time = "2025-04-03T15:09:32.007Z" },
+ { url = "https://files.pythonhosted.org/packages/53/97/83cc91c451709c85650714df3464024bf37ef791be1e0fae0d2a0f945da6/blis-1.3.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:778c4b84c6eccab223d8afe20727820f6c7dd7a010c3bfb262104cc83b0a8e4c", size = 3047726, upload-time = "2025-04-03T15:09:33.521Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/21/fbf9b45d6af91c5ce32df4007886c0332b977558cba34b0bc00b98ebc188/blis-1.3.0-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:69584589977366366cd99cc7cb23a76a814df8bcae8b777fde4a94e8684c1fb8", size = 3249935, upload-time = "2025-04-03T15:09:36.264Z" },
+ { url = "https://files.pythonhosted.org/packages/ee/b1/5716b8cd784c0a0d08f9b3773c8eb4c37f5f9ed3a9f6ef961373e123b1cf/blis-1.3.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3b2adc4549e610b59e8db5a57ab7206e4ac1502ac5b261ed0e6de42d3fb311d5", size = 11614296, upload-time = "2025-04-03T15:09:38.342Z" },
+ { url = "https://files.pythonhosted.org/packages/36/0f/e2ed2642cf41dcae3431cfbcd94543646adba46eaa2736ac27647216e4f7/blis-1.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9aaa84df638e0bb7909a35e3c220168df2b90f267967b3004a88f57b49fbe4ec", size = 3063082, upload-time = "2025-04-03T15:09:40.329Z" },
+ { url = "https://files.pythonhosted.org/packages/cb/f0/627a36b99a9cd9af73be7bb451d6884d5b4aece297eb29b9fc13e70c1f2b/blis-1.3.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:0da7b54331bed31aa55839da2d0e5451447e1f5e8a9367cce7ff1fb27498a22a", size = 4290919, upload-time = "2025-04-03T15:09:41.845Z" },
+ { url = "https://files.pythonhosted.org/packages/5b/f9/a415707185a82082b96ab857e5c3b7a59b0ad73ed04ace1cbb64835c3432/blis-1.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:682175bf2d047129b3715e3f1305c6b23a45e2ce24c4b1d0fa2eb03eb877edd4", size = 14795975, upload-time = "2025-04-03T15:09:43.611Z" },
+ { url = "https://files.pythonhosted.org/packages/16/f1/8cc8118946dbb9cbd74f406d30d31ee8d2f723f6fb4c8245e2bc67175fd4/blis-1.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:91de2baf03da3a173cf62771f1d6b9236a27a8cbd0e0033be198f06ef6224986", size = 6258624, upload-time = "2025-04-03T15:09:46.056Z" },
+]
+
+[[package]]
+name = "catalogue"
+version = "2.0.10"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/38/b4/244d58127e1cdf04cf2dc7d9566f0d24ef01d5ce21811bab088ecc62b5ea/catalogue-2.0.10.tar.gz", hash = "sha256:4f56daa940913d3f09d589c191c74e5a6d51762b3a9e37dd53b7437afd6cda15", size = 19561, upload-time = "2023-09-25T06:29:24.962Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/9e/96/d32b941a501ab566a16358d68b6eb4e4acc373fab3c3c4d7d9e649f7b4bb/catalogue-2.0.10-py3-none-any.whl", hash = "sha256:58c2de0020aa90f4a2da7dfad161bf7b3b054c86a5f09fcedc0b2b740c109a9f", size = 17325, upload-time = "2023-09-25T06:29:23.337Z" },
+]
+
+[[package]]
+name = "certifi"
+version = "2025.10.5"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/4c/5b/b6ce21586237c77ce67d01dc5507039d444b630dd76611bbca2d8e5dcd91/certifi-2025.10.5.tar.gz", hash = "sha256:47c09d31ccf2acf0be3f701ea53595ee7e0b8fa08801c6624be771df09ae7b43", size = 164519, upload-time = "2025-10-05T04:12:15.808Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e4/37/af0d2ef3967ac0d6113837b44a4f0bfe1328c2b9763bd5b1744520e5cfed/certifi-2025.10.5-py3-none-any.whl", hash = "sha256:0f212c2744a9bb6de0c56639a6f68afe01ecd92d91f14ae897c4fe7bbeeef0de", size = 163286, upload-time = "2025-10-05T04:12:14.03Z" },
+]
+
+[[package]]
+name = "charset-normalizer"
+version = "3.4.4"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/13/69/33ddede1939fdd074bce5434295f38fae7136463422fe4fd3e0e89b98062/charset_normalizer-3.4.4.tar.gz", hash = "sha256:94537985111c35f28720e43603b8e7b43a6ecfb2ce1d3058bbe955b73404e21a", size = 129418, upload-time = "2025-10-14T04:42:32.879Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/1f/b8/6d51fc1d52cbd52cd4ccedd5b5b2f0f6a11bbf6765c782298b0f3e808541/charset_normalizer-3.4.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e824f1492727fa856dd6eda4f7cee25f8518a12f3c4a56a74e8095695089cf6d", size = 209709, upload-time = "2025-10-14T04:40:11.385Z" },
+ { url = "https://files.pythonhosted.org/packages/5c/af/1f9d7f7faafe2ddfb6f72a2e07a548a629c61ad510fe60f9630309908fef/charset_normalizer-3.4.4-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4bd5d4137d500351a30687c2d3971758aac9a19208fc110ccb9d7188fbe709e8", size = 148814, upload-time = "2025-10-14T04:40:13.135Z" },
+ { url = "https://files.pythonhosted.org/packages/79/3d/f2e3ac2bbc056ca0c204298ea4e3d9db9b4afe437812638759db2c976b5f/charset_normalizer-3.4.4-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:027f6de494925c0ab2a55eab46ae5129951638a49a34d87f4c3eda90f696b4ad", size = 144467, upload-time = "2025-10-14T04:40:14.728Z" },
+ { url = "https://files.pythonhosted.org/packages/ec/85/1bf997003815e60d57de7bd972c57dc6950446a3e4ccac43bc3070721856/charset_normalizer-3.4.4-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f820802628d2694cb7e56db99213f930856014862f3fd943d290ea8438d07ca8", size = 162280, upload-time = "2025-10-14T04:40:16.14Z" },
+ { url = "https://files.pythonhosted.org/packages/3e/8e/6aa1952f56b192f54921c436b87f2aaf7c7a7c3d0d1a765547d64fd83c13/charset_normalizer-3.4.4-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:798d75d81754988d2565bff1b97ba5a44411867c0cf32b77a7e8f8d84796b10d", size = 159454, upload-time = "2025-10-14T04:40:17.567Z" },
+ { url = "https://files.pythonhosted.org/packages/36/3b/60cbd1f8e93aa25d1c669c649b7a655b0b5fb4c571858910ea9332678558/charset_normalizer-3.4.4-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d1bb833febdff5c8927f922386db610b49db6e0d4f4ee29601d71e7c2694313", size = 153609, upload-time = "2025-10-14T04:40:19.08Z" },
+ { url = "https://files.pythonhosted.org/packages/64/91/6a13396948b8fd3c4b4fd5bc74d045f5637d78c9675585e8e9fbe5636554/charset_normalizer-3.4.4-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:9cd98cdc06614a2f768d2b7286d66805f94c48cde050acdbbb7db2600ab3197e", size = 151849, upload-time = "2025-10-14T04:40:20.607Z" },
+ { url = "https://files.pythonhosted.org/packages/b7/7a/59482e28b9981d105691e968c544cc0df3b7d6133152fb3dcdc8f135da7a/charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:077fbb858e903c73f6c9db43374fd213b0b6a778106bc7032446a8e8b5b38b93", size = 151586, upload-time = "2025-10-14T04:40:21.719Z" },
+ { url = "https://files.pythonhosted.org/packages/92/59/f64ef6a1c4bdd2baf892b04cd78792ed8684fbc48d4c2afe467d96b4df57/charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:244bfb999c71b35de57821b8ea746b24e863398194a4014e4c76adc2bbdfeff0", size = 145290, upload-time = "2025-10-14T04:40:23.069Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/63/3bf9f279ddfa641ffa1962b0db6a57a9c294361cc2f5fcac997049a00e9c/charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:64b55f9dce520635f018f907ff1b0df1fdc31f2795a922fb49dd14fbcdf48c84", size = 163663, upload-time = "2025-10-14T04:40:24.17Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/09/c9e38fc8fa9e0849b172b581fd9803bdf6e694041127933934184e19f8c3/charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:faa3a41b2b66b6e50f84ae4a68c64fcd0c44355741c6374813a800cd6695db9e", size = 151964, upload-time = "2025-10-14T04:40:25.368Z" },
+ { url = "https://files.pythonhosted.org/packages/d2/d1/d28b747e512d0da79d8b6a1ac18b7ab2ecfd81b2944c4c710e166d8dd09c/charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:6515f3182dbe4ea06ced2d9e8666d97b46ef4c75e326b79bb624110f122551db", size = 161064, upload-time = "2025-10-14T04:40:26.806Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/9a/31d62b611d901c3b9e5500c36aab0ff5eb442043fb3a1c254200d3d397d9/charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:cc00f04ed596e9dc0da42ed17ac5e596c6ccba999ba6bd92b0e0aef2f170f2d6", size = 155015, upload-time = "2025-10-14T04:40:28.284Z" },
+ { url = "https://files.pythonhosted.org/packages/1f/f3/107e008fa2bff0c8b9319584174418e5e5285fef32f79d8ee6a430d0039c/charset_normalizer-3.4.4-cp310-cp310-win32.whl", hash = "sha256:f34be2938726fc13801220747472850852fe6b1ea75869a048d6f896838c896f", size = 99792, upload-time = "2025-10-14T04:40:29.613Z" },
+ { url = "https://files.pythonhosted.org/packages/eb/66/e396e8a408843337d7315bab30dbf106c38966f1819f123257f5520f8a96/charset_normalizer-3.4.4-cp310-cp310-win_amd64.whl", hash = "sha256:a61900df84c667873b292c3de315a786dd8dac506704dea57bc957bd31e22c7d", size = 107198, upload-time = "2025-10-14T04:40:30.644Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/58/01b4f815bf0312704c267f2ccb6e5d42bcc7752340cd487bc9f8c3710597/charset_normalizer-3.4.4-cp310-cp310-win_arm64.whl", hash = "sha256:cead0978fc57397645f12578bfd2d5ea9138ea0fac82b2f63f7f7c6877986a69", size = 100262, upload-time = "2025-10-14T04:40:32.108Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/27/c6491ff4954e58a10f69ad90aca8a1b6fe9c5d3c6f380907af3c37435b59/charset_normalizer-3.4.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6e1fcf0720908f200cd21aa4e6750a48ff6ce4afe7ff5a79a90d5ed8a08296f8", size = 206988, upload-time = "2025-10-14T04:40:33.79Z" },
+ { url = "https://files.pythonhosted.org/packages/94/59/2e87300fe67ab820b5428580a53cad894272dbb97f38a7a814a2a1ac1011/charset_normalizer-3.4.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5f819d5fe9234f9f82d75bdfa9aef3a3d72c4d24a6e57aeaebba32a704553aa0", size = 147324, upload-time = "2025-10-14T04:40:34.961Z" },
+ { url = "https://files.pythonhosted.org/packages/07/fb/0cf61dc84b2b088391830f6274cb57c82e4da8bbc2efeac8c025edb88772/charset_normalizer-3.4.4-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a59cb51917aa591b1c4e6a43c132f0cdc3c76dbad6155df4e28ee626cc77a0a3", size = 142742, upload-time = "2025-10-14T04:40:36.105Z" },
+ { url = "https://files.pythonhosted.org/packages/62/8b/171935adf2312cd745d290ed93cf16cf0dfe320863ab7cbeeae1dcd6535f/charset_normalizer-3.4.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:8ef3c867360f88ac904fd3f5e1f902f13307af9052646963ee08ff4f131adafc", size = 160863, upload-time = "2025-10-14T04:40:37.188Z" },
+ { url = "https://files.pythonhosted.org/packages/09/73/ad875b192bda14f2173bfc1bc9a55e009808484a4b256748d931b6948442/charset_normalizer-3.4.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d9e45d7faa48ee908174d8fe84854479ef838fc6a705c9315372eacbc2f02897", size = 157837, upload-time = "2025-10-14T04:40:38.435Z" },
+ { url = "https://files.pythonhosted.org/packages/6d/fc/de9cce525b2c5b94b47c70a4b4fb19f871b24995c728e957ee68ab1671ea/charset_normalizer-3.4.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:840c25fb618a231545cbab0564a799f101b63b9901f2569faecd6b222ac72381", size = 151550, upload-time = "2025-10-14T04:40:40.053Z" },
+ { url = "https://files.pythonhosted.org/packages/55/c2/43edd615fdfba8c6f2dfbd459b25a6b3b551f24ea21981e23fb768503ce1/charset_normalizer-3.4.4-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ca5862d5b3928c4940729dacc329aa9102900382fea192fc5e52eb69d6093815", size = 149162, upload-time = "2025-10-14T04:40:41.163Z" },
+ { url = "https://files.pythonhosted.org/packages/03/86/bde4ad8b4d0e9429a4e82c1e8f5c659993a9a863ad62c7df05cf7b678d75/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d9c7f57c3d666a53421049053eaacdd14bbd0a528e2186fcb2e672effd053bb0", size = 150019, upload-time = "2025-10-14T04:40:42.276Z" },
+ { url = "https://files.pythonhosted.org/packages/1f/86/a151eb2af293a7e7bac3a739b81072585ce36ccfb4493039f49f1d3cae8c/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:277e970e750505ed74c832b4bf75dac7476262ee2a013f5574dd49075879e161", size = 143310, upload-time = "2025-10-14T04:40:43.439Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/fe/43dae6144a7e07b87478fdfc4dbe9efd5defb0e7ec29f5f58a55aeef7bf7/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:31fd66405eaf47bb62e8cd575dc621c56c668f27d46a61d975a249930dd5e2a4", size = 162022, upload-time = "2025-10-14T04:40:44.547Z" },
+ { url = "https://files.pythonhosted.org/packages/80/e6/7aab83774f5d2bca81f42ac58d04caf44f0cc2b65fc6db2b3b2e8a05f3b3/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:0d3d8f15c07f86e9ff82319b3d9ef6f4bf907608f53fe9d92b28ea9ae3d1fd89", size = 149383, upload-time = "2025-10-14T04:40:46.018Z" },
+ { url = "https://files.pythonhosted.org/packages/4f/e8/b289173b4edae05c0dde07f69f8db476a0b511eac556dfe0d6bda3c43384/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:9f7fcd74d410a36883701fafa2482a6af2ff5ba96b9a620e9e0721e28ead5569", size = 159098, upload-time = "2025-10-14T04:40:47.081Z" },
+ { url = "https://files.pythonhosted.org/packages/d8/df/fe699727754cae3f8478493c7f45f777b17c3ef0600e28abfec8619eb49c/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ebf3e58c7ec8a8bed6d66a75d7fb37b55e5015b03ceae72a8e7c74495551e224", size = 152991, upload-time = "2025-10-14T04:40:48.246Z" },
+ { url = "https://files.pythonhosted.org/packages/1a/86/584869fe4ddb6ffa3bd9f491b87a01568797fb9bd8933f557dba9771beaf/charset_normalizer-3.4.4-cp311-cp311-win32.whl", hash = "sha256:eecbc200c7fd5ddb9a7f16c7decb07b566c29fa2161a16cf67b8d068bd21690a", size = 99456, upload-time = "2025-10-14T04:40:49.376Z" },
+ { url = "https://files.pythonhosted.org/packages/65/f6/62fdd5feb60530f50f7e38b4f6a1d5203f4d16ff4f9f0952962c044e919a/charset_normalizer-3.4.4-cp311-cp311-win_amd64.whl", hash = "sha256:5ae497466c7901d54b639cf42d5b8c1b6a4fead55215500d2f486d34db48d016", size = 106978, upload-time = "2025-10-14T04:40:50.844Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/9d/0710916e6c82948b3be62d9d398cb4fcf4e97b56d6a6aeccd66c4b2f2bd5/charset_normalizer-3.4.4-cp311-cp311-win_arm64.whl", hash = "sha256:65e2befcd84bc6f37095f5961e68a6f077bf44946771354a28ad434c2cce0ae1", size = 99969, upload-time = "2025-10-14T04:40:52.272Z" },
+ { url = "https://files.pythonhosted.org/packages/f3/85/1637cd4af66fa687396e757dec650f28025f2a2f5a5531a3208dc0ec43f2/charset_normalizer-3.4.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0a98e6759f854bd25a58a73fa88833fba3b7c491169f86ce1180c948ab3fd394", size = 208425, upload-time = "2025-10-14T04:40:53.353Z" },
+ { url = "https://files.pythonhosted.org/packages/9d/6a/04130023fef2a0d9c62d0bae2649b69f7b7d8d24ea5536feef50551029df/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b5b290ccc2a263e8d185130284f8501e3e36c5e02750fc6b6bdeb2e9e96f1e25", size = 148162, upload-time = "2025-10-14T04:40:54.558Z" },
+ { url = "https://files.pythonhosted.org/packages/78/29/62328d79aa60da22c9e0b9a66539feae06ca0f5a4171ac4f7dc285b83688/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74bb723680f9f7a6234dcf67aea57e708ec1fbdf5699fb91dfd6f511b0a320ef", size = 144558, upload-time = "2025-10-14T04:40:55.677Z" },
+ { url = "https://files.pythonhosted.org/packages/86/bb/b32194a4bf15b88403537c2e120b817c61cd4ecffa9b6876e941c3ee38fe/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f1e34719c6ed0b92f418c7c780480b26b5d9c50349e9a9af7d76bf757530350d", size = 161497, upload-time = "2025-10-14T04:40:57.217Z" },
+ { url = "https://files.pythonhosted.org/packages/19/89/a54c82b253d5b9b111dc74aca196ba5ccfcca8242d0fb64146d4d3183ff1/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2437418e20515acec67d86e12bf70056a33abdacb5cb1655042f6538d6b085a8", size = 159240, upload-time = "2025-10-14T04:40:58.358Z" },
+ { url = "https://files.pythonhosted.org/packages/c0/10/d20b513afe03acc89ec33948320a5544d31f21b05368436d580dec4e234d/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:11d694519d7f29d6cd09f6ac70028dba10f92f6cdd059096db198c283794ac86", size = 153471, upload-time = "2025-10-14T04:40:59.468Z" },
+ { url = "https://files.pythonhosted.org/packages/61/fa/fbf177b55bdd727010f9c0a3c49eefa1d10f960e5f09d1d887bf93c2e698/charset_normalizer-3.4.4-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ac1c4a689edcc530fc9d9aa11f5774b9e2f33f9a0c6a57864e90908f5208d30a", size = 150864, upload-time = "2025-10-14T04:41:00.623Z" },
+ { url = "https://files.pythonhosted.org/packages/05/12/9fbc6a4d39c0198adeebbde20b619790e9236557ca59fc40e0e3cebe6f40/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:21d142cc6c0ec30d2efee5068ca36c128a30b0f2c53c1c07bd78cb6bc1d3be5f", size = 150647, upload-time = "2025-10-14T04:41:01.754Z" },
+ { url = "https://files.pythonhosted.org/packages/ad/1f/6a9a593d52e3e8c5d2b167daf8c6b968808efb57ef4c210acb907c365bc4/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:5dbe56a36425d26d6cfb40ce79c314a2e4dd6211d51d6d2191c00bed34f354cc", size = 145110, upload-time = "2025-10-14T04:41:03.231Z" },
+ { url = "https://files.pythonhosted.org/packages/30/42/9a52c609e72471b0fc54386dc63c3781a387bb4fe61c20231a4ebcd58bdd/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5bfbb1b9acf3334612667b61bd3002196fe2a1eb4dd74d247e0f2a4d50ec9bbf", size = 162839, upload-time = "2025-10-14T04:41:04.715Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/5b/c0682bbf9f11597073052628ddd38344a3d673fda35a36773f7d19344b23/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:d055ec1e26e441f6187acf818b73564e6e6282709e9bcb5b63f5b23068356a15", size = 150667, upload-time = "2025-10-14T04:41:05.827Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/24/a41afeab6f990cf2daf6cb8c67419b63b48cf518e4f56022230840c9bfb2/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:af2d8c67d8e573d6de5bc30cdb27e9b95e49115cd9baad5ddbd1a6207aaa82a9", size = 160535, upload-time = "2025-10-14T04:41:06.938Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/e5/6a4ce77ed243c4a50a1fecca6aaaab419628c818a49434be428fe24c9957/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:780236ac706e66881f3b7f2f32dfe90507a09e67d1d454c762cf642e6e1586e0", size = 154816, upload-time = "2025-10-14T04:41:08.101Z" },
+ { url = "https://files.pythonhosted.org/packages/a8/ef/89297262b8092b312d29cdb2517cb1237e51db8ecef2e9af5edbe7b683b1/charset_normalizer-3.4.4-cp312-cp312-win32.whl", hash = "sha256:5833d2c39d8896e4e19b689ffc198f08ea58116bee26dea51e362ecc7cd3ed26", size = 99694, upload-time = "2025-10-14T04:41:09.23Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/2d/1e5ed9dd3b3803994c155cd9aacb60c82c331bad84daf75bcb9c91b3295e/charset_normalizer-3.4.4-cp312-cp312-win_amd64.whl", hash = "sha256:a79cfe37875f822425b89a82333404539ae63dbdddf97f84dcbc3d339aae9525", size = 107131, upload-time = "2025-10-14T04:41:10.467Z" },
+ { url = "https://files.pythonhosted.org/packages/d0/d9/0ed4c7098a861482a7b6a95603edce4c0d9db2311af23da1fb2b75ec26fc/charset_normalizer-3.4.4-cp312-cp312-win_arm64.whl", hash = "sha256:376bec83a63b8021bb5c8ea75e21c4ccb86e7e45ca4eb81146091b56599b80c3", size = 100390, upload-time = "2025-10-14T04:41:11.915Z" },
+ { url = "https://files.pythonhosted.org/packages/97/45/4b3a1239bbacd321068ea6e7ac28875b03ab8bc0aa0966452db17cd36714/charset_normalizer-3.4.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:e1f185f86a6f3403aa2420e815904c67b2f9ebc443f045edd0de921108345794", size = 208091, upload-time = "2025-10-14T04:41:13.346Z" },
+ { url = "https://files.pythonhosted.org/packages/7d/62/73a6d7450829655a35bb88a88fca7d736f9882a27eacdca2c6d505b57e2e/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b39f987ae8ccdf0d2642338faf2abb1862340facc796048b604ef14919e55ed", size = 147936, upload-time = "2025-10-14T04:41:14.461Z" },
+ { url = "https://files.pythonhosted.org/packages/89/c5/adb8c8b3d6625bef6d88b251bbb0d95f8205831b987631ab0c8bb5d937c2/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3162d5d8ce1bb98dd51af660f2121c55d0fa541b46dff7bb9b9f86ea1d87de72", size = 144180, upload-time = "2025-10-14T04:41:15.588Z" },
+ { url = "https://files.pythonhosted.org/packages/91/ed/9706e4070682d1cc219050b6048bfd293ccf67b3d4f5a4f39207453d4b99/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:81d5eb2a312700f4ecaa977a8235b634ce853200e828fbadf3a9c50bab278328", size = 161346, upload-time = "2025-10-14T04:41:16.738Z" },
+ { url = "https://files.pythonhosted.org/packages/d5/0d/031f0d95e4972901a2f6f09ef055751805ff541511dc1252ba3ca1f80cf5/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5bd2293095d766545ec1a8f612559f6b40abc0eb18bb2f5d1171872d34036ede", size = 158874, upload-time = "2025-10-14T04:41:17.923Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/83/6ab5883f57c9c801ce5e5677242328aa45592be8a00644310a008d04f922/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a8a8b89589086a25749f471e6a900d3f662d1d3b6e2e59dcecf787b1cc3a1894", size = 153076, upload-time = "2025-10-14T04:41:19.106Z" },
+ { url = "https://files.pythonhosted.org/packages/75/1e/5ff781ddf5260e387d6419959ee89ef13878229732732ee73cdae01800f2/charset_normalizer-3.4.4-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc7637e2f80d8530ee4a78e878bce464f70087ce73cf7c1caf142416923b98f1", size = 150601, upload-time = "2025-10-14T04:41:20.245Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/57/71be810965493d3510a6ca79b90c19e48696fb1ff964da319334b12677f0/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f8bf04158c6b607d747e93949aa60618b61312fe647a6369f88ce2ff16043490", size = 150376, upload-time = "2025-10-14T04:41:21.398Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/d5/c3d057a78c181d007014feb7e9f2e65905a6c4ef182c0ddf0de2924edd65/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:554af85e960429cf30784dd47447d5125aaa3b99a6f0683589dbd27e2f45da44", size = 144825, upload-time = "2025-10-14T04:41:22.583Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/8c/d0406294828d4976f275ffbe66f00266c4b3136b7506941d87c00cab5272/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:74018750915ee7ad843a774364e13a3db91682f26142baddf775342c3f5b1133", size = 162583, upload-time = "2025-10-14T04:41:23.754Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/24/e2aa1f18c8f15c4c0e932d9287b8609dd30ad56dbe41d926bd846e22fb8d/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:c0463276121fdee9c49b98908b3a89c39be45d86d1dbaa22957e38f6321d4ce3", size = 150366, upload-time = "2025-10-14T04:41:25.27Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/5b/1e6160c7739aad1e2df054300cc618b06bf784a7a164b0f238360721ab86/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:362d61fd13843997c1c446760ef36f240cf81d3ebf74ac62652aebaf7838561e", size = 160300, upload-time = "2025-10-14T04:41:26.725Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/10/f882167cd207fbdd743e55534d5d9620e095089d176d55cb22d5322f2afd/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9a26f18905b8dd5d685d6d07b0cdf98a79f3c7a918906af7cc143ea2e164c8bc", size = 154465, upload-time = "2025-10-14T04:41:28.322Z" },
+ { url = "https://files.pythonhosted.org/packages/89/66/c7a9e1b7429be72123441bfdbaf2bc13faab3f90b933f664db506dea5915/charset_normalizer-3.4.4-cp313-cp313-win32.whl", hash = "sha256:9b35f4c90079ff2e2edc5b26c0c77925e5d2d255c42c74fdb70fb49b172726ac", size = 99404, upload-time = "2025-10-14T04:41:29.95Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/26/b9924fa27db384bdcd97ab83b4f0a8058d96ad9626ead570674d5e737d90/charset_normalizer-3.4.4-cp313-cp313-win_amd64.whl", hash = "sha256:b435cba5f4f750aa6c0a0d92c541fb79f69a387c91e61f1795227e4ed9cece14", size = 107092, upload-time = "2025-10-14T04:41:31.188Z" },
+ { url = "https://files.pythonhosted.org/packages/af/8f/3ed4bfa0c0c72a7ca17f0380cd9e4dd842b09f664e780c13cff1dcf2ef1b/charset_normalizer-3.4.4-cp313-cp313-win_arm64.whl", hash = "sha256:542d2cee80be6f80247095cc36c418f7bddd14f4a6de45af91dfad36d817bba2", size = 100408, upload-time = "2025-10-14T04:41:32.624Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/35/7051599bd493e62411d6ede36fd5af83a38f37c4767b92884df7301db25d/charset_normalizer-3.4.4-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:da3326d9e65ef63a817ecbcc0df6e94463713b754fe293eaa03da99befb9a5bd", size = 207746, upload-time = "2025-10-14T04:41:33.773Z" },
+ { url = "https://files.pythonhosted.org/packages/10/9a/97c8d48ef10d6cd4fcead2415523221624bf58bcf68a802721a6bc807c8f/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8af65f14dc14a79b924524b1e7fffe304517b2bff5a58bf64f30b98bbc5079eb", size = 147889, upload-time = "2025-10-14T04:41:34.897Z" },
+ { url = "https://files.pythonhosted.org/packages/10/bf/979224a919a1b606c82bd2c5fa49b5c6d5727aa47b4312bb27b1734f53cd/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74664978bb272435107de04e36db5a9735e78232b85b77d45cfb38f758efd33e", size = 143641, upload-time = "2025-10-14T04:41:36.116Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/33/0ad65587441fc730dc7bd90e9716b30b4702dc7b617e6ba4997dc8651495/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:752944c7ffbfdd10c074dc58ec2d5a8a4cd9493b314d367c14d24c17684ddd14", size = 160779, upload-time = "2025-10-14T04:41:37.229Z" },
+ { url = "https://files.pythonhosted.org/packages/67/ed/331d6b249259ee71ddea93f6f2f0a56cfebd46938bde6fcc6f7b9a3d0e09/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d1f13550535ad8cff21b8d757a3257963e951d96e20ec82ab44bc64aeb62a191", size = 159035, upload-time = "2025-10-14T04:41:38.368Z" },
+ { url = "https://files.pythonhosted.org/packages/67/ff/f6b948ca32e4f2a4576aa129d8bed61f2e0543bf9f5f2b7fc3758ed005c9/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ecaae4149d99b1c9e7b88bb03e3221956f68fd6d50be2ef061b2381b61d20838", size = 152542, upload-time = "2025-10-14T04:41:39.862Z" },
+ { url = "https://files.pythonhosted.org/packages/16/85/276033dcbcc369eb176594de22728541a925b2632f9716428c851b149e83/charset_normalizer-3.4.4-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:cb6254dc36b47a990e59e1068afacdcd02958bdcce30bb50cc1700a8b9d624a6", size = 149524, upload-time = "2025-10-14T04:41:41.319Z" },
+ { url = "https://files.pythonhosted.org/packages/9e/f2/6a2a1f722b6aba37050e626530a46a68f74e63683947a8acff92569f979a/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:c8ae8a0f02f57a6e61203a31428fa1d677cbe50c93622b4149d5c0f319c1d19e", size = 150395, upload-time = "2025-10-14T04:41:42.539Z" },
+ { url = "https://files.pythonhosted.org/packages/60/bb/2186cb2f2bbaea6338cad15ce23a67f9b0672929744381e28b0592676824/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:47cc91b2f4dd2833fddaedd2893006b0106129d4b94fdb6af1f4ce5a9965577c", size = 143680, upload-time = "2025-10-14T04:41:43.661Z" },
+ { url = "https://files.pythonhosted.org/packages/7d/a5/bf6f13b772fbb2a90360eb620d52ed8f796f3c5caee8398c3b2eb7b1c60d/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:82004af6c302b5d3ab2cfc4cc5f29db16123b1a8417f2e25f9066f91d4411090", size = 162045, upload-time = "2025-10-14T04:41:44.821Z" },
+ { url = "https://files.pythonhosted.org/packages/df/c5/d1be898bf0dc3ef9030c3825e5d3b83f2c528d207d246cbabe245966808d/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:2b7d8f6c26245217bd2ad053761201e9f9680f8ce52f0fcd8d0755aeae5b2152", size = 149687, upload-time = "2025-10-14T04:41:46.442Z" },
+ { url = "https://files.pythonhosted.org/packages/a5/42/90c1f7b9341eef50c8a1cb3f098ac43b0508413f33affd762855f67a410e/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:799a7a5e4fb2d5898c60b640fd4981d6a25f1c11790935a44ce38c54e985f828", size = 160014, upload-time = "2025-10-14T04:41:47.631Z" },
+ { url = "https://files.pythonhosted.org/packages/76/be/4d3ee471e8145d12795ab655ece37baed0929462a86e72372fd25859047c/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:99ae2cffebb06e6c22bdc25801d7b30f503cc87dbd283479e7b606f70aff57ec", size = 154044, upload-time = "2025-10-14T04:41:48.81Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/6f/8f7af07237c34a1defe7defc565a9bc1807762f672c0fde711a4b22bf9c0/charset_normalizer-3.4.4-cp314-cp314-win32.whl", hash = "sha256:f9d332f8c2a2fcbffe1378594431458ddbef721c1769d78e2cbc06280d8155f9", size = 99940, upload-time = "2025-10-14T04:41:49.946Z" },
+ { url = "https://files.pythonhosted.org/packages/4b/51/8ade005e5ca5b0d80fb4aff72a3775b325bdc3d27408c8113811a7cbe640/charset_normalizer-3.4.4-cp314-cp314-win_amd64.whl", hash = "sha256:8a6562c3700cce886c5be75ade4a5db4214fda19fede41d9792d100288d8f94c", size = 107104, upload-time = "2025-10-14T04:41:51.051Z" },
+ { url = "https://files.pythonhosted.org/packages/da/5f/6b8f83a55bb8278772c5ae54a577f3099025f9ade59d0136ac24a0df4bde/charset_normalizer-3.4.4-cp314-cp314-win_arm64.whl", hash = "sha256:de00632ca48df9daf77a2c65a484531649261ec9f25489917f09e455cb09ddb2", size = 100743, upload-time = "2025-10-14T04:41:52.122Z" },
+ { url = "https://files.pythonhosted.org/packages/0a/4c/925909008ed5a988ccbb72dcc897407e5d6d3bd72410d69e051fc0c14647/charset_normalizer-3.4.4-py3-none-any.whl", hash = "sha256:7a32c560861a02ff789ad905a2fe94e3f840803362c84fecf1851cb4cf3dc37f", size = 53402, upload-time = "2025-10-14T04:42:31.76Z" },
+]
+
+[[package]]
+name = "click"
+version = "8.3.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "colorama", marker = "sys_platform == 'win32'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/46/61/de6cd827efad202d7057d93e0fed9294b96952e188f7384832791c7b2254/click-8.3.0.tar.gz", hash = "sha256:e7b8232224eba16f4ebe410c25ced9f7875cb5f3263ffc93cc3e8da705e229c4", size = 276943, upload-time = "2025-09-18T17:32:23.696Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/db/d3/9dcc0f5797f070ec8edf30fbadfb200e71d9db6b84d211e3b2085a7589a0/click-8.3.0-py3-none-any.whl", hash = "sha256:9b9f285302c6e3064f4330c05f05b81945b2a39544279343e6e7c5f27a9baddc", size = 107295, upload-time = "2025-09-18T17:32:22.42Z" },
+]
+
+[[package]]
+name = "cloudpathlib"
+version = "0.23.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "typing-extensions", marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/f4/18/2ac35d6b3015a0c74e923d94fc69baf8307f7c3233de015d69f99e17afa8/cloudpathlib-0.23.0.tar.gz", hash = "sha256:eb38a34c6b8a048ecfd2b2f60917f7cbad4a105b7c979196450c2f541f4d6b4b", size = 53126, upload-time = "2025-10-07T22:47:56.278Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ae/8a/c4bb04426d608be4a3171efa2e233d2c59a5c8937850c10d098e126df18e/cloudpathlib-0.23.0-py3-none-any.whl", hash = "sha256:8520b3b01468fee77de37ab5d50b1b524ea6b4a8731c35d1b7407ac0cd716002", size = 62755, upload-time = "2025-10-07T22:47:54.905Z" },
+]
+
+[[package]]
+name = "colorama"
+version = "0.4.6"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
+]
+
+[[package]]
+name = "confection"
+version = "0.1.5"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "pydantic" },
+ { name = "srsly" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/51/d3/57c6631159a1b48d273b40865c315cf51f89df7a9d1101094ef12e3a37c2/confection-0.1.5.tar.gz", hash = "sha256:8e72dd3ca6bd4f48913cd220f10b8275978e740411654b6e8ca6d7008c590f0e", size = 38924, upload-time = "2024-05-31T16:17:01.559Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/0c/00/3106b1854b45bd0474ced037dfe6b73b90fe68a68968cef47c23de3d43d2/confection-0.1.5-py3-none-any.whl", hash = "sha256:e29d3c3f8eac06b3f77eb9dfb4bf2fc6bcc9622a98ca00a698e3d019c6430b14", size = 35451, upload-time = "2024-05-31T16:16:59.075Z" },
+]
+
+[[package]]
+name = "contourpy"
+version = "1.3.2"
+source = { registry = "https://pypi.org/simple" }
+resolution-markers = [
+ "python_full_version < '3.11'",
+]
+dependencies = [
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/66/54/eb9bfc647b19f2009dd5c7f5ec51c4e6ca831725f1aea7a993034f483147/contourpy-1.3.2.tar.gz", hash = "sha256:b6945942715a034c671b7fc54f9588126b0b8bf23db2696e3ca8328f3ff0ab54", size = 13466130, upload-time = "2025-04-15T17:47:53.79Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/12/a3/da4153ec8fe25d263aa48c1a4cbde7f49b59af86f0b6f7862788c60da737/contourpy-1.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ba38e3f9f330af820c4b27ceb4b9c7feee5fe0493ea53a8720f4792667465934", size = 268551, upload-time = "2025-04-15T17:34:46.581Z" },
+ { url = "https://files.pythonhosted.org/packages/2f/6c/330de89ae1087eb622bfca0177d32a7ece50c3ef07b28002de4757d9d875/contourpy-1.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:dc41ba0714aa2968d1f8674ec97504a8f7e334f48eeacebcaa6256213acb0989", size = 253399, upload-time = "2025-04-15T17:34:51.427Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/bd/20c6726b1b7f81a8bee5271bed5c165f0a8e1f572578a9d27e2ccb763cb2/contourpy-1.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9be002b31c558d1ddf1b9b415b162c603405414bacd6932d031c5b5a8b757f0d", size = 312061, upload-time = "2025-04-15T17:34:55.961Z" },
+ { url = "https://files.pythonhosted.org/packages/22/fc/a9665c88f8a2473f823cf1ec601de9e5375050f1958cbb356cdf06ef1ab6/contourpy-1.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8d2e74acbcba3bfdb6d9d8384cdc4f9260cae86ed9beee8bd5f54fee49a430b9", size = 351956, upload-time = "2025-04-15T17:35:00.992Z" },
+ { url = "https://files.pythonhosted.org/packages/25/eb/9f0a0238f305ad8fb7ef42481020d6e20cf15e46be99a1fcf939546a177e/contourpy-1.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e259bced5549ac64410162adc973c5e2fb77f04df4a439d00b478e57a0e65512", size = 320872, upload-time = "2025-04-15T17:35:06.177Z" },
+ { url = "https://files.pythonhosted.org/packages/32/5c/1ee32d1c7956923202f00cf8d2a14a62ed7517bdc0ee1e55301227fc273c/contourpy-1.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ad687a04bc802cbe8b9c399c07162a3c35e227e2daccf1668eb1f278cb698631", size = 325027, upload-time = "2025-04-15T17:35:11.244Z" },
+ { url = "https://files.pythonhosted.org/packages/83/bf/9baed89785ba743ef329c2b07fd0611d12bfecbedbdd3eeecf929d8d3b52/contourpy-1.3.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:cdd22595308f53ef2f891040ab2b93d79192513ffccbd7fe19be7aa773a5e09f", size = 1306641, upload-time = "2025-04-15T17:35:26.701Z" },
+ { url = "https://files.pythonhosted.org/packages/d4/cc/74e5e83d1e35de2d28bd97033426b450bc4fd96e092a1f7a63dc7369b55d/contourpy-1.3.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b4f54d6a2defe9f257327b0f243612dd051cc43825587520b1bf74a31e2f6ef2", size = 1374075, upload-time = "2025-04-15T17:35:43.204Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/42/17f3b798fd5e033b46a16f8d9fcb39f1aba051307f5ebf441bad1ecf78f8/contourpy-1.3.2-cp310-cp310-win32.whl", hash = "sha256:f939a054192ddc596e031e50bb13b657ce318cf13d264f095ce9db7dc6ae81c0", size = 177534, upload-time = "2025-04-15T17:35:46.554Z" },
+ { url = "https://files.pythonhosted.org/packages/54/ec/5162b8582f2c994721018d0c9ece9dc6ff769d298a8ac6b6a652c307e7df/contourpy-1.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:c440093bbc8fc21c637c03bafcbef95ccd963bc6e0514ad887932c18ca2a759a", size = 221188, upload-time = "2025-04-15T17:35:50.064Z" },
+ { url = "https://files.pythonhosted.org/packages/b3/b9/ede788a0b56fc5b071639d06c33cb893f68b1178938f3425debebe2dab78/contourpy-1.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6a37a2fb93d4df3fc4c0e363ea4d16f83195fc09c891bc8ce072b9d084853445", size = 269636, upload-time = "2025-04-15T17:35:54.473Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/75/3469f011d64b8bbfa04f709bfc23e1dd71be54d05b1b083be9f5b22750d1/contourpy-1.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:b7cd50c38f500bbcc9b6a46643a40e0913673f869315d8e70de0438817cb7773", size = 254636, upload-time = "2025-04-15T17:35:58.283Z" },
+ { url = "https://files.pythonhosted.org/packages/8d/2f/95adb8dae08ce0ebca4fd8e7ad653159565d9739128b2d5977806656fcd2/contourpy-1.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d6658ccc7251a4433eebd89ed2672c2ed96fba367fd25ca9512aa92a4b46c4f1", size = 313053, upload-time = "2025-04-15T17:36:03.235Z" },
+ { url = "https://files.pythonhosted.org/packages/c3/a6/8ccf97a50f31adfa36917707fe39c9a0cbc24b3bbb58185577f119736cc9/contourpy-1.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:70771a461aaeb335df14deb6c97439973d253ae70660ca085eec25241137ef43", size = 352985, upload-time = "2025-04-15T17:36:08.275Z" },
+ { url = "https://files.pythonhosted.org/packages/1d/b6/7925ab9b77386143f39d9c3243fdd101621b4532eb126743201160ffa7e6/contourpy-1.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65a887a6e8c4cd0897507d814b14c54a8c2e2aa4ac9f7686292f9769fcf9a6ab", size = 323750, upload-time = "2025-04-15T17:36:13.29Z" },
+ { url = "https://files.pythonhosted.org/packages/c2/f3/20c5d1ef4f4748e52d60771b8560cf00b69d5c6368b5c2e9311bcfa2a08b/contourpy-1.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3859783aefa2b8355697f16642695a5b9792e7a46ab86da1118a4a23a51a33d7", size = 326246, upload-time = "2025-04-15T17:36:18.329Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/e5/9dae809e7e0b2d9d70c52b3d24cba134dd3dad979eb3e5e71f5df22ed1f5/contourpy-1.3.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:eab0f6db315fa4d70f1d8ab514e527f0366ec021ff853d7ed6a2d33605cf4b83", size = 1308728, upload-time = "2025-04-15T17:36:33.878Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/4a/0058ba34aeea35c0b442ae61a4f4d4ca84d6df8f91309bc2d43bb8dd248f/contourpy-1.3.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d91a3ccc7fea94ca0acab82ceb77f396d50a1f67412efe4c526f5d20264e6ecd", size = 1375762, upload-time = "2025-04-15T17:36:51.295Z" },
+ { url = "https://files.pythonhosted.org/packages/09/33/7174bdfc8b7767ef2c08ed81244762d93d5c579336fc0b51ca57b33d1b80/contourpy-1.3.2-cp311-cp311-win32.whl", hash = "sha256:1c48188778d4d2f3d48e4643fb15d8608b1d01e4b4d6b0548d9b336c28fc9b6f", size = 178196, upload-time = "2025-04-15T17:36:55.002Z" },
+ { url = "https://files.pythonhosted.org/packages/5e/fe/4029038b4e1c4485cef18e480b0e2cd2d755448bb071eb9977caac80b77b/contourpy-1.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:5ebac872ba09cb8f2131c46b8739a7ff71de28a24c869bcad554477eb089a878", size = 222017, upload-time = "2025-04-15T17:36:58.576Z" },
+ { url = "https://files.pythonhosted.org/packages/34/f7/44785876384eff370c251d58fd65f6ad7f39adce4a093c934d4a67a7c6b6/contourpy-1.3.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4caf2bcd2969402bf77edc4cb6034c7dd7c0803213b3523f111eb7460a51b8d2", size = 271580, upload-time = "2025-04-15T17:37:03.105Z" },
+ { url = "https://files.pythonhosted.org/packages/93/3b/0004767622a9826ea3d95f0e9d98cd8729015768075d61f9fea8eeca42a8/contourpy-1.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:82199cb78276249796419fe36b7386bd8d2cc3f28b3bc19fe2454fe2e26c4c15", size = 255530, upload-time = "2025-04-15T17:37:07.026Z" },
+ { url = "https://files.pythonhosted.org/packages/e7/bb/7bd49e1f4fa805772d9fd130e0d375554ebc771ed7172f48dfcd4ca61549/contourpy-1.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:106fab697af11456fcba3e352ad50effe493a90f893fca6c2ca5c033820cea92", size = 307688, upload-time = "2025-04-15T17:37:11.481Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/97/e1d5dbbfa170725ef78357a9a0edc996b09ae4af170927ba8ce977e60a5f/contourpy-1.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d14f12932a8d620e307f715857107b1d1845cc44fdb5da2bc8e850f5ceba9f87", size = 347331, upload-time = "2025-04-15T17:37:18.212Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/66/e69e6e904f5ecf6901be3dd16e7e54d41b6ec6ae3405a535286d4418ffb4/contourpy-1.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:532fd26e715560721bb0d5fc7610fce279b3699b018600ab999d1be895b09415", size = 318963, upload-time = "2025-04-15T17:37:22.76Z" },
+ { url = "https://files.pythonhosted.org/packages/a8/32/b8a1c8965e4f72482ff2d1ac2cd670ce0b542f203c8e1d34e7c3e6925da7/contourpy-1.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f26b383144cf2d2c29f01a1e8170f50dacf0eac02d64139dcd709a8ac4eb3cfe", size = 323681, upload-time = "2025-04-15T17:37:33.001Z" },
+ { url = "https://files.pythonhosted.org/packages/30/c6/12a7e6811d08757c7162a541ca4c5c6a34c0f4e98ef2b338791093518e40/contourpy-1.3.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:c49f73e61f1f774650a55d221803b101d966ca0c5a2d6d5e4320ec3997489441", size = 1308674, upload-time = "2025-04-15T17:37:48.64Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/8a/bebe5a3f68b484d3a2b8ffaf84704b3e343ef1addea528132ef148e22b3b/contourpy-1.3.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3d80b2c0300583228ac98d0a927a1ba6a2ba6b8a742463c564f1d419ee5b211e", size = 1380480, upload-time = "2025-04-15T17:38:06.7Z" },
+ { url = "https://files.pythonhosted.org/packages/34/db/fcd325f19b5978fb509a7d55e06d99f5f856294c1991097534360b307cf1/contourpy-1.3.2-cp312-cp312-win32.whl", hash = "sha256:90df94c89a91b7362e1142cbee7568f86514412ab8a2c0d0fca72d7e91b62912", size = 178489, upload-time = "2025-04-15T17:38:10.338Z" },
+ { url = "https://files.pythonhosted.org/packages/01/c8/fadd0b92ffa7b5eb5949bf340a63a4a496a6930a6c37a7ba0f12acb076d6/contourpy-1.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:8c942a01d9163e2e5cfb05cb66110121b8d07ad438a17f9e766317bcb62abf73", size = 223042, upload-time = "2025-04-15T17:38:14.239Z" },
+ { url = "https://files.pythonhosted.org/packages/2e/61/5673f7e364b31e4e7ef6f61a4b5121c5f170f941895912f773d95270f3a2/contourpy-1.3.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:de39db2604ae755316cb5967728f4bea92685884b1e767b7c24e983ef5f771cb", size = 271630, upload-time = "2025-04-15T17:38:19.142Z" },
+ { url = "https://files.pythonhosted.org/packages/ff/66/a40badddd1223822c95798c55292844b7e871e50f6bfd9f158cb25e0bd39/contourpy-1.3.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:3f9e896f447c5c8618f1edb2bafa9a4030f22a575ec418ad70611450720b5b08", size = 255670, upload-time = "2025-04-15T17:38:23.688Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/c7/cf9fdee8200805c9bc3b148f49cb9482a4e3ea2719e772602a425c9b09f8/contourpy-1.3.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71e2bd4a1c4188f5c2b8d274da78faab884b59df20df63c34f74aa1813c4427c", size = 306694, upload-time = "2025-04-15T17:38:28.238Z" },
+ { url = "https://files.pythonhosted.org/packages/dd/e7/ccb9bec80e1ba121efbffad7f38021021cda5be87532ec16fd96533bb2e0/contourpy-1.3.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de425af81b6cea33101ae95ece1f696af39446db9682a0b56daaa48cfc29f38f", size = 345986, upload-time = "2025-04-15T17:38:33.502Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/49/ca13bb2da90391fa4219fdb23b078d6065ada886658ac7818e5441448b78/contourpy-1.3.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:977e98a0e0480d3fe292246417239d2d45435904afd6d7332d8455981c408b85", size = 318060, upload-time = "2025-04-15T17:38:38.672Z" },
+ { url = "https://files.pythonhosted.org/packages/c8/65/5245ce8c548a8422236c13ffcdcdada6a2a812c361e9e0c70548bb40b661/contourpy-1.3.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:434f0adf84911c924519d2b08fc10491dd282b20bdd3fa8f60fd816ea0b48841", size = 322747, upload-time = "2025-04-15T17:38:43.712Z" },
+ { url = "https://files.pythonhosted.org/packages/72/30/669b8eb48e0a01c660ead3752a25b44fdb2e5ebc13a55782f639170772f9/contourpy-1.3.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:c66c4906cdbc50e9cba65978823e6e00b45682eb09adbb78c9775b74eb222422", size = 1308895, upload-time = "2025-04-15T17:39:00.224Z" },
+ { url = "https://files.pythonhosted.org/packages/05/5a/b569f4250decee6e8d54498be7bdf29021a4c256e77fe8138c8319ef8eb3/contourpy-1.3.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8b7fc0cd78ba2f4695fd0a6ad81a19e7e3ab825c31b577f384aa9d7817dc3bef", size = 1379098, upload-time = "2025-04-15T17:43:29.649Z" },
+ { url = "https://files.pythonhosted.org/packages/19/ba/b227c3886d120e60e41b28740ac3617b2f2b971b9f601c835661194579f1/contourpy-1.3.2-cp313-cp313-win32.whl", hash = "sha256:15ce6ab60957ca74cff444fe66d9045c1fd3e92c8936894ebd1f3eef2fff075f", size = 178535, upload-time = "2025-04-15T17:44:44.532Z" },
+ { url = "https://files.pythonhosted.org/packages/12/6e/2fed56cd47ca739b43e892707ae9a13790a486a3173be063681ca67d2262/contourpy-1.3.2-cp313-cp313-win_amd64.whl", hash = "sha256:e1578f7eafce927b168752ed7e22646dad6cd9bca673c60bff55889fa236ebf9", size = 223096, upload-time = "2025-04-15T17:44:48.194Z" },
+ { url = "https://files.pythonhosted.org/packages/54/4c/e76fe2a03014a7c767d79ea35c86a747e9325537a8b7627e0e5b3ba266b4/contourpy-1.3.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0475b1f6604896bc7c53bb070e355e9321e1bc0d381735421a2d2068ec56531f", size = 285090, upload-time = "2025-04-15T17:43:34.084Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/e2/5aba47debd55d668e00baf9651b721e7733975dc9fc27264a62b0dd26eb8/contourpy-1.3.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:c85bb486e9be652314bb5b9e2e3b0d1b2e643d5eec4992c0fbe8ac71775da739", size = 268643, upload-time = "2025-04-15T17:43:38.626Z" },
+ { url = "https://files.pythonhosted.org/packages/a1/37/cd45f1f051fe6230f751cc5cdd2728bb3a203f5619510ef11e732109593c/contourpy-1.3.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:745b57db7758f3ffc05a10254edd3182a2a83402a89c00957a8e8a22f5582823", size = 310443, upload-time = "2025-04-15T17:43:44.522Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/a2/36ea6140c306c9ff6dd38e3bcec80b3b018474ef4d17eb68ceecd26675f4/contourpy-1.3.2-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:970e9173dbd7eba9b4e01aab19215a48ee5dd3f43cef736eebde064a171f89a5", size = 349865, upload-time = "2025-04-15T17:43:49.545Z" },
+ { url = "https://files.pythonhosted.org/packages/95/b7/2fc76bc539693180488f7b6cc518da7acbbb9e3b931fd9280504128bf956/contourpy-1.3.2-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c6c4639a9c22230276b7bffb6a850dfc8258a2521305e1faefe804d006b2e532", size = 321162, upload-time = "2025-04-15T17:43:54.203Z" },
+ { url = "https://files.pythonhosted.org/packages/f4/10/76d4f778458b0aa83f96e59d65ece72a060bacb20cfbee46cf6cd5ceba41/contourpy-1.3.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cc829960f34ba36aad4302e78eabf3ef16a3a100863f0d4eeddf30e8a485a03b", size = 327355, upload-time = "2025-04-15T17:44:01.025Z" },
+ { url = "https://files.pythonhosted.org/packages/43/a3/10cf483ea683f9f8ab096c24bad3cce20e0d1dd9a4baa0e2093c1c962d9d/contourpy-1.3.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:d32530b534e986374fc19eaa77fcb87e8a99e5431499949b828312bdcd20ac52", size = 1307935, upload-time = "2025-04-15T17:44:17.322Z" },
+ { url = "https://files.pythonhosted.org/packages/78/73/69dd9a024444489e22d86108e7b913f3528f56cfc312b5c5727a44188471/contourpy-1.3.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:e298e7e70cf4eb179cc1077be1c725b5fd131ebc81181bf0c03525c8abc297fd", size = 1372168, upload-time = "2025-04-15T17:44:33.43Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/1b/96d586ccf1b1a9d2004dd519b25fbf104a11589abfd05484ff12199cca21/contourpy-1.3.2-cp313-cp313t-win32.whl", hash = "sha256:d0e589ae0d55204991450bb5c23f571c64fe43adaa53f93fc902a84c96f52fe1", size = 189550, upload-time = "2025-04-15T17:44:37.092Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/e6/6000d0094e8a5e32ad62591c8609e269febb6e4db83a1c75ff8868b42731/contourpy-1.3.2-cp313-cp313t-win_amd64.whl", hash = "sha256:78e9253c3de756b3f6a5174d024c4835acd59eb3f8e2ca13e775dbffe1558f69", size = 238214, upload-time = "2025-04-15T17:44:40.827Z" },
+ { url = "https://files.pythonhosted.org/packages/33/05/b26e3c6ecc05f349ee0013f0bb850a761016d89cec528a98193a48c34033/contourpy-1.3.2-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:fd93cc7f3139b6dd7aab2f26a90dde0aa9fc264dbf70f6740d498a70b860b82c", size = 265681, upload-time = "2025-04-15T17:44:59.314Z" },
+ { url = "https://files.pythonhosted.org/packages/2b/25/ac07d6ad12affa7d1ffed11b77417d0a6308170f44ff20fa1d5aa6333f03/contourpy-1.3.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:107ba8a6a7eec58bb475329e6d3b95deba9440667c4d62b9b6063942b61d7f16", size = 315101, upload-time = "2025-04-15T17:45:04.165Z" },
+ { url = "https://files.pythonhosted.org/packages/8f/4d/5bb3192bbe9d3f27e3061a6a8e7733c9120e203cb8515767d30973f71030/contourpy-1.3.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:ded1706ed0c1049224531b81128efbd5084598f18d8a2d9efae833edbd2b40ad", size = 220599, upload-time = "2025-04-15T17:45:08.456Z" },
+ { url = "https://files.pythonhosted.org/packages/ff/c0/91f1215d0d9f9f343e4773ba6c9b89e8c0cc7a64a6263f21139da639d848/contourpy-1.3.2-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:5f5964cdad279256c084b69c3f412b7801e15356b16efa9d78aa974041903da0", size = 266807, upload-time = "2025-04-15T17:45:15.535Z" },
+ { url = "https://files.pythonhosted.org/packages/d4/79/6be7e90c955c0487e7712660d6cead01fa17bff98e0ea275737cc2bc8e71/contourpy-1.3.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:49b65a95d642d4efa8f64ba12558fcb83407e58a2dfba9d796d77b63ccfcaff5", size = 318729, upload-time = "2025-04-15T17:45:20.166Z" },
+ { url = "https://files.pythonhosted.org/packages/87/68/7f46fb537958e87427d98a4074bcde4b67a70b04900cfc5ce29bc2f556c1/contourpy-1.3.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:8c5acb8dddb0752bf252e01a3035b21443158910ac16a3b0d20e7fed7d534ce5", size = 221791, upload-time = "2025-04-15T17:45:24.794Z" },
+]
+
+[[package]]
+name = "contourpy"
+version = "1.3.3"
+source = { registry = "https://pypi.org/simple" }
+resolution-markers = [
+ "python_full_version >= '3.12'",
+ "python_full_version == '3.11.*'",
+]
+dependencies = [
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/58/01/1253e6698a07380cd31a736d248a3f2a50a7c88779a1813da27503cadc2a/contourpy-1.3.3.tar.gz", hash = "sha256:083e12155b210502d0bca491432bb04d56dc3432f95a979b429f2848c3dbe880", size = 13466174, upload-time = "2025-07-26T12:03:12.549Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/91/2e/c4390a31919d8a78b90e8ecf87cd4b4c4f05a5b48d05ec17db8e5404c6f4/contourpy-1.3.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:709a48ef9a690e1343202916450bc48b9e51c049b089c7f79a267b46cffcdaa1", size = 288773, upload-time = "2025-07-26T12:01:02.277Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/44/c4b0b6095fef4dc9c420e041799591e3b63e9619e3044f7f4f6c21c0ab24/contourpy-1.3.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:23416f38bfd74d5d28ab8429cc4d63fa67d5068bd711a85edb1c3fb0c3e2f381", size = 270149, upload-time = "2025-07-26T12:01:04.072Z" },
+ { url = "https://files.pythonhosted.org/packages/30/2e/dd4ced42fefac8470661d7cb7e264808425e6c5d56d175291e93890cce09/contourpy-1.3.3-cp311-cp311-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:929ddf8c4c7f348e4c0a5a3a714b5c8542ffaa8c22954862a46ca1813b667ee7", size = 329222, upload-time = "2025-07-26T12:01:05.688Z" },
+ { url = "https://files.pythonhosted.org/packages/f2/74/cc6ec2548e3d276c71389ea4802a774b7aa3558223b7bade3f25787fafc2/contourpy-1.3.3-cp311-cp311-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:9e999574eddae35f1312c2b4b717b7885d4edd6cb46700e04f7f02db454e67c1", size = 377234, upload-time = "2025-07-26T12:01:07.054Z" },
+ { url = "https://files.pythonhosted.org/packages/03/b3/64ef723029f917410f75c09da54254c5f9ea90ef89b143ccadb09df14c15/contourpy-1.3.3-cp311-cp311-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0bf67e0e3f482cb69779dd3061b534eb35ac9b17f163d851e2a547d56dba0a3a", size = 380555, upload-time = "2025-07-26T12:01:08.801Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/4b/6157f24ca425b89fe2eb7e7be642375711ab671135be21e6faa100f7448c/contourpy-1.3.3-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:51e79c1f7470158e838808d4a996fa9bac72c498e93d8ebe5119bc1e6becb0db", size = 355238, upload-time = "2025-07-26T12:01:10.319Z" },
+ { url = "https://files.pythonhosted.org/packages/98/56/f914f0dd678480708a04cfd2206e7c382533249bc5001eb9f58aa693e200/contourpy-1.3.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:598c3aaece21c503615fd59c92a3598b428b2f01bfb4b8ca9c4edeecc2438620", size = 1326218, upload-time = "2025-07-26T12:01:12.659Z" },
+ { url = "https://files.pythonhosted.org/packages/fb/d7/4a972334a0c971acd5172389671113ae82aa7527073980c38d5868ff1161/contourpy-1.3.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:322ab1c99b008dad206d406bb61d014cf0174df491ae9d9d0fac6a6fda4f977f", size = 1392867, upload-time = "2025-07-26T12:01:15.533Z" },
+ { url = "https://files.pythonhosted.org/packages/75/3e/f2cc6cd56dc8cff46b1a56232eabc6feea52720083ea71ab15523daab796/contourpy-1.3.3-cp311-cp311-win32.whl", hash = "sha256:fd907ae12cd483cd83e414b12941c632a969171bf90fc937d0c9f268a31cafff", size = 183677, upload-time = "2025-07-26T12:01:17.088Z" },
+ { url = "https://files.pythonhosted.org/packages/98/4b/9bd370b004b5c9d8045c6c33cf65bae018b27aca550a3f657cdc99acdbd8/contourpy-1.3.3-cp311-cp311-win_amd64.whl", hash = "sha256:3519428f6be58431c56581f1694ba8e50626f2dd550af225f82fb5f5814d2a42", size = 225234, upload-time = "2025-07-26T12:01:18.256Z" },
+ { url = "https://files.pythonhosted.org/packages/d9/b6/71771e02c2e004450c12b1120a5f488cad2e4d5b590b1af8bad060360fe4/contourpy-1.3.3-cp311-cp311-win_arm64.whl", hash = "sha256:15ff10bfada4bf92ec8b31c62bf7c1834c244019b4a33095a68000d7075df470", size = 193123, upload-time = "2025-07-26T12:01:19.848Z" },
+ { url = "https://files.pythonhosted.org/packages/be/45/adfee365d9ea3d853550b2e735f9d66366701c65db7855cd07621732ccfc/contourpy-1.3.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b08a32ea2f8e42cf1d4be3169a98dd4be32bafe4f22b6c4cb4ba810fa9e5d2cb", size = 293419, upload-time = "2025-07-26T12:01:21.16Z" },
+ { url = "https://files.pythonhosted.org/packages/53/3e/405b59cfa13021a56bba395a6b3aca8cec012b45bf177b0eaf7a202cde2c/contourpy-1.3.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:556dba8fb6f5d8742f2923fe9457dbdd51e1049c4a43fd3986a0b14a1d815fc6", size = 273979, upload-time = "2025-07-26T12:01:22.448Z" },
+ { url = "https://files.pythonhosted.org/packages/d4/1c/a12359b9b2ca3a845e8f7f9ac08bdf776114eb931392fcad91743e2ea17b/contourpy-1.3.3-cp312-cp312-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92d9abc807cf7d0e047b95ca5d957cf4792fcd04e920ca70d48add15c1a90ea7", size = 332653, upload-time = "2025-07-26T12:01:24.155Z" },
+ { url = "https://files.pythonhosted.org/packages/63/12/897aeebfb475b7748ea67b61e045accdfcf0d971f8a588b67108ed7f5512/contourpy-1.3.3-cp312-cp312-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b2e8faa0ed68cb29af51edd8e24798bb661eac3bd9f65420c1887b6ca89987c8", size = 379536, upload-time = "2025-07-26T12:01:25.91Z" },
+ { url = "https://files.pythonhosted.org/packages/43/8a/a8c584b82deb248930ce069e71576fc09bd7174bbd35183b7943fb1064fd/contourpy-1.3.3-cp312-cp312-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:626d60935cf668e70a5ce6ff184fd713e9683fb458898e4249b63be9e28286ea", size = 384397, upload-time = "2025-07-26T12:01:27.152Z" },
+ { url = "https://files.pythonhosted.org/packages/cc/8f/ec6289987824b29529d0dfda0d74a07cec60e54b9c92f3c9da4c0ac732de/contourpy-1.3.3-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4d00e655fcef08aba35ec9610536bfe90267d7ab5ba944f7032549c55a146da1", size = 362601, upload-time = "2025-07-26T12:01:28.808Z" },
+ { url = "https://files.pythonhosted.org/packages/05/0a/a3fe3be3ee2dceb3e615ebb4df97ae6f3828aa915d3e10549ce016302bd1/contourpy-1.3.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:451e71b5a7d597379ef572de31eeb909a87246974d960049a9848c3bc6c41bf7", size = 1331288, upload-time = "2025-07-26T12:01:31.198Z" },
+ { url = "https://files.pythonhosted.org/packages/33/1d/acad9bd4e97f13f3e2b18a3977fe1b4a37ecf3d38d815333980c6c72e963/contourpy-1.3.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:459c1f020cd59fcfe6650180678a9993932d80d44ccde1fa1868977438f0b411", size = 1403386, upload-time = "2025-07-26T12:01:33.947Z" },
+ { url = "https://files.pythonhosted.org/packages/cf/8f/5847f44a7fddf859704217a99a23a4f6417b10e5ab1256a179264561540e/contourpy-1.3.3-cp312-cp312-win32.whl", hash = "sha256:023b44101dfe49d7d53932be418477dba359649246075c996866106da069af69", size = 185018, upload-time = "2025-07-26T12:01:35.64Z" },
+ { url = "https://files.pythonhosted.org/packages/19/e8/6026ed58a64563186a9ee3f29f41261fd1828f527dd93d33b60feca63352/contourpy-1.3.3-cp312-cp312-win_amd64.whl", hash = "sha256:8153b8bfc11e1e4d75bcb0bff1db232f9e10b274e0929de9d608027e0d34ff8b", size = 226567, upload-time = "2025-07-26T12:01:36.804Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/e2/f05240d2c39a1ed228d8328a78b6f44cd695f7ef47beb3e684cf93604f86/contourpy-1.3.3-cp312-cp312-win_arm64.whl", hash = "sha256:07ce5ed73ecdc4a03ffe3e1b3e3c1166db35ae7584be76f65dbbe28a7791b0cc", size = 193655, upload-time = "2025-07-26T12:01:37.999Z" },
+ { url = "https://files.pythonhosted.org/packages/68/35/0167aad910bbdb9599272bd96d01a9ec6852f36b9455cf2ca67bd4cc2d23/contourpy-1.3.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:177fb367556747a686509d6fef71d221a4b198a3905fe824430e5ea0fda54eb5", size = 293257, upload-time = "2025-07-26T12:01:39.367Z" },
+ { url = "https://files.pythonhosted.org/packages/96/e4/7adcd9c8362745b2210728f209bfbcf7d91ba868a2c5f40d8b58f54c509b/contourpy-1.3.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d002b6f00d73d69333dac9d0b8d5e84d9724ff9ef044fd63c5986e62b7c9e1b1", size = 274034, upload-time = "2025-07-26T12:01:40.645Z" },
+ { url = "https://files.pythonhosted.org/packages/73/23/90e31ceeed1de63058a02cb04b12f2de4b40e3bef5e082a7c18d9c8ae281/contourpy-1.3.3-cp313-cp313-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:348ac1f5d4f1d66d3322420f01d42e43122f43616e0f194fc1c9f5d830c5b286", size = 334672, upload-time = "2025-07-26T12:01:41.942Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/93/b43d8acbe67392e659e1d984700e79eb67e2acb2bd7f62012b583a7f1b55/contourpy-1.3.3-cp313-cp313-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:655456777ff65c2c548b7c454af9c6f33f16c8884f11083244b5819cc214f1b5", size = 381234, upload-time = "2025-07-26T12:01:43.499Z" },
+ { url = "https://files.pythonhosted.org/packages/46/3b/bec82a3ea06f66711520f75a40c8fc0b113b2a75edb36aa633eb11c4f50f/contourpy-1.3.3-cp313-cp313-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:644a6853d15b2512d67881586bd03f462c7ab755db95f16f14d7e238f2852c67", size = 385169, upload-time = "2025-07-26T12:01:45.219Z" },
+ { url = "https://files.pythonhosted.org/packages/4b/32/e0f13a1c5b0f8572d0ec6ae2f6c677b7991fafd95da523159c19eff0696a/contourpy-1.3.3-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4debd64f124ca62069f313a9cb86656ff087786016d76927ae2cf37846b006c9", size = 362859, upload-time = "2025-07-26T12:01:46.519Z" },
+ { url = "https://files.pythonhosted.org/packages/33/71/e2a7945b7de4e58af42d708a219f3b2f4cff7386e6b6ab0a0fa0033c49a9/contourpy-1.3.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a15459b0f4615b00bbd1e91f1b9e19b7e63aea7483d03d804186f278c0af2659", size = 1332062, upload-time = "2025-07-26T12:01:48.964Z" },
+ { url = "https://files.pythonhosted.org/packages/12/fc/4e87ac754220ccc0e807284f88e943d6d43b43843614f0a8afa469801db0/contourpy-1.3.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ca0fdcd73925568ca027e0b17ab07aad764be4706d0a925b89227e447d9737b7", size = 1403932, upload-time = "2025-07-26T12:01:51.979Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/2e/adc197a37443f934594112222ac1aa7dc9a98faf9c3842884df9a9d8751d/contourpy-1.3.3-cp313-cp313-win32.whl", hash = "sha256:b20c7c9a3bf701366556e1b1984ed2d0cedf999903c51311417cf5f591d8c78d", size = 185024, upload-time = "2025-07-26T12:01:53.245Z" },
+ { url = "https://files.pythonhosted.org/packages/18/0b/0098c214843213759692cc638fce7de5c289200a830e5035d1791d7a2338/contourpy-1.3.3-cp313-cp313-win_amd64.whl", hash = "sha256:1cadd8b8969f060ba45ed7c1b714fe69185812ab43bd6b86a9123fe8f99c3263", size = 226578, upload-time = "2025-07-26T12:01:54.422Z" },
+ { url = "https://files.pythonhosted.org/packages/8a/9a/2f6024a0c5995243cd63afdeb3651c984f0d2bc727fd98066d40e141ad73/contourpy-1.3.3-cp313-cp313-win_arm64.whl", hash = "sha256:fd914713266421b7536de2bfa8181aa8c699432b6763a0ea64195ebe28bff6a9", size = 193524, upload-time = "2025-07-26T12:01:55.73Z" },
+ { url = "https://files.pythonhosted.org/packages/c0/b3/f8a1a86bd3298513f500e5b1f5fd92b69896449f6cab6a146a5d52715479/contourpy-1.3.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:88df9880d507169449d434c293467418b9f6cbe82edd19284aa0409e7fdb933d", size = 306730, upload-time = "2025-07-26T12:01:57.051Z" },
+ { url = "https://files.pythonhosted.org/packages/3f/11/4780db94ae62fc0c2053909b65dc3246bd7cecfc4f8a20d957ad43aa4ad8/contourpy-1.3.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:d06bb1f751ba5d417047db62bca3c8fde202b8c11fb50742ab3ab962c81e8216", size = 287897, upload-time = "2025-07-26T12:01:58.663Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/15/e59f5f3ffdd6f3d4daa3e47114c53daabcb18574a26c21f03dc9e4e42ff0/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e4e6b05a45525357e382909a4c1600444e2a45b4795163d3b22669285591c1ae", size = 326751, upload-time = "2025-07-26T12:02:00.343Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/81/03b45cfad088e4770b1dcf72ea78d3802d04200009fb364d18a493857210/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ab3074b48c4e2cf1a960e6bbeb7f04566bf36b1861d5c9d4d8ac04b82e38ba20", size = 375486, upload-time = "2025-07-26T12:02:02.128Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/ba/49923366492ffbdd4486e970d421b289a670ae8cf539c1ea9a09822b371a/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6c3d53c796f8647d6deb1abe867daeb66dcc8a97e8455efa729516b997b8ed99", size = 388106, upload-time = "2025-07-26T12:02:03.615Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/52/5b00ea89525f8f143651f9f03a0df371d3cbd2fccd21ca9b768c7a6500c2/contourpy-1.3.3-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:50ed930df7289ff2a8d7afeb9603f8289e5704755c7e5c3bbd929c90c817164b", size = 352548, upload-time = "2025-07-26T12:02:05.165Z" },
+ { url = "https://files.pythonhosted.org/packages/32/1d/a209ec1a3a3452d490f6b14dd92e72280c99ae3d1e73da74f8277d4ee08f/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4feffb6537d64b84877da813a5c30f1422ea5739566abf0bd18065ac040e120a", size = 1322297, upload-time = "2025-07-26T12:02:07.379Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/9e/46f0e8ebdd884ca0e8877e46a3f4e633f6c9c8c4f3f6e72be3fe075994aa/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:2b7e9480ffe2b0cd2e787e4df64270e3a0440d9db8dc823312e2c940c167df7e", size = 1391023, upload-time = "2025-07-26T12:02:10.171Z" },
+ { url = "https://files.pythonhosted.org/packages/b9/70/f308384a3ae9cd2209e0849f33c913f658d3326900d0ff5d378d6a1422d2/contourpy-1.3.3-cp313-cp313t-win32.whl", hash = "sha256:283edd842a01e3dcd435b1c5116798d661378d83d36d337b8dde1d16a5fc9ba3", size = 196157, upload-time = "2025-07-26T12:02:11.488Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/dd/880f890a6663b84d9e34a6f88cded89d78f0091e0045a284427cb6b18521/contourpy-1.3.3-cp313-cp313t-win_amd64.whl", hash = "sha256:87acf5963fc2b34825e5b6b048f40e3635dd547f590b04d2ab317c2619ef7ae8", size = 240570, upload-time = "2025-07-26T12:02:12.754Z" },
+ { url = "https://files.pythonhosted.org/packages/80/99/2adc7d8ffead633234817ef8e9a87115c8a11927a94478f6bb3d3f4d4f7d/contourpy-1.3.3-cp313-cp313t-win_arm64.whl", hash = "sha256:3c30273eb2a55024ff31ba7d052dde990d7d8e5450f4bbb6e913558b3d6c2301", size = 199713, upload-time = "2025-07-26T12:02:14.4Z" },
+ { url = "https://files.pythonhosted.org/packages/72/8b/4546f3ab60f78c514ffb7d01a0bd743f90de36f0019d1be84d0a708a580a/contourpy-1.3.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fde6c716d51c04b1c25d0b90364d0be954624a0ee9d60e23e850e8d48353d07a", size = 292189, upload-time = "2025-07-26T12:02:16.095Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/e1/3542a9cb596cadd76fcef413f19c79216e002623158befe6daa03dbfa88c/contourpy-1.3.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:cbedb772ed74ff5be440fa8eee9bd49f64f6e3fc09436d9c7d8f1c287b121d77", size = 273251, upload-time = "2025-07-26T12:02:17.524Z" },
+ { url = "https://files.pythonhosted.org/packages/b1/71/f93e1e9471d189f79d0ce2497007731c1e6bf9ef6d1d61b911430c3db4e5/contourpy-1.3.3-cp314-cp314-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:22e9b1bd7a9b1d652cd77388465dc358dafcd2e217d35552424aa4f996f524f5", size = 335810, upload-time = "2025-07-26T12:02:18.9Z" },
+ { url = "https://files.pythonhosted.org/packages/91/f9/e35f4c1c93f9275d4e38681a80506b5510e9327350c51f8d4a5a724d178c/contourpy-1.3.3-cp314-cp314-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a22738912262aa3e254e4f3cb079a95a67132fc5a063890e224393596902f5a4", size = 382871, upload-time = "2025-07-26T12:02:20.418Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/71/47b512f936f66a0a900d81c396a7e60d73419868fba959c61efed7a8ab46/contourpy-1.3.3-cp314-cp314-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:afe5a512f31ee6bd7d0dda52ec9864c984ca3d66664444f2d72e0dc4eb832e36", size = 386264, upload-time = "2025-07-26T12:02:21.916Z" },
+ { url = "https://files.pythonhosted.org/packages/04/5f/9ff93450ba96b09c7c2b3f81c94de31c89f92292f1380261bd7195bea4ea/contourpy-1.3.3-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f64836de09927cba6f79dcd00fdd7d5329f3fccc633468507079c829ca4db4e3", size = 363819, upload-time = "2025-07-26T12:02:23.759Z" },
+ { url = "https://files.pythonhosted.org/packages/3e/a6/0b185d4cc480ee494945cde102cb0149ae830b5fa17bf855b95f2e70ad13/contourpy-1.3.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:1fd43c3be4c8e5fd6e4f2baeae35ae18176cf2e5cced681cca908addf1cdd53b", size = 1333650, upload-time = "2025-07-26T12:02:26.181Z" },
+ { url = "https://files.pythonhosted.org/packages/43/d7/afdc95580ca56f30fbcd3060250f66cedbde69b4547028863abd8aa3b47e/contourpy-1.3.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:6afc576f7b33cf00996e5c1102dc2a8f7cc89e39c0b55df93a0b78c1bd992b36", size = 1404833, upload-time = "2025-07-26T12:02:28.782Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/e2/366af18a6d386f41132a48f033cbd2102e9b0cf6345d35ff0826cd984566/contourpy-1.3.3-cp314-cp314-win32.whl", hash = "sha256:66c8a43a4f7b8df8b71ee1840e4211a3c8d93b214b213f590e18a1beca458f7d", size = 189692, upload-time = "2025-07-26T12:02:30.128Z" },
+ { url = "https://files.pythonhosted.org/packages/7d/c2/57f54b03d0f22d4044b8afb9ca0e184f8b1afd57b4f735c2fa70883dc601/contourpy-1.3.3-cp314-cp314-win_amd64.whl", hash = "sha256:cf9022ef053f2694e31d630feaacb21ea24224be1c3ad0520b13d844274614fd", size = 232424, upload-time = "2025-07-26T12:02:31.395Z" },
+ { url = "https://files.pythonhosted.org/packages/18/79/a9416650df9b525737ab521aa181ccc42d56016d2123ddcb7b58e926a42c/contourpy-1.3.3-cp314-cp314-win_arm64.whl", hash = "sha256:95b181891b4c71de4bb404c6621e7e2390745f887f2a026b2d99e92c17892339", size = 198300, upload-time = "2025-07-26T12:02:32.956Z" },
+ { url = "https://files.pythonhosted.org/packages/1f/42/38c159a7d0f2b7b9c04c64ab317042bb6952b713ba875c1681529a2932fe/contourpy-1.3.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:33c82d0138c0a062380332c861387650c82e4cf1747aaa6938b9b6516762e772", size = 306769, upload-time = "2025-07-26T12:02:34.2Z" },
+ { url = "https://files.pythonhosted.org/packages/c3/6c/26a8205f24bca10974e77460de68d3d7c63e282e23782f1239f226fcae6f/contourpy-1.3.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:ea37e7b45949df430fe649e5de8351c423430046a2af20b1c1961cae3afcda77", size = 287892, upload-time = "2025-07-26T12:02:35.807Z" },
+ { url = "https://files.pythonhosted.org/packages/66/06/8a475c8ab718ebfd7925661747dbb3c3ee9c82ac834ccb3570be49d129f4/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d304906ecc71672e9c89e87c4675dc5c2645e1f4269a5063b99b0bb29f232d13", size = 326748, upload-time = "2025-07-26T12:02:37.193Z" },
+ { url = "https://files.pythonhosted.org/packages/b4/a3/c5ca9f010a44c223f098fccd8b158bb1cb287378a31ac141f04730dc49be/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ca658cd1a680a5c9ea96dc61cdbae1e85c8f25849843aa799dfd3cb370ad4fbe", size = 375554, upload-time = "2025-07-26T12:02:38.894Z" },
+ { url = "https://files.pythonhosted.org/packages/80/5b/68bd33ae63fac658a4145088c1e894405e07584a316738710b636c6d0333/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ab2fd90904c503739a75b7c8c5c01160130ba67944a7b77bbf36ef8054576e7f", size = 388118, upload-time = "2025-07-26T12:02:40.642Z" },
+ { url = "https://files.pythonhosted.org/packages/40/52/4c285a6435940ae25d7410a6c36bda5145839bc3f0beb20c707cda18b9d2/contourpy-1.3.3-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b7301b89040075c30e5768810bc96a8e8d78085b47d8be6e4c3f5a0b4ed478a0", size = 352555, upload-time = "2025-07-26T12:02:42.25Z" },
+ { url = "https://files.pythonhosted.org/packages/24/ee/3e81e1dd174f5c7fefe50e85d0892de05ca4e26ef1c9a59c2a57e43b865a/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:2a2a8b627d5cc6b7c41a4beff6c5ad5eb848c88255fda4a8745f7e901b32d8e4", size = 1322295, upload-time = "2025-07-26T12:02:44.668Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/b2/6d913d4d04e14379de429057cd169e5e00f6c2af3bb13e1710bcbdb5da12/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:fd6ec6be509c787f1caf6b247f0b1ca598bef13f4ddeaa126b7658215529ba0f", size = 1391027, upload-time = "2025-07-26T12:02:47.09Z" },
+ { url = "https://files.pythonhosted.org/packages/93/8a/68a4ec5c55a2971213d29a9374913f7e9f18581945a7a31d1a39b5d2dfe5/contourpy-1.3.3-cp314-cp314t-win32.whl", hash = "sha256:e74a9a0f5e3fff48fb5a7f2fd2b9b70a3fe014a67522f79b7cca4c0c7e43c9ae", size = 202428, upload-time = "2025-07-26T12:02:48.691Z" },
+ { url = "https://files.pythonhosted.org/packages/fa/96/fd9f641ffedc4fa3ace923af73b9d07e869496c9cc7a459103e6e978992f/contourpy-1.3.3-cp314-cp314t-win_amd64.whl", hash = "sha256:13b68d6a62db8eafaebb8039218921399baf6e47bf85006fd8529f2a08ef33fc", size = 250331, upload-time = "2025-07-26T12:02:50.137Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/8c/469afb6465b853afff216f9528ffda78a915ff880ed58813ba4faf4ba0b6/contourpy-1.3.3-cp314-cp314t-win_arm64.whl", hash = "sha256:b7448cb5a725bb1e35ce88771b86fba35ef418952474492cf7c764059933ff8b", size = 203831, upload-time = "2025-07-26T12:02:51.449Z" },
+ { url = "https://files.pythonhosted.org/packages/a5/29/8dcfe16f0107943fa92388c23f6e05cff0ba58058c4c95b00280d4c75a14/contourpy-1.3.3-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:cd5dfcaeb10f7b7f9dc8941717c6c2ade08f587be2226222c12b25f0483ed497", size = 278809, upload-time = "2025-07-26T12:02:52.74Z" },
+ { url = "https://files.pythonhosted.org/packages/85/a9/8b37ef4f7dafeb335daee3c8254645ef5725be4d9c6aa70b50ec46ef2f7e/contourpy-1.3.3-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:0c1fc238306b35f246d61a1d416a627348b5cf0648648a031e14bb8705fcdfe8", size = 261593, upload-time = "2025-07-26T12:02:54.037Z" },
+ { url = "https://files.pythonhosted.org/packages/0a/59/ebfb8c677c75605cc27f7122c90313fd2f375ff3c8d19a1694bda74aaa63/contourpy-1.3.3-pp311-pypy311_pp73-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:70f9aad7de812d6541d29d2bbf8feb22ff7e1c299523db288004e3157ff4674e", size = 302202, upload-time = "2025-07-26T12:02:55.947Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/37/21972a15834d90bfbfb009b9d004779bd5a07a0ec0234e5ba8f64d5736f4/contourpy-1.3.3-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5ed3657edf08512fc3fe81b510e35c2012fbd3081d2e26160f27ca28affec989", size = 329207, upload-time = "2025-07-26T12:02:57.468Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/58/bd257695f39d05594ca4ad60df5bcb7e32247f9951fd09a9b8edb82d1daa/contourpy-1.3.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:3d1a3799d62d45c18bafd41c5fa05120b96a28079f2393af559b843d1a966a77", size = 225315, upload-time = "2025-07-26T12:02:58.801Z" },
+]
+
+[[package]]
+name = "cycler"
+version = "0.12.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/a9/95/a3dbbb5028f35eafb79008e7522a75244477d2838f38cbb722248dabc2a8/cycler-0.12.1.tar.gz", hash = "sha256:88bb128f02ba341da8ef447245a9e138fae777f6a23943da4540077d3601eb1c", size = 7615, upload-time = "2023-10-07T05:32:18.335Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e7/05/c19819d5e3d95294a6f5947fb9b9629efb316b96de511b418c53d245aae6/cycler-0.12.1-py3-none-any.whl", hash = "sha256:85cef7cff222d8644161529808465972e51340599459b8ac3ccbac5a854e0d30", size = 8321, upload-time = "2023-10-07T05:32:16.783Z" },
+]
+
+[[package]]
+name = "cymem"
+version = "2.0.11"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/f2/4a/1acd761fb6ac4c560e823ce40536a62f886f2d59b2763b5c3fc7e9d92101/cymem-2.0.11.tar.gz", hash = "sha256:efe49a349d4a518be6b6c6b255d4a80f740a341544bde1a807707c058b88d0bd", size = 10346, upload-time = "2025-01-16T21:50:41.045Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/6d/55/f453f2b2f560e057f20eb2acdaafbf6488d72a6e8a36a4aef30f6053a51c/cymem-2.0.11-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:1b4dd8f8c2475c7c9948eefa89c790d83134600858d8d43b90276efd8df3882e", size = 41886, upload-time = "2025-01-16T21:49:17.183Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/9d/03299eff35bd4fd80db33e4fd516661b82bb7b898cb677829acf22391ede/cymem-2.0.11-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d46ba0d2e0f749195297d16f2286b55af7d7c084db2b853fdfccece2c000c5dc", size = 41696, upload-time = "2025-01-16T21:49:18.788Z" },
+ { url = "https://files.pythonhosted.org/packages/d3/0c/90aa41f258a67ea210886c5c73f88dc9f120b7a20e6b5d92c5ce73a68276/cymem-2.0.11-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:739c4336b9d04ce9761851e9260ef77508d4a86ee3060e41302bfb6fa82c37de", size = 203719, upload-time = "2025-01-16T21:49:23.13Z" },
+ { url = "https://files.pythonhosted.org/packages/52/d1/dc4a72aa2049c34a53a220290b1a59fadae61929dff3a6e1a830a22971fe/cymem-2.0.11-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a69c470c2fb118161f49761f9137384f46723c77078b659bba33858e19e46b49", size = 204763, upload-time = "2025-01-16T21:49:26.164Z" },
+ { url = "https://files.pythonhosted.org/packages/69/51/86ed323585530558bcdda1324c570abe032db2c1d5afd1c5e8e3e8fde63a/cymem-2.0.11-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:40159f6c92627438de970fd761916e745d70dfd84a7dcc28c1627eb49cee00d8", size = 193964, upload-time = "2025-01-16T21:49:28.057Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/0c/aee4ad2996a4e24342228ccf44d7835c7784042f0ee0c47ad33be1443f18/cymem-2.0.11-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:f503f98e6aa333fffbe657a6854f13a9c3de68860795ae21171284213b9c5c09", size = 195002, upload-time = "2025-01-16T21:49:31.329Z" },
+ { url = "https://files.pythonhosted.org/packages/eb/d5/eda823d639258d2ed1db83403c991a9a57d5a4ddea3bf08e59060809a9aa/cymem-2.0.11-cp310-cp310-win_amd64.whl", hash = "sha256:7f05ed5920cc92d6b958ec5da55bd820d326fe9332b90660e6fa67e3b476ceb1", size = 39079, upload-time = "2025-01-16T21:49:33.777Z" },
+ { url = "https://files.pythonhosted.org/packages/03/e3/d98e3976f4ffa99cddebc1ce379d4d62e3eb1da22285267f902c99cc3395/cymem-2.0.11-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3ee54039aad3ef65de82d66c40516bf54586287b46d32c91ea0530c34e8a2745", size = 42005, upload-time = "2025-01-16T21:49:34.977Z" },
+ { url = "https://files.pythonhosted.org/packages/41/b4/7546faf2ab63e59befc95972316d62276cec153f7d4d60e7b0d5e08f0602/cymem-2.0.11-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4c05ef75b5db217be820604e43a47ccbbafea98ab6659d07cea92fa3c864ea58", size = 41747, upload-time = "2025-01-16T21:49:36.108Z" },
+ { url = "https://files.pythonhosted.org/packages/7d/4e/042f372e5b3eb7f5f3dd7677161771d301de2b6fa3f7c74e1cebcd502552/cymem-2.0.11-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a8d5381e5793ce531bac0dbc00829c8381f18605bb67e4b61d34f8850463da40", size = 217647, upload-time = "2025-01-16T21:49:37.433Z" },
+ { url = "https://files.pythonhosted.org/packages/48/cb/2207679e4b92701f78cf141e1ab4f81f55247dbe154eb426b842a0a993de/cymem-2.0.11-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2b9d3f42d7249ac81802135cad51d707def058001a32f73fc7fbf3de7045ac7", size = 218857, upload-time = "2025-01-16T21:49:40.09Z" },
+ { url = "https://files.pythonhosted.org/packages/31/7a/76ae3b7a39ab2531029d281e43fcfcaad728c2341b150a81a3a1f5587cf3/cymem-2.0.11-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:39b78f2195d20b75c2d465732f6b8e8721c5d4eb012777c2cb89bdb45a043185", size = 206148, upload-time = "2025-01-16T21:49:41.383Z" },
+ { url = "https://files.pythonhosted.org/packages/25/f9/d0fc0191ac79f15638ddb59237aa76f234691374d7d7950e10f384bd8a25/cymem-2.0.11-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:2203bd6525a80d8fd0c94654a263af21c0387ae1d5062cceaebb652bf9bad7bc", size = 207112, upload-time = "2025-01-16T21:49:43.986Z" },
+ { url = "https://files.pythonhosted.org/packages/56/c8/75f75889401b20f4c3a7c5965dda09df42913e904ddc2ffe7ef3bdf25061/cymem-2.0.11-cp311-cp311-win_amd64.whl", hash = "sha256:aa54af7314de400634448da1f935b61323da80a49484074688d344fb2036681b", size = 39360, upload-time = "2025-01-16T21:49:45.479Z" },
+ { url = "https://files.pythonhosted.org/packages/71/67/0d74f7e9d79f934368a78fb1d1466b94bebdbff14f8ae94dd3e4ea8738bb/cymem-2.0.11-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:a0fbe19ce653cd688842d81e5819dc63f911a26e192ef30b0b89f0ab2b192ff2", size = 42621, upload-time = "2025-01-16T21:49:46.585Z" },
+ { url = "https://files.pythonhosted.org/packages/4a/d6/f7a19c63b48efc3f00a3ee8d69070ac90202e1e378f6cf81b8671f0cf762/cymem-2.0.11-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:de72101dc0e6326f6a2f73e05a438d1f3c6110d41044236d0fbe62925091267d", size = 42249, upload-time = "2025-01-16T21:49:48.973Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/60/cdc434239813eef547fb99b6d0bafe31178501702df9b77c4108c9a216f6/cymem-2.0.11-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bee4395917f6588b8ac1699499128842768b391fe8896e8626950b4da5f9a406", size = 224758, upload-time = "2025-01-16T21:49:51.382Z" },
+ { url = "https://files.pythonhosted.org/packages/1d/68/8fa6efae17cd3b2ba9a2f83b824867c5b65b06f7aec3f8a0d0cabdeffb9b/cymem-2.0.11-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5b02f2b17d760dc3fe5812737b1ce4f684641cdd751d67761d333a3b5ea97b83", size = 227995, upload-time = "2025-01-16T21:49:54.538Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/f3/ceda70bf6447880140602285b7c6fa171cb7c78b623d35345cc32505cd06/cymem-2.0.11-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:04ee6b4041ddec24512d6e969ed6445e57917f01e73b9dabbe17b7e6b27fef05", size = 215325, upload-time = "2025-01-16T21:49:57.229Z" },
+ { url = "https://files.pythonhosted.org/packages/d3/47/6915eaa521e1ce7a0ba480eecb6870cb4f681bcd64ced88c2f0ed7a744b4/cymem-2.0.11-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e1048dae7e627ee25f22c87bb670b13e06bc0aecc114b89b959a798d487d1bf4", size = 216447, upload-time = "2025-01-16T21:50:00.432Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/be/8e02bdd31e557f642741a06c8e886782ef78f0b00daffd681922dc9bbc88/cymem-2.0.11-cp312-cp312-win_amd64.whl", hash = "sha256:0c269c7a867d74adeb9db65fa1d226342aacf44d64b7931282f0b0eb22eb6275", size = 39283, upload-time = "2025-01-16T21:50:03.384Z" },
+ { url = "https://files.pythonhosted.org/packages/bd/90/b064e2677e27a35cf3605146abc3285d4f599cc1b6c18fc445ae876dd1e3/cymem-2.0.11-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4a311c82f743275c84f708df89ac5bf60ddefe4713d532000c887931e22941f", size = 42389, upload-time = "2025-01-16T21:50:05.925Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/60/7aa0561a6c1f0d42643b02c4fdeb2a16181b0ff4e85d73d2d80c6689e92a/cymem-2.0.11-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:02ed92bead896cca36abad00502b14fa651bdf5d8319461126a2d5ac8c9674c5", size = 41948, upload-time = "2025-01-16T21:50:08.375Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/4e/88a29cc5575374982e527b4ebcab3781bdc826ce693c6418a0f836544246/cymem-2.0.11-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:44ddd3588379f8f376116384af99e3fb5f90091d90f520c341942618bf22f05e", size = 219382, upload-time = "2025-01-16T21:50:13.089Z" },
+ { url = "https://files.pythonhosted.org/packages/9b/3a/8f96e167e93b7f7ec105ed7b25c77bbf215d15bcbf4a24082cdc12234cd6/cymem-2.0.11-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87ec985623624bbd298762d8163fc194a096cb13282731a017e09ff8a60bb8b1", size = 222974, upload-time = "2025-01-16T21:50:17.969Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/fc/ce016bb0c66a4776345fac7508fddec3b739b9dd4363094ac89cce048832/cymem-2.0.11-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:e3385a47285435848e0ed66cfd29b35f3ed8703218e2b17bd7a0c053822f26bf", size = 213426, upload-time = "2025-01-16T21:50:19.349Z" },
+ { url = "https://files.pythonhosted.org/packages/5c/c8/accf7cc768f751447a5050b14a195af46798bc22767ac25f49b02861b1eb/cymem-2.0.11-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:5461e65340d6572eb64deadce79242a446a1d39cb7bf70fe7b7e007eb0d799b0", size = 219195, upload-time = "2025-01-16T21:50:21.407Z" },
+ { url = "https://files.pythonhosted.org/packages/74/65/c162fbac63e867a055240b6600b92ef96c0eb7a1895312ac53c4be93d056/cymem-2.0.11-cp313-cp313-win_amd64.whl", hash = "sha256:25da111adf425c29af0cfd9fecfec1c71c8d82e2244a85166830a0817a66ada7", size = 39090, upload-time = "2025-01-16T21:50:24.239Z" },
+]
+
+[[package]]
+name = "datasets"
+version = "4.3.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "dill" },
+ { name = "filelock" },
+ { name = "fsspec", extra = ["http"] },
+ { name = "httpx" },
+ { name = "huggingface-hub" },
+ { name = "multiprocess" },
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "packaging" },
+ { name = "pandas" },
+ { name = "pyarrow" },
+ { name = "pyyaml" },
+ { name = "requests" },
+ { name = "tqdm" },
+ { name = "xxhash" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/2a/47/325206ac160f7699ed9f1798afa8f8f8d5189b03bf3815654859ac1d5cba/datasets-4.3.0.tar.gz", hash = "sha256:bc9118ed9afd92346c5be7ed3aaa00177eb907c25467f9d072a0d22777efbd2b", size = 582801, upload-time = "2025-10-23T16:31:51.547Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ca/51/409a8184ed35453d9cbb3d6b20d524b1115c2c2d117b85d5e9b06cd70b45/datasets-4.3.0-py3-none-any.whl", hash = "sha256:0ea157e72138b3ca6c7d2415f19a164ecf7d4c4fa72da2a570da286882e96903", size = 506846, upload-time = "2025-10-23T16:31:49.965Z" },
+]
+
+[[package]]
+name = "dill"
+version = "0.4.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/12/80/630b4b88364e9a8c8c5797f4602d0f76ef820909ee32f0bacb9f90654042/dill-0.4.0.tar.gz", hash = "sha256:0633f1d2df477324f53a895b02c901fb961bdbf65a17122586ea7019292cbcf0", size = 186976, upload-time = "2025-04-16T00:41:48.867Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/50/3d/9373ad9c56321fdab5b41197068e1d8c25883b3fea29dd361f9b55116869/dill-0.4.0-py3-none-any.whl", hash = "sha256:44f54bf6412c2c8464c14e8243eb163690a9800dbe2c367330883b19c7561049", size = 119668, upload-time = "2025-04-16T00:41:47.671Z" },
+]
+
+[[package]]
+name = "evaluate"
+version = "0.4.6"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "datasets" },
+ { name = "dill" },
+ { name = "fsspec", extra = ["http"] },
+ { name = "huggingface-hub" },
+ { name = "multiprocess" },
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "packaging" },
+ { name = "pandas" },
+ { name = "requests" },
+ { name = "tqdm" },
+ { name = "xxhash" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/ad/d0/0c17a8e6e8dc7245f22dea860557c32bae50fc4d287ae030cb0e8ab8720f/evaluate-0.4.6.tar.gz", hash = "sha256:e07036ca12b3c24331f83ab787f21cc2dbf3631813a1631e63e40897c69a3f21", size = 65716, upload-time = "2025-09-18T13:06:30.581Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/3e/af/3e990d8d4002bbc9342adb4facd59506e653da93b2417de0fa6027cb86b1/evaluate-0.4.6-py3-none-any.whl", hash = "sha256:bca85bc294f338377b7ac2f861e21c308b11b2a285f510d7d5394d5df437db29", size = 84069, upload-time = "2025-09-18T13:06:29.265Z" },
+]
+
+[[package]]
+name = "exceptiongroup"
+version = "1.3.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "typing-extensions", marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/50/79/66800aadf48771f6b62f7eb014e352e5d06856655206165d775e675a02c9/exceptiongroup-1.3.1.tar.gz", hash = "sha256:8b412432c6055b0b7d14c310000ae93352ed6754f70fa8f7c34141f91c4e3219", size = 30371, upload-time = "2025-11-21T23:01:54.787Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/8a/0e/97c33bf5009bdbac74fd2beace167cab3f978feb69cc36f1ef79360d6c4e/exceptiongroup-1.3.1-py3-none-any.whl", hash = "sha256:a7a39a3bd276781e98394987d3a5701d0c4edffb633bb7a5144577f82c773598", size = 16740, upload-time = "2025-11-21T23:01:53.443Z" },
+]
+
+[[package]]
+name = "filelock"
+version = "3.20.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/58/46/0028a82567109b5ef6e4d2a1f04a583fb513e6cf9527fcdd09afd817deeb/filelock-3.20.0.tar.gz", hash = "sha256:711e943b4ec6be42e1d4e6690b48dc175c822967466bb31c0c293f34334c13f4", size = 18922, upload-time = "2025-10-08T18:03:50.056Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/76/91/7216b27286936c16f5b4d0c530087e4a54eead683e6b0b73dd0c64844af6/filelock-3.20.0-py3-none-any.whl", hash = "sha256:339b4732ffda5cd79b13f4e2711a31b0365ce445d95d243bb996273d072546a2", size = 16054, upload-time = "2025-10-08T18:03:48.35Z" },
+]
+
+[[package]]
+name = "flashtrace"
+version = "0.1.1"
+source = { editable = "." }
+dependencies = [
+ { name = "accelerate" },
+ { name = "matplotlib" },
+ { name = "networkx" },
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "seaborn" },
+ { name = "spacy" },
+ { name = "torch" },
+ { name = "tqdm" },
+ { name = "transformers" },
+ { name = "wordfreq" },
+]
+
+[package.optional-dependencies]
+baselines = [
+ { name = "bert-score" },
+ { name = "evaluate" },
+ { name = "sentence-transformers" },
+]
+dev = [
+ { name = "pytest" },
+]
+eval = [
+ { name = "datasets" },
+ { name = "evaluate" },
+]
+
+[package.dev-dependencies]
+dev = [
+ { name = "pytest" },
+]
+
+[package.metadata]
+requires-dist = [
+ { name = "accelerate", specifier = ">=1.11.0" },
+ { name = "bert-score", marker = "extra == 'baselines'", specifier = ">=0.3.13" },
+ { name = "datasets", marker = "extra == 'eval'", specifier = ">=2.21" },
+ { name = "evaluate", marker = "extra == 'baselines'", specifier = ">=0.4.6" },
+ { name = "evaluate", marker = "extra == 'eval'", specifier = ">=0.4.6" },
+ { name = "matplotlib", specifier = ">=3.6" },
+ { name = "networkx", specifier = ">=3.3" },
+ { name = "numpy", specifier = ">=2.0" },
+ { name = "pytest", marker = "extra == 'dev'", specifier = ">=8.0" },
+ { name = "seaborn", specifier = ">=0.13.2" },
+ { name = "sentence-transformers", marker = "extra == 'baselines'", specifier = ">=4.1.0" },
+ { name = "spacy", specifier = ">=3.8" },
+ { name = "torch", specifier = ">=2.5" },
+ { name = "tqdm", specifier = ">=4.67" },
+ { name = "transformers", specifier = ">=4.53" },
+ { name = "wordfreq", specifier = ">=3.1.1" },
+]
+provides-extras = ["baselines", "eval", "dev"]
+
+[package.metadata.requires-dev]
+dev = [{ name = "pytest", specifier = ">=8.0" }]
+
+[[package]]
+name = "fonttools"
+version = "4.60.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/4b/42/97a13e47a1e51a5a7142475bbcf5107fe3a68fc34aef331c897d5fb98ad0/fonttools-4.60.1.tar.gz", hash = "sha256:ef00af0439ebfee806b25f24c8f92109157ff3fac5731dc7867957812e87b8d9", size = 3559823, upload-time = "2025-09-29T21:13:27.129Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/26/70/03e9d89a053caff6ae46053890eba8e4a5665a7c5638279ed4492e6d4b8b/fonttools-4.60.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:9a52f254ce051e196b8fe2af4634c2d2f02c981756c6464dc192f1b6050b4e28", size = 2810747, upload-time = "2025-09-29T21:10:59.653Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/41/449ad5aff9670ab0df0f61ee593906b67a36d7e0b4d0cd7fa41ac0325bf5/fonttools-4.60.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c7420a2696a44650120cdd269a5d2e56a477e2bfa9d95e86229059beb1c19e15", size = 2346909, upload-time = "2025-09-29T21:11:02.882Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/18/e5970aa96c8fad1cb19a9479cc3b7602c0c98d250fcdc06a5da994309c50/fonttools-4.60.1-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee0c0b3b35b34f782afc673d503167157094a16f442ace7c6c5e0ca80b08f50c", size = 4864572, upload-time = "2025-09-29T21:11:05.096Z" },
+ { url = "https://files.pythonhosted.org/packages/ce/20/9b2b4051b6ec6689480787d506b5003f72648f50972a92d04527a456192c/fonttools-4.60.1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:282dafa55f9659e8999110bd8ed422ebe1c8aecd0dc396550b038e6c9a08b8ea", size = 4794635, upload-time = "2025-09-29T21:11:08.651Z" },
+ { url = "https://files.pythonhosted.org/packages/10/52/c791f57347c1be98f8345e3dca4ac483eb97666dd7c47f3059aeffab8b59/fonttools-4.60.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:4ba4bd646e86de16160f0fb72e31c3b9b7d0721c3e5b26b9fa2fc931dfdb2652", size = 4843878, upload-time = "2025-09-29T21:11:10.893Z" },
+ { url = "https://files.pythonhosted.org/packages/69/e9/35c24a8d01644cee8c090a22fad34d5b61d1e0a8ecbc9945ad785ebf2e9e/fonttools-4.60.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:0b0835ed15dd5b40d726bb61c846a688f5b4ce2208ec68779bc81860adb5851a", size = 4954555, upload-time = "2025-09-29T21:11:13.24Z" },
+ { url = "https://files.pythonhosted.org/packages/f7/86/fb1e994971be4bdfe3a307de6373ef69a9df83fb66e3faa9c8114893d4cc/fonttools-4.60.1-cp310-cp310-win32.whl", hash = "sha256:1525796c3ffe27bb6268ed2a1bb0dcf214d561dfaf04728abf01489eb5339dce", size = 2232019, upload-time = "2025-09-29T21:11:15.73Z" },
+ { url = "https://files.pythonhosted.org/packages/40/84/62a19e2bd56f0e9fb347486a5b26376bade4bf6bbba64dda2c103bd08c94/fonttools-4.60.1-cp310-cp310-win_amd64.whl", hash = "sha256:268ecda8ca6cb5c4f044b1fb9b3b376e8cd1b361cef275082429dc4174907038", size = 2276803, upload-time = "2025-09-29T21:11:18.152Z" },
+ { url = "https://files.pythonhosted.org/packages/ea/85/639aa9bface1537e0fb0f643690672dde0695a5bbbc90736bc571b0b1941/fonttools-4.60.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7b4c32e232a71f63a5d00259ca3d88345ce2a43295bb049d21061f338124246f", size = 2831872, upload-time = "2025-09-29T21:11:20.329Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/47/3c63158459c95093be9618794acb1067b3f4d30dcc5c3e8114b70e67a092/fonttools-4.60.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3630e86c484263eaac71d117085d509cbcf7b18f677906824e4bace598fb70d2", size = 2356990, upload-time = "2025-09-29T21:11:22.754Z" },
+ { url = "https://files.pythonhosted.org/packages/94/dd/1934b537c86fcf99f9761823f1fc37a98fbd54568e8e613f29a90fed95a9/fonttools-4.60.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5c1015318e4fec75dd4943ad5f6a206d9727adf97410d58b7e32ab644a807914", size = 5042189, upload-time = "2025-09-29T21:11:25.061Z" },
+ { url = "https://files.pythonhosted.org/packages/d2/d2/9f4e4c4374dd1daa8367784e1bd910f18ba886db1d6b825b12edf6db3edc/fonttools-4.60.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e6c58beb17380f7c2ea181ea11e7db8c0ceb474c9dd45f48e71e2cb577d146a1", size = 4978683, upload-time = "2025-09-29T21:11:27.693Z" },
+ { url = "https://files.pythonhosted.org/packages/cc/c4/0fb2dfd1ecbe9a07954cc13414713ed1eab17b1c0214ef07fc93df234a47/fonttools-4.60.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ec3681a0cb34c255d76dd9d865a55f260164adb9fa02628415cdc2d43ee2c05d", size = 5021372, upload-time = "2025-09-29T21:11:30.257Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/d5/495fc7ae2fab20223cc87179a8f50f40f9a6f821f271ba8301ae12bb580f/fonttools-4.60.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f4b5c37a5f40e4d733d3bbaaef082149bee5a5ea3156a785ff64d949bd1353fa", size = 5132562, upload-time = "2025-09-29T21:11:32.737Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/fa/021dab618526323c744e0206b3f5c8596a2e7ae9aa38db5948a131123e83/fonttools-4.60.1-cp311-cp311-win32.whl", hash = "sha256:398447f3d8c0c786cbf1209711e79080a40761eb44b27cdafffb48f52bcec258", size = 2230288, upload-time = "2025-09-29T21:11:35.015Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/78/0e1a6d22b427579ea5c8273e1c07def2f325b977faaf60bb7ddc01456cb1/fonttools-4.60.1-cp311-cp311-win_amd64.whl", hash = "sha256:d066ea419f719ed87bc2c99a4a4bfd77c2e5949cb724588b9dd58f3fd90b92bf", size = 2278184, upload-time = "2025-09-29T21:11:37.434Z" },
+ { url = "https://files.pythonhosted.org/packages/e3/f7/a10b101b7a6f8836a5adb47f2791f2075d044a6ca123f35985c42edc82d8/fonttools-4.60.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:7b0c6d57ab00dae9529f3faf187f2254ea0aa1e04215cf2f1a8ec277c96661bc", size = 2832953, upload-time = "2025-09-29T21:11:39.616Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/fe/7bd094b59c926acf2304d2151354ddbeb74b94812f3dc943c231db09cb41/fonttools-4.60.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:839565cbf14645952d933853e8ade66a463684ed6ed6c9345d0faf1f0e868877", size = 2352706, upload-time = "2025-09-29T21:11:41.826Z" },
+ { url = "https://files.pythonhosted.org/packages/c0/ca/4bb48a26ed95a1e7eba175535fe5805887682140ee0a0d10a88e1de84208/fonttools-4.60.1-cp312-cp312-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:8177ec9676ea6e1793c8a084a90b65a9f778771998eb919d05db6d4b1c0b114c", size = 4923716, upload-time = "2025-09-29T21:11:43.893Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/9f/2cb82999f686c1d1ddf06f6ae1a9117a880adbec113611cc9d22b2fdd465/fonttools-4.60.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:996a4d1834524adbb423385d5a629b868ef9d774670856c63c9a0408a3063401", size = 4968175, upload-time = "2025-09-29T21:11:46.439Z" },
+ { url = "https://files.pythonhosted.org/packages/18/79/be569699e37d166b78e6218f2cde8c550204f2505038cdd83b42edc469b9/fonttools-4.60.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a46b2f450bc79e06ef3b6394f0c68660529ed51692606ad7f953fc2e448bc903", size = 4911031, upload-time = "2025-09-29T21:11:48.977Z" },
+ { url = "https://files.pythonhosted.org/packages/cc/9f/89411cc116effaec5260ad519162f64f9c150e5522a27cbb05eb62d0c05b/fonttools-4.60.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6ec722ee589e89a89f5b7574f5c45604030aa6ae24cb2c751e2707193b466fed", size = 5062966, upload-time = "2025-09-29T21:11:54.344Z" },
+ { url = "https://files.pythonhosted.org/packages/62/a1/f888221934b5731d46cb9991c7a71f30cb1f97c0ef5fcf37f8da8fce6c8e/fonttools-4.60.1-cp312-cp312-win32.whl", hash = "sha256:b2cf105cee600d2de04ca3cfa1f74f1127f8455b71dbad02b9da6ec266e116d6", size = 2218750, upload-time = "2025-09-29T21:11:56.601Z" },
+ { url = "https://files.pythonhosted.org/packages/88/8f/a55b5550cd33cd1028601df41acd057d4be20efa5c958f417b0c0613924d/fonttools-4.60.1-cp312-cp312-win_amd64.whl", hash = "sha256:992775c9fbe2cf794786fa0ffca7f09f564ba3499b8fe9f2f80bd7197db60383", size = 2267026, upload-time = "2025-09-29T21:11:58.852Z" },
+ { url = "https://files.pythonhosted.org/packages/7c/5b/cdd2c612277b7ac7ec8c0c9bc41812c43dc7b2d5f2b0897e15fdf5a1f915/fonttools-4.60.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:6f68576bb4bbf6060c7ab047b1574a1ebe5c50a17de62830079967b211059ebb", size = 2825777, upload-time = "2025-09-29T21:12:01.22Z" },
+ { url = "https://files.pythonhosted.org/packages/d6/8a/de9cc0540f542963ba5e8f3a1f6ad48fa211badc3177783b9d5cadf79b5d/fonttools-4.60.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:eedacb5c5d22b7097482fa834bda0dafa3d914a4e829ec83cdea2a01f8c813c4", size = 2348080, upload-time = "2025-09-29T21:12:03.785Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/8b/371ab3cec97ee3fe1126b3406b7abd60c8fec8975fd79a3c75cdea0c3d83/fonttools-4.60.1-cp313-cp313-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b33a7884fabd72bdf5f910d0cf46be50dce86a0362a65cfc746a4168c67eb96c", size = 4903082, upload-time = "2025-09-29T21:12:06.382Z" },
+ { url = "https://files.pythonhosted.org/packages/04/05/06b1455e4bc653fcb2117ac3ef5fa3a8a14919b93c60742d04440605d058/fonttools-4.60.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2409d5fb7b55fd70f715e6d34e7a6e4f7511b8ad29a49d6df225ee76da76dd77", size = 4960125, upload-time = "2025-09-29T21:12:09.314Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/37/f3b840fcb2666f6cb97038793606bdd83488dca2d0b0fc542ccc20afa668/fonttools-4.60.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:c8651e0d4b3bdeda6602b85fdc2abbefc1b41e573ecb37b6779c4ca50753a199", size = 4901454, upload-time = "2025-09-29T21:12:11.931Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/9e/eb76f77e82f8d4a46420aadff12cec6237751b0fb9ef1de373186dcffb5f/fonttools-4.60.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:145daa14bf24824b677b9357c5e44fd8895c2a8f53596e1b9ea3496081dc692c", size = 5044495, upload-time = "2025-09-29T21:12:15.241Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/b3/cede8f8235d42ff7ae891bae8d619d02c8ac9fd0cfc450c5927a6200c70d/fonttools-4.60.1-cp313-cp313-win32.whl", hash = "sha256:2299df884c11162617a66b7c316957d74a18e3758c0274762d2cc87df7bc0272", size = 2217028, upload-time = "2025-09-29T21:12:17.96Z" },
+ { url = "https://files.pythonhosted.org/packages/75/4d/b022c1577807ce8b31ffe055306ec13a866f2337ecee96e75b24b9b753ea/fonttools-4.60.1-cp313-cp313-win_amd64.whl", hash = "sha256:a3db56f153bd4c5c2b619ab02c5db5192e222150ce5a1bc10f16164714bc39ac", size = 2266200, upload-time = "2025-09-29T21:12:20.14Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/83/752ca11c1aa9a899b793a130f2e466b79ea0cf7279c8d79c178fc954a07b/fonttools-4.60.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:a884aef09d45ba1206712c7dbda5829562d3fea7726935d3289d343232ecb0d3", size = 2822830, upload-time = "2025-09-29T21:12:24.406Z" },
+ { url = "https://files.pythonhosted.org/packages/57/17/bbeab391100331950a96ce55cfbbff27d781c1b85ebafb4167eae50d9fe3/fonttools-4.60.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8a44788d9d91df72d1a5eac49b31aeb887a5f4aab761b4cffc4196c74907ea85", size = 2345524, upload-time = "2025-09-29T21:12:26.819Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/2e/d4831caa96d85a84dd0da1d9f90d81cec081f551e0ea216df684092c6c97/fonttools-4.60.1-cp314-cp314-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:e852d9dda9f93ad3651ae1e3bb770eac544ec93c3807888798eccddf84596537", size = 4843490, upload-time = "2025-09-29T21:12:29.123Z" },
+ { url = "https://files.pythonhosted.org/packages/49/13/5e2ea7c7a101b6fc3941be65307ef8df92cbbfa6ec4804032baf1893b434/fonttools-4.60.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:154cb6ee417e417bf5f7c42fe25858c9140c26f647c7347c06f0cc2d47eff003", size = 4944184, upload-time = "2025-09-29T21:12:31.414Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/2b/cf9603551c525b73fc47c52ee0b82a891579a93d9651ed694e4e2cd08bb8/fonttools-4.60.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:5664fd1a9ea7f244487ac8f10340c4e37664675e8667d6fee420766e0fb3cf08", size = 4890218, upload-time = "2025-09-29T21:12:33.936Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/2f/933d2352422e25f2376aae74f79eaa882a50fb3bfef3c0d4f50501267101/fonttools-4.60.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:583b7f8e3c49486e4d489ad1deacfb8d5be54a8ef34d6df824f6a171f8511d99", size = 4999324, upload-time = "2025-09-29T21:12:36.637Z" },
+ { url = "https://files.pythonhosted.org/packages/38/99/234594c0391221f66216bc2c886923513b3399a148defaccf81dc3be6560/fonttools-4.60.1-cp314-cp314-win32.whl", hash = "sha256:66929e2ea2810c6533a5184f938502cfdaea4bc3efb7130d8cc02e1c1b4108d6", size = 2220861, upload-time = "2025-09-29T21:12:39.108Z" },
+ { url = "https://files.pythonhosted.org/packages/3e/1d/edb5b23726dde50fc4068e1493e4fc7658eeefcaf75d4c5ffce067d07ae5/fonttools-4.60.1-cp314-cp314-win_amd64.whl", hash = "sha256:f3d5be054c461d6a2268831f04091dc82753176f6ea06dc6047a5e168265a987", size = 2270934, upload-time = "2025-09-29T21:12:41.339Z" },
+ { url = "https://files.pythonhosted.org/packages/fb/da/1392aaa2170adc7071fe7f9cfd181a5684a7afcde605aebddf1fb4d76df5/fonttools-4.60.1-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:b6379e7546ba4ae4b18f8ae2b9bc5960936007a1c0e30b342f662577e8bc3299", size = 2894340, upload-time = "2025-09-29T21:12:43.774Z" },
+ { url = "https://files.pythonhosted.org/packages/bf/a7/3b9f16e010d536ce567058b931a20b590d8f3177b2eda09edd92e392375d/fonttools-4.60.1-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9d0ced62b59e0430b3690dbc5373df1c2aa7585e9a8ce38eff87f0fd993c5b01", size = 2375073, upload-time = "2025-09-29T21:12:46.437Z" },
+ { url = "https://files.pythonhosted.org/packages/9b/b5/e9bcf51980f98e59bb5bb7c382a63c6f6cac0eec5f67de6d8f2322382065/fonttools-4.60.1-cp314-cp314t-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:875cb7764708b3132637f6c5fb385b16eeba0f7ac9fa45a69d35e09b47045801", size = 4849758, upload-time = "2025-09-29T21:12:48.694Z" },
+ { url = "https://files.pythonhosted.org/packages/e3/dc/1d2cf7d1cba82264b2f8385db3f5960e3d8ce756b4dc65b700d2c496f7e9/fonttools-4.60.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a184b2ea57b13680ab6d5fbde99ccef152c95c06746cb7718c583abd8f945ccc", size = 5085598, upload-time = "2025-09-29T21:12:51.081Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/4d/279e28ba87fb20e0c69baf72b60bbf1c4d873af1476806a7b5f2b7fac1ff/fonttools-4.60.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:026290e4ec76583881763fac284aca67365e0be9f13a7fb137257096114cb3bc", size = 4957603, upload-time = "2025-09-29T21:12:53.423Z" },
+ { url = "https://files.pythonhosted.org/packages/78/d4/ff19976305e0c05aa3340c805475abb00224c954d3c65e82c0a69633d55d/fonttools-4.60.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:f0e8817c7d1a0c2eedebf57ef9a9896f3ea23324769a9a2061a80fe8852705ed", size = 4974184, upload-time = "2025-09-29T21:12:55.962Z" },
+ { url = "https://files.pythonhosted.org/packages/63/22/8553ff6166f5cd21cfaa115aaacaa0dc73b91c079a8cfd54a482cbc0f4f5/fonttools-4.60.1-cp314-cp314t-win32.whl", hash = "sha256:1410155d0e764a4615774e5c2c6fc516259fe3eca5882f034eb9bfdbee056259", size = 2282241, upload-time = "2025-09-29T21:12:58.179Z" },
+ { url = "https://files.pythonhosted.org/packages/8a/cb/fa7b4d148e11d5a72761a22e595344133e83a9507a4c231df972e657579b/fonttools-4.60.1-cp314-cp314t-win_amd64.whl", hash = "sha256:022beaea4b73a70295b688f817ddc24ed3e3418b5036ffcd5658141184ef0d0c", size = 2345760, upload-time = "2025-09-29T21:13:00.375Z" },
+ { url = "https://files.pythonhosted.org/packages/c7/93/0dd45cd283c32dea1545151d8c3637b4b8c53cdb3a625aeb2885b184d74d/fonttools-4.60.1-py3-none-any.whl", hash = "sha256:906306ac7afe2156fcf0042173d6ebbb05416af70f6b370967b47f8f00103bbb", size = 1143175, upload-time = "2025-09-29T21:13:24.134Z" },
+]
+
+[[package]]
+name = "frozenlist"
+version = "1.8.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/2d/f5/c831fac6cc817d26fd54c7eaccd04ef7e0288806943f7cc5bbf69f3ac1f0/frozenlist-1.8.0.tar.gz", hash = "sha256:3ede829ed8d842f6cd48fc7081d7a41001a56f1f38603f9d49bf3020d59a31ad", size = 45875, upload-time = "2025-10-06T05:38:17.865Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/83/4a/557715d5047da48d54e659203b9335be7bfaafda2c3f627b7c47e0b3aaf3/frozenlist-1.8.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:b37f6d31b3dcea7deb5e9696e529a6aa4a898adc33db82da12e4c60a7c4d2011", size = 86230, upload-time = "2025-10-06T05:35:23.699Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/fb/c85f9fed3ea8fe8740e5b46a59cc141c23b842eca617da8876cfce5f760e/frozenlist-1.8.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:ef2b7b394f208233e471abc541cc6991f907ffd47dc72584acee3147899d6565", size = 49621, upload-time = "2025-10-06T05:35:25.341Z" },
+ { url = "https://files.pythonhosted.org/packages/63/70/26ca3f06aace16f2352796b08704338d74b6d1a24ca38f2771afbb7ed915/frozenlist-1.8.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a88f062f072d1589b7b46e951698950e7da00442fc1cacbe17e19e025dc327ad", size = 49889, upload-time = "2025-10-06T05:35:26.797Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/ed/c7895fd2fde7f3ee70d248175f9b6cdf792fb741ab92dc59cd9ef3bd241b/frozenlist-1.8.0-cp310-cp310-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:f57fb59d9f385710aa7060e89410aeb5058b99e62f4d16b08b91986b9a2140c2", size = 219464, upload-time = "2025-10-06T05:35:28.254Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/83/4d587dccbfca74cb8b810472392ad62bfa100bf8108c7223eb4c4fa2f7b3/frozenlist-1.8.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:799345ab092bee59f01a915620b5d014698547afd011e691a208637312db9186", size = 221649, upload-time = "2025-10-06T05:35:29.454Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/c6/fd3b9cd046ec5fff9dab66831083bc2077006a874a2d3d9247dea93ddf7e/frozenlist-1.8.0-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:c23c3ff005322a6e16f71bf8692fcf4d5a304aaafe1e262c98c6d4adc7be863e", size = 219188, upload-time = "2025-10-06T05:35:30.951Z" },
+ { url = "https://files.pythonhosted.org/packages/ce/80/6693f55eb2e085fc8afb28cf611448fb5b90e98e068fa1d1b8d8e66e5c7d/frozenlist-1.8.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:8a76ea0f0b9dfa06f254ee06053d93a600865b3274358ca48a352ce4f0798450", size = 231748, upload-time = "2025-10-06T05:35:32.101Z" },
+ { url = "https://files.pythonhosted.org/packages/97/d6/e9459f7c5183854abd989ba384fe0cc1a0fb795a83c033f0571ec5933ca4/frozenlist-1.8.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:c7366fe1418a6133d5aa824ee53d406550110984de7637d65a178010f759c6ef", size = 236351, upload-time = "2025-10-06T05:35:33.834Z" },
+ { url = "https://files.pythonhosted.org/packages/97/92/24e97474b65c0262e9ecd076e826bfd1d3074adcc165a256e42e7b8a7249/frozenlist-1.8.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:13d23a45c4cebade99340c4165bd90eeb4a56c6d8a9d8aa49568cac19a6d0dc4", size = 218767, upload-time = "2025-10-06T05:35:35.205Z" },
+ { url = "https://files.pythonhosted.org/packages/ee/bf/dc394a097508f15abff383c5108cb8ad880d1f64a725ed3b90d5c2fbf0bb/frozenlist-1.8.0-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:e4a3408834f65da56c83528fb52ce7911484f0d1eaf7b761fc66001db1646eff", size = 235887, upload-time = "2025-10-06T05:35:36.354Z" },
+ { url = "https://files.pythonhosted.org/packages/40/90/25b201b9c015dbc999a5baf475a257010471a1fa8c200c843fd4abbee725/frozenlist-1.8.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:42145cd2748ca39f32801dad54aeea10039da6f86e303659db90db1c4b614c8c", size = 228785, upload-time = "2025-10-06T05:35:37.949Z" },
+ { url = "https://files.pythonhosted.org/packages/84/f4/b5bc148df03082f05d2dd30c089e269acdbe251ac9a9cf4e727b2dbb8a3d/frozenlist-1.8.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:e2de870d16a7a53901e41b64ffdf26f2fbb8917b3e6ebf398098d72c5b20bd7f", size = 230312, upload-time = "2025-10-06T05:35:39.178Z" },
+ { url = "https://files.pythonhosted.org/packages/db/4b/87e95b5d15097c302430e647136b7d7ab2398a702390cf4c8601975709e7/frozenlist-1.8.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:20e63c9493d33ee48536600d1a5c95eefc870cd71e7ab037763d1fbb89cc51e7", size = 217650, upload-time = "2025-10-06T05:35:40.377Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/70/78a0315d1fea97120591a83e0acd644da638c872f142fd72a6cebee825f3/frozenlist-1.8.0-cp310-cp310-win32.whl", hash = "sha256:adbeebaebae3526afc3c96fad434367cafbfd1b25d72369a9e5858453b1bb71a", size = 39659, upload-time = "2025-10-06T05:35:41.863Z" },
+ { url = "https://files.pythonhosted.org/packages/66/aa/3f04523fb189a00e147e60c5b2205126118f216b0aa908035c45336e27e4/frozenlist-1.8.0-cp310-cp310-win_amd64.whl", hash = "sha256:667c3777ca571e5dbeb76f331562ff98b957431df140b54c85fd4d52eea8d8f6", size = 43837, upload-time = "2025-10-06T05:35:43.205Z" },
+ { url = "https://files.pythonhosted.org/packages/39/75/1135feecdd7c336938bd55b4dc3b0dfc46d85b9be12ef2628574b28de776/frozenlist-1.8.0-cp310-cp310-win_arm64.whl", hash = "sha256:80f85f0a7cc86e7a54c46d99c9e1318ff01f4687c172ede30fd52d19d1da1c8e", size = 39989, upload-time = "2025-10-06T05:35:44.596Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/03/077f869d540370db12165c0aa51640a873fb661d8b315d1d4d67b284d7ac/frozenlist-1.8.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:09474e9831bc2b2199fad6da3c14c7b0fbdd377cce9d3d77131be28906cb7d84", size = 86912, upload-time = "2025-10-06T05:35:45.98Z" },
+ { url = "https://files.pythonhosted.org/packages/df/b5/7610b6bd13e4ae77b96ba85abea1c8cb249683217ef09ac9e0ae93f25a91/frozenlist-1.8.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:17c883ab0ab67200b5f964d2b9ed6b00971917d5d8a92df149dc2c9779208ee9", size = 50046, upload-time = "2025-10-06T05:35:47.009Z" },
+ { url = "https://files.pythonhosted.org/packages/6e/ef/0e8f1fe32f8a53dd26bdd1f9347efe0778b0fddf62789ea683f4cc7d787d/frozenlist-1.8.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:fa47e444b8ba08fffd1c18e8cdb9a75db1b6a27f17507522834ad13ed5922b93", size = 50119, upload-time = "2025-10-06T05:35:48.38Z" },
+ { url = "https://files.pythonhosted.org/packages/11/b1/71a477adc7c36e5fb628245dfbdea2166feae310757dea848d02bd0689fd/frozenlist-1.8.0-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:2552f44204b744fba866e573be4c1f9048d6a324dfe14475103fd51613eb1d1f", size = 231067, upload-time = "2025-10-06T05:35:49.97Z" },
+ { url = "https://files.pythonhosted.org/packages/45/7e/afe40eca3a2dc19b9904c0f5d7edfe82b5304cb831391edec0ac04af94c2/frozenlist-1.8.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:957e7c38f250991e48a9a73e6423db1bb9dd14e722a10f6b8bb8e16a0f55f695", size = 233160, upload-time = "2025-10-06T05:35:51.729Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/aa/7416eac95603ce428679d273255ffc7c998d4132cfae200103f164b108aa/frozenlist-1.8.0-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:8585e3bb2cdea02fc88ffa245069c36555557ad3609e83be0ec71f54fd4abb52", size = 228544, upload-time = "2025-10-06T05:35:53.246Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/3d/2a2d1f683d55ac7e3875e4263d28410063e738384d3adc294f5ff3d7105e/frozenlist-1.8.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:edee74874ce20a373d62dc28b0b18b93f645633c2943fd90ee9d898550770581", size = 243797, upload-time = "2025-10-06T05:35:54.497Z" },
+ { url = "https://files.pythonhosted.org/packages/78/1e/2d5565b589e580c296d3bb54da08d206e797d941a83a6fdea42af23be79c/frozenlist-1.8.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:c9a63152fe95756b85f31186bddf42e4c02c6321207fd6601a1c89ebac4fe567", size = 247923, upload-time = "2025-10-06T05:35:55.861Z" },
+ { url = "https://files.pythonhosted.org/packages/aa/c3/65872fcf1d326a7f101ad4d86285c403c87be7d832b7470b77f6d2ed5ddc/frozenlist-1.8.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b6db2185db9be0a04fecf2f241c70b63b1a242e2805be291855078f2b404dd6b", size = 230886, upload-time = "2025-10-06T05:35:57.399Z" },
+ { url = "https://files.pythonhosted.org/packages/a0/76/ac9ced601d62f6956f03cc794f9e04c81719509f85255abf96e2510f4265/frozenlist-1.8.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:f4be2e3d8bc8aabd566f8d5b8ba7ecc09249d74ba3c9ed52e54dc23a293f0b92", size = 245731, upload-time = "2025-10-06T05:35:58.563Z" },
+ { url = "https://files.pythonhosted.org/packages/b9/49/ecccb5f2598daf0b4a1415497eba4c33c1e8ce07495eb07d2860c731b8d5/frozenlist-1.8.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:c8d1634419f39ea6f5c427ea2f90ca85126b54b50837f31497f3bf38266e853d", size = 241544, upload-time = "2025-10-06T05:35:59.719Z" },
+ { url = "https://files.pythonhosted.org/packages/53/4b/ddf24113323c0bbcc54cb38c8b8916f1da7165e07b8e24a717b4a12cbf10/frozenlist-1.8.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:1a7fa382a4a223773ed64242dbe1c9c326ec09457e6b8428efb4118c685c3dfd", size = 241806, upload-time = "2025-10-06T05:36:00.959Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/fb/9b9a084d73c67175484ba2789a59f8eebebd0827d186a8102005ce41e1ba/frozenlist-1.8.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:11847b53d722050808926e785df837353bd4d75f1d494377e59b23594d834967", size = 229382, upload-time = "2025-10-06T05:36:02.22Z" },
+ { url = "https://files.pythonhosted.org/packages/95/a3/c8fb25aac55bf5e12dae5c5aa6a98f85d436c1dc658f21c3ac73f9fa95e5/frozenlist-1.8.0-cp311-cp311-win32.whl", hash = "sha256:27c6e8077956cf73eadd514be8fb04d77fc946a7fe9f7fe167648b0b9085cc25", size = 39647, upload-time = "2025-10-06T05:36:03.409Z" },
+ { url = "https://files.pythonhosted.org/packages/0a/f5/603d0d6a02cfd4c8f2a095a54672b3cf967ad688a60fb9faf04fc4887f65/frozenlist-1.8.0-cp311-cp311-win_amd64.whl", hash = "sha256:ac913f8403b36a2c8610bbfd25b8013488533e71e62b4b4adce9c86c8cea905b", size = 44064, upload-time = "2025-10-06T05:36:04.368Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/16/c2c9ab44e181f043a86f9a8f84d5124b62dbcb3a02c0977ec72b9ac1d3e0/frozenlist-1.8.0-cp311-cp311-win_arm64.whl", hash = "sha256:d4d3214a0f8394edfa3e303136d0575eece0745ff2b47bd2cb2e66dd92d4351a", size = 39937, upload-time = "2025-10-06T05:36:05.669Z" },
+ { url = "https://files.pythonhosted.org/packages/69/29/948b9aa87e75820a38650af445d2ef2b6b8a6fab1a23b6bb9e4ef0be2d59/frozenlist-1.8.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:78f7b9e5d6f2fdb88cdde9440dc147259b62b9d3b019924def9f6478be254ac1", size = 87782, upload-time = "2025-10-06T05:36:06.649Z" },
+ { url = "https://files.pythonhosted.org/packages/64/80/4f6e318ee2a7c0750ed724fa33a4bdf1eacdc5a39a7a24e818a773cd91af/frozenlist-1.8.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:229bf37d2e4acdaf808fd3f06e854a4a7a3661e871b10dc1f8f1896a3b05f18b", size = 50594, upload-time = "2025-10-06T05:36:07.69Z" },
+ { url = "https://files.pythonhosted.org/packages/2b/94/5c8a2b50a496b11dd519f4a24cb5496cf125681dd99e94c604ccdea9419a/frozenlist-1.8.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f833670942247a14eafbb675458b4e61c82e002a148f49e68257b79296e865c4", size = 50448, upload-time = "2025-10-06T05:36:08.78Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/bd/d91c5e39f490a49df14320f4e8c80161cfcce09f1e2cde1edd16a551abb3/frozenlist-1.8.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:494a5952b1c597ba44e0e78113a7266e656b9794eec897b19ead706bd7074383", size = 242411, upload-time = "2025-10-06T05:36:09.801Z" },
+ { url = "https://files.pythonhosted.org/packages/8f/83/f61505a05109ef3293dfb1ff594d13d64a2324ac3482be2cedc2be818256/frozenlist-1.8.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:96f423a119f4777a4a056b66ce11527366a8bb92f54e541ade21f2374433f6d4", size = 243014, upload-time = "2025-10-06T05:36:11.394Z" },
+ { url = "https://files.pythonhosted.org/packages/d8/cb/cb6c7b0f7d4023ddda30cf56b8b17494eb3a79e3fda666bf735f63118b35/frozenlist-1.8.0-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3462dd9475af2025c31cc61be6652dfa25cbfb56cbbf52f4ccfe029f38decaf8", size = 234909, upload-time = "2025-10-06T05:36:12.598Z" },
+ { url = "https://files.pythonhosted.org/packages/31/c5/cd7a1f3b8b34af009fb17d4123c5a778b44ae2804e3ad6b86204255f9ec5/frozenlist-1.8.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c4c800524c9cd9bac5166cd6f55285957fcfc907db323e193f2afcd4d9abd69b", size = 250049, upload-time = "2025-10-06T05:36:14.065Z" },
+ { url = "https://files.pythonhosted.org/packages/c0/01/2f95d3b416c584a1e7f0e1d6d31998c4a795f7544069ee2e0962a4b60740/frozenlist-1.8.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d6a5df73acd3399d893dafc71663ad22534b5aa4f94e8a2fabfe856c3c1b6a52", size = 256485, upload-time = "2025-10-06T05:36:15.39Z" },
+ { url = "https://files.pythonhosted.org/packages/ce/03/024bf7720b3abaebcff6d0793d73c154237b85bdf67b7ed55e5e9596dc9a/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:405e8fe955c2280ce66428b3ca55e12b3c4e9c336fb2103a4937e891c69a4a29", size = 237619, upload-time = "2025-10-06T05:36:16.558Z" },
+ { url = "https://files.pythonhosted.org/packages/69/fa/f8abdfe7d76b731f5d8bd217827cf6764d4f1d9763407e42717b4bed50a0/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:908bd3f6439f2fef9e85031b59fd4f1297af54415fb60e4254a95f75b3cab3f3", size = 250320, upload-time = "2025-10-06T05:36:17.821Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/3c/b051329f718b463b22613e269ad72138cc256c540f78a6de89452803a47d/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:294e487f9ec720bd8ffcebc99d575f7eff3568a08a253d1ee1a0378754b74143", size = 246820, upload-time = "2025-10-06T05:36:19.046Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/ae/58282e8f98e444b3f4dd42448ff36fa38bef29e40d40f330b22e7108f565/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:74c51543498289c0c43656701be6b077f4b265868fa7f8a8859c197006efb608", size = 250518, upload-time = "2025-10-06T05:36:20.763Z" },
+ { url = "https://files.pythonhosted.org/packages/8f/96/007e5944694d66123183845a106547a15944fbbb7154788cbf7272789536/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:776f352e8329135506a1d6bf16ac3f87bc25b28e765949282dcc627af36123aa", size = 239096, upload-time = "2025-10-06T05:36:22.129Z" },
+ { url = "https://files.pythonhosted.org/packages/66/bb/852b9d6db2fa40be96f29c0d1205c306288f0684df8fd26ca1951d461a56/frozenlist-1.8.0-cp312-cp312-win32.whl", hash = "sha256:433403ae80709741ce34038da08511d4a77062aa924baf411ef73d1146e74faf", size = 39985, upload-time = "2025-10-06T05:36:23.661Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/af/38e51a553dd66eb064cdf193841f16f077585d4d28394c2fa6235cb41765/frozenlist-1.8.0-cp312-cp312-win_amd64.whl", hash = "sha256:34187385b08f866104f0c0617404c8eb08165ab1272e884abc89c112e9c00746", size = 44591, upload-time = "2025-10-06T05:36:24.958Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/06/1dc65480ab147339fecc70797e9c2f69d9cea9cf38934ce08df070fdb9cb/frozenlist-1.8.0-cp312-cp312-win_arm64.whl", hash = "sha256:fe3c58d2f5db5fbd18c2987cba06d51b0529f52bc3a6cdc33d3f4eab725104bd", size = 40102, upload-time = "2025-10-06T05:36:26.333Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/40/0832c31a37d60f60ed79e9dfb5a92e1e2af4f40a16a29abcc7992af9edff/frozenlist-1.8.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:8d92f1a84bb12d9e56f818b3a746f3efba93c1b63c8387a73dde655e1e42282a", size = 85717, upload-time = "2025-10-06T05:36:27.341Z" },
+ { url = "https://files.pythonhosted.org/packages/30/ba/b0b3de23f40bc55a7057bd38434e25c34fa48e17f20ee273bbde5e0650f3/frozenlist-1.8.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:96153e77a591c8adc2ee805756c61f59fef4cf4073a9275ee86fe8cba41241f7", size = 49651, upload-time = "2025-10-06T05:36:28.855Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/ab/6e5080ee374f875296c4243c381bbdef97a9ac39c6e3ce1d5f7d42cb78d6/frozenlist-1.8.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f21f00a91358803399890ab167098c131ec2ddd5f8f5fd5fe9c9f2c6fcd91e40", size = 49417, upload-time = "2025-10-06T05:36:29.877Z" },
+ { url = "https://files.pythonhosted.org/packages/d5/4e/e4691508f9477ce67da2015d8c00acd751e6287739123113a9fca6f1604e/frozenlist-1.8.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:fb30f9626572a76dfe4293c7194a09fb1fe93ba94c7d4f720dfae3b646b45027", size = 234391, upload-time = "2025-10-06T05:36:31.301Z" },
+ { url = "https://files.pythonhosted.org/packages/40/76/c202df58e3acdf12969a7895fd6f3bc016c642e6726aa63bd3025e0fc71c/frozenlist-1.8.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:eaa352d7047a31d87dafcacbabe89df0aa506abb5b1b85a2fb91bc3faa02d822", size = 233048, upload-time = "2025-10-06T05:36:32.531Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/c0/8746afb90f17b73ca5979c7a3958116e105ff796e718575175319b5bb4ce/frozenlist-1.8.0-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:03ae967b4e297f58f8c774c7eabcce57fe3c2434817d4385c50661845a058121", size = 226549, upload-time = "2025-10-06T05:36:33.706Z" },
+ { url = "https://files.pythonhosted.org/packages/7e/eb/4c7eefc718ff72f9b6c4893291abaae5fbc0c82226a32dcd8ef4f7a5dbef/frozenlist-1.8.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f6292f1de555ffcc675941d65fffffb0a5bcd992905015f85d0592201793e0e5", size = 239833, upload-time = "2025-10-06T05:36:34.947Z" },
+ { url = "https://files.pythonhosted.org/packages/c2/4e/e5c02187cf704224f8b21bee886f3d713ca379535f16893233b9d672ea71/frozenlist-1.8.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:29548f9b5b5e3460ce7378144c3010363d8035cea44bc0bf02d57f5a685e084e", size = 245363, upload-time = "2025-10-06T05:36:36.534Z" },
+ { url = "https://files.pythonhosted.org/packages/1f/96/cb85ec608464472e82ad37a17f844889c36100eed57bea094518bf270692/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ec3cc8c5d4084591b4237c0a272cc4f50a5b03396a47d9caaf76f5d7b38a4f11", size = 229314, upload-time = "2025-10-06T05:36:38.582Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/6f/4ae69c550e4cee66b57887daeebe006fe985917c01d0fff9caab9883f6d0/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:517279f58009d0b1f2e7c1b130b377a349405da3f7621ed6bfae50b10adf20c1", size = 243365, upload-time = "2025-10-06T05:36:40.152Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/58/afd56de246cf11780a40a2c28dc7cbabbf06337cc8ddb1c780a2d97e88d8/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:db1e72ede2d0d7ccb213f218df6a078a9c09a7de257c2fe8fcef16d5925230b1", size = 237763, upload-time = "2025-10-06T05:36:41.355Z" },
+ { url = "https://files.pythonhosted.org/packages/cb/36/cdfaf6ed42e2644740d4a10452d8e97fa1c062e2a8006e4b09f1b5fd7d63/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:b4dec9482a65c54a5044486847b8a66bf10c9cb4926d42927ec4e8fd5db7fed8", size = 240110, upload-time = "2025-10-06T05:36:42.716Z" },
+ { url = "https://files.pythonhosted.org/packages/03/a8/9ea226fbefad669f11b52e864c55f0bd57d3c8d7eb07e9f2e9a0b39502e1/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:21900c48ae04d13d416f0e1e0c4d81f7931f73a9dfa0b7a8746fb2fe7dd970ed", size = 233717, upload-time = "2025-10-06T05:36:44.251Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/0b/1b5531611e83ba7d13ccc9988967ea1b51186af64c42b7a7af465dcc9568/frozenlist-1.8.0-cp313-cp313-win32.whl", hash = "sha256:8b7b94a067d1c504ee0b16def57ad5738701e4ba10cec90529f13fa03c833496", size = 39628, upload-time = "2025-10-06T05:36:45.423Z" },
+ { url = "https://files.pythonhosted.org/packages/d8/cf/174c91dbc9cc49bc7b7aab74d8b734e974d1faa8f191c74af9b7e80848e6/frozenlist-1.8.0-cp313-cp313-win_amd64.whl", hash = "sha256:878be833caa6a3821caf85eb39c5ba92d28e85df26d57afb06b35b2efd937231", size = 43882, upload-time = "2025-10-06T05:36:46.796Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/17/502cd212cbfa96eb1388614fe39a3fc9ab87dbbe042b66f97acb57474834/frozenlist-1.8.0-cp313-cp313-win_arm64.whl", hash = "sha256:44389d135b3ff43ba8cc89ff7f51f5a0bb6b63d829c8300f79a2fe4fe61bcc62", size = 39676, upload-time = "2025-10-06T05:36:47.8Z" },
+ { url = "https://files.pythonhosted.org/packages/d2/5c/3bbfaa920dfab09e76946a5d2833a7cbdf7b9b4a91c714666ac4855b88b4/frozenlist-1.8.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:e25ac20a2ef37e91c1b39938b591457666a0fa835c7783c3a8f33ea42870db94", size = 89235, upload-time = "2025-10-06T05:36:48.78Z" },
+ { url = "https://files.pythonhosted.org/packages/d2/d6/f03961ef72166cec1687e84e8925838442b615bd0b8854b54923ce5b7b8a/frozenlist-1.8.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:07cdca25a91a4386d2e76ad992916a85038a9b97561bf7a3fd12d5d9ce31870c", size = 50742, upload-time = "2025-10-06T05:36:49.837Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/bb/a6d12b7ba4c3337667d0e421f7181c82dda448ce4e7ad7ecd249a16fa806/frozenlist-1.8.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4e0c11f2cc6717e0a741f84a527c52616140741cd812a50422f83dc31749fb52", size = 51725, upload-time = "2025-10-06T05:36:50.851Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/71/d1fed0ffe2c2ccd70b43714c6cab0f4188f09f8a67a7914a6b46ee30f274/frozenlist-1.8.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b3210649ee28062ea6099cfda39e147fa1bc039583c8ee4481cb7811e2448c51", size = 284533, upload-time = "2025-10-06T05:36:51.898Z" },
+ { url = "https://files.pythonhosted.org/packages/c9/1f/fb1685a7b009d89f9bf78a42d94461bc06581f6e718c39344754a5d9bada/frozenlist-1.8.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:581ef5194c48035a7de2aefc72ac6539823bb71508189e5de01d60c9dcd5fa65", size = 292506, upload-time = "2025-10-06T05:36:53.101Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/3b/b991fe1612703f7e0d05c0cf734c1b77aaf7c7d321df4572e8d36e7048c8/frozenlist-1.8.0-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3ef2d026f16a2b1866e1d86fc4e1291e1ed8a387b2c333809419a2f8b3a77b82", size = 274161, upload-time = "2025-10-06T05:36:54.309Z" },
+ { url = "https://files.pythonhosted.org/packages/ca/ec/c5c618767bcdf66e88945ec0157d7f6c4a1322f1473392319b7a2501ded7/frozenlist-1.8.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:5500ef82073f599ac84d888e3a8c1f77ac831183244bfd7f11eaa0289fb30714", size = 294676, upload-time = "2025-10-06T05:36:55.566Z" },
+ { url = "https://files.pythonhosted.org/packages/7c/ce/3934758637d8f8a88d11f0585d6495ef54b2044ed6ec84492a91fa3b27aa/frozenlist-1.8.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:50066c3997d0091c411a66e710f4e11752251e6d2d73d70d8d5d4c76442a199d", size = 300638, upload-time = "2025-10-06T05:36:56.758Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/4f/a7e4d0d467298f42de4b41cbc7ddaf19d3cfeabaf9ff97c20c6c7ee409f9/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:5c1c8e78426e59b3f8005e9b19f6ff46e5845895adbde20ece9218319eca6506", size = 283067, upload-time = "2025-10-06T05:36:57.965Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/48/c7b163063d55a83772b268e6d1affb960771b0e203b632cfe09522d67ea5/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:eefdba20de0d938cec6a89bd4d70f346a03108a19b9df4248d3cf0d88f1b0f51", size = 292101, upload-time = "2025-10-06T05:36:59.237Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/d0/2366d3c4ecdc2fd391e0afa6e11500bfba0ea772764d631bbf82f0136c9d/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:cf253e0e1c3ceb4aaff6df637ce033ff6535fb8c70a764a8f46aafd3d6ab798e", size = 289901, upload-time = "2025-10-06T05:37:00.811Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/94/daff920e82c1b70e3618a2ac39fbc01ae3e2ff6124e80739ce5d71c9b920/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:032efa2674356903cd0261c4317a561a6850f3ac864a63fc1583147fb05a79b0", size = 289395, upload-time = "2025-10-06T05:37:02.115Z" },
+ { url = "https://files.pythonhosted.org/packages/e3/20/bba307ab4235a09fdcd3cc5508dbabd17c4634a1af4b96e0f69bfe551ebd/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6da155091429aeba16851ecb10a9104a108bcd32f6c1642867eadaee401c1c41", size = 283659, upload-time = "2025-10-06T05:37:03.711Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/00/04ca1c3a7a124b6de4f8a9a17cc2fcad138b4608e7a3fc5877804b8715d7/frozenlist-1.8.0-cp313-cp313t-win32.whl", hash = "sha256:0f96534f8bfebc1a394209427d0f8a63d343c9779cda6fc25e8e121b5fd8555b", size = 43492, upload-time = "2025-10-06T05:37:04.915Z" },
+ { url = "https://files.pythonhosted.org/packages/59/5e/c69f733a86a94ab10f68e496dc6b7e8bc078ebb415281d5698313e3af3a1/frozenlist-1.8.0-cp313-cp313t-win_amd64.whl", hash = "sha256:5d63a068f978fc69421fb0e6eb91a9603187527c86b7cd3f534a5b77a592b888", size = 48034, upload-time = "2025-10-06T05:37:06.343Z" },
+ { url = "https://files.pythonhosted.org/packages/16/6c/be9d79775d8abe79b05fa6d23da99ad6e7763a1d080fbae7290b286093fd/frozenlist-1.8.0-cp313-cp313t-win_arm64.whl", hash = "sha256:bf0a7e10b077bf5fb9380ad3ae8ce20ef919a6ad93b4552896419ac7e1d8e042", size = 41749, upload-time = "2025-10-06T05:37:07.431Z" },
+ { url = "https://files.pythonhosted.org/packages/f1/c8/85da824b7e7b9b6e7f7705b2ecaf9591ba6f79c1177f324c2735e41d36a2/frozenlist-1.8.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:cee686f1f4cadeb2136007ddedd0aaf928ab95216e7691c63e50a8ec066336d0", size = 86127, upload-time = "2025-10-06T05:37:08.438Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/e8/a1185e236ec66c20afd72399522f142c3724c785789255202d27ae992818/frozenlist-1.8.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:119fb2a1bd47307e899c2fac7f28e85b9a543864df47aa7ec9d3c1b4545f096f", size = 49698, upload-time = "2025-10-06T05:37:09.48Z" },
+ { url = "https://files.pythonhosted.org/packages/a1/93/72b1736d68f03fda5fdf0f2180fb6caaae3894f1b854d006ac61ecc727ee/frozenlist-1.8.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:4970ece02dbc8c3a92fcc5228e36a3e933a01a999f7094ff7c23fbd2beeaa67c", size = 49749, upload-time = "2025-10-06T05:37:10.569Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/b2/fabede9fafd976b991e9f1b9c8c873ed86f202889b864756f240ce6dd855/frozenlist-1.8.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:cba69cb73723c3f329622e34bdbf5ce1f80c21c290ff04256cff1cd3c2036ed2", size = 231298, upload-time = "2025-10-06T05:37:11.993Z" },
+ { url = "https://files.pythonhosted.org/packages/3a/3b/d9b1e0b0eed36e70477ffb8360c49c85c8ca8ef9700a4e6711f39a6e8b45/frozenlist-1.8.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:778a11b15673f6f1df23d9586f83c4846c471a8af693a22e066508b77d201ec8", size = 232015, upload-time = "2025-10-06T05:37:13.194Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/94/be719d2766c1138148564a3960fc2c06eb688da592bdc25adcf856101be7/frozenlist-1.8.0-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0325024fe97f94c41c08872db482cf8ac4800d80e79222c6b0b7b162d5b13686", size = 225038, upload-time = "2025-10-06T05:37:14.577Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/09/6712b6c5465f083f52f50cf74167b92d4ea2f50e46a9eea0523d658454ae/frozenlist-1.8.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:97260ff46b207a82a7567b581ab4190bd4dfa09f4db8a8b49d1a958f6aa4940e", size = 240130, upload-time = "2025-10-06T05:37:15.781Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/d4/cd065cdcf21550b54f3ce6a22e143ac9e4836ca42a0de1022da8498eac89/frozenlist-1.8.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:54b2077180eb7f83dd52c40b2750d0a9f175e06a42e3213ce047219de902717a", size = 242845, upload-time = "2025-10-06T05:37:17.037Z" },
+ { url = "https://files.pythonhosted.org/packages/62/c3/f57a5c8c70cd1ead3d5d5f776f89d33110b1addae0ab010ad774d9a44fb9/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:2f05983daecab868a31e1da44462873306d3cbfd76d1f0b5b69c473d21dbb128", size = 229131, upload-time = "2025-10-06T05:37:18.221Z" },
+ { url = "https://files.pythonhosted.org/packages/6c/52/232476fe9cb64f0742f3fde2b7d26c1dac18b6d62071c74d4ded55e0ef94/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:33f48f51a446114bc5d251fb2954ab0164d5be02ad3382abcbfe07e2531d650f", size = 240542, upload-time = "2025-10-06T05:37:19.771Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/85/07bf3f5d0fb5414aee5f47d33c6f5c77bfe49aac680bfece33d4fdf6a246/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:154e55ec0655291b5dd1b8731c637ecdb50975a2ae70c606d100750a540082f7", size = 237308, upload-time = "2025-10-06T05:37:20.969Z" },
+ { url = "https://files.pythonhosted.org/packages/11/99/ae3a33d5befd41ac0ca2cc7fd3aa707c9c324de2e89db0e0f45db9a64c26/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:4314debad13beb564b708b4a496020e5306c7333fa9a3ab90374169a20ffab30", size = 238210, upload-time = "2025-10-06T05:37:22.252Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/60/b1d2da22f4970e7a155f0adde9b1435712ece01b3cd45ba63702aea33938/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:073f8bf8becba60aa931eb3bc420b217bb7d5b8f4750e6f8b3be7f3da85d38b7", size = 231972, upload-time = "2025-10-06T05:37:23.5Z" },
+ { url = "https://files.pythonhosted.org/packages/3f/ab/945b2f32de889993b9c9133216c068b7fcf257d8595a0ac420ac8677cab0/frozenlist-1.8.0-cp314-cp314-win32.whl", hash = "sha256:bac9c42ba2ac65ddc115d930c78d24ab8d4f465fd3fc473cdedfccadb9429806", size = 40536, upload-time = "2025-10-06T05:37:25.581Z" },
+ { url = "https://files.pythonhosted.org/packages/59/ad/9caa9b9c836d9ad6f067157a531ac48b7d36499f5036d4141ce78c230b1b/frozenlist-1.8.0-cp314-cp314-win_amd64.whl", hash = "sha256:3e0761f4d1a44f1d1a47996511752cf3dcec5bbdd9cc2b4fe595caf97754b7a0", size = 44330, upload-time = "2025-10-06T05:37:26.928Z" },
+ { url = "https://files.pythonhosted.org/packages/82/13/e6950121764f2676f43534c555249f57030150260aee9dcf7d64efda11dd/frozenlist-1.8.0-cp314-cp314-win_arm64.whl", hash = "sha256:d1eaff1d00c7751b7c6662e9c5ba6eb2c17a2306ba5e2a37f24ddf3cc953402b", size = 40627, upload-time = "2025-10-06T05:37:28.075Z" },
+ { url = "https://files.pythonhosted.org/packages/c0/c7/43200656ecc4e02d3f8bc248df68256cd9572b3f0017f0a0c4e93440ae23/frozenlist-1.8.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:d3bb933317c52d7ea5004a1c442eef86f426886fba134ef8cf4226ea6ee1821d", size = 89238, upload-time = "2025-10-06T05:37:29.373Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/29/55c5f0689b9c0fb765055629f472c0de484dcaf0acee2f7707266ae3583c/frozenlist-1.8.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:8009897cdef112072f93a0efdce29cd819e717fd2f649ee3016efd3cd885a7ed", size = 50738, upload-time = "2025-10-06T05:37:30.792Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/7d/b7282a445956506fa11da8c2db7d276adcbf2b17d8bb8407a47685263f90/frozenlist-1.8.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:2c5dcbbc55383e5883246d11fd179782a9d07a986c40f49abe89ddf865913930", size = 51739, upload-time = "2025-10-06T05:37:32.127Z" },
+ { url = "https://files.pythonhosted.org/packages/62/1c/3d8622e60d0b767a5510d1d3cf21065b9db874696a51ea6d7a43180a259c/frozenlist-1.8.0-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:39ecbc32f1390387d2aa4f5a995e465e9e2f79ba3adcac92d68e3e0afae6657c", size = 284186, upload-time = "2025-10-06T05:37:33.21Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/14/aa36d5f85a89679a85a1d44cd7a6657e0b1c75f61e7cad987b203d2daca8/frozenlist-1.8.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92db2bf818d5cc8d9c1f1fc56b897662e24ea5adb36ad1f1d82875bd64e03c24", size = 292196, upload-time = "2025-10-06T05:37:36.107Z" },
+ { url = "https://files.pythonhosted.org/packages/05/23/6bde59eb55abd407d34f77d39a5126fb7b4f109a3f611d3929f14b700c66/frozenlist-1.8.0-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2dc43a022e555de94c3b68a4ef0b11c4f747d12c024a520c7101709a2144fb37", size = 273830, upload-time = "2025-10-06T05:37:37.663Z" },
+ { url = "https://files.pythonhosted.org/packages/d2/3f/22cff331bfad7a8afa616289000ba793347fcd7bc275f3b28ecea2a27909/frozenlist-1.8.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:cb89a7f2de3602cfed448095bab3f178399646ab7c61454315089787df07733a", size = 294289, upload-time = "2025-10-06T05:37:39.261Z" },
+ { url = "https://files.pythonhosted.org/packages/a4/89/5b057c799de4838b6c69aa82b79705f2027615e01be996d2486a69ca99c4/frozenlist-1.8.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:33139dc858c580ea50e7e60a1b0ea003efa1fd42e6ec7fdbad78fff65fad2fd2", size = 300318, upload-time = "2025-10-06T05:37:43.213Z" },
+ { url = "https://files.pythonhosted.org/packages/30/de/2c22ab3eb2a8af6d69dc799e48455813bab3690c760de58e1bf43b36da3e/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:168c0969a329b416119507ba30b9ea13688fafffac1b7822802537569a1cb0ef", size = 282814, upload-time = "2025-10-06T05:37:45.337Z" },
+ { url = "https://files.pythonhosted.org/packages/59/f7/970141a6a8dbd7f556d94977858cfb36fa9b66e0892c6dd780d2219d8cd8/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:28bd570e8e189d7f7b001966435f9dac6718324b5be2990ac496cf1ea9ddb7fe", size = 291762, upload-time = "2025-10-06T05:37:46.657Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/15/ca1adae83a719f82df9116d66f5bb28bb95557b3951903d39135620ef157/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:b2a095d45c5d46e5e79ba1e5b9cb787f541a8dee0433836cea4b96a2c439dcd8", size = 289470, upload-time = "2025-10-06T05:37:47.946Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/83/dca6dc53bf657d371fbc88ddeb21b79891e747189c5de990b9dfff2ccba1/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:eab8145831a0d56ec9c4139b6c3e594c7a83c2c8be25d5bcf2d86136a532287a", size = 289042, upload-time = "2025-10-06T05:37:49.499Z" },
+ { url = "https://files.pythonhosted.org/packages/96/52/abddd34ca99be142f354398700536c5bd315880ed0a213812bc491cff5e4/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:974b28cf63cc99dfb2188d8d222bc6843656188164848c4f679e63dae4b0708e", size = 283148, upload-time = "2025-10-06T05:37:50.745Z" },
+ { url = "https://files.pythonhosted.org/packages/af/d3/76bd4ed4317e7119c2b7f57c3f6934aba26d277acc6309f873341640e21f/frozenlist-1.8.0-cp314-cp314t-win32.whl", hash = "sha256:342c97bf697ac5480c0a7ec73cd700ecfa5a8a40ac923bd035484616efecc2df", size = 44676, upload-time = "2025-10-06T05:37:52.222Z" },
+ { url = "https://files.pythonhosted.org/packages/89/76/c615883b7b521ead2944bb3480398cbb07e12b7b4e4d073d3752eb721558/frozenlist-1.8.0-cp314-cp314t-win_amd64.whl", hash = "sha256:06be8f67f39c8b1dc671f5d83aaefd3358ae5cdcf8314552c57e7ed3e6475bdd", size = 49451, upload-time = "2025-10-06T05:37:53.425Z" },
+ { url = "https://files.pythonhosted.org/packages/e0/a3/5982da14e113d07b325230f95060e2169f5311b1017ea8af2a29b374c289/frozenlist-1.8.0-cp314-cp314t-win_arm64.whl", hash = "sha256:102e6314ca4da683dca92e3b1355490fed5f313b768500084fbe6371fddfdb79", size = 42507, upload-time = "2025-10-06T05:37:54.513Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/9a/e35b4a917281c0b8419d4207f4334c8e8c5dbf4f3f5f9ada73958d937dcc/frozenlist-1.8.0-py3-none-any.whl", hash = "sha256:0c18a16eab41e82c295618a77502e17b195883241c563b00f0aa5106fc4eaa0d", size = 13409, upload-time = "2025-10-06T05:38:16.721Z" },
+]
+
+[[package]]
+name = "fsspec"
+version = "2025.9.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/de/e0/bab50af11c2d75c9c4a2a26a5254573c0bd97cea152254401510950486fa/fsspec-2025.9.0.tar.gz", hash = "sha256:19fd429483d25d28b65ec68f9f4adc16c17ea2c7c7bf54ec61360d478fb19c19", size = 304847, upload-time = "2025-09-02T19:10:49.215Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/47/71/70db47e4f6ce3e5c37a607355f80da8860a33226be640226ac52cb05ef2e/fsspec-2025.9.0-py3-none-any.whl", hash = "sha256:530dc2a2af60a414a832059574df4a6e10cce927f6f4a78209390fe38955cfb7", size = 199289, upload-time = "2025-09-02T19:10:47.708Z" },
+]
+
+[package.optional-dependencies]
+http = [
+ { name = "aiohttp" },
+]
+
+[[package]]
+name = "ftfy"
+version = "6.3.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "wcwidth" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/a5/d3/8650919bc3c7c6e90ee3fa7fd618bf373cbbe55dff043bd67353dbb20cd8/ftfy-6.3.1.tar.gz", hash = "sha256:9b3c3d90f84fb267fe64d375a07b7f8912d817cf86009ae134aa03e1819506ec", size = 308927, upload-time = "2024-10-26T00:50:35.149Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ab/6e/81d47999aebc1b155f81eca4477a616a70f238a2549848c38983f3c22a82/ftfy-6.3.1-py3-none-any.whl", hash = "sha256:7c70eb532015cd2f9adb53f101fb6c7945988d023a085d127d1573dc49dd0083", size = 44821, upload-time = "2024-10-26T00:50:33.425Z" },
+]
+
+[[package]]
+name = "h11"
+version = "0.16.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" },
+]
+
+[[package]]
+name = "httpcore"
+version = "1.0.9"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "certifi" },
+ { name = "h11" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" },
+]
+
+[[package]]
+name = "httpx"
+version = "0.28.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "anyio" },
+ { name = "certifi" },
+ { name = "httpcore" },
+ { name = "idna" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" },
+]
+
+[[package]]
+name = "huggingface-hub"
+version = "0.31.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "filelock" },
+ { name = "fsspec" },
+ { name = "packaging" },
+ { name = "pyyaml" },
+ { name = "requests" },
+ { name = "tqdm" },
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/3b/7b/09ab792c463975fcd0a81f459b5e900057dabbbc274ff253bb28d58ebfce/huggingface_hub-0.31.2.tar.gz", hash = "sha256:7053561376ed7f6ffdaecf09cc54d70dc784ac6315fa4bb9b93e19662b029675", size = 403025, upload-time = "2025-05-13T09:45:43.617Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/83/81/a8fd9c226f7e3bc8918f1e456131717cb38e93f18ccc109bf3c8471e464f/huggingface_hub-0.31.2-py3-none-any.whl", hash = "sha256:8138cd52aa2326b4429bb00a4a1ba8538346b7b8a808cdce30acb6f1f1bdaeec", size = 484230, upload-time = "2025-05-13T09:45:41.977Z" },
+]
+
+[[package]]
+name = "idna"
+version = "3.11"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/6f/6d/0703ccc57f3a7233505399edb88de3cbd678da106337b9fcde432b65ed60/idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902", size = 194582, upload-time = "2025-10-12T14:55:20.501Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008, upload-time = "2025-10-12T14:55:18.883Z" },
+]
+
+[[package]]
+name = "iniconfig"
+version = "2.3.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/72/34/14ca021ce8e5dfedc35312d08ba8bf51fdd999c576889fc2c24cb97f4f10/iniconfig-2.3.0.tar.gz", hash = "sha256:c76315c77db068650d49c5b56314774a7804df16fee4402c1f19d6d15d8c4730", size = 20503, upload-time = "2025-10-18T21:55:43.219Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/cb/b1/3846dd7f199d53cb17f49cba7e651e9ce294d8497c8c150530ed11865bb8/iniconfig-2.3.0-py3-none-any.whl", hash = "sha256:f631c04d2c48c52b84d0d0549c99ff3859c98df65b3101406327ecc7d53fbf12", size = 7484, upload-time = "2025-10-18T21:55:41.639Z" },
+]
+
+[[package]]
+name = "jinja2"
+version = "3.1.6"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "markupsafe" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115, upload-time = "2025-03-05T20:05:02.478Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899, upload-time = "2025-03-05T20:05:00.369Z" },
+]
+
+[[package]]
+name = "joblib"
+version = "1.5.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/e8/5d/447af5ea094b9e4c4054f82e223ada074c552335b9b4b2d14bd9b35a67c4/joblib-1.5.2.tar.gz", hash = "sha256:3faa5c39054b2f03ca547da9b2f52fde67c06240c31853f306aea97f13647b55", size = 331077, upload-time = "2025-08-27T12:15:46.575Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/1e/e8/685f47e0d754320684db4425a0967f7d3fa70126bffd76110b7009a0090f/joblib-1.5.2-py3-none-any.whl", hash = "sha256:4e1f0bdbb987e6d843c70cf43714cb276623def372df3c22fe5266b2670bc241", size = 308396, upload-time = "2025-08-27T12:15:45.188Z" },
+]
+
+[[package]]
+name = "kiwisolver"
+version = "1.4.9"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/5c/3c/85844f1b0feb11ee581ac23fe5fce65cd049a200c1446708cc1b7f922875/kiwisolver-1.4.9.tar.gz", hash = "sha256:c3b22c26c6fd6811b0ae8363b95ca8ce4ea3c202d3d0975b2914310ceb1bcc4d", size = 97564, upload-time = "2025-08-10T21:27:49.279Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/c6/5d/8ce64e36d4e3aac5ca96996457dcf33e34e6051492399a3f1fec5657f30b/kiwisolver-1.4.9-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:b4b4d74bda2b8ebf4da5bd42af11d02d04428b2c32846e4c2c93219df8a7987b", size = 124159, upload-time = "2025-08-10T21:25:35.472Z" },
+ { url = "https://files.pythonhosted.org/packages/96/1e/22f63ec454874378175a5f435d6ea1363dd33fb2af832c6643e4ccea0dc8/kiwisolver-1.4.9-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:fb3b8132019ea572f4611d770991000d7f58127560c4889729248eb5852a102f", size = 66578, upload-time = "2025-08-10T21:25:36.73Z" },
+ { url = "https://files.pythonhosted.org/packages/41/4c/1925dcfff47a02d465121967b95151c82d11027d5ec5242771e580e731bd/kiwisolver-1.4.9-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:84fd60810829c27ae375114cd379da1fa65e6918e1da405f356a775d49a62bcf", size = 65312, upload-time = "2025-08-10T21:25:37.658Z" },
+ { url = "https://files.pythonhosted.org/packages/d4/42/0f333164e6307a0687d1eb9ad256215aae2f4bd5d28f4653d6cd319a3ba3/kiwisolver-1.4.9-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:b78efa4c6e804ecdf727e580dbb9cba85624d2e1c6b5cb059c66290063bd99a9", size = 1628458, upload-time = "2025-08-10T21:25:39.067Z" },
+ { url = "https://files.pythonhosted.org/packages/86/b6/2dccb977d651943995a90bfe3495c2ab2ba5cd77093d9f2318a20c9a6f59/kiwisolver-1.4.9-cp310-cp310-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d4efec7bcf21671db6a3294ff301d2fc861c31faa3c8740d1a94689234d1b415", size = 1225640, upload-time = "2025-08-10T21:25:40.489Z" },
+ { url = "https://files.pythonhosted.org/packages/50/2b/362ebd3eec46c850ccf2bfe3e30f2fc4c008750011f38a850f088c56a1c6/kiwisolver-1.4.9-cp310-cp310-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:90f47e70293fc3688b71271100a1a5453aa9944a81d27ff779c108372cf5567b", size = 1244074, upload-time = "2025-08-10T21:25:42.221Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/bb/f09a1e66dab8984773d13184a10a29fe67125337649d26bdef547024ed6b/kiwisolver-1.4.9-cp310-cp310-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8fdca1def57a2e88ef339de1737a1449d6dbf5fab184c54a1fca01d541317154", size = 1293036, upload-time = "2025-08-10T21:25:43.801Z" },
+ { url = "https://files.pythonhosted.org/packages/ea/01/11ecf892f201cafda0f68fa59212edaea93e96c37884b747c181303fccd1/kiwisolver-1.4.9-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:9cf554f21be770f5111a1690d42313e140355e687e05cf82cb23d0a721a64a48", size = 2175310, upload-time = "2025-08-10T21:25:45.045Z" },
+ { url = "https://files.pythonhosted.org/packages/7f/5f/bfe11d5b934f500cc004314819ea92427e6e5462706a498c1d4fc052e08f/kiwisolver-1.4.9-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:fc1795ac5cd0510207482c3d1d3ed781143383b8cfd36f5c645f3897ce066220", size = 2270943, upload-time = "2025-08-10T21:25:46.393Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/de/259f786bf71f1e03e73d87e2db1a9a3bcab64d7b4fd780167123161630ad/kiwisolver-1.4.9-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:ccd09f20ccdbbd341b21a67ab50a119b64a403b09288c27481575105283c1586", size = 2440488, upload-time = "2025-08-10T21:25:48.074Z" },
+ { url = "https://files.pythonhosted.org/packages/1b/76/c989c278faf037c4d3421ec07a5c452cd3e09545d6dae7f87c15f54e4edf/kiwisolver-1.4.9-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:540c7c72324d864406a009d72f5d6856f49693db95d1fbb46cf86febef873634", size = 2246787, upload-time = "2025-08-10T21:25:49.442Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/55/c2898d84ca440852e560ca9f2a0d28e6e931ac0849b896d77231929900e7/kiwisolver-1.4.9-cp310-cp310-win_amd64.whl", hash = "sha256:ede8c6d533bc6601a47ad4046080d36b8fc99f81e6f1c17b0ac3c2dc91ac7611", size = 73730, upload-time = "2025-08-10T21:25:51.102Z" },
+ { url = "https://files.pythonhosted.org/packages/e8/09/486d6ac523dd33b80b368247f238125d027964cfacb45c654841e88fb2ae/kiwisolver-1.4.9-cp310-cp310-win_arm64.whl", hash = "sha256:7b4da0d01ac866a57dd61ac258c5607b4cd677f63abaec7b148354d2b2cdd536", size = 65036, upload-time = "2025-08-10T21:25:52.063Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/ab/c80b0d5a9d8a1a65f4f815f2afff9798b12c3b9f31f1d304dd233dd920e2/kiwisolver-1.4.9-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:eb14a5da6dc7642b0f3a18f13654847cd8b7a2550e2645a5bda677862b03ba16", size = 124167, upload-time = "2025-08-10T21:25:53.403Z" },
+ { url = "https://files.pythonhosted.org/packages/a0/c0/27fe1a68a39cf62472a300e2879ffc13c0538546c359b86f149cc19f6ac3/kiwisolver-1.4.9-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:39a219e1c81ae3b103643d2aedb90f1ef22650deb266ff12a19e7773f3e5f089", size = 66579, upload-time = "2025-08-10T21:25:54.79Z" },
+ { url = "https://files.pythonhosted.org/packages/31/a2/a12a503ac1fd4943c50f9822678e8015a790a13b5490354c68afb8489814/kiwisolver-1.4.9-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2405a7d98604b87f3fc28b1716783534b1b4b8510d8142adca34ee0bc3c87543", size = 65309, upload-time = "2025-08-10T21:25:55.76Z" },
+ { url = "https://files.pythonhosted.org/packages/66/e1/e533435c0be77c3f64040d68d7a657771194a63c279f55573188161e81ca/kiwisolver-1.4.9-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:dc1ae486f9abcef254b5618dfb4113dd49f94c68e3e027d03cf0143f3f772b61", size = 1435596, upload-time = "2025-08-10T21:25:56.861Z" },
+ { url = "https://files.pythonhosted.org/packages/67/1e/51b73c7347f9aabdc7215aa79e8b15299097dc2f8e67dee2b095faca9cb0/kiwisolver-1.4.9-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8a1f570ce4d62d718dce3f179ee78dac3b545ac16c0c04bb363b7607a949c0d1", size = 1246548, upload-time = "2025-08-10T21:25:58.246Z" },
+ { url = "https://files.pythonhosted.org/packages/21/aa/72a1c5d1e430294f2d32adb9542719cfb441b5da368d09d268c7757af46c/kiwisolver-1.4.9-cp311-cp311-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:cb27e7b78d716c591e88e0a09a2139c6577865d7f2e152488c2cc6257f460872", size = 1263618, upload-time = "2025-08-10T21:25:59.857Z" },
+ { url = "https://files.pythonhosted.org/packages/a3/af/db1509a9e79dbf4c260ce0cfa3903ea8945f6240e9e59d1e4deb731b1a40/kiwisolver-1.4.9-cp311-cp311-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:15163165efc2f627eb9687ea5f3a28137217d217ac4024893d753f46bce9de26", size = 1317437, upload-time = "2025-08-10T21:26:01.105Z" },
+ { url = "https://files.pythonhosted.org/packages/e0/f2/3ea5ee5d52abacdd12013a94130436e19969fa183faa1e7c7fbc89e9a42f/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:bdee92c56a71d2b24c33a7d4c2856bd6419d017e08caa7802d2963870e315028", size = 2195742, upload-time = "2025-08-10T21:26:02.675Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/9b/1efdd3013c2d9a2566aa6a337e9923a00590c516add9a1e89a768a3eb2fc/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:412f287c55a6f54b0650bd9b6dce5aceddb95864a1a90c87af16979d37c89771", size = 2290810, upload-time = "2025-08-10T21:26:04.009Z" },
+ { url = "https://files.pythonhosted.org/packages/fb/e5/cfdc36109ae4e67361f9bc5b41323648cb24a01b9ade18784657e022e65f/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2c93f00dcba2eea70af2be5f11a830a742fe6b579a1d4e00f47760ef13be247a", size = 2461579, upload-time = "2025-08-10T21:26:05.317Z" },
+ { url = "https://files.pythonhosted.org/packages/62/86/b589e5e86c7610842213994cdea5add00960076bef4ae290c5fa68589cac/kiwisolver-1.4.9-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f117e1a089d9411663a3207ba874f31be9ac8eaa5b533787024dc07aeb74f464", size = 2268071, upload-time = "2025-08-10T21:26:06.686Z" },
+ { url = "https://files.pythonhosted.org/packages/3b/c6/f8df8509fd1eee6c622febe54384a96cfaf4d43bf2ccec7a0cc17e4715c9/kiwisolver-1.4.9-cp311-cp311-win_amd64.whl", hash = "sha256:be6a04e6c79819c9a8c2373317d19a96048e5a3f90bec587787e86a1153883c2", size = 73840, upload-time = "2025-08-10T21:26:07.94Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/2d/16e0581daafd147bc11ac53f032a2b45eabac897f42a338d0a13c1e5c436/kiwisolver-1.4.9-cp311-cp311-win_arm64.whl", hash = "sha256:0ae37737256ba2de764ddc12aed4956460277f00c4996d51a197e72f62f5eec7", size = 65159, upload-time = "2025-08-10T21:26:09.048Z" },
+ { url = "https://files.pythonhosted.org/packages/86/c9/13573a747838aeb1c76e3267620daa054f4152444d1f3d1a2324b78255b5/kiwisolver-1.4.9-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:ac5a486ac389dddcc5bef4f365b6ae3ffff2c433324fb38dd35e3fab7c957999", size = 123686, upload-time = "2025-08-10T21:26:10.034Z" },
+ { url = "https://files.pythonhosted.org/packages/51/ea/2ecf727927f103ffd1739271ca19c424d0e65ea473fbaeea1c014aea93f6/kiwisolver-1.4.9-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f2ba92255faa7309d06fe44c3a4a97efe1c8d640c2a79a5ef728b685762a6fd2", size = 66460, upload-time = "2025-08-10T21:26:11.083Z" },
+ { url = "https://files.pythonhosted.org/packages/5b/5a/51f5464373ce2aeb5194508298a508b6f21d3867f499556263c64c621914/kiwisolver-1.4.9-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4a2899935e724dd1074cb568ce7ac0dce28b2cd6ab539c8e001a8578eb106d14", size = 64952, upload-time = "2025-08-10T21:26:12.058Z" },
+ { url = "https://files.pythonhosted.org/packages/70/90/6d240beb0f24b74371762873e9b7f499f1e02166a2d9c5801f4dbf8fa12e/kiwisolver-1.4.9-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f6008a4919fdbc0b0097089f67a1eb55d950ed7e90ce2cc3e640abadd2757a04", size = 1474756, upload-time = "2025-08-10T21:26:13.096Z" },
+ { url = "https://files.pythonhosted.org/packages/12/42/f36816eaf465220f683fb711efdd1bbf7a7005a2473d0e4ed421389bd26c/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:67bb8b474b4181770f926f7b7d2f8c0248cbcb78b660fdd41a47054b28d2a752", size = 1276404, upload-time = "2025-08-10T21:26:14.457Z" },
+ { url = "https://files.pythonhosted.org/packages/2e/64/bc2de94800adc830c476dce44e9b40fd0809cddeef1fde9fcf0f73da301f/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2327a4a30d3ee07d2fbe2e7933e8a37c591663b96ce42a00bc67461a87d7df77", size = 1294410, upload-time = "2025-08-10T21:26:15.73Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/42/2dc82330a70aa8e55b6d395b11018045e58d0bb00834502bf11509f79091/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:7a08b491ec91b1d5053ac177afe5290adacf1f0f6307d771ccac5de30592d198", size = 1343631, upload-time = "2025-08-10T21:26:17.045Z" },
+ { url = "https://files.pythonhosted.org/packages/22/fd/f4c67a6ed1aab149ec5a8a401c323cee7a1cbe364381bb6c9c0d564e0e20/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d8fc5c867c22b828001b6a38d2eaeb88160bf5783c6cb4a5e440efc981ce286d", size = 2224963, upload-time = "2025-08-10T21:26:18.737Z" },
+ { url = "https://files.pythonhosted.org/packages/45/aa/76720bd4cb3713314677d9ec94dcc21ced3f1baf4830adde5bb9b2430a5f/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:3b3115b2581ea35bb6d1f24a4c90af37e5d9b49dcff267eeed14c3893c5b86ab", size = 2321295, upload-time = "2025-08-10T21:26:20.11Z" },
+ { url = "https://files.pythonhosted.org/packages/80/19/d3ec0d9ab711242f56ae0dc2fc5d70e298bb4a1f9dfab44c027668c673a1/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:858e4c22fb075920b96a291928cb7dea5644e94c0ee4fcd5af7e865655e4ccf2", size = 2487987, upload-time = "2025-08-10T21:26:21.49Z" },
+ { url = "https://files.pythonhosted.org/packages/39/e9/61e4813b2c97e86b6fdbd4dd824bf72d28bcd8d4849b8084a357bc0dd64d/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ed0fecd28cc62c54b262e3736f8bb2512d8dcfdc2bcf08be5f47f96bf405b145", size = 2291817, upload-time = "2025-08-10T21:26:22.812Z" },
+ { url = "https://files.pythonhosted.org/packages/a0/41/85d82b0291db7504da3c2defe35c9a8a5c9803a730f297bd823d11d5fb77/kiwisolver-1.4.9-cp312-cp312-win_amd64.whl", hash = "sha256:f68208a520c3d86ea51acf688a3e3002615a7f0238002cccc17affecc86a8a54", size = 73895, upload-time = "2025-08-10T21:26:24.37Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/92/5f3068cf15ee5cb624a0c7596e67e2a0bb2adee33f71c379054a491d07da/kiwisolver-1.4.9-cp312-cp312-win_arm64.whl", hash = "sha256:2c1a4f57df73965f3f14df20b80ee29e6a7930a57d2d9e8491a25f676e197c60", size = 64992, upload-time = "2025-08-10T21:26:25.732Z" },
+ { url = "https://files.pythonhosted.org/packages/31/c1/c2686cda909742ab66c7388e9a1a8521a59eb89f8bcfbee28fc980d07e24/kiwisolver-1.4.9-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a5d0432ccf1c7ab14f9949eec60c5d1f924f17c037e9f8b33352fa05799359b8", size = 123681, upload-time = "2025-08-10T21:26:26.725Z" },
+ { url = "https://files.pythonhosted.org/packages/ca/f0/f44f50c9f5b1a1860261092e3bc91ecdc9acda848a8b8c6abfda4a24dd5c/kiwisolver-1.4.9-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efb3a45b35622bb6c16dbfab491a8f5a391fe0e9d45ef32f4df85658232ca0e2", size = 66464, upload-time = "2025-08-10T21:26:27.733Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/7a/9d90a151f558e29c3936b8a47ac770235f436f2120aca41a6d5f3d62ae8d/kiwisolver-1.4.9-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1a12cf6398e8a0a001a059747a1cbf24705e18fe413bc22de7b3d15c67cffe3f", size = 64961, upload-time = "2025-08-10T21:26:28.729Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/e9/f218a2cb3a9ffbe324ca29a9e399fa2d2866d7f348ec3a88df87fc248fc5/kiwisolver-1.4.9-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:b67e6efbf68e077dd71d1a6b37e43e1a99d0bff1a3d51867d45ee8908b931098", size = 1474607, upload-time = "2025-08-10T21:26:29.798Z" },
+ { url = "https://files.pythonhosted.org/packages/d9/28/aac26d4c882f14de59041636292bc838db8961373825df23b8eeb807e198/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5656aa670507437af0207645273ccdfee4f14bacd7f7c67a4306d0dcaeaf6eed", size = 1276546, upload-time = "2025-08-10T21:26:31.401Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/ad/8bfc1c93d4cc565e5069162f610ba2f48ff39b7de4b5b8d93f69f30c4bed/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:bfc08add558155345129c7803b3671cf195e6a56e7a12f3dde7c57d9b417f525", size = 1294482, upload-time = "2025-08-10T21:26:32.721Z" },
+ { url = "https://files.pythonhosted.org/packages/da/f1/6aca55ff798901d8ce403206d00e033191f63d82dd708a186e0ed2067e9c/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:40092754720b174e6ccf9e845d0d8c7d8e12c3d71e7fc35f55f3813e96376f78", size = 1343720, upload-time = "2025-08-10T21:26:34.032Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/91/eed031876c595c81d90d0f6fc681ece250e14bf6998c3d7c419466b523b7/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:497d05f29a1300d14e02e6441cf0f5ee81c1ff5a304b0d9fb77423974684e08b", size = 2224907, upload-time = "2025-08-10T21:26:35.824Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/ec/4d1925f2e49617b9cca9c34bfa11adefad49d00db038e692a559454dfb2e/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:bdd1a81a1860476eb41ac4bc1e07b3f07259e6d55bbf739b79c8aaedcf512799", size = 2321334, upload-time = "2025-08-10T21:26:37.534Z" },
+ { url = "https://files.pythonhosted.org/packages/43/cb/450cd4499356f68802750c6ddc18647b8ea01ffa28f50d20598e0befe6e9/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:e6b93f13371d341afee3be9f7c5964e3fe61d5fa30f6a30eb49856935dfe4fc3", size = 2488313, upload-time = "2025-08-10T21:26:39.191Z" },
+ { url = "https://files.pythonhosted.org/packages/71/67/fc76242bd99f885651128a5d4fa6083e5524694b7c88b489b1b55fdc491d/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:d75aa530ccfaa593da12834b86a0724f58bff12706659baa9227c2ccaa06264c", size = 2291970, upload-time = "2025-08-10T21:26:40.828Z" },
+ { url = "https://files.pythonhosted.org/packages/75/bd/f1a5d894000941739f2ae1b65a32892349423ad49c2e6d0771d0bad3fae4/kiwisolver-1.4.9-cp313-cp313-win_amd64.whl", hash = "sha256:dd0a578400839256df88c16abddf9ba14813ec5f21362e1fe65022e00c883d4d", size = 73894, upload-time = "2025-08-10T21:26:42.33Z" },
+ { url = "https://files.pythonhosted.org/packages/95/38/dce480814d25b99a391abbddadc78f7c117c6da34be68ca8b02d5848b424/kiwisolver-1.4.9-cp313-cp313-win_arm64.whl", hash = "sha256:d4188e73af84ca82468f09cadc5ac4db578109e52acb4518d8154698d3a87ca2", size = 64995, upload-time = "2025-08-10T21:26:43.889Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/37/7d218ce5d92dadc5ebdd9070d903e0c7cf7edfe03f179433ac4d13ce659c/kiwisolver-1.4.9-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:5a0f2724dfd4e3b3ac5a82436a8e6fd16baa7d507117e4279b660fe8ca38a3a1", size = 126510, upload-time = "2025-08-10T21:26:44.915Z" },
+ { url = "https://files.pythonhosted.org/packages/23/b0/e85a2b48233daef4b648fb657ebbb6f8367696a2d9548a00b4ee0eb67803/kiwisolver-1.4.9-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:1b11d6a633e4ed84fc0ddafd4ebfd8ea49b3f25082c04ad12b8315c11d504dc1", size = 67903, upload-time = "2025-08-10T21:26:45.934Z" },
+ { url = "https://files.pythonhosted.org/packages/44/98/f2425bc0113ad7de24da6bb4dae1343476e95e1d738be7c04d31a5d037fd/kiwisolver-1.4.9-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61874cdb0a36016354853593cffc38e56fc9ca5aa97d2c05d3dcf6922cd55a11", size = 66402, upload-time = "2025-08-10T21:26:47.101Z" },
+ { url = "https://files.pythonhosted.org/packages/98/d8/594657886df9f34c4177cc353cc28ca7e6e5eb562d37ccc233bff43bbe2a/kiwisolver-1.4.9-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:60c439763a969a6af93b4881db0eed8fadf93ee98e18cbc35bc8da868d0c4f0c", size = 1582135, upload-time = "2025-08-10T21:26:48.665Z" },
+ { url = "https://files.pythonhosted.org/packages/5c/c6/38a115b7170f8b306fc929e166340c24958347308ea3012c2b44e7e295db/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92a2f997387a1b79a75e7803aa7ded2cfbe2823852ccf1ba3bcf613b62ae3197", size = 1389409, upload-time = "2025-08-10T21:26:50.335Z" },
+ { url = "https://files.pythonhosted.org/packages/bf/3b/e04883dace81f24a568bcee6eb3001da4ba05114afa622ec9b6fafdc1f5e/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a31d512c812daea6d8b3be3b2bfcbeb091dbb09177706569bcfc6240dcf8b41c", size = 1401763, upload-time = "2025-08-10T21:26:51.867Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/80/20ace48e33408947af49d7d15c341eaee69e4e0304aab4b7660e234d6288/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:52a15b0f35dad39862d376df10c5230155243a2c1a436e39eb55623ccbd68185", size = 1453643, upload-time = "2025-08-10T21:26:53.592Z" },
+ { url = "https://files.pythonhosted.org/packages/64/31/6ce4380a4cd1f515bdda976a1e90e547ccd47b67a1546d63884463c92ca9/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a30fd6fdef1430fd9e1ba7b3398b5ee4e2887783917a687d86ba69985fb08748", size = 2330818, upload-time = "2025-08-10T21:26:55.051Z" },
+ { url = "https://files.pythonhosted.org/packages/fa/e9/3f3fcba3bcc7432c795b82646306e822f3fd74df0ee81f0fa067a1f95668/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:cc9617b46837c6468197b5945e196ee9ca43057bb7d9d1ae688101e4e1dddf64", size = 2419963, upload-time = "2025-08-10T21:26:56.421Z" },
+ { url = "https://files.pythonhosted.org/packages/99/43/7320c50e4133575c66e9f7dadead35ab22d7c012a3b09bb35647792b2a6d/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:0ab74e19f6a2b027ea4f845a78827969af45ce790e6cb3e1ebab71bdf9f215ff", size = 2594639, upload-time = "2025-08-10T21:26:57.882Z" },
+ { url = "https://files.pythonhosted.org/packages/65/d6/17ae4a270d4a987ef8a385b906d2bdfc9fce502d6dc0d3aea865b47f548c/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:dba5ee5d3981160c28d5490f0d1b7ed730c22470ff7f6cc26cfcfaacb9896a07", size = 2391741, upload-time = "2025-08-10T21:26:59.237Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/8f/8f6f491d595a9e5912971f3f863d81baddccc8a4d0c3749d6a0dd9ffc9df/kiwisolver-1.4.9-cp313-cp313t-win_arm64.whl", hash = "sha256:0749fd8f4218ad2e851e11cc4dc05c7cbc0cbc4267bdfdb31782e65aace4ee9c", size = 68646, upload-time = "2025-08-10T21:27:00.52Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/32/6cc0fbc9c54d06c2969faa9c1d29f5751a2e51809dd55c69055e62d9b426/kiwisolver-1.4.9-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:9928fe1eb816d11ae170885a74d074f57af3a0d65777ca47e9aeb854a1fba386", size = 123806, upload-time = "2025-08-10T21:27:01.537Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/dd/2bfb1d4a4823d92e8cbb420fe024b8d2167f72079b3bb941207c42570bdf/kiwisolver-1.4.9-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d0005b053977e7b43388ddec89fa567f43d4f6d5c2c0affe57de5ebf290dc552", size = 66605, upload-time = "2025-08-10T21:27:03.335Z" },
+ { url = "https://files.pythonhosted.org/packages/f7/69/00aafdb4e4509c2ca6064646cba9cd4b37933898f426756adb2cb92ebbed/kiwisolver-1.4.9-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:2635d352d67458b66fd0667c14cb1d4145e9560d503219034a18a87e971ce4f3", size = 64925, upload-time = "2025-08-10T21:27:04.339Z" },
+ { url = "https://files.pythonhosted.org/packages/43/dc/51acc6791aa14e5cb6d8a2e28cefb0dc2886d8862795449d021334c0df20/kiwisolver-1.4.9-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:767c23ad1c58c9e827b649a9ab7809fd5fd9db266a9cf02b0e926ddc2c680d58", size = 1472414, upload-time = "2025-08-10T21:27:05.437Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/bb/93fa64a81db304ac8a246f834d5094fae4b13baf53c839d6bb6e81177129/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:72d0eb9fba308b8311685c2268cf7d0a0639a6cd027d8128659f72bdd8a024b4", size = 1281272, upload-time = "2025-08-10T21:27:07.063Z" },
+ { url = "https://files.pythonhosted.org/packages/70/e6/6df102916960fb8d05069d4bd92d6d9a8202d5a3e2444494e7cd50f65b7a/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f68e4f3eeca8fb22cc3d731f9715a13b652795ef657a13df1ad0c7dc0e9731df", size = 1298578, upload-time = "2025-08-10T21:27:08.452Z" },
+ { url = "https://files.pythonhosted.org/packages/7c/47/e142aaa612f5343736b087864dbaebc53ea8831453fb47e7521fa8658f30/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d84cd4061ae292d8ac367b2c3fa3aad11cb8625a95d135fe93f286f914f3f5a6", size = 1345607, upload-time = "2025-08-10T21:27:10.125Z" },
+ { url = "https://files.pythonhosted.org/packages/54/89/d641a746194a0f4d1a3670fb900d0dbaa786fb98341056814bc3f058fa52/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:a60ea74330b91bd22a29638940d115df9dc00af5035a9a2a6ad9399ffb4ceca5", size = 2230150, upload-time = "2025-08-10T21:27:11.484Z" },
+ { url = "https://files.pythonhosted.org/packages/aa/6b/5ee1207198febdf16ac11f78c5ae40861b809cbe0e6d2a8d5b0b3044b199/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:ce6a3a4e106cf35c2d9c4fa17c05ce0b180db622736845d4315519397a77beaf", size = 2325979, upload-time = "2025-08-10T21:27:12.917Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/ff/b269eefd90f4ae14dcc74973d5a0f6d28d3b9bb1afd8c0340513afe6b39a/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:77937e5e2a38a7b48eef0585114fe7930346993a88060d0bf886086d2aa49ef5", size = 2491456, upload-time = "2025-08-10T21:27:14.353Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/d4/10303190bd4d30de547534601e259a4fbf014eed94aae3e5521129215086/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:24c175051354f4a28c5d6a31c93906dc653e2bf234e8a4bbfb964892078898ce", size = 2294621, upload-time = "2025-08-10T21:27:15.808Z" },
+ { url = "https://files.pythonhosted.org/packages/28/e0/a9a90416fce5c0be25742729c2ea52105d62eda6c4be4d803c2a7be1fa50/kiwisolver-1.4.9-cp314-cp314-win_amd64.whl", hash = "sha256:0763515d4df10edf6d06a3c19734e2566368980d21ebec439f33f9eb936c07b7", size = 75417, upload-time = "2025-08-10T21:27:17.436Z" },
+ { url = "https://files.pythonhosted.org/packages/1f/10/6949958215b7a9a264299a7db195564e87900f709db9245e4ebdd3c70779/kiwisolver-1.4.9-cp314-cp314-win_arm64.whl", hash = "sha256:0e4e2bf29574a6a7b7f6cb5fa69293b9f96c928949ac4a53ba3f525dffb87f9c", size = 66582, upload-time = "2025-08-10T21:27:18.436Z" },
+ { url = "https://files.pythonhosted.org/packages/ec/79/60e53067903d3bc5469b369fe0dfc6b3482e2133e85dae9daa9527535991/kiwisolver-1.4.9-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:d976bbb382b202f71c67f77b0ac11244021cfa3f7dfd9e562eefcea2df711548", size = 126514, upload-time = "2025-08-10T21:27:19.465Z" },
+ { url = "https://files.pythonhosted.org/packages/25/d1/4843d3e8d46b072c12a38c97c57fab4608d36e13fe47d47ee96b4d61ba6f/kiwisolver-1.4.9-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:2489e4e5d7ef9a1c300a5e0196e43d9c739f066ef23270607d45aba368b91f2d", size = 67905, upload-time = "2025-08-10T21:27:20.51Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/ae/29ffcbd239aea8b93108de1278271ae764dfc0d803a5693914975f200596/kiwisolver-1.4.9-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:e2ea9f7ab7fbf18fffb1b5434ce7c69a07582f7acc7717720f1d69f3e806f90c", size = 66399, upload-time = "2025-08-10T21:27:21.496Z" },
+ { url = "https://files.pythonhosted.org/packages/a1/ae/d7ba902aa604152c2ceba5d352d7b62106bedbccc8e95c3934d94472bfa3/kiwisolver-1.4.9-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:b34e51affded8faee0dfdb705416153819d8ea9250bbbf7ea1b249bdeb5f1122", size = 1582197, upload-time = "2025-08-10T21:27:22.604Z" },
+ { url = "https://files.pythonhosted.org/packages/f2/41/27c70d427eddb8bc7e4f16420a20fefc6f480312122a59a959fdfe0445ad/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d8aacd3d4b33b772542b2e01beb50187536967b514b00003bdda7589722d2a64", size = 1390125, upload-time = "2025-08-10T21:27:24.036Z" },
+ { url = "https://files.pythonhosted.org/packages/41/42/b3799a12bafc76d962ad69083f8b43b12bf4fe78b097b12e105d75c9b8f1/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7cf974dd4e35fa315563ac99d6287a1024e4dc2077b8a7d7cd3d2fb65d283134", size = 1402612, upload-time = "2025-08-10T21:27:25.773Z" },
+ { url = "https://files.pythonhosted.org/packages/d2/b5/a210ea073ea1cfaca1bb5c55a62307d8252f531beb364e18aa1e0888b5a0/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:85bd218b5ecfbee8c8a82e121802dcb519a86044c9c3b2e4aef02fa05c6da370", size = 1453990, upload-time = "2025-08-10T21:27:27.089Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/ce/a829eb8c033e977d7ea03ed32fb3c1781b4fa0433fbadfff29e39c676f32/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:0856e241c2d3df4efef7c04a1e46b1936b6120c9bcf36dd216e3acd84bc4fb21", size = 2331601, upload-time = "2025-08-10T21:27:29.343Z" },
+ { url = "https://files.pythonhosted.org/packages/e0/4b/b5e97eb142eb9cd0072dacfcdcd31b1c66dc7352b0f7c7255d339c0edf00/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:9af39d6551f97d31a4deebeac6f45b156f9755ddc59c07b402c148f5dbb6482a", size = 2422041, upload-time = "2025-08-10T21:27:30.754Z" },
+ { url = "https://files.pythonhosted.org/packages/40/be/8eb4cd53e1b85ba4edc3a9321666f12b83113a178845593307a3e7891f44/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:bb4ae2b57fc1d8cbd1cf7b1d9913803681ffa903e7488012be5b76dedf49297f", size = 2594897, upload-time = "2025-08-10T21:27:32.803Z" },
+ { url = "https://files.pythonhosted.org/packages/99/dd/841e9a66c4715477ea0abc78da039832fbb09dac5c35c58dc4c41a407b8a/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:aedff62918805fb62d43a4aa2ecd4482c380dc76cd31bd7c8878588a61bd0369", size = 2391835, upload-time = "2025-08-10T21:27:34.23Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/28/4b2e5c47a0da96896fdfdb006340ade064afa1e63675d01ea5ac222b6d52/kiwisolver-1.4.9-cp314-cp314t-win_amd64.whl", hash = "sha256:1fa333e8b2ce4d9660f2cda9c0e1b6bafcfb2457a9d259faa82289e73ec24891", size = 79988, upload-time = "2025-08-10T21:27:35.587Z" },
+ { url = "https://files.pythonhosted.org/packages/80/be/3578e8afd18c88cdf9cb4cffde75a96d2be38c5a903f1ed0ceec061bd09e/kiwisolver-1.4.9-cp314-cp314t-win_arm64.whl", hash = "sha256:4a48a2ce79d65d363597ef7b567ce3d14d68783d2b2263d98db3d9477805ba32", size = 70260, upload-time = "2025-08-10T21:27:36.606Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/63/fde392691690f55b38d5dd7b3710f5353bf7a8e52de93a22968801ab8978/kiwisolver-1.4.9-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:4d1d9e582ad4d63062d34077a9a1e9f3c34088a2ec5135b1f7190c07cf366527", size = 60183, upload-time = "2025-08-10T21:27:37.669Z" },
+ { url = "https://files.pythonhosted.org/packages/27/b1/6aad34edfdb7cced27f371866f211332bba215bfd918ad3322a58f480d8b/kiwisolver-1.4.9-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:deed0c7258ceb4c44ad5ec7d9918f9f14fd05b2be86378d86cf50e63d1e7b771", size = 58675, upload-time = "2025-08-10T21:27:39.031Z" },
+ { url = "https://files.pythonhosted.org/packages/9d/1a/23d855a702bb35a76faed5ae2ba3de57d323f48b1f6b17ee2176c4849463/kiwisolver-1.4.9-pp310-pypy310_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:0a590506f303f512dff6b7f75fd2fd18e16943efee932008fe7140e5fa91d80e", size = 80277, upload-time = "2025-08-10T21:27:40.129Z" },
+ { url = "https://files.pythonhosted.org/packages/5a/5b/5239e3c2b8fb5afa1e8508f721bb77325f740ab6994d963e61b2b7abcc1e/kiwisolver-1.4.9-pp310-pypy310_pp73-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e09c2279a4d01f099f52d5c4b3d9e208e91edcbd1a175c9662a8b16e000fece9", size = 77994, upload-time = "2025-08-10T21:27:41.181Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/1c/5d4d468fb16f8410e596ed0eac02d2c68752aa7dc92997fe9d60a7147665/kiwisolver-1.4.9-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:c9e7cdf45d594ee04d5be1b24dd9d49f3d1590959b2271fb30b5ca2b262c00fb", size = 73744, upload-time = "2025-08-10T21:27:42.254Z" },
+ { url = "https://files.pythonhosted.org/packages/a3/0f/36d89194b5a32c054ce93e586d4049b6c2c22887b0eb229c61c68afd3078/kiwisolver-1.4.9-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:720e05574713db64c356e86732c0f3c5252818d05f9df320f0ad8380641acea5", size = 60104, upload-time = "2025-08-10T21:27:43.287Z" },
+ { url = "https://files.pythonhosted.org/packages/52/ba/4ed75f59e4658fd21fe7dde1fee0ac397c678ec3befba3fe6482d987af87/kiwisolver-1.4.9-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:17680d737d5335b552994a2008fab4c851bcd7de33094a82067ef3a576ff02fa", size = 58592, upload-time = "2025-08-10T21:27:44.314Z" },
+ { url = "https://files.pythonhosted.org/packages/33/01/a8ea7c5ea32a9b45ceeaee051a04c8ed4320f5add3c51bfa20879b765b70/kiwisolver-1.4.9-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:85b5352f94e490c028926ea567fc569c52ec79ce131dadb968d3853e809518c2", size = 80281, upload-time = "2025-08-10T21:27:45.369Z" },
+ { url = "https://files.pythonhosted.org/packages/da/e3/dbd2ecdce306f1d07a1aaf324817ee993aab7aee9db47ceac757deabafbe/kiwisolver-1.4.9-pp311-pypy311_pp73-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:464415881e4801295659462c49461a24fb107c140de781d55518c4b80cb6790f", size = 78009, upload-time = "2025-08-10T21:27:46.376Z" },
+ { url = "https://files.pythonhosted.org/packages/da/e9/0d4add7873a73e462aeb45c036a2dead2562b825aa46ba326727b3f31016/kiwisolver-1.4.9-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:fb940820c63a9590d31d88b815e7a3aa5915cad3ce735ab45f0c730b39547de1", size = 73929, upload-time = "2025-08-10T21:27:48.236Z" },
+]
+
+[[package]]
+name = "langcodes"
+version = "3.5.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "language-data" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/3a/7a/5a97e327063409a5caa21541e6d08ae4a0f2da328447e9f2c7b39e179226/langcodes-3.5.0.tar.gz", hash = "sha256:1eef8168d07e51e131a2497ffecad4b663f6208e7c3ae3b8dc15c51734a6f801", size = 191030, upload-time = "2024-11-19T10:23:45.546Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/c3/6b/068c2ea7a712bf805c62445bd9e9c06d7340358ef2824150eceac027444b/langcodes-3.5.0-py3-none-any.whl", hash = "sha256:853c69d1a35e0e13da2f427bb68fb2fa4a8f4fb899e0c62ad8df8d073dcfed33", size = 182974, upload-time = "2024-11-19T10:23:42.824Z" },
+]
+
+[[package]]
+name = "language-data"
+version = "1.3.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "marisa-trie" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/dd/ce/3f144716a9f2cbf42aa86ebc8b085a184be25c80aa453eea17c294d239c1/language_data-1.3.0.tar.gz", hash = "sha256:7600ef8aa39555145d06c89f0c324bf7dab834ea0b0a439d8243762e3ebad7ec", size = 5129310, upload-time = "2024-11-19T10:21:37.912Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/5d/e9/5a5ffd9b286db82be70d677d0a91e4d58f7912bb8dd026ddeeb4abe70679/language_data-1.3.0-py3-none-any.whl", hash = "sha256:e2ee943551b5ae5f89cd0e801d1fc3835bb0ef5b7e9c3a4e8e17b2b214548fbf", size = 5385760, upload-time = "2024-11-19T10:21:36.005Z" },
+]
+
+[[package]]
+name = "locate"
+version = "1.1.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/26/b0/6b303d4a2a20046dc396de914a6c1840253ff874630f00864ffe623acb68/locate-1.1.1.tar.gz", hash = "sha256:432750f5b7e89f8c99942ca7d8722ccd1e7954b20e6a973027fccb6cc00af857", size = 7831, upload-time = "2022-12-15T07:01:30.602Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ab/0d/9b6f11382b2f9080b5a366d20e90b4c08e547b6cd08c2a206729e6bad47a/locate-1.1.1-py3-none-any.whl", hash = "sha256:9e5e2f3516639240f4d975c08e95ae6a24ff4dd63d228f927541cdec30105755", size = 5364, upload-time = "2022-12-15T07:01:29.526Z" },
+]
+
+[[package]]
+name = "marisa-trie"
+version = "1.3.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/c5/e3/c9066e74076b90f9701ccd23d6a0b8c1d583feefdec576dc3e1bb093c50d/marisa_trie-1.3.1.tar.gz", hash = "sha256:97107fd12f30e4f8fea97790343a2d2d9a79d93697fe14e1b6f6363c984ff85b", size = 212454, upload-time = "2025-08-26T15:13:18.401Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/36/eb/c18113555950ea25c421a5e8f7f280a9d7e9198a072f89d33ae9a5725ead/marisa_trie-1.3.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7e957aa4251a8e70b9fe02a16b2d190f18787902da563cb7ba865508b8e8fb04", size = 172432, upload-time = "2025-08-26T15:11:51.329Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/98/6d3f507a7340697d25d53839e68b516d3d01a3714edf33d484896250189b/marisa_trie-1.3.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e5888b269e790356ce4525f3e8df1fe866d1497b7d7fb7548cfec883cb985288", size = 156327, upload-time = "2025-08-26T15:11:52.646Z" },
+ { url = "https://files.pythonhosted.org/packages/be/39/78d6def87a6effec6480ef1474d4cc81ef9845c78281ac5a6c07a6440744/marisa_trie-1.3.1-cp310-cp310-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8f81344d212cb41992340b0b8a67e375f44da90590b884204fd3fa5e02107df2", size = 1219155, upload-time = "2025-08-26T15:11:53.915Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/b4/3b60c26cb9a2c623f47eeed84cfa6ebd3f71c5bd95ef32ed526e4ac689dc/marisa_trie-1.3.1-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3715d779561699471edde70975e07b1de7dddb2816735d40ed16be4b32054188", size = 1239413, upload-time = "2025-08-26T15:11:55.655Z" },
+ { url = "https://files.pythonhosted.org/packages/21/ef/9c7fca5bf133bdb144317843881c8b0c74d2acb7fa209f793c29422e7669/marisa_trie-1.3.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:47631614c5243ed7d15ae0af8245fcc0599f5b7921fae2a4ae992afb27c9afbb", size = 2161737, upload-time = "2025-08-26T15:11:56.832Z" },
+ { url = "https://files.pythonhosted.org/packages/1c/03/d5f630498bf4b8baf2d6484651255f601e9fdc6d42a83288e8b2420ebc9b/marisa_trie-1.3.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ad82ab8a58562cf69e6b786debcc7638b28df12f9f1c7bcffb07efb5c1f09cbd", size = 2250038, upload-time = "2025-08-26T15:11:58.165Z" },
+ { url = "https://files.pythonhosted.org/packages/00/6b/c12f055dbb13d22b0f8e1f3da9cb734f581b516cc0e3c909e3f39368f676/marisa_trie-1.3.1-cp310-cp310-win32.whl", hash = "sha256:9f92d3577c72d5a97af5c8e3d98247b79c8ccfb64ebf611311dcf631b11e5604", size = 117232, upload-time = "2025-08-26T15:11:59.616Z" },
+ { url = "https://files.pythonhosted.org/packages/f2/fd/988a19587c7bb8f03fb80e17335f75ca2d5538df4909727012b4bdff8f99/marisa_trie-1.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:a5a0a58ffe2a7eb3f870214c6df8f9a43ce768bd8fed883e6ba8c77645666b63", size = 143231, upload-time = "2025-08-26T15:12:00.52Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/bf/2f1fe6c9fcd2b509c6dfaaf26e35128947d6d3718d0b39510903c55b7bed/marisa_trie-1.3.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5ef045f694ef66079b4e00c4c9063a00183d6af7d1ff643de6ea5c3b0d9af01b", size = 174027, upload-time = "2025-08-26T15:12:01.434Z" },
+ { url = "https://files.pythonhosted.org/packages/a9/5a/de7936d58ed0de847180cee2b95143d420223c5ade0c093d55113f628237/marisa_trie-1.3.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:cbd28f95d5f30d9a7af6130869568e75bfd7ef2e0adfb1480f1f44480f5d3603", size = 158478, upload-time = "2025-08-26T15:12:02.429Z" },
+ { url = "https://files.pythonhosted.org/packages/48/cc/80611aadefcd0bcf8cd1795cb4643bb27213319a221ba04fe071da0b75cd/marisa_trie-1.3.1-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b173ec46d521308f7c97d96d6e05cf2088e0548f82544ec9a8656af65593304d", size = 1257535, upload-time = "2025-08-26T15:12:04.271Z" },
+ { url = "https://files.pythonhosted.org/packages/36/89/c4eeefb956318047036e6bdc572b6112b2059d595e85961267a90aa40458/marisa_trie-1.3.1-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:954fef9185f8a79441b4e433695116636bf66402945cfee404f8983bafa59788", size = 1275566, upload-time = "2025-08-26T15:12:05.874Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/63/d775a2fdfc4b555120381cd2aa6dff1845576bc14fb13796ae1b1e8dbaf7/marisa_trie-1.3.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ca644534f15f85bba14c412afc17de07531e79a766ce85b8dbf3f8b6e7758f20", size = 2199831, upload-time = "2025-08-26T15:12:07.175Z" },
+ { url = "https://files.pythonhosted.org/packages/50/aa/e5053927dc3cac77acc9b27f6f87e75c880f5d3d5eac9111fe13b1d8bf6f/marisa_trie-1.3.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:3834304fdeaa1c9b73596ad5a6c01a44fc19c13c115194704b85f7fbdf0a7b8e", size = 2283830, upload-time = "2025-08-26T15:12:08.319Z" },
+ { url = "https://files.pythonhosted.org/packages/71/3e/e314906d0de5b1a44780a23c79bb62a9aafd876e2a4e80fb34f58c721da4/marisa_trie-1.3.1-cp311-cp311-win32.whl", hash = "sha256:70b4c96f9119cfeb4dc6a0cf4afc9f92f0b002cde225bcd910915d976c78e66a", size = 117335, upload-time = "2025-08-26T15:12:09.776Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/2b/85623566621135de3d57497811f94679b4fb2a8f16148ef67133c2abab7a/marisa_trie-1.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:986eaf35a7f63c878280609ecd37edf8a074f7601c199acfec81d03f1ee9a39a", size = 143985, upload-time = "2025-08-26T15:12:10.988Z" },
+ { url = "https://files.pythonhosted.org/packages/3f/40/ee7ea61b88d62d2189b5c4a27bc0fc8d9c32f8b8dc6daf1c93a7b7ad34ac/marisa_trie-1.3.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:5b7c1e7fa6c3b855e8cfbabf38454d7decbaba1c567d0cd58880d033c6b363bd", size = 173454, upload-time = "2025-08-26T15:12:12.13Z" },
+ { url = "https://files.pythonhosted.org/packages/9c/fc/58635811586898041004b2197a085253706ede211324a53ec01612a50e20/marisa_trie-1.3.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:c12b44c190deb0d67655021da1f2d0a7d61a257bf844101cf982e68ed344f28d", size = 155305, upload-time = "2025-08-26T15:12:13.374Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/98/88ca0c98d37034a3237acaf461d210cbcfeb6687929e5ba0e354971fa3ed/marisa_trie-1.3.1-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9688c7b45f744366a4ef661e399f24636ebe440d315ab35d768676c59c613186", size = 1244834, upload-time = "2025-08-26T15:12:14.795Z" },
+ { url = "https://files.pythonhosted.org/packages/f3/5f/93b3e3607ccd693a768eafee60829cd14ea1810b75aa48e8b20e27b332c4/marisa_trie-1.3.1-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:99a00cab4cf9643a87977c87a5c8961aa44fff8d5dd46e00250135f686e7dedf", size = 1265148, upload-time = "2025-08-26T15:12:16.229Z" },
+ { url = "https://files.pythonhosted.org/packages/db/6e/051d7d25c7fb2b3df605c8bd782513ebbb33fddf3bae6cf46cf268cca89f/marisa_trie-1.3.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:83efc045fc58ca04c91a96c9b894d8a19ac6553677a76f96df01ff9f0405f53d", size = 2172726, upload-time = "2025-08-26T15:12:18.467Z" },
+ { url = "https://files.pythonhosted.org/packages/58/da/244d9d4e414ce6c73124cba4cc293dd140bf3b04ca18dec64c2775cca951/marisa_trie-1.3.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:0b9816ab993001a7854b02a7daec228892f35bd5ab0ac493bacbd1b80baec9f1", size = 2256104, upload-time = "2025-08-26T15:12:20.168Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/f1/1a36ecd7da6668685a7753522af89a19928ffc80f1cc1dbc301af216f011/marisa_trie-1.3.1-cp312-cp312-win32.whl", hash = "sha256:c785fd6dae9daa6825734b7b494cdac972f958be1f9cb3fb1f32be8598d2b936", size = 115624, upload-time = "2025-08-26T15:12:21.233Z" },
+ { url = "https://files.pythonhosted.org/packages/35/b2/aabd1c9f1c102aa31d66633ed5328c447be166e0a703f9723e682478fd83/marisa_trie-1.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:9868b7a8e0f648d09ffe25ac29511e6e208cc5fb0d156c295385f9d5dc2a138e", size = 138562, upload-time = "2025-08-26T15:12:22.632Z" },
+ { url = "https://files.pythonhosted.org/packages/46/a2/8331b995c1b3eee83aa745f4a6502d737ec523d5955a48f167d4177db105/marisa_trie-1.3.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9de573d933db4753a50af891bcb3ffbfe14e200406214c223aa5dfe2163f316d", size = 172272, upload-time = "2025-08-26T15:12:24.016Z" },
+ { url = "https://files.pythonhosted.org/packages/97/b8/7b9681b5c0ea1bb950f907a4e3919eb7f7b7b3febafaae346f3b3f199f6f/marisa_trie-1.3.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f4bae4f920f2a1082eaf766c1883df7da84abdf333bafa15b8717c10416a615e", size = 154671, upload-time = "2025-08-26T15:12:25.013Z" },
+ { url = "https://files.pythonhosted.org/packages/ca/16/929c1f83fdcff13f8d08500f434aaa18c21c8168d16cf81585d69085e980/marisa_trie-1.3.1-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:bf9f2b97fcfd5e2dbb0090d0664023872dcde990df0b545eca8d0ce95795a409", size = 1238754, upload-time = "2025-08-26T15:12:26.217Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/0a/b0e04d3ef91a87d4c7ea0b66c004fdfc6e65c9ed83edaebecfb482dfe0ed/marisa_trie-1.3.1-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ecdb19d33b26738a32602ef432b06cc6deeca4b498ce67ba8e5e39c8a7c19745", size = 1262653, upload-time = "2025-08-26T15:12:27.422Z" },
+ { url = "https://files.pythonhosted.org/packages/de/1f/0ecf610ddc9a209ee63116baabb47584d5b8ecd01610091a593d9429537e/marisa_trie-1.3.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a7416f1a084eb889c5792c57317875aeaa86abfe0bdc6f167712cebcec1d36ee", size = 2172399, upload-time = "2025-08-26T15:12:28.926Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/74/6b47deff3b3920449c135b9187c80f0d656adcdc5d41463745a61b012ea1/marisa_trie-1.3.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ee428575377e29c636f2b4b3b0488875dcea310c6c5b3412ec4ef997f7bb37cc", size = 2255138, upload-time = "2025-08-26T15:12:30.271Z" },
+ { url = "https://files.pythonhosted.org/packages/bd/fa/3dbcbe93dfaa626a5b3e741e7bcf3d7389aa5777175213bd8d9a9d3c992d/marisa_trie-1.3.1-cp313-cp313-win32.whl", hash = "sha256:d0f87bdf660f01e88ab3a507955697b2e3284065afa0b94fc9e77d6ad153ed5e", size = 115391, upload-time = "2025-08-26T15:12:31.465Z" },
+ { url = "https://files.pythonhosted.org/packages/3b/ce/ddfab303646b21aef07ff9dbc83fba92e5d493f49d3bc03d899ffd45c86f/marisa_trie-1.3.1-cp313-cp313-win_amd64.whl", hash = "sha256:a83f5f7ae3494e0cc25211296252b1b86901c788ed82c83adda19d0c98f828d6", size = 139130, upload-time = "2025-08-26T15:12:32.4Z" },
+ { url = "https://files.pythonhosted.org/packages/5a/1e/734b618048ad05c50cb1673ce2c6e836dc38ddeeeb011ed1804af07327a4/marisa_trie-1.3.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:a850b151bd1e3a5d9afef113adc22727d696603659d575d7e84f994bd8d04bf1", size = 175131, upload-time = "2025-08-26T15:12:33.728Z" },
+ { url = "https://files.pythonhosted.org/packages/d3/78/c7051147cc918cb8ff4a2920e11a9b17d9dcb4d8fc122122694b486e2bfe/marisa_trie-1.3.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:9dc61fb8f8993589544f6df268229c6cf0a56ad4ed3e8585a9cd23c5ad79527b", size = 163094, upload-time = "2025-08-26T15:12:35.312Z" },
+ { url = "https://files.pythonhosted.org/packages/ee/b8/3b904178d7878319aacaabae5131c1f281519aaac0f8c68c8ed312912ccf/marisa_trie-1.3.1-cp313-cp313t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d4bd41a6e73c0d0adafe4de449b6d35530a4ce6a836a6ee839baf117785ecfd7", size = 1279812, upload-time = "2025-08-26T15:12:36.831Z" },
+ { url = "https://files.pythonhosted.org/packages/fb/bf/e77a1284247b980560b4104bbdd5d06ed2c2ae3d56ab954f97293b6dbbcd/marisa_trie-1.3.1-cp313-cp313t-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8c8b2386d2d22c57880ed20a913ceca86363765623175671137484a7d223f07a", size = 1285690, upload-time = "2025-08-26T15:12:38.754Z" },
+ { url = "https://files.pythonhosted.org/packages/48/82/f6f10db5ec72de2642499f3a6e4e8607bbd2cfb28269ea08d0d8ddac3313/marisa_trie-1.3.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9c56001badaf1779afae5c24b7ab85938644ab8ef3c5fd438ab5d49621b84482", size = 2197943, upload-time = "2025-08-26T15:12:40.584Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/d0/74b6c3011b1ebf4a8131430156b14c3af694082cf34c392fff766096fd4b/marisa_trie-1.3.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:83a3748088d117a9b15d8981c947df9e4f56eb2e4b5456ae34fe1f83666c9185", size = 2280132, upload-time = "2025-08-26T15:12:42.059Z" },
+ { url = "https://files.pythonhosted.org/packages/28/b2/b8b0cb738fa3ab07309ed92025c6e1b278f84c7255e976921a52b30d8d1b/marisa_trie-1.3.1-cp313-cp313t-win32.whl", hash = "sha256:137010598d8cebc53dbfb7caf59bde96c33a6af555e3e1bdbf30269b6a157e1e", size = 126446, upload-time = "2025-08-26T15:12:43.339Z" },
+ { url = "https://files.pythonhosted.org/packages/b6/c6/2381648d0c946556ef51c673397cea40712d945444ceed0a0a0b51a174d2/marisa_trie-1.3.1-cp313-cp313t-win_amd64.whl", hash = "sha256:ec633e108f277f2b7f4671d933a909f39bba549910bf103e2940b87a14da2783", size = 153885, upload-time = "2025-08-26T15:12:44.309Z" },
+ { url = "https://files.pythonhosted.org/packages/40/8a/590f25a281e08879791aabec7b8584c7934ff3d5f9d52859197d587246ec/marisa_trie-1.3.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:389721481c14a92fa042e4b91ae065bff13e2bc567c85a10aa9d9de80aaa8622", size = 172803, upload-time = "2025-08-26T15:12:45.342Z" },
+ { url = "https://files.pythonhosted.org/packages/20/7f/fd19a4aa57ad169d08e518a6ee2438e7e77bfba7786c59f65891db69d202/marisa_trie-1.3.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0e6f3b45def6ff23e254eeaa9079267004f0069d0a34eba30a620780caa4f2cb", size = 155506, upload-time = "2025-08-26T15:12:46.701Z" },
+ { url = "https://files.pythonhosted.org/packages/e3/05/857832b8fe6b2ec441de1154eadc66dee067ce5fb6673c3ee0b8616108ee/marisa_trie-1.3.1-cp314-cp314-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3a96ef3e461ecc85ec7d2233ddc449ff5a3fbdc520caea752bc5bc8faa975231", size = 1239979, upload-time = "2025-08-26T15:12:47.943Z" },
+ { url = "https://files.pythonhosted.org/packages/4c/08/f9ea8b720a627d54e8e19f19a0ec1cc2011e01aa2b4f40d078e7f5e9e21f/marisa_trie-1.3.1-cp314-cp314-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5370f9ef6c008e502537cc1ff518c80ddf749367ce90179efa0e7f6275903a76", size = 1255705, upload-time = "2025-08-26T15:12:49.24Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/c3/42360fb38cdfde5db1783e2d7cfeb8b91eea837f89ef678f308ee026d794/marisa_trie-1.3.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:0dcd42774e367ceb423c211a4fc8e7ce586acfaf0929c9c06d98002112075239", size = 2175092, upload-time = "2025-08-26T15:12:50.602Z" },
+ { url = "https://files.pythonhosted.org/packages/09/ba/215b0d821fd37cdc600e834a75708aa2e117124dcf495c9a6c6dc7fdcb6b/marisa_trie-1.3.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:3e2a0e1be95237981bd375a388f44b33d69ea5669a2f79fea038e45fff326595", size = 2250454, upload-time = "2025-08-26T15:12:52.435Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/a3/292ab31a12ec1cb356e6bc8b9cc8aaec920aa892a805757c011d77e8cd93/marisa_trie-1.3.1-cp314-cp314-win32.whl", hash = "sha256:c7a33506d0451112911c69f38d55da3e0e050f2be0ea4e5176865cf03baf26a9", size = 119101, upload-time = "2025-08-26T15:12:53.615Z" },
+ { url = "https://files.pythonhosted.org/packages/95/83/0ea5de53209993cf301dd9d18d4cb22c20c84c753b4357b66660a8b9eb48/marisa_trie-1.3.1-cp314-cp314-win_amd64.whl", hash = "sha256:68678816818efcd4a1787b557af81f215b989ec88680a86c85c34c914d413690", size = 142886, upload-time = "2025-08-26T15:12:54.835Z" },
+ { url = "https://files.pythonhosted.org/packages/37/00/c7e063867988067992a9d9d2aceaede0be7787ca6d77ef34f2eca9d2708e/marisa_trie-1.3.1-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9e467e13971c64db6aed8afe4c2a131c3f73f048bec3f788a6141216acda598d", size = 175163, upload-time = "2025-08-26T15:12:55.908Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/64/eaf49d10c8506ecd717bbbeda907e474842c298354a444b875741ef4a0d9/marisa_trie-1.3.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:076731f79f8603cb3216cb6e5bbbc56536c89f63f175ad47014219ecb01e5996", size = 163119, upload-time = "2025-08-26T15:12:58.054Z" },
+ { url = "https://files.pythonhosted.org/packages/b4/26/f24dd9c98ce6fc8c8d554b556e1c43f326c5df414b79aba33bd7d2d2fbfd/marisa_trie-1.3.1-cp314-cp314t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:82de2de90488d0fbbf74cf9f20e1afd62e320693b88f5e9565fc80b28f5bbad3", size = 1277783, upload-time = "2025-08-26T15:12:59.225Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/1a/efd63e75d1374e08f8ebe2e15ff1b1ed5f6d5cf57614a5b0884bd9c882ee/marisa_trie-1.3.1-cp314-cp314t-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0c2bc6bee737f4d47fce48c5b03a7bd3214ef2d83eb5c9f84210091370a5f195", size = 1282309, upload-time = "2025-08-26T15:13:00.797Z" },
+ { url = "https://files.pythonhosted.org/packages/33/4c/0cefa1eceec7858766af5939979857ac079c6c5251e00c6991c1a26bb1b7/marisa_trie-1.3.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:56043cf908ddf3d7364498085dbc2855d4ea8969aff3bf2439a79482a79e68e2", size = 2196594, upload-time = "2025-08-26T15:13:02.158Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/64/900f4132fc345be4b40073e66284707afa4cc203d8d0f1fe78c6b111cd47/marisa_trie-1.3.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:9651daa1fdc471df5a5fa6a4833d3b01e76ac512eea141a5995681aebac5555f", size = 2277730, upload-time = "2025-08-26T15:13:03.528Z" },
+ { url = "https://files.pythonhosted.org/packages/62/ab/6d6cf25a5c8835589a601a9a916ec5cdee740e277fed8ee620df546834bb/marisa_trie-1.3.1-cp314-cp314t-win32.whl", hash = "sha256:c6571462417cda2239b1ade86ceaf3852da9b52c6286046e87d404afc6da20a7", size = 131409, upload-time = "2025-08-26T15:13:05.106Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/61/c4efc044141429e67e8fd5536be86d76303f250179c7f92b2cc0c72e8d0b/marisa_trie-1.3.1-cp314-cp314t-win_amd64.whl", hash = "sha256:9e6496bbad3068e3bbbb934b1e1307bf1a9cb4609f9ec47b57e8ea37f1b5ee40", size = 162564, upload-time = "2025-08-26T15:13:06.112Z" },
+]
+
+[[package]]
+name = "markdown-it-py"
+version = "4.0.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "mdurl" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/5b/f5/4ec618ed16cc4f8fb3b701563655a69816155e79e24a17b651541804721d/markdown_it_py-4.0.0.tar.gz", hash = "sha256:cb0a2b4aa34f932c007117b194e945bd74e0ec24133ceb5bac59009cda1cb9f3", size = 73070, upload-time = "2025-08-11T12:57:52.854Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/94/54/e7d793b573f298e1c9013b8c4dade17d481164aa517d1d7148619c2cedbf/markdown_it_py-4.0.0-py3-none-any.whl", hash = "sha256:87327c59b172c5011896038353a81343b6754500a08cd7a4973bb48c6d578147", size = 87321, upload-time = "2025-08-11T12:57:51.923Z" },
+]
+
+[[package]]
+name = "markupsafe"
+version = "3.0.3"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/7e/99/7690b6d4034fffd95959cbe0c02de8deb3098cc577c67bb6a24fe5d7caa7/markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698", size = 80313, upload-time = "2025-09-27T18:37:40.426Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e8/4b/3541d44f3937ba468b75da9eebcae497dcf67adb65caa16760b0a6807ebb/markupsafe-3.0.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2f981d352f04553a7171b8e44369f2af4055f888dfb147d55e42d29e29e74559", size = 11631, upload-time = "2025-09-27T18:36:05.558Z" },
+ { url = "https://files.pythonhosted.org/packages/98/1b/fbd8eed11021cabd9226c37342fa6ca4e8a98d8188a8d9b66740494960e4/markupsafe-3.0.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e1c1493fb6e50ab01d20a22826e57520f1284df32f2d8601fdd90b6304601419", size = 12057, upload-time = "2025-09-27T18:36:07.165Z" },
+ { url = "https://files.pythonhosted.org/packages/40/01/e560d658dc0bb8ab762670ece35281dec7b6c1b33f5fbc09ebb57a185519/markupsafe-3.0.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1ba88449deb3de88bd40044603fafffb7bc2b055d626a330323a9ed736661695", size = 22050, upload-time = "2025-09-27T18:36:08.005Z" },
+ { url = "https://files.pythonhosted.org/packages/af/cd/ce6e848bbf2c32314c9b237839119c5a564a59725b53157c856e90937b7a/markupsafe-3.0.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f42d0984e947b8adf7dd6dde396e720934d12c506ce84eea8476409563607591", size = 20681, upload-time = "2025-09-27T18:36:08.881Z" },
+ { url = "https://files.pythonhosted.org/packages/c9/2a/b5c12c809f1c3045c4d580b035a743d12fcde53cf685dbc44660826308da/markupsafe-3.0.3-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c0c0b3ade1c0b13b936d7970b1d37a57acde9199dc2aecc4c336773e1d86049c", size = 20705, upload-time = "2025-09-27T18:36:10.131Z" },
+ { url = "https://files.pythonhosted.org/packages/cf/e3/9427a68c82728d0a88c50f890d0fc072a1484de2f3ac1ad0bfc1a7214fd5/markupsafe-3.0.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:0303439a41979d9e74d18ff5e2dd8c43ed6c6001fd40e5bf2e43f7bd9bbc523f", size = 21524, upload-time = "2025-09-27T18:36:11.324Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/36/23578f29e9e582a4d0278e009b38081dbe363c5e7165113fad546918a232/markupsafe-3.0.3-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:d2ee202e79d8ed691ceebae8e0486bd9a2cd4794cec4824e1c99b6f5009502f6", size = 20282, upload-time = "2025-09-27T18:36:12.573Z" },
+ { url = "https://files.pythonhosted.org/packages/56/21/dca11354e756ebd03e036bd8ad58d6d7168c80ce1fe5e75218e4945cbab7/markupsafe-3.0.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:177b5253b2834fe3678cb4a5f0059808258584c559193998be2601324fdeafb1", size = 20745, upload-time = "2025-09-27T18:36:13.504Z" },
+ { url = "https://files.pythonhosted.org/packages/87/99/faba9369a7ad6e4d10b6a5fbf71fa2a188fe4a593b15f0963b73859a1bbd/markupsafe-3.0.3-cp310-cp310-win32.whl", hash = "sha256:2a15a08b17dd94c53a1da0438822d70ebcd13f8c3a95abe3a9ef9f11a94830aa", size = 14571, upload-time = "2025-09-27T18:36:14.779Z" },
+ { url = "https://files.pythonhosted.org/packages/d6/25/55dc3ab959917602c96985cb1253efaa4ff42f71194bddeb61eb7278b8be/markupsafe-3.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:c4ffb7ebf07cfe8931028e3e4c85f0357459a3f9f9490886198848f4fa002ec8", size = 15056, upload-time = "2025-09-27T18:36:16.125Z" },
+ { url = "https://files.pythonhosted.org/packages/d0/9e/0a02226640c255d1da0b8d12e24ac2aa6734da68bff14c05dd53b94a0fc3/markupsafe-3.0.3-cp310-cp310-win_arm64.whl", hash = "sha256:e2103a929dfa2fcaf9bb4e7c091983a49c9ac3b19c9061b6d5427dd7d14d81a1", size = 13932, upload-time = "2025-09-27T18:36:17.311Z" },
+ { url = "https://files.pythonhosted.org/packages/08/db/fefacb2136439fc8dd20e797950e749aa1f4997ed584c62cfb8ef7c2be0e/markupsafe-3.0.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1cc7ea17a6824959616c525620e387f6dd30fec8cb44f649e31712db02123dad", size = 11631, upload-time = "2025-09-27T18:36:18.185Z" },
+ { url = "https://files.pythonhosted.org/packages/e1/2e/5898933336b61975ce9dc04decbc0a7f2fee78c30353c5efba7f2d6ff27a/markupsafe-3.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4bd4cd07944443f5a265608cc6aab442e4f74dff8088b0dfc8238647b8f6ae9a", size = 12058, upload-time = "2025-09-27T18:36:19.444Z" },
+ { url = "https://files.pythonhosted.org/packages/1d/09/adf2df3699d87d1d8184038df46a9c80d78c0148492323f4693df54e17bb/markupsafe-3.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b5420a1d9450023228968e7e6a9ce57f65d148ab56d2313fcd589eee96a7a50", size = 24287, upload-time = "2025-09-27T18:36:20.768Z" },
+ { url = "https://files.pythonhosted.org/packages/30/ac/0273f6fcb5f42e314c6d8cd99effae6a5354604d461b8d392b5ec9530a54/markupsafe-3.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0bf2a864d67e76e5c9a34dc26ec616a66b9888e25e7b9460e1c76d3293bd9dbf", size = 22940, upload-time = "2025-09-27T18:36:22.249Z" },
+ { url = "https://files.pythonhosted.org/packages/19/ae/31c1be199ef767124c042c6c3e904da327a2f7f0cd63a0337e1eca2967a8/markupsafe-3.0.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc51efed119bc9cfdf792cdeaa4d67e8f6fcccab66ed4bfdd6bde3e59bfcbb2f", size = 21887, upload-time = "2025-09-27T18:36:23.535Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/76/7edcab99d5349a4532a459e1fe64f0b0467a3365056ae550d3bcf3f79e1e/markupsafe-3.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:068f375c472b3e7acbe2d5318dea141359e6900156b5b2ba06a30b169086b91a", size = 23692, upload-time = "2025-09-27T18:36:24.823Z" },
+ { url = "https://files.pythonhosted.org/packages/a4/28/6e74cdd26d7514849143d69f0bf2399f929c37dc2b31e6829fd2045b2765/markupsafe-3.0.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:7be7b61bb172e1ed687f1754f8e7484f1c8019780f6f6b0786e76bb01c2ae115", size = 21471, upload-time = "2025-09-27T18:36:25.95Z" },
+ { url = "https://files.pythonhosted.org/packages/62/7e/a145f36a5c2945673e590850a6f8014318d5577ed7e5920a4b3448e0865d/markupsafe-3.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f9e130248f4462aaa8e2552d547f36ddadbeaa573879158d721bbd33dfe4743a", size = 22923, upload-time = "2025-09-27T18:36:27.109Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/62/d9c46a7f5c9adbeeeda52f5b8d802e1094e9717705a645efc71b0913a0a8/markupsafe-3.0.3-cp311-cp311-win32.whl", hash = "sha256:0db14f5dafddbb6d9208827849fad01f1a2609380add406671a26386cdf15a19", size = 14572, upload-time = "2025-09-27T18:36:28.045Z" },
+ { url = "https://files.pythonhosted.org/packages/83/8a/4414c03d3f891739326e1783338e48fb49781cc915b2e0ee052aa490d586/markupsafe-3.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:de8a88e63464af587c950061a5e6a67d3632e36df62b986892331d4620a35c01", size = 15077, upload-time = "2025-09-27T18:36:29.025Z" },
+ { url = "https://files.pythonhosted.org/packages/35/73/893072b42e6862f319b5207adc9ae06070f095b358655f077f69a35601f0/markupsafe-3.0.3-cp311-cp311-win_arm64.whl", hash = "sha256:3b562dd9e9ea93f13d53989d23a7e775fdfd1066c33494ff43f5418bc8c58a5c", size = 13876, upload-time = "2025-09-27T18:36:29.954Z" },
+ { url = "https://files.pythonhosted.org/packages/5a/72/147da192e38635ada20e0a2e1a51cf8823d2119ce8883f7053879c2199b5/markupsafe-3.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d53197da72cc091b024dd97249dfc7794d6a56530370992a5e1a08983ad9230e", size = 11615, upload-time = "2025-09-27T18:36:30.854Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/81/7e4e08678a1f98521201c3079f77db69fb552acd56067661f8c2f534a718/markupsafe-3.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1872df69a4de6aead3491198eaf13810b565bdbeec3ae2dc8780f14458ec73ce", size = 12020, upload-time = "2025-09-27T18:36:31.971Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/2c/799f4742efc39633a1b54a92eec4082e4f815314869865d876824c257c1e/markupsafe-3.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3a7e8ae81ae39e62a41ec302f972ba6ae23a5c5396c8e60113e9066ef893da0d", size = 24332, upload-time = "2025-09-27T18:36:32.813Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/2e/8d0c2ab90a8c1d9a24f0399058ab8519a3279d1bd4289511d74e909f060e/markupsafe-3.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d6dd0be5b5b189d31db7cda48b91d7e0a9795f31430b7f271219ab30f1d3ac9d", size = 22947, upload-time = "2025-09-27T18:36:33.86Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/54/887f3092a85238093a0b2154bd629c89444f395618842e8b0c41783898ea/markupsafe-3.0.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:94c6f0bb423f739146aec64595853541634bde58b2135f27f61c1ffd1cd4d16a", size = 21962, upload-time = "2025-09-27T18:36:35.099Z" },
+ { url = "https://files.pythonhosted.org/packages/c9/2f/336b8c7b6f4a4d95e91119dc8521402461b74a485558d8f238a68312f11c/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:be8813b57049a7dc738189df53d69395eba14fb99345e0a5994914a3864c8a4b", size = 23760, upload-time = "2025-09-27T18:36:36.001Z" },
+ { url = "https://files.pythonhosted.org/packages/32/43/67935f2b7e4982ffb50a4d169b724d74b62a3964bc1a9a527f5ac4f1ee2b/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:83891d0e9fb81a825d9a6d61e3f07550ca70a076484292a70fde82c4b807286f", size = 21529, upload-time = "2025-09-27T18:36:36.906Z" },
+ { url = "https://files.pythonhosted.org/packages/89/e0/4486f11e51bbba8b0c041098859e869e304d1c261e59244baa3d295d47b7/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:77f0643abe7495da77fb436f50f8dab76dbc6e5fd25d39589a0f1fe6548bfa2b", size = 23015, upload-time = "2025-09-27T18:36:37.868Z" },
+ { url = "https://files.pythonhosted.org/packages/2f/e1/78ee7a023dac597a5825441ebd17170785a9dab23de95d2c7508ade94e0e/markupsafe-3.0.3-cp312-cp312-win32.whl", hash = "sha256:d88b440e37a16e651bda4c7c2b930eb586fd15ca7406cb39e211fcff3bf3017d", size = 14540, upload-time = "2025-09-27T18:36:38.761Z" },
+ { url = "https://files.pythonhosted.org/packages/aa/5b/bec5aa9bbbb2c946ca2733ef9c4ca91c91b6a24580193e891b5f7dbe8e1e/markupsafe-3.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:26a5784ded40c9e318cfc2bdb30fe164bdb8665ded9cd64d500a34fb42067b1c", size = 15105, upload-time = "2025-09-27T18:36:39.701Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/f1/216fc1bbfd74011693a4fd837e7026152e89c4bcf3e77b6692fba9923123/markupsafe-3.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:35add3b638a5d900e807944a078b51922212fb3dedb01633a8defc4b01a3c85f", size = 13906, upload-time = "2025-09-27T18:36:40.689Z" },
+ { url = "https://files.pythonhosted.org/packages/38/2f/907b9c7bbba283e68f20259574b13d005c121a0fa4c175f9bed27c4597ff/markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795", size = 11622, upload-time = "2025-09-27T18:36:41.777Z" },
+ { url = "https://files.pythonhosted.org/packages/9c/d9/5f7756922cdd676869eca1c4e3c0cd0df60ed30199ffd775e319089cb3ed/markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219", size = 12029, upload-time = "2025-09-27T18:36:43.257Z" },
+ { url = "https://files.pythonhosted.org/packages/00/07/575a68c754943058c78f30db02ee03a64b3c638586fba6a6dd56830b30a3/markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6", size = 24374, upload-time = "2025-09-27T18:36:44.508Z" },
+ { url = "https://files.pythonhosted.org/packages/a9/21/9b05698b46f218fc0e118e1f8168395c65c8a2c750ae2bab54fc4bd4e0e8/markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676", size = 22980, upload-time = "2025-09-27T18:36:45.385Z" },
+ { url = "https://files.pythonhosted.org/packages/7f/71/544260864f893f18b6827315b988c146b559391e6e7e8f7252839b1b846a/markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9", size = 21990, upload-time = "2025-09-27T18:36:46.916Z" },
+ { url = "https://files.pythonhosted.org/packages/c2/28/b50fc2f74d1ad761af2f5dcce7492648b983d00a65b8c0e0cb457c82ebbe/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1", size = 23784, upload-time = "2025-09-27T18:36:47.884Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/76/104b2aa106a208da8b17a2fb72e033a5a9d7073c68f7e508b94916ed47a9/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc", size = 21588, upload-time = "2025-09-27T18:36:48.82Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/99/16a5eb2d140087ebd97180d95249b00a03aa87e29cc224056274f2e45fd6/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12", size = 23041, upload-time = "2025-09-27T18:36:49.797Z" },
+ { url = "https://files.pythonhosted.org/packages/19/bc/e7140ed90c5d61d77cea142eed9f9c303f4c4806f60a1044c13e3f1471d0/markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed", size = 14543, upload-time = "2025-09-27T18:36:51.584Z" },
+ { url = "https://files.pythonhosted.org/packages/05/73/c4abe620b841b6b791f2edc248f556900667a5a1cf023a6646967ae98335/markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5", size = 15113, upload-time = "2025-09-27T18:36:52.537Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/3a/fa34a0f7cfef23cf9500d68cb7c32dd64ffd58a12b09225fb03dd37d5b80/markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485", size = 13911, upload-time = "2025-09-27T18:36:53.513Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/d7/e05cd7efe43a88a17a37b3ae96e79a19e846f3f456fe79c57ca61356ef01/markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73", size = 11658, upload-time = "2025-09-27T18:36:54.819Z" },
+ { url = "https://files.pythonhosted.org/packages/99/9e/e412117548182ce2148bdeacdda3bb494260c0b0184360fe0d56389b523b/markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37", size = 12066, upload-time = "2025-09-27T18:36:55.714Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/e6/fa0ffcda717ef64a5108eaa7b4f5ed28d56122c9a6d70ab8b72f9f715c80/markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19", size = 25639, upload-time = "2025-09-27T18:36:56.908Z" },
+ { url = "https://files.pythonhosted.org/packages/96/ec/2102e881fe9d25fc16cb4b25d5f5cde50970967ffa5dddafdb771237062d/markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025", size = 23569, upload-time = "2025-09-27T18:36:57.913Z" },
+ { url = "https://files.pythonhosted.org/packages/4b/30/6f2fce1f1f205fc9323255b216ca8a235b15860c34b6798f810f05828e32/markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6", size = 23284, upload-time = "2025-09-27T18:36:58.833Z" },
+ { url = "https://files.pythonhosted.org/packages/58/47/4a0ccea4ab9f5dcb6f79c0236d954acb382202721e704223a8aafa38b5c8/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f", size = 24801, upload-time = "2025-09-27T18:36:59.739Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/70/3780e9b72180b6fecb83a4814d84c3bf4b4ae4bf0b19c27196104149734c/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb", size = 22769, upload-time = "2025-09-27T18:37:00.719Z" },
+ { url = "https://files.pythonhosted.org/packages/98/c5/c03c7f4125180fc215220c035beac6b9cb684bc7a067c84fc69414d315f5/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009", size = 23642, upload-time = "2025-09-27T18:37:01.673Z" },
+ { url = "https://files.pythonhosted.org/packages/80/d6/2d1b89f6ca4bff1036499b1e29a1d02d282259f3681540e16563f27ebc23/markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354", size = 14612, upload-time = "2025-09-27T18:37:02.639Z" },
+ { url = "https://files.pythonhosted.org/packages/2b/98/e48a4bfba0a0ffcf9925fe2d69240bfaa19c6f7507b8cd09c70684a53c1e/markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218", size = 15200, upload-time = "2025-09-27T18:37:03.582Z" },
+ { url = "https://files.pythonhosted.org/packages/0e/72/e3cc540f351f316e9ed0f092757459afbc595824ca724cbc5a5d4263713f/markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287", size = 13973, upload-time = "2025-09-27T18:37:04.929Z" },
+ { url = "https://files.pythonhosted.org/packages/33/8a/8e42d4838cd89b7dde187011e97fe6c3af66d8c044997d2183fbd6d31352/markupsafe-3.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe", size = 11619, upload-time = "2025-09-27T18:37:06.342Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/64/7660f8a4a8e53c924d0fa05dc3a55c9cee10bbd82b11c5afb27d44b096ce/markupsafe-3.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026", size = 12029, upload-time = "2025-09-27T18:37:07.213Z" },
+ { url = "https://files.pythonhosted.org/packages/da/ef/e648bfd021127bef5fa12e1720ffed0c6cbb8310c8d9bea7266337ff06de/markupsafe-3.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737", size = 24408, upload-time = "2025-09-27T18:37:09.572Z" },
+ { url = "https://files.pythonhosted.org/packages/41/3c/a36c2450754618e62008bf7435ccb0f88053e07592e6028a34776213d877/markupsafe-3.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97", size = 23005, upload-time = "2025-09-27T18:37:10.58Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/20/b7fdf89a8456b099837cd1dc21974632a02a999ec9bf7ca3e490aacd98e7/markupsafe-3.0.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d", size = 22048, upload-time = "2025-09-27T18:37:11.547Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/a7/591f592afdc734f47db08a75793a55d7fbcc6902a723ae4cfbab61010cc5/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda", size = 23821, upload-time = "2025-09-27T18:37:12.48Z" },
+ { url = "https://files.pythonhosted.org/packages/7d/33/45b24e4f44195b26521bc6f1a82197118f74df348556594bd2262bda1038/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf", size = 21606, upload-time = "2025-09-27T18:37:13.485Z" },
+ { url = "https://files.pythonhosted.org/packages/ff/0e/53dfaca23a69fbfbbf17a4b64072090e70717344c52eaaaa9c5ddff1e5f0/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe", size = 23043, upload-time = "2025-09-27T18:37:14.408Z" },
+ { url = "https://files.pythonhosted.org/packages/46/11/f333a06fc16236d5238bfe74daccbca41459dcd8d1fa952e8fbd5dccfb70/markupsafe-3.0.3-cp314-cp314-win32.whl", hash = "sha256:729586769a26dbceff69f7a7dbbf59ab6572b99d94576a5592625d5b411576b9", size = 14747, upload-time = "2025-09-27T18:37:15.36Z" },
+ { url = "https://files.pythonhosted.org/packages/28/52/182836104b33b444e400b14f797212f720cbc9ed6ba34c800639d154e821/markupsafe-3.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:bdc919ead48f234740ad807933cdf545180bfbe9342c2bb451556db2ed958581", size = 15341, upload-time = "2025-09-27T18:37:16.496Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/18/acf23e91bd94fd7b3031558b1f013adfa21a8e407a3fdb32745538730382/markupsafe-3.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:5a7d5dc5140555cf21a6fefbdbf8723f06fcd2f63ef108f2854de715e4422cb4", size = 14073, upload-time = "2025-09-27T18:37:17.476Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/f0/57689aa4076e1b43b15fdfa646b04653969d50cf30c32a102762be2485da/markupsafe-3.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab", size = 11661, upload-time = "2025-09-27T18:37:18.453Z" },
+ { url = "https://files.pythonhosted.org/packages/89/c3/2e67a7ca217c6912985ec766c6393b636fb0c2344443ff9d91404dc4c79f/markupsafe-3.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175", size = 12069, upload-time = "2025-09-27T18:37:19.332Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/00/be561dce4e6ca66b15276e184ce4b8aec61fe83662cce2f7d72bd3249d28/markupsafe-3.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634", size = 25670, upload-time = "2025-09-27T18:37:20.245Z" },
+ { url = "https://files.pythonhosted.org/packages/50/09/c419f6f5a92e5fadde27efd190eca90f05e1261b10dbd8cbcb39cd8ea1dc/markupsafe-3.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50", size = 23598, upload-time = "2025-09-27T18:37:21.177Z" },
+ { url = "https://files.pythonhosted.org/packages/22/44/a0681611106e0b2921b3033fc19bc53323e0b50bc70cffdd19f7d679bb66/markupsafe-3.0.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e", size = 23261, upload-time = "2025-09-27T18:37:22.167Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/57/1b0b3f100259dc9fffe780cfb60d4be71375510e435efec3d116b6436d43/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5", size = 24835, upload-time = "2025-09-27T18:37:23.296Z" },
+ { url = "https://files.pythonhosted.org/packages/26/6a/4bf6d0c97c4920f1597cc14dd720705eca0bf7c787aebc6bb4d1bead5388/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523", size = 22733, upload-time = "2025-09-27T18:37:24.237Z" },
+ { url = "https://files.pythonhosted.org/packages/14/c7/ca723101509b518797fedc2fdf79ba57f886b4aca8a7d31857ba3ee8281f/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc", size = 23672, upload-time = "2025-09-27T18:37:25.271Z" },
+ { url = "https://files.pythonhosted.org/packages/fb/df/5bd7a48c256faecd1d36edc13133e51397e41b73bb77e1a69deab746ebac/markupsafe-3.0.3-cp314-cp314t-win32.whl", hash = "sha256:915c04ba3851909ce68ccc2b8e2cd691618c4dc4c4232fb7982bca3f41fd8c3d", size = 14819, upload-time = "2025-09-27T18:37:26.285Z" },
+ { url = "https://files.pythonhosted.org/packages/1a/8a/0402ba61a2f16038b48b39bccca271134be00c5c9f0f623208399333c448/markupsafe-3.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4faffd047e07c38848ce017e8725090413cd80cbc23d86e55c587bf979e579c9", size = 15426, upload-time = "2025-09-27T18:37:27.316Z" },
+ { url = "https://files.pythonhosted.org/packages/70/bc/6f1c2f612465f5fa89b95bead1f44dcb607670fd42891d8fdcd5d039f4f4/markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa", size = 14146, upload-time = "2025-09-27T18:37:28.327Z" },
+]
+
+[[package]]
+name = "matplotlib"
+version = "3.10.7"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "contourpy", version = "1.3.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "contourpy", version = "1.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "cycler" },
+ { name = "fonttools" },
+ { name = "kiwisolver" },
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "packaging" },
+ { name = "pillow" },
+ { name = "pyparsing" },
+ { name = "python-dateutil" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/ae/e2/d2d5295be2f44c678ebaf3544ba32d20c1f9ef08c49fe47f496180e1db15/matplotlib-3.10.7.tar.gz", hash = "sha256:a06ba7e2a2ef9131c79c49e63dad355d2d878413a0376c1727c8b9335ff731c7", size = 34804865, upload-time = "2025-10-09T00:28:00.669Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/6c/87/3932d5778ab4c025db22710b61f49ccaed3956c5cf46ffb2ffa7492b06d9/matplotlib-3.10.7-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:7ac81eee3b7c266dd92cee1cd658407b16c57eed08c7421fa354ed68234de380", size = 8247141, upload-time = "2025-10-09T00:26:06.023Z" },
+ { url = "https://files.pythonhosted.org/packages/45/a8/bfed45339160102bce21a44e38a358a1134a5f84c26166de03fb4a53208f/matplotlib-3.10.7-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:667ecd5d8d37813a845053d8f5bf110b534c3c9f30e69ebd25d4701385935a6d", size = 8107995, upload-time = "2025-10-09T00:26:08.669Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/3c/5692a2d9a5ba848fda3f48d2b607037df96460b941a59ef236404b39776b/matplotlib-3.10.7-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:cc1c51b846aca49a5a8b44fbba6a92d583a35c64590ad9e1e950dc88940a4297", size = 8680503, upload-time = "2025-10-09T00:26:10.607Z" },
+ { url = "https://files.pythonhosted.org/packages/ab/a0/86ace53c48b05d0e6e9c127b2ace097434901f3e7b93f050791c8243201a/matplotlib-3.10.7-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4a11c2e9e72e7de09b7b72e62f3df23317c888299c875e2b778abf1eda8c0a42", size = 9514982, upload-time = "2025-10-09T00:26:12.594Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/81/ead71e2824da8f72640a64166d10e62300df4ae4db01a0bac56c5b39fa51/matplotlib-3.10.7-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:f19410b486fdd139885ace124e57f938c1e6a3210ea13dd29cab58f5d4bc12c7", size = 9566429, upload-time = "2025-10-09T00:26:14.758Z" },
+ { url = "https://files.pythonhosted.org/packages/65/7d/954b3067120456f472cce8fdcacaf4a5fcd522478db0c37bb243c7cb59dd/matplotlib-3.10.7-cp310-cp310-win_amd64.whl", hash = "sha256:b498e9e4022f93de2d5a37615200ca01297ceebbb56fe4c833f46862a490f9e3", size = 8108174, upload-time = "2025-10-09T00:26:17.015Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/bc/0fb489005669127ec13f51be0c6adc074d7cf191075dab1da9fe3b7a3cfc/matplotlib-3.10.7-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:53b492410a6cd66c7a471de6c924f6ede976e963c0f3097a3b7abfadddc67d0a", size = 8257507, upload-time = "2025-10-09T00:26:19.073Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/6a/d42588ad895279ff6708924645b5d2ed54a7fb2dc045c8a804e955aeace1/matplotlib-3.10.7-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d9749313deb729f08207718d29c86246beb2ea3fdba753595b55901dee5d2fd6", size = 8119565, upload-time = "2025-10-09T00:26:21.023Z" },
+ { url = "https://files.pythonhosted.org/packages/10/b7/4aa196155b4d846bd749cf82aa5a4c300cf55a8b5e0dfa5b722a63c0f8a0/matplotlib-3.10.7-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2222c7ba2cbde7fe63032769f6eb7e83ab3227f47d997a8453377709b7fe3a5a", size = 8692668, upload-time = "2025-10-09T00:26:22.967Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/e7/664d2b97016f46683a02d854d730cfcf54ff92c1dafa424beebef50f831d/matplotlib-3.10.7-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e91f61a064c92c307c5a9dc8c05dc9f8a68f0a3be199d9a002a0622e13f874a1", size = 9521051, upload-time = "2025-10-09T00:26:25.041Z" },
+ { url = "https://files.pythonhosted.org/packages/a8/a3/37aef1404efa615f49b5758a5e0261c16dd88f389bc1861e722620e4a754/matplotlib-3.10.7-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:6f1851eab59ca082c95df5a500106bad73672645625e04538b3ad0f69471ffcc", size = 9576878, upload-time = "2025-10-09T00:26:27.478Z" },
+ { url = "https://files.pythonhosted.org/packages/33/cd/b145f9797126f3f809d177ca378de57c45413c5099c5990de2658760594a/matplotlib-3.10.7-cp311-cp311-win_amd64.whl", hash = "sha256:6516ce375109c60ceec579e699524e9d504cd7578506f01150f7a6bc174a775e", size = 8115142, upload-time = "2025-10-09T00:26:29.774Z" },
+ { url = "https://files.pythonhosted.org/packages/2e/39/63bca9d2b78455ed497fcf51a9c71df200a11048f48249038f06447fa947/matplotlib-3.10.7-cp311-cp311-win_arm64.whl", hash = "sha256:b172db79759f5f9bc13ef1c3ef8b9ee7b37b0247f987fbbbdaa15e4f87fd46a9", size = 7992439, upload-time = "2025-10-09T00:26:40.32Z" },
+ { url = "https://files.pythonhosted.org/packages/be/b3/09eb0f7796932826ec20c25b517d568627754f6c6462fca19e12c02f2e12/matplotlib-3.10.7-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7a0edb7209e21840e8361e91ea84ea676658aa93edd5f8762793dec77a4a6748", size = 8272389, upload-time = "2025-10-09T00:26:42.474Z" },
+ { url = "https://files.pythonhosted.org/packages/11/0b/1ae80ddafb8652fd8046cb5c8460ecc8d4afccb89e2c6d6bec61e04e1eaf/matplotlib-3.10.7-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:c380371d3c23e0eadf8ebff114445b9f970aff2010198d498d4ab4c3b41eea4f", size = 8128247, upload-time = "2025-10-09T00:26:44.77Z" },
+ { url = "https://files.pythonhosted.org/packages/7d/18/95ae2e242d4a5c98bd6e90e36e128d71cf1c7e39b0874feaed3ef782e789/matplotlib-3.10.7-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d5f256d49fea31f40f166a5e3131235a5d2f4b7f44520b1cf0baf1ce568ccff0", size = 8696996, upload-time = "2025-10-09T00:26:46.792Z" },
+ { url = "https://files.pythonhosted.org/packages/7e/3d/5b559efc800bd05cb2033aa85f7e13af51958136a48327f7c261801ff90a/matplotlib-3.10.7-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:11ae579ac83cdf3fb72573bb89f70e0534de05266728740d478f0f818983c695", size = 9530153, upload-time = "2025-10-09T00:26:49.07Z" },
+ { url = "https://files.pythonhosted.org/packages/88/57/eab4a719fd110312d3c220595d63a3c85ec2a39723f0f4e7fa7e6e3f74ba/matplotlib-3.10.7-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:4c14b6acd16cddc3569a2d515cfdd81c7a68ac5639b76548cfc1a9e48b20eb65", size = 9593093, upload-time = "2025-10-09T00:26:51.067Z" },
+ { url = "https://files.pythonhosted.org/packages/31/3c/80816f027b3a4a28cd2a0a6ef7f89a2db22310e945cd886ec25bfb399221/matplotlib-3.10.7-cp312-cp312-win_amd64.whl", hash = "sha256:0d8c32b7ea6fb80b1aeff5a2ceb3fb9778e2759e899d9beff75584714afcc5ee", size = 8122771, upload-time = "2025-10-09T00:26:53.296Z" },
+ { url = "https://files.pythonhosted.org/packages/de/77/ef1fc78bfe99999b2675435cc52120887191c566b25017d78beaabef7f2d/matplotlib-3.10.7-cp312-cp312-win_arm64.whl", hash = "sha256:5f3f6d315dcc176ba7ca6e74c7768fb7e4cf566c49cb143f6bc257b62e634ed8", size = 7992812, upload-time = "2025-10-09T00:26:54.882Z" },
+ { url = "https://files.pythonhosted.org/packages/02/9c/207547916a02c78f6bdd83448d9b21afbc42f6379ed887ecf610984f3b4e/matplotlib-3.10.7-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:1d9d3713a237970569156cfb4de7533b7c4eacdd61789726f444f96a0d28f57f", size = 8273212, upload-time = "2025-10-09T00:26:56.752Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/d0/b3d3338d467d3fc937f0bb7f256711395cae6f78e22cef0656159950adf0/matplotlib-3.10.7-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:37a1fea41153dd6ee061d21ab69c9cf2cf543160b1b85d89cd3d2e2a7902ca4c", size = 8128713, upload-time = "2025-10-09T00:26:59.001Z" },
+ { url = "https://files.pythonhosted.org/packages/22/ff/6425bf5c20d79aa5b959d1ce9e65f599632345391381c9a104133fe0b171/matplotlib-3.10.7-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:b3c4ea4948d93c9c29dc01c0c23eef66f2101bf75158c291b88de6525c55c3d1", size = 8698527, upload-time = "2025-10-09T00:27:00.69Z" },
+ { url = "https://files.pythonhosted.org/packages/d0/7f/ccdca06f4c2e6c7989270ed7829b8679466682f4cfc0f8c9986241c023b6/matplotlib-3.10.7-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:22df30ffaa89f6643206cf13877191c63a50e8f800b038bc39bee9d2d4957632", size = 9529690, upload-time = "2025-10-09T00:27:02.664Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/95/b80fc2c1f269f21ff3d193ca697358e24408c33ce2b106a7438a45407b63/matplotlib-3.10.7-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:b69676845a0a66f9da30e87f48be36734d6748024b525ec4710be40194282c84", size = 9593732, upload-time = "2025-10-09T00:27:04.653Z" },
+ { url = "https://files.pythonhosted.org/packages/e1/b6/23064a96308b9aeceeffa65e96bcde459a2ea4934d311dee20afde7407a0/matplotlib-3.10.7-cp313-cp313-win_amd64.whl", hash = "sha256:744991e0cc863dd669c8dc9136ca4e6e0082be2070b9d793cbd64bec872a6815", size = 8122727, upload-time = "2025-10-09T00:27:06.814Z" },
+ { url = "https://files.pythonhosted.org/packages/b3/a6/2faaf48133b82cf3607759027f82b5c702aa99cdfcefb7f93d6ccf26a424/matplotlib-3.10.7-cp313-cp313-win_arm64.whl", hash = "sha256:fba2974df0bf8ce3c995fa84b79cde38326e0f7b5409e7a3a481c1141340bcf7", size = 7992958, upload-time = "2025-10-09T00:27:08.567Z" },
+ { url = "https://files.pythonhosted.org/packages/4a/f0/b018fed0b599bd48d84c08794cb242227fe3341952da102ee9d9682db574/matplotlib-3.10.7-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:932c55d1fa7af4423422cb6a492a31cbcbdbe68fd1a9a3f545aa5e7a143b5355", size = 8316849, upload-time = "2025-10-09T00:27:10.254Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/b7/bb4f23856197659f275e11a2a164e36e65e9b48ea3e93c4ec25b4f163198/matplotlib-3.10.7-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5e38c2d581d62ee729a6e144c47a71b3f42fb4187508dbbf4fe71d5612c3433b", size = 8178225, upload-time = "2025-10-09T00:27:12.241Z" },
+ { url = "https://files.pythonhosted.org/packages/62/56/0600609893ff277e6f3ab3c0cef4eafa6e61006c058e84286c467223d4d5/matplotlib-3.10.7-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:786656bb13c237bbcebcd402f65f44dd61ead60ee3deb045af429d889c8dbc67", size = 8711708, upload-time = "2025-10-09T00:27:13.879Z" },
+ { url = "https://files.pythonhosted.org/packages/d8/1a/6bfecb0cafe94d6658f2f1af22c43b76cf7a1c2f0dc34ef84cbb6809617e/matplotlib-3.10.7-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:09d7945a70ea43bf9248f4b6582734c2fe726723204a76eca233f24cffc7ef67", size = 9541409, upload-time = "2025-10-09T00:27:15.684Z" },
+ { url = "https://files.pythonhosted.org/packages/08/50/95122a407d7f2e446fd865e2388a232a23f2b81934960ea802f3171518e4/matplotlib-3.10.7-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:d0b181e9fa8daf1d9f2d4c547527b167cb8838fc587deabca7b5c01f97199e84", size = 9594054, upload-time = "2025-10-09T00:27:17.547Z" },
+ { url = "https://files.pythonhosted.org/packages/13/76/75b194a43b81583478a81e78a07da8d9ca6ddf50dd0a2ccabf258059481d/matplotlib-3.10.7-cp313-cp313t-win_amd64.whl", hash = "sha256:31963603041634ce1a96053047b40961f7a29eb8f9a62e80cc2c0427aa1d22a2", size = 8200100, upload-time = "2025-10-09T00:27:20.039Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/9e/6aefebdc9f8235c12bdeeda44cc0383d89c1e41da2c400caf3ee2073a3ce/matplotlib-3.10.7-cp313-cp313t-win_arm64.whl", hash = "sha256:aebed7b50aa6ac698c90f60f854b47e48cd2252b30510e7a1feddaf5a3f72cbf", size = 8042131, upload-time = "2025-10-09T00:27:21.608Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/4b/e5bc2c321b6a7e3a75638d937d19ea267c34bd5a90e12bee76c4d7c7a0d9/matplotlib-3.10.7-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d883460c43e8c6b173fef244a2341f7f7c0e9725c7fe68306e8e44ed9c8fb100", size = 8273787, upload-time = "2025-10-09T00:27:23.27Z" },
+ { url = "https://files.pythonhosted.org/packages/86/ad/6efae459c56c2fbc404da154e13e3a6039129f3c942b0152624f1c621f05/matplotlib-3.10.7-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:07124afcf7a6504eafcb8ce94091c5898bbdd351519a1beb5c45f7a38c67e77f", size = 8131348, upload-time = "2025-10-09T00:27:24.926Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/5a/a4284d2958dee4116359cc05d7e19c057e64ece1b4ac986ab0f2f4d52d5a/matplotlib-3.10.7-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c17398b709a6cce3d9fdb1595c33e356d91c098cd9486cb2cc21ea2ea418e715", size = 9533949, upload-time = "2025-10-09T00:27:26.704Z" },
+ { url = "https://files.pythonhosted.org/packages/de/ff/f3781b5057fa3786623ad8976fc9f7b0d02b2f28534751fd5a44240de4cf/matplotlib-3.10.7-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7146d64f561498764561e9cd0ed64fcf582e570fc519e6f521e2d0cfd43365e1", size = 9804247, upload-time = "2025-10-09T00:27:28.514Z" },
+ { url = "https://files.pythonhosted.org/packages/47/5a/993a59facb8444efb0e197bf55f545ee449902dcee86a4dfc580c3b61314/matplotlib-3.10.7-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:90ad854c0a435da3104c01e2c6f0028d7e719b690998a2333d7218db80950722", size = 9595497, upload-time = "2025-10-09T00:27:30.418Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/a5/77c95aaa9bb32c345cbb49626ad8eb15550cba2e6d4c88081a6c2ac7b08d/matplotlib-3.10.7-cp314-cp314-win_amd64.whl", hash = "sha256:4645fc5d9d20ffa3a39361fcdbcec731382763b623b72627806bf251b6388866", size = 8252732, upload-time = "2025-10-09T00:27:32.332Z" },
+ { url = "https://files.pythonhosted.org/packages/74/04/45d269b4268d222390d7817dae77b159651909669a34ee9fdee336db5883/matplotlib-3.10.7-cp314-cp314-win_arm64.whl", hash = "sha256:9257be2f2a03415f9105c486d304a321168e61ad450f6153d77c69504ad764bb", size = 8124240, upload-time = "2025-10-09T00:27:33.94Z" },
+ { url = "https://files.pythonhosted.org/packages/4b/c7/ca01c607bb827158b439208c153d6f14ddb9fb640768f06f7ca3488ae67b/matplotlib-3.10.7-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1e4bbad66c177a8fdfa53972e5ef8be72a5f27e6a607cec0d8579abd0f3102b1", size = 8316938, upload-time = "2025-10-09T00:27:35.534Z" },
+ { url = "https://files.pythonhosted.org/packages/84/d2/5539e66e9f56d2fdec94bb8436f5e449683b4e199bcc897c44fbe3c99e28/matplotlib-3.10.7-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:d8eb7194b084b12feb19142262165832fc6ee879b945491d1c3d4660748020c4", size = 8178245, upload-time = "2025-10-09T00:27:37.334Z" },
+ { url = "https://files.pythonhosted.org/packages/77/b5/e6ca22901fd3e4fe433a82e583436dd872f6c966fca7e63cf806b40356f8/matplotlib-3.10.7-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b4d41379b05528091f00e1728004f9a8d7191260f3862178b88e8fd770206318", size = 9541411, upload-time = "2025-10-09T00:27:39.387Z" },
+ { url = "https://files.pythonhosted.org/packages/9e/99/a4524db57cad8fee54b7237239a8f8360bfcfa3170d37c9e71c090c0f409/matplotlib-3.10.7-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4a74f79fafb2e177f240579bc83f0b60f82cc47d2f1d260f422a0627207008ca", size = 9803664, upload-time = "2025-10-09T00:27:41.492Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/a5/85e2edf76ea0ad4288d174926d9454ea85f3ce5390cc4e6fab196cbf250b/matplotlib-3.10.7-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:702590829c30aada1e8cef0568ddbffa77ca747b4d6e36c6d173f66e301f89cc", size = 9594066, upload-time = "2025-10-09T00:27:43.694Z" },
+ { url = "https://files.pythonhosted.org/packages/39/69/9684368a314f6d83fe5c5ad2a4121a3a8e03723d2e5c8ea17b66c1bad0e7/matplotlib-3.10.7-cp314-cp314t-win_amd64.whl", hash = "sha256:f79d5de970fc90cd5591f60053aecfce1fcd736e0303d9f0bf86be649fa68fb8", size = 8342832, upload-time = "2025-10-09T00:27:45.543Z" },
+ { url = "https://files.pythonhosted.org/packages/04/5f/e22e08da14bc1a0894184640d47819d2338b792732e20d292bf86e5ab785/matplotlib-3.10.7-cp314-cp314t-win_arm64.whl", hash = "sha256:cb783436e47fcf82064baca52ce748af71725d0352e1d31564cbe9c95df92b9c", size = 8172585, upload-time = "2025-10-09T00:27:47.185Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/6c/a9bcf03e9afb2a873e0a5855f79bce476d1023f26f8212969f2b7504756c/matplotlib-3.10.7-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:5c09cf8f2793f81368f49f118b6f9f937456362bee282eac575cca7f84cda537", size = 8241204, upload-time = "2025-10-09T00:27:48.806Z" },
+ { url = "https://files.pythonhosted.org/packages/5b/fd/0e6f5aa762ed689d9fa8750b08f1932628ffa7ed30e76423c399d19407d2/matplotlib-3.10.7-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:de66744b2bb88d5cd27e80dfc2ec9f0517d0a46d204ff98fe9e5f2864eb67657", size = 8104607, upload-time = "2025-10-09T00:27:50.876Z" },
+ { url = "https://files.pythonhosted.org/packages/b9/a9/21c9439d698fac5f0de8fc68b2405b738ed1f00e1279c76f2d9aa5521ead/matplotlib-3.10.7-pp310-pypy310_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:53cc80662dd197ece414dd5b66e07370201515a3eaf52e7c518c68c16814773b", size = 8682257, upload-time = "2025-10-09T00:27:52.597Z" },
+ { url = "https://files.pythonhosted.org/packages/58/8f/76d5dc21ac64a49e5498d7f0472c0781dae442dd266a67458baec38288ec/matplotlib-3.10.7-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:15112bcbaef211bd663fa935ec33313b948e214454d949b723998a43357b17b0", size = 8252283, upload-time = "2025-10-09T00:27:54.739Z" },
+ { url = "https://files.pythonhosted.org/packages/27/0d/9c5d4c2317feb31d819e38c9f947c942f42ebd4eb935fc6fd3518a11eaa7/matplotlib-3.10.7-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d2a959c640cdeecdd2ec3136e8ea0441da59bcaf58d67e9c590740addba2cb68", size = 8116733, upload-time = "2025-10-09T00:27:56.406Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/cc/3fe688ff1355010937713164caacf9ed443675ac48a997bab6ed23b3f7c0/matplotlib-3.10.7-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3886e47f64611046bc1db523a09dd0a0a6bed6081e6f90e13806dd1d1d1b5e91", size = 8693919, upload-time = "2025-10-09T00:27:58.41Z" },
+]
+
+[[package]]
+name = "mdurl"
+version = "0.1.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" },
+]
+
+[[package]]
+name = "mpmath"
+version = "1.3.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/e0/47/dd32fa426cc72114383ac549964eecb20ecfd886d1e5ccf5340b55b02f57/mpmath-1.3.0.tar.gz", hash = "sha256:7a28eb2a9774d00c7bc92411c19a89209d5da7c4c9a9e227be8330a23a25b91f", size = 508106, upload-time = "2023-03-07T16:47:11.061Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/43/e3/7d92a15f894aa0c9c4b49b8ee9ac9850d6e63b03c9c32c0367a13ae62209/mpmath-1.3.0-py3-none-any.whl", hash = "sha256:a0b2b9fe80bbcd81a6647ff13108738cfb482d481d826cc0e02f5b35e5c88d2c", size = 536198, upload-time = "2023-03-07T16:47:09.197Z" },
+]
+
+[[package]]
+name = "msgpack"
+version = "1.1.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/4d/f2/bfb55a6236ed8725a96b0aa3acbd0ec17588e6a2c3b62a93eb513ed8783f/msgpack-1.1.2.tar.gz", hash = "sha256:3b60763c1373dd60f398488069bcdc703cd08a711477b5d480eecc9f9626f47e", size = 173581, upload-time = "2025-10-08T09:15:56.596Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/f5/a2/3b68a9e769db68668b25c6108444a35f9bd163bb848c0650d516761a59c0/msgpack-1.1.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0051fffef5a37ca2cd16978ae4f0aef92f164df86823871b5162812bebecd8e2", size = 81318, upload-time = "2025-10-08T09:14:38.722Z" },
+ { url = "https://files.pythonhosted.org/packages/5b/e1/2b720cc341325c00be44e1ed59e7cfeae2678329fbf5aa68f5bda57fe728/msgpack-1.1.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a605409040f2da88676e9c9e5853b3449ba8011973616189ea5ee55ddbc5bc87", size = 83786, upload-time = "2025-10-08T09:14:40.082Z" },
+ { url = "https://files.pythonhosted.org/packages/71/e5/c2241de64bfceac456b140737812a2ab310b10538a7b34a1d393b748e095/msgpack-1.1.2-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8b696e83c9f1532b4af884045ba7f3aa741a63b2bc22617293a2c6a7c645f251", size = 398240, upload-time = "2025-10-08T09:14:41.151Z" },
+ { url = "https://files.pythonhosted.org/packages/b7/09/2a06956383c0fdebaef5aa9246e2356776f12ea6f2a44bd1368abf0e46c4/msgpack-1.1.2-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:365c0bbe981a27d8932da71af63ef86acc59ed5c01ad929e09a0b88c6294e28a", size = 406070, upload-time = "2025-10-08T09:14:42.821Z" },
+ { url = "https://files.pythonhosted.org/packages/0e/74/2957703f0e1ef20637d6aead4fbb314330c26f39aa046b348c7edcf6ca6b/msgpack-1.1.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:41d1a5d875680166d3ac5c38573896453bbbea7092936d2e107214daf43b1d4f", size = 393403, upload-time = "2025-10-08T09:14:44.38Z" },
+ { url = "https://files.pythonhosted.org/packages/a5/09/3bfc12aa90f77b37322fc33e7a8a7c29ba7c8edeadfa27664451801b9860/msgpack-1.1.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:354e81bcdebaab427c3df4281187edc765d5d76bfb3a7c125af9da7a27e8458f", size = 398947, upload-time = "2025-10-08T09:14:45.56Z" },
+ { url = "https://files.pythonhosted.org/packages/4b/4f/05fcebd3b4977cb3d840f7ef6b77c51f8582086de5e642f3fefee35c86fc/msgpack-1.1.2-cp310-cp310-win32.whl", hash = "sha256:e64c8d2f5e5d5fda7b842f55dec6133260ea8f53c4257d64494c534f306bf7a9", size = 64769, upload-time = "2025-10-08T09:14:47.334Z" },
+ { url = "https://files.pythonhosted.org/packages/d0/3e/b4547e3a34210956382eed1c85935fff7e0f9b98be3106b3745d7dec9c5e/msgpack-1.1.2-cp310-cp310-win_amd64.whl", hash = "sha256:db6192777d943bdaaafb6ba66d44bf65aa0e9c5616fa1d2da9bb08828c6b39aa", size = 71293, upload-time = "2025-10-08T09:14:48.665Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/97/560d11202bcd537abca693fd85d81cebe2107ba17301de42b01ac1677b69/msgpack-1.1.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2e86a607e558d22985d856948c12a3fa7b42efad264dca8a3ebbcfa2735d786c", size = 82271, upload-time = "2025-10-08T09:14:49.967Z" },
+ { url = "https://files.pythonhosted.org/packages/83/04/28a41024ccbd67467380b6fb440ae916c1e4f25e2cd4c63abe6835ac566e/msgpack-1.1.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:283ae72fc89da59aa004ba147e8fc2f766647b1251500182fac0350d8af299c0", size = 84914, upload-time = "2025-10-08T09:14:50.958Z" },
+ { url = "https://files.pythonhosted.org/packages/71/46/b817349db6886d79e57a966346cf0902a426375aadc1e8e7a86a75e22f19/msgpack-1.1.2-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:61c8aa3bd513d87c72ed0b37b53dd5c5a0f58f2ff9f26e1555d3bd7948fb7296", size = 416962, upload-time = "2025-10-08T09:14:51.997Z" },
+ { url = "https://files.pythonhosted.org/packages/da/e0/6cc2e852837cd6086fe7d8406af4294e66827a60a4cf60b86575a4a65ca8/msgpack-1.1.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:454e29e186285d2ebe65be34629fa0e8605202c60fbc7c4c650ccd41870896ef", size = 426183, upload-time = "2025-10-08T09:14:53.477Z" },
+ { url = "https://files.pythonhosted.org/packages/25/98/6a19f030b3d2ea906696cedd1eb251708e50a5891d0978b012cb6107234c/msgpack-1.1.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7bc8813f88417599564fafa59fd6f95be417179f76b40325b500b3c98409757c", size = 411454, upload-time = "2025-10-08T09:14:54.648Z" },
+ { url = "https://files.pythonhosted.org/packages/b7/cd/9098fcb6adb32187a70b7ecaabf6339da50553351558f37600e53a4a2a23/msgpack-1.1.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:bafca952dc13907bdfdedfc6a5f579bf4f292bdd506fadb38389afa3ac5b208e", size = 422341, upload-time = "2025-10-08T09:14:56.328Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/ae/270cecbcf36c1dc85ec086b33a51a4d7d08fc4f404bdbc15b582255d05ff/msgpack-1.1.2-cp311-cp311-win32.whl", hash = "sha256:602b6740e95ffc55bfb078172d279de3773d7b7db1f703b2f1323566b878b90e", size = 64747, upload-time = "2025-10-08T09:14:57.882Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/79/309d0e637f6f37e83c711f547308b91af02b72d2326ddd860b966080ef29/msgpack-1.1.2-cp311-cp311-win_amd64.whl", hash = "sha256:d198d275222dc54244bf3327eb8cbe00307d220241d9cec4d306d49a44e85f68", size = 71633, upload-time = "2025-10-08T09:14:59.177Z" },
+ { url = "https://files.pythonhosted.org/packages/73/4d/7c4e2b3d9b1106cd0aa6cb56cc57c6267f59fa8bfab7d91df5adc802c847/msgpack-1.1.2-cp311-cp311-win_arm64.whl", hash = "sha256:86f8136dfa5c116365a8a651a7d7484b65b13339731dd6faebb9a0242151c406", size = 64755, upload-time = "2025-10-08T09:15:00.48Z" },
+ { url = "https://files.pythonhosted.org/packages/ad/bd/8b0d01c756203fbab65d265859749860682ccd2a59594609aeec3a144efa/msgpack-1.1.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:70a0dff9d1f8da25179ffcf880e10cf1aad55fdb63cd59c9a49a1b82290062aa", size = 81939, upload-time = "2025-10-08T09:15:01.472Z" },
+ { url = "https://files.pythonhosted.org/packages/34/68/ba4f155f793a74c1483d4bdef136e1023f7bcba557f0db4ef3db3c665cf1/msgpack-1.1.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:446abdd8b94b55c800ac34b102dffd2f6aa0ce643c55dfc017ad89347db3dbdb", size = 85064, upload-time = "2025-10-08T09:15:03.764Z" },
+ { url = "https://files.pythonhosted.org/packages/f2/60/a064b0345fc36c4c3d2c743c82d9100c40388d77f0b48b2f04d6041dbec1/msgpack-1.1.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c63eea553c69ab05b6747901b97d620bb2a690633c77f23feb0c6a947a8a7b8f", size = 417131, upload-time = "2025-10-08T09:15:05.136Z" },
+ { url = "https://files.pythonhosted.org/packages/65/92/a5100f7185a800a5d29f8d14041f61475b9de465ffcc0f3b9fba606e4505/msgpack-1.1.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:372839311ccf6bdaf39b00b61288e0557916c3729529b301c52c2d88842add42", size = 427556, upload-time = "2025-10-08T09:15:06.837Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/87/ffe21d1bf7d9991354ad93949286f643b2bb6ddbeab66373922b44c3b8cc/msgpack-1.1.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2929af52106ca73fcb28576218476ffbb531a036c2adbcf54a3664de124303e9", size = 404920, upload-time = "2025-10-08T09:15:08.179Z" },
+ { url = "https://files.pythonhosted.org/packages/ff/41/8543ed2b8604f7c0d89ce066f42007faac1eaa7d79a81555f206a5cdb889/msgpack-1.1.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:be52a8fc79e45b0364210eef5234a7cf8d330836d0a64dfbb878efa903d84620", size = 415013, upload-time = "2025-10-08T09:15:09.83Z" },
+ { url = "https://files.pythonhosted.org/packages/41/0d/2ddfaa8b7e1cee6c490d46cb0a39742b19e2481600a7a0e96537e9c22f43/msgpack-1.1.2-cp312-cp312-win32.whl", hash = "sha256:1fff3d825d7859ac888b0fbda39a42d59193543920eda9d9bea44d958a878029", size = 65096, upload-time = "2025-10-08T09:15:11.11Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/ec/d431eb7941fb55a31dd6ca3404d41fbb52d99172df2e7707754488390910/msgpack-1.1.2-cp312-cp312-win_amd64.whl", hash = "sha256:1de460f0403172cff81169a30b9a92b260cb809c4cb7e2fc79ae8d0510c78b6b", size = 72708, upload-time = "2025-10-08T09:15:12.554Z" },
+ { url = "https://files.pythonhosted.org/packages/c5/31/5b1a1f70eb0e87d1678e9624908f86317787b536060641d6798e3cf70ace/msgpack-1.1.2-cp312-cp312-win_arm64.whl", hash = "sha256:be5980f3ee0e6bd44f3a9e9dea01054f175b50c3e6cdb692bc9424c0bbb8bf69", size = 64119, upload-time = "2025-10-08T09:15:13.589Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/31/b46518ecc604d7edf3a4f94cb3bf021fc62aa301f0cb849936968164ef23/msgpack-1.1.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:4efd7b5979ccb539c221a4c4e16aac1a533efc97f3b759bb5a5ac9f6d10383bf", size = 81212, upload-time = "2025-10-08T09:15:14.552Z" },
+ { url = "https://files.pythonhosted.org/packages/92/dc/c385f38f2c2433333345a82926c6bfa5ecfff3ef787201614317b58dd8be/msgpack-1.1.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:42eefe2c3e2af97ed470eec850facbe1b5ad1d6eacdbadc42ec98e7dcf68b4b7", size = 84315, upload-time = "2025-10-08T09:15:15.543Z" },
+ { url = "https://files.pythonhosted.org/packages/d3/68/93180dce57f684a61a88a45ed13047558ded2be46f03acb8dec6d7c513af/msgpack-1.1.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1fdf7d83102bf09e7ce3357de96c59b627395352a4024f6e2458501f158bf999", size = 412721, upload-time = "2025-10-08T09:15:16.567Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/ba/459f18c16f2b3fc1a1ca871f72f07d70c07bf768ad0a507a698b8052ac58/msgpack-1.1.2-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fac4be746328f90caa3cd4bc67e6fe36ca2bf61d5c6eb6d895b6527e3f05071e", size = 424657, upload-time = "2025-10-08T09:15:17.825Z" },
+ { url = "https://files.pythonhosted.org/packages/38/f8/4398c46863b093252fe67368b44edc6c13b17f4e6b0e4929dbf0bdb13f23/msgpack-1.1.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:fffee09044073e69f2bad787071aeec727183e7580443dfeb8556cbf1978d162", size = 402668, upload-time = "2025-10-08T09:15:19.003Z" },
+ { url = "https://files.pythonhosted.org/packages/28/ce/698c1eff75626e4124b4d78e21cca0b4cc90043afb80a507626ea354ab52/msgpack-1.1.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:5928604de9b032bc17f5099496417f113c45bc6bc21b5c6920caf34b3c428794", size = 419040, upload-time = "2025-10-08T09:15:20.183Z" },
+ { url = "https://files.pythonhosted.org/packages/67/32/f3cd1667028424fa7001d82e10ee35386eea1408b93d399b09fb0aa7875f/msgpack-1.1.2-cp313-cp313-win32.whl", hash = "sha256:a7787d353595c7c7e145e2331abf8b7ff1e6673a6b974ded96e6d4ec09f00c8c", size = 65037, upload-time = "2025-10-08T09:15:21.416Z" },
+ { url = "https://files.pythonhosted.org/packages/74/07/1ed8277f8653c40ebc65985180b007879f6a836c525b3885dcc6448ae6cb/msgpack-1.1.2-cp313-cp313-win_amd64.whl", hash = "sha256:a465f0dceb8e13a487e54c07d04ae3ba131c7c5b95e2612596eafde1dccf64a9", size = 72631, upload-time = "2025-10-08T09:15:22.431Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/db/0314e4e2db56ebcf450f277904ffd84a7988b9e5da8d0d61ab2d057df2b6/msgpack-1.1.2-cp313-cp313-win_arm64.whl", hash = "sha256:e69b39f8c0aa5ec24b57737ebee40be647035158f14ed4b40e6f150077e21a84", size = 64118, upload-time = "2025-10-08T09:15:23.402Z" },
+ { url = "https://files.pythonhosted.org/packages/22/71/201105712d0a2ff07b7873ed3c220292fb2ea5120603c00c4b634bcdafb3/msgpack-1.1.2-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:e23ce8d5f7aa6ea6d2a2b326b4ba46c985dbb204523759984430db7114f8aa00", size = 81127, upload-time = "2025-10-08T09:15:24.408Z" },
+ { url = "https://files.pythonhosted.org/packages/1b/9f/38ff9e57a2eade7bf9dfee5eae17f39fc0e998658050279cbb14d97d36d9/msgpack-1.1.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:6c15b7d74c939ebe620dd8e559384be806204d73b4f9356320632d783d1f7939", size = 84981, upload-time = "2025-10-08T09:15:25.812Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/a9/3536e385167b88c2cc8f4424c49e28d49a6fc35206d4a8060f136e71f94c/msgpack-1.1.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:99e2cb7b9031568a2a5c73aa077180f93dd2e95b4f8d3b8e14a73ae94a9e667e", size = 411885, upload-time = "2025-10-08T09:15:27.22Z" },
+ { url = "https://files.pythonhosted.org/packages/2f/40/dc34d1a8d5f1e51fc64640b62b191684da52ca469da9cd74e84936ffa4a6/msgpack-1.1.2-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:180759d89a057eab503cf62eeec0aa61c4ea1200dee709f3a8e9397dbb3b6931", size = 419658, upload-time = "2025-10-08T09:15:28.4Z" },
+ { url = "https://files.pythonhosted.org/packages/3b/ef/2b92e286366500a09a67e03496ee8b8ba00562797a52f3c117aa2b29514b/msgpack-1.1.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:04fb995247a6e83830b62f0b07bf36540c213f6eac8e851166d8d86d83cbd014", size = 403290, upload-time = "2025-10-08T09:15:29.764Z" },
+ { url = "https://files.pythonhosted.org/packages/78/90/e0ea7990abea5764e4655b8177aa7c63cdfa89945b6e7641055800f6c16b/msgpack-1.1.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:8e22ab046fa7ede9e36eeb4cfad44d46450f37bb05d5ec482b02868f451c95e2", size = 415234, upload-time = "2025-10-08T09:15:31.022Z" },
+ { url = "https://files.pythonhosted.org/packages/72/4e/9390aed5db983a2310818cd7d3ec0aecad45e1f7007e0cda79c79507bb0d/msgpack-1.1.2-cp314-cp314-win32.whl", hash = "sha256:80a0ff7d4abf5fecb995fcf235d4064b9a9a8a40a3ab80999e6ac1e30b702717", size = 66391, upload-time = "2025-10-08T09:15:32.265Z" },
+ { url = "https://files.pythonhosted.org/packages/6e/f1/abd09c2ae91228c5f3998dbd7f41353def9eac64253de3c8105efa2082f7/msgpack-1.1.2-cp314-cp314-win_amd64.whl", hash = "sha256:9ade919fac6a3e7260b7f64cea89df6bec59104987cbea34d34a2fa15d74310b", size = 73787, upload-time = "2025-10-08T09:15:33.219Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/b0/9d9f667ab48b16ad4115c1935d94023b82b3198064cb84a123e97f7466c1/msgpack-1.1.2-cp314-cp314-win_arm64.whl", hash = "sha256:59415c6076b1e30e563eb732e23b994a61c159cec44deaf584e5cc1dd662f2af", size = 66453, upload-time = "2025-10-08T09:15:34.225Z" },
+ { url = "https://files.pythonhosted.org/packages/16/67/93f80545eb1792b61a217fa7f06d5e5cb9e0055bed867f43e2b8e012e137/msgpack-1.1.2-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:897c478140877e5307760b0ea66e0932738879e7aa68144d9b78ea4c8302a84a", size = 85264, upload-time = "2025-10-08T09:15:35.61Z" },
+ { url = "https://files.pythonhosted.org/packages/87/1c/33c8a24959cf193966ef11a6f6a2995a65eb066bd681fd085afd519a57ce/msgpack-1.1.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:a668204fa43e6d02f89dbe79a30b0d67238d9ec4c5bd8a940fc3a004a47b721b", size = 89076, upload-time = "2025-10-08T09:15:36.619Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/6b/62e85ff7193663fbea5c0254ef32f0c77134b4059f8da89b958beb7696f3/msgpack-1.1.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5559d03930d3aa0f3aacb4c42c776af1a2ace2611871c84a75afe436695e6245", size = 435242, upload-time = "2025-10-08T09:15:37.647Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/47/5c74ecb4cc277cf09f64e913947871682ffa82b3b93c8dad68083112f412/msgpack-1.1.2-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:70c5a7a9fea7f036b716191c29047374c10721c389c21e9ffafad04df8c52c90", size = 432509, upload-time = "2025-10-08T09:15:38.794Z" },
+ { url = "https://files.pythonhosted.org/packages/24/a4/e98ccdb56dc4e98c929a3f150de1799831c0a800583cde9fa022fa90602d/msgpack-1.1.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:f2cb069d8b981abc72b41aea1c580ce92d57c673ec61af4c500153a626cb9e20", size = 415957, upload-time = "2025-10-08T09:15:40.238Z" },
+ { url = "https://files.pythonhosted.org/packages/da/28/6951f7fb67bc0a4e184a6b38ab71a92d9ba58080b27a77d3e2fb0be5998f/msgpack-1.1.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:d62ce1f483f355f61adb5433ebfd8868c5f078d1a52d042b0a998682b4fa8c27", size = 422910, upload-time = "2025-10-08T09:15:41.505Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/03/42106dcded51f0a0b5284d3ce30a671e7bd3f7318d122b2ead66ad289fed/msgpack-1.1.2-cp314-cp314t-win32.whl", hash = "sha256:1d1418482b1ee984625d88aa9585db570180c286d942da463533b238b98b812b", size = 75197, upload-time = "2025-10-08T09:15:42.954Z" },
+ { url = "https://files.pythonhosted.org/packages/15/86/d0071e94987f8db59d4eeb386ddc64d0bb9b10820a8d82bcd3e53eeb2da6/msgpack-1.1.2-cp314-cp314t-win_amd64.whl", hash = "sha256:5a46bf7e831d09470ad92dff02b8b1ac92175ca36b087f904a0519857c6be3ff", size = 85772, upload-time = "2025-10-08T09:15:43.954Z" },
+ { url = "https://files.pythonhosted.org/packages/81/f2/08ace4142eb281c12701fc3b93a10795e4d4dc7f753911d836675050f886/msgpack-1.1.2-cp314-cp314t-win_arm64.whl", hash = "sha256:d99ef64f349d5ec3293688e91486c5fdb925ed03807f64d98d205d2713c60b46", size = 70868, upload-time = "2025-10-08T09:15:44.959Z" },
+]
+
+[[package]]
+name = "multidict"
+version = "6.7.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "typing-extensions", marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/80/1e/5492c365f222f907de1039b91f922b93fa4f764c713ee858d235495d8f50/multidict-6.7.0.tar.gz", hash = "sha256:c6e99d9a65ca282e578dfea819cfa9c0a62b2499d8677392e09feaf305e9e6f5", size = 101834, upload-time = "2025-10-06T14:52:30.657Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/a9/63/7bdd4adc330abcca54c85728db2327130e49e52e8c3ce685cec44e0f2e9f/multidict-6.7.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:9f474ad5acda359c8758c8accc22032c6abe6dc87a8be2440d097785e27a9349", size = 77153, upload-time = "2025-10-06T14:48:26.409Z" },
+ { url = "https://files.pythonhosted.org/packages/3f/bb/b6c35ff175ed1a3142222b78455ee31be71a8396ed3ab5280fbe3ebe4e85/multidict-6.7.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:4b7a9db5a870f780220e931d0002bbfd88fb53aceb6293251e2c839415c1b20e", size = 44993, upload-time = "2025-10-06T14:48:28.4Z" },
+ { url = "https://files.pythonhosted.org/packages/e0/1f/064c77877c5fa6df6d346e68075c0f6998547afe952d6471b4c5f6a7345d/multidict-6.7.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:03ca744319864e92721195fa28c7a3b2bc7b686246b35e4078c1e4d0eb5466d3", size = 44607, upload-time = "2025-10-06T14:48:29.581Z" },
+ { url = "https://files.pythonhosted.org/packages/04/7a/bf6aa92065dd47f287690000b3d7d332edfccb2277634cadf6a810463c6a/multidict-6.7.0-cp310-cp310-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:f0e77e3c0008bc9316e662624535b88d360c3a5d3f81e15cf12c139a75250046", size = 241847, upload-time = "2025-10-06T14:48:32.107Z" },
+ { url = "https://files.pythonhosted.org/packages/94/39/297a8de920f76eda343e4ce05f3b489f0ab3f9504f2576dfb37b7c08ca08/multidict-6.7.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:08325c9e5367aa379a3496aa9a022fe8837ff22e00b94db256d3a1378c76ab32", size = 242616, upload-time = "2025-10-06T14:48:34.054Z" },
+ { url = "https://files.pythonhosted.org/packages/39/3a/d0eee2898cfd9d654aea6cb8c4addc2f9756e9a7e09391cfe55541f917f7/multidict-6.7.0-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e2862408c99f84aa571ab462d25236ef9cb12a602ea959ba9c9009a54902fc73", size = 222333, upload-time = "2025-10-06T14:48:35.9Z" },
+ { url = "https://files.pythonhosted.org/packages/05/48/3b328851193c7a4240815b71eea165b49248867bbb6153a0aee227a0bb47/multidict-6.7.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4d72a9a2d885f5c208b0cb91ff2ed43636bb7e345ec839ff64708e04f69a13cc", size = 253239, upload-time = "2025-10-06T14:48:37.302Z" },
+ { url = "https://files.pythonhosted.org/packages/b1/ca/0706a98c8d126a89245413225ca4a3fefc8435014de309cf8b30acb68841/multidict-6.7.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:478cc36476687bac1514d651cbbaa94b86b0732fb6855c60c673794c7dd2da62", size = 251618, upload-time = "2025-10-06T14:48:38.963Z" },
+ { url = "https://files.pythonhosted.org/packages/5e/4f/9c7992f245554d8b173f6f0a048ad24b3e645d883f096857ec2c0822b8bd/multidict-6.7.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6843b28b0364dc605f21481c90fadb5f60d9123b442eb8a726bb74feef588a84", size = 241655, upload-time = "2025-10-06T14:48:40.312Z" },
+ { url = "https://files.pythonhosted.org/packages/31/79/26a85991ae67efd1c0b1fc2e0c275b8a6aceeb155a68861f63f87a798f16/multidict-6.7.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:23bfeee5316266e5ee2d625df2d2c602b829435fc3a235c2ba2131495706e4a0", size = 239245, upload-time = "2025-10-06T14:48:41.848Z" },
+ { url = "https://files.pythonhosted.org/packages/14/1e/75fa96394478930b79d0302eaf9a6c69f34005a1a5251ac8b9c336486ec9/multidict-6.7.0-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:680878b9f3d45c31e1f730eef731f9b0bc1da456155688c6745ee84eb818e90e", size = 233523, upload-time = "2025-10-06T14:48:43.749Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/5e/085544cb9f9c4ad2b5d97467c15f856df8d9bac410cffd5c43991a5d878b/multidict-6.7.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:eb866162ef2f45063acc7a53a88ef6fe8bf121d45c30ea3c9cd87ce7e191a8d4", size = 243129, upload-time = "2025-10-06T14:48:45.225Z" },
+ { url = "https://files.pythonhosted.org/packages/b9/c3/e9d9e2f20c9474e7a8fcef28f863c5cbd29bb5adce6b70cebe8bdad0039d/multidict-6.7.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:df0e3bf7993bdbeca5ac25aa859cf40d39019e015c9c91809ba7093967f7a648", size = 248999, upload-time = "2025-10-06T14:48:46.703Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/3f/df171b6efa3239ae33b97b887e42671cd1d94d460614bfb2c30ffdab3b95/multidict-6.7.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:661709cdcd919a2ece2234f9bae7174e5220c80b034585d7d8a755632d3e2111", size = 243711, upload-time = "2025-10-06T14:48:48.146Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/2f/9b5564888c4e14b9af64c54acf149263721a283aaf4aa0ae89b091d5d8c1/multidict-6.7.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:096f52730c3fb8ed419db2d44391932b63891b2c5ed14850a7e215c0ba9ade36", size = 237504, upload-time = "2025-10-06T14:48:49.447Z" },
+ { url = "https://files.pythonhosted.org/packages/6c/3a/0bd6ca0f7d96d790542d591c8c3354c1e1b6bfd2024d4d92dc3d87485ec7/multidict-6.7.0-cp310-cp310-win32.whl", hash = "sha256:afa8a2978ec65d2336305550535c9c4ff50ee527914328c8677b3973ade52b85", size = 41422, upload-time = "2025-10-06T14:48:50.789Z" },
+ { url = "https://files.pythonhosted.org/packages/00/35/f6a637ea2c75f0d3b7c7d41b1189189acff0d9deeb8b8f35536bb30f5e33/multidict-6.7.0-cp310-cp310-win_amd64.whl", hash = "sha256:b15b3afff74f707b9275d5ba6a91ae8f6429c3ffb29bbfd216b0b375a56f13d7", size = 46050, upload-time = "2025-10-06T14:48:51.938Z" },
+ { url = "https://files.pythonhosted.org/packages/e7/b8/f7bf8329b39893d02d9d95cf610c75885d12fc0f402b1c894e1c8e01c916/multidict-6.7.0-cp310-cp310-win_arm64.whl", hash = "sha256:4b73189894398d59131a66ff157837b1fafea9974be486d036bb3d32331fdbf0", size = 43153, upload-time = "2025-10-06T14:48:53.146Z" },
+ { url = "https://files.pythonhosted.org/packages/34/9e/5c727587644d67b2ed479041e4b1c58e30afc011e3d45d25bbe35781217c/multidict-6.7.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:4d409aa42a94c0b3fa617708ef5276dfe81012ba6753a0370fcc9d0195d0a1fc", size = 76604, upload-time = "2025-10-06T14:48:54.277Z" },
+ { url = "https://files.pythonhosted.org/packages/17/e4/67b5c27bd17c085a5ea8f1ec05b8a3e5cba0ca734bfcad5560fb129e70ca/multidict-6.7.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:14c9e076eede3b54c636f8ce1c9c252b5f057c62131211f0ceeec273810c9721", size = 44715, upload-time = "2025-10-06T14:48:55.445Z" },
+ { url = "https://files.pythonhosted.org/packages/4d/e1/866a5d77be6ea435711bef2a4291eed11032679b6b28b56b4776ab06ba3e/multidict-6.7.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4c09703000a9d0fa3c3404b27041e574cc7f4df4c6563873246d0e11812a94b6", size = 44332, upload-time = "2025-10-06T14:48:56.706Z" },
+ { url = "https://files.pythonhosted.org/packages/31/61/0c2d50241ada71ff61a79518db85ada85fdabfcf395d5968dae1cbda04e5/multidict-6.7.0-cp311-cp311-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:a265acbb7bb33a3a2d626afbe756371dce0279e7b17f4f4eda406459c2b5ff1c", size = 245212, upload-time = "2025-10-06T14:48:58.042Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/e0/919666a4e4b57fff1b57f279be1c9316e6cdc5de8a8b525d76f6598fefc7/multidict-6.7.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:51cb455de290ae462593e5b1cb1118c5c22ea7f0d3620d9940bf695cea5a4bd7", size = 246671, upload-time = "2025-10-06T14:49:00.004Z" },
+ { url = "https://files.pythonhosted.org/packages/a1/cc/d027d9c5a520f3321b65adea289b965e7bcbd2c34402663f482648c716ce/multidict-6.7.0-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:db99677b4457c7a5c5a949353e125ba72d62b35f74e26da141530fbb012218a7", size = 225491, upload-time = "2025-10-06T14:49:01.393Z" },
+ { url = "https://files.pythonhosted.org/packages/75/c4/bbd633980ce6155a28ff04e6a6492dd3335858394d7bb752d8b108708558/multidict-6.7.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f470f68adc395e0183b92a2f4689264d1ea4b40504a24d9882c27375e6662bb9", size = 257322, upload-time = "2025-10-06T14:49:02.745Z" },
+ { url = "https://files.pythonhosted.org/packages/4c/6d/d622322d344f1f053eae47e033b0b3f965af01212de21b10bcf91be991fb/multidict-6.7.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0db4956f82723cc1c270de9c6e799b4c341d327762ec78ef82bb962f79cc07d8", size = 254694, upload-time = "2025-10-06T14:49:04.15Z" },
+ { url = "https://files.pythonhosted.org/packages/a8/9f/78f8761c2705d4c6d7516faed63c0ebdac569f6db1bef95e0d5218fdc146/multidict-6.7.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3e56d780c238f9e1ae66a22d2adf8d16f485381878250db8d496623cd38b22bd", size = 246715, upload-time = "2025-10-06T14:49:05.967Z" },
+ { url = "https://files.pythonhosted.org/packages/78/59/950818e04f91b9c2b95aab3d923d9eabd01689d0dcd889563988e9ea0fd8/multidict-6.7.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:9d14baca2ee12c1a64740d4531356ba50b82543017f3ad6de0deb943c5979abb", size = 243189, upload-time = "2025-10-06T14:49:07.37Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/3d/77c79e1934cad2ee74991840f8a0110966d9599b3af95964c0cd79bb905b/multidict-6.7.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:295a92a76188917c7f99cda95858c822f9e4aae5824246bba9b6b44004ddd0a6", size = 237845, upload-time = "2025-10-06T14:49:08.759Z" },
+ { url = "https://files.pythonhosted.org/packages/63/1b/834ce32a0a97a3b70f86437f685f880136677ac00d8bce0027e9fd9c2db7/multidict-6.7.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:39f1719f57adbb767ef592a50ae5ebb794220d1188f9ca93de471336401c34d2", size = 246374, upload-time = "2025-10-06T14:49:10.574Z" },
+ { url = "https://files.pythonhosted.org/packages/23/ef/43d1c3ba205b5dec93dc97f3fba179dfa47910fc73aaaea4f7ceb41cec2a/multidict-6.7.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:0a13fb8e748dfc94749f622de065dd5c1def7e0d2216dba72b1d8069a389c6ff", size = 253345, upload-time = "2025-10-06T14:49:12.331Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/03/eaf95bcc2d19ead522001f6a650ef32811aa9e3624ff0ad37c445c7a588c/multidict-6.7.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:e3aa16de190d29a0ea1b48253c57d99a68492c8dd8948638073ab9e74dc9410b", size = 246940, upload-time = "2025-10-06T14:49:13.821Z" },
+ { url = "https://files.pythonhosted.org/packages/e8/df/ec8a5fd66ea6cd6f525b1fcbb23511b033c3e9bc42b81384834ffa484a62/multidict-6.7.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a048ce45dcdaaf1defb76b2e684f997fb5abf74437b6cb7b22ddad934a964e34", size = 242229, upload-time = "2025-10-06T14:49:15.603Z" },
+ { url = "https://files.pythonhosted.org/packages/8a/a2/59b405d59fd39ec86d1142630e9049243015a5f5291ba49cadf3c090c541/multidict-6.7.0-cp311-cp311-win32.whl", hash = "sha256:a90af66facec4cebe4181b9e62a68be65e45ac9b52b67de9eec118701856e7ff", size = 41308, upload-time = "2025-10-06T14:49:16.871Z" },
+ { url = "https://files.pythonhosted.org/packages/32/0f/13228f26f8b882c34da36efa776c3b7348455ec383bab4a66390e42963ae/multidict-6.7.0-cp311-cp311-win_amd64.whl", hash = "sha256:95b5ffa4349df2887518bb839409bcf22caa72d82beec453216802f475b23c81", size = 46037, upload-time = "2025-10-06T14:49:18.457Z" },
+ { url = "https://files.pythonhosted.org/packages/84/1f/68588e31b000535a3207fd3c909ebeec4fb36b52c442107499c18a896a2a/multidict-6.7.0-cp311-cp311-win_arm64.whl", hash = "sha256:329aa225b085b6f004a4955271a7ba9f1087e39dcb7e65f6284a988264a63912", size = 43023, upload-time = "2025-10-06T14:49:19.648Z" },
+ { url = "https://files.pythonhosted.org/packages/c2/9e/9f61ac18d9c8b475889f32ccfa91c9f59363480613fc807b6e3023d6f60b/multidict-6.7.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:8a3862568a36d26e650a19bb5cbbba14b71789032aebc0423f8cc5f150730184", size = 76877, upload-time = "2025-10-06T14:49:20.884Z" },
+ { url = "https://files.pythonhosted.org/packages/38/6f/614f09a04e6184f8824268fce4bc925e9849edfa654ddd59f0b64508c595/multidict-6.7.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:960c60b5849b9b4f9dcc9bea6e3626143c252c74113df2c1540aebce70209b45", size = 45467, upload-time = "2025-10-06T14:49:22.054Z" },
+ { url = "https://files.pythonhosted.org/packages/b3/93/c4f67a436dd026f2e780c433277fff72be79152894d9fc36f44569cab1a6/multidict-6.7.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:2049be98fb57a31b4ccf870bf377af2504d4ae35646a19037ec271e4c07998aa", size = 43834, upload-time = "2025-10-06T14:49:23.566Z" },
+ { url = "https://files.pythonhosted.org/packages/7f/f5/013798161ca665e4a422afbc5e2d9e4070142a9ff8905e482139cd09e4d0/multidict-6.7.0-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:0934f3843a1860dd465d38895c17fce1f1cb37295149ab05cd1b9a03afacb2a7", size = 250545, upload-time = "2025-10-06T14:49:24.882Z" },
+ { url = "https://files.pythonhosted.org/packages/71/2f/91dbac13e0ba94669ea5119ba267c9a832f0cb65419aca75549fcf09a3dc/multidict-6.7.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b3e34f3a1b8131ba06f1a73adab24f30934d148afcd5f5de9a73565a4404384e", size = 258305, upload-time = "2025-10-06T14:49:26.778Z" },
+ { url = "https://files.pythonhosted.org/packages/ef/b0/754038b26f6e04488b48ac621f779c341338d78503fb45403755af2df477/multidict-6.7.0-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:efbb54e98446892590dc2458c19c10344ee9a883a79b5cec4bc34d6656e8d546", size = 242363, upload-time = "2025-10-06T14:49:28.562Z" },
+ { url = "https://files.pythonhosted.org/packages/87/15/9da40b9336a7c9fa606c4cf2ed80a649dffeb42b905d4f63a1d7eb17d746/multidict-6.7.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a35c5fc61d4f51eb045061e7967cfe3123d622cd500e8868e7c0c592a09fedc4", size = 268375, upload-time = "2025-10-06T14:49:29.96Z" },
+ { url = "https://files.pythonhosted.org/packages/82/72/c53fcade0cc94dfaad583105fd92b3a783af2091eddcb41a6d5a52474000/multidict-6.7.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:29fe6740ebccba4175af1b9b87bf553e9c15cd5868ee967e010efcf94e4fd0f1", size = 269346, upload-time = "2025-10-06T14:49:31.404Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/e2/9baffdae21a76f77ef8447f1a05a96ec4bc0a24dae08767abc0a2fe680b8/multidict-6.7.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:123e2a72e20537add2f33a79e605f6191fba2afda4cbb876e35c1a7074298a7d", size = 256107, upload-time = "2025-10-06T14:49:32.974Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/06/3f06f611087dc60d65ef775f1fb5aca7c6d61c6db4990e7cda0cef9b1651/multidict-6.7.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:b284e319754366c1aee2267a2036248b24eeb17ecd5dc16022095e747f2f4304", size = 253592, upload-time = "2025-10-06T14:49:34.52Z" },
+ { url = "https://files.pythonhosted.org/packages/20/24/54e804ec7945b6023b340c412ce9c3f81e91b3bf5fa5ce65558740141bee/multidict-6.7.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:803d685de7be4303b5a657b76e2f6d1240e7e0a8aa2968ad5811fa2285553a12", size = 251024, upload-time = "2025-10-06T14:49:35.956Z" },
+ { url = "https://files.pythonhosted.org/packages/14/48/011cba467ea0b17ceb938315d219391d3e421dfd35928e5dbdc3f4ae76ef/multidict-6.7.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:c04a328260dfd5db8c39538f999f02779012268f54614902d0afc775d44e0a62", size = 251484, upload-time = "2025-10-06T14:49:37.631Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/2f/919258b43bb35b99fa127435cfb2d91798eb3a943396631ef43e3720dcf4/multidict-6.7.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:8a19cdb57cd3df4cd865849d93ee14920fb97224300c88501f16ecfa2604b4e0", size = 263579, upload-time = "2025-10-06T14:49:39.502Z" },
+ { url = "https://files.pythonhosted.org/packages/31/22/a0e884d86b5242b5a74cf08e876bdf299e413016b66e55511f7a804a366e/multidict-6.7.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:9b2fd74c52accced7e75de26023b7dccee62511a600e62311b918ec5c168fc2a", size = 259654, upload-time = "2025-10-06T14:49:41.32Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/e5/17e10e1b5c5f5a40f2fcbb45953c9b215f8a4098003915e46a93f5fcaa8f/multidict-6.7.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3e8bfdd0e487acf992407a140d2589fe598238eaeffa3da8448d63a63cd363f8", size = 251511, upload-time = "2025-10-06T14:49:46.021Z" },
+ { url = "https://files.pythonhosted.org/packages/e3/9a/201bb1e17e7af53139597069c375e7b0dcbd47594604f65c2d5359508566/multidict-6.7.0-cp312-cp312-win32.whl", hash = "sha256:dd32a49400a2c3d52088e120ee00c1e3576cbff7e10b98467962c74fdb762ed4", size = 41895, upload-time = "2025-10-06T14:49:48.718Z" },
+ { url = "https://files.pythonhosted.org/packages/46/e2/348cd32faad84eaf1d20cce80e2bb0ef8d312c55bca1f7fa9865e7770aaf/multidict-6.7.0-cp312-cp312-win_amd64.whl", hash = "sha256:92abb658ef2d7ef22ac9f8bb88e8b6c3e571671534e029359b6d9e845923eb1b", size = 46073, upload-time = "2025-10-06T14:49:50.28Z" },
+ { url = "https://files.pythonhosted.org/packages/25/ec/aad2613c1910dce907480e0c3aa306905830f25df2e54ccc9dea450cb5aa/multidict-6.7.0-cp312-cp312-win_arm64.whl", hash = "sha256:490dab541a6a642ce1a9d61a4781656b346a55c13038f0b1244653828e3a83ec", size = 43226, upload-time = "2025-10-06T14:49:52.304Z" },
+ { url = "https://files.pythonhosted.org/packages/d2/86/33272a544eeb36d66e4d9a920602d1a2f57d4ebea4ef3cdfe5a912574c95/multidict-6.7.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:bee7c0588aa0076ce77c0ea5d19a68d76ad81fcd9fe8501003b9a24f9d4000f6", size = 76135, upload-time = "2025-10-06T14:49:54.26Z" },
+ { url = "https://files.pythonhosted.org/packages/91/1c/eb97db117a1ebe46d457a3d235a7b9d2e6dcab174f42d1b67663dd9e5371/multidict-6.7.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:7ef6b61cad77091056ce0e7ce69814ef72afacb150b7ac6a3e9470def2198159", size = 45117, upload-time = "2025-10-06T14:49:55.82Z" },
+ { url = "https://files.pythonhosted.org/packages/f1/d8/6c3442322e41fb1dd4de8bd67bfd11cd72352ac131f6368315617de752f1/multidict-6.7.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:9c0359b1ec12b1d6849c59f9d319610b7f20ef990a6d454ab151aa0e3b9f78ca", size = 43472, upload-time = "2025-10-06T14:49:57.048Z" },
+ { url = "https://files.pythonhosted.org/packages/75/3f/e2639e80325af0b6c6febdf8e57cc07043ff15f57fa1ef808f4ccb5ac4cd/multidict-6.7.0-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:cd240939f71c64bd658f186330603aac1a9a81bf6273f523fca63673cb7378a8", size = 249342, upload-time = "2025-10-06T14:49:58.368Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/cc/84e0585f805cbeaa9cbdaa95f9a3d6aed745b9d25700623ac89a6ecff400/multidict-6.7.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a60a4d75718a5efa473ebd5ab685786ba0c67b8381f781d1be14da49f1a2dc60", size = 257082, upload-time = "2025-10-06T14:49:59.89Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/9c/ac851c107c92289acbbf5cfb485694084690c1b17e555f44952c26ddc5bd/multidict-6.7.0-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:53a42d364f323275126aff81fb67c5ca1b7a04fda0546245730a55c8c5f24bc4", size = 240704, upload-time = "2025-10-06T14:50:01.485Z" },
+ { url = "https://files.pythonhosted.org/packages/50/cc/5f93e99427248c09da95b62d64b25748a5f5c98c7c2ab09825a1d6af0e15/multidict-6.7.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3b29b980d0ddbecb736735ee5bef69bb2ddca56eff603c86f3f29a1128299b4f", size = 266355, upload-time = "2025-10-06T14:50:02.955Z" },
+ { url = "https://files.pythonhosted.org/packages/ec/0c/2ec1d883ceb79c6f7f6d7ad90c919c898f5d1c6ea96d322751420211e072/multidict-6.7.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f8a93b1c0ed2d04b97a5e9336fd2d33371b9a6e29ab7dd6503d63407c20ffbaf", size = 267259, upload-time = "2025-10-06T14:50:04.446Z" },
+ { url = "https://files.pythonhosted.org/packages/c6/2d/f0b184fa88d6630aa267680bdb8623fb69cb0d024b8c6f0d23f9a0f406d3/multidict-6.7.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9ff96e8815eecacc6645da76c413eb3b3d34cfca256c70b16b286a687d013c32", size = 254903, upload-time = "2025-10-06T14:50:05.98Z" },
+ { url = "https://files.pythonhosted.org/packages/06/c9/11ea263ad0df7dfabcad404feb3c0dd40b131bc7f232d5537f2fb1356951/multidict-6.7.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7516c579652f6a6be0e266aec0acd0db80829ca305c3d771ed898538804c2036", size = 252365, upload-time = "2025-10-06T14:50:07.511Z" },
+ { url = "https://files.pythonhosted.org/packages/41/88/d714b86ee2c17d6e09850c70c9d310abac3d808ab49dfa16b43aba9d53fd/multidict-6.7.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:040f393368e63fb0f3330e70c26bfd336656bed925e5cbe17c9da839a6ab13ec", size = 250062, upload-time = "2025-10-06T14:50:09.074Z" },
+ { url = "https://files.pythonhosted.org/packages/15/fe/ad407bb9e818c2b31383f6131ca19ea7e35ce93cf1310fce69f12e89de75/multidict-6.7.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b3bc26a951007b1057a1c543af845f1c7e3e71cc240ed1ace7bf4484aa99196e", size = 249683, upload-time = "2025-10-06T14:50:10.714Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/a4/a89abdb0229e533fb925e7c6e5c40201c2873efebc9abaf14046a4536ee6/multidict-6.7.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:7b022717c748dd1992a83e219587aabe45980d88969f01b316e78683e6285f64", size = 261254, upload-time = "2025-10-06T14:50:12.28Z" },
+ { url = "https://files.pythonhosted.org/packages/8d/aa/0e2b27bd88b40a4fb8dc53dd74eecac70edaa4c1dd0707eb2164da3675b3/multidict-6.7.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:9600082733859f00d79dee64effc7aef1beb26adb297416a4ad2116fd61374bd", size = 257967, upload-time = "2025-10-06T14:50:14.16Z" },
+ { url = "https://files.pythonhosted.org/packages/d0/8e/0c67b7120d5d5f6d874ed85a085f9dc770a7f9d8813e80f44a9fec820bb7/multidict-6.7.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:94218fcec4d72bc61df51c198d098ce2b378e0ccbac41ddbed5ef44092913288", size = 250085, upload-time = "2025-10-06T14:50:15.639Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/55/b73e1d624ea4b8fd4dd07a3bb70f6e4c7c6c5d9d640a41c6ffe5cdbd2a55/multidict-6.7.0-cp313-cp313-win32.whl", hash = "sha256:a37bd74c3fa9d00be2d7b8eca074dc56bd8077ddd2917a839bd989612671ed17", size = 41713, upload-time = "2025-10-06T14:50:17.066Z" },
+ { url = "https://files.pythonhosted.org/packages/32/31/75c59e7d3b4205075b4c183fa4ca398a2daf2303ddf616b04ae6ef55cffe/multidict-6.7.0-cp313-cp313-win_amd64.whl", hash = "sha256:30d193c6cc6d559db42b6bcec8a5d395d34d60c9877a0b71ecd7c204fcf15390", size = 45915, upload-time = "2025-10-06T14:50:18.264Z" },
+ { url = "https://files.pythonhosted.org/packages/31/2a/8987831e811f1184c22bc2e45844934385363ee61c0a2dcfa8f71b87e608/multidict-6.7.0-cp313-cp313-win_arm64.whl", hash = "sha256:ea3334cabe4d41b7ccd01e4d349828678794edbc2d3ae97fc162a3312095092e", size = 43077, upload-time = "2025-10-06T14:50:19.853Z" },
+ { url = "https://files.pythonhosted.org/packages/e8/68/7b3a5170a382a340147337b300b9eb25a9ddb573bcdfff19c0fa3f31ffba/multidict-6.7.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:ad9ce259f50abd98a1ca0aa6e490b58c316a0fce0617f609723e40804add2c00", size = 83114, upload-time = "2025-10-06T14:50:21.223Z" },
+ { url = "https://files.pythonhosted.org/packages/55/5c/3fa2d07c84df4e302060f555bbf539310980362236ad49f50eeb0a1c1eb9/multidict-6.7.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:07f5594ac6d084cbb5de2df218d78baf55ef150b91f0ff8a21cc7a2e3a5a58eb", size = 48442, upload-time = "2025-10-06T14:50:22.871Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/56/67212d33239797f9bd91962bb899d72bb0f4c35a8652dcdb8ed049bef878/multidict-6.7.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:0591b48acf279821a579282444814a2d8d0af624ae0bc600aa4d1b920b6e924b", size = 46885, upload-time = "2025-10-06T14:50:24.258Z" },
+ { url = "https://files.pythonhosted.org/packages/46/d1/908f896224290350721597a61a69cd19b89ad8ee0ae1f38b3f5cd12ea2ac/multidict-6.7.0-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:749a72584761531d2b9467cfbdfd29487ee21124c304c4b6cb760d8777b27f9c", size = 242588, upload-time = "2025-10-06T14:50:25.716Z" },
+ { url = "https://files.pythonhosted.org/packages/ab/67/8604288bbd68680eee0ab568fdcb56171d8b23a01bcd5cb0c8fedf6e5d99/multidict-6.7.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b4c3d199f953acd5b446bf7c0de1fe25d94e09e79086f8dc2f48a11a129cdf1", size = 249966, upload-time = "2025-10-06T14:50:28.192Z" },
+ { url = "https://files.pythonhosted.org/packages/20/33/9228d76339f1ba51e3efef7da3ebd91964d3006217aae13211653193c3ff/multidict-6.7.0-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:9fb0211dfc3b51efea2f349ec92c114d7754dd62c01f81c3e32b765b70c45c9b", size = 228618, upload-time = "2025-10-06T14:50:29.82Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/2d/25d9b566d10cab1c42b3b9e5b11ef79c9111eaf4463b8c257a3bd89e0ead/multidict-6.7.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a027ec240fe73a8d6281872690b988eed307cd7d91b23998ff35ff577ca688b5", size = 257539, upload-time = "2025-10-06T14:50:31.731Z" },
+ { url = "https://files.pythonhosted.org/packages/b6/b1/8d1a965e6637fc33de3c0d8f414485c2b7e4af00f42cab3d84e7b955c222/multidict-6.7.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d1d964afecdf3a8288789df2f5751dc0a8261138c3768d9af117ed384e538fad", size = 256345, upload-time = "2025-10-06T14:50:33.26Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/0c/06b5a8adbdeedada6f4fb8d8f193d44a347223b11939b42953eeb6530b6b/multidict-6.7.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:caf53b15b1b7df9fbd0709aa01409000a2b4dd03a5f6f5cc548183c7c8f8b63c", size = 247934, upload-time = "2025-10-06T14:50:34.808Z" },
+ { url = "https://files.pythonhosted.org/packages/8f/31/b2491b5fe167ca044c6eb4b8f2c9f3b8a00b24c432c365358eadac5d7625/multidict-6.7.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:654030da3197d927f05a536a66186070e98765aa5142794c9904555d3a9d8fb5", size = 245243, upload-time = "2025-10-06T14:50:36.436Z" },
+ { url = "https://files.pythonhosted.org/packages/61/1a/982913957cb90406c8c94f53001abd9eafc271cb3e70ff6371590bec478e/multidict-6.7.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:2090d3718829d1e484706a2f525e50c892237b2bf9b17a79b059cb98cddc2f10", size = 235878, upload-time = "2025-10-06T14:50:37.953Z" },
+ { url = "https://files.pythonhosted.org/packages/be/c0/21435d804c1a1cf7a2608593f4d19bca5bcbd7a81a70b253fdd1c12af9c0/multidict-6.7.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:2d2cfeec3f6f45651b3d408c4acec0ebf3daa9bc8a112a084206f5db5d05b754", size = 243452, upload-time = "2025-10-06T14:50:39.574Z" },
+ { url = "https://files.pythonhosted.org/packages/54/0a/4349d540d4a883863191be6eb9a928846d4ec0ea007d3dcd36323bb058ac/multidict-6.7.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:4ef089f985b8c194d341eb2c24ae6e7408c9a0e2e5658699c92f497437d88c3c", size = 252312, upload-time = "2025-10-06T14:50:41.612Z" },
+ { url = "https://files.pythonhosted.org/packages/26/64/d5416038dbda1488daf16b676e4dbfd9674dde10a0cc8f4fc2b502d8125d/multidict-6.7.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e93a0617cd16998784bf4414c7e40f17a35d2350e5c6f0bd900d3a8e02bd3762", size = 246935, upload-time = "2025-10-06T14:50:43.972Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/8c/8290c50d14e49f35e0bd4abc25e1bc7711149ca9588ab7d04f886cdf03d9/multidict-6.7.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:f0feece2ef8ebc42ed9e2e8c78fc4aa3cf455733b507c09ef7406364c94376c6", size = 243385, upload-time = "2025-10-06T14:50:45.648Z" },
+ { url = "https://files.pythonhosted.org/packages/ef/a0/f83ae75e42d694b3fbad3e047670e511c138be747bc713cf1b10d5096416/multidict-6.7.0-cp313-cp313t-win32.whl", hash = "sha256:19a1d55338ec1be74ef62440ca9e04a2f001a04d0cc49a4983dc320ff0f3212d", size = 47777, upload-time = "2025-10-06T14:50:47.154Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/80/9b174a92814a3830b7357307a792300f42c9e94664b01dee8e457551fa66/multidict-6.7.0-cp313-cp313t-win_amd64.whl", hash = "sha256:3da4fb467498df97e986af166b12d01f05d2e04f978a9c1c680ea1988e0bc4b6", size = 53104, upload-time = "2025-10-06T14:50:48.851Z" },
+ { url = "https://files.pythonhosted.org/packages/cc/28/04baeaf0428d95bb7a7bea0e691ba2f31394338ba424fb0679a9ed0f4c09/multidict-6.7.0-cp313-cp313t-win_arm64.whl", hash = "sha256:b4121773c49a0776461f4a904cdf6264c88e42218aaa8407e803ca8025872792", size = 45503, upload-time = "2025-10-06T14:50:50.16Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/b1/3da6934455dd4b261d4c72f897e3a5728eba81db59959f3a639245891baa/multidict-6.7.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3bab1e4aff7adaa34410f93b1f8e57c4b36b9af0426a76003f441ee1d3c7e842", size = 75128, upload-time = "2025-10-06T14:50:51.92Z" },
+ { url = "https://files.pythonhosted.org/packages/14/2c/f069cab5b51d175a1a2cb4ccdf7a2c2dabd58aa5bd933fa036a8d15e2404/multidict-6.7.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:b8512bac933afc3e45fb2b18da8e59b78d4f408399a960339598374d4ae3b56b", size = 44410, upload-time = "2025-10-06T14:50:53.275Z" },
+ { url = "https://files.pythonhosted.org/packages/42/e2/64bb41266427af6642b6b128e8774ed84c11b80a90702c13ac0a86bb10cc/multidict-6.7.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:79dcf9e477bc65414ebfea98ffd013cb39552b5ecd62908752e0e413d6d06e38", size = 43205, upload-time = "2025-10-06T14:50:54.911Z" },
+ { url = "https://files.pythonhosted.org/packages/02/68/6b086fef8a3f1a8541b9236c594f0c9245617c29841f2e0395d979485cde/multidict-6.7.0-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:31bae522710064b5cbeddaf2e9f32b1abab70ac6ac91d42572502299e9953128", size = 245084, upload-time = "2025-10-06T14:50:56.369Z" },
+ { url = "https://files.pythonhosted.org/packages/15/ee/f524093232007cd7a75c1d132df70f235cfd590a7c9eaccd7ff422ef4ae8/multidict-6.7.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4a0df7ff02397bb63e2fd22af2c87dfa39e8c7f12947bc524dbdc528282c7e34", size = 252667, upload-time = "2025-10-06T14:50:57.991Z" },
+ { url = "https://files.pythonhosted.org/packages/02/a5/eeb3f43ab45878f1895118c3ef157a480db58ede3f248e29b5354139c2c9/multidict-6.7.0-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:7a0222514e8e4c514660e182d5156a415c13ef0aabbd71682fc714e327b95e99", size = 233590, upload-time = "2025-10-06T14:50:59.589Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/1e/76d02f8270b97269d7e3dbd45644b1785bda457b474315f8cf999525a193/multidict-6.7.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2397ab4daaf2698eb51a76721e98db21ce4f52339e535725de03ea962b5a3202", size = 264112, upload-time = "2025-10-06T14:51:01.183Z" },
+ { url = "https://files.pythonhosted.org/packages/76/0b/c28a70ecb58963847c2a8efe334904cd254812b10e535aefb3bcce513918/multidict-6.7.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8891681594162635948a636c9fe0ff21746aeb3dd5463f6e25d9bea3a8a39ca1", size = 261194, upload-time = "2025-10-06T14:51:02.794Z" },
+ { url = "https://files.pythonhosted.org/packages/b4/63/2ab26e4209773223159b83aa32721b4021ffb08102f8ac7d689c943fded1/multidict-6.7.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:18706cc31dbf402a7945916dd5cddf160251b6dab8a2c5f3d6d5a55949f676b3", size = 248510, upload-time = "2025-10-06T14:51:04.724Z" },
+ { url = "https://files.pythonhosted.org/packages/93/cd/06c1fa8282af1d1c46fd55c10a7930af652afdce43999501d4d68664170c/multidict-6.7.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:f844a1bbf1d207dd311a56f383f7eda2d0e134921d45751842d8235e7778965d", size = 248395, upload-time = "2025-10-06T14:51:06.306Z" },
+ { url = "https://files.pythonhosted.org/packages/99/ac/82cb419dd6b04ccf9e7e61befc00c77614fc8134362488b553402ecd55ce/multidict-6.7.0-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:d4393e3581e84e5645506923816b9cc81f5609a778c7e7534054091acc64d1c6", size = 239520, upload-time = "2025-10-06T14:51:08.091Z" },
+ { url = "https://files.pythonhosted.org/packages/fa/f3/a0f9bf09493421bd8716a362e0cd1d244f5a6550f5beffdd6b47e885b331/multidict-6.7.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:fbd18dc82d7bf274b37aa48d664534330af744e03bccf696d6f4c6042e7d19e7", size = 245479, upload-time = "2025-10-06T14:51:10.365Z" },
+ { url = "https://files.pythonhosted.org/packages/8d/01/476d38fc73a212843f43c852b0eee266b6971f0e28329c2184a8df90c376/multidict-6.7.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:b6234e14f9314731ec45c42fc4554b88133ad53a09092cc48a88e771c125dadb", size = 258903, upload-time = "2025-10-06T14:51:12.466Z" },
+ { url = "https://files.pythonhosted.org/packages/49/6d/23faeb0868adba613b817d0e69c5f15531b24d462af8012c4f6de4fa8dc3/multidict-6.7.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:08d4379f9744d8f78d98c8673c06e202ffa88296f009c71bbafe8a6bf847d01f", size = 252333, upload-time = "2025-10-06T14:51:14.48Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/cc/48d02ac22b30fa247f7dad82866e4b1015431092f4ba6ebc7e77596e0b18/multidict-6.7.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:9fe04da3f79387f450fd0061d4dd2e45a72749d31bf634aecc9e27f24fdc4b3f", size = 243411, upload-time = "2025-10-06T14:51:16.072Z" },
+ { url = "https://files.pythonhosted.org/packages/4a/03/29a8bf5a18abf1fe34535c88adbdfa88c9fb869b5a3b120692c64abe8284/multidict-6.7.0-cp314-cp314-win32.whl", hash = "sha256:fbafe31d191dfa7c4c51f7a6149c9fb7e914dcf9ffead27dcfd9f1ae382b3885", size = 40940, upload-time = "2025-10-06T14:51:17.544Z" },
+ { url = "https://files.pythonhosted.org/packages/82/16/7ed27b680791b939de138f906d5cf2b4657b0d45ca6f5dd6236fdddafb1a/multidict-6.7.0-cp314-cp314-win_amd64.whl", hash = "sha256:2f67396ec0310764b9222a1728ced1ab638f61aadc6226f17a71dd9324f9a99c", size = 45087, upload-time = "2025-10-06T14:51:18.875Z" },
+ { url = "https://files.pythonhosted.org/packages/cd/3c/e3e62eb35a1950292fe39315d3c89941e30a9d07d5d2df42965ab041da43/multidict-6.7.0-cp314-cp314-win_arm64.whl", hash = "sha256:ba672b26069957ee369cfa7fc180dde1fc6f176eaf1e6beaf61fbebbd3d9c000", size = 42368, upload-time = "2025-10-06T14:51:20.225Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/40/cd499bd0dbc5f1136726db3153042a735fffd0d77268e2ee20d5f33c010f/multidict-6.7.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:c1dcc7524066fa918c6a27d61444d4ee7900ec635779058571f70d042d86ed63", size = 82326, upload-time = "2025-10-06T14:51:21.588Z" },
+ { url = "https://files.pythonhosted.org/packages/13/8a/18e031eca251c8df76daf0288e6790561806e439f5ce99a170b4af30676b/multidict-6.7.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:27e0b36c2d388dc7b6ced3406671b401e84ad7eb0656b8f3a2f46ed0ce483718", size = 48065, upload-time = "2025-10-06T14:51:22.93Z" },
+ { url = "https://files.pythonhosted.org/packages/40/71/5e6701277470a87d234e433fb0a3a7deaf3bcd92566e421e7ae9776319de/multidict-6.7.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:2a7baa46a22e77f0988e3b23d4ede5513ebec1929e34ee9495be535662c0dfe2", size = 46475, upload-time = "2025-10-06T14:51:24.352Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/6a/bab00cbab6d9cfb57afe1663318f72ec28289ea03fd4e8236bb78429893a/multidict-6.7.0-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:7bf77f54997a9166a2f5675d1201520586439424c2511723a7312bdb4bcc034e", size = 239324, upload-time = "2025-10-06T14:51:25.822Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/5f/8de95f629fc22a7769ade8b41028e3e5a822c1f8904f618d175945a81ad3/multidict-6.7.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e011555abada53f1578d63389610ac8a5400fc70ce71156b0aa30d326f1a5064", size = 246877, upload-time = "2025-10-06T14:51:27.604Z" },
+ { url = "https://files.pythonhosted.org/packages/23/b4/38881a960458f25b89e9f4a4fdcb02ac101cfa710190db6e5528841e67de/multidict-6.7.0-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:28b37063541b897fd6a318007373930a75ca6d6ac7c940dbe14731ffdd8d498e", size = 225824, upload-time = "2025-10-06T14:51:29.664Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/39/6566210c83f8a261575f18e7144736059f0c460b362e96e9cf797a24b8e7/multidict-6.7.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:05047ada7a2fde2631a0ed706f1fd68b169a681dfe5e4cf0f8e4cb6618bbc2cd", size = 253558, upload-time = "2025-10-06T14:51:31.684Z" },
+ { url = "https://files.pythonhosted.org/packages/00/a3/67f18315100f64c269f46e6c0319fa87ba68f0f64f2b8e7fd7c72b913a0b/multidict-6.7.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:716133f7d1d946a4e1b91b1756b23c088881e70ff180c24e864c26192ad7534a", size = 252339, upload-time = "2025-10-06T14:51:33.699Z" },
+ { url = "https://files.pythonhosted.org/packages/c8/2a/1cb77266afee2458d82f50da41beba02159b1d6b1f7973afc9a1cad1499b/multidict-6.7.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d1bed1b467ef657f2a0ae62844a607909ef1c6889562de5e1d505f74457d0b96", size = 244895, upload-time = "2025-10-06T14:51:36.189Z" },
+ { url = "https://files.pythonhosted.org/packages/dd/72/09fa7dd487f119b2eb9524946ddd36e2067c08510576d43ff68469563b3b/multidict-6.7.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:ca43bdfa5d37bd6aee89d85e1d0831fb86e25541be7e9d376ead1b28974f8e5e", size = 241862, upload-time = "2025-10-06T14:51:41.291Z" },
+ { url = "https://files.pythonhosted.org/packages/65/92/bc1f8bd0853d8669300f732c801974dfc3702c3eeadae2f60cef54dc69d7/multidict-6.7.0-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:44b546bd3eb645fd26fb949e43c02a25a2e632e2ca21a35e2e132c8105dc8599", size = 232376, upload-time = "2025-10-06T14:51:43.55Z" },
+ { url = "https://files.pythonhosted.org/packages/09/86/ac39399e5cb9d0c2ac8ef6e10a768e4d3bc933ac808d49c41f9dc23337eb/multidict-6.7.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:a6ef16328011d3f468e7ebc326f24c1445f001ca1dec335b2f8e66bed3006394", size = 240272, upload-time = "2025-10-06T14:51:45.265Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/b6/fed5ac6b8563ec72df6cb1ea8dac6d17f0a4a1f65045f66b6d3bf1497c02/multidict-6.7.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:5aa873cbc8e593d361ae65c68f85faadd755c3295ea2c12040ee146802f23b38", size = 248774, upload-time = "2025-10-06T14:51:46.836Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/8d/b954d8c0dc132b68f760aefd45870978deec6818897389dace00fcde32ff/multidict-6.7.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:3d7b6ccce016e29df4b7ca819659f516f0bc7a4b3efa3bb2012ba06431b044f9", size = 242731, upload-time = "2025-10-06T14:51:48.541Z" },
+ { url = "https://files.pythonhosted.org/packages/16/9d/a2dac7009125d3540c2f54e194829ea18ac53716c61b655d8ed300120b0f/multidict-6.7.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:171b73bd4ee683d307599b66793ac80981b06f069b62eea1c9e29c9241aa66b0", size = 240193, upload-time = "2025-10-06T14:51:50.355Z" },
+ { url = "https://files.pythonhosted.org/packages/39/ca/c05f144128ea232ae2178b008d5011d4e2cea86e4ee8c85c2631b1b94802/multidict-6.7.0-cp314-cp314t-win32.whl", hash = "sha256:b2d7f80c4e1fd010b07cb26820aae86b7e73b681ee4889684fb8d2d4537aab13", size = 48023, upload-time = "2025-10-06T14:51:51.883Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/8f/0a60e501584145588be1af5cc829265701ba3c35a64aec8e07cbb71d39bb/multidict-6.7.0-cp314-cp314t-win_amd64.whl", hash = "sha256:09929cab6fcb68122776d575e03c6cc64ee0b8fca48d17e135474b042ce515cd", size = 53507, upload-time = "2025-10-06T14:51:53.672Z" },
+ { url = "https://files.pythonhosted.org/packages/7f/ae/3148b988a9c6239903e786eac19c889fab607c31d6efa7fb2147e5680f23/multidict-6.7.0-cp314-cp314t-win_arm64.whl", hash = "sha256:cc41db090ed742f32bd2d2c721861725e6109681eddf835d0a82bd3a5c382827", size = 44804, upload-time = "2025-10-06T14:51:55.415Z" },
+ { url = "https://files.pythonhosted.org/packages/b7/da/7d22601b625e241d4f23ef1ebff8acfc60da633c9e7e7922e24d10f592b3/multidict-6.7.0-py3-none-any.whl", hash = "sha256:394fc5c42a333c9ffc3e421a4c85e08580d990e08b99f6bf35b4132114c5dcb3", size = 12317, upload-time = "2025-10-06T14:52:29.272Z" },
+]
+
+[[package]]
+name = "multiprocess"
+version = "0.70.16"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "dill" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/b5/ae/04f39c5d0d0def03247c2893d6f2b83c136bf3320a2154d7b8858f2ba72d/multiprocess-0.70.16.tar.gz", hash = "sha256:161af703d4652a0e1410be6abccecde4a7ddffd19341be0a7011b94aeb171ac1", size = 1772603, upload-time = "2024-01-28T18:52:34.85Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ef/76/6e712a2623d146d314f17598df5de7224c85c0060ef63fd95cc15a25b3fa/multiprocess-0.70.16-pp310-pypy310_pp73-macosx_10_13_x86_64.whl", hash = "sha256:476887be10e2f59ff183c006af746cb6f1fd0eadcfd4ef49e605cbe2659920ee", size = 134980, upload-time = "2024-01-28T18:52:15.731Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/ab/1e6e8009e380e22254ff539ebe117861e5bdb3bff1fc977920972237c6c7/multiprocess-0.70.16-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:d951bed82c8f73929ac82c61f01a7b5ce8f3e5ef40f5b52553b4f547ce2b08ec", size = 134982, upload-time = "2024-01-28T18:52:17.783Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/f7/7ec7fddc92e50714ea3745631f79bd9c96424cb2702632521028e57d3a36/multiprocess-0.70.16-py310-none-any.whl", hash = "sha256:c4a9944c67bd49f823687463660a2d6daae94c289adff97e0f9d696ba6371d02", size = 134824, upload-time = "2024-01-28T18:52:26.062Z" },
+ { url = "https://files.pythonhosted.org/packages/50/15/b56e50e8debaf439f44befec5b2af11db85f6e0f344c3113ae0be0593a91/multiprocess-0.70.16-py311-none-any.whl", hash = "sha256:af4cabb0dac72abfb1e794fa7855c325fd2b55a10a44628a3c1ad3311c04127a", size = 143519, upload-time = "2024-01-28T18:52:28.115Z" },
+ { url = "https://files.pythonhosted.org/packages/0a/7d/a988f258104dcd2ccf1ed40fdc97e26c4ac351eeaf81d76e266c52d84e2f/multiprocess-0.70.16-py312-none-any.whl", hash = "sha256:fc0544c531920dde3b00c29863377f87e1632601092ea2daca74e4beb40faa2e", size = 146741, upload-time = "2024-01-28T18:52:29.395Z" },
+ { url = "https://files.pythonhosted.org/packages/ea/89/38df130f2c799090c978b366cfdf5b96d08de5b29a4a293df7f7429fa50b/multiprocess-0.70.16-py38-none-any.whl", hash = "sha256:a71d82033454891091a226dfc319d0cfa8019a4e888ef9ca910372a446de4435", size = 132628, upload-time = "2024-01-28T18:52:30.853Z" },
+ { url = "https://files.pythonhosted.org/packages/da/d9/f7f9379981e39b8c2511c9e0326d212accacb82f12fbfdc1aa2ce2a7b2b6/multiprocess-0.70.16-py39-none-any.whl", hash = "sha256:a0bafd3ae1b732eac64be2e72038231c1ba97724b60b09400d68f229fcc2fbf3", size = 133351, upload-time = "2024-01-28T18:52:31.981Z" },
+]
+
+[[package]]
+name = "murmurhash"
+version = "1.0.13"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/54/e9/02efbc6dfc2dd2085da3daacf9a8c17e8356019eceaedbfa21555e32d2af/murmurhash-1.0.13.tar.gz", hash = "sha256:737246d41ee00ff74b07b0bd1f0888be304d203ce668e642c86aa64ede30f8b7", size = 13258, upload-time = "2025-05-22T12:35:57.019Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/32/c3/ac14ed2aff4f18eadccf7d4e80c2361cf6e9a6a350442db9987919c4a747/murmurhash-1.0.13-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:136c7017e7d59ef16f065c2285bf5d30557ad8260adf47714c3c2802725e3e07", size = 26278, upload-time = "2025-05-22T12:35:10.16Z" },
+ { url = "https://files.pythonhosted.org/packages/62/38/87e5f72aa96a0a816b90cd66209cda713e168d4d23b52af62fdba3c8b33c/murmurhash-1.0.13-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d0292f6fcd99361157fafad5c86d508f367931b7699cce1e14747364596950cb", size = 26528, upload-time = "2025-05-22T12:35:12.181Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/df/f74b22acf2ebf04ea24b858667836c9490e677ef29c1fe7bc993ecf4bc12/murmurhash-1.0.13-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:12265dc748257966c62041b677201b8fa74334a2548dc27f1c7a9e78dab7c2c1", size = 120045, upload-time = "2025-05-22T12:35:13.657Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/ed/19c48d4c5ad475e144fba5b1adf45d8a189eabde503168660e1ec5d081e8/murmurhash-1.0.13-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e411d5be64d37f2ce10a5d4d74c50bb35bd06205745b9631c4d8b1cb193e540", size = 117103, upload-time = "2025-05-22T12:35:14.899Z" },
+ { url = "https://files.pythonhosted.org/packages/48/0e/3d6e009c539709f0cf643679977e2dfbd5d50e1ef49928f9a92941839482/murmurhash-1.0.13-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:da3500ad3dbf75ac9c6bc8c5fbc677d56dfc34aec0a289269939d059f194f61d", size = 118191, upload-time = "2025-05-22T12:35:16.098Z" },
+ { url = "https://files.pythonhosted.org/packages/7c/8c/fab9d11bde62783d2aa7919e1ecbbf12dea7100ea61f63f55c9e0f199a6a/murmurhash-1.0.13-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b23278c5428fc14f3101f8794f38ec937da042198930073e8c86d00add0fa2f0", size = 118663, upload-time = "2025-05-22T12:35:17.847Z" },
+ { url = "https://files.pythonhosted.org/packages/cf/23/322d87ab935782f2676a836ea88d92f87e58db40fb49112ba03b03d335a1/murmurhash-1.0.13-cp310-cp310-win_amd64.whl", hash = "sha256:7bc27226c0e8d9927f8e59af0dfefc93f5009e4ec3dde8da4ba7751ba19edd47", size = 24504, upload-time = "2025-05-22T12:35:19.36Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/d1/9d13a02d9c8bfff10b1f68d19df206eaf2a8011defeccf7eb05ea0b8c54e/murmurhash-1.0.13-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:b20d168370bc3ce82920121b78ab35ae244070a9b18798f4a2e8678fa03bd7e0", size = 26410, upload-time = "2025-05-22T12:35:20.786Z" },
+ { url = "https://files.pythonhosted.org/packages/14/b0/3ee762e98cf9a8c2df9c8b377c326f3dd4495066d4eace9066fca46eba7a/murmurhash-1.0.13-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:cef667d2e83bdceea3bc20c586c491fa442662ace1aea66ff5e3a18bb38268d8", size = 26679, upload-time = "2025-05-22T12:35:21.808Z" },
+ { url = "https://files.pythonhosted.org/packages/39/06/24618f79cd5aac48490932e50263bddfd1ea90f7123d49bfe806a5982675/murmurhash-1.0.13-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:507148e50929ba1fce36898808573b9f81c763d5676f3fc6e4e832ff56b66992", size = 125970, upload-time = "2025-05-22T12:35:23.222Z" },
+ { url = "https://files.pythonhosted.org/packages/e8/09/0e7afce0a422692506c85474a26fb3a03c1971b2b5f7e7745276c4b3de7f/murmurhash-1.0.13-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64d50f6173d266ad165beb8bca6101d824217fc9279f9e9981f4c0245c1e7ee6", size = 123390, upload-time = "2025-05-22T12:35:24.303Z" },
+ { url = "https://files.pythonhosted.org/packages/22/4c/c98f579b1a951b2bcc722a35270a2eec105c1e21585c9b314a02079e3c4d/murmurhash-1.0.13-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:0f272e15a84a8ae5f8b4bc0a68f9f47be38518ddffc72405791178058e9d019a", size = 124007, upload-time = "2025-05-22T12:35:25.446Z" },
+ { url = "https://files.pythonhosted.org/packages/df/f8/1b0dcebc8df8e091341617102b5b3b97deb6435f345b84f75382c290ec2c/murmurhash-1.0.13-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f9423e0b0964ed1013a06c970199538c7ef9ca28c0be54798c0f1473a6591761", size = 123705, upload-time = "2025-05-22T12:35:26.709Z" },
+ { url = "https://files.pythonhosted.org/packages/79/17/f2a38558e150a0669d843f75e128afb83c1a67af41885ea2acb940e18e2a/murmurhash-1.0.13-cp311-cp311-win_amd64.whl", hash = "sha256:83b81e7084b696df3d853f2c78e0c9bda6b285d643f923f1a6fa9ab145d705c5", size = 24572, upload-time = "2025-05-22T12:35:30.38Z" },
+ { url = "https://files.pythonhosted.org/packages/e1/53/56ce2d8d4b9ab89557cb1d00ffce346b80a2eb2d8c7944015e5c83eacdec/murmurhash-1.0.13-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:bbe882e46cb3f86e092d8a1dd7a5a1c992da1ae3b39f7dd4507b6ce33dae7f92", size = 26859, upload-time = "2025-05-22T12:35:31.815Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/85/3a0ad54a61257c31496545ae6861515d640316f93681d1dd917e7be06634/murmurhash-1.0.13-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:52a33a12ecedc432493692c207c784b06b6427ffaa897fc90b7a76e65846478d", size = 26900, upload-time = "2025-05-22T12:35:34.267Z" },
+ { url = "https://files.pythonhosted.org/packages/d0/cd/6651de26744b50ff11c79f0c0d41244db039625de53c0467a7a52876b2d8/murmurhash-1.0.13-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:950403a7f0dc2d9c8d0710f07c296f2daab66299d9677d6c65d6b6fa2cb30aaa", size = 131367, upload-time = "2025-05-22T12:35:35.258Z" },
+ { url = "https://files.pythonhosted.org/packages/50/6c/01ded95ddce33811c9766cae4ce32e0a54288da1d909ee2bcaa6ed13b9f1/murmurhash-1.0.13-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fde9fb5d2c106d86ff3ef2e4a9a69c2a8d23ba46e28c6b30034dc58421bc107b", size = 128943, upload-time = "2025-05-22T12:35:36.358Z" },
+ { url = "https://files.pythonhosted.org/packages/ab/27/e539a9622d7bea3ae22706c1eb80d4af80f9dddd93b54d151955c2ae4011/murmurhash-1.0.13-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3aa55d62773745616e1ab19345dece122f6e6d09224f7be939cc5b4c513c8473", size = 129108, upload-time = "2025-05-22T12:35:37.864Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/84/18af5662e07d06839ad4db18ce026e6f8ef850d7b0ba92817b28dad28ba6/murmurhash-1.0.13-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:060dfef1b405cf02c450f182fb629f76ebe7f79657cced2db5054bc29b34938b", size = 129175, upload-time = "2025-05-22T12:35:38.928Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/8d/b01d3ee1f1cf3957250223b7c6ce35454f38fbf4abe236bf04a3f769341d/murmurhash-1.0.13-cp312-cp312-win_amd64.whl", hash = "sha256:a8e79627d44a6e20a6487effc30bfe1c74754c13d179106e68cc6d07941b022c", size = 24869, upload-time = "2025-05-22T12:35:40.035Z" },
+ { url = "https://files.pythonhosted.org/packages/00/b4/8919dfdc4a131ad38a57b2c5de69f4bd74538bf546637ee59ebaebe6e5a4/murmurhash-1.0.13-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b8a7f8befd901379b6dc57a9e49c5188454113747ad6aa8cdd951a6048e10790", size = 26852, upload-time = "2025-05-22T12:35:41.061Z" },
+ { url = "https://files.pythonhosted.org/packages/b4/32/ce78bef5d6101568bcb12f5bb5103fabcbe23723ec52e76ff66132d5dbb7/murmurhash-1.0.13-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f741aab86007510199193eee4f87c5ece92bc5a6ca7d0fe0d27335c1203dface", size = 26900, upload-time = "2025-05-22T12:35:42.097Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/4c/0f47c0b4f6b31a1de84d65f9573832c78cd47b4b8ce25ab5596a8238d150/murmurhash-1.0.13-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:82614f18fa6d9d83da6bb0918f3789a3e1555d0ce12c2548153e97f79b29cfc9", size = 130033, upload-time = "2025-05-22T12:35:43.113Z" },
+ { url = "https://files.pythonhosted.org/packages/e0/cb/e47233e32fb792dcc9fb18a2cf65f795d47179b29c2b4a2034689f14c707/murmurhash-1.0.13-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:91f22a48b9454712e0690aa0b76cf0156a5d5a083d23ec7e209cfaeef28f56ff", size = 130619, upload-time = "2025-05-22T12:35:44.229Z" },
+ { url = "https://files.pythonhosted.org/packages/8f/f1/f89911bf304ba5d385ccd346cc7fbb1c1450a24f093b592c3bfe87768467/murmurhash-1.0.13-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:c4bc7938627b8fcb3d598fe6657cc96d1e31f4eba6a871b523c1512ab6dacb3e", size = 127643, upload-time = "2025-05-22T12:35:45.369Z" },
+ { url = "https://files.pythonhosted.org/packages/a4/24/262229221f6840c1a04a46051075e99675e591571abcca6b9a8b6aa1602b/murmurhash-1.0.13-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:58a61f1fc840f9ef704e638c39b8517bab1d21f1a9dbb6ba3ec53e41360e44ec", size = 127981, upload-time = "2025-05-22T12:35:46.503Z" },
+ { url = "https://files.pythonhosted.org/packages/18/25/addbc1d28f83252732ac3e57334d42f093890b4c2cce483ba01a42bc607c/murmurhash-1.0.13-cp313-cp313-win_amd64.whl", hash = "sha256:c451a22f14c2f40e7abaea521ee24fa0e46fbec480c4304c25c946cdb6e81883", size = 24880, upload-time = "2025-05-22T12:35:47.625Z" },
+]
+
+[[package]]
+name = "networkx"
+version = "3.3"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/04/e6/b164f94c869d6b2c605b5128b7b0cfe912795a87fc90e78533920001f3ec/networkx-3.3.tar.gz", hash = "sha256:0c127d8b2f4865f59ae9cb8aafcd60b5c70f3241ebd66f7defad7c4ab90126c9", size = 2126579, upload-time = "2024-04-06T12:59:47.137Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/38/e9/5f72929373e1a0e8d142a130f3f97e6ff920070f87f91c4e13e40e0fba5a/networkx-3.3-py3-none-any.whl", hash = "sha256:28575580c6ebdaf4505b22c6256a2b9de86b316dc63ba9e93abde3d78dfdbcf2", size = 1702396, upload-time = "2024-04-06T12:59:44.283Z" },
+]
+
+[[package]]
+name = "numpy"
+version = "2.2.6"
+source = { registry = "https://pypi.org/simple" }
+resolution-markers = [
+ "python_full_version < '3.11'",
+]
+sdist = { url = "https://files.pythonhosted.org/packages/76/21/7d2a95e4bba9dc13d043ee156a356c0a8f0c6309dff6b21b4d71a073b8a8/numpy-2.2.6.tar.gz", hash = "sha256:e29554e2bef54a90aa5cc07da6ce955accb83f21ab5de01a62c8478897b264fd", size = 20276440, upload-time = "2025-05-17T22:38:04.611Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/9a/3e/ed6db5be21ce87955c0cbd3009f2803f59fa08df21b5df06862e2d8e2bdd/numpy-2.2.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b412caa66f72040e6d268491a59f2c43bf03eb6c96dd8f0307829feb7fa2b6fb", size = 21165245, upload-time = "2025-05-17T21:27:58.555Z" },
+ { url = "https://files.pythonhosted.org/packages/22/c2/4b9221495b2a132cc9d2eb862e21d42a009f5a60e45fc44b00118c174bff/numpy-2.2.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8e41fd67c52b86603a91c1a505ebaef50b3314de0213461c7a6e99c9a3beff90", size = 14360048, upload-time = "2025-05-17T21:28:21.406Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/77/dc2fcfc66943c6410e2bf598062f5959372735ffda175b39906d54f02349/numpy-2.2.6-cp310-cp310-macosx_14_0_arm64.whl", hash = "sha256:37e990a01ae6ec7fe7fa1c26c55ecb672dd98b19c3d0e1d1f326fa13cb38d163", size = 5340542, upload-time = "2025-05-17T21:28:30.931Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/4f/1cb5fdc353a5f5cc7feb692db9b8ec2c3d6405453f982435efc52561df58/numpy-2.2.6-cp310-cp310-macosx_14_0_x86_64.whl", hash = "sha256:5a6429d4be8ca66d889b7cf70f536a397dc45ba6faeb5f8c5427935d9592e9cf", size = 6878301, upload-time = "2025-05-17T21:28:41.613Z" },
+ { url = "https://files.pythonhosted.org/packages/eb/17/96a3acd228cec142fcb8723bd3cc39c2a474f7dcf0a5d16731980bcafa95/numpy-2.2.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:efd28d4e9cd7d7a8d39074a4d44c63eda73401580c5c76acda2ce969e0a38e83", size = 14297320, upload-time = "2025-05-17T21:29:02.78Z" },
+ { url = "https://files.pythonhosted.org/packages/b4/63/3de6a34ad7ad6646ac7d2f55ebc6ad439dbbf9c4370017c50cf403fb19b5/numpy-2.2.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fc7b73d02efb0e18c000e9ad8b83480dfcd5dfd11065997ed4c6747470ae8915", size = 16801050, upload-time = "2025-05-17T21:29:27.675Z" },
+ { url = "https://files.pythonhosted.org/packages/07/b6/89d837eddef52b3d0cec5c6ba0456c1bf1b9ef6a6672fc2b7873c3ec4e2e/numpy-2.2.6-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:74d4531beb257d2c3f4b261bfb0fc09e0f9ebb8842d82a7b4209415896adc680", size = 15807034, upload-time = "2025-05-17T21:29:51.102Z" },
+ { url = "https://files.pythonhosted.org/packages/01/c8/dc6ae86e3c61cfec1f178e5c9f7858584049b6093f843bca541f94120920/numpy-2.2.6-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8fc377d995680230e83241d8a96def29f204b5782f371c532579b4f20607a289", size = 18614185, upload-time = "2025-05-17T21:30:18.703Z" },
+ { url = "https://files.pythonhosted.org/packages/5b/c5/0064b1b7e7c89137b471ccec1fd2282fceaae0ab3a9550f2568782d80357/numpy-2.2.6-cp310-cp310-win32.whl", hash = "sha256:b093dd74e50a8cba3e873868d9e93a85b78e0daf2e98c6797566ad8044e8363d", size = 6527149, upload-time = "2025-05-17T21:30:29.788Z" },
+ { url = "https://files.pythonhosted.org/packages/a3/dd/4b822569d6b96c39d1215dbae0582fd99954dcbcf0c1a13c61783feaca3f/numpy-2.2.6-cp310-cp310-win_amd64.whl", hash = "sha256:f0fd6321b839904e15c46e0d257fdd101dd7f530fe03fd6359c1ea63738703f3", size = 12904620, upload-time = "2025-05-17T21:30:48.994Z" },
+ { url = "https://files.pythonhosted.org/packages/da/a8/4f83e2aa666a9fbf56d6118faaaf5f1974d456b1823fda0a176eff722839/numpy-2.2.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:f9f1adb22318e121c5c69a09142811a201ef17ab257a1e66ca3025065b7f53ae", size = 21176963, upload-time = "2025-05-17T21:31:19.36Z" },
+ { url = "https://files.pythonhosted.org/packages/b3/2b/64e1affc7972decb74c9e29e5649fac940514910960ba25cd9af4488b66c/numpy-2.2.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:c820a93b0255bc360f53eca31a0e676fd1101f673dda8da93454a12e23fc5f7a", size = 14406743, upload-time = "2025-05-17T21:31:41.087Z" },
+ { url = "https://files.pythonhosted.org/packages/4a/9f/0121e375000b5e50ffdd8b25bf78d8e1a5aa4cca3f185d41265198c7b834/numpy-2.2.6-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:3d70692235e759f260c3d837193090014aebdf026dfd167834bcba43e30c2a42", size = 5352616, upload-time = "2025-05-17T21:31:50.072Z" },
+ { url = "https://files.pythonhosted.org/packages/31/0d/b48c405c91693635fbe2dcd7bc84a33a602add5f63286e024d3b6741411c/numpy-2.2.6-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:481b49095335f8eed42e39e8041327c05b0f6f4780488f61286ed3c01368d491", size = 6889579, upload-time = "2025-05-17T21:32:01.712Z" },
+ { url = "https://files.pythonhosted.org/packages/52/b8/7f0554d49b565d0171eab6e99001846882000883998e7b7d9f0d98b1f934/numpy-2.2.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b64d8d4d17135e00c8e346e0a738deb17e754230d7e0810ac5012750bbd85a5a", size = 14312005, upload-time = "2025-05-17T21:32:23.332Z" },
+ { url = "https://files.pythonhosted.org/packages/b3/dd/2238b898e51bd6d389b7389ffb20d7f4c10066d80351187ec8e303a5a475/numpy-2.2.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba10f8411898fc418a521833e014a77d3ca01c15b0c6cdcce6a0d2897e6dbbdf", size = 16821570, upload-time = "2025-05-17T21:32:47.991Z" },
+ { url = "https://files.pythonhosted.org/packages/83/6c/44d0325722cf644f191042bf47eedad61c1e6df2432ed65cbe28509d404e/numpy-2.2.6-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:bd48227a919f1bafbdda0583705e547892342c26fb127219d60a5c36882609d1", size = 15818548, upload-time = "2025-05-17T21:33:11.728Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/9d/81e8216030ce66be25279098789b665d49ff19eef08bfa8cb96d4957f422/numpy-2.2.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9551a499bf125c1d4f9e250377c1ee2eddd02e01eac6644c080162c0c51778ab", size = 18620521, upload-time = "2025-05-17T21:33:39.139Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/fd/e19617b9530b031db51b0926eed5345ce8ddc669bb3bc0044b23e275ebe8/numpy-2.2.6-cp311-cp311-win32.whl", hash = "sha256:0678000bb9ac1475cd454c6b8c799206af8107e310843532b04d49649c717a47", size = 6525866, upload-time = "2025-05-17T21:33:50.273Z" },
+ { url = "https://files.pythonhosted.org/packages/31/0a/f354fb7176b81747d870f7991dc763e157a934c717b67b58456bc63da3df/numpy-2.2.6-cp311-cp311-win_amd64.whl", hash = "sha256:e8213002e427c69c45a52bbd94163084025f533a55a59d6f9c5b820774ef3303", size = 12907455, upload-time = "2025-05-17T21:34:09.135Z" },
+ { url = "https://files.pythonhosted.org/packages/82/5d/c00588b6cf18e1da539b45d3598d3557084990dcc4331960c15ee776ee41/numpy-2.2.6-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:41c5a21f4a04fa86436124d388f6ed60a9343a6f767fced1a8a71c3fbca038ff", size = 20875348, upload-time = "2025-05-17T21:34:39.648Z" },
+ { url = "https://files.pythonhosted.org/packages/66/ee/560deadcdde6c2f90200450d5938f63a34b37e27ebff162810f716f6a230/numpy-2.2.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:de749064336d37e340f640b05f24e9e3dd678c57318c7289d222a8a2f543e90c", size = 14119362, upload-time = "2025-05-17T21:35:01.241Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/65/4baa99f1c53b30adf0acd9a5519078871ddde8d2339dc5a7fde80d9d87da/numpy-2.2.6-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:894b3a42502226a1cac872f840030665f33326fc3dac8e57c607905773cdcde3", size = 5084103, upload-time = "2025-05-17T21:35:10.622Z" },
+ { url = "https://files.pythonhosted.org/packages/cc/89/e5a34c071a0570cc40c9a54eb472d113eea6d002e9ae12bb3a8407fb912e/numpy-2.2.6-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:71594f7c51a18e728451bb50cc60a3ce4e6538822731b2933209a1f3614e9282", size = 6625382, upload-time = "2025-05-17T21:35:21.414Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/35/8c80729f1ff76b3921d5c9487c7ac3de9b2a103b1cd05e905b3090513510/numpy-2.2.6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f2618db89be1b4e05f7a1a847a9c1c0abd63e63a1607d892dd54668dd92faf87", size = 14018462, upload-time = "2025-05-17T21:35:42.174Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/3d/1e1db36cfd41f895d266b103df00ca5b3cbe965184df824dec5c08c6b803/numpy-2.2.6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd83c01228a688733f1ded5201c678f0c53ecc1006ffbc404db9f7a899ac6249", size = 16527618, upload-time = "2025-05-17T21:36:06.711Z" },
+ { url = "https://files.pythonhosted.org/packages/61/c6/03ed30992602c85aa3cd95b9070a514f8b3c33e31124694438d88809ae36/numpy-2.2.6-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:37c0ca431f82cd5fa716eca9506aefcabc247fb27ba69c5062a6d3ade8cf8f49", size = 15505511, upload-time = "2025-05-17T21:36:29.965Z" },
+ { url = "https://files.pythonhosted.org/packages/b7/25/5761d832a81df431e260719ec45de696414266613c9ee268394dd5ad8236/numpy-2.2.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fe27749d33bb772c80dcd84ae7e8df2adc920ae8297400dabec45f0dedb3f6de", size = 18313783, upload-time = "2025-05-17T21:36:56.883Z" },
+ { url = "https://files.pythonhosted.org/packages/57/0a/72d5a3527c5ebffcd47bde9162c39fae1f90138c961e5296491ce778e682/numpy-2.2.6-cp312-cp312-win32.whl", hash = "sha256:4eeaae00d789f66c7a25ac5f34b71a7035bb474e679f410e5e1a94deb24cf2d4", size = 6246506, upload-time = "2025-05-17T21:37:07.368Z" },
+ { url = "https://files.pythonhosted.org/packages/36/fa/8c9210162ca1b88529ab76b41ba02d433fd54fecaf6feb70ef9f124683f1/numpy-2.2.6-cp312-cp312-win_amd64.whl", hash = "sha256:c1f9540be57940698ed329904db803cf7a402f3fc200bfe599334c9bd84a40b2", size = 12614190, upload-time = "2025-05-17T21:37:26.213Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/5c/6657823f4f594f72b5471f1db1ab12e26e890bb2e41897522d134d2a3e81/numpy-2.2.6-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:0811bb762109d9708cca4d0b13c4f67146e3c3b7cf8d34018c722adb2d957c84", size = 20867828, upload-time = "2025-05-17T21:37:56.699Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/9e/14520dc3dadf3c803473bd07e9b2bd1b69bc583cb2497b47000fed2fa92f/numpy-2.2.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:287cc3162b6f01463ccd86be154f284d0893d2b3ed7292439ea97eafa8170e0b", size = 14143006, upload-time = "2025-05-17T21:38:18.291Z" },
+ { url = "https://files.pythonhosted.org/packages/4f/06/7e96c57d90bebdce9918412087fc22ca9851cceaf5567a45c1f404480e9e/numpy-2.2.6-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:f1372f041402e37e5e633e586f62aa53de2eac8d98cbfb822806ce4bbefcb74d", size = 5076765, upload-time = "2025-05-17T21:38:27.319Z" },
+ { url = "https://files.pythonhosted.org/packages/73/ed/63d920c23b4289fdac96ddbdd6132e9427790977d5457cd132f18e76eae0/numpy-2.2.6-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:55a4d33fa519660d69614a9fad433be87e5252f4b03850642f88993f7b2ca566", size = 6617736, upload-time = "2025-05-17T21:38:38.141Z" },
+ { url = "https://files.pythonhosted.org/packages/85/c5/e19c8f99d83fd377ec8c7e0cf627a8049746da54afc24ef0a0cb73d5dfb5/numpy-2.2.6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f92729c95468a2f4f15e9bb94c432a9229d0d50de67304399627a943201baa2f", size = 14010719, upload-time = "2025-05-17T21:38:58.433Z" },
+ { url = "https://files.pythonhosted.org/packages/19/49/4df9123aafa7b539317bf6d342cb6d227e49f7a35b99c287a6109b13dd93/numpy-2.2.6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1bc23a79bfabc5d056d106f9befb8d50c31ced2fbc70eedb8155aec74a45798f", size = 16526072, upload-time = "2025-05-17T21:39:22.638Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/6c/04b5f47f4f32f7c2b0e7260442a8cbcf8168b0e1a41ff1495da42f42a14f/numpy-2.2.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:e3143e4451880bed956e706a3220b4e5cf6172ef05fcc397f6f36a550b1dd868", size = 15503213, upload-time = "2025-05-17T21:39:45.865Z" },
+ { url = "https://files.pythonhosted.org/packages/17/0a/5cd92e352c1307640d5b6fec1b2ffb06cd0dabe7d7b8227f97933d378422/numpy-2.2.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:b4f13750ce79751586ae2eb824ba7e1e8dba64784086c98cdbbcc6a42112ce0d", size = 18316632, upload-time = "2025-05-17T21:40:13.331Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/3b/5cba2b1d88760ef86596ad0f3d484b1cbff7c115ae2429678465057c5155/numpy-2.2.6-cp313-cp313-win32.whl", hash = "sha256:5beb72339d9d4fa36522fc63802f469b13cdbe4fdab4a288f0c441b74272ebfd", size = 6244532, upload-time = "2025-05-17T21:43:46.099Z" },
+ { url = "https://files.pythonhosted.org/packages/cb/3b/d58c12eafcb298d4e6d0d40216866ab15f59e55d148a5658bb3132311fcf/numpy-2.2.6-cp313-cp313-win_amd64.whl", hash = "sha256:b0544343a702fa80c95ad5d3d608ea3599dd54d4632df855e4c8d24eb6ecfa1c", size = 12610885, upload-time = "2025-05-17T21:44:05.145Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/9e/4bf918b818e516322db999ac25d00c75788ddfd2d2ade4fa66f1f38097e1/numpy-2.2.6-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0bca768cd85ae743b2affdc762d617eddf3bcf8724435498a1e80132d04879e6", size = 20963467, upload-time = "2025-05-17T21:40:44Z" },
+ { url = "https://files.pythonhosted.org/packages/61/66/d2de6b291507517ff2e438e13ff7b1e2cdbdb7cb40b3ed475377aece69f9/numpy-2.2.6-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:fc0c5673685c508a142ca65209b4e79ed6740a4ed6b2267dbba90f34b0b3cfda", size = 14225144, upload-time = "2025-05-17T21:41:05.695Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/25/480387655407ead912e28ba3a820bc69af9adf13bcbe40b299d454ec011f/numpy-2.2.6-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:5bd4fc3ac8926b3819797a7c0e2631eb889b4118a9898c84f585a54d475b7e40", size = 5200217, upload-time = "2025-05-17T21:41:15.903Z" },
+ { url = "https://files.pythonhosted.org/packages/aa/4a/6e313b5108f53dcbf3aca0c0f3e9c92f4c10ce57a0a721851f9785872895/numpy-2.2.6-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:fee4236c876c4e8369388054d02d0e9bb84821feb1a64dd59e137e6511a551f8", size = 6712014, upload-time = "2025-05-17T21:41:27.321Z" },
+ { url = "https://files.pythonhosted.org/packages/b7/30/172c2d5c4be71fdf476e9de553443cf8e25feddbe185e0bd88b096915bcc/numpy-2.2.6-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e1dda9c7e08dc141e0247a5b8f49cf05984955246a327d4c48bda16821947b2f", size = 14077935, upload-time = "2025-05-17T21:41:49.738Z" },
+ { url = "https://files.pythonhosted.org/packages/12/fb/9e743f8d4e4d3c710902cf87af3512082ae3d43b945d5d16563f26ec251d/numpy-2.2.6-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f447e6acb680fd307f40d3da4852208af94afdfab89cf850986c3ca00562f4fa", size = 16600122, upload-time = "2025-05-17T21:42:14.046Z" },
+ { url = "https://files.pythonhosted.org/packages/12/75/ee20da0e58d3a66f204f38916757e01e33a9737d0b22373b3eb5a27358f9/numpy-2.2.6-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:389d771b1623ec92636b0786bc4ae56abafad4a4c513d36a55dce14bd9ce8571", size = 15586143, upload-time = "2025-05-17T21:42:37.464Z" },
+ { url = "https://files.pythonhosted.org/packages/76/95/bef5b37f29fc5e739947e9ce5179ad402875633308504a52d188302319c8/numpy-2.2.6-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8e9ace4a37db23421249ed236fdcdd457d671e25146786dfc96835cd951aa7c1", size = 18385260, upload-time = "2025-05-17T21:43:05.189Z" },
+ { url = "https://files.pythonhosted.org/packages/09/04/f2f83279d287407cf36a7a8053a5abe7be3622a4363337338f2585e4afda/numpy-2.2.6-cp313-cp313t-win32.whl", hash = "sha256:038613e9fb8c72b0a41f025a7e4c3f0b7a1b5d768ece4796b674c8f3fe13efff", size = 6377225, upload-time = "2025-05-17T21:43:16.254Z" },
+ { url = "https://files.pythonhosted.org/packages/67/0e/35082d13c09c02c011cf21570543d202ad929d961c02a147493cb0c2bdf5/numpy-2.2.6-cp313-cp313t-win_amd64.whl", hash = "sha256:6031dd6dfecc0cf9f668681a37648373bddd6421fff6c66ec1624eed0180ee06", size = 12771374, upload-time = "2025-05-17T21:43:35.479Z" },
+ { url = "https://files.pythonhosted.org/packages/9e/3b/d94a75f4dbf1ef5d321523ecac21ef23a3cd2ac8b78ae2aac40873590229/numpy-2.2.6-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:0b605b275d7bd0c640cad4e5d30fa701a8d59302e127e5f79138ad62762c3e3d", size = 21040391, upload-time = "2025-05-17T21:44:35.948Z" },
+ { url = "https://files.pythonhosted.org/packages/17/f4/09b2fa1b58f0fb4f7c7963a1649c64c4d315752240377ed74d9cd878f7b5/numpy-2.2.6-pp310-pypy310_pp73-macosx_14_0_x86_64.whl", hash = "sha256:7befc596a7dc9da8a337f79802ee8adb30a552a94f792b9c9d18c840055907db", size = 6786754, upload-time = "2025-05-17T21:44:47.446Z" },
+ { url = "https://files.pythonhosted.org/packages/af/30/feba75f143bdc868a1cc3f44ccfa6c4b9ec522b36458e738cd00f67b573f/numpy-2.2.6-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ce47521a4754c8f4593837384bd3424880629f718d87c5d44f8ed763edd63543", size = 16643476, upload-time = "2025-05-17T21:45:11.871Z" },
+ { url = "https://files.pythonhosted.org/packages/37/48/ac2a9584402fb6c0cd5b5d1a91dcf176b15760130dd386bbafdbfe3640bf/numpy-2.2.6-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:d042d24c90c41b54fd506da306759e06e568864df8ec17ccc17e9e884634fd00", size = 12812666, upload-time = "2025-05-17T21:45:31.426Z" },
+]
+
+[[package]]
+name = "numpy"
+version = "2.3.3"
+source = { registry = "https://pypi.org/simple" }
+resolution-markers = [
+ "python_full_version >= '3.12'",
+ "python_full_version == '3.11.*'",
+]
+sdist = { url = "https://files.pythonhosted.org/packages/d0/19/95b3d357407220ed24c139018d2518fab0a61a948e68286a25f1a4d049ff/numpy-2.3.3.tar.gz", hash = "sha256:ddc7c39727ba62b80dfdbedf400d1c10ddfa8eefbd7ec8dcb118be8b56d31029", size = 20576648, upload-time = "2025-09-09T16:54:12.543Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/7a/45/e80d203ef6b267aa29b22714fb558930b27960a0c5ce3c19c999232bb3eb/numpy-2.3.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:0ffc4f5caba7dfcbe944ed674b7eef683c7e94874046454bb79ed7ee0236f59d", size = 21259253, upload-time = "2025-09-09T15:56:02.094Z" },
+ { url = "https://files.pythonhosted.org/packages/52/18/cf2c648fccf339e59302e00e5f2bc87725a3ce1992f30f3f78c9044d7c43/numpy-2.3.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e7e946c7170858a0295f79a60214424caac2ffdb0063d4d79cb681f9aa0aa569", size = 14450980, upload-time = "2025-09-09T15:56:05.926Z" },
+ { url = "https://files.pythonhosted.org/packages/93/fb/9af1082bec870188c42a1c239839915b74a5099c392389ff04215dcee812/numpy-2.3.3-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:cd4260f64bc794c3390a63bf0728220dd1a68170c169088a1e0dfa2fde1be12f", size = 5379709, upload-time = "2025-09-09T15:56:07.95Z" },
+ { url = "https://files.pythonhosted.org/packages/75/0f/bfd7abca52bcbf9a4a65abc83fe18ef01ccdeb37bfb28bbd6ad613447c79/numpy-2.3.3-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:f0ddb4b96a87b6728df9362135e764eac3cfa674499943ebc44ce96c478ab125", size = 6913923, upload-time = "2025-09-09T15:56:09.443Z" },
+ { url = "https://files.pythonhosted.org/packages/79/55/d69adad255e87ab7afda1caf93ca997859092afeb697703e2f010f7c2e55/numpy-2.3.3-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:afd07d377f478344ec6ca2b8d4ca08ae8bd44706763d1efb56397de606393f48", size = 14589591, upload-time = "2025-09-09T15:56:11.234Z" },
+ { url = "https://files.pythonhosted.org/packages/10/a2/010b0e27ddeacab7839957d7a8f00e91206e0c2c47abbb5f35a2630e5387/numpy-2.3.3-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bc92a5dedcc53857249ca51ef29f5e5f2f8c513e22cfb90faeb20343b8c6f7a6", size = 16938714, upload-time = "2025-09-09T15:56:14.637Z" },
+ { url = "https://files.pythonhosted.org/packages/1c/6b/12ce8ede632c7126eb2762b9e15e18e204b81725b81f35176eac14dc5b82/numpy-2.3.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7af05ed4dc19f308e1d9fc759f36f21921eb7bbfc82843eeec6b2a2863a0aefa", size = 16370592, upload-time = "2025-09-09T15:56:17.285Z" },
+ { url = "https://files.pythonhosted.org/packages/b4/35/aba8568b2593067bb6a8fe4c52babb23b4c3b9c80e1b49dff03a09925e4a/numpy-2.3.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:433bf137e338677cebdd5beac0199ac84712ad9d630b74eceeb759eaa45ddf30", size = 18884474, upload-time = "2025-09-09T15:56:20.943Z" },
+ { url = "https://files.pythonhosted.org/packages/45/fa/7f43ba10c77575e8be7b0138d107e4f44ca4a1ef322cd16980ea3e8b8222/numpy-2.3.3-cp311-cp311-win32.whl", hash = "sha256:eb63d443d7b4ffd1e873f8155260d7f58e7e4b095961b01c91062935c2491e57", size = 6599794, upload-time = "2025-09-09T15:56:23.258Z" },
+ { url = "https://files.pythonhosted.org/packages/0a/a2/a4f78cb2241fe5664a22a10332f2be886dcdea8784c9f6a01c272da9b426/numpy-2.3.3-cp311-cp311-win_amd64.whl", hash = "sha256:ec9d249840f6a565f58d8f913bccac2444235025bbb13e9a4681783572ee3caa", size = 13088104, upload-time = "2025-09-09T15:56:25.476Z" },
+ { url = "https://files.pythonhosted.org/packages/79/64/e424e975adbd38282ebcd4891661965b78783de893b381cbc4832fb9beb2/numpy-2.3.3-cp311-cp311-win_arm64.whl", hash = "sha256:74c2a948d02f88c11a3c075d9733f1ae67d97c6bdb97f2bb542f980458b257e7", size = 10460772, upload-time = "2025-09-09T15:56:27.679Z" },
+ { url = "https://files.pythonhosted.org/packages/51/5d/bb7fc075b762c96329147799e1bcc9176ab07ca6375ea976c475482ad5b3/numpy-2.3.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:cfdd09f9c84a1a934cde1eec2267f0a43a7cd44b2cca4ff95b7c0d14d144b0bf", size = 20957014, upload-time = "2025-09-09T15:56:29.966Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/0e/c6211bb92af26517acd52125a237a92afe9c3124c6a68d3b9f81b62a0568/numpy-2.3.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:cb32e3cf0f762aee47ad1ddc6672988f7f27045b0783c887190545baba73aa25", size = 14185220, upload-time = "2025-09-09T15:56:32.175Z" },
+ { url = "https://files.pythonhosted.org/packages/22/f2/07bb754eb2ede9073f4054f7c0286b0d9d2e23982e090a80d478b26d35ca/numpy-2.3.3-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:396b254daeb0a57b1fe0ecb5e3cff6fa79a380fa97c8f7781a6d08cd429418fe", size = 5113918, upload-time = "2025-09-09T15:56:34.175Z" },
+ { url = "https://files.pythonhosted.org/packages/81/0a/afa51697e9fb74642f231ea36aca80fa17c8fb89f7a82abd5174023c3960/numpy-2.3.3-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:067e3d7159a5d8f8a0b46ee11148fc35ca9b21f61e3c49fbd0a027450e65a33b", size = 6647922, upload-time = "2025-09-09T15:56:36.149Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/f5/122d9cdb3f51c520d150fef6e87df9279e33d19a9611a87c0d2cf78a89f4/numpy-2.3.3-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1c02d0629d25d426585fb2e45a66154081b9fa677bc92a881ff1d216bc9919a8", size = 14281991, upload-time = "2025-09-09T15:56:40.548Z" },
+ { url = "https://files.pythonhosted.org/packages/51/64/7de3c91e821a2debf77c92962ea3fe6ac2bc45d0778c1cbe15d4fce2fd94/numpy-2.3.3-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d9192da52b9745f7f0766531dcfa978b7763916f158bb63bdb8a1eca0068ab20", size = 16641643, upload-time = "2025-09-09T15:56:43.343Z" },
+ { url = "https://files.pythonhosted.org/packages/30/e4/961a5fa681502cd0d68907818b69f67542695b74e3ceaa513918103b7e80/numpy-2.3.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:cd7de500a5b66319db419dc3c345244404a164beae0d0937283b907d8152e6ea", size = 16056787, upload-time = "2025-09-09T15:56:46.141Z" },
+ { url = "https://files.pythonhosted.org/packages/99/26/92c912b966e47fbbdf2ad556cb17e3a3088e2e1292b9833be1dfa5361a1a/numpy-2.3.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:93d4962d8f82af58f0b2eb85daaf1b3ca23fe0a85d0be8f1f2b7bb46034e56d7", size = 18579598, upload-time = "2025-09-09T15:56:49.844Z" },
+ { url = "https://files.pythonhosted.org/packages/17/b6/fc8f82cb3520768718834f310c37d96380d9dc61bfdaf05fe5c0b7653e01/numpy-2.3.3-cp312-cp312-win32.whl", hash = "sha256:5534ed6b92f9b7dca6c0a19d6df12d41c68b991cef051d108f6dbff3babc4ebf", size = 6320800, upload-time = "2025-09-09T15:56:52.499Z" },
+ { url = "https://files.pythonhosted.org/packages/32/ee/de999f2625b80d043d6d2d628c07d0d5555a677a3cf78fdf868d409b8766/numpy-2.3.3-cp312-cp312-win_amd64.whl", hash = "sha256:497d7cad08e7092dba36e3d296fe4c97708c93daf26643a1ae4b03f6294d30eb", size = 12786615, upload-time = "2025-09-09T15:56:54.422Z" },
+ { url = "https://files.pythonhosted.org/packages/49/6e/b479032f8a43559c383acb20816644f5f91c88f633d9271ee84f3b3a996c/numpy-2.3.3-cp312-cp312-win_arm64.whl", hash = "sha256:ca0309a18d4dfea6fc6262a66d06c26cfe4640c3926ceec90e57791a82b6eee5", size = 10195936, upload-time = "2025-09-09T15:56:56.541Z" },
+ { url = "https://files.pythonhosted.org/packages/7d/b9/984c2b1ee61a8b803bf63582b4ac4242cf76e2dbd663efeafcb620cc0ccb/numpy-2.3.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f5415fb78995644253370985342cd03572ef8620b934da27d77377a2285955bf", size = 20949588, upload-time = "2025-09-09T15:56:59.087Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/e4/07970e3bed0b1384d22af1e9912527ecbeb47d3b26e9b6a3bced068b3bea/numpy-2.3.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d00de139a3324e26ed5b95870ce63be7ec7352171bc69a4cf1f157a48e3eb6b7", size = 14177802, upload-time = "2025-09-09T15:57:01.73Z" },
+ { url = "https://files.pythonhosted.org/packages/35/c7/477a83887f9de61f1203bad89cf208b7c19cc9fef0cebef65d5a1a0619f2/numpy-2.3.3-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:9dc13c6a5829610cc07422bc74d3ac083bd8323f14e2827d992f9e52e22cd6a6", size = 5106537, upload-time = "2025-09-09T15:57:03.765Z" },
+ { url = "https://files.pythonhosted.org/packages/52/47/93b953bd5866a6f6986344d045a207d3f1cfbad99db29f534ea9cee5108c/numpy-2.3.3-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:d79715d95f1894771eb4e60fb23f065663b2298f7d22945d66877aadf33d00c7", size = 6640743, upload-time = "2025-09-09T15:57:07.921Z" },
+ { url = "https://files.pythonhosted.org/packages/23/83/377f84aaeb800b64c0ef4de58b08769e782edcefa4fea712910b6f0afd3c/numpy-2.3.3-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:952cfd0748514ea7c3afc729a0fc639e61655ce4c55ab9acfab14bda4f402b4c", size = 14278881, upload-time = "2025-09-09T15:57:11.349Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/a5/bf3db6e66c4b160d6ea10b534c381a1955dfab34cb1017ea93aa33c70ed3/numpy-2.3.3-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5b83648633d46f77039c29078751f80da65aa64d5622a3cd62aaef9d835b6c93", size = 16636301, upload-time = "2025-09-09T15:57:14.245Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/59/1287924242eb4fa3f9b3a2c30400f2e17eb2707020d1c5e3086fe7330717/numpy-2.3.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b001bae8cea1c7dfdb2ae2b017ed0a6f2102d7a70059df1e338e307a4c78a8ae", size = 16053645, upload-time = "2025-09-09T15:57:16.534Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/93/b3d47ed882027c35e94ac2320c37e452a549f582a5e801f2d34b56973c97/numpy-2.3.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8e9aced64054739037d42fb84c54dd38b81ee238816c948c8f3ed134665dcd86", size = 18578179, upload-time = "2025-09-09T15:57:18.883Z" },
+ { url = "https://files.pythonhosted.org/packages/20/d9/487a2bccbf7cc9d4bfc5f0f197761a5ef27ba870f1e3bbb9afc4bbe3fcc2/numpy-2.3.3-cp313-cp313-win32.whl", hash = "sha256:9591e1221db3f37751e6442850429b3aabf7026d3b05542d102944ca7f00c8a8", size = 6312250, upload-time = "2025-09-09T15:57:21.296Z" },
+ { url = "https://files.pythonhosted.org/packages/1b/b5/263ebbbbcede85028f30047eab3d58028d7ebe389d6493fc95ae66c636ab/numpy-2.3.3-cp313-cp313-win_amd64.whl", hash = "sha256:f0dadeb302887f07431910f67a14d57209ed91130be0adea2f9793f1a4f817cf", size = 12783269, upload-time = "2025-09-09T15:57:23.034Z" },
+ { url = "https://files.pythonhosted.org/packages/fa/75/67b8ca554bbeaaeb3fac2e8bce46967a5a06544c9108ec0cf5cece559b6c/numpy-2.3.3-cp313-cp313-win_arm64.whl", hash = "sha256:3c7cf302ac6e0b76a64c4aecf1a09e51abd9b01fc7feee80f6c43e3ab1b1dbc5", size = 10195314, upload-time = "2025-09-09T15:57:25.045Z" },
+ { url = "https://files.pythonhosted.org/packages/11/d0/0d1ddec56b162042ddfafeeb293bac672de9b0cfd688383590090963720a/numpy-2.3.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:eda59e44957d272846bb407aad19f89dc6f58fecf3504bd144f4c5cf81a7eacc", size = 21048025, upload-time = "2025-09-09T15:57:27.257Z" },
+ { url = "https://files.pythonhosted.org/packages/36/9e/1996ca6b6d00415b6acbdd3c42f7f03ea256e2c3f158f80bd7436a8a19f3/numpy-2.3.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:823d04112bc85ef5c4fda73ba24e6096c8f869931405a80aa8b0e604510a26bc", size = 14301053, upload-time = "2025-09-09T15:57:30.077Z" },
+ { url = "https://files.pythonhosted.org/packages/05/24/43da09aa764c68694b76e84b3d3f0c44cb7c18cdc1ba80e48b0ac1d2cd39/numpy-2.3.3-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:40051003e03db4041aa325da2a0971ba41cf65714e65d296397cc0e32de6018b", size = 5229444, upload-time = "2025-09-09T15:57:32.733Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/14/50ffb0f22f7218ef8af28dd089f79f68289a7a05a208db9a2c5dcbe123c1/numpy-2.3.3-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:6ee9086235dd6ab7ae75aba5662f582a81ced49f0f1c6de4260a78d8f2d91a19", size = 6738039, upload-time = "2025-09-09T15:57:34.328Z" },
+ { url = "https://files.pythonhosted.org/packages/55/52/af46ac0795e09657d45a7f4db961917314377edecf66db0e39fa7ab5c3d3/numpy-2.3.3-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:94fcaa68757c3e2e668ddadeaa86ab05499a70725811e582b6a9858dd472fb30", size = 14352314, upload-time = "2025-09-09T15:57:36.255Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/b1/dc226b4c90eb9f07a3fff95c2f0db3268e2e54e5cce97c4ac91518aee71b/numpy-2.3.3-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:da1a74b90e7483d6ce5244053399a614b1d6b7bc30a60d2f570e5071f8959d3e", size = 16701722, upload-time = "2025-09-09T15:57:38.622Z" },
+ { url = "https://files.pythonhosted.org/packages/9d/9d/9d8d358f2eb5eced14dba99f110d83b5cd9a4460895230f3b396ad19a323/numpy-2.3.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:2990adf06d1ecee3b3dcbb4977dfab6e9f09807598d647f04d385d29e7a3c3d3", size = 16132755, upload-time = "2025-09-09T15:57:41.16Z" },
+ { url = "https://files.pythonhosted.org/packages/b6/27/b3922660c45513f9377b3fb42240bec63f203c71416093476ec9aa0719dc/numpy-2.3.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:ed635ff692483b8e3f0fcaa8e7eb8a75ee71aa6d975388224f70821421800cea", size = 18651560, upload-time = "2025-09-09T15:57:43.459Z" },
+ { url = "https://files.pythonhosted.org/packages/5b/8e/3ab61a730bdbbc201bb245a71102aa609f0008b9ed15255500a99cd7f780/numpy-2.3.3-cp313-cp313t-win32.whl", hash = "sha256:a333b4ed33d8dc2b373cc955ca57babc00cd6f9009991d9edc5ddbc1bac36bcd", size = 6442776, upload-time = "2025-09-09T15:57:45.793Z" },
+ { url = "https://files.pythonhosted.org/packages/1c/3a/e22b766b11f6030dc2decdeff5c2fb1610768055603f9f3be88b6d192fb2/numpy-2.3.3-cp313-cp313t-win_amd64.whl", hash = "sha256:4384a169c4d8f97195980815d6fcad04933a7e1ab3b530921c3fef7a1c63426d", size = 12927281, upload-time = "2025-09-09T15:57:47.492Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/42/c2e2bc48c5e9b2a83423f99733950fbefd86f165b468a3d85d52b30bf782/numpy-2.3.3-cp313-cp313t-win_arm64.whl", hash = "sha256:75370986cc0bc66f4ce5110ad35aae6d182cc4ce6433c40ad151f53690130bf1", size = 10265275, upload-time = "2025-09-09T15:57:49.647Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/01/342ad585ad82419b99bcf7cebe99e61da6bedb89e213c5fd71acc467faee/numpy-2.3.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:cd052f1fa6a78dee696b58a914b7229ecfa41f0a6d96dc663c1220a55e137593", size = 20951527, upload-time = "2025-09-09T15:57:52.006Z" },
+ { url = "https://files.pythonhosted.org/packages/ef/d8/204e0d73fc1b7a9ee80ab1fe1983dd33a4d64a4e30a05364b0208e9a241a/numpy-2.3.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:414a97499480067d305fcac9716c29cf4d0d76db6ebf0bf3cbce666677f12652", size = 14186159, upload-time = "2025-09-09T15:57:54.407Z" },
+ { url = "https://files.pythonhosted.org/packages/22/af/f11c916d08f3a18fb8ba81ab72b5b74a6e42ead4c2846d270eb19845bf74/numpy-2.3.3-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:50a5fe69f135f88a2be9b6ca0481a68a136f6febe1916e4920e12f1a34e708a7", size = 5114624, upload-time = "2025-09-09T15:57:56.5Z" },
+ { url = "https://files.pythonhosted.org/packages/fb/11/0ed919c8381ac9d2ffacd63fd1f0c34d27e99cab650f0eb6f110e6ae4858/numpy-2.3.3-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:b912f2ed2b67a129e6a601e9d93d4fa37bef67e54cac442a2f588a54afe5c67a", size = 6642627, upload-time = "2025-09-09T15:57:58.206Z" },
+ { url = "https://files.pythonhosted.org/packages/ee/83/deb5f77cb0f7ba6cb52b91ed388b47f8f3c2e9930d4665c600408d9b90b9/numpy-2.3.3-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9e318ee0596d76d4cb3d78535dc005fa60e5ea348cd131a51e99d0bdbe0b54fe", size = 14296926, upload-time = "2025-09-09T15:58:00.035Z" },
+ { url = "https://files.pythonhosted.org/packages/77/cc/70e59dcb84f2b005d4f306310ff0a892518cc0c8000a33d0e6faf7ca8d80/numpy-2.3.3-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ce020080e4a52426202bdb6f7691c65bb55e49f261f31a8f506c9f6bc7450421", size = 16638958, upload-time = "2025-09-09T15:58:02.738Z" },
+ { url = "https://files.pythonhosted.org/packages/b6/5a/b2ab6c18b4257e099587d5b7f903317bd7115333ad8d4ec4874278eafa61/numpy-2.3.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:e6687dc183aa55dae4a705b35f9c0f8cb178bcaa2f029b241ac5356221d5c021", size = 16071920, upload-time = "2025-09-09T15:58:05.029Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/f1/8b3fdc44324a259298520dd82147ff648979bed085feeacc1250ef1656c0/numpy-2.3.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d8f3b1080782469fdc1718c4ed1d22549b5fb12af0d57d35e992158a772a37cf", size = 18577076, upload-time = "2025-09-09T15:58:07.745Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/a1/b87a284fb15a42e9274e7fcea0dad259d12ddbf07c1595b26883151ca3b4/numpy-2.3.3-cp314-cp314-win32.whl", hash = "sha256:cb248499b0bc3be66ebd6578b83e5acacf1d6cb2a77f2248ce0e40fbec5a76d0", size = 6366952, upload-time = "2025-09-09T15:58:10.096Z" },
+ { url = "https://files.pythonhosted.org/packages/70/5f/1816f4d08f3b8f66576d8433a66f8fa35a5acfb3bbd0bf6c31183b003f3d/numpy-2.3.3-cp314-cp314-win_amd64.whl", hash = "sha256:691808c2b26b0f002a032c73255d0bd89751425f379f7bcd22d140db593a96e8", size = 12919322, upload-time = "2025-09-09T15:58:12.138Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/de/072420342e46a8ea41c324a555fa90fcc11637583fb8df722936aed1736d/numpy-2.3.3-cp314-cp314-win_arm64.whl", hash = "sha256:9ad12e976ca7b10f1774b03615a2a4bab8addce37ecc77394d8e986927dc0dfe", size = 10478630, upload-time = "2025-09-09T15:58:14.64Z" },
+ { url = "https://files.pythonhosted.org/packages/d5/df/ee2f1c0a9de7347f14da5dd3cd3c3b034d1b8607ccb6883d7dd5c035d631/numpy-2.3.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9cc48e09feb11e1db00b320e9d30a4151f7369afb96bd0e48d942d09da3a0d00", size = 21047987, upload-time = "2025-09-09T15:58:16.889Z" },
+ { url = "https://files.pythonhosted.org/packages/d6/92/9453bdc5a4e9e69cf4358463f25e8260e2ffc126d52e10038b9077815989/numpy-2.3.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:901bf6123879b7f251d3631967fd574690734236075082078e0571977c6a8e6a", size = 14301076, upload-time = "2025-09-09T15:58:20.343Z" },
+ { url = "https://files.pythonhosted.org/packages/13/77/1447b9eb500f028bb44253105bd67534af60499588a5149a94f18f2ca917/numpy-2.3.3-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:7f025652034199c301049296b59fa7d52c7e625017cae4c75d8662e377bf487d", size = 5229491, upload-time = "2025-09-09T15:58:22.481Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/f9/d72221b6ca205f9736cb4b2ce3b002f6e45cd67cd6a6d1c8af11a2f0b649/numpy-2.3.3-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:533ca5f6d325c80b6007d4d7fb1984c303553534191024ec6a524a4c92a5935a", size = 6737913, upload-time = "2025-09-09T15:58:24.569Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/5f/d12834711962ad9c46af72f79bb31e73e416ee49d17f4c797f72c96b6ca5/numpy-2.3.3-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0edd58682a399824633b66885d699d7de982800053acf20be1eaa46d92009c54", size = 14352811, upload-time = "2025-09-09T15:58:26.416Z" },
+ { url = "https://files.pythonhosted.org/packages/a1/0d/fdbec6629d97fd1bebed56cd742884e4eead593611bbe1abc3eb40d304b2/numpy-2.3.3-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:367ad5d8fbec5d9296d18478804a530f1191e24ab4d75ab408346ae88045d25e", size = 16702689, upload-time = "2025-09-09T15:58:28.831Z" },
+ { url = "https://files.pythonhosted.org/packages/9b/09/0a35196dc5575adde1eb97ddfbc3e1687a814f905377621d18ca9bc2b7dd/numpy-2.3.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8f6ac61a217437946a1fa48d24c47c91a0c4f725237871117dea264982128097", size = 16133855, upload-time = "2025-09-09T15:58:31.349Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/ca/c9de3ea397d576f1b6753eaa906d4cdef1bf97589a6d9825a349b4729cc2/numpy-2.3.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:179a42101b845a816d464b6fe9a845dfaf308fdfc7925387195570789bb2c970", size = 18652520, upload-time = "2025-09-09T15:58:33.762Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/c2/e5ed830e08cd0196351db55db82f65bc0ab05da6ef2b72a836dcf1936d2f/numpy-2.3.3-cp314-cp314t-win32.whl", hash = "sha256:1250c5d3d2562ec4174bce2e3a1523041595f9b651065e4a4473f5f48a6bc8a5", size = 6515371, upload-time = "2025-09-09T15:58:36.04Z" },
+ { url = "https://files.pythonhosted.org/packages/47/c7/b0f6b5b67f6788a0725f744496badbb604d226bf233ba716683ebb47b570/numpy-2.3.3-cp314-cp314t-win_amd64.whl", hash = "sha256:b37a0b2e5935409daebe82c1e42274d30d9dd355852529eab91dab8dcca7419f", size = 13112576, upload-time = "2025-09-09T15:58:37.927Z" },
+ { url = "https://files.pythonhosted.org/packages/06/b9/33bba5ff6fb679aa0b1f8a07e853f002a6b04b9394db3069a1270a7784ca/numpy-2.3.3-cp314-cp314t-win_arm64.whl", hash = "sha256:78c9f6560dc7e6b3990e32df7ea1a50bbd0e2a111e05209963f5ddcab7073b0b", size = 10545953, upload-time = "2025-09-09T15:58:40.576Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/f2/7e0a37cfced2644c9563c529f29fa28acbd0960dde32ece683aafa6f4949/numpy-2.3.3-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:1e02c7159791cd481e1e6d5ddd766b62a4d5acf8df4d4d1afe35ee9c5c33a41e", size = 21131019, upload-time = "2025-09-09T15:58:42.838Z" },
+ { url = "https://files.pythonhosted.org/packages/1a/7e/3291f505297ed63831135a6cc0f474da0c868a1f31b0dd9a9f03a7a0d2ed/numpy-2.3.3-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:dca2d0fc80b3893ae72197b39f69d55a3cd8b17ea1b50aa4c62de82419936150", size = 14376288, upload-time = "2025-09-09T15:58:45.425Z" },
+ { url = "https://files.pythonhosted.org/packages/bf/4b/ae02e985bdeee73d7b5abdefeb98aef1207e96d4c0621ee0cf228ddfac3c/numpy-2.3.3-pp311-pypy311_pp73-macosx_14_0_arm64.whl", hash = "sha256:99683cbe0658f8271b333a1b1b4bb3173750ad59c0c61f5bbdc5b318918fffe3", size = 5305425, upload-time = "2025-09-09T15:58:48.6Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/eb/9df215d6d7250db32007941500dc51c48190be25f2401d5b2b564e467247/numpy-2.3.3-pp311-pypy311_pp73-macosx_14_0_x86_64.whl", hash = "sha256:d9d537a39cc9de668e5cd0e25affb17aec17b577c6b3ae8a3d866b479fbe88d0", size = 6819053, upload-time = "2025-09-09T15:58:50.401Z" },
+ { url = "https://files.pythonhosted.org/packages/57/62/208293d7d6b2a8998a4a1f23ac758648c3c32182d4ce4346062018362e29/numpy-2.3.3-pp311-pypy311_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8596ba2f8af5f93b01d97563832686d20206d303024777f6dfc2e7c7c3f1850e", size = 14420354, upload-time = "2025-09-09T15:58:52.704Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/0c/8e86e0ff7072e14a71b4c6af63175e40d1e7e933ce9b9e9f765a95b4e0c3/numpy-2.3.3-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e1ec5615b05369925bd1125f27df33f3b6c8bc10d788d5999ecd8769a1fa04db", size = 16760413, upload-time = "2025-09-09T15:58:55.027Z" },
+ { url = "https://files.pythonhosted.org/packages/af/11/0cc63f9f321ccf63886ac203336777140011fb669e739da36d8db3c53b98/numpy-2.3.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:2e267c7da5bf7309670523896df97f93f6e469fb931161f483cd6882b3b1a5dc", size = 12971844, upload-time = "2025-09-09T15:58:57.359Z" },
+]
+
+[[package]]
+name = "nvidia-cublas-cu12"
+version = "12.6.4.1"
+source = { registry = "https://pypi.org/simple" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/af/eb/ff4b8c503fa1f1796679dce648854d58751982426e4e4b37d6fce49d259c/nvidia_cublas_cu12-12.6.4.1-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:08ed2686e9875d01b58e3cb379c6896df8e76c75e0d4a7f7dace3d7b6d9ef8eb", size = 393138322, upload-time = "2024-11-20T17:40:25.65Z" },
+]
+
+[[package]]
+name = "nvidia-cuda-cupti-cu12"
+version = "12.6.80"
+source = { registry = "https://pypi.org/simple" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/49/60/7b6497946d74bcf1de852a21824d63baad12cd417db4195fc1bfe59db953/nvidia_cuda_cupti_cu12-12.6.80-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:6768bad6cab4f19e8292125e5f1ac8aa7d1718704012a0e3272a6f61c4bce132", size = 8917980, upload-time = "2024-11-20T17:36:04.019Z" },
+ { url = "https://files.pythonhosted.org/packages/a5/24/120ee57b218d9952c379d1e026c4479c9ece9997a4fb46303611ee48f038/nvidia_cuda_cupti_cu12-12.6.80-py3-none-manylinux2014_x86_64.whl", hash = "sha256:a3eff6cdfcc6a4c35db968a06fcadb061cbc7d6dde548609a941ff8701b98b73", size = 8917972, upload-time = "2024-10-01T16:58:06.036Z" },
+]
+
+[[package]]
+name = "nvidia-cuda-nvrtc-cu12"
+version = "12.6.77"
+source = { registry = "https://pypi.org/simple" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/75/2e/46030320b5a80661e88039f59060d1790298b4718944a65a7f2aeda3d9e9/nvidia_cuda_nvrtc_cu12-12.6.77-py3-none-manylinux2014_x86_64.whl", hash = "sha256:35b0cc6ee3a9636d5409133e79273ce1f3fd087abb0532d2d2e8fff1fe9efc53", size = 23650380, upload-time = "2024-10-01T17:00:14.643Z" },
+]
+
+[[package]]
+name = "nvidia-cuda-runtime-cu12"
+version = "12.6.77"
+source = { registry = "https://pypi.org/simple" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e1/23/e717c5ac26d26cf39a27fbc076240fad2e3b817e5889d671b67f4f9f49c5/nvidia_cuda_runtime_cu12-12.6.77-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:ba3b56a4f896141e25e19ab287cd71e52a6a0f4b29d0d31609f60e3b4d5219b7", size = 897690, upload-time = "2024-11-20T17:35:30.697Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/62/65c05e161eeddbafeca24dc461f47de550d9fa8a7e04eb213e32b55cfd99/nvidia_cuda_runtime_cu12-12.6.77-py3-none-manylinux2014_x86_64.whl", hash = "sha256:a84d15d5e1da416dd4774cb42edf5e954a3e60cc945698dc1d5be02321c44dc8", size = 897678, upload-time = "2024-10-01T16:57:33.821Z" },
+]
+
+[[package]]
+name = "nvidia-cudnn-cu12"
+version = "9.5.1.17"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "nvidia-cublas-cu12" },
+]
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/2a/78/4535c9c7f859a64781e43c969a3a7e84c54634e319a996d43ef32ce46f83/nvidia_cudnn_cu12-9.5.1.17-py3-none-manylinux_2_28_x86_64.whl", hash = "sha256:30ac3869f6db17d170e0e556dd6cc5eee02647abc31ca856634d5a40f82c15b2", size = 570988386, upload-time = "2024-10-25T19:54:26.39Z" },
+]
+
+[[package]]
+name = "nvidia-cufft-cu12"
+version = "11.3.0.4"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "nvidia-nvjitlink-cu12" },
+]
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/8f/16/73727675941ab8e6ffd86ca3a4b7b47065edcca7a997920b831f8147c99d/nvidia_cufft_cu12-11.3.0.4-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:ccba62eb9cef5559abd5e0d54ceed2d9934030f51163df018532142a8ec533e5", size = 200221632, upload-time = "2024-11-20T17:41:32.357Z" },
+ { url = "https://files.pythonhosted.org/packages/60/de/99ec247a07ea40c969d904fc14f3a356b3e2a704121675b75c366b694ee1/nvidia_cufft_cu12-11.3.0.4-py3-none-manylinux2014_x86_64.whl", hash = "sha256:768160ac89f6f7b459bee747e8d175dbf53619cfe74b2a5636264163138013ca", size = 200221622, upload-time = "2024-10-01T17:03:58.79Z" },
+]
+
+[[package]]
+name = "nvidia-cufile-cu12"
+version = "1.11.1.6"
+source = { registry = "https://pypi.org/simple" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/b2/66/cc9876340ac68ae71b15c743ddb13f8b30d5244af344ec8322b449e35426/nvidia_cufile_cu12-1.11.1.6-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:cc23469d1c7e52ce6c1d55253273d32c565dd22068647f3aa59b3c6b005bf159", size = 1142103, upload-time = "2024-11-20T17:42:11.83Z" },
+]
+
+[[package]]
+name = "nvidia-curand-cu12"
+version = "10.3.7.77"
+source = { registry = "https://pypi.org/simple" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/73/1b/44a01c4e70933637c93e6e1a8063d1e998b50213a6b65ac5a9169c47e98e/nvidia_curand_cu12-10.3.7.77-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:a42cd1344297f70b9e39a1e4f467a4e1c10f1da54ff7a85c12197f6c652c8bdf", size = 56279010, upload-time = "2024-11-20T17:42:50.958Z" },
+ { url = "https://files.pythonhosted.org/packages/4a/aa/2c7ff0b5ee02eaef890c0ce7d4f74bc30901871c5e45dee1ae6d0083cd80/nvidia_curand_cu12-10.3.7.77-py3-none-manylinux2014_x86_64.whl", hash = "sha256:99f1a32f1ac2bd134897fc7a203f779303261268a65762a623bf30cc9fe79117", size = 56279000, upload-time = "2024-10-01T17:04:45.274Z" },
+]
+
+[[package]]
+name = "nvidia-cusolver-cu12"
+version = "11.7.1.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "nvidia-cublas-cu12" },
+ { name = "nvidia-cusparse-cu12" },
+ { name = "nvidia-nvjitlink-cu12" },
+]
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/f0/6e/c2cf12c9ff8b872e92b4a5740701e51ff17689c4d726fca91875b07f655d/nvidia_cusolver_cu12-11.7.1.2-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e9e49843a7707e42022babb9bcfa33c29857a93b88020c4e4434656a655b698c", size = 158229790, upload-time = "2024-11-20T17:43:43.211Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/81/baba53585da791d043c10084cf9553e074548408e04ae884cfe9193bd484/nvidia_cusolver_cu12-11.7.1.2-py3-none-manylinux2014_x86_64.whl", hash = "sha256:6cf28f17f64107a0c4d7802be5ff5537b2130bfc112f25d5a30df227058ca0e6", size = 158229780, upload-time = "2024-10-01T17:05:39.875Z" },
+]
+
+[[package]]
+name = "nvidia-cusparse-cu12"
+version = "12.5.4.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "nvidia-nvjitlink-cu12" },
+]
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/06/1e/b8b7c2f4099a37b96af5c9bb158632ea9e5d9d27d7391d7eb8fc45236674/nvidia_cusparse_cu12-12.5.4.2-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7556d9eca156e18184b94947ade0fba5bb47d69cec46bf8660fd2c71a4b48b73", size = 216561367, upload-time = "2024-11-20T17:44:54.824Z" },
+ { url = "https://files.pythonhosted.org/packages/43/ac/64c4316ba163e8217a99680c7605f779accffc6a4bcd0c778c12948d3707/nvidia_cusparse_cu12-12.5.4.2-py3-none-manylinux2014_x86_64.whl", hash = "sha256:23749a6571191a215cb74d1cdbff4a86e7b19f1200c071b3fcf844a5bea23a2f", size = 216561357, upload-time = "2024-10-01T17:06:29.861Z" },
+]
+
+[[package]]
+name = "nvidia-cusparselt-cu12"
+version = "0.6.3"
+source = { registry = "https://pypi.org/simple" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/3b/9a/72ef35b399b0e183bc2e8f6f558036922d453c4d8237dab26c666a04244b/nvidia_cusparselt_cu12-0.6.3-py3-none-manylinux2014_x86_64.whl", hash = "sha256:e5c8a26c36445dd2e6812f1177978a24e2d37cacce7e090f297a688d1ec44f46", size = 156785796, upload-time = "2024-10-15T21:29:17.709Z" },
+]
+
+[[package]]
+name = "nvidia-nccl-cu12"
+version = "2.26.2"
+source = { registry = "https://pypi.org/simple" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/67/ca/f42388aed0fddd64ade7493dbba36e1f534d4e6fdbdd355c6a90030ae028/nvidia_nccl_cu12-2.26.2-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:694cf3879a206553cc9d7dbda76b13efaf610fdb70a50cba303de1b0d1530ac6", size = 201319755, upload-time = "2025-03-13T00:29:55.296Z" },
+]
+
+[[package]]
+name = "nvidia-nvjitlink-cu12"
+version = "12.6.85"
+source = { registry = "https://pypi.org/simple" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/9d/d7/c5383e47c7e9bf1c99d5bd2a8c935af2b6d705ad831a7ec5c97db4d82f4f/nvidia_nvjitlink_cu12-12.6.85-py3-none-manylinux2010_x86_64.manylinux_2_12_x86_64.whl", hash = "sha256:eedc36df9e88b682efe4309aa16b5b4e78c2407eac59e8c10a6a47535164369a", size = 19744971, upload-time = "2024-11-20T17:46:53.366Z" },
+]
+
+[[package]]
+name = "nvidia-nvtx-cu12"
+version = "12.6.77"
+source = { registry = "https://pypi.org/simple" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/56/9a/fff8376f8e3d084cd1530e1ef7b879bb7d6d265620c95c1b322725c694f4/nvidia_nvtx_cu12-12.6.77-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:b90bed3df379fa79afbd21be8e04a0314336b8ae16768b58f2d34cb1d04cd7d2", size = 89276, upload-time = "2024-11-20T17:38:27.621Z" },
+ { url = "https://files.pythonhosted.org/packages/9e/4e/0d0c945463719429b7bd21dece907ad0bde437a2ff12b9b12fee94722ab0/nvidia_nvtx_cu12-12.6.77-py3-none-manylinux2014_x86_64.whl", hash = "sha256:6574241a3ec5fdc9334353ab8c479fe75841dbe8f4532a8fc97ce63503330ba1", size = 89265, upload-time = "2024-10-01T17:00:38.172Z" },
+]
+
+[[package]]
+name = "packaging"
+version = "25.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
+]
+
+[[package]]
+name = "pandas"
+version = "2.3.3"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "python-dateutil" },
+ { name = "pytz" },
+ { name = "tzdata" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/33/01/d40b85317f86cf08d853a4f495195c73815fdf205eef3993821720274518/pandas-2.3.3.tar.gz", hash = "sha256:e05e1af93b977f7eafa636d043f9f94c7ee3ac81af99c13508215942e64c993b", size = 4495223, upload-time = "2025-09-29T23:34:51.853Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/3d/f7/f425a00df4fcc22b292c6895c6831c0c8ae1d9fac1e024d16f98a9ce8749/pandas-2.3.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:376c6446ae31770764215a6c937f72d917f214b43560603cd60da6408f183b6c", size = 11555763, upload-time = "2025-09-29T23:16:53.287Z" },
+ { url = "https://files.pythonhosted.org/packages/13/4f/66d99628ff8ce7857aca52fed8f0066ce209f96be2fede6cef9f84e8d04f/pandas-2.3.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e19d192383eab2f4ceb30b412b22ea30690c9e618f78870357ae1d682912015a", size = 10801217, upload-time = "2025-09-29T23:17:04.522Z" },
+ { url = "https://files.pythonhosted.org/packages/1d/03/3fc4a529a7710f890a239cc496fc6d50ad4a0995657dccc1d64695adb9f4/pandas-2.3.3-cp310-cp310-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5caf26f64126b6c7aec964f74266f435afef1c1b13da3b0636c7518a1fa3e2b1", size = 12148791, upload-time = "2025-09-29T23:17:18.444Z" },
+ { url = "https://files.pythonhosted.org/packages/40/a8/4dac1f8f8235e5d25b9955d02ff6f29396191d4e665d71122c3722ca83c5/pandas-2.3.3-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:dd7478f1463441ae4ca7308a70e90b33470fa593429f9d4c578dd00d1fa78838", size = 12769373, upload-time = "2025-09-29T23:17:35.846Z" },
+ { url = "https://files.pythonhosted.org/packages/df/91/82cc5169b6b25440a7fc0ef3a694582418d875c8e3ebf796a6d6470aa578/pandas-2.3.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:4793891684806ae50d1288c9bae9330293ab4e083ccd1c5e383c34549c6e4250", size = 13200444, upload-time = "2025-09-29T23:17:49.341Z" },
+ { url = "https://files.pythonhosted.org/packages/10/ae/89b3283800ab58f7af2952704078555fa60c807fff764395bb57ea0b0dbd/pandas-2.3.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:28083c648d9a99a5dd035ec125d42439c6c1c525098c58af0fc38dd1a7a1b3d4", size = 13858459, upload-time = "2025-09-29T23:18:03.722Z" },
+ { url = "https://files.pythonhosted.org/packages/85/72/530900610650f54a35a19476eca5104f38555afccda1aa11a92ee14cb21d/pandas-2.3.3-cp310-cp310-win_amd64.whl", hash = "sha256:503cf027cf9940d2ceaa1a93cfb5f8c8c7e6e90720a2850378f0b3f3b1e06826", size = 11346086, upload-time = "2025-09-29T23:18:18.505Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/fa/7ac648108144a095b4fb6aa3de1954689f7af60a14cf25583f4960ecb878/pandas-2.3.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:602b8615ebcc4a0c1751e71840428ddebeb142ec02c786e8ad6b1ce3c8dec523", size = 11578790, upload-time = "2025-09-29T23:18:30.065Z" },
+ { url = "https://files.pythonhosted.org/packages/9b/35/74442388c6cf008882d4d4bdfc4109be87e9b8b7ccd097ad1e7f006e2e95/pandas-2.3.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8fe25fc7b623b0ef6b5009149627e34d2a4657e880948ec3c840e9402e5c1b45", size = 10833831, upload-time = "2025-09-29T23:38:56.071Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/e4/de154cbfeee13383ad58d23017da99390b91d73f8c11856f2095e813201b/pandas-2.3.3-cp311-cp311-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b468d3dad6ff947df92dcb32ede5b7bd41a9b3cceef0a30ed925f6d01fb8fa66", size = 12199267, upload-time = "2025-09-29T23:18:41.627Z" },
+ { url = "https://files.pythonhosted.org/packages/bf/c9/63f8d545568d9ab91476b1818b4741f521646cbdd151c6efebf40d6de6f7/pandas-2.3.3-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b98560e98cb334799c0b07ca7967ac361a47326e9b4e5a7dfb5ab2b1c9d35a1b", size = 12789281, upload-time = "2025-09-29T23:18:56.834Z" },
+ { url = "https://files.pythonhosted.org/packages/f2/00/a5ac8c7a0e67fd1a6059e40aa08fa1c52cc00709077d2300e210c3ce0322/pandas-2.3.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1d37b5848ba49824e5c30bedb9c830ab9b7751fd049bc7914533e01c65f79791", size = 13240453, upload-time = "2025-09-29T23:19:09.247Z" },
+ { url = "https://files.pythonhosted.org/packages/27/4d/5c23a5bc7bd209231618dd9e606ce076272c9bc4f12023a70e03a86b4067/pandas-2.3.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:db4301b2d1f926ae677a751eb2bd0e8c5f5319c9cb3f88b0becbbb0b07b34151", size = 13890361, upload-time = "2025-09-29T23:19:25.342Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/59/712db1d7040520de7a4965df15b774348980e6df45c129b8c64d0dbe74ef/pandas-2.3.3-cp311-cp311-win_amd64.whl", hash = "sha256:f086f6fe114e19d92014a1966f43a3e62285109afe874f067f5abbdcbb10e59c", size = 11348702, upload-time = "2025-09-29T23:19:38.296Z" },
+ { url = "https://files.pythonhosted.org/packages/9c/fb/231d89e8637c808b997d172b18e9d4a4bc7bf31296196c260526055d1ea0/pandas-2.3.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d21f6d74eb1725c2efaa71a2bfc661a0689579b58e9c0ca58a739ff0b002b53", size = 11597846, upload-time = "2025-09-29T23:19:48.856Z" },
+ { url = "https://files.pythonhosted.org/packages/5c/bd/bf8064d9cfa214294356c2d6702b716d3cf3bb24be59287a6a21e24cae6b/pandas-2.3.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3fd2f887589c7aa868e02632612ba39acb0b8948faf5cc58f0850e165bd46f35", size = 10729618, upload-time = "2025-09-29T23:39:08.659Z" },
+ { url = "https://files.pythonhosted.org/packages/57/56/cf2dbe1a3f5271370669475ead12ce77c61726ffd19a35546e31aa8edf4e/pandas-2.3.3-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ecaf1e12bdc03c86ad4a7ea848d66c685cb6851d807a26aa245ca3d2017a1908", size = 11737212, upload-time = "2025-09-29T23:19:59.765Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/63/cd7d615331b328e287d8233ba9fdf191a9c2d11b6af0c7a59cfcec23de68/pandas-2.3.3-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b3d11d2fda7eb164ef27ffc14b4fcab16a80e1ce67e9f57e19ec0afaf715ba89", size = 12362693, upload-time = "2025-09-29T23:20:14.098Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/de/8b1895b107277d52f2b42d3a6806e69cfef0d5cf1d0ba343470b9d8e0a04/pandas-2.3.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a68e15f780eddf2b07d242e17a04aa187a7ee12b40b930bfdd78070556550e98", size = 12771002, upload-time = "2025-09-29T23:20:26.76Z" },
+ { url = "https://files.pythonhosted.org/packages/87/21/84072af3187a677c5893b170ba2c8fbe450a6ff911234916da889b698220/pandas-2.3.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:371a4ab48e950033bcf52b6527eccb564f52dc826c02afd9a1bc0ab731bba084", size = 13450971, upload-time = "2025-09-29T23:20:41.344Z" },
+ { url = "https://files.pythonhosted.org/packages/86/41/585a168330ff063014880a80d744219dbf1dd7a1c706e75ab3425a987384/pandas-2.3.3-cp312-cp312-win_amd64.whl", hash = "sha256:a16dcec078a01eeef8ee61bf64074b4e524a2a3f4b3be9326420cabe59c4778b", size = 10992722, upload-time = "2025-09-29T23:20:54.139Z" },
+ { url = "https://files.pythonhosted.org/packages/cd/4b/18b035ee18f97c1040d94debd8f2e737000ad70ccc8f5513f4eefad75f4b/pandas-2.3.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:56851a737e3470de7fa88e6131f41281ed440d29a9268dcbf0002da5ac366713", size = 11544671, upload-time = "2025-09-29T23:21:05.024Z" },
+ { url = "https://files.pythonhosted.org/packages/31/94/72fac03573102779920099bcac1c3b05975c2cb5f01eac609faf34bed1ca/pandas-2.3.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:bdcd9d1167f4885211e401b3036c0c8d9e274eee67ea8d0758a256d60704cfe8", size = 10680807, upload-time = "2025-09-29T23:21:15.979Z" },
+ { url = "https://files.pythonhosted.org/packages/16/87/9472cf4a487d848476865321de18cc8c920b8cab98453ab79dbbc98db63a/pandas-2.3.3-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e32e7cc9af0f1cc15548288a51a3b681cc2a219faa838e995f7dc53dbab1062d", size = 11709872, upload-time = "2025-09-29T23:21:27.165Z" },
+ { url = "https://files.pythonhosted.org/packages/15/07/284f757f63f8a8d69ed4472bfd85122bd086e637bf4ed09de572d575a693/pandas-2.3.3-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:318d77e0e42a628c04dc56bcef4b40de67918f7041c2b061af1da41dcff670ac", size = 12306371, upload-time = "2025-09-29T23:21:40.532Z" },
+ { url = "https://files.pythonhosted.org/packages/33/81/a3afc88fca4aa925804a27d2676d22dcd2031c2ebe08aabd0ae55b9ff282/pandas-2.3.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4e0a175408804d566144e170d0476b15d78458795bb18f1304fb94160cabf40c", size = 12765333, upload-time = "2025-09-29T23:21:55.77Z" },
+ { url = "https://files.pythonhosted.org/packages/8d/0f/b4d4ae743a83742f1153464cf1a8ecfafc3ac59722a0b5c8602310cb7158/pandas-2.3.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:93c2d9ab0fc11822b5eece72ec9587e172f63cff87c00b062f6e37448ced4493", size = 13418120, upload-time = "2025-09-29T23:22:10.109Z" },
+ { url = "https://files.pythonhosted.org/packages/4f/c7/e54682c96a895d0c808453269e0b5928a07a127a15704fedb643e9b0a4c8/pandas-2.3.3-cp313-cp313-win_amd64.whl", hash = "sha256:f8bfc0e12dc78f777f323f55c58649591b2cd0c43534e8355c51d3fede5f4dee", size = 10993991, upload-time = "2025-09-29T23:25:04.889Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/ca/3f8d4f49740799189e1395812f3bf23b5e8fc7c190827d55a610da72ce55/pandas-2.3.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:75ea25f9529fdec2d2e93a42c523962261e567d250b0013b16210e1d40d7c2e5", size = 12048227, upload-time = "2025-09-29T23:22:24.343Z" },
+ { url = "https://files.pythonhosted.org/packages/0e/5a/f43efec3e8c0cc92c4663ccad372dbdff72b60bdb56b2749f04aa1d07d7e/pandas-2.3.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:74ecdf1d301e812db96a465a525952f4dde225fdb6d8e5a521d47e1f42041e21", size = 11411056, upload-time = "2025-09-29T23:22:37.762Z" },
+ { url = "https://files.pythonhosted.org/packages/46/b1/85331edfc591208c9d1a63a06baa67b21d332e63b7a591a5ba42a10bb507/pandas-2.3.3-cp313-cp313t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6435cb949cb34ec11cc9860246ccb2fdc9ecd742c12d3304989017d53f039a78", size = 11645189, upload-time = "2025-09-29T23:22:51.688Z" },
+ { url = "https://files.pythonhosted.org/packages/44/23/78d645adc35d94d1ac4f2a3c4112ab6f5b8999f4898b8cdf01252f8df4a9/pandas-2.3.3-cp313-cp313t-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:900f47d8f20860de523a1ac881c4c36d65efcb2eb850e6948140fa781736e110", size = 12121912, upload-time = "2025-09-29T23:23:05.042Z" },
+ { url = "https://files.pythonhosted.org/packages/53/da/d10013df5e6aaef6b425aa0c32e1fc1f3e431e4bcabd420517dceadce354/pandas-2.3.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a45c765238e2ed7d7c608fc5bc4a6f88b642f2f01e70c0c23d2224dd21829d86", size = 12712160, upload-time = "2025-09-29T23:23:28.57Z" },
+ { url = "https://files.pythonhosted.org/packages/bd/17/e756653095a083d8a37cbd816cb87148debcfcd920129b25f99dd8d04271/pandas-2.3.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:c4fc4c21971a1a9f4bdb4c73978c7f7256caa3e62b323f70d6cb80db583350bc", size = 13199233, upload-time = "2025-09-29T23:24:24.876Z" },
+ { url = "https://files.pythonhosted.org/packages/04/fd/74903979833db8390b73b3a8a7d30d146d710bd32703724dd9083950386f/pandas-2.3.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:ee15f284898e7b246df8087fc82b87b01686f98ee67d85a17b7ab44143a3a9a0", size = 11540635, upload-time = "2025-09-29T23:25:52.486Z" },
+ { url = "https://files.pythonhosted.org/packages/21/00/266d6b357ad5e6d3ad55093a7e8efc7dd245f5a842b584db9f30b0f0a287/pandas-2.3.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:1611aedd912e1ff81ff41c745822980c49ce4a7907537be8692c8dbc31924593", size = 10759079, upload-time = "2025-09-29T23:26:33.204Z" },
+ { url = "https://files.pythonhosted.org/packages/ca/05/d01ef80a7a3a12b2f8bbf16daba1e17c98a2f039cbc8e2f77a2c5a63d382/pandas-2.3.3-cp314-cp314-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6d2cefc361461662ac48810cb14365a365ce864afe85ef1f447ff5a1e99ea81c", size = 11814049, upload-time = "2025-09-29T23:27:15.384Z" },
+ { url = "https://files.pythonhosted.org/packages/15/b2/0e62f78c0c5ba7e3d2c5945a82456f4fac76c480940f805e0b97fcbc2f65/pandas-2.3.3-cp314-cp314-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ee67acbbf05014ea6c763beb097e03cd629961c8a632075eeb34247120abcb4b", size = 12332638, upload-time = "2025-09-29T23:27:51.625Z" },
+ { url = "https://files.pythonhosted.org/packages/c5/33/dd70400631b62b9b29c3c93d2feee1d0964dc2bae2e5ad7a6c73a7f25325/pandas-2.3.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:c46467899aaa4da076d5abc11084634e2d197e9460643dd455ac3db5856b24d6", size = 12886834, upload-time = "2025-09-29T23:28:21.289Z" },
+ { url = "https://files.pythonhosted.org/packages/d3/18/b5d48f55821228d0d2692b34fd5034bb185e854bdb592e9c640f6290e012/pandas-2.3.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:6253c72c6a1d990a410bc7de641d34053364ef8bcd3126f7e7450125887dffe3", size = 13409925, upload-time = "2025-09-29T23:28:58.261Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/3d/124ac75fcd0ecc09b8fdccb0246ef65e35b012030defb0e0eba2cbbbe948/pandas-2.3.3-cp314-cp314-win_amd64.whl", hash = "sha256:1b07204a219b3b7350abaae088f451860223a52cfb8a6c53358e7948735158e5", size = 11109071, upload-time = "2025-09-29T23:32:27.484Z" },
+ { url = "https://files.pythonhosted.org/packages/89/9c/0e21c895c38a157e0faa1fb64587a9226d6dd46452cac4532d80c3c4a244/pandas-2.3.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:2462b1a365b6109d275250baaae7b760fd25c726aaca0054649286bcfbb3e8ec", size = 12048504, upload-time = "2025-09-29T23:29:31.47Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/82/b69a1c95df796858777b68fbe6a81d37443a33319761d7c652ce77797475/pandas-2.3.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:0242fe9a49aa8b4d78a4fa03acb397a58833ef6199e9aa40a95f027bb3a1b6e7", size = 11410702, upload-time = "2025-09-29T23:29:54.591Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/88/702bde3ba0a94b8c73a0181e05144b10f13f29ebfc2150c3a79062a8195d/pandas-2.3.3-cp314-cp314t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a21d830e78df0a515db2b3d2f5570610f5e6bd2e27749770e8bb7b524b89b450", size = 11634535, upload-time = "2025-09-29T23:30:21.003Z" },
+ { url = "https://files.pythonhosted.org/packages/a4/1e/1bac1a839d12e6a82ec6cb40cda2edde64a2013a66963293696bbf31fbbb/pandas-2.3.3-cp314-cp314t-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2e3ebdb170b5ef78f19bfb71b0dc5dc58775032361fa188e814959b74d726dd5", size = 12121582, upload-time = "2025-09-29T23:30:43.391Z" },
+ { url = "https://files.pythonhosted.org/packages/44/91/483de934193e12a3b1d6ae7c8645d083ff88dec75f46e827562f1e4b4da6/pandas-2.3.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:d051c0e065b94b7a3cea50eb1ec32e912cd96dba41647eb24104b6c6c14c5788", size = 12699963, upload-time = "2025-09-29T23:31:10.009Z" },
+ { url = "https://files.pythonhosted.org/packages/70/44/5191d2e4026f86a2a109053e194d3ba7a31a2d10a9c2348368c63ed4e85a/pandas-2.3.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:3869faf4bd07b3b66a9f462417d0ca3a9df29a9f6abd5d0d0dbab15dac7abe87", size = 13202175, upload-time = "2025-09-29T23:31:59.173Z" },
+]
+
+[[package]]
+name = "pillow"
+version = "12.0.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/5a/b0/cace85a1b0c9775a9f8f5d5423c8261c858760e2466c79b2dd184638b056/pillow-12.0.0.tar.gz", hash = "sha256:87d4f8125c9988bfbed67af47dd7a953e2fc7b0cc1e7800ec6d2080d490bb353", size = 47008828, upload-time = "2025-10-15T18:24:14.008Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/5d/08/26e68b6b5da219c2a2cb7b563af008b53bb8e6b6fcb3fa40715fcdb2523a/pillow-12.0.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:3adfb466bbc544b926d50fe8f4a4e6abd8c6bffd28a26177594e6e9b2b76572b", size = 5289809, upload-time = "2025-10-15T18:21:27.791Z" },
+ { url = "https://files.pythonhosted.org/packages/cb/e9/4e58fb097fb74c7b4758a680aacd558810a417d1edaa7000142976ef9d2f/pillow-12.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1ac11e8ea4f611c3c0147424eae514028b5e9077dd99ab91e1bd7bc33ff145e1", size = 4650606, upload-time = "2025-10-15T18:21:29.823Z" },
+ { url = "https://files.pythonhosted.org/packages/4b/e0/1fa492aa9f77b3bc6d471c468e62bfea1823056bf7e5e4f1914d7ab2565e/pillow-12.0.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d49e2314c373f4c2b39446fb1a45ed333c850e09d0c59ac79b72eb3b95397363", size = 6221023, upload-time = "2025-10-15T18:21:31.415Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/09/4de7cd03e33734ccd0c876f0251401f1314e819cbfd89a0fcb6e77927cc6/pillow-12.0.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c7b2a63fd6d5246349f3d3f37b14430d73ee7e8173154461785e43036ffa96ca", size = 8024937, upload-time = "2025-10-15T18:21:33.453Z" },
+ { url = "https://files.pythonhosted.org/packages/2e/69/0688e7c1390666592876d9d474f5e135abb4acb39dcb583c4dc5490f1aff/pillow-12.0.0-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d64317d2587c70324b79861babb9c09f71fbb780bad212018874b2c013d8600e", size = 6334139, upload-time = "2025-10-15T18:21:35.395Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/1c/880921e98f525b9b44ce747ad1ea8f73fd7e992bafe3ca5e5644bf433dea/pillow-12.0.0-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d77153e14b709fd8b8af6f66a3afbb9ed6e9fc5ccf0b6b7e1ced7b036a228782", size = 7026074, upload-time = "2025-10-15T18:21:37.219Z" },
+ { url = "https://files.pythonhosted.org/packages/28/03/96f718331b19b355610ef4ebdbbde3557c726513030665071fd025745671/pillow-12.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:32ed80ea8a90ee3e6fa08c21e2e091bba6eda8eccc83dbc34c95169507a91f10", size = 6448852, upload-time = "2025-10-15T18:21:39.168Z" },
+ { url = "https://files.pythonhosted.org/packages/3a/a0/6a193b3f0cc9437b122978d2c5cbce59510ccf9a5b48825096ed7472da2f/pillow-12.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:c828a1ae702fc712978bda0320ba1b9893d99be0badf2647f693cc01cf0f04fa", size = 7117058, upload-time = "2025-10-15T18:21:40.997Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/c4/043192375eaa4463254e8e61f0e2ec9a846b983929a8d0a7122e0a6d6fff/pillow-12.0.0-cp310-cp310-win32.whl", hash = "sha256:bd87e140e45399c818fac4247880b9ce719e4783d767e030a883a970be632275", size = 6295431, upload-time = "2025-10-15T18:21:42.518Z" },
+ { url = "https://files.pythonhosted.org/packages/92/c6/c2f2fc7e56301c21827e689bb8b0b465f1b52878b57471a070678c0c33cd/pillow-12.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:455247ac8a4cfb7b9bc45b7e432d10421aea9fc2e74d285ba4072688a74c2e9d", size = 7000412, upload-time = "2025-10-15T18:21:44.404Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/d2/5f675067ba82da7a1c238a73b32e3fd78d67f9d9f80fbadd33a40b9c0481/pillow-12.0.0-cp310-cp310-win_arm64.whl", hash = "sha256:6ace95230bfb7cd79ef66caa064bbe2f2a1e63d93471c3a2e1f1348d9f22d6b7", size = 2435903, upload-time = "2025-10-15T18:21:46.29Z" },
+ { url = "https://files.pythonhosted.org/packages/0e/5a/a2f6773b64edb921a756eb0729068acad9fc5208a53f4a349396e9436721/pillow-12.0.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:0fd00cac9c03256c8b2ff58f162ebcd2587ad3e1f2e397eab718c47e24d231cc", size = 5289798, upload-time = "2025-10-15T18:21:47.763Z" },
+ { url = "https://files.pythonhosted.org/packages/2e/05/069b1f8a2e4b5a37493da6c5868531c3f77b85e716ad7a590ef87d58730d/pillow-12.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a3475b96f5908b3b16c47533daaa87380c491357d197564e0ba34ae75c0f3257", size = 4650589, upload-time = "2025-10-15T18:21:49.515Z" },
+ { url = "https://files.pythonhosted.org/packages/61/e3/2c820d6e9a36432503ead175ae294f96861b07600a7156154a086ba7111a/pillow-12.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:110486b79f2d112cf6add83b28b627e369219388f64ef2f960fef9ebaf54c642", size = 6230472, upload-time = "2025-10-15T18:21:51.052Z" },
+ { url = "https://files.pythonhosted.org/packages/4f/89/63427f51c64209c5e23d4d52071c8d0f21024d3a8a487737caaf614a5795/pillow-12.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:5269cc1caeedb67e6f7269a42014f381f45e2e7cd42d834ede3c703a1d915fe3", size = 8033887, upload-time = "2025-10-15T18:21:52.604Z" },
+ { url = "https://files.pythonhosted.org/packages/f6/1b/c9711318d4901093c15840f268ad649459cd81984c9ec9887756cca049a5/pillow-12.0.0-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:aa5129de4e174daccbc59d0a3b6d20eaf24417d59851c07ebb37aeb02947987c", size = 6343964, upload-time = "2025-10-15T18:21:54.619Z" },
+ { url = "https://files.pythonhosted.org/packages/41/1e/db9470f2d030b4995083044cd8738cdd1bf773106819f6d8ba12597d5352/pillow-12.0.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bee2a6db3a7242ea309aa7ee8e2780726fed67ff4e5b40169f2c940e7eb09227", size = 7034756, upload-time = "2025-10-15T18:21:56.151Z" },
+ { url = "https://files.pythonhosted.org/packages/cc/b0/6177a8bdd5ee4ed87cba2de5a3cc1db55ffbbec6176784ce5bb75aa96798/pillow-12.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:90387104ee8400a7b4598253b4c406f8958f59fcf983a6cea2b50d59f7d63d0b", size = 6458075, upload-time = "2025-10-15T18:21:57.759Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/5e/61537aa6fa977922c6a03253a0e727e6e4a72381a80d63ad8eec350684f2/pillow-12.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:bc91a56697869546d1b8f0a3ff35224557ae7f881050e99f615e0119bf934b4e", size = 7125955, upload-time = "2025-10-15T18:21:59.372Z" },
+ { url = "https://files.pythonhosted.org/packages/1f/3d/d5033539344ee3cbd9a4d69e12e63ca3a44a739eb2d4c8da350a3d38edd7/pillow-12.0.0-cp311-cp311-win32.whl", hash = "sha256:27f95b12453d165099c84f8a8bfdfd46b9e4bda9e0e4b65f0635430027f55739", size = 6298440, upload-time = "2025-10-15T18:22:00.982Z" },
+ { url = "https://files.pythonhosted.org/packages/4d/42/aaca386de5cc8bd8a0254516957c1f265e3521c91515b16e286c662854c4/pillow-12.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:b583dc9070312190192631373c6c8ed277254aa6e6084b74bdd0a6d3b221608e", size = 6999256, upload-time = "2025-10-15T18:22:02.617Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/f1/9197c9c2d5708b785f631a6dfbfa8eb3fb9672837cb92ae9af812c13b4ed/pillow-12.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:759de84a33be3b178a64c8ba28ad5c135900359e85fb662bc6e403ad4407791d", size = 2436025, upload-time = "2025-10-15T18:22:04.598Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/90/4fcce2c22caf044e660a198d740e7fbc14395619e3cb1abad12192c0826c/pillow-12.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:53561a4ddc36facb432fae7a9d8afbfaf94795414f5cdc5fc52f28c1dca90371", size = 5249377, upload-time = "2025-10-15T18:22:05.993Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/e0/ed960067543d080691d47d6938ebccbf3976a931c9567ab2fbfab983a5dd/pillow-12.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:71db6b4c1653045dacc1585c1b0d184004f0d7e694c7b34ac165ca70c0838082", size = 4650343, upload-time = "2025-10-15T18:22:07.718Z" },
+ { url = "https://files.pythonhosted.org/packages/e7/a1/f81fdeddcb99c044bf7d6faa47e12850f13cee0849537a7d27eeab5534d4/pillow-12.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2fa5f0b6716fc88f11380b88b31fe591a06c6315e955c096c35715788b339e3f", size = 6232981, upload-time = "2025-10-15T18:22:09.287Z" },
+ { url = "https://files.pythonhosted.org/packages/88/e1/9098d3ce341a8750b55b0e00c03f1630d6178f38ac191c81c97a3b047b44/pillow-12.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:82240051c6ca513c616f7f9da06e871f61bfd7805f566275841af15015b8f98d", size = 8041399, upload-time = "2025-10-15T18:22:10.872Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/62/a22e8d3b602ae8cc01446d0c57a54e982737f44b6f2e1e019a925143771d/pillow-12.0.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:55f818bd74fe2f11d4d7cbc65880a843c4075e0ac7226bc1a23261dbea531953", size = 6347740, upload-time = "2025-10-15T18:22:12.769Z" },
+ { url = "https://files.pythonhosted.org/packages/4f/87/424511bdcd02c8d7acf9f65caa09f291a519b16bd83c3fb3374b3d4ae951/pillow-12.0.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b87843e225e74576437fd5b6a4c2205d422754f84a06942cfaf1dc32243e45a8", size = 7040201, upload-time = "2025-10-15T18:22:14.813Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/4d/435c8ac688c54d11755aedfdd9f29c9eeddf68d150fe42d1d3dbd2365149/pillow-12.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:c607c90ba67533e1b2355b821fef6764d1dd2cbe26b8c1005ae84f7aea25ff79", size = 6462334, upload-time = "2025-10-15T18:22:16.375Z" },
+ { url = "https://files.pythonhosted.org/packages/2b/f2/ad34167a8059a59b8ad10bc5c72d4d9b35acc6b7c0877af8ac885b5f2044/pillow-12.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:21f241bdd5080a15bc86d3466a9f6074a9c2c2b314100dd896ac81ee6db2f1ba", size = 7134162, upload-time = "2025-10-15T18:22:17.996Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/b1/a7391df6adacf0a5c2cf6ac1cf1fcc1369e7d439d28f637a847f8803beb3/pillow-12.0.0-cp312-cp312-win32.whl", hash = "sha256:dd333073e0cacdc3089525c7df7d39b211bcdf31fc2824e49d01c6b6187b07d0", size = 6298769, upload-time = "2025-10-15T18:22:19.923Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/0b/d87733741526541c909bbf159e338dcace4f982daac6e5a8d6be225ca32d/pillow-12.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:9fe611163f6303d1619bbcb653540a4d60f9e55e622d60a3108be0d5b441017a", size = 7001107, upload-time = "2025-10-15T18:22:21.644Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/96/aaa61ce33cc98421fb6088af2a03be4157b1e7e0e87087c888e2370a7f45/pillow-12.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:7dfb439562f234f7d57b1ac6bc8fe7f838a4bd49c79230e0f6a1da93e82f1fad", size = 2436012, upload-time = "2025-10-15T18:22:23.621Z" },
+ { url = "https://files.pythonhosted.org/packages/62/f2/de993bb2d21b33a98d031ecf6a978e4b61da207bef02f7b43093774c480d/pillow-12.0.0-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:0869154a2d0546545cde61d1789a6524319fc1897d9ee31218eae7a60ccc5643", size = 4045493, upload-time = "2025-10-15T18:22:25.758Z" },
+ { url = "https://files.pythonhosted.org/packages/0e/b6/bc8d0c4c9f6f111a783d045310945deb769b806d7574764234ffd50bc5ea/pillow-12.0.0-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:a7921c5a6d31b3d756ec980f2f47c0cfdbce0fc48c22a39347a895f41f4a6ea4", size = 4120461, upload-time = "2025-10-15T18:22:27.286Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/57/d60d343709366a353dc56adb4ee1e7d8a2cc34e3fbc22905f4167cfec119/pillow-12.0.0-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:1ee80a59f6ce048ae13cda1abf7fbd2a34ab9ee7d401c46be3ca685d1999a399", size = 3576912, upload-time = "2025-10-15T18:22:28.751Z" },
+ { url = "https://files.pythonhosted.org/packages/a4/a4/a0a31467e3f83b94d37568294b01d22b43ae3c5d85f2811769b9c66389dd/pillow-12.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:c50f36a62a22d350c96e49ad02d0da41dbd17ddc2e29750dbdba4323f85eb4a5", size = 5249132, upload-time = "2025-10-15T18:22:30.641Z" },
+ { url = "https://files.pythonhosted.org/packages/83/06/48eab21dd561de2914242711434c0c0eb992ed08ff3f6107a5f44527f5e9/pillow-12.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:5193fde9a5f23c331ea26d0cf171fbf67e3f247585f50c08b3e205c7aeb4589b", size = 4650099, upload-time = "2025-10-15T18:22:32.73Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/bd/69ed99fd46a8dba7c1887156d3572fe4484e3f031405fcc5a92e31c04035/pillow-12.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:bde737cff1a975b70652b62d626f7785e0480918dece11e8fef3c0cf057351c3", size = 6230808, upload-time = "2025-10-15T18:22:34.337Z" },
+ { url = "https://files.pythonhosted.org/packages/ea/94/8fad659bcdbf86ed70099cb60ae40be6acca434bbc8c4c0d4ef356d7e0de/pillow-12.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:a6597ff2b61d121172f5844b53f21467f7082f5fb385a9a29c01414463f93b07", size = 8037804, upload-time = "2025-10-15T18:22:36.402Z" },
+ { url = "https://files.pythonhosted.org/packages/20/39/c685d05c06deecfd4e2d1950e9a908aa2ca8bc4e6c3b12d93b9cafbd7837/pillow-12.0.0-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0b817e7035ea7f6b942c13aa03bb554fc44fea70838ea21f8eb31c638326584e", size = 6345553, upload-time = "2025-10-15T18:22:38.066Z" },
+ { url = "https://files.pythonhosted.org/packages/38/57/755dbd06530a27a5ed74f8cb0a7a44a21722ebf318edbe67ddbd7fb28f88/pillow-12.0.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f4f1231b7dec408e8670264ce63e9c71409d9583dd21d32c163e25213ee2a344", size = 7037729, upload-time = "2025-10-15T18:22:39.769Z" },
+ { url = "https://files.pythonhosted.org/packages/ca/b6/7e94f4c41d238615674d06ed677c14883103dce1c52e4af16f000338cfd7/pillow-12.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:6e51b71417049ad6ab14c49608b4a24d8fb3fe605e5dfabfe523b58064dc3d27", size = 6459789, upload-time = "2025-10-15T18:22:41.437Z" },
+ { url = "https://files.pythonhosted.org/packages/9c/14/4448bb0b5e0f22dd865290536d20ec8a23b64e2d04280b89139f09a36bb6/pillow-12.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:d120c38a42c234dc9a8c5de7ceaaf899cf33561956acb4941653f8bdc657aa79", size = 7130917, upload-time = "2025-10-15T18:22:43.152Z" },
+ { url = "https://files.pythonhosted.org/packages/dd/ca/16c6926cc1c015845745d5c16c9358e24282f1e588237a4c36d2b30f182f/pillow-12.0.0-cp313-cp313-win32.whl", hash = "sha256:4cc6b3b2efff105c6a1656cfe59da4fdde2cda9af1c5e0b58529b24525d0a098", size = 6302391, upload-time = "2025-10-15T18:22:44.753Z" },
+ { url = "https://files.pythonhosted.org/packages/6d/2a/dd43dcfd6dae9b6a49ee28a8eedb98c7d5ff2de94a5d834565164667b97b/pillow-12.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:4cf7fed4b4580601c4345ceb5d4cbf5a980d030fd5ad07c4d2ec589f95f09905", size = 7007477, upload-time = "2025-10-15T18:22:46.838Z" },
+ { url = "https://files.pythonhosted.org/packages/77/f0/72ea067f4b5ae5ead653053212af05ce3705807906ba3f3e8f58ddf617e6/pillow-12.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:9f0b04c6b8584c2c193babcccc908b38ed29524b29dd464bc8801bf10d746a3a", size = 2435918, upload-time = "2025-10-15T18:22:48.399Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/5e/9046b423735c21f0487ea6cb5b10f89ea8f8dfbe32576fe052b5ba9d4e5b/pillow-12.0.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:7fa22993bac7b77b78cae22bad1e2a987ddf0d9015c63358032f84a53f23cdc3", size = 5251406, upload-time = "2025-10-15T18:22:49.905Z" },
+ { url = "https://files.pythonhosted.org/packages/12/66/982ceebcdb13c97270ef7a56c3969635b4ee7cd45227fa707c94719229c5/pillow-12.0.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:f135c702ac42262573fe9714dfe99c944b4ba307af5eb507abef1667e2cbbced", size = 4653218, upload-time = "2025-10-15T18:22:51.587Z" },
+ { url = "https://files.pythonhosted.org/packages/16/b3/81e625524688c31859450119bf12674619429cab3119eec0e30a7a1029cb/pillow-12.0.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c85de1136429c524e55cfa4e033b4a7940ac5c8ee4d9401cc2d1bf48154bbc7b", size = 6266564, upload-time = "2025-10-15T18:22:53.215Z" },
+ { url = "https://files.pythonhosted.org/packages/98/59/dfb38f2a41240d2408096e1a76c671d0a105a4a8471b1871c6902719450c/pillow-12.0.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:38df9b4bfd3db902c9c2bd369bcacaf9d935b2fff73709429d95cc41554f7b3d", size = 8069260, upload-time = "2025-10-15T18:22:54.933Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/3d/378dbea5cd1874b94c312425ca77b0f47776c78e0df2df751b820c8c1d6c/pillow-12.0.0-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7d87ef5795da03d742bf49439f9ca4d027cde49c82c5371ba52464aee266699a", size = 6379248, upload-time = "2025-10-15T18:22:56.605Z" },
+ { url = "https://files.pythonhosted.org/packages/84/b0/d525ef47d71590f1621510327acec75ae58c721dc071b17d8d652ca494d8/pillow-12.0.0-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:aff9e4d82d082ff9513bdd6acd4f5bd359f5b2c870907d2b0a9c5e10d40c88fe", size = 7066043, upload-time = "2025-10-15T18:22:58.53Z" },
+ { url = "https://files.pythonhosted.org/packages/61/2c/aced60e9cf9d0cde341d54bf7932c9ffc33ddb4a1595798b3a5150c7ec4e/pillow-12.0.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:8d8ca2b210ada074d57fcee40c30446c9562e542fc46aedc19baf758a93532ee", size = 6490915, upload-time = "2025-10-15T18:23:00.582Z" },
+ { url = "https://files.pythonhosted.org/packages/ef/26/69dcb9b91f4e59f8f34b2332a4a0a951b44f547c4ed39d3e4dcfcff48f89/pillow-12.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:99a7f72fb6249302aa62245680754862a44179b545ded638cf1fef59befb57ef", size = 7157998, upload-time = "2025-10-15T18:23:02.627Z" },
+ { url = "https://files.pythonhosted.org/packages/61/2b/726235842220ca95fa441ddf55dd2382b52ab5b8d9c0596fe6b3f23dafe8/pillow-12.0.0-cp313-cp313t-win32.whl", hash = "sha256:4078242472387600b2ce8d93ade8899c12bf33fa89e55ec89fe126e9d6d5d9e9", size = 6306201, upload-time = "2025-10-15T18:23:04.709Z" },
+ { url = "https://files.pythonhosted.org/packages/c0/3d/2afaf4e840b2df71344ababf2f8edd75a705ce500e5dc1e7227808312ae1/pillow-12.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:2c54c1a783d6d60595d3514f0efe9b37c8808746a66920315bfd34a938d7994b", size = 7013165, upload-time = "2025-10-15T18:23:06.46Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/75/3fa09aa5cf6ed04bee3fa575798ddf1ce0bace8edb47249c798077a81f7f/pillow-12.0.0-cp313-cp313t-win_arm64.whl", hash = "sha256:26d9f7d2b604cd23aba3e9faf795787456ac25634d82cd060556998e39c6fa47", size = 2437834, upload-time = "2025-10-15T18:23:08.194Z" },
+ { url = "https://files.pythonhosted.org/packages/54/2a/9a8c6ba2c2c07b71bec92cf63e03370ca5e5f5c5b119b742bcc0cde3f9c5/pillow-12.0.0-cp314-cp314-ios_13_0_arm64_iphoneos.whl", hash = "sha256:beeae3f27f62308f1ddbcfb0690bf44b10732f2ef43758f169d5e9303165d3f9", size = 4045531, upload-time = "2025-10-15T18:23:10.121Z" },
+ { url = "https://files.pythonhosted.org/packages/84/54/836fdbf1bfb3d66a59f0189ff0b9f5f666cee09c6188309300df04ad71fa/pillow-12.0.0-cp314-cp314-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:d4827615da15cd59784ce39d3388275ec093ae3ee8d7f0c089b76fa87af756c2", size = 4120554, upload-time = "2025-10-15T18:23:12.14Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/cd/16aec9f0da4793e98e6b54778a5fbce4f375c6646fe662e80600b8797379/pillow-12.0.0-cp314-cp314-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:3e42edad50b6909089750e65c91aa09aaf1e0a71310d383f11321b27c224ed8a", size = 3576812, upload-time = "2025-10-15T18:23:13.962Z" },
+ { url = "https://files.pythonhosted.org/packages/f6/b7/13957fda356dc46339298b351cae0d327704986337c3c69bb54628c88155/pillow-12.0.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:e5d8efac84c9afcb40914ab49ba063d94f5dbdf5066db4482c66a992f47a3a3b", size = 5252689, upload-time = "2025-10-15T18:23:15.562Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/f5/eae31a306341d8f331f43edb2e9122c7661b975433de5e447939ae61c5da/pillow-12.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:266cd5f2b63ff316d5a1bba46268e603c9caf5606d44f38c2873c380950576ad", size = 4650186, upload-time = "2025-10-15T18:23:17.379Z" },
+ { url = "https://files.pythonhosted.org/packages/86/62/2a88339aa40c4c77e79108facbd307d6091e2c0eb5b8d3cf4977cfca2fe6/pillow-12.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:58eea5ebe51504057dd95c5b77d21700b77615ab0243d8152793dc00eb4faf01", size = 6230308, upload-time = "2025-10-15T18:23:18.971Z" },
+ { url = "https://files.pythonhosted.org/packages/c7/33/5425a8992bcb32d1cb9fa3dd39a89e613d09a22f2c8083b7bf43c455f760/pillow-12.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f13711b1a5ba512d647a0e4ba79280d3a9a045aaf7e0cc6fbe96b91d4cdf6b0c", size = 8039222, upload-time = "2025-10-15T18:23:20.909Z" },
+ { url = "https://files.pythonhosted.org/packages/d8/61/3f5d3b35c5728f37953d3eec5b5f3e77111949523bd2dd7f31a851e50690/pillow-12.0.0-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6846bd2d116ff42cba6b646edf5bf61d37e5cbd256425fa089fee4ff5c07a99e", size = 6346657, upload-time = "2025-10-15T18:23:23.077Z" },
+ { url = "https://files.pythonhosted.org/packages/3a/be/ee90a3d79271227e0f0a33c453531efd6ed14b2e708596ba5dd9be948da3/pillow-12.0.0-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c98fa880d695de164b4135a52fd2e9cd7b7c90a9d8ac5e9e443a24a95ef9248e", size = 7038482, upload-time = "2025-10-15T18:23:25.005Z" },
+ { url = "https://files.pythonhosted.org/packages/44/34/a16b6a4d1ad727de390e9bd9f19f5f669e079e5826ec0f329010ddea492f/pillow-12.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:fa3ed2a29a9e9d2d488b4da81dcb54720ac3104a20bf0bd273f1e4648aff5af9", size = 6461416, upload-time = "2025-10-15T18:23:27.009Z" },
+ { url = "https://files.pythonhosted.org/packages/b6/39/1aa5850d2ade7d7ba9f54e4e4c17077244ff7a2d9e25998c38a29749eb3f/pillow-12.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d034140032870024e6b9892c692fe2968493790dd57208b2c37e3fb35f6df3ab", size = 7131584, upload-time = "2025-10-15T18:23:29.752Z" },
+ { url = "https://files.pythonhosted.org/packages/bf/db/4fae862f8fad0167073a7733973bfa955f47e2cac3dc3e3e6257d10fab4a/pillow-12.0.0-cp314-cp314-win32.whl", hash = "sha256:1b1b133e6e16105f524a8dec491e0586d072948ce15c9b914e41cdadd209052b", size = 6400621, upload-time = "2025-10-15T18:23:32.06Z" },
+ { url = "https://files.pythonhosted.org/packages/2b/24/b350c31543fb0107ab2599464d7e28e6f856027aadda995022e695313d94/pillow-12.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:8dc232e39d409036af549c86f24aed8273a40ffa459981146829a324e0848b4b", size = 7142916, upload-time = "2025-10-15T18:23:34.71Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/9b/0ba5a6fd9351793996ef7487c4fdbde8d3f5f75dbedc093bb598648fddf0/pillow-12.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:d52610d51e265a51518692045e372a4c363056130d922a7351429ac9f27e70b0", size = 2523836, upload-time = "2025-10-15T18:23:36.967Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/7a/ceee0840aebc579af529b523d530840338ecf63992395842e54edc805987/pillow-12.0.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:1979f4566bb96c1e50a62d9831e2ea2d1211761e5662afc545fa766f996632f6", size = 5255092, upload-time = "2025-10-15T18:23:38.573Z" },
+ { url = "https://files.pythonhosted.org/packages/44/76/20776057b4bfd1aef4eeca992ebde0f53a4dce874f3ae693d0ec90a4f79b/pillow-12.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:b2e4b27a6e15b04832fe9bf292b94b5ca156016bbc1ea9c2c20098a0320d6cf6", size = 4653158, upload-time = "2025-10-15T18:23:40.238Z" },
+ { url = "https://files.pythonhosted.org/packages/82/3f/d9ff92ace07be8836b4e7e87e6a4c7a8318d47c2f1463ffcf121fc57d9cb/pillow-12.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:fb3096c30df99fd01c7bf8e544f392103d0795b9f98ba71a8054bcbf56b255f1", size = 6267882, upload-time = "2025-10-15T18:23:42.434Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/7a/4f7ff87f00d3ad33ba21af78bfcd2f032107710baf8280e3722ceec28cda/pillow-12.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7438839e9e053ef79f7112c881cef684013855016f928b168b81ed5835f3e75e", size = 8071001, upload-time = "2025-10-15T18:23:44.29Z" },
+ { url = "https://files.pythonhosted.org/packages/75/87/fcea108944a52dad8cca0715ae6247e271eb80459364a98518f1e4f480c1/pillow-12.0.0-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5d5c411a8eaa2299322b647cd932586b1427367fd3184ffbb8f7a219ea2041ca", size = 6380146, upload-time = "2025-10-15T18:23:46.065Z" },
+ { url = "https://files.pythonhosted.org/packages/91/52/0d31b5e571ef5fd111d2978b84603fce26aba1b6092f28e941cb46570745/pillow-12.0.0-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d7e091d464ac59d2c7ad8e7e08105eaf9dafbc3883fd7265ffccc2baad6ac925", size = 7067344, upload-time = "2025-10-15T18:23:47.898Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/f4/2dd3d721f875f928d48e83bb30a434dee75a2531bca839bb996bb0aa5a91/pillow-12.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:792a2c0be4dcc18af9d4a2dfd8a11a17d5e25274a1062b0ec1c2d79c76f3e7f8", size = 6491864, upload-time = "2025-10-15T18:23:49.607Z" },
+ { url = "https://files.pythonhosted.org/packages/30/4b/667dfcf3d61fc309ba5a15b141845cece5915e39b99c1ceab0f34bf1d124/pillow-12.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:afbefa430092f71a9593a99ab6a4e7538bc9eabbf7bf94f91510d3503943edc4", size = 7158911, upload-time = "2025-10-15T18:23:51.351Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/2f/16cabcc6426c32218ace36bf0d55955e813f2958afddbf1d391849fee9d1/pillow-12.0.0-cp314-cp314t-win32.whl", hash = "sha256:3830c769decf88f1289680a59d4f4c46c72573446352e2befec9a8512104fa52", size = 6408045, upload-time = "2025-10-15T18:23:53.177Z" },
+ { url = "https://files.pythonhosted.org/packages/35/73/e29aa0c9c666cf787628d3f0dcf379f4791fba79f4936d02f8b37165bdf8/pillow-12.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:905b0365b210c73afb0ebe9101a32572152dfd1c144c7e28968a331b9217b94a", size = 7148282, upload-time = "2025-10-15T18:23:55.316Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/70/6b41bdcddf541b437bbb9f47f94d2db5d9ddef6c37ccab8c9107743748a4/pillow-12.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:99353a06902c2e43b43e8ff74ee65a7d90307d82370604746738a1e0661ccca7", size = 2525630, upload-time = "2025-10-15T18:23:57.149Z" },
+ { url = "https://files.pythonhosted.org/packages/1d/b3/582327e6c9f86d037b63beebe981425d6811104cb443e8193824ef1a2f27/pillow-12.0.0-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:b22bd8c974942477156be55a768f7aa37c46904c175be4e158b6a86e3a6b7ca8", size = 5215068, upload-time = "2025-10-15T18:23:59.594Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/d6/67748211d119f3b6540baf90f92fae73ae51d5217b171b0e8b5f7e5d558f/pillow-12.0.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:805ebf596939e48dbb2e4922a1d3852cfc25c38160751ce02da93058b48d252a", size = 4614994, upload-time = "2025-10-15T18:24:01.669Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/e1/f8281e5d844c41872b273b9f2c34a4bf64ca08905668c8ae730eedc7c9fa/pillow-12.0.0-pp311-pypy311_pp73-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:cae81479f77420d217def5f54b5b9d279804d17e982e0f2fa19b1d1e14ab5197", size = 5246639, upload-time = "2025-10-15T18:24:03.403Z" },
+ { url = "https://files.pythonhosted.org/packages/94/5a/0d8ab8ffe8a102ff5df60d0de5af309015163bf710c7bb3e8311dd3b3ad0/pillow-12.0.0-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:aeaefa96c768fc66818730b952a862235d68825c178f1b3ffd4efd7ad2edcb7c", size = 6986839, upload-time = "2025-10-15T18:24:05.344Z" },
+ { url = "https://files.pythonhosted.org/packages/20/2e/3434380e8110b76cd9eb00a363c484b050f949b4bbe84ba770bb8508a02c/pillow-12.0.0-pp311-pypy311_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:09f2d0abef9e4e2f349305a4f8cc784a8a6c2f58a8c4892eea13b10a943bd26e", size = 5313505, upload-time = "2025-10-15T18:24:07.137Z" },
+ { url = "https://files.pythonhosted.org/packages/57/ca/5a9d38900d9d74785141d6580950fe705de68af735ff6e727cb911b64740/pillow-12.0.0-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bdee52571a343d721fb2eb3b090a82d959ff37fc631e3f70422e0c2e029f3e76", size = 5963654, upload-time = "2025-10-15T18:24:09.579Z" },
+ { url = "https://files.pythonhosted.org/packages/95/7e/f896623c3c635a90537ac093c6a618ebe1a90d87206e42309cb5d98a1b9e/pillow-12.0.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:b290fd8aa38422444d4b50d579de197557f182ef1068b75f5aa8558638b8d0a5", size = 6997850, upload-time = "2025-10-15T18:24:11.495Z" },
+]
+
+[[package]]
+name = "pluggy"
+version = "1.6.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" },
+]
+
+[[package]]
+name = "preshed"
+version = "3.0.10"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "cymem" },
+ { name = "murmurhash" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/4d/3a/db814f67a05b6d7f9c15d38edef5ec9b21415710705b393883de92aee5ef/preshed-3.0.10.tar.gz", hash = "sha256:5a5c8e685e941f4ffec97f1fbf32694b8107858891a4bc34107fac981d8296ff", size = 15039, upload-time = "2025-05-26T15:18:33.612Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/66/12/3bfd7790481513d71a281a3a7194a6d7aa9a59289a109253e78d9bcedcec/preshed-3.0.10-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:14593c32e6705fda0fd54684293ca079530418bb1fb036dcbaa6c0ef0f144b7d", size = 131102, upload-time = "2025-05-26T15:17:41.762Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/bf/54635387524315fe40b1f3d1688a5ad369f59a4e3a377b0da6e8a3ecba30/preshed-3.0.10-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ba1960a3996678aded882260133853e19e3a251d9f35a19c9d7d830c4238c4eb", size = 127302, upload-time = "2025-05-26T15:17:43.263Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/df/d057705c9c6aff877ee687f612f242006750f165c0e557f6075fe913a8e3/preshed-3.0.10-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0830c0a262015be743a01455a1da5963750afed1bde2395590b01af3b7da2741", size = 793737, upload-time = "2025-05-26T15:17:44.736Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/73/9206a60e59e81a259d49273f95307821f5e88c84c400533ed0cb9a8093af/preshed-3.0.10-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:165dda5862c28e77ee1f3feabad98d4ebb65345f458b5626596b92fd20a65275", size = 795131, upload-time = "2025-05-26T15:17:46.382Z" },
+ { url = "https://files.pythonhosted.org/packages/25/18/02a40bcb13ae6c1ca3a859a709354621b45c83857994943c9c409f85f183/preshed-3.0.10-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:e88e4c7fbbfa7c23a90d7d0cbe27e4c5fa2fd742ef1be09c153f9ccd2c600098", size = 777924, upload-time = "2025-05-26T15:17:48.184Z" },
+ { url = "https://files.pythonhosted.org/packages/11/13/bb2db0f037fc659494fbe964255f80fbca7e5e4154137e9855619e3543d9/preshed-3.0.10-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:87780ae00def0c97130c9d1652295ec8362c2e4ca553673b64fe0dc7b321a382", size = 796024, upload-time = "2025-05-26T15:17:49.568Z" },
+ { url = "https://files.pythonhosted.org/packages/99/ab/7187df84a32f02d987b689f4bbb1ad77304bdc8129d8fed483b8ebde113d/preshed-3.0.10-cp310-cp310-win_amd64.whl", hash = "sha256:32496f216255a6cbdd60965dde29ff42ed8fc2d77968c28ae875e3856c6fa01a", size = 117429, upload-time = "2025-05-26T15:17:51.091Z" },
+ { url = "https://files.pythonhosted.org/packages/08/99/c3709638f687da339504d1daeca48604cadb338bf3556a1484d1f0cd95e6/preshed-3.0.10-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d96c4fe2b41c1cdcc8c4fc1fdb10f922a6095c0430a3ebe361fe62c78902d068", size = 131486, upload-time = "2025-05-26T15:17:52.231Z" },
+ { url = "https://files.pythonhosted.org/packages/e0/27/0fd36b63caa8bbf57b31a121d9565d385bbd7521771d4eb93e17d326873d/preshed-3.0.10-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:cb01ea930b96f3301526a2ab26f41347d07555e4378c4144c6b7645074f2ebb0", size = 127938, upload-time = "2025-05-26T15:17:54.19Z" },
+ { url = "https://files.pythonhosted.org/packages/90/54/6a876d9cc8d401a9c1fb6bb8ca5a31b3664d0bcb888a9016258a1ae17344/preshed-3.0.10-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dd1f0a7b7d150e229d073fd4fe94f72610cae992e907cee74687c4695873a98", size = 842263, upload-time = "2025-05-26T15:17:55.398Z" },
+ { url = "https://files.pythonhosted.org/packages/1c/7d/ff19f74d15ee587905bafa3582883cfe2f72b574e6d691ee64dc690dc276/preshed-3.0.10-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fd7b350c280137f324cd447afbf6ba9a849af0e8898850046ac6f34010e08bd", size = 842913, upload-time = "2025-05-26T15:17:56.687Z" },
+ { url = "https://files.pythonhosted.org/packages/f1/3a/1c345a26463345557705b61965e1e0a732cc0e9c6dfd4787845dbfa50b4a/preshed-3.0.10-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:cf6a5fdc89ad06079aa6ee63621e417d4f4cf2a3d8b63c72728baad35a9ff641", size = 820548, upload-time = "2025-05-26T15:17:58.057Z" },
+ { url = "https://files.pythonhosted.org/packages/7f/6b/71f25e2b7a23dba168f43edfae0bb508552dbef89114ce65c73f2ea7172f/preshed-3.0.10-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:b4c29a7bd66985808ad181c9ad05205a6aa7400cd0f98426acd7bc86588b93f8", size = 840379, upload-time = "2025-05-26T15:17:59.565Z" },
+ { url = "https://files.pythonhosted.org/packages/3a/86/d8f32b0b31a36ee8770a9b1a95321430e364cd0ba4bfebb7348aed2f198d/preshed-3.0.10-cp311-cp311-win_amd64.whl", hash = "sha256:1367c1fd6f44296305315d4e1c3fe3171787d4d01c1008a76bc9466bd79c3249", size = 117655, upload-time = "2025-05-26T15:18:00.836Z" },
+ { url = "https://files.pythonhosted.org/packages/c3/14/322a4f58bc25991a87f216acb1351800739b0794185d27508ee86c35f382/preshed-3.0.10-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6e9c46933d55c8898c8f7a6019a8062cd87ef257b075ada2dd5d1e57810189ea", size = 131367, upload-time = "2025-05-26T15:18:02.408Z" },
+ { url = "https://files.pythonhosted.org/packages/38/80/67507653c35620cace913f617df6d6f658b87e8da83087b851557d65dd86/preshed-3.0.10-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5c4ebc4f8ef0114d55f2ffdce4965378129c7453d0203664aeeb03055572d9e4", size = 126535, upload-time = "2025-05-26T15:18:03.589Z" },
+ { url = "https://files.pythonhosted.org/packages/db/b1/ab4f811aeaf20af0fa47148c1c54b62d7e8120d59025bd0a3f773bb67725/preshed-3.0.10-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ab5ab4c6dfd3746fb4328e7fbeb2a0544416b872db02903bfac18e6f5cd412f", size = 864907, upload-time = "2025-05-26T15:18:04.794Z" },
+ { url = "https://files.pythonhosted.org/packages/fb/db/fe37c1f99cfb26805dd89381ddd54901307feceb267332eaaca228e9f9c1/preshed-3.0.10-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40586fd96ae3974c552a7cd78781b6844ecb1559ee7556586f487058cf13dd96", size = 869329, upload-time = "2025-05-26T15:18:06.353Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/fd/efb6a6233d1cd969966f3f65bdd8e662579c3d83114e5c356cec1927b1f7/preshed-3.0.10-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a606c24cda931306b98e0edfafed3309bffcf8d6ecfe07804db26024c4f03cd6", size = 846829, upload-time = "2025-05-26T15:18:07.716Z" },
+ { url = "https://files.pythonhosted.org/packages/14/49/0e4ce5db3bf86b081abb08a404fb37b7c2dbfd7a73ec6c0bc71b650307eb/preshed-3.0.10-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:394015566f9354738be903447039e8dbc6d93ba5adf091af694eb03c4e726b1e", size = 874008, upload-time = "2025-05-26T15:18:09.364Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/17/76d6593fc2d055d4e413b68a8c87b70aa9b7697d4972cb8062559edcf6e9/preshed-3.0.10-cp312-cp312-win_amd64.whl", hash = "sha256:fd7e38225937e580420c84d1996dde9b4f726aacd9405093455c3a2fa60fede5", size = 116701, upload-time = "2025-05-26T15:18:11.905Z" },
+ { url = "https://files.pythonhosted.org/packages/bf/5e/87671bc58c4f6c8cf0a5601ccd74b8bb50281ff28aa4ab3e3cad5cd9d06a/preshed-3.0.10-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:23e6e0581a517597f3f76bc24a4cdb0ba5509933d4f61c34fca49649dd71edf9", size = 129184, upload-time = "2025-05-26T15:18:13.331Z" },
+ { url = "https://files.pythonhosted.org/packages/92/69/b3969a3c95778def5bf5126484a1f7d2ad324d1040077f55f56e027d8ea4/preshed-3.0.10-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:574e6d6056981540310ff181b47a2912f4bddc91bcace3c7a9c6726eafda24ca", size = 124258, upload-time = "2025-05-26T15:18:14.497Z" },
+ { url = "https://files.pythonhosted.org/packages/32/df/6e828ec4565bf33bd4803a3eb3b1102830b739143e5d6c132bf7181a58ec/preshed-3.0.10-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2bd658dd73e853d1bb5597976a407feafa681b9d6155bc9bc7b4c2acc2a6ee96", size = 825445, upload-time = "2025-05-26T15:18:15.71Z" },
+ { url = "https://files.pythonhosted.org/packages/05/3d/478b585f304920e51f328c9231e22f30dc64baa68e079e08a46ab72be738/preshed-3.0.10-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5b95396046328ffb461a68859ce2141aca4815b8624167832d28ced70d541626", size = 831690, upload-time = "2025-05-26T15:18:17.08Z" },
+ { url = "https://files.pythonhosted.org/packages/c3/65/938f21f77227e8d398d46fb10b9d1b3467be859468ce8db138fc3d50589c/preshed-3.0.10-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:3e6728b2028bbe79565eb6cf676b5bae5ce1f9cc56e4bf99bb28ce576f88054d", size = 808593, upload-time = "2025-05-26T15:18:18.535Z" },
+ { url = "https://files.pythonhosted.org/packages/6c/1c/2a3961fc88bc72300ff7e4ca54689bda90d2d77cc994167cc09a310480b6/preshed-3.0.10-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:c4ef96cb28bf5f08de9c070143113e168efccbb68fd4961e7d445f734c051a97", size = 837333, upload-time = "2025-05-26T15:18:19.937Z" },
+ { url = "https://files.pythonhosted.org/packages/fa/8c/d3e30f80b2ef21f267f09f0b7d18995adccc928ede5b73ea3fe54e1303f4/preshed-3.0.10-cp313-cp313-win_amd64.whl", hash = "sha256:97e0e2edfd25a7dfba799b49b3c5cc248ad0318a76edd9d5fd2c82aa3d5c64ed", size = 115769, upload-time = "2025-05-26T15:18:21.842Z" },
+]
+
+[[package]]
+name = "propcache"
+version = "0.4.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/9e/da/e9fc233cf63743258bff22b3dfa7ea5baef7b5bc324af47a0ad89b8ffc6f/propcache-0.4.1.tar.gz", hash = "sha256:f48107a8c637e80362555f37ecf49abe20370e557cc4ab374f04ec4423c97c3d", size = 46442, upload-time = "2025-10-08T19:49:02.291Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/3c/0e/934b541323035566a9af292dba85a195f7b78179114f2c6ebb24551118a9/propcache-0.4.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7c2d1fa3201efaf55d730400d945b5b3ab6e672e100ba0f9a409d950ab25d7db", size = 79534, upload-time = "2025-10-08T19:46:02.083Z" },
+ { url = "https://files.pythonhosted.org/packages/a1/6b/db0d03d96726d995dc7171286c6ba9d8d14251f37433890f88368951a44e/propcache-0.4.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:1eb2994229cc8ce7fe9b3db88f5465f5fd8651672840b2e426b88cdb1a30aac8", size = 45526, upload-time = "2025-10-08T19:46:03.884Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/c3/82728404aea669e1600f304f2609cde9e665c18df5a11cdd57ed73c1dceb/propcache-0.4.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:66c1f011f45a3b33d7bcb22daed4b29c0c9e2224758b6be00686731e1b46f925", size = 47263, upload-time = "2025-10-08T19:46:05.405Z" },
+ { url = "https://files.pythonhosted.org/packages/df/1b/39313ddad2bf9187a1432654c38249bab4562ef535ef07f5eb6eb04d0b1b/propcache-0.4.1-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9a52009f2adffe195d0b605c25ec929d26b36ef986ba85244891dee3b294df21", size = 201012, upload-time = "2025-10-08T19:46:07.165Z" },
+ { url = "https://files.pythonhosted.org/packages/5b/01/f1d0b57d136f294a142acf97f4ed58c8e5b974c21e543000968357115011/propcache-0.4.1-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:5d4e2366a9c7b837555cf02fb9be2e3167d333aff716332ef1b7c3a142ec40c5", size = 209491, upload-time = "2025-10-08T19:46:08.909Z" },
+ { url = "https://files.pythonhosted.org/packages/a1/c8/038d909c61c5bb039070b3fb02ad5cccdb1dde0d714792e251cdb17c9c05/propcache-0.4.1-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:9d2b6caef873b4f09e26ea7e33d65f42b944837563a47a94719cc3544319a0db", size = 215319, upload-time = "2025-10-08T19:46:10.7Z" },
+ { url = "https://files.pythonhosted.org/packages/08/57/8c87e93142b2c1fa2408e45695205a7ba05fb5db458c0bf5c06ba0e09ea6/propcache-0.4.1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2b16ec437a8c8a965ecf95739448dd938b5c7f56e67ea009f4300d8df05f32b7", size = 196856, upload-time = "2025-10-08T19:46:12.003Z" },
+ { url = "https://files.pythonhosted.org/packages/42/df/5615fec76aa561987a534759b3686008a288e73107faa49a8ae5795a9f7a/propcache-0.4.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:296f4c8ed03ca7476813fe666c9ea97869a8d7aec972618671b33a38a5182ef4", size = 193241, upload-time = "2025-10-08T19:46:13.495Z" },
+ { url = "https://files.pythonhosted.org/packages/d5/21/62949eb3a7a54afe8327011c90aca7e03547787a88fb8bd9726806482fea/propcache-0.4.1-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:1f0978529a418ebd1f49dad413a2b68af33f85d5c5ca5c6ca2a3bed375a7ac60", size = 190552, upload-time = "2025-10-08T19:46:14.938Z" },
+ { url = "https://files.pythonhosted.org/packages/30/ee/ab4d727dd70806e5b4de96a798ae7ac6e4d42516f030ee60522474b6b332/propcache-0.4.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:fd138803047fb4c062b1c1dd95462f5209456bfab55c734458f15d11da288f8f", size = 200113, upload-time = "2025-10-08T19:46:16.695Z" },
+ { url = "https://files.pythonhosted.org/packages/8a/0b/38b46208e6711b016aa8966a3ac793eee0d05c7159d8342aa27fc0bc365e/propcache-0.4.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:8c9b3cbe4584636d72ff556d9036e0c9317fa27b3ac1f0f558e7e84d1c9c5900", size = 200778, upload-time = "2025-10-08T19:46:18.023Z" },
+ { url = "https://files.pythonhosted.org/packages/cf/81/5abec54355ed344476bee711e9f04815d4b00a311ab0535599204eecc257/propcache-0.4.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:f93243fdc5657247533273ac4f86ae106cc6445a0efacb9a1bfe982fcfefd90c", size = 193047, upload-time = "2025-10-08T19:46:19.449Z" },
+ { url = "https://files.pythonhosted.org/packages/ec/b6/1f237c04e32063cb034acd5f6ef34ef3a394f75502e72703545631ab1ef6/propcache-0.4.1-cp310-cp310-win32.whl", hash = "sha256:a0ee98db9c5f80785b266eb805016e36058ac72c51a064040f2bc43b61101cdb", size = 38093, upload-time = "2025-10-08T19:46:20.643Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/67/354aac4e0603a15f76439caf0427781bcd6797f370377f75a642133bc954/propcache-0.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:1cdb7988c4e5ac7f6d175a28a9aa0c94cb6f2ebe52756a3c0cda98d2809a9e37", size = 41638, upload-time = "2025-10-08T19:46:21.935Z" },
+ { url = "https://files.pythonhosted.org/packages/e0/e1/74e55b9fd1a4c209ff1a9a824bf6c8b3d1fc5a1ac3eabe23462637466785/propcache-0.4.1-cp310-cp310-win_arm64.whl", hash = "sha256:d82ad62b19645419fe79dd63b3f9253e15b30e955c0170e5cebc350c1844e581", size = 38229, upload-time = "2025-10-08T19:46:23.368Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/d4/4e2c9aaf7ac2242b9358f98dccd8f90f2605402f5afeff6c578682c2c491/propcache-0.4.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:60a8fda9644b7dfd5dece8c61d8a85e271cb958075bfc4e01083c148b61a7caf", size = 80208, upload-time = "2025-10-08T19:46:24.597Z" },
+ { url = "https://files.pythonhosted.org/packages/c2/21/d7b68e911f9c8e18e4ae43bdbc1e1e9bbd971f8866eb81608947b6f585ff/propcache-0.4.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c30b53e7e6bda1d547cabb47c825f3843a0a1a42b0496087bb58d8fedf9f41b5", size = 45777, upload-time = "2025-10-08T19:46:25.733Z" },
+ { url = "https://files.pythonhosted.org/packages/d3/1d/11605e99ac8ea9435651ee71ab4cb4bf03f0949586246476a25aadfec54a/propcache-0.4.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:6918ecbd897443087a3b7cd978d56546a812517dcaaca51b49526720571fa93e", size = 47647, upload-time = "2025-10-08T19:46:27.304Z" },
+ { url = "https://files.pythonhosted.org/packages/58/1a/3c62c127a8466c9c843bccb503d40a273e5cc69838805f322e2826509e0d/propcache-0.4.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3d902a36df4e5989763425a8ab9e98cd8ad5c52c823b34ee7ef307fd50582566", size = 214929, upload-time = "2025-10-08T19:46:28.62Z" },
+ { url = "https://files.pythonhosted.org/packages/56/b9/8fa98f850960b367c4b8fe0592e7fc341daa7a9462e925228f10a60cf74f/propcache-0.4.1-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a9695397f85973bb40427dedddf70d8dc4a44b22f1650dd4af9eedf443d45165", size = 221778, upload-time = "2025-10-08T19:46:30.358Z" },
+ { url = "https://files.pythonhosted.org/packages/46/a6/0ab4f660eb59649d14b3d3d65c439421cf2f87fe5dd68591cbe3c1e78a89/propcache-0.4.1-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2bb07ffd7eaad486576430c89f9b215f9e4be68c4866a96e97db9e97fead85dc", size = 228144, upload-time = "2025-10-08T19:46:32.607Z" },
+ { url = "https://files.pythonhosted.org/packages/52/6a/57f43e054fb3d3a56ac9fc532bc684fc6169a26c75c353e65425b3e56eef/propcache-0.4.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fd6f30fdcf9ae2a70abd34da54f18da086160e4d7d9251f81f3da0ff84fc5a48", size = 210030, upload-time = "2025-10-08T19:46:33.969Z" },
+ { url = "https://files.pythonhosted.org/packages/40/e2/27e6feebb5f6b8408fa29f5efbb765cd54c153ac77314d27e457a3e993b7/propcache-0.4.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:fc38cba02d1acba4e2869eef1a57a43dfbd3d49a59bf90dda7444ec2be6a5570", size = 208252, upload-time = "2025-10-08T19:46:35.309Z" },
+ { url = "https://files.pythonhosted.org/packages/9e/f8/91c27b22ccda1dbc7967f921c42825564fa5336a01ecd72eb78a9f4f53c2/propcache-0.4.1-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:67fad6162281e80e882fb3ec355398cf72864a54069d060321f6cd0ade95fe85", size = 202064, upload-time = "2025-10-08T19:46:36.993Z" },
+ { url = "https://files.pythonhosted.org/packages/f2/26/7f00bd6bd1adba5aafe5f4a66390f243acab58eab24ff1a08bebb2ef9d40/propcache-0.4.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:f10207adf04d08bec185bae14d9606a1444715bc99180f9331c9c02093e1959e", size = 212429, upload-time = "2025-10-08T19:46:38.398Z" },
+ { url = "https://files.pythonhosted.org/packages/84/89/fd108ba7815c1117ddca79c228f3f8a15fc82a73bca8b142eb5de13b2785/propcache-0.4.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:e9b0d8d0845bbc4cfcdcbcdbf5086886bc8157aa963c31c777ceff7846c77757", size = 216727, upload-time = "2025-10-08T19:46:39.732Z" },
+ { url = "https://files.pythonhosted.org/packages/79/37/3ec3f7e3173e73f1d600495d8b545b53802cbf35506e5732dd8578db3724/propcache-0.4.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:981333cb2f4c1896a12f4ab92a9cc8f09ea664e9b7dbdc4eff74627af3a11c0f", size = 205097, upload-time = "2025-10-08T19:46:41.025Z" },
+ { url = "https://files.pythonhosted.org/packages/61/b0/b2631c19793f869d35f47d5a3a56fb19e9160d3c119f15ac7344fc3ccae7/propcache-0.4.1-cp311-cp311-win32.whl", hash = "sha256:f1d2f90aeec838a52f1c1a32fe9a619fefd5e411721a9117fbf82aea638fe8a1", size = 38084, upload-time = "2025-10-08T19:46:42.693Z" },
+ { url = "https://files.pythonhosted.org/packages/f4/78/6cce448e2098e9f3bfc91bb877f06aa24b6ccace872e39c53b2f707c4648/propcache-0.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:364426a62660f3f699949ac8c621aad6977be7126c5807ce48c0aeb8e7333ea6", size = 41637, upload-time = "2025-10-08T19:46:43.778Z" },
+ { url = "https://files.pythonhosted.org/packages/9c/e9/754f180cccd7f51a39913782c74717c581b9cc8177ad0e949f4d51812383/propcache-0.4.1-cp311-cp311-win_arm64.whl", hash = "sha256:e53f3a38d3510c11953f3e6a33f205c6d1b001129f972805ca9b42fc308bc239", size = 38064, upload-time = "2025-10-08T19:46:44.872Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/0f/f17b1b2b221d5ca28b4b876e8bb046ac40466513960646bda8e1853cdfa2/propcache-0.4.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:e153e9cd40cc8945138822807139367f256f89c6810c2634a4f6902b52d3b4e2", size = 80061, upload-time = "2025-10-08T19:46:46.075Z" },
+ { url = "https://files.pythonhosted.org/packages/76/47/8ccf75935f51448ba9a16a71b783eb7ef6b9ee60f5d14c7f8a8a79fbeed7/propcache-0.4.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:cd547953428f7abb73c5ad82cbb32109566204260d98e41e5dfdc682eb7f8403", size = 46037, upload-time = "2025-10-08T19:46:47.23Z" },
+ { url = "https://files.pythonhosted.org/packages/0a/b6/5c9a0e42df4d00bfb4a3cbbe5cf9f54260300c88a0e9af1f47ca5ce17ac0/propcache-0.4.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f048da1b4f243fc44f205dfd320933a951b8d89e0afd4c7cacc762a8b9165207", size = 47324, upload-time = "2025-10-08T19:46:48.384Z" },
+ { url = "https://files.pythonhosted.org/packages/9e/d3/6c7ee328b39a81ee877c962469f1e795f9db87f925251efeb0545e0020d0/propcache-0.4.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ec17c65562a827bba85e3872ead335f95405ea1674860d96483a02f5c698fa72", size = 225505, upload-time = "2025-10-08T19:46:50.055Z" },
+ { url = "https://files.pythonhosted.org/packages/01/5d/1c53f4563490b1d06a684742cc6076ef944bc6457df6051b7d1a877c057b/propcache-0.4.1-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:405aac25c6394ef275dee4c709be43745d36674b223ba4eb7144bf4d691b7367", size = 230242, upload-time = "2025-10-08T19:46:51.815Z" },
+ { url = "https://files.pythonhosted.org/packages/20/e1/ce4620633b0e2422207c3cb774a0ee61cac13abc6217763a7b9e2e3f4a12/propcache-0.4.1-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0013cb6f8dde4b2a2f66903b8ba740bdfe378c943c4377a200551ceb27f379e4", size = 238474, upload-time = "2025-10-08T19:46:53.208Z" },
+ { url = "https://files.pythonhosted.org/packages/46/4b/3aae6835b8e5f44ea6a68348ad90f78134047b503765087be2f9912140ea/propcache-0.4.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:15932ab57837c3368b024473a525e25d316d8353016e7cc0e5ba9eb343fbb1cf", size = 221575, upload-time = "2025-10-08T19:46:54.511Z" },
+ { url = "https://files.pythonhosted.org/packages/6e/a5/8a5e8678bcc9d3a1a15b9a29165640d64762d424a16af543f00629c87338/propcache-0.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:031dce78b9dc099f4c29785d9cf5577a3faf9ebf74ecbd3c856a7b92768c3df3", size = 216736, upload-time = "2025-10-08T19:46:56.212Z" },
+ { url = "https://files.pythonhosted.org/packages/f1/63/b7b215eddeac83ca1c6b934f89d09a625aa9ee4ba158338854c87210cc36/propcache-0.4.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:ab08df6c9a035bee56e31af99be621526bd237bea9f32def431c656b29e41778", size = 213019, upload-time = "2025-10-08T19:46:57.595Z" },
+ { url = "https://files.pythonhosted.org/packages/57/74/f580099a58c8af587cac7ba19ee7cb418506342fbbe2d4a4401661cca886/propcache-0.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:4d7af63f9f93fe593afbf104c21b3b15868efb2c21d07d8732c0c4287e66b6a6", size = 220376, upload-time = "2025-10-08T19:46:59.067Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/ee/542f1313aff7eaf19c2bb758c5d0560d2683dac001a1c96d0774af799843/propcache-0.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:cfc27c945f422e8b5071b6e93169679e4eb5bf73bbcbf1ba3ae3a83d2f78ebd9", size = 226988, upload-time = "2025-10-08T19:47:00.544Z" },
+ { url = "https://files.pythonhosted.org/packages/8f/18/9c6b015dd9c6930f6ce2229e1f02fb35298b847f2087ea2b436a5bfa7287/propcache-0.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:35c3277624a080cc6ec6f847cbbbb5b49affa3598c4535a0a4682a697aaa5c75", size = 215615, upload-time = "2025-10-08T19:47:01.968Z" },
+ { url = "https://files.pythonhosted.org/packages/80/9e/e7b85720b98c45a45e1fca6a177024934dc9bc5f4d5dd04207f216fc33ed/propcache-0.4.1-cp312-cp312-win32.whl", hash = "sha256:671538c2262dadb5ba6395e26c1731e1d52534bfe9ae56d0b5573ce539266aa8", size = 38066, upload-time = "2025-10-08T19:47:03.503Z" },
+ { url = "https://files.pythonhosted.org/packages/54/09/d19cff2a5aaac632ec8fc03737b223597b1e347416934c1b3a7df079784c/propcache-0.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:cb2d222e72399fcf5890d1d5cc1060857b9b236adff2792ff48ca2dfd46c81db", size = 41655, upload-time = "2025-10-08T19:47:04.973Z" },
+ { url = "https://files.pythonhosted.org/packages/68/ab/6b5c191bb5de08036a8c697b265d4ca76148efb10fa162f14af14fb5f076/propcache-0.4.1-cp312-cp312-win_arm64.whl", hash = "sha256:204483131fb222bdaaeeea9f9e6c6ed0cac32731f75dfc1d4a567fc1926477c1", size = 37789, upload-time = "2025-10-08T19:47:06.077Z" },
+ { url = "https://files.pythonhosted.org/packages/bf/df/6d9c1b6ac12b003837dde8a10231a7344512186e87b36e855bef32241942/propcache-0.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:43eedf29202c08550aac1d14e0ee619b0430aaef78f85864c1a892294fbc28cf", size = 77750, upload-time = "2025-10-08T19:47:07.648Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/e8/677a0025e8a2acf07d3418a2e7ba529c9c33caf09d3c1f25513023c1db56/propcache-0.4.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d62cdfcfd89ccb8de04e0eda998535c406bf5e060ffd56be6c586cbcc05b3311", size = 44780, upload-time = "2025-10-08T19:47:08.851Z" },
+ { url = "https://files.pythonhosted.org/packages/89/a4/92380f7ca60f99ebae761936bc48a72a639e8a47b29050615eef757cb2a7/propcache-0.4.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cae65ad55793da34db5f54e4029b89d3b9b9490d8abe1b4c7ab5d4b8ec7ebf74", size = 46308, upload-time = "2025-10-08T19:47:09.982Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/48/c5ac64dee5262044348d1d78a5f85dd1a57464a60d30daee946699963eb3/propcache-0.4.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:333ddb9031d2704a301ee3e506dc46b1fe5f294ec198ed6435ad5b6a085facfe", size = 208182, upload-time = "2025-10-08T19:47:11.319Z" },
+ { url = "https://files.pythonhosted.org/packages/c6/0c/cd762dd011a9287389a6a3eb43aa30207bde253610cca06824aeabfe9653/propcache-0.4.1-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:fd0858c20f078a32cf55f7e81473d96dcf3b93fd2ccdb3d40fdf54b8573df3af", size = 211215, upload-time = "2025-10-08T19:47:13.146Z" },
+ { url = "https://files.pythonhosted.org/packages/30/3e/49861e90233ba36890ae0ca4c660e95df565b2cd15d4a68556ab5865974e/propcache-0.4.1-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:678ae89ebc632c5c204c794f8dab2837c5f159aeb59e6ed0539500400577298c", size = 218112, upload-time = "2025-10-08T19:47:14.913Z" },
+ { url = "https://files.pythonhosted.org/packages/f1/8b/544bc867e24e1bd48f3118cecd3b05c694e160a168478fa28770f22fd094/propcache-0.4.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d472aeb4fbf9865e0c6d622d7f4d54a4e101a89715d8904282bb5f9a2f476c3f", size = 204442, upload-time = "2025-10-08T19:47:16.277Z" },
+ { url = "https://files.pythonhosted.org/packages/50/a6/4282772fd016a76d3e5c0df58380a5ea64900afd836cec2c2f662d1b9bb3/propcache-0.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4d3df5fa7e36b3225954fba85589da77a0fe6a53e3976de39caf04a0db4c36f1", size = 199398, upload-time = "2025-10-08T19:47:17.962Z" },
+ { url = "https://files.pythonhosted.org/packages/3e/ec/d8a7cd406ee1ddb705db2139f8a10a8a427100347bd698e7014351c7af09/propcache-0.4.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:ee17f18d2498f2673e432faaa71698032b0127ebf23ae5974eeaf806c279df24", size = 196920, upload-time = "2025-10-08T19:47:19.355Z" },
+ { url = "https://files.pythonhosted.org/packages/f6/6c/f38ab64af3764f431e359f8baf9e0a21013e24329e8b85d2da32e8ed07ca/propcache-0.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:580e97762b950f993ae618e167e7be9256b8353c2dcd8b99ec100eb50f5286aa", size = 203748, upload-time = "2025-10-08T19:47:21.338Z" },
+ { url = "https://files.pythonhosted.org/packages/d6/e3/fa846bd70f6534d647886621388f0a265254d30e3ce47e5c8e6e27dbf153/propcache-0.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:501d20b891688eb8e7aa903021f0b72d5a55db40ffaab27edefd1027caaafa61", size = 205877, upload-time = "2025-10-08T19:47:23.059Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/39/8163fc6f3133fea7b5f2827e8eba2029a0277ab2c5beee6c1db7b10fc23d/propcache-0.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9a0bd56e5b100aef69bd8562b74b46254e7c8812918d3baa700c8a8009b0af66", size = 199437, upload-time = "2025-10-08T19:47:24.445Z" },
+ { url = "https://files.pythonhosted.org/packages/93/89/caa9089970ca49c7c01662bd0eeedfe85494e863e8043565aeb6472ce8fe/propcache-0.4.1-cp313-cp313-win32.whl", hash = "sha256:bcc9aaa5d80322bc2fb24bb7accb4a30f81e90ab8d6ba187aec0744bc302ad81", size = 37586, upload-time = "2025-10-08T19:47:25.736Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/ab/f76ec3c3627c883215b5c8080debb4394ef5a7a29be811f786415fc1e6fd/propcache-0.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:381914df18634f5494334d201e98245c0596067504b9372d8cf93f4bb23e025e", size = 40790, upload-time = "2025-10-08T19:47:26.847Z" },
+ { url = "https://files.pythonhosted.org/packages/59/1b/e71ae98235f8e2ba5004d8cb19765a74877abf189bc53fc0c80d799e56c3/propcache-0.4.1-cp313-cp313-win_arm64.whl", hash = "sha256:8873eb4460fd55333ea49b7d189749ecf6e55bf85080f11b1c4530ed3034cba1", size = 37158, upload-time = "2025-10-08T19:47:27.961Z" },
+ { url = "https://files.pythonhosted.org/packages/83/ce/a31bbdfc24ee0dcbba458c8175ed26089cf109a55bbe7b7640ed2470cfe9/propcache-0.4.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:92d1935ee1f8d7442da9c0c4fa7ac20d07e94064184811b685f5c4fada64553b", size = 81451, upload-time = "2025-10-08T19:47:29.445Z" },
+ { url = "https://files.pythonhosted.org/packages/25/9c/442a45a470a68456e710d96cacd3573ef26a1d0a60067e6a7d5e655621ed/propcache-0.4.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:473c61b39e1460d386479b9b2f337da492042447c9b685f28be4f74d3529e566", size = 46374, upload-time = "2025-10-08T19:47:30.579Z" },
+ { url = "https://files.pythonhosted.org/packages/f4/bf/b1d5e21dbc3b2e889ea4327044fb16312a736d97640fb8b6aa3f9c7b3b65/propcache-0.4.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:c0ef0aaafc66fbd87842a3fe3902fd889825646bc21149eafe47be6072725835", size = 48396, upload-time = "2025-10-08T19:47:31.79Z" },
+ { url = "https://files.pythonhosted.org/packages/f4/04/5b4c54a103d480e978d3c8a76073502b18db0c4bc17ab91b3cb5092ad949/propcache-0.4.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f95393b4d66bfae908c3ca8d169d5f79cd65636ae15b5e7a4f6e67af675adb0e", size = 275950, upload-time = "2025-10-08T19:47:33.481Z" },
+ { url = "https://files.pythonhosted.org/packages/b4/c1/86f846827fb969c4b78b0af79bba1d1ea2156492e1b83dea8b8a6ae27395/propcache-0.4.1-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c07fda85708bc48578467e85099645167a955ba093be0a2dcba962195676e859", size = 273856, upload-time = "2025-10-08T19:47:34.906Z" },
+ { url = "https://files.pythonhosted.org/packages/36/1d/fc272a63c8d3bbad6878c336c7a7dea15e8f2d23a544bda43205dfa83ada/propcache-0.4.1-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:af223b406d6d000830c6f65f1e6431783fc3f713ba3e6cc8c024d5ee96170a4b", size = 280420, upload-time = "2025-10-08T19:47:36.338Z" },
+ { url = "https://files.pythonhosted.org/packages/07/0c/01f2219d39f7e53d52e5173bcb09c976609ba30209912a0680adfb8c593a/propcache-0.4.1-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a78372c932c90ee474559c5ddfffd718238e8673c340dc21fe45c5b8b54559a0", size = 263254, upload-time = "2025-10-08T19:47:37.692Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/18/cd28081658ce597898f0c4d174d4d0f3c5b6d4dc27ffafeef835c95eb359/propcache-0.4.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:564d9f0d4d9509e1a870c920a89b2fec951b44bf5ba7d537a9e7c1ccec2c18af", size = 261205, upload-time = "2025-10-08T19:47:39.659Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/71/1f9e22eb8b8316701c2a19fa1f388c8a3185082607da8e406a803c9b954e/propcache-0.4.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:17612831fda0138059cc5546f4d12a2aacfb9e47068c06af35c400ba58ba7393", size = 247873, upload-time = "2025-10-08T19:47:41.084Z" },
+ { url = "https://files.pythonhosted.org/packages/4a/65/3d4b61f36af2b4eddba9def857959f1016a51066b4f1ce348e0cf7881f58/propcache-0.4.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:41a89040cb10bd345b3c1a873b2bf36413d48da1def52f268a055f7398514874", size = 262739, upload-time = "2025-10-08T19:47:42.51Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/42/26746ab087faa77c1c68079b228810436ccd9a5ce9ac85e2b7307195fd06/propcache-0.4.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e35b88984e7fa64aacecea39236cee32dd9bd8c55f57ba8a75cf2399553f9bd7", size = 263514, upload-time = "2025-10-08T19:47:43.927Z" },
+ { url = "https://files.pythonhosted.org/packages/94/13/630690fe201f5502d2403dd3cfd451ed8858fe3c738ee88d095ad2ff407b/propcache-0.4.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6f8b465489f927b0df505cbe26ffbeed4d6d8a2bbc61ce90eb074ff129ef0ab1", size = 257781, upload-time = "2025-10-08T19:47:45.448Z" },
+ { url = "https://files.pythonhosted.org/packages/92/f7/1d4ec5841505f423469efbfc381d64b7b467438cd5a4bbcbb063f3b73d27/propcache-0.4.1-cp313-cp313t-win32.whl", hash = "sha256:2ad890caa1d928c7c2965b48f3a3815c853180831d0e5503d35cf00c472f4717", size = 41396, upload-time = "2025-10-08T19:47:47.202Z" },
+ { url = "https://files.pythonhosted.org/packages/48/f0/615c30622316496d2cbbc29f5985f7777d3ada70f23370608c1d3e081c1f/propcache-0.4.1-cp313-cp313t-win_amd64.whl", hash = "sha256:f7ee0e597f495cf415bcbd3da3caa3bd7e816b74d0d52b8145954c5e6fd3ff37", size = 44897, upload-time = "2025-10-08T19:47:48.336Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/ca/6002e46eccbe0e33dcd4069ef32f7f1c9e243736e07adca37ae8c4830ec3/propcache-0.4.1-cp313-cp313t-win_arm64.whl", hash = "sha256:929d7cbe1f01bb7baffb33dc14eb5691c95831450a26354cd210a8155170c93a", size = 39789, upload-time = "2025-10-08T19:47:49.876Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/5c/bca52d654a896f831b8256683457ceddd490ec18d9ec50e97dfd8fc726a8/propcache-0.4.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3f7124c9d820ba5548d431afb4632301acf965db49e666aa21c305cbe8c6de12", size = 78152, upload-time = "2025-10-08T19:47:51.051Z" },
+ { url = "https://files.pythonhosted.org/packages/65/9b/03b04e7d82a5f54fb16113d839f5ea1ede58a61e90edf515f6577c66fa8f/propcache-0.4.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:c0d4b719b7da33599dfe3b22d3db1ef789210a0597bc650b7cee9c77c2be8c5c", size = 44869, upload-time = "2025-10-08T19:47:52.594Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/fa/89a8ef0468d5833a23fff277b143d0573897cf75bd56670a6d28126c7d68/propcache-0.4.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:9f302f4783709a78240ebc311b793f123328716a60911d667e0c036bc5dcbded", size = 46596, upload-time = "2025-10-08T19:47:54.073Z" },
+ { url = "https://files.pythonhosted.org/packages/86/bd/47816020d337f4a746edc42fe8d53669965138f39ee117414c7d7a340cfe/propcache-0.4.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c80ee5802e3fb9ea37938e7eecc307fb984837091d5fd262bb37238b1ae97641", size = 206981, upload-time = "2025-10-08T19:47:55.715Z" },
+ { url = "https://files.pythonhosted.org/packages/df/f6/c5fa1357cc9748510ee55f37173eb31bfde6d94e98ccd9e6f033f2fc06e1/propcache-0.4.1-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ed5a841e8bb29a55fb8159ed526b26adc5bdd7e8bd7bf793ce647cb08656cdf4", size = 211490, upload-time = "2025-10-08T19:47:57.499Z" },
+ { url = "https://files.pythonhosted.org/packages/80/1e/e5889652a7c4a3846683401a48f0f2e5083ce0ec1a8a5221d8058fbd1adf/propcache-0.4.1-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:55c72fd6ea2da4c318e74ffdf93c4fe4e926051133657459131a95c846d16d44", size = 215371, upload-time = "2025-10-08T19:47:59.317Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/f2/889ad4b2408f72fe1a4f6a19491177b30ea7bf1a0fd5f17050ca08cfc882/propcache-0.4.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8326e144341460402713f91df60ade3c999d601e7eb5ff8f6f7862d54de0610d", size = 201424, upload-time = "2025-10-08T19:48:00.67Z" },
+ { url = "https://files.pythonhosted.org/packages/27/73/033d63069b57b0812c8bd19f311faebeceb6ba31b8f32b73432d12a0b826/propcache-0.4.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:060b16ae65bc098da7f6d25bf359f1f31f688384858204fe5d652979e0015e5b", size = 197566, upload-time = "2025-10-08T19:48:02.604Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/89/ce24f3dc182630b4e07aa6d15f0ff4b14ed4b9955fae95a0b54c58d66c05/propcache-0.4.1-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:89eb3fa9524f7bec9de6e83cf3faed9d79bffa560672c118a96a171a6f55831e", size = 193130, upload-time = "2025-10-08T19:48:04.499Z" },
+ { url = "https://files.pythonhosted.org/packages/a9/24/ef0d5fd1a811fb5c609278d0209c9f10c35f20581fcc16f818da959fc5b4/propcache-0.4.1-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:dee69d7015dc235f526fe80a9c90d65eb0039103fe565776250881731f06349f", size = 202625, upload-time = "2025-10-08T19:48:06.213Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/02/98ec20ff5546f68d673df2f7a69e8c0d076b5abd05ca882dc7ee3a83653d/propcache-0.4.1-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:5558992a00dfd54ccbc64a32726a3357ec93825a418a401f5cc67df0ac5d9e49", size = 204209, upload-time = "2025-10-08T19:48:08.432Z" },
+ { url = "https://files.pythonhosted.org/packages/a0/87/492694f76759b15f0467a2a93ab68d32859672b646aa8a04ce4864e7932d/propcache-0.4.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:c9b822a577f560fbd9554812526831712c1436d2c046cedee4c3796d3543b144", size = 197797, upload-time = "2025-10-08T19:48:09.968Z" },
+ { url = "https://files.pythonhosted.org/packages/ee/36/66367de3575db1d2d3f3d177432bd14ee577a39d3f5d1b3d5df8afe3b6e2/propcache-0.4.1-cp314-cp314-win32.whl", hash = "sha256:ab4c29b49d560fe48b696cdcb127dd36e0bc2472548f3bf56cc5cb3da2b2984f", size = 38140, upload-time = "2025-10-08T19:48:11.232Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/2a/a758b47de253636e1b8aef181c0b4f4f204bf0dd964914fb2af90a95b49b/propcache-0.4.1-cp314-cp314-win_amd64.whl", hash = "sha256:5a103c3eb905fcea0ab98be99c3a9a5ab2de60228aa5aceedc614c0281cf6153", size = 41257, upload-time = "2025-10-08T19:48:12.707Z" },
+ { url = "https://files.pythonhosted.org/packages/34/5e/63bd5896c3fec12edcbd6f12508d4890d23c265df28c74b175e1ef9f4f3b/propcache-0.4.1-cp314-cp314-win_arm64.whl", hash = "sha256:74c1fb26515153e482e00177a1ad654721bf9207da8a494a0c05e797ad27b992", size = 38097, upload-time = "2025-10-08T19:48:13.923Z" },
+ { url = "https://files.pythonhosted.org/packages/99/85/9ff785d787ccf9bbb3f3106f79884a130951436f58392000231b4c737c80/propcache-0.4.1-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:824e908bce90fb2743bd6b59db36eb4f45cd350a39637c9f73b1c1ea66f5b75f", size = 81455, upload-time = "2025-10-08T19:48:15.16Z" },
+ { url = "https://files.pythonhosted.org/packages/90/85/2431c10c8e7ddb1445c1f7c4b54d886e8ad20e3c6307e7218f05922cad67/propcache-0.4.1-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:c2b5e7db5328427c57c8e8831abda175421b709672f6cfc3d630c3b7e2146393", size = 46372, upload-time = "2025-10-08T19:48:16.424Z" },
+ { url = "https://files.pythonhosted.org/packages/01/20/b0972d902472da9bcb683fa595099911f4d2e86e5683bcc45de60dd05dc3/propcache-0.4.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:6f6ff873ed40292cd4969ef5310179afd5db59fdf055897e282485043fc80ad0", size = 48411, upload-time = "2025-10-08T19:48:17.577Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/e3/7dc89f4f21e8f99bad3d5ddb3a3389afcf9da4ac69e3deb2dcdc96e74169/propcache-0.4.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:49a2dc67c154db2c1463013594c458881a069fcf98940e61a0569016a583020a", size = 275712, upload-time = "2025-10-08T19:48:18.901Z" },
+ { url = "https://files.pythonhosted.org/packages/20/67/89800c8352489b21a8047c773067644e3897f02ecbbd610f4d46b7f08612/propcache-0.4.1-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:005f08e6a0529984491e37d8dbc3dd86f84bd78a8ceb5fa9a021f4c48d4984be", size = 273557, upload-time = "2025-10-08T19:48:20.762Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/a1/b52b055c766a54ce6d9c16d9aca0cad8059acd9637cdf8aa0222f4a026ef/propcache-0.4.1-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5c3310452e0d31390da9035c348633b43d7e7feb2e37be252be6da45abd1abcc", size = 280015, upload-time = "2025-10-08T19:48:22.592Z" },
+ { url = "https://files.pythonhosted.org/packages/48/c8/33cee30bd890672c63743049f3c9e4be087e6780906bfc3ec58528be59c1/propcache-0.4.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4c3c70630930447f9ef1caac7728c8ad1c56bc5015338b20fed0d08ea2480b3a", size = 262880, upload-time = "2025-10-08T19:48:23.947Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/b1/8f08a143b204b418285c88b83d00edbd61afbc2c6415ffafc8905da7038b/propcache-0.4.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8e57061305815dfc910a3634dcf584f08168a8836e6999983569f51a8544cd89", size = 260938, upload-time = "2025-10-08T19:48:25.656Z" },
+ { url = "https://files.pythonhosted.org/packages/cf/12/96e4664c82ca2f31e1c8dff86afb867348979eb78d3cb8546a680287a1e9/propcache-0.4.1-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:521a463429ef54143092c11a77e04056dd00636f72e8c45b70aaa3140d639726", size = 247641, upload-time = "2025-10-08T19:48:27.207Z" },
+ { url = "https://files.pythonhosted.org/packages/18/ed/e7a9cfca28133386ba52278136d42209d3125db08d0a6395f0cba0c0285c/propcache-0.4.1-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:120c964da3fdc75e3731aa392527136d4ad35868cc556fd09bb6d09172d9a367", size = 262510, upload-time = "2025-10-08T19:48:28.65Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/76/16d8bf65e8845dd62b4e2b57444ab81f07f40caa5652b8969b87ddcf2ef6/propcache-0.4.1-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:d8f353eb14ee3441ee844ade4277d560cdd68288838673273b978e3d6d2c8f36", size = 263161, upload-time = "2025-10-08T19:48:30.133Z" },
+ { url = "https://files.pythonhosted.org/packages/e7/70/c99e9edb5d91d5ad8a49fa3c1e8285ba64f1476782fed10ab251ff413ba1/propcache-0.4.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ab2943be7c652f09638800905ee1bab2c544e537edb57d527997a24c13dc1455", size = 257393, upload-time = "2025-10-08T19:48:31.567Z" },
+ { url = "https://files.pythonhosted.org/packages/08/02/87b25304249a35c0915d236575bc3574a323f60b47939a2262b77632a3ee/propcache-0.4.1-cp314-cp314t-win32.whl", hash = "sha256:05674a162469f31358c30bcaa8883cb7829fa3110bf9c0991fe27d7896c42d85", size = 42546, upload-time = "2025-10-08T19:48:32.872Z" },
+ { url = "https://files.pythonhosted.org/packages/cb/ef/3c6ecf8b317aa982f309835e8f96987466123c6e596646d4e6a1dfcd080f/propcache-0.4.1-cp314-cp314t-win_amd64.whl", hash = "sha256:990f6b3e2a27d683cb7602ed6c86f15ee6b43b1194736f9baaeb93d0016633b1", size = 46259, upload-time = "2025-10-08T19:48:34.226Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/2d/346e946d4951f37eca1e4f55be0f0174c52cd70720f84029b02f296f4a38/propcache-0.4.1-cp314-cp314t-win_arm64.whl", hash = "sha256:ecef2343af4cc68e05131e45024ba34f6095821988a9d0a02aa7c73fcc448aa9", size = 40428, upload-time = "2025-10-08T19:48:35.441Z" },
+ { url = "https://files.pythonhosted.org/packages/5b/5a/bc7b4a4ef808fa59a816c17b20c4bef6884daebbdf627ff2a161da67da19/propcache-0.4.1-py3-none-any.whl", hash = "sha256:af2a6052aeb6cf17d3e46ee169099044fd8224cbaf75c76a2ef596e8163e2237", size = 13305, upload-time = "2025-10-08T19:49:00.792Z" },
+]
+
+[[package]]
+name = "psutil"
+version = "7.1.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/cd/ec/7b8e6b9b1d22708138630ef34c53ab2b61032c04f16adfdbb96791c8c70c/psutil-7.1.2.tar.gz", hash = "sha256:aa225cdde1335ff9684708ee8c72650f6598d5ed2114b9a7c5802030b1785018", size = 487424, upload-time = "2025-10-25T10:46:34.931Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/b8/d9/b56cc9f883140ac10021a8c9b0f4e16eed1ba675c22513cdcbce3ba64014/psutil-7.1.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0cc5c6889b9871f231ed5455a9a02149e388fffcb30b607fb7a8896a6d95f22e", size = 238575, upload-time = "2025-10-25T10:46:38.728Z" },
+ { url = "https://files.pythonhosted.org/packages/36/eb/28d22de383888deb252c818622196e709da98816e296ef95afda33f1c0a2/psutil-7.1.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:8e9e77a977208d84aa363a4a12e0f72189d58bbf4e46b49aae29a2c6e93ef206", size = 239297, upload-time = "2025-10-25T10:46:41.347Z" },
+ { url = "https://files.pythonhosted.org/packages/89/5d/220039e2f28cc129626e54d63892ab05c0d56a29818bfe7268dcb5008932/psutil-7.1.2-cp313-cp313t-manylinux2010_x86_64.manylinux_2_12_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7d9623a5e4164d2220ecceb071f4b333b3c78866141e8887c072129185f41278", size = 280420, upload-time = "2025-10-25T10:46:44.122Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/7a/286f0e1c167445b2ef4a6cbdfc8c59fdb45a5a493788950cf8467201dc73/psutil-7.1.2-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:364b1c10fe4ed59c89ec49e5f1a70da353b27986fa8233b4b999df4742a5ee2f", size = 283049, upload-time = "2025-10-25T10:46:47.095Z" },
+ { url = "https://files.pythonhosted.org/packages/aa/cc/7eb93260794a42e39b976f3a4dde89725800b9f573b014fac142002a5c98/psutil-7.1.2-cp313-cp313t-win_amd64.whl", hash = "sha256:f101ef84de7e05d41310e3ccbdd65a6dd1d9eed85e8aaf0758405d022308e204", size = 248713, upload-time = "2025-10-25T10:46:49.573Z" },
+ { url = "https://files.pythonhosted.org/packages/ab/1a/0681a92b53366e01f0a099f5237d0c8a2f79d322ac589cccde5e30c8a4e2/psutil-7.1.2-cp313-cp313t-win_arm64.whl", hash = "sha256:20c00824048a95de67f00afedc7b08b282aa08638585b0206a9fb51f28f1a165", size = 244644, upload-time = "2025-10-25T10:46:51.924Z" },
+ { url = "https://files.pythonhosted.org/packages/56/9e/f1c5c746b4ed5320952acd3002d3962fe36f30524c00ea79fdf954cc6779/psutil-7.1.2-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:e09cfe92aa8e22b1ec5e2d394820cf86c5dff6367ac3242366485dfa874d43bc", size = 238640, upload-time = "2025-10-25T10:46:54.089Z" },
+ { url = "https://files.pythonhosted.org/packages/32/ee/fd26216a735395cc25c3899634e34aeb41fb1f3dbb44acc67d9e594be562/psutil-7.1.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:fa6342cf859c48b19df3e4aa170e4cfb64aadc50b11e06bb569c6c777b089c9e", size = 239303, upload-time = "2025-10-25T10:46:56.932Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/cd/7d96eaec4ef7742b845a9ce2759a2769ecce4ab7a99133da24abacbc9e41/psutil-7.1.2-cp314-cp314t-manylinux2010_x86_64.manylinux_2_12_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:625977443498ee7d6c1e63e93bacca893fd759a66c5f635d05e05811d23fb5ee", size = 281717, upload-time = "2025-10-25T10:46:59.116Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/1a/7f0b84bdb067d35fe7fade5fff888408688caf989806ce2d6dae08c72dd5/psutil-7.1.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4a24bcd7b7f2918d934af0fb91859f621b873d6aa81267575e3655cd387572a7", size = 284575, upload-time = "2025-10-25T10:47:00.944Z" },
+ { url = "https://files.pythonhosted.org/packages/de/05/7820ef8f7b275268917e0c750eada5834581206d9024ca88edce93c4b762/psutil-7.1.2-cp314-cp314t-win_amd64.whl", hash = "sha256:329f05610da6380982e6078b9d0881d9ab1e9a7eb7c02d833bfb7340aa634e31", size = 249491, upload-time = "2025-10-25T10:47:03.174Z" },
+ { url = "https://files.pythonhosted.org/packages/db/9a/58de399c7cb58489f08498459ff096cd76b3f1ddc4f224ec2c5ef729c7d0/psutil-7.1.2-cp314-cp314t-win_arm64.whl", hash = "sha256:7b04c29e3c0c888e83ed4762b70f31e65c42673ea956cefa8ced0e31e185f582", size = 244880, upload-time = "2025-10-25T10:47:05.228Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/89/b9f8d47ddbc52d7301fc868e8224e5f44ed3c7f55e6d0f54ecaf5dd9ff5e/psutil-7.1.2-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:c9ba5c19f2d46203ee8c152c7b01df6eec87d883cfd8ee1af2ef2727f6b0f814", size = 237244, upload-time = "2025-10-25T10:47:07.086Z" },
+ { url = "https://files.pythonhosted.org/packages/c8/7a/8628c2f6b240680a67d73d8742bb9ff39b1820a693740e43096d5dcb01e5/psutil-7.1.2-cp36-abi3-macosx_11_0_arm64.whl", hash = "sha256:2a486030d2fe81bec023f703d3d155f4823a10a47c36784c84f1cc7f8d39bedb", size = 238101, upload-time = "2025-10-25T10:47:09.523Z" },
+ { url = "https://files.pythonhosted.org/packages/30/28/5e27f4d5a0e347f8e3cc16cd7d35533dbce086c95807f1f0e9cd77e26c10/psutil-7.1.2-cp36-abi3-manylinux2010_x86_64.manylinux_2_12_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3efd8fc791492e7808a51cb2b94889db7578bfaea22df931424f874468e389e3", size = 258675, upload-time = "2025-10-25T10:47:11.082Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/5c/79cf60c9acf36d087f0db0f82066fca4a780e97e5b3a2e4c38209c03d170/psutil-7.1.2-cp36-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e2aeb9b64f481b8eabfc633bd39e0016d4d8bbcd590d984af764d80bf0851b8a", size = 260203, upload-time = "2025-10-25T10:47:13.226Z" },
+ { url = "https://files.pythonhosted.org/packages/f7/03/0a464404c51685dcb9329fdd660b1721e076ccd7b3d97dee066bcc9ffb15/psutil-7.1.2-cp37-abi3-win_amd64.whl", hash = "sha256:8e17852114c4e7996fe9da4745c2bdef001ebbf2f260dec406290e66628bdb91", size = 246714, upload-time = "2025-10-25T10:47:15.093Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/32/97ca2090f2f1b45b01b6aa7ae161cfe50671de097311975ca6eea3e7aabc/psutil-7.1.2-cp37-abi3-win_arm64.whl", hash = "sha256:3e988455e61c240cc879cb62a008c2699231bf3e3d061d7fce4234463fd2abb4", size = 243742, upload-time = "2025-10-25T10:47:17.302Z" },
+]
+
+[[package]]
+name = "pyarrow"
+version = "22.0.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/30/53/04a7fdc63e6056116c9ddc8b43bc28c12cdd181b85cbeadb79278475f3ae/pyarrow-22.0.0.tar.gz", hash = "sha256:3d600dc583260d845c7d8a6db540339dd883081925da2bd1c5cb808f720b3cd9", size = 1151151, upload-time = "2025-10-24T12:30:00.762Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/d9/9b/cb3f7e0a345353def531ca879053e9ef6b9f38ed91aebcf68b09ba54dec0/pyarrow-22.0.0-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:77718810bd3066158db1e95a63c160ad7ce08c6b0710bc656055033e39cdad88", size = 34223968, upload-time = "2025-10-24T10:03:31.21Z" },
+ { url = "https://files.pythonhosted.org/packages/6c/41/3184b8192a120306270c5307f105b70320fdaa592c99843c5ef78aaefdcf/pyarrow-22.0.0-cp310-cp310-macosx_12_0_x86_64.whl", hash = "sha256:44d2d26cda26d18f7af7db71453b7b783788322d756e81730acb98f24eb90ace", size = 35942085, upload-time = "2025-10-24T10:03:38.146Z" },
+ { url = "https://files.pythonhosted.org/packages/d9/3d/a1eab2f6f08001f9fb714b8ed5cfb045e2fe3e3e3c0c221f2c9ed1e6d67d/pyarrow-22.0.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:b9d71701ce97c95480fecb0039ec5bb889e75f110da72005743451339262f4ce", size = 44964613, upload-time = "2025-10-24T10:03:46.516Z" },
+ { url = "https://files.pythonhosted.org/packages/46/46/a1d9c24baf21cfd9ce994ac820a24608decf2710521b29223d4334985127/pyarrow-22.0.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:710624ab925dc2b05a6229d47f6f0dac1c1155e6ed559be7109f684eba048a48", size = 47627059, upload-time = "2025-10-24T10:03:55.353Z" },
+ { url = "https://files.pythonhosted.org/packages/3a/4c/f711acb13075c1391fd54bc17e078587672c575f8de2a6e62509af026dcf/pyarrow-22.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:f963ba8c3b0199f9d6b794c90ec77545e05eadc83973897a4523c9e8d84e9340", size = 47947043, upload-time = "2025-10-24T10:04:05.408Z" },
+ { url = "https://files.pythonhosted.org/packages/4e/70/1f3180dd7c2eab35c2aca2b29ace6c519f827dcd4cfeb8e0dca41612cf7a/pyarrow-22.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:bd0d42297ace400d8febe55f13fdf46e86754842b860c978dfec16f081e5c653", size = 50206505, upload-time = "2025-10-24T10:04:15.786Z" },
+ { url = "https://files.pythonhosted.org/packages/80/07/fea6578112c8c60ffde55883a571e4c4c6bc7049f119d6b09333b5cc6f73/pyarrow-22.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:00626d9dc0f5ef3a75fe63fd68b9c7c8302d2b5bbc7f74ecaedba83447a24f84", size = 28101641, upload-time = "2025-10-24T10:04:22.57Z" },
+ { url = "https://files.pythonhosted.org/packages/2e/b7/18f611a8cdc43417f9394a3ccd3eace2f32183c08b9eddc3d17681819f37/pyarrow-22.0.0-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:3e294c5eadfb93d78b0763e859a0c16d4051fc1c5231ae8956d61cb0b5666f5a", size = 34272022, upload-time = "2025-10-24T10:04:28.973Z" },
+ { url = "https://files.pythonhosted.org/packages/26/5c/f259e2526c67eb4b9e511741b19870a02363a47a35edbebc55c3178db22d/pyarrow-22.0.0-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:69763ab2445f632d90b504a815a2a033f74332997052b721002298ed6de40f2e", size = 35995834, upload-time = "2025-10-24T10:04:35.467Z" },
+ { url = "https://files.pythonhosted.org/packages/50/8d/281f0f9b9376d4b7f146913b26fac0aa2829cd1ee7e997f53a27411bbb92/pyarrow-22.0.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:b41f37cabfe2463232684de44bad753d6be08a7a072f6a83447eeaf0e4d2a215", size = 45030348, upload-time = "2025-10-24T10:04:43.366Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/e5/53c0a1c428f0976bf22f513d79c73000926cb00b9c138d8e02daf2102e18/pyarrow-22.0.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:35ad0f0378c9359b3f297299c3309778bb03b8612f987399a0333a560b43862d", size = 47699480, upload-time = "2025-10-24T10:04:51.486Z" },
+ { url = "https://files.pythonhosted.org/packages/95/e1/9dbe4c465c3365959d183e6345d0a8d1dc5b02ca3f8db4760b3bc834cf25/pyarrow-22.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8382ad21458075c2e66a82a29d650f963ce51c7708c7c0ff313a8c206c4fd5e8", size = 48011148, upload-time = "2025-10-24T10:04:59.585Z" },
+ { url = "https://files.pythonhosted.org/packages/c5/b4/7caf5d21930061444c3cf4fa7535c82faf5263e22ce43af7c2759ceb5b8b/pyarrow-22.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:1a812a5b727bc09c3d7ea072c4eebf657c2f7066155506ba31ebf4792f88f016", size = 50276964, upload-time = "2025-10-24T10:05:08.175Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/f3/cec89bd99fa3abf826f14d4e53d3d11340ce6f6af4d14bdcd54cd83b6576/pyarrow-22.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:ec5d40dd494882704fb876c16fa7261a69791e784ae34e6b5992e977bd2e238c", size = 28106517, upload-time = "2025-10-24T10:05:14.314Z" },
+ { url = "https://files.pythonhosted.org/packages/af/63/ba23862d69652f85b615ca14ad14f3bcfc5bf1b99ef3f0cd04ff93fdad5a/pyarrow-22.0.0-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:bea79263d55c24a32b0d79c00a1c58bb2ee5f0757ed95656b01c0fb310c5af3d", size = 34211578, upload-time = "2025-10-24T10:05:21.583Z" },
+ { url = "https://files.pythonhosted.org/packages/b1/d0/f9ad86fe809efd2bcc8be32032fa72e8b0d112b01ae56a053006376c5930/pyarrow-22.0.0-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:12fe549c9b10ac98c91cf791d2945e878875d95508e1a5d14091a7aaa66d9cf8", size = 35989906, upload-time = "2025-10-24T10:05:29.485Z" },
+ { url = "https://files.pythonhosted.org/packages/b4/a8/f910afcb14630e64d673f15904ec27dd31f1e009b77033c365c84e8c1e1d/pyarrow-22.0.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:334f900ff08ce0423407af97e6c26ad5d4e3b0763645559ece6fbf3747d6a8f5", size = 45021677, upload-time = "2025-10-24T10:05:38.274Z" },
+ { url = "https://files.pythonhosted.org/packages/13/95/aec81f781c75cd10554dc17a25849c720d54feafb6f7847690478dcf5ef8/pyarrow-22.0.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:c6c791b09c57ed76a18b03f2631753a4960eefbbca80f846da8baefc6491fcfe", size = 47726315, upload-time = "2025-10-24T10:05:47.314Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/d4/74ac9f7a54cfde12ee42734ea25d5a3c9a45db78f9def949307a92720d37/pyarrow-22.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:c3200cb41cdbc65156e5f8c908d739b0dfed57e890329413da2748d1a2cd1a4e", size = 47990906, upload-time = "2025-10-24T10:05:58.254Z" },
+ { url = "https://files.pythonhosted.org/packages/2e/71/fedf2499bf7a95062eafc989ace56572f3343432570e1c54e6599d5b88da/pyarrow-22.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ac93252226cf288753d8b46280f4edf3433bf9508b6977f8dd8526b521a1bbb9", size = 50306783, upload-time = "2025-10-24T10:06:08.08Z" },
+ { url = "https://files.pythonhosted.org/packages/68/ed/b202abd5a5b78f519722f3d29063dda03c114711093c1995a33b8e2e0f4b/pyarrow-22.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:44729980b6c50a5f2bfcc2668d36c569ce17f8b17bccaf470c4313dcbbf13c9d", size = 27972883, upload-time = "2025-10-24T10:06:14.204Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/d6/d0fac16a2963002fc22c8fa75180a838737203d558f0ed3b564c4a54eef5/pyarrow-22.0.0-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:e6e95176209257803a8b3d0394f21604e796dadb643d2f7ca21b66c9c0b30c9a", size = 34204629, upload-time = "2025-10-24T10:06:20.274Z" },
+ { url = "https://files.pythonhosted.org/packages/c6/9c/1d6357347fbae062ad3f17082f9ebc29cc733321e892c0d2085f42a2212b/pyarrow-22.0.0-cp313-cp313-macosx_12_0_x86_64.whl", hash = "sha256:001ea83a58024818826a9e3f89bf9310a114f7e26dfe404a4c32686f97bd7901", size = 35985783, upload-time = "2025-10-24T10:06:27.301Z" },
+ { url = "https://files.pythonhosted.org/packages/ff/c0/782344c2ce58afbea010150df07e3a2f5fdad299cd631697ae7bd3bac6e3/pyarrow-22.0.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:ce20fe000754f477c8a9125543f1936ea5b8867c5406757c224d745ed033e691", size = 45020999, upload-time = "2025-10-24T10:06:35.387Z" },
+ { url = "https://files.pythonhosted.org/packages/1b/8b/5362443737a5307a7b67c1017c42cd104213189b4970bf607e05faf9c525/pyarrow-22.0.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:e0a15757fccb38c410947df156f9749ae4a3c89b2393741a50521f39a8cf202a", size = 47724601, upload-time = "2025-10-24T10:06:43.551Z" },
+ { url = "https://files.pythonhosted.org/packages/69/4d/76e567a4fc2e190ee6072967cb4672b7d9249ac59ae65af2d7e3047afa3b/pyarrow-22.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cedb9dd9358e4ea1d9bce3665ce0797f6adf97ff142c8e25b46ba9cdd508e9b6", size = 48001050, upload-time = "2025-10-24T10:06:52.284Z" },
+ { url = "https://files.pythonhosted.org/packages/01/5e/5653f0535d2a1aef8223cee9d92944cb6bccfee5cf1cd3f462d7cb022790/pyarrow-22.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:252be4a05f9d9185bb8c18e83764ebcfea7185076c07a7a662253af3a8c07941", size = 50307877, upload-time = "2025-10-24T10:07:02.405Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/f8/1d0bd75bf9328a3b826e24a16e5517cd7f9fbf8d34a3184a4566ef5a7f29/pyarrow-22.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:a4893d31e5ef780b6edcaf63122df0f8d321088bb0dee4c8c06eccb1ca28d145", size = 27977099, upload-time = "2025-10-24T10:08:07.259Z" },
+ { url = "https://files.pythonhosted.org/packages/90/81/db56870c997805bf2b0f6eeeb2d68458bf4654652dccdcf1bf7a42d80903/pyarrow-22.0.0-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:f7fe3dbe871294ba70d789be16b6e7e52b418311e166e0e3cba9522f0f437fb1", size = 34336685, upload-time = "2025-10-24T10:07:11.47Z" },
+ { url = "https://files.pythonhosted.org/packages/1c/98/0727947f199aba8a120f47dfc229eeb05df15bcd7a6f1b669e9f882afc58/pyarrow-22.0.0-cp313-cp313t-macosx_12_0_x86_64.whl", hash = "sha256:ba95112d15fd4f1105fb2402c4eab9068f0554435e9b7085924bcfaac2cc306f", size = 36032158, upload-time = "2025-10-24T10:07:18.626Z" },
+ { url = "https://files.pythonhosted.org/packages/96/b4/9babdef9c01720a0785945c7cf550e4acd0ebcd7bdd2e6f0aa7981fa85e2/pyarrow-22.0.0-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:c064e28361c05d72eed8e744c9605cbd6d2bb7481a511c74071fd9b24bc65d7d", size = 44892060, upload-time = "2025-10-24T10:07:26.002Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/ca/2f8804edd6279f78a37062d813de3f16f29183874447ef6d1aadbb4efa0f/pyarrow-22.0.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:6f9762274496c244d951c819348afbcf212714902742225f649cf02823a6a10f", size = 47504395, upload-time = "2025-10-24T10:07:34.09Z" },
+ { url = "https://files.pythonhosted.org/packages/b9/f0/77aa5198fd3943682b2e4faaf179a674f0edea0d55d326d83cb2277d9363/pyarrow-22.0.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a9d9ffdc2ab696f6b15b4d1f7cec6658e1d788124418cb30030afbae31c64746", size = 48066216, upload-time = "2025-10-24T10:07:43.528Z" },
+ { url = "https://files.pythonhosted.org/packages/79/87/a1937b6e78b2aff18b706d738c9e46ade5bfcf11b294e39c87706a0089ac/pyarrow-22.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:ec1a15968a9d80da01e1d30349b2b0d7cc91e96588ee324ce1b5228175043e95", size = 50288552, upload-time = "2025-10-24T10:07:53.519Z" },
+ { url = "https://files.pythonhosted.org/packages/60/ae/b5a5811e11f25788ccfdaa8f26b6791c9807119dffcf80514505527c384c/pyarrow-22.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:bba208d9c7decf9961998edf5c65e3ea4355d5818dd6cd0f6809bec1afb951cc", size = 28262504, upload-time = "2025-10-24T10:08:00.932Z" },
+ { url = "https://files.pythonhosted.org/packages/bd/b0/0fa4d28a8edb42b0a7144edd20befd04173ac79819547216f8a9f36f9e50/pyarrow-22.0.0-cp314-cp314-macosx_12_0_arm64.whl", hash = "sha256:9bddc2cade6561f6820d4cd73f99a0243532ad506bc510a75a5a65a522b2d74d", size = 34224062, upload-time = "2025-10-24T10:08:14.101Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/a8/7a719076b3c1be0acef56a07220c586f25cd24de0e3f3102b438d18ae5df/pyarrow-22.0.0-cp314-cp314-macosx_12_0_x86_64.whl", hash = "sha256:e70ff90c64419709d38c8932ea9fe1cc98415c4f87ea8da81719e43f02534bc9", size = 35990057, upload-time = "2025-10-24T10:08:21.842Z" },
+ { url = "https://files.pythonhosted.org/packages/89/3c/359ed54c93b47fb6fe30ed16cdf50e3f0e8b9ccfb11b86218c3619ae50a8/pyarrow-22.0.0-cp314-cp314-manylinux_2_28_aarch64.whl", hash = "sha256:92843c305330aa94a36e706c16209cd4df274693e777ca47112617db7d0ef3d7", size = 45068002, upload-time = "2025-10-24T10:08:29.034Z" },
+ { url = "https://files.pythonhosted.org/packages/55/fc/4945896cc8638536ee787a3bd6ce7cec8ec9acf452d78ec39ab328efa0a1/pyarrow-22.0.0-cp314-cp314-manylinux_2_28_x86_64.whl", hash = "sha256:6dda1ddac033d27421c20d7a7943eec60be44e0db4e079f33cc5af3b8280ccde", size = 47737765, upload-time = "2025-10-24T10:08:38.559Z" },
+ { url = "https://files.pythonhosted.org/packages/cd/5e/7cb7edeb2abfaa1f79b5d5eb89432356155c8426f75d3753cbcb9592c0fd/pyarrow-22.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:84378110dd9a6c06323b41b56e129c504d157d1a983ce8f5443761eb5256bafc", size = 48048139, upload-time = "2025-10-24T10:08:46.784Z" },
+ { url = "https://files.pythonhosted.org/packages/88/c6/546baa7c48185f5e9d6e59277c4b19f30f48c94d9dd938c2a80d4d6b067c/pyarrow-22.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:854794239111d2b88b40b6ef92aa478024d1e5074f364033e73e21e3f76b25e0", size = 50314244, upload-time = "2025-10-24T10:08:55.771Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/79/755ff2d145aafec8d347bf18f95e4e81c00127f06d080135dfc86aea417c/pyarrow-22.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:b883fe6fd85adad7932b3271c38ac289c65b7337c2c132e9569f9d3940620730", size = 28757501, upload-time = "2025-10-24T10:09:59.891Z" },
+ { url = "https://files.pythonhosted.org/packages/0e/d2/237d75ac28ced3147912954e3c1a174df43a95f4f88e467809118a8165e0/pyarrow-22.0.0-cp314-cp314t-macosx_12_0_arm64.whl", hash = "sha256:7a820d8ae11facf32585507c11f04e3f38343c1e784c9b5a8b1da5c930547fe2", size = 34355506, upload-time = "2025-10-24T10:09:02.953Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/2c/733dfffe6d3069740f98e57ff81007809067d68626c5faef293434d11bd6/pyarrow-22.0.0-cp314-cp314t-macosx_12_0_x86_64.whl", hash = "sha256:c6ec3675d98915bf1ec8b3c7986422682f7232ea76cad276f4c8abd5b7319b70", size = 36047312, upload-time = "2025-10-24T10:09:10.334Z" },
+ { url = "https://files.pythonhosted.org/packages/7c/2b/29d6e3782dc1f299727462c1543af357a0f2c1d3c160ce199950d9ca51eb/pyarrow-22.0.0-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:3e739edd001b04f654b166204fc7a9de896cf6007eaff33409ee9e50ceaff754", size = 45081609, upload-time = "2025-10-24T10:09:18.61Z" },
+ { url = "https://files.pythonhosted.org/packages/8d/42/aa9355ecc05997915af1b7b947a7f66c02dcaa927f3203b87871c114ba10/pyarrow-22.0.0-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:7388ac685cab5b279a41dfe0a6ccd99e4dbf322edfb63e02fc0443bf24134e91", size = 47703663, upload-time = "2025-10-24T10:09:27.369Z" },
+ { url = "https://files.pythonhosted.org/packages/ee/62/45abedde480168e83a1de005b7b7043fd553321c1e8c5a9a114425f64842/pyarrow-22.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:f633074f36dbc33d5c05b5dc75371e5660f1dbf9c8b1d95669def05e5425989c", size = 48066543, upload-time = "2025-10-24T10:09:34.908Z" },
+ { url = "https://files.pythonhosted.org/packages/84/e9/7878940a5b072e4f3bf998770acafeae13b267f9893af5f6d4ab3904b67e/pyarrow-22.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:4c19236ae2402a8663a2c8f21f1870a03cc57f0bef7e4b6eb3238cc82944de80", size = 50288838, upload-time = "2025-10-24T10:09:44.394Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/03/f335d6c52b4a4761bcc83499789a1e2e16d9d201a58c327a9b5cc9a41bd9/pyarrow-22.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:0c34fe18094686194f204a3b1787a27456897d8a2d62caf84b61e8dfbc0252ae", size = 29185594, upload-time = "2025-10-24T10:09:53.111Z" },
+]
+
+[[package]]
+name = "pydantic"
+version = "2.12.3"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "annotated-types" },
+ { name = "pydantic-core" },
+ { name = "typing-extensions" },
+ { name = "typing-inspection" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/f3/1e/4f0a3233767010308f2fd6bd0814597e3f63f1dc98304a9112b8759df4ff/pydantic-2.12.3.tar.gz", hash = "sha256:1da1c82b0fc140bb0103bc1441ffe062154c8d38491189751ee00fd8ca65ce74", size = 819383, upload-time = "2025-10-17T15:04:21.222Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/a1/6b/83661fa77dcefa195ad5f8cd9af3d1a7450fd57cc883ad04d65446ac2029/pydantic-2.12.3-py3-none-any.whl", hash = "sha256:6986454a854bc3bc6e5443e1369e06a3a456af9d339eda45510f517d9ea5c6bf", size = 462431, upload-time = "2025-10-17T15:04:19.346Z" },
+]
+
+[[package]]
+name = "pydantic-core"
+version = "2.41.4"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/df/18/d0944e8eaaa3efd0a91b0f1fc537d3be55ad35091b6a87638211ba691964/pydantic_core-2.41.4.tar.gz", hash = "sha256:70e47929a9d4a1905a67e4b687d5946026390568a8e952b92824118063cee4d5", size = 457557, upload-time = "2025-10-14T10:23:47.909Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/a7/3d/9b8ca77b0f76fcdbf8bc6b72474e264283f461284ca84ac3fde570c6c49a/pydantic_core-2.41.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2442d9a4d38f3411f22eb9dd0912b7cbf4b7d5b6c92c4173b75d3e1ccd84e36e", size = 2111197, upload-time = "2025-10-14T10:19:43.303Z" },
+ { url = "https://files.pythonhosted.org/packages/59/92/b7b0fe6ed4781642232755cb7e56a86e2041e1292f16d9ae410a0ccee5ac/pydantic_core-2.41.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:30a9876226dda131a741afeab2702e2d127209bde3c65a2b8133f428bc5d006b", size = 1917909, upload-time = "2025-10-14T10:19:45.194Z" },
+ { url = "https://files.pythonhosted.org/packages/52/8c/3eb872009274ffa4fb6a9585114e161aa1a0915af2896e2d441642929fe4/pydantic_core-2.41.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d55bbac04711e2980645af68b97d445cdbcce70e5216de444a6c4b6943ebcccd", size = 1969905, upload-time = "2025-10-14T10:19:46.567Z" },
+ { url = "https://files.pythonhosted.org/packages/f4/21/35adf4a753bcfaea22d925214a0c5b880792e3244731b3f3e6fec0d124f7/pydantic_core-2.41.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e1d778fb7849a42d0ee5927ab0f7453bf9f85eef8887a546ec87db5ddb178945", size = 2051938, upload-time = "2025-10-14T10:19:48.237Z" },
+ { url = "https://files.pythonhosted.org/packages/7d/d0/cdf7d126825e36d6e3f1eccf257da8954452934ede275a8f390eac775e89/pydantic_core-2.41.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1b65077a4693a98b90ec5ad8f203ad65802a1b9b6d4a7e48066925a7e1606706", size = 2250710, upload-time = "2025-10-14T10:19:49.619Z" },
+ { url = "https://files.pythonhosted.org/packages/2e/1c/af1e6fd5ea596327308f9c8d1654e1285cc3d8de0d584a3c9d7705bf8a7c/pydantic_core-2.41.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:62637c769dee16eddb7686bf421be48dfc2fae93832c25e25bc7242e698361ba", size = 2367445, upload-time = "2025-10-14T10:19:51.269Z" },
+ { url = "https://files.pythonhosted.org/packages/d3/81/8cece29a6ef1b3a92f956ea6da6250d5b2d2e7e4d513dd3b4f0c7a83dfea/pydantic_core-2.41.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2dfe3aa529c8f501babf6e502936b9e8d4698502b2cfab41e17a028d91b1ac7b", size = 2072875, upload-time = "2025-10-14T10:19:52.671Z" },
+ { url = "https://files.pythonhosted.org/packages/e3/37/a6a579f5fc2cd4d5521284a0ab6a426cc6463a7b3897aeb95b12f1ba607b/pydantic_core-2.41.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ca2322da745bf2eeb581fc9ea3bbb31147702163ccbcbf12a3bb630e4bf05e1d", size = 2191329, upload-time = "2025-10-14T10:19:54.214Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/03/505020dc5c54ec75ecba9f41119fd1e48f9e41e4629942494c4a8734ded1/pydantic_core-2.41.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e8cd3577c796be7231dcf80badcf2e0835a46665eaafd8ace124d886bab4d700", size = 2151658, upload-time = "2025-10-14T10:19:55.843Z" },
+ { url = "https://files.pythonhosted.org/packages/cb/5d/2c0d09fb53aa03bbd2a214d89ebfa6304be7df9ed86ee3dc7770257f41ee/pydantic_core-2.41.4-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:1cae8851e174c83633f0833e90636832857297900133705ee158cf79d40f03e6", size = 2316777, upload-time = "2025-10-14T10:19:57.607Z" },
+ { url = "https://files.pythonhosted.org/packages/ea/4b/c2c9c8f5e1f9c864b57d08539d9d3db160e00491c9f5ee90e1bfd905e644/pydantic_core-2.41.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:a26d950449aae348afe1ac8be5525a00ae4235309b729ad4d3399623125b43c9", size = 2320705, upload-time = "2025-10-14T10:19:59.016Z" },
+ { url = "https://files.pythonhosted.org/packages/28/c3/a74c1c37f49c0a02c89c7340fafc0ba816b29bd495d1a31ce1bdeacc6085/pydantic_core-2.41.4-cp310-cp310-win32.whl", hash = "sha256:0cf2a1f599efe57fa0051312774280ee0f650e11152325e41dfd3018ef2c1b57", size = 1975464, upload-time = "2025-10-14T10:20:00.581Z" },
+ { url = "https://files.pythonhosted.org/packages/d6/23/5dd5c1324ba80303368f7569e2e2e1a721c7d9eb16acb7eb7b7f85cb1be2/pydantic_core-2.41.4-cp310-cp310-win_amd64.whl", hash = "sha256:a8c2e340d7e454dc3340d3d2e8f23558ebe78c98aa8f68851b04dcb7bc37abdc", size = 2024497, upload-time = "2025-10-14T10:20:03.018Z" },
+ { url = "https://files.pythonhosted.org/packages/62/4c/f6cbfa1e8efacd00b846764e8484fe173d25b8dab881e277a619177f3384/pydantic_core-2.41.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:28ff11666443a1a8cf2a044d6a545ebffa8382b5f7973f22c36109205e65dc80", size = 2109062, upload-time = "2025-10-14T10:20:04.486Z" },
+ { url = "https://files.pythonhosted.org/packages/21/f8/40b72d3868896bfcd410e1bd7e516e762d326201c48e5b4a06446f6cf9e8/pydantic_core-2.41.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:61760c3925d4633290292bad462e0f737b840508b4f722247d8729684f6539ae", size = 1916301, upload-time = "2025-10-14T10:20:06.857Z" },
+ { url = "https://files.pythonhosted.org/packages/94/4d/d203dce8bee7faeca791671c88519969d98d3b4e8f225da5b96dad226fc8/pydantic_core-2.41.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eae547b7315d055b0de2ec3965643b0ab82ad0106a7ffd29615ee9f266a02827", size = 1968728, upload-time = "2025-10-14T10:20:08.353Z" },
+ { url = "https://files.pythonhosted.org/packages/65/f5/6a66187775df87c24d526985b3a5d78d861580ca466fbd9d4d0e792fcf6c/pydantic_core-2.41.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ef9ee5471edd58d1fcce1c80ffc8783a650e3e3a193fe90d52e43bb4d87bff1f", size = 2050238, upload-time = "2025-10-14T10:20:09.766Z" },
+ { url = "https://files.pythonhosted.org/packages/5e/b9/78336345de97298cf53236b2f271912ce11f32c1e59de25a374ce12f9cce/pydantic_core-2.41.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:15dd504af121caaf2c95cb90c0ebf71603c53de98305621b94da0f967e572def", size = 2249424, upload-time = "2025-10-14T10:20:11.732Z" },
+ { url = "https://files.pythonhosted.org/packages/99/bb/a4584888b70ee594c3d374a71af5075a68654d6c780369df269118af7402/pydantic_core-2.41.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3a926768ea49a8af4d36abd6a8968b8790f7f76dd7cbd5a4c180db2b4ac9a3a2", size = 2366047, upload-time = "2025-10-14T10:20:13.647Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/8d/17fc5de9d6418e4d2ae8c675f905cdafdc59d3bf3bf9c946b7ab796a992a/pydantic_core-2.41.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6916b9b7d134bff5440098a4deb80e4cb623e68974a87883299de9124126c2a8", size = 2071163, upload-time = "2025-10-14T10:20:15.307Z" },
+ { url = "https://files.pythonhosted.org/packages/54/e7/03d2c5c0b8ed37a4617430db68ec5e7dbba66358b629cd69e11b4d564367/pydantic_core-2.41.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5cf90535979089df02e6f17ffd076f07237efa55b7343d98760bde8743c4b265", size = 2190585, upload-time = "2025-10-14T10:20:17.3Z" },
+ { url = "https://files.pythonhosted.org/packages/be/fc/15d1c9fe5ad9266a5897d9b932b7f53d7e5cfc800573917a2c5d6eea56ec/pydantic_core-2.41.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:7533c76fa647fade2d7ec75ac5cc079ab3f34879626dae5689b27790a6cf5a5c", size = 2150109, upload-time = "2025-10-14T10:20:19.143Z" },
+ { url = "https://files.pythonhosted.org/packages/26/ef/e735dd008808226c83ba56972566138665b71477ad580fa5a21f0851df48/pydantic_core-2.41.4-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:37e516bca9264cbf29612539801ca3cd5d1be465f940417b002905e6ed79d38a", size = 2315078, upload-time = "2025-10-14T10:20:20.742Z" },
+ { url = "https://files.pythonhosted.org/packages/90/00/806efdcf35ff2ac0f938362350cd9827b8afb116cc814b6b75cf23738c7c/pydantic_core-2.41.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:0c19cb355224037c83642429b8ce261ae108e1c5fbf5c028bac63c77b0f8646e", size = 2318737, upload-time = "2025-10-14T10:20:22.306Z" },
+ { url = "https://files.pythonhosted.org/packages/41/7e/6ac90673fe6cb36621a2283552897838c020db343fa86e513d3f563b196f/pydantic_core-2.41.4-cp311-cp311-win32.whl", hash = "sha256:09c2a60e55b357284b5f31f5ab275ba9f7f70b7525e18a132ec1f9160b4f1f03", size = 1974160, upload-time = "2025-10-14T10:20:23.817Z" },
+ { url = "https://files.pythonhosted.org/packages/e0/9d/7c5e24ee585c1f8b6356e1d11d40ab807ffde44d2db3b7dfd6d20b09720e/pydantic_core-2.41.4-cp311-cp311-win_amd64.whl", hash = "sha256:711156b6afb5cb1cb7c14a2cc2c4a8b4c717b69046f13c6b332d8a0a8f41ca3e", size = 2021883, upload-time = "2025-10-14T10:20:25.48Z" },
+ { url = "https://files.pythonhosted.org/packages/33/90/5c172357460fc28b2871eb4a0fb3843b136b429c6fa827e4b588877bf115/pydantic_core-2.41.4-cp311-cp311-win_arm64.whl", hash = "sha256:6cb9cf7e761f4f8a8589a45e49ed3c0d92d1d696a45a6feaee8c904b26efc2db", size = 1968026, upload-time = "2025-10-14T10:20:27.039Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/81/d3b3e95929c4369d30b2a66a91db63c8ed0a98381ae55a45da2cd1cc1288/pydantic_core-2.41.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:ab06d77e053d660a6faaf04894446df7b0a7e7aba70c2797465a0a1af00fc887", size = 2099043, upload-time = "2025-10-14T10:20:28.561Z" },
+ { url = "https://files.pythonhosted.org/packages/58/da/46fdac49e6717e3a94fc9201403e08d9d61aa7a770fab6190b8740749047/pydantic_core-2.41.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:c53ff33e603a9c1179a9364b0a24694f183717b2e0da2b5ad43c316c956901b2", size = 1910699, upload-time = "2025-10-14T10:20:30.217Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/63/4d948f1b9dd8e991a5a98b77dd66c74641f5f2e5225fee37994b2e07d391/pydantic_core-2.41.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:304c54176af2c143bd181d82e77c15c41cbacea8872a2225dd37e6544dce9999", size = 1952121, upload-time = "2025-10-14T10:20:32.246Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/a7/e5fc60a6f781fc634ecaa9ecc3c20171d238794cef69ae0af79ac11b89d7/pydantic_core-2.41.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:025ba34a4cf4fb32f917d5d188ab5e702223d3ba603be4d8aca2f82bede432a4", size = 2041590, upload-time = "2025-10-14T10:20:34.332Z" },
+ { url = "https://files.pythonhosted.org/packages/70/69/dce747b1d21d59e85af433428978a1893c6f8a7068fa2bb4a927fba7a5ff/pydantic_core-2.41.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b9f5f30c402ed58f90c70e12eff65547d3ab74685ffe8283c719e6bead8ef53f", size = 2219869, upload-time = "2025-10-14T10:20:35.965Z" },
+ { url = "https://files.pythonhosted.org/packages/83/6a/c070e30e295403bf29c4df1cb781317b6a9bac7cd07b8d3acc94d501a63c/pydantic_core-2.41.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dd96e5d15385d301733113bcaa324c8bcf111275b7675a9c6e88bfb19fc05e3b", size = 2345169, upload-time = "2025-10-14T10:20:37.627Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/83/06d001f8043c336baea7fd202a9ac7ad71f87e1c55d8112c50b745c40324/pydantic_core-2.41.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:98f348cbb44fae6e9653c1055db7e29de67ea6a9ca03a5fa2c2e11a47cff0e47", size = 2070165, upload-time = "2025-10-14T10:20:39.246Z" },
+ { url = "https://files.pythonhosted.org/packages/14/0a/e567c2883588dd12bcbc110232d892cf385356f7c8a9910311ac997ab715/pydantic_core-2.41.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ec22626a2d14620a83ca583c6f5a4080fa3155282718b6055c2ea48d3ef35970", size = 2189067, upload-time = "2025-10-14T10:20:41.015Z" },
+ { url = "https://files.pythonhosted.org/packages/f4/1d/3d9fca34273ba03c9b1c5289f7618bc4bd09c3ad2289b5420481aa051a99/pydantic_core-2.41.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:3a95d4590b1f1a43bf33ca6d647b990a88f4a3824a8c4572c708f0b45a5290ed", size = 2132997, upload-time = "2025-10-14T10:20:43.106Z" },
+ { url = "https://files.pythonhosted.org/packages/52/70/d702ef7a6cd41a8afc61f3554922b3ed8d19dd54c3bd4bdbfe332e610827/pydantic_core-2.41.4-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:f9672ab4d398e1b602feadcffcdd3af44d5f5e6ddc15bc7d15d376d47e8e19f8", size = 2307187, upload-time = "2025-10-14T10:20:44.849Z" },
+ { url = "https://files.pythonhosted.org/packages/68/4c/c06be6e27545d08b802127914156f38d10ca287a9e8489342793de8aae3c/pydantic_core-2.41.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:84d8854db5f55fead3b579f04bda9a36461dab0730c5d570e1526483e7bb8431", size = 2305204, upload-time = "2025-10-14T10:20:46.781Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/e5/35ae4919bcd9f18603419e23c5eaf32750224a89d41a8df1a3704b69f77e/pydantic_core-2.41.4-cp312-cp312-win32.whl", hash = "sha256:9be1c01adb2ecc4e464392c36d17f97e9110fbbc906bcbe1c943b5b87a74aabd", size = 1972536, upload-time = "2025-10-14T10:20:48.39Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/c2/49c5bb6d2a49eb2ee3647a93e3dae7080c6409a8a7558b075027644e879c/pydantic_core-2.41.4-cp312-cp312-win_amd64.whl", hash = "sha256:d682cf1d22bab22a5be08539dca3d1593488a99998f9f412137bc323179067ff", size = 2031132, upload-time = "2025-10-14T10:20:50.421Z" },
+ { url = "https://files.pythonhosted.org/packages/06/23/936343dbcba6eec93f73e95eb346810fc732f71ba27967b287b66f7b7097/pydantic_core-2.41.4-cp312-cp312-win_arm64.whl", hash = "sha256:833eebfd75a26d17470b58768c1834dfc90141b7afc6eb0429c21fc5a21dcfb8", size = 1969483, upload-time = "2025-10-14T10:20:52.35Z" },
+ { url = "https://files.pythonhosted.org/packages/13/d0/c20adabd181a029a970738dfe23710b52a31f1258f591874fcdec7359845/pydantic_core-2.41.4-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:85e050ad9e5f6fe1004eec65c914332e52f429bc0ae12d6fa2092407a462c746", size = 2105688, upload-time = "2025-10-14T10:20:54.448Z" },
+ { url = "https://files.pythonhosted.org/packages/00/b6/0ce5c03cec5ae94cca220dfecddc453c077d71363b98a4bbdb3c0b22c783/pydantic_core-2.41.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:e7393f1d64792763a48924ba31d1e44c2cfbc05e3b1c2c9abb4ceeadd912cced", size = 1910807, upload-time = "2025-10-14T10:20:56.115Z" },
+ { url = "https://files.pythonhosted.org/packages/68/3e/800d3d02c8beb0b5c069c870cbb83799d085debf43499c897bb4b4aaff0d/pydantic_core-2.41.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:94dab0940b0d1fb28bcab847adf887c66a27a40291eedf0b473be58761c9799a", size = 1956669, upload-time = "2025-10-14T10:20:57.874Z" },
+ { url = "https://files.pythonhosted.org/packages/60/a4/24271cc71a17f64589be49ab8bd0751f6a0a03046c690df60989f2f95c2c/pydantic_core-2.41.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:de7c42f897e689ee6f9e93c4bec72b99ae3b32a2ade1c7e4798e690ff5246e02", size = 2051629, upload-time = "2025-10-14T10:21:00.006Z" },
+ { url = "https://files.pythonhosted.org/packages/68/de/45af3ca2f175d91b96bfb62e1f2d2f1f9f3b14a734afe0bfeff079f78181/pydantic_core-2.41.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:664b3199193262277b8b3cd1e754fb07f2c6023289c815a1e1e8fb415cb247b1", size = 2224049, upload-time = "2025-10-14T10:21:01.801Z" },
+ { url = "https://files.pythonhosted.org/packages/af/8f/ae4e1ff84672bf869d0a77af24fd78387850e9497753c432875066b5d622/pydantic_core-2.41.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d95b253b88f7d308b1c0b417c4624f44553ba4762816f94e6986819b9c273fb2", size = 2342409, upload-time = "2025-10-14T10:21:03.556Z" },
+ { url = "https://files.pythonhosted.org/packages/18/62/273dd70b0026a085c7b74b000394e1ef95719ea579c76ea2f0cc8893736d/pydantic_core-2.41.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a1351f5bbdbbabc689727cb91649a00cb9ee7203e0a6e54e9f5ba9e22e384b84", size = 2069635, upload-time = "2025-10-14T10:21:05.385Z" },
+ { url = "https://files.pythonhosted.org/packages/30/03/cf485fff699b4cdaea469bc481719d3e49f023241b4abb656f8d422189fc/pydantic_core-2.41.4-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1affa4798520b148d7182da0615d648e752de4ab1a9566b7471bc803d88a062d", size = 2194284, upload-time = "2025-10-14T10:21:07.122Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/7e/c8e713db32405dfd97211f2fc0a15d6bf8adb7640f3d18544c1f39526619/pydantic_core-2.41.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7b74e18052fea4aa8dea2fb7dbc23d15439695da6cbe6cfc1b694af1115df09d", size = 2137566, upload-time = "2025-10-14T10:21:08.981Z" },
+ { url = "https://files.pythonhosted.org/packages/04/f7/db71fd4cdccc8b75990f79ccafbbd66757e19f6d5ee724a6252414483fb4/pydantic_core-2.41.4-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:285b643d75c0e30abda9dc1077395624f314a37e3c09ca402d4015ef5979f1a2", size = 2316809, upload-time = "2025-10-14T10:21:10.805Z" },
+ { url = "https://files.pythonhosted.org/packages/76/63/a54973ddb945f1bca56742b48b144d85c9fc22f819ddeb9f861c249d5464/pydantic_core-2.41.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:f52679ff4218d713b3b33f88c89ccbf3a5c2c12ba665fb80ccc4192b4608dbab", size = 2311119, upload-time = "2025-10-14T10:21:12.583Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/03/5d12891e93c19218af74843a27e32b94922195ded2386f7b55382f904d2f/pydantic_core-2.41.4-cp313-cp313-win32.whl", hash = "sha256:ecde6dedd6fff127c273c76821bb754d793be1024bc33314a120f83a3c69460c", size = 1981398, upload-time = "2025-10-14T10:21:14.584Z" },
+ { url = "https://files.pythonhosted.org/packages/be/d8/fd0de71f39db91135b7a26996160de71c073d8635edfce8b3c3681be0d6d/pydantic_core-2.41.4-cp313-cp313-win_amd64.whl", hash = "sha256:d081a1f3800f05409ed868ebb2d74ac39dd0c1ff6c035b5162356d76030736d4", size = 2030735, upload-time = "2025-10-14T10:21:16.432Z" },
+ { url = "https://files.pythonhosted.org/packages/72/86/c99921c1cf6650023c08bfab6fe2d7057a5142628ef7ccfa9921f2dda1d5/pydantic_core-2.41.4-cp313-cp313-win_arm64.whl", hash = "sha256:f8e49c9c364a7edcbe2a310f12733aad95b022495ef2a8d653f645e5d20c1564", size = 1973209, upload-time = "2025-10-14T10:21:18.213Z" },
+ { url = "https://files.pythonhosted.org/packages/36/0d/b5706cacb70a8414396efdda3d72ae0542e050b591119e458e2490baf035/pydantic_core-2.41.4-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:ed97fd56a561f5eb5706cebe94f1ad7c13b84d98312a05546f2ad036bafe87f4", size = 1877324, upload-time = "2025-10-14T10:21:20.363Z" },
+ { url = "https://files.pythonhosted.org/packages/de/2d/cba1fa02cfdea72dfb3a9babb067c83b9dff0bbcb198368e000a6b756ea7/pydantic_core-2.41.4-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a870c307bf1ee91fc58a9a61338ff780d01bfae45922624816878dce784095d2", size = 1884515, upload-time = "2025-10-14T10:21:22.339Z" },
+ { url = "https://files.pythonhosted.org/packages/07/ea/3df927c4384ed9b503c9cc2d076cf983b4f2adb0c754578dfb1245c51e46/pydantic_core-2.41.4-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d25e97bc1f5f8f7985bdc2335ef9e73843bb561eb1fa6831fdfc295c1c2061cf", size = 2042819, upload-time = "2025-10-14T10:21:26.683Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/ee/df8e871f07074250270a3b1b82aad4cd0026b588acd5d7d3eb2fcb1471a3/pydantic_core-2.41.4-cp313-cp313t-win_amd64.whl", hash = "sha256:d405d14bea042f166512add3091c1af40437c2e7f86988f3915fabd27b1e9cd2", size = 1995866, upload-time = "2025-10-14T10:21:28.951Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/de/b20f4ab954d6d399499c33ec4fafc46d9551e11dc1858fb7f5dca0748ceb/pydantic_core-2.41.4-cp313-cp313t-win_arm64.whl", hash = "sha256:19f3684868309db5263a11bace3c45d93f6f24afa2ffe75a647583df22a2ff89", size = 1970034, upload-time = "2025-10-14T10:21:30.869Z" },
+ { url = "https://files.pythonhosted.org/packages/54/28/d3325da57d413b9819365546eb9a6e8b7cbd9373d9380efd5f74326143e6/pydantic_core-2.41.4-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:e9205d97ed08a82ebb9a307e92914bb30e18cdf6f6b12ca4bedadb1588a0bfe1", size = 2102022, upload-time = "2025-10-14T10:21:32.809Z" },
+ { url = "https://files.pythonhosted.org/packages/9e/24/b58a1bc0d834bf1acc4361e61233ee217169a42efbdc15a60296e13ce438/pydantic_core-2.41.4-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:82df1f432b37d832709fbcc0e24394bba04a01b6ecf1ee87578145c19cde12ac", size = 1905495, upload-time = "2025-10-14T10:21:34.812Z" },
+ { url = "https://files.pythonhosted.org/packages/fb/a4/71f759cc41b7043e8ecdaab81b985a9b6cad7cec077e0b92cff8b71ecf6b/pydantic_core-2.41.4-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fc3b4cc4539e055cfa39a3763c939f9d409eb40e85813257dcd761985a108554", size = 1956131, upload-time = "2025-10-14T10:21:36.924Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/64/1e79ac7aa51f1eec7c4cda8cbe456d5d09f05fdd68b32776d72168d54275/pydantic_core-2.41.4-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b1eb1754fce47c63d2ff57fdb88c351a6c0150995890088b33767a10218eaa4e", size = 2052236, upload-time = "2025-10-14T10:21:38.927Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/e3/a3ffc363bd4287b80f1d43dc1c28ba64831f8dfc237d6fec8f2661138d48/pydantic_core-2.41.4-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e6ab5ab30ef325b443f379ddb575a34969c333004fca5a1daa0133a6ffaad616", size = 2223573, upload-time = "2025-10-14T10:21:41.574Z" },
+ { url = "https://files.pythonhosted.org/packages/28/27/78814089b4d2e684a9088ede3790763c64693c3d1408ddc0a248bc789126/pydantic_core-2.41.4-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:31a41030b1d9ca497634092b46481b937ff9397a86f9f51bd41c4767b6fc04af", size = 2342467, upload-time = "2025-10-14T10:21:44.018Z" },
+ { url = "https://files.pythonhosted.org/packages/92/97/4de0e2a1159cb85ad737e03306717637842c88c7fd6d97973172fb183149/pydantic_core-2.41.4-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a44ac1738591472c3d020f61c6df1e4015180d6262ebd39bf2aeb52571b60f12", size = 2063754, upload-time = "2025-10-14T10:21:46.466Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/50/8cb90ce4b9efcf7ae78130afeb99fd1c86125ccdf9906ef64b9d42f37c25/pydantic_core-2.41.4-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d72f2b5e6e82ab8f94ea7d0d42f83c487dc159c5240d8f83beae684472864e2d", size = 2196754, upload-time = "2025-10-14T10:21:48.486Z" },
+ { url = "https://files.pythonhosted.org/packages/34/3b/ccdc77af9cd5082723574a1cc1bcae7a6acacc829d7c0a06201f7886a109/pydantic_core-2.41.4-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:c4d1e854aaf044487d31143f541f7aafe7b482ae72a022c664b2de2e466ed0ad", size = 2137115, upload-time = "2025-10-14T10:21:50.63Z" },
+ { url = "https://files.pythonhosted.org/packages/ca/ba/e7c7a02651a8f7c52dc2cff2b64a30c313e3b57c7d93703cecea76c09b71/pydantic_core-2.41.4-cp314-cp314-musllinux_1_1_armv7l.whl", hash = "sha256:b568af94267729d76e6ee5ececda4e283d07bbb28e8148bb17adad93d025d25a", size = 2317400, upload-time = "2025-10-14T10:21:52.959Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/ba/6c533a4ee8aec6b812c643c49bb3bd88d3f01e3cebe451bb85512d37f00f/pydantic_core-2.41.4-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:6d55fb8b1e8929b341cc313a81a26e0d48aa3b519c1dbaadec3a6a2b4fcad025", size = 2312070, upload-time = "2025-10-14T10:21:55.419Z" },
+ { url = "https://files.pythonhosted.org/packages/22/ae/f10524fcc0ab8d7f96cf9a74c880243576fd3e72bd8ce4f81e43d22bcab7/pydantic_core-2.41.4-cp314-cp314-win32.whl", hash = "sha256:5b66584e549e2e32a1398df11da2e0a7eff45d5c2d9db9d5667c5e6ac764d77e", size = 1982277, upload-time = "2025-10-14T10:21:57.474Z" },
+ { url = "https://files.pythonhosted.org/packages/b4/dc/e5aa27aea1ad4638f0c3fb41132f7eb583bd7420ee63204e2d4333a3bbf9/pydantic_core-2.41.4-cp314-cp314-win_amd64.whl", hash = "sha256:557a0aab88664cc552285316809cab897716a372afaf8efdbef756f8b890e894", size = 2024608, upload-time = "2025-10-14T10:21:59.557Z" },
+ { url = "https://files.pythonhosted.org/packages/3e/61/51d89cc2612bd147198e120a13f150afbf0bcb4615cddb049ab10b81b79e/pydantic_core-2.41.4-cp314-cp314-win_arm64.whl", hash = "sha256:3f1ea6f48a045745d0d9f325989d8abd3f1eaf47dd00485912d1a3a63c623a8d", size = 1967614, upload-time = "2025-10-14T10:22:01.847Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/c2/472f2e31b95eff099961fa050c376ab7156a81da194f9edb9f710f68787b/pydantic_core-2.41.4-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:6c1fe4c5404c448b13188dd8bd2ebc2bdd7e6727fa61ff481bcc2cca894018da", size = 1876904, upload-time = "2025-10-14T10:22:04.062Z" },
+ { url = "https://files.pythonhosted.org/packages/4a/07/ea8eeb91173807ecdae4f4a5f4b150a520085b35454350fc219ba79e66a3/pydantic_core-2.41.4-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:523e7da4d43b113bf8e7b49fa4ec0c35bf4fe66b2230bfc5c13cc498f12c6c3e", size = 1882538, upload-time = "2025-10-14T10:22:06.39Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/29/b53a9ca6cd366bfc928823679c6a76c7a4c69f8201c0ba7903ad18ebae2f/pydantic_core-2.41.4-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5729225de81fb65b70fdb1907fcf08c75d498f4a6f15af005aabb1fdadc19dfa", size = 2041183, upload-time = "2025-10-14T10:22:08.812Z" },
+ { url = "https://files.pythonhosted.org/packages/c7/3d/f8c1a371ceebcaf94d6dd2d77c6cf4b1c078e13a5837aee83f760b4f7cfd/pydantic_core-2.41.4-cp314-cp314t-win_amd64.whl", hash = "sha256:de2cfbb09e88f0f795fd90cf955858fc2c691df65b1f21f0aa00b99f3fbc661d", size = 1993542, upload-time = "2025-10-14T10:22:11.332Z" },
+ { url = "https://files.pythonhosted.org/packages/8a/ac/9fc61b4f9d079482a290afe8d206b8f490e9fd32d4fc03ed4fc698214e01/pydantic_core-2.41.4-cp314-cp314t-win_arm64.whl", hash = "sha256:d34f950ae05a83e0ede899c595f312ca976023ea1db100cd5aa188f7005e3ab0", size = 1973897, upload-time = "2025-10-14T10:22:13.444Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/12/5ba58daa7f453454464f92b3ca7b9d7c657d8641c48e370c3ebc9a82dd78/pydantic_core-2.41.4-graalpy311-graalpy242_311_native-macosx_10_12_x86_64.whl", hash = "sha256:a1b2cfec3879afb742a7b0bcfa53e4f22ba96571c9e54d6a3afe1052d17d843b", size = 2122139, upload-time = "2025-10-14T10:22:47.288Z" },
+ { url = "https://files.pythonhosted.org/packages/21/fb/6860126a77725c3108baecd10fd3d75fec25191d6381b6eb2ac660228eac/pydantic_core-2.41.4-graalpy311-graalpy242_311_native-macosx_11_0_arm64.whl", hash = "sha256:d175600d975b7c244af6eb9c9041f10059f20b8bbffec9e33fdd5ee3f67cdc42", size = 1936674, upload-time = "2025-10-14T10:22:49.555Z" },
+ { url = "https://files.pythonhosted.org/packages/de/be/57dcaa3ed595d81f8757e2b44a38240ac5d37628bce25fb20d02c7018776/pydantic_core-2.41.4-graalpy311-graalpy242_311_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0f184d657fa4947ae5ec9c47bd7e917730fa1cbb78195037e32dcbab50aca5ee", size = 1956398, upload-time = "2025-10-14T10:22:52.19Z" },
+ { url = "https://files.pythonhosted.org/packages/2f/1d/679a344fadb9695f1a6a294d739fbd21d71fa023286daeea8c0ed49e7c2b/pydantic_core-2.41.4-graalpy311-graalpy242_311_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1ed810568aeffed3edc78910af32af911c835cc39ebbfacd1f0ab5dd53028e5c", size = 2138674, upload-time = "2025-10-14T10:22:54.499Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/48/ae937e5a831b7c0dc646b2ef788c27cd003894882415300ed21927c21efa/pydantic_core-2.41.4-graalpy312-graalpy250_312_native-macosx_10_12_x86_64.whl", hash = "sha256:4f5d640aeebb438517150fdeec097739614421900e4a08db4a3ef38898798537", size = 2112087, upload-time = "2025-10-14T10:22:56.818Z" },
+ { url = "https://files.pythonhosted.org/packages/5e/db/6db8073e3d32dae017da7e0d16a9ecb897d0a4d92e00634916e486097961/pydantic_core-2.41.4-graalpy312-graalpy250_312_native-macosx_11_0_arm64.whl", hash = "sha256:4a9ab037b71927babc6d9e7fc01aea9e66dc2a4a34dff06ef0724a4049629f94", size = 1920387, upload-time = "2025-10-14T10:22:59.342Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/c1/dd3542d072fcc336030d66834872f0328727e3b8de289c662faa04aa270e/pydantic_core-2.41.4-graalpy312-graalpy250_312_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e4dab9484ec605c3016df9ad4fd4f9a390bc5d816a3b10c6550f8424bb80b18c", size = 1951495, upload-time = "2025-10-14T10:23:02.089Z" },
+ { url = "https://files.pythonhosted.org/packages/2b/c6/db8d13a1f8ab3f1eb08c88bd00fd62d44311e3456d1e85c0e59e0a0376e7/pydantic_core-2.41.4-graalpy312-graalpy250_312_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd8a5028425820731d8c6c098ab642d7b8b999758e24acae03ed38a66eca8335", size = 2139008, upload-time = "2025-10-14T10:23:04.539Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/d4/912e976a2dd0b49f31c98a060ca90b353f3b73ee3ea2fd0030412f6ac5ec/pydantic_core-2.41.4-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:1e5ab4fc177dd41536b3c32b2ea11380dd3d4619a385860621478ac2d25ceb00", size = 2106739, upload-time = "2025-10-14T10:23:06.934Z" },
+ { url = "https://files.pythonhosted.org/packages/71/f0/66ec5a626c81eba326072d6ee2b127f8c139543f1bf609b4842978d37833/pydantic_core-2.41.4-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:3d88d0054d3fa11ce936184896bed3c1c5441d6fa483b498fac6a5d0dd6f64a9", size = 1932549, upload-time = "2025-10-14T10:23:09.24Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/af/625626278ca801ea0a658c2dcf290dc9f21bb383098e99e7c6a029fccfc0/pydantic_core-2.41.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7b2a054a8725f05b4b6503357e0ac1c4e8234ad3b0c2ac130d6ffc66f0e170e2", size = 2135093, upload-time = "2025-10-14T10:23:11.626Z" },
+ { url = "https://files.pythonhosted.org/packages/20/f6/2fba049f54e0f4975fef66be654c597a1d005320fa141863699180c7697d/pydantic_core-2.41.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0d9db5a161c99375a0c68c058e227bee1d89303300802601d76a3d01f74e258", size = 2187971, upload-time = "2025-10-14T10:23:14.437Z" },
+ { url = "https://files.pythonhosted.org/packages/0e/80/65ab839a2dfcd3b949202f9d920c34f9de5a537c3646662bdf2f7d999680/pydantic_core-2.41.4-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:6273ea2c8ffdac7b7fda2653c49682db815aebf4a89243a6feccf5e36c18c347", size = 2147939, upload-time = "2025-10-14T10:23:16.831Z" },
+ { url = "https://files.pythonhosted.org/packages/44/58/627565d3d182ce6dfda18b8e1c841eede3629d59c9d7cbc1e12a03aeb328/pydantic_core-2.41.4-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:4c973add636efc61de22530b2ef83a65f39b6d6f656df97f678720e20de26caa", size = 2311400, upload-time = "2025-10-14T10:23:19.234Z" },
+ { url = "https://files.pythonhosted.org/packages/24/06/8a84711162ad5a5f19a88cead37cca81b4b1f294f46260ef7334ae4f24d3/pydantic_core-2.41.4-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:b69d1973354758007f46cf2d44a4f3d0933f10b6dc9bf15cf1356e037f6f731a", size = 2316840, upload-time = "2025-10-14T10:23:21.738Z" },
+ { url = "https://files.pythonhosted.org/packages/aa/8b/b7bb512a4682a2f7fbfae152a755d37351743900226d29bd953aaf870eaa/pydantic_core-2.41.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:3619320641fd212aaf5997b6ca505e97540b7e16418f4a241f44cdf108ffb50d", size = 2149135, upload-time = "2025-10-14T10:23:24.379Z" },
+ { url = "https://files.pythonhosted.org/packages/7e/7d/138e902ed6399b866f7cfe4435d22445e16fff888a1c00560d9dc79a780f/pydantic_core-2.41.4-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:491535d45cd7ad7e4a2af4a5169b0d07bebf1adfd164b0368da8aa41e19907a5", size = 2104721, upload-time = "2025-10-14T10:23:26.906Z" },
+ { url = "https://files.pythonhosted.org/packages/47/13/0525623cf94627f7b53b4c2034c81edc8491cbfc7c28d5447fa318791479/pydantic_core-2.41.4-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:54d86c0cada6aba4ec4c047d0e348cbad7063b87ae0f005d9f8c9ad04d4a92a2", size = 1931608, upload-time = "2025-10-14T10:23:29.306Z" },
+ { url = "https://files.pythonhosted.org/packages/d6/f9/744bc98137d6ef0a233f808bfc9b18cf94624bf30836a18d3b05d08bf418/pydantic_core-2.41.4-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eca1124aced216b2500dc2609eade086d718e8249cb9696660ab447d50a758bd", size = 2132986, upload-time = "2025-10-14T10:23:32.057Z" },
+ { url = "https://files.pythonhosted.org/packages/17/c8/629e88920171173f6049386cc71f893dff03209a9ef32b4d2f7e7c264bcf/pydantic_core-2.41.4-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6c9024169becccf0cb470ada03ee578d7348c119a0d42af3dcf9eda96e3a247c", size = 2187516, upload-time = "2025-10-14T10:23:34.871Z" },
+ { url = "https://files.pythonhosted.org/packages/2e/0f/4f2734688d98488782218ca61bcc118329bf5de05bb7fe3adc7dd79b0b86/pydantic_core-2.41.4-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:26895a4268ae5a2849269f4991cdc97236e4b9c010e51137becf25182daac405", size = 2146146, upload-time = "2025-10-14T10:23:37.342Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/f2/ab385dbd94a052c62224b99cf99002eee99dbec40e10006c78575aead256/pydantic_core-2.41.4-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:ca4df25762cf71308c446e33c9b1fdca2923a3f13de616e2a949f38bf21ff5a8", size = 2311296, upload-time = "2025-10-14T10:23:40.145Z" },
+ { url = "https://files.pythonhosted.org/packages/fc/8e/e4f12afe1beeb9823bba5375f8f258df0cc61b056b0195fb1cf9f62a1a58/pydantic_core-2.41.4-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:5a28fcedd762349519276c36634e71853b4541079cab4acaaac60c4421827308", size = 2315386, upload-time = "2025-10-14T10:23:42.624Z" },
+ { url = "https://files.pythonhosted.org/packages/48/f7/925f65d930802e3ea2eb4d5afa4cb8730c8dc0d2cb89a59dc4ed2fcb2d74/pydantic_core-2.41.4-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:c173ddcd86afd2535e2b695217e82191580663a1d1928239f877f5a1649ef39f", size = 2147775, upload-time = "2025-10-14T10:23:45.406Z" },
+]
+
+[[package]]
+name = "pygments"
+version = "2.19.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" },
+]
+
+[[package]]
+name = "pyparsing"
+version = "3.2.5"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/f2/a5/181488fc2b9d093e3972d2a472855aae8a03f000592dbfce716a512b3359/pyparsing-3.2.5.tar.gz", hash = "sha256:2df8d5b7b2802ef88e8d016a2eb9c7aeaa923529cd251ed0fe4608275d4105b6", size = 1099274, upload-time = "2025-09-21T04:11:06.277Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/10/5e/1aa9a93198c6b64513c9d7752de7422c06402de6600a8767da1524f9570b/pyparsing-3.2.5-py3-none-any.whl", hash = "sha256:e38a4f02064cf41fe6593d328d0512495ad1f3d8a91c4f73fc401b3079a59a5e", size = 113890, upload-time = "2025-09-21T04:11:04.117Z" },
+]
+
+[[package]]
+name = "pytest"
+version = "9.0.3"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "colorama", marker = "sys_platform == 'win32'" },
+ { name = "exceptiongroup", marker = "python_full_version < '3.11'" },
+ { name = "iniconfig" },
+ { name = "packaging" },
+ { name = "pluggy" },
+ { name = "pygments" },
+ { name = "tomli", marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/7d/0d/549bd94f1a0a402dc8cf64563a117c0f3765662e2e668477624baeec44d5/pytest-9.0.3.tar.gz", hash = "sha256:b86ada508af81d19edeb213c681b1d48246c1a91d304c6c81a427674c17eb91c", size = 1572165, upload-time = "2026-04-07T17:16:18.027Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/d4/24/a372aaf5c9b7208e7112038812994107bc65a84cd00e0354a88c2c77a617/pytest-9.0.3-py3-none-any.whl", hash = "sha256:2c5efc453d45394fdd706ade797c0a81091eccd1d6e4bccfcd476e2b8e0ab5d9", size = 375249, upload-time = "2026-04-07T17:16:16.13Z" },
+]
+
+[[package]]
+name = "python-dateutil"
+version = "2.9.0.post0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "six" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" },
+]
+
+[[package]]
+name = "pytz"
+version = "2025.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/f8/bf/abbd3cdfb8fbc7fb3d4d38d320f2441b1e7cbe29be4f23797b4a2b5d8aac/pytz-2025.2.tar.gz", hash = "sha256:360b9e3dbb49a209c21ad61809c7fb453643e048b38924c765813546746e81c3", size = 320884, upload-time = "2025-03-25T02:25:00.538Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/81/c4/34e93fe5f5429d7570ec1fa436f1986fb1f00c3e0f43a589fe2bbcd22c3f/pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00", size = 509225, upload-time = "2025-03-25T02:24:58.468Z" },
+]
+
+[[package]]
+name = "pyyaml"
+version = "6.0.3"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/05/8e/961c0007c59b8dd7729d542c61a4d537767a59645b82a0b521206e1e25c2/pyyaml-6.0.3.tar.gz", hash = "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f", size = 130960, upload-time = "2025-09-25T21:33:16.546Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/f4/a0/39350dd17dd6d6c6507025c0e53aef67a9293a6d37d3511f23ea510d5800/pyyaml-6.0.3-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:214ed4befebe12df36bcc8bc2b64b396ca31be9304b8f59e25c11cf94a4c033b", size = 184227, upload-time = "2025-09-25T21:31:46.04Z" },
+ { url = "https://files.pythonhosted.org/packages/05/14/52d505b5c59ce73244f59c7a50ecf47093ce4765f116cdb98286a71eeca2/pyyaml-6.0.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:02ea2dfa234451bbb8772601d7b8e426c2bfa197136796224e50e35a78777956", size = 174019, upload-time = "2025-09-25T21:31:47.706Z" },
+ { url = "https://files.pythonhosted.org/packages/43/f7/0e6a5ae5599c838c696adb4e6330a59f463265bfa1e116cfd1fbb0abaaae/pyyaml-6.0.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b30236e45cf30d2b8e7b3e85881719e98507abed1011bf463a8fa23e9c3e98a8", size = 740646, upload-time = "2025-09-25T21:31:49.21Z" },
+ { url = "https://files.pythonhosted.org/packages/2f/3a/61b9db1d28f00f8fd0ae760459a5c4bf1b941baf714e207b6eb0657d2578/pyyaml-6.0.3-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:66291b10affd76d76f54fad28e22e51719ef9ba22b29e1d7d03d6777a9174198", size = 840793, upload-time = "2025-09-25T21:31:50.735Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/1e/7acc4f0e74c4b3d9531e24739e0ab832a5edf40e64fbae1a9c01941cabd7/pyyaml-6.0.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9c7708761fccb9397fe64bbc0395abcae8c4bf7b0eac081e12b809bf47700d0b", size = 770293, upload-time = "2025-09-25T21:31:51.828Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/ef/abd085f06853af0cd59fa5f913d61a8eab65d7639ff2a658d18a25d6a89d/pyyaml-6.0.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:418cf3f2111bc80e0933b2cd8cd04f286338bb88bdc7bc8e6dd775ebde60b5e0", size = 732872, upload-time = "2025-09-25T21:31:53.282Z" },
+ { url = "https://files.pythonhosted.org/packages/1f/15/2bc9c8faf6450a8b3c9fc5448ed869c599c0a74ba2669772b1f3a0040180/pyyaml-6.0.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:5e0b74767e5f8c593e8c9b5912019159ed0533c70051e9cce3e8b6aa699fcd69", size = 758828, upload-time = "2025-09-25T21:31:54.807Z" },
+ { url = "https://files.pythonhosted.org/packages/a3/00/531e92e88c00f4333ce359e50c19b8d1de9fe8d581b1534e35ccfbc5f393/pyyaml-6.0.3-cp310-cp310-win32.whl", hash = "sha256:28c8d926f98f432f88adc23edf2e6d4921ac26fb084b028c733d01868d19007e", size = 142415, upload-time = "2025-09-25T21:31:55.885Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/fa/926c003379b19fca39dd4634818b00dec6c62d87faf628d1394e137354d4/pyyaml-6.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:bdb2c67c6c1390b63c6ff89f210c8fd09d9a1217a465701eac7316313c915e4c", size = 158561, upload-time = "2025-09-25T21:31:57.406Z" },
+ { url = "https://files.pythonhosted.org/packages/6d/16/a95b6757765b7b031c9374925bb718d55e0a9ba8a1b6a12d25962ea44347/pyyaml-6.0.3-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:44edc647873928551a01e7a563d7452ccdebee747728c1080d881d68af7b997e", size = 185826, upload-time = "2025-09-25T21:31:58.655Z" },
+ { url = "https://files.pythonhosted.org/packages/16/19/13de8e4377ed53079ee996e1ab0a9c33ec2faf808a4647b7b4c0d46dd239/pyyaml-6.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:652cb6edd41e718550aad172851962662ff2681490a8a711af6a4d288dd96824", size = 175577, upload-time = "2025-09-25T21:32:00.088Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/62/d2eb46264d4b157dae1275b573017abec435397aa59cbcdab6fc978a8af4/pyyaml-6.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:10892704fc220243f5305762e276552a0395f7beb4dbf9b14ec8fd43b57f126c", size = 775556, upload-time = "2025-09-25T21:32:01.31Z" },
+ { url = "https://files.pythonhosted.org/packages/10/cb/16c3f2cf3266edd25aaa00d6c4350381c8b012ed6f5276675b9eba8d9ff4/pyyaml-6.0.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:850774a7879607d3a6f50d36d04f00ee69e7fc816450e5f7e58d7f17f1ae5c00", size = 882114, upload-time = "2025-09-25T21:32:03.376Z" },
+ { url = "https://files.pythonhosted.org/packages/71/60/917329f640924b18ff085ab889a11c763e0b573da888e8404ff486657602/pyyaml-6.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b8bb0864c5a28024fac8a632c443c87c5aa6f215c0b126c449ae1a150412f31d", size = 806638, upload-time = "2025-09-25T21:32:04.553Z" },
+ { url = "https://files.pythonhosted.org/packages/dd/6f/529b0f316a9fd167281a6c3826b5583e6192dba792dd55e3203d3f8e655a/pyyaml-6.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1d37d57ad971609cf3c53ba6a7e365e40660e3be0e5175fa9f2365a379d6095a", size = 767463, upload-time = "2025-09-25T21:32:06.152Z" },
+ { url = "https://files.pythonhosted.org/packages/f2/6a/b627b4e0c1dd03718543519ffb2f1deea4a1e6d42fbab8021936a4d22589/pyyaml-6.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:37503bfbfc9d2c40b344d06b2199cf0e96e97957ab1c1b546fd4f87e53e5d3e4", size = 794986, upload-time = "2025-09-25T21:32:07.367Z" },
+ { url = "https://files.pythonhosted.org/packages/45/91/47a6e1c42d9ee337c4839208f30d9f09caa9f720ec7582917b264defc875/pyyaml-6.0.3-cp311-cp311-win32.whl", hash = "sha256:8098f252adfa6c80ab48096053f512f2321f0b998f98150cea9bd23d83e1467b", size = 142543, upload-time = "2025-09-25T21:32:08.95Z" },
+ { url = "https://files.pythonhosted.org/packages/da/e3/ea007450a105ae919a72393cb06f122f288ef60bba2dc64b26e2646fa315/pyyaml-6.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:9f3bfb4965eb874431221a3ff3fdcddc7e74e3b07799e0e84ca4a0f867d449bf", size = 158763, upload-time = "2025-09-25T21:32:09.96Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/33/422b98d2195232ca1826284a76852ad5a86fe23e31b009c9886b2d0fb8b2/pyyaml-6.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7f047e29dcae44602496db43be01ad42fc6f1cc0d8cd6c83d342306c32270196", size = 182063, upload-time = "2025-09-25T21:32:11.445Z" },
+ { url = "https://files.pythonhosted.org/packages/89/a0/6cf41a19a1f2f3feab0e9c0b74134aa2ce6849093d5517a0c550fe37a648/pyyaml-6.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fc09d0aa354569bc501d4e787133afc08552722d3ab34836a80547331bb5d4a0", size = 173973, upload-time = "2025-09-25T21:32:12.492Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/23/7a778b6bd0b9a8039df8b1b1d80e2e2ad78aa04171592c8a5c43a56a6af4/pyyaml-6.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9149cad251584d5fb4981be1ecde53a1ca46c891a79788c0df828d2f166bda28", size = 775116, upload-time = "2025-09-25T21:32:13.652Z" },
+ { url = "https://files.pythonhosted.org/packages/65/30/d7353c338e12baef4ecc1b09e877c1970bd3382789c159b4f89d6a70dc09/pyyaml-6.0.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5fdec68f91a0c6739b380c83b951e2c72ac0197ace422360e6d5a959d8d97b2c", size = 844011, upload-time = "2025-09-25T21:32:15.21Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/9d/b3589d3877982d4f2329302ef98a8026e7f4443c765c46cfecc8858c6b4b/pyyaml-6.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ba1cc08a7ccde2d2ec775841541641e4548226580ab850948cbfda66a1befcdc", size = 807870, upload-time = "2025-09-25T21:32:16.431Z" },
+ { url = "https://files.pythonhosted.org/packages/05/c0/b3be26a015601b822b97d9149ff8cb5ead58c66f981e04fedf4e762f4bd4/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8dc52c23056b9ddd46818a57b78404882310fb473d63f17b07d5c40421e47f8e", size = 761089, upload-time = "2025-09-25T21:32:17.56Z" },
+ { url = "https://files.pythonhosted.org/packages/be/8e/98435a21d1d4b46590d5459a22d88128103f8da4c2d4cb8f14f2a96504e1/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:41715c910c881bc081f1e8872880d3c650acf13dfa8214bad49ed4cede7c34ea", size = 790181, upload-time = "2025-09-25T21:32:18.834Z" },
+ { url = "https://files.pythonhosted.org/packages/74/93/7baea19427dcfbe1e5a372d81473250b379f04b1bd3c4c5ff825e2327202/pyyaml-6.0.3-cp312-cp312-win32.whl", hash = "sha256:96b533f0e99f6579b3d4d4995707cf36df9100d67e0c8303a0c55b27b5f99bc5", size = 137658, upload-time = "2025-09-25T21:32:20.209Z" },
+ { url = "https://files.pythonhosted.org/packages/86/bf/899e81e4cce32febab4fb42bb97dcdf66bc135272882d1987881a4b519e9/pyyaml-6.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:5fcd34e47f6e0b794d17de1b4ff496c00986e1c83f7ab2fb8fcfe9616ff7477b", size = 154003, upload-time = "2025-09-25T21:32:21.167Z" },
+ { url = "https://files.pythonhosted.org/packages/1a/08/67bd04656199bbb51dbed1439b7f27601dfb576fb864099c7ef0c3e55531/pyyaml-6.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:64386e5e707d03a7e172c0701abfb7e10f0fb753ee1d773128192742712a98fd", size = 140344, upload-time = "2025-09-25T21:32:22.617Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/11/0fd08f8192109f7169db964b5707a2f1e8b745d4e239b784a5a1dd80d1db/pyyaml-6.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8", size = 181669, upload-time = "2025-09-25T21:32:23.673Z" },
+ { url = "https://files.pythonhosted.org/packages/b1/16/95309993f1d3748cd644e02e38b75d50cbc0d9561d21f390a76242ce073f/pyyaml-6.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1", size = 173252, upload-time = "2025-09-25T21:32:25.149Z" },
+ { url = "https://files.pythonhosted.org/packages/50/31/b20f376d3f810b9b2371e72ef5adb33879b25edb7a6d072cb7ca0c486398/pyyaml-6.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c", size = 767081, upload-time = "2025-09-25T21:32:26.575Z" },
+ { url = "https://files.pythonhosted.org/packages/49/1e/a55ca81e949270d5d4432fbbd19dfea5321eda7c41a849d443dc92fd1ff7/pyyaml-6.0.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5", size = 841159, upload-time = "2025-09-25T21:32:27.727Z" },
+ { url = "https://files.pythonhosted.org/packages/74/27/e5b8f34d02d9995b80abcef563ea1f8b56d20134d8f4e5e81733b1feceb2/pyyaml-6.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6", size = 801626, upload-time = "2025-09-25T21:32:28.878Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/11/ba845c23988798f40e52ba45f34849aa8a1f2d4af4b798588010792ebad6/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6", size = 753613, upload-time = "2025-09-25T21:32:30.178Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/e0/7966e1a7bfc0a45bf0a7fb6b98ea03fc9b8d84fa7f2229e9659680b69ee3/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be", size = 794115, upload-time = "2025-09-25T21:32:31.353Z" },
+ { url = "https://files.pythonhosted.org/packages/de/94/980b50a6531b3019e45ddeada0626d45fa85cbe22300844a7983285bed3b/pyyaml-6.0.3-cp313-cp313-win32.whl", hash = "sha256:d0eae10f8159e8fdad514efdc92d74fd8d682c933a6dd088030f3834bc8e6b26", size = 137427, upload-time = "2025-09-25T21:32:32.58Z" },
+ { url = "https://files.pythonhosted.org/packages/97/c9/39d5b874e8b28845e4ec2202b5da735d0199dbe5b8fb85f91398814a9a46/pyyaml-6.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:79005a0d97d5ddabfeeea4cf676af11e647e41d81c9a7722a193022accdb6b7c", size = 154090, upload-time = "2025-09-25T21:32:33.659Z" },
+ { url = "https://files.pythonhosted.org/packages/73/e8/2bdf3ca2090f68bb3d75b44da7bbc71843b19c9f2b9cb9b0f4ab7a5a4329/pyyaml-6.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:5498cd1645aa724a7c71c8f378eb29ebe23da2fc0d7a08071d89469bf1d2defb", size = 140246, upload-time = "2025-09-25T21:32:34.663Z" },
+ { url = "https://files.pythonhosted.org/packages/9d/8c/f4bd7f6465179953d3ac9bc44ac1a8a3e6122cf8ada906b4f96c60172d43/pyyaml-6.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8d1fab6bb153a416f9aeb4b8763bc0f22a5586065f86f7664fc23339fc1c1fac", size = 181814, upload-time = "2025-09-25T21:32:35.712Z" },
+ { url = "https://files.pythonhosted.org/packages/bd/9c/4d95bb87eb2063d20db7b60faa3840c1b18025517ae857371c4dd55a6b3a/pyyaml-6.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:34d5fcd24b8445fadc33f9cf348c1047101756fd760b4dacb5c3e99755703310", size = 173809, upload-time = "2025-09-25T21:32:36.789Z" },
+ { url = "https://files.pythonhosted.org/packages/92/b5/47e807c2623074914e29dabd16cbbdd4bf5e9b2db9f8090fa64411fc5382/pyyaml-6.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:501a031947e3a9025ed4405a168e6ef5ae3126c59f90ce0cd6f2bfc477be31b7", size = 766454, upload-time = "2025-09-25T21:32:37.966Z" },
+ { url = "https://files.pythonhosted.org/packages/02/9e/e5e9b168be58564121efb3de6859c452fccde0ab093d8438905899a3a483/pyyaml-6.0.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b3bc83488de33889877a0f2543ade9f70c67d66d9ebb4ac959502e12de895788", size = 836355, upload-time = "2025-09-25T21:32:39.178Z" },
+ { url = "https://files.pythonhosted.org/packages/88/f9/16491d7ed2a919954993e48aa941b200f38040928474c9e85ea9e64222c3/pyyaml-6.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c458b6d084f9b935061bc36216e8a69a7e293a2f1e68bf956dcd9e6cbcd143f5", size = 794175, upload-time = "2025-09-25T21:32:40.865Z" },
+ { url = "https://files.pythonhosted.org/packages/dd/3f/5989debef34dc6397317802b527dbbafb2b4760878a53d4166579111411e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7c6610def4f163542a622a73fb39f534f8c101d690126992300bf3207eab9764", size = 755228, upload-time = "2025-09-25T21:32:42.084Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/ce/af88a49043cd2e265be63d083fc75b27b6ed062f5f9fd6cdc223ad62f03e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5190d403f121660ce8d1d2c1bb2ef1bd05b5f68533fc5c2ea899bd15f4399b35", size = 789194, upload-time = "2025-09-25T21:32:43.362Z" },
+ { url = "https://files.pythonhosted.org/packages/23/20/bb6982b26a40bb43951265ba29d4c246ef0ff59c9fdcdf0ed04e0687de4d/pyyaml-6.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:4a2e8cebe2ff6ab7d1050ecd59c25d4c8bd7e6f400f5f82b96557ac0abafd0ac", size = 156429, upload-time = "2025-09-25T21:32:57.844Z" },
+ { url = "https://files.pythonhosted.org/packages/f4/f4/a4541072bb9422c8a883ab55255f918fa378ecf083f5b85e87fc2b4eda1b/pyyaml-6.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:93dda82c9c22deb0a405ea4dc5f2d0cda384168e466364dec6255b293923b2f3", size = 143912, upload-time = "2025-09-25T21:32:59.247Z" },
+ { url = "https://files.pythonhosted.org/packages/7c/f9/07dd09ae774e4616edf6cda684ee78f97777bdd15847253637a6f052a62f/pyyaml-6.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:02893d100e99e03eda1c8fd5c441d8c60103fd175728e23e431db1b589cf5ab3", size = 189108, upload-time = "2025-09-25T21:32:44.377Z" },
+ { url = "https://files.pythonhosted.org/packages/4e/78/8d08c9fb7ce09ad8c38ad533c1191cf27f7ae1effe5bb9400a46d9437fcf/pyyaml-6.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c1ff362665ae507275af2853520967820d9124984e0f7466736aea23d8611fba", size = 183641, upload-time = "2025-09-25T21:32:45.407Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/5b/3babb19104a46945cf816d047db2788bcaf8c94527a805610b0289a01c6b/pyyaml-6.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6adc77889b628398debc7b65c073bcb99c4a0237b248cacaf3fe8a557563ef6c", size = 831901, upload-time = "2025-09-25T21:32:48.83Z" },
+ { url = "https://files.pythonhosted.org/packages/8b/cc/dff0684d8dc44da4d22a13f35f073d558c268780ce3c6ba1b87055bb0b87/pyyaml-6.0.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a80cb027f6b349846a3bf6d73b5e95e782175e52f22108cfa17876aaeff93702", size = 861132, upload-time = "2025-09-25T21:32:50.149Z" },
+ { url = "https://files.pythonhosted.org/packages/b1/5e/f77dc6b9036943e285ba76b49e118d9ea929885becb0a29ba8a7c75e29fe/pyyaml-6.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:00c4bdeba853cc34e7dd471f16b4114f4162dc03e6b7afcc2128711f0eca823c", size = 839261, upload-time = "2025-09-25T21:32:51.808Z" },
+ { url = "https://files.pythonhosted.org/packages/ce/88/a9db1376aa2a228197c58b37302f284b5617f56a5d959fd1763fb1675ce6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:66e1674c3ef6f541c35191caae2d429b967b99e02040f5ba928632d9a7f0f065", size = 805272, upload-time = "2025-09-25T21:32:52.941Z" },
+ { url = "https://files.pythonhosted.org/packages/da/92/1446574745d74df0c92e6aa4a7b0b3130706a4142b2d1a5869f2eaa423c6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:16249ee61e95f858e83976573de0f5b2893b3677ba71c9dd36b9cf8be9ac6d65", size = 829923, upload-time = "2025-09-25T21:32:54.537Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/7a/1c7270340330e575b92f397352af856a8c06f230aa3e76f86b39d01b416a/pyyaml-6.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4ad1906908f2f5ae4e5a8ddfce73c320c2a1429ec52eafd27138b7f1cbe341c9", size = 174062, upload-time = "2025-09-25T21:32:55.767Z" },
+ { url = "https://files.pythonhosted.org/packages/f1/12/de94a39c2ef588c7e6455cfbe7343d3b2dc9d6b6b2f40c4c6565744c873d/pyyaml-6.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:ebc55a14a21cb14062aa4162f906cd962b28e2e9ea38f9b4391244cd8de4ae0b", size = 149341, upload-time = "2025-09-25T21:32:56.828Z" },
+]
+
+[[package]]
+name = "regex"
+version = "2025.10.23"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/f8/c8/1d2160d36b11fbe0a61acb7c3c81ab032d9ec8ad888ac9e0a61b85ab99dd/regex-2025.10.23.tar.gz", hash = "sha256:8cbaf8ceb88f96ae2356d01b9adf5e6306fa42fa6f7eab6b97794e37c959ac26", size = 401266, upload-time = "2025-10-21T15:58:20.23Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/88/11/849d5d23633a77047465eaae4cc0cbf24ded7aa496c02e8b9710e28b1687/regex-2025.10.23-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:17bbcde374bef1c5fad9b131f0e28a6a24856dd90368d8c0201e2b5a69533daa", size = 487957, upload-time = "2025-10-21T15:54:26.151Z" },
+ { url = "https://files.pythonhosted.org/packages/87/12/5985386e7e3200a0d6a6417026d2c758d783a932428a5efc0a42ca1ddf74/regex-2025.10.23-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b4e10434279cc8567f99ca6e018e9025d14f2fded2a603380b6be2090f476426", size = 290419, upload-time = "2025-10-21T15:54:28.804Z" },
+ { url = "https://files.pythonhosted.org/packages/67/cf/a8615923f962f8fdc41a3a6093a48726955e8b1993f4614b26a41d249f9b/regex-2025.10.23-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9c9bb421cbe7012c744a5a56cf4d6c80829c72edb1a2991677299c988d6339c8", size = 288285, upload-time = "2025-10-21T15:54:30.47Z" },
+ { url = "https://files.pythonhosted.org/packages/4e/3d/6a3a1e12c86354cd0b3cbf8c3dd6acbe853609ee3b39d47ecd3ce95caf84/regex-2025.10.23-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:275cd1c2ed8c4a78ebfa489618d7aee762e8b4732da73573c3e38236ec5f65de", size = 781458, upload-time = "2025-10-21T15:54:31.978Z" },
+ { url = "https://files.pythonhosted.org/packages/46/47/76a8da004489f2700361754859e373b87a53d043de8c47f4d1583fd39d78/regex-2025.10.23-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7b426ae7952f3dc1e73a86056d520bd4e5f021397484a6835902fc5648bcacce", size = 850605, upload-time = "2025-10-21T15:54:33.753Z" },
+ { url = "https://files.pythonhosted.org/packages/67/05/fa886461f97d45a6f4b209699cb994dc6d6212d6e219d29444dac5005775/regex-2025.10.23-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:c5cdaf5b6d37c7da1967dbe729d819461aab6a98a072feef65bbcff0a6e60649", size = 898563, upload-time = "2025-10-21T15:54:35.431Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/db/3ddd8d01455f23cabad7499f4199de0df92f5e96d39633203ff9d0b592dc/regex-2025.10.23-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3bfeff0b08f296ab28b4332a7e03ca31c437ee78b541ebc874bbf540e5932f8d", size = 791535, upload-time = "2025-10-21T15:54:37.269Z" },
+ { url = "https://files.pythonhosted.org/packages/7c/ae/0fa5cbf41ca92b6ec3370222fcb6c68b240d68ab10e803d086c03a19fd9e/regex-2025.10.23-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:5f97236a67307b775f30a74ef722b64b38b7ab7ba3bb4a2508518a5de545459c", size = 782461, upload-time = "2025-10-21T15:54:39.187Z" },
+ { url = "https://files.pythonhosted.org/packages/d4/23/70af22a016df11af4def27870eb175c2c7235b72d411ecf75a4b4a422cb6/regex-2025.10.23-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:be19e7de499940cd72475fb8e46ab2ecb1cf5906bebdd18a89f9329afb1df82f", size = 774583, upload-time = "2025-10-21T15:54:41.018Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/ee/a54a6851f6905f33d3c4ed64e8737b1d85ed01b5724712530ddc0f9abdb1/regex-2025.10.23-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:883df76ee42d9ecb82b37ff8d01caea5895b3f49630a64d21111078bbf8ef64c", size = 845649, upload-time = "2025-10-21T15:54:42.615Z" },
+ { url = "https://files.pythonhosted.org/packages/80/7d/c3ec1cae14e01fab00e38c41ed35f47a853359e95e9c023e9a4381bb122c/regex-2025.10.23-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2e9117d1d35fc2addae6281019ecc70dc21c30014b0004f657558b91c6a8f1a7", size = 836037, upload-time = "2025-10-21T15:54:44.63Z" },
+ { url = "https://files.pythonhosted.org/packages/15/ae/45771140dd43c4d67c87b54d3728078ed6a96599d9fc7ba6825086236782/regex-2025.10.23-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:0ff1307f531a5d8cf5c20ea517254551ff0a8dc722193aab66c656c5a900ea68", size = 779705, upload-time = "2025-10-21T15:54:46.08Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/95/074e2581760eafce7c816a352b7d3a322536e5b68c346d1a8bacd895545c/regex-2025.10.23-cp310-cp310-win32.whl", hash = "sha256:7888475787cbfee4a7cd32998eeffe9a28129fa44ae0f691b96cb3939183ef41", size = 265663, upload-time = "2025-10-21T15:54:47.854Z" },
+ { url = "https://files.pythonhosted.org/packages/f7/c7/a25f56a718847e34d3f1608c72eadeb67653bff1a0411da023dd8f4c647b/regex-2025.10.23-cp310-cp310-win_amd64.whl", hash = "sha256:ec41a905908496ce4906dab20fb103c814558db1d69afc12c2f384549c17936a", size = 277587, upload-time = "2025-10-21T15:54:49.571Z" },
+ { url = "https://files.pythonhosted.org/packages/d3/e5/63eb17c6b5deaefd93c2bbb1feae7c0a8d2157da25883a6ca2569cf7a663/regex-2025.10.23-cp310-cp310-win_arm64.whl", hash = "sha256:b2b7f19a764d5e966d5a62bf2c28a8b4093cc864c6734510bdb4aeb840aec5e6", size = 269979, upload-time = "2025-10-21T15:54:51.375Z" },
+ { url = "https://files.pythonhosted.org/packages/82/e5/74b7cd5cd76b4171f9793042045bb1726f7856dd56e582fc3e058a7a8a5e/regex-2025.10.23-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6c531155bf9179345e85032052a1e5fe1a696a6abf9cea54b97e8baefff970fd", size = 487960, upload-time = "2025-10-21T15:54:53.253Z" },
+ { url = "https://files.pythonhosted.org/packages/b9/08/854fa4b3b20471d1df1c71e831b6a1aa480281e37791e52a2df9641ec5c6/regex-2025.10.23-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:912e9df4e89d383681268d38ad8f5780d7cccd94ba0e9aa09ca7ab7ab4f8e7eb", size = 290425, upload-time = "2025-10-21T15:54:55.21Z" },
+ { url = "https://files.pythonhosted.org/packages/ab/d3/6272b1dd3ca1271661e168762b234ad3e00dbdf4ef0c7b9b72d2d159efa7/regex-2025.10.23-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4f375c61bfc3138b13e762fe0ae76e3bdca92497816936534a0177201666f44f", size = 288278, upload-time = "2025-10-21T15:54:56.862Z" },
+ { url = "https://files.pythonhosted.org/packages/14/8f/c7b365dd9d9bc0a36e018cb96f2ffb60d2ba8deb589a712b437f67de2920/regex-2025.10.23-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e248cc9446081119128ed002a3801f8031e0c219b5d3c64d3cc627da29ac0a33", size = 793289, upload-time = "2025-10-21T15:54:58.352Z" },
+ { url = "https://files.pythonhosted.org/packages/d4/fb/b8fbe9aa16cf0c21f45ec5a6c74b4cecbf1a1c0deb7089d4a6f83a9c1caa/regex-2025.10.23-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b52bf9282fdf401e4f4e721f0f61fc4b159b1307244517789702407dd74e38ca", size = 860321, upload-time = "2025-10-21T15:54:59.813Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/81/bf41405c772324926a9bd8a640dedaa42da0e929241834dfce0733070437/regex-2025.10.23-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5c084889ab2c59765a0d5ac602fd1c3c244f9b3fcc9a65fdc7ba6b74c5287490", size = 907011, upload-time = "2025-10-21T15:55:01.968Z" },
+ { url = "https://files.pythonhosted.org/packages/a4/fb/5ad6a8b92d3f88f3797b51bb4ef47499acc2d0b53d2fbe4487a892f37a73/regex-2025.10.23-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d80e8eb79009bdb0936658c44ca06e2fbbca67792013e3818eea3f5f228971c2", size = 800312, upload-time = "2025-10-21T15:55:04.15Z" },
+ { url = "https://files.pythonhosted.org/packages/42/48/b4efba0168a2b57f944205d823f8e8a3a1ae6211a34508f014ec2c712f4f/regex-2025.10.23-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b6f259118ba87b814a8ec475380aee5f5ae97a75852a3507cf31d055b01b5b40", size = 782839, upload-time = "2025-10-21T15:55:05.641Z" },
+ { url = "https://files.pythonhosted.org/packages/13/2a/c9efb4c6c535b0559c1fa8e431e0574d229707c9ca718600366fcfef6801/regex-2025.10.23-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:9b8c72a242683dcc72d37595c4f1278dfd7642b769e46700a8df11eab19dfd82", size = 854270, upload-time = "2025-10-21T15:55:07.27Z" },
+ { url = "https://files.pythonhosted.org/packages/34/2d/68eecc1bdaee020e8ba549502291c9450d90d8590d0552247c9b543ebf7b/regex-2025.10.23-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:a8d7b7a0a3df9952f9965342159e0c1f05384c0f056a47ce8b61034f8cecbe83", size = 845771, upload-time = "2025-10-21T15:55:09.477Z" },
+ { url = "https://files.pythonhosted.org/packages/a5/cd/a1ae499cf9b87afb47a67316bbf1037a7c681ffe447c510ed98c0aa2c01c/regex-2025.10.23-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:413bfea20a484c524858125e92b9ce6ffdd0a4b97d4ff96b5859aa119b0f1bdd", size = 788778, upload-time = "2025-10-21T15:55:11.396Z" },
+ { url = "https://files.pythonhosted.org/packages/38/f9/70765e63f5ea7d43b2b6cd4ee9d3323f16267e530fb2a420d92d991cf0fc/regex-2025.10.23-cp311-cp311-win32.whl", hash = "sha256:f76deef1f1019a17dad98f408b8f7afc4bd007cbe835ae77b737e8c7f19ae575", size = 265666, upload-time = "2025-10-21T15:55:13.306Z" },
+ { url = "https://files.pythonhosted.org/packages/9c/1a/18e9476ee1b63aaec3844d8e1cb21842dc19272c7e86d879bfc0dcc60db3/regex-2025.10.23-cp311-cp311-win_amd64.whl", hash = "sha256:59bba9f7125536f23fdab5deeea08da0c287a64c1d3acc1c7e99515809824de8", size = 277600, upload-time = "2025-10-21T15:55:15.087Z" },
+ { url = "https://files.pythonhosted.org/packages/1d/1b/c019167b1f7a8ec77251457e3ff0339ed74ca8bce1ea13138dc98309c923/regex-2025.10.23-cp311-cp311-win_arm64.whl", hash = "sha256:b103a752b6f1632ca420225718d6ed83f6a6ced3016dd0a4ab9a6825312de566", size = 269974, upload-time = "2025-10-21T15:55:16.841Z" },
+ { url = "https://files.pythonhosted.org/packages/f6/57/eeb274d83ab189d02d778851b1ac478477522a92b52edfa6e2ae9ff84679/regex-2025.10.23-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:7a44d9c00f7a0a02d3b777429281376370f3d13d2c75ae74eb94e11ebcf4a7fc", size = 489187, upload-time = "2025-10-21T15:55:18.322Z" },
+ { url = "https://files.pythonhosted.org/packages/55/5c/7dad43a9b6ea88bf77e0b8b7729a4c36978e1043165034212fd2702880c6/regex-2025.10.23-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b83601f84fde939ae3478bb32a3aef36f61b58c3208d825c7e8ce1a735f143f2", size = 291122, upload-time = "2025-10-21T15:55:20.2Z" },
+ { url = "https://files.pythonhosted.org/packages/66/21/38b71e6f2818f0f4b281c8fba8d9d57cfca7b032a648fa59696e0a54376a/regex-2025.10.23-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ec13647907bb9d15fd192bbfe89ff06612e098a5709e7d6ecabbdd8f7908fc45", size = 288797, upload-time = "2025-10-21T15:55:21.932Z" },
+ { url = "https://files.pythonhosted.org/packages/be/95/888f069c89e7729732a6d7cca37f76b44bfb53a1e35dda8a2c7b65c1b992/regex-2025.10.23-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:78d76dd2957d62501084e7012ddafc5fcd406dd982b7a9ca1ea76e8eaaf73e7e", size = 798442, upload-time = "2025-10-21T15:55:23.747Z" },
+ { url = "https://files.pythonhosted.org/packages/76/70/4f903c608faf786627a8ee17c06e0067b5acade473678b69c8094b248705/regex-2025.10.23-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:8668e5f067e31a47699ebb354f43aeb9c0ef136f915bd864243098524482ac43", size = 864039, upload-time = "2025-10-21T15:55:25.656Z" },
+ { url = "https://files.pythonhosted.org/packages/62/19/2df67b526bf25756c7f447dde554fc10a220fd839cc642f50857d01e4a7b/regex-2025.10.23-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a32433fe3deb4b2d8eda88790d2808fed0dc097e84f5e683b4cd4f42edef6cca", size = 912057, upload-time = "2025-10-21T15:55:27.309Z" },
+ { url = "https://files.pythonhosted.org/packages/99/14/9a39b7c9e007968411bc3c843cc14cf15437510c0a9991f080cab654fd16/regex-2025.10.23-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d97d73818c642c938db14c0668167f8d39520ca9d983604575ade3fda193afcc", size = 803374, upload-time = "2025-10-21T15:55:28.9Z" },
+ { url = "https://files.pythonhosted.org/packages/d4/f7/3495151dd3ca79949599b6d069b72a61a2c5e24fc441dccc79dcaf708fe6/regex-2025.10.23-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:bca7feecc72ee33579e9f6ddf8babbe473045717a0e7dbc347099530f96e8b9a", size = 787714, upload-time = "2025-10-21T15:55:30.628Z" },
+ { url = "https://files.pythonhosted.org/packages/28/65/ee882455e051131869957ee8597faea45188c9a98c0dad724cfb302d4580/regex-2025.10.23-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:7e24af51e907d7457cc4a72691ec458320b9ae67dc492f63209f01eecb09de32", size = 858392, upload-time = "2025-10-21T15:55:32.322Z" },
+ { url = "https://files.pythonhosted.org/packages/53/25/9287fef5be97529ebd3ac79d256159cb709a07eb58d4be780d1ca3885da8/regex-2025.10.23-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:d10bcde58bbdf18146f3a69ec46dd03233b94a4a5632af97aa5378da3a47d288", size = 850484, upload-time = "2025-10-21T15:55:34.037Z" },
+ { url = "https://files.pythonhosted.org/packages/f3/b4/b49b88b4fea2f14dc73e5b5842755e782fc2e52f74423d6f4adc130d5880/regex-2025.10.23-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:44383bc0c933388516c2692c9a7503e1f4a67e982f20b9a29d2fb70c6494f147", size = 789634, upload-time = "2025-10-21T15:55:35.958Z" },
+ { url = "https://files.pythonhosted.org/packages/b6/3c/2f8d199d0e84e78bcd6bdc2be9b62410624f6b796e2893d1837ae738b160/regex-2025.10.23-cp312-cp312-win32.whl", hash = "sha256:6040a86f95438a0114bba16e51dfe27f1bc004fd29fe725f54a586f6d522b079", size = 266060, upload-time = "2025-10-21T15:55:37.902Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/67/c35e80969f6ded306ad70b0698863310bdf36aca57ad792f45ddc0e2271f/regex-2025.10.23-cp312-cp312-win_amd64.whl", hash = "sha256:436b4c4352fe0762e3bfa34a5567079baa2ef22aa9c37cf4d128979ccfcad842", size = 276931, upload-time = "2025-10-21T15:55:39.502Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/a1/4ed147de7d2b60174f758412c87fa51ada15cd3296a0ff047f4280aaa7ca/regex-2025.10.23-cp312-cp312-win_arm64.whl", hash = "sha256:f4b1b1991617055b46aff6f6db24888c1f05f4db9801349d23f09ed0714a9335", size = 270103, upload-time = "2025-10-21T15:55:41.24Z" },
+ { url = "https://files.pythonhosted.org/packages/28/c6/195a6217a43719d5a6a12cc192a22d12c40290cecfa577f00f4fb822f07d/regex-2025.10.23-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:b7690f95404a1293923a296981fd943cca12c31a41af9c21ba3edd06398fc193", size = 488956, upload-time = "2025-10-21T15:55:42.887Z" },
+ { url = "https://files.pythonhosted.org/packages/4c/93/181070cd1aa2fa541ff2d3afcf763ceecd4937b34c615fa92765020a6c90/regex-2025.10.23-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:1a32d77aeaea58a13230100dd8797ac1a84c457f3af2fdf0d81ea689d5a9105b", size = 290997, upload-time = "2025-10-21T15:55:44.53Z" },
+ { url = "https://files.pythonhosted.org/packages/b6/c5/9d37fbe3a40ed8dda78c23e1263002497540c0d1522ed75482ef6c2000f0/regex-2025.10.23-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b24b29402f264f70a3c81f45974323b41764ff7159655360543b7cabb73e7d2f", size = 288686, upload-time = "2025-10-21T15:55:46.186Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/e7/db610ff9f10c2921f9b6ac0c8d8be4681b28ddd40fc0549429366967e61f/regex-2025.10.23-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:563824a08c7c03d96856d84b46fdb3bbb7cfbdf79da7ef68725cda2ce169c72a", size = 798466, upload-time = "2025-10-21T15:55:48.24Z" },
+ { url = "https://files.pythonhosted.org/packages/90/10/aab883e1fa7fe2feb15ac663026e70ca0ae1411efa0c7a4a0342d9545015/regex-2025.10.23-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a0ec8bdd88d2e2659c3518087ee34b37e20bd169419ffead4240a7004e8ed03b", size = 863996, upload-time = "2025-10-21T15:55:50.478Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/b0/8f686dd97a51f3b37d0238cd00a6d0f9ccabe701f05b56de1918571d0d61/regex-2025.10.23-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b577601bfe1d33913fcd9276d7607bbac827c4798d9e14d04bf37d417a6c41cb", size = 912145, upload-time = "2025-10-21T15:55:52.215Z" },
+ { url = "https://files.pythonhosted.org/packages/a3/ca/639f8cd5b08797bca38fc5e7e07f76641a428cf8c7fca05894caf045aa32/regex-2025.10.23-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7c9f2c68ac6cb3de94eea08a437a75eaa2bd33f9e97c84836ca0b610a5804368", size = 803370, upload-time = "2025-10-21T15:55:53.944Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/1e/a40725bb76959eddf8abc42a967bed6f4851b39f5ac4f20e9794d7832aa5/regex-2025.10.23-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:89f8b9ea3830c79468e26b0e21c3585f69f105157c2154a36f6b7839f8afb351", size = 787767, upload-time = "2025-10-21T15:55:56.004Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/d8/8ee9858062936b0f99656dce390aa667c6e7fb0c357b1b9bf76fb5e2e708/regex-2025.10.23-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:98fd84c4e4ea185b3bb5bf065261ab45867d8875032f358a435647285c722673", size = 858335, upload-time = "2025-10-21T15:55:58.185Z" },
+ { url = "https://files.pythonhosted.org/packages/d8/0a/ed5faaa63fa8e3064ab670e08061fbf09e3a10235b19630cf0cbb9e48c0a/regex-2025.10.23-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:1e11d3e5887b8b096f96b4154dfb902f29c723a9556639586cd140e77e28b313", size = 850402, upload-time = "2025-10-21T15:56:00.023Z" },
+ { url = "https://files.pythonhosted.org/packages/79/14/d05f617342f4b2b4a23561da500ca2beab062bfcc408d60680e77ecaf04d/regex-2025.10.23-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4f13450328a6634348d47a88367e06b64c9d84980ef6a748f717b13f8ce64e87", size = 789739, upload-time = "2025-10-21T15:56:01.967Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/7b/e8ce8eef42a15f2c3461f8b3e6e924bbc86e9605cb534a393aadc8d3aff8/regex-2025.10.23-cp313-cp313-win32.whl", hash = "sha256:37be9296598a30c6a20236248cb8b2c07ffd54d095b75d3a2a2ee5babdc51df1", size = 266054, upload-time = "2025-10-21T15:56:05.291Z" },
+ { url = "https://files.pythonhosted.org/packages/71/2d/55184ed6be6473187868d2f2e6a0708195fc58270e62a22cbf26028f2570/regex-2025.10.23-cp313-cp313-win_amd64.whl", hash = "sha256:ea7a3c283ce0f06fe789365841e9174ba05f8db16e2fd6ae00a02df9572c04c0", size = 276917, upload-time = "2025-10-21T15:56:07.303Z" },
+ { url = "https://files.pythonhosted.org/packages/9c/d4/927eced0e2bd45c45839e556f987f8c8f8683268dd3c00ad327deb3b0172/regex-2025.10.23-cp313-cp313-win_arm64.whl", hash = "sha256:d9a4953575f300a7bab71afa4cd4ac061c7697c89590a2902b536783eeb49a4f", size = 270105, upload-time = "2025-10-21T15:56:09.857Z" },
+ { url = "https://files.pythonhosted.org/packages/3e/b3/95b310605285573341fc062d1d30b19a54f857530e86c805f942c4ff7941/regex-2025.10.23-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:7d6606524fa77b3912c9ef52a42ef63c6cfbfc1077e9dc6296cd5da0da286044", size = 491850, upload-time = "2025-10-21T15:56:11.685Z" },
+ { url = "https://files.pythonhosted.org/packages/a4/8f/207c2cec01e34e56db1eff606eef46644a60cf1739ecd474627db90ad90b/regex-2025.10.23-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:c037aadf4d64bdc38af7db3dbd34877a057ce6524eefcb2914d6d41c56f968cc", size = 292537, upload-time = "2025-10-21T15:56:13.963Z" },
+ { url = "https://files.pythonhosted.org/packages/98/3b/025240af4ada1dc0b5f10d73f3e5122d04ce7f8908ab8881e5d82b9d61b6/regex-2025.10.23-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:99018c331fb2529084a0c9b4c713dfa49fafb47c7712422e49467c13a636c656", size = 290904, upload-time = "2025-10-21T15:56:16.016Z" },
+ { url = "https://files.pythonhosted.org/packages/81/8e/104ac14e2d3450c43db18ec03e1b96b445a94ae510b60138f00ce2cb7ca1/regex-2025.10.23-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fd8aba965604d70306eb90a35528f776e59112a7114a5162824d43b76fa27f58", size = 807311, upload-time = "2025-10-21T15:56:17.818Z" },
+ { url = "https://files.pythonhosted.org/packages/19/63/78aef90141b7ce0be8a18e1782f764f6997ad09de0e05251f0d2503a914a/regex-2025.10.23-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:238e67264b4013e74136c49f883734f68656adf8257bfa13b515626b31b20f8e", size = 873241, upload-time = "2025-10-21T15:56:19.941Z" },
+ { url = "https://files.pythonhosted.org/packages/b3/a8/80eb1201bb49ae4dba68a1b284b4211ed9daa8e74dc600018a10a90399fb/regex-2025.10.23-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b2eb48bd9848d66fd04826382f5e8491ae633de3233a3d64d58ceb4ecfa2113a", size = 914794, upload-time = "2025-10-21T15:56:22.488Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/d5/1984b6ee93281f360a119a5ca1af6a8ca7d8417861671388bf750becc29b/regex-2025.10.23-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d36591ce06d047d0c0fe2fc5f14bfbd5b4525d08a7b6a279379085e13f0e3d0e", size = 812581, upload-time = "2025-10-21T15:56:24.319Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/39/11ebdc6d9927172a64ae237d16763145db6bd45ebb4055c17b88edab72a7/regex-2025.10.23-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:b5d4ece8628d6e364302006366cea3ee887db397faebacc5dacf8ef19e064cf8", size = 795346, upload-time = "2025-10-21T15:56:26.232Z" },
+ { url = "https://files.pythonhosted.org/packages/3b/b4/89a591bcc08b5e436af43315284bd233ba77daf0cf20e098d7af12f006c1/regex-2025.10.23-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:39a7e8083959cb1c4ff74e483eecb5a65d3b3e1d821b256e54baf61782c906c6", size = 868214, upload-time = "2025-10-21T15:56:28.597Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/ff/58ba98409c1dbc8316cdb20dafbc63ed267380a07780cafecaf5012dabc9/regex-2025.10.23-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:842d449a8fefe546f311656cf8c0d6729b08c09a185f1cad94c756210286d6a8", size = 854540, upload-time = "2025-10-21T15:56:30.875Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/f2/4a9e9338d67626e2071b643f828a482712ad15889d7268e11e9a63d6f7e9/regex-2025.10.23-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:d614986dc68506be8f00474f4f6960e03e4ca9883f7df47744800e7d7c08a494", size = 799346, upload-time = "2025-10-21T15:56:32.725Z" },
+ { url = "https://files.pythonhosted.org/packages/63/be/543d35c46bebf6f7bf2be538cca74d6585f25714700c36f37f01b92df551/regex-2025.10.23-cp313-cp313t-win32.whl", hash = "sha256:a5b7a26b51a9df473ec16a1934d117443a775ceb7b39b78670b2e21893c330c9", size = 268657, upload-time = "2025-10-21T15:56:34.577Z" },
+ { url = "https://files.pythonhosted.org/packages/14/9f/4dd6b7b612037158bb2c9bcaa710e6fb3c40ad54af441b9c53b3a137a9f1/regex-2025.10.23-cp313-cp313t-win_amd64.whl", hash = "sha256:ce81c5544a5453f61cb6f548ed358cfb111e3b23f3cd42d250a4077a6be2a7b6", size = 280075, upload-time = "2025-10-21T15:56:36.767Z" },
+ { url = "https://files.pythonhosted.org/packages/81/7a/5bd0672aa65d38c8da6747c17c8b441bdb53d816c569e3261013af8e83cf/regex-2025.10.23-cp313-cp313t-win_arm64.whl", hash = "sha256:e9bf7f6699f490e4e43c44757aa179dab24d1960999c84ab5c3d5377714ed473", size = 271219, upload-time = "2025-10-21T15:56:39.033Z" },
+ { url = "https://files.pythonhosted.org/packages/73/f6/0caf29fec943f201fbc8822879c99d31e59c1d51a983d9843ee5cf398539/regex-2025.10.23-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:5b5cb5b6344c4c4c24b2dc87b0bfee78202b07ef7633385df70da7fcf6f7cec6", size = 488960, upload-time = "2025-10-21T15:56:40.849Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/7d/ebb7085b8fa31c24ce0355107cea2b92229d9050552a01c5d291c42aecea/regex-2025.10.23-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:a6ce7973384c37bdf0f371a843f95a6e6f4e1489e10e0cf57330198df72959c5", size = 290932, upload-time = "2025-10-21T15:56:42.875Z" },
+ { url = "https://files.pythonhosted.org/packages/27/41/43906867287cbb5ca4cee671c3cc8081e15deef86a8189c3aad9ac9f6b4d/regex-2025.10.23-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:2ee3663f2c334959016b56e3bd0dd187cbc73f948e3a3af14c3caaa0c3035d10", size = 288766, upload-time = "2025-10-21T15:56:44.894Z" },
+ { url = "https://files.pythonhosted.org/packages/ab/9e/ea66132776700fc77a39b1056e7a5f1308032fead94507e208dc6716b7cd/regex-2025.10.23-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2003cc82a579107e70d013482acce8ba773293f2db534fb532738395c557ff34", size = 798884, upload-time = "2025-10-21T15:56:47.178Z" },
+ { url = "https://files.pythonhosted.org/packages/d5/99/aed1453687ab63819a443930770db972c5c8064421f0d9f5da9ad029f26b/regex-2025.10.23-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:182c452279365a93a9f45874f7f191ec1c51e1f1eb41bf2b16563f1a40c1da3a", size = 864768, upload-time = "2025-10-21T15:56:49.793Z" },
+ { url = "https://files.pythonhosted.org/packages/99/5d/732fe747a1304805eb3853ce6337eea16b169f7105a0d0dd9c6a5ffa9948/regex-2025.10.23-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b1249e9ff581c5b658c8f0437f883b01f1edcf424a16388591e7c05e5e9e8b0c", size = 911394, upload-time = "2025-10-21T15:56:52.186Z" },
+ { url = "https://files.pythonhosted.org/packages/5e/48/58a1f6623466522352a6efa153b9a3714fc559d9f930e9bc947b4a88a2c3/regex-2025.10.23-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2b841698f93db3ccc36caa1900d2a3be281d9539b822dc012f08fc80b46a3224", size = 803145, upload-time = "2025-10-21T15:56:55.142Z" },
+ { url = "https://files.pythonhosted.org/packages/ea/f6/7dea79be2681a5574ab3fc237aa53b2c1dfd6bd2b44d4640b6c76f33f4c1/regex-2025.10.23-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:956d89e0c92d471e8f7eee73f73fdff5ed345886378c45a43175a77538a1ffe4", size = 787831, upload-time = "2025-10-21T15:56:57.203Z" },
+ { url = "https://files.pythonhosted.org/packages/3a/ad/07b76950fbbe65f88120ca2d8d845047c401450f607c99ed38862904671d/regex-2025.10.23-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:5c259cb363299a0d90d63b5c0d7568ee98419861618a95ee9d91a41cb9954462", size = 859162, upload-time = "2025-10-21T15:56:59.195Z" },
+ { url = "https://files.pythonhosted.org/packages/41/87/374f3b2021b22aa6a4fc0b750d63f9721e53d1631a238f7a1c343c1cd288/regex-2025.10.23-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:185d2b18c062820b3a40d8fefa223a83f10b20a674bf6e8c4a432e8dfd844627", size = 849899, upload-time = "2025-10-21T15:57:01.747Z" },
+ { url = "https://files.pythonhosted.org/packages/12/4a/7f7bb17c5a5a9747249807210e348450dab9212a46ae6d23ebce86ba6a2b/regex-2025.10.23-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:281d87fa790049c2b7c1b4253121edd80b392b19b5a3d28dc2a77579cb2a58ec", size = 789372, upload-time = "2025-10-21T15:57:04.018Z" },
+ { url = "https://files.pythonhosted.org/packages/c9/dd/9c7728ff544fea09bbc8635e4c9e7c423b11c24f1a7a14e6ac4831466709/regex-2025.10.23-cp314-cp314-win32.whl", hash = "sha256:63b81eef3656072e4ca87c58084c7a9c2b81d41a300b157be635a8a675aacfb8", size = 271451, upload-time = "2025-10-21T15:57:06.266Z" },
+ { url = "https://files.pythonhosted.org/packages/48/f8/ef7837ff858eb74079c4804c10b0403c0b740762e6eedba41062225f7117/regex-2025.10.23-cp314-cp314-win_amd64.whl", hash = "sha256:0967c5b86f274800a34a4ed862dfab56928144d03cb18821c5153f8777947796", size = 280173, upload-time = "2025-10-21T15:57:08.206Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/d0/d576e1dbd9885bfcd83d0e90762beea48d9373a6f7ed39170f44ed22e336/regex-2025.10.23-cp314-cp314-win_arm64.whl", hash = "sha256:c70dfe58b0a00b36aa04cdb0f798bf3e0adc31747641f69e191109fd8572c9a9", size = 273206, upload-time = "2025-10-21T15:57:10.367Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/d0/2025268315e8b2b7b660039824cb7765a41623e97d4cd421510925400487/regex-2025.10.23-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:1f5799ea1787aa6de6c150377d11afad39a38afd033f0c5247aecb997978c422", size = 491854, upload-time = "2025-10-21T15:57:12.526Z" },
+ { url = "https://files.pythonhosted.org/packages/44/35/5681c2fec5e8b33454390af209c4353dfc44606bf06d714b0b8bd0454ffe/regex-2025.10.23-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:a9639ab7540cfea45ef57d16dcbea2e22de351998d614c3ad2f9778fa3bdd788", size = 292542, upload-time = "2025-10-21T15:57:15.158Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/17/184eed05543b724132e4a18149e900f5189001fcfe2d64edaae4fbaf36b4/regex-2025.10.23-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:08f52122c352eb44c3421dab78b9b73a8a77a282cc8314ae576fcaa92b780d10", size = 290903, upload-time = "2025-10-21T15:57:17.108Z" },
+ { url = "https://files.pythonhosted.org/packages/25/d0/5e3347aa0db0de382dddfa133a7b0ae72f24b4344f3989398980b44a3924/regex-2025.10.23-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ebf1baebef1c4088ad5a5623decec6b52950f0e4d7a0ae4d48f0a99f8c9cb7d7", size = 807546, upload-time = "2025-10-21T15:57:19.179Z" },
+ { url = "https://files.pythonhosted.org/packages/d2/bb/40c589bbdce1be0c55e9f8159789d58d47a22014f2f820cf2b517a5cd193/regex-2025.10.23-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:16b0f1c2e2d566c562d5c384c2b492646be0a19798532fdc1fdedacc66e3223f", size = 873322, upload-time = "2025-10-21T15:57:21.36Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/56/a7e40c01575ac93360e606278d359f91829781a9f7fb6e5aa435039edbda/regex-2025.10.23-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f7ada5d9dceafaab92646aa00c10a9efd9b09942dd9b0d7c5a4b73db92cc7e61", size = 914855, upload-time = "2025-10-21T15:57:24.044Z" },
+ { url = "https://files.pythonhosted.org/packages/5c/4b/d55587b192763db3163c3f508b3b67b31bb6f5e7a0e08b83013d0a59500a/regex-2025.10.23-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3a36b4005770044bf08edecc798f0e41a75795b9e7c9c12fe29da8d792ef870c", size = 812724, upload-time = "2025-10-21T15:57:26.123Z" },
+ { url = "https://files.pythonhosted.org/packages/33/20/18bac334955fbe99d17229f4f8e98d05e4a501ac03a442be8facbb37c304/regex-2025.10.23-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:af7b2661dcc032da1fae82069b5ebf2ac1dfcd5359ef8b35e1367bfc92181432", size = 795439, upload-time = "2025-10-21T15:57:28.497Z" },
+ { url = "https://files.pythonhosted.org/packages/67/46/c57266be9df8549c7d85deb4cb82280cb0019e46fff677534c5fa1badfa4/regex-2025.10.23-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:1cb976810ac1416a67562c2e5ba0accf6f928932320fef302e08100ed681b38e", size = 868336, upload-time = "2025-10-21T15:57:30.867Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/f3/bd5879e41ef8187fec5e678e94b526a93f99e7bbe0437b0f2b47f9101694/regex-2025.10.23-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:1a56a54be3897d62f54290190fbcd754bff6932934529fbf5b29933da28fcd43", size = 854567, upload-time = "2025-10-21T15:57:33.062Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/57/2b6bbdbd2f24dfed5b028033aa17ad8f7d86bb28f1a892cac8b3bc89d059/regex-2025.10.23-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:8f3e6d202fb52c2153f532043bbcf618fd177df47b0b306741eb9b60ba96edc3", size = 799565, upload-time = "2025-10-21T15:57:35.153Z" },
+ { url = "https://files.pythonhosted.org/packages/c7/ba/a6168f542ba73b151ed81237adf6b869c7b2f7f8d51618111296674e20ee/regex-2025.10.23-cp314-cp314t-win32.whl", hash = "sha256:1fa1186966b2621b1769fd467c7b22e317e6ba2d2cdcecc42ea3089ef04a8521", size = 274428, upload-time = "2025-10-21T15:57:37.996Z" },
+ { url = "https://files.pythonhosted.org/packages/ef/a0/c84475e14a2829e9b0864ebf77c3f7da909df9d8acfe2bb540ff0072047c/regex-2025.10.23-cp314-cp314t-win_amd64.whl", hash = "sha256:08a15d40ce28362eac3e78e83d75475147869c1ff86bc93285f43b4f4431a741", size = 284140, upload-time = "2025-10-21T15:57:40.027Z" },
+ { url = "https://files.pythonhosted.org/packages/51/33/6a08ade0eee5b8ba79386869fa6f77afeb835b60510f3525db987e2fffc4/regex-2025.10.23-cp314-cp314t-win_arm64.whl", hash = "sha256:a93e97338e1c8ea2649e130dcfbe8cd69bba5e1e163834752ab64dcb4de6d5ed", size = 274497, upload-time = "2025-10-21T15:57:42.389Z" },
+]
+
+[[package]]
+name = "requests"
+version = "2.32.5"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "certifi" },
+ { name = "charset-normalizer" },
+ { name = "idna" },
+ { name = "urllib3" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" },
+]
+
+[[package]]
+name = "rich"
+version = "14.2.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "markdown-it-py" },
+ { name = "pygments" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/fb/d2/8920e102050a0de7bfabeb4c4614a49248cf8d5d7a8d01885fbb24dc767a/rich-14.2.0.tar.gz", hash = "sha256:73ff50c7c0c1c77c8243079283f4edb376f0f6442433aecb8ce7e6d0b92d1fe4", size = 219990, upload-time = "2025-10-09T14:16:53.064Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/25/7a/b0178788f8dc6cafce37a212c99565fa1fe7872c70c6c9c1e1a372d9d88f/rich-14.2.0-py3-none-any.whl", hash = "sha256:76bc51fe2e57d2b1be1f96c524b890b816e334ab4c1e45888799bfaab0021edd", size = 243393, upload-time = "2025-10-09T14:16:51.245Z" },
+]
+
+[[package]]
+name = "safetensors"
+version = "0.6.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/ac/cc/738f3011628920e027a11754d9cae9abec1aed00f7ae860abbf843755233/safetensors-0.6.2.tar.gz", hash = "sha256:43ff2aa0e6fa2dc3ea5524ac7ad93a9839256b8703761e76e2d0b2a3fa4f15d9", size = 197968, upload-time = "2025-08-08T13:13:58.654Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/4d/b1/3f5fd73c039fc87dba3ff8b5d528bfc5a32b597fea8e7a6a4800343a17c7/safetensors-0.6.2-cp38-abi3-macosx_10_12_x86_64.whl", hash = "sha256:9c85ede8ec58f120bad982ec47746981e210492a6db876882aa021446af8ffba", size = 454797, upload-time = "2025-08-08T13:13:52.066Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/c9/bb114c158540ee17907ec470d01980957fdaf87b4aa07914c24eba87b9c6/safetensors-0.6.2-cp38-abi3-macosx_11_0_arm64.whl", hash = "sha256:d6675cf4b39c98dbd7d940598028f3742e0375a6b4d4277e76beb0c35f4b843b", size = 432206, upload-time = "2025-08-08T13:13:50.931Z" },
+ { url = "https://files.pythonhosted.org/packages/d3/8e/f70c34e47df3110e8e0bb268d90db8d4be8958a54ab0336c9be4fe86dac8/safetensors-0.6.2-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1d2d2b3ce1e2509c68932ca03ab8f20570920cd9754b05063d4368ee52833ecd", size = 473261, upload-time = "2025-08-08T13:13:41.259Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/f5/be9c6a7c7ef773e1996dc214e73485286df1836dbd063e8085ee1976f9cb/safetensors-0.6.2-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:93de35a18f46b0f5a6a1f9e26d91b442094f2df02e9fd7acf224cfec4238821a", size = 485117, upload-time = "2025-08-08T13:13:43.506Z" },
+ { url = "https://files.pythonhosted.org/packages/c9/55/23f2d0a2c96ed8665bf17a30ab4ce5270413f4d74b6d87dd663258b9af31/safetensors-0.6.2-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:89a89b505f335640f9120fac65ddeb83e40f1fd081cb8ed88b505bdccec8d0a1", size = 616154, upload-time = "2025-08-08T13:13:45.096Z" },
+ { url = "https://files.pythonhosted.org/packages/98/c6/affb0bd9ce02aa46e7acddbe087912a04d953d7a4d74b708c91b5806ef3f/safetensors-0.6.2-cp38-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fc4d0d0b937e04bdf2ae6f70cd3ad51328635fe0e6214aa1fc811f3b576b3bda", size = 520713, upload-time = "2025-08-08T13:13:46.25Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/5d/5a514d7b88e310c8b146e2404e0dc161282e78634d9358975fd56dfd14be/safetensors-0.6.2-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8045db2c872db8f4cbe3faa0495932d89c38c899c603f21e9b6486951a5ecb8f", size = 485835, upload-time = "2025-08-08T13:13:49.373Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/7b/4fc3b2ba62c352b2071bea9cfbad330fadda70579f617506ae1a2f129cab/safetensors-0.6.2-cp38-abi3-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:81e67e8bab9878bb568cffbc5f5e655adb38d2418351dc0859ccac158f753e19", size = 521503, upload-time = "2025-08-08T13:13:47.651Z" },
+ { url = "https://files.pythonhosted.org/packages/5a/50/0057e11fe1f3cead9254315a6c106a16dd4b1a19cd247f7cc6414f6b7866/safetensors-0.6.2-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:b0e4d029ab0a0e0e4fdf142b194514695b1d7d3735503ba700cf36d0fc7136ce", size = 652256, upload-time = "2025-08-08T13:13:53.167Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/29/473f789e4ac242593ac1656fbece6e1ecd860bb289e635e963667807afe3/safetensors-0.6.2-cp38-abi3-musllinux_1_2_armv7l.whl", hash = "sha256:fa48268185c52bfe8771e46325a1e21d317207bcabcb72e65c6e28e9ffeb29c7", size = 747281, upload-time = "2025-08-08T13:13:54.656Z" },
+ { url = "https://files.pythonhosted.org/packages/68/52/f7324aad7f2df99e05525c84d352dc217e0fa637a4f603e9f2eedfbe2c67/safetensors-0.6.2-cp38-abi3-musllinux_1_2_i686.whl", hash = "sha256:d83c20c12c2d2f465997c51b7ecb00e407e5f94d7dec3ea0cc11d86f60d3fde5", size = 692286, upload-time = "2025-08-08T13:13:55.884Z" },
+ { url = "https://files.pythonhosted.org/packages/ad/fe/cad1d9762868c7c5dc70c8620074df28ebb1a8e4c17d4c0cb031889c457e/safetensors-0.6.2-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:d944cea65fad0ead848b6ec2c37cc0b197194bec228f8020054742190e9312ac", size = 655957, upload-time = "2025-08-08T13:13:57.029Z" },
+ { url = "https://files.pythonhosted.org/packages/59/a7/e2158e17bbe57d104f0abbd95dff60dda916cf277c9f9663b4bf9bad8b6e/safetensors-0.6.2-cp38-abi3-win32.whl", hash = "sha256:cab75ca7c064d3911411461151cb69380c9225798a20e712b102edda2542ddb1", size = 308926, upload-time = "2025-08-08T13:14:01.095Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/c3/c0be1135726618dc1e28d181b8c442403d8dbb9e273fd791de2d4384bcdd/safetensors-0.6.2-cp38-abi3-win_amd64.whl", hash = "sha256:c7b214870df923cbc1593c3faee16bec59ea462758699bd3fee399d00aac072c", size = 320192, upload-time = "2025-08-08T13:13:59.467Z" },
+]
+
+[[package]]
+name = "scikit-learn"
+version = "1.7.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "joblib" },
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "scipy", version = "1.15.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "scipy", version = "1.16.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "threadpoolctl" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/98/c2/a7855e41c9d285dfe86dc50b250978105dce513d6e459ea66a6aeb0e1e0c/scikit_learn-1.7.2.tar.gz", hash = "sha256:20e9e49ecd130598f1ca38a1d85090e1a600147b9c02fa6f15d69cb53d968fda", size = 7193136, upload-time = "2025-09-09T08:21:29.075Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ba/3e/daed796fd69cce768b8788401cc464ea90b306fb196ae1ffed0b98182859/scikit_learn-1.7.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6b33579c10a3081d076ab403df4a4190da4f4432d443521674637677dc91e61f", size = 9336221, upload-time = "2025-09-09T08:20:19.328Z" },
+ { url = "https://files.pythonhosted.org/packages/1c/ce/af9d99533b24c55ff4e18d9b7b4d9919bbc6cd8f22fe7a7be01519a347d5/scikit_learn-1.7.2-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:36749fb62b3d961b1ce4fedf08fa57a1986cd409eff2d783bca5d4b9b5fce51c", size = 8653834, upload-time = "2025-09-09T08:20:22.073Z" },
+ { url = "https://files.pythonhosted.org/packages/58/0e/8c2a03d518fb6bd0b6b0d4b114c63d5f1db01ff0f9925d8eb10960d01c01/scikit_learn-1.7.2-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7a58814265dfc52b3295b1900cfb5701589d30a8bb026c7540f1e9d3499d5ec8", size = 9660938, upload-time = "2025-09-09T08:20:24.327Z" },
+ { url = "https://files.pythonhosted.org/packages/2b/75/4311605069b5d220e7cf5adabb38535bd96f0079313cdbb04b291479b22a/scikit_learn-1.7.2-cp310-cp310-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4a847fea807e278f821a0406ca01e387f97653e284ecbd9750e3ee7c90347f18", size = 9477818, upload-time = "2025-09-09T08:20:26.845Z" },
+ { url = "https://files.pythonhosted.org/packages/7f/9b/87961813c34adbca21a6b3f6b2bea344c43b30217a6d24cc437c6147f3e8/scikit_learn-1.7.2-cp310-cp310-win_amd64.whl", hash = "sha256:ca250e6836d10e6f402436d6463d6c0e4d8e0234cfb6a9a47835bd392b852ce5", size = 8886969, upload-time = "2025-09-09T08:20:29.329Z" },
+ { url = "https://files.pythonhosted.org/packages/43/83/564e141eef908a5863a54da8ca342a137f45a0bfb71d1d79704c9894c9d1/scikit_learn-1.7.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:c7509693451651cd7361d30ce4e86a1347493554f172b1c72a39300fa2aea79e", size = 9331967, upload-time = "2025-09-09T08:20:32.421Z" },
+ { url = "https://files.pythonhosted.org/packages/18/d6/ba863a4171ac9d7314c4d3fc251f015704a2caeee41ced89f321c049ed83/scikit_learn-1.7.2-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:0486c8f827c2e7b64837c731c8feff72c0bd2b998067a8a9cbc10643c31f0fe1", size = 8648645, upload-time = "2025-09-09T08:20:34.436Z" },
+ { url = "https://files.pythonhosted.org/packages/ef/0e/97dbca66347b8cf0ea8b529e6bb9367e337ba2e8be0ef5c1a545232abfde/scikit_learn-1.7.2-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:89877e19a80c7b11a2891a27c21c4894fb18e2c2e077815bcade10d34287b20d", size = 9715424, upload-time = "2025-09-09T08:20:36.776Z" },
+ { url = "https://files.pythonhosted.org/packages/f7/32/1f3b22e3207e1d2c883a7e09abb956362e7d1bd2f14458c7de258a26ac15/scikit_learn-1.7.2-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8da8bf89d4d79aaec192d2bda62f9b56ae4e5b4ef93b6a56b5de4977e375c1f1", size = 9509234, upload-time = "2025-09-09T08:20:38.957Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/71/34ddbd21f1da67c7a768146968b4d0220ee6831e4bcbad3e03dd3eae88b6/scikit_learn-1.7.2-cp311-cp311-win_amd64.whl", hash = "sha256:9b7ed8d58725030568523e937c43e56bc01cadb478fc43c042a9aca1dacb3ba1", size = 8894244, upload-time = "2025-09-09T08:20:41.166Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/aa/3996e2196075689afb9fce0410ebdb4a09099d7964d061d7213700204409/scikit_learn-1.7.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:8d91a97fa2b706943822398ab943cde71858a50245e31bc71dba62aab1d60a96", size = 9259818, upload-time = "2025-09-09T08:20:43.19Z" },
+ { url = "https://files.pythonhosted.org/packages/43/5d/779320063e88af9c4a7c2cf463ff11c21ac9c8bd730c4a294b0000b666c9/scikit_learn-1.7.2-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:acbc0f5fd2edd3432a22c69bed78e837c70cf896cd7993d71d51ba6708507476", size = 8636997, upload-time = "2025-09-09T08:20:45.468Z" },
+ { url = "https://files.pythonhosted.org/packages/5c/d0/0c577d9325b05594fdd33aa970bf53fb673f051a45496842caee13cfd7fe/scikit_learn-1.7.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e5bf3d930aee75a65478df91ac1225ff89cd28e9ac7bd1196853a9229b6adb0b", size = 9478381, upload-time = "2025-09-09T08:20:47.982Z" },
+ { url = "https://files.pythonhosted.org/packages/82/70/8bf44b933837ba8494ca0fc9a9ab60f1c13b062ad0197f60a56e2fc4c43e/scikit_learn-1.7.2-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b4d6e9deed1a47aca9fe2f267ab8e8fe82ee20b4526b2c0cd9e135cea10feb44", size = 9300296, upload-time = "2025-09-09T08:20:50.366Z" },
+ { url = "https://files.pythonhosted.org/packages/c6/99/ed35197a158f1fdc2fe7c3680e9c70d0128f662e1fee4ed495f4b5e13db0/scikit_learn-1.7.2-cp312-cp312-win_amd64.whl", hash = "sha256:6088aa475f0785e01bcf8529f55280a3d7d298679f50c0bb70a2364a82d0b290", size = 8731256, upload-time = "2025-09-09T08:20:52.627Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/93/a3038cb0293037fd335f77f31fe053b89c72f17b1c8908c576c29d953e84/scikit_learn-1.7.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:0b7dacaa05e5d76759fb071558a8b5130f4845166d88654a0f9bdf3eb57851b7", size = 9212382, upload-time = "2025-09-09T08:20:54.731Z" },
+ { url = "https://files.pythonhosted.org/packages/40/dd/9a88879b0c1104259136146e4742026b52df8540c39fec21a6383f8292c7/scikit_learn-1.7.2-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:abebbd61ad9e1deed54cca45caea8ad5f79e1b93173dece40bb8e0c658dbe6fe", size = 8592042, upload-time = "2025-09-09T08:20:57.313Z" },
+ { url = "https://files.pythonhosted.org/packages/46/af/c5e286471b7d10871b811b72ae794ac5fe2989c0a2df07f0ec723030f5f5/scikit_learn-1.7.2-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:502c18e39849c0ea1a5d681af1dbcf15f6cce601aebb657aabbfe84133c1907f", size = 9434180, upload-time = "2025-09-09T08:20:59.671Z" },
+ { url = "https://files.pythonhosted.org/packages/f1/fd/df59faa53312d585023b2da27e866524ffb8faf87a68516c23896c718320/scikit_learn-1.7.2-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7a4c328a71785382fe3fe676a9ecf2c86189249beff90bf85e22bdb7efaf9ae0", size = 9283660, upload-time = "2025-09-09T08:21:01.71Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/c7/03000262759d7b6f38c836ff9d512f438a70d8a8ddae68ee80de72dcfb63/scikit_learn-1.7.2-cp313-cp313-win_amd64.whl", hash = "sha256:63a9afd6f7b229aad94618c01c252ce9e6fa97918c5ca19c9a17a087d819440c", size = 8702057, upload-time = "2025-09-09T08:21:04.234Z" },
+ { url = "https://files.pythonhosted.org/packages/55/87/ef5eb1f267084532c8e4aef98a28b6ffe7425acbfd64b5e2f2e066bc29b3/scikit_learn-1.7.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:9acb6c5e867447b4e1390930e3944a005e2cb115922e693c08a323421a6966e8", size = 9558731, upload-time = "2025-09-09T08:21:06.381Z" },
+ { url = "https://files.pythonhosted.org/packages/93/f8/6c1e3fc14b10118068d7938878a9f3f4e6d7b74a8ddb1e5bed65159ccda8/scikit_learn-1.7.2-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:2a41e2a0ef45063e654152ec9d8bcfc39f7afce35b08902bfe290c2498a67a6a", size = 9038852, upload-time = "2025-09-09T08:21:08.628Z" },
+ { url = "https://files.pythonhosted.org/packages/83/87/066cafc896ee540c34becf95d30375fe5cbe93c3b75a0ee9aa852cd60021/scikit_learn-1.7.2-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:98335fb98509b73385b3ab2bd0639b1f610541d3988ee675c670371d6a87aa7c", size = 9527094, upload-time = "2025-09-09T08:21:11.486Z" },
+ { url = "https://files.pythonhosted.org/packages/9c/2b/4903e1ccafa1f6453b1ab78413938c8800633988c838aa0be386cbb33072/scikit_learn-1.7.2-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:191e5550980d45449126e23ed1d5e9e24b2c68329ee1f691a3987476e115e09c", size = 9367436, upload-time = "2025-09-09T08:21:13.602Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/aa/8444be3cfb10451617ff9d177b3c190288f4563e6c50ff02728be67ad094/scikit_learn-1.7.2-cp313-cp313t-win_amd64.whl", hash = "sha256:57dc4deb1d3762c75d685507fbd0bc17160144b2f2ba4ccea5dc285ab0d0e973", size = 9275749, upload-time = "2025-09-09T08:21:15.96Z" },
+ { url = "https://files.pythonhosted.org/packages/d9/82/dee5acf66837852e8e68df6d8d3a6cb22d3df997b733b032f513d95205b7/scikit_learn-1.7.2-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fa8f63940e29c82d1e67a45d5297bdebbcb585f5a5a50c4914cc2e852ab77f33", size = 9208906, upload-time = "2025-09-09T08:21:18.557Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/30/9029e54e17b87cb7d50d51a5926429c683d5b4c1732f0507a6c3bed9bf65/scikit_learn-1.7.2-cp314-cp314-macosx_12_0_arm64.whl", hash = "sha256:f95dc55b7902b91331fa4e5845dd5bde0580c9cd9612b1b2791b7e80c3d32615", size = 8627836, upload-time = "2025-09-09T08:21:20.695Z" },
+ { url = "https://files.pythonhosted.org/packages/60/18/4a52c635c71b536879f4b971c2cedf32c35ee78f48367885ed8025d1f7ee/scikit_learn-1.7.2-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:9656e4a53e54578ad10a434dc1f993330568cfee176dff07112b8785fb413106", size = 9426236, upload-time = "2025-09-09T08:21:22.645Z" },
+ { url = "https://files.pythonhosted.org/packages/99/7e/290362f6ab582128c53445458a5befd471ed1ea37953d5bcf80604619250/scikit_learn-1.7.2-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:96dc05a854add0e50d3f47a1ef21a10a595016da5b007c7d9cd9d0bffd1fcc61", size = 9312593, upload-time = "2025-09-09T08:21:24.65Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/87/24f541b6d62b1794939ae6422f8023703bbf6900378b2b34e0b4384dfefd/scikit_learn-1.7.2-cp314-cp314-win_amd64.whl", hash = "sha256:bb24510ed3f9f61476181e4db51ce801e2ba37541def12dc9333b946fc7a9cf8", size = 8820007, upload-time = "2025-09-09T08:21:26.713Z" },
+]
+
+[[package]]
+name = "scipy"
+version = "1.15.3"
+source = { registry = "https://pypi.org/simple" }
+resolution-markers = [
+ "python_full_version < '3.11'",
+]
+dependencies = [
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/0f/37/6964b830433e654ec7485e45a00fc9a27cf868d622838f6b6d9c5ec0d532/scipy-1.15.3.tar.gz", hash = "sha256:eae3cf522bc7df64b42cad3925c876e1b0b6c35c1337c93e12c0f366f55b0eaf", size = 59419214, upload-time = "2025-05-08T16:13:05.955Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/78/2f/4966032c5f8cc7e6a60f1b2e0ad686293b9474b65246b0c642e3ef3badd0/scipy-1.15.3-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:a345928c86d535060c9c2b25e71e87c39ab2f22fc96e9636bd74d1dbf9de448c", size = 38702770, upload-time = "2025-05-08T16:04:20.849Z" },
+ { url = "https://files.pythonhosted.org/packages/a0/6e/0c3bf90fae0e910c274db43304ebe25a6b391327f3f10b5dcc638c090795/scipy-1.15.3-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:ad3432cb0f9ed87477a8d97f03b763fd1d57709f1bbde3c9369b1dff5503b253", size = 30094511, upload-time = "2025-05-08T16:04:27.103Z" },
+ { url = "https://files.pythonhosted.org/packages/ea/b1/4deb37252311c1acff7f101f6453f0440794f51b6eacb1aad4459a134081/scipy-1.15.3-cp310-cp310-macosx_14_0_arm64.whl", hash = "sha256:aef683a9ae6eb00728a542b796f52a5477b78252edede72b8327a886ab63293f", size = 22368151, upload-time = "2025-05-08T16:04:31.731Z" },
+ { url = "https://files.pythonhosted.org/packages/38/7d/f457626e3cd3c29b3a49ca115a304cebb8cc6f31b04678f03b216899d3c6/scipy-1.15.3-cp310-cp310-macosx_14_0_x86_64.whl", hash = "sha256:1c832e1bd78dea67d5c16f786681b28dd695a8cb1fb90af2e27580d3d0967e92", size = 25121732, upload-time = "2025-05-08T16:04:36.596Z" },
+ { url = "https://files.pythonhosted.org/packages/db/0a/92b1de4a7adc7a15dcf5bddc6e191f6f29ee663b30511ce20467ef9b82e4/scipy-1.15.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:263961f658ce2165bbd7b99fa5135195c3a12d9bef045345016b8b50c315cb82", size = 35547617, upload-time = "2025-05-08T16:04:43.546Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/6d/41991e503e51fc1134502694c5fa7a1671501a17ffa12716a4a9151af3df/scipy-1.15.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9e2abc762b0811e09a0d3258abee2d98e0c703eee49464ce0069590846f31d40", size = 37662964, upload-time = "2025-05-08T16:04:49.431Z" },
+ { url = "https://files.pythonhosted.org/packages/25/e1/3df8f83cb15f3500478c889be8fb18700813b95e9e087328230b98d547ff/scipy-1.15.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:ed7284b21a7a0c8f1b6e5977ac05396c0d008b89e05498c8b7e8f4a1423bba0e", size = 37238749, upload-time = "2025-05-08T16:04:55.215Z" },
+ { url = "https://files.pythonhosted.org/packages/93/3e/b3257cf446f2a3533ed7809757039016b74cd6f38271de91682aa844cfc5/scipy-1.15.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:5380741e53df2c566f4d234b100a484b420af85deb39ea35a1cc1be84ff53a5c", size = 40022383, upload-time = "2025-05-08T16:05:01.914Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/84/55bc4881973d3f79b479a5a2e2df61c8c9a04fcb986a213ac9c02cfb659b/scipy-1.15.3-cp310-cp310-win_amd64.whl", hash = "sha256:9d61e97b186a57350f6d6fd72640f9e99d5a4a2b8fbf4b9ee9a841eab327dc13", size = 41259201, upload-time = "2025-05-08T16:05:08.166Z" },
+ { url = "https://files.pythonhosted.org/packages/96/ab/5cc9f80f28f6a7dff646c5756e559823614a42b1939d86dd0ed550470210/scipy-1.15.3-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:993439ce220d25e3696d1b23b233dd010169b62f6456488567e830654ee37a6b", size = 38714255, upload-time = "2025-05-08T16:05:14.596Z" },
+ { url = "https://files.pythonhosted.org/packages/4a/4a/66ba30abe5ad1a3ad15bfb0b59d22174012e8056ff448cb1644deccbfed2/scipy-1.15.3-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:34716e281f181a02341ddeaad584205bd2fd3c242063bd3423d61ac259ca7eba", size = 30111035, upload-time = "2025-05-08T16:05:20.152Z" },
+ { url = "https://files.pythonhosted.org/packages/4b/fa/a7e5b95afd80d24313307f03624acc65801846fa75599034f8ceb9e2cbf6/scipy-1.15.3-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:3b0334816afb8b91dab859281b1b9786934392aa3d527cd847e41bb6f45bee65", size = 22384499, upload-time = "2025-05-08T16:05:24.494Z" },
+ { url = "https://files.pythonhosted.org/packages/17/99/f3aaddccf3588bb4aea70ba35328c204cadd89517a1612ecfda5b2dd9d7a/scipy-1.15.3-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:6db907c7368e3092e24919b5e31c76998b0ce1684d51a90943cb0ed1b4ffd6c1", size = 25152602, upload-time = "2025-05-08T16:05:29.313Z" },
+ { url = "https://files.pythonhosted.org/packages/56/c5/1032cdb565f146109212153339f9cb8b993701e9fe56b1c97699eee12586/scipy-1.15.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:721d6b4ef5dc82ca8968c25b111e307083d7ca9091bc38163fb89243e85e3889", size = 35503415, upload-time = "2025-05-08T16:05:34.699Z" },
+ { url = "https://files.pythonhosted.org/packages/bd/37/89f19c8c05505d0601ed5650156e50eb881ae3918786c8fd7262b4ee66d3/scipy-1.15.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:39cb9c62e471b1bb3750066ecc3a3f3052b37751c7c3dfd0fd7e48900ed52982", size = 37652622, upload-time = "2025-05-08T16:05:40.762Z" },
+ { url = "https://files.pythonhosted.org/packages/7e/31/be59513aa9695519b18e1851bb9e487de66f2d31f835201f1b42f5d4d475/scipy-1.15.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:795c46999bae845966368a3c013e0e00947932d68e235702b5c3f6ea799aa8c9", size = 37244796, upload-time = "2025-05-08T16:05:48.119Z" },
+ { url = "https://files.pythonhosted.org/packages/10/c0/4f5f3eeccc235632aab79b27a74a9130c6c35df358129f7ac8b29f562ac7/scipy-1.15.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:18aaacb735ab38b38db42cb01f6b92a2d0d4b6aabefeb07f02849e47f8fb3594", size = 40047684, upload-time = "2025-05-08T16:05:54.22Z" },
+ { url = "https://files.pythonhosted.org/packages/ab/a7/0ddaf514ce8a8714f6ed243a2b391b41dbb65251affe21ee3077ec45ea9a/scipy-1.15.3-cp311-cp311-win_amd64.whl", hash = "sha256:ae48a786a28412d744c62fd7816a4118ef97e5be0bee968ce8f0a2fba7acf3bb", size = 41246504, upload-time = "2025-05-08T16:06:00.437Z" },
+ { url = "https://files.pythonhosted.org/packages/37/4b/683aa044c4162e10ed7a7ea30527f2cbd92e6999c10a8ed8edb253836e9c/scipy-1.15.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6ac6310fdbfb7aa6612408bd2f07295bcbd3fda00d2d702178434751fe48e019", size = 38766735, upload-time = "2025-05-08T16:06:06.471Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/7e/f30be3d03de07f25dc0ec926d1681fed5c732d759ac8f51079708c79e680/scipy-1.15.3-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:185cd3d6d05ca4b44a8f1595af87f9c372bb6acf9c808e99aa3e9aa03bd98cf6", size = 30173284, upload-time = "2025-05-08T16:06:11.686Z" },
+ { url = "https://files.pythonhosted.org/packages/07/9c/0ddb0d0abdabe0d181c1793db51f02cd59e4901da6f9f7848e1f96759f0d/scipy-1.15.3-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:05dc6abcd105e1a29f95eada46d4a3f251743cfd7d3ae8ddb4088047f24ea477", size = 22446958, upload-time = "2025-05-08T16:06:15.97Z" },
+ { url = "https://files.pythonhosted.org/packages/af/43/0bce905a965f36c58ff80d8bea33f1f9351b05fad4beaad4eae34699b7a1/scipy-1.15.3-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:06efcba926324df1696931a57a176c80848ccd67ce6ad020c810736bfd58eb1c", size = 25242454, upload-time = "2025-05-08T16:06:20.394Z" },
+ { url = "https://files.pythonhosted.org/packages/56/30/a6f08f84ee5b7b28b4c597aca4cbe545535c39fe911845a96414700b64ba/scipy-1.15.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c05045d8b9bfd807ee1b9f38761993297b10b245f012b11b13b91ba8945f7e45", size = 35210199, upload-time = "2025-05-08T16:06:26.159Z" },
+ { url = "https://files.pythonhosted.org/packages/0b/1f/03f52c282437a168ee2c7c14a1a0d0781a9a4a8962d84ac05c06b4c5b555/scipy-1.15.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:271e3713e645149ea5ea3e97b57fdab61ce61333f97cfae392c28ba786f9bb49", size = 37309455, upload-time = "2025-05-08T16:06:32.778Z" },
+ { url = "https://files.pythonhosted.org/packages/89/b1/fbb53137f42c4bf630b1ffdfc2151a62d1d1b903b249f030d2b1c0280af8/scipy-1.15.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6cfd56fc1a8e53f6e89ba3a7a7251f7396412d655bca2aa5611c8ec9a6784a1e", size = 36885140, upload-time = "2025-05-08T16:06:39.249Z" },
+ { url = "https://files.pythonhosted.org/packages/2e/2e/025e39e339f5090df1ff266d021892694dbb7e63568edcfe43f892fa381d/scipy-1.15.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:0ff17c0bb1cb32952c09217d8d1eed9b53d1463e5f1dd6052c7857f83127d539", size = 39710549, upload-time = "2025-05-08T16:06:45.729Z" },
+ { url = "https://files.pythonhosted.org/packages/e6/eb/3bf6ea8ab7f1503dca3a10df2e4b9c3f6b3316df07f6c0ded94b281c7101/scipy-1.15.3-cp312-cp312-win_amd64.whl", hash = "sha256:52092bc0472cfd17df49ff17e70624345efece4e1a12b23783a1ac59a1b728ed", size = 40966184, upload-time = "2025-05-08T16:06:52.623Z" },
+ { url = "https://files.pythonhosted.org/packages/73/18/ec27848c9baae6e0d6573eda6e01a602e5649ee72c27c3a8aad673ebecfd/scipy-1.15.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:2c620736bcc334782e24d173c0fdbb7590a0a436d2fdf39310a8902505008759", size = 38728256, upload-time = "2025-05-08T16:06:58.696Z" },
+ { url = "https://files.pythonhosted.org/packages/74/cd/1aef2184948728b4b6e21267d53b3339762c285a46a274ebb7863c9e4742/scipy-1.15.3-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:7e11270a000969409d37ed399585ee530b9ef6aa99d50c019de4cb01e8e54e62", size = 30109540, upload-time = "2025-05-08T16:07:04.209Z" },
+ { url = "https://files.pythonhosted.org/packages/5b/d8/59e452c0a255ec352bd0a833537a3bc1bfb679944c4938ab375b0a6b3a3e/scipy-1.15.3-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:8c9ed3ba2c8a2ce098163a9bdb26f891746d02136995df25227a20e71c396ebb", size = 22383115, upload-time = "2025-05-08T16:07:08.998Z" },
+ { url = "https://files.pythonhosted.org/packages/08/f5/456f56bbbfccf696263b47095291040655e3cbaf05d063bdc7c7517f32ac/scipy-1.15.3-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:0bdd905264c0c9cfa74a4772cdb2070171790381a5c4d312c973382fc6eaf730", size = 25163884, upload-time = "2025-05-08T16:07:14.091Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/66/a9618b6a435a0f0c0b8a6d0a2efb32d4ec5a85f023c2b79d39512040355b/scipy-1.15.3-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:79167bba085c31f38603e11a267d862957cbb3ce018d8b38f79ac043bc92d825", size = 35174018, upload-time = "2025-05-08T16:07:19.427Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/09/c5b6734a50ad4882432b6bb7c02baf757f5b2f256041da5df242e2d7e6b6/scipy-1.15.3-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c9deabd6d547aee2c9a81dee6cc96c6d7e9a9b1953f74850c179f91fdc729cb7", size = 37269716, upload-time = "2025-05-08T16:07:25.712Z" },
+ { url = "https://files.pythonhosted.org/packages/77/0a/eac00ff741f23bcabd352731ed9b8995a0a60ef57f5fd788d611d43d69a1/scipy-1.15.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:dde4fc32993071ac0c7dd2d82569e544f0bdaff66269cb475e0f369adad13f11", size = 36872342, upload-time = "2025-05-08T16:07:31.468Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/54/4379be86dd74b6ad81551689107360d9a3e18f24d20767a2d5b9253a3f0a/scipy-1.15.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:f77f853d584e72e874d87357ad70f44b437331507d1c311457bed8ed2b956126", size = 39670869, upload-time = "2025-05-08T16:07:38.002Z" },
+ { url = "https://files.pythonhosted.org/packages/87/2e/892ad2862ba54f084ffe8cc4a22667eaf9c2bcec6d2bff1d15713c6c0703/scipy-1.15.3-cp313-cp313-win_amd64.whl", hash = "sha256:b90ab29d0c37ec9bf55424c064312930ca5f4bde15ee8619ee44e69319aab163", size = 40988851, upload-time = "2025-05-08T16:08:33.671Z" },
+ { url = "https://files.pythonhosted.org/packages/1b/e9/7a879c137f7e55b30d75d90ce3eb468197646bc7b443ac036ae3fe109055/scipy-1.15.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:3ac07623267feb3ae308487c260ac684b32ea35fd81e12845039952f558047b8", size = 38863011, upload-time = "2025-05-08T16:07:44.039Z" },
+ { url = "https://files.pythonhosted.org/packages/51/d1/226a806bbd69f62ce5ef5f3ffadc35286e9fbc802f606a07eb83bf2359de/scipy-1.15.3-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:6487aa99c2a3d509a5227d9a5e889ff05830a06b2ce08ec30df6d79db5fcd5c5", size = 30266407, upload-time = "2025-05-08T16:07:49.891Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/9b/f32d1d6093ab9eeabbd839b0f7619c62e46cc4b7b6dbf05b6e615bbd4400/scipy-1.15.3-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:50f9e62461c95d933d5c5ef4a1f2ebf9a2b4e83b0db374cb3f1de104d935922e", size = 22540030, upload-time = "2025-05-08T16:07:54.121Z" },
+ { url = "https://files.pythonhosted.org/packages/e7/29/c278f699b095c1a884f29fda126340fcc201461ee8bfea5c8bdb1c7c958b/scipy-1.15.3-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:14ed70039d182f411ffc74789a16df3835e05dc469b898233a245cdfd7f162cb", size = 25218709, upload-time = "2025-05-08T16:07:58.506Z" },
+ { url = "https://files.pythonhosted.org/packages/24/18/9e5374b617aba742a990581373cd6b68a2945d65cc588482749ef2e64467/scipy-1.15.3-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a769105537aa07a69468a0eefcd121be52006db61cdd8cac8a0e68980bbb723", size = 34809045, upload-time = "2025-05-08T16:08:03.929Z" },
+ { url = "https://files.pythonhosted.org/packages/e1/fe/9c4361e7ba2927074360856db6135ef4904d505e9b3afbbcb073c4008328/scipy-1.15.3-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9db984639887e3dffb3928d118145ffe40eff2fa40cb241a306ec57c219ebbbb", size = 36703062, upload-time = "2025-05-08T16:08:09.558Z" },
+ { url = "https://files.pythonhosted.org/packages/b7/8e/038ccfe29d272b30086b25a4960f757f97122cb2ec42e62b460d02fe98e9/scipy-1.15.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:40e54d5c7e7ebf1aa596c374c49fa3135f04648a0caabcb66c52884b943f02b4", size = 36393132, upload-time = "2025-05-08T16:08:15.34Z" },
+ { url = "https://files.pythonhosted.org/packages/10/7e/5c12285452970be5bdbe8352c619250b97ebf7917d7a9a9e96b8a8140f17/scipy-1.15.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:5e721fed53187e71d0ccf382b6bf977644c533e506c4d33c3fb24de89f5c3ed5", size = 38979503, upload-time = "2025-05-08T16:08:21.513Z" },
+ { url = "https://files.pythonhosted.org/packages/81/06/0a5e5349474e1cbc5757975b21bd4fad0e72ebf138c5592f191646154e06/scipy-1.15.3-cp313-cp313t-win_amd64.whl", hash = "sha256:76ad1fb5f8752eabf0fa02e4cc0336b4e8f021e2d5f061ed37d6d264db35e3ca", size = 40308097, upload-time = "2025-05-08T16:08:27.627Z" },
+]
+
+[[package]]
+name = "scipy"
+version = "1.16.3"
+source = { registry = "https://pypi.org/simple" }
+resolution-markers = [
+ "python_full_version >= '3.12'",
+ "python_full_version == '3.11.*'",
+]
+dependencies = [
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/0a/ca/d8ace4f98322d01abcd52d381134344bf7b431eba7ed8b42bdea5a3c2ac9/scipy-1.16.3.tar.gz", hash = "sha256:01e87659402762f43bd2fee13370553a17ada367d42e7487800bf2916535aecb", size = 30597883, upload-time = "2025-10-28T17:38:54.068Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/9b/5f/6f37d7439de1455ce9c5a556b8d1db0979f03a796c030bafdf08d35b7bf9/scipy-1.16.3-cp311-cp311-macosx_10_14_x86_64.whl", hash = "sha256:40be6cf99e68b6c4321e9f8782e7d5ff8265af28ef2cd56e9c9b2638fa08ad97", size = 36630881, upload-time = "2025-10-28T17:31:47.104Z" },
+ { url = "https://files.pythonhosted.org/packages/7c/89/d70e9f628749b7e4db2aa4cd89735502ff3f08f7b9b27d2e799485987cd9/scipy-1.16.3-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:8be1ca9170fcb6223cc7c27f4305d680ded114a1567c0bd2bfcbf947d1b17511", size = 28941012, upload-time = "2025-10-28T17:31:53.411Z" },
+ { url = "https://files.pythonhosted.org/packages/a8/a8/0e7a9a6872a923505dbdf6bb93451edcac120363131c19013044a1e7cb0c/scipy-1.16.3-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:bea0a62734d20d67608660f69dcda23e7f90fb4ca20974ab80b6ed40df87a005", size = 20931935, upload-time = "2025-10-28T17:31:57.361Z" },
+ { url = "https://files.pythonhosted.org/packages/bd/c7/020fb72bd79ad798e4dbe53938543ecb96b3a9ac3fe274b7189e23e27353/scipy-1.16.3-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:2a207a6ce9c24f1951241f4693ede2d393f59c07abc159b2cb2be980820e01fb", size = 23534466, upload-time = "2025-10-28T17:32:01.875Z" },
+ { url = "https://files.pythonhosted.org/packages/be/a0/668c4609ce6dbf2f948e167836ccaf897f95fb63fa231c87da7558a374cd/scipy-1.16.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:532fb5ad6a87e9e9cd9c959b106b73145a03f04c7d57ea3e6f6bb60b86ab0876", size = 33593618, upload-time = "2025-10-28T17:32:06.902Z" },
+ { url = "https://files.pythonhosted.org/packages/ca/6e/8942461cf2636cdae083e3eb72622a7fbbfa5cf559c7d13ab250a5dbdc01/scipy-1.16.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:0151a0749efeaaab78711c78422d413c583b8cdd2011a3c1d6c794938ee9fdb2", size = 35899798, upload-time = "2025-10-28T17:32:12.665Z" },
+ { url = "https://files.pythonhosted.org/packages/79/e8/d0f33590364cdbd67f28ce79368b373889faa4ee959588beddf6daef9abe/scipy-1.16.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:b7180967113560cca57418a7bc719e30366b47959dd845a93206fbed693c867e", size = 36226154, upload-time = "2025-10-28T17:32:17.961Z" },
+ { url = "https://files.pythonhosted.org/packages/39/c1/1903de608c0c924a1749c590064e65810f8046e437aba6be365abc4f7557/scipy-1.16.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:deb3841c925eeddb6afc1e4e4a45e418d19ec7b87c5df177695224078e8ec733", size = 38878540, upload-time = "2025-10-28T17:32:23.907Z" },
+ { url = "https://files.pythonhosted.org/packages/f1/d0/22ec7036ba0b0a35bccb7f25ab407382ed34af0b111475eb301c16f8a2e5/scipy-1.16.3-cp311-cp311-win_amd64.whl", hash = "sha256:53c3844d527213631e886621df5695d35e4f6a75f620dca412bcd292f6b87d78", size = 38722107, upload-time = "2025-10-28T17:32:29.921Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/60/8a00e5a524bb3bf8898db1650d350f50e6cffb9d7a491c561dc9826c7515/scipy-1.16.3-cp311-cp311-win_arm64.whl", hash = "sha256:9452781bd879b14b6f055b26643703551320aa8d79ae064a71df55c00286a184", size = 25506272, upload-time = "2025-10-28T17:32:34.577Z" },
+ { url = "https://files.pythonhosted.org/packages/40/41/5bf55c3f386b1643812f3a5674edf74b26184378ef0f3e7c7a09a7e2ca7f/scipy-1.16.3-cp312-cp312-macosx_10_14_x86_64.whl", hash = "sha256:81fc5827606858cf71446a5e98715ba0e11f0dbc83d71c7409d05486592a45d6", size = 36659043, upload-time = "2025-10-28T17:32:40.285Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/0f/65582071948cfc45d43e9870bf7ca5f0e0684e165d7c9ef4e50d783073eb/scipy-1.16.3-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:c97176013d404c7346bf57874eaac5187d969293bf40497140b0a2b2b7482e07", size = 28898986, upload-time = "2025-10-28T17:32:45.325Z" },
+ { url = "https://files.pythonhosted.org/packages/96/5e/36bf3f0ac298187d1ceadde9051177d6a4fe4d507e8f59067dc9dd39e650/scipy-1.16.3-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:2b71d93c8a9936046866acebc915e2af2e292b883ed6e2cbe5c34beb094b82d9", size = 20889814, upload-time = "2025-10-28T17:32:49.277Z" },
+ { url = "https://files.pythonhosted.org/packages/80/35/178d9d0c35394d5d5211bbff7ac4f2986c5488b59506fef9e1de13ea28d3/scipy-1.16.3-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:3d4a07a8e785d80289dfe66b7c27d8634a773020742ec7187b85ccc4b0e7b686", size = 23565795, upload-time = "2025-10-28T17:32:53.337Z" },
+ { url = "https://files.pythonhosted.org/packages/fa/46/d1146ff536d034d02f83c8afc3c4bab2eddb634624d6529a8512f3afc9da/scipy-1.16.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0553371015692a898e1aa858fed67a3576c34edefa6b7ebdb4e9dde49ce5c203", size = 33349476, upload-time = "2025-10-28T17:32:58.353Z" },
+ { url = "https://files.pythonhosted.org/packages/79/2e/415119c9ab3e62249e18c2b082c07aff907a273741b3f8160414b0e9193c/scipy-1.16.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:72d1717fd3b5e6ec747327ce9bda32d5463f472c9dce9f54499e81fbd50245a1", size = 35676692, upload-time = "2025-10-28T17:33:03.88Z" },
+ { url = "https://files.pythonhosted.org/packages/27/82/df26e44da78bf8d2aeaf7566082260cfa15955a5a6e96e6a29935b64132f/scipy-1.16.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1fb2472e72e24d1530debe6ae078db70fb1605350c88a3d14bc401d6306dbffe", size = 36019345, upload-time = "2025-10-28T17:33:09.773Z" },
+ { url = "https://files.pythonhosted.org/packages/82/31/006cbb4b648ba379a95c87262c2855cd0d09453e500937f78b30f02fa1cd/scipy-1.16.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:c5192722cffe15f9329a3948c4b1db789fbb1f05c97899187dcf009b283aea70", size = 38678975, upload-time = "2025-10-28T17:33:15.809Z" },
+ { url = "https://files.pythonhosted.org/packages/c2/7f/acbd28c97e990b421af7d6d6cd416358c9c293fc958b8529e0bd5d2a2a19/scipy-1.16.3-cp312-cp312-win_amd64.whl", hash = "sha256:56edc65510d1331dae01ef9b658d428e33ed48b4f77b1d51caf479a0253f96dc", size = 38555926, upload-time = "2025-10-28T17:33:21.388Z" },
+ { url = "https://files.pythonhosted.org/packages/ce/69/c5c7807fd007dad4f48e0a5f2153038dc96e8725d3345b9ee31b2b7bed46/scipy-1.16.3-cp312-cp312-win_arm64.whl", hash = "sha256:a8a26c78ef223d3e30920ef759e25625a0ecdd0d60e5a8818b7513c3e5384cf2", size = 25463014, upload-time = "2025-10-28T17:33:25.975Z" },
+ { url = "https://files.pythonhosted.org/packages/72/f1/57e8327ab1508272029e27eeef34f2302ffc156b69e7e233e906c2a5c379/scipy-1.16.3-cp313-cp313-macosx_10_14_x86_64.whl", hash = "sha256:d2ec56337675e61b312179a1ad124f5f570c00f920cc75e1000025451b88241c", size = 36617856, upload-time = "2025-10-28T17:33:31.375Z" },
+ { url = "https://files.pythonhosted.org/packages/44/13/7e63cfba8a7452eb756306aa2fd9b37a29a323b672b964b4fdeded9a3f21/scipy-1.16.3-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:16b8bc35a4cc24db80a0ec836a9286d0e31b2503cb2fd7ff7fb0e0374a97081d", size = 28874306, upload-time = "2025-10-28T17:33:36.516Z" },
+ { url = "https://files.pythonhosted.org/packages/15/65/3a9400efd0228a176e6ec3454b1fa998fbbb5a8defa1672c3f65706987db/scipy-1.16.3-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:5803c5fadd29de0cf27fa08ccbfe7a9e5d741bf63e4ab1085437266f12460ff9", size = 20865371, upload-time = "2025-10-28T17:33:42.094Z" },
+ { url = "https://files.pythonhosted.org/packages/33/d7/eda09adf009a9fb81827194d4dd02d2e4bc752cef16737cc4ef065234031/scipy-1.16.3-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:b81c27fc41954319a943d43b20e07c40bdcd3ff7cf013f4fb86286faefe546c4", size = 23524877, upload-time = "2025-10-28T17:33:48.483Z" },
+ { url = "https://files.pythonhosted.org/packages/7d/6b/3f911e1ebc364cb81320223a3422aab7d26c9c7973109a9cd0f27c64c6c0/scipy-1.16.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0c3b4dd3d9b08dbce0f3440032c52e9e2ab9f96ade2d3943313dfe51a7056959", size = 33342103, upload-time = "2025-10-28T17:33:56.495Z" },
+ { url = "https://files.pythonhosted.org/packages/21/f6/4bfb5695d8941e5c570a04d9fcd0d36bce7511b7d78e6e75c8f9791f82d0/scipy-1.16.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7dc1360c06535ea6116a2220f760ae572db9f661aba2d88074fe30ec2aa1ff88", size = 35697297, upload-time = "2025-10-28T17:34:04.722Z" },
+ { url = "https://files.pythonhosted.org/packages/04/e1/6496dadbc80d8d896ff72511ecfe2316b50313bfc3ebf07a3f580f08bd8c/scipy-1.16.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:663b8d66a8748051c3ee9c96465fb417509315b99c71550fda2591d7dd634234", size = 36021756, upload-time = "2025-10-28T17:34:13.482Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/bd/a8c7799e0136b987bda3e1b23d155bcb31aec68a4a472554df5f0937eef7/scipy-1.16.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eab43fae33a0c39006a88096cd7b4f4ef545ea0447d250d5ac18202d40b6611d", size = 38696566, upload-time = "2025-10-28T17:34:22.384Z" },
+ { url = "https://files.pythonhosted.org/packages/cd/01/1204382461fcbfeb05b6161b594f4007e78b6eba9b375382f79153172b4d/scipy-1.16.3-cp313-cp313-win_amd64.whl", hash = "sha256:062246acacbe9f8210de8e751b16fc37458213f124bef161a5a02c7a39284304", size = 38529877, upload-time = "2025-10-28T17:35:51.076Z" },
+ { url = "https://files.pythonhosted.org/packages/7f/14/9d9fbcaa1260a94f4bb5b64ba9213ceb5d03cd88841fe9fd1ffd47a45b73/scipy-1.16.3-cp313-cp313-win_arm64.whl", hash = "sha256:50a3dbf286dbc7d84f176f9a1574c705f277cb6565069f88f60db9eafdbe3ee2", size = 25455366, upload-time = "2025-10-28T17:35:59.014Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/a3/9ec205bd49f42d45d77f1730dbad9ccf146244c1647605cf834b3a8c4f36/scipy-1.16.3-cp313-cp313t-macosx_10_14_x86_64.whl", hash = "sha256:fb4b29f4cf8cc5a8d628bc8d8e26d12d7278cd1f219f22698a378c3d67db5e4b", size = 37027931, upload-time = "2025-10-28T17:34:31.451Z" },
+ { url = "https://files.pythonhosted.org/packages/25/06/ca9fd1f3a4589cbd825b1447e5db3a8ebb969c1eaf22c8579bd286f51b6d/scipy-1.16.3-cp313-cp313t-macosx_12_0_arm64.whl", hash = "sha256:8d09d72dc92742988b0e7750bddb8060b0c7079606c0d24a8cc8e9c9c11f9079", size = 29400081, upload-time = "2025-10-28T17:34:39.087Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/56/933e68210d92657d93fb0e381683bc0e53a965048d7358ff5fbf9e6a1b17/scipy-1.16.3-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:03192a35e661470197556de24e7cb1330d84b35b94ead65c46ad6f16f6b28f2a", size = 21391244, upload-time = "2025-10-28T17:34:45.234Z" },
+ { url = "https://files.pythonhosted.org/packages/a8/7e/779845db03dc1418e215726329674b40576879b91814568757ff0014ad65/scipy-1.16.3-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:57d01cb6f85e34f0946b33caa66e892aae072b64b034183f3d87c4025802a119", size = 23929753, upload-time = "2025-10-28T17:34:51.793Z" },
+ { url = "https://files.pythonhosted.org/packages/4c/4b/f756cf8161d5365dcdef9e5f460ab226c068211030a175d2fc7f3f41ca64/scipy-1.16.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:96491a6a54e995f00a28a3c3badfff58fd093bf26cd5fb34a2188c8c756a3a2c", size = 33496912, upload-time = "2025-10-28T17:34:59.8Z" },
+ { url = "https://files.pythonhosted.org/packages/09/b5/222b1e49a58668f23839ca1542a6322bb095ab8d6590d4f71723869a6c2c/scipy-1.16.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:cd13e354df9938598af2be05822c323e97132d5e6306b83a3b4ee6724c6e522e", size = 35802371, upload-time = "2025-10-28T17:35:08.173Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/8d/5964ef68bb31829bde27611f8c9deeac13764589fe74a75390242b64ca44/scipy-1.16.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:63d3cdacb8a824a295191a723ee5e4ea7768ca5ca5f2838532d9f2e2b3ce2135", size = 36190477, upload-time = "2025-10-28T17:35:16.7Z" },
+ { url = "https://files.pythonhosted.org/packages/ab/f2/b31d75cb9b5fa4dd39a0a931ee9b33e7f6f36f23be5ef560bf72e0f92f32/scipy-1.16.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:e7efa2681ea410b10dde31a52b18b0154d66f2485328830e45fdf183af5aefc6", size = 38796678, upload-time = "2025-10-28T17:35:26.354Z" },
+ { url = "https://files.pythonhosted.org/packages/b4/1e/b3723d8ff64ab548c38d87055483714fefe6ee20e0189b62352b5e015bb1/scipy-1.16.3-cp313-cp313t-win_amd64.whl", hash = "sha256:2d1ae2cf0c350e7705168ff2429962a89ad90c2d49d1dd300686d8b2a5af22fc", size = 38640178, upload-time = "2025-10-28T17:35:35.304Z" },
+ { url = "https://files.pythonhosted.org/packages/8e/f3/d854ff38789aca9b0cc23008d607ced9de4f7ab14fa1ca4329f86b3758ca/scipy-1.16.3-cp313-cp313t-win_arm64.whl", hash = "sha256:0c623a54f7b79dd88ef56da19bc2873afec9673a48f3b85b18e4d402bdd29a5a", size = 25803246, upload-time = "2025-10-28T17:35:42.155Z" },
+ { url = "https://files.pythonhosted.org/packages/99/f6/99b10fd70f2d864c1e29a28bbcaa0c6340f9d8518396542d9ea3b4aaae15/scipy-1.16.3-cp314-cp314-macosx_10_14_x86_64.whl", hash = "sha256:875555ce62743e1d54f06cdf22c1e0bc47b91130ac40fe5d783b6dfa114beeb6", size = 36606469, upload-time = "2025-10-28T17:36:08.741Z" },
+ { url = "https://files.pythonhosted.org/packages/4d/74/043b54f2319f48ea940dd025779fa28ee360e6b95acb7cd188fad4391c6b/scipy-1.16.3-cp314-cp314-macosx_12_0_arm64.whl", hash = "sha256:bb61878c18a470021fb515a843dc7a76961a8daceaaaa8bad1332f1bf4b54657", size = 28872043, upload-time = "2025-10-28T17:36:16.599Z" },
+ { url = "https://files.pythonhosted.org/packages/4d/e1/24b7e50cc1c4ee6ffbcb1f27fe9f4c8b40e7911675f6d2d20955f41c6348/scipy-1.16.3-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:f2622206f5559784fa5c4b53a950c3c7c1cf3e84ca1b9c4b6c03f062f289ca26", size = 20862952, upload-time = "2025-10-28T17:36:22.966Z" },
+ { url = "https://files.pythonhosted.org/packages/dd/3a/3e8c01a4d742b730df368e063787c6808597ccb38636ed821d10b39ca51b/scipy-1.16.3-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:7f68154688c515cdb541a31ef8eb66d8cd1050605be9dcd74199cbd22ac739bc", size = 23508512, upload-time = "2025-10-28T17:36:29.731Z" },
+ { url = "https://files.pythonhosted.org/packages/1f/60/c45a12b98ad591536bfe5330cb3cfe1850d7570259303563b1721564d458/scipy-1.16.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:8b3c820ddb80029fe9f43d61b81d8b488d3ef8ca010d15122b152db77dc94c22", size = 33413639, upload-time = "2025-10-28T17:36:37.982Z" },
+ { url = "https://files.pythonhosted.org/packages/71/bc/35957d88645476307e4839712642896689df442f3e53b0fa016ecf8a3357/scipy-1.16.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d3837938ae715fc0fe3c39c0202de3a8853aff22ca66781ddc2ade7554b7e2cc", size = 35704729, upload-time = "2025-10-28T17:36:46.547Z" },
+ { url = "https://files.pythonhosted.org/packages/3b/15/89105e659041b1ca11c386e9995aefacd513a78493656e57789f9d9eab61/scipy-1.16.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:aadd23f98f9cb069b3bd64ddc900c4d277778242e961751f77a8cb5c4b946fb0", size = 36086251, upload-time = "2025-10-28T17:36:55.161Z" },
+ { url = "https://files.pythonhosted.org/packages/1a/87/c0ea673ac9c6cc50b3da2196d860273bc7389aa69b64efa8493bdd25b093/scipy-1.16.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:b7c5f1bda1354d6a19bc6af73a649f8285ca63ac6b52e64e658a5a11d4d69800", size = 38716681, upload-time = "2025-10-28T17:37:04.1Z" },
+ { url = "https://files.pythonhosted.org/packages/91/06/837893227b043fb9b0d13e4bd7586982d8136cb249ffb3492930dab905b8/scipy-1.16.3-cp314-cp314-win_amd64.whl", hash = "sha256:e5d42a9472e7579e473879a1990327830493a7047506d58d73fc429b84c1d49d", size = 39358423, upload-time = "2025-10-28T17:38:20.005Z" },
+ { url = "https://files.pythonhosted.org/packages/95/03/28bce0355e4d34a7c034727505a02d19548549e190bedd13a721e35380b7/scipy-1.16.3-cp314-cp314-win_arm64.whl", hash = "sha256:6020470b9d00245926f2d5bb93b119ca0340f0d564eb6fbaad843eaebf9d690f", size = 26135027, upload-time = "2025-10-28T17:38:24.966Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/6f/69f1e2b682efe9de8fe9f91040f0cd32f13cfccba690512ba4c582b0bc29/scipy-1.16.3-cp314-cp314t-macosx_10_14_x86_64.whl", hash = "sha256:e1d27cbcb4602680a49d787d90664fa4974063ac9d4134813332a8c53dbe667c", size = 37028379, upload-time = "2025-10-28T17:37:14.061Z" },
+ { url = "https://files.pythonhosted.org/packages/7c/2d/e826f31624a5ebbab1cd93d30fd74349914753076ed0593e1d56a98c4fb4/scipy-1.16.3-cp314-cp314t-macosx_12_0_arm64.whl", hash = "sha256:9b9c9c07b6d56a35777a1b4cc8966118fb16cfd8daf6743867d17d36cfad2d40", size = 29400052, upload-time = "2025-10-28T17:37:21.709Z" },
+ { url = "https://files.pythonhosted.org/packages/69/27/d24feb80155f41fd1f156bf144e7e049b4e2b9dd06261a242905e3bc7a03/scipy-1.16.3-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:3a4c460301fb2cffb7f88528f30b3127742cff583603aa7dc964a52c463b385d", size = 21391183, upload-time = "2025-10-28T17:37:29.559Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/d3/1b229e433074c5738a24277eca520a2319aac7465eea7310ea6ae0e98ae2/scipy-1.16.3-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:f667a4542cc8917af1db06366d3f78a5c8e83badd56409f94d1eac8d8d9133fa", size = 23930174, upload-time = "2025-10-28T17:37:36.306Z" },
+ { url = "https://files.pythonhosted.org/packages/16/9d/d9e148b0ec680c0f042581a2be79a28a7ab66c0c4946697f9e7553ead337/scipy-1.16.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:f379b54b77a597aa7ee5e697df0d66903e41b9c85a6dd7946159e356319158e8", size = 33497852, upload-time = "2025-10-28T17:37:42.228Z" },
+ { url = "https://files.pythonhosted.org/packages/2f/22/4e5f7561e4f98b7bea63cf3fd7934bff1e3182e9f1626b089a679914d5c8/scipy-1.16.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:4aff59800a3b7f786b70bfd6ab551001cb553244988d7d6b8299cb1ea653b353", size = 35798595, upload-time = "2025-10-28T17:37:48.102Z" },
+ { url = "https://files.pythonhosted.org/packages/83/42/6644d714c179429fc7196857866f219fef25238319b650bb32dde7bf7a48/scipy-1.16.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:da7763f55885045036fabcebd80144b757d3db06ab0861415d1c3b7c69042146", size = 36186269, upload-time = "2025-10-28T17:37:53.72Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/70/64b4d7ca92f9cf2e6fc6aaa2eecf80bb9b6b985043a9583f32f8177ea122/scipy-1.16.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ffa6eea95283b2b8079b821dc11f50a17d0571c92b43e2b5b12764dc5f9b285d", size = 38802779, upload-time = "2025-10-28T17:37:59.393Z" },
+ { url = "https://files.pythonhosted.org/packages/61/82/8d0e39f62764cce5ffd5284131e109f07cf8955aef9ab8ed4e3aa5e30539/scipy-1.16.3-cp314-cp314t-win_amd64.whl", hash = "sha256:d9f48cafc7ce94cf9b15c6bffdc443a81a27bf7075cf2dcd5c8b40f85d10c4e7", size = 39471128, upload-time = "2025-10-28T17:38:05.259Z" },
+ { url = "https://files.pythonhosted.org/packages/64/47/a494741db7280eae6dc033510c319e34d42dd41b7ac0c7ead39354d1a2b5/scipy-1.16.3-cp314-cp314t-win_arm64.whl", hash = "sha256:21d9d6b197227a12dcbf9633320a4e34c6b0e51c57268df255a0942983bac562", size = 26464127, upload-time = "2025-10-28T17:38:11.34Z" },
+]
+
+[[package]]
+name = "seaborn"
+version = "0.13.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "matplotlib" },
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "pandas" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/86/59/a451d7420a77ab0b98f7affa3a1d78a313d2f7281a57afb1a34bae8ab412/seaborn-0.13.2.tar.gz", hash = "sha256:93e60a40988f4d65e9f4885df477e2fdaff6b73a9ded434c1ab356dd57eefff7", size = 1457696, upload-time = "2024-01-25T13:21:52.551Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/83/11/00d3c3dfc25ad54e731d91449895a79e4bf2384dc3ac01809010ba88f6d5/seaborn-0.13.2-py3-none-any.whl", hash = "sha256:636f8336facf092165e27924f223d3c62ca560b1f2bb5dff7ab7fad265361987", size = 294914, upload-time = "2024-01-25T13:21:49.598Z" },
+]
+
+[[package]]
+name = "sentence-transformers"
+version = "4.1.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "huggingface-hub" },
+ { name = "pillow" },
+ { name = "scikit-learn" },
+ { name = "scipy", version = "1.15.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "scipy", version = "1.16.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "torch" },
+ { name = "tqdm" },
+ { name = "transformers" },
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/73/84/b30d1b29ff58cfdff423e36a50efd622c8e31d7039b1a0d5e72066620da1/sentence_transformers-4.1.0.tar.gz", hash = "sha256:f125ffd1c727533e0eca5d4567de72f84728de8f7482834de442fd90c2c3d50b", size = 272420, upload-time = "2025-04-15T13:46:13.732Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/45/2d/1151b371f28caae565ad384fdc38198f1165571870217aedda230b9d7497/sentence_transformers-4.1.0-py3-none-any.whl", hash = "sha256:382a7f6be1244a100ce40495fb7523dbe8d71b3c10b299f81e6b735092b3b8ca", size = 345695, upload-time = "2025-04-15T13:46:12.44Z" },
+]
+
+[[package]]
+name = "setuptools"
+version = "80.9.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/18/5d/3bf57dcd21979b887f014ea83c24ae194cfcd12b9e0fda66b957c69d1fca/setuptools-80.9.0.tar.gz", hash = "sha256:f36b47402ecde768dbfafc46e8e4207b4360c654f1f3bb84475f0a28628fb19c", size = 1319958, upload-time = "2025-05-27T00:56:51.443Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/a3/dc/17031897dae0efacfea57dfd3a82fdd2a2aeb58e0ff71b77b87e44edc772/setuptools-80.9.0-py3-none-any.whl", hash = "sha256:062d34222ad13e0cc312a4c02d73f059e86a4acbfbdea8f8f76b28c99f306922", size = 1201486, upload-time = "2025-05-27T00:56:49.664Z" },
+]
+
+[[package]]
+name = "shellingham"
+version = "1.5.4"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310, upload-time = "2023-10-24T04:13:40.426Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755, upload-time = "2023-10-24T04:13:38.866Z" },
+]
+
+[[package]]
+name = "six"
+version = "1.17.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" },
+]
+
+[[package]]
+name = "smart-open"
+version = "7.4.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "wrapt" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/ed/1a/8de851644371c2c88ee5ff006d58355979247a02370494d63d091ef0cd01/smart_open-7.4.1.tar.gz", hash = "sha256:5c20f09026875e6dec708e9610e0cd13d24d91f0a2c12e6511b9e478a566b4a0", size = 53203, upload-time = "2025-10-21T16:28:40.075Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/04/ba/6dc8aea11f667b2a1455c466a42cdcd7a7efda93a1510dd085fa988c5144/smart_open-7.4.1-py3-none-any.whl", hash = "sha256:f52cb9bc897c7676dfc6996735332bd2465dfb048c73bfa9dfcdc829f48018cc", size = 63220, upload-time = "2025-10-21T16:28:38.494Z" },
+]
+
+[[package]]
+name = "sniffio"
+version = "1.3.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" },
+]
+
+[[package]]
+name = "spacy"
+version = "3.8.7"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "catalogue" },
+ { name = "cymem" },
+ { name = "jinja2" },
+ { name = "langcodes" },
+ { name = "murmurhash" },
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "packaging" },
+ { name = "preshed" },
+ { name = "pydantic" },
+ { name = "requests" },
+ { name = "setuptools" },
+ { name = "spacy-legacy" },
+ { name = "spacy-loggers" },
+ { name = "srsly" },
+ { name = "thinc" },
+ { name = "tqdm" },
+ { name = "typer" },
+ { name = "wasabi" },
+ { name = "weasel" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/1e/9e/fb4e1cefe3fbd51ea6a243e5a3d2bc629baa9a28930bf4be6fe5672fa1ca/spacy-3.8.7.tar.gz", hash = "sha256:700fd174c6c552276be142c48e70bb53cae24c4dd86003c4432af9cb93e4c908", size = 1316143, upload-time = "2025-05-23T08:55:39.538Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/29/2c/bbba614290492c169ee50777e44d3e4325a1e646272379988de8749b9dd4/spacy-3.8.7-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6ec0368ce96cd775fb14906f04b771c912ea8393ba30f8b35f9c4dc47a420b8e", size = 6613435, upload-time = "2025-05-23T08:54:03.964Z" },
+ { url = "https://files.pythonhosted.org/packages/39/a9/c1fdecc11d8855b3df601bbfb5fc4cdb98d79b6a5d166af974354ea658eb/spacy-3.8.7-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5672f8a0fe7a3847e925544890be60015fbf48a60a838803425f82e849dd4f18", size = 6261550, upload-time = "2025-05-23T08:54:06.984Z" },
+ { url = "https://files.pythonhosted.org/packages/39/fe/e8b5a374f2517716f510f0dd6a0b68e88637e66db7c315d4002ba80b2bfe/spacy-3.8.7-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:60cde9fe8b15be04eb1e634c353d9c160187115d825b368cc1975452dd54f264", size = 31215973, upload-time = "2025-05-23T08:54:09.46Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/e7/bd1df17add98a5ec3e0d2dd73d4e5884683ffd2e34d3c0e5828f48933787/spacy-3.8.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d9cac8e58fb92fb1c5e06328039595fa6589a9d1403681266f8f5e454d15319c", size = 31504596, upload-time = "2025-05-23T08:54:12.684Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/fa/5fd95749f390478a31a806500e829c5a8d97312ea18129494d255e231c00/spacy-3.8.7-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:1456245a4ed04bc882db2d89a27ca1b6dc0b947b643bedaeaa5da11d9f7e22ec", size = 30527369, upload-time = "2025-05-23T08:54:15.467Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/74/f4708260fc135f8de15eb1d0ecfe00fd7b53f4b1d4927f90a33d48dff637/spacy-3.8.7-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:bb98f85d467963d17c7c660884069ba948bde71c07280c91ee3235e554375308", size = 31357330, upload-time = "2025-05-23T08:54:18.342Z" },
+ { url = "https://files.pythonhosted.org/packages/53/a6/3086859d2bfb5b6f97b17e19f51da0983eb11b07f63c24dced6506cdb370/spacy-3.8.7-cp310-cp310-win_amd64.whl", hash = "sha256:b0df50d69e6691e97eae228733b321971607dbbb799e59d8470f2e70b8b27a8e", size = 14929267, upload-time = "2025-05-23T08:54:21.365Z" },
+ { url = "https://files.pythonhosted.org/packages/29/c5/5fbb3a4e694d4855a5bab87af9664377c48b89691f180ad3cde4faeaf35c/spacy-3.8.7-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:bdff8b9b556468a6dd527af17f0ddf9fb0b0bee92ee7703339ddf542361cff98", size = 6746140, upload-time = "2025-05-23T08:54:23.483Z" },
+ { url = "https://files.pythonhosted.org/packages/03/2a/43afac516eb82409ca47d7206f982beaf265d2ba06a72ca07cf06b290c20/spacy-3.8.7-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9194b7cf015ed9b4450ffb162da49c8a9305e76b468de036b0948abdfc748a37", size = 6392440, upload-time = "2025-05-23T08:54:25.12Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/83/2ea68c18e2b1b9a6f6b30ef63eb9d07e979626b9595acfdb5394f18923c4/spacy-3.8.7-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7dc38b78d48b9c2a80a3eea95f776304993f63fc307f07cdd104441442f92f1e", size = 32699126, upload-time = "2025-05-23T08:54:27.385Z" },
+ { url = "https://files.pythonhosted.org/packages/0a/0a/bb90e9aa0b3c527876627567d82517aabab08006ccf63796c33b0242254d/spacy-3.8.7-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e43bd70772751b8fc7a14f338d087a3d297195d43d171832923ef66204b23ab", size = 33008865, upload-time = "2025-05-23T08:54:30.248Z" },
+ { url = "https://files.pythonhosted.org/packages/39/dd/8e906ba378457107ab0394976ea9f7b12fdb2cad682ef1a2ccf473d61e5f/spacy-3.8.7-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c402bf5dcf345fd96d202378c54bc345219681e3531f911d99567d569328c45f", size = 31933169, upload-time = "2025-05-23T08:54:33.199Z" },
+ { url = "https://files.pythonhosted.org/packages/c9/b5/42df07eb837a923fbb42509864d5c7c2072d010de933dccdfb3c655b3a76/spacy-3.8.7-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:4234189861e486d86f1269e50542d87e8a6391a1ee190652479cf1a793db115f", size = 32776322, upload-time = "2025-05-23T08:54:36.891Z" },
+ { url = "https://files.pythonhosted.org/packages/92/e7/8176484801c67dcd814f141991fe0a3c9b5b4a3583ea30c2062e93d1aa6b/spacy-3.8.7-cp311-cp311-win_amd64.whl", hash = "sha256:e9d12e2eb7f36bc11dd9edae011032fe49ea100d63e83177290d3cbd80eaa650", size = 14938936, upload-time = "2025-05-23T08:54:40.322Z" },
+ { url = "https://files.pythonhosted.org/packages/a5/10/89852f40f926e0902c11c34454493ba0d15530b322711e754b89a6d7dfe6/spacy-3.8.7-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:88b397e37793cea51df298e6c651a763e49877a25bead5ba349761531a456687", size = 6265335, upload-time = "2025-05-23T08:54:42.876Z" },
+ { url = "https://files.pythonhosted.org/packages/16/fb/b5d54522969a632c06f4af354763467553b66d5bf0671ac39f3cceb3fd54/spacy-3.8.7-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f70b676955fa6959347ca86ed6edd8ff0d6eb2ba20561fdfec76924bd3e540f9", size = 5906035, upload-time = "2025-05-23T08:54:44.824Z" },
+ { url = "https://files.pythonhosted.org/packages/3a/03/70f06753fd65081404ade30408535eb69f627a36ffce2107116d1aa16239/spacy-3.8.7-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c4b5a624797ade30c25b5b69daa35a93ee24bcc56bd79b0884b2565f76f35d6", size = 33420084, upload-time = "2025-05-23T08:54:46.889Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/19/b60e1ebf4985ee2b33d85705b89a5024942b65dad04dbdc3fb46f168b410/spacy-3.8.7-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d9d83e006df66decccefa3872fa958b3756228fb216d83783595444cf42ca10c", size = 33922188, upload-time = "2025-05-23T08:54:49.781Z" },
+ { url = "https://files.pythonhosted.org/packages/8f/a3/1fb1a49dc6d982d96fffc30c3a31bb431526008eea72ac3773f6518720a6/spacy-3.8.7-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:0dca25deba54f3eb5dcfbf63bf16e613e6c601da56f91c4a902d38533c098941", size = 31939285, upload-time = "2025-05-23T08:54:53.162Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/55/6cf1aff8e5c01ee683e828f3ccd9282d2aff7ca1143a9349ee3d0c1291ff/spacy-3.8.7-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:5eef3f805a1c118d9b709a23e2d378f5f20da5a0d6258c9cfdc87c4cb234b4fc", size = 32988845, upload-time = "2025-05-23T08:54:57.776Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/47/c17ee61b51aa8497d8af0999224b4b62485111a55ec105a06886685b2c68/spacy-3.8.7-cp312-cp312-win_amd64.whl", hash = "sha256:25d7a68e445200c9e9dc0044f8b7278ec0ef01ccc7cb5a95d1de2bd8e3ed6be2", size = 13918682, upload-time = "2025-05-23T08:55:00.387Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/95/7125bea6d432c601478bf922f7a568762c8be425bbde5b66698260ab0358/spacy-3.8.7-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:dda7d57f42ec57c19fbef348095a9c82504e4777bca7b8db4b0d8318ba280fc7", size = 6235950, upload-time = "2025-05-23T08:55:02.92Z" },
+ { url = "https://files.pythonhosted.org/packages/96/c3/d2362846154d4d341136774831605df02d61f49ac637524a15f4f2794874/spacy-3.8.7-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:de0e0bddb810ed05bce44bcb91460eabe52bc56323da398d2ca74288a906da35", size = 5878106, upload-time = "2025-05-23T08:55:04.496Z" },
+ { url = "https://files.pythonhosted.org/packages/50/b6/b2943acfbfc4fc12642dac9feb571e712dd1569ab481db8f3daedee045fe/spacy-3.8.7-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5a2e58f92b684465777a7c1a65d5578b1dc36fe55c48d9964fb6d46cc9449768", size = 33085866, upload-time = "2025-05-23T08:55:06.65Z" },
+ { url = "https://files.pythonhosted.org/packages/65/98/c4415cbb217ac0b502dbb3372136015c699dd16a0c47cd6d338cd15f4bed/spacy-3.8.7-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:46330da2eb357d6979f40ea8fc16ee5776ee75cd0c70aac2a4ea10c80364b8f3", size = 33398424, upload-time = "2025-05-23T08:55:10.477Z" },
+ { url = "https://files.pythonhosted.org/packages/12/45/12a198858f1f11c21844876e039ba90df59d550527c72996d418c1faf78d/spacy-3.8.7-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:86b6a6ad23ca5440ef9d29c2b1e3125e28722c927db612ae99e564d49202861c", size = 31530066, upload-time = "2025-05-23T08:55:13.329Z" },
+ { url = "https://files.pythonhosted.org/packages/9c/df/80524f99822eb96c9649200042ec5912357eec100cf0cd678a2e9ef0ecb3/spacy-3.8.7-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ccfe468cbb370888153df145ce3693af8e54dae551940df49057258081b2112f", size = 32613343, upload-time = "2025-05-23T08:55:16.711Z" },
+ { url = "https://files.pythonhosted.org/packages/02/99/881f6f24c279a5a70b8d69aaf8266fd411a0a58fd1c8848112aaa348f6f6/spacy-3.8.7-cp313-cp313-win_amd64.whl", hash = "sha256:ca81e416ff35209769e8b5dd5d13acc52e4f57dd9d028364bccbbe157c2ae86b", size = 13911250, upload-time = "2025-05-23T08:55:19.606Z" },
+]
+
+[[package]]
+name = "spacy-legacy"
+version = "3.0.12"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d9/79/91f9d7cc8db5642acad830dcc4b49ba65a7790152832c4eceb305e46d681/spacy-legacy-3.0.12.tar.gz", hash = "sha256:b37d6e0c9b6e1d7ca1cf5bc7152ab64a4c4671f59c85adaf7a3fcb870357a774", size = 23806, upload-time = "2023-01-23T09:04:15.104Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/c3/55/12e842c70ff8828e34e543a2c7176dac4da006ca6901c9e8b43efab8bc6b/spacy_legacy-3.0.12-py2.py3-none-any.whl", hash = "sha256:476e3bd0d05f8c339ed60f40986c07387c0a71479245d6d0f4298dbd52cda55f", size = 29971, upload-time = "2023-01-23T09:04:13.45Z" },
+]
+
+[[package]]
+name = "spacy-loggers"
+version = "1.0.5"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/67/3d/926db774c9c98acf66cb4ed7faf6c377746f3e00b84b700d0868b95d0712/spacy-loggers-1.0.5.tar.gz", hash = "sha256:d60b0bdbf915a60e516cc2e653baeff946f0cfc461b452d11a4d5458c6fe5f24", size = 20811, upload-time = "2023-09-11T12:26:52.323Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/33/78/d1a1a026ef3af911159398c939b1509d5c36fe524c7b644f34a5146c4e16/spacy_loggers-1.0.5-py3-none-any.whl", hash = "sha256:196284c9c446cc0cdb944005384270d775fdeaf4f494d8e269466cfa497ef645", size = 22343, upload-time = "2023-09-11T12:26:50.586Z" },
+]
+
+[[package]]
+name = "srsly"
+version = "2.5.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "catalogue" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/b7/e8/eb51b1349f50bac0222398af0942613fdc9d1453ae67cbe4bf9936a1a54b/srsly-2.5.1.tar.gz", hash = "sha256:ab1b4bf6cf3e29da23dae0493dd1517fb787075206512351421b89b4fc27c77e", size = 466464, upload-time = "2025-01-17T09:26:26.919Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/37/08/448bcc87bb93bc19fccf70c2f0f993ac42aa41d5f44a19c60d00186aea09/srsly-2.5.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d0cda6f65cc0dd1daf47e856b0d6c5d51db8a9343c5007723ca06903dcfe367d", size = 636045, upload-time = "2025-01-17T09:25:04.605Z" },
+ { url = "https://files.pythonhosted.org/packages/03/8a/379dd9014e56460e71346cf512632fb8cbc89aa6dfebe31dff21c9eb37ba/srsly-2.5.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:cf643e6f45c266cfacea54997a1f9cfe0113fadac1ac21a1ec5b200cfe477ba0", size = 634425, upload-time = "2025-01-17T09:25:07.957Z" },
+ { url = "https://files.pythonhosted.org/packages/95/69/46e672941b5f4403b0e2b14918d8e1393ca48e3338e2c01e549113261cdf/srsly-2.5.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:467ed25ddab09ca9404fda92519a317c803b5ea0849f846e74ba8b7843557df5", size = 1085032, upload-time = "2025-01-17T09:25:11.291Z" },
+ { url = "https://files.pythonhosted.org/packages/ce/d8/1039e663b87a06d2450148ebadc07eaf6f8b7dd7f7d5e2f4221050ce6702/srsly-2.5.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f8113d202664b7d31025bdbe40b9d3536e8d7154d09520b6a1955818fa6d622", size = 1089469, upload-time = "2025-01-17T09:25:15.913Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/62/f819ac665ecca2659343a6c79174c582fe292829f481899f05e7a7301988/srsly-2.5.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:794d39fccd2b333d24f1b445acc78daf90f3f37d3c0f6f0167f25c56961804e7", size = 1052673, upload-time = "2025-01-17T09:25:17.658Z" },
+ { url = "https://files.pythonhosted.org/packages/a8/69/321a41fe4d549b96dd010b6a77657e84eb181034f9d125e2feebcd8f2e5c/srsly-2.5.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:df7fd77457c4d6c630f700b1019a8ad173e411e7cf7cfdea70e5ed86b608083b", size = 1062650, upload-time = "2025-01-17T09:25:20.704Z" },
+ { url = "https://files.pythonhosted.org/packages/d5/b8/3dfed2db5c7ecf275aaddb775e2ae17c576b09c848873188fce91e410129/srsly-2.5.1-cp310-cp310-win_amd64.whl", hash = "sha256:1a4dddb2edb8f7974c9aa5ec46dc687a75215b3bbdc815ce3fc9ea68fe1e94b5", size = 632267, upload-time = "2025-01-17T09:25:23.713Z" },
+ { url = "https://files.pythonhosted.org/packages/df/9c/a248bb49de499fe0990e3cb0fb341c2373d8863ef9a8b5799353cade5731/srsly-2.5.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:58f0736794ce00a71d62a39cbba1d62ea8d5be4751df956e802d147da20ecad7", size = 635917, upload-time = "2025-01-17T09:25:25.109Z" },
+ { url = "https://files.pythonhosted.org/packages/41/47/1bdaad84502df973ecb8ca658117234cf7fb20e1dec60da71dce82de993f/srsly-2.5.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7a8269c40859806d71920396d185f4f38dc985cdb6a28d3a326a701e29a5f629", size = 634374, upload-time = "2025-01-17T09:25:26.609Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/2a/d73c71989fcf2a6d1fa518d75322aff4db01a8763f167f8c5e00aac11097/srsly-2.5.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:889905900401fefc1032e22b73aecbed8b4251aa363f632b2d1f86fc16f1ad8e", size = 1108390, upload-time = "2025-01-17T09:25:29.32Z" },
+ { url = "https://files.pythonhosted.org/packages/35/a3/9eda9997a8bd011caed18fdaa5ce606714eb06d8dab587ed0522b3e92ab1/srsly-2.5.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf454755f22589df49c25dc799d8af7b47dce3d861dded35baf0f0b6ceab4422", size = 1110712, upload-time = "2025-01-17T09:25:31.051Z" },
+ { url = "https://files.pythonhosted.org/packages/8a/ef/4b50bc05d06349f905b27f824cc23b652098efd4be19aead3af4981df647/srsly-2.5.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:cc0607c8a59013a51dde5c1b4e465558728e9e0a35dcfa73c7cbefa91a0aad50", size = 1081244, upload-time = "2025-01-17T09:25:32.611Z" },
+ { url = "https://files.pythonhosted.org/packages/90/af/d4a2512d9a5048d2b18efead39d4c4404bddd4972935bbc68211292a736c/srsly-2.5.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d5421ba3ab3c790e8b41939c51a1d0f44326bfc052d7a0508860fb79a47aee7f", size = 1091692, upload-time = "2025-01-17T09:25:34.15Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/da/657a685f63028dcb00ccdc4ac125ed347c8bff6fa0dab6a9eb3dc45f3223/srsly-2.5.1-cp311-cp311-win_amd64.whl", hash = "sha256:b96ea5a9a0d0379a79c46d255464a372fb14c30f59a8bc113e4316d131a530ab", size = 632627, upload-time = "2025-01-17T09:25:37.36Z" },
+ { url = "https://files.pythonhosted.org/packages/fb/f6/bebc20d75bd02121fc0f65ad8c92a5dd2570e870005e940faa55a263e61a/srsly-2.5.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:683b54ed63d7dfee03bc2abc4b4a5f2152f81ec217bbadbac01ef1aaf2a75790", size = 636717, upload-time = "2025-01-17T09:25:40.236Z" },
+ { url = "https://files.pythonhosted.org/packages/b6/e8/9372317a4742c70b87b413335adfcdfb2bee4f88f3faba89fabb9e6abf21/srsly-2.5.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:459d987130e57e83ce9e160899afbeb871d975f811e6958158763dd9a8a20f23", size = 634697, upload-time = "2025-01-17T09:25:43.605Z" },
+ { url = "https://files.pythonhosted.org/packages/d5/00/c6a7b99ab27b051a27bd26fe1a8c1885225bb8980282bf9cb99f70610368/srsly-2.5.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:184e3c98389aab68ff04aab9095bd5f1a8e5a72cc5edcba9d733bac928f5cf9f", size = 1134655, upload-time = "2025-01-17T09:25:45.238Z" },
+ { url = "https://files.pythonhosted.org/packages/c2/e6/861459e8241ec3b78c111081bd5efa414ef85867e17c45b6882954468d6e/srsly-2.5.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:00c2a3e4856e63b7efd47591d049aaee8e5a250e098917f50d93ea68853fab78", size = 1143544, upload-time = "2025-01-17T09:25:47.485Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/85/8448fe874dd2042a4eceea5315cfff3af03ac77ff5073812071852c4e7e2/srsly-2.5.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:366b4708933cd8d6025c13c2cea3331f079c7bb5c25ec76fca392b6fc09818a0", size = 1098330, upload-time = "2025-01-17T09:25:52.55Z" },
+ { url = "https://files.pythonhosted.org/packages/ef/7e/04d0e1417da140b2ac4053a3d4fcfc86cd59bf4829f69d370bb899f74d5d/srsly-2.5.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:c8a0b03c64eb6e150d772c5149befbadd981cc734ab13184b0561c17c8cef9b1", size = 1110670, upload-time = "2025-01-17T09:25:54.02Z" },
+ { url = "https://files.pythonhosted.org/packages/96/1a/a8cd627eaa81a91feb6ceab50155f4ceff3eef6107916cb87ef796958427/srsly-2.5.1-cp312-cp312-win_amd64.whl", hash = "sha256:7952538f6bba91b9d8bf31a642ac9e8b9ccc0ccbb309feb88518bfb84bb0dc0d", size = 632598, upload-time = "2025-01-17T09:25:55.499Z" },
+ { url = "https://files.pythonhosted.org/packages/42/94/cab36845aad6e2c22ecee1178accaa365657296ff87305b805648fd41118/srsly-2.5.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:84b372f7ef1604b4a5b3cee1571993931f845a5b58652ac01bcb32c52586d2a8", size = 634883, upload-time = "2025-01-17T09:25:58.363Z" },
+ { url = "https://files.pythonhosted.org/packages/67/8b/501f51f4eaee7e1fd7327764799cb0a42f5d0de042a97916d30dbff770fc/srsly-2.5.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:6ac3944c112acb3347a39bfdc2ebfc9e2d4bace20fe1c0b764374ac5b83519f2", size = 632842, upload-time = "2025-01-17T09:25:59.777Z" },
+ { url = "https://files.pythonhosted.org/packages/07/be/5b8fce4829661e070a7d3e262d2e533f0e297b11b8993d57240da67d7330/srsly-2.5.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6118f9c4b221cde0a990d06a42c8a4845218d55b425d8550746fe790acf267e9", size = 1118516, upload-time = "2025-01-17T09:26:01.234Z" },
+ { url = "https://files.pythonhosted.org/packages/91/60/a34e97564eac352c0e916c98f44b6f566b7eb6a9fb60bcd60ffa98530762/srsly-2.5.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7481460110d9986781d9e4ac0f5f991f1d6839284a80ad268625f9a23f686950", size = 1127974, upload-time = "2025-01-17T09:26:04.007Z" },
+ { url = "https://files.pythonhosted.org/packages/70/a2/f642334db0cabd187fa86b8773257ee6993c6009338a6831d4804e2c5b3c/srsly-2.5.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:6e57b8138082f09e35db60f99757e16652489e9e3692471d8e0c39aa95180688", size = 1086098, upload-time = "2025-01-17T09:26:05.612Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/9b/be48e185c5a010e71b5135e4cdf317ff56b8ac4bc08f394bbf882ac13b05/srsly-2.5.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:bab90b85a63a1fe0bbc74d373c8bb9bb0499ddfa89075e0ebe8d670f12d04691", size = 1100354, upload-time = "2025-01-17T09:26:07.215Z" },
+ { url = "https://files.pythonhosted.org/packages/3a/e2/745aeba88a8513017fbac2fd2f9f07b8a36065e51695f818541eb795ec0c/srsly-2.5.1-cp313-cp313-win_amd64.whl", hash = "sha256:e73712be1634b5e1de6f81c273a7d47fe091ad3c79dc779c03d3416a5c117cee", size = 630634, upload-time = "2025-01-17T09:26:10.018Z" },
+]
+
+[[package]]
+name = "sympy"
+version = "1.14.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "mpmath" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/83/d3/803453b36afefb7c2bb238361cd4ae6125a569b4db67cd9e79846ba2d68c/sympy-1.14.0.tar.gz", hash = "sha256:d3d3fe8df1e5a0b42f0e7bdf50541697dbe7d23746e894990c030e2b05e72517", size = 7793921, upload-time = "2025-04-27T18:05:01.611Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/a2/09/77d55d46fd61b4a135c444fc97158ef34a095e5681d0a6c10b75bf356191/sympy-1.14.0-py3-none-any.whl", hash = "sha256:e091cc3e99d2141a0ba2847328f5479b05d94a6635cb96148ccb3f34671bd8f5", size = 6299353, upload-time = "2025-04-27T18:04:59.103Z" },
+]
+
+[[package]]
+name = "thinc"
+version = "8.3.6"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "blis" },
+ { name = "catalogue" },
+ { name = "confection" },
+ { name = "cymem" },
+ { name = "murmurhash" },
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "packaging" },
+ { name = "preshed" },
+ { name = "pydantic" },
+ { name = "setuptools" },
+ { name = "srsly" },
+ { name = "wasabi" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/01/f4/7607f76c2e156a34b1961a941eb8407b84da4f515cc0903b44d44edf4f45/thinc-8.3.6.tar.gz", hash = "sha256:49983f9b7ddc4343a9532694a9118dd216d7a600520a21849a43b6c268ec6cad", size = 194218, upload-time = "2025-04-04T11:50:45.751Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/6c/36/92233344b30caab56c2d8b0fc92472ec37402a5088f4f89ced96821e1638/thinc-8.3.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f4abec5a35e5945a6573b62bf0f423709467ba321fea9d00770b4c5282a8257d", size = 894690, upload-time = "2025-04-04T11:49:38.682Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/7e/ce6acadb3ae92836ed9eef2359fd2bd5f9257d22c422b44dd9c1cbc2b5c8/thinc-8.3.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ba7ced4bfc5890dd8f4be2978f8d491a07e80c9d9a7fffae9f57970b55db01bd", size = 844882, upload-time = "2025-04-04T11:49:45.947Z" },
+ { url = "https://files.pythonhosted.org/packages/9d/88/26ca6b5d097d7958451fb4f6e67bcce6dfe31e73f21c16fd905315806f2a/thinc-8.3.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1e645517d87f71e92137a1aef028094d134223885e15b8472bfcdc09665973ed", size = 4072032, upload-time = "2025-04-04T11:49:47.636Z" },
+ { url = "https://files.pythonhosted.org/packages/c0/eb/12681892dbcaa45eedeb37dcf40251ea47e22e97413e6d4209fa4db2b9f2/thinc-8.3.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10d8451dd08386d6bbde8160fd0e5e057e04a330c168837d3e0f278fa8738eea", size = 4136573, upload-time = "2025-04-04T11:49:49.106Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/76/98dd4983efbd7eebdd5d5964c566c8e5c10782db1599fc694610ba49ba75/thinc-8.3.6-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:0e913f120fde25aea9f052e8cd45dd9cd36553ff1903e312b7302dd91000125a", size = 4949592, upload-time = "2025-04-04T11:49:50.64Z" },
+ { url = "https://files.pythonhosted.org/packages/cd/7f/d867c7bcb22cd6022d9931f625340b26e86fd9cf25905adc408bc7393799/thinc-8.3.6-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:03706680bc0ea92036ac2e00f46bc86116ac6dccb6212b0c632e835176f666b2", size = 5165689, upload-time = "2025-04-04T11:49:52.385Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/8c/d6000d6d19cc1fc800ceb073b44403ce27a3414248cede691b5c49cbfb22/thinc-8.3.6-cp310-cp310-win_amd64.whl", hash = "sha256:0902314ecb83a225f41ab6121ceaf139b5da8bb6ada9e58031bad6c46134b8d4", size = 1773832, upload-time = "2025-04-04T11:49:54.259Z" },
+ { url = "https://files.pythonhosted.org/packages/23/b4/b4ed217679327849ad796dc8ced307447a05e9f607bb12f290b2ec99fb35/thinc-8.3.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:7c7c44f8736f27d1cced216246c00e219fb5734e6bc3b8a78c09157c011aae59", size = 895694, upload-time = "2025-04-04T11:49:56.102Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/53/5f9eeb725c2ca94adef76a2cd0289bc530728b0a035eed815c766a9291ef/thinc-8.3.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:92b3c38bdfdf81d0485685a6261b8a6ea40e03120b08ced418c8400f5e186b2d", size = 845256, upload-time = "2025-04-04T11:49:57.401Z" },
+ { url = "https://files.pythonhosted.org/packages/89/8e/fae78ba63b1b0fb9017fd51d0aeffdc22e837807a93d55100c92e6d30570/thinc-8.3.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:853eb187b1f77057adada1a72e7f6ea3f38643930363681cfd5de285dab4b09b", size = 4402142, upload-time = "2025-04-04T11:49:59.144Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/7e/6b1bb6eba9c25a5911e1624c0da33ca007d7697b41ee11c059f3d23d8bbc/thinc-8.3.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c12bf75a375b3b1f7c32a26cbd69255b177daa693c986a27faaf2027439c7ef", size = 4463797, upload-time = "2025-04-04T11:50:00.647Z" },
+ { url = "https://files.pythonhosted.org/packages/42/39/c5f48785f76cd97a79e83b1acd3dab7d0835e4b560c9d7add846b47ea984/thinc-8.3.6-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:5bf1708c22fb54e7846e8e743a9e6a43a22cbe24cab0081ba4e6362b4437a53f", size = 5313270, upload-time = "2025-04-04T11:50:02.241Z" },
+ { url = "https://files.pythonhosted.org/packages/a3/32/ce8e7827ed0a1f86eab864e6b0b8ae79118e4e5c40ffd8b21938f8961e23/thinc-8.3.6-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:169d7c5779f6f1a78fa91b2bc3a6485f7bbe4341bd8064576f8e067b67b6a0b5", size = 5508338, upload-time = "2025-04-04T11:50:04.18Z" },
+ { url = "https://files.pythonhosted.org/packages/94/40/7e5e840ac2e835fbf5d87e3ab94df7d678d846aaf28b12d46538ed36bf7f/thinc-8.3.6-cp311-cp311-win_amd64.whl", hash = "sha256:59c244ce11a3359b9a33b4c3bbc9ba94f7174214356ed88c16a41e39f31fe372", size = 1775998, upload-time = "2025-04-04T11:50:05.744Z" },
+ { url = "https://files.pythonhosted.org/packages/2b/c8/a9250944fb9a0a4c65b5d456f3a87ee6c249b53962757d77c28df8fadb46/thinc-8.3.6-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c54705e45a710e49758192592a3e0a80482edfdf5c61fc99f5d27ae822f652c5", size = 890177, upload-time = "2025-04-04T11:50:07.543Z" },
+ { url = "https://files.pythonhosted.org/packages/3b/89/1ac54b18d4de79872c633302a10825695a36cd2e552cb8d4fea820b7a357/thinc-8.3.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:91acdbf3041c0ac1775ede570535a779cdf1312c317cd054d7b9d200da685c23", size = 839410, upload-time = "2025-04-04T11:50:09.26Z" },
+ { url = "https://files.pythonhosted.org/packages/37/76/e1a76ab42e4637c4b8988d59784cdc1169a532d3043c36d2faf1a8d95228/thinc-8.3.6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f5a1db861614f91ff127feecce681c2213777b2d3d1ee6644bcc8a886acf0595", size = 4195748, upload-time = "2025-04-04T11:50:10.92Z" },
+ { url = "https://files.pythonhosted.org/packages/00/a9/c59ac3260e7aff6b9dc80f495f1846a80b490595db06d040b05205d1f7f8/thinc-8.3.6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:512e461989df8a30558367061d63ae6f1a6b4abe3c016a3360ee827e824254e0", size = 4261270, upload-time = "2025-04-04T11:50:12.953Z" },
+ { url = "https://files.pythonhosted.org/packages/e0/8e/e86c5cbc6ebe238aa747ef9e20a969f6faba9ebbe1cbce059119f9614dd6/thinc-8.3.6-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a087aea2a63e6b9ccde61163d5922553b58908e96f8ad49cd0fd2edeb43e063f", size = 5067567, upload-time = "2025-04-04T11:50:18.317Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/8a/16670e4de36231aab5b052c734ad716be29aab2c0d2f3d8dd9c8dd27fafc/thinc-8.3.6-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b1d85dd5d94bb75006864c7d99fd5b75d05b1602d571e7fcdb42d4521f962048", size = 5309405, upload-time = "2025-04-04T11:50:20.075Z" },
+ { url = "https://files.pythonhosted.org/packages/58/08/5439dd15b661610d8a3b919f18065ebf0d664b6a54a3794206622a74c910/thinc-8.3.6-cp312-cp312-win_amd64.whl", hash = "sha256:1170d85294366127d97a27dd5896f4abe90e2a5ea2b7988de9a5bb8e1128d222", size = 1749275, upload-time = "2025-04-04T11:50:21.769Z" },
+ { url = "https://files.pythonhosted.org/packages/a6/03/0ba9bec3057f4a9c0b7ba53839aebcbbbc28de3b91330cb8de74a885b8f6/thinc-8.3.6-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d8743ee8ad2d59fda018b57e5da102d6098bbeb0f70476f3fd8ceb9d215d88b9", size = 883375, upload-time = "2025-04-04T11:50:23.273Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/79/ac31cd25d1d973b824de10ebbc56788688aecdd8f56800daf8edfff45097/thinc-8.3.6-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:89dbeb2ca94f1033e90999a70e2bc9dd5390d5341dc1a3a4b8793d03855265c3", size = 832654, upload-time = "2025-04-04T11:50:24.871Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/0d/fb5e8e49dfb53cc02ce907f81002031c6f4fe7e7aa44b1004ea695630017/thinc-8.3.6-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:89a5460695067aa6e4182515cfd2018263db77cc17b7031d50ed696e990797a8", size = 4158592, upload-time = "2025-04-04T11:50:26.403Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/42/c87990ca214b9910f33b110d3b1ac213407388d35376bc955ad45e5de764/thinc-8.3.6-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0aa8e32f49234569fd10c35b562ee2f9c0d51225365a6e604a5a67396a49f2c1", size = 4236211, upload-time = "2025-04-04T11:50:27.943Z" },
+ { url = "https://files.pythonhosted.org/packages/fa/10/9975bcee4dd4634bfb87df0447d7fa86d6c9b2d9228e56d4adb98cc19cbc/thinc-8.3.6-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f432158b80cf75a096980470b790b51d81daf9c2822598adebfc3cb58588fd6c", size = 5049197, upload-time = "2025-04-04T11:50:29.583Z" },
+ { url = "https://files.pythonhosted.org/packages/9b/34/e1b384009eb8ad2192770157961cd0c2e2712fedf49e1dfd902e3d9b9973/thinc-8.3.6-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:61fb33a22aba40366fa9018ab34580f74fc40be821ab8af77ac1fdbeac17243b", size = 5278543, upload-time = "2025-04-04T11:50:31.524Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/26/f77ef4bd174bfeac491237a4ca3f74ba2ee2f672004f76cff90f8407a489/thinc-8.3.6-cp313-cp313-win_amd64.whl", hash = "sha256:ddd7041946a427f6a9b0b49419353d02ad7eb43fe16724bfcc3bdeb9562040b1", size = 1746883, upload-time = "2025-04-04T11:50:33.038Z" },
+]
+
+[[package]]
+name = "threadpoolctl"
+version = "3.6.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/b7/4d/08c89e34946fce2aec4fbb45c9016efd5f4d7f24af8e5d93296e935631d8/threadpoolctl-3.6.0.tar.gz", hash = "sha256:8ab8b4aa3491d812b623328249fab5302a68d2d71745c8a4c719a2fcaba9f44e", size = 21274, upload-time = "2025-03-13T13:49:23.031Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/32/d5/f9a850d79b0851d1d4ef6456097579a9005b31fea68726a4ae5f2d82ddd9/threadpoolctl-3.6.0-py3-none-any.whl", hash = "sha256:43a0b8fd5a2928500110039e43a5eed8480b918967083ea48dc3ab9f13c4a7fb", size = 18638, upload-time = "2025-03-13T13:49:21.846Z" },
+]
+
+[[package]]
+name = "tokenizers"
+version = "0.21.4"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "huggingface-hub" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/c2/2f/402986d0823f8d7ca139d969af2917fefaa9b947d1fb32f6168c509f2492/tokenizers-0.21.4.tar.gz", hash = "sha256:fa23f85fbc9a02ec5c6978da172cdcbac23498c3ca9f3645c5c68740ac007880", size = 351253, upload-time = "2025-07-28T15:48:54.325Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/98/c6/fdb6f72bf6454f52eb4a2510be7fb0f614e541a2554d6210e370d85efff4/tokenizers-0.21.4-cp39-abi3-macosx_10_12_x86_64.whl", hash = "sha256:2ccc10a7c3bcefe0f242867dc914fc1226ee44321eb618cfe3019b5df3400133", size = 2863987, upload-time = "2025-07-28T15:48:44.877Z" },
+ { url = "https://files.pythonhosted.org/packages/8d/a6/28975479e35ddc751dc1ddc97b9b69bf7fcf074db31548aab37f8116674c/tokenizers-0.21.4-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:5e2f601a8e0cd5be5cc7506b20a79112370b9b3e9cb5f13f68ab11acd6ca7d60", size = 2732457, upload-time = "2025-07-28T15:48:43.265Z" },
+ { url = "https://files.pythonhosted.org/packages/aa/8f/24f39d7b5c726b7b0be95dca04f344df278a3fe3a4deb15a975d194cbb32/tokenizers-0.21.4-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:39b376f5a1aee67b4d29032ee85511bbd1b99007ec735f7f35c8a2eb104eade5", size = 3012624, upload-time = "2025-07-28T13:22:43.895Z" },
+ { url = "https://files.pythonhosted.org/packages/58/47/26358925717687a58cb74d7a508de96649544fad5778f0cd9827398dc499/tokenizers-0.21.4-cp39-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2107ad649e2cda4488d41dfd031469e9da3fcbfd6183e74e4958fa729ffbf9c6", size = 2939681, upload-time = "2025-07-28T13:22:47.499Z" },
+ { url = "https://files.pythonhosted.org/packages/99/6f/cc300fea5db2ab5ddc2c8aea5757a27b89c84469899710c3aeddc1d39801/tokenizers-0.21.4-cp39-abi3-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c73012da95afafdf235ba80047699df4384fdc481527448a078ffd00e45a7d9", size = 3247445, upload-time = "2025-07-28T15:48:39.711Z" },
+ { url = "https://files.pythonhosted.org/packages/be/bf/98cb4b9c3c4afd8be89cfa6423704337dc20b73eb4180397a6e0d456c334/tokenizers-0.21.4-cp39-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f23186c40395fc390d27f519679a58023f368a0aad234af145e0f39ad1212732", size = 3428014, upload-time = "2025-07-28T13:22:49.569Z" },
+ { url = "https://files.pythonhosted.org/packages/75/c7/96c1cc780e6ca7f01a57c13235dd05b7bc1c0f3588512ebe9d1331b5f5ae/tokenizers-0.21.4-cp39-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cc88bb34e23a54cc42713d6d98af5f1bf79c07653d24fe984d2d695ba2c922a2", size = 3193197, upload-time = "2025-07-28T13:22:51.471Z" },
+ { url = "https://files.pythonhosted.org/packages/f2/90/273b6c7ec78af547694eddeea9e05de771278bd20476525ab930cecaf7d8/tokenizers-0.21.4-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:51b7eabb104f46c1c50b486520555715457ae833d5aee9ff6ae853d1130506ff", size = 3115426, upload-time = "2025-07-28T15:48:41.439Z" },
+ { url = "https://files.pythonhosted.org/packages/91/43/c640d5a07e95f1cf9d2c92501f20a25f179ac53a4f71e1489a3dcfcc67ee/tokenizers-0.21.4-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:714b05b2e1af1288bd1bc56ce496c4cebb64a20d158ee802887757791191e6e2", size = 9089127, upload-time = "2025-07-28T15:48:46.472Z" },
+ { url = "https://files.pythonhosted.org/packages/44/a1/dd23edd6271d4dca788e5200a807b49ec3e6987815cd9d0a07ad9c96c7c2/tokenizers-0.21.4-cp39-abi3-musllinux_1_2_armv7l.whl", hash = "sha256:1340ff877ceedfa937544b7d79f5b7becf33a4cfb58f89b3b49927004ef66f78", size = 9055243, upload-time = "2025-07-28T15:48:48.539Z" },
+ { url = "https://files.pythonhosted.org/packages/21/2b/b410d6e9021c4b7ddb57248304dc817c4d4970b73b6ee343674914701197/tokenizers-0.21.4-cp39-abi3-musllinux_1_2_i686.whl", hash = "sha256:3c1f4317576e465ac9ef0d165b247825a2a4078bcd01cba6b54b867bdf9fdd8b", size = 9298237, upload-time = "2025-07-28T15:48:50.443Z" },
+ { url = "https://files.pythonhosted.org/packages/b7/0a/42348c995c67e2e6e5c89ffb9cfd68507cbaeb84ff39c49ee6e0a6dd0fd2/tokenizers-0.21.4-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:c212aa4e45ec0bb5274b16b6f31dd3f1c41944025c2358faaa5782c754e84c24", size = 9461980, upload-time = "2025-07-28T15:48:52.325Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/d3/dacccd834404cd71b5c334882f3ba40331ad2120e69ded32cf5fda9a7436/tokenizers-0.21.4-cp39-abi3-win32.whl", hash = "sha256:6c42a930bc5f4c47f4ea775c91de47d27910881902b0f20e4990ebe045a415d0", size = 2329871, upload-time = "2025-07-28T15:48:56.841Z" },
+ { url = "https://files.pythonhosted.org/packages/41/f2/fd673d979185f5dcbac4be7d09461cbb99751554ffb6718d0013af8604cb/tokenizers-0.21.4-cp39-abi3-win_amd64.whl", hash = "sha256:475d807a5c3eb72c59ad9b5fcdb254f6e17f53dfcbb9903233b0dfa9c943b597", size = 2507568, upload-time = "2025-07-28T15:48:55.456Z" },
+]
+
+[[package]]
+name = "tomli"
+version = "2.4.1"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/22/de/48c59722572767841493b26183a0d1cc411d54fd759c5607c4590b6563a6/tomli-2.4.1.tar.gz", hash = "sha256:7c7e1a961a0b2f2472c1ac5b69affa0ae1132c39adcb67aba98568702b9cc23f", size = 17543, upload-time = "2026-03-25T20:22:03.828Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/f4/11/db3d5885d8528263d8adc260bb2d28ebf1270b96e98f0e0268d32b8d9900/tomli-2.4.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:f8f0fc26ec2cc2b965b7a3b87cd19c5c6b8c5e5f436b984e85f486d652285c30", size = 154704, upload-time = "2026-03-25T20:21:10.473Z" },
+ { url = "https://files.pythonhosted.org/packages/6d/f7/675db52c7e46064a9aa928885a9b20f4124ecb9bc2e1ce74c9106648d202/tomli-2.4.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4ab97e64ccda8756376892c53a72bd1f964e519c77236368527f758fbc36a53a", size = 149454, upload-time = "2026-03-25T20:21:12.036Z" },
+ { url = "https://files.pythonhosted.org/packages/61/71/81c50943cf953efa35bce7646caab3cf457a7d8c030b27cfb40d7235f9ee/tomli-2.4.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:96481a5786729fd470164b47cdb3e0e58062a496f455ee41b4403be77cb5a076", size = 237561, upload-time = "2026-03-25T20:21:13.098Z" },
+ { url = "https://files.pythonhosted.org/packages/48/c1/f41d9cb618acccca7df82aaf682f9b49013c9397212cb9f53219e3abac37/tomli-2.4.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5a881ab208c0baf688221f8cecc5401bd291d67e38a1ac884d6736cbcd8247e9", size = 243824, upload-time = "2026-03-25T20:21:14.569Z" },
+ { url = "https://files.pythonhosted.org/packages/22/e4/5a816ecdd1f8ca51fb756ef684b90f2780afc52fc67f987e3c61d800a46d/tomli-2.4.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:47149d5bd38761ac8be13a84864bf0b7b70bc051806bc3669ab1cbc56216b23c", size = 242227, upload-time = "2026-03-25T20:21:15.712Z" },
+ { url = "https://files.pythonhosted.org/packages/6b/49/2b2a0ef529aa6eec245d25f0c703e020a73955ad7edf73e7f54ddc608aa5/tomli-2.4.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ec9bfaf3ad2df51ace80688143a6a4ebc09a248f6ff781a9945e51937008fcbc", size = 247859, upload-time = "2026-03-25T20:21:17.001Z" },
+ { url = "https://files.pythonhosted.org/packages/83/bd/6c1a630eaca337e1e78c5903104f831bda934c426f9231429396ce3c3467/tomli-2.4.1-cp311-cp311-win32.whl", hash = "sha256:ff2983983d34813c1aeb0fa89091e76c3a22889ee83ab27c5eeb45100560c049", size = 97204, upload-time = "2026-03-25T20:21:18.079Z" },
+ { url = "https://files.pythonhosted.org/packages/42/59/71461df1a885647e10b6bb7802d0b8e66480c61f3f43079e0dcd315b3954/tomli-2.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:5ee18d9ebdb417e384b58fe414e8d6af9f4e7a0ae761519fb50f721de398dd4e", size = 108084, upload-time = "2026-03-25T20:21:18.978Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/83/dceca96142499c069475b790e7913b1044c1a4337e700751f48ed723f883/tomli-2.4.1-cp311-cp311-win_arm64.whl", hash = "sha256:c2541745709bad0264b7d4705ad453b76ccd191e64aa6f0fc66b69a293a45ece", size = 95285, upload-time = "2026-03-25T20:21:20.309Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/ba/42f134a3fe2b370f555f44b1d72feebb94debcab01676bf918d0cb70e9aa/tomli-2.4.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:c742f741d58a28940ce01d58f0ab2ea3ced8b12402f162f4d534dfe18ba1cd6a", size = 155924, upload-time = "2026-03-25T20:21:21.626Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/c7/62d7a17c26487ade21c5422b646110f2162f1fcc95980ef7f63e73c68f14/tomli-2.4.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7f86fd587c4ed9dd76f318225e7d9b29cfc5a9d43de44e5754db8d1128487085", size = 150018, upload-time = "2026-03-25T20:21:23.002Z" },
+ { url = "https://files.pythonhosted.org/packages/5c/05/79d13d7c15f13bdef410bdd49a6485b1c37d28968314eabee452c22a7fda/tomli-2.4.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ff18e6a727ee0ab0388507b89d1bc6a22b138d1e2fa56d1ad494586d61d2eae9", size = 244948, upload-time = "2026-03-25T20:21:24.04Z" },
+ { url = "https://files.pythonhosted.org/packages/10/90/d62ce007a1c80d0b2c93e02cab211224756240884751b94ca72df8a875ca/tomli-2.4.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:136443dbd7e1dee43c68ac2694fde36b2849865fa258d39bf822c10e8068eac5", size = 253341, upload-time = "2026-03-25T20:21:25.177Z" },
+ { url = "https://files.pythonhosted.org/packages/1a/7e/caf6496d60152ad4ed09282c1885cca4eea150bfd007da84aea07bcc0a3e/tomli-2.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:5e262d41726bc187e69af7825504c933b6794dc3fbd5945e41a79bb14c31f585", size = 248159, upload-time = "2026-03-25T20:21:26.364Z" },
+ { url = "https://files.pythonhosted.org/packages/99/e7/c6f69c3120de34bbd882c6fba7975f3d7a746e9218e56ab46a1bc4b42552/tomli-2.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:5cb41aa38891e073ee49d55fbc7839cfdb2bc0e600add13874d048c94aadddd1", size = 253290, upload-time = "2026-03-25T20:21:27.46Z" },
+ { url = "https://files.pythonhosted.org/packages/d6/2f/4a3c322f22c5c66c4b836ec58211641a4067364f5dcdd7b974b4c5da300c/tomli-2.4.1-cp312-cp312-win32.whl", hash = "sha256:da25dc3563bff5965356133435b757a795a17b17d01dbc0f42fb32447ddfd917", size = 98141, upload-time = "2026-03-25T20:21:28.492Z" },
+ { url = "https://files.pythonhosted.org/packages/24/22/4daacd05391b92c55759d55eaee21e1dfaea86ce5c571f10083360adf534/tomli-2.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:52c8ef851d9a240f11a88c003eacb03c31fc1c9c4ec64a99a0f922b93874fda9", size = 108847, upload-time = "2026-03-25T20:21:29.386Z" },
+ { url = "https://files.pythonhosted.org/packages/68/fd/70e768887666ddd9e9f5d85129e84910f2db2796f9096aa02b721a53098d/tomli-2.4.1-cp312-cp312-win_arm64.whl", hash = "sha256:f758f1b9299d059cc3f6546ae2af89670cb1c4d48ea29c3cacc4fe7de3058257", size = 95088, upload-time = "2026-03-25T20:21:30.677Z" },
+ { url = "https://files.pythonhosted.org/packages/07/06/b823a7e818c756d9a7123ba2cda7d07bc2dd32835648d1a7b7b7a05d848d/tomli-2.4.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:36d2bd2ad5fb9eaddba5226aa02c8ec3fa4f192631e347b3ed28186d43be6b54", size = 155866, upload-time = "2026-03-25T20:21:31.65Z" },
+ { url = "https://files.pythonhosted.org/packages/14/6f/12645cf7f08e1a20c7eb8c297c6f11d31c1b50f316a7e7e1e1de6e2e7b7e/tomli-2.4.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:eb0dc4e38e6a1fd579e5d50369aa2e10acfc9cace504579b2faabb478e76941a", size = 149887, upload-time = "2026-03-25T20:21:33.028Z" },
+ { url = "https://files.pythonhosted.org/packages/5c/e0/90637574e5e7212c09099c67ad349b04ec4d6020324539297b634a0192b0/tomli-2.4.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c7f2c7f2b9ca6bdeef8f0fa897f8e05085923eb091721675170254cbc5b02897", size = 243704, upload-time = "2026-03-25T20:21:34.51Z" },
+ { url = "https://files.pythonhosted.org/packages/10/8f/d3ddb16c5a4befdf31a23307f72828686ab2096f068eaf56631e136c1fdd/tomli-2.4.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f3c6818a1a86dd6dca7ddcaaf76947d5ba31aecc28cb1b67009a5877c9a64f3f", size = 251628, upload-time = "2026-03-25T20:21:36.012Z" },
+ { url = "https://files.pythonhosted.org/packages/e3/f1/dbeeb9116715abee2485bf0a12d07a8f31af94d71608c171c45f64c0469d/tomli-2.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d312ef37c91508b0ab2cee7da26ec0b3ed2f03ce12bd87a588d771ae15dcf82d", size = 247180, upload-time = "2026-03-25T20:21:37.136Z" },
+ { url = "https://files.pythonhosted.org/packages/d3/74/16336ffd19ed4da28a70959f92f506233bd7cfc2332b20bdb01591e8b1d1/tomli-2.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:51529d40e3ca50046d7606fa99ce3956a617f9b36380da3b7f0dd3dd28e68cb5", size = 251674, upload-time = "2026-03-25T20:21:38.298Z" },
+ { url = "https://files.pythonhosted.org/packages/16/f9/229fa3434c590ddf6c0aa9af64d3af4b752540686cace29e6281e3458469/tomli-2.4.1-cp313-cp313-win32.whl", hash = "sha256:2190f2e9dd7508d2a90ded5ed369255980a1bcdd58e52f7fe24b8162bf9fedbd", size = 97976, upload-time = "2026-03-25T20:21:39.316Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/1e/71dfd96bcc1c775420cb8befe7a9d35f2e5b1309798f009dca17b7708c1e/tomli-2.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:8d65a2fbf9d2f8352685bc1364177ee3923d6baf5e7f43ea4959d7d8bc326a36", size = 108755, upload-time = "2026-03-25T20:21:40.248Z" },
+ { url = "https://files.pythonhosted.org/packages/83/7a/d34f422a021d62420b78f5c538e5b102f62bea616d1d75a13f0a88acb04a/tomli-2.4.1-cp313-cp313-win_arm64.whl", hash = "sha256:4b605484e43cdc43f0954ddae319fb75f04cc10dd80d830540060ee7cd0243cd", size = 95265, upload-time = "2026-03-25T20:21:41.219Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/fb/9a5c8d27dbab540869f7c1f8eb0abb3244189ce780ba9cd73f3770662072/tomli-2.4.1-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:fd0409a3653af6c147209d267a0e4243f0ae46b011aa978b1080359fddc9b6cf", size = 155726, upload-time = "2026-03-25T20:21:42.23Z" },
+ { url = "https://files.pythonhosted.org/packages/62/05/d2f816630cc771ad836af54f5001f47a6f611d2d39535364f148b6a92d6b/tomli-2.4.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:a120733b01c45e9a0c34aeef92bf0cf1d56cfe81ed9d47d562f9ed591a9828ac", size = 149859, upload-time = "2026-03-25T20:21:43.386Z" },
+ { url = "https://files.pythonhosted.org/packages/ce/48/66341bdb858ad9bd0ceab5a86f90eddab127cf8b046418009f2125630ecb/tomli-2.4.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:559db847dc486944896521f68d8190be1c9e719fced785720d2216fe7022b662", size = 244713, upload-time = "2026-03-25T20:21:44.474Z" },
+ { url = "https://files.pythonhosted.org/packages/df/6d/c5fad00d82b3c7a3ab6189bd4b10e60466f22cfe8a08a9394185c8a8111c/tomli-2.4.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:01f520d4f53ef97964a240a035ec2a869fe1a37dde002b57ebc4417a27ccd853", size = 252084, upload-time = "2026-03-25T20:21:45.62Z" },
+ { url = "https://files.pythonhosted.org/packages/00/71/3a69e86f3eafe8c7a59d008d245888051005bd657760e96d5fbfb0b740c2/tomli-2.4.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7f94b27a62cfad8496c8d2513e1a222dd446f095fca8987fceef261225538a15", size = 247973, upload-time = "2026-03-25T20:21:46.937Z" },
+ { url = "https://files.pythonhosted.org/packages/67/50/361e986652847fec4bd5e4a0208752fbe64689c603c7ae5ea7cb16b1c0ca/tomli-2.4.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:ede3e6487c5ef5d28634ba3f31f989030ad6af71edfb0055cbbd14189ff240ba", size = 256223, upload-time = "2026-03-25T20:21:48.467Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/9a/b4173689a9203472e5467217e0154b00e260621caa227b6fa01feab16998/tomli-2.4.1-cp314-cp314-win32.whl", hash = "sha256:3d48a93ee1c9b79c04bb38772ee1b64dcf18ff43085896ea460ca8dec96f35f6", size = 98973, upload-time = "2026-03-25T20:21:49.526Z" },
+ { url = "https://files.pythonhosted.org/packages/14/58/640ac93bf230cd27d002462c9af0d837779f8773bc03dee06b5835208214/tomli-2.4.1-cp314-cp314-win_amd64.whl", hash = "sha256:88dceee75c2c63af144e456745e10101eb67361050196b0b6af5d717254dddf7", size = 109082, upload-time = "2026-03-25T20:21:50.506Z" },
+ { url = "https://files.pythonhosted.org/packages/d5/2f/702d5e05b227401c1068f0d386d79a589bb12bf64c3d2c72ce0631e3bc49/tomli-2.4.1-cp314-cp314-win_arm64.whl", hash = "sha256:b8c198f8c1805dc42708689ed6864951fd2494f924149d3e4bce7710f8eb5232", size = 96490, upload-time = "2026-03-25T20:21:51.474Z" },
+ { url = "https://files.pythonhosted.org/packages/45/4b/b877b05c8ba62927d9865dd980e34a755de541eb65fffba52b4cc495d4d2/tomli-2.4.1-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:d4d8fe59808a54658fcc0160ecfb1b30f9089906c50b23bcb4c69eddc19ec2b4", size = 164263, upload-time = "2026-03-25T20:21:52.543Z" },
+ { url = "https://files.pythonhosted.org/packages/24/79/6ab420d37a270b89f7195dec5448f79400d9e9c1826df982f3f8e97b24fd/tomli-2.4.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7008df2e7655c495dd12d2a4ad038ff878d4ca4b81fccaf82b714e07eae4402c", size = 160736, upload-time = "2026-03-25T20:21:53.674Z" },
+ { url = "https://files.pythonhosted.org/packages/02/e0/3630057d8eb170310785723ed5adcdfb7d50cb7e6455f85ba8a3deed642b/tomli-2.4.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1d8591993e228b0c930c4bb0db464bdad97b3289fb981255d6c9a41aedc84b2d", size = 270717, upload-time = "2026-03-25T20:21:55.129Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/b4/1613716072e544d1a7891f548d8f9ec6ce2faf42ca65acae01d76ea06bb0/tomli-2.4.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:734e20b57ba95624ecf1841e72b53f6e186355e216e5412de414e3c51e5e3c41", size = 278461, upload-time = "2026-03-25T20:21:56.228Z" },
+ { url = "https://files.pythonhosted.org/packages/05/38/30f541baf6a3f6df77b3df16b01ba319221389e2da59427e221ef417ac0c/tomli-2.4.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8a650c2dbafa08d42e51ba0b62740dae4ecb9338eefa093aa5c78ceb546fcd5c", size = 274855, upload-time = "2026-03-25T20:21:57.653Z" },
+ { url = "https://files.pythonhosted.org/packages/77/a3/ec9dd4fd2c38e98de34223b995a3b34813e6bdadf86c75314c928350ed14/tomli-2.4.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:504aa796fe0569bb43171066009ead363de03675276d2d121ac1a4572397870f", size = 283144, upload-time = "2026-03-25T20:21:59.089Z" },
+ { url = "https://files.pythonhosted.org/packages/ef/be/605a6261cac79fba2ec0c9827e986e00323a1945700969b8ee0b30d85453/tomli-2.4.1-cp314-cp314t-win32.whl", hash = "sha256:b1d22e6e9387bf4739fbe23bfa80e93f6b0373a7f1b96c6227c32bef95a4d7a8", size = 108683, upload-time = "2026-03-25T20:22:00.214Z" },
+ { url = "https://files.pythonhosted.org/packages/12/64/da524626d3b9cc40c168a13da8335fe1c51be12c0a63685cc6db7308daae/tomli-2.4.1-cp314-cp314t-win_amd64.whl", hash = "sha256:2c1c351919aca02858f740c6d33adea0c5deea37f9ecca1cc1ef9e884a619d26", size = 121196, upload-time = "2026-03-25T20:22:01.169Z" },
+ { url = "https://files.pythonhosted.org/packages/5a/cd/e80b62269fc78fc36c9af5a6b89c835baa8af28ff5ad28c7028d60860320/tomli-2.4.1-cp314-cp314t-win_arm64.whl", hash = "sha256:eab21f45c7f66c13f2a9e0e1535309cee140182a9cdae1e041d02e47291e8396", size = 100393, upload-time = "2026-03-25T20:22:02.137Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/61/cceae43728b7de99d9b847560c262873a1f6c98202171fd5ed62640b494b/tomli-2.4.1-py3-none-any.whl", hash = "sha256:0d85819802132122da43cb86656f8d1f8c6587d54ae7dcaf30e90533028b49fe", size = 14583, upload-time = "2026-03-25T20:22:03.012Z" },
+]
+
+[[package]]
+name = "torch"
+version = "2.7.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "filelock" },
+ { name = "fsspec" },
+ { name = "jinja2" },
+ { name = "networkx" },
+ { name = "nvidia-cublas-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-cuda-cupti-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-cuda-nvrtc-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-cuda-runtime-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-cudnn-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-cufft-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-cufile-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-curand-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-cusolver-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-cusparse-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-cusparselt-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-nccl-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-nvjitlink-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "nvidia-nvtx-cu12", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "setuptools", marker = "python_full_version >= '3.12'" },
+ { name = "sympy" },
+ { name = "triton", marker = "platform_machine == 'x86_64' and sys_platform == 'linux'" },
+ { name = "typing-extensions" },
+]
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/6a/27/2e06cb52adf89fe6e020963529d17ed51532fc73c1e6d1b18420ef03338c/torch-2.7.1-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:a103b5d782af5bd119b81dbcc7ffc6fa09904c423ff8db397a1e6ea8fd71508f", size = 99089441, upload-time = "2025-06-04T17:38:48.268Z" },
+ { url = "https://files.pythonhosted.org/packages/0a/7c/0a5b3aee977596459ec45be2220370fde8e017f651fecc40522fd478cb1e/torch-2.7.1-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:fe955951bdf32d182ee8ead6c3186ad54781492bf03d547d31771a01b3d6fb7d", size = 821154516, upload-time = "2025-06-04T17:36:28.556Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/91/3d709cfc5e15995fb3fe7a6b564ce42280d3a55676dad672205e94f34ac9/torch-2.7.1-cp310-cp310-win_amd64.whl", hash = "sha256:885453d6fba67d9991132143bf7fa06b79b24352f4506fd4d10b309f53454162", size = 216093147, upload-time = "2025-06-04T17:39:38.132Z" },
+ { url = "https://files.pythonhosted.org/packages/92/f6/5da3918414e07da9866ecb9330fe6ffdebe15cb9a4c5ada7d4b6e0a6654d/torch-2.7.1-cp310-none-macosx_11_0_arm64.whl", hash = "sha256:d72acfdb86cee2a32c0ce0101606f3758f0d8bb5f8f31e7920dc2809e963aa7c", size = 68630914, upload-time = "2025-06-04T17:39:31.162Z" },
+ { url = "https://files.pythonhosted.org/packages/11/56/2eae3494e3d375533034a8e8cf0ba163363e996d85f0629441fa9d9843fe/torch-2.7.1-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:236f501f2e383f1cb861337bdf057712182f910f10aeaf509065d54d339e49b2", size = 99093039, upload-time = "2025-06-04T17:39:06.963Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/94/34b80bd172d0072c9979708ccd279c2da2f55c3ef318eceec276ab9544a4/torch-2.7.1-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:06eea61f859436622e78dd0cdd51dbc8f8c6d76917a9cf0555a333f9eac31ec1", size = 821174704, upload-time = "2025-06-04T17:37:03.799Z" },
+ { url = "https://files.pythonhosted.org/packages/50/9e/acf04ff375b0b49a45511c55d188bcea5c942da2aaf293096676110086d1/torch-2.7.1-cp311-cp311-win_amd64.whl", hash = "sha256:8273145a2e0a3c6f9fd2ac36762d6ee89c26d430e612b95a99885df083b04e52", size = 216095937, upload-time = "2025-06-04T17:39:24.83Z" },
+ { url = "https://files.pythonhosted.org/packages/5b/2b/d36d57c66ff031f93b4fa432e86802f84991477e522adcdffd314454326b/torch-2.7.1-cp311-none-macosx_11_0_arm64.whl", hash = "sha256:aea4fc1bf433d12843eb2c6b2204861f43d8364597697074c8d38ae2507f8730", size = 68640034, upload-time = "2025-06-04T17:39:17.989Z" },
+ { url = "https://files.pythonhosted.org/packages/87/93/fb505a5022a2e908d81fe9a5e0aa84c86c0d5f408173be71c6018836f34e/torch-2.7.1-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:27ea1e518df4c9de73af7e8a720770f3628e7f667280bce2be7a16292697e3fa", size = 98948276, upload-time = "2025-06-04T17:39:12.852Z" },
+ { url = "https://files.pythonhosted.org/packages/56/7e/67c3fe2b8c33f40af06326a3d6ae7776b3e3a01daa8f71d125d78594d874/torch-2.7.1-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:c33360cfc2edd976c2633b3b66c769bdcbbf0e0b6550606d188431c81e7dd1fc", size = 821025792, upload-time = "2025-06-04T17:34:58.747Z" },
+ { url = "https://files.pythonhosted.org/packages/a1/37/a37495502bc7a23bf34f89584fa5a78e25bae7b8da513bc1b8f97afb7009/torch-2.7.1-cp312-cp312-win_amd64.whl", hash = "sha256:d8bf6e1856ddd1807e79dc57e54d3335f2b62e6f316ed13ed3ecfe1fc1df3d8b", size = 216050349, upload-time = "2025-06-04T17:38:59.709Z" },
+ { url = "https://files.pythonhosted.org/packages/3a/60/04b77281c730bb13460628e518c52721257814ac6c298acd25757f6a175c/torch-2.7.1-cp312-none-macosx_11_0_arm64.whl", hash = "sha256:787687087412c4bd68d315e39bc1223f08aae1d16a9e9771d95eabbb04ae98fb", size = 68645146, upload-time = "2025-06-04T17:38:52.97Z" },
+ { url = "https://files.pythonhosted.org/packages/66/81/e48c9edb655ee8eb8c2a6026abdb6f8d2146abd1f150979ede807bb75dcb/torch-2.7.1-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:03563603d931e70722dce0e11999d53aa80a375a3d78e6b39b9f6805ea0a8d28", size = 98946649, upload-time = "2025-06-04T17:38:43.031Z" },
+ { url = "https://files.pythonhosted.org/packages/3a/24/efe2f520d75274fc06b695c616415a1e8a1021d87a13c68ff9dce733d088/torch-2.7.1-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:d632f5417b6980f61404a125b999ca6ebd0b8b4bbdbb5fbbba44374ab619a412", size = 821033192, upload-time = "2025-06-04T17:38:09.146Z" },
+ { url = "https://files.pythonhosted.org/packages/dd/d9/9c24d230333ff4e9b6807274f6f8d52a864210b52ec794c5def7925f4495/torch-2.7.1-cp313-cp313-win_amd64.whl", hash = "sha256:23660443e13995ee93e3d844786701ea4ca69f337027b05182f5ba053ce43b38", size = 216055668, upload-time = "2025-06-04T17:38:36.253Z" },
+ { url = "https://files.pythonhosted.org/packages/95/bf/e086ee36ddcef9299f6e708d3b6c8487c1651787bb9ee2939eb2a7f74911/torch-2.7.1-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:0da4f4dba9f65d0d203794e619fe7ca3247a55ffdcbd17ae8fb83c8b2dc9b585", size = 68925988, upload-time = "2025-06-04T17:38:29.273Z" },
+ { url = "https://files.pythonhosted.org/packages/69/6a/67090dcfe1cf9048448b31555af6efb149f7afa0a310a366adbdada32105/torch-2.7.1-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:e08d7e6f21a617fe38eeb46dd2213ded43f27c072e9165dc27300c9ef9570934", size = 99028857, upload-time = "2025-06-04T17:37:50.956Z" },
+ { url = "https://files.pythonhosted.org/packages/90/1c/48b988870823d1cc381f15ec4e70ed3d65e043f43f919329b0045ae83529/torch-2.7.1-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:30207f672328a42df4f2174b8f426f354b2baa0b7cca3a0adb3d6ab5daf00dc8", size = 821098066, upload-time = "2025-06-04T17:37:33.939Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/eb/10050d61c9d5140c5dc04a89ed3257ef1a6b93e49dd91b95363d757071e0/torch-2.7.1-cp313-cp313t-win_amd64.whl", hash = "sha256:79042feca1c634aaf6603fe6feea8c6b30dfa140a6bbc0b973e2260c7e79a22e", size = 216336310, upload-time = "2025-06-04T17:36:09.862Z" },
+ { url = "https://files.pythonhosted.org/packages/b1/29/beb45cdf5c4fc3ebe282bf5eafc8dfd925ead7299b3c97491900fe5ed844/torch-2.7.1-cp313-none-macosx_11_0_arm64.whl", hash = "sha256:988b0cbc4333618a1056d2ebad9eb10089637b659eb645434d0809d8d937b946", size = 68645708, upload-time = "2025-06-04T17:34:39.852Z" },
+]
+
+[[package]]
+name = "tqdm"
+version = "4.67.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "colorama", marker = "sys_platform == 'win32'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/e8/4f/0153c21dc5779a49a0598c445b1978126b1344bab9ee71e53e44877e14e0/tqdm-4.67.0.tar.gz", hash = "sha256:fe5a6f95e6fe0b9755e9469b77b9c3cf850048224ecaa8293d7d2d31f97d869a", size = 169739, upload-time = "2024-11-06T16:35:37.677Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/2b/78/57043611a16c655c8350b4c01b8d6abfb38cc2acb475238b62c2146186d7/tqdm-4.67.0-py3-none-any.whl", hash = "sha256:0cd8af9d56911acab92182e88d763100d4788bdf421d251616040cc4d44863be", size = 78590, upload-time = "2024-11-06T16:35:35.023Z" },
+]
+
+[[package]]
+name = "transformers"
+version = "4.53.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "filelock" },
+ { name = "huggingface-hub" },
+ { name = "numpy", version = "2.2.6", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.11'" },
+ { name = "numpy", version = "2.3.3", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.11'" },
+ { name = "packaging" },
+ { name = "pyyaml" },
+ { name = "regex" },
+ { name = "requests" },
+ { name = "safetensors" },
+ { name = "tokenizers" },
+ { name = "tqdm" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/9f/2c/68a0024c311db41bb92d4ec17d22e90b7406a4d28aa18d87662f2bbebcd9/transformers-4.53.1.tar.gz", hash = "sha256:da5a9f66ad480bc2a7f75bc32eaf735fd20ac56af4325ca4ce994021ceb37710", size = 9192189, upload-time = "2025-07-04T08:28:40.571Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/8d/10/8cef2288810a3210659eb3a20711e8387cc35a881a7762ae387806e2d651/transformers-4.53.1-py3-none-any.whl", hash = "sha256:c84f3c3e41c71fdf2c60c8a893e1cd31191b0cb463385f4c276302d2052d837b", size = 10825681, upload-time = "2025-07-04T08:28:37.318Z" },
+]
+
+[[package]]
+name = "triton"
+version = "3.3.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "setuptools" },
+]
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/8d/a9/549e51e9b1b2c9b854fd761a1d23df0ba2fbc60bd0c13b489ffa518cfcb7/triton-3.3.1-cp310-cp310-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b74db445b1c562844d3cfad6e9679c72e93fdfb1a90a24052b03bb5c49d1242e", size = 155600257, upload-time = "2025-05-29T23:39:36.085Z" },
+ { url = "https://files.pythonhosted.org/packages/21/2f/3e56ea7b58f80ff68899b1dbe810ff257c9d177d288c6b0f55bf2fe4eb50/triton-3.3.1-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b31e3aa26f8cb3cc5bf4e187bf737cbacf17311e1112b781d4a059353dfd731b", size = 155689937, upload-time = "2025-05-29T23:39:44.182Z" },
+ { url = "https://files.pythonhosted.org/packages/24/5f/950fb373bf9c01ad4eb5a8cd5eaf32cdf9e238c02f9293557a2129b9c4ac/triton-3.3.1-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9999e83aba21e1a78c1f36f21bce621b77bcaa530277a50484a7cb4a822f6e43", size = 155669138, upload-time = "2025-05-29T23:39:51.771Z" },
+ { url = "https://files.pythonhosted.org/packages/74/1f/dfb531f90a2d367d914adfee771babbd3f1a5b26c3f5fbc458dee21daa78/triton-3.3.1-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b89d846b5a4198317fec27a5d3a609ea96b6d557ff44b56c23176546023c4240", size = 155673035, upload-time = "2025-05-29T23:40:02.468Z" },
+ { url = "https://files.pythonhosted.org/packages/28/71/bd20ffcb7a64c753dc2463489a61bf69d531f308e390ad06390268c4ea04/triton-3.3.1-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a3198adb9d78b77818a5388bff89fa72ff36f9da0bc689db2f0a651a67ce6a42", size = 155735832, upload-time = "2025-05-29T23:40:10.522Z" },
+]
+
+[[package]]
+name = "typer"
+version = "0.20.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "click" },
+ { name = "rich" },
+ { name = "shellingham" },
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/8f/28/7c85c8032b91dbe79725b6f17d2fffc595dff06a35c7a30a37bef73a1ab4/typer-0.20.0.tar.gz", hash = "sha256:1aaf6494031793e4876fb0bacfa6a912b551cf43c1e63c800df8b1a866720c37", size = 106492, upload-time = "2025-10-20T17:03:49.445Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/78/64/7713ffe4b5983314e9d436a90d5bd4f63b6054e2aca783a3cfc44cb95bbf/typer-0.20.0-py3-none-any.whl", hash = "sha256:5b463df6793ec1dca6213a3cf4c0f03bc6e322ac5e16e13ddd622a889489784a", size = 47028, upload-time = "2025-10-20T17:03:47.617Z" },
+]
+
+[[package]]
+name = "typing-extensions"
+version = "4.15.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" },
+]
+
+[[package]]
+name = "typing-inspection"
+version = "0.4.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "typing-extensions" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/55/e3/70399cb7dd41c10ac53367ae42139cf4b1ca5f36bb3dc6c9d33acdb43655/typing_inspection-0.4.2.tar.gz", hash = "sha256:ba561c48a67c5958007083d386c3295464928b01faa735ab8547c5692e87f464", size = 75949, upload-time = "2025-10-01T02:14:41.687Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/dc/9b/47798a6c91d8bdb567fe2698fe81e0c6b7cb7ef4d13da4114b41d239f65d/typing_inspection-0.4.2-py3-none-any.whl", hash = "sha256:4ed1cacbdc298c220f1bd249ed5287caa16f34d44ef4e9c3d0cbad5b521545e7", size = 14611, upload-time = "2025-10-01T02:14:40.154Z" },
+]
+
+[[package]]
+name = "tzdata"
+version = "2025.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/95/32/1a225d6164441be760d75c2c42e2780dc0873fe382da3e98a2e1e48361e5/tzdata-2025.2.tar.gz", hash = "sha256:b60a638fcc0daffadf82fe0f57e53d06bdec2f36c4df66280ae79bce6bd6f2b9", size = 196380, upload-time = "2025-03-23T13:54:43.652Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/5c/23/c7abc0ca0a1526a0774eca151daeb8de62ec457e77262b66b359c3c7679e/tzdata-2025.2-py2.py3-none-any.whl", hash = "sha256:1a403fada01ff9221ca8044d701868fa132215d84beb92242d9acd2147f667a8", size = 347839, upload-time = "2025-03-23T13:54:41.845Z" },
+]
+
+[[package]]
+name = "urllib3"
+version = "2.5.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/15/22/9ee70a2574a4f4599c47dd506532914ce044817c7752a79b6a51286319bc/urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760", size = 393185, upload-time = "2025-06-18T14:07:41.644Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc", size = 129795, upload-time = "2025-06-18T14:07:40.39Z" },
+]
+
+[[package]]
+name = "wasabi"
+version = "1.1.3"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "colorama", marker = "sys_platform == 'win32'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/ac/f9/054e6e2f1071e963b5e746b48d1e3727470b2a490834d18ad92364929db3/wasabi-1.1.3.tar.gz", hash = "sha256:4bb3008f003809db0c3e28b4daf20906ea871a2bb43f9914197d540f4f2e0878", size = 30391, upload-time = "2024-05-31T16:56:18.99Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/06/7c/34330a89da55610daa5f245ddce5aab81244321101614751e7537f125133/wasabi-1.1.3-py3-none-any.whl", hash = "sha256:f76e16e8f7e79f8c4c8be49b4024ac725713ab10cd7f19350ad18a8e3f71728c", size = 27880, upload-time = "2024-05-31T16:56:16.699Z" },
+]
+
+[[package]]
+name = "wcwidth"
+version = "0.2.14"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/24/30/6b0809f4510673dc723187aeaf24c7f5459922d01e2f794277a3dfb90345/wcwidth-0.2.14.tar.gz", hash = "sha256:4d478375d31bc5395a3c55c40ccdf3354688364cd61c4f6adacaa9215d0b3605", size = 102293, upload-time = "2025-09-22T16:29:53.023Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/af/b5/123f13c975e9f27ab9c0770f514345bd406d0e8d3b7a0723af9d43f710af/wcwidth-0.2.14-py2.py3-none-any.whl", hash = "sha256:a7bb560c8aee30f9957e5f9895805edd20602f2d7f720186dfd906e82b4982e1", size = 37286, upload-time = "2025-09-22T16:29:51.641Z" },
+]
+
+[[package]]
+name = "weasel"
+version = "0.4.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "cloudpathlib" },
+ { name = "confection" },
+ { name = "packaging" },
+ { name = "pydantic" },
+ { name = "requests" },
+ { name = "smart-open" },
+ { name = "srsly" },
+ { name = "typer" },
+ { name = "wasabi" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/a7/1a/9c522dd61b52939c217925d3e55c95f9348b73a66a956f52608e1e59a2c0/weasel-0.4.1.tar.gz", hash = "sha256:aabc210f072e13f6744e5c3a28037f93702433405cd35673f7c6279147085aa9", size = 38417, upload-time = "2024-05-15T08:52:54.765Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/2a/87/abd57374044e1f627f0a905ac33c1a7daab35a3a815abfea4e1bafd3fdb1/weasel-0.4.1-py3-none-any.whl", hash = "sha256:24140a090ea1ac512a2b2f479cc64192fd1d527a7f3627671268d08ed5ac418c", size = 50270, upload-time = "2024-05-15T08:52:52.977Z" },
+]
+
+[[package]]
+name = "wordfreq"
+version = "3.1.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "ftfy" },
+ { name = "langcodes" },
+ { name = "locate" },
+ { name = "msgpack" },
+ { name = "regex" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/4c/cd/9581ff0ea2c581012d0caae4bba024f3ff6b46e030a55ddec1ce545e2caf/wordfreq-3.1.1.tar.gz", hash = "sha256:7943098975f25c2a70e1151ee5a62083b14a5f86f6cc5703cc9526f716ceb408", size = 56846562, upload-time = "2023-11-21T23:09:12.106Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/24/61/62835c475d69872d30689f284497853fe33fe1d6dd18f57346d13305861d/wordfreq-3.1.1-py3-none-any.whl", hash = "sha256:4b1c6ecffc6198be3396d5cf871c4423ca71c907c231348d352dd54d62b97473", size = 56834549, upload-time = "2023-11-21T23:09:07.183Z" },
+]
+
+[[package]]
+name = "wrapt"
+version = "2.0.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/49/19/5e5bcd855d808892fe02d49219f97a50f64cd6d8313d75df3494ee97b1a3/wrapt-2.0.0.tar.gz", hash = "sha256:35a542cc7a962331d0279735c30995b024e852cf40481e384fd63caaa391cbb9", size = 81722, upload-time = "2025-10-19T23:47:54.07Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/ee/db/ac9546e89b645e525686727f8749847485e3b45ffc4507b61c4669358638/wrapt-2.0.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a7cebcee61f21b1e46aa32db8d9d93826d0fbf1ad85defc2ccfb93b4adef1435", size = 77431, upload-time = "2025-10-19T23:45:25.177Z" },
+ { url = "https://files.pythonhosted.org/packages/74/bc/3b57c8012bbd0d02eec5ae838681c1a819df6c5e765ebc897f52623b5eb1/wrapt-2.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:827e6e3a3a560f6ec1f5ee92d4319c21a0549384f896ec692f3201eda31ebd11", size = 60644, upload-time = "2025-10-19T23:45:27.511Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/6e/b5e7d47713e3d46c30ec6ae83fafd369bc34de8148668c6e3168d9301863/wrapt-2.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1a91075a5383a7cbfe46aed1845ef7c3f027e8e20e7d9a8a75e36ebc9b0dd15e", size = 61526, upload-time = "2025-10-19T23:45:28.789Z" },
+ { url = "https://files.pythonhosted.org/packages/28/8d/d5df2af58ae479785473607a3b25726c295640cdcaee830847cee339eff9/wrapt-2.0.0-cp310-cp310-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b6a18c813196e18146b8d041e20875bdb0cb09b94ac1d1e1146e0fa87b2deb0d", size = 113638, upload-time = "2025-10-19T23:45:31.977Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/b7/9501c45ab93b4d6ba396ef02fcfb55867866bc8579fff045bb54cae58423/wrapt-2.0.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ec5028d26011a53c76bd91bb6198b30b438c6e0f7adb45f2ad84fe2655b6a104", size = 115651, upload-time = "2025-10-19T23:45:33.257Z" },
+ { url = "https://files.pythonhosted.org/packages/5e/3a/bfebe2ba51cf98ae80c5dbb6fa5892ae75d1acf1a4c404eda88e28f5ab06/wrapt-2.0.0-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bed9b04900204721a24bcefc652ca267b01c1e8ad8bc8c0cff81558a45a3aadc", size = 112060, upload-time = "2025-10-19T23:45:30.298Z" },
+ { url = "https://files.pythonhosted.org/packages/00/e7/cd50a32bed022d98f61a90e57faf782aa063f7930f57eb67eb105d3189be/wrapt-2.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03442f2b45fa3f2b98a94a1917f52fb34670de8f96c0a009c02dbd512d855a3d", size = 114829, upload-time = "2025-10-19T23:45:34.23Z" },
+ { url = "https://files.pythonhosted.org/packages/9d/2c/c709578271df0c70a27ab8f797c44c258650f24a32b452f03d7afedc070d/wrapt-2.0.0-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:17d0b5c42495ba142a1cee52b76414f9210591c84aae94dffda70240753bfb3c", size = 111249, upload-time = "2025-10-19T23:45:35.554Z" },
+ { url = "https://files.pythonhosted.org/packages/60/ef/cb58f6eea41f129600bda68d1ae4c80b14d4e0663eec1d5220cbffe50be5/wrapt-2.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:ee44215e7d13e112a8fc74e12ed1a1f41cab2bc07b11cc703f2398cd114b261c", size = 113312, upload-time = "2025-10-19T23:45:36.66Z" },
+ { url = "https://files.pythonhosted.org/packages/59/55/97e6c4e1c175fb27f8dec717a3e36493ff0c4e50173a95f439496556910f/wrapt-2.0.0-cp310-cp310-win32.whl", hash = "sha256:fe6eafac3bc3c957ab6597a0c0654a0a308868458d00d218743e5b5fae51951c", size = 57961, upload-time = "2025-10-19T23:45:40.958Z" },
+ { url = "https://files.pythonhosted.org/packages/3b/0a/898b1d81ae1f3dd9a79fd2e0330a7c8dd793982f815a318548777cb21ee5/wrapt-2.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:9e070c3491397fba0445b8977900271eca9656570cca7c900d9b9352186703a0", size = 60311, upload-time = "2025-10-19T23:45:38.033Z" },
+ { url = "https://files.pythonhosted.org/packages/44/f1/e7e92f9535f5624ee22879f09456df9d1f1ae9bb338eef711077b48e456a/wrapt-2.0.0-cp310-cp310-win_arm64.whl", hash = "sha256:806e2e73186eb5e3546f39fb5d0405040e0088db0fc8b2f667fd1863de2b3c99", size = 58822, upload-time = "2025-10-19T23:45:39.785Z" },
+ { url = "https://files.pythonhosted.org/packages/12/8f/8e4c8b6da60b4205191d588cbac448fb9ff4f5ed89f4e555dc4813ab30cf/wrapt-2.0.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:b7e221abb6c5387819db9323dac3c875b459695057449634f1111955d753c621", size = 77433, upload-time = "2025-10-19T23:45:42.543Z" },
+ { url = "https://files.pythonhosted.org/packages/22/9a/01a29ccb029aa8e78241f8b53cb89ae8826c240129abbbb6ebba3416eff9/wrapt-2.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1147a84c8fc852426580af8b6e33138461ddbc65aa459a25ea539374d32069fa", size = 60641, upload-time = "2025-10-19T23:45:43.866Z" },
+ { url = "https://files.pythonhosted.org/packages/3d/ec/e058997971428b7665b5c3665a55b18bb251ea7e08d002925e3ca017c020/wrapt-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:5d6691d4a711504a0bc10de789842ad6ac627bed22937b10f37a1211a8ab7bb3", size = 61526, upload-time = "2025-10-19T23:45:44.839Z" },
+ { url = "https://files.pythonhosted.org/packages/70/c3/c82263503f554715aa1847e85dc75a69631a54e9d7ab0f1a55e34a22d44a/wrapt-2.0.0-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:f460e1eb8e75a17c3918c8e35ba57625721eef2439ef0bcf05304ac278a65e1d", size = 114069, upload-time = "2025-10-19T23:45:47.223Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/97/d95e88a3a1bc2890a1aa47880c2762cf0eb6d231b5a64048e351cec6f071/wrapt-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:12c37784b77bf043bf65cc96c7195a5db474b8e54173208af076bdbb61df7b3e", size = 116109, upload-time = "2025-10-19T23:45:48.252Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/36/cba0bf954f2303897b80fa5342499b43f8c5201110dddf0d578d6841b149/wrapt-2.0.0-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:75e5c049eb583835f7a0e0e311d9dde9bfbaac723a6dd89d052540f9b2809977", size = 112500, upload-time = "2025-10-19T23:45:45.838Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/2b/8cb88e63bec989f641d208acb3fd198bfdbbb4ef7dfb71f0cac3c90b07a9/wrapt-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e50bcbd5b65dac21b82319fcf18486e6ac439947e9305034b00704eb7405f553", size = 115356, upload-time = "2025-10-19T23:45:49.249Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/60/a6d5fb94648cd430648705bef9f4241bd22ead123ead552b6d2873ad5240/wrapt-2.0.0-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:06b78cb6b9320f57737a52fede882640d93cface98332d1a3df0c5696ec9ae9f", size = 111754, upload-time = "2025-10-19T23:45:51.21Z" },
+ { url = "https://files.pythonhosted.org/packages/d0/44/1963854edf0592ae806307899dc7bf891e76cec19e598f55845c94603a65/wrapt-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:8c8349ebfc3cd98bc9105e0112dd8c8ac1f3c7cb5601f9d02248cae83a63f748", size = 113789, upload-time = "2025-10-19T23:45:52.473Z" },
+ { url = "https://files.pythonhosted.org/packages/62/ec/4b1d76cb6d96ac511aaaa92efc57f528e57f06082a595b8b2663fcdb0f20/wrapt-2.0.0-cp311-cp311-win32.whl", hash = "sha256:028f19ec29e204fe725139d4a8b09f77ecfb64f8f02b7ab5ee822c85e330b68b", size = 57954, upload-time = "2025-10-19T23:45:57.03Z" },
+ { url = "https://files.pythonhosted.org/packages/d4/cf/df8ff9bd64d4a75f9a9f6c1c93480a51904d0c9bd71c11994301c47d8a33/wrapt-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:c6961f05e58d919153ba311b397b7b904b907132b7b8344dde47865d4bb5ec89", size = 60308, upload-time = "2025-10-19T23:45:54.314Z" },
+ { url = "https://files.pythonhosted.org/packages/69/d8/61e245fe387d58d84b3f913d5da9d909c4f239b887db692a05105aaf2a1b/wrapt-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:be7e316c2accd5a31dbcc230de19e2a846a325f8967fdea72704d00e38e6af06", size = 58822, upload-time = "2025-10-19T23:45:55.772Z" },
+ { url = "https://files.pythonhosted.org/packages/3c/28/7f266b5bf50c3ad0c99c524d99faa0f7d6eecb045d950e7d2c9e1f0e1338/wrapt-2.0.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:73c6f734aecb1a030d9a265c13a425897e1ea821b73249bb14471445467ca71c", size = 78078, upload-time = "2025-10-19T23:45:58.855Z" },
+ { url = "https://files.pythonhosted.org/packages/06/0c/bbdcad7eb535fae9d6b0fcfa3995c364797cd8e2b423bba5559ab2d88dcf/wrapt-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b4a7f8023b8ce8a36370154733c747f8d65c8697cb977d8b6efeb89291fff23e", size = 61158, upload-time = "2025-10-19T23:46:00.096Z" },
+ { url = "https://files.pythonhosted.org/packages/d3/8a/bba3e7a4ebf4d1624103ee59d97b78a1fbb08fb5753ff5d1b69f5ef5e863/wrapt-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a1cb62f686c50e9dab5983c68f6c8e9cbf14a6007935e683662898a7d892fa69", size = 61646, upload-time = "2025-10-19T23:46:01.279Z" },
+ { url = "https://files.pythonhosted.org/packages/ff/0c/0f565294897a72493dbafe7b46229b5f09f3776795a894d6b737e98387de/wrapt-2.0.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:43dc0550ae15e33e6bb45a82a5e1b5495be2587fbaa996244b509921810ee49f", size = 121442, upload-time = "2025-10-19T23:46:04.287Z" },
+ { url = "https://files.pythonhosted.org/packages/da/80/7f03501a8a078ad79b19b1a888f9192a9494e62ddf8985267902766a4f30/wrapt-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:39c5b45b056d630545e40674d1f5e1b51864b3546f25ab6a4a331943de96262e", size = 123018, upload-time = "2025-10-19T23:46:06.052Z" },
+ { url = "https://files.pythonhosted.org/packages/37/6b/ad0e1ff98359f13b4b0c2c52848e792841146fe79ac5f56899b9a028fc0d/wrapt-2.0.0-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:804e88f824b76240a1b670330637ccfd2d18b9efa3bb4f02eb20b2f64880b324", size = 117369, upload-time = "2025-10-19T23:46:02.53Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/6c/a90437bba8cb1ce2ed639af979515e09784678c2a7f4ffc79f2cf7de809e/wrapt-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:c2c476aa3fc2b9899c3f7b20963fac4f952e7edb74a31fc92f7745389a2e3618", size = 121453, upload-time = "2025-10-19T23:46:07.747Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/a9/b3982f9bd15bd45857a23c48b7c36e47d05db4a4dcc5061c31f169238845/wrapt-2.0.0-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:8d851e526891216f89fcb7a1820dad9bd503ba3468fb9635ee28e93c781aa98e", size = 116250, upload-time = "2025-10-19T23:46:09.385Z" },
+ { url = "https://files.pythonhosted.org/packages/73/e2/b7a8b1afac9f791d8f5eac0d9726559f1d7ec4a2b5a6b4e67ac145b007a5/wrapt-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b95733c2360c4a8656ee93c7af78e84c0bd617da04a236d7a456c8faa34e7a2d", size = 120575, upload-time = "2025-10-19T23:46:11.882Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/0f/37920eeea96094f450ae35505d39f1135df951a2cdee0d4e01d4f843396a/wrapt-2.0.0-cp312-cp312-win32.whl", hash = "sha256:ea56817176834edf143df1109ae8fdaa087be82fdad3492648de0baa8ae82bf2", size = 58175, upload-time = "2025-10-19T23:46:15.678Z" },
+ { url = "https://files.pythonhosted.org/packages/f0/db/b395f3b0c7f2c60d9219afacc54ceb699801ccf2d3d969ba556dc6d3af20/wrapt-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:3c7d3bee7be7a2665286103f4d1f15405c8074e6e1f89dac5774f9357c9a3809", size = 60415, upload-time = "2025-10-19T23:46:12.913Z" },
+ { url = "https://files.pythonhosted.org/packages/86/22/33d660214548af47fc59d9eec8c0e0693bcedc5b3a0b52e8cbdd61f3b646/wrapt-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:680f707e1d26acbc60926659799b15659f077df5897a6791c7c598a5d4a211c4", size = 58911, upload-time = "2025-10-19T23:46:13.889Z" },
+ { url = "https://files.pythonhosted.org/packages/18/0a/dd88abfe756b1aa79f0777e5ee4ce9e4b5dc4999bd805e9b04b52efc7b18/wrapt-2.0.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:e2ea096db28d5eb64d381af0e93464621ace38a7003a364b6b5ffb7dd713aabe", size = 78083, upload-time = "2025-10-19T23:46:16.937Z" },
+ { url = "https://files.pythonhosted.org/packages/7f/b9/8afebc1655a863bb2178b23c2d699b8743f3a7dab466904adc6155f3c858/wrapt-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:c92b5a82d28491e3f14f037e1aae99a27a5e6e0bb161e65f52c0445a3fa7c940", size = 61156, upload-time = "2025-10-19T23:46:17.927Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/8b/f710a6528ccc52e21943f42c8cf64814cde90f9adbd3bcd58c7c274b4f75/wrapt-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:81d234718aabe632d179fac52c7f69f0f99fbaac4d4bcd670e62462bbcbfcad7", size = 61641, upload-time = "2025-10-19T23:46:19.229Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/5f/e4eabd0cc6684c5b208c2abc5c3459449c4d15be1694a9bbcf51e0e135fd/wrapt-2.0.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:db2eea83c43f84e4e41dbbb4c1de371a53166e55f900a6b130c3ef51c6345c1a", size = 121454, upload-time = "2025-10-19T23:46:21.808Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/c4/ec31ee17cc7866960d323609ba7402be786d211a6d713a59f776c4270bb3/wrapt-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:65f50e356c425c061e1e17fe687ff30e294fed9bf3441dc1f13ef73859c2a817", size = 123063, upload-time = "2025-10-19T23:46:23.545Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/2b/a4b10c3c0022e40aeae9bec009bafb049f440493f0575ebb27ecf61c32f8/wrapt-2.0.0-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:887f2a667e3cbfb19e204032d42ad7dedaa43972e4861dc7a3d51ae951d9b578", size = 117401, upload-time = "2025-10-19T23:46:20.433Z" },
+ { url = "https://files.pythonhosted.org/packages/2a/4a/ade23a76967e1f148e461076a4d0e24a7950a5f18b394c9107fe60224ae2/wrapt-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9054829da4be461e3ad3192e4b6bbf1fc18af64c9975ce613aec191924e004dc", size = 121485, upload-time = "2025-10-19T23:46:24.85Z" },
+ { url = "https://files.pythonhosted.org/packages/cb/ba/33b5f3e2edede4e1cfd259f0d9c203cf370f259bb9b215dd58fc6cbb94e9/wrapt-2.0.0-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:b952ffd77133a5a2798ee3feb18e51b0a299d2f440961e5bb7737dbb02e57289", size = 116276, upload-time = "2025-10-19T23:46:27.006Z" },
+ { url = "https://files.pythonhosted.org/packages/eb/bf/b7f95bb4529a35ca11eb95d48f9d1a563b495471f7cf404c644566fb4293/wrapt-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e25fde03c480061b8234d8ee4863eb5f40a9be4fb258ce105b364de38fc6bcf9", size = 120578, upload-time = "2025-10-19T23:46:28.679Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/71/984849df6f052592474a44aafd6b847e1cffad39b0debc5390a04aa46331/wrapt-2.0.0-cp313-cp313-win32.whl", hash = "sha256:49e982b7860d325094978292a49e0418833fc7fc42c0dc7cd0b7524d7d06ee74", size = 58178, upload-time = "2025-10-19T23:46:32.372Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/3b/4e1fc0f2e1355fbc55ab248311bf4c958dbbd96bd9183b9e96882cc16213/wrapt-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:6e5c86389d9964050ce50babe247d172a5e3911d59a64023b90db2b4fa00ae7c", size = 60423, upload-time = "2025-10-19T23:46:30.041Z" },
+ { url = "https://files.pythonhosted.org/packages/20/0a/9384e0551f56fe361f41bb8f209a13bb9ef689c3a18264225b249849b12c/wrapt-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:b96fdaa4611e05c7231937930567d3c16782be9dbcf03eb9f60d83e57dd2f129", size = 58918, upload-time = "2025-10-19T23:46:31.056Z" },
+ { url = "https://files.pythonhosted.org/packages/68/70/37b90d3ee5bf0d0dc4859306383da08b685c9a51abff6fd6b0a7c052e117/wrapt-2.0.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:f2c7b7fead096dbf1dcc455b7f59facb05de3f5bfb04f60a69f98cdfe6049e5f", size = 81980, upload-time = "2025-10-19T23:46:33.368Z" },
+ { url = "https://files.pythonhosted.org/packages/95/23/0ce69cc90806b90b3ee4cfd9ad8d2ee9becc3a1aab7df3c3bfc7d0904cb6/wrapt-2.0.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:04c7c8393f25b11c0faa5d907dd9eb462e87e4e7ba55e308a046d7ed37f4bbe2", size = 62900, upload-time = "2025-10-19T23:46:34.415Z" },
+ { url = "https://files.pythonhosted.org/packages/54/76/03ec08170c02f38f3be3646977920976b968e0b704a0693a98f95d02f4d2/wrapt-2.0.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a93e0f8b376c0735b2f4daf58018b4823614d2b896cb72b6641c4d3dbdca1d75", size = 63636, upload-time = "2025-10-19T23:46:35.643Z" },
+ { url = "https://files.pythonhosted.org/packages/75/c1/04ce0511e504cdcd84cdb6980bc7d4efa38ac358e8103d6dd0cd278bfc6d/wrapt-2.0.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b42d13603da4416c43c430dbc6313c8d7ff745c40942f146ed4f6dd02c7d2547", size = 152650, upload-time = "2025-10-19T23:46:38.717Z" },
+ { url = "https://files.pythonhosted.org/packages/17/06/cd2e32b5f744701189c954f9ab5eee449c86695b13f414bb8ea7a83f6d48/wrapt-2.0.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c8bbd2472abf8c33480ad2314b1f8fac45d592aba6cc093e8839a7b2045660e6", size = 158811, upload-time = "2025-10-19T23:46:40.875Z" },
+ { url = "https://files.pythonhosted.org/packages/7d/a2/a6d920695cca62563c1b969064e5cd2051344a6e330c184b6f80383d87e4/wrapt-2.0.0-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e64a3a1fd9a308ab9b815a2ad7a65b679730629dbf85f8fc3f7f970d634ee5df", size = 146033, upload-time = "2025-10-19T23:46:37.351Z" },
+ { url = "https://files.pythonhosted.org/packages/c6/90/7fd2abe4ec646bc43cb6b0d05086be6fcf15e64f06f51fc4198804396d68/wrapt-2.0.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:d61214525eaf88e0d0edf3d1ad5b5889863c6f88e588c6cdc6aa4ee5d1f10a4a", size = 155673, upload-time = "2025-10-19T23:46:42.582Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/8d/6cce7f8c41633e677ac8aa34e84b53a22a645ec2a680deb991785ca2798d/wrapt-2.0.0-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:04f7a5f92c5f7324a1735043cc467b1295a1c5b4e0c1395472b7c44706e3dc61", size = 144364, upload-time = "2025-10-19T23:46:44.381Z" },
+ { url = "https://files.pythonhosted.org/packages/72/42/9570349e03afa9d83daf7f33ffb17e8cdc62d7e84c0d09005d0f51912efa/wrapt-2.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:2356f76cb99b3de5b4e5b8210367fbbb81c7309fe39b622f5d199dd88eb7f765", size = 150275, upload-time = "2025-10-19T23:46:45.662Z" },
+ { url = "https://files.pythonhosted.org/packages/f2/d8/448728e6fe030e5c4f1022c82cd3af1de1c672fa53d2d5b36b32a55ce7bf/wrapt-2.0.0-cp313-cp313t-win32.whl", hash = "sha256:0a921b657a224e40e4bc161b5d33934583b34f0c9c5bdda4e6ac66f9d2fcb849", size = 59867, upload-time = "2025-10-19T23:46:49.593Z" },
+ { url = "https://files.pythonhosted.org/packages/8f/b1/ad812b1fe1cd85f6498dc3a3c9809a1e880d6108283b1735119bec217041/wrapt-2.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:c16f6d4eea98080f6659a8a7fc559d4a0a337ee66960659265cad2c8a40f7c0f", size = 63170, upload-time = "2025-10-19T23:46:46.87Z" },
+ { url = "https://files.pythonhosted.org/packages/7f/29/c105b1e76650c82823c491952a7a8eafe09b78944f7a43f22d37ed860229/wrapt-2.0.0-cp313-cp313t-win_arm64.whl", hash = "sha256:52878edc13dc151c58a9966621d67163a80654bc6cff4b2e1c79fa62d0352b26", size = 60339, upload-time = "2025-10-19T23:46:47.862Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/38/0dd39f83163fd28326afba84e3e416656938df07e60a924ac4d992b30220/wrapt-2.0.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:79a53d86c2aff7b32cc77267e3a308365d1fcb881e74bc9cbe26f63ee90e37f0", size = 78242, upload-time = "2025-10-19T23:46:51.096Z" },
+ { url = "https://files.pythonhosted.org/packages/08/ef/fa7a5c1d73f8690c712f9d2e4615700c6809942536dd3f441b9ba650a310/wrapt-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d731a4f22ed6ffa4cb551b4d2b0c24ff940c27a88edaf8e3490a5ee3a05aef71", size = 61207, upload-time = "2025-10-19T23:46:52.558Z" },
+ { url = "https://files.pythonhosted.org/packages/23/d9/67cb93da492eb0a1cb17b7ed18220d059e58f00467ce6728b674d3441b3d/wrapt-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:3e02ab8c0ac766a5a6e81cd3b6cc39200c69051826243182175555872522bd5a", size = 61748, upload-time = "2025-10-19T23:46:54.468Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/be/912bbd70cc614f491b526a1d7fe85695b283deed19287b9f32460178c54d/wrapt-2.0.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:895870602d65d7338edb3b6a717d856632ad9f14f7ff566214e4fb11f0816649", size = 120424, upload-time = "2025-10-19T23:46:57.575Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/e1/10df8937e7da2aa9bc3662a4b623e51a323c68f42cad7b13f0e61a700ce2/wrapt-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0b9ad4fab76a0086dc364c4f17f39ad289600e73ef5c6e9ab529aff22cac1ac3", size = 122804, upload-time = "2025-10-19T23:46:59.308Z" },
+ { url = "https://files.pythonhosted.org/packages/f3/60/576751b1919adab9f63168e3b5fd46c0d1565871b1cc4c2569503ccf4be6/wrapt-2.0.0-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e7ca0562606d7bad2736b2c18f61295d61f50cd3f4bfc51753df13614dbcce1b", size = 117398, upload-time = "2025-10-19T23:46:55.814Z" },
+ { url = "https://files.pythonhosted.org/packages/ec/55/243411f360cc27bae5f8e21c16f1a8d87674c5534f4558e8a97c1e0d1c6f/wrapt-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:fe089d9f5a4a3dea0108a8ae34bced114d0c4cca417bada1c5e8f42d98af9050", size = 121230, upload-time = "2025-10-19T23:47:01.347Z" },
+ { url = "https://files.pythonhosted.org/packages/d6/23/2f21f692c3b3f0857cb82708ce0c341fbac55a489d4025ae4e3fd5d5de8c/wrapt-2.0.0-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:e761f2d2f8dbc80384af3d547b522a80e67db3e319c7b02e7fd97aded0a8a678", size = 116296, upload-time = "2025-10-19T23:47:02.659Z" },
+ { url = "https://files.pythonhosted.org/packages/bd/ed/678957fad212cfb1b65b2359d62f5619f5087d1d1cf296c6a996be45171c/wrapt-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:17ba1bdc52d0c783481850996aa26cea5237720769197335abea2ae6b4c23bc0", size = 119602, upload-time = "2025-10-19T23:47:03.775Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/e3/aeb4c3b052d3eed95e61babc20dcb1a512651e098cca4b84a6896585c06a/wrapt-2.0.0-cp314-cp314-win32.whl", hash = "sha256:f73318741b141223a4674ba96992aa2291b1b3f7a5e85cb3c2c964f86171eb45", size = 58649, upload-time = "2025-10-19T23:47:07.382Z" },
+ { url = "https://files.pythonhosted.org/packages/aa/2a/a71c51cb211798405b59172c7df5789a5b934b18317223cf22e0c6f852de/wrapt-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:8e08d4edb13cafe7b3260f31d4de033f73d3205774540cf583bffaa4bec97db9", size = 60897, upload-time = "2025-10-19T23:47:04.862Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/a5/acc5628035d06f69e9144cca543ca54c33b42a5a23b6f1e8fa131026db89/wrapt-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:af01695c2b7bbd8d67b869d8e3de2b123a7bfbee0185bdd138c2775f75373b83", size = 59306, upload-time = "2025-10-19T23:47:05.883Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/e6/1318ca07d7fcee57e4592a78dacd9d5493b8ddd971c553a62904fb2c0cf2/wrapt-2.0.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:057f02c13cce7b26c79624c06a3e1c2353e6dc9708525232232f6768118042ca", size = 81987, upload-time = "2025-10-19T23:47:08.7Z" },
+ { url = "https://files.pythonhosted.org/packages/e7/bf/ffac358ddf61c3923d94a8b0e7620f2af1cd1b637a0fe4963a3919aa62b7/wrapt-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:79bdd84570267f3f43d609c892ae2d30b91ee4b8614c2cbfd311a2965f1c9bdb", size = 62902, upload-time = "2025-10-19T23:47:10.248Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/af/387c51f9e7b544fe95d852fc94f9f3866e3f7d7d39c2ee65041752f90bc2/wrapt-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:93c8b4f4d54fd401a817abbfc9bf482aa72fd447f8adf19ce81d035b3f5c762c", size = 63635, upload-time = "2025-10-19T23:47:11.746Z" },
+ { url = "https://files.pythonhosted.org/packages/7c/99/d38d8c80b9cc352531d4d539a17e3674169a5cc25a7e6e5e3c27bc29893e/wrapt-2.0.0-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:5e09ffd31001dce71c2c2a4fc201bdba9a2f9f62b23700cf24af42266e784741", size = 152659, upload-time = "2025-10-19T23:47:15.344Z" },
+ { url = "https://files.pythonhosted.org/packages/5a/2a/e154432f274e22ecf2465583386c5ceffa5e0bab3947c1c5b26cc8e7b275/wrapt-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d87c285ff04e26083c4b03546e7b74df7ba4f1f32f1dcb92e9ac13c2dbb4c379", size = 158818, upload-time = "2025-10-19T23:47:17.569Z" },
+ { url = "https://files.pythonhosted.org/packages/c5/7a/3a40c453300e2898e99c27495b8109ff7cd526997d12cfb8ebd1843199a4/wrapt-2.0.0-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e52e50ea0a72ea48d1291cf8b8aaedcc99072d9dc5baba6b820486dcf4c67da8", size = 146113, upload-time = "2025-10-19T23:47:13.026Z" },
+ { url = "https://files.pythonhosted.org/packages/9e/e2/3116a9eade8bea2bf5eedba3fa420e3c7d193d4b047440330d8eaf1098de/wrapt-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:1fd4c95536975895f32571073446e614d5e2810b666b64955586dcddfd438fd3", size = 155689, upload-time = "2025-10-19T23:47:19.397Z" },
+ { url = "https://files.pythonhosted.org/packages/43/1c/277d3fbe9d177830ab9e54fe9253f38455b75a22d639a4bd9fa092d55ae5/wrapt-2.0.0-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:d6ebfe9283209220ed9de80a3e9442aab8fc2be5a9bbf8491b99e02ca9349a89", size = 144403, upload-time = "2025-10-19T23:47:20.779Z" },
+ { url = "https://files.pythonhosted.org/packages/d8/37/ab6ddaf182248aac5ed925725ef4c69a510594764665ecbd95bdd4481f16/wrapt-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5d3ebd784804f146b7ea55359beb138e23cc18e5a5cc2cf26ad438723c00ce3a", size = 150307, upload-time = "2025-10-19T23:47:22.604Z" },
+ { url = "https://files.pythonhosted.org/packages/f6/d7/df9e2d8040a3af618ff9496261cf90ca4f886fd226af0f4a69ac0c020c3b/wrapt-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:9b15940ae9debc8b40b15dc57e1ce4433f7fb9d3f8761c7fab1ddd94cb999d99", size = 60557, upload-time = "2025-10-19T23:47:26.73Z" },
+ { url = "https://files.pythonhosted.org/packages/b4/c2/502bd4557a3a9199ea73cc5932cf83354bd362682162f0b14164d2e90216/wrapt-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:7a0efbbc06d3e2077476a04f55859819d23206600b4c33f791359a8e6fa3c362", size = 63988, upload-time = "2025-10-19T23:47:23.826Z" },
+ { url = "https://files.pythonhosted.org/packages/1f/f2/632b13942f45db7af709f346ff38b8992c8c21b004e61ab320b0dec525fe/wrapt-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:7fec8a9455c029c8cf4ff143a53b6e7c463268d42be6c17efa847ebd2f809965", size = 60584, upload-time = "2025-10-19T23:47:25.396Z" },
+ { url = "https://files.pythonhosted.org/packages/00/5c/c34575f96a0a038579683c7f10fca943c15c7946037d1d254ab9db1536ec/wrapt-2.0.0-py3-none-any.whl", hash = "sha256:02482fb0df89857e35427dfb844319417e14fae05878f295ee43fa3bf3b15502", size = 43998, upload-time = "2025-10-19T23:47:52.858Z" },
+]
+
+[[package]]
+name = "xxhash"
+version = "3.6.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/02/84/30869e01909fb37a6cc7e18688ee8bf1e42d57e7e0777636bd47524c43c7/xxhash-3.6.0.tar.gz", hash = "sha256:f0162a78b13a0d7617b2845b90c763339d1f1d82bb04a4b07f4ab535cc5e05d6", size = 85160, upload-time = "2025-10-02T14:37:08.097Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/34/ee/f9f1d656ad168681bb0f6b092372c1e533c4416b8069b1896a175c46e484/xxhash-3.6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:87ff03d7e35c61435976554477a7f4cd1704c3596a89a8300d5ce7fc83874a71", size = 32845, upload-time = "2025-10-02T14:33:51.573Z" },
+ { url = "https://files.pythonhosted.org/packages/a3/b1/93508d9460b292c74a09b83d16750c52a0ead89c51eea9951cb97a60d959/xxhash-3.6.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f572dfd3d0e2eb1a57511831cf6341242f5a9f8298a45862d085f5b93394a27d", size = 30807, upload-time = "2025-10-02T14:33:52.964Z" },
+ { url = "https://files.pythonhosted.org/packages/07/55/28c93a3662f2d200c70704efe74aab9640e824f8ce330d8d3943bf7c9b3c/xxhash-3.6.0-cp310-cp310-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:89952ea539566b9fed2bbd94e589672794b4286f342254fad28b149f9615fef8", size = 193786, upload-time = "2025-10-02T14:33:54.272Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/96/fec0be9bb4b8f5d9c57d76380a366f31a1781fb802f76fc7cda6c84893c7/xxhash-3.6.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:48e6f2ffb07a50b52465a1032c3cf1f4a5683f944acaca8a134a2f23674c2058", size = 212830, upload-time = "2025-10-02T14:33:55.706Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/a0/c706845ba77b9611f81fd2e93fad9859346b026e8445e76f8c6fd057cc6d/xxhash-3.6.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b5b848ad6c16d308c3ac7ad4ba6bede80ed5df2ba8ed382f8932df63158dd4b2", size = 211606, upload-time = "2025-10-02T14:33:57.133Z" },
+ { url = "https://files.pythonhosted.org/packages/67/1e/164126a2999e5045f04a69257eea946c0dc3e86541b400d4385d646b53d7/xxhash-3.6.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a034590a727b44dd8ac5914236a7b8504144447a9682586c3327e935f33ec8cc", size = 444872, upload-time = "2025-10-02T14:33:58.446Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/4b/55ab404c56cd70a2cf5ecfe484838865d0fea5627365c6c8ca156bd09c8f/xxhash-3.6.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8a8f1972e75ebdd161d7896743122834fe87378160c20e97f8b09166213bf8cc", size = 193217, upload-time = "2025-10-02T14:33:59.724Z" },
+ { url = "https://files.pythonhosted.org/packages/45/e6/52abf06bac316db33aa269091ae7311bd53cfc6f4b120ae77bac1b348091/xxhash-3.6.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:ee34327b187f002a596d7b167ebc59a1b729e963ce645964bbc050d2f1b73d07", size = 210139, upload-time = "2025-10-02T14:34:02.041Z" },
+ { url = "https://files.pythonhosted.org/packages/34/37/db94d490b8691236d356bc249c08819cbcef9273a1a30acf1254ff9ce157/xxhash-3.6.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:339f518c3c7a850dd033ab416ea25a692759dc7478a71131fe8869010d2b75e4", size = 197669, upload-time = "2025-10-02T14:34:03.664Z" },
+ { url = "https://files.pythonhosted.org/packages/b7/36/c4f219ef4a17a4f7a64ed3569bc2b5a9c8311abdb22249ac96093625b1a4/xxhash-3.6.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:bf48889c9630542d4709192578aebbd836177c9f7a4a2778a7d6340107c65f06", size = 210018, upload-time = "2025-10-02T14:34:05.325Z" },
+ { url = "https://files.pythonhosted.org/packages/fd/06/bfac889a374fc2fc439a69223d1750eed2e18a7db8514737ab630534fa08/xxhash-3.6.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:5576b002a56207f640636056b4160a378fe36a58db73ae5c27a7ec8db35f71d4", size = 413058, upload-time = "2025-10-02T14:34:06.925Z" },
+ { url = "https://files.pythonhosted.org/packages/c9/d1/555d8447e0dd32ad0930a249a522bb2e289f0d08b6b16204cfa42c1f5a0c/xxhash-3.6.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:af1f3278bd02814d6dedc5dec397993b549d6f16c19379721e5a1d31e132c49b", size = 190628, upload-time = "2025-10-02T14:34:08.669Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/15/8751330b5186cedc4ed4b597989882ea05e0408b53fa47bcb46a6125bfc6/xxhash-3.6.0-cp310-cp310-win32.whl", hash = "sha256:aed058764db109dc9052720da65fafe84873b05eb8b07e5e653597951af57c3b", size = 30577, upload-time = "2025-10-02T14:34:10.234Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/cc/53f87e8b5871a6eb2ff7e89c48c66093bda2be52315a8161ddc54ea550c4/xxhash-3.6.0-cp310-cp310-win_amd64.whl", hash = "sha256:e82da5670f2d0d98950317f82a0e4a0197150ff19a6df2ba40399c2a3b9ae5fb", size = 31487, upload-time = "2025-10-02T14:34:11.618Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/00/60f9ea3bb697667a14314d7269956f58bf56bb73864f8f8d52a3c2535e9a/xxhash-3.6.0-cp310-cp310-win_arm64.whl", hash = "sha256:4a082ffff8c6ac07707fb6b671caf7c6e020c75226c561830b73d862060f281d", size = 27863, upload-time = "2025-10-02T14:34:12.619Z" },
+ { url = "https://files.pythonhosted.org/packages/17/d4/cc2f0400e9154df4b9964249da78ebd72f318e35ccc425e9f403c392f22a/xxhash-3.6.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:b47bbd8cf2d72797f3c2772eaaac0ded3d3af26481a26d7d7d41dc2d3c46b04a", size = 32844, upload-time = "2025-10-02T14:34:14.037Z" },
+ { url = "https://files.pythonhosted.org/packages/5e/ec/1cc11cd13e26ea8bc3cb4af4eaadd8d46d5014aebb67be3f71fb0b68802a/xxhash-3.6.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2b6821e94346f96db75abaa6e255706fb06ebd530899ed76d32cd99f20dc52fa", size = 30809, upload-time = "2025-10-02T14:34:15.484Z" },
+ { url = "https://files.pythonhosted.org/packages/04/5f/19fe357ea348d98ca22f456f75a30ac0916b51c753e1f8b2e0e6fb884cce/xxhash-3.6.0-cp311-cp311-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:d0a9751f71a1a65ce3584e9cae4467651c7e70c9d31017fa57574583a4540248", size = 194665, upload-time = "2025-10-02T14:34:16.541Z" },
+ { url = "https://files.pythonhosted.org/packages/90/3b/d1f1a8f5442a5fd8beedae110c5af7604dc37349a8e16519c13c19a9a2de/xxhash-3.6.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8b29ee68625ab37b04c0b40c3fafdf24d2f75ccd778333cfb698f65f6c463f62", size = 213550, upload-time = "2025-10-02T14:34:17.878Z" },
+ { url = "https://files.pythonhosted.org/packages/c4/ef/3a9b05eb527457d5db13a135a2ae1a26c80fecd624d20f3e8dcc4cb170f3/xxhash-3.6.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6812c25fe0d6c36a46ccb002f40f27ac903bf18af9f6dd8f9669cb4d176ab18f", size = 212384, upload-time = "2025-10-02T14:34:19.182Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/18/ccc194ee698c6c623acbf0f8c2969811a8a4b6185af5e824cd27b9e4fd3e/xxhash-3.6.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:4ccbff013972390b51a18ef1255ef5ac125c92dc9143b2d1909f59abc765540e", size = 445749, upload-time = "2025-10-02T14:34:20.659Z" },
+ { url = "https://files.pythonhosted.org/packages/a5/86/cf2c0321dc3940a7aa73076f4fd677a0fb3e405cb297ead7d864fd90847e/xxhash-3.6.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:297b7fbf86c82c550e12e8fb71968b3f033d27b874276ba3624ea868c11165a8", size = 193880, upload-time = "2025-10-02T14:34:22.431Z" },
+ { url = "https://files.pythonhosted.org/packages/82/fb/96213c8560e6f948a1ecc9a7613f8032b19ee45f747f4fca4eb31bb6d6ed/xxhash-3.6.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:dea26ae1eb293db089798d3973a5fc928a18fdd97cc8801226fae705b02b14b0", size = 210912, upload-time = "2025-10-02T14:34:23.937Z" },
+ { url = "https://files.pythonhosted.org/packages/40/aa/4395e669b0606a096d6788f40dbdf2b819d6773aa290c19e6e83cbfc312f/xxhash-3.6.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:7a0b169aafb98f4284f73635a8e93f0735f9cbde17bd5ec332480484241aaa77", size = 198654, upload-time = "2025-10-02T14:34:25.644Z" },
+ { url = "https://files.pythonhosted.org/packages/67/74/b044fcd6b3d89e9b1b665924d85d3f400636c23590226feb1eb09e1176ce/xxhash-3.6.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:08d45aef063a4531b785cd72de4887766d01dc8f362a515693df349fdb825e0c", size = 210867, upload-time = "2025-10-02T14:34:27.203Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/fd/3ce73bf753b08cb19daee1eb14aa0d7fe331f8da9c02dd95316ddfe5275e/xxhash-3.6.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:929142361a48ee07f09121fe9e96a84950e8d4df3bb298ca5d88061969f34d7b", size = 414012, upload-time = "2025-10-02T14:34:28.409Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/b3/5a4241309217c5c876f156b10778f3ab3af7ba7e3259e6d5f5c7d0129eb2/xxhash-3.6.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:51312c768403d8540487dbbfb557454cfc55589bbde6424456951f7fcd4facb3", size = 191409, upload-time = "2025-10-02T14:34:29.696Z" },
+ { url = "https://files.pythonhosted.org/packages/c0/01/99bfbc15fb9abb9a72b088c1d95219fc4782b7d01fc835bd5744d66dd0b8/xxhash-3.6.0-cp311-cp311-win32.whl", hash = "sha256:d1927a69feddc24c987b337ce81ac15c4720955b667fe9b588e02254b80446fd", size = 30574, upload-time = "2025-10-02T14:34:31.028Z" },
+ { url = "https://files.pythonhosted.org/packages/65/79/9d24d7f53819fe301b231044ea362ce64e86c74f6e8c8e51320de248b3e5/xxhash-3.6.0-cp311-cp311-win_amd64.whl", hash = "sha256:26734cdc2d4ffe449b41d186bbeac416f704a482ed835d375a5c0cb02bc63fef", size = 31481, upload-time = "2025-10-02T14:34:32.062Z" },
+ { url = "https://files.pythonhosted.org/packages/30/4e/15cd0e3e8772071344eab2961ce83f6e485111fed8beb491a3f1ce100270/xxhash-3.6.0-cp311-cp311-win_arm64.whl", hash = "sha256:d72f67ef8bf36e05f5b6c65e8524f265bd61071471cd4cf1d36743ebeeeb06b7", size = 27861, upload-time = "2025-10-02T14:34:33.555Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/07/d9412f3d7d462347e4511181dea65e47e0d0e16e26fbee2ea86a2aefb657/xxhash-3.6.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:01362c4331775398e7bb34e3ab403bc9ee9f7c497bc7dee6272114055277dd3c", size = 32744, upload-time = "2025-10-02T14:34:34.622Z" },
+ { url = "https://files.pythonhosted.org/packages/79/35/0429ee11d035fc33abe32dca1b2b69e8c18d236547b9a9b72c1929189b9a/xxhash-3.6.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b7b2df81a23f8cb99656378e72501b2cb41b1827c0f5a86f87d6b06b69f9f204", size = 30816, upload-time = "2025-10-02T14:34:36.043Z" },
+ { url = "https://files.pythonhosted.org/packages/b7/f2/57eb99aa0f7d98624c0932c5b9a170e1806406cdbcdb510546634a1359e0/xxhash-3.6.0-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:dc94790144e66b14f67b10ac8ed75b39ca47536bf8800eb7c24b50271ea0c490", size = 194035, upload-time = "2025-10-02T14:34:37.354Z" },
+ { url = "https://files.pythonhosted.org/packages/4c/ed/6224ba353690d73af7a3f1c7cdb1fc1b002e38f783cb991ae338e1eb3d79/xxhash-3.6.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:93f107c673bccf0d592cdba077dedaf52fe7f42dcd7676eba1f6d6f0c3efffd2", size = 212914, upload-time = "2025-10-02T14:34:38.6Z" },
+ { url = "https://files.pythonhosted.org/packages/38/86/fb6b6130d8dd6b8942cc17ab4d90e223653a89aa32ad2776f8af7064ed13/xxhash-3.6.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2aa5ee3444c25b69813663c9f8067dcfaa2e126dc55e8dddf40f4d1c25d7effa", size = 212163, upload-time = "2025-10-02T14:34:39.872Z" },
+ { url = "https://files.pythonhosted.org/packages/ee/dc/e84875682b0593e884ad73b2d40767b5790d417bde603cceb6878901d647/xxhash-3.6.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f7f99123f0e1194fa59cc69ad46dbae2e07becec5df50a0509a808f90a0f03f0", size = 445411, upload-time = "2025-10-02T14:34:41.569Z" },
+ { url = "https://files.pythonhosted.org/packages/11/4f/426f91b96701ec2f37bb2b8cec664eff4f658a11f3fa9d94f0a887ea6d2b/xxhash-3.6.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:49e03e6fe2cac4a1bc64952dd250cf0dbc5ef4ebb7b8d96bce82e2de163c82a2", size = 193883, upload-time = "2025-10-02T14:34:43.249Z" },
+ { url = "https://files.pythonhosted.org/packages/53/5a/ddbb83eee8e28b778eacfc5a85c969673e4023cdeedcfcef61f36731610b/xxhash-3.6.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:bd17fede52a17a4f9a7bc4472a5867cb0b160deeb431795c0e4abe158bc784e9", size = 210392, upload-time = "2025-10-02T14:34:45.042Z" },
+ { url = "https://files.pythonhosted.org/packages/1e/c2/ff69efd07c8c074ccdf0a4f36fcdd3d27363665bcdf4ba399abebe643465/xxhash-3.6.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:6fb5f5476bef678f69db04f2bd1efbed3030d2aba305b0fc1773645f187d6a4e", size = 197898, upload-time = "2025-10-02T14:34:46.302Z" },
+ { url = "https://files.pythonhosted.org/packages/58/ca/faa05ac19b3b622c7c9317ac3e23954187516298a091eb02c976d0d3dd45/xxhash-3.6.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:843b52f6d88071f87eba1631b684fcb4b2068cd2180a0224122fe4ef011a9374", size = 210655, upload-time = "2025-10-02T14:34:47.571Z" },
+ { url = "https://files.pythonhosted.org/packages/d4/7a/06aa7482345480cc0cb597f5c875b11a82c3953f534394f620b0be2f700c/xxhash-3.6.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:7d14a6cfaf03b1b6f5f9790f76880601ccc7896aff7ab9cd8978a939c1eb7e0d", size = 414001, upload-time = "2025-10-02T14:34:49.273Z" },
+ { url = "https://files.pythonhosted.org/packages/23/07/63ffb386cd47029aa2916b3d2f454e6cc5b9f5c5ada3790377d5430084e7/xxhash-3.6.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:418daf3db71e1413cfe211c2f9a528456936645c17f46b5204705581a45390ae", size = 191431, upload-time = "2025-10-02T14:34:50.798Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/93/14fde614cadb4ddf5e7cebf8918b7e8fac5ae7861c1875964f17e678205c/xxhash-3.6.0-cp312-cp312-win32.whl", hash = "sha256:50fc255f39428a27299c20e280d6193d8b63b8ef8028995323bf834a026b4fbb", size = 30617, upload-time = "2025-10-02T14:34:51.954Z" },
+ { url = "https://files.pythonhosted.org/packages/13/5d/0d125536cbe7565a83d06e43783389ecae0c0f2ed037b48ede185de477c0/xxhash-3.6.0-cp312-cp312-win_amd64.whl", hash = "sha256:c0f2ab8c715630565ab8991b536ecded9416d615538be8ecddce43ccf26cbc7c", size = 31534, upload-time = "2025-10-02T14:34:53.276Z" },
+ { url = "https://files.pythonhosted.org/packages/54/85/6ec269b0952ec7e36ba019125982cf11d91256a778c7c3f98a4c5043d283/xxhash-3.6.0-cp312-cp312-win_arm64.whl", hash = "sha256:eae5c13f3bc455a3bbb68bdc513912dc7356de7e2280363ea235f71f54064829", size = 27876, upload-time = "2025-10-02T14:34:54.371Z" },
+ { url = "https://files.pythonhosted.org/packages/33/76/35d05267ac82f53ae9b0e554da7c5e281ee61f3cad44c743f0fcd354f211/xxhash-3.6.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:599e64ba7f67472481ceb6ee80fa3bd828fd61ba59fb11475572cc5ee52b89ec", size = 32738, upload-time = "2025-10-02T14:34:55.839Z" },
+ { url = "https://files.pythonhosted.org/packages/31/a8/3fbce1cd96534a95e35d5120637bf29b0d7f5d8fa2f6374e31b4156dd419/xxhash-3.6.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7d8b8aaa30fca4f16f0c84a5c8d7ddee0e25250ec2796c973775373257dde8f1", size = 30821, upload-time = "2025-10-02T14:34:57.219Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/ea/d387530ca7ecfa183cb358027f1833297c6ac6098223fd14f9782cd0015c/xxhash-3.6.0-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:d597acf8506d6e7101a4a44a5e428977a51c0fadbbfd3c39650cca9253f6e5a6", size = 194127, upload-time = "2025-10-02T14:34:59.21Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/0c/71435dcb99874b09a43b8d7c54071e600a7481e42b3e3ce1eb5226a5711a/xxhash-3.6.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:858dc935963a33bc33490128edc1c12b0c14d9c7ebaa4e387a7869ecc4f3e263", size = 212975, upload-time = "2025-10-02T14:35:00.816Z" },
+ { url = "https://files.pythonhosted.org/packages/84/7a/c2b3d071e4bb4a90b7057228a99b10d51744878f4a8a6dd643c8bd897620/xxhash-3.6.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ba284920194615cb8edf73bf52236ce2e1664ccd4a38fdb543506413529cc546", size = 212241, upload-time = "2025-10-02T14:35:02.207Z" },
+ { url = "https://files.pythonhosted.org/packages/81/5f/640b6eac0128e215f177df99eadcd0f1b7c42c274ab6a394a05059694c5a/xxhash-3.6.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:4b54219177f6c6674d5378bd862c6aedf64725f70dd29c472eaae154df1a2e89", size = 445471, upload-time = "2025-10-02T14:35:03.61Z" },
+ { url = "https://files.pythonhosted.org/packages/5e/1e/3c3d3ef071b051cc3abbe3721ffb8365033a172613c04af2da89d5548a87/xxhash-3.6.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:42c36dd7dbad2f5238950c377fcbf6811b1cdb1c444fab447960030cea60504d", size = 193936, upload-time = "2025-10-02T14:35:05.013Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/bd/4a5f68381939219abfe1c22a9e3a5854a4f6f6f3c4983a87d255f21f2e5d/xxhash-3.6.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f22927652cba98c44639ffdc7aaf35828dccf679b10b31c4ad72a5b530a18eb7", size = 210440, upload-time = "2025-10-02T14:35:06.239Z" },
+ { url = "https://files.pythonhosted.org/packages/eb/37/b80fe3d5cfb9faff01a02121a0f4d565eb7237e9e5fc66e73017e74dcd36/xxhash-3.6.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:b45fad44d9c5c119e9c6fbf2e1c656a46dc68e280275007bbfd3d572b21426db", size = 197990, upload-time = "2025-10-02T14:35:07.735Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/fd/2c0a00c97b9e18f72e1f240ad4e8f8a90fd9d408289ba9c7c495ed7dc05c/xxhash-3.6.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:6f2580ffab1a8b68ef2b901cde7e55fa8da5e4be0977c68f78fc80f3c143de42", size = 210689, upload-time = "2025-10-02T14:35:09.438Z" },
+ { url = "https://files.pythonhosted.org/packages/93/86/5dd8076a926b9a95db3206aba20d89a7fc14dd5aac16e5c4de4b56033140/xxhash-3.6.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:40c391dd3cd041ebc3ffe6f2c862f402e306eb571422e0aa918d8070ba31da11", size = 414068, upload-time = "2025-10-02T14:35:11.162Z" },
+ { url = "https://files.pythonhosted.org/packages/af/3c/0bb129170ee8f3650f08e993baee550a09593462a5cddd8e44d0011102b1/xxhash-3.6.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:f205badabde7aafd1a31e8ca2a3e5a763107a71c397c4481d6a804eb5063d8bd", size = 191495, upload-time = "2025-10-02T14:35:12.971Z" },
+ { url = "https://files.pythonhosted.org/packages/e9/3a/6797e0114c21d1725e2577508e24006fd7ff1d8c0c502d3b52e45c1771d8/xxhash-3.6.0-cp313-cp313-win32.whl", hash = "sha256:2577b276e060b73b73a53042ea5bd5203d3e6347ce0d09f98500f418a9fcf799", size = 30620, upload-time = "2025-10-02T14:35:14.129Z" },
+ { url = "https://files.pythonhosted.org/packages/86/15/9bc32671e9a38b413a76d24722a2bf8784a132c043063a8f5152d390b0f9/xxhash-3.6.0-cp313-cp313-win_amd64.whl", hash = "sha256:757320d45d2fbcce8f30c42a6b2f47862967aea7bf458b9625b4bbe7ee390392", size = 31542, upload-time = "2025-10-02T14:35:15.21Z" },
+ { url = "https://files.pythonhosted.org/packages/39/c5/cc01e4f6188656e56112d6a8e0dfe298a16934b8c47a247236549a3f7695/xxhash-3.6.0-cp313-cp313-win_arm64.whl", hash = "sha256:457b8f85dec5825eed7b69c11ae86834a018b8e3df5e77783c999663da2f96d6", size = 27880, upload-time = "2025-10-02T14:35:16.315Z" },
+ { url = "https://files.pythonhosted.org/packages/f3/30/25e5321c8732759e930c555176d37e24ab84365482d257c3b16362235212/xxhash-3.6.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:a42e633d75cdad6d625434e3468126c73f13f7584545a9cf34e883aa1710e702", size = 32956, upload-time = "2025-10-02T14:35:17.413Z" },
+ { url = "https://files.pythonhosted.org/packages/9f/3c/0573299560d7d9f8ab1838f1efc021a280b5ae5ae2e849034ef3dee18810/xxhash-3.6.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:568a6d743219e717b07b4e03b0a828ce593833e498c3b64752e0f5df6bfe84db", size = 31072, upload-time = "2025-10-02T14:35:18.844Z" },
+ { url = "https://files.pythonhosted.org/packages/7a/1c/52d83a06e417cd9d4137722693424885cc9878249beb3a7c829e74bf7ce9/xxhash-3.6.0-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:bec91b562d8012dae276af8025a55811b875baace6af510412a5e58e3121bc54", size = 196409, upload-time = "2025-10-02T14:35:20.31Z" },
+ { url = "https://files.pythonhosted.org/packages/e3/8e/c6d158d12a79bbd0b878f8355432075fc82759e356ab5a111463422a239b/xxhash-3.6.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:78e7f2f4c521c30ad5e786fdd6bae89d47a32672a80195467b5de0480aa97b1f", size = 215736, upload-time = "2025-10-02T14:35:21.616Z" },
+ { url = "https://files.pythonhosted.org/packages/bc/68/c4c80614716345d55071a396cf03d06e34b5f4917a467faf43083c995155/xxhash-3.6.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3ed0df1b11a79856df5ffcab572cbd6b9627034c1c748c5566fa79df9048a7c5", size = 214833, upload-time = "2025-10-02T14:35:23.32Z" },
+ { url = "https://files.pythonhosted.org/packages/7e/e9/ae27c8ffec8b953efa84c7c4a6c6802c263d587b9fc0d6e7cea64e08c3af/xxhash-3.6.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0e4edbfc7d420925b0dd5e792478ed393d6e75ff8fc219a6546fb446b6a417b1", size = 448348, upload-time = "2025-10-02T14:35:25.111Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/6b/33e21afb1b5b3f46b74b6bd1913639066af218d704cc0941404ca717fc57/xxhash-3.6.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fba27a198363a7ef87f8c0f6b171ec36b674fe9053742c58dd7e3201c1ab30ee", size = 196070, upload-time = "2025-10-02T14:35:26.586Z" },
+ { url = "https://files.pythonhosted.org/packages/96/b6/fcabd337bc5fa624e7203aa0fa7d0c49eed22f72e93229431752bddc83d9/xxhash-3.6.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:794fe9145fe60191c6532fa95063765529770edcdd67b3d537793e8004cabbfd", size = 212907, upload-time = "2025-10-02T14:35:28.087Z" },
+ { url = "https://files.pythonhosted.org/packages/4b/d3/9ee6160e644d660fcf176c5825e61411c7f62648728f69c79ba237250143/xxhash-3.6.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:6105ef7e62b5ac73a837778efc331a591d8442f8ef5c7e102376506cb4ae2729", size = 200839, upload-time = "2025-10-02T14:35:29.857Z" },
+ { url = "https://files.pythonhosted.org/packages/0d/98/e8de5baa5109394baf5118f5e72ab21a86387c4f89b0e77ef3e2f6b0327b/xxhash-3.6.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:f01375c0e55395b814a679b3eea205db7919ac2af213f4a6682e01220e5fe292", size = 213304, upload-time = "2025-10-02T14:35:31.222Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/1d/71056535dec5c3177eeb53e38e3d367dd1d16e024e63b1cee208d572a033/xxhash-3.6.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:d706dca2d24d834a4661619dcacf51a75c16d65985718d6a7d73c1eeeb903ddf", size = 416930, upload-time = "2025-10-02T14:35:32.517Z" },
+ { url = "https://files.pythonhosted.org/packages/dc/6c/5cbde9de2cd967c322e651c65c543700b19e7ae3e0aae8ece3469bf9683d/xxhash-3.6.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:5f059d9faeacd49c0215d66f4056e1326c80503f51a1532ca336a385edadd033", size = 193787, upload-time = "2025-10-02T14:35:33.827Z" },
+ { url = "https://files.pythonhosted.org/packages/19/fa/0172e350361d61febcea941b0cc541d6e6c8d65d153e85f850a7b256ff8a/xxhash-3.6.0-cp313-cp313t-win32.whl", hash = "sha256:1244460adc3a9be84731d72b8e80625788e5815b68da3da8b83f78115a40a7ec", size = 30916, upload-time = "2025-10-02T14:35:35.107Z" },
+ { url = "https://files.pythonhosted.org/packages/ad/e6/e8cf858a2b19d6d45820f072eff1bea413910592ff17157cabc5f1227a16/xxhash-3.6.0-cp313-cp313t-win_amd64.whl", hash = "sha256:b1e420ef35c503869c4064f4a2f2b08ad6431ab7b229a05cce39d74268bca6b8", size = 31799, upload-time = "2025-10-02T14:35:36.165Z" },
+ { url = "https://files.pythonhosted.org/packages/56/15/064b197e855bfb7b343210e82490ae672f8bc7cdf3ddb02e92f64304ee8a/xxhash-3.6.0-cp313-cp313t-win_arm64.whl", hash = "sha256:ec44b73a4220623235f67a996c862049f375df3b1052d9899f40a6382c32d746", size = 28044, upload-time = "2025-10-02T14:35:37.195Z" },
+ { url = "https://files.pythonhosted.org/packages/7e/5e/0138bc4484ea9b897864d59fce9be9086030825bc778b76cb5a33a906d37/xxhash-3.6.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:a40a3d35b204b7cc7643cbcf8c9976d818cb47befcfac8bbefec8038ac363f3e", size = 32754, upload-time = "2025-10-02T14:35:38.245Z" },
+ { url = "https://files.pythonhosted.org/packages/18/d7/5dac2eb2ec75fd771957a13e5dda560efb2176d5203f39502a5fc571f899/xxhash-3.6.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:a54844be970d3fc22630b32d515e79a90d0a3ddb2644d8d7402e3c4c8da61405", size = 30846, upload-time = "2025-10-02T14:35:39.6Z" },
+ { url = "https://files.pythonhosted.org/packages/fe/71/8bc5be2bb00deb5682e92e8da955ebe5fa982da13a69da5a40a4c8db12fb/xxhash-3.6.0-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:016e9190af8f0a4e3741343777710e3d5717427f175adfdc3e72508f59e2a7f3", size = 194343, upload-time = "2025-10-02T14:35:40.69Z" },
+ { url = "https://files.pythonhosted.org/packages/e7/3b/52badfb2aecec2c377ddf1ae75f55db3ba2d321c5e164f14461c90837ef3/xxhash-3.6.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4f6f72232f849eb9d0141e2ebe2677ece15adfd0fa599bc058aad83c714bb2c6", size = 213074, upload-time = "2025-10-02T14:35:42.29Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/2b/ae46b4e9b92e537fa30d03dbc19cdae57ed407e9c26d163895e968e3de85/xxhash-3.6.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:63275a8aba7865e44b1813d2177e0f5ea7eadad3dd063a21f7cf9afdc7054063", size = 212388, upload-time = "2025-10-02T14:35:43.929Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/80/49f88d3afc724b4ac7fbd664c8452d6db51b49915be48c6982659e0e7942/xxhash-3.6.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3cd01fa2aa00d8b017c97eb46b9a794fbdca53fc14f845f5a328c71254b0abb7", size = 445614, upload-time = "2025-10-02T14:35:45.216Z" },
+ { url = "https://files.pythonhosted.org/packages/ed/ba/603ce3961e339413543d8cd44f21f2c80e2a7c5cfe692a7b1f2cccf58f3c/xxhash-3.6.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0226aa89035b62b6a86d3c68df4d7c1f47a342b8683da2b60cedcddb46c4d95b", size = 194024, upload-time = "2025-10-02T14:35:46.959Z" },
+ { url = "https://files.pythonhosted.org/packages/78/d1/8e225ff7113bf81545cfdcd79eef124a7b7064a0bba53605ff39590b95c2/xxhash-3.6.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:c6e193e9f56e4ca4923c61238cdaced324f0feac782544eb4c6d55ad5cc99ddd", size = 210541, upload-time = "2025-10-02T14:35:48.301Z" },
+ { url = "https://files.pythonhosted.org/packages/6f/58/0f89d149f0bad89def1a8dd38feb50ccdeb643d9797ec84707091d4cb494/xxhash-3.6.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:9176dcaddf4ca963d4deb93866d739a343c01c969231dbe21680e13a5d1a5bf0", size = 198305, upload-time = "2025-10-02T14:35:49.584Z" },
+ { url = "https://files.pythonhosted.org/packages/11/38/5eab81580703c4df93feb5f32ff8fa7fe1e2c51c1f183ee4e48d4bb9d3d7/xxhash-3.6.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:c1ce4009c97a752e682b897aa99aef84191077a9433eb237774689f14f8ec152", size = 210848, upload-time = "2025-10-02T14:35:50.877Z" },
+ { url = "https://files.pythonhosted.org/packages/5e/6b/953dc4b05c3ce678abca756416e4c130d2382f877a9c30a20d08ee6a77c0/xxhash-3.6.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:8cb2f4f679b01513b7adbb9b1b2f0f9cdc31b70007eaf9d59d0878809f385b11", size = 414142, upload-time = "2025-10-02T14:35:52.15Z" },
+ { url = "https://files.pythonhosted.org/packages/08/a9/238ec0d4e81a10eb5026d4a6972677cbc898ba6c8b9dbaec12ae001b1b35/xxhash-3.6.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:653a91d7c2ab54a92c19ccf43508b6a555440b9be1bc8be553376778be7f20b5", size = 191547, upload-time = "2025-10-02T14:35:53.547Z" },
+ { url = "https://files.pythonhosted.org/packages/f1/ee/3cf8589e06c2164ac77c3bf0aa127012801128f1feebf2a079272da5737c/xxhash-3.6.0-cp314-cp314-win32.whl", hash = "sha256:a756fe893389483ee8c394d06b5ab765d96e68fbbfe6fde7aa17e11f5720559f", size = 31214, upload-time = "2025-10-02T14:35:54.746Z" },
+ { url = "https://files.pythonhosted.org/packages/02/5d/a19552fbc6ad4cb54ff953c3908bbc095f4a921bc569433d791f755186f1/xxhash-3.6.0-cp314-cp314-win_amd64.whl", hash = "sha256:39be8e4e142550ef69629c9cd71b88c90e9a5db703fecbcf265546d9536ca4ad", size = 32290, upload-time = "2025-10-02T14:35:55.791Z" },
+ { url = "https://files.pythonhosted.org/packages/b1/11/dafa0643bc30442c887b55baf8e73353a344ee89c1901b5a5c54a6c17d39/xxhash-3.6.0-cp314-cp314-win_arm64.whl", hash = "sha256:25915e6000338999236f1eb68a02a32c3275ac338628a7eaa5a269c401995679", size = 28795, upload-time = "2025-10-02T14:35:57.162Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/db/0e99732ed7f64182aef4a6fb145e1a295558deec2a746265dcdec12d191e/xxhash-3.6.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:c5294f596a9017ca5a3e3f8884c00b91ab2ad2933cf288f4923c3fd4346cf3d4", size = 32955, upload-time = "2025-10-02T14:35:58.267Z" },
+ { url = "https://files.pythonhosted.org/packages/55/f4/2a7c3c68e564a099becfa44bb3d398810cc0ff6749b0d3cb8ccb93f23c14/xxhash-3.6.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1cf9dcc4ab9cff01dfbba78544297a3a01dafd60f3bde4e2bfd016cf7e4ddc67", size = 31072, upload-time = "2025-10-02T14:35:59.382Z" },
+ { url = "https://files.pythonhosted.org/packages/c6/d9/72a29cddc7250e8a5819dad5d466facb5dc4c802ce120645630149127e73/xxhash-3.6.0-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:01262da8798422d0685f7cef03b2bd3f4f46511b02830861df548d7def4402ad", size = 196579, upload-time = "2025-10-02T14:36:00.838Z" },
+ { url = "https://files.pythonhosted.org/packages/63/93/b21590e1e381040e2ca305a884d89e1c345b347404f7780f07f2cdd47ef4/xxhash-3.6.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:51a73fb7cb3a3ead9f7a8b583ffd9b8038e277cdb8cb87cf890e88b3456afa0b", size = 215854, upload-time = "2025-10-02T14:36:02.207Z" },
+ { url = "https://files.pythonhosted.org/packages/ce/b8/edab8a7d4fa14e924b29be877d54155dcbd8b80be85ea00d2be3413a9ed4/xxhash-3.6.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b9c6df83594f7df8f7f708ce5ebeacfc69f72c9fbaaababf6cf4758eaada0c9b", size = 214965, upload-time = "2025-10-02T14:36:03.507Z" },
+ { url = "https://files.pythonhosted.org/packages/27/67/dfa980ac7f0d509d54ea0d5a486d2bb4b80c3f1bb22b66e6a05d3efaf6c0/xxhash-3.6.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:627f0af069b0ea56f312fd5189001c24578868643203bca1abbc2c52d3a6f3ca", size = 448484, upload-time = "2025-10-02T14:36:04.828Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/63/8ffc2cc97e811c0ca5d00ab36604b3ea6f4254f20b7bc658ca825ce6c954/xxhash-3.6.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:aa912c62f842dfd013c5f21a642c9c10cd9f4c4e943e0af83618b4a404d9091a", size = 196162, upload-time = "2025-10-02T14:36:06.182Z" },
+ { url = "https://files.pythonhosted.org/packages/4b/77/07f0e7a3edd11a6097e990f6e5b815b6592459cb16dae990d967693e6ea9/xxhash-3.6.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:b465afd7909db30168ab62afe40b2fcf79eedc0b89a6c0ab3123515dc0df8b99", size = 213007, upload-time = "2025-10-02T14:36:07.733Z" },
+ { url = "https://files.pythonhosted.org/packages/ae/d8/bc5fa0d152837117eb0bef6f83f956c509332ce133c91c63ce07ee7c4873/xxhash-3.6.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:a881851cf38b0a70e7c4d3ce81fc7afd86fbc2a024f4cfb2a97cf49ce04b75d3", size = 200956, upload-time = "2025-10-02T14:36:09.106Z" },
+ { url = "https://files.pythonhosted.org/packages/26/a5/d749334130de9411783873e9b98ecc46688dad5db64ca6e04b02acc8b473/xxhash-3.6.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:9b3222c686a919a0f3253cfc12bb118b8b103506612253b5baeaac10d8027cf6", size = 213401, upload-time = "2025-10-02T14:36:10.585Z" },
+ { url = "https://files.pythonhosted.org/packages/89/72/abed959c956a4bfc72b58c0384bb7940663c678127538634d896b1195c10/xxhash-3.6.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:c5aa639bc113e9286137cec8fadc20e9cd732b2cc385c0b7fa673b84fc1f2a93", size = 417083, upload-time = "2025-10-02T14:36:12.276Z" },
+ { url = "https://files.pythonhosted.org/packages/0c/b3/62fd2b586283b7d7d665fb98e266decadf31f058f1cf6c478741f68af0cb/xxhash-3.6.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5c1343d49ac102799905e115aee590183c3921d475356cb24b4de29a4bc56518", size = 193913, upload-time = "2025-10-02T14:36:14.025Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/9a/c19c42c5b3f5a4aad748a6d5b4f23df3bed7ee5445accc65a0fb3ff03953/xxhash-3.6.0-cp314-cp314t-win32.whl", hash = "sha256:5851f033c3030dd95c086b4a36a2683c2ff4a799b23af60977188b057e467119", size = 31586, upload-time = "2025-10-02T14:36:15.603Z" },
+ { url = "https://files.pythonhosted.org/packages/03/d6/4cc450345be9924fd5dc8c590ceda1db5b43a0a889587b0ae81a95511360/xxhash-3.6.0-cp314-cp314t-win_amd64.whl", hash = "sha256:0444e7967dac37569052d2409b00a8860c2135cff05502df4da80267d384849f", size = 32526, upload-time = "2025-10-02T14:36:16.708Z" },
+ { url = "https://files.pythonhosted.org/packages/0f/c9/7243eb3f9eaabd1a88a5a5acadf06df2d83b100c62684b7425c6a11bcaa8/xxhash-3.6.0-cp314-cp314t-win_arm64.whl", hash = "sha256:bb79b1e63f6fd84ec778a4b1916dfe0a7c3fdb986c06addd5db3a0d413819d95", size = 28898, upload-time = "2025-10-02T14:36:17.843Z" },
+ { url = "https://files.pythonhosted.org/packages/93/1e/8aec23647a34a249f62e2398c42955acd9b4c6ed5cf08cbea94dc46f78d2/xxhash-3.6.0-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:0f7b7e2ec26c1666ad5fc9dbfa426a6a3367ceaf79db5dd76264659d509d73b0", size = 30662, upload-time = "2025-10-02T14:37:01.743Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/0b/b14510b38ba91caf43006209db846a696ceea6a847a0c9ba0a5b1adc53d6/xxhash-3.6.0-pp311-pypy311_pp73-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:5dc1e14d14fa0f5789ec29a7062004b5933964bb9b02aae6622b8f530dc40296", size = 41056, upload-time = "2025-10-02T14:37:02.879Z" },
+ { url = "https://files.pythonhosted.org/packages/50/55/15a7b8a56590e66ccd374bbfa3f9ffc45b810886c8c3b614e3f90bd2367c/xxhash-3.6.0-pp311-pypy311_pp73-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:881b47fc47e051b37d94d13e7455131054b56749b91b508b0907eb07900d1c13", size = 36251, upload-time = "2025-10-02T14:37:04.44Z" },
+ { url = "https://files.pythonhosted.org/packages/62/b2/5ac99a041a29e58e95f907876b04f7067a0242cb85b5f39e726153981503/xxhash-3.6.0-pp311-pypy311_pp73-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c6dc31591899f5e5666f04cc2e529e69b4072827085c1ef15294d91a004bc1bd", size = 32481, upload-time = "2025-10-02T14:37:05.869Z" },
+ { url = "https://files.pythonhosted.org/packages/7b/d9/8d95e906764a386a3d3b596f3c68bb63687dfca806373509f51ce8eea81f/xxhash-3.6.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:15e0dac10eb9309508bfc41f7f9deaa7755c69e35af835db9cb10751adebc35d", size = 31565, upload-time = "2025-10-02T14:37:06.966Z" },
+]
+
+[[package]]
+name = "yarl"
+version = "1.22.0"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+ { name = "idna" },
+ { name = "multidict" },
+ { name = "propcache" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/57/63/0c6ebca57330cd313f6102b16dd57ffaf3ec4c83403dcb45dbd15c6f3ea1/yarl-1.22.0.tar.gz", hash = "sha256:bebf8557577d4401ba8bd9ff33906f1376c877aa78d1fe216ad01b4d6745af71", size = 187169, upload-time = "2025-10-06T14:12:55.963Z" }
+wheels = [
+ { url = "https://files.pythonhosted.org/packages/d1/43/a2204825342f37c337f5edb6637040fa14e365b2fcc2346960201d457579/yarl-1.22.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:c7bd6683587567e5a49ee6e336e0612bec8329be1b7d4c8af5687dcdeb67ee1e", size = 140517, upload-time = "2025-10-06T14:08:42.494Z" },
+ { url = "https://files.pythonhosted.org/packages/44/6f/674f3e6f02266428c56f704cd2501c22f78e8b2eeb23f153117cc86fb28a/yarl-1.22.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:5cdac20da754f3a723cceea5b3448e1a2074866406adeb4ef35b469d089adb8f", size = 93495, upload-time = "2025-10-06T14:08:46.2Z" },
+ { url = "https://files.pythonhosted.org/packages/b8/12/5b274d8a0f30c07b91b2f02cba69152600b47830fcfb465c108880fcee9c/yarl-1.22.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:07a524d84df0c10f41e3ee918846e1974aba4ec017f990dc735aad487a0bdfdf", size = 94400, upload-time = "2025-10-06T14:08:47.855Z" },
+ { url = "https://files.pythonhosted.org/packages/e2/7f/df1b6949b1fa1aa9ff6de6e2631876ad4b73c4437822026e85d8acb56bb1/yarl-1.22.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e1b329cb8146d7b736677a2440e422eadd775d1806a81db2d4cded80a48efc1a", size = 347545, upload-time = "2025-10-06T14:08:49.683Z" },
+ { url = "https://files.pythonhosted.org/packages/84/09/f92ed93bd6cd77872ab6c3462df45ca45cd058d8f1d0c9b4f54c1704429f/yarl-1.22.0-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:75976c6945d85dbb9ee6308cd7ff7b1fb9409380c82d6119bd778d8fcfe2931c", size = 319598, upload-time = "2025-10-06T14:08:51.215Z" },
+ { url = "https://files.pythonhosted.org/packages/c3/97/ac3f3feae7d522cf7ccec3d340bb0b2b61c56cb9767923df62a135092c6b/yarl-1.22.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:80ddf7a5f8c86cb3eb4bc9028b07bbbf1f08a96c5c0bc1244be5e8fefcb94147", size = 363893, upload-time = "2025-10-06T14:08:53.144Z" },
+ { url = "https://files.pythonhosted.org/packages/06/49/f3219097403b9c84a4d079b1d7bda62dd9b86d0d6e4428c02d46ab2c77fc/yarl-1.22.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d332fc2e3c94dad927f2112395772a4e4fedbcf8f80efc21ed7cdfae4d574fdb", size = 371240, upload-time = "2025-10-06T14:08:55.036Z" },
+ { url = "https://files.pythonhosted.org/packages/35/9f/06b765d45c0e44e8ecf0fe15c9eacbbde342bb5b7561c46944f107bfb6c3/yarl-1.22.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0cf71bf877efeac18b38d3930594c0948c82b64547c1cf420ba48722fe5509f6", size = 346965, upload-time = "2025-10-06T14:08:56.722Z" },
+ { url = "https://files.pythonhosted.org/packages/c5/69/599e7cea8d0fcb1694323b0db0dda317fa3162f7b90166faddecf532166f/yarl-1.22.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:663e1cadaddae26be034a6ab6072449a8426ddb03d500f43daf952b74553bba0", size = 342026, upload-time = "2025-10-06T14:08:58.563Z" },
+ { url = "https://files.pythonhosted.org/packages/95/6f/9dfd12c8bc90fea9eab39832ee32ea48f8e53d1256252a77b710c065c89f/yarl-1.22.0-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:6dcbb0829c671f305be48a7227918cfcd11276c2d637a8033a99a02b67bf9eda", size = 335637, upload-time = "2025-10-06T14:09:00.506Z" },
+ { url = "https://files.pythonhosted.org/packages/57/2e/34c5b4eb9b07e16e873db5b182c71e5f06f9b5af388cdaa97736d79dd9a6/yarl-1.22.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:f0d97c18dfd9a9af4490631905a3f131a8e4c9e80a39353919e2cfed8f00aedc", size = 359082, upload-time = "2025-10-06T14:09:01.936Z" },
+ { url = "https://files.pythonhosted.org/packages/31/71/fa7e10fb772d273aa1f096ecb8ab8594117822f683bab7d2c5a89914c92a/yarl-1.22.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:437840083abe022c978470b942ff832c3940b2ad3734d424b7eaffcd07f76737", size = 357811, upload-time = "2025-10-06T14:09:03.445Z" },
+ { url = "https://files.pythonhosted.org/packages/26/da/11374c04e8e1184a6a03cf9c8f5688d3e5cec83ed6f31ad3481b3207f709/yarl-1.22.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:a899cbd98dce6f5d8de1aad31cb712ec0a530abc0a86bd6edaa47c1090138467", size = 351223, upload-time = "2025-10-06T14:09:05.401Z" },
+ { url = "https://files.pythonhosted.org/packages/82/8f/e2d01f161b0c034a30410e375e191a5d27608c1f8693bab1a08b089ca096/yarl-1.22.0-cp310-cp310-win32.whl", hash = "sha256:595697f68bd1f0c1c159fcb97b661fc9c3f5db46498043555d04805430e79bea", size = 82118, upload-time = "2025-10-06T14:09:11.148Z" },
+ { url = "https://files.pythonhosted.org/packages/62/46/94c76196642dbeae634c7a61ba3da88cd77bed875bf6e4a8bed037505aa6/yarl-1.22.0-cp310-cp310-win_amd64.whl", hash = "sha256:cb95a9b1adaa48e41815a55ae740cfda005758104049a640a398120bf02515ca", size = 86852, upload-time = "2025-10-06T14:09:12.958Z" },
+ { url = "https://files.pythonhosted.org/packages/af/af/7df4f179d3b1a6dcb9a4bd2ffbc67642746fcafdb62580e66876ce83fff4/yarl-1.22.0-cp310-cp310-win_arm64.whl", hash = "sha256:b85b982afde6df99ecc996990d4ad7ccbdbb70e2a4ba4de0aecde5922ba98a0b", size = 82012, upload-time = "2025-10-06T14:09:14.664Z" },
+ { url = "https://files.pythonhosted.org/packages/4d/27/5ab13fc84c76a0250afd3d26d5936349a35be56ce5785447d6c423b26d92/yarl-1.22.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:1ab72135b1f2db3fed3997d7e7dc1b80573c67138023852b6efb336a5eae6511", size = 141607, upload-time = "2025-10-06T14:09:16.298Z" },
+ { url = "https://files.pythonhosted.org/packages/6a/a1/d065d51d02dc02ce81501d476b9ed2229d9a990818332242a882d5d60340/yarl-1.22.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:669930400e375570189492dc8d8341301578e8493aec04aebc20d4717f899dd6", size = 94027, upload-time = "2025-10-06T14:09:17.786Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/da/8da9f6a53f67b5106ffe902c6fa0164e10398d4e150d85838b82f424072a/yarl-1.22.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:792a2af6d58177ef7c19cbf0097aba92ca1b9cb3ffdd9c7470e156c8f9b5e028", size = 94963, upload-time = "2025-10-06T14:09:19.662Z" },
+ { url = "https://files.pythonhosted.org/packages/68/fe/2c1f674960c376e29cb0bec1249b117d11738db92a6ccc4a530b972648db/yarl-1.22.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3ea66b1c11c9150f1372f69afb6b8116f2dd7286f38e14ea71a44eee9ec51b9d", size = 368406, upload-time = "2025-10-06T14:09:21.402Z" },
+ { url = "https://files.pythonhosted.org/packages/95/26/812a540e1c3c6418fec60e9bbd38e871eaba9545e94fa5eff8f4a8e28e1e/yarl-1.22.0-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3e2daa88dc91870215961e96a039ec73e4937da13cf77ce17f9cad0c18df3503", size = 336581, upload-time = "2025-10-06T14:09:22.98Z" },
+ { url = "https://files.pythonhosted.org/packages/0b/f5/5777b19e26fdf98563985e481f8be3d8a39f8734147a6ebf459d0dab5a6b/yarl-1.22.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ba440ae430c00eee41509353628600212112cd5018d5def7e9b05ea7ac34eb65", size = 388924, upload-time = "2025-10-06T14:09:24.655Z" },
+ { url = "https://files.pythonhosted.org/packages/86/08/24bd2477bd59c0bbd994fe1d93b126e0472e4e3df5a96a277b0a55309e89/yarl-1.22.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e6438cc8f23a9c1478633d216b16104a586b9761db62bfacb6425bac0a36679e", size = 392890, upload-time = "2025-10-06T14:09:26.617Z" },
+ { url = "https://files.pythonhosted.org/packages/46/00/71b90ed48e895667ecfb1eaab27c1523ee2fa217433ed77a73b13205ca4b/yarl-1.22.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4c52a6e78aef5cf47a98ef8e934755abf53953379b7d53e68b15ff4420e6683d", size = 365819, upload-time = "2025-10-06T14:09:28.544Z" },
+ { url = "https://files.pythonhosted.org/packages/30/2d/f715501cae832651d3282387c6a9236cd26bd00d0ff1e404b3dc52447884/yarl-1.22.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:3b06bcadaac49c70f4c88af4ffcfbe3dc155aab3163e75777818092478bcbbe7", size = 363601, upload-time = "2025-10-06T14:09:30.568Z" },
+ { url = "https://files.pythonhosted.org/packages/f8/f9/a678c992d78e394e7126ee0b0e4e71bd2775e4334d00a9278c06a6cce96a/yarl-1.22.0-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:6944b2dc72c4d7f7052683487e3677456050ff77fcf5e6204e98caf785ad1967", size = 358072, upload-time = "2025-10-06T14:09:32.528Z" },
+ { url = "https://files.pythonhosted.org/packages/2c/d1/b49454411a60edb6fefdcad4f8e6dbba7d8019e3a508a1c5836cba6d0781/yarl-1.22.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:d5372ca1df0f91a86b047d1277c2aaf1edb32d78bbcefffc81b40ffd18f027ed", size = 385311, upload-time = "2025-10-06T14:09:34.634Z" },
+ { url = "https://files.pythonhosted.org/packages/87/e5/40d7a94debb8448c7771a916d1861d6609dddf7958dc381117e7ba36d9e8/yarl-1.22.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:51af598701f5299012b8416486b40fceef8c26fc87dc6d7d1f6fc30609ea0aa6", size = 381094, upload-time = "2025-10-06T14:09:36.268Z" },
+ { url = "https://files.pythonhosted.org/packages/35/d8/611cc282502381ad855448643e1ad0538957fc82ae83dfe7762c14069e14/yarl-1.22.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:b266bd01fedeffeeac01a79ae181719ff848a5a13ce10075adbefc8f1daee70e", size = 370944, upload-time = "2025-10-06T14:09:37.872Z" },
+ { url = "https://files.pythonhosted.org/packages/2d/df/fadd00fb1c90e1a5a8bd731fa3d3de2e165e5a3666a095b04e31b04d9cb6/yarl-1.22.0-cp311-cp311-win32.whl", hash = "sha256:a9b1ba5610a4e20f655258d5a1fdc7ebe3d837bb0e45b581398b99eb98b1f5ca", size = 81804, upload-time = "2025-10-06T14:09:39.359Z" },
+ { url = "https://files.pythonhosted.org/packages/b5/f7/149bb6f45f267cb5c074ac40c01c6b3ea6d8a620d34b337f6321928a1b4d/yarl-1.22.0-cp311-cp311-win_amd64.whl", hash = "sha256:078278b9b0b11568937d9509b589ee83ef98ed6d561dfe2020e24a9fd08eaa2b", size = 86858, upload-time = "2025-10-06T14:09:41.068Z" },
+ { url = "https://files.pythonhosted.org/packages/2b/13/88b78b93ad3f2f0b78e13bfaaa24d11cbc746e93fe76d8c06bf139615646/yarl-1.22.0-cp311-cp311-win_arm64.whl", hash = "sha256:b6a6f620cfe13ccec221fa312139135166e47ae169f8253f72a0abc0dae94376", size = 81637, upload-time = "2025-10-06T14:09:42.712Z" },
+ { url = "https://files.pythonhosted.org/packages/75/ff/46736024fee3429b80a165a732e38e5d5a238721e634ab41b040d49f8738/yarl-1.22.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:e340382d1afa5d32b892b3ff062436d592ec3d692aeea3bef3a5cfe11bbf8c6f", size = 142000, upload-time = "2025-10-06T14:09:44.631Z" },
+ { url = "https://files.pythonhosted.org/packages/5a/9a/b312ed670df903145598914770eb12de1bac44599549b3360acc96878df8/yarl-1.22.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f1e09112a2c31ffe8d80be1b0988fa6a18c5d5cad92a9ffbb1c04c91bfe52ad2", size = 94338, upload-time = "2025-10-06T14:09:46.372Z" },
+ { url = "https://files.pythonhosted.org/packages/ba/f5/0601483296f09c3c65e303d60c070a5c19fcdbc72daa061e96170785bc7d/yarl-1.22.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:939fe60db294c786f6b7c2d2e121576628468f65453d86b0fe36cb52f987bd74", size = 94909, upload-time = "2025-10-06T14:09:48.648Z" },
+ { url = "https://files.pythonhosted.org/packages/60/41/9a1fe0b73dbcefce72e46cf149b0e0a67612d60bfc90fb59c2b2efdfbd86/yarl-1.22.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e1651bf8e0398574646744c1885a41198eba53dc8a9312b954073f845c90a8df", size = 372940, upload-time = "2025-10-06T14:09:50.089Z" },
+ { url = "https://files.pythonhosted.org/packages/17/7a/795cb6dfee561961c30b800f0ed616b923a2ec6258b5def2a00bf8231334/yarl-1.22.0-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:b8a0588521a26bf92a57a1705b77b8b59044cdceccac7151bd8d229e66b8dedb", size = 345825, upload-time = "2025-10-06T14:09:52.142Z" },
+ { url = "https://files.pythonhosted.org/packages/d7/93/a58f4d596d2be2ae7bab1a5846c4d270b894958845753b2c606d666744d3/yarl-1.22.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:42188e6a615c1a75bcaa6e150c3fe8f3e8680471a6b10150c5f7e83f47cc34d2", size = 386705, upload-time = "2025-10-06T14:09:54.128Z" },
+ { url = "https://files.pythonhosted.org/packages/61/92/682279d0e099d0e14d7fd2e176bd04f48de1484f56546a3e1313cd6c8e7c/yarl-1.22.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f6d2cb59377d99718913ad9a151030d6f83ef420a2b8f521d94609ecc106ee82", size = 396518, upload-time = "2025-10-06T14:09:55.762Z" },
+ { url = "https://files.pythonhosted.org/packages/db/0f/0d52c98b8a885aeda831224b78f3be7ec2e1aa4a62091f9f9188c3c65b56/yarl-1.22.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:50678a3b71c751d58d7908edc96d332af328839eea883bb554a43f539101277a", size = 377267, upload-time = "2025-10-06T14:09:57.958Z" },
+ { url = "https://files.pythonhosted.org/packages/22/42/d2685e35908cbeaa6532c1fc73e89e7f2efb5d8a7df3959ea8e37177c5a3/yarl-1.22.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1e8fbaa7cec507aa24ea27a01456e8dd4b6fab829059b69844bd348f2d467124", size = 365797, upload-time = "2025-10-06T14:09:59.527Z" },
+ { url = "https://files.pythonhosted.org/packages/a2/83/cf8c7bcc6355631762f7d8bdab920ad09b82efa6b722999dfb05afa6cfac/yarl-1.22.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:433885ab5431bc3d3d4f2f9bd15bfa1614c522b0f1405d62c4f926ccd69d04fa", size = 365535, upload-time = "2025-10-06T14:10:01.139Z" },
+ { url = "https://files.pythonhosted.org/packages/25/e1/5302ff9b28f0c59cac913b91fe3f16c59a033887e57ce9ca5d41a3a94737/yarl-1.22.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:b790b39c7e9a4192dc2e201a282109ed2985a1ddbd5ac08dc56d0e121400a8f7", size = 382324, upload-time = "2025-10-06T14:10:02.756Z" },
+ { url = "https://files.pythonhosted.org/packages/bf/cd/4617eb60f032f19ae3a688dc990d8f0d89ee0ea378b61cac81ede3e52fae/yarl-1.22.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:31f0b53913220599446872d757257be5898019c85e7971599065bc55065dc99d", size = 383803, upload-time = "2025-10-06T14:10:04.552Z" },
+ { url = "https://files.pythonhosted.org/packages/59/65/afc6e62bb506a319ea67b694551dab4a7e6fb7bf604e9bd9f3e11d575fec/yarl-1.22.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a49370e8f711daec68d09b821a34e1167792ee2d24d405cbc2387be4f158b520", size = 374220, upload-time = "2025-10-06T14:10:06.489Z" },
+ { url = "https://files.pythonhosted.org/packages/e7/3d/68bf18d50dc674b942daec86a9ba922d3113d8399b0e52b9897530442da2/yarl-1.22.0-cp312-cp312-win32.whl", hash = "sha256:70dfd4f241c04bd9239d53b17f11e6ab672b9f1420364af63e8531198e3f5fe8", size = 81589, upload-time = "2025-10-06T14:10:09.254Z" },
+ { url = "https://files.pythonhosted.org/packages/c8/9a/6ad1a9b37c2f72874f93e691b2e7ecb6137fb2b899983125db4204e47575/yarl-1.22.0-cp312-cp312-win_amd64.whl", hash = "sha256:8884d8b332a5e9b88e23f60bb166890009429391864c685e17bd73a9eda9105c", size = 87213, upload-time = "2025-10-06T14:10:11.369Z" },
+ { url = "https://files.pythonhosted.org/packages/44/c5/c21b562d1680a77634d748e30c653c3ca918beb35555cff24986fff54598/yarl-1.22.0-cp312-cp312-win_arm64.whl", hash = "sha256:ea70f61a47f3cc93bdf8b2f368ed359ef02a01ca6393916bc8ff877427181e74", size = 81330, upload-time = "2025-10-06T14:10:13.112Z" },
+ { url = "https://files.pythonhosted.org/packages/ea/f3/d67de7260456ee105dc1d162d43a019ecad6b91e2f51809d6cddaa56690e/yarl-1.22.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:8dee9c25c74997f6a750cd317b8ca63545169c098faee42c84aa5e506c819b53", size = 139980, upload-time = "2025-10-06T14:10:14.601Z" },
+ { url = "https://files.pythonhosted.org/packages/01/88/04d98af0b47e0ef42597b9b28863b9060bb515524da0a65d5f4db160b2d5/yarl-1.22.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:01e73b85a5434f89fc4fe27dcda2aff08ddf35e4d47bbbea3bdcd25321af538a", size = 93424, upload-time = "2025-10-06T14:10:16.115Z" },
+ { url = "https://files.pythonhosted.org/packages/18/91/3274b215fd8442a03975ce6bee5fe6aa57a8326b29b9d3d56234a1dca244/yarl-1.22.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:22965c2af250d20c873cdbee8ff958fb809940aeb2e74ba5f20aaf6b7ac8c70c", size = 93821, upload-time = "2025-10-06T14:10:17.993Z" },
+ { url = "https://files.pythonhosted.org/packages/61/3a/caf4e25036db0f2da4ca22a353dfeb3c9d3c95d2761ebe9b14df8fc16eb0/yarl-1.22.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b4f15793aa49793ec8d1c708ab7f9eded1aa72edc5174cae703651555ed1b601", size = 373243, upload-time = "2025-10-06T14:10:19.44Z" },
+ { url = "https://files.pythonhosted.org/packages/6e/9e/51a77ac7516e8e7803b06e01f74e78649c24ee1021eca3d6a739cb6ea49c/yarl-1.22.0-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e5542339dcf2747135c5c85f68680353d5cb9ffd741c0f2e8d832d054d41f35a", size = 342361, upload-time = "2025-10-06T14:10:21.124Z" },
+ { url = "https://files.pythonhosted.org/packages/d4/f8/33b92454789dde8407f156c00303e9a891f1f51a0330b0fad7c909f87692/yarl-1.22.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:5c401e05ad47a75869c3ab3e35137f8468b846770587e70d71e11de797d113df", size = 387036, upload-time = "2025-10-06T14:10:22.902Z" },
+ { url = "https://files.pythonhosted.org/packages/d9/9a/c5db84ea024f76838220280f732970aa4ee154015d7f5c1bfb60a267af6f/yarl-1.22.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:243dda95d901c733f5b59214d28b0120893d91777cb8aa043e6ef059d3cddfe2", size = 397671, upload-time = "2025-10-06T14:10:24.523Z" },
+ { url = "https://files.pythonhosted.org/packages/11/c9/cd8538dc2e7727095e0c1d867bad1e40c98f37763e6d995c1939f5fdc7b1/yarl-1.22.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bec03d0d388060058f5d291a813f21c011041938a441c593374da6077fe21b1b", size = 377059, upload-time = "2025-10-06T14:10:26.406Z" },
+ { url = "https://files.pythonhosted.org/packages/a1/b9/ab437b261702ced75122ed78a876a6dec0a1b0f5e17a4ac7a9a2482d8abe/yarl-1.22.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b0748275abb8c1e1e09301ee3cf90c8a99678a4e92e4373705f2a2570d581273", size = 365356, upload-time = "2025-10-06T14:10:28.461Z" },
+ { url = "https://files.pythonhosted.org/packages/b2/9d/8e1ae6d1d008a9567877b08f0ce4077a29974c04c062dabdb923ed98e6fe/yarl-1.22.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:47fdb18187e2a4e18fda2c25c05d8251a9e4a521edaed757fef033e7d8498d9a", size = 361331, upload-time = "2025-10-06T14:10:30.541Z" },
+ { url = "https://files.pythonhosted.org/packages/ca/5a/09b7be3905962f145b73beb468cdd53db8aa171cf18c80400a54c5b82846/yarl-1.22.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c7044802eec4524fde550afc28edda0dd5784c4c45f0be151a2d3ba017daca7d", size = 382590, upload-time = "2025-10-06T14:10:33.352Z" },
+ { url = "https://files.pythonhosted.org/packages/aa/7f/59ec509abf90eda5048b0bc3e2d7b5099dffdb3e6b127019895ab9d5ef44/yarl-1.22.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:139718f35149ff544caba20fce6e8a2f71f1e39b92c700d8438a0b1d2a631a02", size = 385316, upload-time = "2025-10-06T14:10:35.034Z" },
+ { url = "https://files.pythonhosted.org/packages/e5/84/891158426bc8036bfdfd862fabd0e0fa25df4176ec793e447f4b85cf1be4/yarl-1.22.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e1b51bebd221006d3d2f95fbe124b22b247136647ae5dcc8c7acafba66e5ee67", size = 374431, upload-time = "2025-10-06T14:10:37.76Z" },
+ { url = "https://files.pythonhosted.org/packages/bb/49/03da1580665baa8bef5e8ed34c6df2c2aca0a2f28bf397ed238cc1bbc6f2/yarl-1.22.0-cp313-cp313-win32.whl", hash = "sha256:d3e32536234a95f513bd374e93d717cf6b2231a791758de6c509e3653f234c95", size = 81555, upload-time = "2025-10-06T14:10:39.649Z" },
+ { url = "https://files.pythonhosted.org/packages/9a/ee/450914ae11b419eadd067c6183ae08381cfdfcb9798b90b2b713bbebddda/yarl-1.22.0-cp313-cp313-win_amd64.whl", hash = "sha256:47743b82b76d89a1d20b83e60d5c20314cbd5ba2befc9cda8f28300c4a08ed4d", size = 86965, upload-time = "2025-10-06T14:10:41.313Z" },
+ { url = "https://files.pythonhosted.org/packages/98/4d/264a01eae03b6cf629ad69bae94e3b0e5344741e929073678e84bf7a3e3b/yarl-1.22.0-cp313-cp313-win_arm64.whl", hash = "sha256:5d0fcda9608875f7d052eff120c7a5da474a6796fe4d83e152e0e4d42f6d1a9b", size = 81205, upload-time = "2025-10-06T14:10:43.167Z" },
+ { url = "https://files.pythonhosted.org/packages/88/fc/6908f062a2f77b5f9f6d69cecb1747260831ff206adcbc5b510aff88df91/yarl-1.22.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:719ae08b6972befcba4310e49edb1161a88cdd331e3a694b84466bd938a6ab10", size = 146209, upload-time = "2025-10-06T14:10:44.643Z" },
+ { url = "https://files.pythonhosted.org/packages/65/47/76594ae8eab26210b4867be6f49129861ad33da1f1ebdf7051e98492bf62/yarl-1.22.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:47d8a5c446df1c4db9d21b49619ffdba90e77c89ec6e283f453856c74b50b9e3", size = 95966, upload-time = "2025-10-06T14:10:46.554Z" },
+ { url = "https://files.pythonhosted.org/packages/ab/ce/05e9828a49271ba6b5b038b15b3934e996980dd78abdfeb52a04cfb9467e/yarl-1.22.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:cfebc0ac8333520d2d0423cbbe43ae43c8838862ddb898f5ca68565e395516e9", size = 97312, upload-time = "2025-10-06T14:10:48.007Z" },
+ { url = "https://files.pythonhosted.org/packages/d1/c5/7dffad5e4f2265b29c9d7ec869c369e4223166e4f9206fc2243ee9eea727/yarl-1.22.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4398557cbf484207df000309235979c79c4356518fd5c99158c7d38203c4da4f", size = 361967, upload-time = "2025-10-06T14:10:49.997Z" },
+ { url = "https://files.pythonhosted.org/packages/50/b2/375b933c93a54bff7fc041e1a6ad2c0f6f733ffb0c6e642ce56ee3b39970/yarl-1.22.0-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2ca6fd72a8cd803be290d42f2dec5cdcd5299eeb93c2d929bf060ad9efaf5de0", size = 323949, upload-time = "2025-10-06T14:10:52.004Z" },
+ { url = "https://files.pythonhosted.org/packages/66/50/bfc2a29a1d78644c5a7220ce2f304f38248dc94124a326794e677634b6cf/yarl-1.22.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ca1f59c4e1ab6e72f0a23c13fca5430f889634166be85dbf1013683e49e3278e", size = 361818, upload-time = "2025-10-06T14:10:54.078Z" },
+ { url = "https://files.pythonhosted.org/packages/46/96/f3941a46af7d5d0f0498f86d71275696800ddcdd20426298e572b19b91ff/yarl-1.22.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6c5010a52015e7c70f86eb967db0f37f3c8bd503a695a49f8d45700144667708", size = 372626, upload-time = "2025-10-06T14:10:55.767Z" },
+ { url = "https://files.pythonhosted.org/packages/c1/42/8b27c83bb875cd89448e42cd627e0fb971fa1675c9ec546393d18826cb50/yarl-1.22.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d7672ecf7557476642c88497c2f8d8542f8e36596e928e9bcba0e42e1e7d71f", size = 341129, upload-time = "2025-10-06T14:10:57.985Z" },
+ { url = "https://files.pythonhosted.org/packages/49/36/99ca3122201b382a3cf7cc937b95235b0ac944f7e9f2d5331d50821ed352/yarl-1.22.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:3b7c88eeef021579d600e50363e0b6ee4f7f6f728cd3486b9d0f3ee7b946398d", size = 346776, upload-time = "2025-10-06T14:10:59.633Z" },
+ { url = "https://files.pythonhosted.org/packages/85/b4/47328bf996acd01a4c16ef9dcd2f59c969f495073616586f78cd5f2efb99/yarl-1.22.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:f4afb5c34f2c6fecdcc182dfcfc6af6cccf1aa923eed4d6a12e9d96904e1a0d8", size = 334879, upload-time = "2025-10-06T14:11:01.454Z" },
+ { url = "https://files.pythonhosted.org/packages/c2/ad/b77d7b3f14a4283bffb8e92c6026496f6de49751c2f97d4352242bba3990/yarl-1.22.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:59c189e3e99a59cf8d83cbb31d4db02d66cda5a1a4374e8a012b51255341abf5", size = 350996, upload-time = "2025-10-06T14:11:03.452Z" },
+ { url = "https://files.pythonhosted.org/packages/81/c8/06e1d69295792ba54d556f06686cbd6a7ce39c22307100e3fb4a2c0b0a1d/yarl-1.22.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:5a3bf7f62a289fa90f1990422dc8dff5a458469ea71d1624585ec3a4c8d6960f", size = 356047, upload-time = "2025-10-06T14:11:05.115Z" },
+ { url = "https://files.pythonhosted.org/packages/4b/b8/4c0e9e9f597074b208d18cef227d83aac36184bfbc6eab204ea55783dbc5/yarl-1.22.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:de6b9a04c606978fdfe72666fa216ffcf2d1a9f6a381058d4378f8d7b1e5de62", size = 342947, upload-time = "2025-10-06T14:11:08.137Z" },
+ { url = "https://files.pythonhosted.org/packages/e0/e5/11f140a58bf4c6ad7aca69a892bff0ee638c31bea4206748fc0df4ebcb3a/yarl-1.22.0-cp313-cp313t-win32.whl", hash = "sha256:1834bb90991cc2999f10f97f5f01317f99b143284766d197e43cd5b45eb18d03", size = 86943, upload-time = "2025-10-06T14:11:10.284Z" },
+ { url = "https://files.pythonhosted.org/packages/31/74/8b74bae38ed7fe6793d0c15a0c8207bbb819cf287788459e5ed230996cdd/yarl-1.22.0-cp313-cp313t-win_amd64.whl", hash = "sha256:ff86011bd159a9d2dfc89c34cfd8aff12875980e3bd6a39ff097887520e60249", size = 93715, upload-time = "2025-10-06T14:11:11.739Z" },
+ { url = "https://files.pythonhosted.org/packages/69/66/991858aa4b5892d57aef7ee1ba6b4d01ec3b7eb3060795d34090a3ca3278/yarl-1.22.0-cp313-cp313t-win_arm64.whl", hash = "sha256:7861058d0582b847bc4e3a4a4c46828a410bca738673f35a29ba3ca5db0b473b", size = 83857, upload-time = "2025-10-06T14:11:13.586Z" },
+ { url = "https://files.pythonhosted.org/packages/46/b3/e20ef504049f1a1c54a814b4b9bed96d1ac0e0610c3b4da178f87209db05/yarl-1.22.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:34b36c2c57124530884d89d50ed2c1478697ad7473efd59cfd479945c95650e4", size = 140520, upload-time = "2025-10-06T14:11:15.465Z" },
+ { url = "https://files.pythonhosted.org/packages/e4/04/3532d990fdbab02e5ede063676b5c4260e7f3abea2151099c2aa745acc4c/yarl-1.22.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:0dd9a702591ca2e543631c2a017e4a547e38a5c0f29eece37d9097e04a7ac683", size = 93504, upload-time = "2025-10-06T14:11:17.106Z" },
+ { url = "https://files.pythonhosted.org/packages/11/63/ff458113c5c2dac9a9719ac68ee7c947cb621432bcf28c9972b1c0e83938/yarl-1.22.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:594fcab1032e2d2cc3321bb2e51271e7cd2b516c7d9aee780ece81b07ff8244b", size = 94282, upload-time = "2025-10-06T14:11:19.064Z" },
+ { url = "https://files.pythonhosted.org/packages/a7/bc/315a56aca762d44a6aaaf7ad253f04d996cb6b27bad34410f82d76ea8038/yarl-1.22.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f3d7a87a78d46a2e3d5b72587ac14b4c16952dd0887dbb051451eceac774411e", size = 372080, upload-time = "2025-10-06T14:11:20.996Z" },
+ { url = "https://files.pythonhosted.org/packages/3f/3f/08e9b826ec2e099ea6e7c69a61272f4f6da62cb5b1b63590bb80ca2e4a40/yarl-1.22.0-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:852863707010316c973162e703bddabec35e8757e67fcb8ad58829de1ebc8590", size = 338696, upload-time = "2025-10-06T14:11:22.847Z" },
+ { url = "https://files.pythonhosted.org/packages/e3/9f/90360108e3b32bd76789088e99538febfea24a102380ae73827f62073543/yarl-1.22.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:131a085a53bfe839a477c0845acf21efc77457ba2bcf5899618136d64f3303a2", size = 387121, upload-time = "2025-10-06T14:11:24.889Z" },
+ { url = "https://files.pythonhosted.org/packages/98/92/ab8d4657bd5b46a38094cfaea498f18bb70ce6b63508fd7e909bd1f93066/yarl-1.22.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:078a8aefd263f4d4f923a9677b942b445a2be970ca24548a8102689a3a8ab8da", size = 394080, upload-time = "2025-10-06T14:11:27.307Z" },
+ { url = "https://files.pythonhosted.org/packages/f5/e7/d8c5a7752fef68205296201f8ec2bf718f5c805a7a7e9880576c67600658/yarl-1.22.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bca03b91c323036913993ff5c738d0842fc9c60c4648e5c8d98331526df89784", size = 372661, upload-time = "2025-10-06T14:11:29.387Z" },
+ { url = "https://files.pythonhosted.org/packages/b6/2e/f4d26183c8db0bb82d491b072f3127fb8c381a6206a3a56332714b79b751/yarl-1.22.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:68986a61557d37bb90d3051a45b91fa3d5c516d177dfc6dd6f2f436a07ff2b6b", size = 364645, upload-time = "2025-10-06T14:11:31.423Z" },
+ { url = "https://files.pythonhosted.org/packages/80/7c/428e5812e6b87cd00ee8e898328a62c95825bf37c7fa87f0b6bb2ad31304/yarl-1.22.0-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:4792b262d585ff0dff6bcb787f8492e40698443ec982a3568c2096433660c694", size = 355361, upload-time = "2025-10-06T14:11:33.055Z" },
+ { url = "https://files.pythonhosted.org/packages/ec/2a/249405fd26776f8b13c067378ef4d7dd49c9098d1b6457cdd152a99e96a9/yarl-1.22.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:ebd4549b108d732dba1d4ace67614b9545b21ece30937a63a65dd34efa19732d", size = 381451, upload-time = "2025-10-06T14:11:35.136Z" },
+ { url = "https://files.pythonhosted.org/packages/67/a8/fb6b1adbe98cf1e2dd9fad71003d3a63a1bc22459c6e15f5714eb9323b93/yarl-1.22.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:f87ac53513d22240c7d59203f25cc3beac1e574c6cd681bbfd321987b69f95fd", size = 383814, upload-time = "2025-10-06T14:11:37.094Z" },
+ { url = "https://files.pythonhosted.org/packages/d9/f9/3aa2c0e480fb73e872ae2814c43bc1e734740bb0d54e8cb2a95925f98131/yarl-1.22.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:22b029f2881599e2f1b06f8f1db2ee63bd309e2293ba2d566e008ba12778b8da", size = 370799, upload-time = "2025-10-06T14:11:38.83Z" },
+ { url = "https://files.pythonhosted.org/packages/50/3c/af9dba3b8b5eeb302f36f16f92791f3ea62e3f47763406abf6d5a4a3333b/yarl-1.22.0-cp314-cp314-win32.whl", hash = "sha256:6a635ea45ba4ea8238463b4f7d0e721bad669f80878b7bfd1f89266e2ae63da2", size = 82990, upload-time = "2025-10-06T14:11:40.624Z" },
+ { url = "https://files.pythonhosted.org/packages/ac/30/ac3a0c5bdc1d6efd1b41fa24d4897a4329b3b1e98de9449679dd327af4f0/yarl-1.22.0-cp314-cp314-win_amd64.whl", hash = "sha256:0d6e6885777af0f110b0e5d7e5dda8b704efed3894da26220b7f3d887b839a79", size = 88292, upload-time = "2025-10-06T14:11:42.578Z" },
+ { url = "https://files.pythonhosted.org/packages/df/0a/227ab4ff5b998a1b7410abc7b46c9b7a26b0ca9e86c34ba4b8d8bc7c63d5/yarl-1.22.0-cp314-cp314-win_arm64.whl", hash = "sha256:8218f4e98d3c10d683584cb40f0424f4b9fd6e95610232dd75e13743b070ee33", size = 82888, upload-time = "2025-10-06T14:11:44.863Z" },
+ { url = "https://files.pythonhosted.org/packages/06/5e/a15eb13db90abd87dfbefb9760c0f3f257ac42a5cac7e75dbc23bed97a9f/yarl-1.22.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:45c2842ff0e0d1b35a6bf1cd6c690939dacb617a70827f715232b2e0494d55d1", size = 146223, upload-time = "2025-10-06T14:11:46.796Z" },
+ { url = "https://files.pythonhosted.org/packages/18/82/9665c61910d4d84f41a5bf6837597c89e665fa88aa4941080704645932a9/yarl-1.22.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:d947071e6ebcf2e2bee8fce76e10faca8f7a14808ca36a910263acaacef08eca", size = 95981, upload-time = "2025-10-06T14:11:48.845Z" },
+ { url = "https://files.pythonhosted.org/packages/5d/9a/2f65743589809af4d0a6d3aa749343c4b5f4c380cc24a8e94a3c6625a808/yarl-1.22.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:334b8721303e61b00019474cc103bdac3d7b1f65e91f0bfedeec2d56dfe74b53", size = 97303, upload-time = "2025-10-06T14:11:50.897Z" },
+ { url = "https://files.pythonhosted.org/packages/b0/ab/5b13d3e157505c43c3b43b5a776cbf7b24a02bc4cccc40314771197e3508/yarl-1.22.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1e7ce67c34138a058fd092f67d07a72b8e31ff0c9236e751957465a24b28910c", size = 361820, upload-time = "2025-10-06T14:11:52.549Z" },
+ { url = "https://files.pythonhosted.org/packages/fb/76/242a5ef4677615cf95330cfc1b4610e78184400699bdda0acb897ef5e49a/yarl-1.22.0-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:d77e1b2c6d04711478cb1c4ab90db07f1609ccf06a287d5607fcd90dc9863acf", size = 323203, upload-time = "2025-10-06T14:11:54.225Z" },
+ { url = "https://files.pythonhosted.org/packages/8c/96/475509110d3f0153b43d06164cf4195c64d16999e0c7e2d8a099adcd6907/yarl-1.22.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c4647674b6150d2cae088fc07de2738a84b8bcedebef29802cf0b0a82ab6face", size = 363173, upload-time = "2025-10-06T14:11:56.069Z" },
+ { url = "https://files.pythonhosted.org/packages/c9/66/59db471aecfbd559a1fd48aedd954435558cd98c7d0da8b03cc6c140a32c/yarl-1.22.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:efb07073be061c8f79d03d04139a80ba33cbd390ca8f0297aae9cce6411e4c6b", size = 373562, upload-time = "2025-10-06T14:11:58.783Z" },
+ { url = "https://files.pythonhosted.org/packages/03/1f/c5d94abc91557384719da10ff166b916107c1b45e4d0423a88457071dd88/yarl-1.22.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e51ac5435758ba97ad69617e13233da53908beccc6cfcd6c34bbed8dcbede486", size = 339828, upload-time = "2025-10-06T14:12:00.686Z" },
+ { url = "https://files.pythonhosted.org/packages/5f/97/aa6a143d3afba17b6465733681c70cf175af89f76ec8d9286e08437a7454/yarl-1.22.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:33e32a0dd0c8205efa8e83d04fc9f19313772b78522d1bdc7d9aed706bfd6138", size = 347551, upload-time = "2025-10-06T14:12:02.628Z" },
+ { url = "https://files.pythonhosted.org/packages/43/3c/45a2b6d80195959239a7b2a8810506d4eea5487dce61c2a3393e7fc3c52e/yarl-1.22.0-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:bf4a21e58b9cde0e401e683ebd00f6ed30a06d14e93f7c8fd059f8b6e8f87b6a", size = 334512, upload-time = "2025-10-06T14:12:04.871Z" },
+ { url = "https://files.pythonhosted.org/packages/86/a0/c2ab48d74599c7c84cb104ebd799c5813de252bea0f360ffc29d270c2caa/yarl-1.22.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:e4b582bab49ac33c8deb97e058cd67c2c50dac0dd134874106d9c774fd272529", size = 352400, upload-time = "2025-10-06T14:12:06.624Z" },
+ { url = "https://files.pythonhosted.org/packages/32/75/f8919b2eafc929567d3d8411f72bdb1a2109c01caaab4ebfa5f8ffadc15b/yarl-1.22.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:0b5bcc1a9c4839e7e30b7b30dd47fe5e7e44fb7054ec29b5bb8d526aa1041093", size = 357140, upload-time = "2025-10-06T14:12:08.362Z" },
+ { url = "https://files.pythonhosted.org/packages/cf/72/6a85bba382f22cf78add705d8c3731748397d986e197e53ecc7835e76de7/yarl-1.22.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:c0232bce2170103ec23c454e54a57008a9a72b5d1c3105dc2496750da8cfa47c", size = 341473, upload-time = "2025-10-06T14:12:10.994Z" },
+ { url = "https://files.pythonhosted.org/packages/35/18/55e6011f7c044dc80b98893060773cefcfdbf60dfefb8cb2f58b9bacbd83/yarl-1.22.0-cp314-cp314t-win32.whl", hash = "sha256:8009b3173bcd637be650922ac455946197d858b3630b6d8787aa9e5c4564533e", size = 89056, upload-time = "2025-10-06T14:12:13.317Z" },
+ { url = "https://files.pythonhosted.org/packages/f9/86/0f0dccb6e59a9e7f122c5afd43568b1d31b8ab7dda5f1b01fb5c7025c9a9/yarl-1.22.0-cp314-cp314t-win_amd64.whl", hash = "sha256:9fb17ea16e972c63d25d4a97f016d235c78dd2344820eb35bc034bc32012ee27", size = 96292, upload-time = "2025-10-06T14:12:15.398Z" },
+ { url = "https://files.pythonhosted.org/packages/48/b7/503c98092fb3b344a179579f55814b613c1fbb1c23b3ec14a7b008a66a6e/yarl-1.22.0-cp314-cp314t-win_arm64.whl", hash = "sha256:9f6d73c1436b934e3f01df1e1b21ff765cd1d28c77dfb9ace207f746d4610ee1", size = 85171, upload-time = "2025-10-06T14:12:16.935Z" },
+ { url = "https://files.pythonhosted.org/packages/73/ae/b48f95715333080afb75a4504487cbe142cae1268afc482d06692d605ae6/yarl-1.22.0-py3-none-any.whl", hash = "sha256:1380560bdba02b6b6c90de54133c81c9f2a453dee9912fe58c1dcced1edb7cff", size = 46814, upload-time = "2025-10-06T14:12:53.872Z" },
+]