API Reference¶
CLI¶
cal_disp.cli ¶
cli_app ¶
cli_app(ctx: Context, debug: bool) -> None
Run a DISP calibration workflow.
Source code in src/cal_disp/cli/__init__.py
6 7 8 9 10 11 12 13 14 | |
download ¶
burst_bounds ¶
burst_bounds(input_file: Path, output_dir: Path) -> None
Download S1 CSLC boundary tiles for a DISP-S1 file.
Extracts frame ID and sensing times from the DISP-S1 filename, then downloads corresponding Sentinel-1 burst boundary geometries. Output directory is created if it doesn't exist.
Only supports DISP-S1 products (sensor must be S1).
Examples:
Basic usage: $ cal-disp download burst-bounds -i disp_product.nc -o ./burst_data
Full path example: $ cal-disp download burst-bounds \ -i OPERA_L3_DISP-S1_IW_F08882_VV_20220111T002651Z_20220722T002657Z_v1.0.nc \ -o ./burst_bounds
Source code in src/cal_disp/cli/download.py
241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 | |
disp_s1 ¶
disp_s1(frame_id: int, output_dir: Path, start: datetime | None, end: datetime | None, num_workers: int) -> None
Download OPERA DISP-S1 products for a frame.
Downloads displacement products from the OPERA DISP-S1 archive for the specified frame and date range. Products are filtered based on the secondary date of each interferogram.
Examples:
Download all products for a frame: $ cal-disp download disp-s1 --frame-id 8882 -o ./disp_data
Download products for specific date range: $ cal-disp download disp-s1 --frame-id 8882 -o ./disp_data \ --start 2022-07-01 --end 2022-07-31
Use more workers for faster downloads: $ cal-disp download disp-s1 --frame-id 8882 -o ./disp_data -n 8
Source code in src/cal_disp/cli/download.py
19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 | |
download_group ¶
download_group()
Sub-commands for downloading prerequisite data.
Source code in src/cal_disp/cli/download.py
11 12 13 14 15 16 | |
tropo ¶
tropo(input_file: Path, output_dir: Path, num_workers: int, interp: bool) -> None
Download OPERA TROPO for a DISP-S1 file.
Extracts sensing times from the input DISP-S1 product filename and downloads corresponding OPERA TROPO data. Output directory is created if it doesn't exist.
The input file must follow OPERA naming convention: OPERA_L3_DISP-S1_IW_F{frame}VV{ref_date}{sec_date}_v1.0{proc_date}.nc
Examples:
Basic usage: $ cal-disp download tropo -i disp_product.nc -o ./tropo_data
With temporal interpolation: $ cal-disp download tropo -i disp_product.nc -o ./tropo_data --interp
Using more workers: $ cal-disp download tropo -i disp_product.nc -o ./tropo_data -n 8
Source code in src/cal_disp/cli/download.py
172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 | |
unr ¶
unr(frame_id: int, output_dir: Path, start: datetime | None, end: datetime | None, margin: float) -> None
Download UNR GPS timeseries data for a DISP-S1 frame.
Downloads GPS timeseries grid from the Nevada Geodetic Laboratory (UNR) within the frame's bounding box (expanded by margin). Data is saved as a parquet file for efficient loading. Output directory is created if it doesn't exist.
Examples:
Download UNR data for a frame: $ cal-disp download unr --frame-id 8882 -o ./unr_data
With date range: $ cal-disp download unr --frame-id 8882 -o ./unr_data \ --start 2022-01-01 --end 2023-12-31
Expand bounding box by 1 degree: $ cal-disp download unr --frame-id 8882 -o ./unr_data -m 1.0
Source code in src/cal_disp/cli/download.py
94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 | |
Download¶
cal_disp.download ¶
download_disp ¶
download_disp(frame_id: int, output_dir: Path, start: datetime | None = None, end: datetime | None = None, num_workers: int = DEFAULT_NUM_WORKERS) -> None
Download DISP-S1 products for a frame.
Downloads displacement products from the OPERA DISP-S1 archive for the specified frame and date range. Products are filtered based on the secondary date of each interferogram.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
frame_id
|
int
|
OPERA frame identifier. |
required |
output_dir
|
Path
|
Directory where products will be saved. |
required |
start
|
datetime or None
|
Start date for query (based on secondary date). Default is None (no start limit). |
None
|
end
|
datetime or None
|
End date for query (based on secondary date). Default is None (no end limit). |
None
|
num_workers
|
int
|
Number of parallel download workers. Default is 2. |
DEFAULT_NUM_WORKERS
|
Raises:
| Type | Description |
|---|---|
ValueError
|
If frame ID is not in the database or no products are found. |
Notes
Date queries are based on the secondary (later) date of each interferometric pair. If start and end dates are identical, the range is automatically expanded by ±1 day and num_workers is set to 1 to ensure the specific product is captured.
Examples:
>>> download_frame_products(
... frame_id=8887,
... output_dir=Path("./data"),
... start=datetime(2024, 1, 1),
... end=datetime(2024, 12, 31)
... )
Source code in src/cal_disp/download/_stage_disp.py
50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 | |
download_tropo ¶
download_tropo(disp_times: list[datetime | Timestamp], output_dir: Path | str, num_workers: int = 4, interp: bool = True) -> None
Download tropospheric correction data for displacement times.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
disp_times
|
list of datetime or pd.Timestamp
|
Displacement measurement times |
required |
output_dir
|
Path or str
|
Output directory for downloads |
required |
num_workers
|
int
|
Parallel download workers |
4
|
interp
|
bool
|
If True, get 2 scenes per time (for interpolation). If False, get single nearest scene. |
True
|
Source code in src/cal_disp/download/_stage_tropo.py
96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 | |
download_unr_grid ¶
download_unr_grid(frame_id: int, output_dir: Path, start: datetime | None = None, end: datetime | None = None, margin_deg: float = 0.5, plate: Literal['NA', 'PA', 'IGS14', 'IGS20'] = 'IGS20', version: Literal['0.1', '0.2'] = '0.2') -> None
Download UNR gridded GNSS timeseries for a given frame.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
frame_id
|
int
|
OPERA frame identifier. |
required |
output_dir
|
Path
|
Output directory for downloaded data. |
required |
start
|
datetime or None
|
Start date for timeseries. If None, downloads from beginning. |
None
|
end
|
datetime or None
|
End date for timeseries. If None, downloads until present. |
None
|
margin_deg
|
float
|
Margin in degrees to expand frame bounding box. |
0.5
|
plate
|
(NA, PA, IGS14, IGS20)
|
Reference plate for velocity computation. |
"NA"
|
version
|
(0.1, 0.2)
|
UNR grid version to download. |
"0.1"
|
Source code in src/cal_disp/download/_stage_unr.py
15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 | |
generate_s1_burst_tiles ¶
generate_s1_burst_tiles(frame_id: int, sensing_time: datetime, output_dir: Path, time_window_hours: float = 2.0, n_download_processes: int = 5) -> Path
Generate non-overlapping burst tiles for a frame and sensing time.
Downloads CSLC data, processes bursts to create non-overlapping polygons, and saves to GeoJSON. Priority: IW1 > IW2 > IW3, lower burst_id first.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
frame_id
|
int
|
OPERA frame identifier. |
required |
sensing_time
|
datetime
|
Sensing time to search for CSLC products. |
required |
output_dir
|
Path
|
Directory to save output GeoJSON and temporary files. |
required |
time_window_hours
|
float
|
Time window in hours for searching CSLC products. Default is 2.0. |
2.0
|
n_download_processes
|
int
|
Number of parallel download processes. Default is 5. |
5
|
Returns:
| Type | Description |
|---|---|
Path
|
Path to the generated GeoJSON file. |
Raises:
| Type | Description |
|---|---|
ValueError
|
If no bursts are found or download fails. |
Source code in src/cal_disp/download/_stage_burst_bounds.py
366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 | |
utils ¶
extract_sensing_times_from_file ¶
extract_sensing_times_from_file(disp_file: Path) -> list[datetime]
Extract sensing times from DISP-S1 NetCDF filename.
Parses DISP-S1 filename to extract reference and secondary dates. Expected format: OPERA_L3_DISP-S1_IW_F{frame}VV{ref_date}{sec_date}_v{version}{prod_date}.nc
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
disp_file
|
Path
|
Path to DISP-S1 NetCDF file. |
required |
Returns:
| Type | Description |
|---|---|
list[datetime]
|
List of unique sensing times from reference and secondary dates. |
Raises:
| Type | Description |
|---|---|
FileNotFoundError
|
If DISP file does not exist. |
ValueError
|
If dates cannot be parsed from filename. |
Examples:
>>> from pathlib import Path
>>> filename = Path("OPERA_L3_*.nc")
>>> times = extract_sensing_times_from_file(filename)
>>> len(times)
2
Source code in src/cal_disp/download/utils.py
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 | |
Product¶
cal_disp.product ¶
CalProduct
dataclass
¶
Calibrated OPERA DISP displacement product.
Represents a calibration correction product that should be subtracted from OPERA DISP products. Main group contains calibration at full resolution. Optional model_3d group contains 3D displacement components at coarser resolution.
Groups
Main group: - calibration: Correction to subtract from DISP (full resolution) - calibration_std: Calibration uncertainty (full resolution)
model_3d group (optional): - north_south: North-south displacement component (coarse resolution) - east_west: East-west displacement component (coarse resolution) - up_down: Up-down displacement component (coarse resolution) - north_south_std: Uncertainty in north-south - east_west_std: Uncertainty in east-west - up_down_std: Uncertainty in up-down
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
path
|
Path
|
Path to the calibration product NetCDF file. |
required |
frame_id
|
int
|
OPERA frame identifier. |
required |
primary_date
|
datetime
|
Earlier acquisition date (reference). |
required |
secondary_date
|
datetime
|
Later acquisition date. |
required |
polarization
|
str
|
Radar polarization (e.g., "VV", "VH"). |
required |
sensor
|
str
|
Sensor type: "S1" (Sentinel-1) or "NI" (NISAR). |
required |
version
|
str
|
Product version string. |
required |
production_date
|
datetime
|
Date when product was generated. |
required |
mode
|
str
|
Acquisition mode (e.g., "IW" for S1, "LSAR" for NI). |
'IW'
|
Examples:
>>> # Create product with both groups
>>> cal = CalProduct.create(
... calibration=cal_correction,
... disp_product=disp,
... output_dir="output/",
... model_3d={"north_south": vel_ns, "east_west": vel_ew, "up_down": vel_ud},
... )
>>> # Access main calibration
>>> ds_main = cal.open_dataset()
>>> calibration = ds_main["calibration"]
>>> # Access 3D model (coarse resolution)
>>> ds_model = cal.open_model_3d()
>>> model_up = ds_model["up_down"]
Source code in src/cal_disp/product/_cal.py
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 | |
create
classmethod
¶
create(calibration: DataArray, disp_product: DispProduct, output_dir: Path | str, sensor: str = 'S1', calibration_std: DataArray | None = None, model_3d: dict[str, DataArray] | None = None, model_3d_std: dict[str, DataArray] | None = None, metadata: dict[str, str] | None = None, version: str = '1.0') -> CalProduct
Create calibration product with optional model_3d group.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
calibration
|
DataArray
|
Calibration correction at full DISP resolution. |
required |
disp_product
|
DispProduct
|
Original DISP product (for metadata). |
required |
output_dir
|
Path or str
|
Output directory for NetCDF file. |
required |
sensor
|
str
|
Sensor type: "S1" or "NI". Default is "S1". |
'S1'
|
calibration_std
|
DataArray or None
|
Calibration uncertainty at full resolution. Default is None. |
None
|
model_3d
|
dict[str, DataArray] or None
|
3D displacement components (coarse resolution) with keys: "north_south", "east_west", "up_down". Default is None. |
None
|
model_3d_std
|
dict[str, DataArray] or None
|
3D displacement uncertainties (coarse resolution). Default is None. |
None
|
metadata
|
dict[str, str] or None
|
Additional metadata (e.g., GNSS reference epoch). Default is None. |
None
|
version
|
str
|
Product version. Default is "1.0". |
'1.0'
|
Returns:
| Type | Description |
|---|---|
CalProduct
|
Created calibration product. |
Examples:
>>> from product import DispProduct, CalProduct
>>>
>>> disp = DispProduct.from_path(
"OPERA_L3_DISP-S1_IW_F08882_VV_20220111T002651Z_20220722T002657Z_v1.0_20251027T005420Z.nc")
>>>
>>> # Full resolution calibration
>>> cal_full = calibration_at_30m_resolution
>>>
>>> # Coarse resolution 3D model (e.g., 90m from GNSS interpolation)
>>> model_coarse = {
... "north_south": disp_ns_90m,
... "east_west": disp_ew_90m,
... "up_down": disp_ud_90m,
... }
>>>
>>> cal = CalProduct.create(
... calibration=cal_full,
... disp_product=disp,
... output_dir="output/",
... calibration_std=cal_std,
... model_3d=model_coarse,
... model_3d_std={
... "north_south_std": disp_ns_std_90m,
... "east_west_std": disp_ew_std_90m,
... "up_down_std": disp_ud_std_90m,
... },
... metadata={
... "gnss_reference_epoch": "2020-01-01T00:00:00Z",
... "model_3d_resolution": "90m",
... },
... )
Source code in src/cal_disp/product/_cal.py
189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 | |
from_path
classmethod
¶
from_path(path: Path | str) -> CalProduct
Parse product metadata from filename.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
path
|
Path or str
|
Path to calibration product NetCDF file. |
required |
Returns:
| Type | Description |
|---|---|
CalProduct
|
Parsed calibration product instance. |
Raises:
| Type | Description |
|---|---|
ValueError
|
If filename doesn't match OPERA CAL-DISP pattern. |
Examples:
>>> cal = CalProduct.from_path(
"OPERA_L4_CAL-DISP-S1_IW_F08882_VV_20220111T002651Z_20220722T002657Z_v1.0_20251227T123456Z.nc")
>>> cal.sensor
'S1'
Source code in src/cal_disp/product/_cal.py
138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 | |
get_bounds ¶
get_bounds() -> dict[str, float]
Get bounds in native projection.
Source code in src/cal_disp/product/_cal.py
478 479 480 481 482 483 484 485 486 487 488 489 490 | |
get_bounds_wgs84 ¶
get_bounds_wgs84() -> dict[str, float]
Get bounds transformed to WGS84.
Source code in src/cal_disp/product/_cal.py
492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 | |
get_calibration_summary ¶
get_calibration_summary() -> dict[str, dict[str, float]]
Get summary statistics of all layers.
Returns:
| Type | Description |
|---|---|
dict[str, dict[str, float]]
|
Statistics for main and model_3d groups. |
Examples:
>>> summary = cal.get_calibration_summary()
>>> summary["main"]["calibration"]
{'mean': 0.023, 'std': 0.015, 'min': -0.05, 'max': 0.08}
>>> summary["model_3d"]["up_down"]
{'mean': 0.001, 'std': 0.003, 'min': -0.01, 'max': 0.02}
Source code in src/cal_disp/product/_cal.py
630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 | |
get_epsg ¶
get_epsg() -> int | None
Get EPSG code from spatial reference.
Source code in src/cal_disp/product/_cal.py
466 467 468 469 470 471 472 473 474 475 476 | |
has_model_3d ¶
has_model_3d() -> bool
Check if product contains model_3d group.
Returns:
| Type | Description |
|---|---|
bool
|
True if model_3d group exists. |
Source code in src/cal_disp/product/_cal.py
451 452 453 454 455 456 457 458 459 460 461 462 463 464 | |
open_dataset ¶
open_dataset(group: str | None = None) -> xr.Dataset
Open calibration dataset.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
group
|
str or None
|
Group to open: None for main, "model_3d" for 3D model. Default is None (main group). |
None
|
Returns:
| Type | Description |
|---|---|
Dataset
|
Dataset containing requested group. |
Raises:
| Type | Description |
|---|---|
FileNotFoundError
|
If product file does not exist. |
Examples:
>>> # Open main calibration (full resolution)
>>> ds_main = cal.open_dataset()
>>> calibration = ds_main["calibration"]
>>> # Open model_3d group (coarse resolution)
>>> ds_model = cal.open_dataset(group="model_3d")
>>> model_up = ds_model["up_down"]
Source code in src/cal_disp/product/_cal.py
379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 | |
open_model_3d ¶
open_model_3d() -> xr.Dataset
Open model_3d group dataset.
Returns:
| Type | Description |
|---|---|
Dataset
|
Dataset containing 3D displacement model at coarse resolution. |
Raises:
| Type | Description |
|---|---|
FileNotFoundError
|
If product file does not exist. |
ValueError
|
If model_3d group does not exist. |
Examples:
>>> ds_model = cal.open_model_3d()
>>> disp_ns = ds_model["north_south"]
>>> disp_ew = ds_model["east_west"]
>>> disp_up = ds_model["up_down"]
Source code in src/cal_disp/product/_cal.py
417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 | |
to_geotiff ¶
to_geotiff(layer: str, output_path: Path | str, group: str | None = None, compress: str = 'DEFLATE', **kwargs) -> Path
Export layer to GeoTIFF.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
layer
|
str
|
Name of layer to export. |
required |
output_path
|
Path or str
|
Output GeoTIFF path. |
required |
group
|
str or None
|
Group containing layer. Default is None (main group). |
None
|
compress
|
str
|
Compression method. Default is "DEFLATE". |
'DEFLATE'
|
**kwargs
|
Additional rasterio creation options. |
{}
|
Returns:
| Type | Description |
|---|---|
Path
|
Path to created GeoTIFF. |
Examples:
>>> # Export main calibration
>>> cal.to_geotiff("calibration", "calibration.tif")
>>> # Export 3D model component
>>> cal.to_geotiff("up_down", "model_up.tif", group="model_3d")
Source code in src/cal_disp/product/_cal.py
528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 | |
DispProduct
dataclass
¶
OPERA DISP-S1 displacement product.
Represents a Level-3 interferometric displacement product from the OPERA DISP-S1 archive. Products contain displacement measurements between two Sentinel-1 acquisition dates.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
path
|
Path
|
Path to the NetCDF product file. |
required |
frame_id
|
int
|
OPERA frame identifier (e.g., 8882). |
required |
primary_date
|
datetime
|
Earlier acquisition date (reference). |
required |
secondary_date
|
datetime
|
Later acquisition date. |
required |
polarization
|
str
|
Radar polarization (e.g., "VV", "VH"). |
required |
version
|
str
|
Product version string (e.g., "1.0"). |
required |
production_date
|
datetime
|
Date when product was generated. |
required |
mode
|
str
|
Acquisition mode (e.g., "IW"). Default is "IW". |
'IW'
|
Examples:
>>> path = Path(
"OPERA_L3_DISP-S1_IW_F08882_VV_20220111T002651Z_20220722T002657Z_v1.0_20251027T005420Z.nc")
>>> product = DispProduct.from_path(path)
>>> product.frame_id
8882
Get reference point¶
>>> row, col = product.get_reference_point_index()
>>> lat, lon = product.get_reference_point_latlon()
>>> print(f"Reference at ({row}, {col}): {lat:.4f}°N, {lon:.4f}°E")
Source code in src/cal_disp/product/_disp.py
14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 | |
from_path
classmethod
¶
from_path(path: Path | str) -> DispProduct
Parse product metadata from filename.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
path
|
Path or str
|
Path to OPERA DISP-S1 NetCDF file. |
required |
Returns:
| Type | Description |
|---|---|
DispProduct
|
Parsed product instance. |
Raises:
| Type | Description |
|---|---|
ValueError
|
If filename doesn't match expected OPERA DISP-S1 format. |
Source code in src/cal_disp/product/_disp.py
116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 | |
get_bounds ¶
get_bounds() -> dict[str, float]
Get bounds in native projection coordinates.
Returns:
| Type | Description |
|---|---|
dict[str, float]
|
Dictionary with keys: left, bottom, right, top. |
Examples:
>>> bounds = product.get_bounds()
>>> bounds
{'left': 71970.0, 'bottom': 3153930.0, 'right': 355890.0, 'top': 3385920.0}
Source code in src/cal_disp/product/_disp.py
225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 | |
get_bounds_wgs84 ¶
get_bounds_wgs84() -> dict[str, float]
Get bounds transformed to WGS84 (EPSG:4326).
Returns:
| Type | Description |
|---|---|
dict[str, float]
|
Dictionary with keys: west, south, east, north in decimal degrees. |
Examples:
>>> bounds = product.get_bounds_wgs84()
>>> bounds
{'west': -95.567, 'south': 28.486, 'east': -93.212, 'north': 30.845}
Source code in src/cal_disp/product/_disp.py
252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 | |
get_epsg ¶
get_epsg() -> int | None
Get EPSG code from spatial reference.
Returns:
| Type | Description |
|---|---|
int or None
|
EPSG code if found, None otherwise. |
Examples:
>>> product.get_epsg()
32615
Source code in src/cal_disp/product/_disp.py
204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 | |
get_reference_point_index ¶
get_reference_point_index() -> tuple[int, int]
Get reference point pixel indices.
The reference point is where the phase was set to zero during processing. This is stored in the corrections group.
Returns:
| Type | Description |
|---|---|
tuple[int, int]
|
Row and column indices (row, col) of reference point. |
Raises:
| Type | Description |
|---|---|
ValueError
|
If reference_point variable not found or missing attributes. |
Examples:
>>> row, col = product.get_reference_point_index()
>>> print(f"Reference point at pixel ({row}, {col})")
Source code in src/cal_disp/product/_disp.py
298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 | |
get_reference_point_latlon ¶
get_reference_point_latlon() -> tuple[float, float]
Get reference point geographic coordinates.
Returns latitude and longitude of the reference point in WGS84.
Returns:
| Type | Description |
|---|---|
tuple[float, float]
|
Latitude and longitude in decimal degrees (lat, lon). |
Raises:
| Type | Description |
|---|---|
ValueError
|
If reference_point variable not found or missing attributes. |
Examples:
>>> lat, lon = product.get_reference_point_latlon()
>>> print(f"Reference point: {lat:.6f}°N, {lon:.6f}°E")
Source code in src/cal_disp/product/_disp.py
335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 | |
open_corrections ¶
open_corrections() -> xr.Dataset
Open corrections group dataset.
Returns:
| Type | Description |
|---|---|
Dataset
|
Corrections dataset containing ionospheric delay, solid earth tide, etc. |
Raises:
| Type | Description |
|---|---|
FileNotFoundError
|
If product file does not exist. |
Source code in src/cal_disp/product/_disp.py
188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 | |
open_dataset ¶
open_dataset(group: Literal['main', 'corrections'] | None = None) -> xr.Dataset
Open dataset.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
group
|
(main, corrections)
|
Which group to open. If None, opens main group. Default is None. |
"main"
|
Returns:
| Type | Description |
|---|---|
Dataset
|
Dataset containing displacement and quality layers. |
Raises:
| Type | Description |
|---|---|
FileNotFoundError
|
If product file does not exist. |
Source code in src/cal_disp/product/_disp.py
159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 | |
to_geotiff ¶
to_geotiff(layer: str, output_path: Path | str, group: Literal['main', 'corrections'] = 'main', compress: str = 'DEFLATE', **kwargs) -> Path
Export layer to optimized GeoTIFF.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
layer
|
str
|
Name of layer to export (e.g., "displacement", "ionospheric_delay"). |
required |
output_path
|
Path or str
|
Output GeoTIFF path. |
required |
group
|
(main, corrections)
|
Which group to read from. Default is "main". |
"main"
|
compress
|
str
|
Compression method. Default is "DEFLATE". |
'DEFLATE'
|
**kwargs
|
Additional rasterio creation options. |
{}
|
Returns:
| Type | Description |
|---|---|
Path
|
Path to created GeoTIFF. |
Raises:
| Type | Description |
|---|---|
ValueError
|
If layer not found in specified group. |
Examples:
>>> product.to_geotiff("displacement", "disp.tif")
>>> product.to_geotiff("ionospheric_delay", "iono.tif", group="corrections")
Source code in src/cal_disp/product/_disp.py
373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 | |
StaticLayer
dataclass
¶
OPERA DISP-S1-STATIC layer.
Represents a single static layer (DEM, incidence angle, LOS vectors, etc.) used as input for DISP-S1 processing. These are frame-specific GeoTIFF files that don't change over time.
Examples:
>>> path = Path("OPERA_L3_DISP-S1-STATIC_F08882_20140403_S1A_v1.0_dem.tif")
>>> layer = StaticLayer.from_path(path)
>>> layer.frame_id
8882
Read LOS components¶
>>> los_layer = StaticLayer.from_path("..._line_of_sight_enu.tif")
>>> bands = los_layer.read_bands()
>>> east, north, up = bands[0], bands[1], bands[2]
Source code in src/cal_disp/product/_static.py
14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 | |
compute_incidence_angle ¶
compute_incidence_angle(fill_value: float = 0.0, dtype: dtype = np.float32) -> np.ndarray
Compute incidence angle from LOS up component.
Only valid for line_of_sight_enu layers.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
fill_value
|
float
|
Value to use for masked/invalid pixels. Default is 0.0. |
0.0
|
dtype
|
dtype
|
Output data type. Default is np.float32. |
float32
|
Returns:
| Type | Description |
|---|---|
ndarray
|
Incidence angle in degrees (0-90°). |
Raises:
| Type | Description |
|---|---|
ValueError
|
If not a line_of_sight_enu layer or wrong number of bands. |
Source code in src/cal_disp/product/_static.py
248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 | |
export_incidence_angle ¶
export_incidence_angle(output_path: Path | str, fill_value: float = 0.0, nodata: float | None = 0.0, compress: str = 'DEFLATE', **kwargs) -> Path
Compute and export incidence angle to GeoTIFF.
Source code in src/cal_disp/product/_static.py
308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 | |
from_path
classmethod
¶
from_path(path: Path | str) -> StaticLayer
Parse layer metadata from filename.
Source code in src/cal_disp/product/_static.py
66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 | |
get_bounds ¶
get_bounds() -> dict[str, float]
Get bounds in native projection.
Source code in src/cal_disp/product/_static.py
386 387 388 389 390 391 392 393 394 395 396 397 398 | |
get_bounds_wgs84 ¶
get_bounds_wgs84() -> dict[str, float]
Get bounds transformed to WGS84.
Source code in src/cal_disp/product/_static.py
400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 | |
get_crs ¶
get_crs() -> CRS
Get coordinate reference system.
Source code in src/cal_disp/product/_static.py
373 374 375 376 377 378 379 | |
get_epsg ¶
get_epsg() -> int | None
Get EPSG code.
Source code in src/cal_disp/product/_static.py
381 382 383 384 | |
get_nodata ¶
get_nodata() -> float | None
Get nodata value.
Source code in src/cal_disp/product/_static.py
431 432 433 434 435 436 437 | |
get_profile ¶
get_profile() -> dict
Get rasterio profile.
Source code in src/cal_disp/product/_static.py
357 358 359 360 361 362 363 | |
get_shape ¶
get_shape() -> tuple[int, int]
Get array shape.
Source code in src/cal_disp/product/_static.py
423 424 425 426 427 428 429 | |
get_transform ¶
get_transform() -> Affine
Get affine transform.
Source code in src/cal_disp/product/_static.py
365 366 367 368 369 370 371 | |
read ¶
read(band: int = 1, masked: bool = True) -> np.ndarray
Read single band data.
Source code in src/cal_disp/product/_static.py
95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 | |
read_bands ¶
read_bands(masked: bool = True) -> list[np.ndarray]
Read all bands.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
masked
|
bool
|
If True, return masked arrays with nodata values masked. Default is True. |
True
|
Returns:
| Type | Description |
|---|---|
list[ndarray]
|
List of arrays, one per band. |
Examples:
>>> # Read DEM (single band)
>>> dem_layer = StaticLayer.from_path("..._dem.tif")
>>> bands = dem_layer.read_bands()
>>> dem = bands[0]
>>> # Read LOS vectors (three bands)
>>> los_layer = StaticLayer.from_path("..._line_of_sight_enu.tif")
>>> bands = los_layer.read_bands()
>>> east, north, up = bands[0], bands[1], bands[2]
Source code in src/cal_disp/product/_static.py
113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 | |
to_dataset ¶
to_dataset() -> xr.Dataset
Convert raster to xarray Dataset.
Source code in src/cal_disp/product/_static.py
153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 | |
TropoProduct
dataclass
¶
OPERA TROPO-ZENITH tropospheric delay product.
Minimal class for managing OPERA tropospheric products. Processing functions are standalone for composability.
Examples:
>>> product = TropoProduct.from_path("tropo.nc")
>>> ds = product.open_dataset()
>>> total = product.get_total_delay()
Source code in src/cal_disp/product/_tropo.py
14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 | |
from_path
classmethod
¶
from_path(path: Path | str) -> TropoProduct
Parse product metadata from filename.
Source code in src/cal_disp/product/_tropo.py
59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 | |
get_total_delay ¶
get_total_delay(time_idx: int = 0, bounds: tuple[float, float, float, float] | None = None, max_height: float | None = None, bounds_crs: str = 'EPSG:4326', bounds_buffer: float = 0.0) -> xr.DataArray
Get total zenith delay (wet + hydrostatic).
Computes total delay as sum of wet and hydrostatic components.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
time_idx
|
int
|
Time index to extract. Default is 0. |
0
|
bounds
|
tuple[float, float, float, float] or None
|
Spatial bounds for subsetting. Default is None. |
None
|
max_height
|
float or None
|
Maximum height for subsetting. Default is None. |
None
|
bounds_crs
|
str
|
CRS of bounds. Default is "EPSG:4326". |
'EPSG:4326'
|
bounds_buffer
|
float
|
Buffer to add to bounds. Default is 0.0. |
0.0
|
Returns:
| Type | Description |
|---|---|
DataArray
|
Total zenith delay with dimensions (height, latitude, longitude). |
Source code in src/cal_disp/product/_tropo.py
134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 | |
matches_date ¶
matches_date(target_date: datetime, hours: float = 6.0) -> bool
Check if product date is within time window of target date.
Source code in src/cal_disp/product/_tropo.py
80 81 82 83 | |
open_dataset ¶
open_dataset(bounds: tuple[float, float, float, float] | None = None, max_height: float | None = None, bounds_crs: str = 'EPSG:4326', bounds_buffer: float = 0.0) -> xr.Dataset
Open tropospheric delay dataset with optional subsetting.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
bounds
|
tuple[float, float, float, float] or None
|
Spatial bounds as (west, south, east, north). Default is None. |
None
|
max_height
|
float or None
|
Maximum height in meters. Default is None. |
None
|
bounds_crs
|
str
|
CRS of bounds. Default is "EPSG:4326". |
'EPSG:4326'
|
bounds_buffer
|
float
|
Buffer to add to bounds in degrees (for lat/lon) or meters (for projected CRS). Default is 0.0. Useful value: 0.2 for lat/lon. |
0.0
|
Returns:
| Type | Description |
|---|---|
Dataset
|
Tropospheric delay dataset. |
Source code in src/cal_disp/product/_tropo.py
85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 | |
UnrGrid
dataclass
¶
UNR GNSS grid data.
Represents gridded GNSS velocity data from University of Nevada Reno. Data is stored as parquet with point geometries and metadata.
Examples:
>>> # Load from path (frame_id parsed if in filename)
>>> grid = UnrGrid.from_path("unr_grid_frame8882.parquet")
>>> grid.frame_id
8882
>>> # Load GeoDataFrame
>>> gdf = grid.load()
>>> gdf.columns
['lon', 'lat', 'east', 'north', 'up', 'geometry', ...]
>>> # Get metadata
>>> meta = grid.get_metadata()
>>> meta['source']
'UNR'
Source code in src/cal_disp/product/_unr.py
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 | |
from_path
classmethod
¶
from_path(path: Path | str, frame_id: int | None = None) -> UnrGrid
Create UnrGrid from parquet file path.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
path
|
Path or str
|
Path to UNR parquet file. |
required |
frame_id
|
int or None
|
Frame ID. If None, attempts to parse from filename. Default is None. |
None
|
Returns:
| Type | Description |
|---|---|
UnrGrid
|
Grid instance. |
Examples:
>>> # Frame ID from filename
>>> grid = UnrGrid.from_path("unr_grid_frame8882.parquet")
>>> grid.frame_id
8882
>>> # Explicit frame ID
>>> grid = UnrGrid.from_path("custom_unr_data.parquet", frame_id=8882)
>>> grid.frame_id
8882
>>> # No frame ID
>>> grid = UnrGrid.from_path("unr_data.parquet")
>>> grid.frame_id is None
True
Source code in src/cal_disp/product/_unr.py
46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 | |
get_bounds ¶
get_bounds() -> dict[str, float]
Get spatial bounds of grid.
Returns:
| Type | Description |
|---|---|
dict[str, float]
|
Dictionary with keys: west, south, east, north. |
Source code in src/cal_disp/product/_unr.py
170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 | |
get_grid_count ¶
get_grid_count() -> int
Get number of GNSS points in grid.
Returns:
| Type | Description |
|---|---|
int
|
Number of stations. |
Source code in src/cal_disp/product/_unr.py
189 190 191 192 193 194 195 196 197 198 199 200 | |
get_metadata ¶
get_metadata() -> dict[str, str]
Extract metadata from parquet file.
Returns:
| Type | Description |
|---|---|
dict[str, str]
|
Metadata dictionary. |
Examples:
>>> grid = UnrGrid.from_path("unr_grid_frame8882.parquet")
>>> meta = grid.get_metadata()
>>> meta.keys()
dict_keys(['source', 'date_created', 'frame_id', ...])
Source code in src/cal_disp/product/_unr.py
128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 | |
load ¶
load() -> gpd.GeoDataFrame
Load UNR grid as GeoDataFrame.
Returns:
| Type | Description |
|---|---|
GeoDataFrame
|
GeoDataFrame with point geometries and velocity data. |
Raises:
| Type | Description |
|---|---|
FileNotFoundError
|
If parquet file does not exist. |
Examples:
>>> grid = UnrGrid.from_path("unr_grid_frame8882.parquet")
>>> gdf = grid.load()
>>> gdf.crs
'EPSG:4326'
>>> gdf[['lon', 'lat', 'east', 'north', 'up']].head()
Source code in src/cal_disp/product/_unr.py
91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 | |
to_dataframe ¶
to_dataframe() -> pd.DataFrame
Load as regular DataFrame without geometry.
Returns:
| Type | Description |
|---|---|
DataFrame
|
DataFrame with lon, lat, and velocity columns. |
Source code in src/cal_disp/product/_unr.py
156 157 158 159 160 161 162 163 164 165 166 167 168 | |
bounds_contains ¶
bounds_contains(outer_bounds: tuple[float, float, float, float] | dict[str, float], inner_bounds: tuple[float, float, float, float] | dict[str, float], buffer: float = 0.0) -> bool
Check if outer bounds completely contain inner bounds.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
outer_bounds
|
tuple or dict
|
Outer bounds as (west, south, east, north) or dict with those keys. |
required |
inner_bounds
|
tuple or dict
|
Inner bounds as (west, south, east, north) or dict with those keys. |
required |
buffer
|
float
|
Buffer distance to require around inner bounds (in same units). Default is 0.0 (exact containment). |
0.0
|
Returns:
| Type | Description |
|---|---|
bool
|
True if outer bounds completely contain inner bounds (with buffer). |
Examples:
>>> # Check if UNR grid covers DISP frame
>>> unr_bounds = (-97, 28, -93, 32)
>>> disp_bounds = (-96, 29, -94, 31)
>>> bounds_contains(unr_bounds, disp_bounds)
True
>>> # With buffer requirement
>>> bounds_contains(unr_bounds, disp_bounds, buffer=0.5)
True
>>> # Dict format
>>> unr_bounds = {"west": -97, "south": 28, "east": -93, "north": 32}
>>> disp_bounds = {"west": -96, "south": 29, "east": -94, "north": 31}
>>> bounds_contains(unr_bounds, disp_bounds)
True
>>> # Does not contain
>>> small_bounds = (-95, 29, -94, 30)
>>> large_bounds = (-96, 28, -93, 31)
>>> bounds_contains(small_bounds, large_bounds)
False
Source code in src/cal_disp/product/_utils.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 | |
check_bounds_coverage ¶
check_bounds_coverage(outer_bounds: tuple[float, float, float, float] | dict[str, float], inner_bounds: tuple[float, float, float, float] | dict[str, float], buffer: float = 0.0) -> dict[str, bool | dict[str, float]]
Check bounds coverage with detailed gap information.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
outer_bounds
|
tuple or dict
|
Outer bounds as (west, south, east, north) or dict. |
required |
inner_bounds
|
tuple or dict
|
Inner bounds as (west, south, east, north) or dict. |
required |
buffer
|
float
|
Buffer distance required. Default is 0.0. |
0.0
|
Returns:
| Type | Description |
|---|---|
dict
|
Dictionary with: - "contains": bool, True if outer contains inner (with buffer) - "gaps": dict, Gap distances by direction (negative = covered) |
Examples:
>>> unr_bounds = (-97, 28, -93, 32)
>>> disp_bounds = (-96, 29, -94, 31)
>>> result = check_bounds_coverage(unr_bounds, disp_bounds, buffer=0.5)
>>> result["contains"]
True
>>> result["gaps"]
{'west': -0.5, 'south': -0.5, 'east': -0.5, 'north': -0.5}
>>> # Insufficient coverage
>>> small_bounds = (-95.5, 29, -94, 30)
>>> result = check_bounds_coverage(small_bounds, disp_bounds)
>>> result["contains"]
False
>>> result["gaps"]
{'west': 0.5, 'south': 0.0, 'east': 0.0, 'north': 1.0}
Source code in src/cal_disp/product/_utils.py
75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 | |
compute_los_correction ¶
compute_los_correction(zenith_delay_2d: DataArray, los_up: DataArray, reference_correction: DataArray | None = None, target_crs: str | None = None, output_path: Path | str | None = None, output_format: str = 'geotiff') -> xr.DataArray
Convert zenith delay to line-of-sight correction.
Source code in src/cal_disp/product/_tropo.py
505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 | |
interpolate_in_time ¶
interpolate_in_time(tropo_early: TropoProduct, tropo_late: TropoProduct, target_datetime: datetime, bounds: tuple[float, float, float, float] | None = None, max_height: float = 11000.0, bounds_buffer: float = 0.2, output_path: Path | str | None = None) -> xr.DataArray
Interpolate tropospheric delay between two products in time.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
tropo_early
|
TropoProduct
|
Earlier tropospheric product. |
required |
tropo_late
|
TropoProduct
|
Later tropospheric product. |
required |
target_datetime
|
datetime
|
Target datetime for interpolation. |
required |
bounds
|
tuple[float, float, float, float] or None
|
Spatial bounds as (west, south, east, north). Default is None. |
None
|
max_height
|
float
|
Maximum height in meters. Default is 11000 m. |
11000.0
|
bounds_buffer
|
float
|
Buffer to add to bounds in degrees. Default is 0.2. |
0.2
|
output_path
|
Path, str, or None
|
If provided, save result to NetCDF. Default is None. |
None
|
Returns:
| Type | Description |
|---|---|
DataArray
|
Interpolated tropospheric delay at target datetime. |
Examples:
>>> from datetime import datetime
>>> from product import TropoProduct
>>> from product._tropo import interpolate_in_time
>>>
>>> early = TropoProduct.from_path("tropo_00Z.nc")
>>> late = TropoProduct.from_path("tropo_06Z.nc")
>>> target = datetime(2022, 1, 11, 3, 0)
>>>
>>> # Basic interpolation
>>> delay = interpolate_in_time(early, late, target)
>>>
>>> # With spatial subsetting
>>> bounds = (-96, 29, -94, 31)
>>> delay = interpolate_in_time(
... early, late, target,
... bounds=bounds,
... max_height=11000,
... bounds_buffer=0.2,
... )
Source code in src/cal_disp/product/_tropo.py
264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 | |
interpolate_to_dem_surface ¶
interpolate_to_dem_surface(da_tropo_cube: DataArray, dem: DataArray, method: str = 'linear', output_path: Path | str | None = None, output_format: str = 'netcdf') -> xr.DataArray
Interpolate 3D tropospheric delay to DEM surface heights.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
da_tropo_cube
|
DataArray
|
3D tropospheric delay with dims (height, y, x). Assumed to be in EPSG:4326 (WGS84) if CRS not specified. |
required |
dem
|
DataArray
|
DEM with surface heights in meters. Must have CRS information. |
required |
method
|
str
|
Interpolation method ("linear" or "nearest"). Default is "linear". |
'linear'
|
output_path
|
Path, str, or None
|
If provided, save result. Default is None. |
None
|
output_format
|
str
|
Output format ("netcdf" or "geotiff"). Default is "netcdf". |
'netcdf'
|
Returns:
| Type | Description |
|---|---|
DataArray
|
2D tropospheric delay at DEM surface. |
Raises:
| Type | Description |
|---|---|
ValueError
|
If DEM is missing CRS information. |
Source code in src/cal_disp/product/_tropo.py
378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 | |
Config¶
cal_disp.config ¶
DynamicAncillaryFileGroup ¶
Bases: YamlModel
Dynamic ancillary files for the SAS.
Attributes:
| Name | Type | Description |
|---|---|---|
algorithm_parameters_file |
Path
|
Path to file containing SAS algorithm parameters. |
los_file |
Path
|
Path to the DISP static LOS layer file (line-of-sight unit vectors). Alias: static_los_file |
dem_file |
Path
|
Path to the DISP static DEM layer file (digital elevation model). Alias: static_dem_file |
mask_file |
(Path or None, optional)
|
Optional byte mask file to ignore low correlation/bad data (e.g., water mask). Convention: 0 = invalid/no data, 1 = good data. Dtype must be uint8. Default is None. |
reference_tropo_files |
(list[Path] or None, optional)
|
Paths to TROPO files for the reference (primary) date. If not provided, tropospheric correction for reference is skipped. Alias: ref_tropo_files. Default is None. |
secondary_tropo_files |
(list[Path] or None, optional)
|
Paths to TROPO files for the secondary date. If not provided, tropospheric correction for secondary is skipped. Alias: sec_tropo_files. Default is None. |
iono_files |
(list[Path] or None, optional)
|
Paths to ionospheric correction files. If not provided, ionospheric correction is skipped. Default is None. |
tiles_files |
(list[Path] or None, optional)
|
Paths to calibration tile bounds files (e.g., S1 burst bounds) covering the full frame. If not provided, per-tile calibration is skipped. Default is None. |
Source code in src/cal_disp/config/_base.py
44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 | |
InputFileGroup ¶
Bases: YamlModel
Input file group for the SAS.
Attributes:
| Name | Type | Description |
|---|---|---|
disp_file |
Path
|
Path to DISP file. |
calibration_reference_grid |
Path
|
Path to UNR calibration reference file (parquet format). |
Source code in src/cal_disp/config/_base.py
12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 | |
StaticAncillaryFileGroup ¶
Bases: YamlModel
Static ancillary files for the SAS.
These files contain configuration and reference data that don't change between processing runs for a given frame.
Attributes:
| Name | Type | Description |
|---|---|---|
algorithm_parameters_overrides_json |
Optional[Path]
|
JSON file with frame-specific algorithm parameter overrides. |
deformation_area_database_json |
Optional[Path]
|
GeoJSON file with deforming areas to exclude from calibration. |
event_database_json |
Optional[Path]
|
GeoJSON file with earthquake/volcanic activity events for each frame. |
Source code in src/cal_disp/config/_base.py
166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 | |
has_algorithm_overrides ¶
has_algorithm_overrides() -> bool
Check if algorithm parameter overrides are provided.
Source code in src/cal_disp/config/_base.py
209 210 211 | |
has_deformation_database ¶
has_deformation_database() -> bool
Check if deformation area database is provided.
Source code in src/cal_disp/config/_base.py
213 214 215 | |
has_event_database ¶
has_event_database() -> bool
Check if event database is provided.
Source code in src/cal_disp/config/_base.py
217 218 219 | |
WorkerSettings ¶
Bases: YamlModel
Settings for controlling CPU/threading and parallelism.
This class configures Dask distributed computing settings including worker count, threads per worker, and data block sizes for chunked processing.
Attributes:
| Name | Type | Description |
|---|---|---|
n_workers |
int
|
Number of Dask workers to spawn. Default is 4. |
threads_per_worker |
int
|
Number of threads per worker. Default is 2. |
block_shape |
Tuple[int, int]
|
Block size (rows, columns) for chunked data loading. |
Examples:
>>> # Default settings
>>> settings = WorkerSettings()
>>>
>>> # Custom configuration
>>> settings = WorkerSettings(
... n_workers=8,
... threads_per_worker=4,
... block_shape=(256, 256)
... )
>>> print(f"Total threads: {settings.total_threads}")
Source code in src/cal_disp/config/_workers.py
9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 | |
block_size
property
¶
block_size: int
Total number of elements per block.
Returns:
| Type | Description |
|---|---|
int
|
Product of block_shape dimensions (rows * columns). |
total_threads
property
¶
total_threads: int
Total number of threads across all workers.
Returns:
| Type | Description |
|---|---|
int
|
n_workers * threads_per_worker |
create_from_cpu_count
classmethod
¶
create_from_cpu_count(use_fraction: float = 0.75, threads_per_worker: int = 2, block_shape: Tuple[int, int] = (128, 128)) -> WorkerSettings
Create settings based on available CPU count.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
use_fraction
|
float
|
Fraction of available CPUs to use (0.0 to 1.0). |
0.75
|
threads_per_worker
|
int
|
Threads per worker. |
2
|
block_shape
|
Tuple[int, int]
|
Block shape for data loading. |
(128, 128)
|
Returns:
| Type | Description |
|---|---|
WorkerSettings
|
Configuration tuned to system CPU count. |
Examples:
>>> # Use 75% of available CPUs
>>> settings = WorkerSettings.create_from_cpu_count(use_fraction=0.75)
Source code in src/cal_disp/config/_workers.py
244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 | |
create_heavy
classmethod
¶
create_heavy() -> WorkerSettings
Create heavy-duty settings for large datasets.
Returns:
| Type | Description |
|---|---|
WorkerSettings
|
Configuration with 8 workers, 4 threads each, large blocks. |
Source code in src/cal_disp/config/_workers.py
232 233 234 235 236 237 238 239 240 241 242 | |
create_lightweight
classmethod
¶
create_lightweight() -> WorkerSettings
Create lightweight settings for small datasets or testing.
Returns:
| Type | Description |
|---|---|
WorkerSettings
|
Configuration with 2 workers, 1 thread each, small blocks. |
Source code in src/cal_disp/config/_workers.py
208 209 210 211 212 213 214 215 216 217 218 | |
create_standard
classmethod
¶
create_standard() -> WorkerSettings
Create standard settings for typical workloads.
Returns:
| Type | Description |
|---|---|
WorkerSettings
|
Configuration with 4 workers, 2 threads each, medium blocks. |
Source code in src/cal_disp/config/_workers.py
220 221 222 223 224 225 226 227 228 229 230 | |
estimate_memory_per_block ¶
estimate_memory_per_block(dtype_size: int = 8, n_bands: int = 1) -> float
Estimate memory usage per block in MB.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
dtype_size
|
int
|
Size of data type in bytes (e.g., 8 for float64, 4 for float32). |
8
|
n_bands
|
int
|
Number of bands/layers in the data. |
1
|
Returns:
| Type | Description |
|---|---|
float
|
Estimated memory in megabytes. |
Examples:
>>> settings = WorkerSettings(block_shape=(256, 256))
>>> mem_mb = settings.estimate_memory_per_block(dtype_size=8, n_bands=2)
>>> print(f"Estimated memory: {mem_mb:.2f} MB")
Source code in src/cal_disp/config/_workers.py
135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 | |
estimate_total_memory ¶
estimate_total_memory(dtype_size: int = 8, n_bands: int = 1, overhead_factor: float = 1.5) -> float
Estimate total memory usage across all workers in GB.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
dtype_size
|
int
|
Size of data type in bytes. |
8
|
n_bands
|
int
|
Number of bands/layers in the data. |
1
|
overhead_factor
|
float
|
Multiplier for overhead (copies, intermediate results). |
1.5
|
Returns:
| Type | Description |
|---|---|
float
|
Estimated total memory in gigabytes. |
Source code in src/cal_disp/config/_workers.py
160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 | |
summary ¶
summary() -> str
Generate a human-readable summary of settings.
Returns:
| Type | Description |
|---|---|
str
|
Multi-line summary string. |
Source code in src/cal_disp/config/_workers.py
185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 | |
validate_against_system ¶
validate_against_system() -> Dict[str, Any]
Validate settings against system resources.
Returns:
| Type | Description |
|---|---|
dict
|
Dictionary with validation results and warnings. |
Examples:
>>> settings = WorkerSettings(n_workers=16, threads_per_worker=8)
>>> validation = settings.validate_against_system()
>>> if validation['warnings']:
... for warning in validation['warnings']:
... print(f"Warning: {warning}")
Source code in src/cal_disp/config/_workers.py
282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 | |
YamlModel ¶
Bases: BaseModel
Pydantic model that can be exported to yaml.
Source code in src/cal_disp/config/_yaml.py
39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 | |
all_files_exist ¶
all_files_exist() -> bool
Check if all files exist.
Source code in src/cal_disp/config/_yaml.py
309 310 311 | |
from_yaml
classmethod
¶
from_yaml(yaml_path: Filename)
Load a configuration from a yaml file.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
yaml_path
|
Pathlike
|
Path to the yaml file to load. |
required |
Returns:
| Type | Description |
|---|---|
Config
|
Workflow configuration |
Source code in src/cal_disp/config/_yaml.py
93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 | |
get_all_file_paths ¶
get_all_file_paths(include_none: bool = False, flatten_lists: bool = True) -> Dict[str, Union[Path, List[Path]]]
Get all Path fields from the model.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
include_none
|
bool
|
Include fields with None values. |
False
|
flatten_lists
|
bool
|
Flatten list fields to individual entries with indices. |
True
|
Returns:
| Type | Description |
|---|---|
dict
|
Mapping of field names to Path objects. |
Source code in src/cal_disp/config/_yaml.py
146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 | |
get_missing_files ¶
get_missing_files() -> List[str]
Get list of missing file paths.
Source code in src/cal_disp/config/_yaml.py
301 302 303 304 305 306 307 | |
print_yaml_schema
classmethod
¶
print_yaml_schema(output_path: Union[Filename, TextIO] = sys.stdout, indent_per_level: int = 2)
Print/save an empty configuration with defaults filled in.
Ignores the required input_file_list input, so a user can
inspect all fields.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
output_path
|
Pathlike
|
Path or stream to save to the yaml file to. By default, prints to stdout. |
stdout
|
indent_per_level
|
int
|
Number of spaces to indent per level. |
= 2
|
Source code in src/cal_disp/config/_yaml.py
114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 | |
to_yaml ¶
to_yaml(output_path: Union[Filename, TextIO], with_comments: bool = True, by_alias: bool = True, indent_per_level: int = 2)
Save configuration as a yaml file.
Used to record the default-filled version of a supplied yaml.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
output_path
|
Pathlike
|
Path to the yaml file to save. |
required |
with_comments
|
bool
|
Whether to add comments containing the type/descriptions to all fields. |
= False.
|
by_alias
|
bool
|
Whether to use the alias names for the fields.
Passed to pydantic's |
= False.
|
indent_per_level
|
int
|
Number of spaces to indent per level. |
= 2
|
Source code in src/cal_disp/config/_yaml.py
44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 | |
validate_files_exist ¶
validate_files_exist(raise_on_missing: bool = False) -> Dict[str, Dict[str, Any]]
Validate all file paths exist on disk.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
raise_on_missing
|
bool
|
If True, raise FileNotFoundError on first missing file. |
False
|
Returns:
| Type | Description |
|---|---|
dict
|
Detailed status for each file including existence, size, etc. |
Raises:
| Type | Description |
|---|---|
FileNotFoundError
|
If raise_on_missing=True and any file is missing. |
Source code in src/cal_disp/config/_yaml.py
251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 | |
validate_ready_to_run ¶
validate_ready_to_run() -> ValidationResult
Check if configuration is ready to run.
Source code in src/cal_disp/config/_yaml.py
297 298 299 | |
pge_runconfig ¶
OutputOptions ¶
Bases: YamlModel
Output configuration options.
Attributes:
| Name | Type | Description |
|---|---|---|
product_version |
str
|
Version of the product in |
output_format |
str
|
Format for output files (e.g., 'netcdf', 'hdf5'). |
compression |
bool
|
Whether to compress output files. |
Source code in src/cal_disp/config/pge_runconfig.py
38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 | |
PrimaryExecutable ¶
Bases: YamlModel
Group describing the primary executable.
Attributes:
| Name | Type | Description |
|---|---|---|
product_type |
str
|
Product type identifier for the PGE. |
Source code in src/cal_disp/config/pge_runconfig.py
20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 | |
ProductPathGroup ¶
Bases: YamlModel
Group describing the product paths.
Attributes:
| Name | Type | Description |
|---|---|---|
product_path |
Path
|
Directory where PGE will place results. |
scratch_path |
Path
|
Path to the scratch directory for intermediate files. |
output_path |
Path
|
Path to the SAS output directory. |
Source code in src/cal_disp/config/pge_runconfig.py
70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 | |
RunConfig ¶
Bases: YamlModel
A PGE (Product Generation Executive) run configuration.
This class represents the top-level configuration for running the calibration workflow as a PGE. It includes input files, output options, paths, and worker settings.
Attributes:
| Name | Type | Description |
|---|---|---|
input_file_group |
InputFileGroup
|
Configuration for input files. |
dynamic_ancillary_file_group |
Optional[DynamicAncillaryFileGroup]
|
Dynamic ancillary files configuration. |
static_ancillary_file_group |
Optional[StaticAncillaryFileGroup]
|
Static ancillary files configuration. |
output_options |
OutputOptions
|
Output configuration options. |
primary_executable |
PrimaryExecutable
|
Primary executable configuration. |
product_path_group |
ProductPathGroup
|
Product path configuration. |
worker_settings |
WorkerSettings
|
Dask worker and parallelism configuration. |
log_file |
Optional[Path]
|
Path to the output log file. |
Source code in src/cal_disp/config/pge_runconfig.py
103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 | |
create_directories ¶
create_directories(exist_ok: bool = True) -> None
Create all necessary directories for the PGE run.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
exist_ok
|
bool
|
If True, don't raise error if directories already exist. |
True
|
Source code in src/cal_disp/config/pge_runconfig.py
228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 | |
create_example
classmethod
¶
create_example() -> 'RunConfig'
Create an example PGE run configuration.
Returns:
| Type | Description |
|---|---|
RunConfig
|
Example configuration with placeholder values. |
Source code in src/cal_disp/config/pge_runconfig.py
352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 | |
from_yaml_file
classmethod
¶
from_yaml_file(yaml_path: Path) -> 'RunConfig'
Load run configuration from YAML file.
Handles optional name wrapper key.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
yaml_path
|
Path
|
Path to YAML configuration file. |
required |
Returns:
| Type | Description |
|---|---|
RunConfig
|
Loaded and validated run configuration. |
Source code in src/cal_disp/config/pge_runconfig.py
378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 | |
summary ¶
summary() -> str
Generate a human-readable summary of the run configuration.
Returns:
| Type | Description |
|---|---|
str
|
Multi-line summary string. |
Source code in src/cal_disp/config/pge_runconfig.py
270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 | |
to_workflow ¶
to_workflow() -> CalibrationWorkflow
Convert PGE RunConfig to a CalibrationWorkflow object.
This method translates the PGE-style configuration into the format expected by CalibrationWorkflow.
Returns:
| Type | Description |
|---|---|
CalibrationWorkflow
|
Converted workflow configuration. |
Examples:
>>> run_config = RunConfig.from_yaml_file("pge_config.yaml")
>>> workflow = run_config.to_workflow()
>>> workflow.create_directories()
>>> workflow.run()
Source code in src/cal_disp/config/pge_runconfig.py
181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 | |
to_yaml ¶
to_yaml(output_path: Union[str, PathLike, TextIO], with_comments: bool = True, by_alias: bool = True, indent_per_level: int = 2) -> None
Save configuration to YAML file with name wrapper.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
output_path
|
str | PathLike | TextIO
|
Path where YAML should be saved. |
required |
with_comments
|
bool
|
Whether to include field descriptions as comments. |
True
|
by_alias
|
bool
|
Whether to use field aliases in output. |
True
|
indent_per_level
|
int
|
Indentation spacing. |
2
|
Notes
This method always wraps output in {cal_disp_workflow: ...} structure. The with_comments parameter is accepted for signature compatibility but not currently used in the wrapper output.
Source code in src/cal_disp/config/pge_runconfig.py
424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 | |
validate_ready_to_run ¶
validate_ready_to_run() -> ValidationResult
Check if run configuration is ready for execution.
Source code in src/cal_disp/config/pge_runconfig.py
245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 | |
workflow ¶
CalibrationWorkflow ¶
Bases: YamlModel
Calibration workflow configuration.
This class manages the complete configuration for the displacement calibration workflow, including input files, output directories, worker settings, and logging configuration.
Attributes:
| Name | Type | Description |
|---|---|---|
input_options |
Optional[InputFileGroup]
|
Configuration for required input files. |
work_directory |
Path
|
Directory for intermediate processing files. |
output_directory |
Path
|
Directory for final output files. |
keep_paths_relative |
bool
|
If False, resolve all paths to absolute paths. |
dynamic_ancillary_file_options |
Optional[DynamicAncillaryFileGroup]
|
Optional dynamic ancillary files for processing. |
static_ancillary_file_options |
Optional[StaticAncillaryFileGroup]
|
Optional static ancillary files for processing. |
worker_settings |
WorkerSettings
|
Dask worker and threading configuration. |
log_file |
Optional[Path]
|
Custom log file path. |
Source code in src/cal_disp/config/workflow.py
14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 | |
create_directories ¶
create_directories(exist_ok: bool = True) -> None
Create work and output directories if they don't exist.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
exist_ok
|
bool
|
If True, don't raise error if directories already exist. |
True
|
Source code in src/cal_disp/config/workflow.py
175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 | |
create_example
classmethod
¶
create_example() -> 'CalibrationWorkflow'
Create an example workflow configuration.
Returns:
| Type | Description |
|---|---|
CalibrationWorkflow
|
Example configuration with placeholder values. |
Source code in src/cal_disp/config/workflow.py
341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 | |
create_minimal
classmethod
¶
create_minimal() -> 'CalibrationWorkflow'
Create a minimal workflow configuration without input files.
Returns:
| Type | Description |
|---|---|
CalibrationWorkflow
|
Minimal configuration. Input files must be added before running. |
Source code in src/cal_disp/config/workflow.py
368 369 370 371 372 373 374 375 376 377 378 379 380 381 | |
get_missing_files ¶
get_missing_files() -> List[str]
Get list of missing required input files.
Source code in src/cal_disp/config/workflow.py
170 171 172 173 | |
setup_logging ¶
setup_logging(level: int = 20, format_string: Optional[str] = None)
Set up logging configuration for the workflow.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
level
|
int
|
Logging level (DEBUG=10, INFO=20, WARNING=30, ERROR=40, CRITICAL=50). |
20 (INFO)
|
format_string
|
str
|
Custom format string for log messages. |
None
|
Returns:
| Type | Description |
|---|---|
Logger
|
Configured logger instance. |
Source code in src/cal_disp/config/workflow.py
191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 | |
summary ¶
summary() -> str
Generate a human-readable summary of the workflow configuration.
Returns:
| Type | Description |
|---|---|
str
|
Multi-line summary string. |
Source code in src/cal_disp/config/workflow.py
234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 | |
validate_input_files_exist ¶
validate_input_files_exist() -> Dict[str, Dict[str, Any]]
Check if all input files exist.
Source code in src/cal_disp/config/workflow.py
155 156 157 158 159 160 161 162 163 164 165 166 167 168 | |
validate_ready_to_run ¶
validate_ready_to_run() -> ValidationResult
Check if workflow is ready to run.
Source code in src/cal_disp/config/workflow.py
124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 | |
Browse Image¶
cal_disp.browse_image ¶
Module for creating browse images for the output product.
make_browse_image_from_arr ¶
make_browse_image_from_arr(output_filename: PathOrStr, arr: ArrayLike, mask: ArrayLike, max_dim_allowed: int = 2048, cmap: str = DEFAULT_CMAP, vmin: float = -0.1, vmax: float = 0.1) -> None
Create a PNG browse image for the output product from given array.
Source code in src/cal_disp/browse_image.py
35 36 37 38 39 40 41 42 43 44 45 46 47 | |
make_browse_image_from_nc ¶
make_browse_image_from_nc(output_filename: PathOrStr, input_filename: PathOrStr, max_dim_allowed: int = 2048, cmap: str = DEFAULT_CMAP, vmin: float = -0.1, vmax: float = 0.1) -> None
Create a PNG browse image for the output product from product in NetCDF file.
Source code in src/cal_disp/browse_image.py
50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 | |