licenses
sequencelengths
1
3
version
stringclasses
677 values
tree_hash
stringlengths
40
40
path
stringclasses
1 value
type
stringclasses
2 values
size
stringlengths
2
8
text
stringlengths
25
67.1M
package_name
stringlengths
2
41
repo
stringlengths
33
86
[ "MPL-2.0" ]
0.12.1
c68e031e469f1898bb779cbc3c401e1640ecc0c4
docs
4020
# JSOSolvers.jl [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.3991143.svg)](https://doi.org/10.5281/zenodo.3991143) [![GitHub release](https://img.shields.io/github/release/JuliaSmoothOptimizers/JSOSolvers.jl.svg)](https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl/releases/latest) [![](https://img.shields.io/badge/docs-stable-3f51b5.svg)](https://jso.dev/JSOSolvers.jl/stable) [![](https://img.shields.io/badge/docs-latest-3f51b5.svg)](https://jso.dev/JSOSolvers.jl/latest) [![codecov](https://codecov.io/gh/JuliaSmoothOptimizers/JSOSolvers.jl/branch/main/graph/badge.svg?token=eyiGsilbZx)](https://codecov.io/gh/JuliaSmoothOptimizers/JSOSolvers.jl) ![CI](https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl/workflows/CI/badge.svg?branch=main) [![Cirrus CI - Base Branch Build Status](https://img.shields.io/cirrus/github/JuliaSmoothOptimizers/JSOSolvers.jl?logo=Cirrus%20CI)](https://cirrus-ci.com/github/JuliaSmoothOptimizers/JSOSolvers.jl) This package provides optimization solvers curated by the JuliaSmoothOptimizers organization for unconstrained optimization min f(x) and bound-constrained optimization min f(x) s.t. ℓ ≤ x ≤ u This package provides an implementation of four classic algorithms for unconstrained/bound-constrained nonlinear optimization: - `lbfgs`: an implementation of a limited-memory BFGS line-search method for unconstrained minimization; > D. C. Liu, J. Nocedal. (1989). On the limited memory BFGS method for > large scale optimization. *Mathematical Programming*, 45(1), 503-528. > DOI: [10.1007/BF01589116](https://doi.org/10.1007/BF01589116) - `R2`: a first-order quadratic regularization method for unconstrained optimization; > E. G. Birgin, J. L. Gardenghi, J. M. Martínez, S. A. Santos, Ph. L. Toint. (2017). > Worst-case evaluation complexity for unconstrained nonlinear optimization using > high-order regularized models. *Mathematical Programming*, 163(1), 359-368. > DOI: [10.1007/s10107-016-1065-8](https://doi.org/10.1007/s10107-016-1065-8) - `fomo`: a first-order method with momentum for unconstrained optimization; - `tron`: a pure Julia implementation of TRON, a trust-region solver for bound-constrained optimization described in > Chih-Jen Lin and Jorge J. Moré, *Newton's Method for Large Bound-Constrained > Optimization Problems*, SIAM J. Optim., 9(4), 1100–1127, 1999. > DOI: [10.1137/S1052623498345075](https://www.doi.org/10.1137/S1052623498345075) as well as a variant for nonlinear least-squares; - `trunk`: a trust-region solver for unconstrained optimization using exact second derivatives. Our implementation follows the description given in > A. R. Conn, N. I. M. Gould, and Ph. L. Toint, > Trust-Region Methods, volume 1 of MPS/SIAM Series on Optimization. > SIAM, Philadelphia, USA, 2000. > DOI: [10.1137/1.9780898719857](https://www.doi.org/10.1137/1.9780898719857) The package also contains a variant for nonlinear least-squares. ## Installation `pkg> add JSOSolvers` ## Example ```julia using JSOSolvers, ADNLPModels # Rosenbrock nlp = ADNLPModel(x -> 100 * (x[2] - x[1]^2)^2 + (x[1] - 1)^2, [-1.2; 1.0]) stats = lbfgs(nlp) # or trunk, tron, R2 ``` ## How to cite If you use JSOSolvers.jl in your work, please cite using the format given in [CITATION.cff](CITATION.cff). # Bug reports and discussions If you think you found a bug, feel free to open an [issue](https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl/issues). Focused suggestions and requests can also be opened as issues. Before opening a pull request, start an issue or a discussion on the topic, please. If you want to ask a question not suited for a bug report, feel free to start a discussion [here](https://github.com/JuliaSmoothOptimizers/Organization/discussions). This forum is for general discussion about this repository and the [JuliaSmoothOptimizers](https://github.com/JuliaSmoothOptimizers), so questions about any of our packages are welcome.
JSOSolvers
https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl.git
[ "MPL-2.0" ]
0.12.1
c68e031e469f1898bb779cbc3c401e1640ecc0c4
docs
1133
# [JSOSolvers.jl documentation](@id Home) This package provides a few optimization solvers curated by the [JuliaSmoothOptimizers](https://jso.dev) organization. ## Basic usage All solvers here are _JSO-Compliant_, in the sense that they accept NLPModels and return GenericExecutionStats. This allows [benchmark them easily](https://jso.dev/tutorials/introduction-to-solverbenchmark/). All solvers can be called like the following: ```julia stats = solver_name(nlp; kwargs...) ``` where `nlp` is an AbstractNLPModel or some specialization, such as an `AbstractNLSModel`, and the following keyword arguments are supported: - `x` is the starting default (default: `nlp.meta.x0`); - `atol` is the absolute stopping tolerance (default: `atol = √ϵ`); - `rtol` is the relative stopping tolerance (default: `rtol = √ϵ`); - `max_eval` is the maximum number of objective and constraints function evaluations (default: `-1`, which means no limit); - `max_time` is the maximum allowed elapsed time (default: `30.0`); - `stats` is a `SolverTools.GenericExecutionStats` with the output of the solver. See the full list of [Solvers](@ref).
JSOSolvers
https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl.git
[ "MPL-2.0" ]
0.12.1
c68e031e469f1898bb779cbc3c401e1640ecc0c4
docs
211
# Internal functions ```@docs JSOSolvers.projected_newton! JSOSolvers.projected_line_search! JSOSolvers.cauchy! JSOSolvers.projected_gauss_newton! JSOSolvers.projected_line_search_ls! JSOSolvers.cauchy_ls! ```
JSOSolvers
https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl.git
[ "MPL-2.0" ]
0.12.1
c68e031e469f1898bb779cbc3c401e1640ecc0c4
docs
174
# Reference ​ ## Contents ​ ```@contents Pages = ["reference.md"] ``` ​ ## Index ​ ```@index Pages = ["reference.md"] ``` ​ ```@autodocs Modules = [JSOSolvers] ```
JSOSolvers
https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl.git
[ "MPL-2.0" ]
0.12.1
c68e031e469f1898bb779cbc3c401e1640ecc0c4
docs
495
# Solvers **Solver list** - [`lbfgs`](@ref) - [`tron`](@ref) - [`trunk`](@ref) - [`R2`](@ref) - [`fomo`](@ref) | Problem type | Solvers | | --------------------- | -------- | | Unconstrained NLP | [`lbfgs`](@ref), [`tron`](@ref), [`trunk`](@ref), [`R2`](@ref), [`fomo`](@ref)| | Unconstrained NLS | [`trunk`](@ref), [`tron`](@ref) | | Bound-constrained NLP | [`tron`](@ref) | | Bound-constrained NLS | [`tron`](@ref) | ## Solver list ```@docs lbfgs tron trunk R2 fomo ```
JSOSolvers
https://github.com/JuliaSmoothOptimizers/JSOSolvers.jl.git
[ "MIT" ]
0.6.0
31aa360731ce7e75f3a6fc9159d8ab4948a92493
code
408
module AbstractCosmologicalEmulators using Base: @kwdef using Adapt using ChainRulesCore using Lux using SimpleChains export AbstractTrainedEmulators, LuxEmulator, SimpleChainsEmulator export maximin, inv_maximin, run_emulator, get_emulator_description, init_emulator include("core.jl") include("initialization.jl") include("utils.jl") include("chainrules.jl") end # module AbstractCosmologicalEmulators
AbstractCosmologicalEmulators
https://github.com/CosmologicalEmulators/AbstractCosmologicalEmulators.jl.git
[ "MIT" ]
0.6.0
31aa360731ce7e75f3a6fc9159d8ab4948a92493
code
591
function ChainRulesCore.rrule(::typeof(maximin), input, minmax) Y = maximin(input, minmax) function maximin_pullback(Ȳ) ∂input = @thunk(@views @. Ȳ / (minmax[:,2] - minmax[:,1])) return NoTangent(), ∂input, NoTangent() end return Y, maximin_pullback end function ChainRulesCore.rrule(::typeof(inv_maximin), input, minmax) Y = inv_maximin(input, minmax) function inv_maximin_pullback(Ȳ) ∂input = @thunk(@views @. Ȳ * (minmax[:,2] - minmax[:,1])) return NoTangent(), ∂input, NoTangent() end return Y, inv_maximin_pullback end
AbstractCosmologicalEmulators
https://github.com/CosmologicalEmulators/AbstractCosmologicalEmulators.jl.git
[ "MIT" ]
0.6.0
31aa360731ce7e75f3a6fc9159d8ab4948a92493
code
646
abstract type AbstractTrainedEmulators end @kwdef mutable struct SimpleChainsEmulator <: AbstractTrainedEmulators Architecture Weights Description::Dict = Dict() end function run_emulator(input, emulator::SimpleChainsEmulator) return emulator.Architecture(input, emulator.Weights) end @kwdef mutable struct LuxEmulator <: AbstractTrainedEmulators Model Parameters States Description::Dict = Dict() end Adapt.@adapt_structure LuxEmulator function run_emulator(input, emulator::LuxEmulator) return (Lux.apply(emulator.Model, (input), emulator.Parameters, emulator.States)[1]) end
AbstractCosmologicalEmulators
https://github.com/CosmologicalEmulators/AbstractCosmologicalEmulators.jl.git
[ "MIT" ]
0.6.0
31aa360731ce7e75f3a6fc9159d8ab4948a92493
code
4836
function _get_layer_simplechains(input_dict::Dict) if input_dict["activation_function"] == "tanh" act_func = SimpleChains.tanh elseif input_dict["activation_function"] == "relu" act_func = SimpleChains.relu else error("Error in the Activation Function. You choose "* string(input_dict["activation_function"])*" which we do not support.") end return TurboDense(act_func, Int(input_dict["n_neurons"])) end function _get_hidden_layers_simplechains(input_dict::Dict) n_hidden_layers = input_dict["n_hidden_layers"] return (_get_layer_simplechains(input_dict["layers"]["layer_"*string(i)]) for i in 1:n_hidden_layers) end function _get_layer_lux(activation_function, n_in::Int, n_out::Int) if activation_function == "tanh" act_func = Lux.tanh elseif activation_function == "relu" act_func = Lux.relu else error("Error in the Activation Function. You choose "* string(activation_function)*" which we do not support.") end return Dense(n_in => n_out, act_func) end function _get_layers_lux(input_dict::Dict) n_hidden_layers = input_dict["n_hidden_layers"] in_array, out_array = _get_in_out_arrays(input_dict) intermediate = (_get_layer_lux( input_dict["layers"]["layer_"*string(j)]["activation_function"], in_array[j], out_array[j]) for j in 1:n_hidden_layers) return (intermediate..., Dense(in_array[end],out_array[end])) end function _get_nn_simplechains(input_dict::Dict) hidden_layer_tuple = _get_hidden_layers_simplechains(input_dict) return SimpleChain(static(input_dict["n_input_features"]), hidden_layer_tuple..., TurboDense(identity, input_dict["n_output_features"])) end function _get_nn_lux(input_dict::Dict) hidden_layer_tuple = _get_layers_lux(input_dict) return Chain(hidden_layer_tuple...) end function _get_weight_bias(i::Int, n_in::Int, n_out::Int, weight_bias, NN_dict::Dict) weight = reshape(weight_bias[i:i+n_out*n_in-1], n_out, n_in) bias = weight_bias[i+n_out*n_in:i+n_out*n_in+n_out-1] i += n_out*n_in+n_out-1+1 return (weight = weight, bias = bias) end function _get_in_out_arrays(NN_dict::Dict) n = NN_dict["n_hidden_layers"] in_array = zeros(Int64, n+1) out_array = zeros(Int64, n+1) in_array[1] = NN_dict["n_input_features"] out_array[end] = NN_dict["n_output_features"] for i in 1:n in_array[i+1] = NN_dict["layers"]["layer_"*string(i)]["n_neurons"] out_array[i] = NN_dict["layers"]["layer_"*string(i)]["n_neurons"] end return in_array, out_array end function _get_i_array(in_array::Vector, out_array::Vector) i_array = similar(in_array) i_array[1] = 1 for i in 1:length(i_array)-1 i_array[i+1] = i_array[i]+in_array[i]*out_array[i]+out_array[i] end return i_array end function _get_lux_params(NN_dict::Dict, weights) in_array, out_array = _get_in_out_arrays(NN_dict) i_array = _get_i_array(in_array, out_array) params = [_get_weight_bias(i_array[j], in_array[j], out_array[j], weights, NN_dict) for j in 1:NN_dict["n_hidden_layers"]+1] layer = [Symbol("layer_"*string(j)) for j in 1:NN_dict["n_hidden_layers"]+1] return (; zip(layer, params)...) end function _get_lux_states(NN_dict::Dict) params = [NamedTuple() for j in 1:NN_dict["n_hidden_layers"]+1] layer = [Symbol("layer_"*string(j)) for j in 1:NN_dict["n_hidden_layers"]+1] return (; zip(layer, params)...) end function _get_lux_params_states(NN_dict::Dict, weights) return _get_lux_params(NN_dict, weights), _get_lux_states(NN_dict) end function _get_emulator_description_dict(input_dict::Dict) if haskey(input_dict, "emulator_description") nn_descript = input_dict["emulator_description"] else nn_descript = Dict() @warn "No emulator description found!" end return nn_descript end function _init_luxemulator(NN_dict::Dict, weight) params, states = _get_lux_params_states(NN_dict, weight) model = _get_nn_lux(NN_dict) nn_descript = Dict("emulator_description"=>_get_emulator_description_dict(NN_dict)) return LuxEmulator(Model = model, Parameters = params, States = states, Description= nn_descript) end function init_emulator(NN_dict::Dict, weight, ::Type{LuxEmulator}) return _init_luxemulator(NN_dict, weight) end function _init_simplechainsemulator(NN_dict::Dict, weight) architecture = _get_nn_simplechains(NN_dict) nn_descript = Dict("emulator_description"=>_get_emulator_description_dict(NN_dict)) return SimpleChainsEmulator(Architecture = architecture, Weights = weight, Description= nn_descript) end function init_emulator(NN_dict::Dict, weight, ::Type{SimpleChainsEmulator}) return _init_simplechainsemulator(NN_dict, weight) end
AbstractCosmologicalEmulators
https://github.com/CosmologicalEmulators/AbstractCosmologicalEmulators.jl.git
[ "MIT" ]
0.6.0
31aa360731ce7e75f3a6fc9159d8ab4948a92493
code
1217
function maximin(input, minmax) result = @views @. (input - minmax[:,1]) ./ (minmax[:,2] - minmax[:,1]) return result end function inv_maximin(input, minmax) result = @views @. input * (minmax[:,2] - minmax[:,1]) + minmax[:,1] return result end function get_emulator_description(input_dict::Dict) if haskey(input_dict, "parameters") println("The parameters the model has been trained are, in the following order: "*input_dict["parameters"]*".") else @warn "We do not know which parameters were included in the emulators training space. Use this trained emulator with caution!" end if haskey(input_dict, "author") println("The emulator has been trained by "*input_dict["author"]*".") end if haskey(input_dict, "author_email") println(input_dict["author"]*" email is "*input_dict["author_email"]*".") end if haskey(input_dict, "miscellanea") println(input_dict["miscellanea"]) end return nothing end function get_emulator_description(emu::AbstractTrainedEmulators) try get_emulator_description(emu.Description["emulator_description"]) catch @warn "No emulator description present!" end end
AbstractCosmologicalEmulators
https://github.com/CosmologicalEmulators/AbstractCosmologicalEmulators.jl.git
[ "MIT" ]
0.6.0
31aa360731ce7e75f3a6fc9159d8ab4948a92493
code
2883
using AbstractCosmologicalEmulators using JSON using SimpleChains using Test using ForwardDiff using Zygote m = 100 n = 300 InMinMax = hcat(zeros(m), ones(m)) mlpd = SimpleChain( static(6), TurboDense(tanh, 64), TurboDense(tanh, 64), TurboDense(relu, 64), TurboDense(tanh, 64), TurboDense(tanh, 64), TurboDense(identity, 40) ) NN_dict = JSON.parsefile(pwd()*"/testNN.json") weights = SimpleChains.init_params(mlpd) sc_emu = SimpleChainsEmulator(Architecture = mlpd, Weights = weights, Description = Dict("emulator_description"=> NN_dict["emulator_description"])) n = 1024 A = randn(n) B = ones(n, 2) B[:,1] .*= 0. test_sum(A) = sum(abs2, maximin(A, B)) test_suminv(A) = sum(abs2, inv_maximin(A, B)) @testset "AbstractEmulators test" begin x = rand(m) y = rand(m, n) X = deepcopy(x) Y = deepcopy(y) norm_x = maximin(x, InMinMax) norm_y = maximin(y, InMinMax) @test any(norm_x .>=0 .&& norm_x .<=1) @test any(norm_y .>=0 .&& norm_y .<=1) x = inv_maximin(norm_x, InMinMax) y = inv_maximin(norm_y, InMinMax) @test any(x .== X) @test any(y .== Y) input = randn(6) stack_input = hcat(input, input) @test isapprox(run_emulator(input, sc_emu), run_emulator(stack_input, sc_emu)[:,1]) @test AbstractCosmologicalEmulators._get_nn_simplechains(NN_dict) == mlpd lux_emu = init_emulator(NN_dict, weights, LuxEmulator) sc_emu_check = init_emulator(NN_dict, weights, SimpleChainsEmulator) @test sc_emu_check.Architecture == sc_emu.Architecture @test sc_emu_check.Weights == sc_emu.Weights @test sc_emu_check.Description == sc_emu.Description NN_dict["layers"]["layer_1"]["activation_function"]= "adremxud" @test_throws ErrorException AbstractCosmologicalEmulators._get_nn_simplechains(NN_dict) @test_throws ErrorException AbstractCosmologicalEmulators._get_nn_lux(NN_dict) @test isapprox(run_emulator(input, sc_emu), run_emulator(input, lux_emu)) @test isapprox(run_emulator(input, lux_emu), run_emulator(stack_input, lux_emu)[:,1]) get_emulator_description(NN_dict["emulator_description"]) @test_logs (:warn, "We do not know which parameters were included in the emulators training space. Use this trained emulator with caution!") AbstractCosmologicalEmulators.get_emulator_description(Dict("pippo" => "franco")) @test_logs (:warn, "No emulator description found!") AbstractCosmologicalEmulators._get_emulator_description_dict(Dict("pippo" => "franco")) @test isapprox(run_emulator(input, sc_emu), run_emulator(input, lux_emu)) @test get_emulator_description(sc_emu) == get_emulator_description(NN_dict["emulator_description"]) @test ForwardDiff.gradient(test_sum, A) ≈ Zygote.gradient(test_sum, A)[1] @test ForwardDiff.gradient(test_suminv, A) ≈ Zygote.gradient(test_suminv, A)[1] end
AbstractCosmologicalEmulators
https://github.com/CosmologicalEmulators/AbstractCosmologicalEmulators.jl.git
[ "MIT" ]
0.6.0
31aa360731ce7e75f3a6fc9159d8ab4948a92493
docs
2339
# AbstractCosmologicalEmulators.jl [![Build status (Github Actions)](https://github.com/CosmologicalEmulators/AbstractCosmologicalEmulators.jl/workflows/CI/badge.svg)](https://github.com/CosmologicalEmulators/AbstractCosmologicalEmulators.jl/actions) [![codecov](https://codecov.io/gh/CosmologicalEmulators/AbstractCosmologicalEmulators.jl/branch/main/graph/badge.svg?token=0PYHCWVL67)](https://codecov.io/gh/CosmologicalEmulators/AbstractCosmologicalEmulators.jl) ![size](https://img.shields.io/github/repo-size/CosmologicalEmulators/AbstractCosmologicalEmulators.jl) [![Code Style: Blue](https://img.shields.io/badge/code%20style-blue-4495d1.svg)](https://github.com/invenia/BlueStyle) [![ColPrac: Contributor's Guide on Collaborative Practices for Community Packages](https://img.shields.io/badge/ColPrac-Contributor's%20Guide-blueviolet)](https://github.com/SciML/ColPrac) `AbstractCosmologicalEmulators.jl` is the central `Julia` package within the the [CosmologicalEmulators](https://github.com/CosmologicalEmulators) Github organization, which defines methods and structs used by the other packages hosted by the organization. In this moment the emulators here used are based only on the [`SimpleChains.jl`](https://github.com/PumasAI/SimpleChains.jl) library, whose performance is excellent on the CPU for the kind of small neural networks (NN) that we employ. We plan to include other frameworks, such as [`Lux.jl`](https://github.com/LuxDL/Lux.jl), in order to support models running on the GPU. If you want include a new NN/GP framework, feel free to open a PR or get in touch with us. ## Roadmap to v1.0.0 Step | Status| Comment :------------ | :-------------| :------------- Interface with `SimpleChains.jl` | :heavy_check_mark: | Implemented Support for vectorization | :heavy_check_mark: | Implemented Interface with `Lux.jl` | :heavy_check_mark: | Implemented AD Rules | :heavy_check_mark: | Implemented Robust emulators initialization | :heavy_check_mark: | Implemented, needs some polishing GPU support | :construction: | Work in progress Stable API | :construction: | Work in progress ## Authors - [Marco Bonici](https://www.marcobonici.com), PostDoctoral Researcher at Waterloo Centre for Astrophysics - [Marius Millea](https://cosmicmar.com), Researcher at UC Davis and Berkeley Center for Cosmological Physics
AbstractCosmologicalEmulators
https://github.com/CosmologicalEmulators/AbstractCosmologicalEmulators.jl.git
[ "MIT" ]
1.1.4
218d61a78bc3f3dc01ea89ad75417566ac61f238
code
656
using Documenter, CancerImagingArchive makedocs(; modules=[CancerImagingArchive], format = Documenter.HTML( prettyurls = get(ENV, "CI", nothing) == "true" ), pages=[ "Introduction" => "index.md", "Guide" => Any[ "Formats" => "guide/formats.md", "Queries" => "guide/queries.md", "Data Download" => "guide/downloading.md" ], "Functions" => "functions.md" ], repo="https://github.com/notZaki/CancerImagingArchive.jl/blob/{commit}{path}#L{line}", sitename="CancerImagingArchive.jl" ) deploydocs(; repo="github.com/notZaki/CancerImagingArchive.jl", )
CancerImagingArchive
https://github.com/notZaki/CancerImagingArchive.jl.git
[ "MIT" ]
1.1.4
218d61a78bc3f3dc01ea89ad75417566ac61f238
code
8941
module CancerImagingArchive using HTTP, CSV, DataFrames, JSON include("download_series.jl") export download_series export tcia_collections, tcia_modalities, tcia_bodyparts, tcia_manufacturers, tcia_studies, tcia_series, tcia_series_size export tcia_patients, tcia_patients_by_modality, tcia_newpatients, tcia_newstudies, tcia_sop export tcia_single_image, tcia_images export dataframe_to_csv, dictionary_to_json const _host = "services.cancerimagingarchive.net/services/v4/TCIA/query" const _format = "csv" const _q = Dict( :collection => "Collection", :bodypart => "BodyPartExamined", :modality => "Modality", :patient => "PatientID", :study => "StudyInstanceUID", :series => "SeriesInstanceUID", :manufacturer => "Manufacturer", :model => "ManufacturerModelName", :date => "Date", :sop => "SOPInstanceUID", :format => "format" ) """ remove_empty!(dictionary::Dict) Removes dictionary keys with empty values. Used internally to remove empty queries. """ function remove_empty!(dictionary::Dict) for (key, value) in dictionary if isempty(value) delete!(dictionary, key) end end end """ tcia_collections(; format = "csv") Provides names of all the collections on TCIA. """ function tcia_collections(; format = _format) endpoint = "/getCollectionValues" query = Dict( _q[:format] => format ) return request(endpoint, query) end """ tcia_modalities(; collection, bodypart, format = "csv") Returns the modalities used in a given `collection` and/or for a given `bodypart`. """ function tcia_modalities(; collection = "", bodypart = "", format = _format) endpoint = "/getModalityValues" query = Dict( _q[:collection] => collection, _q[:bodypart] => bodypart, _q[:format] => format) return request(endpoint, query) end """ tcia_bodyparts(; collection, modality, format = "csv") Returns the body parts examined in a given `collection` and/or by a given `modality`. """ function tcia_bodyparts(; collection = "", modality = "", format = _format) endpoint = "/getBodyPartValues" query = Dict( _q[:collection] => collection, _q[:modality] => uppercase(modality), _q[:format] => format ) return request(endpoint, query) end """ tcia_manufacturers(; collection, modality, bodypart, format = "csv") Returns the hardware manufacturers for a given `collection` and/or `modality` and/or `bodypart`. """ function tcia_manufacturers(; collection = "", modality = "", bodypart = "", format = _format) endpoint = "/getManufacturerValues" query = Dict( _q[:collection] => collection, _q[:bodypart] => bodypart, _q[:modality] => modality, _q[:format] => format ) return request(endpoint, query) end """ tcia_patients(; collection, format = "csv") Returns the patients in a given `collection`. """ function tcia_patients(; collection = "", format = _format) endpoint = "/getPatient" query = Dict( _q[:collection] => collection, _q[:format] => format ) return request(endpoint, query) end """ tcia_patients(; collection, modality, format = "csv") Returns the patients in a given `collection` and `modality` (both inputs required). """ function tcia_patients_by_modality(; collection::AbstractString, modality::AbstractString, format = _format) endpoint = "/PatientsByModality" query = Dict( _q[:collection] => collection, _q[:modality] => modality, _q[:format] => format ) return request(endpoint, query) end """ tcia_studies(; collection, patient, study, format = "csv") Returns the patient studies for a given `collection` and/or `patient` and/or `study`. """ function tcia_studies(; collection = "", patient = "", study = "", format = _format) endpoint = "/getPatientStudy" query = Dict( _q[:collection] => collection, _q[:patient] => patient, _q[:study] => study, _q[:format] => format ) return request(endpoint, query) end """ tcia_series(; collection, bodypart, manufacturer, modality, model, patient, series, study, format = "csv") Returns series information for a given `collection`, `bodypart, `manufactuer`, `modality`, manufacturer `model, `patient`, SeriesInstanceUID `series`, or StudyInstanceUID `study`. """ function tcia_series(; collection = "", bodypart = "", manufacturer = "", modality = "", model = "", patient = "", series = "", study = "", format = _format) endpoint = "/getSeries" query = Dict( _q[:collection] => collection, _q[:bodypart] => bodypart, _q[:manufacturer] => manufacturer, _q[:modality] => modality, _q[:model] => model, _q[:patient] => patient, _q[:series] => series, _q[:study] => study, _q[:format] => format ) return request(endpoint, query) end """ tcia_series_size(; series, format = "csv") Returns the total byte size and the number of objects in the given SeriesInstanceUID `series`. """ function tcia_series_size(; series::AbstractString, format = _format) endpoint = "/getSeriesSize" query = Dict( _q[:series] => series, _q[:format] => format ) return request(endpoint, query) end """ tcia_sop(; series, format = "csv") Returns the SOPInstanceUIDs for the given SeriesInstanceUID `series`. """ function tcia_sop(; series::AbstractString, format = _format) endpoint = "/getSOPInstanceUIDs" query = Dict( _q[:series] => series, _q[:format] => format ) return request(endpoint, query) end """ tcia_images(; series, file) Downloads the images for the given SeriesInstanceUID `series` as the given zip-file `file`. """ function tcia_images(; series::AbstractString, file::AbstractString) endpoint = "/getImage" query = Dict( _q[:series] => series, ) return request(endpoint, query, file=file) end """ tcia_single_image(; series, sop, file) Downloads a single DICOM image for the given SeriesInstanceUID `series` and SOPInstanceUID `sop`. The DICOM file is saved as `file`. """ function tcia_single_image(; series::AbstractString, sop::AbstractString, file::AbstractString) endpoint = "/getSingleImage" query = Dict( _q[:series] => series, _q[:sop] => sop ) return request(endpoint, query, file = file) end function tcia_newpatients(; collection::AbstractString, date::AbstractString, format = _format) endpoint = "/NewPatientsInCollection" query = Dict( _q[:collection] => collection, _q[:date] => date, _q[:format] => format ) return request(endpoint, query) end """ tcia_newstudies(; date, collection, patient, format = "csv") Returns new studies for a given `collection` that were added after a given `date` formatted as `YYYY-MM-DD`. The `patient` ID can be optionally given. """ function tcia_newstudies(; date::AbstractString, collection::AbstractString, patient = "", format = _format) endpoint = "/NewStudiesInPatientCollection" query = Dict( _q[:collection] => collection, _q[:date] => date, _q[:patient] => patient, _q[:format] => format ) return request(endpoint, query) end function request(endpoint, query; file="", host = _host) remove_empty!(query) uri = HTTP.URI(scheme="https", host=host, path=endpoint, query=query) @assert HTTP.isvalid(uri) "Invalid URI: $(uri)" url = string(uri) if has_format(query, "csv") return _request_csv(url) elseif has_format(query, "json") return _request_json(url) elseif !isempty(file) return _request_image(url, file) else error("Not supported") end end has_format(query, format) = haskey(query, "format") && query["format"]==format function _request_csv(url) r = HTTP.request("GET", url) return DataFrame(CSV.File(r.body)) end function _request_json(url) r = HTTP.request("GET", url) json = JSON.Parser.parse(String(r.body)) return filter(element -> !isempty(element), json) end function _request_image(url, file) HTTP.open("GET", url) do http open(file, "w") do out write(out, http) end end return file end """ dataframe_to_csv(; dataframe, file) Writes the information in a DataFrame object (`dataframe`) into a csv file (`file`). """ function dataframe_to_csv(; dataframe::DataFrame, file::AbstractString) CSV.write(file, dataframe) return end """ dictionary_to_json(; dictionary, file) Writes the information in a Dictionary Array (`dictionary`) into a json file (`file`). """ function dictionary_to_json(; dictionary, file::AbstractString) open(file, "w") do io JSON.print(io, dictionary, 4) end return end end # module
CancerImagingArchive
https://github.com/notZaki/CancerImagingArchive.jl.git
[ "MIT" ]
1.1.4
218d61a78bc3f3dc01ea89ad75417566ac61f238
code
3245
function _initialize_destination(destination, overwrite) if overwrite rm(destination; force = true, recursive = true) end if !isdir(destination) mkpath(destination) end return destination end function _append_to_path(path, thing_to_append) thing_to_append = replace(thing_to_append, r"[^0-9a-zA-Z]" => "") return joinpath(path, thing_to_append) end """ download_series(series_id::AbstractString, destination::AbstractString = "./", overwrite::Boolean = true) Downloads images belonging to series with `series_id` and extracts them to `destination` folder. If the destination folder already exists, then it will be overwritten by default unless `overwrite = false`. """ function download_series(series_id::AbstractString, destination = "./", overwrite = true) _initialize_destination(destination, overwrite) zip_file = joinpath(destination, "downloaded.zip") tcia_images(series = series_id, file = zip_file) unzip_command = `unzip -o $zip_file -d $destination` run(unzip_command) rm(zip_file) return destination end """ download_series(df::DataFrame, destination::AbstractString = "./"; append_desc::Boolean = true, overwrite::Boolean = true) Downloads all images from the series in the dataframe `df` and then extracts them to `destination` folder. The `df` can be obtained through the `tcia_series()` function. By default, the series description will be appended to the path unless `append_desc = false`. If the destination folder already exists, then it will be overwritten by default unless `overwrite = false`. """ function download_series(series_df::DataFrames.DataFrame, destination = "./"; append_desc = true, overwrite = true) return [download_series(row, destination; append_desc=append_desc, overwrite=overwrite) for row in eachrow(series_df)] end function download_series(series::DataFrames.DataFrameRow, destination = "./"; append_desc = true, overwrite = true) series_id = series.SeriesInstanceUID if append_desc destination = _append_to_path(destination, series.SeriesDescription) end return download_series(series_id, destination, overwrite) end """ download_series(arr::Array, destination::AbstractString = "./"; append_desc::Boolean = true, overwrite::Boolean = true) Downloads all images from the series in the array `arr` and then extracts them to `destination` folder. The `arr` can be obtained through the `tcia_series(..., format = "json")` command. By default, the series description will be appended to the path unless `append_desc = false`. If the destination folder already exists, then it will be overwritten by default unless `overwrite = false`. """ function download_series(series_array::Array, destination = "./"; append_desc = true, overwrite = true) return [download_series(series, destination; append_desc=append_desc, overwrite=overwrite) for series in series_array] end function download_series(series::Dict, destination = "./"; append_desc = true, overwrite = true) series_id = series["SeriesInstanceUID"] if append_desc destination = _append_to_path(destination, series["SeriesDescription"]) end return download_series(series_id, destination, overwrite) end
CancerImagingArchive
https://github.com/notZaki/CancerImagingArchive.jl.git
[ "MIT" ]
1.1.4
218d61a78bc3f3dc01ea89ad75417566ac61f238
code
8367
using CancerImagingArchive, DataFrames using Test ####### # SETUP ####### # Use global variable for filenames because we want to delete them if they already exist zip_file = "test.zip" dicom_file = "test.dcm" csv_file = "test.csv" json_file = "test.json" for file in [zip_file, dicom_file, csv_file, json_file] rm(file, force = true) end # Helper function for comparing CSV/DataFrames vs JSON/DictionaryArrays function compare_csv_vs_json(csv, json; max_names = Inf) names_in_csv = find_names_in_csv(csv) names_in_json = find_names_in_json(json) @test length(json) == length(csv[!, names_in_csv[1]]) @test length(names_in_json) == length(names_in_csv) if length(names_in_json) > max_names # Some cases have too many rows and they are not directly comparable because of types names_in_json = names_in_json[1:max_names] names_in_csv = names_in_csv[1:max_names] end for (idx, json_element) in enumerate(json) for name in names_in_csv if ismissing(csv[!, name][idx]) # Missing entries lead to this messy situation. Just check if both json/csv are empty/missing @test isempty(json_element) || isempty(json_element[string(name)]) @test ismissing(csv[!, name][idx]) else @test csv[!, name][idx] == json_element[string(name)] end end end return nothing end function find_names_in_csv(csv) names_in_csv = names(csv) # AnnotationsFlag field exists in csv but not json, so remove it for comparisons filter!(name -> name != "AnnotationsFlag", names_in_csv) return names_in_csv end function find_names_in_json(json_array) # JSON names get ignored if the entry is missing so we can't just do collect(keys(json_array[1])) found_names = [] num_names = 0 for json in json_array cur_names = keys(json) if length(cur_names) > num_names found_names = collect(cur_names) num_names = length(found_names) end end return found_names end ############################################################################### @testset "Queries - Collection" begin @test_throws ErrorException tcia_collections(format = "unknown") collections_csv = tcia_collections() collections_json = tcia_collections(format = "json") @test length(collections_json) > 90 compare_csv_vs_json(collections_csv, collections_json) end @testset "Queries - Modalities" begin @test length( tcia_modalities(collection = "TCGA-GBM", format = "json") ) > 2 @test length( tcia_modalities(bodypart = "BREAST", format = "json") ) > 5 compare_csv_vs_json( tcia_modalities(collection = "TCGA-GBM", bodypart = "BRAIN"), tcia_modalities(collection = "TCGA-GBM", bodypart = "BRAIN", format = "json")) end @testset "Queries - BodyParts" begin @test "BRAIN" in tcia_bodyparts(modality = "MR").BodyPartExamined compare_csv_vs_json( tcia_bodyparts(collection = "CPTAC-HNSCC"), tcia_bodyparts(collection = "CPTAC-HNSCC", format = "json")) end @testset "Queries - Manufacturers" begin compare_csv_vs_json( tcia_manufacturers(collection = "TCGA-KICH", modality = "MR"), tcia_manufacturers(collection = "TCGA-KICH", modality = "MR", format = "json")) compare_csv_vs_json( tcia_manufacturers(bodypart = "BREAST"), tcia_manufacturers(bodypart = "BREAST", format = "json")) end @testset "Queries - Patients" begin compare_csv_vs_json( tcia_patients(collection = "TCGA-THCA"), tcia_patients(collection = "TCGA-THCA", format = "json")) # Following criteria should only find one patient found_patient = tcia_patients_by_modality(collection = "ACRIN-FLT-Breast", modality = "OT") @test length(found_patient.PatientID) == 1 @test found_patient.PatientID[1] == "ACRIN-FLT-Breast_066" # Following criteria should find at least two patients new_gbm_patients = tcia_newpatients(collection = "TCGA-GBM", date = "2015-01-01", format = "json") @test length(new_gbm_patients) > 1 end @testset "Queries - Studies" begin # The CSV version requires a few manual changes, so we do them first studies_csv = tcia_studies(collection = "TCGA-SARC") # 1. Convert the date to plain strings so that they can be compared with the json version studies_csv.StudyDate = string.(studies_csv.StudyDate) # 2. Remove the escape characters in the string. These occur in the study description for (idx, description) in enumerate(studies_csv.StudyDescription) studies_csv.StudyDescription[idx] = replace(description, "\\" => "") end compare_csv_vs_json( studies_csv, tcia_studies(collection = "TCGA-SARC", format = "json")) # Following criteria should find at least three series @test length(tcia_newstudies(collection="TCGA-GBM", date="2015-01-01", format="json")) > 2 end @testset "Queries - Series" begin compare_csv_vs_json( tcia_series(collection = "TCGA-THCA"), tcia_series(collection = "TCGA-THCA", format = "json"), max_names = 3) compare_csv_vs_json( tcia_series(study = "1.3.6.1.4.1.14519.5.2.1.3023.4024.298690116465423805879206377806"), tcia_series(study = "1.3.6.1.4.1.14519.5.2.1.3023.4024.298690116465423805879206377806", format = "json"), max_names = 3) compare_csv_vs_json( tcia_series(bodypart = "CHEST", modality = "CT", manufacturer = "TOSHIBA"), tcia_series(bodypart = "CHEST", modality = "CT", manufacturer = "TOSHIBA", format = "json"), max_names = 3) # Can not use compare_csv_vs_json() on tcia_series_size() because TotalSizeInBytes has different types dce_series_json = tcia_series_size(series = "1.3.6.1.4.1.14519.5.2.1.4591.4001.241972527061347495484079664948", format="json")[1] @test dce_series_json["TotalSizeInBytes"] == "149149266.000000" dce_series_csv = tcia_series_size(series = "1.3.6.1.4.1.14519.5.2.1.4591.4001.241972527061347495484079664948") @test dce_series_csv.TotalSizeInBytes[1] ≈ 149149266 @test dce_series_csv.ObjectCount[1] == dce_series_json["ObjectCount"] == 1120 end @testset "Queries - SOP" begin compare_csv_vs_json( tcia_sop(series = "1.3.6.1.4.1.14519.5.2.1.4591.4001.241972527061347495484079664948"), tcia_sop(series = "1.3.6.1.4.1.14519.5.2.1.4591.4001.241972527061347495484079664948", format = "json")) end @testset "Data Download" begin patient_studies = tcia_studies(collection = "TCGA-THCA") chosen_study = patient_studies.StudyInstanceUID[1] imaging_series = tcia_series(study = chosen_study) chosen_series = imaging_series.SeriesInstanceUID[1] series_sops = tcia_sop(series = chosen_series) chosen_sop = series_sops.SOPInstanceUID[1] tcia_images(series = chosen_series, file = zip_file) @test isfile(zip_file) @test filesize(zip_file) == 945849 tcia_single_image(series = chosen_series, sop = chosen_sop, file = dicom_file) @test isfile(dicom_file) @test filesize(dicom_file) == 980794 end @testset "Download series" begin series = tcia_series(collection = "AAPM-RT-MAC", patient = "RTMAC-LIVE-001") seriesjs = tcia_series(collection = "AAPM-RT-MAC", patient = "RTMAC-LIVE-001", format="json") download_series(series, "./testdf") download_series(seriesjs, "./testjs") download_series(series, "./testdf"; overwrite = false) end @testset "Utilities - remove_empty!()" begin dict_potentialy_empty_values = Dict(1 => "", 2 => "hello", 3 => "b", 4 => "", 5 => "ye") CancerImagingArchive.remove_empty!(dict_potentialy_empty_values) nonempty_keys = [] for (key, value) in dict_potentialy_empty_values @test !isempty(value) push!(nonempty_keys, key) end @test sort(nonempty_keys) == [2, 3, 5] end @testset "Utilities - Data writer" begin tabular_data = tcia_collections() dataframe_to_csv(dataframe = tabular_data, file = csv_file) @test isfile(csv_file) println("Size of csv file: $(filesize(csv_file))") @test filesize(csv_file) >= 1346 dict_array = tcia_collections(format = "json") dictionary_to_json(dictionary = dict_array, file = json_file) @test isfile(json_file) println("Size of json file: $(filesize(json_file))") @test filesize(json_file) >= 4816 end
CancerImagingArchive
https://github.com/notZaki/CancerImagingArchive.jl.git
[ "MIT" ]
1.1.4
218d61a78bc3f3dc01ea89ad75417566ac61f238
docs
2127
<a href="#"><img src="./docs/src/assets/logo.svg" width="350" height="200"></img></a> # CancerImagingArchive.jl [![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](https://notZaki.github.io/CancerImagingArchive.jl/stable) [![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://notZaki.github.io/CancerImagingArchive.jl/dev) [![Build Status](https://github.com/notZaki/CancerImagingArchive.jl/workflows/CI/badge.svg)](https://github.com/notZaki/CancerImagingArchive.jl/actions) [![Codecov](https://codecov.io/gh/notZaki/CancerImagingArchive.jl/branch/master/graph/badge.svg)](https://codecov.io/gh/notZaki/CancerImagingArchive.jl) A Julia interface for [The Cancer Imaging Archive (TCIA) REST-API](https://wiki.cancerimagingarchive.net/display/Public/TCIA+Programmatic+Interface+%28REST+API%29+Usage+Guide) ## Installation The package can be installed by ```julia julia> ]add CancerImagingArchive ``` ## Usage The [documentation pages](https://notZaki.github.io/CancerImagingArchive.jl/stable) provide details/examples of how to use the package. ## Notes This is **not** an official project of The Cancer Imaging Archive. If any problems are experienced with this package, then please open a [new issue here](https://github.com/notZaki/CancerImagingArchive.jl/issues). Please follow the [Data Usage Policies and Restrictions](https://wiki.cancerimagingarchive.net/display/Public/Data+Usage+Policies+and+Restrictions) outlined in the TCIA wiki. If this package is helpful, then great! There is no need to cite the package itself. However, any manuscript produced using data from TCIA should cite: 1. TCIA [[link]](https://www.ncbi.nlm.nih.gov/pubmed/23884657) ``` Clark K, Vendt B, Smith K, et al. The Cancer Imaging Archive (TCIA): Maintaining and Operating a Public Information Repository. Journal of Digital Imaging. 2013; 26(6): 1045-1057. doi: 10.1007/s10278-013-9622-7. ``` 2. All citations specific to the datasets that were used. These details are mentioned on the [TCIA wiki](https://wiki.cancerimagingarchive.net/display/Public/Data+Usage+Policies+and+Restrictions).
CancerImagingArchive
https://github.com/notZaki/CancerImagingArchive.jl.git
[ "MIT" ]
1.1.4
218d61a78bc3f3dc01ea89ad75417566ac61f238
docs
451
# Functions This page briefly describes the functions exported by the package. The function names in this module are in lowercase letters and sometimes underscores are used for longer function names. Most functions require keyword arguments, i.e. `tcia_bodyparts(collection = collection_name)` will work but `tcia_bodyparts(collection_name)` will not. ## Index ```@index ``` ## Public functions ```@autodocs Modules = [CancerImagingArchive] ```
CancerImagingArchive
https://github.com/notZaki/CancerImagingArchive.jl.git
[ "MIT" ]
1.1.4
218d61a78bc3f3dc01ea89ad75417566ac61f238
docs
652
## Introduction The `CancerImagingArchive.jl` module provides a Julia interface for exploring and downloading imaging data from [The Cancer Imaging Archive (TCIA)](https://www.cancerimagingarchive.net/) via their [REST API](https://wiki.cancerimagingarchive.net/display/Public/TCIA+Programmatic+Interface+%28REST+API%29+Usage+Guide). ## Installation This module can be installed by: ```julia julia> ]add CancerImagingArchive ``` Once installed, the package can be loaded via ```julia julia> using CancerImagingArchive ``` ## Usage The module contains about a dozen [Functions](@ref) and a guide with some examples is included in the next section.
CancerImagingArchive
https://github.com/notZaki/CancerImagingArchive.jl.git
[ "MIT" ]
1.1.4
218d61a78bc3f3dc01ea89ad75417566ac61f238
docs
2688
```@setup ex using CancerImagingArchive ``` # Data Download Imaging data can be downloaded either as a `.zip` file containing an imaging series or as a DICOM `.dcm` file containing a single acquisition within an imaging series. ## Imaging series ### Selecting the imaging series The SeriesInstanceUID is needed to download an imaging series. The below example selects one series from the [TCGA-THCA collection](https://wiki.cancerimagingarchive.net/display/Public/TCGA-THCA). ```@repl ex patient_studies = tcia_studies(collection = "TCGA-THCA") chosen_study = patient_studies.StudyInstanceUID[1] imaging_series = tcia_series(study = chosen_study) chosen_series = imaging_series.SeriesInstanceUID[1] ``` ### Downloading the imaging series Once the SeriesInstanceUID is known, the imaging data can be downloaded as a zip file by: ```@repl ex zip_file = "output_file.zip"; # Can also be a path tcia_images(series = chosen_series, file = zip_file) ``` ### Convenience wrapper The above steps will only download a zip file which then has to be extracted. This can be cumbersome when downloading multipled series, so the `download_series()` function is provided for convenience. !!! note The `download_series()` assumes that the `unzip` utility is installed on the system. This can be verified by typing `unzip` in a terminal or `;unzip` in julia. ``` **Downloading a single series** The following will download and extract the `chosen_series` (selected above) and extract the images in the current directory `./`. ```julia julia> download_series(chosen_series, "./") ``` **Downloading multiple series** The wrapper function can download multiple series from a Dataframe by ```julia julia> series = tcia_series(collection = "AAPM-RT-MAC", patient = "RTMAC-LIVE-001") julia> download_series(series, "./testdf") ``` or from an array of dictionaries by ```julia julia> seriesjs = tcia_series(collection = "AAPM-RT-MAC", patient = "RTMAC-LIVE-001", format="json") julia> download_series(seriesjs, "./testjs") ``` ## Single image ### Selecting the single image To download a single image, both its SeriesInstanceUID and SOPInstanceUID must be known. Continuing from the previous example, if we only wanted to download the first image in `chosen_series`, then: ```@repl ex series_sops = tcia_sop(series = chosen_series) chosen_sop = series_sops.SOPInstanceUID[1] ``` ### Downloading the single image Once the SeriesInstanceUID and SOPInstanceUID are known, the dicom file can be downloaded by: ```@repl ex dicom_file = "output_file.dcm"; tcia_single_image(series = chosen_series, sop = chosen_sop, file = dicom_file) ``` ```@setup ex rm(zip_file) rm(dicom_file) ```
CancerImagingArchive
https://github.com/notZaki/CancerImagingArchive.jl.git
[ "MIT" ]
1.1.4
218d61a78bc3f3dc01ea89ad75417566ac61f238
docs
3749
```@setup ex using CancerImagingArchive ``` # Formats Most functions in this module return either a DataFrame (default) or a Dictionary Array. Both formats have their respective advantages and disadvantages. ## DataFrames/CSV DataFrames display the results in a table and this is the default behaviour. The table is useful for visually analysing the results. For example, the studies in the [TCGA-SARC collection](https://wiki.cancerimagingarchive.net/display/Public/TCGA-SARC) can be obtained by: ```@example ex studies_df = tcia_studies(collection = "TCGA-SARC") ``` ### Manipulating the DataFrame object The table can be manipulated using tools from the [DataFrames.jl](https://juliadata.github.io/DataFrames.jl/stable/) and [CSV.jl](https://juliadata.github.io/CSV.jl/stable/) packages. Individual columns in a DataFrame object---suppose it is named `data_frame`---can be accessed by `data_frame.column_name` where the available column names are in `names(data_frame)`. ```@repl ex names(studies_df) studies_df.StudyDate ``` The table can also be filtered and sorted. As an example, the following lines will sort the previous table by the number of series and then remove the `StudyInstanceUID` and `PatientName` columns: ```@example ex using DataFrames studies_sorted_by_count = sort(studies_df, :SeriesCount) select!(studies_sorted_by_count, Not([:StudyInstanceUID, :PatientName])) ``` ### Saving DataFrame as CSV The contents of the table can be written to a csv file by: ```@repl ex dataframe_to_csv(dataframe = studies_df, file = "output_file.csv") ``` ## DictionaryArray/JSON Instead of a table, an array of dictionaries can be obtained by passing `format = "json"` as an argument when calling the query function. For example, the DataFrame from the previous example could have been obtained as an array by: ```@example ex studies_array = tcia_studies(collection = "TCGA-SARC", format = "json") ``` ### Manipulating the Dictionary Array The array can be manipulated by iterating over the elements. As an example, the following lines will collect patients that are less than 60 years old: ```@example ex patients_below_60Y = [] for patient in studies_array if patient["PatientAge"] < "060Y" push!(patients_below_60Y, patient) end end # Print the new array: patients_below_60Y ``` The available keys for each dictionary in the array are listed by: ```@repl ex keys(studies_array[1]) ``` ### Saving Dictionary Array as JSON The array can be written to a JSON file by ```@repl ex dictionary_to_json(dictionary = studies_array, file = "output_file.json") ``` ```@setup ex rm("output_file.csv") rm("output_file.json") ``` ## Note on types The DataFrames object tries to figure out the types from the input while the DictionaryArray just accepts whatever the API returns. For a practical example of this, suppose we want to know the size of an imaging series; the DataFrame version will be ```@example ex tcia_series_size(series = "1.3.6.1.4.1.14519.5.2.1.4591.4001.241972527061347495484079664948") ``` while the JSON version will be ```@repl ex tcia_series_size(series = "1.3.6.1.4.1.14519.5.2.1.4591.4001.241972527061347495484079664948", format="json")[1] ``` The difference between the two is that the DataFrames version recognizes that `TotalSizeInBytes` is a number whereas the DictionaryArray displays it as a string (because the API returns it as a string). DataFrames' ability to recognize types is usually helpful, but sometimes it can fail. For example, in an anonymized dataset where patient names are replaced by numbers, the DataFrames object will incorrectly treat the names as numbers. These differences are unlikely to cause problems in practice so it isn't something to be actively concerned about.
CancerImagingArchive
https://github.com/notZaki/CancerImagingArchive.jl.git
[ "MIT" ]
1.1.4
218d61a78bc3f3dc01ea89ad75417566ac61f238
docs
6184
```@setup ex using CancerImagingArchive ``` # Queries Queries are useful for exploring the available imaging data. The general hierarchy of the cancer imaging archive (TCIA) is: ``` Collection -> PatientID -> StudyInstanceUID -> SeriesInstanceUID -> SOPInstanceUID ``` To download images, the `SeriesInstanceUID` and/or `SOPInstanceUID` must be known. The query functions are meant to help identify the relevant unique identifiers (UIDs) Detailed information is available in the TCIA's user guide which includes a [list of available query endpoints](https://wiki.cancerimagingarchive.net/display/Public/TCIA+Programmatic+Interface+%28REST+API%29+Usage+Guide) and the [type of information returned](https://wiki.cancerimagingarchive.net/display/Public/TCIA+API+Return+Values) by each query. !!! note As mentioned in the [Formats](@ref) section, each query returns either a DataFrame or a Dictionary Array. The current section will exclusively use the DataFrame output. That being said, a dictionary array can always be obtained by any of these functions by passing `format = "json"` as an input argument. ## All collections The names of all available collections on TCIA is obtained by: ```@repl ex tcia_collections() ``` ## Imaging modalities The imaging modalities in a specific collection and/or anatomy are listed by: ```@repl ex tcia_modalities(collection = "TCGA-KIRP") tcia_modalities(bodypart = "BRAIN") tcia_modalities(collection = "CPTAC-HNSCC", bodypart = "HEAD") ``` !!! note Capitalization matters when passing in arguments, i.e. `bodypart = "BRAIN"` works but passing `bodypart = "brain"` will return an empty object. However, there are some cases where different versions are valid. As an example, passing `bodypart = Kidney` or `bodypart = "KIDNEY"` will both return valid (but different!) results. So although fully-capitalized body part names will work most of the time, do double-check if alternative spellings exist when using the `bodypart` argument (see next section for names) ## Anatomy/body parts The anatomy scanned in a specific collection and/or modality are listed by: ```@repl ex tcia_bodyparts(collection = "CPTAC-HNSCC") tcia_bodyparts(modality = "CT") tcia_bodyparts(collection = "CPTAC-SAR", modality = "MR") tcia_bodyparts(collection = "CPTAC-SAR", modality = "CT") ``` ## Manufacturers A list of scanner manufacturers for a specific collection/modality/anatomy is obtained by ```@repl ex tcia_manufacturers(collection = "TCGA-KICH") tcia_manufacturers(modality = "CT") tcia_manufacturers(bodypart = "BREAST") ``` The same manufacturer can have different names, e.g. `Philips`/`Philips Medical Systems` and `SIEMENS`/`Siemens`. ## Patients The patients in a given collection are listed by: ```@repl ex tcia_patients(collection = "TCGA-SARC") ``` ### Patients for specific modality To get a patients for which a specific modality was used, a slightly different function is used: ```@repl ex tcia_patients_by_modality(collection = "TCGA-SARC", modality = "CT") tcia_patients_by_modality(collection = "TCGA-SARC", modality = "MR") ``` !!! note Although the functionality of `tcia_patients_by_modality()` could be combined into the `tcia_patients()` function, they use a different query endpoint so the two functions were given different names to keep that difference explicit. ### Patients added after specific date In large collections, it can be useful to query patients that were added after a date specified as YYYY-MM-DD. This is accomplished by: ```@repl ex tcia_newpatients(collection = "TCGA-GBM", date = "2015-01-01") ``` ## Patient studies A list of visits/studies for a given collection/patient is obtained by: ```@repl ex tcia_studies(collection = "TCGA-THCA") tcia_studies(patient = "TCGA-QQ-A8VF") ``` If the unique identifier (UID) for a study is known (a.k.a. StudyInstanceUID), then that can also be used ```@repl ex tcia_studies(study = "1.3.6.1.4.1.14519.5.2.1.3023.4024.298690116465423805879206377806") ``` ### Patient studies added after specific data A list of visits/studies that were added after some date, formatted by YYYY-MM-DD, can be obtained by: ```@repl ex tcia_newstudies(collection="TCGA-GBM", date="2015-01-01") ``` ## Imaging series Each patient study consists of one or more imaging series which can be obtained by: ```@repl ex tcia_series(collection = "TCGA-THCA") tcia_series(patient = "TCGA-QQ-A8VF") tcia_series(study = "1.3.6.1.4.1.14519.5.2.1.3023.4024.298690116465423805879206377806") tcia_series(modality = "CT", manufacturer = "TOSHIBA") tcia_series(bodypart = "EXTREMITY") ``` This query's importance is hinted by the smorgasbord of parameters it accepts. That's because this query returns the `SeriesInstanceUID` which is needed to download images. Although the above examples only show `PatientID`, the query actually returns more information which is not shown because of limited screen space. The complete list of columns are: ```@repl ex series_dataframe = tcia_series(patient = "TCGA-QQ-A8VF"); names(series_dataframe) ``` !!! note The entire table could have been printed by: ```julia show(series_dataframe, allrows = true, allcols = true) ``` !!! warning Passing `format = "json"` will result in one fewer column. This is because the `AnnotationsFlag` field is returned for CSV output but not for JSON. ### Imaging series size The size (in bytes) and number of images for a given imaging series is given by ```@repl ex tcia_series_size(series = "1.3.6.1.4.1.14519.5.2.1.4591.4001.241972527061347495484079664948") ``` !!! warning It is recommended that `tcia_series_size()` should **not** be used with `format = json`. This is because the json version interprets the `TotalSizeInBytes` as string/text rather than a number. ## Service-Object Pairs (SOP) Each imaging series consists of one or more images, each of which have a service-object-pair unique identifier (SOPInstanceUID). These can be listed by ```@repl ex tcia_sop(series = "1.3.6.1.4.1.14519.5.2.1.4591.4001.241972527061347495484079664948") ``` These identifiers are useful for accessing a specific image without having to download the entire imaging series.
CancerImagingArchive
https://github.com/notZaki/CancerImagingArchive.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
292
using Documenter import BusinessDays makedocs( sitename = "BusinessDays.jl", modules = [ BusinessDays ], pages = [ "Home" => "index.md", "API Reference" => "api.md" ] ) deploydocs( repo = "github.com/JuliaFinance/BusinessDays.jl.git", target = "build", )
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
785
""" A highly optimized Business Days calculator written in Julia language. Also known as Working Days calculator. Website: https://github.com/JuliaFinance/BusinessDays.jl """ module BusinessDays import Dates # exported types export HolidayCalendar, CompositeHolidayCalendar, GenericHolidayCalendar # exported functions export isholiday, isweekday, isweekend, isbday, tobday, advancebdays, bdayscount, bdays, firstbdayofmonth, lastbdayofmonth, listholidays, listbdays include("dateutils.jl") include("holidaycalendar.jl") include("bdayscache.jl") include("bdays.jl") include("bdaysvecfun.jl") include("composite.jl") include("query.jl") include("calendars/calendars.jl") include("generic.jl") end # module BusinessDays
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
5213
""" isweekend(dt) :: Bool Returns `true` for Saturdays or Sundays. Returns `false` otherwise. """ @inline isweekend(dt::Dates.Date) :: Bool = signbit(5 - Dates.dayofweek(dt)) """ isweekday(dt) :: Bool Returns `true` for Monday to Friday. Returns `false` otherwise. """ @inline isweekday(dt::Dates.Date) :: Bool = signbit(Dates.dayofweek(dt) - 6) """ isbday(calendar, dt) :: Bool Returns `false` for weekends or holidays. Returns `true` otherwise. """ function isbday(hc::HolidayCalendar, dt::Dates.Date) :: Bool if _getcachestate(hc) return isbday(_getholidaycalendarcache(hc), dt) else return !(isweekend(dt) || isholiday(hc, dt)) end end @inline isbday(calendar, dt) :: Bool = isbday(convert(HolidayCalendar, calendar), dt) """ tobday(calendar, dt; [forward=true]) :: Dates.Date Adjusts `dt` to next Business Day if it's not a Business Day. If `isbday(dt)`, returns `dt`. """ function tobday(hc::HolidayCalendar, dt::Dates.Date; forward::Bool = true) :: Dates.Date if isbday(hc, dt) return dt else increment = forward ? 1 : -1 next_date = dt + Dates.Day(increment) while !isbday(hc, next_date) next_date += Dates.Day(increment) end end return next_date end tobday(calendar, dt; forward::Bool = true) = tobday(convert(HolidayCalendar, calendar), dt; forward=forward) """ advancebdays(calendar, dt, bdays_count) :: Dates.Date Increments given date `dt` by `bdays_count`. Decrements it if `bdays_count` is negative. `bdays_count` can be a `Int`, `Dates.Day`, `Vector{Int}`, `Vector{Dates.Day}` or a `UnitRange`. Computation starts by next Business Day if `dt` is not a Business Day. """ function advancebdays(hc::HolidayCalendar, dt::Dates.Date, bdays_count::Int) :: Dates.Date result = tobday(hc, dt) # does nothing if bdays_count == 0 return result end # if bdays_count is positive, goes forward. Otherwise, goes backwards. increment = bdays_count > 0 ? +1 : -1 num_iterations = abs(bdays_count) while num_iterations > 0 result += Dates.Day(increment) # Looks for previous / next Business Day while !isbday(hc, result) result += Dates.Day(increment) end num_iterations += -1 end return result end advancebdays(hc::HolidayCalendar, dt::Dates.Date, bdays_count::Dates.Day) = advancebdays(hc, dt, Dates.value(bdays_count)) const BDaysCountType = Union{Int, Dates.Day} function advancebdays(calendar, dt, bdays_count::Union{T, Vector{T}, AbstractRange}) where {T<:BDaysCountType} advancebdays(convert(HolidayCalendar, calendar), convert(Dates.Date, dt), bdays_count) end """ bdayscount(calendar, dt0, dt1) :: Int Counts the number of Business Days between `dt0` and `dt1`. Returns `Int`. Computation is always based on next Business Day if given dates are not Business Days. """ function bdayscount(hc::HolidayCalendar, dt0::Dates.Date, dt1::Dates.Date) :: Int if _getcachestate(hc) return bdayscount(_getholidaycalendarcache(hc), dt0, dt1) else dt0 = tobday(hc, dt0) dt1 = tobday(hc, dt1) inc = dt0 <= dt1 ? +1 : -1 result = 0 while dt0 != dt1 dt0 = advancebdays(hc, dt0, inc) result += inc end return result end end bdayscount(calendar, dt0::Dates.Date, dt1::T) where {T<:Union{Dates.Date, Vector{Dates.Date}}} = bdayscount(convert(HolidayCalendar, calendar), dt0, dt1) bdayscount(calendar, dt0::Vector{Dates.Date}, dt1::Vector{Dates.Date}) = bdayscount(convert(HolidayCalendar, calendar), dt0, dt1) """ bdays(calendar, dt0, dt1) :: Dates.Day Counts the number of Business Days between `dt0` and `dt1`. Returns instances of `Dates.Day`. Computation is always based on next Business Day if given dates are not Business Days. """ bdays(hc::HolidayCalendar, dt0::Dates.Date, dt1::Dates.Date) = Dates.Day(bdayscount(hc, dt0, dt1)) bdays(calendar, dt0::Dates.Date, dt1::T) where {T<:Union{Dates.Date, Vector{Dates.Date}}} = bdays(convert(HolidayCalendar, calendar), dt0, dt1) bdays(calendar, dt0::Vector{Dates.Date}, dt1::Vector{Dates.Date}) = bdays(convert(HolidayCalendar, calendar), dt0, dt1) """ firstbdayofmonth(calendar, dt) :: Dates.Date firstbdayofmonth(calendar, yy, mm) :: Dates.Date Returns the first business day of month. """ firstbdayofmonth(calendar, dt::Dates.Date) = tobday(calendar, Dates.firstdayofmonth(dt)) """ lastbdayofmonth(calendar, dt) :: Dates.Date lastbdayofmonth(calendar, yy, mm) :: Dates.Date Returns the last business day of month. """ lastbdayofmonth(calendar, dt::Dates.Date) = tobday(calendar, Dates.lastdayofmonth(dt), forward=false) firstbdayofmonth(calendar, yy::T, mm::T) where {T<:Integer} = firstbdayofmonth(calendar, Dates.Date(yy, mm, 1)) firstbdayofmonth(calendar, yy::Dates.Year, mm::Dates.Month) = firstbdayofmonth(calendar, Dates.Date(yy, mm, Dates.Day(1))) lastbdayofmonth(calendar, yy::T, mm::T) where {T<:Integer} = lastbdayofmonth(calendar, Dates.Date(yy, mm, 1)) lastbdayofmonth(calendar, yy::Dates.Year, mm::Dates.Month) = lastbdayofmonth(calendar, Dates.Date(yy, mm, Dates.Day(1)))
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
6595
# # Cache routines for Business Days precalculated days # """ Data structure for calendar cache. """ mutable struct HolidayCalendarCache hc::HolidayCalendar isbday_array::Vector{Bool} bdayscounter_array::Vector{UInt32} dtmin::Dates.Date dtmax::Dates.Date is_initialized::Bool # indicated wether isbday_array and bdayscounter_array is empty for this cache end """ HolidayCalendarCache() creates an empty instance of HolidayCalendarCache """ HolidayCalendarCache() = HolidayCalendarCache(NullHolidayCalendar(), Vector{Bool}(), Vector{UInt32}(), Dates.Date(1900,1,1), Dates.Date(1900,1,1), false) """ Holds caches for Holiday Calendars. * Key = `HolidayCalendar` instance * Value = instance of `HolidayCalendarCache` """ const CACHE_DICT = Dict{HolidayCalendar, HolidayCalendarCache}() const DEFAULT_CACHE_D0 = Dates.Date(1980, 01, 01) const DEFAULT_CACHE_D1 = Dates.Date(2150, 12, 20) @inline _getcachestate(hcc::HolidayCalendarCache) = hcc.is_initialized @inline _getcachestate(hc::HolidayCalendar) = haskey(CACHE_DICT, hc) && _getcachestate(CACHE_DICT[hc]) @inline _getholidaycalendarcache(hc::HolidayCalendar) = CACHE_DICT[hc] @inline checkbounds(hcc::HolidayCalendarCache, dt::Dates.Date) = @assert (hcc.dtmin <= dt) && (dt <= hcc.dtmax) "Date out of cache bounds. Use initcache function with a wider time spread. Provided date: $(dt)." @inline _linenumber(hcc::HolidayCalendarCache, dt::Dates.Date) = Dates.days(dt) - Dates.days(hcc.dtmin) + 1 @inline function isbday(hcc::HolidayCalendarCache, dt::Dates.Date) :: Bool checkbounds(hcc, dt) return hcc.isbday_array[ _linenumber(hcc, dt) ] end function bdayscount(hcc::HolidayCalendarCache, dt0::Dates.Date, dt1::Dates.Date) :: Int # Computation is always based on next Business Days if given dates are not Business Days, inspired by Banking Account convention. dt0_tobday = tobday(hcc.hc, dt0) # cache bounds are checked inside tobday -> isbday dt1_tobday = tobday(hcc.hc, dt1) # cache bounds are checked inside tobday -> isbday return Int(hcc.bdayscounter_array[_linenumber(hcc, dt1_tobday)]) - Int(hcc.bdayscounter_array[_linenumber(hcc, dt0_tobday)]) end @inline bdays(hcc::HolidayCalendarCache, dt0::Dates.Date, dt1::Dates.Date) :: Dates.Day = Dates.Day(bdayscount(hcc, dt0, dt1)) # Returns tuple # tuple[1] = Array of Bool (isBday) , tuple[2] = Array of UInt32 (bdaycounter) function _create_bdays_cache_arrays(hc::HolidayCalendar, d0::Dates.Date, d1::Dates.Date) d0_rata = Dates.days(d0) d1_rata = Dates.days(d1) # length of the cache arrays len::Int = d1_rata - d0_rata + 1 # This function uses UInt32 to store bdayscounter array # We need to check if we'll exceed typemax(UInt32) @assert len <= typemax(UInt32) "Maximum size allowed for bdays cache array is $(typemax(UInt32)). The required lenght was $(len)." isbday_array = Vector{Bool}(undef, len) bdayscounter_array = Vector{UInt32}(undef, len) @inbounds isbday_array[1] = isbday(hc, d0) @inbounds bdayscounter_array[1] = 0 for i in 2:len @inbounds isbday_array[i] = isbday(hc, d0 + Dates.Day(i-1)) @inbounds bdayscounter_array[i] = bdayscounter_array[i-1] + isbday_array[i] end return isbday_array, bdayscounter_array end @inline needs_cache_update(cache::HolidayCalendarCache, d0::Dates.Date, d1::Dates.Date) :: Bool = _getcachestate(cache) && cache.dtmin == d0 && cache.dtmax == d1 @inline needs_cache_update(hc::HolidayCalendar, d0::Dates.Date, d1::Dates.Date) :: Bool = _getcachestate(hc) && CACHE_DICT[hc].dtmin == d0 && CACHE_DICT[hc].dtmax == d1 # Be sure to use this function on a syncronized code (not multithreaded). """ initcache(calendar, [d0], [d1]) Creates cache for a given Holiday Calendar. After calling this function, any call to `isbday` function, or any function that uses `isbday`, will be optimized to use this cache. You can pass `calendar` as an instance of `HolidayCalendar`, `Symbol` or `AbstractString`. You can also pass `calendar` as an `AbstractArray` of those types. """ function initcache(hc::HolidayCalendar, d0::Dates.Date=DEFAULT_CACHE_D0, d1::Dates.Date=DEFAULT_CACHE_D1) @assert d0 <= d1 "d1 < d0 not allowed." if needs_cache_update(hc, d0, d1) # will not repeat initcache for this already initialized cache return else isbday_array , bdayscounter_array = _create_bdays_cache_arrays(hc, d0, d1) CACHE_DICT[hc] = HolidayCalendarCache(hc, isbday_array, bdayscounter_array, d0, d1, true) end nothing end function initcache(hc_vec::Vector{HolidayCalendar}, d0::Dates.Date=DEFAULT_CACHE_D0, d1::Dates.Date=DEFAULT_CACHE_D1) for hc in hc_vec initcache(hc, d0, d1) end end initcache(calendars::A, d0::Dates.Date=DEFAULT_CACHE_D0, d1::Dates.Date=DEFAULT_CACHE_D1) where {A<:AbstractArray} = initcache(convert(Vector{HolidayCalendar}, calendars), d0, d1) initcache(calendar, d0::Dates.Date=DEFAULT_CACHE_D0, d1::Dates.Date=DEFAULT_CACHE_D1) = initcache(convert(HolidayCalendar, calendar), d0, d1) function initcache!(cache::HolidayCalendarCache, hc::HolidayCalendar, d0::Dates.Date=DEFAULT_CACHE_D0, d1::Dates.Date=DEFAULT_CACHE_D1) if needs_cache_update(cache, d0, d1) # will not repeat initcache for this already initialized cache return else cache.dtmin = d0 cache.dtmax = d1 isbday_array , bdayscounter_array = _create_bdays_cache_arrays(hc, d0, d1) cache.isbday_array = isbday_array cache.bdayscounter_array = bdayscounter_array cache.is_initialized = true end nothing end function cleancache!(cache::HolidayCalendarCache) if cache.is_initialized empty!(cache.isbday_array) empty!(cache.bdayscounter_array) cache.is_initialized = false end nothing end # remove all elements from cache function cleancache() for k in keys(CACHE_DICT) delete!(CACHE_DICT, k) end nothing end """ cleancache([calendar]) Cleans cache for a given instance or list of `HolidayCalendar`, `Symbol` or `AbstractString`. """ function cleancache(hc::HolidayCalendar) if haskey(CACHE_DICT, hc) delete!(CACHE_DICT, hc) end nothing end function cleancache(hc_vec::Vector{HolidayCalendar}) for k in hc_vec if k in keys(CACHE_DICT) delete!(CACHE_DICT, k) end end nothing end cleancache(calendar) = cleancache(convert(HolidayCalendar, calendar)) cleancache(calendars::A) where {A<:AbstractArray} = cleancache(convert(Vector{HolidayCalendar}, calendars))
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
4841
# helper functions for vector inputs @inline isweekend(dt::Vector{Dates.Date}) = isweekend.(dt) function isbday(hc::HolidayCalendar, dt::Vector{Dates.Date}) result = Vector{Bool}(undef, length(dt)) for i in eachindex(dt) @inbounds result[i] = isbday(hc, dt[i]) end return result end function isbday(hc::Vector{HolidayCalendar}, dt::Vector{Dates.Date}) l_hc = length(hc) l_dt = length(dt) @assert l_hc == l_dt "Input vectors must have the same size. $(l_hc) != $(l_dt)" result = Vector{Bool}(undef, length(dt)) for i in 1:l_hc @inbounds result[i] = isbday(hc[i], dt[i]) end return result end isbday(calendar, dt::Vector{Dates.Date}) = isbday(convert(HolidayCalendar, calendar), dt) isbday(calendars::A, dt::Vector{Dates.Date}) where {A<:AbstractArray} = isbday(convert(Vector{HolidayCalendar}, calendars), dt) function tobday(hc::HolidayCalendar, dt::Vector{Dates.Date}; forward::Bool = true) result = Vector{Dates.Date}(undef, length(dt)) for i in eachindex(dt) @inbounds result[i] = tobday(hc, dt[i]; forward=forward) end return result end function tobday(hc::Vector{HolidayCalendar}, dt::Vector{Dates.Date}; forward::Bool = true) l_hc = length(hc) l_dt = length(dt) @assert l_hc == l_dt "Input vectors must have the same size. $(l_hc) != $(l_dt)" result = Vector{Dates.Date}(undef, l_hc) for i in 1:l_hc @inbounds result[i] = tobday(hc[i], dt[i]; forward=forward) end return result end tobday(calendar, dt::Vector{Dates.Date}; forward::Bool = true) = tobday(convert(HolidayCalendar, calendar), dt; forward=forward) tobday(calendars::A, dt::Vector{Dates.Date}; forward::Bool = true) where {A<:AbstractArray} = tobday(convert(Vector{HolidayCalendar}, calendars), dt; forward=forward) function bdays(hc::HolidayCalendar, base_date::Dates.Date, dt_vec::Vector{Dates.Date}) len = length(dt_vec) result = Vector{Dates.Day}(undef, len) for i in 1:len @inbounds result[i] = bdays(hc, base_date, dt_vec[i]) end return result end function bdayscount(hc::HolidayCalendar, base_date::Dates.Date, dt_vec::Vector{Dates.Date}) len = length(dt_vec) result = Vector{Int}(undef, len) for i in 1:len @inbounds result[i] = bdayscount(hc, base_date, dt_vec[i]) end return result end function bdays(hc::HolidayCalendar, dt0::Vector{Dates.Date}, dt1::Vector{Dates.Date}) l0 = length(dt0) l1 = length(dt1) @assert l0 == l1 "Input vectors must have the same size. $(l0) != $(l1)" result = Vector{Dates.Day}(undef, l0) for i in 1:l0 @inbounds result[i] = bdays(hc, dt0[i], dt1[i]) end return result end function bdayscount(hc::HolidayCalendar, dt0::Vector{Dates.Date}, dt1::Vector{Dates.Date}) l0 = length(dt0) l1 = length(dt1) @assert l0 == l1 "Input vectors must have the same size. $(l0) != $(l1)" result = Vector{Int}(undef, l0) for i in 1:l0 @inbounds result[i] = bdayscount(hc, dt0[i], dt1[i]) end return result end function bdays(hc::Vector{HolidayCalendar}, dt0::Vector{Dates.Date}, dt1::Vector{Dates.Date}) l_hc = length(hc) l0 = length(dt0) l1 = length(dt1) @assert l_hc == l0 && l0 == l1 "Input vectors must have the same size. $(l_hc), $(l0), $(l1)" result = Vector{Dates.Day}(undef, l0) for i in 1:l0 @inbounds result[i] = bdays(hc[i], dt0[i], dt1[i]) end return result end function bdayscount(hc::Vector{HolidayCalendar}, dt0::Vector{Dates.Date}, dt1::Vector{Dates.Date}) l_hc = length(hc) l0 = length(dt0) l1 = length(dt1) @assert l_hc == l0 && l0 == l1 "Input vectors must have the same size. $(l_hc), $(l0), $(l1)" result = Vector{Int}(undef, l0) for i in 1:l0 @inbounds result[i] = bdayscount(hc[i], dt0[i], dt1[i]) end return result end bdays(calendars::A, dt0::Vector{Dates.Date}, dt1::Vector{Dates.Date}) where {A<:AbstractArray} = bdays(convert(Vector{HolidayCalendar}, calendars), dt0, dt1) bdayscount(calendars::A, dt0::Vector{Dates.Date}, dt1::Vector{Dates.Date}) where {A<:AbstractArray} = bdayscount(convert(Vector{HolidayCalendar}, calendars), dt0, dt1) function advancebdays(hc::HolidayCalendar, dt::Dates.Date, bdays_count_vec::Vector{T}) where {T<:Union{Int, Dates.Day}} l = length(bdays_count_vec) result = Vector{Dates.Date}(undef, l) for i in 1:l @inbounds result[i] = advancebdays(hc, dt, bdays_count_vec[i]) end return result end function advancebdays(calendar, dt::Dates.Date, bdays_count_vec::Vector{T}) where {T<:Union{Int, Dates.Day}} advancebdays(convert(HolidayCalendar, calendar), dt, bdays_count_vec) end advancebdays(hc, dt::Dates.Date, bdays_range::AbstractRange) = advancebdays(hc, dt, collect(bdays_range))
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
342
""" Allows for combination of several Holiday Calendars. """ struct CompositeHolidayCalendar <: HolidayCalendar calendars::Vector{HolidayCalendar} end function isholiday(hc::CompositeHolidayCalendar, dt::Dates.Date) for c in hc.calendars if isholiday(c, dt) return true end end return false end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
3443
""" easter_rata(y::Dates.Year) → Int Returns Easter date as a *[Rata Die](https://en.wikipedia.org/wiki/Rata_Die)* number. Based on *[Algo R](http://www.linuxtopia.org/online_books/programming_books/python_programming/python_ch38.html)*. """ function easter_rata(y::Dates.Year) # Algo R only works after 1582 if y.value < 1582 # Are you using this? Send me a postcard! error("Year cannot be less than 1582. Provided: $(y.value).") end # Century c = div(y.value , 100) + 1 # Shifted Epact local se::Int = mod(14 + 11*(mod(y.value, 19)) - div(3*c, 4) + div(5+8*c, 25), 30) # Adjust Epact if (se == 0) || ((se == 1) && ( 10 < mod(y.value, 19) )) se += 1 end # Paschal Moon p = Dates.Date(y.value, 4, 19).instant.periods.value - se # Easter: locate the Sunday after the Paschal Moon return p + 7 - mod(p, 7) end """ easter_date(y::Dates.Year) → Dates.Date Returns result of `easter_rata` as a `Dates.Date` instance. """ @inline easter_date(y::Dates.Year) = Dates.Date(Dates.rata2datetime(easter_rata(y))) """ findweekday(weekday_target::Integer, yy::Integer, mm::Integer, occurrence::Integer, ascending::Bool) → Date Given a year `yy` and month `mm`, finds a date where a choosen weekday occurs. `weekday_target` values are declared in module `Base.Dates`: `Monday,Tuesday,Wednesday,Thursday,Friday,Saturday,Sunday = 1,2,3,4,5,6,7`. If `ascending` is true, searches from the beginning of the month. If false, searches from the end of the month. If `occurrence` is `2` and `weekday_target` is `Monday`, searches the 2nd Monday of the given month, and so on. """ function findweekday(weekday_target::Integer, yy::Integer, mm::Integer, occurrence::Integer, ascending::Bool) :: Dates.Date dt = Dates.Date(yy, mm, 1) local dt_dayofweek::Integer local offset::Integer @assert occurrence > 0 "occurrence must be > 0. Provided $(occurrence)." if ascending dt_dayofweek = Dates.dayofweek(dt) offset = mod(weekday_target + 7 - dt_dayofweek, 7) else dt = Dates.lastdayofmonth(dt) dt_dayofweek = Dates.dayofweek(dt) offset = mod(dt_dayofweek + 7 - weekday_target, 7) end if occurrence > 1 offset += 7 * (occurrence - 1) end if ascending return dt + Dates.Day(offset) else return dt - Dates.Day(offset) end end """ adjustweekendholidayPost(dt, [adjust_saturdays]) → Date In the UK and Canada, if a holiday falls on Saturday or Sunday, it's observed on the next business day. This function will adjust to the next Monday. `adjust_saturdays` kwarg defaults to `true`. """ function adjustweekendholidayPost(dt::Dates.Date; adjust_saturdays::Bool = true) :: Dates.Date if adjust_saturdays && (Dates.dayofweek(dt) == Dates.Saturday) return dt + Dates.Day(2) end if Dates.dayofweek(dt) == Dates.Sunday return dt + Dates.Day(1) end return dt end """ adjustweekendholidayUS(dt) → Date In the United States, if a holiday falls on Saturday, it's observed on the preceding Friday. If it falls on Sunday, it's observed on the next Monday. """ function adjustweekendholidayUS(dt::Dates.Date) :: Dates.Date if Dates.dayofweek(dt) == Dates.Saturday return dt - Dates.Day(1) end if Dates.dayofweek(dt) == Dates.Sunday return dt + Dates.Day(1) end return dt end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
2665
""" GenericHolidayCalendar * `holidays`: a set of holiday dates * `dtmin`: minimum date allowed to check for holidays in holidays set. Defaults to `min(holidays...)`. * `dtmax`: maximum date allowed to check for holidays in holidays set. Defaults to `max(holidays...)`. * `cache`: instance of HolidayCalendarCache. """ mutable struct GenericHolidayCalendar <: HolidayCalendar holidays::Set{Dates.Date} dtmin::Dates.Date dtmax::Dates.Date cache::HolidayCalendarCache end Base.:(==)(g1::GenericHolidayCalendar, g2::GenericHolidayCalendar) = g1.holidays == g2.holidays && g1.dtmin == g2.dtmin && g1.dtmax == g2.dtmax Base.hash(g::GenericHolidayCalendar) = hash(g.holidays) + hash(g.dtmin) + hash(g.dtmax) """ GenericHolidayCalendar(holidays, [dtmin], [dtmax], [_initcache_]) * `holidays`: a set of holiday dates * `dtmin`: minimum date allowed to check for holidays in holidays set. Defaults to `min(holidays...)`. * `dtmax`: maximum date allowed to check for holidays in holidays set. Defaults to `max(holidays...)`. * `_initcache_`: initializes the cache for this calendar. Defaults to `true`. """ function GenericHolidayCalendar(holidays::Set{Dates.Date}, dtmin::Dates.Date=min(holidays...), dtmax::Dates.Date=max(holidays...), _initcache_::Bool=true) generic_calendar = GenericHolidayCalendar(holidays, dtmin, dtmax, HolidayCalendarCache()) generic_calendar.cache.hc = generic_calendar if _initcache_ initcache!(generic_calendar.cache, generic_calendar, dtmin, dtmax) end return generic_calendar end GenericHolidayCalendar(holidays::Vector{Dates.Date}, d0::Dates.Date=min(holidays...), d1::Dates.Date=max(holidays...), _initcache_::Bool=true) = GenericHolidayCalendar(Set(holidays), d0, d1, _initcache_) @inline checkbounds(cal::GenericHolidayCalendar, dt::Dates.Date) = @assert cal.dtmin <= dt && dt <= cal.dtmax "Date out of calendar bounds: $dt. Allowed dates interval is from $(cal.dtmin) to $(cal.dtmax)." function isholiday(cal::GenericHolidayCalendar, dt::Dates.Date) checkbounds(cal, dt) return in(dt, cal.holidays) end @inline _getcachestate(hc::GenericHolidayCalendar) = _getcachestate(hc.cache) @inline _getholidaycalendarcache(hc::GenericHolidayCalendar) = hc.cache @inline cleancache(cal::GenericHolidayCalendar) = cleancache!(cal.cache) @inline needs_cache_update(hc::GenericHolidayCalendar, d0::Dates.Date, d1::Dates.Date) = _getcachestate(hc) && hc.cache.dtmin == d0 && hc.cache.dtmax == d1 function initcache(hc::GenericHolidayCalendar, d0::Dates.Date=hc.dtmin, d1::Dates.Date=hc.dtmax) checkbounds(hc, d0) checkbounds(hc, d1) initcache!(hc.cache, hc, d0, d1) end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
1313
""" *Abstract* type for Holiday Calendars. """ abstract type HolidayCalendar end Base.string(hc::HolidayCalendar) = string(typeof(hc)) Base.broadcastable(hc::HolidayCalendar) = Ref(hc) function symtocalendar(sym::Symbol) :: HolidayCalendar if isdefined(BusinessDays, sym) && Core.eval(BusinessDays, sym) <: HolidayCalendar return Core.eval(BusinessDays, sym)() elseif isdefined(@__MODULE__, sym) && Core.eval(@__MODULE__, sym) <: HolidayCalendar return Core.eval(@__MODULE__, sym)() elseif isdefined(Main, sym) && Core.eval(Main, sym) <: HolidayCalendar return Core.eval(Main, sym)() else error("$sym is not a valid HolidayCalendar.") end end @inline strtocalendar(str::AbstractString) = symtocalendar(Symbol(str)) Base.convert(::Type{HolidayCalendar}, sym::Symbol) = symtocalendar(sym) Base.convert(::Type{HolidayCalendar}, str::AbstractString) = strtocalendar(str) """ isholiday(calendar, dt) :: Bool Checks if `dt` is a holiday based on a given `calendar` of holidays. `calendar` can be an instance of `HolidayCalendar`, a `Symbol` or an `AbstractString`. """ isholiday(hc::HolidayCalendar, dt::Dates.Date) = error("isholiday for $(hc) not implemented.") isholiday(calendar, dt::Dates.Date) = isholiday(convert(HolidayCalendar, calendar), dt)
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
1257
""" listholidays(calendar, dt0::Dates.Date, dt1::Dates.Date) → Vector{Dates.Date} Returns the list of holidays between `dt0` and `dt1`. """ function listholidays(hc::HolidayCalendar, dt0::Dates.Date, dt1::Dates.Date) d0 = min(dt0, dt1) d1 = max(dt0, dt1) dt_range = d0:Dates.Day(1):d1 isholiday_vec = [ isholiday(hc, i) for i in dt_range ] return dt_range[isholiday_vec] end listholidays(calendar, dt0::Dates.Date, dt1::Dates.Date) = listholidays(convert(HolidayCalendar, calendar), dt0, dt1) """ listbdays(calendar, dt0::Dates.Date, dt1::Dates.Date) → Vector{Dates.Date} Returns the list of business days between `dt0` and `dt1`. """ function listbdays(hc::HolidayCalendar, dt0::Dates.Date, dt1::Dates.Date) d = tobday(hc, min(dt0, dt1)) d1 = max(dt0, dt1) # empty result if d > d1 return Vector{Dates.Date}() end n = Dates.value(d1 - d) + 1 raw_vec = Vector{Dates.Date}(undef, n) raw_vec[1] = d d = advancebdays(hc, d, 1) i = 2 while d <= d1 raw_vec[i] = d i += 1 d = advancebdays(hc, d, 1) end return raw_vec[1:(i-1)] end listbdays(calendar, dt0::Dates.Date, dt1::Dates.Date) = listbdays(convert(HolidayCalendar,calendar), dt0, dt1)
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
9876
""" Public holidays for the Australian Stock Exchange (ASX). """ struct AustraliaASX <: HolidayCalendar end """ Public holidays for the Australian states and territories. Although some holidays are common to all states and territories, such as Christmas Day, each state and territory also has its own additional holidays. Therefore the set of relevant holidays depends on which state/territory you are concerned with. The Australian states and territories are: - Australian Capital Territory (ACT) - New South Wales (NSW) - Northern Territory (NT) - Queensland (QLD) - South Australia (SA) - Tasmania (TAS) - Western Australia (WA) - Victoria (VIC) For example: cal = Australia(:QLD) """ struct Australia <: HolidayCalendar state::Symbol function Australia(state::Symbol) states = Set([:ACT, :NSW, :NT, :QLD, :SA, :TAS, :WA, :VIC]) @assert state ∈ states "$(state) is not a valid Australian state or territory. Must be one of :ACT, :NSW, :NT, :QLD, :SA, :TAS, :WA, :VIC." new(state) end end Base.:(==)(a1::Australia, a2::Australia) = a1.state == a2.state Base.hash(a::Australia) = hash(a.state) function isholiday(::AustraliaASX, dt::Dates.Date) yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) easter_sunday = BusinessDays.easter_date(Dates.Year(yy)) is_australian_national_holiday(dt, yy, mm, dd, easter_sunday) && return true mm == 6 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # Queen's Birthday holiday (2nd Monday of June) false end function isholiday(cal::Australia, dt::Dates.Date) yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) easter_sunday = BusinessDays.easter_date(Dates.Year(yy)) is_australian_national_holiday(dt, yy, mm, dd, easter_sunday) && return true is_australian_state_holiday(Val{cal.state}, dt, yy, mm, dd, easter_sunday) end ################################################################################ function is_australian_national_holiday(dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date) mm == 1 && dd == 1 && return true # New year's day mm == 1 && dd == 26 && return true # Australia Day mm == 4 && dd == 25 && return true # ANZAC Day mm == 12 && dd == 25 && return true # Christmas Day mm == 12 && dd == 26 && return true # Boxing Day dt == easter_sunday - Dates.Day(2) && return true # Good Friday dt == easter_sunday + Dates.Day(1) && return true # Easter Monday false end function is_australian_state_holiday(::Type{Val{:ACT}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date) dt == easter_sunday - Dates.Day(1) && return true # Easter Saturday dt == easter_sunday && return true # Easter Sunday mm == 3 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # Canberra Day (2nd Monday of March) mm == 6 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # Queen's Birthday holiday (2nd Monday of June) mm == 10 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 1 && return true # Labour Day (1st Monday of October) if yy >= 2018 && mm == 5 # Reconciliation Day (Last Monday of May) first_june = Dates.Date(yy, 6, 1) last_mon_may = Dates.toprev(first_june, Dates.Mon) dt == last_mon_may && return true end false end function is_australian_state_holiday(::Type{Val{:NSW}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date) dt == easter_sunday - Dates.Day(1) && return true # Easter Saturday dt == easter_sunday && return true # Easter Sunday mm == 6 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # Queen's Birthday holiday (2nd Monday of June) mm == 8 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 1 && return true # Bank holiday (1st Monday of August) mm == 10 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 1 && return true # Labour Day (1st Monday of October) false end function is_australian_state_holiday(::Type{Val{:NT}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date) dt == easter_sunday - Dates.Day(1) && return true # Easter Saturday (Easter Sunday is not a public holiday in the Northern Territory) mm == 5 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 1 && return true # May Day (1st Monday of May) mm == 6 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # Queen's Birthday holiday (2nd Monday of June) mm == 8 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 1 && return true # Picnic Day (1st Monday of August) false end function is_australian_state_holiday(::Type{Val{:QLD}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date) dt == easter_sunday - Dates.Day(1) && return true # Easter Saturday dt == easter_sunday && return true # Easter Sunday mm == 5 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 1 && return true # Labour Day (1st Monday of May) # Royal Brisbane Show (Brisbane area only). if mm == 8 && Dates.dayofweek(dt) == Dates.Wed end_aug = Dates.Date(yy, 8, 31) last_wed_aug = Dates.dayofweek(end_aug) == Dates.Wed ? end_aug : Dates.toprev(end_aug, Dates.Wed) n_wed_in_aug = Dates.day(last_wed_aug) >= 29 ? 5 : 4 n_wed_in_aug == 4 && Dates.dayofweekofmonth(dt) == 2 && return true # 2nd Wednesday of August if there are 4 Wednesdays in August n_wed_in_aug == 5 && Dates.dayofweekofmonth(dt) == 3 && return true # 3rd Wednesday of August if there are 5 Wednesdays in August end # Queen's Birthday holiday if yy == 2012 || yy >= 2016 mm == 10 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 1 && return true # 1st Monday of October else mm == 6 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # 2nd Monday of June end false end function is_australian_state_holiday(::Type{Val{:SA}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date) dt == easter_sunday - Dates.Day(1) && return true # Easter Saturday (Easter Sunday is not a public holiday in South Australia) mm == 3 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # March public holiday (2nd Monday of March) mm == 6 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # Queen's Birthday holiday (2nd Monday of June) mm == 10 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 1 && return true # Labour Day (1st Monday of October) false end function is_australian_state_holiday(::Type{Val{:TAS}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date) # Neither Easter Saturday nor Easter Sunday are public holidays in Tasmania mm == 2 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # Royal Hobart Regatta (2nd Monday of February) mm == 3 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # Labour Day (2nd Monday of March) mm == 6 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # Queen's Birthday holiday (2nd Monday of June) mm == 11 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 1 && return true # Recreation Day (1st Monday of November) false end function is_australian_state_holiday(::Type{Val{:WA}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date) # Neither Easter Saturday nor Easter Sunday are public holidays in Western Australia mm == 3 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 1 && return true # Labour Day (1st Monday of March) mm == 6 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 1 && return true # Western Australia Day (1st Monday of June) # Queen's Birthday holiday (proclaimed each year by the Governor, usually the last Monday of September) if yy == 2011 mm == 10 && dd == 28 && return true elseif yy == 2012 mm == 10 && dd == 1 && return true else end_sep = Dates.Date(yy, 9, 30) last_mon_sep = Dates.dayofweek(end_sep) == Dates.Mon ? end_sep : Dates.toprev(end_sep, Dates.Mon) dt == last_mon_sep && return true end false end function is_australian_state_holiday(::Type{Val{:VIC}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date) dt == easter_sunday - Dates.Day(1) && return true # Easter Saturday dt == easter_sunday && return true # Easter Sunday mm == 3 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # Labour Day (2nd Monday of March) mm == 6 && Dates.dayofweek(dt) == Dates.Mon && Dates.dayofweekofmonth(dt) == 2 && return true # Queen's Birthday holiday (2nd Monday of June) mm == 11 && Dates.dayofweek(dt) == Dates.Tue && Dates.dayofweekofmonth(dt) == 1 && return true # Melbourne Cup (1st Tuesday of November) if yy >= 2015 && Dates.dayofweek(dt) == Dates.Fri && (mm == 9 || mm == 10) # Friday before AFL Grand Final (Friday closest to 30th September) first_oct = Dates.Date(yy, 10, 1) # 1st October first_sat_oct = Dates.dayofweek(first_oct) == Dates.Sat ? first_oct : Dates.tonext(first_oct, Dates.Sat) # 1st Saturday in October afl_final = Dates.day(first_sat_oct) <= 4 ? first_sat_oct : first_sat_oct - Dates.Day(7) dt == afl_final - Dates.Day(1) && return true end false end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
3071
"Banking holidays for Brazil (federal holidays plus Carnival)." struct BRSettlement <: HolidayCalendar end const Brazil = BRSettlement "B3 Exchange holidays (https://www.b3.com.br)." struct BrazilExchange <: HolidayCalendar end const BrazilBMF = BrazilExchange const BrazilB3 = BrazilExchange # Brazilian Banking Holidays function isholiday(::BRSettlement, dt::Dates.Date) yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) # Bisection if mm >= 8 # Fixed holidays if ( # Independencia do Brasil ((mm == 9) && (dd == 7)) || # Nossa Senhora Aparecida ((mm == 10) && (dd == 12)) || # Finados ((mm == 11) && (dd == 2)) || # Proclamacao da Republica ((mm == 11) && (dd == 15)) || # Dia Nacional de Zumbi e da Consciência Negra ((mm == 11) && (dd == 20) && (yy >= 2024)) || # Natal ((mm == 12) && (dd == 25)) ) return true end else # mm < 8 # Fixed holidays if ( # Confraternizacao Universal ((mm == 1) && (dd == 1)) || # Tiradentes ((mm == 4) && (dd == 21)) || # Dia do Trabalho ((mm == 5) && (dd == 1)) ) return true end # Easter occurs up to April, so Corpus Christi will be up to July in the worst case, which is before August (mm < 8). See `test/easter-min-max.jl`. # Holidays based on easter date. dt_rata::Int = Dates.days(dt) e_rata::Int = easter_rata(Dates.Year(yy)) if ( # Segunda de Carnaval ( dt_rata == ( e_rata - 48 ) ) || # Terca de Carnaval ( dt_rata == ( e_rata - 47 ) ) || # Sexta-feira Santa ( dt_rata == ( e_rata - 2 ) ) || # Corpus Christi ( dt_rata == ( e_rata + 60 ) ) ) return true end end return false end function isholiday(::BrazilExchange, dt::Dates.Date) yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) if ( # Aniversário de São Paulo ( mm == 1 && dd == 25 && yy < 2022 ) || # Revolucão ( mm == 7 && dd == 9 && yy != 2020 && yy < 2022 ) || # Consciência Negra (since 2007) ( yy >= 2007 && mm == 11 && dd == 20 && yy != 2020 && yy < 2022 ) # Christmas Eve || ( mm == 12 && dd == 24) || # Último dia útil do ano ( mm == 12 && (dd == 31 || (dd>=29 && Dates.dayofweek(dt) == Dates.Friday) )) || # National holidays isholiday(Brazil(), dt) ) return true end return false end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
1658
""" A calendar with no holidays. But account for weekends. `isholiday` always returns false for this calendar. Remember that `isbday` considers that Saturdays and Sundars are not business days. """ struct WeekendsOnly <: HolidayCalendar end isholiday(::WeekendsOnly, dt::Dates.Date) = false function bdayscount(::WeekendsOnly, dt0::Dates.Date, dt1::Dates.Date) swapped = false if dt0 == dt1 return 0 elseif dt0 > dt1 dt1, dt0 = dt0, dt1 swapped = true end result = 0 days = Dates.value(dt1 - dt0) whole_weeks = div(days, 7) result += whole_weeks * 5 dt0 += Dates.Day(whole_weeks * 7) if dt0 < dt1 day_of_week = Dates.dayofweek(dt0) while dt0 < dt1 if day_of_week < 6 result += 1 end dt0 += Dates.Day(1) day_of_week += 1 if day_of_week == 8 day_of_week = 1 end end end if swapped result = -result end return result end """ A calendar with no holidays and no weekends. `bdays` returns the actual days between dates (`dt1 - d10`). """ struct NullHolidayCalendar <: HolidayCalendar end isholiday(::NullHolidayCalendar, dt::Dates.Date) = false isbday(::NullHolidayCalendar, dt::Dates.Date) = true bdayscount(::NullHolidayCalendar, dt0::Dates.Date, dt1::Dates.Date) = Dates.value(dt1 - dt0) listbdays(::NullHolidayCalendar, dt0::Dates.Date, dt1::Dates.Date) = collect(dt0:Dates.Day(1):dt1) include("australia.jl") include("brazil.jl") include("canada.jl") include("target.jl") include("uk.jl") include("us.jl") include("germany.jl")
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
4290
""" Holidays for Canada """ struct CanadaSettlement <: HolidayCalendar end const Canada = CanadaSettlement """ Holidays for Toronto Stock Exchange """ struct CanadaTSX <: HolidayCalendar end function isholiday(::CanadaSettlement, dt::Dates.Date) yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) ww = Dates.dayofweek(dt) # Bisection if mm >= 8 # Fixed holidays if ( # first Monday of August (Provincial Holiday) findweekday(Dates.Monday, yy, 8, 1, true) == dt # first Monday of September (Labor Day) || findweekday(Dates.Monday, yy, 9, 1, true) == dt # second Monday of October (Thanksgiving Day) || findweekday(Dates.Monday, yy, 10, 2, true) == dt # November 11th (possibly moved to Monday) || adjustweekendholidayPost(Dates.Date(yy, 11, 11)) == dt # Christmas || adjustweekendholidayPost( Dates.Date(yy, 12, 25) ) == dt # Boxing || adjustweekendholidayPost(adjustweekendholidayPost( Dates.Date(yy, 12, 25) ) + Dates.Day(1)) == dt ) return true end else # mm < 8 # Fixed holidays if ( # New Year's Day adjustweekendholidayPost(Dates.Date(yy, 01, 01); adjust_saturdays=false) == dt # Family Day (third Monday in February, since 2008) || ( yy >= 2008 && findweekday(Dates.Monday, yy, 2, 3, true) == dt ) # The Monday on or preceding 24 May (Victoria Day) || (dd > 17 && dd <= 24 && ww == Dates.Monday && mm == Dates.May) # July 1st, possibly moved to Monday (Canada Day) || adjustweekendholidayPost(Dates.Date(yy, 07, 01)) == dt ) return true end # Easter occurs up to April, which is before August (mm < 8). See test/easter-min-max.jl . # Holidays based on easter date dt_rata::Int = Dates.days(dt) e_rata::Int = easter_rata(Dates.Year(yy)) if ( # Good Friday ( dt_rata == ( e_rata - 2 ) ) ) return true end end return false end function isholiday(::CanadaTSX, dt::Dates.Date) yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) ww = Dates.dayofweek(dt) # Bisection if mm >= 8 # Fixed holidays if ( # first Monday of August (Provincial Holiday) findweekday(Dates.Monday, yy, 8, 1, true) == dt # first Monday of September (Labor Day) || findweekday(Dates.Monday, yy, 9, 1, true) == dt # second Monday of October (Thanksgiving Day) || findweekday(Dates.Monday, yy, 10, 2, true) == dt # Christmas || adjustweekendholidayPost( Dates.Date(yy, 12, 25) ) == dt # Boxing || adjustweekendholidayPost(adjustweekendholidayPost( Dates.Date(yy, 12, 25) ) + Dates.Day(1)) == dt ) return true end else # mm < 8 # Fixed holidays if ( # New Year's Day adjustweekendholidayPost(Dates.Date(yy, 01, 01); adjust_saturdays=false) == dt # Family Day (third Monday in February, since 2008) || ( yy >= 2008 && findweekday(Dates.Monday, yy, 2, 3, true) == dt ) # The Monday on or preceding 24 May (Victoria Day) || (dd > 17 && dd <= 24 && ww == Dates.Monday && mm == Dates.May) # July 1st, possibly moved to Monday (Canada Day) || adjustweekendholidayPost(Dates.Date(yy, 07, 01)) == dt ) return true end # Easter occurs up to April, which is before August (mm < 8). See test/easter-min-max.jl . # Holidays based on easter date dt_rata::Int = Dates.days(dt) e_rata::Int = easter_rata(Dates.Year(yy)) if ( # Good Friday ( dt_rata == ( e_rata - 2 ) ) ) return true end end return false end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
10523
""" Public holidays for the German states. Although some holidays are common to all states, such as Easter and Christmas, each state also has its own additional holidays. The only national holiday is the Day of German Unity. The set of relevant holidays depends about which state you are concerned. The German states are: - Baden-Württemberg (BW) - Bavaria (BY) - Berlin (BE) - Brandenburg (BB) - Bremen (HB) - Hamburg (HH) - Hessen (HE) - Mecklenburg-Vorpommern (MV) - Lowwer Saxony (NI) - North Rhine-Westphalia (NW) - Rhineland-Palatinate (RP) - Saarland (SL) - Saxony (SN) - Saxony-Anhalt (ST) - Schleswig-Holstein (SH) - Thuringia (TH) For example: `cal = Germany(:BW)` or `cal = DE(:BW)` """ struct Germany <: HolidayCalendar state::Symbol function Germany(state::Symbol) states = Set([:BW, :BY, :BYP, :BE, :BB, :HB, :HH, :HE, :MV, :NI, :NW, :RP, :SL, :SN, :ST, :SH, :TH]) @assert state ∈ states "$(state) is not a valid German state. Choose from: :BW, :BY, :BYP, :BE, :BB, :HB, :HH, :HE, :MV, :NI, :NW, :RP, :SL, :SN, :ST, :SH, :TH." new(state) end end const DE = Germany """ isholiday(cal::DE, dt::Dates.Date)::Bool Return true if `dt` is a is a holiday in the German calender (`cal`), otherwise false. """ function isholiday(cal::DE, dt::Dates.Date)::Bool dt ≤ Dates.Date(1990, 10, 3) && return false # Only count holidays after German Reunification yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) easter_sunday = BusinessDays.easter_date(Dates.Year(yy)) is_common_german_holiday(dt, yy, mm, dd, easter_sunday) && return true is_german_state_holiday(Val{cal.state}, dt, yy, mm, dd, easter_sunday) end ################################################################################ """ is_common_german_holiday(dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date)::Bool Return true if `dt` (with its parts `yy` - year, `mm` - month, `dd` -day) is a is a holiday in the calender of all of Germany (`cal`), otherwise false. """ function is_common_german_holiday(dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date)::Bool mm == 1 && dd == 1 && return true # New year's day mm == 5 && dd == 1 && return true # International Workers' Day dt == easter_sunday - Dates.Day(2) && return true # Good Friday dt == easter_sunday + Dates.Day(1) && return true # Easter Monday dt == easter_sunday + Dates.Day(39) && return true # Ascension Day dt == easter_sunday + Dates.Day(50) && return true # Pentecost Monday mm == 10 && dd == 3 && return true # German Unity Day mm == 10 && dd == 31 && yy == 2017 && return true # 500 years Reformation mm == 12 && dd == 25 && return true # Christmas Day mm == 12 && dd == 26 && return true # Boxing Day mm == 11 && dd == day_of_repentance_and_prayer(yy) && yy ≤ 1994 && return true # Day of Repentance and Prayer return false end """ function is_german_state_holiday( ::Union{Type{Val{:HB}},Type{Val{:HH}},Type{Val{:NI}},Type{Val{:SH}}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool Return true if `dt` (with its parts `yy` - year, `mm` - month, `dd` -day) is a is a holiday in the calender (`cal`) of the German states Bremen, Hamburg, Lower Saxony or Schleswig-Holstein, otherwise false. """ function is_german_state_holiday( ::Union{Type{Val{:HB}},Type{Val{:HH}},Type{Val{:NI}},Type{Val{:SH}}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool mm == 10 && dd == 31 && yy ≥ 2017 && return true # Reformation Day return false end """ function is_german_state_holiday( ::Union{Type{Val{:NW}},Type{Val{:RP}},Type{Val{:SL}}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool Return true if `dt` (with its parts `yy` - year, `mm` - month, `dd` -day) is a is a holiday in the calender (`cal`) of the German states North Rhine-Westphalia, Rhineland-Palatinate or Saarland, otherwise false. """ function is_german_state_holiday( cal::T, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool where T <: Union{Type{Val{:NW}},Type{Val{:RP}},Type{Val{:SL}}} mm == 11 && dd == 1 && return true # All Saints' Day dt == easter_sunday + Dates.Day(60) && return true # Corpus Christi mm == 8 && dd == 15 && cal <: Val{:SL} && return true # Assumption of Mary return false end """ function is_german_state_holiday( ::Union{Type{Val{:NW}},Type{Val{:RP}},Type{Val{:SL}}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool Return true if `dt` (with its parts `yy` - year, `mm` - month, `dd` -day) is a is a holiday in the calender (`cal`) of the German states Baden-Württemberg or Bavaria (with or without Assumption of Mary), otherwise false. """ function is_german_state_holiday( cal::T, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool where T <: Union{Type{Val{:BW}},Type{Val{:BY}},Type{Val{:BYP}}} mm == 1 && dd == 6 && return true # Epiphany mm == 11 && dd == 1 && return true # All Saints' Day dt == easter_sunday + Dates.Day(60) && return true # Corpus Christi mm == 8 && dd == 15 && cal <: Val{:BY} && return true # Assumption of Mary # (only catholic communities in Bavaria) return false end """ function is_german_state_holiday( ::Type{Val{:BE}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool Return true if `dt` (with its parts `yy` - year, `mm` - month, `dd` -day) is a is a holiday in the calender (`cal`) of the German state Berlin, otherwise false. """ function is_german_state_holiday( ::Type{Val{:BE}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool mm == 3 && dd == 8 && yy ≥ 2019 && return true # Womens' Day # 75. Jahrestag zur Befreiung des Nationalsozialismus: mm == 5 && dd == 8 && yy == 2020 && return true return false end """ function is_german_state_holiday( ::Type{Val{:BB}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool Return true if `dt` (with its parts `yy` - year, `mm` - month, `dd` -day) is a is a holiday in the calender (`cal`) of the German state Brandenburg, otherwise false. """ function is_german_state_holiday( ::Type{Val{:BB}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool mm == 10 && dd == 31 && yy ≥ 2019 && return true # Reformation Day dt == easter_sunday && return true # Easter Sunday dt == easter_sunday + Dates.Day(49) && return true # Pentecost Sunday return false end """ function is_german_state_holiday( ::Type{Val{:HE}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool Return true if `dt` (with its parts `yy` - year, `mm` - month, `dd` -day) is a is a holiday in the calender (`cal`) of the German state Hessen, otherwise false. """ function is_german_state_holiday( ::Type{Val{:HE}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool dt == easter_sunday + Dates.Day(60) && return true # Corpus Christi return false end """ function is_german_state_holiday( ::Type{Val{:MV}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool Return true if `dt` (with its parts `yy` - year, `mm` - month, `dd` -day) is a is a holiday in the calender (`cal`) of the German state Mecklenburg-Vorpommern, otherwise false. """ function is_german_state_holiday( ::Type{Val{:MV}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool mm == 10 && dd == 31 && return true # Reformation Day mm == 3 && dd == 08 && yy ≥ 2023 && return true # Womens' Day return false end """ function is_german_state_holiday( ::Type{Val{:SN}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool Return true if `dt` (with its parts `yy` - year, `mm` - month, `dd` -day) is a is a holiday in the calender (`cal`) of the German state Saxony, otherwise false. """ function is_german_state_holiday( ::Type{Val{:SN}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool mm == 10 && dd == 31 && return true # Reformation Day # Day of Repentance and Prayer: mm == 11 && dd == day_of_repentance_and_prayer(yy) && return true return false end """ function is_german_state_holiday( ::Type{Val{:ST}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool Return true if `dt` (with its parts `yy` - year, `mm` - month, `dd` -day) is a is a holiday in the calender (`cal`) of the German state Saxony-Anhalt, otherwise false. """ function is_german_state_holiday( ::Type{Val{:ST}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool mm == 1 && dd == 6 && return true # Epiphany mm == 10 && dd == 31 && return true # Reformation Day return false end """ function is_german_state_holiday( ::Type{Val{:TH}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool Return true if `dt` (with its parts `yy` - year, `mm` - month, `dd` -day) is a is a holiday in the calender (`cal`) of the German state Thuringia, otherwise false. """ function is_german_state_holiday( ::Type{Val{:TH}}, dt::Dates.Date, yy::Int, mm::Int, dd::Int, easter_sunday::Dates.Date )::Bool mm == 9 && dd == 20 && yy ≥ 2019 && return true # Children's Day mm == 10 && dd == 31 && return true # Reformation Day return false end """ day_of_repentance_and_prayer(yy::Int)::Int Calculate day of rependance and prayer als last Wednesday before 23. November. """ function day_of_repentance_and_prayer(yy::Int)::Int drange = Dates.Date(yy,11,16):Dates.Day(1):Dates.Date(yy,11,22) return Dates.day(drange[findfirst(isequal.(Dates.dayofweek.(drange),3))]) end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
1386
""" Holidays for TARGET Eurozone (Trans-European Automated Real-time Gross Settlement Express Transfer System) """ struct TARGET <: HolidayCalendar end const TARGET2 = TARGET const EuroZone = TARGET function isholiday(::TARGET, dt::Dates.Date) yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) # Bisection if mm < 8 # Fixed Holidays if ( # New Year's Day ((mm == 1) && (dd == 1)) || # Labour Day ((mm == 5) && (dd == 1) && (yy >= 2000)) ) return true end if yy >= 2000 dt_rata::Int = Dates.days(dt) e_rata::Int = easter_rata(Dates.Year(yy)) if ( # Good Friday ( dt_rata == ( e_rata - 2 )) || # Easter Monday ( dt_rata == ( e_rata + 1)) ) return true end end else # mm >= 8 if ( # Christmas ((mm == 12) && (dd == 25)) || # Day of Goodwill ((mm == 12) && (dd == 26) && (yy >= 2000)) || # End of year ((mm == 12) && (dd == 31) && (yy == 1998 || yy == 1999 || yy == 2001)) ) return true end end return false end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
3516
""" Banking holidays for England and Wales. """ struct UKSettlement <: HolidayCalendar end const UnitedKingdom = UKSettlement # England and Wales Banking Holidays function isholiday(::UKSettlement, dt::Dates.Date) yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) # Bisection if mm >= 8 # Fixed holidays if ( # Late Summer Bank Holiday, August Bank Holiday adjustweekendholidayPost( findweekday(Dates.Monday, yy, 8, 1, false) ) == dt || # Christmas adjustweekendholidayPost( Dates.Date(yy, 12, 25) ) == dt || # Boxing adjustweekendholidayPost(adjustweekendholidayPost( Dates.Date(yy, 12, 25) ) + Dates.Day(1)) == dt ) return true end # Fixed date holidays with mm >= 8 if ( dt == Dates.Date(1999, 12, 31) || # Funeral of Queen Elizabeth II dt == Dates.Date(2022, 9, 19) ) return true end else # mm < 8 # Fixed holidays if ( # New Year's Day adjustweekendholidayPost( Dates.Date(yy, 01, 01) ) == dt || # May Day, Early May Bank Holiday (adjustweekendholidayPost(findweekday(Dates.Monday, yy, 5, 1, true)) == dt && yy != 1995 && yy != 2020) ) return true end # Spring Bank Holiday if yy == 2022 && dt == Dates.Date(2022, 6, 2) return true elseif adjustweekendholidayPost(findweekday(Dates.Monday, yy, 5, 1, false)) == dt && yy != 2012 && yy != 2002 && yy != 2022 return true end # Easter occurs up to April, which is before August (mm < 8). See test/easter-min-max.jl . # Holidays based on easter date dt_rata::Int = Dates.days(dt) e_rata::Int = easter_rata(Dates.Year(yy)) if ( # Good Friday ( dt_rata == ( e_rata - 2 ) ) || # Easter Monday ( dt_rata == ( e_rata + 1 ) ) ) return true end # Fixed date holidays with mm < 8 if ( # Substitute date for Spring Bank Holiday (dt == Dates.Date(2012, 6, 4)) || # Diamond Jubilee of Queen Elizabeth II. (dt == Dates.Date(2012, 6, 5)) || # Golden Jubilee of Queen Elizabeth II. (dt == Dates.Date(2002, 6, 3)) || # Platinum Jubilee of Queen Elizabeth II. (dt == Dates.Date(2022, 6, 3)) || # Substitute date for Spring Bank Holiday (dt == Dates.Date(2002, 6, 4)) || # Wedding of Prince William and Catherine Middleton (dt == Dates.Date(2011, 4, 29)) || # Substitute date for Early May Bank Holiday in 1995 (dt == Dates.Date(1995, 5, 8)) || # Substitute date for Early May Bank Holiday in 2020 (dt == Dates.Date(2020, 5, 8)) || # Coronation of King Charles III (dt == Dates.Date(2023, 5, 8)) ) return true end end return false end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
7461
# United States calendars """ United States federal holidays. """ struct USSettlement <: HolidayCalendar end const UnitedStates = USSettlement """ United States NYSE holidays. """ struct USNYSE <: HolidayCalendar end """ United States Government Bond calendar. See <https://www.sifma.org/resources/general/holiday-schedule/>. """ struct USGovernmentBond <: HolidayCalendar end function isholiday(::USSettlement, dt::Dates.Date) yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) if ( # New Year's Day adjustweekendholidayUS(Dates.Date(yy, 1, 1)) == dt || # New Year's Day on the previous year when 1st Jan is Saturday (mm == 12 && dd == 31 && Dates.dayofweek(dt) == Dates.Friday) || # Birthday of Martin Luther King, Jr. (yy >= 1983 && adjustweekendholidayUS(findweekday(Dates.Monday, yy, 1, 3, true)) == dt) || # Washington's Birthday adjustweekendholidayUS(findweekday(Dates.Monday, yy, 2, 3, true)) == dt || # Memorial Day adjustweekendholidayUS(findweekday(Dates.Monday, yy, 5, 1, false)) == dt || # Juneteenth (yy >= 2021 && adjustweekendholidayUS(Dates.Date(yy, 6, 19)) == dt) || # Independence Day adjustweekendholidayUS(Dates.Date(yy, 7, 4)) == dt || # Labor Day adjustweekendholidayUS(findweekday(Dates.Monday, yy, 9, 1, true)) == dt || # Columbus Day adjustweekendholidayUS(findweekday(Dates.Monday, yy, 10, 2, true)) == dt || # Veterans Day adjustweekendholidayUS(Dates.Date(yy, 11, 11)) == dt || # Thanksgiving Day adjustweekendholidayUS(findweekday(Dates.Thursday, yy, 11, 4, true)) == dt || # Christmas adjustweekendholidayUS(Dates.Date(yy, 12, 25)) == dt ) return true end return false end function isholiday(::USNYSE, dt::Dates.Date) yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) dt_rata::Int = Dates.days(dt) e_rata::Int = easter_rata(Dates.Year(yy)) if ( # New Year's Day adjustweekendholidayUS(Dates.Date(yy, 1, 1)) == dt || # Birthday of Martin Luther King, Jr. (yy >= 1998 && adjustweekendholidayUS(findweekday(Dates.Monday, yy, 1, 3, true)) == dt) || # Washington's Birthday adjustweekendholidayUS(findweekday(Dates.Monday, yy, 2, 3, true)) == dt || # Good Friday dt_rata == ( e_rata - 2 ) || # Memorial Day adjustweekendholidayUS(findweekday(Dates.Monday, yy, 5, 1, false)) == dt || # Juneteenth (yy >= 2022 && adjustweekendholidayUS(Dates.Date(yy, 6, 19)) == dt) || # Independence Day adjustweekendholidayUS(Dates.Date(yy, 7, 4)) == dt || # Labor Day adjustweekendholidayUS(findweekday(Dates.Monday, yy, 9, 1, true)) == dt || # Thanksgiving Day adjustweekendholidayUS(findweekday(Dates.Thursday, yy, 11, 4, true)) == dt || # Christmas adjustweekendholidayUS(Dates.Date(yy, 12, 25)) == dt ) return true end # Presidential election days if (yy <= 1968 || (yy <= 1980 && yy % 4 == 0)) && mm == 11 && dd <= 7 && Dates.istuesday(dt) return true end # Special Closings if ( # President George H.W. Bush's funeral dt == Dates.Date(2018,12,5) || # Hurricane Sandy yy == 2012 && mm == 10 && (dd == 29 || dd == 30) || # Predient Ford's funeral dt == Dates.Date(2007,1,2) || # President Reagan's funeral dt == Dates.Date(2004,6,11) || # Sep 11th yy == 2001 && mm == 9 && (11 <= dd && dd <= 14) || # President Nixon's funeral dt == Dates.Date(1994,4,27) || # Hurricane Gloria dt == Dates.Date(1985,9,27) || # 1977 Blackout dt == Dates.Date(1977,7,14) || # Funeral of former President Lyndon B. Johnson dt == Dates.Date(1973,1,25) || # Funeral of former President Harry S. Truman dt == Dates.Date(1972,12,28) || # National Day of Participation for the lunar exploration dt == Dates.Date(1969,7,21) || # Eisenhower's funeral dt == Dates.Date(1969,3,31) || # Heavy snow dt == Dates.Date(1969,2,10) || # Day after Independence Day dt == Dates.Date(1968,7,5) || # Paperwork Crisis yy == 1968 && Dates.dayofyear(dt) >= 163 && Dates.iswednesday(dt) || # Mourning for Martin Luther King Jr dt == Dates.Date(1968,4,9) || # President Kennedy's funeral dt == Dates.Date(1963,11,25) || # Day before Decoration Day dt == Dates.Date(1961,5,29) || # Day after Christmas dt == Dates.Date(1958,12,26) || # Christmas Eve dt in [Dates.Date(1954,12,24), Dates.Date(1956,12,24), Dates.Date(1965,12,24)] ) return true end return false end function isholiday(::USGovernmentBond, dt::Dates.Date) yy = Dates.year(dt) mm = Dates.month(dt) dd = Dates.day(dt) dt_rata::Int = Dates.days(dt) e_rata::Int = easter_rata(Dates.Year(yy)) if ( # New Year's Day adjustweekendholidayUS(Dates.Date(yy, 1, 1)) == dt || # Birthday of Martin Luther King, Jr. yy >= 1983 && adjustweekendholidayUS(findweekday(Dates.Monday, yy, 1, 3, true)) == dt || # Washington's Birthday adjustweekendholidayUS(findweekday(Dates.Monday, yy, 2, 3, true)) == dt || # Good Friday dt_rata == ( e_rata - 2 ) || # Memorial Day adjustweekendholidayUS(findweekday(Dates.Monday, yy, 5, 1, false)) == dt || # Juneteenth (yy >= 2022 && adjustweekendholidayUS(Dates.Date(yy, 6, 19)) == dt) || # Independence Day adjustweekendholidayUS(Dates.Date(yy, 7, 4)) == dt || # Labor Day adjustweekendholidayUS(findweekday(Dates.Monday, yy, 9, 1, true)) == dt || # Columbus Day adjustweekendholidayUS(findweekday(Dates.Monday, yy, 10, 2, true)) == dt || # Veterans Day (yy != 2023 && adjustweekendholidayUS(Dates.Date(yy, 11, 11)) == dt) || # Thanksgiving Day adjustweekendholidayUS(findweekday(Dates.Thursday, yy, 11, 4, true)) == dt || # Christmas adjustweekendholidayUS(Dates.Date(yy, 12, 25)) == dt ) return true end # Special Closings # President George H.W. Bush's funeral <https://www.newyorkfed.org/markets/opolicy/operating_policy_181204> if dt == Dates.Date(2018, 12, 5) return true end return false end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
62759
print("##########################\n") print(" Using cache: $(usecache)\n") print("##########################\n") if usecache BusinessDays.initcache(all_calendars_vec) end ################ # bdays.jl ################ #export # isweekend, isbday, tobday, advancebdays, bdays dt_tuesday = Dates.Date(2015,06,23) dt_wednesday = Dates.Date(2015, 06, 24) dt_thursday = Dates.Date(2015, 06, 25) dt_friday = Dates.Date(2015, 06, 26) dt_saturday = Dates.Date(2015, 06, 27) dt_sunday = Dates.Date(2015, 06, 28) dt_monday = Dates.Date(2015, 06, 29) dt_newyears = Dates.Date(2016,1,1) # Bounds tests if !usecache # this should work isbday(hc_brazil, Dates.Date(1600,2,1)) isbday(hc_brazil, Dates.Date(3000,2,1)) bdays(hc_brazil, Dates.Date(1600,2,1), Dates.Date(1600, 2, 5)) bdays(hc_brazil, Dates.Date(3000,2,1), Dates.Date(3000, 2, 5)) bdayscount(hc_brazil, Dates.Date(1600,2,1), Dates.Date(1600, 2, 5)) bdayscount(hc_brazil, Dates.Date(3000,2,1), Dates.Date(3000, 2, 5)) else #this should not work @test_throws AssertionError isbday(hc_brazil, Dates.Date(1600,2,1)) @test_throws AssertionError isbday(hc_brazil, Dates.Date(3000,2,1)) @test_throws AssertionError bdays(hc_brazil, Dates.Date(1600,2,1), Dates.Date(1600, 2, 5)) @test_throws AssertionError bdays(hc_brazil, Dates.Date(3000,2,1), Dates.Date(3000, 2, 5)) @test_throws AssertionError bdayscount(hc_brazil, Dates.Date(1600,2,1), Dates.Date(1600, 2, 5)) @test_throws AssertionError bdayscount(hc_brazil, Dates.Date(3000,2,1), Dates.Date(3000, 2, 5)) end @test_throws ErrorException isbday(:UnknownCalendar, Dates.Date(2016,1,1)) @test_throws ErrorException isbday("UnknownCalendar", Dates.Date(2016,1,1)) @test isweekend(dt_tuesday) == false @test isweekend(dt_wednesday) == false @test isweekend(dt_thursday) == false @test isweekend(dt_friday) == false @test isweekend(dt_saturday) == true @test isweekend(dt_sunday) == true @test isweekend(dt_monday) == false @test isweekday(dt_tuesday) == true @test isweekday(dt_wednesday) == true @test isweekday(dt_thursday) == true @test isweekday(dt_friday) == true @test isweekday(dt_saturday) == false @test isweekday(dt_sunday) == false @test isweekday(dt_monday) == true @test isbday(hc_brazil, dt_friday) == true @test isbday(hc_brazil, dt_saturday) == false @test isbday(hc_brazil, dt_sunday) == false @test isbday(hc_brazil, dt_monday) == true @test isholiday(hc_brazil, dt_friday) == false @test isholiday(hc_brazil, dt_saturday) == false @test isholiday(hc_brazil, dt_sunday) == false @test isholiday(hc_brazil, dt_monday) == false @test isholiday(hc_brazil, dt_newyears) == true # Symbol @test isholiday(:Brazil, dt_friday) == false @test isholiday(:Brazil, dt_saturday) == false @test isholiday(:Brazil, dt_sunday) == false @test isholiday(:Brazil, dt_monday) == false @test isholiday(:Brazil, dt_newyears) == true # String @test isholiday("Brazil", dt_friday) == false @test isholiday("Brazil", dt_saturday) == false @test isholiday("Brazil", dt_sunday) == false @test isholiday("Brazil", dt_monday) == false @test isholiday("Brazil", dt_newyears) == true @test isholiday(BusinessDays.NullHolidayCalendar(), Dates.Date(2016,9,25)) == false @test isholiday(:NullHolidayCalendar, Dates.Date(2016,9,25)) == false @test isholiday("NullHolidayCalendar", Dates.Date(2016,9,25)) == false @test isbday(BusinessDays.NullHolidayCalendar(), Dates.Date(2016,9,25)) == true @test isbday(:NullHolidayCalendar, Dates.Date(2016,9,25)) == true @test isbday("NullHolidayCalendar", Dates.Date(2016,9,25)) == true @test bdayscount(:NullHolidayCalendar, Dates.Date(2016,9,25), Dates.Date(2016,9,28)) == 3 @test bdays(:NullHolidayCalendar, Dates.Date(2016,9,25), Dates.Date(2016,9,28)) == Dates.Day(3) @test isholiday(BusinessDays.WeekendsOnly(), Dates.Date(2016,9,25)) == false @test isholiday(:WeekendsOnly, Dates.Date(2016,9,25)) == false @test isholiday("WeekendsOnly", Dates.Date(2016,9,25)) == false @test isbday(BusinessDays.WeekendsOnly(), Dates.Date(2016,9,25)) == false @test isbday(:WeekendsOnly, Dates.Date(2016,9,25)) == false @test isbday("WeekendsOnly", Dates.Date(2016,9,25)) == false @test bdayscount(:WeekendsOnly, Dates.Date(2016,9,25), Dates.Date(2016,9,28)) == 2 function test_bdays(cal, d0::Dates.Date, d1::Dates.Date, expected_result::Integer) @test bdays(cal, d0, d1) == Dates.Day(expected_result) if d0 != d1 @test bdays(cal, d1, d0) == Dates.Day(-expected_result) end nothing end function test_bdays(cal, d0::Tuple{Int, Int, Int}, d1::Tuple{Int, Int, Int}, expected_result::Integer) test_bdays(cal, Dates.Date(d0[1], d0[2], d0[3]), Dates.Date(d1[1], d1[2], d1[3]), expected_result) end test_bdays(:WeekendsOnly, (2016, 9, 25), (2016, 9, 28), 2) test_bdays(:WeekendsOnly, (2019, 8, 26), (2019, 9, 2), 5) test_bdays(:WeekendsOnly, (2019, 8, 26), (2019, 9, 3), 6) test_bdays(:WeekendsOnly, (2019, 8, 26), (2019, 9, 9), 10) test_bdays(:WeekendsOnly, (2019, 8, 26), (2019, 9, 10), 11) test_bdays(:WeekendsOnly, (2019, 8, 26), (2019, 8, 30), 4) test_bdays(:WeekendsOnly, (2019, 8, 26), (2019, 8, 27), 1) test_bdays(:WeekendsOnly, (2019, 8, 26), (2019, 8, 26), 0) test_bdays(:WeekendsOnly, (2019, 8, 19), (2019, 8, 26), 5) test_bdays(:WeekendsOnly, (2019, 8, 25), (2019, 8, 26), 0) test_bdays(:WeekendsOnly, (2019, 8, 24), (2019, 8, 25), 0) test_bdays(:WeekendsOnly, (2019, 8, 24), (2019, 8, 26), 0) test_bdays(:WeekendsOnly, (2019, 8, 23), (2019, 8, 24), 1) # Brazil HolidayCalendar tests @test isbday(hc_brazil, Dates.Date(2014, 12, 31)) == true # wednesday @test isbday(hc_brazil, Dates.Date(2015, 01, 01)) == false # new year @test isbday(hc_brazil, Dates.Date(2015, 01, 02)) == true # friday @test isbday(hc_brazil, Dates.Date(2015, 04, 20)) == true # monday @test isbday(hc_brazil, Dates.Date(2015, 04, 21)) == false # tiradentes @test isbday(hc_brazil, Dates.Date(2015, 04, 22)) == true # wednesday @test isbday(hc_brazil, Dates.Date(2015, 04, 30)) == true # thursday @test isbday(hc_brazil, Dates.Date(2015, 05, 01)) == false # labor day @test isbday(hc_brazil, Dates.Date(2015, 05, 02)) == false # saturday @test isbday(hc_brazil, Dates.Date(2015, 09, 06)) == false # sunday @test isbday(hc_brazil, Dates.Date(2015, 09, 07)) == false # independence day @test isbday(hc_brazil, Dates.Date(2015, 09, 08)) == true # tuesday @test isbday(hc_brazil, Dates.Date(2015, 10, 11)) == false # sunday @test isbday(hc_brazil, Dates.Date(2015, 10, 12)) == false # Nossa Senhora Aparecida @test isbday(hc_brazil, Dates.Date(2015, 10, 13)) == true # tuesday @test isbday(hc_brazil, Dates.Date(2015, 11, 01)) == false # sunday @test isbday(hc_brazil, Dates.Date(2015, 11, 02)) == false # Finados @test isbday(hc_brazil, Dates.Date(2015, 11, 03)) == true # tuesday @test isbday(hc_brazil, Dates.Date(2013, 11, 14)) == true # thursday @test isbday(hc_brazil, Dates.Date(2013, 11, 15)) == false # Republic @test isbday(hc_brazil, Dates.Date(2013, 11, 16)) == false # saturday @test isbday(hc_brazil, Dates.Date(2013, 12, 24)) == true # tuesday @test isbday(hc_brazil, Dates.Date(2013, 12, 25)) == false # Christmas @test isbday(hc_brazil, Dates.Date(2013, 12, 26)) == true # thursday @test isbday(hc_brazil, Dates.Date(2013, 02, 08)) == true # friday @test isbday(hc_brazil, Dates.Date(2013, 02, 09)) == false # saturday @test isbday(hc_brazil, Dates.Date(2013, 02, 10)) == false # sunday @test isbday(hc_brazil, Dates.Date(2013, 02, 11)) == false # segunda carnaval @test isbday(hc_brazil, Dates.Date(2013, 02, 12)) == false # terca carnaval @test isbday(hc_brazil, Dates.Date(2013, 02, 13)) == true # wednesday @test isbday(hc_brazil, Dates.Date(2013, 03, 28)) == true # thursday @test isbday(hc_brazil, Dates.Date(2013, 03, 29)) == false # sexta-feira santa @test isbday(hc_brazil, Dates.Date(2013, 03, 30)) == false # saturday @test isbday(hc_brazil, Dates.Date(2013, 05, 29)) == true # wednesday @test isbday(hc_brazil, Dates.Date(2013, 05, 30)) == false # Corpus Christi @test isbday(hc_brazil, Dates.Date(2013, 05, 31)) == true # friday @test isbday(hc_brazil, Dates.Date(2023, 11, 20)) == true # zumbi 2023 @test isbday(hc_brazil, Dates.Date(2024, 11, 20)) == false # zumbi 2024 # Symbol @test isbday(:Brazil, Dates.Date(2013, 05, 29)) == true # wednesday @test isbday(:Brazil, Dates.Date(2013, 05, 30)) == false # Corpus Christi @test isbday(:Brazil, Dates.Date(2013, 05, 31)) == true # friday # String @test isbday("Brazil", Dates.Date(2013, 05, 29)) == true # wednesday @test isbday("Brazil", Dates.Date(2013, 05, 30)) == false # Corpus Christi @test isbday("Brazil", Dates.Date(2013, 05, 31)) == true # friday # BrazilExchange holiday calendar tests @test isbday(hc_brazil_exc, Dates.Date(2017, 11, 19)) == false # sunday @test isbday(hc_brazil_exc, Dates.Date(2017, 11, 20)) == false # Consciência Negra (segunda) @test isbday(hc_brazil_exc, Dates.Date(2017, 11, 21)) == true # Terca @test isbday(:BrazilExchange, Dates.Date(2013, 05, 29)) == true # wednesday @test isbday(:BrazilExchange, Dates.Date(2013, 05, 30)) == false # Corpus Christi (National Holiday) @test isbday(:BrazilExchange, Dates.Date(2013, 05, 31)) == true # friday # BrazilB3 as alias of BrazilExchange @test isbday(:BrazilB3, Dates.Date(2013, 05, 29)) == true # wednesday @test isbday(:BrazilB3, Dates.Date(2013, 05, 30)) == false # Corpus Christi (National Holiday) @test isbday(:BrazilB3, Dates.Date(2013, 05, 31)) == true # friday # BrazilExchange 2019 calendar @test isbday(hc_brazil_exc, Dates.Date(2019, 01, 01)) == false # Confraternização Universal @test isbday(hc_brazil_exc, Dates.Date(2019, 03, 04)) == false # Carnaval @test isbday(hc_brazil_exc, Dates.Date(2019, 03, 05)) == false # Carnaval @test isbday(hc_brazil_exc, Dates.Date(2019, 04, 19)) == false # Paixão de Cristo @test isbday(hc_brazil_exc, Dates.Date(2019, 05, 01)) == false # Dia do Trabalho @test isbday(hc_brazil_exc, Dates.Date(2019, 06, 20)) == false # Corpus Christi @test isbday(hc_brazil_exc, Dates.Date(2019, 11, 15)) == false # Proclamação da República @test isbday(hc_brazil_exc, Dates.Date(2019, 12, 24)) == false # Véspera de Natal @test isbday(hc_brazil_exc, Dates.Date(2019, 12, 25)) == false # Natal @test isbday(hc_brazil_exc, Dates.Date(2019, 12, 31)) == false # bank holiday @test isbday(hc_brazil_exc, Dates.Date(2019, 01, 25)) == false # Aniversário de São Paulo @test isbday(hc_brazil_exc, Dates.Date(2019, 07, 09)) == false # Revolução Constitucionalista @test isbday(hc_brazil_exc, Dates.Date(2019, 11, 20)) == false # Dia da Consciência Negra # BrazilExchange 2020 calendar @test isbday(hc_brazil_exc, Dates.Date(2020, 05, 20)) == true @test isbday(hc_brazil_exc, Dates.Date(2020, 05, 21)) == true @test isbday(hc_brazil_exc, Dates.Date(2020, 05, 22)) == true @test isbday(hc_brazil_exc, Dates.Date(2020, 05, 25)) == true @test isbday(hc_brazil_exc, Dates.Date(2020, 06, 11)) == false @test isbday(hc_brazil_exc, Dates.Date(2020, 07, 09)) == true # 2020 update by Ofício Circular 072/2020-PRE @test isbday(hc_brazil_exc, Dates.Date(2020, 11, 20)) == true # 2020 update by Ofício Circular 072/2020-PRE # BrazilExchange 2021 calendar @test isholiday(hc_brazil_exc, Dates.Date(2021, 1, 25)) == true @test isholiday(hc_brazil_exc, Dates.Date(2021, 7, 9)) == true # BrazilExchange 2022 calendar @test isholiday(hc_brazil_exc, Dates.Date(2022, 1, 25)) == false # updated by Ofício Circular 150/2020-PRE @test isholiday(hc_brazil_exc, Dates.Date(2022, 7, 9)) == false @test isholiday(hc_brazil_exc, Dates.Date(2022, 11, 20)) == false @test isholiday(hc_brazil_exc, Dates.Date(2022, 2, 28)) == true @test isholiday(hc_brazil_exc, Dates.Date(2022, 3, 1)) == true @test isholiday(hc_brazil_exc, Dates.Date(2022, 4, 15)) == true @test isholiday(hc_brazil_exc, Dates.Date(2022, 4, 21)) == true @test isholiday(hc_brazil_exc, Dates.Date(2022, 6, 16)) == true @test isholiday(hc_brazil_exc, Dates.Date(2022, 9, 7)) == true @test isholiday(hc_brazil_exc, Dates.Date(2022, 10, 12)) == true @test isholiday(hc_brazil_exc, Dates.Date(2022, 11, 2)) == true @test isholiday(hc_brazil_exc, Dates.Date(2022, 11, 15)) == true @test BusinessDays.listholidays(hc_brazil_exc, Dates.Date(2022, 1, 1), Dates.Date(2022, 12, 31)) == [ Dates.Date("2022-01-01"), # Confraternização Universal (Sábado) Dates.Date("2022-02-28"), # Carnaval (segunda) Dates.Date("2022-03-01"), # Carnaval (Terça) Dates.Date("2022-04-15"), # Paixão de Cristo Dates.Date("2022-04-21"), # Tiradentes Dates.Date("2022-05-01"), # Dia do Trabalho (Domingo) Dates.Date("2022-06-16"), # Corpus Christi Dates.Date("2022-09-07"), # Independência do Brasil Dates.Date("2022-10-12"), # Nossa Senhora Aparecida Dates.Date("2022-11-02"), # Finados Dates.Date("2022-11-15"), # Proclamação da República Dates.Date("2022-12-24"), # Véspera de Natal (sábado) Dates.Date("2022-12-25"), # Natal (Domingo) Dates.Date("2022-12-30"), # Véspera do ano novo Dates.Date("2022-12-31") # Ano novo (Sábado) ] # USSettlement HolidayCaledar tests # Federal Holidays listed on https://www.opm.gov/policy-data-oversight/snow-dismissal-procedures/federal-holidays/#url=2015 @test isbday(hc_usa, Dates.Date(2014, 12, 31)) == true @test isbday(hc_usa, Dates.Date(2015, 01, 01)) == false # New Year's Day - Thursday @test isbday(hc_usa, Dates.Date(2015, 01, 02)) == true @test isbday(hc_usa, Dates.Date(2015, 01, 18)) == false @test isbday(hc_usa, Dates.Date(2015, 01, 19)) == false # Birthday of Martin Luther King, Jr. - Monday @test isbday(hc_usa, Dates.Date(2015, 01, 20)) == true @test isbday(hc_usa, Dates.Date(1982,01,18)) == true # not a holiday for Martin Luther King @test isbday(hc_usa, Dates.Date(2015, 02, 15)) == false @test isbday(hc_usa, Dates.Date(2015, 02, 16)) == false # Washington’s Birthday - Monday @test isbday(hc_usa, Dates.Date(2015, 02, 17)) == true @test isbday(hc_usa, Dates.Date(2015, 05, 24)) == false @test isbday(hc_usa, Dates.Date(2015, 05, 25)) == false # Memorial Day - Monday @test isbday(hc_usa, Dates.Date(2015, 05, 26)) == true @test isbday(hc_usa, Dates.Date(2020, 06, 19)) == true @test isbday(hc_usa, Dates.Date(2021, 06, 17)) == true @test isbday(hc_usa, Dates.Date(2021, 06, 18)) == false # Juneteenth starting 2021 @test isbday(hc_usa, Dates.Date(2022, 06, 20)) == false # Juneteenth 2022 @test isbday(hc_usa, Dates.Date(2015, 07, 02)) == true @test isbday(hc_usa, Dates.Date(2015, 07, 03)) == false # Independence Day - Friday @test isbday(hc_usa, Dates.Date(2015, 07, 04)) == false @test isbday(hc_usa, Dates.Date(2015, 09, 06)) == false @test isbday(hc_usa, Dates.Date(2015, 09, 07)) == false # Labor Day - Monday @test isbday(hc_usa, Dates.Date(2015, 09, 08)) == true @test isbday(hc_usa, Dates.Date(2015, 10, 11)) == false @test isbday(hc_usa, Dates.Date(2015, 10, 12)) == false # Columbus Day - Monday @test isbday(hc_usa, Dates.Date(2015, 10, 13)) == true @test isbday(hc_usa, Dates.Date(2015, 11, 10)) == true @test isbday(hc_usa, Dates.Date(2015, 11, 11)) == false # Veterans Day - Wednesday @test isbday(hc_usa, Dates.Date(2015, 11, 12)) == true @test isbday(hc_usa, Dates.Date(2015, 11, 25)) == true @test isbday(hc_usa, Dates.Date(2015, 11, 26)) == false # Thanksgiving Day - Thursday @test isbday(hc_usa, Dates.Date(2015, 11, 27)) == true @test isbday(hc_usa, Dates.Date(2015, 12, 24)) == true @test isbday(hc_usa, Dates.Date(2015, 12, 25)) == false # Christmas - Friday @test isbday(hc_usa, Dates.Date(2015, 12, 26)) == false @test isbday(hc_usa, Dates.Date(2010, 12, 31)) == false # new years day observed @test isbday(hc_usa, Dates.Date(2004, 12, 31)) == false # new years day observed @test isbday(hc_usa, Dates.Date(2013, 03, 28)) == true # thursday @test isbday(hc_usa, Dates.Date(2013, 03, 29)) == true # good friday @test isbday(hc_usa, Dates.Date(2013, 03, 30)) == false # saturday # Symbol @test isbday(:USSettlement, Dates.Date(2015, 12, 24)) == true @test isbday(:USSettlement, Dates.Date(2015, 12, 25)) == false # Christmas - Friday @test isbday(:USSettlement, Dates.Date(2015, 12, 26)) == false # String @test isbday("USSettlement", Dates.Date(2015, 12, 24)) == true @test isbday("USSettlement", Dates.Date(2015, 12, 25)) == false # Christmas - Friday @test isbday("USSettlement", Dates.Date(2015, 12, 26)) == false ## USNYSE HolidayCalendar tests @test isbday(hc_usnyse, Dates.Date(2014, 12, 31)) == true @test isbday(hc_usnyse, Dates.Date(2015, 01, 01)) == false # New Year's Day - Thursday @test isbday(hc_usnyse, Dates.Date(2015, 01, 02)) == true @test isbday(hc_usnyse, Dates.Date(2015, 01, 18)) == false @test isbday(hc_usnyse, Dates.Date(2015, 01, 19)) == false # Birthday of Martin Luther King, Jr. - Monday @test isbday(hc_usnyse, Dates.Date(2015, 01, 20)) == true @test isbday(hc_usnyse, Dates.Date(2015, 02, 15)) == false @test isbday(hc_usnyse, Dates.Date(2015, 02, 16)) == false # Washington’s Birthday - Monday @test isbday(hc_usnyse, Dates.Date(2015, 02, 17)) == true @test isbday(hc_usnyse, Dates.Date(2015, 05, 24)) == false @test isbday(hc_usnyse, Dates.Date(2015, 05, 25)) == false # Memorial Day - Monday @test isbday(hc_usnyse, Dates.Date(2015, 05, 26)) == true @test isbday(hc_usnyse, Dates.Date(2021, 06, 18)) == true @test isbday(hc_usnyse, Dates.Date(2022, 06, 20)) == false # Juneteenth starting 2022 @test isbday(hc_usnyse, Dates.Date(2023, 06, 19)) == false # Juneteenth 2023 @test isbday(hc_usnyse, Dates.Date(2015, 07, 02)) == true @test isbday(hc_usnyse, Dates.Date(2015, 07, 03)) == false # Independence Day - Friday @test isbday(hc_usnyse, Dates.Date(2015, 07, 04)) == false @test isbday(hc_usnyse, Dates.Date(2015, 09, 06)) == false @test isbday(hc_usnyse, Dates.Date(2015, 09, 07)) == false # Labor Day - Monday @test isbday(hc_usnyse, Dates.Date(2015, 09, 08)) == true @test isbday(hc_usnyse, Dates.Date(2015, 10, 11)) == false @test isbday(hc_usnyse, Dates.Date(2015, 10, 12)) == true # Columbus Day - Monday @test isbday(hc_usnyse, Dates.Date(2015, 10, 13)) == true @test isbday(hc_usnyse, Dates.Date(2015, 11, 10)) == true @test isbday(hc_usnyse, Dates.Date(2015, 11, 11)) == true # Veterans Day - Wednesday @test isbday(hc_usnyse, Dates.Date(2015, 11, 12)) == true @test isbday(hc_usnyse, Dates.Date(2015, 11, 25)) == true @test isbday(hc_usnyse, Dates.Date(2015, 11, 26)) == false # Thanksgiving Day - Thursday @test isbday(hc_usnyse, Dates.Date(2015, 11, 27)) == true @test isbday(hc_usnyse, Dates.Date(2015, 12, 24)) == true @test isbday(hc_usnyse, Dates.Date(2015, 12, 25)) == false # Christmas - Friday @test isbday(hc_usnyse, Dates.Date(2015, 12, 26)) == false @test isbday(hc_usnyse, Dates.Date(2010, 12, 31)) == true # Friday before new years @test isbday(hc_usnyse, Dates.Date(2004, 12, 31)) == true # Friday before new years @test isbday(hc_usnyse, Dates.Date(2013, 03, 28)) == true # thursday @test isbday(hc_usnyse, Dates.Date(2013, 03, 29)) == false # good friday @test isbday(hc_usnyse, Dates.Date(2013, 03, 30)) == false # saturday # Symbol @test isbday(:USNYSE, Dates.Date(2015, 12, 24)) == true @test isbday(:USNYSE, Dates.Date(2015, 12, 25)) == false # Christmas - Friday @test isbday(:USNYSE, Dates.Date(2015, 12, 26)) == false # String @test isbday("USNYSE", Dates.Date(2015, 12, 24)) == true @test isbday("USNYSE", Dates.Date(2015, 12, 25)) == false # Christmas - Friday @test isbday("USNYSE", Dates.Date(2015, 12, 26)) == false ## USGovernmentBond HolidayCalendar tests @test isbday(hc_usgovbond, Dates.Date(2014, 12, 31)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 01, 01)) == false # New Year's Day - Thursday @test isbday(hc_usgovbond, Dates.Date(2015, 01, 02)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 01, 18)) == false @test isbday(hc_usgovbond, Dates.Date(2015, 01, 19)) == false # Birthday of Martin Luther King, Jr. - Monday @test isbday(hc_usgovbond, Dates.Date(2015, 01, 20)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 02, 15)) == false @test isbday(hc_usgovbond, Dates.Date(2015, 02, 16)) == false # Washington’s Birthday - Monday @test isbday(hc_usgovbond, Dates.Date(2015, 02, 17)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 05, 24)) == false @test isbday(hc_usgovbond, Dates.Date(2015, 05, 25)) == false # Memorial Day - Monday @test isbday(hc_usgovbond, Dates.Date(2015, 05, 26)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 07, 02)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 07, 03)) == false # Independence Day - Friday @test isbday(hc_usgovbond, Dates.Date(2015, 07, 04)) == false @test isbday(hc_usgovbond, Dates.Date(2015, 09, 06)) == false @test isbday(hc_usgovbond, Dates.Date(2015, 09, 07)) == false # Labor Day - Monday @test isbday(hc_usgovbond, Dates.Date(2015, 09, 08)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 10, 11)) == false @test isbday(hc_usgovbond, Dates.Date(2015, 10, 12)) == false # Columbus Day - Monday @test isbday(hc_usgovbond, Dates.Date(2015, 10, 13)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 11, 10)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 11, 11)) == false # Veterans Day - Wednesday @test isbday(hc_usgovbond, Dates.Date(2015, 11, 12)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 11, 25)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 11, 26)) == false # Thanksgiving Day - Thursday @test isbday(hc_usgovbond, Dates.Date(2015, 11, 27)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 12, 24)) == true @test isbday(hc_usgovbond, Dates.Date(2015, 12, 25)) == false # Christmas - Friday @test isbday(hc_usgovbond, Dates.Date(2015, 12, 26)) == false @test isbday(hc_usgovbond, Dates.Date(2010, 12, 31)) == true # Friday before new years @test isbday(hc_usgovbond, Dates.Date(2004, 12, 31)) == true # Friday before new years @test isbday(hc_usgovbond, Dates.Date(2013, 03, 28)) == true # thursday @test isbday(hc_usgovbond, Dates.Date(2013, 03, 29)) == false # good friday @test isbday(hc_usgovbond, Dates.Date(2013, 03, 30)) == false # saturday # 2022 @test BusinessDays.listholidays(BusinessDays.USGovernmentBond(), Dates.Date(2022, 1, 1), Dates.Date(2023, 1, 1)) == Dates.Date.([ "2022-01-17", # Martin Luther King Day "2022-02-21", # Presidents Day "2022-04-15", # Good Friday "2022-05-30", # Memorial Day "2022-06-20", # Juneteenth "2022-07-04", # U.S. Independence Day "2022-09-05", # Labor Day "2022-10-10", # Columbus Day "2022-11-11", # Veterans Day "2022-11-24", # Thanksgiving Day "2022-12-26", # Christmas Day ]) # 2023 @test BusinessDays.listholidays(BusinessDays.USGovernmentBond(), Dates.Date(2023, 1, 1), Dates.Date(2023, 12, 31)) == Dates.Date.([ "2023-01-02", # New Year's Day 2022/2023 "2023-01-16", # Martin Luther King Day "2023-02-20", # Presidents Day "2023-04-07", # Good Friday "2023-05-29", # Memorial Day "2023-06-19", # Juneteenth "2023-07-04", # U.S. Independence Day "2023-09-04", # Labor Day "2023-10-09", # Columbus Day "2023-11-23", # Thanksgiving Day "2023-12-25", # Christmas Day ]) # 2024 @test BusinessDays.listholidays(BusinessDays.USGovernmentBond(), Dates.Date(2024, 1, 1), Dates.Date(2024, 12, 31)) == Dates.Date.([ "2024-01-01", # New Year’s "2024-01-15", # Martin Luther King Day "2024-02-19", # Presidents Day "2024-03-29", # Good Friday "2024-05-27", # Memorial Day "2024-06-19", # Juneteenth "2024-07-04", # U.S. Independence Day "2024-09-02", # Labor Day "2024-10-14", # Columbus Day "2024-11-11", # Veterans Day "2024-11-28", # Thanksgiving Day "2024-12-25", # Christmas Day ]) # 2025 @test !isbday(hc_usgovbond, Dates.Date(2025, 01, 1)) # New Year's Day # Symbol @test isbday(:USGovernmentBond, Dates.Date(2015, 12, 24)) == true @test isbday(:USGovernmentBond, Dates.Date(2015, 12, 25)) == false # Christmas - Friday @test isbday(:USGovernmentBond, Dates.Date(2015, 12, 26)) == false # String @test isbday("USGovernmentBond", Dates.Date(2015, 12, 24)) == true @test isbday("USGovernmentBond", Dates.Date(2015, 12, 25)) == false # Christmas - Friday @test isbday("USGovernmentBond", Dates.Date(2015, 12, 26)) == false # TARGET HolidayCalendar tests @test isbday(targethc, Dates.Date(2017, 12, 24)) == false # Sunday @test isbday(targethc, Dates.Date(2017, 12, 25)) == false # Christmas - Monday @test isbday(targethc, Dates.Date(2017, 12, 26)) == false # Day of Goodwill - Tuesday @test isbday(targethc, Dates.Date(2017, 12, 27)) == true # Wednesday @test isbday(targethc, Dates.Date(2017, 04, 16)) == false # Easter Sunday @test isbday(targethc, Dates.Date(2017, 04, 17)) == false # Easter Monday @test isbday(targethc, Dates.Date(2017, 04, 18)) == true # Tuesday @test isbday(targethc, Dates.Date(2001, 12, 31)) == false # End of year @test isbday(targethc, Dates.Date(2002, 12, 31)) == true # End of year @test isbday(targethc, Dates.Date(2016, 1, 1)) == false # New Year's Day @test isbday(targethc, Dates.Date(2017, 5, 1)) == false # Labour Day @test isbday(targethc, Dates.Date(1998, 5, 1)) == true # Labour Day before 2000 # Symbol @test isbday(:TARGET, Dates.Date(2017, 04, 16)) == false # Easter Sunday @test isbday(:TARGET, Dates.Date(2017, 04, 17)) == false # Easter Monday @test isbday(:TARGET, Dates.Date(2017, 04, 18)) == true # Tuesday # String @test isbday("TARGET", Dates.Date(2017, 04, 16)) == false # Easter Sunday @test isbday("TARGET", Dates.Date(2017, 04, 17)) == false # Easter Monday @test isbday("TARGET", Dates.Date(2017, 04, 18)) == true # Tuesday # TARGET synonyms @test isbday("TARGET2", Dates.Date(2017, 04, 18)) == isbday("TARGET", Dates.Date(2017, 04, 18)) @test isbday("EuroZone", Dates.Date(2017, 04, 18)) == isbday("TARGET", Dates.Date(2017, 04, 18)) ## UKSettlement HolidayCalendar tests @test isbday(hc_uk, Dates.Date(2014, 12, 31)) == true @test isbday(hc_uk, Dates.Date(2015, 01, 01)) == false # New Year's Day Thursday @test isbday(hc_uk, Dates.Date(2015, 01, 02)) == true @test isbday(hc_uk, Dates.Date(2015, 08, 30)) == false @test isbday(hc_uk, Dates.Date(2015, 08, 31)) == false # Monday Summer bank holiday @test isbday(hc_uk, Dates.Date(2015, 09, 01)) == true @test isbday(hc_uk, Dates.Date(2015, 12, 24)) == true @test isbday(hc_uk, Dates.Date(2015, 12, 25)) == false # 25 December Friday Christmas Day @test isbday(hc_uk, Dates.Date(2015, 12, 26)) == false @test isbday(hc_uk, Dates.Date(2015, 12, 27)) == false @test isbday(hc_uk, Dates.Date(2015, 12, 28)) == false # Monday Boxing Day (substitute day) @test isbday(hc_uk, Dates.Date(2015, 12, 29)) == true @test isbday(hc_uk, Dates.Date(2016, 03, 24)) == true @test isbday(hc_uk, Dates.Date(2016, 03, 25)) == false # 25 March Friday Good Friday @test isbday(hc_uk, Dates.Date(2016, 03, 26)) == false @test isbday(hc_uk, Dates.Date(2016, 03, 27)) == false @test isbday(hc_uk, Dates.Date(2016, 03, 28)) == false # 28 March Monday Easter Monday @test isbday(hc_uk, Dates.Date(2016, 03, 29)) == true @test isbday(hc_uk, Dates.Date(2016, 05, 01)) == false @test isbday(hc_uk, Dates.Date(2016, 05, 02)) == false # 2 May Monday Early May bank holiday @test isbday(hc_uk, Dates.Date(2016, 05, 03)) == true @test isbday(hc_uk, Dates.Date(2016, 05, 29)) == false @test isbday(hc_uk, Dates.Date(2016, 05, 30)) == false # 30 May Monday Spring bank holiday @test isbday(hc_uk, Dates.Date(2016, 05, 31)) == true @test isbday(hc_uk, Dates.Date(2016, 08, 28)) == false @test isbday(hc_uk, Dates.Date(2016, 08, 29)) == false # 29 August Monday Summer bank holiday @test isbday(hc_uk, Dates.Date(2016, 08, 30)) == true @test isbday(hc_uk, Dates.Date(2016, 12, 23)) == true @test isbday(hc_uk, Dates.Date(2016, 12, 24)) == false @test isbday(hc_uk, Dates.Date(2016, 12, 25)) == false @test isbday(hc_uk, Dates.Date(2016, 12, 26)) == false # 26 December Monday Boxing Day @test isbday(hc_uk, Dates.Date(2016, 12, 27)) == false # 27 December Tuesday Christmas Day (substitute day) @test isbday(hc_uk, Dates.Date(2016, 12, 28)) == true # 2012 UK Holidays @test isbday(hc_uk, Dates.Date(2011, 12, 30)) == true @test isbday(hc_uk, Dates.Date(2011, 12, 31)) == false @test isbday(hc_uk, Dates.Date(2012, 01, 01)) == false @test isbday(hc_uk, Dates.Date(2012, 01, 02)) == false # 2 January Monday New Year’s Day (substitute day) @test isbday(hc_uk, Dates.Date(2012, 01, 03)) == true @test isbday(hc_uk, Dates.Date(2012, 04, 05)) == true @test isbday(hc_uk, Dates.Date(2012, 04, 06)) == false # 6 April Friday Good Friday @test isbday(hc_uk, Dates.Date(2012, 04, 07)) == false @test isbday(hc_uk, Dates.Date(2012, 04, 08)) == false @test isbday(hc_uk, Dates.Date(2012, 04, 09)) == false # 9 April Monday Easter Monday @test isbday(hc_uk, Dates.Date(2012, 04, 10)) == true @test isbday(hc_uk, Dates.Date(2012, 05, 06)) == false @test isbday(hc_uk, Dates.Date(2012, 05, 07)) == false # 7 May Monday Early May bank holiday @test isbday(hc_uk, Dates.Date(2012, 05, 08)) == true @test isbday(hc_uk, Dates.Date(2012, 06, 01)) == true @test isbday(hc_uk, Dates.Date(2012, 06, 02)) == false @test isbday(hc_uk, Dates.Date(2012, 06, 03)) == false @test isbday(hc_uk, Dates.Date(2012, 06, 04)) == false # 4 June Monday Spring bank holiday (substitute day) @test isbday(hc_uk, Dates.Date(2012, 06, 05)) == false # 5 June Tuesday Queen’s Diamond Jubilee (extra bank holiday) @test isbday(hc_uk, Dates.Date(2012, 06, 06)) == true @test isbday(hc_uk, Dates.Date(2011, 04, 29)) == false # wedding @test isbday(hc_uk, Dates.Date(2012, 08, 26)) == false @test isbday(hc_uk, Dates.Date(2012, 08, 27)) == false # 27 August Monday Summer bank holiday @test isbday(hc_uk, Dates.Date(2012, 08, 28)) == true @test isbday(hc_uk, Dates.Date(2012, 12, 24)) == true @test isbday(hc_uk, Dates.Date(2012, 12, 25)) == false # 25 December Tuesday Christmas Day @test isbday(hc_uk, Dates.Date(2012, 12, 26)) == false # 26 December Wednesday Boxing Day @test isbday(hc_uk, Dates.Date(2012, 12, 27)) == true # 1999 UK holidays @test isbday(hc_uk, Dates.Date(1999, 12, 26)) == false # Sunday @test isbday(hc_uk, Dates.Date(1999, 12, 27)) == false # Christmas observed @test isbday(hc_uk, Dates.Date(1999, 12, 28)) == false # Boxing observed @test isbday(hc_uk, Dates.Date(1999, 12, 29)) == true @test isbday(hc_uk, Dates.Date(1999, 12, 30)) == true @test isbday(hc_uk, Dates.Date(1999, 12, 31)) == false # 1995 UK holidays @test isbday(hc_uk, Dates.Date(1995, 5, 1)) == true # Early May Bank Holiday was moved to May 8th in 1995 @test isbday(hc_uk, Dates.Date(1995, 5, 8)) == false # Early May Bank Holiday was moved to May 8th in 1995 # 2020 UK holidays @test isbday(hc_uk, Dates.Date(2020, 5, 4)) == true # Early May Bank Holiday was moved to May 8th in 2020 @test isbday(hc_uk, Dates.Date(2020, 5, 8)) == false # Early May Bank Holiday was moved to May 8th in 2020 # 2021 UK Holidays @test isbday(hc_uk, Dates.Date(2021, 5, 3)) == false # May Day @test isbday(hc_uk, Dates.Date(2021, 5, 4)) == true @test isbday(hc_uk, Dates.Date(2021, 5, 31)) == false # spring bank holiday @test isbday(hc_uk, Dates.Date(2021, 6, 1)) == true # spring bank holiday # 2022 UK holidays slightly move due to Platinum Jubilee (#57) @test isbday(hc_uk, Dates.Date(2022, 6, 3)) == false # Platinum Jubilee of Queen Elizabeth II. @test isbday(hc_uk, Dates.Date(2022, 5, 2)) == false # May Day @test isbday(hc_uk, Dates.Date(2022, 5, 30)) == true @test isbday(hc_uk, Dates.Date(2022, 6, 2)) == false # Spring Bank Holiday @test isbday(hc_uk, Dates.Date(2022, 8, 29)) == false # Summer Bank Holiday @test isbday(hc_uk, Dates.Date(2022, 9, 19)) == false # Funeral of Queen Elizabeth II @test isbday(hc_uk, Dates.Date(2022, 12, 26)) == false # Boxing @test isbday(hc_uk, Dates.Date(2022, 12, 27)) == false # Christmas # 2023 UK holidays @test BusinessDays.listholidays(hc_uk, Dates.Date(2023, 1, 1), Dates.Date(2023, 12, 31)) == Dates.Date.([ "2023-01-02", # New Year’s Day (substitute day) "2023-04-07", # Good Friday "2023-04-10", # Easter Monday "2023-05-01", # Early May bank holiday "2023-05-08", # Coronation of King Charles III "2023-05-29", # Spring bank holiday "2023-08-28", # Summer bank holiday "2023-12-25", # Christmas Day "2023-12-26", # Boxing Day ]) @test tobday(hc_brazil, Dates.Date(2013, 02, 08)) == Dates.Date(2013, 02, 08) # regular friday @test tobday(hc_brazil, Dates.Date(2013, 02, 09)) == Dates.Date(2013, 02, 13) # after carnaval @test tobday(hc_brazil, Dates.Date(2013, 02, 09); forward = true) == Dates.Date(2013, 02, 13) # after carnaval @test tobday(hc_brazil, Dates.Date(2013, 02, 13); forward = false) == Dates.Date(2013, 02, 13) # after carnaval @test tobday(hc_brazil, Dates.Date(2013, 02, 12); forward = false) == Dates.Date(2013, 02, 08) # before carnaval @test tobday(:Brazil, Dates.Date(2013, 02, 08)) == Dates.Date(2013, 02, 08) # regular friday @test tobday(:Brazil, Dates.Date(2013, 02, 09)) == Dates.Date(2013, 02, 13) # after carnaval @test tobday(:Brazil, Dates.Date(2013, 02, 09); forward = true) == Dates.Date(2013, 02, 13) # after carnaval @test tobday(:Brazil, Dates.Date(2013, 02, 13); forward = false) == Dates.Date(2013, 02, 13) # after carnaval @test tobday(:Brazil, Dates.Date(2013, 02, 12); forward = false) == Dates.Date(2013, 02, 08) # before carnaval @test tobday("Brazil", Dates.Date(2013, 02, 08)) == Dates.Date(2013, 02, 08) # regular friday @test tobday("Brazil", Dates.Date(2013, 02, 09)) == Dates.Date(2013, 02, 13) # after carnaval @test tobday("Brazil", Dates.Date(2013, 02, 09); forward = true) == Dates.Date(2013, 02, 13) # after carnaval @test tobday("Brazil", Dates.Date(2013, 02, 13); forward = false) == Dates.Date(2013, 02, 13) # after carnaval @test tobday("Brazil", Dates.Date(2013, 02, 12); forward = false) == Dates.Date(2013, 02, 08) # before carnaval @test advancebdays(hc_brazil, Dates.Date(2013, 02, 06), 0) == Dates.Date(2013, 02, 06) # regular wednesday @test advancebdays(hc_brazil, Dates.Date(2013, 02, 06), 1) == Dates.Date(2013, 02, 07) # regular thursday @test advancebdays(hc_brazil, Dates.Date(2013, 02, 07), -1) == Dates.Date(2013, 02, 06) # regular thursday @test advancebdays(hc_brazil, Dates.Date(2013, 02, 06), 2) == Dates.Date(2013, 02, 08) # regular friday @test advancebdays(hc_brazil, Dates.Date(2013, 02, 06), 3) == Dates.Date(2013, 02, 13) # after carnaval wednesday @test advancebdays(hc_brazil, Dates.Date(2013, 02, 06), 4) == Dates.Date(2013, 02, 14) # after carnaval thursday @test advancebdays(hc_brazil, Dates.Date(2013, 02, 14), -4) == Dates.Date(2013, 02, 06) # after carnaval thursday @test advancebdays(:Brazil, Dates.Date(2013, 02, 06), 0) == Dates.Date(2013, 02, 06) # regular wednesday @test advancebdays(:Brazil, Dates.Date(2013, 02, 06), 1) == Dates.Date(2013, 02, 07) # regular thursday @test advancebdays(:Brazil, Dates.Date(2013, 02, 06), 2) == Dates.Date(2013, 02, 08) # regular friday @test advancebdays(:Brazil, Dates.Date(2013, 02, 06), 3) == Dates.Date(2013, 02, 13) # after carnaval wednesday @test advancebdays(:Brazil, Dates.Date(2013, 02, 06), 4) == Dates.Date(2013, 02, 14) # after carnaval thursday @test advancebdays("Brazil", Dates.Date(2013, 02, 06), 0) == Dates.Date(2013, 02, 06) # regular wednesday @test advancebdays("Brazil", Dates.Date(2013, 02, 06), 1) == Dates.Date(2013, 02, 07) # regular thursday @test advancebdays("Brazil", Dates.Date(2013, 02, 06), 2) == Dates.Date(2013, 02, 08) # regular friday @test advancebdays("Brazil", Dates.Date(2013, 02, 06), 3) == Dates.Date(2013, 02, 13) # after carnaval wednesday @test advancebdays("Brazil", Dates.Date(2013, 02, 06), 4) == Dates.Date(2013, 02, 14) # after carnaval thursday @test bdays(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 06)) == Dates.Day(0) @test bdays(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 07)) == Dates.Day(1) @test bdays(hc_brazil, Dates.Date(2013, 02, 07), Dates.Date(2013, 02, 06)).value == -1 @test bdays(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 08)).value == 2 @test bdays(hc_brazil, Dates.Date(2013, 02, 08), Dates.Date(2013, 02, 06)).value == -2 @test bdays(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 09)).value == 3 @test bdays(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 10)).value == 3 @test bdays(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 11)).value == 3 @test bdays(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 12)).value == 3 @test bdays(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 13)).value == 3 @test bdays(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 14)).value == 4 @test bdays(hc_brazil, Dates.Date(2013, 02, 14), Dates.Date(2013, 02, 06)).value == -4 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 06)) == 0 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 07)) == 1 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 07), Dates.Date(2013, 02, 06)) == -1 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 08)) == 2 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 08), Dates.Date(2013, 02, 06)) == -2 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 09)) == 3 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 10)) == 3 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 11)) == 3 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 12)) == 3 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 13)) == 3 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 06), Dates.Date(2013, 02, 14)) == 4 @test bdayscount(hc_brazil, Dates.Date(2013, 02, 14), Dates.Date(2013, 02, 06)) == -4 # Canada tsxdates16 = [ Dates.Date(2016, 1, 1), Dates.Date(2016, 2, 15), Dates.Date(2016, 3, 25), Dates.Date(2016, 5, 23), Dates.Date(2016, 7, 1), Dates.Date(2016, 8, 1), Dates.Date(2016, 9, 5), Dates.Date(2016, 10, 10), Dates.Date(2016, 12, 26), Dates.Date(2016, 12, 27) ] alldates16itr = Dates.Date(2016,1,1):Dates.Day(1):Dates.Date(2016, 12,31) for i in alldates16itr if i ∈ tsxdates16 @test isholiday(hc_canadatsx, i) == true else @test isholiday(hc_canadatsx, i) == false end end tsxdates17 = [ Dates.Date(2017, 1, 2), Dates.Date(2017, 2, 20), Dates.Date(2017, 4, 14), Dates.Date(2017, 5, 22), Dates.Date(2017, 7, 3), Dates.Date(2017, 8, 7), Dates.Date(2017, 9, 4), Dates.Date(2017, 10, 9), Dates.Date(2017, 12, 25), Dates.Date(2017, 12, 26) ] alldates17itr = Dates.Date(2017,1,1):Dates.Day(1):Dates.Date(2017, 12,31) for i in alldates17itr if i ∈ tsxdates17 @test isholiday(hc_canadatsx, i) == true else @test isholiday(hc_canadatsx, i) == false end end canadaDates16 = [ Dates.Date(2016, 1, 1), Dates.Date(2016, 2, 15), Dates.Date(2016, 3, 25), Dates.Date(2016, 5, 23), Dates.Date(2016, 7, 1), Dates.Date(2016, 8, 1), Dates.Date(2016, 9, 5), Dates.Date(2016, 10, 10), Dates.Date(2016, 11, 11), Dates.Date(2016, 12, 26), Dates.Date(2016, 12, 27) ] for i in alldates16itr if i ∈ canadaDates16 @test isholiday(hc_canada, i) == true else @test isholiday(hc_canada, i) == false end end # Issue #8 @test isbday(hc_canada, Dates.Date(2011,1,3)) == true @test isbday(hc_canadatsx, Dates.Date(2008, 2, 18)) == false # Australian Stock Exchange (ASX) christmasday = Dates.Date(2018,12, 25) boxingday = Dates.Date(2018,12, 26) asxdates18 = Set([ Dates.Date(2018, 1, 1), # New year's day Dates.Date(2018, 1, 26), # Australia Day Dates.Date(2018, 3, 30), # Good Friday Dates.Date(2018, 4, 2), # Easter Monday Dates.Date(2018, 4, 25), # ANZAC Day Dates.Date(2018, 6, 11), # Queen's Birthday holiday christmasday, boxingday ]) for dt in Dates.Date(2018,1,1):Dates.Day(1):Dates.Date(2018,12,31) if in(dt, asxdates18) dt != boxingday && @test isholiday(hc_australiaasx, dt-Dates.Day(1)) == false @test isholiday(hc_australiaasx, dt) == true dt != christmasday && @test isholiday(hc_australiaasx, dt+Dates.Day(1)) == false else @test isholiday(hc_australiaasx, dt) == false end end # Australian states and territories @test_throws MethodError BusinessDays.Australia() # State/territory not specified @test_throws AssertionError BusinessDays.Australia(:XXX) # Invalid state/territory # The Australian Capital Territory (ACT) actdates18 = Set([ Dates.Date(2018, 1, 1), # New year's day Dates.Date(2018, 1, 26), # Australia Day Dates.Date(2018, 3, 12), # Canberra Day Dates.Date(2018, 3, 30), # Good Friday Dates.Date(2018, 3, 31), # Easter Saturday Dates.Date(2018, 4, 1), # Easter Sunday Dates.Date(2018, 4, 2), # Easter Monday Dates.Date(2018, 4, 25), # ANZAC Day Dates.Date(2018, 5, 28), # Reconciliation Day Dates.Date(2018, 6, 11), # Queen's Birthday holiday Dates.Date(2018, 10, 1), # Labour Day christmasday, boxingday ]) for dt in Dates.Date(2018,1,1):Dates.Day(1):Dates.Date(2018,12,31) if in(dt, actdates18) @test isholiday(hc_australiaact, dt) == true else @test isholiday(hc_australiaact, dt) == false end end # The state of New South Wales (NSW), Australia nswdates18 = Set([ Dates.Date(2018, 1, 1), # New year's day Dates.Date(2018, 1, 26), # Australia Day Dates.Date(2018, 3, 30), # Good Friday Dates.Date(2018, 3, 31), # Easter Saturday Dates.Date(2018, 4, 1), # Easter Sunday Dates.Date(2018, 4, 2), # Easter Monday Dates.Date(2018, 4, 25), # ANZAC Day Dates.Date(2018, 6, 11), # Queen's Birthday holiday Dates.Date(2018, 8, 6), # Bank Holiday Dates.Date(2018, 10, 1), # Labour Day christmasday, boxingday ]) for dt in Dates.Date(2018,1,1):Dates.Day(1):Dates.Date(2018,12,31) if in(dt, nswdates18) @test isholiday(hc_australiansw, dt) == true else @test isholiday(hc_australiansw, dt) == false end end # The Northern Territory (NT), Australia ntdates18 = Set([ Dates.Date(2018, 1, 1), # New year's day Dates.Date(2018, 1, 26), # Australia Day Dates.Date(2018, 3, 30), # Good Friday Dates.Date(2018, 3, 31), # Easter Saturday Dates.Date(2018, 4, 2), # Easter Monday Dates.Date(2018, 4, 25), # ANZAC Day Dates.Date(2018, 5, 7), # May Day Dates.Date(2018, 6, 11), # Queen's Birthday holiday Dates.Date(2018, 8, 6), # Picnic Day christmasday, boxingday ]) for dt in Dates.Date(2018,1,1):Dates.Day(1):Dates.Date(2018,12,31) if in(dt, ntdates18) @test isholiday(hc_australiant, dt) == true else @test isholiday(hc_australiant, dt) == false end end # The state of Queensland (QLD), Australia qlddates18 = Set([ Dates.Date(2018, 1, 1), # New year's day Dates.Date(2018, 1, 26), # Australia Day Dates.Date(2018, 3, 30), # Good Friday Dates.Date(2018, 3, 31), # Easter Saturday Dates.Date(2018, 4, 1), # Easter Sunday Dates.Date(2018, 4, 2), # Easter Monday Dates.Date(2018, 4, 25), # ANZAC Day Dates.Date(2018, 5, 7), # Labour Day Dates.Date(2018, 8, 15), # Royal Brisbane Show (brisbane area only) Dates.Date(2018, 10, 1), # Queen's Birthday holiday christmasday, boxingday ]) for dt in Dates.Date(2018,1,1):Dates.Day(1):Dates.Date(2018,12,31) if in(dt, qlddates18) @test isholiday(hc_australiaqld, dt) == true else @test isholiday(hc_australiaqld, dt) == false end end @test isholiday(hc_australiaqld, Dates.Date(2013, 6, 10)) == true # The state of South Australia (SA) sadates18 = Set([ Dates.Date(2018, 1, 1), # New year's day Dates.Date(2018, 1, 26), # Australia Day Dates.Date(2018, 3, 12), # March public Holiday Dates.Date(2018, 3, 30), # Good Friday Dates.Date(2018, 3, 31), # Easter Saturday Dates.Date(2018, 4, 2), # Easter Monday Dates.Date(2018, 4, 25), # ANZAC Day Dates.Date(2018, 6, 11), # Queen's Birthday holiday Dates.Date(2018, 10, 1), # Labour Day christmasday, boxingday ]) for dt in Dates.Date(2018,1,1):Dates.Day(1):Dates.Date(2018,12,31) if in(dt, sadates18) @test isholiday(hc_australiasa, dt) == true else @test isholiday(hc_australiasa, dt) == false end end # The state of Tasmania (TAS), Australia tasdates18 = Set([ Dates.Date(2018, 1, 1), # New year's day Dates.Date(2018, 1, 26), # Australia Day Dates.Date(2018, 2, 12), # Royal Hobart Regatta Dates.Date(2018, 3, 12), # Labour Day Dates.Date(2018, 3, 30), # Good Friday Dates.Date(2018, 4, 2), # Easter Monday Dates.Date(2018, 4, 25), # ANZAC Day Dates.Date(2018, 6, 11), # Queen's Birthday holiday Dates.Date(2018, 11, 5), # Recreation Day christmasday, boxingday ]) for dt in Dates.Date(2018,1,1):Dates.Day(1):Dates.Date(2018,12,31) if in(dt, tasdates18) @test isholiday(hc_australiatas, dt) == true else @test isholiday(hc_australiatas, dt) == false end end # The state of Western Australia (WA) wadates18 = Set([ Dates.Date(2018, 1, 1), # New year's day Dates.Date(2018, 1, 26), # Australia Day Dates.Date(2018, 3, 5), # Labour Day Dates.Date(2018, 3, 30), # Good Friday Dates.Date(2018, 4, 2), # Easter Monday Dates.Date(2018, 4, 25), # ANZAC Day Dates.Date(2018, 6, 4), # Western Australia Day Dates.Date(2018, 9, 24), # Queen's Birthday holiday christmasday, boxingday ]) for dt in Dates.Date(2018,1,1):Dates.Day(1):Dates.Date(2018,12,31) if in(dt, wadates18) @test isholiday(hc_australiawa, dt) == true else @test isholiday(hc_australiawa, dt) == false end end @test isholiday(hc_australiawa, Dates.Date(2011, 10, 28)) == true @test isholiday(hc_australiawa, Dates.Date(2012, 10, 1)) == true # The state of Victoria (VIC), Australia vicdates18 = Set([ Dates.Date(2018, 1, 1), # New year's day Dates.Date(2018, 1, 26), # Australia Day Dates.Date(2018, 3, 12), # Labour Day Dates.Date(2018, 3, 30), # Good Friday Dates.Date(2018, 3, 31), # Easter Saturday Dates.Date(2018, 4, 1), # Easter Sunday Dates.Date(2018, 4, 2), # Easter Monday Dates.Date(2018, 4, 25), # ANZAC Day Dates.Date(2018, 6, 11), # Queen's Birthday holiday Dates.Date(2018, 9, 28), # AFL Grand Final Eve holiday Dates.Date(2018, 11, 6), # Melbourne Cup holiday christmasday, boxingday ]) for dt in Dates.Date(2018,1,1):Dates.Day(1):Dates.Date(2018,12,31) if in(dt, vicdates18) @test isholiday(hc_australiavic, dt) == true else @test isholiday(hc_australiavic, dt) == false end end # German holidays gerdates2020 = Set([ Dates.Date(2020, 1, 1), # New year's day Dates.Date(2020, 4, 10), # Good Friday Dates.Date(2020, 4, 13), # Easter Monday Dates.Date(2020, 5, 1), # International Workers' Day Dates.Date(2020, 5, 21), # Ascension Day Dates.Date(2020, 6, 1), # Pentecost Dates.Date(2020, 10, 3), # Day of German Unity Dates.Date(2020, 12, 25), # Christmas Day Dates.Date(2020, 12, 26) # Boxing Day ]) # German state holidays in different regions northdates2020 = Set([ Dates.Date(2020, 10, 31) # Reformation Day ]) westdates2020 = Set([ Dates.Date(2020, 6, 11), # Corpus Christi Dates.Date(2020, 11, 1) # All Saints Day ]) southdates2020 = Set([ Dates.Date(2020, 1, 6), # Epiphany Dates.Date(2020, 6, 11), # Corpus Christi Dates.Date(2020, 11, 1) # All Saints Day ]) assumption2020 = Set([ Dates.Date(2020, 8, 15) # Assumption of Mary ]) debedates2020 = Set([ Dates.Date(2020, 3, 8), # International Womens' Day Dates.Date(2020, 5, 8) # Tag der Befreiung ]) debbdates2020 = Set([ Dates.Date(2020, 4, 12), # Easter Sunday Dates.Date(2020, 5, 31), # Pentecost Sunday Dates.Date(2020, 10, 31) # Reformation Day ]) dehedates2020 = Set([ Dates.Date(2020, 6, 11) # Corpus Christi ]) desndates2020 = Set([ Dates.Date(2020, 10, 31), # Reformation Day Dates.Date(2020, 11, 18) # Day of rependance and prayer ]) destdates2020 = Set([ Dates.Date(2020, 1, 6), # Epiphany Dates.Date(2020, 10, 31) # Reformation Day ]) dethdates2020 = Set([ Dates.Date(2020, 9, 20), # World Children's Day Dates.Date(2020, 10, 31) # Reformation Day ]) # Test holidays in northern states of Bremen, Hamburg, # Mecklenburg-Vorpommern, Lower Saxony, and Schleswig-Holstein for dt in Dates.Date(2020):Dates.Day(1):Dates.Date(2020,12,31) if dt ∈ union(gerdates2020, northdates2020) @test isholiday(hc_de_hb, dt) == true @test isholiday(hc_de_hh, dt) == true @test isholiday(hc_de_mv, dt) == true @test isholiday(hc_de_ni, dt) == true @test isholiday(hc_de_sh, dt) == true else @test isholiday(hc_de_hb, dt) == false @test isholiday(hc_de_hh, dt) == false @test isholiday(hc_de_mv, dt) == false @test isholiday(hc_de_ni, dt) == false @test isholiday(hc_de_sh, dt) == false end end # Test holidays in western states of # North Rhine-Westphalia, Rhineland Palatinate, and Saarland for dt in Dates.Date(2020):Dates.Day(1):Dates.Date(2020,12,31) if dt ∈ union(gerdates2020, westdates2020) @test isholiday(hc_de_nw, dt) == true @test isholiday(hc_de_rp, dt) == true @test isholiday(hc_de_sl, dt) == true elseif dt ∈ assumption2020 @test isholiday(hc_de_nw, dt) == false @test isholiday(hc_de_rp, dt) == false @test isholiday(hc_de_sl, dt) == true else @test isholiday(hc_de_nw, dt) == false @test isholiday(hc_de_rp, dt) == false @test isholiday(hc_de_sl, dt) == false end end # Test holidays in southern states of Baden-Württemberg and Bavaria for dt in Dates.Date(2020):Dates.Day(1):Dates.Date(2020,12,31) if dt ∈ union(gerdates2020, southdates2020) @test isholiday(hc_de_bw, dt) == true @test isholiday(hc_de_by, dt) == true @test isholiday(hc_de_byp, dt) == true elseif dt ∈ assumption2020 @test isholiday(hc_de_bw, dt) == false @test isholiday(hc_de_by, dt) == true @test isholiday(hc_de_byp, dt) == false else @test isholiday(hc_de_bw, dt) == false @test isholiday(hc_de_by, dt) == false @test isholiday(hc_de_byp, dt) == false end end # Test holidays of Berlin for dt in Dates.Date(2020):Dates.Day(1):Dates.Date(2020,12,31) if dt ∈ union(gerdates2020, debedates2020) @test isholiday(hc_de_be, dt) == true else @test isholiday(hc_de_be, dt) == false end end # Test holidays of Brandenburg for dt in Dates.Date(2020):Dates.Day(1):Dates.Date(2020,12,31) if dt ∈ union(gerdates2020, debbdates2020) @test isholiday(hc_de_bb, dt) == true else @test isholiday(hc_de_bb, dt) == false end end # Test holidays of Hessen for dt in Dates.Date(2020):Dates.Day(1):Dates.Date(2020,12,31) if dt ∈ union(gerdates2020, dehedates2020) @test isholiday(hc_de_he, dt) == true else @test isholiday(hc_de_he, dt) == false end end # Test holidays of Saxony for dt in Dates.Date(2020):Dates.Day(1):Dates.Date(2020,12,31) if dt ∈ union(gerdates2020, desndates2020) @test isholiday(hc_de_sn, dt) == true else @test isholiday(hc_de_sn, dt) == false end end # Test holidays of Saxony-Anhalt for dt in Dates.Date(2020):Dates.Day(1):Dates.Date(2020,12,31) if dt ∈ union(gerdates2020, destdates2020) @test isholiday(hc_de_st, dt) == true else @test isholiday(hc_de_st, dt) == false end end # Test holidays of Thuringia for dt in Dates.Date(2020):Dates.Day(1):Dates.Date(2020,12,31) if dt ∈ union(gerdates2020, dethdates2020) @test isholiday(hc_de_th, dt) == true else @test isholiday(hc_de_th, dt) == false end end # dates are treated per value d0 = Dates.Date(2013, 02, 06) d1 = Dates.Date(2013, 02, 14) @test bdays(hc_brazil, d0, d1).value == 4 @test bdayscount(hc_brazil, d0, d1) == 4 @test d0 == Dates.Date(2013, 02, 06) # d0 is changed inside bdays function, but outer-scope value remains the same @test d1 == Dates.Date(2013, 02, 14) d0 = Dates.Date(2015, 06, 29) ; d2 = Dates.Date(2100, 12, 20) @test bdays(hc_brazil, d0, d2).value == 21416 @test bdayscount(hc_brazil, d0, d2) == 21416 # Tests for Composite Calendar @test isholiday(hc_composite_BR_USA, Dates.Date(2012,9,3)) # US Labor Day @test isholiday(hc_composite_BR_USA, Dates.Date(2012,9,7)) # BR Independence Day @test bdayscount(hc_composite_BR_USA, Dates.Date(2012,8,31), Dates.Date(2012,9,10)) == 4 # 3/sep labor day US, 7/sep Indep day BR @test bdays(hc_composite_BR_USA, Dates.Date(2012,8,31), Dates.Date(2012,9,10)) == Dates.Day(4) # 3/sep labor day US, 7/sep Indep day BR println("Timing composite calendar bdays calculation") @time bdays(hc_composite_BR_USA, Dates.Date(2012,8,31), Dates.Date(2012,9,10)) println("Timing single bdays calculation") @time bdays(hc_brazil, d0, d2) println("Timing 100 bdays calculations") @time for i in 1:100 bdays(hc_brazil, d0, d2) end dInicio = Dates.Date(1950, 01, 01) ; dFim = Dates.Date(2100, 12, 20) if !usecache println("Timing cache creation") @time x = BusinessDays._create_bdays_cache_arrays(hc_brazil, dInicio, dFim) end if usecache println("a million...") @time for i in 1:1000000 bdays(hc_brazil, d0, d2) end end # Vector functions d0 = Dates.Date(2000,01,04) d1 = Dates.Date(2020,01,04) d1vec = collect(d0:Dates.Day(1):d1) d0vec = fill(d0, length(d1vec)) r = bdays(hc_brazil, d0vec, d1vec) b = isbday(hc_brazil, d0vec) b2 = isbday(:Brazil, d0vec) b3 = isbday("Brazil", d0vec) @test tobday([hc_brazil, hc_usa], [Dates.Date(2015,11,11), Dates.Date(2015,11,11)]) == [Dates.Date(2015,11,11), Dates.Date(2015,11,12)] @test tobday([:Brazil, :USSettlement], [Dates.Date(2015,11,11), Dates.Date(2015,11,11)]) == [Dates.Date(2015,11,11), Dates.Date(2015,11,12)] @test tobday(["Brazil", "USSettlement"], [Dates.Date(2015,11,11), Dates.Date(2015,11,11)]) == [Dates.Date(2015,11,11), Dates.Date(2015,11,12)] # Vector with different sizes @test_throws AssertionError bdays(hc_brazil, fill(d0, length(d1vec)+1), d1vec) @test_throws AssertionError bdays([hc_brazil, hc_usa], [Dates.Date(2015,11,11)], [Dates.Date(2015,11,11),Dates.Date(2015,11,11)]) @test_throws AssertionError tobday([hc_brazil, hc_usa], fill(d0, 3)) @test_throws AssertionError isbday( [hc_brazil, hc_usa, hc_uk], [Dates.Date(2015,01,01), Dates.Date(2015,01,01)]) println("Timing vectorized functions (vector length $(length(d0vec)))") @time bdays(hc_brazil, d0vec, d1vec) @time bdays(:Brazil, d0vec, d1vec) @time bdays("Brazil", d0vec, d1vec) @time isbday(hc_brazil, d0vec) @time isbday(:Brazil, d0vec) @time isbday("Brazil", d0vec) d2001 = collect(Dates.Date(2001,01,01):Dates.Day(1):Dates.Date(2001,01,15)) @test isweekend(d2001) == [ false, false, false, false, false, true, true, false, false, false, false, false, true, true, false] @test tobday(hc_brazil, d2001; forward=true) == [ Dates.Date(2001,01,02), Dates.Date(2001,01,02), Dates.Date(2001,01,03), Dates.Date(2001,01,04), Dates.Date(2001,01,05), Dates.Date(2001,01,08), Dates.Date(2001,01,08), Dates.Date(2001,01,08), Dates.Date(2001,01,09), Dates.Date(2001,01,10), Dates.Date(2001,01,11), Dates.Date(2001,01,12), Dates.Date(2001,01,15), Dates.Date(2001,01,15), Dates.Date(2001,01,15)] @test tobday(hc_brazil, d2001; forward=false) == [ Dates.Date(2000,12,29), Dates.Date(2001,01,02), Dates.Date(2001,01,03), Dates.Date(2001,01,04), Dates.Date(2001,01,05), Dates.Date(2001,01,05), Dates.Date(2001,01,05), Dates.Date(2001,01,08), Dates.Date(2001,01,09), Dates.Date(2001,01,10), Dates.Date(2001,01,11), Dates.Date(2001,01,12), Dates.Date(2001,01,12), Dates.Date(2001,01,12), Dates.Date(2001,01,15)] @test tobday(:Brazil, d2001; forward=true) == [ Dates.Date(2001,01,02), Dates.Date(2001,01,02), Dates.Date(2001,01,03), Dates.Date(2001,01,04), Dates.Date(2001,01,05), Dates.Date(2001,01,08), Dates.Date(2001,01,08), Dates.Date(2001,01,08), Dates.Date(2001,01,09), Dates.Date(2001,01,10), Dates.Date(2001,01,11), Dates.Date(2001,01,12), Dates.Date(2001,01,15), Dates.Date(2001,01,15), Dates.Date(2001,01,15)] @test tobday("Brazil", d2001; forward=false) == [ Dates.Date(2000,12,29), Dates.Date(2001,01,02), Dates.Date(2001,01,03), Dates.Date(2001,01,04), Dates.Date(2001,01,05), Dates.Date(2001,01,05), Dates.Date(2001,01,05), Dates.Date(2001,01,08), Dates.Date(2001,01,09), Dates.Date(2001,01,10), Dates.Date(2001,01,11), Dates.Date(2001,01,12), Dates.Date(2001,01,12), Dates.Date(2001,01,12), Dates.Date(2001,01,15)] @test bdays([hc_brazil, hc_usa], [Dates.Date(2012,8,31), Dates.Date(2012,8,31)], [Dates.Date(2012,9,10), Dates.Date(2012,9,10)]) == [Dates.Day(5), Dates.Day(5)] # 1/sep labor day US, 7/sep Indep day BR @test bdayscount([hc_brazil, hc_usa], [Dates.Date(2012,8,31), Dates.Date(2012,8,31)], [Dates.Date(2012,9,10), Dates.Date(2012,9,10)]) == [5, 5] # 1/sep labor day US, 7/sep Indep day BR @test isbday([hc_brazil, hc_usa], [Dates.Date(2012, 09, 07), Dates.Date(2012, 09, 03)]) == [false, false] # 1/sep labor day US, 7/sep Indep day BR @test advancebdays(hc_brazil, Dates.Date(2015,9,1), [0, 1, 3, 4, 5]) == [Dates.Date(2015,9,1),Dates.Date(2015,9,2),Dates.Date(2015,9,4),Dates.Date(2015,9,8),Dates.Date(2015,9,9)] @test advancebdays(hc_brazil, Dates.Date(2015,9,1), 0:1:5) == [Dates.Date(2015,9,1),Dates.Date(2015,9,2),Dates.Date(2015,9,3),Dates.Date(2015,9,4),Dates.Date(2015,9,8),Dates.Date(2015,9,9)] @test listholidays(hc_brazil, Dates.Date(2016,1,1), Dates.Date(2016,5,30)) == [Dates.Date(2016,1,1),Dates.Date(2016,2,8),Dates.Date(2016,2,9),Dates.Date(2016,3,25),Dates.Date(2016,4,21),Dates.Date(2016,5,1),Dates.Date(2016,5,26)] @test bdays([:Brazil, :USSettlement], [Dates.Date(2012,8,31), Dates.Date(2012,8,31)], [Dates.Date(2012,9,10), Dates.Date(2012,9,10)]) == [Dates.Day(5), Dates.Day(5)] # 1/sep labor day US, 7/sep Indep day BR @test bdayscount([:Brazil, :USSettlement], [Dates.Date(2012,8,31), Dates.Date(2012,8,31)], [Dates.Date(2012,9,10), Dates.Date(2012,9,10)]) == [5, 5] # 1/sep labor day US, 7/sep Indep day BR @test isbday([:Brazil, :USSettlement], [Dates.Date(2012, 09, 07), Dates.Date(2012, 09, 03)]) == [false, false] # 1/sep labor day US, 7/sep Indep day BR @test advancebdays(:Brazil, Dates.Date(2015,9,1), [0, 1, 3, 4, 5]) == [Dates.Date(2015,9,1),Dates.Date(2015,9,2),Dates.Date(2015,9,4),Dates.Date(2015,9,8),Dates.Date(2015,9,9)] @test advancebdays(:Brazil, Dates.Date(2015,9,1), 0:5) == [Dates.Date(2015,9,1),Dates.Date(2015,9,2),Dates.Date(2015,9,3),Dates.Date(2015,9,4),Dates.Date(2015,9,8),Dates.Date(2015,9,9)] @test listholidays(:Brazil, Dates.Date(2016,1,1), Dates.Date(2016,5,30)) == [Dates.Date(2016,1,1),Dates.Date(2016,2,8),Dates.Date(2016,2,9),Dates.Date(2016,3,25),Dates.Date(2016,4,21),Dates.Date(2016,5,1),Dates.Date(2016,5,26)] @test bdays(["Brazil", "USSettlement"], [Dates.Date(2012,8,31), Dates.Date(2012,8,31)], [Dates.Date(2012,9,10), Dates.Date(2012,9,10)]) == [Dates.Day(5), Dates.Day(5)] # 1/sep labor day US, 7/sep Indep day BR @test bdayscount(["Brazil", "USSettlement"], [Dates.Date(2012,8,31), Dates.Date(2012,8,31)], [Dates.Date(2012,9,10), Dates.Date(2012,9,10)]) == [5, 5] # 1/sep labor day US, 7/sep Indep day BR @test isbday(["Brazil", "USSettlement"], [Dates.Date(2012, 09, 07), Dates.Date(2012, 09, 03)]) == [false, false] # 1/sep labor day US, 7/sep Indep day BR @test advancebdays("Brazil", Dates.Date(2015,9,1), [0, 1, 3, 4, 5]) == [Dates.Date(2015,9,1),Dates.Date(2015,9,2),Dates.Date(2015,9,4),Dates.Date(2015,9,8),Dates.Date(2015,9,9)] @test advancebdays("Brazil", Dates.Date(2015,9,1), 0:5) == [Dates.Date(2015,9,1),Dates.Date(2015,9,2),Dates.Date(2015,9,3),Dates.Date(2015,9,4),Dates.Date(2015,9,8),Dates.Date(2015,9,9)] @test listholidays("Brazil", Dates.Date(2016,1,1), Dates.Date(2016,5,30)) == [Dates.Date(2016,1,1),Dates.Date(2016,2,8),Dates.Date(2016,2,9),Dates.Date(2016,3,25),Dates.Date(2016,4,21),Dates.Date(2016,5,1),Dates.Date(2016,5,26)] @test listbdays("Brazil", Dates.Date(2016,5,18), Dates.Date(2016,5,23)) == [Dates.Date(2016,5,18),Dates.Date(2016,5,19),Dates.Date(2016,5,20),Dates.Date(2016,5,23)] @test listbdays("Brazil", Dates.Date(2016,5,18), Dates.Date(2016,5,18)) == [Dates.Date(2016,5,18)] @test isempty(listbdays("Brazil", Dates.Date(2016,5,21), Dates.Date(2016,5,21))) @test isempty(listbdays("Brazil", Dates.Date(2016,5,21), Dates.Date(2016,5,22))) @test listbdays("Brazil", Dates.Date(2016,5,21), Dates.Date(2016,5,23)) == [Dates.Date(2016,5,23)] @test bdays(:Brazil, Dates.Date(2016,11,1), [Dates.Date(2016,11,3), Dates.Date(2016,11,7), Dates.Date(2016,11,4)]) == [Dates.Day(1), Dates.Day(3), Dates.Day(2)] @test bdayscount(:Brazil, Dates.Date(2016,11,1), [Dates.Date(2016,11,3), Dates.Date(2016,11,7), Dates.Date(2016,11,4)]) == [1, 3, 2] @test bdays(:Brazil, [Dates.Date(2018,3,28), Dates.Date(2018,3,28), Dates.Date(2018,3,28), Dates.Date(2018,3,28)], [Dates.Date(2018,3,28), Dates.Date(2018,3,29), Dates.Date(2018,3,30), Dates.Date(2018,4,2)]) == [Dates.Day(0), Dates.Day(1), Dates.Day(2), Dates.Day(2)] @test bdayscount(:Brazil, [Dates.Date(2018,3,28), Dates.Date(2018,3,28), Dates.Date(2018,3,28), Dates.Date(2018,3,28)], [Dates.Date(2018,3,28), Dates.Date(2018,3,29), Dates.Date(2018,3,30), Dates.Date(2018,4,2)]) == [0, 1, 2, 2] # first/last businessdayofmonth @test firstbdayofmonth(:Brazil, Dates.Date(2017,12,10)) == Dates.Date(2017,12,1) @test firstbdayofmonth(:Brazil, Dates.Date(2018,1,10)) == Dates.Date(2018,1,2) @test lastbdayofmonth(:Brazil, Dates.Date(2017,12,10)) == Dates.Date(2017,12,29) @test firstbdayofmonth(:Brazil, 2017, 12) == Dates.Date(2017,12,1) @test firstbdayofmonth(:Brazil, 2018, 1) == Dates.Date(2018,1,2) @test lastbdayofmonth(:Brazil, 2017, 12) == Dates.Date(2017,12,29) @test firstbdayofmonth(:Brazil, Dates.Year(2017), Dates.Month(12)) == Dates.Date(2017,12,1) @test firstbdayofmonth(:Brazil, Dates.Year(2018), Dates.Month(1)) == Dates.Date(2018,1,2) @test lastbdayofmonth(:Brazil, Dates.Year(2017), Dates.Month(12)) == Dates.Date(2017,12,29) # list holidays for all available calendars for c in [:BRSettlement, :BrazilExchange, :USNYSE, :USGovernmentBond, :USSettlement, :CanadaTSX, :CanadaSettlement, :EuroZone, :UKSettlement, :AustraliaASX] listholidays(c, Dates.Date(1990,1,1), Dates.Date(2100,1,1)) end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
308
using BusinessDays struct CustomCalendar <: HolidayCalendar end BusinessDays.isholiday(::CustomCalendar, dt::Dates.Date) = dt == Dates.Date(2015,8,27) cc = CustomCalendar() println("Returns false: $(isholiday(cc, Dates.Date(2015,8,26)))") println("Returns true: $(isholiday(cc, Dates.Date(2015,8,27)))")
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
837
using BusinessDays using Test # lets find out minimum and maximum month for easter function find_easter_min_max() y = 1582 easter_1582 = BusinessDays.easter_date(Dates.Year(1582)) month_min = Dates.month(easter_1582) date_min = easter_1582 month_max = Dates.month(easter_1582) date_max = easter_1582 while y <= 2100 e_date = BusinessDays.easter_date(Dates.Year(y)) m_e_date = Dates.month(e_date) month_min = min(month_min, m_e_date) month_max = max(month_max, m_e_date) if month_min == m_e_date date_min = e_date end if month_max == m_e_date date_max = e_date end y += 1 end return (month_min, month_max) end month_min, month_max = find_easter_min_max() @test month_min >= 3 @test month_max <= 4
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
13899
################ ## Easter Dates ################ # easter dates from http://www.maa.mhn.de/StarDate/publ_holidays.html @test_throws ErrorException BusinessDays.easter_date(Dates.Year(1100)) @test BusinessDays.easter_date(Dates.Year(1901)) == Dates.Date(1901, 04, 07) @test BusinessDays.easter_date(Dates.Year(1902)) == Dates.Date(1902, 03, 30) @test BusinessDays.easter_date(Dates.Year(1903)) == Dates.Date(1903, 04, 12) @test BusinessDays.easter_date(Dates.Year(1904)) == Dates.Date(1904, 04, 03) @test BusinessDays.easter_date(Dates.Year(1905)) == Dates.Date(1905, 04, 23) @test BusinessDays.easter_date(Dates.Year(1906)) == Dates.Date(1906, 04, 15) @test BusinessDays.easter_date(Dates.Year(1907)) == Dates.Date(1907, 03, 31) @test BusinessDays.easter_date(Dates.Year(1908)) == Dates.Date(1908, 04, 19) @test BusinessDays.easter_date(Dates.Year(1909)) == Dates.Date(1909, 04, 11) @test BusinessDays.easter_date(Dates.Year(1910)) == Dates.Date(1910, 03, 27) @test BusinessDays.easter_date(Dates.Year(1911)) == Dates.Date(1911, 04, 16) @test BusinessDays.easter_date(Dates.Year(1912)) == Dates.Date(1912, 04, 07) @test BusinessDays.easter_date(Dates.Year(1913)) == Dates.Date(1913, 03, 23) @test BusinessDays.easter_date(Dates.Year(1914)) == Dates.Date(1914, 04, 12) @test BusinessDays.easter_date(Dates.Year(1915)) == Dates.Date(1915, 04, 04) @test BusinessDays.easter_date(Dates.Year(1916)) == Dates.Date(1916, 04, 23) @test BusinessDays.easter_date(Dates.Year(1917)) == Dates.Date(1917, 04, 08) @test BusinessDays.easter_date(Dates.Year(1918)) == Dates.Date(1918, 03, 31) @test BusinessDays.easter_date(Dates.Year(1919)) == Dates.Date(1919, 04, 20) @test BusinessDays.easter_date(Dates.Year(1920)) == Dates.Date(1920, 04, 04) @test BusinessDays.easter_date(Dates.Year(1921)) == Dates.Date(1921, 03, 27) @test BusinessDays.easter_date(Dates.Year(1922)) == Dates.Date(1922, 04, 16) @test BusinessDays.easter_date(Dates.Year(1923)) == Dates.Date(1923, 04, 01) @test BusinessDays.easter_date(Dates.Year(1924)) == Dates.Date(1924, 04, 20) @test BusinessDays.easter_date(Dates.Year(1925)) == Dates.Date(1925, 04, 12) @test BusinessDays.easter_date(Dates.Year(1926)) == Dates.Date(1926, 04, 04) @test BusinessDays.easter_date(Dates.Year(1927)) == Dates.Date(1927, 04, 17) @test BusinessDays.easter_date(Dates.Year(1928)) == Dates.Date(1928, 04, 08) @test BusinessDays.easter_date(Dates.Year(1929)) == Dates.Date(1929, 03, 31) @test BusinessDays.easter_date(Dates.Year(1930)) == Dates.Date(1930, 04, 20) @test BusinessDays.easter_date(Dates.Year(1931)) == Dates.Date(1931, 04, 05) @test BusinessDays.easter_date(Dates.Year(1932)) == Dates.Date(1932, 03, 27) @test BusinessDays.easter_date(Dates.Year(1933)) == Dates.Date(1933, 04, 16) @test BusinessDays.easter_date(Dates.Year(1934)) == Dates.Date(1934, 04, 01) @test BusinessDays.easter_date(Dates.Year(1935)) == Dates.Date(1935, 04, 21) @test BusinessDays.easter_date(Dates.Year(1936)) == Dates.Date(1936, 04, 12) @test BusinessDays.easter_date(Dates.Year(1937)) == Dates.Date(1937, 03, 28) @test BusinessDays.easter_date(Dates.Year(1938)) == Dates.Date(1938, 04, 17) @test BusinessDays.easter_date(Dates.Year(1939)) == Dates.Date(1939, 04, 09) @test BusinessDays.easter_date(Dates.Year(1940)) == Dates.Date(1940, 03, 24) @test BusinessDays.easter_date(Dates.Year(1941)) == Dates.Date(1941, 04, 13) @test BusinessDays.easter_date(Dates.Year(1942)) == Dates.Date(1942, 04, 05) @test BusinessDays.easter_date(Dates.Year(1943)) == Dates.Date(1943, 04, 25) @test BusinessDays.easter_date(Dates.Year(1944)) == Dates.Date(1944, 04, 09) @test BusinessDays.easter_date(Dates.Year(1945)) == Dates.Date(1945, 04, 01) @test BusinessDays.easter_date(Dates.Year(1946)) == Dates.Date(1946, 04, 21) @test BusinessDays.easter_date(Dates.Year(1947)) == Dates.Date(1947, 04, 06) @test BusinessDays.easter_date(Dates.Year(1948)) == Dates.Date(1948, 03, 28) @test BusinessDays.easter_date(Dates.Year(1949)) == Dates.Date(1949, 04, 17) @test BusinessDays.easter_date(Dates.Year(1950)) == Dates.Date(1950, 04, 09) @test BusinessDays.easter_date(Dates.Year(1951)) == Dates.Date(1951, 03, 25) @test BusinessDays.easter_date(Dates.Year(1952)) == Dates.Date(1952, 04, 13) @test BusinessDays.easter_date(Dates.Year(1953)) == Dates.Date(1953, 04, 05) @test BusinessDays.easter_date(Dates.Year(1954)) == Dates.Date(1954, 04, 18) @test BusinessDays.easter_date(Dates.Year(1955)) == Dates.Date(1955, 04, 10) @test BusinessDays.easter_date(Dates.Year(1956)) == Dates.Date(1956, 04, 01) @test BusinessDays.easter_date(Dates.Year(1957)) == Dates.Date(1957, 04, 21) @test BusinessDays.easter_date(Dates.Year(1958)) == Dates.Date(1958, 04, 06) @test BusinessDays.easter_date(Dates.Year(1959)) == Dates.Date(1959, 03, 29) @test BusinessDays.easter_date(Dates.Year(1960)) == Dates.Date(1960, 04, 17) @test BusinessDays.easter_date(Dates.Year(1961)) == Dates.Date(1961, 04, 02) @test BusinessDays.easter_date(Dates.Year(1962)) == Dates.Date(1962, 04, 22) @test BusinessDays.easter_date(Dates.Year(1963)) == Dates.Date(1963, 04, 14) @test BusinessDays.easter_date(Dates.Year(1964)) == Dates.Date(1964, 03, 29) @test BusinessDays.easter_date(Dates.Year(1965)) == Dates.Date(1965, 04, 18) @test BusinessDays.easter_date(Dates.Year(1966)) == Dates.Date(1966, 04, 10) @test BusinessDays.easter_date(Dates.Year(1967)) == Dates.Date(1967, 03, 26) @test BusinessDays.easter_date(Dates.Year(1968)) == Dates.Date(1968, 04, 14) @test BusinessDays.easter_date(Dates.Year(1969)) == Dates.Date(1969, 04, 06) @test BusinessDays.easter_date(Dates.Year(1970)) == Dates.Date(1970, 03, 29) @test BusinessDays.easter_date(Dates.Year(1971)) == Dates.Date(1971, 04, 11) @test BusinessDays.easter_date(Dates.Year(1972)) == Dates.Date(1972, 04, 02) @test BusinessDays.easter_date(Dates.Year(1973)) == Dates.Date(1973, 04, 22) @test BusinessDays.easter_date(Dates.Year(1974)) == Dates.Date(1974, 04, 14) @test BusinessDays.easter_date(Dates.Year(1975)) == Dates.Date(1975, 03, 30) @test BusinessDays.easter_date(Dates.Year(1976)) == Dates.Date(1976, 04, 18) @test BusinessDays.easter_date(Dates.Year(1977)) == Dates.Date(1977, 04, 10) @test BusinessDays.easter_date(Dates.Year(1978)) == Dates.Date(1978, 03, 26) @test BusinessDays.easter_date(Dates.Year(1979)) == Dates.Date(1979, 04, 15) @test BusinessDays.easter_date(Dates.Year(1980)) == Dates.Date(1980, 04, 06) @test BusinessDays.easter_date(Dates.Year(1981)) == Dates.Date(1981, 04, 19) @test BusinessDays.easter_date(Dates.Year(1982)) == Dates.Date(1982, 04, 11) @test BusinessDays.easter_date(Dates.Year(1983)) == Dates.Date(1983, 04, 03) @test BusinessDays.easter_date(Dates.Year(1984)) == Dates.Date(1984, 04, 22) @test BusinessDays.easter_date(Dates.Year(1985)) == Dates.Date(1985, 04, 07) @test BusinessDays.easter_date(Dates.Year(1986)) == Dates.Date(1986, 03, 30) @test BusinessDays.easter_date(Dates.Year(1987)) == Dates.Date(1987, 04, 19) @test BusinessDays.easter_date(Dates.Year(1988)) == Dates.Date(1988, 04, 03) @test BusinessDays.easter_date(Dates.Year(1989)) == Dates.Date(1989, 03, 26) @test BusinessDays.easter_date(Dates.Year(1990)) == Dates.Date(1990, 04, 15) @test BusinessDays.easter_date(Dates.Year(1991)) == Dates.Date(1991, 03, 31) @test BusinessDays.easter_date(Dates.Year(1992)) == Dates.Date(1992, 04, 19) @test BusinessDays.easter_date(Dates.Year(1993)) == Dates.Date(1993, 04, 11) @test BusinessDays.easter_date(Dates.Year(1994)) == Dates.Date(1994, 04, 03) @test BusinessDays.easter_date(Dates.Year(1995)) == Dates.Date(1995, 04, 16) @test BusinessDays.easter_date(Dates.Year(1996)) == Dates.Date(1996, 04, 07) @test BusinessDays.easter_date(Dates.Year(1997)) == Dates.Date(1997, 03, 30) @test BusinessDays.easter_date(Dates.Year(1998)) == Dates.Date(1998, 04, 12) @test BusinessDays.easter_date(Dates.Year(1999)) == Dates.Date(1999, 04, 04) @test BusinessDays.easter_date(Dates.Year(2000)) == Dates.Date(2000, 04, 23) @test BusinessDays.easter_date(Dates.Year(2001)) == Dates.Date(2001, 04, 15) @test BusinessDays.easter_date(Dates.Year(2002)) == Dates.Date(2002, 03, 31) @test BusinessDays.easter_date(Dates.Year(2003)) == Dates.Date(2003, 04, 20) @test BusinessDays.easter_date(Dates.Year(2004)) == Dates.Date(2004, 04, 11) @test BusinessDays.easter_date(Dates.Year(2005)) == Dates.Date(2005, 03, 27) @test BusinessDays.easter_date(Dates.Year(2006)) == Dates.Date(2006, 04, 16) @test BusinessDays.easter_date(Dates.Year(2007)) == Dates.Date(2007, 04, 08) @test BusinessDays.easter_date(Dates.Year(2008)) == Dates.Date(2008, 03, 23) @test BusinessDays.easter_date(Dates.Year(2009)) == Dates.Date(2009, 04, 12) @test BusinessDays.easter_date(Dates.Year(2010)) == Dates.Date(2010, 04, 04) @test BusinessDays.easter_date(Dates.Year(2011)) == Dates.Date(2011, 04, 24) @test BusinessDays.easter_date(Dates.Year(2012)) == Dates.Date(2012, 04, 08) @test BusinessDays.easter_date(Dates.Year(2013)) == Dates.Date(2013, 03, 31) @test BusinessDays.easter_date(Dates.Year(2014)) == Dates.Date(2014, 04, 20) @test BusinessDays.easter_date(Dates.Year(2015)) == Dates.Date(2015, 04, 05) @test BusinessDays.easter_date(Dates.Year(2016)) == Dates.Date(2016, 03, 27) @test BusinessDays.easter_date(Dates.Year(2017)) == Dates.Date(2017, 04, 16) @test BusinessDays.easter_date(Dates.Year(2018)) == Dates.Date(2018, 04, 01) @test BusinessDays.easter_date(Dates.Year(2019)) == Dates.Date(2019, 04, 21) @test BusinessDays.easter_date(Dates.Year(2020)) == Dates.Date(2020, 04, 12) @test BusinessDays.easter_date(Dates.Year(2021)) == Dates.Date(2021, 04, 04) @test BusinessDays.easter_date(Dates.Year(2022)) == Dates.Date(2022, 04, 17) @test BusinessDays.easter_date(Dates.Year(2023)) == Dates.Date(2023, 04, 09) @test BusinessDays.easter_date(Dates.Year(2024)) == Dates.Date(2024, 03, 31) @test BusinessDays.easter_date(Dates.Year(2025)) == Dates.Date(2025, 04, 20) @test BusinessDays.easter_date(Dates.Year(2026)) == Dates.Date(2026, 04, 05) @test BusinessDays.easter_date(Dates.Year(2027)) == Dates.Date(2027, 03, 28) @test BusinessDays.easter_date(Dates.Year(2028)) == Dates.Date(2028, 04, 16) @test BusinessDays.easter_date(Dates.Year(2029)) == Dates.Date(2029, 04, 01) @test BusinessDays.easter_date(Dates.Year(2030)) == Dates.Date(2030, 04, 21) @test BusinessDays.easter_date(Dates.Year(2031)) == Dates.Date(2031, 04, 13) @test BusinessDays.easter_date(Dates.Year(2032)) == Dates.Date(2032, 03, 28) @test BusinessDays.easter_date(Dates.Year(2033)) == Dates.Date(2033, 04, 17) @test BusinessDays.easter_date(Dates.Year(2034)) == Dates.Date(2034, 04, 09) @test BusinessDays.easter_date(Dates.Year(2035)) == Dates.Date(2035, 03, 25) @test BusinessDays.easter_date(Dates.Year(2036)) == Dates.Date(2036, 04, 13) @test BusinessDays.easter_date(Dates.Year(2037)) == Dates.Date(2037, 04, 05) @test BusinessDays.easter_date(Dates.Year(2038)) == Dates.Date(2038, 04, 25) @test BusinessDays.easter_date(Dates.Year(2039)) == Dates.Date(2039, 04, 10) @test BusinessDays.easter_date(Dates.Year(2040)) == Dates.Date(2040, 04, 01) @test BusinessDays.easter_date(Dates.Year(2041)) == Dates.Date(2041, 04, 21) @test BusinessDays.easter_date(Dates.Year(2042)) == Dates.Date(2042, 04, 06) @test BusinessDays.easter_date(Dates.Year(2043)) == Dates.Date(2043, 03, 29) @test BusinessDays.easter_date(Dates.Year(2044)) == Dates.Date(2044, 04, 17) @test BusinessDays.easter_date(Dates.Year(2045)) == Dates.Date(2045, 04, 09) @test BusinessDays.easter_date(Dates.Year(2046)) == Dates.Date(2046, 03, 25) @test BusinessDays.easter_date(Dates.Year(2047)) == Dates.Date(2047, 04, 14) @test BusinessDays.easter_date(Dates.Year(2048)) == Dates.Date(2048, 04, 05) @test BusinessDays.easter_date(Dates.Year(2049)) == Dates.Date(2049, 04, 18) @test BusinessDays.easter_date(Dates.Year(2050)) == Dates.Date(2050, 04, 10) @test BusinessDays.easter_date(Dates.Year(2051)) == Dates.Date(2051, 04, 02) @test BusinessDays.easter_date(Dates.Year(2052)) == Dates.Date(2052, 04, 21) @test BusinessDays.easter_date(Dates.Year(2053)) == Dates.Date(2053, 04, 06) @test BusinessDays.easter_date(Dates.Year(2054)) == Dates.Date(2054, 03, 29) @test BusinessDays.easter_date(Dates.Year(2055)) == Dates.Date(2055, 04, 18) @test BusinessDays.easter_date(Dates.Year(2056)) == Dates.Date(2056, 04, 02) @test BusinessDays.easter_date(Dates.Year(2057)) == Dates.Date(2057, 04, 22) @test BusinessDays.easter_date(Dates.Year(2058)) == Dates.Date(2058, 04, 14) @test BusinessDays.easter_date(Dates.Year(2059)) == Dates.Date(2059, 03, 30) @test BusinessDays.easter_date(Dates.Year(2060)) == Dates.Date(2060, 04, 18) @test BusinessDays.easter_date(Dates.Year(2061)) == Dates.Date(2061, 04, 10) @test BusinessDays.easter_date(Dates.Year(2062)) == Dates.Date(2062, 03, 26) @test BusinessDays.easter_date(Dates.Year(2063)) == Dates.Date(2063, 04, 15) @test BusinessDays.easter_date(Dates.Year(2064)) == Dates.Date(2064, 04, 06) @test BusinessDays.easter_date(Dates.Year(2065)) == Dates.Date(2065, 03, 29) @test BusinessDays.easter_date(Dates.Year(2066)) == Dates.Date(2066, 04, 11) @test BusinessDays.easter_date(Dates.Year(2067)) == Dates.Date(2067, 04, 03) @test BusinessDays.easter_date(Dates.Year(2068)) == Dates.Date(2068, 04, 22) @test BusinessDays.easter_date(Dates.Year(2069)) == Dates.Date(2069, 04, 14) @test BusinessDays.easter_date(Dates.Year(2070)) == Dates.Date(2070, 03, 30) @test BusinessDays.easter_date(Dates.Year(2071)) == Dates.Date(2071, 04, 19) @test BusinessDays.easter_date(Dates.Year(2072)) == Dates.Date(2072, 04, 10) @test BusinessDays.easter_date(Dates.Year(2073)) == Dates.Date(2073, 03, 26) @test BusinessDays.easter_date(Dates.Year(2074)) == Dates.Date(2074, 04, 15) @test BusinessDays.easter_date(Dates.Year(2075)) == Dates.Date(2075, 04, 07) @test BusinessDays.easter_date(Dates.Year(2076)) == Dates.Date(2076, 04, 19) @test BusinessDays.easter_date(Dates.Year(2077)) == Dates.Date(2077, 04, 11) @test BusinessDays.easter_date(Dates.Year(2078)) == Dates.Date(2078, 04, 03)
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
257
using BusinessDays, Dates d0 = Date(2015, 06, 29) ; d1 = Date(2100, 12, 20) cal = BusinessDays.Brazil() @time BusinessDays.initcache(cal) bdays(cal, d0, d1) # force JIT compilation @time bdays(cal, d0, d1) @time for i in 1:1000000 bdays(cal, d0, d1) end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
2330
using BusinessDays using Test import Dates println("Perftests") d0 = Dates.Date(2015, 06, 29) ; d1 = Dates.Date(2100, 12, 20) cal_type = BusinessDays.Brazil() cal_sym = :Brazil cal_str = "Brazil" @time BusinessDays.initcache(cal_type) bdays(cal_type, d0, d1) # force JIT compilation @time bdays(cal_type, d0, d1) @time for i in 1:1000 bdays(cal_type, d0, d1) end bdays(cal_sym, d0, d1) # force JIT compilation bdays(cal_str, d0, d1) # force JIT compilation @time for i in 1:1000 bdays(cal_sym, d0, d1) end @time for i in 1:1000 bdays(cal_str, d0, d1) end N = 1000 d0vec = fill(d0, N) d1vec = fill(d1, N) cal_type_vec = fill(cal_type, N) cal_sym_vec = fill(cal_sym, N) cal_str_vec = fill(cal_str, N) # Warmup BusinessDays.initcache(BusinessDays.Brazil()) BusinessDays.initcache("USSettlement") BusinessDays.initcache(:UKSettlement) BusinessDays.bdays(cal_type, d0, d1) BusinessDays.bdays(cal_sym, d0, d1) BusinessDays.bdays(cal_str, d0, d1) BusinessDays.bdays(cal_type, d0vec, d1vec) BusinessDays.bdays(cal_sym, d0vec, d1vec) BusinessDays.bdays(cal_str, d0vec, d1vec) BusinessDays.bdays(cal_type_vec, d0vec, d1vec) BusinessDays.bdays(cal_sym_vec, d0vec, d1vec) BusinessDays.bdays(cal_str_vec, d0vec, d1vec) println("type") @time for i in 1:1000 BusinessDays.bdays(cal_type, d0, d1) end @time BusinessDays.bdays(cal_type, d0vec, d1vec) @time BusinessDays.bdays(cal_type_vec, d0vec, d1vec) println("sym") @time for i in 1:1000 BusinessDays.bdays(cal_sym, d0, d1) end @time BusinessDays.bdays(cal_sym, d0vec, d1vec) @time BusinessDays.bdays(cal_sym_vec, d0vec, d1vec) println("str") @time for i in 1:1000 BusinessDays.bdays(cal_str, d0, d1) end @time BusinessDays.bdays(cal_str, d0vec, d1vec) @time BusinessDays.bdays(cal_str_vec, d0vec, d1vec) BusinessDays.cleancache() println("initcache type") @time BusinessDays.initcache(cal_type) BusinessDays.cleancache() println("initcache sym") @time BusinessDays.initcache(cal_sym) BusinessDays.cleancache() println("initcache str") @time BusinessDays.initcache(cal_str) BusinessDays.cleancache() let cal = BusinessDays.WeekendsOnly() println("WeekendsOnly no cache") @time for i in 1:1000 BusinessDays.bdays(cal, d0, d1) end BusinessDays.initcache(cal) println("WeekendsOnly cache enabled") @time for i in 1:1000 BusinessDays.bdays(cal, d0, d1) end end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
13506
using BusinessDays using Test import Dates # Issue #18 Dict(Any[]) # Issue #30 # list holidays for all available calendars for c in [:BRSettlement, :BrazilExchange, :USNYSE, :USGovernmentBond, :USSettlement, :CanadaTSX, :CanadaSettlement, :EuroZone, :UKSettlement, :AustraliaASX] listholidays(c, Dates.Date(1900,1,1), Dates.Date(2100,1,1)) end # Types bhc = BusinessDays.Brazil() ushc = BusinessDays.USSettlement() ukhc = BusinessDays.UKSettlement() usnysehc = BusinessDays.USNYSE() usgovbondhc = BusinessDays.USGovernmentBond() targethc = BusinessDays.TARGET() hc_composite_BR_USA = CompositeHolidayCalendar([BusinessDays.Brazil(), BusinessDays.USSettlement()]) all_calendars_vec = [bhc, ushc, ukhc, hc_composite_BR_USA, usnysehc, usgovbondhc] # two different instances of the same HolidayCalendar subtype should be equal @test bhc == BusinessDays.Brazil() @test ushc == BusinessDays.USSettlement() @test bhc != ushc @test string(bhc) == "BusinessDays.BRSettlement" # Check typing system @test isa(bhc,HolidayCalendar) @test isa(ushc,HolidayCalendar) @test isa(ukhc,HolidayCalendar) include("easter_dates.jl") include("easter-min-max.jl") ############### # findweekday ############### # weekday values: # const Monday,Tuesday,Wednesday,Thursday,Friday,Saturday,Sunday = 1,2,3,4,5,6,7 # see query.jl on Dates module # See also dayofweek(dt) function. # function findweekday(weekday_target :: Int, yy :: Int, mm:: Int, occurrence :: Int, ascending :: Bool ) @test BusinessDays.findweekday(Dates.Monday, 2015, 07, 1, true) == Dates.Date(2015, 07, 06) @test BusinessDays.findweekday(Dates.Monday, 2015, 07, 2, true) == Dates.Date(2015, 07, 13) @test BusinessDays.findweekday(Dates.Monday, 2015, 07, 3, true) == Dates.Date(2015, 07, 20) @test BusinessDays.findweekday(Dates.Monday, 2015, 07, 4, true) == Dates.Date(2015, 07, 27) @test BusinessDays.findweekday(Dates.Monday, 2015, 07, 5, true) == Dates.Date(2015, 08, 03) @test BusinessDays.findweekday(Dates.Monday, 2015, 07, 1, false) == Dates.Date(2015, 07, 27) @test BusinessDays.findweekday(Dates.Monday, 2015, 07, 2, false) == Dates.Date(2015, 07, 20) @test BusinessDays.findweekday(Dates.Monday, 2015, 07, 3, false) == Dates.Date(2015, 07, 13) @test BusinessDays.findweekday(Dates.Monday, 2015, 07, 4, false) == Dates.Date(2015, 07, 06) @test BusinessDays.findweekday(Dates.Monday, 2015, 07, 5, false) == Dates.Date(2015, 06, 29) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 1, true) == Dates.Date(2015, 07, 03) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 2, true) == Dates.Date(2015, 07, 10) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 3, true) == Dates.Date(2015, 07, 17) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 4, true) == Dates.Date(2015, 07, 24) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 5, true) == Dates.Date(2015, 07, 31) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 6, true) == Dates.Date(2015, 08, 07) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 1, false) == Dates.Date(2015, 07, 31) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 2, false) == Dates.Date(2015, 07, 24) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 3, false) == Dates.Date(2015, 07, 17) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 4, false) == Dates.Date(2015, 07, 10) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 5, false) == Dates.Date(2015, 07, 03) @test BusinessDays.findweekday(Dates.Friday, 2015, 07, 6, false) == Dates.Date(2015, 06, 26) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 1, true) == Dates.Date(2015, 07, 01) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 2, true) == Dates.Date(2015, 07, 08) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 3, true) == Dates.Date(2015, 07, 15) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 4, true) == Dates.Date(2015, 07, 22) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 5, true) == Dates.Date(2015, 07, 29) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 6, true) == Dates.Date(2015, 08, 05) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 1, false) == Dates.Date(2015, 07, 29) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 2, false) == Dates.Date(2015, 07, 22) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 3, false) == Dates.Date(2015, 07, 15) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 4, false) == Dates.Date(2015, 07, 08) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 5, false) == Dates.Date(2015, 07, 01) @test BusinessDays.findweekday(Dates.Wednesday, 2015, 07, 6, false) == Dates.Date(2015, 06, 24) @test_throws AssertionError BusinessDays.findweekday(Dates.Wednesday, 2015, 07, -1, false) @test_throws AssertionError BusinessDays.findweekday(Dates.Wednesday, 2015, 07, -1, true) @test advancebdays(:BRSettlement, Dates.Date(2013, 02, 06), 3) == Dates.Date(2013, 02, 13) # after carnaval wednesday @test advancebdays(:BRSettlement, Dates.Date(2013, 02, 14), -4) == Dates.Date(2013, 02, 06) # after carnaval thursday @test advancebdays(:BRSettlement, Dates.Date(2013, 02, 06), Dates.Day(3)) == Dates.Date(2013, 02, 13) # after carnaval wednesday @test advancebdays(:BRSettlement, Dates.Date(2013, 02, 14), Dates.Day(-4)) == Dates.Date(2013, 02, 06) # after carnaval thursday @test advancebdays(:BRSettlement, Dates.Date(2013, 02, 06), [Dates.Day(3), Dates.Day(4)]) == [Dates.Date(2013, 02, 13), Dates.Date(2013, 02, 14)] # after carnaval wednesday # Test this only on 64bit or higher systems len = typemax(UInt32) + 1 if len > typemax(UInt32) d0 = Dates.Date(1950, 2, 1) d1 = d0 + Dates.Day(len) @test_throws AssertionError BusinessDays._create_bdays_cache_arrays(bhc, d0, d1) end # Create HolidayCalendar instances hc_australiaasx = BusinessDays.AustraliaASX() hc_australiaact = BusinessDays.Australia(:ACT) hc_australiansw = BusinessDays.Australia(:NSW) hc_australiant = BusinessDays.Australia(:NT) hc_australiaqld = BusinessDays.Australia(:QLD) hc_australiasa = BusinessDays.Australia(:SA) hc_australiatas = BusinessDays.Australia(:TAS) hc_australiawa = BusinessDays.Australia(:WA) hc_australiavic = BusinessDays.Australia(:VIC) hc_brazil = BusinessDays.BRSettlement() hc_brazil_exc = BusinessDays.BrazilExchange() hc_de_bw = BusinessDays.DE(:BW) hc_de_by = BusinessDays.DE(:BY) hc_de_byp = BusinessDays.DE(:BYP) hc_de_be = BusinessDays.DE(:BE) hc_de_bb = BusinessDays.DE(:BB) hc_de_hb = BusinessDays.DE(:HB) hc_de_hh = BusinessDays.DE(:HH) hc_de_he = BusinessDays.DE(:HE) hc_de_mv = BusinessDays.DE(:MV) hc_de_ni = BusinessDays.DE(:NI) hc_de_nw = BusinessDays.DE(:NW) hc_de_rp = BusinessDays.DE(:RP) hc_de_sl = BusinessDays.DE(:SL) hc_de_sn = BusinessDays.DE(:SN) hc_de_st = BusinessDays.DE(:ST) hc_de_sh = BusinessDays.DE(:SH) hc_de_th = BusinessDays.DE(:TH) hc_usa = BusinessDays.USSettlement() hc_uk = BusinessDays.UKSettlement() hc_usnyse = BusinessDays.USNYSE() hc_usgovbond = BusinessDays.USGovernmentBond() hc_canadatsx = BusinessDays.CanadaTSX() hc_canada = BusinessDays.CanadaSettlement() #################### # Calendar Tests #################### usecache = false include("calendar_tests.jl") usecache = true include("calendar_tests.jl") #################### # Performance Tests #################### include("perftests.jl") BusinessDays.cleancache(hc_brazil) BusinessDays.cleancache() struct TestCalendar <: HolidayCalendar end cc = TestCalendar() @test_throws ErrorException isholiday(cc, Dates.Date(2015,1,1)) BusinessDays.isholiday(::TestCalendar, dt::Dates.Date) = dt == Dates.Date(2015,8,27) @test isholiday(cc, Dates.Date(2015,8,26)) == false @test isholiday(cc, Dates.Date(2015,8,27)) == true BusinessDays.initcache(cc) @test isholiday(cc, Dates.Date(2015,8,26)) == false @test isholiday(cc, Dates.Date(2015,8,27)) == true BusinessDays.cleancache(cc) isholiday(:Brazil, Dates.Date(2016,2,1)) isholiday(:TestCalendar, Dates.Date(2016,2,1)) isholiday("TestCalendar", Dates.Date(2016,2,1)) isbday(:Brazil, Dates.Date(2016,2,1)) isbday(:TestCalendar, Dates.Date(2016,2,1)) isbday("TestCalendar", Dates.Date(2016,2,1)) sym_vec = [:Brazil, :UKSettlement] BusinessDays.initcache(sym_vec) BusinessDays.cleancache(sym_vec) str_vec = ["Brazil", "UKSettlement", "Canada", "UnitedStates", "USNYSE", "USGovernmentBond", "CanadaTSX", "WeekendsOnly"] BusinessDays.initcache(str_vec) BusinessDays.cleancache(str_vec) BusinessDays.initcache("UKSettlement", Dates.Date(2000,1,1), Dates.Date(2000,5,2)) BusinessDays.initcache("UKSettlement", Dates.Date(2000,1,1), Dates.Date(2000,5,2)) # repeating initcache should work BusinessDays.initcache("UKSettlement", Dates.Date(2000,1,1), Dates.Date(2000,1,1)) # single date cache should work BusinessDays.cleancache("UKSettlement") # equality and hash function for Australia @test BusinessDays.Australia(:ACT) == BusinessDays.Australia(:ACT) @test BusinessDays.Australia(:ACT) != BusinessDays.Australia(:NSW) @test hash(BusinessDays.Australia(:ACT)) == hash(BusinessDays.Australia(:ACT)) @test hash(BusinessDays.Australia(:ACT)) != hash(BusinessDays.Australia(:NSW)) BusinessDays.initcache(BusinessDays.Australia(:ACT)) @test haskey(BusinessDays.CACHE_DICT, BusinessDays.Australia(:ACT)) @test !haskey(BusinessDays.CACHE_DICT, BusinessDays.Australia(:NSW)) BusinessDays.cleancache(BusinessDays.Australia(:ACT)) include("customcalendar-example.jl") ######################### # GenericHolidayCalendar ######################### d0 = Dates.Date(1980,1,1) d1 = Dates.Date(2050,1,1) gen_brazil = GenericHolidayCalendar(listholidays(hc_brazil, d0, d1)) @test isbday(gen_brazil, Dates.Date(2014, 12, 31)) == true # wednesday @test isbday(gen_brazil, Dates.Date(2015, 01, 01)) == false # new year @test isbday(gen_brazil, Dates.Date(2015, 01, 02)) == true # friday @test advancebdays(gen_brazil, Dates.Date(2015,9,1), [0, 1, 3, 4, 5]) == advancebdays(hc_brazil, Dates.Date(2015,9,1), [0, 1, 3, 4, 5]) @test advancebdays(gen_brazil, Dates.Date(2015,9,1), 0:5) == advancebdays(hc_brazil, Dates.Date(2015,9,1), 0:5) @test listholidays(gen_brazil, Dates.Date(2016,1,1), Dates.Date(2016,5,30)) == listholidays(hc_brazil, Dates.Date(2016,1,1), Dates.Date(2016,5,30)) @test tobday(gen_brazil, Dates.Date(2013, 02, 08)) == tobday(hc_brazil, Dates.Date(2013, 02, 08)) @test tobday(gen_brazil, Dates.Date(2013, 02, 09)) == tobday(hc_brazil, Dates.Date(2013, 02, 09)) @test_throws AssertionError bdays(gen_brazil, Dates.Date(1900,2,1), Dates.Date(2000,2,1)) @test_throws AssertionError bdays(gen_brazil, Dates.Date(2000,2,1), Dates.Date(2100,2,1)) d0_test = Dates.Date(1980,1,2) d1_test = Dates.Date(2049,12,28) @test bdays(hc_brazil, d0_test, d1_test) == bdays(gen_brazil, d0_test, d1_test) @test bdayscount(hc_brazil, d0_test, d1_test) == bdayscount(gen_brazil, d0_test, d1_test) println("a million with GenericHolidayCalendar...") @time for i in 1:1000000 bdays(gen_brazil, d0_test, d1_test) end BusinessDays.cleancache(gen_brazil) # Start all over, but without cache gen_brazil = GenericHolidayCalendar(listholidays(hc_brazil, d0, d1), d0, d1, false) @test isbday(gen_brazil, Dates.Date(2014, 12, 31)) == true # wednesday @test isbday(gen_brazil, Dates.Date(2015, 01, 01)) == false # new year @test isbday(gen_brazil, Dates.Date(2015, 01, 02)) == true # friday @test advancebdays(gen_brazil, Dates.Date(2015,9,1), [0, 1, 3, 4, 5]) == advancebdays(hc_brazil, Dates.Date(2015,9,1), [0, 1, 3, 4, 5]) @test advancebdays(gen_brazil, Dates.Date(2015,9,1), 0:5) == advancebdays(hc_brazil, Dates.Date(2015,9,1), 0:5) @test listholidays(gen_brazil, Dates.Date(2016,1,1), Dates.Date(2016,5,30)) == listholidays(hc_brazil, Dates.Date(2016,1,1), Dates.Date(2016,5,30)) @test tobday(gen_brazil, Dates.Date(2013, 02, 08)) == tobday(hc_brazil, Dates.Date(2013, 02, 08)) @test tobday(gen_brazil, Dates.Date(2013, 02, 09)) == tobday(hc_brazil, Dates.Date(2013, 02, 09)) @test_throws AssertionError bdays(gen_brazil, Dates.Date(1900,2,1), Dates.Date(2000,2,1)) @test_throws AssertionError bdays(gen_brazil, Dates.Date(2000,2,1), Dates.Date(2100,2,1)) @test BusinessDays.needs_cache_update(gen_brazil, d0, d1) == false BusinessDays.initcache(gen_brazil) @test BusinessDays.needs_cache_update(gen_brazil, d0, d1) == true @test isbday(gen_brazil, Dates.Date(2014, 12, 31)) == true # wednesday @test isbday(gen_brazil, Dates.Date(2015, 01, 01)) == false # new year @test isbday(gen_brazil, Dates.Date(2015, 01, 02)) == true # friday # does nothing, because cache is already there BusinessDays.initcache(gen_brazil) # A GenericHolidayCalendar is defined by its set of holidays, dtmin, dtmax dtmin = Dates.Date(2018,1,15) dtmax = Dates.Date(2018,1,19) gen_1 = GenericHolidayCalendar(Set([Dates.Date(2018,1,16), Dates.Date(2018,1,18)]), dtmin, dtmax) gen_2 = GenericHolidayCalendar(Set([Dates.Date(2018,1,16), Dates.Date(2018,1,18)]), dtmin, dtmax) @test gen_1 == gen_2 # On a set, they are the same element cal_set = Set([gen_1, gen_2]) @test length(cal_set) == 1 # On a dict, they represent the same key cal_dict = Dict(gen_1 => "hey") @test cal_dict[gen_2] == "hey" # Broadcast @test BusinessDays.tobday.(BusinessDays.WeekendsOnly(), [Dates.Date(2019, 8, 30), Dates.Date(2019, 8, 31), Dates.Date(2019, 9, 1), Dates.Date(2019, 9, 2)]) == [Dates.Date(2019, 8, 30), Dates.Date(2019, 9, 2), Dates.Date(2019, 9, 2), Dates.Date(2019, 9, 2)]
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
code
545
using Base.Dates using BusinessDays using DataFrames bd = BusinessDays hc_vec = [:BRSettlement, :USSettlement, :USNYSE, :USGovernmentBond, :UKSettlement, :CanadaSettlement, :CanadaTSX, :TARGET] d0 = Date(1960,01,04) d1 = Date(2100,01,04) bd.initcache(hc_vec, d0, d1) d1vec = collect(d0:d1) d0vec = fill(d0, length(d1vec)) !isdir("csv") && mkdir("csv") for hc in hc_vec b = convert(Array{Int, 1}, isbday(hc, d1vec)) writetable(string("csv/julia-isbday-", string(hc), ".csv"), DataFrame(D = d1vec, ISBDAY = b), quotemark = ' ') end
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
docs
1301
# BusinessDays.jl [![License][license-img]](LICENSE) [![CI][ci-img]][ci-url] [![codecov][codecov-img]][codecov-url] [![dev][docs-dev-img]][docs-dev-url] [![stable][docs-stable-img]][docs-stable-url] [license-img]: http://img.shields.io/badge/license-MIT-brightgreen.svg?style=flat-square [ci-img]: https://github.com/felipenoris/BusinessDays.jl/workflows/CI/badge.svg [ci-url]: https://github.com/felipenoris/BusinessDays.jl/actions?query=workflow%3ACI [codecov-img]: https://img.shields.io/codecov/c/github/JuliaFinance/BusinessDays.jl/master.svg?label=codecov&style=flat-square [codecov-url]: http://codecov.io/github/JuliaFinance/BusinessDays.jl?branch=master [docs-dev-img]: https://img.shields.io/badge/docs-dev-blue.svg?style=flat-square [docs-dev-url]: https://juliafinance.github.io/BusinessDays.jl/dev [docs-stable-img]: https://img.shields.io/badge/docs-stable-blue.svg?style=flat-square [docs-stable-url]: https://juliafinance.github.io/BusinessDays.jl/stable A highly optimized *Business Days* calculator written in Julia language. Also known as *Working Days* calculator. ## Installation ```julia julia> Pkg.add("BusinessDays") ``` ## Requirements * Julia v1.0 or newer. ## Documentation Package documentation is hosted at https://juliafinance.github.io/BusinessDays.jl/stable.
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
docs
468
# API Reference ```@docs BusinessDays.HolidayCalendar BusinessDays.easter_rata BusinessDays.easter_date BusinessDays.findweekday BusinessDays.isholiday BusinessDays.isweekend BusinessDays.isweekday BusinessDays.isbday BusinessDays.tobday BusinessDays.advancebdays BusinessDays.bdays BusinessDays.bdayscount BusinessDays.firstbdayofmonth BusinessDays.lastbdayofmonth BusinessDays.listholidays BusinessDays.listbdays BusinessDays.initcache BusinessDays.cleancache ```
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.9.24
00e206895eb8d0350149dff15ba1ea9a8f306c63
docs
9979
# BusinessDays.jl A highly optimized *Business Days* calculator written in Julia language. Also known as *Working Days* calculator. ## Requirements * Julia v1.0 or newer. ## Installation From a Julia session, run: ```julia julia> using Pkg julia> Pkg.add("BusinessDays") ``` ## Motivation This code was developed with a mindset of a Financial Institution that has a big *Fixed Income* portfolio. Many financial contracts, specially *Fixed Income instruments*, depend on a particular calendar of holidays to determine how many days exist between the valuation date and the maturity of the contract. A *Business Days* calculator is a small piece of software used to perform this important step of the valuation process. While there are many implementations of *Business Days* calculators out there, the usual implementation is based on this kind of algorithm: ```r dt0 = initial_date dt1 = final_date holidays = vector_of_holidays bdays = 0 while d0 <= d1 if d0 not in holidays bdays = bdays + 1 end d0 = d0 + 1 end while ``` This works fine for general use. But the performance becomes an issue if one must repeat this calculation many times. Say you have 50 000 contracts, each contract with 20 cash flows. If you need to apply this algorithm to each cash flow, you will need to perform it 1 000 000 times. For instance, let's try out this code using *R* and [QuantLib](https://github.com/lballabio/QuantLib) ([RQuantLib](https://github.com/eddelbuettel/rquantlib)): ```r library(RQuantLib) library(microbenchmark) from <- as.Date("2015-06-29") to <- as.Date("2100-12-20") microbenchmark(businessDaysBetween("Brazil", from, to)) from_vect <- rep(from, 1000000) to_vect <- rep(to, 1000000) microbenchmark(businessDaysBetween("Brazil", from_vect, to_vect), times=1) ``` Running this code, we get the following: *(only the fastest execution is shown)* ``` Unit: milliseconds expr min businessDaysBetween("Brazil", from, to) 1.63803 Unit: seconds expr min businessDaysBetween("Brazil", from_vect, to_vect) 1837.476 ``` While one computation takes up to 2 milliseconds, we're in trouble if we have to repeat it for the whole portfolio: it takes about **half an hour** to complete. This is not due to R's performance, because [RQuantLib](https://github.com/eddelbuettel/rquantlib) is a simple wrapper to [QuantLib](https://github.com/lballabio/QuantLib) C++ library. **BusinessDays.jl** uses a *tailor-made* cache to store Business Days results, reducing the time spent to the order of a few *microseconds* for a single computation. Also, the time spent to process the whole portfolio is reduced to **under a second**. It's also important to point out that the initialization of the memory cache, which is done only once for each Julia runtime session, takes less than *half a second*, including JIT compilation time. Also, the *memory footprint* required for each cached calendar should take around 0.7 MB. **Benchmark Code** ```julia julia> using BusinessDays, Dates julia> d0 = Date(2015, 06, 29) ; d1 = Date(2100, 12, 20) ; julia> cal = BusinessDays.BRSettlement() BusinessDays.BRSettlement() julia> @time BusinessDays.initcache(cal) 0.161972 seconds (598.85 k allocations: 30.258 MiB, 2.29% gc time) julia> bdays(cal, d0, d1) # force JIT compilation 21471 days julia> @time bdays(cal, d0, d1) 0.000012 seconds (9 allocations: 240 bytes) 21471 days julia> @time for i in 1:1000000 bdays(cal, d0, d1) end 0.221275 seconds (5.00 M allocations: 76.294 MiB, 2.93% gc time) ``` **There's no magic** If we disable BusinessDays's cache, however, the performance is slightly worse than QuantLib's implementation. It takes around 38 minutes to process the same benchmark test. ```julia julia> BusinessDays.cleancache() # cleans existing cache, if any julia> @time for i in 1:1000000 bdays(cal, d0, d1) end # 2288.906548 seconds (5.00 M allocations: 76.294 MB, 0.00% gc time) ``` It's important to point out that **cache is disabled by default**. So, in order to take advantage of high speed computation provided by this package, one must call `BusinessDays.initcache` function. ## Tutorial ```julia julia> using BusinessDays, Dates # creates cache for US Federal holidays, allowing fast computations julia> BusinessDays.initcache(:USSettlement) # Calendars can be referenced using symbols julia> isbday(:USSettlement, Date(2015, 1, 1)) false # ... and also strings julia> isbday("USSettlement", Date(2015, 1, 1)) false # but for the best performance, use a singleton instance julia> isbday(BusinessDays.USSettlement(), Date(2015, 1, 1)) false # Adjust to next business day julia> tobday(:USSettlement, Date(2015, 1, 1)) 2015-01-02 # Adjust to last business day julia> tobday(:USSettlement, Date(2015, 1, 1); forward = false) 2014-12-31 # advances 1 business day julia> advancebdays(:USSettlement, Date(2015, 1, 2), 1) 2015-01-05 # goes back 1 business day julia> advancebdays(:USSettlement, Date(2015, 1, 2), -1) 2014-12-31 # counts the number of business days between dates julia> bdays(:USSettlement, Date(2014, 12, 31), Date(2015, 1, 5)) 2 days # same as above, but returns integer julia> bdayscount(:USSettlement, Date(2014, 12, 31), Date(2015, 1, 5)) 2 julia> isbday(:USSettlement, [Date(2014,12,31),Date(2015,1,1),Date(2015,1,2),Date(2015,1,3),Date(2015,1,5)]) 5-element Array{Bool,1}: true false true false true julia> bdays(:USSettlement, [Date(2014,12,31),Date(2015,1,2)], [Date(2015,1,5),Date(2015,1,5)]) 2-element Array{Base.Dates.Day,1}: 2 days 1 day ``` See *runtests.jl* for more examples. ## Available Business Days Calendars - **AustraliaASX** : Public holidays for the Australian Stock Exchange (ASX). - **Australia(state)** : Public holidays for the Australian states and territories. Available for each state: `Australia(:ACT)`, `Australia(:NSW)`, `Australia(:NT)`, `Australia(:QLD)`, `Australia(:SA)`, `Australia(:TAS)`, `Australia(:WA)`, `Australia(:VIC)`. - **BRSettlement** or **Brazil** : banking holidays for Brazil (federal holidays plus Carnival). - **BrazilExchange** or **BrazilB3** : holidays for B3 Stock Exchange. - **CanadaSettlement** or **Canada**: holidays for Canada. - **CanadaTSX**: holidays for Toronto Stock Exchange - **CompositeHolidayCalendar** : supports combination of Holiday Calendars. - **Germany(state)** or **DE(state)** : State-wide (except BY/BYP) public holidays for the German federal states. - Available for each state: `Germany(:BW)`, `Germany(:BY)` (including Assumption of Mary for Catholic communities), `Germany(:BYP)` (only Protestant communities without Assumption of Mary), `Germany(:BE)`, `Germany(:BB)`, `Germany(:HB)`, `Germany(:HH)`, `Germany(:HE)`, `Germany(:MV)`, `Germany(:NI)`, `Germany(:NW)`, `Germany(:RP)`, `Germany(:SL)`, `Germany(:SN)`, `Germany(:ST)`, `Germany(:SH)`, `Germany(:TH)`. - **NullHolidayCalendar** : `isholiday` returns `false` and `isbday` returns `true` for any date. `bdays` returns the actual days between dates. - **TARGET** or **TARGET2** or **EuroZone** : [TARGET / TARGET2 Euro Zone](https://en.wikipedia.org/wiki/TARGET2) holiday calendar. - **USSettlement** or **UnitedStates**: United States federal holidays. - **USNYSE** : United States NYSE holidays. - **USGovernmentBond** : Broadly accepted holidays for United States Government Bond market. [SOFR Rate](https://www.newyorkfed.org/markets/reference-rates/sofr) calendar. See <https://www.sifma.org/resources/general/holiday-schedule/>. - **UKSettlement** or **UnitedKingdom**: banking holidays for England and Wales. See <https://www.gov.uk/bank-holidays>. - **WeekendsOnly** : for this calendar, `isholiday` returns `false`, but `isbday` returns `false` for Saturdays and Sundays. ## Adding new Holiday Calendars You can add your custom Holiday Calendar by doing the following: 1. Define a subtype of `HolidayCalendar`. 2. Implement a new method for `isholiday` for your calendar. **Example Code** ```julia julia> using BusinessDays, Dates julia> struct CustomCalendar <: HolidayCalendar end julia> BusinessDays.isholiday(::CustomCalendar, dt::Date) = dt == Date(2015,8,27) julia> cc = CustomCalendar() CustomCalendar() julia> isholiday(cc, Date(2015,8,26)) false julia> isholiday(cc, Date(2015,8,27)) true julia> isholiday(:CustomCalendar, Date(2015,8,27)) true julia> isholiday("CustomCalendar", Date(2015,8,27)) true ``` ## Generic Holiday Calendar You can use a fixed set of holidays to define a new Holiday Calendar using `GenericHolidayCalendar` type. ```julia julia> using BusinessDays, Dates julia> holidays = Set([Date(2018,1,16), Date(2018,1,18)]) julia> dtmin = Date(2018,1,15); dtmax = Date(2018,1,19) julia> gen_calendar = GenericHolidayCalendar(holidays, dtmin, dtmax) julia> bdayscount(gen_calendar, Date(2018,1,15), Date(2018,1,17)) 1 ``` The constructor is given by: `GenericHolidayCalendar(holidays, [dtmin], [dtmax], [_initcache_])`, where * `holidays`: a set of holiday dates * `dtmin`: minimum date allowed to check for holidays in holidays set. Defaults to `min(holidays...)`. * `dtmax`: maximum date allowed to check for holidays in holidays set. Defaults to `max(holidays...)`. * `_initcache_`: initializes the cache for this calendar. Defaults to `true`. ## Source Code The source code for this package is hosted at [https://github.com/JuliaFinance/BusinessDays.jl](https://github.com/JuliaFinance/BusinessDays.jl). ## License The source code for the package **BusinessDays.jl** is licensed under the [MIT License](https://raw.githubusercontent.com/JuliaFinance/BusinessDays.jl/master/LICENSE). ## Alternative Packages * [Ito.jl](http://aviks.github.io/Ito.jl/time.html) * [FinancialMarkets.jl](https://github.com/imanuelcostigan/FinancialMarkets.jl) * [QuantLib.jl](https://github.com/pazzo83/QuantLib.jl) * [QuantLib C++ Library](https://github.com/lballabio/QuantLib)
BusinessDays
https://github.com/JuliaFinance/BusinessDays.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
780
push!(LOAD_PATH, "@stdlib") import Pkg; Pkg.add("Conda"); using Conda try run(`which nvcc`) ENV["GPU"] = 1 Pkg.build("ADCME") catch end using ADCME @info "Install Boost" CONDA = get_conda() run(`$CONDA install libboost==1.73.0=h3ff78a5_11`) # run(`$CONDA install boost==1.73.0`) @info "Install AMGCL" UNZIP = joinpath(ADCME.BINDIR, "unzip") if !isdir("$(@__DIR__)/amgcl") download("https://github.com/ddemidov/amgcl/archive/master.zip", "$(@__DIR__)/amgcl.zip") run(`$UNZIP -o $(@__DIR__)/amgcl.zip -d $(@__DIR__)`) mv("$(@__DIR__)/amgcl-master","$(@__DIR__)/amgcl", force=true) rm("$(@__DIR__)/amgcl.zip") end @info "Build Custom Operators" change_directory("CustomOps/build") require_file("build.ninja") do ADCME.cmake() end ADCME.make()
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2612
using ADCME using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) py""" import tensorflow as tf libEikonal = tf.load_op_library('./build/libEikonal.so') @tf.custom_gradient def eikonal_lekj(f, srcx, srcy, m, n, h): u = libEikonal.eikonal(f, srcx, srcy, m, n, h) def grad(du): return libEikonal.eikonal_grad(du, u, f, srcx, srcy, m, n, h) return u, grad """ eikonal_ = py"eikonal_lekj" function eikonal(f::Union{Array{Float64}, PyObject}, srcx::Int64,srcy::Int64,h::Float64) n_, m_ = size(f) # m width, n depth n = n_-1 m = m_-1 # eikonal_ = load_op_and_grad("$(@__DIR__)/build/libEikonal","eikonal") # f,srcx,srcy,m,n,h = convert_to_tensor([f,srcx,srcy,m,n,h], [Float64,Int64,Int64,Int64,Int64,Float64]) f = tf.cast(f, dtype=tf.float64) srcx = tf.cast(srcx, dtype=tf.int64) srcy = tf.cast(srcy, dtype=tf.int64) m = tf.cast(m, dtype=tf.int64) n = tf.cast(n, dtype=tf.int64) h = tf.cast(h, dtype=tf.float64) f = tf.reshape(f, (-1,)) u = eikonal_(f,srcx,srcy,m,n,h) u.set_shape((length(f),)) tf.reshape(u, (n_, m_)) end # TODO: specify your input parameters m = 60 n = 30 h = 0.1 f = ones(n+1, m+1) for i = 1:m+1 f[12:18, i] .= 10. end srcx = 30 srcy = 3 u = eikonal(f,srcx,srcy,h) sess = Session(); init(sess) @show run(sess, u) pcolormesh(run(sess, u)|>Array) axis("scaled") colorbar() gca().invert_yaxis() # uncomment it for testing gradients # error() # TODO: change your test parameter to `m` # in the case of `multiple=true`, you also need to specify which component you are testings # gradient check -- v function scalar_function(m_) return sum(eikonal(m_,srcx,srcy,h)^2) end # TODO: change `m_` and `v_` to appropriate values m_ = constant(rand((m+1),(n+1))) v_ = rand((m+1),(n+1)) y_ = scalar_function(m_) dy_ = gradients(y_, m_) ms_ = Array{Any}(undef, 5) ys_ = Array{Any}(undef, 5) s_ = Array{Any}(undef, 5) w_ = Array{Any}(undef, 5) gs_ = @. 0.1 / 10^(1:5) for i = 1:5 g_ = gs_[i] ms_[i] = m_ + g_*v_ ys_[i] = scalar_function(ms_[i]) s_[i] = ys_[i] - y_ w_[i] = s_[i] - g_*sum(v_.*dy_) end sess = Session(); init(sess) sval_ = run(sess, s_) wval_ = run(sess, w_) close("all") loglog(gs_, abs.(sval_), "*-", label="finite difference") loglog(gs_, abs.(wval_), "+-", label="automatic differentiation") loglog(gs_, gs_.^2 * 0.5*abs(wval_[1])/gs_[1]^2, "--",label="\$\\mathcal{O}(\\gamma^2)\$") loglog(gs_, gs_ * 0.5*abs(sval_[1])/gs_[1], "--",label="\$\\mathcal{O}(\\gamma)\$") plt.gca().invert_xaxis() legend() xlabel("\$\\gamma\$") ylabel("Error")
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
1441
using ADCME using PyCall using LinearAlgebra using PyPlot using Random using ADCMEKit Random.seed!(233) reset_default_graph() include("eikonal_op.jl") m = 60 n = 30 h = 0.1 f = ones(n+1, m+1) f[12:18, :] .= 10. srcx = 5 srcy = 15 u = PyObject[] for (k,(x,y)) in enumerate(zip(5*ones(Int64, length(1:n)), 1:n)) push!(u,eikonal(f,x,y,h)) end for (k,(x,y)) in enumerate( zip(1:m,5*ones(Int64, length(1:m)))) push!(u,eikonal(f,x,y,h)) end for (k,(x,y)) in enumerate( zip(1:m,25*ones(Int64, length(1:m)))) push!(u,eikonal(f,x,y,h)) end sess = Session() uobs = run(sess, u) F = Variable(ones(n+1, m+1)) # pl = placeholder(F0'[:]) # F = reshape(pl, n+1, m+1) # F = Variable(ones(n+1, m+1)) u = PyObject[] for (k,(x,y)) in enumerate(zip(5*ones(Int64, length(1:n)), 1:n)) push!(u,eikonal(F,x,y,h)) end for (k,(x,y)) in enumerate(zip(1:m,5*ones(Int64, length(1:m)))) push!(u,eikonal(F,x,y,h)) end for (k,(x,y)) in enumerate(zip(1:m,25*ones(Int64, length(1:m)))) push!(u,eikonal(F,x,y,h)) end # loss = sum([sum((uobs[i][5:5:end,55] - u[i][5:5:end,55])^2) for i = 1:length(u)]) loss = sum([sum((uobs[i][1:end,55] - u[i][1:end,55])^2) for i = 1:length(u)]) init(sess) @show run(sess, loss) # lineview(sess, F, loss, f, ones(n+1, m+1)) # gradview(sess, pl, loss, ones((n+1)* (m+1))) # meshview(sess, pl, loss, F0'[:]) BFGS!(sess, loss, 1000,var_to_bounds=Dict(F=>(0.5,100.0))) pcolormesh(run(sess, F)) colorbar()
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
1379
# void cufd(double *res, double *grad_Cp, double *grad_Cs, double *grad_Den, # double *grad_stf, const double *Cp, const double *Cs, # const double *Den, const double *stf, int calc_id, const int gpu_id, # int group_size, const int *shot_ids, const string para_fname); function obscalc(cp,cs,den,stf,shot_ids,para_fname) m, n = size(cp) res = zeros(1) grad_Cp = zeros(m, n) grad_Cs = zeros(m, n) grad_Den = zeros(m, n) grad_stf = zeros(size(stf)...) calc_id = Int32(2) gpu_id = Int32(0) group_size = length(shot_ids) ccall((:cufd, "./Src/build/libCUFD.so"), Cvoid, (Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Cint, Cint, Cint, Ref{Cint}, Cstring), res, grad_Cp, grad_Cs, grad_Den, grad_stf, cp, cs, den, stf, calc_id, gpu_id, group_size, shot_ids, para_fname) end nz = 134 nx = 384 cp = 2500ones(nz, nx) cs = zeros(nz, nx) den = 1000ones(nz, nx) shot_ids = Int32[0 1] para_fname = "/home/lidongzh/TwoPhaseFlowFWI/Ops/FWI/Src/params/Par_file_obs_data.json" src = Matrix{Float64}(undef, 1, 2001) src[1,:] = Float64.(reinterpret(Float32, read("/home/lidongzh/TwoPhaseFlowFWI/Ops/FWI/Src/params/ricker_10Hz.bin"))) stf = repeat(src, outer=30) obscalc(cp,cs,den,stf,shot_ids,para_fname)
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
4366
# input: nz, nx, dz, dx, nSteps, nPoints_pml, nPad, dt, f0, survey_fname, data_dir_name, scratch_dir_name, isAc using JSON using DataStructures using Dierckx function paraGen(nz, nx, dz, dx, nSteps, dt, f0, nPml, nPad, para_fname, survey_fname, data_dir_name; if_win=false, filter_para=nothing, if_src_update=false, scratch_dir_name::String="") para = OrderedDict() para["nz"] = nz para["nx"] = nx para["dz"] = dz para["dx"] = dx para["nSteps"] = nSteps para["dt"] = dt para["f0"] = f0 para["nPoints_pml"] = nPml para["nPad"] = nPad if if_win != false para["if_win"] = true end if filter_para != nothing para["filter"] = filter_para end if if_src_update != false para["if_src_update"] = true end para["survey_fname"] = survey_fname para["data_dir_name"] = data_dir_name if !isdir(data_dir_name) mkdir(data_dir_name) end # if nStepsWrap != nothing # para["nStepsWrap"] = nStepsWrap # end if(scratch_dir_name != "") para["scratch_dir_name"] = scratch_dir_name if !isdir(scratch_dir_name) mkdir(scratch_dir_name) end end para_string = JSON.json(para) open(para_fname,"w") do f write(f, para_string) end end # all shots share the same number of receivers function surveyGen(z_src, x_src, z_rec, x_rec, survey_fname; Windows=nothing, Weights=nothing) nsrc = length(x_src) nrec = length(x_rec) survey = OrderedDict() survey["nShots"] = nsrc for i = 1:nsrc shot = OrderedDict() shot["z_src"] = z_src[i] shot["x_src"] = x_src[i] shot["nrec"] = nrec shot["z_rec"] = z_rec shot["x_rec"] = x_rec if Windows != nothing shot["win_start"] = Windows["shot$(i-1)"][:start] shot["win_end"] = Windows["shot$(i-1)"][:end] end if Weights != nothing # shot["weights"] = Int64.(Weights["shot$(i-1)"][:weights]) shot["weights"] = Weights["shot$(i-1)"][:weights] end survey["shot$(i-1)"] = shot end survey_string = JSON.json(survey) open(survey_fname,"w") do f write(f, survey_string) end end function sourceGene(f, nStep, delta_t) # Ricker wavelet generation and integration for source # Dongzhuo Li @ Stanford # May, 2015 e = pi*pi*f*f; t_delay = 1.2/f; source = Matrix{Float64}(undef, 1, nStep) for it = 1:nStep source[it] = (1-2*e*(delta_t*(it-1)-t_delay)^2)*exp(-e*(delta_t*(it-1)-t_delay)^2); end for it = 2:nStep source[it] = source[it] + source[it-1]; end source = source * delta_t; end """ cs_bounds_cloud(cpImg, Bounds) Get `vs` high and low bounds from log point cloud 1st row of Bounds: vp ref line 2nd row of Bounds: vs high ref line 3rd row of Bounds: vs low ref line """ function cs_bounds_cloud(cpImg, Bounds) cs_high_itp = Spline1D(Bounds[1,:], Bounds[2,:]; k=1) cs_low_itp = Spline1D(Bounds[1,:], Bounds[3,:]; k=1) csHigh = zeros(size(cpImg)) csLow = zeros(size(cpImg)) for i = 1:size(cpImg, 1) for j = 1:size(cpImg, 2) csHigh[i,j] = cs_high_itp(cpImg[i,j]) csLow[i,j] = cs_low_itp(cpImg[i,j]) end end return csHigh, csLow end """ klauderWave(fmin, fmax, t_sweep, nStepTotal, nStepDelay, delta_t) Generates Klauder wavelet. """ function klauderWave(fmin, fmax, t_sweep, nStepTotal, nStepDelay, delta_t) nStep = nStepTotal - nStepDelay source = Matrix{Float64}(undef, 1, nStep+nStep-1) source_half = Matrix{Float64}(undef, 1, nStep-1) K = (fmax - fmin) / t_sweep f0 = (fmin + fmax) / 2.0 t_axis = delta_t:delta_t:(nStep-1)*delta_t source_half = sin.(pi * K .* t_axis .* (t_sweep .- t_axis)) .* cos.(2.0 * pi * f0 .* t_axis) ./ (pi*K.*t_axis*t_sweep) for i = 1:nStep-1 source[i] = source_half[end-i+1] end for i = nStep+1:2*nStep-1 source[i] = source_half[i-nStep] end source[nStep] = 1.0 source_crop = source[:,nStep-nStepDelay:end] return source_crop end # function klauderWave(fmin, fmax, t_sweep, nStep, delta_t) # # Klauder wavelet # # Dongzhuo Li @ Stanford # # August, 2019 # source = Matrix{Float64}(undef, 1, nStep) # K = (fmax - fmin) / t_sweep # f0 = (fmin + fmax) / 2.0 # t_axis = delta_t:delta_t:(nStep-1)*delta_t # source_part = sin.(pi * K .* t_axis .* (t_sweep .- t_axis)) .* cos.(2.0 * pi * f0 .* t_axis) ./ (pi*K.*t_axis*t_sweep) # for i = 2:nStep # source[i] = source_part[i-1] # end # source[1] = 1.0 # return source # end
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2004
if Sys.islinux() py""" import tensorflow as tf import socket if socket.gethostname() != "Dolores": libFwiOp = tf.load_op_library('./build/libFwiOp.so') else: libFwiOp = tf.load_op_library('./build_dolores/libFwiOp.so') @tf.custom_gradient def fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy, tf.constant(1.0,dtype=tf.float64),λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit, grad def fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit """ elseif Sys.isapple() py""" import tensorflow as tf libFwiOp = tf.load_op_library('./build/libFwiOp.dylib') @tf.custom_gradient def fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy, tf.constant(1.0,dtype=tf.float64),λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit, grad def fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit """ elseif Sys.iswindows() py""" import tensorflow as tf libFwiOp = tf.load_op_library('./build/libFwiOp.dll') @tf.custom_gradient def fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy, tf.constant(1.0,dtype=tf.float64),λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit, grad def fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit """ end fwi_op = py"fwi_op" fwi_obs_op = py"fwi_obs_op"
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
5005
using ADCME using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) include("fwi_util.jl") include("fwi_util_op.jl") np = pyimport("numpy") # argsparse.jl # ENV["CUDA_VISIBLE_DEVICES"] = 1 # ENV["PARAMDIR"] = "Src/params/" # config = tf.ConfigProto(device_count = Dict("GPU"=>0)) nz = 400 nx = 400 dz = 20 dx = 20 nSteps = 2001 dt = 0.0025 f0 = 4.5 filter_para = [0, 0.1, 100.0, 200.0] nPml = 32 isAc = false nPad = 0 # x_src = collect(5:10:nx-2nPml-5) # z_src = 2ones(Int64, size(x_src)) # x_rec = collect(5:100-nPml) # z_rec = 2ones(Int64, size(x_rec)) x_src = [200-nPml] z_src = [200-nPml] z = (5:10:nz-2nPml-5)|>collect x = (5:10:nx-2nPml-5)|>collect x_rec, z_rec = np.meshgrid(x, z) x_rec = x_rec[:] z_rec = z_rec[:] # x_src = 5 # z_src = [300-nPml] # z_rec = collect(5:1:nz-2nPml-5) # x_rec = (nx-2nPml-100) .* ones(Int64, size(z_rec)) para_fname = "./para_file.json" survey_fname = "./survey_file.json" data_dir_name = "./Data" scratch_dir_name="./Scratch" # paraGen(nz, nx, dz, dx, nSteps, dt, f0, nPml, nPad, filter_para, isAc, para_fname, survey_fname, data_dir_name, scratch_dir_name=scratch_dir_name) # surveyGen(z_src, x_src, z_rec, x_rec, survey_fname) paraGen(nz, nx, dz, dx, nSteps, dt, f0, nPml, nPad, para_fname, survey_fname, data_dir_name, scratch_dir_name=scratch_dir_name) surveyGen(z_src, x_src, z_rec, x_rec, survey_fname) cp = 3000ones(nz, nx) # cp = (1. .+ 0.1*rand(nz, nx)) .* 3000. cs = 3000.0/sqrt(3.0) .* ones(nz,nx) # cs = zeros(nz,nx) den = 2000.0 .* ones(nz, nx) function vel2moduli(cp,cs,den) lambda = (cp.^2 - 2.0 .* cs.^2) .* den ./ 1e6 mu = cs.^2 .* den ./ 1e6 return lambda, mu end lambda, mu = vel2moduli(cp,cs,den) tf_lambda = constant(lambda) tf_mu = constant(mu) tf_den = constant(den) # # src = Matrix{Float64}(undef, 1, 2001) # src[1,:] = Float64.(reinterpret(Float32, read("./Src/params/ricker_10Hz.bin"))) src = sourceGene(f0, nSteps, dt) tf_stf = constant(repeat(src, outer=length(z_src))) tf_para_fname = tf.strings.join([para_fname]) tf_gpu_id0 = constant(0, dtype=Int32) tf_gpu_id1 = constant(1, dtype=Int32) tf_shot_ids0 = constant(collect(Int32, 0:length(x_src)-1), dtype=Int32) tf_shot_ids1 = constant(collect(Int32, 13:25), dtype=Int32) res1 = fwi_obs_op(tf_lambda, tf_mu, tf_den, tf_stf, tf_gpu_id0, tf_shot_ids0, tf_para_fname) # res2 = fwi_obs_op(tf_cp2, tf_cs2, tf_den2, tf_stf, tf_gpu_id1, tf_shot_ids0, tf_para_fname) sess=Session();init(sess); @time run(sess, res1) # error("") # gradient check -- v function scalar_function(m) # return fwi_op(m, tf_mu, tf_den, tf_stf, tf_gpu_id0, tf_shot_ids0, tf_para_fname) return fwi_op(tf_lambda, m, tf_den, tf_stf, tf_gpu_id0, tf_shot_ids0, tf_para_fname) # return fwi_op(tf_lambda, tf_mu, m, tf_stf, tf_gpu_id0, tf_shot_ids0, tf_para_fname) # return fwi_op(tf_lambda, tf_mu, tf_den, m, tf_gpu_id0, tf_shot_ids0, tf_para_fname) end # lambda2,_ = vel2moduli(3200.0, 3200.0/sqrt(3.0), den) # m_ = constant(lambda2) _,mu2 = vel2moduli(3200.0, 3200.0/sqrt(3.0), den) m_ = constant(mu2) # m_ = constant(2200ones(nz,nx)) # # src2 = circshift(src, (0,30)) # src2 = sourceGene(f0,nSteps,dt) .*1.5 # m_ = constant(repeat(src2, outer=length(z_src))) # for forward_backward wavefield comparison # yy = scalar_function(m_) # gradm = gradients(yy, m_) # sess=Session();init(sess); # @time G = run(sess, gradm) # imshow(G);colorbar(); # A=read("SnapGPU.bin");A=reshape(reinterpret(Float32,A),(200,200)); # B=read("SnapGPU_back.bin");B=reshape(reinterpret(Float32,B),(200,200)); # imshow(A[33:end-32,33:end-32]-B[33:end-32,33:end-32]);colorbar(); # error("") v0 = zeros(nz, nx) # PLEASE!!!!!!!!!!!!!! Don't perturb in the CPML region!!!!!!!!!!!!!!!!!!!!!!! v0[nPml+5:nz-nPml-5, nPml+5:nx-nPml-5] .= 1.0 # # perturb moduli v_ = constant(Float64.((1. .+ 0.1*rand(nz, nx)) .* 1e3 .* v0)) # # perturb density # v_ = constant(Float64.((1. .+ 0.1*rand(nz, nx)) .* 500 .* v0)) # perturb sft # s0 = zeros(1, nSteps) # s0[1, 20:end-20] .= 1.0 # src_perturb = rand(1,nSteps) .* s0 *0.1 # v_ = src_perturb y_ = scalar_function(m_) dy_ = gradients(y_, m_) ms_ = Array{Any}(undef, 5) ys_ = Array{Any}(undef, 5) s_ = Array{Any}(undef, 5) w_ = Array{Any}(undef, 5) gs_ = @. 1 / 10^(1:5) for i = 1:5 g_ = gs_[i] ms_[i] = m_ + g_ * v_ ys_[i] = scalar_function(ms_[i]) s_[i] = ys_[i] - y_ w_[i] = s_[i] - g_*sum(v_.*dy_) end sess = Session() init(sess) sval_ = run(sess, s_) wval_ = run(sess, w_) # error("") sval_ = [x[1] for x in sval_] wval_ = [x[1] for x in wval_] close("all") loglog(gs_, abs.(sval_), "*-", label="finite difference") loglog(gs_, abs.(wval_), "+-", label="FWI gradient") loglog(gs_, gs_.^2 * 0.5*abs(wval_[1])/gs_[1]^2, "--",label="\$\\mathcal{O}(\\gamma^2)\$") loglog(gs_, gs_ * 0.5*abs(sval_[1])/gs_[1], "--",label="\$\\mathcal{O}(\\gamma)\$") plt.gca().invert_xaxis() legend() xlabel("\$\\gamma\$") ylabel("Error") savefig("Convergence_test_mu.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300);
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2786
using ADCME using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libLaplacian = tf.load_op_library('build/libLaplacian.so') @tf.custom_gradient def laplacian_op(coef,func,h,rhograv): p = libLaplacian.laplacian(coef,func,h,rhograv) def grad(dy): return libLaplacian.laplacian_grad(dy, coef, func, h, rhograv) return p, grad """ elseif Sys.isapple() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('build/libPoissonOp.dylib') @tf.custom_gradient def laplacian_op(coef,func,h,rhograv): p = libLaplacian.laplacian(coef,func,h,rhograv) def grad(dy): return libLaplacian.laplacian_grad(dy, coef, func, h, rhograv) return p, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('build/libPoissonOp.dll') @tf.custom_gradient def laplacian_op(coef,func,h,rhograv): p = libLaplacian.laplacian(coef,func,h,rhograv) def grad(dy): return libLaplacian.laplacian_grad(dy, coef, func, h, rhograv) return p, grad """ end laplacian = py"laplacian_op" h = 1.0 rho = 1000.0 G = 9.8 len_z = 16 len_x = 32 nz = Int(len_z/h + 1) nx = Int(len_x/h + 1) tf_h=constant(1.0) # coef = zeros(nz, nx) # rhs = zeros(nz, nx) # for i = 1:nz # for j = 1:nx # rhs[i,j] = -sin(2*pi/len_z*(i-1)*h) * sin(2*pi/len_x*(j-1)*h) # coef[i,j] = 1.0 - cos(2*pi/len_z*(i-1)*h) * sin(2*pi/len_x*(j-1)*h) * len_z / (2*pi*rho*G) # # rhs[i,j] = 2.0*(i-1)*h*exp(-(((i-1)*h)^2) -(((j-1)*h)^2)) * rho * G # # coef[i,j] = 1.0 + exp(-(((i-1)*h)^2) -(((j-1)*h)^2)) # end # end coef = rand(nz, nx) func = rand(nz, nx) tf_coef = constant(coef) tf_func = constant(func) # gradient check -- v function scalar_function(m) # return sum(tanh(laplacian(m, tf_func, tf_h, constant(rho*G)))) return sum(tanh(laplacian(tf_coef, m, tf_h, constant(rho*G)))) end # m_ = tf_coef m_ = tf_func v_ = 0.01*rand(nz, nx) y_ = scalar_function(m_) dy_ = gradients(y_, m_) ms_ = Array{Any}(undef, 5) ys_ = Array{Any}(undef, 5) s_ = Array{Any}(undef, 5) w_ = Array{Any}(undef, 5) gs_ = @. 1 / 10^(1:5) for i = 1:5 g_ = gs_[i] ms_[i] = m_ + g_*v_ ys_[i] = scalar_function(ms_[i]) s_[i] = ys_[i] - y_ w_[i] = s_[i] - g_*sum(v_.*dy_) end sess = Session() init(sess) sval_ = run(sess, s_) wval_ = run(sess, w_) close("all") loglog(gs_, abs.(sval_), "*-", label="finite difference") loglog(gs_, abs.(wval_), "+-", label="automatic differentiation") loglog(gs_, gs_.^2 * 0.5*abs(wval_[1])/gs_[1]^2, "--",label="\$\\mathcal{O}(\\gamma^2)\$") loglog(gs_, gs_ * 0.5*abs(sval_[1])/gs_[1], "--",label="\$\\mathcal{O}(\\gamma)\$") plt[:gca]()[:invert_xaxis]() legend() xlabel("\$\\gamma\$") ylabel("Error")
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
3359
using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('build/libPoissonOp.so') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ elseif Sys.isapple() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('build/libPoissonOp.dylib') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('build/libPoissonOp.dll') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ end poisson_op = py"poisson_op" len_z = 4 len_x = 4 rho = 1000.0 G = 9.8 nScale = 5 tf_g = Array{Any}(undef, nScale) tf_coef = Array{Any}(undef, nScale) tf_h = Array{Any}(undef, nScale) tf_p = Array{Any}(undef, nScale) p_true_array = Array{Any}(undef, nScale) p_inv_array = Array{Any}(undef, nScale) h_array = @. 1 / 2^(1:nScale) for iScale = 1:nScale h = h_array[iScale] nz = Int(len_z/h + 1) nx = Int(len_x/h + 1) g = zeros(nz, nx) coef = zeros(nz, nx) p_true = zeros(nz, nx) for i = 1:nz for j = 1:nx g[i,j] = -sin(2*pi/len_z*(i-1)*h) * sin(2*pi/len_x*(j-1)*h) coef[i,j] = 1.0 - cos(2*pi/len_z*(i-1)*h) * sin(2*pi/len_x*(j-1)*h) * len_z / (2*pi*rho*G) # g[i,j] = 2.0*(i-1)*h*exp(-(((i-1)*h)^2) -(((j-1)*h)^2)) * rho * G # coef[i,j] = 1.0 + exp(-(((i-1)*h)^2) -(((j-1)*h)^2)) p_true[i,j] = rho*G*(i-1)*h end end p_true_array[iScale] = p_true .- mean(p_true) # p_true_array[iScale] = p_true tf_g[iScale] = constant(g) tf_coef[iScale] = constant(coef) tf_h[iScale] = constant(h) tf_p[iScale] = poisson_op(tf_coef[iScale], tf_g[iScale], tf_h[iScale], constant(rho*G), constant(1)) end sess = Session() init(sess) p_inv_array = run(sess, tf_p) for iScale = 1:nScale p_inv_array[iScale] = p_inv_array[iScale] .- mean(p_inv_array[iScale]) end function l2_error(p_true, p_inv, iScale) l2_error = 0.0 l2_norm = 0.0 h = h_array[iScale] nz = size(p_true)[1] nx = size(p_true)[2] for i = 1:nz for j = 1:nx l2_error += (p_true[i,j]-p_inv[i,j])^2 * h^2 l2_norm += (p_true[i,j])^2 * h^2 end end return sqrt(l2_error)/sqrt(l2_norm) end Error_array = Array{Any}(undef, nScale) for iScale = 1:nScale Error_array[iScale] = l2_error(p_true_array[iScale], p_inv_array[iScale], iScale) end loglog(h_array, Error_array, "*-", label="MMS convergence") loglog(h_array, h_array.^2 * 0.5*Error_array[1]/h_array[1]^2, "--",label="\$\\mathcal{O}(h^2)\$") loglog(h_array, h_array * 0.5*Error_array[1]/h_array[1], "-",label="\$\\mathcal{O}(h)\$") plt.gca().invert_xaxis() legend() xlabel("\$h\$") ylabel("Error") # imshow(p)
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2776
using ADCME using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('build/libPoissonOp.so') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ elseif Sys.isapple() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('build/libPoissonOp.dylib') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('build/libPoissonOp.dll') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ end poisson_op = py"poisson_op" # gradient check -- v h = 1.0 rho = 1000.0 G = 9.8 len_z = 16 len_x = 32 nz = Int(len_z/h + 1) nx = Int(len_x/h + 1) tf_h=constant(1.0) coef = zeros(nz, nx) rhs = zeros(nz, nx) for i = 1:nz for j = 1:nx rhs[i,j] = -sin(2*pi/len_z*(i-1)*h) * sin(2*pi/len_x*(j-1)*h) coef[i,j] = 1.0 - cos(2*pi/len_z*(i-1)*h) * sin(2*pi/len_x*(j-1)*h) * len_z / (2*pi*rho*G) # rhs[i,j] = 2.0*(i-1)*h*exp(-(((i-1)*h)^2) -(((j-1)*h)^2)) * rho * G # coef[i,j] = 1.0 + exp(-(((i-1)*h)^2) -(((j-1)*h)^2)) end end tf_coef = constant(coef) tf_rhs = constant(rhs) function scalar_function(m) return sum(tanh(poisson_op(tf_coef,m,tf_h,constant(rho*G), constant(0)))) # return sum(tanh(poisson_op(m,tf_rhs,tf_h, constant(rho*G), constant(0)))) end m_ = tf_rhs # m_ = tf_coef v_ = 0.01*rand(nz, nx) y_ = scalar_function(m_) dy_ = gradients(y_, m_) ms_ = Array{Any}(undef, 5) ys_ = Array{Any}(undef, 5) s_ = Array{Any}(undef, 5) w_ = Array{Any}(undef, 5) gs_ = @. 1 / 20^(1:5) for i = 1:5 g_ = gs_[i] ms_[i] = m_ + g_*v_ ys_[i] = scalar_function(ms_[i]) s_[i] = ys_[i] - y_ w_[i] = s_[i] - g_*sum(v_.*dy_) end sess = Session() init(sess) sval_ = run(sess, s_) wval_ = run(sess, w_) close("all") loglog(gs_, abs.(sval_), "*-", label="finite difference") loglog(gs_, abs.(wval_), "+-", label="automatic differentiation") loglog(gs_, gs_.^2 * 0.5*abs(wval_[1])/gs_[1]^2, "--",label="\$\\mathcal{O}(\\gamma^2)\$") loglog(gs_, gs_ * 0.5*abs(sval_[1])/gs_[1], "--",label="\$\\mathcal{O}(\\gamma)\$") plt.gca().invert_xaxis() legend() xlabel("\$\\gamma\$") ylabel("Error")
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2703
using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) py""" import tensorflow as tf libSatOp = tf.load_op_library('./build/libSatOp.so') @tf.custom_gradient def sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h): sat = libSatOp.sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) def grad(dy): return libSatOp.sat_op_grad(dy, sat, s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) return sat, grad """ sat_op = py"sat_op" len_z = 1.0 * 2pi len_x = 1.0 * 2pi nScale = 5 tf_permi = Array{Any}(undef, nScale) tf_poro = Array{Any}(undef, nScale) tf_h = Array{Any}(undef, nScale) tf_s = Array{Any}(undef, nScale) s_true_array = Array{Any}(undef, nScale) s_inv_array = Array{Any}(undef, nScale) h_array = @. 1 / (2pi)^(1:nScale) dt = 0.00001 for iScale = 1:nScale h = h_array[iScale] nz = Int64(trunc(len_z/h + 1)) nx = Int64(trunc(len_x/h + 1)) pt = zeros(nz,nx) s_true = zeros(nz, nx) q = zeros(nz,nx) poro = zeros(nz,nx) for i = 1:nz for j = 1:nx x1 = len_z*(i-1)*h x2 = len_x*(j-1)*h s_true[i,j] = dt * sin(x1)*sin(x2) pt[i,j] = cos(x1)*cos(x2) poro[i,j] = sin(x1)^2 * sin(x2)^2 q[i,j] = sin(x1)*sin(x2) - (-3.0 * sin(x1)*cos(x1)*sin(x2)^2*cos(x2) - 3.0 * sin(x1)^2*cos(x1)*sin(x2)*cos(x2)) end end s_true_array[iScale] = s_true tf_permi[iScale] = constant(ones(nz,nx)) tf_poro[iScale] = constant(poro) tf_pt = constant(pt) tf_q = constant(q) tf_s0 = constant(zeros(nz,nx)) tf_h[iScale] = constant(h) tf_dt = constant(dt) tf_ones = constant(ones(nz, nx)) tf_s[iScale] = sat_op(tf_s0, tf_pt, tf_permi[iScale], tf_poro[iScale], tf_q, tf_q, constant(1.0),constant(1.0),tf_s0,tf_dt,tf_h[iScale]) end sess = Session() init(sess) s_inv_array = run(sess, tf_s) function l2_error(s_true, s_inv, iScale) l2_error = 0.0 l2_norm = 0.0 h = h_array[iScale] nz = size(s_true)[1] nx = size(s_true)[2] for i = 1:nz for j = 1:nx l2_error += (s_true[i,j]-s_inv[i,j])^2 * h^2 l2_norm += (s_true[i,j])^2 * h^2 end end return sqrt(l2_error)/sqrt(l2_norm) end Error_array = Array{Any}(undef, nScale) for iScale = 1:nScale Error_array[iScale] = l2_error(s_true_array[iScale], s_inv_array[iScale], iScale) end loglog(h_array, Error_array, "*-", label="MMS convergence") loglog(h_array, h_array.^2 * 0.5*Error_array[1]/h_array[1]^2, "--",label="\$\\mathcal{O}(h^2)\$") loglog(h_array, h_array * 0.5*Error_array[1]/h_array[1], "-",label="\$\\mathcal{O}(h)\$") plt.gca().invert_xaxis() legend() xlabel("\$h\$") ylabel("Error") # imshow(p)
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
7564
using ADCME using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libSatOp = tf.load_op_library('./build/libSatOp.so') @tf.custom_gradient def sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h): sat = libSatOp.sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) def grad(dy): return libSatOp.sat_op_grad(dy, sat, s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) return sat, grad """ elseif Sys.isapple() py""" import tensorflow as tf libSatOp = tf.load_op_library('./build/libSatOp.dylib') @tf.custom_gradient def sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h): sat = libSatOp.sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) def grad(dy): return libSatOp.sat_op_grad(dy, sat, s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) return sat, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libSatOp = tf.load_op_library('./build/libSatOp.dll') @tf.custom_gradient def sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h): sat = libSatOp.sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) def grad(dy): return libSatOp.sat_op_grad(dy, sat, s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) return sat, grad """ end sat_op = py"sat_op" if Sys.islinux() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('../Upwps/build/libUpwpsOp.so') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ elseif Sys.isapple() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('../Upwps/build/libUpwpsOp.dylib') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ end upwps_op = py"upwps_op" if Sys.islinux() py""" import tensorflow as tf libUpwlapOp = tf.load_op_library('../Upwlap/build/libUpwlapOp.so') @tf.custom_gradient def upwlap_op(perm,mobi,func,h,rhograv): out = libUpwlapOp.upwlap_op(perm,mobi,func,h,rhograv) def grad(dy): return libUpwlapOp.upwlap_op_grad(dy, out, perm,mobi,func,h,rhograv) return out, grad """ elseif Sys.isapple() py""" import tensorflow as tf libUpwlapOp = tf.load_op_library('../Upwlap/build/libUpwlapOp.dylib') @tf.custom_gradient def upwlap_op(perm,mobi,func,h,rhograv): out = libUpwlapOp.upwlap_op(perm,mobi,func,h,rhograv) def grad(dy): return libUpwlapOp.upwlap_op_grad(dy, out, perm,mobi,func,h,rhograv) return out, grad """ end upwlap_op = py"upwlap_op" if Sys.islinux() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Poisson/build/libPoissonOp.so') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ elseif Sys.isapple() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Poisson/build/libPoissonOp.dylib') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ end poisson_op = py"poisson_op" function ave_normal(quantity, m, n) aa = sum(quantity) return aa/(m*n) end # TODO: # const ALPHA = 0.006323996017182 const ALPHA = 1.0 const SRC_CONST = 86400.0 const K_CONST = 9.869232667160130e-16 * 86400 nz=20 nx=30 sw = constant(zeros(nz, nx)) swref = constant(zeros(nz,nx)) μw = constant(0.001) μo = constant(0.003) K = constant(100.0 .* ones(nz, nx)) ϕ = constant(0.25 .* ones(nz, nx)) dt = constant(30.0) h = constant(100.0 * 0.3048) q1 = zeros(nz,nx) q2 = zeros(nz,nx) q1[10,5] = 0.002 * (1/(100.0 * 0.3048)^2)/20.0/0.3048 * SRC_CONST q2[10,25] = -0.002 * (1/(100.0 * 0.3048)^2)/20.0/0.3048 * SRC_CONST qw = constant(q1) qo = constant(q2) λw = sw.*sw/μw λo = (1-sw).*(1-sw)/μo λ = λw + λo f = λw/λ q = qw + qo + λw/(λo+1e-16).*qo # Θ = laplacian_op(K.*λo, potential_c, h, constant(0.0)) Θ = upwlap_op(K*K_CONST, λo, constant(zeros(nz,nx)), h, constant(0.0)) load_normal = (Θ+q/ALPHA) - ave_normal(Θ+q/ALPHA, nz, nx) tf_comp_p0 = upwps_op(K*K_CONST, λ, load_normal, constant(zeros(nz,nx)), h, constant(0.0), constant(2)) sess = Session() init(sess) p0 = run(sess, tf_comp_p0) tf_p0 = constant(p0) # s = sat_op(sw,p0,K,ϕ,qw,qo,sw,dt,h) # function step(sw) # λw = sw.*sw # λo = (1-sw).*(1-sw) # λ = λw + λo # f = λw/λ # q = qw + qo + λw/(λo+1e-16).*qo # # Θ = laplacian_op(K.*λo, constant(zeros(nz,nx)), h, constant(0.0)) # Θ = upwlap_op(K, λo, constant(zeros(nz,nx)), h, constant(0.0)) # # Θ = constant(zeros(nz,nx)) # load_normal = (Θ+q/ALPHA) - ave_normal(Θ+q/ALPHA, nz, nx) # p = poisson_op(λ.*K, load_normal, h, constant(0.0), constant(0)) # potential p = pw - ρw*g*h # # p = upwps_op(K, λ, load_normal, constant(zeros(nz,nx)), h, constant(0.0), constant(0)) # sw = sat_op(sw,p,K,ϕ,qw,qo,μw,μo,sw,dt,h) # return sw # end # NT=100 # function evolve(sw, NT, qw, qo) # # qw_arr = constant(qw) # qw: NT x m x n array # # qo_arr = constant(qo) # tf_sw = TensorArray(NT+1) # function condition(i, ta) # tf.less(i, NT+1) # end # function body(i, tf_sw) # sw_local = step(read(tf_sw, i)) # i+1, write(tf_sw, i+1, sw_local) # end # tf_sw = write(tf_sw, 1, sw) # i = constant(1, dtype=Int32) # _, out = while_loop(condition, body, [i;tf_sw]) # read(out, NT+1) # end # s = evolve(sw, NT, qw, qo) # J = tf.nn.l2_loss(s) # tf_grad_K = gradients(J, K) # sess = Session() # init(sess) # # P = run(sess,p0) # # error("") # S=run(sess, s) # imshow(S);colorbar(); # error("") # grad_K = run(sess, tf_grad_K) # imshow(grad_K);colorbar(); # error("") # TODO: # gradient check -- v function scalar_function(m) # return sum(tanh(sat_op(m,tf_p0,K*K_CONST,ϕ,qw,qo,μw,μo,constant(zeros(nz,nx)),dt,h))) return sum(tanh(sat_op(sw,m,K*K_CONST,ϕ,qw,qo,μw,μo,constant(zeros(nz,nx)),dt,h))) # return sum(tanh(sat_op(sw,tf_p0,m,ϕ,qw,qo,μw,μo,constant(zeros(nz,nx)),dt,h))) # return sum(tanh(sat_op(sw,tf_p0,K*K_CONST,m,qw,qo,μw,μo,constant(zeros(nz,nx)),dt,h))) end # m_ = sw # v_ = 0.1 * rand(nz,nx) m_ = tf_p0 v_ = 5e5 .* rand(nz,nx) # m_ = K*K_CONST # v_ = 10 .* rand(nz,nx) *K_CONST # m_ = ϕ # v_ = 0.1 * rand(nz,nx) y_ = scalar_function(m_) dy_ = gradients(y_, m_) ms_ = Array{Any}(undef, 5) ys_ = Array{Any}(undef, 5) s_ = Array{Any}(undef, 5) w_ = Array{Any}(undef, 5) gs_ = @. 1 / 10^(1:5) for i = 1:5 g_ = gs_[i] ms_[i] = m_ + g_*v_ ys_[i] = scalar_function(ms_[i]) s_[i] = ys_[i] - y_ w_[i] = s_[i] - g_*sum(v_.*dy_) end sess = Session() init(sess) sval_ = run(sess, s_) wval_ = run(sess, w_) close("all") loglog(gs_, abs.(sval_), "*-", label="finite difference") loglog(gs_, abs.(wval_), "+-", label="automatic differentiation") loglog(gs_, gs_.^2 * 0.5*abs(wval_[1])/gs_[1]^2, "--",label="\$\\mathcal{O}(\\gamma^2)\$") loglog(gs_, gs_ * 0.5*abs(sval_[1])/gs_[1], "--",label="\$\\mathcal{O}(\\gamma)\$") plt.gca().invert_xaxis() legend() xlabel("\$\\gamma\$") ylabel("Error")
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2629
using ADCME using PyCall using LinearAlgebra using PyPlot using Random using FwiFlow # include("../ops.jl") # Random.seed!(233) function saturation(s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,dt,h) saturation_ = load_op_and_grad("./build/libSaturation","saturation") s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,dt,h = convert_to_tensor([s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,dt,h], [Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64]) saturation_(s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,dt,h) end # TODO: specify your input parameters m = 100 n = 10 h = 0.01 x = zeros(n, m) y = zeros(n, m) for i = 1:m for j = 1:n x[j, i] = h*i y[j, i] = h*j end end t = 3.0 s0 = @. (x^2 + y^2)/(1+x^2+y^2) * exp(-t) dporodt = exp(t) * zeros(n, m) pt = @. (x^2+y^2) perm = rand(n, m) poro = exp(t) * ones(n, m) qw = ones(n, m) qo = ones(n, m) muw = 2.0 muo = 3.0 sref = s0 dt = 0.01 h = 0.1 u = sat_op2(s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,dt,h) u3 = sat_op(s0,pt,perm,poro,qw,qo,muw, muo,sref,dt,h) # u3 = sat_op(s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,dt,h) sess = Session(); init(sess) # @show run(sess, u) @show run(sess, u3-u) # uncomment it for testing gradients error() # TODO: change your test parameter to `m` # in the case of `multiple=true`, you also need to specify which component you are testings # gradient check -- v # s0 qw qo function scalar_function(m) # return sum(saturation(s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,dt,h)^2) return sum(sat_op2(m,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,dt,h)^2) # return sum(sat_op(m,pt,perm,poro,qw,qo,muw,muo,sref,dt,h)^2) end # TODO: change `m_` and `v_` to appropriate values m_ = constant(0.7*rand(n, m)) v_ = rand(n, m) # m_ = constant(s0) # v_ = rand(n, m) y_ = scalar_function(m_) dy_ = gradients(y_, m_) ms_ = Array{Any}(undef, 5) ys_ = Array{Any}(undef, 5) s_ = Array{Any}(undef, 5) w_ = Array{Any}(undef, 5) gs_ = @. 1 / 10^(1:5) for i = 1:5 g_ = gs_[i] ms_[i] = m_ + g_*v_ ys_[i] = scalar_function(ms_[i]) s_[i] = ys_[i] - y_ w_[i] = s_[i] - g_*sum(v_.*dy_) end sess = Session(); init(sess) sval_ = run(sess, s_) wval_ = run(sess, w_) close("all") loglog(gs_, abs.(sval_), "*-", label="finite difference") loglog(gs_, abs.(wval_), "+-", label="automatic differentiation") loglog(gs_, gs_.^2 * 0.5*abs(wval_[1])/gs_[1]^2, "--",label="\$\\mathcal{O}(\\gamma^2)\$") loglog(gs_, gs_ * 0.5*abs(sval_[1])/gs_[1], "--",label="\$\\mathcal{O}(\\gamma)\$") plt.gca().invert_xaxis() legend() xlabel("\$\\gamma\$") ylabel("Error")
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2721
using ADCME using PyCall using LinearAlgebra using PyPlot using Random using FwiFlow # Random.seed!(233) function saturation_nn(s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,thetaw,configw,thetao,configo,dt,h) saturation_nn_ = load_op_and_grad("./build/libSaturationNn","saturation_nn") s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,thetaw,configw,thetao,configo,dt,h = convert_to_tensor(Any[s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,thetaw,configw,thetao,configo,dt,h], [Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Float64,Int64,Float64,Int64,Float64,Float64]) saturation_nn_(s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,thetaw,configw,thetao,configo,dt,h) end # TODO: specify your input parameters m = 100 n = 10 h = 0.01 x = zeros(n, m) y = zeros(n, m) for i = 1:m for j = 1:n x[j, i] = h*i y[j, i] = h*j end end t = 3.0 s0 = @. (x^2 + y^2)/(1+x^2+y^2) * exp(-t) dporodt = exp(t) * ones(n, m) pt = @. (x^2+y^2) perm = rand(n, m) poro = exp(t) * ones(n, m) qw = ones(n, m) qo = ones(n, m) muw = 2.0 muo = 3.0 sref = s0 dt = 0.01 h = 0.1 u3 = sat_op(s0,pt,perm,poro,qw,qo,muw, muo,sref,dt,h) configw = [1,20,20,20,1] configo = [1,20,20,20,1] thetaw = ae_init(configw) thetao = ae_init(configo) u = saturation_nn(s0,dporodt,pt,perm,poro,qw,qo,muw,muo,sref,thetaw,configw,thetao,configo,dt,h) sess = Session(); init(sess) @show run(sess, u3-u) # uncomment it for testing gradients # error() # TODO: change your test parameter to `m` # in the case of `multiple=true`, you also need to specify which component you are testings # gradient check -- v function scalar_function(m) return sum(saturation_nn(s0,m,pt,perm,poro,qw,qo,muw,muo,sref,thetaw,configw,thetao,configo,dt,h)^3) # sum(sat_op(m,pt,perm,poro,qw,qo,muw, muo,sref,dt,h)^2) end # TODO: change `m_` and `v_` to appropriate values m_ = constant(dporodt) v_ = rand(n, m) y_ = scalar_function(m_) dy_ = gradients(y_, m_) ms_ = Array{Any}(undef, 5) ys_ = Array{Any}(undef, 5) s_ = Array{Any}(undef, 5) w_ = Array{Any}(undef, 5) gs_ = @. 10000 / 10^(1:5) for i = 1:5 g_ = gs_[i] ms_[i] = m_ + g_*v_ ys_[i] = scalar_function(ms_[i]) s_[i] = ys_[i] - y_ w_[i] = s_[i] - g_*sum(v_.*dy_) end sess = Session(); init(sess) sval_ = run(sess, s_) wval_ = run(sess, w_) close("all") loglog(gs_, abs.(sval_), "*-", label="finite difference") loglog(gs_, abs.(wval_), "+-", label="automatic differentiation") loglog(gs_, gs_.^2 * 0.5*abs(wval_[1])/gs_[1]^2, "--",label="\$\\mathcal{O}(\\gamma^2)\$") loglog(gs_, gs_ * 0.5*abs(sval_[1])/gs_[1], "--",label="\$\\mathcal{O}(\\gamma)\$") plt.gca().invert_xaxis() legend() xlabel("\$\\gamma\$") ylabel("Error")
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2695
using ADCME using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libUpwlapOp = tf.load_op_library('build/libUpwlapOp.so') @tf.custom_gradient def upwlap_op(perm,mobi,func,h,rhograv): out = libUpwlapOp.upwlap_op(perm,mobi,func,h,rhograv) def grad(dy): return libUpwlapOp.upwlap_op_grad(dy, out, perm,mobi,func,h,rhograv) return out, grad """ elseif Sys.isapple() py""" import tensorflow as tf libUpwlapOp = tf.load_op_library('build/libUpwlapOp.dylib') @tf.custom_gradient def upwlap_op(perm,mobi,func,h,rhograv): out = libUpwlapOp.upwlap_op(perm,mobi,func,h,rhograv) def grad(dy): return libUpwlapOp.upwlap_op_grad(dy, out, perm,mobi,func,h,rhograv) return out, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libUpwlapOp = tf.load_op_library('build/libUpwlapOp.dll') @tf.custom_gradient def upwlap_op(perm,mobi,func,h,rhograv): out = libUpwlapOp.upwlap_op(perm,mobi,func,h,rhograv) def grad(dy): return libUpwlapOp.upwlap_op_grad(dy, out, perm,mobi,func,h,rhograv) return out, grad """ end upwlap_op = py"upwlap_op" # TODO: # u = upwlap_op(perm,mobi,func,h,rhograv) # sess = Session() # init(sess) # run(sess, u) # TODO: h = 1.0 rho = 1000.0 G = 9.8 len_z = 16 len_x = 32 nz = Int(len_z/h + 1) nx = Int(len_x/h + 1) tf_h=constant(1.0) perm = rand(nz, nx) mobi = rand(nz, nx) func = rand(nz, nx) tf_perm = constant(perm) tf_mobi = constant(mobi) tf_func = constant(func) # gradient check -- v function scalar_function(m) # return sum(tanh(upwlap_op(m, tf_mobi, tf_func, tf_h, constant(rho*G)))) # return sum(tanh(upwlap_op(tf_perm, m, tf_func, tf_h, constant(rho*G)))) return sum(tanh(upwlap_op(tf_perm, tf_mobi, m, tf_h, constant(rho*G)))) end # m_ = constant(rand(10,20)) # m_ = tf_perm # m_ = tf_mobi m_ = tf_func v_ = rand(nz, nx) y_ = scalar_function(m_) dy_ = gradients(y_, m_) ms_ = Array{Any}(undef, 5) ys_ = Array{Any}(undef, 5) s_ = Array{Any}(undef, 5) w_ = Array{Any}(undef, 5) gs_ = @. 1 / 20^(1:5) for i = 1:5 g_ = gs_[i] ms_[i] = m_ + g_*v_ ys_[i] = scalar_function(ms_[i]) s_[i] = ys_[i] - y_ w_[i] = s_[i] - g_*sum(v_.*dy_) end sess = Session() init(sess) sval_ = run(sess, s_) wval_ = run(sess, w_) close("all") loglog(gs_, abs.(sval_), "*-", label="finite difference") loglog(gs_, abs.(wval_), "+-", label="automatic differentiation") loglog(gs_, gs_.^2 * 0.5*abs(wval_[1])/gs_[1]^2, "--",label="\$\\mathcal{O}(\\gamma^2)\$") loglog(gs_, gs_ * 0.5*abs(sval_[1])/gs_[1], "--",label="\$\\mathcal{O}(\\gamma)\$") plt.gca().invert_xaxis() legend() xlabel("\$\\gamma\$") ylabel("Error")
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
3525
using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('build/libUpwpsOp.so') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ elseif Sys.isapple() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('build/libUpwpsOp.dylib') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('build/libUpwpsOp.dll') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ end upwps_op = py"upwps_op" len_z = 4 len_x = 4 rho = 1000.0 G = 9.8 nScale = 5 tf_g = Array{Any}(undef, nScale) tf_coef = Array{Any}(undef, nScale) tf_h = Array{Any}(undef, nScale) tf_p = Array{Any}(undef, nScale) p_true_array = Array{Any}(undef, nScale) p_inv_array = Array{Any}(undef, nScale) h_array = @. 1 / 2^(1:nScale) for iScale = 1:nScale h = h_array[iScale] nz = Int(len_z/h + 1) nx = Int(len_x/h + 1) g = zeros(nz, nx) coef = zeros(nz, nx) p_true = zeros(nz, nx) for i = 1:nz for j = 1:nx g[i,j] = -sin(2*pi/len_z*(i-1)*h) * sin(2*pi/len_x*(j-1)*h) coef[i,j] = 1.0 - cos(2*pi/len_z*(i-1)*h) * sin(2*pi/len_x*(j-1)*h) * len_z / (2*pi*rho*G) # g[i,j] = 2.0*(i-1)*h*exp(-(((i-1)*h)^2) -(((j-1)*h)^2)) * rho * G # coef[i,j] = 1.0 + exp(-(((i-1)*h)^2) -(((j-1)*h)^2)) p_true[i,j] = rho*G*(i-1)*h end end p_true_array[iScale] = p_true .- mean(p_true) # p_true_array[iScale] = p_true tf_g[iScale] = constant(g) tf_coef[iScale] = constant(coef) tf_h[iScale] = constant(h) tf_ones = constant(ones(nz, nx)) tf_p[iScale] = upwps_op(tf_coef[iScale], tf_ones, tf_g[iScale], tf_ones, tf_h[iScale], constant(rho*G), constant(0)) end sess = Session() init(sess) p_inv_array = run(sess, tf_p) for iScale = 1:nScale p_inv_array[iScale] = p_inv_array[iScale] .- mean(p_inv_array[iScale]) end function l2_error(p_true, p_inv, iScale) l2_error = 0.0 l2_norm = 0.0 h = h_array[iScale] nz = size(p_true)[1] nx = size(p_true)[2] for i = 1:nz for j = 1:nx l2_error += (p_true[i,j]-p_inv[i,j])^2 * h^2 l2_norm += (p_true[i,j])^2 * h^2 end end return sqrt(l2_error)/sqrt(l2_norm) end Error_array = Array{Any}(undef, nScale) for iScale = 1:nScale Error_array[iScale] = l2_error(p_true_array[iScale], p_inv_array[iScale], iScale) end loglog(h_array, Error_array, "*-", label="MMS convergence") loglog(h_array, h_array.^2 * 0.5*Error_array[1]/h_array[1]^2, "--",label="\$\\mathcal{O}(h^2)\$") loglog(h_array, h_array * 0.5*Error_array[1]/h_array[1], "-",label="\$\\mathcal{O}(h)\$") plt.gca().invert_xaxis() legend() xlabel("\$h\$") ylabel("Error") # imshow(p)
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
3175
using ADCME using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('build/libUpwpsOp.so') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ elseif Sys.isapple() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('build/libUpwpsOp.dylib') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('build/libUpwpsOp.dll') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ end upwps_op = py"upwps_op" # gradient check -- v h = 20.0 rho = 1000.0 G = 9.8 len_z = 16*20 len_x = 32*20 nz = Int(len_z/h + 1) nx = Int(len_x/h + 1) tf_h=constant(1.0) coef = zeros(nz, nx) ones_cons = ones(nz, nx) rhs = zeros(nz, nx) for i = 1:nz for j = 1:nx rhs[i,j] = -sin(2*pi/len_z*(i-1)*h) * sin(2*pi/len_x*(j-1)*h) coef[i,j] = 1.0 - cos(2*pi/len_z*(i-1)*h) * sin(2*pi/len_x*(j-1)*h) * len_z / (2*pi*rho*G) # rhs[i,j] = 2.0*(i-1)*h*exp(-(((i-1)*h)^2) -(((j-1)*h)^2)) * rho * G # coef[i,j] = 1.0 + exp(-(((i-1)*h)^2) -(((j-1)*h)^2)) end end # coef = 1.0 .+ rand(nz, nx) # rhs = rand(nz, nx) tf_coef = constant(coef) tf_rhs = constant(rhs) tf_funcref = constant(rand(nz, nx)) tf_ones = constant(ones_cons) function scalar_function(m) # return sum(tanh(upwps_op(tf_coef, tf_ones, m, tf_funcref, tf_h, constant(rho*G), constant(0)))) # return sum(tanh(upwps_op(m, tf_ones, tf_rhs, tf_ones, tf_h, constant(rho*G), constant(0)))) return sum(tanh(upwps_op(tf_ones, m, tf_rhs, tf_ones, tf_h, constant(rho*G), constant(0)))) end # m_ = tf_rhs m_ = tf_coef v_ = 0.01*rand(nz, nx) y_ = scalar_function(m_) dy_ = gradients(y_, m_) ms_ = Array{Any}(undef, 5) ys_ = Array{Any}(undef, 5) s_ = Array{Any}(undef, 5) w_ = Array{Any}(undef, 5) gs_ = @. 1 / 10^(1:5) for i = 1:5 g_ = gs_[i] ms_[i] = m_ + g_*v_ ys_[i] = scalar_function(ms_[i]) s_[i] = ys_[i] - y_ w_[i] = s_[i] - g_*sum(v_.*dy_) end sess = Session() init(sess) sval_ = run(sess, s_) wval_ = run(sess, w_) close("all") loglog(gs_, abs.(sval_), "*-", label="finite difference") loglog(gs_, abs.(wval_), "+-", label="automatic differentiation") loglog(gs_, gs_.^2 * 0.5*abs(wval_[1])/gs_[1]^2, "--",label="\$\\mathcal{O}(\\gamma^2)\$") loglog(gs_, gs_ * 0.5*abs(sval_[1])/gs_[1], "--",label="\$\\mathcal{O}(\\gamma)\$") plt.gca().invert_xaxis() legend() xlabel("\$\\gamma\$") ylabel("Error")
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
307
using Documenter, FwiFlow makedocs(sitename="FwiFlow", modules=[FwiFlow], pages = Any[ "index.md", "api.md", "Tutorial" => ["tutorials/fwi.md","tutorials/flow.md", "tutorials/timefrac.md"] ], authors = "Dongzhuo Li and Kailai Xu") deploydocs( repo = "github.com/lidongzh/FwiFlow.jl.git", )
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
4469
using PyPlot using LinearAlgebra using PyTensorFlow using PyCall np = pyimport("numpy") include("poisson_op.jl") # Solver parameters m = 20 n = 20 h = 1.0 # 50 meters T = 0.01 # 100 days NT = 1000 Δt = T/NT x = (1:m)*h|>collect z = (1:n)*h|>collect X, Z = np.meshgrid(x, z) # todo #= Krw, Kro -- function mxn tensor to mxn tensor μw, μo -- known mxn ρw, ρo -- known mxn K -- scalar g -- constant ≈ 9.8? ϕ -- known mxn =# function Krw(Sw) return Sw ^ 2 end function Kro(So) return So ^2 end ρw = constant(1.) ρo = constant(1.) μw = constant(1.) μo = constant(1.) K = Variable(ones(m,n)) g = constant(1.0) ϕ = Variable(0.25 .* ones(m,n)) function geto(o::Union{Array,PyObject}, i::Int64, j::Int64) if i==-1 ii = 1:m-2 elseif i==0 ii = 2:m-1 else ii = 3:m end if j==-1 jj = 1:n-2 elseif j==0 jj = 2:n-1 else jj = 3:n end return o[ii,jj] end function G(f, p) f1 = (geto(f, 0, 0) + geto(f, 1, 0))/2 f2 = (geto(f, -1, 0) + geto(f, 0, 0))/2 f3 = (geto(f,0,1) + geto(f,0,0))/2 f4 = (geto(f,0,-1) + geto(f,0,0))/2 rhs = -f1.*(geto(p,1,0)-geto(p,0,0)) + f2.*(geto(p,0,0)-geto(p,-1,0)) - f3.*(geto(p,0,1)-geto(p,0,0)) + f4.*(geto(p,0,0)-geto(p,0,-1)) local q if isa(rhs, Array) q = zeros(m, n) q[2:m-1, 2:n-1] = rhs/h^2 else q = constant(zeros(m, n)) q = scatter_add(q, 2:m-1, 2:n-1, rhs/h^2) # q[2:m-1, 2:n-1] += rhs/h^2 end q end # variables : sw, u, v, p # (time dependent) parameters: qw, qo, ϕ function onestep(sw, qw, qo) # step 1: update p λw = Krw(sw)/μw λo = Kro(1-sw)/μo λ = λw + λo f = λw/λ q = qw + qo Θ = G((λw*ρw+λo*ρo)*g, Z) p = poisson_op(λ.*K, Θ+q, constant(h)) # step 2: update u, v rhs_u = -(geto(K, 0, 0)+geto(K,1,0))/2.0 .* (geto(λ, 0, 0) + get(λ, 1, 0))/2h .* (geto(p, 1, 0) - geto(p, 0, 0)) rhs_v = -(geto(K, 0, 0)+geto(K,0,1))/2.0 .* (geto(λ, 0, 0) + get(λ, 0, 1))/2h .* (geto(p, 0, 1) - geto(p, 0, 0)) + (geto(K, 0, 0)+geto(K,0,1))/2.0 .* (geto(λw*ρw+λo*ρo, 0, 0)+geto(λw*ρw+λo*ρo, 0, 1))/2 * g u = constant(zeros(m, n)) v = constant(zeros(m, n)) u = scatter_add(u, 2:m-1, 2:n-1, rhs_u) v = scatter_add(v, 2:m-1, 2:n-1, rhs_v) # step 3: update sw rhs = geto(qw, 0, 0) - (geto(f, 1, 0)-geto(f, 0, 0))/h.*geto(u, 0, 0) - (geto(f, 0, 1)-geto(f, 0, 0))/h.*geto(v, 0, 0) - geto(f, 0, 0) .* ( (geto(u, 0, 0)-geto(u, -1, 0))/h + (geto(v, 0, 0)-geto(v, 0, -1))/h ) - geto(G(K.*f.*λo*(ρw-ρo)*g, Z), 0, 0) rhs = Δt*rhs/geto(ϕ, 0, 0) sw = scatter_add(sw, 2:m-1, 2:n-1, rhs) return sw, p end """ solve(qw, qo, sw0, p0) Solve the two phase flow equation. `qw` and `qo` -- `NT x m x n` numerical array, `qw[i,:,:]` the corresponding value of qw at i*Δt `sw0` and `p0` -- initial value for `sw` and `p`. `m x n` numerical array. """ function solve(qw, qo, sw0) qw_arr = constant(qw) # qw: NT x m x n array qo_arr = constant(qo) function condition(i, tas...) i <= NT end function body(i, tas...) ta_sw, ta_p = tas sw, p = onestep(read(ta_sw, i), qw_arr[i], qo_arr[i]) ta_sw = write(ta_sw, i+1, sw) ta_p = write(ta_p, i+1, p) i+1, ta_sw, ta_p end ta_sw, ta_p = TensorArray(NT+1), TensorArray(NT+1) ta_sw = write(ta_sw, 1, constant(sw0)) i = constant(1, dtype=Int32) _, ta_sw, ta_p = while_loop(condition, body, [i; ta_sw; ta_p]) out_sw, out_p = stack(ta_sw), stack(ta_p) end function vis(val, args...;kwargs...) close("all") ns = Int64.(round.(LinRange(1,size(val,1),9))) for i = 1:9 subplot(330+i) imshow(val[ns[i],:,:], args...;kwargs...) colorbar() end end qw = zeros(NT, m, n) qw[:,15,5] .= 1.0 qo = zeros(NT, m, n) sw0 = zeros(m, n) out_sw, out_p = solve(qw, qo, sw0) sess = Session(); init(sess) S, P = run(sess, [out_sw, out_p]) vis(S) # figure() # vis(P) error("stop") #= # Step 1: Assign numerical values to qw, qo, sw0, p0 # qw = # qo = # sw0 = # p0 = qw = zeros(NT, m, n) qw[:,15,5] .= 0.0018 qo = zeros(NT, m, n) sw0 = zeros(m, n) p0 = 3.0337e+07*ones(m,n) # # Step 2: Construct Graph out_sw, out_p = solve(qw, qo, sw0, p0) # # Step 3: Run sess = Session() init(sess) sw, p = run(sess, [out_sw, out_p]) # # Step 4: Visualize # vis(sw) =#
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
1379
# void cufd(double *res, double *grad_Cp, double *grad_Cs, double *grad_Den, # double *grad_stf, const double *Cp, const double *Cs, # const double *Den, const double *stf, int calc_id, const int gpu_id, # int group_size, const int *shot_ids, const string para_fname); function obscalc(cp,cs,den,stf,shot_ids,para_fname) m, n = size(cp) res = zeros(1) grad_Cp = zeros(m, n) grad_Cs = zeros(m, n) grad_Den = zeros(m, n) grad_stf = zeros(size(stf)...) calc_id = Int32(2) gpu_id = Int32(0) group_size = length(shot_ids) ccall((:cufd, "./Src/build/libCUFD.so"), Cvoid, (Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Ref{Cdouble}, Cint, Cint, Cint, Ref{Cint}, Cstring), res, grad_Cp, grad_Cs, grad_Den, grad_stf, cp, cs, den, stf, calc_id, gpu_id, group_size, shot_ids, para_fname) end nz = 134 nx = 384 cp = 2500ones(nz, nx) cs = zeros(nz, nx) den = 1000ones(nz, nx) shot_ids = Int32[0 1] para_fname = "/home/lidongzh/TwoPhaseFlowFWI/Ops/FWI/Src/params/Par_file_obs_data.json" src = Matrix{Float64}(undef, 1, 2001) src[1,:] = Float64.(reinterpret(Float32, read("/home/lidongzh/TwoPhaseFlowFWI/Ops/FWI/Src/params/ricker_10Hz.bin"))) stf = repeat(src, outer=30) obscalc(cp,cs,den,stf,shot_ids,para_fname)
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
1818
# input: nz, nx, dz, dx, nSteps, nPoints_pml, nPad, dt, f0, survey_fname, data_dir_name, scratch_dir_name, isAc using JSON using DataStructures function paraGen(nz, nx, dz, dx, nSteps, dt, f0, nPml, nPad, filter_para, isAc, para_fname, survey_fname, data_dir_name, scratch_dir_name="") para = OrderedDict() para["nz"] = nz para["nx"] = nx para["dz"] = dz para["dx"] = dx para["nSteps"] = nSteps para["dt"] = dt para["f0"] = f0 para["nPoints_pml"] = nPml para["nPad"] = nPad para["isAc"] = isAc para["if_win"] = false para["filter"] = filter_para para["if_src_update"] = false para["survey_fname"] = survey_fname para["data_dir_name"] = data_dir_name if(scratch_dir_name != "") para["scratch_dir_name"] = scratch_dir_name end para_string = JSON.json(para) open(para_fname,"w") do f write(f, para_string) end end # all shots share the same number of receivers function surveyGen(z_src, x_src, z_rec, x_rec, survey_fname) nsrc = length(x_src) nrec = length(x_rec) survey = OrderedDict() survey["nShots"] = nsrc for i = 1:nsrc shot = OrderedDict() shot["z_src"] = z_src[i] shot["x_src"] = x_src[i] shot["nrec"] = nrec shot["z_rec"] = z_rec shot["x_rec"] = x_rec survey["shot$(i-1)"] = shot end survey_string = JSON.json(survey) open(survey_fname,"w") do f write(f, survey_string) end end function sourceGene(f, nStep, delta_t) # Ricker wavelet generation and integration for source # Dongzhuo Li @ Stanford # May, 2015 e = pi*pi*f*f; t_delay = 1.2/f; source = Matrix{Float64}(undef, 1, nStep) for it = 1:nStep source[it] = (1-2*e*(delta_t*(it-1)-t_delay)^2)*exp(-e*(delta_t*(it-1)-t_delay)^2); end for it = 2:nStep source[it] = source[it] + source[it-1]; end source = source * delta_t; end
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
1821
if Sys.islinux() py""" import tensorflow as tf libFwiOp = tf.load_op_library('./build/libFwiOp.so') @tf.custom_gradient def fwi_op(cp,cs,den,stf,gpu_id,shot_ids,para_fname): res = libFwiOp.fwi_op(cp,cs,den,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy, tf.constant(1.0,dtype=tf.float64),cp,cs,den,stf,gpu_id,shot_ids,para_fname) return res, grad def fwi_obs_op(cp,cs,den,stf,gpu_id,shot_ids,para_fname): res = libFwiOp.fwi_obs_op(cp,cs,den,stf,gpu_id,shot_ids,para_fname) return res """ elseif Sys.isapple() py""" import tensorflow as tf libFwiOp = tf.load_op_library('./build/libFwiOp.dylib') @tf.custom_gradient def fwi_op(cp,cs,den): res = libFwiOp.fwi_op(cp,cs,den,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy,tf.constant(1.0,dtype=tf.float64),cp,cs,den,stf,gpu_id,shot_ids,para_fname) return res, grad def fwi_obs_op(cp,cs,den,stf,gpu_id,shot_ids,para_fname): res = libFwiOp.fwi_obs_op(cp,cs,den,stf,gpu_id,shot_ids,para_fname) return res """ elseif Sys.iswindows() py""" import tensorflow as tf libFwiOp = tf.load_op_library('./build/libFwiOp.dll') @tf.custom_gradient def fwi_op(cp,cs,den): res = libFwiOp.fwi_op(cp,cs,den,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy,tf.constant(1.0,dtype=tf.float64),cp,cs,den,stf,gpu_id,shot_ids,para_fname) return res, grad def fwi_obs_op(cp,cs,den,stf,gpu_id,shot_ids,para_fname): res = libFwiOp.fwi_obs_op(cp,cs,den,stf,gpu_id,shot_ids,para_fname) return res """ end fwi_op = py"fwi_op" fwi_obs_op = py"fwi_obs_op"
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
4059
using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) include("fwi_util.jl") include("fwi_util_op.jl") np = pyimport("numpy") # argsparse.jl # ENV["CUDA_VISIBLE_DEVICES"] = 1 # ENV["PARAMDIR"] = "Src/params/" # config = tf.ConfigProto(device_count = Dict("GPU"=>0)) nz = 200 nx = 200 dz = 20 dx = 20 nSteps = 2001 dt = 0.0025 f0 = 4.5 filter_para = [0, 0.1, 100.0, 200.0] nPml = 32 isAc = true nPad = 0 # x_src = collect(5:10:nx-2nPml-5) # z_src = 2ones(Int64, size(x_src)) # x_rec = collect(5:100-nPml) # z_rec = 2ones(Int64, size(x_rec)) x_src = [100-nPml] z_src = [100-nPml] z = (5:10:nz-2nPml-5)|>collect x = (5:10:nx-2nPml-5)|>collect x_rec, z_rec = np.meshgrid(x, z) x_rec = x_rec[:] z_rec = z_rec[:] # x_rec = collect(5:1:nx-2nPml-5) # z_rec = 60ones(Int64, size(x_rec)) para_fname = "./para_file.json" survey_fname = "./survey_file.json" data_dir_name = "./Data" paraGen(nz, nx, dz, dx, nSteps, dt, f0, nPml, nPad, filter_para, isAc, para_fname, survey_fname, data_dir_name) surveyGen(z_src, x_src, z_rec, x_rec, survey_fname) cp = 3000ones(nz, nx) # cp = (1. .+ 0.1*rand(nz, nx)) .* 3000. cs = zeros(nz, nx) den = 1000.0 .* ones(nz, nx) tf_cp = constant(cp) tf_cs = constant(cs) tf_den = constant(den) src = Matrix{Float64}(undef, 1, 2001) src[1,:] = Float64.(reinterpret(Float32, read("./Src/params/ricker_10Hz.bin"))) tf_stf = constant(repeat(src, outer=length(z_src))) tf_para_fname = tf.strings.join([para_fname]) tf_gpu_id0 = constant(0, dtype=Int32) tf_gpu_id1 = constant(1, dtype=Int32) tf_shot_ids0 = constant(collect(Int32, 0:length(x_src)-1), dtype=Int32) tf_shot_ids1 = constant(collect(Int32, 13:25), dtype=Int32) res1 = fwi_obs_op(tf_cp, tf_cs, tf_den, tf_stf, tf_gpu_id0, tf_shot_ids0, tf_para_fname) # res2 = fwi_obs_op(tf_cp2, tf_cs2, tf_den2, tf_stf, tf_gpu_id1, tf_shot_ids0, tf_para_fname) sess=Session();init(sess); @time run(sess, res1) error("") # error("") # function obj() # res = 0.0 # for i = 1:29 # gpu_id = mod(i, 2) # res += fwi_op(tf_cp, tf_cs, tf_den, tf_stf, constant(gpu_id, dtype=Int32), constant([i], dtype=Int32), tf_para_fname) # end # return res # end # J = obj() # J1 = fwi_op(tf_cp, tf_cs, tf_den, tf_stf, tf_gpu_id0, tf_shot_ids0, tf_para_fname) # J2 = fwi_op(tf_cp, tf_cs, tf_den, tf_stf, tf_gpu_id1, tf_shot_ids0, tf_para_fname) # J = J1 + J2 # # config = tf.ConfigProto() # # config.allow_growth # # config.intra_op_parallelism_threads = 2 # # config.inter_op_parallelism_threads = 2 # sess=Session();init(sess); # # @time run(sess, J) # gg = gradients(J1, tf_cp) # grad_cp = run(sess, gg) # imshow(grad_cp);colorbar(); # error("") # gradient check -- v function scalar_function(m) return fwi_op(m, tf_cs, tf_den, tf_stf, tf_gpu_id0, tf_shot_ids0, tf_para_fname) end # open("./Data/Shot1.bin","w") do f # write(f, zeros(nz*nx,1)) # end m_ = constant(3100ones(nz, nx)) # m_ = constant(cp) # v_ = 100. .* (1. .+ rand(Float32, 384, 134)) v0 = zeros(nz, nx) # PLEASE!!!!!!!!!!!!!! Don't perturb in the CPML region!!!!!!!!!!!!!!!!!!!!!!! v0[nPml+5:nz-nPml-5, nPml+5:nx-nPml-5] .= 1.0 v_ = constant(Float64.((1. .+ 0.1*rand(nz, nx)) .* 500 .* v0)) y_ = scalar_function(m_) dy_ = gradients(y_, m_) ms_ = Array{Any}(undef, 5) ys_ = Array{Any}(undef, 5) s_ = Array{Any}(undef, 5) w_ = Array{Any}(undef, 5) gs_ = @. 1 / 10^(1:5) for i = 1:5 g_ = gs_[i] ms_[i] = m_ + g_ * v_ ys_[i] = scalar_function(ms_[i]) s_[i] = ys_[i] - y_ w_[i] = s_[i] - g_*sum(v_.*dy_) end sess = Session() init(sess) sval_ = run(sess, s_) wval_ = run(sess, w_) # error("") sval_ = [x[1] for x in sval_] wval_ = [x[1] for x in wval_] close("all") loglog(gs_, abs.(sval_), "*-", label="finite difference") loglog(gs_, abs.(wval_), "+-", label="automatic differentiation") loglog(gs_, gs_.^2 * 0.5*abs(wval_[1])/gs_[1]^2, "--",label="\$\\mathcal{O}(\\gamma^2)\$") loglog(gs_, gs_ * 0.5*abs(sval_[1])/gs_[1], "--",label="\$\\mathcal{O}(\\gamma)\$") plt.gca().invert_xaxis() legend() xlabel("\$\\gamma\$") ylabel("Error")
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
7078
using ArgParse function parse_commandline() s = ArgParseSettings() @add_arg_table s begin "--generate_data" arg_type = Bool default = false "--version" arg_type = String default = "0000" "--gpuIds" arg_type = String default = "0" "--indStage" arg_type = Int64 default = 2 "--verbose" arg_type = Bool default = false end return parse_args(s) end args = parse_commandline() if !isdir("./$(args["version"])") mkdir("./$(args["version"])") end if !isdir("./$(args["version"])/Stage$(args["indStage"])") mkdir("./$(args["version"])/Stage$(args["indStage"])") end using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) include("ops_imseq.jl") include("../Ops/FWI/fwi_util.jl") include("fwi_util_op.jl") np = pyimport("numpy") # NOTE Parameters # const ALPHA = 0.006323996017182 # const SRC_CONST = 5.6146 # const GRAV_CONST = 1.0/144.0 const ALPHA = 1.0 const SRC_CONST = 86400.0 const GRAV_CONST = 1.0 # NOTE Hyperparameter for flow simulation NT = 50 dt_survey = 5 Δt = 20.0 # day if args["generate_data"] m = 90 n = 180 h = 5.0 # meter qw = zeros(NT, m, n) qw[:,54,18] .= 0.005 * (1/h^2)/10.0 * SRC_CONST qo = zeros(NT, m, n) qo[:,54,168] .= -0.005 * (1/h^2)/10.0 * SRC_CONST else m = 45 n = 90 h = 10.0 # meter qw = zeros(NT, m, n) qw[:,27,9] .= 0.005 * (1/h^2)/10.0 * SRC_CONST qo = zeros(NT, m, n) qo[:,27,84] .= -0.005 * (1/h^2)/10.0 * SRC_CONST end z = (1:m)*h|>collect x = (1:n)*h|>collect X, Z = np.meshgrid(x, z) # ρw = 996.9571 # ρo = 640.7385 # μw = 1.0 # μo = 3.0 ρw = 501.9 ρo = 1053.0 μw = 0.1 μo = 1.0 # K_init = 20.0 .* ones(m,n) g = 9.8*GRAV_CONST ϕ = 0.25 .* ones(m,n) # qw = zeros(NT, m, n) # qw[:,54,18] .= 0.005 * (1/h^2)/10.0 * SRC_CONST # qo = zeros(NT, m, n) # qo[:,54,168] .= -0.005 * (1/h^2)/10.0 * SRC_CONST sw0 = zeros(m, n) survey_indices = collect(1:dt_survey:NT+1) # 10 stages n_survey = length(survey_indices) # NOTE Hyperparameter for fwi_op # argsparse.jl # ENV["CUDA_VISIBLE_DEVICES"] = 1 # ENV["PARAMDIR"] = "Src/params/" # config = tf.ConfigProto(device_count = Dict("GPU"=>0)) dz = 3 # meters dx = 3 nz = Int64(round((m * h) / dz)) + 1 nx = Int64(round((n * h) / dx)) + 1 nPml = 64 nSteps = 3001 dt = 0.00025 f0 = 50.0 nPad = 32 - mod((nz+2*nPml), 32) nz_pad = nz + 2*nPml + nPad nx_pad = nx + 2*nPml # reflection # x_src = collect(5:20:nx-5) # z_src = 5ones(Int64, size(x_src)) # x_rec = collect(5:1:nx-5) # z_rec = 5 .* ones(Int64, size(x_rec)) # xwell # # z_src = collect(5:10:nz-5) #14->11srcs 10->15srcs # # z_src = collect(5:10:nz-5) z_src = collect(5:10:nz-5) x_src = 5ones(Int64, size(z_src)) z_rec = collect(5:1:nz-5) x_rec = (nx-5) .* ones(Int64, size(z_rec)) # para_fname = "./$(args["version"])/para_file.json" # survey_fname = "./$(args["version"])/survey_file.json" # data_dir_name = "./$(args["version"])/Data" # paraGen(nz, nx, dz, dx, nSteps, dt, f0, nPml, nPad, filter_para, isAc, para_fname, survey_fname, data_dir_name) # surveyGen(z_src, x_src, z_rec, x_rec, survey_fname) cp_nopad = 3500.0 .* ones(nz, nx) # initial cp cs = cp_nopad ./ sqrt(3.0) den = 2200.0 .* ones(nz, nx) cp_pad = 3500.0 .* ones(nz_pad, nx_pad) # initial cp cs_pad = cp_pad ./ sqrt(3.0) den_pad = 2200.0 .* ones(nz_pad, nx_pad) cp_pad_value = 3500.0 # tf_cp = constant(cp) tf_cs = constant(cs_pad) tf_den = constant(den_pad) # src = Matrix{Float64}(undef, 1, 2001) # # src[1,:] = Float64.(reinterpret(Float32, read("../Ops/FWI/Src/params/ricker_10Hz.bin"))) # src[1,:] = Float64.(reinterpret(Float32, read("../Ops/FWI/Src/params/Mar_source_2001.bin"))) src = sourceGene(f0, nSteps, dt) tf_stf = constant(repeat(src, outer=length(z_src))) # tf_para_fname = tf.strings.join([para_fname]) tf_gpu_id0 = constant(0, dtype=Int32) tf_gpu_id1 = constant(1, dtype=Int32) gpu_id_array = [parse(Int, ss) for ss in split(args["gpuIds"],"_")] nGpus = length(gpu_id_array) tf_gpu_id_array = constant(gpu_id_array, dtype=Int32) tf_shot_ids0 = constant(collect(Int32, 0:length(x_src)-1), dtype=Int32) tf_shot_ids1 = constant(collect(Int32, 13:25), dtype=Int32) # NOTE Hyperparameter for rock physics tf_bulk_fl1 = constant(2.735e9) tf_bulk_fl2 = constant(0.125e9) # to displace fl1 tf_bulk_sat1 = constant(den .* (cp_nopad.^2 .- 4.0/3.0 .* cp_nopad.^2 ./3.0)) # vp/vs ratio as sqrt(3) tf_bulk_min = constant(36.6e9) tf_shear_sat1 = constant(den .* cp_nopad.^2 ./3.0) tf_ϕ_pad = tf.image.resize_bilinear(tf.reshape(constant(ϕ), (1, m, n, 1)), (nz, nx)) # upsample the porosity tf_ϕ_pad = cast(tf_ϕ_pad, Float64) tf_ϕ_pad = squeeze(tf_ϕ_pad) tf_shear_pad = tf.pad(tf_shear_sat1, [nPml (nPml+nPad); nPml nPml], constant_values=den[1,1] * cp_nopad[1,1]^2 /3.0) / 1e6 function Gassman(sw) tf_bulk_fl_mix = 1.0/( (1-sw)/tf_bulk_fl1 + sw/tf_bulk_fl2 ) temp = tf_bulk_sat1/(tf_bulk_min - tf_bulk_sat1) - tf_bulk_fl1/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl1) + tf_bulk_fl_mix/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl_mix) tf_bulk_new = tf_bulk_min / (1.0/temp + 1.0) # tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) *16.018463373960138; tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) # tf_cp_new = sqrt((tf_bulk_new + 4.0/3.0 * tf_shear_sat1)/tf_den_new) tf_lambda_new = tf_bulk_new - 2.0/3.0 * tf_shear_sat1 return tf_lambda_new, tf_den_new end tf_brie_coef = Variable(2.0*30.0) # tf_brie_coef = constant(3.0) # tf_brie_coef = constant(2.0) function Brie(sw) tf_bulk_fl_mix = (tf_bulk_fl1-tf_bulk_fl2)*(1-sw)^(tf_brie_coef/30.0) + tf_bulk_fl2 temp = tf_bulk_sat1/(tf_bulk_min - tf_bulk_sat1) - tf_bulk_fl1/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl1) + tf_bulk_fl_mix/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl_mix) tf_bulk_new = tf_bulk_min / (1.0/temp + 1.0) # tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) *16.018463373960138; tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) # tf_cp_new = sqrt((tf_bulk_new + 4.0/3.0 * tf_shear_sat1)/tf_den_new) tf_lambda_new = tf_bulk_new - 2.0/3.0 * tf_shear_sat1 return tf_lambda_new, tf_den_new end function RockLinear(sw) # tf_lambda_new = constant(7500.0*1e6 .* ones(nz,nx)) + (17400.0-7500.0)*1e6 * sw tf_lambda_new = constant(7500.0*1e6 .* ones(nz,nx)) + (9200.0-7500.0)*1e6 * sw tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) return tf_lambda_new, tf_den_new end tf_patch_temp = tf_bulk_sat1/(tf_bulk_min - tf_bulk_sat1) - tf_bulk_fl1/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl1) + tf_bulk_fl2/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl2) tf_bulk_sat2 = tf_bulk_min/(1.0/tf_patch_temp + 1.0) function Patchy(sw) tf_bulk_new = 1/( (1-sw)/(tf_bulk_sat1+4.0/3.0*tf_shear_sat1) + sw/(tf_bulk_sat2+4.0/3.0*tf_shear_sat1) ) - 4.0/3.0*tf_shear_sat1 tf_lambda_new = tf_bulk_new - 2.0/3.0 * tf_shear_sat1 tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) return tf_lambda_new, tf_den_new end
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2036
if Sys.islinux() py""" import tensorflow as tf import socket if socket.gethostname() != "Dolores": libFwiOp = tf.load_op_library('../Ops/FWI/build/libFwiOp.so') else: libFwiOp = tf.load_op_library('../Ops/FWI/build_dolores/libFwiOp.so') @tf.custom_gradient def fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy, tf.constant(1.0,dtype=tf.float64),λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit, grad def fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit """ elseif Sys.isapple() py""" import tensorflow as tf libFwiOp = tf.load_op_library('../Ops/FWI/build/libFwiOp.so') @tf.custom_gradient def fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy, tf.constant(1.0,dtype=tf.float64),λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit, grad def fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit """ elseif Sys.iswindows() py""" import tensorflow as tf libFwiOp = tf.load_op_library('../Ops/FWI/build/libFwiOp.so') @tf.custom_gradient def fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy, tf.constant(1.0,dtype=tf.float64),λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit, grad def fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit """ end fwi_op = py"fwi_op" fwi_obs_op = py"fwi_obs_op"
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2301
using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libLaplacian = tf.load_op_library('../Ops/Laplacian/build/libLaplacian.so') @tf.custom_gradient def laplacian_op(coef,func,h,rhograv): p = libLaplacian.laplacian(coef,func,h,rhograv) def grad(dy): return libLaplacian.laplacian_grad(dy, coef, func, h, rhograv) return p, grad """ elseif Sys.isapple() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Ops/Laplacian/build/libLaplacian.dylib') @tf.custom_gradient def laplacian_op(coef,func,h,rhograv): p = libLaplacian.laplacian(coef,func,h,rhograv) def grad(dy): return libLaplacian.laplacian_grad(dy, coef, func, h, rhograv) return p, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Ops/Laplacian/build/libLaplacian.dll') @tf.custom_gradient def laplacian_op(coef,func,h,rhograv): p = libLaplacian.laplacian(coef,func,h,rhograv) def grad(dy): return libLaplacian.laplacian_grad(dy, coef, func, h, rhograv) return p, grad """ end laplacian_op = py"laplacian_op" if Sys.islinux() py""" import tensorflow as tf libUpwlapOp = tf.load_op_library('../Ops/Upwlap/build/libUpwlapOp.so') @tf.custom_gradient def upwlap_op(perm,mobi,func,h,rhograv): out = libUpwlapOp.upwlap_op(perm,mobi,func,h,rhograv) def grad(dy): return libUpwlapOp.upwlap_op_grad(dy, out, perm,mobi,func,h,rhograv) return out, grad """ elseif Sys.isapple() py""" import tensorflow as tf libUpwlapOp = tf.load_op_library('../Ops/Upwlap/build/libUpwlapOp.dylib') @tf.custom_gradient def upwlap_op(perm,mobi,func,h,rhograv): out = libUpwlapOp.upwlap_op(perm,mobi,func,h,rhograv) def grad(dy): return libUpwlapOp.upwlap_op_grad(dy, out, perm,mobi,func,h,rhograv) return out, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libUpwlapOp = tf.load_op_library('./Ops/Upwlap/build/libUpwlapOp.dll') @tf.custom_gradient def upwlap_op(perm,mobi,func,h,rhograv): out = libUpwlapOp.upwlap_op(perm,mobi,func,h,rhograv) def grad(dy): return libUpwlapOp.upwlap_op_grad(dy, out, perm,mobi,func,h,rhograv) return out, grad """ end upwlap_op = py"upwlap_op"
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
5967
#= Main program for FWI =# using ArgParse function parse_commandline() s = ArgParseSettings() @add_arg_table s begin "--generate_data" arg_type = Bool default = false "--version" arg_type = String default = "0000" "--verbose" arg_type = Bool default = false end return parse_args(s) end args = parse_commandline() if !isdir("./$(args["version"])") mkdir("./$(args["version"])") end using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) include("ops_imseq.jl") include("../Ops/FWI/fwi_util.jl") include("fwi_util_op.jl") np = pyimport("numpy") nz = 134 nx = 384 dz = 24. # meters dx = 24. nSteps = 2001 dt = 0.0025 f0 = 4.5 filter_para = [0, 0.1, 100.0, 200.0] isAc = true nPml = 32 nPad = 32 - mod((nz+2*nPml), 32) nz_pad = nz + 2*nPml + nPad nx_pad = nx + 2*nPml # reflection x_src = collect(4:8:384) z_src = 2ones(Int64, size(x_src)) x_rec = collect(3:381) z_rec = 2ones(Int64, size(x_rec)) # xwell # z_src = collect(5:10:nz-5) #14->11srcs 10->15srcs # x_src = 5ones(Int64, size(z_src)) # z_rec = collect(5:1:nz-5) # x_rec = (nx-5) .* ones(Int64, size(z_rec)) para_fname = "./$(args["version"])/para_file.json" survey_fname = "./$(args["version"])/survey_file.json" data_dir_name = "./$(args["version"])/Data" paraGen(nz_pad, nx_pad, dz, dx, nSteps, dt, f0, nPml, nPad, filter_para, isAc, para_fname, survey_fname, data_dir_name) surveyGen(z_src, x_src, z_rec, x_rec, survey_fname) tf_cp = constant(reshape(reinterpret(Float32,read("Mar_models/Model_Cp_true.bin")),(nz_pad, nx_pad)), dtype=Float64) cs = zeros(nz_pad, nx_pad) den = 1000.0 .* ones(nz_pad, nx_pad) cp_pad_value = 3000.0 # tf_cp = constant(cp) tf_cs = constant(cs) tf_den = constant(den) src = Matrix{Float64}(undef, 1, 2001) # # src[1,:] = Float64.(reinterpret(Float32, read("../Ops/FWI/Src/params/ricker_10Hz.bin"))) src[1,:] = Float64.(reinterpret(Float32, read("../Ops/FWI/Src/params/Mar_source_2001.bin"))) # src = sourceGene(f0, nSteps, dt) tf_stf = constant(repeat(src, outer=length(z_src))) # tf_para_fname = tf.strings.join([para_fname]) tf_gpu_id0 = constant(0, dtype=Int32) tf_gpu_id1 = constant(1, dtype=Int32) nGpus = 2 tf_gpu_id_array = constant(collect(0:nGpus-1), dtype=Int32) tf_shot_ids0 = constant(collect(Int32, 0:length(x_src)-1), dtype=Int32) shot_id_points = Int32.(trunc.(collect(LinRange(0, length(z_src)-1, nGpus+1)))) function pad_cp(cp) tran_cp = cast(cp, Float64) return tf.pad(tran_cp, [nPml (nPml+nPad); nPml nPml], constant_values=3000.0) end # NOTE Generate Data if args["generate_data"] println("Generate Test Data...") if !isdir("./$(args["version"])/Data") mkdir("./$(args["version"])/Data") end res = fwi_obs_op(tf_cp, tf_cs, tf_den, tf_stf, tf_gpu_id0, tf_shot_ids0, para_fname) config = tf.ConfigProto() config.intra_op_parallelism_threads = 24 config.inter_op_parallelism_threads = 24 sess = Session(config=config); init(sess); run(sess, res) error("Generate Data: Stop") end cp_init = reshape(reinterpret(Float32,read("Mar_models/Model_Cp_init_1D.bin")),(nz_pad, nx_pad)) tf_cp_inv = Variable(cp_init, dtype=Float64) Mask = ones(nz_pad, nx_pad) Mask[nPml+1:nPml+10,:] .= 0.0 tf_cp_inv_msk = tf_cp_inv .* constant(Mask) + constant(cp_init[1,1] .* (1. .- Mask)) # NOTE Compute FWI loss # loss = constant(0.0) # for i = 1:nGpus # global loss # tf_shot_ids = constant(collect(shot_id_points[i] : shot_id_points[i+1]), dtype=Int32) # loss += fwi_op(tf_cp_inv_msk, tf_cs, tf_den, tf_stf, tf_gpu_id_array[i], tf_shot_ids, para_fname) # end loss = fwi_op(tf_cp_inv_msk, tf_cs, tf_den, tf_stf, tf_gpu_id_array[1], tf_shot_ids0, para_fname) gradCp = gradients(loss, tf_cp_inv) if args["verbose"] sess = Session(); init(sess) println("Initial loss = ", run(sess, loss)) g = gradients(loss, tfCtxInit.K) G = run(sess, g) pcolormesh(G); savefig("test.png"); close("all") end # Optimization __cnt = 0 # invK = zeros(m,n) function print_loss(l, Cp, gradCp) global __cnt, __l, __Cp, __gradCp if mod(__cnt,1)==0 println("\niter=$__iter, eval=$__cnt, current loss=",l) # println("a=$a, b1=$b1, b2=$b2") end __cnt += 1 __l = l __Cp = Cp __gradCp = gradCp end __iter = 0 function print_iter(rk) global __iter, __l if mod(__iter,1)==0 println("\n************* ITER=$__iter *************\n") end __iter += 1 open("./$(args["version"])/loss.txt", "a") do io writedlm(io, Any[__iter __l]) end open("./$(args["version"])/Cp$__iter.txt", "w") do io writedlm(io, __Cp) end open("./$(args["version"])/gradCp$__iter.txt", "w") do io writedlm(io, __gradCp) end end config = tf.ConfigProto() config.intra_op_parallelism_threads = 24 config.inter_op_parallelism_threads = 24 sess = Session(config=config); init(sess); # cp_low_bd = 1500. .* ones(nz_pad, nx_pad) # cp_high_bd = 5500. .* ones(nz_pad, nx_pad) # cp_high_bd[nPml+1:nPml+10,:] .= 1500.0 opt = ScipyOptimizerInterface(loss, var_list=[tf_cp_inv], var_to_bounds=Dict(tf_cp_inv=> (1500.0, 5500.0)), method="L-BFGS-B", options=Dict("maxiter"=> 100, "ftol"=>1e-6, "gtol"=>1e-6)) @info "Optimization Starts..." ScipyOptimizerMinimize(sess, opt, loss_callback=print_loss, step_callback=print_iter, fetches=[loss,tf_cp_inv,gradCp]) # adam = AdamOptimizer(learning_rate=50.0) # op = minimize(adam, loss) # sess = Session(); init(sess); # for iter = 1:1000 # _, misfit, cp, cpgrad = run(sess, [op, loss, tf_cp_inv, gradCp]) # open("./$(args["version"])/Cp$iter.txt", "w") do io # writedlm(io, cp) # end # open("./$(args["version"])/loss.txt", "a") do io # writedlm(io, Any[iter misfit]) # end # open("./$(args["version"])/gradCp$iter.txt", "w") do io # writedlm(io, cpgrad) # end # end
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
6098
#= Main program for two phase flow inversion =# include("args.jl") function sw_p_to_lambda_den(sw, p) sw = tf.reshape(sw, (1, m, n, 1)) p = tf.reshape(p, (1, m, n, 1)) sw = tf.image.resize_bilinear(sw, (nz, nx)) p = tf.image.resize_bilinear(p, (nz, nx)) sw = cast(sw, Float64) p = cast(p, Float64) sw = squeeze(sw) p = squeeze(p) # tran_lambda, tran_den = Gassman(sw) # tran_lambda, tran_den = RockLinear(sw) # test linear relationship tran_lambda, tran_den = Patchy(sw) # tran_lambda, tran_den = Brie(sw) tran_lambda_pad = tf.pad(tran_lambda, [nPml (nPml+nPad); nPml nPml], constant_values=3500.0^2*2200.0/3.0) /1e6 tran_den_pad = tf.pad(tran_den, [nPml (nPml+nPad); nPml nPml], constant_values=2200.0) return tran_lambda_pad, tran_den_pad end # NOTE Generate Data if args["generate_data"] println("Generate Test Data...") K = 20.0 .* ones(m,n) # millidarcy ix = 1:n y1 = 45. .+ 10. .* sin.(ix./120.0 .* 2.0 .* pi) y2 = 55. .+ 10. .* sin.(ix./120.0 .* 2.0 .* pi) for j = 1:n for i = 1:m if (i > y1[j] && i < y2[j]) K[i, j] = 120; end end end # imshow(K) tfCtxTrue = tfCtxGen(m,n,h,NT,Δt,Z,X,ρw,ρo,μw,μo,K,g,ϕ,qw,qo, sw0, true) out_sw_true, out_p_true = imseq(tfCtxTrue) lambdas = Array{PyObject}(undef, n_survey) dens = Array{PyObject}(undef, n_survey) for i = 1:n_survey sw = out_sw_true[survey_indices[i]] p = out_p_true[survey_indices[i]] lambdas[i], dens[i] = sw_p_to_lambda_den(sw, p) end misfit = Array{PyObject}(undef, n_survey) for i = 1:n_survey if !isdir("./$(args["version"])/Data$i") mkdir("./$(args["version"])/Data$i") end para_fname = "./$(args["version"])/para_file$i.json" survey_fname = "./$(args["version"])/survey_file$i.json" paraGen(nz_pad, nx_pad, dz, dx, nSteps, dt, f0, nPml, nPad, para_fname, survey_fname, "./$(args["version"])/Data$i/") # shot_inds = collect(1:3:length(z_src)) .+ mod(i-1,3) # 5src rotation # shot_inds = i # 1src rotation shot_inds = collect(1:length(z_src)) # all sources surveyGen(z_src[shot_inds], x_src[shot_inds], z_rec, x_rec, survey_fname) tf_shot_ids0 = constant(collect(0:length(shot_inds)-1), dtype=Int32) misfit[i] = fwi_obs_op(lambdas[i], tf_shear_pad, dens[i], tf_stf, tf_gpu_id0, tf_shot_ids0, para_fname) end config = tf.ConfigProto() config.intra_op_parallelism_threads = 24 config.inter_op_parallelism_threads = 24 sess = Session(config=config); init(sess); run(sess, misfit) error("Generate Data: Stop") end if args["indStage"] == 2 K_init = 20.0 .* ones(m,n) elseif args["indStage"] == 1 error("indStage == 1") else ls = readdlm("./$(args["version"])/Stage$(args["indStage"]-1)/loss.txt") Ls = Int64((ls[end,1])) K_init = readdlm("./$(args["version"])/Stage$(args["indStage"]-1)/K$Ls.txt") end tfCtxInit = tfCtxGen(m,n,h,NT,Δt,Z,X,ρw,ρo,μw,μo,K_init,g,ϕ,qw,qo, sw0, false) out_sw_init, out_p_init = imseq(tfCtxInit) lambdas = Array{PyObject}(undef, n_survey) dens = Array{PyObject}(undef, n_survey) for i = 1:n_survey sw = out_sw_init[survey_indices[i]] p = out_p_init[survey_indices[i]] lambdas[i], dens[i] = sw_p_to_lambda_den(sw, p) end # NOTE Compute FWI loss loss = constant(0.0) for i = 1:args["indStage"] global loss para_fname = "./$(args["version"])/para_file$i.json" survey_fname = "./$(args["version"])/survey_file$i.json" # shot_inds = collect(1:3:length(z_src)) .+ mod(i-1,3) # shot_inds = i shot_inds = collect(1:length(z_src)) # all sources tf_shot_ids0 = constant(collect(0:length(shot_inds)-1), dtype=Int32) loss += fwi_op(lambdas[i], tf_shear_pad, dens[i], tf_stf, tf_gpu_id_array[mod(i,nGpus)], tf_shot_ids0, para_fname) # mod(i,2) end gradK = gradients(loss, tfCtxInit.K) if args["verbose"] sess = Session(); init(sess) println("Initial loss = ", run(sess, loss)) g = gradients(loss, tfCtxInit.K) G = run(sess, g) pcolormesh(G); savefig("test.png"); close("all") end # Optimization __cnt = 0 # invK = zeros(m,n) function print_loss(l, K, gradK, brie_coef) global __cnt, __l, __K, __gradK, __brie_coef if mod(__cnt,1)==0 println("\niter=$__iter, eval=$__cnt, current loss=",l) # println("a=$a, b1=$b1, b2=$b2") end __cnt += 1 __l = l __K = K __gradK = gradK __brie_coef = brie_coef end __iter = 0 function print_iter(rk) global __iter, __l if mod(__iter,1)==0 println("\n************* ITER=$__iter *************\n") end __iter += 1 open("./$(args["version"])/Stage$(args["indStage"])/loss.txt", "a") do io writedlm(io, Any[__iter __l]) end open("./$(args["version"])/Stage$(args["indStage"])/K$__iter.txt", "w") do io writedlm(io, __K) end open("./$(args["version"])/Stage$(args["indStage"])/gradK$__iter.txt", "w") do io writedlm(io, __gradK) end open("./$(args["version"])/Stage$(args["indStage"])/brie_coef.txt", "a") do io writedlm(io, Any[__iter __brie_coef]) end end config = tf.ConfigProto() config.intra_op_parallelism_threads = 24 config.inter_op_parallelism_threads = 24 sess = Session(config=config); init(sess); opt = ScipyOptimizerInterface(loss, var_list=[tfCtxInit.K], var_to_bounds=Dict(tfCtxInit.K=> (10.0, 130.0)), method="L-BFGS-B", options=Dict("maxiter"=> 100, "ftol"=>1e-6, "gtol"=>1e-6)) # opt = ScipyOptimizerInterface(loss, var_list=[tfCtxInit.K, tf_brie_coef], var_to_bounds=Dict(tfCtxInit.K=> (10.0, 130.0), tf_brie_coef=>(1.0,100.0)), method="L-BFGS-B", # options=Dict("maxiter"=> 100, "ftol"=>1e-6, "gtol"=>1e-6)) @info "Optimization Starts..." # ScipyOptimizerMinimize(sess, opt, loss_callback=print_loss, step_callback=print_iter, fetches=[loss,tfCtxInit.K,gradK]) ScipyOptimizerMinimize(sess, opt, loss_callback=print_loss, step_callback=print_iter, fetches=[loss,tfCtxInit.K,gradK, tf_brie_coef])
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
5682
#= Main program for two phase flow inversion =# include("args.jl") function sw_p_to_lambda_den(sw, p) sw = tf.reshape(sw, (1, m, n, 1)) p = tf.reshape(p, (1, m, n, 1)) sw = tf.image.resize_bilinear(sw, (nz, nx)) p = tf.image.resize_bilinear(p, (nz, nx)) sw = cast(sw, Float64) p = cast(p, Float64) sw = squeeze(sw) p = squeeze(p) # tran_lambda, tran_den = Gassman(sw) # tran_lambda, tran_den = RockLinear(sw) # test linear relationship tran_lambda, tran_den = Patchy(sw) # tran_lambda, tran_den = Brie(sw) tran_lambda_pad = tf.pad(tran_lambda, [nPml (nPml+nPad); nPml nPml], constant_values=3500.0^2*2200.0/3.0) /1e6 tran_den_pad = tf.pad(tran_den, [nPml (nPml+nPad); nPml nPml], constant_values=2200.0) return tran_lambda_pad, tran_den_pad end # NOTE Generate Data if args["generate_data"] println("Generate Test Data...") K = 20.0 .* ones(m,n) # millidarcy ix = 1:n y1 = 45. .+ 10. .* sin.(ix./120.0 .* 2.0 .* pi) y2 = 55. .+ 10. .* sin.(ix./120.0 .* 2.0 .* pi) for j = 1:n for i = 1:m if (i > y1[j] && i < y2[j]) K[i, j] = 120; end end end # imshow(K) tfCtxTrue = tfCtxGen(m,n,h,NT,Δt,Z,X,ρw,ρo,μw,μo,K,g,ϕ,qw,qo, sw0, true) out_sw_true, out_p_true = imseq(tfCtxTrue) lambdas = Array{PyObject}(undef, n_survey) dens = Array{PyObject}(undef, n_survey) for i = 1:n_survey sw = out_sw_true[survey_indices[i]] p = out_p_true[survey_indices[i]] lambdas[i], dens[i] = sw_p_to_lambda_den(sw, p) end misfit = Array{PyObject}(undef, n_survey) for i = 1:n_survey if !isdir("./$(args["version"])/Data$i") mkdir("./$(args["version"])/Data$i") end para_fname = "./$(args["version"])/para_file$i.json" survey_fname = "./$(args["version"])/survey_file$i.json" paraGen(nz_pad, nx_pad, dz, dx, nSteps, dt, f0, nPml, nPad, para_fname, survey_fname, "./$(args["version"])/Data$i/") # shot_inds = collect(1:3:length(z_src)) .+ mod(i-1,3) # 5src rotation # shot_inds = i # 1src rotation shot_inds = collect(1:length(z_src)) # all sources surveyGen(z_src[shot_inds], x_src[shot_inds], z_rec, x_rec, survey_fname) tf_shot_ids0 = constant(collect(0:length(shot_inds)-1), dtype=Int32) misfit[i] = fwi_obs_op(lambdas[i], tf_shear_pad, dens[i], tf_stf, tf_gpu_id0, tf_shot_ids0, para_fname) end config = tf.ConfigProto() config.intra_op_parallelism_threads = 24 config.inter_op_parallelism_threads = 24 sess = Session(config=config); init(sess); run(sess, misfit) error("Generate Data: Stop") end tfCtxInit = tfCtxGen(m,n,h,NT,Δt,Z,X,ρw,ρo,μw,μo,K_init,g,ϕ,qw,qo, sw0, false) out_sw_init, out_p_init = imseq(tfCtxInit) lambdas = Array{PyObject}(undef, n_survey) dens = Array{PyObject}(undef, n_survey) for i = 1:n_survey sw = out_sw_init[survey_indices[i]] p = out_p_init[survey_indices[i]] lambdas[i], dens[i] = sw_p_to_lambda_den(sw, p) end # NOTE Compute FWI loss loss = constant(0.0) for i = 1:n_survey global loss para_fname = "./$(args["version"])/para_file$i.json" survey_fname = "./$(args["version"])/survey_file$i.json" # shot_inds = collect(1:3:length(z_src)) .+ mod(i-1,3) # shot_inds = i shot_inds = collect(1:length(z_src)) # all sources tf_shot_ids0 = constant(collect(0:length(shot_inds)-1), dtype=Int32) loss += fwi_op(lambdas[i], tf_shear_pad, dens[i], tf_stf, tf_gpu_id_array[mod(i,nGpus)], tf_shot_ids0, para_fname) # mod(i,2) end gradK = gradients(loss, tfCtxInit.K) if args["verbose"] sess = Session(); init(sess) println("Initial loss = ", run(sess, loss)) g = gradients(loss, tfCtxInit.K) G = run(sess, g) pcolormesh(G); savefig("test.png"); close("all") end # Optimization __cnt = 0 # invK = zeros(m,n) function print_loss(l, K, gradK, brie_coef) global __cnt, __l, __K, __gradK, __brie_coef if mod(__cnt,1)==0 println("\niter=$__iter, eval=$__cnt, current loss=",l) # println("a=$a, b1=$b1, b2=$b2") end __cnt += 1 __l = l __K = K __gradK = gradK __brie_coef = brie_coef end __iter = 0 function print_iter(rk) global __iter, __l if mod(__iter,1)==0 println("\n************* ITER=$__iter *************\n") end __iter += 1 open("./$(args["version"])/loss.txt", "a") do io writedlm(io, Any[__iter __l]) end open("./$(args["version"])/K$__iter.txt", "w") do io writedlm(io, __K) end open("./$(args["version"])/gradK$__iter.txt", "w") do io writedlm(io, __gradK) end open("./$(args["version"])/brie_coef.txt", "a") do io writedlm(io, Any[__iter __brie_coef]) end end config = tf.ConfigProto() config.intra_op_parallelism_threads = 24 config.inter_op_parallelism_threads = 24 sess = Session(config=config); init(sess); # opt = ScipyOptimizerInterface(loss, var_list=[tfCtxInit.K], var_to_bounds=Dict(tfCtxInit.K=> (10.0, 130.0)), method="L-BFGS-B", # options=Dict("maxiter"=> 100, "ftol"=>1e-6, "gtol"=>1e-6)) opt = ScipyOptimizerInterface(loss, var_list=[tfCtxInit.K, tf_brie_coef], var_to_bounds=Dict(tfCtxInit.K=> (10.0, 130.0), tf_brie_coef=>(1.0,100.0)), method="L-BFGS-B", options=Dict("maxiter"=> 100, "ftol"=>1e-6, "gtol"=>1e-6)) @info "Optimization Starts..." # ScipyOptimizerMinimize(sess, opt, loss_callback=print_loss, step_callback=print_iter, fetches=[loss,tfCtxInit.K,gradK]) ScipyOptimizerMinimize(sess, opt, loss_callback=print_loss, step_callback=print_iter, fetches=[loss,tfCtxInit.K,gradK, tf_brie_coef])
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
3208
using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using DelimitedFiles using Random Random.seed!(233) np = pyimport("numpy") include("poisson_op.jl") include("laplacian_op.jl") include("sat_op.jl") const K_CONST = 9.869232667160130e-16 * 86400 * 1e3 mutable struct Ctx m; n; h; NT; Δt; Z; X; ρw; ρo; μw; μo; K; g; ϕ; qw; qo; sw0 end function tfCtxGen(m,n,h,NT,Δt,Z,X,ρw,ρo,μw,μo,K,g,ϕ,qw,qo,sw0,ifTrue) tf_h = constant(h) # tf_NT = constant(NT) tf_Δt = constant(Δt) tf_Z = constant(Z) tf_X= constant(X) tf_ρw = constant(ρw) tf_ρo = constant(ρo) tf_μw = constant(μw) tf_μo = constant(μo) # tf_K = isa(K,Array) ? Variable(K) : K if ifTrue tf_K = constant(K) else tf_K = Variable(K) end tf_g = constant(g) # tf_ϕ = Variable(ϕ) tf_ϕ = constant(ϕ) tf_qw = constant(qw) tf_qo = constant(qo) tf_sw0 = constant(sw0) return Ctx(m,n,tf_h,NT,tf_Δt,tf_Z,tf_X,tf_ρw,tf_ρo,tf_μw,tf_μo,tf_K,tf_g,tf_ϕ,tf_qw,tf_qo,tf_sw0) end function Krw(Sw) return Sw ^ 1.5 end function Kro(So) return So ^1.5 end function ave_normal(quantity, m, n) aa = sum(quantity) return aa/(m*n) end # variables : sw, u, v, p # (time dependent) parameters: qw, qo, ϕ function onestep(sw, p, m, n, h, Δt, Z, ρw, ρo, μw, μo, K, g, ϕ, qw, qo) # step 1: update p # λw = Krw(sw)/μw # λo = Kro(1-sw)/μo λw = sw.*sw/μw λo = (1-sw).*(1-sw)/μo λ = λw + λo q = qw + qo + λw/(λo+1e-16).*qo # q = qw + qo potential_c = (ρw - ρo)*g .* Z # Step 1: implicit potential Θ = upwlap_op(K * K_CONST, λo, potential_c, h, constant(0.0)) load_normal = (Θ+q/ALPHA) - ave_normal(Θ+q/ALPHA, m, n) # p = poisson_op(λ.*K* K_CONST, load_normal, h, constant(0.0), constant(1)) p = upwps_op(K * K_CONST, λ, load_normal, p, h, constant(0.0), constant(0)) # potential p = pw - ρw*g*h # step 2: implicit transport sw = sat_op(sw, p, K * K_CONST, ϕ, qw, qo, μw, μo, sw, Δt, h) return sw, p end """ impes(tf_ctx) Solve the two phase flow equation. `qw` and `qo` -- `NT x m x n` numerical array, `qw[i,:,:]` the corresponding value of qw at i*Δt `sw0` and `p0` -- initial value for `sw` and `p`. `m x n` numerical array. """ function imseq(tf_ctx) ta_sw, ta_p = TensorArray(NT+1), TensorArray(NT+1) ta_sw = write(ta_sw, 1, tf_ctx.sw0) ta_p = write(ta_p, 1, constant(zeros(tf_ctx.m, tf_ctx.n))) i = constant(1, dtype=Int32) function condition(i, tas...) i <= tf_ctx.NT end function body(i, tas...) ta_sw, ta_p = tas sw, p = onestep(read(ta_sw, i), read(ta_p, i), tf_ctx.m, tf_ctx.n, tf_ctx.h, tf_ctx.Δt, tf_ctx.Z, tf_ctx.ρw, tf_ctx.ρo, tf_ctx.μw, tf_ctx.μo, tf_ctx.K, tf_ctx.g, tf_ctx.ϕ, tf_ctx.qw[i], tf_ctx.qo[i]) ta_sw = write(ta_sw, i+1, sw) ta_p = write(ta_p, i+1, p) i+1, ta_sw, ta_p end _, ta_sw, ta_p = while_loop(condition, body, [i; ta_sw; ta_p;]) out_sw, out_p = stack(ta_sw), stack(ta_p) end function vis(val, args...;kwargs...) close("all") ns = Int64.(round.(LinRange(1,size(val,1),9))) for i = 1:9 subplot(330+i) imshow(val[ns[i],:,:], args...;kwargs...) colorbar() end end
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
3664
using PyPlot using DelimitedFiles if !isdir("figures_summary_channel") mkdir("figures_summary_channel") end m = 90 n = 180 h = 5.0 # meter dz = 3.0 # meters dx = 3.0 nz = Int64(round((m * h) / dz)) + 1 nx = Int64(round((n * h) / dx)) + 1 z_src = (collect(5:10:nz-5) .- 1 ) .* dz .+ dz/2.0 x_src = (5-1)ones(Int64, size(z_src)) .* dx .+ dx/2.0 z_rec = (collect(5:1:nz-5) .- 1) .* dz .+ dz/2.0 x_rec = (nx-5-1) .* ones(Int64, size(z_rec)) .*dx .+ dx/2.0 z_inj = (54-1)*h + h/2.0 x_inj = (18-1)*h + h/2.0 z_prod = (54-1)*h + h/2.0 x_prod = (168-1)*h + h/2.0 rc("axes", titlesize=20) rc("axes", labelsize=18) rc("xtick", labelsize=18) rc("ytick", labelsize=18) rc("legend", fontsize=20) # true model figure() K = 20.0 .* ones(m,n) # millidarcy ix = 1:n y1 = 45. .+ 10. .* sin.(ix./120.0 .* 2.0 .* pi) y2 = 55. .+ 10. .* sin.(ix./120.0 .* 2.0 .* pi) for j = 1:n for i = 1:m if (i > y1[j] && i < y2[j]) K[i, j] = 120; end end end imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">") scatter(x_prod, z_prod, c="r", marker="<") savefig("figures_summary_channel/K_true.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); # # init model # figure() # K = 20.0 .* ones(m,n) # imshow(K, extent=[0,n*h,m*h,0]); # xlabel("Distance (m)") # ylabel("Depth (m)") # cb = colorbar() # clim([20, 120]) # cb.set_label("Permeability (md)") # shot_inds = collect(1:length(z_src)) # scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") # scatter(x_rec, z_rec, s=16.0, c="r", marker="v") # scatter(x_inj, z_inj, c="r", marker=">") # scatter(x_prod, z_prod, c="r", marker="<") # savefig("figures_summary/K_init.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); m = 45 n = 90 h = 10.0 # meter dz = 3.0 # meters dx = 3.0 nz = Int64(round((m * h) / dz)) + 1 nx = Int64(round((n * h) / dx)) + 1 z_src = (collect(5:10:nz-5) .- 1 ) .* dz .+ dz/2.0 x_src = (5-1)ones(Int64, size(z_src)) .* dx .+ dx/2.0 z_rec = (collect(5:1:nz-5) .- 1) .* dz .+ dz/2.0 x_rec = (nx-5-1) .* ones(Int64, size(z_rec)) .*dx .+ dx/2.0 z_inj = (27-1)*h + h/2.0 x_inj = (9-1)*h + h/2.0 z_prod = (27-1)*h + h/2.0 x_prod = (84-1)*h + h/2.0 iter = 100 Prj_names = ["CO2_channel_4590_pgs", "CO2_channel_45_90"] K_name = "/Stage6/K$iter.txt" figure() iPrj = 1 K = readdlm(Prj_names[iPrj] * K_name) imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">") scatter(x_prod, z_prod, c="r", marker="<") savefig("figures_summary_channel/K_$(Prj_names[iPrj]).pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); K_name = "/K$iter.txt" figure() iPrj = 2 K = readdlm(Prj_names[iPrj] * K_name) imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">") scatter(x_prod, z_prod, c="r", marker="<") savefig("figures_summary_channel/K_$(Prj_names[iPrj]).pdf", bbox_inches="tight",pad_inches = 0, dpi = 300);
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
1236
using DelimitedFiles using PyPlot close("all") if !isdir("figures_summary") mkdir("figures_summary") end Prj_names = ["CO2", "CO2_1src", "CO2_2surveys", "Brie_3_nocoefupdate", "Brie_tune_coef_true3_start2"] rc("axes", titlesize=14) rc("axes", labelsize=14) rc("xtick", labelsize=14) rc("ytick", labelsize=14) rc("legend", fontsize=14) figure() L1 = readdlm("$(Prj_names[1])/loss.txt") l1=semilogy(L1[:,1], L1[:,2]/L1[1,2], label="Baseline") legend() L2 = readdlm("$(Prj_names[2])/loss.txt") l2=semilogy(L2[:,1], L2[:,2]/L2[1,2], label="One source") legend() L3 = readdlm("$(Prj_names[3])/loss.txt") l3=semilogy(L3[:,1], L3[:,2]/L3[1,2], label="Two surveys") legend() grid(ls="--") xlabel("Iteration Number") ylabel("Normalized misfit") savefig("figures_summary/loss.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); figure() L4 = readdlm("$(Prj_names[4])/loss.txt") l4=semilogy(L4[:,1], L4[:,2]/L4[1,2], label="Exact coefficient") legend() L5 = readdlm("$(Prj_names[5])/loss.txt") l5=semilogy(L5[:,1], L5[:,2]/L5[1,2], label="Inexact coefficient") legend() grid(ls="--") xlabel("Iteration Number") ylabel("Normalized misfit") savefig("figures_summary/loss_brie.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300);
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
92
using PyPlot include("args.jl") Sw = constant(collect(0:0.001:1)) lambda_brie_3 = Brie(Sw)
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
6344
include("args.jl") function sw_p_to_lambda_den(sw, p) sw = tf.reshape(sw, (1, m, n, 1)) p = tf.reshape(p, (1, m, n, 1)) sw = tf.image.resize_bilinear(sw, (nz, nx)) p = tf.image.resize_bilinear(p, (nz, nx)) sw = cast(sw, Float64) p = cast(p, Float64) sw = squeeze(sw) p = squeeze(p) # tran_lambda, tran_den = Gassman(sw) # tran_lambda, tran_den = RockLinear(sw) # test linear relationship tran_lambda, tran_den = Patchy(sw) return tran_lambda, tran_den end if !isdir("figures_summary_channel") mkdir("figures_summary_channel") end K = 20.0 .* ones(m,n) # millidarcy ix = 1:n y1 = 45. .+ 10. .* sin.(ix./120.0 .* 2.0 .* pi) y2 = 55. .+ 10. .* sin.(ix./120.0 .* 2.0 .* pi) for j = 1:n for i = 1:m if (i > y1[j] && i < y2[j]) K[i, j] = 120; end end end tfCtxTrue = tfCtxGen(m,n,h,NT,Δt,Z,X,ρw,ρo,μw,μo,K,g,ϕ,qw,qo, sw0, true) out_sw_true, out_p_true = imseq(tfCtxTrue) lambdas = Array{PyObject}(undef, n_survey) dens = Array{PyObject}(undef, n_survey) for i = 1:n_survey sw = out_sw_true[survey_indices[i]] p = out_p_true[survey_indices[i]] lambdas[i], dens[i] = sw_p_to_lambda_den(sw, p) end sess = Session();init(sess); vps = Array{PyObject}(undef, n_survey) for i=1:n_survey vps[i] = sqrt((lambdas[i] + 2.0 * tf_shear_sat1[i])/dens[i]) end V = run(sess, vps); S = run(sess, out_sw_true); P = run(sess, out_p_true); z_inj = (54-1)*h + h/2.0 x_inj = (18-1)*h + h/2.0 z_prod = (54-1)*h + h/2.0 x_prod = (168-1)*h + h/2.0 rc("axes", titlesize=30) rc("axes", labelsize=30) rc("xtick", labelsize=28) rc("ytick", labelsize=28) rc("legend", fontsize=30) fig1,axs = subplots(3,3, figsize=[30,15], sharex=true, sharey=true) ims = Array{Any}(undef, 9) for iPrj = 1:3 for jPrj = 1:3 ims[(iPrj-1)*3+jPrj] = axs[iPrj,jPrj].imshow(V[(iPrj-1)*3+jPrj], extent=[0,n*h,m*h,0], vmin=3350, vmax=3500); axs[iPrj,jPrj].title.set_text("Snapshot $((iPrj-1)*3+jPrj)") if jPrj == 1 || jPrj == 1 axs[iPrj,jPrj].set_ylabel("Depth (m)") end if iPrj == 3 || iPrj == 3 axs[iPrj,jPrj].set_xlabel("Distance (m)") end # cb = fig1.colorbar(ims[(iPrj-1)*3+jPrj], ax=axs[iPrj,jPrj]) # cb.set_label("Vp") axs[iPrj,jPrj].scatter(x_inj, z_inj, c="r", marker=">", s=128) axs[iPrj,jPrj].scatter(x_prod, z_prod, c="r", marker="<", s=128) end end fig1.subplots_adjust(wspace=0.02, hspace=0.18) cbar_ax = fig1.add_axes([0.91, 0.08, 0.01, 0.82]) cb1 = fig1.colorbar(ims[1], cax=cbar_ax) cb1.set_label("Vp (m/s)") savefig("figures_summary_channel/Vp_evo_patchy_true.pdf",bbox_inches="tight",pad_inches = 0); fig2,axs = subplots(3,3, figsize=[30,15], sharex=true, sharey=true) ims = Array{Any}(undef, 9) for iPrj = 1:3 for jPrj = 1:3 ims[(iPrj-1)*3+jPrj] = axs[iPrj,jPrj].imshow(S[survey_indices[(iPrj-1)*3+jPrj], :, :], extent=[0,n*h,m*h,0], vmin=0.0, vmax=0.6); axs[iPrj,jPrj].title.set_text("Snapshot $((iPrj-1)*3+jPrj)") if jPrj == 1 || jPrj == 1 axs[iPrj,jPrj].set_ylabel("Depth (m)") end if iPrj == 3 || iPrj == 3 axs[iPrj,jPrj].set_xlabel("Distance (m)") end # if iPrj ==2 && jPrj == 3 # cb = fig2.colorbar(ims[(iPrj-1)*3+jPrj], ax=axs[iPrj,jPrj]) # cb.set_label("Saturation") axs[iPrj,jPrj].scatter(x_inj, z_inj, c="r", marker=">", s=128) axs[iPrj,jPrj].scatter(x_prod, z_prod, c="r", marker="<", s=128) end end # fig2.subplots_adjust(wspace=0.04, hspace=0.042) fig2.subplots_adjust(wspace=0.02, hspace=0.18) cbar_ax = fig2.add_axes([0.91, 0.08, 0.01, 0.82]) cb2 = fig2.colorbar(ims[1], cax=cbar_ax) cb2.set_label("Saturation") savefig("figures_summary_channel/Saturation_evo_patchy_true.pdf",bbox_inches="tight",pad_inches = 0); fig3,axs = subplots(3,3, figsize=[30,15], sharex=true, sharey=true) ims = Array{Any}(undef, 9) for iPrj = 1:3 for jPrj = 1:3 ims[(iPrj-1)*3+jPrj] = axs[iPrj,jPrj].imshow(P[survey_indices[(iPrj-1)*3+jPrj], :, :]*1.4504e-04, extent=[0,n*h,m*h,0], vmin=-2500.0, vmax=500); axs[iPrj,jPrj].title.set_text("Snapshot $((iPrj-1)*3+jPrj)") if jPrj == 1 || jPrj == 1 axs[iPrj,jPrj].set_ylabel("Depth (m)") end if iPrj == 3 || iPrj == 3 axs[iPrj,jPrj].set_xlabel("Distance (m)") end # if iPrj ==2 && jPrj == 3 # cb = fig2.colorbar(ims[(iPrj-1)*3+jPrj], ax=axs[iPrj,jPrj]) # cb.set_label("Saturation") axs[iPrj,jPrj].scatter(x_inj, z_inj, c="r", marker=">", s=128) axs[iPrj,jPrj].scatter(x_prod, z_prod, c="r", marker="<", s=128) end end # fig2.subplots_adjust(wspace=0.04, hspace=0.042) fig3.subplots_adjust(wspace=0.02, hspace=0.18) cbar_ax = fig3.add_axes([0.91, 0.08, 0.01, 0.82]) cb3 = fig3.colorbar(ims[1], cax=cbar_ax) cb3.set_label("Potential (psi)") savefig("figures_summary_channel/Potential_evo_patchy_true.pdf",bbox_inches="tight",pad_inches = 0); # iter = 100 # Prj_names = ["CO2", "CO2_1src", "CO2_2surveys", "CO2_6surveys"] # K_name = "/K$iter.txt" # fig,axs = subplots(2,2, figsize=[18,8], sharex=true, sharey=true) # for iPrj = 1:2 # for jPrj = 1:2 # # println(ax) # A = readdlm(Prj_names[(iPrj-1)*2 + jPrj] * K_name) # im = axs[iPrj,jPrj].imshow(A, extent=[0,n*h,m*h,0]); # if jPrj == 1 || jPrj == 1 # axs[iPrj,jPrj].set_ylabel("Depth (m)") # end # if iPrj == 2 || iPrj == 2 # axs[iPrj,jPrj].set_xlabel("Distance (m)") # end # axs[iPrj,jPrj].text(-0.1,1.1,string("(" * Char((iPrj-1)*2 + jPrj+'a'-1) * ")"),transform=axs[iPrj,jPrj].transAxes,size=12,weight="bold") # end # end # fig.subplots_adjust(bottom=0.1, top=0.9, left=0.1, right=0.9, # wspace=0.1, hspace=0.2) # cb_ax = fig.add_axes([0.93, 0.1, 0.02, 0.8]) # cbar = fig.colorbar(im, cax=cb_ax) # cb = fig.colorbar() # clim([20, 120]) # cb.set_label("Permeability (md)") # fig = figure() # ax = fig.add_subplot(111) # The big subplot # ax1 = fig.add_subplot(211) # ax2 = fig.add_subplot(212) # # Turn off axis lines and ticks of the big subplot # ax.spines["top"].set_color("none") # ax.spines["bottom"].set_color("none") # ax.spines["left"].set_color("none") # ax.spines["right"].set_color("none") # ax.tick_params(labelcolor="w", top="off", bottom="off", left="off", right="off") # # Set common labels # ax.set_xlabel("common xlabel") # ax.set_ylabel("common ylabel") # ax1.set_title('ax1 title') # ax2.set_title('ax2 title')
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2446
using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Ops/Poisson/build/libPoissonOp.so') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ elseif Sys.isapple() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Ops/Poisson/build/libPoissonOp.dylib') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Ops/Poisson/build/libPoissonOp.dll') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ end poisson_op = py"poisson_op" if Sys.islinux() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('../Ops/Upwps/build/libUpwpsOp.so') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ elseif Sys.isapple() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('../Ops/Upwps/build/libUpwpsOp.dylib') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('../Ops/Upwps/build/libUpwpsOp.dll') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ end upwps_op = py"upwps_op"
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
6175
include("args.jl") using DelimitedFiles function sw_p_to_lambda_den(sw, p) sw = tf.reshape(sw, (1, m, n, 1)) p = tf.reshape(p, (1, m, n, 1)) sw = tf.image.resize_bilinear(sw, (nz, nx)) p = tf.image.resize_bilinear(p, (nz, nx)) sw = cast(sw, Float64) p = cast(p, Float64) sw = squeeze(sw) p = squeeze(p) # tran_lambda, tran_den = Gassman(sw) # tran_lambda, tran_den = RockLinear(sw) # test linear relationship tran_lambda, tran_den = Patchy(sw) return tran_lambda, tran_den end if !isdir("figures_summary") mkdir("figures_summary") end iter = 100 Prj_names = "Brie_true3_set2_noupdate"; K_name = "/K$iter.txt" K = readdlm(Prj_names*K_name) tfCtxTrue = tfCtxGen(m,n,h,NT,Δt,Z,X,ρw,ρo,μw,μo,K,g,ϕ,qw,qo, sw0, true) out_sw_true, out_p_true = imseq(tfCtxTrue) lambdas = Array{PyObject}(undef, n_survey) dens = Array{PyObject}(undef, n_survey) for i = 1:n_survey sw = out_sw_true[survey_indices[i]] p = out_p_true[survey_indices[i]] lambdas[i], dens[i] = sw_p_to_lambda_den(sw, p) end sess = Session();init(sess); vps = Array{PyObject}(undef, n_survey) for i=1:n_survey vps[i] = sqrt((lambdas[i] + 2.0 * tf_shear_sat1[i])/dens[i]) end V = run(sess, vps); S = run(sess, out_sw_true); P = run(sess, out_p_true); z_inj = (9-1)*h + h/2.0 x_inj = (3-1)*h + h/2.0 z_prod = (9-1)*h + h/2.0 x_prod = (28-1)*h + h/2.0 rc("axes", titlesize=30) rc("axes", labelsize=30) rc("xtick", labelsize=28) rc("ytick", labelsize=28) rc("legend", fontsize=30) fig1,axs = subplots(3,3, figsize=[30,15], sharex=true, sharey=true) ims = Array{Any}(undef, 9) for iPrj = 1:3 for jPrj = 1:3 ims[(iPrj-1)*3+jPrj] = axs[iPrj,jPrj].imshow(V[(iPrj-1)*3+jPrj], extent=[0,n*h,m*h,0], vmin=3350, vmax=3500); axs[iPrj,jPrj].title.set_text("Snapshot $((iPrj-1)*3+jPrj)") if jPrj == 1 || jPrj == 1 axs[iPrj,jPrj].set_ylabel("Depth (m)") end if iPrj == 3 || iPrj == 3 axs[iPrj,jPrj].set_xlabel("Distance (m)") end # cb = fig1.colorbar(ims[(iPrj-1)*3+jPrj], ax=axs[iPrj,jPrj]) # cb.set_label("Vp") axs[iPrj,jPrj].scatter(x_inj, z_inj, c="r", marker=">") axs[iPrj,jPrj].scatter(x_prod, z_prod, c="r", marker="<") end end fig1.subplots_adjust(wspace=0.02, hspace=0.18) cbar_ax = fig1.add_axes([0.91, 0.08, 0.01, 0.82]) cb1 = fig1.colorbar(ims[1], cax=cbar_ax) cb1.set_label("Vp (m/s)") savefig("figures_summary/predicted_Vp_evo.pdf",bbox_inches="tight",pad_inches = 0); fig2,axs = subplots(3,3, figsize=[30,15], sharex=true, sharey=true) ims = Array{Any}(undef, 9) for iPrj = 1:3 for jPrj = 1:3 ims[(iPrj-1)*3+jPrj] = axs[iPrj,jPrj].imshow(S[survey_indices[(iPrj-1)*3+jPrj], :, :], extent=[0,n*h,m*h,0], vmin=0.0, vmax=0.6); axs[iPrj,jPrj].title.set_text("Snapshot $((iPrj-1)*3+jPrj)") if jPrj == 1 || jPrj == 1 axs[iPrj,jPrj].set_ylabel("Depth (m)") end if iPrj == 3 || iPrj == 3 axs[iPrj,jPrj].set_xlabel("Distance (m)") end # if iPrj ==2 && jPrj == 3 # cb = fig2.colorbar(ims[(iPrj-1)*3+jPrj], ax=axs[iPrj,jPrj]) # cb.set_label("Saturation") axs[iPrj,jPrj].scatter(x_inj, z_inj, c="r", marker=">") axs[iPrj,jPrj].scatter(x_prod, z_prod, c="r", marker="<") end end # fig2.subplots_adjust(wspace=0.04, hspace=0.042) fig2.subplots_adjust(wspace=0.02, hspace=0.18) cbar_ax = fig2.add_axes([0.91, 0.08, 0.01, 0.82]) cb2 = fig2.colorbar(ims[1], cax=cbar_ax) cb2.set_label("Saturation") savefig("figures_summary/predicted_Saturation_evo.pdf",bbox_inches="tight",pad_inches = 0); # fig3,axs = subplots(3,3, figsize=[30,15], sharex=true, sharey=true) # ims = Array{Any}(undef, 9) # for iPrj = 1:3 # for jPrj = 1:3 # ims[(iPrj-1)*3+jPrj] = axs[iPrj,jPrj].imshow(P[survey_indices[(iPrj-1)*3+jPrj], :, :]*1.4504e-04, extent=[0,n*h,m*h,0], vmin=-2500.0, vmax=500); # axs[iPrj,jPrj].title.set_text("Snapshot $((iPrj-1)*3+jPrj)") # if jPrj == 1 || jPrj == 1 # axs[iPrj,jPrj].set_ylabel("Depth (m)") # end # if iPrj == 3 || iPrj == 3 # axs[iPrj,jPrj].set_xlabel("Distance (m)") # end # # if iPrj ==2 && jPrj == 3 # # cb = fig2.colorbar(ims[(iPrj-1)*3+jPrj], ax=axs[iPrj,jPrj]) # # cb.set_label("Saturation") # axs[iPrj,jPrj].scatter(x_inj, z_inj, c="r", marker=">") # axs[iPrj,jPrj].scatter(x_prod, z_prod, c="r", marker="<") # end # end # # fig2.subplots_adjust(wspace=0.04, hspace=0.042) # fig3.subplots_adjust(wspace=0.02, hspace=0.18) # cbar_ax = fig3.add_axes([0.91, 0.08, 0.01, 0.82]) # cb3 = fig3.colorbar(ims[1], cax=cbar_ax) # cb3.set_label("Potential (psi)") # savefig("figures_summary/Potential_evo_patchy_true.pdf",bbox_inches="tight",pad_inches = 0); # iter = 100 # Prj_names = ["CO2", "CO2_1src", "CO2_2surveys", "CO2_6surveys"] # K_name = "/K$iter.txt" # fig,axs = subplots(2,2, figsize=[18,8], sharex=true, sharey=true) # for iPrj = 1:2 # for jPrj = 1:2 # # println(ax) # A = readdlm(Prj_names[(iPrj-1)*2 + jPrj] * K_name) # im = axs[iPrj,jPrj].imshow(A, extent=[0,n*h,m*h,0]); # if jPrj == 1 || jPrj == 1 # axs[iPrj,jPrj].set_ylabel("Depth (m)") # end # if iPrj == 2 || iPrj == 2 # axs[iPrj,jPrj].set_xlabel("Distance (m)") # end # axs[iPrj,jPrj].text(-0.1,1.1,string("(" * Char((iPrj-1)*2 + jPrj+'a'-1) * ")"),transform=axs[iPrj,jPrj].transAxes,size=12,weight="bold") # end # end # fig.subplots_adjust(bottom=0.1, top=0.9, left=0.1, right=0.9, # wspace=0.1, hspace=0.2) # cb_ax = fig.add_axes([0.93, 0.1, 0.02, 0.8]) # cbar = fig.colorbar(im, cax=cb_ax) # cb = fig.colorbar() # clim([20, 120]) # cb.set_label("Permeability (md)") # fig = figure() # ax = fig.add_subplot(111) # The big subplot # ax1 = fig.add_subplot(211) # ax2 = fig.add_subplot(212) # # Turn off axis lines and ticks of the big subplot # ax.spines["top"].set_color("none") # ax.spines["bottom"].set_color("none") # ax.spines["left"].set_color("none") # ax.spines["right"].set_color("none") # ax.tick_params(labelcolor="w", top="off", bottom="off", left="off", right="off") # # Set common labels # ax.set_xlabel("common xlabel") # ax.set_ylabel("common ylabel") # ax1.set_title('ax1 title') # ax2.set_title('ax2 title')
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
1299
using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libSatOp = tf.load_op_library('../Ops/Saturation/build/libSatOp.so') @tf.custom_gradient def sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h): sat = libSatOp.sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) def grad(dy): return libSatOp.sat_op_grad(dy, sat, s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) return sat, grad """ elseif Sys.isapple() py""" import tensorflow as tf libSatOp = tf.load_op_library('../Ops/Saturation/build/libSatOp.dylib') @tf.custom_gradient def sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h): sat = libSatOp.sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) def grad(dy): return libSatOp.sat_op_grad(dy, sat, s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) return sat, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libSatOp = tf.load_op_library('../Ops/Saturation/build/libSatOp.dll') @tf.custom_gradient def sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h): sat = libSatOp.sat_op(s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) def grad(dy): return libSatOp.sat_op_grad(dy, sat, s0,pt,permi,poro,qw,qo,muw,muo,sref,dt,h) return sat, grad """ end sat_op = py"sat_op"
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
6362
using ArgParse function parse_commandline() s = ArgParseSettings() @add_arg_table s begin "--generate_data" arg_type = Bool default = false "--version" arg_type = String default = "0000" "--verbose" arg_type = Bool default = false end return parse_args(s) end args = parse_commandline() if !isdir("./$(args["version"])") mkdir("./$(args["version"])") end using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) include("ops_imseq.jl") include("../Ops/FWI/fwi_util.jl") include("fwi_util_op.jl") np = pyimport("numpy") # NOTE Parameters # const ALPHA = 0.006323996017182 # const SRC_CONST = 5.6146 # const GRAV_CONST = 1.0/144.0 const ALPHA = 1.0 const SRC_CONST = 86400.0 const GRAV_CONST = 1.0 # NOTE Hyperparameter for flow simulation m = 15 n = 30 h = 30.0 # meter NT = 50 dt_survey = 5 Δt = 20.0 # day z = (1:m)*h|>collect x = (1:n)*h|>collect X, Z = np.meshgrid(x, z) # ρw = 996.9571 # ρo = 640.7385 # μw = 1.0 # μo = 3.0 ρw = 501.9 ρo = 1053.0 μw = 0.1 μo = 1.0 K_init = 20.0 .* ones(m,n) g = 9.8*GRAV_CONST ϕ = 0.25 .* ones(m,n) qw = zeros(NT, m, n) qw[:,9,3] .= 0.005 * (1/h^2)/10.0 * SRC_CONST qo = zeros(NT, m, n) qo[:,9,28] .= -0.005 * (1/h^2)/10.0 * SRC_CONST sw0 = zeros(m, n) survey_indices = collect(1:dt_survey:NT+1) # 10 stages n_survey = length(survey_indices) # NOTE Hyperparameter for fwi_op # argsparse.jl # ENV["CUDA_VISIBLE_DEVICES"] = 1 # ENV["PARAMDIR"] = "Src/params/" # config = tf.ConfigProto(device_count = Dict("GPU"=>0)) dz = 3 # meters dx = 3 nz = Int64(round((m * h) / dz)) + 1 nx = Int64(round((n * h) / dx)) + 1 nPml = 64 nSteps = 3001 dt = 0.00025 f0 = 50.0 nPad = 32 - mod((nz+2*nPml), 32) nz_pad = nz + 2*nPml + nPad nx_pad = nx + 2*nPml # reflection # x_src = collect(5:20:nx-5) # z_src = 5ones(Int64, size(x_src)) # x_rec = collect(5:1:nx-5) # z_rec = 5 .* ones(Int64, size(x_rec)) # xwell # # z_src = collect(5:10:nz-5) #14->11srcs 10->15srcs # # z_src = collect(5:10:nz-5) z_src = collect(5:10:nz-5) x_src = 5ones(Int64, size(z_src)) z_rec = collect(5:1:nz-5) x_rec = (nx-5) .* ones(Int64, size(z_rec)) # para_fname = "./$(args["version"])/para_file.json" # survey_fname = "./$(args["version"])/survey_file.json" # data_dir_name = "./$(args["version"])/Data" # paraGen(nz, nx, dz, dx, nSteps, dt, f0, nPml, nPad, filter_para, isAc, para_fname, survey_fname, data_dir_name) # surveyGen(z_src, x_src, z_rec, x_rec, survey_fname) cp_nopad = 3500.0 .* ones(nz, nx) # initial cp cs = cp_nopad ./ sqrt(3.0) den = 2200.0 .* ones(nz, nx) cp_pad = 3500.0 .* ones(nz_pad, nx_pad) # initial cp cs_pad = cp_pad ./ sqrt(3.0) den_pad = 2200.0 .* ones(nz_pad, nx_pad) cp_pad_value = 3500.0 # tf_cp = constant(cp) tf_cs = constant(cs_pad) tf_den = constant(den_pad) # src = Matrix{Float64}(undef, 1, 2001) # # src[1,:] = Float64.(reinterpret(Float32, read("../Ops/FWI/Src/params/ricker_10Hz.bin"))) # src[1,:] = Float64.(reinterpret(Float32, read("../Ops/FWI/Src/params/Mar_source_2001.bin"))) src = sourceGene(f0, nSteps, dt) tf_stf = constant(repeat(src, outer=length(z_src))) # tf_para_fname = tf.strings.join([para_fname]) tf_gpu_id0 = constant(0, dtype=Int32) tf_gpu_id1 = constant(1, dtype=Int32) nGpus = 1 # tf_gpu_id_array = constant(collect(0:nGpus-1), dtype=Int32) tf_gpu_id_array = constant([0], dtype=Int32) tf_shot_ids0 = constant(collect(Int32, 0:length(x_src)-1), dtype=Int32) tf_shot_ids1 = constant(collect(Int32, 13:25), dtype=Int32) # NOTE Hyperparameter for rock physics tf_bulk_fl1 = constant(2.735e9) tf_bulk_fl2 = constant(0.125e9) # to displace fl1 tf_bulk_sat1 = constant(den .* (cp_nopad.^2 .- 4.0/3.0 .* cp_nopad.^2 ./3.0)) # vp/vs ratio as sqrt(3) tf_bulk_min = constant(36.6e9) tf_shear_sat1 = constant(den .* cp_nopad.^2 ./3.0) tf_ϕ_pad = tf.image.resize_bilinear(tf.reshape(constant(ϕ), (1, m, n, 1)), (nz, nx)) # upsample the porosity tf_ϕ_pad = cast(tf_ϕ_pad, Float64) tf_ϕ_pad = squeeze(tf_ϕ_pad) tf_shear_pad = tf.pad(tf_shear_sat1, [nPml (nPml+nPad); nPml nPml], constant_values=den[1,1] * cp_nopad[1,1]^2 /3.0) / 1e6 function Gassman(sw) tf_bulk_fl_mix = 1.0/( (1-sw)/tf_bulk_fl1 + sw/tf_bulk_fl2 ) temp = tf_bulk_sat1/(tf_bulk_min - tf_bulk_sat1) - tf_bulk_fl1/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl1) + tf_bulk_fl_mix/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl_mix) tf_bulk_new = tf_bulk_min / (1.0/temp + 1.0) # tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) *16.018463373960138; tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) # tf_cp_new = sqrt((tf_bulk_new + 4.0/3.0 * tf_shear_sat1)/tf_den_new) tf_lambda_new = tf_bulk_new - 2.0/3.0 * tf_shear_sat1 return tf_lambda_new, tf_den_new end tf_brie_coef = Variable(2.0*30.0) # tf_brie_coef = constant(3.0) # tf_brie_coef = constant(2.0) function Brie(sw) tf_bulk_fl_mix = (tf_bulk_fl1-tf_bulk_fl2)*(1-sw)^(tf_brie_coef/30.0) + tf_bulk_fl2 temp = tf_bulk_sat1/(tf_bulk_min - tf_bulk_sat1) - tf_bulk_fl1/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl1) + tf_bulk_fl_mix/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl_mix) tf_bulk_new = tf_bulk_min / (1.0/temp + 1.0) # tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) *16.018463373960138; tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) # tf_cp_new = sqrt((tf_bulk_new + 4.0/3.0 * tf_shear_sat1)/tf_den_new) tf_lambda_new = tf_bulk_new - 2.0/3.0 * tf_shear_sat1 return tf_lambda_new, tf_den_new end function RockLinear(sw) # tf_lambda_new = constant(7500.0*1e6 .* ones(nz,nx)) + (17400.0-7500.0)*1e6 * sw tf_lambda_new = constant(7500.0*1e6 .* ones(nz,nx)) + (9200.0-7500.0)*1e6 * sw tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) return tf_lambda_new, tf_den_new end tf_patch_temp = tf_bulk_sat1/(tf_bulk_min - tf_bulk_sat1) - tf_bulk_fl1/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl1) + tf_bulk_fl2/tf_ϕ_pad /(tf_bulk_min - tf_bulk_fl2) tf_bulk_sat2 = tf_bulk_min/(1.0/tf_patch_temp + 1.0) function Patchy(sw) tf_bulk_new = 1/( (1-sw)/(tf_bulk_sat1+4.0/3.0*tf_shear_sat1) + sw/(tf_bulk_sat2+4.0/3.0*tf_shear_sat1) ) - 4.0/3.0*tf_shear_sat1 tf_lambda_new = tf_bulk_new - 2.0/3.0 * tf_shear_sat1 tf_den_new = constant(den) + tf_ϕ_pad .* sw * (ρw - ρo) return tf_lambda_new, tf_den_new end
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2036
if Sys.islinux() py""" import tensorflow as tf import socket if socket.gethostname() != "Dolores": libFwiOp = tf.load_op_library('../Ops/FWI/build/libFwiOp.so') else: libFwiOp = tf.load_op_library('../Ops/FWI/build_dolores/libFwiOp.so') @tf.custom_gradient def fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy, tf.constant(1.0,dtype=tf.float64),λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit, grad def fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit """ elseif Sys.isapple() py""" import tensorflow as tf libFwiOp = tf.load_op_library('../Ops/FWI/build/libFwiOp.so') @tf.custom_gradient def fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy, tf.constant(1.0,dtype=tf.float64),λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit, grad def fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit """ elseif Sys.iswindows() py""" import tensorflow as tf libFwiOp = tf.load_op_library('../Ops/FWI/build/libFwiOp.so') @tf.custom_gradient def fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) def grad(dy): return libFwiOp.fwi_op_grad(dy, tf.constant(1.0,dtype=tf.float64),λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit, grad def fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname): misfit = libFwiOp.fwi_obs_op(λ,μ,ρ,stf,gpu_id,shot_ids,para_fname) return misfit """ end fwi_op = py"fwi_op" fwi_obs_op = py"fwi_obs_op"
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2301
using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libLaplacian = tf.load_op_library('../Ops/Laplacian/build/libLaplacian.so') @tf.custom_gradient def laplacian_op(coef,func,h,rhograv): p = libLaplacian.laplacian(coef,func,h,rhograv) def grad(dy): return libLaplacian.laplacian_grad(dy, coef, func, h, rhograv) return p, grad """ elseif Sys.isapple() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Ops/Laplacian/build/libLaplacian.dylib') @tf.custom_gradient def laplacian_op(coef,func,h,rhograv): p = libLaplacian.laplacian(coef,func,h,rhograv) def grad(dy): return libLaplacian.laplacian_grad(dy, coef, func, h, rhograv) return p, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Ops/Laplacian/build/libLaplacian.dll') @tf.custom_gradient def laplacian_op(coef,func,h,rhograv): p = libLaplacian.laplacian(coef,func,h,rhograv) def grad(dy): return libLaplacian.laplacian_grad(dy, coef, func, h, rhograv) return p, grad """ end laplacian_op = py"laplacian_op" if Sys.islinux() py""" import tensorflow as tf libUpwlapOp = tf.load_op_library('../Ops/Upwlap/build/libUpwlapOp.so') @tf.custom_gradient def upwlap_op(perm,mobi,func,h,rhograv): out = libUpwlapOp.upwlap_op(perm,mobi,func,h,rhograv) def grad(dy): return libUpwlapOp.upwlap_op_grad(dy, out, perm,mobi,func,h,rhograv) return out, grad """ elseif Sys.isapple() py""" import tensorflow as tf libUpwlapOp = tf.load_op_library('../Ops/Upwlap/build/libUpwlapOp.dylib') @tf.custom_gradient def upwlap_op(perm,mobi,func,h,rhograv): out = libUpwlapOp.upwlap_op(perm,mobi,func,h,rhograv) def grad(dy): return libUpwlapOp.upwlap_op_grad(dy, out, perm,mobi,func,h,rhograv) return out, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libUpwlapOp = tf.load_op_library('./Ops/Upwlap/build/libUpwlapOp.dll') @tf.custom_gradient def upwlap_op(perm,mobi,func,h,rhograv): out = libUpwlapOp.upwlap_op(perm,mobi,func,h,rhograv) def grad(dy): return libUpwlapOp.upwlap_op_grad(dy, out, perm,mobi,func,h,rhograv) return out, grad """ end upwlap_op = py"upwlap_op"
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
5967
#= Main program for FWI =# using ArgParse function parse_commandline() s = ArgParseSettings() @add_arg_table s begin "--generate_data" arg_type = Bool default = false "--version" arg_type = String default = "0000" "--verbose" arg_type = Bool default = false end return parse_args(s) end args = parse_commandline() if !isdir("./$(args["version"])") mkdir("./$(args["version"])") end using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) include("ops_imseq.jl") include("../Ops/FWI/fwi_util.jl") include("fwi_util_op.jl") np = pyimport("numpy") nz = 134 nx = 384 dz = 24. # meters dx = 24. nSteps = 2001 dt = 0.0025 f0 = 4.5 filter_para = [0, 0.1, 100.0, 200.0] isAc = true nPml = 32 nPad = 32 - mod((nz+2*nPml), 32) nz_pad = nz + 2*nPml + nPad nx_pad = nx + 2*nPml # reflection x_src = collect(4:8:384) z_src = 2ones(Int64, size(x_src)) x_rec = collect(3:381) z_rec = 2ones(Int64, size(x_rec)) # xwell # z_src = collect(5:10:nz-5) #14->11srcs 10->15srcs # x_src = 5ones(Int64, size(z_src)) # z_rec = collect(5:1:nz-5) # x_rec = (nx-5) .* ones(Int64, size(z_rec)) para_fname = "./$(args["version"])/para_file.json" survey_fname = "./$(args["version"])/survey_file.json" data_dir_name = "./$(args["version"])/Data" paraGen(nz_pad, nx_pad, dz, dx, nSteps, dt, f0, nPml, nPad, filter_para, isAc, para_fname, survey_fname, data_dir_name) surveyGen(z_src, x_src, z_rec, x_rec, survey_fname) tf_cp = constant(reshape(reinterpret(Float32,read("Mar_models/Model_Cp_true.bin")),(nz_pad, nx_pad)), dtype=Float64) cs = zeros(nz_pad, nx_pad) den = 1000.0 .* ones(nz_pad, nx_pad) cp_pad_value = 3000.0 # tf_cp = constant(cp) tf_cs = constant(cs) tf_den = constant(den) src = Matrix{Float64}(undef, 1, 2001) # # src[1,:] = Float64.(reinterpret(Float32, read("../Ops/FWI/Src/params/ricker_10Hz.bin"))) src[1,:] = Float64.(reinterpret(Float32, read("../Ops/FWI/Src/params/Mar_source_2001.bin"))) # src = sourceGene(f0, nSteps, dt) tf_stf = constant(repeat(src, outer=length(z_src))) # tf_para_fname = tf.strings.join([para_fname]) tf_gpu_id0 = constant(0, dtype=Int32) tf_gpu_id1 = constant(1, dtype=Int32) nGpus = 2 tf_gpu_id_array = constant(collect(0:nGpus-1), dtype=Int32) tf_shot_ids0 = constant(collect(Int32, 0:length(x_src)-1), dtype=Int32) shot_id_points = Int32.(trunc.(collect(LinRange(0, length(z_src)-1, nGpus+1)))) function pad_cp(cp) tran_cp = cast(cp, Float64) return tf.pad(tran_cp, [nPml (nPml+nPad); nPml nPml], constant_values=3000.0) end # NOTE Generate Data if args["generate_data"] println("Generate Test Data...") if !isdir("./$(args["version"])/Data") mkdir("./$(args["version"])/Data") end res = fwi_obs_op(tf_cp, tf_cs, tf_den, tf_stf, tf_gpu_id0, tf_shot_ids0, para_fname) config = tf.ConfigProto() config.intra_op_parallelism_threads = 24 config.inter_op_parallelism_threads = 24 sess = Session(config=config); init(sess); run(sess, res) error("Generate Data: Stop") end cp_init = reshape(reinterpret(Float32,read("Mar_models/Model_Cp_init_1D.bin")),(nz_pad, nx_pad)) tf_cp_inv = Variable(cp_init, dtype=Float64) Mask = ones(nz_pad, nx_pad) Mask[nPml+1:nPml+10,:] .= 0.0 tf_cp_inv_msk = tf_cp_inv .* constant(Mask) + constant(cp_init[1,1] .* (1. .- Mask)) # NOTE Compute FWI loss # loss = constant(0.0) # for i = 1:nGpus # global loss # tf_shot_ids = constant(collect(shot_id_points[i] : shot_id_points[i+1]), dtype=Int32) # loss += fwi_op(tf_cp_inv_msk, tf_cs, tf_den, tf_stf, tf_gpu_id_array[i], tf_shot_ids, para_fname) # end loss = fwi_op(tf_cp_inv_msk, tf_cs, tf_den, tf_stf, tf_gpu_id_array[1], tf_shot_ids0, para_fname) gradCp = gradients(loss, tf_cp_inv) if args["verbose"] sess = Session(); init(sess) println("Initial loss = ", run(sess, loss)) g = gradients(loss, tfCtxInit.K) G = run(sess, g) pcolormesh(G); savefig("test.png"); close("all") end # Optimization __cnt = 0 # invK = zeros(m,n) function print_loss(l, Cp, gradCp) global __cnt, __l, __Cp, __gradCp if mod(__cnt,1)==0 println("\niter=$__iter, eval=$__cnt, current loss=",l) # println("a=$a, b1=$b1, b2=$b2") end __cnt += 1 __l = l __Cp = Cp __gradCp = gradCp end __iter = 0 function print_iter(rk) global __iter, __l if mod(__iter,1)==0 println("\n************* ITER=$__iter *************\n") end __iter += 1 open("./$(args["version"])/loss.txt", "a") do io writedlm(io, Any[__iter __l]) end open("./$(args["version"])/Cp$__iter.txt", "w") do io writedlm(io, __Cp) end open("./$(args["version"])/gradCp$__iter.txt", "w") do io writedlm(io, __gradCp) end end config = tf.ConfigProto() config.intra_op_parallelism_threads = 24 config.inter_op_parallelism_threads = 24 sess = Session(config=config); init(sess); # cp_low_bd = 1500. .* ones(nz_pad, nx_pad) # cp_high_bd = 5500. .* ones(nz_pad, nx_pad) # cp_high_bd[nPml+1:nPml+10,:] .= 1500.0 opt = ScipyOptimizerInterface(loss, var_list=[tf_cp_inv], var_to_bounds=Dict(tf_cp_inv=> (1500.0, 5500.0)), method="L-BFGS-B", options=Dict("maxiter"=> 100, "ftol"=>1e-6, "gtol"=>1e-6)) @info "Optimization Starts..." ScipyOptimizerMinimize(sess, opt, loss_callback=print_loss, step_callback=print_iter, fetches=[loss,tf_cp_inv,gradCp]) # adam = AdamOptimizer(learning_rate=50.0) # op = minimize(adam, loss) # sess = Session(); init(sess); # for iter = 1:1000 # _, misfit, cp, cpgrad = run(sess, [op, loss, tf_cp_inv, gradCp]) # open("./$(args["version"])/Cp$iter.txt", "w") do io # writedlm(io, cp) # end # open("./$(args["version"])/loss.txt", "a") do io # writedlm(io, Any[iter misfit]) # end # open("./$(args["version"])/gradCp$iter.txt", "w") do io # writedlm(io, cpgrad) # end # end
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
5659
#= Main program for two phase flow inversion =# include("args.jl") function sw_p_to_lambda_den(sw, p) sw = tf.reshape(sw, (1, m, n, 1)) p = tf.reshape(p, (1, m, n, 1)) sw = tf.image.resize_bilinear(sw, (nz, nx)) p = tf.image.resize_bilinear(p, (nz, nx)) sw = cast(sw, Float64) p = cast(p, Float64) sw = squeeze(sw) p = squeeze(p) # tran_lambda, tran_den = Gassman(sw) # tran_lambda, tran_den = RockLinear(sw) # test linear relationship # tran_lambda, tran_den = Patchy(sw) tran_lambda, tran_den = Brie(sw) tran_lambda_pad = tf.pad(tran_lambda, [nPml (nPml+nPad); nPml nPml], constant_values=3500.0^2*2200.0/3.0) /1e6 tran_den_pad = tf.pad(tran_den, [nPml (nPml+nPad); nPml nPml], constant_values=2200.0) return tran_lambda_pad, tran_den_pad end # NOTE Generate Data if args["generate_data"] println("Generate Test Data...") K = 20.0 .* ones(m,n) # millidarcy K[8:10,:] .= 120.0 # K[17:21,:] .= 100.0 # for i = 1:m # for j = 1:n # if i <= (14 - 24)/(30 - 1)*(j-1) + 24 && i >= (12 - 18)/(30 - 1)*(j-1) + 18 # K[i,j] = 100.0 # end # end # end tfCtxTrue = tfCtxGen(m,n,h,NT,Δt,Z,X,ρw,ρo,μw,μo,K,g,ϕ,qw,qo, sw0, true) out_sw_true, out_p_true = imseq(tfCtxTrue) lambdas = Array{PyObject}(undef, n_survey) dens = Array{PyObject}(undef, n_survey) for i = 1:n_survey sw = out_sw_true[survey_indices[i]] p = out_p_true[survey_indices[i]] lambdas[i], dens[i] = sw_p_to_lambda_den(sw, p) end misfit = Array{PyObject}(undef, n_survey) for i = 1:n_survey if !isdir("./$(args["version"])/Data$i") mkdir("./$(args["version"])/Data$i") end para_fname = "./$(args["version"])/para_file$i.json" survey_fname = "./$(args["version"])/survey_file$i.json" paraGen(nz_pad, nx_pad, dz, dx, nSteps, dt, f0, nPml, nPad, para_fname, survey_fname, "./$(args["version"])/Data$i/") # shot_inds = collect(1:3:length(z_src)) .+ mod(i-1,3) # 5src rotation # shot_inds = i # 1src rotation shot_inds = collect(1:length(z_src)) # all sources surveyGen(z_src[shot_inds], x_src[shot_inds], z_rec, x_rec, survey_fname) tf_shot_ids0 = constant(collect(0:length(shot_inds)-1), dtype=Int32) misfit[i] = fwi_obs_op(lambdas[i], tf_shear_pad, dens[i], tf_stf, tf_gpu_id0, tf_shot_ids0, para_fname) end config = tf.ConfigProto() config.intra_op_parallelism_threads = 24 config.inter_op_parallelism_threads = 24 sess = Session(config=config); init(sess); run(sess, misfit) error("Generate Data: Stop") end tfCtxInit = tfCtxGen(m,n,h,NT,Δt,Z,X,ρw,ρo,μw,μo,K_init,g,ϕ,qw,qo, sw0, false) out_sw_init, out_p_init = imseq(tfCtxInit) lambdas = Array{PyObject}(undef, n_survey) dens = Array{PyObject}(undef, n_survey) for i = 1:n_survey sw = out_sw_init[survey_indices[i]] p = out_p_init[survey_indices[i]] lambdas[i], dens[i] = sw_p_to_lambda_den(sw, p) end # NOTE Compute FWI loss loss = constant(0.0) for i = 1:n_survey global loss para_fname = "./$(args["version"])/para_file$i.json" survey_fname = "./$(args["version"])/survey_file$i.json" # shot_inds = collect(1:3:length(z_src)) .+ mod(i-1,3) # shot_inds = i shot_inds = collect(1:length(z_src)) # all sources tf_shot_ids0 = constant(collect(0:length(shot_inds)-1), dtype=Int32) loss += fwi_op(lambdas[i], tf_shear_pad, dens[i], tf_stf, tf_gpu_id_array[mod(i,nGpus)], tf_shot_ids0, para_fname) # mod(i,2) end gradK = gradients(loss, tfCtxInit.K) if args["verbose"] sess = Session(); init(sess) println("Initial loss = ", run(sess, loss)) g = gradients(loss, tfCtxInit.K) G = run(sess, g) pcolormesh(G); savefig("test.png"); close("all") end # Optimization __cnt = 0 # invK = zeros(m,n) function print_loss(l, K, gradK, brie_coef) global __cnt, __l, __K, __gradK, __brie_coef if mod(__cnt,1)==0 println("\niter=$__iter, eval=$__cnt, current loss=",l) # println("a=$a, b1=$b1, b2=$b2") end __cnt += 1 __l = l __K = K __gradK = gradK __brie_coef = brie_coef end __iter = 0 function print_iter(rk) global __iter, __l if mod(__iter,1)==0 println("\n************* ITER=$__iter *************\n") end __iter += 1 open("./$(args["version"])/loss.txt", "a") do io writedlm(io, Any[__iter __l]) end open("./$(args["version"])/K$__iter.txt", "w") do io writedlm(io, __K) end open("./$(args["version"])/gradK$__iter.txt", "w") do io writedlm(io, __gradK) end open("./$(args["version"])/brie_coef.txt", "a") do io writedlm(io, Any[__iter __brie_coef]) end end config = tf.ConfigProto() config.intra_op_parallelism_threads = 24 config.inter_op_parallelism_threads = 24 sess = Session(config=config); init(sess); # opt = ScipyOptimizerInterface(loss, var_list=[tfCtxInit.K], var_to_bounds=Dict(tfCtxInit.K=> (10.0, 130.0)), method="L-BFGS-B", # options=Dict("maxiter"=> 100, "ftol"=>1e-6, "gtol"=>1e-6)) opt = ScipyOptimizerInterface(loss, var_list=[tfCtxInit.K, tf_brie_coef], var_to_bounds=Dict(tfCtxInit.K=> (10.0, 130.0), tf_brie_coef=>(1.0,100.0)), method="L-BFGS-B", options=Dict("maxiter"=> 100, "ftol"=>1e-6, "gtol"=>1e-6)) @info "Optimization Starts..." # ScipyOptimizerMinimize(sess, opt, loss_callback=print_loss, step_callback=print_iter, fetches=[loss,tfCtxInit.K,gradK]) ScipyOptimizerMinimize(sess, opt, loss_callback=print_loss, step_callback=print_iter, fetches=[loss,tfCtxInit.K,gradK, tf_brie_coef])
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
3208
using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using DelimitedFiles using Random Random.seed!(233) np = pyimport("numpy") include("poisson_op.jl") include("laplacian_op.jl") include("sat_op.jl") const K_CONST = 9.869232667160130e-16 * 86400 * 1e3 mutable struct Ctx m; n; h; NT; Δt; Z; X; ρw; ρo; μw; μo; K; g; ϕ; qw; qo; sw0 end function tfCtxGen(m,n,h,NT,Δt,Z,X,ρw,ρo,μw,μo,K,g,ϕ,qw,qo,sw0,ifTrue) tf_h = constant(h) # tf_NT = constant(NT) tf_Δt = constant(Δt) tf_Z = constant(Z) tf_X= constant(X) tf_ρw = constant(ρw) tf_ρo = constant(ρo) tf_μw = constant(μw) tf_μo = constant(μo) # tf_K = isa(K,Array) ? Variable(K) : K if ifTrue tf_K = constant(K) else tf_K = Variable(K) end tf_g = constant(g) # tf_ϕ = Variable(ϕ) tf_ϕ = constant(ϕ) tf_qw = constant(qw) tf_qo = constant(qo) tf_sw0 = constant(sw0) return Ctx(m,n,tf_h,NT,tf_Δt,tf_Z,tf_X,tf_ρw,tf_ρo,tf_μw,tf_μo,tf_K,tf_g,tf_ϕ,tf_qw,tf_qo,tf_sw0) end function Krw(Sw) return Sw ^ 1.5 end function Kro(So) return So ^1.5 end function ave_normal(quantity, m, n) aa = sum(quantity) return aa/(m*n) end # variables : sw, u, v, p # (time dependent) parameters: qw, qo, ϕ function onestep(sw, p, m, n, h, Δt, Z, ρw, ρo, μw, μo, K, g, ϕ, qw, qo) # step 1: update p # λw = Krw(sw)/μw # λo = Kro(1-sw)/μo λw = sw.*sw/μw λo = (1-sw).*(1-sw)/μo λ = λw + λo q = qw + qo + λw/(λo+1e-16).*qo # q = qw + qo potential_c = (ρw - ρo)*g .* Z # Step 1: implicit potential Θ = upwlap_op(K * K_CONST, λo, potential_c, h, constant(0.0)) load_normal = (Θ+q/ALPHA) - ave_normal(Θ+q/ALPHA, m, n) # p = poisson_op(λ.*K* K_CONST, load_normal, h, constant(0.0), constant(1)) p = upwps_op(K * K_CONST, λ, load_normal, p, h, constant(0.0), constant(0)) # potential p = pw - ρw*g*h # step 2: implicit transport sw = sat_op(sw, p, K * K_CONST, ϕ, qw, qo, μw, μo, sw, Δt, h) return sw, p end """ impes(tf_ctx) Solve the two phase flow equation. `qw` and `qo` -- `NT x m x n` numerical array, `qw[i,:,:]` the corresponding value of qw at i*Δt `sw0` and `p0` -- initial value for `sw` and `p`. `m x n` numerical array. """ function imseq(tf_ctx) ta_sw, ta_p = TensorArray(NT+1), TensorArray(NT+1) ta_sw = write(ta_sw, 1, tf_ctx.sw0) ta_p = write(ta_p, 1, constant(zeros(tf_ctx.m, tf_ctx.n))) i = constant(1, dtype=Int32) function condition(i, tas...) i <= tf_ctx.NT end function body(i, tas...) ta_sw, ta_p = tas sw, p = onestep(read(ta_sw, i), read(ta_p, i), tf_ctx.m, tf_ctx.n, tf_ctx.h, tf_ctx.Δt, tf_ctx.Z, tf_ctx.ρw, tf_ctx.ρo, tf_ctx.μw, tf_ctx.μo, tf_ctx.K, tf_ctx.g, tf_ctx.ϕ, tf_ctx.qw[i], tf_ctx.qo[i]) ta_sw = write(ta_sw, i+1, sw) ta_p = write(ta_p, i+1, p) i+1, ta_sw, ta_p end _, ta_sw, ta_p = while_loop(condition, body, [i; ta_sw; ta_p;]) out_sw, out_p = stack(ta_sw), stack(ta_p) end function vis(val, args...;kwargs...) close("all") ns = Int64.(round.(LinRange(1,size(val,1),9))) for i = 1:9 subplot(330+i) imshow(val[ns[i],:,:], args...;kwargs...) colorbar() end end
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
1438
using PyPlot using DelimitedFiles if !isdir("figures_summary") mkdir("figures_summary") end m = 15 n = 30 h = 30.0 # meter dt = 0.00025 nt = 3001 rc("axes", titlesize=16) rc("axes", labelsize=12) rc("xtick", labelsize=12) rc("ytick", labelsize=12) rc("legend", fontsize=16) shot1=read("CO2/Data1/Shot8.bin"); shot1 = reshape(reinterpret(Float32,shot1),(nt,142)) fig,ax = subplots() imshow(shot1, extent=[0,m*h,(nt-1)*dt,0], cmap="gray", aspect=1.5*(m*h)/((nt-1)*dt)); # imshow(shot1', extent=[0,(nt-1)*dt,m*h,0], cmap="gray", aspect=0.8*(nt-1)*dt/(m*h)); xlabel("Depth (m)") ylabel("Time (s)") ax.xaxis.tick_top() ax.xaxis.set_label_position("top") savefig("figures_summary/CO2_Data1_Shot8.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); shot2=read("CO2/Data11/Shot8.bin"); shot2 = reshape(reinterpret(Float32,shot2),(nt,142)) fig,ax = subplots() imshow(shot2, extent=[0,m*h,(nt-1)*dt,0], cmap="gray", aspect=1.5*(m*h)/((nt-1)*dt)); xlabel("Depth (m)") ylabel("Time (s)") ax.xaxis.tick_top() ax.xaxis.set_label_position("top") savefig("figures_summary/CO2_Data11_Shot8.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); fig,ax = subplots() imshow(shot2-shot1, extent=[0,m*h,(nt-1)*dt,0], cmap="gray", aspect=1.5*(m*h)/((nt-1)*dt)); xlabel("Depth (m)") ylabel("Time (s)") ax.xaxis.tick_top() ax.xaxis.set_label_position("top") savefig("figures_summary/CO2_Data_diff.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300);
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
6258
using PyPlot using DelimitedFiles if !isdir("figures_summary") mkdir("figures_summary") end m = 15 n = 30 h = 30.0 dz = 3.0 # meters dx = 3.0 nz = Int64(round((m * h) / dz)) + 1 nx = Int64(round((n * h) / dx)) + 1 z_src = (collect(5:10:nz-5) .- 1 ) .* dz .+ dz/2.0 x_src = (5-1)ones(Int64, size(z_src)) .* dx .+ dx/2.0 z_rec = (collect(5:1:nz-5) .- 1) .* dz .+ dz/2.0 x_rec = (nx-5-1) .* ones(Int64, size(z_rec)) .*dx .+ dx/2.0 z_inj = (9-1)*h + h/2.0 x_inj = (3-1)*h + h/2.0 z_prod = (9-1)*h + h/2.0 x_prod = (28-1)*h + h/2.0 iter = 100 Prj_names = ["CO2", "CO2_1src", "CO2_2surveys", "Brie_3_nocoefupdate", "Brie_tune_coef_true3_start2", "Brie_true3_set2_noupdate", "Brie_tune_coef_true3_start2_scale30"] K_name = "/K$iter.txt" rc("axes", titlesize=20) rc("axes", labelsize=18) rc("xtick", labelsize=18) rc("ytick", labelsize=18) rc("legend", fontsize=20) # true model figure() K = 20.0 .* ones(m,n) K[8:10,:] .= 120.0 imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">", s=64) scatter(x_prod, z_prod, c="r", marker="<", s=64) savefig("figures_summary/K_true.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); # init model figure() K = 20.0 .* ones(m,n) imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">", s=64) scatter(x_prod, z_prod, c="r", marker="<", s=64) savefig("figures_summary/K_init.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); figure() iPrj = 1 K = readdlm(Prj_names[iPrj] * K_name) imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">", s=64) scatter(x_prod, z_prod, c="r", marker="<", s=64) savefig("figures_summary/K_$(Prj_names[iPrj]).pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); figure() iPrj = 3 K = readdlm(Prj_names[iPrj] * K_name) imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">", s=64) scatter(x_prod, z_prod, c="r", marker="<", s=64) savefig("figures_summary/K_$(Prj_names[iPrj]).pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); figure() iPrj = 4 K = readdlm(Prj_names[iPrj] * K_name) imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">", s=64) scatter(x_prod, z_prod, c="r", marker="<", s=64) savefig("figures_summary/K_$(Prj_names[iPrj]).pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); figure() iPrj = 5 K = readdlm(Prj_names[iPrj] * K_name) imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">", s=64) scatter(x_prod, z_prod, c="r", marker="<", s=64) savefig("figures_summary/K_$(Prj_names[iPrj]).pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); figure() iPrj = 6 K = readdlm(Prj_names[iPrj] * K_name) imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">", s=64) scatter(x_prod, z_prod, c="r", marker="<", s=64) savefig("figures_summary/K_$(Prj_names[iPrj]).pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); figure() iPrj = 7 K = readdlm(Prj_names[iPrj] * K_name) imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">", s=64) scatter(x_prod, z_prod, c="r", marker="<", s=64) savefig("figures_summary/K_$(Prj_names[iPrj]).pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); figure() z_src = [75].* dz .+ dz/2.0 # single source x_src = (5-1)ones(Int64, size(z_src)) .* dx .+ dx/2.0 iPrj = 2 K = readdlm(Prj_names[iPrj] * K_name) imshow(K, extent=[0,n*h,m*h,0]); xlabel("Distance (m)") ylabel("Depth (m)") cb = colorbar() clim([20, 120]) cb.set_label("Permeability (md)") shot_inds = collect(1:length(z_src)) scatter(x_src[shot_inds], z_src[shot_inds], c="w", marker="*") scatter(x_rec, z_rec, s=16.0, c="r", marker="v") scatter(x_inj, z_inj, c="r", marker=">", s=64) scatter(x_prod, z_prod, c="r", marker="<", s=64) savefig("figures_summary/K_$(Prj_names[iPrj]).pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); rc("axes", titlesize=14) rc("axes", labelsize=14) rc("xtick", labelsize=14) rc("ytick", labelsize=14) rc("legend", fontsize=14) figure() iPrj = 7 brie_coef = readdlm(Prj_names[iPrj] * "/brie_coef.txt")[:,2]./30.0 plot(0:100,[2;brie_coef], "k");grid(ls="--") # plot(1:100, 3ones(100)) xlabel("Iterations") ylabel("Brie model coefficient") savefig("figures_summary/brie_coef_curve.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300);
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
1244
using DelimitedFiles using PyPlot close("all") if !isdir("figures_summary") mkdir("figures_summary") end Prj_names = ["CO2", "CO2_1src", "CO2_2surveys", "Brie_3_nocoefupdate", "Brie_tune_coef_true3_start2_scale30"] rc("axes", titlesize=14) rc("axes", labelsize=14) rc("xtick", labelsize=14) rc("ytick", labelsize=14) rc("legend", fontsize=14) figure() L1 = readdlm("$(Prj_names[1])/loss.txt") l1=semilogy(L1[:,1], L1[:,2]/L1[1,2], label="Baseline") legend() L2 = readdlm("$(Prj_names[2])/loss.txt") l2=semilogy(L2[:,1], L2[:,2]/L2[1,2], label="One source") legend() L3 = readdlm("$(Prj_names[3])/loss.txt") l3=semilogy(L3[:,1], L3[:,2]/L3[1,2], label="Two surveys") legend() grid(ls="--") xlabel("Iteration Number") ylabel("Normalized misfit") savefig("figures_summary/loss.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300); figure() L4 = readdlm("$(Prj_names[4])/loss.txt") l4=semilogy(L4[:,1], L4[:,2]/L4[1,2], label="Exact coefficient") legend() L5 = readdlm("$(Prj_names[5])/loss.txt") l5=semilogy(L5[:,1], L5[:,2]/L5[1,2], label="Inexact coefficient") legend() grid(ls="--") xlabel("Iteration Number") ylabel("Normalized misfit") savefig("figures_summary/loss_brie.pdf", bbox_inches="tight",pad_inches = 0, dpi = 300);
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
92
using PyPlot include("args.jl") Sw = constant(collect(0:0.001:1)) lambda_brie_3 = Brie(Sw)
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
6300
include("args.jl") function sw_p_to_lambda_den(sw, p) sw = tf.reshape(sw, (1, m, n, 1)) p = tf.reshape(p, (1, m, n, 1)) sw = tf.image.resize_bilinear(sw, (nz, nx)) p = tf.image.resize_bilinear(p, (nz, nx)) sw = cast(sw, Float64) p = cast(p, Float64) sw = squeeze(sw) p = squeeze(p) # tran_lambda, tran_den = Gassman(sw) # tran_lambda, tran_den = RockLinear(sw) # test linear relationship tran_lambda, tran_den = Patchy(sw) return tran_lambda, tran_den end if !isdir("figures_summary") mkdir("figures_summary") end K = 20.0 .* ones(m,n) # millidarcy # K[8:10,:] .= 120.0 # K[17:21,:] .= 100.0 # for i = 1:m # for j = 1:n # if i <= (14 - 24)/(30 - 1)*(j-1) + 24 && i >= (12 - 18)/(30 - 1)*(j-1) + 18 # K[i,j] = 100.0 # end # end # end tfCtxTrue = tfCtxGen(m,n,h,NT,Δt,Z,X,ρw,ρo,μw,μo,K,g,ϕ,qw,qo, sw0, true) out_sw_true, out_p_true = imseq(tfCtxTrue) lambdas = Array{PyObject}(undef, n_survey) dens = Array{PyObject}(undef, n_survey) for i = 1:n_survey sw = out_sw_true[survey_indices[i]] p = out_p_true[survey_indices[i]] lambdas[i], dens[i] = sw_p_to_lambda_den(sw, p) end sess = Session();init(sess); vps = Array{PyObject}(undef, n_survey) for i=1:n_survey vps[i] = sqrt((lambdas[i] + 2.0 * tf_shear_sat1[i])/dens[i]) end V = run(sess, vps); S = run(sess, out_sw_true); P = run(sess, out_p_true); z_inj = (9-1)*h + h/2.0 x_inj = (3-1)*h + h/2.0 z_prod = (9-1)*h + h/2.0 x_prod = (28-1)*h + h/2.0 rc("axes", titlesize=30) rc("axes", labelsize=30) rc("xtick", labelsize=28) rc("ytick", labelsize=28) rc("legend", fontsize=30) fig1,axs = subplots(3,3, figsize=[30,15], sharex=true, sharey=true) ims = Array{Any}(undef, 9) for iPrj = 1:3 for jPrj = 1:3 ims[(iPrj-1)*3+jPrj] = axs[iPrj,jPrj].imshow(V[(iPrj-1)*3+jPrj], extent=[0,n*h,m*h,0], vmin=3350, vmax=3500); axs[iPrj,jPrj].title.set_text("Snapshot $((iPrj-1)*3+jPrj)") if jPrj == 1 || jPrj == 1 axs[iPrj,jPrj].set_ylabel("Depth (m)") end if iPrj == 3 || iPrj == 3 axs[iPrj,jPrj].set_xlabel("Distance (m)") end # cb = fig1.colorbar(ims[(iPrj-1)*3+jPrj], ax=axs[iPrj,jPrj]) # cb.set_label("Vp") axs[iPrj,jPrj].scatter(x_inj, z_inj, c="r", marker=">", s=128) axs[iPrj,jPrj].scatter(x_prod, z_prod, c="r", marker="<", s=128) end end fig1.subplots_adjust(wspace=0.02, hspace=0.18) cbar_ax = fig1.add_axes([0.91, 0.08, 0.01, 0.82]) cb1 = fig1.colorbar(ims[1], cax=cbar_ax) cb1.set_label("Vp (m/s)") savefig("figures_summary/Vp_evo_patchy_init.pdf",bbox_inches="tight",pad_inches = 0); fig2,axs = subplots(3,3, figsize=[30,15], sharex=true, sharey=true) ims = Array{Any}(undef, 9) for iPrj = 1:3 for jPrj = 1:3 ims[(iPrj-1)*3+jPrj] = axs[iPrj,jPrj].imshow(S[survey_indices[(iPrj-1)*3+jPrj], :, :], extent=[0,n*h,m*h,0], vmin=0.0, vmax=0.6); axs[iPrj,jPrj].title.set_text("Snapshot $((iPrj-1)*3+jPrj)") if jPrj == 1 || jPrj == 1 axs[iPrj,jPrj].set_ylabel("Depth (m)") end if iPrj == 3 || iPrj == 3 axs[iPrj,jPrj].set_xlabel("Distance (m)") end # if iPrj ==2 && jPrj == 3 # cb = fig2.colorbar(ims[(iPrj-1)*3+jPrj], ax=axs[iPrj,jPrj]) # cb.set_label("Saturation") axs[iPrj,jPrj].scatter(x_inj, z_inj, c="r", marker=">", s=128) axs[iPrj,jPrj].scatter(x_prod, z_prod, c="r", marker="<", s=128) end end # fig2.subplots_adjust(wspace=0.04, hspace=0.042) fig2.subplots_adjust(wspace=0.02, hspace=0.18) cbar_ax = fig2.add_axes([0.91, 0.08, 0.01, 0.82]) cb2 = fig2.colorbar(ims[1], cax=cbar_ax) cb2.set_label("Saturation") savefig("figures_summary/Saturation_evo_patchy_init.pdf",bbox_inches="tight",pad_inches = 0); fig3,axs = subplots(3,3, figsize=[30,15], sharex=true, sharey=true) ims = Array{Any}(undef, 9) for iPrj = 1:3 for jPrj = 1:3 ims[(iPrj-1)*3+jPrj] = axs[iPrj,jPrj].imshow(P[survey_indices[(iPrj-1)*3+jPrj], :, :]*1.4504e-04, extent=[0,n*h,m*h,0], vmin=-2500.0, vmax=500); axs[iPrj,jPrj].title.set_text("Snapshot $((iPrj-1)*3+jPrj)") if jPrj == 1 || jPrj == 1 axs[iPrj,jPrj].set_ylabel("Depth (m)") end if iPrj == 3 || iPrj == 3 axs[iPrj,jPrj].set_xlabel("Distance (m)") end # if iPrj ==2 && jPrj == 3 # cb = fig2.colorbar(ims[(iPrj-1)*3+jPrj], ax=axs[iPrj,jPrj]) # cb.set_label("Saturation") axs[iPrj,jPrj].scatter(x_inj, z_inj, c="r", marker=">", s=128) axs[iPrj,jPrj].scatter(x_prod, z_prod, c="r", marker="<", s=128) end end # fig2.subplots_adjust(wspace=0.04, hspace=0.042) fig3.subplots_adjust(wspace=0.02, hspace=0.18) cbar_ax = fig3.add_axes([0.91, 0.08, 0.01, 0.82]) cb3 = fig3.colorbar(ims[1], cax=cbar_ax) cb3.set_label("Potential (psi)") savefig("figures_summary/Potential_evo_patchy_init.pdf",bbox_inches="tight",pad_inches = 0); # iter = 100 # Prj_names = ["CO2", "CO2_1src", "CO2_2surveys", "CO2_6surveys"] # K_name = "/K$iter.txt" # fig,axs = subplots(2,2, figsize=[18,8], sharex=true, sharey=true) # for iPrj = 1:2 # for jPrj = 1:2 # # println(ax) # A = readdlm(Prj_names[(iPrj-1)*2 + jPrj] * K_name) # im = axs[iPrj,jPrj].imshow(A, extent=[0,n*h,m*h,0]); # if jPrj == 1 || jPrj == 1 # axs[iPrj,jPrj].set_ylabel("Depth (m)") # end # if iPrj == 2 || iPrj == 2 # axs[iPrj,jPrj].set_xlabel("Distance (m)") # end # axs[iPrj,jPrj].text(-0.1,1.1,string("(" * Char((iPrj-1)*2 + jPrj+'a'-1) * ")"),transform=axs[iPrj,jPrj].transAxes,size=12,weight="bold") # end # end # fig.subplots_adjust(bottom=0.1, top=0.9, left=0.1, right=0.9, # wspace=0.1, hspace=0.2) # cb_ax = fig.add_axes([0.93, 0.1, 0.02, 0.8]) # cbar = fig.colorbar(im, cax=cb_ax) # cb = fig.colorbar() # clim([20, 120]) # cb.set_label("Permeability (md)") # fig = figure() # ax = fig.add_subplot(111) # The big subplot # ax1 = fig.add_subplot(211) # ax2 = fig.add_subplot(212) # # Turn off axis lines and ticks of the big subplot # ax.spines["top"].set_color("none") # ax.spines["bottom"].set_color("none") # ax.spines["left"].set_color("none") # ax.spines["right"].set_color("none") # ax.tick_params(labelcolor="w", top="off", bottom="off", left="off", right="off") # # Set common labels # ax.set_xlabel("common xlabel") # ax.set_ylabel("common ylabel") # ax1.set_title('ax1 title') # ax2.set_title('ax2 title')
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git
[ "MIT" ]
0.3.1
32fa60d65971f3e409a959dedccb0b5c4e29f76e
code
2446
using PyTensorFlow using PyCall using LinearAlgebra using PyPlot using Random Random.seed!(233) if Sys.islinux() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Ops/Poisson/build/libPoissonOp.so') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ elseif Sys.isapple() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Ops/Poisson/build/libPoissonOp.dylib') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libPoissonOp = tf.load_op_library('../Ops/Poisson/build/libPoissonOp.dll') @tf.custom_gradient def poisson_op(coef,g,h,rhograv,index): p = libPoissonOp.poisson_op(coef,g,h,rhograv,index) def grad(dy): return libPoissonOp.poisson_op_grad(dy, p, coef, g, h, rhograv, index) return p, grad """ end poisson_op = py"poisson_op" if Sys.islinux() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('../Ops/Upwps/build/libUpwpsOp.so') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ elseif Sys.isapple() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('../Ops/Upwps/build/libUpwpsOp.dylib') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ elseif Sys.iswindows() py""" import tensorflow as tf libUpwpsOp = tf.load_op_library('../Ops/Upwps/build/libUpwpsOp.dll') @tf.custom_gradient def upwps_op(permi,mobi,src,funcref,h,rhograv,index): pres = libUpwpsOp.upwps_op(permi,mobi,src,funcref,h,rhograv,index) def grad(dy): return libUpwpsOp.upwps_op_grad(dy, pres, permi,mobi,src,funcref,h,rhograv,index) return pres, grad """ end upwps_op = py"upwps_op"
FwiFlow
https://github.com/lidongzh/FwiFlow.jl.git