url
stringlengths
23
7.17k
text
stringlengths
0
1.65M
https://huggingface.co/leix1
Lei Xia leix1 Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/kevinintel
test kevinintel Research interests None yet Organizations Papers 1 arxiv:2211.07715 models None public yet datasets None public yet
https://huggingface.co/drock577
Dylan Lang drock577 drock577 Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/bbhattar
Bibek Bhattarai bbhattar Research interests None yet Organizations models 2 bbhattar/flan_t5_xl_cnn_dailymail Text2Text Generation β€’ Updated Apr 3 bbhattar/flan-t5-samsum Text2Text Generation β€’ Updated Mar 14 datasets None public yet
https://huggingface.co/joshivm
Vinay Joshi joshivm joshivm22 Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/helenai
helenai/bert-base-uncased-sst2-jpqd-ov-int8 Updated 21 days ago β€’ 3 helenai/bert-base-uncased-squad-v1 Question Answering β€’ Updated 21 days ago β€’ 3 helenai/anton-l-wav2vec2-base-superb-sd-ov Updated Aug 1 helenai/anton-l-wav2vec2-base-superb-sv-ov Updated Aug 1 helenai/facebook-hubert-large-ls960-ft-ov Updated Aug 1 helenai/Salesforce-codegen2-1B-ov Text Generation β€’ Updated Jul 26 β€’ 9 helenai/deepset-xlm-roberta-large-squad2-ov Question Answering β€’ Updated Jul 21 β€’ 261 helenai/deepset-xlm-roberta-base-squad2-ov Question Answering β€’ Updated Jul 21 β€’ 239 helenai/xlm-roberta-large-finetuned-conll03-english-ov Token Classification β€’ Updated Jul 21 β€’ 6 helenai/philschmid-roberta-large-sst2-ov Text Classification β€’ Updated Jul 21 β€’ 243
https://huggingface.co/vandanapadala27
vandana padala vandanapadala27 Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/microsoft/speecht5_vc
SpeechT5 (voice conversion task) SpeechT5 model fine-tuned for voice conversion (speech-to-speech) on CMU ARCTIC. This model was introduced in SpeechT5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language Processing by Junyi Ao, Rui Wang, Long Zhou, Chengyi Wang, Shuo Ren, Yu Wu, Shujie Liu, Tom Ko, Qing Li, Yu Zhang, Zhihua Wei, Yao Qian, Jinyu Li, Furu Wei. SpeechT5 was first released in this repository, original weights. The license used is MIT. Disclaimer: The team releasing SpeechT5 did not write a model card for this model so this model card has been written by the Hugging Face team. Model Description Motivated by the success of T5 (Text-To-Text Transfer Transformer) in pre-trained natural language processing models, we propose a unified-modal SpeechT5 framework that explores the encoder-decoder pre-training for self-supervised speech/text representation learning. The SpeechT5 framework consists of a shared encoder-decoder network and six modal-specific (speech/text) pre/post-nets. After preprocessing the input speech/text through the pre-nets, the shared encoder-decoder network models the sequence-to-sequence transformation, and then the post-nets generate the output in the speech/text modality based on the output of the decoder. Leveraging large-scale unlabeled speech and text data, we pre-train SpeechT5 to learn a unified-modal representation, hoping to improve the modeling capability for both speech and text. To align the textual and speech information into this unified semantic space, we propose a cross-modal vector quantization approach that randomly mixes up speech/text states with latent units as the interface between encoder and decoder. Extensive evaluations show the superiority of the proposed SpeechT5 framework on a wide variety of spoken language processing tasks, including automatic speech recognition, speech synthesis, speech translation, voice conversion, speech enhancement, and speaker identification. Intended Uses & Limitations You can use this model for speech conversion. See the model hub to look for fine-tuned versions on a task that interests you. Currently, both the feature extractor and model support PyTorch. Citation BibTeX: @inproceedings{ao-etal-2022-speecht5, title = {{S}peech{T}5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language Processing}, author = {Ao, Junyi and Wang, Rui and Zhou, Long and Wang, Chengyi and Ren, Shuo and Wu, Yu and Liu, Shujie and Ko, Tom and Li, Qing and Zhang, Yu and Wei, Zhihua and Qian, Yao and Li, Jinyu and Wei, Furu}, booktitle = {Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)}, month = {May}, year = {2022}, pages={5723--5738}, } How to Get Started With the Model Use the code below to convert a mono 16 kHz speech waveform into another. from transformers import SpeechT5Processor, SpeechT5ForSpeechToSpeech, SpeechT5HifiGan from datasets import load_dataset dataset = load_dataset("hf-internal-testing/librispeech_asr_demo", "clean", split="validation") dataset = dataset.sort("id") sampling_rate = dataset.features["audio"].sampling_rate example_speech = dataset[0]["audio"]["array"] processor = SpeechT5Processor.from_pretrained("microsoft/speecht5_vc") model = SpeechT5ForSpeechToSpeech.from_pretrained("microsoft/speecht5_vc") vocoder = SpeechT5HifiGan.from_pretrained("microsoft/speecht5_hifigan") inputs = processor(audio=example_speech, sampling_rate=sampling_rate, return_tensors="pt") # load xvector containing speaker's voice characteristics from a file import numpy as np import torch speaker_embeddings = np.load("xvector_speaker_embedding.npy") speaker_embeddings = torch.tensor(speaker_embeddings).unsqueeze(0) speech = model.generate_speech(inputs["input_values"], speaker_embeddings, vocoder=vocoder) import soundfile as sf sf.write("speech.wav", speech.numpy(), samplerate=16000)
https://huggingface.co/spaces/microsoft/HuggingGPT
App Files Files Community 57 Building...
https://huggingface.co/spaces/Matthijs/speecht5-tts-demo
App Files Files Community 1
https://huggingface.co/MatrixYao
25 1 Matrix Yao MatrixYao yao-matrix Research interests None yet Organizations spaces 1 Stopped 1 πŸ¦€ How Many Data Points models None public yet datasets None public yet
https://huggingface.co/Kaixuanliu
Liu,Kaixuan Kaixuanliu Research interests None yet Organizations spaces 1 Build error 1 πŸ“‰ Textual Inversion Training models None public yet datasets None public yet
https://huggingface.co/microsoft/speecht5_tts
SpeechT5 (TTS task) SpeechT5 model fine-tuned for speech synthesis (text-to-speech) on LibriTTS. This model was introduced in SpeechT5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language Processing by Junyi Ao, Rui Wang, Long Zhou, Chengyi Wang, Shuo Ren, Yu Wu, Shujie Liu, Tom Ko, Qing Li, Yu Zhang, Zhihua Wei, Yao Qian, Jinyu Li, Furu Wei. SpeechT5 was first released in this repository, original weights. The license used is MIT. Model Description Motivated by the success of T5 (Text-To-Text Transfer Transformer) in pre-trained natural language processing models, we propose a unified-modal SpeechT5 framework that explores the encoder-decoder pre-training for self-supervised speech/text representation learning. The SpeechT5 framework consists of a shared encoder-decoder network and six modal-specific (speech/text) pre/post-nets. After preprocessing the input speech/text through the pre-nets, the shared encoder-decoder network models the sequence-to-sequence transformation, and then the post-nets generate the output in the speech/text modality based on the output of the decoder. Leveraging large-scale unlabeled speech and text data, we pre-train SpeechT5 to learn a unified-modal representation, hoping to improve the modeling capability for both speech and text. To align the textual and speech information into this unified semantic space, we propose a cross-modal vector quantization approach that randomly mixes up speech/text states with latent units as the interface between encoder and decoder. Extensive evaluations show the superiority of the proposed SpeechT5 framework on a wide variety of spoken language processing tasks, including automatic speech recognition, speech synthesis, speech translation, voice conversion, speech enhancement, and speaker identification. Developed by: Junyi Ao, Rui Wang, Long Zhou, Chengyi Wang, Shuo Ren, Yu Wu, Shujie Liu, Tom Ko, Qing Li, Yu Zhang, Zhihua Wei, Yao Qian, Jinyu Li, Furu Wei. Shared by [optional]: Matthijs Hollemans Model type: text-to-speech Language(s) (NLP): [More Information Needed] License: MIT Finetuned from model [optional]: [More Information Needed] Model Sources [optional] Repository: [https://github.com/microsoft/SpeechT5/] Paper: [https://arxiv.org/pdf/2110.07205.pdf] Blog Post: [https://huggingface.co/blog/speecht5] Demo: [https://huggingface.co/spaces/Matthijs/speecht5-tts-demo] Uses Direct Use You can use this model for speech synthesis. See the model hub to look for fine-tuned versions on a task that interests you. Downstream Use [optional] [More Information Needed] Out-of-Scope Use [More Information Needed] Bias, Risks, and Limitations [More Information Needed] Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. How to Get Started With the Model Use the code below to convert text into a mono 16 kHz speech waveform. # Following pip packages need to be installed: # !pip install git+https://github.com/huggingface/transformers sentencepiece datasets from transformers import SpeechT5Processor, SpeechT5ForTextToSpeech, SpeechT5HifiGan from datasets import load_dataset import torch import soundfile as sf from datasets import load_dataset processor = SpeechT5Processor.from_pretrained("microsoft/speecht5_tts") model = SpeechT5ForTextToSpeech.from_pretrained("microsoft/speecht5_tts") vocoder = SpeechT5HifiGan.from_pretrained("microsoft/speecht5_hifigan") inputs = processor(text="Hello, my dog is cute", return_tensors="pt") # load xvector containing speaker's voice characteristics from a dataset embeddings_dataset = load_dataset("Matthijs/cmu-arctic-xvectors", split="validation") speaker_embeddings = torch.tensor(embeddings_dataset[7306]["xvector"]).unsqueeze(0) speech = model.generate_speech(inputs["input_ids"], speaker_embeddings, vocoder=vocoder) sf.write("speech.wav", speech.numpy(), samplerate=16000) Fine-tuning the Model Refer to this Colab notebook for an example of how to fine-tune SpeechT5 for TTS on a different dataset or a new language. Training Details Training Data LibriTTS Training Procedure Preprocessing [optional] Leveraging large-scale unlabeled speech and text data, we pre-train SpeechT5 to learn a unified-modal representation, hoping to improve the modeling capability for both speech and text. Training hyperparameters Precision: [More Information Needed] Regime: [More Information Needed] Speeds, Sizes, Times [optional] [More Information Needed] Evaluation Testing Data, Factors & Metrics Testing Data [More Information Needed] Factors [More Information Needed] Metrics [More Information Needed] Results [More Information Needed] Summary Model Examination [optional] Extensive evaluations show the superiority of the proposed SpeechT5 framework on a wide variety of spoken language processing tasks, including automatic speech recognition, speech synthesis, speech translation, voice conversion, speech enhancement, and speaker identification. Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). Hardware Type: [More Information Needed] Hours used: [More Information Needed] Cloud Provider: [More Information Needed] Compute Region: [More Information Needed] Carbon Emitted: [More Information Needed] Technical Specifications [optional] Model Architecture and Objective The SpeechT5 framework consists of a shared encoder-decoder network and six modal-specific (speech/text) pre/post-nets. After preprocessing the input speech/text through the pre-nets, the shared encoder-decoder network models the sequence-to-sequence transformation, and then the post-nets generate the output in the speech/text modality based on the output of the decoder. Compute Infrastructure [More Information Needed] Hardware [More Information Needed] Software [More Information Needed] Citation [optional] BibTeX: @inproceedings{ao-etal-2022-speecht5, title = {{S}peech{T}5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language Processing}, author = {Ao, Junyi and Wang, Rui and Zhou, Long and Wang, Chengyi and Ren, Shuo and Wu, Yu and Liu, Shujie and Ko, Tom and Li, Qing and Zhang, Yu and Wei, Zhihua and Qian, Yao and Li, Jinyu and Wei, Furu}, booktitle = {Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)}, month = {May}, year = {2022}, pages={5723--5738}, } Glossary [optional] text-to-speech to synthesize audio More Information [optional] [More Information Needed] Model Card Authors [optional] Disclaimer: The team releasing SpeechT5 did not write a model card for this model so this model card has been written by the Hugging Face team. Model Card Contact [More Information Needed]
https://huggingface.co/spaces/microsoft/Promptist
App Files Files Community 5
https://huggingface.co/spaces/microsoft/GODEL-Demo
Spaces microsoft / GODEL-Demo Build error App Files Files Community 2 build error Unexpected build error Build logs: Fetching error logs...
https://huggingface.co/spaces/microsoft/ChatGPT-Robotics
Spaces microsoft / ChatGPT-Robotics Build error App Files Files Community 1 build error Unexpected build error Build logs: Fetching error logs...
https://huggingface.co/spaces/microsoft/unicl-img-recog-demo
runtime error 68 [00:00<?, ?B/s] Downloading (…)okenizer_config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 568/568 [00:00<00:00, 367kB/s] Downloading (…)lve/main/config.json: 0%| | 0.00/4.19k [00:00<?, ?B/s] Downloading (…)lve/main/config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 4.19k/4.19k [00:00<00:00, 2.80MB/s] Creating model: swin /home/user/.local/lib/python3.8/site-packages/torch/functional.py:445: UserWarning: torch.meshgrid: in an upcoming release, it will be required to pass the indexing argument. (Triggered internally at ../aten/src/ATen/native/TensorShape.cpp:2157.) return _VF.meshgrid(tensors, **kwargs) # type: ignore[attr-defined] /home/user/.local/lib/python3.8/site-packages/gradio/deprecation.py:40: UserWarning: `optional` parameter is deprecated, and it has no effect warnings.warn(value) /home/user/.local/lib/python3.8/site-packages/gradio/deprecation.py:40: UserWarning: The 'type' parameter has been deprecated. Use the Number component instead. warnings.warn(value) /home/user/.local/lib/python3.8/site-packages/gradio/interface.py:286: UserWarning: Currently, only the 'default' theme is supported. warnings.warn("Currently, only the 'default' theme is supported.") IMPORTANT: You are using gradio version 3.0.13, however version 3.14.0 is available, please upgrade. -------- Cache at /home/user/app/gradio_cached_examples/log.csv not found. Caching now in 'gradio_cached_examples/' directory. Traceback (most recent call last): File "app.py", line 131, in <module> gr.Interface( File "/home/user/.local/lib/python3.8/site-packages/gradio/blocks.py", line 758, in launch server_port, path_to_local_server, app, server = networking.start_server( File "/home/user/.local/lib/python3.8/site-packages/gradio/networking.py", line 114, in start_server port = get_first_available_port( File "/home/user/.local/lib/python3.8/site-packages/gradio/networking.py", line 65, in get_first_available_port raise OSError( OSError: All ports from 7860 to 7861 are in use. Please close a port. Container logs:
https://huggingface.co/microsoft/prophetnet-large-uncased-squad-qg
prophetnet-large-uncased-squad-qg Fine-tuned weights(converted from original fairseq version repo) for ProphetNet on question generation SQuAD 1.1. ProphetNet is a new pre-trained language model for sequence-to-sequence learning with a novel self-supervised objective called future n-gram prediction. ProphetNet is able to predict more future tokens with a n-stream decoder. The original implementation is Fairseq version at github repo. Usage from transformers import ProphetNetTokenizer, ProphetNetForConditionalGeneration, ProphetNetConfig model = ProphetNetForConditionalGeneration.from_pretrained('microsoft/prophetnet-large-uncased-squad-qg') tokenizer = ProphetNetTokenizer.from_pretrained('microsoft/prophetnet-large-uncased-squad-qg') FACT_TO_GENERATE_QUESTION_FROM = ""Bill Gates [SEP] Microsoft was founded by Bill Gates and Paul Allen on April 4, 1975." inputs = tokenizer([FACT_TO_GENERATE_QUESTION_FROM], return_tensors='pt') # Generate Summary question_ids = model.generate(inputs['input_ids'], num_beams=5, early_stopping=True) tokenizer.batch_decode(question_ids, skip_special_tokens=True) # should give: 'along with paul allen, who founded microsoft?' Citation @article{yan2020prophetnet, title={Prophetnet: Predicting future n-gram for sequence-to-sequence pre-training}, author={Yan, Yu and Qi, Weizhen and Gong, Yeyun and Liu, Dayiheng and Duan, Nan and Chen, Jiusheng and Zhang, Ruofei and Zhou, Ming}, journal={arXiv preprint arXiv:2001.04063}, year={2020} }
https://huggingface.co/Tanujay
Saha Tanujay Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/microsoft/xclip-base-patch16-zero-shot
X-CLIP (base-sized model) X-CLIP model (base-sized, patch resolution of 16) trained on Kinetics-400. It was introduced in the paper Expanding Language-Image Pretrained Models for General Video Recognition by Ni et al. and first released in this repository. This model was trained using 32 frames per video, at a resolution of 224x224. Disclaimer: The team releasing X-CLIP did not write a model card for this model so this model card has been written by the Hugging Face team. Model description X-CLIP is a minimal extension of CLIP for general video-language understanding. The model is trained in a contrastive way on (video, text) pairs. This allows the model to be used for tasks like zero-shot, few-shot or fully supervised video classification and video-text retrieval. Intended uses & limitations You can use the raw model for determining how well text goes with a given video. See the model hub to look for fine-tuned versions on a task that interests you. How to use For code examples, we refer to the documentation. Training data This model was trained on Kinetics 400. Preprocessing The exact details of preprocessing during training can be found here. The exact details of preprocessing during validation can be found here. During validation, one resizes the shorter edge of each frame, after which center cropping is performed to a fixed-size resolution (like 224x224). Next, frames are normalized across the RGB channels with the ImageNet mean and standard deviation. Evaluation results This model achieves a zero-shot top-1 accuracy of 44.6% on HMDB-51, 72.0% on UCF-101 and 65.2% on Kinetics-600.
https://huggingface.co/microsoft/cvt-13
Edit model card Convolutional Vision Transformer (CvT) CvT-13 model pre-trained on ImageNet-1k at resolution 224x224. It was introduced in the paper CvT: Introducing Convolutions to Vision Transformers by Wu et al. and first released in this repository. Disclaimer: The team releasing CvT did not write a model card for this model so this model card has been written by the Hugging Face team. Usage Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes: from transformers import AutoFeatureExtractor, CvtForImageClassification from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) feature_extractor = AutoFeatureExtractor.from_pretrained('microsoft/cvt-13') model = CvtForImageClassification.from_pretrained('microsoft/cvt-13') inputs = feature_extractor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits # model predicts one of the 1000 ImageNet classes predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) Downloads last month6,991 Drag image file here or click to browse from your device Browse for image Dataset used to train microsoft/cvt-13 Space using microsoft/cvt-13 1
https://huggingface.co/microsoft/tapex-large-sql-execution
TAPEX (large-sized model) TAPEX was proposed in TAPEX: Table Pre-training via Learning a Neural SQL Executor by Qian Liu, Bei Chen, Jiaqi Guo, Morteza Ziyadi, Zeqi Lin, Weizhu Chen, Jian-Guang Lou. The original repo can be found here. Model description TAPEX (Table Pre-training via Execution) is a conceptually simple and empirically powerful pre-training approach to empower existing models with table reasoning skills. TAPEX realizes table pre-training by learning a neural SQL executor over a synthetic corpus, which is obtained by automatically synthesizing executable SQL queries. TAPEX is based on the BART architecture, the transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. Intended Uses You can use the raw model for simulating neural SQL execution, i.e., employ TAPEX to execute a SQL query on a given table. However, the model is mostly meant to be fine-tuned on a supervised dataset. Currently TAPEX can be fine-tuned to tackle table question answering tasks and table fact verification tasks. See the model hub to look for fine-tuned versions on a task that interests you. How to Use Here is how to use this model in transformers: from transformers import TapexTokenizer, BartForConditionalGeneration import pandas as pd tokenizer = TapexTokenizer.from_pretrained("microsoft/tapex-large-sql-execution") model = BartForConditionalGeneration.from_pretrained("microsoft/tapex-large-sql-execution") data = { "year": [1896, 1900, 1904, 2004, 2008, 2012], "city": ["athens", "paris", "st. louis", "athens", "beijing", "london"] } table = pd.DataFrame.from_dict(data) # tapex accepts uncased input since it is pre-trained on the uncased corpus query = "select year where city = beijing" encoding = tokenizer(table=table, query=query, return_tensors="pt") outputs = model.generate(**encoding) print(tokenizer.batch_decode(outputs, skip_special_tokens=True)) # ['2008'] How to Fine-tuning ⚠️ This model checkpoint is ONLY used for simulating neural SQL execution (i.e., employ TAPEX to execute a SQL query on a given table), and you CANNOT use this model for fine-tuning on downstream tasks. The one that can be used for fine-tuning is at here. This separation of two models for two kinds of intention is because of a known issue in BART large, and we recommend readers to see this comment for more details. BibTeX entry and citation info @inproceedings{ liu2022tapex, title={{TAPEX}: Table Pre-training via Learning a Neural {SQL} Executor}, author={Qian Liu and Bei Chen and Jiaqi Guo and Morteza Ziyadi and Zeqi Lin and Weizhu Chen and Jian-Guang Lou}, booktitle={International Conference on Learning Representations}, year={2022}, url={https://openreview.net/forum?id=O50443AsCP} }
https://huggingface.co/microsoft/phi-1
Edit model card Model Summary The language model phi-1 is a Transformer with 1.3 billion parameters, specialized for basic Python coding. Its training involved a variety of data sources, including subsets of Python codes from The Stack v1.2, Q&A content from StackOverflow, competition code from code_contests, and synthetic Python textbooks and exercises generated by gpt-3.5-turbo-0301. Even though the model and the datasets are relatively small compared to contemporary Large Language Models (LLMs), phi-1 has demonstrated an impressive accuracy rate exceeding 50% on the simple Python coding benchmark, HumanEval. Intended Uses Given the nature of the training data, phi-1 is best suited for prompts using the code format: code format: def print_prime(n): """ Print all primes between 1 and n """ for num in range(2, n+1): for i in range(2, num): if num % i == 0: break else: print(num) where the model generates the code after the comments. (Note: This is a legitimate and correct use of the else statement in Python loops.) Notes phi-1 is intended for research purposes. The model-generated code should be treated as a starting point rather than a definitive solution for potential use cases. Users should be cautious when employing this model in their applications. Direct adoption for production coding tasks is out of the scope of this research project. As a result, phi-1 has not been tested to ensure that it performs adequately for production-level code. Please refer to the limitation sections of this document for more details. Limitations of phi-1 Limited Scope: 99.8% of the Python scripts in our fine-tuning dataset use only the packages "typing, math, random, collections, datetime, itertools". If the model generates Python scripts that utilize other packages, we strongly recommend users manually verify all API uses. Replicate Scripts Online: As our model is trained on Python scripts found online, there is a small chance it may replicate such scripts, especially if they appear repetitively across different online sources. Generate Inaccurate Code: The model frequently generates incorrect code. We suggest that users view these outputs as a source of inspiration rather than definitive solutions. Unreliable Responses to Alternate Formats: Despite appearing to comprehend instructions in formats like Q&A or chat, our models often respond with inaccurate answers, even when seeming confident. Their capabilities with non-code formats are significantly more limited. Limitations on Natural Language Comprehension. As a coding bot, phi-1's main focus is to help with coding-related questions. While it may have some natural language comprehension capabilities, its primary function is not to engage in general conversations or demonstrate common sense like a general AI assistant. Its strength lies in providing assistance and guidance in the context of programming and software development. Potential Biases: phi-1, like other AI models, is trained on web and synthetic data. This data can contain biases and errors that might affect the AI's performance. Biases could stem from various sources like unbalanced representation, stereotypes, or controversial opinions present in the training data. As a result, the model might sometimes generate responses that reflect these biases or errors. Warning about Security Risks When leveraging phi-1, it's paramount to be vigilant. The model, though powerful, can inadvertently introduce security vulnerabilities in the generated code. Examples include, but are not limited to: Directory Traversal: The code might fail to implement safe checks against directory traversal attacks, potentially allowing unauthorized access to sensitive files on your system. Injection Attacks: There could be lapses in escaping strings properly, making the application susceptible to SQL, OS commands, or other injection attacks. Misunderstanding Requirements: The model might sometimes misunderstand or oversimplify user requirements, leading to incomplete or insecure solutions. Lack of Input Validation: In some cases, the model might neglect to incorporate input validation or sanitize user inputs, opening doors to attacks like Cross-Site Scripting (XSS). Insecure Defaults: The model might recommend or generate code with insecure default settings, such as weak password requirements or unencrypted data transmissions. Failure in Error Handling: Improper error handling can inadvertently reveal sensitive information about the system or the application's internal workings. Given these potential pitfalls, and others not explicitly mentioned, it's essential to thoroughly review, test, and verify the generated code before deploying it in any application, especially those that are security-sensitive. Always consult with security experts or perform rigorous penetration testing when in doubt. Training Model Architecture: a Transformer-based model with next-word prediction objective Training tokens: 54B tokens (7B unique tokens) Precision: fp16 GPUs: 8 A100 Training time: 6 days Software PyTorch DeepSpeed flash-attention License The model is licensed under the Research License. Sample Code import torch from transformers import AutoModelForCausalLM, AutoTokenizer torch.set_default_device("cuda") model = AutoModelForCausalLM.from_pretrained("microsoft/phi-1", trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained("microsoft/phi-1", trust_remote_code=True) inputs = tokenizer('''def print_prime(n): """ Print all primes between 1 and n """''', return_tensors="pt", return_attention_mask=False) outputs = model.generate(**inputs, max_length=200) text = tokenizer.batch_decode(outputs)[0] print(text) If you need to use the model in a lower precision (e.g., FP16), please wrap the model's forward pass with torch.autocast(), as follows: with torch.autocast(model.device.type, dtype=torch.float16, enabled=True): outputs = model.generate(**inputs, max_length=200) Remark. In the generation function, our model currently does not support beam search (num_beams >1). Furthermore, in the forward pass of the model, we currently do not support attention mask during training, outputting hidden states or attention values, or using custom input embeddings (instead of the model's). Citation @article{gunasekar2023textbooks, title={Textbooks Are All You Need}, author={Gunasekar, Suriya and Zhang, Yi and Aneja, Jyoti and Mendes, Caio C{\'e}sar Teodoro and Del Giorno, Allie and Gopi, Sivakanth and Javaheripi, Mojan and Kauffmann, Piero and de Rosa, Gustavo and Saarikivi, Olli and others}, journal={arXiv preprint arXiv:2306.11644}, year={2023} } Downloads last month7,067 Inference API does not yet support model repos that contain custom code. Spaces using microsoft/phi-1 2
https://huggingface.co/spaces/microsoft/visual_chatgpt
runtime error 403, in __get_result raise self._exception File "/home/user/.pyenv/versions/3.10.12/lib/python3.10/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) File "/home/user/.pyenv/versions/3.10.12/lib/python3.10/site-packages/huggingface_hub/_snapshot_download.py", line 211, in _inner_hf_hub_download return hf_hub_download( File "/home/user/.pyenv/versions/3.10.12/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(*args, **kwargs) File "/home/user/.pyenv/versions/3.10.12/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1364, in hf_hub_download http_get( File "/home/user/.pyenv/versions/3.10.12/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 505, in http_get r = _request_wrapper( File "/home/user/.pyenv/versions/3.10.12/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 442, in _request_wrapper return http_backoff( File "/home/user/.pyenv/versions/3.10.12/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 258, in http_backoff response = session.request(method=method, url=url, **kwargs) File "/home/user/.pyenv/versions/3.10.12/lib/python3.10/site-packages/requests/sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) File "/home/user/.pyenv/versions/3.10.12/lib/python3.10/site-packages/requests/sessions.py", line 703, in send r = adapter.send(request, **kwargs) File "/home/user/.pyenv/versions/3.10.12/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 63, in send return super().send(request, *args, **kwargs) File "/home/user/.pyenv/versions/3.10.12/lib/python3.10/site-packages/requests/adapters.py", line 501, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: (ProtocolError('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')), '(Request ID: 46dd140a-fd1d-4946-a500-9f814f8e6ff5)') Container logs:
https://huggingface.co/microsoft/git-base-vatex
GIT (GenerativeImage2Text), base-sized, fine-tuned on VATEX GIT (short for GenerativeImage2Text) model, base-sized version, fine-tuned on VATEX. It was introduced in the paper GIT: A Generative Image-to-text Transformer for Vision and Language by Wang et al. and first released in this repository. Disclaimer: The team releasing GIT did not write a model card for this model so this model card has been written by the Hugging Face team. Model description GIT is a Transformer decoder conditioned on both CLIP image tokens and text tokens. The model is trained using "teacher forcing" on a lot of (image, text) pairs. The goal for the model is simply to predict the next text token, giving the image tokens and previous text tokens. The model has full access to (i.e. a bidirectional attention mask is used for) the image patch tokens, but only has access to the previous text tokens (i.e. a causal attention mask is used for the text tokens) when predicting the next text token. This allows the model to be used for tasks like: image and video captioning visual question answering (VQA) on images and videos even image classification (by simply conditioning the model on the image and asking it to generate a class for it in text). Intended uses & limitations You can use the raw model for video captioning. See the model hub to look for fine-tuned versions on a task that interests you. How to use For code examples, we refer to the documentation. Training data From the paper: We collect 0.8B image-text pairs for pre-training, which include COCO (Lin et al., 2014), Conceptual Captions (CC3M) (Sharma et al., 2018), SBU (Ordonez et al., 2011), Visual Genome (VG) (Krishna et al., 2016), Conceptual Captions (CC12M) (Changpinyo et al., 2021), ALT200M (Hu et al., 2021a), and an extra 0.6B data following a similar collection procedure in Hu et al. (2021a). => however this is for the model referred to as "GIT" in the paper, which is not open-sourced. This checkpoint is "GIT-base", which is a smaller variant of GIT trained on 10 million image-text pairs. Next, the model was fine-tuned on VATEX. See table 11 in the paper for more details. Preprocessing We refer to the original repo regarding details for preprocessing during training. During validation, one resizes the shorter edge of each image, after which center cropping is performed to a fixed-size resolution. Next, frames are normalized across the RGB channels with the ImageNet mean and standard deviation. Evaluation results For evaluation results, we refer readers to the paper.
https://huggingface.co/microsoft/mpnet-base
No model card New: Create and edit this model card directly on the website! Contribute a Model Card Downloads last month139,956 Safetensors Model size 133M params Tensor type F32 Β· Hosted inference API Fill-Mask Mask token: undefined Model state unknown Spaces using microsoft/mpnet-base 3
https://huggingface.co/microsoft/swin-tiny-patch4-window7-224
Swin Transformer (tiny-sized model) Swin Transformer model trained on ImageNet-1k at resolution 224x224. It was introduced in the paper Swin Transformer: Hierarchical Vision Transformer using Shifted Windows by Liu et al. and first released in this repository. Disclaimer: The team releasing Swin Transformer did not write a model card for this model so this model card has been written by the Hugging Face team. Model description The Swin Transformer is a type of Vision Transformer. It builds hierarchical feature maps by merging image patches (shown in gray) in deeper layers and has linear computation complexity to input image size due to computation of self-attention only within each local window (shown in red). It can thus serve as a general-purpose backbone for both image classification and dense recognition tasks. In contrast, previous vision Transformers produce feature maps of a single low resolution and have quadratic computation complexity to input image size due to computation of self-attention globally. Source Intended uses & limitations You can use the raw model for image classification. See the model hub to look for fine-tuned versions on a task that interests you. How to use Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes: from transformers import AutoImageProcessor, AutoModelForImageClassification from PIL import Image import requests url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) processor = AutoImageProcessor.from_pretrained("microsoft/swin-tiny-patch4-window7-224") model = AutoModelForImageClassification.from_pretrained("microsoft/swin-tiny-patch4-window7-224") inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits # model predicts one of the 1000 ImageNet classes predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) For more code examples, we refer to the documentation. BibTeX entry and citation info @article{DBLP:journals/corr/abs-2103-14030, author = {Ze Liu and Yutong Lin and Yue Cao and Han Hu and Yixuan Wei and Zheng Zhang and Stephen Lin and Baining Guo}, title = {Swin Transformer: Hierarchical Vision Transformer using Shifted Windows}, journal = {CoRR}, volume = {abs/2103.14030}, year = {2021}, url = {https://arxiv.org/abs/2103.14030}, eprinttype = {arXiv}, eprint = {2103.14030}, timestamp = {Thu, 08 Apr 2021 07:53:26 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2103-14030.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} }
https://huggingface.co/chaitanya-ms
Chaitanya Gupta chaitanya-ms Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/owendeng
Deng Deng owendeng Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/funmikesa
Oluwafunmilola Kesa funmikesa funmikesa funmikesa Research interests Computer Vision, NLP Organizations models None public yet datasets None public yet
https://huggingface.co/datasets/microsoft/LCC_python
Subset Split gt string context string """ Model definition functions and weight loading. """ from __future__ import print_function, division from keras.models import Model, Sequential from keras.layers.merge import concatenate from keras.layers import Input, Bidirectional, Embedding, Dense, Dropout, SpatialDropout1D, LSTM, Activation from keras.regularizers import L1L2 from attlayer import AttentionWeightedAverage from global_variables import NB_TOKENS, NB_EMOJI_CLASSES import numpy as np from copy import deepcopy from os.path import exists import h5py def deepmoji_feature_encoding(maxlen, weight_path, return_attention=False): """ Loads the pretrained DeepMoji model for extracting features from the penultimate feature layer. In this way, it transforms the text into its emotional encoding. # Arguments: maxlen: Maximum length of a sentence (given in tokens). weight_path: Path to model weights to be loaded. return_attention: If true, output will be weight of each input token used for the prediction # Returns: Pretrained model for encoding text into feature vectors. """ model = deepmoji_architecture(nb_classes=None, nb_tokens=NB_TOKENS, maxlen=maxlen, feature_output=True, return_attention=return_attention) load_specific_weights(model, weight_path, exclude_names=['softmax']) return model def deepmoji_emojis(maxlen, weight_path, return_attention=False): """ Loads the pretrained DeepMoji model for extracting features from the penultimate feature layer. In this way, it transforms the text into its emotional encoding. # Arguments: maxlen: Maximum length of a sentence (given in tokens). weight_path: Path to model weights to be loaded. return_attention: If true, output will be weight of each input token used for the prediction # Returns: Pretrained model for encoding text into feature vectors. """ model = deepmoji_architecture(nb_classes=NB_EMOJI_CLASSES, nb_tokens=NB_TOKENS, maxlen=maxlen, return_attention=return_attention) model.load_weights(weight_path, by_name=False) return model def deepmoji_transfer(nb_classes, maxlen, weight_path=None, extend_embedding=0, embed_dropout_rate=0.25, final_dropout_rate=0.5, embed_l2=1E-6): """ Loads the pretrained DeepMoji model for finetuning/transfer learning. Does not load weights for the softmax layer. Note that if you are planning to use class average F1 for evaluation, nb_classes should be set to 2 instead of the actual number of classes in the dataset, since binary classification will be performed on each class individually. Note that for the 'new' method, weight_path should be left as None. # Arguments: nb_classes: Number of classes in the dataset. maxlen: Maximum length of a sentence (given in tokens). weight_path: Path to model weights to be loaded. extend_embedding: Number of tokens that have been added to the vocabulary on top of NB_TOKENS. If this number is larger than 0, the embedding layer's dimensions are adjusted accordingly, with the additional weights being set to random values. embed_dropout_rate: Dropout rate for the embedding layer. final_dropout_rate: Dropout rate for the final Softmax layer. embed_l2: L2 regularization for the embedding layerl. # Returns: Model with the given parameters. """ model = deepmoji_architecture(nb_classes=nb_classes, nb_tokens=NB_TOKENS + extend_embedding, maxlen=maxlen, embed_dropout_rate=embed_dropout_rate, final_dropout_rate=final_dropout_rate, embed_l2=embed_l2) if weight_path is not None: load_specific_weights(model, weight_path, exclude_names=['softmax'], extend_embedding=extend_embedding) return model def deepmoji_architecture(nb_classes, nb_tokens, maxlen, feature_output=False, embed_dropout_rate=0, final_dropout_rate=0, embed_l2=1E-6, return_attention=False): """ Returns the DeepMoji architecture uninitialized and without using the pretrained model weights. # Arguments: nb_classes: Number of classes in the dataset. nb_tokens: Number of tokens in the dataset (i.e. vocabulary size). maxlen: Maximum length of a token. feature_output: If True the model returns the penultimate feature vector rather than Softmax probabilities (defaults to False). embed_dropout_rate: Dropout rate for the embedding layer. final_dropout_rate: Dropout rate for the final Softmax layer. embed_l2: L2 regularization for the embedding layerl. # Returns: Model with the given parameters. """ # define embedding layer that turns word tokens into vectors # an activation function is used to bound the values of the embedding model_input = Input(shape=(maxlen,), dtype='int32') embed_reg = L1L2(l2=embed_l2) if embed_l2 != 0 else None embed = Embedding(input_dim=nb_tokens, output_dim=256, mask_zero=True, input_length=maxlen, embeddings_regularizer=embed_reg, name='embedding') x = embed(model_input) x = Activation('tanh')(x) # entire embedding channels are dropped out instead of the # normal Keras embedding dropout, which drops all channels for entire words # many of the datasets contain so few words that losing one or more words can alter the emotions completely if embed_dropout_rate != 0: embed_drop = SpatialDropout1D(embed_dropout_rate, name='embed_drop') x = embed_drop(x) # skip-connection from embedding to output eases gradient-flow and allows access to lower-level features # ordering of the way the merge is done is important for consistency with the pretrained model lstm_0_output = Bidirectional(LSTM(512, return_sequences=True), name="bi_lstm_0")(x) lstm_1_output = Bidirectional(LSTM(512, return_sequences=True), name="bi_lstm_1")(lstm_0_output) x = concatenate([lstm_1_output, lstm_0_output, x]) # if return_attention is True in AttentionWeightedAverage, an additional tensor # representing the weight at each timestep is returned weights = None x = AttentionWeightedAverage(name='attlayer', return_attention=return_attention)(x) if return_attention: x, weights = x if not feature_output: # output class probabilities if final_dropout_rate != 0: x = Dropout(final_dropout_rate)(x) if nb_classes > 2: outputs = [Dense(nb_classes, activation='softmax', name='softmax')(x)] else: outputs = [Dense(1, activation='sigmoid', name='softmax')(x)] else: # output penultimate feature vector outputs = [x] if return_attention: # add the attention weights to the outputs if required outputs.append(weights) return Model(inputs=[model_input], outputs=outputs, name="DeepMoji") def load_specific_weights(model, weight_path, exclude_names=[], extend_embedding=0, verbose=True): """ Loads model weights from the given file path, excluding any given layers. # Arguments: model: Model whose weights should be loaded. weight_path: Path to file containing model weights. exclude_names: List of layer names whose weights should not be loaded. extend_embedding: Number of new words being added to vocabulary. verbose: Verbosity flag. # Raises: ValueError if the file at weight_path does not exist. """ if not exists(weight_path): raise ValueError('ERROR (load_weights): The weights file at {} does ' 'not exist. Refer to the README for instructions.' .format(weight_path)) if extend_embedding and 'embedding' in exclude_names: raise ValueError('ERROR (load_weights): Cannot extend a vocabulary ' 'without loading the embedding weights.') # Copy only weights from the temporary model that are wanted # for the specific task (e.g. the Softmax is often ignored) layer_weights = get_weights_from_hdf5(weight_path) for i, w in enumerate(layer_weights): l_name = w[0] weight_names = w[1] weight_values = w[2] if l_name in exclude_names: if verbose: print('Ignoring weights for {}'.format(l_name)) continue try: model_l = model.get_layer(name=l_name) except ValueError: raise ValueError("Weights had layer {},".format(l_name) + " but could not find this layer in model.") if verbose: print('Loading weights for {}'.format(l_name)) # extend embedding layer to allow new randomly initialized words # if requested. Otherwise, just load the weights for the layer. if type(model_l) is Embedding and extend_embedding > 0: comb_weights = append_to_embedding(weight_values, model_l.get_weights()) model_l.set_weights(comb_weights) if verbose: print('Extended vocabulary for embedding layer ' + 'from {} to {} tokens.'.format( NB_TOKENS, NB_TOKENS + extend_embedding)) else: model_l.set_weights(weight_values) def append_to_embedding(pretrain_weights, random_init_weights): """ Uses pretrained weights for the tokens already in the vocabulary. Remaining weights will be left with the random initialization. """ pretrain_weights = deepcopy(pretrain_weights) if type(pretrain_weights) == list: pretrain_weights = pretrain_weights[0] if type(random_init_weights) == list: random_init_weights = random_init_weights[0] nb_old_tokens = np.shape(pretrain_weights)[0] random_init_weights[:nb_old_tokens] = pretrain_weights # must be returned as a list to be properly inserted into Keras model return [random_init_weights] def get_weights_from_hdf5(filepath): """ Loads the weights from a saved Keras model into numpy arrays. The weights are saved using Keras 2.0 so we don't need all the conversion functionality for handling old weights. """ with h5py.File(filepath, mode='r') as f: layer_names = [n.decode('utf8') for n in f.attrs['layer_names']] layer_weights = [] for k, l_name in enumerate(layer_names): g = f[l_name] weight_names = [n.decode('utf8') for n in g.attrs['weight_names']] weight_values = [g[weight_name][:] for weight_name in weight_names] if len(weight_values): layer_weights.append([l_name, weight_names, weight_values]) return layer_weights # # This is Seisflows # # See LICENCE file # ############################################################################### # Import system modules import sys # Import Numpy and Obspy import numpy as np import obspy # Local imports from seisflows.tools import msg, unix from seisflows.tools.tools import exists, getset from seisflows.config import ParameterError from seisflows.plugins import adjoint, misfit, readers, writers from seisflows.tools import signal PAR = sys.modules['seisflows_parameters'] PATH = sys.modules['seisflows_paths'] class base(object): """ Data preprocessing class Provides data processing functions for seismic traces, with options for data misfit, filtering, normalization and muting """ def check(self): """ Checks parameters and paths """ # used for inversion if 'MISFIT' not in PAR: setattr(PAR, 'MISFIT', None) # used for migration if 'BACKPROJECT' not in PAR: setattr(PAR, 'BACKPROJECT', None) # data file format if 'FORMAT' not in PAR: raise ParameterError(PAR, 'FORMAT') # data normalization option if 'NORMALIZE' not in PAR: setattr(PAR, 'NORMALIZE', None) # data muting option if 'MUTE' not in PAR: setattr(PAR, 'MUTE', None) # data filtering option if 'FILTER' not in PAR: setattr(PAR, 'FILTER', None) # assertions if PAR.FORMAT not in dir(readers): print msg.ReaderError raise ParameterError() if PAR.FORMAT not in dir(writers): print msg.WriterError raise ParameterError() self.check_filter() self.check_mute() self.check_normalize() def setup(self): """ Sets up data preprocessing machinery """ # define misfit function and adjoint trace generator if PAR.MISFIT: self.misfit = getattr(misfit, PAR.MISFIT) self.adjoint = getattr(adjoint, PAR.MISFIT) elif PAR.BACKPROJECT: self.adjoint = getattr(adjoint, PAR.BACKPROJECT) # define seismic data reader and writer self.reader = getattr(readers, PAR.FORMAT) self.writer = getattr(writers, PAR.FORMAT) def prepare_eval_grad(self, path='.'): """ Prepares solver for gradient evaluation by writing residuals and adjoint traces :input path: directory containing observed and synthetic seismic data """ solver = sys.modules['seisflows_solver'] for filename in solver.data_filenames: obs = self.reader(path+'/'+'traces/obs', filename) syn = self.reader(path+'/'+'traces/syn', filename) # process observations obs = self.apply_filter(obs) obs = self.apply_mute(obs) obs = self.apply_normalize(obs) # process synthetics syn = self.apply_filter(syn) syn = self.apply_mute(syn) syn = self.apply_normalize(syn) if PAR.MISFIT: self.write_residuals(path, syn, obs) self.write_adjoint_traces(path+'/'+'traces/adj', syn, obs, filename) def write_residuals(self, path, syn, obs): """ Computes residuals :input path: location "adjoint traces" will be written :input syn: obspy Stream object containing synthetic data :input obs: obspy Stream object containing observed data """ nt, dt, _ = self.get_time_scheme(syn) nn, _ = self.get_network_size(syn) residuals = [] for ii in range(nn): residuals.append(self.misfit(syn[ii].data, obs[ii].data, nt, dt)) filename = path+'/'+'residuals' if exists(filename): residuals.extend(list(np.loadtxt(filename))) np.savetxt(filename, residuals) def sum_residuals(self, files): """ Sums squares of residuals :input files: list of single-column text files containing residuals :output total_misfit: sum of squares of residuals """ total_misfit = 0. for filename in files: total_misfit += np.sum(np.loadtxt(filename)**2.) return total_misfit def write_adjoint_traces(self, path, syn, obs, channel): """ Writes "adjoint traces" required for gradient computation :input path: location "adjoint traces" will be written :input syn: obspy Stream object containing synthetic data :input obs: obspy Stream object containing observed data :input channel: channel or component code used by writer """ nt, dt, _ = self.get_time_scheme(syn) nn, _ = self.get_network_size(syn) adj = syn for ii in range(nn): adj[ii].data = self.adjoint(syn[ii].data, obs[ii].data, nt, dt) self.writer(adj, path, channel) # Signal processing def apply_filter(self, traces): if not PAR.FILTER: return traces elif PAR.FILTER == 'Bandpass': for tr in traces: tr.detrend('demean') tr.detrend('linear') tr.taper(0.05, type='hann') tr.filter('bandpass', zerophase=True, freqmin=PAR.FREQMIN, freqmax=PAR.FREQMAX) elif PAR.FILTER == 'Lowpass': for tr in traces: tr.detrend('demean') tr.detrend('linear') tr.taper(0.05, type='hann') tr.filter('lowpass', zerophase=True, freq=PAR.FREQ) elif PAR.FILTER == 'Highpass': for tr in traces: tr.detrend('demean') tr.detrend('linear') tr.taper(0.05, type='hann') tr.filter('highpass', zerophase=True, freq=PAR.FREQ) else: raise ParameterError() return traces def apply_mute(self, traces): if not PAR.MUTE: return traces if 'MuteEarlyArrivals' in PAR.MUTE: traces = signal.mute_early_arrivals(traces, PAR.MUTE_EARLY_ARRIVALS_SLOPE, # (units: time/distance) PAR.MUTE_EARLY_ARRIVALS_CONST, # (units: time) self.get_time_scheme(traces), self.get_source_coords(traces), self.get_receiver_coords(traces)) if 'MuteLateArrivals' in PAR.MUTE: traces = signal.mute_late_arrivals(traces, PAR.MUTE_LATE_ARRIVALS_SLOPE, # (units: time/distance) PAR.MUTE_LATE_ARRIVALS_CONST, # (units: time) self.get_time_scheme(traces), self.get_source_coords(traces), self.get_receiver_coords(traces)) if 'MuteShortOffsets' in PAR.MUTE: traces = signal.mute_short_offsets(traces, PAR.MUTE_SHORT_OFFSETS_DIST, self.get_source_coords(traces), self.get_receiver_coords(traces)) if 'MuteLongOffsets' in PAR.MUTE: traces = signal.mute_long_offsets(traces, PAR.MUTE_LONG_OFFSETS_DIST, self.get_source_coords(traces), self.get_receiver_coords(traces)) return traces def apply_normalize(self, traces): if not PAR.NORMALIZE: return traces if 'NormalizeEventsL1' in PAR.NORMALIZE: # normalize event by L1 norm of all traces w = 0. for tr in traces: w += np.linalg.norm(tr.data, ord=1) for tr in traces: tr.data /= w elif 'NormalizeEventsL2' in PAR.NORMALIZE: # normalize event by L2 norm of all traces w = 0. for tr in traces: w += np.linalg.norm(tr.data, ord=2) for tr in traces: tr.data /= w if 'NormalizeTracesL1' in PAR.NORMALIZE: # normalize each trace by its L1 norm for tr in traces: w = np.linalg.norm(tr.data, ord=1) if w > 0: tr.data /= w elif 'NormalizeTracesL2' in PAR.NORMALIZE: # normalize each trace by its L2 norm for tr in traces: w = np.linalg.norm(tr.data, ord=2) if w > 0: tr.data /= w return traces def apply_filter_backwards(self, traces): for tr in traces: tr.data = np.flip(tr.data) traces = self.apply_filter() for tr in traces: tr.data = np.flip(tr.data) return traces # Additional parameter checking def check_filter(self): """ Checks filter settings """ assert getset(PAR.FILTER) < set([ 'Bandpass', 'Lowpass', 'Highpass']) if PAR.FILTER == 'Bandpass': if 'FREQMIN' not in PAR: raise ParameterError('FREQMIN') if 'FREQMAX' not in PAR: raise ParameterError('FREQMAX') assert 0 < PAR.FREQMIN assert PAR.FREQMIN < PAR.FREQMAX assert PAR.FREQMAX < np.inf elif PAR.FILTER == 'Lowpass': raise NotImplementedError if 'FREQ' not in PAR: raise ParameterError('FREQ') assert 0 < PAR.FREQ <= np.inf elif PAR.FILTER == 'Highpass': raise NotImplementedError if 'FREQ' not in PAR: raise ParameterError('FREQ') assert 0 <= PAR.FREQ < np.inf def check_mute(self): """ Checks mute settings """ if not PAR.MUTE: return assert getset(PAR.MUTE) <= set([ 'MuteEarlyArrivals', 'MuteLateArrivals', 'MuteShortOffsets', 'MuteLongOffsets']) if 'MuteEarlyArrivals' in PAR.MUTE: assert 'MUTE_EARLY_ARRIVALS_SLOPE' in PAR assert 'MUTE_EARLY_ARRIVALS_CONST' in PAR assert PAR.MUTE_EARLY_ARRIVALS_SLOPE >= 0. if 'MuteLateArrivals' in PAR.MUTE: assert 'MUTE_LATE_ARRIVALS_SLOPE' in PAR assert 'MUTE_LATE_ARRIVALS_CONST' in PAR assert PAR.MUTE_LATE_ARRIVALS_SLOPE >= 0. if 'MuteShortOffsets' in PAR.MUTE: assert 'MUTE_SHORT_OFFSETS_DIST' in PAR assert 0 < PAR.MUTE_SHORT_OFFSETS_DIST if 'MuteLongOffsets' in PAR.MUTE: assert 'MUTE_LONG_OFFSETS_DIST' in PAR assert 0 < PAR.MUTE_LONG_OFFSETS_DIST if 'MuteShortOffsets' not in PAR.MUTE: setattr(PAR, 'MUTE_SHORT_OFFSETS_DIST', 0.) if 'MuteLongOffsets' not in PAR.MUTE: setattr(PAR, 'MUTE_LONG_OFFSETS_DIST', 0.) def check_normalize(self): assert getset(PAR.NORMALIZE) < set([ 'NormalizeTracesL1', 'NormalizeTracesL2', 'NormalizeEventsL1', 'NormalizeEventsL2']) # Utility functions def get_time_scheme(self, traces): """ FIXME: extract time scheme from trace headers rather than parameters file. Note from Alexis Bottero : it is actually better like this in my opinion because this allows for longer traces to be processed. Indeed, in su format only 2 bytes are dedicated to the number of samples which is supposed to be stored as an unsigned int. The maximum NT which can be stored in the header is then 32762 whereas there is no limit in principle. """ nt = PAR.NT dt = PAR.DT t0 = 0. return nt, dt, t0 def get_network_size(self, traces): nrec = len(traces) nsrc = 1 return nrec, nsrc def get_receiver_coords(self, traces): if PAR.FORMAT in ['SU', 'su']: rx = [] ry = [] rz = [] for trace in traces: rx += [trace.stats.su.trace_header.group_coordinate_x] ry += [trace.stats.su.trace_header.group_coordinate_y] rz += [0.] return rx, ry, rz else: raise NotImplementedError def get_source_coords(self, traces): if PAR.FORMAT in ['SU', 'su']: sx = [] sy = [] sz = [] for trace in traces: sx += [trace.stats.su.trace_header.source_coordinate_x] sy += [trace.stats.su.trace_header.source_coordinate_y] sz += [0.] return sx, sy, sz else: raise NotImplementedError from ..titanic import digital from ..titanic import gmpmath from ..titanic.ops import OP class MPNum(digital.Digital): # must be implemented in subclasses @classmethod def _select_context(cls, *args, ctx=None): raise ValueError('virtual method: unimplemented') @classmethod def _round_to_context(cls, unrounded, ctx=None, strict=False): raise ValueError('virtual method: unimplemented') # most operations def add(self, other, ctx=None): ctx = self._select_context(self, other, ctx=ctx) result = gmpmath.compute(OP.add, self, other, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def sub(self, other, ctx=None): ctx = self._select_context(self, other, ctx=ctx) result = gmpmath.compute(OP.sub, self, other, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def mul(self, other, ctx=None): ctx = self._select_context(self, other, ctx=ctx) result = gmpmath.compute(OP.mul, self, other, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def div(self, other, ctx=None): ctx = self._select_context(self, other, ctx=ctx) result = gmpmath.compute(OP.div, self, other, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def sqrt(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.sqrt, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def fma(self, other1, other2, ctx=None): ctx = self._select_context(self, other1, other2, ctx=ctx) result = gmpmath.compute(OP.fma, self, other1, other2, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def neg(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.neg, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def copysign(self, other, ctx=None): ctx = self._select_context(self, other, ctx=ctx) result = gmpmath.compute(OP.copysign, self, other, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def fabs(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.fabs, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def fdim(self, other, ctx=None): # emulated ctx = self._select_context(self, other, ctx=ctx) result = gmpmath.compute(OP.sub, self, other, prec=ctx.p) zero = digital.Digital(negative=False, c=0, exp=0) if result < zero: return type(self)(negative=False, c=0, exp=0, inexact=False, rc=0) else: # never return negative zero rounded = self._round_to_context(result, ctx=ctx, strict=True) return type(self)(rounded, negative=False) def fmax(self, other, ctx=None): # emulated ctx = self._select_context(self, other, ctx=ctx) if self.isnan: return self._round_to_context(other, ctx=ctx, strict=False) elif other.isnan: return self._round_to_context(self, ctx=ctx, strict=False) else: return self._round_to_context(max(self, other), ctx=ctx, strict=False) def fmin(self, other, ctx=None): # emulated ctx = self._select_context(self, other, ctx=ctx) if self.isnan: return self._round_to_context(other, ctx=ctx, strict=False) elif other.isnan: return self._round_to_context(self, ctx=ctx, strict=False) else: return self._round_to_context(min(self, other), ctx=ctx, strict=False) def fmod(self, other, ctx=None): ctx = self._select_context(self, other, ctx=ctx) result = gmpmath.compute(OP.fmod, self, other, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def remainder(self, other, ctx=None): ctx = self._select_context(self, other, ctx=ctx) result = gmpmath.compute(OP.remainder, self, other, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def ceil(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.ceil, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def floor(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.floor, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def nearbyint(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.nearbyint, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def round(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.round, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def trunc(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.trunc, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def acos(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.acos, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def acosh(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.acosh, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def asin(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.asin, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def asinh(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.asinh, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def atan(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.atan, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def atan2(self, other, ctx=None): ctx = self._select_context(self, other, ctx=ctx) result = gmpmath.compute(OP.atan2, self, other, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def atanh(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.atanh, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def cos(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.cos, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def cosh(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.cosh, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def sin(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.sin, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def sinh(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.sinh, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def tan(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.tan, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def tanh(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.tanh, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def exp_(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.exp, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def exp2(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.exp2, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def expm1(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.expm1, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def log(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.log, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def log10(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.log10, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def log1p(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.log1p, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def log2(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.log2, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def cbrt(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.cbrt, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def hypot(self, other, ctx=None): ctx = self._select_context(self, other, ctx=ctx) result = gmpmath.compute(OP.hypot, self, other, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def pow(self, other, ctx=None): ctx = self._select_context(self, other, ctx=ctx) if other.is_zero(): # avoid possibly passing nan to gmpmath.compute return type(self)(negative=False, c=1, exp=0, inexact=False, rc=0) result = gmpmath.compute(OP.pow, self, other, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def erf(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.erf, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def erfc(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.erfc, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def lgamma(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.lgamma, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def tgamma(self, ctx=None): ctx = self._select_context(self, ctx=ctx) result = gmpmath.compute(OP.tgamma, self, prec=ctx.p) return self._round_to_context(result, ctx=ctx, strict=True) def isfinite(self): return not (self.isinf or self.isnan) # isinf and isnan are properties # isnormal is implementation specific - override if necessary def isnormal(self): return not ( self.is_zero() or self.isinf or self.isnan ) def signbit(self): return self.negative import logging from django.core.urlresolvers import reverse from django.db.models import Q from django.utils.encoding import smart_unicode from restlib2.http import codes from restlib2.params import StrParam, IntParam, BoolParam from modeltree.tree import MODELTREE_DEFAULT_ALIAS, trees from avocado.events import usage from avocado.query import pipeline from .base import FieldBase, is_field_orphaned from ..pagination import PaginatorResource, PaginatorParametizer from ...links import patch_response, reverse_tmpl log = logging.getLogger(__name__) class FieldValuesParametizer(PaginatorParametizer): aware = BoolParam(False) limit = IntParam(10) tree = StrParam(MODELTREE_DEFAULT_ALIAS, choices=trees) processor = StrParam('default', choices=pipeline.query_processors) query = StrParam() random = IntParam() class FieldValues(FieldBase, PaginatorResource): """Field Values Resource This resource can be overriden for any field to use a more performant search implementation. """ parametizer = FieldValuesParametizer def get_base_values(self, request, instance, params): "Returns the base queryset for this field." # The `aware` flag toggles the behavior of the distribution by making # relative to the applied context or none if params['aware']: context = self.get_context(request) else: context = self.get_context(request, attrs={}) return context.apply(queryset=instance.model.objects.all()) def get_all_values(self, request, instance, queryset): "Returns all distinct values for this field." results = [] for value, label in instance.choices(queryset=queryset): results.append({ 'label': label, 'value': value, }) return results def get_search_values(self, request, instance, query, queryset): """ Performs a search on the underlying data for a field. This method can be overridden to use an alternate search implementation. """ results = [] value_labels = instance.value_labels(queryset=queryset) for value in instance.search(query, queryset=queryset): results.append({ 'label': value_labels.get(value, smart_unicode(value)), 'value': value, }) return results def get_random_values(self, request, instance, random, queryset): """ Returns a random set of value/label pairs. This is useful for pre-populating documents or form fields with example data. """ values = instance.random(random, queryset=queryset) results = [] for value in values: results.append({ 'label': instance.get_label(value, queryset=queryset), 'value': value, }) return results def get_link_templates(self, request): uri = request.build_absolute_uri return { 'parent': reverse_tmpl( uri, 'serrano:field', {'pk': (int, 'parent_id')}) } def get(self, request, pk): instance = self.get_object(request, pk=pk) if is_field_orphaned(instance): data = { 'message': 'Orphaned fields do not support values calls.' } return self.render( request, data, status=codes.unprocessable_entity) params = self.get_params(request) if params['aware']: context = self.get_context(request) else: context = None QueryProcessor = pipeline.query_processors[params['processor']] processor = QueryProcessor(tree=instance.model, context=context) queryset = processor.get_queryset(request=request) if params['random']: # In the case that the queryset contains a population smaller than # the number of random items being requested, a ValueError will be # triggered. Instead of passing the error on to the client, we # simply return all the possible values. try: return self.get_random_values( request, instance, params['random'], queryset) except ValueError: return instance.values(queryset=queryset) page = params['page'] limit = params['limit'] # If a query term is supplied, perform the icontains search. if params['query']: usage.log('items', instance=instance, request=request, data={ 'query': params['query'], }) values = self.get_search_values( request, instance, params['query'], queryset) else: values = self.get_all_values(request, instance, queryset) # No page specified, return everything. if page is None: return values paginator = self.get_paginator(values, limit=limit) page = paginator.page(page) # Get paginator-based response. data = self.get_page_response(request, paginator, page) data.update({ 'items': page.object_list, }) # Add links. path = reverse('serrano:field-values', kwargs={'pk': pk}) links = self.get_page_links(request, path, page, extra=params) templates = self.get_link_templates(request) response = self.render(request, content=data) return patch_response(request, response, links, templates) def post(self, request, pk): instance = self.get_object(request, pk=pk) params = self.get_params(request) if not request.data: data = { 'message': 'Error parsing data', } return self.render(request, data, status=codes.unprocessable_entity) if isinstance(request.data, dict): array = [request.data] else: array = request.data values = [] labels = [] array_map = {} # Separate out the values and labels for the lookup. Track indexes # maintain order of array for i, datum in enumerate(array): # Value takes precedence over label if supplied. if 'value' in datum: array_map[i] = 'value' values.append(datum['value']) elif 'label' in datum: array_map[i] = 'label' labels.append(datum['label']) else: data = { 'message': 'Error parsing value or label' } return self.render(request, data, status=codes.unprocessable_entity) value_field_name = instance.field_name label_field_name = instance.label_field.name # Note, this return a context-aware or naive queryset depending # on params. Get the value and label fields so they can be filled # in below. queryset = self.get_base_values(request, instance, params)\ .values_list(value_field_name, label_field_name) lookup = Q() # Validate based on the label. if labels: lookup |= Q(**{'{0}__in'.format(label_field_name): labels}) if values: lookup |= Q(**{'{0}__in'.format(value_field_name): values}) results = queryset.filter(lookup) value_labels = dict(results) label_values = dict([(v, k) for k, v in value_labels.items()]) for i, datum in enumerate(array): if array_map[i] == 'label': valid = datum['label'] in label_values if valid: value = label_values[datum['label']] else: value = datum['label'] datum['valid'] = valid datum['value'] = value else: valid = datum['value'] in value_labels if valid: label = value_labels[datum['value']] else: label = smart_unicode(datum['value']) datum['valid'] = valid datum['label'] = label usage.log('validate', instance=instance, request=request, data={ 'count': len(array), }) # Return the augmented data. return request.data #! /usr/env/python """ This module attempts to "component-ify" GT's Fastscape stream power erosion. Created DEJH, March 2014. """ from __future__ import print_function import numpy import warnings from landlab import ModelParameterDictionary, Component from landlab.core.model_parameter_dictionary import MissingKeyError, \ ParameterValueError from landlab.utils.decorators import use_file_name_or_kwds from landlab.field.scalar_data_fields import FieldError from scipy.optimize import newton, fsolve UNDEFINED_INDEX = -1 class FastscapeEroder(Component): ''' This class uses the Braun-Willett Fastscape approach to calculate the amount of erosion at each node in a grid, following a stream power framework. This should allow it to be stable against larger timesteps than an explicit stream power scheme. Stream power erosion is implemented as:: E = K * (rainfall_intensity*A)**m * S**n - threshold_sp, if K * A**m * S**n > threshold_sp, and:: E = 0, if K * A**m * S**n <= threshold_sp. This module assumes you have already run :func:`landlab.components.flow_routing.route_flow_dn.FlowRouter.route_flow` in the same timestep. It looks for 'flow__upstream_node_order', 'flow__link_to_receiver_node', 'drainage_area', 'flow__receiver_node', and 'topographic__elevation' at the nodes in the grid. 'drainage_area' should be in area upstream, not volume (i.e., set runoff_rate=1.0 when calling FlowRouter.route_flow). The primary method of this class is :func:`run_one_step`. Construction:: FastscapeEroder(grid, K_sp=None, m_sp=0.5, n_sp=1., threshold_sp=0., rainfall_intensity=1.) Parameters ---------- grid : ModelGrid A grid. K_sp : float, array, or field name K in the stream power equation (units vary with other parameters). m_sp : float, optional m in the stream power equation (power on drainage area). n_sp : float, optional, ~ 0.5<n_sp<4. n in the stream power equation (power on slope). Performance will be VERY degraded if n < 1. threshold_sp : float, array, or field name The threshold stream power. rainfall_intensity : float; optional Modifying factor on drainage area to convert it to a true water volume flux in (m/time). i.e., E = K * (r_i*A)**m * S**n. For a time varying rainfall intensity, pass rainfall_intensity_if_used to `run_one_step`. For a spatially variable rainfall, use the StreamPowerEroder component. Examples -------- >>> import numpy as np >>> from landlab import RasterModelGrid >>> from landlab import CLOSED_BOUNDARY, FIXED_VALUE_BOUNDARY >>> from landlab.components import FlowRouter >>> mg = RasterModelGrid((5, 5), 10.) >>> z = np.array([7., 7., 7., 7., 7., ... 7., 5., 3.2, 6., 7., ... 7., 2., 3., 5., 7., ... 7., 1., 1.9, 4., 7., ... 7., 0., 7., 7., 7.]) >>> z = mg.add_field('node', 'topographic__elevation', z) >>> fr = FlowRouter(mg) >>> sp = FastscapeEroder(mg, K_sp=1.) >>> fr.run_one_step() >>> sp.run_one_step(dt=1.) >>> z # doctest: +NORMALIZE_WHITESPACE array([ 7. , 7. , 7. , 7. , 7. , 7. , 2.92996598, 2.02996598, 4.01498299, 7. , 7. , 0.85993197, 1.87743897, 3.28268321, 7. , 7. , 0.28989795, 0.85403051, 2.42701526, 7. , 7. , 0. , 7. , 7. , 7. ]) >>> mg2 = RasterModelGrid((3, 7), 1.) >>> z = np.array(mg2.node_x**2.) >>> z = mg2.add_field('node', 'topographic__elevation', z) >>> mg2.status_at_node[mg2.nodes_at_left_edge] = FIXED_VALUE_BOUNDARY >>> mg2.status_at_node[mg2.nodes_at_top_edge] = CLOSED_BOUNDARY >>> mg2.status_at_node[mg2.nodes_at_bottom_edge] = CLOSED_BOUNDARY >>> mg2.status_at_node[mg2.nodes_at_right_edge] = CLOSED_BOUNDARY >>> fr2 = FlowRouter(mg2) >>> sp2 = FastscapeEroder(mg2, K_sp=0.1, m_sp=0., n_sp=2., ... threshold_sp=2.) >>> fr2.run_one_step() >>> sp2.run_one_step(dt=10.) >>> z.reshape((3, 7))[1, :] # doctest: +NORMALIZE_WHITESPACE array([ 0. , 1. , 4. , 8.52493781, 13.29039716, 18.44367965, 36. ]) >>> mg3 = RasterModelGrid((3, 7), 1.) >>> z = np.array(mg3.node_x**2.) >>> z = mg3.add_field('node', 'topographic__elevation', z) >>> mg3.status_at_node[mg3.nodes_at_left_edge] = FIXED_VALUE_BOUNDARY >>> mg3.status_at_node[mg3.nodes_at_top_edge] = CLOSED_BOUNDARY >>> mg3.status_at_node[mg3.nodes_at_bottom_edge] = CLOSED_BOUNDARY >>> mg3.status_at_node[mg3.nodes_at_right_edge] = CLOSED_BOUNDARY >>> fr3 = FlowRouter(mg3) >>> K_field = mg3.ones('node') # K can be a field >>> sp3 = FastscapeEroder(mg3, K_sp=K_field, m_sp=1., n_sp=0.6, ... threshold_sp=mg3.node_x, ... rainfall_intensity=2.) >>> fr3.run_one_step() >>> sp3.run_one_step(1.) >>> z.reshape((3, 7))[1, :] # doctest: +NORMALIZE_WHITESPACE array([ 0. , 0.0647484 , 0.58634455, 2.67253503, 8.49212152, 20.92606987, 36. ]) >>> previous_z = z.copy() >>> sp3.run_one_step(1., rainfall_intensity_if_used=0.) >>> np.allclose(z, previous_z) True ''' _name = 'FastscapeEroder' _input_var_names = ( 'topographic__elevation', 'drainage_area', 'flow__link_to_receiver_node', 'flow__upstream_node_order', 'flow__receiver_node', ) _output_var_names = ( 'topographic__elevation', ) _var_units = { 'topographic__elevation': 'm', 'drainage_area': 'm**2', 'flow__link_to_receiver_node': '-', 'flow__upstream_node_order': '-', 'flow__receiver_node': '-', } _var_mapping = { 'topographic__elevation': 'node', 'drainage_area': 'node', 'flow__link_to_receiver_node': 'node', 'flow__upstream_node_order': 'node', 'flow__receiver_node': 'node', } _var_doc = { 'topographic__elevation': 'Land surface topographic elevation', 'drainage_area': "Upstream accumulated surface area contributing to the node's " "discharge", 'flow__link_to_receiver_node': 'ID of link downstream of each node, which carries the discharge', 'flow__upstream_node_order': 'Node array containing downstream-to-upstream ordered list of ' 'node IDs', 'flow__receiver_node': 'Node array of receivers (node that receives flow from current ' 'node)', } @use_file_name_or_kwds def __init__(self, grid, K_sp=None, m_sp=0.5, n_sp=1., threshold_sp=0., rainfall_intensity=1., **kwds): """ Initialize the Fastscape stream power component. Note: a timestep, dt, can no longer be supplied to this component through the input file. It must instead be passed directly to the run method. Parameters ---------- grid : ModelGrid A grid. K_sp : float, array, or field name K in the stream power equation (units vary with other parameters). m_sp : float, optional m in the stream power equation (power on drainage area). n_sp : float, optional n in the stream power equation (power on slope). rainfall intensity : float, array, or field name; optional Modifying factor on drainage area to convert it to a true water volume flux in (m/time). i.e., E = K * (r_i*A)**m * S**n """ self._grid = grid self.K = K_sp # overwritten below in special cases self.m = float(m_sp) self.n = float(n_sp) if type(threshold_sp) in (float, int): self.thresholds = float(threshold_sp) else: if type(threshold_sp) is str: self.thresholds = self.grid.at_node[threshold_sp] else: self.thresholds = threshold_sp assert self.thresholds.size == self.grid.number_of_nodes # make storage variables self.A_to_the_m = grid.zeros(at='node') self.alpha = grid.empty(at='node') self.alpha_by_flow_link_lengthtothenless1 = numpy.empty_like( self.alpha) try: self.grid._diagonal_links_at_node # calc number of diagonal links except AttributeError: pass # was not a raster if self.K is None: raise ValueError('K_sp must be set as a float, node array, or ' + 'field name. It was None.') # now handle the inputs that could be float, array or field name: # some support here for old-style inputs if type(K_sp) is str: if K_sp == 'array': self.K = None else: self.K = self._grid.at_node[K_sp] elif type(K_sp) in (float, int): # a float self.K = float(K_sp) elif len(K_sp) == self.grid.number_of_nodes: self.K = numpy.array(K_sp) else: raise TypeError('Supplied type of K_sp ' + 'was not recognised, or array was ' + 'not nnodes long!') if type(rainfall_intensity) is str: raise ValueError('This component can no longer handle ' + 'spatially variable rainfall. Use ' + 'StreamPowerEroder.') if rainfall_intensity == 'array': self._r_i = None else: self._r_i = self._grid.at_node[rainfall_intensity] elif type(rainfall_intensity) in (float, int): # a float self._r_i = float(rainfall_intensity) elif len(rainfall_intensity) == self.grid.number_of_nodes: raise ValueError('This component can no longer handle ' + 'spatially variable rainfall. Use ' + 'StreamPowerEroder.') self._r_i = numpy.array(rainfall_intensity) else: raise TypeError('Supplied type of rainfall_' + 'intensity was not recognised!') # We now forbid changing of the field name if 'value_field' in kwds.keys(): raise ValueError('This component can no longer support variable' + 'field names. Use "topographic__elevation".') def erode(self, grid_in, dt=None, K_if_used=None, flooded_nodes=None, rainfall_intensity_if_used=None): """ This method implements the stream power erosion, following the Braun- Willett (2013) implicit Fastscape algorithm. This should allow it to be stable against larger timesteps than an explicit stream power scheme. This driving method for this component is now superceded by the new, standardized wrapper :func:`run_one_step`, but is retained for back compatibility. Set 'K_if_used' as a field name or nnodes-long array if you set K_sp as 'array' during initialization. It returns the grid, in which it will have modified the value of *value_field*, as specified in component initialization. Parameters ---------- grid_in : a grid This is a dummy argument maintained for component back- compatibility. It is superceded by the copy of the grid passed during initialization. dt : float Time-step size. If you are calling the deprecated function :func:`gear_timestep`, that method will supercede any value supplied here. K_if_used : array (optional) Set this to an array if you set K_sp to 'array' in your input file. flooded_nodes : ndarray of int (optional) IDs of nodes that are flooded and should have no erosion. If not provided but flow has still been routed across depressions, erosion may still occur beneath the apparent water level (though will always still be positive). rainfall_intensity_if_used : float or None (optional) Supply to drive this component with a time-varying spatially constant rainfall. Returns ------- grid A reference to the grid. """ self.alpha = numpy.zeros(self._grid.number_of_nodes) self.alpha_by_flow_link_lengthtothenless1 = numpy.zeros(self._grid.number_of_nodes) upstream_order_IDs = self._grid['node']['flow__upstream_node_order'] z = self._grid['node']['topographic__elevation'] defined_flow_receivers = numpy.not_equal(self._grid['node'][ 'flow__link_to_receiver_node'], UNDEFINED_INDEX) flow_link_lengths = self._grid._length_of_link_with_diagonals[ self._grid['node']['flow__link_to_receiver_node'][ defined_flow_receivers]] # make arrays from input the right size if type(self.K) is numpy.ndarray: K_here = self.K[defined_flow_receivers] else: K_here = self.K if rainfall_intensity_if_used is not None: assert type(rainfall_intensity_if_used) in (float, int) r_i_here = float(rainfall_intensity_if_used) else: r_i_here = self._r_i if dt is None: dt = self.dt assert dt is not None, ('Fastscape component could not find a dt to ' + 'use. Pass dt to the run_one_step() method.') if self.K is None: # "old style" setting of array assert K_if_used is not None self.K = K_if_used numpy.power(self._grid['node']['drainage_area'], self.m, out=self.A_to_the_m) self.alpha[defined_flow_receivers] = r_i_here**self.m * K_here * dt * \ self.A_to_the_m[defined_flow_receivers] / flow_link_lengths flow_receivers = self._grid['node']['flow__receiver_node'] n_nodes = upstream_order_IDs.size alpha = self.alpha # Handle flooded nodes, if any (no erosion there) if flooded_nodes is not None: alpha[flooded_nodes] = 0. else: reversed_flow = z < z[flow_receivers] # this check necessary if flow has been routed across depressions alpha[reversed_flow] = 0. self.alpha_by_flow_link_lengthtothenless1[ defined_flow_receivers] = (alpha[defined_flow_receivers] / flow_link_lengths**(self.n - 1.)) alpha_divided = self.alpha_by_flow_link_lengthtothenless1 n = float(self.n) threshdt = self.thresholds * dt if type(self.thresholds) is float: from .cfuncs import erode_with_link_alpha_fixthresh erode_with_link_alpha_fixthresh(upstream_order_IDs, flow_receivers, threshdt, alpha_divided, n, z) else: from .cfuncs import erode_with_link_alpha_varthresh erode_with_link_alpha_varthresh(upstream_order_IDs, flow_receivers, threshdt, alpha_divided, n, z) # # This replicates the cython for testing: # for i in range(upstream_order_IDs.size): # src_id = upstream_order_IDs[i] # dst_id = flow_receivers[src_id] # thresh = threshdt[i] # if src_id != dst_id: # next_z = z[src_id] # prev_z = 0. # while True: # #for j in range(2): # z_diff = next_z - z[dst_id] # f = alpha_divided[src_id] * pow(z_diff, n - 1.) # # if z_diff -> 0, pow -> nan (in reality, inf) # # print (f, prev_z, next_z, z_diff, z[dst_id]) # next_z = next_z - ((next_z - z[src_id] + ( # f*z_diff - thresh).clip(0.)) / (1. + n * f)) # if next_z < z[dst_id]: # next_z = z[dst_id] + 1.e-15 # # ^maintain connectivity # if next_z != 0.: # if (numpy.fabs((next_z - prev_z)/next_z) < # 1.48e-08) or (n == 1.): # break # else: # break # prev_z = next_z # if next_z < z[src_id]: # z[src_id] = next_z return self._grid def run_one_step(self, dt, flooded_nodes=None, rainfall_intensity_if_used=None, **kwds): """ This method implements the stream power erosion across one time interval, dt, following the Braun-Willett (2013) implicit Fastscape algorithm. This follows Landlab standardized component design, and supercedes the old driving method :func:`erode`. Parameters ---------- dt : float Time-step size flooded_nodes : ndarray of int (optional) IDs of nodes that are flooded and should have no erosion. If not provided but flow has still been routed across depressions, erosion may still occur beneath the apparent water level (though will always still be positive). rainfall_intensity_if_used : float or None (optional) Supply to drive this component with a time-varying spatially constant rainfall. """ self.erode(grid_in=self._grid, dt=dt, flooded_nodes=flooded_nodes, rainfall_intensity_if_used=rainfall_intensity_if_used) import logging import os import shutil import tempfile from crawler_exceptions import CrawlError, CrawlUnsupportedPackageManager from utils import osinfo from utils.features import PackageFeature from utils.misc import subprocess_run logger = logging.getLogger('crawlutils') def get_dpkg_packages( root_dir='/', dbpath='var/lib/dpkg', installed_since=0): if os.path.isabs(dbpath): logger.warning( 'dbpath: ' + dbpath + ' is defined absolute. Ignoring prefix: ' + root_dir + '.') # Update for a different route. dbpath = os.path.join(root_dir, dbpath) output = subprocess_run(['dpkg-query', '-W', '--admindir={0}'.format(dbpath), '-f=${Package}|${Version}' '|${Architecture}|${Installed-Size}\n'], shell=False) dpkglist = output.strip('\n') if dpkglist: for dpkginfo in dpkglist.split('\n'): (name, version, architecture, size) = dpkginfo.split(r'|') # dpkg does not provide any installtime field # feature_key = '{0}/{1}'.format(name, version) --> # changed to below per Suriya's request feature_key = '{0}'.format(name, version) yield (feature_key, PackageFeature(None, name, size, version, architecture)) def get_rpm_packages( root_dir='/', dbpath='var/lib/rpm', installed_since=0, reload_needed=False): if os.path.isabs(dbpath): logger.warning( 'dbpath: ' + dbpath + ' is defined absolute. Ignoring prefix: ' + root_dir + '.') # update for a different route dbpath = os.path.join(root_dir, dbpath) try: if reload_needed: reloaded_db_dir = tempfile.mkdtemp() _rpm_reload_db(root_dir, dbpath, reloaded_db_dir) dbpath = reloaded_db_dir output = subprocess_run(['rpm', '--dbpath', dbpath, '-qa', '--queryformat', '%{installtime}|%{name}|%{version}' '-%{release}|%{arch}|%{size}\n'], shell=False, ignore_failure=True) # We ignore failures because sometimes rpm returns rc=1 but still # outputs all the data. rpmlist = output.strip('\n') finally: if reload_needed: logger.debug('Deleting directory: %s' % (reloaded_db_dir)) shutil.rmtree(reloaded_db_dir) if rpmlist: for rpminfo in rpmlist.split('\n'): (installtime, name, version, architecture, size) = \ rpminfo.split(r'|') """ if int(installtime) <= installed_since: --> this barfs for sth like: 1376416422. Consider try: xxx except ValueError: pass """ if installtime <= installed_since: continue """ feature_key = '{0}/{1}'.format(name, version) --> changed to below per Suriya's request """ feature_key = '{0}'.format(name, version) yield (feature_key, PackageFeature(installtime, name, size, version, architecture)) def _rpm_reload_db( root_dir='/', dbpath='var/lib/rpm', reloaded_db_dir='/tmp/'): """ Dumps and reloads the rpm database. Returns the path to the new rpm database, or raises RuntimeError if the dump and load commands failed. """ try: dump_dir = tempfile.mkdtemp() subprocess_run(['/usr/bin/db_dump', os.path.join(dbpath, 'Packages'), '-f', os.path.join(dump_dir, 'Packages')], shell=False) subprocess_run(['/usr/bin/db_load', '-f', os.path.join(dump_dir, 'Packages'), os.path.join(reloaded_db_dir, 'Packages')], shell=False) finally: logger.debug('Deleting directory: %s' % (dump_dir)) shutil.rmtree(dump_dir) return reloaded_db_dir # from UK crawler codebase def apk_parser(filename): try: db_contents = open(filename).read() packages = db_contents.split('\n\n') logger.debug('Found {} APK packages'.format(len(packages))) for package in packages: if package: attributes = package.split('\n') name = "" version = "" architecture = "" size = "" for attribute in attributes: if (attribute.startswith('P:')): name = attribute[2:] elif (attribute.startswith('V:')): version = attribute[2:] elif (attribute.startswith('A:')): architecture = attribute[2:] elif (attribute.startswith('S:')): size = attribute[2:] yield (name, PackageFeature(None, name, size, version, architecture)) except IOError as e: logger.error('Failed to read APK database to obtain packages. ' 'Check if %s is present. [Exception: %s: %s]' ' ' % (filename, type(e).__name__, e.strerror)) raise def get_apk_packages( root_dir='/', dbpath='lib/apk/db'): if os.path.isabs(dbpath): logger.warning( 'dbpath: ' + dbpath + ' is defined absolute. Ignoring prefix: ' + root_dir + '.') # Update for a different route. dbpath = os.path.join(root_dir, dbpath) for feature_key, package_feature in apk_parser( os.path.join(dbpath, 'installed')): yield (feature_key, package_feature) def crawl_packages( dbpath=None, root_dir='/', installed_since=0, reload_needed=True): # package attributes: ["installed", "name", "size", "version"] logger.debug('Crawling Packages') try: pkg_manager = _get_package_manager(root_dir) if pkg_manager == 'dpkg': dbpath = dbpath or 'var/lib/dpkg' for (key, feature) in get_dpkg_packages( root_dir, dbpath, installed_since): yield (key, feature, 'package') elif pkg_manager == 'rpm': dbpath = dbpath or 'var/lib/rpm' for (key, feature) in get_rpm_packages( root_dir, dbpath, installed_since, reload_needed): yield (key, feature, 'package') elif pkg_manager == 'apk': dbpath = dbpath or 'lib/apk/db' for (key, feature) in get_apk_packages( root_dir, dbpath): yield (key, feature, 'package') else: logger.warning('Unsupported package manager for Linux distro') except Exception as e: logger.error('Error crawling packages', exc_info=True) raise CrawlError(e) def _get_package_manager(root_dir): result = osinfo.get_osinfo(mount_point=root_dir) if result: os_distro = result['os'] else: raise CrawlUnsupportedPackageManager() pkg_manager = None if os_distro in ['ubuntu', 'debian']: pkg_manager = 'dpkg' elif os_distro in ['redhat', 'red hat', 'rhel', 'fedora', 'centos']: pkg_manager = 'rpm' elif os_distro in ['alpine']: pkg_manager = 'apk' elif os.path.exists(os.path.join(root_dir, 'var/lib/dpkg')): pkg_manager = 'dpkg' elif os.path.exists(os.path.join(root_dir, 'var/lib/rpm')): pkg_manager = 'rpm' return pkg_manager import math import os import threading from collections import defaultdict from typing import Dict import copy from twisted.internet.address import IPv4Address import bptc from bptc.data.consensus import divide_rounds, decide_fame, find_order from bptc.data.event import Event, Parents from bptc.data.member import Member from bptc.utils.toposort import toposort from bptc.data.transaction import MoneyTransaction, TransactionStatus, PublishNameTransaction class Hashgraph: """ The Hashgraph - storing the events of all nodes """ def __init__(self, me, debug_mode=False): self.lock = threading.RLock() # Member: A reference to the current user. For convenience (e.g. signing) self.me = me self.debug_mode = debug_mode # {member-id => Member}: All members we know if me is not None: self.known_members = {me.id: me} # {event-hash => event}: Dictionary mapping hashes to events self.lookup_table = {} # {event-hash}: Events for which the final order has not yet been determined self.unordered_events = set() # [event-hash]: Final order of events self.ordered_events = [] self.next_ordered_event_idx_to_process = 0 self.idx = {} # {round-num}: rounds where fame is fully decided self.rounds_with_decided_fame = set() # {round-num => {member-pk => event-hash}}: self.witnesses = defaultdict(dict) # {event-hash => set(event-hash)}: Cache for event's self-children (used for fast fork check) self.self_children_cache = defaultdict(set) # set(member-id): A set of member who forked. Members who forked have no visible events. self.fork_blacklist = set() @property def total_stake(self) -> int: """ :return: The total stake in the hashgraph """ return sum([member.stake for _, member in self.known_members.items()]) @property def supermajority_stake(self) -> int: """ :return: The stake needed for a supermajority (2/3 of total) """ return int(math.floor(2 * self.total_stake / 3)) def get_unknown_events_of(self, member: Member) -> Dict[str, Event]: """ Returns the presumably unknown events of a given member, in the same format as lookup_table :param member: The member for which to return unknown events :return: Dictionary mapping hashes to events """ result = dict(self.lookup_table) head = member.head if head is None: return result to_visit = {head} visited = set() while len(to_visit) > 0: event_id = to_visit.pop() if event_id not in visited: event = result[event_id] del result[event_id] if event.parents.self_parent is not None: to_visit.add(event.parents.self_parent) if event.parents.other_parent is not None: to_visit.add(event.parents.other_parent) visited.add(event_id) return result def add_own_event(self, event: Event, calculate_consensus: bool = False): """ Adds an own event to the hashgraph :param event: The event to be added :param calculate_consensus: Whether the consensus should be calculated immediately :return: None """ # Sign event body event.sign(self.me.signing_key) # Add event self.add_event(event) # Only do consensus if this is the first event if calculate_consensus: divide_rounds(self, [event]) decide_fame(self) find_order(self) self.process_ordered_events() def add_event(self, event: Event): # Set the event's correct height if event.parents.self_parent: event.height = self.lookup_table[event.parents.self_parent].height + 1 # Add event to graph self.lookup_table[event.id] = event # Update caches self.unordered_events.add(event.id) if self.known_members[event.verify_key].head is None or \ event.height > self.lookup_table[self.known_members[event.verify_key].head].height: self.known_members[event.verify_key].head = event.id if event.parents.self_parent is not None: self.self_children_cache[event.parents.self_parent].add(event.id) if len(self.self_children_cache[event.parents.self_parent]) > 1: # We just added a fork bptc.logger.warn("A fork was created! Blacklisting member and clearing visibility caches.") # Blacklist the member who forked self.fork_blacklist.add(event.verify_key) # Visibility for events could have changed - throw away the caches for e in self.lookup_table.values(): e.can_see_cache.clear() def process_events(self, from_member: Member, events: Dict[str, Event]) -> None: """ Processes a list of events :param from_member: The member from whom the events were received :param events: The events to be processed :return: None """ events = copy.deepcopy(events) bptc.logger.debug("Processing {} events from {}...".format(len(events), from_member.verify_key[:6])) # Only deal with valid events events = filter_valid_events(events) events_toposorted = toposort(events) # Learn about other members self.learn_members_from_events(events) # Add all new events in topological order and check parent pointer new_events = {} for event in events_toposorted: if event.id not in self.lookup_table: if event.parents.self_parent is not None and event.parents.self_parent not in self.lookup_table: bptc.logger.error('Self parent {} of {} not known. Ignore all data.'. format(event.parents.self_parent[:6], event.id[:6])) return if event.parents.other_parent is not None and event.parents.other_parent not in self.lookup_table: bptc.logger.error('Other parent {} of {} not known. Ignore all data'. format(event.parents.other_parent[:6], event.id[:6])) return new_events[event.id] = event self.add_event(event) # Create a new event for the gossip event = Event(self.me.verify_key, None, Parents(self.me.head, from_member.head)) self.add_own_event(event) new_events[event.id] = event # Figure out fame, order, etc. divide_rounds(self, toposort(new_events)) decide_fame(self) find_order(self) self.process_ordered_events() # Debug mode writes the DB to a file every 100 events. if self.debug_mode: number_events = (len(self.lookup_table) // 100) * 100 # Don't store when there are not enough events or it would overwrite # the last temporary db if number_events > 0 and number_events > self.debug_mode: bptc.logger.debug('Store intermediate results containing about {} events'.format(number_events)) from bptc.data.db import DB DB.save(self, temp=True) self.debug_mode = (len(self.lookup_table) // 100) * 100 def learn_members_from_events(self, events: Dict[str, Event]) -> None: """ Goes through a list of events and learns their creators if they are not already known :param events: The list of events :return: None """ for event in events.values(): if event.verify_key not in self.known_members: self.known_members[event.verify_key] = Member(event.verify_key, None) def process_ordered_events(self): for event_id in self.ordered_events[self.next_ordered_event_idx_to_process:len(self.ordered_events)]: event = self.lookup_table[event_id] if event.data is None: continue for transaction in event.data: sender = self.known_members[event.verify_key] if isinstance(transaction, MoneyTransaction): receiver = self.known_members[transaction.receiver] # Check if the sender has the funds if sender.account_balance < transaction.amount or transaction.amount < 0: transaction.status = TransactionStatus.DENIED else: sender.account_balance -= transaction.amount receiver.account_balance += transaction.amount transaction.status = TransactionStatus.CONFIRMED elif isinstance(transaction, PublishNameTransaction): sender.name = transaction.name self.next_ordered_event_idx_to_process = len(self.ordered_events) def parse_transaction(self, event, transaction, plain=False): receiver = self.known_members[transaction.receiver].formatted_name if \ transaction.receiver in self.known_members else transaction.receiver sender = self.known_members[event.verify_key].formatted_name if \ event.verify_key in self.known_members else event.verify_key status = TransactionStatus.text_for_value(transaction.status) is_received = transaction.receiver == self.me.to_verifykey_string() amount = transaction.amount comment = transaction.comment time = event.time rec = { 'receiver': receiver, 'sender': sender, 'amount': amount, 'comment': comment, 'time': time, 'status': status, 'is_received': is_received, } format_string = '{} [b]{} BPTC[/b] {} [b]{}[/b] ({}){}' if plain: format_string = '{} {} BPTC {} {} ({}){}' rec['formatted'] = format_string.format( 'Received' if is_received else 'Sent', amount, 'from' if rec['is_received'] else 'to', sender if rec['is_received'] else receiver, status, '\n"{}"'.format(comment) if comment else '', ).replace('\n', ' - ' if plain else '\n') return rec def get_relevant_transactions(self, plain=False, show_all=False): # Load transactions belonging to this member transactions = [] events = list(self.lookup_table.values()) for e in events: for t in e.data or []: if isinstance(t, MoneyTransaction): if show_all or self.me.to_verifykey_string() in [e.verify_key, t.receiver]: transactions.append(self.parse_transaction(e, t, plain)) return sorted(transactions, key=lambda x: x['time'], reverse=True) def filter_valid_events(events: Dict[str, Event]) -> Dict[str, Event]: """ Goes through a dict of events and returns a dict containing only the valid ones :param events: The dict to be filtered :return: A dict containing only valid events """ result = dict() for event_id, event in events.items(): if event.has_valid_signature: result[event_id] = event else: bptc.logger.warn("Event had invalid signature: {}".format(event)) return result def init_hashgraph(app): """Loads the hashgraph from file or creates a new one, if the file doesn't exist.""" from bptc.data.db import DB from bptc.data.network import Network # Try to load the Hashgraph from the database hashgraph = DB.load_hashgraph(os.path.join(app.cl_args.output, 'data.db')) hashgraph.debug_mode = app.cl_args.debug # Create a new hashgraph if it could not be loaded if hashgraph is None or hashgraph.me is None: me = Member.create() me.address = IPv4Address("TCP", bptc.ip, bptc.port) hashgraph = Hashgraph(me, app.cl_args.debug) app.network = Network(hashgraph, create_initial_event=True) else: app.network = Network(hashgraph, create_initial_event=False) # -*- coding: utf-8 -*- # Copyright 2020 Google LLC # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # from collections import OrderedDict from distutils import util import os import re from typing import Dict, Optional, Sequence, Tuple, Type, Union from google.api_core import client_options as client_options_lib # type: ignore from google.api_core import exceptions as core_exceptions # type: ignore from google.api_core import gapic_v1 # type: ignore from google.api_core import retry as retries # type: ignore from google.auth import credentials as ga_credentials # type: ignore from google.auth.transport import mtls # type: ignore from google.auth.transport.grpc import SslCredentials # type: ignore from google.auth.exceptions import MutualTLSChannelError # type: ignore from google.oauth2 import service_account # type: ignore from google.ads.googleads.v8.resources.types import campaign_simulation from google.ads.googleads.v8.services.types import campaign_simulation_service from .transports.base import ( CampaignSimulationServiceTransport, DEFAULT_CLIENT_INFO, ) from .transports.grpc import CampaignSimulationServiceGrpcTransport class CampaignSimulationServiceClientMeta(type): """Metaclass for the CampaignSimulationService client. This provides class-level methods for building and retrieving support objects (e.g. transport) without polluting the client instance objects. """ _transport_registry = ( OrderedDict() ) # type: Dict[str, Type[CampaignSimulationServiceTransport]] _transport_registry["grpc"] = CampaignSimulationServiceGrpcTransport def get_transport_class( cls, label: str = None, ) -> Type[CampaignSimulationServiceTransport]: """Return an appropriate transport class. Args: label: The name of the desired transport. If none is provided, then the first transport in the registry is used. Returns: The transport class to use. """ # If a specific transport is requested, return that one. if label: return cls._transport_registry[label] # No transport is requested; return the default (that is, the first one # in the dictionary). return next(iter(cls._transport_registry.values())) class CampaignSimulationServiceClient( metaclass=CampaignSimulationServiceClientMeta ): """Service to fetch campaign simulations.""" @staticmethod def _get_default_mtls_endpoint(api_endpoint): """Convert api endpoint to mTLS endpoint. Convert "*.sandbox.googleapis.com" and "*.googleapis.com" to "*.mtls.sandbox.googleapis.com" and "*.mtls.googleapis.com" respectively. Args: api_endpoint (Optional[str]): the api endpoint to convert. Returns: str: converted mTLS api endpoint. """ if not api_endpoint: return api_endpoint mtls_endpoint_re = re.compile( r"(?P<name>[^.]+)(?P<mtls>\.mtls)?(?P<sandbox>\.sandbox)?(?P<googledomain>\.googleapis\.com)?" ) m = mtls_endpoint_re.match(api_endpoint) name, mtls, sandbox, googledomain = m.groups() if mtls or not googledomain: return api_endpoint if sandbox: return api_endpoint.replace( "sandbox.googleapis.com", "mtls.sandbox.googleapis.com" ) return api_endpoint.replace(".googleapis.com", ".mtls.googleapis.com") DEFAULT_ENDPOINT = "googleads.googleapis.com" DEFAULT_MTLS_ENDPOINT = _get_default_mtls_endpoint.__func__( # type: ignore DEFAULT_ENDPOINT ) @classmethod def from_service_account_info(cls, info: dict, *args, **kwargs): """Creates an instance of this client using the provided credentials info. Args: info (dict): The service account private key info. args: Additional arguments to pass to the constructor. kwargs: Additional arguments to pass to the constructor. Returns: CampaignSimulationServiceClient: The constructed client. """ credentials = service_account.Credentials.from_service_account_info( info ) kwargs["credentials"] = credentials return cls(*args, **kwargs) @classmethod def from_service_account_file(cls, filename: str, *args, **kwargs): """Creates an instance of this client using the provided credentials file. Args: filename (str): The path to the service account private key json file. args: Additional arguments to pass to the constructor. kwargs: Additional arguments to pass to the constructor. Returns: CampaignSimulationServiceClient: The constructed client. """ credentials = service_account.Credentials.from_service_account_file( filename ) kwargs["credentials"] = credentials return cls(*args, **kwargs) from_service_account_json = from_service_account_file @property def transport(self) -> CampaignSimulationServiceTransport: """Return the transport used by the client instance. Returns: CampaignSimulationServiceTransport: The transport used by the client instance. """ return self._transport @staticmethod def campaign_simulation_path( customer_id: str, campaign_id: str, type: str, modification_method: str, start_date: str, end_date: str, ) -> str: """Return a fully-qualified campaign_simulation string.""" return "customers/{customer_id}/campaignSimulations/{campaign_id}~{type}~{modification_method}~{start_date}~{end_date}".format( customer_id=customer_id, campaign_id=campaign_id, type=type, modification_method=modification_method, start_date=start_date, end_date=end_date, ) @staticmethod def parse_campaign_simulation_path(path: str) -> Dict[str, str]: """Parse a campaign_simulation path into its component segments.""" m = re.match( r"^customers/(?P<customer_id>.+?)/campaignSimulations/(?P<campaign_id>.+?)~(?P<type>.+?)~(?P<modification_method>.+?)~(?P<start_date>.+?)~(?P<end_date>.+?)$", path, ) return m.groupdict() if m else {} @staticmethod def common_billing_account_path(billing_account: str,) -> str: """Return a fully-qualified billing_account string.""" return "billingAccounts/{billing_account}".format( billing_account=billing_account, ) @staticmethod def parse_common_billing_account_path(path: str) -> Dict[str, str]: """Parse a billing_account path into its component segments.""" m = re.match(r"^billingAccounts/(?P<billing_account>.+?)$", path) return m.groupdict() if m else {} @staticmethod def common_folder_path(folder: str,) -> str: """Return a fully-qualified folder string.""" return "folders/{folder}".format(folder=folder,) @staticmethod def parse_common_folder_path(path: str) -> Dict[str, str]: """Parse a folder path into its component segments.""" m = re.match(r"^folders/(?P<folder>.+?)$", path) return m.groupdict() if m else {} @staticmethod def common_organization_path(organization: str,) -> str: """Return a fully-qualified organization string.""" return "organizations/{organization}".format(organization=organization,) @staticmethod def parse_common_organization_path(path: str) -> Dict[str, str]: """Parse a organization path into its component segments.""" m = re.match(r"^organizations/(?P<organization>.+?)$", path) return m.groupdict() if m else {} @staticmethod def common_project_path(project: str,) -> str: """Return a fully-qualified project string.""" return "projects/{project}".format(project=project,) @staticmethod def parse_common_project_path(path: str) -> Dict[str, str]: """Parse a project path into its component segments.""" m = re.match(r"^projects/(?P<project>.+?)$", path) return m.groupdict() if m else {} @staticmethod def common_location_path(project: str, location: str,) -> str: """Return a fully-qualified location string.""" return "projects/{project}/locations/{location}".format( project=project, location=location, ) @staticmethod def parse_common_location_path(path: str) -> Dict[str, str]: """Parse a location path into its component segments.""" m = re.match( r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)$", path ) return m.groupdict() if m else {} def __init__( self, *, credentials: Optional[ga_credentials.Credentials] = None, transport: Union[str, CampaignSimulationServiceTransport, None] = None, client_options: Optional[client_options_lib.ClientOptions] = None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, ) -> None: """Instantiate the campaign simulation service client. Args: credentials (Optional[google.auth.credentials.Credentials]): The authorization credentials to attach to requests. These credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. transport (Union[str, ~.CampaignSimulationServiceTransport]): The transport to use. If set to None, a transport is chosen automatically. client_options (google.api_core.client_options.ClientOptions): Custom options for the client. It won't take effect if a ``transport`` instance is provided. (1) The ``api_endpoint`` property can be used to override the default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT environment variable can also be used to override the endpoint: "always" (always use the default mTLS endpoint), "never" (always use the default regular endpoint) and "auto" (auto switch to the default mTLS endpoint if client certificate is present, this is the default value). However, the ``api_endpoint`` property takes precedence if provided. (2) If GOOGLE_API_USE_CLIENT_CERTIFICATE environment variable is "true", then the ``client_cert_source`` property can be used to provide client certificate for mutual TLS transport. If not provided, the default SSL client certificate will be used if present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not set, no client certificate will be used. client_info (google.api_core.gapic_v1.client_info.ClientInfo): The client info used to send a user-agent string along with API requests. If ``None``, then default info will be used. Generally, you only need to set this if you're developing your own client library. Raises: google.auth.exceptions.MutualTLSChannelError: If mutual TLS transport creation failed for any reason. """ if isinstance(client_options, dict): client_options = client_options_lib.from_dict(client_options) if client_options is None: client_options = client_options_lib.ClientOptions() # Create SSL credentials for mutual TLS if needed. use_client_cert = bool( util.strtobool( os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false") ) ) ssl_credentials = None is_mtls = False if use_client_cert: if client_options.client_cert_source: import grpc # type: ignore cert, key = client_options.client_cert_source() ssl_credentials = grpc.ssl_channel_credentials( certificate_chain=cert, private_key=key ) is_mtls = True else: creds = SslCredentials() is_mtls = creds.is_mtls ssl_credentials = creds.ssl_credentials if is_mtls else None # Figure out which api endpoint to use. if client_options.api_endpoint is not None: api_endpoint = client_options.api_endpoint else: use_mtls_env = os.getenv("GOOGLE_API_USE_MTLS_ENDPOINT", "auto") if use_mtls_env == "never": api_endpoint = self.DEFAULT_ENDPOINT elif use_mtls_env == "always": api_endpoint = self.DEFAULT_MTLS_ENDPOINT elif use_mtls_env == "auto": api_endpoint = ( self.DEFAULT_MTLS_ENDPOINT if is_mtls else self.DEFAULT_ENDPOINT ) else: raise MutualTLSChannelError( "Unsupported GOOGLE_API_USE_MTLS_ENDPOINT value. Accepted values: never, auto, always" ) # Save or instantiate the transport. # Ordinarily, we provide the transport, but allowing a custom transport # instance provides an extensibility point for unusual situations. if isinstance(transport, CampaignSimulationServiceTransport): # transport is a CampaignSimulationServiceTransport instance. if credentials: raise ValueError( "When providing a transport instance, " "provide its credentials directly." ) self._transport = transport elif isinstance(transport, str): Transport = type(self).get_transport_class(transport) self._transport = Transport( credentials=credentials, host=self.DEFAULT_ENDPOINT ) else: self._transport = CampaignSimulationServiceGrpcTransport( credentials=credentials, host=api_endpoint, ssl_channel_credentials=ssl_credentials, client_info=client_info, ) def get_campaign_simulation( self, request: campaign_simulation_service.GetCampaignSimulationRequest = None, *, resource_name: str = None, retry: retries.Retry = gapic_v1.method.DEFAULT, timeout: float = None, metadata: Sequence[Tuple[str, str]] = (), ) -> campaign_simulation.CampaignSimulation: r"""Returns the requested campaign simulation in full detail. Args: request (:class:`google.ads.googleads.v8.services.types.GetCampaignSimulationRequest`): The request object. Request message for [CampaignSimulationService.GetCampaignSimulation][google.ads.googleads.v8.services.CampaignSimulationService.GetCampaignSimulation]. resource_name (:class:`str`): Required. The resource name of the campaign simulation to fetch. This corresponds to the ``resource_name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. retry (google.api_core.retry.Retry): Designation of what errors, if any, should be retried. timeout (float): The timeout for this request. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. Returns: google.ads.googleads.v8.resources.types.CampaignSimulation: A campaign simulation. Supported combinations of advertising channel type, simulation type and simulation modification method is detailed below respectively. SEARCH - CPC_BID - UNIFORM SEARCH - CPC_BID - SCALING SEARCH - TARGET_CPA - UNIFORM SEARCH - TARGET_CPA - SCALING SEARCH - TARGET_ROAS - UNIFORM SEARCH - TARGET_IMPRESSION_SHARE - UNIFORM SEARCH - BUDGET - UNIFORM SHOPPING - BUDGET - UNIFORM SHOPPING - TARGET_ROAS - UNIFORM MULTIPLE - TARGET_CPA - UNIFORM OWNED_AND_OPERATED - TARGET_CPA - DEFAULT DISPLAY - TARGET_CPA - UNIFORM """ # Create or coerce a protobuf request object. # Sanity check: If we got a request object, we should *not* have # gotten any keyword arguments that map to the request. if request is not None and any([resource_name]): raise ValueError( "If the `request` argument is set, then none of " "the individual field arguments should be set." ) # Minor optimization to avoid making a copy if the user passes # in a campaign_simulation_service.GetCampaignSimulationRequest. # There's no risk of modifying the input as we've already verified # there are no flattened fields. if not isinstance( request, campaign_simulation_service.GetCampaignSimulationRequest ): request = campaign_simulation_service.GetCampaignSimulationRequest( request ) # If we have keyword arguments corresponding to fields on the # request, apply these. if resource_name is not None: request.resource_name = resource_name # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. rpc = self._transport._wrapped_methods[ self._transport.get_campaign_simulation ] # Certain fields should be provided within the metadata header; # add these here. metadata = tuple(metadata) + ( gapic_v1.routing_header.to_grpc_metadata( (("resource_name", request.resource_name),) ), ) # Send the request. response = rpc( request, retry=retry, timeout=timeout, metadata=metadata, ) # Done; return the response. return response __all__ = ("CampaignSimulationServiceClient",) # -*- coding: utf-8 -*- ############################ Copyrights and license ############################ # # # Copyright 2012 Vincent Jacques <vincent@vincent-jacques.net> # # Copyright 2012 Zearin <zearin@gonk.net> # # Copyright 2013 AKFish <akfish@gmail.com> # # Copyright 2013 Vincent Jacques <vincent@vincent-jacques.net> # # Copyright 2013 martinqt <m.ki2@laposte.net> # # Copyright 2014 Andy Casey <acasey@mso.anu.edu.au> # # Copyright 2014 Vincent Jacques <vincent@vincent-jacques.net> # # Copyright 2016 Jannis Gebauer <ja.geb@me.com> # # Copyright 2016 John Eskew <jeskew@edx.org> # # Copyright 2016 Peter Buckley <dx-pbuckley@users.noreply.github.com> # # Copyright 2018 sfdye <tsfdye@gmail.com> # # # # This file is part of PyGithub. # # http://pygithub.readthedocs.io/ # # # # PyGithub is free software: you can redistribute it and/or modify it under # # the terms of the GNU Lesser General Public License as published by the Free # # Software Foundation, either version 3 of the License, or (at your option) # # any later version. # # # # PyGithub is distributed in the hope that it will be useful, but WITHOUT ANY # # WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS # # FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more # # details. # # # # You should have received a copy of the GNU Lesser General Public License # # along with PyGithub. If not, see <http://www.gnu.org/licenses/>. # # # ################################################################################ import github.GithubObject import github.PaginatedList import github.GitCommit import github.NamedUser import github.CommitStatus import github.CommitCombinedStatus import github.File import github.CommitStats import github.CommitComment class Commit(github.GithubObject.CompletableGithubObject): """ This class represents Commits. The reference can be found here http://developer.github.com/v3/git/commits/ """ def __repr__(self): return self.get__repr__({"sha": self._sha.value}) @property def author(self): """ :type: :class:`github.NamedUser.NamedUser` """ self._completeIfNotSet(self._author) return self._author.value @property def comments_url(self): """ :type: string """ self._completeIfNotSet(self._comments_url) return self._comments_url.value @property def commit(self): """ :type: :class:`github.GitCommit.GitCommit` """ self._completeIfNotSet(self._commit) return self._commit.value @property def committer(self): """ :type: :class:`github.NamedUser.NamedUser` """ self._completeIfNotSet(self._committer) return self._committer.value @property def files(self): """ :type: list of :class:`github.File.File` """ self._completeIfNotSet(self._files) return self._files.value @property def html_url(self): """ :type: string """ self._completeIfNotSet(self._html_url) return self._html_url.value @property def parents(self): """ :type: list of :class:`github.Commit.Commit` """ self._completeIfNotSet(self._parents) return self._parents.value @property def sha(self): """ :type: string """ self._completeIfNotSet(self._sha) return self._sha.value @property def stats(self): """ :type: :class:`github.CommitStats.CommitStats` """ self._completeIfNotSet(self._stats) return self._stats.value @property def url(self): """ :type: string """ self._completeIfNotSet(self._url) return self._url.value def create_comment(self, body, line=github.GithubObject.NotSet, path=github.GithubObject.NotSet, position=github.GithubObject.NotSet): """ :calls: `POST /repos/:owner/:repo/commits/:sha/comments <http://developer.github.com/v3/repos/comments>`_ :param body: string :param line: integer :param path: string :param position: integer :rtype: :class:`github.CommitComment.CommitComment` """ assert isinstance(body, (str, unicode)), body assert line is github.GithubObject.NotSet or isinstance(line, (int, long)), line assert path is github.GithubObject.NotSet or isinstance(path, (str, unicode)), path assert position is github.GithubObject.NotSet or isinstance(position, (int, long)), position post_parameters = { "body": body, } if line is not github.GithubObject.NotSet: post_parameters["line"] = line if path is not github.GithubObject.NotSet: post_parameters["path"] = path if position is not github.GithubObject.NotSet: post_parameters["position"] = position headers, data = self._requester.requestJsonAndCheck( "POST", self.url + "/comments", input=post_parameters ) return github.CommitComment.CommitComment(self._requester, headers, data, completed=True) def create_status(self, state, target_url=github.GithubObject.NotSet, description=github.GithubObject.NotSet, context=github.GithubObject.NotSet): """ :calls: `POST /repos/:owner/:repo/statuses/:sha <http://developer.github.com/v3/repos/statuses>`_ :param state: string :param target_url: string :param description: string :param context: string :rtype: :class:`github.CommitStatus.CommitStatus` """ assert isinstance(state, (str, unicode)), state assert target_url is github.GithubObject.NotSet or isinstance(target_url, (str, unicode)), target_url assert description is github.GithubObject.NotSet or isinstance(description, (str, unicode)), description assert context is github.GithubObject.NotSet or isinstance(context, (str, unicode)), context post_parameters = { "state": state, } if target_url is not github.GithubObject.NotSet: post_parameters["target_url"] = target_url if description is not github.GithubObject.NotSet: post_parameters["description"] = description if context is not github.GithubObject.NotSet: post_parameters["context"] = context headers, data = self._requester.requestJsonAndCheck( "POST", self._parentUrl(self._parentUrl(self.url)) + "/statuses/" + self.sha, input=post_parameters ) return github.CommitStatus.CommitStatus(self._requester, headers, data, completed=True) def get_comments(self): """ :calls: `GET /repos/:owner/:repo/commits/:sha/comments <http://developer.github.com/v3/repos/comments>`_ :rtype: :class:`github.PaginatedList.PaginatedList` of :class:`github.CommitComment.CommitComment` """ return github.PaginatedList.PaginatedList( github.CommitComment.CommitComment, self._requester, self.url + "/comments", None ) def get_statuses(self): """ :calls: `GET /repos/:owner/:repo/statuses/:ref <http://developer.github.com/v3/repos/statuses>`_ :rtype: :class:`github.PaginatedList.PaginatedList` of :class:`github.CommitStatus.CommitStatus` """ return github.PaginatedList.PaginatedList( github.CommitStatus.CommitStatus, self._requester, self._parentUrl(self._parentUrl(self.url)) + "/statuses/" + self.sha, None ) def get_combined_status(self): """ :calls: `GET /repos/:owner/:repo/commits/:ref/status/ <http://developer.github.com/v3/repos/statuses>`_ :rtype: :class:`github.CommitCombinedStatus.CommitCombinedStatus` """ headers, data = self._requester.requestJsonAndCheck( "GET", self.url + "/status" ) return github.CommitCombinedStatus.CommitCombinedStatus(self._requester, headers, data, completed=True) @property def _identity(self): return self.sha def _initAttributes(self): self._author = github.GithubObject.NotSet self._comments_url = github.GithubObject.NotSet self._commit = github.GithubObject.NotSet self._committer = github.GithubObject.NotSet self._files = github.GithubObject.NotSet self._html_url = github.GithubObject.NotSet self._parents = github.GithubObject.NotSet self._sha = github.GithubObject.NotSet self._stats = github.GithubObject.NotSet self._url = github.GithubObject.NotSet def _useAttributes(self, attributes): if "author" in attributes: # pragma no branch self._author = self._makeClassAttribute(github.NamedUser.NamedUser, attributes["author"]) if "comments_url" in attributes: # pragma no branch self._comments_url = self._makeStringAttribute(attributes["comments_url"]) if "commit" in attributes: # pragma no branch self._commit = self._makeClassAttribute(github.GitCommit.GitCommit, attributes["commit"]) if "committer" in attributes: # pragma no branch self._committer = self._makeClassAttribute(github.NamedUser.NamedUser, attributes["committer"]) if "files" in attributes: # pragma no branch self._files = self._makeListOfClassesAttribute(github.File.File, attributes["files"]) if "html_url" in attributes: # pragma no branch self._html_url = self._makeStringAttribute(attributes["html_url"]) if "parents" in attributes: # pragma no branch self._parents = self._makeListOfClassesAttribute(Commit, attributes["parents"]) if "sha" in attributes: # pragma no branch self._sha = self._makeStringAttribute(attributes["sha"]) if "stats" in attributes: # pragma no branch self._stats = self._makeClassAttribute(github.CommitStats.CommitStats, attributes["stats"]) if "url" in attributes: # pragma no branch self._url = self._makeStringAttribute(attributes["url"]) # -*- coding: utf-8 -*- # # Copyright (c) 2020, the cclib development team # # This file is part of cclib (http://cclib.github.io) and is distributed under # the terms of the BSD 3-Clause License. """Parser for MOPAC output files""" # Based on parser in RMG-Py by Greg Magoon # https://github.com/ReactionMechanismGenerator/RMG-Py/blob/master/external/cclib/parser/mopacparser.py # Also parts from Ben Albrecht # https://github.com/ben-albrecht/cclib/blob/master/cclib/parser/mopacparser.py # Merged and modernized by Geoff Hutchison import re import math import numpy from cclib.parser import data from cclib.parser import logfileparser from cclib.parser import utils def symbol2int(symbol): t = utils.PeriodicTable() return t.number[symbol] class MOPAC(logfileparser.Logfile): """A MOPAC20XX output file.""" def __init__(self, *args, **kwargs): super().__init__(logname="MOPAC", *args, **kwargs) def __str__(self): """Return a string representation of the object.""" return "MOPAC log file %s" % (self.filename) def __repr__(self): """Return a representation of the object.""" return 'MOPAC("%s")' % (self.filename) def normalisesym(self, label): """MOPAC does not require normalizing symmetry labels.""" return label def before_parsing(self): #TODO # Defaults charge = 0 self.set_attribute('charge', charge) mult = 1 self.set_attribute('mult', mult) # Keep track of whether or not we're performing an # (un)restricted calculation. self.unrestricted = False self.is_rohf = False # Keep track of 1SCF vs. gopt since gopt is default self.onescf = False self.geomdone = False # Compile the dashes-and-or-spaces-only regex. self.re_dashes_and_spaces = re.compile(r'^[\s-]+$') self.star = ' * ' self.stars = ' *******************************************************************************' self.spinstate = {'SINGLET': 1, 'DOUBLET': 2, 'TRIPLET': 3, 'QUARTET': 4, 'QUINTET': 5, 'SEXTET': 6, 'HEPTET': 7, 'OCTET': 8, 'NONET': 9} def extract(self, inputfile, line): """Extract information from the file object inputfile.""" # Extract the package version. if "For non-commercial use only" in line: # Ignore the platorm information for now (the last character). self.metadata["package_version"] = line.split()[8][:-1] # Use the year as the legacy (short) package version. self.skip_lines( inputfile, ["Stewart Computational Chemistry", "s", "s", "s", "s"] ) self.metadata["legacy_package_version"] = next(inputfile).split()[1][5:] # Extract the atomic numbers and coordinates from the optimized geometry # note that cartesian coordinates section occurs multiple times in the file, and we want to end up using the last instance # also, note that the section labeled cartesian coordinates doesn't have as many decimal places as the one used here # Example 1 (not used): # CARTESIAN COORDINATES # # NO. ATOM X Y Z # # 1 O 4.7928 -0.8461 0.3641 # 2 O 5.8977 -0.3171 0.0092 # ... # Example 2 (used): # ATOM CHEMICAL X Y Z # NUMBER SYMBOL (ANGSTROMS) (ANGSTROMS) (ANGSTROMS) # # 1 O 4.79280259 * -0.84610232 * 0.36409474 * # 2 O 5.89768035 * -0.31706418 * 0.00917035 * # ... etc. if line.split() == ["NUMBER", "SYMBOL", "(ANGSTROMS)", "(ANGSTROMS)", "(ANGSTROMS)"]: self.updateprogress(inputfile, "Attributes", self.cupdate) self.inputcoords = [] self.inputatoms = [] blankline = inputfile.next() atomcoords = [] line = inputfile.next() while len(line.split()) > 6: # MOPAC Version 14.019L 64BITS suddenly appends this block with # "CARTESIAN COORDINATES" block with no blank line. tokens = line.split() self.inputatoms.append(symbol2int(tokens[1])) xc = float(tokens[2]) yc = float(tokens[4]) zc = float(tokens[6]) atomcoords.append([xc, yc, zc]) line = inputfile.next() self.inputcoords.append(atomcoords) if not hasattr(self, "natom"): self.atomnos = numpy.array(self.inputatoms, 'i') self.natom = len(self.atomnos) if 'CHARGE ON SYSTEM =' in line: charge = int(line.split()[5]) self.set_attribute('charge', charge) if 'SPIN STATE DEFINED' in line: # find the multiplicity from the line token (SINGLET, DOUBLET, TRIPLET, etc) mult = self.spinstate[line.split()[1]] self.set_attribute('mult', mult) # Read energy (in kcal/mol, converted to eV) # # FINAL HEAT OF FORMATION = -333.88606 KCAL = -1396.97927 KJ if 'FINAL HEAT OF FORMATION =' in line: if not hasattr(self, "scfenergies"): self.scfenergies = [] self.scfenergies.append(utils.convertor(utils.float(line.split()[5]), "kcal/mol", "eV")) # Molecular mass parsing (units will be amu) # # MOLECULAR WEIGHT == 130.1890 if line[0:35] == ' MOLECULAR WEIGHT =': self.molmass = utils.float(line.split()[3]) #rotational constants #Example: # ROTATIONAL CONSTANTS IN CM(-1) # # A = 0.01757641 B = 0.00739763 C = 0.00712013 # could also read in moment of inertia, but this should just differ by a constant: rot cons= h/(8*Pi^2*I) # note that the last occurence of this in the thermochemistry section has reduced precision, # so we will want to use the 2nd to last instance if line[0:40] == ' ROTATIONAL CONSTANTS IN CM(-1)': blankline = inputfile.next() rotinfo = inputfile.next() if not hasattr(self, "rotcons"): self.rotcons = [] broken = rotinfo.split() # leave the rotational constants in Hz a = float(broken[2]) b = float(broken[5]) c = float(broken[8]) self.rotcons.append([a, b, c]) # Start of the IR/Raman frequency section. # Example: # VIBRATION 1 1A ATOM PAIR ENERGY CONTRIBUTION RADIAL # FREQ. 15.08 C 12 -- C 16 +7.9% (999.0%) 0.0% # T-DIPOLE 0.2028 C 16 -- H 34 +5.8% (999.0%) 28.0% # TRAVEL 0.0240 C 16 -- H 32 +5.6% (999.0%) 35.0% # RED. MASS 1.7712 O 1 -- O 4 +5.2% (999.0%) 0.4% # EFF. MASS7752.8338 # # VIBRATION 2 2A ATOM PAIR ENERGY CONTRIBUTION RADIAL # FREQ. 42.22 C 11 -- C 15 +9.0% (985.8%) 0.0% # T-DIPOLE 0.1675 C 15 -- H 31 +6.6% (843.6%) 3.3% # TRAVEL 0.0359 C 15 -- H 29 +6.0% (802.8%) 24.5% # RED. MASS 1.7417 C 13 -- C 17 +5.8% (792.7%) 0.0% # EFF. MASS1242.2114 if line[1:10] == 'VIBRATION': self.updateprogress(inputfile, "Frequency Information", self.fupdate) # get the vib symmetry if len(line.split()) >= 3: sym = line.split()[2] if not hasattr(self, 'vibsyms'): self.vibsyms = [] self.vibsyms.append(sym) line = inputfile.next() if 'FREQ' in line: if not hasattr(self, 'vibfreqs'): self.vibfreqs = [] freq = float(line.split()[1]) self.vibfreqs.append(freq) line = inputfile.next() if 'T-DIPOLE' in line: if not hasattr(self, 'vibirs'): self.vibirs = [] tdipole = float(line.split()[1]) # transform to km/mol self.vibirs.append(math.sqrt(tdipole)) line = inputfile.next() if 'TRAVEL' in line: pass line = inputfile.next() if 'RED. MASS' in line: if not hasattr(self, 'vibrmasses'): self.vibrmasses = [] rmass = float(line.split()[2]) self.vibrmasses.append(rmass) # Orbital eigenvalues, e.g. # ALPHA EIGENVALUES # BETA EIGENVALUES # or just "EIGENVALUES" for closed-shell if 'EIGENVALUES' in line: if not hasattr(self, 'moenergies'): self.moenergies = [] # list of arrays energies = [] line = inputfile.next() while len(line.split()) > 0: energies.extend([float(i) for i in line.split()]) line = inputfile.next() self.moenergies.append(energies) # todo: # Partial charges and dipole moments # Example: # NET ATOMIC CHARGES if line[:16] == '== MOPAC DONE ==': self.metadata['success'] = True from __future__ import print_function, unicode_literals import re from decimal import Decimal as D from aspen import Response import pytest from gratipay.security.user import SESSION from gratipay.testing import Harness from gratipay.wireup import find_files overescaping_re = re.compile(r'&amp;(#[0-9]{4}|[a-z]+);') class TestPages(Harness): def browse(self, setup=None, **kw): alice = self.make_participant('alice', claimed_time='now', number='plural') exchange_id = self.make_exchange('balanced-cc', 19, 0, alice) alice.insert_into_communities(True, 'Wonderland', 'wonderland') alan = self.make_participant('alan', claimed_time='now') alice.add_member(alan) if setup: setup(alice) i = len(self.client.www_root) urls = [] for spt in find_files(self.client.www_root, '*.spt'): url = spt[i:-4].replace('/%team/', '/alice/') \ .replace('/alice/%sub', '/alice/foo') \ .replace('/~/%username/', '/~alice/') \ .replace('/for/%slug/', '/for/wonderland/') \ .replace('/%platform/', '/github/') \ .replace('/%user_name/', '/gratipay/') \ .replace('/%membername', '/alan') \ .replace('/%exchange_id.int', '/%s' % exchange_id) \ .replace('/%redirect_to', '/giving') \ .replace('/%endpoint', '/public') \ .replace('/about/me/%sub', '/about/me') assert '/%' not in url if 'index' in url.split('/')[-1]: url = url.rsplit('/', 1)[0] + '/' urls.append(url) urls.extend(""" /about/me /about/me/ /about/me/history """.split()) for url in urls: try: r = self.client.GET(url, **kw) except Response as r: if r.code == 404 or r.code >= 500: raise assert r.code != 404 assert r.code < 500 assert not overescaping_re.search(r.body.decode('utf8')) def test_anon_can_browse(self): self.browse() def test_new_participant_can_browse(self): self.browse(auth_as='alice') def test_on_the_fence_can_browse(self): def setup(alice): bob = self.make_participant('bob', claimed_time='now', last_bill_result='') bob.set_tip_to(alice, D('1.00')) self.browse(setup, auth_as='alice') def test_escaping_on_homepage(self): self.make_participant('alice', claimed_time='now') expected = "<a href='/alice/'>" actual = self.client.GET('/', auth_as='alice').body assert expected in actual @pytest.mark.xfail(reason="migrating to Teams; #3399") def test_username_is_in_button(self): self.make_participant('alice', claimed_time='now') self.make_participant('bob', claimed_time='now') body = self.client.GET('/~alice/', auth_as='bob').body assert '<span class="zero">Give to alice</span>' in body @pytest.mark.xfail(reason="migrating to Teams; #3399") def test_username_is_in_unauth_giving_cta(self): self.make_participant('alice', claimed_time='now') body = self.client.GET('/~alice/').body assert 'give to alice' in body def test_widget(self): self.make_participant('cheese', claimed_time='now') expected = "javascript: window.open" actual = self.client.GET('/~cheese/widget.html').body assert expected in actual def test_github_associate(self): assert self.client.GxT('/on/github/associate').code == 400 def test_twitter_associate(self): assert self.client.GxT('/on/twitter/associate').code == 400 def test_about(self): expected = "give money every week" actual = self.client.GET('/about/').body assert expected in actual def test_about_stats(self): expected = "have joined Gratipay" actual = self.client.GET('/about/stats.html').body assert expected in actual def test_about_charts(self): assert self.client.GxT('/about/charts.html').code == 302 def test_about_faq(self): expected = "What is Gratipay?" actual = self.client.GET('/about/faq.html').body.decode('utf8') assert expected in actual def test_about_teams_redirect(self): assert self.client.GxT('/about/teams/').code == 302 def test_about_teams(self): expected = "Teams" actual = self.client.GET('/about/features/teams/').body.decode('utf8') assert expected in actual def test_404(self): response = self.client.GET('/about/four-oh-four.html', raise_immediately=False) assert "Not Found" in response.body assert "{%" not in response.body def test_for_contributors_redirects_to_inside_gratipay(self): loc = self.client.GxT('/for/contributors/').headers['Location'] assert loc == 'http://inside.gratipay.com/' def test_mission_statement_also_redirects(self): assert self.client.GxT('/for/contributors/mission-statement.html').code == 302 def test_anonymous_sign_out_redirects(self): response = self.client.PxST('/sign-out.html') assert response.code == 302 assert response.headers['Location'] == '/' def test_sign_out_overwrites_session_cookie(self): self.make_participant('alice') response = self.client.PxST('/sign-out.html', auth_as='alice') assert response.code == 302 assert response.headers.cookie[SESSION].value == '' def test_sign_out_doesnt_redirect_xhr(self): self.make_participant('alice') response = self.client.PxST('/sign-out.html', auth_as='alice', HTTP_X_REQUESTED_WITH=b'XMLHttpRequest') assert response.code == 200 def test_settings_page_available_balance(self): self.make_participant('alice', claimed_time='now') self.db.run("UPDATE participants SET balance = 123.00 WHERE username = 'alice'") actual = self.client.GET("/~alice/settings/", auth_as="alice").body expected = "123" assert expected in actual def test_subscriptions_page(self): self.make_team(is_approved=True) alice = self.make_participant('alice', claimed_time='now') alice.set_subscription_to('TheATeam', "1.00") assert "The A Team" in self.client.GET("/~alice/subscriptions/", auth_as="alice").body def test_giving_page_shows_cancelled(self): self.make_team(is_approved=True) alice = self.make_participant('alice', claimed_time='now') alice.set_subscription_to('TheATeam', "1.00") alice.set_subscription_to('TheATeam', "0.00") assert "Cancelled" in self.client.GET("/~alice/subscriptions/", auth_as="alice").body def test_new_participant_can_edit_profile(self): self.make_participant('alice', claimed_time='now') body = self.client.GET("/~alice/", auth_as="alice").body assert b'Edit' in body def test_tilde_slash_redirects_to_tilde(self): self.make_participant('alice', claimed_time='now') response = self.client.GxT("/~/alice/", auth_as="alice") assert response.code == 302 assert response.headers['Location'] == '/~alice/' def test_tilde_slash_redirects_subpages_with_querystring_to_tilde(self): self.make_participant('alice', claimed_time='now') response = self.client.GxT("/~/alice/foo/bar?baz=buz", auth_as="alice") assert response.code == 302 assert response.headers['Location'] == '/~alice/foo/bar?baz=buz' def test_username_redirected_to_tilde(self): self.make_participant('alice', claimed_time='now') response = self.client.GxT("/alice/", auth_as="alice") assert response.code == 302 assert response.headers['Location'] == '/~alice/' def test_username_redirects_everything_to_tilde(self): self.make_participant('alice', claimed_time='now') response = self.client.GxT("/alice/foo/bar?baz=buz", auth_as="alice") assert response.code == 302 assert response.headers['Location'] == '/~alice/foo/bar?baz=buz' def test_team_slug__not__redirected_from_tilde(self): self.make_team(is_approved=True) assert self.client.GET("/TheATeam/").code == 200 assert self.client.GxT("/~TheATeam/").code == 404 from django.contrib import admin from django.contrib.admin import SimpleListFilter from django.db import models from django.db.models.fields import CharField, TextField from django.forms import Textarea, ModelForm from import_export.admin import ImportExportModelAdmin from solo.admin import SingletonModelAdmin from .models import Audiologist from .models import AudiologistResource from .models import Client from .models import ClientResource from .models import MeetingLog from .models import MeetingLogResource from .models import Provider from .models import ProviderResource from .models import IncomeSource from .models import Settings from .models import Grantor from .models import GrantorResource standard_textarea = Textarea(attrs={'rows': 3, 'cols': 40, 'style': 'height: 3.6em;'}) class DeleteNotAllowedModelAdmin(admin.ModelAdmin): def has_delete_permission(self, request, obj=None): return request.user.is_superuser class AudiologistCurrentFilter(SimpleListFilter): ''' Custom filter that defaults to "current" == True ''' title = 'Status' parameter_name = 'current' def lookups(self, request, model_admin): return ( ('a', 'All audiologists'), ('y', 'Current'), ('n', 'Inactive'), ) def queryset(self, request, queryset): if self.value() == 'a': return queryset.filter() url_val_map = { 'y': True, 'n': False, None: True, } val = url_val_map[self.value()] return queryset.filter(current=val) def choices(self, cl, *a, **kw): yield { 'selected': self.value() is None or self.value == 'y', 'query_string': cl.get_query_string({}, [self.parameter_name]), 'display': 'Current', } yield { 'selected': self.value() == 'n', 'query_string': cl.get_query_string({self.parameter_name: 'n'}, []), 'display': 'Inactive', } yield { 'selected': self.value() == 'a', 'query_string': cl.get_query_string({self.parameter_name: 'a'}, []), 'display': 'All', } class AudiologistAdmin(DeleteNotAllowedModelAdmin, ImportExportModelAdmin): list_display = ('name', 'allowed', 'current') list_filter = (AudiologistCurrentFilter,) ordering = ('name',) resource_class = AudiologistResource formfield_overrides = { models.TextField: { 'widget': standard_textarea, }, } class ClientIncomeInlineAdmin(admin.TabularInline): model = IncomeSource can_delete = True extra = 1 class MeetingLogInlineAdminForm(ModelForm): class Meta: model = MeetingLog fields = '__all__' widgets = { 'results': standard_textarea, } class MeetingLogInlineAdmin(admin.TabularInline): model = MeetingLog form = MeetingLogInlineAdminForm can_delete = True extra = 1 class DateYesNoFilter(SimpleListFilter): def lookups(self, request, model_admin): return ( ('y', 'Yes'), ('n', 'No'), ) def queryset(self, request, queryset): query = {} if self.value() == 'y': query = {self.field_name + '__isnull': False} elif self.value() == 'n': query = {self.field_name + '__isnull': True} return queryset.filter(**query) class DeceasedFilter(DateYesNoFilter): title = 'Deceased' parameter_name = 'deceased' field_name = 'date_of_death' class CostShareApprovedFilter(DateYesNoFilter): title = 'Cost Share Approved' parameter_name = 'cost_share_approved' field_name = 'cost_share_approval' class UpdateMeetingFilter(DateYesNoFilter): title = 'Had Update Meeting' parameter_name = 'update_meeting' field_name = 'update_meeting' class ProviderAuthReqFilter(DateYesNoFilter): title = 'Provider Auth Requested' parameter_name = 'provider_auth_requested' field_name = 'provider_auth_requested' class ProviderAuthRecvFilter(DateYesNoFilter): title = 'Provider Auth Rcvd' parameter_name = 'provider_auth_received' field_name = 'provider_auth_received' class AudiologistReferredFilter(DateYesNoFilter): title = 'Audiologist Referred' parameter_name = 'audiologist_referral_date' field_name = 'audiologist_referral_date' class AudiologistApptFilter(DateYesNoFilter): title = 'Audiologist Appt Set' parameter_name = 'audiologist_appointment_date' field_name = 'audiologist_appointment_date' class AudiologistInvoicedFilter(DateYesNoFilter): title = 'Audiologist Invoiced' parameter_name = 'audiologist_invoiced_date' field_name = 'audiologist_invoiced_date' class ClientAdmin(ImportExportModelAdmin): resource_class = ClientResource list_display = ('last_name', 'first_name', 'intake_date', 'last_updated', 'hearing_loss', 'audiologist', 'client_grantors', 'cost_share', 'cost_share_approval') list_display_links = ('last_name', 'first_name',) list_filter = ('provider', 'audiologist', 'grantors', 'family_size', 'hearing_loss', DeceasedFilter, CostShareApprovedFilter, UpdateMeetingFilter, 'update_meeting', ProviderAuthReqFilter, ProviderAuthRecvFilter, AudiologistReferredFilter, AudiologistApptFilter, AudiologistInvoicedFilter, 'equipment_requested', 'adaptive_equipment', 'hearing_aid_assistance', 'last_updated', 'quota_client', 'deliverable', 'non_kcsm', 'intake_staff', 'data_entry_staff') ordering = ('-intake_date',) date_hierarchy = 'intake_date' search_fields = [f.name for f in Client._meta.local_fields if isinstance(f, (CharField, TextField))] formfield_overrides = { models.TextField: { 'widget': standard_textarea, }, } inlines = (ClientIncomeInlineAdmin,MeetingLogInlineAdmin) readonly_fields = ('id', 'last_updated') fieldsets = ( (None, { 'fields': ( 'id', 'napis_id', ) }), ('Personal Info', { 'fields': ( 'first_name', 'last_name', 'gender', 'date_of_birth', 'date_of_death', 'is_veteran', 'lives_alone', 'spouse', 'family_size', ) }), ('Contact', { 'fields': ( 'address', 'city', 'county', 'state', 'zip_code', 'deliverable', 'email', 'phone', 'emergency_contact', 'emergency_phone', ) }), ('Notes', { 'fields': ( 'notes', ) }), ('Demographics', { 'fields': ( 'race', 'is_hispanic', 'multiracial', 'multiracial_white', 'multiracial_black', 'multiracial_asian', 'multiracial_amind', ) }), ('Assistance', { 'fields': ( 'hearing_loss', 'aids_requested_left', 'aids_requested_right', 'equipment_requested', 'hearing_assistance', 'adaptive_equipment', 'hearing_aid_assistance', 'equipment_borrowed', ) }), ('Additional Forms', { 'fields': ( 'proof_of_age', 'signed_client_intake', 'signed_disclosure_authorization', 'signed_confidentiality_policy', 'signed_gross_annual_income', 'signed_client_responsibility_fees' ) }), ('DHHS', { 'fields': ( 'intake_date', 'intake_staff', 'data_entry_staff', 'last_updated', 'referrer', 'update_meeting', 'cost_share_approval', 'cost_share', 'quota_client', 'non_kcsm', 'grantors', 'provider', 'audient_id', 'provider_auth_requested', 'provider_auth_received', ) }), ('Audiologist', { 'fields': ( 'audiologist', 'audiologist_referral_date', 'audiologist_appointment_date', 'audiologist_invoiced_date', 'audiologist_invoiced_amount', ) }), ) class MeetingLogAdmin(ImportExportModelAdmin): resource_class = MeetingLogResource list_display = ('client', 'contact_date', 'consultation_time', 'paperwork_time', 'units', 'results', 'user') list_display_links = ('contact_date',) list_filter = ('client', 'contact_date', 'user') ordering = ('-contact_date',) date_hierarchy = 'contact_date' formfield_overrides = { models.TextField: { 'widget': standard_textarea, }, } def units(self, obj): return (obj.consultation_time + obj.paperwork_time) / 60 class ProviderAdmin(ImportExportModelAdmin): ordering = ('name',) resource_class = ProviderResource formfield_overrides = { models.TextField: { 'widget': standard_textarea, }, } class GrantorAdmin(ImportExportModelAdmin): ordering = ('name',) resource_class = GrantorResource formfield_overrides = { models.TextField: { 'widget': standard_textarea, }, } admin.site.disable_action('delete_selected') admin.site.site_header = 'Deaf & Hard of Hearing Services - ADAPT' admin.site.site_title = 'ADAPT' admin.site.site_url = None admin.site.index_title = '' admin.site.register(Audiologist, AudiologistAdmin) admin.site.register(Client, ClientAdmin) admin.site.register(Provider, ProviderAdmin) admin.site.register(Grantor, GrantorAdmin) admin.site.register(MeetingLog, MeetingLogAdmin) admin.site.register(Settings, SingletonModelAdmin) """ Tests for both experiment.py and experiment_set.py """ import pytest from snovault import TYPES # from snovault.storage import UUID from uuid import uuid4 from ..types.experiment import ExperimentHiC pytestmark = [pytest.mark.setone, pytest.mark.working] @pytest.fixture def custom_experiment_set_data(lab, award): return { 'lab': lab['@id'], 'award': award['@id'], 'description': 'test experiment set', 'experimentset_type': 'custom', 'status': 'in review by lab' } @pytest.fixture def custom_experiment_set(testapp, custom_experiment_set_data): return testapp.post_json('/experiment_set', custom_experiment_set_data).json['@graph'][0] @pytest.fixture def replicate_experiment_set_data(lab, award): return { 'lab': lab['@id'], 'award': award['@id'], 'description': 'test replicate set', 'experimentset_type': 'replicate', 'status': 'in review by lab' } @pytest.fixture def replicate_experiment_set(testapp, replicate_experiment_set_data): return testapp.post_json('/experiment_set_replicate', replicate_experiment_set_data).json['@graph'][0] @pytest.fixture def sop_map_data(protocol, lab, award): return { "sop_name": "in situ Hi-C SOP map", "sop_version": 1, 'lab': lab['@id'], 'award': award['@id'], "associated_item_type": "ExperimentHiC", "id_values": ["in situ Hi-C"], "notes": "This is just a dummy insert not linked to true SOP protocol", "description": "Fields with specified defaults in the SOP for in situ Hi-C experiments as per ??", "sop_protocol": protocol['@id'], "fields_with_default": [ {"field_name": "digestion_enzyme", "field_value": "MboI"}, ] } @pytest.fixture def sop_map_data_2(lab, award): return { "sop_name": "Second in situ hic map", "sop_version": 2, 'lab': lab['@id'], 'award': award['@id'], "associated_item_type": "ExperimentHiC", "id_values": ["in situ Hi-C"], "notes": "This is a dummy second version of map", "description": "Second", } def test_experiment_update_experiment_relation(testapp, base_experiment, experiment): relation = [{'relationship_type': 'controlled by', 'experiment': experiment['@id']}] res = testapp.patch_json(base_experiment['@id'], {'experiment_relation': relation}) assert res.json['@graph'][0]['experiment_relation'] == relation # patching an experiement should also update the related experiement exp_res = testapp.get(experiment['@id']) exp_res_id = exp_res.json['experiment_relation'][0]['experiment']['@id'] assert exp_res_id == base_experiment['@id'] def test_experiment_update_hic_sop_mapping_added_on_submit(testapp, experiment_data, sop_map_data): res_sop = testapp.post_json('/sop_map', sop_map_data, status=201) res_exp = testapp.post_json('/experiment_hi_c', experiment_data) assert 'sop_mapping' in res_exp.json['@graph'][0] assert res_exp.json['@graph'][0]['sop_mapping']['has_sop'] == "Yes" assert res_exp.json['@graph'][0]['sop_mapping']['sop_map'] == res_sop.json['@graph'][0]['@id'] def test_experiment_update_hic_sop_mapping_has_map_is_no(testapp, experiment_data, exp_types): experiment_data['experiment_type'] = exp_types['dnase']['@id'] res_exp = testapp.post_json('/experiment_hi_c', experiment_data) assert 'sop_mapping' in res_exp.json['@graph'][0] assert res_exp.json['@graph'][0]['sop_mapping']['has_sop'] == "No" def test_experiment_update_hic_sop_mapping_has_sop2no_when_only_sopmap_deleted( testapp, experiment_data, sop_map_data): sop_map_data['status'] = 'deleted' testapp.post_json('/sop_map', sop_map_data, status=201) res_exp = testapp.post_json('/experiment_hi_c', experiment_data) assert 'sop_mapping' in res_exp.json['@graph'][0] assert res_exp.json['@graph'][0]['sop_mapping']['has_sop'] == "No" def test_experiment_update_hic_sop_mapping_to_v2_when_2_versions( testapp, experiment_data, sop_map_data, sop_map_data_2): testapp.post_json('/sop_map', sop_map_data, status=201) res2chk = testapp.post_json('/sop_map', sop_map_data_2, status=201) res_exp = testapp.post_json('/experiment_hi_c', experiment_data) assert 'sop_mapping' in res_exp.json['@graph'][0] assert res_exp.json['@graph'][0]['sop_mapping']['has_sop'] == "Yes" assert res_exp.json['@graph'][0]['sop_mapping']['sop_map'] == res2chk.json['@graph'][0]['@id'] def test_experiment_update_hic_sop_mapping_to_v1_when_v2_deleted( testapp, experiment_data, sop_map_data, sop_map_data_2): res2chk = testapp.post_json('/sop_map', sop_map_data, status=201) sop_map_data_2['status'] = 'deleted' testapp.post_json('/sop_map', sop_map_data_2, status=201) res_exp = testapp.post_json('/experiment_hi_c', experiment_data) assert 'sop_mapping' in res_exp.json['@graph'][0] assert res_exp.json['@graph'][0]['sop_mapping']['has_sop'] == "Yes" assert res_exp.json['@graph'][0]['sop_mapping']['sop_map'] == res2chk.json['@graph'][0]['@id'] def test_experiment_update_hic_sop_map_not_added_when_already_present(testapp, experiment_data): experiment_data['sop_mapping'] = {} experiment_data['sop_mapping']['has_sop'] = 'No' res = testapp.post_json('/experiment_hi_c', experiment_data) assert 'sop_mapping' in res.json['@graph'][0] assert res.json['@graph'][0]['sop_mapping']['has_sop'] == "No" assert 'sop_map' not in res.json['@graph'][0]['sop_mapping'] def test_calculated_experiment_summary(testapp, experiment, mboI): summary = 'in situ Hi-C on GM12878 with MboI' res = testapp.patch_json(experiment['@id'], {'digestion_enzyme': mboI['@id']}, status=200) assert res.json['@graph'][0]['experiment_summary'] == summary assert summary in res.json['@graph'][0]['display_title'] def test_experiment_summary_repliseq(repliseq_4): assert repliseq_4.get('experiment_summary') == '2-stage Repli-seq on GM12878 S-phase early' # test for experiment_set_replicate _update function def test_experiment_set_replicate_update_adds_experiments_in_set(testapp, experiment, replicate_experiment_set): assert not replicate_experiment_set['experiments_in_set'] res = testapp.patch_json( replicate_experiment_set['@id'], {'replicate_exps': [{'replicate_exp': experiment['@id'], 'bio_rep_no': 1, 'tec_rep_no': 1}]}, status=200) assert experiment['@id'] in res.json['@graph'][0]['experiments_in_set'] # test for default_embedding practice with embedded list # this test should change should any of the reference embeds below be altered def test_experiment_set_default_embedded_list(registry, exp_types): exp_data = { 'experiment_type': exp_types['microc']['uuid'], 'status': 'in review by lab' } # create experimentHiC obj; _update (and by extension, add_default_embeds) # are called automatically test_exp = ExperimentHiC.create(registry, None, exp_data) # call reify embedded property (defined in snovault/resources.py) embedded = test_exp.embedded embedded_list = test_exp.embedded_list type_info_embedded = registry[TYPES]['experiment_hi_c'].embedded_list assert type_info_embedded == embedded_list if 'produced_in_pub.*' in embedded_list: assert 'produced_in_pub.*' in embedded assert 'produced_in_pub.award.@id' in embedded assert 'produced_in_pub.award.@type' in embedded assert 'produced_in_pub.award.principals_allowed.*' in embedded assert 'produced_in_pub.award.display_title' in embedded assert 'produced_in_pub.award.uuid' in embedded assert 'experiment_sets.accession' in embedded_list assert 'experiment_sets.@id' in embedded assert 'experiment_sets.@type' in embedded assert 'experiment_sets.principals_allowed.*' in embedded assert 'experiment_sets.display_title' in embedded assert 'experiment_sets.uuid' in embedded # tests for the experiment_sets calculated properties def test_calculated_experiment_sets_for_custom_experiment_set(testapp, experiment, custom_experiment_set): assert len(experiment['experiment_sets']) == 0 res = testapp.patch_json(custom_experiment_set['@id'], {'experiments_in_set': [experiment['@id']]}, status=200) expt_res = testapp.get(experiment['@id']) assert custom_experiment_set['uuid'] == expt_res.json['experiment_sets'][0]['uuid'] def test_calculated_experiment_sets_for_replicate_experiment_set(testapp, experiment, replicate_experiment_set): assert len(experiment['experiment_sets']) == 0 res = testapp.patch_json( replicate_experiment_set['@id'], {'replicate_exps': [{'replicate_exp': experiment['@id'], 'bio_rep_no': 1, 'tec_rep_no': 1}]}, status=200) expt_res = testapp.get(experiment['@id']) assert replicate_experiment_set['uuid'] == expt_res.json['experiment_sets'][0]['uuid'] @pytest.fixture def pub1_data(lab, award): # encode paper published 2012-09-06 return { 'award': award['@id'], 'lab': lab['@id'], 'ID': "PMID:22955616" } @pytest.fixture def pub2_data(lab, award): # Sanborn et al paper published 2015-11-24 return { 'award': award['@id'], 'lab': lab['@id'], 'ID': "PMID:26499245" } def test_calculated_produced_in_pub_for_rep_experiment_set(testapp, replicate_experiment_set, pub1_data): # post single rep_exp_set to single pub pub1_data['exp_sets_prod_in_pub'] = [replicate_experiment_set['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) expsetres = testapp.get(replicate_experiment_set['@id']) assert 'produced_in_pub' in expsetres assert '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' in expsetres.json['produced_in_pub'].values() def test_calculated_produced_in_pub_for_cust_experiment_set(testapp, custom_experiment_set, pub1_data): # post single cust_exp_set to single pub pub1_data['exp_sets_prod_in_pub'] = [custom_experiment_set['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) expsetres = testapp.get(custom_experiment_set['@id']) assert 'produced_in_pub' in expsetres assert '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' in expsetres.json['produced_in_pub'].values() def test_calculated_produced_in_pub_for_two_experiment_set_to_one_pub( testapp, replicate_experiment_set, custom_experiment_set, pub1_data): # post two exp_set to single pub pub1_data['exp_sets_prod_in_pub'] = [replicate_experiment_set['@id'], custom_experiment_set['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) responses = [testapp.get(replicate_experiment_set['@id']), testapp.get(custom_experiment_set['@id'])] for response in responses: assert 'produced_in_pub' in response assert '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' == response.json['produced_in_pub']['@id'] def test_calculated_produced_in_pub_for_two_experiment_set_two_pubs( testapp, replicate_experiment_set, custom_experiment_set, pub1_data, pub2_data): # post different exp_set to each pub pub1_data['exp_sets_prod_in_pub'] = [replicate_experiment_set['@id']] pub2_data['exp_sets_prod_in_pub'] = [custom_experiment_set['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) pub2res = testapp.post_json('/publication', pub2_data, status=201) responses = [testapp.get(replicate_experiment_set['@id']), testapp.get(custom_experiment_set['@id'])] for response in responses: assert 'produced_in_pub' in response assert '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' == responses[0].json['produced_in_pub']['@id'] assert '/publications/' + pub2res.json['@graph'][0]['uuid'] + '/' == responses[1].json['produced_in_pub']['@id'] def test_calculated_produced_in_pub_for_one_experiment_set_two_pubs( testapp, replicate_experiment_set, pub1_data, pub2_data): # post one exp_set to two pubs - this one should pick up only the most recent pub pub1_data['exp_sets_prod_in_pub'] = [replicate_experiment_set['@id']] pub2_data['exp_sets_prod_in_pub'] = [replicate_experiment_set['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) pub2res = testapp.post_json('/publication', pub2_data, status=201) response = testapp.get(replicate_experiment_set['@id']) assert 'produced_in_pub' in response assert not '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' == response.json['produced_in_pub']['@id'] assert '/publications/' + pub2res.json['@graph'][0]['uuid'] + '/' == response.json['produced_in_pub']['@id'] def test_calculated_publications_in_experiment_set_no_data( testapp, replicate_experiment_set, custom_experiment_set, pub1_data): pub1res = testapp.post_json('/publication', pub1_data, status=201) print(replicate_experiment_set) print(custom_experiment_set) assert not replicate_experiment_set['publications_of_set'] assert not custom_experiment_set['publications_of_set'] def test_calculated_publications_in_rep_experiment_set_2_fields( testapp, replicate_experiment_set, pub1_data): # post single rep_exp_set to single pub both fields pub1_data['exp_sets_prod_in_pub'] = [replicate_experiment_set['@id']] pub1_data['exp_sets_used_in_pub'] = [replicate_experiment_set['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) response = testapp.get(replicate_experiment_set['@id']) print(response) print('JSON:', response.json) assert 'publications_of_set' in response assert len(response.json['publications_of_set']) == 1 assert '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' in response.json['publications_of_set'][0].values() def test_calculated_publications_in_cust_experiment_set_used_in_field( testapp, custom_experiment_set, pub1_data): # post only used in publication one pub one exp set pub1_data['exp_sets_used_in_pub'] = [custom_experiment_set['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) response = testapp.get(custom_experiment_set['@id']) assert 'publications_of_set' in response assert len(response.json['publications_of_set']) == 1 assert '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' in response.json['publications_of_set'][0].values() def test_calculated_publications_in_rep_experiment_set_two_pubs_both_fields( testapp, replicate_experiment_set, pub1_data, pub2_data): # post same experiment set to two pubs in either field pub1_data['exp_sets_prod_in_pub'] = [replicate_experiment_set['@id']] pub2_data['exp_sets_used_in_pub'] = [replicate_experiment_set['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) pub2res = testapp.post_json('/publication', pub2_data, status=201) response = testapp.get(replicate_experiment_set['@id']) assert 'publications_of_set' in response assert len(response.json['publications_of_set']) == 2 publications = response.json['publications_of_set'] combined_pub_vals = [p['@id'] for p in publications] assert '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' in combined_pub_vals assert '/publications/' + pub2res.json['@graph'][0]['uuid'] + '/' in combined_pub_vals def test_calculated_publications_in_rep_experiment_set_two_pubs_in_used( testapp, replicate_experiment_set, pub1_data, pub2_data): # post same experiment set to two pubs in used in pub field pub1_data['exp_sets_used_in_pub'] = [replicate_experiment_set['@id']] pub2_data['exp_sets_used_in_pub'] = [replicate_experiment_set['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) pub2res = testapp.post_json('/publication', pub2_data, status=201) response = testapp.get(replicate_experiment_set['@id']) assert 'publications_of_set' in response assert len(response.json['publications_of_set']) == 2 publications = response.json['publications_of_set'] combined_pub_vals = list(publications[0].values()) + list(publications[1].values()) assert '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' in combined_pub_vals assert '/publications/' + pub2res.json['@graph'][0]['uuid'] + '/' in combined_pub_vals # experiment pub calculated properties tests @pytest.fixture def repset_w_exp1(testapp, replicate_experiment_set_data, experiment): repset = replicate_experiment_set_data repset['replicate_exps'] = [{'replicate_exp': experiment['@id'], 'bio_rep_no': 1, 'tec_rep_no': 1}] return testapp.post_json('/experiment_set_replicate', repset).json['@graph'][0] @pytest.fixture def experiment2(testapp, experiment_data, exp_types): experiment_data['experiment_type'] = exp_types['capc']['@id'] return testapp.post_json('/experiment_capture_c', experiment_data).json['@graph'][0] @pytest.fixture def custset_w_exp1(testapp, custom_experiment_set_data, experiment): custset = custom_experiment_set_data custset['experiments_in_set'] = [experiment['@id']] return testapp.post_json('/experiment_set', custset).json['@graph'][0] @pytest.fixture def custset_w_exp2(testapp, custom_experiment_set_data, experiment2): custset = custom_experiment_set_data custset['experiments_in_set'] = [experiment2['@id']] return testapp.post_json('/experiment_set', custset).json['@graph'][0] def test_calculated_expt_produced_in_pub_for_rep_experiment_set( testapp, repset_w_exp1, pub1_data): # post single rep_exp_set to single pub pub1_data['exp_sets_prod_in_pub'] = [repset_w_exp1['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) expres = testapp.get(repset_w_exp1['replicate_exps'][0]['replicate_exp']) # import pdb; pdb.set_trace() assert 'produced_in_pub' in expres assert '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' == expres.json['produced_in_pub']['@id'] def test_calculated_expt_produced_in_pub_for_expt_w_ref( testapp, experiment_data, replicate_experiment_set_data, pub2_data, publication): experiment_data['references'] = [publication['@id']] # just check experiment by itself first expt = testapp.post_json('/experiment_hi_c', experiment_data, status=201).json['@graph'][0] assert 'produced_in_pub' in expt assert publication['@id'] == expt['produced_in_pub'] # post repset with this experiment replicate_experiment_set_data['replicate_exps'] = [{'bio_rep_no': 1, 'tec_rep_no': 1, 'replicate_exp': expt['@id']}] repset = testapp.post_json('/experiment_set_replicate', replicate_experiment_set_data, status=201).json['@graph'][0] # post single rep_exp_set to single pub pub2_data['exp_sets_prod_in_pub'] = [repset['@id']] testapp.post_json('/publication', pub2_data, status=201) expinset = testapp.get(repset['replicate_exps'][0]['replicate_exp']).json assert 'produced_in_pub' in expinset assert publication['@id'] == expinset['produced_in_pub']['@id'] def test_calculated_expt_produced_in_pub_for_cust_experiment_set( testapp, custset_w_exp1, pub1_data): # post single cust_exp_set to single pub pub1_data['exp_sets_prod_in_pub'] = [custset_w_exp1['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) expres = testapp.get(custset_w_exp1['experiments_in_set'][0]) assert 'produced_in_pub' not in expres.json.keys() def test_calculated_expt_produced_in_pub_for_one_expt_in_two_expset_one_pub( testapp, repset_w_exp1, custset_w_exp1, pub1_data): # post two exp_set with same experiment (repset and custset) to single pub pub1_data['exp_sets_prod_in_pub'] = [repset_w_exp1['@id'], custset_w_exp1['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) # both responses will get the same experiment responses = [testapp.get(repset_w_exp1['replicate_exps'][0]['replicate_exp']), testapp.get(custset_w_exp1['experiments_in_set'][0])] for response in responses: assert 'produced_in_pub' in response assert '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' == response.json['produced_in_pub']['@id'] def test_calculated_expt_produced_in_pub_for_two_exp_two_expset_two_pubs( testapp, repset_w_exp1, custset_w_exp2, pub1_data, pub2_data): # post 2 exp_set (one repset, one custom) each with diff expt to each pub # only expt in repset should get the pub of repset pub1_data['exp_sets_prod_in_pub'] = [repset_w_exp1['@id']] pub2_data['exp_sets_prod_in_pub'] = [custset_w_exp2['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) testapp.post_json('/publication', pub2_data, status=201) responses = [testapp.get(repset_w_exp1['replicate_exps'][0]['replicate_exp']), testapp.get(custset_w_exp2['experiments_in_set'][0])] assert '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' == responses[0].json['produced_in_pub']['@id'] assert 'produced_in_pub' not in responses[1].json def test_calculated_expt_produced_in_pub_for_one_expt_one_expset_two_pubs( testapp, repset_w_exp1, pub1_data, pub2_data): # post one exp_set to two pubs - this one should pick up only the most recent pub pub1_data['exp_sets_prod_in_pub'] = [repset_w_exp1['@id']] pub2_data['exp_sets_prod_in_pub'] = [repset_w_exp1['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) pub2res = testapp.post_json('/publication', pub2_data, status=201) response = testapp.get(repset_w_exp1['replicate_exps'][0]['replicate_exp']) assert 'produced_in_pub' in response assert not '/publications/' + pub1res.json['@graph'][0]['uuid'] + '/' == response.json['produced_in_pub']['@id'] assert '/publications/' + pub2res.json['@graph'][0]['uuid'] + '/' == response.json['produced_in_pub']['@id'] def test_calculated_publications_in_experiment_no_data( testapp, repset_w_exp1, custset_w_exp2, pub1_data): pub1res = testapp.post_json('/publication', pub1_data, status=201) responses = [testapp.get(repset_w_exp1['replicate_exps'][0]['replicate_exp']), testapp.get(custset_w_exp2['experiments_in_set'][0])] for response in responses: assert response.json['publications_of_exp'] == [] def test_calculated_publications_in_expt_w_repset_in_both_fields( testapp, repset_w_exp1, pub1_data): # post single rep_exp_set to single pub both fields pub1_data['exp_sets_prod_in_pub'] = [repset_w_exp1['@id']] pub1_data['exp_sets_used_in_pub'] = [repset_w_exp1['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) response = testapp.get(repset_w_exp1['replicate_exps'][0]['replicate_exp']) assert 'publications_of_exp' in response assert len(response.json['publications_of_exp']) == 1 assert pub1res.json['@graph'][0]['uuid'] == response.json['publications_of_exp'][0]['uuid'] def test_calculated_publications_in_expt_w_custset_used_in_field( testapp, custset_w_exp2, pub1_data): # post only used in publication one pub one exp set pub1_data['exp_sets_used_in_pub'] = [custset_w_exp2['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) response = testapp.get(custset_w_exp2['experiments_in_set'][0]) assert 'publications_of_exp' in response assert len(response.json['publications_of_exp']) == 1 assert pub1res.json['@graph'][0]['uuid'] == response.json['publications_of_exp'][0]['uuid'] def test_calculated_publications_in_expt_w_repset_two_pubs_both_fields( testapp, repset_w_exp1, pub1_data, pub2_data): # post same experiment set to two pubs in either field pub1_data['exp_sets_prod_in_pub'] = [repset_w_exp1['@id']] pub2_data['exp_sets_used_in_pub'] = [repset_w_exp1['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) pub2res = testapp.post_json('/publication', pub2_data, status=201) pubuuids = [pub1res.json['@graph'][0]['uuid']] pubuuids.append(pub2res.json['@graph'][0]['uuid']) response = testapp.get(repset_w_exp1['replicate_exps'][0]['replicate_exp']) assert 'publications_of_exp' in response assert len(response.json['publications_of_exp']) == 2 publications = response.json['publications_of_exp'] for pub in publications: assert pub['uuid'] in pubuuids def test_calculated_publications_in_expt_w_repset_two_pubs_in_used( testapp, repset_w_exp1, pub1_data, pub2_data): # post same experiment set to two pubs in used in pub field pub1_data['exp_sets_used_in_pub'] = [repset_w_exp1['@id']] pub2_data['exp_sets_used_in_pub'] = [repset_w_exp1['@id']] pub1res = testapp.post_json('/publication', pub1_data, status=201) pub2res = testapp.post_json('/publication', pub2_data, status=201) pubuuids = [pub1res.json['@graph'][0]['uuid']] pubuuids.append(pub2res.json['@graph'][0]['uuid']) response = testapp.get(repset_w_exp1['replicate_exps'][0]['replicate_exp']) assert 'publications_of_exp' in response assert len(response.json['publications_of_exp']) == 2 publications = response.json['publications_of_exp'] for pub in publications: assert pub['uuid'] in pubuuids def test_calculated_no_of_expts_in_set_w_no_exps(empty_replicate_set): assert 'number_of_experiments' not in empty_replicate_set def test_calculated_no_of_expts_in_set_w_2_exps(two_experiment_replicate_set): assert two_experiment_replicate_set['number_of_experiments'] == 2 # tests for category calculated_property @pytest.fixture def target_w_prot(testapp, lab, award): item = { 'description': "Protein target", 'targeted_proteins': ['CTCF (ABCD)'], 'award': award['@id'], 'lab': lab['@id'], } return testapp.post_json('/target', item).json['@graph'][0] @pytest.fixture def exp_w_target_info(lab, award, human_biosample, exp_types, mboI, genomic_region_bio_feature): return { 'lab': lab['@id'], 'award': award['@id'], 'biosample': human_biosample['@id'], 'experiment_type': exp_types['capc']['@id'], 'targeted_regions': [{'target': [genomic_region_bio_feature['@id']]}] } @pytest.fixture def expt_w_targ_region(testapp, exp_w_target_info): return testapp.post_json('/experiment_capture_c', exp_w_target_info).json['@graph'][0] @pytest.fixture def expt_w_2_targ_regions(testapp, exp_w_target_info, gene_bio_feature): region = {'target': [gene_bio_feature['@id']]} exp_w_target_info['targeted_regions'].append(region) return testapp.post_json('/experiment_capture_c', exp_w_target_info).json['@graph'][0] @pytest.fixture def expt_w_target_data(lab, award, human_biosample, prot_bio_feature, exp_types): return { 'lab': lab['@id'], 'award': award['@id'], 'biosample': human_biosample['@id'], 'experiment_type': exp_types['chia']['@id'], 'targeted_factor': [prot_bio_feature['@id']] } @pytest.fixture def expt_w_target(testapp, expt_w_target_data): return testapp.post_json('/experiment_chiapet', expt_w_target_data).json['@graph'][0] @pytest.fixture def chipseq_expt(testapp, expt_w_target_data, exp_types): expt_w_target_data['experiment_type'] = exp_types['chipseq']['@id'] return testapp.post_json('/experiment_seq', expt_w_target_data).json['@graph'][0] @pytest.fixture def tsaseq_expt(testapp, expt_w_target_data, exp_types): expt_w_target_data['experiment_type'] = exp_types['tsaseq']['@id'] return testapp.post_json('/experiment_tsaseq', expt_w_target_data).json['@graph'][0] @pytest.fixture def repliseq_info(lab, award, human_biosample, exp_types): return { 'lab': lab['@id'], 'award': award['@id'], 'biosample': human_biosample['@id'], 'experiment_type': exp_types['repliseq']['@id'], } @pytest.fixture def repliseq_1(testapp, repliseq_info): return testapp.post_json('/experiment_repliseq', repliseq_info).json['@graph'][0] @pytest.fixture def repliseq_2(testapp, repliseq_info): repliseq_info['stage_fraction'] = 'early' return testapp.post_json('/experiment_repliseq', repliseq_info).json['@graph'][0] @pytest.fixture def repliseq_3(testapp, repliseq_info): repliseq_info['stage_fraction'] = 'early' repliseq_info['total_fractions_in_exp'] = 16 return testapp.post_json('/experiment_repliseq', repliseq_info).json['@graph'][0] @pytest.fixture def repliseq_4(testapp, repliseq_info): repliseq_info['stage_fraction'] = 'early' repliseq_info['total_fractions_in_exp'] = 2 repliseq_info['cell_cycle_phase'] = 'S' return testapp.post_json('/experiment_repliseq', repliseq_info).json['@graph'][0] @pytest.fixture def experiment_atacseq(testapp, repliseq_info, exp_types): repliseq_info['experiment_type'] = exp_types['atacseq']['@id'] return testapp.post_json('/experiment_atacseq', repliseq_info).json['@graph'][0] @pytest.fixture def damid_no_fusion(testapp, repliseq_info, exp_types): repliseq_info['experiment_type'] = exp_types['dam']['@id'] return testapp.post_json('/experiment_damid', repliseq_info).json['@graph'][0] @pytest.fixture def damid_w_fusion(testapp, repliseq_info, prot_bio_feature, exp_types): repliseq_info['experiment_type'] = exp_types['dam']['@id'] repliseq_info['targeted_factor'] = [prot_bio_feature['@id']] return testapp.post_json('/experiment_damid', repliseq_info).json['@graph'][0] @pytest.fixture def damid_w_multifusion(testapp, repliseq_info, prot_bio_feature, gene_bio_feature, exp_types): repliseq_info['experiment_type'] = exp_types['dam']['@id'] repliseq_info['targeted_factor'] = [prot_bio_feature['@id'], gene_bio_feature['@id']] return testapp.post_json('/experiment_damid', repliseq_info).json['@graph'][0] @pytest.fixture def basic_info(lab, award): return { 'lab': lab['@id'], 'award': award['@id'], } @pytest.fixture def imaging_path_1(testapp, basic_info, genomic_region_bio_feature): basic_info['target'] = [genomic_region_bio_feature['@id']] basic_info['labeled_probe'] = 'FITC goat anti rabbit' return testapp.post_json('/imaging_path', basic_info).json['@graph'][0] @pytest.fixture def imaging_path_2(testapp, basic_info, genomic_region_bio_feature): basic_info['target'] = [genomic_region_bio_feature['@id']] basic_info['labeled_probe'] = 'TRITC horse anti rabbit' return testapp.post_json('/imaging_path', basic_info).json['@graph'][0] @pytest.fixture def imaging_path_3(testapp, basic_info, basic_region_bio_feature): basic_info['target'] = [basic_region_bio_feature['@id']] basic_info['labeled_probe'] = 'DAPI' return testapp.post_json('/imaging_path', basic_info).json['@graph'][0] @pytest.fixture def microscopy_no_path(testapp, repliseq_info, exp_types): repliseq_info['experiment_type'] = exp_types['fish']['@id'] return testapp.post_json('/experiment_mic', repliseq_info).json['@graph'][0] @pytest.fixture def microscopy_w_path(testapp, repliseq_info, imaging_path_1, exp_types): repliseq_info['experiment_type'] = exp_types['fish']['@id'] img_path = {'path': imaging_path_1['@id'], 'channel': 'ch01'} repliseq_info['imaging_paths'] = [img_path] return testapp.post_json('/experiment_mic', repliseq_info).json['@graph'][0] @pytest.fixture def microscopy_w_multipath(testapp, repliseq_info, imaging_path_1, imaging_path_2, imaging_path_3, exp_types): repliseq_info['experiment_type'] = exp_types['fish']['@id'] img_path1 = {'path': imaging_path_1['@id'], 'channel': 'ch01'} img_path2 = {'path': imaging_path_2['@id'], 'channel': 'ch02'} img_path3 = {'path': imaging_path_3['@id'], 'channel': 'ch03'} repliseq_info['imaging_paths'] = [img_path1, img_path2, img_path3] return testapp.post_json('/experiment_mic', repliseq_info).json['@graph'][0] @pytest.fixture def microscopy_w_splitpath(testapp, repliseq_info, exp_types, imaging_path_1, imaging_path_3, basic_region_bio_feature, genomic_region_bio_feature): '''Sometimes a (group of) target(s) is split into different imaging paths, e.g. due to multiplexing. If text is formatted as follows, the split group will be found and replaced with the sum''' repliseq_info['experiment_type'] = exp_types['fish']['@id'] img_path1 = {'path': imaging_path_1['@id'], 'channel': 'ch01'} img_path3 = {'path': imaging_path_3['@id'], 'channel': 'ch03'} repliseq_info['imaging_paths'] = [img_path1, img_path3] testapp.patch_json(basic_region_bio_feature['@id'], {'preferred_label': '15 TADs on chr19'}).json['@graph'][0] testapp.patch_json(genomic_region_bio_feature['@id'], {'preferred_label': '22 TADs on chr19'}).json['@graph'][0] return testapp.post_json('/experiment_mic', repliseq_info).json['@graph'][0] def test_experiment_atacseq_display_title(experiment_atacseq): assert experiment_atacseq.get('display_title') == 'ATAC-seq on GM12878 - ' + experiment_atacseq.get('accession') def test_experiment_damid_w_multifusion_display_title(damid_w_multifusion): assert damid_w_multifusion.get('display_title') == 'DamID-seq with mulitiple DAM fusions on GM12878 - ' + damid_w_multifusion.get('accession') def test_experiment_chiapet_w_target_display_title(expt_w_target): assert expt_w_target.get('display_title') == 'ChIA-PET against RAD21 protein on GM12878 - ' + expt_w_target.get('accession') def test_experiment_chipseq_w_target_display_title(chipseq_expt): assert chipseq_expt.get('display_title') == 'ChIP-seq against RAD21 protein on GM12878 - ' + chipseq_expt.get('accession') def test_experiment_tsaseq_display_title(tsaseq_expt): assert tsaseq_expt.get('display_title') == 'TSA-seq against RAD21 protein on GM12878 - ' + tsaseq_expt.get('accession') def test_experiment_categorizer_4_mic_no_path(testapp, microscopy_no_path): assert microscopy_no_path['experiment_categorizer']['field'] == 'Default' assert microscopy_no_path['experiment_categorizer'].get('value') is None def test_experiment_categorizer_4_mic_w_path(testapp, microscopy_w_path, genomic_region_bio_feature): assert microscopy_w_path['experiment_categorizer']['field'] == 'Target' assert microscopy_w_path['experiment_categorizer']['value'] == genomic_region_bio_feature['display_title'] def test_experiment_categorizer_4_mic_w_multi_path(testapp, microscopy_w_multipath, genomic_region_bio_feature, basic_region_bio_feature): vals2chk = [genomic_region_bio_feature['display_title'], basic_region_bio_feature['display_title']] len2chk = len(vals2chk[0]) + len(vals2chk[1]) + 2 assert microscopy_w_multipath['experiment_categorizer']['field'] == 'Target' value = microscopy_w_multipath['experiment_categorizer']['value'] assert len(value) == len2chk for v in vals2chk: assert v in value def test_experiment_categorizer_4_mic_w_split_path(testapp, microscopy_w_splitpath): '''Sometimes a (group of) target(s) is split into different imaging paths, e.g. due to multiplexing. Sum the split targets and return only one string.''' assert microscopy_w_splitpath['experiment_categorizer']['value'] == '37 TADs on chr19' def test_experiment_categorizer_4_chiapet_no_fusion(testapp, repliseq_info, exp_types): repliseq_info['experiment_type'] = exp_types['chia']['@id'] res = testapp.post_json('/experiment_chiapet', repliseq_info).json['@graph'][0] assert res['experiment_categorizer']['field'] == 'Default' assert res['experiment_categorizer']['value'] is None def test_experiment_categorizer_4_damid_no_fusion(testapp, damid_no_fusion): assert damid_no_fusion['experiment_categorizer']['field'] == 'Target' assert damid_no_fusion['experiment_categorizer'].get('value') == 'None (Control)' def test_experiment_categorizer_4_damid_w_fusion(testapp, damid_w_fusion, prot_bio_feature): assert damid_w_fusion['experiment_categorizer']['field'] == 'Target' assert damid_w_fusion['experiment_categorizer']['value'] == prot_bio_feature['display_title'] def test_experiment_categorizer_4_repliseq_no_fraction_info(testapp, repliseq_1): assert repliseq_1['experiment_categorizer']['field'] == 'Default' assert repliseq_1['experiment_categorizer'].get('value') is None def test_experiment_categorizer_4_repliseq_only_fraction(testapp, repliseq_2): wanted = 'early of an unspecified number of fractions' assert repliseq_2['experiment_categorizer']['field'] == 'Fraction' assert repliseq_2['experiment_categorizer']['value'] == wanted def test_experiment_categorizer_4_repliseq_fraction_and_total(testapp, repliseq_3): wanted = 'early of 16 fractions' assert repliseq_3['experiment_categorizer']['field'] == 'Fraction' assert repliseq_3['experiment_categorizer']['value'] == wanted def test_experiment_categorizer_w_target(testapp, expt_w_target, prot_bio_feature): assert expt_w_target['experiment_categorizer']['field'] == 'Target' assert expt_w_target['experiment_categorizer']['value'] == prot_bio_feature['display_title'] def test_experiment_categorizer_w_enzyme(testapp, experiment, mboI): assert experiment['experiment_categorizer']['field'] == 'Enzyme' assert experiment['experiment_categorizer']['value'] == mboI['display_title'] def test_experiment_categorizer_w_target_and_enzyme(testapp, expt_w_target, prot_bio_feature, mboI): # import pdb; pdb.set_trace() res = testapp.patch_json(expt_w_target['@id'], {'digestion_enzyme': mboI['@id']}).json['@graph'][0] assert res['digestion_enzyme'] == mboI['@id'] assert res['experiment_categorizer']['field'] == 'Target' assert res['experiment_categorizer']['value'] == prot_bio_feature['display_title'] def test_experiment_categorizer_w_no_cat1(testapp, experiment_data, exp_types): del experiment_data['digestion_enzyme'] experiment_data['experiment_type'] = exp_types['rnaseq']['@id'] expt = testapp.post_json('/experiment_seq', experiment_data).json['@graph'][0] assert expt['experiment_categorizer']['field'] == 'Default' assert expt['experiment_categorizer'].get('value') is None def test_experiment_categorizer_cap_c_no_regions(testapp, experiment_data, mboI, exp_types): experiment_data['experiment_type'] = exp_types['capc']['@id'] expt = testapp.post_json('/experiment_capture_c', experiment_data).json['@graph'][0] assert expt['experiment_categorizer']['field'] == 'Enzyme' assert expt['experiment_categorizer']['value'] == mboI['display_title'] def test_experiment_categorizer_cap_c_w_region(expt_w_targ_region, genomic_region_bio_feature): assert expt_w_targ_region['experiment_categorizer']['field'] == 'Target' assert expt_w_targ_region['experiment_categorizer']['value'] == genomic_region_bio_feature['display_title'] def test_experiment_categorizer_cap_c_w_2regions( expt_w_2_targ_regions, genomic_region_bio_feature, gene_bio_feature): wanted = ', '.join(sorted([genomic_region_bio_feature['display_title'], gene_bio_feature['display_title']])) assert expt_w_2_targ_regions['experiment_categorizer']['field'] == 'Target' assert expt_w_2_targ_regions['experiment_categorizer']['value'] == wanted @pytest.fixture def new_exp_type(lab, award): data = { 'uuid': str(uuid4()), 'title': 'Title', 'lab': lab['@id'], 'award': award['@id'], 'status': 'released', 'valid_item_types': ['ExperimentSeq'] } return data def test_validate_exp_type_valid(testapp, experiment_data, new_exp_type): exp_type1 = testapp.post_json('/experiment_type', new_exp_type).json['@graph'][0] experiment_data['experiment_type'] = exp_type1['@id'] expt = testapp.post_json('/experiment_hi_c', experiment_data, status=422) testapp.patch_json(exp_type1['@id'], {'valid_item_types': ['ExperimentSeq', 'ExperimentHiC']}) expt = testapp.post_json('/experiment_hi_c', experiment_data, status=201).json['@graph'][0] assert expt['experiment_type'] == '/experiment-types/title/' def test_validate_experiment_set_duplicate_replicate_experiments(testapp, rep_set_data, experiment): rep_set_data['replicate_exps'] = [{'bio_rep_no': 1, 'tec_rep_no': 1, 'replicate_exp': experiment['@id']}, {'bio_rep_no': 1, 'tec_rep_no': 2, 'replicate_exp': experiment['@id']}] repset = testapp.post_json('/experiment_set_replicate', rep_set_data, status=422) assert repset.json['errors'][0]['name'] == 'ExperimentSet: non-unique exps' assert 'Duplicate experiment' in repset.json['errors'][0]['description'] '''tzinfo timezone information for Australia/NSW.''' from pytz.tzinfo import DstTzInfo from pytz.tzinfo import memorized_datetime as d from pytz.tzinfo import memorized_ttinfo as i class NSW(DstTzInfo): '''Australia/NSW timezone definition. See datetime.tzinfo for details''' zone = 'Australia/NSW' _utc_transition_times = [ d(1,1,1,0,0,0), d(1916,12,31,14,1,0), d(1917,3,24,15,0,0), d(1941,12,31,16,0,0), d(1942,3,28,15,0,0), d(1942,9,26,16,0,0), d(1943,3,27,15,0,0), d(1943,10,2,16,0,0), d(1944,3,25,15,0,0), d(1971,10,30,16,0,0), d(1972,2,26,16,0,0), d(1972,10,28,16,0,0), d(1973,3,3,16,0,0), d(1973,10,27,16,0,0), d(1974,3,2,16,0,0), d(1974,10,26,16,0,0), d(1975,3,1,16,0,0), d(1975,10,25,16,0,0), d(1976,3,6,16,0,0), d(1976,10,30,16,0,0), d(1977,3,5,16,0,0), d(1977,10,29,16,0,0), d(1978,3,4,16,0,0), d(1978,10,28,16,0,0), d(1979,3,3,16,0,0), d(1979,10,27,16,0,0), d(1980,3,1,16,0,0), d(1980,10,25,16,0,0), d(1981,2,28,16,0,0), d(1981,10,24,16,0,0), d(1982,4,3,16,0,0), d(1982,10,30,16,0,0), d(1983,3,5,16,0,0), d(1983,10,29,16,0,0), d(1984,3,3,16,0,0), d(1984,10,27,16,0,0), d(1985,3,2,16,0,0), d(1985,10,26,16,0,0), d(1986,3,15,16,0,0), d(1986,10,18,16,0,0), d(1987,3,14,16,0,0), d(1987,10,24,16,0,0), d(1988,3,19,16,0,0), d(1988,10,29,16,0,0), d(1989,3,18,16,0,0), d(1989,10,28,16,0,0), d(1990,3,3,16,0,0), d(1990,10,27,16,0,0), d(1991,3,2,16,0,0), d(1991,10,26,16,0,0), d(1992,2,29,16,0,0), d(1992,10,24,16,0,0), d(1993,3,6,16,0,0), d(1993,10,30,16,0,0), d(1994,3,5,16,0,0), d(1994,10,29,16,0,0), d(1995,3,4,16,0,0), d(1995,10,28,16,0,0), d(1996,3,30,16,0,0), d(1996,10,26,16,0,0), d(1997,3,29,16,0,0), d(1997,10,25,16,0,0), d(1998,3,28,16,0,0), d(1998,10,24,16,0,0), d(1999,3,27,16,0,0), d(1999,10,30,16,0,0), d(2000,3,25,16,0,0), d(2000,8,26,16,0,0), d(2001,3,24,16,0,0), d(2001,10,27,16,0,0), d(2002,3,30,16,0,0), d(2002,10,26,16,0,0), d(2003,3,29,16,0,0), d(2003,10,25,16,0,0), d(2004,3,27,16,0,0), d(2004,10,30,16,0,0), d(2005,3,26,16,0,0), d(2005,10,29,16,0,0), d(2006,4,1,16,0,0), d(2006,10,28,16,0,0), d(2007,3,24,16,0,0), d(2007,10,27,16,0,0), d(2008,3,29,16,0,0), d(2008,10,25,16,0,0), d(2009,3,28,16,0,0), d(2009,10,24,16,0,0), d(2010,3,27,16,0,0), d(2010,10,30,16,0,0), d(2011,3,26,16,0,0), d(2011,10,29,16,0,0), d(2012,3,24,16,0,0), d(2012,10,27,16,0,0), d(2013,3,30,16,0,0), d(2013,10,26,16,0,0), d(2014,3,29,16,0,0), d(2014,10,25,16,0,0), d(2015,3,28,16,0,0), d(2015,10,24,16,0,0), d(2016,3,26,16,0,0), d(2016,10,29,16,0,0), d(2017,3,25,16,0,0), d(2017,10,28,16,0,0), d(2018,3,24,16,0,0), d(2018,10,27,16,0,0), d(2019,3,30,16,0,0), d(2019,10,26,16,0,0), d(2020,3,28,16,0,0), d(2020,10,24,16,0,0), d(2021,3,27,16,0,0), d(2021,10,30,16,0,0), d(2022,3,26,16,0,0), d(2022,10,29,16,0,0), d(2023,3,25,16,0,0), d(2023,10,28,16,0,0), d(2024,3,30,16,0,0), d(2024,10,26,16,0,0), d(2025,3,29,16,0,0), d(2025,10,25,16,0,0), d(2026,3,28,16,0,0), d(2026,10,24,16,0,0), d(2027,3,27,16,0,0), d(2027,10,30,16,0,0), d(2028,3,25,16,0,0), d(2028,10,28,16,0,0), d(2029,3,24,16,0,0), d(2029,10,27,16,0,0), d(2030,3,30,16,0,0), d(2030,10,26,16,0,0), d(2031,3,29,16,0,0), d(2031,10,25,16,0,0), d(2032,3,27,16,0,0), d(2032,10,30,16,0,0), d(2033,3,26,16,0,0), d(2033,10,29,16,0,0), d(2034,3,25,16,0,0), d(2034,10,28,16,0,0), d(2035,3,24,16,0,0), d(2035,10,27,16,0,0), d(2036,3,29,16,0,0), d(2036,10,25,16,0,0), d(2037,3,28,16,0,0), d(2037,10,24,16,0,0), ] _transition_info = [ i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), i(36000,0,'EST'), i(39600,3600,'EST'), ] NSW = NSW() Downloads last month10
https://huggingface.co/microsoft/swin-base-patch4-window7-224
Swin Transformer (base-sized model) Swin Transformer model trained on ImageNet-1k at resolution 224x224. It was introduced in the paper Swin Transformer: Hierarchical Vision Transformer using Shifted Windows by Liu et al. and first released in this repository. Disclaimer: The team releasing Swin Transformer did not write a model card for this model so this model card has been written by the Hugging Face team. Model description The Swin Transformer is a type of Vision Transformer. It builds hierarchical feature maps by merging image patches (shown in gray) in deeper layers and has linear computation complexity to input image size due to computation of self-attention only within each local window (shown in red). It can thus serve as a general-purpose backbone for both image classification and dense recognition tasks. In contrast, previous vision Transformers produce feature maps of a single low resolution and have quadratic computation complexity to input image size due to computation of self-attention globally. Source Intended uses & limitations You can use the raw model for image classification. See the model hub to look for fine-tuned versions on a task that interests you. How to use Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes: from transformers import AutoFeatureExtractor, SwinForImageClassification from PIL import Image import requests url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) feature_extractor = AutoFeatureExtractor.from_pretrained("microsoft/swin-base-patch4-window7-224") model = SwinForImageClassification.from_pretrained("microsoft/swin-base-patch4-window7-224") inputs = feature_extractor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits # model predicts one of the 1000 ImageNet classes predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) For more code examples, we refer to the documentation. BibTeX entry and citation info @article{DBLP:journals/corr/abs-2103-14030, author = {Ze Liu and Yutong Lin and Yue Cao and Han Hu and Yixuan Wei and Zheng Zhang and Stephen Lin and Baining Guo}, title = {Swin Transformer: Hierarchical Vision Transformer using Shifted Windows}, journal = {CoRR}, volume = {abs/2103.14030}, year = {2021}, url = {https://arxiv.org/abs/2103.14030}, eprinttype = {arXiv}, eprint = {2103.14030}, timestamp = {Thu, 08 Apr 2021 07:53:26 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2103-14030.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} }
https://huggingface.co/datasets/microsoft/LCC_csharp
Subset Split context stringlengths 2.52k 185k gt stringclasses 1 value namespace DanTup.DartVS.ProjectSystem.Controls { using System; using System.ComponentModel; using System.Drawing; using System.IO; using System.Windows.Forms; using Microsoft.VisualStudio.Shell.Interop; using Package = Microsoft.VisualStudio.Shell.Package; using Url = Microsoft.VisualStudio.Shell.Url; /// <summary> /// Extends a simple text box specialized for browsing to folders. Supports auto-complete and /// a browse button that brings up the folder browse dialog. /// </summary> internal partial class FolderBrowserTextBox : UserControl { private string _rootFolder; // ========================================================================================= // Constructors // ========================================================================================= /// <summary> /// Initializes a new instance of the <see cref="FolderBrowserTextBox"/> class. /// </summary> public FolderBrowserTextBox() { this.InitializeComponent(); folderTextBox.Enabled = Enabled; browseButton.Enabled = Enabled; } // ========================================================================================= // Events // ========================================================================================= /// <summary> /// Occurs when the text has changed. /// </summary> [Browsable(true)] [EditorBrowsable(EditorBrowsableState.Always)] public new event EventHandler TextChanged { add { base.TextChanged += value; } remove { base.TextChanged -= value; } } [Bindable(true)] [Browsable(true)] [DesignerSerializationVisibility(DesignerSerializationVisibility.Visible)] [EditorBrowsable(EditorBrowsableState.Always)] public string RootFolder { get { try { if (!string.IsNullOrEmpty(_rootFolder)) Path.IsPathRooted(_rootFolder); } catch (ArgumentException) { return string.Empty; } return _rootFolder; } set { _rootFolder = value; } } [Bindable(true)] [Browsable(true)] [DesignerSerializationVisibility(DesignerSerializationVisibility.Visible)] [EditorBrowsable(EditorBrowsableState.Always)] [DefaultValue(false)] [Description("When this property is 'true', the folder path will be made relative to RootFolder, when possible.")] public bool MakeRelative { get; set; } // ========================================================================================= // Properties // ========================================================================================= /// <summary> /// Gets or sets the path of the selected folder. /// </summary> [Bindable(true)] [Browsable(true)] [DesignerSerializationVisibility(DesignerSerializationVisibility.Visible)] [EditorBrowsable(EditorBrowsableState.Always)] public override string Text { get { return this.folderTextBox.Text; } set { this.folderTextBox.Text = value; } } private string FullPath { get { try { string text = Text ?? string.Empty; if (!string.IsNullOrEmpty(RootFolder)) text = Path.Combine(RootFolder, text); return Path.GetFullPath(text); } catch (ArgumentException) { return string.Empty; } } } // ========================================================================================= // Methods // ========================================================================================= /// <summary> /// Sets the bounds of the control. In this case, we fix the height to the text box's height. /// </summary> /// <param name="x">The new x value.</param> /// <param name="y">The new y value.</param> /// <param name="width">The new width value.</param> /// <param name="height">The height value.</param> /// <param name="specified">A set of flags indicating which bounds to set.</param> protected override void SetBoundsCore(int x, int y, int width, int height, BoundsSpecified specified) { if ((specified & BoundsSpecified.Height) == BoundsSpecified.Height) { height = this.folderTextBox.Height + 1; } base.SetBoundsCore(x, y, width, height, specified); } /// <summary> /// Brings up the browse folder dialog. /// </summary> /// <param name="sender">The browse button.</param> /// <param name="e">The <see cref="EventArgs"/> object that contains the event data.</param> private void OnBrowseButtonClick(object sender, EventArgs e) { // initialize the dialog to the current directory (if it exists) bool overridePersistedInitialDirectory = false; string initialDirectory = null; if (Directory.Exists(FullPath)) { initialDirectory = FullPath; overridePersistedInitialDirectory = true; } IntPtr parentWindow = Handle; Guid persistenceSlot = typeof(FileBrowserTextBox).GUID; IVsUIShell2 shell = (IVsUIShell2)Package.GetGlobalService(typeof(SVsUIShell)); // show the dialog string path = shell.GetDirectoryViaBrowseDialog(parentWindow, persistenceSlot, "Select folder", initialDirectory, overridePersistedInitialDirectory); if (path != null) { if (MakeRelative && !string.IsNullOrEmpty(RootFolder)) { string rootFolder = Path.GetFullPath(RootFolder); if (Directory.Exists(rootFolder)) { if (!rootFolder.EndsWith(Path.DirectorySeparatorChar.ToString()) && !rootFolder.EndsWith(Path.AltDirectorySeparatorChar.ToString())) rootFolder = rootFolder + Path.DirectorySeparatorChar; path = new Url(rootFolder).MakeRelative(new Url(path)); } } this.folderTextBox.Text = path; } } /// <summary> /// Raises the <see cref="TextChanged"/> event. /// </summary> /// <param name="sender">The folder text box.</param> /// <param name="e">The <see cref="EventArgs"/> object that contains the event data.</param> private void OnFolderTextBoxTextChanged(object sender, EventArgs e) { UpdateColor(); this.OnTextChanged(EventArgs.Empty); } protected override void OnEnabledChanged(EventArgs e) { folderTextBox.Enabled = Enabled; browseButton.Enabled = Enabled; UpdateColor(); browseButton.Invalidate(); base.OnEnabledChanged(e); } private void UpdateColor() { if (!Enabled) { folderTextBox.BackColor = SystemColors.Control; folderTextBox.ForeColor = SystemColors.GrayText; return; } folderTextBox.ForeColor = SystemColors.ControlText; folderTextBox.BackColor = Directory.Exists(FullPath) ? SystemColors.ControlLightLight : Color.LightSalmon; } } } // Copyright (c) Microsoft Corporation. All rights reserved. // Licensed under the MIT License. See License.txt in the project root for license information. namespace Microsoft.Azure.Management.ContainerRegistry.Fluent { using Microsoft.Azure.Management.ContainerRegistry.Fluent.Models; using Microsoft.Azure.Management.ContainerRegistry.Fluent.RegistryDockerTaskStep.Definition; using Microsoft.Azure.Management.ContainerRegistry.Fluent.RegistryDockerTaskStep.Update; using Microsoft.Azure.Management.ContainerRegistry.Fluent.RegistryTask.Definition; using Microsoft.Azure.Management.ContainerRegistry.Fluent.RegistryTask.Update; using Microsoft.Azure.Management.ResourceManager.Fluent.Core; using Microsoft.Azure.Management.ResourceManager.Fluent.Core.ChildResourceActions; using System.Collections.Generic; internal partial class RegistryDockerTaskStepImpl { /// <summary> /// Gets the arguments this Docker task step. /// </summary> System.Collections.Generic.IReadOnlyList<Models.Argument> Microsoft.Azure.Management.ContainerRegistry.Fluent.IRegistryDockerTaskStep.Arguments { get { return this.Arguments(); } } /// <summary> /// Gets Docker file path for this Docker task step. /// </summary> string Microsoft.Azure.Management.ContainerRegistry.Fluent.IRegistryDockerTaskStep.DockerFilePath { get { return this.DockerFilePath(); } } /// <summary> /// Gets the image names of this Docker task step. /// </summary> System.Collections.Generic.IReadOnlyList<string> Microsoft.Azure.Management.ContainerRegistry.Fluent.IRegistryDockerTaskStep.ImageNames { get { return this.ImageNames(); } } /// <summary> /// Gets whether push is enabled for this Docker task step. /// </summary> bool Microsoft.Azure.Management.ContainerRegistry.Fluent.IRegistryDockerTaskStep.IsPushEnabled { get { return this.IsPushEnabled(); } } /// <summary> /// Gets whether there is no cache for this Docker task step. /// </summary> bool Microsoft.Azure.Management.ContainerRegistry.Fluent.IRegistryDockerTaskStep.NoCache { get { return this.NoCache(); } } /// <summary> /// Attaches this child object's definition to its parent's definition. /// </summary> /// <return>The next stage of the parent object's definition.</return> RegistryTask.Definition.ISourceTriggerDefinition Microsoft.Azure.Management.ResourceManager.Fluent.Core.ChildResourceActions.IAttachable<RegistryTask.Definition.ISourceTriggerDefinition>.Attach() { return this.Attach(); } /// <summary> /// Begins an update for a child resource. /// This is the beginning of the builder pattern used to update child resources /// The final method completing the update and continue /// the actual parent resource update process in Azure is Settable.parent(). /// </summary> /// <return>The stage of parent resource update.</return> RegistryTask.Update.IUpdate Microsoft.Azure.Management.ResourceManager.Fluent.Core.ChildResourceActions.ISettable<RegistryTask.Update.IUpdate>.Parent() { return this.Parent(); } /// <summary> /// The function that specifies the task has a cache. /// </summary> /// <param name="enabled">Whether caching is enabled.</param> /// <return>The next stage of the container registry DockerTaskStep update.</return> RegistryDockerTaskStep.Update.IUpdate RegistryDockerTaskStep.Update.ICache.WithCacheEnabled(bool enabled) { return this.WithCacheEnabled(enabled); } /// <summary> /// The function that specifies the use of a cache based on user input parameter. /// </summary> /// <param name="enabled">Whether caching will be enabled.</param> /// <return>The next step of the container registry DockerTaskStep definition.</return> RegistryDockerTaskStep.Definition.IDockerTaskStepAttachable RegistryDockerTaskStep.Definition.IDockerTaskStepAttachable.WithCacheEnabled(bool enabled) { return this.WithCacheEnabled(enabled); } /// <summary> /// The function that specifies the path to the Docker file. /// </summary> /// <param name="path">The path to the Docker file.</param> /// <return>The next stage of the container registry DockerTaskStep definition.</return> RegistryDockerTaskStep.Definition.IDockerTaskStepAttachable RegistryDockerTaskStep.Definition.IDockerFilePath.WithDockerFilePath(string path) { return this.WithDockerFilePath(path); } /// <summary> /// The function that specifies the path to the Docker file. /// </summary> /// <param name="path">The path to the Docker file.</param> /// <return>The next stage of the container registry DockerTaskStep update.</return> RegistryDockerTaskStep.Update.IUpdate RegistryDockerTaskStep.Update.IDockerFilePath.WithDockerFilePath(string path) { return this.WithDockerFilePath(path); } /// <summary> /// The function that specifies the image names. /// </summary> /// <param name="imageNames">The list of the names of the images.</param> /// <return>The next stage of the container registry DockerTaskStep update.</return> RegistryDockerTaskStep.Update.IUpdate RegistryDockerTaskStep.Update.IImageNames.WithImageNames(IList<string> imageNames) { return this.WithImageNames(imageNames); } /// <summary> /// The function that specifies the list of image names. /// </summary> /// <param name="imageNames">The image names.</param> /// <return>The next step of the container registry DockerTaskStep definition.</return> RegistryDockerTaskStep.Definition.IDockerTaskStepAttachable RegistryDockerTaskStep.Definition.IDockerTaskStepAttachable.WithImageNames(IList<string> imageNames) { return this.WithImageNames(imageNames); } /// <summary> /// The function that specifies the overriding argument and what it will override. /// </summary> /// <param name="name">The name of the value to be overridden.</param> /// <param name="overridingArgument">The content of the overriding argument.</param> /// <return>The next stage of the container Docker task step update.</return> RegistryDockerTaskStep.Update.IUpdate RegistryDockerTaskStep.Update.IOverridingArgumentUpdate.WithOverridingArgument(string name, OverridingArgument overridingArgument) { return this.WithOverridingArgument(name, overridingArgument); } /// <summary> /// The function that specifies the overriding argument and what it will override. /// </summary> /// <param name="name">The name of the value to be overridden.</param> /// <param name="overridingArgument">The content of the overriding argument.</param> /// <return>The next stage of the container Docker task step definition.</return> RegistryDockerTaskStep.Definition.IDockerTaskStepAttachable RegistryDockerTaskStep.Definition.IDockerTaskStepAttachable.WithOverridingArgument(string name, OverridingArgument overridingArgument) { return this.WithOverridingArgument(name, overridingArgument); } /// <summary> /// The function that specifies the overriding arguments and what they will override. /// </summary> /// <param name="overridingArguments">Map with key of the name of the value to be overridden and value OverridingArgument specifying the content of the overriding argument.</param> /// <return>The next stage of the container Docker task step update.</return> RegistryDockerTaskStep.Update.IUpdate RegistryDockerTaskStep.Update.IOverridingArgumentUpdate.WithOverridingArguments(IDictionary<string,Microsoft.Azure.Management.ContainerRegistry.Fluent.OverridingArgument> overridingArguments) { return this.WithOverridingArguments(overridingArguments); } /// <summary> /// The function that specifies the overriding arguments and what they will override. /// </summary> /// <param name="overridingArguments">Map with key of the name of the value to be overridden and value OverridingArgument specifying the content of the overriding argument.</param> /// <return>The next stage of the container Docker task step definition.</return> RegistryDockerTaskStep.Definition.IDockerTaskStepAttachable RegistryDockerTaskStep.Definition.IDockerTaskStepAttachable.WithOverridingArguments(IDictionary<string,Microsoft.Azure.Management.ContainerRegistry.Fluent.OverridingArgument> overridingArguments) { return this.WithOverridingArguments(overridingArguments); } /// <summary> /// The function that specifies push is enabled. /// </summary> /// <param name="enabled">Whether push is enabled.</param> /// <return>The next stage of the container registry DockerTaskStep update.</return> RegistryDockerTaskStep.Update.IUpdate RegistryDockerTaskStep.Update.IPush.WithPushEnabled(bool enabled) { return this.WithPushEnabled(enabled); } /// <summary> /// The function that enables push depending on user input parameter. /// </summary> /// <param name="enabled">Whether push will be enabled.</param> /// <return>The next step of the container registry DockerTaskStep definition.</return> RegistryDockerTaskStep.Definition.IDockerTaskStepAttachable RegistryDockerTaskStep.Definition.IDockerTaskStepAttachable.WithPushEnabled(bool enabled) { return this.WithPushEnabled(enabled); } } } #region S# License /****************************************************************************************** NOTICE!!! This program and source code is owned and licensed by StockSharp, LLC, www.stocksharp.com Viewing or use of this code requires your acceptance of the license agreement found at https://github.com/StockSharp/StockSharp/blob/master/LICENSE Removal of this comment is a violation of the license agreement. Project: StockSharp.Algo.Candles.Compression.Algo File: RealTimeCandleBuilderSource.cs Created: 2015, 11, 11, 2:32 PM Copyright 2010 by StockSharp, LLC *******************************************************************************************/ #endregion S# License namespace StockSharp.Algo.Candles.Compression { using System; using System.Collections.Generic; using System.Linq; using Ecng.Collections; using Ecng.ComponentModel; using StockSharp.BusinessEntities; /// <summary> /// The base data source for <see cref="ICandleBuilder"/> which receives data from <see cref="IConnector"/>. /// </summary> /// <typeparam name="T">The source data type (for example, <see cref="Trade"/>).</typeparam> public abstract class RealTimeCandleBuilderSource<T> : ConvertableCandleBuilderSource<T> { private readonly SynchronizedDictionary<Security, CachedSynchronizedList<CandleSeries>> _registeredSeries = new SynchronizedDictionary<Security, CachedSynchronizedList<CandleSeries>>(); private readonly OrderedPriorityQueue<DateTimeOffset, CandleSeries> _seriesByDates = new OrderedPriorityQueue<DateTimeOffset, CandleSeries>(); /// <summary> /// Initializes a new instance of the <see cref="RealTimeCandleBuilderSource{T}"/>. /// </summary> /// <param name="connector">The connection through which new data will be received.</param> protected RealTimeCandleBuilderSource(IConnector connector) { if (connector == null) throw new ArgumentNullException(nameof(connector)); Connector = connector; Connector.MarketTimeChanged += OnConnectorMarketTimeChanged; } /// <summary> /// The source priority by speed (0 - the best). /// </summary> public override int SpeedPriority => 1; /// <summary> /// The connection through which new data will be received. /// </summary> public IConnector Connector { get; } /// <summary> /// To send data request. /// </summary> /// <param name="series">The candles series for which data receiving should be started.</param> /// <param name="from">The initial date from which you need to get data.</param> /// <param name="to">The final date by which you need to get data.</param> public override void Start(CandleSeries series, DateTimeOffset from, DateTimeOffset to) { if (series == null) throw new ArgumentNullException(nameof(series)); bool registerSecurity; series.IsNew = true; _registeredSeries.SafeAdd(series.Security, out registerSecurity).Add(series); if (registerSecurity) RegisterSecurity(series.Security); _seriesByDates.Add(new KeyValuePair<DateTimeOffset, CandleSeries>(to, series)); } /// <summary> /// To stop data receiving starting through <see cref="Start"/>. /// </summary> /// <param name="series">Candles series.</param> public override void Stop(CandleSeries series) { if (series == null) throw new ArgumentNullException(nameof(series)); var registeredSeries = _registeredSeries.TryGetValue(series.Security); if (registeredSeries == null) return; registeredSeries.Remove(series); if (registeredSeries.Count == 0) { UnRegisterSecurity(series.Security); _registeredSeries.Remove(series.Security); } _seriesByDates.RemoveWhere(i => i.Value == series); RaiseStopped(series); } /// <summary> /// To register the getting data for the instrument. /// </summary> /// <param name="security">Security.</param> protected abstract void RegisterSecurity(Security security); /// <summary> /// To stop the getting data for the instrument. /// </summary> /// <param name="security">Security.</param> protected abstract void UnRegisterSecurity(Security security); /// <summary> /// To get previously accumulated values. /// </summary> /// <param name="security">Security.</param> /// <returns>Accumulated values.</returns> protected abstract IEnumerable<T> GetSecurityValues(Security security); /// <summary> /// Synchronously to add new data received from <see cref="Connector"/>. /// </summary> /// <param name="values">New data.</param> protected void AddNewValues(IEnumerable<T> values) { if (_registeredSeries.Count == 0) return; foreach (var group in Convert(values).GroupBy(v => v.Security)) { var security = group.Key; var registeredSeries = _registeredSeries.TryGetValue(security); if (registeredSeries == null) continue; var seriesCache = registeredSeries.Cache; var securityValues = group.OrderBy(v => v.Time).ToArray(); foreach (var series in seriesCache) { if (series.IsNew) { RaiseProcessing(series, Convert(GetSecurityValues(security)).OrderBy(v => v.Time)); series.IsNew = false; } else { RaiseProcessing(series, securityValues); } } } } private void OnConnectorMarketTimeChanged(TimeSpan value) { if (_seriesByDates.Count == 0) return; var pair = _seriesByDates.Peek(); while (pair.Key <= Connector.CurrentTime) { _seriesByDates.Dequeue(); Stop(pair.Value); if (_seriesByDates.Count == 0) break; pair = _seriesByDates.Peek(); } } } /// <summary> /// The data source for <see cref="CandleBuilder{T}"/> which creates <see cref="ICandleBuilderSourceValue"/> from tick trades <see cref="Trade"/>. /// </summary> public class TradeCandleBuilderSource : RealTimeCandleBuilderSource<Trade> { /// <summary> /// Initializes a new instance of the <see cref="TradeCandleBuilderSource"/>. /// </summary> /// <param name="connector">The connection through which new trades will be received using the event <see cref="IConnector.NewTrades"/>.</param> public TradeCandleBuilderSource(IConnector connector) : base(connector) { Connector.NewTrades += AddNewValues; } /// <summary> /// To get time ranges for which this source of passed candles series has data. /// </summary> /// <param name="series">Candles series.</param> /// <returns>Time ranges.</returns> public override IEnumerable<Range<DateTimeOffset>> GetSupportedRanges(CandleSeries series) { if (series == null) throw new ArgumentNullException(nameof(series)); var trades = GetSecurityValues(series.Security); yield return new Range<DateTimeOffset>(trades.IsEmpty() ? Connector.CurrentTime : trades.Min(v => v.Time), DateTimeOffset.MaxValue); } /// <summary> /// To register the getting data for the instrument. /// </summary> /// <param name="security">Security.</param> protected override void RegisterSecurity(Security security) { Connector.RegisterTrades(security); } /// <summary> /// To stop the getting data for the instrument. /// </summary> /// <param name="security">Security.</param> protected override void UnRegisterSecurity(Security security) { Connector.UnRegisterTrades(security); } /// <summary> /// To get previously accumulated values. /// </summary> /// <param name="security">Security.</param> /// <returns>Accumulated values.</returns> protected override IEnumerable<Trade> GetSecurityValues(Security security) { return Connector.Trades.Filter(security); } /// <summary> /// Release resources. /// </summary> protected override void DisposeManaged() { Connector.NewTrades -= AddNewValues; base.DisposeManaged(); } } /// <summary> /// The data source for <see cref="CandleBuilder{T}"/> which creates <see cref="ICandleBuilderSourceValue"/> from the order book <see cref="MarketDepth"/>. /// </summary> public class MarketDepthCandleBuilderSource : RealTimeCandleBuilderSource<MarketDepth> { /// <summary> /// Initializes a new instance of the <see cref="MarketDepthCandleBuilderSource"/>. /// </summary> /// <param name="connector">The connection through which changed order books will be received using the event <see cref="IConnector.MarketDepthsChanged"/>.</param> public MarketDepthCandleBuilderSource(IConnector connector) : base(connector) { Connector.MarketDepthsChanged += OnMarketDepthsChanged; } /// <summary> /// To get time ranges for which this source of passed candles series has data. /// </summary> /// <param name="series">Candles series.</param> /// <returns>Time ranges.</returns> public override IEnumerable<Range<DateTimeOffset>> GetSupportedRanges(CandleSeries series) { if (series == null) throw new ArgumentNullException(nameof(series)); yield return new Range<DateTimeOffset>(Connector.CurrentTime, DateTimeOffset.MaxValue); } /// <summary> /// To register the getting data for the instrument. /// </summary> /// <param name="security">Security.</param> protected override void RegisterSecurity(Security security) { Connector.RegisterMarketDepth(security); } /// <summary> /// To stop the getting data for the instrument. /// </summary> /// <param name="security">Security.</param> protected override void UnRegisterSecurity(Security security) { Connector.UnRegisterMarketDepth(security); } /// <summary> /// To get previously accumulated values. /// </summary> /// <param name="security">Security.</param> /// <returns>Accumulated values.</returns> protected override IEnumerable<MarketDepth> GetSecurityValues(Security security) { return Enumerable.Empty<MarketDepth>(); } private void OnMarketDepthsChanged(IEnumerable<MarketDepth> depths) { AddNewValues(depths.Select(d => d.Clone())); } /// <summary> /// Release resources. /// </summary> protected override void DisposeManaged() { Connector.MarketDepthsChanged -= OnMarketDepthsChanged; base.DisposeManaged(); } } } /* * Infoplus API * * Infoplus API. * * OpenAPI spec version: v1.0 * Contact: api@infopluscommerce.com * Generated by: https://github.com/swagger-api/swagger-codegen.git * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ using System; using System.Linq; using System.IO; using System.Text; using System.Collections; using System.Collections.Generic; using System.Collections.ObjectModel; using System.Runtime.Serialization; using Newtonsoft.Json; using Newtonsoft.Json.Converters; namespace Infoplus.Model { /// <summary> /// ShoppingCartConnection /// </summary> [DataContract] public partial class ShoppingCartConnection : IEquatable<ShoppingCartConnection> { /// <summary> /// Initializes a new instance of the <see cref="ShoppingCartConnection" /> class. /// </summary> [JsonConstructorAttribute] protected ShoppingCartConnection() { } /// <summary> /// Initializes a new instance of the <see cref="ShoppingCartConnection" /> class. /// </summary> /// <param name="LobId">LobId (required).</param> /// <param name="OrderSourceId">OrderSourceId (required).</param> /// <param name="IntegrationPartnerId">IntegrationPartnerId (required).</param> /// <param name="ConnectionType">ConnectionType (required).</param> /// <param name="ItemFilterId">ItemFilterId.</param> /// <param name="InfoplusSKUFieldToMap">InfoplusSKUFieldToMap (required).</param> /// <param name="ShoppingCartSKUFieldToMap">ShoppingCartSKUFieldToMap (required).</param> /// <param name="Name">Name (required).</param> /// <param name="ShoppingCartStoreURL">ShoppingCartStoreURL (required).</param> /// <param name="AccessCode">AccessCode.</param> /// <param name="AccessToken">AccessToken.</param> /// <param name="SyncOrders">SyncOrders (required) (default to false).</param> /// <param name="SyncInventory">SyncInventory (required) (default to false).</param> /// <param name="SyncTrackingData">SyncTrackingData (required) (default to false).</param> public ShoppingCartConnection(int? LobId = null, int? OrderSourceId = null, int? IntegrationPartnerId = null, string ConnectionType = null, int? ItemFilterId = null, string InfoplusSKUFieldToMap = null, string ShoppingCartSKUFieldToMap = null, string Name = null, string ShoppingCartStoreURL = null, string AccessCode = null, string AccessToken = null, bool? SyncOrders = null, bool? SyncInventory = null, bool? SyncTrackingData = null) { // to ensure "LobId" is required (not null) if (LobId == null) { throw new InvalidDataException("LobId is a required property for ShoppingCartConnection and cannot be null"); } else { this.LobId = LobId; } // to ensure "OrderSourceId" is required (not null) if (OrderSourceId == null) { throw new InvalidDataException("OrderSourceId is a required property for ShoppingCartConnection and cannot be null"); } else { this.OrderSourceId = OrderSourceId; } // to ensure "IntegrationPartnerId" is required (not null) if (IntegrationPartnerId == null) { throw new InvalidDataException("IntegrationPartnerId is a required property for ShoppingCartConnection and cannot be null"); } else { this.IntegrationPartnerId = IntegrationPartnerId; } // to ensure "ConnectionType" is required (not null) if (ConnectionType == null) { throw new InvalidDataException("ConnectionType is a required property for ShoppingCartConnection and cannot be null"); } else { this.ConnectionType = ConnectionType; } // to ensure "InfoplusSKUFieldToMap" is required (not null) if (InfoplusSKUFieldToMap == null) { throw new InvalidDataException("InfoplusSKUFieldToMap is a required property for ShoppingCartConnection and cannot be null"); } else { this.InfoplusSKUFieldToMap = InfoplusSKUFieldToMap; } // to ensure "ShoppingCartSKUFieldToMap" is required (not null) if (ShoppingCartSKUFieldToMap == null) { throw new InvalidDataException("ShoppingCartSKUFieldToMap is a required property for ShoppingCartConnection and cannot be null"); } else { this.ShoppingCartSKUFieldToMap = ShoppingCartSKUFieldToMap; } // to ensure "Name" is required (not null) if (Name == null) { throw new InvalidDataException("Name is a required property for ShoppingCartConnection and cannot be null"); } else { this.Name = Name; } // to ensure "ShoppingCartStoreURL" is required (not null) if (ShoppingCartStoreURL == null) { throw new InvalidDataException("ShoppingCartStoreURL is a required property for ShoppingCartConnection and cannot be null"); } else { this.ShoppingCartStoreURL = ShoppingCartStoreURL; } // to ensure "SyncOrders" is required (not null) if (SyncOrders == null) { throw new InvalidDataException("SyncOrders is a required property for ShoppingCartConnection and cannot be null"); } else { this.SyncOrders = SyncOrders; } // to ensure "SyncInventory" is required (not null) if (SyncInventory == null) { throw new InvalidDataException("SyncInventory is a required property for ShoppingCartConnection and cannot be null"); } else { this.SyncInventory = SyncInventory; } // to ensure "SyncTrackingData" is required (not null) if (SyncTrackingData == null) { throw new InvalidDataException("SyncTrackingData is a required property for ShoppingCartConnection and cannot be null"); } else { this.SyncTrackingData = SyncTrackingData; } this.ItemFilterId = ItemFilterId; this.AccessCode = AccessCode; this.AccessToken = AccessToken; } /// <summary> /// Gets or Sets Id /// </summary> [DataMember(Name="id", EmitDefaultValue=false)] public int? Id { get; private set; } /// <summary> /// Gets or Sets CreateDate /// </summary> [DataMember(Name="createDate", EmitDefaultValue=false)] public DateTime? CreateDate { get; private set; } /// <summary> /// Gets or Sets ModifyDate /// </summary> [DataMember(Name="modifyDate", EmitDefaultValue=false)] public DateTime? ModifyDate { get; private set; } /// <summary> /// Gets or Sets ClientId /// </summary> [DataMember(Name="clientId", EmitDefaultValue=false)] public int? ClientId { get; private set; } /// <summary> /// Gets or Sets Nonce /// </summary> [DataMember(Name="nonce", EmitDefaultValue=false)] public string Nonce { get; private set; } /// <summary> /// Gets or Sets LobId /// </summary> [DataMember(Name="lobId", EmitDefaultValue=false)] public int? LobId { get; set; } /// <summary> /// Gets or Sets OrderSourceId /// </summary> [DataMember(Name="orderSourceId", EmitDefaultValue=false)] public int? OrderSourceId { get; set; } /// <summary> /// Gets or Sets IntegrationPartnerId /// </summary> [DataMember(Name="integrationPartnerId", EmitDefaultValue=false)] public int? IntegrationPartnerId { get; set; } /// <summary> /// Gets or Sets ConnectionType /// </summary> [DataMember(Name="connectionType", EmitDefaultValue=false)] public string ConnectionType { get; set; } /// <summary> /// Gets or Sets ItemFilterId /// </summary> [DataMember(Name="itemFilterId", EmitDefaultValue=false)] public int? ItemFilterId { get; set; } /// <summary> /// Gets or Sets InfoplusSKUFieldToMap /// </summary> [DataMember(Name="infoplusSKUFieldToMap", EmitDefaultValue=false)] public string InfoplusSKUFieldToMap { get; set; } /// <summary> /// Gets or Sets ShoppingCartSKUFieldToMap /// </summary> [DataMember(Name="shoppingCartSKUFieldToMap", EmitDefaultValue=false)] public string ShoppingCartSKUFieldToMap { get; set; } /// <summary> /// Gets or Sets Name /// </summary> [DataMember(Name="name", EmitDefaultValue=false)] public string Name { get; set; } /// <summary> /// Gets or Sets ShoppingCartStoreURL /// </summary> [DataMember(Name="shoppingCartStoreURL", EmitDefaultValue=false)] public string ShoppingCartStoreURL { get; set; } /// <summary> /// Gets or Sets AccessCode /// </summary> [DataMember(Name="accessCode", EmitDefaultValue=false)] public string AccessCode { get; set; } /// <summary> /// Gets or Sets AccessToken /// </summary> [DataMember(Name="accessToken", EmitDefaultValue=false)] public string AccessToken { get; set; } /// <summary> /// Gets or Sets SyncOrders /// </summary> [DataMember(Name="syncOrders", EmitDefaultValue=false)] public bool? SyncOrders { get; set; } /// <summary> /// Gets or Sets SyncInventory /// </summary> [DataMember(Name="syncInventory", EmitDefaultValue=false)] public bool? SyncInventory { get; set; } /// <summary> /// Gets or Sets SyncTrackingData /// </summary> [DataMember(Name="syncTrackingData", EmitDefaultValue=false)] public bool? SyncTrackingData { get; set; } /// <summary> /// Gets or Sets SyncInventoryLevelsLastRunTime /// </summary> [DataMember(Name="syncInventoryLevelsLastRunTime", EmitDefaultValue=false)] public DateTime? SyncInventoryLevelsLastRunTime { get; private set; } /// <summary> /// Returns the string presentation of the object /// </summary> /// <returns>String presentation of the object</returns> public override string ToString() { var sb = new StringBuilder(); sb.Append("class ShoppingCartConnection {\n"); sb.Append(" Id: ").Append(Id).Append("\n"); sb.Append(" CreateDate: ").Append(CreateDate).Append("\n"); sb.Append(" ModifyDate: ").Append(ModifyDate).Append("\n"); sb.Append(" ClientId: ").Append(ClientId).Append("\n"); sb.Append(" Nonce: ").Append(Nonce).Append("\n"); sb.Append(" LobId: ").Append(LobId).Append("\n"); sb.Append(" OrderSourceId: ").Append(OrderSourceId).Append("\n"); sb.Append(" IntegrationPartnerId: ").Append(IntegrationPartnerId).Append("\n"); sb.Append(" ConnectionType: ").Append(ConnectionType).Append("\n"); sb.Append(" ItemFilterId: ").Append(ItemFilterId).Append("\n"); sb.Append(" InfoplusSKUFieldToMap: ").Append(InfoplusSKUFieldToMap).Append("\n"); sb.Append(" ShoppingCartSKUFieldToMap: ").Append(ShoppingCartSKUFieldToMap).Append("\n"); sb.Append(" Name: ").Append(Name).Append("\n"); sb.Append(" ShoppingCartStoreURL: ").Append(ShoppingCartStoreURL).Append("\n"); sb.Append(" AccessCode: ").Append(AccessCode).Append("\n"); sb.Append(" AccessToken: ").Append(AccessToken).Append("\n"); sb.Append(" SyncOrders: ").Append(SyncOrders).Append("\n"); sb.Append(" SyncInventory: ").Append(SyncInventory).Append("\n"); sb.Append(" SyncTrackingData: ").Append(SyncTrackingData).Append("\n"); sb.Append(" SyncInventoryLevelsLastRunTime: ").Append(SyncInventoryLevelsLastRunTime).Append("\n"); sb.Append("}\n"); return sb.ToString(); } /// <summary> /// Returns the JSON string presentation of the object /// </summary> /// <returns>JSON string presentation of the object</returns> public string ToJson() { return JsonConvert.SerializeObject(this, Formatting.Indented); } /// <summary> /// Returns true if objects are equal /// </summary> /// <param name="obj">Object to be compared</param> /// <returns>Boolean</returns> public override bool Equals(object obj) { // credit: http://stackoverflow.com/a/10454552/677735 return this.Equals(obj as ShoppingCartConnection); } /// <summary> /// Returns true if ShoppingCartConnection instances are equal /// </summary> /// <param name="other">Instance of ShoppingCartConnection to be compared</param> /// <returns>Boolean</returns> public bool Equals(ShoppingCartConnection other) { // credit: http://stackoverflow.com/a/10454552/677735 if (other == null) return false; return ( this.Id == other.Id || this.Id != null && this.Id.Equals(other.Id) ) && ( this.CreateDate == other.CreateDate || this.CreateDate != null && this.CreateDate.Equals(other.CreateDate) ) && ( this.ModifyDate == other.ModifyDate || this.ModifyDate != null && this.ModifyDate.Equals(other.ModifyDate) ) && ( this.ClientId == other.ClientId || this.ClientId != null && this.ClientId.Equals(other.ClientId) ) && ( this.Nonce == other.Nonce || this.Nonce != null && this.Nonce.Equals(other.Nonce) ) && ( this.LobId == other.LobId || this.LobId != null && this.LobId.Equals(other.LobId) ) && ( this.OrderSourceId == other.OrderSourceId || this.OrderSourceId != null && this.OrderSourceId.Equals(other.OrderSourceId) ) && ( this.IntegrationPartnerId == other.IntegrationPartnerId || this.IntegrationPartnerId != null && this.IntegrationPartnerId.Equals(other.IntegrationPartnerId) ) && ( this.ConnectionType == other.ConnectionType || this.ConnectionType != null && this.ConnectionType.Equals(other.ConnectionType) ) && ( this.ItemFilterId == other.ItemFilterId || this.ItemFilterId != null && this.ItemFilterId.Equals(other.ItemFilterId) ) && ( this.InfoplusSKUFieldToMap == other.InfoplusSKUFieldToMap || this.InfoplusSKUFieldToMap != null && this.InfoplusSKUFieldToMap.Equals(other.InfoplusSKUFieldToMap) ) && ( this.ShoppingCartSKUFieldToMap == other.ShoppingCartSKUFieldToMap || this.ShoppingCartSKUFieldToMap != null && this.ShoppingCartSKUFieldToMap.Equals(other.ShoppingCartSKUFieldToMap) ) && ( this.Name == other.Name || this.Name != null && this.Name.Equals(other.Name) ) && ( this.ShoppingCartStoreURL == other.ShoppingCartStoreURL || this.ShoppingCartStoreURL != null && this.ShoppingCartStoreURL.Equals(other.ShoppingCartStoreURL) ) && ( this.AccessCode == other.AccessCode || this.AccessCode != null && this.AccessCode.Equals(other.AccessCode) ) && ( this.AccessToken == other.AccessToken || this.AccessToken != null && this.AccessToken.Equals(other.AccessToken) ) && ( this.SyncOrders == other.SyncOrders || this.SyncOrders != null && this.SyncOrders.Equals(other.SyncOrders) ) && ( this.SyncInventory == other.SyncInventory || this.SyncInventory != null && this.SyncInventory.Equals(other.SyncInventory) ) && ( this.SyncTrackingData == other.SyncTrackingData || this.SyncTrackingData != null && this.SyncTrackingData.Equals(other.SyncTrackingData) ) && ( this.SyncInventoryLevelsLastRunTime == other.SyncInventoryLevelsLastRunTime || this.SyncInventoryLevelsLastRunTime != null && this.SyncInventoryLevelsLastRunTime.Equals(other.SyncInventoryLevelsLastRunTime) ); } /// <summary> /// Gets the hash code /// </summary> /// <returns>Hash code</returns> public override int GetHashCode() { // credit: http://stackoverflow.com/a/263416/677735 unchecked // Overflow is fine, just wrap { int hash = 41; // Suitable nullity checks etc, of course :) if (this.Id != null) hash = hash * 59 + this.Id.GetHashCode(); if (this.CreateDate != null) hash = hash * 59 + this.CreateDate.GetHashCode(); if (this.ModifyDate != null) hash = hash * 59 + this.ModifyDate.GetHashCode(); if (this.ClientId != null) hash = hash * 59 + this.ClientId.GetHashCode(); if (this.Nonce != null) hash = hash * 59 + this.Nonce.GetHashCode(); if (this.LobId != null) hash = hash * 59 + this.LobId.GetHashCode(); if (this.OrderSourceId != null) hash = hash * 59 + this.OrderSourceId.GetHashCode(); if (this.IntegrationPartnerId != null) hash = hash * 59 + this.IntegrationPartnerId.GetHashCode(); if (this.ConnectionType != null) hash = hash * 59 + this.ConnectionType.GetHashCode(); if (this.ItemFilterId != null) hash = hash * 59 + this.ItemFilterId.GetHashCode(); if (this.InfoplusSKUFieldToMap != null) hash = hash * 59 + this.InfoplusSKUFieldToMap.GetHashCode(); if (this.ShoppingCartSKUFieldToMap != null) hash = hash * 59 + this.ShoppingCartSKUFieldToMap.GetHashCode(); if (this.Name != null) hash = hash * 59 + this.Name.GetHashCode(); if (this.ShoppingCartStoreURL != null) hash = hash * 59 + this.ShoppingCartStoreURL.GetHashCode(); if (this.AccessCode != null) hash = hash * 59 + this.AccessCode.GetHashCode(); if (this.AccessToken != null) hash = hash * 59 + this.AccessToken.GetHashCode(); if (this.SyncOrders != null) hash = hash * 59 + this.SyncOrders.GetHashCode(); if (this.SyncInventory != null) hash = hash * 59 + this.SyncInventory.GetHashCode(); if (this.SyncTrackingData != null) hash = hash * 59 + this.SyncTrackingData.GetHashCode(); if (this.SyncInventoryLevelsLastRunTime != null) hash = hash * 59 + this.SyncInventoryLevelsLastRunTime.GetHashCode(); return hash; } } } } using System; using System.Collections.Generic; using System.Collections.ObjectModel; using System.ComponentModel; using System.Diagnostics; using System.Diagnostics.CodeAnalysis; using System.Globalization; using System.Linq; using System.Net.Http; using System.Net.Http.Headers; using System.Web.Http; using System.Web.Http.Controllers; using System.Web.Http.Description; using ErrorHandling.Areas.HelpPage.ModelDescriptions; using ErrorHandling.Areas.HelpPage.Models; namespace ErrorHandling.Areas.HelpPage { public static class HelpPageConfigurationExtensions { private const string ApiModelPrefix = "MS_HelpPageApiModel_"; /// <summary> /// Sets the documentation provider for help page. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="documentationProvider">The documentation provider.</param> public static void SetDocumentationProvider(this HttpConfiguration config, IDocumentationProvider documentationProvider) { config.Services.Replace(typeof(IDocumentationProvider), documentationProvider); } /// <summary> /// Sets the objects that will be used by the formatters to produce sample requests/responses. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="sampleObjects">The sample objects.</param> public static void SetSampleObjects(this HttpConfiguration config, IDictionary<Type, object> sampleObjects) { config.GetHelpPageSampleGenerator().SampleObjects = sampleObjects; } /// <summary> /// Sets the sample request directly for the specified media type and action. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="sample">The sample request.</param> /// <param name="mediaType">The media type.</param> /// <param name="controllerName">Name of the controller.</param> /// <param name="actionName">Name of the action.</param> public static void SetSampleRequest(this HttpConfiguration config, object sample, MediaTypeHeaderValue mediaType, string controllerName, string actionName) { config.GetHelpPageSampleGenerator().ActionSamples.Add(new HelpPageSampleKey(mediaType, SampleDirection.Request, controllerName, actionName, new[] { "*" }), sample); } /// <summary> /// Sets the sample request directly for the specified media type and action with parameters. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="sample">The sample request.</param> /// <param name="mediaType">The media type.</param> /// <param name="controllerName">Name of the controller.</param> /// <param name="actionName">Name of the action.</param> /// <param name="parameterNames">The parameter names.</param> public static void SetSampleRequest(this HttpConfiguration config, object sample, MediaTypeHeaderValue mediaType, string controllerName, string actionName, params string[] parameterNames) { config.GetHelpPageSampleGenerator().ActionSamples.Add(new HelpPageSampleKey(mediaType, SampleDirection.Request, controllerName, actionName, parameterNames), sample); } /// <summary> /// Sets the sample request directly for the specified media type of the action. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="sample">The sample response.</param> /// <param name="mediaType">The media type.</param> /// <param name="controllerName">Name of the controller.</param> /// <param name="actionName">Name of the action.</param> public static void SetSampleResponse(this HttpConfiguration config, object sample, MediaTypeHeaderValue mediaType, string controllerName, string actionName) { config.GetHelpPageSampleGenerator().ActionSamples.Add(new HelpPageSampleKey(mediaType, SampleDirection.Response, controllerName, actionName, new[] { "*" }), sample); } /// <summary> /// Sets the sample response directly for the specified media type of the action with specific parameters. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="sample">The sample response.</param> /// <param name="mediaType">The media type.</param> /// <param name="controllerName">Name of the controller.</param> /// <param name="actionName">Name of the action.</param> /// <param name="parameterNames">The parameter names.</param> public static void SetSampleResponse(this HttpConfiguration config, object sample, MediaTypeHeaderValue mediaType, string controllerName, string actionName, params string[] parameterNames) { config.GetHelpPageSampleGenerator().ActionSamples.Add(new HelpPageSampleKey(mediaType, SampleDirection.Response, controllerName, actionName, parameterNames), sample); } /// <summary> /// Sets the sample directly for all actions with the specified media type. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="sample">The sample.</param> /// <param name="mediaType">The media type.</param> public static void SetSampleForMediaType(this HttpConfiguration config, object sample, MediaTypeHeaderValue mediaType) { config.GetHelpPageSampleGenerator().ActionSamples.Add(new HelpPageSampleKey(mediaType), sample); } /// <summary> /// Sets the sample directly for all actions with the specified type and media type. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="sample">The sample.</param> /// <param name="mediaType">The media type.</param> /// <param name="type">The parameter type or return type of an action.</param> public static void SetSampleForType(this HttpConfiguration config, object sample, MediaTypeHeaderValue mediaType, Type type) { config.GetHelpPageSampleGenerator().ActionSamples.Add(new HelpPageSampleKey(mediaType, type), sample); } /// <summary> /// Specifies the actual type of <see cref="System.Net.Http.ObjectContent{T}"/> passed to the <see cref="System.Net.Http.HttpRequestMessage"/> in an action. /// The help page will use this information to produce more accurate request samples. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="type">The type.</param> /// <param name="controllerName">Name of the controller.</param> /// <param name="actionName">Name of the action.</param> public static void SetActualRequestType(this HttpConfiguration config, Type type, string controllerName, string actionName) { config.GetHelpPageSampleGenerator().ActualHttpMessageTypes.Add(new HelpPageSampleKey(SampleDirection.Request, controllerName, actionName, new[] { "*" }), type); } /// <summary> /// Specifies the actual type of <see cref="System.Net.Http.ObjectContent{T}"/> passed to the <see cref="System.Net.Http.HttpRequestMessage"/> in an action. /// The help page will use this information to produce more accurate request samples. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="type">The type.</param> /// <param name="controllerName">Name of the controller.</param> /// <param name="actionName">Name of the action.</param> /// <param name="parameterNames">The parameter names.</param> public static void SetActualRequestType(this HttpConfiguration config, Type type, string controllerName, string actionName, params string[] parameterNames) { config.GetHelpPageSampleGenerator().ActualHttpMessageTypes.Add(new HelpPageSampleKey(SampleDirection.Request, controllerName, actionName, parameterNames), type); } /// <summary> /// Specifies the actual type of <see cref="System.Net.Http.ObjectContent{T}"/> returned as part of the <see cref="System.Net.Http.HttpRequestMessage"/> in an action. /// The help page will use this information to produce more accurate response samples. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="type">The type.</param> /// <param name="controllerName">Name of the controller.</param> /// <param name="actionName">Name of the action.</param> public static void SetActualResponseType(this HttpConfiguration config, Type type, string controllerName, string actionName) { config.GetHelpPageSampleGenerator().ActualHttpMessageTypes.Add(new HelpPageSampleKey(SampleDirection.Response, controllerName, actionName, new[] { "*" }), type); } /// <summary> /// Specifies the actual type of <see cref="System.Net.Http.ObjectContent{T}"/> returned as part of the <see cref="System.Net.Http.HttpRequestMessage"/> in an action. /// The help page will use this information to produce more accurate response samples. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="type">The type.</param> /// <param name="controllerName">Name of the controller.</param> /// <param name="actionName">Name of the action.</param> /// <param name="parameterNames">The parameter names.</param> public static void SetActualResponseType(this HttpConfiguration config, Type type, string controllerName, string actionName, params string[] parameterNames) { config.GetHelpPageSampleGenerator().ActualHttpMessageTypes.Add(new HelpPageSampleKey(SampleDirection.Response, controllerName, actionName, parameterNames), type); } /// <summary> /// Gets the help page sample generator. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <returns>The help page sample generator.</returns> public static HelpPageSampleGenerator GetHelpPageSampleGenerator(this HttpConfiguration config) { return (HelpPageSampleGenerator)config.Properties.GetOrAdd( typeof(HelpPageSampleGenerator), k => new HelpPageSampleGenerator()); } /// <summary> /// Sets the help page sample generator. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="sampleGenerator">The help page sample generator.</param> public static void SetHelpPageSampleGenerator(this HttpConfiguration config, HelpPageSampleGenerator sampleGenerator) { config.Properties.AddOrUpdate( typeof(HelpPageSampleGenerator), k => sampleGenerator, (k, o) => sampleGenerator); } /// <summary> /// Gets the model description generator. /// </summary> /// <param name="config">The configuration.</param> /// <returns>The <see cref="ModelDescriptionGenerator"/></returns> public static ModelDescriptionGenerator GetModelDescriptionGenerator(this HttpConfiguration config) { return (ModelDescriptionGenerator)config.Properties.GetOrAdd( typeof(ModelDescriptionGenerator), k => InitializeModelDescriptionGenerator(config)); } /// <summary> /// Gets the model that represents an API displayed on the help page. The model is initialized on the first call and cached for subsequent calls. /// </summary> /// <param name="config">The <see cref="HttpConfiguration"/>.</param> /// <param name="apiDescriptionId">The <see cref="ApiDescription"/> ID.</param> /// <returns> /// An <see cref="HelpPageApiModel"/> /// </returns> public static HelpPageApiModel GetHelpPageApiModel(this HttpConfiguration config, string apiDescriptionId) { object model; string modelId = ApiModelPrefix + apiDescriptionId; if (!config.Properties.TryGetValue(modelId, out model)) { Collection<ApiDescription> apiDescriptions = config.Services.GetApiExplorer().ApiDescriptions; ApiDescription apiDescription = apiDescriptions.FirstOrDefault(api => String.Equals(api.GetFriendlyId(), apiDescriptionId, StringComparison.OrdinalIgnoreCase)); if (apiDescription != null) { model = GenerateApiModel(apiDescription, config); config.Properties.TryAdd(modelId, model); } } return (HelpPageApiModel)model; } private static HelpPageApiModel GenerateApiModel(ApiDescription apiDescription, HttpConfiguration config) { HelpPageApiModel apiModel = new HelpPageApiModel() { ApiDescription = apiDescription, }; ModelDescriptionGenerator modelGenerator = config.GetModelDescriptionGenerator(); HelpPageSampleGenerator sampleGenerator = config.GetHelpPageSampleGenerator(); GenerateUriParameters(apiModel, modelGenerator); GenerateRequestModelDescription(apiModel, modelGenerator, sampleGenerator); GenerateResourceDescription(apiModel, modelGenerator); GenerateSamples(apiModel, sampleGenerator); return apiModel; } private static void GenerateUriParameters(HelpPageApiModel apiModel, ModelDescriptionGenerator modelGenerator) { ApiDescription apiDescription = apiModel.ApiDescription; foreach (ApiParameterDescription apiParameter in apiDescription.ParameterDescriptions) { if (apiParameter.Source == ApiParameterSource.FromUri) { HttpParameterDescriptor parameterDescriptor = apiParameter.ParameterDescriptor; Type parameterType = null; ModelDescription typeDescription = null; ComplexTypeModelDescription complexTypeDescription = null; if (parameterDescriptor != null) { parameterType = parameterDescriptor.ParameterType; typeDescription = modelGenerator.GetOrCreateModelDescription(parameterType); complexTypeDescription = typeDescription as ComplexTypeModelDescription; } // Example: // [TypeConverter(typeof(PointConverter))] // public class Point // { // public Point(int x, int y) // { // X = x; // Y = y; // } // public int X { get; set; } // public int Y { get; set; } // } // Class Point is bindable with a TypeConverter, so Point will be added to UriParameters collection. // // public class Point // { // public int X { get; set; } // public int Y { get; set; } // } // Regular complex class Point will have properties X and Y added to UriParameters collection. if (complexTypeDescription != null && !IsBindableWithTypeConverter(parameterType)) { foreach (ParameterDescription uriParameter in complexTypeDescription.Properties) { apiModel.UriParameters.Add(uriParameter); } } else if (parameterDescriptor != null) { ParameterDescription uriParameter = AddParameterDescription(apiModel, apiParameter, typeDescription); if (!parameterDescriptor.IsOptional) { uriParameter.Annotations.Add(new ParameterAnnotation() { Documentation = "Required" }); } object defaultValue = parameterDescriptor.DefaultValue; if (defaultValue != null) { uriParameter.Annotations.Add(new ParameterAnnotation() { Documentation = "Default value is " + Convert.ToString(defaultValue, CultureInfo.InvariantCulture) }); } } else { Debug.Assert(parameterDescriptor == null); // If parameterDescriptor is null, this is an undeclared route parameter which only occurs // when source is FromUri. Ignored in request model and among resource parameters but listed // as a simple string here. ModelDescription modelDescription = modelGenerator.GetOrCreateModelDescription(typeof(string)); AddParameterDescription(apiModel, apiParameter, modelDescription); } } } } private static bool IsBindableWithTypeConverter(Type parameterType) { if (parameterType == null) { return false; } return TypeDescriptor.GetConverter(parameterType).CanConvertFrom(typeof(string)); } private static ParameterDescription AddParameterDescription(HelpPageApiModel apiModel, ApiParameterDescription apiParameter, ModelDescription typeDescription) { ParameterDescription parameterDescription = new ParameterDescription { Name = apiParameter.Name, Documentation = apiParameter.Documentation, TypeDescription = typeDescription, }; apiModel.UriParameters.Add(parameterDescription); return parameterDescription; } private static void GenerateRequestModelDescription(HelpPageApiModel apiModel, ModelDescriptionGenerator modelGenerator, HelpPageSampleGenerator sampleGenerator) { ApiDescription apiDescription = apiModel.ApiDescription; foreach (ApiParameterDescription apiParameter in apiDescription.ParameterDescriptions) { if (apiParameter.Source == ApiParameterSource.FromBody) { Type parameterType = apiParameter.ParameterDescriptor.ParameterType; apiModel.RequestModelDescription = modelGenerator.GetOrCreateModelDescription(parameterType); apiModel.RequestDocumentation = apiParameter.Documentation; } else if (apiParameter.ParameterDescriptor != null && apiParameter.ParameterDescriptor.ParameterType == typeof(HttpRequestMessage)) { Type parameterType = sampleGenerator.ResolveHttpRequestMessageType(apiDescription); if (parameterType != null) { apiModel.RequestModelDescription = modelGenerator.GetOrCreateModelDescription(parameterType); } } } } private static void GenerateResourceDescription(HelpPageApiModel apiModel, ModelDescriptionGenerator modelGenerator) { ResponseDescription response = apiModel.ApiDescription.ResponseDescription; Type responseType = response.ResponseType ?? response.DeclaredType; if (responseType != null && responseType != typeof(void)) { apiModel.ResourceDescription = modelGenerator.GetOrCreateModelDescription(responseType); } } [SuppressMessage("Microsoft.Design", "CA1031:DoNotCatchGeneralExceptionTypes", Justification = "The exception is recorded as ErrorMessages.")] private static void GenerateSamples(HelpPageApiModel apiModel, HelpPageSampleGenerator sampleGenerator) { try { foreach (var item in sampleGenerator.GetSampleRequests(apiModel.ApiDescription)) { apiModel.SampleRequests.Add(item.Key, item.Value); LogInvalidSampleAsError(apiModel, item.Value); } foreach (var item in sampleGenerator.GetSampleResponses(apiModel.ApiDescription)) { apiModel.SampleResponses.Add(item.Key, item.Value); LogInvalidSampleAsError(apiModel, item.Value); } } catch (Exception e) { apiModel.ErrorMessages.Add(String.Format(CultureInfo.CurrentCulture, "An exception has occurred while generating the sample. Exception message: {0}", HelpPageSampleGenerator.UnwrapException(e).Message)); } } private static bool TryGetResourceParameter(ApiDescription apiDescription, HttpConfiguration config, out ApiParameterDescription parameterDescription, out Type resourceType) { parameterDescription = apiDescription.ParameterDescriptions.FirstOrDefault( p => p.Source == ApiParameterSource.FromBody || (p.ParameterDescriptor != null && p.ParameterDescriptor.ParameterType == typeof(HttpRequestMessage))); if (parameterDescription == null) { resourceType = null; return false; } resourceType = parameterDescription.ParameterDescriptor.ParameterType; if (resourceType == typeof(HttpRequestMessage)) { HelpPageSampleGenerator sampleGenerator = config.GetHelpPageSampleGenerator(); resourceType = sampleGenerator.ResolveHttpRequestMessageType(apiDescription); } if (resourceType == null) { parameterDescription = null; return false; } return true; } private static ModelDescriptionGenerator InitializeModelDescriptionGenerator(HttpConfiguration config) { ModelDescriptionGenerator modelGenerator = new ModelDescriptionGenerator(config); Collection<ApiDescription> apis = config.Services.GetApiExplorer().ApiDescriptions; foreach (ApiDescription api in apis) { ApiParameterDescription parameterDescription; Type parameterType; if (TryGetResourceParameter(api, config, out parameterDescription, out parameterType)) { modelGenerator.GetOrCreateModelDescription(parameterType); } } return modelGenerator; } private static void LogInvalidSampleAsError(HelpPageApiModel apiModel, object sample) { InvalidSample invalidSample = sample as InvalidSample; if (invalidSample != null) { apiModel.ErrorMessages.Add(invalidSample.ErrorMessage); } } } } //------------------------------------------------------------------------------ // Copyright (c) Microsoft Corporation. All rights reserved. //------------------------------------------------------------------------------ namespace System.ServiceModel.Configuration { using System.Collections.Generic; using System.Configuration; using System.Reflection; using System.Runtime; using System.Security; using System.ServiceModel; using System.ServiceModel.Channels; using System.Xml; using System.Runtime.Diagnostics; public sealed partial class BindingsSection : ConfigurationSection, IConfigurationContextProviderInternal { static Configuration configuration; ConfigurationPropertyCollection properties; public BindingsSection() { } Dictionary<string, BindingCollectionElement> BindingCollectionElements { get { Dictionary<string, BindingCollectionElement> bindingCollectionElements = new Dictionary<string, BindingCollectionElement>(); foreach (ConfigurationProperty property in this.Properties) { bindingCollectionElements.Add(property.Name, this[property.Name]); } return bindingCollectionElements; } } new public BindingCollectionElement this[string binding] { get { return (BindingCollectionElement)base[binding]; } } protected override ConfigurationPropertyCollection Properties { get { if (this.properties == null) { this.properties = new ConfigurationPropertyCollection(); } this.UpdateBindingSections(); return this.properties; } } [ConfigurationProperty(ConfigurationStrings.BasicHttpBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public BasicHttpBindingCollectionElement BasicHttpBinding { get { return (BasicHttpBindingCollectionElement)base[ConfigurationStrings.BasicHttpBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.BasicHttpsBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public BasicHttpsBindingCollectionElement BasicHttpsBinding { get { return (BasicHttpsBindingCollectionElement)base[ConfigurationStrings.BasicHttpsBindingCollectionElementName]; } } // This property should only be called/set from BindingsSectionGroup TryAdd static Configuration Configuration { get { return BindingsSection.configuration; } set { BindingsSection.configuration = value; } } [ConfigurationProperty(ConfigurationStrings.CustomBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public CustomBindingCollectionElement CustomBinding { get { return (CustomBindingCollectionElement)base[ConfigurationStrings.CustomBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.MsmqIntegrationBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public MsmqIntegrationBindingCollectionElement MsmqIntegrationBinding { get { return (MsmqIntegrationBindingCollectionElement)base[ConfigurationStrings.MsmqIntegrationBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.NetHttpBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public NetHttpBindingCollectionElement NetHttpBinding { get { return (NetHttpBindingCollectionElement)base[ConfigurationStrings.NetHttpBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.NetHttpsBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public NetHttpsBindingCollectionElement NetHttpsBinding { get { return (NetHttpsBindingCollectionElement)base[ConfigurationStrings.NetHttpsBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.NetPeerTcpBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] [ObsoleteAttribute ("PeerChannel feature is obsolete and will be removed in the future.", false)] public NetPeerTcpBindingCollectionElement NetPeerTcpBinding { get { return (NetPeerTcpBindingCollectionElement)base[ConfigurationStrings.NetPeerTcpBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.NetMsmqBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public NetMsmqBindingCollectionElement NetMsmqBinding { get { return (NetMsmqBindingCollectionElement)base[ConfigurationStrings.NetMsmqBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.NetNamedPipeBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public NetNamedPipeBindingCollectionElement NetNamedPipeBinding { get { return (NetNamedPipeBindingCollectionElement)base[ConfigurationStrings.NetNamedPipeBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.NetTcpBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public NetTcpBindingCollectionElement NetTcpBinding { get { return (NetTcpBindingCollectionElement)base[ConfigurationStrings.NetTcpBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.WSFederationHttpBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public WSFederationHttpBindingCollectionElement WSFederationHttpBinding { get { return (WSFederationHttpBindingCollectionElement)base[ConfigurationStrings.WSFederationHttpBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.WS2007FederationHttpBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public WS2007FederationHttpBindingCollectionElement WS2007FederationHttpBinding { get { return (WS2007FederationHttpBindingCollectionElement)base[ConfigurationStrings.WS2007FederationHttpBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.WSHttpBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public WSHttpBindingCollectionElement WSHttpBinding { get { return (WSHttpBindingCollectionElement)base[ConfigurationStrings.WSHttpBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.WS2007HttpBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public WS2007HttpBindingCollectionElement WS2007HttpBinding { get { return (WS2007HttpBindingCollectionElement)base[ConfigurationStrings.WS2007HttpBindingCollectionElementName]; } } [ConfigurationProperty(ConfigurationStrings.WSDualHttpBindingCollectionElementName, Options = ConfigurationPropertyOptions.None)] public WSDualHttpBindingCollectionElement WSDualHttpBinding { get { return (WSDualHttpBindingCollectionElement)base[ConfigurationStrings.WSDualHttpBindingCollectionElementName]; } } public static BindingsSection GetSection(Configuration config) { if (config == null) { throw DiagnosticUtility.ExceptionUtility.ThrowHelperArgumentNull("config"); } return (BindingsSection)config.GetSection(ConfigurationStrings.BindingsSectionGroupPath); } public List<BindingCollectionElement> BindingCollections { get { List<BindingCollectionElement> bindingCollections = new List<BindingCollectionElement>(); foreach (ConfigurationProperty property in this.Properties) { bindingCollections.Add(this[property.Name]); } return bindingCollections; } } protected override bool OnDeserializeUnrecognizedElement(string elementName, XmlReader reader) { throw DiagnosticUtility.ExceptionUtility.ThrowHelperError( new ConfigurationErrorsException(SR.GetString(SR.ConfigBindingExtensionNotFound, ConfigurationHelpers.GetBindingsSectionPath(elementName)))); } internal static bool TryAdd(string name, Binding binding, Configuration config, out string bindingSectionName) { bool retval = false; BindingsSection.Configuration = config; try { retval = BindingsSection.TryAdd(name, binding, out bindingSectionName); } finally { BindingsSection.Configuration = null; } return retval; } internal static bool TryAdd(string name, Binding binding, out string bindingSectionName) { // TryAdd built on assumption that BindingsSectionGroup.Configuration is valid. // This should be protected at the callers site. If assumption is invalid, then // configuration system is in an indeterminate state. Need to stop in a manner that // user code can not capture. if (null == BindingsSection.Configuration) { Fx.Assert("The TryAdd(string name, Binding binding, Configuration config, out string binding) variant of this function should always be called first. The Configuration object is not set."); DiagnosticUtility.FailFast("The TryAdd(string name, Binding binding, Configuration config, out string binding) variant of this function should always be called first. The Configuration object is not set."); } bool retval = false; string outBindingSectionName = null; BindingsSection sectionGroup = BindingsSection.GetSection(BindingsSection.Configuration); sectionGroup.UpdateBindingSections(); foreach (string sectionName in sectionGroup.BindingCollectionElements.Keys) { BindingCollectionElement bindingCollectionElement = sectionGroup.BindingCollectionElements[sectionName]; // Save the custom bindings as the last choice if (!(bindingCollectionElement is CustomBindingCollectionElement)) { MethodInfo tryAddMethod = bindingCollectionElement.GetType().GetMethod("TryAdd", BindingFlags.Instance | BindingFlags.NonPublic); if (tryAddMethod != null) { retval = (bool)tryAddMethod.Invoke(bindingCollectionElement, new object[] { name, binding, BindingsSection.Configuration }); if (retval) { outBindingSectionName = sectionName; break; } } } } if (!retval) { // Much of the time, the custombinding should come out ok. CustomBindingCollectionElement customBindingSection = CustomBindingCollectionElement.GetBindingCollectionElement(); retval = customBindingSection.TryAdd(name, binding, BindingsSection.Configuration); if (retval) { outBindingSectionName = ConfigurationStrings.CustomBindingCollectionElementName; } } // This little oddity exists to make sure that the out param is assigned to before the method // exits. bindingSectionName = outBindingSectionName; return retval; } void UpdateBindingSections() { UpdateBindingSections(ConfigurationHelpers.GetEvaluationContext(this)); } [Fx.Tag.SecurityNote(Critical = "Calls UnsafeLookupCollection which elevates.", Safe = "Doesn't leak resultant config.")] [SecuritySafeCritical] internal void UpdateBindingSections(ContextInformation evaluationContext) { ExtensionElementCollection bindingExtensions = ExtensionsSection.UnsafeLookupCollection(ConfigurationStrings.BindingExtensions, evaluationContext); // Extension collections are additive only (BasicMap) and do not allow for <clear> // or <remove> tags, nor do they allow for overriding an entry. This allows us // to optimize this to only walk the binding extension collection if the counts // mismatch. if (bindingExtensions.Count != this.properties.Count) { foreach (ExtensionElement bindingExtension in bindingExtensions) { if (null != bindingExtension) { if (!this.properties.Contains(bindingExtension.Name)) { Type extensionType = Type.GetType(bindingExtension.Type, false); if (extensionType == null) { ConfigurationHelpers.TraceExtensionTypeNotFound(bindingExtension); } else { ConfigurationProperty property = new ConfigurationProperty(bindingExtension.Name, extensionType, null, ConfigurationPropertyOptions.None); this.properties.Add(property); } } } } } } [Fx.Tag.SecurityNote(Critical = "Calls UnsafeGetAssociatedBindingCollectionElement which elevates.", Safe = "Doesn't leak resultant config.")] [SecuritySafeCritical] internal static void ValidateBindingReference(string binding, string bindingConfiguration, ContextInformation evaluationContext, ConfigurationElement configurationElement) { // ValidateBindingReference built on assumption that evaluationContext is valid. // This should be protected at the callers site. If assumption is invalid, then // configuration system is in an indeterminate state. Need to stop in a manner that // user code can not capture. if (null == evaluationContext) { Fx.Assert("ValidateBindingReference() should only called with valid ContextInformation"); DiagnosticUtility.FailFast("ValidateBindingReference() should only called with valid ContextInformation"); } if (!String.IsNullOrEmpty(binding)) { BindingCollectionElement bindingCollectionElement = null; if (null != evaluationContext) { bindingCollectionElement = ConfigurationHelpers.UnsafeGetAssociatedBindingCollectionElement(evaluationContext, binding); } else { bindingCollectionElement = ConfigurationHelpers.UnsafeGetBindingCollectionElement(binding); } if (bindingCollectionElement == null) { throw DiagnosticUtility.ExceptionUtility.ThrowHelperError(new ConfigurationErrorsException(SR.GetString(SR.ConfigInvalidSection, ConfigurationHelpers.GetBindingsSectionPath(binding)), configurationElement.ElementInformation.Source, configurationElement.ElementInformation.LineNumber)); } if (!String.IsNullOrEmpty(bindingConfiguration)) { if (!bindingCollectionElement.ContainsKey(bindingConfiguration)) { throw DiagnosticUtility.ExceptionUtility.ThrowHelperError(new ConfigurationErrorsException(SR.GetString(SR.ConfigInvalidBindingName, bindingConfiguration, ConfigurationHelpers.GetBindingsSectionPath(binding), ConfigurationStrings.BindingConfiguration), configurationElement.ElementInformation.Source, configurationElement.ElementInformation.LineNumber)); } } } } ContextInformation IConfigurationContextProviderInternal.GetEvaluationContext() { return this.EvaluationContext; } [Fx.Tag.SecurityNote(Miscellaneous = "RequiresReview - the return value will be used for a security decision -- see comment in interface definition.")] ContextInformation IConfigurationContextProviderInternal.GetOriginalEvaluationContext() { Fx.Assert("Not implemented: IConfigurationContextProviderInternal.GetOriginalEvaluationContext"); return null; } } } /* * Swaggy Jenkins * * Jenkins API clients generated from Swagger / Open API specification * * The version of the OpenAPI document: 1.1.2-pre.0 * Contact: blah@cliffano.com * Generated by: https://github.com/openapitools/openapi-generator.git */ using System; using System.Collections; using System.Collections.Generic; using System.Collections.ObjectModel; using System.Linq; using System.IO; using System.Runtime.Serialization; using System.Text; using System.Text.RegularExpressions; using Newtonsoft.Json; using Newtonsoft.Json.Converters; using Newtonsoft.Json.Linq; using System.ComponentModel.DataAnnotations; using OpenAPIDateConverter = Org.OpenAPITools.Client.OpenAPIDateConverter; namespace Org.OpenAPITools.Models { /// <summary> /// HudsonMasterComputermonitorData /// </summary> [DataContract(Name = "HudsonMasterComputermonitorData")] public partial class HudsonMasterComputermonitorData : IEquatable<HudsonMasterComputermonitorData>, IValidatableObject { /// <summary> /// Initializes a new instance of the <see cref="HudsonMasterComputermonitorData" /> class. /// </summary> /// <param name="hudsonNodeMonitorsSwapSpaceMonitor">hudsonNodeMonitorsSwapSpaceMonitor.</param> /// <param name="hudsonNodeMonitorsTemporarySpaceMonitor">hudsonNodeMonitorsTemporarySpaceMonitor.</param> /// <param name="hudsonNodeMonitorsDiskSpaceMonitor">hudsonNodeMonitorsDiskSpaceMonitor.</param> /// <param name="hudsonNodeMonitorsArchitectureMonitor">hudsonNodeMonitorsArchitectureMonitor.</param> /// <param name="hudsonNodeMonitorsResponseTimeMonitor">hudsonNodeMonitorsResponseTimeMonitor.</param> /// <param name="hudsonNodeMonitorsClockMonitor">hudsonNodeMonitorsClockMonitor.</param> /// <param name="_class">_class.</param> public HudsonMasterComputermonitorData(SwapSpaceMonitorMemoryUsage2 hudsonNodeMonitorsSwapSpaceMonitor = default(SwapSpaceMonitorMemoryUsage2), DiskSpaceMonitorDescriptorDiskSpace hudsonNodeMonitorsTemporarySpaceMonitor = default(DiskSpaceMonitorDescriptorDiskSpace), DiskSpaceMonitorDescriptorDiskSpace hudsonNodeMonitorsDiskSpaceMonitor = default(DiskSpaceMonitorDescriptorDiskSpace), string hudsonNodeMonitorsArchitectureMonitor = default(string), ResponseTimeMonitorData hudsonNodeMonitorsResponseTimeMonitor = default(ResponseTimeMonitorData), ClockDifference hudsonNodeMonitorsClockMonitor = default(ClockDifference), string _class = default(string)) { this.HudsonNodeMonitorsSwapSpaceMonitor = hudsonNodeMonitorsSwapSpaceMonitor; this.HudsonNodeMonitorsTemporarySpaceMonitor = hudsonNodeMonitorsTemporarySpaceMonitor; this.HudsonNodeMonitorsDiskSpaceMonitor = hudsonNodeMonitorsDiskSpaceMonitor; this.HudsonNodeMonitorsArchitectureMonitor = hudsonNodeMonitorsArchitectureMonitor; this.HudsonNodeMonitorsResponseTimeMonitor = hudsonNodeMonitorsResponseTimeMonitor; this.HudsonNodeMonitorsClockMonitor = hudsonNodeMonitorsClockMonitor; this.Class = _class; } /// <summary> /// Gets or Sets HudsonNodeMonitorsSwapSpaceMonitor /// </summary> [DataMember(Name = "hudson.node_monitors.SwapSpaceMonitor", EmitDefaultValue = false)] public SwapSpaceMonitorMemoryUsage2 HudsonNodeMonitorsSwapSpaceMonitor { get; set; } /// <summary> /// Gets or Sets HudsonNodeMonitorsTemporarySpaceMonitor /// </summary> [DataMember(Name = "hudson.node_monitors.TemporarySpaceMonitor", EmitDefaultValue = false)] public DiskSpaceMonitorDescriptorDiskSpace HudsonNodeMonitorsTemporarySpaceMonitor { get; set; } /// <summary> /// Gets or Sets HudsonNodeMonitorsDiskSpaceMonitor /// </summary> [DataMember(Name = "hudson.node_monitors.DiskSpaceMonitor", EmitDefaultValue = false)] public DiskSpaceMonitorDescriptorDiskSpace HudsonNodeMonitorsDiskSpaceMonitor { get; set; } /// <summary> /// Gets or Sets HudsonNodeMonitorsArchitectureMonitor /// </summary> [DataMember(Name = "hudson.node_monitors.ArchitectureMonitor", EmitDefaultValue = false)] public string HudsonNodeMonitorsArchitectureMonitor { get; set; } /// <summary> /// Gets or Sets HudsonNodeMonitorsResponseTimeMonitor /// </summary> [DataMember(Name = "hudson.node_monitors.ResponseTimeMonitor", EmitDefaultValue = false)] public ResponseTimeMonitorData HudsonNodeMonitorsResponseTimeMonitor { get; set; } /// <summary> /// Gets or Sets HudsonNodeMonitorsClockMonitor /// </summary> [DataMember(Name = "hudson.node_monitors.ClockMonitor", EmitDefaultValue = false)] public ClockDifference HudsonNodeMonitorsClockMonitor { get; set; } /// <summary> /// Gets or Sets Class /// </summary> [DataMember(Name = "_class", EmitDefaultValue = false)] public string Class { get; set; } /// <summary> /// Returns the string presentation of the object /// </summary> /// <returns>String presentation of the object</returns> public override string ToString() { var sb = new StringBuilder(); sb.Append("class HudsonMasterComputermonitorData {\n"); sb.Append(" HudsonNodeMonitorsSwapSpaceMonitor: ").Append(HudsonNodeMonitorsSwapSpaceMonitor).Append("\n"); sb.Append(" HudsonNodeMonitorsTemporarySpaceMonitor: ").Append(HudsonNodeMonitorsTemporarySpaceMonitor).Append("\n"); sb.Append(" HudsonNodeMonitorsDiskSpaceMonitor: ").Append(HudsonNodeMonitorsDiskSpaceMonitor).Append("\n"); sb.Append(" HudsonNodeMonitorsArchitectureMonitor: ").Append(HudsonNodeMonitorsArchitectureMonitor).Append("\n"); sb.Append(" HudsonNodeMonitorsResponseTimeMonitor: ").Append(HudsonNodeMonitorsResponseTimeMonitor).Append("\n"); sb.Append(" HudsonNodeMonitorsClockMonitor: ").Append(HudsonNodeMonitorsClockMonitor).Append("\n"); sb.Append(" Class: ").Append(Class).Append("\n"); sb.Append("}\n"); return sb.ToString(); } /// <summary> /// Returns the JSON string presentation of the object /// </summary> /// <returns>JSON string presentation of the object</returns> public virtual string ToJson() { return Newtonsoft.Json.JsonConvert.SerializeObject(this, Newtonsoft.Json.Formatting.Indented); } /// <summary> /// Returns true if objects are equal /// </summary> /// <param name="input">Object to be compared</param> /// <returns>Boolean</returns> public override bool Equals(object input) { return this.Equals(input as HudsonMasterComputermonitorData); } /// <summary> /// Returns true if HudsonMasterComputermonitorData instances are equal /// </summary> /// <param name="input">Instance of HudsonMasterComputermonitorData to be compared</param> /// <returns>Boolean</returns> public bool Equals(HudsonMasterComputermonitorData input) { if (input == null) return false; return ( this.HudsonNodeMonitorsSwapSpaceMonitor == input.HudsonNodeMonitorsSwapSpaceMonitor || (this.HudsonNodeMonitorsSwapSpaceMonitor != null && this.HudsonNodeMonitorsSwapSpaceMonitor.Equals(input.HudsonNodeMonitorsSwapSpaceMonitor)) ) && ( this.HudsonNodeMonitorsTemporarySpaceMonitor == input.HudsonNodeMonitorsTemporarySpaceMonitor || (this.HudsonNodeMonitorsTemporarySpaceMonitor != null && this.HudsonNodeMonitorsTemporarySpaceMonitor.Equals(input.HudsonNodeMonitorsTemporarySpaceMonitor)) ) && ( this.HudsonNodeMonitorsDiskSpaceMonitor == input.HudsonNodeMonitorsDiskSpaceMonitor || (this.HudsonNodeMonitorsDiskSpaceMonitor != null && this.HudsonNodeMonitorsDiskSpaceMonitor.Equals(input.HudsonNodeMonitorsDiskSpaceMonitor)) ) && ( this.HudsonNodeMonitorsArchitectureMonitor == input.HudsonNodeMonitorsArchitectureMonitor || (this.HudsonNodeMonitorsArchitectureMonitor != null && this.HudsonNodeMonitorsArchitectureMonitor.Equals(input.HudsonNodeMonitorsArchitectureMonitor)) ) && ( this.HudsonNodeMonitorsResponseTimeMonitor == input.HudsonNodeMonitorsResponseTimeMonitor || (this.HudsonNodeMonitorsResponseTimeMonitor != null && this.HudsonNodeMonitorsResponseTimeMonitor.Equals(input.HudsonNodeMonitorsResponseTimeMonitor)) ) && ( this.HudsonNodeMonitorsClockMonitor == input.HudsonNodeMonitorsClockMonitor || (this.HudsonNodeMonitorsClockMonitor != null && this.HudsonNodeMonitorsClockMonitor.Equals(input.HudsonNodeMonitorsClockMonitor)) ) && ( this.Class == input.Class || (this.Class != null && this.Class.Equals(input.Class)) ); } /// <summary> /// Gets the hash code /// </summary> /// <returns>Hash code</returns> public override int GetHashCode() { unchecked // Overflow is fine, just wrap { int hashCode = 41; if (this.HudsonNodeMonitorsSwapSpaceMonitor != null) hashCode = hashCode * 59 + this.HudsonNodeMonitorsSwapSpaceMonitor.GetHashCode(); if (this.HudsonNodeMonitorsTemporarySpaceMonitor != null) hashCode = hashCode * 59 + this.HudsonNodeMonitorsTemporarySpaceMonitor.GetHashCode(); if (this.HudsonNodeMonitorsDiskSpaceMonitor != null) hashCode = hashCode * 59 + this.HudsonNodeMonitorsDiskSpaceMonitor.GetHashCode(); if (this.HudsonNodeMonitorsArchitectureMonitor != null) hashCode = hashCode * 59 + this.HudsonNodeMonitorsArchitectureMonitor.GetHashCode(); if (this.HudsonNodeMonitorsResponseTimeMonitor != null) hashCode = hashCode * 59 + this.HudsonNodeMonitorsResponseTimeMonitor.GetHashCode(); if (this.HudsonNodeMonitorsClockMonitor != null) hashCode = hashCode * 59 + this.HudsonNodeMonitorsClockMonitor.GetHashCode(); if (this.Class != null) hashCode = hashCode * 59 + this.Class.GetHashCode(); return hashCode; } } /// <summary> /// To validate all properties of the instance /// </summary> /// <param name="validationContext">Validation context</param> /// <returns>Validation Result</returns> IEnumerable<System.ComponentModel.DataAnnotations.ValidationResult> IValidatableObject.Validate(ValidationContext validationContext) { yield break; } } } using NUnit.Framework; using System; using System.Collections.Generic; using System.Globalization; #if net452 using System.Threading; #endif using System.Threading.Tasks; namespace Braintree.Tests.Integration { [TestFixture] public class SettlementBatchSummaryIntegrationTest { private BraintreeGateway gateway; [SetUp] public void Setup() { gateway = new BraintreeGateway { Environment = Environment.DEVELOPMENT, MerchantId = "integration_merchant_id", PublicKey = "integration_public_key", PrivateKey = "integration_private_key" }; } [Test] public void Generate_ReturnsAnEmptyCollectionIfThereIsNoData() { Result<SettlementBatchSummary> result = gateway.SettlementBatchSummary.Generate(DateTime.Parse("1979-01-01")); Assert.AreEqual(0, result.Target.Records.Count); } [Test] public void Generate_ReturnsTransactionsSettledOnAGivenDay() { TransactionRequest request = new TransactionRequest { Amount = 1000M, CreditCard = new TransactionCreditCardRequest { Number = "4111111111111111", ExpirationDate = "05/2012", CardholderName = "Tom Smith", }, Options = new TransactionOptionsRequest { SubmitForSettlement = true }, }; Transaction transaction = gateway.Transaction.Sale(request).Target; Transaction settlementResult = gateway.TestTransaction.Settle(transaction.Id); var settlementDate = settlementResult.SettlementBatchId.Substring(0,10); var result = gateway.SettlementBatchSummary.Generate(System.DateTime.Parse(settlementDate)); var visas = new List<IDictionary<string,string>>(); foreach (var row in result.Target.Records) { if (CreditCardCardType.VISA.GetDescription().Equals(row["card_type"])) { visas.Add(row); } } Assert.IsTrue(visas.Count >= 1); } [Test] #if netcore public async Task GenerateAsync_ReturnsTransactionsSettledOnAGivenDay() #else public void GenerateAsync_ReturnsTransactionsSettledOnAGivenDay() { Task.Run(async () => #endif { TransactionRequest request = new TransactionRequest { Amount = 1000M, CreditCard = new TransactionCreditCardRequest { Number = "5555555555554444", ExpirationDate = "05/2012", CardholderName = "Jane Smith", }, Options = new TransactionOptionsRequest { SubmitForSettlement = true }, }; Result<Transaction> transactionResult = await gateway.Transaction.SaleAsync(request); Transaction transaction = transactionResult.Target; Transaction settlementResult = await gateway.TestTransaction.SettleAsync(transaction.Id); var settlementDate = settlementResult.SettlementBatchId.Substring(0,10); var result = await gateway.SettlementBatchSummary.GenerateAsync(System.DateTime.Parse(settlementDate)); var mastercards = new List<IDictionary<string,string>>(); foreach (var row in result.Target.Records) { if (CreditCardCardType.MASTER_CARD.GetDescription().Equals(row["card_type"])) { mastercards.Add(row); } } Assert.IsTrue(mastercards.Count >= 1); } #if net452 ).GetAwaiter().GetResult(); } #endif [Test] public void Generate_AcceptsDatesInNonUSFormats() { #if netcore CultureInfo originalCulture = CultureInfo.CurrentCulture; CultureInfo australianCulture = new CultureInfo("en-AU"); CultureInfo.CurrentCulture = australianCulture; #else CultureInfo originalCulture = Thread.CurrentThread.CurrentCulture; CultureInfo australianCulture = new CultureInfo("en-AU"); Thread.CurrentThread.CurrentCulture = australianCulture; #endif DateTime date = new DateTime(2014, 8, 20); var result = gateway.SettlementBatchSummary.Generate(date); Assert.IsTrue(result.IsSuccess()); #if netcore Assert.AreEqual(australianCulture, CultureInfo.CurrentCulture); CultureInfo.CurrentCulture = originalCulture; #else Assert.AreEqual(australianCulture, Thread.CurrentThread.CurrentCulture); Thread.CurrentThread.CurrentCulture = originalCulture; #endif } [Test] public void Generate_CanBeGroupedByACustomField() { TransactionRequest request = new TransactionRequest { Amount = 1000M, CreditCard = new TransactionCreditCardRequest { Number = "4111111111111111", ExpirationDate = "05/2012", CardholderName = "Tom Smith", }, Options = new TransactionOptionsRequest { SubmitForSettlement = true }, CustomFields = new Dictionary<string, string> { { "store_me", "custom value" } } }; Transaction transaction = gateway.Transaction.Sale(request).Target; Transaction settlementResult = gateway.TestTransaction.Settle(transaction.Id); var settlementDate = settlementResult.SettlementBatchId.Substring(0,10); var result = gateway.SettlementBatchSummary.Generate(System.DateTime.Parse(settlementDate), "store_me"); var customValues = new List<IDictionary<string, string>>(); foreach (var row in result.Target.Records) { if ("custom value".Equals(row["store_me"])) { customValues.Add(row); } } Assert.IsTrue(customValues.Count >= 1); } [Test] #if netcore public async Task GenerateAsync_CanBeGroupedByACustomField() #else public void GenerateAsync_CanBeGroupedByACustomField() { Task.Run(async () => #endif { TransactionRequest request = new TransactionRequest { Amount = 1000M, CreditCard = new TransactionCreditCardRequest { Number = "5555555555554444", ExpirationDate = "05/2012", CardholderName = "Jane Smith", }, Options = new TransactionOptionsRequest { SubmitForSettlement = true }, CustomFields = new Dictionary<string, string> { { "store_me", "custom value async" } } }; Result<Transaction> transactionResult = await gateway.Transaction.SaleAsync(request); Transaction transaction = transactionResult.Target; Transaction settlementResult = await gateway.TestTransaction.SettleAsync(transaction.Id); var settlementDate = settlementResult.SettlementBatchId.Substring(0,10); var result = await gateway.SettlementBatchSummary.GenerateAsync(System.DateTime.Parse(settlementDate), "store_me"); var customValues = new List<IDictionary<string, string>>(); foreach (var row in result.Target.Records) { if ("custom value async".Equals(row["store_me"])) { customValues.Add(row); } } Assert.AreEqual(1, customValues.Count); } #if net452 ).GetAwaiter().GetResult(); } #endif } } // Visual Studio Shared Project // Copyright(c) Microsoft Corporation // All rights reserved. // // Licensed under the Apache License, Version 2.0 (the License); you may not use // this file except in compliance with the License. You may obtain a copy of the // License at http://www.apache.org/licenses/LICENSE-2.0 // // THIS CODE IS PROVIDED ON AN *AS IS* BASIS, WITHOUT WARRANTIES OR CONDITIONS // OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING WITHOUT LIMITATION ANY // IMPLIED WARRANTIES OR CONDITIONS OF TITLE, FITNESS FOR A PARTICULAR PURPOSE, // MERCHANTABLITY OR NON-INFRINGEMENT. // // See the Apache Version 2.0 License for specific language governing // permissions and limitations under the License. using System; using System.IO; using System.Runtime.InteropServices; using Microsoft.Build.Construction; using Microsoft.VisualStudio; using Microsoft.VisualStudio.Shell; using Microsoft.VisualStudio.Shell.Interop; using Microsoft.VisualStudioTools.Infrastructure; using MSBuild = Microsoft.Build.Evaluation; namespace Microsoft.VisualStudioTools.Project { /// <summary> /// Creates projects within the solution /// </summary> public abstract class ProjectFactory : FlavoredProjectFactoryBase, #if DEV11_OR_LATER IVsAsynchronousProjectCreate, IVsProjectUpgradeViaFactory4, #endif IVsProjectUpgradeViaFactory { #region fields private System.IServiceProvider site; /// <summary> /// The msbuild engine that we are going to use. /// </summary> private MSBuild.ProjectCollection buildEngine; /// <summary> /// The msbuild project for the project file. /// </summary> private MSBuild.Project buildProject; #if DEV11_OR_LATER private readonly Lazy<IVsTaskSchedulerService> taskSchedulerService; #endif // (See GetSccInfo below.) // When we upgrade a project, we need to cache the SCC info in case // somebody calls to ask for it via GetSccInfo. // We only need to know about the most recently upgraded project, and // can fail for other projects. private string _cachedSccProject; private string _cachedSccProjectName, _cachedSccAuxPath, _cachedSccLocalPath, _cachedSccProvider; #endregion #region properties [Obsolete("Use Site instead")] protected Microsoft.VisualStudio.Shell.Package Package { get { return (Microsoft.VisualStudio.Shell.Package)this.site; } } protected internal System.IServiceProvider Site { get { return this.site; } internal set { this.site = value; } } #endregion #region ctor [Obsolete("Provide an IServiceProvider instead of a package")] protected ProjectFactory(Microsoft.VisualStudio.Shell.Package package) : this((IServiceProvider)package) { } protected ProjectFactory(IServiceProvider serviceProvider) : base(serviceProvider) { this.site = serviceProvider; this.buildEngine = MSBuild.ProjectCollection.GlobalProjectCollection; #if DEV11_OR_LATER this.taskSchedulerService = new Lazy<IVsTaskSchedulerService>(() => Site.GetService(typeof(SVsTaskSchedulerService)) as IVsTaskSchedulerService); #endif } #endregion #region abstract methods internal abstract ProjectNode CreateProject(); #endregion #region overriden methods /// <summary> /// Rather than directly creating the project, ask VS to initate the process of /// creating an aggregated project in case we are flavored. We will be called /// on the IVsAggregatableProjectFactory to do the real project creation. /// </summary> /// <param name="fileName">Project file</param> /// <param name="location">Path of the project</param> /// <param name="name">Project Name</param> /// <param name="flags">Creation flags</param> /// <param name="projectGuid">Guid of the project</param> /// <param name="project">Project that end up being created by this method</param> /// <param name="canceled">Was the project creation canceled</param> protected override void CreateProject(string fileName, string location, string name, uint flags, ref Guid projectGuid, out IntPtr project, out int canceled) { using (new DebugTimer("CreateProject")) { project = IntPtr.Zero; canceled = 0; // Get the list of GUIDs from the project/template string guidsList = this.ProjectTypeGuids(fileName); // Launch the aggregate creation process (we should be called back on our IVsAggregatableProjectFactoryCorrected implementation) IVsCreateAggregateProject aggregateProjectFactory = (IVsCreateAggregateProject)this.Site.GetService(typeof(SVsCreateAggregateProject)); int hr = aggregateProjectFactory.CreateAggregateProject(guidsList, fileName, location, name, flags, ref projectGuid, out project); if (hr == VSConstants.E_ABORT) canceled = 1; ErrorHandler.ThrowOnFailure(hr); this.buildProject = null; } } /// <summary> /// Instantiate the project class, but do not proceed with the /// initialization just yet. /// Delegate to CreateProject implemented by the derived class. /// </summary> [System.Diagnostics.CodeAnalysis.SuppressMessage("Microsoft.Reliability", "CA2000:Dispose objects before losing scope", Justification = "The global property handles is instantiated here and used in the project node that will Dispose it")] protected override object PreCreateForOuter(IntPtr outerProjectIUnknown) { Utilities.CheckNotNull(this.buildProject, "The build project should have been initialized before calling PreCreateForOuter."); // Please be very carefull what is initialized here on the ProjectNode. Normally this should only instantiate and return a project node. // The reason why one should very carefully add state to the project node here is that at this point the aggregation has not yet been created and anything that would cause a CCW for the project to be created would cause the aggregation to fail // Our reasoning is that there is no other place where state on the project node can be set that is known by the Factory and has to execute before the Load method. ProjectNode node = this.CreateProject(); Utilities.CheckNotNull(node, "The project failed to be created"); node.BuildEngine = this.buildEngine; node.BuildProject = this.buildProject; return node; } /// <summary> /// Retrives the list of project guids from the project file. /// If you don't want your project to be flavorable, override /// to only return your project factory Guid: /// return this.GetType().GUID.ToString("B"); /// </summary> /// <param name="file">Project file to look into to find the Guid list</param> /// <returns>List of semi-colon separated GUIDs</returns> protected override string ProjectTypeGuids(string file) { // Load the project so we can extract the list of GUIDs this.buildProject = Utilities.ReinitializeMsBuildProject(this.buildEngine, file, this.buildProject); // Retrieve the list of GUIDs, if it is not specify, make it our GUID string guids = buildProject.GetPropertyValue(ProjectFileConstants.ProjectTypeGuids); if (String.IsNullOrEmpty(guids)) guids = this.GetType().GUID.ToString("B"); return guids; } #endregion #if DEV11_OR_LATER public virtual bool CanCreateProjectAsynchronously(ref Guid rguidProjectID, string filename, uint flags) { return true; } public void OnBeforeCreateProjectAsync(ref Guid rguidProjectID, string filename, string location, string pszName, uint flags) { } public IVsTask CreateProjectAsync(ref Guid rguidProjectID, string filename, string location, string pszName, uint flags) { Guid iid = typeof(IVsHierarchy).GUID; return VsTaskLibraryHelper.CreateAndStartTask(taskSchedulerService.Value, VsTaskRunContext.UIThreadBackgroundPriority, VsTaskLibraryHelper.CreateTaskBody(() => { IntPtr project; int cancelled; CreateProject(filename, location, pszName, flags, ref iid, out project, out cancelled); if (cancelled != 0) { throw new OperationCanceledException(); } return Marshal.GetObjectForIUnknown(project); })); } #endif #region Project Upgrades /// <summary> /// Override this method to upgrade project files. /// </summary> /// <param name="projectXml"> /// The XML of the project file being upgraded. This may be modified /// directly or replaced with a new element. /// </param> /// <param name="userProjectXml"> /// The XML of the user file being upgraded. This may be modified /// directly or replaced with a new element. /// /// If there is no user file before upgrading, this may be null. If it /// is non-null on return, the file is created. /// </param> /// <param name="log"> /// Callback to log messages. These messages will be added to the /// migration log that is displayed after upgrading completes. /// </param> protected virtual void UpgradeProject( ref ProjectRootElement projectXml, ref ProjectRootElement userProjectXml, Action<__VSUL_ERRORLEVEL, string> log ) { } /// <summary> /// Determines whether a project needs to be upgraded. /// </summary> /// <param name="projectXml"> /// The XML of the project file being upgraded. /// </param> /// <param name="userProjectXml"> /// The XML of the user file being upgraded, or null if no user file /// exists. /// </param> /// <param name="log"> /// Callback to log messages. These messages will be added to the /// migration log that is displayed after upgrading completes. /// </param> /// <param name="projectFactory"> /// The project factory that will be used. This may be replaced with /// another Guid if a new project factory should be used to upgrade the /// project. /// </param> /// <param name="backupSupport"> /// The level of backup support requested for the project. By default, /// the project file (and user file, if any) will be copied alongside /// the originals with ".old" added to the filenames. /// </param> /// <returns> /// The form of upgrade required. /// </returns> protected virtual ProjectUpgradeState UpgradeProjectCheck( ProjectRootElement projectXml, ProjectRootElement userProjectXml, Action<__VSUL_ERRORLEVEL, string> log, ref Guid projectFactory, ref __VSPPROJECTUPGRADEVIAFACTORYFLAGS backupSupport ) { return ProjectUpgradeState.NotNeeded; } class UpgradeLogger { private readonly string _projectFile; private readonly string _projectName; private readonly IVsUpgradeLogger _logger; public UpgradeLogger(string projectFile, IVsUpgradeLogger logger) { _projectFile = projectFile; _projectName = Path.GetFileNameWithoutExtension(projectFile); _logger = logger; } public void Log(__VSUL_ERRORLEVEL level, string text) { if (_logger != null) { ErrorHandler.ThrowOnFailure(_logger.LogMessage((uint)level, _projectName, _projectFile, text)); } } } int IVsProjectUpgradeViaFactory.GetSccInfo( string bstrProjectFileName, out string pbstrSccProjectName, out string pbstrSccAuxPath, out string pbstrSccLocalPath, out string pbstrProvider ) { if (string.Equals(_cachedSccProject, bstrProjectFileName, StringComparison.OrdinalIgnoreCase)) { pbstrSccProjectName = _cachedSccProjectName; pbstrSccAuxPath = _cachedSccAuxPath; pbstrSccLocalPath = _cachedSccLocalPath; pbstrProvider = _cachedSccProvider; return VSConstants.S_OK; } pbstrSccProjectName = null; pbstrSccAuxPath = null; pbstrSccLocalPath = null; pbstrProvider = null; return VSConstants.E_FAIL; } int IVsProjectUpgradeViaFactory.UpgradeProject( string bstrFileName, uint fUpgradeFlag, string bstrCopyLocation, out string pbstrUpgradedFullyQualifiedFileName, IVsUpgradeLogger pLogger, out int pUpgradeRequired, out Guid pguidNewProjectFactory ) { pbstrUpgradedFullyQualifiedFileName = null; // We first run (or re-run) the upgrade check and bail out early if // there is actually no need to upgrade. uint dummy; var hr = ((IVsProjectUpgradeViaFactory)this).UpgradeProject_CheckOnly( bstrFileName, pLogger, out pUpgradeRequired, out pguidNewProjectFactory, out dummy ); if (!ErrorHandler.Succeeded(hr)) { return hr; } var logger = new UpgradeLogger(bstrFileName, pLogger); var backup = (__VSPPROJECTUPGRADEVIAFACTORYFLAGS)fUpgradeFlag; bool anyBackup, sxsBackup, copyBackup; anyBackup = backup.HasFlag(__VSPPROJECTUPGRADEVIAFACTORYFLAGS.PUVFF_BACKUPSUPPORTED); if (anyBackup) { sxsBackup = backup.HasFlag(__VSPPROJECTUPGRADEVIAFACTORYFLAGS.PUVFF_SXSBACKUP); copyBackup = !sxsBackup && backup.HasFlag(__VSPPROJECTUPGRADEVIAFACTORYFLAGS.PUVFF_COPYBACKUP); } else { sxsBackup = copyBackup = false; } if (copyBackup) { throw new NotSupportedException("PUVFF_COPYBACKUP is not supported"); } pbstrUpgradedFullyQualifiedFileName = bstrFileName; if (pUpgradeRequired == 0 && !copyBackup) { // No upgrade required, and no backup required. logger.Log(__VSUL_ERRORLEVEL.VSUL_INFORMATIONAL, SR.GetString(SR.UpgradeNotRequired)); return VSConstants.S_OK; } try { UpgradeLogger logger2 = null; var userFileName = bstrFileName + ".user"; if (File.Exists(userFileName)) { logger2 = new UpgradeLogger(userFileName, pLogger); } else { userFileName = null; } if (sxsBackup) { // For SxS backups we want to put the old project file alongside // the current one. bstrCopyLocation = Path.GetDirectoryName(bstrFileName); } if (anyBackup) { var namePart = Path.GetFileNameWithoutExtension(bstrFileName); var extPart = Path.GetExtension(bstrFileName) + (sxsBackup ? ".old" : ""); var projectFileBackup = Path.Combine(bstrCopyLocation, namePart + extPart); for (int i = 1; File.Exists(projectFileBackup); ++i) { projectFileBackup = Path.Combine( bstrCopyLocation, string.Format("{0}{1}{2}", namePart, i, extPart) ); } File.Copy(bstrFileName, projectFileBackup); // Back up the .user file if there is one if (userFileName != null) { if (sxsBackup) { File.Copy( userFileName, Path.ChangeExtension(projectFileBackup, ".user.old") ); } else { File.Copy(userFileName, projectFileBackup + ".old"); } } // TODO: Implement support for backing up all files //if (copyBackup) { // - Open the project // - Inspect all Items // - Copy those items that are referenced relative to the // project file into bstrCopyLocation //} } var queryEdit = site.GetService(typeof(SVsQueryEditQuerySave)) as IVsQueryEditQuerySave2; if (queryEdit != null) { uint editVerdict; uint queryEditMoreInfo; var tagVSQueryEditFlags_QEF_AllowUnopenedProjects = (tagVSQueryEditFlags)0x80; ErrorHandler.ThrowOnFailure(queryEdit.QueryEditFiles( (uint)(tagVSQueryEditFlags.QEF_ForceEdit_NoPrompting | tagVSQueryEditFlags.QEF_DisallowInMemoryEdits | tagVSQueryEditFlags_QEF_AllowUnopenedProjects), 1, new[] { bstrFileName }, null, null, out editVerdict, out queryEditMoreInfo )); if (editVerdict != (uint)tagVSQueryEditResult.QER_EditOK) { logger.Log(__VSUL_ERRORLEVEL.VSUL_ERROR, SR.GetString(SR.UpgradeCannotCheckOutProject)); return VSConstants.E_FAIL; } // File may have been updated during checkout, so check // again whether we need to upgrade. if ((queryEditMoreInfo & (uint)tagVSQueryEditResultFlags.QER_MaybeChanged) != 0) { hr = ((IVsProjectUpgradeViaFactory)this).UpgradeProject_CheckOnly( bstrFileName, pLogger, out pUpgradeRequired, out pguidNewProjectFactory, out dummy ); if (!ErrorHandler.Succeeded(hr)) { return hr; } if (pUpgradeRequired == 0) { logger.Log(__VSUL_ERRORLEVEL.VSUL_INFORMATIONAL, SR.GetString(SR.UpgradeNotRequired)); return VSConstants.S_OK; } } } // Load the project file and user file into MSBuild as plain // XML to make it easier for subclasses. var projectXml = ProjectRootElement.Open(bstrFileName); if (projectXml == null) { throw new Exception(SR.GetString(SR.UpgradeCannotLoadProject)); } var userXml = userFileName != null ? ProjectRootElement.Open(userFileName) : null; // Invoke our virtual UpgradeProject function. If it fails, it // will throw and we will log the exception. UpgradeProject(ref projectXml, ref userXml, logger.Log); // Get the SCC info from the project file. if (projectXml != null) { _cachedSccProject = bstrFileName; _cachedSccProjectName = string.Empty; _cachedSccAuxPath = string.Empty; _cachedSccLocalPath = string.Empty; _cachedSccProvider = string.Empty; foreach (var property in projectXml.Properties) { switch (property.Name) { case ProjectFileConstants.SccProjectName: _cachedSccProjectName = property.Value; break; case ProjectFileConstants.SccAuxPath: _cachedSccAuxPath = property.Value; break; case ProjectFileConstants.SccLocalPath: _cachedSccLocalPath = property.Value; break; case ProjectFileConstants.SccProvider: _cachedSccProvider = property.Value; break; default: break; } } } // Save the updated files. if (projectXml != null) { projectXml.Save(); } if (userXml != null) { userXml.Save(); } // Need to add "Converted" (unlocalized) to the report because // the XSLT refers to it. logger.Log(__VSUL_ERRORLEVEL.VSUL_STATUSMSG, "Converted"); return VSConstants.S_OK; } catch (Exception ex) { if (ex.IsCriticalException()) { throw; } logger.Log(__VSUL_ERRORLEVEL.VSUL_ERROR, SR.GetString(SR.UnexpectedUpgradeError, ex.Message)); try { ActivityLog.LogError(GetType().FullName, ex.ToString()); } catch (InvalidOperationException) { // Cannot log to ActivityLog. This may occur if we are // outside of VS right now (for example, unit tests). System.Diagnostics.Trace.TraceError(ex.ToString()); } return VSConstants.E_FAIL; } } int IVsProjectUpgradeViaFactory.UpgradeProject_CheckOnly( string bstrFileName, IVsUpgradeLogger pLogger, out int pUpgradeRequired, out Guid pguidNewProjectFactory, out uint pUpgradeProjectCapabilityFlags ) { pUpgradeRequired = 0; pguidNewProjectFactory = Guid.Empty; if (!File.Exists(bstrFileName)) { pUpgradeProjectCapabilityFlags = 0; return VSConstants.E_INVALIDARG; } var backupSupport = __VSPPROJECTUPGRADEVIAFACTORYFLAGS.PUVFF_BACKUPSUPPORTED | __VSPPROJECTUPGRADEVIAFACTORYFLAGS.PUVFF_COPYBACKUP | __VSPPROJECTUPGRADEVIAFACTORYFLAGS.PUVFF_SXSBACKUP; var logger = new UpgradeLogger(bstrFileName, pLogger); try { var projectXml = ProjectRootElement.Open(bstrFileName); var userProjectName = bstrFileName + ".user"; var userProjectXml = File.Exists(userProjectName) ? ProjectRootElement.Open(userProjectName) : null; var upgradeRequired = UpgradeProjectCheck( projectXml, userProjectXml, logger.Log, ref pguidNewProjectFactory, ref backupSupport ); if (upgradeRequired != ProjectUpgradeState.NotNeeded) { pUpgradeRequired = 1; } } catch (Exception ex) { if (ex.IsCriticalException()) { throw; } // Log the error and don't attempt to upgrade the project. logger.Log(__VSUL_ERRORLEVEL.VSUL_ERROR, SR.GetString(SR.UnexpectedUpgradeError, ex.Message)); try { ActivityLog.LogError(GetType().FullName, ex.ToString()); } catch (InvalidOperationException) { // Cannot log to ActivityLog. This may occur if we are // outside of VS right now (for example, unit tests). System.Diagnostics.Trace.TraceError(ex.ToString()); } pUpgradeRequired = 0; } pUpgradeProjectCapabilityFlags = (uint)backupSupport; // If the upgrade checker set the factory GUID to ourselves, we need // to clear it if (pguidNewProjectFactory == GetType().GUID) { pguidNewProjectFactory = Guid.Empty; } return VSConstants.S_OK; } #if DEV11_OR_LATER void IVsProjectUpgradeViaFactory4.UpgradeProject_CheckOnly( string bstrFileName, IVsUpgradeLogger pLogger, out uint pUpgradeRequired, out Guid pguidNewProjectFactory, out uint pUpgradeProjectCapabilityFlags ) { pguidNewProjectFactory = Guid.Empty; if (!File.Exists(bstrFileName)) { pUpgradeRequired = 0; pUpgradeProjectCapabilityFlags = 0; return; } var backupSupport = __VSPPROJECTUPGRADEVIAFACTORYFLAGS.PUVFF_BACKUPSUPPORTED | __VSPPROJECTUPGRADEVIAFACTORYFLAGS.PUVFF_COPYBACKUP | __VSPPROJECTUPGRADEVIAFACTORYFLAGS.PUVFF_SXSBACKUP; var logger = new UpgradeLogger(bstrFileName, pLogger); try { var projectXml = ProjectRootElement.Open(bstrFileName); var userProjectName = bstrFileName + ".user"; var userProjectXml = File.Exists(userProjectName) ? ProjectRootElement.Open(userProjectName) : null; var upgradeRequired = UpgradeProjectCheck( projectXml, userProjectXml, logger.Log, ref pguidNewProjectFactory, ref backupSupport ); switch (upgradeRequired) { case ProjectUpgradeState.SafeRepair: pUpgradeRequired = (uint)__VSPPROJECTUPGRADEVIAFACTORYREPAIRFLAGS.VSPUVF_PROJECT_SAFEREPAIR; break; case ProjectUpgradeState.UnsafeRepair: pUpgradeRequired = (uint)__VSPPROJECTUPGRADEVIAFACTORYREPAIRFLAGS.VSPUVF_PROJECT_UNSAFEREPAIR; break; case ProjectUpgradeState.OneWayUpgrade: pUpgradeRequired = (uint)__VSPPROJECTUPGRADEVIAFACTORYREPAIRFLAGS.VSPUVF_PROJECT_ONEWAYUPGRADE; break; case ProjectUpgradeState.Incompatible: pUpgradeRequired = (uint)__VSPPROJECTUPGRADEVIAFACTORYREPAIRFLAGS.VSPUVF_PROJECT_INCOMPATIBLE; break; case ProjectUpgradeState.Deprecated: pUpgradeRequired = (uint)__VSPPROJECTUPGRADEVIAFACTORYREPAIRFLAGS.VSPUVF_PROJECT_DEPRECATED; break; default: case ProjectUpgradeState.NotNeeded: pUpgradeRequired = (uint)__VSPPROJECTUPGRADEVIAFACTORYREPAIRFLAGS.VSPUVF_PROJECT_NOREPAIR; break; } } catch (Exception ex) { if (ex.IsCriticalException()) { throw; } // Log the error and don't attempt to upgrade the project. logger.Log(__VSUL_ERRORLEVEL.VSUL_ERROR, SR.GetString(SR.UnexpectedUpgradeError, ex.Message)); try { ActivityLog.LogError(GetType().FullName, ex.ToString()); } catch (InvalidOperationException) { // Cannot log to ActivityLog. This may occur if we are // outside of VS right now (for example, unit tests). System.Diagnostics.Trace.TraceError(ex.ToString()); } pUpgradeRequired = (uint)__VSPPROJECTUPGRADEVIAFACTORYREPAIRFLAGS.VSPUVF_PROJECT_NOREPAIR; } pUpgradeProjectCapabilityFlags = (uint)backupSupport; // If the upgrade checker set the factory GUID to ourselves, we need // to clear it if (pguidNewProjectFactory == GetType().GUID) { pguidNewProjectFactory = Guid.Empty; } } #endif #endregion } /// <summary> /// Status indicating whether a project upgrade should occur and how the /// project will be affected. /// </summary> public enum ProjectUpgradeState { /// <summary> /// No action will be taken. /// </summary> NotNeeded, /// <summary> /// The project will be upgraded without asking the user. /// </summary> SafeRepair, /// <summary> /// The project will be upgraded with the user's permission. /// </summary> UnsafeRepair, /// <summary> /// The project will be upgraded with the user's permission and they /// will be informed that the project will no longer work with earlier /// versions of Visual Studio. /// </summary> OneWayUpgrade, /// <summary> /// The project will be marked as incompatible. /// </summary> Incompatible, /// <summary> /// The project will be marked as deprecated. /// </summary> Deprecated } } using System; using System.Collections.Generic; namespace GalaxyGen.Engine.Ai.Goap { /** * Plans what actions can be completed in order to fulfill a goal state. */ public class GoapPlanner { /** * Plan what sequence of actions can fulfill the goal. * Returns null if a plan could not be found, or a list of the actions * that must be performed, in order, to fulfill the goal. */ public Queue<GoapAction> Plan(object agent, HashSet<GoapAction> availableActions, Dictionary<string, object> worldState, Dictionary<Int64, Int64> resourceState, Dictionary<string, object> goal, Dictionary<Int64, Int64> resourceGoal) { // reset the actions so we can start fresh with them foreach (GoapAction a in availableActions) { a.doReset(); } // check what actions can run using their checkProceduralPrecondition HashSet<GoapAction> usableActions = new HashSet<GoapAction>(); foreach (GoapAction a in availableActions) { if (a.checkProceduralPrecondition(agent)) usableActions.Add(a); } // we now have all actions that can run, stored in usableActions // build up the tree and record the leaf nodes that provide a solution to the goal. List<GoapNode> leaves = new List<GoapNode>(); // build graph GoapNode start = new GoapNode(null, 0, 0, worldState, resourceState, null); bool success = buildGraph(start, leaves, usableActions, goal, resourceGoal); if (!success) { // oh no, we didn't get a plan // Console.WriteLine("NO PLAN"); return null; } // get the cheapest leaf GoapNode cheapest = null; foreach (GoapNode leaf in leaves) { if (cheapest == null) cheapest = leaf; else { if (leaf.BetterThan(cheapest)) cheapest = leaf; } } // get its node and work back through the parents List<GoapAction> result = new List<GoapAction>(); GoapNode n = cheapest; while (n != null) { if (n.action != null) { result.Insert(0, n.action); // insert the action in the front } n = n.parent; } // we now have this action list in correct order Queue<GoapAction> queue = new Queue<GoapAction>(); foreach (GoapAction a in result) { queue.Enqueue(a); } // hooray we have a plan! return queue; } /** * Returns true if at least one solution was found. * The possible paths are stored in the leaves list. Each leaf has a * 'runningCost' value where the lowest cost will be the best action * sequence. */ private bool buildGraph(GoapNode parent, List<GoapNode> leaves, HashSet<GoapAction> usableActions, Dictionary<string, object> goal, Dictionary<Int64, Int64> resourceGoal) { bool foundOne = false; // go through each action available at this node and see if we can use it here foreach (GoapAction action in usableActions) { // if the parent state has the conditions for this action's preconditions, we can use it here if (inState(action.Preconditions, parent.state)) { // apply the action's effects to the parent state Dictionary<string, object> currentState = populateState(parent.state, action.Effects); Dictionary<Int64, Int64> currentResources = populateResource(parent.resources, action.Resources); // Console.WriteLine(GoapAgent.PrettyPrint(currentState)); GoapNode node = new GoapNode(parent, parent.runningCost + action.GetCost(), parent.weight + action.GetWeight(), currentState, currentResources, action); if (inState(goal, currentState) && inResources(resourceGoal, currentResources)) { // we found a solution! leaves.Add(node); foundOne = true; } else { // not at a solution yet, so test all the remaining actions and branch out the tree HashSet<GoapAction> subset = actionSubset(usableActions, action); bool found = buildGraph(node, leaves, subset, goal, resourceGoal); if (found) foundOne = true; } } } return foundOne; } /** * Create a subset of the actions excluding the removeMe one. Creates a new set. */ private HashSet<GoapAction> actionSubset(HashSet<GoapAction> actions, GoapAction removeMe) { HashSet<GoapAction> subset = new HashSet<GoapAction>(); foreach (GoapAction a in actions) { if (!a.Equals(removeMe)) subset.Add(a); } return subset; } /** * Check that all items in 'test' are in 'state'. If just one does not match or is not there * then this returns false. */ private bool inState(Dictionary<string, object> test, Dictionary<string, object> state) { var allMatch = true; foreach (var t in test) { var match = state.ContainsKey(t.Key) && state[t.Key].Equals(t.Value); if (!match) { allMatch = false; break; } } return allMatch; } private bool inResources(Dictionary<Int64, Int64> resourceGoal, Dictionary<Int64, Int64> currentResources) { var allMatch = true; foreach (var t in resourceGoal) { var match = currentResources.ContainsKey(t.Key) && currentResources[t.Key] >= t.Value; if (!match) { allMatch = false; break; } } return allMatch; } //if there is one true relationship private bool CondRelation(Dictionary<string, object> preconditions , Dictionary<string, object> effects) { foreach (var t in preconditions) { var match = effects.ContainsKey(t.Key) && effects[t.Key].Equals(t.Value); if (match) return true; } return false; } /** * Apply the stateChange to the currentState */ private Dictionary<string, object> populateState(Dictionary<string, object> currentState, Dictionary<string, object> stateChange) { Dictionary<string, object> state = new Dictionary<string, object>(); foreach (var s in currentState) { state.Add(s.Key, s.Value); } foreach (var change in stateChange) { // if the key exists in the current state, update the Value if (state.ContainsKey(change.Key)) { state[change.Key] = change.Value; } else { state.Add(change.Key, change.Value); } } return state; } private Dictionary<Int64, Int64> populateResource(Dictionary<Int64, Int64> currentResource, Dictionary<Int64, Int64> resourceChange) { Dictionary<Int64, Int64> resources = new Dictionary<Int64, Int64>(); foreach (var res in currentResource) { resources.Add(res.Key, res.Value); } foreach (var res in resourceChange) { if (!resources.ContainsKey(res.Key)) { resources.Add(res.Key, res.Value); } else { resources[res.Key] += res.Value; } } return resources; } } } // Licensed to the .NET Foundation under one or more agreements. // The .NET Foundation licenses this file to you under the MIT license. // See the LICENSE file in the project root for more information. // // using System; using System.Runtime.InteropServices; using Windows.Foundation; #pragma warning disable 436 // Redefining types from Windows.Foundation namespace Windows.UI.Xaml.Media.Media3D { // // Matrix3D is the managed projection of Windows.UI.Xaml.Media.Media3D.Matrix3D. Any // changes to the layout of this type must be exactly mirrored on the native WinRT side as well. // // Note that this type is owned by the Jupiter team. Please contact them before making any // changes here. // [StructLayout(LayoutKind.Sequential)] public struct Matrix3D : IFormattable { // Assuming this matrix has fourth column of 0,0,0,1 and isn't identity this function: // Returns false if HasInverse is false, otherwise inverts the matrix. private bool NormalizedAffineInvert() { double z20 = _m12 * _m23 - _m22 * _m13; double z10 = _m32 * _m13 - _m12 * _m33; double z00 = _m22 * _m33 - _m32 * _m23; double det = _m31 * z20 + _m21 * z10 + _m11 * z00; if (IsZero(det)) { return false; } // Compute 3x3 non-zero cofactors for the 2nd column double z21 = _m21 * _m13 - _m11 * _m23; double z11 = _m11 * _m33 - _m31 * _m13; double z01 = _m31 * _m23 - _m21 * _m33; // Compute all six 2x2 determinants of 1st two columns double y01 = _m11 * _m22 - _m21 * _m12; double y02 = _m11 * _m32 - _m31 * _m12; double y03 = _m11 * _offsetY - _offsetX * _m12; double y12 = _m21 * _m32 - _m31 * _m22; double y13 = _m21 * _offsetY - _offsetX * _m22; double y23 = _m31 * _offsetY - _offsetX * _m32; // Compute all non-zero and non-one 3x3 cofactors for 2nd // two columns double z23 = _m23 * y03 - _offsetZ * y01 - _m13 * y13; double z13 = _m13 * y23 - _m33 * y03 + _offsetZ * y02; double z03 = _m33 * y13 - _offsetZ * y12 - _m23 * y23; double z22 = y01; double z12 = -y02; double z02 = y12; double rcp = 1.0 / det; // Multiply all 3x3 cofactors by reciprocal & transpose _m11 = (z00 * rcp); _m12 = (z10 * rcp); _m13 = (z20 * rcp); _m21 = (z01 * rcp); _m22 = (z11 * rcp); _m23 = (z21 * rcp); _m31 = (z02 * rcp); _m32 = (z12 * rcp); _m33 = (z22 * rcp); _offsetX = (z03 * rcp); _offsetY = (z13 * rcp); _offsetZ = (z23 * rcp); return true; } // RETURNS true if has inverse & invert was done. Otherwise returns false & leaves matrix unchanged. private bool InvertCore() { if (IsAffine) { return NormalizedAffineInvert(); } // compute all six 2x2 determinants of 2nd two columns double y01 = _m13 * _m24 - _m23 * _m14; double y02 = _m13 * _m34 - _m33 * _m14; double y03 = _m13 * _m44 - _offsetZ * _m14; double y12 = _m23 * _m34 - _m33 * _m24; double y13 = _m23 * _m44 - _offsetZ * _m24; double y23 = _m33 * _m44 - _offsetZ * _m34; // Compute 3x3 cofactors for 1st the column double z30 = _m22 * y02 - _m32 * y01 - _m12 * y12; double z20 = _m12 * y13 - _m22 * y03 + _offsetY * y01; double z10 = _m32 * y03 - _offsetY * y02 - _m12 * y23; double z00 = _m22 * y23 - _m32 * y13 + _offsetY * y12; // Compute 4x4 determinant double det = _offsetX * z30 + _m31 * z20 + _m21 * z10 + _m11 * z00; if (IsZero(det)) { return false; } // Compute 3x3 cofactors for the 2nd column double z31 = _m11 * y12 - _m21 * y02 + _m31 * y01; double z21 = _m21 * y03 - _offsetX * y01 - _m11 * y13; double z11 = _m11 * y23 - _m31 * y03 + _offsetX * y02; double z01 = _m31 * y13 - _offsetX * y12 - _m21 * y23; // Compute all six 2x2 determinants of 1st two columns y01 = _m11 * _m22 - _m21 * _m12; y02 = _m11 * _m32 - _m31 * _m12; y03 = _m11 * _offsetY - _offsetX * _m12; y12 = _m21 * _m32 - _m31 * _m22; y13 = _m21 * _offsetY - _offsetX * _m22; y23 = _m31 * _offsetY - _offsetX * _m32; // Compute all 3x3 cofactors for 2nd two columns double z33 = _m13 * y12 - _m23 * y02 + _m33 * y01; double z23 = _m23 * y03 - _offsetZ * y01 - _m13 * y13; double z13 = _m13 * y23 - _m33 * y03 + _offsetZ * y02; double z03 = _m33 * y13 - _offsetZ * y12 - _m23 * y23; double z32 = _m24 * y02 - _m34 * y01 - _m14 * y12; double z22 = _m14 * y13 - _m24 * y03 + _m44 * y01; double z12 = _m34 * y03 - _m44 * y02 - _m14 * y23; double z02 = _m24 * y23 - _m34 * y13 + _m44 * y12; double rcp = 1.0 / det; // Multiply all 3x3 cofactors by reciprocal & transpose _m11 = (z00 * rcp); _m12 = (z10 * rcp); _m13 = (z20 * rcp); _m14 = (z30 * rcp); _m21 = (z01 * rcp); _m22 = (z11 * rcp); _m23 = (z21 * rcp); _m24 = (z31 * rcp); _m31 = (z02 * rcp); _m32 = (z12 * rcp); _m33 = (z22 * rcp); _m34 = (z32 * rcp); _offsetX = (z03 * rcp); _offsetY = (z13 * rcp); _offsetZ = (z23 * rcp); _m44 = (z33 * rcp); return true; } public Matrix3D(double m11, double m12, double m13, double m14, double m21, double m22, double m23, double m24, double m31, double m32, double m33, double m34, double offsetX, double offsetY, double offsetZ, double m44) { _m11 = m11; _m12 = m12; _m13 = m13; _m14 = m14; _m21 = m21; _m22 = m22; _m23 = m23; _m24 = m24; _m31 = m31; _m32 = m32; _m33 = m33; _m34 = m34; _offsetX = offsetX; _offsetY = offsetY; _offsetZ = offsetZ; _m44 = m44; } // the transform is identity by default // Actually fill in the fields - some (internal) code uses the fields directly for perf. private static Matrix3D s_identity = CreateIdentity(); public double M11 { get { return _m11; } set { _m11 = value; } } public double M12 { get { return _m12; } set { _m12 = value; } } public double M13 { get { return _m13; } set { _m13 = value; } } public double M14 { get { return _m14; } set { _m14 = value; } } public double M21 { get { return _m21; } set { _m21 = value; } } public double M22 { get { return _m22; } set { _m22 = value; } } public double M23 { get { return _m23; } set { _m23 = value; } } public double M24 { get { return _m24; } set { _m24 = value; } } public double M31 { get { return _m31; } set { _m31 = value; } } public double M32 { get { return _m32; } set { _m32 = value; } } public double M33 { get { return _m33; } set { _m33 = value; } } public double M34 { get { return _m34; } set { _m34 = value; } } public double OffsetX { get { return _offsetX; } set { _offsetX = value; } } public double OffsetY { get { return _offsetY; } set { _offsetY = value; } } public double OffsetZ { get { return _offsetZ; } set { _offsetZ = value; } } public double M44 { get { return _m44; } set { _m44 = value; } } public static Matrix3D Identity { get { return s_identity; } } public bool IsIdentity { get { return (_m11 == 1 && _m12 == 0 && _m13 == 0 && _m14 == 0 && _m21 == 0 && _m22 == 1 && _m23 == 0 && _m24 == 0 && _m31 == 0 && _m32 == 0 && _m33 == 1 && _m34 == 0 && _offsetX == 0 && _offsetY == 0 && _offsetZ == 0 && _m44 == 1); } } public override string ToString() { // Delegate to the internal method which implements all ToString calls. return ConvertToString(null /* format string */, null /* format provider */); } public string ToString(IFormatProvider provider) { // Delegate to the internal method which implements all ToString calls. return ConvertToString(null /* format string */, provider); } string IFormattable.ToString(string format, IFormatProvider provider) { // Delegate to the internal method which implements all ToString calls. return ConvertToString(format, provider); } private string ConvertToString(string format, IFormatProvider provider) { if (IsIdentity) { return "Identity"; } // Helper to get the numeric list separator for a given culture. char separator = TokenizerHelper.GetNumericListSeparator(provider); return String.Format(provider, "{1:" + format + "}{0}{2:" + format + "}{0}{3:" + format + "}{0}{4:" + format + "}{0}{5:" + format + "}{0}{6:" + format + "}{0}{7:" + format + "}{0}{8:" + format + "}{0}{9:" + format + "}{0}{10:" + format + "}{0}{11:" + format + "}{0}{12:" + format + "}{0}{13:" + format + "}{0}{14:" + format + "}{0}{15:" + format + "}{0}{16:" + format + "}", separator, _m11, _m12, _m13, _m14, _m21, _m22, _m23, _m24, _m31, _m32, _m33, _m34, _offsetX, _offsetY, _offsetZ, _m44); } public override int GetHashCode() { // Perform field-by-field XOR of HashCodes return M11.GetHashCode() ^ M12.GetHashCode() ^ M13.GetHashCode() ^ M14.GetHashCode() ^ M21.GetHashCode() ^ M22.GetHashCode() ^ M23.GetHashCode() ^ M24.GetHashCode() ^ M31.GetHashCode() ^ M32.GetHashCode() ^ M33.GetHashCode() ^ M34.GetHashCode() ^ OffsetX.GetHashCode() ^ OffsetY.GetHashCode() ^ OffsetZ.GetHashCode() ^ M44.GetHashCode(); } public override bool Equals(object o) { return o is Matrix3D && Matrix3D.Equals(this, (Matrix3D)o); } public bool Equals(Matrix3D value) { return Matrix3D.Equals(this, value); } public static bool operator ==(Matrix3D matrix1, Matrix3D matrix2) { return matrix1.M11 == matrix2.M11 && matrix1.M12 == matrix2.M12 && matrix1.M13 == matrix2.M13 && matrix1.M14 == matrix2.M14 && matrix1.M21 == matrix2.M21 && matrix1.M22 == matrix2.M22 && matrix1.M23 == matrix2.M23 && matrix1.M24 == matrix2.M24 && matrix1.M31 == matrix2.M31 && matrix1.M32 == matrix2.M32 && matrix1.M33 == matrix2.M33 && matrix1.M34 == matrix2.M34 && matrix1.OffsetX == matrix2.OffsetX && matrix1.OffsetY == matrix2.OffsetY && matrix1.OffsetZ == matrix2.OffsetZ && matrix1.M44 == matrix2.M44; } public static bool operator !=(Matrix3D matrix1, Matrix3D matrix2) { return !(matrix1 == matrix2); } public static Matrix3D operator *(Matrix3D matrix1, Matrix3D matrix2) { Matrix3D matrix3D = new Matrix3D(); matrix3D.M11 = matrix1.M11 * matrix2.M11 + matrix1.M12 * matrix2.M21 + matrix1.M13 * matrix2.M31 + matrix1.M14 * matrix2.OffsetX; matrix3D.M12 = matrix1.M11 * matrix2.M12 + matrix1.M12 * matrix2.M22 + matrix1.M13 * matrix2.M32 + matrix1.M14 * matrix2.OffsetY; matrix3D.M13 = matrix1.M11 * matrix2.M13 + matrix1.M12 * matrix2.M23 + matrix1.M13 * matrix2.M33 + matrix1.M14 * matrix2.OffsetZ; matrix3D.M14 = matrix1.M11 * matrix2.M14 + matrix1.M12 * matrix2.M24 + matrix1.M13 * matrix2.M34 + matrix1.M14 * matrix2.M44; matrix3D.M21 = matrix1.M21 * matrix2.M11 + matrix1.M22 * matrix2.M21 + matrix1.M23 * matrix2.M31 + matrix1.M24 * matrix2.OffsetX; matrix3D.M22 = matrix1.M21 * matrix2.M12 + matrix1.M22 * matrix2.M22 + matrix1.M23 * matrix2.M32 + matrix1.M24 * matrix2.OffsetY; matrix3D.M23 = matrix1.M21 * matrix2.M13 + matrix1.M22 * matrix2.M23 + matrix1.M23 * matrix2.M33 + matrix1.M24 * matrix2.OffsetZ; matrix3D.M24 = matrix1.M21 * matrix2.M14 + matrix1.M22 * matrix2.M24 + matrix1.M23 * matrix2.M34 + matrix1.M24 * matrix2.M44; matrix3D.M31 = matrix1.M31 * matrix2.M11 + matrix1.M32 * matrix2.M21 + matrix1.M33 * matrix2.M31 + matrix1.M34 * matrix2.OffsetX; matrix3D.M32 = matrix1.M31 * matrix2.M12 + matrix1.M32 * matrix2.M22 + matrix1.M33 * matrix2.M32 + matrix1.M34 * matrix2.OffsetY; matrix3D.M33 = matrix1.M31 * matrix2.M13 + matrix1.M32 * matrix2.M23 + matrix1.M33 * matrix2.M33 + matrix1.M34 * matrix2.OffsetZ; matrix3D.M34 = matrix1.M31 * matrix2.M14 + matrix1.M32 * matrix2.M24 + matrix1.M33 * matrix2.M34 + matrix1.M34 * matrix2.M44; matrix3D.OffsetX = matrix1.OffsetX * matrix2.M11 + matrix1.OffsetY * matrix2.M21 + matrix1.OffsetZ * matrix2.M31 + matrix1.M44 * matrix2.OffsetX; matrix3D.OffsetY = matrix1.OffsetX * matrix2.M12 + matrix1.OffsetY * matrix2.M22 + matrix1.OffsetZ * matrix2.M32 + matrix1.M44 * matrix2.OffsetY; matrix3D.OffsetZ = matrix1.OffsetX * matrix2.M13 + matrix1.OffsetY * matrix2.M23 + matrix1.OffsetZ * matrix2.M33 + matrix1.M44 * matrix2.OffsetZ; matrix3D.M44 = matrix1.OffsetX * matrix2.M14 + matrix1.OffsetY * matrix2.M24 + matrix1.OffsetZ * matrix2.M34 + matrix1.M44 * matrix2.M44; // matrix3D._type is not set. return matrix3D; } public bool HasInverse { get { return !IsZero(Determinant); } } public void Invert() { if (!InvertCore()) { throw new InvalidOperationException(); } } private static Matrix3D CreateIdentity() { Matrix3D matrix3D = new Matrix3D(); matrix3D.SetMatrix(1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1); return matrix3D; } private void SetMatrix(double m11, double m12, double m13, double m14, double m21, double m22, double m23, double m24, double m31, double m32, double m33, double m34, double offsetX, double offsetY, double offsetZ, double m44) { _m11 = m11; _m12 = m12; _m13 = m13; _m14 = m14; _m21 = m21; _m22 = m22; _m23 = m23; _m24 = m24; _m31 = m31; _m32 = m32; _m33 = m33; _m34 = m34; _offsetX = offsetX; _offsetY = offsetY; _offsetZ = offsetZ; _m44 = m44; } private static bool Equals(Matrix3D matrix1, Matrix3D matrix2) { return matrix1.M11.Equals(matrix2.M11) && matrix1.M12.Equals(matrix2.M12) && matrix1.M13.Equals(matrix2.M13) && matrix1.M14.Equals(matrix2.M14) && matrix1.M21.Equals(matrix2.M21) && matrix1.M22.Equals(matrix2.M22) && matrix1.M23.Equals(matrix2.M23) && matrix1.M24.Equals(matrix2.M24) && matrix1.M31.Equals(matrix2.M31) && matrix1.M32.Equals(matrix2.M32) && matrix1.M33.Equals(matrix2.M33) && matrix1.M34.Equals(matrix2.M34) && matrix1.OffsetX.Equals(matrix2.OffsetX) && matrix1.OffsetY.Equals(matrix2.OffsetY) && matrix1.OffsetZ.Equals(matrix2.OffsetZ) && matrix1.M44.Equals(matrix2.M44); } private double GetNormalizedAffineDeterminant() { double z20 = _m12 * _m23 - _m22 * _m13; double z10 = _m32 * _m13 - _m12 * _m33; double z00 = _m22 * _m33 - _m32 * _m23; return _m31 * z20 + _m21 * z10 + _m11 * z00; } private bool IsAffine { get { return (_m14 == 0.0 && _m24 == 0.0 && _m34 == 0.0 && _m44 == 1.0); } } private double Determinant { get { if (IsAffine) { return GetNormalizedAffineDeterminant(); } // compute all six 2x2 determinants of 2nd two columns double y01 = _m13 * _m24 - _m23 * _m14; double y02 = _m13 * _m34 - _m33 * _m14; double y03 = _m13 * _m44 - _offsetZ * _m14; double y12 = _m23 * _m34 - _m33 * _m24; double y13 = _m23 * _m44 - _offsetZ * _m24; double y23 = _m33 * _m44 - _offsetZ * _m34; // Compute 3x3 cofactors for 1st the column double z30 = _m22 * y02 - _m32 * y01 - _m12 * y12; double z20 = _m12 * y13 - _m22 * y03 + _offsetY * y01; double z10 = _m32 * y03 - _offsetY * y02 - _m12 * y23; double z00 = _m22 * y23 - _m32 * y13 + _offsetY * y12; return _offsetX * z30 + _m31 * z20 + _m21 * z10 + _m11 * z00; } } private static bool IsZero(double value) { return Math.Abs(value) < 10.0 * DBL_EPSILON_RELATIVE_1; } private const double DBL_EPSILON_RELATIVE_1 = 1.1102230246251567e-016; /* smallest such that 1.0+DBL_EPSILON != 1.0 */ private double _m11; private double _m12; private double _m13; private double _m14; private double _m21; private double _m22; private double _m23; private double _m24; private double _m31; private double _m32; private double _m33; private double _m34; private double _offsetX; private double _offsetY; private double _offsetZ; private double _m44; } } #pragma warning restore 436 using System; using System.Collections.Generic; using System.Linq; using System.Reactive; using System.Reactive.Linq; using System.Reactive.Subjects; using System.Reactive.Threading.Tasks; using System.Threading.Tasks; using F2F.ReactiveNavigation.ViewModel; using F2F.Testing.Xunit.FakeItEasy; using FakeItEasy; using FluentAssertions; using Microsoft.Reactive.Testing; using Ploeh.AutoFixture; using Ploeh.AutoFixture.AutoFakeItEasy; using ReactiveUI; using ReactiveUI.Testing; using Xunit; namespace F2F.ReactiveNavigation.UnitTests { public class ReactiveViewModel_BusyIndication_Test : AutoMockFeature { [Fact] public void IsBusy_ShouldBeFalseByDefault() { new TestScheduler().With(scheduler => { var sut = Fixture.Create<ReactiveViewModel>(); sut.InitializeAsync().Schedule(scheduler); sut.IsBusy.Should().BeFalse(); }); } [Fact] public void IsBusy_ShouldBeTrueWhenNotYetInitialized() { new TestScheduler().With(scheduler => { var sut = Fixture.Create<ReactiveViewModel>(); sut.IsBusy.Should().BeTrue(); }); } [Fact] public void Task_ShouldBeTrueWhenNavigateToAsyncIsExecuting() { new TestScheduler().With(scheduler => { var navigatedToObservable = Observable .Return(Unit.Default) .Delay(TimeSpan.FromMilliseconds(100), scheduler); var task = navigatedToObservable.ToTask(); task.IsCompleted.Should().BeFalse(); scheduler.AdvanceByMs(101); task.IsCompleted.Should().BeTrue(); }); } [Fact(Skip = "Rethink this test")] public void IsBusy_ShouldBeTrueWhenNavigateToAsyncIsExecuting() { new TestScheduler().With(scheduler => { var sut = Fixture.Create<ReactiveViewModel>(); var navigatedToObservable = Observable .Return(Unit.Default) .Delay(TimeSpan.FromMilliseconds(1), scheduler); sut.WhenNavigatedTo() .DoAsync(_ => navigatedToObservable.ToTask() as Task) .Subscribe(); sut.InitializeAsync().Schedule(scheduler); for (int i = 0; i < 10; i++) { sut.NavigateTo(null); scheduler.Advance(); // schedule navigation call sut.IsBusy.Should().BeTrue(); scheduler.Advance(); // pass delay in navigatedToObservable sut.IsBusy.Should().BeFalse(); } }); } [Fact(Skip = "Rethink this test")] public void IsBusy_ShouldBeTrueWhenNavigateToAsyncWithResultIsExecuting() { new TestScheduler().With(scheduler => { var sut = Fixture.Create<ReactiveViewModel>(); var navigatedToObservable = Observable .Return(Unit.Default) .Delay(TimeSpan.FromMilliseconds(1), scheduler); sut.WhenNavigatedTo() .DoAsync(_ => navigatedToObservable.ToTask()) .Subscribe(); sut.InitializeAsync().Schedule(scheduler); for (int i = 0; i < 10; i++) { sut.NavigateTo(null); scheduler.Advance(); // schedule navigation call sut.IsBusy.Should().BeTrue(); scheduler.Advance(); // pass delay in navigatedToObservable sut.IsBusy.Should().BeFalse(); } }); } [Fact] public void IsBusy_WhenHavingOneBusyObservable_ShouldBeTrueAsLongAsBusyObservableYieldsTrue() { new TestScheduler().With(scheduler => { var sut = A.Fake<ReactiveViewModel>(); var busySubject = new Subject<bool>(); A.CallTo(() => sut.BusyObservables).Returns(new[] { busySubject }); sut.IsBusy.Should().BeTrue(); sut.InitializeAsync().Schedule(scheduler); sut.IsBusy.Should().BeFalse(); busySubject.OnNext(true); sut.IsBusy.Should().BeTrue(); busySubject.OnNext(false); sut.IsBusy.Should().BeFalse(); }); } [Fact] public void IsBusy_WhenHavingTwoBusyObservables_ShouldBeTrueAsLongAsOneBusyObservableYieldsTrue() { new TestScheduler().With(scheduler => { var sut = A.Fake<ReactiveViewModel>(); var busySubject1 = new Subject<bool>(); var busySubject2 = new Subject<bool>(); A.CallTo(() => sut.BusyObservables).Returns(new[] { busySubject1, busySubject2 }); sut.IsBusy.Should().BeTrue(); sut.InitializeAsync().Schedule(scheduler); sut.IsBusy.Should().BeFalse(); busySubject1.OnNext(true); sut.IsBusy.Should().BeTrue(); busySubject2.OnNext(true); sut.IsBusy.Should().BeTrue(); busySubject1.OnNext(false); sut.IsBusy.Should().BeTrue(); busySubject2.OnNext(false); sut.IsBusy.Should().BeFalse(); }); } [Fact(Skip = "Rethink this test")] public void IsBusy_WhenHavingTwoBusyObservables_AndNavigation_ShouldBeTrueAsLongAsOneBusyObservableYieldsTrue() { new TestScheduler().With(scheduler => { var sut = A.Fake<ReactiveViewModel>(); var busySubject1 = new Subject<bool>(); var busySubject2 = new Subject<bool>(); A.CallTo(() => sut.BusyObservables).Returns(new[] { busySubject1, busySubject2 }); var navigatedToTask = Observable .Return(Unit.Default) .Delay(TimeSpan.FromMilliseconds(2), scheduler) .ToTask(); sut.IsBusy.Should().BeTrue(); sut.WhenNavigatedTo() .DoAsync(_ => navigatedToTask) .Subscribe(); sut.InitializeAsync().Schedule(scheduler); sut.IsBusy.Should().BeFalse(); sut.NavigateTo(NavigationParameters.Empty); sut.IsBusy.Should().BeFalse(); scheduler.Advance(); // schedule navigation call start sut.IsBusy.Should().BeTrue(); scheduler.Advance(); sut.IsBusy.Should().BeFalse(); // schedule navigation call end busySubject2.OnNext(true); sut.IsBusy.Should().BeTrue(); busySubject2.OnNext(false); sut.IsBusy.Should().BeFalse(); }); } [Fact] public void IsBusy_WhenBusyObservableThrowsObservedException_ShouldPushExceptionToThrownExceptionsObservable() { new TestScheduler().With(scheduler => { var sut = A.Fake<ReactiveViewModel>(); var exception = Fixture.Create<Exception>(); var errorSubject = new Subject<bool>(); A.CallTo(() => sut.BusyObservables).Returns(new[] { errorSubject }); sut.InitializeAsync(); var busyExceptions = sut.ThrownExceptions.CreateCollection(); errorSubject.OnError(exception); scheduler.Advance(); busyExceptions.Single().Should().Be(exception); }); } [Fact] public void IsBusy_WhenBusyObservableThrowsUnobservedException_ShouldThrowDefaultExceptionAtCallSite() { new TestScheduler().With(scheduler => { var sut = A.Fake<ReactiveViewModel>(); var exception = Fixture.Create<Exception>(); var errorSubject = new Subject<bool>(); A.CallTo(() => sut.BusyObservables).Returns(new[] { errorSubject }); sut.InitializeAsync(); errorSubject.OnError(exception); scheduler .Invoking(x => x.Advance()) .ShouldThrow<Exception>() .Which .InnerException .Should() .Be(exception); }); } } } Downloads last month9
https://huggingface.co/soarroyo
Andrea Arroyo soarroyo Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/datasets/microsoft/LCC_java
Subset Split gt string context string package de.mpii.rdf3x; import java.io.InputStream; import java.io.Reader; import java.sql.Array; import java.sql.Blob; import java.sql.Clob; import java.sql.Date; import java.sql.NClob; import java.sql.Ref; import java.sql.RowId; import java.sql.SQLException; import java.sql.SQLFeatureNotSupportedException; import java.sql.SQLWarning; import java.sql.SQLXML; import java.sql.Time; import java.sql.Timestamp; import java.util.Map; // RDF-3X // (c) 2009 Thomas Neumann. Web site: http://www.mpi-inf.mpg.de/~neumann/rdf3x // // This work is licensed under the Creative Commons // Attribution-Noncommercial-Share Alike 3.0 Unported License. To view a copy // of this license, visit http://creativecommons.org/licenses/by-nc-sa/3.0/ // or send a letter to Creative Commons, 171 Second Street, Suite 300, // San Francisco, California, 94105, USA. public final class ResultSet implements java.sql.ResultSet { // The header private String[] header; // The data private String[][] data; // The current position private int row; // The last column private int lastCol; // Constructor ResultSet(String[] header,String[][] data) { this.header=header; this.data=data; row=-1; } // Move absolutely public boolean absolute(int row) { if (row>0) { if (row>(data.length+1)) return false; this.row=row-1; return true; } else { if ((-row)>data.length) return false; this.row=data.length-row; return true; } } // Move after the last entry public void afterLast() { row=data.length; } // Move before the first entry public void beforeFirst() throws SQLException { throw new SQLFeatureNotSupportedException(); } // Cancel all updates public void cancelRowUpdates() throws SQLException { throw new SQLFeatureNotSupportedException(); } // Clear all warnings public void clearWarnings() {} // Releases resources public void close() { data=null; } // Deletes the current row public void deleteRow() throws SQLException { throw new SQLFeatureNotSupportedException(); } // Find a column public int findColumn(String columnLabel) throws SQLException { for (int index=0;index<header.length;index++) if (header[index].equals(columnLabel)) return index+1; throw new SQLException(); } // Go to the first entry public boolean first() { row=0; return row<data.length; } // Get an entry as array public Array getArray(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as array public Array getArray(String columnLabel) throws SQLException { return getArray(findColumn(columnLabel)); } // Get an entry as ascii stream public InputStream getAsciiStream(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as ascii stream public InputStream getAsciiStream(String columnLabel) throws SQLException { return getAsciiStream(findColumn(columnLabel)); } // Get an entry as big decimal public java.math.BigDecimal getBigDecimal(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } /** * Get an entry as big decimal * @deprecated */ public java.math.BigDecimal getBigDecimal(int columnIndex, int scale) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as big decimal public java.math.BigDecimal getBigDecimal(String columnLabel) throws SQLException { return getBigDecimal(findColumn(columnLabel)); } /** * Get an entry as big decimal. * @deprecated */ public java.math.BigDecimal getBigDecimal(String columnLabel, int scale) throws SQLException { return getBigDecimal(findColumn(columnLabel),scale); } // Get an entry as binary stream public InputStream getBinaryStream(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as binary stream public InputStream getBinaryStream(String columnLabel) throws SQLException { return getBinaryStream(findColumn(columnLabel)); } // Get an entry as blob public Blob getBlob(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as blob public Blob getBlob(String columnLabel) throws SQLException { return getBlob(findColumn(columnLabel)); } // Get an entry as boolean public boolean getBoolean(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as boolean public boolean getBoolean(String columnLabel) throws SQLException { return getBoolean(findColumn(columnLabel)); } // Get an entry as byte public byte getByte(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as byte public byte getByte(String columnLabel) throws SQLException { return getByte(findColumn(columnLabel)); } // Get an entry as bytes public byte[] getBytes(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as bytes public byte[] getBytes(String columnLabel) throws SQLException { return getBytes(findColumn(columnLabel)); } // Get an entry as character stream public Reader getCharacterStream(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as character stream public Reader getCharacterStream(String columnLabel) throws SQLException { return getCharacterStream(findColumn(columnLabel)); } // Get an entry as clob public Clob getClob(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as clob public Clob getClob(String columnLabel) throws SQLException { return getClob(findColumn(columnLabel)); } // Get the concurrency setting public int getConcurrency() { return java.sql.ResultSet.CONCUR_READ_ONLY; } // Get the cursor name public String getCursorName() throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as date public Date getDate(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as date public Date getDate(int columnIndex, java.util.Calendar cal) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as date public Date getDate(String columnLabel) throws SQLException { return getDate(findColumn(columnLabel)); } // Get an entry as date public Date getDate(String columnLabel, java.util.Calendar cal) throws SQLException { return getDate(findColumn(columnLabel),cal); } // Get an entry as double public double getDouble(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as double public double getDouble(String columnLabel) throws SQLException { return getDouble(findColumn(columnLabel)); } // Get the fetch direction public int getFetchDirection() { return java.sql.ResultSet.FETCH_FORWARD; } // Get the fetch size public int getFetchSize() { return 0; } // Get an entry as float public float getFloat(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as float public float getFloat(String columnLabel) throws SQLException { return getFloat(findColumn(columnLabel)); } // Get the holdability public int getHoldability() { return java.sql.ResultSet.CLOSE_CURSORS_AT_COMMIT; } // Get an entry as int public int getInt(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as int public int getInt(String columnLabel) throws SQLException { return getInt(findColumn(columnLabel)); } // Get an entry as long public long getLong(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as long public long getLong(String columnLabel) throws SQLException { return getLong(findColumn(columnLabel)); } // Get the meta data public java.sql.ResultSetMetaData getMetaData() { return new ResultSetMetaData(header); } // Get an entry as stream public Reader getNCharacterStream(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as stream public Reader getNCharacterStream(String columnLabel) throws SQLException { return getNCharacterStream(findColumn(columnLabel)); } // Get an entry as nclob public NClob getNClob(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as nclob public NClob getNClob(String columnLabel) throws SQLException { return getNClob(findColumn(columnLabel)); } // Get an entry as string public String getNString(int columnIndex) throws SQLException { return getString(columnIndex); } // Get an entry as string public String getNString(String columnLabel) throws SQLException { return getNString(findColumn(columnLabel)); } // Get an entry public Object getObject(int columnIndex) throws SQLException { return getString(columnIndex); } // Get an entry public Object getObject(int columnIndex, Map<String,Class<?>> map) throws SQLException { return getString(columnIndex); } // Get an entry public Object getObject(String columnLabel) throws SQLException { return getObject(findColumn(columnLabel)); } // Get an entry public Object getObject(String columnLabel, Map<String,Class<?>> map) throws SQLException { return getObject(findColumn(columnLabel),map); } // Get an entry as ref public Ref getRef(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as ref public Ref getRef(String columnLabel) throws SQLException { return getRef(findColumn(columnLabel)); } // Get the current row number public int getRow() { return row+1; } // Get an entry as rowid public RowId getRowId(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as rowid public RowId getRowId(String columnLabel) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as short public short getShort(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as short public short getShort(String columnLabel) throws SQLException { return getShort(findColumn(columnLabel)); } // Get an entry as SQL public SQLXML getSQLXML(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as SQL public SQLXML getSQLXML(String columnLabel) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get the corresponding statement public Statement getStatement() throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as string public String getString(int columnIndex) throws SQLException { if ((row>=data.length)||(columnIndex<1)||(columnIndex>data[row].length)) throw new SQLException(); String s=data[row][columnIndex-1]; lastCol=columnIndex; if ("NULL".equals(s)) return null; else return s; } // Get an entry as string public String getString(String columnLabel) throws SQLException { return getString(findColumn(columnLabel)); } // Get an entry as time public Time getTime(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as time public Time getTime(int columnIndex, java.util.Calendar cal) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as time public Time getTime(String columnLabel) throws SQLException { return getTime(findColumn(columnLabel)); } // Get an entry as tme public Time getTime(String columnLabel, java.util.Calendar cal) throws SQLException { return getTime(findColumn(columnLabel),cal); } // Get an entry as timestamp public Timestamp getTimestamp(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as timestamp public Timestamp getTimestamp(int columnIndex, java.util.Calendar cal) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as timestamp public Timestamp getTimestamp(String columnLabel) throws SQLException { return getTimestamp(findColumn(columnLabel)); } // Get an entry as timestamp public Timestamp getTimestamp(String columnLabel, java.util.Calendar cal) throws SQLException { return getTimestamp(findColumn(columnLabel),cal); } // Get the type public int getType() { return java.sql.ResultSet.TYPE_FORWARD_ONLY; } /** * Get an entry as unicode stream * @deprecated */ public InputStream getUnicodeStream(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } /** * Get an entry as unicode stream * @deprecated */ public InputStream getUnicodeStream(String columnLabel) throws SQLException { return getUnicodeStream(findColumn(columnLabel)); } // Get an entry as URL public java.net.URL getURL(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Get an entry as URL public java.net.URL getURL(String columnLabel) throws SQLException { return getURL(findColumn(columnLabel)); } // Get warnings public SQLWarning getWarnings() { return null; } // Insert a row public void insertRow() throws SQLException { throw new SQLFeatureNotSupportedException(); } // After the last row public boolean isAfterLast() { return row>=data.length; } // Before the first row public boolean isBeforeFirst() { return false; } // Closed public boolean isClosed() { return data==null; } // At first row public boolean isFirst() { return row==0; } // At last row public boolean isLast() { return row==(data.length-1); } // Go to the last row public boolean last() { if (data.length>0) { row=data.length-1; return true; } else return false; } // Move the cursor public void moveToCurrentRow() throws SQLException { throw new SQLFeatureNotSupportedException(); } // Move the cursor public void moveToInsertRow() throws SQLException { throw new SQLFeatureNotSupportedException(); } // Go to the next row public boolean next() { if (row>=data.length) return false; ++row; return row<data.length; } // Go to the previous row public boolean previous() { if (row==0) return false; --row; return true; } // Refresh the current tow public void refreshRow() {} // Move the cursor relatively public boolean relative(int rows) { if (rows>=0) { if (row+rows>=data.length) { row=data.length; return false; } else { row+=rows; return true; } } else { if (row+rows<0) { row=0; return true; } else { row+=rows; return true; } } } // Deleted public boolean rowDeleted() { return false; } // Inserted public boolean rowInserted() { return false; } // Updated public boolean rowUpdated() { return false; } // Fetch direction public void setFetchDirection(int direction) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Fetch size public void setFetchSize(int rows) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateArray(int columnIndex, Array x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateArray(String columnLabel, Array x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateAsciiStream(int columnIndex, InputStream x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateAsciiStream(int columnIndex, InputStream x, int length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateAsciiStream(int columnIndex, InputStream x, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateAsciiStream(String columnLabel, InputStream x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateAsciiStream(String columnLabel, InputStream x, int length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateAsciiStream(String columnLabel, InputStream x, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBigDecimal(int columnIndex, java.math.BigDecimal x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBigDecimal(String columnLabel, java.math.BigDecimal x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBinaryStream(int columnIndex, InputStream x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBinaryStream(int columnIndex, InputStream x, int length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBinaryStream(int columnIndex, InputStream x, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBinaryStream(String columnLabel, InputStream x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Updare public void updateBinaryStream(String columnLabel, InputStream x, int length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBinaryStream(String columnLabel, InputStream x, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBlob(int columnIndex, Blob x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBlob(int columnIndex, InputStream inputStream) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBlob(int columnIndex, InputStream inputStream, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBlob(String columnLabel, Blob x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBlob(String columnLabel, InputStream inputStream) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBlob(String columnLabel, InputStream inputStream, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBoolean(int columnIndex, boolean x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBoolean(String columnLabel, boolean x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateByte(int columnIndex, byte x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateByte(String columnLabel, byte x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBytes(int columnIndex, byte[] x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateBytes(String columnLabel, byte[] x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateCharacterStream(int columnIndex, Reader x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateCharacterStream(int columnIndex, Reader x, int length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateCharacterStream(int columnIndex, Reader x, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateCharacterStream(String columnLabel, Reader reader) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateCharacterStream(String columnLabel, Reader reader, int length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateCharacterStream(String columnLabel, Reader reader, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateClob(int columnIndex, Clob x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateClob(int columnIndex, Reader reader) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateClob(int columnIndex, Reader reader, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateClob(String columnLabel, Clob x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateClob(String columnLabel, Reader reader) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateClob(String columnLabel, Reader reader, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateDate(int columnIndex, Date x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateDate(String columnLabel, Date x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateDouble(int columnIndex, double x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateDouble(String columnLabel, double x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateFloat(int columnIndex, float x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateFloat(String columnLabel, float x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateInt(int columnIndex, int x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateInt(String columnLabel, int x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateLong(int columnIndex, long x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateLong(String columnLabel, long x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNCharacterStream(int columnIndex, Reader x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNCharacterStream(int columnIndex, Reader x, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNCharacterStream(String columnLabel, Reader reader) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNCharacterStream(String columnLabel, Reader reader, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNClob(int columnIndex, NClob nClob) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNClob(int columnIndex, Reader reader) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNClob(int columnIndex, Reader reader, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNClob(String columnLabel, NClob nClob) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNClob(String columnLabel, Reader reader) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNClob(String columnLabel, Reader reader, long length) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNString(int columnIndex, String nString) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNString(String columnLabel, String nString) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNull(int columnIndex) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateNull(String columnLabel) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateObject(int columnIndex, Object x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateObject(int columnIndex, Object x, int scaleOrLength) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateObject(String columnLabel, Object x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateObject(String columnLabel, Object x, int scaleOrLength) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateRef(int columnIndex, Ref x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateRef(String columnLabel, Ref x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateRow() throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateRowId(int columnIndex, RowId x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateRowId(String columnLabel, RowId x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateShort(int columnIndex, short x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateShort(String columnLabel, short x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateSQLXML(int columnIndex, SQLXML xmlObject) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateSQLXML(String columnLabel, SQLXML xmlObject) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateString(int columnIndex, String x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateString(String columnLabel, String x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateTime(int columnIndex, Time x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateTime(String columnLabel, Time x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateTimestamp(int columnIndex, Timestamp x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Update public void updateTimestamp(String columnLabel, Timestamp x) throws SQLException { throw new SQLFeatureNotSupportedException(); } // Was the last column NULL? public boolean wasNull() throws SQLException { return getString(lastCol)==null; } // Wrapper? public boolean isWrapperFor(Class<?> iface) { return false; } // Unwrap public <T> T unwrap(Class<T> iface) throws SQLException { throw new SQLException(); } public <T> T getObject(int columnIndex, Class<T> type) throws SQLException { // TODO Auto-generated method stub return null; } public <T> T getObject(String columnLabel, Class<T> type) throws SQLException { // TODO Auto-generated method stub return null; } } /* * Copyright 2011 Greg Haines * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package net.greghaines.jesque.meta.dao.impl; import static net.greghaines.jesque.utils.ResqueConstants.FAILED; import static net.greghaines.jesque.utils.ResqueConstants.QUEUE; import static net.greghaines.jesque.utils.ResqueConstants.QUEUES; import static net.greghaines.jesque.utils.ResqueConstants.STAT; import java.io.IOException; import java.util.ArrayList; import java.util.Date; import java.util.List; import java.util.UUID; import net.greghaines.jesque.Config; import net.greghaines.jesque.Job; import net.greghaines.jesque.JobFailure; import net.greghaines.jesque.json.ObjectMapperFactory; import net.greghaines.jesque.meta.dao.FailureDAO; import net.greghaines.jesque.utils.JesqueUtils; import net.greghaines.jesque.utils.PoolUtils; import net.greghaines.jesque.utils.PoolUtils.PoolWork; import redis.clients.jedis.Jedis; import redis.clients.jedis.util.Pool; /** * Accesses failure information about Jesque/Resque from Redis. * * @author Greg Haines */ public class FailureDAORedisImpl implements FailureDAO { private final Config config; private final Pool<Jedis> jedisPool; /** * Constructor. * @param config the Jesque configuration * @param jedisPool the connection pool to Redis */ public FailureDAORedisImpl(final Config config, final Pool<Jedis> jedisPool) { if (config == null) { throw new IllegalArgumentException("config must not be null"); } if (jedisPool == null) { throw new IllegalArgumentException("jedisPool must not be null"); } this.config = config; this.jedisPool = jedisPool; } /** * {@inheritDoc} */ @Override public long getCount() { return PoolUtils.doWorkInPoolNicely(this.jedisPool, new PoolWork<Jedis, Long>() { /** * {@inheritDoc} */ @Override public Long doWork(final Jedis jedis) throws Exception { final String failedStr = jedis.get(key(STAT, FAILED)); return (failedStr == null) ? 0L : Long.parseLong(failedStr); } }); } /** * {@inheritDoc} */ @Override public long getFailQueueJobCount() { return PoolUtils.doWorkInPoolNicely(this.jedisPool, new PoolWork<Jedis, Long>() { /** * {@inheritDoc} */ @Override public Long doWork(final Jedis jedis) throws Exception { return jedis.llen(key(FAILED)); } }); } /** * {@inheritDoc} */ @Override public List<JobFailure> getFailures(final long offset, final long count) { return PoolUtils.doWorkInPoolNicely(this.jedisPool, new PoolWork<Jedis, List<JobFailure>>() { /** * {@inheritDoc} */ @Override public List<JobFailure> doWork(final Jedis jedis) throws Exception { final List<String> payloads = jedis.lrange(key(FAILED), offset, offset + count - 1); final List<JobFailure> failures = new ArrayList<JobFailure>(payloads.size()); for (final String payload : payloads) { if (payload.charAt(0) == '{') { // Ignore non-JSON strings failures.add(ObjectMapperFactory.get().readValue(payload, JobFailure.class)); } } return failures; } }); } /** * {@inheritDoc} */ @Override public void clear() { PoolUtils.doWorkInPoolNicely(this.jedisPool, new PoolWork<Jedis, Void>() { /** * {@inheritDoc} */ @Override public Void doWork(final Jedis jedis) throws Exception { jedis.del(key(FAILED)); return null; } }); } /** * {@inheritDoc} */ @Override public Date requeue(final long index) { Date retryDate = null; final List<JobFailure> failures = getFailures(index, 1); if (!failures.isEmpty()) { retryDate = PoolUtils.doWorkInPoolNicely(this.jedisPool, new PoolWork<Jedis, Date>() { /** * {@inheritDoc} */ @Override public Date doWork(final Jedis jedis) throws Exception { final Date retriedAt = new Date(); final JobFailure failure = failures.get(0); failure.setRetriedAt(retriedAt); jedis.lset(key(FAILED), index, ObjectMapperFactory.get().writeValueAsString(failure)); enqueue(jedis, failure.getQueue(), failure.getPayload()); return retriedAt; } }); } return retryDate; } /** * {@inheritDoc} */ @Override public void remove(final long index) { PoolUtils.doWorkInPoolNicely(this.jedisPool, new PoolWork<Jedis, Void>() { /** * {@inheritDoc} */ @Override public Void doWork(final Jedis jedis) throws Exception { final String failedKey = key(FAILED); final String randId = UUID.randomUUID().toString(); jedis.lset(failedKey, index, randId); jedis.lrem(failedKey, 1, randId); return null; } }); } protected void enqueue(final Jedis jedis, final String queue, final Job job) throws IOException { if (queue == null || "".equals(queue)) { throw new IllegalArgumentException("queue must not be null or empty: " + queue); } if (job == null) { throw new IllegalArgumentException("job must not be null"); } if (!job.isValid()) { throw new IllegalStateException("job is not valid: " + job); } final String msg = ObjectMapperFactory.get().writeValueAsString(job); jedis.sadd(key(QUEUES), queue); jedis.rpush(key(QUEUE, queue), msg); } /** * Builds a namespaced Redis key with the given arguments. * * @param parts * the key parts to be joined * @return an assembled String key */ private String key(final String... parts) { return JesqueUtils.createKey(this.config.getNamespace(), parts); } } /** * Portions Copyright 2001 Sun Microsystems, Inc. * Portions Copyright 1999-2001 Language Technologies Institute, * Carnegie Mellon University. * All Rights Reserved. Use is subject to license terms. * * See the file "license.terms" for information on usage and * redistribution of this file, and for a DISCLAIMER OF ALL * WARRANTIES. */ package com.sun.speech.freetts; import java.io.BufferedReader; import java.io.IOException; import java.io.InputStream; import java.io.InputStreamReader; import java.io.PrintWriter; import java.io.Reader; import java.net.URL; import java.util.ArrayList; import java.util.Collections; import java.util.HashMap; import java.util.Iterator; import java.util.List; import java.util.Locale; import java.util.Map; import java.util.logging.Level; import java.util.logging.Logger; import org.w3c.dom.Document; import org.w3c.dom.Node; import org.w3c.dom.Text; import com.sun.speech.freetts.audio.AudioPlayer; import com.sun.speech.freetts.lexicon.Lexicon; import com.sun.speech.freetts.relp.LPCResult; import com.sun.speech.freetts.util.BulkTimer; import com.sun.speech.freetts.util.Utilities; /** * Performs text-to-speech using a series of * <code>UtteranceProcessors</code>. It is the main conduit to the FreeTTS * speech synthesizer. It can perform TTS on ASCII text, * a JSML document, an <code>InputStream</code>, or a * <code>FreeTTSSpeakable</code>, by invoking the method <code>speak</code>. * * <p>Before a Voice can perform TTS, it must have a * <code>Lexicon</code>, from which it gets the vocabulary, and * an <code>AudioPlayer</code>, to which it sends the synthesized output. * * <p><b>Example</b> (using the <code>CMUDiphoneVoice</code>, * <code>CMULexicon</code> and <code>JavaClipAudioPlayer</code>): * * <pre> * Voice voice = new CMUDiphoneVoice(); * * // sets the Lexicon * voice.setLexicon(new CMULexicon()); * * // sets the AudioPlayer * voice.setAudioPlayer(new JavaClipAudioPlayer()); * * // loads the Voice * voice.allocate(); * * // start talking * voice.speak("I can talk forever without getting tired!"); * </pre> * * * <p>A user can override the AudioPlayer to use by defining the * "com.sun.speech.freetts.voice.defaultAudioPlayer" system property. * The value of this property must be the name of a class that * implements the AudioPlayer interface, and which also has a no-arg * constructor. * * @see VoiceManager * @see VoiceDirectory */ public abstract class Voice implements UtteranceProcessor, Dumpable { /** Logger instance. */ private static final Logger LOGGER = Logger.getLogger(Voice.class.getName()); /** * Constant that describes the name of the unit database used by * this voice. */ public final static String DATABASE_NAME = "databaseName"; private List utteranceProcessors; private Map featureProcessors; private FeatureSetImpl features; private boolean metrics = false; private boolean detailedMetrics = false; private boolean dumpUtterance = false; private boolean dumpRelations = false; private String runTitle = "unnamed run"; private Lexicon lexicon = null; private AudioPlayer defaultAudioPlayer = null; private AudioPlayer audioPlayer = null; private UtteranceProcessor audioOutput; private OutputQueue outputQueue = null; private String waveDumpFile = null; private BulkTimer runTimer = new BulkTimer(); private BulkTimer threadTimer = new BulkTimer(); private boolean externalOutputQueue = false; private boolean externalAudioPlayer = false; private float nominalRate = 150; // nominal speaking rate for this voice private float pitch = 100; // pitch baseline (hertz) private float range = 10; // pitch range (hertz) private float pitchShift = 1; // F0 Shift private float volume = 0.8f; // the volume (range 0 to 1) private float durationStretch = 1f; // the duration stretch private boolean loaded = false; private String name = "default_name"; private Age age = Age.DONT_CARE; private Gender gender = Gender.DONT_CARE; private String description = "default description"; private Locale locale = Locale.getDefault(); private String domain = "general"; private String style = "standard"; private String organization = "unknown"; /** * Prefix for System property names. */ public final static String PROP_PREFIX = "com.sun.speech.freetts.voice."; /** * Feature name for the silence phone string. */ public final static String FEATURE_SILENCE = "silence"; /** * Feature name for the join type string. */ public final static String FEATURE_JOIN_TYPE = "join_type"; /** * Feature name for the default AudioPlayer class to use. */ public final static String DEFAULT_AUDIO_PLAYER = PROP_PREFIX + "defaultAudioPlayer"; /** * The default class to use for the DEFAULT_AUDIO_PLAYER. */ public final static String DEFAULT_AUDIO_PLAYER_DEFAULT = "com.sun.speech.freetts.audio.JavaStreamingAudioPlayer"; /** * Creates a new Voice. Utterances are sent to an * output queue to be rendered as audio. Utterances are placed * on the queue by an output thread. This * queue is usually created via a call to 'createOutputThread,' * which creates a thread that waits on the queue and sends the * output to the audio player associated with this voice. If * the queue is null, the output is rendered in the calling * thread. * * @see #createOutputThread */ public Voice() { /* Make the utteranceProcessors a synchronized list to avoid * some threading issues. */ utteranceProcessors = Collections.synchronizedList(new ArrayList()); features = new FeatureSetImpl(); featureProcessors = new HashMap(); try { nominalRate = Float.parseFloat( Utilities.getProperty(PROP_PREFIX + "speakingRate","150")); pitch = Float.parseFloat( Utilities.getProperty(PROP_PREFIX + "pitch","100")); range = Float.parseFloat( Utilities.getProperty(PROP_PREFIX + "range","10")); volume = Float.parseFloat( Utilities.getProperty(PROP_PREFIX + "volume","1.0")); } catch (SecurityException se) { // can't get properties, just use defaults } outputQueue = null; audioPlayer = null; defaultAudioPlayer = null; } /** * Creates a new Voice like above, except that it also * stores the properties of the voice. * @param name the name of the voice * @param gender the gender of the voice * @param age the age of the voice * @param description a human-readable string providing a * description that can be displayed for the users. * @param locale the locale of the voice * @param domain the domain of this voice. For example, * @param organization the organization which created the voice * &quot;general&quot;, &quot;time&quot;, or * &quot;weather&quot;. * * @see #Voice() */ public Voice(String name, Gender gender, Age age, String description, Locale locale, String domain, String organization) { this(); setName(name); setGender(gender); setAge(age); setDescription(description); setLocale(locale); setDomain(domain); setOrganization(organization); } /** * Speaks the given text. * * @param text the text to speak * * @return <code>true</code> if the given text is spoken properly; * otherwise <code>false</code> */ public boolean speak(String text) { return speak(new FreeTTSSpeakableImpl(text)); } /** * Speaks the given document. * * @param doc the JSML document to speak * * @return <code>true</code> if the given document is spoken properly; * otherwise <code>false</code> */ public boolean speak(Document doc) { return speak(new FreeTTSSpeakableImpl(doc)); } /** * Speaks the input stream. * * @param inputStream the inputStream to speak * * @return <code>true</code> if the given input stream is spoken properly; * otherwise <code>false</code> */ public boolean speak(InputStream inputStream) { return speak(new FreeTTSSpeakableImpl(inputStream)); } /** * Speak the given queue item. This is a synchronous method that * does not return until the speakable is completely * spoken or has been cancelled. * * @param speakable the item to speak * * @return <code>true</code> if the utterance was spoken properly, * <code>false</code> otherwise */ public boolean speak(FreeTTSSpeakable speakable) { if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("speak(FreeTTSSpeakable) called"); } boolean ok = true; boolean posted = false; getAudioPlayer().startFirstSampleTimer(); for (Iterator i = tokenize(speakable); !speakable.isCompleted() && i.hasNext() ; ) { try { Utterance utterance = (Utterance) i.next(); if (utterance != null) { processUtterance(utterance); posted = true; } } catch (ProcessException pe) { ok = false; } } if (ok && posted) { runTimer.start("WaitAudio"); ok = speakable.waitCompleted(); runTimer.stop("WaitAudio"); } if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("speak(FreeTTSSpeakable) completed"); } return ok; } /** * @deprecated As of FreeTTS 1.2, replaced by {@link #allocate}. */ public void load() { allocate(); } /** * Allocate this Voice. It loads the lexicon and the * audio output handler, and creates an audio output thread by * invoking <code>createOutputThread()</code>, if * one is not already created. It then calls the <code>loader()</code> * method to load Voice-specific data, which include utterance processors. */ public void allocate() { if (isLoaded()) { return; } BulkTimer.LOAD.start(); if (!lexicon.isLoaded()) { try { lexicon.load(); } catch (IOException ioe) { LOGGER.severe("Can't load voice " + ioe); throw new Error(ioe); } } try { audioOutput = getAudioOutput(); } catch (IOException ioe) { LOGGER.severe("Can't load audio output handler for voice " + ioe); throw new Error(ioe); } if (outputQueue == null) { outputQueue = createOutputThread(); } try { loader(); } catch (IOException ioe) { LOGGER.severe("Can't load voice " + ioe); throw new Error(ioe); } BulkTimer.LOAD.stop(); if (isMetrics()) { BulkTimer.LOAD.show("loading " + toString() + " for " + getRunTitle()); } setLoaded(true); } /** * Returns true if this voice is loaded. * * @return <code>true</code> if the voice is loaded; * otherwise <code>false</code> */ public boolean isLoaded() { return loaded; } /** * Sets the loaded state * * @param loaded the new loaded state * otherwise <code>false</code> */ protected void setLoaded(boolean loaded) { this.loaded = loaded; } /** * Processes the given Utterance by passing it to each * UtteranceProcessor managed by this Voice. The * UtteranceProcessors are called in the order they were added to * the Voice. * * @param u the Utterance to process * * @throws ProcessException if an exception occurred while performing * operations on the Utterance */ public void processUtterance(Utterance u) throws ProcessException { UtteranceProcessor[] processors; if (utteranceProcessors == null) { return; } if (u == null) { throw new ProcessException("Utterance is null."); } runTimer.start("processing"); processors = new UtteranceProcessor[utteranceProcessors.size()]; processors = (UtteranceProcessor[]) utteranceProcessors.toArray(processors); if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("Processing Utterance: " + u.getString("input_text")); } try { for (int i = 0; i < processors.length && !u.getSpeakable().isCompleted(); i++) { runProcessor(processors[i], u, runTimer); } if (!u.getSpeakable().isCompleted()) { if (outputQueue == null) { if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("To AudioOutput"); } outputUtterance(u, runTimer); } else { runTimer.start("..post"); outputQueue.post(u); runTimer.stop("..post"); } } } catch (ProcessException pe) { System.err.println("Processing Utterance: " + pe); } catch (Exception e) { System.err.println("Trouble while processing utterance " + e); e.printStackTrace(); u.getSpeakable().cancelled(); } if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("Done Processing Utterance: " + u.getString("input_text")); } runTimer.stop("processing"); if (dumpUtterance) { u.dump("Utterance"); } if (dumpRelations) { u.dumpRelations("Utterance"); } dumpASCII(u); } /** * Dumps the wave for the given utterance. * * @param utterance the utterance of interest */ private void dumpASCII(Utterance utterance) { if (waveDumpFile != null) { LPCResult lpcResult = (LPCResult) utterance.getObject("target_lpcres"); try { if (waveDumpFile.equals("-")) { lpcResult.dumpASCII(); } else { lpcResult.dumpASCII(waveDumpFile); } } catch (IOException ioe) { LOGGER.severe("Can't dump file to " + waveDumpFile + " " + ioe); throw new Error(ioe); } } } /** * Creates an output thread that will asynchronously * output utterances that are generated by this voice (and other * voices). * * @return the queue where utterances should be placed. */ public static OutputQueue createOutputThread() { final OutputQueue queue = new OutputQueue(); Thread t = new Thread() { public void run() { Utterance utterance = null; do { utterance = queue.pend(); if (utterance != null) { Voice voice = utterance.getVoice(); if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("OUT: " + utterance.getString("input_text")); } voice.outputUtterance(utterance, voice.threadTimer); } } while (utterance != null); } }; t.setDaemon(true); t.start(); return queue; } /** * Sends the given utterance to the audio output processor * associated with this voice. If the queue item associated with * this utterance is completed, then this set of utterances has * been cancelled or otherwise aborted and the utterance should * not be output. * * @param utterance the utterance to be output * @param timer the timer for gathering performance metrics * * @return true if the utterance was output properly; otherwise * false */ private boolean outputUtterance(Utterance utterance, BulkTimer timer) { boolean ok = true; FreeTTSSpeakable speakable = utterance.getSpeakable(); if (!speakable.isCompleted()) { if (utterance.isFirst()) { getAudioPlayer().reset(); speakable.started(); if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine(" --- started ---"); } } // log(" utt: " + utterance.getString("input_text")); try { if (!speakable.isCompleted()) { runProcessor(audioOutput, utterance, timer); } else { ok = false; } } catch (ProcessException pe) { ok = false; } if (ok && utterance.isLast()) { getAudioPlayer().drain(); speakable.completed(); if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine(" --- completed ---"); } } else if (!ok) { // getAudioPlayer().drain(); speakable.cancelled(); if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine(" --- cancelled ---"); } } else { if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine(" --- not last: " + speakable.getText() + " --- "); } } if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("Calling speakable.completed() on " + speakable.getText()); } } else { ok = false; if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("STRANGE: speakable already completed: " + speakable.getText()); } } return ok; } /** * Runs the given utterance processor. * * @param processor the processor to run. If the processor * is null, it is ignored * @param utterance the utterance to process * * @throws ProcessException if an exceptin occurs while processing * the utterance */ private void runProcessor(UtteranceProcessor processor, Utterance utterance, BulkTimer timer) throws ProcessException { if (processor != null) { String processorName = ".." + processor.toString(); if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine(" Running " + processorName); } timer.start(processorName); processor.processUtterance(utterance); timer.stop(processorName); } } /** * Returns the tokenizer associated with this voice. * * @return the tokenizer */ public abstract Tokenizer getTokenizer(); /** * Return the list of UtteranceProcessor instances. Applications * should use this to obtain and modify the contents of the * UtteranceProcessor list. * * @return a List containing UtteranceProcessor instances */ public List getUtteranceProcessors() { return utteranceProcessors; } /** * Returns the feature set associated with this voice. * * @return the feature set. */ public FeatureSet getFeatures() { return features; } /** * Starts a batch of utterances. Utterances are sometimes * batched in groups for timing purposes. * * @see #endBatch */ public void startBatch() { runTimer.setVerbose(detailedMetrics); runTimer.start(); } /** * Ends a batch of utterances. * * @see #startBatch */ public void endBatch() { runTimer.stop(); if (metrics) { runTimer.show(getRunTitle() + " run"); threadTimer.show(getRunTitle() + " thread"); getAudioPlayer().showMetrics(); long totalMemory = Runtime.getRuntime().totalMemory(); LOGGER.info ("Memory Use : " + (totalMemory - Runtime.getRuntime().freeMemory()) / 1024 + "k of " + totalMemory / 1024 + "k"); } } /** * Sets the output queue for this voice. If no output queue is set * for the voice when the voice is loaded, a queue and thread will * be created when the voice is loaded. If the outputQueue is set * by an external entity by calling setOutputQueue, the caller is * responsible for shutting down the output thread. That is, if * you call 'setOutputQueue' then you are responsible for shutting * down the output thread on your own. This is necessary since the * output queue may be shared by a number of voices. * * <p>Utterances are placed on the * queue to be output by an output thread. This queue is * usually created via a call to 'createOutputThread' which * creates a thread that waits on the queue and sends the * output to the audio player associated with this voice. If * the queue is null, the output is rendered in the calling * thread. * * @param queue the output queue */ public void setOutputQueue(OutputQueue queue) { externalOutputQueue = true; outputQueue = queue; } /** * Returns the output queue associated with this voice. * * @return the output queue associated with this voice */ public OutputQueue getOutputQueue() { return outputQueue; } /** * Loads voice specific data. Subclasses of voice should * implement this to perform class specific loading. */ protected abstract void loader() throws IOException; /** * tokenizes the given the queue item. * * @return an iterator that will yield a series of utterances */ private Iterator tokenize(FreeTTSSpeakable speakable) { return new FreeTTSSpeakableTokenizer(speakable).iterator(); } /** * Converts the document to a string (a placeholder for more * sophisticated logic to be done). * * @param dom the jsml document * * @return the document as a string. */ private String documentToString(Document dom) { StringBuffer buf = new StringBuffer(); linearize(dom, buf); return buf.toString(); } /** * Appends the text for this node to the given StringBuffer. * * @param n the node to traverse in depth-first order * @param buf the buffer to append text to */ private void linearize(Node n, StringBuffer buf) { StringBuffer endText = processNode(n, buf); for (Node child = n.getFirstChild(); child != null; child = child.getNextSibling()) { linearize(child, buf); } if (endText != null) { buf.append(endText); } } /** * Adds text for just this node and returns any text that might * be needed to undo the effects of this node after it is * processed. * * @param n the node to traverse in depth-first order * @param buf the buffer to append text to * * @return a <code>String</code> containing text to undo the * effects of the node */ protected StringBuffer processNode(Node n, StringBuffer buf) { StringBuffer endText = null; int type = n.getNodeType(); switch (type) { case Node.ATTRIBUTE_NODE: break; case Node.DOCUMENT_NODE: break; case Node.ELEMENT_NODE: // endText = processElement((Element) n, buf); break; case Node.TEXT_NODE: buf.append(((Text) n).getData()); break; // Pass processing instructions (e.g., <?blah?> // right on to the synthesizer. These types of things // probably should not be used. Instead the 'engine' // element is probably the best thing to do. // case Node.PROCESSING_INSTRUCTION_NODE: break; // The document type had better be JSML. // case Node.DOCUMENT_TYPE_NODE: break; // I think NOTATION nodes are only DTD's. // case Node.NOTATION_NODE: break; // Should not get COMMENTS because the JSMLParser // ignores them. // case Node.COMMENT_NODE: break; // Should not get CDATA because the JSMLParser is // coalescing. // case Node.CDATA_SECTION_NODE: break; // Should not get ENTITY related notes because // entities are expanded by the JSMLParser // case Node.ENTITY_NODE: case Node.ENTITY_REFERENCE_NODE: break; // Should not get DOCUMENT_FRAGMENT nodes because I // [[[WDW]]] think they are only created via the API's // and cannot be defined via content. // case Node.DOCUMENT_FRAGMENT_NODE: break; default: break; } return endText; } /** * Dumps the voice in textual form. * * @param output where to send the formatted output * @param pad the initial padding * @param title the title to print when dumping out */ public void dump(PrintWriter output, int pad, String title) { Utilities.dump(output, pad, title); features.dump(output, pad + 4, title + " Features"); dumpProcessors(output, pad + 4, title + " Processors"); } /** * Dumps the voice processors. * * @param output where to send the formatted output * @param pad the initial padding * @param title the title to print when dumping out */ public void dumpProcessors(PrintWriter output, int pad, String title) { UtteranceProcessor[] processors; if (utteranceProcessors == null) { return; } processors = new UtteranceProcessor[utteranceProcessors.size()]; processors = (UtteranceProcessor[]) utteranceProcessors.toArray(processors); Utilities.dump(output, pad, title); for (int i = 0; i < processors.length; i++) { Utilities.dump(output, pad + 4, processors[i].toString()); } } /** * Returns a language/voice specific Feature Processor. * * @param name the name of the processor * * @return the processor associated with the name or null if none * could be found */ public FeatureProcessor getFeatureProcessor(String name) { return (FeatureProcessor) featureProcessors.get(name); } /** * Adds a language/voice specific Feature Processor to the set of * FeatureProcessors supported by this voice. * * @param name the name of the processor * @param fp the processor */ public void addFeatureProcessor(String name, FeatureProcessor fp) { featureProcessors.put(name, fp); } /** * Gets the state of the metrics mode. * * @return true if metrics mode is on */ public boolean isMetrics() { return metrics; } /** * Sets the metrics mode. * * @param metrics true if metrics mode should be on */ public void setMetrics(boolean metrics) { this.metrics = metrics; if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("Metrics mode is " + metrics); } } /** * Gets the state of the detailedMetrics mode. * * @return true if detailedMetrics mode is on */ public boolean isDetailedMetrics() { return detailedMetrics; } /** * Sets the state of the detailedMetrics mode. * * @param detailedMetrics true if detailedMetrics mode should be on */ public void setDetailedMetrics(boolean detailedMetrics) { this.detailedMetrics = detailedMetrics; if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("DetailedMetrics mode is " + detailedMetrics); } } /** * Gets the state of the dumpUtterance mode. * * @return true if dumpUtterance mode is on */ public boolean isDumpUtterance() { return dumpUtterance; } /** * Sets the state of the dumpUtterance mode. * * @param dumpUtterance true if dumpUtterance mode should be on */ public void setDumpUtterance(boolean dumpUtterance) { this.dumpUtterance = dumpUtterance; if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("DumpUtterance mode is " + dumpUtterance); } } /** * Gets the state of the dumpRelations mode. * * @return true if dumpRelations mode is on */ public boolean isDumpRelations() { return dumpRelations; } /** * Sets the state of the dumpRelations mode. * * @param dumpRelations true if dumpRelations mode should be on */ public void setDumpRelations(boolean dumpRelations) { this.dumpRelations = dumpRelations; if (LOGGER.isLoggable(Level.FINE)) { LOGGER.fine("DumpRelations mode is " + dumpRelations); } } /** * Sets the title for this run. * * @param runTitle the title for the run */ public void setRunTitle(String runTitle) { this.runTitle = runTitle; } /** * Gets the title for this run. * * @return the title for the run */ public String getRunTitle() { return runTitle; } /** * Given a phoneme and a feature name, returns the feature. * * @param phone the phoneme of interest * @param featureName the name of the feature of interest * * @return the feature with the given name */ public String getPhoneFeature(String phone, String featureName) { return null; } /** * Shuts down the voice processing. */ public void deallocate() { setLoaded(false); if (!externalAudioPlayer) { if (audioPlayer != null) { audioPlayer.close(); audioPlayer = null; } } if (!externalOutputQueue) { outputQueue.close(); } } /** * Sets the baseline pitch. * * @param hertz the baseline pitch in hertz */ public void setPitch(float hertz) { this.pitch = hertz; } /** * Retreives the baseline pitch. * * @return the baseline pitch in hertz */ public float getPitch() { return pitch; } /** * Sets the pitch range. * * @param range the range in hertz */ public void setPitchRange(float range) { this.range = range; } /** * Gets the pitch range. * * @return the range in hertz */ public float getPitchRange() { return range; } /** * Sets the pitch shift * * @param shift the pitch shift (1.0 is no shift) */ public void setPitchShift(float shift) { this.pitchShift = shift; } /** * Gets the pitch shift. * * @return the pitch shift */ public float getPitchShift() { return pitchShift; } /** * Sets the duration stretch * * @param stretch the duration stretch (1.0 is no stretch) */ public void setDurationStretch(float stretch) { this.durationStretch = stretch; } /** * Gets the duration Stretch * * @return the duration stretch */ public float getDurationStretch() { return durationStretch; } /** * Sets the rate of speech. * * @param wpm words per minute */ public void setRate(float wpm) { if (wpm > 0 && wpm < 1000) { setDurationStretch(nominalRate / wpm); } } /** * Gets the rate of speech. * * @return words per minute */ public float getRate() { return durationStretch * nominalRate; } /** * Sets the volume. * * @param vol the volume (0 to 1.0) */ public void setVolume(float vol) { volume = vol; } /** * Gets the volume. * * @return the volume (0 to 1.0) */ public float getVolume() { return volume; } /** * Gets the lexicon for this voice. * * @return the lexicon (or null if there is no lexicon) */ public Lexicon getLexicon() { return lexicon; } /** * Sets the lexicon to be used by this voice. * * @param lexicon the lexicon to use */ public void setLexicon(Lexicon lexicon) { this.lexicon = lexicon; } /** * Sets the dumpfile for this voice. * * @param waveDumpFile the dumpfile */ public void setWaveDumpFile(String waveDumpFile) { this.waveDumpFile = waveDumpFile; } /** * Gets the dumpfile for this voice. * * @return the dumpfile */ public String getWaveDumpFile() { return waveDumpFile; } /** * Sets the audio player associated with this voice. The caller is * responsible for closing this player. * * @param player the audio player */ public void setAudioPlayer(AudioPlayer player) { audioPlayer = player; externalAudioPlayer = true; } /** * Gets the default audio player for this voice. The return * value will be non-null only if the DEFAULT_AUDIO_PLAYER * system property has been set to the name of an AudioPlayer * class, and that class is able to be instantiated via a * no arg constructor. getAudioPlayer will automatically set * the audio player for this voice to the default audio player * if the audio player has not yet been set. * * @see #DEFAULT_AUDIO_PLAYER * @see #getAudioPlayer * @return the default AudioPlayer */ public AudioPlayer getDefaultAudioPlayer() throws InstantiationException { if (defaultAudioPlayer != null) { return defaultAudioPlayer; } String className = Utilities.getProperty( DEFAULT_AUDIO_PLAYER, DEFAULT_AUDIO_PLAYER_DEFAULT); try { Class cls = Class.forName(className); defaultAudioPlayer = (AudioPlayer) cls.newInstance(); return defaultAudioPlayer; } catch (ClassNotFoundException e) { throw new InstantiationException("Can't find class " + className); } catch (IllegalAccessException e) { throw new InstantiationException("Can't find class " + className); } catch (ClassCastException e) { throw new InstantiationException(className + " cannot be cast " + "to AudioPlayer"); } } /** * Gets the audio player associated with this voice. If the * audio player has not yet been set, the value will default * to the return value of getDefaultAudioPlayer. * * @see #getDefaultAudioPlayer * @return the audio player */ public AudioPlayer getAudioPlayer() { if (audioPlayer == null) { try { audioPlayer = getDefaultAudioPlayer(); } catch (InstantiationException e) { e.printStackTrace(); } } return audioPlayer; } /** * Get a resource for this voice. * By default, the voice is searched for in the package * to which the voice class belongs. Subclasses are free to * override this behaviour. */ protected URL getResource(String resource) { return this.getClass().getResource(resource); } /** * Set the name of this voice. * [[[TODO: any standard format to the name?]]] * * @param name the name to assign this voice */ protected void setName(String name) { this.name = name; } /** * Get the name of this voice. * * @return the name */ public String getName() { return name; } /** * Returns the name of this Voice. * * @return the name of this Voice */ public String toString() { return getName(); } /** * Set the gender of this voice. * * @param gender the gender to assign */ protected void setGender(Gender gender) { this.gender = gender; } /** * Get the gender of this voice. * * @return the gender of this voice */ public Gender getGender() { return gender; } /** * Set the age of this voice. * * @param age the age to assign */ protected void setAge(Age age) { this.age = age; } /** * Get the age of this voice. * * @return the age of this voice */ public Age getAge() { return age; } /** * Set the description of this voice. * * @param description the human readable description to assign */ protected void setDescription(String description) { this.description = description; } /** * Get the description of this voice. * * @return the human readable description of this voice */ public String getDescription() { return description; } /** * Set the locale of this voice. * * @param locale the locale of this voice. */ protected void setLocale(Locale locale) { this.locale = locale; } /** * Get the locale of this voice. * * @return the locale of this voice. */ public Locale getLocale() { return locale; } /** * Set the domain of this voice. * * @param domain the domain of this voice. For example, * &quot;general&quot;, &quot;time&quot;, or * &quot;weather&quot;. */ protected void setDomain(String domain) { this.domain = domain; } /** * Get the domain of this voice. * * @return the domain of this voice. For example, * &quot;general&quot;, &quot;time&quot;, or * &quot;weather&quot;. */ public String getDomain() { return domain; } /** * Sets the voice style. This parameter is designed for human * interpretation. Values might include "business", "casual", * "robotic", "breathy" * * @param style the stile of this voice. */ public void setStyle(String style) { this.style = style; } /** * Gets the voice style. This parameter is designed for human * interpretation. Values might include "business", "casual", * "robotic", "breathy". */ public String getStyle() { return style; } /** * Sets the organization which created this voice. For example * "cmu", "sun", ... * * @param organization the name of the organization */ protected void setOrganization(String organization) { this.organization = organization; } /** * Gets the organization which created this voice. For example * "cmu", "sun", ... * * @return the name of the organization */ public String getOrganization() { return organization; } /** * Returns the AudioOutput processor to be used by this voice. * Derived voices typically override this to customize behaviors. * * @return the audio output processor * * @throws IOException if an IO error occurs while getting * processor */ protected abstract UtteranceProcessor getAudioOutput() throws IOException ; /** * Tokenizes a FreeTTSSpeakable */ private class FreeTTSSpeakableTokenizer { FreeTTSSpeakable speakable; Tokenizer tok = getTokenizer(); /** * Constructor. * * @param speakable the queue item to be pretokenized */ public FreeTTSSpeakableTokenizer(FreeTTSSpeakable speakable) { this.speakable = speakable; if (speakable.isPlainText()) { tok.setInputText(speakable.getText()); } else if (speakable.isStream()) { Reader reader = new BufferedReader( new InputStreamReader(speakable.getInputStream())); tok.setInputReader(reader); } else if (speakable.isDocument()) { tok.setInputText(documentToString(speakable.getDocument())); } } /** * Returns an iterator for this text item. */ public Iterator iterator() { return new Iterator() { boolean first = true; Token savedToken = null; /** * Determines if there are more utterances * * @return true if there are more tokens */ public boolean hasNext() { return savedToken != null || tok.hasMoreTokens(); } /** * Returns the next utterance. * * @return the next utterance (as an object) or * null if there is are no utterances left */ public Object next() { ArrayList tokenList = new ArrayList(); Utterance utterance = null; if (savedToken != null) { tokenList.add(savedToken); savedToken = null; } while (tok.hasMoreTokens()) { Token token = tok.getNextToken(); if ((token.getWord().length() == 0) || (tokenList.size() > 500) || tok.isBreak()) { savedToken = token; break; } tokenList.add(token); } utterance = new Utterance(Voice.this, tokenList); utterance.setSpeakable(speakable); utterance.setFirst(first); first = false; boolean isLast = (!tok.hasMoreTokens() && (savedToken == null || savedToken.getWord().length() == 0)); utterance.setLast(isLast); return utterance; } public void remove() { throw new UnsupportedOperationException("remove"); } }; } } } /* * ==================================================================== * * Licensed to the Apache Software Foundation (ASF) under one or more * contributor license agreements. See the NOTICE file distributed with * this work for additional information regarding copyright ownership. * The ASF licenses this file to You under the Apache License, Version 2.0 * (the "License"); you may not use this file except in compliance with * the License. You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. * ==================================================================== * * This software consists of voluntary contributions made by many * individuals on behalf of the Apache Software Foundation. For more * information on the Apache Software Foundation, please see * <http://www.apache.org/>. * */ package org.apache.http.impl.cookie; import java.util.ArrayList; import java.util.HashMap; import java.util.List; import java.util.Locale; import java.util.Map; import org.apache.http.annotation.NotThreadSafe; import org.apache.http.Header; import org.apache.http.HeaderElement; import org.apache.http.NameValuePair; import org.apache.http.cookie.ClientCookie; import org.apache.http.cookie.Cookie; import org.apache.http.cookie.CookieAttributeHandler; import org.apache.http.cookie.CookieOrigin; import org.apache.http.cookie.CookieSpec; import org.apache.http.cookie.MalformedCookieException; import org.apache.http.cookie.SM; import org.apache.http.message.BufferedHeader; import org.apache.http.util.CharArrayBuffer; /** * RFC 2965 compliant {@link CookieSpec} implementation. * * @since 4.0 */ @NotThreadSafe // superclass is @NotThreadSafe public class RFC2965Spec extends RFC2109Spec { /** * Default constructor * */ public RFC2965Spec() { this(null, false); } public RFC2965Spec(final String[] datepatterns, boolean oneHeader) { super(datepatterns, oneHeader); registerAttribHandler(ClientCookie.DOMAIN_ATTR, new RFC2965DomainAttributeHandler()); registerAttribHandler(ClientCookie.PORT_ATTR, new RFC2965PortAttributeHandler()); registerAttribHandler(ClientCookie.COMMENTURL_ATTR, new RFC2965CommentUrlAttributeHandler()); registerAttribHandler(ClientCookie.DISCARD_ATTR, new RFC2965DiscardAttributeHandler()); registerAttribHandler(ClientCookie.VERSION_ATTR, new RFC2965VersionAttributeHandler()); } @Override public List<Cookie> parse( final Header header, CookieOrigin origin) throws MalformedCookieException { if (header == null) { throw new IllegalArgumentException("Header may not be null"); } if (origin == null) { throw new IllegalArgumentException("Cookie origin may not be null"); } if (!header.getName().equalsIgnoreCase(SM.SET_COOKIE2)) { throw new MalformedCookieException("Unrecognized cookie header '" + header.toString() + "'"); } origin = adjustEffectiveHost(origin); HeaderElement[] elems = header.getElements(); return createCookies(elems, origin); } @Override protected List<Cookie> parse( final HeaderElement[] elems, CookieOrigin origin) throws MalformedCookieException { origin = adjustEffectiveHost(origin); return createCookies(elems, origin); } private List<Cookie> createCookies( final HeaderElement[] elems, final CookieOrigin origin) throws MalformedCookieException { List<Cookie> cookies = new ArrayList<Cookie>(elems.length); for (HeaderElement headerelement : elems) { String name = headerelement.getName(); String value = headerelement.getValue(); if (name == null || name.length() == 0) { throw new MalformedCookieException("Cookie name may not be empty"); } BasicClientCookie2 cookie = new BasicClientCookie2(name, value); cookie.setPath(getDefaultPath(origin)); cookie.setDomain(getDefaultDomain(origin)); cookie.setPorts(new int [] { origin.getPort() }); // cycle through the parameters NameValuePair[] attribs = headerelement.getParameters(); // Eliminate duplicate attributes. The first occurrence takes precedence // See RFC2965: 3.2 Origin Server Role Map<String, NameValuePair> attribmap = new HashMap<String, NameValuePair>(attribs.length); for (int j = attribs.length - 1; j >= 0; j--) { NameValuePair param = attribs[j]; attribmap.put(param.getName().toLowerCase(Locale.ENGLISH), param); } for (Map.Entry<String, NameValuePair> entry : attribmap.entrySet()) { NameValuePair attrib = entry.getValue(); String s = attrib.getName().toLowerCase(Locale.ENGLISH); cookie.setAttribute(s, attrib.getValue()); CookieAttributeHandler handler = findAttribHandler(s); if (handler != null) { handler.parse(cookie, attrib.getValue()); } } cookies.add(cookie); } return cookies; } @Override public void validate(final Cookie cookie, CookieOrigin origin) throws MalformedCookieException { if (cookie == null) { throw new IllegalArgumentException("Cookie may not be null"); } if (origin == null) { throw new IllegalArgumentException("Cookie origin may not be null"); } origin = adjustEffectiveHost(origin); super.validate(cookie, origin); } @Override public boolean match(final Cookie cookie, CookieOrigin origin) { if (cookie == null) { throw new IllegalArgumentException("Cookie may not be null"); } if (origin == null) { throw new IllegalArgumentException("Cookie origin may not be null"); } origin = adjustEffectiveHost(origin); return super.match(cookie, origin); } /** * Adds valid Port attribute value, e.g. "8000,8001,8002" */ @Override protected void formatCookieAsVer(final CharArrayBuffer buffer, final Cookie cookie, int version) { super.formatCookieAsVer(buffer, cookie, version); // format port attribute if (cookie instanceof ClientCookie) { // Test if the port attribute as set by the origin server is not blank String s = ((ClientCookie) cookie).getAttribute(ClientCookie.PORT_ATTR); if (s != null) { buffer.append("; $Port"); buffer.append("=\""); if (s.trim().length() > 0) { int[] ports = cookie.getPorts(); if (ports != null) { for (int i = 0, len = ports.length; i < len; i++) { if (i > 0) { buffer.append(","); } buffer.append(Integer.toString(ports[i])); } } } buffer.append("\""); } } } /** * Set 'effective host name' as defined in RFC 2965. * <p> * If a host name contains no dots, the effective host name is * that name with the string .local appended to it. Otherwise * the effective host name is the same as the host name. Note * that all effective host names contain at least one dot. * * @param origin origin where cookie is received from or being sent to. */ private static CookieOrigin adjustEffectiveHost(final CookieOrigin origin) { String host = origin.getHost(); // Test if the host name appears to be a fully qualified DNS name, // IPv4 address or IPv6 address boolean isLocalHost = true; for (int i = 0; i < host.length(); i++) { char ch = host.charAt(i); if (ch == '.' || ch == ':') { isLocalHost = false; break; } } if (isLocalHost) { host += ".local"; return new CookieOrigin( host, origin.getPort(), origin.getPath(), origin.isSecure()); } else { return origin; } } @Override public int getVersion() { return 1; } @Override public Header getVersionHeader() { CharArrayBuffer buffer = new CharArrayBuffer(40); buffer.append(SM.COOKIE2); buffer.append(": "); buffer.append("$Version="); buffer.append(Integer.toString(getVersion())); return new BufferedHeader(buffer); } @Override public String toString() { return "rfc2965"; } } /*************************************************************************** * Copyright 2017 Kieker Project (http://kieker-monitoring.net) * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. ***************************************************************************/ package kieker.common.record.misc; import java.nio.BufferOverflowException; import kieker.common.record.AbstractMonitoringRecord; import kieker.common.record.IMonitoringRecord; import kieker.common.record.io.IValueDeserializer; import kieker.common.record.io.IValueSerializer; import kieker.common.util.registry.IRegistry; /** * @author Jan Waller * API compatibility: Kieker 1.13.0 * * @since 1.7 */ public class KiekerMetadataRecord extends AbstractMonitoringRecord implements IMonitoringRecord.Factory, IMonitoringRecord.BinaryFactory { private static final long serialVersionUID = 8241152536143822747L; /** Descriptive definition of the serialization size of the record. */ public static final int SIZE = TYPE_SIZE_STRING // KiekerMetadataRecord.version + TYPE_SIZE_STRING // KiekerMetadataRecord.controllerName + TYPE_SIZE_STRING // KiekerMetadataRecord.hostname + TYPE_SIZE_INT // KiekerMetadataRecord.experimentId + TYPE_SIZE_BOOLEAN // KiekerMetadataRecord.debugMode + TYPE_SIZE_LONG // KiekerMetadataRecord.timeOffset + TYPE_SIZE_STRING // KiekerMetadataRecord.timeUnit + TYPE_SIZE_LONG // KiekerMetadataRecord.numberOfRecords ; public static final Class<?>[] TYPES = { String.class, // KiekerMetadataRecord.version String.class, // KiekerMetadataRecord.controllerName String.class, // KiekerMetadataRecord.hostname int.class, // KiekerMetadataRecord.experimentId boolean.class, // KiekerMetadataRecord.debugMode long.class, // KiekerMetadataRecord.timeOffset String.class, // KiekerMetadataRecord.timeUnit long.class, // KiekerMetadataRecord.numberOfRecords }; /** user-defined constants. */ public static final String NO_CONTROLLERNAME = "<no-controller-name>"; public static final String NO_HOSTNAME = "<no-hostname>"; public static final String NO_TIMESOURCE = "<no-timesource>"; public static final String NO_TIMEUNIT = "NANOSECONDS"; /** default constants. */ public static final String VERSION = kieker.common.util.Version.getVERSION(); public static final String CONTROLLER_NAME = NO_CONTROLLERNAME; public static final String HOSTNAME = NO_HOSTNAME; public static final int EXPERIMENT_ID = 0; public static final boolean DEBUG_MODE = false; public static final long TIME_OFFSET = 0L; public static final String TIME_UNIT = NO_TIMEUNIT; public static final long NUMBER_OF_RECORDS = 0L; /** property name array. */ private static final String[] PROPERTY_NAMES = { "version", "controllerName", "hostname", "experimentId", "debugMode", "timeOffset", "timeUnit", "numberOfRecords", }; /** property declarations. */ private final String version; private final String controllerName; private final String hostname; private final int experimentId; private final boolean debugMode; private final long timeOffset; private final String timeUnit; private final long numberOfRecords; /** * Creates a new instance of this class using the given parameters. * * @param version * version * @param controllerName * controllerName * @param hostname * hostname * @param experimentId * experimentId * @param debugMode * debugMode * @param timeOffset * timeOffset * @param timeUnit * timeUnit * @param numberOfRecords * numberOfRecords */ public KiekerMetadataRecord(final String version, final String controllerName, final String hostname, final int experimentId, final boolean debugMode, final long timeOffset, final String timeUnit, final long numberOfRecords) { this.version = version == null?VERSION:version; this.controllerName = controllerName == null?NO_CONTROLLERNAME:controllerName; this.hostname = hostname == null?NO_HOSTNAME:hostname; this.experimentId = experimentId; this.debugMode = debugMode; this.timeOffset = timeOffset; this.timeUnit = timeUnit == null?NO_TIMEUNIT:timeUnit; this.numberOfRecords = numberOfRecords; } /** * This constructor converts the given array into a record. * It is recommended to use the array which is the result of a call to {@link #toArray()}. * * @param values * The values for the record. * * @deprecated since 1.13. Use {@link #KiekerMetadataRecord(IValueDeserializer)} instead. */ @Deprecated public KiekerMetadataRecord(final Object[] values) { // NOPMD (direct store of values) AbstractMonitoringRecord.checkArray(values, TYPES); this.version = (String) values[0]; this.controllerName = (String) values[1]; this.hostname = (String) values[2]; this.experimentId = (Integer) values[3]; this.debugMode = (Boolean) values[4]; this.timeOffset = (Long) values[5]; this.timeUnit = (String) values[6]; this.numberOfRecords = (Long) values[7]; } /** * This constructor uses the given array to initialize the fields of this record. * * @param values * The values for the record. * @param valueTypes * The types of the elements in the first array. * * @deprecated since 1.13. Use {@link #KiekerMetadataRecord(IValueDeserializer)} instead. */ @Deprecated protected KiekerMetadataRecord(final Object[] values, final Class<?>[] valueTypes) { // NOPMD (values stored directly) AbstractMonitoringRecord.checkArray(values, valueTypes); this.version = (String) values[0]; this.controllerName = (String) values[1]; this.hostname = (String) values[2]; this.experimentId = (Integer) values[3]; this.debugMode = (Boolean) values[4]; this.timeOffset = (Long) values[5]; this.timeUnit = (String) values[6]; this.numberOfRecords = (Long) values[7]; } /** * @param deserializer * The deserializer to use */ public KiekerMetadataRecord(final IValueDeserializer deserializer) { this.version = deserializer.getString(); this.controllerName = deserializer.getString(); this.hostname = deserializer.getString(); this.experimentId = deserializer.getInt(); this.debugMode = deserializer.getBoolean(); this.timeOffset = deserializer.getLong(); this.timeUnit = deserializer.getString(); this.numberOfRecords = deserializer.getLong(); } /** * {@inheritDoc} * * @deprecated since 1.13. Use {@link #serialize(IValueSerializer)} with an array serializer instead. */ @Override @Deprecated public Object[] toArray() { return new Object[] { this.getVersion(), this.getControllerName(), this.getHostname(), this.getExperimentId(), this.isDebugMode(), this.getTimeOffset(), this.getTimeUnit(), this.getNumberOfRecords() }; } /** * {@inheritDoc} */ @Override public void registerStrings(final IRegistry<String> stringRegistry) { // NOPMD (generated code) stringRegistry.get(this.getVersion()); stringRegistry.get(this.getControllerName()); stringRegistry.get(this.getHostname()); stringRegistry.get(this.getTimeUnit()); } /** * {@inheritDoc} */ @Override public void serialize(final IValueSerializer serializer) throws BufferOverflowException { //super.serialize(serializer); serializer.putString(this.getVersion()); serializer.putString(this.getControllerName()); serializer.putString(this.getHostname()); serializer.putInt(this.getExperimentId()); serializer.putBoolean(this.isDebugMode()); serializer.putLong(this.getTimeOffset()); serializer.putString(this.getTimeUnit()); serializer.putLong(this.getNumberOfRecords()); } /** * {@inheritDoc} */ @Override public Class<?>[] getValueTypes() { return TYPES; // NOPMD } /** * {@inheritDoc} */ @Override public String[] getValueNames() { return PROPERTY_NAMES; // NOPMD } /** * {@inheritDoc} */ @Override public int getSize() { return SIZE; } /** * {@inheritDoc} * * @deprecated This record uses the {@link kieker.common.record.IMonitoringRecord.Factory} mechanism. Hence, this method is not implemented. */ @Override @Deprecated public void initFromArray(final Object[] values) { throw new UnsupportedOperationException(); } /** * {@inheritDoc} */ @Override public boolean equals(final Object obj) { if (obj == null) return false; if (obj == this) return true; if (obj.getClass() != this.getClass()) return false; final KiekerMetadataRecord castedRecord = (KiekerMetadataRecord) obj; if (this.getLoggingTimestamp() != castedRecord.getLoggingTimestamp()) return false; if (!this.getVersion().equals(castedRecord.getVersion())) return false; if (!this.getControllerName().equals(castedRecord.getControllerName())) return false; if (!this.getHostname().equals(castedRecord.getHostname())) return false; if (this.getExperimentId() != castedRecord.getExperimentId()) return false; if (this.isDebugMode() != castedRecord.isDebugMode()) return false; if (this.getTimeOffset() != castedRecord.getTimeOffset()) return false; if (!this.getTimeUnit().equals(castedRecord.getTimeUnit())) return false; if (this.getNumberOfRecords() != castedRecord.getNumberOfRecords()) return false; return true; } public final String getVersion() { return this.version; } public final String getControllerName() { return this.controllerName; } public final String getHostname() { return this.hostname; } public final int getExperimentId() { return this.experimentId; } public final boolean isDebugMode() { return this.debugMode; } public final long getTimeOffset() { return this.timeOffset; } public final String getTimeUnit() { return this.timeUnit; } public final long getNumberOfRecords() { return this.numberOfRecords; } } /* * Licensed to the Apache Software Foundation (ASF) under one * or more contributor license agreements. See the NOTICE file * distributed with this work for additional information * regarding copyright ownership. The ASF licenses this file * to you under the Apache License, Version 2.0 (the * "License"); you may not use this file except in compliance * with the License. You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package org.apache.hadoop.fs.s3a; import javax.annotation.Nullable; import java.io.File; import java.io.FileNotFoundException; import java.io.IOException; import java.io.InputStream; import java.util.List; import java.util.Map; import java.util.concurrent.atomic.AtomicInteger; import com.amazonaws.services.s3.model.AmazonS3Exception; import com.amazonaws.services.s3.model.CompleteMultipartUploadRequest; import com.amazonaws.services.s3.model.CompleteMultipartUploadResult; import com.amazonaws.services.s3.model.InitiateMultipartUploadRequest; import com.amazonaws.services.s3.model.MultipartUpload; import com.amazonaws.services.s3.model.ObjectMetadata; import com.amazonaws.services.s3.model.PartETag; import com.amazonaws.services.s3.model.PutObjectRequest; import com.amazonaws.services.s3.model.PutObjectResult; import com.amazonaws.services.s3.model.SelectObjectContentRequest; import com.amazonaws.services.s3.model.SelectObjectContentResult; import com.amazonaws.services.s3.model.UploadPartRequest; import com.amazonaws.services.s3.model.UploadPartResult; import com.amazonaws.services.s3.transfer.model.UploadResult; import org.apache.hadoop.util.Preconditions; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.apache.hadoop.classification.InterfaceAudience; import org.apache.hadoop.classification.InterfaceStability; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.fs.PathIOException; import org.apache.hadoop.fs.s3a.api.RequestFactory; import org.apache.hadoop.fs.s3a.impl.StoreContext; import org.apache.hadoop.fs.s3a.statistics.S3AStatisticsContext; import org.apache.hadoop.fs.s3a.select.SelectBinding; import org.apache.hadoop.fs.store.audit.AuditSpan; import org.apache.hadoop.fs.store.audit.AuditSpanSource; import org.apache.hadoop.util.DurationInfo; import org.apache.hadoop.util.functional.CallableRaisingIOE; import static org.apache.hadoop.util.Preconditions.checkNotNull; import static org.apache.hadoop.fs.s3a.Invoker.*; import static org.apache.hadoop.fs.store.audit.AuditingFunctions.withinAuditSpan; /** * Helper for low-level operations against an S3 Bucket for writing data, * creating and committing pending writes, and other S3-layer operations. * <p> * It hides direct access to the S3 API * and is a location where the object operations can be evolved/enhanced. * <p> * Features * <ul> * <li>Methods to create and submit requests to S3, so avoiding * all direct interaction with the AWS APIs.</li> * <li>Some extra preflight checks of arguments, so failing fast on * errors.</li> * <li>Callbacks to let the FS know of events in the output stream * upload process.</li> * <li>Other low-level access to S3 functions, for private use.</li> * <li>Failure handling, including converting exceptions to IOEs.</li> * <li>Integration with instrumentation.</li> * <li>Evolution to add more low-level operations, such as S3 select.</li> * </ul> * * This API is for internal use only. * Span scoping: This helper is instantiated with span; it will be used * before operations which query/update S3 * * History * <pre> * - A nested class in S3AFileSystem * - Single shared instance created and reused. * - [HADOOP-13786] A separate class, single instance in S3AFS * - [HDFS-13934] Split into interface and implementation * - [HADOOP-15711] Adds audit tracking; one instance per use. * </pre> */ @InterfaceAudience.Private @InterfaceStability.Unstable public class WriteOperationHelper implements WriteOperations { private static final Logger LOG = LoggerFactory.getLogger(WriteOperationHelper.class); /** * Owning filesystem. */ private final S3AFileSystem owner; /** * Invoker for operations; uses the S3A retry policy and calls int * {@link #operationRetried(String, Exception, int, boolean)} on retries. */ private final Invoker invoker; /** Configuration of the owner. This is a reference, not a copy. */ private final Configuration conf; /** Bucket of the owner FS. */ private final String bucket; /** * statistics context. */ private final S3AStatisticsContext statisticsContext; /** * Store Context; extracted from owner. */ private final StoreContext storeContext; /** * Source of Audit spans. */ private final AuditSpanSource auditSpanSource; /** * Audit Span. */ private AuditSpan auditSpan; /** * Factory for AWS requests. */ private final RequestFactory requestFactory; /** * Constructor. * @param owner owner FS creating the helper * @param conf Configuration object * @param statisticsContext statistics context * @param auditSpanSource source of spans * @param auditSpan span to activate * */ protected WriteOperationHelper(S3AFileSystem owner, Configuration conf, S3AStatisticsContext statisticsContext, final AuditSpanSource auditSpanSource, final AuditSpan auditSpan) { this.owner = owner; this.invoker = new Invoker(new S3ARetryPolicy(conf), this::operationRetried); this.conf = conf; this.statisticsContext = statisticsContext; this.storeContext = owner.createStoreContext(); this.bucket = owner.getBucket(); this.auditSpanSource = auditSpanSource; this.auditSpan = checkNotNull(auditSpan); this.requestFactory = owner.getRequestFactory(); } /** * Callback from {@link Invoker} when an operation is retried. * @param text text of the operation * @param ex exception * @param retries number of retries * @param idempotent is the method idempotent */ void operationRetried(String text, Exception ex, int retries, boolean idempotent) { LOG.info("{}: Retried {}: {}", text, retries, ex.toString()); LOG.debug("Stack", ex); owner.operationRetried(text, ex, retries, idempotent); } /** * Execute a function with retry processing. * Also activates the current span. * @param <T> type of return value * @param action action to execute (used in error messages) * @param path path of work (used in error messages) * @param idempotent does the operation have semantics * which mean that it can be retried even if was already executed? * @param operation operation to execute * @return the result of the call * @throws IOException any IOE raised, or translated exception */ public <T> T retry(String action, String path, boolean idempotent, CallableRaisingIOE<T> operation) throws IOException { activateAuditSpan(); return invoker.retry(action, path, idempotent, operation); } /** * Get the audit span this object was created with. * @return the audit span */ public AuditSpan getAuditSpan() { return auditSpan; } /** * Activate the audit span. * @return the span */ private AuditSpan activateAuditSpan() { return auditSpan.activate(); } /** * Deactivate the audit span. */ private void deactivateAuditSpan() { auditSpan.deactivate(); } /** * Create a {@link PutObjectRequest} request against the specific key. * @param destKey destination key * @param inputStream source data. * @param length size, if known. Use -1 for not known * @param headers optional map of custom headers. * @return the request */ @Retries.OnceRaw public PutObjectRequest createPutObjectRequest(String destKey, InputStream inputStream, long length, final Map<String, String> headers) { activateAuditSpan(); ObjectMetadata objectMetadata = newObjectMetadata(length); if (headers != null) { objectMetadata.setUserMetadata(headers); } return getRequestFactory().newPutObjectRequest( destKey, objectMetadata, inputStream); } /** * Create a {@link PutObjectRequest} request to upload a file. * @param dest key to PUT to. * @param sourceFile source file * @return the request */ @Retries.OnceRaw public PutObjectRequest createPutObjectRequest(String dest, File sourceFile) { Preconditions.checkState(sourceFile.length() < Integer.MAX_VALUE, "File length is too big for a single PUT upload"); activateAuditSpan(); return getRequestFactory(). newPutObjectRequest(dest, newObjectMetadata((int) sourceFile.length()), sourceFile); } /** * Callback on a successful write. * @param length length of the write */ public void writeSuccessful(long length) { } /** * Callback on a write failure. * @param ex Any exception raised which triggered the failure. */ public void writeFailed(Exception ex) { LOG.debug("Write to {} failed", this, ex); } /** * Create a new object metadata instance. * Any standard metadata headers are added here, for example: * encryption. * @param length size, if known. Use -1 for not known * @return a new metadata instance */ public ObjectMetadata newObjectMetadata(long length) { return getRequestFactory().newObjectMetadata(length); } /** * Start the multipart upload process. * Retry policy: retrying, translated. * @param destKey destination of upload * @return the upload result containing the ID * @throws IOException IO problem */ @Retries.RetryTranslated public String initiateMultiPartUpload(String destKey) throws IOException { LOG.debug("Initiating Multipart upload to {}", destKey); try (AuditSpan span = activateAuditSpan()) { return retry("initiate MultiPartUpload", destKey, true, () -> { final InitiateMultipartUploadRequest initiateMPURequest = getRequestFactory().newMultipartUploadRequest( destKey); return owner.initiateMultipartUpload(initiateMPURequest) .getUploadId(); }); } } /** * Finalize a multipart PUT operation. * This completes the upload, and, if that works, calls * {@link S3AFileSystem#finishedWrite(String, long, String, String)} * to update the filesystem. * Retry policy: retrying, translated. * @param destKey destination of the commit * @param uploadId multipart operation Id * @param partETags list of partial uploads * @param length length of the upload * @param retrying retrying callback * @return the result of the operation. * @throws IOException on problems. */ @Retries.RetryTranslated private CompleteMultipartUploadResult finalizeMultipartUpload( String destKey, String uploadId, List<PartETag> partETags, long length, Retried retrying) throws IOException { if (partETags.isEmpty()) { throw new PathIOException(destKey, "No upload parts in multipart upload"); } try (AuditSpan span = activateAuditSpan()) { CompleteMultipartUploadResult uploadResult; uploadResult = invoker.retry("Completing multipart upload", destKey, true, retrying, () -> { final CompleteMultipartUploadRequest request = getRequestFactory().newCompleteMultipartUploadRequest( destKey, uploadId, partETags); return owner.getAmazonS3Client().completeMultipartUpload( request); }); owner.finishedWrite(destKey, length, uploadResult.getETag(), uploadResult.getVersionId()); return uploadResult; } } /** * This completes a multipart upload to the destination key via * {@code finalizeMultipartUpload()}. * Retry policy: retrying, translated. * Retries increment the {@code errorCount} counter. * @param destKey destination * @param uploadId multipart operation Id * @param partETags list of partial uploads * @param length length of the upload * @param errorCount a counter incremented by 1 on every error; for * use in statistics * @return the result of the operation. * @throws IOException if problems arose which could not be retried, or * the retry count was exceeded */ @Retries.RetryTranslated public CompleteMultipartUploadResult completeMPUwithRetries( String destKey, String uploadId, List<PartETag> partETags, long length, AtomicInteger errorCount) throws IOException { checkNotNull(uploadId); checkNotNull(partETags); LOG.debug("Completing multipart upload {} with {} parts", uploadId, partETags.size()); return finalizeMultipartUpload(destKey, uploadId, partETags, length, (text, e, r, i) -> errorCount.incrementAndGet() ); } /** * Abort a multipart upload operation. * @param destKey destination key of the upload * @param uploadId multipart operation Id * @param shouldRetry should failures trigger a retry? * @param retrying callback invoked on every retry * @throws IOException failure to abort * @throws FileNotFoundException if the abort ID is unknown */ @Retries.RetryTranslated public void abortMultipartUpload(String destKey, String uploadId, boolean shouldRetry, Retried retrying) throws IOException { if (shouldRetry) { // retrying option invoker.retry("Aborting multipart upload ID " + uploadId, destKey, true, retrying, withinAuditSpan(getAuditSpan(), () -> owner.abortMultipartUpload( destKey, uploadId))); } else { // single pass attempt. once("Aborting multipart upload ID " + uploadId, destKey, withinAuditSpan(getAuditSpan(), () -> owner.abortMultipartUpload( destKey, uploadId))); } } /** * Abort a multipart commit operation. * @param upload upload to abort. * @throws IOException on problems. */ @Retries.RetryTranslated public void abortMultipartUpload(MultipartUpload upload) throws IOException { invoker.retry("Aborting multipart commit", upload.getKey(), true, withinAuditSpan(getAuditSpan(), () -> owner.abortMultipartUpload(upload))); } /** * Abort multipart uploads under a path: limited to the first * few hundred. * @param prefix prefix for uploads to abort * @return a count of aborts * @throws IOException trouble; FileNotFoundExceptions are swallowed. */ @Retries.RetryTranslated public int abortMultipartUploadsUnderPath(String prefix) throws IOException { LOG.debug("Aborting multipart uploads under {}", prefix); int count = 0; List<MultipartUpload> multipartUploads = listMultipartUploads(prefix); LOG.debug("Number of outstanding uploads: {}", multipartUploads.size()); for (MultipartUpload upload: multipartUploads) { try { abortMultipartUpload(upload); count++; } catch (FileNotFoundException e) { LOG.debug("Already aborted: {}", upload.getKey(), e); } } return count; } @Override @Retries.RetryTranslated public List<MultipartUpload> listMultipartUploads(final String prefix) throws IOException { activateAuditSpan(); return owner.listMultipartUploads(prefix); } /** * Abort a multipart commit operation. * @param destKey destination key of ongoing operation * @param uploadId multipart operation Id * @throws IOException on problems. * @throws FileNotFoundException if the abort ID is unknown */ @Override @Retries.RetryTranslated public void abortMultipartCommit(String destKey, String uploadId) throws IOException { abortMultipartUpload(destKey, uploadId, true, invoker.getRetryCallback()); } /** * Create and initialize a part request of a multipart upload. * Exactly one of: {@code uploadStream} or {@code sourceFile} * must be specified. * A subset of the file may be posted, by providing the starting point * in {@code offset} and a length of block in {@code size} equal to * or less than the remaining bytes. * The part number must be less than 10000. * Retry policy is once-translated; to much effort * @param destKey destination key of ongoing operation * @param uploadId ID of ongoing upload * @param partNumber current part number of the upload * @param size amount of data * @param uploadStream source of data to upload * @param sourceFile optional source file. * @param offset offset in file to start reading. * @return the request. * @throws IllegalArgumentException if the parameters are invalid. * @throws PathIOException if the part number is out of range. */ @Override @Retries.OnceTranslated public UploadPartRequest newUploadPartRequest( String destKey, String uploadId, int partNumber, int size, InputStream uploadStream, File sourceFile, Long offset) throws IOException { return once("upload part request", destKey, withinAuditSpan(getAuditSpan(), () -> getRequestFactory().newUploadPartRequest( destKey, uploadId, partNumber, size, uploadStream, sourceFile, offset))); } /** * The toString method is intended to be used in logging/toString calls. * @return a string description. */ @Override public String toString() { final StringBuilder sb = new StringBuilder( "WriteOperationHelper {bucket=").append(bucket); sb.append('}'); return sb.toString(); } /** * PUT an object directly (i.e. not via the transfer manager). * Byte length is calculated from the file length, or, if there is no * file, from the content length of the header. * @param putObjectRequest the request * @return the upload initiated * @throws IOException on problems */ @Retries.RetryTranslated public PutObjectResult putObject(PutObjectRequest putObjectRequest) throws IOException { return retry("Writing Object", putObjectRequest.getKey(), true, withinAuditSpan(getAuditSpan(), () -> owner.putObjectDirect(putObjectRequest))); } /** * PUT an object via the transfer manager. * @param putObjectRequest the request * @return the result of the operation * @throws IOException on problems */ @Retries.RetryTranslated public UploadResult uploadObject(PutObjectRequest putObjectRequest) throws IOException { // no retry; rely on xfer manager logic return retry("Writing Object", putObjectRequest.getKey(), true, withinAuditSpan(getAuditSpan(), () -> owner.executePut(putObjectRequest, null))); } /** * Revert a commit by deleting the file. * Relies on retry code in filesystem * @throws IOException on problems * @param destKey destination key */ @Retries.OnceTranslated public void revertCommit(String destKey) throws IOException { once("revert commit", destKey, withinAuditSpan(getAuditSpan(), () -> { Path destPath = owner.keyToQualifiedPath(destKey); owner.deleteObjectAtPath(destPath, destKey, true); owner.maybeCreateFakeParentDirectory(destPath); })); } /** * This completes a multipart upload to the destination key via * {@code finalizeMultipartUpload()}. * Retry policy: retrying, translated. * Retries increment the {@code errorCount} counter. * @param destKey destination * @param uploadId multipart operation Id * @param partETags list of partial uploads * @param length length of the upload * @return the result of the operation. * @throws IOException if problems arose which could not be retried, or * the retry count was exceeded */ @Retries.RetryTranslated public CompleteMultipartUploadResult commitUpload( String destKey, String uploadId, List<PartETag> partETags, long length) throws IOException { checkNotNull(uploadId); checkNotNull(partETags); LOG.debug("Completing multipart upload {} with {} parts", uploadId, partETags.size()); return finalizeMultipartUpload(destKey, uploadId, partETags, length, Invoker.NO_OP ); } /** * Upload part of a multi-partition file. * @param request request * @return the result of the operation. * @throws IOException on problems */ @Retries.RetryTranslated public UploadPartResult uploadPart(UploadPartRequest request) throws IOException { return retry("upload part #" + request.getPartNumber() + " upload ID " + request.getUploadId(), request.getKey(), true, withinAuditSpan(getAuditSpan(), () -> owner.uploadPart(request))); } /** * Get the configuration of this instance; essentially the owning * filesystem configuration. * @return the configuration. */ public Configuration getConf() { return conf; } /** * Create a S3 Select request for the destination path. * This does not build the query. * @param path pre-qualified path for query * @return the request */ public SelectObjectContentRequest newSelectRequest(Path path) { try (AuditSpan span = getAuditSpan()) { return getRequestFactory().newSelectRequest( storeContext.pathToKey(path)); } } /** * Execute an S3 Select operation. * On a failure, the request is only logged at debug to avoid the * select exception being printed. * @param source source for selection * @param request Select request to issue. * @param action the action for use in exception creation * @return response * @throws IOException failure */ @Retries.RetryTranslated public SelectObjectContentResult select( final Path source, final SelectObjectContentRequest request, final String action) throws IOException { // no setting of span here as the select binding is (statically) created // without any span. String bucketName = request.getBucketName(); Preconditions.checkArgument(bucket.equals(bucketName), "wrong bucket: %s", bucketName); if (LOG.isDebugEnabled()) { LOG.debug("Initiating select call {} {}", source, request.getExpression()); LOG.debug(SelectBinding.toString(request)); } return invoker.retry( action, source.toString(), true, withinAuditSpan(getAuditSpan(), () -> { try (DurationInfo ignored = new DurationInfo(LOG, "S3 Select operation")) { try { return owner.getAmazonS3Client().selectObjectContent(request); } catch (AmazonS3Exception e) { LOG.error("Failure of S3 Select request against {}", source); LOG.debug("S3 Select request against {}:\n{}", source, SelectBinding.toString(request), e); throw e; } } })); } @Override public AuditSpan createSpan(final String operation, @Nullable final String path1, @Nullable final String path2) throws IOException { return auditSpanSource.createSpan(operation, path1, path2); } @Override public void incrementWriteOperations() { owner.incrementWriteOperations(); } /** * Deactivate the audit span. */ @Override public void close() throws IOException { deactivateAuditSpan(); } /** * Get the request factory which uses this store's audit span. * @return the request factory. */ public RequestFactory getRequestFactory() { return requestFactory; } } /* * Copyright 2016 DiffPlug * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package com.diffplug.gradle.spotless; import java.io.File; import java.io.Serializable; import java.nio.charset.Charset; import java.util.ArrayList; import java.util.Arrays; import java.util.List; import java.util.Map; import java.util.Objects; import java.util.Random; import java.util.stream.Stream; import javax.annotation.Nullable; import org.gradle.api.GradleException; import org.gradle.api.Project; import org.gradle.api.file.FileCollection; import org.gradle.api.internal.file.UnionFileCollection; import com.diffplug.spotless.FormatExceptionPolicyStrict; import com.diffplug.spotless.FormatterFunc; import com.diffplug.spotless.FormatterStep; import com.diffplug.spotless.LazyForwardingEquality; import com.diffplug.spotless.LineEnding; import com.diffplug.spotless.ThrowingEx; import com.diffplug.spotless.generic.EndWithNewlineStep; import com.diffplug.spotless.generic.IndentStep; import com.diffplug.spotless.generic.LicenseHeaderStep; import com.diffplug.spotless.generic.ReplaceRegexStep; import com.diffplug.spotless.generic.ReplaceStep; import com.diffplug.spotless.generic.TrimTrailingWhitespaceStep; import groovy.lang.Closure; /** Adds a `spotless{Name}Check` and `spotless{Name}Apply` task. */ public class FormatExtension { final SpotlessExtension root; public FormatExtension(SpotlessExtension root) { this.root = root; } private String formatName() { for (Map.Entry<String, FormatExtension> entry : root.formats.entrySet()) { if (entry.getValue() == this) { return entry.getKey(); } } throw new IllegalStateException("This format is not contained by any SpotlessExtension."); } boolean paddedCell = false; /** Enables paddedCell mode. @see <a href="https://github.com/diffplug/spotless/blob/master/PADDEDCELL.md">Padded cell</a> */ public void paddedCell() { paddedCell(true); } /** Enables paddedCell mode. @see <a href="https://github.com/diffplug/spotless/blob/master/PADDEDCELL.md">Padded cell</a> */ public void paddedCell(boolean paddedCell) { this.paddedCell = paddedCell; } LineEnding lineEndings; /** Returns the line endings to use (defaults to {@link SpotlessExtension#getLineEndings()}. */ public LineEnding getLineEndings() { return lineEndings == null ? root.getLineEndings() : lineEndings; } /** Sets the line endings to use (defaults to {@link SpotlessExtension#getLineEndings()}. */ public void setLineEndings(LineEnding lineEndings) { this.lineEndings = lineEndings; } Charset encoding; /** Returns the encoding to use (defaults to {@link SpotlessExtension#getEncoding()}. */ public Charset getEncoding() { return encoding == null ? root.getEncoding() : encoding; } /** Sets the encoding to use (defaults to {@link SpotlessExtension#getEncoding()}. */ public void setEncoding(String name) { setEncoding(Charset.forName(name)); } /** Sets the encoding to use (defaults to {@link SpotlessExtension#getEncoding()}. */ public void setEncoding(Charset charset) { encoding = Objects.requireNonNull(charset); } FormatExceptionPolicyStrict exceptionPolicy = new FormatExceptionPolicyStrict(); /** Ignores errors in the given step. */ public void ignoreErrorForStep(String stepName) { exceptionPolicy.excludeStep(stepName); } /** Ignores errors for the given relative path. */ public void ignoreErrorForPath(String relativePath) { exceptionPolicy.excludePath(relativePath); } /** Sets encoding to use (defaults to {@link SpotlessExtension#getEncoding()}). */ public void encoding(String charset) { setEncoding(charset); } /** The files that need to be formatted. */ protected FileCollection target; /** * FileCollections pass through raw. * Strings are treated as the 'include' arg to fileTree, with project.rootDir as the dir. * List<String> are treated as the 'includes' arg to fileTree, with project.rootDir as the dir. * Anything else gets passed to getProject().files(). */ public void target(Object... targets) { if (targets.length == 0) { this.target = getProject().files(); } else if (targets.length == 1) { this.target = parseTarget(targets[0]); } else { if (Stream.of(targets).allMatch(o -> o instanceof String)) { this.target = parseTarget(Arrays.asList(targets)); } else { UnionFileCollection union = new UnionFileCollection(); for (Object target : targets) { union.add(parseTarget(target)); } this.target = union; } } } @SuppressWarnings("unchecked") protected FileCollection parseTarget(Object target) { if (target instanceof FileCollection) { return (FileCollection) target; } else if (target instanceof String || (target instanceof List && ((List<?>) target).stream().allMatch(o -> o instanceof String))) { // since people are likely to do '**/*.md', we want to make sure to exclude folders // they don't want to format which will slow down the operation greatly File dir = getProject().getProjectDir(); List<String> excludes = new ArrayList<>(); // no git excludes.add(".git"); // no .gradle if (getProject() == getProject().getRootProject()) { excludes.add(".gradle"); } // no build folders excludes.add(relativize(dir, getProject().getBuildDir())); for (Project subproject : getProject().getSubprojects()) { excludes.add(relativize(dir, subproject.getBuildDir())); } if (target instanceof String) { return (FileCollection) getProject().fileTree(dir).include((String) target).exclude(excludes); } else { // target can only be a List<String> at this point return (FileCollection) getProject().fileTree(dir).include((List<String>) target).exclude(excludes); } } else { return getProject().files(target); } } static String relativize(File root, File dest) { String rootPath = root.getAbsolutePath(); String destPath = dest.getAbsolutePath(); if (!destPath.startsWith(rootPath)) { throw new IllegalArgumentException(dest + " is not a child of " + root); } else { return destPath.substring(rootPath.length()); } } /** The steps that need to be added. */ protected List<FormatterStep> steps = new ArrayList<>(); /** Adds a new step. */ public void addStep(FormatterStep newStep) { FormatterStep existing = getExistingStep(newStep.getName()); if (existing != null) { throw new GradleException("Multiple steps with name '" + newStep.getName() + "' for spotless format '" + formatName() + "'"); } steps.add(newStep); } /** Returns the existing step with the given name, if any. */ @Nullable protected FormatterStep getExistingStep(String stepName) { for (FormatterStep step : steps) { if (stepName.equals(step.getName())) { return step; } } return null; } /** Replaces the given step. */ protected void replaceStep(FormatterStep replacementStep) { FormatterStep existing = getExistingStep(replacementStep.getName()); if (existing == null) { throw new GradleException("Cannot replace step '" + replacementStep.getName() + "' for spotless format '" + formatName() + "' because it hasn't been added yet."); } int index = steps.indexOf(existing); steps.set(index, replacementStep); } /** Clears all of the existing steps. */ public void clearSteps() { steps.clear(); } /** * An optional performance optimization if you are using any of the `custom` or `customLazy` * methods. If you aren't explicitly calling `custom` or `customLazy`, then this method * has no effect. * * Spotless tracks what files have changed from run to run, so that it can run faster * by only checking files which have changed, or whose formatting steps have changed. * If you use either the `custom` or `customLazy` methods, then gradle can never mark * your files as `up-to-date`, because it can't know if perhaps the behavior of your * custom function has changed. * * If you set `bumpThisNumberIfACustomStepChanges( <some number> )`, then spotless will * assume that the custom rules have not changed if the number has not changed. If a * custom rule does change, then you must bump the number so that spotless will know * that it must recheck the files it has already checked. */ public void bumpThisNumberIfACustomStepChanges(int number) { globalState = number; } private Serializable globalState = new NeverUpToDateBetweenRuns(); static class NeverUpToDateBetweenRuns extends LazyForwardingEquality<Integer> { private static final long serialVersionUID = 1L; private static final Random RANDOM = new Random(); @Override protected Integer calculateState() throws Exception { return RANDOM.nextInt(); } } /** * Adds the given custom step, which is constructed lazily for performance reasons. * * The resulting function will receive a string with unix-newlines, and it must return a string unix newlines. * * If you're getting errors about `closure cannot be cast to com.diffplug.common.base.Throwing$Function`, then use * {@link #customLazyGroovy(String, ThrowingEx.Supplier)}. */ public void customLazy(String name, ThrowingEx.Supplier<FormatterFunc> formatterSupplier) { addStep(FormatterStep.createLazy(name, () -> globalState, unusedState -> formatterSupplier.get())); } /** Same as {@link #customLazy(String, ThrowingEx.Supplier)}, but for Groovy closures. */ public void customLazyGroovy(String name, ThrowingEx.Supplier<Closure<String>> formatterSupplier) { customLazy(name, () -> formatterSupplier.get()::call); } /** Adds a custom step. Receives a string with unix-newlines, must return a string with unix newlines. */ public void custom(String name, Closure<String> formatter) { custom(name, formatter::call); } /** Adds a custom step. Receives a string with unix-newlines, must return a string with unix newlines. */ public void custom(String name, FormatterFunc formatter) { customLazy(name, () -> formatter); } /** Highly efficient find-replace char sequence. */ public void replace(String name, CharSequence original, CharSequence after) { addStep(ReplaceStep.create(name, original, after)); } /** Highly efficient find-replace regex. */ public void replaceRegex(String name, String regex, String replacement) { addStep(ReplaceRegexStep.create(name, regex, replacement)); } /** Removes trailing whitespace. */ public void trimTrailingWhitespace() { addStep(TrimTrailingWhitespaceStep.create()); } /** Ensures that files end with a single newline. */ public void endWithNewline() { addStep(EndWithNewlineStep.create()); } /** Ensures that the files are indented using spaces. */ public void indentWithSpaces(int numSpacesPerTab) { addStep(IndentStep.Type.SPACE.create(numSpacesPerTab)); } /** Ensures that the files are indented using spaces. */ public void indentWithSpaces() { indentWithSpaces(4); } /** Ensures that the files are indented using tabs. */ public void indentWithTabs(int tabToSpaces) { addStep(IndentStep.Type.TAB.create(tabToSpaces)); } /** Ensures that the files are indented using tabs. */ public void indentWithTabs() { indentWithTabs(4); } /** * @param licenseHeader * Content that should be at the top of every file * @param delimiter * Spotless will look for a line that starts with this to know what the "top" is. */ public void licenseHeader(String licenseHeader, String delimiter) { addStep(LicenseHeaderStep.createFromHeader(licenseHeader, delimiter)); } /** * @param licenseHeaderFile * Content that should be at the top of every file * @param delimiter * Spotless will look for a line that starts with this to know what the "top" is. */ public void licenseHeaderFile(Object licenseHeaderFile, String delimiter) { addStep(LicenseHeaderStep.createFromFile(getProject().file(licenseHeaderFile), getEncoding(), delimiter)); } /** Sets up a format task according to the values in this extension. */ protected void setupTask(SpotlessTask task) { task.setPaddedCell(paddedCell); task.setEncoding(getEncoding().name()); task.setExceptionPolicy(exceptionPolicy); task.setTarget(target); task.setSteps(steps); task.setLineEndingsPolicy(getLineEndings().createPolicy(getProject().getProjectDir(), () -> task.target)); } /** Returns the project that this extension is attached to. */ protected Project getProject() { return root.project; } } /* * Copyright 2014-2019 Amazon.com, Inc. or its affiliates. All Rights Reserved. * * Licensed under the Apache License, Version 2.0 (the "License"). You may not use this file except in compliance with * the License. A copy of the License is located at * * http://aws.amazon.com/apache2.0 * * or in the "license" file accompanying this file. This file is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR * CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions * and limitations under the License. */ package com.amazonaws.services.athena.model; import java.io.Serializable; import javax.annotation.Generated; import com.amazonaws.protocol.StructuredPojo; import com.amazonaws.protocol.ProtocolMarshaller; /** * <p> * A workgroup, which contains a name, description, creation time, state, and other configuration, listed under * <a>WorkGroup$Configuration</a>. Each workgroup enables you to isolate queries for you or your group of users from * other queries in the same account, to configure the query results location and the encryption configuration (known as * workgroup settings), to enable sending query metrics to Amazon CloudWatch, and to establish per-query data usage * control limits for all queries in a workgroup. The workgroup settings override is specified in * EnforceWorkGroupConfiguration (true/false) in the WorkGroupConfiguration. See * <a>WorkGroupConfiguration$EnforceWorkGroupConfiguration</a>. * </p> * * @see <a href="http://docs.aws.amazon.com/goto/WebAPI/athena-2017-05-18/WorkGroup" target="_top">AWS API * Documentation</a> */ @Generated("com.amazonaws:aws-java-sdk-code-generator") public class WorkGroup implements Serializable, Cloneable, StructuredPojo { /** * <p> * The workgroup name. * </p> */ private String name; /** * <p> * The state of the workgroup: ENABLED or DISABLED. * </p> */ private String state; /** * <p> * The configuration of the workgroup, which includes the location in Amazon S3 where query results are stored, the * encryption configuration, if any, used for query results; whether the Amazon CloudWatch Metrics are enabled for * the workgroup; whether workgroup settings override client-side settings; and the data usage limits for the amount * of data scanned per query or per workgroup. The workgroup settings override is specified in * EnforceWorkGroupConfiguration (true/false) in the WorkGroupConfiguration. See * <a>WorkGroupConfiguration$EnforceWorkGroupConfiguration</a>. * </p> */ private WorkGroupConfiguration configuration; /** * <p> * The workgroup description. * </p> */ private String description; /** * <p> * The date and time the workgroup was created. * </p> */ private java.util.Date creationTime; /** * <p> * The workgroup name. * </p> * * @param name * The workgroup name. */ public void setName(String name) { this.name = name; } /** * <p> * The workgroup name. * </p> * * @return The workgroup name. */ public String getName() { return this.name; } /** * <p> * The workgroup name. * </p> * * @param name * The workgroup name. * @return Returns a reference to this object so that method calls can be chained together. */ public WorkGroup withName(String name) { setName(name); return this; } /** * <p> * The state of the workgroup: ENABLED or DISABLED. * </p> * * @param state * The state of the workgroup: ENABLED or DISABLED. * @see WorkGroupState */ public void setState(String state) { this.state = state; } /** * <p> * The state of the workgroup: ENABLED or DISABLED. * </p> * * @return The state of the workgroup: ENABLED or DISABLED. * @see WorkGroupState */ public String getState() { return this.state; } /** * <p> * The state of the workgroup: ENABLED or DISABLED. * </p> * * @param state * The state of the workgroup: ENABLED or DISABLED. * @return Returns a reference to this object so that method calls can be chained together. * @see WorkGroupState */ public WorkGroup withState(String state) { setState(state); return this; } /** * <p> * The state of the workgroup: ENABLED or DISABLED. * </p> * * @param state * The state of the workgroup: ENABLED or DISABLED. * @return Returns a reference to this object so that method calls can be chained together. * @see WorkGroupState */ public WorkGroup withState(WorkGroupState state) { this.state = state.toString(); return this; } /** * <p> * The configuration of the workgroup, which includes the location in Amazon S3 where query results are stored, the * encryption configuration, if any, used for query results; whether the Amazon CloudWatch Metrics are enabled for * the workgroup; whether workgroup settings override client-side settings; and the data usage limits for the amount * of data scanned per query or per workgroup. The workgroup settings override is specified in * EnforceWorkGroupConfiguration (true/false) in the WorkGroupConfiguration. See * <a>WorkGroupConfiguration$EnforceWorkGroupConfiguration</a>. * </p> * * @param configuration * The configuration of the workgroup, which includes the location in Amazon S3 where query results are * stored, the encryption configuration, if any, used for query results; whether the Amazon CloudWatch * Metrics are enabled for the workgroup; whether workgroup settings override client-side settings; and the * data usage limits for the amount of data scanned per query or per workgroup. The workgroup settings * override is specified in EnforceWorkGroupConfiguration (true/false) in the WorkGroupConfiguration. See * <a>WorkGroupConfiguration$EnforceWorkGroupConfiguration</a>. */ public void setConfiguration(WorkGroupConfiguration configuration) { this.configuration = configuration; } /** * <p> * The configuration of the workgroup, which includes the location in Amazon S3 where query results are stored, the * encryption configuration, if any, used for query results; whether the Amazon CloudWatch Metrics are enabled for * the workgroup; whether workgroup settings override client-side settings; and the data usage limits for the amount * of data scanned per query or per workgroup. The workgroup settings override is specified in * EnforceWorkGroupConfiguration (true/false) in the WorkGroupConfiguration. See * <a>WorkGroupConfiguration$EnforceWorkGroupConfiguration</a>. * </p> * * @return The configuration of the workgroup, which includes the location in Amazon S3 where query results are * stored, the encryption configuration, if any, used for query results; whether the Amazon CloudWatch * Metrics are enabled for the workgroup; whether workgroup settings override client-side settings; and the * data usage limits for the amount of data scanned per query or per workgroup. The workgroup settings * override is specified in EnforceWorkGroupConfiguration (true/false) in the WorkGroupConfiguration. See * <a>WorkGroupConfiguration$EnforceWorkGroupConfiguration</a>. */ public WorkGroupConfiguration getConfiguration() { return this.configuration; } /** * <p> * The configuration of the workgroup, which includes the location in Amazon S3 where query results are stored, the * encryption configuration, if any, used for query results; whether the Amazon CloudWatch Metrics are enabled for * the workgroup; whether workgroup settings override client-side settings; and the data usage limits for the amount * of data scanned per query or per workgroup. The workgroup settings override is specified in * EnforceWorkGroupConfiguration (true/false) in the WorkGroupConfiguration. See * <a>WorkGroupConfiguration$EnforceWorkGroupConfiguration</a>. * </p> * * @param configuration * The configuration of the workgroup, which includes the location in Amazon S3 where query results are * stored, the encryption configuration, if any, used for query results; whether the Amazon CloudWatch * Metrics are enabled for the workgroup; whether workgroup settings override client-side settings; and the * data usage limits for the amount of data scanned per query or per workgroup. The workgroup settings * override is specified in EnforceWorkGroupConfiguration (true/false) in the WorkGroupConfiguration. See * <a>WorkGroupConfiguration$EnforceWorkGroupConfiguration</a>. * @return Returns a reference to this object so that method calls can be chained together. */ public WorkGroup withConfiguration(WorkGroupConfiguration configuration) { setConfiguration(configuration); return this; } /** * <p> * The workgroup description. * </p> * * @param description * The workgroup description. */ public void setDescription(String description) { this.description = description; } /** * <p> * The workgroup description. * </p> * * @return The workgroup description. */ public String getDescription() { return this.description; } /** * <p> * The workgroup description. * </p> * * @param description * The workgroup description. * @return Returns a reference to this object so that method calls can be chained together. */ public WorkGroup withDescription(String description) { setDescription(description); return this; } /** * <p> * The date and time the workgroup was created. * </p> * * @param creationTime * The date and time the workgroup was created. */ public void setCreationTime(java.util.Date creationTime) { this.creationTime = creationTime; } /** * <p> * The date and time the workgroup was created. * </p> * * @return The date and time the workgroup was created. */ public java.util.Date getCreationTime() { return this.creationTime; } /** * <p> * The date and time the workgroup was created. * </p> * * @param creationTime * The date and time the workgroup was created. * @return Returns a reference to this object so that method calls can be chained together. */ public WorkGroup withCreationTime(java.util.Date creationTime) { setCreationTime(creationTime); return this; } /** * Returns a string representation of this object. This is useful for testing and debugging. Sensitive data will be * redacted from this string using a placeholder value. * * @return A string representation of this object. * * @see java.lang.Object#toString() */ @Override public String toString() { StringBuilder sb = new StringBuilder(); sb.append("{"); if (getName() != null) sb.append("Name: ").append(getName()).append(","); if (getState() != null) sb.append("State: ").append(getState()).append(","); if (getConfiguration() != null) sb.append("Configuration: ").append(getConfiguration()).append(","); if (getDescription() != null) sb.append("Description: ").append(getDescription()).append(","); if (getCreationTime() != null) sb.append("CreationTime: ").append(getCreationTime()); sb.append("}"); return sb.toString(); } @Override public boolean equals(Object obj) { if (this == obj) return true; if (obj == null) return false; if (obj instanceof WorkGroup == false) return false; WorkGroup other = (WorkGroup) obj; if (other.getName() == null ^ this.getName() == null) return false; if (other.getName() != null && other.getName().equals(this.getName()) == false) return false; if (other.getState() == null ^ this.getState() == null) return false; if (other.getState() != null && other.getState().equals(this.getState()) == false) return false; if (other.getConfiguration() == null ^ this.getConfiguration() == null) return false; if (other.getConfiguration() != null && other.getConfiguration().equals(this.getConfiguration()) == false) return false; if (other.getDescription() == null ^ this.getDescription() == null) return false; if (other.getDescription() != null && other.getDescription().equals(this.getDescription()) == false) return false; if (other.getCreationTime() == null ^ this.getCreationTime() == null) return false; if (other.getCreationTime() != null && other.getCreationTime().equals(this.getCreationTime()) == false) return false; return true; } @Override public int hashCode() { final int prime = 31; int hashCode = 1; hashCode = prime * hashCode + ((getName() == null) ? 0 : getName().hashCode()); hashCode = prime * hashCode + ((getState() == null) ? 0 : getState().hashCode()); hashCode = prime * hashCode + ((getConfiguration() == null) ? 0 : getConfiguration().hashCode()); hashCode = prime * hashCode + ((getDescription() == null) ? 0 : getDescription().hashCode()); hashCode = prime * hashCode + ((getCreationTime() == null) ? 0 : getCreationTime().hashCode()); return hashCode; } @Override public WorkGroup clone() { try { return (WorkGroup) super.clone(); } catch (CloneNotSupportedException e) { throw new IllegalStateException("Got a CloneNotSupportedException from Object.clone() " + "even though we're Cloneable!", e); } } @com.amazonaws.annotation.SdkInternalApi @Override public void marshall(ProtocolMarshaller protocolMarshaller) { com.amazonaws.services.athena.model.transform.WorkGroupMarshaller.getInstance().marshall(this, protocolMarshaller); } } /** * Copyright (c) 2016-present, RxJava Contributors. * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package io.reactivex.flowable; import static org.junit.Assert.assertEquals; import java.util.*; import org.junit.Test; import org.reactivestreams.Publisher; import io.reactivex.Flowable; import io.reactivex.FlowableTransformer; import io.reactivex.flowables.GroupedFlowable; import io.reactivex.functions.*; import io.reactivex.subscribers.TestSubscriber; /** * Test super/extends of generics. * * See https://github.com/Netflix/RxJava/pull/331 */ public class FlowableCovarianceTest { /** * This won't compile if super/extends isn't done correctly on generics. */ @Test public void testCovarianceOfFrom() { Flowable.<Movie> just(new HorrorMovie()); Flowable.<Movie> fromIterable(new ArrayList<HorrorMovie>()); // Observable.<HorrorMovie>from(new Movie()); // may not compile } @Test public void testSortedList() { Comparator<Media> sortFunction = new Comparator<Media>() { @Override public int compare(Media t1, Media t2) { return 1; } }; // this one would work without the covariance generics Flowable<Media> f = Flowable.just(new Movie(), new TVSeason(), new Album()); f.toSortedList(sortFunction); // this one would NOT work without the covariance generics Flowable<Movie> f2 = Flowable.just(new Movie(), new ActionMovie(), new HorrorMovie()); f2.toSortedList(sortFunction); } @Test public void testGroupByCompose() { Flowable<Movie> movies = Flowable.just(new HorrorMovie(), new ActionMovie(), new Movie()); TestSubscriber<String> ts = new TestSubscriber<String>(); movies .groupBy(new Function<Movie, Object>() { @Override public Object apply(Movie v) { return v.getClass(); } }) .doOnNext(new Consumer<GroupedFlowable<Object, Movie>>() { @Override public void accept(GroupedFlowable<Object, Movie> g) { System.out.println(g.getKey()); } }) .flatMap(new Function<GroupedFlowable<Object, Movie>, Publisher<String>>() { @Override public Publisher<String> apply(GroupedFlowable<Object, Movie> g) { return g .doOnNext(new Consumer<Movie>() { @Override public void accept(Movie v) { System.out.println(v); } }) .compose(new FlowableTransformer<Movie, Movie>() { @Override public Publisher<Movie> apply(Flowable<Movie> m) { return m.concatWith(Flowable.just(new ActionMovie())); } } ) .map(new Function<Object, String>() { @Override public String apply(Object v) { return v.toString(); } }); } }) .subscribe(ts); ts.assertTerminated(); ts.assertNoErrors(); // System.out.println(ts.getOnNextEvents()); assertEquals(6, ts.valueCount()); } @SuppressWarnings("unused") @Test public void testCovarianceOfCompose() { Flowable<HorrorMovie> movie = Flowable.just(new HorrorMovie()); Flowable<Movie> movie2 = movie.compose(new FlowableTransformer<HorrorMovie, Movie>() { @Override public Publisher<Movie> apply(Flowable<HorrorMovie> t) { return Flowable.just(new Movie()); } }); } @SuppressWarnings("unused") @Test public void testCovarianceOfCompose2() { Flowable<Movie> movie = Flowable.<Movie> just(new HorrorMovie()); Flowable<HorrorMovie> movie2 = movie.compose(new FlowableTransformer<Movie, HorrorMovie>() { @Override public Publisher<HorrorMovie> apply(Flowable<Movie> t) { return Flowable.just(new HorrorMovie()); } }); } @SuppressWarnings("unused") @Test public void testCovarianceOfCompose3() { Flowable<Movie> movie = Flowable.<Movie>just(new HorrorMovie()); Flowable<HorrorMovie> movie2 = movie.compose(new FlowableTransformer<Movie, HorrorMovie>() { @Override public Publisher<HorrorMovie> apply(Flowable<Movie> t) { return Flowable.just(new HorrorMovie()).map(new Function<HorrorMovie, HorrorMovie>() { @Override public HorrorMovie apply(HorrorMovie v) { return v; } }); } } ); } @SuppressWarnings("unused") @Test public void testCovarianceOfCompose4() { Flowable<HorrorMovie> movie = Flowable.just(new HorrorMovie()); Flowable<HorrorMovie> movie2 = movie.compose(new FlowableTransformer<HorrorMovie, HorrorMovie>() { @Override public Publisher<HorrorMovie> apply(Flowable<HorrorMovie> t1) { return t1.map(new Function<HorrorMovie, HorrorMovie>() { @Override public HorrorMovie apply(HorrorMovie v) { return v; } }); } }); } @Test public void testComposeWithDeltaLogic() { List<Movie> list1 = Arrays.asList(new Movie(), new HorrorMovie(), new ActionMovie()); List<Movie> list2 = Arrays.asList(new ActionMovie(), new Movie(), new HorrorMovie(), new ActionMovie()); Flowable<List<Movie>> movies = Flowable.just(list1, list2); movies.compose(deltaTransformer); } static Function<List<List<Movie>>, Flowable<Movie>> calculateDelta = new Function<List<List<Movie>>, Flowable<Movie>>() { @Override public Flowable<Movie> apply(List<List<Movie>> listOfLists) { if (listOfLists.size() == 1) { return Flowable.fromIterable(listOfLists.get(0)); } else { // diff the two List<Movie> newList = listOfLists.get(1); List<Movie> oldList = new ArrayList<Movie>(listOfLists.get(0)); Set<Movie> delta = new LinkedHashSet<Movie>(); delta.addAll(newList); // remove all that match in old delta.removeAll(oldList); // filter oldList to those that aren't in the newList oldList.removeAll(newList); // for all left in the oldList we'll create DROP events for (@SuppressWarnings("unused") Movie old : oldList) { delta.add(new Movie()); } return Flowable.fromIterable(delta); } } }; static FlowableTransformer<List<Movie>, Movie> deltaTransformer = new FlowableTransformer<List<Movie>, Movie>() { @Override public Publisher<Movie> apply(Flowable<List<Movie>> movieList) { return movieList .startWith(new ArrayList<Movie>()) .buffer(2, 1) .skip(1) .flatMap(calculateDelta); } }; /* * Most tests are moved into their applicable classes such as [Operator]Tests.java */ static class Media { } static class Movie extends Media { } static class HorrorMovie extends Movie { } static class ActionMovie extends Movie { } static class Album extends Media { } static class TVSeason extends Media { } static class Rating { } static class CoolRating extends Rating { } static class Result { } static class ExtendedResult extends Result { } } /* * Licensed to the Apache Software Foundation (ASF) under one or more * contributor license agreements. See the NOTICE file distributed with * this work for additional information regarding copyright ownership. * The ASF licenses this file to You under the Apache License, Version 2.0 * (the "License"); you may not use this file except in compliance with * the License. You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package org.apache.accumulo.server.zookeeper; import static java.nio.charset.StandardCharsets.UTF_8; import java.util.ArrayList; import java.util.Collections; import java.util.List; import java.util.Random; import java.util.Set; import java.util.concurrent.ThreadPoolExecutor; import java.util.concurrent.atomic.AtomicInteger; import org.apache.accumulo.core.conf.AccumuloConfiguration; import org.apache.accumulo.fate.zookeeper.ZooUtil.NodeExistsPolicy; import org.apache.accumulo.fate.zookeeper.ZooUtil.NodeMissingPolicy; import org.apache.accumulo.server.util.time.SimpleTimer; import org.apache.zookeeper.KeeperException; import org.apache.zookeeper.KeeperException.NodeExistsException; import org.apache.zookeeper.WatchedEvent; import org.apache.zookeeper.Watcher; import org.slf4j.Logger; import org.slf4j.LoggerFactory; /** * Provides a way to push work out to tablet servers via zookeeper and wait for that work to be done. Any tablet server can pick up a work item and process it. * * Worker processes watch a zookeeper node for tasks to be performed. After getting an exclusive lock on the node, the worker will perform the task. */ public class DistributedWorkQueue { private static final String LOCKS_NODE = "locks"; private static final Logger log = LoggerFactory.getLogger(DistributedWorkQueue.class); private ThreadPoolExecutor threadPool; private ZooReaderWriter zoo = ZooReaderWriter.getInstance(); private String path; private AccumuloConfiguration config; private long timerInitialDelay, timerPeriod; private AtomicInteger numTask = new AtomicInteger(0); private void lookForWork(final Processor processor, List<String> children) { if (children.size() == 0) return; if (numTask.get() >= threadPool.getCorePoolSize()) return; Random random = new Random(); Collections.shuffle(children, random); try { for (final String child : children) { if (child.equals(LOCKS_NODE)) continue; final String lockPath = path + "/locks/" + child; try { // no need to use zoolock, because a queue (ephemeral sequential) is not needed // if can not get the lock right now then do not want to wait zoo.putEphemeralData(lockPath, new byte[0]); } catch (NodeExistsException nee) { // someone else has reserved it continue; } final String childPath = path + "/" + child; // check to see if another node processed it already if (!zoo.exists(childPath)) { zoo.recursiveDelete(lockPath, NodeMissingPolicy.SKIP); continue; } // Great... we got the lock, but maybe we're too busy if (numTask.get() >= threadPool.getCorePoolSize()) { zoo.recursiveDelete(lockPath, NodeMissingPolicy.SKIP); break; } log.debug("got lock for " + child); Runnable task = new Runnable() { @Override public void run() { try { try { processor.newProcessor().process(child, zoo.getData(childPath, null)); // if the task fails, then its entry in the Q is not deleted... so it will be retried try { zoo.recursiveDelete(childPath, NodeMissingPolicy.SKIP); } catch (Exception e) { log.error("Error received when trying to delete entry in zookeeper " + childPath, e); } } catch (Exception e) { log.warn("Failed to process work " + child, e); } try { zoo.recursiveDelete(lockPath, NodeMissingPolicy.SKIP); } catch (Exception e) { log.error("Error received when trying to delete entry in zookeeper " + childPath, e); } } finally { numTask.decrementAndGet(); } try { // its important that this is called after numTask is decremented lookForWork(processor, zoo.getChildren(path)); } catch (KeeperException e) { log.error("Failed to look for work", e); } catch (InterruptedException e) { log.info("Interrupted looking for work", e); } } }; numTask.incrementAndGet(); threadPool.execute(task); } } catch (Throwable t) { log.error("Unexpected error", t); } } public interface Processor { Processor newProcessor(); void process(String workID, byte[] data); } public DistributedWorkQueue(String path, AccumuloConfiguration config) { // Preserve the old delay and period this(path, config, new Random().nextInt(60 * 1000), 60 * 1000); } public DistributedWorkQueue(String path, AccumuloConfiguration config, long timerInitialDelay, long timerPeriod) { this.path = path; this.config = config; this.timerInitialDelay = timerInitialDelay; this.timerPeriod = timerPeriod; } public void startProcessing(final Processor processor, ThreadPoolExecutor executorService) throws KeeperException, InterruptedException { threadPool = executorService; zoo.mkdirs(path); zoo.mkdirs(path + "/" + LOCKS_NODE); List<String> children = zoo.getChildren(path, new Watcher() { @Override public void process(WatchedEvent event) { switch (event.getType()) { case NodeChildrenChanged: if (event.getPath().equals(path)) try { lookForWork(processor, zoo.getChildren(path, this)); } catch (KeeperException e) { log.error("Failed to look for work", e); } catch (InterruptedException e) { log.info("Interrupted looking for work", e); } else log.info("Unexpected path for NodeChildrenChanged event " + event.getPath()); break; case NodeCreated: case NodeDataChanged: case NodeDeleted: case None: log.info("Got unexpected zookeeper event: " + event.getType() + " for " + path); break; } } }); lookForWork(processor, children); // Add a little jitter to avoid all the tservers slamming zookeeper at once SimpleTimer.getInstance(config).schedule(new Runnable() { @Override public void run() { log.debug("Looking for work in " + path); try { lookForWork(processor, zoo.getChildren(path)); } catch (KeeperException e) { log.error("Failed to look for work", e); } catch (InterruptedException e) { log.info("Interrupted looking for work", e); } } }, timerInitialDelay, timerPeriod); } /** * Adds work to the queue, automatically converting the String to bytes using UTF-8 */ public void addWork(String workId, String data) throws KeeperException, InterruptedException { addWork(workId, data.getBytes(UTF_8)); } public void addWork(String workId, byte[] data) throws KeeperException, InterruptedException { if (workId.equalsIgnoreCase(LOCKS_NODE)) throw new IllegalArgumentException("locks is reserved work id"); zoo.mkdirs(path); zoo.putPersistentData(path + "/" + workId, data, NodeExistsPolicy.SKIP); } public List<String> getWorkQueued() throws KeeperException, InterruptedException { ArrayList<String> children = new ArrayList<String>(zoo.getChildren(path)); children.remove(LOCKS_NODE); return children; } public void waitUntilDone(Set<String> workIDs) throws KeeperException, InterruptedException { final Object condVar = new Object(); Watcher watcher = new Watcher() { @Override public void process(WatchedEvent event) { switch (event.getType()) { case NodeChildrenChanged: synchronized (condVar) { condVar.notify(); } break; case NodeCreated: case NodeDataChanged: case NodeDeleted: case None: log.info("Got unexpected zookeeper event: " + event.getType() + " for " + path); break; } } }; List<String> children = zoo.getChildren(path, watcher); while (!Collections.disjoint(children, workIDs)) { synchronized (condVar) { condVar.wait(10000); } children = zoo.getChildren(path, watcher); } } } package com.fincatto.documentofiscal.nfe310.classes.inutilizacao; import org.junit.Assert; import org.junit.Test; import com.fincatto.documentofiscal.DFAmbiente; import com.fincatto.documentofiscal.DFUnidadeFederativa; import com.fincatto.documentofiscal.nfe310.FabricaDeObjetosFake; import com.fincatto.documentofiscal.nfe310.classes.evento.inutilizacao.NFEventoInutilizacaoDados; public class NFEventoCancelamentoDadosTest { @Test(expected = IllegalStateException.class) public void naoDevePermitirModeloInvalido() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setModeloDocumentoFiscal("75"); } @Test public void devePermitirAmbosModelosDeNFe() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setModeloDocumentoFiscal("55"); dados.setModeloDocumentoFiscal("65"); } @Test(expected = IllegalStateException.class) public void naoDevePermitirJustificativaInvalido() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); try { dados.setJustificativa("rcAYGVaFoYcW8q"); } catch (final IllegalStateException e) { dados.setJustificativa("WDou2V29BncPEppZRB7XnD7BAQPYFgewTmEu2kCCRbESq01soGjLJVxhJmcYMxAY3t0nXCXmWJh8suPIikxWuUxaJCAMBKUiMMm04AyySjtjSrNqThH0W14IpNWM5bCkKOqyoV58HFVxfZLfZOYmn7SCUW3QTOoaos09TFbMMIccnW2kfVMrb8T419Mpy60IIjo6hqORvMPZiDKjSrmpWiYLCIGLLBpqjbO9XmSHryazw2XoT2yJMpfE9N53GCRh"); } } @Test(expected = IllegalStateException.class) public void naoDevePermitirCNPJInvalido() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setCnpj("1234567890123"); } @Test(expected = NumberFormatException.class) public void naoDevePermitirAnoDiferenteDeDuasCasas() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); try { dados.setAno(9); } catch (final NumberFormatException e) { dados.setAno(100); } } @Test(expected = IllegalStateException.class) public void naoDevePermitirNumeroNFInicialInvalido() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); try { dados.setNumeroNFInicial(""); } catch (final IllegalStateException e) { dados.setNumeroNFInicial("1000000000"); } } @Test(expected = IllegalStateException.class) public void naoDevePermitirNumeroNFFinalInvalido() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); try { dados.setNumeroNFFinal(""); } catch (final IllegalStateException e) { dados.setNumeroNFFinal("1000000000"); } } @Test(expected = IllegalStateException.class) public void naoDevePermitirSerieInvalido() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); try { dados.setSerie(""); } catch (final IllegalStateException e) { dados.setSerie("1000"); } } @Test(expected = IllegalStateException.class) public void naoDevePermitirServicoInvalido() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setNomeServico("inutiliza"); } @Test(expected = IllegalStateException.class) public void naoDevePermitirIDInvalido() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setIdentificador("IDw6cRIPJzP4sv6gBWQFCNcFSITQK7rOxjmBFcW2Mzf"); } @Test(expected = IllegalStateException.class) public void naoDevePermitirIdentificadorNulo() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setAmbiente(DFAmbiente.HOMOLOGACAO); dados.setAno(15); dados.setCnpj("12345678901234"); dados.setJustificativa("u2MGhwXFQDFtSuKsLkmgowBZNNhOWBL4JKIqYnIj5iDPTAUqHSwKL1O2olgmZwigRS1P58Zoc1qDxzqmvv3hBE1LYuLHNPbFXuLwM5ZxvH7xfSpnkX5VBGjrkR3cuiXLr1uz3chFb9JrNY5xU3X0eF9Byc2Q9TkPbFyPj7iRwwQVMNt6FGvpUyRMHGmhSDYhFRD2Dst0UaauvA4V0breWHyN4WUSEm9z377jXHNwtVLQQCxB2wcEIZGWVIT4CF5"); dados.setModeloDocumentoFiscal("55"); dados.setNomeServico("INUTILIZAR"); dados.setNumeroNFInicial("1"); dados.setNumeroNFFinal("999999999"); dados.setSerie("999"); dados.setUf(DFUnidadeFederativa.SC); dados.toString(); } @Test(expected = IllegalStateException.class) public void naoDevePermitirAmbienteNulo() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setAno(15); dados.setCnpj("12345678901234"); dados.setIdentificador("ID55605654557305333405403926218856863798956"); dados.setJustificativa("u2MGhwXFQDFtSuKsLkmgowBZNNhOWBL4JKIqYnIj5iDPTAUqHSwKL1O2olgmZwigRS1P58Zoc1qDxzqmvv3hBE1LYuLHNPbFXuLwM5ZxvH7xfSpnkX5VBGjrkR3cuiXLr1uz3chFb9JrNY5xU3X0eF9Byc2Q9TkPbFyPj7iRwwQVMNt6FGvpUyRMHGmhSDYhFRD2Dst0UaauvA4V0breWHyN4WUSEm9z377jXHNwtVLQQCxB2wcEIZGWVIT4CF5"); dados.setModeloDocumentoFiscal("55"); dados.setNomeServico("INUTILIZAR"); dados.setNumeroNFInicial("1"); dados.setNumeroNFFinal("999999999"); dados.setSerie("999"); dados.setUf(DFUnidadeFederativa.SC); dados.toString(); } @Test(expected = IllegalStateException.class) public void naoDevePermitirServicoNulo() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setAmbiente(DFAmbiente.HOMOLOGACAO); dados.setAno(15); dados.setCnpj("12345678901234"); dados.setIdentificador("ID55605654557305333405403926218856863798956"); dados.setJustificativa("u2MGhwXFQDFtSuKsLkmgowBZNNhOWBL4JKIqYnIj5iDPTAUqHSwKL1O2olgmZwigRS1P58Zoc1qDxzqmvv3hBE1LYuLHNPbFXuLwM5ZxvH7xfSpnkX5VBGjrkR3cuiXLr1uz3chFb9JrNY5xU3X0eF9Byc2Q9TkPbFyPj7iRwwQVMNt6FGvpUyRMHGmhSDYhFRD2Dst0UaauvA4V0breWHyN4WUSEm9z377jXHNwtVLQQCxB2wcEIZGWVIT4CF5"); dados.setModeloDocumentoFiscal("55"); dados.setNumeroNFInicial("1"); dados.setNumeroNFFinal("999999999"); dados.setSerie("999"); dados.setUf(DFUnidadeFederativa.SC); dados.toString(); } @Test(expected = IllegalStateException.class) public void naoDevePermitirUFNulo() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setAmbiente(DFAmbiente.HOMOLOGACAO); dados.setAno(15); dados.setCnpj("12345678901234"); dados.setIdentificador("ID55605654557305333405403926218856863798956"); dados.setJustificativa("u2MGhwXFQDFtSuKsLkmgowBZNNhOWBL4JKIqYnIj5iDPTAUqHSwKL1O2olgmZwigRS1P58Zoc1qDxzqmvv3hBE1LYuLHNPbFXuLwM5ZxvH7xfSpnkX5VBGjrkR3cuiXLr1uz3chFb9JrNY5xU3X0eF9Byc2Q9TkPbFyPj7iRwwQVMNt6FGvpUyRMHGmhSDYhFRD2Dst0UaauvA4V0breWHyN4WUSEm9z377jXHNwtVLQQCxB2wcEIZGWVIT4CF5"); dados.setModeloDocumentoFiscal("55"); dados.setNomeServico("INUTILIZAR"); dados.setNumeroNFInicial("1"); dados.setNumeroNFFinal("999999999"); dados.setSerie("999"); dados.toString(); } @Test(expected = IllegalStateException.class) public void naoDevePermitirAnoNulo() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setAmbiente(DFAmbiente.HOMOLOGACAO); dados.setCnpj("12345678901234"); dados.setIdentificador("ID55605654557305333405403926218856863798956"); dados.setJustificativa("u2MGhwXFQDFtSuKsLkmgowBZNNhOWBL4JKIqYnIj5iDPTAUqHSwKL1O2olgmZwigRS1P58Zoc1qDxzqmvv3hBE1LYuLHNPbFXuLwM5ZxvH7xfSpnkX5VBGjrkR3cuiXLr1uz3chFb9JrNY5xU3X0eF9Byc2Q9TkPbFyPj7iRwwQVMNt6FGvpUyRMHGmhSDYhFRD2Dst0UaauvA4V0breWHyN4WUSEm9z377jXHNwtVLQQCxB2wcEIZGWVIT4CF5"); dados.setModeloDocumentoFiscal("55"); dados.setNomeServico("INUTILIZAR"); dados.setNumeroNFInicial("1"); dados.setNumeroNFFinal("999999999"); dados.setSerie("999"); dados.setUf(DFUnidadeFederativa.SC); dados.toString(); } @Test(expected = IllegalStateException.class) public void naoDevePermitirCNPJNulo() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setAmbiente(DFAmbiente.HOMOLOGACAO); dados.setAno(15); dados.setIdentificador("ID55605654557305333405403926218856863798956"); dados.setJustificativa("u2MGhwXFQDFtSuKsLkmgowBZNNhOWBL4JKIqYnIj5iDPTAUqHSwKL1O2olgmZwigRS1P58Zoc1qDxzqmvv3hBE1LYuLHNPbFXuLwM5ZxvH7xfSpnkX5VBGjrkR3cuiXLr1uz3chFb9JrNY5xU3X0eF9Byc2Q9TkPbFyPj7iRwwQVMNt6FGvpUyRMHGmhSDYhFRD2Dst0UaauvA4V0breWHyN4WUSEm9z377jXHNwtVLQQCxB2wcEIZGWVIT4CF5"); dados.setModeloDocumentoFiscal("55"); dados.setNomeServico("INUTILIZAR"); dados.setNumeroNFInicial("1"); dados.setNumeroNFFinal("999999999"); dados.setSerie("999"); dados.setUf(DFUnidadeFederativa.SC); dados.toString(); } @Test(expected = IllegalStateException.class) public void naoDevePermitirModeloNulo() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setAmbiente(DFAmbiente.HOMOLOGACAO); dados.setAno(15); dados.setCnpj("12345678901234"); dados.setIdentificador("ID55605654557305333405403926218856863798956"); dados.setJustificativa("u2MGhwXFQDFtSuKsLkmgowBZNNhOWBL4JKIqYnIj5iDPTAUqHSwKL1O2olgmZwigRS1P58Zoc1qDxzqmvv3hBE1LYuLHNPbFXuLwM5ZxvH7xfSpnkX5VBGjrkR3cuiXLr1uz3chFb9JrNY5xU3X0eF9Byc2Q9TkPbFyPj7iRwwQVMNt6FGvpUyRMHGmhSDYhFRD2Dst0UaauvA4V0breWHyN4WUSEm9z377jXHNwtVLQQCxB2wcEIZGWVIT4CF5"); dados.setNomeServico("INUTILIZAR"); dados.setNumeroNFInicial("1"); dados.setNumeroNFFinal("999999999"); dados.setSerie("999"); dados.setUf(DFUnidadeFederativa.SC); dados.toString(); } @Test(expected = IllegalStateException.class) public void naoDevePermitirSerieNulo() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setAmbiente(DFAmbiente.HOMOLOGACAO); dados.setAno(15); dados.setCnpj("12345678901234"); dados.setIdentificador("ID55605654557305333405403926218856863798956"); dados.setJustificativa("u2MGhwXFQDFtSuKsLkmgowBZNNhOWBL4JKIqYnIj5iDPTAUqHSwKL1O2olgmZwigRS1P58Zoc1qDxzqmvv3hBE1LYuLHNPbFXuLwM5ZxvH7xfSpnkX5VBGjrkR3cuiXLr1uz3chFb9JrNY5xU3X0eF9Byc2Q9TkPbFyPj7iRwwQVMNt6FGvpUyRMHGmhSDYhFRD2Dst0UaauvA4V0breWHyN4WUSEm9z377jXHNwtVLQQCxB2wcEIZGWVIT4CF5"); dados.setModeloDocumentoFiscal("55"); dados.setNomeServico("INUTILIZAR"); dados.setNumeroNFInicial("1"); dados.setNumeroNFFinal("999999999"); dados.setUf(DFUnidadeFederativa.SC); dados.toString(); } @Test(expected = IllegalStateException.class) public void naoDevePermitirNumeroNotaInicialNulo() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setAmbiente(DFAmbiente.HOMOLOGACAO); dados.setAno(15); dados.setCnpj("12345678901234"); dados.setIdentificador("ID55605654557305333405403926218856863798956"); dados.setJustificativa("u2MGhwXFQDFtSuKsLkmgowBZNNhOWBL4JKIqYnIj5iDPTAUqHSwKL1O2olgmZwigRS1P58Zoc1qDxzqmvv3hBE1LYuLHNPbFXuLwM5ZxvH7xfSpnkX5VBGjrkR3cuiXLr1uz3chFb9JrNY5xU3X0eF9Byc2Q9TkPbFyPj7iRwwQVMNt6FGvpUyRMHGmhSDYhFRD2Dst0UaauvA4V0breWHyN4WUSEm9z377jXHNwtVLQQCxB2wcEIZGWVIT4CF5"); dados.setModeloDocumentoFiscal("55"); dados.setNomeServico("INUTILIZAR"); dados.setNumeroNFFinal("999999999"); dados.setSerie("999"); dados.setUf(DFUnidadeFederativa.SC); dados.toString(); } @Test(expected = IllegalStateException.class) public void naoDevePermitirNumeroNotaFinalNulo() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setAmbiente(DFAmbiente.HOMOLOGACAO); dados.setAno(15); dados.setCnpj("12345678901234"); dados.setIdentificador("ID55605654557305333405403926218856863798956"); dados.setJustificativa("u2MGhwXFQDFtSuKsLkmgowBZNNhOWBL4JKIqYnIj5iDPTAUqHSwKL1O2olgmZwigRS1P58Zoc1qDxzqmvv3hBE1LYuLHNPbFXuLwM5ZxvH7xfSpnkX5VBGjrkR3cuiXLr1uz3chFb9JrNY5xU3X0eF9Byc2Q9TkPbFyPj7iRwwQVMNt6FGvpUyRMHGmhSDYhFRD2Dst0UaauvA4V0breWHyN4WUSEm9z377jXHNwtVLQQCxB2wcEIZGWVIT4CF5"); dados.setModeloDocumentoFiscal("55"); dados.setNomeServico("INUTILIZAR"); dados.setNumeroNFInicial("1"); dados.setSerie("999"); dados.setUf(DFUnidadeFederativa.SC); dados.toString(); } @Test(expected = IllegalStateException.class) public void naoDevePermitirJustificativaNulo() { final NFEventoInutilizacaoDados dados = new NFEventoInutilizacaoDados(); dados.setAmbiente(DFAmbiente.HOMOLOGACAO); dados.setAno(15); dados.setCnpj("12345678901234"); dados.setIdentificador("ID55605654557305333405403926218856863798956"); dados.setModeloDocumentoFiscal("55"); dados.setNomeServico("INUTILIZAR"); dados.setNumeroNFInicial("1"); dados.setNumeroNFFinal("999999999"); dados.setSerie("999"); dados.setUf(DFUnidadeFederativa.SC); dados.toString(); } @Test public void deveGerarXMLDeAcordoComOPadraoEstabelecido() { final String xmlEsperado = "<infInut Id=\"ID42161234567890123455123123456789987654321\"><tpAmb>2</tpAmb><xServ>INUTILIZAR</xServ><cUF>42</cUF><ano>16</ano><CNPJ>12345678901234</CNPJ><mod>55</mod><serie>123</serie><nNFIni>123456789</nNFIni><nNFFin>987654321</nNFFin><xJust>u2MGhwXFQDFtSuKsLkmgowBZNNhOWBL4JKIqYnIj5iDPTAUqHSwKL1O2olgmZwigRS1P58Zoc1qDxzqmvv3hBE1LYuLHNPbFXuLwM5ZxvH7xfSpnkX5VBGjrkR3cuiXLr1uz3chFb9JrNY5xU3X0eF9Byc2Q9TkPbFyPj7iRwwQVMNt6FGvpUyRMHGmhSDYhFRD2Dst0UaauvA4V0breWHyN4WUSEm9z377jXHNwtVLQQCxB2wcEIZGWVIT4CF5</xJust></infInut>"; Assert.assertEquals(xmlEsperado, FabricaDeObjetosFake.getNFEventoInutilizacaoDados().toString()); } } Downloads last month3
https://huggingface.co/datasets/microsoft/CLUES
CLUES: Few-Shot Learning Evaluation in Natural Language Understanding This repo contains the data for the NeurIPS 2021 benchmark Constrained Language Understanding Evaluation Standard (CLUES). Leaderboard We maintain a Leaderboard allowing researchers to submit their results as entries. Submission Instructions Each submission must be submitted as a pull request modifying the markdown file underlying the leaderboard. The submission must attach an accompanying public paper and public source code for reproducing their results on our dataset. A submission can be toward any subset of tasks in our benchmark, or toward the aggregate leaderboard. For any task targeted by the submission, we require evaluation on (1) 10, 20, and 30 shots, and (2) all 5 splits of the corresponding dataset and a report of their mean and standard deviation. Each leaderboard will be sorted by the 30-shot mean S1 score (where S1 score is a variant of F1 score defined in our paper). The submission should not use data from the 4 other splits during few-shot finetuning of any 1 split, either as extra training set or as validation set for hyperparameter tuning. However, we allow external data, labeled or unlabeled, to be used for such purposes. Each submission using external data must mark the corresponding columns "external labeled" and/or "external unlabeled". Note, in this context, "external data" refers to data used after pretraining (e.g., for task-specific tuning); in particular, methods using existing pretrained models only, without extra data, should not mark either column. For obvious reasons, models cannot be trained on the original labeled datasets from where we sampled the few-shot CLUES data. In the table entry, the submission should include a method name and a citation, hyperlinking to their publicly released source code reproducing the results. See the last entry of the table below for an example. Abbreviations FT = (classic) finetuning PT = prompt based tuning ICL = in-context learning, in the style of GPT-3 ΞΌΒ±Οƒ = mean ΞΌ and standard deviation Οƒ across our 5 splits. Aggregate standard deviation is calculated using the sum-of-variance formula from individual tasks' standard deviations. Benchmarking CLUES for Aggregate 30-shot Evaluation Shots (K=30) external labeled external unlabeled Average β–Ό SST-2 MNLI CoNLL03 WikiANN SQuAD-v2 ReCoRD Human N N 81.4 83.7 69.4 87.4 82.6 73.5 91.9 T5-Large-770M-FT N N 43.1Β±6.7 52.3Β±2.9 36.8Β±3.8 51.2Β±0.1 62.4Β±0.6 43.7Β±2.7 12Β±3.8 BERT-Large-336M-FT N N 42.1Β±7.8 55.4Β±2.5 33.3Β±1.4 51.3Β±0 62.5Β±0.6 35.3Β±6.4 14.9Β±3.4 BERT-Base-110M-FT N N 41.5Β±9.2 53.6Β±5.5 35.4Β±3.2 51.3Β±0 62.8Β±0 32.6Β±5.8 13.1Β±3.3 DeBERTa-Large-400M-FT N N 40.1Β±17.8 47.7Β±9.0 26.7Β±11 48.2Β±2.9 58.3Β±6.2 38.7Β±7.4 21.1Β±3.6 RoBERTa-Large-355M-FT N N 40.0Β±10.6 53.2Β±5.6 34.0Β±1.1 44.7Β±2.6 48.4Β±6.7 43.5Β±4.4 16Β±2.8 RoBERTa-Large-355M-PT N N 90.2Β±1.8 61.6Β±3.5 DeBERTa-Large-400M-PT N N 88.4Β±3.3 62.9Β±3.1 BERT-Large-336M-PT N N 82.7Β±4.1 45.3Β±2.0 GPT3-175B-ICL N N 91.0Β±1.6 33.2Β±0.2 BERT-Base-110M-PT N N 79.4Β±5.6 42.5Β±3.2 LiST (Wang et al.) N Y 91.3 Β±0.7 67.9Β±3.0 Example (lastname et al.) Y/N Y/N 0Β±0 0Β±0 0Β±0 0Β±0 0Β±0 0Β±0 0Β±0 Individual Task Performance over Multiple Shots SST-2 Shots (K) external labeled external unlabeled 10 20 30 β–Ό All GPT-3 (175B) ICL N N 85.9Β±3.7 92.0Β±0.7 91.0Β±1.6 - RoBERTa-Large PT N N 88.8Β±3.9 89.0Β±1.1 90.2Β±1.8 93.8 DeBERTa-Large PT N N 83.4Β±5.3 87.8Β±3.5 88.4Β±3.3 91.9 Human N N 79.8 83 83.7 - BERT-Large PT N N 63.2Β±11.3 78.2Β±9.9 82.7Β±4.1 91 BERT-Base PT N N 63.9Β±10.0 76.7Β±6.6 79.4Β±5.6 91.9 BERT-Large FT N N 46.3Β±5.5 55.5Β±3.4 55.4Β±2.5 99.1 BERT-Base FT N N 46.2Β±5.6 54.0Β±2.8 53.6Β±5.5 98.1 RoBERTa-Large FT N N 38.4Β±21.7 52.3Β±5.6 53.2Β±5.6 98.6 T5-Large FT N N 51.2Β±1.8 53.4Β±3.2 52.3Β±2.9 97.6 DeBERTa-Large FT N N 43.0Β±11.9 40.8Β±22.6 47.7Β±9.0 100 Example (lastname et al.) Y/N Y/N 0Β±0 0Β±0 0Β±0 - MNLI Shots (K) external labeled external unlabeled 10 20 30 β–Ό All Human N Y 78.1 78.6 69.4 - LiST (wang et al.) N N 60.5Β±8.3 67.2Β±4.5 67.9Β±3.0 - DeBERTa-Large PT N N 44.5Β±8.2 60.7Β±5.3 62.9Β±3.1 88.1 RoBERTa-Large PT N N 57.7Β±3.6 58.6Β±2.9 61.6Β±3.5 87.1 BERT-Large PT N N 41.7Β±1.0 43.7Β±2.1 45.3Β±2.0 81.9 BERT-Base PT N N 40.4Β±1.8 42.1Β±4.4 42.5Β±3.2 81 T5-Large FT N N 39.8Β±3.3 37.9Β±4.3 36.8Β±3.8 85.9 BERT-Base FT N N 37.0Β±5.2 35.2Β±2.7 35.4Β±3.2 81.6 RoBERTa-Large FT N N 34.3Β±2.8 33.4Β±0.9 34.0Β±1.1 85.5 BERT-Large FT N N 33.7Β±0.4 28.2Β±14.8 33.3Β±1.4 80.9 GPT-3 (175B) ICL N N 33.5Β±0.7 33.1Β±0.3 33.2Β±0.2 - DeBERTa-Large FT N N 27.4Β±14.1 33.6Β±2.5 26.7Β±11.0 87.6 CoNLL03 Shots (K) external labeled external unlabeled 10 20 30 β–Ό All Human N N 87.7 89.7 87.4 - BERT-Base FT N N 51.3Β±0 51.3Β±0 51.3Β±0 - BERT-Large FT N N 51.3Β±0 51.3Β±0 51.3Β±0 89.3 T5-Large FT N N 46.3Β±6.9 50.0Β±0.7 51.2Β±0.1 92.2 DeBERTa-Large FT N N 50.1Β±1.2 47.8Β±2.5 48.2Β±2.9 93.6 RoBERTa-Large FT N N 50.8Β±0.5 44.6Β±5.1 44.7Β±2.6 93.2 WikiANN Shots (K) external labeled external unlabeled 10 20 30 β–Ό All Human N N 81.4 83.5 82.6 - BERT-Base FT N N 62.8Β±0 62.8Β±0 62.8Β±0 88.8 BERT-Large FT N N 62.8Β±0 62.6Β±0.4 62.5Β±0.6 91 T5-Large FT N N 61.7Β±0.7 62.1Β±0.2 62.4Β±0.6 87.4 DeBERTa-Large FT N N 58.5Β±3.3 57.9Β±5.8 58.3Β±6.2 91.1 RoBERTa-Large FT N N 58.5Β±8.8 56.9Β±3.4 48.4Β±6.7 91.2 SQuAD v2 Shots (K) external labeled external unlabeled 10 20 30 β–Ό All Human N N 71.9 76.4 73.5 - T5-Large FT N N 43.6Β±3.5 28.7Β±13.0 43.7Β±2.7 87.2 RoBERTa-Large FT N N 38.1Β±7.2 40.1Β±6.4 43.5Β±4.4 89.4 DeBERTa-Large FT N N 41.4Β±7.3 44.4Β±4.5 38.7Β±7.4 90 BERT-Large FT N N 42.3Β±5.6 35.8Β±9.7 35.3Β±6.4 81.8 BERT-Base FT N N 46.0Β±2.4 34.9Β±9.0 32.6Β±5.8 76.3 ReCoRD Shots (K) external labeled external unlabeled 10 20 30 β–Ό All Human N N 94.1 94.2 91.9 - DeBERTa-Large FT N N 15.7Β±5.0 16.8Β±5.7 21.1Β±3.6 80.7 RoBERTa-Large FT N N 12.0Β±1.9 9.9Β±6.2 16.0Β±2.8 80.3 BERT-Large FT N N 9.9Β±5.2 11.8Β±4.9 14.9Β±3.4 66 BERT-Base FT N N 10.3Β±1.8 11.7Β±2.4 13.1Β±3.3 54.4 T5-Large FT N N 11.9Β±2.7 11.7Β±1.5 12.0Β±3.8 77.3 How do I cite CLUES? @article{cluesteam2021, title={Few-Shot Learning Evaluation in Natural Language Understanding}, author={Mukherjee, Subhabrata and Liu, Xiaodong and Zheng, Guoqing and Hosseini, Saghar and Cheng, Hao and Yang, Greg and Meek, Christopher and Awadallah, Ahmed Hassan and Gao, Jianfeng}, booktitle = {NeurIPS 2021}, year = {2021}, month = {December}, url = {https://www.microsoft.com/en-us/research/publication/clues-few-shot-learning-evaluation-in-natural-language-understanding/}, } Contributing This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com. When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments. Trademarks This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies. Downloads last month2 Models trained or fine-tuned on microsoft/CLUES
https://huggingface.co/Kunda
Chaikatisha Kunda Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/ramaramani
Rama Ramani ramaramani rramani Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/microsoft/phi-1_5
Model Summary The language model phi-1.5 is a Transformer with 1.3 billion parameters. It was trained using the same data sources as phi-1, augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters. We did not fine-tune phi-1.5 either for instruction following or through reinforcement learning from human feedback. The intention behind crafting this open-source model is to provide the research community with a non-restricted small model to explore vital safety challenges, such as reducing toxicity, understanding societal biases, enhancing controllability, and more. For a safer model release, we exclude generic web-crawl data sources such as common-crawl from the training. This strategy prevents direct exposure to potentially harmful online content, enhancing the model's safety without RLHF. However, the model is still vulnerable to generating harmful content. We hope the model can help the research community to further study the safety of language models. phi-1.5 can write poems, draft emails, create stories, summarize texts, write Python code (such as downloading a Hugging Face transformer model), etc. Intended Uses Given the nature of the training data, phi-1.5 is best suited for prompts using the QA format, the chat format, and the code format. Note that phi-1.5, being a base model, often produces irrelevant text following the main answer. In the following example, we've truncated the answer for illustrative purposes only. QA format: Write a detailed analogy between mathematics and a lighthouse. Answer: Mathematics is like a lighthouse, guiding us through the vast ocean of numbers and calculations. Just as a lighthouse illuminates the darkness, mathematics provides us with a clear path to navigate through complex problems. It helps us make sense of the world around us, just like a lighthouse helps ships find their way home. where the model generates the text after "Answer:". Chat format: Alice: I don't know why, I'm struggling to maintain focus while studying. Any suggestions? Bob: Have you tried using a timer? It can help you stay on track and avoid distractions. Alice: That's a good idea. I'll give it a try. Charlie: Another thing that can help is to break up your study sessions into smaller chunks. It's easier to concentrate on one thing at a time. Alice: That makes sense. I'll try that too. Bob: And don't forget to take breaks! It's important to give your brain a rest so you can come back to your studies with a fresh perspective. Alice: Thanks for the advice, guys. I feel more motivated now. Charlie: No problem, Alice. We're all in this together. Bob: Yeah, and remember that it's okay to ask for help if you need it. We're here to support each other. where the model generates the text after the first "Bob:". Code format: def print_prime(n): """ Print all primes between 1 and n """ primes = [] for num in range(2, n+1): is_prime = True for i in range(2, int(math.sqrt(num))+1): if num % i == 0: is_prime = False break if is_prime: primes.append(num) print(primes) where the model generates the text after the comments. Notes phi-1.5 is intended for research purposes. The model-generated text/code should be treated as a starting point rather than a definitive solution for potential use cases. Users should be cautious when employing these models in their applications. Direct adoption for production tasks is out of the scope of this research project. As a result, phi-1.5 has not been tested to ensure that it performs adequately for any production-level application. Please refer to the limitation sections of this document for more details. Limitations of phi-1.5 Generate Inaccurate Code and Facts: The model often produces incorrect code snippets and statements. Users should treat these outputs as suggestions or starting points, not as definitive or accurate solutions. Limited Scope for code: If the model generates Python scripts that utilize uncommon packages or scripts in other languages, we strongly recommend users manually verify all API uses. Unreliable Responses to Instruction: The model has not undergone instruction fine-tuning. As a result, it may struggle or fail to adhere to intricate or nuanced instructions provided by users. Language Limitations: The model is primarily designed to understand standard English. Informal English, slang, or any other language outside of English might pose challenges to its comprehension, leading to potential misinterpretations or errors in response. Potential Societal Biases: Regardless of the safe data used for its training, the model is not entirely free from societal biases. There's a possibility it may generate content that mirrors these societal biases, particularly if prompted or instructed to do so. We urge users to be aware of this and to exercise caution and critical thinking when interpreting model outputs. Toxicity: Despite that the model is trained with carefully selected data, the model can still produce harmful content if explicitly prompted or instructed to do so. We chose to release the model for research purposes only -- We hope to help the open-source community develop the most effective ways to reduce the toxicity of a model directly after pretraining. Training Model Architecture: a Transformer-based model with next-word prediction objective Dataset size: 30B tokens Training tokens: 150B tokens Precision: fp16 GPUs: 32xA100-40G Training time: 8 days Software PyTorch DeepSpeed flash-attention License The model is licensed under the Research License. Sample Code import torch from transformers import AutoModelForCausalLM, AutoTokenizer torch.set_default_device("cuda") model = AutoModelForCausalLM.from_pretrained("microsoft/phi-1_5", trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained("microsoft/phi-1_5", trust_remote_code=True) inputs = tokenizer('''```python def print_prime(n): """ Print all primes between 1 and n """''', return_tensors="pt", return_attention_mask=False) outputs = model.generate(**inputs, max_length=200) text = tokenizer.batch_decode(outputs)[0] print(text) If you need to use the model in a lower precision (e.g., FP16), please wrap the model's forward pass with torch.autocast(), as follows: with torch.autocast(model.device.type, dtype=torch.float16, enabled=True): outputs = model.generate(**inputs, max_length=200) Remark. In the generation function, our model currently does not support beam search (num_beams > 1). Furthermore, in the forward pass of the model, we currently do not support attention mask during training, outputting hidden states or attention values, or using custom input embeddings (instead of the model's). Citation You can find the paper at https://arxiv.org/abs/2309.05463 @article{textbooks2, title={Textbooks Are All You Need II: \textbf{phi-1.5} technical report}, author={Li, Yuanzhi and Bubeck, S{\'e}bastien and Eldan, Ronen and Del Giorno, Allie and Gunasekar, Suriya and Lee, Yin Tat}, journal={arXiv preprint arXiv:2309.05463}, year={2023} }
https://huggingface.co/memoryz
Jason Wang memoryz memoryz Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/msftndubuisi
Emmanuel Ndubuisi msftndubuisi Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/pidajay
Ajay P pidajay pidajay Research interests AI, Medical Imaging, Healthcare Organizations models None public yet datasets None public yet
https://huggingface.co/devinHMS
Devin Howard devinHMS Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/datasets/microsoft/codexglue_method_generation
a, b = tee(iterable)<EOL>next(b, None)<EOL>return zip(a, b)<EOL> s -> (s0,s1), (s1,s2), (s2, s3), ... <EOL>all_characters = re.findall('<STR_LIT>', text) <EOL>if len(all_characters) == <NUM_LIT:0>:<EOL><INDENT>return <NUM_LIT:0><EOL><DEDENT>repetition_count = Counter(all_characters)<EOL>score = (len(all_characters)) ** <NUM_LIT:2> / (len(repetition_count) + len(text) / <NUM_LIT>)<EOL>return score<EOL> Returns a score in [0,1] range if the text makes any sense in English. def get_top_n_meanings(strings, n): scored_strings = [(s, score_meaning(s)) for s in strings]<EOL>scored_strings.sort(key=lambda tup: -tup[<NUM_LIT:1>])<EOL>return scored_strings[:n]<EOL> Returns (text, score) for top n strings for k, v in PokeAPI().get_endpoints().items():<EOL><INDENT>string = "<STR_LIT>"<EOL>string += ("<STR_LIT>"<EOL>.format(k.replace('<STR_LIT:->', '<STR_LIT:_>')) + '<STR_LIT>')<EOL>string += ("<STR_LIT>" +<EOL>"<STR_LIT>")<EOL>string += "<STR_LIT>".format(v.split('<STR_LIT:/>')[-<NUM_LIT:2>])<EOL>string += "<STR_LIT>"<EOL>string += '<STR_LIT>'<EOL>string += '<STR_LIT>'<EOL>print(string)<EOL><DEDENT> Automagically generates methods based on the API endpoints Returns a local filesystem path where the file can be retrieved using Python's built-in open() function. Storage systems that can't be accessed using open() should *not* implement this method. def open(self, name, mode='<STR_LIT:rb>'): self.request = urlopen(self.url)<EOL>if self.algorithm:<EOL><INDENT>self.hash = hashlib.new(self.algorithm)<EOL><DEDENT>return self<EOL> Retrieves the specified file from storage. def list(self, ignore_patterns): return six.iteritems(self.firsts)<EOL> List all files in all storages. def find(self, path, all=False): found = os.path.join(settings.STATIC_ROOT, path)<EOL>if all:<EOL><INDENT>return [found]<EOL><DEDENT>else:<EOL><INDENT>return found<EOL><DEDENT> Looks for files in the app directories. def give_unexpected_calls(method_calls, expected_methods_names): return [call for call in method_calls<EOL>if call[<NUM_LIT:0>] not in expected_methods_names]<EOL> TODO: Move this to a common test utils module. @classmethod<EOL><INDENT>def _unwrap_func(cls, decorated_func):<DEDENT> if click is not None:<EOL><INDENT>if isinstance(decorated_func, click.Command):<EOL><INDENT>return cls._unwrap_func(decorated_func.callback)<EOL><DEDENT><DEDENT>if hasattr(decorated_func, '<STR_LIT>'):<EOL><INDENT>return cls._unwrap_func(decorated_func.__wrapped__)<EOL><DEDENT>else:<EOL><INDENT>return decorated_func<EOL><DEDENT> This unwraps a decorated func, returning the inner wrapped func. This may become unnecessary with Python 3.4's inspect.unwrap(). def _register_dependent(self, dependent, resource_name): if dependent not in self.dependents:<EOL><INDENT>self.dependents[dependent] = []<EOL><DEDENT>self.dependents[dependent].insert(<NUM_LIT:0>, resource_name)<EOL> Register a mapping of the dependent to resource name. After calling, dependency_register.dependents[dependent] should contain resource_name. def register(self, resource_name, dependent=None): if dependent is None:<EOL><INDENT>return partial(self.register, resource_name)<EOL><DEDENT>dependent = self._unwrap_dependent(dependent)<EOL>self._register_dependent(dependent, resource_name)<EOL>self._register_resource_dependency(resource_name, dependent)<EOL>return dependent<EOL> Register the given dependent as depending on the "resource" named by resource_name. @di.dependsOn('<STR_LIT>')<EOL>def multiply(n): multiplier = di.resolver.unpack(multiply)<EOL>return multiplier * n<EOL> Multiply the given number n by some configured multiplier. @providers.register('<STR_LIT>')<EOL>def give_multiplier(): @di.dependsOn('<STR_LIT>')<EOL>@di.dependsOn('<STR_LIT>')<EOL>def multiply_and_add(n): multiplier, offset = di.resolver.unpack(multiply_and_add)<EOL>return (multiplier * n) + offset<EOL> Multiply the given number n by some configured multiplier, and then add a configured offset. @providers.register('<STR_LIT>')<EOL>def give_multiplier(): @providers.register('<STR_LIT>')<EOL>def give_offset(): Give an offset value of 3. self._msg('<STR_LIT>')<EOL>self._msg2('<STR_LIT>'.format(self._curdir))<EOL>self._msg2('<STR_LIT>'.format(self._session.cookies))<EOL>self._msg2('<STR_LIT>'.format(self._session.headers))<EOL>self._msg2('<STR_LIT>'.format(self._config))<EOL>self._msg2('<STR_LIT>'.format(self._custom))<EOL>self._msg2('<STR_LIT>'.format(self._account))<EOL> Show a list of recently variables info. def register(self, argtypes=r'<STR_LIT:M>', help_msg=None): def format_args(method):<EOL><INDENT>def wrapped_method(*args, **kwargs):<EOL><INDENT>args_count = len(args) <EOL>argtypes_count = len(argtypes)<EOL>placeholder_count = argtypes.count('<STR_LIT:H>') + argtypes.count('<STR_LIT:h>')<EOL>if placeholder_count:<EOL><INDENT>min_args_count = (argtypes_count - placeholder_count)<EOL>if args_count < min_args_count or args_count > argtypes_count:<EOL><INDENT>raise KngetError("<STR_LIT>",<EOL>reason='<STR_LIT>'.format(args_count))<EOL><DEDENT><DEDENT>elif args_count != argtypes_count:<EOL><INDENT>raise KngetError("<STR_LIT>",<EOL>reason='<STR_LIT>'.format(args_count))<EOL><DEDENT>argv = [] <EOL>for i in range(args_count):<EOL><INDENT>if argtypes[i] in ('<STR_LIT:m>', '<STR_LIT:M>'):<EOL><INDENT>argv.append(args[i])<EOL><DEDENT>elif argtypes[i] in ('<STR_LIT:i>', '<STR_LIT:I>'):<EOL><INDENT>argv.append(int(args[i]))<EOL><DEDENT>elif argtypes[i] in ('<STR_LIT:s>', '<STR_LIT:S>'):<EOL><INDENT>argv.append(str(args[i]))<EOL><DEDENT>elif argtypes[i] in ('<STR_LIT:h>', '<STR_LIT:H>'):<EOL><INDENT>argv.append(args[i])<EOL><DEDENT>else:<EOL><INDENT>raise KngetError('<STR_LIT>'.format(argtypes[i]))<EOL><DEDENT><DEDENT>return method(*argv, **kwargs)<EOL><DEDENT>wrapped_method.__doc__ = method.__doc__<EOL>self._commands[method.__name__] = (<EOL>wrapped_method, help_msg<EOL>)<EOL>return wrapped_method<EOL><DEDENT>return format_args<EOL> Register a method to a command. NOTE: Method registered here is unbound method, e.g. registered `run` command -> `KngetShell.run` So we call it should add `self` at first. See also: KngetShell.execute() :param argtypes: a str of the command args type. M: Myself -> self S: String -> str I: Integer -> int H: placeHolder -> pass or anything :param help_msg: a short help string of commands. :return: a callable function or method. @command.register(argtypes=r'<STR_LIT>', help_msg="<STR_LIT>")<EOL><INDENT>def run(self, tags, begin, end=False):<DEDENT> if not end:<EOL><INDENT>end = begin<EOL><DEDENT>super(KngetShell, self).run(tags, begin, int(end))<EOL> Override method of class Knget @command.register(argtypes=r'<STR_LIT:M>', help_msg="<STR_LIT>")<EOL><INDENT>def debug(self):<DEDENT> Override method of `Knget._debug_info()` @command.register(argtypes=r'<STR_LIT>', help_msg="<STR_LIT>")<EOL><INDENT>def dbgrun(self, source):<DEDENT> try:<EOL><INDENT>exec(source)<EOL><DEDENT>except Exception as e:<EOL><INDENT>self._msg2('<STR_LIT>'.format(e))<EOL><DEDENT> Debug run. based on exec(), unsafe. try:<EOL><INDENT>return load_pem_public_key(pubkey.encode(), default_backend())<EOL><DEDENT>except ValueError:<EOL><INDENT>pubkey = pubkey.replace('<STR_LIT>', '<STR_LIT>').replace('<STR_LIT>', '<STR_LIT>')<EOL>return load_pem_public_key(pubkey.encode(), default_backend())<EOL><DEDENT> Load public RSA key, with work-around for keys using incorrect header/footer format. Read more about RSA encryption with cryptography: https://cryptography.io/latest/hazmat/primitives/asymmetric/rsa/ def encrypt(pubkey, password): key = load_key(pubkey)<EOL>encrypted_password = key.encrypt(password, PKCS1v15())<EOL>return base64.b64encode(encrypted_password)<EOL> Encrypt password using given RSA public key and encode it with base64. The encrypted password can only be decrypted by someone with the private key (in this case, only Travis). def fetch_public_key(repo): keyurl = '<STR_LIT>'.format(repo)<EOL>data = json.loads(urlopen(keyurl).read().decode())<EOL>if '<STR_LIT:key>' not in data:<EOL><INDENT>errmsg = "<STR_LIT>".format(repo)<EOL>errmsg += "<STR_LIT>"<EOL>raise ValueError(errmsg)<EOL><DEDENT>return data['<STR_LIT:key>']<EOL> Download RSA public key Travis will use for this repo. Travis API docs: http://docs.travis-ci.com/api/#repository-keys def prepend_line(filepath, line): with open(filepath) as f:<EOL><INDENT>lines = f.readlines()<EOL><DEDENT>lines.insert(<NUM_LIT:0>, line)<EOL>with open(filepath, '<STR_LIT:w>') as f:<EOL><INDENT>f.writelines(lines)<EOL><DEDENT> Rewrite a file adding a line to its beginning. def update_travis_deploy_password(encrypted_password): config = load_yaml_config(TRAVIS_CONFIG_FILE)<EOL>config['<STR_LIT>']['<STR_LIT:password>'] = dict(secure=encrypted_password)<EOL>save_yaml_config(TRAVIS_CONFIG_FILE, config)<EOL>line = ('<STR_LIT>'<EOL>'<STR_LIT>')<EOL>prepend_line(TRAVIS_CONFIG_FILE, line)<EOL> Update the deploy section of the .travis.yml file to use the given encrypted password. def tokenize_words(string): string = six.text_type(string)<EOL>return re.findall(WORD_TOKENIZATION_RULES, string)<EOL> Tokenize input text to words. :param string: Text to tokenize :type string: str or unicode :return: words :rtype: list of strings def tokenize_sents(string): string = six.text_type(string)<EOL>spans = []<EOL>for match in re.finditer('<STR_LIT>', string):<EOL><INDENT>spans.append(match)<EOL><DEDENT>spans_count = len(spans)<EOL>rez = []<EOL>off = <NUM_LIT:0><EOL>for i in range(spans_count):<EOL><INDENT>tok = string[spans[i].start():spans[i].end()]<EOL>if i == spans_count - <NUM_LIT:1>:<EOL><INDENT>rez.append(string[off:spans[i].end()])<EOL><DEDENT>elif tok[-<NUM_LIT:1>] in ['<STR_LIT:.>', '<STR_LIT:!>', '<STR_LIT:?>', '<STR_LIT>', '<STR_LIT>']:<EOL><INDENT>tok1 = tok[re.search('<STR_LIT>', tok).start()-<NUM_LIT:1>]<EOL>next_tok = string[spans[i + <NUM_LIT:1>].start():spans[i + <NUM_LIT:1>].end()]<EOL>if (next_tok[<NUM_LIT:0>].isupper()<EOL>and not tok1.isupper()<EOL>and not (tok[-<NUM_LIT:1>] != '<STR_LIT:.>'<EOL>or tok1[<NUM_LIT:0>] == '<STR_LIT:(>'<EOL>or tok in ABBRS)):<EOL><INDENT>rez.append(string[off:spans[i].end()])<EOL>off = spans[i + <NUM_LIT:1>].start()<EOL><DEDENT><DEDENT><DEDENT>return rez<EOL> Tokenize input text to sentences. :param string: Text to tokenize :type string: str or unicode :return: sentences :rtype: list of strings def tokenize_text(string): string = six.text_type(string)<EOL>rez = []<EOL>for part in string.split('<STR_LIT:\n>'):<EOL><INDENT>par = []<EOL>for sent in tokenize_sents(part):<EOL><INDENT>par.append(tokenize_words(sent))<EOL><DEDENT>if par:<EOL><INDENT>rez.append(par)<EOL><DEDENT><DEDENT>return rez<EOL> Tokenize input text to paragraphs, sentences and words. Tokenization to paragraphs is done using simple Newline algorithm For sentences and words tokenizers above are used :param string: Text to tokenize :type string: str or unicode :return: text, tokenized into paragraphs, sentences and words :rtype: list of list of list of words def crypt(header, body_bytes, secret): <EOL><INDENT>unsigned char<EOL><DEDENT>= network-order (big-endian) unsigned int<EOL>length = len(body_bytes)<EOL>hed = (<EOL>truct.pack('<STR_LIT>', header.session_id) +<EOL>ix.b(secret) +<EOL>truct.pack('<STR_LIT:B>', header.version) +<EOL>truct.pack('<STR_LIT:B>', header.seq_no)<EOL>hashed = md5(unhashed).digest()<EOL>en(pad) < body_length):<EOL><INDENT>remake hash, appending it to pad until pad >= header.length<EOL><DEDENT>hile True:<EOL><INDENT>hashed = md5(unhashed + hashed).digest()<EOL>pad += hashed<EOL>if len(pad) >= body_length:<EOL><INDENT>break<EOL> TACACS+ uses a shared secret key (known to both the client and server) to obfuscate the body of sent packets. Only the packet body (not the header) is obfuscated. https://datatracker.ietf.org/doc/draft-ietf-opsawg-tacacs/?include_text=1#section-3.7 ENCRYPTED {data} == data ^ pseudo_pad The pad is generated by concatenating a series of MD5 hashes (each 16 bytes long) and truncating it to the length of the input data. pseudo_pad = {MD5_1 [,MD5_2 [ ... ,MD5_n]]} truncated to len(data) The first MD5 hash is generated by concatenating the session_id, the secret key, the version number and the sequence number and then running MD5 over that stream. All of those input values are available in the packet header, except for the secret key which is a shared secret between the TACACS+ client and server. Subsequent hashes are generated by using the same input stream, but concatenating the previous hash value at the end of the input stream. MD5_1 = MD5{session_id, key, version, seq_no} MD5_2 = MD5{session_id, key, version, seq_no, MD5_1} .... MD5_n = MD5{session_id, key, version, seq_no, MD5_n-1} :param header: a TACACSHeader object :param body_bytes: packed bytes, i.e., `struct.pack(...)` :param secret: a key used to encrypt/obfuscate packets according to the TACACS+ spec :return: packed bytes, i.e., `struct.pack(...)` representing the obfuscated packet body def __init__(self, header, body_bytes, secret): self.header = header<EOL>self.body_bytes = body_bytes<EOL>self.secret = secret<EOL> :param header: a TACACSHeader object :param body_bytes: packed bytes, i.e., `struct.pack(...)` :param secret: a key used to encrypt/obfuscate packets according to the TACACS+ spec def __init__(self, host, port, secret, timeout=<NUM_LIT:10>, session_id=None,<EOL>family=socket.AF_INET, version_max=TAC_PLUS_MAJOR_VER,<EOL>version_min=TAC_PLUS_MINOR_VER): self._sock = None<EOL>self.host = host<EOL>self.port = port<EOL>self.secret = secret<EOL>self.timeout = timeout<EOL>self.version_max = version_max<EOL>self.version_min = version_min<EOL>self.family = family<EOL>self.session_id = session_id or random.randint(<NUM_LIT:1>, <NUM_LIT:2> ** <NUM_LIT:32> - <NUM_LIT:1>)<EOL> :param host: hostname of the TACACS+ server :param port: port of the TACACS+ server, generally 49 :param secret: the secret key used to obfuscate packet bodies; can be `None` to disable packet body obfuscation :param session_id: a unique 32-bit int representing the session; if left empty, one will be auto-generated :param version_max: TACACS+ major version number, 12 :param version_min: TACACS+ minor version number, 0 or 1 def send(self, body, req_type, seq_no=<NUM_LIT:1>): <EOL>header = TACACSHeader(<EOL>self.version,<EOL>req_type,<EOL>self.session_id,<EOL>len(body.packed),<EOL>seq_no=seq_no<EOL>)<EOL>packet = TACACSPacket(header, body.packed, self.secret)<EOL>logger.debug('<STR_LIT:\n>'.join([<EOL>body.__class__.__name__,<EOL>'<STR_LIT>' % header,<EOL>'<STR_LIT>' % body,<EOL>]))<EOL>self.sock.send(bytes(packet))<EOL>readable, _, _ = select.select([self.sock], [], [], self.timeout)<EOL>if readable:<EOL><INDENT>header_bytes = self.sock.recv(<NUM_LIT:12>)<EOL>resp_header = TACACSHeader.unpacked(header_bytes)<EOL>if any([<EOL>resp_header.version_max != header.version_max,<EOL>resp_header.type != header.type,<EOL>resp_header.session_id != header.session_id<EOL>]):<EOL><INDENT>logger.error('<STR_LIT:\n>'.join([<EOL>resp_header.__class__.__name__,<EOL>'<STR_LIT>' % resp_header,<EOL>str(resp_header.packed)<EOL>]))<EOL>raise socket.error<EOL><DEDENT>body_bytes = six.b('<STR_LIT>')<EOL>remaining = resp_header.length<EOL>while remaining > <NUM_LIT:0>:<EOL><INDENT>body_bytes += self.sock.recv(remaining)<EOL>remaining = resp_header.length - len(body_bytes)<EOL><DEDENT>return TACACSPacket(<EOL>resp_header,<EOL>body_bytes,<EOL>self.secret<EOL>)<EOL><DEDENT>raise socket.timeout<EOL> Send a TACACS+ message body :param body: packed bytes, i.e., `struct.pack(...)` :param req_type: TAC_PLUS_AUTHEN, TAC_PLUS_AUTHOR, TAC_PLUS_ACCT :param seq_no: The sequence number of the current packet. The first packet in a session MUST have the sequence number 1 and each subsequent packet will increment the sequence number by one. Thus clients only send packets containing odd sequence numbers, and TACACS+ servers only send packets containing even sequence numbers. :return: TACACSPacket :raises: socket.timeout, socket.error def authenticate(self, username, password, priv_lvl=TAC_PLUS_PRIV_LVL_MIN,<EOL>authen_type=TAC_PLUS_AUTHEN_TYPE_ASCII,<EOL>chap_ppp_id=None, chap_challenge=None,<EOL>rem_addr=TAC_PLUS_VIRTUAL_REM_ADDR, port=TAC_PLUS_VIRTUAL_PORT): start_data = six.b('<STR_LIT>')<EOL>if authen_type in (TAC_PLUS_AUTHEN_TYPE_PAP,<EOL>TAC_PLUS_AUTHEN_TYPE_CHAP):<EOL><INDENT>self.version_min = TAC_PLUS_MINOR_VER_ONE<EOL>if authen_type == TAC_PLUS_AUTHEN_TYPE_PAP:<EOL><INDENT>start_data = six.b(password)<EOL><DEDENT>if authen_type == TAC_PLUS_AUTHEN_TYPE_CHAP:<EOL><INDENT>if not isinstance(chap_ppp_id, six.string_types):<EOL><INDENT>raise ValueError('<STR_LIT>')<EOL><DEDENT>if len(chap_ppp_id) != <NUM_LIT:1>:<EOL><INDENT>raise ValueError('<STR_LIT>')<EOL><DEDENT>if not isinstance(chap_challenge, six.string_types):<EOL><INDENT>raise ValueError('<STR_LIT>')<EOL><DEDENT>if len(chap_challenge) > <NUM_LIT:255>:<EOL><INDENT>raise ValueError('<STR_LIT>')<EOL><DEDENT>start_data = (<EOL>six.b(chap_ppp_id) +<EOL>six.b(chap_challenge) +<EOL>md5(six.b(<EOL>chap_ppp_id + password + chap_challenge<EOL>)).digest()<EOL>)<EOL><DEDENT><DEDENT>with self.closing():<EOL><INDENT>packet = self.send(<EOL>TACACSAuthenticationStart(username, authen_type, priv_lvl,<EOL>start_data, rem_addr=rem_addr, port=port),<EOL>TAC_PLUS_AUTHEN<EOL>)<EOL>reply = TACACSAuthenticationReply.unpacked(packet.body)<EOL>logger.debug('<STR_LIT:\n>'.join([<EOL>reply.__class__.__name__,<EOL>'<STR_LIT>' % packet.header,<EOL>'<STR_LIT>' % reply<EOL>]))<EOL>if authen_type == TAC_PLUS_AUTHEN_TYPE_ASCII and reply.getpass:<EOL><INDENT>packet = self.send(TACACSAuthenticationContinue(password),<EOL>TAC_PLUS_AUTHEN,<EOL>packet.seq_no + <NUM_LIT:1>)<EOL>reply = TACACSAuthenticationReply.unpacked(packet.body)<EOL>logger.debug('<STR_LIT:\n>'.join([<EOL>reply.__class__.__name__,<EOL>'<STR_LIT>' % packet.header,<EOL>'<STR_LIT>' % reply<EOL>]))<EOL>if reply.flags == TAC_PLUS_CONTINUE_FLAG_ABORT:<EOL><INDENT>reply.status = TAC_PLUS_AUTHEN_STATUS_FAIL<EOL><DEDENT><DEDENT><DEDENT>return reply<EOL> Authenticate to a TACACS+ server with a username and password. :param username: :param password: :param priv_lvl: :param authen_type: TAC_PLUS_AUTHEN_TYPE_ASCII, TAC_PLUS_AUTHEN_TYPE_PAP, TAC_PLUS_AUTHEN_TYPE_CHAP :param chap_ppp_id: PPP ID when authen_type == 'chap' :param chap_challenge: challenge value when authen_type == 'chap' :param rem_addr: AAA request source, default to TAC_PLUS_VIRTUAL_REM_ADDR :param port: AAA port, default to TAC_PLUS_VIRTUAL_PORT :return: TACACSAuthenticationReply :raises: socket.timeout, socket.error def authorize(self, username, arguments=[],<EOL>authen_type=TAC_PLUS_AUTHEN_TYPE_ASCII, priv_lvl=TAC_PLUS_PRIV_LVL_MIN,<EOL>rem_addr=TAC_PLUS_VIRTUAL_REM_ADDR, port=TAC_PLUS_VIRTUAL_PORT): with self.closing():<EOL><INDENT>packet = self.send(<EOL>TACACSAuthorizationStart(username,<EOL>TAC_PLUS_AUTHEN_METH_TACACSPLUS,<EOL>priv_lvl, authen_type, arguments,<EOL>rem_addr=rem_addr, port=port),<EOL>TAC_PLUS_AUTHOR<EOL>)<EOL>reply = TACACSAuthorizationReply.unpacked(packet.body)<EOL>logger.debug('<STR_LIT:\n>'.join([<EOL>reply.__class__.__name__,<EOL>'<STR_LIT>' % packet.header,<EOL>'<STR_LIT>' % reply<EOL>]))<EOL>reply_arguments = dict([<EOL>arg.split(six.b('<STR_LIT:=>'), <NUM_LIT:1>)<EOL>for arg in reply.arguments or []<EOL>if arg.find(six.b('<STR_LIT:=>')) > -<NUM_LIT:1>]<EOL>)<EOL>user_priv_lvl = int(reply_arguments.get(<EOL>six.b('<STR_LIT>'), TAC_PLUS_PRIV_LVL_MAX))<EOL>if user_priv_lvl < priv_lvl:<EOL><INDENT>reply.status = TAC_PLUS_AUTHOR_STATUS_FAIL<EOL><DEDENT><DEDENT>return reply<EOL> Authorize with a TACACS+ server. :param username: :param arguments: The authorization arguments :param authen_type: TAC_PLUS_AUTHEN_TYPE_ASCII, TAC_PLUS_AUTHEN_TYPE_PAP, TAC_PLUS_AUTHEN_TYPE_CHAP :param priv_lvl: Minimal Required priv_lvl. :param rem_addr: AAA request source, default to TAC_PLUS_VIRTUAL_REM_ADDR :param port: AAA port, default to TAC_PLUS_VIRTUAL_PORT :return: TACACSAuthenticationReply :raises: socket.timeout, socket.error def account(self, username, flags, arguments=[],<EOL>authen_type=TAC_PLUS_AUTHEN_TYPE_ASCII, priv_lvl=TAC_PLUS_PRIV_LVL_MIN,<EOL>rem_addr=TAC_PLUS_VIRTUAL_REM_ADDR, port=TAC_PLUS_VIRTUAL_PORT): with self.closing():<EOL><INDENT>packet = self.send(<EOL>TACACSAccountingStart(username, flags,<EOL>TAC_PLUS_AUTHEN_METH_TACACSPLUS,<EOL>priv_lvl, authen_type, arguments,<EOL>rem_addr=rem_addr, port=port),<EOL>TAC_PLUS_ACCT<EOL>)<EOL>reply = TACACSAccountingReply.unpacked(packet.body)<EOL>logger.debug('<STR_LIT:\n>'.join([<EOL>reply.__class__.__name__,<EOL>'<STR_LIT>' % packet.header,<EOL>'<STR_LIT>' % reply<EOL>]))<EOL><DEDENT>return reply<EOL> Account with a TACACS+ server. :param username: :param flags: TAC_PLUS_ACCT_FLAG_START, TAC_PLUS_ACCT_FLAG_WATCHDOG, TAC_PLUS_ACCT_FLAG_STOP :param arguments: The authorization arguments :param authen_type: TAC_PLUS_AUTHEN_TYPE_ASCII, TAC_PLUS_AUTHEN_TYPE_PAP, TAC_PLUS_AUTHEN_TYPE_CHAP :param priv_lvl: Minimal Required priv_lvl. :param rem_addr: AAA request source, default to TAC_PLUS_VIRTUAL_REM_ADDR :param port: AAA port, default to TAC_PLUS_VIRTUAL_PORT :return: TACACSAccountingReply :raises: socket.timeout, socket.error def get_or_create(self, model, **spec): return self.service.get_or_create(model, **spec)<EOL> Args: model: class of Model get_data: the filter used for finding an instance create_data: the data used to create an instance, if none could be found def create_order(self, debtor, is_vat_included=True, due_date=None,<EOL>heading='<STR_LIT>', text_line1='<STR_LIT>', text_line2='<STR_LIT>',<EOL>debtor_data=None, delivery_data=None, products=None,<EOL>project=None, other_reference='<STR_LIT>', model=models.Order, **extra<EOL>): debtor_data = debtor_data or {}<EOL>delivery_data = delivery_data or {}<EOL>delivery_date = delivery_data.get('<STR_LIT:date>', datetime.datetime.now())<EOL>our_reference = extra.get('<STR_LIT>', debtor.our_reference)<EOL>currency = extra.get('<STR_LIT>', debtor.currency)<EOL>layout = extra.get('<STR_LIT>', debtor.layout)<EOL>term_of_payment = extra.get('<STR_LIT>', debtor.term_of_payment)<EOL>date = extra.get('<STR_LIT:date>', datetime.datetime.now())<EOL>order_input = {<EOL>'<STR_LIT>': debtor,<EOL>'<STR_LIT>': extra.get('<STR_LIT>', <NUM_LIT:1>),<EOL>'<STR_LIT>': project,<EOL>}<EOL>for dd in ['<STR_LIT:name>', '<STR_LIT:address>', '<STR_LIT>', '<STR_LIT>', '<STR_LIT>', '<STR_LIT>']:<EOL><INDENT>order_input['<STR_LIT>' % dd] = debtor_data.get(dd, getattr(debtor, dd))<EOL><DEDENT>for dd in ['<STR_LIT:address>', '<STR_LIT>', '<STR_LIT>', '<STR_LIT>']:<EOL><INDENT>order_input['<STR_LIT>' % dd] = delivery_data.get(dd, getattr(debtor, dd))<EOL><DEDENT>order_input.update({<EOL>'<STR_LIT>': delivery_date or datetime.datetime.now(),<EOL>'<STR_LIT>': heading,<EOL>'<STR_LIT>': text_line1,<EOL>'<STR_LIT>': text_line2,<EOL>'<STR_LIT>': extra.get('<STR_LIT>', <NUM_LIT:0>),<EOL>'<STR_LIT>': extra.get('<STR_LIT>', <NUM_LIT:0>),<EOL>'<STR_LIT>': extra.get('<STR_LIT>', <NUM_LIT:0>),<EOL>'<STR_LIT>': extra.get('<STR_LIT>', <NUM_LIT:0>),<EOL>'<STR_LIT>': extra.get('<STR_LIT>', <NUM_LIT:0>),<EOL>'<STR_LIT>': extra.get('<STR_LIT>', <NUM_LIT:0>),<EOL>'<STR_LIT>': extra.get('<STR_LIT>', <NUM_LIT:0>),<EOL>'<STR_LIT:date>': date,<EOL>'<STR_LIT>': our_reference,<EOL>'<STR_LIT>': other_reference,<EOL>'<STR_LIT>': currency,<EOL>'<STR_LIT>': extra.get('<STR_LIT>', <NUM_LIT:1.0>),<EOL>'<STR_LIT>': is_vat_included,<EOL>'<STR_LIT>': layout,<EOL>'<STR_LIT>': due_date or datetime.datetime.now(),<EOL>'<STR_LIT>': term_of_payment<EOL>})<EOL>order_input.update(extra)<EOL>order = self.create(model, **order_input)<EOL>if products:<EOL><INDENT>for product in products:<EOL><INDENT>self.create_orderline(order, product)<EOL><DEDENT><DEDENT>return order<EOL> Create a new Order. Args: debtor (Debtor): the debtor of the order debtor_data (mapping): map of debtor data {'postal_code: .., 'city': .., 'ean': ..} defaults to values on debitor instance for missing values delivery_data (mapping): map of delivery data {'address': ..., 'postal_code': ...} defaults to values on debitor instance for missing values due_date (datetime): due date heading (string): heading to be displayed in the order pdf text_line1 (string): first order description line text_line2 (string): second order description line other_reference (string): custom string to be used for identification extra (mapping): mapping of extra values to be passed in to the server call Returns: Order instance def build_model_code(client): models = {}<EOL>references = {}<EOL>for method in client.wsdl.services[<NUM_LIT:0>].ports[<NUM_LIT:0>].methods.values():<EOL><INDENT>if not '<STR_LIT:_>' in method.name:<EOL><INDENT>continue<EOL><DEDENT>model, action = method.name.split('<STR_LIT:_>')<EOL>models.setdefault(model, {'<STR_LIT>': [], '<STR_LIT>': []})<EOL>references[model] = model<EOL>if model[-<NUM_LIT:1>] == '<STR_LIT:y>':<EOL><INDENT>references[model[:-<NUM_LIT:1>] + '<STR_LIT>'] = model<EOL><DEDENT>else:<EOL><INDENT>references[model+'<STR_LIT:s>'] = model<EOL><DEDENT><DEDENT>references['<STR_LIT>'] = '<STR_LIT>'<EOL>references['<STR_LIT>'] = '<STR_LIT>'<EOL>references['<STR_LIT>'] = '<STR_LIT>'<EOL>references['<STR_LIT>'] = '<STR_LIT>'<EOL>special = {<EOL>'<STR_LIT>': {<EOL>'<STR_LIT:type>': '<STR_LIT>',<EOL>'<STR_LIT:args>': ["<STR_LIT>", "<STR_LIT>"]<EOL>},<EOL>'<STR_LIT>': {<EOL>'<STR_LIT:type>': '<STR_LIT>',<EOL>'<STR_LIT:args>': ["<STR_LIT>", "<STR_LIT>"]<EOL>},<EOL>'<STR_LIT>': {<EOL>'<STR_LIT:type>': '<STR_LIT>',<EOL>'<STR_LIT:args>': ["<STR_LIT>", "<STR_LIT>"]<EOL>}<EOL>}<EOL>for line in ['<STR_LIT>', '<STR_LIT>', '<STR_LIT>', '<STR_LIT>']:<EOL><INDENT>method = '<STR_LIT>' % line<EOL>special[method] = {<EOL>'<STR_LIT:type>': '<STR_LIT>',<EOL>'<STR_LIT:args>': ["<STR_LIT>" % line, "<STR_LIT>" % method]<EOL>}<EOL><DEDENT>for method in client.wsdl.services[<NUM_LIT:0>].ports[<NUM_LIT:0>].methods.values():<EOL><INDENT>if not '<STR_LIT:_>' in method.name:<EOL><INDENT>continue<EOL><DEDENT>model, action = method.name.split('<STR_LIT:_>')<EOL>if action in ['<STR_LIT>', '<STR_LIT>', '<STR_LIT>']:<EOL><INDENT>continue<EOL><DEDENT>modeldata = models[model]<EOL>if action == '<STR_LIT>':<EOL><INDENT>camelname = action[<NUM_LIT:3>:]<EOL>modeldata['<STR_LIT>'].append({'<STR_LIT:name>': pythonize(camelname), '<STR_LIT>': action})<EOL><DEDENT>if re.findall('<STR_LIT>', action):<EOL><INDENT>camelname = action[<NUM_LIT:3>:]<EOL>modeldata['<STR_LIT>'].append({'<STR_LIT:name>': pythonize(camelname), '<STR_LIT>': action})<EOL><DEDENT>elif action.startswith('<STR_LIT>'):<EOL><INDENT>camelname = action[<NUM_LIT:6>:]<EOL>modeldata['<STR_LIT>'].append({'<STR_LIT:name>': pythonize(camelname), '<STR_LIT>': action})<EOL><DEDENT>elif action.startswith('<STR_LIT>'):<EOL><INDENT>propname = action[<NUM_LIT:3>:]<EOL>pyname = pythonize(propname)<EOL>if not propname:<EOL><INDENT>continue<EOL><DEDENT>get_type = re.findall('<STR_LIT>' % ('<STR_LIT:|>'.join(references.keys())), action)<EOL>if get_type and get_type[<NUM_LIT:0>] in references:<EOL><INDENT>refmodel = references[get_type[<NUM_LIT:0>]]<EOL>if action[-<NUM_LIT:1>] == '<STR_LIT:s>':<EOL><INDENT>modeldata['<STR_LIT>'].append({<EOL>'<STR_LIT:type>': '<STR_LIT>',<EOL>'<STR_LIT:args>': ["<STR_LIT>" % propname, "<STR_LIT>" % refmodel, "<STR_LIT>" % method.name],<EOL>'<STR_LIT:name>': pyname<EOL>})<EOL><DEDENT>else:<EOL><INDENT>modeldata['<STR_LIT>'].append({<EOL>'<STR_LIT:type>': '<STR_LIT>',<EOL>'<STR_LIT:args>': ["<STR_LIT>" % propname, "<STR_LIT>" % refmodel],<EOL>'<STR_LIT:name>': pyname<EOL>})<EOL><DEDENT><DEDENT>elif method.name in special:<EOL><INDENT>spdata = special[method.name]<EOL>modeldata['<STR_LIT>'].append({<EOL>'<STR_LIT:type>': spdata['<STR_LIT:type>'],<EOL>'<STR_LIT:args>': ["<STR_LIT>" % propname] + spdata['<STR_LIT:args>'],<EOL>'<STR_LIT:name>': pyname<EOL>})<EOL><DEDENT>else:<EOL><INDENT>modeldata['<STR_LIT>'].append({<EOL>'<STR_LIT:type>': '<STR_LIT>',<EOL>'<STR_LIT:args>': ["<STR_LIT>" % propname],<EOL>'<STR_LIT:name>': pyname<EOL>})<EOL><DEDENT><DEDENT><DEDENT>classes = []<EOL>for modelname, modeldata in models.items():<EOL><INDENT>propertycode = ["<STR_LIT>" % (md['<STR_LIT:name>'], md['<STR_LIT:type>'], '<STR_LIT:U+002CU+0020>'.join(md['<STR_LIT:args>']))<EOL>for md in modeldata['<STR_LIT>']]<EOL>code = "<STR_LIT>" % (modelname, '<STR_LIT>',<EOL>modeldata['<STR_LIT>'], '<STR_LIT>'.join(propertycode))<EOL>classes.append(code)<EOL><DEDENT>return "<STR_LIT>" + "<STR_LIT>".join(classes)<EOL> Generate source code for e-conomic models based on WSDL connection. This is based on the assumption that the API follows a specific method naming-convention. Not all models and attributes has been tested. The source-generation is mostly to help improve readability and IDE auto-completion. :param client: :return: source code for models.py def __find_handles(self, model, **spec): server_calls = []<EOL>filter_names = dict([(f['<STR_LIT:name>'], f['<STR_LIT>'],) for f in model.get_filters()])<EOL>if not spec:<EOL><INDENT>server_calls.append({'<STR_LIT>': "<STR_LIT>" % model.__name__, '<STR_LIT:args>': []})<EOL><DEDENT>else:<EOL><INDENT>for key, value in spec.items():<EOL><INDENT>if not key in filter_names:<EOL><INDENT>raise ValueError("<STR_LIT>" % key)<EOL><DEDENT>args = []<EOL>if not hasattr(value, '<STR_LIT>'):<EOL><INDENT>value = [value]<EOL><DEDENT>if key.endswith('<STR_LIT>'):<EOL><INDENT>vtype = type(value[<NUM_LIT:0>]).__name__<EOL>array = self.soap_factory.create('<STR_LIT>' % vtype.capitalize())<EOL>getattr(array, "<STR_LIT:%s>" % vtype).extend(value)<EOL>args.append(array)<EOL><DEDENT>else:<EOL><INDENT>args.extend(value)<EOL><DEDENT>method = "<STR_LIT>" % (model.__name__, filter_names[key])<EOL>if filter_names[key].startswith('<STR_LIT>'):<EOL><INDENT>args = []<EOL><DEDENT>server_calls.append({'<STR_LIT>': method, '<STR_LIT:args>': args, '<STR_LIT>': "<STR_LIT>" % model.__name__})<EOL><DEDENT><DEDENT>handles = [<EOL>map(Handle, self.fetch_list(scall['<STR_LIT>'], scall.get('<STR_LIT>'), *scall['<STR_LIT:args>']))<EOL>for scall in server_calls<EOL>]<EOL>return [h.wsdl for h in reduce(set.intersection, map(set, handles))]<EOL> find model instances based on given filter (spec) The filter is based on available server-calls, so some values might not be available for filtering. Multiple filter-values is going to do multiple server-calls. For complex filters in small datasets, it might be faster to fetch all and do your own in-memory filter. Empty filter will fetch all. :param model: subclass of EConomicsModel :param spec: mapping of values to filter by :return: a list of EConomicsModel instances def get(self, model, **spec): handles = self.__find_handles(model, **spec)<EOL>if len(handles) > <NUM_LIT:1>:<EOL><INDENT>raise MultipleObjectsReturned()<EOL><DEDENT>if not handles:<EOL><INDENT>raise ObjectDoesNotExist()<EOL><DEDENT>return self.get_instance(model, handles[<NUM_LIT:0>])<EOL> get a single model instance by handle :param model: model :param handle: instance handle :return: def has_add_permission(self, request): Hides the add metric link in admin def get_queryset(self, request): queryset = super(MetricGroupAdmin, self).get_queryset(request)<EOL>qs_values = queryset.values('<STR_LIT:id>', '<STR_LIT:name>')<EOL>distinct_names = {}<EOL>for metric in qs_values:<EOL><INDENT>distinct_names[metric['<STR_LIT:name>']] = metric['<STR_LIT:id>']<EOL><DEDENT>queryset = self.model.objects.filter(id__in=distinct_names.values())<EOL>return queryset<EOL> Shows one entry per distinct metric name def save_model(self, request, obj, form, change): like_metrics = self.model.objects.filter(name=obj.name)<EOL>updates = {}<EOL>for key in form.changed_data:<EOL><INDENT>updates[key] = form.cleaned_data[key]<EOL><DEDENT>like_metrics.update(**updates)<EOL> Updates all metrics with the same name def generate_sample_data(point_numbers, interval): src_names = ['<STR_LIT>', '<STR_LIT>', '<STR_LIT>']<EOL>sources = []<EOL>for name in src_names:<EOL><INDENT>sources.append(models.Source.objects.get_or_create(name=name)[<NUM_LIT:0>])<EOL><DEDENT>sources.append(None)<EOL>metric_names = ['<STR_LIT>',<EOL>'<STR_LIT>',<EOL>'<STR_LIT>',<EOL>'<STR_LIT>',<EOL>'<STR_LIT>']<EOL>for source in sources:<EOL><INDENT>for name in metric_names:<EOL><INDENT>metric = models.Metric.objects.get_or_create(source=source,<EOL>name=name)[<NUM_LIT:0>]<EOL>start = datetime.datetime.now() - datetime.timedelta(<EOL>minutes=interval * point_numbers)<EOL>for i in range(point_numbers):<EOL><INDENT>metric.latest_value = random.randint(<NUM_LIT:1>, <NUM_LIT:100>)<EOL>metric.last_updated = (start +<EOL>datetime.timedelta(minutes=interval * i))<EOL>metric.save()<EOL><DEDENT><DEDENT><DEDENT> This function generates sample data and populates the databases :point_numbers: is an int defining the number of values for each metric :interval: is an int defining the interval between each results This method returns a list of metrics sources = (models.Source.objects.all().prefetch_related('<STR_LIT>')<EOL>.order_by('<STR_LIT:name>'))<EOL>metrics = SortedDict([(src, src.metric_set.all()) for src in sources])<EOL>no_source_metrics = models.Metric.objects.filter(source__isnull=True)<EOL>if no_source_metrics:<EOL><INDENT>metrics['<STR_LIT>'] = no_source_metrics<EOL><DEDENT>if request.META.get('<STR_LIT>', False):<EOL><INDENT>parent_template = '<STR_LIT>'<EOL><DEDENT>else:<EOL><INDENT>parent_template = '<STR_LIT>'<EOL><DEDENT>return render(request, '<STR_LIT>', {<EOL>'<STR_LIT>': metrics,<EOL>'<STR_LIT>': parent_template<EOL>})<EOL> Shows the latest results for each source def replace_variable(self, variable): if variable == '<STR_LIT:x>':<EOL><INDENT>return self.value<EOL><DEDENT>if variable == '<STR_LIT:t>':<EOL><INDENT>return self.timedelta<EOL><DEDENT>raise ValueError("<STR_LIT>", variable)<EOL> Substitute variables with numeric values <EOL>return self.eval_(ast.parse(self.expr).body[<NUM_LIT:0>].value)<EOL> Evaluate expression and return result def _reset_changes(self): self._original = {}<EOL>if self.last_updated is not None:<EOL><INDENT>self._original['<STR_LIT>'] = self.last_updated<EOL><DEDENT> Stores current values for comparison later @property<EOL><INDENT>def whisper_filename(self):<DEDENT> source_name = self.source_id and self.source.name or '<STR_LIT>'<EOL>return get_valid_filename("<STR_LIT>".format(source_name,<EOL>self.name))<EOL> Build a file path to the Whisper database def get_or_create_archive(self): return graph.WhisperDatabase(self.whisper_filename)<EOL> Gets a Whisper DB instance. Creates it if it doesn't exist. def load_archive(self, from_date, to_date=None): return self.get_or_create_archive().fetch(from_date, to_date)<EOL> Loads in historical data from Whisper database def get_value_display(self): if self.display_as == '<STR_LIT>':<EOL><INDENT>return '<STR_LIT>'.format(self.latest_value)<EOL><DEDENT>if self.display_as == '<STR_LIT>':<EOL><INDENT>return bool(self.latest_value)<EOL><DEDENT>if self.display_as == '<STR_LIT>':<EOL><INDENT>return defaultfilters.filesizeformat(self.latest_value)<EOL><DEDENT>if self.display_as == '<STR_LIT>':<EOL><INDENT>return time.strftime('<STR_LIT>', time.gmtime(self.latest_value))<EOL><DEDENT>return self.latest_value<EOL> Human friendly value output def time_between_updates(self): if '<STR_LIT>' not in self._original:<EOL><INDENT>return <NUM_LIT:0><EOL><DEDENT>last_update = self._original['<STR_LIT>']<EOL>this_update = self.last_updated<EOL>return this_update - last_update<EOL> Time between current `last_updated` and previous `last_updated` if not self.transform:<EOL><INDENT>return<EOL><DEDENT>try:<EOL><INDENT>self.latest_value = utils.Transform(<EOL>expr=self.transform, value=self.latest_value,<EOL>timedelta=self.time_between_updates().total_seconds()).result()<EOL><DEDENT>except (TypeError, ValueError):<EOL><INDENT>logger.warn("<STR_LIT>",<EOL>self.transfrom, self.pk)<EOL><DEDENT>self.transform = '<STR_LIT>'<EOL> Apply the transformation (if it exists) to the latest_value def do_counter_conversion(self): if self.is_counter:<EOL><INDENT>if self._previous_counter_value is None:<EOL><INDENT>prev_value = self.latest_value<EOL><DEDENT>else:<EOL><INDENT>prev_value = self._previous_counter_value<EOL><DEDENT>self._previous_counter_value = self.latest_value<EOL>self.latest_value = self.latest_value - prev_value<EOL><DEDENT> Update latest value to the diff between it and the previous value if not os.path.exists(settings.SALMON_WHISPER_DB_PATH):<EOL><INDENT>os.makedirs(settings.SALMON_WHISPER_DB_PATH)<EOL><DEDENT>archives = [whisper.parseRetentionDef(retentionDef)<EOL>for retentionDef in settings.ARCHIVES.split("<STR_LIT:U+002C>")]<EOL>whisper.create(self.path, archives,<EOL>xFilesFactor=settings.XFILEFACTOR,<EOL>aggregationMethod=settings.AGGREGATION_METHOD)<EOL> Create the Whisper file on disk def _update(self, datapoints): if len(datapoints) == <NUM_LIT:1>:<EOL><INDENT>timestamp, value = datapoints[<NUM_LIT:0>]<EOL>whisper.update(self.path, value, timestamp)<EOL><DEDENT>else:<EOL><INDENT>whisper.update_many(self.path, datapoints)<EOL><DEDENT> This method store in the datapoints in the current database. :datapoints: is a list of tupple with the epoch timestamp and value [(1368977629,10)] def fetch(self, from_time, until_time=None): until_time = until_time or datetime.now()<EOL>time_info, values = whisper.fetch(self.path,<EOL>from_time.strftime('<STR_LIT:%s>'),<EOL>until_time.strftime('<STR_LIT:%s>'))<EOL>start_time, end_time, step = time_info<EOL>current = start_time<EOL>times = []<EOL>while current <= end_time:<EOL><INDENT>times.append(current)<EOL>current += step<EOL><DEDENT>return zip(times, values)<EOL> This method fetch data from the database according to the period given fetch(path, fromTime, untilTime=None) fromTime is an datetime untilTime is also an datetime, but defaults to now. Returns a tuple of (timeInfo, valueList) where timeInfo is itself a tuple of (fromTime, untilTime, step) Returns None if no data can be returned conf_file = os.path.join(os.path.dirname(base_settings.__file__),<EOL>'<STR_LIT>', '<STR_LIT>')<EOL>conf_template = open(conf_file).read()<EOL>default_url = '<STR_LIT>'<EOL>site_url = raw_input("<STR_LIT>".format(<EOL>default_url))<EOL>site_url = site_url or default_url<EOL>secret_key = base64.b64encode(os.urandom(KEY_LENGTH))<EOL>api_key = base64.b64encode(os.urandom(KEY_LENGTH))<EOL>output = conf_template.format(api_key=api_key, secret_key=secret_key,<EOL>site_url=site_url)<EOL>return output<EOL> This command is run when ``default_path`` doesn't exist, or ``init`` is run and returns a string representing the default data to put into their settings file. def configure_app(**kwargs): sys_args = sys.argv<EOL>args, command, command_args = parse_args(sys_args[<NUM_LIT:1>:])<EOL>parser = OptionParser()<EOL>parser.add_option('<STR_LIT>', metavar='<STR_LIT>')<EOL>(options, logan_args) = parser.parse_args(args)<EOL>config_path = options.config<EOL>logan_configure(config_path=config_path, **kwargs)<EOL> Builds up the settings using the same method as logan self._netrc.hosts = dedictify_machines(self.machines)<EOL>rep = "<STR_LIT>"<EOL>for host in self._netrc.hosts.keys():<EOL><INDENT>attrs = self._netrc.hosts[host]<EOL>rep += "<STR_LIT>".format(host=host,<EOL>attrs=attrs)<EOL>if attrs[<NUM_LIT:1>]:<EOL><INDENT>rep += "<STR_LIT>".format(attrs=attrs)<EOL><DEDENT>rep += "<STR_LIT>".format(attrs=attrs)<EOL><DEDENT>for macro in self._netrc.macros.keys():<EOL><INDENT>rep += "<STR_LIT>".format(macro=macro)<EOL>for line in self._netrc.macros[macro]:<EOL><INDENT>rep += line<EOL><DEDENT>rep += "<STR_LIT:\n>"<EOL><DEDENT>return rep<EOL> Dump the class data in the format of a .netrc file. version = '<STR_LIT>'<EOL>with open(fname, '<STR_LIT:r>') as fp:<EOL><INDENT>reg = re.compile(r'<STR_LIT>')<EOL>for line in fp:<EOL><INDENT>m = reg.match(line)<EOL>if m:<EOL><INDENT>version = m.group(<NUM_LIT:1>)<EOL>break<EOL><DEDENT><DEDENT><DEDENT>if not version:<EOL><INDENT>raise RuntimeError('<STR_LIT>')<EOL><DEDENT>return version<EOL> Attempts to find the version number in the file names fname. Raises RuntimeError if not found. ArgumentParser.__init__(self, description=self.DESCRIPTION)<EOL>self._parameters = []<EOL>ArgumentParser.add_argument(self, '<STR_LIT>', action=JsonAction, dest='<STR_LIT>',<EOL>default=False,<EOL>help='<STR_LIT>')<EOL>ArgumentParser.add_argument(self, '<STR_LIT>', action=SaveJsonAction,<EOL>type=ChrisApp.path, dest='<STR_LIT>', metavar='<STR_LIT>',<EOL>help='<STR_LIT>')<EOL>if self.TYPE == '<STR_LIT>':<EOL><INDENT>ArgumentParser.add_argument(self, '<STR_LIT>', action='<STR_LIT:store>', type=str,<EOL>help='<STR_LIT>')<EOL><DEDENT>ArgumentParser.add_argument(self, '<STR_LIT>', action='<STR_LIT:store>', type=str,<EOL>help='<STR_LIT>')<EOL>ArgumentParser.add_argument(self, '<STR_LIT>', action='<STR_LIT:store>', dest='<STR_LIT>',<EOL>help='<STR_LIT>')<EOL>ArgumentParser.add_argument(self, '<STR_LIT>', action='<STR_LIT:store_true>',<EOL>dest='<STR_LIT>',<EOL>help='<STR_LIT>')<EOL>ArgumentParser.add_argument(self, '<STR_LIT>', action='<STR_LIT:store_true>',<EOL>dest='<STR_LIT>',<EOL>help='<STR_LIT>')<EOL>ArgumentParser.add_argument(self, '<STR_LIT>', action=VersionAction,<EOL>dest='<STR_LIT:version>', default=False,<EOL>help='<STR_LIT>')<EOL>ArgumentParser.add_argument(self, '<STR_LIT>', action=AppMetaDataAction,<EOL>dest='<STR_LIT>', default=False,<EOL>help='<STR_LIT>')<EOL>ArgumentParser.add_argument(self, '<STR_LIT>', '<STR_LIT>', action='<STR_LIT:store>', type=str,<EOL>dest='<STR_LIT>', default="<STR_LIT:0>",<EOL>help='<STR_LIT>')<EOL>ArgumentParser.add_argument(self, '<STR_LIT>', action=ManPageAction,<EOL>dest='<STR_LIT>', default=False,<EOL>help="<STR_LIT>")<EOL>self.define_parameters()<EOL> The constructor of this app. @staticmethod<EOL><INDENT>def path(string):<DEDENT> if not os.path.exists(string):<EOL><INDENT>msg = "<STR_LIT>" % string<EOL>raise ArgumentTypeError(msg)<EOL><DEDENT>return string<EOL> Define the 'path' data type that can be used by apps. Show the app's man page (abstract method in this class). def define_parameters(self): raise NotImplementedError("<STR_LIT>")<EOL> Define the parameters used by this app (abstract method in this class). raise NotImplementedError("<STR_LIT>")<EOL> Execute this app (abstract method in this class). def add_argument(self, *args, **kwargs): if not (('<STR_LIT:action>' in kwargs) and (kwargs['<STR_LIT:action>'] == '<STR_LIT>')):<EOL><INDENT>try:<EOL><INDENT>name = kwargs['<STR_LIT>']<EOL>param_type = kwargs['<STR_LIT:type>']<EOL>optional = kwargs['<STR_LIT>']<EOL><DEDENT>except KeyError as e:<EOL><INDENT>detail = "<STR_LIT>" % e<EOL>raise KeyError(detail)<EOL><DEDENT>if optional and ('<STR_LIT:default>' not in kwargs):<EOL><INDENT>detail = "<STR_LIT>" % name<EOL>raise KeyError(detail)<EOL><DEDENT>default = None<EOL>if '<STR_LIT:default>' in kwargs:<EOL><INDENT>default = kwargs['<STR_LIT:default>']<EOL><DEDENT>param_help = "<STR_LIT>"<EOL>if '<STR_LIT>' in kwargs:<EOL><INDENT>param_help = kwargs['<STR_LIT>']<EOL><DEDENT>if param_type not in (str, int, float, bool, ChrisApp.path):<EOL><INDENT>detail = "<STR_LIT>" % param_type<EOL>raise ValueError(detail)<EOL><DEDENT>action = '<STR_LIT:store>'<EOL>if param_type == bool:<EOL><INDENT>action = '<STR_LIT>' if default else '<STR_LIT:store_true>'<EOL>del kwargs['<STR_LIT:default>'] <EOL>del kwargs['<STR_LIT:type>']<EOL><DEDENT>kwargs['<STR_LIT:action>'] = action<EOL>param = {'<STR_LIT:name>': name, '<STR_LIT:type>': param_type.__name__, '<STR_LIT>': optional,<EOL>'<STR_LIT>': args[<NUM_LIT:0>], '<STR_LIT:action>': action, '<STR_LIT>': param_help, '<STR_LIT:default>': default}<EOL>self._parameters.append(param)<EOL>del kwargs['<STR_LIT>']<EOL><DEDENT>ArgumentParser.add_argument(self, *args, **kwargs)<EOL> Add a parameter to this app. def get_json_representation(self): repres = {}<EOL>repres['<STR_LIT:type>'] = self.TYPE<EOL>repres['<STR_LIT>'] = self._parameters<EOL>repres['<STR_LIT>'] = self.ICON<EOL>repres['<STR_LIT>'] = self.AUTHORS<EOL>repres['<STR_LIT:title>'] = self.TITLE<EOL>repres['<STR_LIT>'] = self.CATEGORY<EOL>repres['<STR_LIT:description>'] = self.DESCRIPTION<EOL>repres['<STR_LIT>'] = self.DOCUMENTATION<EOL>repres['<STR_LIT>'] = self.LICENSE<EOL>repres['<STR_LIT:version>'] = self.VERSION<EOL>repres['<STR_LIT>'] = self.SELFPATH<EOL>repres['<STR_LIT>'] = self.SELFEXEC<EOL>repres['<STR_LIT>'] = self.EXECSHELL<EOL>repres['<STR_LIT>'] = self.MAX_NUMBER_OF_WORKERS<EOL>repres['<STR_LIT>'] = self.MIN_NUMBER_OF_WORKERS<EOL>repres['<STR_LIT>'] = self.MAX_MEMORY_LIMIT<EOL>repres['<STR_LIT>'] = self.MAX_CPU_LIMIT <EOL>repres['<STR_LIT>'] = self.MIN_MEMORY_LIMIT<EOL>repres['<STR_LIT>'] = self.MIN_CPU_LIMIT <EOL>repres['<STR_LIT>'] = self.MIN_GPU_LIMIT<EOL>repres['<STR_LIT>'] = self.MAX_GPU_LIMIT<EOL>return repres<EOL> Return a JSON object with a representation of this app (type and parameters). def save_json_representation(self, dir_path): file_name = self.__class__.__name__+ '<STR_LIT>'<EOL>file_path = os.path.join(dir_path, file_name)<EOL>with open(file_path, '<STR_LIT:w>') as outfile:<EOL><INDENT>json.dump(self.get_json_representation(), outfile)<EOL><DEDENT> Save the app's JSON representation object to a JSON file. def launch(self, args=None): self.options = self.parse_args(args)<EOL>if self.options.saveinputmeta:<EOL><INDENT>self.save_input_meta()<EOL><DEDENT>if self.options.inputmeta:<EOL><INDENT>self.options = self.get_options_from_file(self.options.inputmeta)<EOL><DEDENT>self.run(self.options)<EOL>if self.options.saveoutputmeta:<EOL><INDENT>self.save_output_meta()<EOL><DEDENT> This method triggers the parsing of arguments. def get_options_from_file(self, file_path): <EOL>with open(file_path) as options_file:<EOL><INDENT>options_dict = json.load(options_file)<EOL><DEDENT>options = []<EOL>for opt_name in options_dict:<EOL><INDENT>options.append(opt_name)<EOL>options.append(options_dict[opt_name])<EOL><DEDENT>return self.parse_args(options)<EOL> Return the options parsed from a JSON file. def save_input_meta(self): options = self.options<EOL>file_path = os.path.join(options.outputdir, '<STR_LIT>')<EOL>with open(file_path, '<STR_LIT:w>') as outfile:<EOL><INDENT>json.dump(vars(options), outfile)<EOL><DEDENT> Save the input meta data (options passed to the app) to a JSON file. def save_output_meta(self): options = self.options<EOL>file_path = os.path.join(options.outputdir, '<STR_LIT>')<EOL>with open(file_path, '<STR_LIT:w>') as outfile:<EOL><INDENT>json.dump(self.OUTPUT_META_DICT, outfile)<EOL><DEDENT> Save descriptive output meta data to a JSON file. def load_output_meta(self): options = self.options<EOL>file_path = os.path.join(options.inputdir, '<STR_LIT>')<EOL>with open(file_path) as infile:<EOL><INDENT>return json.load(infile)<EOL><DEDENT> Load descriptive output meta data from a JSON file in the input directory. Return the app's version. def print_app_meta_data(self): l_metaData = dir(self)<EOL>l_classVar = [x for x in l_metaData if x.isupper() ]<EOL>for str_var in l_classVar:<EOL><INDENT>str_val = getattr(self, str_var)<EOL>print("<STR_LIT>" % (str_var, str_val))<EOL><DEDENT> Print the app's meta data. def error(self, message): print()<EOL>sys.stderr.write('<STR_LIT>' % message)<EOL>print()<EOL>self.print_help()<EOL>sys.exit(<NUM_LIT:2>)<EOL> The error handler if wrong commandline arguments are specified. def find_tarball(directory, name, version): dir_contents = os.listdir(os.path.join(directory, '<STR_LIT>'))<EOL>candidates = [tarball for tarball in dir_contents<EOL>if tarball.endswith('<STR_LIT>')<EOL>and tarball.startswith(name + '<STR_LIT:->' + version)]<EOL>if not candidates:<EOL><INDENT>logger.error("<STR_LIT>",<EOL>name, version)<EOL>logger.error("<STR_LIT>", directory, dir_contents)<EOL>return<EOL><DEDENT>if len(candidates) > <NUM_LIT:1>:<EOL><INDENT>logger.warn("<STR_LIT>",<EOL>candidates)<EOL><DEDENT>tarball = candidates[<NUM_LIT:0>]<EOL>return os.path.join(directory, '<STR_LIT>', tarball)<EOL> Return matching tarball filename from dist/ dir (if found). Setuptools generates a source distribution in a ``dist/`` directory and we need to find the exact filename, whether .tgz or .zip. We expect "name + '-' + version + '.tar.gz'", but we *can* get a -dev.r1234.tar.gz as that can be configured in a setup.cfg. Not pretty, but we don't want to force anyone to modify old tags. directories = [os.path.join(self.base_directory, d)<EOL>for d in os.listdir(self.base_directory)]<EOL>return [d for d in directories if os.path.isdir(d)]<EOL> Return directories inside the base directory. def missing_tags(self, existing_sdists=None): if existing_sdists is None:<EOL><INDENT>existing_sdists = []<EOL><DEDENT>logger.debug("<STR_LIT>", existing_sdists)<EOL>if self._missing_tags is None:<EOL><INDENT>missing = []<EOL>existing_sdists = sorted_versions(set(existing_sdists))<EOL>available = set(self.wrapper.vcs.available_tags())<EOL>available_tags = sorted_versions(available)<EOL>available_tags.reverse()<EOL>for tag in available_tags:<EOL><INDENT>if tag.is_prerelease:<EOL><INDENT>logger.warn("<STR_LIT>", tag)<EOL>continue<EOL><DEDENT>if tag in existing_sdists:<EOL><INDENT>logger.debug(<EOL>"<STR_LIT>",<EOL>tag)<EOL>break<EOL><DEDENT>else:<EOL><INDENT>missing.append(tag)<EOL>logger.debug("<STR_LIT>", tag)<EOL><DEDENT><DEDENT>missing.reverse()<EOL>mapping = {}<EOL>for tag in available:<EOL><INDENT>mapping[parse_version(tag)] = tag<EOL><DEDENT>self._missing_tags = [mapping[tag] for tag in missing]<EOL><DEDENT>logger.debug("<STR_LIT>", self._missing_tags)<EOL>return self._missing_tags<EOL> Return difference between existing sdists and available tags. def create_sdist(self, tag): logger.info("<STR_LIT>",<EOL>self.package, tag)<EOL>self.wrapper.vcs.checkout_from_tag(tag)<EOL>self.temp_tagdir = os.path.realpath(os.getcwd())<EOL>logger.debug("<STR_LIT>", self.temp_tagdir)<EOL>python = sys.executable<EOL>logger.debug(command("<STR_LIT>" % python))<EOL>tarball = find_tarball(self.temp_tagdir, self.package, tag)<EOL>return tarball<EOL> Create an sdist and return the full file path of the .tar.gz. shutil.rmtree(self.temp_tagdir)<EOL>parentdir = os.path.dirname(self.temp_tagdir)<EOL>if os.path.basename(parentdir).startswith(self.package):<EOL><INDENT>os.rmdir(parentdir)<EOL><DEDENT>os.chdir(self.start_directory)<EOL> Clean up temporary tag checkout dir. usage = ("<STR_LIT>"<EOL>"<STR_LIT>"<EOL>"<STR_LIT>")<EOL>parser = optparse.OptionParser(usage=usage)<EOL>parser.add_option("<STR_LIT>", "<STR_LIT>",<EOL>action="<STR_LIT:store_true>", dest="<STR_LIT>", default=False,<EOL>help="<STR_LIT>")<EOL>parser.add_option("<STR_LIT>", "<STR_LIT>",<EOL>action="<STR_LIT:store_true>", dest="<STR_LIT>", default=False,<EOL>help="<STR_LIT>")<EOL>(options, args) = parser.parse_args()<EOL>if len(args) != <NUM_LIT:2>:<EOL><INDENT>parser.print_help()<EOL>return <NUM_LIT:1><EOL><DEDENT>checkouts_dir = args[<NUM_LIT:0>]<EOL>sdists_dir = args[<NUM_LIT:1>]<EOL>checkouts_dir = os.path.abspath(checkouts_dir)<EOL>sdists_dir = os.path.abspath(sdists_dir)<EOL>if options.verbose:<EOL><INDENT>log_level = logging.DEBUG<EOL><DEDENT>elif options.quiet:<EOL><INDENT>log_level = logging.WARN<EOL><DEDENT>else:<EOL><INDENT>log_level = logging.INFO<EOL><DEDENT>logging.basicConfig(level=log_level,<EOL>format="<STR_LIT>")<EOL>logger.info("<STR_LIT>",<EOL>checkouts_dir, sdists_dir)<EOL>package_dir = packagedir.PackageDir(sdists_dir)<EOL>package_dir.parse()<EOL>checkout_base_dir = checkoutdir.CheckoutBaseDir(checkouts_dir)<EOL>for directory in checkout_base_dir.checkout_dirs():<EOL><INDENT>logger.debug("<STR_LIT>", directory)<EOL>checkout_dir = checkoutdir.CheckoutDir(directory)<EOL>package = checkout_dir.package<EOL>if '<STR_LIT>' in package:<EOL><INDENT>continue<EOL><DEDENT>for tag in checkout_dir.missing_tags(<EOL>existing_sdists=package_dir.packages[package]):<EOL><INDENT>tarball = checkout_dir.create_sdist(tag)<EOL>package_dir.add_tarball(tarball, package)<EOL>checkout_dir.cleanup()<EOL><DEDENT><DEDENT> bin/tags2sdists: create an sdist for a directory of checkouts. def __init__(self, root_directory): self.root_directory = root_directory<EOL>self.packages = collections.defaultdict(list)<EOL> Initialize with the root of the packages dir. for package in os.listdir(self.root_directory):<EOL><INDENT>directory = os.path.join(self.root_directory, package)<EOL>if not os.path.isdir(directory):<EOL><INDENT>continue<EOL><DEDENT>dir_contents = os.listdir(directory)<EOL>sdists = [tarball for tarball in dir_contents<EOL>if (tarball.endswith('<STR_LIT>')<EOL>and tarball.startswith(package + '<STR_LIT:->'))]<EOL>for sdist in sdists:<EOL><INDENT>version = sdist.replace('<STR_LIT>', '<STR_LIT>').replace(<EOL>package + '<STR_LIT:->', '<STR_LIT>')<EOL>self.packages[package].append(version)<EOL><DEDENT><DEDENT> Iterate through the directory and extract package/version info. def add_tarball(self, tarball, package): if tarball is None:<EOL><INDENT>logger.error(<EOL>"<STR_LIT>",<EOL>package)<EOL>return<EOL><DEDENT>target_dir = os.path.join(self.root_directory, package)<EOL>if not os.path.exists(target_dir):<EOL><INDENT>os.mkdir(target_dir)<EOL>logger.info("<STR_LIT>", target_dir)<EOL><DEDENT>logger.info("<STR_LIT>", target_dir)<EOL>shutil.copy(tarball, target_dir)<EOL> Add a tarball, possibly creating the directory if needed. status, out = commands.getstatusoutput(cmd)<EOL>if status is not <NUM_LIT:0>:<EOL><INDENT>logger.error("<STR_LIT>")<EOL>logger.error(out)<EOL>raise SdistCreationError()<EOL><DEDENT>return out<EOL> Execute command and raise an exception upon an error. >>> 'README' in command('ls') True >>> command('nonexistingcommand') #doctest: +ELLIPSIS Traceback (most recent call last): ... SdistCreationError import sys<EOL>import pprint<EOL>print('<STR_LIT>'),<EOL>pprint.pprint(sys.path)<EOL>print()<EOL>print('<STR_LIT>')<EOL>for name, cache_value in sys.path_importer_cache.items():<EOL><INDENT>name = name.replace(sys.prefix, '<STR_LIT>')<EOL>print('<STR_LIT>' % (name, cache_value))<EOL><DEDENT> Helper function to print sys.path and importers cache def _resolve_name(name, package, level): bits = package.rsplit('<STR_LIT:.>', level - <NUM_LIT:1>)<EOL>if len(bits) < level:<EOL><INDENT>raise ValueError('<STR_LIT>')<EOL><DEDENT>base = bits[<NUM_LIT:0>]<EOL>return '<STR_LIT>'.format(base, name) if name else base<EOL> Resolve a relative module name to an absolute one. def resolve_name(name, package): if not name.startswith('<STR_LIT:.>'):<EOL><INDENT>return name<EOL><DEDENT>elif not package:<EOL><INDENT>raise ValueError('<STR_LIT>'<EOL>'<STR_LIT>'.format(name))<EOL><DEDENT>level = <NUM_LIT:0><EOL>for character in name:<EOL><INDENT>if character != '<STR_LIT:.>':<EOL><INDENT>break<EOL><DEDENT>level += <NUM_LIT:1><EOL><DEDENT>return _resolve_name(name[level:], package, level)<EOL> Resolve a relative module name to an absolute one. def _find_spec_from_path(name, path=None): if name not in sys.modules:<EOL><INDENT>return _find_spec(name, path)<EOL><DEDENT>else:<EOL><INDENT>module = sys.modules[name]<EOL>if module is None:<EOL><INDENT>return None<EOL><DEDENT>try:<EOL><INDENT>spec = module.__spec__<EOL><DEDENT>except AttributeError:<EOL><INDENT>six.raise_from(ValueError('<STR_LIT>'.format(name)), None)<EOL><DEDENT>else:<EOL><INDENT>if spec is None:<EOL><INDENT>raise ValueError('<STR_LIT>'.format(name))<EOL><DEDENT>return spec<EOL><DEDENT><DEDENT> Return the spec for the specified module. First, sys.modules is checked to see if the module was already imported. If so, then sys.modules[name].__spec__ is returned. If that happens to be set to None, then ValueError is raised. If the module is not in sys.modules, then sys.meta_path is searched for a suitable spec with the value of 'path' given to the finders. None is returned if no spec could be found. Dotted names do not have their parent packages implicitly imported. You will most likely need to explicitly import all parent packages in the proper order for a submodule to get the correct spec. def find_spec(name, package=None): fullname = resolve_name(name, package) if name.startswith('<STR_LIT:.>') else name<EOL>if fullname not in sys.modules:<EOL><INDENT>parent_name = fullname.rpartition('<STR_LIT:.>')[<NUM_LIT:0>]<EOL>if parent_name:<EOL><INDENT>parent = __import__(parent_name, fromlist=['<STR_LIT>'])<EOL>return _find_spec(fullname, parent.__path__)<EOL><DEDENT>else:<EOL><INDENT>return _find_spec(fullname, None)<EOL><DEDENT><DEDENT>else:<EOL><INDENT>module = sys.modules[fullname]<EOL>if module is None:<EOL><INDENT>return None<EOL><DEDENT>try:<EOL><INDENT>spec = module.__spec__<EOL><DEDENT>except AttributeError:<EOL><INDENT>six.raise_from(ValueError('<STR_LIT>'.format(name)), None)<EOL><DEDENT>else:<EOL><INDENT>if spec is None:<EOL><INDENT>raise ValueError('<STR_LIT>'.format(name))<EOL><DEDENT>return spec<EOL><DEDENT><DEDENT> Return the spec for the specified module. First, sys.modules is checked to see if the module was already imported. If so, then sys.modules[name].__spec__ is returned. If that happens to be set to None, then ValueError is raised. If the module is not in sys.modules, then sys.meta_path is searched for a suitable spec with the value of 'path' given to the finders. None is returned if no spec could be found. If the name is for submodule (contains a dot), the parent module is automatically imported. The name and package arguments work the same as importlib.import_module(). In other words, relative module names (with leading dots) work. @functools.wraps(fxn)<EOL>def set_package_wrapper(*args, **kwargs):<EOL><INDENT>warnings.warn('<STR_LIT>',<EOL>DeprecationWarning, stacklevel=<NUM_LIT:2>)<EOL>module = fxn(*args, **kwargs)<EOL>if getattr(module, '<STR_LIT>', None) is None:<EOL><INDENT>module.__package__ = module.__name__<EOL>if not hasattr(module, '<STR_LIT>'):<EOL><INDENT>module.__package__ = module.__package__.rpartition('<STR_LIT:.>')[<NUM_LIT:0>]<EOL><DEDENT><DEDENT>return module<EOL><DEDENT>return set_package_wrapper<EOL> Set __package__ on the returned module. This function is deprecated. @functools.wraps(fxn)<EOL>def set_loader_wrapper(self, *args, **kwargs):<EOL><INDENT>warnings.warn('<STR_LIT>',<EOL>DeprecationWarning, stacklevel=<NUM_LIT:2>)<EOL>module = fxn(self, *args, **kwargs)<EOL>if getattr(module, '<STR_LIT>', None) is None:<EOL><INDENT>module.__loader__ = self<EOL><DEDENT>return module<EOL><DEDENT>return set_loader_wrapper<EOL> Set __loader__ on the returned module. This function is deprecated. def module_for_loader(fxn): warnings.warn('<STR_LIT>',<EOL>DeprecationWarning, stacklevel=<NUM_LIT:2>)<EOL>@functools.wraps(fxn)<EOL>def module_for_loader_wrapper(self, fullname, *args, **kwargs):<EOL><INDENT>with _module_to_load(fullname) as module:<EOL><INDENT>module.__loader__ = self<EOL>try:<EOL><INDENT>is_package = self.is_package(fullname)<EOL><DEDENT>except (ImportError, AttributeError):<EOL><INDENT>pass<EOL><DEDENT>else:<EOL><INDENT>if is_package:<EOL><INDENT>module.__package__ = fullname<EOL><DEDENT>else:<EOL><INDENT>module.__package__ = fullname.rpartition('<STR_LIT:.>')[<NUM_LIT:0>]<EOL><DEDENT><DEDENT>return fxn(self, module, *args, **kwargs)<EOL><DEDENT><DEDENT>return module_for_loader_wrapper<EOL> Decorator to handle selecting the proper module for loaders. The decorated function is passed the module to use instead of the module name. The module passed in to the function is either from sys.modules if it already exists or is a new module. If the module is new, then __name__ is set the first argument to the method, __loader__ is set to self, and __package__ is set accordingly (if self.is_package() is defined) will be set before it is passed to the decorated function (if self.is_package() does not work for the module it will be set post-load). If an exception is raised and the decorator created the module it is subsequently removed from sys.modules. The decorator assumes that the decorated function takes the module name as the second argument. return SOURCE_SUFFIXES + BYTECODE_SUFFIXES + EXTENSION_SUFFIXES<EOL> Returns a list of all recognized module suffixes for this process
https://huggingface.co/sanjeev3
Sanjeev Jagtap sanjeev3 sanjeev3 Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/collections/microsoft/speecht5-650995fc647a3ea442cc6c7b
microsoft 's Collections SpeechT5 SpeechT5 updated 14 days ago The SpeechT5 framework consists of a shared seq2seq and six modal-specific (speech/text) pre/post-nets that can address a few audio-related tasks. Upvote 1 SpeechT5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language Processing Paper β€’ 2110.07205 β€’ Published Oct 14, 2021 β€’ 1 microsoft/speecht5_tts Text-to-Speech β€’ Updated Aug 25 β€’ 53.2k β€’ 273 Note Text-to-speech version of SpeechT5 Running ont4 182 πŸ‘©β€πŸŽ€ SpeechT5 Speech Synthesis Demo microsoft/speecht5_vc Audio-to-Audio β€’ Updated Mar 22 β€’ 16.3k β€’ 36 Note Voice-conversion version of SpeechT5 81 πŸ‘©β€πŸŽ€ SpeechT5 Voice Conversion Demo microsoft/speecht5_asr Automatic Speech Recognition β€’ Updated Mar 22 β€’ 3.55k β€’ 18 Note Automatic-speech-recognition version of SpeechT5 32 πŸ‘©β€πŸŽ€ SpeechT5 Speech Recognition Demo microsoft/speecht5_hifigan Updated Feb 2 β€’ 72.2k β€’ 10 Note SpeechT5 produces a spectrogram, this model converts it to a waveform Upvote 1 Collection guide
https://huggingface.co/bsmock
1 Brandon Smock bsmock bsmock Research interests None yet Organizations models 4 bsmock/TATR-v1.1-Pub Updated Aug 22 bsmock/TATR-v1.1-All Updated Aug 22 β€’ 2 bsmock/TATR-v1.1-Fin Updated Aug 22 β€’ 2 bsmock/tatr-pubtables1m-v1.0 Updated Jun 15 β€’ 1 datasets 4 bsmock/FinTabNet.c Viewer β€’ Updated 26 days ago bsmock/ICDAR-2013.c Viewer β€’ Updated 26 days ago bsmock/ICDAR-2013-Table-Competition-Corrected Viewer β€’ Updated 29 days ago bsmock/pubtables-1m Viewer β€’ Updated Aug 8 β€’ 29 β€’ 16
https://huggingface.co/sakulka
Sandip Kulkarni sakulka Research interests Reinforcement Learning, Control Theory, Systems Organizations
https://huggingface.co/chenglong-ms
Chenglong Wang chenglong-ms Research interests None yet Organizations Papers 1 arxiv:2306.09896 models None public yet datasets None public yet
https://huggingface.co/jwilcox
Jeff Wilcox jwilcox Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/jennifermarsman
Jennifer Marsman jennifermarsman jennifermarsman jennifermarsman Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/Eitamar
Saraf Eitamar Research interests NLP, CV Organizations models None public yet datasets None public yet
https://huggingface.co/nkishan
Neel Kishan nkishan Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/millicentochieng
Millicent Ochieng millicentochieng millicentochieng Research interests NLP, ML, AI Organizations models 1 millicentochieng/en-fake-news-classifier Updated Mar 18 datasets None public yet
https://huggingface.co/gugarosa
29 3 1 Gustavo de Rosa gugarosa gugarosa Research interests None yet Organizations Papers 1 arxiv:2306.11644 spaces 2 No application file πŸ”₯ Majority Voting Runtime error 🌍 Codegen The Stack Inference models None public yet datasets None public yet
https://huggingface.co/shengz
6 8 2 Sheng Zhang shengz https://sheng-z.github.io/ sheng_zh sheng-z Research interests None yet Organizations Papers 4 arxiv:2303.00915 arxiv:2306.00890 arxiv:2308.02180 arxiv:2308.03279 models None public yet datasets None public yet
https://huggingface.co/ancientmooner
Han Hu ancientmooner https://ancientmooner.github.io/ ancientmooner Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/Inhenn
Yinheng Li Inhenn Inhenn Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/impiga
1 Yutong Lin impiga impiga Research interests None yet Organizations Papers 1 arxiv:2308.01904 models None public yet datasets None public yet
https://huggingface.co/Ishmam
zabir Ishmam Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/zeliu98
Ze Liu zeliu98 https://zeliu98.github.io/ zeliu98 Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/papers/2110.07205
Published on Oct 14, 2021 Authors: , , , , , , , , , , , , Abstract Motivated by the success of T5 (Text-To-Text Transfer Transformer) in pre-trained natural language processing models, we propose a unified-modal SpeechT5 framework that explores the encoder-decoder pre-training for self-supervised speech/text representation learning. The SpeechT5 framework consists of a shared encoder-decoder network and six modal-specific (speech/text) pre/post-nets. After preprocessing the input speech/text through the pre-nets, the shared encoder-decoder network models the sequence-to-sequence transformation, and then the post-nets generate the output in the speech/text modality based on the output of the decoder. Leveraging large-scale unlabeled speech and text data, we pre-train SpeechT5 to learn a unified-modal representation, hoping to improve the modeling capability for both speech and text. To align the textual and speech information into this unified semantic space, we propose a cross-modal vector quantization approach that randomly mixes up speech/text states with latent units as the interface between encoder and decoder. Extensive evaluations show the superiority of the proposed SpeechT5 framework on a wide variety of spoken language processing tasks, including automatic speech recognition, speech synthesis, speech translation, voice conversion, speech enhancement, and speaker identification. We release our code and model at https://github.com/microsoft/SpeechT5. Community Models citing this paper 4 Datasets citing this paper 0 No dataset linking this paper Cite arxiv.org/abs/2110.07205 in a dataset README.md to link it from this page. Spaces citing this paper 283
https://huggingface.co/altorremsft
Alyssa Torres altorremsft Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/bs1
B S bs1 Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/ARahul
Rahul ARahul Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/kleinhu
Klein Hu kleinhu Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/nileshdattani
Nilesh Dattani nileshdattani Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/shruthib
Shruthi Bannur shruthib Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/PrithviMicrosoft
1 Prithvishankar Srinivasan PrithviMicrosoft Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/ozanoktay
1 3 Ozan Oktay ozanoktay Research interests None yet Organizations Papers 2 arxiv:2204.09817 arxiv:2301.04558 models None public yet datasets None public yet
https://huggingface.co/sthyland
Stephanie sthyland https://sthy.land _hylandSL corcra Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/rogerpengyu
Roger (Peng) Yu rogerpengyu rogerpengyu Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/fosoromo
Foromo Daniel Soromou fosoromo grandfather1 Research interests NLP Organizations models None public yet datasets None public yet
https://huggingface.co/avikram
Ayush Vikram avikram avikram2 Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/nvalluri
1 Naveen Valluri nvalluri Research interests None yet Organizations Papers 1 arxiv:2307.06439 models None public yet datasets None public yet
https://huggingface.co/henryz
henryzeng henryz Research interests LLM, AIGC, ML Platform, Prompt Engineering, product manager, LLM product design and dev Organizations spaces 3
https://huggingface.co/Jiahang
6 Jiahang Xu Jiahang JiahangXu Research interests None yet Organizations Papers 1 arxiv:2306.14393 spaces 1 Runtime error πŸŒ– Latency Prediction by nn-Meter models None public yet datasets None public yet
https://huggingface.co/fepegar
Fernando PΓ©rez-GarcΓ­a fepegar Research interests Medical image computing Organizations Papers 1 models 1
https://huggingface.co/BSlininger
Brian Slininger BSlininger Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/Nancc
8 Nan Chen Nancc Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/yufan
44 yufan zhao yufan Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/test233
WJ test233 Research interests NLP Organizations models None public yet datasets None public yet
https://huggingface.co/intfloat
Liang Wang intfloat Research interests natural language processing Organizations models 17 datasets 6
https://huggingface.co/gavrilo
1 Gavrilo Andrić gavrilo gavrilo91 Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/VenerableSpace
Farrukh Rahman VenerableSpace Research interests low labeled settings (few shot, semi-supervised), Fine grained tasks, Representation learning, Video Understanding, Anomaly Detection, Organizations Papers 1
https://huggingface.co/saghar-h
Saghar Hosseini saghar-h sagharh Saghar-Hosseini Research interests NLP, Fairness, Natural Language Safety Organizations models None public yet datasets None public yet
https://huggingface.co/shubhamv199
Shubham Verma shubhamv199 shubhamv199 Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/t-aakritilnu
Aakriti Lnu t-aakritilnu Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/amka66
Amir Kantor amka66 Research interests machine learning, artificial intelligence, natural language processing, formal methods, formal mathematics Organizations
https://huggingface.co/pavelkliuiev
Pavel Kliuiev pavelkliuiev Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/knxylan
Su He knxylan Research interests None yet Organizations models 2 knxylan/sumodel Updated Jul 25, 2022 knxylan/Su Updated Jul 25, 2022 datasets None public yet
https://huggingface.co/netoront
Neil Toronto netoront netoront Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/vishalsood
Vishal Sood vishalsood vishalsood vishalsood Research interests Multi-modal, computer vision Organizations models None public yet datasets None public yet
https://huggingface.co/jmagosta
John-Mark Agosta jmagosta jmagosta Research interests Bayesian methods Organizations models None public yet datasets None public yet
https://huggingface.co/shikharmn
Mohan shikharmn shikharmn Research interests Extreme Classification, retrieval Organizations models None public yet datasets None public yet
https://huggingface.co/Pokennat
Wang Pokennat Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/danielsc
Daniel Schneider danielsc danielsc Research interests None yet Organizations models 1 danielsc/bert_test Automatic Speech Recognition β€’ Updated Nov 16, 2022 β€’ 3 datasets None public yet
https://huggingface.co/ssushant
Sushant Srivastava ssushant sgunadhya Research interests None yet Organizations models None public yet datasets None public yet
https://huggingface.co/jeffra
1 Jeff Rasley jeffra jeffra45 jeffra Research interests None yet Organizations Papers 1 arxiv:2308.01320 models None public yet datasets None public yet
https://huggingface.co/gok
Govind K gok t2govind thegovind Research interests None yet Organizations models None public yet datasets None public yet