repository_name
stringlengths 7
55
| func_path_in_repository
stringlengths 4
223
| func_name
stringlengths 1
134
| whole_func_string
stringlengths 75
104k
| language
stringclasses 1
value | func_code_string
stringlengths 75
104k
| func_code_tokens
sequencelengths 19
28.4k
| func_documentation_string
stringlengths 1
46.9k
| func_documentation_tokens
sequencelengths 1
1.97k
| split_name
stringclasses 1
value | func_code_url
stringlengths 87
315
|
---|---|---|---|---|---|---|---|---|---|---|
InfoAgeTech/django-core | django_core/utils/random_utils.py | random_alphanum | def random_alphanum(length=10, lower_only=False):
"""
Gets a random alphanumeric value using both letters and numbers.
:param length: size of the random alphanumeric string.
:param lower_only: boolean indicating if only lower case letters should be
used.
:return: alphanumeric string size of length
This function uses all number except for:
* 0
* 1
and uses all letters except for:
* lower case "l" (el)
* lower and upper case "o" and "O" (oh)
For upper and lower cased letters...
------------------------------------
Upper and lower cased letters and numbers can be used more than once which
leaves the possible combinations as follows:
8 numbers used + 49 letters used (upper and lower) = 57 total characters
Which leads us to the following equation:
57 total characters ^ length = total possible combinations
The following total possible combinations are below for a given length:
57 ^ 1 = 57
57 ^ 2 = 3,249
57 ^ 3 = 185,193
57 ^ 4 = 10,556,001
57 ^ 5 = 601,692,057
57 ^ 6 = 34,296,447,249
57 ^ 7 = 1,954,897,493,193
57 ^ 8 = 111,429,157,112,001
57 ^ 9 = 6,351,461,955,384,057
57 ^ 10 = 362,033,331,456,891,249
...
For lower cased letters...
--------------------------
Lower cased letters and numbers can be used more than once which leaves the
possible combinations as follows:
8 numbers used + 24 letters used (lower only) = 32 total characters
Which leads us to the following equation:
32 total characters ^ length = total possible combinations
The following total possible combinations are below for a given length:
32 ^ 1 = 32
32 ^ 2 = 1,024
32 ^ 3 = 32,768
32 ^ 4 = 1,048,576
32 ^ 5 = 33,554,432
32 ^ 6 = 1,073,741,824
32 ^ 7 = 34,359,738,368
32 ^ 8 = 1,099,511,627,776
32 ^ 9 = 35,184,372,088,832
32 ^ 10 = 1,125,899,906,842,624
...
"""
character_set = ALPHANUM_LOWER if lower_only else ALPHANUM
sample_size = 5
chars = random.sample(character_set, sample_size)
while len(chars) < length:
chars += random.sample(character_set, sample_size)
random.shuffle(chars)
return ''.join(chars[:length]) | python | def random_alphanum(length=10, lower_only=False):
"""
Gets a random alphanumeric value using both letters and numbers.
:param length: size of the random alphanumeric string.
:param lower_only: boolean indicating if only lower case letters should be
used.
:return: alphanumeric string size of length
This function uses all number except for:
* 0
* 1
and uses all letters except for:
* lower case "l" (el)
* lower and upper case "o" and "O" (oh)
For upper and lower cased letters...
------------------------------------
Upper and lower cased letters and numbers can be used more than once which
leaves the possible combinations as follows:
8 numbers used + 49 letters used (upper and lower) = 57 total characters
Which leads us to the following equation:
57 total characters ^ length = total possible combinations
The following total possible combinations are below for a given length:
57 ^ 1 = 57
57 ^ 2 = 3,249
57 ^ 3 = 185,193
57 ^ 4 = 10,556,001
57 ^ 5 = 601,692,057
57 ^ 6 = 34,296,447,249
57 ^ 7 = 1,954,897,493,193
57 ^ 8 = 111,429,157,112,001
57 ^ 9 = 6,351,461,955,384,057
57 ^ 10 = 362,033,331,456,891,249
...
For lower cased letters...
--------------------------
Lower cased letters and numbers can be used more than once which leaves the
possible combinations as follows:
8 numbers used + 24 letters used (lower only) = 32 total characters
Which leads us to the following equation:
32 total characters ^ length = total possible combinations
The following total possible combinations are below for a given length:
32 ^ 1 = 32
32 ^ 2 = 1,024
32 ^ 3 = 32,768
32 ^ 4 = 1,048,576
32 ^ 5 = 33,554,432
32 ^ 6 = 1,073,741,824
32 ^ 7 = 34,359,738,368
32 ^ 8 = 1,099,511,627,776
32 ^ 9 = 35,184,372,088,832
32 ^ 10 = 1,125,899,906,842,624
...
"""
character_set = ALPHANUM_LOWER if lower_only else ALPHANUM
sample_size = 5
chars = random.sample(character_set, sample_size)
while len(chars) < length:
chars += random.sample(character_set, sample_size)
random.shuffle(chars)
return ''.join(chars[:length]) | [
"def",
"random_alphanum",
"(",
"length",
"=",
"10",
",",
"lower_only",
"=",
"False",
")",
":",
"character_set",
"=",
"ALPHANUM_LOWER",
"if",
"lower_only",
"else",
"ALPHANUM",
"sample_size",
"=",
"5",
"chars",
"=",
"random",
".",
"sample",
"(",
"character_set",
",",
"sample_size",
")",
"while",
"len",
"(",
"chars",
")",
"<",
"length",
":",
"chars",
"+=",
"random",
".",
"sample",
"(",
"character_set",
",",
"sample_size",
")",
"random",
".",
"shuffle",
"(",
"chars",
")",
"return",
"''",
".",
"join",
"(",
"chars",
"[",
":",
"length",
"]",
")"
] | Gets a random alphanumeric value using both letters and numbers.
:param length: size of the random alphanumeric string.
:param lower_only: boolean indicating if only lower case letters should be
used.
:return: alphanumeric string size of length
This function uses all number except for:
* 0
* 1
and uses all letters except for:
* lower case "l" (el)
* lower and upper case "o" and "O" (oh)
For upper and lower cased letters...
------------------------------------
Upper and lower cased letters and numbers can be used more than once which
leaves the possible combinations as follows:
8 numbers used + 49 letters used (upper and lower) = 57 total characters
Which leads us to the following equation:
57 total characters ^ length = total possible combinations
The following total possible combinations are below for a given length:
57 ^ 1 = 57
57 ^ 2 = 3,249
57 ^ 3 = 185,193
57 ^ 4 = 10,556,001
57 ^ 5 = 601,692,057
57 ^ 6 = 34,296,447,249
57 ^ 7 = 1,954,897,493,193
57 ^ 8 = 111,429,157,112,001
57 ^ 9 = 6,351,461,955,384,057
57 ^ 10 = 362,033,331,456,891,249
...
For lower cased letters...
--------------------------
Lower cased letters and numbers can be used more than once which leaves the
possible combinations as follows:
8 numbers used + 24 letters used (lower only) = 32 total characters
Which leads us to the following equation:
32 total characters ^ length = total possible combinations
The following total possible combinations are below for a given length:
32 ^ 1 = 32
32 ^ 2 = 1,024
32 ^ 3 = 32,768
32 ^ 4 = 1,048,576
32 ^ 5 = 33,554,432
32 ^ 6 = 1,073,741,824
32 ^ 7 = 34,359,738,368
32 ^ 8 = 1,099,511,627,776
32 ^ 9 = 35,184,372,088,832
32 ^ 10 = 1,125,899,906,842,624
... | [
"Gets",
"a",
"random",
"alphanumeric",
"value",
"using",
"both",
"letters",
"and",
"numbers",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/random_utils.py#L20-L98 |
InfoAgeTech/django-core | django_core/utils/random_utils.py | generate_key | def generate_key(low=7, high=10, lower_only=False):
"""Gets a random alphanumeric key between low and high characters in
length.
"""
return random_alphanum(length=randint(7, 10), lower_only=lower_only) | python | def generate_key(low=7, high=10, lower_only=False):
"""Gets a random alphanumeric key between low and high characters in
length.
"""
return random_alphanum(length=randint(7, 10), lower_only=lower_only) | [
"def",
"generate_key",
"(",
"low",
"=",
"7",
",",
"high",
"=",
"10",
",",
"lower_only",
"=",
"False",
")",
":",
"return",
"random_alphanum",
"(",
"length",
"=",
"randint",
"(",
"7",
",",
"10",
")",
",",
"lower_only",
"=",
"lower_only",
")"
] | Gets a random alphanumeric key between low and high characters in
length. | [
"Gets",
"a",
"random",
"alphanumeric",
"key",
"between",
"low",
"and",
"high",
"characters",
"in",
"length",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/random_utils.py#L101-L105 |
InfoAgeTech/django-core | django_core/db/models/mixins/urls.py | AbstractUrlLinkModelMixin.get_absolute_url_link | def get_absolute_url_link(self, text=None, cls=None, icon_class=None,
**attrs):
"""Gets the html link for the object."""
if text is None:
text = self.get_link_text()
return build_link(href=self.get_absolute_url(),
text=text,
cls=cls,
icon_class=icon_class,
**attrs) | python | def get_absolute_url_link(self, text=None, cls=None, icon_class=None,
**attrs):
"""Gets the html link for the object."""
if text is None:
text = self.get_link_text()
return build_link(href=self.get_absolute_url(),
text=text,
cls=cls,
icon_class=icon_class,
**attrs) | [
"def",
"get_absolute_url_link",
"(",
"self",
",",
"text",
"=",
"None",
",",
"cls",
"=",
"None",
",",
"icon_class",
"=",
"None",
",",
"*",
"*",
"attrs",
")",
":",
"if",
"text",
"is",
"None",
":",
"text",
"=",
"self",
".",
"get_link_text",
"(",
")",
"return",
"build_link",
"(",
"href",
"=",
"self",
".",
"get_absolute_url",
"(",
")",
",",
"text",
"=",
"text",
",",
"cls",
"=",
"cls",
",",
"icon_class",
"=",
"icon_class",
",",
"*",
"*",
"attrs",
")"
] | Gets the html link for the object. | [
"Gets",
"the",
"html",
"link",
"for",
"the",
"object",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/db/models/mixins/urls.py#L26-L36 |
InfoAgeTech/django-core | django_core/db/models/mixins/urls.py | AbstractUrlLinkModelMixin.get_edit_url_link | def get_edit_url_link(self, text=None, cls=None, icon_class=None,
**attrs):
"""Gets the html edit link for the object."""
if text is None:
text = 'Edit'
return build_link(href=self.get_edit_url(),
text=text,
cls=cls,
icon_class=icon_class,
**attrs) | python | def get_edit_url_link(self, text=None, cls=None, icon_class=None,
**attrs):
"""Gets the html edit link for the object."""
if text is None:
text = 'Edit'
return build_link(href=self.get_edit_url(),
text=text,
cls=cls,
icon_class=icon_class,
**attrs) | [
"def",
"get_edit_url_link",
"(",
"self",
",",
"text",
"=",
"None",
",",
"cls",
"=",
"None",
",",
"icon_class",
"=",
"None",
",",
"*",
"*",
"attrs",
")",
":",
"if",
"text",
"is",
"None",
":",
"text",
"=",
"'Edit'",
"return",
"build_link",
"(",
"href",
"=",
"self",
".",
"get_edit_url",
"(",
")",
",",
"text",
"=",
"text",
",",
"cls",
"=",
"cls",
",",
"icon_class",
"=",
"icon_class",
",",
"*",
"*",
"attrs",
")"
] | Gets the html edit link for the object. | [
"Gets",
"the",
"html",
"edit",
"link",
"for",
"the",
"object",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/db/models/mixins/urls.py#L38-L48 |
InfoAgeTech/django-core | django_core/db/models/mixins/urls.py | AbstractUrlLinkModelMixin.get_delete_url_link | def get_delete_url_link(self, text=None, cls=None, icon_class=None,
**attrs):
"""Gets the html delete link for the object."""
if text is None:
text = 'Delete'
return build_link(href=self.get_delete_url(),
text=text,
cls=cls,
icon_class=icon_class,
**attrs) | python | def get_delete_url_link(self, text=None, cls=None, icon_class=None,
**attrs):
"""Gets the html delete link for the object."""
if text is None:
text = 'Delete'
return build_link(href=self.get_delete_url(),
text=text,
cls=cls,
icon_class=icon_class,
**attrs) | [
"def",
"get_delete_url_link",
"(",
"self",
",",
"text",
"=",
"None",
",",
"cls",
"=",
"None",
",",
"icon_class",
"=",
"None",
",",
"*",
"*",
"attrs",
")",
":",
"if",
"text",
"is",
"None",
":",
"text",
"=",
"'Delete'",
"return",
"build_link",
"(",
"href",
"=",
"self",
".",
"get_delete_url",
"(",
")",
",",
"text",
"=",
"text",
",",
"cls",
"=",
"cls",
",",
"icon_class",
"=",
"icon_class",
",",
"*",
"*",
"attrs",
")"
] | Gets the html delete link for the object. | [
"Gets",
"the",
"html",
"delete",
"link",
"for",
"the",
"object",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/db/models/mixins/urls.py#L50-L60 |
InfoAgeTech/django-core | django_core/managers.py | TokenAuthorizationManager.expire_by_email | def expire_by_email(self, email_address, **kwargs):
"""Expires tokens for an email address or email addresses.
:param email_address: the string email address or emails addresses to
expire tokens for.
:param reason: the codified reason for the tokens. If explicitly set
to None, this will expire all tokens for the email provided.
"""
if not email_address:
# no email(s) provided. Nothing to do.
return None
if isinstance(email_address, (set, list, tuple)):
email_address = [e.strip() for e in set(email_address)
if e and e.strip()]
# make sure there's at least 1 valid email address
if len(email_address) <= 0:
# no valid emails
return None
kwargs['email_address__in'] = email_address
else:
kwargs['email_address'] = email_address
# try setting the reason default if one exists (in the case of proxy
# models)
if 'reason' not in kwargs and self.model.reason_default:
kwargs['reason'] = self.model.reason_default
if 'reason' in kwargs and kwargs.get('reason') is None:
# explicitly setting the reason to None will expire all tokens for
# a user regardless of the reason.
del kwargs['reason']
self.filter(**kwargs).update(expires=datetime(1970, 1, 1)) | python | def expire_by_email(self, email_address, **kwargs):
"""Expires tokens for an email address or email addresses.
:param email_address: the string email address or emails addresses to
expire tokens for.
:param reason: the codified reason for the tokens. If explicitly set
to None, this will expire all tokens for the email provided.
"""
if not email_address:
# no email(s) provided. Nothing to do.
return None
if isinstance(email_address, (set, list, tuple)):
email_address = [e.strip() for e in set(email_address)
if e and e.strip()]
# make sure there's at least 1 valid email address
if len(email_address) <= 0:
# no valid emails
return None
kwargs['email_address__in'] = email_address
else:
kwargs['email_address'] = email_address
# try setting the reason default if one exists (in the case of proxy
# models)
if 'reason' not in kwargs and self.model.reason_default:
kwargs['reason'] = self.model.reason_default
if 'reason' in kwargs and kwargs.get('reason') is None:
# explicitly setting the reason to None will expire all tokens for
# a user regardless of the reason.
del kwargs['reason']
self.filter(**kwargs).update(expires=datetime(1970, 1, 1)) | [
"def",
"expire_by_email",
"(",
"self",
",",
"email_address",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"not",
"email_address",
":",
"# no email(s) provided. Nothing to do.",
"return",
"None",
"if",
"isinstance",
"(",
"email_address",
",",
"(",
"set",
",",
"list",
",",
"tuple",
")",
")",
":",
"email_address",
"=",
"[",
"e",
".",
"strip",
"(",
")",
"for",
"e",
"in",
"set",
"(",
"email_address",
")",
"if",
"e",
"and",
"e",
".",
"strip",
"(",
")",
"]",
"# make sure there's at least 1 valid email address",
"if",
"len",
"(",
"email_address",
")",
"<=",
"0",
":",
"# no valid emails",
"return",
"None",
"kwargs",
"[",
"'email_address__in'",
"]",
"=",
"email_address",
"else",
":",
"kwargs",
"[",
"'email_address'",
"]",
"=",
"email_address",
"# try setting the reason default if one exists (in the case of proxy",
"# models)",
"if",
"'reason'",
"not",
"in",
"kwargs",
"and",
"self",
".",
"model",
".",
"reason_default",
":",
"kwargs",
"[",
"'reason'",
"]",
"=",
"self",
".",
"model",
".",
"reason_default",
"if",
"'reason'",
"in",
"kwargs",
"and",
"kwargs",
".",
"get",
"(",
"'reason'",
")",
"is",
"None",
":",
"# explicitly setting the reason to None will expire all tokens for",
"# a user regardless of the reason.",
"del",
"kwargs",
"[",
"'reason'",
"]",
"self",
".",
"filter",
"(",
"*",
"*",
"kwargs",
")",
".",
"update",
"(",
"expires",
"=",
"datetime",
"(",
"1970",
",",
"1",
",",
"1",
")",
")"
] | Expires tokens for an email address or email addresses.
:param email_address: the string email address or emails addresses to
expire tokens for.
:param reason: the codified reason for the tokens. If explicitly set
to None, this will expire all tokens for the email provided. | [
"Expires",
"tokens",
"for",
"an",
"email",
"address",
"or",
"email",
"addresses",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/managers.py#L12-L47 |
InfoAgeTech/django-core | django_core/views/mixins/query.py | QueryStringAliasViewMixin.map_query_string | def map_query_string(self):
"""Maps the GET query string params the the query_key_mapper dict and
updates the request's GET QueryDict with the mapped keys.
"""
if (not self.query_key_mapper or
self.request.method == 'POST'):
# Nothing to map, don't do anything.
# return self.request.POST
return {}
keys = list(self.query_key_mapper.keys())
return {self.query_key_mapper.get(k) if k in keys else k: v.strip()
for k, v in self.request.GET.items()} | python | def map_query_string(self):
"""Maps the GET query string params the the query_key_mapper dict and
updates the request's GET QueryDict with the mapped keys.
"""
if (not self.query_key_mapper or
self.request.method == 'POST'):
# Nothing to map, don't do anything.
# return self.request.POST
return {}
keys = list(self.query_key_mapper.keys())
return {self.query_key_mapper.get(k) if k in keys else k: v.strip()
for k, v in self.request.GET.items()} | [
"def",
"map_query_string",
"(",
"self",
")",
":",
"if",
"(",
"not",
"self",
".",
"query_key_mapper",
"or",
"self",
".",
"request",
".",
"method",
"==",
"'POST'",
")",
":",
"# Nothing to map, don't do anything.",
"# return self.request.POST",
"return",
"{",
"}",
"keys",
"=",
"list",
"(",
"self",
".",
"query_key_mapper",
".",
"keys",
"(",
")",
")",
"return",
"{",
"self",
".",
"query_key_mapper",
".",
"get",
"(",
"k",
")",
"if",
"k",
"in",
"keys",
"else",
"k",
":",
"v",
".",
"strip",
"(",
")",
"for",
"k",
",",
"v",
"in",
"self",
".",
"request",
".",
"GET",
".",
"items",
"(",
")",
"}"
] | Maps the GET query string params the the query_key_mapper dict and
updates the request's GET QueryDict with the mapped keys. | [
"Maps",
"the",
"GET",
"query",
"string",
"params",
"the",
"the",
"query_key_mapper",
"dict",
"and",
"updates",
"the",
"request",
"s",
"GET",
"QueryDict",
"with",
"the",
"mapped",
"keys",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/views/mixins/query.py#L40-L53 |
edx/edx-django-release-util | scripts/update_repos_version.py | bump_repos_version | def bump_repos_version(module_name, new_version, local_only):
"""
Changes the pinned version number in the requirements files of all repos
which have the specified Python module as a dependency.
This script assumes that GITHUB_TOKEN is set for GitHub authentication.
"""
# Make the cloning directory and change directories into it.
tmp_dir = tempfile.mkdtemp(dir=os.getcwd())
# Iterate through each repository.
for owner, repo_name in REPOS_TO_CHANGE:
repo_url = REPO_URL_FORMAT.format(owner, repo_name)
gh = GitHubApiUtils(owner, repo_name)
os.chdir(tmp_dir)
# Clone the repo.
ret_code = subprocess.call(['git', 'clone', '{}.git'.format(repo_url)])
if ret_code:
logging.error('Failed to clone repo {}'.format(repo_url))
continue
# Change into the cloned repo dir.
os.chdir(repo_name)
# Create a branch, using the version number.
branch_name = '{}/{}'.format(module_name, new_version)
ret_code = subprocess.call(['git', 'checkout', '-b', branch_name])
if ret_code:
logging.error('Failed to create branch in repo {}'.format(repo_url))
continue
# Search through all TXT files to find all lines with the module name, changing the pinned version.
files_changed = False
for root, _dirs, files in os.walk('.'):
for file in files:
if file.endswith('.txt') and (('requirements' in file) or ('requirements' in root)):
found = False
filepath = os.path.join(root, file)
with open(filepath) as f:
if '{}=='.format(module_name) in f.read():
found = True
if found:
files_changed = True
# Change the file in-place.
for line in fileinput.input(filepath, inplace=True):
if '{}=='.format(module_name) in line:
print '{}=={}'.format(module_name, new_version)
else:
print line,
if not files_changed:
# Module name wasn't found in the requirements files.
logging.info("Module name '{}' not found in repo {} - skipping.".format(module_name, repo_url))
continue
# Add/commit the files.
ret_code = subprocess.call(['git', 'commit', '-am', 'Updating {} requirement to version {}'.format(module_name, new_version)])
if ret_code:
logging.error("Failed to add and commit changed files to repo {}".format(repo_url))
continue
if local_only:
# For local_only, don't push the branch to the remote and create the PR - leave all changes local for review.
continue
# Push the branch.
ret_code = subprocess.call(['git', 'push', '--set-upstream', 'origin', branch_name])
if ret_code:
logging.error("Failed to push branch {} upstream for repo {}".format(branch_name, repo_url))
continue
# Create a PR with an automated message.
rollback_branch_push = False
try:
# The GitHub "mention" below does not work via the API - unfortunately...
response = gh.create_pull(
title='Change {} version.'.format(module_name),
body='Change the required version of {} to {}.\n\n@edx-ops/pipeline-team Please review and tag appropriate parties.'.format(module_name, new_version),
head=branch_name,
base='master'
)
except:
logging.error('Failed to create PR for repo {} - did you set GITHUB_TOKEN?'.format(repo_url))
rollback_branch_push = True
else:
logging.info('Created PR #{} for repo {}: {}'.format(response.number, repo_url, response.html_url))
if rollback_branch_push:
# Since the PR creation failed, delete the branch in the remote repo as well.
ret_code = subprocess.call(['git', 'push', 'origin', '--delete', branch_name])
if ret_code:
logging.error("ROLLBACK: Failed to delete upstream branch {} for repo {}".format(branch_name, repo_url))
if not local_only:
# Remove the temp directory containing all the cloned repos.
shutil.rmtree(tmp_dir) | python | def bump_repos_version(module_name, new_version, local_only):
"""
Changes the pinned version number in the requirements files of all repos
which have the specified Python module as a dependency.
This script assumes that GITHUB_TOKEN is set for GitHub authentication.
"""
# Make the cloning directory and change directories into it.
tmp_dir = tempfile.mkdtemp(dir=os.getcwd())
# Iterate through each repository.
for owner, repo_name in REPOS_TO_CHANGE:
repo_url = REPO_URL_FORMAT.format(owner, repo_name)
gh = GitHubApiUtils(owner, repo_name)
os.chdir(tmp_dir)
# Clone the repo.
ret_code = subprocess.call(['git', 'clone', '{}.git'.format(repo_url)])
if ret_code:
logging.error('Failed to clone repo {}'.format(repo_url))
continue
# Change into the cloned repo dir.
os.chdir(repo_name)
# Create a branch, using the version number.
branch_name = '{}/{}'.format(module_name, new_version)
ret_code = subprocess.call(['git', 'checkout', '-b', branch_name])
if ret_code:
logging.error('Failed to create branch in repo {}'.format(repo_url))
continue
# Search through all TXT files to find all lines with the module name, changing the pinned version.
files_changed = False
for root, _dirs, files in os.walk('.'):
for file in files:
if file.endswith('.txt') and (('requirements' in file) or ('requirements' in root)):
found = False
filepath = os.path.join(root, file)
with open(filepath) as f:
if '{}=='.format(module_name) in f.read():
found = True
if found:
files_changed = True
# Change the file in-place.
for line in fileinput.input(filepath, inplace=True):
if '{}=='.format(module_name) in line:
print '{}=={}'.format(module_name, new_version)
else:
print line,
if not files_changed:
# Module name wasn't found in the requirements files.
logging.info("Module name '{}' not found in repo {} - skipping.".format(module_name, repo_url))
continue
# Add/commit the files.
ret_code = subprocess.call(['git', 'commit', '-am', 'Updating {} requirement to version {}'.format(module_name, new_version)])
if ret_code:
logging.error("Failed to add and commit changed files to repo {}".format(repo_url))
continue
if local_only:
# For local_only, don't push the branch to the remote and create the PR - leave all changes local for review.
continue
# Push the branch.
ret_code = subprocess.call(['git', 'push', '--set-upstream', 'origin', branch_name])
if ret_code:
logging.error("Failed to push branch {} upstream for repo {}".format(branch_name, repo_url))
continue
# Create a PR with an automated message.
rollback_branch_push = False
try:
# The GitHub "mention" below does not work via the API - unfortunately...
response = gh.create_pull(
title='Change {} version.'.format(module_name),
body='Change the required version of {} to {}.\n\n@edx-ops/pipeline-team Please review and tag appropriate parties.'.format(module_name, new_version),
head=branch_name,
base='master'
)
except:
logging.error('Failed to create PR for repo {} - did you set GITHUB_TOKEN?'.format(repo_url))
rollback_branch_push = True
else:
logging.info('Created PR #{} for repo {}: {}'.format(response.number, repo_url, response.html_url))
if rollback_branch_push:
# Since the PR creation failed, delete the branch in the remote repo as well.
ret_code = subprocess.call(['git', 'push', 'origin', '--delete', branch_name])
if ret_code:
logging.error("ROLLBACK: Failed to delete upstream branch {} for repo {}".format(branch_name, repo_url))
if not local_only:
# Remove the temp directory containing all the cloned repos.
shutil.rmtree(tmp_dir) | [
"def",
"bump_repos_version",
"(",
"module_name",
",",
"new_version",
",",
"local_only",
")",
":",
"# Make the cloning directory and change directories into it.",
"tmp_dir",
"=",
"tempfile",
".",
"mkdtemp",
"(",
"dir",
"=",
"os",
".",
"getcwd",
"(",
")",
")",
"# Iterate through each repository.",
"for",
"owner",
",",
"repo_name",
"in",
"REPOS_TO_CHANGE",
":",
"repo_url",
"=",
"REPO_URL_FORMAT",
".",
"format",
"(",
"owner",
",",
"repo_name",
")",
"gh",
"=",
"GitHubApiUtils",
"(",
"owner",
",",
"repo_name",
")",
"os",
".",
"chdir",
"(",
"tmp_dir",
")",
"# Clone the repo.",
"ret_code",
"=",
"subprocess",
".",
"call",
"(",
"[",
"'git'",
",",
"'clone'",
",",
"'{}.git'",
".",
"format",
"(",
"repo_url",
")",
"]",
")",
"if",
"ret_code",
":",
"logging",
".",
"error",
"(",
"'Failed to clone repo {}'",
".",
"format",
"(",
"repo_url",
")",
")",
"continue",
"# Change into the cloned repo dir.",
"os",
".",
"chdir",
"(",
"repo_name",
")",
"# Create a branch, using the version number.",
"branch_name",
"=",
"'{}/{}'",
".",
"format",
"(",
"module_name",
",",
"new_version",
")",
"ret_code",
"=",
"subprocess",
".",
"call",
"(",
"[",
"'git'",
",",
"'checkout'",
",",
"'-b'",
",",
"branch_name",
"]",
")",
"if",
"ret_code",
":",
"logging",
".",
"error",
"(",
"'Failed to create branch in repo {}'",
".",
"format",
"(",
"repo_url",
")",
")",
"continue",
"# Search through all TXT files to find all lines with the module name, changing the pinned version.",
"files_changed",
"=",
"False",
"for",
"root",
",",
"_dirs",
",",
"files",
"in",
"os",
".",
"walk",
"(",
"'.'",
")",
":",
"for",
"file",
"in",
"files",
":",
"if",
"file",
".",
"endswith",
"(",
"'.txt'",
")",
"and",
"(",
"(",
"'requirements'",
"in",
"file",
")",
"or",
"(",
"'requirements'",
"in",
"root",
")",
")",
":",
"found",
"=",
"False",
"filepath",
"=",
"os",
".",
"path",
".",
"join",
"(",
"root",
",",
"file",
")",
"with",
"open",
"(",
"filepath",
")",
"as",
"f",
":",
"if",
"'{}=='",
".",
"format",
"(",
"module_name",
")",
"in",
"f",
".",
"read",
"(",
")",
":",
"found",
"=",
"True",
"if",
"found",
":",
"files_changed",
"=",
"True",
"# Change the file in-place.",
"for",
"line",
"in",
"fileinput",
".",
"input",
"(",
"filepath",
",",
"inplace",
"=",
"True",
")",
":",
"if",
"'{}=='",
".",
"format",
"(",
"module_name",
")",
"in",
"line",
":",
"print",
"'{}=={}'",
".",
"format",
"(",
"module_name",
",",
"new_version",
")",
"else",
":",
"print",
"line",
",",
"if",
"not",
"files_changed",
":",
"# Module name wasn't found in the requirements files.",
"logging",
".",
"info",
"(",
"\"Module name '{}' not found in repo {} - skipping.\"",
".",
"format",
"(",
"module_name",
",",
"repo_url",
")",
")",
"continue",
"# Add/commit the files.",
"ret_code",
"=",
"subprocess",
".",
"call",
"(",
"[",
"'git'",
",",
"'commit'",
",",
"'-am'",
",",
"'Updating {} requirement to version {}'",
".",
"format",
"(",
"module_name",
",",
"new_version",
")",
"]",
")",
"if",
"ret_code",
":",
"logging",
".",
"error",
"(",
"\"Failed to add and commit changed files to repo {}\"",
".",
"format",
"(",
"repo_url",
")",
")",
"continue",
"if",
"local_only",
":",
"# For local_only, don't push the branch to the remote and create the PR - leave all changes local for review.",
"continue",
"# Push the branch.",
"ret_code",
"=",
"subprocess",
".",
"call",
"(",
"[",
"'git'",
",",
"'push'",
",",
"'--set-upstream'",
",",
"'origin'",
",",
"branch_name",
"]",
")",
"if",
"ret_code",
":",
"logging",
".",
"error",
"(",
"\"Failed to push branch {} upstream for repo {}\"",
".",
"format",
"(",
"branch_name",
",",
"repo_url",
")",
")",
"continue",
"# Create a PR with an automated message.",
"rollback_branch_push",
"=",
"False",
"try",
":",
"# The GitHub \"mention\" below does not work via the API - unfortunately...",
"response",
"=",
"gh",
".",
"create_pull",
"(",
"title",
"=",
"'Change {} version.'",
".",
"format",
"(",
"module_name",
")",
",",
"body",
"=",
"'Change the required version of {} to {}.\\n\\n@edx-ops/pipeline-team Please review and tag appropriate parties.'",
".",
"format",
"(",
"module_name",
",",
"new_version",
")",
",",
"head",
"=",
"branch_name",
",",
"base",
"=",
"'master'",
")",
"except",
":",
"logging",
".",
"error",
"(",
"'Failed to create PR for repo {} - did you set GITHUB_TOKEN?'",
".",
"format",
"(",
"repo_url",
")",
")",
"rollback_branch_push",
"=",
"True",
"else",
":",
"logging",
".",
"info",
"(",
"'Created PR #{} for repo {}: {}'",
".",
"format",
"(",
"response",
".",
"number",
",",
"repo_url",
",",
"response",
".",
"html_url",
")",
")",
"if",
"rollback_branch_push",
":",
"# Since the PR creation failed, delete the branch in the remote repo as well.",
"ret_code",
"=",
"subprocess",
".",
"call",
"(",
"[",
"'git'",
",",
"'push'",
",",
"'origin'",
",",
"'--delete'",
",",
"branch_name",
"]",
")",
"if",
"ret_code",
":",
"logging",
".",
"error",
"(",
"\"ROLLBACK: Failed to delete upstream branch {} for repo {}\"",
".",
"format",
"(",
"branch_name",
",",
"repo_url",
")",
")",
"if",
"not",
"local_only",
":",
"# Remove the temp directory containing all the cloned repos.",
"shutil",
".",
"rmtree",
"(",
"tmp_dir",
")"
] | Changes the pinned version number in the requirements files of all repos
which have the specified Python module as a dependency.
This script assumes that GITHUB_TOKEN is set for GitHub authentication. | [
"Changes",
"the",
"pinned",
"version",
"number",
"in",
"the",
"requirements",
"files",
"of",
"all",
"repos",
"which",
"have",
"the",
"specified",
"Python",
"module",
"as",
"a",
"dependency",
"."
] | train | https://github.com/edx/edx-django-release-util/blob/de0fde41d6a19885ab7dc309472b94fd0fccbc1d/scripts/update_repos_version.py#L55-L153 |
txomon/abot | abot/slack.py | SlackAPI.call | async def call(self, method, **params):
"""
Call an Slack Web API method
:param method: Slack Web API method to call
:param params: {str: object} parameters to method
:return: dict()
"""
url = self.SLACK_RPC_PREFIX + method
data = FormData()
data.add_fields(MultiDict(token=self.bot_token, charset='utf-8', **params))
response_body = await self.request(
method='POST',
url=url,
data=data
)
if 'warning' in response_body:
logger.warning(f'Warnings received from API call {method}: {response_body["warning"]}')
if 'ok' not in response_body:
logger.error(f'No ok marker in slack API call {method} {params} => {response_body}')
raise SlackCallException('There is no ok marker, ... strange', method=method)
if not response_body['ok']:
logger.error(f'Slack API call failed {method} {params} => {response_body}')
raise SlackCallException(f'No OK response returned', method=method)
return response_body | python | async def call(self, method, **params):
"""
Call an Slack Web API method
:param method: Slack Web API method to call
:param params: {str: object} parameters to method
:return: dict()
"""
url = self.SLACK_RPC_PREFIX + method
data = FormData()
data.add_fields(MultiDict(token=self.bot_token, charset='utf-8', **params))
response_body = await self.request(
method='POST',
url=url,
data=data
)
if 'warning' in response_body:
logger.warning(f'Warnings received from API call {method}: {response_body["warning"]}')
if 'ok' not in response_body:
logger.error(f'No ok marker in slack API call {method} {params} => {response_body}')
raise SlackCallException('There is no ok marker, ... strange', method=method)
if not response_body['ok']:
logger.error(f'Slack API call failed {method} {params} => {response_body}')
raise SlackCallException(f'No OK response returned', method=method)
return response_body | [
"async",
"def",
"call",
"(",
"self",
",",
"method",
",",
"*",
"*",
"params",
")",
":",
"url",
"=",
"self",
".",
"SLACK_RPC_PREFIX",
"+",
"method",
"data",
"=",
"FormData",
"(",
")",
"data",
".",
"add_fields",
"(",
"MultiDict",
"(",
"token",
"=",
"self",
".",
"bot_token",
",",
"charset",
"=",
"'utf-8'",
",",
"*",
"*",
"params",
")",
")",
"response_body",
"=",
"await",
"self",
".",
"request",
"(",
"method",
"=",
"'POST'",
",",
"url",
"=",
"url",
",",
"data",
"=",
"data",
")",
"if",
"'warning'",
"in",
"response_body",
":",
"logger",
".",
"warning",
"(",
"f'Warnings received from API call {method}: {response_body[\"warning\"]}'",
")",
"if",
"'ok'",
"not",
"in",
"response_body",
":",
"logger",
".",
"error",
"(",
"f'No ok marker in slack API call {method} {params} => {response_body}'",
")",
"raise",
"SlackCallException",
"(",
"'There is no ok marker, ... strange'",
",",
"method",
"=",
"method",
")",
"if",
"not",
"response_body",
"[",
"'ok'",
"]",
":",
"logger",
".",
"error",
"(",
"f'Slack API call failed {method} {params} => {response_body}'",
")",
"raise",
"SlackCallException",
"(",
"f'No OK response returned'",
",",
"method",
"=",
"method",
")",
"return",
"response_body"
] | Call an Slack Web API method
:param method: Slack Web API method to call
:param params: {str: object} parameters to method
:return: dict() | [
"Call",
"an",
"Slack",
"Web",
"API",
"method"
] | train | https://github.com/txomon/abot/blob/3ac23c6d14965d4608ed13c284ae1a886b462252/abot/slack.py#L83-L107 |
txomon/abot | abot/slack.py | SlackAPI.rtm_handler | def rtm_handler(self, ws_message):
"""
Handle a message, processing it internally if required. If it's a message that should go outside the bot,
this function will return True
:param message:
:return: Boolean if message should be yielded
"""
message = json.loads(ws_message.data)
if 'reply_to' in message:
reply_to = message['reply_to']
future = self.response_futures.pop(reply_to, None)
if future is None:
logger.error(f'This should not happen, received reply to unknown message! {message}')
return None
future.set_result(message)
return None
if 'type' not in message:
logger.error(f'No idea what this could be {message}')
return
message_type = message['type']
if hasattr(self, f'handle_{message_type}'):
function = getattr(self, f'handle_{message_type}')
return function(message)
if message_type in self.SLACK_RTM_EVENTS:
logger.debug(f'Unhandled {message_type}. {message}')
else:
logger.warning(f'Unknown {message_type}. {message}')
return message | python | def rtm_handler(self, ws_message):
"""
Handle a message, processing it internally if required. If it's a message that should go outside the bot,
this function will return True
:param message:
:return: Boolean if message should be yielded
"""
message = json.loads(ws_message.data)
if 'reply_to' in message:
reply_to = message['reply_to']
future = self.response_futures.pop(reply_to, None)
if future is None:
logger.error(f'This should not happen, received reply to unknown message! {message}')
return None
future.set_result(message)
return None
if 'type' not in message:
logger.error(f'No idea what this could be {message}')
return
message_type = message['type']
if hasattr(self, f'handle_{message_type}'):
function = getattr(self, f'handle_{message_type}')
return function(message)
if message_type in self.SLACK_RTM_EVENTS:
logger.debug(f'Unhandled {message_type}. {message}')
else:
logger.warning(f'Unknown {message_type}. {message}')
return message | [
"def",
"rtm_handler",
"(",
"self",
",",
"ws_message",
")",
":",
"message",
"=",
"json",
".",
"loads",
"(",
"ws_message",
".",
"data",
")",
"if",
"'reply_to'",
"in",
"message",
":",
"reply_to",
"=",
"message",
"[",
"'reply_to'",
"]",
"future",
"=",
"self",
".",
"response_futures",
".",
"pop",
"(",
"reply_to",
",",
"None",
")",
"if",
"future",
"is",
"None",
":",
"logger",
".",
"error",
"(",
"f'This should not happen, received reply to unknown message! {message}'",
")",
"return",
"None",
"future",
".",
"set_result",
"(",
"message",
")",
"return",
"None",
"if",
"'type'",
"not",
"in",
"message",
":",
"logger",
".",
"error",
"(",
"f'No idea what this could be {message}'",
")",
"return",
"message_type",
"=",
"message",
"[",
"'type'",
"]",
"if",
"hasattr",
"(",
"self",
",",
"f'handle_{message_type}'",
")",
":",
"function",
"=",
"getattr",
"(",
"self",
",",
"f'handle_{message_type}'",
")",
"return",
"function",
"(",
"message",
")",
"if",
"message_type",
"in",
"self",
".",
"SLACK_RTM_EVENTS",
":",
"logger",
".",
"debug",
"(",
"f'Unhandled {message_type}. {message}'",
")",
"else",
":",
"logger",
".",
"warning",
"(",
"f'Unknown {message_type}. {message}'",
")",
"return",
"message"
] | Handle a message, processing it internally if required. If it's a message that should go outside the bot,
this function will return True
:param message:
:return: Boolean if message should be yielded | [
"Handle",
"a",
"message",
"processing",
"it",
"internally",
"if",
"required",
".",
"If",
"it",
"s",
"a",
"message",
"that",
"should",
"go",
"outside",
"the",
"bot",
"this",
"function",
"will",
"return",
"True"
] | train | https://github.com/txomon/abot/blob/3ac23c6d14965d4608ed13c284ae1a886b462252/abot/slack.py#L591-L621 |
MarcoFavorito/flloat | flloat/parser/ltlf.py | LTLfLexer.t_ATOM | def t_ATOM(self, t):
r'[a-zA-Z_][a-zA-Z_0-9]*'
t.type = LTLfLexer.reserved.get(t.value, 'ATOM') # Check for reserved words
return t | python | def t_ATOM(self, t):
r'[a-zA-Z_][a-zA-Z_0-9]*'
t.type = LTLfLexer.reserved.get(t.value, 'ATOM') # Check for reserved words
return t | [
"def",
"t_ATOM",
"(",
"self",
",",
"t",
")",
":",
"t",
".",
"type",
"=",
"LTLfLexer",
".",
"reserved",
".",
"get",
"(",
"t",
".",
"value",
",",
"'ATOM'",
")",
"# Check for reserved words",
"return",
"t"
] | r'[a-zA-Z_][a-zA-Z_0-9]* | [
"r",
"[",
"a",
"-",
"zA",
"-",
"Z_",
"]",
"[",
"a",
"-",
"zA",
"-",
"Z_0",
"-",
"9",
"]",
"*"
] | train | https://github.com/MarcoFavorito/flloat/blob/5e6de1bea444b68d46d288834031860a8b2f8c2d/flloat/parser/ltlf.py#L54-L57 |
MarcoFavorito/flloat | flloat/parser/ltlf.py | LTLfParser.p_formula | def p_formula(self, p):
"""formula : formula EQUIVALENCE formula
| formula IMPLIES formula
| formula OR formula
| formula AND formula
| formula UNTIL formula
| formula RELEASE formula
| EVENTUALLY formula
| ALWAYS formula
| NEXT formula
| WEAK_NEXT formula
| NOT formula
| TRUE
| FALSE
| ATOM"""
if len(p) == 2:
if p[1] == Symbols.TRUE.value:
p[0] = LTLfTrue()
elif p[1] == Symbols.FALSE.value:
p[0] = LTLfFalse()
else:
p[0] = LTLfAtomic(Symbol(p[1]))
elif len(p) == 3:
if p[1] == Symbols.NEXT.value:
p[0] = LTLfNext(p[2])
elif p[1] == Symbols.WEAK_NEXT.value:
p[0] = LTLfWeakNext(p[2])
elif p[1] == Symbols.EVENTUALLY.value:
p[0] = LTLfEventually(p[2])
elif p[1] == Symbols.ALWAYS.value:
p[0] = LTLfAlways(p[2])
elif p[1] == Symbols.NOT.value:
p[0] = LTLfNot(p[2])
elif len(p) == 4:
l, o, r = p[1:]
if o == Symbols.EQUIVALENCE.value:
p[0] = LTLfEquivalence([l, r])
elif o == Symbols.IMPLIES.value:
p[0] = LTLfImplies([l, r])
elif o == Symbols.OR.value:
p[0] = LTLfOr([l, r])
elif o == Symbols.AND.value:
p[0] = LTLfAnd([l, r])
elif o == Symbols.UNTIL.value:
p[0] = LTLfUntil([l, r])
elif o == Symbols.RELEASE.value:
p[0] = LTLfRelease([l, r])
else:
raise ValueError
else:
raise ValueError | python | def p_formula(self, p):
"""formula : formula EQUIVALENCE formula
| formula IMPLIES formula
| formula OR formula
| formula AND formula
| formula UNTIL formula
| formula RELEASE formula
| EVENTUALLY formula
| ALWAYS formula
| NEXT formula
| WEAK_NEXT formula
| NOT formula
| TRUE
| FALSE
| ATOM"""
if len(p) == 2:
if p[1] == Symbols.TRUE.value:
p[0] = LTLfTrue()
elif p[1] == Symbols.FALSE.value:
p[0] = LTLfFalse()
else:
p[0] = LTLfAtomic(Symbol(p[1]))
elif len(p) == 3:
if p[1] == Symbols.NEXT.value:
p[0] = LTLfNext(p[2])
elif p[1] == Symbols.WEAK_NEXT.value:
p[0] = LTLfWeakNext(p[2])
elif p[1] == Symbols.EVENTUALLY.value:
p[0] = LTLfEventually(p[2])
elif p[1] == Symbols.ALWAYS.value:
p[0] = LTLfAlways(p[2])
elif p[1] == Symbols.NOT.value:
p[0] = LTLfNot(p[2])
elif len(p) == 4:
l, o, r = p[1:]
if o == Symbols.EQUIVALENCE.value:
p[0] = LTLfEquivalence([l, r])
elif o == Symbols.IMPLIES.value:
p[0] = LTLfImplies([l, r])
elif o == Symbols.OR.value:
p[0] = LTLfOr([l, r])
elif o == Symbols.AND.value:
p[0] = LTLfAnd([l, r])
elif o == Symbols.UNTIL.value:
p[0] = LTLfUntil([l, r])
elif o == Symbols.RELEASE.value:
p[0] = LTLfRelease([l, r])
else:
raise ValueError
else:
raise ValueError | [
"def",
"p_formula",
"(",
"self",
",",
"p",
")",
":",
"if",
"len",
"(",
"p",
")",
"==",
"2",
":",
"if",
"p",
"[",
"1",
"]",
"==",
"Symbols",
".",
"TRUE",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfTrue",
"(",
")",
"elif",
"p",
"[",
"1",
"]",
"==",
"Symbols",
".",
"FALSE",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfFalse",
"(",
")",
"else",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfAtomic",
"(",
"Symbol",
"(",
"p",
"[",
"1",
"]",
")",
")",
"elif",
"len",
"(",
"p",
")",
"==",
"3",
":",
"if",
"p",
"[",
"1",
"]",
"==",
"Symbols",
".",
"NEXT",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfNext",
"(",
"p",
"[",
"2",
"]",
")",
"elif",
"p",
"[",
"1",
"]",
"==",
"Symbols",
".",
"WEAK_NEXT",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfWeakNext",
"(",
"p",
"[",
"2",
"]",
")",
"elif",
"p",
"[",
"1",
"]",
"==",
"Symbols",
".",
"EVENTUALLY",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfEventually",
"(",
"p",
"[",
"2",
"]",
")",
"elif",
"p",
"[",
"1",
"]",
"==",
"Symbols",
".",
"ALWAYS",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfAlways",
"(",
"p",
"[",
"2",
"]",
")",
"elif",
"p",
"[",
"1",
"]",
"==",
"Symbols",
".",
"NOT",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfNot",
"(",
"p",
"[",
"2",
"]",
")",
"elif",
"len",
"(",
"p",
")",
"==",
"4",
":",
"l",
",",
"o",
",",
"r",
"=",
"p",
"[",
"1",
":",
"]",
"if",
"o",
"==",
"Symbols",
".",
"EQUIVALENCE",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfEquivalence",
"(",
"[",
"l",
",",
"r",
"]",
")",
"elif",
"o",
"==",
"Symbols",
".",
"IMPLIES",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfImplies",
"(",
"[",
"l",
",",
"r",
"]",
")",
"elif",
"o",
"==",
"Symbols",
".",
"OR",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfOr",
"(",
"[",
"l",
",",
"r",
"]",
")",
"elif",
"o",
"==",
"Symbols",
".",
"AND",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfAnd",
"(",
"[",
"l",
",",
"r",
"]",
")",
"elif",
"o",
"==",
"Symbols",
".",
"UNTIL",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfUntil",
"(",
"[",
"l",
",",
"r",
"]",
")",
"elif",
"o",
"==",
"Symbols",
".",
"RELEASE",
".",
"value",
":",
"p",
"[",
"0",
"]",
"=",
"LTLfRelease",
"(",
"[",
"l",
",",
"r",
"]",
")",
"else",
":",
"raise",
"ValueError",
"else",
":",
"raise",
"ValueError"
] | formula : formula EQUIVALENCE formula
| formula IMPLIES formula
| formula OR formula
| formula AND formula
| formula UNTIL formula
| formula RELEASE formula
| EVENTUALLY formula
| ALWAYS formula
| NEXT formula
| WEAK_NEXT formula
| NOT formula
| TRUE
| FALSE
| ATOM | [
"formula",
":",
"formula",
"EQUIVALENCE",
"formula",
"|",
"formula",
"IMPLIES",
"formula",
"|",
"formula",
"OR",
"formula",
"|",
"formula",
"AND",
"formula",
"|",
"formula",
"UNTIL",
"formula",
"|",
"formula",
"RELEASE",
"formula",
"|",
"EVENTUALLY",
"formula",
"|",
"ALWAYS",
"formula",
"|",
"NEXT",
"formula",
"|",
"WEAK_NEXT",
"formula",
"|",
"NOT",
"formula",
"|",
"TRUE",
"|",
"FALSE",
"|",
"ATOM"
] | train | https://github.com/MarcoFavorito/flloat/blob/5e6de1bea444b68d46d288834031860a8b2f8c2d/flloat/parser/ltlf.py#L77-L127 |
InfoAgeTech/django-core | django_core/forms/mixins/common.py | PrefixFormMixin.get_default_prefix | def get_default_prefix(self, instance=None):
"""Gets the prefix for this form.
:param instance: the form model instance. When calling this method
directly this should almost always stay None so it looks for
self.instance.
"""
if instance is None and hasattr(self, 'instance'):
instance = self.instance
if instance and instance.id is not None:
# it's an existing instance, use the instance prefix
instance_prefix = self.default_instance_prefix
if instance_prefix is None:
instance_prefix = self.__class__.__name__.lower() + 'i-'
return '{0}{1}'.format(instance_prefix,
instance.id)
if self.default_new_prefix is not None:
return self.default_new_prefix
return self.__class__.__name__.lower() + 'new-' | python | def get_default_prefix(self, instance=None):
"""Gets the prefix for this form.
:param instance: the form model instance. When calling this method
directly this should almost always stay None so it looks for
self.instance.
"""
if instance is None and hasattr(self, 'instance'):
instance = self.instance
if instance and instance.id is not None:
# it's an existing instance, use the instance prefix
instance_prefix = self.default_instance_prefix
if instance_prefix is None:
instance_prefix = self.__class__.__name__.lower() + 'i-'
return '{0}{1}'.format(instance_prefix,
instance.id)
if self.default_new_prefix is not None:
return self.default_new_prefix
return self.__class__.__name__.lower() + 'new-' | [
"def",
"get_default_prefix",
"(",
"self",
",",
"instance",
"=",
"None",
")",
":",
"if",
"instance",
"is",
"None",
"and",
"hasattr",
"(",
"self",
",",
"'instance'",
")",
":",
"instance",
"=",
"self",
".",
"instance",
"if",
"instance",
"and",
"instance",
".",
"id",
"is",
"not",
"None",
":",
"# it's an existing instance, use the instance prefix",
"instance_prefix",
"=",
"self",
".",
"default_instance_prefix",
"if",
"instance_prefix",
"is",
"None",
":",
"instance_prefix",
"=",
"self",
".",
"__class__",
".",
"__name__",
".",
"lower",
"(",
")",
"+",
"'i-'",
"return",
"'{0}{1}'",
".",
"format",
"(",
"instance_prefix",
",",
"instance",
".",
"id",
")",
"if",
"self",
".",
"default_new_prefix",
"is",
"not",
"None",
":",
"return",
"self",
".",
"default_new_prefix",
"return",
"self",
".",
"__class__",
".",
"__name__",
".",
"lower",
"(",
")",
"+",
"'new-'"
] | Gets the prefix for this form.
:param instance: the form model instance. When calling this method
directly this should almost always stay None so it looks for
self.instance. | [
"Gets",
"the",
"prefix",
"for",
"this",
"form",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/forms/mixins/common.py#L36-L58 |
InfoAgeTech/django-core | django_core/contrib/auth/backends.py | EmailOrUsernameBackend.authenticate | def authenticate(self, username, password):
"""
:param username: this is the email or username to check
"""
# If username is an email address, then try to pull it up
user = self.get_by_username_or_email(username)
if not user:
return None
if user.check_password(password):
return user
return None | python | def authenticate(self, username, password):
"""
:param username: this is the email or username to check
"""
# If username is an email address, then try to pull it up
user = self.get_by_username_or_email(username)
if not user:
return None
if user.check_password(password):
return user
return None | [
"def",
"authenticate",
"(",
"self",
",",
"username",
",",
"password",
")",
":",
"# If username is an email address, then try to pull it up",
"user",
"=",
"self",
".",
"get_by_username_or_email",
"(",
"username",
")",
"if",
"not",
"user",
":",
"return",
"None",
"if",
"user",
".",
"check_password",
"(",
"password",
")",
":",
"return",
"user",
"return",
"None"
] | :param username: this is the email or username to check | [
":",
"param",
"username",
":",
"this",
"is",
"the",
"email",
"or",
"username",
"to",
"check"
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/contrib/auth/backends.py#L13-L26 |
MarcoFavorito/flloat | flloat/utils.py | powerset | def powerset(iterable) -> FrozenSet:
"powerset([1,2,3]) --> () (1,) (2,) (3,) (1,2) (1,3) (2,3) (1,2,3)"
combs = _powerset(iterable)
res = frozenset(frozenset(x) for x in combs)
# res = map(frozenset, combs)
return res | python | def powerset(iterable) -> FrozenSet:
"powerset([1,2,3]) --> () (1,) (2,) (3,) (1,2) (1,3) (2,3) (1,2,3)"
combs = _powerset(iterable)
res = frozenset(frozenset(x) for x in combs)
# res = map(frozenset, combs)
return res | [
"def",
"powerset",
"(",
"iterable",
")",
"->",
"FrozenSet",
":",
"combs",
"=",
"_powerset",
"(",
"iterable",
")",
"res",
"=",
"frozenset",
"(",
"frozenset",
"(",
"x",
")",
"for",
"x",
"in",
"combs",
")",
"# res = map(frozenset, combs)",
"return",
"res"
] | powerset([1,2,3]) --> () (1,) (2,) (3,) (1,2) (1,3) (2,3) (1,2,3) | [
"powerset",
"(",
"[",
"1",
"2",
"3",
"]",
")",
"--",
">",
"()",
"(",
"1",
")",
"(",
"2",
")",
"(",
"3",
")",
"(",
"1",
"2",
")",
"(",
"1",
"3",
")",
"(",
"2",
"3",
")",
"(",
"1",
"2",
"3",
")"
] | train | https://github.com/MarcoFavorito/flloat/blob/5e6de1bea444b68d46d288834031860a8b2f8c2d/flloat/utils.py#L13-L18 |
InfoAgeTech/django-core | django_core/templatetags/url_tags.py | edit_url_link | def edit_url_link(obj, **kwargs):
"""This method assumes that the "get_delete_url_link" method has been
implemented on the obj.
"""
if hasattr(obj, 'get_edit_url_link'):
return obj.get_edit_url_link(**kwargs)
edit_url = obj.get_edit_url()
return build_link(href=edit_url, **kwargs) | python | def edit_url_link(obj, **kwargs):
"""This method assumes that the "get_delete_url_link" method has been
implemented on the obj.
"""
if hasattr(obj, 'get_edit_url_link'):
return obj.get_edit_url_link(**kwargs)
edit_url = obj.get_edit_url()
return build_link(href=edit_url, **kwargs) | [
"def",
"edit_url_link",
"(",
"obj",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"hasattr",
"(",
"obj",
",",
"'get_edit_url_link'",
")",
":",
"return",
"obj",
".",
"get_edit_url_link",
"(",
"*",
"*",
"kwargs",
")",
"edit_url",
"=",
"obj",
".",
"get_edit_url",
"(",
")",
"return",
"build_link",
"(",
"href",
"=",
"edit_url",
",",
"*",
"*",
"kwargs",
")"
] | This method assumes that the "get_delete_url_link" method has been
implemented on the obj. | [
"This",
"method",
"assumes",
"that",
"the",
"get_delete_url_link",
"method",
"has",
"been",
"implemented",
"on",
"the",
"obj",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/templatetags/url_tags.py#L49-L57 |
InfoAgeTech/django-core | django_core/templatetags/url_tags.py | delete_url_link | def delete_url_link(obj, **kwargs):
"""This method assumes that the "get_delete_url_link" method has been
implemented on the obj.
"""
if hasattr(obj, 'get_delete_url_link'):
return obj.get_delete_url_link(**kwargs)
delete_url = obj.get_delete_url()
return build_link(href=delete_url, **kwargs) | python | def delete_url_link(obj, **kwargs):
"""This method assumes that the "get_delete_url_link" method has been
implemented on the obj.
"""
if hasattr(obj, 'get_delete_url_link'):
return obj.get_delete_url_link(**kwargs)
delete_url = obj.get_delete_url()
return build_link(href=delete_url, **kwargs) | [
"def",
"delete_url_link",
"(",
"obj",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"hasattr",
"(",
"obj",
",",
"'get_delete_url_link'",
")",
":",
"return",
"obj",
".",
"get_delete_url_link",
"(",
"*",
"*",
"kwargs",
")",
"delete_url",
"=",
"obj",
".",
"get_delete_url",
"(",
")",
"return",
"build_link",
"(",
"href",
"=",
"delete_url",
",",
"*",
"*",
"kwargs",
")"
] | This method assumes that the "get_delete_url_link" method has been
implemented on the obj. | [
"This",
"method",
"assumes",
"that",
"the",
"get_delete_url_link",
"method",
"has",
"been",
"implemented",
"on",
"the",
"obj",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/templatetags/url_tags.py#L68-L76 |
InfoAgeTech/django-core | django_core/db/models/fields.py | ListField.formfield | def formfield(self, form_class=None, choices_form_class=None, **kwargs):
"""Make the default formfield a CommaSeparatedListField."""
defaults = {
'form_class': form_class or self.get_form_class()
}
defaults.update(kwargs)
return super(ListField, self).formfield(**defaults) | python | def formfield(self, form_class=None, choices_form_class=None, **kwargs):
"""Make the default formfield a CommaSeparatedListField."""
defaults = {
'form_class': form_class or self.get_form_class()
}
defaults.update(kwargs)
return super(ListField, self).formfield(**defaults) | [
"def",
"formfield",
"(",
"self",
",",
"form_class",
"=",
"None",
",",
"choices_form_class",
"=",
"None",
",",
"*",
"*",
"kwargs",
")",
":",
"defaults",
"=",
"{",
"'form_class'",
":",
"form_class",
"or",
"self",
".",
"get_form_class",
"(",
")",
"}",
"defaults",
".",
"update",
"(",
"kwargs",
")",
"return",
"super",
"(",
"ListField",
",",
"self",
")",
".",
"formfield",
"(",
"*",
"*",
"defaults",
")"
] | Make the default formfield a CommaSeparatedListField. | [
"Make",
"the",
"default",
"formfield",
"a",
"CommaSeparatedListField",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/db/models/fields.py#L94-L101 |
InfoAgeTech/django-core | django_core/db/models/fields.py | ListField.validate | def validate(self, value, model_instance, **kwargs):
"""This follows the validate rules for choices_form_class field used.
"""
self.get_choices_form_class().validate(value, model_instance, **kwargs) | python | def validate(self, value, model_instance, **kwargs):
"""This follows the validate rules for choices_form_class field used.
"""
self.get_choices_form_class().validate(value, model_instance, **kwargs) | [
"def",
"validate",
"(",
"self",
",",
"value",
",",
"model_instance",
",",
"*",
"*",
"kwargs",
")",
":",
"self",
".",
"get_choices_form_class",
"(",
")",
".",
"validate",
"(",
"value",
",",
"model_instance",
",",
"*",
"*",
"kwargs",
")"
] | This follows the validate rules for choices_form_class field used. | [
"This",
"follows",
"the",
"validate",
"rules",
"for",
"choices_form_class",
"field",
"used",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/db/models/fields.py#L110-L113 |
InfoAgeTech/django-core | django_core/db/models/fields.py | JSONField.to_python | def to_python(self, value):
"""
Convert the input JSON value into python structures, raises
django.core.exceptions.ValidationError if the data can't be converted.
"""
if isinstance(value, dict):
return value
if self.blank and not value:
return None
if isinstance(value, string_types):
try:
return json.loads(value)
except Exception as e:
raise ValidationError(str(e))
return value | python | def to_python(self, value):
"""
Convert the input JSON value into python structures, raises
django.core.exceptions.ValidationError if the data can't be converted.
"""
if isinstance(value, dict):
return value
if self.blank and not value:
return None
if isinstance(value, string_types):
try:
return json.loads(value)
except Exception as e:
raise ValidationError(str(e))
return value | [
"def",
"to_python",
"(",
"self",
",",
"value",
")",
":",
"if",
"isinstance",
"(",
"value",
",",
"dict",
")",
":",
"return",
"value",
"if",
"self",
".",
"blank",
"and",
"not",
"value",
":",
"return",
"None",
"if",
"isinstance",
"(",
"value",
",",
"string_types",
")",
":",
"try",
":",
"return",
"json",
".",
"loads",
"(",
"value",
")",
"except",
"Exception",
"as",
"e",
":",
"raise",
"ValidationError",
"(",
"str",
"(",
"e",
")",
")",
"return",
"value"
] | Convert the input JSON value into python structures, raises
django.core.exceptions.ValidationError if the data can't be converted. | [
"Convert",
"the",
"input",
"JSON",
"value",
"into",
"python",
"structures",
"raises",
"django",
".",
"core",
".",
"exceptions",
".",
"ValidationError",
"if",
"the",
"data",
"can",
"t",
"be",
"converted",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/db/models/fields.py#L197-L214 |
InfoAgeTech/django-core | django_core/db/models/fields.py | JSONField.get_prep_value | def get_prep_value(self, value):
"""Convert value to JSON string before save"""
try:
return json.dumps(value, cls=DjangoJSONEncoder)
except Exception as e:
raise ValidationError(str(e)) | python | def get_prep_value(self, value):
"""Convert value to JSON string before save"""
try:
return json.dumps(value, cls=DjangoJSONEncoder)
except Exception as e:
raise ValidationError(str(e)) | [
"def",
"get_prep_value",
"(",
"self",
",",
"value",
")",
":",
"try",
":",
"return",
"json",
".",
"dumps",
"(",
"value",
",",
"cls",
"=",
"DjangoJSONEncoder",
")",
"except",
"Exception",
"as",
"e",
":",
"raise",
"ValidationError",
"(",
"str",
"(",
"e",
")",
")"
] | Convert value to JSON string before save | [
"Convert",
"value",
"to",
"JSON",
"string",
"before",
"save"
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/db/models/fields.py#L226-L231 |
dls-controls/annotypes | annotypes/_fake_typing.py | _gorg | def _gorg(a):
"""Return the farthest origin of a generic class (internal helper)."""
assert isinstance(a, GenericMeta)
while a.__origin__ is not None:
a = a.__origin__
return a | python | def _gorg(a):
"""Return the farthest origin of a generic class (internal helper)."""
assert isinstance(a, GenericMeta)
while a.__origin__ is not None:
a = a.__origin__
return a | [
"def",
"_gorg",
"(",
"a",
")",
":",
"assert",
"isinstance",
"(",
"a",
",",
"GenericMeta",
")",
"while",
"a",
".",
"__origin__",
"is",
"not",
"None",
":",
"a",
"=",
"a",
".",
"__origin__",
"return",
"a"
] | Return the farthest origin of a generic class (internal helper). | [
"Return",
"the",
"farthest",
"origin",
"of",
"a",
"generic",
"class",
"(",
"internal",
"helper",
")",
"."
] | train | https://github.com/dls-controls/annotypes/blob/31ab68a0367bb70ebd9898e8b9fa9405423465bd/annotypes/_fake_typing.py#L135-L140 |
dls-controls/annotypes | annotypes/_fake_typing.py | _next_in_mro | def _next_in_mro(cls):
"""Helper for Generic.__new__.
Returns the class after the last occurrence of Generic or
Generic[...] in cls.__mro__.
"""
next_in_mro = object
# Look for the last occurrence of Generic or Generic[...].
for i, c in enumerate(cls.__mro__[:-1]):
if isinstance(c, GenericMeta) and _gorg(c) is Generic:
next_in_mro = cls.__mro__[i + 1]
return next_in_mro | python | def _next_in_mro(cls):
"""Helper for Generic.__new__.
Returns the class after the last occurrence of Generic or
Generic[...] in cls.__mro__.
"""
next_in_mro = object
# Look for the last occurrence of Generic or Generic[...].
for i, c in enumerate(cls.__mro__[:-1]):
if isinstance(c, GenericMeta) and _gorg(c) is Generic:
next_in_mro = cls.__mro__[i + 1]
return next_in_mro | [
"def",
"_next_in_mro",
"(",
"cls",
")",
":",
"next_in_mro",
"=",
"object",
"# Look for the last occurrence of Generic or Generic[...].",
"for",
"i",
",",
"c",
"in",
"enumerate",
"(",
"cls",
".",
"__mro__",
"[",
":",
"-",
"1",
"]",
")",
":",
"if",
"isinstance",
"(",
"c",
",",
"GenericMeta",
")",
"and",
"_gorg",
"(",
"c",
")",
"is",
"Generic",
":",
"next_in_mro",
"=",
"cls",
".",
"__mro__",
"[",
"i",
"+",
"1",
"]",
"return",
"next_in_mro"
] | Helper for Generic.__new__.
Returns the class after the last occurrence of Generic or
Generic[...] in cls.__mro__. | [
"Helper",
"for",
"Generic",
".",
"__new__",
"."
] | train | https://github.com/dls-controls/annotypes/blob/31ab68a0367bb70ebd9898e8b9fa9405423465bd/annotypes/_fake_typing.py#L143-L154 |
dls-controls/annotypes | annotypes/_fake_typing.py | _make_subclasshook | def _make_subclasshook(cls):
"""Construct a __subclasshook__ callable that incorporates
the associated __extra__ class in subclass checks performed
against cls.
"""
if isinstance(cls.__extra__, abc.ABCMeta):
# The logic mirrors that of ABCMeta.__subclasscheck__.
# Registered classes need not be checked here because
# cls and its extra share the same _abc_registry.
def __extrahook__(cls, subclass):
res = cls.__extra__.__subclasshook__(subclass)
if res is not NotImplemented:
return res
if cls.__extra__ in getattr(subclass, '__mro__', ()):
return True
for scls in cls.__extra__.__subclasses__():
# If we have fake typing and typing detect by module name
if scls.__class__.__module__ == "typing":
continue
if isinstance(scls, GenericMeta):
continue
if issubclass(subclass, scls):
return True
return NotImplemented
else:
# For non-ABC extras we'll just call issubclass().
def __extrahook__(cls, subclass):
if cls.__extra__ and issubclass(subclass, cls.__extra__):
return True
return NotImplemented
return classmethod(__extrahook__) | python | def _make_subclasshook(cls):
"""Construct a __subclasshook__ callable that incorporates
the associated __extra__ class in subclass checks performed
against cls.
"""
if isinstance(cls.__extra__, abc.ABCMeta):
# The logic mirrors that of ABCMeta.__subclasscheck__.
# Registered classes need not be checked here because
# cls and its extra share the same _abc_registry.
def __extrahook__(cls, subclass):
res = cls.__extra__.__subclasshook__(subclass)
if res is not NotImplemented:
return res
if cls.__extra__ in getattr(subclass, '__mro__', ()):
return True
for scls in cls.__extra__.__subclasses__():
# If we have fake typing and typing detect by module name
if scls.__class__.__module__ == "typing":
continue
if isinstance(scls, GenericMeta):
continue
if issubclass(subclass, scls):
return True
return NotImplemented
else:
# For non-ABC extras we'll just call issubclass().
def __extrahook__(cls, subclass):
if cls.__extra__ and issubclass(subclass, cls.__extra__):
return True
return NotImplemented
return classmethod(__extrahook__) | [
"def",
"_make_subclasshook",
"(",
"cls",
")",
":",
"if",
"isinstance",
"(",
"cls",
".",
"__extra__",
",",
"abc",
".",
"ABCMeta",
")",
":",
"# The logic mirrors that of ABCMeta.__subclasscheck__.",
"# Registered classes need not be checked here because",
"# cls and its extra share the same _abc_registry.",
"def",
"__extrahook__",
"(",
"cls",
",",
"subclass",
")",
":",
"res",
"=",
"cls",
".",
"__extra__",
".",
"__subclasshook__",
"(",
"subclass",
")",
"if",
"res",
"is",
"not",
"NotImplemented",
":",
"return",
"res",
"if",
"cls",
".",
"__extra__",
"in",
"getattr",
"(",
"subclass",
",",
"'__mro__'",
",",
"(",
")",
")",
":",
"return",
"True",
"for",
"scls",
"in",
"cls",
".",
"__extra__",
".",
"__subclasses__",
"(",
")",
":",
"# If we have fake typing and typing detect by module name",
"if",
"scls",
".",
"__class__",
".",
"__module__",
"==",
"\"typing\"",
":",
"continue",
"if",
"isinstance",
"(",
"scls",
",",
"GenericMeta",
")",
":",
"continue",
"if",
"issubclass",
"(",
"subclass",
",",
"scls",
")",
":",
"return",
"True",
"return",
"NotImplemented",
"else",
":",
"# For non-ABC extras we'll just call issubclass().",
"def",
"__extrahook__",
"(",
"cls",
",",
"subclass",
")",
":",
"if",
"cls",
".",
"__extra__",
"and",
"issubclass",
"(",
"subclass",
",",
"cls",
".",
"__extra__",
")",
":",
"return",
"True",
"return",
"NotImplemented",
"return",
"classmethod",
"(",
"__extrahook__",
")"
] | Construct a __subclasshook__ callable that incorporates
the associated __extra__ class in subclass checks performed
against cls. | [
"Construct",
"a",
"__subclasshook__",
"callable",
"that",
"incorporates",
"the",
"associated",
"__extra__",
"class",
"in",
"subclass",
"checks",
"performed",
"against",
"cls",
"."
] | train | https://github.com/dls-controls/annotypes/blob/31ab68a0367bb70ebd9898e8b9fa9405423465bd/annotypes/_fake_typing.py#L157-L187 |
InfoAgeTech/django-core | django_core/auth/views.py | AuthorizationTokenRequiredViewMixin.get_authorization | def get_authorization(self, **kwargs):
"""Gets the authorization object for the view."""
if self.authorization is not None:
return self.authorization
auth_class = self.get_authorization_class()
auth_user = self.get_authorization_user()
auth_kwargs = {
'token': self.get_authorization_token(**kwargs)
}
if auth_user and auth_user.is_authenticated():
auth_kwargs['created_user'] = self.get_authorization_user()
self.authorization = auth_class.objects.get_by_token_or_404(
**auth_kwargs
)
return self.authorization | python | def get_authorization(self, **kwargs):
"""Gets the authorization object for the view."""
if self.authorization is not None:
return self.authorization
auth_class = self.get_authorization_class()
auth_user = self.get_authorization_user()
auth_kwargs = {
'token': self.get_authorization_token(**kwargs)
}
if auth_user and auth_user.is_authenticated():
auth_kwargs['created_user'] = self.get_authorization_user()
self.authorization = auth_class.objects.get_by_token_or_404(
**auth_kwargs
)
return self.authorization | [
"def",
"get_authorization",
"(",
"self",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"self",
".",
"authorization",
"is",
"not",
"None",
":",
"return",
"self",
".",
"authorization",
"auth_class",
"=",
"self",
".",
"get_authorization_class",
"(",
")",
"auth_user",
"=",
"self",
".",
"get_authorization_user",
"(",
")",
"auth_kwargs",
"=",
"{",
"'token'",
":",
"self",
".",
"get_authorization_token",
"(",
"*",
"*",
"kwargs",
")",
"}",
"if",
"auth_user",
"and",
"auth_user",
".",
"is_authenticated",
"(",
")",
":",
"auth_kwargs",
"[",
"'created_user'",
"]",
"=",
"self",
".",
"get_authorization_user",
"(",
")",
"self",
".",
"authorization",
"=",
"auth_class",
".",
"objects",
".",
"get_by_token_or_404",
"(",
"*",
"*",
"auth_kwargs",
")",
"return",
"self",
".",
"authorization"
] | Gets the authorization object for the view. | [
"Gets",
"the",
"authorization",
"object",
"for",
"the",
"view",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/auth/views.py#L31-L48 |
InfoAgeTech/django-core | django_core/auth/views.py | AuthorizationTokenRequiredViewMixin.get_authorization_user | def get_authorization_user(self, **kwargs):
"""Gets the user the authorization object is for."""
if self.authorization_user is not None:
return self.authorization_user
self.authorization_user = self.request.user
return self.request.user | python | def get_authorization_user(self, **kwargs):
"""Gets the user the authorization object is for."""
if self.authorization_user is not None:
return self.authorization_user
self.authorization_user = self.request.user
return self.request.user | [
"def",
"get_authorization_user",
"(",
"self",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"self",
".",
"authorization_user",
"is",
"not",
"None",
":",
"return",
"self",
".",
"authorization_user",
"self",
".",
"authorization_user",
"=",
"self",
".",
"request",
".",
"user",
"return",
"self",
".",
"request",
".",
"user"
] | Gets the user the authorization object is for. | [
"Gets",
"the",
"user",
"the",
"authorization",
"object",
"is",
"for",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/auth/views.py#L54-L60 |
dls-controls/annotypes | annotypes/_serializable.py | Serializable.to_dict | def to_dict(self, dict_cls=FrozenOrderedDict):
# type: (Type[dict]) -> Dict[str, Any]
"""Create a dictionary representation of object attributes
Returns:
OrderedDict serialised version of self
"""
pairs = tuple((k, serialize_object(getattr(self, k), dict_cls))
for k in self.call_types)
if self.typeid:
d = dict_cls((("typeid", self.typeid),) + pairs)
else:
d = dict_cls(pairs)
return d | python | def to_dict(self, dict_cls=FrozenOrderedDict):
# type: (Type[dict]) -> Dict[str, Any]
"""Create a dictionary representation of object attributes
Returns:
OrderedDict serialised version of self
"""
pairs = tuple((k, serialize_object(getattr(self, k), dict_cls))
for k in self.call_types)
if self.typeid:
d = dict_cls((("typeid", self.typeid),) + pairs)
else:
d = dict_cls(pairs)
return d | [
"def",
"to_dict",
"(",
"self",
",",
"dict_cls",
"=",
"FrozenOrderedDict",
")",
":",
"# type: (Type[dict]) -> Dict[str, Any]",
"pairs",
"=",
"tuple",
"(",
"(",
"k",
",",
"serialize_object",
"(",
"getattr",
"(",
"self",
",",
"k",
")",
",",
"dict_cls",
")",
")",
"for",
"k",
"in",
"self",
".",
"call_types",
")",
"if",
"self",
".",
"typeid",
":",
"d",
"=",
"dict_cls",
"(",
"(",
"(",
"\"typeid\"",
",",
"self",
".",
"typeid",
")",
",",
")",
"+",
"pairs",
")",
"else",
":",
"d",
"=",
"dict_cls",
"(",
"pairs",
")",
"return",
"d"
] | Create a dictionary representation of object attributes
Returns:
OrderedDict serialised version of self | [
"Create",
"a",
"dictionary",
"representation",
"of",
"object",
"attributes"
] | train | https://github.com/dls-controls/annotypes/blob/31ab68a0367bb70ebd9898e8b9fa9405423465bd/annotypes/_serializable.py#L125-L138 |
InfoAgeTech/django-core | django_core/utils/urls.py | safe_redirect | def safe_redirect(next_url, default=None):
"""Makes sure it's a legit site to redirect to.
:param default: this is the default url or named url to redirect to in the
event where next_url is not legit.
"""
if is_legit_next_url(next_url):
return redirect(next_url)
if default:
return redirect(default)
return redirect('/') | python | def safe_redirect(next_url, default=None):
"""Makes sure it's a legit site to redirect to.
:param default: this is the default url or named url to redirect to in the
event where next_url is not legit.
"""
if is_legit_next_url(next_url):
return redirect(next_url)
if default:
return redirect(default)
return redirect('/') | [
"def",
"safe_redirect",
"(",
"next_url",
",",
"default",
"=",
"None",
")",
":",
"if",
"is_legit_next_url",
"(",
"next_url",
")",
":",
"return",
"redirect",
"(",
"next_url",
")",
"if",
"default",
":",
"return",
"redirect",
"(",
"default",
")",
"return",
"redirect",
"(",
"'/'",
")"
] | Makes sure it's a legit site to redirect to.
:param default: this is the default url or named url to redirect to in the
event where next_url is not legit. | [
"Makes",
"sure",
"it",
"s",
"a",
"legit",
"site",
"to",
"redirect",
"to",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/urls.py#L53-L66 |
InfoAgeTech/django-core | django_core/utils/urls.py | replace_url_query_values | def replace_url_query_values(url, replace_vals):
"""Replace querystring values in a url string.
>>> url = 'http://helloworld.com/some/path?test=5'
>>> replace_vals = {'test': 10}
>>> replace_url_query_values(url=url, replace_vals=replace_vals)
'http://helloworld.com/some/path?test=10'
"""
if '?' not in url:
return url
parsed_url = urlparse(url)
query = dict(parse_qsl(parsed_url.query))
query.update(replace_vals)
return '{0}?{1}'.format(url.split('?')[0], urlencode(query)) | python | def replace_url_query_values(url, replace_vals):
"""Replace querystring values in a url string.
>>> url = 'http://helloworld.com/some/path?test=5'
>>> replace_vals = {'test': 10}
>>> replace_url_query_values(url=url, replace_vals=replace_vals)
'http://helloworld.com/some/path?test=10'
"""
if '?' not in url:
return url
parsed_url = urlparse(url)
query = dict(parse_qsl(parsed_url.query))
query.update(replace_vals)
return '{0}?{1}'.format(url.split('?')[0], urlencode(query)) | [
"def",
"replace_url_query_values",
"(",
"url",
",",
"replace_vals",
")",
":",
"if",
"'?'",
"not",
"in",
"url",
":",
"return",
"url",
"parsed_url",
"=",
"urlparse",
"(",
"url",
")",
"query",
"=",
"dict",
"(",
"parse_qsl",
"(",
"parsed_url",
".",
"query",
")",
")",
"query",
".",
"update",
"(",
"replace_vals",
")",
"return",
"'{0}?{1}'",
".",
"format",
"(",
"url",
".",
"split",
"(",
"'?'",
")",
"[",
"0",
"]",
",",
"urlencode",
"(",
"query",
")",
")"
] | Replace querystring values in a url string.
>>> url = 'http://helloworld.com/some/path?test=5'
>>> replace_vals = {'test': 10}
>>> replace_url_query_values(url=url, replace_vals=replace_vals)
'http://helloworld.com/some/path?test=10' | [
"Replace",
"querystring",
"values",
"in",
"a",
"url",
"string",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/urls.py#L82-L96 |
InfoAgeTech/django-core | django_core/utils/urls.py | get_query_values_from_url | def get_query_values_from_url(url, keys=None):
"""Gets query string values from a url.
if a list of keys are provided, then a dict will be returned. If only a
single string key is provided, then only a single value will be returned.
>>> url = 'http://helloworld.com/some/path?test=5&hello=world&john=doe'
>>> get_query_values_from_url(url=url, keys='test')
"5"
>>> get_query_values_from_url(url=url, keys=['test'])
{'test': '5'}
>>> get_query_values_from_url(url=url, keys=['test', 'john'])
{'test': '5', 'john': 'doe'}
>>> get_query_values_from_url(url=url, keys=['test', 'john', 'blah'])
{'test': '5', 'john': 'doe', 'blah': None}
"""
if not url or '?' not in url:
# no query params
return None
parsed_url = urlparse(url)
query = dict(parse_qsl(parsed_url.query))
if keys is None:
return query
if isinstance(keys, string_types):
return query.get(keys)
return {k: query.get(k) for k in keys} | python | def get_query_values_from_url(url, keys=None):
"""Gets query string values from a url.
if a list of keys are provided, then a dict will be returned. If only a
single string key is provided, then only a single value will be returned.
>>> url = 'http://helloworld.com/some/path?test=5&hello=world&john=doe'
>>> get_query_values_from_url(url=url, keys='test')
"5"
>>> get_query_values_from_url(url=url, keys=['test'])
{'test': '5'}
>>> get_query_values_from_url(url=url, keys=['test', 'john'])
{'test': '5', 'john': 'doe'}
>>> get_query_values_from_url(url=url, keys=['test', 'john', 'blah'])
{'test': '5', 'john': 'doe', 'blah': None}
"""
if not url or '?' not in url:
# no query params
return None
parsed_url = urlparse(url)
query = dict(parse_qsl(parsed_url.query))
if keys is None:
return query
if isinstance(keys, string_types):
return query.get(keys)
return {k: query.get(k) for k in keys} | [
"def",
"get_query_values_from_url",
"(",
"url",
",",
"keys",
"=",
"None",
")",
":",
"if",
"not",
"url",
"or",
"'?'",
"not",
"in",
"url",
":",
"# no query params",
"return",
"None",
"parsed_url",
"=",
"urlparse",
"(",
"url",
")",
"query",
"=",
"dict",
"(",
"parse_qsl",
"(",
"parsed_url",
".",
"query",
")",
")",
"if",
"keys",
"is",
"None",
":",
"return",
"query",
"if",
"isinstance",
"(",
"keys",
",",
"string_types",
")",
":",
"return",
"query",
".",
"get",
"(",
"keys",
")",
"return",
"{",
"k",
":",
"query",
".",
"get",
"(",
"k",
")",
"for",
"k",
"in",
"keys",
"}"
] | Gets query string values from a url.
if a list of keys are provided, then a dict will be returned. If only a
single string key is provided, then only a single value will be returned.
>>> url = 'http://helloworld.com/some/path?test=5&hello=world&john=doe'
>>> get_query_values_from_url(url=url, keys='test')
"5"
>>> get_query_values_from_url(url=url, keys=['test'])
{'test': '5'}
>>> get_query_values_from_url(url=url, keys=['test', 'john'])
{'test': '5', 'john': 'doe'}
>>> get_query_values_from_url(url=url, keys=['test', 'john', 'blah'])
{'test': '5', 'john': 'doe', 'blah': None} | [
"Gets",
"query",
"string",
"values",
"from",
"a",
"url",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/urls.py#L99-L128 |
InfoAgeTech/django-core | django_core/views/response.py | JSONResponseMixin.get_json_response | def get_json_response(self, content, **kwargs):
"""Returns a json response object."""
# Don't care to return a django form or view in the response here.
# Remove those from the context.
if isinstance(content, dict):
response_content = {k: deepcopy(v) for k, v in content.items()
if k not in ('form', 'view') or k in ('form', 'view')
and not isinstance(v, (Form, View))}
else:
response_content = content
return HttpResponse(content=json.dumps(response_content),
content_type='application/json; charset=utf-8',
**kwargs) | python | def get_json_response(self, content, **kwargs):
"""Returns a json response object."""
# Don't care to return a django form or view in the response here.
# Remove those from the context.
if isinstance(content, dict):
response_content = {k: deepcopy(v) for k, v in content.items()
if k not in ('form', 'view') or k in ('form', 'view')
and not isinstance(v, (Form, View))}
else:
response_content = content
return HttpResponse(content=json.dumps(response_content),
content_type='application/json; charset=utf-8',
**kwargs) | [
"def",
"get_json_response",
"(",
"self",
",",
"content",
",",
"*",
"*",
"kwargs",
")",
":",
"# Don't care to return a django form or view in the response here.",
"# Remove those from the context.",
"if",
"isinstance",
"(",
"content",
",",
"dict",
")",
":",
"response_content",
"=",
"{",
"k",
":",
"deepcopy",
"(",
"v",
")",
"for",
"k",
",",
"v",
"in",
"content",
".",
"items",
"(",
")",
"if",
"k",
"not",
"in",
"(",
"'form'",
",",
"'view'",
")",
"or",
"k",
"in",
"(",
"'form'",
",",
"'view'",
")",
"and",
"not",
"isinstance",
"(",
"v",
",",
"(",
"Form",
",",
"View",
")",
")",
"}",
"else",
":",
"response_content",
"=",
"content",
"return",
"HttpResponse",
"(",
"content",
"=",
"json",
".",
"dumps",
"(",
"response_content",
")",
",",
"content_type",
"=",
"'application/json; charset=utf-8'",
",",
"*",
"*",
"kwargs",
")"
] | Returns a json response object. | [
"Returns",
"a",
"json",
"response",
"object",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/views/response.py#L23-L37 |
InfoAgeTech/django-core | django_core/views/response.py | JSONHybridDeleteView.delete | def delete(self, request, *args, **kwargs):
"""
Calls the delete() method on the fetched object and then
redirects to the success URL.
"""
self.object = self.get_object()
success_url = self.get_success_url()
self.object.delete()
if self.request.is_ajax():
return JSONResponseMixin.render_to_response(self, context={})
return HttpResponseRedirect(success_url) | python | def delete(self, request, *args, **kwargs):
"""
Calls the delete() method on the fetched object and then
redirects to the success URL.
"""
self.object = self.get_object()
success_url = self.get_success_url()
self.object.delete()
if self.request.is_ajax():
return JSONResponseMixin.render_to_response(self, context={})
return HttpResponseRedirect(success_url) | [
"def",
"delete",
"(",
"self",
",",
"request",
",",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"self",
".",
"object",
"=",
"self",
".",
"get_object",
"(",
")",
"success_url",
"=",
"self",
".",
"get_success_url",
"(",
")",
"self",
".",
"object",
".",
"delete",
"(",
")",
"if",
"self",
".",
"request",
".",
"is_ajax",
"(",
")",
":",
"return",
"JSONResponseMixin",
".",
"render_to_response",
"(",
"self",
",",
"context",
"=",
"{",
"}",
")",
"return",
"HttpResponseRedirect",
"(",
"success_url",
")"
] | Calls the delete() method on the fetched object and then
redirects to the success URL. | [
"Calls",
"the",
"delete",
"()",
"method",
"on",
"the",
"fetched",
"object",
"and",
"then",
"redirects",
"to",
"the",
"success",
"URL",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/views/response.py#L166-L178 |
MarcoFavorito/flloat | flloat/flloat.py | find_atomics | def find_atomics(formula: Formula) -> Set[PLAtomic]:
"""Finds all the atomic formulas"""
f = formula
res = set()
if isinstance(formula, PLFormula):
res = formula.find_atomics()
# elif isinstance(f, PLNot):
# res = res.union(find_atomics(f.f))
# elif isinstance(f, PLBinaryOperator):
# for subf in f.formulas:
# res = res.union(find_atomics(subf))
else:
res.add(f)
return res | python | def find_atomics(formula: Formula) -> Set[PLAtomic]:
"""Finds all the atomic formulas"""
f = formula
res = set()
if isinstance(formula, PLFormula):
res = formula.find_atomics()
# elif isinstance(f, PLNot):
# res = res.union(find_atomics(f.f))
# elif isinstance(f, PLBinaryOperator):
# for subf in f.formulas:
# res = res.union(find_atomics(subf))
else:
res.add(f)
return res | [
"def",
"find_atomics",
"(",
"formula",
":",
"Formula",
")",
"->",
"Set",
"[",
"PLAtomic",
"]",
":",
"f",
"=",
"formula",
"res",
"=",
"set",
"(",
")",
"if",
"isinstance",
"(",
"formula",
",",
"PLFormula",
")",
":",
"res",
"=",
"formula",
".",
"find_atomics",
"(",
")",
"# elif isinstance(f, PLNot):",
"# res = res.union(find_atomics(f.f))",
"# elif isinstance(f, PLBinaryOperator):",
"# for subf in f.formulas:",
"# res = res.union(find_atomics(subf))",
"else",
":",
"res",
".",
"add",
"(",
"f",
")",
"return",
"res"
] | Finds all the atomic formulas | [
"Finds",
"all",
"the",
"atomic",
"formulas"
] | train | https://github.com/MarcoFavorito/flloat/blob/5e6de1bea444b68d46d288834031860a8b2f8c2d/flloat/flloat.py#L18-L31 |
MarcoFavorito/flloat | flloat/flloat.py | _transform_delta | def _transform_delta(f:Formula, formula2AtomicFormula):
"""From a Propositional Formula to a Propositional Formula
with non-propositional subformulas replaced with a "freezed" atomic formula."""
t = type(f)
if t == PLNot:
return PLNot(_transform_delta(f, formula2AtomicFormula))
# elif isinstance(f, PLBinaryOperator): #PLAnd, PLOr, PLImplies, PLEquivalence
elif t == PLAnd or t == PLOr or t == PLImplies or t == PLEquivalence:
return t([_transform_delta(subf, formula2AtomicFormula) for subf in f.formulas])
elif t == PLTrue or t == PLFalse:
return f
else:
return formula2AtomicFormula[f] | python | def _transform_delta(f:Formula, formula2AtomicFormula):
"""From a Propositional Formula to a Propositional Formula
with non-propositional subformulas replaced with a "freezed" atomic formula."""
t = type(f)
if t == PLNot:
return PLNot(_transform_delta(f, formula2AtomicFormula))
# elif isinstance(f, PLBinaryOperator): #PLAnd, PLOr, PLImplies, PLEquivalence
elif t == PLAnd or t == PLOr or t == PLImplies or t == PLEquivalence:
return t([_transform_delta(subf, formula2AtomicFormula) for subf in f.formulas])
elif t == PLTrue or t == PLFalse:
return f
else:
return formula2AtomicFormula[f] | [
"def",
"_transform_delta",
"(",
"f",
":",
"Formula",
",",
"formula2AtomicFormula",
")",
":",
"t",
"=",
"type",
"(",
"f",
")",
"if",
"t",
"==",
"PLNot",
":",
"return",
"PLNot",
"(",
"_transform_delta",
"(",
"f",
",",
"formula2AtomicFormula",
")",
")",
"# elif isinstance(f, PLBinaryOperator): #PLAnd, PLOr, PLImplies, PLEquivalence",
"elif",
"t",
"==",
"PLAnd",
"or",
"t",
"==",
"PLOr",
"or",
"t",
"==",
"PLImplies",
"or",
"t",
"==",
"PLEquivalence",
":",
"return",
"t",
"(",
"[",
"_transform_delta",
"(",
"subf",
",",
"formula2AtomicFormula",
")",
"for",
"subf",
"in",
"f",
".",
"formulas",
"]",
")",
"elif",
"t",
"==",
"PLTrue",
"or",
"t",
"==",
"PLFalse",
":",
"return",
"f",
"else",
":",
"return",
"formula2AtomicFormula",
"[",
"f",
"]"
] | From a Propositional Formula to a Propositional Formula
with non-propositional subformulas replaced with a "freezed" atomic formula. | [
"From",
"a",
"Propositional",
"Formula",
"to",
"a",
"Propositional",
"Formula",
"with",
"non",
"-",
"propositional",
"subformulas",
"replaced",
"with",
"a",
"freezed",
"atomic",
"formula",
"."
] | train | https://github.com/MarcoFavorito/flloat/blob/5e6de1bea444b68d46d288834031860a8b2f8c2d/flloat/flloat.py#L33-L45 |
MarcoFavorito/flloat | flloat/flloat.py | to_automaton_ | def to_automaton_(f, labels:Set[Symbol]=None):
"""
DEPRECATED
From a LDLfFormula, build the automaton.
:param f: a LDLfFormula;
:param labels: a set of Symbol, the fluents of our domain. If None, retrieve them from the formula;
:param determinize: True if you need to determinize the NFA, obtaining a DFA;
:param minimize: True if you need to minimize the DFA (if determinize is False this flag has no effect.)
:return: a NFA or a DFA which accepts the same traces that makes the formula True.
"""
nnf = f.to_nnf()
if labels is None:
# if the labels of the formula are not specified in input,
# retrieve them from the formula
labels = nnf.find_labels()
# the alphabet is the powerset of the set of fluents
alphabet = powerset(labels)
initial_state = MacroState({nnf})
final_states = {MacroState()}
delta = set()
d = f.delta(PLFalseInterpretation(), epsilon=True)
if d.truth(d):
final_states.add(initial_state)
states = {MacroState(), initial_state}
states_changed, delta_changed = True, True
while states_changed or delta_changed:
states_changed, delta_changed = False, False
for actions_set in alphabet:
states_list = list(states)
for q in states_list:
# delta function applied to every formula in the macro state Q
delta_formulas = [f.delta(actions_set) for f in q]
# find the list of atoms, which are "true" atoms (i.e. propositional atoms) or LDLf formulas
atomics = [s for subf in delta_formulas for s in find_atomics(subf)]
# "freeze" the found atoms as symbols and build a mapping from symbols to formulas
symbol2formula = {Symbol(str(f)): f for f in atomics if f != PLTrue() and f != PLFalse()}
# build a map from formula to a "freezed" propositional Atomic Formula
formula2atomic_formulas = {
f: PLAtomic(Symbol(str(f)))
if f != PLTrue() and f != PLFalse()# and not isinstance(f, PLAtomic)
else f for f in atomics
}
# the final list of Propositional Atomic Formulas, one for each formula in the original macro state Q
transformed_delta_formulas = [_transform_delta(f, formula2atomic_formulas) for f in delta_formulas]
# the empty conjunction stands for true
if len(transformed_delta_formulas) == 0:
conjunctions = PLTrue()
elif len(transformed_delta_formulas) == 1:
conjunctions = transformed_delta_formulas[0]
else:
conjunctions = PLAnd(transformed_delta_formulas)
# the model in this case is the smallest set of symbols s.t. the conjunction of "freezed" atomic formula
# is true.
models = frozenset(conjunctions.minimal_models(Alphabet(symbol2formula)))
if len(models) == 0:
continue
for min_model in models:
q_prime = MacroState(
{symbol2formula[s] for s in min_model.true_propositions})
len_before = len(states)
states.add(q_prime)
if len(states) == len_before + 1:
states_list.append(q_prime)
states_changed = True
len_before = len(delta)
delta.add((q, actions_set, q_prime))
if len(delta) == len_before + 1:
delta_changed = True
# check if q_prime should be added as final state
if len(q_prime) == 0:
final_states.add(q_prime)
else:
subf_deltas = [subf.delta(PLFalseInterpretation(), epsilon=True) for subf in q_prime]
if len(subf_deltas)==1:
q_prime_delta_conjunction = subf_deltas[0]
else:
q_prime_delta_conjunction = PLAnd(subf_deltas)
if q_prime_delta_conjunction.truth(PLFalseInterpretation()):
final_states.add(q_prime)
alphabet = PythomataAlphabet({PLInterpretation(set(sym)) for sym in alphabet})
delta = frozenset((i, PLInterpretation(set(a)), o) for i, a, o in delta)
nfa = NFA.fromTransitions(
alphabet=alphabet,
states=frozenset(states),
initial_state=initial_state,
accepting_states=frozenset(final_states),
transitions=delta
)
return nfa | python | def to_automaton_(f, labels:Set[Symbol]=None):
"""
DEPRECATED
From a LDLfFormula, build the automaton.
:param f: a LDLfFormula;
:param labels: a set of Symbol, the fluents of our domain. If None, retrieve them from the formula;
:param determinize: True if you need to determinize the NFA, obtaining a DFA;
:param minimize: True if you need to minimize the DFA (if determinize is False this flag has no effect.)
:return: a NFA or a DFA which accepts the same traces that makes the formula True.
"""
nnf = f.to_nnf()
if labels is None:
# if the labels of the formula are not specified in input,
# retrieve them from the formula
labels = nnf.find_labels()
# the alphabet is the powerset of the set of fluents
alphabet = powerset(labels)
initial_state = MacroState({nnf})
final_states = {MacroState()}
delta = set()
d = f.delta(PLFalseInterpretation(), epsilon=True)
if d.truth(d):
final_states.add(initial_state)
states = {MacroState(), initial_state}
states_changed, delta_changed = True, True
while states_changed or delta_changed:
states_changed, delta_changed = False, False
for actions_set in alphabet:
states_list = list(states)
for q in states_list:
# delta function applied to every formula in the macro state Q
delta_formulas = [f.delta(actions_set) for f in q]
# find the list of atoms, which are "true" atoms (i.e. propositional atoms) or LDLf formulas
atomics = [s for subf in delta_formulas for s in find_atomics(subf)]
# "freeze" the found atoms as symbols and build a mapping from symbols to formulas
symbol2formula = {Symbol(str(f)): f for f in atomics if f != PLTrue() and f != PLFalse()}
# build a map from formula to a "freezed" propositional Atomic Formula
formula2atomic_formulas = {
f: PLAtomic(Symbol(str(f)))
if f != PLTrue() and f != PLFalse()# and not isinstance(f, PLAtomic)
else f for f in atomics
}
# the final list of Propositional Atomic Formulas, one for each formula in the original macro state Q
transformed_delta_formulas = [_transform_delta(f, formula2atomic_formulas) for f in delta_formulas]
# the empty conjunction stands for true
if len(transformed_delta_formulas) == 0:
conjunctions = PLTrue()
elif len(transformed_delta_formulas) == 1:
conjunctions = transformed_delta_formulas[0]
else:
conjunctions = PLAnd(transformed_delta_formulas)
# the model in this case is the smallest set of symbols s.t. the conjunction of "freezed" atomic formula
# is true.
models = frozenset(conjunctions.minimal_models(Alphabet(symbol2formula)))
if len(models) == 0:
continue
for min_model in models:
q_prime = MacroState(
{symbol2formula[s] for s in min_model.true_propositions})
len_before = len(states)
states.add(q_prime)
if len(states) == len_before + 1:
states_list.append(q_prime)
states_changed = True
len_before = len(delta)
delta.add((q, actions_set, q_prime))
if len(delta) == len_before + 1:
delta_changed = True
# check if q_prime should be added as final state
if len(q_prime) == 0:
final_states.add(q_prime)
else:
subf_deltas = [subf.delta(PLFalseInterpretation(), epsilon=True) for subf in q_prime]
if len(subf_deltas)==1:
q_prime_delta_conjunction = subf_deltas[0]
else:
q_prime_delta_conjunction = PLAnd(subf_deltas)
if q_prime_delta_conjunction.truth(PLFalseInterpretation()):
final_states.add(q_prime)
alphabet = PythomataAlphabet({PLInterpretation(set(sym)) for sym in alphabet})
delta = frozenset((i, PLInterpretation(set(a)), o) for i, a, o in delta)
nfa = NFA.fromTransitions(
alphabet=alphabet,
states=frozenset(states),
initial_state=initial_state,
accepting_states=frozenset(final_states),
transitions=delta
)
return nfa | [
"def",
"to_automaton_",
"(",
"f",
",",
"labels",
":",
"Set",
"[",
"Symbol",
"]",
"=",
"None",
")",
":",
"nnf",
"=",
"f",
".",
"to_nnf",
"(",
")",
"if",
"labels",
"is",
"None",
":",
"# if the labels of the formula are not specified in input,",
"# retrieve them from the formula",
"labels",
"=",
"nnf",
".",
"find_labels",
"(",
")",
"# the alphabet is the powerset of the set of fluents",
"alphabet",
"=",
"powerset",
"(",
"labels",
")",
"initial_state",
"=",
"MacroState",
"(",
"{",
"nnf",
"}",
")",
"final_states",
"=",
"{",
"MacroState",
"(",
")",
"}",
"delta",
"=",
"set",
"(",
")",
"d",
"=",
"f",
".",
"delta",
"(",
"PLFalseInterpretation",
"(",
")",
",",
"epsilon",
"=",
"True",
")",
"if",
"d",
".",
"truth",
"(",
"d",
")",
":",
"final_states",
".",
"add",
"(",
"initial_state",
")",
"states",
"=",
"{",
"MacroState",
"(",
")",
",",
"initial_state",
"}",
"states_changed",
",",
"delta_changed",
"=",
"True",
",",
"True",
"while",
"states_changed",
"or",
"delta_changed",
":",
"states_changed",
",",
"delta_changed",
"=",
"False",
",",
"False",
"for",
"actions_set",
"in",
"alphabet",
":",
"states_list",
"=",
"list",
"(",
"states",
")",
"for",
"q",
"in",
"states_list",
":",
"# delta function applied to every formula in the macro state Q",
"delta_formulas",
"=",
"[",
"f",
".",
"delta",
"(",
"actions_set",
")",
"for",
"f",
"in",
"q",
"]",
"# find the list of atoms, which are \"true\" atoms (i.e. propositional atoms) or LDLf formulas",
"atomics",
"=",
"[",
"s",
"for",
"subf",
"in",
"delta_formulas",
"for",
"s",
"in",
"find_atomics",
"(",
"subf",
")",
"]",
"# \"freeze\" the found atoms as symbols and build a mapping from symbols to formulas",
"symbol2formula",
"=",
"{",
"Symbol",
"(",
"str",
"(",
"f",
")",
")",
":",
"f",
"for",
"f",
"in",
"atomics",
"if",
"f",
"!=",
"PLTrue",
"(",
")",
"and",
"f",
"!=",
"PLFalse",
"(",
")",
"}",
"# build a map from formula to a \"freezed\" propositional Atomic Formula",
"formula2atomic_formulas",
"=",
"{",
"f",
":",
"PLAtomic",
"(",
"Symbol",
"(",
"str",
"(",
"f",
")",
")",
")",
"if",
"f",
"!=",
"PLTrue",
"(",
")",
"and",
"f",
"!=",
"PLFalse",
"(",
")",
"# and not isinstance(f, PLAtomic)",
"else",
"f",
"for",
"f",
"in",
"atomics",
"}",
"# the final list of Propositional Atomic Formulas, one for each formula in the original macro state Q",
"transformed_delta_formulas",
"=",
"[",
"_transform_delta",
"(",
"f",
",",
"formula2atomic_formulas",
")",
"for",
"f",
"in",
"delta_formulas",
"]",
"# the empty conjunction stands for true",
"if",
"len",
"(",
"transformed_delta_formulas",
")",
"==",
"0",
":",
"conjunctions",
"=",
"PLTrue",
"(",
")",
"elif",
"len",
"(",
"transformed_delta_formulas",
")",
"==",
"1",
":",
"conjunctions",
"=",
"transformed_delta_formulas",
"[",
"0",
"]",
"else",
":",
"conjunctions",
"=",
"PLAnd",
"(",
"transformed_delta_formulas",
")",
"# the model in this case is the smallest set of symbols s.t. the conjunction of \"freezed\" atomic formula",
"# is true.",
"models",
"=",
"frozenset",
"(",
"conjunctions",
".",
"minimal_models",
"(",
"Alphabet",
"(",
"symbol2formula",
")",
")",
")",
"if",
"len",
"(",
"models",
")",
"==",
"0",
":",
"continue",
"for",
"min_model",
"in",
"models",
":",
"q_prime",
"=",
"MacroState",
"(",
"{",
"symbol2formula",
"[",
"s",
"]",
"for",
"s",
"in",
"min_model",
".",
"true_propositions",
"}",
")",
"len_before",
"=",
"len",
"(",
"states",
")",
"states",
".",
"add",
"(",
"q_prime",
")",
"if",
"len",
"(",
"states",
")",
"==",
"len_before",
"+",
"1",
":",
"states_list",
".",
"append",
"(",
"q_prime",
")",
"states_changed",
"=",
"True",
"len_before",
"=",
"len",
"(",
"delta",
")",
"delta",
".",
"add",
"(",
"(",
"q",
",",
"actions_set",
",",
"q_prime",
")",
")",
"if",
"len",
"(",
"delta",
")",
"==",
"len_before",
"+",
"1",
":",
"delta_changed",
"=",
"True",
"# check if q_prime should be added as final state",
"if",
"len",
"(",
"q_prime",
")",
"==",
"0",
":",
"final_states",
".",
"add",
"(",
"q_prime",
")",
"else",
":",
"subf_deltas",
"=",
"[",
"subf",
".",
"delta",
"(",
"PLFalseInterpretation",
"(",
")",
",",
"epsilon",
"=",
"True",
")",
"for",
"subf",
"in",
"q_prime",
"]",
"if",
"len",
"(",
"subf_deltas",
")",
"==",
"1",
":",
"q_prime_delta_conjunction",
"=",
"subf_deltas",
"[",
"0",
"]",
"else",
":",
"q_prime_delta_conjunction",
"=",
"PLAnd",
"(",
"subf_deltas",
")",
"if",
"q_prime_delta_conjunction",
".",
"truth",
"(",
"PLFalseInterpretation",
"(",
")",
")",
":",
"final_states",
".",
"add",
"(",
"q_prime",
")",
"alphabet",
"=",
"PythomataAlphabet",
"(",
"{",
"PLInterpretation",
"(",
"set",
"(",
"sym",
")",
")",
"for",
"sym",
"in",
"alphabet",
"}",
")",
"delta",
"=",
"frozenset",
"(",
"(",
"i",
",",
"PLInterpretation",
"(",
"set",
"(",
"a",
")",
")",
",",
"o",
")",
"for",
"i",
",",
"a",
",",
"o",
"in",
"delta",
")",
"nfa",
"=",
"NFA",
".",
"fromTransitions",
"(",
"alphabet",
"=",
"alphabet",
",",
"states",
"=",
"frozenset",
"(",
"states",
")",
",",
"initial_state",
"=",
"initial_state",
",",
"accepting_states",
"=",
"frozenset",
"(",
"final_states",
")",
",",
"transitions",
"=",
"delta",
")",
"return",
"nfa"
] | DEPRECATED
From a LDLfFormula, build the automaton.
:param f: a LDLfFormula;
:param labels: a set of Symbol, the fluents of our domain. If None, retrieve them from the formula;
:param determinize: True if you need to determinize the NFA, obtaining a DFA;
:param minimize: True if you need to minimize the DFA (if determinize is False this flag has no effect.)
:return: a NFA or a DFA which accepts the same traces that makes the formula True. | [
"DEPRECATED",
"From",
"a",
"LDLfFormula",
"build",
"the",
"automaton",
".",
":",
"param",
"f",
":",
"a",
"LDLfFormula",
";",
":",
"param",
"labels",
":",
"a",
"set",
"of",
"Symbol",
"the",
"fluents",
"of",
"our",
"domain",
".",
"If",
"None",
"retrieve",
"them",
"from",
"the",
"formula",
";",
":",
"param",
"determinize",
":",
"True",
"if",
"you",
"need",
"to",
"determinize",
"the",
"NFA",
"obtaining",
"a",
"DFA",
";",
":",
"param",
"minimize",
":",
"True",
"if",
"you",
"need",
"to",
"minimize",
"the",
"DFA",
"(",
"if",
"determinize",
"is",
"False",
"this",
"flag",
"has",
"no",
"effect",
".",
")",
":",
"return",
":",
"a",
"NFA",
"or",
"a",
"DFA",
"which",
"accepts",
"the",
"same",
"traces",
"that",
"makes",
"the",
"formula",
"True",
"."
] | train | https://github.com/MarcoFavorito/flloat/blob/5e6de1bea444b68d46d288834031860a8b2f8c2d/flloat/flloat.py#L48-L161 |
InfoAgeTech/django-core | django_core/utils/loading.py | get_setting | def get_setting(key, **kwargs):
"""Gets a settings key or raises an improperly configured error.
:param key: the settings key to get.
:param default: the default value to return if no value is found
"""
has_default = 'default' in kwargs
default_val = kwargs.get('default')
try:
if has_default:
return getattr(settings, key, default_val)
else:
return getattr(settings, key)
except Exception as e:
raise ImproperlyConfigured(
_('"{0}" setting has not been properly set. {1}').format(key, e)
) | python | def get_setting(key, **kwargs):
"""Gets a settings key or raises an improperly configured error.
:param key: the settings key to get.
:param default: the default value to return if no value is found
"""
has_default = 'default' in kwargs
default_val = kwargs.get('default')
try:
if has_default:
return getattr(settings, key, default_val)
else:
return getattr(settings, key)
except Exception as e:
raise ImproperlyConfigured(
_('"{0}" setting has not been properly set. {1}').format(key, e)
) | [
"def",
"get_setting",
"(",
"key",
",",
"*",
"*",
"kwargs",
")",
":",
"has_default",
"=",
"'default'",
"in",
"kwargs",
"default_val",
"=",
"kwargs",
".",
"get",
"(",
"'default'",
")",
"try",
":",
"if",
"has_default",
":",
"return",
"getattr",
"(",
"settings",
",",
"key",
",",
"default_val",
")",
"else",
":",
"return",
"getattr",
"(",
"settings",
",",
"key",
")",
"except",
"Exception",
"as",
"e",
":",
"raise",
"ImproperlyConfigured",
"(",
"_",
"(",
"'\"{0}\" setting has not been properly set. {1}'",
")",
".",
"format",
"(",
"key",
",",
"e",
")",
")"
] | Gets a settings key or raises an improperly configured error.
:param key: the settings key to get.
:param default: the default value to return if no value is found | [
"Gets",
"a",
"settings",
"key",
"or",
"raises",
"an",
"improperly",
"configured",
"error",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/loading.py#L12-L29 |
InfoAgeTech/django-core | django_core/utils/loading.py | get_class_from_settings | def get_class_from_settings(settings_key):
"""Gets a class from a setting key. This will first check loaded models,
then look in installed apps, then fallback to import from lib.
:param settings_key: the key defined in settings to the value for
"""
cls_path = getattr(settings, settings_key, None)
if not cls_path:
raise NotImplementedError()
try:
# First check to see if it's an installed model
return get_model_from_settings(settings_key=settings_key)
except:
try:
# Next, check from installed apps
return get_class_from_settings_from_apps(settings_key=settings_key)
except:
# Last, try to load from the full path
return get_class_from_settings_full_path(settings_key) | python | def get_class_from_settings(settings_key):
"""Gets a class from a setting key. This will first check loaded models,
then look in installed apps, then fallback to import from lib.
:param settings_key: the key defined in settings to the value for
"""
cls_path = getattr(settings, settings_key, None)
if not cls_path:
raise NotImplementedError()
try:
# First check to see if it's an installed model
return get_model_from_settings(settings_key=settings_key)
except:
try:
# Next, check from installed apps
return get_class_from_settings_from_apps(settings_key=settings_key)
except:
# Last, try to load from the full path
return get_class_from_settings_full_path(settings_key) | [
"def",
"get_class_from_settings",
"(",
"settings_key",
")",
":",
"cls_path",
"=",
"getattr",
"(",
"settings",
",",
"settings_key",
",",
"None",
")",
"if",
"not",
"cls_path",
":",
"raise",
"NotImplementedError",
"(",
")",
"try",
":",
"# First check to see if it's an installed model",
"return",
"get_model_from_settings",
"(",
"settings_key",
"=",
"settings_key",
")",
"except",
":",
"try",
":",
"# Next, check from installed apps",
"return",
"get_class_from_settings_from_apps",
"(",
"settings_key",
"=",
"settings_key",
")",
"except",
":",
"# Last, try to load from the full path",
"return",
"get_class_from_settings_full_path",
"(",
"settings_key",
")"
] | Gets a class from a setting key. This will first check loaded models,
then look in installed apps, then fallback to import from lib.
:param settings_key: the key defined in settings to the value for | [
"Gets",
"a",
"class",
"from",
"a",
"setting",
"key",
".",
"This",
"will",
"first",
"check",
"loaded",
"models",
"then",
"look",
"in",
"installed",
"apps",
"then",
"fallback",
"to",
"import",
"from",
"lib",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/loading.py#L32-L51 |
InfoAgeTech/django-core | django_core/utils/loading.py | get_model_from_settings | def get_model_from_settings(settings_key):
"""Return the django model from a settings key.
This is the same pattern user for django's "get_user_model()" method. To
allow you to set the model instance to a different model subclass.
:param settings_key: the key defined in settings to the value for
"""
cls_path = getattr(settings, settings_key, None)
if not cls_path:
raise NotImplementedError()
try:
app_label, model_name = cls_path.split('.')
except ValueError:
raise ImproperlyConfigured("{0} must be of the form "
"'app_label.model_name'".format(settings_key))
model = apps.get_model(app_label, model_name)
if model is None:
raise ImproperlyConfigured("{0} refers to model '%s' that has not "
"been installed".format(settings_key))
return model | python | def get_model_from_settings(settings_key):
"""Return the django model from a settings key.
This is the same pattern user for django's "get_user_model()" method. To
allow you to set the model instance to a different model subclass.
:param settings_key: the key defined in settings to the value for
"""
cls_path = getattr(settings, settings_key, None)
if not cls_path:
raise NotImplementedError()
try:
app_label, model_name = cls_path.split('.')
except ValueError:
raise ImproperlyConfigured("{0} must be of the form "
"'app_label.model_name'".format(settings_key))
model = apps.get_model(app_label, model_name)
if model is None:
raise ImproperlyConfigured("{0} refers to model '%s' that has not "
"been installed".format(settings_key))
return model | [
"def",
"get_model_from_settings",
"(",
"settings_key",
")",
":",
"cls_path",
"=",
"getattr",
"(",
"settings",
",",
"settings_key",
",",
"None",
")",
"if",
"not",
"cls_path",
":",
"raise",
"NotImplementedError",
"(",
")",
"try",
":",
"app_label",
",",
"model_name",
"=",
"cls_path",
".",
"split",
"(",
"'.'",
")",
"except",
"ValueError",
":",
"raise",
"ImproperlyConfigured",
"(",
"\"{0} must be of the form \"",
"\"'app_label.model_name'\"",
".",
"format",
"(",
"settings_key",
")",
")",
"model",
"=",
"apps",
".",
"get_model",
"(",
"app_label",
",",
"model_name",
")",
"if",
"model",
"is",
"None",
":",
"raise",
"ImproperlyConfigured",
"(",
"\"{0} refers to model '%s' that has not \"",
"\"been installed\"",
".",
"format",
"(",
"settings_key",
")",
")",
"return",
"model"
] | Return the django model from a settings key.
This is the same pattern user for django's "get_user_model()" method. To
allow you to set the model instance to a different model subclass.
:param settings_key: the key defined in settings to the value for | [
"Return",
"the",
"django",
"model",
"from",
"a",
"settings",
"key",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/loading.py#L54-L79 |
InfoAgeTech/django-core | django_core/utils/loading.py | get_class_from_settings_from_apps | def get_class_from_settings_from_apps(settings_key):
"""Try and get a class from a settings path by lookin in installed apps.
"""
cls_path = getattr(settings, settings_key, None)
if not cls_path:
raise NotImplementedError()
try:
app_label = cls_path.split('.')[-2]
model_name = cls_path.split('.')[-1]
except ValueError:
raise ImproperlyConfigured("{0} must be of the form "
"'app_label.model_name'".format(
settings_key))
app = apps.get_app_config(app_label).models_module
if not app:
raise ImproperlyConfigured("{0} setting refers to an app that has not "
"been installed".format(settings_key))
return getattr(app, model_name) | python | def get_class_from_settings_from_apps(settings_key):
"""Try and get a class from a settings path by lookin in installed apps.
"""
cls_path = getattr(settings, settings_key, None)
if not cls_path:
raise NotImplementedError()
try:
app_label = cls_path.split('.')[-2]
model_name = cls_path.split('.')[-1]
except ValueError:
raise ImproperlyConfigured("{0} must be of the form "
"'app_label.model_name'".format(
settings_key))
app = apps.get_app_config(app_label).models_module
if not app:
raise ImproperlyConfigured("{0} setting refers to an app that has not "
"been installed".format(settings_key))
return getattr(app, model_name) | [
"def",
"get_class_from_settings_from_apps",
"(",
"settings_key",
")",
":",
"cls_path",
"=",
"getattr",
"(",
"settings",
",",
"settings_key",
",",
"None",
")",
"if",
"not",
"cls_path",
":",
"raise",
"NotImplementedError",
"(",
")",
"try",
":",
"app_label",
"=",
"cls_path",
".",
"split",
"(",
"'.'",
")",
"[",
"-",
"2",
"]",
"model_name",
"=",
"cls_path",
".",
"split",
"(",
"'.'",
")",
"[",
"-",
"1",
"]",
"except",
"ValueError",
":",
"raise",
"ImproperlyConfigured",
"(",
"\"{0} must be of the form \"",
"\"'app_label.model_name'\"",
".",
"format",
"(",
"settings_key",
")",
")",
"app",
"=",
"apps",
".",
"get_app_config",
"(",
"app_label",
")",
".",
"models_module",
"if",
"not",
"app",
":",
"raise",
"ImproperlyConfigured",
"(",
"\"{0} setting refers to an app that has not \"",
"\"been installed\"",
".",
"format",
"(",
"settings_key",
")",
")",
"return",
"getattr",
"(",
"app",
",",
"model_name",
")"
] | Try and get a class from a settings path by lookin in installed apps. | [
"Try",
"and",
"get",
"a",
"class",
"from",
"a",
"settings",
"path",
"by",
"lookin",
"in",
"installed",
"apps",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/loading.py#L82-L104 |
InfoAgeTech/django-core | django_core/utils/loading.py | get_class_from_settings_full_path | def get_class_from_settings_full_path(settings_key):
"""Get a class from it's full path.
Example:
some.path.module.MyClass
"""
cls_path = getattr(settings, settings_key, None)
if not cls_path:
raise NotImplementedError()
try:
module_name, class_name = cls_path.rsplit('.', 1)
except ValueError:
raise ImproperlyConfigured("{0} must be of the form "
"'some.path.module.MyClass'".format(
settings_key))
manager_module = importlib.import_module(module_name)
if not manager_module:
raise ImproperlyConfigured("{0} refers to a module that has not been "
"installed".format(settings_key))
return getattr(manager_module, class_name) | python | def get_class_from_settings_full_path(settings_key):
"""Get a class from it's full path.
Example:
some.path.module.MyClass
"""
cls_path = getattr(settings, settings_key, None)
if not cls_path:
raise NotImplementedError()
try:
module_name, class_name = cls_path.rsplit('.', 1)
except ValueError:
raise ImproperlyConfigured("{0} must be of the form "
"'some.path.module.MyClass'".format(
settings_key))
manager_module = importlib.import_module(module_name)
if not manager_module:
raise ImproperlyConfigured("{0} refers to a module that has not been "
"installed".format(settings_key))
return getattr(manager_module, class_name) | [
"def",
"get_class_from_settings_full_path",
"(",
"settings_key",
")",
":",
"cls_path",
"=",
"getattr",
"(",
"settings",
",",
"settings_key",
",",
"None",
")",
"if",
"not",
"cls_path",
":",
"raise",
"NotImplementedError",
"(",
")",
"try",
":",
"module_name",
",",
"class_name",
"=",
"cls_path",
".",
"rsplit",
"(",
"'.'",
",",
"1",
")",
"except",
"ValueError",
":",
"raise",
"ImproperlyConfigured",
"(",
"\"{0} must be of the form \"",
"\"'some.path.module.MyClass'\"",
".",
"format",
"(",
"settings_key",
")",
")",
"manager_module",
"=",
"importlib",
".",
"import_module",
"(",
"module_name",
")",
"if",
"not",
"manager_module",
":",
"raise",
"ImproperlyConfigured",
"(",
"\"{0} refers to a module that has not been \"",
"\"installed\"",
".",
"format",
"(",
"settings_key",
")",
")",
"return",
"getattr",
"(",
"manager_module",
",",
"class_name",
")"
] | Get a class from it's full path.
Example:
some.path.module.MyClass | [
"Get",
"a",
"class",
"from",
"it",
"s",
"full",
"path",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/loading.py#L107-L132 |
InfoAgeTech/django-core | django_core/utils/loading.py | get_function_from_settings | def get_function_from_settings(settings_key):
"""Gets a function from the string path defined in a settings file.
Example:
# my_app/my_file.py
def some_function():
# do something
pass
# settings.py
SOME_FUNCTION = 'my_app.my_file.some_function'
> get_function_from_settings('SOME_FUNCTION')
<function my_app.my_file.some_function>
"""
renderer_func_str = getattr(settings, settings_key, None)
if not renderer_func_str:
return None
module_str, renderer_func_name = renderer_func_str.rsplit('.', 1)
try:
mod = importlib.import_module(module_str)
return getattr(mod, renderer_func_name)
except Exception:
return None | python | def get_function_from_settings(settings_key):
"""Gets a function from the string path defined in a settings file.
Example:
# my_app/my_file.py
def some_function():
# do something
pass
# settings.py
SOME_FUNCTION = 'my_app.my_file.some_function'
> get_function_from_settings('SOME_FUNCTION')
<function my_app.my_file.some_function>
"""
renderer_func_str = getattr(settings, settings_key, None)
if not renderer_func_str:
return None
module_str, renderer_func_name = renderer_func_str.rsplit('.', 1)
try:
mod = importlib.import_module(module_str)
return getattr(mod, renderer_func_name)
except Exception:
return None | [
"def",
"get_function_from_settings",
"(",
"settings_key",
")",
":",
"renderer_func_str",
"=",
"getattr",
"(",
"settings",
",",
"settings_key",
",",
"None",
")",
"if",
"not",
"renderer_func_str",
":",
"return",
"None",
"module_str",
",",
"renderer_func_name",
"=",
"renderer_func_str",
".",
"rsplit",
"(",
"'.'",
",",
"1",
")",
"try",
":",
"mod",
"=",
"importlib",
".",
"import_module",
"(",
"module_str",
")",
"return",
"getattr",
"(",
"mod",
",",
"renderer_func_name",
")",
"except",
"Exception",
":",
"return",
"None"
] | Gets a function from the string path defined in a settings file.
Example:
# my_app/my_file.py
def some_function():
# do something
pass
# settings.py
SOME_FUNCTION = 'my_app.my_file.some_function'
> get_function_from_settings('SOME_FUNCTION')
<function my_app.my_file.some_function> | [
"Gets",
"a",
"function",
"from",
"the",
"string",
"path",
"defined",
"in",
"a",
"settings",
"file",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/loading.py#L135-L162 |
dls-controls/annotypes | annotypes/_anno.py | caller_locals | def caller_locals():
# type: () -> Dict
"""Return the frame object for the caller's stack frame."""
try:
raise ValueError
except ValueError:
_, _, tb = sys.exc_info()
assert tb, "Can't get traceback, this shouldn't happen"
caller_frame = tb.tb_frame.f_back.f_back
return caller_frame.f_locals | python | def caller_locals():
# type: () -> Dict
"""Return the frame object for the caller's stack frame."""
try:
raise ValueError
except ValueError:
_, _, tb = sys.exc_info()
assert tb, "Can't get traceback, this shouldn't happen"
caller_frame = tb.tb_frame.f_back.f_back
return caller_frame.f_locals | [
"def",
"caller_locals",
"(",
")",
":",
"# type: () -> Dict",
"try",
":",
"raise",
"ValueError",
"except",
"ValueError",
":",
"_",
",",
"_",
",",
"tb",
"=",
"sys",
".",
"exc_info",
"(",
")",
"assert",
"tb",
",",
"\"Can't get traceback, this shouldn't happen\"",
"caller_frame",
"=",
"tb",
".",
"tb_frame",
".",
"f_back",
".",
"f_back",
"return",
"caller_frame",
".",
"f_locals"
] | Return the frame object for the caller's stack frame. | [
"Return",
"the",
"frame",
"object",
"for",
"the",
"caller",
"s",
"stack",
"frame",
"."
] | train | https://github.com/dls-controls/annotypes/blob/31ab68a0367bb70ebd9898e8b9fa9405423465bd/annotypes/_anno.py#L43-L52 |
dls-controls/annotypes | annotypes/_anno.py | make_repr | def make_repr(inst, attrs):
# type: (object, Sequence[str]) -> str
"""Create a repr from an instance of a class
Args:
inst: The class instance we are generating a repr of
attrs: The attributes that should appear in the repr
"""
arg_str = ", ".join(
"%s=%r" % (a, getattr(inst, a)) for a in attrs if hasattr(inst, a))
repr_str = "%s(%s)" % (inst.__class__.__name__, arg_str)
return repr_str | python | def make_repr(inst, attrs):
# type: (object, Sequence[str]) -> str
"""Create a repr from an instance of a class
Args:
inst: The class instance we are generating a repr of
attrs: The attributes that should appear in the repr
"""
arg_str = ", ".join(
"%s=%r" % (a, getattr(inst, a)) for a in attrs if hasattr(inst, a))
repr_str = "%s(%s)" % (inst.__class__.__name__, arg_str)
return repr_str | [
"def",
"make_repr",
"(",
"inst",
",",
"attrs",
")",
":",
"# type: (object, Sequence[str]) -> str",
"arg_str",
"=",
"\", \"",
".",
"join",
"(",
"\"%s=%r\"",
"%",
"(",
"a",
",",
"getattr",
"(",
"inst",
",",
"a",
")",
")",
"for",
"a",
"in",
"attrs",
"if",
"hasattr",
"(",
"inst",
",",
"a",
")",
")",
"repr_str",
"=",
"\"%s(%s)\"",
"%",
"(",
"inst",
".",
"__class__",
".",
"__name__",
",",
"arg_str",
")",
"return",
"repr_str"
] | Create a repr from an instance of a class
Args:
inst: The class instance we are generating a repr of
attrs: The attributes that should appear in the repr | [
"Create",
"a",
"repr",
"from",
"an",
"instance",
"of",
"a",
"class"
] | train | https://github.com/dls-controls/annotypes/blob/31ab68a0367bb70ebd9898e8b9fa9405423465bd/annotypes/_anno.py#L55-L66 |
openstack/pymod2pkg | pymod2pkg/__init__.py | default_rdo_tr | def default_rdo_tr(mod):
"""
Default translation function for Fedora/RDO based systems
"""
pkg = mod.rsplit('-python')[0]
pkg = pkg.replace('_', '-').replace('.', '-').lower()
if not pkg.startswith('python-'):
pkg = 'python-' + pkg
py2pkg = pkg
py3pkg = re.sub('python', 'python3', pkg)
return (pkg, py2pkg, py3pkg) | python | def default_rdo_tr(mod):
"""
Default translation function for Fedora/RDO based systems
"""
pkg = mod.rsplit('-python')[0]
pkg = pkg.replace('_', '-').replace('.', '-').lower()
if not pkg.startswith('python-'):
pkg = 'python-' + pkg
py2pkg = pkg
py3pkg = re.sub('python', 'python3', pkg)
return (pkg, py2pkg, py3pkg) | [
"def",
"default_rdo_tr",
"(",
"mod",
")",
":",
"pkg",
"=",
"mod",
".",
"rsplit",
"(",
"'-python'",
")",
"[",
"0",
"]",
"pkg",
"=",
"pkg",
".",
"replace",
"(",
"'_'",
",",
"'-'",
")",
".",
"replace",
"(",
"'.'",
",",
"'-'",
")",
".",
"lower",
"(",
")",
"if",
"not",
"pkg",
".",
"startswith",
"(",
"'python-'",
")",
":",
"pkg",
"=",
"'python-'",
"+",
"pkg",
"py2pkg",
"=",
"pkg",
"py3pkg",
"=",
"re",
".",
"sub",
"(",
"'python'",
",",
"'python3'",
",",
"pkg",
")",
"return",
"(",
"pkg",
",",
"py2pkg",
",",
"py3pkg",
")"
] | Default translation function for Fedora/RDO based systems | [
"Default",
"translation",
"function",
"for",
"Fedora",
"/",
"RDO",
"based",
"systems"
] | train | https://github.com/openstack/pymod2pkg/blob/f9a2f02fbfa0b2cfcdb4a7494c9ddbd10859065a/pymod2pkg/__init__.py#L74-L84 |
openstack/pymod2pkg | pymod2pkg/__init__.py | default_ubuntu_tr | def default_ubuntu_tr(mod):
"""
Default translation function for Ubuntu based systems
"""
pkg = 'python-%s' % mod.lower()
py2pkg = pkg
py3pkg = 'python3-%s' % mod.lower()
return (pkg, py2pkg, py3pkg) | python | def default_ubuntu_tr(mod):
"""
Default translation function for Ubuntu based systems
"""
pkg = 'python-%s' % mod.lower()
py2pkg = pkg
py3pkg = 'python3-%s' % mod.lower()
return (pkg, py2pkg, py3pkg) | [
"def",
"default_ubuntu_tr",
"(",
"mod",
")",
":",
"pkg",
"=",
"'python-%s'",
"%",
"mod",
".",
"lower",
"(",
")",
"py2pkg",
"=",
"pkg",
"py3pkg",
"=",
"'python3-%s'",
"%",
"mod",
".",
"lower",
"(",
")",
"return",
"(",
"pkg",
",",
"py2pkg",
",",
"py3pkg",
")"
] | Default translation function for Ubuntu based systems | [
"Default",
"translation",
"function",
"for",
"Ubuntu",
"based",
"systems"
] | train | https://github.com/openstack/pymod2pkg/blob/f9a2f02fbfa0b2cfcdb4a7494c9ddbd10859065a/pymod2pkg/__init__.py#L87-L94 |
openstack/pymod2pkg | pymod2pkg/__init__.py | default_suse_tr | def default_suse_tr(mod):
"""
Default translation function for openSUSE, SLES, and other
SUSE based systems
Returns a tuple of 3 elements - the unversioned name, the python2 versioned
name and the python3 versioned name.
"""
pkg = 'python-%s' % mod
py2pkg = 'python2-%s' % mod
py3pkg = 'python3-%s' % mod
return (pkg, py2pkg, py3pkg) | python | def default_suse_tr(mod):
"""
Default translation function for openSUSE, SLES, and other
SUSE based systems
Returns a tuple of 3 elements - the unversioned name, the python2 versioned
name and the python3 versioned name.
"""
pkg = 'python-%s' % mod
py2pkg = 'python2-%s' % mod
py3pkg = 'python3-%s' % mod
return (pkg, py2pkg, py3pkg) | [
"def",
"default_suse_tr",
"(",
"mod",
")",
":",
"pkg",
"=",
"'python-%s'",
"%",
"mod",
"py2pkg",
"=",
"'python2-%s'",
"%",
"mod",
"py3pkg",
"=",
"'python3-%s'",
"%",
"mod",
"return",
"(",
"pkg",
",",
"py2pkg",
",",
"py3pkg",
")"
] | Default translation function for openSUSE, SLES, and other
SUSE based systems
Returns a tuple of 3 elements - the unversioned name, the python2 versioned
name and the python3 versioned name. | [
"Default",
"translation",
"function",
"for",
"openSUSE",
"SLES",
"and",
"other",
"SUSE",
"based",
"systems"
] | train | https://github.com/openstack/pymod2pkg/blob/f9a2f02fbfa0b2cfcdb4a7494c9ddbd10859065a/pymod2pkg/__init__.py#L97-L108 |
openstack/pymod2pkg | pymod2pkg/__init__.py | module2package | def module2package(mod, dist, pkg_map=None, py_vers=('py',)):
"""Return a corresponding package name for a python module.
mod: python module name
dist: a linux distribution as returned by
`platform.linux_distribution()[0]`
pkg_map: a custom package mapping. None means autodetected based on the
given dist parameter
py_vers: a list of python versions the function should return. Default is
'py' which is the unversioned translation. Possible values are
'py', 'py2' and 'py3'
"""
if not pkg_map:
pkg_map = get_pkg_map(dist)
for rule in pkg_map:
pkglist = rule(mod, dist)
if pkglist:
break
else:
tr_func = get_default_tr_func(dist)
pkglist = tr_func(mod)
output = []
for v in py_vers:
if v == 'py':
output.append(pkglist[0])
elif v == 'py2':
output.append(pkglist[1])
elif v == 'py3':
output.append(pkglist[2])
else:
raise Exception('Invalid version "%s"' % (v))
if len(output) == 1:
# just return a single value (backwards compatible)
return output[0]
else:
return output | python | def module2package(mod, dist, pkg_map=None, py_vers=('py',)):
"""Return a corresponding package name for a python module.
mod: python module name
dist: a linux distribution as returned by
`platform.linux_distribution()[0]`
pkg_map: a custom package mapping. None means autodetected based on the
given dist parameter
py_vers: a list of python versions the function should return. Default is
'py' which is the unversioned translation. Possible values are
'py', 'py2' and 'py3'
"""
if not pkg_map:
pkg_map = get_pkg_map(dist)
for rule in pkg_map:
pkglist = rule(mod, dist)
if pkglist:
break
else:
tr_func = get_default_tr_func(dist)
pkglist = tr_func(mod)
output = []
for v in py_vers:
if v == 'py':
output.append(pkglist[0])
elif v == 'py2':
output.append(pkglist[1])
elif v == 'py3':
output.append(pkglist[2])
else:
raise Exception('Invalid version "%s"' % (v))
if len(output) == 1:
# just return a single value (backwards compatible)
return output[0]
else:
return output | [
"def",
"module2package",
"(",
"mod",
",",
"dist",
",",
"pkg_map",
"=",
"None",
",",
"py_vers",
"=",
"(",
"'py'",
",",
")",
")",
":",
"if",
"not",
"pkg_map",
":",
"pkg_map",
"=",
"get_pkg_map",
"(",
"dist",
")",
"for",
"rule",
"in",
"pkg_map",
":",
"pkglist",
"=",
"rule",
"(",
"mod",
",",
"dist",
")",
"if",
"pkglist",
":",
"break",
"else",
":",
"tr_func",
"=",
"get_default_tr_func",
"(",
"dist",
")",
"pkglist",
"=",
"tr_func",
"(",
"mod",
")",
"output",
"=",
"[",
"]",
"for",
"v",
"in",
"py_vers",
":",
"if",
"v",
"==",
"'py'",
":",
"output",
".",
"append",
"(",
"pkglist",
"[",
"0",
"]",
")",
"elif",
"v",
"==",
"'py2'",
":",
"output",
".",
"append",
"(",
"pkglist",
"[",
"1",
"]",
")",
"elif",
"v",
"==",
"'py3'",
":",
"output",
".",
"append",
"(",
"pkglist",
"[",
"2",
"]",
")",
"else",
":",
"raise",
"Exception",
"(",
"'Invalid version \"%s\"'",
"%",
"(",
"v",
")",
")",
"if",
"len",
"(",
"output",
")",
"==",
"1",
":",
"# just return a single value (backwards compatible)",
"return",
"output",
"[",
"0",
"]",
"else",
":",
"return",
"output"
] | Return a corresponding package name for a python module.
mod: python module name
dist: a linux distribution as returned by
`platform.linux_distribution()[0]`
pkg_map: a custom package mapping. None means autodetected based on the
given dist parameter
py_vers: a list of python versions the function should return. Default is
'py' which is the unversioned translation. Possible values are
'py', 'py2' and 'py3' | [
"Return",
"a",
"corresponding",
"package",
"name",
"for",
"a",
"python",
"module",
"."
] | train | https://github.com/openstack/pymod2pkg/blob/f9a2f02fbfa0b2cfcdb4a7494c9ddbd10859065a/pymod2pkg/__init__.py#L359-L396 |
openstack/pymod2pkg | pymod2pkg/__init__.py | module2upstream | def module2upstream(mod):
"""Return a corresponding OpenStack upstream name for a python module.
mod -- python module name
"""
for rule in OPENSTACK_UPSTREAM_PKG_MAP:
pkglist = rule(mod, dist=None)
if pkglist:
return pkglist[0]
return mod | python | def module2upstream(mod):
"""Return a corresponding OpenStack upstream name for a python module.
mod -- python module name
"""
for rule in OPENSTACK_UPSTREAM_PKG_MAP:
pkglist = rule(mod, dist=None)
if pkglist:
return pkglist[0]
return mod | [
"def",
"module2upstream",
"(",
"mod",
")",
":",
"for",
"rule",
"in",
"OPENSTACK_UPSTREAM_PKG_MAP",
":",
"pkglist",
"=",
"rule",
"(",
"mod",
",",
"dist",
"=",
"None",
")",
"if",
"pkglist",
":",
"return",
"pkglist",
"[",
"0",
"]",
"return",
"mod"
] | Return a corresponding OpenStack upstream name for a python module.
mod -- python module name | [
"Return",
"a",
"corresponding",
"OpenStack",
"upstream",
"name",
"for",
"a",
"python",
"module",
"."
] | train | https://github.com/openstack/pymod2pkg/blob/f9a2f02fbfa0b2cfcdb4a7494c9ddbd10859065a/pymod2pkg/__init__.py#L399-L408 |
openstack/pymod2pkg | pymod2pkg/__init__.py | main | def main():
"""for resolving names from command line"""
parser = argparse.ArgumentParser(description='Python module name to'
'package name')
group = parser.add_mutually_exclusive_group()
group.add_argument('--dist', help='distribution style '
'(default: %(default)s)',
default=platform.linux_distribution()[0])
group.add_argument('--upstream', help='map to OpenStack project name',
action='store_true')
parser.add_argument('--pyver', help='Python versions to return. "py" is '
'the unversioned name',
action='append', choices=['py', 'py2', 'py3'],
default=[])
parser.add_argument('modulename', help='python module name')
args = vars(parser.parse_args())
pyversions = args['pyver'] if args['pyver'] else ['py']
if args['upstream']:
print(module2upstream(args['modulename']))
else:
pylist = module2package(args['modulename'], args['dist'],
py_vers=pyversions)
# When only 1 version is requested, it will be returned as a string,
# for backwards compatibility. Else, it will be a list.
if type(pylist) is list:
print(' '.join(pylist))
else:
print(pylist) | python | def main():
"""for resolving names from command line"""
parser = argparse.ArgumentParser(description='Python module name to'
'package name')
group = parser.add_mutually_exclusive_group()
group.add_argument('--dist', help='distribution style '
'(default: %(default)s)',
default=platform.linux_distribution()[0])
group.add_argument('--upstream', help='map to OpenStack project name',
action='store_true')
parser.add_argument('--pyver', help='Python versions to return. "py" is '
'the unversioned name',
action='append', choices=['py', 'py2', 'py3'],
default=[])
parser.add_argument('modulename', help='python module name')
args = vars(parser.parse_args())
pyversions = args['pyver'] if args['pyver'] else ['py']
if args['upstream']:
print(module2upstream(args['modulename']))
else:
pylist = module2package(args['modulename'], args['dist'],
py_vers=pyversions)
# When only 1 version is requested, it will be returned as a string,
# for backwards compatibility. Else, it will be a list.
if type(pylist) is list:
print(' '.join(pylist))
else:
print(pylist) | [
"def",
"main",
"(",
")",
":",
"parser",
"=",
"argparse",
".",
"ArgumentParser",
"(",
"description",
"=",
"'Python module name to'",
"'package name'",
")",
"group",
"=",
"parser",
".",
"add_mutually_exclusive_group",
"(",
")",
"group",
".",
"add_argument",
"(",
"'--dist'",
",",
"help",
"=",
"'distribution style '",
"'(default: %(default)s)'",
",",
"default",
"=",
"platform",
".",
"linux_distribution",
"(",
")",
"[",
"0",
"]",
")",
"group",
".",
"add_argument",
"(",
"'--upstream'",
",",
"help",
"=",
"'map to OpenStack project name'",
",",
"action",
"=",
"'store_true'",
")",
"parser",
".",
"add_argument",
"(",
"'--pyver'",
",",
"help",
"=",
"'Python versions to return. \"py\" is '",
"'the unversioned name'",
",",
"action",
"=",
"'append'",
",",
"choices",
"=",
"[",
"'py'",
",",
"'py2'",
",",
"'py3'",
"]",
",",
"default",
"=",
"[",
"]",
")",
"parser",
".",
"add_argument",
"(",
"'modulename'",
",",
"help",
"=",
"'python module name'",
")",
"args",
"=",
"vars",
"(",
"parser",
".",
"parse_args",
"(",
")",
")",
"pyversions",
"=",
"args",
"[",
"'pyver'",
"]",
"if",
"args",
"[",
"'pyver'",
"]",
"else",
"[",
"'py'",
"]",
"if",
"args",
"[",
"'upstream'",
"]",
":",
"print",
"(",
"module2upstream",
"(",
"args",
"[",
"'modulename'",
"]",
")",
")",
"else",
":",
"pylist",
"=",
"module2package",
"(",
"args",
"[",
"'modulename'",
"]",
",",
"args",
"[",
"'dist'",
"]",
",",
"py_vers",
"=",
"pyversions",
")",
"# When only 1 version is requested, it will be returned as a string,",
"# for backwards compatibility. Else, it will be a list.",
"if",
"type",
"(",
"pylist",
")",
"is",
"list",
":",
"print",
"(",
"' '",
".",
"join",
"(",
"pylist",
")",
")",
"else",
":",
"print",
"(",
"pylist",
")"
] | for resolving names from command line | [
"for",
"resolving",
"names",
"from",
"command",
"line"
] | train | https://github.com/openstack/pymod2pkg/blob/f9a2f02fbfa0b2cfcdb4a7494c9ddbd10859065a/pymod2pkg/__init__.py#L411-L440 |
InfoAgeTech/django-core | django_core/views/mixins/paging.py | PagingViewMixin.get_paging | def get_paging(self):
"""Gets the paging values passed through the query string params.
* "p" for "page number" and
* "ps" for "page size".
:returns: tuple with the page being the first part and the page size
being the second part.
"""
orig_page_num = self.page_num
orig_page_size = self.page_size
try:
page_num = int(self.request.GET.get(self.page_kwarg or 'p'))
if page_num < 1:
page_num = orig_page_num
except:
page_num = orig_page_num
try:
orig_page_size = self.page_size
page_size = int(self.request.GET.get(self.page_size_kwarg or 'ps'))
if page_size < 1:
page_size = orig_page_size
except:
page_size = orig_page_size
return page_num, page_size | python | def get_paging(self):
"""Gets the paging values passed through the query string params.
* "p" for "page number" and
* "ps" for "page size".
:returns: tuple with the page being the first part and the page size
being the second part.
"""
orig_page_num = self.page_num
orig_page_size = self.page_size
try:
page_num = int(self.request.GET.get(self.page_kwarg or 'p'))
if page_num < 1:
page_num = orig_page_num
except:
page_num = orig_page_num
try:
orig_page_size = self.page_size
page_size = int(self.request.GET.get(self.page_size_kwarg or 'ps'))
if page_size < 1:
page_size = orig_page_size
except:
page_size = orig_page_size
return page_num, page_size | [
"def",
"get_paging",
"(",
"self",
")",
":",
"orig_page_num",
"=",
"self",
".",
"page_num",
"orig_page_size",
"=",
"self",
".",
"page_size",
"try",
":",
"page_num",
"=",
"int",
"(",
"self",
".",
"request",
".",
"GET",
".",
"get",
"(",
"self",
".",
"page_kwarg",
"or",
"'p'",
")",
")",
"if",
"page_num",
"<",
"1",
":",
"page_num",
"=",
"orig_page_num",
"except",
":",
"page_num",
"=",
"orig_page_num",
"try",
":",
"orig_page_size",
"=",
"self",
".",
"page_size",
"page_size",
"=",
"int",
"(",
"self",
".",
"request",
".",
"GET",
".",
"get",
"(",
"self",
".",
"page_size_kwarg",
"or",
"'ps'",
")",
")",
"if",
"page_size",
"<",
"1",
":",
"page_size",
"=",
"orig_page_size",
"except",
":",
"page_size",
"=",
"orig_page_size",
"return",
"page_num",
",",
"page_size"
] | Gets the paging values passed through the query string params.
* "p" for "page number" and
* "ps" for "page size".
:returns: tuple with the page being the first part and the page size
being the second part. | [
"Gets",
"the",
"paging",
"values",
"passed",
"through",
"the",
"query",
"string",
"params",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/views/mixins/paging.py#L38-L68 |
dls-controls/annotypes | annotypes/_calltypes.py | make_call_types | def make_call_types(f, globals_d):
# type: (Callable, Dict) -> Tuple[Dict[str, Anno], Anno]
"""Make a call_types dictionary that describes what arguments to pass to f
Args:
f: The function to inspect for argument names (without self)
globals_d: A dictionary of globals to lookup annotation definitions in
"""
arg_spec = getargspec(f)
args = [k for k in arg_spec.args if k != "self"]
defaults = {} # type: Dict[str, Any]
if arg_spec.defaults:
default_args = args[-len(arg_spec.defaults):]
for a, default in zip(default_args, arg_spec.defaults):
defaults[a] = default
if not getattr(f, "__annotations__", None):
# Make string annotations from the type comment if there is one
annotations = make_annotations(f, globals_d)
else:
annotations = f.__annotations__
call_types = OrderedDict() # type: Dict[str, Anno]
for a in args:
anno = anno_with_default(annotations[a], defaults.get(a, NO_DEFAULT))
assert isinstance(anno, Anno), \
"Argument %r has type %r which is not an Anno" % (a, anno)
call_types[a] = anno
return_type = anno_with_default(annotations.get("return", None))
if return_type is Any:
return_type = Anno("Any return value", Any, "return")
assert return_type is None or isinstance(return_type, Anno), \
"Return has type %r which is not an Anno" % (return_type,)
return call_types, return_type | python | def make_call_types(f, globals_d):
# type: (Callable, Dict) -> Tuple[Dict[str, Anno], Anno]
"""Make a call_types dictionary that describes what arguments to pass to f
Args:
f: The function to inspect for argument names (without self)
globals_d: A dictionary of globals to lookup annotation definitions in
"""
arg_spec = getargspec(f)
args = [k for k in arg_spec.args if k != "self"]
defaults = {} # type: Dict[str, Any]
if arg_spec.defaults:
default_args = args[-len(arg_spec.defaults):]
for a, default in zip(default_args, arg_spec.defaults):
defaults[a] = default
if not getattr(f, "__annotations__", None):
# Make string annotations from the type comment if there is one
annotations = make_annotations(f, globals_d)
else:
annotations = f.__annotations__
call_types = OrderedDict() # type: Dict[str, Anno]
for a in args:
anno = anno_with_default(annotations[a], defaults.get(a, NO_DEFAULT))
assert isinstance(anno, Anno), \
"Argument %r has type %r which is not an Anno" % (a, anno)
call_types[a] = anno
return_type = anno_with_default(annotations.get("return", None))
if return_type is Any:
return_type = Anno("Any return value", Any, "return")
assert return_type is None or isinstance(return_type, Anno), \
"Return has type %r which is not an Anno" % (return_type,)
return call_types, return_type | [
"def",
"make_call_types",
"(",
"f",
",",
"globals_d",
")",
":",
"# type: (Callable, Dict) -> Tuple[Dict[str, Anno], Anno]",
"arg_spec",
"=",
"getargspec",
"(",
"f",
")",
"args",
"=",
"[",
"k",
"for",
"k",
"in",
"arg_spec",
".",
"args",
"if",
"k",
"!=",
"\"self\"",
"]",
"defaults",
"=",
"{",
"}",
"# type: Dict[str, Any]",
"if",
"arg_spec",
".",
"defaults",
":",
"default_args",
"=",
"args",
"[",
"-",
"len",
"(",
"arg_spec",
".",
"defaults",
")",
":",
"]",
"for",
"a",
",",
"default",
"in",
"zip",
"(",
"default_args",
",",
"arg_spec",
".",
"defaults",
")",
":",
"defaults",
"[",
"a",
"]",
"=",
"default",
"if",
"not",
"getattr",
"(",
"f",
",",
"\"__annotations__\"",
",",
"None",
")",
":",
"# Make string annotations from the type comment if there is one",
"annotations",
"=",
"make_annotations",
"(",
"f",
",",
"globals_d",
")",
"else",
":",
"annotations",
"=",
"f",
".",
"__annotations__",
"call_types",
"=",
"OrderedDict",
"(",
")",
"# type: Dict[str, Anno]",
"for",
"a",
"in",
"args",
":",
"anno",
"=",
"anno_with_default",
"(",
"annotations",
"[",
"a",
"]",
",",
"defaults",
".",
"get",
"(",
"a",
",",
"NO_DEFAULT",
")",
")",
"assert",
"isinstance",
"(",
"anno",
",",
"Anno",
")",
",",
"\"Argument %r has type %r which is not an Anno\"",
"%",
"(",
"a",
",",
"anno",
")",
"call_types",
"[",
"a",
"]",
"=",
"anno",
"return_type",
"=",
"anno_with_default",
"(",
"annotations",
".",
"get",
"(",
"\"return\"",
",",
"None",
")",
")",
"if",
"return_type",
"is",
"Any",
":",
"return_type",
"=",
"Anno",
"(",
"\"Any return value\"",
",",
"Any",
",",
"\"return\"",
")",
"assert",
"return_type",
"is",
"None",
"or",
"isinstance",
"(",
"return_type",
",",
"Anno",
")",
",",
"\"Return has type %r which is not an Anno\"",
"%",
"(",
"return_type",
",",
")",
"return",
"call_types",
",",
"return_type"
] | Make a call_types dictionary that describes what arguments to pass to f
Args:
f: The function to inspect for argument names (without self)
globals_d: A dictionary of globals to lookup annotation definitions in | [
"Make",
"a",
"call_types",
"dictionary",
"that",
"describes",
"what",
"arguments",
"to",
"pass",
"to",
"f"
] | train | https://github.com/dls-controls/annotypes/blob/31ab68a0367bb70ebd9898e8b9fa9405423465bd/annotypes/_calltypes.py#L44-L80 |
dls-controls/annotypes | annotypes/_calltypes.py | make_annotations | def make_annotations(f, globals_d=None):
# type: (Callable, Dict) -> Dict[str, Any]
"""Create an annotations dictionary from Python2 type comments
http://mypy.readthedocs.io/en/latest/python2.html
Args:
f: The function to examine for type comments
globals_d: The globals dictionary to get type idents from. If not
specified then make the annotations dict contain strings rather
than the looked up objects
"""
locals_d = {} # type: Dict[str, Any]
if globals_d is None:
# If not given a globals_d then we should just populate annotations with
# the strings in the type comment.
globals_d = {}
# The current approach is to use eval, which means manufacturing a
# dict like object that will just echo the string back to you. This
# has a number of complexities for somthing like numpy.number or
# Callable[..., int], which are handled in EchoStr above, so it might be
# better off as an ast.parse in the future...
locals_d = EchoDict()
lines, _ = inspect.getsourcelines(f)
arg_spec = getargspec(f)
args = list(arg_spec.args)
if arg_spec.varargs is not None:
args.append(arg_spec.varargs)
if arg_spec.keywords is not None:
args.append(arg_spec.keywords)
it = iter(lines)
types = [] # type: List
found = None
for token in tokenize.generate_tokens(lambda: next(it)):
typ, string, start, end, line = token
if typ == tokenize.COMMENT:
found = type_re.match(string)
if found:
parts = found.groups()
# (...) is used to represent all the args so far
if parts[0] != "(...)":
expr = parts[0].replace("*", "")
try:
ob = eval(expr, globals_d, locals_d)
except Exception as e:
raise ValueError(
"Error evaluating %r: %s" % (expr, e))
if isinstance(ob, tuple):
# We got more than one argument
types += list(ob)
else:
# We got a single argument
types.append(ob)
if parts[1]:
# Got a return, done
try:
ob = eval(parts[2], globals_d, locals_d)
except Exception as e:
raise ValueError(
"Error evaluating %r: %s" % (parts[2], e))
if args and args[0] in ["self", "cls"]:
# Allow the first argument to be inferred
if len(args) == len(types) + 1:
args = args[1:]
assert len(args) == len(types), \
"Args %r Types %r length mismatch" % (args, types)
ret = dict(zip(args, types))
ret["return"] = ob
return ret
if found:
# If we have ever found a type comment, but not the return value, error
raise ValueError("Got to the end of the function without seeing ->")
return {} | python | def make_annotations(f, globals_d=None):
# type: (Callable, Dict) -> Dict[str, Any]
"""Create an annotations dictionary from Python2 type comments
http://mypy.readthedocs.io/en/latest/python2.html
Args:
f: The function to examine for type comments
globals_d: The globals dictionary to get type idents from. If not
specified then make the annotations dict contain strings rather
than the looked up objects
"""
locals_d = {} # type: Dict[str, Any]
if globals_d is None:
# If not given a globals_d then we should just populate annotations with
# the strings in the type comment.
globals_d = {}
# The current approach is to use eval, which means manufacturing a
# dict like object that will just echo the string back to you. This
# has a number of complexities for somthing like numpy.number or
# Callable[..., int], which are handled in EchoStr above, so it might be
# better off as an ast.parse in the future...
locals_d = EchoDict()
lines, _ = inspect.getsourcelines(f)
arg_spec = getargspec(f)
args = list(arg_spec.args)
if arg_spec.varargs is not None:
args.append(arg_spec.varargs)
if arg_spec.keywords is not None:
args.append(arg_spec.keywords)
it = iter(lines)
types = [] # type: List
found = None
for token in tokenize.generate_tokens(lambda: next(it)):
typ, string, start, end, line = token
if typ == tokenize.COMMENT:
found = type_re.match(string)
if found:
parts = found.groups()
# (...) is used to represent all the args so far
if parts[0] != "(...)":
expr = parts[0].replace("*", "")
try:
ob = eval(expr, globals_d, locals_d)
except Exception as e:
raise ValueError(
"Error evaluating %r: %s" % (expr, e))
if isinstance(ob, tuple):
# We got more than one argument
types += list(ob)
else:
# We got a single argument
types.append(ob)
if parts[1]:
# Got a return, done
try:
ob = eval(parts[2], globals_d, locals_d)
except Exception as e:
raise ValueError(
"Error evaluating %r: %s" % (parts[2], e))
if args and args[0] in ["self", "cls"]:
# Allow the first argument to be inferred
if len(args) == len(types) + 1:
args = args[1:]
assert len(args) == len(types), \
"Args %r Types %r length mismatch" % (args, types)
ret = dict(zip(args, types))
ret["return"] = ob
return ret
if found:
# If we have ever found a type comment, but not the return value, error
raise ValueError("Got to the end of the function without seeing ->")
return {} | [
"def",
"make_annotations",
"(",
"f",
",",
"globals_d",
"=",
"None",
")",
":",
"# type: (Callable, Dict) -> Dict[str, Any]",
"locals_d",
"=",
"{",
"}",
"# type: Dict[str, Any]",
"if",
"globals_d",
"is",
"None",
":",
"# If not given a globals_d then we should just populate annotations with",
"# the strings in the type comment.",
"globals_d",
"=",
"{",
"}",
"# The current approach is to use eval, which means manufacturing a",
"# dict like object that will just echo the string back to you. This",
"# has a number of complexities for somthing like numpy.number or",
"# Callable[..., int], which are handled in EchoStr above, so it might be",
"# better off as an ast.parse in the future...",
"locals_d",
"=",
"EchoDict",
"(",
")",
"lines",
",",
"_",
"=",
"inspect",
".",
"getsourcelines",
"(",
"f",
")",
"arg_spec",
"=",
"getargspec",
"(",
"f",
")",
"args",
"=",
"list",
"(",
"arg_spec",
".",
"args",
")",
"if",
"arg_spec",
".",
"varargs",
"is",
"not",
"None",
":",
"args",
".",
"append",
"(",
"arg_spec",
".",
"varargs",
")",
"if",
"arg_spec",
".",
"keywords",
"is",
"not",
"None",
":",
"args",
".",
"append",
"(",
"arg_spec",
".",
"keywords",
")",
"it",
"=",
"iter",
"(",
"lines",
")",
"types",
"=",
"[",
"]",
"# type: List",
"found",
"=",
"None",
"for",
"token",
"in",
"tokenize",
".",
"generate_tokens",
"(",
"lambda",
":",
"next",
"(",
"it",
")",
")",
":",
"typ",
",",
"string",
",",
"start",
",",
"end",
",",
"line",
"=",
"token",
"if",
"typ",
"==",
"tokenize",
".",
"COMMENT",
":",
"found",
"=",
"type_re",
".",
"match",
"(",
"string",
")",
"if",
"found",
":",
"parts",
"=",
"found",
".",
"groups",
"(",
")",
"# (...) is used to represent all the args so far",
"if",
"parts",
"[",
"0",
"]",
"!=",
"\"(...)\"",
":",
"expr",
"=",
"parts",
"[",
"0",
"]",
".",
"replace",
"(",
"\"*\"",
",",
"\"\"",
")",
"try",
":",
"ob",
"=",
"eval",
"(",
"expr",
",",
"globals_d",
",",
"locals_d",
")",
"except",
"Exception",
"as",
"e",
":",
"raise",
"ValueError",
"(",
"\"Error evaluating %r: %s\"",
"%",
"(",
"expr",
",",
"e",
")",
")",
"if",
"isinstance",
"(",
"ob",
",",
"tuple",
")",
":",
"# We got more than one argument",
"types",
"+=",
"list",
"(",
"ob",
")",
"else",
":",
"# We got a single argument",
"types",
".",
"append",
"(",
"ob",
")",
"if",
"parts",
"[",
"1",
"]",
":",
"# Got a return, done",
"try",
":",
"ob",
"=",
"eval",
"(",
"parts",
"[",
"2",
"]",
",",
"globals_d",
",",
"locals_d",
")",
"except",
"Exception",
"as",
"e",
":",
"raise",
"ValueError",
"(",
"\"Error evaluating %r: %s\"",
"%",
"(",
"parts",
"[",
"2",
"]",
",",
"e",
")",
")",
"if",
"args",
"and",
"args",
"[",
"0",
"]",
"in",
"[",
"\"self\"",
",",
"\"cls\"",
"]",
":",
"# Allow the first argument to be inferred",
"if",
"len",
"(",
"args",
")",
"==",
"len",
"(",
"types",
")",
"+",
"1",
":",
"args",
"=",
"args",
"[",
"1",
":",
"]",
"assert",
"len",
"(",
"args",
")",
"==",
"len",
"(",
"types",
")",
",",
"\"Args %r Types %r length mismatch\"",
"%",
"(",
"args",
",",
"types",
")",
"ret",
"=",
"dict",
"(",
"zip",
"(",
"args",
",",
"types",
")",
")",
"ret",
"[",
"\"return\"",
"]",
"=",
"ob",
"return",
"ret",
"if",
"found",
":",
"# If we have ever found a type comment, but not the return value, error",
"raise",
"ValueError",
"(",
"\"Got to the end of the function without seeing ->\"",
")",
"return",
"{",
"}"
] | Create an annotations dictionary from Python2 type comments
http://mypy.readthedocs.io/en/latest/python2.html
Args:
f: The function to examine for type comments
globals_d: The globals dictionary to get type idents from. If not
specified then make the annotations dict contain strings rather
than the looked up objects | [
"Create",
"an",
"annotations",
"dictionary",
"from",
"Python2",
"type",
"comments"
] | train | https://github.com/dls-controls/annotypes/blob/31ab68a0367bb70ebd9898e8b9fa9405423465bd/annotypes/_calltypes.py#L106-L178 |
InfoAgeTech/django-core | django_core/views/mixins/common.py | CommonSingleObjectViewMixin.get_object | def get_object(self, **kwargs):
"""Sometimes preprocessing of a view need to happen before the object
attribute has been set for a view. In this case, just return the
object if it has already been set when it's called down the road since
there's no need to make another query.
"""
if hasattr(self, 'object') and self.object:
return self.object
obj = super(CommonSingleObjectViewMixin, self).get_object(**kwargs)
self.object = obj
return obj | python | def get_object(self, **kwargs):
"""Sometimes preprocessing of a view need to happen before the object
attribute has been set for a view. In this case, just return the
object if it has already been set when it's called down the road since
there's no need to make another query.
"""
if hasattr(self, 'object') and self.object:
return self.object
obj = super(CommonSingleObjectViewMixin, self).get_object(**kwargs)
self.object = obj
return obj | [
"def",
"get_object",
"(",
"self",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"hasattr",
"(",
"self",
",",
"'object'",
")",
"and",
"self",
".",
"object",
":",
"return",
"self",
".",
"object",
"obj",
"=",
"super",
"(",
"CommonSingleObjectViewMixin",
",",
"self",
")",
".",
"get_object",
"(",
"*",
"*",
"kwargs",
")",
"self",
".",
"object",
"=",
"obj",
"return",
"obj"
] | Sometimes preprocessing of a view need to happen before the object
attribute has been set for a view. In this case, just return the
object if it has already been set when it's called down the road since
there's no need to make another query. | [
"Sometimes",
"preprocessing",
"of",
"a",
"view",
"need",
"to",
"happen",
"before",
"the",
"object",
"attribute",
"has",
"been",
"set",
"for",
"a",
"view",
".",
"In",
"this",
"case",
"just",
"return",
"the",
"object",
"if",
"it",
"has",
"already",
"been",
"set",
"when",
"it",
"s",
"called",
"down",
"the",
"road",
"since",
"there",
"s",
"no",
"need",
"to",
"make",
"another",
"query",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/views/mixins/common.py#L6-L17 |
InfoAgeTech/django-core | django_core/utils/date_parsers.py | hex_timestamp_to_datetime | def hex_timestamp_to_datetime(hex_timestamp):
"""Converts hex timestamp to a datetime object.
>>> hex_timestamp_to_datetime('558BBCF9')
datetime.datetime(2015, 6, 25, 8, 34, 1)
>>> hex_timestamp_to_datetime('0x558BBCF9')
datetime.datetime(2015, 6, 25, 8, 34, 1)
>>> datetime.fromtimestamp(0x558BBCF9)
datetime.datetime(2015, 6, 25, 8, 34, 1)
"""
if not hex_timestamp.startswith('0x'):
hex_timestamp = '0x{0}'.format(hex_timestamp)
return datetime.fromtimestamp(int(hex_timestamp, 16)) | python | def hex_timestamp_to_datetime(hex_timestamp):
"""Converts hex timestamp to a datetime object.
>>> hex_timestamp_to_datetime('558BBCF9')
datetime.datetime(2015, 6, 25, 8, 34, 1)
>>> hex_timestamp_to_datetime('0x558BBCF9')
datetime.datetime(2015, 6, 25, 8, 34, 1)
>>> datetime.fromtimestamp(0x558BBCF9)
datetime.datetime(2015, 6, 25, 8, 34, 1)
"""
if not hex_timestamp.startswith('0x'):
hex_timestamp = '0x{0}'.format(hex_timestamp)
return datetime.fromtimestamp(int(hex_timestamp, 16)) | [
"def",
"hex_timestamp_to_datetime",
"(",
"hex_timestamp",
")",
":",
"if",
"not",
"hex_timestamp",
".",
"startswith",
"(",
"'0x'",
")",
":",
"hex_timestamp",
"=",
"'0x{0}'",
".",
"format",
"(",
"hex_timestamp",
")",
"return",
"datetime",
".",
"fromtimestamp",
"(",
"int",
"(",
"hex_timestamp",
",",
"16",
")",
")"
] | Converts hex timestamp to a datetime object.
>>> hex_timestamp_to_datetime('558BBCF9')
datetime.datetime(2015, 6, 25, 8, 34, 1)
>>> hex_timestamp_to_datetime('0x558BBCF9')
datetime.datetime(2015, 6, 25, 8, 34, 1)
>>> datetime.fromtimestamp(0x558BBCF9)
datetime.datetime(2015, 6, 25, 8, 34, 1) | [
"Converts",
"hex",
"timestamp",
"to",
"a",
"datetime",
"object",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/date_parsers.py#L10-L23 |
InfoAgeTech/django-core | django_core/utils/date_parsers.py | now_by_tz | def now_by_tz(tz='US/Central', ignoretz=True):
"""Gets the current datetime object by timezone.
:param tz: is the timezone to get the date for. tz can be passed as a
string or as a timezone object. (i.e. 'US/Central' or
pytz.timezone('US/Central'), etc)
:param ignoretz: will ignore the timezone portion of the datetime object and
tzinfo will be None.
:return: the current datetime object by tz
Examples:
>>> now_by_tz('US/Pacific')
2011-09-28 10:06:01.130025
>>> now_by_tz('US/Pacific', False)
2011-09-28 10:06:01.130025-07:00
>>> now_by_tz(pytz.timezone('US/Central'))
2011-09-28 12:06:01.130025
>>> now_by_tz(pytz.timezone('US/Central'), False)
2011-09-28 12:06:01.130025-05:00
"""
if isinstance(tz, string_types):
tz = pytz.timezone(tz)
if ignoretz:
return datetime.now(tz).replace(tzinfo=None)
return datetime.now(tz) | python | def now_by_tz(tz='US/Central', ignoretz=True):
"""Gets the current datetime object by timezone.
:param tz: is the timezone to get the date for. tz can be passed as a
string or as a timezone object. (i.e. 'US/Central' or
pytz.timezone('US/Central'), etc)
:param ignoretz: will ignore the timezone portion of the datetime object and
tzinfo will be None.
:return: the current datetime object by tz
Examples:
>>> now_by_tz('US/Pacific')
2011-09-28 10:06:01.130025
>>> now_by_tz('US/Pacific', False)
2011-09-28 10:06:01.130025-07:00
>>> now_by_tz(pytz.timezone('US/Central'))
2011-09-28 12:06:01.130025
>>> now_by_tz(pytz.timezone('US/Central'), False)
2011-09-28 12:06:01.130025-05:00
"""
if isinstance(tz, string_types):
tz = pytz.timezone(tz)
if ignoretz:
return datetime.now(tz).replace(tzinfo=None)
return datetime.now(tz) | [
"def",
"now_by_tz",
"(",
"tz",
"=",
"'US/Central'",
",",
"ignoretz",
"=",
"True",
")",
":",
"if",
"isinstance",
"(",
"tz",
",",
"string_types",
")",
":",
"tz",
"=",
"pytz",
".",
"timezone",
"(",
"tz",
")",
"if",
"ignoretz",
":",
"return",
"datetime",
".",
"now",
"(",
"tz",
")",
".",
"replace",
"(",
"tzinfo",
"=",
"None",
")",
"return",
"datetime",
".",
"now",
"(",
"tz",
")"
] | Gets the current datetime object by timezone.
:param tz: is the timezone to get the date for. tz can be passed as a
string or as a timezone object. (i.e. 'US/Central' or
pytz.timezone('US/Central'), etc)
:param ignoretz: will ignore the timezone portion of the datetime object and
tzinfo will be None.
:return: the current datetime object by tz
Examples:
>>> now_by_tz('US/Pacific')
2011-09-28 10:06:01.130025
>>> now_by_tz('US/Pacific', False)
2011-09-28 10:06:01.130025-07:00
>>> now_by_tz(pytz.timezone('US/Central'))
2011-09-28 12:06:01.130025
>>> now_by_tz(pytz.timezone('US/Central'), False)
2011-09-28 12:06:01.130025-05:00 | [
"Gets",
"the",
"current",
"datetime",
"object",
"by",
"timezone",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/date_parsers.py#L26-L53 |
InfoAgeTech/django-core | django_core/utils/date_parsers.py | tz_to_utc | def tz_to_utc(dt, tz, ignoretz=True):
"""Converts a datetime object from the specified timezone to a UTC datetime.
:param tz: the timezone the datetime is currently in. tz can be passed
as a string or as a timezone object. (i.e. 'US/Central' or
pytz.timezone('US/Central'), etc)
:param ignoretz: will ignore the timezone portion of the datetime object and
tzinfo will be None.
:return: the datetime object by in UTC time.
Examples:
>>> tz_to_utc(datetime(2011, 11, 25, 9), 'US/Central')
2011-11-25 15:00:00
>>> tz_to_utc(datetime(2011, 11, 25, 9), pytz.timezone('US/Central'))
2011-11-25 15:00:00
>>> tz_to_utc(datetime(2011, 11, 25, 9), 'US/Central', False)
2011-11-25 15:00:00+00:00
"""
if isinstance(tz, string_types):
tz = pytz.timezone(tz)
dt = tz.localize(dt)
dt = datetime.astimezone(dt, pytz.timezone('UTC'))
if ignoretz:
return dt.replace(tzinfo=None)
return dt | python | def tz_to_utc(dt, tz, ignoretz=True):
"""Converts a datetime object from the specified timezone to a UTC datetime.
:param tz: the timezone the datetime is currently in. tz can be passed
as a string or as a timezone object. (i.e. 'US/Central' or
pytz.timezone('US/Central'), etc)
:param ignoretz: will ignore the timezone portion of the datetime object and
tzinfo will be None.
:return: the datetime object by in UTC time.
Examples:
>>> tz_to_utc(datetime(2011, 11, 25, 9), 'US/Central')
2011-11-25 15:00:00
>>> tz_to_utc(datetime(2011, 11, 25, 9), pytz.timezone('US/Central'))
2011-11-25 15:00:00
>>> tz_to_utc(datetime(2011, 11, 25, 9), 'US/Central', False)
2011-11-25 15:00:00+00:00
"""
if isinstance(tz, string_types):
tz = pytz.timezone(tz)
dt = tz.localize(dt)
dt = datetime.astimezone(dt, pytz.timezone('UTC'))
if ignoretz:
return dt.replace(tzinfo=None)
return dt | [
"def",
"tz_to_utc",
"(",
"dt",
",",
"tz",
",",
"ignoretz",
"=",
"True",
")",
":",
"if",
"isinstance",
"(",
"tz",
",",
"string_types",
")",
":",
"tz",
"=",
"pytz",
".",
"timezone",
"(",
"tz",
")",
"dt",
"=",
"tz",
".",
"localize",
"(",
"dt",
")",
"dt",
"=",
"datetime",
".",
"astimezone",
"(",
"dt",
",",
"pytz",
".",
"timezone",
"(",
"'UTC'",
")",
")",
"if",
"ignoretz",
":",
"return",
"dt",
".",
"replace",
"(",
"tzinfo",
"=",
"None",
")",
"return",
"dt"
] | Converts a datetime object from the specified timezone to a UTC datetime.
:param tz: the timezone the datetime is currently in. tz can be passed
as a string or as a timezone object. (i.e. 'US/Central' or
pytz.timezone('US/Central'), etc)
:param ignoretz: will ignore the timezone portion of the datetime object and
tzinfo will be None.
:return: the datetime object by in UTC time.
Examples:
>>> tz_to_utc(datetime(2011, 11, 25, 9), 'US/Central')
2011-11-25 15:00:00
>>> tz_to_utc(datetime(2011, 11, 25, 9), pytz.timezone('US/Central'))
2011-11-25 15:00:00
>>> tz_to_utc(datetime(2011, 11, 25, 9), 'US/Central', False)
2011-11-25 15:00:00+00:00 | [
"Converts",
"a",
"datetime",
"object",
"from",
"the",
"specified",
"timezone",
"to",
"a",
"UTC",
"datetime",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/date_parsers.py#L56-L84 |
InfoAgeTech/django-core | django_core/utils/date_parsers.py | utc_to_tz | def utc_to_tz(dt, tz, ignoretz=True):
""" Converts UTC datetime object to the specific timezone.
:param dt: the UTC datetime object to convert.
:param tz: the timezone to convert the UTC datetime object info. tz can be
passed as a string or as a timezone object. (i.e. 'US/Central' or
pytz.timezone('US/Central'), etc)
:param ignoretz: will ignore the timezone portion of the datetime object and
tzinfo will be None.
:return: the datetime object by in UTC time.
Examples:
>>> utc_to_tz(datetime(2011, 11, 25, 9), pytz.timezone('US/Central'))
2011-11-25 03:00:00
>>> utc_to_tz(datetime(2011, 11, 25, 9), 'US/Central', False)
2011-11-25 03:00:00-06:00
"""
if isinstance(tz, string_types):
tz = pytz.timezone(tz)
dt = pytz.utc.localize(dt)
dt = dt.astimezone(tz)
if ignoretz:
return dt.replace(tzinfo=None)
return dt | python | def utc_to_tz(dt, tz, ignoretz=True):
""" Converts UTC datetime object to the specific timezone.
:param dt: the UTC datetime object to convert.
:param tz: the timezone to convert the UTC datetime object info. tz can be
passed as a string or as a timezone object. (i.e. 'US/Central' or
pytz.timezone('US/Central'), etc)
:param ignoretz: will ignore the timezone portion of the datetime object and
tzinfo will be None.
:return: the datetime object by in UTC time.
Examples:
>>> utc_to_tz(datetime(2011, 11, 25, 9), pytz.timezone('US/Central'))
2011-11-25 03:00:00
>>> utc_to_tz(datetime(2011, 11, 25, 9), 'US/Central', False)
2011-11-25 03:00:00-06:00
"""
if isinstance(tz, string_types):
tz = pytz.timezone(tz)
dt = pytz.utc.localize(dt)
dt = dt.astimezone(tz)
if ignoretz:
return dt.replace(tzinfo=None)
return dt | [
"def",
"utc_to_tz",
"(",
"dt",
",",
"tz",
",",
"ignoretz",
"=",
"True",
")",
":",
"if",
"isinstance",
"(",
"tz",
",",
"string_types",
")",
":",
"tz",
"=",
"pytz",
".",
"timezone",
"(",
"tz",
")",
"dt",
"=",
"pytz",
".",
"utc",
".",
"localize",
"(",
"dt",
")",
"dt",
"=",
"dt",
".",
"astimezone",
"(",
"tz",
")",
"if",
"ignoretz",
":",
"return",
"dt",
".",
"replace",
"(",
"tzinfo",
"=",
"None",
")",
"return",
"dt"
] | Converts UTC datetime object to the specific timezone.
:param dt: the UTC datetime object to convert.
:param tz: the timezone to convert the UTC datetime object info. tz can be
passed as a string or as a timezone object. (i.e. 'US/Central' or
pytz.timezone('US/Central'), etc)
:param ignoretz: will ignore the timezone portion of the datetime object and
tzinfo will be None.
:return: the datetime object by in UTC time.
Examples:
>>> utc_to_tz(datetime(2011, 11, 25, 9), pytz.timezone('US/Central'))
2011-11-25 03:00:00
>>> utc_to_tz(datetime(2011, 11, 25, 9), 'US/Central', False)
2011-11-25 03:00:00-06:00 | [
"Converts",
"UTC",
"datetime",
"object",
"to",
"the",
"specific",
"timezone",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/date_parsers.py#L87-L114 |
InfoAgeTech/django-core | django_core/utils/date_parsers.py | parse_date | def parse_date(dt, ignoretz=True, as_tz=None):
"""
:param dt: string datetime to convert into datetime object.
:return: date object if the string can be parsed into a date. Otherwise,
return None.
:see: http://labix.org/python-dateutil
Examples:
>>> parse_date('2011-12-30')
datetime.date(2011, 12, 30)
>>> parse_date('12/30/2011')
datetime.date(2011, 12, 30)
"""
dttm = parse_datetime(dt, ignoretz=ignoretz)
return None if dttm is None else dttm.date() | python | def parse_date(dt, ignoretz=True, as_tz=None):
"""
:param dt: string datetime to convert into datetime object.
:return: date object if the string can be parsed into a date. Otherwise,
return None.
:see: http://labix.org/python-dateutil
Examples:
>>> parse_date('2011-12-30')
datetime.date(2011, 12, 30)
>>> parse_date('12/30/2011')
datetime.date(2011, 12, 30)
"""
dttm = parse_datetime(dt, ignoretz=ignoretz)
return None if dttm is None else dttm.date() | [
"def",
"parse_date",
"(",
"dt",
",",
"ignoretz",
"=",
"True",
",",
"as_tz",
"=",
"None",
")",
":",
"dttm",
"=",
"parse_datetime",
"(",
"dt",
",",
"ignoretz",
"=",
"ignoretz",
")",
"return",
"None",
"if",
"dttm",
"is",
"None",
"else",
"dttm",
".",
"date",
"(",
")"
] | :param dt: string datetime to convert into datetime object.
:return: date object if the string can be parsed into a date. Otherwise,
return None.
:see: http://labix.org/python-dateutil
Examples:
>>> parse_date('2011-12-30')
datetime.date(2011, 12, 30)
>>> parse_date('12/30/2011')
datetime.date(2011, 12, 30) | [
":",
"param",
"dt",
":",
"string",
"datetime",
"to",
"convert",
"into",
"datetime",
"object",
".",
":",
"return",
":",
"date",
"object",
"if",
"the",
"string",
"can",
"be",
"parsed",
"into",
"a",
"date",
".",
"Otherwise",
"return",
"None",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/date_parsers.py#L117-L133 |
InfoAgeTech/django-core | django_core/utils/date_parsers.py | parse_datetime | def parse_datetime(dt, ignoretz=True, **kwargs):
"""
:param dt: string datetime to convert into datetime object.
:return: datetime object if the string can be parsed into a datetime.
Otherwise, return None.
:see: http://labix.org/python-dateutil
Examples:
>>> parse_datetime('2011-12-30 13:45:12 CDT')
2011-12-30 13:45:12
>>> parse_datetime('12/30/2011 13:45:12 CDT')
2011-12-30 13:45:12
>>> parse_datetime('2011-12-30 13:45:12 CDT', ignoretz=False)
2011-12-30 13:45:12-06:00
>>> parse_datetime('12/30/2011 13:45:12 CDT', ignoretz=False)
2011-12-30 13:45:12-06:00
"""
try:
return parse(dt, ignoretz=ignoretz, **kwargs)
except:
return None | python | def parse_datetime(dt, ignoretz=True, **kwargs):
"""
:param dt: string datetime to convert into datetime object.
:return: datetime object if the string can be parsed into a datetime.
Otherwise, return None.
:see: http://labix.org/python-dateutil
Examples:
>>> parse_datetime('2011-12-30 13:45:12 CDT')
2011-12-30 13:45:12
>>> parse_datetime('12/30/2011 13:45:12 CDT')
2011-12-30 13:45:12
>>> parse_datetime('2011-12-30 13:45:12 CDT', ignoretz=False)
2011-12-30 13:45:12-06:00
>>> parse_datetime('12/30/2011 13:45:12 CDT', ignoretz=False)
2011-12-30 13:45:12-06:00
"""
try:
return parse(dt, ignoretz=ignoretz, **kwargs)
except:
return None | [
"def",
"parse_datetime",
"(",
"dt",
",",
"ignoretz",
"=",
"True",
",",
"*",
"*",
"kwargs",
")",
":",
"try",
":",
"return",
"parse",
"(",
"dt",
",",
"ignoretz",
"=",
"ignoretz",
",",
"*",
"*",
"kwargs",
")",
"except",
":",
"return",
"None"
] | :param dt: string datetime to convert into datetime object.
:return: datetime object if the string can be parsed into a datetime.
Otherwise, return None.
:see: http://labix.org/python-dateutil
Examples:
>>> parse_datetime('2011-12-30 13:45:12 CDT')
2011-12-30 13:45:12
>>> parse_datetime('12/30/2011 13:45:12 CDT')
2011-12-30 13:45:12
>>> parse_datetime('2011-12-30 13:45:12 CDT', ignoretz=False)
2011-12-30 13:45:12-06:00
>>> parse_datetime('12/30/2011 13:45:12 CDT', ignoretz=False)
2011-12-30 13:45:12-06:00 | [
":",
"param",
"dt",
":",
"string",
"datetime",
"to",
"convert",
"into",
"datetime",
"object",
".",
":",
"return",
":",
"datetime",
"object",
"if",
"the",
"string",
"can",
"be",
"parsed",
"into",
"a",
"datetime",
".",
"Otherwise",
"return",
"None",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/date_parsers.py#L136-L159 |
InfoAgeTech/django-core | django_core/decorators.py | turn_emails_off | def turn_emails_off(view_func):
"""Turns emails off so no emails will be sent."""
# Dummy email backend so no emails are sent.
EMAIL_BACKEND_DUMMY = 'django.core.mail.backends.dummy.EmailBackend'
def decorated(request, *args, **kwargs):
orig_email_backend = settings.EMAIL_BACKEND
settings.EMAIL_BACKEND = EMAIL_BACKEND_DUMMY
response = view_func(request, *args, **kwargs)
settings.EMAIL_BACKEND = orig_email_backend
return response
return decorated | python | def turn_emails_off(view_func):
"""Turns emails off so no emails will be sent."""
# Dummy email backend so no emails are sent.
EMAIL_BACKEND_DUMMY = 'django.core.mail.backends.dummy.EmailBackend'
def decorated(request, *args, **kwargs):
orig_email_backend = settings.EMAIL_BACKEND
settings.EMAIL_BACKEND = EMAIL_BACKEND_DUMMY
response = view_func(request, *args, **kwargs)
settings.EMAIL_BACKEND = orig_email_backend
return response
return decorated | [
"def",
"turn_emails_off",
"(",
"view_func",
")",
":",
"# Dummy email backend so no emails are sent.",
"EMAIL_BACKEND_DUMMY",
"=",
"'django.core.mail.backends.dummy.EmailBackend'",
"def",
"decorated",
"(",
"request",
",",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"orig_email_backend",
"=",
"settings",
".",
"EMAIL_BACKEND",
"settings",
".",
"EMAIL_BACKEND",
"=",
"EMAIL_BACKEND_DUMMY",
"response",
"=",
"view_func",
"(",
"request",
",",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
"settings",
".",
"EMAIL_BACKEND",
"=",
"orig_email_backend",
"return",
"response",
"return",
"decorated"
] | Turns emails off so no emails will be sent. | [
"Turns",
"emails",
"off",
"so",
"no",
"emails",
"will",
"be",
"sent",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/decorators.py#L6-L20 |
InfoAgeTech/django-core | django_core/utils/file_utils.py | get_md5_for_file | def get_md5_for_file(file):
"""Get the md5 hash for a file.
:param file: the file to get the md5 hash for
"""
md5 = hashlib.md5()
while True:
data = file.read(md5.block_size)
if not data:
break
md5.update(data)
return md5.hexdigest() | python | def get_md5_for_file(file):
"""Get the md5 hash for a file.
:param file: the file to get the md5 hash for
"""
md5 = hashlib.md5()
while True:
data = file.read(md5.block_size)
if not data:
break
md5.update(data)
return md5.hexdigest() | [
"def",
"get_md5_for_file",
"(",
"file",
")",
":",
"md5",
"=",
"hashlib",
".",
"md5",
"(",
")",
"while",
"True",
":",
"data",
"=",
"file",
".",
"read",
"(",
"md5",
".",
"block_size",
")",
"if",
"not",
"data",
":",
"break",
"md5",
".",
"update",
"(",
"data",
")",
"return",
"md5",
".",
"hexdigest",
"(",
")"
] | Get the md5 hash for a file.
:param file: the file to get the md5 hash for | [
"Get",
"the",
"md5",
"hash",
"for",
"a",
"file",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/file_utils.py#L72-L87 |
InfoAgeTech/django-core | django_core/utils/file_utils.py | get_dict_from_json_file | def get_dict_from_json_file(path, encoding='utf-8'):
"""Gets a dict of data form a json file.
:param path: the absolute path to the file
:param encoding: the encoding the file is in
"""
with open(path, encoding=encoding) as data_file:
return json.loads(data_file.read()) | python | def get_dict_from_json_file(path, encoding='utf-8'):
"""Gets a dict of data form a json file.
:param path: the absolute path to the file
:param encoding: the encoding the file is in
"""
with open(path, encoding=encoding) as data_file:
return json.loads(data_file.read()) | [
"def",
"get_dict_from_json_file",
"(",
"path",
",",
"encoding",
"=",
"'utf-8'",
")",
":",
"with",
"open",
"(",
"path",
",",
"encoding",
"=",
"encoding",
")",
"as",
"data_file",
":",
"return",
"json",
".",
"loads",
"(",
"data_file",
".",
"read",
"(",
")",
")"
] | Gets a dict of data form a json file.
:param path: the absolute path to the file
:param encoding: the encoding the file is in | [
"Gets",
"a",
"dict",
"of",
"data",
"form",
"a",
"json",
"file",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/utils/file_utils.py#L90-L97 |
InfoAgeTech/django-core | django_core/templatetags/html_tags.py | linebreaks_safe | def linebreaks_safe(value, autoescape=True):
"""
Adds linebreaks only for text that has a newline character.
"""
if isinstance(value, string_types) and '\n' in value:
return linebreaks_filter(value, autoescape=autoescape)
return value | python | def linebreaks_safe(value, autoescape=True):
"""
Adds linebreaks only for text that has a newline character.
"""
if isinstance(value, string_types) and '\n' in value:
return linebreaks_filter(value, autoescape=autoescape)
return value | [
"def",
"linebreaks_safe",
"(",
"value",
",",
"autoescape",
"=",
"True",
")",
":",
"if",
"isinstance",
"(",
"value",
",",
"string_types",
")",
"and",
"'\\n'",
"in",
"value",
":",
"return",
"linebreaks_filter",
"(",
"value",
",",
"autoescape",
"=",
"autoescape",
")",
"return",
"value"
] | Adds linebreaks only for text that has a newline character. | [
"Adds",
"linebreaks",
"only",
"for",
"text",
"that",
"has",
"a",
"newline",
"character",
"."
] | train | https://github.com/InfoAgeTech/django-core/blob/9664a145473b75120bf71e1644e9c8086e7e8955/django_core/templatetags/html_tags.py#L11-L18 |
kmedian/korr | korr/kendall.py | kendall | def kendall(x, axis=0):
"""Kendall' tau (Rank) Correlation Matrix (for ordinal data)
Parameters
----------
x : ndarray
data set
axis : int, optional
Variables as columns is the default (axis=0). If variables are
in the rows use axis=1
Returns
-------
r : ndarray
Correlation Matrix (Kendall tau)
p : ndarray
p-values
"""
# transpose if axis<>0
if axis is not 0:
x = x.T
# read dimensions and
n, c = x.shape
# check if enough variables provided
if c < 2:
raise Exception(
"Only " + str(c) + " variables provided. Min. 2 required.")
# allocate variables
r = np.ones((c, c))
p = np.zeros((c, c))
# compute each (i,j)-th correlation
for i in range(0, c):
for j in range(i + 1, c):
r[i, j], p[i, j] = scipy.stats.kendalltau(x[:, i], x[:, j])
r[j, i] = r[i, j]
p[j, i] = p[i, j]
# done
return r, p | python | def kendall(x, axis=0):
"""Kendall' tau (Rank) Correlation Matrix (for ordinal data)
Parameters
----------
x : ndarray
data set
axis : int, optional
Variables as columns is the default (axis=0). If variables are
in the rows use axis=1
Returns
-------
r : ndarray
Correlation Matrix (Kendall tau)
p : ndarray
p-values
"""
# transpose if axis<>0
if axis is not 0:
x = x.T
# read dimensions and
n, c = x.shape
# check if enough variables provided
if c < 2:
raise Exception(
"Only " + str(c) + " variables provided. Min. 2 required.")
# allocate variables
r = np.ones((c, c))
p = np.zeros((c, c))
# compute each (i,j)-th correlation
for i in range(0, c):
for j in range(i + 1, c):
r[i, j], p[i, j] = scipy.stats.kendalltau(x[:, i], x[:, j])
r[j, i] = r[i, j]
p[j, i] = p[i, j]
# done
return r, p | [
"def",
"kendall",
"(",
"x",
",",
"axis",
"=",
"0",
")",
":",
"# transpose if axis<>0",
"if",
"axis",
"is",
"not",
"0",
":",
"x",
"=",
"x",
".",
"T",
"# read dimensions and",
"n",
",",
"c",
"=",
"x",
".",
"shape",
"# check if enough variables provided",
"if",
"c",
"<",
"2",
":",
"raise",
"Exception",
"(",
"\"Only \"",
"+",
"str",
"(",
"c",
")",
"+",
"\" variables provided. Min. 2 required.\"",
")",
"# allocate variables",
"r",
"=",
"np",
".",
"ones",
"(",
"(",
"c",
",",
"c",
")",
")",
"p",
"=",
"np",
".",
"zeros",
"(",
"(",
"c",
",",
"c",
")",
")",
"# compute each (i,j)-th correlation",
"for",
"i",
"in",
"range",
"(",
"0",
",",
"c",
")",
":",
"for",
"j",
"in",
"range",
"(",
"i",
"+",
"1",
",",
"c",
")",
":",
"r",
"[",
"i",
",",
"j",
"]",
",",
"p",
"[",
"i",
",",
"j",
"]",
"=",
"scipy",
".",
"stats",
".",
"kendalltau",
"(",
"x",
"[",
":",
",",
"i",
"]",
",",
"x",
"[",
":",
",",
"j",
"]",
")",
"r",
"[",
"j",
",",
"i",
"]",
"=",
"r",
"[",
"i",
",",
"j",
"]",
"p",
"[",
"j",
",",
"i",
"]",
"=",
"p",
"[",
"i",
",",
"j",
"]",
"# done",
"return",
"r",
",",
"p"
] | Kendall' tau (Rank) Correlation Matrix (for ordinal data)
Parameters
----------
x : ndarray
data set
axis : int, optional
Variables as columns is the default (axis=0). If variables are
in the rows use axis=1
Returns
-------
r : ndarray
Correlation Matrix (Kendall tau)
p : ndarray
p-values | [
"Kendall",
"tau",
"(",
"Rank",
")",
"Correlation",
"Matrix",
"(",
"for",
"ordinal",
"data",
")"
] | train | https://github.com/kmedian/korr/blob/4eb86fc14b1fc1b69204069b7753d115b327c937/korr/kendall.py#L5-L49 |
QualiSystems/cloudshell-networking-devices | cloudshell/devices/standards/base.py | AbstractResource.add_sub_resource | def add_sub_resource(self, relative_id, sub_resource):
"""Add sub resource"""
existing_sub_resources = self.resources.get(sub_resource.RELATIVE_PATH_TEMPLATE, defaultdict(list))
existing_sub_resources[relative_id].append(sub_resource)
self.resources.update({sub_resource.RELATIVE_PATH_TEMPLATE: existing_sub_resources}) | python | def add_sub_resource(self, relative_id, sub_resource):
"""Add sub resource"""
existing_sub_resources = self.resources.get(sub_resource.RELATIVE_PATH_TEMPLATE, defaultdict(list))
existing_sub_resources[relative_id].append(sub_resource)
self.resources.update({sub_resource.RELATIVE_PATH_TEMPLATE: existing_sub_resources}) | [
"def",
"add_sub_resource",
"(",
"self",
",",
"relative_id",
",",
"sub_resource",
")",
":",
"existing_sub_resources",
"=",
"self",
".",
"resources",
".",
"get",
"(",
"sub_resource",
".",
"RELATIVE_PATH_TEMPLATE",
",",
"defaultdict",
"(",
"list",
")",
")",
"existing_sub_resources",
"[",
"relative_id",
"]",
".",
"append",
"(",
"sub_resource",
")",
"self",
".",
"resources",
".",
"update",
"(",
"{",
"sub_resource",
".",
"RELATIVE_PATH_TEMPLATE",
":",
"existing_sub_resources",
"}",
")"
] | Add sub resource | [
"Add",
"sub",
"resource"
] | train | https://github.com/QualiSystems/cloudshell-networking-devices/blob/009aab33edb30035b52fe10dbb91db61c95ba4d9/cloudshell/devices/standards/base.py#L30-L34 |
QualiSystems/cloudshell-networking-devices | cloudshell/devices/standards/base.py | AbstractResource.cloudshell_model_name | def cloudshell_model_name(self):
"""Return the name of the CloudShell model"""
if self.shell_name:
return "{shell_name}.{resource_model}".format(shell_name=self.shell_name,
resource_model=self.RESOURCE_MODEL.replace(" ", ""))
else:
return self.RESOURCE_MODEL | python | def cloudshell_model_name(self):
"""Return the name of the CloudShell model"""
if self.shell_name:
return "{shell_name}.{resource_model}".format(shell_name=self.shell_name,
resource_model=self.RESOURCE_MODEL.replace(" ", ""))
else:
return self.RESOURCE_MODEL | [
"def",
"cloudshell_model_name",
"(",
"self",
")",
":",
"if",
"self",
".",
"shell_name",
":",
"return",
"\"{shell_name}.{resource_model}\"",
".",
"format",
"(",
"shell_name",
"=",
"self",
".",
"shell_name",
",",
"resource_model",
"=",
"self",
".",
"RESOURCE_MODEL",
".",
"replace",
"(",
"\" \"",
",",
"\"\"",
")",
")",
"else",
":",
"return",
"self",
".",
"RESOURCE_MODEL"
] | Return the name of the CloudShell model | [
"Return",
"the",
"name",
"of",
"the",
"CloudShell",
"model"
] | train | https://github.com/QualiSystems/cloudshell-networking-devices/blob/009aab33edb30035b52fe10dbb91db61c95ba4d9/cloudshell/devices/standards/base.py#L37-L43 |
kmedian/korr | korr/slice_yx.py | slice_yx | def slice_yx(r, pval, ydim=1):
"""slice a correlation and p-value matrix of a (y,X) dataset
into a (y,x_i) vector and (x_j, x_k) matrices
Parameters
----------
r : ndarray
Correlation Matrix of a (y,X) dataset
pval : ndarray
p-values
ydim : int
Number of target variables y, i.e. the first ydim-th columns
and rows are (y, x_i) correlations
Returns
-------
y_r : ndarray
1D vector or ydim-column array with (y,x) correlations
y_pval : ndarray
1D vector or ydim-column array with (y,x) p-values
x_r : ndarray
correlation matrix (x_j, x_k)
x_pval : ndarar
matrix with p-values
Example
-------
import korr
r, pval = korr.mcc(np.c_[y, X])
y_r, y_pval, x_r, x_pval = slice_yx(r, pval, ydim=1)
print(np.c_[y_r, y_pval])
korr.corrgram(x_r, x_pval)
"""
if ydim is 1:
return (
r[1:, :1].reshape(-1, ), pval[1:, :1].reshape(-1, ),
r[1:, 1:], pval[1:, 1:])
else:
return (
r[ydim:, :ydim], pval[ydim:, :ydim],
r[ydim:, ydim:], pval[ydim:, ydim:]) | python | def slice_yx(r, pval, ydim=1):
"""slice a correlation and p-value matrix of a (y,X) dataset
into a (y,x_i) vector and (x_j, x_k) matrices
Parameters
----------
r : ndarray
Correlation Matrix of a (y,X) dataset
pval : ndarray
p-values
ydim : int
Number of target variables y, i.e. the first ydim-th columns
and rows are (y, x_i) correlations
Returns
-------
y_r : ndarray
1D vector or ydim-column array with (y,x) correlations
y_pval : ndarray
1D vector or ydim-column array with (y,x) p-values
x_r : ndarray
correlation matrix (x_j, x_k)
x_pval : ndarar
matrix with p-values
Example
-------
import korr
r, pval = korr.mcc(np.c_[y, X])
y_r, y_pval, x_r, x_pval = slice_yx(r, pval, ydim=1)
print(np.c_[y_r, y_pval])
korr.corrgram(x_r, x_pval)
"""
if ydim is 1:
return (
r[1:, :1].reshape(-1, ), pval[1:, :1].reshape(-1, ),
r[1:, 1:], pval[1:, 1:])
else:
return (
r[ydim:, :ydim], pval[ydim:, :ydim],
r[ydim:, ydim:], pval[ydim:, ydim:]) | [
"def",
"slice_yx",
"(",
"r",
",",
"pval",
",",
"ydim",
"=",
"1",
")",
":",
"if",
"ydim",
"is",
"1",
":",
"return",
"(",
"r",
"[",
"1",
":",
",",
":",
"1",
"]",
".",
"reshape",
"(",
"-",
"1",
",",
")",
",",
"pval",
"[",
"1",
":",
",",
":",
"1",
"]",
".",
"reshape",
"(",
"-",
"1",
",",
")",
",",
"r",
"[",
"1",
":",
",",
"1",
":",
"]",
",",
"pval",
"[",
"1",
":",
",",
"1",
":",
"]",
")",
"else",
":",
"return",
"(",
"r",
"[",
"ydim",
":",
",",
":",
"ydim",
"]",
",",
"pval",
"[",
"ydim",
":",
",",
":",
"ydim",
"]",
",",
"r",
"[",
"ydim",
":",
",",
"ydim",
":",
"]",
",",
"pval",
"[",
"ydim",
":",
",",
"ydim",
":",
"]",
")"
] | slice a correlation and p-value matrix of a (y,X) dataset
into a (y,x_i) vector and (x_j, x_k) matrices
Parameters
----------
r : ndarray
Correlation Matrix of a (y,X) dataset
pval : ndarray
p-values
ydim : int
Number of target variables y, i.e. the first ydim-th columns
and rows are (y, x_i) correlations
Returns
-------
y_r : ndarray
1D vector or ydim-column array with (y,x) correlations
y_pval : ndarray
1D vector or ydim-column array with (y,x) p-values
x_r : ndarray
correlation matrix (x_j, x_k)
x_pval : ndarar
matrix with p-values
Example
-------
import korr
r, pval = korr.mcc(np.c_[y, X])
y_r, y_pval, x_r, x_pval = slice_yx(r, pval, ydim=1)
print(np.c_[y_r, y_pval])
korr.corrgram(x_r, x_pval) | [
"slice",
"a",
"correlation",
"and",
"p",
"-",
"value",
"matrix",
"of",
"a",
"(",
"y",
"X",
")",
"dataset",
"into",
"a",
"(",
"y",
"x_i",
")",
"vector",
"and",
"(",
"x_j",
"x_k",
")",
"matrices"
] | train | https://github.com/kmedian/korr/blob/4eb86fc14b1fc1b69204069b7753d115b327c937/korr/slice_yx.py#L2-L47 |
eumis/pyviews | pyviews/compilation/parsing.py | parse_expression | def parse_expression(source: str) -> ExpressionSource:
'''Returns tuple with expression type and expression body'''
if not is_expression(source):
msg = 'Expression is not valid. Expression should be matched with regular expression: {0}'\
.format(EXPRESSION_REGEX)
raise ExpressionError(msg, source)
if not source.startswith('{'):
[type_, source] = source.split(':', 1)
elif source.endswith('}}'):
type_ = 'twoways'
else:
type_ = 'oneway'
return (type_, source[1:-1]) | python | def parse_expression(source: str) -> ExpressionSource:
'''Returns tuple with expression type and expression body'''
if not is_expression(source):
msg = 'Expression is not valid. Expression should be matched with regular expression: {0}'\
.format(EXPRESSION_REGEX)
raise ExpressionError(msg, source)
if not source.startswith('{'):
[type_, source] = source.split(':', 1)
elif source.endswith('}}'):
type_ = 'twoways'
else:
type_ = 'oneway'
return (type_, source[1:-1]) | [
"def",
"parse_expression",
"(",
"source",
":",
"str",
")",
"->",
"ExpressionSource",
":",
"if",
"not",
"is_expression",
"(",
"source",
")",
":",
"msg",
"=",
"'Expression is not valid. Expression should be matched with regular expression: {0}'",
".",
"format",
"(",
"EXPRESSION_REGEX",
")",
"raise",
"ExpressionError",
"(",
"msg",
",",
"source",
")",
"if",
"not",
"source",
".",
"startswith",
"(",
"'{'",
")",
":",
"[",
"type_",
",",
"source",
"]",
"=",
"source",
".",
"split",
"(",
"':'",
",",
"1",
")",
"elif",
"source",
".",
"endswith",
"(",
"'}}'",
")",
":",
"type_",
"=",
"'twoways'",
"else",
":",
"type_",
"=",
"'oneway'",
"return",
"(",
"type_",
",",
"source",
"[",
"1",
":",
"-",
"1",
"]",
")"
] | Returns tuple with expression type and expression body | [
"Returns",
"tuple",
"with",
"expression",
"type",
"and",
"expression",
"body"
] | train | https://github.com/eumis/pyviews/blob/80a868242ee9cdc6f4ded594b3e0544cc238ed55/pyviews/compilation/parsing.py#L20-L32 |
internetarchive/doublethink | doublethink/rethinker.py | Rethinker._server_whitelist | def _server_whitelist(self):
'''
Returns list of servers that have not errored in the last five minutes.
If all servers have errored in the last five minutes, returns list with
one item, the server that errored least recently.
'''
whitelist = []
for server in self.servers:
if (server not in self.last_error
or self.last_error[server] < time.time() - self.PENALTY_BOX_TIME):
whitelist.append(server)
if not whitelist:
whitelist.append(sorted(
self.last_error.items(), key=lambda kv: kv[1])[0][0])
return whitelist | python | def _server_whitelist(self):
'''
Returns list of servers that have not errored in the last five minutes.
If all servers have errored in the last five minutes, returns list with
one item, the server that errored least recently.
'''
whitelist = []
for server in self.servers:
if (server not in self.last_error
or self.last_error[server] < time.time() - self.PENALTY_BOX_TIME):
whitelist.append(server)
if not whitelist:
whitelist.append(sorted(
self.last_error.items(), key=lambda kv: kv[1])[0][0])
return whitelist | [
"def",
"_server_whitelist",
"(",
"self",
")",
":",
"whitelist",
"=",
"[",
"]",
"for",
"server",
"in",
"self",
".",
"servers",
":",
"if",
"(",
"server",
"not",
"in",
"self",
".",
"last_error",
"or",
"self",
".",
"last_error",
"[",
"server",
"]",
"<",
"time",
".",
"time",
"(",
")",
"-",
"self",
".",
"PENALTY_BOX_TIME",
")",
":",
"whitelist",
".",
"append",
"(",
"server",
")",
"if",
"not",
"whitelist",
":",
"whitelist",
".",
"append",
"(",
"sorted",
"(",
"self",
".",
"last_error",
".",
"items",
"(",
")",
",",
"key",
"=",
"lambda",
"kv",
":",
"kv",
"[",
"1",
"]",
")",
"[",
"0",
"]",
"[",
"0",
"]",
")",
"return",
"whitelist"
] | Returns list of servers that have not errored in the last five minutes.
If all servers have errored in the last five minutes, returns list with
one item, the server that errored least recently. | [
"Returns",
"list",
"of",
"servers",
"that",
"have",
"not",
"errored",
"in",
"the",
"last",
"five",
"minutes",
".",
"If",
"all",
"servers",
"have",
"errored",
"in",
"the",
"last",
"five",
"minutes",
"returns",
"list",
"with",
"one",
"item",
"the",
"server",
"that",
"errored",
"least",
"recently",
"."
] | train | https://github.com/internetarchive/doublethink/blob/f7fc7da725c9b572d473c717b3dad9af98a7a2b4/doublethink/rethinker.py#L141-L155 |
QualiSystems/cloudshell-networking-devices | cloudshell/devices/standards/networking/configuration_attributes_structure.py | create_networking_resource_from_context | def create_networking_resource_from_context(shell_name, supported_os, context):
"""
Creates an instance of Networking Resource by given context
:param shell_name: Shell Name
:type shell_name: str
:param supported_os: list of supported OS
:type supported_os: list
:param context: cloudshell.shell.core.driver_context.ResourceCommandContext
:type context: cloudshell.shell.core.driver_context.ResourceCommandContext
:return:
:rtype GenericNetworkingResource
"""
result = GenericNetworkingResource(shell_name=shell_name, name=context.resource.name, supported_os=supported_os)
result.address = context.resource.address
result.family = context.resource.family
result.fullname = context.resource.fullname
result.attributes = dict(context.resource.attributes)
return result | python | def create_networking_resource_from_context(shell_name, supported_os, context):
"""
Creates an instance of Networking Resource by given context
:param shell_name: Shell Name
:type shell_name: str
:param supported_os: list of supported OS
:type supported_os: list
:param context: cloudshell.shell.core.driver_context.ResourceCommandContext
:type context: cloudshell.shell.core.driver_context.ResourceCommandContext
:return:
:rtype GenericNetworkingResource
"""
result = GenericNetworkingResource(shell_name=shell_name, name=context.resource.name, supported_os=supported_os)
result.address = context.resource.address
result.family = context.resource.family
result.fullname = context.resource.fullname
result.attributes = dict(context.resource.attributes)
return result | [
"def",
"create_networking_resource_from_context",
"(",
"shell_name",
",",
"supported_os",
",",
"context",
")",
":",
"result",
"=",
"GenericNetworkingResource",
"(",
"shell_name",
"=",
"shell_name",
",",
"name",
"=",
"context",
".",
"resource",
".",
"name",
",",
"supported_os",
"=",
"supported_os",
")",
"result",
".",
"address",
"=",
"context",
".",
"resource",
".",
"address",
"result",
".",
"family",
"=",
"context",
".",
"resource",
".",
"family",
"result",
".",
"fullname",
"=",
"context",
".",
"resource",
".",
"fullname",
"result",
".",
"attributes",
"=",
"dict",
"(",
"context",
".",
"resource",
".",
"attributes",
")",
"return",
"result"
] | Creates an instance of Networking Resource by given context
:param shell_name: Shell Name
:type shell_name: str
:param supported_os: list of supported OS
:type supported_os: list
:param context: cloudshell.shell.core.driver_context.ResourceCommandContext
:type context: cloudshell.shell.core.driver_context.ResourceCommandContext
:return:
:rtype GenericNetworkingResource | [
"Creates",
"an",
"instance",
"of",
"Networking",
"Resource",
"by",
"given",
"context",
":",
"param",
"shell_name",
":",
"Shell",
"Name",
":",
"type",
"shell_name",
":",
"str",
":",
"param",
"supported_os",
":",
"list",
"of",
"supported",
"OS",
":",
"type",
"supported_os",
":",
"list",
":",
"param",
"context",
":",
"cloudshell",
".",
"shell",
".",
"core",
".",
"driver_context",
".",
"ResourceCommandContext",
":",
"type",
"context",
":",
"cloudshell",
".",
"shell",
".",
"core",
".",
"driver_context",
".",
"ResourceCommandContext",
":",
"return",
":",
":",
"rtype",
"GenericNetworkingResource"
] | train | https://github.com/QualiSystems/cloudshell-networking-devices/blob/009aab33edb30035b52fe10dbb91db61c95ba4d9/cloudshell/devices/standards/networking/configuration_attributes_structure.py#L213-L233 |
kmedian/korr | korr/find_worst.py | find_worst | def find_worst(rho, pval, m=1, rlim=.10, plim=.35):
"""Find the N "worst", i.e. insignificant/random and low, correlations
Parameters
----------
rho : ndarray, list
1D array with correlation coefficients
pval : ndarray, list
1D array with p-values
m : int
The desired number of indicies to return
(How many "worst" correlations to find?)
rlim : float
Desired maximum absolute correlation coefficient
(Default: 0.10)
plim : float
Desired minimum p-value
(Default: 0.35)
Return
------
selected : list
Indicies of rho and pval of the "worst" correlations.
"""
# convert to lists
n = len(rho)
r = list(np.abs(rho))
p = list(pval)
i = list(range(n))
# check m
if m > n:
warnings.warn(
'm is bigger than the available correlations in rho and pval.')
m = n
# selected indicies
selected = list()
# (1) pick the highest/worst p-value
# |r| <= r_lim
# p > p_lim
it = 0
while (len(selected) < m) and (it < n):
temp = p.index(max(p)) # temporary index of the remaining values
worst = i[temp] # store original index as 'worst' before abort loop
# check
if (r[temp] <= rlim) and (p[temp] > plim):
# delete from lists
r.pop(temp)
p.pop(temp)
i.pop(temp)
# append to abort
selected.append(worst)
# next step
it = it + 1
# print(selected, i)
# (2) Just pick the highest/worst p-value of the remaining
# with bad correlations
# |r| <= r_lim
it = 0
n2 = len(i)
while (len(selected) < m) and (it < n2):
temp = p.index(max(p)) # temporary index of the remaining values
worst = i[temp] # store original index as 'worst' before abort loop
# check
if (r[temp] <= rlim):
# delete from lists
r.pop(temp)
p.pop(temp)
i.pop(temp)
# append to abort
selected.append(worst)
# next step
it = it + 1
# (3) Pick the lowest correlations
it = 0
n3 = len(i)
while (len(selected) < m) and (it < n3):
# find the smallest p-value
temp = r.index(min(r))
worst = i[temp]
# delete from lists
r.pop(temp)
p.pop(temp)
i.pop(temp)
# append to abort
selected.append(worst)
# next step
it = it + 1
return selected | python | def find_worst(rho, pval, m=1, rlim=.10, plim=.35):
"""Find the N "worst", i.e. insignificant/random and low, correlations
Parameters
----------
rho : ndarray, list
1D array with correlation coefficients
pval : ndarray, list
1D array with p-values
m : int
The desired number of indicies to return
(How many "worst" correlations to find?)
rlim : float
Desired maximum absolute correlation coefficient
(Default: 0.10)
plim : float
Desired minimum p-value
(Default: 0.35)
Return
------
selected : list
Indicies of rho and pval of the "worst" correlations.
"""
# convert to lists
n = len(rho)
r = list(np.abs(rho))
p = list(pval)
i = list(range(n))
# check m
if m > n:
warnings.warn(
'm is bigger than the available correlations in rho and pval.')
m = n
# selected indicies
selected = list()
# (1) pick the highest/worst p-value
# |r| <= r_lim
# p > p_lim
it = 0
while (len(selected) < m) and (it < n):
temp = p.index(max(p)) # temporary index of the remaining values
worst = i[temp] # store original index as 'worst' before abort loop
# check
if (r[temp] <= rlim) and (p[temp] > plim):
# delete from lists
r.pop(temp)
p.pop(temp)
i.pop(temp)
# append to abort
selected.append(worst)
# next step
it = it + 1
# print(selected, i)
# (2) Just pick the highest/worst p-value of the remaining
# with bad correlations
# |r| <= r_lim
it = 0
n2 = len(i)
while (len(selected) < m) and (it < n2):
temp = p.index(max(p)) # temporary index of the remaining values
worst = i[temp] # store original index as 'worst' before abort loop
# check
if (r[temp] <= rlim):
# delete from lists
r.pop(temp)
p.pop(temp)
i.pop(temp)
# append to abort
selected.append(worst)
# next step
it = it + 1
# (3) Pick the lowest correlations
it = 0
n3 = len(i)
while (len(selected) < m) and (it < n3):
# find the smallest p-value
temp = r.index(min(r))
worst = i[temp]
# delete from lists
r.pop(temp)
p.pop(temp)
i.pop(temp)
# append to abort
selected.append(worst)
# next step
it = it + 1
return selected | [
"def",
"find_worst",
"(",
"rho",
",",
"pval",
",",
"m",
"=",
"1",
",",
"rlim",
"=",
".10",
",",
"plim",
"=",
".35",
")",
":",
"# convert to lists",
"n",
"=",
"len",
"(",
"rho",
")",
"r",
"=",
"list",
"(",
"np",
".",
"abs",
"(",
"rho",
")",
")",
"p",
"=",
"list",
"(",
"pval",
")",
"i",
"=",
"list",
"(",
"range",
"(",
"n",
")",
")",
"# check m",
"if",
"m",
">",
"n",
":",
"warnings",
".",
"warn",
"(",
"'m is bigger than the available correlations in rho and pval.'",
")",
"m",
"=",
"n",
"# selected indicies",
"selected",
"=",
"list",
"(",
")",
"# (1) pick the highest/worst p-value",
"# |r| <= r_lim",
"# p > p_lim",
"it",
"=",
"0",
"while",
"(",
"len",
"(",
"selected",
")",
"<",
"m",
")",
"and",
"(",
"it",
"<",
"n",
")",
":",
"temp",
"=",
"p",
".",
"index",
"(",
"max",
"(",
"p",
")",
")",
"# temporary index of the remaining values",
"worst",
"=",
"i",
"[",
"temp",
"]",
"# store original index as 'worst' before abort loop",
"# check",
"if",
"(",
"r",
"[",
"temp",
"]",
"<=",
"rlim",
")",
"and",
"(",
"p",
"[",
"temp",
"]",
">",
"plim",
")",
":",
"# delete from lists",
"r",
".",
"pop",
"(",
"temp",
")",
"p",
".",
"pop",
"(",
"temp",
")",
"i",
".",
"pop",
"(",
"temp",
")",
"# append to abort",
"selected",
".",
"append",
"(",
"worst",
")",
"# next step",
"it",
"=",
"it",
"+",
"1",
"# print(selected, i)",
"# (2) Just pick the highest/worst p-value of the remaining",
"# with bad correlations",
"# |r| <= r_lim",
"it",
"=",
"0",
"n2",
"=",
"len",
"(",
"i",
")",
"while",
"(",
"len",
"(",
"selected",
")",
"<",
"m",
")",
"and",
"(",
"it",
"<",
"n2",
")",
":",
"temp",
"=",
"p",
".",
"index",
"(",
"max",
"(",
"p",
")",
")",
"# temporary index of the remaining values",
"worst",
"=",
"i",
"[",
"temp",
"]",
"# store original index as 'worst' before abort loop",
"# check",
"if",
"(",
"r",
"[",
"temp",
"]",
"<=",
"rlim",
")",
":",
"# delete from lists",
"r",
".",
"pop",
"(",
"temp",
")",
"p",
".",
"pop",
"(",
"temp",
")",
"i",
".",
"pop",
"(",
"temp",
")",
"# append to abort",
"selected",
".",
"append",
"(",
"worst",
")",
"# next step",
"it",
"=",
"it",
"+",
"1",
"# (3) Pick the lowest correlations",
"it",
"=",
"0",
"n3",
"=",
"len",
"(",
"i",
")",
"while",
"(",
"len",
"(",
"selected",
")",
"<",
"m",
")",
"and",
"(",
"it",
"<",
"n3",
")",
":",
"# find the smallest p-value",
"temp",
"=",
"r",
".",
"index",
"(",
"min",
"(",
"r",
")",
")",
"worst",
"=",
"i",
"[",
"temp",
"]",
"# delete from lists",
"r",
".",
"pop",
"(",
"temp",
")",
"p",
".",
"pop",
"(",
"temp",
")",
"i",
".",
"pop",
"(",
"temp",
")",
"# append to abort",
"selected",
".",
"append",
"(",
"worst",
")",
"# next step",
"it",
"=",
"it",
"+",
"1",
"return",
"selected"
] | Find the N "worst", i.e. insignificant/random and low, correlations
Parameters
----------
rho : ndarray, list
1D array with correlation coefficients
pval : ndarray, list
1D array with p-values
m : int
The desired number of indicies to return
(How many "worst" correlations to find?)
rlim : float
Desired maximum absolute correlation coefficient
(Default: 0.10)
plim : float
Desired minimum p-value
(Default: 0.35)
Return
------
selected : list
Indicies of rho and pval of the "worst" correlations. | [
"Find",
"the",
"N",
"worst",
"i",
".",
"e",
".",
"insignificant",
"/",
"random",
"and",
"low",
"correlations"
] | train | https://github.com/kmedian/korr/blob/4eb86fc14b1fc1b69204069b7753d115b327c937/korr/find_worst.py#L5-L101 |
upsight/doctor | doctor/parsers.py | _parse_array | def _parse_array(value):
"""Coerce value into an list.
:param str value: Value to parse.
:returns: list or None if the value is not a JSON array
:raises: TypeError or ValueError if value appears to be an array but can't
be parsed as JSON.
"""
value = value.lstrip()
if not value or value[0] not in _bracket_strings:
return None
return json.loads(value) | python | def _parse_array(value):
"""Coerce value into an list.
:param str value: Value to parse.
:returns: list or None if the value is not a JSON array
:raises: TypeError or ValueError if value appears to be an array but can't
be parsed as JSON.
"""
value = value.lstrip()
if not value or value[0] not in _bracket_strings:
return None
return json.loads(value) | [
"def",
"_parse_array",
"(",
"value",
")",
":",
"value",
"=",
"value",
".",
"lstrip",
"(",
")",
"if",
"not",
"value",
"or",
"value",
"[",
"0",
"]",
"not",
"in",
"_bracket_strings",
":",
"return",
"None",
"return",
"json",
".",
"loads",
"(",
"value",
")"
] | Coerce value into an list.
:param str value: Value to parse.
:returns: list or None if the value is not a JSON array
:raises: TypeError or ValueError if value appears to be an array but can't
be parsed as JSON. | [
"Coerce",
"value",
"into",
"an",
"list",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/parsers.py#L22-L33 |
upsight/doctor | doctor/parsers.py | _parse_boolean | def _parse_boolean(value):
"""Coerce value into an bool.
:param str value: Value to parse.
:returns: bool or None if the value is not a boolean string.
"""
value = value.lower()
if value in _true_strings:
return True
elif value in _false_strings:
return False
else:
return None | python | def _parse_boolean(value):
"""Coerce value into an bool.
:param str value: Value to parse.
:returns: bool or None if the value is not a boolean string.
"""
value = value.lower()
if value in _true_strings:
return True
elif value in _false_strings:
return False
else:
return None | [
"def",
"_parse_boolean",
"(",
"value",
")",
":",
"value",
"=",
"value",
".",
"lower",
"(",
")",
"if",
"value",
"in",
"_true_strings",
":",
"return",
"True",
"elif",
"value",
"in",
"_false_strings",
":",
"return",
"False",
"else",
":",
"return",
"None"
] | Coerce value into an bool.
:param str value: Value to parse.
:returns: bool or None if the value is not a boolean string. | [
"Coerce",
"value",
"into",
"an",
"bool",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/parsers.py#L36-L48 |
upsight/doctor | doctor/parsers.py | _parse_object | def _parse_object(value):
"""Coerce value into a dict.
:param str value: Value to parse.
:returns: dict or None if the value is not a JSON object
:raises: TypeError or ValueError if value appears to be an object but can't
be parsed as JSON.
"""
value = value.lstrip()
if not value or value[0] not in _brace_strings:
return None
return json.loads(value) | python | def _parse_object(value):
"""Coerce value into a dict.
:param str value: Value to parse.
:returns: dict or None if the value is not a JSON object
:raises: TypeError or ValueError if value appears to be an object but can't
be parsed as JSON.
"""
value = value.lstrip()
if not value or value[0] not in _brace_strings:
return None
return json.loads(value) | [
"def",
"_parse_object",
"(",
"value",
")",
":",
"value",
"=",
"value",
".",
"lstrip",
"(",
")",
"if",
"not",
"value",
"or",
"value",
"[",
"0",
"]",
"not",
"in",
"_brace_strings",
":",
"return",
"None",
"return",
"json",
".",
"loads",
"(",
"value",
")"
] | Coerce value into a dict.
:param str value: Value to parse.
:returns: dict or None if the value is not a JSON object
:raises: TypeError or ValueError if value appears to be an object but can't
be parsed as JSON. | [
"Coerce",
"value",
"into",
"a",
"dict",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/parsers.py#L51-L62 |
upsight/doctor | doctor/parsers.py | parse_value | def parse_value(value, allowed_types, name='value'):
"""Parse a value into one of a number of types.
This function is used to coerce untyped HTTP parameter strings into an
appropriate type. It tries to coerce the value into each of the allowed
types, and uses the first that evaluates properly.
Because this is coercing a string into multiple, potentially ambiguous,
types, it tests things in the order of least ambiguous to most ambiguous:
- The "null" type is checked first. If allowed, and the value is blank
(""), None will be returned.
- The "boolean" type is checked next. Values of "true" (case insensitive)
are True, and values of "false" are False.
- Numeric types are checked next -- first "integer", then "number".
- The "array" type is checked next. A value is only considered a valid
array if it begins with a "[" and can be parsed as JSON.
- The "object" type is checked next. A value is only considered a valid
object if it begins with a "{" and can be parsed as JSON.
- The "string" type is checked last, since any value is a valid string.
Unicode strings are encoded as UTF-8.
:param str value: Parameter value. Example: "1"
:param list allowed_types: Types that should be attempted. Example:
["integer", "null"]
:param str name: Parameter name. If not specified, "value" is used.
Example: "campaign_id"
:returns: a tuple of a type string and coerced value
:raises: ParseError if the value cannot be coerced to any of the types
"""
if not isinstance(value, str):
raise ValueError('value for %r must be a string' % name)
if isinstance(allowed_types, str):
allowed_types = [allowed_types]
# Note that the order of these type considerations is important. Because we
# have an untyped value that may be one of any given number of types, we
# need a consistent order of evaluation in cases when there is ambiguity
# between types.
if 'null' in allowed_types and value == '':
return 'null', None
# For all of these types, we'll pass the value to the function and it will
# raise a TypeError or ValueError or return None if it can't be parsed as
# the given type.
for allowed_type, parser in _parser_funcs:
if allowed_type in allowed_types:
try:
parsed_value = parser(value)
if parsed_value is not None:
return allowed_type, parsed_value
except (TypeError, ValueError):
# Ignore any errors, and continue trying other types
pass
raise ParseError('%s must be a valid type (%s)' %
(name, ', '.join(allowed_types))) | python | def parse_value(value, allowed_types, name='value'):
"""Parse a value into one of a number of types.
This function is used to coerce untyped HTTP parameter strings into an
appropriate type. It tries to coerce the value into each of the allowed
types, and uses the first that evaluates properly.
Because this is coercing a string into multiple, potentially ambiguous,
types, it tests things in the order of least ambiguous to most ambiguous:
- The "null" type is checked first. If allowed, and the value is blank
(""), None will be returned.
- The "boolean" type is checked next. Values of "true" (case insensitive)
are True, and values of "false" are False.
- Numeric types are checked next -- first "integer", then "number".
- The "array" type is checked next. A value is only considered a valid
array if it begins with a "[" and can be parsed as JSON.
- The "object" type is checked next. A value is only considered a valid
object if it begins with a "{" and can be parsed as JSON.
- The "string" type is checked last, since any value is a valid string.
Unicode strings are encoded as UTF-8.
:param str value: Parameter value. Example: "1"
:param list allowed_types: Types that should be attempted. Example:
["integer", "null"]
:param str name: Parameter name. If not specified, "value" is used.
Example: "campaign_id"
:returns: a tuple of a type string and coerced value
:raises: ParseError if the value cannot be coerced to any of the types
"""
if not isinstance(value, str):
raise ValueError('value for %r must be a string' % name)
if isinstance(allowed_types, str):
allowed_types = [allowed_types]
# Note that the order of these type considerations is important. Because we
# have an untyped value that may be one of any given number of types, we
# need a consistent order of evaluation in cases when there is ambiguity
# between types.
if 'null' in allowed_types and value == '':
return 'null', None
# For all of these types, we'll pass the value to the function and it will
# raise a TypeError or ValueError or return None if it can't be parsed as
# the given type.
for allowed_type, parser in _parser_funcs:
if allowed_type in allowed_types:
try:
parsed_value = parser(value)
if parsed_value is not None:
return allowed_type, parsed_value
except (TypeError, ValueError):
# Ignore any errors, and continue trying other types
pass
raise ParseError('%s must be a valid type (%s)' %
(name, ', '.join(allowed_types))) | [
"def",
"parse_value",
"(",
"value",
",",
"allowed_types",
",",
"name",
"=",
"'value'",
")",
":",
"if",
"not",
"isinstance",
"(",
"value",
",",
"str",
")",
":",
"raise",
"ValueError",
"(",
"'value for %r must be a string'",
"%",
"name",
")",
"if",
"isinstance",
"(",
"allowed_types",
",",
"str",
")",
":",
"allowed_types",
"=",
"[",
"allowed_types",
"]",
"# Note that the order of these type considerations is important. Because we",
"# have an untyped value that may be one of any given number of types, we",
"# need a consistent order of evaluation in cases when there is ambiguity",
"# between types.",
"if",
"'null'",
"in",
"allowed_types",
"and",
"value",
"==",
"''",
":",
"return",
"'null'",
",",
"None",
"# For all of these types, we'll pass the value to the function and it will",
"# raise a TypeError or ValueError or return None if it can't be parsed as",
"# the given type.",
"for",
"allowed_type",
",",
"parser",
"in",
"_parser_funcs",
":",
"if",
"allowed_type",
"in",
"allowed_types",
":",
"try",
":",
"parsed_value",
"=",
"parser",
"(",
"value",
")",
"if",
"parsed_value",
"is",
"not",
"None",
":",
"return",
"allowed_type",
",",
"parsed_value",
"except",
"(",
"TypeError",
",",
"ValueError",
")",
":",
"# Ignore any errors, and continue trying other types",
"pass",
"raise",
"ParseError",
"(",
"'%s must be a valid type (%s)'",
"%",
"(",
"name",
",",
"', '",
".",
"join",
"(",
"allowed_types",
")",
")",
")"
] | Parse a value into one of a number of types.
This function is used to coerce untyped HTTP parameter strings into an
appropriate type. It tries to coerce the value into each of the allowed
types, and uses the first that evaluates properly.
Because this is coercing a string into multiple, potentially ambiguous,
types, it tests things in the order of least ambiguous to most ambiguous:
- The "null" type is checked first. If allowed, and the value is blank
(""), None will be returned.
- The "boolean" type is checked next. Values of "true" (case insensitive)
are True, and values of "false" are False.
- Numeric types are checked next -- first "integer", then "number".
- The "array" type is checked next. A value is only considered a valid
array if it begins with a "[" and can be parsed as JSON.
- The "object" type is checked next. A value is only considered a valid
object if it begins with a "{" and can be parsed as JSON.
- The "string" type is checked last, since any value is a valid string.
Unicode strings are encoded as UTF-8.
:param str value: Parameter value. Example: "1"
:param list allowed_types: Types that should be attempted. Example:
["integer", "null"]
:param str name: Parameter name. If not specified, "value" is used.
Example: "campaign_id"
:returns: a tuple of a type string and coerced value
:raises: ParseError if the value cannot be coerced to any of the types | [
"Parse",
"a",
"value",
"into",
"one",
"of",
"a",
"number",
"of",
"types",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/parsers.py#L87-L145 |
upsight/doctor | doctor/parsers.py | parse_json | def parse_json(value: str, sig_params: List[inspect.Parameter] = None) -> dict:
"""Parse a value as JSON.
This is just a wrapper around json.loads which re-raises any errors as a
ParseError instead.
:param str value: JSON string.
:param dict sig_params: The logic function's signature parameters.
:returns: the parsed JSON value
"""
try:
loaded = json.loads(value)
except Exception as e:
message = 'Error parsing JSON: %r error: %s' % (value, e)
logging.debug(message, exc_info=e)
raise ParseError(message)
if sig_params is not None:
return map_param_names(loaded, sig_params)
return loaded | python | def parse_json(value: str, sig_params: List[inspect.Parameter] = None) -> dict:
"""Parse a value as JSON.
This is just a wrapper around json.loads which re-raises any errors as a
ParseError instead.
:param str value: JSON string.
:param dict sig_params: The logic function's signature parameters.
:returns: the parsed JSON value
"""
try:
loaded = json.loads(value)
except Exception as e:
message = 'Error parsing JSON: %r error: %s' % (value, e)
logging.debug(message, exc_info=e)
raise ParseError(message)
if sig_params is not None:
return map_param_names(loaded, sig_params)
return loaded | [
"def",
"parse_json",
"(",
"value",
":",
"str",
",",
"sig_params",
":",
"List",
"[",
"inspect",
".",
"Parameter",
"]",
"=",
"None",
")",
"->",
"dict",
":",
"try",
":",
"loaded",
"=",
"json",
".",
"loads",
"(",
"value",
")",
"except",
"Exception",
"as",
"e",
":",
"message",
"=",
"'Error parsing JSON: %r error: %s'",
"%",
"(",
"value",
",",
"e",
")",
"logging",
".",
"debug",
"(",
"message",
",",
"exc_info",
"=",
"e",
")",
"raise",
"ParseError",
"(",
"message",
")",
"if",
"sig_params",
"is",
"not",
"None",
":",
"return",
"map_param_names",
"(",
"loaded",
",",
"sig_params",
")",
"return",
"loaded"
] | Parse a value as JSON.
This is just a wrapper around json.loads which re-raises any errors as a
ParseError instead.
:param str value: JSON string.
:param dict sig_params: The logic function's signature parameters.
:returns: the parsed JSON value | [
"Parse",
"a",
"value",
"as",
"JSON",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/parsers.py#L148-L167 |
upsight/doctor | doctor/parsers.py | map_param_names | def map_param_names(
req_params: dict, sig_params: List[inspect.Parameter]) -> dict:
"""Maps request param names to match logic function param names.
If a doctor type defined a `param_name` attribute for the name of the
parameter in the request, we should use that as the key when looking up
the value for the request parameter.
When we declare a type we can specify what the parameter name
should be in the request that the annotated type should get mapped to.
>>> from doctor.types import number
>>> Latitude = number('The latitude', param_name='location.lat')
>>> def my_logic(lat: Latitude): pass
>>> request_params = {'location.lat': 45.2342343}
In the above example doctor knows to pass the value at key `location.lat`
to the logic function variable named `lat` since it's annotated by the
`Latitude` type which specifies what the param_name is on the request.
:param dict req_params: The parameters specified in the request.
:param dict sig_params: The logic function's signature parameters.
:returns: A dict of re-mapped params.
"""
new_request_params = {}
for k, param in sig_params.items():
param_name = getattr(param.annotation, 'param_name', None)
key = k if param_name is None else param_name
if key in req_params:
new_request_params[k] = req_params[key]
return new_request_params | python | def map_param_names(
req_params: dict, sig_params: List[inspect.Parameter]) -> dict:
"""Maps request param names to match logic function param names.
If a doctor type defined a `param_name` attribute for the name of the
parameter in the request, we should use that as the key when looking up
the value for the request parameter.
When we declare a type we can specify what the parameter name
should be in the request that the annotated type should get mapped to.
>>> from doctor.types import number
>>> Latitude = number('The latitude', param_name='location.lat')
>>> def my_logic(lat: Latitude): pass
>>> request_params = {'location.lat': 45.2342343}
In the above example doctor knows to pass the value at key `location.lat`
to the logic function variable named `lat` since it's annotated by the
`Latitude` type which specifies what the param_name is on the request.
:param dict req_params: The parameters specified in the request.
:param dict sig_params: The logic function's signature parameters.
:returns: A dict of re-mapped params.
"""
new_request_params = {}
for k, param in sig_params.items():
param_name = getattr(param.annotation, 'param_name', None)
key = k if param_name is None else param_name
if key in req_params:
new_request_params[k] = req_params[key]
return new_request_params | [
"def",
"map_param_names",
"(",
"req_params",
":",
"dict",
",",
"sig_params",
":",
"List",
"[",
"inspect",
".",
"Parameter",
"]",
")",
"->",
"dict",
":",
"new_request_params",
"=",
"{",
"}",
"for",
"k",
",",
"param",
"in",
"sig_params",
".",
"items",
"(",
")",
":",
"param_name",
"=",
"getattr",
"(",
"param",
".",
"annotation",
",",
"'param_name'",
",",
"None",
")",
"key",
"=",
"k",
"if",
"param_name",
"is",
"None",
"else",
"param_name",
"if",
"key",
"in",
"req_params",
":",
"new_request_params",
"[",
"k",
"]",
"=",
"req_params",
"[",
"key",
"]",
"return",
"new_request_params"
] | Maps request param names to match logic function param names.
If a doctor type defined a `param_name` attribute for the name of the
parameter in the request, we should use that as the key when looking up
the value for the request parameter.
When we declare a type we can specify what the parameter name
should be in the request that the annotated type should get mapped to.
>>> from doctor.types import number
>>> Latitude = number('The latitude', param_name='location.lat')
>>> def my_logic(lat: Latitude): pass
>>> request_params = {'location.lat': 45.2342343}
In the above example doctor knows to pass the value at key `location.lat`
to the logic function variable named `lat` since it's annotated by the
`Latitude` type which specifies what the param_name is on the request.
:param dict req_params: The parameters specified in the request.
:param dict sig_params: The logic function's signature parameters.
:returns: A dict of re-mapped params. | [
"Maps",
"request",
"param",
"names",
"to",
"match",
"logic",
"function",
"param",
"names",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/parsers.py#L180-L210 |
upsight/doctor | doctor/parsers.py | parse_form_and_query_params | def parse_form_and_query_params(req_params: dict, sig_params: dict) -> dict:
"""Uses the parameter annotations to coerce string params.
This is used for HTTP requests, in which the form parameters are all
strings, but need to be converted to the appropriate types before
validating them.
:param dict req_params: The parameters specified in the request.
:param dict sig_params: The logic function's signature parameters.
:returns: a dict of params parsed from the input dict.
:raises TypeSystemError: If there are errors parsing values.
"""
# Importing here to prevent circular dependencies.
from doctor.types import SuperType, UnionType
errors = {}
parsed_params = {}
for param, value in req_params.items():
# Skip request variables not in the function signature.
if param not in sig_params:
continue
# Skip coercing parameters not annotated by a doctor type.
if not issubclass(sig_params[param].annotation, SuperType):
continue
# Check if the type has a custom parser for the parameter.
custom_parser = sig_params[param].annotation.parser
if custom_parser is not None:
if not callable(custom_parser):
warnings.warn(
'Parser `{}` is not callable, using default parser.'.format(
custom_parser))
custom_parser = None
try:
if custom_parser is not None:
parsed_params[param] = custom_parser(value)
else:
if issubclass(sig_params[param].annotation, UnionType):
json_type = [
_native_type_to_json[_type.native_type]
for _type in sig_params[param].annotation.types
]
else:
native_type = sig_params[param].annotation.native_type
json_type = [_native_type_to_json[native_type]]
# If the type is nullable, also add null as an allowed type.
if sig_params[param].annotation.nullable:
json_type.append('null')
_, parsed_params[param] = parse_value(value, json_type)
except ParseError as e:
errors[param] = str(e)
if errors:
raise TypeSystemError(errors, errors=errors)
return parsed_params | python | def parse_form_and_query_params(req_params: dict, sig_params: dict) -> dict:
"""Uses the parameter annotations to coerce string params.
This is used for HTTP requests, in which the form parameters are all
strings, but need to be converted to the appropriate types before
validating them.
:param dict req_params: The parameters specified in the request.
:param dict sig_params: The logic function's signature parameters.
:returns: a dict of params parsed from the input dict.
:raises TypeSystemError: If there are errors parsing values.
"""
# Importing here to prevent circular dependencies.
from doctor.types import SuperType, UnionType
errors = {}
parsed_params = {}
for param, value in req_params.items():
# Skip request variables not in the function signature.
if param not in sig_params:
continue
# Skip coercing parameters not annotated by a doctor type.
if not issubclass(sig_params[param].annotation, SuperType):
continue
# Check if the type has a custom parser for the parameter.
custom_parser = sig_params[param].annotation.parser
if custom_parser is not None:
if not callable(custom_parser):
warnings.warn(
'Parser `{}` is not callable, using default parser.'.format(
custom_parser))
custom_parser = None
try:
if custom_parser is not None:
parsed_params[param] = custom_parser(value)
else:
if issubclass(sig_params[param].annotation, UnionType):
json_type = [
_native_type_to_json[_type.native_type]
for _type in sig_params[param].annotation.types
]
else:
native_type = sig_params[param].annotation.native_type
json_type = [_native_type_to_json[native_type]]
# If the type is nullable, also add null as an allowed type.
if sig_params[param].annotation.nullable:
json_type.append('null')
_, parsed_params[param] = parse_value(value, json_type)
except ParseError as e:
errors[param] = str(e)
if errors:
raise TypeSystemError(errors, errors=errors)
return parsed_params | [
"def",
"parse_form_and_query_params",
"(",
"req_params",
":",
"dict",
",",
"sig_params",
":",
"dict",
")",
"->",
"dict",
":",
"# Importing here to prevent circular dependencies.",
"from",
"doctor",
".",
"types",
"import",
"SuperType",
",",
"UnionType",
"errors",
"=",
"{",
"}",
"parsed_params",
"=",
"{",
"}",
"for",
"param",
",",
"value",
"in",
"req_params",
".",
"items",
"(",
")",
":",
"# Skip request variables not in the function signature.",
"if",
"param",
"not",
"in",
"sig_params",
":",
"continue",
"# Skip coercing parameters not annotated by a doctor type.",
"if",
"not",
"issubclass",
"(",
"sig_params",
"[",
"param",
"]",
".",
"annotation",
",",
"SuperType",
")",
":",
"continue",
"# Check if the type has a custom parser for the parameter.",
"custom_parser",
"=",
"sig_params",
"[",
"param",
"]",
".",
"annotation",
".",
"parser",
"if",
"custom_parser",
"is",
"not",
"None",
":",
"if",
"not",
"callable",
"(",
"custom_parser",
")",
":",
"warnings",
".",
"warn",
"(",
"'Parser `{}` is not callable, using default parser.'",
".",
"format",
"(",
"custom_parser",
")",
")",
"custom_parser",
"=",
"None",
"try",
":",
"if",
"custom_parser",
"is",
"not",
"None",
":",
"parsed_params",
"[",
"param",
"]",
"=",
"custom_parser",
"(",
"value",
")",
"else",
":",
"if",
"issubclass",
"(",
"sig_params",
"[",
"param",
"]",
".",
"annotation",
",",
"UnionType",
")",
":",
"json_type",
"=",
"[",
"_native_type_to_json",
"[",
"_type",
".",
"native_type",
"]",
"for",
"_type",
"in",
"sig_params",
"[",
"param",
"]",
".",
"annotation",
".",
"types",
"]",
"else",
":",
"native_type",
"=",
"sig_params",
"[",
"param",
"]",
".",
"annotation",
".",
"native_type",
"json_type",
"=",
"[",
"_native_type_to_json",
"[",
"native_type",
"]",
"]",
"# If the type is nullable, also add null as an allowed type.",
"if",
"sig_params",
"[",
"param",
"]",
".",
"annotation",
".",
"nullable",
":",
"json_type",
".",
"append",
"(",
"'null'",
")",
"_",
",",
"parsed_params",
"[",
"param",
"]",
"=",
"parse_value",
"(",
"value",
",",
"json_type",
")",
"except",
"ParseError",
"as",
"e",
":",
"errors",
"[",
"param",
"]",
"=",
"str",
"(",
"e",
")",
"if",
"errors",
":",
"raise",
"TypeSystemError",
"(",
"errors",
",",
"errors",
"=",
"errors",
")",
"return",
"parsed_params"
] | Uses the parameter annotations to coerce string params.
This is used for HTTP requests, in which the form parameters are all
strings, but need to be converted to the appropriate types before
validating them.
:param dict req_params: The parameters specified in the request.
:param dict sig_params: The logic function's signature parameters.
:returns: a dict of params parsed from the input dict.
:raises TypeSystemError: If there are errors parsing values. | [
"Uses",
"the",
"parameter",
"annotations",
"to",
"coerce",
"string",
"params",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/parsers.py#L213-L268 |
Workiva/furious | furious/handlers/webapp.py | AsyncJobHandler._handle_task | def _handle_task(self):
"""Pass request info to the async framework."""
headers = self.request.headers
message = None
try:
status_code, output = process_async_task(
headers, self.request.body)
except AbortAndRestart as restart:
# Async retry status code
status_code = 549
message = 'Retry Async Task'
output = str(restart)
self.response.set_status(status_code, message)
self.response.out.write(output) | python | def _handle_task(self):
"""Pass request info to the async framework."""
headers = self.request.headers
message = None
try:
status_code, output = process_async_task(
headers, self.request.body)
except AbortAndRestart as restart:
# Async retry status code
status_code = 549
message = 'Retry Async Task'
output = str(restart)
self.response.set_status(status_code, message)
self.response.out.write(output) | [
"def",
"_handle_task",
"(",
"self",
")",
":",
"headers",
"=",
"self",
".",
"request",
".",
"headers",
"message",
"=",
"None",
"try",
":",
"status_code",
",",
"output",
"=",
"process_async_task",
"(",
"headers",
",",
"self",
".",
"request",
".",
"body",
")",
"except",
"AbortAndRestart",
"as",
"restart",
":",
"# Async retry status code",
"status_code",
"=",
"549",
"message",
"=",
"'Retry Async Task'",
"output",
"=",
"str",
"(",
"restart",
")",
"self",
".",
"response",
".",
"set_status",
"(",
"status_code",
",",
"message",
")",
"self",
".",
"response",
".",
"out",
".",
"write",
"(",
"output",
")"
] | Pass request info to the async framework. | [
"Pass",
"request",
"info",
"to",
"the",
"async",
"framework",
"."
] | train | https://github.com/Workiva/furious/blob/c29823ec8b98549e7439d7273aa064d1e5830632/furious/handlers/webapp.py#L30-L45 |
QualiSystems/cloudshell-networking-devices | cloudshell/devices/autoload/autoload_migration_helper.py | migrate_autoload_details | def migrate_autoload_details(autoload_details, shell_name, shell_type):
""" Migrate autoload details. Add namespace for attributes
:param autoload_details:
:param shell_name:
:param shell_type:
:return:
"""
mapping = {}
for resource in autoload_details.resources:
resource.model = "{shell_name}.{model}".format(shell_name=shell_name, model=resource.model)
mapping[resource.relative_address] = resource.model
for attribute in autoload_details.attributes:
if not attribute.relative_address: # Root element
attribute.attribute_name = "{shell_type}.{attr_name}".format(shell_type=shell_type,
attr_name=attribute.attribute_name)
else:
attribute.attribute_name = "{model}.{attr_name}".format(model=mapping[attribute.relative_address],
attr_name=attribute.attribute_name)
return autoload_details | python | def migrate_autoload_details(autoload_details, shell_name, shell_type):
""" Migrate autoload details. Add namespace for attributes
:param autoload_details:
:param shell_name:
:param shell_type:
:return:
"""
mapping = {}
for resource in autoload_details.resources:
resource.model = "{shell_name}.{model}".format(shell_name=shell_name, model=resource.model)
mapping[resource.relative_address] = resource.model
for attribute in autoload_details.attributes:
if not attribute.relative_address: # Root element
attribute.attribute_name = "{shell_type}.{attr_name}".format(shell_type=shell_type,
attr_name=attribute.attribute_name)
else:
attribute.attribute_name = "{model}.{attr_name}".format(model=mapping[attribute.relative_address],
attr_name=attribute.attribute_name)
return autoload_details | [
"def",
"migrate_autoload_details",
"(",
"autoload_details",
",",
"shell_name",
",",
"shell_type",
")",
":",
"mapping",
"=",
"{",
"}",
"for",
"resource",
"in",
"autoload_details",
".",
"resources",
":",
"resource",
".",
"model",
"=",
"\"{shell_name}.{model}\"",
".",
"format",
"(",
"shell_name",
"=",
"shell_name",
",",
"model",
"=",
"resource",
".",
"model",
")",
"mapping",
"[",
"resource",
".",
"relative_address",
"]",
"=",
"resource",
".",
"model",
"for",
"attribute",
"in",
"autoload_details",
".",
"attributes",
":",
"if",
"not",
"attribute",
".",
"relative_address",
":",
"# Root element",
"attribute",
".",
"attribute_name",
"=",
"\"{shell_type}.{attr_name}\"",
".",
"format",
"(",
"shell_type",
"=",
"shell_type",
",",
"attr_name",
"=",
"attribute",
".",
"attribute_name",
")",
"else",
":",
"attribute",
".",
"attribute_name",
"=",
"\"{model}.{attr_name}\"",
".",
"format",
"(",
"model",
"=",
"mapping",
"[",
"attribute",
".",
"relative_address",
"]",
",",
"attr_name",
"=",
"attribute",
".",
"attribute_name",
")",
"return",
"autoload_details"
] | Migrate autoload details. Add namespace for attributes
:param autoload_details:
:param shell_name:
:param shell_type:
:return: | [
"Migrate",
"autoload",
"details",
".",
"Add",
"namespace",
"for",
"attributes"
] | train | https://github.com/QualiSystems/cloudshell-networking-devices/blob/009aab33edb30035b52fe10dbb91db61c95ba4d9/cloudshell/devices/autoload/autoload_migration_helper.py#L76-L99 |
QualiSystems/cloudshell-networking-devices | cloudshell/devices/standards/firewall/configuration_attributes_structure.py | create_firewall_resource_from_context | def create_firewall_resource_from_context(shell_name, supported_os, context):
"""
Creates an instance of Firewall Resource by given context
:param shell_name: Shell Name
:type shell_name: str
:param supported_os: list of supported OS
:type supported_os: list
:param context: cloudshell.shell.core.driver_context.ResourceCommandContext
:type context: cloudshell.shell.core.driver_context.ResourceCommandContext
:return:
:rtype GenericNetworkingResource
"""
result = GenericFirewallResource(shell_name=shell_name, name=context.resource.name, supported_os=supported_os)
result.address = context.resource.address
result.family = context.resource.family
result.fullname = context.resource.fullname
result.attributes = dict(context.resource.attributes)
return result | python | def create_firewall_resource_from_context(shell_name, supported_os, context):
"""
Creates an instance of Firewall Resource by given context
:param shell_name: Shell Name
:type shell_name: str
:param supported_os: list of supported OS
:type supported_os: list
:param context: cloudshell.shell.core.driver_context.ResourceCommandContext
:type context: cloudshell.shell.core.driver_context.ResourceCommandContext
:return:
:rtype GenericNetworkingResource
"""
result = GenericFirewallResource(shell_name=shell_name, name=context.resource.name, supported_os=supported_os)
result.address = context.resource.address
result.family = context.resource.family
result.fullname = context.resource.fullname
result.attributes = dict(context.resource.attributes)
return result | [
"def",
"create_firewall_resource_from_context",
"(",
"shell_name",
",",
"supported_os",
",",
"context",
")",
":",
"result",
"=",
"GenericFirewallResource",
"(",
"shell_name",
"=",
"shell_name",
",",
"name",
"=",
"context",
".",
"resource",
".",
"name",
",",
"supported_os",
"=",
"supported_os",
")",
"result",
".",
"address",
"=",
"context",
".",
"resource",
".",
"address",
"result",
".",
"family",
"=",
"context",
".",
"resource",
".",
"family",
"result",
".",
"fullname",
"=",
"context",
".",
"resource",
".",
"fullname",
"result",
".",
"attributes",
"=",
"dict",
"(",
"context",
".",
"resource",
".",
"attributes",
")",
"return",
"result"
] | Creates an instance of Firewall Resource by given context
:param shell_name: Shell Name
:type shell_name: str
:param supported_os: list of supported OS
:type supported_os: list
:param context: cloudshell.shell.core.driver_context.ResourceCommandContext
:type context: cloudshell.shell.core.driver_context.ResourceCommandContext
:return:
:rtype GenericNetworkingResource | [
"Creates",
"an",
"instance",
"of",
"Firewall",
"Resource",
"by",
"given",
"context",
":",
"param",
"shell_name",
":",
"Shell",
"Name",
":",
"type",
"shell_name",
":",
"str",
":",
"param",
"supported_os",
":",
"list",
"of",
"supported",
"OS",
":",
"type",
"supported_os",
":",
"list",
":",
"param",
"context",
":",
"cloudshell",
".",
"shell",
".",
"core",
".",
"driver_context",
".",
"ResourceCommandContext",
":",
"type",
"context",
":",
"cloudshell",
".",
"shell",
".",
"core",
".",
"driver_context",
".",
"ResourceCommandContext",
":",
"return",
":",
":",
"rtype",
"GenericNetworkingResource"
] | train | https://github.com/QualiSystems/cloudshell-networking-devices/blob/009aab33edb30035b52fe10dbb91db61c95ba4d9/cloudshell/devices/standards/firewall/configuration_attributes_structure.py#L213-L233 |
horejsek/python-sqlpuzzle | sqlpuzzle/_common/sqlvalue.py | SqlValue._get_convert_method | def _get_convert_method(self):
"""
Get right method to convert of the value.
"""
for type_, method in self._map.items():
if type(self.value) is bool and type_ is not bool:
continue
if isinstance(self.value, type_):
return method
if is_sql_instance(self.value):
return self._raw
return self._undefined | python | def _get_convert_method(self):
"""
Get right method to convert of the value.
"""
for type_, method in self._map.items():
if type(self.value) is bool and type_ is not bool:
continue
if isinstance(self.value, type_):
return method
if is_sql_instance(self.value):
return self._raw
return self._undefined | [
"def",
"_get_convert_method",
"(",
"self",
")",
":",
"for",
"type_",
",",
"method",
"in",
"self",
".",
"_map",
".",
"items",
"(",
")",
":",
"if",
"type",
"(",
"self",
".",
"value",
")",
"is",
"bool",
"and",
"type_",
"is",
"not",
"bool",
":",
"continue",
"if",
"isinstance",
"(",
"self",
".",
"value",
",",
"type_",
")",
":",
"return",
"method",
"if",
"is_sql_instance",
"(",
"self",
".",
"value",
")",
":",
"return",
"self",
".",
"_raw",
"return",
"self",
".",
"_undefined"
] | Get right method to convert of the value. | [
"Get",
"right",
"method",
"to",
"convert",
"of",
"the",
"value",
"."
] | train | https://github.com/horejsek/python-sqlpuzzle/blob/d3a42ed1b339b8eafddb8d2c28a3a5832b3998dd/sqlpuzzle/_common/sqlvalue.py#L68-L79 |
kmedian/korr | korr/flatten.py | flatten | def flatten(rho, pval, sortby="cor"):
"""Flatten correlation and p-value matrix
Parameters:
-----------
rho : ndarray
Correlation Matrix
pval : ndarray
Matrix with p-values
sortby : str
sort the output table by
- "cor" the highest absolute correlation coefficient
- "pval" the lowest p-value
Return:
-------
tab : ndarray
Table with (i, j, cor, pval) rows
Example:
--------
from korr import pearson, flatten
rho, pval = pearson(X)
tab = flatten(rho, pval, sortby="pval")
tab.values
"""
n = rho.shape[0]
idx = np.triu_indices(n, k=1)
tab = pd.DataFrame(
columns=['i', 'j', 'cor', 'pval'],
data=np.c_[idx[0], idx[1], rho[idx], pval[idx]])
tab[['i', "j"]] = tab[['i', "j"]].astype(int)
if sortby == "cor":
tab['abscor'] = np.abs(tab['cor'])
tab.sort_values(by='abscor', inplace=True, ascending=False)
elif sortby == "pval":
tab.sort_values(by='pval', inplace=True, ascending=True)
return tab[["i", "j", "cor", "pval"]] | python | def flatten(rho, pval, sortby="cor"):
"""Flatten correlation and p-value matrix
Parameters:
-----------
rho : ndarray
Correlation Matrix
pval : ndarray
Matrix with p-values
sortby : str
sort the output table by
- "cor" the highest absolute correlation coefficient
- "pval" the lowest p-value
Return:
-------
tab : ndarray
Table with (i, j, cor, pval) rows
Example:
--------
from korr import pearson, flatten
rho, pval = pearson(X)
tab = flatten(rho, pval, sortby="pval")
tab.values
"""
n = rho.shape[0]
idx = np.triu_indices(n, k=1)
tab = pd.DataFrame(
columns=['i', 'j', 'cor', 'pval'],
data=np.c_[idx[0], idx[1], rho[idx], pval[idx]])
tab[['i', "j"]] = tab[['i', "j"]].astype(int)
if sortby == "cor":
tab['abscor'] = np.abs(tab['cor'])
tab.sort_values(by='abscor', inplace=True, ascending=False)
elif sortby == "pval":
tab.sort_values(by='pval', inplace=True, ascending=True)
return tab[["i", "j", "cor", "pval"]] | [
"def",
"flatten",
"(",
"rho",
",",
"pval",
",",
"sortby",
"=",
"\"cor\"",
")",
":",
"n",
"=",
"rho",
".",
"shape",
"[",
"0",
"]",
"idx",
"=",
"np",
".",
"triu_indices",
"(",
"n",
",",
"k",
"=",
"1",
")",
"tab",
"=",
"pd",
".",
"DataFrame",
"(",
"columns",
"=",
"[",
"'i'",
",",
"'j'",
",",
"'cor'",
",",
"'pval'",
"]",
",",
"data",
"=",
"np",
".",
"c_",
"[",
"idx",
"[",
"0",
"]",
",",
"idx",
"[",
"1",
"]",
",",
"rho",
"[",
"idx",
"]",
",",
"pval",
"[",
"idx",
"]",
"]",
")",
"tab",
"[",
"[",
"'i'",
",",
"\"j\"",
"]",
"]",
"=",
"tab",
"[",
"[",
"'i'",
",",
"\"j\"",
"]",
"]",
".",
"astype",
"(",
"int",
")",
"if",
"sortby",
"==",
"\"cor\"",
":",
"tab",
"[",
"'abscor'",
"]",
"=",
"np",
".",
"abs",
"(",
"tab",
"[",
"'cor'",
"]",
")",
"tab",
".",
"sort_values",
"(",
"by",
"=",
"'abscor'",
",",
"inplace",
"=",
"True",
",",
"ascending",
"=",
"False",
")",
"elif",
"sortby",
"==",
"\"pval\"",
":",
"tab",
".",
"sort_values",
"(",
"by",
"=",
"'pval'",
",",
"inplace",
"=",
"True",
",",
"ascending",
"=",
"True",
")",
"return",
"tab",
"[",
"[",
"\"i\"",
",",
"\"j\"",
",",
"\"cor\"",
",",
"\"pval\"",
"]",
"]"
] | Flatten correlation and p-value matrix
Parameters:
-----------
rho : ndarray
Correlation Matrix
pval : ndarray
Matrix with p-values
sortby : str
sort the output table by
- "cor" the highest absolute correlation coefficient
- "pval" the lowest p-value
Return:
-------
tab : ndarray
Table with (i, j, cor, pval) rows
Example:
--------
from korr import pearson, flatten
rho, pval = pearson(X)
tab = flatten(rho, pval, sortby="pval")
tab.values | [
"Flatten",
"correlation",
"and",
"p",
"-",
"value",
"matrix"
] | train | https://github.com/kmedian/korr/blob/4eb86fc14b1fc1b69204069b7753d115b327c937/korr/flatten.py#L5-L48 |
wallento/riscv-python-model | riscvmodel/types.py | Immediate.max | def max(self) -> int:
"""
Get the maximum value this immediate can have
:return: Maximum value of this immediate
"""
if self.signed:
v = (1 << (self.bits - 1)) - 1
else:
v = (1 << self.bits) - 1
if self.lsb0:
v = v - (v % 2)
return v | python | def max(self) -> int:
"""
Get the maximum value this immediate can have
:return: Maximum value of this immediate
"""
if self.signed:
v = (1 << (self.bits - 1)) - 1
else:
v = (1 << self.bits) - 1
if self.lsb0:
v = v - (v % 2)
return v | [
"def",
"max",
"(",
"self",
")",
"->",
"int",
":",
"if",
"self",
".",
"signed",
":",
"v",
"=",
"(",
"1",
"<<",
"(",
"self",
".",
"bits",
"-",
"1",
")",
")",
"-",
"1",
"else",
":",
"v",
"=",
"(",
"1",
"<<",
"self",
".",
"bits",
")",
"-",
"1",
"if",
"self",
".",
"lsb0",
":",
"v",
"=",
"v",
"-",
"(",
"v",
"%",
"2",
")",
"return",
"v"
] | Get the maximum value this immediate can have
:return: Maximum value of this immediate | [
"Get",
"the",
"maximum",
"value",
"this",
"immediate",
"can",
"have"
] | train | https://github.com/wallento/riscv-python-model/blob/51df07d16b79b143eb3d3c1e95bf26030c64a39b/riscvmodel/types.py#L34-L46 |
wallento/riscv-python-model | riscvmodel/types.py | Immediate.set | def set(self, value: int):
"""
Set the immediate to a value. This function checks if the value is valid and will raise an
:class:`InvalidImmediateException` if it doesn't.
:param value: Value to set the immediate to
:type value: int
:raises InvalidImmediateException: value does not match immediate
"""
if not isinstance(value, int):
raise self.exception("{} is not an integer".format(value))
if self.lsb0 and self.value % 2 == 1:
raise self.exception("{} not power of two".format(value))
if not self.signed and value < 0:
raise self.exception("{} cannot be negative".format(value))
if value < self.min() or value > self.max():
raise self.exception("{} not in allowed range {}-{}".format(value, self.min(), self.max()))
self.value = value | python | def set(self, value: int):
"""
Set the immediate to a value. This function checks if the value is valid and will raise an
:class:`InvalidImmediateException` if it doesn't.
:param value: Value to set the immediate to
:type value: int
:raises InvalidImmediateException: value does not match immediate
"""
if not isinstance(value, int):
raise self.exception("{} is not an integer".format(value))
if self.lsb0 and self.value % 2 == 1:
raise self.exception("{} not power of two".format(value))
if not self.signed and value < 0:
raise self.exception("{} cannot be negative".format(value))
if value < self.min() or value > self.max():
raise self.exception("{} not in allowed range {}-{}".format(value, self.min(), self.max()))
self.value = value | [
"def",
"set",
"(",
"self",
",",
"value",
":",
"int",
")",
":",
"if",
"not",
"isinstance",
"(",
"value",
",",
"int",
")",
":",
"raise",
"self",
".",
"exception",
"(",
"\"{} is not an integer\"",
".",
"format",
"(",
"value",
")",
")",
"if",
"self",
".",
"lsb0",
"and",
"self",
".",
"value",
"%",
"2",
"==",
"1",
":",
"raise",
"self",
".",
"exception",
"(",
"\"{} not power of two\"",
".",
"format",
"(",
"value",
")",
")",
"if",
"not",
"self",
".",
"signed",
"and",
"value",
"<",
"0",
":",
"raise",
"self",
".",
"exception",
"(",
"\"{} cannot be negative\"",
".",
"format",
"(",
"value",
")",
")",
"if",
"value",
"<",
"self",
".",
"min",
"(",
")",
"or",
"value",
">",
"self",
".",
"max",
"(",
")",
":",
"raise",
"self",
".",
"exception",
"(",
"\"{} not in allowed range {}-{}\"",
".",
"format",
"(",
"value",
",",
"self",
".",
"min",
"(",
")",
",",
"self",
".",
"max",
"(",
")",
")",
")",
"self",
".",
"value",
"=",
"value"
] | Set the immediate to a value. This function checks if the value is valid and will raise an
:class:`InvalidImmediateException` if it doesn't.
:param value: Value to set the immediate to
:type value: int
:raises InvalidImmediateException: value does not match immediate | [
"Set",
"the",
"immediate",
"to",
"a",
"value",
".",
"This",
"function",
"checks",
"if",
"the",
"value",
"is",
"valid",
"and",
"will",
"raise",
"an",
":",
"class",
":",
"InvalidImmediateException",
"if",
"it",
"doesn",
"t",
"."
] | train | https://github.com/wallento/riscv-python-model/blob/51df07d16b79b143eb3d3c1e95bf26030c64a39b/riscvmodel/types.py#L64-L83 |
wallento/riscv-python-model | riscvmodel/types.py | Immediate.set_from_bits | def set_from_bits(self, value: int):
"""
Set the immediate value from machine code bits. Those are not sign extended, so it will take care of the
proper handling.
:param value: Value to set the immediate to
:type value: int
"""
if self.signed:
value = -(value & self.tcmask) + (value & ~self.tcmask)
self.set(value) | python | def set_from_bits(self, value: int):
"""
Set the immediate value from machine code bits. Those are not sign extended, so it will take care of the
proper handling.
:param value: Value to set the immediate to
:type value: int
"""
if self.signed:
value = -(value & self.tcmask) + (value & ~self.tcmask)
self.set(value) | [
"def",
"set_from_bits",
"(",
"self",
",",
"value",
":",
"int",
")",
":",
"if",
"self",
".",
"signed",
":",
"value",
"=",
"-",
"(",
"value",
"&",
"self",
".",
"tcmask",
")",
"+",
"(",
"value",
"&",
"~",
"self",
".",
"tcmask",
")",
"self",
".",
"set",
"(",
"value",
")"
] | Set the immediate value from machine code bits. Those are not sign extended, so it will take care of the
proper handling.
:param value: Value to set the immediate to
:type value: int | [
"Set",
"the",
"immediate",
"value",
"from",
"machine",
"code",
"bits",
".",
"Those",
"are",
"not",
"sign",
"extended",
"so",
"it",
"will",
"take",
"care",
"of",
"the",
"proper",
"handling",
"."
] | train | https://github.com/wallento/riscv-python-model/blob/51df07d16b79b143eb3d3c1e95bf26030c64a39b/riscvmodel/types.py#L85-L95 |
wallento/riscv-python-model | riscvmodel/types.py | Immediate.randomize | def randomize(self):
"""
Randomize this immediate to a legal value
"""
self.value = randint(self.min(), self.max())
if self.lsb0:
self.value = self.value - (self.value % 2) | python | def randomize(self):
"""
Randomize this immediate to a legal value
"""
self.value = randint(self.min(), self.max())
if self.lsb0:
self.value = self.value - (self.value % 2) | [
"def",
"randomize",
"(",
"self",
")",
":",
"self",
".",
"value",
"=",
"randint",
"(",
"self",
".",
"min",
"(",
")",
",",
"self",
".",
"max",
"(",
")",
")",
"if",
"self",
".",
"lsb0",
":",
"self",
".",
"value",
"=",
"self",
".",
"value",
"-",
"(",
"self",
".",
"value",
"%",
"2",
")"
] | Randomize this immediate to a legal value | [
"Randomize",
"this",
"immediate",
"to",
"a",
"legal",
"value"
] | train | https://github.com/wallento/riscv-python-model/blob/51df07d16b79b143eb3d3c1e95bf26030c64a39b/riscvmodel/types.py#L97-L103 |
Workiva/furious | furious/context/_local.py | _init | def _init():
"""Initialize the furious context and registry.
NOTE: Do not directly run this method.
"""
# If there is a context and it is initialized to this request,
# return, otherwise reinitialize the _local_context.
if (hasattr(_local_context, '_initialized') and
_local_context._initialized == os.environ.get('REQUEST_ID_HASH')):
return
# Used to track the context object stack.
_local_context.registry = []
# Used to provide easy access to the currently running Async job.
_local_context._executing_async_context = None
_local_context._executing_async = []
# So that we do not inadvertently reinitialize the local context.
_local_context._initialized = os.environ.get('REQUEST_ID_HASH')
return _local_context | python | def _init():
"""Initialize the furious context and registry.
NOTE: Do not directly run this method.
"""
# If there is a context and it is initialized to this request,
# return, otherwise reinitialize the _local_context.
if (hasattr(_local_context, '_initialized') and
_local_context._initialized == os.environ.get('REQUEST_ID_HASH')):
return
# Used to track the context object stack.
_local_context.registry = []
# Used to provide easy access to the currently running Async job.
_local_context._executing_async_context = None
_local_context._executing_async = []
# So that we do not inadvertently reinitialize the local context.
_local_context._initialized = os.environ.get('REQUEST_ID_HASH')
return _local_context | [
"def",
"_init",
"(",
")",
":",
"# If there is a context and it is initialized to this request,",
"# return, otherwise reinitialize the _local_context.",
"if",
"(",
"hasattr",
"(",
"_local_context",
",",
"'_initialized'",
")",
"and",
"_local_context",
".",
"_initialized",
"==",
"os",
".",
"environ",
".",
"get",
"(",
"'REQUEST_ID_HASH'",
")",
")",
":",
"return",
"# Used to track the context object stack.",
"_local_context",
".",
"registry",
"=",
"[",
"]",
"# Used to provide easy access to the currently running Async job.",
"_local_context",
".",
"_executing_async_context",
"=",
"None",
"_local_context",
".",
"_executing_async",
"=",
"[",
"]",
"# So that we do not inadvertently reinitialize the local context.",
"_local_context",
".",
"_initialized",
"=",
"os",
".",
"environ",
".",
"get",
"(",
"'REQUEST_ID_HASH'",
")",
"return",
"_local_context"
] | Initialize the furious context and registry.
NOTE: Do not directly run this method. | [
"Initialize",
"the",
"furious",
"context",
"and",
"registry",
"."
] | train | https://github.com/Workiva/furious/blob/c29823ec8b98549e7439d7273aa064d1e5830632/furious/context/_local.py#L50-L71 |
Workiva/furious | furious/processors.py | run_job | def run_job():
"""Takes an async object and executes its job."""
async = get_current_async()
async_options = async.get_options()
job = async_options.get('job')
if not job:
raise Exception('This async contains no job to execute!')
__, args, kwargs = job
if args is None:
args = ()
if kwargs is None:
kwargs = {}
function = async._decorate_job()
try:
async.executing = True
async.result = AsyncResult(payload=function(*args, **kwargs),
status=AsyncResult.SUCCESS)
except Abort as abort:
logging.info('Async job was aborted: %r', abort)
async.result = AsyncResult(status=AsyncResult.ABORT)
# QUESTION: In this eventuality, we should probably tell the context we
# are "complete" and let it handle completion checking.
_handle_context_completion_check(async)
return
except AbortAndRestart as restart:
logging.info('Async job was aborted and restarted: %r', restart)
raise
except BaseException as e:
async.result = AsyncResult(payload=encode_exception(e),
status=AsyncResult.ERROR)
_handle_results(async_options)
_handle_context_completion_check(async) | python | def run_job():
"""Takes an async object and executes its job."""
async = get_current_async()
async_options = async.get_options()
job = async_options.get('job')
if not job:
raise Exception('This async contains no job to execute!')
__, args, kwargs = job
if args is None:
args = ()
if kwargs is None:
kwargs = {}
function = async._decorate_job()
try:
async.executing = True
async.result = AsyncResult(payload=function(*args, **kwargs),
status=AsyncResult.SUCCESS)
except Abort as abort:
logging.info('Async job was aborted: %r', abort)
async.result = AsyncResult(status=AsyncResult.ABORT)
# QUESTION: In this eventuality, we should probably tell the context we
# are "complete" and let it handle completion checking.
_handle_context_completion_check(async)
return
except AbortAndRestart as restart:
logging.info('Async job was aborted and restarted: %r', restart)
raise
except BaseException as e:
async.result = AsyncResult(payload=encode_exception(e),
status=AsyncResult.ERROR)
_handle_results(async_options)
_handle_context_completion_check(async) | [
"def",
"run_job",
"(",
")",
":",
"async",
"=",
"get_current_async",
"(",
")",
"async_options",
"=",
"async",
".",
"get_options",
"(",
")",
"job",
"=",
"async_options",
".",
"get",
"(",
"'job'",
")",
"if",
"not",
"job",
":",
"raise",
"Exception",
"(",
"'This async contains no job to execute!'",
")",
"__",
",",
"args",
",",
"kwargs",
"=",
"job",
"if",
"args",
"is",
"None",
":",
"args",
"=",
"(",
")",
"if",
"kwargs",
"is",
"None",
":",
"kwargs",
"=",
"{",
"}",
"function",
"=",
"async",
".",
"_decorate_job",
"(",
")",
"try",
":",
"async",
".",
"executing",
"=",
"True",
"async",
".",
"result",
"=",
"AsyncResult",
"(",
"payload",
"=",
"function",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
",",
"status",
"=",
"AsyncResult",
".",
"SUCCESS",
")",
"except",
"Abort",
"as",
"abort",
":",
"logging",
".",
"info",
"(",
"'Async job was aborted: %r'",
",",
"abort",
")",
"async",
".",
"result",
"=",
"AsyncResult",
"(",
"status",
"=",
"AsyncResult",
".",
"ABORT",
")",
"# QUESTION: In this eventuality, we should probably tell the context we",
"# are \"complete\" and let it handle completion checking.",
"_handle_context_completion_check",
"(",
"async",
")",
"return",
"except",
"AbortAndRestart",
"as",
"restart",
":",
"logging",
".",
"info",
"(",
"'Async job was aborted and restarted: %r'",
",",
"restart",
")",
"raise",
"except",
"BaseException",
"as",
"e",
":",
"async",
".",
"result",
"=",
"AsyncResult",
"(",
"payload",
"=",
"encode_exception",
"(",
"e",
")",
",",
"status",
"=",
"AsyncResult",
".",
"ERROR",
")",
"_handle_results",
"(",
"async_options",
")",
"_handle_context_completion_check",
"(",
"async",
")"
] | Takes an async object and executes its job. | [
"Takes",
"an",
"async",
"object",
"and",
"executes",
"its",
"job",
"."
] | train | https://github.com/Workiva/furious/blob/c29823ec8b98549e7439d7273aa064d1e5830632/furious/processors.py#L36-L75 |
Workiva/furious | furious/processors.py | _handle_results | def _handle_results(options):
"""Process the results of executing the Async's target."""
results_processor = options.get('_process_results')
if not results_processor:
results_processor = _process_results
processor_result = results_processor()
if isinstance(processor_result, (Async, Context)):
processor_result.start() | python | def _handle_results(options):
"""Process the results of executing the Async's target."""
results_processor = options.get('_process_results')
if not results_processor:
results_processor = _process_results
processor_result = results_processor()
if isinstance(processor_result, (Async, Context)):
processor_result.start() | [
"def",
"_handle_results",
"(",
"options",
")",
":",
"results_processor",
"=",
"options",
".",
"get",
"(",
"'_process_results'",
")",
"if",
"not",
"results_processor",
":",
"results_processor",
"=",
"_process_results",
"processor_result",
"=",
"results_processor",
"(",
")",
"if",
"isinstance",
"(",
"processor_result",
",",
"(",
"Async",
",",
"Context",
")",
")",
":",
"processor_result",
".",
"start",
"(",
")"
] | Process the results of executing the Async's target. | [
"Process",
"the",
"results",
"of",
"executing",
"the",
"Async",
"s",
"target",
"."
] | train | https://github.com/Workiva/furious/blob/c29823ec8b98549e7439d7273aa064d1e5830632/furious/processors.py#L78-L86 |
Workiva/furious | furious/processors.py | encode_exception | def encode_exception(exception):
"""Encode exception to a form that can be passed around and serialized.
This will grab the stack, then strip off the last two calls which are
encode_exception and the function that called it.
"""
import sys
return AsyncException(unicode(exception),
exception.args,
sys.exc_info(),
exception) | python | def encode_exception(exception):
"""Encode exception to a form that can be passed around and serialized.
This will grab the stack, then strip off the last two calls which are
encode_exception and the function that called it.
"""
import sys
return AsyncException(unicode(exception),
exception.args,
sys.exc_info(),
exception) | [
"def",
"encode_exception",
"(",
"exception",
")",
":",
"import",
"sys",
"return",
"AsyncException",
"(",
"unicode",
"(",
"exception",
")",
",",
"exception",
".",
"args",
",",
"sys",
".",
"exc_info",
"(",
")",
",",
"exception",
")"
] | Encode exception to a form that can be passed around and serialized.
This will grab the stack, then strip off the last two calls which are
encode_exception and the function that called it. | [
"Encode",
"exception",
"to",
"a",
"form",
"that",
"can",
"be",
"passed",
"around",
"and",
"serialized",
"."
] | train | https://github.com/Workiva/furious/blob/c29823ec8b98549e7439d7273aa064d1e5830632/furious/processors.py#L100-L110 |
Workiva/furious | furious/processors.py | _process_results | def _process_results():
"""Process the results from an Async job."""
async = get_current_async()
callbacks = async.get_callbacks()
if not isinstance(async.result.payload, AsyncException):
callback = callbacks.get('success')
else:
callback = callbacks.get('error')
if not callback:
raise async.result.payload.exception, None, \
async.result.payload.traceback[2]
return _execute_callback(async, callback) | python | def _process_results():
"""Process the results from an Async job."""
async = get_current_async()
callbacks = async.get_callbacks()
if not isinstance(async.result.payload, AsyncException):
callback = callbacks.get('success')
else:
callback = callbacks.get('error')
if not callback:
raise async.result.payload.exception, None, \
async.result.payload.traceback[2]
return _execute_callback(async, callback) | [
"def",
"_process_results",
"(",
")",
":",
"async",
"=",
"get_current_async",
"(",
")",
"callbacks",
"=",
"async",
".",
"get_callbacks",
"(",
")",
"if",
"not",
"isinstance",
"(",
"async",
".",
"result",
".",
"payload",
",",
"AsyncException",
")",
":",
"callback",
"=",
"callbacks",
".",
"get",
"(",
"'success'",
")",
"else",
":",
"callback",
"=",
"callbacks",
".",
"get",
"(",
"'error'",
")",
"if",
"not",
"callback",
":",
"raise",
"async",
".",
"result",
".",
"payload",
".",
"exception",
",",
"None",
",",
"async",
".",
"result",
".",
"payload",
".",
"traceback",
"[",
"2",
"]",
"return",
"_execute_callback",
"(",
"async",
",",
"callback",
")"
] | Process the results from an Async job. | [
"Process",
"the",
"results",
"from",
"an",
"Async",
"job",
"."
] | train | https://github.com/Workiva/furious/blob/c29823ec8b98549e7439d7273aa064d1e5830632/furious/processors.py#L113-L127 |
Workiva/furious | furious/processors.py | _execute_callback | def _execute_callback(async, callback):
"""Execute the given callback or insert the Async callback, or if no
callback is given return the async.result.
"""
from furious.async import Async
if not callback:
return async.result.payload
if isinstance(callback, Async):
return callback.start()
return callback() | python | def _execute_callback(async, callback):
"""Execute the given callback or insert the Async callback, or if no
callback is given return the async.result.
"""
from furious.async import Async
if not callback:
return async.result.payload
if isinstance(callback, Async):
return callback.start()
return callback() | [
"def",
"_execute_callback",
"(",
"async",
",",
"callback",
")",
":",
"from",
"furious",
".",
"async",
"import",
"Async",
"if",
"not",
"callback",
":",
"return",
"async",
".",
"result",
".",
"payload",
"if",
"isinstance",
"(",
"callback",
",",
"Async",
")",
":",
"return",
"callback",
".",
"start",
"(",
")",
"return",
"callback",
"(",
")"
] | Execute the given callback or insert the Async callback, or if no
callback is given return the async.result. | [
"Execute",
"the",
"given",
"callback",
"or",
"insert",
"the",
"Async",
"callback",
"or",
"if",
"no",
"callback",
"is",
"given",
"return",
"the",
"async",
".",
"result",
"."
] | train | https://github.com/Workiva/furious/blob/c29823ec8b98549e7439d7273aa064d1e5830632/furious/processors.py#L130-L142 |
Workiva/furious | example/complex_workflow.py | complex_state_generator_bravo | def complex_state_generator_bravo(last_state=''):
"""Pick a state."""
from random import choice
states = ['ALPHA', 'BRAVO', 'BRAVO', 'DONE']
if last_state:
states.remove(last_state) # Slightly lower chances of previous state.
state = choice(states)
logging.info('Generating a state... %s', state)
return state | python | def complex_state_generator_bravo(last_state=''):
"""Pick a state."""
from random import choice
states = ['ALPHA', 'BRAVO', 'BRAVO', 'DONE']
if last_state:
states.remove(last_state) # Slightly lower chances of previous state.
state = choice(states)
logging.info('Generating a state... %s', state)
return state | [
"def",
"complex_state_generator_bravo",
"(",
"last_state",
"=",
"''",
")",
":",
"from",
"random",
"import",
"choice",
"states",
"=",
"[",
"'ALPHA'",
",",
"'BRAVO'",
",",
"'BRAVO'",
",",
"'DONE'",
"]",
"if",
"last_state",
":",
"states",
".",
"remove",
"(",
"last_state",
")",
"# Slightly lower chances of previous state.",
"state",
"=",
"choice",
"(",
"states",
")",
"logging",
".",
"info",
"(",
"'Generating a state... %s'",
",",
"state",
")",
"return",
"state"
] | Pick a state. | [
"Pick",
"a",
"state",
"."
] | train | https://github.com/Workiva/furious/blob/c29823ec8b98549e7439d7273aa064d1e5830632/example/complex_workflow.py#L61-L73 |
Workiva/furious | example/complex_workflow.py | state_machine_success | def state_machine_success():
"""A positive result! Iterate!"""
from furious.async import Async
from furious.context import get_current_async
result = get_current_async().result
if result == 'ALPHA':
logging.info('Inserting continuation for state %s.', result)
return Async(target=complex_state_generator_alpha, args=[result])
elif result == 'BRAVO':
logging.info('Inserting continuation for state %s.', result)
return Async(target=complex_state_generator_bravo, args=[result])
logging.info('Done working, stop now.') | python | def state_machine_success():
"""A positive result! Iterate!"""
from furious.async import Async
from furious.context import get_current_async
result = get_current_async().result
if result == 'ALPHA':
logging.info('Inserting continuation for state %s.', result)
return Async(target=complex_state_generator_alpha, args=[result])
elif result == 'BRAVO':
logging.info('Inserting continuation for state %s.', result)
return Async(target=complex_state_generator_bravo, args=[result])
logging.info('Done working, stop now.') | [
"def",
"state_machine_success",
"(",
")",
":",
"from",
"furious",
".",
"async",
"import",
"Async",
"from",
"furious",
".",
"context",
"import",
"get_current_async",
"result",
"=",
"get_current_async",
"(",
")",
".",
"result",
"if",
"result",
"==",
"'ALPHA'",
":",
"logging",
".",
"info",
"(",
"'Inserting continuation for state %s.'",
",",
"result",
")",
"return",
"Async",
"(",
"target",
"=",
"complex_state_generator_alpha",
",",
"args",
"=",
"[",
"result",
"]",
")",
"elif",
"result",
"==",
"'BRAVO'",
":",
"logging",
".",
"info",
"(",
"'Inserting continuation for state %s.'",
",",
"result",
")",
"return",
"Async",
"(",
"target",
"=",
"complex_state_generator_bravo",
",",
"args",
"=",
"[",
"result",
"]",
")",
"logging",
".",
"info",
"(",
"'Done working, stop now.'",
")"
] | A positive result! Iterate! | [
"A",
"positive",
"result!",
"Iterate!"
] | train | https://github.com/Workiva/furious/blob/c29823ec8b98549e7439d7273aa064d1e5830632/example/complex_workflow.py#L76-L91 |
upsight/doctor | doctor/docs/base.py | get_example_curl_lines | def get_example_curl_lines(method: str, url: str, params: dict,
headers: dict) -> List[str]:
"""Render a cURL command for the given request.
:param str method: HTTP request method (e.g. "GET").
:param str url: HTTP request URL.
:param dict params: JSON body, for POST and PUT requests.
:param dict headers: A dict of HTTP headers.
:returns: list
"""
parts = ['curl {}'.format(pipes.quote(url))]
parts.append('-X {}'.format(method))
for header in headers:
parts.append("-H '{}: {}'".format(header, headers[header]))
if method not in ('DELETE', 'GET'):
# Don't append a json body if there are no params.
if params:
parts.append("-H 'Content-Type: application/json' -d")
pretty_json = json.dumps(params, separators=(',', ': '), indent=4,
sort_keys=True)
# add indentation for the closing bracket of the json body
json_lines = pretty_json.split('\n')
json_lines[-1] = ' ' + json_lines[-1]
pretty_json = '\n'.join(json_lines)
parts.append(pipes.quote(pretty_json))
wrapped = [parts.pop(0)]
for part in parts:
if len(wrapped[-1]) + len(part) < 80:
wrapped[-1] += ' ' + part
else:
wrapped[-1] += ' \\'
wrapped.append(' ' + part)
return wrapped | python | def get_example_curl_lines(method: str, url: str, params: dict,
headers: dict) -> List[str]:
"""Render a cURL command for the given request.
:param str method: HTTP request method (e.g. "GET").
:param str url: HTTP request URL.
:param dict params: JSON body, for POST and PUT requests.
:param dict headers: A dict of HTTP headers.
:returns: list
"""
parts = ['curl {}'.format(pipes.quote(url))]
parts.append('-X {}'.format(method))
for header in headers:
parts.append("-H '{}: {}'".format(header, headers[header]))
if method not in ('DELETE', 'GET'):
# Don't append a json body if there are no params.
if params:
parts.append("-H 'Content-Type: application/json' -d")
pretty_json = json.dumps(params, separators=(',', ': '), indent=4,
sort_keys=True)
# add indentation for the closing bracket of the json body
json_lines = pretty_json.split('\n')
json_lines[-1] = ' ' + json_lines[-1]
pretty_json = '\n'.join(json_lines)
parts.append(pipes.quote(pretty_json))
wrapped = [parts.pop(0)]
for part in parts:
if len(wrapped[-1]) + len(part) < 80:
wrapped[-1] += ' ' + part
else:
wrapped[-1] += ' \\'
wrapped.append(' ' + part)
return wrapped | [
"def",
"get_example_curl_lines",
"(",
"method",
":",
"str",
",",
"url",
":",
"str",
",",
"params",
":",
"dict",
",",
"headers",
":",
"dict",
")",
"->",
"List",
"[",
"str",
"]",
":",
"parts",
"=",
"[",
"'curl {}'",
".",
"format",
"(",
"pipes",
".",
"quote",
"(",
"url",
")",
")",
"]",
"parts",
".",
"append",
"(",
"'-X {}'",
".",
"format",
"(",
"method",
")",
")",
"for",
"header",
"in",
"headers",
":",
"parts",
".",
"append",
"(",
"\"-H '{}: {}'\"",
".",
"format",
"(",
"header",
",",
"headers",
"[",
"header",
"]",
")",
")",
"if",
"method",
"not",
"in",
"(",
"'DELETE'",
",",
"'GET'",
")",
":",
"# Don't append a json body if there are no params.",
"if",
"params",
":",
"parts",
".",
"append",
"(",
"\"-H 'Content-Type: application/json' -d\"",
")",
"pretty_json",
"=",
"json",
".",
"dumps",
"(",
"params",
",",
"separators",
"=",
"(",
"','",
",",
"': '",
")",
",",
"indent",
"=",
"4",
",",
"sort_keys",
"=",
"True",
")",
"# add indentation for the closing bracket of the json body",
"json_lines",
"=",
"pretty_json",
".",
"split",
"(",
"'\\n'",
")",
"json_lines",
"[",
"-",
"1",
"]",
"=",
"' '",
"+",
"json_lines",
"[",
"-",
"1",
"]",
"pretty_json",
"=",
"'\\n'",
".",
"join",
"(",
"json_lines",
")",
"parts",
".",
"append",
"(",
"pipes",
".",
"quote",
"(",
"pretty_json",
")",
")",
"wrapped",
"=",
"[",
"parts",
".",
"pop",
"(",
"0",
")",
"]",
"for",
"part",
"in",
"parts",
":",
"if",
"len",
"(",
"wrapped",
"[",
"-",
"1",
"]",
")",
"+",
"len",
"(",
"part",
")",
"<",
"80",
":",
"wrapped",
"[",
"-",
"1",
"]",
"+=",
"' '",
"+",
"part",
"else",
":",
"wrapped",
"[",
"-",
"1",
"]",
"+=",
"' \\\\'",
"wrapped",
".",
"append",
"(",
"' '",
"+",
"part",
")",
"return",
"wrapped"
] | Render a cURL command for the given request.
:param str method: HTTP request method (e.g. "GET").
:param str url: HTTP request URL.
:param dict params: JSON body, for POST and PUT requests.
:param dict headers: A dict of HTTP headers.
:returns: list | [
"Render",
"a",
"cURL",
"command",
"for",
"the",
"given",
"request",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/docs/base.py#L62-L94 |
upsight/doctor | doctor/docs/base.py | get_example_lines | def get_example_lines(headers: Dict[str, str], method: str, url: str,
params: Dict[str, Any], response: str) -> List[str]:
"""Render a reStructuredText example for the given request and response.
:param dict headers: A dict of HTTP headers.
:param str method: HTTP request method (e.g. "GET").
:param str url: HTTP request URL.
:param dict params: Form parameters, for POST and PUT requests.
:param str response: Text response body.
:returns: list
"""
lines = ['', 'Example Request:', '', '.. code-block:: bash', '']
lines.extend(prefix_lines(
get_example_curl_lines(method, url, params, headers), ' '))
lines.extend(['', 'Example Response:', ''])
try:
# Try to parse and prettify the response as JSON. If it fails
# (for whatever reason), we'll treat it as text instead.
response = json.dumps(json.loads(response), indent=2,
separators=(',', ': '), sort_keys=True)
lines.extend(['.. code-block:: json', ''])
except Exception:
lines.extend(['.. code-block:: text', ''])
lines.extend(prefix_lines(response, ' '))
return lines | python | def get_example_lines(headers: Dict[str, str], method: str, url: str,
params: Dict[str, Any], response: str) -> List[str]:
"""Render a reStructuredText example for the given request and response.
:param dict headers: A dict of HTTP headers.
:param str method: HTTP request method (e.g. "GET").
:param str url: HTTP request URL.
:param dict params: Form parameters, for POST and PUT requests.
:param str response: Text response body.
:returns: list
"""
lines = ['', 'Example Request:', '', '.. code-block:: bash', '']
lines.extend(prefix_lines(
get_example_curl_lines(method, url, params, headers), ' '))
lines.extend(['', 'Example Response:', ''])
try:
# Try to parse and prettify the response as JSON. If it fails
# (for whatever reason), we'll treat it as text instead.
response = json.dumps(json.loads(response), indent=2,
separators=(',', ': '), sort_keys=True)
lines.extend(['.. code-block:: json', ''])
except Exception:
lines.extend(['.. code-block:: text', ''])
lines.extend(prefix_lines(response, ' '))
return lines | [
"def",
"get_example_lines",
"(",
"headers",
":",
"Dict",
"[",
"str",
",",
"str",
"]",
",",
"method",
":",
"str",
",",
"url",
":",
"str",
",",
"params",
":",
"Dict",
"[",
"str",
",",
"Any",
"]",
",",
"response",
":",
"str",
")",
"->",
"List",
"[",
"str",
"]",
":",
"lines",
"=",
"[",
"''",
",",
"'Example Request:'",
",",
"''",
",",
"'.. code-block:: bash'",
",",
"''",
"]",
"lines",
".",
"extend",
"(",
"prefix_lines",
"(",
"get_example_curl_lines",
"(",
"method",
",",
"url",
",",
"params",
",",
"headers",
")",
",",
"' '",
")",
")",
"lines",
".",
"extend",
"(",
"[",
"''",
",",
"'Example Response:'",
",",
"''",
"]",
")",
"try",
":",
"# Try to parse and prettify the response as JSON. If it fails",
"# (for whatever reason), we'll treat it as text instead.",
"response",
"=",
"json",
".",
"dumps",
"(",
"json",
".",
"loads",
"(",
"response",
")",
",",
"indent",
"=",
"2",
",",
"separators",
"=",
"(",
"','",
",",
"': '",
")",
",",
"sort_keys",
"=",
"True",
")",
"lines",
".",
"extend",
"(",
"[",
"'.. code-block:: json'",
",",
"''",
"]",
")",
"except",
"Exception",
":",
"lines",
".",
"extend",
"(",
"[",
"'.. code-block:: text'",
",",
"''",
"]",
")",
"lines",
".",
"extend",
"(",
"prefix_lines",
"(",
"response",
",",
"' '",
")",
")",
"return",
"lines"
] | Render a reStructuredText example for the given request and response.
:param dict headers: A dict of HTTP headers.
:param str method: HTTP request method (e.g. "GET").
:param str url: HTTP request URL.
:param dict params: Form parameters, for POST and PUT requests.
:param str response: Text response body.
:returns: list | [
"Render",
"a",
"reStructuredText",
"example",
"for",
"the",
"given",
"request",
"and",
"response",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/docs/base.py#L97-L121 |
upsight/doctor | doctor/docs/base.py | get_object_reference | def get_object_reference(obj: Object) -> str:
"""Gets an object reference string from the obj instance.
This adds the object type to ALL_RESOURCES so that it gets documented and
returns a str which contains a sphinx reference to the documented object.
:param obj: The Object instance.
:returns: A sphinx docs reference str.
"""
resource_name = obj.title
if resource_name is None:
class_name = obj.__name__
resource_name = class_name_to_resource_name(class_name)
ALL_RESOURCES[resource_name] = obj
return ' See :ref:`resource-{}`.'.format(
'-'.join(resource_name.split(' ')).lower().strip()) | python | def get_object_reference(obj: Object) -> str:
"""Gets an object reference string from the obj instance.
This adds the object type to ALL_RESOURCES so that it gets documented and
returns a str which contains a sphinx reference to the documented object.
:param obj: The Object instance.
:returns: A sphinx docs reference str.
"""
resource_name = obj.title
if resource_name is None:
class_name = obj.__name__
resource_name = class_name_to_resource_name(class_name)
ALL_RESOURCES[resource_name] = obj
return ' See :ref:`resource-{}`.'.format(
'-'.join(resource_name.split(' ')).lower().strip()) | [
"def",
"get_object_reference",
"(",
"obj",
":",
"Object",
")",
"->",
"str",
":",
"resource_name",
"=",
"obj",
".",
"title",
"if",
"resource_name",
"is",
"None",
":",
"class_name",
"=",
"obj",
".",
"__name__",
"resource_name",
"=",
"class_name_to_resource_name",
"(",
"class_name",
")",
"ALL_RESOURCES",
"[",
"resource_name",
"]",
"=",
"obj",
"return",
"' See :ref:`resource-{}`.'",
".",
"format",
"(",
"'-'",
".",
"join",
"(",
"resource_name",
".",
"split",
"(",
"' '",
")",
")",
".",
"lower",
"(",
")",
".",
"strip",
"(",
")",
")"
] | Gets an object reference string from the obj instance.
This adds the object type to ALL_RESOURCES so that it gets documented and
returns a str which contains a sphinx reference to the documented object.
:param obj: The Object instance.
:returns: A sphinx docs reference str. | [
"Gets",
"an",
"object",
"reference",
"string",
"from",
"the",
"obj",
"instance",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/docs/base.py#L124-L139 |
upsight/doctor | doctor/docs/base.py | get_array_items_description | def get_array_items_description(item: Array) -> str:
"""Returns a description for an array's items.
:param item: The Array type whose items should be documented.
:returns: A string documenting what type the array's items should be.
"""
desc = ''
if isinstance(item.items, list):
# This means the type has a list of types where each position is
# mapped to a different type. Document what each type should be.
desc = ''
item_pos_template = (
' *Item {pos} must be*: {description}{enum}{ref}')
for pos, item in enumerate(item.items):
_enum = ''
ref = ''
if issubclass(item, Enum):
_enum = ' Must be one of: `{}`'.format(item.enum)
if item.case_insensitive:
_enum += ' (case-insensitive)'
_enum += '.'
elif issubclass(item, Object):
ref = get_object_reference(item)
desc += item_pos_template.format(
pos=pos, description=item.description, enum=_enum,
ref=ref)
else:
# Otherwise just document the type assigned to `items`.
desc = item.items.description
_enum = ''
ref = ''
if issubclass(item.items, Enum):
_enum = ' Must be one of: `{}`'.format(
item.items.enum)
if item.items.case_insensitive:
_enum += ' (case-insensitive)'
_enum += '.'
elif issubclass(item.items, Object):
ref = get_object_reference(item.items)
desc = ' *Items must be*: {description}{enum}{ref}'.format(
description=desc, enum=_enum, ref=ref)
return desc | python | def get_array_items_description(item: Array) -> str:
"""Returns a description for an array's items.
:param item: The Array type whose items should be documented.
:returns: A string documenting what type the array's items should be.
"""
desc = ''
if isinstance(item.items, list):
# This means the type has a list of types where each position is
# mapped to a different type. Document what each type should be.
desc = ''
item_pos_template = (
' *Item {pos} must be*: {description}{enum}{ref}')
for pos, item in enumerate(item.items):
_enum = ''
ref = ''
if issubclass(item, Enum):
_enum = ' Must be one of: `{}`'.format(item.enum)
if item.case_insensitive:
_enum += ' (case-insensitive)'
_enum += '.'
elif issubclass(item, Object):
ref = get_object_reference(item)
desc += item_pos_template.format(
pos=pos, description=item.description, enum=_enum,
ref=ref)
else:
# Otherwise just document the type assigned to `items`.
desc = item.items.description
_enum = ''
ref = ''
if issubclass(item.items, Enum):
_enum = ' Must be one of: `{}`'.format(
item.items.enum)
if item.items.case_insensitive:
_enum += ' (case-insensitive)'
_enum += '.'
elif issubclass(item.items, Object):
ref = get_object_reference(item.items)
desc = ' *Items must be*: {description}{enum}{ref}'.format(
description=desc, enum=_enum, ref=ref)
return desc | [
"def",
"get_array_items_description",
"(",
"item",
":",
"Array",
")",
"->",
"str",
":",
"desc",
"=",
"''",
"if",
"isinstance",
"(",
"item",
".",
"items",
",",
"list",
")",
":",
"# This means the type has a list of types where each position is",
"# mapped to a different type. Document what each type should be.",
"desc",
"=",
"''",
"item_pos_template",
"=",
"(",
"' *Item {pos} must be*: {description}{enum}{ref}'",
")",
"for",
"pos",
",",
"item",
"in",
"enumerate",
"(",
"item",
".",
"items",
")",
":",
"_enum",
"=",
"''",
"ref",
"=",
"''",
"if",
"issubclass",
"(",
"item",
",",
"Enum",
")",
":",
"_enum",
"=",
"' Must be one of: `{}`'",
".",
"format",
"(",
"item",
".",
"enum",
")",
"if",
"item",
".",
"case_insensitive",
":",
"_enum",
"+=",
"' (case-insensitive)'",
"_enum",
"+=",
"'.'",
"elif",
"issubclass",
"(",
"item",
",",
"Object",
")",
":",
"ref",
"=",
"get_object_reference",
"(",
"item",
")",
"desc",
"+=",
"item_pos_template",
".",
"format",
"(",
"pos",
"=",
"pos",
",",
"description",
"=",
"item",
".",
"description",
",",
"enum",
"=",
"_enum",
",",
"ref",
"=",
"ref",
")",
"else",
":",
"# Otherwise just document the type assigned to `items`.",
"desc",
"=",
"item",
".",
"items",
".",
"description",
"_enum",
"=",
"''",
"ref",
"=",
"''",
"if",
"issubclass",
"(",
"item",
".",
"items",
",",
"Enum",
")",
":",
"_enum",
"=",
"' Must be one of: `{}`'",
".",
"format",
"(",
"item",
".",
"items",
".",
"enum",
")",
"if",
"item",
".",
"items",
".",
"case_insensitive",
":",
"_enum",
"+=",
"' (case-insensitive)'",
"_enum",
"+=",
"'.'",
"elif",
"issubclass",
"(",
"item",
".",
"items",
",",
"Object",
")",
":",
"ref",
"=",
"get_object_reference",
"(",
"item",
".",
"items",
")",
"desc",
"=",
"' *Items must be*: {description}{enum}{ref}'",
".",
"format",
"(",
"description",
"=",
"desc",
",",
"enum",
"=",
"_enum",
",",
"ref",
"=",
"ref",
")",
"return",
"desc"
] | Returns a description for an array's items.
:param item: The Array type whose items should be documented.
:returns: A string documenting what type the array's items should be. | [
"Returns",
"a",
"description",
"for",
"an",
"array",
"s",
"items",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/docs/base.py#L142-L185 |
upsight/doctor | doctor/docs/base.py | get_json_types | def get_json_types(annotated_type: SuperType) -> List[str]:
"""Returns the json types for the provided annotated type.
This handles special cases for when we encounter UnionType and an Array.
UnionType's will have all valid types returned. An Array will document
what the items type is by placing that value in brackets, e.g. `list[str]`.
:param annotated_type: A subclass of SuperType.
:returns: A list of json types.
"""
types = []
if issubclass(annotated_type, UnionType):
types = [str(t.native_type.__name__) for t in annotated_type.types]
elif issubclass(annotated_type, Array):
# Include the type of items in the list if items is defined.
if annotated_type.items is not None:
if not isinstance(annotated_type.items, list):
# items are all of the same type.
types.append('list[{}]'.format(
str(annotated_type.items.native_type.__name__)))
else:
# items are different at each index.
_types = [
str(t.native_type.__name__) for t in annotated_type.items]
types.append('list[{}]'.format(','.join(_types)))
else:
types.append('list')
else:
types.append(str(annotated_type.native_type.__name__))
return types | python | def get_json_types(annotated_type: SuperType) -> List[str]:
"""Returns the json types for the provided annotated type.
This handles special cases for when we encounter UnionType and an Array.
UnionType's will have all valid types returned. An Array will document
what the items type is by placing that value in brackets, e.g. `list[str]`.
:param annotated_type: A subclass of SuperType.
:returns: A list of json types.
"""
types = []
if issubclass(annotated_type, UnionType):
types = [str(t.native_type.__name__) for t in annotated_type.types]
elif issubclass(annotated_type, Array):
# Include the type of items in the list if items is defined.
if annotated_type.items is not None:
if not isinstance(annotated_type.items, list):
# items are all of the same type.
types.append('list[{}]'.format(
str(annotated_type.items.native_type.__name__)))
else:
# items are different at each index.
_types = [
str(t.native_type.__name__) for t in annotated_type.items]
types.append('list[{}]'.format(','.join(_types)))
else:
types.append('list')
else:
types.append(str(annotated_type.native_type.__name__))
return types | [
"def",
"get_json_types",
"(",
"annotated_type",
":",
"SuperType",
")",
"->",
"List",
"[",
"str",
"]",
":",
"types",
"=",
"[",
"]",
"if",
"issubclass",
"(",
"annotated_type",
",",
"UnionType",
")",
":",
"types",
"=",
"[",
"str",
"(",
"t",
".",
"native_type",
".",
"__name__",
")",
"for",
"t",
"in",
"annotated_type",
".",
"types",
"]",
"elif",
"issubclass",
"(",
"annotated_type",
",",
"Array",
")",
":",
"# Include the type of items in the list if items is defined.",
"if",
"annotated_type",
".",
"items",
"is",
"not",
"None",
":",
"if",
"not",
"isinstance",
"(",
"annotated_type",
".",
"items",
",",
"list",
")",
":",
"# items are all of the same type.",
"types",
".",
"append",
"(",
"'list[{}]'",
".",
"format",
"(",
"str",
"(",
"annotated_type",
".",
"items",
".",
"native_type",
".",
"__name__",
")",
")",
")",
"else",
":",
"# items are different at each index.",
"_types",
"=",
"[",
"str",
"(",
"t",
".",
"native_type",
".",
"__name__",
")",
"for",
"t",
"in",
"annotated_type",
".",
"items",
"]",
"types",
".",
"append",
"(",
"'list[{}]'",
".",
"format",
"(",
"','",
".",
"join",
"(",
"_types",
")",
")",
")",
"else",
":",
"types",
".",
"append",
"(",
"'list'",
")",
"else",
":",
"types",
".",
"append",
"(",
"str",
"(",
"annotated_type",
".",
"native_type",
".",
"__name__",
")",
")",
"return",
"types"
] | Returns the json types for the provided annotated type.
This handles special cases for when we encounter UnionType and an Array.
UnionType's will have all valid types returned. An Array will document
what the items type is by placing that value in brackets, e.g. `list[str]`.
:param annotated_type: A subclass of SuperType.
:returns: A list of json types. | [
"Returns",
"the",
"json",
"types",
"for",
"the",
"provided",
"annotated",
"type",
"."
] | train | https://github.com/upsight/doctor/blob/2cf1d433f6f1aa1355644b449a757c0660793cdd/doctor/docs/base.py#L188-L218 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.