code
stringlengths
26
79.6k
docstring
stringlengths
1
46.9k
def get_subject(self, msg): text, encoding = decode_header(msg[])[-1] try: text = text.decode(encoding) except AttributeError: pass return text
Extracts the subject line from an EmailMessage object.
def tradingStatus(symbol=None, token=, version=): _raiseIfNotStr(symbol) if symbol: return _getJson( + symbol, token, version) return _getJson(, token, version)
The Trading status message is used to indicate the current trading status of a security. For IEX-listed securities, IEX acts as the primary market and has the authority to institute a trading halt or trading pause in a security due to news dissemination or regulatory reasons. For non-IEX-listed securities, IE...
def _get_param_names(self): template = Template(self.yaml_string) names = [] for match in re.finditer(template.pattern, template.template): name = match.group() or match.group() assert name is not None names.append(name) return names
Get mappable parameters from YAML.
def acquire_hosting_device_slots(self, context, hosting_device, resource, resource_type, resource_service, num, exclusive=False): bound = hosting_device[] if ((bound is not None and bound != resource[]) or (ex...
Assign <num> slots in <hosting_device> to logical <resource>. If exclusive is True the hosting device is bound to the resource's tenant. Otherwise it is not bound to any tenant. Returns True if allocation was granted, False otherwise.
def compact(self, term_doc_matrix): rank_df = self.scorer.get_rank_df(term_doc_matrix) return self._prune_higher_ranked_terms(term_doc_matrix, rank_df, self.rank)
Parameters ---------- term_doc_matrix : TermDocMatrix Term document matrix object to compact Returns ------- TermDocMatrix
def batch_retrieve_overrides_in_course(self, course_id, assignment_overrides_id, assignment_overrides_assignment_id): path = {} data = {} params = {} path["course_id"] = course_id params["assignment_overrides[id]"] = assign...
Batch retrieve overrides in a course. Returns a list of specified overrides in this course, providing they target sections/groups/students visible to the current user. Returns null elements in the list for requests that were not found.
def _set_predictor(self, predictor): if predictor is self._predictor: return self if self.data is not None: self._predictor = predictor return self._free_handle() else: raise LightGBMError("Cannot set predictor after freed raw data, " ...
Set predictor for continued training. It is not recommended for user to call this function. Please use init_model argument in engine.train() or engine.cv() instead.
def split(self, bits_count): result = [] array = WBinArray(self.__value, self.__size) if (len(array) % bits_count) > 0: array.resize(len(array) + (bits_count - (len(array) % bits_count))) while len(array): result.append(WBinArray(array[:bits_count], bits_count)) array = array[bits_count:] return...
Split array into smaller parts. Each small array is fixed-length WBinArray (length of that array is bits_count). :param bits_count: array length :return: list of WBinArray
def check_differences(self): logger.info("Check that mail differences are within the limits.") if self.conf.size_threshold < 0: logger.info("Skip checking for size differences.") if self.conf.content_threshold < 0: logger.info("Skip checking for content differenc...
In-depth check of mail differences. Compare all mails of the duplicate set with each other, both in size and content. Raise an error if we're not within the limits imposed by the threshold setting.
def avgwave(self, wavelengths=None): x = self._validate_wavelengths(wavelengths).value y = self(x).value num = np.trapz(y * x, x=x) den = np.trapz(y, x=x) if den == 0: avg_wave = 0.0 else: avg_wave = abs(num / den) return avg_w...
Calculate the :ref:`average wavelength <synphot-formula-avgwv>`. Parameters ---------- wavelengths : array-like, `~astropy.units.quantity.Quantity`, or `None` Wavelength values for sampling. If not a Quantity, assumed to be in Angstrom. If `None`, `waveset` i...
def show_correlation_matrix(self, correlation_matrix): cr_plot.create_correlation_matrix_plot( correlation_matrix, self.title, self.headers_to_test ) pyplot.show()
Shows the given correlation matrix as image :param correlation_matrix: Correlation matrix of features
def get_parent_of_type(typ, obj): if type(typ) is not text: typ = typ.__name__ while hasattr(obj, ): obj = obj.parent if obj.__class__.__name__ == typ: return obj
Finds first object up the parent chain of the given type. If no parent of the given type exists None is returned. Args: typ(str or python class): The type of the model object we are looking for. obj (model object): Python model object which is the start of the search pro...
def launch_job(self, job_id): assert self.api_version.lower() in [, ], \ try: self.create_job(job_id, {: True}) except ValueError: pass return self.read_job(job_id)
Convenience method for launching a job. We use POST for actions outside of HTTP verbs (job launch in this case).
def printed_out(self, name): opt = self.variables().optional_namestring() req = self.variables().required_namestring() out = out += out += .format(name, req, opt) if self.description: out += .format(self.description) return out
Create a string representation of the action
def send_short_lpp_packet(self, dest_id, data): pk = CRTPPacket() pk.port = CRTPPort.LOCALIZATION pk.channel = self.GENERIC_CH pk.data = struct.pack(, self.LPS_SHORT_LPP_PACKET, dest_id) + data self._cf.send_packet(pk)
Send ultra-wide-band LPP packet to dest_id
def network_delete_event(self, network_info): net_id = network_info[] if net_id not in self.network: LOG.error(, net_id) return segid = self.network[net_id].get() tenant_id = self.network[net_id].get() tenant_name = self.ge...
Process network delete event.
def create_backed_vol(self, name, backer, _format=): vol_xml = ElementTree.Element() vol_name = ElementTree.SubElement(vol_xml, ) name = .format(name, _format) vol_name.text = name target = ElementTree.SubElement(vol_xml, ) target_format = ElementTree.SubElement...
TODO(rdelinger) think about changing _format This is a pretty specialized function. It takes an existing volume, and creates a new volume that is backed by the existing volume Sadly there is no easy way to do this in libvirt, the best way I've found is to just create some xml and...
def get_context_data(self, **kwargs): context = super().get_context_data(**kwargs) topic = self.get_topic() context[] = topic context[] = topic.forum try: if hasattr(topic, ) and topic.poll.options.exists(): context[] = top...
Returns the context data to provide to the template.
def serverdir(): path = join(ROOT_DIR, ) path = normpath(path) if sys.platform == : path = realpath(path) return path
Get the location of the server subpackage
def binarize_signal(signal, treshold="auto", cut="higher"): if treshold == "auto": treshold = (np.max(np.array(signal)) - np.min(np.array(signal)))/2 signal = list(signal) binary_signal = [] for i in range(len(signal)): if cut == "higher": if signal[i] > treshold: ...
Binarize a channel based on a continuous channel. Parameters ---------- signal = array or list The signal channel. treshold = float The treshold value by which to select the events. If "auto", takes the value between the max and the min. cut = str "higher" or "lower", define...
def permutation_entropy(time_series, order=3, delay=1, normalize=False): x = np.array(time_series) hashmult = np.power(order, np.arange(order)) sorted_idx = _embed(x, order=order, delay=delay).argsort(kind=) hashval = (np.multiply(sorted_idx, hashmult)).sum(1) _, c = np.unique(ha...
Permutation Entropy. Parameters ---------- time_series : list or np.array Time series order : int Order of permutation entropy delay : int Time delay normalize : bool If True, divide by log2(factorial(m)) to normalize the entropy between 0 and 1. Otherwis...
def expand_as_args(args): return (isinstance(args, collections.Sequence) and not _is_namedtuple(args) and not _force_leaf(args))
Returns `True` if `args` should be expanded as `*args`.
def get_startup(self, id_): return _get_request(_STARTUP.format(c_api=_C_API_BEGINNING, api=_API_VERSION, id_=id_, at=self.access_token))
Get startup based on id
def set_ifo(self,ifo): self.__ifo = ifo if self.job().channel(): self.add_var_opt(, ifo + + self.job().channel())
Set the ifo name to analyze. If the channel name for the job is defined, then the name of the ifo is prepended to the channel name obtained from the job configuration file and passed with a --channel-name option. @param ifo: two letter ifo code (e.g. L1, H1 or H2).
def rename(self, old_name, new_name): try: self.api.rename(mkey(old_name), mkey(new_name)) except ResponseError, exc: if "no such key" in exc.args: raise KeyError(old_name) raise
Rename key to a new name.
def cli(self, prt=sys.stdout): kws = self.objdoc.get_docargs(prt=None) godag = get_godag(kws[], prt=None, loading_bar=False, optional_attrs=[]) usrgos = GetGOs(godag, max_gos=200).get_usrgos(kws.get(), prt) tcntobj = self._get_tcntobj(usrgos, godag, **kws) self.gosubda...
Command-line interface for go_draw script.
def quick_str_input(prompt, default_value): valid = False str_val = default_value while not valid: input_val = raw_input(prompt + "[{0}]: ".format(default_value)) if input_val == "": str_val = default_value valid = True ...
Function to display a quick question for text input. **Parameters:** - **prompt:** Text / question to display - **default_value:** Default value for no entry **Returns:** text_type() or default_value.
def main(): args = parse_args() logging.info(, __version__) completed_classes = [] classes_with_errors = [] mkdir_p(PATH_CACHE, 0o700) if args.clear_cache: shutil.rmtree(PATH_CACHE) if args.list_courses: logging.info() list_courses(args) return ses...
Main entry point for execution as a program (instead of as a module).
def bsn(self) -> str: def _is_valid_bsn(number: str) -> bool: total = 0 multiplier = 9 for char in number: multiplier = -multiplier if multiplier == 1 else multiplier total += int(char) * multiplier multiplier -= 1 ...
Generate a random, but valid ``Burgerservicenummer``. :returns: Random BSN. :Example: 255159705
def eth_getBlockByNumber(self, number): block_hash = self.reader._get_block_hash(number) block_number = _format_block_number(number) body_key = body_prefix + block_number + block_hash block_data = self.db.get(body_key) body = rlp.decode(block_data, sedes=Block) r...
Get block body by block number. :param number: :return:
def execute(self, query, *args, **kwargs): tornado_future = Future() cassandra_future = self._session.execute_async(query, *args, **kwargs) self._ioloop.add_callback( self._callback, cassandra_future, tornado_future) return tornado_future
Asynchronously execute the specified CQL query. The execute command also takes optional parameters and trace keyword arguments. See cassandra-python documentation for definition of those parameters.
def thermal_expansion_coeff(self, structure, temperature, mode="debye"): soec = ElasticTensor(self[0]) v0 = (structure.volume * 1e-30 / structure.num_sites) if mode == "debye": td = soec.debye_temperature(structure) t_ratio = temperature / td integran...
Gets thermal expansion coefficient from third-order constants. Args: temperature (float): Temperature in kelvin, if not specified will return non-cv-normalized value structure (Structure): Structure to be used in directional heat capacity determination, o...
def api_version(self, verbose=False): return self.__auth_req_get(self.rest_url, verbose=verbose)
Get information about the API http://docs.opsview.com/doku.php?id=opsview4.6:restapi#api_version_information
def accepts_contributor_roles(func): if inspect.isclass(func): apply_function_to_members(func, accepts_contributor_roles) return func else: @functools.wraps(func) def decorator(*args, **kwargs): return accepts_roles(*ROLES_CONTRIBUTOR)(func)(*args, **kwargs) ...
Decorator that accepts only contributor roles :param func: :return:
def add(self, data, overwrite=False): if is_srec(data): self.add_srec(data, overwrite) elif is_ihex(data): self.add_ihex(data, overwrite) elif is_ti_txt(data): self.add_ti_txt(data, overwrite) else: raise UnsupportedFileFormatErro...
Add given data string by guessing its format. The format must be Motorola S-Records, Intel HEX or TI-TXT. Set `overwrite` to ``True`` to allow already added data to be overwritten.
def calcFontScaling(self): self.ypx = self.figure.get_size_inches()[1]*self.figure.dpi self.xpx = self.figure.get_size_inches()[0]*self.figure.dpi self.fontSize = self.vertSize*(self.ypx/2.0) self.leftPos = self.axes.get_xlim()[0] self.rightPos = self.axes.get_xlim()[1]
Calculates the current font size and left position for the current window.
def z_angle_rotate(xy, theta): xy = np.array(xy).T theta = np.array(theta).T out = np.zeros_like(xy) out[...,0] = np.cos(theta)*xy[...,0] - np.sin(theta)*xy[...,1] out[...,1] = np.sin(theta)*xy[...,0] + np.cos(theta)*xy[...,1] return out.T
Rotated the input vector or set of vectors `xy` by the angle `theta`. Parameters ---------- xy : array_like The vector or array of vectors to transform. Must have shape
def get_portal_by_name(self, portal_name): portals = self.get_portals_list() for p in portals: if portal_name == p[1]: self.set_portal_name( p[1] ) self.set_portal_id( p[0] ) self.set_portal_cik( p[2][1][...
Set active portal according to the name passed in 'portal_name'. Returns dictionary of device 'serial_number: rid'
def pan_delta(self, dx_px, dy_px): direction = self.target - self.position distance_from_target = direction.length() direction = direction.normalized() speed_per_radius = self.get_translation_speed(distance_from_target) px_per_unit = self.vport_radius_px / speed_per_rad...
This causes the scene to appear to translate right and up (i.e., what really happens is the camera is translated left and down). This is also called "panning" in some software packages. Passing in negative delta values causes the opposite motion.
def hostcmd_push(base_path, project_name, engine_name, vars_files=None, config_file=None, **kwargs): assert_initialized(base_path, config_file) config = get_config(base_path, vars_files=vars_files, engine_name=engine_name, project_name=project_name, config_file=config_file) eng...
Push images to a registry. Requires authenticating with the registry prior to starting the push. If your engine's config file does not already contain an authorization for the registry, pass username and/or password. If you exclude password, you will be prompted.
def mime(self): author = self.author sender = self.sender if not author: raise ValueError("You must specify an author.") if not self.subject: raise ValueError("You must specify a subject.") if len(self.recipients) == 0: raise ValueError("You must specify at least one recipient.") if not se...
Produce the final MIME message.
def two_lorentzian(freq, freq0_1, freq0_2, area1, area2, hwhm1, hwhm2, phase1, phase2, offset, drift): return (lorentzian(freq, freq0_1, area1, hwhm1, phase1, offset, drift) + lorentzian(freq, freq0_2, area2, hwhm2, phase2, offset, drift))
A two-Lorentzian model. This is simply the sum of two lorentzian functions in some part of the spectrum. Each individual Lorentzian has its own peak frequency, area, hwhm and phase, but they share common offset and drift parameters.
def blend(self, blend_function=stack): new_scn = Scene() common_datasets = self.shared_dataset_ids for ds_id in common_datasets: datasets = [scn[ds_id] for scn in self.scenes if ds_id in scn] new_scn[ds_id] = blend_function(datasets) return new_scn
Blend the datasets into one scene. .. note:: Blending is not currently optimized for generator-based MultiScene.
def normalize_modpath(modpath, hide_init=True, hide_main=False): if six.PY2: if modpath.endswith(): modpath = modpath[:-1] if hide_init: if basename(modpath) == : modpath = dirname(modpath) hide_main = True else: modpath_with_init = j...
Normalizes __init__ and __main__ paths. Notes: Adds __init__ if reasonable, but only removes __main__ by default Args: hide_init (bool): if True, always return package modules as __init__.py files otherwise always return the dpath. hide_init (bool): if True, always strip awa...
def _slice_replace(code, index, old, new): nodes = [str(node) for node in code.get(index)] substring = "".join(nodes).replace(old, new) code.nodes[index] = parse_anything(substring).nodes
Replace the string *old* with *new* across *index* in *code*.
def get_languages_from_item(ct_item, item): try: item_lan = TransItemLanguage.objects.filter(content_type__pk=ct_item.id, object_id=item.id).get() languages = [lang.code for lang in item_lan.languages.all()] return languages except TransItemLanguage.DoesNotEx...
Get the languages configured for the current item :param ct_item: :param item: :return:
def plot_blob( sampler, blobidx=0, label=None, last_step=False, figure=None, **kwargs ): modelx, model = _process_blob(sampler, blobidx, last_step) if label is None: label = "Model output {0}".format(blobidx) if modelx is None: f = plot_distribution(model, label, figure=f...
Plot a metadata blob as a fit to spectral data or value distribution Additional ``kwargs`` are passed to `plot_fit`. Parameters ---------- sampler : `emcee.EnsembleSampler` Sampler with a stored chain. blobidx : int, optional Metadata blob index to plot. label : str, optional ...
def style_data(self): def recursive_get(data, keys): if len(keys): return recursive_get(data.get(keys[0]), keys[1:]) else: return data geometries = recursive_get(self.data, self.object_path.split())[] for feature in geometries:...
Applies self.style_function to each feature of self.data.
def build_query_string(self, data): query = [] keys_to_be_removed = [] for key, value in data.items(): if key not in [, , ]: if not key == : if key == : value = .join(str(val) for val in value) ...
This method occurs after dumping the data into the class. Args: data (dict): dictionary of all the query values Returns: data (dict): ordered dict of all the values
def from_bits(self, bits): if len(bits) != Person.BITS_PER_PERSON: raise ValueError(u"Person requires exactly {} bits".format( Person.BITS_PER_PERSON )) self.sitting = bool(bits[0]) return self
Set this person from bits (ignores the id) :param bits: Bits representing a person :type bits: bytearray :rtype: Person :raises ValueError: Bits has an unexpected length
def create(style_dataset, content_dataset, style_feature=None, content_feature=None, max_iterations=None, model=, verbose=True, batch_size = 6, **kwargs): if len(style_dataset) == 0: raise _ToolkitError("style_dataset SFrame cannot be empty") if len(content_dataset) == 0: ra...
Create a :class:`StyleTransfer` model. Parameters ---------- style_dataset: SFrame Input style images. The columns named by the ``style_feature`` parameters will be extracted for training the model. content_dataset : SFrame Input content images. The columns named by the ``conte...
def use_certificate(self, cert): if not isinstance(cert, X509): raise TypeError("cert must be an X509 instance") use_result = _lib.SSL_CTX_use_certificate(self._context, cert._x509) if not use_result: _raise_current_error()
Load a certificate from a X509 object :param cert: The X509 object :return: None
def circles(st, layer, axis, ax=None, talpha=1.0, cedge=, cface=): pos = st.obj_get_positions() rad = st.obj_get_radii() shape = st.ishape.shape.tolist() shape.pop(axis) if ax is None: fig = plt.figure() axisbg = if cface == else sx, sy = ((1,shape[1]/float(shape[0])...
Plots a set of circles corresponding to a slice through the platonic structure. Copied from twoslice_overlay with comments, standaloneness. Inputs ------ pos : array of particle positions; [N,3] rad : array of particle radii; [N] ax : plt.axis instance layer : Which layer of...
def visit_exact_match_value(self, node, fieldnames=None): if not fieldnames: fieldnames = [] else: fieldnames = force_list(fieldnames) if ElasticSearchVisitor.KEYWORD_TO_ES_FIELDNAME[] == fieldnames[0]: return self._generate_exact_author_query(node.v...
Generates a term query (exact search in ElasticSearch).
def _handleDelete(self): if self.cursorPos < len(self.inputBuffer): self.inputBuffer = self.inputBuffer[0:self.cursorPos] + self.inputBuffer[self.cursorPos+1:] self._refreshInputPrompt(len(self.inputBuffer)+1)
Handles "delete" characters
def verify_sc_url(url: str) -> bool: parsed = urlsplit(url) scheme: str = parsed.scheme netloc: str = parsed.netloc path: str = parsed.path try: port = parsed.port except ValueError: port = None result = (scheme.lower() == and netloc.lower().split()[0] ...
Verify signature certificate URL against Amazon Alexa requirements. Each call of Agent passes incoming utterances batch through skills filter, agent skills, skills processor. Batch of dialog IDs can be provided, in other case utterances indexes in incoming batch are used as dialog IDs. Args: u...
def write(self, handle): if not self._frames: return def add(name, desc, bpe, format, bytes, *dimensions): group.add_param(name, desc=desc, bytes_per_element=bpe, bytes=struct.pack(forma...
Write metadata and point + analog frames to a file handle. Parameters ---------- handle : file Write metadata and C3D motion frames to the given file handle. The writer does not close the handle.
def sort_values(self, by=None, axis=0, ascending=True, inplace=False, kind=, na_position=): raise NotImplementedError("sort_values has not been implemented " "on Panel or Panel4D objects.")
Sort by the values along either axis. Parameters ----------%(optional_by)s axis : %(axes_single_arg)s, default 0 Axis to be sorted. ascending : bool or list of bool, default True Sort ascending vs. descending. Specify list for multiple sort orders....
def send(self, request, **kwargs): kwargs.setdefault(, self.stream) kwargs.setdefault(, self.verify) kwargs.setdefault(, self.cert) kwargs.setdefault(, self.proxies) if history: history.insert(0, r) ...
Send a given PreparedRequest.
def grouper_nofill_str(n, iterable): res = more_itertools.chunked(iterable, n) if isinstance(iterable, six.string_types): res = (.join(item) for item in res) return res
Take a sequence and break it up into chunks of the specified size. The last chunk may be smaller than size. This works very similar to grouper_nofill, except it works with strings as well. >>> tuple(grouper_nofill_str(3, 'foobarbaz')) ('foo', 'bar', 'baz') You can still use it on non-strings too if you like. ...
def GetForwardedIps(self, interface, interface_ip=None): try: ips = netifaces.ifaddresses(interface) ips = ips[netifaces.AF_INET] except (ValueError, IndexError): return [] forwarded_ips = [] for ip in ips: if ip[] != interface_ip: full_addr = % (ip[], netaddr.IPAdd...
Retrieve the list of configured forwarded IP addresses. Args: interface: string, the output device to query. interface_ip: string, current interface ip address. Returns: list, the IP address strings.
def eval_genome(genome, config): net = neat.nn.FeedForwardNetwork.create(genome, config) error = 4.0 for xi, xo in zip(xor_inputs, xor_outputs): output = net.activate(xi) error -= (output[0] - xo[0]) ** 2 return error
This function will be run in parallel by ParallelEvaluator. It takes two arguments (a single genome and the genome class configuration data) and should return one float (that genome's fitness). Note that this function needs to be in module scope for multiprocessing.Pool (which is what ParallelEvaluato...
def quantize_model(sym, arg_params, aux_params, data_names=(,), label_names=(,), ctx=cpu(), excluded_sym_names=None, calib_mode=, calib_data=None, num_calib_examples=None, calib_layer=None, quantized_dtype=, logger=logging): if exclude...
User-level API for generating a quantized model from a FP32 model w/ or w/o calibration. The backend quantized operators are only enabled for Linux systems. Please do not run inference using the quantized models on Windows for now. The quantization implementation adopts the TensorFlow's approach: https:...
def make_venv(self, dj_version): venv_path = self._get_venv_path(dj_version) self.logger.info( % dj_version) try: create_venv(venv_path, **VENV_CREATE_KWARGS) except ValueError: self.logger.warning() self.venv_install( % dj_version, venv_path) ...
Creates a virtual environment for a given Django version. :param str dj_version: :rtype: str :return: path to created virtual env
def _resolve_name(name, package, level): if not hasattr(package, ): raise ValueError(" not set to a string") dot = len(package) for x in xrange(level, 1, -1): try: dot = package.rindex(, 0, dot) except ValueError: raise ValueError("attempted relative impo...
Return the absolute name of the module to be imported.
def readPattern(self): if ( self.dev == None ): return pattern=[] for i in range(0,16): pattern.append( self.readPatternLine(i) ) return pattern
Read the entire color pattern :return List of pattern line tuples
def mark_flags_as_mutual_exclusive(flag_names, required=False, flag_values=_flagvalues.FLAGS): for flag_name in flag_names: if flag_values[flag_name].default is not None: warnings.warn( .format(flag_name)) def validate_mutual_exclusion...
Ensures that only one flag among flag_names is not None. Important note: This validator checks if flag values are None, and it does not distinguish between default and explicit values. Therefore, this validator does not make sense when applied to flags with default values other than None, including other false...
def _set_allowed_services_and_actions(self, services): for service in services: self.services[service[]] = {} for action in service[]: name = action.pop() self.services[service[]][name] = action
Expect services to be a list of service dictionaries, each with `name` and `actions` keys.
def _read_parsed(self, lines): self.log(u"Parsing fragments from parsed text format") pairs = [] for line in lines: pieces = line.split(gc.PARSED_TEXT_SEPARATOR) if len(pieces) == 2: identifier = pieces[0].strip() text = pieces[1]....
Read text fragments from a parsed format text file. :param list lines: the lines of the parsed text file :param dict parameters: additional parameters for parsing (e.g., class/id regex strings)
def addmsg(self, msg_p): return lib.zmsg_addmsg(self._as_parameter_, byref(zmsg_p.from_param(msg_p)))
Push encoded message as a new frame. Message takes ownership of submessage, so the original is destroyed in this call. Returns 0 on success, -1 on error.
def check_timers(self): if self._current is None: advance = min([self.clocks] + [x for x in self.timers if x is not None]) + 1 logger.debug(f"Advancing the clock from {self.clocks} to {advance}") self.clocks = advance for procid in range(len(self...
Awake process if timer has expired
def triangle_area(point1, point2, point3): a = point_distance(point1, point2) b = point_distance(point1, point3) c = point_distance(point2, point3) s = (a + b + c) / 2.0 return math.sqrt(s * (s - a) * (s - b) * (s - c))
Uses Heron's formula to find the area of a triangle based on the coordinates of three points. Args: point1: list or tuple, the x y coordinate of point one. point2: list or tuple, the x y coordinate of point two. point3: list or tuple, the x y coordinate of point three. Returns: ...
def _get_role_arn(name, **conn_params): if name.startswith(): return name role = __salt__[](name, **conn_params) rolearn = role.get() if role else None return rolearn
Helper function to turn a name into an arn string, returns None if not able to resolve
def value_str(sc): if sc.type in (STRING, INT, HEX): return "({})".format(sc.str_value) return "-->" if sc.choice.selection is sc else " " tri_val_str = (" ", "M", "*")[sc.tri_value] if len(sc.assignable) == 1: return "-{}-".format(tri_val_str) ...
Returns the value part ("[*]", "<M>", "(foo)" etc.) of a menu entry. sc: Symbol or Choice.
def load_mod_from_file(self, fpath): shutit_global.shutit_global_object.yield_to_draw() fpath = os.path.abspath(fpath) file_ext = os.path.splitext(os.path.split(fpath)[-1])[-1] if file_ext.lower() != : return with open(fpath) as f: content = f.read().splitlines() ok = False for line in content: ...
Loads modules from a .py file into ShutIt if there are no modules from this file already. We expect to have a callable 'module/0' which returns one or more module objects. If this doesn't exist we assume that the .py file works in the old style (automatically inserting the module into shutit_global) or it's n...
def NewFromJSON(data): if data.get(, None): shakes = [Shake.NewFromJSON(shk) for shk in data.get()] else: shakes = None return User( id=data.get(, None), name=data.get(, None), profile_image_url=data.get(, None), a...
Create a new User instance from a JSON dict. Args: data (dict): JSON dictionary representing a user. Returns: A User instance.
def _estimate_label_shape(self): max_count = 0 self.reset() try: while True: label, _ = self.next_sample() label = self._parse_label(label) max_count = max(max_count, label.shape[0]) except StopIteration: pa...
Helper function to estimate label shape
def stretch_weber_fechner(self, k, s0): attrs = self.data.attrs self.data = k * xu.log(self.data / s0) self.data.attrs = attrs
Stretch according to the Weber-Fechner law. p = k.ln(S/S0) p is perception, S is the stimulus, S0 is the stimulus threshold (the highest unpercieved stimulus), and k is the factor.
def read(path, encoding="utf-8"): try: with io.open(path, encoding=encoding) as f: return f.read() except Exception as e: logger.error("read: %s failed. Error: %s", path, e) return ""
Read the content of the file. Args: path (str): Path to the file encoding (str): File encoding. Default: utf-8 Returns: str: File content or empty string if there was an error
def send_zipfile(request, fileList): temp = tempfile.TemporaryFile() archive = zipfile.ZipFile(temp, , zipfile.ZIP_DEFLATED) for artist,files in fileList.iteritems(): for f in files: archive.write(f[0], % (artist, f[1])) archive.close() wrapper = FixedFileWrapper(temp) ...
Create a ZIP file on disk and transmit it in chunks of 8KB, without loading the whole file into memory. A similar approach can be used for large dynamic PDF files.
def cmd(send, msg, args): if not msg: msg = gen_word() morse = gen_morse(msg) if len(morse) > 100: send("Your morse is too long. Have you considered Western Union?") else: send(morse)
Converts text to morse code. Syntax: {command} [text]
def is_multicast(text): try: first = ord(dns.ipv4.inet_aton(text)[0]) return (first >= 224 and first <= 239) except Exception: try: first = ord(dns.ipv6.inet_aton(text)[0]) return (first == 255) except Exception: raise ValueError
Is the textual-form network address a multicast address? @param text: the textual address @raises ValueError: the address family cannot be determined from the input. @rtype: bool
def toTypeURIs(namespace_map, alias_list_s): uris = [] if alias_list_s: for alias in alias_list_s.split(): type_uri = namespace_map.getNamespaceURI(alias) if type_uri is None: raise KeyError( % (alias,)) else: ...
Given a namespace mapping and a string containing a comma-separated list of namespace aliases, return a list of type URIs that correspond to those aliases. @param namespace_map: The mapping from namespace URI to alias @type namespace_map: openid.message.NamespaceMap @param alias_list_s: The string...
def __query_spec(self): operators = self.__modifiers.copy() if self.__ordering: operators["$orderby"] = self.__ordering if self.__explain: operators["$explain"] = True if self.__hint: operators["$hint"] = self.__hint if self.__comment:...
Get the spec to use for a query.
def followed_topic_num(self): if self.url is not None: tag = self.soup.find(, class_=) if tag is not None: return int(re_get_number.match( tag.parent.strong.text).group(1)) return 0
获取用户关注的话题数 :return: 关注的话题数 :rtype: int
def response_to_json_dict(response, **kwargs): if response.encoding is None: response.encoding = return json.loads(response.text, **kwargs)
Standard place to convert responses to JSON. :param response: requests response object :param **kwargs: arguments accepted by json.loads :returns: dict of JSON response
def get_dates(raw_table) -> "list of dates": dates = [] found_first = False for i, dstr in enumerate([raw_table[i][0] for i in range(0, len(raw_table))]): if dstr: if len(dstr.split("/")) == 3: d = datetime.datetime.strptime(dstr, ) ...
Goes through the first column of input table and returns the first sequence of dates it finds.
def write_url (self, url_data): self.writeln(u"<tr>") self.writeln(u % self.part("url")) self.write(u) self.write(u"`%s'" % cgi.escape(url_data.base_url)) self.writeln(u"</td></tr>")
Write url_data.base_url.
def _set_token(self): try: self.token = os.environ[] if self.verbose: print("Overriding Cerberus token with environment variable.", file=sys.stderr) logger.info("Overriding Cerberus token with environment variable.") return except:...
Set the Cerberus token based on auth type
async def renew(self, session, *, dc=None): session_id = extract_attr(session, keys=["ID"]) response = await self._api.put("/v1/session/renew", session_id, params={"dc": dc}) try: result = response.body[0] except IndexError: ...
Renews a TTL-based session Parameters: session (ObjectID): Session ID dc (str): Specify datacenter that will be used. Defaults to the agent's local datacenter. Returns: ObjectMeta: where value is session Raises: NotFound: ses...
def reduce(source, func, initializer=None): acc = accumulate.raw(source, func, initializer) return select.item.raw(acc, -1)
Apply a function of two arguments cumulatively to the items of an asynchronous sequence, reducing the sequence to a single value. If ``initializer`` is present, it is placed before the items of the sequence in the calculation, and serves as a default when the sequence is empty.
def create_init(self, path): source = with io.open(path, "w", encoding="utf-8") as outfile: outfile.write(source)
Create a minimal __init__ file with enough boiler plate to not add to lint messages :param path: :return:
def gene_to_panels(self, case_obj): LOG.info("Building gene to panels") gene_dict = {} for panel_info in case_obj.get(, []): panel_name = panel_info[] panel_version = panel_info[] panel_obj = self.gene_panel(panel_name, version=panel_version) ...
Fetch all gene panels and group them by gene Args: case_obj(scout.models.Case) Returns: gene_dict(dict): A dictionary with gene as keys and a set of panel names as value
def get_count(self, prefix=): return sum([self.counters[key] for key in self.messages if key.startswith(prefix)])
Return the total count of errors and warnings.
def load_watch(): XysidesubjectX_labelsy_labels module_path = dirname(__file__) data = np.load(module_path + "/data/watch_dataset.npy").item() return data
Loads some of the 6-axis inertial sensor data from my smartwatch project. The sensor data was recorded as study subjects performed sets of 20 shoulder exercise repetitions while wearing a smartwatch. It is a multivariate time series. The study can be found here: https://arxiv.org/abs/1802.01489 Return...
def add_filter(self, ftype, func): if not isinstance(ftype, type): raise TypeError("Expected type object, got %s" % type(ftype)) self.castfilter = [(t, f) for (t, f) in self.castfilter if t != ftype] self.castfilter.append((ftype, func)) self.castfilter.sort()
Register a new output filter. Whenever bottle hits a handler output matching `ftype`, `func` is applyed to it.
def vgcreate(vgname, devices, **kwargs): if not vgname or not devices: return if isinstance(devices, six.string_types): devices = devices.split() cmd = [, vgname] for device in devices: cmd.append(device) valid = (, , , , , ) for var in kwargs: ...
Create an LVM volume group CLI Examples: .. code-block:: bash salt mymachine lvm.vgcreate my_vg /dev/sdb1,/dev/sdb2 salt mymachine lvm.vgcreate my_vg /dev/sdb1 clustered=y
def stop_instance(self, instance_id): if not instance_id: log.info("Instance to stop has no instance id") return gce = self._connect() try: request = gce.instances().delete(project=self._project_id, instance=i...
Stops the instance gracefully. :param str instance_id: instance identifier :raises: `InstanceError` if instance can not be stopped
def connect(self, hostkey=None, username=, password=None, pkey=None): if hostkey is not None: self._preferred_keys = [ hostkey.get_name() ] self.start_client() if (hostkey is not None): key = self.get_remote_server_key() if (key.get_name() ...
Negotiate an SSH2 session, and optionally verify the server's host key and authenticate using a password or private key. This is a shortcut for L{start_client}, L{get_remote_server_key}, and L{Transport.auth_password} or L{Transport.auth_publickey}. Use those methods if you want more c...
def overview(): search = Service.search() search = search.filter("term", state=) search.aggs.bucket(, , field=, order={: }, size=100) \ .metric(, , field=) response = search.execute() print_line("Port Count") print_line("---------------") for entry in response.aggregations.p...
Function to create an overview of the services. Will print a list of ports found an the number of times the port was seen.