text
stringlengths 29
850k
|
---|
## {{{ http://code.activestate.com/recipes/144838/ (r1)
#messaging.py
#this is a module used for messaging. It allows multiple classes
#to handle various types of messages. It should work on all python
#versions >= 1.5.2
# These are useful slides explaining bacis usage of regex in Notepad++
# http://www.slideshare.net/anjesh/the-power-of-regular-expression-use-in-notepad
# example: to replace print.... statements with dbgMsg(....) put print[\s]*(.*) in find box and dbgMsg(\1) in replace box \1 refers to group of characters matched by (.*) if we have another group (...) we would use \2 to refer to it
import sys, string, exceptions
#this flag determines whether debug output is sent to debug handlers themselves
debug = 1
def setDebugging(debugging):
global debug
debug = debugging
class MessagingException(exceptions.Exception):
"""an exception class for any errors that may occur in
a messaging function"""
def __init__(self, args=None):
self.args = args
class FakeException(exceptions.Exception):
"""an exception that is thrown and then caught
to get a reference to the current execution frame"""
pass
class MessageHandler:
"""All message handlers should inherit this class. Each method will be
passed a string when the executing program passes calls a messaging function"""
def handleStdMsg(self, msg):
"""do something with a standard message from the program"""
pass
def handleErrMsg(self, msg):
"""do something with an error message. This will already include the
class, method, and line of the call"""
pass
def handleDbgMsg(self, msg):
"""do something with a debug message. This will already include the
class, method, and line of the call"""
pass
class defaultMessageHandler(MessageHandler):
"""This is a default message handler. It simply spits all strings to
standard out"""
def handleStdMsg(self, msg):
sys.stdout.write(msg + "\n")
def handleErrMsg(self, msg):
sys.stderr.write(msg + "\n")
def handleDbgMsg(self, msg):
sys.stdout.write(msg + "\n")
#this keeps track of the handlers
_messageHandlers = []
#call this with the handler to register it for receiving messages
def registerMessageHandler(handler):
"""we're not going to check for inheritance, but we should check to make
sure that it has the correct methods"""
for methodName in ["handleStdMsg", "handleErrMsg", "handleDbgMsg"]:
try:
getattr(handler, methodName)
except:
raise MessagingException, "The class " + handler.__class__.__name__ + " is missing a " + methodName + " method"
_messageHandlers.append(handler)
def getCallString(level):
#this gets us the frame of the caller and will work
#in python versions 1.5.2 and greater (there are better
#ways starting in 2.1
try:
raise FakeException("this is fake")
except Exception, e:
#get the current execution frame
f = sys.exc_info()[2].tb_frame
#go back as many call-frames as was specified
while level >= 0:
f = f.f_back
level = level-1
#if there is a self variable in the caller's local namespace then
#we'll make the assumption that the caller is a class method
obj = f.f_locals.get("self", None)
functionName = f.f_code.co_name
if obj:
callStr = obj.__class__.__name__+"::"+f.f_code.co_name+" (line "+str(f.f_lineno)+")"
else:
callStr = f.f_code.co_name+" (line "+str(f.f_lineno)+")"
return callStr
#send this message to all handlers of std messages
def stdMsg(*args):
stdStr = string.join(map(str, args), " ")
for handler in _messageHandlers:
handler.handleStdMsg(stdStr)
#send this message to all handlers of error messages
def errMsg(*args):
errStr = "Error in "+getCallString(1)+" : "+string.join(map(str, args), " ")
for handler in _messageHandlers:
handler.handleErrMsg(errStr)
#send this message to all handlers of debug messages
def dbgMsg(*args):
if not debug:
return
errStr = getCallString(1)+" : "+string.join(map(str, args), " ")
for handler in _messageHandlers:
handler.handleDbgMsg(errStr)
def pd(*args):
# dbgMsg(*args)
if not debug:
return
errStr = getCallString(1)+" : "+string.join(map(str, args), " ")
for handler in _messageHandlers:
handler.handleDbgMsg(errStr)
registerMessageHandler(defaultMessageHandler())
#end of messaging.py
#test.py
#here is a simple use case for the above module
# from messaging import stdMsg, dbgMsg, errMsg, setDebugging
# setDebugging(0)
# dbgMsg("this won't be printed")
# stdMsg("but this will")
# setDebugging(1)
# def foo():
# dbgMsg("this is a debug message in", "foo")
# class bar:
# def baz(self):
# errMsg("this is an error message in bar")
# foo()
# b = bar()
# b.baz()
# #end of test.py
# output is :
# but this will
# foo (line 12) : this is a debug message in foo
# Error in bar::baz (line 16) : this is an error message in bar
# ## end of http://code.activestate.com/recipes/144838/ }}}
|
All enquiries, bookings, issues and cancellations are handled by individual Tourism Operators, or their chosen Booking Services. greatoceanroad.PAGES is not involved in these tranactions, and greatoceanroad.PAGES will not be liable for any of these issues. As a website Visitor, please make sure of the Tourism Operator's identity and trustworthiness before placing a Booking.
Any malicious use intended to harm greatoceanroad.PAGES, or its Contributors, or any other entity, will be considered as a breach of this Policy.
greatoceanroad.PAGES may collect Statistics about the usage of the website, so that it can be improved generally, and also to change the the behaviour of greatoceanroad.PAGES to better suit you.
greatoceanroad.PAGES links to many independant websites, and you should check their Privacy Policies if you have any concerns.
greatoceanroad.PAGES may have a commercial relationship with a Contributor, but uses automated random algorithms to ensure fairness.
Views, Opinions and Descriptions are those of the individual Contributors, and not necessarily those of greatoceanroad.PAGES.
All content (Text and Images) on greatoceanroad.PAGES has been provided by individual Contributors who have stated their right to Copyright over the published Text and Images, or have confirmed that they are licensed to use them. Any breaches of Copyright by a Contributor will be the responsibility of that Contributor.
greatoceanroad.PAGES respects the rights of Contributors, and their Licensors, to have legal protection over their creative works. If a Website Visitor attempts to copy or change any of these Text and Images then greatoceanroad.PAGES will support its Contributors in chasing any legal options open to them. |
from django.core.urlresolvers import reverse
from django.db import models
from django.db.models.signals import pre_save
from django.dispatch import receiver
from django.template.defaultfilters import slugify, stringformat
import markdown
import re
from bs4 import BeautifulSoup
class Tag(models.Model):
name = models.CharField(max_length=50, unique=True)
slug = models.SlugField(max_length=50, unique=True)
def __str__(self):
return self.name
class Post(models.Model):
title = models.CharField(max_length=200)
date = models.DateTimeField()
content = models.TextField()
content_html = models.TextField(blank=True)
tags = models.ManyToManyField(Tag, related_name='posts', blank=True)
slug = models.SlugField(max_length=200, unique=True, blank=True)
def get_absolute_url(self):
return reverse('blog:post', kwargs={
'year': self.date.year,
'month': stringformat(self.date.month, '02d'),
'day': stringformat(self.date.day, '02d'),
'slug': self.slug
})
def __str__(self):
return self.title
@receiver(pre_save, sender=Post)
def pre_save_post(**kwargs):
content_html = markdown.markdown(kwargs['instance'].content,
extensions=['codehilite'])
soup = BeautifulSoup(content_html)
for tag in soup.find_all(re.compile(r'h\d')):
if tag.parent is soup:
tag.name = 'h%d' % (int(tag.name[1]) + 1)
kwargs['instance'].content_html = str(soup)
kwargs['instance'].slug = slugify(
kwargs['instance'].title.replace('.', '-'))
|
Ability to cruise quietly and comfortably, ideal for elite marques.
Our Turbosteel and Turbospeed ranges are perfect for large, luxury cars. With enough flex in the sidewall to deliver a comfortable ride, and a tread pattern that’s efficient in reducing noise and dispersing water, if you drive a premium car, these premium tyres will do it justice. |
from __future__ import absolute_import
from builtins import object
from proteus import *
from proteus.default_p import *
from .NS_hotstart import *
from proteus.mprans import RANS3PF
LevelModelType = RANS3PF.LevelModel
name="momentum_eqn"
coefficients = RANS3PF.Coefficients(epsFact=epsFact_viscosity,
sigma=0.0,
rho_0 = rho_0,
nu_0 = nu_0,
rho_1 = rho_1,
nu_1 = nu_1,
g=g,
nd=nd,
ME_model=0,
PRESSURE_model=2,
SED_model=None,
VOS_model=None,
VOF_model=None,
LS_model=None,
Closure_0_model=None,
Closure_1_model=None,
epsFact_density=epsFact_density,
stokes=False,
useVF=useVF,
useRBLES=useRBLES,
useMetrics=useMetrics,
eb_adjoint_sigma=1.0,
eb_penalty_constant=weak_bc_penalty_constant,
forceStrongDirichlet=ns_forceStrongDirichlet,
turbulenceClosureModel=ns_closure,
movingDomain=movingDomain,
dragAlpha=dragAlpha,
PSTAB=1.0,
cE=cE,
cMax=cMax,
CORRECT_VELOCITY=CORRECT_VELOCITY)
#######################
# BOUNDARY CONDITIONS #
#######################
def getDBC_u(x,flag):
#None
pi = np.pi
if (flag==1 or flag==2 or flag==3 or flag==4):
if manufactured_solution == 1:
return lambda x,t: np.sin(x[0])*np.sin(x[1]+t)
else:
return lambda x,t: np.sin(pi*x[0])*np.cos(pi*x[1])*np.sin(t)
def getDBC_v(x,flag):
#None
pi = np.pi
if (flag==1 or flag==2 or flag==3 or flag==4):
if manufactured_solution == 1:
return lambda x,t: np.cos(x[0])*np.cos(x[1]+t)
else:
return lambda x,t: -np.cos(pi*x[0])*np.sin(pi*x[1])*np.sin(t)
def getAFBC_u(x,flag):
None
def getAFBC_v(x,flag):
None
def getDFBC_u(x,flag):
# Set grad(u).n
None
def getDFBC_v(x,flag):
None
dirichletConditions = {0:getDBC_u,
1:getDBC_v}
advectiveFluxBoundaryConditions = {0:getAFBC_u,
1:getAFBC_v}
diffusiveFluxBoundaryConditions = {0:{0:getDFBC_u},
1:{1:getDFBC_v}}
######################
# INITIAL CONDITIONS #
######################
class AtRest(object):
def __init__(self):
pass
def uOfXT(self,x,t):
return 0.0
class velx_at_t0(object):
def __init__(self):
pass
def uOfXT(self,x,t):
if manufactured_solution == 1:
return np.sin(x[0])*np.sin(x[1])
else:
return 0.
class vely_at_t0(object):
def __init__(self):
pass
def uOfXT(self,x,t):
if manufactured_solution == 1:
return np.cos(x[0])*np.cos(x[1])
else:
return 0.
initialConditions = {0:velx_at_t0(),
1:vely_at_t0()}
#############################
# MATERIAL PARAMETER FIELDS #
#############################
def density(X,t):
x = X[0]
y = X[1]
return np.sin(x+y+t)**2+1
mu_constant=False
def dynamic_viscosity(X,t):
x = X[0]
y = X[1]
return mu*(np.cos(x+y+t)**2+1)
materialParameters = {'density':density,
'dynamic_viscosity':dynamic_viscosity}
###############
# FORCE TERMS #
###############
def forcex(X,t):
x = X[0]
y = X[1]
rho = density(X,t)
#pi = np.pi
if manufactured_solution == 1: #u.n!=0
return (rho*np.sin(x)*np.cos(y+t) # Time derivative
+ rho*np.sin(x)*np.cos(x) # Non-linearity
- (0. if KILL_PRESSURE_TERM==True else 1.)*np.sin(x)*np.sin(y+t) # Pressure
+ (2*dynamic_viscosity(X,t)*np.sin(x)*np.sin(y+t) # Diffusion
+(0. if mu_constant==True else 1.)*mu*2*np.cos(x+y+t)*np.sin(x+y+t)*(np.cos(x)*np.sin(y+t)+np.sin(x)*np.cos(y+t))
+(0. if mu_constant==True else 1.)*mu*2*np.cos(x+y+t)*np.sin(x+y+t)*(np.cos(x)*np.sin(y+t)-np.sin(x)*np.cos(y+t)))
)
else: # u.n=0
return (rho*np.sin(pi*x)*np.cos(pi*y)*np.cos(t) # Time derivative
+ rho*pi*np.sin(pi*x)*np.cos(pi*x)*np.sin(t)**2 # non-linearity
- (0. if KILL_PRESSURE_TERM==True else 1.)*np.sin(x)*np.sin(y+t) # Pressure
- dynamic_viscosity(X,t)*(-2*pi**2*np.sin(pi*x)*np.cos(pi*y)*np.sin(t)) # Diffusion
)
def forcey(X,t):
x = X[0]
y = X[1]
rho = density(X,t)
pi = np.pi
if manufactured_solution == 1: #u.n!=0
return (-rho*np.cos(x)*np.sin(y+t) # Time derivative
- rho*np.sin(y+t)*np.cos(y+t) #Non-linearity
+ (0. if KILL_PRESSURE_TERM==True else 1.)*np.cos(x)*np.cos(y+t) #Pressure
+ (2*dynamic_viscosity(X,t)*np.cos(x)*np.cos(y+t) # Diffusion
+(0. if mu_constant==True else 1.)*mu*2*np.cos(x+y+t)*np.sin(x+y+t)*(-np.sin(x)*np.cos(y+t)-np.cos(x)*np.sin(y+t))
+(0. if mu_constant==True else 1.)*mu*2*np.cos(x+y+t)*np.sin(x+y+t)*(np.sin(x)*np.cos(y+t)-np.cos(x)*np.sin(y+t)))
)
else:
return (-rho*np.cos(pi*x)*np.sin(pi*y)*np.cos(t) # Time derivative
+ rho*pi*np.sin(pi*y)*np.cos(pi*y)*np.sin(t)**2 # non-linearity
+ (0. if KILL_PRESSURE_TERM==True else 1.)*np.cos(x)*np.cos(y+t) #Pressure
- dynamic_viscosity(X,t)*(2*pi**2*np.cos(pi*x)*np.sin(pi*y)*np.sin(t)) # Diffusion
)
forceTerms = {0:forcex,
1:forcey}
##################
# EXACT SOLUTION #
##################
class velx(object):
def __init__(self):
pass
def uOfXT(self,x,t):
pi = np.pi
if manufactured_solution == 1:
return np.sin(x[0])*np.sin(x[1]+t)
else:
return np.sin(pi*x[0])*np.cos(pi*x[1])*np.sin(t)
def duOfXT(self,x,t):
if manufactured_solution == 1:
return [np.cos(x[0])*np.sin(x[1]+t),
np.sin(x[0])*np.cos(x[1]+t)]
else:
return [pi*np.cos(pi*x[0])*np.cos(pi*x[1])*np.sin(t),
-pi*np.sin(pi*x[0])*np.sin(pi*x[1])*np.sin(t)]
class vely(object):
def __init__(self):
pass
def uOfXT(self,x,t):
pi = np.pi
if manufactured_solution == 1:
return np.cos(x[0])*np.cos(x[1]+t)
else:
return -np.cos(pi*x[0])*np.sin(pi*x[1])*np.sin(t)
def duOfXT(self,x,t):
if manufactured_solution == 1:
return [-np.sin(x[0])*np.cos(x[1]+t),
-np.cos(x[0])*np.sin(x[1]+t)]
else:
return [pi*np.sin(pi*x[0])*np.sin(pi*x[1])*np.sin(t),
-pi*np.cos(pi*x[0])*np.cos(pi*x[1])*np.sin(t)]
analyticalSolution = {0:velx(),
1:vely()}
class pressure(object):
def __init__(self):
pass
def uOfXT(self,x,t):
return np.cos(x[0])*np.sin(x[1]+t)
analyticalPressureSolution={0:pressure()}
|
If you have high blood pressure then you are at greater risk of: stroke, Heart disease, Cardiovascular disease and kidney disease. As we get older we are more susceptible to high blood pressure, particularly if we have a family history of it. Another factor that can influence our blood pressure is our lifestyle, for example, smoking, alcohol, diet and exercise can all have a big influence particularly if you are overweight. We have various treatments available to help you keep your blood pressure within a healthy range. |
import re
examples = ""
def full_usage():
global examples
out = ""
out += main(False)
out += strip_extras(resource([],False))
out += strip_extras(cluster([],False))
out += strip_extras(stonith([],False))
out += strip_extras(property([],False))
out += strip_extras(constraint([],False))
out += strip_extras(acl([],False))
out += strip_extras(status([],False))
out += strip_extras(config([],False))
out += strip_extras(pcsd([],False))
print out.strip()
print "Examples:\n" + examples.replace(" \ ","")
def strip_extras(text):
global examples
ret = ""
group_name = text.split(" ")[2]
in_commands = False
in_examples = False
lines = text.split("\n")
minicmd = ""
ret += group_name.title() + ":\n"
for line in lines:
if not in_commands:
if line == "Commands:":
in_commands = True
continue
if not in_examples:
if line == "Examples:":
in_examples = True
continue
if not in_examples and not in_commands:
continue
if len(line) >= 4:
if line[0:4] == " ":
if line[4:8] != " ":
if in_examples:
minicmd = line.lstrip() + " "
else:
minicmd = " " + " " + line.lstrip() + " "
else:
minicmd += line.lstrip() + " "
else:
if in_commands:
break
else:
if in_examples:
examples += minicmd + "\n\n"
else:
ret += minicmd + "\n"
minicmd = ""
return ret
# Print only output for items that match the args
# For now we only look at the first arg
# If no args, then we return the full output
def sub_usage(args, output):
if len(args) == 0:
return output
ret = ""
lines = output.split('\n')
begin_printing = False
usage = re.sub("\[commands\]", args[0], lines[1])
for line in lines:
if begin_printing == True and re.match("^ [^ ]",line) and not re.match("^ " + args[0], line):
begin_printing = False
if not re.match("^ ",line) and not re.match("^$",line):
begin_printing = False
if re.match("^ " + args[0], line):
begin_printing = True
if begin_printing:
ret += line + "\n"
if ret != "":
return "\n" + usage + "\n" + ret.rstrip() + "\n"
else:
return output
def dict_depth(d, depth=0):
if not isinstance(d, dict) or not d:
return depth
return max(dict_depth(v, depth+1) for k, v in d.iteritems())
def sub_gen_code(level,item,prev_level=[],spaces=""):
out = ""
if dict_depth(item) <= level:
return ""
out += 'case "${cur' + str(level) + '}" in\n'
next_level = []
for key,val in item.items():
if len(val) == 0:
continue
values = " ".join(val.keys())
values = values.replace("|"," ")
out += " " + key + ")\n"
if len(val) > 0 and level != 1:
out += sub_gen_code(level-1,item[key],[] ,spaces + " ")
else:
out += " " + 'COMPREPLY=($(compgen -W "' + values + '" -- ${cur}))\n'
out += " return 0\n"
out += " ;;\n"
out += " *)\n"
out += " ;;\n"
out += 'esac\n'
temp = out.split('\n')
new_out = ""
for l in temp:
new_out += spaces + l + "\n"
return new_out
def sub_generate_bash_completion():
tree = {}
tree["resource"] = generate_tree(resource([],False))
tree["cluster"] = generate_tree(cluster([],False))
tree["stonith"] = generate_tree(stonith([],False))
tree["property"] = generate_tree(property([],False))
tree["acl"] = generate_tree(acl([],False))
tree["constraint"] = generate_tree(constraint([],False))
tree["status"] = generate_tree(status([],False))
tree["config"] = generate_tree(config([],False))
tree["pcsd"] = generate_tree(pcsd([],False))
print """
_pcs()
{
local cur cur1 cur2 cur3
COMPREPLY=()
cur="${COMP_WORDS[COMP_CWORD]}"
if [ "$COMP_CWORD" -gt "0" ]; then cur1="${COMP_WORDS[COMP_CWORD-1]}";fi
if [ "$COMP_CWORD" -gt "1" ]; then cur2="${COMP_WORDS[COMP_CWORD-2]}";fi
if [ "$COMP_CWORD" -gt "2" ]; then cur3="${COMP_WORDS[COMP_CWORD-3]}";fi
"""
print sub_gen_code(3,tree,[])
print sub_gen_code(2,tree,[])
print sub_gen_code(1,tree,[])
print """
if [ $COMP_CWORD -eq 1 ]; then
COMPREPLY=( $(compgen -W "resource cluster stonith property acl constraint status config" -- $cur) )
fi
return 0
}
complete -F _pcs pcs
"""
def generate_tree(usage_txt):
ignore = True
ret_hash = {}
cur_stack = []
for l in usage_txt.split('\n'):
if l.startswith("Commands:"):
ignore = False
continue
if l.startswith("Examples:"):
break
if ignore == True:
continue
if re.match("^ \w",l):
args = l.split()
arg = args.pop(0)
if not arg in ret_hash:
ret_hash[arg] = {}
cur_hash = ret_hash[arg]
for arg in args:
if arg.startswith('[') or arg.startswith('<'):
break
if not arg in cur_hash:
cur_hash[arg] = {}
cur_hash = cur_hash[arg]
return ret_hash
def main(pout=True):
output = """
Usage: pcs [-f file] [-h] [commands]...
Control and configure pacemaker and corosync.
Options:
-h, --help Display usage and exit
-f file Perform actions on file instead of active CIB
--debug Print all network traffic and external commands run
--version Print pcs version information
Commands:
cluster Configure cluster options and nodes
resource Manage cluster resources
stonith Configure fence devices
constraint Set resource constraints
property Set pacemaker properties
acl Set pacemaker access control lists
status View cluster status
config View and manage cluster configuration
pcsd Manage pcs daemon
"""
# Advanced usage to possibly add later
# --corosync_conf=<corosync file> Specify alternative corosync.conf file
if pout:
print output
else:
return output
def resource(args = [], pout = True):
output = """
Usage: pcs resource [commands]...
Manage pacemaker resources
Commands:
show [resource id] [--full] [--groups]
Show all currently configured resources or if a resource is specified
show the options for the configured resource. If --full is specified
all configured resource options will be displayed. If --groups is
specified, only show groups (and their resources).
list [<standard|provider|type>] [--nodesc]
Show list of all available resources, optionally filtered by specified
type, standard or provider. If --nodesc is used then descriptions
of resources are not printed.
describe <standard:provider:type|type>
Show options for the specified resource
create <resource id> <standard:provider:type|type> [resource options]
[op <operation action> <operation options> [<operation action>
<operation options>]...] [meta <meta options>...]
[--clone <clone options> | --master <master options> |
--group <group name> [--before <resource id> | --after <resource id>]
] [--disabled] [--wait[=n]]
Create specified resource. If --clone is used a clone resource is
created if --master is specified a master/slave resource is created.
If --group is specified the resource is added to the group named. You
can use --before or --after to specify the position of the added
resource relatively to some resource already existing in the group.
If --disabled is specified the resource is not started automatically.
If --wait is specified, pcs will wait up to 'n' seconds for the resource
to start and then return 0 if the resource is started, or 1 if
the resource has not yet started. If 'n' is not specified it defaults
to 60 minutes.
Example: Create a new resource called 'VirtualIP' with IP address
192.168.0.99, netmask of 32, monitored everything 30 seconds,
on eth2.
pcs resource create VirtualIP ocf:heartbeat:IPaddr2 \\
ip=192.168.0.99 cidr_netmask=32 nic=eth2 \\
op monitor interval=30s
delete <resource id|group id|master id|clone id>
Deletes the resource, group, master or clone (and all resources within
the group/master/clone).
enable <resource id> [--wait[=n]]
Allow the cluster to start the resource. Depending on the rest of the
configuration (constraints, options, failures, etc), the resource may
remain stopped. If --wait is specified, pcs will wait up to 'n' seconds
for the resource to start and then return 0 if the resource is started,
or 1 if the resource has not yet started. If 'n' is not specified it
defaults to 60 minutes.
disable <resource id> [--wait[=n]]
Attempt to stop the resource if it is running and forbid the cluster
from starting it again. Depending on the rest of the configuration
(constraints, options, failures, etc), the resource may remain
started. If --wait is specified, pcs will wait up to 'n' seconds for
the resource to stop and then return 0 if the resource is stopped or 1
if the resource has not stopped. If 'n' is not specified it defaults
to 60 minutes.
restart <resource id> [node] [--wait=n]
Restart the resource specified. If a node is specified and if the
resource is a clone or master/slave it will be restarted only on
the node specified. If --wait is specified, then we will wait
up to 'n' seconds for the resource to be restarted and return 0 if
the restart was successful or 1 if it was not.
debug-start <resource id> [--full]
This command will force the specified resource to start on this node
ignoring the cluster recommendations and print the output from
starting the resource. Using --full will give more detailed output.
This is mainly used for debugging resources that fail to start.
debug-stop <resource id> [--full]
This command will force the specified resource to stop on this node
ignoring the cluster recommendations and print the output from
stopping the resource. Using --full will give more detailed output.
This is mainly used for debugging resources that fail to stop.
debug-promote <resource id> [--full]
This command will force the specified resource to be promoted on this
node ignoring the cluster recommendations and print the output from
promoting the resource. Using --full will give more detailed output.
This is mainly used for debugging resources that fail to promote.
debug-demote <resource id> [--full]
This command will force the specified resource to be demoted on this
node ignoring the cluster recommendations and print the output from
demoting the resource. Using --full will give more detailed output.
This is mainly used for debugging resources that fail to demote.
debug-monitor <resource id> [--full]
This command will force the specified resource to be moniored on this
node ignoring the cluster recommendations and print the output from
monitoring the resource. Using --full will give more detailed output.
This is mainly used for debugging resources that fail to be monitored.
move <resource id> [destination node] [--master] [lifetime=<lifetime>]
[--wait[=n]]
Move the resource off the node it is currently running on by creating a
-INFINITY location constraint to ban the node. If destination node is
specified the resource will be moved to that node by creating an
INFINITY location constraint to prefer the destination node. If
--master is used the scope of the command is limited to the master role
and you must use the master id (instead of the resource id). If
lifetime is specified then the constraint will expire after that time,
otherwise it defaults to infinity and the constraint can be cleared
manually with 'pcs resource clear' or 'pcs constraint delete'. If
--wait is specified, pcs will wait up to 'n' seconds for the resource
to move and then return 0 on success or 1 on error. If 'n' is not
specified it defaults to 60 minutes.
If you want the resource to preferably avoid running on some nodes but
be able to failover to them use 'pcs location avoids'.
ban <resource id> [node] [--master] [lifetime=<lifetime>] [--wait[=n]]
Prevent the resource id specified from running on the node (or on the
current node it is running on if no node is specified) by creating a
-INFINITY location constraint. If --master is used the scope of the
command is limited to the master role and you must use the master id
(instead of the resource id). If lifetime is specified then the
constraint will expire after that time, otherwise it defaults to
infinity and the constraint can be cleared manually with 'pcs resource
clear' or 'pcs constraint delete'. If --wait is specified, pcs will
wait up to 'n' seconds for the resource to move and then return 0
on success or 1 on error. If 'n' is not specified it defaults to 60
minutes.
If you want the resource to preferably avoid running on some nodes but
be able to failover to them use 'pcs location avoids'.
clear <resource id> [node] [--master] [--wait[=n]]
Remove constraints created by move and/or ban on the specified
resource (and node if specified).
If --master is used the scope of the command is limited to the
master role and you must use the master id (instead of the resource id).
If --wait is specified, pcs will wait up to 'n' seconds for the
operation to finish (including starting and/or moving resources if
appropriate) and then return 0 on success or 1 on error. If 'n' is not
specified it defaults to 60 minutes.
standards
List available resource agent standards supported by this installation.
(OCF, LSB, etc.)
providers
List available OCF resource agent providers
agents [standard[:provider]]
List available agents optionally filtered by standard and provider
update <resource id> [resource options] [op [<operation action>
<operation options>]...] [meta <meta operations>...] [--wait[=n]]
Add/Change options to specified resource, clone or multi-state
resource. If an operation (op) is specified it will update the first
found operation with the same action on the specified resource, if no
operation with that action exists then a new operation will be created.
(WARNING: all existing options on the updated operation will be reset
if not specified.) If you want to create multiple monitor operations
you should use the 'op add' & 'op remove' commands. If --wait is
specified, pcs will wait up to 'n' seconds for the changes to take
effect and then return 0 if the changes have been processed or 1
otherwise. If 'n' is not specified it defaults to 60 minutes.
op add <resource id> <operation action> [operation properties]
Add operation for specified resource
op remove <resource id> <operation action> [<operation properties>...]
Remove specified operation (note: you must specify the exact operation
properties to properly remove an existing operation).
op remove <operation id>
Remove the specified operation id
op defaults [options]
Set default values for operations, if no options are passed, lists
currently configured defaults
meta <resource id | group id | master id | clone id> <meta options>
[--wait[=n]]
Add specified options to the specified resource, group, master/slave
or clone. Meta options should be in the format of name=value, options
may be removed by setting an option without a value. If --wait is
specified, pcs will wait up to 'n' seconds for the changes to take
effect and then return 0 if the changes have been processed or 1
otherwise. If 'n' is not specified it defaults to 60 minutes.
Example: pcs resource meta TestResource failure-timeout=50 stickiness=
group add <group name> <resource id> [resource id] ... [resource id]
[--before <resource id> | --after <resource id>] [--wait[=n]]
Add the specified resource to the group, creating the group if it does
not exist. If the resource is present in another group it is moved
to the new group. You can use --before or --after to specify
the position of the added resources relatively to some resource already
existing in the group. If --wait is specified, pcs will wait up to 'n'
seconds for the operation to finish (including moving resources if
appropriate) and then return 0 on success or 1 on error. If 'n' is not
specified it defaults to 60 minutes.
group remove <group name> <resource id> [resource id] ... [resource id]
[--wait[=n]]
Remove the specified resource(s) from the group, removing the group if
it no resources remain. If --wait is specified, pcs will wait up to 'n'
seconds for the operation to finish (including moving resources if
appropriate) and then return 0 on success or 1 on error. If 'n' is not
specified it defaults to 60 minutes.
ungroup <group name> [resource id] ... [resource id] [--wait[=n]]
Remove the group (Note: this does not remove any resources from the
cluster) or if resources are specified, remove the specified resources
from the group. If --wait is specified, pcs will wait up to 'n' seconds
for the operation to finish (including moving resources if appropriate)
and the return 0 on success or 1 on error. If 'n' is not specified it
defaults to 60 minutes.
clone <resource id | group id> [clone options]... [--wait[=n]]
Setup up the specified resource or group as a clone. If --wait is
specified, pcs will wait up to 'n' seconds for the operation to finish
(including starting clone instances if appropriate) and then return 0
on success or 1 on error. If 'n' is not specified it defaults to 60
minutes.
unclone <resource id | group name> [--wait[=n]]
Remove the clone which contains the specified group or resource (the
resource or group will not be removed). If --wait is specified, pcs
will wait up to 'n' seconds for the operation to finish (including
stopping clone instances if appropriate) and then return 0 on success
or 1 on error. If 'n' is not specified it defaults to 60 minutes.
master [<master/slave name>] <resource id | group name> [options]
[--wait[=n]]
Configure a resource or group as a multi-state (master/slave) resource.
If --wait is specified, pcs will wait up to 'n' seconds for the operation
to finish (including starting and promoting resource instances if
appropriate) and then return 0 on success or 1 on error. If 'n' is not
specified it defaults to 60 minutes.
Note: to remove a master you must remove the resource/group it contains.
manage <resource id> ... [resource n]
Set resources listed to managed mode (default)
unmanage <resource id> ... [resource n]
Set resources listed to unmanaged mode
defaults [options]
Set default values for resources, if no options are passed, lists
currently configured defaults
cleanup [<resource id>]
Cleans up the resource in the lrmd (useful to reset the resource
status and failcount). This tells the cluster to forget the
operation history of a resource and re-detect its current state.
This can be useful to purge knowledge of past failures that have
since been resolved. If a resource id is not specified then all
resources/stonith devices will be cleaned up.
failcount show <resource id> [node]
Show current failcount for specified resource from all nodes or
only on specified node
failcount reset <resource id> [node]
Reset failcount for specified resource on all nodes or only on
specified node. This tells the cluster to forget how many times
a resource has failed in the past. This may allow the resource to
be started or moved to a more preferred location.
relocate dry-run [resource1] [resource2] ...
The same as 'relocate run' but has no effect on the cluster.
relocate run [resource1] [resource2] ...
Relocate specified resources to their preferred nodes. If no resources
are specified, relocate all resources.
This command calculates the preferred node for each resource while
ignoring resource stickiness. Then it creates location constraints
which will cause the resources to move to their preferred nodes. Once
the resources have been moved the constraints are deleted automatically.
Note that the preferred node is calculated based on current cluster
status, constraints, location of resources and other settings and thus
it might change over time.
relocate show
Display current status of resources and their optimal node ignoring
resource stickiness.
relocate clean
Remove all constraints created by the 'relocate run' command.
Examples:
pcs resource show
Show all resources
pcs resource show VirtualIP
Show options specific to the 'VirtualIP' resource
pcs resource create VirtualIP ocf:heartbeat:IPaddr2 ip=192.168.0.99 \\
cidr_netmask=32 nic=eth2 op monitor interval=30s
Create a new resource called 'VirtualIP' with options
pcs resource create VirtualIP IPaddr2 ip=192.168.0.99 \\
cidr_netmask=32 nic=eth2 op monitor interval=30s
Create a new resource called 'VirtualIP' with options
pcs resource update VirtualIP ip=192.168.0.98 nic=
Change the ip address of VirtualIP and remove the nic option
pcs resource delete VirtualIP
Delete the VirtualIP resource
Notes:
Starting resources on a cluster is (almost) always done by pacemaker and
not directly from pcs. If your resource isn't starting, it's usually
due to either a misconfiguration of the resource (which you debug in
the system log), or constraints preventing the resource from starting or
the resource being disabled. You can use 'pcs resource debug-start' to
test resource configuration, but it should *not* normally be used to start
resources in a cluster.
"""
if pout:
print sub_usage(args, output)
else:
return output
def cluster(args = [], pout = True):
output = """
Usage: pcs cluster [commands]...
Configure cluster for use with pacemaker
Commands:
auth [node] [...] [-u username] [-p password] [--force] [--local]
Authenticate pcs to pcsd on nodes specified, or on all nodes
configured in corosync.conf if no nodes are specified (authorization
tokens are stored in ~/.pcs/tokens or /var/lib/pcsd/tokens for root).
By default all nodes are also authenticated to each other, using
--local only authenticates the local node (and does not authenticate
the remote nodes with each other). Using --force forces
re-authentication to occur.
setup [--start] [--local] [--enable] --name <cluster name> <node1[,node1-altaddr]>
[node2[,node2-altaddr]] [..] [--transport <udpu|udp>] [--rrpmode active|passive]
[--addr0 <addr/net> [[[--mcast0 <address>] [--mcastport0 <port>]
[--ttl0 <ttl>]] | [--broadcast0]]
[--addr1 <addr/net> [[[--mcast1 <address>] [--mcastport1 <port>]
[--ttl1 <ttl>]] | [--broadcast1]]]]
[--wait_for_all=<0|1>] [--auto_tie_breaker=<0|1>]
[--last_man_standing=<0|1> [--last_man_standing_window=<time in ms>]]
[--ipv6] [--token <timeout>] [--token_coefficient <timeout>]
[--join <timeout>] [--consensus <timeout>] [--miss_count_const <count>]
[--fail_recv_const <failures>]
Configure corosync and sync configuration out to listed nodes.
--local will only perform changes on the local node,
--start will also start the cluster on the specified nodes,
--enable will enable corosync and pacemaker on node startup,
--transport allows specification of corosync transport (default: udpu),
--rrpmode allows you to set the RRP mode of the system. Currently only
'passive' is supported or tested (using 'active' is not
recommended).
The --wait_for_all, --auto_tie_breaker, --last_man_standing,
--last_man_standing_window options are all documented in corosync's
votequorum(5) man page.
--ipv6 will configure corosync to use ipv6 (instead of ipv4)
--token <timeout> sets time in milliseconds until a token loss is
declared after not receiving a token (default 1000 ms)
--token_coefficient <timeout> sets time in milliseconds used for clusters
with at least 3 nodes as a coefficient for real token timeout calculation
(token + (number_of_nodes - 2) * token_coefficient) (default 650 ms)
--join <timeout> sets time in milliseconds to wait for join messages
(default 50 ms)
--consensus <timeout> sets time in milliseconds to wait for consensus
to be achieved before starting a new round of membership configuration
(default 1200 ms)
--miss_count_const <count> sets the maximum number of times on
receipt of a token a message is checked for retransmission before
a retransmission occurs (default 5 messages)
--fail_recv_const <failures> specifies how many rotations of the token
without receiving any messages when messages should be received
may occur before a new configuration is formed (default 2500 failures)
Configuring Redundant Ring Protocol (RRP)
When using udpu (the default) specifying nodes, specify the ring 0
address first followed by a ',' and then the ring 1 address.
Example: pcs cluster setup --name cname nodeA-0,nodeA-1 nodeB-0,nodeB-1
When using udp, using --addr0 and --addr1 will allow you to configure
rrp mode for corosync. It's recommended to use a network (instead of
IP address) for --addr0 and --addr1 so the same corosync.conf file can
be used around the cluster. --mcast0 defaults to 239.255.1.1 and
--mcast1 defaults to 239.255.2.1, --mcastport0/1 default to 5405 and
ttl defaults to 1. If --broadcast is specified, --mcast0/1,
--mcastport0/1 & --ttl0/1 are ignored.
start [--all] [node] [...]
Start corosync & pacemaker on specified node(s), if a node is not
specified then corosync & pacemaker are started on the local node.
If --all is specified then corosync & pacemaker are started on all
nodes.
stop [--all] [node] [...]
Stop corosync & pacemaker on specified node(s), if a node is not
specified then corosync & pacemaker are stopped on the local node.
If --all is specified then corosync & pacemaker are stopped on all
nodes.
kill
Force corosync and pacemaker daemons to stop on the local node
(performs kill -9).
enable [--all] [node] [...]
Configure corosync & pacemaker to run on node boot on specified
node(s), if node is not specified then corosync & pacemaker are
enabled on the local node. If --all is specified then corosync &
pacemaker are enabled on all nodes.
disable [--all] [node] [...]
Configure corosync & pacemaker to not run on node boot on specified
node(s), if node is not specified then corosync & pacemaker are
disabled on the local node. If --all is specified then corosync &
pacemaker are disabled on all nodes. (Note: this is the default after
installation)
standby [<node>] | --all
Put specified node into standby mode (the node specified will no longer
be able to host resources), if no node or options are specified the
current node will be put into standby mode, if --all is specified all
nodes will be put into standby mode.
unstandby [<node>] | --all
Remove node from standby mode (the node specified will now be able to
host resources), if no node or options are specified the current node
will be removed from standby mode, if --all is specified all nodes will
be removed from standby mode.
remote-node add <hostname> <resource id> [options]
Enables the specified resource as a remote-node resource on the
specified hostname (hostname should be the same as 'uname -n')
remote-node remove <hostname>
Disables any resources configured to be remote-node resource on the
specified hostname (hostname should be the same as 'uname -n')
status
View current cluster status (an alias of 'pcs status cluster')
pcsd-status [node] [...]
Get current status of pcsd on nodes specified, or on all nodes
configured in corosync.conf if no nodes are specified
sync
Sync corosync configuration to all nodes found from current
corosync.conf file (cluster.conf on systems running Corosync 1.x)
quorum unblock
Cancel waiting for all nodes when establishing quorum. Useful in
situations where you know the cluster is inquorate, but you are
confident that the cluster should proceed with resource management
regardless.
cib [filename] [scope=<scope> | --config]
Get the raw xml from the CIB (Cluster Information Base). If a
filename is provided, we save the cib to that file, otherwise the cib
is printed. Specify scope to get a specific section of the CIB. Valid
values of the scope are: configuration, nodes, resources, constraints,
crm_config, rsc_defaults, op_defaults, status. --config is the same
as scope=configuration. Use of --config is recommended. Do not specify
a scope if you need to get the whole CIB or be warned in the case
of outdated CIB on cib-push.
cib-push <filename> [scope=<scope> | --config]
Push the raw xml from <filename> to the CIB (Cluster Information Base).
Specify scope to push a specific section of the CIB. Valid values
of the scope are: configuration, nodes, resources, constraints,
crm_config, rsc_defaults, op_defaults. --config is the same as
scope=configuration. Use of --config is recommended. Do not specify
a scope if you need to push the whole CIB or be warned in the case
of outdated CIB.
cib-upgrade
Upgrade the cib to the latest version
edit [scope=<scope> | --config]
Edit the cib in the editor specified by the $EDITOR environment
variable and push out any changes upon saving. Specify scope to edit
a specific section of the CIB. Valid values of the scope are:
configuration, nodes, resources, constraints, crm_config, rsc_defaults,
op_defaults. --config is the same as scope=configuration. Use of
--config is recommended. Do not specify a scope if you need to edit
the whole CIB or be warned in the case of outdated CIB.
node add <node[,node-altaddr]> [--start] [--enable]
Add the node to corosync.conf and corosync on all nodes in the cluster
and sync the new corosync.conf to the new node. If --start is specified
also start corosync/pacemaker on the new node, if --enable is specified
enable corosync/pacemaker on new node.
When using Redundant Ring Protocol (RRP) with udpu transport, specify
the ring 0 address first followed by a ',' and then the ring 1 address.
node remove <node>
Shutdown specified node and remove it from pacemaker and corosync on
all other nodes in the cluster
uidgid
List the current configured uids and gids of users allowed to connect
to corosync
uidgid add [uid=<uid>] [gid=<gid>]
Add the specified uid and/or gid to the list of users/groups
allowed to connect to corosync
uidgid rm [uid=<uid>] [gid=<gid>]
Remove the specified uid and/or gid from the list of users/groups
allowed to connect to corosync
corosync [node]
Get the corosync.conf from the specified node or from the current node
if node not specified
reload corosync
Reload the corosync configuration on the current node
destroy [--all]
Permanently destroy the cluster on the current node, killing all
corosync/pacemaker processes removing all cib files and the
corosync.conf file. Using --all will attempt to destroy the
cluster on all nodes configure in the corosync.conf file.
WARNING: This command permantly removes any cluster configuration that
has been created. It is recommended to run 'pcs cluster stop' before
destroying the cluster.
verify [-V] [filename]
Checks the pacemaker configuration (cib) for syntax and common
conceptual errors. If no filename is specified the check is
performmed on the currently running cluster. If -V is used
more verbose output will be printed
report [--from "YYYY-M-D H:M:S" [--to "YYYY-M-D" H:M:S"]] dest
Create a tarball containing everything needed when reporting cluster
problems. If --from and --to are not used, the report will include
the past 24 hours.
"""
if pout:
print sub_usage(args, output)
else:
return output
def stonith(args = [], pout = True):
output = """
Usage: pcs stonith [commands]...
Configure fence devices for use with pacemaker
Commands:
show [stonith id] [--full]
Show all currently configured stonith devices or if a stonith id is
specified show the options for the configured stonith device. If
--full is specified all configured stonith options will be displayed
list [filter] [--nodesc]
Show list of all available stonith agents (if filter is provided then
only stonith agents matching the filter will be shown). If --nodesc is
used then descriptions of stonith agents are not printed.
describe <stonith agent>
Show options for specified stonith agent
create <stonith id> <stonith device type> [stonith device options]
Create stonith device with specified type and options
update <stonith id> [stonith device options]
Add/Change options to specified stonith id
delete <stonith id>
Remove stonith id from configuration
cleanup [<stonith id>]
Cleans up the stonith device in the lrmd (useful to reset the
status and failcount). This tells the cluster to forget the
operation history of a stonith device and re-detect its current state.
This can be useful to purge knowledge of past failures that have
since been resolved. If a stonith id is not specified then all
resources/stonith devices will be cleaned up.
level
Lists all of the fencing levels currently configured
level add <level> <node> <devices>
Add the fencing level for the specified node with a comma separated
list of devices (stonith ids) to attempt for that node at that level.
Fence levels are attempted in numerical order (starting with 1) if
a level succeeds (meaning all devices are successfully fenced in that
level) then no other levels are tried, and the node is considered
fenced.
level remove <level> [node id] [stonith id] ... [stonith id]
Removes the fence level for the level, node and/or devices specified
If no nodes or devices are specified then the fence level is removed
level clear [node|stonith id(s)]
Clears the fence levels on the node (or stonith id) specified or clears
all fence levels if a node/stonith id is not specified. If more than
one stonith id is specified they must be separated by a comma and no
spaces. Example: pcs stonith level clear dev_a,dev_b
level verify
Verifies all fence devices and nodes specified in fence levels exist
fence <node> [--off]
Fence the node specified (if --off is specified, use the 'off' API
call to stonith which will turn the node off instead of rebooting it)
confirm <node>
Confirm that the host specified is currently down.
WARNING: if this node is not actually down data corruption/cluster
failure can occur.
Examples:
pcs stonith create MyStonith fence_virt pcmk_host_list=f1
"""
if pout:
print sub_usage(args, output)
else:
return output
def property(args = [], pout = True):
output = """
Usage: pcs property <properties>...
Configure pacemaker properties
Commands:
list|show [<property> | --all | --defaults]
List property settings (default: lists configured properties).
If --defaults is specified will show all property defaults, if --all
is specified, current configured properties will be shown with unset
properties and their defaults.
Run 'man pengine' and 'man crmd' to get a description of the properties.
set [--force] [--node <nodename>] <property>=[<value>]
Set specific pacemaker properties (if the value is blank then the
property is removed from the configuration). If a property is not
recognized by pcs the property will not be created unless the
--force is used. If --node is used a node attribute is set on
the specified node.
Run 'man pengine' and 'man crmd' to get a description of the properties.
unset [--node <nodename>] <property>
Remove property from configuration (or remove attribute from
specified node if --node is used).
Run 'man pengine' and 'man crmd' to get a description of the properties.
Examples:
pcs property set stonith-enabled=false
"""
if pout:
print sub_usage(args, output)
else:
return output
def constraint(args = [], pout = True):
output = """
Usage: pcs constraint [constraints]...
Manage resource constraints
Commands:
[list|show] --full
List all current location, order and colocation constraints, if --full
is specified also list the constraint ids.
location <resource id> prefers <node[=score]>...
Create a location constraint on a resource to prefer the specified
node and score (default score: INFINITY)
location <resource id> avoids <node[=score]>...
Create a location constraint on a resource to avoid the specified
node and score (default score: INFINITY)
location <resource id> rule [id=<rule id>] [resource-discovery=<option>]
[role=master|slave] [constraint-id=<id>]
[score=<score>|score-attribute=<attribute>] <expression>
Creates a location rule on the specified resource where the expression
looks like one of the following:
defined|not_defined <attribute>
<attribute> lt|gt|lte|gte|eq|ne [string|integer|version] <value>
date gt|lt <date>
date in_range <date> to <date>
date in_range <date> to duration <duration options>...
date-spec <date spec options>...
<expression> and|or <expression>
( <expression> )
where duration options and date spec options are: hours, monthdays,
weekdays, yeardays, months, weeks, years, weekyears, moon.
If score is omitted it defaults to INFINITY. If id is omitted one is
generated from the resource id. If resource-discovery is omitted it
defaults to 'always'.
location show [resources|nodes [node id|resource id]...] [--full]
List all the current location constraints, if 'resources' is specified
location constraints are displayed per resource (default), if 'nodes'
is specified location constraints are displayed per node. If specific
nodes or resources are specified then we only show information about
them. If --full is specified show the internal constraint id's as well.
location add <id> <resource name> <node> <score> [resource-discovery=<option>]
Add a location constraint with the appropriate id, resource name,
node name and score. (For more advanced pacemaker usage)
location remove <id> [<resource name> <node> <score>]
Remove a location constraint with the appropriate id, resource name,
node name and score. (For more advanced pacemaker usage)
order show [--full]
List all current ordering constraints (if --full is specified show
the internal constraint id's as well).
order [action] <resource id> then [action] <resource id> [options]
Add an ordering constraint specifying actions (start, stop, promote,
demote) and if no action is specified the default action will be
start.
Available options are kind=Optional/Mandatory/Serialize,
symmetrical=true/false, require-all=true/false and id=<constraint-id>.
order set <resource1> <resource2> [resourceN]... [options] [set
<resourceX> <resourceY> ... [options]]
[setoptions [constraint_options]]
Create an ordered set of resources.
Available options are sequential=true/false, require-all=true/false,
action=start/promote/demote/stop and role=Stopped/Started/Master/Slave.
Available constraint_options are id=<constraint-id>,
kind=Optional/Mandatory/Serialize and symmetrical=true/false.
order remove <resource1> [resourceN]...
Remove resource from any ordering constraint
colocation show [--full]
List all current colocation constraints (if --full is specified show
the internal constraint id's as well).
colocation add [master|slave] <source resource id> with [master|slave]
<target resource id> [score] [options] [id=constraint-id]
Request <source resource> to run on the same node where pacemaker has
determined <target resource> should run. Positive values of score
mean the resources should be run on the same node, negative values
mean the resources should not be run on the same node. Specifying
'INFINITY' (or '-INFINITY') for the score force <source resource> to
run (or not run) with <target resource>. (score defaults to "INFINITY")
A role can be master or slave (if no role is specified, it defaults to
'started').
colocation set <resource1> <resource2> [resourceN]... [options]
[set <resourceX> <resourceY> ... [options]]
[setoptions [constraint_options]]
Create a colocation constraint with a resource set.
Available options are sequential=true/false, require-all=true/false,
action=start/promote/demote/stop and role=Stopped/Started/Master/Slave.
Available constraint_options are id, score, score-attribute and
score-attribute-mangle.
colocation remove <source resource id> <target resource id>
Remove colocation constraints with <source resource>
remove [constraint id]...
Remove constraint(s) or constraint rules with the specified id(s)
ref <resource>...
List constraints referencing specified resource
rule add <constraint id> [id=<rule id>] [role=master|slave]
[score=<score>|score-attribute=<attribute>] <expression>
Add a rule to a constraint where the expression looks like one of
the following:
defined|not_defined <attribute>
<attribute> lt|gt|lte|gte|eq|ne [string|integer|version] <value>
date gt|lt <date>
date in_range <date> to <date>
date in_range <date> to duration <duration options>...
date-spec <date spec options>...
<expression> and|or <expression>
( <expression> )
where duration options and date spec options are: hours, monthdays,
weekdays, yeardays, months, weeks, years, weekyears, moon
If score is ommited it defaults to INFINITY. If id is ommited one is
generated from the constraint id.
rule remove <rule id>
Remove a rule if a rule id is specified, if rule is last rule in its
constraint, the constraint will be removed
"""
if pout:
print sub_usage(args, output)
else:
return output
def acl(args = [], pout = True):
output = """
Usage: pcs acl [commands]...
View and modify current cluster access control lists
Commands:
[show]
List all current access control lists
enable
Enable access control lists
disable
Disable access control lists
role create <role name> [description=<description>] [((read | write | deny)
(xpath <query> | id <id>))...]
Create a role with the name and (optional) description specified.
Each role can also have an unlimited number of permissions
(read/write/deny) applied to either an xpath query or the id
of a specific element in the cib
role delete <role name>
Delete the role specified and remove it from any users/groups it was
assigned to
role assign <role name> [to] <username/group>
Assign a role to a user or group already created with 'pcs acl
user/group create'
role unassign <role name> [from] <username/group>
Remove a role from the specified user
user create <username> <role name> [<role name>]...
Create an ACL for the user specified and assign roles to the user
user delete <username>
Remove the user specified (and roles assigned will be unassigned for
the specified user)
group create <group> <role name> [<role name>]...
Create an ACL for the group specified and assign roles to the group
group delete <group>
Remove the group specified (and roles assigned will be unassigned for
the specified group)
permission add <role name> ((read | write | deny) (xpath <query> |
id <id>))...
Add the listed permissions to the role specified
permission delete <permission id>
Remove the permission id specified (permission id's are listed in
parenthesis after permissions in 'pcs acl' output)
"""
if pout:
print sub_usage(args, output)
else:
return output
def status(args = [], pout = True):
output = """
Usage: pcs status [commands]...
View current cluster and resource status
Commands:
[status] [--full]
View all information about the cluster and resources (--full provides
more details)
resources
View current status of cluster resources
groups
View currently configured groups and their resources
cluster
View current cluster status
corosync
View current membership information as seen by corosync
nodes [corosync|both|config]
View current status of nodes from pacemaker. If 'corosync' is
specified, print nodes currently configured in corosync, if 'both'
is specified, print nodes from both corosync & pacemaker. If 'config'
is specified, print nodes from corosync & pacemaker configuration.
pcsd <node> ...
Show the current status of pcsd on the specified nodes
xml
View xml version of status (output from crm_mon -r -1 -X)
"""
if pout:
print sub_usage(args, output)
else:
return output
def config(args=[], pout=True):
output = """
Usage: pcs config [commands]...
View and manage cluster configuration
Commands:
[show]
View full cluster configuration
backup [filename]
Creates the tarball containing the cluster configuration files.
If filename is not specified the standard output will be used.
restore [--local] [filename]
Restores the cluster configuration files on all nodes from the backup.
If filename is not specified the standard input will be used.
If --local is specified only the files on the current node will
be restored.
checkpoint
List all available configuration checkpoints.
checkpoint view <checkpoint_number>
Show specified configuration checkpoint.
checkpoint restore <checkpoint_number>
Restore cluster configuration to specified checkpoint.
import-cman output=<filename> [input=<filename>] [--interactive]
[output-format=corosync.conf|cluster.conf]
Converts CMAN cluster configuration to Pacemaker cluster configuration.
Converted configuration will be saved to 'output' file. To send
the configuration to the cluster nodes the 'pcs config restore'
command can be used. If --interactive is specified you will be
prompted to solve incompatibilities manually. If no input is specified
/etc/cluster/cluster.conf will be used. You can force to create output
containing either cluster.conf or corosync.conf using the output-format
option.
"""
if pout:
print sub_usage(args, output)
else:
return output
def pcsd(args=[], pout=True):
output = """
Usage: pcs pcsd [commands]...
Manage pcs daemon
Commands:
certkey <certificate file> <key file>
Load custom certificate and key files for use in pcsd.
sync-certificates
Sync pcsd certificates to all nodes found from current corosync.conf
file (cluster.conf on systems running Corosync 1.x). WARNING: This will
restart pcsd daemon on the nodes.
"""
if pout:
print sub_usage(args, output)
else:
return output
|
Upcoming February Events With The Reuseum!
Now that we are finally moved in and organized, the swarms of classes will start again! We are so ready in fact, that we have scheduled a handful of events that our people at the Reuseum will be participating in to spread the word about the store! Below are just some of the events we have planned for the month of February, that we hope everyone has a chance of going to!
The Reuseum will participating in the Engineering & Science Festival on the Boise State University Campus. We will be performing a drop-in display with Bristlebots in the Bergquist Lounge. This event will be ran throughout the day, where people can drop by at anytime to participate in! The Engineering & Science Festival is a free event for all ages with a wide variety of engaging activities designed for K-12 students and their family.
We will be participating in the Idaho Business & Technology Expo at The Riverside Hotel! This event brings together local technological business from Idaho together to communicate and to spread the word about their businesses! We ask that, if anyone is interested in attending, that you stop by our display and say hello!
The Idaho Business & Technology Expo is celebrating 25 years of connecting leaders to resources, creating lifetime relationships and unrivaled solutions, making this a must attend event for all local businesses.
The Reuseum will be participating as a vender at the 2016 Kids Fair at Expo Idaho! We will be showing off our 3D printer, as well as what you can expect when stopping by at the Reusuem! The Kids Fair offers kids and families the unique opportunity to experience interactive activities at every booth. The day will be filled with the best in exciting entertainment, interactive activities and games for all. The Kids Fair features exhibit and activity areas offering products and services that enhance, enrich and impact children’s lives.
Engineering & Science Festival at Boise State University! |
# Copyright 2011 OpenStack LLC.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from nova import context
from nova import db
from nova import log as logging
LOG = logging.getLogger(__name__)
def notify(message):
"""Look for specific compute manager events and interprete them
so as to keep the Capacity table up to date.
NOTE: the True/False return codes are only for testing.
"""
# The event_type must start with 'compute.instance.'
event_type = message.get('event_type', None)
preamble = 'compute.instance.'
if not event_type or not event_type.startswith(preamble):
return False
# Events we're interested in end with .start and .end
event = event_type[len(preamble):]
parts = event.split('.')
suffix = parts[-1].lower()
event = event[:(-len(suffix) - 1)]
if suffix not in ['start', 'end']:
return False
started = suffix == 'start'
ended = suffix == 'end'
if started and event == 'create':
# We've already updated this stuff in the scheduler. Don't redo the
# work here.
return False
work = 1 if started else -1
# Extract the host name from the publisher id ...
publisher_preamble = 'compute.'
publisher = message.get('publisher_id', None)
if not publisher or not publisher.startswith(publisher_preamble):
return False
host = publisher[len(publisher_preamble):]
# If we deleted an instance, make sure we reclaim the resources.
# We may need to do something explicit for rebuild/migrate.
free_ram_mb = 0
free_disk_gb = 0
vms = 0
if ended and event == 'delete':
vms = -1
payload = message.get('payload', {})
free_ram_mb = payload.get('memory_mb', 0)
free_disk_gb = payload.get('disk_gb', 0)
LOG.debug("EventType=%(event_type)s -> host %(host)s: "
"ram %(free_ram_mb)d, disk %(free_disk_gb)d, "
"work %(work)d, vms%(vms)d" % locals())
db.api.compute_node_utilization_update(context.get_admin_context(), host,
free_ram_mb_delta=free_ram_mb, free_disk_gb_delta=free_disk_gb,
work_delta=work, vm_delta=vms)
return True
|
I wonder what the seagull is thinking, he/she looks so intent on what is out there in the horizon. Love the close up.
I bet they dare not poo anywhere in MC, otherwise they might get banned. |
# -*- coding: utf-8 -*-
# System Imports
from xml.dom.minidom import parseString
# Flask Imports
from flask import render_template, redirect
from flask import url_for, request, send_from_directory, session
from werkzeug import secure_filename
# Local Imports
from app import app
from app.forms import *
# Package Imports
from decorators import login_required
from models import db
#Dashboard projet (changer de fichier dashboard.py !!)
@app.route('/opengrow/<id>', methods=['GET', 'POST'])
@login_required
def opengrow(id):
print "Project "+str(id)+" selected"
return render_template('opengrow.html', title='OpenGrow', project_id=id)
#Historique photos (changer de fichier dashboard.py !!)
@app.route('/stream', methods=['GET', 'POST'])
@login_required
def stream():
return render_template('camera.html', title='OpenGrow')
#Future page de settings
@app.route('/settings', methods=['GET', 'POST'])
@login_required
def settings():
return render_template('settings.html', title='OpenGrow', settings=globalsettings, form=form, ips=ips)
|
Retail at $2.50 to $3.50 each, and you can see potential for numerous busy Belgian Waffle Bakers! The dessert with a slab of ice cream, strawberry syrup and whipped cream, has a food cost of about 30¢--a selling price of $2.50. Also great for use on a breakfast buffet. |
#!/usr/bin/env python
"""The package's classes
Slight Fimulator - Flight simulator in Python
Copyright (C) 2017, 2018 Hao Tian and Adrien Hopkins
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
"""
# Installs Python 3 division and print behaviour
from __future__ import division, print_function
import math
import os
import time
import pygame
PATH = os.path.dirname(os.path.realpath(__file__))
class Airplane(pygame.sprite.Sprite):
"""The class for an airplane sprite.
All units are stored internally in SI base units
"""
NEXT_ID = 0
MAX_SPEED = 500
TERMINAL_VELOCITY = MAX_SPEED / 5 # Why not?
LABELS = "ID:\tX:\tY:\tALT:\tSPD:\tACCEL:\tVSPD:\t\
HDG:\tROLL:\tPITCH:\tPTS:\tDMG:\t"
def __init__(self, x=(0, 0, 0, 0, 0), z=None, width=None,
height=None, altitude=None, player_id=None):
"""Initialize the instance."""
super(Airplane, self).__init__()
if z is None:
x, z, width, height, altitude = x
elif width is None:
altitude = z
x, z, width, height = x
elif height is None:
altitude = width
width, height = z
x, z = x
if player_id is None: # Get an ID for the airplane
self._id = Airplane.NEXT_ID
Airplane.NEXT_ID += 1
else: self._id = player_id
# Initialize private variables
self._pos = [x, z]
self._size = [width, height]
self._altitude = altitude
self._heading = 0
self._pitch = 0
self._speed = 0
self._acceleration = 0
self._gravity = 0
self._throttle = 0
self._roll_level = 0
self._vertical_roll_level = 0
self._autopilot_info = {
'enabled': False,
'conditions': {
'roll-centered': True,
'vertical-roll-centered': True,
'throttle-centered': True
}
}
self._within_objective_range = False
self._points = 0
self._exit_code = 0
self._health = 100
self._time = time.time()
def __repr__(self, show_labels=True):
"""Display some important stats about the plane."""
msg = ("%i\t%i\t%i\t%i\t%.1f\t%.1f\t%.1f\t%.1f\t%.1f\t%.1f\t\
%i\t%.1f\t" % (self.id_, self.x, self.z,
self.altitude, self.speed, self.acceleration,
self.vertical_velocity, self.heading, self.roll,
self.pitch, self.points, 100 - self.health))
if show_labels:
return "%s\n%s" % (Airplane.LABELS, msg)
else:
return msg
## variables
@property
def id_(self):
"""Get the plane's ID."""
return self._id
@property
def pos(self):
"""Get the plane's (x, z) position in metres."""
return self._pos
@pos.setter
def pos(self, new_value):
"""Set the plane's (x, z) position in metres."""
if not isinstance(new_value, (list, tuple)):
raise TypeError("Position must be a list or a tuple.")
if len(new_value) != 2:
raise ValueError("Position must contain two values.")
if not isinstance(new_value[0], (int, float)):
raise ValueError("X must be a number.")
if not isinstance(new_value[1], (int, float)):
raise ValueError("Z must be a number.")
self._pos = new_value
@property
def x(self):
"""Get the plane's x coordinate in metres."""
return self._pos[0]
@x.setter
def x(self, new_value):
"""Set the plane's x coordinate in metres."""
if not isinstance(new_value, (int, float)):
raise ValueError("X must be a number")
self._pos[0] = new_value
@property
def z(self):
"""Get the plane's z coordinate in metres."""
return self._pos[1]
@z.setter
def z(self, new_value):
"""Set the plane's z coordinate in metres."""
if not isinstance(new_value, (int, float)):
raise ValueError("Z must be a number")
self._pos[1] = new_value
@property
def altitude(self):
"""Get the plane's altitude in metres."""
return self._altitude
@altitude.setter
def altitude(self, new_value):
"""Set the plane's altitude in metres."""
if not isinstance(new_value, (int, float)):
raise TypeError("Altitude must be a number.")
self._altitude = new_value
y = altitude
@property
def heading(self):
"""Get the plane's heading in radians."""
return self._heading
@heading.setter
def heading(self, new_value):
"""Set the plane's heading in radians."""
if not isinstance(new_value, (int, float)):
raise TypeError("Heading must be a number.")
new_value %= math.pi * 2
self._heading = new_value
@property
def heading_degrees(self):
"""Get the plane's heading in degrees."""
return math.degrees(self.heading)
@heading_degrees.setter
def heading_degrees(self, new_value):
"""Set the plane's heading in degrees."""
self.heading = math.radians(new_value)
@property
def pitch(self):
"""Get the plane's pitch in radians."""
return self._pitch
@pitch.setter
def pitch(self, new_value):
"""Set the plane's pitch in radians."""
if not isinstance(new_value, (int, float)):
raise TypeError("Pitch must be a number.")
self._pitch = new_value
@property
def pitch_degrees(self):
"""Get the plane's pitch in degrees."""
return math.degrees(self._pitch)
@pitch_degrees.setter
def pitch_degrees(self, new_value):
"""Set the plane's pitch in degrees."""
self.pitch = math.radians(new_value)
@property
def speed(self):
"""Get the plane's speed in m/s."""
return self._speed
@speed.setter
def speed(self, new_value):
"""Set the plane's speed in m/s."""
if not isinstance(new_value, (int, float)):
raise TypeError("Speed must be a number.")
self._speed = new_value
@property
def horizontal_velocity(self):
"""Get the plane's horizontal speed in m/s."""
return self.speed * math.cos(self.pitch)
horizontal_speed = horizontal_velocity
@property
def vertical_velocity(self):
"""Get the plane's vertical speed in m/s."""
return self.speed * math.sin(self.pitch)
@property
def gravity(self):
"""Get the plane's gravity-caused vertical speed drop in m/s."""
return self._gravity
@gravity.setter
def gravity(self, new_value):
"""Set the plane's gravity-caused vertical speed drop in m/s."""
if not isinstance(new_value, (int, float)):
raise TypeError("Gravity must be a number.")
self._gravity = new_value
@property
def total_vertical_velocity(self):
"""Get the plane's total vertical speed in m/s."""
return self.vertical_velocity - self.gravity
@property
def acceleration(self):
"""Get the plane's acceleration in m/s."""
return self._acceleration
@acceleration.setter
def acceleration(self, new_value):
"""Set the plane's acceleration in m/s."""
if not isinstance(new_value, (int, float)):
raise ValueError("Acceleration must be a number")
self._acceleration = new_value
@property
def throttle(self):
"""Get the plane's throttle in m/s."""
return self._throttle
@throttle.setter
def throttle(self, new_value):
"""Set the plane's throttle in m/s."""
if not isinstance(new_value, (int, float)):
raise ValueError("Throttle must be a number")
if new_value < 0:
new_value = 0
elif new_value > 100:
new_value = 100
self._throttle = new_value
@property
def roll(self):
"""Get the plane's horizontal roll in radians."""
return math.radians(self.roll_degrees)
@property
def roll_degrees(self):
"""Get the plane's horizontal roll in degrees."""
return ((35/198) * self._roll_level**3 + (470/99)
* self._roll_level)
@property
def roll_level(self):
"""Get the plane's horizontal roll level."""
return self._roll_level
@roll_level.setter
def roll_level(self, new_value):
"""Set the plane's horizontal roll level."""
if not isinstance(new_value, (int, float)):
raise TypeError("Roll Level must be a number.")
if new_value < -4:
new_value = -4
elif new_value > 4:
new_value = 4
self._roll_level = new_value
@property
def vertical_roll_level(self):
"""Get the plane's vertical roll level."""
return self._vertical_roll_level
@vertical_roll_level.setter
def vertical_roll_level(self, new_value):
"""Set the plane's vertical roll level."""
if not isinstance(new_value, (int, float)):
raise TypeError("Vertical Roll Level must be a number.")
if new_value < -4:
new_value = -4
elif new_value > 4:
new_value = 4
self._vertical_roll_level = new_value
@property
def autopilot_enabled(self):
"""Get the plane's autopilot's status."""
if not self._autopilot_info['enabled']:
return False
else: # See if the autopilot can be disabled
if abs(self.roll_level) < 0.1:
self.roll_level = 0
self._autopilot_info['conditions'][
'roll-centered'] = True
if abs(self.vertical_roll_level) < 0.1:
self.vertical_roll_level = 0
self._autopilot_info['conditions'][
'vertical-roll-centered'] = True
if abs(50 - self.throttle) < 1:
self.throttle = 50
self._autopilot_info['conditions'][
'throttle-centered'] = True
if all(self._autopilot_info['conditions'].values()):
self._autopilot_info['enabled'] = False
return self._autopilot_info['enabled']
@property
def health(self):
"""Get the plane's health."""
return self._health
@health.setter
def health(self, new_value):
"""Set the plane's health."""
if not isinstance(new_value, (int, float)):
raise TypeError("Health must be a number.")
self._health = new_value
@property
def damage(self):
"""Get the plane's damage."""
return 100-self._health
@property
def points(self):
"""Get the plane's score."""
return self._points
@points.setter
def points(self, new_value):
"""Set the plane's score."""
if not isinstance(new_value, (int, float)):
raise TypeError("Score must be a number.")
self._points = new_value
score = points # score is an alias for points.
@property
def image(self):
"""Get the plane's image."""
return self._image
@property
def rect(self):
"""Get the plane's rect."""
return pygame.rect.Rect(self._pos, self._size)
def enable_autopilot(self):
"""Enable the autopilot."""
self._autopilot_info['enabled'] = True
for condition in self._autopilot_info['conditions']:
self._autopilot_info['conditions'][condition] = False
def draw(self, client, airspace):
"""Draw the airplane."""
image = pygame.transform.rotate(
client.scaled_images['navmarker'], -self.heading_degrees)
draw_rect = image.get_rect()
draw_rect.center = (
self.x / airspace.width * client.airspace_rect.width
+ client.airspace_rect.left,
self.z / airspace.height * client.airspace_rect.height
+ client.airspace_rect.top
)
client.screen.blit(image, draw_rect)
def update(self):
"""Update the plane."""
tick_duration = time.time() - self._time
self._time = time.time()
# initialize damage
damage = 0
# stall and gravity
if self.speed <= (self.MAX_SPEED / 5):
max_vert_roll = max((self.speed-(self.MAX_SPEED / 10))
/ (self.MAX_SPEED / 40), 0)
else: max_vert_roll = 4
self.gravity += (((self.MAX_SPEED / 10 - self.speed)
/ self.MAX_SPEED * self.TERMINAL_VELOCITY)
- (self.gravity ** 2
/ (self.TERMINAL_VELOCITY ** 2 / 10)))
if self.gravity < 0:
self.gravity = 0
if self.altitude <= 0.1:
self.gravity = 0
# get heading and pitch
self.heading += (self.roll * tick_duration)
if self.vertical_roll_level > max_vert_roll:
self.vertical_roll_level = max_vert_roll
self.pitch_degrees = self.vertical_roll_level * 10
# acceleration
self.acceleration = (self.throttle**2 / 250
- self.speed**2 * 40 / self.MAX_SPEED**2)
self.speed += (self.acceleration * tick_duration)
# move plane
hspeed = self.horizontal_speed * tick_duration
vspeed = self.total_vertical_velocity * tick_duration
self.x += math.sin(self.heading) * hspeed
self.z -= math.cos(self.heading) * hspeed
self.altitude += vspeed
if self.altitude < 0.1:
self.altitude = 0
# overspeed damage
if self.speed > self.MAX_SPEED * 0.75:
damage += ((self.speed - self.MAX_SPEED*0.75) ** 2
/ (self.MAX_SPEED**2*10) * tick_duration)
if self._throttle > 75:
damage += (self._throttle - 75) ** 2 / 1000 * tick_duration
# autopilot
if self.autopilot_enabled:
self.roll_level *= (0.5 ** tick_duration)
self.vertical_roll_level *= (0.5 ** tick_duration)
self._throttle = 50 + (self.throttle-50) * (
0.5 ** tick_duration)
# deal damage
self.health -= damage
# Function that approximates the 5, 10, 20, 30
# roll of Slight Fimulator 1.0
get_roll = lambda s, r: (35/198) * r**3 + (470/99) * r
get_pitch = lambda s, r: 10*r
class Objective(pygame.sprite.Sprite):
"""The class for an objective sprite."""
NEXT_ID = 0
LABELS = "ID:\tX:\tY:\tALT:\t"
def __init__(self, x=(0, 0, 0, 0, 0), z=None, width=None,
height=None, altitude=None, obj_id=None):
"""Initialize the instance."""
super(Objective, self).__init__()
if z is None:
x, z, width, height, altitude = x
elif width is None:
altitude = z
x, z, width, height = x
elif height is None:
altitude = width
width, height = z
x, z = x
if obj_id is None: # Get an ID for the objective
self._id = Objective.NEXT_ID
Objective.NEXT_ID += 1
else: self._id = obj_id
# Initialize private variables
self._pos = [x, z]
self._size = [width, height]
self._altitude = altitude
def __repr__(self, show_labels=True):
"""Display some important stats about the objective."""
msg = "%i\t%i\t%i\t%i\t" % (self.id_, self.x, self.z,
self.altitude)
if show_labels:
return "%s\n%s" % (self.labels(), msg)
else:
return msg
@property
def id_(self):
"""Get the objective's ID."""
return self._id
@property
def pos(self):
"""Get the objective's (x, z) position in metres."""
return self._pos
@pos.setter
def pos(self, new_value):
"""Set the objective's (x, z) position in metres."""
if not isinstance(new_value, (list, tuple)):
raise TypeError("Position must be a list or a tuple.")
if len(new_value) != 2:
raise ValueError("Position must contain two values.")
if not isinstance(new_value[0], (int, float)):
raise ValueError("X must be a number.")
if not isinstance(new_value[1], (int, float)):
raise ValueError("Z must be a number.")
self._pos = new_value
@property
def x(self):
"""Get the objective's x coordinate in metres."""
return self._pos[0]
@x.setter
def x(self, new_value):
"""Set the objective's x coordinate in metres."""
if not isinstance(new_value, (int, float)):
raise ValueError("X must be a number")
self._pos[0] = new_value
@property
def z(self):
"""Get the objective's z coordinate in metres."""
return self._pos[1]
@z.setter
def z(self, new_value):
"""Set the objective's z coordinate in metres."""
if not isinstance(new_value, (int, float)):
raise ValueError("Z must be a number")
self._pos[1] = new_value
@property
def altitude(self):
"""Get the objective's altitude in metres."""
return self._altitude
@altitude.setter
def altitude(self, new_value):
"""Set the objective's altitude in metres."""
if not isinstance(new_value, (int, float)):
raise TypeError("Altitude must be a number.")
self._altitude = new_value
y = altitude
@property
def image(self):
"""Get the objective's image."""
return self._image
@property
def rect(self):
"""Get the plane's rect."""
return pygame.rect.Rect(self._pos, self._size)
def draw(self, client, airspace):
"""Draw the objective."""
draw_rect = client.scaled_images['objectivemarker'].get_rect()
draw_rect.center = (
self.x / airspace.width * client.airspace_rect.width
+ client.airspace_rect.left,
self.z / airspace.height * client.airspace_rect.height
+ client.airspace_rect.top
)
client.screen.blit(
client.scaled_images['objectivemarker'], draw_rect)
class AdvancedSpriteGroup(pygame.sprite.Group):
"""A Pygame sprite group, except you can index it."""
def __init__(self, *args, **kw):
"""Initialize the instance."""
super(AdvancedSpriteGroup, self).__init__(*args, **kw)
def __getitem__(self, key):
"""Get the sprite at key."""
for sprite in self:
if sprite.id_ == key:
return sprite
raise KeyError("Item {} not found.".format(key))
|
Millionaire 2018 – Who will win all questions - Google Friv games at Friv.land!
Millionaire 2018 is a game wracking your nerves on Friv.land. You will test your knowledge, have some happy moments and release all stress when playing this game. The free Millionaire 2018 game is one of the good ways to examine your brain and have fun in google friv. What about trying the first question!
Millionaire 2018 seems so easy to start playing. It is a one-player puzzle game. Do you know the American game show named Who wants to be a millionaire? This game is based on that popular show. Imagine you are a player in the show. You have to hit the books to win this game at google friv games. A host will put questions to you and read rules of the game.
You must overcome 15 multiple choice questions. Winning a question brings you an amount of money. Rewards increase constantly. The 15th question values 1 million dollars. How valuable it is! After the host read the one question, four answers appear on the screen of google friv player game. One of four is the right answer. Do your best to choose the right reply. After you click one answer, we will ask whether you are sure with your choice. Click Yes or No. It’s up to you.
If you answer wrongly, the game will end. You can replay right now. You think and give a response within 30 seconds. If you cannot give a reply after 30, you lose the game. You begin a new turn on google friv online game. We provide you 2 supports. You probably make a phone call to a friend to ask a piece of advice or survey the opinion of audiences. Use your understanding to face all questions.
Millionaire 2018 is an online HTML5 game at http://www.friv.land/, it's playable in browsers such as safari and chrome. You can play the game on smartphone and tablet (iPhone, iPad, Samsung, Android devices and Windows Phone). If you think that our game is fantastic, please introduce it to your friends. All your comments and rating are warmly welcomed. Play many other amusing games such as WikiHow Game, Idiot Win and Handless Millionaire. |
#------------------------------------------------------------------------------
# Copyright (c) 2005, Enthought, Inc.
# All rights reserved.
#
# This software is provided without warranty under the terms of the BSD
# license included in enthought/LICENSE.txt and may be redistributed only
# under the conditions described in the aforementioned license. The license
# is also available online at http://www.enthought.com/licenses/BSD.txt
# Thanks for using Enthought open source!
#
# Author: Enthought, Inc.
# Description: <Enthought pyface package component>
#------------------------------------------------------------------------------
""" A widget for editing Python code. """
# Enthought library imports.
from traits.api import Bool, Event, Instance, File, Interface, Unicode
from pyface.tasks.i_editor import IEditor
# Local imports.
from pyface.key_pressed_event import KeyPressedEvent
class IPythonEditor(IEditor):
""" A widget for editing Python code. """
#### 'IPythonEditor' interface ############################################
# Object being editor is a file
obj = Instance(File)
# The pathname of the file being edited.
path = Unicode
# Should line numbers be shown in the margin?
show_line_numbers = Bool(True)
#### Events ####
# The contents of the editor has changed.
changed = Event
# A key has been pressed.
key_pressed = Event(KeyPressedEvent)
###########################################################################
# 'IPythonEditor' interface.
###########################################################################
def load(self, path=None):
""" Loads the contents of the editor. """
def save(self, path=None):
""" Saves the contents of the editor. """
def select_line(self, lineno):
""" Selects the specified line. """
|
Veronica Gonzales, a struggling Latina artist, is trying to find her place in the elite world of painting. Moved by her endless determination and her willingness not to become a high school art teacher Veronica pushes herself forward in an attempt to overcome her artistic and personal life struggles including the sudden death of her father.
This novel written by Vanessa Garcia embodies a dichotomy between the art and its creator. At times we use the world as our canvas, leaving brushstrokes along our path. Other times we are the canvas, and the world leaves its mark on us. The protagonist's endless attempts to shape her destiny are contrasted with the inevitable and painful loss which is bound to shape her outcome.
Written in an genuine and honest style, White Light is a story of artistic passion, love and grief. It is a novel that will make the reader contemplate the significance of his or her own life masterpiece using only but the hues we find along our journey.
Vanessa Garcia (Miami, Florida) is a multidisciplinary artist working as a novelist, playwright, and journalist. Her work has been published in the Los Angeles Times, Miami Herald and Washington Post. Her novel White Light was named one of the best books of 2015 by NPR.
You can purchase White Light through Shade Mountain Press. Click here to purchase. |
# Generated by Django 1.11.4 on 2017-08-20 14:47
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [("logs", "0001_initial")]
operations = [
migrations.AddField(
model_name="emaillog",
name="status",
field=models.CharField(
choices=[
(b"open", "Open"),
(b"ok", "Open"),
(b"spambounce", "Open"),
(b"softbounce", "Open"),
(b"hardbounce", "Open"),
(b"dropped", "Open"),
(b"deferred", "Deferred"),
(b"unknown", "Unknown"),
],
default=b"unknown",
max_length=20,
),
),
migrations.AddField(
model_name="emaillog",
name="to",
field=models.CharField(default="", max_length=255, verbose_name="To"),
preserve_default=False,
),
]
|
Our team through professional training. Skilled professional knowledge, strong sense of service, to meet the service needs of customers for Oem Odm Alloy Case Watch , Mens Alloy Case Watch , OEM/ODM Alloy Case Watch , and create an extended term business enterprise romantic relationship with purchasers and users everywhere in the planet.
We pursue the management tenet of "Quality is superior, Service is supreme, Reputation is first", and will sincerely create and share success with all clients for Oem Odm Alloy Case Watch , Mens Alloy Case Watch , OEM/ODM Alloy Case Watch , Welcome to visit our company factory and our showroom where displays various hair products that will meet your expectation. Meanwhile it is convenient to visit our website and our sales staff will try their best to provide you the best service. Please contact us if you need more information. Our aim is to help customers realize their goals. We are making great efforts to achieve this win-win situation. |
import pulsar as psr
def run_test():
tester=psr.PyTester("Testing MathSet and MathSet Iterator C++ Interface")
U1=psr.DoubleUniverse([1.0,2.0,3.0])
U2, U3=psr.DoubleUniverse([3.0,4.0,5.0]),psr.DoubleUniverse([3.0])
U4=psr.DoubleUniverse([9.0,51.0,100.0])
#Constructors
S1=psr.DoubleSet(U1,{2});S2=psr.DoubleSet(S1)
tester.test_equal("Constructor 1 and copy constructor work",S1,S2)
U1.insert(4.0);
tester.test_equal("Shallow copy of universe",S1,S2)
S3,S4=psr.DoubleSet(U2,True),psr.DoubleSet(U2,False)
tester.test_return("Constructor 2",True,False,S3.__eq__,S4)
B2,B3=psr.DoubleSet(S1,True),psr.DoubleSet(U1,True)
tester.test_return("Copy and fill works",True,True,B2.__eq__,B3)
S5=psr.DoubleSet(U2,{0,1,2})
tester.test_equal("Fill constructor works",S3,S5);
S8=psr.DoubleSet(S2.clone())
U1.insert(5.0)
tester.test_return("Clone is not aliased",True,False,S8.__eq__,S2)
#Access and manipulation
tester.test_return("Get universe works",True,U1,S1.get_universe)
tester.test_return("As universe",True,U3,S1.as_universe)
tester.test_return("Size works",True,1,S8.size)
tester.test_return("count element false",True,False,S2.count,15.0)
tester.test_return("count element true",True,True,S3.count,3.0)
tester.test_return("count index false",True,False,S2.count_idx,15)
tester.test_return("count index true",True,True,S3.count_idx,0)
vals=[3.0,4.0,5.0]
itercheck=[i for i in S5]
tester.test_equal("iterators work",vals,itercheck)
tester.test_return("idx valid",True,0,S3.idx,3.0)
tester.test_call("idx invalid",False,S8.idx,55.0)
S11=psr.DoubleSet(U1,{2,3})
tester.test_return("insert by valid elem",True,S11,S1.insert,4.0)
tester.test_call("insert by invalid elem",False,S1.insert,55.0)
tester.test_return("insert by valid index",True,S11,S2.insert_idx,3)
tester.test_call("insert by invalid index",False,S2.insert_idx,99)
S9=psr.DoubleSet(U1,{1,2,3})
S10=psr.DoubleSet(U1,{1,2,4})
S12=psr.DoubleSet(U1,{1,2})
S13=psr.DoubleSet(U1,{3})
S99=psr.DoubleSet(U4,{1})
tester.test_return("union",True,S9,S1.set_union,S12)
tester.test_call("union fail",False,S1.set_union,S99)
tester.test_return("union assign",True,S9,S1.union_assign,S12)
tester.test_call("union assign fail",False,S1.union_assign,S99)
tester.test_return("intersection",True,S12,S1.intersection,S10)
tester.test_call("intersection fail",False,S1.intersection,S99)
tester.test_return("intersection assign",True,S12,S1.intersection_assign,S10)
tester.test_call("intersection assign fail",False,S1.intersection_assign,S99)
tester.test_return("difference",True,S13,S2.difference,S12)
tester.test_call("difference fail",False,S2.difference,S99)
tester.test_return("difference assign",True,S13,S2.difference_assign,S12)
tester.test_call("difference assign fail",False,S2.difference_assign,S99)
S14=psr.DoubleSet(U1,True)
S14-=S2
tester.test_return("complement",True,S14,S2.complement);
#Set comparisons
tester.test_return("subset equal",True,True,S2.is_subset_of,S13)
tester.test_return("subset true",True,True,S2.is_subset_of,S9)
tester.test_return("subset false",True,False,S9.is_subset_of,S2)
tester.test_return("proper subset equal",True,False,S2.is_proper_subset_of,S13)
tester.test_return("proper subset true",True,True,S2.is_proper_subset_of,S9)
tester.test_return("proper subset false",True,False,S9.is_proper_subset_of,S2)
tester.test_return("superset equal",True,True,S2.is_superset_of,S13)
tester.test_return("superset true",True,True,S9.is_superset_of,S2)
tester.test_return("superset false",True,False,S2.is_superset_of,S9)
tester.test_return("proper superset equal",True,False,S2.is_proper_superset_of,S13)
tester.test_return("proper superset true",True,True,S9.is_proper_superset_of,S2)
tester.test_return("proper superset false",True,False,S2.is_proper_superset_of,S9)
tester.test_return("not equal",True,True,S2.__ne__,S14)
#Manipulations
transresults=[4.0,6.0]
def transfxn(in_val):
return 2.0*in_val
NewS=S1.transform(transfxn)
tresults2=[i for i in NewS]
tester.test_equal("transform works",transresults,tresults2)
def partfxn(in_val):
return in_val==2.0
NewS2=S1.partition(partfxn)
partresults=[2.0]
presults2=[i for i in NewS2]
tester.test_equal("partition works",partresults,presults2)
tester.test_return("hash works check 1",True,S2.my_hash(),S13.my_hash)
S2.clear()
tester.test_return("clear works",True,0,S2.size)
tester.print_results()
return tester.nfailed()
|
Each young lady longs for delectable, solid hair that falls underneath her shoulders. However the scores of magnificence tips and long columns of hair items can influence it to appear to be very overpowering. Despite the fact that hair development tips are extremely common, the trap lies in focusing in on the best ones. To do this, you must trust the hair mind specialists. Gratefully, you’re in good fortune. On the off chance that you’ve been considering how to get thick hair for a really long time, we have the main 5 hair look after you to get the tresses you had always wanted.
While a hot shower can feel particularly consoling on winter days, ensure it’s not the last piece of your hair washing schedule. When you purge and condition your hair, tilt your head in reverse and wash your hair with chilly water for a few seconds. This hair mind tip will seal dampness inside the hair fingernail skin with the goal that the strands remain hydrated when you venture out.
To get the long hair you’ve generally needed, nothing works superior to anything a decent out-dated oil knead. What you should pick is the correct hair oil for the reason. We suggest the Dove Nourished Shine Elixir. Advanced with argan oil and hibiscus that loans tresses a glowing completion as well as restores harmed hair so new hair development is sound. Swing to this one for your next head knead!
On the off chance that you thought yoga just fortified the body, you’ll be astounded to realize that it can influence the hair as well! You know how practice enables the surface of your skin to show signs of improvement? A similar procedure applies to hair also. To get sound, thick hair with the assistance of yoga, we recommend a couple of stress calming postures like the descending pooch. Why it works so well is on the grounds that these activities empower the stream of blood to the head which thus, enables hair to become speedier.
This may seem like a restricting hair development tip however simply believe us on this. On the off chance that you endeavor to develop your hair without general trims, you will soon see the finishes of the hair length shaggy and harmed, which will in the end should be hacked off. Be that as it may, on the off chance that you trim your hair at regular intervals, the development will be progressive while split finishes, which influence hair to look unfortunate and dormant, will be headed out. In addition, you should realize that the more the split finishes, the greater the odds of your hair breaking and falling—something you totally don’t need when attempting to develop your hair. That is reason enough to plan your next hair style arrangement! |
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/.
import string, sys
def usage():
print >>sys.stderr, """
%s template_file -t unit_tests... -e extra_protocols...
TEMPLATE_FILE is used to generate to generate the unit-tester .cpp
UNIT_TESTS are the top-level protocols defining unit tests
EXTRA_PROTOCOLS are top-level protocols for subprocesses that can be
spawned in tests but are not unit tests in and of
themselves
"""% (sys.argv[0])
sys.exit(1)
def main(argv):
template = argv[1]
if argv[2] != '-t': usage()
i = 3
unittests = []
while argv[i] != '-e':
unittests.append(argv[i])
i += 1
extras = argv[(i+1):]
includes = '\n'.join([
'#include "%s.h"'% (t) for t in unittests ])
enum_values = '\n'.join([
' %s,'% (t) for t in unittests+extras ])
last_enum = unittests[-1]
string_to_enums = '\n'.join([
''' else if (!strcmp(aString, "%s"))
return %s;'''% (t, t) for t in unittests+extras ])
enum_to_strings = '\n'.join([
''' case %s:
return "%s";'''%(t, t) for t in unittests+extras ])
parent_delete_cases = '\n'.join([
''' case %s: {
delete reinterpret_cast<%sParent*>(gParentActor);
return;
}
'''% (t, t) for t in unittests ])
parent_enabled_cases_proc = '\n'.join([
''' case %s: {
if (!%sParent::RunTestInProcesses()) {
passed("N/A to proc");
DeferredParentShutdown();
return;
}
break;
}
''' % (t, t) for t in unittests ])
parent_main_cases_proc = '\n'.join([
''' case %s: {
%sParent** parent =
reinterpret_cast<%sParent**>(&gParentActor);
*parent = new %sParent();
(*parent)->Open(transport, child);
return (*parent)->Main();
}
'''% (t, t, t, t) for t in unittests ])
parent_enabled_cases_thread = '\n'.join([
''' case %s: {
if (!%sParent::RunTestInThreads()) {
passed("N/A to threads");
DeferredParentShutdown();
return;
}
break;
}
''' % (t, t) for t in unittests ])
parent_main_cases_thread = '\n'.join([
''' case %s: {
%sParent** parent =
reinterpret_cast<%sParent**>(&gParentActor);
*parent = new %sParent();
%sChild** child =
reinterpret_cast<%sChild**>(&gChildActor);
*child = new %sChild();
::mozilla::ipc::MessageChannel *childChannel = (*child)->GetIPCChannel();
::mozilla::ipc::Side parentSide =
::mozilla::ipc::ParentSide;
(*parent)->Open(childChannel, childMessageLoop, parentSide);
return (*parent)->Main();
}
'''% (t, t, t, t, t, t, t) for t in unittests ])
child_delete_cases = '\n'.join([
''' case %s: {
delete reinterpret_cast<%sChild*>(gChildActor);
return;
}
'''% (t, t) for t in unittests+extras ])
child_init_cases = '\n'.join([
''' case %s: {
%sChild** child =
reinterpret_cast<%sChild**>(&gChildActor);
*child = new %sChild();
(*child)->Open(transport, parentPid, worker);
return;
}
'''% (t, t, t, t) for t in unittests+extras ])
templatefile = open(template, 'r')
sys.stdout.write(
string.Template(templatefile.read()).substitute(
INCLUDES=includes,
ENUM_VALUES=enum_values, LAST_ENUM=last_enum,
STRING_TO_ENUMS=string_to_enums,
ENUM_TO_STRINGS=enum_to_strings,
PARENT_DELETE_CASES=parent_delete_cases,
PARENT_ENABLED_CASES_PROC=parent_enabled_cases_proc,
PARENT_MAIN_CASES_PROC=parent_main_cases_proc,
PARENT_ENABLED_CASES_THREAD=parent_enabled_cases_thread,
PARENT_MAIN_CASES_THREAD=parent_main_cases_thread,
CHILD_DELETE_CASES=child_delete_cases,
CHILD_INIT_CASES=child_init_cases))
templatefile.close()
if __name__ == '__main__':
main(sys.argv)
|
All about you, your space and your city. Discover what’s happening close to home in The Moment’s Community section.
We are NOT Free Moments!
Peterborough Green Festival is back! |
# -*- coding: utf-8 -*-
#########################################################################
#
# Copyright (C) 2016 OSGeo
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
#########################################################################
import os
from lxml import etree
from django.conf import settings
from ConfigParser import SafeConfigParser
from owslib.iso import MD_Metadata
from pycsw import server
from geonode.catalogue.backends.generic import CatalogueBackend as GenericCatalogueBackend
from geonode.catalogue.backends.generic import METADATA_FORMATS
from shapely.geometry.base import ReadingError
true_value = 'true'
if settings.DATABASES['default']['ENGINE'].endswith(('sqlite', 'sqlite3', 'spatialite',)):
true_value = '1'
# pycsw settings that the user shouldn't have to worry about
CONFIGURATION = {
'server': {
'home': '.',
'url': settings.CATALOGUE['default']['URL'],
'encoding': 'UTF-8',
'language': settings.LANGUAGE_CODE,
'maxrecords': '10',
# 'loglevel': 'DEBUG',
# 'logfile': '/tmp/pycsw.log',
# 'federatedcatalogues': 'http://geo.data.gov/geoportal/csw/discovery',
# 'pretty_print': 'true',
# 'domainquerytype': 'range',
'domaincounts': 'true',
'profiles': 'apiso,ebrim',
},
'repository': {
'source': 'geonode',
'filter': 'is_published = %s' % true_value,
'mappings': os.path.join(os.path.dirname(__file__), 'pycsw_local_mappings.py')
}
}
class CatalogueBackend(GenericCatalogueBackend):
def __init__(self, *args, **kwargs):
super(CatalogueBackend, self).__init__(*args, **kwargs)
self.catalogue.formats = ['Atom', 'DIF', 'Dublin Core', 'ebRIM', 'FGDC', 'ISO']
self.catalogue.local = True
def remove_record(self, uuid):
pass
def create_record(self, item):
pass
def get_record(self, uuid):
results = self._csw_local_dispatch(identifier=uuid)
if len(results) < 1:
return None
result = etree.fromstring(results).find('{http://www.isotc211.org/2005/gmd}MD_Metadata')
if result is None:
return None
record = MD_Metadata(result)
record.keywords = []
if hasattr(record, 'identification') and hasattr(record.identification, 'keywords'):
for kw in record.identification.keywords:
record.keywords.extend(kw['keywords'])
record.links = {}
record.links['metadata'] = self.catalogue.urls_for_uuid(uuid)
record.links['download'] = self.catalogue.extract_links(record)
return record
def search_records(self, keywords, start, limit, bbox):
with self.catalogue:
lresults = self._csw_local_dispatch(keywords, keywords, start+1, limit, bbox)
# serialize XML
e = etree.fromstring(lresults)
self.catalogue.records = \
[MD_Metadata(x) for x in e.findall('//{http://www.isotc211.org/2005/gmd}MD_Metadata')]
# build results into JSON for API
results = [self.catalogue.metadatarecord2dict(doc) for v, doc in self.catalogue.records.iteritems()]
result = {'rows': results,
'total': e.find('{http://www.opengis.net/cat/csw/2.0.2}SearchResults').attrib.get(
'numberOfRecordsMatched'),
'next_page': e.find('{http://www.opengis.net/cat/csw/2.0.2}SearchResults').attrib.get(
'nextRecord')
}
return result
def _csw_local_dispatch(self, keywords=None, start=0, limit=10, bbox=None, identifier=None):
"""
HTTP-less CSW
"""
# serialize pycsw settings into SafeConfigParser
# object for interaction with pycsw
mdict = dict(settings.PYCSW['CONFIGURATION'], **CONFIGURATION)
if 'server' in settings.PYCSW['CONFIGURATION']:
# override server system defaults with user specified directives
mdict['server'].update(settings.PYCSW['CONFIGURATION']['server'])
config = SafeConfigParser()
for section, options in mdict.iteritems():
config.add_section(section)
for option, value in options.iteritems():
config.set(section, option, value)
# fake HTTP environment variable
os.environ['QUERY_STRING'] = ''
# init pycsw
csw = server.Csw(config, version='2.0.2')
# fake HTTP method
csw.requesttype = 'GET'
# fake HTTP request parameters
if identifier is None: # it's a GetRecords request
formats = []
for f in self.catalogue.formats:
formats.append(METADATA_FORMATS[f][0])
csw.kvp = {
'service': 'CSW',
'version': '2.0.2',
'elementsetname': 'full',
'typenames': formats,
'resulttype': 'results',
'constraintlanguage': 'CQL_TEXT',
'outputschema': 'http://www.isotc211.org/2005/gmd',
'constraint': None,
'startposition': start,
'maxrecords': limit
}
response = csw.getrecords()
else: # it's a GetRecordById request
csw.kvp = {
'service': 'CSW',
'version': '2.0.2',
'request': 'GetRecordById',
'id': identifier,
'outputschema': 'http://www.isotc211.org/2005/gmd',
}
# FIXME(Ariel): Remove this try/except block when pycsw deals with
# empty geometry fields better.
# https://gist.github.com/ingenieroariel/717bb720a201030e9b3a
try:
response = csw.dispatch()
except ReadingError:
return []
if isinstance(response, list): # pycsw 2.0+
response = response[1]
return response
|
Cushard Consequential: What Makes Me Run in the Rain in December?
What Makes Me Run in the Rain in December?
It is cloudy, drizzly and quite warm for a December day (63 degrees at noon). When you look outside, though, it looks cold and dreary. It is just the type of day during which you want to sink into your favorite chair with a book and a football game, played in the snow. I did not want to go for my run. So it was decided…I put on my running shoes and went. Sometimes I am most motivated when I least want to run. I start thinking about all the days real athletes don’t want to get out there. I think about what sets elite athletes (or any successful person, for that matter) apart…getting out there on the days they just want to sleep in and have a late pancake breakfast.
I was motivated by wanting to emulate what successful people do; motivated to stick to my fitness plan; and motivated by doing something that most other people would not do….run on a Sunday in December in the rain.
During my run I thought about what motivates people and how to tap into that motivation to get people to perform. Motivation is complicated and individual once you get past Maslow’s hierarchy, and perhaps a leader cannot know everything that motivates everyone. But think about it, if you can get to know your people well enough to know what gets them excited, you might just be able to get them to go for a run in a rain on a Sunday in December.
Are You an Ostrich or an Owl? |
# -*- coding: mbcs -*-
typelib_path = 'd:\\bogo\\bogo-win32\\interfaces\\tsf.tlb'
_lcid = 0 # change this if required
from ctypes import *
from comtypes import GUID
from comtypes import IUnknown
from comtypes import GUID
from ctypes import HRESULT
from comtypes import BSTR
from ctypes.wintypes import HKL
from comtypes import helpstring
from comtypes import COMMETHOD
from comtypes import dispid
TfGuidAtom = c_ulong
from comtypes import CoClass
UINT_PTR = c_ulong
class ITfInputProcessorProfiles(IUnknown):
_case_insensitive_ = True
_iid_ = GUID('{1F02B6C5-7842-4EE6-8A0B-9A24183A95CA}')
_idlflags_ = []
class IEnumGUID(IUnknown):
_case_insensitive_ = True
_iid_ = GUID('{0002E000-0000-0000-C000-000000000046}')
_idlflags_ = []
class IEnumTfLanguageProfiles(IUnknown):
_case_insensitive_ = True
_iid_ = GUID('{3D61BF11-AC5F-42C8-A4CB-931BCC28C744}')
_idlflags_ = []
def __iter__(self):
return self
def next(self):
item, fetched = self.Next(1)
if fetched:
return item
raise StopIteration
def __getitem__(self, index):
self.Reset()
self.Skip(index)
item, fetched = self.Next(1)
if fetched:
return item
raise IndexError(index)
ITfInputProcessorProfiles._methods_ = [
COMMETHOD([], HRESULT, 'Register',
( ['in'], POINTER(GUID), 'rclsid' )),
COMMETHOD([], HRESULT, 'Unregister',
( ['in'], POINTER(GUID), 'rclsid' )),
COMMETHOD([], HRESULT, 'AddLanguageProfile',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], c_ushort, 'langid' ),
( ['in'], POINTER(GUID), 'guidProfile' ),
( ['in'], POINTER(c_ushort), 'pchDesc' ),
( ['in'], c_ulong, 'cchDesc' ),
( ['in'], POINTER(c_ushort), 'pchIconFile' ),
( ['in'], c_ulong, 'cchFile' ),
( ['in'], c_ulong, 'uIconIndex' )),
COMMETHOD([], HRESULT, 'RemoveLanguageProfile',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], c_ushort, 'langid' ),
( ['in'], POINTER(GUID), 'guidProfile' )),
COMMETHOD([], HRESULT, 'EnumInputProcessorInfo',
( ['out'], POINTER(POINTER(IEnumGUID)), 'ppenum' )),
COMMETHOD([], HRESULT, 'GetDefaultLanguageProfile',
( ['in'], c_ushort, 'langid' ),
( ['in'], POINTER(GUID), 'catid' ),
( ['out'], POINTER(GUID), 'pclsid' ),
( ['out'], POINTER(GUID), 'pguidProfile' )),
COMMETHOD([], HRESULT, 'SetDefaultLanguageProfile',
( ['in'], c_ushort, 'langid' ),
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], POINTER(GUID), 'guidProfiles' )),
COMMETHOD([], HRESULT, 'ActivateLanguageProfile',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], c_ushort, 'langid' ),
( ['in'], POINTER(GUID), 'guidProfiles' )),
COMMETHOD([], HRESULT, 'GetActiveLanguageProfile',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['out'], POINTER(c_ushort), 'plangid' ),
( ['out'], POINTER(GUID), 'pguidProfile' )),
COMMETHOD([], HRESULT, 'GetLanguageProfileDescription',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], c_ushort, 'langid' ),
( ['in'], POINTER(GUID), 'guidProfile' ),
( ['out'], POINTER(BSTR), 'pbstrProfile' )),
COMMETHOD([], HRESULT, 'GetCurrentLanguage',
( ['out'], POINTER(c_ushort), 'plangid' )),
COMMETHOD([], HRESULT, 'ChangeCurrentLanguage',
( ['in'], c_ushort, 'langid' )),
COMMETHOD([], HRESULT, 'GetLanguageList',
( ['out'], POINTER(POINTER(c_ushort)), 'ppLangId' ),
( ['out'], POINTER(c_ulong), 'pulCount' )),
COMMETHOD([], HRESULT, 'EnumLanguageProfiles',
( ['in'], c_ushort, 'langid' ),
( ['out'], POINTER(POINTER(IEnumTfLanguageProfiles)), 'ppenum' )),
COMMETHOD([], HRESULT, 'EnableLanguageProfile',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], c_ushort, 'langid' ),
( ['in'], POINTER(GUID), 'guidProfile' ),
( ['in'], c_int, 'fEnable' )),
COMMETHOD([], HRESULT, 'IsEnabledLanguageProfile',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], c_ushort, 'langid' ),
( ['in'], POINTER(GUID), 'guidProfile' ),
( ['out'], POINTER(c_int), 'pfEnable' )),
COMMETHOD([], HRESULT, 'EnableLanguageProfileByDefault',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], c_ushort, 'langid' ),
( ['in'], POINTER(GUID), 'guidProfile' ),
( ['in'], c_int, 'fEnable' )),
COMMETHOD([], HRESULT, 'SubstituteKeyboardLayout',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], c_ushort, 'langid' ),
( ['in'], POINTER(GUID), 'guidProfile' ),
( ['in'], HKL, 'HKL' )),
]
################################################################
## code template for ITfInputProcessorProfiles implementation
##class ITfInputProcessorProfiles_Impl(object):
## def EnumInputProcessorInfo(self):
## '-no docstring-'
## #return ppenum
##
## def EnumLanguageProfiles(self, langid):
## '-no docstring-'
## #return ppenum
##
## def GetDefaultLanguageProfile(self, langid, catid):
## '-no docstring-'
## #return pclsid, pguidProfile
##
## def Unregister(self, rclsid):
## '-no docstring-'
## #return
##
## def GetLanguageList(self):
## '-no docstring-'
## #return ppLangId, pulCount
##
## def GetCurrentLanguage(self):
## '-no docstring-'
## #return plangid
##
## def Register(self, rclsid):
## '-no docstring-'
## #return
##
## def ActivateLanguageProfile(self, rclsid, langid, guidProfiles):
## '-no docstring-'
## #return
##
## def RemoveLanguageProfile(self, rclsid, langid, guidProfile):
## '-no docstring-'
## #return
##
## def AddLanguageProfile(self, rclsid, langid, guidProfile, pchDesc, cchDesc, pchIconFile, cchFile, uIconIndex):
## '-no docstring-'
## #return
##
## def EnableLanguageProfile(self, rclsid, langid, guidProfile, fEnable):
## '-no docstring-'
## #return
##
## def ChangeCurrentLanguage(self, langid):
## '-no docstring-'
## #return
##
## def SubstituteKeyboardLayout(self, rclsid, langid, guidProfile, HKL):
## '-no docstring-'
## #return
##
## def IsEnabledLanguageProfile(self, rclsid, langid, guidProfile):
## '-no docstring-'
## #return pfEnable
##
## def GetLanguageProfileDescription(self, rclsid, langid, guidProfile):
## '-no docstring-'
## #return pbstrProfile
##
## def GetActiveLanguageProfile(self, rclsid):
## '-no docstring-'
## #return plangid, pguidProfile
##
## def SetDefaultLanguageProfile(self, langid, rclsid, guidProfiles):
## '-no docstring-'
## #return
##
## def EnableLanguageProfileByDefault(self, rclsid, langid, guidProfile, fEnable):
## '-no docstring-'
## #return
##
class ITfCategoryMgr(IUnknown):
_case_insensitive_ = True
_iid_ = GUID('{C3ACEFB5-F69D-4905-938F-FCADCF4BE830}')
_idlflags_ = []
ITfCategoryMgr._methods_ = [
COMMETHOD([], HRESULT, 'RegisterCategory',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], POINTER(GUID), 'rcatid' ),
( ['in'], POINTER(GUID), 'rguid' )),
COMMETHOD([], HRESULT, 'UnregisterCategory',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], POINTER(GUID), 'rcatid' ),
( ['in'], POINTER(GUID), 'rguid' )),
COMMETHOD([], HRESULT, 'EnumCategoriesInItem',
( ['in'], POINTER(GUID), 'rguid' ),
( ['out'], POINTER(POINTER(IEnumGUID)), 'ppenum' )),
COMMETHOD([], HRESULT, 'EnumItemsInCategory',
( ['in'], POINTER(GUID), 'rcatid' ),
( ['out'], POINTER(POINTER(IEnumGUID)), 'ppenum' )),
COMMETHOD([], HRESULT, 'FindClosestCategory',
( ['in'], POINTER(GUID), 'rguid' ),
( ['out'], POINTER(GUID), 'pcatid' ),
( ['in'], POINTER(POINTER(GUID)), 'ppcatidList' ),
( ['in'], c_ulong, 'ulCount' )),
COMMETHOD([], HRESULT, 'RegisterGUIDDescription',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], POINTER(GUID), 'rguid' ),
( ['in'], POINTER(c_ushort), 'pchDesc' ),
( ['in'], c_ulong, 'cch' )),
COMMETHOD([], HRESULT, 'UnregisterGUIDDescription',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], POINTER(GUID), 'rguid' )),
COMMETHOD([], HRESULT, 'GetGUIDDescription',
( ['in'], POINTER(GUID), 'rguid' ),
( ['out'], POINTER(BSTR), 'pbstrDesc' )),
COMMETHOD([], HRESULT, 'RegisterGUIDDWORD',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], POINTER(GUID), 'rguid' ),
( ['in'], c_ulong, 'dw' )),
COMMETHOD([], HRESULT, 'UnregisterGUIDDWORD',
( ['in'], POINTER(GUID), 'rclsid' ),
( ['in'], POINTER(GUID), 'rguid' )),
COMMETHOD([], HRESULT, 'GetGUIDDWORD',
( ['in'], POINTER(GUID), 'rguid' ),
( ['out'], POINTER(c_ulong), 'pdw' )),
COMMETHOD([], HRESULT, 'RegisterGUID',
( ['in'], POINTER(GUID), 'rguid' ),
( ['out'], POINTER(TfGuidAtom), 'pguidatom' )),
COMMETHOD([], HRESULT, 'GetGUID',
( ['in'], TfGuidAtom, 'guidatom' ),
( ['out'], POINTER(GUID), 'pguid' )),
COMMETHOD([], HRESULT, 'IsEqualTfGuidAtom',
( ['in'], TfGuidAtom, 'guidatom' ),
( ['in'], POINTER(GUID), 'rguid' ),
( ['out'], POINTER(c_int), 'pfEqual' )),
]
################################################################
## code template for ITfCategoryMgr implementation
##class ITfCategoryMgr_Impl(object):
## def RegisterGUIDDescription(self, rclsid, rguid, pchDesc, cch):
## '-no docstring-'
## #return
##
## def IsEqualTfGuidAtom(self, guidatom, rguid):
## '-no docstring-'
## #return pfEqual
##
## def GetGUIDDescription(self, rguid):
## '-no docstring-'
## #return pbstrDesc
##
## def RegisterCategory(self, rclsid, rcatid, rguid):
## '-no docstring-'
## #return
##
## def UnregisterGUIDDescription(self, rclsid, rguid):
## '-no docstring-'
## #return
##
## def FindClosestCategory(self, rguid, ppcatidList, ulCount):
## '-no docstring-'
## #return pcatid
##
## def GetGUIDDWORD(self, rguid):
## '-no docstring-'
## #return pdw
##
## def UnregisterGUIDDWORD(self, rclsid, rguid):
## '-no docstring-'
## #return
##
## def RegisterGUIDDWORD(self, rclsid, rguid, dw):
## '-no docstring-'
## #return
##
## def RegisterGUID(self, rguid):
## '-no docstring-'
## #return pguidatom
##
## def UnregisterCategory(self, rclsid, rcatid, rguid):
## '-no docstring-'
## #return
##
## def EnumCategoriesInItem(self, rguid):
## '-no docstring-'
## #return ppenum
##
## def GetGUID(self, guidatom):
## '-no docstring-'
## #return pguid
##
## def EnumItemsInCategory(self, rcatid):
## '-no docstring-'
## #return ppenum
##
class FakeClass(CoClass):
_reg_clsid_ = GUID('{DEC2C382-120C-4D57-BEDA-9C15678C863F}')
_idlflags_ = []
_typelib_path_ = typelib_path
_reg_typelib_ = ('{6A160507-C2CC-4C37-A435-B4D645642BDD}', 0, 0)
FakeClass._com_interfaces_ = [ITfInputProcessorProfiles, ITfCategoryMgr]
class __MIDL___MIDL_itf_tsf_0006_0001_0001(Structure):
pass
__MIDL___MIDL_itf_tsf_0006_0001_0001._fields_ = [
('Data1', c_ulong),
('Data2', c_ushort),
('Data3', c_ushort),
('Data4', c_ubyte * 8),
]
assert sizeof(__MIDL___MIDL_itf_tsf_0006_0001_0001) == 16, sizeof(__MIDL___MIDL_itf_tsf_0006_0001_0001)
assert alignment(__MIDL___MIDL_itf_tsf_0006_0001_0001) == 4, alignment(__MIDL___MIDL_itf_tsf_0006_0001_0001)
IEnumGUID._methods_ = [
COMMETHOD([], HRESULT, 'RemoteNext',
( ['in'], c_ulong, 'celt' ),
( ['out'], POINTER(GUID), 'rgelt' ),
( ['out'], POINTER(c_ulong), 'pceltFetched' )),
COMMETHOD([], HRESULT, 'Skip',
( ['in'], c_ulong, 'celt' )),
COMMETHOD([], HRESULT, 'Reset'),
COMMETHOD([], HRESULT, 'Clone',
( ['out'], POINTER(POINTER(IEnumGUID)), 'ppenum' )),
]
################################################################
## code template for IEnumGUID implementation
##class IEnumGUID_Impl(object):
## def Reset(self):
## '-no docstring-'
## #return
##
## def Skip(self, celt):
## '-no docstring-'
## #return
##
## def Clone(self):
## '-no docstring-'
## #return ppenum
##
## def RemoteNext(self, celt):
## '-no docstring-'
## #return rgelt, pceltFetched
##
class TF_LANGUAGEPROFILE(Structure):
_recordinfo_ = ('{6A160507-C2CC-4C37-A435-B4D645642BDD}', 0, 0, 0L, '{E1B5808D-1E46-4C19-84DC-68C5F5978CC8}')
IEnumTfLanguageProfiles._methods_ = [
COMMETHOD([], HRESULT, 'Clone',
( ['out'], POINTER(POINTER(IEnumTfLanguageProfiles)), 'ppenum' )),
COMMETHOD([], HRESULT, 'Next',
( ['in'], c_ulong, 'ulCount' ),
( ['out'], POINTER(TF_LANGUAGEPROFILE), 'pProfile' ),
( ['out'], POINTER(c_ulong), 'pcFetch' )),
COMMETHOD([], HRESULT, 'Reset'),
COMMETHOD([], HRESULT, 'Skip',
( ['in'], c_ulong, 'ulCount' )),
]
################################################################
## code template for IEnumTfLanguageProfiles implementation
##class IEnumTfLanguageProfiles_Impl(object):
## def Reset(self):
## '-no docstring-'
## #return
##
## def Skip(self, ulCount):
## '-no docstring-'
## #return
##
## def Clone(self):
## '-no docstring-'
## #return ppenum
##
## def Next(self, ulCount):
## '-no docstring-'
## #return pProfile, pcFetch
##
TF_LANGUAGEPROFILE._fields_ = [
('clsid', GUID),
('langid', c_ushort),
('catid', GUID),
('fActive', c_int),
('guidProfile', GUID),
]
assert sizeof(TF_LANGUAGEPROFILE) == 56, sizeof(TF_LANGUAGEPROFILE)
assert alignment(TF_LANGUAGEPROFILE) == 4, alignment(TF_LANGUAGEPROFILE)
class Library(object):
name = u'TSF'
_reg_typelib_ = ('{6A160507-C2CC-4C37-A435-B4D645642BDD}', 0, 0)
__all__ = ['ITfInputProcessorProfiles', 'FakeClass',
'IEnumTfLanguageProfiles', 'TfGuidAtom',
'TF_LANGUAGEPROFILE', 'UINT_PTR', 'IEnumGUID',
'ITfCategoryMgr', '__MIDL___MIDL_itf_tsf_0006_0001_0001']
from comtypes import _check_version; _check_version('501')
|
At the age of 23 years, Jerome Increase Case set out from his birthplace and home in Oswego County, New York in the summer of 1842. He had purchased six (6) groundhog threshing machines on credit. He traveled to Wisconsin with the intent of selling the groundhog threshers along the way. Arriving in Racine, Wisconsin, Jerome began to work on his own design for a thresher. In 1844, he rented a small shop on the bank of the river in Racine and began making threshers. This was the beginning of what would become the J.I Case Threshing Machine Company. The Company became one of the leading manufacturers of threshing machines. To power these threshing machines, the company began the manufacture of a sweep-style horsepower in the early 1860’s. (See the article on the Case sweep-style horse-power in the January/February 2006 issue of Belt Pulley magazine.) The company soon realized the limitations of the sweep as a power source. This was particularly true as Case began to add innovative improvements to the basic design of their threshers. In 1880 Case introduced the Agitator thresher with the vibrating or agitating separator tables. In 1882, Case installed their patented tubular-style elevator on their threshers. Case developed their own straw stacker for the rear of the thresher which could lift stack the straw from the threshing operation into a tall stack behind the thresher. In 1888, a mechanical grain weigher was added to the top of the grain elevator. By 1893, self feeders were becoming a common part of nearly all Case threshers. These new improvements made the J.I. Case Threshing Machine Company, the leading producer of threshers. However, nearly all of these improvements imposed additional power requirements on the power source powering the thresher. At this time, Case offered threshers in a variety of sizes—one model with a 28 inch cylinder and a 46 inch separating unit, a model with a 32 inch cylinder and a 54 inch separator , a 36 inch x 58 inch thresher and a 40 x 62 model. The largest of the Case sweep-style horsepower—the seven team sweep—could produce up to about 28 horsepower. However, even the smallest of the new Case threshers—the 28 x 46 model—when fully outfitted with the new improvements, required 34 hp. to run at top efficiency. Obviously the sweep style horsepower was hopelessly outdated as a power source for these new threshers. Consequently, the Case Company began to look to a new source of power for their new threshers. The Company began to manufacture of steam engines in 1869. In 1876, the Company introduced its first “traction” steam engine, a steam engine that could move under its own power. From this time forward, the Case Company also became a leading manufacturer of steam engines and particularly traction steam engines. Until the 1890s, the J.I. Case Threshing Machine Company operated out of a singe factory located on Bridge Street in Racine, Wisconsin. Then during the 1890s, this building was torn down and replaced with the “Eagle” Building which became part of a new factory complex of buildings known as the “Main Works.” From the Main Works, the Case Company became a leading manufacturer of both a wide range of steam engines and a wide range of wood-frame grain threshers/separators.
In 1904, Case continued its technological innovations in thresher technology. One of the major shortcomings of wood frame threshers was the threat of fire posed by a wood frame machine working in association with a steam engine sitting next to a highly flammable stack of dry straw. Consequently, the Case Company, in 1904, introduced the first “all-steel” thresher. These threshers were sold side by side with the wood-frame threshers until 1906 when production of the wooden threshers was discontinued.
Food, clothing and shelter are well known as the three basic requirements of human beings. Agricultural is generally concerned with the production of the raw materials i.e. plants and animals, that become the food for mankind. To a lesser degree, agriculture also is concerned with the production of raw materials for clothing for mankind e.g. cotton and wool. To a still lesser degree, agriculture may be said to be involved in one of the most basic building materials used in providing shelter for mankind i.e. wood. This is especially true in recent days when forests are replanted after harvest in preparation for another harvest of trees in the future.
Just as the development of the mechanical thresher/separator revolutionized the threshing small grains, so too did the sawmill revolutionize the lumber industry. In the early days of the settlement of the upper Midwest of the United States and Canada, homes were made from logs. However, a log house had a tremendous tendency to shrink or “settle” over the years. This settling was especially pronounced in the first couple years after the construction. Settling meant that windows and doors would not remain square and, thus, tight fitting doors and windows were impossible in traditional log homes. Only frame-built houses would allow for tight fitting windows and doors. As civilization came to the Midwest with more people settling in the towns and on the farms of the Midwest, the frame house became the rule in home construction.
This tremendous growth of frame house got under way in the period following the War Between the States—the golden age of American agriculture. This boom in frame built housing created a vigorous demand for sawn lumber. Thus saw mills sprung up all over the Midwest. Usually, these sawmills were located at the falls of a particular river. This would allow the sawmill to use the power generated by the falling water and a water wheel to power the saw. Additionally, the river would be used as a transportation medium for the logs as lumber camps cut the native timber of the watershed up river from the sawmill and floated the logs down the river to the sawmill. The water might be captured by a dam on the river just above the sawmill to provide a reservoir of water to power the sawmill through any dry spells. This “mill pond” above the sawmill also served as a storage place for all the logs that came floating down the river.
The wood most in demand for building construction was pine. Pine is a straight grained, light but strong wood. It is easily worked with a handsaw and/or a plane. Furthermore, it tends to maintain its proper dimensions and shape,once it had been properly seasoned. (Robert C. Nesbit and William F. Thompson, Wisconsin: A History [University of Wisconsin: Madison, 1989] p. 297.) However, pine was not available in all areas of the United States.
Because of these desirable characteristics, pine could be transported a considerable distance and compete economically with any lumber found locally in any hardwood community. (Ibid.) Any person that has tried to hammer a nail into a “native” hardwood board will recognize why this is true. Pine tree forests were discovered to be most abundant in two belts of land in the United States. First was the wide belt of land that reached from New England through the Great Lakes area, with Lake Erie representing the southern most fringe of this belt, and extending on to present-day northern Minnesota. (Ibid.) Secondly, there was the Southern pine wood belt which started in eastern North Carolina (Hugh Talmage Lefler & Albert Ray Newsome, North Carolina: The History of a Southern State [University of North Carolina Press: Chapel Hill, 1973] pp. 100-101.) and arched to the south and including nearly all of South Carolina (David Duncan Wallace, South Carolina: A Short History [University of North Carolina Press: Chapel Hill, 1951] pp. 3-4.)southern Georgia ( Kenneth Coleman & et al. A History of Georgia), northern Florida (Charlton W. Tebeau, A History of Florida [University of Miami Press: Coral Gables, Florida, 1971] pp. 42 & 52.), southern Alabama and southern Mississippi (Nollie Hickman, Mississippi Harvest: Lumbering in the Longleaf Pine Belt 1840-1915 [Paragon Press: Montgomery, Alabama 1962] pp. 3-11].
Lumbering of the northern pine woods began in Maine and followed the virgin forests of this band of land westward. The market for all this lumber was south of this belt where civilization in the form of towns and farms arose along the upper Ohio River valley during the early nineteenth. The cities of Pittsburg, Cincinnati, Louisville and Evansville were all build with pine wood harvested from the northern pine woods.
Scene from an early American steam- powered sawmill. |
import string
import nltk
from nltk.collocations import *
from nltk.tokenize import word_tokenize
from apps.wordtrack.levenshtein_reduce import LevenshteinReduce
from spotify.utils.spotify_client import SpotifyClient
class LyricsTrack(object):
def __init__(self, lyrics):
"""
Converts the input lyrics to Spotify tracks
:param lyrics: (str) lyrics
"""
self.lyrics = lyrics.lower()
self.original_lyrics = self.lyrics
self.spotify_client = SpotifyClient()
self.last_ngrams = []
self.acceptable_levenshtein = 3
self.acquired_tracks = []
self.ngram_degree = 3
def get_tracks_for_ngram(self, ngrams):
"""
Makes a search request to Spotify using the
terms of the ngrams as the search query
:param ngrams: (list) ngrams
:param tracks: (list) spotify tracks
"""
return [
{
'phrase': ngram,
'tracks': self.spotify_client.get_tracks(ngram),
} for ngram in ngrams
]
def convert_phrase_to_track(self, ngram_tracks, lyrics):
"""
Given the tracks retrieved from Spotify for each ngram,
a Levenshtein Reduce mapping is applied to the tracks
and phrases from the input lyrics.
:param ngram_tracks: (list) ngram_tracks
:param lyrics: (str) lyrics
:return lyrics: (str) consumed phrases are removed from lyrics
"""
phrase_to_tracks = []
for ngram_track in ngram_tracks:
phrase_to_tracks.append(LevenshteinReduce(
phrase=ngram_track['phrase'],
tracks=ngram_track['tracks']
).get_most_similar_track())
for track in phrase_to_tracks:
if track and track['levenshtein'] <= self.acceptable_levenshtein:
self.acquired_tracks.append(track)
lyrics = lyrics.replace(track['phrase'], '').strip()
return lyrics
def process(self, ngram_degree=3):
"""
Processes the lyrics into Spotify tracks. The lyrics is processed recursively
:param ngram_degree: (int) the greatest degree of ngrams to use.
"""
self.ngram_degree = ngram_degree
self._remove_punctuations()
if ngram_degree == 3:
ngrams = self._get_trigrams_with_collocation_pmi_for_lyrics()
elif ngram_degree == 2:
ngrams = self._get_bigrams_with_collocation_pmi_for_lyrics()
else:
ngrams = self.lyrics.split(' ')
self.last_ngrams = ngrams
ngram_tracks = self.get_tracks_for_ngram(ngrams)
self.lyrics = self.convert_phrase_to_track(ngram_tracks, self.lyrics)
if self.lyrics.strip() != '':
if len(self.last_ngrams) == len(ngrams):
self.acceptable_levenshtein += 1
self.ngram_degree -= 1
self.process(self.ngram_degree)
def get_tracks(self):
"""
:return tracks: (list) the tracks best matching the lyrics.
"""
return self.acquired_tracks
def _get_bigrams_with_collocation_pmi_for_lyrics(self):
bigram_measures = nltk.collocations.BigramAssocMeasures()
finder = BigramCollocationFinder.from_words(word_tokenize(self.lyrics))
bi_phraseme = finder.score_ngrams(bigram_measures.pmi)
phrasemes = ["%s %s" % (phrase[0][0], phrase[0][1]) for phrase in bi_phraseme]
return phrasemes
def _get_trigrams_with_collocation_pmi_for_lyrics(self):
trigram_measures = nltk.collocations.TrigramAssocMeasures()
finder = TrigramCollocationFinder.from_words(word_tokenize(self.lyrics))
tri_phraseme = finder.score_ngrams(trigram_measures.pmi)
phrasemes = ["%s %s %s" % (phrase[0][0], phrase[0][1], phrase[0][2]) for phrase in tri_phraseme]
return phrasemes
def _remove_punctuations(self):
for c in string.punctuation:
self.lyrics = self.lyrics.replace(c, '')
|
Attention Investors! Amazing opportunity for an income property or owner occupied duplex. Separate utilities, refrigerator and ovens included! Newer carpet and laminate flooring throughout. Located on a quiet Village street with large yard. CALL TODAY!!! |
from __future__ import print_function
import IMP
import IMP.test
import IMP.core
import IMP.display
import io
import re
class Tests(IMP.test.TestCase):
def assertColorEqual(self, c, red, green, blue, delta=1e-6):
self.assertAlmostEqual(c.get_red(), red, delta=delta)
self.assertAlmostEqual(c.get_green(), green, delta=delta)
self.assertAlmostEqual(c.get_blue(), blue, delta=delta)
def test_color(self):
"""Test Color class"""
c = IMP.display.Color()
self.assertColorEqual(c, -1.0, -1.0, -1.0)
c = IMP.display.Color(0.1, 0.2, 0.3)
self.assertColorEqual(c, 0.1, 0.2, 0.3)
c.show()
for bad in range(3):
rgb = [0.5, 0.5, 0.5]
rgb[bad] = -1.0
self.assertRaisesUsageException(IMP.display.Color, *rgb)
rgb[bad] = 2.0
self.assertRaisesUsageException(IMP.display.Color, *rgb)
def test_get_interpolated_rgb(self):
"""Test get_interpolated_rgb()"""
a = IMP.display.Color(0.1, 0.2, 0.3)
b = IMP.display.Color(0.4, 0.9, 0.8)
# c == a when f=0
c = IMP.display.get_interpolated_rgb(a, b, 0.)
self.assertColorEqual(c, 0.1, 0.2, 0.3)
# c == b when f=1
c = IMP.display.get_interpolated_rgb(a, b, 1.)
self.assertColorEqual(c, 0.4, 0.9, 0.8)
c = IMP.display.get_interpolated_rgb(a, b, 0.4)
self.assertColorEqual(c, 0.22, 0.48, 0.5)
def test_get_linear_color_map_value(self):
"""Test get_linear_color_map_value()"""
self.assertAlmostEqual(IMP.display.get_linear_color_map_value(
10, 40, 30), 0.66, delta=0.1)
self.assertAlmostEqual(IMP.display.get_linear_color_map_value(
10, 40, 50), 1.0, delta=0.1)
self.assertAlmostEqual(IMP.display.get_linear_color_map_value(
10, 40, -50), 0.0, delta=0.1)
self.assertRaisesUsageException(
IMP.display.get_linear_color_map_value, 100, 50, 70)
def test_get_display_color(self):
"""Test get_display_color()"""
self.assertColorEqual(IMP.display.get_display_color(0),
166./255., 206./255., 227./255.)
self.assertColorEqual(IMP.display.get_display_color(105),
253./255., 191./255., 111./255.)
def test_get_jet_color(self):
"""Test the jet color map"""
self.assertColorEqual(IMP.display.get_jet_color(0.), 0., 0., 1.)
self.assertColorEqual(IMP.display.get_jet_color(1.), 0., 0., 1.)
self.assertColorEqual(IMP.display.get_jet_color(0.5), 1., 0.5, 0.)
# Some rounding error over 1.0 should be OK
self.assertColorEqual(IMP.display.get_jet_color(1.0001), 0., 0., 1.)
# Check out of range condition
self.assertRaisesUsageException(IMP.display.get_jet_color, -1.0)
self.assertRaisesUsageException(IMP.display.get_jet_color, 1.1)
def test_get_rgb_color(self):
"""Test the rgb color map"""
self.assertColorEqual(IMP.display.get_rgb_color(0.), 0., 0., 1.)
self.assertColorEqual(IMP.display.get_rgb_color(1.), 1., 0., 0.)
self.assertColorEqual(IMP.display.get_rgb_color(0.5), 0., 1., 0.)
def test_get_hot_color(self):
"""Test the hot color map"""
self.assertColorEqual(IMP.display.get_hot_color(0.), 0., 0., 0.)
self.assertColorEqual(IMP.display.get_hot_color(1.), 1., 1., 1.)
self.assertColorEqual(IMP.display.get_hot_color(0.5), 1., 0.5, 0.)
def test_get_gray_color(self):
"""Test the gray color map"""
self.assertColorEqual(IMP.display.get_gray_color(0.), 0., 0., 0.)
self.assertColorEqual(IMP.display.get_gray_color(1.), 1., 1., 1.)
self.assertColorEqual(IMP.display.get_gray_color(0.5), 0.5, 0.5, 0.5)
def test_get_gnuplot_color(self):
"""Test the gnuplot color map"""
self.assertColorEqual(IMP.display.get_gnuplot_color(0.), 0., 0., 0.)
self.assertColorEqual(IMP.display.get_gnuplot_color(1.), 1., 1., 0.)
self.assertColorEqual(IMP.display.get_gnuplot_color(0.5),
0.675, 0.125, 0.3)
if __name__ == '__main__':
IMP.test.main()
|
Saudi Arabia’s public prosecutor has concluded that an intelligence officer ordered Jamal Khashoggi’s murder, and not Crown Prince Mohammed bin Salman.
The officer was tasked with persuading the dissident journalist to return to the Gulf kingdom, a spokesman said.
The body parts were then handed over to a local “collaborator” outside the grounds, he added. A composite sketch of the collaborator has been produced and investigations are continuing to locate the remains.
Turkish officials have alleged that the 15 Saudi agents who flew to Istanbul in the hours before the murder, one of whom is believed to have been a forensic pathologist working for the Saudi interior ministry, were carrying a bone saw. |
import os
import pickle
import unittest
from mock import MagicMock, patch
from pokemongo_bot.cell_workers.spin_fort import SpinFort
from pokemongo_bot.inventory import Items
config = {
"spin_wait_min": 0,
"spin_wait_max": 0,
"daily_spin_limit": 100,
}
response_dict = {'responses':
{'FORT_SEARCH': {
'experience_awarded': 50,
'items_awarded': [
{'item_id': 1, 'item_count': 1},
{'item_id': 1, 'item_count': 1},
{'item_id': 1, 'item_count': 1}
],
'result': 1,
'cooldown_complete_timestamp_ms': 1474592183629L,
'chain_hack_sequence_number': 1}
},
'status_code': 1,
'platform_returns': [
{'type': 6, 'response': 'CAE='}
],
'request_id': 4916374460149268503L
}
items_awarded = {u'Pokeball': 4}
egg_awarded = None
experience_awarded = 50
class SpinFortTestCase(unittest.TestCase):
def setUp(self):
self.patcherPokemonGoBot = patch('pokemongo_bot.PokemonGoBot')
self.bot = self.patcherPokemonGoBot.start()
forts_path = os.path.join(os.path.dirname(__file__),
'resources', 'example_forts.pickle')
with open(forts_path, 'rb') as forts:
ex_forts = pickle.load(forts)
self.patcherFortRange = patch('pokemongo_bot.cell_workers.spin_fort.SpinFort.get_forts_in_range')
self.fort_range = self.patcherFortRange.start()
self.fort_range.return_value = ex_forts
self.patcherInventoryItem = patch('pokemongo_bot.inventory.Items')
self.inventory_item = self.patcherInventoryItem.start()
def tearDown(self):
self.patcherPokemonGoBot.stop()
self.patcherFortRange.stop()
self.patcherInventoryItem.stop()
# @patch('pokemongo_bot.cell_workers.spin_fort.SpinFort.get_items_awarded_from_fort_spinned')
# def test_spin_fort(self, items_awarded):
# spin_fort = SpinFort(self.bot, config)
# self.bot.api = MagicMock()
# self.bot.api.fort_search.return_value = response_dict
# items_awarded.return_value = items_awarded
# result = spin_fort.work()
# self.assertEqual(result, 1)
|
Add to the joy of gift-giving with a toddler mermaid tail set featuring our refreshed Rainbow Reef design! A matching swim top and fun mermaid extras are included!
Swim the extra mile with a mermaid tail gift set for your sweet toddler this holiday season! For the little mermaid who loves all the colors of the rainbow, this mermaid tail skirt in our signature Rainbow Reef design is sure to bring happiness whenever the urge strikes to play dress-up! She’ll also get a matching swim top in a bandeau or tankini style to look absolutely magical in all the places she’ll play, in or out of the water. Once she discovers the adorable FinFriend Bubbles the Dolphin waiting within, we’re certain she’ll instantly add this soft plush toy to her stuffed animal collection. As an extra bonus, mom or dad can enjoy reading the bonus Mermaiden Tales book to their child anytime of the year! |
import random
from Utilities.text_converter import *
from pokemon import Pokemon
from pokemon_species import Species
def pokemon_type_block_encode(pokemon):
length = len(pokemon)
if(length > 6):
raise ValueError("Cannot have more than 6 Pokemon")
out = []
out.append(length)
for i in range(6):
if(i < length):
out.append(pokemon[i].species.hex)
else:
out.append(0xFF)
out.append(0xFF)
return out
def pokemon_type_block_decode(bytes):
pokemon_count = bytes[0]
species = []
for i in range(pokemon_count):
species.append(Species.fromBytes(bytes[i+1]))
return [pokemon_count, species]
def trainer_name_encode(name):
if len(name) > 7:
raise ValueError("Name cannot be longer than 7 characters")
return padTo(terminate(encode(name)), 0x00, 11)
def trainer_name_decode(bytes):
if len(bytes) is not 11:
print "Warning trainer name data should be 11 bytes"
return decode(unterminate(removePad(bytes, 0)))
def extend(bytes, arr):
for a in arr:
bytes.append(a)
class PokemonTeam():
def __init__(self, name, pokemon):
self.name = name
self.pokemon = pokemon
if len(name) > 7:
raise ValueError("Name cannot be longer than 7 characters")
if len(pokemon) > 6:
raise ValueError("Cannot have more than 6 Pokemon")
def __str__(self):
out = "Trainer: " + self.name + "\n"
for p in self.pokemon:
out += p.__str__() + "\n"
return out
def trade_pokemon(self, idx, pokemon):
self.pokemon[idx] = Pokemon.fromBytes(pokemon.toBytes())
def toBytes(self):
dataBlock = []
extend(dataBlock, trainer_name_encode(self.name))
extend(dataBlock, pokemon_type_block_encode(self.pokemon))
length = len(self.pokemon)
for i in range(6):
if (i < length):
extend(dataBlock, self.pokemon[i].toBytes())
else:
# Fill with 0 bytes
extend(dataBlock, padTo([], 0x00, 44))
for i in range(6):
if (i < length):
extend(dataBlock, trainer_name_encode(self.pokemon[i].originalTrainerName))
else:
# Fill with 0 bytes
extend(dataBlock, padTo([], 0x00, 11))
for i in range(6):
if (i < length):
extend(dataBlock, self.pokemon[i].terminatedNickname())
else:
# Fill with 0 bytes
extend(dataBlock, padTo([], 0x00, 11))
return dataBlock
@staticmethod
def fromBytes(bytes):
trainer_name = trainer_name_decode(bytes[0:11])
meta = pokemon_type_block_decode(bytes[11:19])
pokemon = []
byte_idx = 19
for i in range(meta[0]):
pokemon.append(Pokemon.fromBytes(bytes[byte_idx:byte_idx+44]))
byte_idx += 44
byte_idx = 283
for i in range(meta[0]):
pokemon[i].originalTrainerName = trainer_name_decode(bytes[byte_idx:byte_idx+11])
byte_idx += 11
byte_idx = 349
for i in range(meta[0]):
pokemon[i].setNickname(bytes[byte_idx:byte_idx+11])
byte_idx += 11
return PokemonTeam(trainer_name, pokemon)
@staticmethod
def rnd():
pkmn_cnt = random.randint(1, 3) + 3
pkmn = []
for i in range(pkmn_cnt):
pkmn.append(Pokemon.rnd())
return PokemonTeam("HACKER", pkmn)
|
It’s a chilly morning here in Conway – it’s crisp and sunny and perfect for planning. We have meetings all day with people who are contributing to this cause. Thanksgiving has come and gone, but I find my self filled with this overwhelming feeling of gratitude. What better way to celebrate a season filled with magic than plan a market with people we love, people who inspire us to be better versions of ourselves.
Sometimes, in the process of growing up – becoming adults, buying homes, running businesses – we lose our sense of wonder. We forget to take time to breathe the cold winter air – to feel it fill our lungs. The holiday season begins to lose its magic as it fills up with deadlines and obligations. Snowflakes become obstacles rather than opportunities. Now is the start of the holiday season. Shopping centers will fill with people who, pressed for time and stressed by the ‘requirements’ of the season, may forget that “so-and-so” who doubled parked is a person. I know. I’ve been there.
But each day is an opportunity to make the world around us brighter. If I think of that in very small terms, it’s not such a difficult thing to manage. What small steps can I take today to make the world more wonderful tomorrow? How can we make this season about time, versus dollars spent? How can we give Love – the gift that when given, is exponentiated.
For us, the Winter Market is a step in the right direction. Bringing a renewed sense of wonder to our little corner of Locust & Oak Street is a small but worthwhile effort. I have the rare opportunity to see this thing through with my family by my side… so it feels a little like planning a massive Christmas party.
As we plan for what looks to be a very busy week, know that we are thinking of all of you and your families too, and how we can make this holiday season a magical one. |
from fontTools.misc.transform import Transform
from ufo2ft.filters import BaseFilter
import logging
logger = logging.getLogger(__name__)
class FlattenComponentsFilter(BaseFilter):
def __call__(self, font, glyphSet=None):
if super(FlattenComponentsFilter, self).__call__(font, glyphSet):
modified = self.context.modified
if modified:
logger.info('Flattened composite glyphs: %i' %
len(modified))
return modified
def filter(self, glyph):
flattened = False
if not glyph.components:
return flattened
pen = glyph.getPen()
for comp in list(glyph.components):
flattened_tuples = _flattenComponent(self.context.glyphSet, comp)
if flattened_tuples[0] != (comp.baseGlyph, comp.transformation):
flattened = True
glyph.removeComponent(comp)
for flattened_tuple in flattened_tuples:
pen.addComponent(*flattened_tuple)
if flattened:
self.context.modified.add(glyph.name)
return flattened
def _flattenComponent(glyphSet, component):
"""Returns a list of tuples (baseGlyph, transform) of nested component."""
glyph = glyphSet[component.baseGlyph]
if not glyph.components:
transformation = Transform(*component.transformation)
return [(component.baseGlyph, transformation)]
all_flattened_components = []
for nested in glyph.components:
flattened_components = _flattenComponent(glyphSet, nested)
for i, (_, tr) in enumerate(flattened_components):
tr = tr.transform(component.transformation)
flattened_components[i] = (flattened_components[i][0], tr)
all_flattened_components.extend(flattened_components)
return all_flattened_components
|
Jiang Cheng walked away but left behind that spooky mask on the table. Lan Jinyao’s eyes locked on the mask, her heart distraught with anxiety.
Jiang Cheng had kidnapped her, but what did he want to do later? Could it be that he wanted to drag her to hell with him again? If the person who’d kidnapped her was Jiang Cheng all along, then who was the one who’d died in the sea of fire?
All these questions were circling her mind, tangling themselves into a complete mess.
As she lay in bed, she gradually calmed down.
When Lan Jinyao quieted down, she suddenly noticed that the sound echoing in her ears became louder and clearer. It was the sound of waves crashing against rocks!
Lan Jinyao’s eyes lit up at this thought. Did this mean that Jiang Cheng had brought her back to the island where the filming crew was located? It was daybreak now, so the crew must still be on the island. They shouldn’t have left yet.
She briefly checked her surroundings. This room was brightly lit and decorated in almost the same way as the room she’d previously stayed in on the island, just a bit more luxurious.
She couldn’t call anyone else’s name. Otherwise, her real intentions would‘ve been exposed. Besides, even if the crew really hadn’t left yet and came to her rescue, Jiang Cheng would surely rush here one step ahead of them. In that case, her life would be at risk.
She shouted his name several times. Soon after, hurried footsteps could be heard coming from the door, and Jiang Cheng entered the room.
On the other hand, Lan Jinyao felt disappointed because no one else came to save her.
The nervous look on his face...it didn’t seem like he was faking it. At least, that was what Lan Jinyao had thought.
Soon, however, she realised that her guess was incorrect. This man’s train of thought had always been undecipherable, so she wouldn’t be able to guess it so easily.
She’d thought that her acting skills were impeccable, but in fact, her act was easily seen through by Jiang Cheng.
While talking, the corners of Jiang Cheng’s lips slightly curled up, revealing a sinister smile that made Lan Jinyao’s hair stand on end.
As expected, Lan Jinyao’s last glimmer of hope was extinguished; her eyelids slightly drooped, concealing the disappointment contained within her eyes.
One of his hands was about to touch her hair, but Lan Jinyao instantly tilted her head, making him touch air instead. She’d initially thought that he’d become angry, but he was still all smiles and it didn’t seem like he’d get mad at all.
At this moment, someone knocked on the door. Upon hearing that person’s subtle voice, Lan Jinyao’s eyes suddenly lit up again, and she intently stared at the door.
Upon hearing him say this, that little bit of excitement in Lan Jinyao’s heart completely vanquished.
That was right; this was Jiang Cheng’s territory. He wouldn’t let just anyone in. Even that person who’d knocked earlier was one of his people. There was no way they would help her escape from here.
The person who came in was an aunt in her work uniform, somewhere in her fifties. She held a tray in her hands, and on it was a bowl of soup or something. Lan Jinyao noticed that from the moment this aunt stepped in, her head was hung low and she didn’t dare to look around.
The moment the aunt opened the lid, a strong smell of medicine infiltrated Lan Jinyao’s nostrils. For a moment, her heart leapt into her throat. In fact, Jiang Cheng shouldn’t know that she was pregnant, so, what was this medicine for?
Her hands and feet were bound; the rope used was also very short, so if Jiang Cheng wanted to force the medicine down her throat, she wouldn’t be able to escape.
At this thought, the panic she felt in her heart inexplicably increased by tenfold.
“What do you want to do?” Lan Jinyao asked with a slightly trembling voice. She looked like a pointy hedgehog, and even her gaze was extremely ferocious; as if she’d bite Jiang Cheng if he made a move.
Lan Jinyao repeatedly shook her head. She was never worried that Jiang Cheng would hurt her. The thing that she was worried about was that he’d not let the child in her stomach off the hook. This body was Chen Meimei’s; the child also carried Chen Meimei’s blood.
Lan Jinyao intently stared at that bowl of soup. When she ascertained that it was chicken soup with some unknown medicinal herbs, her heart tensed up a bit.
Upon hearing this, Jiang Cheng suddenly laughed, his eerie laughter reverberating around the room.
Lan Jinyao’s eyes widened in shock when she heard this. At this moment, she was feeling utterly remorseful.
She kept scolding herself inwardly: Who told you to be so stupid? Now, you’re stuck with this madman! This time, how are you going to get yourself out of this mess? |
class User():
def __init__(self, first_name, last_name, phone, email, twitter):
self.first_name = first_name
self.last_name = last_name
self.phone = phone
self.email = email
self.twitter = twitter
def describe_user(self):
print("The user first name is: {} \nThe user last name is: {} \nThe user phone is: {} \nThe user email is: {} \nThe user Twitter is: {}".format(self.first_name,self.last_name,self.phone,self.email,self.twitter))
def greet_user(self):
print("Hey", self.first_name, "have a nice day!")
user_1 = User("Jonathan","Castillo", 5559864, "jonatillo@gmail.com", "@Jonatillo")
user_2 = User("Terry","Flores", 5552148, "Teero1@gmail.com", "@Ter_ser")
user_3 = User("Mary","Adams", 5559794, "maryni@gmail.com", "@mar_y")
user_4 = User("Hugo","Jacobo", 5556444, "HugeJA@gmail.com", "@Hugo_tarugo")
list = [user_1, user_2, user_3, user_4]
for i in list:
i.describe_user()
i.greet_user()
print("")
"""
for i in range(len(list)):
list[i].describe_user()
list[i].greet_user()
print("")
"""
"""
user_1.describe_user()
user_1.greet_user()
user_2.describe_user()
user_2.greet_user()
user_3.describe_user()
user_3.greet_user()
user_4.describe_user()
user_4.greet_user()
"""
|
THE EHUDSON AMERICAN TEE REPRESENTS OUR COUNTRY AND ALL ITS GLORY. I'M PROUD TO BE AN AMERICAN.
It was my grandparents anniversary & since we love them so much & they feel the same way, we decided to give them a shirt with a family picture of just grandkids & grandparents. They absolutely loved it & can twin whenever they want.
All of the Orange & Blue shirts were through DesignAShirt.com, for our 33 US based project team members, and the workers from the project in Rancho Arriba, Dominican Republic, where we finished the construction of a Child Development Center and orphanage.
I stumbled upon a picture of my Uncle Jesse at a family reunion a few years back, and I had been waiting for the perfect opportunity to use it ever since. He was this cute little kid in a striped shirt and a SWEET comb-over. If you know Uncle Jesse now, you know he doesn't have any hair, and he absolutely hates that picture. Well, we decided to order a bunch of shirts for his 51st birthday, and he was totally surprised!
Five of our children participated in this amazing group during the fall/winter and thoroughly enjoyed it! Everyone wore their shirt for the celebration in January. This is a free ministry in our area and we are truly blessed by it! |
# This file was created automatically by SWIG 1.3.29.
# Don't modify this file, modify the SWIG interface instead.
"""
`GLCanvas` provides an OpenGL Context on a `wx.Window`.
"""
import _glcanvas
import new
new_instancemethod = new.instancemethod
def _swig_setattr_nondynamic(self,class_type,name,value,static=1):
if (name == "thisown"): return self.this.own(value)
if (name == "this"):
if type(value).__name__ == 'PySwigObject':
self.__dict__[name] = value
return
method = class_type.__swig_setmethods__.get(name,None)
if method: return method(self,value)
if (not static) or hasattr(self,name):
self.__dict__[name] = value
else:
raise AttributeError("You cannot add attributes to %s" % self)
def _swig_setattr(self,class_type,name,value):
return _swig_setattr_nondynamic(self,class_type,name,value,0)
def _swig_getattr(self,class_type,name):
if (name == "thisown"): return self.this.own()
method = class_type.__swig_getmethods__.get(name,None)
if method: return method(self)
raise AttributeError,name
def _swig_repr(self):
try: strthis = "proxy of " + self.this.__repr__()
except: strthis = ""
return "<%s.%s; %s >" % (self.__class__.__module__, self.__class__.__name__, strthis,)
import types
try:
_object = types.ObjectType
_newclass = 1
except AttributeError:
class _object : pass
_newclass = 0
del types
def _swig_setattr_nondynamic_method(set):
def set_attr(self,name,value):
if (name == "thisown"): return self.this.own(value)
if hasattr(self,name) or (name == "this"):
set(self,name,value)
else:
raise AttributeError("You cannot add attributes to %s" % self)
return set_attr
import _core
wx = _core
__docfilter__ = wx.__DocFilter(globals())
class GLContext(_core.Object):
"""Proxy of C++ GLContext class"""
thisown = property(lambda x: x.this.own(), lambda x, v: x.this.own(v), doc='The membership flag')
__repr__ = _swig_repr
def __init__(self, *args, **kwargs):
"""__init__(self, GLCanvas win, GLContext other=None) -> GLContext"""
_glcanvas.GLContext_swiginit(self,_glcanvas.new_GLContext(*args, **kwargs))
__swig_destroy__ = _glcanvas.delete_GLContext
__del__ = lambda self : None;
def SetCurrent(*args, **kwargs):
"""SetCurrent(self, GLCanvas win)"""
return _glcanvas.GLContext_SetCurrent(*args, **kwargs)
_glcanvas.GLContext_swigregister(GLContext)
cvar = _glcanvas.cvar
GLCanvasNameStr = cvar.GLCanvasNameStr
WX_GL_RGBA = _glcanvas.WX_GL_RGBA
WX_GL_BUFFER_SIZE = _glcanvas.WX_GL_BUFFER_SIZE
WX_GL_LEVEL = _glcanvas.WX_GL_LEVEL
WX_GL_DOUBLEBUFFER = _glcanvas.WX_GL_DOUBLEBUFFER
WX_GL_STEREO = _glcanvas.WX_GL_STEREO
WX_GL_AUX_BUFFERS = _glcanvas.WX_GL_AUX_BUFFERS
WX_GL_MIN_RED = _glcanvas.WX_GL_MIN_RED
WX_GL_MIN_GREEN = _glcanvas.WX_GL_MIN_GREEN
WX_GL_MIN_BLUE = _glcanvas.WX_GL_MIN_BLUE
WX_GL_MIN_ALPHA = _glcanvas.WX_GL_MIN_ALPHA
WX_GL_DEPTH_SIZE = _glcanvas.WX_GL_DEPTH_SIZE
WX_GL_STENCIL_SIZE = _glcanvas.WX_GL_STENCIL_SIZE
WX_GL_MIN_ACCUM_RED = _glcanvas.WX_GL_MIN_ACCUM_RED
WX_GL_MIN_ACCUM_GREEN = _glcanvas.WX_GL_MIN_ACCUM_GREEN
WX_GL_MIN_ACCUM_BLUE = _glcanvas.WX_GL_MIN_ACCUM_BLUE
WX_GL_MIN_ACCUM_ALPHA = _glcanvas.WX_GL_MIN_ACCUM_ALPHA
class GLCanvas(_core.Window):
"""Proxy of C++ GLCanvas class"""
thisown = property(lambda x: x.this.own(), lambda x, v: x.this.own(v), doc='The membership flag')
__repr__ = _swig_repr
def __init__(self, *args, **kwargs):
"""
__init__(self, Window parent, int id=-1, Point pos=DefaultPosition,
Size size=DefaultSize, long style=0, String name=GLCanvasNameStr,
int attribList=None, Palette palette=wxNullPalette) -> GLCanvas
"""
_glcanvas.GLCanvas_swiginit(self,_glcanvas.new_GLCanvas(*args, **kwargs))
self._setOORInfo(self)
def SetCurrent(*args):
"""
SetCurrent(self)
SetCurrent(self, GLContext RC)
"""
return _glcanvas.GLCanvas_SetCurrent(*args)
def SetColour(*args, **kwargs):
"""SetColour(self, String colour)"""
return _glcanvas.GLCanvas_SetColour(*args, **kwargs)
def SwapBuffers(*args, **kwargs):
"""SwapBuffers(self)"""
return _glcanvas.GLCanvas_SwapBuffers(*args, **kwargs)
def GetContext(*args, **kwargs):
"""GetContext(self) -> GLContext"""
return _glcanvas.GLCanvas_GetContext(*args, **kwargs)
def SetupPixelFormat(*args, **kwargs):
"""SetupPixelFormat(self, int attribList=None)"""
return _glcanvas.GLCanvas_SetupPixelFormat(*args, **kwargs)
def SetupPalette(*args, **kwargs):
"""SetupPalette(self, Palette palette)"""
return _glcanvas.GLCanvas_SetupPalette(*args, **kwargs)
def CreateDefaultPalette(*args, **kwargs):
"""CreateDefaultPalette(self) -> Palette"""
return _glcanvas.GLCanvas_CreateDefaultPalette(*args, **kwargs)
def GetPalette(*args, **kwargs):
"""GetPalette(self) -> Palette"""
return _glcanvas.GLCanvas_GetPalette(*args, **kwargs)
Context = property(GetContext,doc="See `GetContext`")
_glcanvas.GLCanvas_swigregister(GLCanvas)
def GLCanvasWithContext(*args, **kwargs):
"""
GLCanvasWithContext(Window parent, GLContext shared=None, int id=-1, Point pos=DefaultPosition,
Size size=DefaultSize,
long style=0, String name=GLCanvasNameStr,
int attribList=None, Palette palette=wxNullPalette) -> GLCanvas
"""
val = _glcanvas.new_GLCanvasWithContext(*args, **kwargs)
val._setOORInfo(val)
return val
|
Phone Number of Airtel Data Card Dongle is +91 11 4666 6100 .
Airtel is also recognized by the name of Bharti Airtel Limited which is one of the dignified and most advanced telecom subsidiary traced its roots in the year of 2006. Airtel Data card Dongle is listed among the third leading mobile service issuer and having three hundred fifty-eight million customer base. Bharti Airtel discovers numerous array of items with cost effective range such as Data card Dongle, DTH and Enterprise Services.
The workplace of Airtel Data card Dongle produces numerous career chances in distinctive fields and give initial priority to those who are eager to become part of reputed corporation.
The address of Airtel Data Card Dongle is bharti airtel Ltd., 6th floor, tower A, Plot # 16 udyog vihar - Ph-IV gurgaon haryana 122001.
The email address of Airtel Data Card Dongle is 121@in.airtel.com.
The Website of Airtel Data Card Dongle is www.airtel.in.
The customer support phone number of Airtel Data Card Dongle is +91 11 4666 6100 (Click phone number to call).
The postal and official address, email address and phone number (helpline) of Airtel Data Card Dongle Service Center and Airtel Data Card Dongle customer care number is given below. The helpline of Airtel Data Card Dongle customer care number may or may not be toll free.
No complaints and reviews so far for Airtel Data Card Dongle.To add review or complaint against Airtel Data Card Dongle Click Here.
We advise you to express your problem or complaint against Airtel Data Card Dongle. your phone number should be mentioned in your comment, so that Airtel Data Card Dongle will communicate to you on your phone number mentioned in your comment. |
# -*- coding: utf-8 -*-
'''
Created on 2013-01-04 12:10
@summary:
@author: Martin Predki
'''
from twisted.internet import reactor
from CompilerProcessProtocol import CompilerProcessProtocol
import logging
import os
class Compiler(object):
'''
@summary: Class containing the compile logic for jobs.
'''
def __init__(self, scheduler):
'''
@summary: Initializes the compiler.
@param scheduler: A reference to the scheduler
@result:
'''
self.scheduler = scheduler
self.logger = logging.getLogger("PySchedServer")
def compileJob(self, job):
'''
@summary: Compiles a job.
@param job:
@result:
'''
# Setting up the compile process parameter
# ==============================
jobPath = os.path.join(self.scheduler.workingDir, str(job.jobId))
# parse command Template
template = job.compilerStr.split(" ")
# Start the compiler
# ==============================
self.logger.debug("Spawn process: {}".format(template))
reactor.spawnProcess(CompilerProcessProtocol(job, jobPath, self.scheduler), executable=template[0],
args=template, path=jobPath, env=os.environ)
# write a log file
# ==============================
self.logger.info("Compile process for job {} started.".format(job.jobId))
return True
def compilingCompleted(self, job):
'''
@summary: Is called when a job is compiled successful.
@param job:
@result:
'''
self.scheduler.compilingComplete(job)
def compilingFailed(self, job):
'''
@summary: Is called when a job could not be compiled
@param job:
@result:
'''
self.scheduler.compilingFailed(job)
|
We are delighted to publish this literature review on ‘The relationship between reading age, education and life outcomes‘, conducted for Sound Training in 2016, now Lexonik.
The review examines in detail the relationship between reading ability and this wide range of negative life outcomes in order to highlight the profoundly negative consequences, to the individual and the economy, of low literacy levels.
The relationships between reading ability, education and wider life outcomes are complex and difficult to establish. Whilst poor reading may cause unemployment for some individuals, not all poor readers are unemployed. However, on balance the existing literature suggests that reading is a crucially important component of achieving success in education and throughout life. |
from docset.index import Index
from docset import rules
import os
import shutil
def build_docset(info, ds_rules, src_dir, out_dir):
root_dir = os.path.join(out_dir, info['name'] + '.docset')
content_dir = os.path.join(root_dir, 'Contents')
resources_dir = os.path.join(content_dir, 'Resources')
doc_dir = os.path.join(resources_dir, 'Documents')
index_path = os.path.join(resources_dir, 'docSet.dsidx')
if not os.path.exists(doc_dir):
os.makedirs(doc_dir)
with open(os.path.join(content_dir, "Info.plist"), "w+t") as f:
f.write(info['plist'])
if 'icon' in info and os.path.exists(info['icon']):
shutil.copy2(info['icon'], root_dir)
idx = Index(index_path)
for root, dirnames, filenames in os.walk(src_dir):
for filename in filenames:
full_path = os.path.join(root, filename)
rel_path = os.path.relpath(full_path, src_dir)
dest_path = os.path.join(doc_dir, rel_path)
ctx = {
'src_path': full_path,
'dest_path': dest_path,
'rel_path': rel_path,
'idx': idx
}
rules.process_file_rules(ds_rules, ctx)
idx.flush()
|
Mascalls Academy has a rare opportunity for an exceptional Music Teacher to join us in the role of Head of Department for September 2019. Music is thriving at Mascalls Academy; we have a cohort of over 45 students in Key Stage 4 and 10 Key Stage 5 students who we see flourish due to the passionate and inspiring teaching in the department. The students are engaged and energetic, thanks to our strong practice, and you'll have the opportunity to work within our wider collaborative and dynamic Performing Arts team to oversee the delivery of this subject.
We are looking for an organised, flexible, emotionally intelligent leader who can inspire and motivate. If that's you, we'd love to hear from you!
To agree and fully support the achievement of subject pupil progress targets to make a measurable contribution towards whole academy targets.
To create, support and monitor the progress of subject development plans which contribute positively to the achievement of the academy plan.
To provide regular feedback for subject colleagues in a way that recognises good practice and supports their progress against performance objectives resulting in a tangible impact on student learning across the whole subject area.
To review and report regularly on the standards of leadership and teaching and learning and attainment across the subject area.
To be consistent with the procedures in the academy’s self-evaluation policy and performance management policy.
To consult with all subject teachers and assist with the formulation, communication and monitoring of the academy improvement plan ensuring that concerns and ideas are considered and all staff understand they key academy targets and the part they play in achieving them.
To oversee and evaluate the team’s budget allocation to ensure the budget is spent in line with academy learning priorities and best value principles.
To provide regular progress updates to the Senior Leadership Team so that it is aware of all successes, concerns and obstacles to progress.
The main message we’d like potential candidates to take away from this advert is that the Music department at Mascalls is full of dynamism, talent, vitality and success among staff and students alike. Come and be a part of it. The responsibilities outlined above are in addition to the National Teaching Standards – please click here to view them. |
# -*- coding: utf-8 -*-
##############################################################################
#
# OpenERP, Open Source Management Solution
# Copyright (C) 2004-2010 Tiny SPRL (<http://tiny.be>).
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
##############################################################################
import time
from report import report_sxw
class buyer_form_report(report_sxw.rml_parse):
count=0
c=0
def __init__(self, cr, uid, name, context):
super(buyer_form_report, self).__init__(cr, uid, name, context=context)
self.localcontext.update({
'time': time,
'sum_taxes': self.sum_taxes,
'buyerinfo' : self.buyer_info,
'grand_total' : self.grand_buyer_total,
})
def sum_taxes(self, lot):
amount=0.0
taxes=[]
if lot.author_right:
taxes.append(lot.author_right)
if lot.auction_id:
taxes += lot.auction_id.buyer_costs
tax=self.pool.get('account.tax').compute_all(self.cr, self.uid, taxes, lot.obj_price, 1)
for t in tax:
amount+=t['amount']
return amount
def buyer_info(self):
objects = [object for object in self.localcontext.get('objects')]
ret_dict = {}
for object in objects:
partner = ret_dict.get(object.ach_uid.id,False)
if not partner:
ret_dict[object.ach_uid.id] = {'partner' : object.ach_uid or False, 'lots':[object]}
else:
lots = partner.get('lots')
lots.append(object)
return ret_dict.values()
def grand_buyer_total(self,o):
grand_total = 0
for oo in o:
grand_total =grand_total + oo['obj_price'] +self.sum_taxes(oo)
return grand_total
report_sxw.report_sxw('report.buyer_form_report', 'auction.lots', 'addons/auction/report/buyer_form_report.rml', parser=buyer_form_report)
# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
|
The Lakeside House has been completely restored, redecorated and refurnished to offer luxurious, contemporary guest accommodation for visitors seeking a special bed and breakfast stay in Keswick.
The building presents fourteen spacious bedrooms, including twin rooms, all en-suite, and most bedrooms have spectacular views of the Lake District mountains.
We are superbly located on the corner of Lake Road and The Heads just a few minutes' walk from Derwentwater and The Theatre By The Lake, and very close to the shops, restaurants and pubs of Keswick town centre - location, location, location - we have it!
- Rated as "Exceptional" 9.8/10 by booking.com for the sixth year running and many more.
All of the above awards and nominations were based purely on the feedback of our guests which is why they mean so much to us all here at LakeSide House. We aim to be the choice accommodation for guests looking to stay in Keswick and invest our time and our energy into ensuring all of our guests wish to return and tell their family and friends about their stay to promote not only LakeSide House but also Keswick, the Heart of the Lakes.
LakeSide House is a great base from which to explore all of the beautiful Lake District. Whether you plan to fellwalk or cycle, or just want to relax in comfort and enjoy the magnificent scenery. Our accommodation covers 3 floors, with a lovely light and airy breakfast room and drying room facilities for cyclists and walkers.
Book directly with us to get our lowest available rates! |
# Copyright 2015, Google Inc.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above
# copyright notice, this list of conditions and the following disclaimer
# in the documentation and/or other materials provided with the
# distribution.
# * Neither the name of Google Inc. nor the names of its
# contributors may be used to endorse or promote products derived from
# this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
"""State and behavior for ingestion during an operation."""
import abc
import collections
from grpc.framework.base import exceptions
from grpc.framework.base import interfaces
from grpc.framework.base.packets import _constants
from grpc.framework.base.packets import _interfaces
from grpc.framework.base.packets import packets
from grpc.framework.foundation import abandonment
from grpc.framework.foundation import callable_util
from grpc.framework.foundation import stream
_CREATE_CONSUMER_EXCEPTION_LOG_MESSAGE = 'Exception initializing ingestion!'
_CONSUME_EXCEPTION_LOG_MESSAGE = 'Exception during ingestion!'
class _ConsumerCreation(collections.namedtuple(
'_ConsumerCreation', ('consumer', 'remote_error', 'abandoned'))):
"""A sum type for the outcome of ingestion initialization.
Either consumer will be non-None, remote_error will be True, or abandoned will
be True.
Attributes:
consumer: A stream.Consumer for ingesting payloads.
remote_error: A boolean indicating that the consumer could not be created
due to an error on the remote side of the operation.
abandoned: A boolean indicating that the consumer creation was abandoned.
"""
class _EmptyConsumer(stream.Consumer):
"""A no-operative stream.Consumer that ignores all inputs and calls."""
def consume(self, value):
"""See stream.Consumer.consume for specification."""
def terminate(self):
"""See stream.Consumer.terminate for specification."""
def consume_and_terminate(self, value):
"""See stream.Consumer.consume_and_terminate for specification."""
class _ConsumerCreator(object):
"""Common specification of different consumer-creating behavior."""
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def create_consumer(self, requirement):
"""Creates the stream.Consumer to which customer payloads will be delivered.
Any exceptions raised by this method should be attributed to and treated as
defects in the serviced or servicer code called by this method.
Args:
requirement: A value required by this _ConsumerCreator for consumer
creation.
Returns:
A _ConsumerCreation describing the result of consumer creation.
"""
raise NotImplementedError()
class _FrontConsumerCreator(_ConsumerCreator):
"""A _ConsumerCreator appropriate for front-side use."""
def __init__(self, subscription, operation_context):
"""Constructor.
Args:
subscription: The serviced's interfaces.ServicedSubscription for the
operation.
operation_context: The interfaces.OperationContext object for the
operation.
"""
self._subscription = subscription
self._operation_context = operation_context
def create_consumer(self, requirement):
"""See _ConsumerCreator.create_consumer for specification."""
if self._subscription.kind is interfaces.ServicedSubscription.Kind.FULL:
try:
return _ConsumerCreation(
self._subscription.ingestor.consumer(self._operation_context),
False, False)
except abandonment.Abandoned:
return _ConsumerCreation(None, False, True)
else:
return _ConsumerCreation(_EmptyConsumer(), False, False)
class _BackConsumerCreator(_ConsumerCreator):
"""A _ConsumerCreator appropriate for back-side use."""
def __init__(self, servicer, operation_context, emission_consumer):
"""Constructor.
Args:
servicer: The interfaces.Servicer that will service the operation.
operation_context: The interfaces.OperationContext object for the
operation.
emission_consumer: The stream.Consumer object to which payloads emitted
from the operation will be passed.
"""
self._servicer = servicer
self._operation_context = operation_context
self._emission_consumer = emission_consumer
def create_consumer(self, requirement):
"""See _ConsumerCreator.create_consumer for full specification.
Args:
requirement: The name of the Servicer method to be called during this
operation.
Returns:
A _ConsumerCreation describing the result of consumer creation.
"""
try:
return _ConsumerCreation(
self._servicer.service(
requirement, self._operation_context, self._emission_consumer),
False, False)
except exceptions.NoSuchMethodError:
return _ConsumerCreation(None, True, False)
except abandonment.Abandoned:
return _ConsumerCreation(None, False, True)
class _WrappedConsumer(object):
"""Wraps a consumer to catch the exceptions that it is allowed to throw."""
def __init__(self, consumer):
"""Constructor.
Args:
consumer: A stream.Consumer that may raise abandonment.Abandoned from any
of its methods.
"""
self._consumer = consumer
def moar(self, payload, complete):
"""Makes progress with the wrapped consumer.
This method catches all exceptions allowed to be thrown by the wrapped
consumer. Any exceptions raised by this method should be blamed on the
customer-supplied consumer.
Args:
payload: A customer-significant payload object. May be None only if
complete is True.
complete: Whether or not the end of the payload sequence has been reached.
Must be True if payload is None.
Returns:
True if the wrapped consumer made progress or False if the wrapped
consumer raised abandonment.Abandoned to indicate its abandonment of
progress.
"""
try:
if payload is None:
self._consumer.terminate()
elif complete:
self._consumer.consume_and_terminate(payload)
else:
self._consumer.consume(payload)
return True
except abandonment.Abandoned:
return False
class _IngestionManager(_interfaces.IngestionManager):
"""An implementation of _interfaces.IngestionManager."""
def __init__(
self, lock, pool, consumer_creator, failure_kind, termination_manager,
transmission_manager):
"""Constructor.
Args:
lock: The operation-wide lock.
pool: A thread pool in which to execute customer code.
consumer_creator: A _ConsumerCreator wrapping the portion of customer code
that when called returns the stream.Consumer with which the customer
code will ingest payload values.
failure_kind: Whichever one of packets.Kind.SERVICED_FAILURE or
packets.Kind.SERVICER_FAILURE describes local failure of customer code.
termination_manager: The _interfaces.TerminationManager for the operation.
transmission_manager: The _interfaces.TransmissionManager for the
operation.
"""
self._lock = lock
self._pool = pool
self._consumer_creator = consumer_creator
self._failure_kind = failure_kind
self._termination_manager = termination_manager
self._transmission_manager = transmission_manager
self._expiration_manager = None
self._wrapped_ingestion_consumer = None
self._pending_ingestion = []
self._ingestion_complete = False
self._processing = False
def set_expiration_manager(self, expiration_manager):
self._expiration_manager = expiration_manager
def _abort_internal_only(self):
self._wrapped_ingestion_consumer = None
self._pending_ingestion = None
def _abort_and_notify(self, outcome):
self._abort_internal_only()
self._termination_manager.abort(outcome)
self._transmission_manager.abort(outcome)
self._expiration_manager.abort()
def _next(self):
"""Computes the next step for ingestion.
Returns:
A payload, complete, continue triplet indicating what payload (if any) is
available to feed into customer code, whether or not the sequence of
payloads has terminated, and whether or not there is anything
immediately actionable to call customer code to do.
"""
if self._pending_ingestion is None:
return None, False, False
elif self._pending_ingestion:
payload = self._pending_ingestion.pop(0)
complete = self._ingestion_complete and not self._pending_ingestion
return payload, complete, True
elif self._ingestion_complete:
return None, True, True
else:
return None, False, False
def _process(self, wrapped_ingestion_consumer, payload, complete):
"""A method to call to execute customer code.
This object's lock must *not* be held when calling this method.
Args:
wrapped_ingestion_consumer: The _WrappedConsumer with which to pass
payloads to customer code.
payload: A customer payload. May be None only if complete is True.
complete: Whether or not the sequence of payloads to pass to the customer
has concluded.
"""
while True:
consumption_outcome = callable_util.call_logging_exceptions(
wrapped_ingestion_consumer.moar, _CONSUME_EXCEPTION_LOG_MESSAGE,
payload, complete)
if consumption_outcome.exception is None:
if consumption_outcome.return_value:
with self._lock:
if complete:
self._pending_ingestion = None
self._termination_manager.ingestion_complete()
return
else:
payload, complete, moar = self._next()
if not moar:
self._processing = False
return
else:
with self._lock:
if self._pending_ingestion is not None:
self._abort_and_notify(self._failure_kind)
self._processing = False
return
else:
with self._lock:
self._abort_and_notify(self._failure_kind)
self._processing = False
return
def start(self, requirement):
if self._pending_ingestion is not None:
def initialize():
consumer_creation_outcome = callable_util.call_logging_exceptions(
self._consumer_creator.create_consumer,
_CREATE_CONSUMER_EXCEPTION_LOG_MESSAGE, requirement)
if consumer_creation_outcome.return_value is None:
with self._lock:
self._abort_and_notify(self._failure_kind)
self._processing = False
elif consumer_creation_outcome.return_value.remote_error:
with self._lock:
self._abort_and_notify(packets.Kind.RECEPTION_FAILURE)
self._processing = False
elif consumer_creation_outcome.return_value.abandoned:
with self._lock:
if self._pending_ingestion is not None:
self._abort_and_notify(self._failure_kind)
self._processing = False
else:
wrapped_ingestion_consumer = _WrappedConsumer(
consumer_creation_outcome.return_value.consumer)
with self._lock:
self._wrapped_ingestion_consumer = wrapped_ingestion_consumer
payload, complete, moar = self._next()
if not moar:
self._processing = False
return
self._process(wrapped_ingestion_consumer, payload, complete)
self._pool.submit(
callable_util.with_exceptions_logged(
initialize, _constants.INTERNAL_ERROR_LOG_MESSAGE))
self._processing = True
def consume(self, payload):
if self._ingestion_complete:
self._abort_and_notify(self._failure_kind)
elif self._pending_ingestion is not None:
if self._processing:
self._pending_ingestion.append(payload)
else:
self._pool.submit(
callable_util.with_exceptions_logged(
self._process, _constants.INTERNAL_ERROR_LOG_MESSAGE),
self._wrapped_ingestion_consumer, payload, False)
self._processing = True
def terminate(self):
if self._ingestion_complete:
self._abort_and_notify(self._failure_kind)
else:
self._ingestion_complete = True
if self._pending_ingestion is not None and not self._processing:
self._pool.submit(
callable_util.with_exceptions_logged(
self._process, _constants.INTERNAL_ERROR_LOG_MESSAGE),
self._wrapped_ingestion_consumer, None, True)
self._processing = True
def consume_and_terminate(self, payload):
if self._ingestion_complete:
self._abort_and_notify(self._failure_kind)
else:
self._ingestion_complete = True
if self._pending_ingestion is not None:
if self._processing:
self._pending_ingestion.append(payload)
else:
self._pool.submit(
callable_util.with_exceptions_logged(
self._process, _constants.INTERNAL_ERROR_LOG_MESSAGE),
self._wrapped_ingestion_consumer, payload, True)
self._processing = True
def abort(self):
"""See _interfaces.IngestionManager.abort for specification."""
self._abort_internal_only()
def front_ingestion_manager(
lock, pool, subscription, termination_manager, transmission_manager,
operation_context):
"""Creates an IngestionManager appropriate for front-side use.
Args:
lock: The operation-wide lock.
pool: A thread pool in which to execute customer code.
subscription: A base_interfaces.ServicedSubscription indicating the
customer's interest in the results of the operation.
termination_manager: The _interfaces.TerminationManager for the operation.
transmission_manager: The _interfaces.TransmissionManager for the
operation.
operation_context: A base_interfaces.OperationContext for the operation.
Returns:
An IngestionManager appropriate for front-side use.
"""
ingestion_manager = _IngestionManager(
lock, pool, _FrontConsumerCreator(subscription, operation_context),
packets.Kind.SERVICED_FAILURE, termination_manager, transmission_manager)
ingestion_manager.start(None)
return ingestion_manager
def back_ingestion_manager(
lock, pool, servicer, termination_manager, transmission_manager,
operation_context, emission_consumer):
"""Creates an IngestionManager appropriate for back-side use.
Args:
lock: The operation-wide lock.
pool: A thread pool in which to execute customer code.
servicer: A base_interfaces.Servicer for servicing the operation.
termination_manager: The _interfaces.TerminationManager for the operation.
transmission_manager: The _interfaces.TransmissionManager for the
operation.
operation_context: A base_interfaces.OperationContext for the operation.
emission_consumer: The _interfaces.EmissionConsumer for the operation.
Returns:
An IngestionManager appropriate for back-side use.
"""
ingestion_manager = _IngestionManager(
lock, pool, _BackConsumerCreator(
servicer, operation_context, emission_consumer),
packets.Kind.SERVICER_FAILURE, termination_manager, transmission_manager)
return ingestion_manager
|
Welcome to kuymparistokeskus.valudata.fi Sites. This is your first post. Edit or delete it, then start blogging! |
# coding: utf-8
# Question 2) Find all the mentions of world countries in the whole corpus,
# using the pycountry utility (HINT: remember that there will be different surface forms
# for the same country in the text, e.g., Switzerland, switzerland, CH, etc.)
# Perform sentiment analysis on every email message using the demo methods
# in the nltk.sentiment.util module. Aggregate the polarity information of all
# the emails by country, and plot a histogram (ordered and colored by polarity level)
# that summarizes the perception of the different countries. Repeat the aggregation and plotting steps using different demo methods from the sentiment analysis module.
# Can you find substantial differences?
# In[51]:
import pandas as pd
import pycountry
from nltk.sentiment import *
import numpy as np
import matplotlib.pyplot as plt
import codecs
import math
import re
import string
# Pre Process the Data, Dropping Irrelevant Columns
# In[204]:
emails = pd.read_csv("hillary-clinton-emails/Emails.csv")
# In[205]:
# Drop columns that won't be used
emails = emails.drop(['DocNumber', 'MetadataPdfLink','DocNumber', 'ExtractedDocNumber', 'MetadataCaseNumber'], axis=1)
emails.head()
# In[206]:
emails_cut = emails[['ExtractedBodyText']].copy()
emails_cut.head()
# In[207]:
emails_cut = emails_cut.dropna()
emails_cut.head()
# Now we must tokenize the data...
# In[208]:
from nltk import word_tokenize
from nltk.tokenize import RegexpTokenizer
from nltk.corpus import stopwords
tokenizer = RegexpTokenizer(r'\w+')
# In[209]:
emails_tokenized = emails_cut.copy()
for index, row in emails_tokenized.iterrows():
row['ExtractedBodyText'] = tokenizer.tokenize(row['ExtractedBodyText'])
emails_tokenized.columns = ['TokenizedText']
emails_tokenized.reset_index(drop=True, inplace=True)
emails_tokenized.head()
# Figure out what words to remove...
# In[210]:
words_delete = ['IT', 'RE','LA','AND', 'AM', 'AT', 'IN', 'I', 'ME', 'DO',
'A', 'AN','BUT', 'IF', 'OR','AS','OF','BY', 'TO', 'UP','ON','ANY', 'NO', 'NOR', 'NOT','SO',
'S', 'T','DON','D', 'LL', 'M', 'O','VE', 'Y','PM', 'TV','CD','PA','ET', 'BY', 'IE','MS', 'MP', 'CC',
'GA','VA', 'BI','CV', 'AL','VAT', 'VA','AI', 'MD', 'SM', 'FM', 'EST', 'BB', 'BRB', 'AQ', 'MA', 'MAR', 'JAM', 'BM',
'Lybia', 'LY', 'LBY', 'MC', 'MCO', 'MO', 'MAC', 'NC', 'PG', 'PNG', 'SUR', 'VI', 'lybia', 'ARM']
emails_final = emails_tokenized.copy()
emails_final['TokenizedText'] = emails_final['TokenizedText'].apply(lambda x: [item for item in x if item not in words_delete])
emails_final.head()
# Create list of countries
# In[211]:
countries_cited = []
for emails in emails_final['TokenizedText']:
for word in emails:
try:
country_name = pycountry.countries.get(alpha_2=word)
countries_cited.append(country_name.name)
except KeyError:
try:
country_name = pycountry.countries.get(alpha_3=word)
countries_cited.append(country_name.name)
except KeyError:
try:
country = pycountry.countries.get(name=word)
countries_cited.append(country_name.name)
except KeyError: pass
# Organize List and Count Occurrence of Each Country
# In[212]:
#List with Unique Entries of Countries Cited
final_countries = list(set(countries_cited))
size = len(final_countries)
final_countries
# In[213]:
#Create New DataFrame for the Counts
Country_Sent = pd.DataFrame(index=range(0,size),columns=['Country', 'Count'])
Country_Sent['Country']=final_countries
Country_Sent.head()
# In[214]:
count_list = []
for country in Country_Sent['Country']:
count = countries_cited.count(country)
count_list.append(count)
Country_Sent['Count']=count_list
Country_Sent.head()
# In[215]:
#Take Out Countries with Less than 20 Citations
Country_Sent= Country_Sent[Country_Sent['Count'] > 14]
Country_Sent = Country_Sent.reset_index(drop=True)
Country_Sent.head()
# In[216]:
#plot to see frequencies
Country_Sent.plot.bar(x='Country', y='Count')
plt.show()
#We have repeatedly plotted this, identifying weird occurances (small countries with high counts),
#and then elimitating them from the data set and repating the process
# In[217]:
#create a list with all possible names of the countries above
countries_used_name = []
countries_used_alpha_2 =[]
countries_used_alpha_3 =[]
for country in Country_Sent['Country']:
country_names = pycountry.countries.get(name=country)
countries_used_name.append(country_names.name)
countries_used_alpha_2.append(country_names.alpha_2)
countries_used_alpha_3.append(country_names.alpha_3)
Country_Sent['Alpha_2']=countries_used_alpha_2
Country_Sent['Alpha_3']=countries_used_alpha_3
Country_Sent.head()
# In[218]:
len(Country_Sent)
# Now we check sentiment on emails around these names
# In[170]:
sentiments = []
vader_analyzer = SentimentIntensityAnalyzer()
size = len(Country_Sent['Alpha_2'])
for i in range(1,size):
country_score =[]
for email in emails_no_stop['TokenizedText']:
if Country_Sent['Alpha_2'][i] in email or Country_Sent['Alpha_3'][i] in email or Country_Sent['Country'][i] in email:
str_email = ' '.join(email)
sentiment = vader_analyzer.polarity_scores(str_email)
score = sentiment['compound']
country_score.append(score)
else: pass
if len(country_score)!=0:
sentiment_score = sum(country_score) / float(len(country_score))
sentiments.append(sentiment_score)
else:
sentiments.append(999)
# In[291]:
sentiments
# In[220]:
#error in iteration, must drop NZ because it was not taken into account in the sentiments analysis
Country_Sent = Country_Sent.drop(Country_Sent.index[[0]])
len(Country_Sent)
# In[222]:
#add sentiment list to data frame
Country_Sent['Sentiment'] = sentiments
Country_Sent.head()
# In[224]:
#delete any row with sentiment value of 999
Country_Sent = Country_Sent[Country_Sent['Sentiment'] != 999]
Country_Sent.head()
# In[226]:
#reorder dataframe in ascending order of sentiment
Country_Sent.sort_values(['Sentiment'], ascending=True, inplace=True)
Country_Sent.head()
# In[254]:
#reorder index
Country_Sent = Country_Sent.reset_index(drop=True)
Country_Sent.head()
# Now we make a color gradient for the histogram
# In[288]:
#We must normalize the sentiment scores and create a gradient based on that (green, blue & red gradient)
#first we sort the ones that are below zero, than the ones above zero
color_grad = []
size = len(Country_Sent['Sentiment'])
for i in range(0,size):
if Country_Sent['Sentiment'][i] < 0:
high = 0
low = np.min(sentiments)
rg = low-high
new_entry = (low-Country_Sent['Sentiment'][i])/rg
red = 1 - new_entry
color_grad.append((red,0,0))
else:
high = np.max(sentiments)
low = 0
rg2 = high-low
new_entry = (Country_Sent['Sentiment'][i]-low)/rg2
green = 1 - new_entry
color_grad.append((0,green,0))
Country_Sent['color_grad'] = color_grad
Country_Sent.head()
# In[289]:
#Now we create the bar plot based on this palette
import seaborn as sns
plt.figure(figsize=(30,20))
plot = sns.barplot(x='Country', y='Sentiment', data=Country_Sent, orient='vertical', palette=color_grad)
plt.ylabel('Country Sentiment');
plt.show()
# In[252]:
#Now we create a bar plot with an automatic gradient based on sentiment
size = len(Country_Sent['Sentiment'])
plt.figure(figsize=(30,20))
grad = sns.diverging_palette(10, 225, n=32)
plot = sns.barplot(x='Country', y='Sentiment', data=Country_Sent, orient='vertical', palette = grad )
plt.xticks(rotation=60);
plt.ylabel('Country Sentiment');
plt.show()
# Comment on Sentiment Data:
# Some Countries Were lost in this analysis, it is not clear why yet.
# Comments on Data Viz:
# Creating my own pallette somehow erased the nuances between countries even when the difference in
# scores was significant. The automaticallly generated palette performed much better at conveying the info.
# In[ ]:
|
But today I am sharing a new one pot recipe today that is once again super easy and delicious! Everything cooks in one skillet and I also think it brings something different to the table for your family. This recipe will serve four and we ate half for dinner and the other half for lunch the next day.
Thats it! I promise this will be a pleaser for everyone and is a great weeknight meal! If you want to see my list of one pot meals click here. I am working my way to 100!
Thanks for catching up! I will be back here on Wednesday for What’s up Wednesday! |
import numpy as np
import regreg.api as rr
np.random.seed(400)
N = 500
P = 2
Y = 2 * np.random.binomial(1, 0.5, size=(N,)) - 1.
X = np.random.standard_normal((N,P))
X[Y==1] += np.array([3,-2])[np.newaxis,:]
X_1 = np.hstack([X, np.ones((N,1))])
X_1_signs = -Y[:,np.newaxis] * X_1
transform = rr.affine_transform(X_1_signs, np.ones(N))
C = 0.2
hinge = rr.positive_part(N, lagrange=C)
hinge_loss = rr.linear_atom(hinge, transform)
quadratic = rr.quadratic.linear(rr.selector(slice(0,P), (P+1,)), coef=0.5)
problem = rr.container(quadratic, hinge_loss)
solver = rr.FISTA(problem)
solver.fit()
import pylab
pylab.clf()
pylab.scatter(X[Y==1,0],X[Y==1,1], facecolor='red')
pylab.scatter(X[Y==-1,0],X[Y==-1,1], facecolor='blue')
fits = np.dot(X_1, problem.coefs)
labels = 2 * (fits > 0) - 1
pointX = [X[:,0].min(), X[:,0].max()]
pointY = [-(pointX[0]*problem.coefs[0]+problem.coefs[2])/problem.coefs[1],
-(pointX[1]*problem.coefs[0]+problem.coefs[2])/problem.coefs[1]]
pylab.plot(pointX, pointY, linestyle='--', label='Separating hyperplane')
pylab.title("Accuracy = %0.1f %%" % (100-100 * np.fabs(labels - Y).sum() / (2 * N)))
#pylab.show()
|
The Phantom cruiser is one of three latest additions to Honda Shadow lineup. Ever since its release, it draws attention. Now, the 2018 Honda Shadow Phantom is on the way, with new features and options. It will use the same 750 cc engine with all characteristics of Shadow bikes. The highlight of this motorcycle is low seating position. After Shadow Phantom, Honda could come out with new Shadow Aero bike, which is not available since the 2016 year model.
The engine of the 2018 Honda Shadow Phantom is a 745 cc unit with the liquid cooling system. This drivetrain is very economical, with the fuel economy of 56 mpg combined. With a fuel tank of 3.7 gallons, the 2018 Shadow Phantom has a range around 200 miles. A V-twin engine combines electric starter, 79×76 mm bore and stroke, and 9.6:1 compression ratio. The manual transmission has 5 gears. Valve configuration is SOHC, and brakes are discs for front wheel and drum on the rear one.
The seating position of the 2018 Honda Shadow Phantom is very low. Whoever loves cruising will like this bike because of its saddle. At 26 inches, drivers will have the impression as they almost touch the ground. The front wheel is 17 inches and rear 15-inch. The wheelbase of the 2018 Shadow Phantom is 65 inches long, and curb weight, with all liquids, is 550 pounds. We are still waiting for confirmation of color options. Buyers will be able to upgrade the bike with backrest tall, or many bags and extra carriers.
The major rival for 2018 Honda Shadow Phantom will be Suzuki Boulevard S40. Also, Kawasaki and Harley-Davidson are there with few models. Nevertheless, Honda fans can find the alternative for Shadow Phantom in smaller, Rebel 500 bike. The 2018 Phantom will cost $7,800, which is even more than Harley’s Street model under 750 cc. |
from tornado import gen
from itertools import izip_longest
from functools import partial
from operator import is_not
class Delete(object):
def __init__(self):
self.delete_count = 0
self.message_count = 0
@gen.coroutine
def _one_request(self, sqs_delete_message_batch, queue_url, ids):
self.delete_count += 1
resp = yield gen.Task(sqs_delete_message_batch.call,
QueueUrl=queue_url,
Entries=ids)
if 'Successful' in resp:
self.message_count += len(resp['Successful'])
if 'Failed' in resp:
raise Exception('failed to delete messages')
@gen.coroutine
def execute(self, sqs_delete_message_batch, queue_url, ids):
id_groups = group_by_10(ids)
r = []
for id_group in id_groups:
r.append(self._one_request(sqs_delete_message_batch,
queue_url, id_group))
yield r
def group_by_10(ids):
def grouper(iterable, n):
args = [iter(iterable)] * n
return izip_longest(*args)
def convert(t):
return list(filter(partial(is_not, None), t))
return map(convert, grouper(ids, 10))
|
Police have named a 39-year-old man who died on a Stoke-on-Trent street at the weekend.
John Hughes died after collapsing on Pevensey Grove, Longton at around 6.45pm on Saturday evening.
Paramedics were unable to save John, who was from Longton, and he was pronounced dead at the scene.
A Staffordshire Police spokesman said: "We can confirm that the man who sadly passed away on Pevensey Grove, Longton on Saturday November 25 was 39-year-old John Hughes from Longton.
"John collapsed in Pevensey Grove at around 6:45pm. Despite the best efforts of paramedics, he passed away a short time after.
"His family is being supported by specially trained officers and they have asked that their privacy is respected at this difficult time.
"The death is not considered suspicious and a report is being prepared for HM Coroner." |
"""
sentry.models.groupmeta
~~~~~~~~~~~~~~~~~~~~~~~
:copyright: (c) 2010-2014 by the Sentry Team, see AUTHORS for more details.
:license: BSD, see LICENSE for more details.
"""
from __future__ import absolute_import
from django.db import models
from django.utils import timezone
from sentry.db.models import (BoundedBigIntegerField, Model, sane_repr)
class EventMapping(Model):
__core__ = False
project_id = BoundedBigIntegerField()
group_id = BoundedBigIntegerField()
event_id = models.CharField(max_length=32)
date_added = models.DateTimeField(default=timezone.now)
class Meta:
app_label = 'sentry'
db_table = 'sentry_eventmapping'
unique_together = (('project_id', 'event_id'), )
__repr__ = sane_repr('project_id', 'group_id', 'event_id')
# Implement a ForeignKey-like accessor for backwards compat
def _set_group(self, group):
self.group_id = group.id
self._group_cache = group
def _get_group(self):
from sentry.models import Group
if not hasattr(self, '_group_cache'):
self._group_cache = Group.objects.get(id=self.group_id)
return self._group_cache
group = property(_get_group, _set_group)
# Implement a ForeignKey-like accessor for backwards compat
def _set_project(self, project):
self.project_id = project.id
self._project_cache = project
def _get_project(self):
from sentry.models import Project
if not hasattr(self, '_project_cache'):
self._project_cache = Project.objects.get(id=self.project_id)
return self._project_cache
project = property(_get_project, _set_project)
|
Hi there! My name is okay_google and I am a webcam model. I am years old and was born on . |
# -*- coding: utf-8 -*-
"""
ReklaminiaiParduotuviųLankstinukai
Copyright (C) <2014> <Algirdas Butkus> <butkus.algirdas@gmail.com>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Affero General Public License as
published by the Free Software Foundation, either version 3 of the
License, or (at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU Affero General Public License for more details.
You should have received a copy of the GNU Affero General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
"""
version = 0.011
from PyQt4 import QtCore, QtGui, QtWebKit
import os
def SEP(path):
separator = os.path.sep
if separator != '/':
path = path.replace('/', os.path.sep)
return path
userdir = os.path.expanduser('~')
userprogpath = SEP('/.cache/deadprogram/')
try:
_fromUtf8 = QtCore.QString.fromUtf8
except AttributeError:
def _fromUtf8(s):
return s
try:
_encoding = QtGui.QApplication.UnicodeUTF8
def _translate(context, text, disambig):
return QtGui.QApplication.translate(context, text, disambig, _encoding)
except AttributeError:
def _translate(context, text, disambig):
return QtGui.QApplication.translate(context, text, disambig)
class Ui_MainWindow(object):
def setupUi(self, MainWindow):
MainWindow.setObjectName(_fromUtf8("MainWindow"))
MainWindow.resize(905, 636)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Expanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(MainWindow.sizePolicy().hasHeightForWidth())
MainWindow.setSizePolicy(sizePolicy)
MainWindow.setMinimumSize(QtCore.QSize(0, 600))
font = QtGui.QFont()
font.setFamily(_fromUtf8("Sans"))
MainWindow.setFont(font)
icon = QtGui.QIcon()
icon.addPixmap(QtGui.QPixmap(_fromUtf8(userdir + userprogpath + SEP("icons/image.png"))), QtGui.QIcon.Normal, QtGui.QIcon.Off)
MainWindow.setWindowIcon(icon)
self.centralwidget = QtGui.QWidget(MainWindow)
self.centralwidget.setObjectName(_fromUtf8("centralwidget"))
self.verticalLayout_2 = QtGui.QVBoxLayout(self.centralwidget)
self.verticalLayout_2.setSpacing(6)
self.verticalLayout_2.setMargin(6)
self.verticalLayout_2.setObjectName(_fromUtf8("verticalLayout_2"))
self.gridLayout = QtGui.QGridLayout()
self.gridLayout.setSizeConstraint(QtGui.QLayout.SetMinAndMaxSize)
self.gridLayout.setSpacing(0)
self.gridLayout.setObjectName(_fromUtf8("gridLayout"))
self.horizontalLayout = QtGui.QHBoxLayout()
self.horizontalLayout.setSpacing(0)
self.horizontalLayout.setSizeConstraint(QtGui.QLayout.SetNoConstraint)
self.horizontalLayout.setObjectName(_fromUtf8("horizontalLayout"))
self.tabWidget = QtGui.QTabWidget(self.centralwidget)
self.tabWidget.setMinimumSize(QtCore.QSize(0, 27))
self.tabWidget.setBaseSize(QtCore.QSize(0, 0))
self.tabWidget.setFocusPolicy(QtCore.Qt.NoFocus)
self.tabWidget.setTabShape(QtGui.QTabWidget.Rounded)
self.tabWidget.setDocumentMode(False)
self.tabWidget.setMovable(False)
self.tabWidget.setObjectName(_fromUtf8("tabWidget"))
self.pdftab = QtGui.QWidget()
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Expanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pdftab.sizePolicy().hasHeightForWidth())
self.pdftab.setSizePolicy(sizePolicy)
self.pdftab.setObjectName(_fromUtf8("pdftab"))
self.verticalLayout_4 = QtGui.QVBoxLayout(self.pdftab)
self.verticalLayout_4.setSpacing(0)
self.verticalLayout_4.setMargin(0)
self.verticalLayout_4.setObjectName(_fromUtf8("verticalLayout_4"))
self.horizontalLayout_5 = QtGui.QHBoxLayout()
self.horizontalLayout_5.setSpacing(1)
self.horizontalLayout_5.setContentsMargins(-1, 2, -1, 1)
self.horizontalLayout_5.setObjectName(_fromUtf8("horizontalLayout_5"))
self.comboBox_2 = QtGui.QComboBox(self.pdftab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.comboBox_2.sizePolicy().hasHeightForWidth())
self.comboBox_2.setSizePolicy(sizePolicy)
self.comboBox_2.setMinimumSize(QtCore.QSize(100, 0))
self.comboBox_2.setMaximumSize(QtCore.QSize(100, 23))
font = QtGui.QFont()
font.setFamily(_fromUtf8("Sans Serif"))
self.comboBox_2.setFont(font)
self.comboBox_2.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_2.setObjectName(_fromUtf8("comboBox_2"))
self.comboBox_2.addItem(_fromUtf8(""))
self.horizontalLayout_5.addWidget(self.comboBox_2)
self.comboBox_3 = QtGui.QComboBox(self.pdftab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.comboBox_3.sizePolicy().hasHeightForWidth())
self.comboBox_3.setSizePolicy(sizePolicy)
self.comboBox_3.setMinimumSize(QtCore.QSize(100, 0))
self.comboBox_3.setMaximumSize(QtCore.QSize(100, 23))
self.comboBox_3.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_3.setObjectName(_fromUtf8("comboBox_3"))
self.comboBox_3.addItem(_fromUtf8(""))
self.horizontalLayout_5.addWidget(self.comboBox_3)
self.comboBox_4 = QtGui.QComboBox(self.pdftab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.comboBox_4.sizePolicy().hasHeightForWidth())
self.comboBox_4.setSizePolicy(sizePolicy)
self.comboBox_4.setMinimumSize(QtCore.QSize(100, 0))
self.comboBox_4.setMaximumSize(QtCore.QSize(100, 23))
self.comboBox_4.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_4.setObjectName(_fromUtf8("comboBox_4"))
self.comboBox_4.addItem(_fromUtf8(""))
self.horizontalLayout_5.addWidget(self.comboBox_4)
self.comboBox_6 = QtGui.QComboBox(self.pdftab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.comboBox_6.sizePolicy().hasHeightForWidth())
self.comboBox_6.setSizePolicy(sizePolicy)
self.comboBox_6.setMinimumSize(QtCore.QSize(100, 0))
self.comboBox_6.setMaximumSize(QtCore.QSize(100, 23))
self.comboBox_6.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_6.setObjectName(_fromUtf8("comboBox_6"))
self.comboBox_6.addItem(_fromUtf8(""))
self.horizontalLayout_5.addWidget(self.comboBox_6)
self.comboBox_5 = QtGui.QComboBox(self.pdftab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.comboBox_5.sizePolicy().hasHeightForWidth())
self.comboBox_5.setSizePolicy(sizePolicy)
self.comboBox_5.setMinimumSize(QtCore.QSize(100, 0))
self.comboBox_5.setMaximumSize(QtCore.QSize(100, 23))
self.comboBox_5.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_5.setObjectName(_fromUtf8("comboBox_5"))
self.comboBox_5.addItem(_fromUtf8(""))
self.horizontalLayout_5.addWidget(self.comboBox_5)
self.comboBox_7 = QtGui.QComboBox(self.pdftab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.comboBox_7.sizePolicy().hasHeightForWidth())
self.comboBox_7.setSizePolicy(sizePolicy)
self.comboBox_7.setMinimumSize(QtCore.QSize(130, 0))
self.comboBox_7.setMaximumSize(QtCore.QSize(100, 23))
self.comboBox_7.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_7.setObjectName(_fromUtf8("comboBox_7"))
self.comboBox_7.addItem(_fromUtf8(""))
self.horizontalLayout_5.addWidget(self.comboBox_7)
self.comboBox_10 = QtGui.QComboBox(self.pdftab)
self.comboBox_10.setMinimumSize(QtCore.QSize(145, 0))
self.comboBox_10.setMaximumSize(QtCore.QSize(16777215, 23))
self.comboBox_10.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_10.setObjectName(_fromUtf8("comboBox_10"))
self.comboBox_10.addItem(_fromUtf8(""))
self.horizontalLayout_5.addWidget(self.comboBox_10)
self.comboBox_11 = QtGui.QComboBox(self.pdftab)
self.comboBox_11.setMinimumSize(QtCore.QSize(100, 0))
self.comboBox_11.setMaximumSize(QtCore.QSize(100, 23))
self.comboBox_11.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_11.setObjectName(_fromUtf8("comboBox_11"))
self.comboBox_11.addItem(_fromUtf8(""))
self.horizontalLayout_5.addWidget(self.comboBox_11)
spacerItem = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_5.addItem(spacerItem)
self.verticalLayout_4.addLayout(self.horizontalLayout_5)
self.horizontalLayout_12 = QtGui.QHBoxLayout()
self.horizontalLayout_12.setSpacing(1)
self.horizontalLayout_12.setContentsMargins(-1, 1, -1, 2)
self.horizontalLayout_12.setObjectName(_fromUtf8("horizontalLayout_12"))
self.comboBox_12 = QtGui.QComboBox(self.pdftab)
self.comboBox_12.setMinimumSize(QtCore.QSize(100, 0))
self.comboBox_12.setMaximumSize(QtCore.QSize(100, 23))
self.comboBox_12.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_12.setObjectName(_fromUtf8("comboBox_12"))
self.comboBox_12.addItem(_fromUtf8(""))
self.horizontalLayout_12.addWidget(self.comboBox_12)
self.comboBox_13 = QtGui.QComboBox(self.pdftab)
self.comboBox_13.setMinimumSize(QtCore.QSize(100, 0))
self.comboBox_13.setMaximumSize(QtCore.QSize(100, 23))
self.comboBox_13.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_13.setObjectName(_fromUtf8("comboBox_13"))
self.comboBox_13.addItem(_fromUtf8(""))
self.horizontalLayout_12.addWidget(self.comboBox_13)
self.comboBox_14 = QtGui.QComboBox(self.pdftab)
self.comboBox_14.setMinimumSize(QtCore.QSize(100, 0))
self.comboBox_14.setMaximumSize(QtCore.QSize(100, 23))
self.comboBox_14.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_14.setObjectName(_fromUtf8("comboBox_14"))
self.comboBox_14.addItem(_fromUtf8(""))
self.horizontalLayout_12.addWidget(self.comboBox_14)
self.comboBox_8 = QtGui.QComboBox(self.pdftab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.comboBox_8.sizePolicy().hasHeightForWidth())
self.comboBox_8.setSizePolicy(sizePolicy)
self.comboBox_8.setMinimumSize(QtCore.QSize(100, 0))
self.comboBox_8.setMaximumSize(QtCore.QSize(100, 23))
self.comboBox_8.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_8.setObjectName(_fromUtf8("comboBox_8"))
self.comboBox_8.addItem(_fromUtf8(""))
self.horizontalLayout_12.addWidget(self.comboBox_8)
self.comboBox_9 = QtGui.QComboBox(self.pdftab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.comboBox_9.sizePolicy().hasHeightForWidth())
self.comboBox_9.setSizePolicy(sizePolicy)
self.comboBox_9.setMinimumSize(QtCore.QSize(100, 0))
self.comboBox_9.setMaximumSize(QtCore.QSize(100, 23))
self.comboBox_9.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox_9.setObjectName(_fromUtf8("comboBox_9"))
self.comboBox_9.addItem(_fromUtf8(""))
self.horizontalLayout_12.addWidget(self.comboBox_9)
spacerItem1 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_12.addItem(spacerItem1)
self.checkBox_5 = QtGui.QCheckBox(self.pdftab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkBox_5.sizePolicy().hasHeightForWidth())
self.checkBox_5.setSizePolicy(sizePolicy)
self.checkBox_5.setMaximumSize(QtCore.QSize(16777215, 23))
self.checkBox_5.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkBox_5.setText(_fromUtf8(""))
icon1 = QtGui.QIcon()
icon1.addPixmap(QtGui.QPixmap(_fromUtf8(userdir + userprogpath + SEP("icons/zoom-fit.png"))), QtGui.QIcon.Normal, QtGui.QIcon.On)
self.checkBox_5.setIcon(icon1)
self.checkBox_5.setObjectName(_fromUtf8("checkBox_5"))
self.horizontalLayout_12.addWidget(self.checkBox_5)
self.doubleSpinBox = QtGui.QDoubleSpinBox(self.pdftab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Fixed, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.doubleSpinBox.sizePolicy().hasHeightForWidth())
self.doubleSpinBox.setSizePolicy(sizePolicy)
self.doubleSpinBox.setMinimumSize(QtCore.QSize(0, 0))
self.doubleSpinBox.setMaximumSize(QtCore.QSize(16777215, 23))
self.doubleSpinBox.setPrefix(_fromUtf8(""))
self.doubleSpinBox.setSuffix(_fromUtf8(""))
self.doubleSpinBox.setDecimals(2)
self.doubleSpinBox.setMinimum(0.5)
self.doubleSpinBox.setMaximum(1.5)
self.doubleSpinBox.setSingleStep(0.05)
self.doubleSpinBox.setProperty("value", 1.0)
self.doubleSpinBox.setObjectName(_fromUtf8("doubleSpinBox"))
self.horizontalLayout_12.addWidget(self.doubleSpinBox)
self.pushButton_11 = QtGui.QPushButton(self.pdftab)
self.pushButton_11.setMaximumSize(QtCore.QSize(27, 23))
self.pushButton_11.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButton_11.setText(_fromUtf8(""))
icon2 = QtGui.QIcon()
icon2.addPixmap(QtGui.QPixmap(_fromUtf8(userdir + userprogpath + SEP("icons/go-up.png"))), QtGui.QIcon.Normal, QtGui.QIcon.On)
self.pushButton_11.setIcon(icon2)
self.pushButton_11.setFlat(True)
self.pushButton_11.setObjectName(_fromUtf8("pushButton_11"))
self.horizontalLayout_12.addWidget(self.pushButton_11)
self.pushButton_6 = QtGui.QPushButton(self.pdftab)
self.pushButton_6.setMaximumSize(QtCore.QSize(27, 23))
self.pushButton_6.setBaseSize(QtCore.QSize(0, 27))
self.pushButton_6.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButton_6.setText(_fromUtf8(""))
icon3 = QtGui.QIcon()
icon3.addPixmap(QtGui.QPixmap(_fromUtf8(userdir + userprogpath + SEP("icons/go-down.png"))), QtGui.QIcon.Normal, QtGui.QIcon.On)
self.pushButton_6.setIcon(icon3)
self.pushButton_6.setFlat(True)
self.pushButton_6.setObjectName(_fromUtf8("pushButton_6"))
self.horizontalLayout_12.addWidget(self.pushButton_6)
self.label_8 = QtGui.QLabel(self.pdftab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Maximum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_8.sizePolicy().hasHeightForWidth())
self.label_8.setSizePolicy(sizePolicy)
self.label_8.setObjectName(_fromUtf8("label_8"))
self.horizontalLayout_12.addWidget(self.label_8)
self.verticalLayout_4.addLayout(self.horizontalLayout_12)
self.line_2 = QtGui.QFrame(self.pdftab)
self.line_2.setFrameShape(QtGui.QFrame.HLine)
self.line_2.setFrameShadow(QtGui.QFrame.Sunken)
self.line_2.setObjectName(_fromUtf8("line_2"))
self.verticalLayout_4.addWidget(self.line_2)
self.webView_2 = QtWebKit.QWebView(self.pdftab)
self.webView_2.setContextMenuPolicy(QtCore.Qt.NoContextMenu)
self.webView_2.setAutoFillBackground(False)
self.webView_2.setProperty("url", QtCore.QUrl(_fromUtf8("about:blank")))
self.webView_2.setObjectName(_fromUtf8("webView_2"))
self.verticalLayout_4.addWidget(self.webView_2)
self.tabWidget.addTab(self.pdftab, _fromUtf8(""))
self.Internettab = QtGui.QWidget()
self.Internettab.setObjectName(_fromUtf8("Internettab"))
self.verticalLayout_3 = QtGui.QVBoxLayout(self.Internettab)
self.verticalLayout_3.setSpacing(0)
self.verticalLayout_3.setMargin(0)
self.verticalLayout_3.setObjectName(_fromUtf8("verticalLayout_3"))
self.horizontalLayout_2 = QtGui.QHBoxLayout()
self.horizontalLayout_2.setSpacing(1)
self.horizontalLayout_2.setContentsMargins(0, 2, 0, 1)
self.horizontalLayout_2.setObjectName(_fromUtf8("horizontalLayout_2"))
self.Intbuttonmaxima = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.Intbuttonmaxima.sizePolicy().hasHeightForWidth())
self.Intbuttonmaxima.setSizePolicy(sizePolicy)
self.Intbuttonmaxima.setMinimumSize(QtCore.QSize(0, 0))
self.Intbuttonmaxima.setMaximumSize(QtCore.QSize(16777215, 23))
self.Intbuttonmaxima.setFocusPolicy(QtCore.Qt.StrongFocus)
self.Intbuttonmaxima.setAcceptDrops(False)
self.Intbuttonmaxima.setCheckable(True)
self.Intbuttonmaxima.setChecked(False)
self.Intbuttonmaxima.setFlat(False)
self.Intbuttonmaxima.setObjectName(_fromUtf8("Intbuttonmaxima"))
self.horizontalLayout_2.addWidget(self.Intbuttonmaxima)
self.Intbuttonnorfa = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.Intbuttonnorfa.sizePolicy().hasHeightForWidth())
self.Intbuttonnorfa.setSizePolicy(sizePolicy)
self.Intbuttonnorfa.setMinimumSize(QtCore.QSize(0, 0))
self.Intbuttonnorfa.setMaximumSize(QtCore.QSize(16777215, 23))
self.Intbuttonnorfa.setFocusPolicy(QtCore.Qt.StrongFocus)
self.Intbuttonnorfa.setCheckable(True)
self.Intbuttonnorfa.setFlat(False)
self.Intbuttonnorfa.setObjectName(_fromUtf8("Intbuttonnorfa"))
self.horizontalLayout_2.addWidget(self.Intbuttonnorfa)
self.Intbuttoniki = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.Intbuttoniki.sizePolicy().hasHeightForWidth())
self.Intbuttoniki.setSizePolicy(sizePolicy)
self.Intbuttoniki.setMinimumSize(QtCore.QSize(0, 0))
self.Intbuttoniki.setMaximumSize(QtCore.QSize(16777215, 23))
self.Intbuttoniki.setFocusPolicy(QtCore.Qt.StrongFocus)
self.Intbuttoniki.setCheckable(True)
self.Intbuttoniki.setFlat(False)
self.Intbuttoniki.setObjectName(_fromUtf8("Intbuttoniki"))
self.horizontalLayout_2.addWidget(self.Intbuttoniki)
self.Intbuttonrimi = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.Intbuttonrimi.sizePolicy().hasHeightForWidth())
self.Intbuttonrimi.setSizePolicy(sizePolicy)
self.Intbuttonrimi.setMinimumSize(QtCore.QSize(0, 0))
self.Intbuttonrimi.setMaximumSize(QtCore.QSize(16777215, 23))
self.Intbuttonrimi.setFocusPolicy(QtCore.Qt.StrongFocus)
self.Intbuttonrimi.setCheckable(True)
self.Intbuttonrimi.setFlat(False)
self.Intbuttonrimi.setObjectName(_fromUtf8("Intbuttonrimi"))
self.horizontalLayout_2.addWidget(self.Intbuttonrimi)
self.intbuttonaibe = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.intbuttonaibe.sizePolicy().hasHeightForWidth())
self.intbuttonaibe.setSizePolicy(sizePolicy)
self.intbuttonaibe.setMinimumSize(QtCore.QSize(0, 0))
self.intbuttonaibe.setMaximumSize(QtCore.QSize(16777215, 23))
self.intbuttonaibe.setFocusPolicy(QtCore.Qt.StrongFocus)
self.intbuttonaibe.setCheckable(True)
self.intbuttonaibe.setObjectName(_fromUtf8("intbuttonaibe"))
self.horizontalLayout_2.addWidget(self.intbuttonaibe)
self.intbuttonFRESH_MARKET = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.intbuttonFRESH_MARKET.sizePolicy().hasHeightForWidth())
self.intbuttonFRESH_MARKET.setSizePolicy(sizePolicy)
self.intbuttonFRESH_MARKET.setMinimumSize(QtCore.QSize(0, 0))
self.intbuttonFRESH_MARKET.setMaximumSize(QtCore.QSize(16777215, 23))
self.intbuttonFRESH_MARKET.setFocusPolicy(QtCore.Qt.StrongFocus)
self.intbuttonFRESH_MARKET.setCheckable(True)
self.intbuttonFRESH_MARKET.setObjectName(_fromUtf8("intbuttonFRESH_MARKET"))
self.horizontalLayout_2.addWidget(self.intbuttonFRESH_MARKET)
self.intbuttonPROMO = QtGui.QPushButton(self.Internettab)
self.intbuttonPROMO.setMaximumSize(QtCore.QSize(16777215, 23))
self.intbuttonPROMO.setFocusPolicy(QtCore.Qt.StrongFocus)
self.intbuttonPROMO.setCheckable(True)
self.intbuttonPROMO.setObjectName(_fromUtf8("intbuttonPROMO"))
self.horizontalLayout_2.addWidget(self.intbuttonPROMO)
self.intbuttonPRISMA = QtGui.QPushButton(self.Internettab)
self.intbuttonPRISMA.setMaximumSize(QtCore.QSize(16777215, 23))
self.intbuttonPRISMA.setFocusPolicy(QtCore.Qt.StrongFocus)
self.intbuttonPRISMA.setCheckable(True)
self.intbuttonPRISMA.setObjectName(_fromUtf8("intbuttonPRISMA"))
self.horizontalLayout_2.addWidget(self.intbuttonPRISMA)
spacerItem2 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_2.addItem(spacerItem2)
self.pushButton_9 = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Maximum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pushButton_9.sizePolicy().hasHeightForWidth())
self.pushButton_9.setSizePolicy(sizePolicy)
self.pushButton_9.setMinimumSize(QtCore.QSize(0, 0))
self.pushButton_9.setMaximumSize(QtCore.QSize(16777215, 23))
font = QtGui.QFont()
font.setStyleStrategy(QtGui.QFont.PreferDefault)
self.pushButton_9.setFont(font)
self.pushButton_9.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButton_9.setAcceptDrops(True)
self.pushButton_9.setText(_fromUtf8(""))
icon4 = QtGui.QIcon()
icon4.addPixmap(QtGui.QPixmap(_fromUtf8(userdir + userprogpath + SEP("icons/user-trash.png"))), QtGui.QIcon.Normal, QtGui.QIcon.On)
self.pushButton_9.setIcon(icon4)
self.pushButton_9.setIconSize(QtCore.QSize(24, 24))
self.pushButton_9.setAutoRepeat(False)
self.pushButton_9.setFlat(True)
self.pushButton_9.setObjectName(_fromUtf8("pushButton_9"))
self.horizontalLayout_2.addWidget(self.pushButton_9)
self.verticalLayout_3.addLayout(self.horizontalLayout_2)
self.horizontalLayout_17 = QtGui.QHBoxLayout()
self.horizontalLayout_17.setSpacing(1)
self.horizontalLayout_17.setContentsMargins(-1, 1, -1, 1)
self.horizontalLayout_17.setObjectName(_fromUtf8("horizontalLayout_17"))
self.intbuttonEUROKOS = QtGui.QPushButton(self.Internettab)
self.intbuttonEUROKOS.setMaximumSize(QtCore.QSize(16777215, 23))
self.intbuttonEUROKOS.setFocusPolicy(QtCore.Qt.StrongFocus)
self.intbuttonEUROKOS.setCheckable(True)
self.intbuttonEUROKOS.setObjectName(_fromUtf8("intbuttonEUROKOS"))
self.horizontalLayout_17.addWidget(self.intbuttonEUROKOS)
self.intbuttonDrogas = QtGui.QPushButton(self.Internettab)
self.intbuttonDrogas.setMaximumSize(QtCore.QSize(16777215, 23))
self.intbuttonDrogas.setFocusPolicy(QtCore.Qt.StrongFocus)
self.intbuttonDrogas.setCheckable(True)
self.intbuttonDrogas.setObjectName(_fromUtf8("intbuttonDrogas"))
self.horizontalLayout_17.addWidget(self.intbuttonDrogas)
self.intbuttonERMITAZAS = QtGui.QPushButton(self.Internettab)
self.intbuttonERMITAZAS.setMaximumSize(QtCore.QSize(16777215, 23))
self.intbuttonERMITAZAS.setFocusPolicy(QtCore.Qt.StrongFocus)
self.intbuttonERMITAZAS.setCheckable(True)
self.intbuttonERMITAZAS.setObjectName(_fromUtf8("intbuttonERMITAZAS"))
self.horizontalLayout_17.addWidget(self.intbuttonERMITAZAS)
self.intbuttonSenukai = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.intbuttonSenukai.sizePolicy().hasHeightForWidth())
self.intbuttonSenukai.setSizePolicy(sizePolicy)
self.intbuttonSenukai.setMinimumSize(QtCore.QSize(0, 0))
self.intbuttonSenukai.setMaximumSize(QtCore.QSize(16777215, 23))
self.intbuttonSenukai.setFocusPolicy(QtCore.Qt.StrongFocus)
self.intbuttonSenukai.setCheckable(True)
self.intbuttonSenukai.setObjectName(_fromUtf8("intbuttonSenukai"))
self.horizontalLayout_17.addWidget(self.intbuttonSenukai)
self.intbuttonMoki_Vezi = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.intbuttonMoki_Vezi.sizePolicy().hasHeightForWidth())
self.intbuttonMoki_Vezi.setSizePolicy(sizePolicy)
self.intbuttonMoki_Vezi.setMinimumSize(QtCore.QSize(0, 0))
self.intbuttonMoki_Vezi.setMaximumSize(QtCore.QSize(16777215, 23))
self.intbuttonMoki_Vezi.setFocusPolicy(QtCore.Qt.StrongFocus)
self.intbuttonMoki_Vezi.setCheckable(True)
self.intbuttonMoki_Vezi.setObjectName(_fromUtf8("intbuttonMoki_Vezi"))
self.horizontalLayout_17.addWidget(self.intbuttonMoki_Vezi)
self.intbuttonJysk = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.intbuttonJysk.sizePolicy().hasHeightForWidth())
self.intbuttonJysk.setSizePolicy(sizePolicy)
self.intbuttonJysk.setMinimumSize(QtCore.QSize(0, 0))
self.intbuttonJysk.setMaximumSize(QtCore.QSize(16777215, 23))
self.intbuttonJysk.setFocusPolicy(QtCore.Qt.StrongFocus)
self.intbuttonJysk.setCheckable(True)
self.intbuttonJysk.setObjectName(_fromUtf8("intbuttonJysk"))
self.horizontalLayout_17.addWidget(self.intbuttonJysk)
spacerItem3 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_17.addItem(spacerItem3)
self.comboBox = QtGui.QComboBox(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Fixed, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.comboBox.sizePolicy().hasHeightForWidth())
self.comboBox.setSizePolicy(sizePolicy)
self.comboBox.setMinimumSize(QtCore.QSize(200, 0))
self.comboBox.setMaximumSize(QtCore.QSize(200, 23))
font = QtGui.QFont()
font.setFamily(_fromUtf8("Sans Serif"))
font.setBold(False)
font.setWeight(50)
font.setStrikeOut(False)
self.comboBox.setFont(font)
self.comboBox.setFocusPolicy(QtCore.Qt.StrongFocus)
self.comboBox.setAcceptDrops(True)
self.comboBox.setEditable(False)
self.comboBox.setMaxVisibleItems(20)
self.comboBox.setFrame(False)
self.comboBox.setObjectName(_fromUtf8("comboBox"))
self.horizontalLayout_17.addWidget(self.comboBox)
self.verticalLayout_3.addLayout(self.horizontalLayout_17)
self.horizontalLayout_4 = QtGui.QHBoxLayout()
self.horizontalLayout_4.setSpacing(1)
self.horizontalLayout_4.setContentsMargins(-1, 1, -1, 2)
self.horizontalLayout_4.setObjectName(_fromUtf8("horizontalLayout_4"))
self.pushButton_5 = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Maximum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pushButton_5.sizePolicy().hasHeightForWidth())
self.pushButton_5.setSizePolicy(sizePolicy)
self.pushButton_5.setMinimumSize(QtCore.QSize(0, 0))
self.pushButton_5.setMaximumSize(QtCore.QSize(16777215, 23))
self.pushButton_5.setMouseTracking(False)
self.pushButton_5.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButton_5.setAutoFillBackground(False)
self.pushButton_5.setText(_fromUtf8(""))
icon5 = QtGui.QIcon()
icon5.addPixmap(QtGui.QPixmap(_fromUtf8(userdir + userprogpath + SEP("icons/go-previous.png"))), QtGui.QIcon.Normal, QtGui.QIcon.On)
self.pushButton_5.setIcon(icon5)
self.pushButton_5.setIconSize(QtCore.QSize(24, 24))
self.pushButton_5.setShortcut(_fromUtf8(""))
self.pushButton_5.setAutoExclusive(False)
self.pushButton_5.setAutoDefault(False)
self.pushButton_5.setDefault(False)
self.pushButton_5.setFlat(True)
self.pushButton_5.setObjectName(_fromUtf8("pushButton_5"))
self.horizontalLayout_4.addWidget(self.pushButton_5)
self.pushButton_4 = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Maximum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pushButton_4.sizePolicy().hasHeightForWidth())
self.pushButton_4.setSizePolicy(sizePolicy)
self.pushButton_4.setMinimumSize(QtCore.QSize(0, 0))
self.pushButton_4.setMaximumSize(QtCore.QSize(16777215, 23))
self.pushButton_4.setMouseTracking(False)
self.pushButton_4.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButton_4.setText(_fromUtf8(""))
icon6 = QtGui.QIcon()
icon6.addPixmap(QtGui.QPixmap(_fromUtf8(userdir + userprogpath + SEP("icons/go-next.png"))), QtGui.QIcon.Normal, QtGui.QIcon.On)
self.pushButton_4.setIcon(icon6)
self.pushButton_4.setIconSize(QtCore.QSize(24, 24))
self.pushButton_4.setCheckable(False)
self.pushButton_4.setFlat(True)
self.pushButton_4.setObjectName(_fromUtf8("pushButton_4"))
self.horizontalLayout_4.addWidget(self.pushButton_4)
self.pushButton_3 = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Maximum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pushButton_3.sizePolicy().hasHeightForWidth())
self.pushButton_3.setSizePolicy(sizePolicy)
self.pushButton_3.setMinimumSize(QtCore.QSize(0, 0))
self.pushButton_3.setMaximumSize(QtCore.QSize(16777215, 23))
self.pushButton_3.setMouseTracking(False)
self.pushButton_3.setFocusPolicy(QtCore.Qt.StrongFocus)
icon7 = QtGui.QIcon()
icon7.addPixmap(QtGui.QPixmap(_fromUtf8(userdir + userprogpath + SEP("icons/process-stop.png"))), QtGui.QIcon.Normal, QtGui.QIcon.On)
self.pushButton_3.setIcon(icon7)
self.pushButton_3.setIconSize(QtCore.QSize(24, 24))
self.pushButton_3.setFlat(True)
self.pushButton_3.setObjectName(_fromUtf8("pushButton_3"))
self.horizontalLayout_4.addWidget(self.pushButton_3)
self.pushButton = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Maximum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pushButton.sizePolicy().hasHeightForWidth())
self.pushButton.setSizePolicy(sizePolicy)
self.pushButton.setMinimumSize(QtCore.QSize(0, 0))
self.pushButton.setMaximumSize(QtCore.QSize(16777215, 23))
self.pushButton.setMouseTracking(False)
self.pushButton.setFocusPolicy(QtCore.Qt.StrongFocus)
icon8 = QtGui.QIcon()
icon8.addPixmap(QtGui.QPixmap(_fromUtf8(userdir + userprogpath + SEP("icons/view-refresh.png"))), QtGui.QIcon.Normal, QtGui.QIcon.On)
self.pushButton.setIcon(icon8)
self.pushButton.setIconSize(QtCore.QSize(24, 24))
self.pushButton.setFlat(True)
self.pushButton.setObjectName(_fromUtf8("pushButton"))
self.horizontalLayout_4.addWidget(self.pushButton)
self.pushButton_22 = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Maximum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pushButton_22.sizePolicy().hasHeightForWidth())
self.pushButton_22.setSizePolicy(sizePolicy)
self.pushButton_22.setMinimumSize(QtCore.QSize(0, 0))
self.pushButton_22.setMaximumSize(QtCore.QSize(16777215, 23))
self.pushButton_22.setMouseTracking(False)
self.pushButton_22.setFocusPolicy(QtCore.Qt.StrongFocus)
icon9 = QtGui.QIcon()
icon9.addPixmap(QtGui.QPixmap(_fromUtf8(userdir + userprogpath + SEP("icons/go-home.png"))), QtGui.QIcon.Normal, QtGui.QIcon.On)
self.pushButton_22.setIcon(icon9)
self.pushButton_22.setIconSize(QtCore.QSize(24, 24))
self.pushButton_22.setFlat(True)
self.pushButton_22.setObjectName(_fromUtf8("pushButton_22"))
self.horizontalLayout_4.addWidget(self.pushButton_22)
self.lineEdit = QtGui.QLineEdit(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.lineEdit.sizePolicy().hasHeightForWidth())
self.lineEdit.setSizePolicy(sizePolicy)
self.lineEdit.setMinimumSize(QtCore.QSize(360, 0))
self.lineEdit.setMaximumSize(QtCore.QSize(16777215, 23))
font = QtGui.QFont()
font.setStyleStrategy(QtGui.QFont.PreferAntialias)
self.lineEdit.setFont(font)
self.lineEdit.setMouseTracking(False)
self.lineEdit.setFocusPolicy(QtCore.Qt.StrongFocus)
self.lineEdit.setDragEnabled(True)
self.lineEdit.setCursorMoveStyle(QtCore.Qt.LogicalMoveStyle)
self.lineEdit.setObjectName(_fromUtf8("lineEdit"))
self.horizontalLayout_4.addWidget(self.lineEdit)
self.pushButton_2 = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Maximum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pushButton_2.sizePolicy().hasHeightForWidth())
self.pushButton_2.setSizePolicy(sizePolicy)
self.pushButton_2.setMinimumSize(QtCore.QSize(0, 0))
self.pushButton_2.setMaximumSize(QtCore.QSize(16777215, 23))
self.pushButton_2.setMouseTracking(True)
self.pushButton_2.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButton_2.setText(_fromUtf8(""))
icon10 = QtGui.QIcon()
icon10.addPixmap(QtGui.QPixmap(_fromUtf8(userdir + userprogpath + SEP("icons/go-jump.png"))), QtGui.QIcon.Normal, QtGui.QIcon.On)
self.pushButton_2.setIcon(icon10)
self.pushButton_2.setIconSize(QtCore.QSize(24, 24))
self.pushButton_2.setFlat(True)
self.pushButton_2.setObjectName(_fromUtf8("pushButton_2"))
self.horizontalLayout_4.addWidget(self.pushButton_2)
spacerItem4 = QtGui.QSpacerItem(40, 23, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_4.addItem(spacerItem4)
self.pushButton_12 = QtGui.QPushButton(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Maximum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pushButton_12.sizePolicy().hasHeightForWidth())
self.pushButton_12.setSizePolicy(sizePolicy)
self.pushButton_12.setMaximumSize(QtCore.QSize(16777215, 23))
self.pushButton_12.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButton_12.setObjectName(_fromUtf8("pushButton_12"))
self.horizontalLayout_4.addWidget(self.pushButton_12)
self.progressBar_2 = QtGui.QProgressBar(self.Internettab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Maximum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.progressBar_2.sizePolicy().hasHeightForWidth())
self.progressBar_2.setSizePolicy(sizePolicy)
self.progressBar_2.setMinimumSize(QtCore.QSize(0, 0))
self.progressBar_2.setMaximumSize(QtCore.QSize(16777215, 23))
self.progressBar_2.setProperty("value", 0)
self.progressBar_2.setObjectName(_fromUtf8("progressBar_2"))
self.horizontalLayout_4.addWidget(self.progressBar_2)
self.verticalLayout_3.addLayout(self.horizontalLayout_4)
self.line_3 = QtGui.QFrame(self.Internettab)
self.line_3.setFrameShape(QtGui.QFrame.HLine)
self.line_3.setFrameShadow(QtGui.QFrame.Sunken)
self.line_3.setObjectName(_fromUtf8("line_3"))
self.verticalLayout_3.addWidget(self.line_3)
self.webView = QtWebKit.QWebView(self.Internettab)
font = QtGui.QFont()
font.setStyleStrategy(QtGui.QFont.PreferAntialias)
self.webView.setFont(font)
self.webView.setMouseTracking(False)
self.webView.setProperty("url", QtCore.QUrl(_fromUtf8("about:blank")))
self.webView.setObjectName(_fromUtf8("webView"))
self.verticalLayout_3.addWidget(self.webView)
self.tabWidget.addTab(self.Internettab, _fromUtf8(""))
self.tab = QtGui.QWidget()
self.tab.setObjectName(_fromUtf8("tab"))
self.verticalLayout = QtGui.QVBoxLayout(self.tab)
self.verticalLayout.setObjectName(_fromUtf8("verticalLayout"))
self.verticalLayout_11 = QtGui.QVBoxLayout()
self.verticalLayout_11.setSpacing(2)
self.verticalLayout_11.setContentsMargins(-1, 0, 0, -1)
self.verticalLayout_11.setObjectName(_fromUtf8("verticalLayout_11"))
self.line_4 = QtGui.QFrame(self.tab)
self.line_4.setFrameShape(QtGui.QFrame.HLine)
self.line_4.setFrameShadow(QtGui.QFrame.Sunken)
self.line_4.setObjectName(_fromUtf8("line_4"))
self.verticalLayout_11.addWidget(self.line_4)
self.horizontalLayout_15 = QtGui.QHBoxLayout()
self.horizontalLayout_15.setContentsMargins(-1, 0, -1, 0)
self.horizontalLayout_15.setObjectName(_fromUtf8("horizontalLayout_15"))
self.label_3 = QtGui.QLabel(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_3.sizePolicy().hasHeightForWidth())
self.label_3.setSizePolicy(sizePolicy)
self.label_3.setTextFormat(QtCore.Qt.RichText)
self.label_3.setScaledContents(True)
self.label_3.setAlignment(QtCore.Qt.AlignCenter)
self.label_3.setWordWrap(False)
self.label_3.setObjectName(_fromUtf8("label_3"))
self.horizontalLayout_15.addWidget(self.label_3)
spacerItem5 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_15.addItem(spacerItem5)
self.verticalLayout_11.addLayout(self.horizontalLayout_15)
self.horizontalLayout_14 = QtGui.QHBoxLayout()
self.horizontalLayout_14.setSpacing(2)
self.horizontalLayout_14.setSizeConstraint(QtGui.QLayout.SetDefaultConstraint)
self.horizontalLayout_14.setMargin(0)
self.horizontalLayout_14.setObjectName(_fromUtf8("horizontalLayout_14"))
self.verticalLayout_7 = QtGui.QVBoxLayout()
self.verticalLayout_7.setSpacing(2)
self.verticalLayout_7.setSizeConstraint(QtGui.QLayout.SetDefaultConstraint)
self.verticalLayout_7.setMargin(0)
self.verticalLayout_7.setObjectName(_fromUtf8("verticalLayout_7"))
self.checkboxmaxima = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkboxmaxima.sizePolicy().hasHeightForWidth())
self.checkboxmaxima.setSizePolicy(sizePolicy)
self.checkboxmaxima.setMinimumSize(QtCore.QSize(0, 0))
self.checkboxmaxima.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkboxmaxima.setLayoutDirection(QtCore.Qt.LeftToRight)
self.checkboxmaxima.setAutoFillBackground(False)
self.checkboxmaxima.setChecked(False)
self.checkboxmaxima.setTristate(False)
self.checkboxmaxima.setObjectName(_fromUtf8("checkboxmaxima"))
self.verticalLayout_7.addWidget(self.checkboxmaxima)
self.checkBoxnorfa = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkBoxnorfa.sizePolicy().hasHeightForWidth())
self.checkBoxnorfa.setSizePolicy(sizePolicy)
self.checkBoxnorfa.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkBoxnorfa.setChecked(False)
self.checkBoxnorfa.setObjectName(_fromUtf8("checkBoxnorfa"))
self.verticalLayout_7.addWidget(self.checkBoxnorfa)
self.checkBoxiki = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkBoxiki.sizePolicy().hasHeightForWidth())
self.checkBoxiki.setSizePolicy(sizePolicy)
self.checkBoxiki.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkBoxiki.setChecked(False)
self.checkBoxiki.setObjectName(_fromUtf8("checkBoxiki"))
self.verticalLayout_7.addWidget(self.checkBoxiki)
self.checkBoxrimi = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkBoxrimi.sizePolicy().hasHeightForWidth())
self.checkBoxrimi.setSizePolicy(sizePolicy)
self.checkBoxrimi.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkBoxrimi.setChecked(False)
self.checkBoxrimi.setObjectName(_fromUtf8("checkBoxrimi"))
self.verticalLayout_7.addWidget(self.checkBoxrimi)
self.checkboxAibe = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkboxAibe.sizePolicy().hasHeightForWidth())
self.checkboxAibe.setSizePolicy(sizePolicy)
self.checkboxAibe.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkboxAibe.setObjectName(_fromUtf8("checkboxAibe"))
self.verticalLayout_7.addWidget(self.checkboxAibe)
self.checkboxFRESH_MARKET = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkboxFRESH_MARKET.sizePolicy().hasHeightForWidth())
self.checkboxFRESH_MARKET.setSizePolicy(sizePolicy)
self.checkboxFRESH_MARKET.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkboxFRESH_MARKET.setObjectName(_fromUtf8("checkboxFRESH_MARKET"))
self.verticalLayout_7.addWidget(self.checkboxFRESH_MARKET)
self.checkboxPROMO = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkboxPROMO.sizePolicy().hasHeightForWidth())
self.checkboxPROMO.setSizePolicy(sizePolicy)
self.checkboxPROMO.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkboxPROMO.setObjectName(_fromUtf8("checkboxPROMO"))
self.verticalLayout_7.addWidget(self.checkboxPROMO)
self.checkboxPRISMA = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkboxPRISMA.sizePolicy().hasHeightForWidth())
self.checkboxPRISMA.setSizePolicy(sizePolicy)
self.checkboxPRISMA.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkboxPRISMA.setObjectName(_fromUtf8("checkboxPRISMA"))
self.verticalLayout_7.addWidget(self.checkboxPRISMA)
self.checkboxEUROKOS = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkboxEUROKOS.sizePolicy().hasHeightForWidth())
self.checkboxEUROKOS.setSizePolicy(sizePolicy)
self.checkboxEUROKOS.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkboxEUROKOS.setObjectName(_fromUtf8("checkboxEUROKOS"))
self.verticalLayout_7.addWidget(self.checkboxEUROKOS)
self.checkboxDrogas = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkboxDrogas.sizePolicy().hasHeightForWidth())
self.checkboxDrogas.setSizePolicy(sizePolicy)
self.checkboxDrogas.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkboxDrogas.setObjectName(_fromUtf8("checkboxDrogas"))
self.verticalLayout_7.addWidget(self.checkboxDrogas)
self.checkboxERMITAZAS = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkboxERMITAZAS.sizePolicy().hasHeightForWidth())
self.checkboxERMITAZAS.setSizePolicy(sizePolicy)
self.checkboxERMITAZAS.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkboxERMITAZAS.setObjectName(_fromUtf8("checkboxERMITAZAS"))
self.verticalLayout_7.addWidget(self.checkboxERMITAZAS)
self.checkboxSenukai = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkboxSenukai.sizePolicy().hasHeightForWidth())
self.checkboxSenukai.setSizePolicy(sizePolicy)
self.checkboxSenukai.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkboxSenukai.setObjectName(_fromUtf8("checkboxSenukai"))
self.verticalLayout_7.addWidget(self.checkboxSenukai)
self.checkboxMoki_Vezi = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkboxMoki_Vezi.sizePolicy().hasHeightForWidth())
self.checkboxMoki_Vezi.setSizePolicy(sizePolicy)
self.checkboxMoki_Vezi.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkboxMoki_Vezi.setObjectName(_fromUtf8("checkboxMoki_Vezi"))
self.verticalLayout_7.addWidget(self.checkboxMoki_Vezi)
self.horizontalLayout_14.addLayout(self.verticalLayout_7)
spacerItem6 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_14.addItem(spacerItem6)
self.verticalLayout_11.addLayout(self.horizontalLayout_14)
self.horizontalLayout_8 = QtGui.QHBoxLayout()
self.horizontalLayout_8.setSpacing(0)
self.horizontalLayout_8.setContentsMargins(-1, 0, -1, 0)
self.horizontalLayout_8.setObjectName(_fromUtf8("horizontalLayout_8"))
self.pushButtondownloadpdf = QtGui.QPushButton(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pushButtondownloadpdf.sizePolicy().hasHeightForWidth())
self.pushButtondownloadpdf.setSizePolicy(sizePolicy)
self.pushButtondownloadpdf.setMinimumSize(QtCore.QSize(85, 0))
self.pushButtondownloadpdf.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButtondownloadpdf.setObjectName(_fromUtf8("pushButtondownloadpdf"))
self.horizontalLayout_8.addWidget(self.pushButtondownloadpdf)
spacerItem7 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_8.addItem(spacerItem7)
self.verticalLayout_11.addLayout(self.horizontalLayout_8)
self.horizontalLayout_6 = QtGui.QHBoxLayout()
self.horizontalLayout_6.setSpacing(0)
self.horizontalLayout_6.setContentsMargins(-1, 2, -1, 2)
self.horizontalLayout_6.setObjectName(_fromUtf8("horizontalLayout_6"))
self.checkBox_4 = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Fixed, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkBox_4.sizePolicy().hasHeightForWidth())
self.checkBox_4.setSizePolicy(sizePolicy)
self.checkBox_4.setBaseSize(QtCore.QSize(0, 0))
self.checkBox_4.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkBox_4.setObjectName(_fromUtf8("checkBox_4"))
self.horizontalLayout_6.addWidget(self.checkBox_4)
self.spinBox_3 = QtGui.QSpinBox(self.tab)
self.spinBox_3.setMaximum(30)
self.spinBox_3.setProperty("value", 1)
self.spinBox_3.setObjectName(_fromUtf8("spinBox_3"))
self.horizontalLayout_6.addWidget(self.spinBox_3)
self.label_5 = QtGui.QLabel(self.tab)
self.label_5.setObjectName(_fromUtf8("label_5"))
self.horizontalLayout_6.addWidget(self.label_5)
spacerItem8 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_6.addItem(spacerItem8)
self.verticalLayout_11.addLayout(self.horizontalLayout_6)
self.horizontalLayout_3 = QtGui.QHBoxLayout()
self.horizontalLayout_3.setSpacing(0)
self.horizontalLayout_3.setContentsMargins(-1, 2, -1, 2)
self.horizontalLayout_3.setObjectName(_fromUtf8("horizontalLayout_3"))
self.checkBox_3 = QtGui.QCheckBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Fixed, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkBox_3.sizePolicy().hasHeightForWidth())
self.checkBox_3.setSizePolicy(sizePolicy)
self.checkBox_3.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkBox_3.setChecked(False)
self.checkBox_3.setObjectName(_fromUtf8("checkBox_3"))
self.horizontalLayout_3.addWidget(self.checkBox_3)
self.spinBox = QtGui.QSpinBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Fixed, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.spinBox.sizePolicy().hasHeightForWidth())
self.spinBox.setSizePolicy(sizePolicy)
self.spinBox.setMinimum(5)
self.spinBox.setMaximum(365)
self.spinBox.setSingleStep(5)
self.spinBox.setProperty("value", 180)
self.spinBox.setObjectName(_fromUtf8("spinBox"))
self.horizontalLayout_3.addWidget(self.spinBox)
self.label_2 = QtGui.QLabel(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Fixed, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_2.sizePolicy().hasHeightForWidth())
self.label_2.setSizePolicy(sizePolicy)
self.label_2.setObjectName(_fromUtf8("label_2"))
self.horizontalLayout_3.addWidget(self.label_2)
self.pushButton_8 = QtGui.QPushButton(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Fixed, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pushButton_8.sizePolicy().hasHeightForWidth())
self.pushButton_8.setSizePolicy(sizePolicy)
self.pushButton_8.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButton_8.setObjectName(_fromUtf8("pushButton_8"))
self.horizontalLayout_3.addWidget(self.pushButton_8)
spacerItem9 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_3.addItem(spacerItem9)
self.verticalLayout_11.addLayout(self.horizontalLayout_3)
self.line = QtGui.QFrame(self.tab)
self.line.setFrameShape(QtGui.QFrame.HLine)
self.line.setFrameShadow(QtGui.QFrame.Sunken)
self.line.setObjectName(_fromUtf8("line"))
self.verticalLayout_11.addWidget(self.line)
self.verticalLayout.addLayout(self.verticalLayout_11)
self.horizontalLayout_16 = QtGui.QHBoxLayout()
self.horizontalLayout_16.setContentsMargins(-1, 0, -1, 0)
self.horizontalLayout_16.setObjectName(_fromUtf8("horizontalLayout_16"))
self.label = QtGui.QLabel(self.tab)
self.label.setObjectName(_fromUtf8("label"))
self.horizontalLayout_16.addWidget(self.label)
spacerItem10 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_16.addItem(spacerItem10)
self.verticalLayout.addLayout(self.horizontalLayout_16)
self.horizontalLayout_7 = QtGui.QHBoxLayout()
self.horizontalLayout_7.setSpacing(0)
self.horizontalLayout_7.setContentsMargins(0, 2, 0, 2)
self.horizontalLayout_7.setObjectName(_fromUtf8("horizontalLayout_7"))
self.checkBox_2 = QtGui.QCheckBox(self.tab)
self.checkBox_2.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkBox_2.setChecked(False)
self.checkBox_2.setObjectName(_fromUtf8("checkBox_2"))
self.horizontalLayout_7.addWidget(self.checkBox_2)
self.spinBox_2 = QtGui.QSpinBox(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Fixed, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.spinBox_2.sizePolicy().hasHeightForWidth())
self.spinBox_2.setSizePolicy(sizePolicy)
self.spinBox_2.setMinimum(0)
self.spinBox_2.setMaximum(30)
self.spinBox_2.setProperty("value", 1)
self.spinBox_2.setObjectName(_fromUtf8("spinBox_2"))
self.horizontalLayout_7.addWidget(self.spinBox_2)
self.label_4 = QtGui.QLabel(self.tab)
self.label_4.setObjectName(_fromUtf8("label_4"))
self.horizontalLayout_7.addWidget(self.label_4)
self.pushButton_7 = QtGui.QPushButton(self.tab)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Fixed, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.pushButton_7.sizePolicy().hasHeightForWidth())
self.pushButton_7.setSizePolicy(sizePolicy)
self.pushButton_7.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButton_7.setObjectName(_fromUtf8("pushButton_7"))
self.horizontalLayout_7.addWidget(self.pushButton_7)
spacerItem11 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_7.addItem(spacerItem11)
self.verticalLayout.addLayout(self.horizontalLayout_7)
self.line_6 = QtGui.QFrame(self.tab)
self.line_6.setFrameShape(QtGui.QFrame.HLine)
self.line_6.setFrameShadow(QtGui.QFrame.Sunken)
self.line_6.setObjectName(_fromUtf8("line_6"))
self.verticalLayout.addWidget(self.line_6)
spacerItem12 = QtGui.QSpacerItem(20, 40, QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Expanding)
self.verticalLayout.addItem(spacerItem12)
self.plainTextEdit = QtGui.QPlainTextEdit(self.tab)
self.plainTextEdit.setFocusPolicy(QtCore.Qt.NoFocus)
self.plainTextEdit.setContextMenuPolicy(QtCore.Qt.NoContextMenu)
self.plainTextEdit.setAcceptDrops(False)
self.plainTextEdit.setAutoFillBackground(True)
self.plainTextEdit.setFrameShape(QtGui.QFrame.StyledPanel)
self.plainTextEdit.setFrameShadow(QtGui.QFrame.Plain)
self.plainTextEdit.setVerticalScrollBarPolicy(QtCore.Qt.ScrollBarAsNeeded)
self.plainTextEdit.setHorizontalScrollBarPolicy(QtCore.Qt.ScrollBarAsNeeded)
self.plainTextEdit.setUndoRedoEnabled(True)
self.plainTextEdit.setLineWrapMode(QtGui.QPlainTextEdit.NoWrap)
self.plainTextEdit.setReadOnly(True)
self.plainTextEdit.setBackgroundVisible(False)
self.plainTextEdit.setObjectName(_fromUtf8("plainTextEdit"))
self.verticalLayout.addWidget(self.plainTextEdit)
self.progressBar = QtGui.QProgressBar(self.tab)
self.progressBar.setProperty("value", 0)
self.progressBar.setTextDirection(QtGui.QProgressBar.TopToBottom)
self.progressBar.setObjectName(_fromUtf8("progressBar"))
self.verticalLayout.addWidget(self.progressBar)
self.tabWidget.addTab(self.tab, _fromUtf8(""))
self.tab_2 = QtGui.QWidget()
self.tab_2.setObjectName(_fromUtf8("tab_2"))
self.verticalLayout_6 = QtGui.QVBoxLayout(self.tab_2)
self.verticalLayout_6.setObjectName(_fromUtf8("verticalLayout_6"))
self.verticalLayout_5 = QtGui.QVBoxLayout()
self.verticalLayout_5.setSpacing(6)
self.verticalLayout_5.setContentsMargins(-1, 0, -1, 0)
self.verticalLayout_5.setObjectName(_fromUtf8("verticalLayout_5"))
self.horizontalLayout_13 = QtGui.QHBoxLayout()
self.horizontalLayout_13.setContentsMargins(-1, 0, -1, 0)
self.horizontalLayout_13.setObjectName(_fromUtf8("horizontalLayout_13"))
self.checkBox = QtGui.QCheckBox(self.tab_2)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Fixed, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.checkBox.sizePolicy().hasHeightForWidth())
self.checkBox.setSizePolicy(sizePolicy)
self.checkBox.setFocusPolicy(QtCore.Qt.StrongFocus)
self.checkBox.setObjectName(_fromUtf8("checkBox"))
self.horizontalLayout_13.addWidget(self.checkBox)
spacerItem13 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_13.addItem(spacerItem13)
self.verticalLayout_5.addLayout(self.horizontalLayout_13)
self.horizontalLayout_9 = QtGui.QHBoxLayout()
self.horizontalLayout_9.setContentsMargins(-1, 0, -1, 0)
self.horizontalLayout_9.setObjectName(_fromUtf8("horizontalLayout_9"))
self.spinBox_4 = QtGui.QSpinBox(self.tab_2)
self.spinBox_4.setFrame(True)
self.spinBox_4.setButtonSymbols(QtGui.QAbstractSpinBox.UpDownArrows)
self.spinBox_4.setMinimum(100)
self.spinBox_4.setMaximum(250)
self.spinBox_4.setSingleStep(10)
self.spinBox_4.setProperty("value", 150)
self.spinBox_4.setObjectName(_fromUtf8("spinBox_4"))
self.horizontalLayout_9.addWidget(self.spinBox_4)
self.label_6 = QtGui.QLabel(self.tab_2)
self.label_6.setObjectName(_fromUtf8("label_6"))
self.horizontalLayout_9.addWidget(self.label_6)
spacerItem14 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_9.addItem(spacerItem14)
self.verticalLayout_5.addLayout(self.horizontalLayout_9)
self.horizontalLayout_10 = QtGui.QHBoxLayout()
self.horizontalLayout_10.setContentsMargins(-1, 0, -1, 0)
self.horizontalLayout_10.setObjectName(_fromUtf8("horizontalLayout_10"))
self.pushButton_10 = QtGui.QPushButton(self.tab_2)
self.pushButton_10.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButton_10.setObjectName(_fromUtf8("pushButton_10"))
self.horizontalLayout_10.addWidget(self.pushButton_10)
self.label_7 = QtGui.QLabel(self.tab_2)
self.label_7.setObjectName(_fromUtf8("label_7"))
self.horizontalLayout_10.addWidget(self.label_7)
spacerItem15 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_10.addItem(spacerItem15)
self.verticalLayout_5.addLayout(self.horizontalLayout_10)
self.horizontalLayout_11 = QtGui.QHBoxLayout()
self.horizontalLayout_11.setSpacing(0)
self.horizontalLayout_11.setContentsMargins(-1, 0, -1, 0)
self.horizontalLayout_11.setObjectName(_fromUtf8("horizontalLayout_11"))
self.pushButton_13 = QtGui.QPushButton(self.tab_2)
self.pushButton_13.setFocusPolicy(QtCore.Qt.StrongFocus)
self.pushButton_13.setObjectName(_fromUtf8("pushButton_13"))
self.horizontalLayout_11.addWidget(self.pushButton_13)
spacerItem16 = QtGui.QSpacerItem(40, 20, QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Minimum)
self.horizontalLayout_11.addItem(spacerItem16)
self.verticalLayout_5.addLayout(self.horizontalLayout_11)
self.verticalLayout_6.addLayout(self.verticalLayout_5)
spacerItem17 = QtGui.QSpacerItem(20, 40, QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Expanding)
self.verticalLayout_6.addItem(spacerItem17)
self.tabWidget.addTab(self.tab_2, _fromUtf8(""))
self.horizontalLayout.addWidget(self.tabWidget)
self.gridLayout.addLayout(self.horizontalLayout, 0, 0, 1, 1)
self.verticalLayout_2.addLayout(self.gridLayout)
MainWindow.setCentralWidget(self.centralwidget)
self.retranslateUi(MainWindow)
self.tabWidget.setCurrentIndex(0)
self.comboBox_5.setCurrentIndex(0)
self.comboBox_7.setCurrentIndex(0)
QtCore.QMetaObject.connectSlotsByName(MainWindow)
def retranslateUi(self, MainWindow):
MainWindow.setWindowTitle(_translate("MainWindow", "Reklaminiai Parduotuvių Lankstinukai", None))
self.comboBox_2.setItemText(0, _translate("MainWindow", "Maxima", None))
self.comboBox_3.setItemText(0, _translate("MainWindow", "Norfa", None))
self.comboBox_4.setItemText(0, _translate("MainWindow", "Iki", None))
self.comboBox_6.setItemText(0, _translate("MainWindow", "Rimi", None))
self.comboBox_5.setItemText(0, _translate("MainWindow", "Aibė", None))
self.comboBox_7.setItemText(0, _translate("MainWindow", "FRESH MARKET", None))
self.comboBox_10.setItemText(0, _translate("MainWindow", "PROMO CashCarry", None))
self.comboBox_11.setItemText(0, _translate("MainWindow", "PRISMA", None))
self.comboBox_12.setItemText(0, _translate("MainWindow", "EUROKOS", None))
self.comboBox_13.setItemText(0, _translate("MainWindow", "Drogas", None))
self.comboBox_14.setItemText(0, _translate("MainWindow", "ERMITAŽAS", None))
self.comboBox_8.setItemText(0, _translate("MainWindow", "Senukai", None))
self.comboBox_9.setItemText(0, _translate("MainWindow", "Moki*Veži", None))
self.label_8.setText(_translate("MainWindow", "TextLabel", None))
self.tabWidget.setTabText(self.tabWidget.indexOf(self.pdftab), _translate("MainWindow", "Lankstinukai", None))
self.Intbuttonmaxima.setText(_translate("MainWindow", "Maxima", None))
self.Intbuttonnorfa.setText(_translate("MainWindow", "Norfa", None))
self.Intbuttoniki.setText(_translate("MainWindow", "Iki", None))
self.Intbuttonrimi.setText(_translate("MainWindow", "Rimi", None))
self.intbuttonaibe.setText(_translate("MainWindow", "Aibė", None))
self.intbuttonFRESH_MARKET.setText(_translate("MainWindow", "FRESH MARKET", None))
self.intbuttonPROMO.setText(_translate("MainWindow", "PROMO CashCarry", None))
self.intbuttonPRISMA.setText(_translate("MainWindow", "PRISMA", None))
self.intbuttonEUROKOS.setText(_translate("MainWindow", "EUROKOS", None))
self.intbuttonDrogas.setText(_translate("MainWindow", "Drogas", None))
self.intbuttonERMITAZAS.setText(_translate("MainWindow", "ERMITAŽAS", None))
self.intbuttonSenukai.setText(_translate("MainWindow", "Senukai", None))
self.intbuttonMoki_Vezi.setText(_translate("MainWindow", "Moki*Veži", None))
self.intbuttonJysk.setText(_translate("MainWindow", "Jysk", None))
self.pushButton_12.setText(_translate("MainWindow", "Į adresyną", None))
self.tabWidget.setTabText(self.tabWidget.indexOf(self.Internettab), _translate("MainWindow", "Internetas", None))
self.label_3.setText(_translate("MainWindow", "<html><head/><body><p align=\"justify\"><span style=\" font-weight:600;\">Lankstinukų atnaujinimas</span><br/></p></body></html>", None))
self.checkboxmaxima.setText(_translate("MainWindow", "Maxima", None))
self.checkBoxnorfa.setText(_translate("MainWindow", "Norfa", None))
self.checkBoxiki.setText(_translate("MainWindow", "Iki", None))
self.checkBoxrimi.setText(_translate("MainWindow", "Rimi", None))
self.checkboxAibe.setText(_translate("MainWindow", "Aibė", None))
self.checkboxFRESH_MARKET.setText(_translate("MainWindow", "FRESH MARKET", None))
self.checkboxPROMO.setText(_translate("MainWindow", "PROMO CashCarry", None))
self.checkboxPRISMA.setText(_translate("MainWindow", "PRISMA", None))
self.checkboxEUROKOS.setText(_translate("MainWindow", "EUROKOS", None))
self.checkboxDrogas.setText(_translate("MainWindow", "Drogas", None))
self.checkboxERMITAZAS.setText(_translate("MainWindow", "ERMITAŽAS", None))
self.checkboxSenukai.setText(_translate("MainWindow", "Senukai", None))
self.checkboxMoki_Vezi.setText(_translate("MainWindow", "Moki*Veži", None))
self.pushButtondownloadpdf.setText(_translate("MainWindow", "Tikrinti ir atsiųsti dabar", None))
self.checkBox_4.setText(_translate("MainWindow", "Automatiškai tikrinti ar yra naujų lankstinukų kas ", None))
self.label_5.setText(_translate("MainWindow", " dienų ", None))
self.checkBox_3.setText(_translate("MainWindow", "Automatiškai trinti senus lankstinukus po ", None))
self.label_2.setText(_translate("MainWindow", " dienų ", None))
self.pushButton_8.setText(_translate("MainWindow", "Trinti dabar", None))
self.label.setText(_translate("MainWindow", "<html><head/><body><p align=\"justify\"><span style=\" font-weight:600;\">Programos atnaujinimas</span><br/></p></body></html>", None))
self.checkBox_2.setText(_translate("MainWindow", "Automatiškai tikrinti įjungiant programą kas ", None))
self.label_4.setText(_translate("MainWindow", " dienų ", None))
self.pushButton_7.setText(_translate("MainWindow", "Tikrinti ir atsiųsti dabar", None))
self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab), _translate("MainWindow", "Naujinimas", None))
self.checkBox.setText(_translate("MainWindow", "Naudoti pdf.js. Lėtas ir kolkas kreivokai atvaizduoja su standartiniu webkit\'u.", None))
self.label_6.setText(_translate("MainWindow", "PPI paveikslėlių kūrimui iš lankstinukų. 1920x* ekranui reikėtų 200.", None))
self.pushButton_10.setText(_translate("MainWindow", "Ištrinti paveikslėlius", None))
self.label_7.setText(_translate("MainWindow", "Nespausk. Rimtai ;)", None))
self.pushButton_13.setText(_translate("MainWindow", "Pagalba", None))
self.tabWidget.setTabText(self.tabWidget.indexOf(self.tab_2), _translate("MainWindow", "Nustatymai", None))
|
I want to ask about peer detect, based on this statement "Peers detect each other with TCP Option 76. To trigger peers to come online, start a new TCP session."
I has testing in lab, when I open network sharing windows and status come UP but when status back idle/down, I reload few times sharing folder but status not come UP, I must close window and open link share \\PC-02 again to get status UP.
This mean TCP connection state for 'related-connection' cannot use to trigger peer?
I Think using ping (icmp) to trigger from another wanos the best way to make sure peer still UP so wanos not passthrough the traffic or maybe there is something option in wanos to auto send TCP option 76 from one wanos to another peer with interval where users can config it.
Silver Peak is the same as well. When testing in the lab the TCP sessions needs to be resetted when enabling Boost.
This is new in v4 since we now auto detect which sessions to optimize and which to leave alone. If the TCP session is seen on both appliances, then it is eligible to be optimized. If a session is only seen on one appliance but not the other, then it is bypassed. Hence, MultiSite has been removed and configuring Traffic Policies are now optional. This is a big step forward considering that these are often miss-configured in v3.
As like the other vendors, this is only expected to be used in a lab/test environment. Under normal production environments it would not be normal to reset TCP sessions in order to force a TCP reconnect to start optimization for a particular application like CIFS/SMB.
As for the ping idea, it is not really workable. The remote peer could be removed from the inline position but still respond to ICMP and even wanos control traffic like PLR and RTT measurements may still flow between the sites. This would cause the peers to remain "Active" causing the device that is inline to keep optimization alive. This would cause a blackout for optimized traffic. Better to be safe and only optimize while valid traffic is seen. When one of the peers are removed or moved where traffic flow is not correct, the peers should go into Idle state to avoid traffic blackout.
Test in a production network. If you still see peers Idle occasionally but want to force them to "Active" increase the peer timeout value under settings.
Thanks for your explained, one more question, are Router mode has removed from V4 ?
I see only Bridge and Tunnel mode, how I can mix for 2 device, where one device have only 1 NIC and other have 2 NIC.
And, can I modify default policymap port, I want to optimize RDP (3389), does this work if I add rule #99 to allow RDP trafict to optimize?
Router mode is not supported in Express. |
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from __future__ import absolute_import
from os import path
import mock
import pytest
from wonderful_bing.wonderful_bing import Computer
@pytest.fixture
def computer():
computer = Computer()
return computer
def test_computer(computer):
gnome_based = ("DISPLAY=:0 GSETTINGS_BACKEND=dconf "
"/usr/bin/gsettings set org.gnome.desktop.background "
"picture-uri file://{0}")
mate_based = ("DISPLAY=:0 GSETTINGS_BACKEND=dconf "
"/usr/bin/gsettings set org.mate.background "
"picture-filename '{0}'")
xfce_based = ("DISPLAY=:0 xfconf-query -c xfce4-desktop "
"-p /backdrop/screen0/monitor0/image-path -s {0}")
assert computer._get_command('gnome') == gnome_based
assert computer._get_command('gnome2') == gnome_based
assert computer._get_command('cinnamon') == gnome_based
assert computer._get_command('mate') == mate_based
assert computer._get_command('xfce4') == xfce_based
assert computer._get_command('blablabla') is None
def test_set_wallpaper_with_unsupported_environment(computer):
with pytest.raises(SystemExit):
computer.set_wallpaper('blablabla', 'tmp/blabla.jpg')
def test_set_wallpaper(computer):
with mock.patch('wonderful_bing.wonderful_bing.subprocess') as subprocess:
subprocess.Popen.return_value.returncode = 0
computer.set_wallpaper('gnome', '/tmp/blabla.jpg')
command = computer._get_command('gnome').format('/tmp/blabla.jpg')
subprocess.Popen.assert_called_once_with(command, shell=True)
def test_show_notify(computer):
with mock.patch('wonderful_bing.wonderful_bing.subprocess') as subprocess:
computer.show_notify('Hello, world')
notify_icon = path.join(
path.dirname(path.dirname(path.realpath(__file__))),
'wonderful_bing/img/icon.png')
subprocess.Popen.assert_called_once_with(
["notify-send", "-a", "wonderful_bing", "-i",
notify_icon, "Today's Picture Story", "Hello, world"])
|
XP 11.11 - how to assign the Condition levers?
sorry i read the manual section but i can't understand. i have only the multipanel. |
'''
Created on Jan 9, 2014
@author: Stefan Koelbl
'''
from random import sample
from gost.gost import GOST
DDT = []
SBOX_MATCH = []
LinearStepList = []
InverseLinearStepList = []
possibleOutputDifferences = []
possibleInputDifferences = []
def propagateDifferencesThroughSout(state):
"""
Returns a state containing a possible output difference after applying
the S-Box for the given input difference.
"""
result = GOST()
for x in range(8):
for y in range(8):
result.setValue(x, y, sample(getValidDiffsForOutputDiff(state.getValue(x, y)), 1)[0])
return result
def propagateDifferencesThroughSin(state):
"""
Returns a state containing a possible input difference after applying
the inverse S-Box for the given output difference.
"""
result = GOST()
for x in range(8):
for y in range(8):
result.setValue(x, y, sample(getValidDiffsForInputDiff(state.getValue(x, y)), 1)[0])
return result
def computeLinearStepList():
"""
Compute the list of all possible values for
(x 0 0 0 0 0 0 0) * L = (y0 y1 y2 y3 y4 y5 y6 y7)
"""
global LinearStepList
gost = GOST()
for value in range(1, 256):
gost.setValue(0, 0, value)
LinearStepList.append(gost.L().getRow(0))
def computeInverseLinearStepList():
"""
Compute the list of all possible values for
(x 0 0 0 0 0 0 0) * Linverse = (y0 y1 y2 y3 y4 y5 y6 y7)
"""
global InverseLinearStepList
gost = GOST()
for value in range(1, 256):
gost.setValue(0, 0, value)
InverseLinearStepList.append(gost.Linverse().getRow(0))
def computeDDT(sbox):
"""
Compute the differential distribution table (DDT) for a given S-Box
"""
global DDT
DDT = [[0 for _ in range(len(sbox))] for _ in range(len(sbox))]
for a in range(len(sbox)):
for b in range(len(sbox)):
DDT[a ^ b][sbox[a] ^ sbox[b]] += 1
def computeSBOX_MATCH(sbox):
"""
Compute the valid pairs for each input/output difference.
"""
global SBOX_MATCH
SBOX_MATCH = [[[] for _ in range(len(sbox))] for _ in range(len(sbox))]
for a in range(len(sbox)):
for b in range(len(sbox)):
SBOX_MATCH[a ^ b][sbox[a] ^ sbox[b]].append([a, b])
def getValidBytePairsForOutputDiff(outputDiff):
"""
Get all possible pairs (a, b) such that:
S(a) xor S(b) = outputDiff
"""
bytePairs = []
for i in range(len(SBOX_MATCH)):
if(len(SBOX_MATCH[i][outputDiff]) > 0):
bytePairs.append(SBOX_MATCH[i][outputDiff])
return bytePairs
def getValidBytePairsForInputDiff(inputDiff):
"""
Get all possible pairs (a, b) such that:
Sinverse(a) xor Sinverse(b) = inputDiff
"""
bytePairs = []
for i in range(len(SBOX_MATCH)):
if(len(SBOX_MATCH[inputDiff][i]) > 0):
bytePairs.append(SBOX_MATCH[inputDiff][i])
return bytePairs
def getValidDiffsForInputDiff(inputDiff):
"""
Get all possible output differences for a given input difference.
"""
global possibleOutputDifferences
if not possibleOutputDifferences:
possibleOutputDifferences = [set([]) for _ in range(256)]
# Compute Table
for diffIn in range(256):
for diffOut in range(256):
if(DDT[diffIn][diffOut] > 0):
possibleOutputDifferences[diffIn].add(diffOut)
return possibleOutputDifferences[inputDiff]
def getValidDiffsForOutputDiff(outputDiff):
"""
Get all possible input differences for a given output difference.
"""
global possibleInputDifferences
if not possibleInputDifferences:
possibleInputDifferences = [set([]) for _ in range(256)]
# Compute Table
for diffIn in range(256):
for diffOut in range(256):
if(DDT[diffIn][diffOut] > 0):
possibleInputDifferences[diffOut].add(diffIn)
return possibleInputDifferences[outputDiff] |
Do you agree that we influence one another to desire certain objects? Or, do our desires arise within our own selves?
Have you ever tried to “keep up with the Joneses”? |
"""Parse an tokenized expression into an AST."""
import codecs
from runtime import ast, lexer, env, lib, flags
class ParseException(Exception):
def __init__(self, msg):
super().__init__("ParseException: " + msg)
class MissingOperand(ParseException):
def __init__(self, op):
super().__init__("%s is missing operands" % op)
class UnknownOperator(ParseException):
def __init__(self, op):
super().__init__("Unknown operator %s" % op)
class BadStatement(ParseException):
"""A statement exception."""
def __init__(self, msg="Bad statement without semicolon"):
super().__init__(msg)
class NotImplemented(ParseException):
"""A parse exception."""
def __init__(self, msg="Functionality not implemented"):
super().__init__(msg)
class InvalidStatement(ParseException):
def __init__(self, msg):
super().__init__("Invalid statement: Unexpected %s" % str(msg))
class InvalidDeclaration(ParseException):
def __init__(self, msg):
super().__init__("Invalid declaration: Unexpected %s" % str(msg))
class InvalidDefinition(ParseException):
def __init__(self, msg):
super().__init__("Invalid definition: Unexpected %s" % str(msg))
class InvalidAssignment(ParseException):
def __init__(self, msg="Invalid assignment"):
super().__init__(msg)
class InvalidBlock(ParseException):
def __init__(self, msg="Missing block borders"):
super().__init__(msg)
class InvalidExpression(ParseException):
def __init__(self, msg="Invalid expression"):
super().__init__(msg)
class InvalidCondition(ParseException):
def __init__(self, msg="Invalid condition"):
super().__init__(msg)
class InvalidLoop(ParseException):
def __init__(self, msg):
super().__init__("Invalid loop: Unexpected %s" % str(msg))
def is_assignment(token):
return token != None and (token.kind is lexer.OPERATOR) and (token.value in ["=", "+=", "-=", "*=", "/=", "%=", "^="])
def find_matching_block(stream, start):
level = 1
max = len(stream)
for i in range(start, max):
if stream[i].kind == lexer.LBLOCK:
level += 1
elif stream[i].kind == lexer.RBLOCK:
level -= 1
if level == 0:
return i
return -1
def find_matching_prt(stream, start):
level = 1
max = len(stream)
for i in range(start, max):
if flags.debug:
print("scanned", str(stream[i]), ":", level)
if stream[i].kind == lexer.LPRT:
level += 1
elif stream[i].kind == lexer.RPRT:
level -= 1
if level == 0:
return i
return -1
def get_arg_count(operator, last_token):
if operator in ["+", "-"] and (last_token == None or last_token.kind not in [lexer.NUMBER, lexer.IDENTIFIER, lexer.STRING, lexer.RPRT]):
if flags.debug:
print("unary operator because of", last_token)
return 1
elif operator in ["!"]:
return 1
if flags.debug:
print("binary because of", last_token)
return 2
def is_left_associative(operator, last_token):
if operator in ["+", "-"] and (last_token == None or last_token.kind not in [lexer.NUMBER, lexer.IDENTIFIER, lexer.STRING, lexer.RPRT]):
if flags.debug:
print("right associative because of", last_token)
return False
elif operator in ["!", "^"]:
return False
if flags.debug:
print("left associative because of", last_token)
return True
def get_precedence(operator, last_token):
if operator in ["+", "-"] and (last_token == None or last_token.kind not in [lexer.NUMBER, lexer.IDENTIFIER, lexer.STRING, lexer.RPRT]):
return 7
elif operator in ["!"]:
return 7
elif operator in ["^"]:
return 6
elif operator in ["*", "/"]:
return 5
elif operator in ["+", "-", ":"]:
return 4
elif operator in ["%"]:
return 3
elif operator in ["<", ">", "<=", ">=", "!=", "=="]:
return 2
elif operator in ["&&", "||", "^|"]:
return 1
return 0
def generate_expression(stream):
if flags.debug:
print("Starting generate expression")
operand_stack = []
operator_stack = []
max = len(stream) - 1
last_token = None
token = None
def pop_off_operator():
if len(operator_stack) < 1:
raise ParseException("Empty operator stack, could not pop off operator")
operator = operator_stack.pop()
if flags.debug:
print("popping of", operator)
arg_count = operator.tag("arg_count")
if len(operand_stack) < arg_count:
raise MissingOperand(operator.symbol)
for j in range(arg_count):
operator.add_front(operand_stack.pop())
operand_stack.append(operator)
for i in range(max + 1):
last_token = token
token = stream[i]
if flags.debug:
print(">>> Parsing next token:", token)
print("Operands: ", ', '.join(str(e) for e in operand_stack))
print("Operators:", ', '.join(str(e) for e in operator_stack))
if token.kind == lexer.NUMBER:
value = None
if '.' in token.value:
value = env.Value(lib.FLOAT, data=float(token.value))
else:
value = env.Value(lib.INTEGER, data=float(token.value))
operand_stack.append(ast.Literal(value))
elif token.kind == lexer.STRING:
stripped = token.value.strip("\"")
decoded = codecs.decode(stripped, "unicode_escape")
value = env.Value(lib.STRING, data=decoded)
operand_stack.append(ast.Literal(value))
elif token.kind == lexer.SEPARATOR:
while len(operator_stack) > 0 and operator_stack[-1] != "(":
pop_off_operator()
elif token.kind == lexer.IDENTIFIER:
if i < max and stream[i+1].kind == lexer.LPRT:
operator_stack.append(ast.Call(token.value))
else:
if token.value == "false":
operand_stack.append(ast.Literal(env.Value(lib.BOOLEAN, data=False)))
elif token.value == "true":
operand_stack.append(ast.Literal(env.Value(lib.BOOLEAN, data=True)))
elif token.value == "null":
operand_stack.append(ast.Literal(env.Value(lib.NULL)))
else:
operand_stack.append(ast.Identifier(token.value))
elif token.kind == lexer.OPERATOR:
new_operator = ast.Operation(token.value)
prec = get_precedence(token.value, last_token)
arg_count = get_arg_count(token.value, last_token)
left_associative = is_left_associative(token.value, last_token)
new_operator.tag("precedence", prec)
new_operator.tag("arg_count", arg_count)
new_operator.tag("left_associative", left_associative)
if flags.debug:
print("adding operator", new_operator.symbol, "to", len(operator_stack))
while len(operator_stack) > 0 and (type(operator_stack[-1]) is ast.Operation):
other = operator_stack[-1]
other_prec = operator_stack[-1].tag("precedence")
if flags.debug:
print("comparing precedence of ", new_operator.symbol, prec, "to", other.symbol, other_prec)
if left_associative:
if prec > other_prec:
break
else:
if prec >= other_prec:
break
pop_off_operator()
operator_stack.append(new_operator)
if flags.debug:
print("pushed operator on stack")
elif token.kind == lexer.LPRT:
operand_stack.append(token.value)
operator_stack.append(token.value)
elif token.kind == lexer.RPRT:
while len(operator_stack) > 0 and operator_stack[-1] != "(":
pop_off_operator()
if len(operator_stack) < 1:
raise ParseException("Mismatched parentheses")
operator_stack.pop()
if len(operator_stack) > 0 and type(operator_stack[-1]) is ast.Call:
function = operator_stack.pop()
while len(operand_stack) > 0 and operand_stack[-1] != "(":
function.add_front(operand_stack.pop())
operand_stack.pop()
operand_stack.append(function)
else:
j = len(operand_stack) - 1
while j >= 0 and operand_stack[j] != "(":
j -= 1
del operand_stack[j]
elif token.kind == lexer.STATEMENT:
while len(operator_stack) > 0:
pop_off_operator()
if len(operand_stack) > 1:
raise InvalidExpression()
if len(operand_stack) != 1:
raise InvalidExpression("Empty expression")
return operand_stack[0], i
last_token = token
while len(operator_stack) > 0:
pop_off_operator()
if flags.debug:
print("Operands: ", ', '.join(str(e) for e in operand_stack))
print("Operators:", ', '.join(str(e) for e in operator_stack))
if len(operand_stack) != 1:
raise InvalidExpression("Empty expression")
if flags.debug:
print("Parsed expression with length %d" % (max + 1))
return operand_stack[0], max + 1
def generate_declaration(stream):
if flags.debug:
print("Starting generating declaration")
end = len(stream)
for j in range(len(stream)):
if stream[j].kind is lexer.STATEMENT:
end = j
break
if end < 3:
raise ParseException("Declaration too short")
if not (stream[0].kind is lexer.IDENTIFIER and stream[0].value == "var"):
raise InvalidDeclaration(stream[0])
if stream[1].kind != lexer.IDENTIFIER:
raise InvalidDeclaration(stream[1])
declared_names = []
sequ = ast.Sequence()
expr_begin = 2
ignore_type = True
while expr_begin < end and ((stream[expr_begin].kind is lexer.SEPARATOR) or
((stream[expr_begin].kind is lexer.OPERATOR) and
(stream[expr_begin].value == ":" or
(stream[expr_begin].value == "=" and ignore_type)))):
if (stream[expr_begin].kind is lexer.OPERATOR) and stream[expr_begin].value == ":":
ignore_type = False
if expr_begin > 2 and stream[expr_begin - 2].kind != lexer.SEPARATOR:
raise InvalidDeclaration(stream[expr_begin - 2])
if stream[expr_begin - 1].kind is not lexer.IDENTIFIER:
raise InvalidDeclaration(stream[expr_begin - 1])
declared_names.append(stream[expr_begin - 1].value)
expr_begin += 2
if not ignore_type and stream[expr_begin - 1].kind is not lexer.IDENTIFIER:
raise InvalidDeclaration(stream[expr_begin - 1])
datatype = "null"
if not ignore_type:
datatype = stream[expr_begin - 1].value
else:
expr_begin -= 2
expr = None
if expr_begin < end and is_assignment(stream[expr_begin]):
expr, _ = generate_expression(stream[expr_begin + 1:])
for name in declared_names:
decl = ast.Declaration(name, datatype)
sequ.add(decl)
if expr is not None:
assgn = ast.Assignment(name, ignore_type)
assgn.add(expr)
expr = assgn
if expr is not None:
sequ.add(expr)
return sequ, end - 1
def generate_assignment(stream):
if flags.debug:
print("Starting generating assigment")
if len(stream) < 3:
raise InvalidAssignment()
name_token, equ_token = stream[0], stream[1]
if name_token.kind != lexer.IDENTIFIER or not is_assignment(equ_token):
raise InvalidAssignment()
expr, offset = generate_expression(stream[2:])
if flags.debug:
print("Expression has offset %d" % offset)
if len(equ_token.value) != 1:
operation = ast.Operation(equ_token.value[0])
operation.add(ast.Identifier(name_token.value))
operation.add(expr)
expr = operation
assgn = ast.Assignment(name_token.value)
assgn.add(expr)
if flags.debug:
print("Assignment has offset %d" % (1 + offset))
return assgn, 1 + offset
def generate_function(stream):
if flags.debug:
print("Starting generating function definition")
head_name = stream[0]
if head_name.kind is not lexer.IDENTIFIER:
raise InvalidDefinition(head_name)
fnc_name = head_name.value
head_start = stream[1]
if head_start.kind != lexer.LPRT:
raise InvalidDefinition(head_start)
head_end_index = find_matching_prt(stream, 2)
arguments = []
arg_index = 2
while arg_index < head_end_index:
if stream[arg_index].kind is not lexer.IDENTIFIER:
raise InvalidDefinition(stream[arg_index])
arg_name = stream[arg_index].value
if arg_index + 3 >= len(stream):
raise InvalidDefinition(stream[arg_index+1])
if (stream[arg_index+1].kind is not lexer.OPERATOR) or stream[arg_index+1].value != ":":
raise InvalidDefinitiom(stream[arg_index+1])
if stream[arg_index+2].kind is not lexer.IDENTIFIER:
raise InvalidDefinitiom(stream[arg_index+2])
arg_type = stream[arg_index+2].value
arguments.append(env.Value(arg_type, None, arg_name))
arg_index += 4
if flags.debug:
print("Adding arguments:", ', '.join(str(e) for e in arguments))
body_start_index = head_end_index + 1
body_start = stream[body_start_index]
if body_start.kind is not lexer.LBLOCK:
raise InvalidDefinition(body_start)
body, body_len = generate_sequence(stream[body_start_index+1:])
defi_node = ast.Definition(fnc_name, arguments)
defi_node.add(body)
return defi_node, 3 + head_end_index + body_len
def generate_if(stream):
if flags.debug:
print("Starting generating if statement")
cond_head = stream[0]
if not (cond_head.kind is lexer.IDENTIFIER and cond_head.value == "if"):
raise InvalidCondition()
cond_start_index = 1
cond_start = stream[cond_start_index]
if cond_start.kind != lexer.LPRT:
raise InvalidCondition()
cond_end_index = find_matching_prt(stream, cond_start_index + 1)
if cond_end_index == -1:
raise InvalidCondition()
cond_block = stream[cond_start_index+1:cond_end_index]
if flags.debug:
print("if-condition: " + ' '.join(str(e) for e in cond_block))
body_start_index = cond_end_index + 1
body_start = stream[body_start_index]
if body_start.kind != lexer.LBLOCK:
raise InvalidBlock()
body_end_index = find_matching_block(stream, body_start_index)
body_block = stream[body_start_index+1:body_end_index]
condition, cond_len = generate_expression(cond_block)
body, body_len = generate_sequence(body_block)
body.substitute = True
branch_node = ast.Branch()
cond_node = ast.Conditional()
cond_node.add(condition)
cond_node.add(body)
branch_node.add(cond_node)
# if ( .... ) { .... }
# 0 1 cond 2 3 body 4
offset = 4 + cond_len + body_len
if offset + 1 >= len(stream) or not (stream[offset+1].kind is lexer.IDENTIFIER and
stream[offset+1].value == "else"):
return branch_node, offset
if flags.debug:
print("Possible else (if) at", str(stream[offset+1]))
# else if? (offset+2 == 'if')
if stream[offset+2].kind is lexer.IDENTIFIER and stream[offset+2].value == "if":
if flags.debug:
print("Parsing else-if at token", offset + 2)
elif_node, elif_len = generate_if(stream[offset+2:])
branch_node.add(elif_node)
# ...... else .......
# offset 1 elif_len
offset += elif_len + 2
# guaranteed to be else
else:
if flags.debug:
print("Parsing else at token", offset + 2)
else_body, else_len = generate_sequence(stream[offset+3:])
else_body.substitute = True
branch_node.add(else_body)
# ...... else { ........ }
# offset 1 2 else_len 3
offset += else_len + 3
return branch_node, offset
def generate_for(stream):
if flags.debug:
print("Starting generating for statement")
for_ident = stream[0]
if not (for_ident.kind is lexer.IDENTIFIER and for_ident.value == "for"):
raise InvalidCondition()
cond_start = 2
head_start = stream[cond_start - 1]
if head_start.kind is not lexer.LPRT:
raise InvalidCondition()
head_end_index = find_matching_prt(stream, cond_start)
if head_end_index == -1:
raise InvalidCondition()
# find first ;
init_end_index = cond_start
for j in range(len(stream)):
if stream[j].kind is lexer.STATEMENT:
init_end_index = j
break
init_stmt, init_len = generate_sequence(stream[cond_start:init_end_index+1])
cond_expr, cond_len = generate_expression(stream[cond_start + init_len:head_end_index])
iter_stmt, iter_len = generate_sequence(stream[cond_start + init_len + cond_len:head_end_index])
body_start_index = head_end_index + 1
body_start = stream[body_start_index]
if body_start.kind is not lexer.LBLOCK:
raise InvalidBlock()
body_end_index = find_matching_block(stream, body_start_index + 1)
body, body_len = generate_sequence(stream[body_start_index+1:])
inner_sequ = ast.Sequence()
inner_sequ.add(body)
inner_sequ.add(iter_stmt)
loop = ast.Loop()
loop.add(cond_expr)
loop.add(inner_sequ)
sequ = ast.Sequence(True)
sequ.add(init_stmt)
sequ.add(loop)
return sequ, 4 + init_len + cond_len + iter_len + body_len
def generate_while(stream):
if flags.debug:
print("Starting generating while statement")
if len(stream) < 6:
raise InvalidLoop("length %d" % len(stream))
if not (stream[0].kind is lexer.IDENTIFIER and stream[0].value == "while"):
raise InvalidLoop(stream[0])
cond_start = stream[1]
if cond_start.kind != lexer.LPRT:
raise InvalidCondition()
cond_end_index = find_matching_prt(stream, 2)
if cond_end_index == -1:
raise InvalidCondition()
body_start_index = cond_end_index+1
body_start = stream[body_start_index]
if body_start.kind != lexer.LBLOCK:
raise InvalidBlock()
body_end_index = find_matching_block(stream, body_start_index+1)
condition, cond_len = generate_expression(stream[2:cond_end_index])
body, offset = generate_sequence(stream[body_start_index+1:])
body.substitute = True
loop = ast.Loop()
loop.add(condition)
loop.add(body)
return loop, 4 + cond_len + offset
def generate_sequence(stream):
if flags.debug:
print("Starting generating sequence")
print("Generating on", stream)
sequence = ast.Sequence()
stack = []
queue = []
max = len(stream) - 1
i = 0
def next():
if i < max:
return stream[i+1]
return None
while i <= max:
if flags.debug:
print("Operating on", i, stream[i])
token = stream[i]
if token.kind == lexer.IDENTIFIER:
if token.value == "func":
func, offset = generate_function(stream[i+1:])
sequence.add(func)
i += offset
elif token.value == "return":
expr, offset = generate_expression(stream[i+1:])
return_node = ast.Return()
return_node.add(expr)
sequence.add(return_node)
i += offset
elif token.value == "continue":
sequence.add(ast.Continue())
elif token.value == "break":
sequence.add(ast.Break())
elif token.value == "while":
while_node, offset = generate_while(stream[i:])
sequence.add(while_node)
i += offset
elif token.value == "if":
if_node, offset = generate_if(stream[i:])
sequence.add(if_node)
i += offset
elif token.value == "for":
for_node, offset = generate_for(stream[i:])
sequence.add(for_node)
i += offset
elif token.value == "import":
raise NotImplemented()
elif token.value == "var":
decl, offset = generate_declaration(stream[i:])
sequence.add(decl)
i += offset
else:
if i < max and is_assignment(next()):
assgn, offset = generate_assignment(stream[i:])
sequence.add(assgn)
i += offset
else:
expr, offset = generate_expression(stream[i:])
sequence.add(expr)
i += offset - 1
elif token.kind in [lexer.NUMBER, lexer.STRING, lexer.OPERATOR, lexer.LPRT]:
expr, offset = generate_expression(stream[i:])
sequence.add(expr)
i += offset
elif token.kind == lexer.LBLOCK:
sequ, offset = generate_sequence(stream[i+1:])
i += offset + 1
sequence.add(sequ)
elif token.kind == lexer.STATEMENT:
pass
elif token.kind == lexer.RBLOCK:
if flags.debug:
print("Stopping generating sequence")
return sequence, i
else:
raise InvalidStatement(stream[i])
i += 1
return sequence, i
def optimize_ast(root):
for i in range(len(root.children)):
node = root.children[i]
if type(node) is ast.Operation and node.symbol == ":":
values = node.children
datatype = values.pop().identity
if flags.debug:
print("Replacing cast of %s" % datatype)
node = ast.Cast(datatype)
node.children = values
root.children[i] = node
else:
optimize_ast(node)
def generate(tokens):
"""Parse the tokens to AST notation."""
# clean off whitespaces
clean = [t for t in tokens if t.kind != lexer.WHITESPACE]
if flags.debug:
print("Optimized tokens:", '; '.join(str(e) for e in clean))
sequ, _ = generate_sequence(clean)
if flags.debug:
print("Optimizing AST ...")
optimize_ast(sequ)
if flags.debug:
print("Final AST:", str(sequ))
return sequ
def demo_syntax_tree():
"""Initialize a demo syntax tree."""
tree = ast.syntax_tree()
return tree
|
Please note: All prices are listed in US Dollars (USD). The term Meridian Centre as well as all associated graphics, logos, and/or other trademarks, tradenames or copyrights are the property of the Meridian Centre and are used herein for factual descriptive purposes only. We are in no way associated with or authorized by the Meridian Centre and neither that entity nor any of its affiliates have licensed or endorsed us to sell tickets, goods and or services in conjunction with their events. |
"""
Models contains the database models for the application.
"""
import datetime
from uuid import uuid4
from passlib.hash import pbkdf2_sha256
from peewee import DateTimeField, TextField, CharField, BooleanField
from peewee import SqliteDatabase, DecimalField
from peewee import UUIDField, ForeignKeyField, IntegerField
from playhouse.signals import Model, post_delete, pre_delete
from exceptions import InsufficientAvailabilityException, WrongQuantity
from schemas import (ItemSchema, UserSchema, OrderSchema, OrderItemSchema,
BaseSchema, AddressSchema)
from utils import remove_image
database = SqliteDatabase('database.db')
class BaseModel(Model):
""" Base model for all the database models. """
created_at = DateTimeField(default=datetime.datetime.now)
updated_at = DateTimeField(default=datetime.datetime.now)
_schema = BaseSchema
def save(self, *args, **kwargs):
"""Automatically update updated_at time during save"""
self.updated_at = datetime.datetime.now()
return super(BaseModel, self).save(*args, **kwargs)
class Meta:
database = database
@classmethod
def get_all(cls):
return [o for o in cls.select()]
@classmethod
def json_list(cls, objs_list):
return cls._schema.jsonapi_list(objs_list)
def json(self, include_data=[]):
parsed, errors = self._schema.jsonapi(self, include_data)
return parsed
@classmethod
def validate_input(cls, data, partial=False):
return cls._schema.validate_input(data, partial=partial)
class Item(BaseModel):
"""
Product model
name: product unique name
price: product price
description: product description text
availability: number of available products of this kind
"""
uuid = UUIDField(unique=True)
name = CharField()
price = DecimalField(auto_round=True)
description = TextField()
availability = IntegerField()
_schema = ItemSchema
def __str__(self):
return '{}, {}, {}, {}'.format(
self.uuid,
self.name,
self.price,
self.description)
@database.atomic()
@pre_delete(sender=Item)
def on_delete_item_handler(model_class, instance):
"""Delete item pictures in cascade"""
pictures = Picture.select().join(Item).where(
Item.uuid == instance.uuid)
for pic in pictures:
pic.delete_instance()
class Picture(BaseModel):
"""
Picture model
uuid: picture identifier and file name stored
extension: picture type
item: referenced item
"""
uuid = UUIDField(unique=True)
extension = CharField()
item = ForeignKeyField(Item, related_name='pictures')
def filename(self):
return '{}.{}'.format(
self.uuid,
self.extension)
def json(self):
return {
'uuid': str(self.uuid),
'extension': self.extension,
'item_uuid': str(self.item.uuid)
}
def __str__(self):
return '{}.{} -> item: {}'.format(
self.uuid,
self.extension,
self.item.uuid)
@post_delete(sender=Picture)
def on_delete_picture_handler(model_class, instance):
"""Delete file picture"""
# TODO log eventual inconsistency
remove_image(instance.uuid, instance.extension)
class User(BaseModel):
"""
User represents an user for the application.
Users created are always as role "normal" (admin field = False)
"""
uuid = UUIDField(unique=True)
first_name = CharField()
last_name = CharField()
email = CharField(unique=True)
password = CharField()
admin = BooleanField(default=False)
_schema = UserSchema
@staticmethod
def exists(email):
"""
Check that an user exists by checking the email field (unique).
"""
try:
User.get(User.email == email)
except User.DoesNotExist:
return False
return True
@staticmethod
def hash_password(password):
"""Use passlib to get a crypted password.
:returns: str
"""
return pbkdf2_sha256.hash(password)
def verify_password(self, password):
"""
Verify a clear password against the stored hashed password of the user
using passlib.
:returns: bool
"""
return pbkdf2_sha256.verify(password, self.password)
class Address(BaseModel):
""" The model Address represent a user address.
Each address is releated to one user, but one user can have
more addresses."""
uuid = UUIDField(unique=True)
user = ForeignKeyField(User, related_name='addresses')
country = CharField()
city = CharField()
post_code = CharField()
address = CharField()
phone = CharField()
_schema = AddressSchema
class Order(BaseModel):
""" The model Order contains a list of orders - one row per order.
Each order will be place by one client.
An order is represented by an uuid,
a dateTimeField which is the date of the order, a FloatField which
is the total price of the order. Finally, there is the delivery address,
if it's different from the customers address from their record.
"""
uuid = UUIDField(unique=True, default=uuid4)
total_price = DecimalField(default=0)
delivery_address = ForeignKeyField(Address, related_name="orders")
user = ForeignKeyField(User, related_name="orders")
_schema = OrderSchema
class Meta:
order_by = ('created_at',)
@property
def order_items(self):
"""
Returns the list of OrderItem related to the order.
"""
query = (
OrderItem
.select(OrderItem, Order)
.join(Order)
.where(Order.uuid == self.uuid)
)
return [orderitem for orderitem in query]
def empty_order(self):
"""
Remove all the items from the order.
Delete all OrderItem related to this order and reset the total_price
value to 0.
"""
self.total_price = 0
OrderItem.delete().where(OrderItem.order == self).execute()
self.save()
return self
def add_item(self, item, quantity=1):
"""
Add one item to the order.
Creates one OrderItem row if the item is not present in the order yet,
or increasing the count of the existing OrderItem. It also updates the
item availability counter and raise InsufficientAvailability if
quantity is less than item availability.
:param item Item: instance of models.Item
"""
for orderitem in self.order_items:
# Looping all the OrderItem related to this order, if one with the
# same item is found we update that row.
if orderitem.item == item:
orderitem.add_item(quantity)
self.total_price += (item.price * quantity)
self.save()
return self
# if no existing OrderItem is found with this order and this Item,
# create a new row in the OrderItem table and use OrderItem.add_item
# to properly use the calculus logic that handles updating prices and
# availability. To use correctly add_item the initial quantity and
# subtotal are set to 0
OrderItem.create(
order=self,
item=item,
quantity=0,
subtotal=0,
).add_item(quantity)
self.total_price += (item.price * quantity)
self.save()
return self
def update_item(self, item, quantity):
"""
Update the quantity of the orderitem of the given item.
"""
for order_item in self.order_items:
if order_item.item == item:
diff = quantity - order_item.quantity
if diff > 0:
self.add_item(item, abs(diff))
elif diff < 0:
self.remove_item(item, abs(diff))
break
else:
self.add_item(item, quantity)
def remove_item(self, item, quantity=1):
"""
Remove the given item from the order, reducing quantity of the relative
OrderItem entity or deleting it if removing the last item
(OrderItem.quantity == 0).
It also restores the item availability.
"""
for orderitem in self.order_items:
if orderitem.item == item:
removed_items = orderitem.remove_item(quantity)
item.availability += quantity
item.save()
self.total_price -= (item.price * removed_items)
self.save()
return self
# No OrderItem found for this item
# TODO: Raise or return something more explicit
return self
class OrderItem(BaseModel):
""" The model OrderItem is a cross table that contains the order
items - one row for each item on an order (so each order can
generate multiple rows).
It contains two reference field. The first one is a reference
of the model Order and the other one is for the Item.
It contains also the quantity of the item and the total price
of that item.
"""
order = ForeignKeyField(Order)
item = ForeignKeyField(Item)
quantity = IntegerField()
subtotal = DecimalField()
_schema = OrderItemSchema
def add_item(self, quantity=1):
"""
Add one item to the OrderItem, increasing the quantity count and
recalculating the subtotal value for this item(s)
"""
if quantity > self.item.availability:
raise InsufficientAvailabilityException(self.item, quantity)
self.item.availability -= quantity
self.item.save()
self.quantity += quantity
self._calculate_subtotal()
self.save()
def remove_item(self, quantity=1):
"""
Remove one item from the OrderItem, decreasing the quantity count and
recalculating the subtotal value for this item(s)
:returns: int - quantity of items really removed.
"""
if self.quantity < quantity:
raise WrongQuantity('Quantity of items to be removed ({}) higher than availability ({})'
.format(quantity, self.quantity))
elif self.quantity > quantity:
self.quantity -= quantity
self._calculate_subtotal()
self.save()
else: # elif self.quantity == quantity
quantity = self.quantity
self.delete_instance()
return quantity
def _calculate_subtotal(self):
"""Calculate the subtotal value of the item(s) in the order."""
self.subtotal = self.item.price * self.quantity
|
Carpenter Co is currently seeking a Class A CDL Truck Driver - Local / Hourly to join our Dallas, TX location. The successful candidate will be expected to safely and efficiently deliver finished products to customers at various locations.
Other Key Job duties may work in branch warehouse as needed.
One to two years driving experience and product delivery required to perform the job. Specific on the job training is provided. |
#!/usr/bin/python
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
#
from __future__ import absolute_import, division, print_function
__metaclass__ = type
ANSIBLE_METADATA = {'status': ['preview'],
'supported_by': 'community',
'metadata_version': '1.1'}
DOCUMENTATION = '''
---
module: fmgr_secprof_web
version_added: "2.8"
notes:
- Full Documentation at U(https://ftnt-ansible-docs.readthedocs.io/en/latest/).
author:
- Luke Weighall (@lweighall)
- Andrew Welsh (@Ghilli3)
- Jim Huber (@p4r4n0y1ng)
short_description: Manage web filter security profiles in FortiManager
description:
- Manage web filter security profiles in FortiManager through playbooks using the FMG API
options:
adom:
description:
- The ADOM the configuration should belong to.
required: false
default: root
mode:
description:
- Sets one of three modes for managing the object.
- Allows use of soft-adds instead of overwriting existing values
choices: ['add', 'set', 'delete', 'update']
required: false
default: add
youtube_channel_status:
description:
- YouTube channel filter status.
- choice | disable | Disable YouTube channel filter.
- choice | blacklist | Block matches.
- choice | whitelist | Allow matches.
required: false
choices: ["disable", "blacklist", "whitelist"]
wisp_servers:
description:
- WISP servers.
required: false
wisp_algorithm:
description:
- WISP server selection algorithm.
- choice | auto-learning | Select the lightest loading healthy server.
- choice | primary-secondary | Select the first healthy server in order.
- choice | round-robin | Select the next healthy server.
required: false
choices: ["auto-learning", "primary-secondary", "round-robin"]
wisp:
description:
- Enable/disable web proxy WISP.
- choice | disable | Disable web proxy WISP.
- choice | enable | Enable web proxy WISP.
required: false
choices: ["disable", "enable"]
web_url_log:
description:
- Enable/disable logging URL filtering.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_invalid_domain_log:
description:
- Enable/disable logging invalid domain names.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_ftgd_quota_usage:
description:
- Enable/disable logging daily quota usage.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_ftgd_err_log:
description:
- Enable/disable logging rating errors.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_filter_vbs_log:
description:
- Enable/disable logging VBS scripts.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_filter_unknown_log:
description:
- Enable/disable logging unknown scripts.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_filter_referer_log:
description:
- Enable/disable logging referrers.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_filter_jscript_log:
description:
- Enable/disable logging JScripts.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_filter_js_log:
description:
- Enable/disable logging Java scripts.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_filter_cookie_removal_log:
description:
- Enable/disable logging blocked cookies.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_filter_cookie_log:
description:
- Enable/disable logging cookie filtering.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_filter_command_block_log:
description:
- Enable/disable logging blocked commands.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_filter_applet_log:
description:
- Enable/disable logging Java applets.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_filter_activex_log:
description:
- Enable/disable logging ActiveX.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_extended_all_action_log:
description:
- Enable/disable extended any filter action logging for web filtering.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_content_log:
description:
- Enable/disable logging logging blocked web content.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
replacemsg_group:
description:
- Replacement message group.
required: false
post_action:
description:
- Action taken for HTTP POST traffic.
- choice | normal | Normal, POST requests are allowed.
- choice | block | POST requests are blocked.
required: false
choices: ["normal", "block"]
ovrd_perm:
description:
- FLAG Based Options. Specify multiple in list form.
- flag | bannedword-override | Banned word override.
- flag | urlfilter-override | URL filter override.
- flag | fortiguard-wf-override | FortiGuard Web Filter override.
- flag | contenttype-check-override | Content-type header override.
required: false
choices:
- bannedword-override
- urlfilter-override
- fortiguard-wf-override
- contenttype-check-override
options:
description:
- FLAG Based Options. Specify multiple in list form.
- flag | block-invalid-url | Block sessions contained an invalid domain name.
- flag | jscript | Javascript block.
- flag | js | JS block.
- flag | vbs | VB script block.
- flag | unknown | Unknown script block.
- flag | wf-referer | Referring block.
- flag | intrinsic | Intrinsic script block.
- flag | wf-cookie | Cookie block.
- flag | per-user-bwl | Per-user black/white list filter
- flag | activexfilter | ActiveX filter.
- flag | cookiefilter | Cookie filter.
- flag | javafilter | Java applet filter.
required: false
choices:
- block-invalid-url
- jscript
- js
- vbs
- unknown
- wf-referer
- intrinsic
- wf-cookie
- per-user-bwl
- activexfilter
- cookiefilter
- javafilter
name:
description:
- Profile name.
required: false
log_all_url:
description:
- Enable/disable logging all URLs visited.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
inspection_mode:
description:
- Web filtering inspection mode.
- choice | proxy | Proxy.
- choice | flow-based | Flow based.
required: false
choices: ["proxy", "flow-based"]
https_replacemsg:
description:
- Enable replacement messages for HTTPS.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
extended_log:
description:
- Enable/disable extended logging for web filtering.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
comment:
description:
- Optional comments.
required: false
ftgd_wf:
description:
- EXPERTS ONLY! KNOWLEDGE OF FMGR JSON API IS REQUIRED!
- List of multiple child objects to be added. Expects a list of dictionaries.
- Dictionaries must use FortiManager API parameters, not the ansible ones listed below.
- If submitted, all other prefixed sub-parameters ARE IGNORED.
- This object is MUTUALLY EXCLUSIVE with its options.
- We expect that you know what you are doing with these list parameters, and are leveraging the JSON API Guide.
- WHEN IN DOUBT, USE THE SUB OPTIONS BELOW INSTEAD TO CREATE OBJECTS WITH MULTIPLE TASKS
required: false
ftgd_wf_exempt_quota:
description:
- Do not stop quota for these categories.
required: false
ftgd_wf_max_quota_timeout:
description:
- Maximum FortiGuard quota used by single page view in seconds (excludes streams).
required: false
ftgd_wf_options:
description:
- Options for FortiGuard Web Filter.
- FLAG Based Options. Specify multiple in list form.
- flag | error-allow | Allow web pages with a rating error to pass through.
- flag | rate-server-ip | Rate the server IP in addition to the domain name.
- flag | connect-request-bypass | Bypass connection which has CONNECT request.
- flag | ftgd-disable | Disable FortiGuard scanning.
required: false
choices: ["error-allow", "rate-server-ip", "connect-request-bypass", "ftgd-disable"]
ftgd_wf_ovrd:
description:
- Allow web filter profile overrides.
required: false
ftgd_wf_rate_crl_urls:
description:
- Enable/disable rating CRL by URL.
- choice | disable | Disable rating CRL by URL.
- choice | enable | Enable rating CRL by URL.
required: false
choices: ["disable", "enable"]
ftgd_wf_rate_css_urls:
description:
- Enable/disable rating CSS by URL.
- choice | disable | Disable rating CSS by URL.
- choice | enable | Enable rating CSS by URL.
required: false
choices: ["disable", "enable"]
ftgd_wf_rate_image_urls:
description:
- Enable/disable rating images by URL.
- choice | disable | Disable rating images by URL (blocked images are replaced with blanks).
- choice | enable | Enable rating images by URL (blocked images are replaced with blanks).
required: false
choices: ["disable", "enable"]
ftgd_wf_rate_javascript_urls:
description:
- Enable/disable rating JavaScript by URL.
- choice | disable | Disable rating JavaScript by URL.
- choice | enable | Enable rating JavaScript by URL.
required: false
choices: ["disable", "enable"]
ftgd_wf_filters_action:
description:
- Action to take for matches.
- choice | block | Block access.
- choice | monitor | Allow access while logging the action.
- choice | warning | Allow access after warning the user.
- choice | authenticate | Authenticate user before allowing access.
required: false
choices: ["block", "monitor", "warning", "authenticate"]
ftgd_wf_filters_auth_usr_grp:
description:
- Groups with permission to authenticate.
required: false
ftgd_wf_filters_category:
description:
- Categories and groups the filter examines.
required: false
ftgd_wf_filters_log:
description:
- Enable/disable logging.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
ftgd_wf_filters_override_replacemsg:
description:
- Override replacement message.
required: false
ftgd_wf_filters_warn_duration:
description:
- Duration of warnings.
required: false
ftgd_wf_filters_warning_duration_type:
description:
- Re-display warning after closing browser or after a timeout.
- choice | session | After session ends.
- choice | timeout | After timeout occurs.
required: false
choices: ["session", "timeout"]
ftgd_wf_filters_warning_prompt:
description:
- Warning prompts in each category or each domain.
- choice | per-domain | Per-domain warnings.
- choice | per-category | Per-category warnings.
required: false
choices: ["per-domain", "per-category"]
ftgd_wf_quota_category:
description:
- FortiGuard categories to apply quota to (category action must be set to monitor).
required: false
ftgd_wf_quota_duration:
description:
- Duration of quota.
required: false
ftgd_wf_quota_override_replacemsg:
description:
- Override replacement message.
required: false
ftgd_wf_quota_type:
description:
- Quota type.
- choice | time | Use a time-based quota.
- choice | traffic | Use a traffic-based quota.
required: false
choices: ["time", "traffic"]
ftgd_wf_quota_unit:
description:
- Traffic quota unit of measurement.
- choice | B | Quota in bytes.
- choice | KB | Quota in kilobytes.
- choice | MB | Quota in megabytes.
- choice | GB | Quota in gigabytes.
required: false
choices: ["B", "KB", "MB", "GB"]
ftgd_wf_quota_value:
description:
- Traffic quota value.
required: false
override:
description:
- EXPERTS ONLY! KNOWLEDGE OF FMGR JSON API IS REQUIRED!
- List of multiple child objects to be added. Expects a list of dictionaries.
- Dictionaries must use FortiManager API parameters, not the ansible ones listed below.
- If submitted, all other prefixed sub-parameters ARE IGNORED.
- This object is MUTUALLY EXCLUSIVE with its options.
- We expect that you know what you are doing with these list parameters, and are leveraging the JSON API Guide.
- WHEN IN DOUBT, USE THE SUB OPTIONS BELOW INSTEAD TO CREATE OBJECTS WITH MULTIPLE TASKS
required: false
override_ovrd_cookie:
description:
- Allow/deny browser-based (cookie) overrides.
- choice | deny | Deny browser-based (cookie) override.
- choice | allow | Allow browser-based (cookie) override.
required: false
choices: ["deny", "allow"]
override_ovrd_dur:
description:
- Override duration.
required: false
override_ovrd_dur_mode:
description:
- Override duration mode.
- choice | constant | Constant mode.
- choice | ask | Prompt for duration when initiating an override.
required: false
choices: ["constant", "ask"]
override_ovrd_scope:
description:
- Override scope.
- choice | user | Override for the user.
- choice | user-group | Override for the user's group.
- choice | ip | Override for the initiating IP.
- choice | ask | Prompt for scope when initiating an override.
- choice | browser | Create browser-based (cookie) override.
required: false
choices: ["user", "user-group", "ip", "ask", "browser"]
override_ovrd_user_group:
description:
- User groups with permission to use the override.
required: false
override_profile:
description:
- Web filter profile with permission to create overrides.
required: false
override_profile_attribute:
description:
- Profile attribute to retrieve from the RADIUS server.
- choice | User-Name | Use this attribute.
- choice | NAS-IP-Address | Use this attribute.
- choice | Framed-IP-Address | Use this attribute.
- choice | Framed-IP-Netmask | Use this attribute.
- choice | Filter-Id | Use this attribute.
- choice | Login-IP-Host | Use this attribute.
- choice | Reply-Message | Use this attribute.
- choice | Callback-Number | Use this attribute.
- choice | Callback-Id | Use this attribute.
- choice | Framed-Route | Use this attribute.
- choice | Framed-IPX-Network | Use this attribute.
- choice | Class | Use this attribute.
- choice | Called-Station-Id | Use this attribute.
- choice | Calling-Station-Id | Use this attribute.
- choice | NAS-Identifier | Use this attribute.
- choice | Proxy-State | Use this attribute.
- choice | Login-LAT-Service | Use this attribute.
- choice | Login-LAT-Node | Use this attribute.
- choice | Login-LAT-Group | Use this attribute.
- choice | Framed-AppleTalk-Zone | Use this attribute.
- choice | Acct-Session-Id | Use this attribute.
- choice | Acct-Multi-Session-Id | Use this attribute.
required: false
choices:
- User-Name
- NAS-IP-Address
- Framed-IP-Address
- Framed-IP-Netmask
- Filter-Id
- Login-IP-Host
- Reply-Message
- Callback-Number
- Callback-Id
- Framed-Route
- Framed-IPX-Network
- Class
- Called-Station-Id
- Calling-Station-Id
- NAS-Identifier
- Proxy-State
- Login-LAT-Service
- Login-LAT-Node
- Login-LAT-Group
- Framed-AppleTalk-Zone
- Acct-Session-Id
- Acct-Multi-Session-Id
override_profile_type:
description:
- Override profile type.
- choice | list | Profile chosen from list.
- choice | radius | Profile determined by RADIUS server.
required: false
choices: ["list", "radius"]
url_extraction:
description:
- EXPERTS ONLY! KNOWLEDGE OF FMGR JSON API IS REQUIRED!
- List of multiple child objects to be added. Expects a list of dictionaries.
- Dictionaries must use FortiManager API parameters, not the ansible ones listed below.
- If submitted, all other prefixed sub-parameters ARE IGNORED.
- This object is MUTUALLY EXCLUSIVE with its options.
- We expect that you know what you are doing with these list parameters, and are leveraging the JSON API Guide.
- WHEN IN DOUBT, USE THE SUB OPTIONS BELOW INSTEAD TO CREATE OBJECTS WITH MULTIPLE TASKS
required: false
url_extraction_redirect_header:
description:
- HTTP header name to use for client redirect on blocked requests
required: false
url_extraction_redirect_no_content:
description:
- Enable / Disable empty message-body entity in HTTP response
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
url_extraction_redirect_url:
description:
- HTTP header value to use for client redirect on blocked requests
required: false
url_extraction_server_fqdn:
description:
- URL extraction server FQDN (fully qualified domain name)
required: false
url_extraction_status:
description:
- Enable URL Extraction
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web:
description:
- EXPERTS ONLY! KNOWLEDGE OF FMGR JSON API IS REQUIRED!
- List of multiple child objects to be added. Expects a list of dictionaries.
- Dictionaries must use FortiManager API parameters, not the ansible ones listed below.
- If submitted, all other prefixed sub-parameters ARE IGNORED.
- This object is MUTUALLY EXCLUSIVE with its options.
- We expect that you know what you are doing with these list parameters, and are leveraging the JSON API Guide.
- WHEN IN DOUBT, USE THE SUB OPTIONS BELOW INSTEAD TO CREATE OBJECTS WITH MULTIPLE TASKS
required: false
web_blacklist:
description:
- Enable/disable automatic addition of URLs detected by FortiSandbox to blacklist.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_bword_table:
description:
- Banned word table ID.
required: false
web_bword_threshold:
description:
- Banned word score threshold.
required: false
web_content_header_list:
description:
- Content header list.
required: false
web_keyword_match:
description:
- Search keywords to log when match is found.
required: false
web_log_search:
description:
- Enable/disable logging all search phrases.
- choice | disable | Disable setting.
- choice | enable | Enable setting.
required: false
choices: ["disable", "enable"]
web_safe_search:
description:
- Safe search type.
- FLAG Based Options. Specify multiple in list form.
- flag | url | Insert safe search string into URL.
- flag | header | Insert safe search header.
required: false
choices: ["url", "header"]
web_urlfilter_table:
description:
- URL filter table ID.
required: false
web_whitelist:
description:
- FortiGuard whitelist settings.
- FLAG Based Options. Specify multiple in list form.
- flag | exempt-av | Exempt antivirus.
- flag | exempt-webcontent | Exempt web content.
- flag | exempt-activex-java-cookie | Exempt ActiveX-JAVA-Cookie.
- flag | exempt-dlp | Exempt DLP.
- flag | exempt-rangeblock | Exempt RangeBlock.
- flag | extended-log-others | Support extended log.
required: false
choices:
- exempt-av
- exempt-webcontent
- exempt-activex-java-cookie
- exempt-dlp
- exempt-rangeblock
- extended-log-others
web_youtube_restrict:
description:
- YouTube EDU filter level.
- choice | strict | Strict access for YouTube.
- choice | none | Full access for YouTube.
- choice | moderate | Moderate access for YouTube.
required: false
choices: ["strict", "none", "moderate"]
youtube_channel_filter:
description:
- EXPERTS ONLY! KNOWLEDGE OF FMGR JSON API IS REQUIRED!
- List of multiple child objects to be added. Expects a list of dictionaries.
- Dictionaries must use FortiManager API parameters, not the ansible ones listed below.
- If submitted, all other prefixed sub-parameters ARE IGNORED.
- This object is MUTUALLY EXCLUSIVE with its options.
- We expect that you know what you are doing with these list parameters, and are leveraging the JSON API Guide.
- WHEN IN DOUBT, USE THE SUB OPTIONS BELOW INSTEAD TO CREATE OBJECTS WITH MULTIPLE TASKS
required: false
youtube_channel_filter_channel_id:
description:
- YouTube channel ID to be filtered.
required: false
youtube_channel_filter_comment:
description:
- Comment.
required: false
'''
EXAMPLES = '''
- name: DELETE Profile
fmgr_secprof_web:
name: "Ansible_Web_Filter_Profile"
mode: "delete"
- name: CREATE Profile
fmgr_secprof_web:
name: "Ansible_Web_Filter_Profile"
comment: "Created by Ansible Module TEST"
mode: "set"
extended_log: "enable"
inspection_mode: "proxy"
log_all_url: "enable"
options: "js"
ovrd_perm: "bannedword-override"
post_action: "block"
web_content_log: "enable"
web_extended_all_action_log: "enable"
web_filter_activex_log: "enable"
web_filter_applet_log: "enable"
web_filter_command_block_log: "enable"
web_filter_cookie_log: "enable"
web_filter_cookie_removal_log: "enable"
web_filter_js_log: "enable"
web_filter_jscript_log: "enable"
web_filter_referer_log: "enable"
web_filter_unknown_log: "enable"
web_filter_vbs_log: "enable"
web_ftgd_err_log: "enable"
web_ftgd_quota_usage: "enable"
web_invalid_domain_log: "enable"
web_url_log: "enable"
wisp: "enable"
wisp_algorithm: "auto-learning"
youtube_channel_status: "blacklist"
'''
RETURN = """
api_result:
description: full API response, includes status code and message
returned: always
type: str
"""
from ansible.module_utils.basic import AnsibleModule, env_fallback
from ansible.module_utils.connection import Connection
from ansible.module_utils.network.fortimanager.fortimanager import FortiManagerHandler
from ansible.module_utils.network.fortimanager.common import FMGBaseException
from ansible.module_utils.network.fortimanager.common import FMGRCommon
from ansible.module_utils.network.fortimanager.common import FMGRMethods
from ansible.module_utils.network.fortimanager.common import DEFAULT_RESULT_OBJ
from ansible.module_utils.network.fortimanager.common import FAIL_SOCKET_MSG
from ansible.module_utils.network.fortimanager.common import prepare_dict
from ansible.module_utils.network.fortimanager.common import scrub_dict
def fmgr_webfilter_profile_modify(fmgr, paramgram):
mode = paramgram["mode"]
adom = paramgram["adom"]
response = DEFAULT_RESULT_OBJ
url = ""
datagram = {}
# EVAL THE MODE PARAMETER FOR SET OR ADD
if mode in ['set', 'add', 'update']:
url = '/pm/config/adom/{adom}/obj/webfilter/profile'.format(adom=adom)
datagram = scrub_dict(prepare_dict(paramgram))
# EVAL THE MODE PARAMETER FOR DELETE
elif mode == "delete":
# SET THE CORRECT URL FOR DELETE
url = '/pm/config/adom/{adom}/obj/webfilter/profile/{name}'.format(adom=adom, name=paramgram["name"])
datagram = {}
response = fmgr.process_request(url, datagram, paramgram["mode"])
return response
#############
# END METHODS
#############
def main():
argument_spec = dict(
adom=dict(type="str", default="root"),
mode=dict(choices=["add", "set", "delete", "update"], type="str", default="add"),
youtube_channel_status=dict(required=False, type="str", choices=["disable", "blacklist", "whitelist"]),
wisp_servers=dict(required=False, type="str"),
wisp_algorithm=dict(required=False, type="str", choices=["auto-learning", "primary-secondary", "round-robin"]),
wisp=dict(required=False, type="str", choices=["disable", "enable"]),
web_url_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_invalid_domain_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_ftgd_quota_usage=dict(required=False, type="str", choices=["disable", "enable"]),
web_ftgd_err_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_filter_vbs_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_filter_unknown_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_filter_referer_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_filter_jscript_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_filter_js_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_filter_cookie_removal_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_filter_cookie_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_filter_command_block_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_filter_applet_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_filter_activex_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_extended_all_action_log=dict(required=False, type="str", choices=["disable", "enable"]),
web_content_log=dict(required=False, type="str", choices=["disable", "enable"]),
replacemsg_group=dict(required=False, type="str"),
post_action=dict(required=False, type="str", choices=["normal", "block"]),
ovrd_perm=dict(required=False, type="list", choices=["bannedword-override",
"urlfilter-override",
"fortiguard-wf-override",
"contenttype-check-override"]),
options=dict(required=False, type="list", choices=["block-invalid-url",
"jscript",
"js",
"vbs",
"unknown",
"wf-referer",
"intrinsic",
"wf-cookie",
"per-user-bwl",
"activexfilter",
"cookiefilter",
"javafilter"]),
name=dict(required=False, type="str"),
log_all_url=dict(required=False, type="str", choices=["disable", "enable"]),
inspection_mode=dict(required=False, type="str", choices=["proxy", "flow-based"]),
https_replacemsg=dict(required=False, type="str", choices=["disable", "enable"]),
extended_log=dict(required=False, type="str", choices=["disable", "enable"]),
comment=dict(required=False, type="str"),
ftgd_wf=dict(required=False, type="list"),
ftgd_wf_exempt_quota=dict(required=False, type="str"),
ftgd_wf_max_quota_timeout=dict(required=False, type="int"),
ftgd_wf_options=dict(required=False, type="str", choices=["error-allow", "rate-server-ip",
"connect-request-bypass", "ftgd-disable"]),
ftgd_wf_ovrd=dict(required=False, type="str"),
ftgd_wf_rate_crl_urls=dict(required=False, type="str", choices=["disable", "enable"]),
ftgd_wf_rate_css_urls=dict(required=False, type="str", choices=["disable", "enable"]),
ftgd_wf_rate_image_urls=dict(required=False, type="str", choices=["disable", "enable"]),
ftgd_wf_rate_javascript_urls=dict(required=False, type="str", choices=["disable", "enable"]),
ftgd_wf_filters_action=dict(required=False, type="str", choices=["block", "monitor",
"warning", "authenticate"]),
ftgd_wf_filters_auth_usr_grp=dict(required=False, type="str"),
ftgd_wf_filters_category=dict(required=False, type="str"),
ftgd_wf_filters_log=dict(required=False, type="str", choices=["disable", "enable"]),
ftgd_wf_filters_override_replacemsg=dict(required=False, type="str"),
ftgd_wf_filters_warn_duration=dict(required=False, type="str"),
ftgd_wf_filters_warning_duration_type=dict(required=False, type="str", choices=["session", "timeout"]),
ftgd_wf_filters_warning_prompt=dict(required=False, type="str", choices=["per-domain", "per-category"]),
ftgd_wf_quota_category=dict(required=False, type="str"),
ftgd_wf_quota_duration=dict(required=False, type="str"),
ftgd_wf_quota_override_replacemsg=dict(required=False, type="str"),
ftgd_wf_quota_type=dict(required=False, type="str", choices=["time", "traffic"]),
ftgd_wf_quota_unit=dict(required=False, type="str", choices=["B", "KB", "MB", "GB"]),
ftgd_wf_quota_value=dict(required=False, type="int"),
override=dict(required=False, type="list"),
override_ovrd_cookie=dict(required=False, type="str", choices=["deny", "allow"]),
override_ovrd_dur=dict(required=False, type="str"),
override_ovrd_dur_mode=dict(required=False, type="str", choices=["constant", "ask"]),
override_ovrd_scope=dict(required=False, type="str", choices=["user", "user-group", "ip", "ask", "browser"]),
override_ovrd_user_group=dict(required=False, type="str"),
override_profile=dict(required=False, type="str"),
override_profile_attribute=dict(required=False, type="list", choices=["User-Name",
"NAS-IP-Address",
"Framed-IP-Address",
"Framed-IP-Netmask",
"Filter-Id",
"Login-IP-Host",
"Reply-Message",
"Callback-Number",
"Callback-Id",
"Framed-Route",
"Framed-IPX-Network",
"Class",
"Called-Station-Id",
"Calling-Station-Id",
"NAS-Identifier",
"Proxy-State",
"Login-LAT-Service",
"Login-LAT-Node",
"Login-LAT-Group",
"Framed-AppleTalk-Zone",
"Acct-Session-Id",
"Acct-Multi-Session-Id"]),
override_profile_type=dict(required=False, type="str", choices=["list", "radius"]),
url_extraction=dict(required=False, type="list"),
url_extraction_redirect_header=dict(required=False, type="str"),
url_extraction_redirect_no_content=dict(required=False, type="str", choices=["disable", "enable"]),
url_extraction_redirect_url=dict(required=False, type="str"),
url_extraction_server_fqdn=dict(required=False, type="str"),
url_extraction_status=dict(required=False, type="str", choices=["disable", "enable"]),
web=dict(required=False, type="list"),
web_blacklist=dict(required=False, type="str", choices=["disable", "enable"]),
web_bword_table=dict(required=False, type="str"),
web_bword_threshold=dict(required=False, type="int"),
web_content_header_list=dict(required=False, type="str"),
web_keyword_match=dict(required=False, type="str"),
web_log_search=dict(required=False, type="str", choices=["disable", "enable"]),
web_safe_search=dict(required=False, type="str", choices=["url", "header"]),
web_urlfilter_table=dict(required=False, type="str"),
web_whitelist=dict(required=False, type="list", choices=["exempt-av",
"exempt-webcontent",
"exempt-activex-java-cookie",
"exempt-dlp",
"exempt-rangeblock",
"extended-log-others"]),
web_youtube_restrict=dict(required=False, type="str", choices=["strict", "none", "moderate"]),
youtube_channel_filter=dict(required=False, type="list"),
youtube_channel_filter_channel_id=dict(required=False, type="str"),
youtube_channel_filter_comment=dict(required=False, type="str"),
)
module = AnsibleModule(argument_spec=argument_spec, supports_check_mode=False, )
# MODULE PARAMGRAM
paramgram = {
"mode": module.params["mode"],
"adom": module.params["adom"],
"youtube-channel-status": module.params["youtube_channel_status"],
"wisp-servers": module.params["wisp_servers"],
"wisp-algorithm": module.params["wisp_algorithm"],
"wisp": module.params["wisp"],
"web-url-log": module.params["web_url_log"],
"web-invalid-domain-log": module.params["web_invalid_domain_log"],
"web-ftgd-quota-usage": module.params["web_ftgd_quota_usage"],
"web-ftgd-err-log": module.params["web_ftgd_err_log"],
"web-filter-vbs-log": module.params["web_filter_vbs_log"],
"web-filter-unknown-log": module.params["web_filter_unknown_log"],
"web-filter-referer-log": module.params["web_filter_referer_log"],
"web-filter-jscript-log": module.params["web_filter_jscript_log"],
"web-filter-js-log": module.params["web_filter_js_log"],
"web-filter-cookie-removal-log": module.params["web_filter_cookie_removal_log"],
"web-filter-cookie-log": module.params["web_filter_cookie_log"],
"web-filter-command-block-log": module.params["web_filter_command_block_log"],
"web-filter-applet-log": module.params["web_filter_applet_log"],
"web-filter-activex-log": module.params["web_filter_activex_log"],
"web-extended-all-action-log": module.params["web_extended_all_action_log"],
"web-content-log": module.params["web_content_log"],
"replacemsg-group": module.params["replacemsg_group"],
"post-action": module.params["post_action"],
"ovrd-perm": module.params["ovrd_perm"],
"options": module.params["options"],
"name": module.params["name"],
"log-all-url": module.params["log_all_url"],
"inspection-mode": module.params["inspection_mode"],
"https-replacemsg": module.params["https_replacemsg"],
"extended-log": module.params["extended_log"],
"comment": module.params["comment"],
"ftgd-wf": {
"exempt-quota": module.params["ftgd_wf_exempt_quota"],
"max-quota-timeout": module.params["ftgd_wf_max_quota_timeout"],
"options": module.params["ftgd_wf_options"],
"ovrd": module.params["ftgd_wf_ovrd"],
"rate-crl-urls": module.params["ftgd_wf_rate_crl_urls"],
"rate-css-urls": module.params["ftgd_wf_rate_css_urls"],
"rate-image-urls": module.params["ftgd_wf_rate_image_urls"],
"rate-javascript-urls": module.params["ftgd_wf_rate_javascript_urls"],
"filters": {
"action": module.params["ftgd_wf_filters_action"],
"auth-usr-grp": module.params["ftgd_wf_filters_auth_usr_grp"],
"category": module.params["ftgd_wf_filters_category"],
"log": module.params["ftgd_wf_filters_log"],
"override-replacemsg": module.params["ftgd_wf_filters_override_replacemsg"],
"warn-duration": module.params["ftgd_wf_filters_warn_duration"],
"warning-duration-type": module.params["ftgd_wf_filters_warning_duration_type"],
"warning-prompt": module.params["ftgd_wf_filters_warning_prompt"],
},
"quota": {
"category": module.params["ftgd_wf_quota_category"],
"duration": module.params["ftgd_wf_quota_duration"],
"override-replacemsg": module.params["ftgd_wf_quota_override_replacemsg"],
"type": module.params["ftgd_wf_quota_type"],
"unit": module.params["ftgd_wf_quota_unit"],
"value": module.params["ftgd_wf_quota_value"],
},
},
"override": {
"ovrd-cookie": module.params["override_ovrd_cookie"],
"ovrd-dur": module.params["override_ovrd_dur"],
"ovrd-dur-mode": module.params["override_ovrd_dur_mode"],
"ovrd-scope": module.params["override_ovrd_scope"],
"ovrd-user-group": module.params["override_ovrd_user_group"],
"profile": module.params["override_profile"],
"profile-attribute": module.params["override_profile_attribute"],
"profile-type": module.params["override_profile_type"],
},
"url-extraction": {
"redirect-header": module.params["url_extraction_redirect_header"],
"redirect-no-content": module.params["url_extraction_redirect_no_content"],
"redirect-url": module.params["url_extraction_redirect_url"],
"server-fqdn": module.params["url_extraction_server_fqdn"],
"status": module.params["url_extraction_status"],
},
"web": {
"blacklist": module.params["web_blacklist"],
"bword-table": module.params["web_bword_table"],
"bword-threshold": module.params["web_bword_threshold"],
"content-header-list": module.params["web_content_header_list"],
"keyword-match": module.params["web_keyword_match"],
"log-search": module.params["web_log_search"],
"safe-search": module.params["web_safe_search"],
"urlfilter-table": module.params["web_urlfilter_table"],
"whitelist": module.params["web_whitelist"],
"youtube-restrict": module.params["web_youtube_restrict"],
},
"youtube-channel-filter": {
"channel-id": module.params["youtube_channel_filter_channel_id"],
"comment": module.params["youtube_channel_filter_comment"],
}
}
module.paramgram = paramgram
fmgr = None
if module._socket_path:
connection = Connection(module._socket_path)
fmgr = FortiManagerHandler(connection, module)
fmgr.tools = FMGRCommon()
else:
module.fail_json(**FAIL_SOCKET_MSG)
list_overrides = ['ftgd-wf', 'override', 'url-extraction', 'web', 'youtube-channel-filter']
paramgram = fmgr.tools.paramgram_child_list_override(list_overrides=list_overrides,
paramgram=paramgram, module=module)
results = DEFAULT_RESULT_OBJ
try:
results = fmgr_webfilter_profile_modify(fmgr, paramgram)
fmgr.govern_response(module=module, results=results,
ansible_facts=fmgr.construct_ansible_facts(results, module.params, paramgram))
except Exception as err:
raise FMGBaseException(err)
return module.exit_json(**results[1])
if __name__ == "__main__":
main()
|
High quality 3d model of Molteni Dart with arms. The model is textured and is ready to use. The model has reasonable amount of polygons and accurate grid. All models have been created in cinema 4d and are not the result of exports from other software. The objects are modelled in Cinema 4D and non-connected so you can modify them as you prefer before you put them in the scene. Models have sweep objects, extrusions, HyperNURBS, modifier, etc… that make easier for you to modify them.
Be the first to review “Molteni Dart with arms - N.39 in M4D Vol.5” Click here to cancel reply. |
import numpy as N
def log10(x, xerr, mask=False):
z = N.log10(x)
zerr = N.absolute(xerr * N.log10(N.e)/x)
if not mask:
return N.array([z, zerr])
else:
ok = x > 0.0
return N.array([z, zerr]), ok
def pow10(x, xerr, mask=False):
z = 10**x
zerr = N.absolute(N.log(10) * xerr * z)
if not mask:
return N.array([z, zerr])
else:
ok = N.ones(z.shape, N.bool)
return N.array([z, zerr]), ok
def multiply(x, xerr, y, yerr, mask=False):
z = x*y
zerr = (xerr/x)**2 + (yerr/y)**2
zerr = N.absolute(N.sqrt(zerr) * z)
if not mask:
return N.array([z, zerr])
else:
ok = N.ones(z.shape, N.bool)
return N.array([z, zerr]), ok
def divide(x, xerr, y, yerr, mask=False):
z = x/y
zerr = (xerr/x)**2 + (yerr/y)**2
zerr = N.absolute(N.sqrt(zerr) * z)
if not mask:
return N.array([z, zerr])
else:
ok = x != 0.0
return N.array([z, zerr]), ok
def add(x, xerr, y, yerr, mask=False):
z = x+y
zerr = N.absolute((xerr)**2 + (yerr)**2)
zerr = N.sqrt(zerr)
if not mask:
return N.array([z, zerr])
else:
ok = N.ones(z.shape, N.bool)
return N.array([z, zerr]), ok
def subtract(x, xerr, y, yerr, mask=False):
z = x-y
zerr = N.absolute((xerr)**2 + (yerr)**2)
zerr = N.sqrt(zerr)
if not mask:
return N.array([z, zerr])
else:
ok = N.ones(z.shape, N.bool)
return N.array([z, zerr]), ok
def test():
n = 100000
a = 10.0
aerr = 2.0
b = 3.0
berr = 0.3
x = N.random.normal(a, aerr, n)
y = N.random.normal(b, berr, n)
# log10
t = N.log10(x)
z, zerr = log10(a, aerr)
delta = N.absolute(t.mean() - z)/t.mean()
deltaerr = N.absolute(t.std() - zerr)/t.std()
print 'log10', delta, deltaerr
# pow10
t = 10**y
z, zerr = pow10(b, berr)
delta = N.absolute(t.mean() - z)/t.mean()
deltaerr = N.absolute(t.std() - zerr)/t.std()
print 'pow10', delta, deltaerr
# multiply
t = N.multiply(x, y)
z, zerr = multiply(a, aerr, b, berr)
delta = N.absolute(t.mean() - z)/t.mean()
deltaerr = N.absolute(t.std() - zerr)/t.std()
print 'multiply', delta, deltaerr
# divide
t = N.divide(x, y)
z, zerr = divide(a, aerr, b, berr)
delta = N.absolute(t.mean() - z)/t.mean()
deltaerr = N.absolute(t.std() - zerr)/t.std()
print 'divide', delta, deltaerr
# add
t = N.add(x, y)
z, zerr = add(a, aerr, b, berr)
delta = N.absolute(t.mean() - z)/t.mean()
deltaerr = N.absolute(t.std() - zerr)/t.std()
print 'add', delta, deltaerr
# subtract
t = N.subtract(x, y)
z, zerr = subtract(a, aerr, b, berr)
delta = N.absolute(t.mean() - z)/t.mean()
deltaerr = N.absolute(t.std() - zerr)/t.std()
print 'subtract', delta, deltaerr
|
Hugo Cabret (Asa Butterfield) lives in the walls of the Paris train station. The 12-year-old, much like his late clock-making father (Jude Law), is something of a mechanical genius, and with his drunken uncle Claude (Ray Winstone) mysteriously away, it is up to him to make sure all of the station's clocks run with their usual precision. To do so, he must avoid the constant badgering of the facility's chief inspector (Sacha Baron Cohen), whose main joy in life is ridding the massive building of orphans by taking them away to the orphanage.
But Hugo isn't just intent on making sure the clocks continue ticking. He is also consumed with a passion to rebuild a broken automaton his father once discovered in a museum's attic. He is sure that by fixing it, he will reveal a final message from his dearly departed dad, that its words will give him some sort of reason as to why these troubles have befallen him and help the lad discover his life's purpose. What he does not know is that this quest will also lead him to the mysterious Georges Méliès (Ben Kinglsey), and with the help of the old man's free-spirited goddaughter Isabelle (Chloë Grace Moretz), Hugo will change more lives for the better than just his own.
Based on the beloved novel The Invention of Hugo Cabret by Brian Selznick, legendary filmmaker Martin Scorsese's (Shutter Island, The Departed) 3D adaptation Hugo is at times a visual and emotional marvel that moved me to euphoric tears. Its sensational final 30 minutes are a celebration of the cinematic medium, a jovial harkening to the days of silent film and childlike imagination that speaks to the very best of who we are. It is a miraculous achievement that Hugo, during this home stretch, engages on levels and in ways few other films can admit to, and as such, it makes a decided case to be considered as one of the year's finest achievements.
Yet there are issues. Scorsese and screenwriter John Logan (Rango, The Aviator) is not entirely successful in translating Selznick's prose to the big screen. The entire subplot involving the inspector's never-ending quest to capture Hugo gets old far too fast, and there are moments where the movie dips into a state of juvenile sentimentality more suited to a Nickelodeon or Disney Channel sitcom than to high drama. More so, while I was never bored by the proceedings, it does take the movie a bit of time to hit its stride, and as wonderful as Hugo and Isabelle's friendship climactically proves to be, getting there took a tad more effort than it needed to.
But no filmmaker, not even James Cameron with Avatar, has used the 3D process in such a profound and intimate way. There were moments where I could actually feel myself disappearing within the frame, becoming one with the wispy bits of dust and spiraling layers of smoke filtering through it. This is as immersive a motion picture as any I have ever had the pleasure to experience, and if this is truly the future of 3D as it pertains to cinema, then I might finally have to reconsider my reticence toward the technology.
More than that, though, Scorsese has found a way to extrapolate on his love for the camera and the cinematic medium in a way that speaks universally to the child within us all. Hugo is more than a history lesson; it is a love letter, beautifully conveying the importance and significance of the early days of moviemaking and lovingly showcasing how those first moving images of trains, crowds mingling, and men journeying to the moon shaped the filmmakers of today. It enraptures the soul, engages the intellect, and connects in an emotional way that had me mesmerized. I could not look away from the screen, and the smile never left my face during the third act's bit of blissful delirium.
Yet do not misunderstand, what makes all this borderline brilliant is that Scorsese never forgets about his characters, never loses sight of Hugo's story or how his journey plays upon Méliès and his family. What is discovered comes from a character-driven place that is as distinct as it is wonderful, adding to the film's innate power to charm and to beguile and proving once again the best stories are always the ones you can relate to on a personal level.
This review could go on forever. There is so much more to talk about, so much just on a technical front - whether it be Howard Shore's (The Twilight Saga: Eclipse) score, Robert Richardson's (Inglourious Basterds) cinematography, or Dante Ferretti's (The Black Dahlia) eye-popping production design - I don't even know where to begin. I could go on about the intricacies of the script, the delicate and subtlety complex nature of the majority of the performances (although Cohen did get on my nerves at times), or how Moretz's use of the word 'clandestine' made me shiver in absolute giggly glee.
The point is that, even with its flaws, Hugo is such a wondrous achievement on so many different levels that trying to go into detail in regard to them all borders on impossible. For me, the end result is that Scorsese has manufactured a motion picture that articulates everything I love and adore about cinema, but has done so in a way that also speaks to the greater angels within us all and to the better people each and every one of us hopes on some level to be. It is, in a word, sublime, and here's hoping general audiences will take the time to discover its heartwarming magic for themselves. |
import unittest
import tempfile
import collections
import os
import multiprocessing
from ngs_python.structure import alignedPair
class PairTestCase(unittest.TestCase):
def setUp(self):
''' Create temporary directory and example read pairs '''
# Make temporary file
self.dirName = tempfile.mkdtemp()
self.testPair = self.dirName + 'test.pair'
# Create read pairs
self.pair1 = ('chr1',1,40,'+','chr1',1960,2000,'-')
self.pair2 = ('chr1',1,40,'+','chr1',1959,2001,'-')
self.pair3 = ('chr1',1,40,'+','chr2',1959,1999,'-')
self.pair4 = ('chr1',1,40,'+','chr1',1959,1999,'+')
self.pair5 = ('chr1',100,140,'-','chr1',100,140,'+')
self.pair6 = ('chr1',100,140,'-','chr1',90,130,'+')
self.pair7 = ('chr1',100,140,'-','chr1',90,141,'+')
self.pair8 = ('chr1',99,140,'-','chr1',100,130,'+')
# Create pair list
self.pairList = ([self.pair2] + [self.pair3] * 2 + [self.pair4] * 3 +
[self.pair5] + [self.pair6] * 2)
def tearDown(self):
''' Remove temporary files and directories '''
if os.path.isfile(self.testPair):
os.remove(self.testPair)
os.removedirs(self.dirName)
def readFile(self):
with open(self.testPair) as f:
data = f.readlines()
output = [d.strip().split('\t') for d in data]
return(output)
def processPair(self, rmDup, rmConcord, maxSize):
pipes = multiprocessing.Pipe(True)
process = multiprocessing.Process(
target = alignedPair.processPairs,
args = (pipes[0], self.testPair, rmDup, rmConcord, maxSize)
)
process.start()
pipes[0].close()
for pair in self.pairList:
pipes[1].send(pair)
pipes[1].send(None)
metrics = pipes[1].recv()
pipes[1].close()
process.join()
return(metrics)
class TestPairProcessing(PairTestCase):
def test_find_concordant(self):
''' Testing identification of concordant read pairs '''
# Check proper pair
self.assertTrue(alignedPair.concordant(self.pair1,2000))
# Check pair that is too big
self.assertFalse(alignedPair.concordant(self.pair2,2000))
# Check pair on different chromosome
self.assertFalse(alignedPair.concordant(self.pair3,2000))
# Check pair on same strand
self.assertFalse(alignedPair.concordant(self.pair4,2000))
# Check overlapping proper pairs
self.assertTrue(alignedPair.concordant(self.pair5,2000))
self.assertTrue(alignedPair.concordant(self.pair6,2000))
# Check when read pairs extend beyond each other
self.assertFalse(alignedPair.concordant(self.pair7,2000))
self.assertFalse(alignedPair.concordant(self.pair8,2000))
def test_process_concord_duplication(self):
''' Test correct processing of concordant and duplicated reads '''
# Check processing with concordant and duplicates removed
pairMetrics = self.processPair(rmDup = True, rmConcord = True,
maxSize = 2000)
self.assertEqual(self.readFile(), [map(str,self.pair2),
map(str,self.pair3), map(str,self.pair4)])
self.assertEqual(pairMetrics, collections.defaultdict(int, {
'total':9, 'unique':5, 'duplicate':4, 'concord':3, 'concorduni':2,
'discord':6, 'discorduni':3}))
# Check processing with duplicates removed
pairMetrics = self.processPair(rmDup = True, rmConcord = False,
maxSize = 2000)
self.assertEqual(self.readFile(), [map(str,self.pair2),
map(str,self.pair3), map(str,self.pair4), map(str,self.pair5),
map(str,self.pair6)])
self.assertEqual(pairMetrics, collections.defaultdict(int, {
'total':9, 'unique':5, 'duplicate':4, 'concord':3, 'concorduni':2,
'discord':6, 'discorduni':3}))
# Check processing with concordant removed
pairMetrics = self.processPair(rmDup = False, rmConcord = True,
maxSize = 2000)
self.assertEqual(self.readFile(), [map(str,self.pair2)] +
[map(str,self.pair3)] * 2 + [map(str,self.pair4)] * 3)
self.assertEqual(pairMetrics, collections.defaultdict(int, {
'total':9, 'unique':5, 'duplicate':4, 'concord':3, 'concorduni':2,
'discord':6, 'discorduni':3}))
# Check processing with nothing removed
pairMetrics = self.processPair(rmDup = False, rmConcord = False,
maxSize = 2000)
self.assertEqual(self.readFile(), [map(str,self.pair2)] +
[map(str,self.pair3)] * 2 + [map(str,self.pair4)] * 3 +
[map(str,self.pair5)] + [map(str,self.pair6)] * 2)
self.assertEqual(pairMetrics, collections.defaultdict(int, {
'total':9, 'unique':5, 'duplicate':4, 'concord':3, 'concorduni':2,
'discord':6, 'discorduni':3}))
suite = unittest.TestLoader().loadTestsFromTestCase(TestPairProcessing)
unittest.TextTestRunner(verbosity=3).run(suite)
|
Use your patient login and password to enter the store sites below. If you have any questions, please call the office at (614) 985-1435.
Metagenics – Click here to visit the store.
Doctor’s Supplements – Click here to visit the store.
Fullscript – Click here to visit the store. |
# -*- coding:utf-8 -*-
"""
/***************************************************************************
Python Console for QGIS
-------------------
begin : 2012-09-10
copyright : (C) 2012 by Salvatore Larosa
email : lrssvtml (at) gmail (dot) com
***************************************************************************/
/***************************************************************************
* *
* This program is free software; you can redistribute it and/or modify *
* it under the terms of the GNU General Public License as published by *
* the Free Software Foundation; either version 2 of the License, or *
* (at your option) any later version. *
* *
***************************************************************************/
Some portions of code were taken from https://code.google.com/p/pydee/
"""
from qgis.PyQt.QtCore import QCoreApplication, QSize, Qt
from qgis.PyQt.QtWidgets import QDialog, QFileDialog, QMessageBox, QTableWidgetItem
from qgis.PyQt.QtGui import QIcon, QFont, QColor, QFontDatabase
from qgis.core import QgsSettings
from .console_compile_apis import PrepareAPIDialog
from .ui_console_settings import Ui_SettingsDialogPythonConsole
class optionsDialog(QDialog, Ui_SettingsDialogPythonConsole):
DEFAULT_COLOR = "#4d4d4c"
KEYWORD_COLOR = "#8959a8"
CLASS_COLOR = "#4271ae"
METHOD_COLOR = "#4271ae"
DECORATION_COLOR = "#3e999f"
NUMBER_COLOR = "#c82829"
COMMENT_COLOR = "#8e908c"
COMMENT_BLOCK_COLOR = "#8e908c"
BACKGROUND_COLOR = "#ffffff"
CURSOR_COLOR = "#636363"
CARET_LINE_COLOR = "#efefef"
SINGLE_QUOTE_COLOR = "#718c00"
DOUBLE_QUOTE_COLOR = "#718c00"
TRIPLE_SINGLE_QUOTE_COLOR = "#eab700"
TRIPLE_DOUBLE_QUOTE_COLOR = "#eab700"
MARGIN_BACKGROUND_COLOR = "#efefef"
MARGIN_FOREGROUND_COLOR = "#636363"
SELECTION_BACKGROUND_COLOR = "#d7d7d7"
SELECTION_FOREGROUND_COLOR = "#303030"
MATCHED_BRACE_BACKGROUND_COLOR = "#b7f907"
MATCHED_BRACE_FOREGROUND_COLOR = "#303030"
EDGE_COLOR = "#efefef"
FOLD_COLOR = "#efefef"
ERROR_COLOR = "#e31a1c"
def __init__(self, parent):
QDialog.__init__(self, parent)
self.setWindowTitle(QCoreApplication.translate(
"SettingsDialogPythonConsole", "Python Console Settings"))
self.parent = parent
self.setupUi(self)
self.listPath = []
self.lineEdit.setReadOnly(True)
self.restoreSettings()
self.initialCheck()
self.addAPIpath.setIcon(QIcon(":/images/themes/default/symbologyAdd.svg"))
self.addAPIpath.setToolTip(QCoreApplication.translate("PythonConsole", "Add API path"))
self.removeAPIpath.setIcon(QIcon(":/images/themes/default/symbologyRemove.svg"))
self.removeAPIpath.setToolTip(QCoreApplication.translate("PythonConsole", "Remove API path"))
self.preloadAPI.stateChanged.connect(self.initialCheck)
self.addAPIpath.clicked.connect(self.loadAPIFile)
self.removeAPIpath.clicked.connect(self.removeAPI)
self.compileAPIs.clicked.connect(self._prepareAPI)
self.resetFontColor.setIcon(QIcon(":/images/themes/default/mActionUndo.svg"))
self.resetFontColor.setIconSize(QSize(18, 18))
self.resetFontColorEditor.setIcon(QIcon(":/images/themes/default/mActionUndo.svg"))
self.resetFontColorEditor.setIconSize(QSize(18, 18))
self.resetFontColor.clicked.connect(self._resetFontColor)
self.resetFontColorEditor.clicked.connect(self._resetFontColorEditor)
def initialCheck(self):
if self.preloadAPI.isChecked():
self.enableDisable(False)
else:
self.enableDisable(True)
def enableDisable(self, value):
self.tableWidget.setEnabled(value)
self.addAPIpath.setEnabled(value)
self.removeAPIpath.setEnabled(value)
self.groupBoxPreparedAPI.setEnabled(value)
def loadAPIFile(self):
settings = QgsSettings()
lastDirPath = settings.value("pythonConsole/lastDirAPIPath", "", type=str)
fileAPI, selected_filter = QFileDialog.getOpenFileName(
self, "Open API File", lastDirPath, "API file (*.api)")
if fileAPI:
self.addAPI(fileAPI)
settings.setValue("pythonConsole/lastDirAPIPath", fileAPI)
def _prepareAPI(self):
if self.tableWidget.rowCount() != 0:
pap_file, filter = QFileDialog().getSaveFileName(
self,
"",
'*.pap',
"Prepared APIs file (*.pap)")
else:
QMessageBox.information(
self, self.tr("Warning!"),
self.tr('You need to add some APIs file in order to compile'))
return
if pap_file:
api_lexer = 'QsciLexerPython'
api_files = []
count = self.tableWidget.rowCount()
for i in range(0, count):
api_files.append(self.tableWidget.item(i, 1).text())
api_dlg = PrepareAPIDialog(api_lexer, api_files, pap_file, self)
api_dlg.show()
api_dlg.activateWindow()
api_dlg.raise_()
api_dlg.prepareAPI()
self.lineEdit.setText(pap_file)
def accept(self):
if not self.preloadAPI.isChecked() and \
not self.groupBoxPreparedAPI.isChecked():
if self.tableWidget.rowCount() == 0:
QMessageBox.information(
self, self.tr("Warning!"),
self.tr('Please specify API file or check "Use preloaded API files"'))
return
if self.groupBoxPreparedAPI.isChecked() and \
not self.lineEdit.text():
QMessageBox.information(
self, self.tr("Warning!"),
QCoreApplication.translate('optionsDialog', 'The APIs file was not compiled, click on "Compile APIs…"')
)
return
self.saveSettings()
self.listPath = []
QDialog.accept(self)
def addAPI(self, pathAPI):
count = self.tableWidget.rowCount()
self.tableWidget.setColumnCount(2)
self.tableWidget.insertRow(count)
pathItem = QTableWidgetItem(pathAPI)
pathSplit = pathAPI.split("/")
apiName = pathSplit[-1][0:-4]
apiNameItem = QTableWidgetItem(apiName)
self.tableWidget.setItem(count, 0, apiNameItem)
self.tableWidget.setItem(count, 1, pathItem)
def removeAPI(self):
listItemSel = self.tableWidget.selectionModel().selectedRows()
for index in reversed(listItemSel):
self.tableWidget.removeRow(index.row())
def saveSettings(self):
settings = QgsSettings()
settings.setValue("pythonConsole/preloadAPI", self.preloadAPI.isChecked())
settings.setValue("pythonConsole/autoSaveScript", self.autoSaveScript.isChecked())
fontFamilyText = self.fontComboBox.currentText()
settings.setValue("pythonConsole/fontfamilytext", fontFamilyText)
fontFamilyTextEditor = self.fontComboBoxEditor.currentText()
settings.setValue("pythonConsole/fontfamilytextEditor", fontFamilyTextEditor)
fontSize = self.spinBox.value()
fontSizeEditor = self.spinBoxEditor.value()
for i in range(0, self.tableWidget.rowCount()):
text = self.tableWidget.item(i, 1).text()
self.listPath.append(text)
settings.setValue("pythonConsole/fontsize", fontSize)
settings.setValue("pythonConsole/fontsizeEditor", fontSizeEditor)
settings.setValue("pythonConsole/userAPI", self.listPath)
settings.setValue("pythonConsole/autoCompThreshold", self.autoCompThreshold.value())
settings.setValue("pythonConsole/autoCompThresholdEditor", self.autoCompThresholdEditor.value())
settings.setValue("pythonConsole/autoCompleteEnabledEditor", self.groupBoxAutoCompletionEditor.isChecked())
settings.setValue("pythonConsole/autoCompleteEnabled", self.groupBoxAutoCompletion.isChecked())
settings.setValue("pythonConsole/usePreparedAPIFile", self.groupBoxPreparedAPI.isChecked())
settings.setValue("pythonConsole/preparedAPIFile", self.lineEdit.text())
if self.autoCompFromAPIEditor.isChecked():
settings.setValue("pythonConsole/autoCompleteSourceEditor", 'fromAPI')
elif self.autoCompFromDocEditor.isChecked():
settings.setValue("pythonConsole/autoCompleteSourceEditor", 'fromDoc')
elif self.autoCompFromDocAPIEditor.isChecked():
settings.setValue("pythonConsole/autoCompleteSourceEditor", 'fromDocAPI')
if self.autoCompFromAPI.isChecked():
settings.setValue("pythonConsole/autoCompleteSource", 'fromAPI')
elif self.autoCompFromDoc.isChecked():
settings.setValue("pythonConsole/autoCompleteSource", 'fromDoc')
elif self.autoCompFromDocAPI.isChecked():
settings.setValue("pythonConsole/autoCompleteSource", 'fromDocAPI')
settings.setValue("pythonConsole/enableObjectInsp", self.enableObjectInspector.isChecked())
settings.setValue("pythonConsole/autoCloseBracket", self.autoCloseBracket.isChecked())
settings.setValue("pythonConsole/autoCloseBracketEditor", self.autoCloseBracketEditor.isChecked())
settings.setValue("pythonConsole/autoInsertionImport", self.autoInsertionImport.isChecked())
settings.setValue("pythonConsole/autoInsertionImportEditor", self.autoInsertionImportEditor.isChecked())
settings.setValue("pythonConsole/defaultFontColor", self.defaultFontColor.color())
settings.setValue("pythonConsole/defaultFontColorEditor", self.defaultFontColorEditor.color())
settings.setValue("pythonConsole/classFontColor", self.classFontColor.color())
settings.setValue("pythonConsole/classFontColorEditor", self.classFontColorEditor.color())
settings.setValue("pythonConsole/keywordFontColor", self.keywordFontColor.color())
settings.setValue("pythonConsole/keywordFontColorEditor", self.keywordFontColorEditor.color())
settings.setValue("pythonConsole/decorFontColor", self.decorFontColor.color())
settings.setValue("pythonConsole/decorFontColorEditor", self.decorFontColorEditor.color())
settings.setValue("pythonConsole/numberFontColor", self.numberFontColor.color())
settings.setValue("pythonConsole/numberFontColorEditor", self.numberFontColorEditor.color())
settings.setValue("pythonConsole/methodFontColor", self.methodFontColor.color())
settings.setValue("pythonConsole/methodFontColorEditor", self.methodFontColorEditor.color())
settings.setValue("pythonConsole/commentFontColor", self.commentFontColor.color())
settings.setValue("pythonConsole/commentFontColorEditor", self.commentFontColorEditor.color())
settings.setValue("pythonConsole/commentBlockFontColor", self.commentBlockFontColor.color())
settings.setValue("pythonConsole/commentBlockFontColorEditor", self.commentBlockFontColorEditor.color())
settings.setValue("pythonConsole/paperBackgroundColor", self.paperBackgroundColor.color())
settings.setValue("pythonConsole/paperBackgroundColorEditor", self.paperBackgroundColorEditor.color())
settings.setValue("pythonConsole/cursorColor", self.cursorColor.color())
settings.setValue("pythonConsole/cursorColorEditor", self.cursorColorEditor.color())
settings.setValue("pythonConsole/caretLineColor", self.caretLineColor.color())
settings.setValue("pythonConsole/caretLineColorEditor", self.caretLineColorEditor.color())
settings.setValue("pythonConsole/stderrFontColor", self.stderrFontColor.color())
settings.setValue("pythonConsole/singleQuoteFontColor", self.singleQuoteFontColor.color())
settings.setValue("pythonConsole/singleQuoteFontColorEditor", self.singleQuoteFontColorEditor.color())
settings.setValue("pythonConsole/doubleQuoteFontColor", self.doubleQuoteFontColor.color())
settings.setValue("pythonConsole/doubleQuoteFontColorEditor", self.doubleQuoteFontColorEditor.color())
settings.setValue("pythonConsole/tripleSingleQuoteFontColor", self.tripleSingleQuoteFontColor.color())
settings.setValue("pythonConsole/tripleSingleQuoteFontColorEditor",
self.tripleSingleQuoteFontColorEditor.color())
settings.setValue("pythonConsole/tripleDoubleQuoteFontColor", self.tripleDoubleQuoteFontColor.color())
settings.setValue("pythonConsole/tripleDoubleQuoteFontColorEditor",
self.tripleDoubleQuoteFontColorEditor.color())
settings.setValue("pythonConsole/edgeColorEditor", self.edgeColorEditor.color())
settings.setValue("pythonConsole/marginBackgroundColor", self.marginBackgroundColor.color())
settings.setValue("pythonConsole/marginBackgroundColorEditor", self.marginBackgroundColorEditor.color())
settings.setValue("pythonConsole/marginForegroundColor", self.marginForegroundColor.color())
settings.setValue("pythonConsole/marginForegroundColorEditor", self.marginForegroundColorEditor.color())
settings.setValue("pythonConsole/foldColorEditor", self.foldColorEditor.color())
settings.setValue("pythonConsole/selectionBackgroundColor", self.selectionBackgroundColor.color())
settings.setValue("pythonConsole/selectionBackgroundColorEditor", self.selectionBackgroundColorEditor.color())
settings.setValue("pythonConsole/selectionForegroundColor", self.selectionForegroundColor.color())
settings.setValue("pythonConsole/selectionForegroundColorEditor", self.selectionForegroundColorEditor.color())
settings.setValue("pythonConsole/matchedBraceBackgroundColor", self.matchedBraceBackgroundColor.color())
settings.setValue("pythonConsole/matchedBraceBackgroundColorEditor", self.matchedBraceBackgroundColorEditor.color())
settings.setValue("pythonConsole/matchedBraceForegroundColor", self.matchedBraceForegroundColor.color())
settings.setValue("pythonConsole/matchedBraceForegroundColorEditor", self.matchedBraceForegroundColorEditor.color())
def restoreSettings(self):
settings = QgsSettings()
font = QFontDatabase.systemFont(QFontDatabase.FixedFont)
self.spinBox.setValue(settings.value("pythonConsole/fontsize", font.pointSize(), type=int))
self.spinBoxEditor.setValue(settings.value("pythonConsole/fontsizeEditor", font.pointSize(), type=int))
self.fontComboBox.setCurrentFont(QFont(settings.value("pythonConsole/fontfamilytext",
font.family())))
self.fontComboBoxEditor.setCurrentFont(QFont(settings.value("pythonConsole/fontfamilytextEditor",
font.family())))
self.preloadAPI.setChecked(settings.value("pythonConsole/preloadAPI", True, type=bool))
self.lineEdit.setText(settings.value("pythonConsole/preparedAPIFile", "", type=str))
itemTable = settings.value("pythonConsole/userAPI", [])
if itemTable:
self.tableWidget.setRowCount(0)
for i in range(len(itemTable)):
self.tableWidget.insertRow(i)
self.tableWidget.setColumnCount(2)
pathSplit = itemTable[i].split("/")
apiName = pathSplit[-1][0:-4]
self.tableWidget.setItem(i, 0, QTableWidgetItem(apiName))
self.tableWidget.setItem(i, 1, QTableWidgetItem(itemTable[i]))
self.autoSaveScript.setChecked(settings.value("pythonConsole/autoSaveScript", False, type=bool))
self.autoCompThreshold.setValue(settings.value("pythonConsole/autoCompThreshold", 2, type=int))
self.autoCompThresholdEditor.setValue(settings.value("pythonConsole/autoCompThresholdEditor", 2, type=int))
self.groupBoxAutoCompletionEditor.setChecked(
settings.value("pythonConsole/autoCompleteEnabledEditor", True, type=bool))
self.groupBoxAutoCompletion.setChecked(settings.value("pythonConsole/autoCompleteEnabled", True, type=bool))
self.enableObjectInspector.setChecked(settings.value("pythonConsole/enableObjectInsp", False, type=bool))
self.autoCloseBracketEditor.setChecked(settings.value("pythonConsole/autoCloseBracketEditor", False, type=bool))
self.autoCloseBracket.setChecked(settings.value("pythonConsole/autoCloseBracket", False, type=bool))
self.autoInsertionImportEditor.setChecked(
settings.value("pythonConsole/autoInsertionImportEditor", True, type=bool))
self.autoInsertionImport.setChecked(settings.value("pythonConsole/autoInsertionImport", True, type=bool))
if settings.value("pythonConsole/autoCompleteSource") == 'fromDoc':
self.autoCompFromDoc.setChecked(True)
elif settings.value("pythonConsole/autoCompleteSource") == 'fromAPI':
self.autoCompFromAPI.setChecked(True)
elif settings.value("pythonConsole/autoCompleteSource") == 'fromDocAPI':
self.autoCompFromDocAPI.setChecked(True)
if settings.value("pythonConsole/autoCompleteSourceEditor") == 'fromDoc':
self.autoCompFromDocEditor.setChecked(True)
elif settings.value("pythonConsole/autoCompleteSourceEditor") == 'fromAPI':
self.autoCompFromAPIEditor.setChecked(True)
elif settings.value("pythonConsole/autoCompleteSourceEditor") == 'fromDocAPI':
self.autoCompFromDocAPIEditor.setChecked(True)
# Setting font lexer color
self.defaultFontColor.setColor(QColor(settings.value("pythonConsole/defaultFontColor", QColor(self.DEFAULT_COLOR))))
self.defaultFontColorEditor.setColor(
QColor(settings.value("pythonConsole/defaultFontColorEditor", QColor(self.DEFAULT_COLOR))))
self.keywordFontColor.setColor(QColor(settings.value("pythonConsole/keywordFontColor", QColor(self.KEYWORD_COLOR))))
self.keywordFontColorEditor.setColor(
QColor(settings.value("pythonConsole/keywordFontColorEditor", QColor(self.KEYWORD_COLOR))))
self.classFontColor.setColor(QColor(settings.value("pythonConsole/classFontColor", QColor(self.CLASS_COLOR))))
self.classFontColorEditor.setColor(
QColor(settings.value("pythonConsole/classFontColorEditor", QColor(self.CLASS_COLOR))))
self.methodFontColor.setColor(QColor(settings.value("pythonConsole/methodFontColor", QColor(self.METHOD_COLOR))))
self.methodFontColorEditor.setColor(
QColor(settings.value("pythonConsole/methodFontColorEditor", QColor(self.METHOD_COLOR))))
self.decorFontColor.setColor(QColor(settings.value("pythonConsole/decorFontColor", QColor(self.DECORATION_COLOR))))
self.decorFontColorEditor.setColor(
QColor(settings.value("pythonConsole/decorFontColorEditor", QColor(self.DECORATION_COLOR))))
self.numberFontColor.setColor(QColor(settings.value("pythonConsole/numberFontColor", QColor(self.NUMBER_COLOR))))
self.numberFontColorEditor.setColor(
QColor(settings.value("pythonConsole/numberFontColorEditor", QColor(self.NUMBER_COLOR))))
self.commentFontColor.setColor(QColor(settings.value("pythonConsole/commentFontColor", QColor(self.COMMENT_COLOR))))
self.commentFontColorEditor.setColor(
QColor(settings.value("pythonConsole/commentFontColorEditor", QColor(self.COMMENT_COLOR))))
self.commentBlockFontColor.setColor(
QColor(settings.value("pythonConsole/commentBlockFontColor", QColor(self.COMMENT_BLOCK_COLOR))))
self.commentBlockFontColorEditor.setColor(
QColor(settings.value("pythonConsole/commentBlockFontColorEditor", QColor(self.COMMENT_BLOCK_COLOR))))
self.paperBackgroundColor.setColor(
QColor(settings.value("pythonConsole/paperBackgroundColor", QColor(self.BACKGROUND_COLOR))))
self.paperBackgroundColorEditor.setColor(
QColor(settings.value("pythonConsole/paperBackgroundColorEditor", QColor(self.BACKGROUND_COLOR))))
self.caretLineColor.setColor(QColor(settings.value("pythonConsole/caretLineColor", QColor(self.CARET_LINE_COLOR))))
self.caretLineColorEditor.setColor(
QColor(settings.value("pythonConsole/caretLineColorEditor", QColor(self.CARET_LINE_COLOR))))
self.cursorColor.setColor(QColor(settings.value("pythonConsole/cursorColor", QColor(self.CURSOR_COLOR))))
self.cursorColorEditor.setColor(QColor(settings.value("pythonConsole/cursorColorEditor", QColor(self.CURSOR_COLOR))))
self.singleQuoteFontColor.setColor(settings.value("pythonConsole/singleQuoteFontColor", QColor(self.SINGLE_QUOTE_COLOR)))
self.singleQuoteFontColorEditor.setColor(
settings.value("pythonConsole/singleQuoteFontColorEditor", QColor(self.SINGLE_QUOTE_COLOR)))
self.doubleQuoteFontColor.setColor(settings.value("pythonConsole/doubleQuoteFontColor", QColor(self.DOUBLE_QUOTE_COLOR)))
self.doubleQuoteFontColorEditor.setColor(
settings.value("pythonConsole/doubleQuoteFontColorEditor", QColor(self.DOUBLE_QUOTE_COLOR)))
self.tripleSingleQuoteFontColor.setColor(
settings.value("pythonConsole/tripleSingleQuoteFontColor", QColor(self.TRIPLE_SINGLE_QUOTE_COLOR)))
self.tripleSingleQuoteFontColorEditor.setColor(
settings.value("pythonConsole/tripleSingleQuoteFontColorEditor", QColor(self.TRIPLE_SINGLE_QUOTE_COLOR)))
self.tripleDoubleQuoteFontColor.setColor(
settings.value("pythonConsole/tripleDoubleQuoteFontColor", QColor(self.TRIPLE_DOUBLE_QUOTE_COLOR)))
self.tripleDoubleQuoteFontColorEditor.setColor(
settings.value("pythonConsole/tripleDoubleQuoteFontColorEditor", QColor(self.TRIPLE_DOUBLE_QUOTE_COLOR)))
self.marginBackgroundColor.setColor(settings.value("pythonConsole/marginBackgroundColor", QColor(self.MARGIN_BACKGROUND_COLOR)))
self.marginBackgroundColorEditor.setColor(settings.value("pythonConsole/marginBackgroundColorEditor", QColor(self.MARGIN_BACKGROUND_COLOR)))
self.marginForegroundColor.setColor(settings.value("pythonConsole/marginForegroundColor", QColor(self.MARGIN_FOREGROUND_COLOR)))
self.marginForegroundColorEditor.setColor(settings.value("pythonConsole/marginForegroundColorEditor", QColor(self.MARGIN_FOREGROUND_COLOR)))
self.selectionForegroundColor.setColor(settings.value("pythonConsole/selectionForegroundColor", QColor(self.SELECTION_FOREGROUND_COLOR)))
self.selectionForegroundColorEditor.setColor(settings.value("pythonConsole/selectionForegroundColorEditor", QColor(self.SELECTION_FOREGROUND_COLOR)))
self.selectionBackgroundColor.setColor(settings.value("pythonConsole/selectionBackgroundColor", QColor(self.SELECTION_BACKGROUND_COLOR)))
self.selectionBackgroundColorEditor.setColor(settings.value("pythonConsole/selectionBackgroundColorEditor", QColor(self.SELECTION_BACKGROUND_COLOR)))
self.matchedBraceForegroundColor.setColor(settings.value("pythonConsole/matchedBraceForegroundColor", QColor(self.MATCHED_BRACE_FOREGROUND_COLOR)))
self.matchedBraceForegroundColorEditor.setColor(settings.value("pythonConsole/matchedBraceForegroundColorEditor", QColor(self.MATCHED_BRACE_FOREGROUND_COLOR)))
self.matchedBraceBackgroundColor.setColor(settings.value("pythonConsole/matchedBraceBackgroundColor", QColor(self.MATCHED_BRACE_BACKGROUND_COLOR)))
self.matchedBraceBackgroundColorEditor.setColor(settings.value("pythonConsole/matchedBraceBackgroundColorEditor", QColor(self.MATCHED_BRACE_BACKGROUND_COLOR)))
self.stderrFontColor.setColor(QColor(settings.value("pythonConsole/stderrFontColor", QColor(self.ERROR_COLOR))))
self.edgeColorEditor.setColor(settings.value("pythonConsole/edgeColorEditor", QColor(self.EDGE_COLOR)))
self.foldColorEditor.setColor(settings.value("pythonConsole/foldColorEditor", QColor(self.FOLD_COLOR)))
def _resetFontColor(self):
self.defaultFontColor.setColor(QColor(self.DEFAULT_COLOR))
self.keywordFontColor.setColor(QColor(self.KEYWORD_COLOR))
self.classFontColor.setColor(QColor(self.CLASS_COLOR))
self.methodFontColor.setColor(QColor(self.METHOD_COLOR))
self.decorFontColor.setColor(QColor(self.DECORATION_COLOR))
self.numberFontColor.setColor(QColor(self.NUMBER_COLOR))
self.commentFontColor.setColor(QColor(self.COMMENT_COLOR))
self.commentBlockFontColor.setColor(QColor(self.COMMENT_BLOCK_COLOR))
self.paperBackgroundColor.setColor(QColor(self.BACKGROUND_COLOR))
self.cursorColor.setColor(QColor(self.CURSOR_COLOR))
self.caretLineColor.setColor(QColor(self.CARET_LINE_COLOR))
self.singleQuoteFontColor.setColor(QColor(self.SINGLE_QUOTE_COLOR))
self.doubleQuoteFontColor.setColor(QColor(self.DOUBLE_QUOTE_COLOR))
self.tripleSingleQuoteFontColor.setColor(QColor(self.TRIPLE_SINGLE_QUOTE_COLOR))
self.tripleDoubleQuoteFontColor.setColor(QColor(self.TRIPLE_DOUBLE_QUOTE_COLOR))
self.marginBackgroundColor.setColor(QColor(self.MARGIN_BACKGROUND_COLOR))
self.marginForegroundColor.setColor(QColor(self.MARGIN_FOREGROUND_COLOR))
self.selectionBackgroundColor.setColor(QColor(self.SELECTION_BACKGROUND_COLOR))
self.selectionForegroundColor.setColor(QColor(self.SELECTION_FOREGROUND_COLOR))
self.matchedBraceBackgroundColor.setColor(QColor(self.MATCHED_BRACE_BACKGROUND_COLOR))
self.matchedBraceForegroundColor.setColor(QColor(self.MATCHED_BRACE_FOREGROUND_COLOR))
self.stderrFontColor.setColor(QColor(self.ERROR_COLOR))
def _resetFontColorEditor(self):
self.defaultFontColorEditor.setColor(QColor(self.DEFAULT_COLOR))
self.keywordFontColorEditor.setColor(QColor(self.KEYWORD_COLOR))
self.classFontColorEditor.setColor(QColor(self.CLASS_COLOR))
self.methodFontColorEditor.setColor(QColor(self.METHOD_COLOR))
self.decorFontColorEditor.setColor(QColor(self.DECORATION_COLOR))
self.numberFontColorEditor.setColor(QColor(self.NUMBER_COLOR))
self.commentFontColorEditor.setColor(QColor(self.COMMENT_COLOR))
self.commentBlockFontColorEditor.setColor(QColor(self.COMMENT_BLOCK_COLOR))
self.paperBackgroundColorEditor.setColor(QColor(self.BACKGROUND_COLOR))
self.cursorColorEditor.setColor(QColor(self.CURSOR_COLOR))
self.caretLineColorEditor.setColor(QColor(self.CARET_LINE_COLOR))
self.singleQuoteFontColorEditor.setColor(QColor(self.SINGLE_QUOTE_COLOR))
self.doubleQuoteFontColorEditor.setColor(QColor(self.DOUBLE_QUOTE_COLOR))
self.tripleSingleQuoteFontColorEditor.setColor(QColor(self.TRIPLE_SINGLE_QUOTE_COLOR))
self.tripleDoubleQuoteFontColorEditor.setColor(QColor(self.TRIPLE_DOUBLE_QUOTE_COLOR))
self.marginBackgroundColorEditor.setColor(QColor(self.MARGIN_BACKGROUND_COLOR))
self.marginForegroundColorEditor.setColor(QColor(self.MARGIN_FOREGROUND_COLOR))
self.selectionBackgroundColorEditor.setColor(QColor(self.SELECTION_BACKGROUND_COLOR))
self.selectionForegroundColorEditor.setColor(QColor(self.SELECTION_FOREGROUND_COLOR))
self.matchedBraceBackgroundColorEditor.setColor(QColor(self.MATCHED_BRACE_BACKGROUND_COLOR))
self.matchedBraceForegroundColorEditor.setColor(QColor(self.MATCHED_BRACE_FOREGROUND_COLOR))
self.edgeColorEditor.setColor(QColor(self.EDGE_COLOR))
self.foldColorEditor.setColor(QColor(self.FOLD_COLOR))
def reject(self):
self.restoreSettings()
QDialog.reject(self)
|
Mr. Day considers Virginia Beach his home, having lived in the city for 39 years. Originally from New Philadelphia, OH, he graduated from St Leo University with a B.S. in Criminal Justice. Mr. Day is a retired Va Beach Police officer who served his community for 27 years. His 2 children both attended VBCPS and he was an active Dad in their schools, including serving on the Ocean Lakes High Planning Council. Paul currently has a grandchild that attends VB Public Schools. He continues to be an active participant in our schools, serving as a substitute teacher for nearly 10 years. Mr. Day seeks to ensure that VBPCS adheres to the highest standards in the areas of both education and discipline. Paul is active in his church, Azalea Garden and serves in their food bank. |
import time
from django.shortcuts import render
from django.http import HttpResponse, Http404
from MergeServer.models import Results
from KeyManager.Util.Paillier import mul, add
from KeyManager.Util.TimeCost import SPATime
from CommunicationServer.models import Transaction
# Create your views here.
def get_merge_response(request):
start = time.time()
if request.method != 'POST':
return Http404
values = request.POST['values'].split(',')
requester = request.POST['requester']
requester_number = long(requester)
respondent = request.POST['respondent']
paillier_n = long(request.POST['paillier_n'])
n_square = paillier_n * paillier_n
paillier_g = long(request.POST['paillier_g'])
spa_policies = request.POST['spa_policies'].split(',')
settings = request.POST['settings']
size = len(spa_policies)
results = Results.objects.filter(requester=requester_number)
if results.exists():
result = results[0].values.split(',')
count = results[0].count
print "count: " + str(count)
new_result = []
for i in range(size):
if spa_policies[i] == 'Average':
[this_value, this_weight] = result[i].split(' ')
[sum_value, sum_weight] = values[i].split(' ')
this_result = str(add(long(this_value), long(sum_value), n_square)) \
+ ' ' + str(long(this_weight) + long(sum_weight))
new_result.append(this_result)
if spa_policies[i] == 'MaximumValue' or spa_policies[i] == 'MinimumValue':
this_result = result[i] + ' ' + values[i]
new_result.append(this_result)
if spa_policies[i] == 'MajorityPreferred' or spa_policies[i] == 'MinorityPreferred':
this_result = []
this_values = values[i].split(' ')
result_element = result[i].split(' ')
this_size = len(result_element)
for j in range(this_size):
ans = add(long(this_values[j]), long(result_element[j]), n_square)
this_result.append(str(ans))
new_result.append(' '.join(this_result))
results.update(values=','.join(new_result), count=count + 1)
end = time.time()
SPATime(end - start)
else:
r = Results(values=request.POST['values']
, requester=requester_number
, spa_policies=request.POST['spa_policies']
, n_square=n_square
, settings=settings)
r.save()
end = time.time()
SPATime(end - start)
return HttpResponse('ok')
|
The Aristotle team rounds up the week’s best marketing news stories, interesting op-eds on the latest trends and just plain cool sites on the Web. This week we’re bringing you even more Vine vs. Instagram stats, Facebook’s new #hashtags and US tablet stats.
The Aristotle team has rounded up this week’s best marketing news stories, interesting op-eds on the latest trends, and just plain cool sites on the Web. This week we’re bringing you new Facebook statuses, Foursquare for business owners, and infographic tattoos. |
# def comet(mvs, pcs, recdepth, maxrecdepth):
# options = []
# curxp = xp_diff(mvs[0][0].color, pcs)
# intro = '// ' + ('-' * (recdepth * 2)) + ' COMET: '
# if DEBUG_OUTPUT:
# print(intro + '{} opts'.format(len(mvs)))
# else:
# print(' ' * recdepth + '*')
# if len(mvs) == 1:
# tmp, xp, oma = can_move_piece(mvs[0][0], mvs[0][1][0], mvs[0][1][1], pcs, aimode=True)
# if DEBUG_OUTPUT:
# print(intro + 'one opt ({} -> {}) - xp {}'.format(num_to_chess_coord(
# mvs[0][0].x,
# mvs[0][0].y),
# num_to_chess_coord(
# mvs[0][1][0],
# mvs[0][1][1]),
# xp))
# return 0, xp
# if recdepth == maxrecdepth:
# if DEBUG_OUTPUT:
# print(intro + 'unable to continue analysis; maximum recursion depth exceeded;'
# ' xp set to current xp ({})'.format(curxp))
# return 0, curxp
# for m in mvs:
# if m[0].type != KING:
# break
# else:
# # Check if possible to check or checkmate
# for n in range(len(mvs)):
# tmp, xp, oma = can_move_piece(mvs[n][0], mvs[n][1][0], mvs[n][1][1], pcs, aimode=True)
# if type(oma) == bool and oma:
# return n, xp
#
# for n in range(len(mvs)):
# tmp, xp, oma = can_move_piece(mvs[n][0], mvs[n][1][0], mvs[n][1][1], pcs, aimode=True)
# if xp - curxp >= 2:
# return n, xp
# # / Check if possible to check or checkmate /
# myking = m[0]
# for p in pcs:
# if p.color != myking.color and p.type == KING:
# otherking = p
# if myking.resist >= 6:
# mv = select_move_toward(myking, otherking, mvs)
# else:
# mv = select_move_away(myking, otherking, mvs)
# a, x, c = can_move_piece(myking, mvs[mv][1][0], mvs[mv][1][1], pcs, aimode=True)
# return mv, x
# for i in range(len(mvs)):
# tmp, xp, oma = can_move_piece(mvs[i][0], mvs[i][1][0], mvs[i][1][1], pcs, aimode=True)
# if type(oma) == bool and oma:
# turn = i
# if DEBUG_OUTPUT:
# print(intro + 'opt {} ({} -> {}) leads to victory, xp counted as inf; exiting analysis'.format(i,
# num_to_chess_coord(
# mvs[i][0].x,
# mvs[i][0].y),
# num_to_chess_coord(
# mvs[i][1][0],
# mvs[i][1][1])))
# xp = INF
# return i, INF
#
# elif type(oma) != bool and len(oma) == 1:
# bpcs = []
# for p in pcs:
# bpcs.append(copy.deepcopy(p))
#
# if DEBUG_OUTPUT:
# print(intro + 'analyzing our opt {} ({} -> {})...'.format(i,
# num_to_chess_coord(
# mvs[i][0].x,
# mvs[i][0].y),
# num_to_chess_coord(
# mvs[i][1][0],
# mvs[i][1][1])))
#
# move_piece(mvs[i][0], mvs[i][1][0], mvs[i][1][1], bpcs)
# if DEBUG_OUTPUT:
# print(intro + 'fc; one opponent opt ({} -> {})'.format(num_to_chess_coord(
# oma[0][0].x,
# oma[0][0].y),
# num_to_chess_coord(
# oma[0][1][0],
# oma[0][1][1]
# )))
# move_piece(oma[0][0], oma[0][1][0], oma[0][1][1], bpcs)
#
# newmv = get_all_moves(mvs[0][0].color, bpcs)
# if type(newmv) != bool:
# tmptmp, xp = comet(newmv, bpcs, recdepth + 1, maxrecdepth)# if maxrecdepth - recdepth >= 1 else -1, curxp
# if DEBUG_OUTPUT:
# print(intro + 'analysis of opt {} finished; xp {}'.format(i, xp))
# if xp == INF:
# if DEBUG_OUTPUT:
# print(intro + 'checkmate detected, exiting analysis')
# return i, INF
# else:
# if DEBUG_OUTPUT:
# print(intro + 'opt {} leads to defeat/stalemate, xp counted as -inf'.format(i))
# xp = -INF
#
# elif type(oma) != bool and get_piece_by_coords(oma[0][1][0], oma[0][1][1], pcs) is not None:
# bpcs = []
# for p in pcs:
# bpcs.append(copy.deepcopy(p))
#
# if DEBUG_OUTPUT:
# print(intro + 'analyzing opt {} ({} -> {})...'.format(i,
# num_to_chess_coord(
# mvs[i][0].x,
# mvs[i][0].y),
# num_to_chess_coord(
# mvs[i][1][0],
# mvs[i][1][1])))
#
# move_piece(mvs[i][0], mvs[i][1][0], mvs[i][1][1], bpcs)
# if DEBUG_OUTPUT:
# print(intro + 'fc; {} opponent opts'.format(len(oma)))
# xps = []
# for q in range(len(oma)):
# nbpcs = []
# for p in bpcs:
# nbpcs.append(copy.deepcopy(p))
# if DEBUG_OUTPUT:
# print(intro + 'analyzing opponent opt {} ({} -> {})'.format(q, num_to_chess_coord(
# oma[0][0].x,
# oma[0][0].y),
# num_to_chess_coord(
# oma[0][1][0],
# oma[0][1][1]
# )))
# move_piece(oma[q][0], oma[q][1][0], oma[q][1][1], nbpcs)
#
# newmv = get_all_moves(mvs[0][0].color, nbpcs)
# if type(newmv) != bool:
# if maxrecdepth - recdepth >= 1:
# t, xpn = comet(newmv, nbpcs, recdepth + 1, maxrecdepth)
# if DEBUG_OUTPUT:
# print(intro + 'analysis of opponent opt {} finished; xp {}'.format(q, xpn))
# else:
# xpn = curxp
# if DEBUG_OUTPUT:
# print(intro + 'unable to analyze opponent opt {}; maximum recursion depth exceeded;'
# ' xp set to current xp ({})'.format(q, xpn))
#
# else:
# if DEBUG_OUTPUT:
# print(intro + 'opponent opt {} leads to defeat/stalemate, xp counted as -inf'.format(q))
# xpn = -INF
#
# xps.append(xpn)
#
# xp = min(xps)
# if DEBUG_OUTPUT:
# print(intro + 'analysis of opt {} finished, final possible xps {}'.format(i, xps))
# print(intro + 'min xp {}'.format(xp))
#
# # elif type(oma) != bool and len(oma) == 2:
# # bpcs = []
# # for p in pcs:
# # bpcs.append(copy.deepcopy(p))
# #
# # if DEBUG_OUTPUT:
# # print(
# # intro + 'semi-analyzing opt {} ({} -> {})...'.format(i,
# # num_to_chess_coord(
# # mvs[
# # i][0].x,
# # mvs[
# # i][0].y),
# # num_to_chess_coord(
# # mvs[
# # i][
# # 1][
# # 0],
# # mvs[
# # i][
# # 1][
# # 1])))
# #
# # move_piece(mvs[i][0], mvs[i][1][0], mvs[i][1][1], bpcs)
# # t, xp = comet(oma, bpcs, -1, -1)
# # move_piece(oma[t][0], oma[t][1][0], oma[t][1][1], bpcs)
# # xp = xp_sum(mvs[0][0].color, bpcs)
# # if DEBUG_OUTPUT:
# # print(intro + 'semi-analysis of opt {} finished; xp {}'.format(i, xp))
#
# elif DEBUG_OUTPUT:
# print(intro + 'opt {} ({} -> {}) - not fc, xp {}'.format(i,
# num_to_chess_coord(mvs[i][0].x, mvs[i][0].y),
# num_to_chess_coord(mvs[i][1][0], mvs[i][1][1]),
# xp))
# options.append(xp)
# else:
# m = max(options)
# turns = [i for i in range(len(options)) if options[i] == m]
# turn = random.choice(turns)
# if DEBUG_OUTPUT:
# print(intro + 'final opts {}'.format(str(options).replace('100000000000000000000', 'inf')))
#
# if DEBUG_OUTPUT:
# print(intro + 'selected opt {}'.format(turn))
#
# return turn, max(options)
# def get_piece_by_coords(x, y, pieces):
# for piece in pieces:
# if piece.x == x and piece.y == y:
# return piece
# return None
#
# def get_index_by_coords(x, y, pieces):
# for i in range(len(pieces)):
# if pieces[i].x == x and pieces[i].y == y:
# return i
# return None
# def get_moves_by_offset(diags, x, y, board):
# stopped = [False] * 4
# ret = []
#
# for i in range(1, max(BOARD_X, BOARD_Y)):
# for d in range(4):
# if not stopped[d]:
# p = board[x + diags[d][0] * i, y + diags[d][1] * i]
# if p is not None:
# stopped[d] = True
# if p.color != self.color:
# ret.append((p.x, p.y))
# else:
# ret.append((self.x + diags[d][0] * i, self.y + diags[d][1] * i))
# return ret
# def is_check(color, pieces):
# return get_all_moves(not color, pieces, has_king_cpt=True)
#
# def is_under_attack_of(piece1, piece2, pieces):
# allm = get_all_moves(True, pieces) + get_all_moves(False, pieces)
# allm = [x for x in allm if x[0] == piece1 and x[1][0] == piece2.x and x[1][1] == piece2.y]
# return not len(allm) == 0
#
# def can_move_piece(piece, x2, y2, pieces, aimode=False):
# pieces_back = []
#
# for p in pieces:
# pieces_back.append(copy.deepcopy(p))
#
# p1 = get_index_by_coords(piece.x, piece.y, pieces_back)
# p2 = get_index_by_coords(x2, y2, pieces_back)
#
# xp = 0
#
# if p1 is None:
# raise Exception('No such piece')
# if p2 is not None:
# xp += pieces_back[p2].type.capture_price
# pieces_back.pop(p2)
# if p1 > p2:
# p1 -= 1
#
# pieces_back[p1].x = x2
# pieces_back[p1].y = y2
#
# ret = not is_check(pieces_back[p1].color, pieces_back)
#
# xp += CHECK_XP if is_check(not piece.color, pieces_back) else 0
# xp = TURN_XP if xp == 0 else xp
# pieces_back[p1].add_xp(xp)
#
# total_xp = xp_diff(piece.color, pieces_back)
#
# if aimode:
# return ret, total_xp, get_all_moves(not piece.color, pieces_back) # total_xp = difference between sum of xp of pcs
# # of this color and pieces of the opp color
# else:
# return ret
#
# def move_piece(piece, x2, y2, pieces):
# global pawn_prom
# p1 = get_index_by_coords(piece.x, piece.y, pieces)
# p2 = get_index_by_coords(x2, y2, pieces)
#
# xpsum = 0
#
# if p1 is None:
# raise Exception('No such piece')
# if p1 == p2:
# raise Exception('Can\'t move piece to previous location')
# if p2 is not None:
# xpsum += pieces[p2].type.capture_price
# pieces.pop(p2)
# if p1 > p2:
# p1 -= 1
#
# if pieces[p1].type == PAWN and pawn_prom:
# print(LANGUAGE.phrases.PROMOTION_CHOICE)
# typ = input()
# if typ == '1':
# pieces[p1] = Piece(p1.color, ROOK, x2, y2, moved=True)
# elif typ == '2':
# pieces[p1] = Piece(p1.color, BISHOP, x2, y2, moved=True)
# elif typ == '3':
# pieces[p1] = Piece(p1.color, KNIGHT, x2, y2, moved=True)
# else:
# pieces[p1] = Piece(p1.color, QUEEN, x2, y2, moved=True)
# else:
# pieces[p1].x = x2
# pieces[p1].y = y2
#
# xpsum += CHECK_XP if is_check(not piece.color, pieces) else 0
# xpsum = TURN_XP if xpsum == 0 else xpsum
# pieces[p1].add_xp(xpsum)
#
# def get_all_moves(color, pieces, has_king_cpt=False, has_mob_cpt=False):
# ret = []
# captures = []
# res = []
# for p in pieces:
# if p.color == color:
# rt, cpt = p.get_moves(pieces)
# ret.extend([(p, x) for x in rt])
# captures.extend([(p, x) for x in cpt])
# for x, y in cpt:
# if get_piece_by_coords(x, y, pieces).type == KING:
# return True
# if get_piece_by_coords(x, y, pieces).type == MOB and has_mob_cpt:
# return True
#
# if has_king_cpt or has_mob_cpt:
# return False
#
# # --- Check all capture variants for checks
# popped = []
#
# for i in range(len(captures)):
# b = can_move_piece(captures[i][0], captures[i][1][0], captures[i][1][1], pieces)
# if not b:
# popped.append(captures[i])
#
# for p in popped:
# captures.remove(p)
#
# if len(captures) == 0:
# # --- Same with ret
# popped = []
#
# for i in range(len(ret)):
# b = can_move_piece(ret[i][0], ret[i][1][0], ret[i][1][1], pieces)
# if not b:
# popped.append(ret[i])
#
# for p in popped:
# ret.remove(p)
#
# res = ret
# else:
# res = captures
#
# if len(res) == 0:
# return is_check(color, pieces)
# else:
# return res
# def change_back(v):
# global curon
#
# v = False
# if v:
# curon = not curon
#
# if curon:
# print('\033[47m', end='')
# else:
# print('\033[0m', end='')
# def power(type):
# if type == PAWN:
# return 1
# if type == KNIGHT:
# return 2
# if type == BISHOP:
# return 3
# if type == ROOK:
# return 4
# if type == QUEEN:
# return 5
# if type == KING:
# return 6
# if type == AMAZON:
# return 7
#
# def select_move_toward(p1, p2, mvs):
# xd, yd = abs(p1.x - p2.x), abs(p1.y - p2.y)
# resindex = -1
# resval = -INF
# for m in range(len(mvs)):
# nx, ny = abs(mvs[m][1][0] - p2.x), abs(mvs[m][1][1] - p2.y)
# change = xd - nx + yd - ny
# if change > resval:
# resval = change
# resindex = m
#
# return resindex
#
# def select_move_away(p1, p2, mvs):
# xd, yd = abs(p1.x - p2.x), abs(p1.y - p2.y)
# resindex = -1
# resval = -INF
# for m in range(len(mvs)):
# nx, ny = abs(mvs[m][1][0] - p2.x), abs(mvs[m][1][1] - p2.y)
# change = nx - xd + ny - yd
# if change > resval:
# resval = change
# resindex = m
#
# return resindex
#
# def escape_capture_of(mob, piece1, pieces):
# variants = [(-1, 0), (1, 0), (0, -1), (0, 1)]
# ret = []
# for v in variants:
# pieces_back = []
# for p in pieces:
# pieces_back.append(copy.deepcopy(p))
# move_piece(mob, mob.x + v[0], mob.y + v[1], pieces_back)
# if not is_under_attack_of(piece1, mob, pieces_back):
# ret.append(v)
# return ret
#
# def get_all_mobs(pieces):
# return [x for x in pieces if x.type == MOB]
#
# def can_move_piece(self, p1, x, y, return_details=False):
# board_c = copy.copy(self)
#
# p2 = self[x, y]
# board_c.pieces.pop((self[p1].x, self[p1].y))
#
# xp = 0
#
# if p1 is None:
# raise Exception('No such piece')
# if p2 is not None:
# xp += board_c.pieces_l[p2].type.capture_price
# self.remove(p2)
# if p1 > p2:
# p1 -= 1
#
# board_c[p1].x = x
# board_c[p1].y = y
#
# ret = not self.is_check(board_c.pieces_l[p1].color)
#
# xp += CHECK_XP if self.is_check(not board_c.pieces_l[p1].color) else 0
# xp = TURN_XP if xp == 0 else xp
# board_c.pieces_l[p1].add_xp(xp)
#
# total_xp = board_c.xp_diff(board_c.pieces_l[p1].color)
#
# if return_details:
# mv, cpt = self.get_all_moves()
# return ret, total_xp, mv[int(not board_c.pieces_l[p1].color)], cpt[int(not board_c.pieces_l[p1].color)]
# # total_xp = difference between sum of xp of pcs
# # of this color and pieces of the opp color
# else:
# return ret
#
# def move_to_chess(piece, x, y, pieces):
# if piece.type == PAWN:
# if piece.x == x:
# ret = num_to_chess_coord(x, y)
# else:
# ret = LINES[piece.x] + LINES[x]
# else:
# if get_piece_by_coords(x, y, pieces) is None:
# ret = piece.type.abbr + num_to_chess_coord(x, y)
# else:
# ret = piece.type.abbr + ':' + num_to_chess_coord(x, y)
# return ret
#
# bool1 = len(t) != 2 or len(t[0]) != 2 or len(t[1]) != 2 or \
# t[0][0] not in LINES[:BOARD_X] or t[1][0] not in LINES[:BOARD_X] or \
# t[0][1] not in string.digits or t[1][1] not in string.digits |
So: later on this year, Sub Pop will be releasing Fear Fun, the debut album from an artist called Father John Misty. That’s the latest project from one J. Tillman, who’s released several solo albums, and has also spent time in an obscure folk-rock outfit called Fleet Foxes.
From looking at the bio, it’s worth noting the literary types mentioned: the names of Richard Brautigan, Oscar Wilde, and Charles Bukowski are invoked alongside those of Arthur Russell and Harry Nilsson. All of which seems like catnip designed to lure in lit-bloggers such as ourselves. And…evidently, it worked. |
import mock
from bugwarrior.services.jira import JiraService
from .base import ServiceTest
class TestJiraIssue(ServiceTest):
SERVICE_CONFIG = {
'jira.username': 'one',
'jira.base_uri': 'two',
'jira.password': 'three',
}
def setUp(self):
with mock.patch('jira.client.JIRA._get_json'):
self.service = self.get_mock_service(JiraService)
def test_to_taskwarrior(self):
arbitrary_project = 'DONUT'
arbitrary_id = '10'
arbitrary_url = 'http://one'
arbitrary_summary = 'lkjaldsfjaldf'
arbitrary_record = {
'fields': {
'priority': 'Blocker',
'summary': arbitrary_summary,
},
'key': '%s-%s' % (arbitrary_project, arbitrary_id, ),
}
arbitrary_extra = {
'jira_version': 5,
'annotations': ['an annotation'],
}
issue = self.service.get_issue_for_record(
arbitrary_record, arbitrary_extra
)
expected_output = {
'project': arbitrary_project,
'priority': (
issue.PRIORITY_MAP[arbitrary_record['fields']['priority']]
),
'annotations': arbitrary_extra['annotations'],
'tags': [],
issue.URL: arbitrary_url,
issue.FOREIGN_ID: arbitrary_record['key'],
issue.SUMMARY: arbitrary_summary,
issue.DESCRIPTION: None,
}
def get_url(*args):
return arbitrary_url
with mock.patch.object(issue, 'get_url', side_effect=get_url):
actual_output = issue.to_taskwarrior()
self.assertEqual(actual_output, expected_output)
|
This regulation prescribes policies and procedures for investigating the circumstances of disease, injury, or death of a soldier. It provides standards and considerations used in determining line of duty (LD) status.
Line of duty determinations are essential for protecting the interest of both the individual concerned and the U.S. Government where service is interrupted by injury, disease, or death. Soldiers who are on active duty (AD) for a period of more than 30 days will not lose their entitlement to medical and dental care, even if the injury or disease is found to have been incurred not in LD and/or because of the soldier’s intentional misconduct or willful negligence, Section 1074, Title 10, United States Code (10 USC 1074). A person who becomes a casualty because of his or her intentional misconduct or willful negligence can never be said to be injured, diseased, or deceased in LD. Such a person stands to lose substantial benefits as a consequence of his or her actions; therefore, it is critical that the decision to categorize injury, disease, or death as not in LD only be made after following the deliberate, ordered procedures described in this regulation.
An enlisted soldier who is unable to perform duties for more than one day because of his or her intemperate use of drugs or alcohol or because of disease or injury resulting from the soldier’s misconduct is liable after returning to duty to serve for a period that, when added to the period that he or she served before the absence from duty, amounts to the term for which he or she was enlisted or inducted (10 USC 972).
b. Longevity and retirement multiplier.
Eligibility for increases in pay because of longevity and the amount of retirement pay to which a soldier may be entitled depends on the soldier’s cumulative years of creditable service. An enlisted soldier who is unable to perform duties for more than one day because of his or her intemperate use of drugs or alcohol or because of disease or injury resulting from misconduct is not entitled to include such periods in computing creditable service in accordance with the Department of Defense Financial Management Regulation (DODFMR).
Any soldier on AD who is absent from regular duties for a continuous period of more than one day because of disease that is directly caused by and immediately following his or her intemperate use of drugs or alcohol is not entitled to pay for the period of that absence. Pay is not forfeited for absence from duty caused by injuries. Pay is not forfeited for disease not directly caused by and immediately following the intemperate use of drugs and alcohol.
d. Disability retirement and severance pay.
1204, 1206, and 1207). Physical Evaluation Board determinations are made independently and are not controlled by LD determinations. However, entitlement to disability compensation may depend on those facts that have been officially recorded and are on file within the Department of the Army (DA). This includes reports and investigations submitted in accordance with this regulation.
(6) while remaining overnight immediately before serving on funeral honors duty under 10 USC 12503 or 32 USC 115 at or in the vicinity of the place at which the soldier was to so serve, if the place is outside reasonable commuting distance from the soldier’s residence.
f. Benefits administered by the Department of Veterans Affairs (DVA).
In determining whether a veteran or his or her survivors or family members are eligible for certain benefits, the DVA makes its own determinations with respect to LD. These determinations rest upon the evidence available. Usually this consists of those facts that have been officially recorded and are on file within DA, including reports and LD investigations submitted in accordance with the provisions of this regulation. Statutes governing these benefits generally require that disabling injury or death be service connected, which means that the disability was incurred or aggravated in LD (38 USC 101). The statutory criteria for making such determinations are in 38 USC 105.
Line of duty investigations are conducted essentially to arrive at a determination of whether misconduct or negligence was involved in the disease, injury, or death and, if so, to what degree. Depending on the circumstances of the case, an LD investigation may or may not be required to make this determination.
(1) In the case of disease, except as described in paragraphs c (1) and (8) below.
(2) In the case of injuries clearly incurred as a result of enemy action or attack by terrorists.
b. In all other cases of death or injury, except injuries so slight as to be clearly of no lasting significance (for example, superficial lacerations or abrasions or mild heat injuries), an LD investigation must be conducted.
due to misconduct or willful negligence.
(2) Injury or death involving the abuse of alcohol or other drugs.
(3) Self-inflicted injuries or possible suicide.
(4) Injury or death incurred while AWOL.
(5) Injury or death that occurs while an individual was en route to final acceptance in the Army.
(6) Death of a USAR or ARNG soldier while participating in authorized training or duty.
(7) Injury or death of a USAR or ARNG soldier while traveling to or from authorized training or duty.
(8) When a USAR or ARNG soldier serving on an AD tour of 30 days or less is disabled due to disease.
(9) In connection with an appeal of an unfavorable determination of abuse of alcohol or other drugs (para 4–10 a).
(10) When requested or directed for other cases. |
"""
This module retrieves a simple string from a PDA
using the state removal method
"""
from pda import PDAState
class PdaString():
"""Retrieves a string from a PDA"""
def __init__(self):
"""Class Initialization"""
self.statediag = []
self.quickresponse = {}
self.quickresponse_types = {}
pass
def _combine_rest_push(self):
"""Combining Rest and Push States"""
new = []
change = 0
# DEBUG
# logging.debug('Combining Rest and Push')
i = 0
examinetypes = self.quickresponse_types[3]
for state in examinetypes:
if state.type == 3:
for nextstate_id in state.trans.keys():
found = 0
# if nextstate_id != state.id:
if nextstate_id in self.quickresponse:
examines = self.quickresponse[nextstate_id]
for examine in examines:
if examine.id == nextstate_id and examine.type == 1:
temp = PDAState()
temp.type = 1
temp.sym = examine.sym
temp.id = state.id
for nextnextstate_id in examine.trans:
# if nextnextstate_id != examine.id :
for x_char in state.trans[nextstate_id]:
for z_char in examine.trans[
nextnextstate_id]:
if nextnextstate_id not in temp.trans:
temp.trans[
nextnextstate_id] = []
if x_char != 0 and z_char != 0:
temp.trans[
nextnextstate_id].append(x_char + z_char)
# DEBUGprint 'transition is now
# '+x_char +' + '+ z_char
elif x_char != 0 and z_char == 0:
temp.trans[
nextnextstate_id].append(x_char)
# DEBUGprint 'transition is now
# '+x_char
elif x_char == 0 and z_char != 0:
temp.trans[
nextnextstate_id].append(z_char)
# DEBUGprint 'transition is now
# '+z_char
elif x_char == 0 and z_char == 0:
temp.trans[
nextnextstate_id].append(0)
# DEBUGprint 'transition is now
# empty'
else:
pass
found = 1
new.append(temp)
if found == 1:
# print 'Lets combine one with id '+`state.id`+'(rest)
# and one with id '+`nextstate_id`+'(push)'
change = 1
# del(state.trans[nextstate_id])
i = i + 1
if change == 0:
return []
else:
return new
def _combine_push_rest(self):
"""Combining Push and Rest"""
new = []
change = 0
# DEBUG
# logging.debug('Combining Push and Rest')
i = 0
examinetypes = self.quickresponse_types[1]
for state in examinetypes:
if state.type == 1:
for nextstate_id in state.trans.keys():
found = 0
# if nextstate_id != state.id:
if nextstate_id in self.quickresponse:
examines = self.quickresponse[nextstate_id]
for examine in examines:
if examine.id == nextstate_id and examine.type == 3:
temp = PDAState()
temp.type = 1
temp.sym = state.sym
temp.id = state.id
for nextnextstate_id in examine.trans:
# if nextnextstate_id != examine.id :
for x_char in state.trans[nextstate_id]:
for z_char in examine.trans[
nextnextstate_id]:
if nextnextstate_id not in temp.trans:
temp.trans[
nextnextstate_id] = []
if x_char != 0 and z_char != 0:
temp.trans[
nextnextstate_id].append(x_char + z_char)
# DEBUGprint 'transition is now
# '+x_char +' + '+ z_char
elif x_char != 0 and z_char == 0:
temp.trans[
nextnextstate_id].append(x_char)
# DEBUGprint 'transition is now
# '+x_char
elif x_char == 0 and z_char != 0:
temp.trans[
nextnextstate_id].append(z_char)
# DEBUGprint 'transition is now
# '+z_char
elif x_char == 0 and z_char == 0:
temp.trans[
nextnextstate_id].append(0)
# DEBUGprint 'transition is now
# empty'
else:
pass
found = 1
new.append(temp)
if found == 1:
# DEBUGprint 'Lets combine one with id
# '+`state.id`+'(push) and one with id
# '+`nextstate_id`+'(rest)'
change = 1
del state.trans[nextstate_id]
i = i + 1
if change == 0:
return []
else:
return new
def _combine_pop_rest(self):
"""Combining Pop and Rest"""
new = []
change = 0
# DEBUG
# logging.debug('Combining Pop and Rest')
i = 0
examinetypes = self.quickresponse_types[2]
for state in examinetypes:
if state.type == 2:
for nextstate_id in state.trans.keys():
found = 0
# if nextstate_id != state.id:
if nextstate_id in self.quickresponse:
examines = self.quickresponse[nextstate_id]
for examine in examines:
if examine.id == nextstate_id and examine.type == 3:
if state.sym != 0:
temp = PDAState()
temp.type = 2
temp.sym = state.sym
temp.id = state.id
for nextnextstate_id in examine.trans:
# if nextnextstate_id != examine.id:
for x_char in state.trans[nextstate_id]:
for z_char in examine.trans[
nextnextstate_id]:
if nextnextstate_id not in temp.trans:
temp.trans[
nextnextstate_id] = []
if x_char != 0 and z_char != 0:
temp.trans[
nextnextstate_id].append(x_char + z_char)
# DEBUGprint 'transition is
# now '+x_char +' + '+ z_char
elif x_char != 0 and z_char == 0:
temp.trans[
nextnextstate_id].append(x_char)
# DEBUGprint 'transition is
# now '+x_char
elif x_char == 0 and z_char != 0:
temp.trans[
nextnextstate_id].append(z_char)
# DEBUGprint 'transition is
# now '+z_char
elif x_char == 0 and z_char == 0:
temp.trans[
nextnextstate_id].append(0)
# DEBUGprint 'transition is
# now empty'
else:
pass
found = 1
new.append(temp)
else:
for nextnextstate_id in examine.trans:
# if nextnextstate_id != examine.id:
for x_char in state.trans[nextstate_id]:
temp = PDAState()
temp.type = 2
temp.id = state.id
temp.sym = x_char
temp.trans[nextnextstate_id] = []
for z_char in examine.trans[
nextnextstate_id]:
if z_char != 0:
temp.trans[
nextnextstate_id].append(z_char)
# DEBUGprint 'transition is
# now '+z_char
elif z_char == 0:
temp.trans[
nextnextstate_id].append(0)
# DEBUGprint 'transition is
# now empty'
else:
pass
found = 1
new.append(temp)
if found == 1:
# DEBUGprint 'Lets combine one with id
# '+`state.id`+'(push) and one with id
# '+`nextstate_id`+'(rest)'
change = 1
del state.trans[nextstate_id]
i = i + 1
if change == 0:
return []
else:
return new
def _combine_rest_rest(self):
"""Combining Rest and Rest"""
new = []
change = 0
# DEBUG
# logging.debug('Combining Rest and Rest')
i = 0
examinetypes = self.quickresponse_types[3]
for state in examinetypes:
if state.type == 3:
found = 0
for nextstate_id in state.trans.keys():
secondfound = 0
# if nextstate_id != state.id:
if nextstate_id in self.quickresponse:
examines = self.quickresponse[nextstate_id]
for examine in examines:
if examine.id == nextstate_id and examine.type == 3:
temp = PDAState()
temp.type = 3
temp.sym = state.sym
temp.id = state.id
for nextnextstate_id in examine.trans:
if nextnextstate_id != examine.id:
for x_char in state.trans[nextstate_id]:
for z_char in examine.trans[
nextnextstate_id]:
if nextnextstate_id not in temp.trans:
temp.trans[
nextnextstate_id] = []
if x_char != 0 and z_char != 0:
temp.trans[
nextnextstate_id].append(x_char + z_char)
# DEBUGprint 'transition is
# now '+x_char +' + '+ z_char
elif x_char != 0 and z_char == 0:
temp.trans[
nextnextstate_id].append(x_char)
# DEBUGprint 'transition is
# now '+x_char
elif x_char == 0 and z_char != 0:
temp.trans[
nextnextstate_id].append(z_char)
# DEBUGprint 'transition is
# now '+z_char
elif x_char == 0 and z_char == 0:
temp.trans[
nextnextstate_id].append(0)
# DEBUGprint 'transition is
# now empty'
else:
pass
secondfound = 1
if secondfound == 1:
new.append(temp)
found = 1
if found == 1:
# DEBUGprint 'Lets combine one with id
# '+`state.id`+'(rest) and one with id
# '+`nextstate_id`+'(rest)'
change = 1
del state.trans[nextstate_id]
i = i + 1
if change == 0:
return []
else:
return new
def _combine_push_pop(self):
"""Combining Push and Pop"""
new = []
change = 0
# DEBUG
# logging.debug('Combining Push and Pop')
i = 0
examinetypes = self.quickresponse_types[1]
for state in examinetypes:
if state.type == 1:
found = 0
for nextstate_id in state.trans.keys():
# if nextstate_id != state.id:
if nextstate_id in self.quickresponse:
examines = self.quickresponse[nextstate_id]
for examine in examines:
secondfound = 0
if examine.id == nextstate_id and examine.type == 2:
temp = PDAState()
temp.type = 3
temp.sym = 0
temp.id = state.id
if examine.sym == 0:
for nextnextstate_id in examine.trans:
# if nextnextstate_id != examine.id :
for z_char in examine.trans[
nextnextstate_id]:
if state.sym == z_char:
for x_char in state.trans[
nextstate_id]:
# DEBUGprint state.sym+' vs
# '+z_char
if nextnextstate_id not in temp.trans:
temp.trans[
nextnextstate_id] = []
if x_char != 0:
temp.trans[
nextnextstate_id].append(x_char)
# DEBUGprint
# 'transition is now
# '+x_char
else:
temp.trans[
nextnextstate_id].append(0)
# DEBUGprint
# 'transition is now
# empty'
secondfound = 1
elif state.sym == examine.sym:
for nextnextstate_id in examine.trans:
# if nextnextstate_id != examine.id :
for x_char in state.trans[nextstate_id]:
for z_char in examine.trans[
nextnextstate_id]:
if nextnextstate_id not in temp.trans:
temp.trans[
nextnextstate_id] = []
if x_char != 0 and z_char != 0:
temp.trans[
nextnextstate_id].append(x_char + z_char)
# DEBUGprint 'transition is
# now '+x_char +' + '+ z_char
elif x_char != 0 and z_char == 0:
temp.trans[
nextnextstate_id].append(x_char)
# DEBUGprint 'transition is
# now '+x_char
elif x_char == 0 and z_char != 0:
temp.trans[
nextnextstate_id].append(z_char)
# DEBUGprint 'transition is
# now '+z_char
elif x_char == 0 and z_char == 0:
temp.trans[
nextnextstate_id].append(0)
# DEBUGprint 'transition is
# now empty'
else:
pass
secondfound = 1
if secondfound == 1:
new.append(temp)
found = 1
if found == 1:
# DEBUGprint 'Lets combine one with id
# '+`state.id`+'(push) and one with id
# '+`nextstate_id`+'(pop)'
change = 1
# DEBUGprint 'delete '+`nextstate_id`+' from
# '+`state.id`
del state.trans[nextstate_id]
i = i + 1
if change == 0:
return []
else:
return new
def _check(self, accepted):
"""_check for string existence"""
# logging.debug('A check is now happening...')
# for key in self.statediag[1].trans:
# logging.debug('transition to '+`key`+" with "+self.statediag[1].trans[key][0])
total = []
if 1 in self.quickresponse:
total = total + self.quickresponse[1]
if (1, 0) in self.quickresponse:
total = total + self.quickresponse[(1, 0)]
for key in total:
if (key.id == 1 or key.id == (1, 0)) and key.type == 3:
if accepted is None:
if 2 in key.trans:
# print 'Found'
return key.trans[2]
else:
for state in accepted:
if (2, state) in key.trans:
# print 'Found'
return key.trans[(2, state)]
return -1
def _stage(self, accepted, count=0):
"""This is a repeated state in the state removal algorithm"""
new5 = self._combine_rest_push()
new1 = self._combine_push_pop()
new2 = self._combine_push_rest()
new3 = self._combine_pop_rest()
new4 = self._combine_rest_rest()
new = new1 + new2 + new3 + new4 + new5
del new1
del new2
del new3
del new4
del new5
if len(new) == 0:
# self.printer()
# print 'PDA is empty'
# logging.debug('PDA is empty')
return None
self.statediag = self.statediag + new
del new
# print 'cleaning...'
# It is cheaper to create a new array than to use the old one and
# delete a key
newstates = []
for key in self.statediag:
if len(key.trans) == 0 or key.trans == {}:
# rint 'delete '+`key.id`
# self.statediag.remove(key)
pass
else:
newstates.append(key)
del self.statediag
self.statediag = newstates
self.quickresponse = {}
self.quickresponse_types = {}
self.quickresponse_types[0] = []
self.quickresponse_types[1] = []
self.quickresponse_types[2] = []
self.quickresponse_types[3] = []
self.quickresponse_types[4] = []
for state in self.statediag:
if state.id not in self.quickresponse:
self.quickresponse[state.id] = [state]
else:
self.quickresponse[state.id].append(state)
self.quickresponse_types[state.type].append(state)
# else:
# print `key.id`+' (type: '+`key.type`+' and sym:'+`key.sym`+')'
# print key.trans
# print 'checking...'
exists = self._check(accepted)
if exists == -1:
# DEBUGself.printer()
# raw_input('next step?')
return self._stage(accepted, count + 1)
else:
# DEBUGself.printer()
# print 'Found '
print exists
# return self._stage(accepted, count+1)
return exists
def printer(self):
"""Visualizes the current state"""
for key in self.statediag:
if key.trans is not None and len(key.trans) > 0:
print '****** ' + repr(key.id) + '(' + repr(key.type)\
+ ' on sym ' + repr(key.sym) + ') ******'
print key.trans
def init(self, states, accepted):
"""Initialization of the indexing dictionaries"""
self.statediag = []
for key in states:
self.statediag.append(states[key])
self.quickresponse = {}
self.quickresponse_types = {}
self.quickresponse_types[0] = []
self.quickresponse_types[1] = []
self.quickresponse_types[2] = []
self.quickresponse_types[3] = []
self.quickresponse_types[4] = []
for state in self.statediag:
if state.id not in self.quickresponse:
self.quickresponse[state.id] = [state]
else:
self.quickresponse[state.id].append(state)
self.quickresponse_types[state.type].append(state)
# self.printer()
# raw_input('next stepA?')
return self._stage(accepted, 0)
|
Show Related Q&As Where Can I Find Medical Assisting Degree Programs in Arkansas? Top Schools For Medical Assisting Associate Degree Programs Medical Assisting Management Associate Degree Which Schools Offer Medical Assisting Programs in Oregon?
By earning your clinical medical assisting certificate and passing the medical assisting certification exam, you'll become qualified to pursue a career as a medical assistant. Medical assistants provide clinical and administrative support to physicians and other health care practitioners. Clinical medical assisting job duties can include collecting lab samples and giving medications. Schools offering Medical Assisting degrees can also be found in these popular choices.
Medical assistants support physicians, podiatrists and chiropractors by performing both clerical and clinical duties. According to the U.S. Bureau of Labor Statistics (BLS), clinical medical assistants work in private practice offices, general hospitals, outpatient care facilities and surgical centers. Their clinical medical assisting duties vary by individual skill level, location and the needs of the facility in which they work.
On-the-job training Some employers will train assistants rather than hire those with formal training.
What are Some Nationally Ranked Schools for Architectural Design? |
# Higgins - A multi-media server
# Copyright (c) 2007-2009 Michael Frank <msfrank@syntaxjockey.com>
#
# This program is free software; for license information see
# the COPYING file.
import os, tempfile
from higgins.http import resource, http_headers
from higgins.http.http import Response as HttpResponse
from higgins.core.models import File, Artist, Album, Song, Genre
from higgins.core.postable_resource import PostableResource
from higgins.core.logger import CoreLogger
class UniqueFile:
def __init__(self, filename, mimetype='application/octet-stream'):
self.mimetype = mimetype
self._fd,self.path = tempfile.mkstemp(prefix=filename + '.', dir='.')
def write(self, data):
os.write(self._fd, data)
def close(self):
os.close(self._fd)
del(self._fd)
class CreateCommand(PostableResource, CoreLogger):
def acceptFile(self, headers):
content_disposition = headers.getHeader('content-disposition')
if 'filename' in content_disposition.params:
filename = content_disposition.params['filename']
else:
filename = "file"
content_type = headers.getHeader('content-type')
if isinstance(content_type, http_headers.MimeType):
mimetype = content_type.mediaType + '/' + content_type.mediaSubtype
else:
mimetype = 'application/octet-stream'
file = UniqueFile(filename, mimetype)
self.log_debug("acceptFile: created new unique file %s" % file.path);
return file
def render(self, request):
try:
# title is required
title = request.post.get('title', None)
if title == None:
return HttpResponse(400, stream="Missing required form item 'title")
is_local = request.args.get('is_local', None)
# process in local mode
if not is_local == None:
local_path = request.post.get('local_path', None)
if local_path == None:
return HttpResponse(400, stream="Missing required form item 'local_path'")
mimetype = request.post.get('mimetype', None)
if mimetype == None:
return HttpResponse(400, stream="Missing required form item 'mimetype'")
# verify that file exists at local_path
try:
s = os.stat(local_path)
except:
return HttpResponse(400, stream="Failed to stat() local file %s" % local_path)
file = File(path=local_path, mimetype=mimetype, size=s.st_size)
file.save()
else:
nfiles = len(request.files)
if nfiles == 0:
return HttpResponse(400, stream="Not local mode and no file specified")
if nfiles > 1:
return HttpResponse(400, stream="More than one file specified")
posted = request.files[0]
try:
s = os.stat(posted.path)
except:
return HttpResponse(400, stream="Failed to stat() local file %s" % local_path)
file = File(path=posted.path, mimetype=posted.mimetype, size=s.st_size)
file.save()
self.log_debug("CreateCommand: created new file %s" % posted.path)
# create or get the artist object
value = request.post.get('artist', None)
if value:
artist,created = Artist.objects.get_or_create(name=value)
else:
artist,created = Artist.objects.get_or_create(name="")
artist.save()
# create or get the genre object
value = request.post.get('genre', None)
if value:
genre,created = Genre.objects.get_or_create(name=value)
else:
genre,created = Genre.objects.get_or_create(name="")
genre.save()
# create or get the album object
value = request.post.get('album', None)
if value:
album,created = Album.objects.get_or_create(name=value, artist=artist, genre=genre)
else:
album,created = Album.objects.get_or_create(name="", artist=artist, genre=genre)
album.save()
# create the song object
song = Song(name=title, album=album, artist=artist, file=file)
value = request.post.get('track', None)
if value:
song.track_number = int(value)
value = request.post.get('length', None)
if value:
song.duration = int(value)
song.save()
self.log_debug("successfully added new song '%s'" % title)
return HttpResponse(200, stream="success!")
except Exception, e:
self.log_debug("CreateCommand failed: %s" % e)
return HttpResponse(500, stream="Internal Server Error")
class UpdateCommand(resource.Resource, CoreLogger):
def render(self, request):
return HttpResponse(404)
class DeleteCommand(resource.Resource, CoreLogger):
def render(self, request):
return HttpResponse(404)
class ManagerResource(resource.Resource, CoreLogger):
def locateChild(self, request, segments):
if segments[0] == "create":
return CreateCommand(), []
if segments[0] == "update":
return UpdateCommand(), []
if segments[0] == "delete":
return DeleteCommand(), []
return None, []
|
Akka implements a specific form called “parental supervision”. Actors can only be created by other actors—where the top-level actor is provided by the library—and each created actor is supervised by its parent. This restriction makes the formation of actor supervision hierarchies explicit and encourages sound design decisions. It should be noted that this also guarantees that actors cannot be orphaned or attached to supervisors from the outside, which might otherwise catch them unawares. In addition, this yields a natural and clean shutdown procedure for (sub-trees of) actor applications.
Lifecycle monitoring is implemented using a Terminated message to be received by the monitoring actor, where the default behavior is to throw a special DeathPactException if not otherwise handled. One important property is that the message will be delivered irrespective of the order in which the monitoring request and target’s termination occur, i.e. you still get the message even if at the time of registration the target is already dead.
The AllForOneStrategy is applicable in cases where the ensemble of children has so tight dependencies among them, that a failure of one child affects the function of the others, i.e. they are intricably linked. Since a restart does not clear out the mailbox, it often is best to terminate the children upon failure and re-create them explicitly from the supervisor (by watching the children’s lifecycle); otherwise you have to make sure that it is no problem for any of the actors to receive a message which was queued before the restart but processed afterwards.
Normally stopping a child (i.e. not in response to a failure) will not automatically terminate the other children in an all-for-one strategy, that can easily be done by watching their lifecycle: if the Terminated message is not handled by the supervisor, it will throw a DeathPathException which (depending on its supervisor) will restart it, and the default preRestart action will terminate all children. Of course this can be handled explicitly as well.
Please note that creating one-off actors from an all-for-one supervisor entails that failures escalated by the temporary actor will affect all the permanent ones. If this is not desired, install an intermediate supervisor; this can very easily be done by declaring a router of size 1 for the worker, see Routing (Scala) or Routing (Java). |
import traceback, threading, sys
class ExitPipeline(Exception):
pass
class Pipeline(object):
def __init__(self, pipeline):
self.lock = threading.Condition(threading.Lock())
self.thread = threading.Thread(target=self._run, args=(pipeline,))
self.item = None
self.error = None
self.active = False
self.exit = False
self.thread.start()
def _run(self, pipeline):
try:
for x in pipeline(self._iter()):
pass
except ExitPipeline:
pass
except:
self.error = traceback.format_exc()
# there are two ways how we may end up here:
if not self.active:
# case 1) initialization of the pipeline crashes
# before we get to 'yield item' in _iter below: We must
# wait for the main thread to call next(), so we can
# release it properly below.
self.lock.acquire()
while self.item == None:
self.lock.wait()
# case 2) crash occurs after 'yield item' moves control to
# the other stages in the pipeline: release lock as if we
# had processed the item as usual.
self.lock.notify()
self.lock.release()
def _iter(self):
while True:
self.lock.acquire()
self.active = True
while self.item == None:
self.lock.wait()
if self.exit:
if self.error:
raise ExitPipeline()
else:
return
item = self.item
self.item = None
yield item
self.lock.notify()
self.lock.release()
def next(self, item):
self.lock.acquire()
self.item = item
self.lock.notify()
self.lock.wait()
self.lock.release()
return self.error
def close(self, error):
self.lock.acquire()
self.item = True
self.exit = True
self.error = error
self.lock.notify()
self.lock.release()
self.thread.join()
return self.error
def run(source, pipelines):
def run(pipes):
for item in source:
for pipe in pipes:
error = pipe.next(item)
if error:
return error
return False
pipes = [Pipeline(p) for p in pipelines]
error = run(pipes)
for pipe in pipes:
error = pipe.close(error)
if error:
sys.stderr.write(error)
return False
return True
|
The Lakers also survived despite their third straight game without suspended starters Brandon Ingram and Rajon Rondo. "[One] of the all-time Laker greats".
"It feels great; it's a win, period", said James, who got to rest in the fourth quarter with the Lakers up by as much as 27.
Through four games, McGee is averaging 16.2 points, 7 rebounds and 3.2 blocks in 24 minutes, and he made it clear afterward that he can play more and is not restricted by having asthma. "We were able to get them out of it in the later stages of the first quarter, and then took it into the second quarter and into the half, all the way to the end of the game".
"I just felt like my clutch badge came on", laughed Stephenson, who had 12 points and four assists.
Ingram will miss one more game for his role in the Lakers' brawl with the Houston Rockets last weekend, while Rondo can return Saturday in San Antonio.
Jamal Murray scored 22 points and Monte Morris had 20, but Denver couldn't answer the Lakers' rally led by James, who posted his third career triple-double in Staples Center for his third different team.
"A lot of people always try to discredit what he can do offensively but they never give him enough credit what he does defensively", James continued.
Bryant watched the Lakers erase an eight-point fourth-quarter deficit with a 15-2 blitz in just over three minutes.
The two-time MVP, 30, finished three points short of his career high of 54 and three three-pointers short of breaking his National Basketball Association record of 13 made in a single game as he did not enter for the fourth quarter. The Ukrainian guard scored his first National Basketball Association points one night earlier in Phoenix.
Los Angeles clobbered the Phoenix Suns 131-113. I wasn't going to say anything. |
# This are my implementations of two algorithms that find a maximum subarray
# (a subarray whose sum is maximum).
#
# It is also possible to find the maximum subarray by testing C(n, 2) pairs.
# This is the brute-force approach and it is O(n^2). Don't use this.
#
# The first one finds the maximum subarray using D&C in O(n lg n).
#
# The second one can find the maximum subarray in O(n).
# It is known as Kadane's algorithm.
def find_maximum_crossing_subarray(array, start, end):
"""
Finds the maximum middle-crossing subarray between start and end, inclusive.
Returns a tuple of the form (left index, right index, sum).
"""
mid = (start + end) // 2
cur_l_sum = 0
max_l_sum = None
max_l_index = mid
for i in range(mid, start - 1, -1):
cur_l_sum += array[i]
if max_l_sum is None or cur_l_sum > max_l_sum:
max_l_sum = cur_l_sum
max_l_index = i
cur_r_sum = 0
max_r_sum = 0
max_r_index = mid
# As the left sum includes the middle element, the right sum does not.
for i in range(mid + 1, end + 1):
cur_r_sum += array[i]
if cur_r_sum > max_r_sum:
max_r_sum = cur_r_sum
max_r_index = i
return (max_l_index, max_r_index, max_l_sum + max_r_sum)
def find_maximum_subarray(array, start, end):
"""
Finds the maximum subarray between start and end (inclusive).
Returns a tuple of the form (left index, right index, sum).
"""
if start == end:
return (start, end, array[start])
else:
mid = (start + end) // 2
l = find_maximum_subarray(array, start, mid)
m = find_maximum_crossing_subarray(array, start, end)
r = find_maximum_subarray(array, mid + 1, end)
if l[2] >= r[2] and l[2] >= m[2]:
return l
elif r[2] >= l[2] and r[2] >= m[2]:
return r
else:
return m
def find_maximum_subarray_by_kadane(array, start, end):
max_ending_here = array[start]
max_ending_here_l = start
max_so_far = array[start]
max_so_far_l = start
max_so_far_r = start
for i in range(start + 1, end + 1):
if array[i] > max_ending_here + array[i]:
max_ending_here = array[i]
max_ending_here_l = i
else:
max_ending_here += array[i]
if max_ending_here > max_so_far:
max_so_far = max_ending_here
max_so_far_l = max_ending_here_l
max_so_far_r = i
return (max_so_far_l, max_so_far_r, max_so_far)
if __name__ == '__main__':
array_a = [1]
array_b = [-1]
array_c = [10, -20]
array_d = [-15, 20, 40, -10, 15]
print(find_maximum_subarray(array_a, 0, len(array_a) - 1))
print(find_maximum_subarray(array_b, 0, len(array_b) - 1))
print(find_maximum_subarray(array_c, 0, len(array_c) - 1))
print(find_maximum_subarray(array_d, 0, len(array_d) - 1))
print(find_maximum_subarray_by_kadane(array_a, 0, len(array_a) - 1))
print(find_maximum_subarray_by_kadane(array_b, 0, len(array_b) - 1))
print(find_maximum_subarray_by_kadane(array_c, 0, len(array_c) - 1))
print(find_maximum_subarray_by_kadane(array_d, 0, len(array_d) - 1))
|
Lubiprostone is used for the treatment of long-term constipation (chronic idiopathic constipation), opioid-induced constipation (OIC) in adults with chronic non-cancer pain and irritable bowel syndrome (IBS, a common disorder of the large intestine that causes abdominal pain, cramps, bloating, gas, diarrhea and constipation).
Lubiprostone belongs to a class of medications called laxatives. It activates proteins called chloride channels (ClC-2) in the intestine resulting in increased flow of fluids into the bowel and increased motility that facilitates easy passage of stools thereby relieving constipation and associated symptoms. |
import zmq
from getpass import getpass
def jsonify(text_command):
tokens = text_command.split()
args = []
if len(tokens) > 1:
args = tokens[1:]
json = { "opcode": tokens[0], "args": args }
return json
def login_prompt():
login_payload = {
"opcode": "login",
"args": [
raw_input("Username: "),
getpass()
]
}
return login_payload
def enter_repl(socket):
print("Jukeboxify CLI - Developed by Steve Parrington")
try:
while True:
text_command = raw_input("> ")
json = jsonify(text_command)
if json['opcode'] == 'exit':
raise KeyboardInterrupt
elif json['opcode'] == 'login':
json = login_prompt()
socket.send_json(json)
response = socket.recv_json()
if "message" in response:
print(response["message"])
else:
print(response)
except KeyboardInterrupt:
print("Exiting Jukeboxify CLI...")
socket.disconnect('tcp://127.0.0.1:7890')
def main():
context = zmq.Context.instance()
socket = context.socket(zmq.REQ)
socket.connect('tcp://127.0.0.1:7890')
enter_repl(socket)
if __name__ == '__main__':
main()
|
If you have any questions about Tingirana they may already be answered below.
Alternatively you can contact us if your question is still unanswered.
Yes. Tingirana Noosa provides complimentary pool and beach towels for use during your stay. We also have beach chairs and beach umbrellas you are welcome to use.
We prefer if you do not use your guest room bathroom towels at the pool or outside the Resort.
Can we arrange a late checkout?
No. We do not offer a late check out service as we run at very high occupancy and there is someone else wanting to use the room from 2.00pm.
Guests who are departing Noosa later than 10.00am are welcome to take advantage of our complimentary baggage store and make use of the pool, car park, guest lounge and bathroom facilities. You are welcome to change and freshen up in our transit lounge prior to your departure.
Yes. We can arrange bus and limousine transfers from the Sunshine Coast and Brisbane airports on your behalf.
Please have your flight number and arrival and/or departure times available when making the booking.
Yes. We have a 20 metre heated lap pool which overlooking the beach. We also have a spa and children’s wading pool.
Do you have a tour desk?
Yes. Our reception staff can help you with ideas for things to do and see during your stay at Tingirana Noosa. Alternatively you can visit our Things to Do page and book your tours before you arrive.
Do you have cots to hire?
How many apartments are at Tingirana?
We have 36 guest apartments available. Studio, One and Two Bedroom Apartments.
Studio Apartments overlook Hastings Street and do not have a balcony.
The One and Two Bedroom Apartments are located from ground to level 3 and overlook the pool and ocean.
How new are the apartments?
Tingirana Noosa first opened in October 2000. We are constantly striving to improve and maintain the standard of quality in the rooms and the resort.
We have upgraded the entire resort in 2017 to our current Hampton style luxury accommodation. The entire resort was rebuilt from reception, to rooms, to guest facilities including the pool.
No. Our rates are room only, however, Season Restaurant is within the resort and offers room service and pool side service for Tingirana Noosa guests.
Season Restaurant is owned and operated separately so you are not able to charge meals to your room account at this stage.
Is there lift access to all levels?
Yes. We have a guest lift that takes you from the car park to our level 3 apartments (stopping all floors).
Yes. Tingirana Noosa offers complimentary secure parking in the basement car park. There are wheelchair access spaces available. There is a height restriction of 1.9 metres.
What are your reception hours?
If you are arriving before or after reception hours please make sure you have advised us of your expected time of arrival before you leave home.We can then provide you with after hours check in instructions.
What deposit do I need to pay?
We require one night’s accommodation per week of stay to confirm your reservation. This will be charged at time of booking and credited to your stay. We require a valid credit card as a guarantee.
What is the closest airport to Tingirana Noosa?
30 minute drive – Sunshine Coast airport.
90 minute drive – Brisbane Domestic and International airports.
Jetstar, Qantas Airways and Virgin Blue all fly into the Sunshine Coast airport along with International flights from New Zealand.
All other international arrivals are in to the Brisbane airport.
Visit our UPGRADE YOUR STAY pay to book your airport transfers.
Deposits are payable on all bookings. A deposit equivalent to one night’s accommodation is required per week (or part thereof) of booking. Your booking will be confirmed once your deposit has been received.
We are unable to offer access to your room earlier than 2.00pm on your day of arrival.
We do understand that you may be wanting to start your holiday prior to 2.00pm so we organise for you to park your car, store your luggage and provide you with access to the transit lounge (refresh room). Plus if you’d like a towel or two you can hang out by the pool or on the beach until your room is ready.
If your question has not been answered, you can submit your own by using the form below.
Alternatively, feel free to contact us and one of our staff will be happy to assist. |
# Copyright (c) 2011 Hesky Fisher
# See LICENSE.txt for details.
"Thread-based monitoring of a stenotype machine using the TX Bolt protocol."
import plover.machine.base
# In the TX Bolt protocol, there are four sets of keys grouped in
# order from left to right. Each byte represents all the keys that
# were pressed in that set. The first two bits indicate which set this
# byte represents. The next bits are set if the corresponding key was
# pressed for the stroke.
# 00XXXXXX 01XXXXXX 10XXXXXX 110XXXXX
# HWPKTS UE*OAR GLBPRF #ZDST
# The protocol uses variable length packets of one, two, three or four
# bytes. Only those bytes for which keys were pressed will be
# transmitted. The bytes arrive in order of the sets so it is clear
# when a new stroke starts. Also, if a key is pressed in an earlier
# set in one stroke and then a key is pressed only in a later set then
# there will be a zero byte to indicate that this is a new stroke. So,
# it is reliable to assume that a stroke ended when a lower set is
# seen. Additionally, if there is no activity then the machine will
# send a zero byte every few seconds.
STENO_KEY_CHART = ("S-", "T-", "K-", "P-", "W-", "H-", # 00
"R-", "A-", "O-", "*", "-E", "-U", # 01
"-F", "-R", "-P", "-B", "-L", "-G", # 10
"-T", "-S", "-D", "-Z", "#") # 11
class TxBolt(plover.machine.base.SerialStenotypeBase):
"""TX Bolt interface.
This class implements the three methods necessary for a standard
stenotype interface: start_capture, stop_capture, and
add_callback.
"""
KEYS_LAYOUT = '''
# # # # # # # # # #
S- T- P- H- * -F -P -L -T -D
S- K- W- R- * -R -B -G -S -Z
A- O- -E -U
'''
def __init__(self, params):
super().__init__(params)
self._reset_stroke_state()
def _reset_stroke_state(self):
self._pressed_keys = []
self._last_key_set = 0
def _finish_stroke(self):
steno_keys = self.keymap.keys_to_actions(self._pressed_keys)
if steno_keys:
self._notify(steno_keys)
self._reset_stroke_state()
def run(self):
"""Overrides base class run method. Do not call directly."""
settings = self.serial_port.getSettingsDict()
settings['timeout'] = 0.1 # seconds
self.serial_port.applySettingsDict(settings)
self._ready()
while not self.finished.isSet():
# Grab data from the serial port, or wait for timeout if none available.
raw = self.serial_port.read(max(1, self.serial_port.inWaiting()))
if not raw:
# Timeout, finish the current stroke.
self._finish_stroke()
continue
for byte in raw:
key_set = byte >> 6
if key_set <= self._last_key_set:
# Starting a new stroke, finish previous one.
self._finish_stroke()
self._last_key_set = key_set
for i in range(5 if key_set == 3 else 6):
if (byte >> i) & 1:
key = STENO_KEY_CHART[(key_set * 6) + i]
self._pressed_keys.append(key)
if key_set == 3:
# Last possible set, the stroke is finished.
self._finish_stroke()
|
A modern and unusual ring. Reminiscent of the soft rounded shapes found on the water surface, this ring represents water.Finished in a high polish to create reflections and illusions.
Available in ring sizes J-U. |
#! /usr/bin/env python
# ----------------------------------------------------------------------
# Numenta Platform for Intelligent Computing (NuPIC)
# Copyright (C) 2013, Numenta, Inc. Unless you have purchased from
# Numenta, Inc. a separate commercial license for this software code, the
# following terms and conditions apply:
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License version 3 as
# published by the Free Software Foundation.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
# See the GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see http://www.gnu.org/licenses.
#
# http://numenta.org/licenses/
# ----------------------------------------------------------------------
"""Unify the various HotGym CSV files to a single coherent StandardFile
See README.txt for details
"""
import os
import sys
import glob
import operator
import datetime
"""from nupic.providers.WeatherProvider import (
WeatherStation,
getClosestStation)
"""
from nupic.data.file import File
def fixEOL(f):
"""Make sure the end of line character is '\n'
This is needed in order to use fileinput.input() to process the files. The
file format of the raw gym dataset unfortunately contains \r (old Mac format)
EOL characters.
"""
text = open(f).read()
# If there are no carriage returns (\r) just return
if text.find('\r') == -1:
return
# Takes care of Windows format
text = text.replace('\r\n', '\n')
# Takes care of old Mac format
text = text.replace('\r', '\n')
open(f, 'w').write(text)
def _parseTimestamp(t):
tokens = t.split()
day, month, year = [int(x) for x in tokens[0].split('/')]
if len(tokens) == 1:
hour = 0
minute = 0
else:
assert len(tokens) == 3
hour, minute, seconds = [int(x) for x in tokens[1].split(':')]
hour %= 12
if tokens[2] == 'PM':
hour += 12
result = datetime.datetime(year, month, day, hour, minute)
assert datetime.datetime(2010, 7, 2) <= result < datetime.datetime(2011, 1, 1)
return result
def _parseLine(line):
# Get rid of the double quotes arounf each field
line = line.replace('"', '')
# Split the line and get rid of the first field (running count)
fields = line[:-1].split(',')[1:]
gym = fields[0]
record = [gym] # Gym
# Add in an address for each Gym
gymAddresses = {
'Balgowlah Platinum': 'Shop 67 197-215 Condamine Street Balgowlah 2093',
'Lane Cove': '24-28 Lane Cove Plaza Lane Cove 2066',
'Mosman': '555 Military Rd Mosman 2088',
'North Sydney - Walker St': '100 Walker St North Sydney 2060',
'Randwick': 'Royal Randwick Shopping Centre 73 Belmore Rd Randwick 2031'
}
address = gymAddresses[gym]
record.append(address)
# Parse field 2 to a datetime object
record.append(_parseTimestamp(fields[1]))
# Add the consumption
record.append(float(fields[2]))
return record
def makeDataset():
"""
"""
inputFile = 'numenta_air_Con.csv'
fixEOL(inputFile)
fields = [
('gym', 'string', 'S'),
('address', 'string', ''),
('timestamp', 'datetime', 'T'),
('consumption', 'float', '')]
gymName = None
missing = 0
total = 0
# Create a the output file by parsing the customer given csv
with File('./hotgym2.csv', fields) as o:
with open(inputFile) as f:
# Skip header
f.readline()
# iterate over all the lines in the input file
for line in f.xreadlines():
# Parse the fields in the current line
record = _parseLine(line)
# Write the merged record to the output file
o.write(record)
if record[0] != gymName:
gymName = record[0]
print gymName
return total, missing
if __name__ == '__main__':
makeDataset()
print 'Done.'
|
We’ve made comparing vehicles easier, quicker and as risk-free as it gets.
When you read a vehicle inspection report, you need to know the inspection was carried out thoroughly, accurately and to a nationally-recognised set of standards.
At Manheim, every vehicle we auction has been carefully examined and graded by one of our qualified vehicle inspectors. Not just that, our inspectors are accredited by NAMA, the National Association of Motor Auctions. This means they work to a strict set of independently approved standards, using a clearly-defined 6-tier grading system so you know exactly what condition to expect.
NAMA grading is particularly useful if you’re buying online, because it gives added transparency to the whole issue of vehicle inspections and grading, and this means extra peace of mind. And the fee for all this extra peace of mind that comes from NAMA grading? There isn’t one – it’s all part of the Manheim service.
To see the six NAMA standards of vehicle grading and find out what each one means click the link below.
If you have a query about our vehicle grading and inspections, or you'd just like to know more, contact us and we'll get back to you. |
#!/usr/bin/env python
#coding:utf-8
# Purpose: insert block references with appended attributes
# Created: 11.04.2010
# Copyright (C) 2010, Manfred Moitzi
# License: MIT License
"""
Provides the Insert2 composite-entity.
Insert a new block-reference with auto-creating of attribs from attdefs,
and setting attrib-text by the attribs-dict.
"""
__author__ = "mozman <mozman@gmx.at>"
from dxfwrite.entities import Insert
import dxfwrite.const as const
__all__ = ['Insert2']
class Insert2(object):
"""
Insert a new block-reference with auto-creating of attribs from attdefs,
and setting attrib-text by the attribs-dict.
"""
def __init__(self, blockdef, insert, attribs, rotation=0,
xscale=1., yscale=1., zscale=1.,
layer=const.BYBLOCK, color=const.BYLAYER, linetype=None):
"""
Insert a new block-reference with auto-creating of :ref:`ATTRIB` from
:ref:`ATTDEF`, and setting attrib-text by the attribs-dict.
(multi-insert is not supported)
:param blockdef: the block definition itself
:param insert: insert point (xy- or xyz-tuple), z-axis is 0 by default
:param float xscale: x-scale factor, default=1.
:param float yscale: y-scale factor, default=1.
:param float zscale: z-scale factor, default=1.
:param float rotation: rotation angle in degree, default=0.
:param dict attribs: dict with tag:value pairs, to fill the the attdefs in the
block-definition. example: {'TAG1': 'TextOfTAG1'}, create and insert
an attrib from an attdef (with tag-value == 'TAG1'), and set
text-value of the attrib to value 'TextOfTAG1'.
:param string linetype: linetype name, if not defined = **BYLAYER**
:param string layer: layer name
:param int color: range [1..255], 0 = **BYBLOCK**, 256 = **BYLAYER**
"""
self.blockdef = blockdef
self.insert = insert
self.attribs = attribs
self.xscale = xscale
self.yscale = yscale
self.zscale = zscale
self.rotation = rotation
self.layer = layer
self.color = color
self.linetype = linetype
def _build(self):
def set_tags(insert_entity):
basepoint = self.blockdef['basepoint']['xyz']
for tag, text in self.attribs.items():
try:
attdef = self.blockdef.find_attdef(tag)
attrib = attdef.new_attrib(text=text)
insert_entity.add(attrib, relative=True, block_basepoint=basepoint)
except KeyError: # no attdef <tag> found
pass
insert = Insert(blockname=self.blockdef['name'], insert=self.insert,
rotation=self.rotation,
layer=self.layer, color=self.color,
linetype=self.linetype)
for key, value in [('xscale', self.xscale),
('yscale', self.yscale),
('zscale', self.zscale)]:
if value != 1.:
insert[key] = value
set_tags(insert)
return insert.__dxf__()
def __dxf__(self):
return self._build()
|
More information on the land acquisition issue can be found here.
The provisions of this Bill shall not apply to acquisitions under 16 existing legislations including the Special Economic Zones Act, 2005, the Atomic Energy Act, 1962, the Railways Act, 1989, etc.
It is not clear whether Parliament has jurisdiction to impose rehabilitation and resettlement requirements on private purchase of agricultural land.
The requirement of a Social Impact Assessment for every acquisition without a minimum thresholdmay delay the implementation of certain government programmes.
Projects involving land acquisition and undertaken by private companies or public private partnerships require the consent of 80 per cent of the people affected. However, no such consent is required in case of PSUs.
The market value is based on recent reported transactions. This value is doubled in rural areas to arrive at the compensation amount. This method may not lead to an accurate adjustment for the possible underreporting of prices in land transactions.
The government can temporarily acquire land for a maximum period of three years. There is no provision for rehabilitation and resettlement in such cases. |
'''
Part 1:
Given the following list:
students = [
{'first_name': 'Michael', 'last_name' : 'Jordan'},
{'first_name' : 'John', 'last_name' : 'Rosales'},
{'first_name' : 'Mark', 'last_name' : 'Guillen'},
{'first_name' : 'KB', 'last_name' : 'Tonel'}
]
Create a program that outputs:
Michael Jordan
John Rosales
Mark Guillen
KB Tonel
Part 2:
Given the following dictionary:
users = {
'Students': [
{'first_name': 'Michael', 'last_name' : 'Jordan'},
{'first_name' : 'John', 'last_name' : 'Rosales'},
{'first_name' : 'Mark', 'last_name' : 'Guillen'},
{'first_name' : 'KB', 'last_name' : 'Tonel'}
],
'Instructors': [
{'first_name' : 'Michael', 'last_name' : 'Choi'},
{'first_name' : 'Martin', 'last_name' : 'Puryear'}
]
}
Create a program that prints the following format (including number of characters in each combined name):
Students
1 - MICHAEL JORDAN - 13
2 - JOHN ROSALES - 11
3 - MARK GUILLEN - 11
4 - KB TONEL - 7
Instructors
1 - MICHAEL CHOI - 11
2 - MARTIN PURYEAR - 13
'''
students = [
{'first_name': 'Michael', 'last_name' : 'Jordan'},
{'first_name' : 'John', 'last_name' : 'Rosales'},
{'first_name' : 'Mark', 'last_name' : 'Guillen'},
{'first_name' : 'KB', 'last_name' : 'Tonel'}
]
# print just names for part 1
for dict in students:
print('{} {}'.format(dict['first_name'], dict['last_name']))
print # blank line
users = {
'Students': [
{'first_name': 'Michael', 'last_name' : 'Jordan'},
{'first_name' : 'John', 'last_name' : 'Rosales'},
{'first_name' : 'Mark', 'last_name' : 'Guillen'},
{'first_name' : 'KB', 'last_name' : 'Tonel'}
],
'Instructors': [
{'first_name' : 'Michael', 'last_name' : 'Choi'},
{'first_name' : 'Martin', 'last_name' : 'Puryear'}
]
}
# print names with associated index and number of chars
for user_item, user_list in users.iteritems():
dict_count = 0
print(user_item)
for dict in user_list:
dict_count += 1
print('{} - {} {} - {}'.format(str(dict_count),
dict['first_name'].upper(),
dict['last_name'].upper(),
str(len(dict['first_name']) + len(dict['last_name']))))
|
This week we watched Modern Times (Chaplin 1936), read a recent article on it by Lawrence Howe (which contains some useful contextualisation for the film, even though I am not wholly convinced by its argument), had a brief introduction to Marxist ideas about capitalism and the class society it produces, and then spent quite a while discussing some basic essay writing skills.
Capitalism, Marx argued, is defined by the exploitative relationship between the bourgeoisie (or capitalist class), who own and control the means of production (from factories to financial instruments), and the proletariat (or working class), who sell their labour for a wage which is worth less than the value created by their labour. All that extra value they create is used to pay for raw materials, plant, etc; and all that is left over from that – surplus value, in Marx’s term – is taken by the capitalist. Although there might be small individual and partial exceptions, the capitalist will always look to increase production of surplus value – by introducing ‘rationalised’ production processes and increasing automation, by lowering or freezing wages, by extending the working day (including reducing breaks), by offering productivity bonuses, by resisting unionisation of the workforce and health/safety legislation, by casualising the workforce, by not paying the costs of pollution, by relocating to countries with weaker unions/workplace protections/ environmental laws, and by avoiding/evading taxes and manipulating political systems.
Before discussing Modern Times, we took a look at several short sequences from Metropolis (Lang 1927), a film I really wanted to include on the module but which is too long for the screening session (and perhaps in that respect a bit cruel as an introduction to silent cinema – although next week we will be watching Man with a Movie Camera, so I am not sure where the greater cruelty lies).
Lang’s film spatialise class relations in a manner that will become common in dystopian visions, and also in the real world. Here the spatial division is vertical, recalling the literal and figurative descents into poverty in Gaskell’s Mary Barton. The garden in which the city’s wealthy youths play is somewhere high up and pristine. Freder’s father’s office – as controller of the city – is also elevated above all, symbolising his pan optical powers (making him an important figure when we dip our toes into a little de Certeau in a few weeks). Then there is the magnificent metropolis itself, beneath which are the machines which sustain it. And beneath the level even of the machines, as Lang’s opening sequence shows, is the city of the workers.
We also took a look at some of the machinery in the film: the 10-hour shift clock and 24-hour clock over which the shift change is announced (we have already seen Lang’s obsession with clocks in M), the rather abstract machine which overheats and transforms, in Freder’s eyes, into a barbaric ancient idol into whose maw the workers are fed; and the even more abstract clock machine that Freder undertakes to operate so as to free an exhausted worker, only to become a kind of knackered Christ figure himself as he struggles to keep up with its incomprehensible demands for repetitive motion.
Some of this imagery is picked up on directly in Chaplin’s film, which also begins with the image of a clock and workers trudging to the factory like lambs to the slaughter.
As ever, a lot of these terms sort of overlap or seem to be describing the same things from different angles.
Such control systems or disciplinary structures as the factory represents also provide most of the other key locations of the film: asylum, prison, orphanage, department store, restaurant.
Talking about the department store – designed to move customers through the space in such a way as to organise and prolong their experience within the retail environment (think about how IKEA has no windows or clocks and only one route through the warehouse – and, at least according to one of the class, blocks cell phone reception) – also facilitated a way to think about the interconnections of production and consumption.
Then it was time for a break, for the grand unveiling of the essay questions, for reminders to do the library quiz online within 24 hours, and for essay-writing advice.
The latter is especially tough, I find, to do for a whole group, none of whom have yet submitted any work. Makes it hard to know where to begin, what particular strengths and weaknesses each student has. So we did some very basic stuff.
On stucture, taking Strunk and White’s advice: ‘Make the paragraph the unit of composition: one paragraph to each topic.’ So a brief introduction to what is going to be discussed, probably somewhere between 5 and 8 paragraphs, each devoted to making, developing and supporting a single idea in a chain of ideas/paragraphs, and a short conclusion tying it all together. For a 1200 word essay, the introduction and conclusion should probably need no more than a sentence or two each. Revise the introduction once the essay is completed so as to ensure it describes what the essay actually does, rather than what you intended to do (the initial introduction can also be used to help think through revisions to early drafts). No new ideas to be introduced in the conclusion – and never end with a quotation (it is supposed to be your conclusion).
Using spell-check (make sure it is set to English UK; remember it won’t catch certain kinds of errors, such as typing ‘form’ when you mean ‘from’). Use grammar-check sparingly, as typically you need to understand grammar in order to make sense of its recommendations. Instead, concentrate on becoming a better writer (obligatory plug for the genuinely excellent kids’ book, The English Repair Kit by Angela Burt and William Vandyck).
We covered rules about laying how to quote and paraphrase and reference (MLA-style).
Finally, we thought about writing in a more formal academic style, but how that did not necessarily mean writing in long sentences. Focus on short, clear sentences, and work in length-variety where necessary – focus on the connection between what you want to say and the best way to say it clearly.
Jenkins, Henry. “Looking at the City in The Matrix Franchise.” Cities in Transition: The Moving Image and the Modern Metropolis. Ed. Andrew Webber and Emma Wilson. London: Wallflower, 2008. 176–192.
Mellen, Joan. Modern Times. London: BFI, 2006.
By imagining future cities, sf often highlights contemporary concerns about the city. See, for example, Yevgeny Zamyatin’s We (1924), Aldous Huxley’s Brave New World (1932), George Orwell’s Nineteen Eighty-four (1949), Frederik Pohl and Cyril Kornbluth’s The Space Merchants (1953), Harry Harrison’s Make Room! Make Room! (1966), John Brunner’s Stand on Zanzibar (1968), Thomas Disch’s 334 (1972), Marge Piercy’s Woman on the Edge of Time (1976), William Gibson’s Neuromancer (1984), Neal Stephenson’s Snow Crash (1992), Colson Whitehead’s The Intuitionist (1999), Tricia Sullivan’s Maul (2003) and Nnedi Okorafor’s Lagoon (2014).
The same is true of many sf films, such as Metropolis (Lang 1927), Things to Come (Menzies 1936), Alphaville (Godard 1965), Clockwork Orange (Kubrick 1971), THX 1138 (Lucas 1971), Soylent Green (Fleischer 1973), Blade Runner (1982), Akira (Ôtomo 1988), Dark City (Proyas 1998), Minority Report (Spielberg 2002), Code 46 (Winterbottom 2003), District 13 (Morel 2004), Children of Men (Cuarón 2006), La Antena (Sapir 2007) and In Time (Niccol 2011).
Modern Times was partly inspired by À Nous la Liberté (Clair 1931).
The Iron Heel is the incomplete memoir of Avis, written in 1932, on the eve of the Second Revolt. It recounts how, in 1912, she and her wealthy father met the revolutionary socialist, and her future husband, Ernest Everhard, and were won to his cause. Within a year, their lives are in disarray as the capitalist interests who dominate institutions and an increasingly tyrannical government seize complete control of America. This plutocratic oligarchy – dubbed ‘the Iron Heel’ by Ernest – forces the socialists underground. The novel ends with horrific descriptions of the destruction of Chicago in the failed First Revolt, and breaks off abruptly, leaving no account of the subsequent fifteen years. The memoir is introduced and edited by Anthony Meredith, writing in the year 409 B.O.M. (Brotherhood Of Man), the socialist era that follows three centuries of the Iron Heel.
In the first issue of Amazing Stories, Hugo Gernsback described the sf story as ‘a charming romance intermingled with scientific fact and prophetic vision’. To his list of exemplars – Jules Verne, H.G. Wells, Edgar Allan Poe – Gernsback could have added Jack London. The Iron Heel reworks the ‘guided tour’ typical of the utopian novel: Avis, her father and Bishop Morehouse enter a new world – that of the immiserated, impoverished working class – and Ernest, their guide, explains in detail its logic and inner workings. The novel also anticipates the hard-sf which emerged from the pulp tradition (and which can still, arguably, be defined in Gernsback’s terms). The science in question is not, however, physics or astronomy but London’s idiosyncratic version of scientific socialism. The near-future events of the novel are predicated on a (vulgar) Marxist analysis of the process of capital accumulation and the cyclical crises it inevitably produces.
London’s extrapolative premises and technique are most obvious in the chapter ‘The Mathematics of a Dream’. Beginning with the ‘ABCs of commerce’ (108) – the production of value by labour and the extraction of surplus-value (profit) by the capitalist – Ernest takes his audience step-by-step through the logic of capital accumulation which leads to periods of overproduction and mass unemployment. His satiric proposal – that destruction of surpluses would be an effective way to deal with cyclical over-production, as in Frederik Pohl’s ‘The Midas Plague’ (1954) – is dismissed as absurd. But in a world in which, for example, agricultural subsidies are paid for deliberate underproduction to stabilise prices and ‘surplus’ crops are routinely destroyed (while people elsewhere starve), The Iron Heel’s prophetic value is difficult to ignore.
This denial of agency to the working class is indicative of the peculiar type of socialism, blended with aspects of Friedrich Nietzsche and with Herbert Spencer’s ‘survival of the fittest’ misunderstanding of evolution, advocated by London (who puts at least one of his own speeches/essays, ‘Revolution’, into Ernest’s mouth).
She conceives of him as a messiah – he is an eagle, a lamb, a lion, ‘the spirit of regnant labour’ (63), Christ – and longs to melt before him, to merge her ‘life completely into his’ (138). When they are on the run, Avis learns to take on a completely different appearance through controlling her body whereas Ernest requires cosmetic surgery to transform him: she is fluid, he is hard.
Avis, then, despite her origins, exemplifies what London’s socialism requires of the working class, ‘the People of the Abyss’. In the Chicago uprising, they are not only depicted as dumb beasts but as an inundation, a surging fluid mass. Without form or identity, they are to be shaped or sacrificed by the revolutionary party.
This system of images – rigidly armoured male bodies; women and the feminised masses as a threatening flood – is typical of the literature produced by the German Freikorps in the 1920s, many of whom later played significant roles in the SA and SS. And so at the heart of this ‘small folk Bible of scientific socialism’ we find a form of fascism.
North America and Asia are beneath the Iron Heel of the Oligarchs for 300 years, but as early as 1912 a wave of socialist revolutions swept the world, inspired and empowered by the general strike which prevented a war between the US and Germany. Perhaps it is such collective action and international solidarity that leads to the Brotherhood of Man.
This expression, which London also used as the title of his 1903 book of reportage on the London poor, is borrowed from HG Wells’s Anticipations of the Reaction of Mechanical and Human Progress upon Human Life and Thought (1901). |
#!/usr/bin/env python
# (C) Copyright IBM Corporation 2004, 2005
# All Rights Reserved.
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# on the rights to use, copy, modify, merge, publish, distribute, sub
# license, and/or sell copies of the Software, and to permit persons to whom
# the Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice (including the next
# paragraph) shall be included in all copies or substantial portions of the
# Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT. IN NO EVENT SHALL
# IBM AND/OR ITS SUPPLIERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
# IN THE SOFTWARE.
#
# Authors:
# Ian Romanick <idr@us.ibm.com>
import libxml2
import re, sys, string
import typeexpr
def parse_GL_API( file_name, factory = None ):
doc = libxml2.readFile( file_name, None, libxml2.XML_PARSE_XINCLUDE + libxml2.XML_PARSE_NOBLANKS + libxml2.XML_PARSE_DTDVALID + libxml2.XML_PARSE_DTDATTR + libxml2.XML_PARSE_DTDLOAD + libxml2.XML_PARSE_NOENT )
ret = doc.xincludeProcess()
if not factory:
factory = gl_item_factory()
api = factory.create_item( "api", None, None )
api.process_element( doc )
# After the XML has been processed, we need to go back and assign
# dispatch offsets to the functions that request that their offsets
# be assigned by the scripts. Typically this means all functions
# that are not part of the ABI.
for func in api.functionIterateByCategory():
if func.assign_offset:
func.offset = api.next_offset;
api.next_offset += 1
doc.freeDoc()
return api
def is_attr_true( element, name ):
"""Read a name value from an element's attributes.
The value read from the attribute list must be either 'true' or
'false'. If the value is 'false', zero will be returned. If the
value is 'true', non-zero will be returned. An exception will be
raised for any other value."""
value = element.nsProp( name, None )
if value == "true":
return 1
elif value == "false":
return 0
else:
raise RuntimeError('Invalid value "%s" for boolean "%s".' % (value, name))
class gl_print_base:
"""Base class of all API pretty-printers.
In the model-view-controller pattern, this is the view. Any derived
class will want to over-ride the printBody, printRealHader, and
printRealFooter methods. Some derived classes may want to over-ride
printHeader and printFooter, or even Print (though this is unlikely).
"""
def __init__(self):
# Name of the script that is generating the output file.
# Every derived class should set this to the name of its
# source file.
self.name = "a"
# License on the *generated* source file. This may differ
# from the license on the script that is generating the file.
# Every derived class should set this to some reasonable
# value.
#
# See license.py for an example of a reasonable value.
self.license = "The license for this file is unspecified."
# The header_tag is the name of the C preprocessor define
# used to prevent multiple inclusion. Typically only
# generated C header files need this to be set. Setting it
# causes code to be generated automatically in printHeader
# and printFooter.
self.header_tag = None
# List of file-private defines that must be undefined at the
# end of the file. This can be used in header files to define
# names for use in the file, then undefine them at the end of
# the header file.
self.undef_list = []
return
def Print(self, api):
self.printHeader()
self.printBody(api)
self.printFooter()
return
def printHeader(self):
"""Print the header associated with all files and call the printRealHeader method."""
print '/* DO NOT EDIT - This file generated automatically by %s script */' \
% (self.name)
print ''
print '/*'
print ' * ' + self.license.replace('\n', '\n * ')
print ' */'
print ''
if self.header_tag:
print '#if !defined( %s )' % (self.header_tag)
print '# define %s' % (self.header_tag)
print ''
self.printRealHeader();
return
def printFooter(self):
"""Print the header associated with all files and call the printRealFooter method."""
self.printRealFooter()
if self.undef_list:
print ''
for u in self.undef_list:
print "# undef %s" % (u)
if self.header_tag:
print ''
print '#endif /* !defined( %s ) */' % (self.header_tag)
def printRealHeader(self):
"""Print the "real" header for the created file.
In the base class, this function is empty. All derived
classes should over-ride this function."""
return
def printRealFooter(self):
"""Print the "real" footer for the created file.
In the base class, this function is empty. All derived
classes should over-ride this function."""
return
def printPure(self):
"""Conditionally define `PURE' function attribute.
Conditionally defines a preprocessor macro `PURE' that wraps
GCC's `pure' function attribute. The conditional code can be
easilly adapted to other compilers that support a similar
feature.
The name is also added to the file's undef_list.
"""
self.undef_list.append("PURE")
print """# if defined(__GNUC__) || (defined(__SUNPRO_C) && (__SUNPRO_C >= 0x590))
# define PURE __attribute__((pure))
# else
# define PURE
# endif"""
return
def printFastcall(self):
"""Conditionally define `FASTCALL' function attribute.
Conditionally defines a preprocessor macro `FASTCALL' that
wraps GCC's `fastcall' function attribute. The conditional
code can be easilly adapted to other compilers that support a
similar feature.
The name is also added to the file's undef_list.
"""
self.undef_list.append("FASTCALL")
print """# if defined(__i386__) && defined(__GNUC__) && !defined(__CYGWIN__) && !defined(__MINGW32__)
# define FASTCALL __attribute__((fastcall))
# else
# define FASTCALL
# endif"""
return
def printVisibility(self, S, s):
"""Conditionally define visibility function attribute.
Conditionally defines a preprocessor macro name S that wraps
GCC's visibility function attribute. The visibility used is
the parameter s. The conditional code can be easilly adapted
to other compilers that support a similar feature.
The name is also added to the file's undef_list.
"""
self.undef_list.append(S)
print """# if defined(__GNUC__) || (defined(__SUNPRO_C) && (__SUNPRO_C >= 0x590)) && defined(__ELF__)
# define %s __attribute__((visibility("%s")))
# else
# define %s
# endif""" % (S, s, S)
return
def printNoinline(self):
"""Conditionally define `NOINLINE' function attribute.
Conditionally defines a preprocessor macro `NOINLINE' that
wraps GCC's `noinline' function attribute. The conditional
code can be easilly adapted to other compilers that support a
similar feature.
The name is also added to the file's undef_list.
"""
self.undef_list.append("NOINLINE")
print """# if defined(__GNUC__) || (defined(__SUNPRO_C) && (__SUNPRO_C >= 0x590))
# define NOINLINE __attribute__((noinline))
# else
# define NOINLINE
# endif"""
return
def real_function_name(element):
name = element.nsProp( "name", None )
alias = element.nsProp( "alias", None )
if alias:
return alias
else:
return name
def real_category_name(c):
if re.compile("[1-9][0-9]*[.][0-9]+").match(c):
return "GL_VERSION_" + c.replace(".", "_")
else:
return c
def classify_category(name, number):
"""Based on the category name and number, select a numerical class for it.
Categories are divided into four classes numbered 0 through 3. The
classes are:
0. Core GL versions, sorted by version number.
1. ARB extensions, sorted by extension number.
2. Non-ARB extensions, sorted by extension number.
3. Un-numbered extensions, sorted by extension name.
"""
try:
core_version = float(name)
except Exception,e:
core_version = 0.0
if core_version > 0.0:
cat_type = 0
key = name
elif name.startswith("GL_ARB_") or name.startswith("GLX_ARB_") or name.startswith("WGL_ARB_"):
cat_type = 1
key = int(number)
else:
if number != None:
cat_type = 2
key = int(number)
else:
cat_type = 3
key = name
return [cat_type, key]
def create_parameter_string(parameters, include_names):
"""Create a parameter string from a list of gl_parameters."""
list = []
for p in parameters:
if p.is_padding:
continue
if include_names:
list.append( p.string() )
else:
list.append( p.type_string() )
if len(list) == 0: list = ["void"]
return string.join(list, ", ")
class gl_item:
def __init__(self, element, context):
self.context = context
self.name = element.nsProp( "name", None )
self.category = real_category_name( element.parent.nsProp( "name", None ) )
return
class gl_type( gl_item ):
def __init__(self, element, context):
gl_item.__init__(self, element, context)
self.size = int( element.nsProp( "size", None ), 0 )
te = typeexpr.type_expression( None )
tn = typeexpr.type_node()
tn.size = int( element.nsProp( "size", None ), 0 )
tn.integer = not is_attr_true( element, "float" )
tn.unsigned = is_attr_true( element, "unsigned" )
tn.name = "GL" + self.name
te.set_base_type_node( tn )
self.type_expr = te
return
def get_type_expression(self):
return self.type_expr
class gl_enum( gl_item ):
def __init__(self, element, context):
gl_item.__init__(self, element, context)
self.value = int( element.nsProp( "value", None ), 0 )
temp = element.nsProp( "count", None )
if not temp or temp == "?":
self.default_count = -1
else:
try:
c = int(temp)
except Exception,e:
raise RuntimeError('Invalid count value "%s" for enum "%s" in function "%s" when an integer was expected.' % (temp, self.name, n))
self.default_count = c
return
def priority(self):
"""Calculate a 'priority' for this enum name.
When an enum is looked up by number, there may be many
possible names, but only one is the 'prefered' name. The
priority is used to select which name is the 'best'.
Highest precedence is given to core GL name. ARB extension
names have the next highest, followed by EXT extension names.
Vendor extension names are the lowest.
"""
if self.name.endswith( "_BIT" ):
bias = 1
else:
bias = 0
if self.category.startswith( "GL_VERSION_" ):
priority = 0
elif self.category.startswith( "GL_ARB_" ):
priority = 2
elif self.category.startswith( "GL_EXT_" ):
priority = 4
else:
priority = 6
return priority + bias
class gl_parameter:
def __init__(self, element, context):
self.name = element.nsProp( "name", None )
ts = element.nsProp( "type", None )
self.type_expr = typeexpr.type_expression( ts, context )
temp = element.nsProp( "variable_param", None )
if temp:
self.count_parameter_list = temp.split( ' ' )
else:
self.count_parameter_list = []
# The count tag can be either a numeric string or the name of
# a variable. If it is the name of a variable, the int(c)
# statement will throw an exception, and the except block will
# take over.
c = element.nsProp( "count", None )
try:
count = int(c)
self.count = count
self.counter = None
except Exception,e:
count = 1
self.count = 0
self.counter = c
self.count_scale = int(element.nsProp( "count_scale", None ))
elements = (count * self.count_scale)
if elements == 1:
elements = 0
#if ts == "GLdouble":
# print '/* stack size -> %s = %u (before)*/' % (self.name, self.type_expr.get_stack_size())
# print '/* # elements = %u */' % (elements)
self.type_expr.set_elements( elements )
#if ts == "GLdouble":
# print '/* stack size -> %s = %u (after) */' % (self.name, self.type_expr.get_stack_size())
self.is_client_only = is_attr_true( element, 'client_only' )
self.is_counter = is_attr_true( element, 'counter' )
self.is_output = is_attr_true( element, 'output' )
# Pixel data has special parameters.
self.width = element.nsProp('img_width', None)
self.height = element.nsProp('img_height', None)
self.depth = element.nsProp('img_depth', None)
self.extent = element.nsProp('img_extent', None)
self.img_xoff = element.nsProp('img_xoff', None)
self.img_yoff = element.nsProp('img_yoff', None)
self.img_zoff = element.nsProp('img_zoff', None)
self.img_woff = element.nsProp('img_woff', None)
self.img_format = element.nsProp('img_format', None)
self.img_type = element.nsProp('img_type', None)
self.img_target = element.nsProp('img_target', None)
self.img_pad_dimensions = is_attr_true( element, 'img_pad_dimensions' )
self.img_null_flag = is_attr_true( element, 'img_null_flag' )
self.img_send_null = is_attr_true( element, 'img_send_null' )
self.is_padding = is_attr_true( element, 'padding' )
return
def compatible(self, other):
return 1
def is_array(self):
return self.is_pointer()
def is_pointer(self):
return self.type_expr.is_pointer()
def is_image(self):
if self.width:
return 1
else:
return 0
def is_variable_length(self):
return len(self.count_parameter_list) or self.counter
def is_64_bit(self):
count = self.type_expr.get_element_count()
if count:
if (self.size() / count) == 8:
return 1
else:
if self.size() == 8:
return 1
return 0
def string(self):
return self.type_expr.original_string + " " + self.name
def type_string(self):
return self.type_expr.original_string
def get_base_type_string(self):
return self.type_expr.get_base_name()
def get_dimensions(self):
if not self.width:
return [ 0, "0", "0", "0", "0" ]
dim = 1
w = self.width
h = "1"
d = "1"
e = "1"
if self.height:
dim = 2
h = self.height
if self.depth:
dim = 3
d = self.depth
if self.extent:
dim = 4
e = self.extent
return [ dim, w, h, d, e ]
def get_stack_size(self):
return self.type_expr.get_stack_size()
def size(self):
if self.is_image():
return 0
else:
return self.type_expr.get_element_size()
def get_element_count(self):
c = self.type_expr.get_element_count()
if c == 0:
return 1
return c
def size_string(self, use_parens = 1):
s = self.size()
if self.counter or self.count_parameter_list:
list = [ "compsize" ]
if self.counter and self.count_parameter_list:
list.append( self.counter )
elif self.counter:
list = [ self.counter ]
if s > 1:
list.append( str(s) )
if len(list) > 1 and use_parens :
return "(%s)" % (string.join(list, " * "))
else:
return string.join(list, " * ")
elif self.is_image():
return "compsize"
else:
return str(s)
def format_string(self):
if self.type_expr.original_string == "GLenum":
return "0x%x"
else:
return self.type_expr.format_string()
class gl_function( gl_item ):
def __init__(self, element, context):
self.context = context
self.name = None
self.entry_points = []
self.return_type = "void"
self.parameters = []
self.offset = -1
self.initialized = 0
self.images = []
self.assign_offset = 0
self.static_entry_points = []
# Track the parameter string (for the function prototype)
# for each entry-point. This is done because some functions
# change their prototype slightly when promoted from extension
# to ARB extension to core. glTexImage3DEXT and glTexImage3D
# are good examples of this. Scripts that need to generate
# code for these differing aliases need to real prototype
# for each entry-point. Otherwise, they may generate code
# that won't compile.
self.parameter_strings = {}
self.process_element( element )
return
def process_element(self, element):
name = element.nsProp( "name", None )
alias = element.nsProp( "alias", None )
if is_attr_true(element, "static_dispatch"):
self.static_entry_points.append(name)
self.entry_points.append( name )
if alias:
true_name = alias
else:
true_name = name
# Only try to set the offset when a non-alias
# entry-point is being processes.
offset = element.nsProp( "offset", None )
if offset:
try:
o = int( offset )
self.offset = o
except Exception, e:
self.offset = -1
if offset == "assign":
self.assign_offset = 1
if not self.name:
self.name = true_name
elif self.name != true_name:
raise RuntimeError("Function true name redefined. Was %s, now %s." % (self.name, true_name))
# There are two possible cases. The first time an entry-point
# with data is seen, self.initialized will be 0. On that
# pass, we just fill in the data. The next time an
# entry-point with data is seen, self.initialized will be 1.
# On that pass we have to make that the new values match the
# valuse from the previous entry-point.
parameters = []
return_type = "void"
child = element.children
while child:
if child.type == "element":
if child.name == "return":
return_type = child.nsProp( "type", None )
elif child.name == "param":
param = self.context.factory.create_item( "parameter", child, self.context)
parameters.append( param )
child = child.next
if self.initialized:
if self.return_type != return_type:
raise RuntimeError( "Return type changed in %s. Was %s, now %s." % (name, self.return_type, return_type))
if len(parameters) != len(self.parameters):
raise RuntimeError( "Parameter count mismatch in %s. Was %d, now %d." % (name, len(self.parameters), len(parameters)))
for j in range(0, len(parameters)):
p1 = parameters[j]
p2 = self.parameters[j]
if not p1.compatible( p2 ):
raise RuntimeError( 'Parameter type mismatch in %s. "%s" was "%s", now "%s".' % (name, p2.name, p2.type_expr.original_string, p1.type_expr.original_string))
if true_name == name or not self.initialized:
self.return_type = return_type
self.parameters = parameters
for param in self.parameters:
if param.is_image():
self.images.append( param )
if element.children:
self.initialized = 1
self.parameter_strings[name] = create_parameter_string(parameters, 1)
else:
self.parameter_strings[name] = None
return
def get_images(self):
"""Return potentially empty list of input images."""
return self.images
def parameterIterator(self):
return self.parameters.__iter__();
def get_parameter_string(self, entrypoint = None):
if entrypoint:
s = self.parameter_strings[ entrypoint ]
if s:
return s
return create_parameter_string( self.parameters, 1 )
def get_called_parameter_string(self):
p_string = ""
comma = ""
for p in self.parameterIterator():
p_string = p_string + comma + p.name
comma = ", "
return p_string
def is_abi(self):
return (self.offset >= 0 and not self.assign_offset)
def is_static_entry_point(self, name):
return name in self.static_entry_points
def dispatch_name(self):
if self.name in self.static_entry_points:
return self.name
else:
return "_dispatch_stub_%u" % (self.offset)
def static_name(self, name):
if name in self.static_entry_points:
return name
else:
return "_dispatch_stub_%u" % (self.offset)
class gl_item_factory:
"""Factory to create objects derived from gl_item."""
def create_item(self, item_name, element, context):
if item_name == "function":
return gl_function(element, context)
if item_name == "type":
return gl_type(element, context)
elif item_name == "enum":
return gl_enum(element, context)
elif item_name == "parameter":
return gl_parameter(element, context)
elif item_name == "api":
return gl_api(self)
else:
return None
class gl_api:
def __init__(self, factory):
self.functions_by_name = {}
self.enums_by_name = {}
self.types_by_name = {}
self.category_dict = {}
self.categories = [{}, {}, {}, {}]
self.factory = factory
self.next_offset = 0
typeexpr.create_initial_types()
return
def process_element(self, doc):
element = doc.children
while element.type != "element" or element.name != "OpenGLAPI":
element = element.next
if element:
self.process_OpenGLAPI(element)
return
def process_OpenGLAPI(self, element):
child = element.children
while child:
if child.type == "element":
if child.name == "category":
self.process_category( child )
elif child.name == "OpenGLAPI":
self.process_OpenGLAPI( child )
child = child.next
return
def process_category(self, cat):
cat_name = cat.nsProp( "name", None )
cat_number = cat.nsProp( "number", None )
[cat_type, key] = classify_category(cat_name, cat_number)
self.categories[cat_type][key] = [cat_name, cat_number]
child = cat.children
while child:
if child.type == "element":
if child.name == "function":
func_name = real_function_name( child )
temp_name = child.nsProp( "name", None )
self.category_dict[ temp_name ] = [cat_name, cat_number]
if self.functions_by_name.has_key( func_name ):
func = self.functions_by_name[ func_name ]
func.process_element( child )
else:
func = self.factory.create_item( "function", child, self )
self.functions_by_name[ func_name ] = func
if func.offset >= self.next_offset:
self.next_offset = func.offset + 1
elif child.name == "enum":
enum = self.factory.create_item( "enum", child, self )
self.enums_by_name[ enum.name ] = enum
elif child.name == "type":
t = self.factory.create_item( "type", child, self )
self.types_by_name[ "GL" + t.name ] = t
child = child.next
return
def functionIterateByCategory(self, cat = None):
"""Iterate over functions by category.
If cat is None, all known functions are iterated in category
order. See classify_category for details of the ordering.
Within a category, functions are sorted by name. If cat is
not None, then only functions in that category are iterated.
"""
lists = [{}, {}, {}, {}]
for func in self.functionIterateAll():
[cat_name, cat_number] = self.category_dict[func.name]
if (cat == None) or (cat == cat_name):
[func_cat_type, key] = classify_category(cat_name, cat_number)
if not lists[func_cat_type].has_key(key):
lists[func_cat_type][key] = {}
lists[func_cat_type][key][func.name] = func
functions = []
for func_cat_type in range(0,4):
keys = lists[func_cat_type].keys()
keys.sort()
for key in keys:
names = lists[func_cat_type][key].keys()
names.sort()
for name in names:
functions.append(lists[func_cat_type][key][name])
return functions.__iter__()
def functionIterateByOffset(self):
max_offset = -1
for func in self.functions_by_name.itervalues():
if func.offset > max_offset:
max_offset = func.offset
temp = [None for i in range(0, max_offset + 1)]
for func in self.functions_by_name.itervalues():
if func.offset != -1:
temp[ func.offset ] = func
list = []
for i in range(0, max_offset + 1):
if temp[i]:
list.append(temp[i])
return list.__iter__();
def functionIterateAll(self):
return self.functions_by_name.itervalues()
def enumIterateByName(self):
keys = self.enums_by_name.keys()
keys.sort()
list = []
for enum in keys:
list.append( self.enums_by_name[ enum ] )
return list.__iter__()
def categoryIterate(self):
"""Iterate over categories.
Iterate over all known categories in the order specified by
classify_category. Each iterated value is a tuple of the
name and number (which may be None) of the category.
"""
list = []
for cat_type in range(0,4):
keys = self.categories[cat_type].keys()
keys.sort()
for key in keys:
list.append(self.categories[cat_type][key])
return list.__iter__()
def get_category_for_name( self, name ):
if self.category_dict.has_key(name):
return self.category_dict[name]
else:
return ["<unknown category>", None]
def typeIterate(self):
return self.types_by_name.itervalues()
def find_type( self, type_name ):
if type_name in self.types_by_name:
return self.types_by_name[ type_name ].type_expr
else:
print "Unable to find base type matching \"%s\"." % (type_name)
return None
|
Meanwhile, the fear of radiation caused Germany and others to radically curtail their nuclear power generation. Thus we have people with “green” sensibilities, moving away from the one reliable power source that does not emit carbon.
It is two years since Japan’s 9.0- magnitude earthquake, one so powerful it shifted the position of the Earth’s figure axis by as much as 6 inches and moved Honshu, Japan’s main island, 8 feet eastward. The tsunami generated by the earthquake obliterated towns, drowned almost 20,000 people and left more than 300,000 homeless. Everyone living within 15 miles of Fukushima was evacuated; many are still in temporary housing. Some will never be able to return home.
More than 300,000 buildings were destroyed and another million damaged, including four reactors at the Fukushima Daiichi nuclear power plant on the northeast coast. The earthquake caused the immediate shutdown of this and three other nuclear-power facilities.
Since the earthquake, a powerful movement gained momentum to halt Japan’s use of nuclear energy, which provided 30 percent of the country’s electricity. The last of 54 nuclear reactors was shut down in May 2012. Two facilities were restarted in June 2012; 52 remain shut. Japan has therefore had to increase its imports of natural gas, low-sulfur crude oil and fuel oil at a substantial economic and environmental cost. Seventy-five percent of the country’s electricity now comes from fossil fuels.
Accustomed to large trade surpluses, Japan, in 2012, had a record $78 billion trade deficit, thanks to increased energy imports and a drop in exports as Japanese goods became more expensive to produce.
And what of the lasting threat from radiation? Remarkably, outside the immediate area of Fukushima, this is hardly a problem at all. Although the crippled nuclear reactors themselves still pose a danger, no one, including personnel who worked in the buildings, died fromradiation exposure. Most experts agree that future health risks from the released radiation, notably radioactive iodine-131 and cesiums-134 and – 137, are extremely small and likely to be undetectable.
Even considering the upper boundary of estimated effects, there is unlikely to be any detectable increase in cancers in Japan, Asia or the world except close to the facility, according to a World Health Organization report. There will almost certainly be no increase in birth defects or genetic abnormalities from radiation. |