text
stringlengths 29
850k
|
|---|
"""
Utility for model parameter
"""
import os
try:
from cPickle import load
except ImportError:
from pickle import load
class Params(object):
pass
def load_dcnn_model_params(path, param_str = None):
"""
>>> p = load_dcnn_model_params("models/filter_widths=8,6,,batch_size=10,,ks=20,8,,fold=1,1,,conv_layer_n=2,,ebd_dm=48,,l2_regs=1e-06,1e-06,1e-06,0.0001,,dr=0.5,0.5,,nkerns=7,12.pkl")
>>> p.ks
(20, 8)
>>> len(p.W)
2
>>> type(p.logreg_W)
<type 'numpy.ndarray'>
"""
if param_str is None:
param_str = os.path.basename(path).split('.')[0]
p = parse_param_string(param_str)
stuff = load(open(path, "r"))
for name, value in stuff:
if not hasattr(p, name):
setattr(p, name, value)
else:
# if appear multiple times,
# make it a list
setattr(p, name, [getattr(p, name), value])
return p
def parse_param_string(s, desired_fields = {"ks", "fold", "conv_layer_n"}):
"""
>>> p = parse_param_string("twitter4,,filter_widths=8,6,,batch_size=10,,ks=20,8,,fold=1,1,,conv_layer_n=2,,ebd_dm=48,,l2_regs=1e-06,1e-06,1e-06,0.0001,,dr=0.5,0.5,,nkerns=7,12")
>>> p.ks
(20, 8)
>>> p.fold
(1, 1)
>>> p.conv_layer_n
2
"""
p = Params()
segs = s.split(',,')
for s in segs:
if "=" in s:
key, value = s.split('=')
if key in desired_fields:
if not ',' in value:
setattr(p, key, int(value))
else:
setattr(p, key, tuple(map(int, value.split(','))))
return p
|
All Mi-branded products are not manufactured by Xiaomi. In most cases, they are produced by third-party companies that are part of the Xiaomi ecosystem. One of these products is the popular models made by Huami technologies. Another obvious example is the mi laser projector. This product was manufactured by Fengmi (Beijing) Technology Co., Ltd. The company recently launched its brand name Xiaomi wemax one laser projector, which is called Wemax One. This product is now listed on China's Youpin Xiaomi for sale at 12990 yuan ($1,870). The reason behind the price of this product is that the Wemax One comes with an xiaomi wemax s1 subwoofer.
xiaomi wemax one laser projector is available in a basic version and a professional edition at a variety of prices. The initial version with the xiaomi wemax s1 subwoofer is priced at 12,999 yuan ($1,870). The Wemax One edition comes with an anti-reflection projector screen, with a total price of 16,999 yuan ($2,446). Finally, the full package, which includes the xiaomi wemax one laser projector, the xiaomi wemax s1 subwoofer and projector screen, is priced at 17,999 yuan ($2,590).
The xiaomi wemax one laser projector in terms of design and more features is very similar to the mi laser projector. However, the brightness of the light source in the Wemax One, instead of 5,000 lumens, can be up to 7,000 lumens, which is at theater's screen level. The true brightness of the 1688 ANSI Lumens display is still admirable. The xiaomi wemax one laser projector uses Advanced Laser Fluorescence Screen (ALPD 3.0) technology, with a useful life of 25,000 hours, which means that if the projector is used for 2 hours a day, it will last 34 years. 4 hours of daily use can still guarantee the life of this product for up to 17 years, which means that it has an ultra-durable, life-long laser. The dual-core projector technology is a light source technology developed by the Light Peak photovoltaic company, while imaging technology chips used by Texas Instruments and Guangfeng's custom-made R & D photoelectric cells.
The xiaomi wemax one laser projector can display up to 150 inches at a resolution of 1920 x 1080 pixels. The projector delivers 16 to 18 percent red light, a wide, color-coded, color contrasting contrast with the laser's contrast ratio of 3000: 1. It also delivers a direct light source and provides more protection for the eye. The sound quality is also high and the xiaomi wemax s1 subwoofer can not be better. Therefore, Wemax One can easily provide theater and theater audio in the living room.
The xiaomi wemax one laser projector is currently available outside of China at some stores, but the xiaomi wemax s1 subwoofer is not listed yet.
|
# MIT License
#
# Copyright (c) 2017 Alex Ignatov
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
# -*- coding: utf-8 -*-
import ctypes
import json
import multiprocessing
import os
import os.path
import re
import shutil
import tempfile
import time
import random
from collections import OrderedDict
from ctypes import *
from ctypes.wintypes import RECT, DWORD, LONG, WORD, LPVOID, LPCWSTR, HWND, POINT, UINT, INT
from multiprocessing import Process
from operator import itemgetter
import cv2
import itertools
import numpy as np
DEBUG = False
STATS_DIR = 'stats'
PATTERNS_DIR = 'patterns'
FARMING_INTERVAL = 30
PROGRESSION_INTERVAL = 30
FISHING_INTERVAL = 30
ASCENSION_INTERVAL = 10800
TOP_HEROES_UPGRADE_INTERVAL = 120
ALL_HEROES_UPGRADE_INTERVAL = 10
ALL_HEROES_UPGRADE_MAX_TIMER = 3600
MAX_NUMBER_OF_VISIBLE_HEROES = 5
WM_MOUSEWHEEL = 0x020A
WHEEL_DOWN = -1
WHEEL_UP = 1
WHEEL_DELTA = 120
WM_LBUTTONDOWN = 0x0201
WM_LBUTTONUP = 0x0202
WM_MOUSEMOVE = 0x0200
WM_MOUSELEAVE = 0x02A3
WM_MOUSEHOVER = 0x02A1
HTCLIENT = 1
WM_SETCURSOR = 0x0020
WM_CHAR = 0x0102
WM_KEYDOWN = 0x0100
WM_KEYUP = 0x0101
HWND_TOP = 0x0
SWP_NOMOVE = 0x0002
VK_CONTROL = 0x11
VK_SHIFT = 0x10
BI_RGB = 0x0000
DIB_RGB_COLORS = 0x00
SRCCOPY = 0xCC0020
SW_SHOWMINIMIZED = 0x2
SW_SHOWNORMAL = 0x1
SW_SHOWMAXIMIZED = 0x3
SW_RESTORE = 0x9
SW_MINIMIZE = 6
SW_SHOWMINNOACTIVE = 7
SW_SHOWNOACTIVATE = 4
SW_HIDE = 0x0
GWL_EXSTYLE = -20
GWL_STYLE = -16
WS_EX_LAYERED = 0x00080000
LWA_ALPHA = 0x00000002
SPI_GETANIMATION = 0x0048
SPI_SETANIMATION = 0x0049
SPIF_SENDCHANGE = 2
WS_MINIMIZEBOX = 0x00020000
WS_MAXIMIZEBOX = 0x00010000
WS_VSCROLL = 0x00200000
WS_HSCROLL = 0x00100000
WS_SIZEBOX = 0x00040000
WS_CAPTION = 0x00C00000
WS_SYSMENU = 0x00080000
SendMessage = ctypes.windll.user32.SendMessageW
FindWindow = ctypes.windll.user32.FindWindowW
FindWindow.argtypes = [LPCWSTR, LPCWSTR]
FindWindow.restype = HWND
SetForegroundWindow = ctypes.windll.user32.SetForegroundWindow
SetWindowPos = ctypes.windll.user32.SetWindowPos
GetWindowRect = ctypes.windll.user32.GetWindowRect
AdjustWindowRect = ctypes.windll.user32.AdjustWindowRect
GetwDesktopWindow = ctypes.windll.user32.GetDesktopWindow
GetWindowRect = ctypes.windll.user32.GetWindowRect
ClientToScreen = ctypes.windll.user32.ClientToScreen
GetClientRect = ctypes.windll.user32.GetClientRect
GetWindowDC = ctypes.windll.user32.GetWindowDC
GetDC = ctypes.windll.user32.GetDC
GetDIBits = ctypes.windll.gdi32.GetDIBits
GetObject = ctypes.windll.gdi32.GetObjectW
CreateBitmap = ctypes.windll.Gdi32.CreateBitmap
CreateCompatibleDC = ctypes.windll.Gdi32.CreateCompatibleDC
CreateCompatibleBitmap = ctypes.windll.Gdi32.CreateCompatibleBitmap
EnumWindows = ctypes.windll.user32.EnumWindows
BitBlt = ctypes.windll.Gdi32.BitBlt
SelectObject = ctypes.windll.Gdi32.SelectObject
GetWindowPlacement = ctypes.windll.user32.GetWindowPlacement
ShowWindow = ctypes.windll.user32.ShowWindow
PrintWindow = ctypes.windll.user32.PrintWindow
GetWindowLong = ctypes.windll.user32.GetWindowLongW
SetWindowLong = ctypes.windll.user32.SetWindowLongW
SetLayeredWindowAttributes = ctypes.windll.user32.SetLayeredWindowAttributes
SystemParametersInfo = ctypes.windll.user32.SystemParametersInfoW
IsIconic = ctypes.windll.user32.IsIconic
ReleaseDC = ctypes.windll.user32.ReleaseDC
DeleteObject = ctypes.windll.gdi32.DeleteObject
DeleteDC = ctypes.windll.Gdi32.DeleteDC
def charToKeyCode(char):
if char in ('1', '2', '3', '4', '5', '6', '7', '8', '9', '0'):
return 0x30 + (ord(char) - ord('0'))
if char == 'ctrl':
return VK_CONTROL
if char == 'shift':
return VK_SHIFT
if 'a' <= char <= 'z':
return 0x41 + (ord(char) - ord('a'))
return None
class BITMAPINFOHEADER(Structure):
_fields_ = [("biSize", DWORD),
("biWidth", LONG),
("biHeight", LONG),
("biPlanes", WORD),
("biBitCount", WORD),
("biCompression", DWORD),
("biSizeImage", DWORD),
("biXPelsPerMeter", LONG),
("biYPelsPerMeter", LONG),
("biClrUsed", DWORD),
("biClrImportant", DWORD)]
class BITMAP(Structure):
_fields_ = [("bmType", LONG),
("bmWidth", LONG),
("bmHeight", LONG),
("bmWidthBytes", LONG),
("bmPlanes", WORD),
("bmBitsPixel", WORD),
("bmBits", LPVOID)]
class WINDOWPLACEMENT(Structure):
_fields_ = [("length", UINT),
("flags", UINT),
("showCmd", UINT),
("ptMinPosition", POINT),
("ptMaxPosition", POINT),
("rcNormalPosition", RECT)]
class ANIMATIONINFO(Structure):
_fields_ = [("cbSize", UINT),
("iMinAnimate", INT)]
def find_single_grey_old(image, pattern, method=cv2.TM_CCOEFF_NORMED, threshold=0.8):
height_pattern, width_pattern = pattern.getSize()
height_image, width_image = image.getSize()
if height_pattern > height_image or width_pattern > width_image:
if DEBUG:
print('find_single_grey: Pattern size if greater than image size ')
return None
pattern_array = pattern.get_grey_array()
image_array = image.get_grey_array()
try:
res = cv2.matchTemplate(image_array, pattern_array, method)
except cv2.error as e:
print('find_single_grey: catch cv2 exception!!! %s ' % str(e))
# cv2.imshow('image', image)
# cv2.imshow('pimage', pimage)
# cv2.waitKey()
return None
min_val, max_val, min_loc, max_loc = cv2.minMaxLoc(res)
if method in [cv2.TM_SQDIFF, cv2.TM_SQDIFF_NORMED] and min_val <= 1 - threshold:
top_left = min_loc
elif max_val >= threshold:
top_left = max_loc
else:
# if image.name == '123456':
# cv2.imshow('image', image_array)
# cv2.imshow('pimage', pattern_array)
# cv2.waitKey(50)
return None
# cv2.rectangle(image.get_array(), top_left,
# (top_left[0] + width, top_left[1] + height),
# (0, 0, 255),
# 1)
return [Region(top_left[0], top_left[1], width_pattern, height_pattern)]
def find_lvlup(image, pattern, all=False):
t_min = 128
t_max = 255
reg = find_single_grey(image, pattern)
if not reg:
return None
reg = reg[0]
pat_max = pattern.get_threshold(t_min, t_max).get_array().max()
tcc = image.crop_copy(reg).get_threshold(t_min, t_max).get_array().max()
if tcc != pat_max:
return None
return [reg]
def find_progress_button(image, pattern):
return find_single_grey(image, pattern, threshold=0.9, all=all)
def find_single_grey_90(image, pattern, all=False):
return find_single_grey(image, pattern, threshold=0.9, all=all)
def find_single_grey_95(image, pattern, all=False):
return find_single_grey(image, pattern, threshold=0.95, all=all)
def find_single_grey_97(image, pattern, all=False):
return find_single_grey(image, pattern, threshold=0.97, all=all)
def find_level(image, pattern, all=False):
image = image.get_threshold(235, 255)
pattern = pattern.get_threshold(235, 255)
return find_single_grey(image, pattern, threshold=0.96, all=all)
def find_checked_skills(image, pattern, all=False, parts=4):
# image = image.get_threshold(128, 255)
# pattern = pattern.get_threshold(128, 255)
topLeft = None
if parts == 1:
return find_single_grey(image, pattern)
cv2.imshow("find_checked_skills:image", image.get_array())
for sect in np.array_split(pattern.get_array(), parts, axis=1):
sect_img = Image.fromArray(sect)
sect_reg = find_single_grey(image, sect_img)
cv2.imshow("find_checked_skills:pattern", sect)
cv2.waitKey(50)
if not sect_reg:
return None
sect_reg = sect_reg[0]
if topLeft is None and sect_reg:
topLeft = sect_reg.getTopLeft()
bottomRight = sect_reg.getBottomRight()
return [Region.from2Location(topLeft, bottomRight)]
# return find_single_grey(image, pattern, threshold=0.90)
def find_single_grey(image, pattern, method=cv2.TM_CCOEFF_NORMED, threshold=0.8, all=False):
pattern_height, pattern_width = pattern.getSize()
height_image, width_image = image.getSize()
if pattern_height > height_image or pattern_width > width_image:
if DEBUG:
print('find_single_grey: Pattern size if greater than image size ')
return None
pattern_grey = pattern.get_grey_array()
image_grey = image.get_grey_array()
if all:
image_grey = image_grey.copy()
regions = []
while 1:
try:
res = cv2.matchTemplate(image_grey, pattern_grey, method)
except cv2.error as e:
print('find_single_grey: catch cv2 exception!!! %s ' % str(e))
# cv2.imshow('image', image)
# cv2.imshow('pimage', pimage)
# cv2.waitKey()
return None
min_val, max_val, min_loc, max_loc = cv2.minMaxLoc(res)
if method in [cv2.TM_SQDIFF, cv2.TM_SQDIFF_NORMED] and min_val <= 1 - threshold:
top_left = min_loc
elif max_val >= threshold:
top_left = max_loc
else:
# if image.name == '123456':
# cv2.imshow('image', image_array)
# cv2.imshow('pimage', pattern_array)
# cv2.waitKey(50)
break
regions.append(Region(top_left[0], top_left[1], pattern_width, pattern_height))
if not all:
break
cv2.rectangle(image_grey, top_left,
(top_left[0] + pattern_width, top_left[1] + pattern_width),
(0, 0, 0),
cv2.FILLED)
# return [Region(top_left[0], top_left[1], pattern_width, pattern_height)]
return regions
def find_all_grey_old(image, pattern, method=cv2.TM_CCOEFF_NORMED, threshold=0.8):
height, width = pattern.getSize()
# pimage = pattern.get_canny_array()
# image = image.get_canny_array()
pimage = pattern.get_grey_array()
image = image.get_grey_array()
res = cv2.matchTemplate(image, pimage, method)
# res = ((res - res.min()) / (res.max() - res.min()))
cv2.imshow('find_all_grey', res)
cv2.waitKey(500)
min_val, max_val, min_loc, max_loc = cv2.minMaxLoc(res)
if method in [cv2.TM_SQDIFF, cv2.TM_SQDIFF_NORMED]:
loc = np.where(res <= 1 - threshold)
else:
loc = np.where(res >= threshold)
regArr = []
for pt in zip(*loc[::-1]):
val = res.item(pt[1], pt[0])
print("find_all_grey: val %s, location %s " % (val, pt))
reg = Region(pt[0], pt[1], width, height)
regArr.append((reg, val))
regRet = []
valRet = []
while regArr:
# Get first region from regArr
cr = regArr[0][0]
# Create array of regions which intersect with cr
intersect_reg = [r for r in regArr if cr.is_intersect(r[0])]
# Sort it by res value in descending order
intersect_reg.sort(key=itemgetter(1), reverse=True)
# Append to returned array region with highest res value
reg = intersect_reg[0][0]
val = intersect_reg[0][1]
regRet.append(reg)
valRet.append(val)
# cv2.rectangle(image.img, (reg.x, reg.y),
# (reg.getRight(), reg.getBottom()),
# (0, 0, 255),
# 1)
# Keep region regArr which is not intersected with crfor pt in zip(*loc[::-1]):
# reg = Region(pt[0], pt[1], width, height)
# cv2.rectangle(image.img, (reg.x, reg.y),
# (reg.getRight(), reg.getBottom()),
# color,
# thickness)
regArr = [r for r in regArr if r not in intersect_reg]
return regRet
def find_pattern_hist(image, pattern, method=cv2.TM_CCOEFF_NORMED, threshold=0.8, corr_coeff=0.9, all=False):
pattern_grey = pattern.get_grey()
image_grey = image.get_grey()
reg_list = find_single_grey(image_grey, pattern_grey, method=method, threshold=threshold, all=all)
print('find_pattern_hist: reg_list %s' % (reg_list))
pattern_region = None
if reg_list:
img1 = None
img2 = None
corr_prev = None
for reg in reg_list:
img1 = pattern
img2 = image.crop(reg)
# hist_img1 = cv2.calcHist([img1.get_grey_array()], [0], None, [32], [0, 256])
# hist_img2 = cv2.calcHist([img2.get_grey_array()], [0], None, [32], [0, 256])
# corr_color = []
# corr_grey=[]
# for i in range(1):
hist_img1 = cv2.calcHist([img1.get_hsv_array()], [0, 1], None, [180, 256], [0, 180, 0, 256])
# # hist_img1 = cv2.calcHist([img1.get_array()], [i], None, [256], [0, 256])
# # hist_img2 = cv2.calcHist([img2.get_array()], [i], None, [256], [0, 256])
# corr_color.append(cv2.compareHist(hist_img1, hist_img2, cv2.HISTCMP_CORREL))
#
hist_img2 = cv2.calcHist([img2.get_hsv_array()], [0, 1], None, [180, 256], [0, 180, 0, 256])
# # hist_img1 = cv2.calcHist([img1.get_array()], [0], None, [8], [0, 256])
# # hist_img2 = cv2.calcHist([img2.get_array()], [0], None, [8], [0, 256])
# # hist_img1 = cv2.calcHist([(cv2.medianBlur(img1.get_grey_array(), 3))], [0], None, [256],
# # [0, 256])
# # hist_img2 = cv2.calcHist([(cv2.medianBlur(img2.get_grey_array(), 3))], [0], None, [256],
# # [0, 256])
# # hist_img1 = cv2.calcHist([img1.get_grey_array()], [0], None, [256],
# # [0, 256])
# # hist_img2 = cv2.calcHist([img2.get_grey_array()], [0], None, [256],
# # [0, 256])
corr_grey = cv2.compareHist(hist_img1, hist_img2, cv2.HISTCMP_CORREL)
# print('find_pattern_hist: %s to %s corr_color is B %s G %s R %s corr_grey =%s' % (
print('find_pattern_hist: %s to %s corr_grey =%s' % (
# img1.get_name(), img2.get_name(), corr_color[0], corr_color[1], corr_color[2], corr_grey))
img1.get_name(), img2.get_name(), corr_grey))
print('find_pattern_hist: img1.getSize() %s to img2.getSize() %s' % (img1.getSize(), img2.getSize()))
# if pattern.get_name()=='.\\patterns\\main\\fish_1.png':
#
# x_size = 300
#
# for img in (img1, img2):
# y_size = int(x_size / img.get_array().shape[1] * img.get_array().shape[0])
# cv2.namedWindow(img.get_name(), cv2.WINDOW_NORMAL)
# cv2.resizeWindow(img.get_name(), x_size, y_size)
# cv2.imshow(img.get_name(),cv2.medianBlur(img.get_array(),3))
# cv2.waitKey(500)
# pass
# if min(corr_color) >= corr_coeff or corr_grey >= corr_coeff:
# if corr_grey >= corr_coeff:
if corr_grey >= corr_coeff:
pattern_region = reg_list
# corr_prev = corr_color
# # print('find_pattern_hist: %s to %s corr is %s' % (img1.get_name(), img2.get_name(), corr_prev))
# # If pattern_region is not already create do it
# if not pattern_region:
# pattern_region = []
#
# if not pattern_region or not pattern_region[-1].is_intersect(reg):
# pattern_region.append(reg)
# # cv2.rectangle(self.img, (reg.getLeft(), reg.getTop()),
# # (reg.getRight(), reg.getBottom()),
# # (0, 0, 255),
# # 1)
# # if corr_prev and corr_prev <= corr:
# # reg_ret = reg
print('find_pattern_hist: %s to %s is %s. pattern_region ' % (img1.get_name(), img2.get_name(), corr_prev))
return pattern_region
def find_all_grey_multi(image, pattern, method=cv2.TM_CCOEFF_NORMED, threshold=0.8):
height, width = pattern.getSize()
# pimage = cv2.medianBlur(pattern.get_canny_array(), 3)
# image = cv2.medianBlur(image.get_canny_array(),3)
# pimage = cv2.blur(pattern.get_array(), (3, 3))
# image = cv2.blur(image.get_array(), (3, 3))
# pimage=cv2.cvtColor(pimage, cv2.COLOR_BGR2GRAY)
# image = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
# pimage = cv2.Canny(pimage, 100, 200)
# image = cv2.Canny(image, 100, 200)
# pimage = pattern.get_grey_array()
# image = image.get_grey_array()
pimage = pattern.get_canny_array()
image = image.get_canny_array()
set_list = []
for method in [cv2.TM_SQDIFF, cv2.TM_CCOEFF]:
res = cv2.matchTemplate(image, pimage, method)
# cv2.normalize(res,res,0,1,norm_type=cv2.NORM_MINMAX)
# Normilized res values in 0..1
res = ((res - res.min()) / (res.max() - res.min()))
# if pattern.get_name() == '.\\patterns\\main\\heroes_menu_active.PNG':
# cv2.imshow(pattern.get_name(), pattern.get_array())
# cv2.imshow(image.get_name(), image.get_array())
# cv2.waitKey()
# cv2.imshow('res', res)
# cv2.imshow('template', pimage)
# cv2.waitKey(300)
if method in [cv2.TM_SQDIFF, cv2.TM_SQDIFF_NORMED]:
# Find the most minimal value in res
sort = np.sort(res, axis=None)[0:10]
sort = np.where(sort < 1 - threshold)
else:
# Find the maximum values in res
sort = np.sort(res, axis=None)[:-10:-1]
# Find sorted res indext
ix = np.in1d(res.ravel(), sort).reshape(res.shape)
loc = np.where(ix)
regArr = []
if method == cv2.TM_SQDIFF_NORMED:
color = (0, 0, 255)
thickness = 3
elif method == cv2.TM_CCOEFF_NORMED:
color = (0, 255, 0)
thickness = 2
elif method == cv2.TM_CCORR_NORMED:
color = (255, 0, 0)
thickness = 1
# return regArr
for pt in zip(*loc[::-1]):
val = res.item(pt[1], pt[0])
print("res %s %s " % (val, pt))
reg = Region(pt[0], pt[1], width, height)
regArr.append((reg, res.item(pt[1], pt[0])))
# return regArr
regRet = []
while regArr:
# Get first region from regArr
cr = regArr[0][0]
# Create array of regions which intersect with cr
intersect_reg = [r for r in regArr if cr.is_intersect(r[0])]
# Sort it by res value in descending order
intersect_reg.sort(key=itemgetter(1), reverse=True)
# Append to returned array region with highest res value
reg = intersect_reg[0][0]
regRet.append(reg)
# cv2.rectangle(image.img, (reg.x, reg.y),
# (reg.getRight(), reg.getBottom()),
# (0, 0, 255),
# 1)
# Keep region regArr which is not intersected with crfor pt in zip(*loc[::-1]):
# reg = Region(pt[0], pt[1], width, height)
# cv2.rectangle(image.img, (reg.x, reg.y),
# (reg.getRight(), reg.getBottom()),
# color,
# thickness)
regArr = [r for r in regArr if r not in intersect_reg]
#
set_list.append(set(regRet))
# Find instersection of regions created by different methods
# Suppose that this
regRet = set.intersection(*set_list)
for reg in regRet:
cv2.rectangle(image.img, (reg.x, reg.y),
(reg.getRight(), reg.getBottom()),
(255, 255, 255),
4)
if not regRet:
return None
return regRet
class Image:
def __init__(self, name=None, find_in_func=find_single_grey):
self.img = None
self.img_buffer = None
self.name = name
self.pattern_finder = find_in_func
# print("Image:__init__: name %s pattern_finder %s", name, self.pattern_finder)
pass
@classmethod
def fromFile(cls, path, name=None, find_in_func=find_single_grey, method=cv2.IMREAD_COLOR):
image = Image(name=(name if name is not None else path), find_in_func=find_in_func)
image.img = cv2.imread(path, method)
return image
@classmethod
def fromArray(cls, arr):
image = Image('from array %s' % (id(arr)))
image.img = arr
return image
def set_pattern_finder(self, find_in_func):
self.pattern_finder = find_in_func
def crop(self, region):
arr = self.get_array()[region.getTop():region.getBottom(), region.getLeft():region.getRight()]
img = Image.fromArray(arr)
img.set_name('cropped at top %s bottom %s left %s rigth %s of %s' % (
region.getTop(), region.getBottom(), region.getLeft(), region.getRight(), self.get_name()))
return img
def crop_copy(self, region):
arr = self.get_array()[region.getTop():region.getBottom(), region.getLeft():region.getRight()].copy()
img = Image.fromArray(arr)
img.set_name('cropped at top %s bottom %s left %s rigth %s of %s' % (
region.getTop(), region.getBottom(), region.getLeft(), region.getRight(), self.get_name()))
return img
def show(self, time=0):
cv2.imshow(self.get_name(), self.img)
cv2.waitKey(time)
def get_array(self):
return self.img
def get_threshold(self, low, high, method=cv2.THRESH_BINARY):
ret, thresh1 = cv2.threshold(self.get_grey_array(), low, high, method)
return self.fromArray(thresh1)
def get_grey(self):
if self.is_grey():
return self
return self.fromArray(cv2.cvtColor(self.img, cv2.COLOR_BGR2GRAY))
def get_grey_array(self):
if self.is_grey():
return self.img
return cv2.cvtColor(self.img, cv2.COLOR_BGR2GRAY)
def get_canny_array(self):
return cv2.Canny(self.img, 100, 200)
def get_hsv_array(self):
return cv2.cvtColor(self.img, cv2.COLOR_BGR2HSV)
def get_width(self):
return self.img.shape[:2][1]
def get_height(self):
return self.img.shape[:2][0]
def getSize(self):
return self.img.shape[:2]
def is_grey(self):
return len(self.img.shape) == 2
def resize(self, w, h):
if (self.get_height() == h and self.get_width() == w):
return
w_c = float(w) / self.get_width()
h_c = float(h) / self.get_height()
if w_c < 1 or h_c < 1:
method = cv2.INTER_AREA
else:
method = cv2.INTER_LINEAR
self.img = cv2.resize(self.img, None, fx=w_c, fy=h_c, interpolation=method)
def get_resized_copy(self, w=None, h=None):
if w is None and h is None:
raise AttributeError("Image:get_resize_copy: Width and height cant be None both simultaneously")
if (self.get_height() == h and self.get_width() == w):
return
if w:
w_c = float(w) / self.get_width()
else:
w_c = float(h) / self.get_height()
if h:
h_c = float(h) / self.get_height()
else:
h_c = float(w) / self.get_width()
if w_c < 1 or h_c < 1:
method = cv2.INTER_AREA
else:
method = cv2.INTER_LINEAR
return Image.fromArray(cv2.resize(self.img, None, fx=w_c, fy=h_c, interpolation=method))
def resize(self, scale):
if scale == 1:
return
elif scale < 1:
method = cv2.INTER_AREA
elif scale > 1:
method = cv2.INTER_LINEAR
self.img = cv2.resize(self.img, None, fx=scale, fy=scale, interpolation=method)
def get_name(self):
return self.name
def set_name(self, name=None):
if not name:
self.name = 'image id:' + id(self)
return
self.name = name
def cvtColor(self, method=cv2.COLOR_BGR2GRAY):
self.img = cv2.cvtColor(self.img, method)
def find_pattern_from_list(self, pat_list, cache=False):
reg = None
for pat in pat_list:
reg = self.find_pattern(pat)
if reg:
break
return reg
def find_pattern(self, pattern, all=False):
return pattern.pattern_finder(self, pattern, all=all)
# Search for all occurence of pattern in source image
class App(object):
def __init__(self, name='Clicker Heroes', width: int = None):
print("init App")
FindWindow.argtypes = [LPCWSTR, LPCWSTR]
FindWindow.restype = HWND
self.name = name
self.window = Window(FindWindow(None, name))
if width:
self.window.resizeCliPropW(width)
# FindWindow.argtypes = [ctypes.c_wchar_p,ctypes.c_wchar_p]
# FindWindow.restype = ctypes.c_void_p
def getWindow(self):
return self.window
class SingletonMetaClass(type):
def __init__(cls, name, bases, dict):
super(SingletonMetaClass, cls) \
.__init__(name, bases, dict)
original_new = cls.__new__
def my_new(cls, *args, **kwds):
if cls.instance == None:
cls.instance = \
original_new(cls, *args, **kwds)
return cls.instance
cls.instance = None
cls.__new__ = staticmethod(my_new)
class Singleton(type):
instance = None
def __call__(cls, *args, **kw):
if not cls.instance:
cls.instance = super(Singleton, cls).__call__(*args, **kw)
return cls.instance
class MouseClick:
def __init__(self, window, x, y):
self.hwnd = window.hwnd
self.x = x
self.y = y
def apply(self):
self.click(self.x, self.y)
def click(self, x, y, park=True, cps=30):
x = int(x)
y = int(y)
self.last_click_location = (x, y)
tmp = (y << 16) | x
delay = 1 / cps
if park:
delay /= 2
err = 0
err += SendMessage(self.hwnd, WM_LBUTTONDOWN, 0, tmp)
time.sleep(delay)
err += SendMessage(self.hwnd, WM_LBUTTONUP, 0, tmp)
if park:
x = 1
y = 1
tmp = (y << 16) | x
time.sleep(delay)
err += SendMessage(self.hwnd, WM_MOUSEMOVE, 0, tmp)
if err > 0:
return None
return True
def move(self, x, y, park=True, cps=30):
l_x, ly = self.last_click_location
x = int(x)
y = int(y)
tmp = (y << 16) | x
delay = 1 / cps
if park:
delay /= 2
err = 0
err += SendMessage(self.hwnd, WM_LBUTTONDOWN, 0, tmp)
time.sleep(delay)
err += SendMessage(self.hwnd, WM_LBUTTONUP, 0, tmp)
if park:
x = 1
y = 1
tmp = (y << 16) | x
time.sleep(delay)
err += SendMessage(self.hwnd, WM_MOUSEMOVE, 0, tmp)
if err > 0:
return None
return True
class MouseScroll:
def __init__(self, window, direction):
self.hwnd = window.hwnd
self.direction = direction
def apply(self):
self.scroll(direction=self.direction)
def scroll(self, direction, x=1, y=1, park=True, cps=30):
tmp = (y << 16) | x
delay = 1 / cps
if park:
delay /= 2
err = 0
err += SendMessage(self.hwnd, WM_MOUSEWHEEL,
(WHEEL_DELTA * direction) << 16, tmp)
time.sleep(delay)
if park:
x = 1
y = 1
tmp = (y << 16) | x
time.sleep(delay)
err += SendMessage(self.hwnd, WM_MOUSEMOVE, 0, tmp)
if err > 0:
return None
return True
class ClickerHeroes(metaclass=Singleton):
# class ClickerHeroes(App):
# __metaclass__ = Singleton
def __init__(self, lock, width: int = None) -> None:
if DEBUG:
print("init ClickerHeroes")
name = 'Clicker Heroes'
self.name = name
self.window = Window(FindWindow(None, name), lock)
if width:
self.window.resizeCliPropW(width)
self.lock = lock
self.fish_time = -1000000
self.newyear = -1000000
self.farm_mode_start_time = -1000000
self.ascend_time = 0
self.ascend_checker_time = 0
self.got_heroes_souls = False
self.relic_ooze_collected = False
self.reindex_heroes_list_time = 0
self.patterns = {}
self.menus = {}
self.hero_patterns_location_cache = {}
self.patterns_location_cache = {}
self.patterns_cache = {}
self.click_monster_location = None
self.starter_clicks = True
self.lvlup_all_heroes_time = 0
self.boss_time = None
self.boss_check_time = 0
self.levels_region = None
self.levels_region_scrshot = None
self.progress_button_time = -1000000
self.farm_mode_start_time = -1000000
self.cache_state = False
self.reindex_heroes_list_time = -1000000
self.skills_upgrades_time = 0
width, height = self.window.get_size()
sss = MouseClick(self.window, 1, 1)
scale = 1
if width > height * 16.0 / 9:
scale = height / (1600.0 * 9 / 16)
if height > width * 9.0 / 16:
scale = width / 1600.0
self.script_path = os.path.realpath(__file__)
self.script_dir = os.path.dirname(self.script_path)
self.stats_dir = os.path.join(self.script_dir, STATS_DIR)
self.patterns_path = os.path.join(self.script_dir, PATTERNS_DIR)
self.load_patterns(self.patterns_path, self.patterns, scale)
self.hero_patterns_location_cache = {}
for menu_name in ('heroes', 'ancients'):
self.menus[menu_name] = {}
# self.menus[menu_name]['sorted_heroes_list'] = self.load_sorted_heroes_list(menu_name)
self.menus[menu_name]['sorted_heroes_list'] = self.load_container(menu_name, "sorted_heroes_list", [])
self.menus[menu_name]['sb_min_position'] = None
self.menus[menu_name]['sb_max_position'] = None
self.menus[menu_name]['sb_position'] = 0
self.menus[menu_name]['last_available_hero'] = None
self.menus[menu_name]['max_seen_hero'] = None
self.menus[menu_name]['heroes_list'] = None
self.menus[menu_name]['visible_heroes_cache'] = None
# self.menus[menu_name]['hero_level'] = self.load_heroes_levels(menu_name)
self.menus[menu_name]['hero_level'] = self.load_container(menu_name, "hero_level", {})
self.menus[menu_name]['heroes_upgraded_list'] = self.load_container(menu_name,
"heroes_upgraded_list",
[])
self.menus[menu_name]['last_ascend_seen_heroes'] = set()
self.window.makeScreenshotClientAreaRegion()
# self.set_monster_click_location()
def do(self):
self.screenShot = self.window.getScreenshot()
self.lvlup_top_heroes()
# self.buyHeroesUpgrade()
self.lvl_progress()
# if self.ascensionNeed():
# self.ascend()
# self.lvlUpAncient()
# if self.transcendNeed():
# self.trascend()
# self.lvlUpOutsiders()
# Loading image patterns structure in self.patterns
def get_sorted_heroes_list(self, menu_name):
return self.menus[menu_name]['sorted_heroes_list']
def load_patterns(self, path, patterns, scale):
bbb = patterns
for root, dirs, files in os.walk(path):
for fn in files:
name, ext = os.path.splitext(os.path.basename(fn))
nm = root[root.find(path) + len(path):]
# Change os path sepatator to /
nm = re.sub(re.escape(os.sep), '/', os.path.join(nm, name))
if 'lvlup_' in nm:
find_in_func = find_lvlup
elif 'button_progression' in nm:
find_in_func = find_single_grey_90
# elif '_c' in nm and 'heroes_skills' in nm:
elif all(x in nm for x in ['_c', 'heroes_skills']):
find_in_func = find_checked_skills
# find_in_func = find_pattern_hist
else:
find_in_func = find_single_grey
img = Image.fromFile(path=os.path.join(root, fn), name=name, find_in_func=find_in_func)
img.resize(scale)
bbb[nm] = img
pass
def find_pattern(self, pat):
return self.find_pattern_cached(pat)
def find_pattern_from_list(self, pat_list, cache=True, all=False):
regions = []
for pat in pat_list:
if cache:
reg = self.find_pattern_cached(pat, all=all)
else:
reg = self.window.getScreenshot().find_pattern(pat, all=all)
if reg:
regions.extend(reg)
if not all:
break
return regions
def find_pattern_reg_name(self, pat_list):
reg = None
reg_name = []
for pat in pat_list:
regs = self.find_pattern(pat)
if regs:
for r in regs:
reg_name.append((r, pat.get_name()))
return reg_name
def find_pattern_reg_name_single(self, reg, pat_list):
reg_name = None
for pat in pat_list:
regs = self.window.getScreenshot(reg).find_pattern(pat)
if regs:
regs = regs[0]
reg_name = (regs, pat.get_name())
break
# if not reg_name:
# return None
return reg_name
def find_pattern_cached(self, pat, all=False):
# pat_id = id(pat)
pat_id = pat.get_name()
if pat_id not in self.patterns_location_cache.keys():
self.patterns_location_cache[pat_id] = {}
# pattern location cache
plc = self.patterns_location_cache[pat_id]
regions = []
if plc:
# print("find_pattern_cached: Pattern %s has %s entries in cache location" % (pat.get_name(), len(plc)))
# Quickly scan pattern location cache
cnt = 0
for cached_location in plc.keys():
reg = self.window.getScreenshot(cached_location).find_pattern(pat, all=False)
cnt += 1
# If location exist in cache and pattern is on screen add location to retrun
if reg:
# print("find_pattern_cached: Cache hit!! Pattern %s" % (pat.get_name()))
plc[cached_location] += 1
if DEBUG and cnt > 1:
print("find_pattern_cached: Pattern %s : Cache hit on %s cache entry" % (pat_id, cnt))
regions.append(cached_location)
break
# If pattern dont exists on locations from cache scan the whole screen and cache it
if not regions:
# Scan the whole screen
# print("find_pattern_cached: Cache missed!! Searching for %s " % (pat.get_name()))
reg = self.window.getScreenshot().find_pattern(pat, all=all)
# print("find_pattern_cached: Found reg %s " % (reg))
# If location found add it to cache and
if reg:
# hit_count = [1] * len(reg)
# cache_entry = zip(reg, hit_count)
plc = self.patterns_location_cache[pat_id]
plc.update(dict.fromkeys(reg, 1))
regions.extend(reg)
else:
# Nothing found in cache and on screen
return None
if plc:
if len(plc) != 1:
# Sort by cache hit count
plc = OrderedDict(sorted(plc.items(), key=lambda t: t[1], reverse=True))
self.patterns_location_cache[pat_id] = plc
# print(plc)
return regions
def scroll_to_last_available_hero(self, menu_name):
self.scroll_to_position(menu_name, self.menus[menu_name]['sorted_heroes_list'])
return
def get_prev_hero_name(self, menu_name, hero_name):
if hero_name is None:
return None
unsorted_heroes_list = self.get_unsorted_hero_list(menu_name)
sorted_heroes_list = self.get_sorted_heroes_list(menu_name)
# Previous hero index
try:
if sorted_heroes_list:
hindex = sorted_heroes_list.index(hero_name) - 1
if hindex >= 0:
# Can definitely be deterrmine previous heroe name
ret_hlist = [sorted_heroes_list[hindex]]
else:
# Return heroes that can be possible be previous in list and dont sits in sorted_heroes_list
# Hero oredered list doest contains hero_name so we return all heroes that dont sit in hol
# and dont equal to hero_name
# ret_hlist = [name for name in unsorted_heroes_list if name not in sorted_heroes_list and name != hero_name]
ret_hlist = [name for name in unsorted_heroes_list if name != hero_name]
else:
# Hero ordered list is empty so return all heroes from hero list except hero_name
ret_hlist = [name for name in unsorted_heroes_list if name != hero_name]
except ValueError as e:
ret_hlist = None
return ret_hlist
def get_next_hero_name(self, menu_name, hero_name):
if hero_name is None:
return None
unsorted_heroes_list = self.get_unsorted_hero_list(menu_name)
sorted_heroes_list = self.get_sorted_heroes_list(menu_name)
ret_hlist = None
# Next hero index
try:
if sorted_heroes_list:
hindex = sorted_heroes_list.index(hero_name) + 1
if hindex >= len(sorted_heroes_list):
# Return heroes that can be possible be next in list and dont sits in sorted_heroes_list
# Hero oredered list doest contains hero_name so we return all heroes that dont sit in hol
# and dont equal to hero_name
# ret_hlist = [name for name in unsorted_heroes_list if name not in sorted_heroes_list and name != hero_name]
ret_hlist = [name for name in unsorted_heroes_list if name != hero_name]
else:
# Can definitely be determine next heroes name
ret_hlist = [sorted_heroes_list[hindex]]
else:
# Hero ordered list is empty so return all heroes from hero list except hero_name
ret_hlist = [name for name in unsorted_heroes_list if name != hero_name]
except ValueError as e:
ret_hlist = None
return ret_hlist
def get_max_seen_hero(self, menu_name):
return self.menus[menu_name]['max_seen_hero']
def set_max_seen_hero(self, menu_name, hero_name):
self.menus[menu_name]['max_seen_hero'] = hero_name
def lvlup_all_heroes(self, menu_name, max_level=200, timer=180):
self.window.makeScreenshotClientAreaRegion()
curr_time = time.clock()
if curr_time - self.lvlup_all_heroes_time < timer:
return None
self.lvlup_all_heroes_time = curr_time
sorted_hero_list = self.get_sorted_heroes_list(menu_name)
if sorted_hero_list is None:
return None
last_available_hero = self.get_last_available_hero(menu_name)
if last_available_hero:
last_available_hero_index = sorted_hero_list.index(last_available_hero)
else:
return None
heroes_upgraded_list = self.menus[menu_name]['heroes_upgraded_list']
heroes_to_lvlup = [hero_name for hero_name in sorted_hero_list if
self.get_hero_level(menu_name, hero_name) < max_level
and sorted_hero_list.index(hero_name) <= last_available_hero_index
and hero_name not in heroes_upgraded_list]
for hero_name in heroes_to_lvlup:
self.lvlup_hero(menu_name, hero_name, max_level=max_level)
return True
###Buy heroes skill except ascension
# hero_reg = self.scroll_to_hero(menu_name, hero_name)
# hero_reg_scr = self.window.makeScreenshotClientAreaRegion(hero_reg)
# skills_reg = hero_reg_scr.find_pattern_from_list(self.get_pattern('heroes_skills', '%s_c' % hero_name))
# if skills_reg:
# continue
#
# if hero_name == 'amenhotep':
# ascend_skill_reg = hero_reg_scr.find_pattern_from_list(
# self.get_pattern('heroes_skills', 'amenhotep_ascend'),
# cache=False)
# if ascend_skill_reg:
# ascend_skill_reg = ascend_skill_reg[0]
# else:
# continue
# else:
# ascend_skill_reg = None
# button_edge_reg = hero_reg_scr.find_pattern_from_list(self.get_pattern('heroes_button', 'edge_'),
# cache=False)
# if button_edge_reg is None:
# continue
# button_edge_reg = button_edge_reg[0]
# hero_name_reg = hero_reg_scr.find_pattern_from_list(self.get_pattern(menu_name, hero_name))
# if hero_name_reg is None:
# continue
# hero_name_reg = hero_name_reg[0]
# skills_reg_left_x, skills_reg_left_y = button_edge_reg.center().get_xy()
# skills_reg_right_x = hero_name_reg.getRight()
# y = hero_reg.getTop() + skills_reg_left_y
# for i in range(100):
# x = hero_reg.getLeft() + skills_reg_left_x + int(
# random.random() * (skills_reg_right_x - skills_reg_left_x))
# if ascend_skill_reg and ascend_skill_reg.contains((x - hero_reg.getLeft(), y - hero_reg.getTop())):
# continue
# hero_reg_scr = self.window.makeScreenshotClientAreaRegion(hero_reg)
# cv2.imshow("hero_reg_scr", hero_reg_scr.get_array())
# cv2.waitKey(50)
# # skills_reg = hero_reg_scr.find_pattern_from_list(self.get_pattern('heroes_skills', '%s_c' % hero_name))
# # if skills_reg:
# # break
# self.window.click(x, y, cps=5)
def lvlup_top_heroes(self, menu_name, dist=0):
self.window.makeScreenshotClientAreaRegion()
img = self.window.getScreenshot().get_resized_copy(w=300).get_array()
cv2.imshow('lvlup_top_heroes:img', img)
cv2.waitKey(50)
hero_name = self.get_last_available_hero(menu_name)
if hero_name is None:
return None
i = 0
while i <= dist:
if hero_name:
res = self.lvlup_hero(menu_name, hero_name)
hero_lst = self.get_prev_hero_name(menu_name, hero_name)
if hero_lst:
hero_name = hero_lst[0]
else:
break
i += 1
def set_last_available_hero(self, menu_name, hero_name):
self.menus[menu_name]['last_available_hero'] = hero_name
def get_last_available_hero(self, menu_name):
hol = self.get_sorted_heroes_list(menu_name)
hol_length = len(hol)
if hol is None:
return None
lah = None
# pah = None
lahp = self.menus[menu_name]['last_available_hero']
if lahp:
# Check that lahp is the last hero in the list
if lahp == hol[-1]:
return lahp
else:
next_lah_index = hol.index(lahp) + 1
# pah = self.get_next_hero_name(menu_name, lahp)
else:
next_lah_index = 0
max_seen_hero = self.get_max_seen_hero(menu_name)
if not max_seen_hero:
return None
max_seen_hero_index = hol.index(max_seen_hero) + 1
if max_seen_hero_index >= next_lah_index:
to_check_heroes = hol[next_lah_index:max_seen_hero_index]
for h in reversed(to_check_heroes):
reg = self.scroll_to_hero(menu_name, h)
if reg is None:
return None
if self.is_hero_lvlup_button_active(reg):
lah = h
cv2.imshow("get_last_available_hero:lah_reg", self.window.getScreenshot(reg).get_array())
cv2.waitKey(50)
break
if lah:
lah_index = hol.index(lah)
else:
lah_index = 0
if lah_index >= next_lah_index:
self.set_last_available_hero(menu_name, lah)
return self.menus[menu_name]['last_available_hero']
def find_hero_lvlup_button(self, hero_reg):
if hero_reg is None:
return None
cv2.imshow("find_hero_lvlup_button:hero_reg", self.window.getScreenshot(hero_reg).get_array())
cv2.waitKey(50)
return self.find_lvlup_button(hero_reg)
def find_hero_level_reg(self, reg):
if reg is None:
return None
level_mark_patterns = self.get_pattern('heroes_button', 'level_mark')
reg_name = self.find_pattern_reg_name_single(reg, level_mark_patterns)
if reg_name is None:
return None
# Absolute region of level mark
level_mark_reg = reg_name[0]
loc1 = Location(reg.getLeft() + level_mark_reg.getLeft(), reg.getTop() + level_mark_reg.getTop())
loc2 = Location(reg.getRight(), reg.getTop() + level_mark_reg.getBottom())
level_mark_reg = Region.from2Location(loc1, loc2)
return level_mark_reg
###################################
# def find_hero_region(self, menu_name, hero_name):
# reg = self.scroll_to_hero(menu_name, hero_name)
# return reg
###################################
def find_lvlup_button(self, reg):
button_patterns = self.get_pattern('heroes_button', 'lvlup_')
reg_name = self.find_pattern_reg_name_single(reg, button_patterns)
if reg_name is None:
return None
# Absolute region
reg = reg_name[0] + (reg.x, reg.y)
cv2.imshow("find_lvlup_button:reg_name[0]", self.window.getScreenshot(reg).get_array())
cv2.waitKey(50)
butt_reg_name = (reg, reg_name[1])
if 'hire_inactive' in butt_reg_name[1]:
# if all(x in butt_reg_name[1] for x in ['inactive', 'hire']):
status = False
else:
status = True
return (butt_reg_name[0], status)
def find_level_reg(self, reg):
return
def is_hero_lvlup_button_active(self, hero_reg):
self.window.makeScreenshotClientAreaRegion()
lvl_button = self.find_hero_lvlup_button(hero_reg)
if lvl_button is None:
return None
status = lvl_button[1]
return status
def get_hero_level(self, menu_name, hero_name):
hero_level_dict = self.menus[menu_name]['hero_level']
if hero_name not in hero_level_dict.keys():
hero_level_dict[hero_name] = 0
return hero_level_dict[hero_name]
def set_hero_level(self, menu_name, hero_name, level):
hero_level_dict = self.menus[menu_name]['hero_level']
if hero_name not in hero_level_dict.keys():
hero_level_dict[hero_name] = 0
hero_level_dict[hero_name] = level
def add_hero_level(self, menu_name, hero_name, level):
hero_level_dict = self.menus[menu_name]['hero_level']
if hero_name not in hero_level_dict.keys():
hero_level_dict[hero_name] = 0
hero_level_dict[hero_name] += level
def get_heroes_level_dict(self, menu_name):
return self.menus[menu_name]['hero_level']
# def save_heroes_levels(self, menu_name, hld):
# try:
# hld_filename = STATS_DIR + '/%s_heroes_levels.dat' % menu_name
# with tempfile.NamedTemporaryFile(mode='w+t', delete=False, dir=STATS_DIR) as temp_file:
# json.dump(hld, temp_file)
# if os.path.isfile(hld_filename):
# shutil.copy(hld_filename, hld_filename + '.bck')
#
# os.replace(temp_file.name, hld_filename)
# except OSError:
# raise
def save_sorted_heroes_list(self, menu_name, shl):
try:
shl_filename = os.path.join(self.stats_dir, '%s_sorted_heroes_list.dat' % menu_name)
with tempfile.NamedTemporaryFile(mode='w+t', delete=False, dir=STATS_DIR) as temp_file:
json.dump(shl, temp_file)
if os.path.isfile(shl_filename):
shutil.copy(shl_filename, shl_filename + '.bck')
os.replace(temp_file.name, shl_filename)
except OSError:
raise
def load_sorted_heroes_list(self, menu_name):
try:
fn = STATS_DIR + '/%s_sorted_heroes_list.dat' % menu_name
with open(fn, 'r') as f:
return json.load(f)
except FileNotFoundError:
return []
def save_container(self, menu_name, container_name, container):
try:
shl_filename = os.path.join(self.stats_dir, '%s_%s' % (menu_name, container_name))
with tempfile.NamedTemporaryFile(mode='w+t', delete=False, dir=self.stats_dir) as temp_file:
json.dump(container, temp_file)
if os.path.isfile(shl_filename):
shutil.copy(shl_filename, shl_filename + '.bck')
os.replace(temp_file.name, shl_filename)
except OSError:
raise
def load_container(self, menu_name, container_name, default_container):
try:
fn = os.path.join(self.stats_dir, '%s_%s' % (menu_name, container_name))
# fn = STATS_DIR + '/%s_%s' % (menu_name, container_name)
with open(fn, 'r') as f:
return json.load(f)
except FileNotFoundError:
return default_container
def save_heroes_levels(self, menu_name, hld):
try:
hld_filename = STATS_DIR + '/%s_heroes_levels.dat' % menu_name
with tempfile.NamedTemporaryFile(mode='w+t', delete=False, dir=STATS_DIR) as temp_file:
json.dump(hld, temp_file)
if os.path.isfile(hld_filename):
shutil.copy(hld_filename, hld_filename + '.bck')
os.replace(temp_file.name, hld_filename)
except OSError:
raise
def load_heroes_levels(self, menu_name):
try:
fn = STATS_DIR + '/%s_heroes_levels.dat' % menu_name
with open(fn, 'r') as f:
return json.load(f)
except FileNotFoundError:
return {}
def lvlup_hero(self, menu_name, hero_name, lvl_count=None, max_level=None):
self.open_menu(menu_name)
hero_reg = self.scroll_to_hero(menu_name, hero_name=hero_name)
# hero_reg_scr= self.window.makeScreenshotClientAreaRegion(hero_reg)
# button_edge_reg=hero_reg_scr.find_pattern(self.get_pattern('heroes_button','edge_'))
# skills_reg_left_x,skills_reg_left_y=button_edge_reg.center().get_xy()
# hero_name_reg=hero_reg_scr.find_pattern(self.get_pattern(menu_name,hero_name))
# skills_reg_right_x=hero_name_reg.getRight()
# for i in range(100):
# x=skills_reg_left_x+int(random.random()*(skills_reg_right_x-skills_reg_left_x))
# y=skills_reg_left_y
# self.window.click(hero_reg.getRight()+x,hero_reg.getTop()+y)
if hero_reg is None:
return None
button = self.find_hero_lvlup_button(hero_reg)
if button is None:
return None
hero_level = self.get_hero_level(menu_name, hero_name)
levelup_button = button[0]
if lvl_count is None:
lvl_count = 1000 * 1000 * 1000
if max_level:
lvl_count = max_level - hero_level
time_1 = time.clock()
start_time = time.clock()
cnt = 0
# hold_key = 'shift'
# if max_level is None:
# hold_key = 'q'
# self.window.pressAndHoldKey(hold_key)
while True:
# time.sleep(0.2)
# if menu_name == 'heroes':
# For speed make screenshot of lvlup button area
time_chk = time.clock()
# delay = 0.01
# max = 0
# while max == 0 and delay<=1: # time.clock() - time_chk < 0.3:
# scrshot = self.window.makeScreenshotClientAreaRegion(reg)
# # Quick and dirty check for active button
# max = scrshot.get_threshold(128, 255).get_array().max()
#
# time.sleep(delay)
# delay *= 2
# reg, status = self.find_hero_lvlup_button(menu_name, hero_name)
self.window.makeScreenshotClientAreaRegion()
scr_levelup = self.window.makeScreenshotClientAreaRegion(levelup_button)
max = scr_levelup.get_threshold(128, 255).get_array().max()
if max == 0:
break
level_reg = self.find_hero_level_reg(hero_reg)
if level_reg is None:
check_reg = levelup_button
pattern_finder = find_lvlup
else:
check_reg = level_reg
pattern_finder = find_level
scr_level_before = self.window.makeScreenshotClientAreaRegion(check_reg) # .get_threshold(235,255)
self.click_region(levelup_button)
delay = 0.01
total_delay = 0
# Wait a little after click applied
while total_delay <= 1: # time.clock() - time_chk < 0.3:
time.sleep(delay)
total_delay += delay
scr_level_after = self.window.makeScreenshotClientAreaRegion(check_reg) # .get_threshold(235,255)
scr_level_after.set_pattern_finder(pattern_finder)
cv2.imshow('scr_level_before', scr_level_before.get_array())
cv2.imshow('scr_level_after', scr_level_after.get_array())
cv2.waitKey(25)
# comp=cv2.compare(scr_level_before.get_array(),scr_level_after.get_array(),cv2.CMP_EQ)
# comp = cv2.bitwise_xor(scr_level_before.get_array(), scr_level_after.get_array())
# if comp.min()==0:
if not scr_level_before.find_pattern(scr_level_after):
break
delay *= 2
if total_delay > 1 and check_reg == level_reg:
break
# cnt += 10
cnt += 1
if time.clock() - start_time > 120 or cnt >= lvl_count:
break
# self.window.releaseKey(hold_key)
# self.window.move_mouse(y=10)
time.sleep(0.5)
time_2 = time.clock()
# self.menus[menu_name][hero_name]['lvl'] += 1
self.add_hero_level(menu_name, hero_name, cnt)
if cnt == 0:
return None
if DEBUG:
print("lvlup_hero:lvl/sec=%s" % (cnt / (time_2 - time_1)))
# self.save_heroes_levels(menu_name, self.get_heroes_level_dict(menu_name))
self.save_container(menu_name, 'hero_level', self.menus[menu_name]['hero_level'])
return cnt
def click_location(self, loc, refresh=False):
# x, y = loc.get_xy()
# mouse_event = MouseClick(self.window, x, y)
# # self.mouse_event_queue.put((mouse_event, self.mp_event))
# l=list()
# # self.mouse_event_queue.put(mouse_event)
# self.mouse_event_queue.put(l)
# # self.mouse_event_queue.put('123123123')
# self.mp_event.wait()
return self.window.click_location(loc, refresh=refresh)
def click_region(self, reg, refresh=False):
# x, y = reg.center().get_xy()
# mouse_event = MouseClick(self.window, x, y)
# self.mouse_event_queue.put((mouse_event, self.mp_event))
# self.mp_event.wait()
ret = self.window.click_region(reg, refresh=refresh)
return ret
def scrollDownMenu(self, name):
self.menus[name]['scrollPosition'] += 1
scrPos = self.menus[name]['scrollPosition']
if scrPos > self.menus[name]['maxScrollPosition']:
self.menus[name]['maxScrollPosition'] = scrPos
def get_pattern_old(self, menu_name, pattern_name):
if pattern_name not in self.patterns[menu_name].keys():
return None
return self.patterns[menu_name][pattern_name]
def get_pattern(self, menu_name, pattern_name):
path = '/%s/%s' % (menu_name, pattern_name)
if path in self.patterns_cache.keys():
patterns_list = self.patterns_cache[path]
else:
patterns_list = [self.patterns[key] for key in self.patterns.keys() if key.startswith(path)]
self.patterns_cache[path] = patterns_list
return patterns_list
def find_hero_location_old(self, menu_name, hero_name):
if hero_name not in self.hero_patterns_location_cache[menu_name]:
self.hero_patterns_location_cache[menu_name][hero_name] = []
# hero pattern location cache
hplc = self.hero_patterns_location_cache[menu_name][hero_name]
for cached_location in hplc:
if cached_location is None:
break
pat = self.get_pattern(menu_name, hero_name)
for img in pat:
if self.window.getScreenshot(cached_location).find_pattern(img) is not None:
return cached_location
location = None
for pat in self.get_pattern(menu_name, hero_name):
location = self.window.getScreenshot().find_pattern(pat)
if location not in hplc and location is not None:
hplc.append(location)
break
return location
def find_hero_region(self, menu_name, hero_name):
pat = self.get_pattern(menu_name, hero_name)
return self.find_pattern_from_list(pat)
def get_last_hero(self, menu_name):
return self.get_sorted_heroes_list(menu_name)[-1]
def get_hero_list(self, menu_name):
path = '/%s/' % (menu_name)
#
hl = self.menus[menu_name]['heroes_list']
if hl is None:
hl = set(
[key.rpartition('/')[2].rpartition('_')[0] for key in self.patterns.keys() if key.startswith(path)])
self.menus[menu_name]['heroes_list'] = hl
return hl
def get_unsorted_hero_list(self, menu_name):
hero_list = self.get_hero_list(menu_name)
sorted_heroes_list = self.get_sorted_heroes_list(menu_name)
return [name for name in hero_list if name not in sorted_heroes_list]
def get_visible_heroes_cached_old(self, menu_name):
vhc = self.menus[menu_name]['visible_heroes_cache']
sp = self.get_scroll_pos(menu_name)
if sp not in vhc.keys():
return None
return vhc[sp]
def cache_visible_heroes_old(self, menu_name: str, hero_name_list):
vhc = self.menus[menu_name]['visible_heroes_cache']
sp = self.get_scroll_pos(menu_name)
if sp not in vhc.keys():
vhc[sp] = set()
# vhc[sp].update(hero_name_list)
vhc[sp] = hero_name_list
def get_visible_heroes_cached(self, menu_name):
vhc = self.menus[menu_name]['visible_heroes_cache']
return vhc
def cache_visible_heroes(self, menu_name: str, hero_name_list):
self.menus[menu_name]['visible_heroes_cache'] = hero_name_list
self.validate_cache_state()
def get_visible_heroes_old(self, menu_name):
self.open_menu(menu_name)
visible_heroes = []
hl = self.get_visible_heroes_cached(menu_name)
loop = 1
while not visible_heroes:
if not hl or loop > 1:
hl = self.get_hero_list(menu_name)
if DEBUG:
print("get_visible_heroes: visible_heroes_cache missed")
for name in hl:
reg = self.find_hero_region(menu_name, name)
if reg:
visible_heroes.append((name, reg[0]))
loop += 1
visible_heroes_names = list(zip(*visible_heroes))[0]
self.cache_visible_heroes(menu_name, visible_heroes_names)
# Sort visible heroes list by y position
return sorted(visible_heroes, key=lambda x: x[1].y)
def get_last_ascend_seen_heroes(self, menu_name):
#
#
# shl=self.get_sorted_heroes_list(menu_name)
# if not shl:
# return None
#
# hl=self.get_hero_list(menu_name)
# if not hl:
# return None
# msh=self.get_max_seen_hero(menu_name)
# if not msh:
# return None
#
# mshi=hl.index(msh)`
# return shl[:hl.index(self.get_max_seen_hero(menu_name))]
return self.menus[menu_name]['last_ascend_seen_heroes']
def add_last_ascend_seen_heroes(self, menu_name, hero_name):
self.menus[menu_name]['last_ascend_seen_heroes'].update(hero_name)
def get_visible_heroes(self, menu_name, number_of_vh=MAX_NUMBER_OF_VISIBLE_HEROES):
self.open_menu(menu_name)
visible_heroes = []
hl = self.get_hero_list(menu_name)
hlc = self.get_visible_heroes_cached(menu_name)
hol = self.get_sorted_heroes_list(menu_name)
check_remain_heroes = True
cache_state = self.get_cache_state()
if hlc:
# if self.cache_state_is_valid():
# return hlc
# Get hero name list from cache
hero_name_cached = list(zip(*hlc))[0]
for name in hero_name_cached:
reg = self.find_hero_region(menu_name, name)
if reg:
visible_heroes.append((name, reg[0]))
if len(visible_heroes) >= number_of_vh:
check_remain_heroes = False
break
visible_heroes = sorted(visible_heroes, key=lambda x: x[1].y)
if visible_heroes and check_remain_heroes:
top_hero_name = visible_heroes[0][0]
bottom_hero_name = visible_heroes[-1][0]
for dir in [(top_hero_name, self.get_prev_hero_name), (bottom_hero_name, self.get_next_hero_name)]:
name = dir[0]
func = dir[1]
can_change_edge = False
if check_remain_heroes:
while 1:
# name_list = self.get_prev_hero_name(menu_name, name)
pass
name = func(menu_name, name)
if not name:
break
pass
for n in name:
reg = self.find_hero_region(menu_name, n)
if reg:
visible_heroes.append((n, reg[0]))
else:
can_change_edge = True
if len(visible_heroes) >= number_of_vh:
check_remain_heroes = False
break
if len(name) > 1 or not check_remain_heroes:
break
if len(name) == 1 and can_change_edge == True:
break
if len(name) == 1:
name = name[0]
# name_list = self.get_prev_hero_name(menu_name, name)
# if name_list and check_remain_heroes:
# for name in name_list:
# reg = self.find_hero_region(menu_name, name)
# if reg:
# visible_heroes.append((name, reg[0]))
# if len(visible_heroes) >= number_of_vh:
# check_remain_heroes = False
# break
# name = bhn
# while 1:
# name_list = self.get_next_hero_name(menu_name, name)
# for name in name_list:
# reg = self.find_hero_region(menu_name, name)
# if reg:
# visible_heroes.append((name, reg[0]))
# if len(visible_heroes) >= number_of_vh:
# check_remain_heroes = False
# break
# if len(name_list)>1:
# break
if not visible_heroes:
name_list = hl
if name_list and check_remain_heroes:
for name in name_list:
reg = self.find_hero_region(menu_name, name)
if reg:
visible_heroes.append((name, reg[0]))
if len(visible_heroes) >= number_of_vh:
check_remain_heroes = False
break
visible_heroes = set(visible_heroes)
visible_heroes = sorted(visible_heroes, key=lambda x: x[1].y)
if visible_heroes:
visible_heroes_names = list(zip(*visible_heroes))[0]
self.add_last_ascend_seen_heroes(menu_name, visible_heroes_names)
self.cache_visible_heroes(menu_name, visible_heroes)
# Sort visible heroes list by y position
# self.add_last_ascend_seen_heroes( menu_name, visible_heroes_names)
return visible_heroes
def set_max_scroll_position(self, menu_name, pos):
self.menus[menu_name]['sb_max_position'] = pos
def set_min_scroll_position(self, menu_name, pos):
self.menus[menu_name]['sb_min_position'] = pos
def get_scroll_pos(self, menu_name):
sp = self.menus[menu_name]['sb_position']
# spx = self.get_scroll_max_pos(menu_name)
# spm = self.get_scroll_min_pos(menu_name)
# if spm and sp < spm:
# return spm
# if spx and sp > spx:
# return spx
return sp
def set_scroll_pos(self, menu_name, sp):
self.menus[menu_name]['sb_position'] = sp
def reindex_heroes_list(self, menu_name, reindex_timer=30):
self.window.makeScreenshotClientAreaRegion()
img = self.window.getScreenshot().get_resized_copy(w=300).get_array()
cv2.imshow('reindex_heroes_list:img', img)
cv2.waitKey(50)
curr_time = time.clock()
if curr_time - self.reindex_heroes_list_time < reindex_timer:
return False
self.reindex_heroes_list_time = curr_time
self.open_menu(menu_name)
if self.get_sorted_heroes_list(menu_name) is None:
dir_list = [WHEEL_UP, WHEEL_DOWN]
else:
dir_list = [WHEEL_DOWN]
# self.scroll_to_last(menu_name)
# dir_list = [WHEEL_UP, WHEEL_DOWN]
# Start scrolling to find location of heroes
for direction in dir_list:
visible_heroes = None
bug_scroll_heroes = None
while True:
# if direction == WHEEL_UP:
# op_dir=WHEEL_DOWN
# else:
# op_dir=WHEEL_UP
#
# self.scroll_menu(menu_name, op_dir)
# self.scroll_menu(menu_name, direction)
prev_vis_heroes = visible_heroes
visible_heroes = self.get_visible_heroes(menu_name)
if not visible_heroes:
return None
# if (visible_heroes and prev_vis_heroes):
# print("reindex_heroes_list: set==set %s" % (set(visible_heroes) == set(prev_vis_heroes)))
if (visible_heroes and prev_vis_heroes) and set(visible_heroes) == set(prev_vis_heroes):
if direction == WHEEL_DOWN:
self.scroll_menu(menu_name, WHEEL_UP)
self.scroll_menu(menu_name, WHEEL_DOWN)
bug_scroll_heroes = self.get_visible_heroes(menu_name)
if bug_scroll_heroes == None:
return None
if set(visible_heroes) != set(bug_scroll_heroes):
continue
if direction == WHEEL_DOWN:
self.set_scroll_pos(menu_name, self.get_scroll_pos(menu_name) - 1)
self.set_max_scroll_position(menu_name, self.get_scroll_pos(menu_name))
else:
self.set_scroll_pos(menu_name, self.get_scroll_pos(menu_name) + 1)
self.set_min_scroll_position(menu_name, self.get_scroll_pos(menu_name))
break
hol = self.menus[menu_name]['sorted_heroes_list']
visible_heroes_names = list(zip(*visible_heroes))[0]
if direction == WHEEL_UP:
# Adding heroes in front of sorted_heroes_list
hol[0:0] = [item for item in visible_heroes_names if item not in hol]
else:
# Adding heroes in the end of of sorted_heroes_list
hol.extend([item for item in visible_heroes_names if item not in hol])
local_max_seen_hero = visible_heroes_names[-1]
global_max_seen_hero = self.get_max_seen_hero(menu_name)
if global_max_seen_hero:
if hol.index(local_max_seen_hero) > hol.index(global_max_seen_hero):
self.set_max_seen_hero(menu_name, local_max_seen_hero)
else:
self.set_max_seen_hero(menu_name, local_max_seen_hero)
# Check if we need to scroll
# if self.find_pattern_from_list(self.get_pattern('main', 'scroll_up')):
self.scroll_menu(menu_name, direction)
# else:
# # Just make screenshot
# self.window.makeScreenshotClientAreaRegion()
self.invalidate_cache_state()
self.validate_cache_state()
# self.save_sorted_heroes_list(menu_name, shl=self.menus[menu_name]['sorted_heroes_list'])
self.save_container(menu_name, 'sorted_heroes_list', self.menus[menu_name]['sorted_heroes_list'])
return True
def get_hero_scroll_position(self, menu_name, hero_name):
return self.menus[menu_name]['sb_position'][hero_name]
def set_hero_scroll_position(self, menu_name, hero_name):
sbp = self.get_scroll_pos(menu_name)
hsbp = self.get_hero_scroll_position(self, menu_name, hero_name)
if hsbp is None:
self.init_hero_scroll_position(menu_name, hero_name)
self.menus[menu_name][hero_name]['sb_position']
self.get_scroll_pos(menu_name)
def scroll_to_hero(self, menu_name, hero_name):
if hero_name is None:
return None
self.open_menu(menu_name)
sorted_heroes_list = self.get_sorted_heroes_list(menu_name)
if sorted_heroes_list is None:
return None
direction = None
while True:
visible_heroes = self.get_visible_heroes(menu_name)
if not visible_heroes:
return None
pass
hero_reg_dict = dict(visible_heroes)
visible_heroes_names = list(zip(*visible_heroes))[0]
top_vh = visible_heroes_names[0]
bottom_vh = visible_heroes_names[-1]
if direction == WHEEL_DOWN or direction is None:
# Adding heroes in the end of of sorted_heroes_list
lst = [name for name in visible_heroes_names if name not in sorted_heroes_list]
sorted_heroes_list.extend(lst)
next_hero_name_list = self.get_next_hero_name(menu_name, hero_name)
if not next_hero_name_list:
return None
if set(next_hero_name_list).issubset(sorted_heroes_list) and hero_name != bottom_vh:
next_hero_name = next_hero_name_list[0]
# We need that next_hero_name is visible also as hero_name
# So lvlup button is between them
if sorted_heroes_list.index(next_hero_name) > sorted_heroes_list.index(bottom_vh):
direction = WHEEL_DOWN
elif sorted_heroes_list.index(hero_name) < sorted_heroes_list.index(top_vh):
direction = WHEEL_UP
if all(h in visible_heroes_names for h in (hero_name, next_hero_name)):
hero_name_reg = hero_reg_dict[hero_name]
next_hero_name_reg = hero_reg_dict[next_hero_name]
hero_reg_height = next_hero_name_reg.y - hero_name_reg.y
hero_reg = Region(0, hero_name_reg.y, hero_name_reg.getRight(), hero_reg_height)
break
else:
# May be we are at the end of heroes list
# So we need that lvlup button is visible below hero name
direction = WHEEL_DOWN
if hero_name in visible_heroes_names:
# button_patterns = self.get_pattern('heroes_button', 'lvlup_')
bottom_patterns = self.get_pattern('heroes_button', 'edge_')
hero_name_reg = hero_reg_dict[hero_name]
hero_reg_height = self.window.getClientRegion().getHeight() - hero_name_reg.y
hero_reg = Region(0, hero_name_reg.y, hero_name_reg.getRight(), hero_reg_height)
# Check that we have lvlup button in butt_reg
if self.find_pattern_reg_name_single(hero_reg, bottom_patterns):
break
# if direction == WHEEL_UP:
# # Adding heroes in front of sorted_heroes_list
# hol[0:0] = [item for item in visible_heroes_names if item not in hol]
# elif direction == WHEEL_UP:
# # Adding heroes in the end of of sorted_heroes_list
# hol.extend([item for item in visible_heroes_names if item not in hol])
if direction:
if direction == WHEEL_DOWN:
self.scroll_menu(menu_name, WHEEL_UP)
self.scroll_menu(menu_name, WHEEL_DOWN)
self.scroll_menu(menu_name, direction)
img = self.window.getScreenshot(hero_reg).get_array()
cv2.imshow('scroll_to_hero:hero_reg', img)
cv2.waitKey(50)
return hero_reg
def get_lvlup_toggle(self):
return 'z'
def needToUpgrade(self):
numberOfSkill = [len(v) for v in self.menus['mainMenu']['skills'].values()]
if numberOfSkill == SKILL_NUMBER:
return False
return True
def get_scroll_min_pos(self, menu_name):
return self.menus[menu_name]['sb_min_position']
def get_scroll_max_pos(self, menu_name):
return self.menus[menu_name]['sb_max_position']
def set_scroll_pos(self, menu_name, pos):
self.menus[menu_name]['sb_position'] = pos
def scroll_pos_inc(self, menu_name, count=1):
self.set_scroll_pos(self, menu_name, self.get_scroll_pos(menu_name) + count)
def scroll_menu(self, menu_name, direction, count=1):
self.open_menu(menu_name)
# if not self.find_pattern_from_list(self.get_pattern('main', 'scroll_')):
# return None
for i in range(count):
# mouse_event = MouseScroll(self.window, direction)
# self.mouse_event_queue.put((mouse_event, self.mp_event))
# self.mp_event.wait()
self.window.scroll(direction)
self.menus[menu_name]['sb_position'] -= direction
time.sleep(0.3)
self.window.makeScreenshotClientAreaRegion()
self.invalidate_cache_state()
def scroll_to_position(self, menu_name, position):
if position is None:
return
cur_pos = self.get_scroll_pos(menu_name)
if position < cur_pos:
direction = WHEEL_UP
elif position > cur_pos:
direction = WHEEL_DOWN
else:
return
self.scroll_menu(menu_name, direction, abs(position - cur_pos))
def scroll_to_start(self, menu_name):
self.scroll_to_position(menu_name, self.get_scroll_min_pos(menu_name))
def scroll_to_last(self, menu_name):
self.scroll_to_position(menu_name, self.get_scroll_max_pos(menu_name))
def findItem(self, item):
if item['pattern']['location'] is None:
self.window.getScreenshot('').find_pattern(item['pattern'])
def nextLvl(self):
self.open_menu('mainMenu')
self.clickItem(self.findItem(self.menus['mainMenu']['nextLvlButton']))
def prevLvl(self):
self.open_menu('mainMenu')
self.clickItem(self.findItem(self.menus['mainMenu']['prevLvlButton']))
def upgradeTopHero(self, offset=0):
self.open_menu('heroesTab')
h = self.findTopHero()
self.lvlUpHero(h)
def get_current_menu(self):
name = None
for menu_name in ['news', 'ancients_summon', 'settings', 'shop', 'heroes', 'ancients', 'relics', 'clan',
'merceneries', 'transcendence']:
# Check that
reg = self.find_pattern_from_list(self.get_pattern('main', menu_name + '_menu_active'))
if reg:
name = menu_name
break
return name
def open_menu(self, menu_name):
cur_menu = self.get_current_menu()
if DEBUG:
print('open_menu: menu name is %s ' % (cur_menu))
if cur_menu == menu_name:
return
self.close_menu(wait=None)
pat_list = self.get_pattern('main', menu_name + '_menu')
reg = self.find_pattern_from_list(pat_list)
if not reg:
return None
self.click_location(reg[0].center())
def getCurrentMenu(self):
return self.currentmenu_name
def close_menu(self, menu_name=None, wait=1):
# self.wait_for_pattern_name(menu_name, 'close_menu')
while 1:
self.wait_for_pattern_list(self.get_pattern('buttons', 'button_close_menu'), wait=wait)
if not self.click_pattern('buttons', 'button_close_menu', all=True):
break
# self.click_pattern(menu_name, 'close_menu', all=False)
def close_popups(self, menu_name):
self.wait_for_pattern_name(menu_name, 'close_menu')
self.click_pattern(menu_name, 'close_menu', all=False)
def wait_for_pattern_name(self, menu_name, pat_name):
pat_list = self.get_pattern(menu_name, pat_name)
return self.wait_for_pattern_list(pat_list)
def wait_for_pattern_list(self, pat_list, wait=1):
delay = 0.05
wait_start = time.clock()
total_delay = 0
while wait is None or wait == -1 or total_delay <= wait:
self.window.makeScreenshotClientAreaRegion()
reg = self.find_pattern_from_list(pat_list)
if reg:
return reg
if wait is None:
return None
time.sleep(delay)
# if time.clock() - wait_start >= wait:
# return None
total_delay += delay
delay *= 2
return None
def click_pattern(self, menu_name, pattern_name, all=False, refresh=True):
if refresh:
self.window.makeScreenshotClientAreaRegion()
patt_list = self.get_pattern(menu_name, pattern_name)
if patt_list:
regs = self.find_pattern_from_list(patt_list, all=all)
if regs:
for reg in regs:
self.click_region(reg)
if not all:
break
if refresh:
self.window.makeScreenshotClientAreaRegion()
return True
return None
def get_monster_click_location(self):
if not self.click_monster_location:
next_lvl_button = self.find_pattern_from_list(self.get_pattern('main', 'levelnext'))
if next_lvl_button:
next_lvl_button = next_lvl_button[0]
else:
return None
prev_lvl_button = self.find_pattern_from_list(self.get_pattern('main', 'levelprev'))
if prev_lvl_button:
prev_lvl_button = prev_lvl_button[0]
else:
return None
skull_hp = self.find_pattern_from_list(self.get_pattern('main', 'skull_hp'))
if skull_hp:
skull_hp = skull_hp[0]
else:
return None
x_n, y_n = next_lvl_button.center().get_xy()
x_p, y_p = prev_lvl_button.center().get_xy()
shop_y = skull_hp.center().get_y()
# click_x is halfway between next and previous level button
click_x = (x_p + x_n) // 2
# click_y is halfway between current level rect and shop button
click_y = (shop_y + y_n) // 2
self.click_monster_location = Location(click_x, click_y)
return self.click_monster_location
# def get_monster_click_location(self):
# if self.click_monster_location is None:
# self.set_monster_click_location()
return self.click_monster_location
def click_monster(self, cps=10):
mcl = self.get_monster_click_location()
if not mcl:
return None
return self.click_location(mcl)
def collect_fish(self, timer=15):
curr_time = time.clock()
if curr_time - self.fish_time >= timer:
self.window.makeScreenshotClientAreaRegion()
self.fish_time = curr_time
return self.click_pattern('main', 'fish')
def collect_newyear(self, timer=30):
curr_time = time.clock()
if curr_time - self.newyear >= timer:
self.newyear = curr_time
return self.click_pattern('main', 'new_year')
return None
def collect_relic_ooze(self):
# if not self.relic_ooze_collected:
self.window.makeScreenshotClientAreaRegion()
if self.find_pattern_from_list(self.get_pattern('main', 'relic_ooze')):
with self.lock:
if self.click_location(self.window.getClientRegion().center(), refresh=True):
self.close_menu('relic_ooze')
self.relic_ooze_collected = True
def lvlup(self):
self.click_pattern('heroes_button', 'lvlup_active')
def ascend(self, ascension_life=3600, check_timer=60, check_progress=True, check_hero_souls=True):
self.window.makeScreenshotClientAreaRegion()
self.click_location(Location(1, 1), refresh=True)
curr_time = time.clock()
if curr_time - self.ascend_checker_time < check_timer:
return None
self.ascend_checker_time = curr_time
if curr_time - self.ascend_time < ascension_life:
return None
if self.got_heroes_souls == False and check_hero_souls:
got_heroes_souls = self.find_pattern_from_list(self.get_pattern('main', 'got_heroes_souls'))
if got_heroes_souls:
self.got_heroes_souls = True
else:
return None
progress_on = self.find_pattern_from_list(self.get_pattern('main', 'button_progression_on'))
if progress_on and check_progress:
return None
# if not self.find_pattern_from_list(self.get_pattern('main', 'button_ascend')):
# if self.get_hero_level('heroes', 'amenhotep') < 200:
# if not self.lvlup_hero('heroes', 'amenhotep', max_level=200):
# return None
cnt = 0
while not self.wait_for_pattern_list(self.get_pattern('main', 'button_ascend'), wait=1) and cnt < 10:
self.lvlup_hero('heroes', 'amenhotep', lvl_count=100)
cnt += 1
destroy_relics_pat = self.get_pattern('main', 'destroy_relics')
wish_to_ascend_pat = self.get_pattern('main', 'wish_to_ascend')
# Refresh screenshot
self.window.makeScreenshotClientAreaRegion()
# if self.find_pattern_from_list(self.get_pattern('main', 'button_ascend')):
if self.wait_for_pattern_list(self.get_pattern('main', 'button_ascend')):
with self.lock:
if self.click_pattern('main', 'button_ascend'):
if self.wait_for_pattern_list(destroy_relics_pat):
self.click_pattern('main', 'button_yes')
if self.wait_for_pattern_list(wish_to_ascend_pat):
if self.click_pattern('main', 'button_yes'):
time.sleep(5)
curr_time = time.clock()
self.menus['heroes']['last_available_hero'] = None
self.menus['heroes']['max_seen_hero'] = None
self.menus['heroes']['visible_heroes_cache'] = None
self.menus['heroes']['hero_level'] = {}
# self.save_heroes_levels('heroes', self.get_heroes_level_dict('heroes'))
self.save_container('heroes', 'hero_level', self.menus['heroes']['hero_level'])
self.starter_clicks = True
self.got_heroes_souls = False
self.ascend_time = curr_time
self.lvlup_all_heroes_time = curr_time
self.click_pattern('main', 'button_progression_off')
self.buy_quick_ascension()
def monster_clicker(self, count=100, cps=30):
for i in range(count):
if not self.click_monster(cps):
break
def collect_gilds(self):
self.window.makeScreenshotClientAreaRegion()
present_reg = self.find_pattern_from_list(self.get_pattern('main', 'transcension_highest_zone_gift'))
if present_reg:
with self.lock:
if self.click_pattern('main', 'transcension_highest_zone_gift'):
transcension_highest_zone_menu = self.get_pattern('main', 'transcension_highest_zone_menu')
if self.wait_for_pattern_list(transcension_highest_zone_menu):
if self.click_location(self.window.getClientRegion().center(), refresh=True):
self.close_menu('main')
def get_np_level(self):
next_lvl_button = self.find_pattern_from_list(self.get_pattern('main', 'levelnext'))
if next_lvl_button:
next_lvl_button = next_lvl_button[0]
else:
return None
prev_lvl_button = self.find_pattern_from_list(self.get_pattern('main', 'levelprev'))
if prev_lvl_button:
prev_lvl_button = prev_lvl_button[0]
else:
return None
x_n, y_n = next_lvl_button.center().get_xy()
x_p, y_p = prev_lvl_button.center().get_xy()
x_curr_level, y_curr_level = ((x_n + x_p) / 2, (y_n + y_p) / 2)
x_next_level = x_curr_level + 0.4 * (x_n - x_curr_level)
y_next_level = y_curr_level
x_prev_level = x_curr_level - 0.4 * (x_curr_level - x_p)
y_prev_level = y_curr_level
return (Location(x_prev_level, y_prev_level), Location(x_next_level, y_next_level))
def next_level(self):
skull_farm = self.find_pattern_from_list(self.get_pattern('main', 'skull_farm'))
if skull_farm:
return
np_level = self.get_np_level()
if np_level:
next_level = self.get_np_level()[1]
else:
return None
if next_level:
self.click_location(next_level)
def prev_level(self):
np_level = self.get_np_level()
if np_level:
prev_level = np_level[0]
else:
return None
self.click_location(prev_level)
def progress_auto(self, farm_mode_timer=300, boss_timer=5):
curr_time = time.clock()
progress_off = self.find_pattern_from_list(self.get_pattern('main', 'button_progression_off'))
progress_on = self.find_pattern_from_list(self.get_pattern('main', 'button_progression_on'))
if progress_on and self.stuck_on_boss(boss_time=boss_timer, check_interval=1):
if self.try_skill_combos('869', '123457', '123'):
time.sleep(30)
self.prev_level()
self.farm_mode_start_time = curr_time
self.click_pattern('main', 'button_progression_off')
return True
if not progress_off:
return False
if progress_off and self.farm_mode_start_time is None:
self.farm_mode_start_time = curr_time
return True
if curr_time - self.farm_mode_start_time >= farm_mode_timer:
self.farm_mode_start_time = None
self.click_pattern('main', 'button_progression_off')
return True
def progress_manual(self, farm_mode_timer=300, boss_timer=10):
curr_time = time.clock()
if self.farm_mode_start_time and curr_time - self.farm_mode_start_time > farm_mode_timer:
self.farm_mode_start_time = None
# return False
if not self.farm_mode_start_time:
self.next_level()
if not self.farm_mode_start_time and self.stuck_on_boss(boss_time=boss_timer, check_interval=1):
if self.try_skill_combos('869', '123457', '123'):
time.sleep(30)
self.prev_level()
self.farm_mode_start_time = curr_time
return True
return False
def stuck_on_boss(self, boss_time, check_interval=5):
curr_time = time.clock()
if self.boss_time and curr_time - self.boss_check_time <= check_interval:
return False
self.boss_check_time = curr_time
boss_clock = self.find_pattern_from_list(self.get_pattern('main', 'boss_clock'))
if not boss_clock:
return False
skull_farm = self.find_pattern_from_list(self.get_pattern('main', 'skull_farm'))
if not skull_farm:
return False
if not self.levels_region:
next_lvl_button = self.find_pattern_from_list(self.get_pattern('main', 'levelnext'))[0]
prev_lvl_button = self.find_pattern_from_list(self.get_pattern('main', 'levelprev'))[0]
x_n, y_n = next_lvl_button.getBottomRight().get_xy()
x_p, y_p = prev_lvl_button.getTopLeft().get_xy()
x_c, y_c = (x_n + x_p) / 2, (y_n + y_p) / 2
h = (y_c - y_p) * 2
y_p = int(y_c - h)
y_n = int(y_c + h)
self.levels_region = Region.from2POINT(x_p, y_p, x_n, y_n)
time.sleep(1.1)
if self.boss_time is None:
self.boss_time = curr_time
self.levels_region_scrshot = self.window.makeScreenshotClientAreaRegion(self.levels_region)
# cv2.imshow('self.levels_region_scrshot', self.levels_region_scrshot.get_array())
# cv2.waitKey(50)
return False
self.boss_check_time = curr_time
if curr_time - self.boss_time >= boss_time:
levels_region_scrshot = self.window.makeScreenshotClientAreaRegion(self.levels_region)
levels_region_scrshot.set_name('123456')
# cv2.imshow('levels_region_scrshot',levels_region_scrshot.get_array())
# cv2.imshow('self.levels_region_scrshot', self.levels_region_scrshot.get_array())
# cv2.waitKey(50)
self.boss_time = None
if levels_region_scrshot.find_pattern(self.levels_region_scrshot):
# cv2.imshow('levels_region_scrshot',levels_region_scrshot.get_array())
# cv2.imshow('self.levels_region_scrshot', self.levels_region_scrshot.get_array())
# cv2.waitKey(50)
return True
return False
def progress_level(self, farm_mode_timer=300, boss_timer=30, progress_button_timer=30):
self.window.makeScreenshotClientAreaRegion()
curr_time = time.clock()
progress_button = None
if self.progress_button_time and curr_time - self.progress_button_time >= progress_button_timer:
progress_button = self.find_pattern_from_list(self.get_pattern('main', 'button_progression'))
if progress_button:
self.progress_button_time = None
else:
self.progress_button_time = curr_time
if progress_button or self.progress_button_time is None:
return self.progress_auto(farm_mode_timer=farm_mode_timer, boss_timer=boss_timer)
return self.progress_manual(farm_mode_timer=farm_mode_timer, boss_timer=boss_timer)
def get_cache_state(self):
return self.cache_state
def invalidate_cache_state(self):
self.cache_state = False
def validate_cache_state(self):
self.cache_state = True
def cache_state_is_invalid(self):
return not self.get_cache_state()
def cache_state_is_valid(self):
return self.get_cache_state()
# def buy_available_upgrades_old(self):
# self.window.makeScreenshotClientAreaRegion()
#
# menu_name = 'heroes'
# max_seen_hero = self.get_max_seen_hero(menu_name)
# if max_seen_hero is None:
# return None
# self.scroll_to_hero(menu_name, max_seen_hero)
# while not self.click_pattern('main', 'buy_available_upgrades_old'):
# self.scroll_menu(menu_name, WHEEL_DOWN)
# self.scroll_menu(menu_name, WHEEL_UP)
# self.scroll_menu(menu_name, WHEEL_DOWN)
def buy_available_upgrades(self, upgrades_timer=300):
curr_time = time.clock()
if curr_time - self.skills_upgrades_time < upgrades_timer:
return None
self.window.makeScreenshotClientAreaRegion()
menu_name = 'heroes'
max_seen_hero = self.get_max_seen_hero(menu_name)
if max_seen_hero is None:
return None
self.scroll_to_hero(menu_name, max_seen_hero)
cnt = 0
MAX_RETRY = 3
while cnt <= MAX_RETRY:
if not self.click_pattern('main', 'buy_available_upgrades'):
self.scroll_menu(menu_name, WHEEL_DOWN)
self.scroll_menu(menu_name, WHEEL_UP)
self.scroll_menu(menu_name, WHEEL_DOWN)
else:
self.skills_upgrades_time = time.clock()
return True
cnt += 1
self.window.makeScreenshotClientAreaRegion()
sorted_hero_list = self.get_sorted_heroes_list(menu_name)
if sorted_hero_list is None:
return None
heroes_upgraded_list = self.menus[menu_name]['heroes_upgraded_list']
# if heroes_upgraded_list is None:
# return None
# heroes_to_lvlup = [hero_name for hero_name in last_ascend_seen_heroes if hero_name not in heroes_upgraded_list]
# Make list from sorted heroes list up to max_seen_hero included.
# heroes_to_lvlup = list(itertools.takewhile(lambda x: x != max_seen_hero, sorted_hero_list))+[max_seen_hero]
heroes_to_lvlup = list(
itertools.takewhile(lambda x: x not in self.get_next_hero_name(menu_name, max_seen_hero), sorted_hero_list))
# Exclude from this list upgraded heroes
heroes_to_lvlup = [hero_name for hero_name in heroes_to_lvlup if hero_name not in heroes_upgraded_list]
for hero_name in heroes_to_lvlup:
###Buy heroes skill except ascension
hero_reg = self.scroll_to_hero(menu_name, hero_name)
hero_reg_scr = self.window.makeScreenshotClientAreaRegion(hero_reg)
ascend_skill_reg = None
if hero_name == 'amenhotep':
ascend_skill_reg = hero_reg_scr.find_pattern_from_list(
self.get_pattern('heroes_skills', 'amenhotep_ascend'),
cache=False)
if ascend_skill_reg:
ascend_skill_reg = ascend_skill_reg[0]
button_edge_reg = hero_reg_scr.find_pattern_from_list(self.get_pattern('heroes_button', 'edge_'),
cache=False)
if not button_edge_reg:
continue
button_edge_reg = button_edge_reg[0]
hero_name_reg = hero_reg_scr.find_pattern_from_list(self.get_pattern(menu_name, hero_name))
if hero_name_reg is None:
continue
hero_name_reg = hero_name_reg[0]
# skills_reg_left_x, skills_reg_left_y = button_edge_reg.center().get_xy()
skills_reg_left_x, skills_reg_left_y = button_edge_reg.getRight(), button_edge_reg.center().get_y()
skills_reg_right_x = hero_name_reg.getRight()
y = hero_reg.getTop() + skills_reg_left_y
for i in range(100):
x = hero_reg.getLeft() + skills_reg_left_x + int(
random.random() * (skills_reg_right_x - skills_reg_left_x))
if ascend_skill_reg and ascend_skill_reg.contains((x - hero_reg.getLeft(), y - hero_reg.getTop())):
continue
# hero_reg_scr = self.window.makeScreenshotClientAreaRegion(hero_reg)
# cv2.imshow("hero_reg_scr", hero_reg_scr.get_array())
# cv2.waitKey(10)
# skills_reg = hero_reg_scr.find_pattern_from_list(self.get_pattern('heroes_skills', '%s_c' % hero_name))
# if skills_reg:
# break
self.window.click(x, y, cps=30)
hero_reg_scr = self.window.makeScreenshotClientAreaRegion(hero_reg)
skills_reg = hero_reg_scr.find_pattern_from_list(self.get_pattern('heroes_skills', '%s_c' % hero_name))
if skills_reg:
# heroes_upgraded_list.remove(hero_name)
heroes_upgraded_list = self.menus[menu_name]['heroes_upgraded_list']
heroes_upgraded_list.append(hero_name)
self.save_container(menu_name, 'heroes_upgraded_list', heroes_upgraded_list)
self.skills_upgrades_time = time.clock()
return True
def buy_quick_ascension(self):
self.window.makeScreenshotClientAreaRegion()
self.close_menu()
with self.window.lock:
if self.click_pattern('main', 'button_shop'):
if self.wait_for_pattern_list(self.get_pattern('shop', 'shop_title')):
self.click_pattern('shop', 'button_buy_quick_ascension')
if self.wait_for_pattern_list(self.get_pattern('shop', 'buy_confirm')):
self.click_pattern('shop', 'button_yes')
if self.wait_for_pattern_list(self.get_pattern('shop', 'title_thank_you')):
# Close all shop submenu
self.click_pattern('shop', 'button_close_menu', all=True)
# self.click_pattern('shop', 'button_okey')
# if self.wait_for_pattern_list(self.get_pattern('shop', 'shop_title')):
else:
if self.wait_for_pattern_list(self.get_pattern('shop', 'title_you_need_more_rubies')):
self.click_pattern('shop', 'button_close_menu', all=True)
# self.click_pattern('shop', 'button_no')
return False
return True
menu_name = 'heroes'
max_seen_hero = self.get_max_seen_hero(menu_name)
if max_seen_hero is None:
return None
self.scroll_to_hero(menu_name, max_seen_hero)
while not self.click_pattern('main', 'buy_available_upgrades'):
self.scroll_menu(menu_name, WHEEL_DOWN)
self.scroll_menu(menu_name, WHEEL_UP)
self.scroll_menu(menu_name, WHEEL_DOWN)
def try_skill_combos(self, *args):
def is_skill_combo_available(skill_combo):
for sn in skill_combo:
if not self.find_pattern_from_list(self.get_pattern('skills', 'skill_%s' % sn)):
if DEBUG:
print("try_skill_combos: skill %s is not ready yet. Try another combo" % sn)
return False
return True
self.window.makeScreenshotClientAreaRegion()
for combo in args:
if is_skill_combo_available(combo):
if DEBUG:
print("try_skill_combos: Combo %s is ready to activate" % combo)
self.window.pressKeyList(combo)
return True
return False
def start_play(self):
if self.click_pattern('buttons', 'button_play'):
if self.click_pattern('buttons', 'button_close_menu', all=True):
return True
return None
class Window:
def __init__(self, hwnd, lock):
self.hwnd = hwnd
self.lock = lock
self.screenshot = None
self.last_click_location = (None, None)
if DEBUG:
print("Window:_init_:hwnd=%s" % (hwnd))
winLong = GetWindowLong(self.hwnd, GWL_STYLE)
# SetWindowLong(hwnd, GWL_STYLE, winLong & ~WS_SIZEBOX)
# # SetWindowLong(self.hwnd, GWL_STYLE, winLong |WS_SYSMENU|WS_CAPTION| WS_MAXIMIZEBOX | WS_MINIMIZEBOX)
# SetWindowLong(self.hwnd, GWL_STYLE, winLong |WS_SYSMENU|WS_CAPTION| ~WS_MAXIMIZEBOX | ~WS_MINIMIZEBOX)
pass
def move(self, x, y):
reg = self.getWindowRegion()
SetWindowPos(self.hwnd,
HWND_TOP,
x,
y,
reg.w,
reg.h,
0)
def resize(self, width, height):
reg = self.getWindowRegion()
SetWindowPos(self.hwnd,
HWND_TOP,
reg.x,
reg.y,
width,
height,
0)
def resizeRel(self, dwidth, dheight):
reg = self.getWindowRegion()
SetWindowPos(self.hwnd,
HWND_TOP,
reg.x,
reg.y,
reg.w + dwidth,
reg.h + dheight,
0)
def resizeCliPropW(self, width):
self.resizeClientArea(width, int(width * 9.0 / 16))
def resizeCliPropH(self, height):
self.resizeClientArea(int(round(height * 16.0 / 9)), height)
def resizeClientArea(self, width, height):
cliReg = self.getClientRegion()
dx = width - cliReg.getWidth()
dy = height - cliReg.getHeight()
self.resizeRel(dx, dy)
def getClientRegion(self):
cliRect = RECT()
GetClientRect(self.hwnd, ctypes.byref(cliRect))
return Region(cliRect.left, cliRect.top, cliRect.right - cliRect.left, cliRect.bottom - cliRect.top)
def getWidth(self):
return self.getClientRegion().getWidth()
def getHeight(self):
return self.getClientRegion().getHeight()
def get_size(self):
return self.getClientRegion().get_size()
def getHeight(self):
return self.getClientRegion().getHeight()
def getWindowRegion(self):
winRect = RECT()
GetWindowRect(self.hwnd, ctypes.byref(winRect))
return Region(winRect.left, winRect.top, winRect.right - winRect.left, winRect.bottom - winRect.top)
def getRegionScreenShot(Region):
return Image
def pressKey(self, char):
with self.lock:
SendMessage(self.hwnd, WM_KEYDOWN, charToKeyCode(char), 1)
# time.sleep(0.1)
SendMessage(self.hwnd, WM_KEYUP, charToKeyCode(char), 1)
def pressAndHoldKey(self, char):
with self.lock:
SendMessage(self.hwnd, WM_KEYDOWN, charToKeyCode(char), 1)
def releaseKey(self, char):
with self.lock:
SendMessage(self.hwnd, WM_KEYUP, charToKeyCode(char), 1)
def pressKeyList(self, chars):
with self.lock:
for c in chars:
self.pressKey(c)
return
def getScreenshot(self, region=None):
if region:
return self.screenshot.crop(region)
return self.screenshot
def getScreenshotCliRegion(self, name):
return self.getClientRegion().getScreenshot()
def makeScreenshotClientAreaRegion(self, region=None):
with self.lock:
isIconic = IsIconic(self.hwnd)
winLong = None
# ShowWindow(self.hwnd, SW_HIDE)
if isIconic:
animationInfo = ANIMATIONINFO()
animationInfo.iMinAnimate = 0
animationInfo.cbSize = ctypes.sizeof(ANIMATIONINFO)
winLong = GetWindowLong(self.hwnd, GWL_EXSTYLE)
SetWindowLong(self.hwnd, GWL_EXSTYLE, winLong | WS_EX_LAYERED)
SetLayeredWindowAttributes(self.hwnd, 0, 0, LWA_ALPHA)
# SystemParametersInfo(SPI_GETANIMATION, animationInfo.cbSize,ctypes.byref(animationInfo), 0)
SystemParametersInfo(SPI_SETANIMATION, animationInfo.cbSize, ctypes.byref(animationInfo),
SPIF_SENDCHANGE)
ShowWindow(self.hwnd, SW_SHOWNOACTIVATE)
wr = RECT()
cliRect = RECT()
GetClientRect(self.hwnd, ctypes.byref(cliRect))
if region is None:
x = 0
y = 0
w = cliRect.right
h = cliRect.bottom
else:
ir = region.intersection(Region.fromRECT(cliRect))
if ir is None:
raise Exception(
'Region ' + str(region) + ' is not intersect with client area rectangle' + str(cliRect))
x = ir.x
y = ir.y
w = ir.w
h = ir.h
# w = cliRect.right
# h = cliRect.bottom
# x = region.get_x()
# y = region.get_y()
# w = region.getWidth()
# h = region.getHeight()
hDC = GetDC(self.hwnd)
myDC = CreateCompatibleDC(hDC)
myBitMap = CreateCompatibleBitmap(hDC, w, h)
SelectObject(myDC, myBitMap)
BitBlt(myDC, 0, 0, w, h, hDC, x, y, SRCCOPY)
if isIconic:
ShowWindow(self.hwnd, SW_SHOWMINNOACTIVE)
SetWindowLong(self.hwnd, GWL_EXSTYLE, winLong)
# SystemParametersInfo(SPI_GETANIMATION, animationInfo.cbSize,ctypes.byref(animationInfo), 0)
animationInfo = ANIMATIONINFO()
animationInfo.iMinAnimate = 1
animationInfo.cbSize = ctypes.sizeof(ANIMATIONINFO)
SystemParametersInfo(SPI_SETANIMATION, animationInfo.cbSize, ctypes.byref(animationInfo),
SPIF_SENDCHANGE)
bmpScreen = BITMAP()
GetObject(myBitMap, ctypes.sizeof(BITMAP), ctypes.byref(bmpScreen))
bi = BITMAPINFOHEADER()
bi.biSize = ctypes.sizeof(BITMAPINFOHEADER)
bi.biWidth = bmpScreen.bmWidth
bi.biHeight = bmpScreen.bmHeight
bi.biPlanes = 1
bi.biBitCount = bmpScreen.bmBitsPixel
bi.biCompression = BI_RGB
bi.biSizeImage = 0
bi.biXPelsPerMeter = 0
bi.biYPelsPerMeter = 0
bi.biClrUsed = 0
bi.biClrImportant = 0
img = np.empty((h, w, int(bmpScreen.bmBitsPixel / 8)), np.uint8)
winplace = WINDOWPLACEMENT()
GetWindowPlacement(self.hwnd, ctypes.byref(winplace))
wr = winplace.rcNormalPosition
if (GetDIBits(hDC, myBitMap, 0,
bmpScreen.bmHeight,
ctypes.c_void_p(img.ctypes.data),
ctypes.byref(bi), DIB_RGB_COLORS) == 0):
print ("makeScreenshotClientAreaRegion: GetDIBits = 0 ")
return None
DeleteDC(myDC)
DeleteObject(myBitMap)
ReleaseDC(self.hwnd, hDC)
screenshot = Image.fromArray(cv2.flip(img, 0))
screenshot.set_name('Screenshot of %s %s' % (self.hwnd, id(screenshot)))
if region is None:
self.screenshot = screenshot
return screenshot
def scroll(self, direction, x=1, y=1):
with self.lock:
tmp = (y << 16) | x
time1 = time.clock()
err = 0
err += SendMessage(self.hwnd, WM_MOUSEWHEEL,
(WHEEL_DELTA * direction) << 16, tmp)
time.sleep(0.02)
x = 1
y = 1
tmp = (y << 16) | x
err += SendMessage(self.hwnd, WM_MOUSEMOVE, 0, tmp)
time2 = time.clock()
if time2 - time1 > 1 or err > 0:
print("scroll: got delay > 1 sec %s err %s" % (time2 - time1, err))
def scrollDown(self):
self.scrollDown(1, 1)
def scrollUp(self):
self.scrollUp(1, 1)
def scrollDown(self, x, y):
tmp = (y << 16) | x
SendMessage(self.hwnd, WM_MOUSEWHEEL,
(WHEEL_DELTA * -1) << 16, tmp)
def scrollUp(self, x, y):
tmp = (y << 16) | x
SendMessage(self.hwnd, WM_MOUSEWHEEL,
(WHEEL_DELTA * 1) << 16, tmp)
def click(self, x, y, refresh=False, park=True, cps=30):
x = int(x)
y = int(y)
self.last_click_location = (x, y)
tmp = (y << 16) | x
delay = 1 / cps
# if park:
# delay /= 2
with self.lock:
err = 0
time1 = time.clock()
err += SendMessage(self.hwnd, WM_LBUTTONDOWN, 0, tmp)
err += SendMessage(self.hwnd, WM_LBUTTONUP, 0, tmp)
time2 = time.clock()
if time2 - time1 > 1 or err > 0:
print("scroll: got delay > 1 sec %s err %s" % (time2 - time1, err))
time.sleep(delay)
if park:
x = 1
y = 1
tmp = (y << 16) | x
err += SendMessage(self.hwnd, WM_MOUSEMOVE, 0, tmp)
# time.sleep(delay / 2)
if refresh:
self.makeScreenshotClientAreaRegion()
if err > 0:
return None
return True
def move_mouse(self, x=None, y=None, refresh=False, park=True, cps=30):
l_x, l_y = self.last_click_location
xc, yc = l_x, l_y
steps = 30
if x:
dx = (x - l_x) / steps
else:
dx = 0
if y:
dy = (y - l_y) / steps
else:
dy = 0
for i in range(steps):
xc += dx
yc += dy
xi, yi = int(xc), int(yc)
tmp = (yi << 16) | xi
delay = 1 / cps
err = 0
with self.lock:
err += SendMessage(self.hwnd, WM_MOUSEMOVE, 0, tmp)
time.sleep(delay)
if err > 0:
return None
return True
def click_location(self, loc, refresh=False, park=True, cps=50):
return self.click(loc.get_x(), loc.get_y(), refresh=refresh, park=park, cps=cps)
def click_region(self, reg, refresh=False, park=True, cps=30):
x, y = reg.center().get_xy()
return self.click(x, y, refresh=refresh, park=park, cps=cps)
class Location:
def __init__(self, x, y):
self.x = x
self.y = y
def get_x(self):
return self.x
def get_y(self):
return self.y
def set(self, x, y):
self.x = x
self.y = y
def get_xy(self):
return (self.x, self.y)
class Region:
def __init__(self, x, y, w, h):
self.x = x
self.y = y
self.w = w
self.h = h
# def get_xy(self):
# self.hwnd = hwnd
# r = RECT()
# GetWindowRect(hwnd, ctypes.byref(r))
# (self.x, self.y, self.w, self.h) = (r.left, r.top, r.right - r.left, r.bottom - r.top)
@classmethod
def fromRECT(cls, rect):
return cls(rect.left, rect.top, rect.right - rect.left, rect.bottom - rect.top)
@classmethod
def from2POINT(cls, left, top, right, bottom):
return cls(left, top, right - left, bottom - top)
@classmethod
def from2Location(cls, l1, l2):
x1, y1 = l1.get_xy()
x2, y2 = l2.get_xy()
w = x2 - x1
h = y2 - y1
return cls(x1, y1, w, h)
def getTopLeft(self):
return Location(self.getLeft(), self.getTop())
def getTopRight(self):
return Location(self.getRight(), self.getTop())
# def __eq__(self, other):
# if isinstance(other, Region):
# return self.is_intersect(other)
# return NotImplemented
#
# def __hash__(self):
# # return hash((self.x,self.y,self.w,self.h))
# return 1
@property
def getBottomLeft(self):
return Location(self.getLeft(), self.getbottom())
def getBottomRight(self):
return Location(self.getRight(), self.getBottom())
def resize(self, x, y, w, h):
return 1
def getX(self):
return self.x
def getY(self):
return self.y
def getWidth(self):
return self.w
def getHeight(self):
return self.h
def get_size(self):
return self.w, self.h
def getLeft(self):
return self.x
def getRight(self):
return self.x + self.w
def getTop(self):
return self.y
def getBottom(self):
return self.y + self.h
def setLeft(self, left):
self.x = left
def setRight(self, right):
self.w = right - self.x
def setTop(self, top):
self.y = top
def setBottom(self, bottom):
self.h = bottom - self.y
def center(self):
return Location(int(self.x + self.w / 2), int(self.y + self.h / 2))
###################################
#
# t1
# |----------------|
# l1| |r1
# | |
# | maxt | t2
# | |-------|------|
# | maxl|#######|minr |
# |--------|-------| |
# b1 | minb |
# | |
# l2| |r2
# | |
# |--------------|
# b2
#
###################################
def intersection(self, region):
t1 = self.getTop()
b1 = self.getBottom()
l1 = self.getLeft()
r1 = self.getRight()
t2 = region.getTop()
b2 = region.getBottom()
l2 = region.getLeft()
r2 = region.getRight()
maxt = max(t1, t2)
minb = min(b1, b2)
maxl = max(l1, l2)
minr = min(r1, r2)
if not (maxt < minb and maxl < minr):
return None
return Region(maxl, maxt, minr - maxl, minb - maxt)
def is_intersect(self, region):
t1 = self.getTop()
b1 = self.getBottom()
l1 = self.getLeft()
r1 = self.getRight()
t2 = region.getTop()
b2 = region.getBottom()
l2 = region.getLeft()
r2 = region.getRight()
maxt = max(t1, t2)
minb = min(b1, b2)
maxl = max(l1, l2)
minr = min(r1, r2)
if (maxt > minb or maxl > minr):
return False
return True
def contains(self, loc):
return self.getLeft() <= loc[0] <= self.getRight() and self.getTop() <= loc[1] <= self.getBottom()
def __add__(self, x):
return Region(self.getX() + x[0], self.getY() + x[1], self.getWidth(), self.getHeight())
def get_collectables(click_lock, start_barrier):
ch = ClickerHeroes(click_lock)
print("get_collectables: Started")
ch.start_play()
start_barrier.wait()
while True:
# try:
ch.collect_fish()
ch.collect_gilds()
if ch.collect_newyear():
ch.monster_clicker(count=750)
ch.collect_relic_ooze()
ch.monster_clicker()
# except Exception as e:
# print("get_collectables:Exception:%s" % repr(e))
# continue
def levelup_heroes(click_lock, start_barrier):
start_barrier.wait()
print("levelup_heroes: Started")
ch = ClickerHeroes(click_lock)
i=0
while True:
# try:
time.sleep(10)
i+=1
# time1=time.clock()
ch.window.makeScreenshotClientAreaRegion()
cv2.imwrite('D:\\tmp\\scr\\scr_%d.png'% i, ch.window.getScreenshot().get_array(),[cv2.IMWRITE_PNG_COMPRESSION,9])
# cv2.imshow('Test screenshot', ch.window.getScreenshot().get_array())
# cv2.waitKey(10)
# time.sleep(10)
# time2 = time.clock()
# print("%d time2-time1= %s" % (i,time2-time1))
# ch.buy_quick_ascension()
ch.reindex_heroes_list('heroes')
# if ch.lvlup_all_heroes('heroes', max_level=150, timer=600):
# continue
ch.lvlup_top_heroes('heroes')
# ch.buy_quick_ascension()
# ch.lvlup_all_heroes('heroes', timer=1800)
# ch.buy_available_upgrades(upgrades_timer=1800)
# ch.ascend(ascension_life=7200, check_timer=30, check_progress=False)
# except Exception as e:
# print("levelup_heroes:Exception:%s" % repr(e))
# continue
def progress_levels(click_lock, start_barrier):
start_barrier.wait()
print("progress_levels: Started")
ch = ClickerHeroes(click_lock)
while True:
# try:
# img = ch.window.getScreenshot().get_canny_array()
# cv2.imshow('Clicker Heroes', img)
ch.progress_level(farm_mode_timer=180, boss_timer=30)
# ch.try_skill_combos('12345')
# ch.try_skill_combos('869', '123457','123')
# except Exception as e:
# print("progress_levels:Exception:%s" % repr(e))
# continue
if __name__ == '__main__':
c_lock = multiprocessing.RLock()
start_condition = multiprocessing.Condition()
mp_target = [progress_levels, get_collectables, levelup_heroes]
# mp_target = [progress_levels]
# mp_target = [levelup_heroes]
start_barrier = multiprocessing.Barrier(len(mp_target))
proc = []
for target in mp_target:
proc.append(Process(target=target, args=(c_lock, start_barrier,)))
for p in proc:
p.start()
ch = ClickerHeroes(c_lock)
while True:
time.sleep(1)
if DEBUG:
print("Bot is running")
|
Sebastian's Table closing on Saturday. To be replaced by second Honest Abe's.
Billy Joel coming to PBA on March 24.
Cottonwood and Meadowlark coffee shops are closed. Tax reasons. May or may not reopen. Story coming soon at JournalStar.com.
|
# -*- coding: utf-8 -*-
import json
from tempfile import NamedTemporaryFile
from zipfile import ZipFile
from pprint import pprint
URLS = {
'CREATE_REPO': 'repositories/',
'CREATE_REPO_V2': 'repositories/%(username)s/%(repo_slug)s/',
'GET_REPO': 'repositories/%(username)s/%(repo_slug)s/',
'UPDATE_REPO': 'repositories/%(username)s/%(repo_slug)s/',
'DELETE_REPO': 'repositories/%(username)s/%(repo_slug)s/',
'GET_USER_REPOS': 'user/repositories/',
# Get archive
'GET_ARCHIVE': 'repositories/%(username)s/%(repo_slug)s/%(format)s/master/',
}
class Repository(object):
""" This class provide repository-related methods to Bitbucket objects."""
def __init__(self, bitbucket):
self.bitbucket = bitbucket
self.bitbucket.URLS.update(URLS)
def _get_files_in_dir(self, repo_slug=None, owner=None, dir='/'):
repo_slug = repo_slug or self.bitbucket.repo_slug or ''
owner = owner or self.bitbucket.username
dir = dir.lstrip('/')
url = self.bitbucket.url(
'GET_ARCHIVE',
username=owner,
repo_slug=repo_slug,
format='src')
dir_url = url + dir
response = self.bitbucket.dispatch('GET', dir_url, auth=self.bitbucket.auth)
if response[0] and isinstance(response[1], dict):
repo_tree = response[1]
url = self.bitbucket.url(
'GET_ARCHIVE',
username=owner,
repo_slug=repo_slug,
format='raw')
# Download all files in dir
for file in repo_tree['files']:
file_url = url + '/'.join((file['path'],))
response = self.bitbucket.dispatch('GET', file_url, auth=self.bitbucket.auth)
self.bitbucket.repo_tree[file['path']] = response[1]
# recursively download in dirs
for directory in repo_tree['directories']:
dir_path = '/'.join((dir, directory))
self._get_files_in_dir(repo_slug=repo_slug, owner=owner, dir=dir_path)
def public(self, username=None):
""" Returns all public repositories from an user.
If username is not defined, tries to return own public repos.
"""
username = username or self.bitbucket.username or ''
url = self.bitbucket.url('GET_USER', username=username)
response = self.bitbucket.dispatch('GET', url)
try:
return (response[0], response[1]['repositories'])
except TypeError:
pass
return response
def all(self, owner=None):
""" Return all repositories owned by a given owner """
owner = owner or self.bitbucket.username
url = self.bitbucket.url('GET_USER', username=owner)
response = self.bitbucket.dispatch('GET', url, auth=self.bitbucket.auth)
try:
return (response[0], response[1]['repositories'])
except TypeError:
pass
return response
def team(self, include_owned=True):
"""Return all repositories for which the authenticated user is part of
the team.
If include_owned is True (default), repos owned by the user are
included (and therefore is a superset of the repos returned by
all().
If include_owned is False, repositories only repositories
owned by other users are returned.
"""
url = self.bitbucket.url('GET_USER_REPOS')
status, repos = self.bitbucket.dispatch('GET', url, auth=self.bitbucket.auth)
if status and not include_owned:
return status,[r for r in repos if r['owner'] != self.bitbucket.username]
return status, repos
def get(self, repo_slug=None, owner=None):
""" Get a single repository on Bitbucket and return it."""
repo_slug = repo_slug or self.bitbucket.repo_slug or ''
owner = owner or self.bitbucket.username
url = self.bitbucket.url('GET_REPO', username=owner, repo_slug=repo_slug)
return self.bitbucket.dispatch('GET', url, auth=self.bitbucket.auth)
def create(self, repo_name=None, repo_slug=None, owner=None, scm='git', private=True, **kwargs):
""" Creates a new repository on a Bitbucket account and return it."""
repo_slug = repo_slug or self.bitbucket.repo_slug or ''
if owner:
url = self.bitbucket.url_v2('CREATE_REPO_V2', username=owner, repo_slug=repo_slug)
else:
owner = self.bitbucket.username
url = self.bitbucket.url('CREATE_REPO')
return self.bitbucket.dispatch('POST', url, auth=self.bitbucket.auth, name=repo_name, scm=scm, is_private=private, **kwargs)
def update(self, repo_slug=None, owner=None, **kwargs):
""" Updates repository on a Bitbucket account and return it."""
repo_slug = repo_slug or self.bitbucket.repo_slug or ''
owner = owner or self.bitbucket.username
url = self.bitbucket.url('UPDATE_REPO', username=owner, repo_slug=repo_slug)
return self.bitbucket.dispatch('PUT', url, auth=self.bitbucket.auth, **kwargs)
def delete(self, repo_slug=None, owner=None):
""" Delete a repository on own Bitbucket account.
Please use with caution as there is NO confimation and NO undo.
"""
repo_slug = repo_slug or self.bitbucket.repo_slug or ''
owner = owner or self.bitbucket.username
url = self.bitbucket.url_v2('DELETE_REPO', username=owner, repo_slug=repo_slug)
return self.bitbucket.dispatch('DELETE', url, auth=self.bitbucket.auth)
def archive(self, repo_slug=None, owner=None, format='zip', prefix=''):
""" Get one of your repositories and compress it as an archive.
Return the path of the archive.
format parameter is curently not supported.
"""
owner = owner or self.bitbucket.username
prefix = '%s'.lstrip('/') % prefix
self._get_files_in_dir(repo_slug=repo_slug, owner=owner, dir='/')
if self.bitbucket.repo_tree:
with NamedTemporaryFile(delete=False) as archive:
with ZipFile(archive, 'w') as zip_archive:
for name, f in self.bitbucket.repo_tree.items():
with NamedTemporaryFile(delete=False) as temp_file:
if isinstance(f, dict):
f = json.dumps(f)
try:
temp_file.write(f.encode('utf-8'))
except UnicodeDecodeError:
temp_file.write(f)
zip_archive.write(temp_file.name, prefix + name)
return (True, archive.name)
return (False, 'Could not archive your project.')
|
Aabeeabbdcda Fabulous Blue Floral Wall Decor is just one of the many collections of Sample Home Decorating Ideas Reference that we have on this website. We have a lot of Home Decorating Ideas or interior decorating ideas and any other things concerning in this website. We're not just providing info about , but , you can get a lot more reference to create your dream home as well. So , don't forget to keep visiting Interactifideas.net to get the latest update about Home Decorating Ideas and more.
Aabeeabbdcda Fabulous Blue Floral Wall Decor was posted in July 25, 2018 at 9:24 pm. Aabeeabbdcda Fabulous Blue Floral Wall Decor has viewed by 26 users. Click it and download the Aabeeabbdcda Fabulous Blue Floral Wall Decor.
|
from __future__ import unicode_literals
from .common import InfoExtractor
from .youtube import YoutubeIE
class UnityIE(InfoExtractor):
_VALID_URL = r'https?://(?:www\.)?unity3d\.com/learn/tutorials/(?:[^/]+/)*(?P<id>[^/?#&]+)'
_TESTS = [{
'url': 'https://unity3d.com/learn/tutorials/topics/animation/animate-anything-mecanim',
'info_dict': {
'id': 'jWuNtik0C8E',
'ext': 'mp4',
'title': 'Live Training 22nd September 2014 - Animate Anything',
'description': 'md5:e54913114bd45a554c56cdde7669636e',
'duration': 2893,
'uploader': 'Unity',
'uploader_id': 'Unity3D',
'upload_date': '20140926',
}
}, {
'url': 'https://unity3d.com/learn/tutorials/projects/2d-ufo-tutorial/following-player-camera?playlist=25844',
'only_matching': True,
}]
def _real_extract(self, url):
video_id = self._match_id(url)
webpage = self._download_webpage(url, video_id)
youtube_id = self._search_regex(
r'data-video-id="([_0-9a-zA-Z-]+)"',
webpage, 'youtube ID')
return self.url_result(youtube_id, ie=YoutubeIE.ie_key(), video_id=video_id)
|
Providing year-round performance for total home comfort, heat pumps are a great solution for your home comfort system. That's because they work to provide both heating and cooling . Whether it's the hottest day of the summer, or the coldest day of winter, Trane heat pumps work day in and day out to keep your family in premium comfort.
|
# -*- coding: utf-8 -*-
# Copyright 2010 British Broadcasting Corporation and Kamaelia Contributors(1)
#
# (1) Kamaelia Contributors are listed in the AUTHORS file and at
# http://www.kamaelia.org/AUTHORS - please extend this file,
# not this notice.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import pygame
import Axon
from Axon.Ipc import producerFinished, WaitComplete
from Kamaelia.UI.Pygame.Display import PygameDisplay
from Kamaelia.UI.Pygame.Button import Button
from ColourSelector import ColourSelector
from Slider import Slider
class ToolBox(Axon.Component.component):
Inboxes = {"inbox" : "Receive events from Pygame Display",
"control" : "For shutdown messages",
"callback" : "Receive callbacks from Pygame Display",
"buttons" : "Recieve interrupts from the buttons"
}
Outboxes = {"outbox" : "XY positions emitted here",
"signal" : "For shutdown messages",
"display_signal" : "Outbox used for communicating to the display surface"
}
def __init__(self, position=None, size=(500,500)):
"""x.__init__(...) initializes x; see x.__class__.__doc__ for signature"""
super(ToolBox,self).__init__()
self.size = size
self.dispRequest = { "DISPLAYREQUEST" : True,
"callback" : (self,"callback"),
"events" : (self, "inbox"),
"size": self.size,
"transparency" : None }
if not position is None:
self.dispRequest["position"] = position
def waitBox(self,boxname):
"""Generator. yields 1 until data ready on the named inbox."""
waiting = True
while waiting:
if self.dataReady(boxname): return
else: yield 1
def main(self):
"""Main loop."""
displayservice = PygameDisplay.getDisplayService()
self.link((self,"display_signal"), displayservice)
self.send( self.dispRequest,
"display_signal")
for _ in self.waitBox("callback"): yield 1
self.display = self.recv("callback")
# tool buttons
circleb = Button(caption="Circle",position=(10,10), msg = (("Tool", "Circle"),)).activate()
eraseb = Button(caption="Eraser",position=(100,10), msg = (("Tool", "Eraser"),)).activate()
lineb = Button(caption="Line",position=(10,50), msg = (("Tool", "Line"),)).activate()
bucketb = Button(caption="Bucket",position=(10,90), msg = (("Tool", "Bucket"),)).activate()
eyeb = Button(caption="Eyedropper",position=(10,130), msg = (("Tool", "Eyedropper"),)).activate()
addlayerb = Button(caption="Add Layer",position=(10,540), msg = (("Layer", "Add"),)).activate()
prevlayerb = Button(caption="<-",position=(80,540), msg = (("Layer", "Prev"),)).activate()
nextlayerb = Button(caption="->",position=(110,540), msg = (("Layer", "Next"),)).activate()
dellayerb = Button(caption="Delete",position=(140,540), msg = (("Layer", "Delete"),)).activate()
self.link( (circleb,"outbox"), (self,"outbox"), passthrough = 2 )
self.link( (eraseb,"outbox"), (self,"outbox"), passthrough = 2 )
self.link( (lineb,"outbox"), (self,"outbox"), passthrough = 2 )
self.link( (bucketb,"outbox"), (self,"outbox"), passthrough = 2 )
self.link( (eyeb,"outbox"), (self,"outbox"), passthrough = 2 )
self.link( (addlayerb,"outbox"), (self,"outbox"), passthrough = 2 )
self.link( (prevlayerb,"outbox"), (self,"outbox"), passthrough = 2 )
self.link( (nextlayerb,"outbox"), (self,"outbox"), passthrough = 2 )
self.link( (dellayerb,"outbox"), (self,"outbox"), passthrough = 2 )
colSel = ColourSelector(position = (10,170), size = (255,255)).activate()
self.link( (colSel,"outbox"), (self,"outbox"), passthrough = 2 )
SizeSlider = Slider(size=(255, 50), messagePrefix = "Size", position = (10, 460), default = 9).activate()
self.link( (SizeSlider,"outbox"), (self,"outbox"), passthrough = 2 )
AlphaSlider = Slider(size=(255, 10), messagePrefix = "Alpha", position = (10, 515), default = 255).activate()
self.link( (AlphaSlider,"outbox"), (self,"outbox"), passthrough = 2 )
self.drawBG()
done = False
while not done:
if not self.anyReady():
self.pause()
yield 1
def drawBG(self):
self.display.fill( (255,255,255) )
if __name__ == "__main__":
from Kamaelia.Chassis.Pipeline import Pipeline
from Kamaelia.Util.Console import ConsoleEchoer
Pipeline(ToolBox(size = (275,600)), ConsoleEchoer()).run()
# ToolBox(size = (275,600)).activate()
Axon.Scheduler.scheduler.run.runThreads()
|
>>> Thunderbird inbox? Since it's quite new, nothing seems to support it.
> inbox as requested by the OP.
> 0.8, or 0.9. I haven't tried on 1.0 yet.
Thank you guys for the link and the help.
|
#!/usr/bin/python
import sys
import os
import hashlib
import json
path = ""
towrite = {}
hash = hashlib.md5()
def PrintHelp():
print """\nHasher v0.1 is an utility that produces a .txt of MD5 checksums in a folder.
Simply run in a terminal and specify the folder's path as an argument. The .txt will be created
or updated if already existent in the above-mentioned folder."""
def hashfile(afile, hasher, blocksize=65536):
buf = afile.read(blocksize)
while len(buf) > 0:
hasher.update(buf)
buf = afile.read(blocksize)
print " ->" + hasher.hexdigest()
print ""
return str(hasher.hexdigest())
def Hash(p):
print ""
for root, dirs, files in os.walk(p):
for file in files:
if file != "hashes.txt":
print file
filex = open(os.path.join(p, file))
# file.read()
towrite[file] = str(hashfile(filex, hash))
json.dump(towrite, open(os.path.join(p, 'hashes.txt'), 'w'))
if len(sys.argv) <= 1:
print "\nTry -h/help arguments for help."
else:
if sys.argv[1] == "-h" or sys.argv[1] == "help":
PrintHelp()
else:
path = sys.argv[1]
Hash(path)
|
Are you the child of senior parents how are still driving around town locally? Just two years ago, my 75 year old mother and her girlfriend from church wanted to drive from Roosevelt, NY to Norfolk VA, alone. My siblings and I were on pins and needles. My mother is an excellent driver but there are physical changes occurring naturally with age, and they impact her ability to respond while driving. She has driven this route many times, but now she is much older. Her ability to respond and react to hazards are a huge concern for us.
The State of Virginia has developed a website “Recognizing the Signs” www.granddriver.net .
If you are concerned remember you are not alone. This is a very uncomfortable conversation and feelings can get hurt. This could possibly be the person that drove you home from the hospital the day you were born and now you are telling them you are no longer comfortable with them driving.
The website on Granddriver.net has also, set up a Granddriver Safety Questionnaire you can review with them or you can suggest an appointment with a trained professional who can evaluate their driving skills.
|
r"""
>>> template = ''' template <class T%(, class A%+%)>
... static PyObject* call( %1(T::*pmf)(%(A%+%:, %))%2, PyObject* args, PyObject* ) {
... PyObject* self;
... %( PyObject* a%+;
... %) if (!PyArg_ParseTuple(args, const_cast<char*>("O%(O%)"), &self%(, &a%+%)))
... return 0;
... T& target = from_python(self, type<T&>());
... %3to_python((target.*pmf)(%(
... from_python(a%+, type<A%+>())%:,%)
... ));%4
... }'''
>>> print gen_function(template, 0, 'R ', '', 'return ', '')
template <class T>
static PyObject* call( R (T::*pmf)(), PyObject* args, PyObject* ) {
PyObject* self;
if (!PyArg_ParseTuple(args, const_cast<char*>("O"), &self))
return 0;
T& target = from_python(self, type<T&>());
return to_python((target.*pmf)(
));
}
>>> print gen_function(template, 2, 'R ', '', 'return ', '')
template <class T, class A1, class A2>
static PyObject* call( R (T::*pmf)(A1, A2), PyObject* args, PyObject* ) {
PyObject* self;
PyObject* a1;
PyObject* a2;
if (!PyArg_ParseTuple(args, const_cast<char*>("OOO"), &self, &a1, &a2))
return 0;
T& target = from_python(self, type<T&>());
return to_python((target.*pmf)(
from_python(a1, type<A1>()),
from_python(a2, type<A2>())
));
}
>>> print gen_function(template, 3, 'void ', ' const', '', '\n'+8*' ' + 'return none();')
template <class T, class A1, class A2, class A3>
static PyObject* call( void (T::*pmf)(A1, A2, A3) const, PyObject* args, PyObject* ) {
PyObject* self;
PyObject* a1;
PyObject* a2;
PyObject* a3;
if (!PyArg_ParseTuple(args, const_cast<char*>("OOOO"), &self, &a1, &a2, &a3))
return 0;
T& target = from_python(self, type<T&>());
to_python((target.*pmf)(
from_python(a1, type<A1>()),
from_python(a2, type<A2>()),
from_python(a3, type<A3>())
));
return none();
}
"""
import string
def _find(s, sub, start=0, end=None):
"""Just like string.find, except it returns end or len(s) when not found.
"""
if end == None:
end = len(s)
pos = string.find(s, sub, start, end)
if pos < 0:
return end
else:
return pos
def _raise_no_argument(key, n, args):
raise IndexError(str(key) + " extra arg(s) not passed to gen_function")
def _gen_common_key(key, n, args, fill = _raise_no_argument):
# import sys
# print >> sys.stderr, "_gen_common_key(", repr(key), ",", repr(n), ',', repr(args), ',', fill, ')'
# sys.stderr.flush()
if len(key) > 0 and key in '123456789':
index = int(key) - 1;
if index >= len(args):
return fill(key, n, args)
arg = args[index]
if callable(arg):
return str(arg(key, n, args))
else:
return str(arg)
elif key in ('x','n','-','+'):
return str(n + {'-':-1,'+':+1,'x':0,'n':0}[key])
else:
return key
def _gen_arg(template, n, args, fill = _raise_no_argument):
result = ''
i = 0
while i < len(template): # until the template is consumed
# consume everything up to the first '%'
delimiter_pos = _find(template, '%', i)
result = result + template[i:delimiter_pos]
# The start position of whatever comes after the '%'+key
start = delimiter_pos + 2
key = template[start - 1 : start] # the key character. If there were no
# '%'s left, key will be empty
if 0 and key == 'n':
result = result + `n`
else:
result = result + _gen_common_key(key, n, args, fill)
i = start
return result
def gen_function(template, n, *args, **keywords):
r"""gen_function(template, n, [args...] ) -> string
Generate a function declaration based on the given template.
Sections of the template between '%(', '%)' pairs are repeated n times. If '%:'
appears in the middle, it denotes the beginning of a '%'.
Sections of the template between '%{', '%}' pairs are ommitted if n == 0.
%n is transformed into the string representation of 1..n for each
repetition within %(...%). Elsewhere, %n is transformed into the
string representation of n
%- is transformed into the string representation of 0..n-1 for
each repetition within %(...%). Elsewhere, %- is transformed into the
string representation of n-1.
%+ is transformed into the string representation of 2..n+1 for
each repetition within %(...%). Elsewhere, %- is transformed into the
string representation of n+1.
%x is always transformed into the string representation of n
%z, where z is a digit, selects the corresponding additional
argument. If that argument is callable, it is called with three
arguments:
key - the string representation of 'z'
n - the iteration number
args - a tuple consisting of all the additional arguments to
this function
otherwise, the selected argument is converted to a string representation
for example,
>>> gen_function('%1 abc%x(%(int a%n%:, %));%{ // all args are ints%}', 2, 'void')
'void abc2(int a0, int a1); // all args are ints'
>>> gen_function('%1 abc(%(int a%n%:, %));%{ // all args are ints%}', 0, 'x')
'x abc();'
>>> gen_function('%1 abc(%(int a%n%:, %));%{ // all args are ints%}', 0, lambda key, n, args: 'abcd'[n])
'a abc();'
>>> gen_function('%2 %1 abc(%(int a%n%:, %));%{ // all args are ints%}', 0, 'x', fill = lambda key, n, args: 'const')
'const x abc();'
>>> gen_function('abc%[k%:v%]', 0, fill = lambda key, n, args, value = None: '<' + key + ',' + value + '>')
'abc<k,v>'
"""
expand = (lambda s, n = n:
apply(gen_function, (s, n) + args, keywords))
fill = keywords.get('fill', _raise_no_argument);
result = ''
i = 0
while i < len(template): # until the template is consumed
# consume everything up to the first '%'
delimiter_pos = _find(template, '%', i)
result = result + template[i:delimiter_pos]
# The start position of whatever comes after the '%'+key
start = delimiter_pos + 2
key = template[start - 1 : start] # the key character. If there were no
# '%'s left, key will be empty
pairs = { '(':')', '{':'}', '[':']' }
if key in pairs.keys():
end = string.find(template, '%' + pairs[key], start)
assert end >= 0, "Matching '" + '%' + pairs[key] +"' not found!"
delimiter_pos = end
if key == '{':
if n > 0:
result = result + expand(template[start:end])
else:
separator_pos = _find(template, '%:', start, end)
remainder = template[separator_pos+2 : end]
if key == '(':
for x in range(n):
iteration = expand(
template[start:separator_pos], x)
result = result + expand(iteration, x)
if x != n - 1:
result = result + expand(remainder, x)
else:
function_result = fill(
template[start:separator_pos], n, args, value = remainder)
result = result + expand(function_result)
else:
result = result + expand(_gen_common_key(key, n, args, fill))
i = delimiter_pos + 2
return result
def gen_functions(template, n, *args, **keywords):
r"""gen_functions(template, n, [args...]) -> string
Call gen_function repeatedly with from 0..n and the given optional
arguments.
>>> print gen_functions('%1 abc(%(int a%n%:, %));%{ // all args are ints%}\n', 2, 'void'),
void abc();
void abc(int a0); // all args are ints
void abc(int a0, int a1); // all args are ints
"""
fill = keywords.get('fill', _raise_no_argument);
result = ''
for x in range(n + 1):
result = result + apply(gen_function, (template, x) + args, keywords)
return result
if __name__ == '__main__':
import doctest
import sys
doctest.testmod(sys.modules.get(__name__))
|
White Bruce Smith Jerseys , Minnesota Vikings fan confidence plummets Buffalo Bills fans think things are looking up. Those are the results of the latest fan confidence poll conducted by SB Nation. The results, while timid after rookie quarterback Josh Allen’s first start, show fans believing in not only the resurrected defense but also in the offense.While Bills fans weren’t down in the dumps last week, they’ve come up 43% since their low before Week 2 when it was unclear if Nathan Peterman would start again or if the team would turn to Allen. From Week 3 to Week 4, Bills fans gained 30% points to get to 73% of fans feeling confident in the direction of the team.In Minnesota, it was a completely different story. Following a 1-0-1 start to their season, Vikings fans now think the sky is falling. A 57% drop in fan confidence, the largest of the week among NFL fan bases, moved Minnesota from 92% expressing confidence to just 35% of respondents saying they were confident. Hmmmmm...I wonder what could have happened to Minnesota... The Buffalo Bills could not get any footing on offense Bruce Smith Jerseys 2019 , defense, or special teams Sunday. They were beaten soundly by a vulnerable New England Patriots team in all three phases. With one game left in the season, the coaching staff has a lot of evaluating to do as they enter Week 17.Here are our five takeaways from the game:Bills were out-coachedThe Patriots kept attacking Buffalo’s inexperienced linebackers by keeping their heavy units on the field. They were able to run the ball down the Bills’ collective throats and influence the need for Lorenzo Alexander to cover Julian Edelman. On defense, the Patriots put Pro Bowler Stephon Gilmore on Robert Foster and then dared Josh Allen to pass the ball. He had opportunities but New England made the Bills make plays—and so many drops doomed the Bills.Bills rushing defense remains a concernAfter their second touchdown, the Patriots were averaging 8.4 yards per rush while Tom Brady was having a pedestrian day. New England had 179 yards rushing before halftime.With 20 minutes left in the game, New England already had passed the Indianapolis Colts for most rushing yards against the Bills in a game this season. With 273 yards on the ground, it was tied for the seventh-most allowed since 2000. Missed tackles, the defensive line not settling into their lanes, and sub-par linebacker play all led to the problem. In four games against New England White Andre Reed Jerseys , Sean McDermott’s defense has allowed more than 190 yards on the ground three times. They entered the game allowing only 105 yards per game but they have had some hiccups this year.Josh Allen didn’t play well*Bill Belichick* is pretty darn good at home against rookie quarterbacks. It was a story line going into this week, but it came true in spades. Belichick spied Allen with a cornerback and other defensive backs on multiple plays. They converted 9% of their third downs. Allen threw two picks and was 14-for-33 for just 148 yards. He had five rushes for 30 yards and was sacked a few times. Allen had several very good passes, for sure, and he was able to settle in more as the game wore on, but they couldn’t get anything going when it mattered.Convert some freaking turnoversThe Bills forced three turnovers in Patriots territory and were only able to convert three points. The opening drive of the third quarter got the Bills on the board, down 14-3, but the Bills turned it over on downs in the second quarter and Stephen Hauschka bounced off the crossbar from 43 yards out in the first quarter. Not exactly how you defeat the Patriots.Special teams (again)Holy cow. That onside kick was pathetic. The Patriots blocked an extra point (kind of but not really). It was abysmal and it’s not the first time we’ve called for coordinator Danny Crossman to be fired. Let’s do it again anyway: Danny Crossman should be fired.Send your questions to me for the Buffalo Rumblings Mailbag: 716-508-0405. It was a talk-able Bills game, for sure.
|
# -*- coding: utf-8 -*-
"""
cli
~~~
Implements CLI mode
:author: Feei <feei@feei.cn>
:homepage: https://github.com/WhaleShark-Team/cobra
:license: MIT, see LICENSE for more details.
:copyright: Copyright (c) 2018 Feei. All rights reserved
"""
import re
import os
from .detection import Detection
from .engine import scan, Running
from .exceptions import PickupException
from .export import write_to_file
from .log import logger
from .pickup import Directory
from .send_mail import send_mail
from .utils import ParseArgs
from .utils import md5, random_generator, clean_dir
from .push_to_api import PushToThird
def get_sid(target, is_a_sid=False):
target = target
if isinstance(target, list):
target = ';'.join(target)
sid = md5(target)[:5]
if is_a_sid:
pre = 'a'
else:
pre = 's'
sid = '{p}{sid}{r}'.format(p=pre, sid=sid, r=random_generator())
return sid.lower()
def start(target, formatter, output, special_rules, a_sid=None, is_del=False):
"""
Start CLI
:param target: File, FOLDER, GIT
:param formatter:
:param output:
:param special_rules:
:param a_sid: all scan id
:param is_del: del target directory
:return:
"""
# generate single scan id
s_sid = get_sid(target)
r = Running(a_sid)
data = (s_sid, target)
r.init_list(data=target)
r.list(data)
report = '?sid={a_sid}'.format(a_sid=a_sid)
d = r.status()
d['report'] = report
r.status(d)
logger.info('[REPORT] Report URL: {u}'.format(u=report))
# parse target mode and output mode
pa = ParseArgs(target, formatter, output, special_rules, a_sid=None)
target_mode = pa.target_mode
output_mode = pa.output_mode
# target directory
try:
target_directory = pa.target_directory(target_mode)
target_directory = target_directory.rstrip("/")
logger.info('[CLI] Target directory: {d}'.format(d=target_directory))
# static analyse files info
files, file_count, time_consume = Directory(target_directory).collect_files()
# detection main language and framework
dt = Detection(target_directory, files)
main_language = dt.language
main_framework = dt.framework
logger.info('[CLI] [STATISTIC] Language: {l} Framework: {f}'.format(l=main_language, f=main_framework))
logger.info('[CLI] [STATISTIC] Files: {fc}, Extensions:{ec}, Consume: {tc}'.format(fc=file_count,
ec=len(files),
tc=time_consume))
if pa.special_rules is not None:
logger.info('[CLI] [SPECIAL-RULE] only scan used by {r}'.format(r=','.join(pa.special_rules)))
# scan
scan(target_directory=target_directory, a_sid=a_sid, s_sid=s_sid, special_rules=pa.special_rules,
language=main_language, framework=main_framework, file_count=file_count, extension_count=len(files))
if target_mode == 'git' and '/tmp/cobra/git/' in target_directory and is_del is True:
res = clean_dir(target_directory)
if res is True:
logger.info('[CLI] Target directory remove success')
else:
logger.info('[CLI] Target directory remove fail')
except PickupException:
result = {
'code': 1002,
'msg': 'Repository not exist!'
}
Running(s_sid).data(result)
logger.critical('Repository or branch not exist!')
exit()
except Exception:
result = {
'code': 1002,
'msg': 'Exception'
}
Running(s_sid).data(result)
raise
# 匹配邮箱地址
if re.match(r'^[A-Za-z\d]+([-_.][A-Za-z\d]+)*@([A-Za-z\d]+[-.])+[A-Za-z\d]{2,4}$', output):
# 生成邮件附件
attachment_name = s_sid + '.' + formatter
write_to_file(target=target, sid=s_sid, output_format=formatter, filename=attachment_name)
# 发送邮件
send_mail(target=target, filename=attachment_name, receiver=output)
elif output.startswith('http'):
# HTTP API URL
pusher = PushToThird(url=output)
pusher.add_data(target=target, sid=s_sid)
pusher.push()
else:
write_to_file(target=target, sid=s_sid, output_format=formatter, filename=output)
|
University of Northern Colorado cornerback Quincy Wofford isn’t worried about facing the high-powered run-and shoot offense of Portland State at 1:35 p.m. today at Nottingham Field.
Actually, he is concerned for the Vikings.
After the Bears No. 1 ranked pass defense in the Big Sky Conference gave up four touchdown passes to Northern Arizona last week, including an 81-yard bomb to Lumberjacks receiver Ed Berry, Wofford and the Bears are determined not to let it happen again.
It will not be easy for the Bears (0-3 Big Sky, 1-4 overall). The Vikings (0-3, 1-4) come into the game with the No. 1 pass offense in the Big Sky, led by quarterback Drew Hubel. Hubel is leading the Big Sky in passing with 344 yards per game through the air.
Of course, the run-and-shoot is a quarterback friendly-system in which only one running back and four receivers that spread out the defense hoping to create mismatches. Wide receivers usually determine their own routes based on the defense.
The run-and-shoot was made famous by Mouse Davis, the former long-time college and pro coach who retired this year as the Vikings offensive coordinator.
While the offense may seem like a nightmare for a defensive back because a quarterback might throw the ball more than 50 times a game, Wofford loves playing against it.
He may get that opportunity. Hubel has thrown seven interceptions so far, only one fewer than touchdowns passes. Besides, the Bears believe they are the best pass defense in the league, and they want to go out and prove it.
UNC coach Scott Downing knows the Bears defense will be tested, but said the Bears shouldn’t be surprised by anything Portland State throws at them with the run-and-shoot.
Of course, Wofford would like nothing better than to hold the Vikings to less than 100 yards passing. While that is unlikely, he believes the defense should be rewarded by defensive coordinator Cody Deti if it does.
UNC secondary vs. Portland State QB Drew Hubel: Hubel is putting up big numbers running the run-and-shoot offense. He already has 1,670 yards passing to lead the Big Sky and has four games of throwing for more than 300 yards. The down side of the run-and-shoot is that he is averaging 37 attempts per game, which means their is more opportunities for interceptions; Hubel has thrown seven so far. That has the UNC defense licking its chops. Although the Bears are only tied for fourth in the conference in interceptions so far this season, they have the top-ranked pass defense led by safety Max Hewitt and cornerback Korey Askew. However, the Bears have given up 11 passing touchdowns, tied for third-most in the conference. The key will be not letting Hubel get into a rhythm and start picking the Bears apart. If he does that, it could be a long afternoon.
UNC red-zone offense vs. Portland State red-zone offense: Last season, one of the areas the UNC offense prided itself on most was its efficiency inside the 20. The Bears scored on 30-of-36 possessions inside the red zone or 83.3 percent. That was second best in the conference to only Weber State. So far this season, the Bears are second-to-last in the conference in red zone efficiency. UNC has scored on 8-of-14 attempts in the red zone, a 57.1 percent efficiency. Fortunately, Portland State isn’t much better. The Vikings are only one spot ahead of the Bears in the conference in red-zone efficiency (seventh). Portland State has scored on only 8-of-13 attempts in the red zone, or 61.5 percent. Whoever wins this battle probably wins the game.
UNC special teams vs. Portland State special teams: Portland State’s Aaron Woods is not only one of the Vikings best receivers, but also one of the conferences best return men. He is the only player in the Big Sky to be in the top five in kickoff return and punt return yardage. Woods is averaging 24.2 per kickoff return and is the only player in the Big Sky to have returned one for a touchdown this season (97 yards). He is also averaging 6.5 per punt return. UNC has two pretty good returners of its own. Redshirt freshman John Burnley is third in the conference in kickoff returns, averaging 24.4 per return. Senior Ryan Lutz is averaging 7.2 per punt return, third best in the Big Sky. UNC will have to do a much better job on kickoff coverage if they want to contain Woods and win the battle for field position. The Bears are second to last in the conference in kickoff coverage. Portland State is only sixth in the conference, though.
Portland State 24, UNC 21: Drew Hubel completed 28-of-36 passes for 343 yards and three touchdowns as the Portland State won the season finale at PGE Park. The Bears outgained the Vikings, 431-426, with running back David Woods leading the way. Woods rushed for 121 yards and a pair of touchdowns. UNC quarterback Bryan Waggener completed 24-of-38 passes for 254 yards and a touchdown with an interception. Dominic Gunn led all UNC receivers with four catches for 60 yards. Portland State receiver Aaron Woods caught eight passes for 113 yards and Daniel Wolverton had six catches for 112 yards. UNC trailed 24-14 entering the fourth quarter, but UNC receiver Cory Fauver caught a 10-yard touchdown pass from Waggener with 4:11 left to pull the Bears within three. However, the Bears could pull no closer.
Streaks: Portland State has lost eight straight road games and five straight Big Sky road games. UNC has lost nine straight Big Sky Conference games. Both teams are on a three-game losing streak.
The numbers: UNC quarterback Bryan Waggener ranks sixth in the nation among FCS active career leaders in passing attempts (34.4 per game) and is 11th in passing yards (215 per game). He is also seventh in completions per game (19.6). Waggener needs only 48 passing yards to reach 3,500 for his career. UNC sophomore Cameron Kaman ranks 10th among FCS active career leaders in punting average (39.91). UNC safety Max Hewitt is two tackles shy of 150 for his career. UNC linebacker Matt King needs just eight more tackles to match his season total of 44 from each of his first two years. PSU wide receiver Raymond Fry III is averaging 109.6 yards receiving per game, first in the Big Sky and third in the nation.
For the record: The Vikings are 1-4 for the first time since 1997 and 0-3 in the Big Sky Conference for the first time since 1996, their first year in the conference.
Big Thompson: UNC wide receiver Alex Thompson caught 12 passes for 164 yards and three touchdowns last week against Northern Arizona. The 12 catches tied for second all-time in UNC history for a single game. The last time a UNC player had 10 or more catches was when Vincent Jackson caught 10 passes against Florida Atlantic on Oct. 16, 2004. Thompson’s three touchdowns were second most in a game. Thompson also became the first player since 2006 to score three or more touchdowns in a game. The last was Andre Wilson against Western Illinois on Sept. 23, 2006.
Turn around: The Bears were just 17-for-55 on third downs (.309) before last week when they converted 13-of-23 third downs (.565) against NAU.
An omen?: The only time UNC has beaten Portland State in its history was 30 years ago on its homecoming. The Bears defeated the Vikings 21-20 on Oct. 20, 1979.
Man in Black: Portland State head coach Jerry Glanville was head coach of the Atlanta Falcons (1990-93) and Houston Oilers (1986-89) as well as a long time assistant coach in the NFL. Glanville is 8-19 in his third season at Portland State.
Moving on up: Portland State quarterback Drew Hubel moved into fifth place on the Vikings all-time passing list with 6,052 yards. He passed Vikings’ legend June Jones (5,798, 1975-76).
|
from collections import OrderedDict, namedtuple
from datetime import datetime
from conans.model.ref import ConanFileReference
class _UploadElement(namedtuple("UploadElement", "reference, remote_name, remote_url, time")):
def __new__(cls, reference, remote_name, remote_url):
the_time = datetime.utcnow()
return super(cls, _UploadElement).__new__(cls, reference, remote_name, remote_url, the_time)
def to_dict(self):
ret = {"remote_name": self.remote_name,
"remote_url": self.remote_url, "time": self.time}
ret.update(_id_dict(self.reference))
return ret
def _id_dict(ref):
if isinstance(ref, ConanFileReference):
ret = {"id": str(ref)}
else:
ret = {"id": ref.id}
# FIXME: When revisions feature is completely release this field should be always there
# with None if needed
if ref.revision:
ret["revision"] = ref.revision
return ret
class UploadRecorder(object):
def __init__(self):
self.error = False
self._info = OrderedDict()
def add_recipe(self, ref, remote_name, remote_url):
self._info[str(ref)] = {"recipe": _UploadElement(ref, remote_name, remote_url),
"packages": []}
def add_package(self, pref, remote_name, remote_url):
self._info[str(pref.ref)]["packages"].append(_UploadElement(pref, remote_name, remote_url))
def get_info(self):
info = {"error": self.error, "uploaded": []}
for item in self._info.values():
recipe_info = item["recipe"].to_dict()
packages_info = [package.to_dict() for package in item["packages"]]
info["uploaded"].append({"recipe": recipe_info, "packages": packages_info})
return info
|
A co-worker sends you an email asking if you’re available for a 2:30 PM meeting, and your first thought is: How am I going to get out of this?
Before you automatically (though, begrudgingly) agree because you don’t think you have a choice, keep this is mind: While it’s true people will judge someone who declines every meeting or backs out after agreeing, that’s not what’s happening here. If your teammate’s asking you to join at the very last-minute—and you explain why you can’t in a thoughtful manner—odds are he or she will understand.
Like a courtesy CC, sometimes co-workers think that including you even when it’s not totally necessary is the nice thing to do—and they have no idea that it’s creating extra work on your end.
Successful people avoid extra meetings. Knowing this, you may’ve decided to try the to limit the number you attend each week.
If it’s a one-on-one meeting, follow up by asking if you could reschedule for a time that works better for you. And say that in the meantime, you’re happy to start the conversation over email (you’d be amazed how many issues can be resolved this way!). If a large group’s invited, ask if it’d be alright for you to send in thoughts in advance, or comment on notes afterward, whichever would be more helpful.
If you always agree to meeting requests, declining may feel a little nerve-wracking at first. So, think about what you’re achieving. Not only could it save you from a time-waster today, but in the future, it could encourage people to contact you with relevant invites only.
Photo of people talking at desk courtesy of Hero Images/Getty Images.
|
#!/usr/bin/env python3
# vim: set encoding=utf-8 tabstop=4 softtabstop=4 shiftwidth=4 expandtab
#########################################################################
# Copyright 2018- Martin Sinn m.sinn@gmx.de
#########################################################################
# This file is part of SmartHomeNG
#
# SmartHomeNG is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# SmartHomeNG is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with SmartHomeNG If not, see <http://www.gnu.org/licenses/>.
#########################################################################
"""
This script assembles a complete list of requirements for the SmartHomeNG core and all plugins.
The list is not tested for correctness nor checked for contrary
requirements.
The procedure is as following:
1) walks the plugins subdirectory and collect all files with requirements
2) read the requirements for the core
3) read all files with requirements and add them with source of requirement to a dict
4) write it all to a file all.txt in requirements directory
"""
import os
import sys
sh_basedir = os.sep.join(os.path.realpath(__file__).split(os.sep)[:-2])
sys.path.insert(0, sh_basedir)
program_name = sys.argv[0]
arguments = sys.argv[1:]
if "-debug_tox" in arguments:
import logging
logging.basicConfig(level=logging.DEBUG)
logger = logging.getLogger('build_requirements')
logger.setLevel(logging.DEBUG)
logger.debug("sys.path = {}".format(sys.path))
import lib.shpypi as shpypi
# ==========================================================================
selection = 'all'
if not os.path.exists(os.path.join(sh_basedir, 'modules')):
print ("Directory <shng-root>/modules not found!")
exit(1)
if not os.path.exists(os.path.join(sh_basedir, 'plugins')):
print ("Directory <shng-root>/plugins not found!")
exit(1)
if not os.path.exists(os.path.join(sh_basedir, 'requirements')):
print ("Directory <shng-root>/requirements not found!")
exit(1)
req_files = shpypi.Requirements_files()
# req_files.create_requirementsfile('core')
# print("File 'requirements" + os.sep + "core.txt' created.")
# req_files.create_requirementsfile('modules')
# print("File 'requirements" + os.sep + "modules.txt' created.")
fn = req_files.create_requirementsfile('base')
print("File {} created.".format(fn))
# req_files.create_requirementsfile('plugins')
# print("File 'requirements" + os.sep + "plugins.txt' created.")
fn = req_files.create_requirementsfile('all')
print("File {} created.".format(fn))
|
28 Royal Blue Dress For Wedding 2019 Uploaded by Michael Stanley on Sunday, February 10th, 2019 in category junior wedding guest dresses.
See also 20 Royal Blue Dress For Wedding 2017 from junior wedding guest dresses Topic.
Here we have another image 77 Royal Blue Dress For Wedding 2018 featured under 28 Royal Blue Dress For Wedding 2019. We hope you enjoyed it and if you want to download the pictures in high quality, simply right click the image and choose "Save As". Thanks for reading 28 Royal Blue Dress For Wedding 2019.
|
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#from senlin.drivers import heat_v1 as heat
from senlin.profiles import base
__type_name__ = 'os.heat.resource'
class ResourceProfile(base.Profile):
'''Profile for an OpenStack Heat resource.
When this profile is used, the whole cluster is Heat stack, composed
of resources initialzed from this profile.
'''
spec_schema = {}
def __init__(self, ctx, name, type_name=__type_name__, **kwargs):
super(ResourceProfile, self).__init__(ctx, name, type_name, **kwargs)
def do_create(self):
return {}
def do_delete(self, id):
return True
def do_update(self, ):
self.status = self.UPDATING
# TODO(anyone): do update
self.status = self.ACTIVE
return {}
def do_check(self, id):
#TODO(liuh): add actual checking logic
return True
|
My beer taste watery HELP!!
Will the beer get any better after secondary and carbonation or am I stuck with a watery beer again.
Re: My beer taste watery HELP!!
Carbonation will change it a lot. It is a funny thing where if you have too little, it is watery and if you have too much it is thin and fizzy, but just the right amount will make it full.
This is something that commonly is seen with low gravity beers like you made (1.040), and you will see reference to it for english bitters and some other low gravity styles. If you look at the carbonation levels for those beers, they tend to be in the low carbonation category.
If you have one of your other beers, and it is pretty carbonated, try taking a chop stick or something and wisk out some of the carbonation and try it, then do it again and so on with attention to how the mouthfeel presents itself.
Shoot for 1.060 instead of 1.040. That might do the trick.
Also, steep or mini-mash in the 150s F for just 30-40 minutes instead of 60 minutes. This will leave more unfermentable dextrins in the wort compared to a full hour mash. If you were steeping in the 140s for a full hour, the enzymes from the pilsner malt will be doing cartwheels and jumping jacks in your wort.
I steeped the gains at 150 for 1 hour. So that may have affected it.
In order to get the higher OG do I modify the recipe? I bought this as a kit from my LHBS.
This^^. Most beer at 1.040 is going to taste thin.
This is a wit beer. I don't think you want to go too much over 1.040. It's a light bodied beer. If you're tasting it flat that is the main problem. Get it carbed up and you should be fine.
As heretical as this may sound, some lower gravity/low carbonation beers well seem thin if served cold. Maybe try taking one out of the fridge for 10-15 minutes before pouring? If over carbonation is the issue this will make things worse though.
Witbier is a style that should be highly carbonated. This helps elevate the mouthfeel, giving it a light and spritzy feeling, increasing the aromatics, and adding carbonic acidity, which adds a light tartness.
1.040 is below the low end of the style, which according to the 2015 guidelines goes 1.044 up to 1.052.
If its still in primary, I'd boil up about 1.25 lbs of wheat DME in about 3/4 gallon of water and a 1/4 oz of Styrian goldings or a noble hop like Tettnanger or Hallertaur, and add that to the primary. You'll get more volume than a 5 gal batch (which is why I suggest overshooting the gravity a touch and adding a few more hops).
One other thing you can do is get some maltodextrin powder and add about 1/3 cup of that when adding the extract.
Be sure to use a blowoff tube if you are increasing the volume you have going in the fermenter.
It will take time to figure out how small a beer you can brew before it gets "watery" to you. For me, it's around 1.044 but 90% time I am around 1.050. That said, I agree that carbonation will really help and it could very well be just fine. I don't like to doctor beers post ferment so I suggest you leave it be. Someone much wiser than me suggested to start planning your next brew rather than obsess about your current one... Not sure if you are doing full wort boils or partial and topping up, but I think partial boils can also contribute to a watery/diluted flavor.
I don't agree that topping up would cause watery beer. I suppose it could depend on how much water you add but if you top up to your intended gravity there should be no effective difference from a full boil (assuming hopping and color is accounted for in the recipe).
I agree that leaving it alone is probably the best route. Adding dme at this point would be an effective way to boost the gravity but that also changes the beer from the intended recipe (assuming 1.040 was where you wanted to be).
Add calcium chloride to your brewing water. Like 3-5 grams. Should fatten the body right up. Also wait for it to be carbed. Just because it is low gravity doesn't mean it has to taste watery.
|
"""
Django settings for pxl_master project.
Generated by 'django-admin startproject' using Django 1.9.5.
For more information on this file, see
https://docs.djangoproject.com/en/1.9/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/1.9/ref/settings/
"""
import os
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/1.9/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = '&6_#1&951zc&9*$9br_#jezu2%7s&7cx5^%w5@z8_@aq#5!epe'
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = False
ALLOWED_HOSTS = ['*']
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'rest_framework',
'pxl',
'rest_framework.authtoken',
]
REST_FRAMEWORK = {
'DEFAULT_PERMISSION_CLASSES': (
'rest_framework.permissions.IsAuthenticated',
),
'DEFAULT_AUTHENTICATION_CLASSES': (
# 'rest_framework.authentication.BasicAuthentication',
'rest_framework.authentication.TokenAuthentication',
),
'PAGE_SIZE': 10
}
MIDDLEWARE_CLASSES = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.auth.middleware.SessionAuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'pxl_master.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'pxl_master.wsgi.application'
# Database
# https://docs.djangoproject.com/en/1.9/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'pxldb',
}
}
# Password validation
# https://docs.djangoproject.com/en/1.9/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/1.9/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/1.9/howto/static-files/
STATIC_URL = '/static/'
|
If your New Year’s Resolution was to get organized or you are prepping for some heavy-duty spring cleaning, then you’ll love our ideas for organizing your craft stash! If you’re anything like me, you can’t remember what color your desk is because it’s been so long since it was clean. Any sudden movements result in an avalanche of stacks of paper, boxes of pens, and piles of washi tape.
Here are some quick and effective ways to corral the art supplies and create a visually stunning place to create.
Boxes are great for storing art supplies and organizing your craft stash, but most boxes aren’t pretty to look at. Explore other options for storing your art supplies that will enhance your work space and inspire you to create.
These vintage drawers have darling vintage-inspired details that are an eye-catching solution to the desktop clutter.
Vintage suitcases provide an elegant display solution for craft supplies. Stack some old leather suitcases in the corner to fill with papers, ephemera, and more. Everyone will appreciate your taste in antiques, and no one will know that they are looking at your craft stash.
Hangers are a wonderful way to get supplies up and out of the way. Hang up your papers, fabric samples, or artwork to keep it easy-to-reach without cluttering your creative space. Use hangers to clip small baggies of ephemera so that they are always on hand and ready to use.
Even a string with some clothespins can turn a pile into an instant inspiration board.
Rinse out old jars and decorate to create an easy way to store washi tapes, pens and pencils, or buttons.
Tin cans also provide the great foundation for creating all sorts of organizational solutions. Attach a few to create a desktop organizer, or use them in drawers to create dividers for your markers and paints.
If hiding away your supplies isn’t an option, but you still want a cleaner look, consider organizing items by color. Stacking reds with reds and blues with blues can create a more unified, cohesive look while still having your supplies close at hand. Stephanie Miller Corfee stacked her supplies by color, creating a calming and organized display.
For crafters with very large craft stashes, you may want to invest in a big furniture piece to store your supplies. You’ll save a lot of money in the long run if you forgo the hundreds of plastic bins and just get a larger armoire to begin with. Jeanne Oliver’s antique cabinet provides storage for loads of supplies, and is a beautiful focal point for the room.
Have a great solution for organizing your craft stash? Leave a comment below!
Tags: afforable ways to organize your craft stash, craft stash, organization ideas, organize your craft stash.
What a great post and so many creative and inspiring ideas!
I have found some toiletry organizer bags that unroll and can be hung from a hook in the bathroom when you travel. There four clear plastic and triangular shaped containers I have stuffed with all my different lace stash, organized by types and colours. The containers can also be removed very handy for working and when not in use look very nice on the wall…. .
I have some great antique pieces of furniture for storing my supplies. I have an old mug shot cabinet that holds all the different kinds of paper I use. I think there are 20 drawers. for my sewing supplies, I have an antique spool cabinet.
I am a collector of found objects, they are stored in an old library card catalog file. My brushes, pens etc are stored in an old file drawer that is divided. My fabrics are stored in a reproduction storage bin from Pier 1. I have two of them stacked on top of each other.
Now this is the inspiration I need to clean up my wonderful mess!
Beautiful Studios and organization! Thanks so much for the tips and visual delights!
I use empty fruit containers to store my washi tape & other small craft supplies. I like them because they’re clear. I can see what I have without having to open the box.
Love these clever and creative ideas.
I love the solutions for organizing craft stash. Always welcome. Your magazines are so helpful and inspiring. Thank you.
All these beautiful ideas inspire me to get my act together! It’s never too late to get organized and these ideas have given me great motivation to reorganize my space and make it what it should have been all along!
|
import sys
def reactions_with_no_proteins(reactions, verbose=False):
"""
Figure out which reactions in our set have no proteins associated with them.
:param reactions: The reactions dictionary
:type reactions: dict
:param verbose: prints out how many reactions have no proteins out of the total
:type verbose: bool
:return: a set of reaction ids that have no proteins associated with them.
:rtype: set
"""
nopegs = set()
for r in reactions:
if reactions[r].number_of_enzymes() == 0:
nopegs.add(r)
if verbose:
sys.stderr.write("REACTIONS WITH NO PROTEINS: {} reactions have no pegs associated ".format(len(nopegs)) +
"with them (out of {} reactions)\n".format(len(reactions)))
return nopegs
def reactions_with_proteins(reactions, verbose=False):
"""
Figure out which reactions in our set have proteins associated with them.
:param reactions: The reactions dictionary
:type reactions: dict
:param verbose: prints out how many reactions have no proteins out of the total
:type verbose: bool
:return: a set of reaction ids that have proteins associated with them.
:rtype: set
"""
pegs = set()
for r in reactions:
if reactions[r].number_of_enzymes() != 0:
pegs.add(r)
if verbose:
sys.stderr.write("REACTIONS WITH PROTEINS: {} reactions have pegs associated ".format(len(pegs)) +
"with them (out of {} reactions)\n".format(len(reactions)))
return pegs
|
Aggregator of company information and news intended to support investors make informed decision. The company provides up-to-the-minute multi-media, breaking news, comment, and analysis on hundreds of listed companies, events organization, digital services and investor research enabling high net worth sophisticated investors, fund managers, hedge funds, private client brokers and analysts to connect with other investors in an informative manner.
This information is reserved for PitchBook Platform users. To explore Proactive Investors‘s full profile, request access.
|
from FileTester import FileTester
import os
import sys
from mooseutils.ImageDiffer import ImageDiffer
class ImageDiff(FileTester):
@staticmethod
def validParams():
params = FileTester.validParams()
params.addRequiredParam('imagediff', [], 'A list of files to compare against the gold.')
params.addParam('allowed', 0.98, "Absolute zero cutoff used in exodiff comparisons.")
params.addParam('allowed_linux', "Absolute zero cuttoff used for linux machines, if not provided 'allowed' is used.")
params.addParam('allowed_darwin', "Absolute zero cuttoff used for Mac OS (Darwin) machines, if not provided 'allowed' is used.")
# We don't want to check for any errors on the screen with this. If there are any real errors then the image test will fail.
params['errors'] = []
params['display_required'] = True
return params
def __init__(self, name, params):
FileTester.__init__(self, name, params)
def getOutputFiles(self):
return self.specs['imagediff']
def processResults(self, moose_dir, options, output):
"""
Perform image diff
"""
# Call base class processResults
FileTester.processResults(self, moose_dir, options, output)
if self.getStatus() == self.bucket_fail:
return output
# Loop through files
specs = self.specs
for filename in specs['imagediff']:
# Error if gold file does not exist
if not os.path.exists(os.path.join(specs['test_dir'], specs['gold_dir'], filename)):
output += "File Not Found: " + os.path.join(specs['test_dir'], specs['gold_dir'], filename)
self.setStatus('MISSING GOLD FILE', self.bucket_fail)
break
# Perform diff
else:
output = 'Running ImageDiffer.py'
gold = os.path.join(specs['test_dir'], specs['gold_dir'], filename)
test = os.path.join(specs['test_dir'], filename)
if sys.platform in ['linux', 'linux2']:
name = 'allowed_linux'
elif sys.platform == 'darwin':
name = 'allowed_darwin'
allowed = specs[name] if specs.isValid(name) else specs['allowed']
differ = ImageDiffer(gold, test, allowed=allowed)
# Update golds (e.g., uncomment this to re-gold for new system or new defaults)
#import shutil; shutil.copy(test, gold)
output += differ.message()
if differ.fail():
self.setStatus('IMAGEDIFF', self.bucket_diff)
break
# If status is still pending, then it is a passing test
if self.getStatus() == self.bucket_pending:
self.setStatus(self.success_message, self.bucket_success)
return output
|
These beautifully crafted observational films immerse us in the world of indigenous artists across the Pacific, giving us insight into the unique creative process behind their most cherished works and their role and importance within the community. Taking us from Australia to New Zealand, by way of Rurutu and Hawaii, the series explores the deep reverence many indigenous people continue to have for their artistic traditions and heritage.
Lovingly recording the creation of a new object from start to finish, each film explores the layers of inherited knowledge and years of practice to develop the skill needed to make each object. Simple and stripped back, the films have little or no music to accompany them, no narration or presenter to distract. Instead, the artists themselves, during the process of making their object, are given time and space to explain their life’s work, inspiration and sense of calling. Through exploring how, tradition, master craftsmanship and artistic inspiration continue to provide meaning for indigenous people across the Pacific, the films also become intimate portraits of community life.
In Arnhem Land, in the remote tropical north of Australia, the Gurruwiwi Family of the Yolngu Aboriginal people take us into the beguiling world of the ‘yidaki’, a sacred instrument better known to outsiders as the ‘didgeridoo’. Believing the yidaki can heal people, control the weather, and summon ancestral spirits, the Yolngu place great importance on the making and playing of this powerful instrument. The yidaki is a key feature of local ceremonial life and is used to play ‘songlines’, the stories of ancestors that the Yolngu communicate through music and dance. The whole family is involved in the making of a new yidaki. We follow them as they hunt for suitable stringybark trees and watch each stage of the process as the tree is hollowed out, shaped, and given sacred ceremonial paintings with ochres. With the new yidaki finished, the film culminates in a ‘bunggul’, a ceremonial dance for the whole village, where the instrument is given its first outing. Many of the beliefs held by the Gurruwiwi family have remained unchanged for tens of thousands of years. Yet, the modern world has definitely arrived. iPhones and machine sanders mix seemingly with ease with ancient belief systems, in an intimate film that explores and straddles this moment of change for Aboriginal people in Australia.
‘Mama’, from the beautiful South Seas island of Rurutu, French Polynesia, is one of the last traditional weavers to make the ‘taupoo’, a ceremonial hat woven from dried pandanus tree leaves. Taking 5 weeks to make, these hats were originally introduced to the island by British missionaries in the early 19th Century. Now, they’re worn to church and given as wedding gifts. But, the knowledge of how to make them is dying out. To create each hat, 30 or more long pandanus leaves are cut down, spliced together, hung, dried, rolled, sorted, dyed and bleached. And then the complex weaving begins. Without a template or stitches or any thread, we watch Mama almost magically weaving the dried leaves into a stunningly elaborate hat, which she wears to church in the final scenes of the film. Touching upon the island’s Christian history, local myths and legends, and offering a unique and relaxed sense of this island idyll in a moment of flux, this film is both a rare visual treat and a last chance to see the intricacies of an ancient and dying craft.
Master Maori carver Logan Okiwi Shipgood crafts a beautiful 6ft tall ‘pou’ statue from native New Zealand timber. Starting from a block of native white pine, and using chainsaws, adzes and around 30 chisels, Logan gradually carves out the figure of ‘Hene Te Akiri’, a Maori warrior princess, as he lovingly chips away at the wood. Inlaid with sacred shells and given a powerful facial tattoo to denote her social rank, the finished statue is finally unveiled to the public. Throughout the film, Logan explores the deep spiritual connection between Maori carvers and the objects they create, and the significance of his home – Rotorua – in the revival of Maori art and culture in the 20th Century. For Maori today, carving remains a key way of telling stories and honouring ancestors, and Logan – an internationally famous sculptor and carver – is proud to be keeping the tradition alive.
Indigenous Hawaiian artist Dalani Tanahy painstakingly beats tree bark into sheets of cloth-like fabric, known as ‘kapa’, a process that takes several weeks. This ancient Hawaiian bark cloth was once a staple material of the islands. But, Captain Cook arrived in Hawaii, he introduced far more easily-produced cotton and kapa-making completely disappeared. Today, Dalani is one of only a handful of dedicated practitioners bringing this traditional art form back. Kapa-making has become an integral part of nascent Hawaiian cultural nationalism that is taking hold in indigenous communities. Kapa making has become a powerful source of pride and identity. But making kapa is a lot of work. Dalani shows us the process from start to finish as trees are cut, stripped, and the bark beaten and fermented. Then sheets of bark are beaten together to make large single sheets. Natural dyes and paints are printed on. And only then, after weeks of work, is the kapa ready to be worn. In this film, Dalani finally delivers her kapa to a dancer, who plans to use it as a traditional ‘hula’ skirt. The film ends with an emotional performance at the Royal Palace in Honululu, Hawaii. A traditional ‘hula’ dance is performed, to honour Hawaii’s last monarch, Queen Lilo’uokalani, who was overthrown by the Americans. Like kapa itself, the Queen has become a symbol of reborn Hawaiian identity.
|
# -*- encoding: utf-8 -*-
from . import db
from .activity import Activity
from flask import url_for
class Item(db.Model):
id = db.Column(db.Integer, primary_key=True, autoincrement=False)
name = db.Column(db.String(100), nullable=True)
max_production_limit = db.Column(db.Integer, nullable=True)
market_group_id = db.Column(db.Integer)
group_id = db.Column(db.Integer)
category_id = db.Column(db.Integer)
volume = db.Column(db.Numeric(precision=16, scale=4, decimal_return_scale=4, asdecimal=False), nullable=True)
# calculated field on import
is_from_manufacturing = db.Column(db.Boolean(), default=True)
is_from_reaction = db.Column(db.Boolean(), default=True)
base_cost = db.Column(
db.Numeric(
precision=17,
scale=2,
decimal_return_scale=2,
asdecimal=False
),
nullable=True,
)
# foreign keys
activities = db.relationship(
'Activity',
backref='blueprint',
lazy='dynamic'
)
activity_products = db.relationship(
'ActivityProduct',
backref='blueprint',
lazy='dynamic',
foreign_keys='ActivityProduct.item_id'
)
activity_skills = db.relationship(
'ActivitySkill',
backref='blueprint',
lazy='dynamic',
foreign_keys='ActivitySkill.item_id'
)
activity_materials = db.relationship(
'ActivityMaterial',
backref='blueprint',
lazy='dynamic',
foreign_keys='ActivityMaterial.item_id'
)
product_for_activities = db.relationship(
'ActivityProduct',
backref='product',
lazy='dynamic',
foreign_keys='ActivityProduct.product_id'
)
skill_for_activities = db.relationship(
'ActivitySkill',
backref='skill',
lazy='dynamic',
foreign_keys='ActivitySkill.skill_id'
)
material_for_activities = db.relationship(
'ActivityMaterial',
backref='material',
lazy='dynamic',
foreign_keys='ActivityMaterial.material_id'
)
# relationship only defined for performance issues
# ------------------------------------------------
activity_products__eager = db.relationship(
'ActivityProduct',
lazy='joined',
foreign_keys='ActivityProduct.item_id'
)
def icon_32(self):
static_url = "ccp/Types/%d_32.png" % self.id
return url_for('static', filename=static_url)
def icon_64(self):
static_url = "ccp/Types/%d_64.png" % self.id
return url_for('static', filename=static_url)
def is_moon_goo(self):
return self.market_group_id == 499
def is_pi(self):
return self.category_id == 43
def is_mineral_salvage(self):
return self.market_group_id in [1857, 1033, 1863]
def is_ancient_relic(self):
return self.category_id == 34
def is_cap_part(self):
""" Return if the item is a cap part / blueprint of cap part.
914 / 915 are Blueprints
913 / 873 are their respective items """
return self.group_id in [914, 915, 913, 873]
|
Antigua is Guatemala’s colonial gem. The small town sits just outside of Guatemala City but breathes a different type of air than the capital city. Whereas Guate is fast, noisy, dirty and busy the town of Antigua sometimes seems to be at a complete standstill at times. The streets are pieced together by awkwardly placed cobble that refuses to allow even the fastest cars to move to quickly through town. The low level buildings are painted in a variation of blue, red, yellow and green and are never stacked too high as to block the view of the stunning and imposing volcano. The rhythm in Antigua is one of a slow dance. Romantic and intimate Antigua will easily draw your sentiments back to an easier time.
There’s no real trick to this shot other than getting up early and being prepared for the right light. There are so many advantages to shooting in the morning rather than any other time of day, as I’ve said a number of times. In this case I actually missed the shot that I was going for because above everything photography is capturing light. As I was passing the church, the light was right. I had intended on making it to another location for the good blue sky that happens about an hour before the sunrises, but was late. Thus, I had to make a choice between location or light. In my opinion light always wins that battle. Great light can make even the most boring scene look dramatic, not to say that this church in Antigua, Guatemala is a boring subject though. Shoot the light, and you’ll never go wrong.
|
#!/usr/bin/python
# encoding: utf-8
import sys
from workflow import Workflow, web
import datetime
ICON_DEFAULT = 'icon.png'
def get_yunbi_tickers(query):
url = 'https://yunbi.com//api/v2/tickers.json'
r = web.get(url)
r.raise_for_status()
tickers = r.json()
for name in tickers:
if name[:-3] not in query:
continue
last = tickers[name]['ticker']['last']
sell = tickers[name]['ticker']['sell']
high = tickers[name]['ticker']['high']
buy = tickers[name]['ticker']['buy']
low = tickers[name]['ticker']['low']
vol = tickers[name]['ticker']['vol']
yunb_title = (name[:-3]).upper() + u":\t云币\t最新价 " + str(last)
yunb_arg = "https://yunbi.com/markets/" + name
yunb_subtitle = u"量:" + vol + u" 买:" + str(buy) + u" 卖:" + \
str(sell) + u" 高:" + high + u" 低:" + low
wf.add_item(title = yunb_title,
subtitle = yunb_subtitle,
arg = yunb_arg,
valid = True,
icon = ICON_DEFAULT)
def get_jubi_tickers(query):
url = 'https://www.jubi.com/api/v1/allticker'
r = web.get(url)
r.raise_for_status()
tickers = r.json()
for name in tickers:
if name not in query:
continue
last = tickers[name]['last']
buy = tickers[name]['buy']
sell = tickers[name]['sell']
high = tickers[name]['high']
low = tickers[name]['low']
vol = tickers[name]['vol']
jub_title = name.upper() + u":\t聚币\t最新价 " + str(last)
jub_arg = "https://www.jubi.com/coin/" + name
jub_subtitle = u"量:" + str(vol) + u" 买:" + str(buy) + u" 卖:" + \
str(sell) + u" 高:" + str(high) + u" 低:" + str(low)
wf.add_item(title = jub_title,
subtitle = jub_subtitle,
arg = jub_arg,
valid = True,
icon = ICON_DEFAULT)
def get_yuanbao_tickers(query):
names = ['ans', 'btc', 'eth', 'etc', 'ltc', 'zec', 'qtum']
for id in query:
if id not in names:
continue
url = 'https://www.yuanbao.com/api_market/getInfo_cny/coin/' + id
r = web.get(url)
r.raise_for_status()
ticker = r.json()
name = ticker['name']
last = ticker['price']
buy = ticker['buy']
sell = ticker['sale']
high = ticker['max']
low = ticker['min']
vol = ticker['volume_24h']
url = ticker['Markets']
yuanb_title = name + u":\t元宝\t最新价 " + last
yuanb_subtitle = u"量:" + vol + u" 买:" + str(buy) + u" 卖:" + \
str(sell) + u" 高:" + high + u" 低:" + low
wf.add_item(title = yuanb_title,
subtitle = yuanb_subtitle,
arg = url,
valid = True,
icon = ICON_DEFAULT)
def main(wf):
query = wf.args[0].strip().replace("\\", "")
if not isinstance(query, unicode):
query = query.decode('utf8')
names = ['ans', 'btc', 'eth', 'etc', 'ltc', 'zec', 'qtum', 'bts', 'eos', 'sc']
if query == 'yun':
get_yunbi_tickers(names)
elif query == 'ju':
get_jubi_tickers(names)
elif query == 'bao':
get_yuanbao_tickers(names)
else:
get_yuanbao_tickers([query])
get_yunbi_tickers([query])
get_jubi_tickers([query])
wf.send_feedback()
if __name__ == '__main__':
wf = Workflow()
logger = wf.logger
sys.exit(wf.run(main))
|
Came on for consideration the motion of Holly Leanne Frantzen ("movant") under 28 U.S.C. § 2255 to vacate, set aside, or correct sentence. After having considered such motion, its supporting memorandum, the government's response, the reply, and pertinent parts of the record in Case No. 4:16-CR-132-A, styled "United States of America v. Charles Ben Bounds, et al., " the court has concluded that the motion should be dismissed as untimely.
On June 15, 2016, movant was named with others in a one-count second superseding indictment charging her with conspiracy to possess with intent to distribute 50 grams or more of a mixture and substance containing a detectable amount of methamphetamine, in violation of 21 U.S.C. § 84 6. CR Doc. 286. On July 29, 2016, movant appeared for rearraignment and pleaded guilty without benefit of a plea agreement. CR Doc. 459. Movant signed a factual resume setting forth the penalties she faced, the elements of the offense, and the stipulated facts reflecting that she had committed each of the elements of the offense. CR Doc. 4 61. Under oath, movant stated that no one had made any promise or assurance of any kind to induce her to plead guilty. Further, movant stated her understanding that the guideline range was advisory and was one of many sentencing factors the court could consider; that the guideline range could not be calculated until the presentence report ("PSR") was prepared; the court could impose a sentence more severe that the sentence recommended by the advisory guidelines and movant would be bound by her guilty plea; movant was satisfied with her counsel and had no complaints regarding her representation; and, movant and counsel had reviewed the factual resume and movant understood the meaning of everything in it and the stipulated facts were true and accurate. CR Doc. 1451.
Movant's PSR calculated her total offense level to be 35, based on a base offense level of 3 6 with a two-level enhancement for possession of a firearm and a three-level reduction for acceptance of responsibility. CR Doc. 778, ¶¶ 36-45. Her total offense level combined with a criminal history category of IV produced a guideline range of 235 to 293 months. Id., ¶ 96. The PSR also noted that movant had pending state charges against her. Id., ¶ 97. Movant did not object to the PSR. CR Doc. 1001. The government filed a motion for downward departure based on movant's substantial assistance to the government in its investigation and prosecution of others. CR Doc. 817. The court granted the motion and sentenced movant to a term of imprisonment of 200 months, giving her the benefit of a 35-month reduction below the bottom of the guideline range. CR Doc. 1454 at 13; CR Doc. 997. The court specifically informed movant of her right to appeal and told her that the clerk would file a notice of appeal forthwith if she were to specifically request it. CR Doc. 1454 at 16. The court noted that movant and her attorney had been given a form explaining appeal rights that they were to review and sign once they were satisfied that they understood it. Movant's counsel affirmed that the form had been signed and returned to the court co-ordinator. Id.
Movant did not appeal and her judgment became final on January 6, 2017. United States v. Plascencia, 537 F.3d 385, 388 (5th Cir. 2008) .
Id. In her memorandum, movant admits that she knew she had fourteen days in which to file a notice of appeal because the court had so admonished her. Doc. 2 at 1.
|
import bpy
import sys
scene = bpy.context.scene
scene.frame_set(1)
output_filename = None
parsing = False
for i in sys.argv:
if parsing == True:
output_filename = i
elif i == '--':
parsing = True
for obj in bpy.data.objects:
obj.select = False
for obj in bpy.data.objects:
if obj.name != 'Lamp' and obj.name != 'Lamp2' and obj.name != 'Camera':
filename = obj.name
if output_filename != None:
if output_filename == '*' or output_filename == filename + '.obj':
obj.hide = False
obj.hide_render = False
obj.select = True
#bpy.ops.export_scene.autodesk_3ds(filepath=bpy.path.abspath(filename))
bpy.ops.export_scene.obj(filepath=bpy.path.abspath(filename) + '.obj',
check_existing=False,
use_selection=True,
use_normals=True,
use_uvs=True,
use_materials=True,
use_triangles=True,
group_by_material=True,
path_mode='RELATIVE')
obj.select = False
obj.hide = True
obj.hide_render = True
else:
print('Filename: ' + filename)
|
You are able to rely on Bathroom Toilet Guys to supply the most effective expert services for Bathroom Toilets in Lava Hot Springs, ID. You want the most sophisticated solutions in the market, and our team of experienced contractors can provide this. We work with top of the line materials and budget friendly solutions to guarantee that you'll get the finest service for the greatest value. Call us today at 888-247-8873 and we'll be ready to go over the options, address all your questions, and set up a consultation to start planning your task.
Here at Bathroom Toilet Guys, we are aware that you have to stay in your budget and reduce costs everywhere it's possible to. On the other hand, you'll need the most beneficial and highest standard of work when it comes to Bathroom Toilets in Lava Hot Springs, ID. Our company offers the best quality even while still costing you less. Our plan is to be sure you enjoy the best supplies and a end result which can last over time. We'll achieve this by supplying you with the top prices around and avoiding costly mistakes. Choose Bathroom Toilet Guys when you'd like the most impressive products and services at a minimal cost. You'll be able to reach our team by dialing 888-247-8873 to learn more.
To come up with the best decisions regarding Bathroom Toilets in Lava Hot Springs, ID, you've got to be well informed. We will not let you make imprudent choices, because we understand just what we are undertaking, and we make sure you know what to anticipate from the work. You won't deal with any unexpected situations whenever you work with Bathroom Toilet Guys. Start by contacting 888-247-8873 to talk about your work. Within this phone call, you'll get the questions you have responded to, and we're going to set up a time to commence work. Our staff can arrive at the scheduled time with the required resources, and will work with you throughout the process.
You have plenty of reasons to choose Bathroom Toilet Guys for Bathroom Toilets in Lava Hot Springs, ID. We have the best customer support ratings, the very best quality products, and the most useful and productive cash saving solutions. Our company is there to help you with the most expertise and experience in the field. Contact 888-247-8873 whenever you need Bathroom Toilets in Lava Hot Springs, and we are going to work together with you to successfully finish the project.
|
"""
SALTS XBMC Addon
Copyright (C) 2014 tknorris
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
"""
import re
import urlparse
import kodi
import log_utils # @UnusedImport
import dom_parser
from salts_lib import scraper_utils
from salts_lib.constants import FORCE_NO_MATCH
from salts_lib.constants import VIDEO_TYPES
from salts_lib.constants import QUALITIES
from salts_lib.constants import Q_ORDER
from salts_lib.utils2 import i18n
import scraper
BASE_URL = 'http://www.tvshow.me'
class Scraper(scraper.Scraper):
base_url = BASE_URL
def __init__(self, timeout=scraper.DEFAULT_TIMEOUT):
self.timeout = timeout
self.base_url = kodi.get_setting('%s-base_url' % (self.get_name()))
@classmethod
def provides(cls):
return frozenset([VIDEO_TYPES.EPISODE])
@classmethod
def get_name(cls):
return 'TVShow.me'
def get_sources(self, video):
source_url = self.get_url(video)
hosters = []
if source_url and source_url != FORCE_NO_MATCH:
url = urlparse.urljoin(self.base_url, source_url)
html = self._http_get(url, require_debrid=True, cache_limit=.5)
title = dom_parser.parse_dom(html, 'title')
if title:
title = re.sub('^\[ST\]\s*–\s*', '', title[0])
meta = scraper_utils.parse_episode_link(title)
page_quality = scraper_utils.height_get_quality(meta['height'])
else:
page_quality = QUALITIES.HIGH
fragment = dom_parser.parse_dom(html, 'section', {'class': '[^"]*entry-content[^"]*'})
if fragment:
for section in dom_parser.parse_dom(fragment[0], 'p'):
match = re.search('([^<]*)', section)
meta = scraper_utils.parse_episode_link(match.group(1))
if meta['episode'] != '-1' or meta['airdate']:
section_quality = scraper_utils.height_get_quality(meta['height'])
else:
section_quality = page_quality
if Q_ORDER[section_quality] < Q_ORDER[page_quality]:
quality = section_quality
else:
quality = page_quality
for stream_url in dom_parser.parse_dom(section, 'a', ret='href'):
host = urlparse.urlparse(stream_url).hostname
hoster = {'multi-part': False, 'host': host, 'class': self, 'views': None, 'url': stream_url, 'rating': None, 'quality': quality, 'direct': False}
hosters.append(hoster)
return hosters
def get_url(self, video):
return self._blog_get_url(video, delim=' ')
@classmethod
def get_settings(cls):
settings = super(cls, cls).get_settings()
settings = scraper_utils.disable_sub_check(settings)
name = cls.get_name()
settings.append(' <setting id="%s-filter" type="slider" range="0,180" option="int" label=" %s" default="60" visible="eq(-3,true)"/>' % (name, i18n('filter_results_days')))
settings.append(' <setting id="%s-select" type="enum" label=" %s" lvalues="30636|30637" default="0" visible="eq(-4,true)"/>' % (name, i18n('auto_select')))
return settings
def search(self, video_type, title, year, season=''): # @UnusedVariable
html = self._http_get(self.base_url, params={'s': title}, require_debrid=True, cache_limit=1)
post_pattern = '<h\d+[^>]+class="entry-title[^>]*>\s*<[^>]+href="(?P<url>[^"]*/(?P<date>\d{4}/\d{1,2}/\d{1,2})[^"]*)[^>]+>(?:\[ST\]\s+–\s*)?(?P<post_title>[^<]+)'
date_format = '%Y/%m/%d'
return self._blog_proc_results(html, post_pattern, date_format, video_type, title, year)
|
A suburban and low-rise urban district of the north,outer LondonBoroughs of Haringey and a small part of Barnet. It is between Highgate, Hampstead Garden Village, East Finchley and Crouch End. It has many roads with Edwardian architecture.
Family life is what Muswell Hill is all about, with people making a beeline for homes in the catchment areas of the most popular state schools.
Located seven miles north of central London with Highgate to the south, Finchley to the west, Wood Green to the east and Friern Barnet and Whetstone to the north, Muswell Hill’s popularity has been achieved in spite of its relative isolation.
Some Muswell Hill residents live within walking distance of East Finchley or Highgate Tube stations, which are both on the Northern line. Others can walk to Alexandra Palace train station with services to Moorgate via Finsbury Park. All stations are in Zone 3 and an annual travel card to Zone 1 will costs around £1,600. For the rest the only form of public transport to take them out of the area is the very regular W7 bus service to Finsbury Park, or the 102 or 299 buses to Bounds Green Tube on the Piccadilly line.
If Muswell Hill sounds like the place for you then give one of our Niche consultants a call today on: 020 3970 4142 and they will help guide you to your dream property.
|
import psycopg2
from psycopg2 import extras
from principal.parameters import *
# Create your views here.
import sys
reload(sys) # to re-enable sys.setdefaultencoding()
sys.setdefaultencoding('utf-8')
class AdminBD:
#conn_string = ""
def __init__(self):
#host='127.0.0.1' dbname='docker' user='docker' password='docker' port='49153'"
try:
#self.conn = psycopg2.connect(database="docker", user="docker", password="docker", host="localhost", port="49153")
self.conn = psycopg2.connect(database=DATABASE, user=USER, password=PASSWORD, host=HOST, port=PORT)
# get a connection, if a connect cannot be made an exception will be raised here
# conn.cursor will return a cursor object, you can use this cursor to perform queries
self.cursor = self.conn.cursor(cursor_factory=psycopg2.extras.DictCursor)
#self.conn.set_character_encoding('utf8')
self.conn.set_client_encoding('UTF-8')
#self.cursor.execute("SET CLIENT_ENCODING TO 'LATIN-1';")
#cursor_factory=psycopg2.extras.DictCursor
except:
raise Exception('No se pudo conectar a la DB!')
def get_eid_papers_proyecto(self, proyecto):
consulta = "SELECT eid from paper_proyecto JOIN paper ON id_paper=id WHERE id_proyecto = %s;" %(str(proyecto))
try:
self.cursor.execute(consulta)
filas = self.cursor.fetchall()
eids=[]
for row in filas:
eids.append( row['eid'])
return eids
except psycopg2.DatabaseError, e:
raise Exception('No se pudo get_eid_papers_proyecto()')
def get_autores(self, proyecto):
consulta= "Select au.nombre_autor from paper_proyecto pp JOIN paper_autor pa ON pa.paper_id = pp.id_paper JOIN autor au ON au.id = autor_id WHERE pp.id_proyecto = %s;" %(str(proyecto))
try:
self.cursor.execute(consulta)
filas = self.cursor.fetchall()
return filas
except psycopg2.DatabaseError, e:
raise Exception('No se pudo get_autores()')
#Select au.nombre_autor from paper_proyecto pp JOIN paper_autor pa ON pa.paper_id = pp.id_paper JOIN autor au ON au.id = autor_id WHERE pp.id_proyecto = 1;
def get_dois_proyecto(self, proyecto):
consulta= "SELECT doi from paper_proyecto pp JOIN paper p ON pp.id_paper=p.id WHERE pp.id_proyecto =%s AND p.descargo=false AND NOT doi='00000';" %str(proyecto)
try:
self.cursor.execute(consulta)
filas = self.cursor.fetchall()
doi=[]
for row in filas:
doi.append( row['doi'])
return doi
except psycopg2.DatabaseError, e:
raise Exception('No se pudo get_dois_proyecto()')
def insertar_papers(self, proyecto,papers):
for paper in papers:
consulta = "INSERT INTO paper (doi,fecha,titulo_paper, total_citaciones,eid,abstract,descargo,link_source) VALUES (\'%s\',\'%s\',\'%s\',\'%s\',\'%s\',\'%s\',\'%s\',\'%s\') RETURNING id"%(paper['doi'], paper['fecha'], paper['titulo'],str(0), '00000', paper['abstract'], 'FALSE', paper['link'])
try:
self.cursor.execute(consulta)
self.conn.commit()
data = self.cursor.fetchall()
id_paper=data[0][0]
self.insertar_autores(paper['autores'], id_paper)
self.insertar_paper_proyecto(proyecto,id_paper)
except psycopg2.DatabaseError, e:
if self.conn:
self.conn.rollback()
raise Exception('No se pudo insertar_papers()')
sys.exit(1)
def insertar_autor(self,autor):
autor = autor.replace("'","''")
consulta = "INSERT INTO autor (nombre_autor) VALUES('%s') RETURNING id;"%(autor)
try:
self.cursor.execute(consulta)
self.conn.commit()
data = self.cursor.fetchall()
return data[0][0]
except psycopg2.DatabaseError, e:
if self.conn:
self.conn.rollback()
raise Exception('No se pudo insertar_autor()')
sys.exit(1)
def insertar_paper_autor(self,id_autor,id_paper):
consulta = "INSERT INTO paper_autor (paper_id,autor_id) VALUES(\'%s\',\'%s\');"%(str(id_paper), str(id_autor))
try:
self.cursor.execute(consulta)
self.conn.commit()
except psycopg2.DatabaseError, e:
if self.conn:
self.conn.rollback()
raise Exception('No se pudo insertar_paper_autor()')
sys.exit(1)
def insertar_autores(self,autores,id_paper):
for autor in autores:
id_autor=self.insertar_autor(autor)
self.insertar_paper_autor(id_autor,id_paper)
def insertar_paper_proyecto(self,id_proyecto,id_paper):
consulta = "INSERT INTO paper_proyecto (id_proyecto,id_paper) VALUES(\'%s\',\'%s\');"%(str(id_proyecto), str(id_paper))
try:
self.cursor.execute(consulta)
self.conn.commit()
except psycopg2.DatabaseError, e:
if self.conn:
self.conn.rollback()
raise Exception('No se pudo insertar_paper_proyecto()')
sys.exit(1)
def get_papers_eid(self, eids):
consulta = 'SELECT titulo_paper, link_source FROM paper WHERE '
count = 0
for eid in eids:
if count == 0:
concat = ' eid = \'%s\'' %(str(eid))
consulta += concat
else:
concat = ' OR eid = \'%s\'' %(str(eid))
consulta += concat
count +=1
try:
self.cursor.execute(consulta)
filas = self.cursor.fetchall()
#filas=[]
papers=[]
for row in filas:
#papers.append({"titulo": row['titulo_paper'], "link": row['link'])
papers.append({"titulo": row['titulo_paper'], "link_source": row['link_source']})
#eids.append( row['eid'])
return papers
except psycopg2.DatabaseError, e:
raise Exception('No se pudo get_papers_eid()')
sys.exit(1)
def get_papers_proyecto(self, proyecto):
consulta="SELECT id_paper, titulo_paper, fecha, total_citaciones, revista_issn, eid, abstract from paper_proyecto pp JOIN paper p ON p.id=pp.id_paper WHERE pp.id_proyecto=%s" %(str(proyecto))
try:
self.cursor.execute(consulta)
filas = self.cursor.fetchall()
papers=[]
for row in filas:
#papers.append({"titulo": row['titulo_paper'], "link": row['link'])
papers.append({"titulo": row['titulo_paper'], "link_source": row['link_source']})
#eids.append( row['eid'])
return eids
except psycopg2.DatabaseError, e:
if self.conn:
self.conn.rollback()
raise Exception('No se pudo get_papers_proyecto()')
sys.exit(1)
def getAuthors(self, paper_id):
#cur = self.cursor(cursor_factory=psycopg2.extras.DictCursor)
query = """
SELECT
id_scopus AS authid,
nombre_autor AS authname,
id_afiliacion_scopus AS afid
FROM
paper_autor pau, autor au
WHERE
pau.paper_id = {} AND pau.autor_id = au.id;
"""
query = query.format(paper_id)
self.cursor.execute(query)
#cur.execute(query)
data = self.cursor.fetchall()
authors = []
for data_tuple in data:
authors.append(dict(data_tuple))
return authors
def getAffiliation(self, paper_id):
#cur = self.cursor(cursor_factory=psycopg2.extras.DictCursor)
query = """
SELECT
scopus_id AS afid,
nombre AS affilname,
pais AS affiliation__country,
ciudad AS affiliation__city,
variante_nombre AS name__variant
FROM
paper_afiliacion pa, afiliacion a
WHERE
pa.paper_id = {} AND pa.afiliacion_id = a.id
"""
query = query.format(paper_id)
#cur.execute(query)
self.cursor.execute(query)
data = self.cursor.fetchone()
return dict(data) if data else {}
def getKeywords(self, paper_id):
#cur = conn.cursor(cursor_factory=psycopg2.extras.DictCursor)
query = """
SELECT
pk.paper_id,
string_agg(k.keyword, '|') as authkeywords
FROM
paper_keyword pk, keyword k
WHERE
pk.paper_id = {} AND pk.keyword_id = k.id
GROUP BY pk.paper_id
"""
query = query.format(paper_id)
self.cursor.execute(query)
#cur.execute(query)
data = self.cursor.fetchone()
return data['authkeywords'] if data else ''
def getPapers(self, project_id):
#cur = self.cursor(cursor_factory=psycopg2.extras.DictCursor)
query = """
SELECT
id,
p.link_source AS prism_url,
eid, titulo_paper AS dc_title,
doi AS prism_doi,
abstract AS dc_description,
fecha AS prism_coverDate,
total_citaciones AS citedby__count
FROM
paper p, paper_proyecto pp
WHERE
pp.id_proyecto = {} AND pp.id_paper = p.id;
"""
query = query.format(project_id)
self.cursor.execute(query)
data = self.cursor.fetchall()
papers = []
for data_tuple in data:
paper_id = data_tuple[0]
paper = dict(data_tuple)
paper['authors'] = self.getAuthors(paper_id)
paper['affiliation'] = self.getAffiliation(paper_id)
paper['authkeywords'] = self.getKeywords(paper_id)
papers.append(paper)
return papers
"""
data = {"Hola": "hola", "mundo": [1,2,3] }
import json
with open('data.txt', 'w') as outfile:
json.dump(data, outfile)
"""
|
Ottoman science emerged and developed on the basis of the scientific legacy and institutions of the Seljukid Turks. It greatly benefited from the activities of scholars who came from Egypt , Syria , Iran , and Turkistan , which were homelands of some of the most important scientific and cultural centers of the time. The Ottomans preserved and enriched the cultural and scientific heritage of the Islamic world, giving it new dynamism and vigor. Thus, the Islamic scientific tradition reached its climax in the 16 th century. Moreover, proximity allowed the Ottomans to learn early on of European innovations and discoveries. The Ottomans began, already in 15 th century, to transfer Western technology (especially firearms, cartography, and mining), and they also had some access to Renaissance astronomy and medicine through emigrant Jewish scholars. The interests of the Ottomans remained selective, however, because of their feelings of moral and cultural superiority and the self-sufficiency of their economic and educational system. They thus did not track the scientific and intellectual developments of the Renaissance and the Scientific Revolution, during their heyday.
In the 17 th century, contacts with Europe became closer and Ottoman knowledge of the West came through translations made from European languages, personal observations of Ottoman ambassadors who paid official visits to Europe and the modern educational institutions established in the 18th and 19th centuries.
The first work of astronomy translated from European languages was the astronomical tables by the French astronomer Noel Duret (d. ca. 1650). The translation, made by the Ottoman astronomer Tezkereci Köse İbrahim Efendi (Zigetvarlı) in 1660 was also the first book in Ottoman literature to mention Copernicus and his heliocentric system. Astronomy books subsequently translated from European languages also dealt mostly with astronomical tables. From the 16 th century onwards, the arrival of physicians and diseases from the West introduced new medical ideas and methods of prophylaxis and treatment. The medical doctrines of Paracelsus and his followers began to appear in Ottoman medical literature under the names of tıbb-ı cedid (new medicine) and tıbb-ı kimyaî (chemical medicine).
Instructors at the imperial schools of engineering or medical sciences translated and compiled books from European scientific literature. They relied on the textbooks used in European military technical or medical schools. In the 19th century, modern education became widespread, civilian education was reorganized, and new scientific and technical books were printed. The mid-19 th century thus witnessed an increase in both the number of printed books on modern science and techniques, and in the variety of subjects introduced.
Prior to the 18th century, it was not difficult for the Ottomans to keep up with European technology, for it changed relatively slowly. Large state enterprises such as the Maritime Arsenal, the Arsenal of Ordnance and Artillery, the Powder mill and the Mint, functioned fairly successfully to meet the needs of the military. In the 18th century, forced into constant retreat in Central Europe , the Ottomans gave up their policy of conquest and began to follow European developments closely, turning their attention to the cultural and technical sources of European superiority. Thus commenced a period of affluence, called the Tulip Age (1718-1730); under Western influence, new developments emerged not only in the technical fields, but also in art and architecture. Innovations such as the fire pump and the printing press were established in the Ottoman capital. Ottoman administrators who learnt about European daily life also developed a great interest in nonmilitary European inventions.
During the 18 th century, innovation in European war technology began to accelerate, and it became harder for the Ottomans to keep pace. The Ottomans sought gradually to import Western military science and to modernize their army. A first attempt was the creation in 1735 of the Corps of Bombardiers, under the supervision of the Comte de Bonneval. Besides undergoing drills, the bombardiers in this corps received theoretical training in geometry, trigonometry, ballistics, and technical drawing. In the second half of the 18 th century, a group of French experts came to Istanbul within the framework of military aid agreements. One of them, the Baron de Tott, was employed in building fortifications, in teaching new European military techniques. He established a new foundry, the "Corps de Diligents" where artillerymen were trained in the European manner, a school where courses were given for the first time on theoretical mathematics and military techniques. He introduced European techniques to the Imperial Maritime Arsenal as well. De Tott's cooperation with the Ottoman lasted six years, and he returned to France when local, French, and personal interests ceased to overlap. Between 1783-1788, numerous French military experts and officers came to Istanbul to work on various technical projects and the fortification of the Ottoman borders. When French experts, masters and teachers left Istanbul in 1788, their native Ottoman counterparts were employed in their place. In the 19 th century, European technical knowledge continued to filter through previously established schools of engineering and also through the students sent abroad to study in various fields.
During the early decades of the 18 th century, while transferring European techniques to strengthen they military power, the Ottomans started industrialization by establishing in İstanbul small-scale workshops for wool, cotton, paper, silk and porcelain manufacture. Fostered by the reforms of Sultan Mustafa III (1757-1774) and Selim III (1789-1807), the workshops established in the late 18 th century were generally designed to serve the military. As early as 1793-1794, Sultan Selim III introduced contemporary European methods and equipment for the production of cannons, rifles, mines, and gunpowder. As late as 1804, he undertook the construction of elaborate buildings to house a woolen mill for uniforms, and a paper factory. Sultan Mahmud II's reign (1808-1839) a spinning mill and a a leather tannery were built and boot works were improved. After the abolition of the Janissary Corps (1826), the army adopted European-style equipment; this, however, worked against domestic self-sufficiency. A factory was opened in 1832-1833 to manufacture the fez, a kind of headgear. In 1841, the Ottomans began to produce the fez with steam powered machines.
By 1841, the need for a massive industrial program became obvious. The enterprises set up between 1847-48 can be called imperial factories. Among these, were the Zeytinburnu Iron Factory, Izmit Woolen Cloth Factory, Hereke Silk Cloth Factory, Veliefendi Printed Wool-Cloth Factory, Mihalic State Farms, the School of Iron Ore and Agriculture in Büyükada. There were also plans to open talimhanes (training courses) on "mines, geometry, chemistry and sheep-breeding".
To serve the army's needs, the Gun Foundry and the Arsenal were installed with steam engines. The Arsenal at the Golden Horn , provided with European equipment and personnel, typifies the attitude of the Ottoman state toward technology transfer. Like the Feshane, it employed numerous foreign (particularly English, French and American) workers and administrators. Until the end of the century, Ottoman industries equipped with Western technologies depended largely on a foreign labor force.
The main purpose of founding and building imperial factories was to produce the necessary materials for the army, and to meet expenses with internal rather external resources. The bureaucrats of the Tanzimat were aware that it was as important to encourage exports as to limit imports. Nevertheless, this objective went unrealized.
In 1880's, the state shifted its emphasis from building factories to encouraging their creation by entrepreneurs. Thus, the role of the entrepreneurs gradually increased, and the number of factories grew significantly in the 1880s, when three-quarters of the Ottoman factories were established. The 19 th century attempts to industrialize, and to transfer Western technology didn't yield the expected results. This limited success may have resulted from mistakes in Ottoman policies and the pressure of foreign powers. To begin with, it was very hard for the Ottomans to find the necessary capital for industrialization; western capital investments, which entailed heavy conditions and difficulties, did not develop in the direction that they wished. Instead, western investments favored the interests of the non-Muslim subjects and ethnic groups who had cultural affinities with Europe . Moreover, the West quite naturally made its investment decisions with an eye, above all, toward profits. Ottoman attempts in the 19 th century to transfer modern technology and to found independent industrial enterprises were also hindered by deep-rooted European hostility. Although the initiatives in heavy industry met with limited success, there was a rapid growth in low-level technology transfer. For example, the yarn and dye technologies were adopted quickly and quite extensively.
The Ottomans' haste to bridge the gap with Europe and regain their old power led them to commit political errors. The Ottomans adopted modern science and technology mostly through "translations" and "purchase", and failed to produce science and develop a technology -failed, that is to establish an indigenous tradition in science and industry which would decrease their dependence on the West. I believe that this was the most critical factor that made the Ottoman experience different from that of Russia and Japan .
|
# coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
from msrest.pipeline import ClientRawResponse
from .. import models
class Polymorphism(object):
"""Polymorphism operations.
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An objec model deserializer.
"""
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self.config = config
def get_valid(
self, custom_headers=None, raw=False, **operation_config):
"""Get complex types that are polymorphic.
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: :class:`Fish
<fixtures.acceptancetestsbodycomplex.models.Fish>`
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
"""
# Construct URL
url = '/complex/polymorphism/valid'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct and send request
request = self._client.get(url, query_parameters)
response = self._client.send(request, header_parameters, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
deserialized = None
if response.status_code == 200:
deserialized = self._deserialize('Fish', response)
if raw:
client_raw_response = ClientRawResponse(deserialized, response)
return client_raw_response
return deserialized
def put_valid(
self, complex_body, custom_headers=None, raw=False, **operation_config):
"""Put complex types that are polymorphic.
:param complex_body: Please put a salmon that looks like this:
{
'fishtype':'Salmon',
'location':'alaska',
'iswild':true,
'species':'king',
'length':1.0,
'siblings':[
{
'fishtype':'Shark',
'age':6,
'birthday': '2012-01-05T01:00:00Z',
'length':20.0,
'species':'predator',
},
{
'fishtype':'Sawshark',
'age':105,
'birthday': '1900-01-05T01:00:00Z',
'length':10.0,
'picture': new Buffer([255, 255, 255, 255,
254]).toString('base64'),
'species':'dangerous',
},
{
'fishtype': 'goblin',
'age': 1,
'birthday': '2015-08-08T00:00:00Z',
'length': 30.0,
'species': 'scary',
'jawsize': 5
}
]
};
:type complex_body: :class:`Fish
<fixtures.acceptancetestsbodycomplex.models.Fish>`
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: None
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
"""
# Construct URL
url = '/complex/polymorphism/valid'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(complex_body, 'Fish')
# Construct and send request
request = self._client.put(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
def put_valid_missing_required(
self, complex_body, custom_headers=None, raw=False, **operation_config):
"""Put complex types that are polymorphic, attempting to omit required
'birthday' field - the request should not be allowed from the client.
:param complex_body: Please attempt put a sawshark that looks like
this, the client should not allow this data to be sent:
{
"fishtype": "sawshark",
"species": "snaggle toothed",
"length": 18.5,
"age": 2,
"birthday": "2013-06-01T01:00:00Z",
"location": "alaska",
"picture": base64(FF FF FF FF FE),
"siblings": [
{
"fishtype": "shark",
"species": "predator",
"birthday": "2012-01-05T01:00:00Z",
"length": 20,
"age": 6
},
{
"fishtype": "sawshark",
"species": "dangerous",
"picture": base64(FF FF FF FF FE),
"length": 10,
"age": 105
}
]
}
:type complex_body: :class:`Fish
<fixtures.acceptancetestsbodycomplex.models.Fish>`
:param dict custom_headers: headers that will be added to the request
:param bool raw: returns the direct response alongside the
deserialized response
:param operation_config: :ref:`Operation configuration
overrides<msrest:optionsforoperations>`.
:rtype: None
:rtype: :class:`ClientRawResponse<msrest.pipeline.ClientRawResponse>`
if raw=true
"""
# Construct URL
url = '/complex/polymorphism/missingrequired/invalid'
# Construct parameters
query_parameters = {}
# Construct headers
header_parameters = {}
header_parameters['Content-Type'] = 'application/json; charset=utf-8'
if custom_headers:
header_parameters.update(custom_headers)
# Construct body
body_content = self._serialize.body(complex_body, 'Fish')
# Construct and send request
request = self._client.put(url, query_parameters)
response = self._client.send(
request, header_parameters, body_content, **operation_config)
if response.status_code not in [200]:
raise models.ErrorException(self._deserialize, response)
if raw:
client_raw_response = ClientRawResponse(None, response)
return client_raw_response
|
The Prototype mode is an amazing feature which sets Adobe XD apart from other design tools.
In this mode, you can easily create an interactive prototype of what you are designing by simply connecting the artboards included in your web or mobile project, choosing a type of transition and previewing the final result on your computer or on a mobile device.
This allows you to test and validate the navigation, usability and overall user experience of your mobile app or website before you start with development, avoiding fundamental UX design flaws down the line which might cost you more time and money to fix.
1. To create a prototype, you first need to switch to Prototype Mode by using the little toggle on the top left of the page. You will notice that all the design related toolbars and features are not accessible anymore.
2. Now click on an artboard you want to be part of your prototype. You will notice that a little home icon appears at the top left of your artboard.
If you want this screen to be the first screen of your prototype, click on the home icon. It will turn blue, meaning that this will be your home screen, the first one appearing when the prototype is launched or shared.
3. Now let’s say you want to connect the button on the first screen to the screen next to it , and create an interaction between the two.
Select the button by clicking on it directly on your artboard or from the layers panel on the left. (note: if your object is a layer group, make sure you select the entire group and not just one of its individual layers).
The object will be highlighted in blue, and a little arrow handle will be displayed on the right side of it.
Click on this handle, drag the connecting line which will be displayed and release the mouse on the screen you want to connect to the button.
Target: this is the screen we are connecting our object to. You generally do not need to change this as it picks up automatically the target screen when you connect the two screens together.
Transition: you can choose from a bunch of different transition effects which you should change depending on the actual flow of your prototype. If for example you are prototyping the opening of a sidebar menu on a mobile app, triggered by the tap on a hamburger menu icon on the top left of the screen, you should use a “Slide right” transition.
Easing: I personally don’t bother touching this setting as I don’t see much of a difference. Feel free to play with it though.
Duration: This setting defines how long the chosen transition between the connected screens will last. I am a fan of fast transitions so I generally use 0.2s or 0.4s.
After you are done changing these settings, or if you don’t want to change them at all, just press ESC or click somewhere on the grey canvas and your connection will be completed.
Repeat this process for each screen of your app or website you want to connect.
• If you want to review all the connections a specific artboard has, just click on its title: all connections to other screens will be displayed.
• If you made a mistake and want to delete a connection between two screens, just drag one of the two connecting handles on the grey canvas, or click on it and then press DELETE on the keyboard.
• If you want to change the target screen of a connection, you can do it through the “Target” dropdown menu, or you can just drag the destination handle to a different screen.
You can preview the XD prototype you just created by clicking on the “Play” icon at the top right of the screen.
A preview window will popup and the currently selected artboard will be displayed. You can now click on the objects you connected and navigate through your prototype.
In the same window you can also record a video of your prototype, which you can then share with your clients, coworkers or stakeholders.
To do that, just click on the little record icon at the top right side of the preview window (a timer with the video duration will start). Do all the actions you want to record in the video and then click on the same icon to stop recording and save it (in .mp4 format).
|
from fasta.Fasta import read_fasta
import sys
if __name__ == '__main__':
data = '''
>Rosalind_4592
>Rosalind_9829
TTAGACCATGCTGTTGTACTCCCCCCGTCATGGCAAAAATGACTCATTCGAGTCTTTCGC
ATGCGTCCACCCGGCTGTGGACTTGTCTGTTCGGCCTAGGCGTGACAAAGGTTAAAGTCA
TGTATACAGGATGCCGACCAAATGTAGAGCTACTCATTCCGTATAGCTTCTAAGCCACTA
ACGAGGATACGAAGTGTATAGATATCCTTAAAGCTGGCGTTTGTAGCCTAGTTCGGCTAG
CCGGCATTATAATAATGAGATAAATTACAGGTGATCGCCAAGGCTCATTTAGTTGTGTTC
TTCGCAACTTTCCGGACCTTCGCCCTAATAATTAAAGCGCAGAGCCGCCTCGTATAGGAG
ATTAGTTTTATGCGACCGGATCTCGTAAGAAAATTAGATCGGGAAATTGTTAAACTACAG
TACTTAATTCAGGTAATGACATTTCCGTCGTTTGTATCTTTATAATAGTATATGATCGCC
TATAGTTATAAGAGCGCGAACGACAGCTTACGCGTGGTTGCGATTATCGGGTTAGTCTTA
ACTCGGCCCAAATGAGAAGAATTCACGTGGTAAAGTGTCGGACTGATAGCGCACGCATTC
GGTGACTTCTGATAGGCGACCCAGTATCTACGTCTGCCTATTCTGCACAGCGTATATCCT
ATTCAACCTAGGGTTACGTCGCGAAAGCTATATGCCCACACGGCAAGGGTAGCTGCAAGG
TGATAAGGTTGTTTGATTTGCCTCATAGCTGAGAAGACCGACAGAGGCTGCCTACTGGCA
CGTACAAACCAAGGTAACGTATCTGTGGGACACTTCGATGGTTGTGTCTAACGCCTTGAG
ACGTACGATACAAAACGTAGCGGGATTCACATGCAGGTAGTTGAAGTGTGTTCGGAGCTA
>Rosalind_8886
ACCAGCTATAACGTCCTAGCGTACCCTGCTTTGGTTAAAACACGAAATTGTTACAGTTGA
CGTCCGCCACTACTACGTGGAATATACAACTAGCCGAGTAGCTAAGCACTGTTCGTATCA
TCGGAGGATCCTAGTCTTGTCCTCGCTCTACAAAGGCCAAAAAATGTGTGGCTAAATGGC
GTAGGGTGGCTTATTAGAACTCTATTCATTCTCTATACTAAACTGCACGTGCTACATGGT
CTCCACGCGGGGTTCCGATAGTCGCCGCGAGTCTACGATTGGTACGTCCAAAAACCGTGG
GAGCGGGGAATAAATTTGCGGAACAGCCGACCACTGCCTCAAACACGGCGGGTTGAAGCC
GGGGCGTAGCCCTGAGGCAGTCTTGGGGGCGAGTTCCGCGCCGAGACGTACTGTCTAAGT
CGGACTGGGTACATCAGCGTGCTAGACAATAGCAATCGTCCTATTGATTACCCAAGACCG
CTAAGATTTGGAATCAACACCCTATAGAGGCGTATTTGACGAACCAGGAGTAATAGTTGA
GCATCTATGGATACAATAACCGAGGCCTACTACAGAAAAGTGTCCTAGTTGAGTAATACG
GCGTAACTACTAAAGCCGTTCGTGCAGAAGAGGTCAAGTGTAACGGCTCACGAGCCCAGC
GATCAGCCTTAACTTCTACAATGTGCGAGATATGTTTCGAAGGCACATCGCTTGGCGGCC
TCGCATTTTCTTTCTCGTTTTGGACAAGGCAAACTCAGGATGATGTAGTCGGCTTAGCAA
AAACCATACACAGGGATCTAGCCGAATCGGTCCGGGTCTCAAACACTAGGCTACCAGAAT
CCCTTCGACTGAAGACTCGGTGCAAGCCTCTATGGTACGCATACTCGAAAGCAGATAATA
>Rosalind_8552
GATTCCCGGACACCCGTGCTACCGCACGCCTCTTGTAAGTCAGTTTTCCGCTGAGTGTGG
ATCCGCACGTGATCAGCTGCAGTCACGCCGCACAGTATGGTGTATCTAGGTTCAGACACT
ACAGGCTGCGGGGTGACCGGATTTAGCGTTATCCATAAACAGATAGCGGGTTATGATGCA
TGTTACCACTATGTATATCGGGCCGGGCAGCGATCTGTTAACGGTCGGTAAACTGATCGG
CCGCTAGACGTAGTCTTTAGTAGCGGATCCTGTTCACCCACAATACGACAGGTAGTGCTT
CTGAAGTTATATAGGGCAAATATTGTATGTGGCCGACCGGCACAGCATTGACACGACGTC
TTGTCTCCTGACATCAACCGATAGAAGCATGAGAAGATAGTTACGTTATGGCGAATAGGA
AGGTACGCACCGAAACCTTCCATGAGATGGATGCCCATGCTTCTCTATGGGCGTTCCGGG
GAGCTATGGACTACCGGGATTCAACAGGCACCAGACGTTGTGGCGGGACTGTCGTCCATT
GCCGTGAATCTGGACTTTTAGTTCTAAGTATGAAGCCGTCGGGTCTGTATATGGAATCTG
AAAATCCATAGAGATGGATCACTGTGTATTGTTACGGAGGACTGATTTTCCAAAGTTTAC
CTGGTTACAGACCGCCGGTGGCAATTTTGATTAAAGTGGGGTCTTGATCCTGGCTGTATA
CGTGCTAGCGTCTCTCGCGTACCCCGCTTGAGTCGCAACAGCCGCTACGTCAAGAGACGT
GGTCCTACTAAACGGGCATGGGTCCGATGGTTCGACTCTCGATTGCTGTTCGAACCGGAG
ATTTATAGGGACTGAACCGCCCATCCACCTACTTGACTTCAGAGTGCTTCGCCATAACCC
>Rosalind_3431
TCCCCCGATCCTGCATTTTAGACCGTCATTTCTGAGCGCAGCCGTTACTCTGTGTTCTAG
TCAATACGTGAGCGACGCGTGGTCAAGGATTAGCTTTGTTCTAGCATCATAGGTGGAACT
GTTTCCGAAACCTAGAGCTTGCAAGTAGCTCACCCTGTTTCACTGCATAACGAATTAAGT
GATACGAGCCTTAGGTAAACTATGGAATAGCATCCCCCAACGCTGCCCTTTGCTAGTTTT
CCAAGCATGCCTGGTTTTAGATCAGTTTACTTTTAAGTGAGTGCGCGTGATGCGAATCTC
TCAGCGATTTATTGCTGACCTACCAGTGAAACTTATGCAAGGCTATGTGCGCCTGGCCGT
ACCTGAAGCCGGGACCAGTTCTATGAGTGGCAATATACCTTCTTTTGGTTCCCTCGTATG
AACGTTAACATGGGGATTGAAAACTTGTTGATGTTTTATTTCAATTGTTCCCATGATGTG
TCGATGGGTGGAACGCATCGCAGTTCGCACAGATAGCGTCCAGGAATGTTCACGCGGGAT
TGCGGGAGCGTAGCTTTCGGGAAAAGGACGACGTTCGCATACCGATCAGTCGCCATGCCA
CATATACGGAGTTGGTATCAGTCTTTTGCATGGTCAGAGCGTACCAGGCCAACCGGACCC
ACATATCGTGGTTACCGCGACAGCAGTATGCACCGGGTGCACAATCCTGCTAAACCCCGG
ATCTTGGGCCATCAAGAGGTTACTTCGAAAGGCTCAATGGCCGTAGTGTGGTGCCGATCC
GGGCATGATCTCCTCGTTTGAATGTTCTGCCGCACCTCAACGGTTAAGTGAACTTACACT
GGAAGGTAGATCGTCACGGCTAAGTTCGGCCAAAACCTCGCCCGCAGTTGGAGCCAATCC
>Rosalind_2026
CTCATCATTTCCCAGCAATGGAGTTAAAGTTGGTCCTCCTCTCGTGAGTGAGCGTTGAAT
TTATAAGTAACCTCGTAGGTCCGAAGGAGAGTAAGGGAATAAGAAACGGCTCCGTTCCTA
ATGACTAGTTAGGAGGTTTGGGATGACGTGAGAAGGGTGTCCCTTTGGTACTCGAATCGG
AATATGTCGCTCGCATCCATGTGCTATACATCCTTACTTGCAAGTCATATGCGGGGTCAG
GGTTAGGTAGCCAGTGGCCTCTGAACTATCGGGATGACCTGTACTAACCGGTTTACAACC
AGACGGACCAGGGCACGGGAGTCCCTACGGTGCCCAGTACTACTGCGGGAAAATACACCC
TCACTGCAATAACGCGAAGACTAAACTCTGCCATAATATCGTAGGTATGCTCGCTCGCGC
GAATCGGTATCCTAGCCTTGGTATTCTTGGCCGGGGCCAATCTGCCCTCCTTAAGCGGAC
CATAATACCGGACCTCGTAATAAGCCGAAATAGATATTCCCATCCAAACGAGTTACCCTA
GGCGAGACGGCAGAGCTTCCATATGGTAAACCTACCTGATGCGGGTATGGCTCCAAAGCG
TTGGCATTCCACCATCCCGGATAAATTAAGGAACCTCATGAGTGGGTTTGCAACTGGGAG
CGTTTGCGCCGACCTTCGCTTTCCCACGTCTTAACTCGATCCGATATCCTGGTCGGGGGT
GAGGGCAGCACGGTACCTGGTTTGCTGACAGGTTCTCTCCGGCCGCCAGCCCCAGGGCGT
GTTAAGAGAAACCGCAACGGAGACCACCTAGTATTTTAGGCCGCCGGGGTGTTAGGTGAA
TAAACACAGACAACTTCCACAACACTCCAATTCTCATACGAACGGGTTAATAAGGATTTT
>Rosalind_5007
GGGAAGCTAAATCCCCCCCCAAGTGGGGCCGAGAAAAATAAGTACGCAGTCCTGTCAACC
CCGGATCTACCTACTCCCATCTTGCGCGGCTTGAACAGATGTGAGGGTCAGCGGCCTCTG
GATGTTCTCTGGCTTGGGTACTAGGAGACGAACCCAATTGTTAGTTCGAATTATTTGCCC
CAGGCGCCGATCCTTCTTCTCGAACGCCTCTTCCTAGCCCTGCCGGCTCCCTCCGAAATG
GACTCAGGCGCTTTCGTAGTCCCGACACGCGCGTTCTCATCTTGGTAATCAACGGTTCTT
GTGAGTGCGAATTGGTCGCACTCGTGCCCTGACACTTGCGGGCGGTGTGGACTTTACAAC
ATATGACCTGCGCACTTTGGGGCATAATCATGAAAATGAACCGCTTGTGACGTAGCGGGT
AGAGTTTGGGTAACTGAAGTGAAGTGCACGCGGGGGACGACAGTCGAGGCGGGTTAGAAT
ATTCGGGAAGCACACACCTACCTTCAGGGTCAGCCGGTGGAGAGGGGGGCTTCCGTGGCA
TTACACATACTGGACCAGATCAACTTACTGCGGTTTATCATAGTTTTCATCAGATTAATT
TTTCACGGTCTGCGAAGCGGTCTCTCAGCACAATAACCCTTTACCTTTCCGCAGATGACT
GTTTGGAATAGATTGGGTAGACACCCCGTCGCCCGCTTATCATGTAAATTACCCACTAAG
GAAGTTCGTACTAAGTAGACGTTTCTGGAAGGACGTCAAGAGAGTGTACTAGACACTATT
AATCTCACCACGATTTGTTGACACTATGCAGAAACTCAGGTTAGATATCCTCCTGTGGCC
TTCCACTTGCACTTTCCATTATCGTGCGCTAACAAAGCACACGACTGGGTCTACAACGTA
>Rosalind_2348
AAAACCCGCGCAGCTCTACGGCCCATCAATCTGAGCTAATAAGTCGTTCGCTTAAAGGGA
CTTCGCACCCCATCATCTGAACAAACCGTCAGACGTCTCTTGTGGACTCTACTGGTACGG
TTCTCGACGAAATTGCGCCATCAACGCAGCACGTAGGTCCACCGTAGCCACCTGAGGTAC
GGCTGGGCACAGTTTGCTCTGTATGCTACTGGGCAGAGAGTGTCTACTACTGCCGGTGCC
TGGACGCGCTCCTGCTAGACCACAATCTCCAAGGAGATTGCCTTGAAAGCTGCATATGTA
GAGTTCATCAATCCATAACTTCTCCGGACGCACTGTAAATCAATTATAGCATCTGTTCAC
TGTGAGATGTGTTTCAGGATGATTCCTTTCTAGAGATACCTTTTGATTGGCAGAGTCCTC
TGGAATCTCGGTGGACCATGGTTCTCATAACTCAGGGATCTCCATTCTATCGCACCGGTT
AACGTGGACACTCTTGTTCTGCGACGCCTGTCTTGTCGCGATGACGAGATACCGTGGGCT
TCGAGTACTATCAAGGACTACGCCGTACCCAAATCAATTGAACGATCAGCATATTGGCGC
GAGGCTTAAACGGCGTCTGCATGAGTATAGTTGCTTGGTCCAGGGCCTTATCATTGCATG
ACTTTATGGTTAGTATATGGTGTGGTAAGTTGTCGGTGGAGCTTTACTGGCCCGTTTATG
ACTGCCCATTACAGACTCCGTGTGCTGAATCGGGTCAGCCACCTCCATCTATGCCGCAGC
TCCCCGCTGGCTACATATCCGCACCGTATACAGGGGAAAGAGTACGCTCTAGACAGACCC
GCTGGCATCTCGTGCCCACCTGGACCGAATAGCGACCACAGTCATTCACGACTTTGCAAT
>Rosalind_3740
GAGCCGTTCTTTTGATATACTTAGGTCCGTGCGGTCTCCACCCAAGCAACTCCTCTTAAC
GCTATTACTCGGATTGATAGTATATAGGTACCTCTGTGAAGTTTTGCAACGAGCATCTTA
CATGTCCCCAGGTGGACTCAAGTATTGACGAGGTGACCGGCGTCCTAGGATTGGGCCAAT
GACTTAAGGGTATTGTGAACTAAGTTATGCTACCTTTCGATCGAGAGATCCACCCTTGCA
TCAGGCCATTTACGGGAGCAGCGTGTAATATTGGCTAGACTATCTCTAGTTCGAAATCGA
CAGATGGGTGCCCTTAGCTACTGTGTCCAGCCAGCCTAATGGTCGAGGGTTGATACATCT
TGATCGTACGTTCCTCTAGCGGCCAAGAACTCGATCGTGTGGAGAACATGAGCAACCCCA
CAGACATGCTATAGTCCTAGAGCTGATCTTTACCCTGGGAGAAGTCTGTTTTCTCGGGGC
ATGCCGTGTTTGGCTGGGCTATTACGGTGCCCCACACGGTACTTACAGTACAAGAGTTAG
CACTGGTTAAGGGATAAATTATCGATATATTGTCCCCCGAGAGCACTTTCAGAGGACCTG
ACCAAGTAACTTATGGGGCCAAGCACGAATCGGTGTCCGTTCGCCTACGCGTGAACCAGC
CGCAGGAGGTTAACGTTACCTATTCTGCTTAGTCCTCAGTCCGATAGTAGCACCTTTGTG
AGCGCAGGGAAAGTAGAGGCTAGGCCTTCATTGGCGCCCCAACAAGACCAACCCAAGGCG
AATAAAACCGCCCCAGTTCAGAGAAATTCGCGGACGAAACACCACTCGTGAACAACCATA
TCATGAGTAGAGTGGCCACATAGACGGGAACAACAAAAACGTAAGTGAATGGCTGGACTT
>Rosalind_0868
CTCTGAGTGGGCACACACTGCGGTGCCACCGCAGCTAGCAAGCGTGATACCACTATTCTT
CAGTGCTCCCTGCAGATGTGGCATACCTTGCTTTAATTTCGTTTAAGGGCGGCATGGCTT
TAACTGTTCTACATGCGTATATTGATCATCCAATGCCGCGCGTGCACAGTTCAAAATTAG
TCAGTTCCCACCCGACAATCTTCCCAGTCACTTCGATAAAAACGGCAGAGAATTTTGCTG
AGCAGAGTGACCATCAATATGGCTTGCGACTTACTTAAGTTTCCTCCCAGGTTATACATT
AATAGCGTCAGCATGCATTCCAGCATGAAGTTCCCAGATTCGCTCTCGCCTCAACTAAAG
CAGAAGCCACCACCGACCACCGCATGTTGTTTTTGGATAGCTACTATTCACACAGAGAAG
CTGTTTCGATTATTTGTGATTTGCACCGATTGAAGATTCGGCTCGATAGGGACTCTCGGA
CAGACTGTACCGGTTAGGGGATCTTTATTTACTATGTTACTATTATGTCTTCCCTAATAC
GCCTCTGCTAGTAGCTAAGGTTCCAGATTAAAACCCGGAGACGTGCGGTCGTACCGATCG
GCGGCCATCACAATGATCTTATTTAATTACACGTAGGCCATTGTCTTCGTCAATTTGCAG
GGCTTTGACTAGGACACACGAACGGCTTGAGGGGAAACCCGGCAACGTGCGCGAATATTC
TTTAGGCATTTTGGAGTGGTCATTTCAGGTCCTACCCCGAACCTGAAAGCGGGTAGGGGC
GTGGAATGCAGCAAACGATGCTTGAGGTCGCTCAAGCGGGCCCAATGTCAAGGGTTACCT
GCGAGAGGCGGAAGTGCAAAGAACCAGCGAAGGATATTGGCTATTCCCTAGTCATGAGGT
>Rosalind_6517
AGCTATTTGGGGTTTCACAATAGAGTTTCGAGGCTTAAGATAGACACCAGGCATAGACGT
CGGCAATCCTTTTACTTCAATATAGATATTATCCAAATTTTAAGCCACTCTTTCCGGTCA
GTTCCGCATCGGCCACCTCTCCTGGCCGCCACATTAAACGACCCTTTCTGTGGTCTTGGA
CTACCTCGCCTGCCATAGCCTACATACAATTGACAGATCTCGCTATTCCGCAAGTGTTGG
GCTAAACAAGGCAAGGATACTCATCTTCGTGCGCGATGGAAGTTATTCCTCTGTCGATGT
CCCAAGTCTGAATTGGAATGCATCAGACTAGTGCTGTCAGACCGCAGCTGGCTCATATGT
GAATCCATTCTTGAACGAGCGCGTCTATGTCTTCGGACTCCTGGGACTATTTACCCGCCA
AATGAGTACGGTATTGTTGCCGCATCACGCGAACACGTAGTGGGGCAAGTTAGGACATAT
GGGTTCCATCATACGTTTGCGAGGCAGCGGTATGGTATAACTCCAGCTAAGGAAGTCGCC
ACGGTTGCTTCGTCAACGAAGGCTGTGATGGACGCAGTCGTGTAGCAAATACTGACAAAA
CACTGAGTTGGCCACAGAAGCGGCTAAAATTAATCATCGTCTTGAAAATGTCGCCTTGAA
ATTGGTACAGTATGTTATGAGCTCGCACGGGGTTGGAGGATAACGAGTTTAAGTTACCTG
CCACGCAAAACATTGAACTCGAAACTTCGTTTTGAGGAGTATCTTTATCAATCGCGTTGG
GTGATTTATGCTGAGGGTATGAGATAATAATGCGATGAACTAGGAAAGCGGAGTTTCTAT
TGGCAGTATGGTCGCTTTATCGTCCATGTCTAAAATCCTTAGTTAGTGAGTTAAATGCAA
'''
seqs = read_fasta(data)
# matrix
m = []
length = 0
for s in seqs:
r = []
length = len(s.sequence)
for c in s.sequence:
r.append(c)
m.append(r)
a = [0] * length
c = [0] * length
g = [0] * length
t = [0] * length
for row in m:
col = 0
for e in row:
if (e == 'A'):
a[col] += 1
elif (e == 'C'):
c[col] += 1
elif (e == 'G'):
g[col] += 1
else:
t[col] += 1
col += 1
profile = list()
for i in range(length):
if (a[i] >= c[i] and a[i] >= g[i] and a[i] >= t[i]):
profile.append('A')
elif (c[i] >= a[i] and c[i] >= g[i] and c[i] >= t[i]):
profile.append('C')
elif (g[i] >= c[i] and g[i] >= a[i] and g[i] >= t[i]):
profile.append('G')
elif (t[i] >= c[i] and t[i] >= a[i] and t[i] >= a[i]):
profile.append('T')
for e in profile:
sys.stdout.write(e)
print
print 'A:',
for e in a:
print e,
print
print 'C:',
for e in c:
print e,
print
print 'G:',
for e in g:
print e,
print
print 'T:',
for e in t:
print e,
|
The GRACE project demonstrates large-scale miscanthus and hemp production on land with low productivity, with contaminated soil or soil that has been abandoned. The aim is to ensure the supply of raw materials for the European bioeconomy.
In the project, ten different demonstration cases are used to show how biomass cultivation can be linked to the near-industrial-scale production of various biobased products.
Miscanthus is a perennial C4 grass originating from South-East Asia. It is a very resource-efficient crop which, after a two-year establishment period, can yield up to 25 tonnes of dry matter a year in Central Europe. It can be harvested annually over a cultivation period of up to twenty years. The application of herbicides is only necessary during the establishment phase. Its biomass can be used for a wide range of utilization pathways including combustion, conversion to bioethanol, production of building materials and of basic chemicals.
MOGU manufactures materials and technologies that are based on fungi as main constituent.
|
from tensorflow.contrib.learn.python.learn.datasets.mnist import read_data_sets
import numpy as np
import tensorflow as tf
import matplotlib.pyplot as plt
mnist = read_data_sets('data/', one_hot=True)
train_pixels, train_list_values = mnist.train.next_batch(100)
test_pixels, test_list_values = mnist.train.next_batch(10)
train_pixel_tensor = tf.placeholder(dtype=tf.float32, shape=[None, 784])
test_pixels_tensor = tf.placeholder(dtype=tf.float32, shape=[784])
distance = tf.reduce_sum(
tf.abs(
tf.add(
train_pixel_tensor,
tf.negative(test_pixels_tensor)
)
),
reduction_indices=1
)
pred = tf.arg_min(distance, 0)
accuracy = 0
init = tf.global_variables_initializer()
with tf.Session() as session:
session.run(init)
for i in range(len(test_list_values)):
print(test_pixels)
nn_index = session.run(
pred,
feed_dict={
train_pixel_tensor: train_pixels,
test_pixels_tensor: test_pixels[i, :]
}
)
trained_value = train_list_values[nn_index]
true_value = train_list_values[i]
trained_value_number = np.argmax(trained_value)
true_value_number = np.argmax(true_value)
print('test N %s Predicted Class: %s, True Class: %s, %s'
% (i,
trained_value_number,
true_value_number,
trained_value_number == true_value_number
)
)
if trained_value_number == true_value_number:
accuracy += 1.0 / len(test_pixels)
image = np.reshape(train_pixels[nn_index], [28, 28])
plt.imshow(image)
plt.show()
print('Accuracy = %s' % accuracy)
|
36.3° Enjoyed teaching today. Tired hit right as the kids left. I made it through the afternoon with distracting activities. We had a fun family evening. An early bedtime tonight.
|
# python3
# Copyright 2018 DeepMind Technologies Limited. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for the environment loop."""
from typing import Optional
from absl.testing import absltest
from absl.testing import parameterized
from acme import environment_loop
from acme import specs
from acme import types
from acme.testing import fakes
import numpy as np
EPISODE_LENGTH = 10
# Discount specs
F32_2_MIN_0_MAX_1 = specs.BoundedArray(
dtype=np.float32, shape=(2,), minimum=0.0, maximum=1.0)
F32_2x1_MIN_0_MAX_1 = specs.BoundedArray(
dtype=np.float32, shape=(2, 1), minimum=0.0, maximum=1.0)
TREE_MIN_0_MAX_1 = {'a': F32_2_MIN_0_MAX_1, 'b': F32_2x1_MIN_0_MAX_1}
# Reward specs
F32 = specs.Array(dtype=np.float32, shape=())
F32_1x3 = specs.Array(dtype=np.float32, shape=(1, 3))
TREE = {'a': F32, 'b': F32_1x3}
TEST_CASES = (
('scalar_discount_scalar_reward', None, None),
('vector_discount_scalar_reward', F32_2_MIN_0_MAX_1, F32),
('matrix_discount_matrix_reward', F32_2x1_MIN_0_MAX_1, F32_1x3),
('tree_discount_tree_reward', TREE_MIN_0_MAX_1, TREE),
)
class EnvironmentLoopTest(parameterized.TestCase):
@parameterized.named_parameters(*TEST_CASES)
def test_one_episode(self, discount_spec, reward_spec):
_, loop = _parameterized_setup(discount_spec, reward_spec)
result = loop.run_episode()
self.assertIn('episode_length', result)
self.assertEqual(EPISODE_LENGTH, result['episode_length'])
self.assertIn('episode_return', result)
self.assertIn('steps_per_second', result)
@parameterized.named_parameters(*TEST_CASES)
def test_run_episodes(self, discount_spec, reward_spec):
actor, loop = _parameterized_setup(discount_spec, reward_spec)
# Run the loop. There should be EPISODE_LENGTH update calls per episode.
loop.run(num_episodes=10)
self.assertEqual(actor.num_updates, 10 * EPISODE_LENGTH)
@parameterized.named_parameters(*TEST_CASES)
def test_run_steps(self, discount_spec, reward_spec):
actor, loop = _parameterized_setup(discount_spec, reward_spec)
# Run the loop. This will run 2 episodes so that total number of steps is
# at least 15.
loop.run(num_steps=EPISODE_LENGTH + 5)
self.assertEqual(actor.num_updates, 2 * EPISODE_LENGTH)
def _parameterized_setup(discount_spec: Optional[types.NestedSpec] = None,
reward_spec: Optional[types.NestedSpec] = None):
"""Common setup code that, unlike self.setUp, takes arguments.
Args:
discount_spec: None, or a (nested) specs.BoundedArray.
reward_spec: None, or a (nested) specs.Array.
Returns:
environment, actor, loop
"""
env_kwargs = {'episode_length': EPISODE_LENGTH}
if discount_spec:
env_kwargs['discount_spec'] = discount_spec
if reward_spec:
env_kwargs['reward_spec'] = reward_spec
environment = fakes.DiscreteEnvironment(**env_kwargs)
actor = fakes.Actor(specs.make_environment_spec(environment))
loop = environment_loop.EnvironmentLoop(environment, actor)
return actor, loop
if __name__ == '__main__':
absltest.main()
|
Brookside Apartments combine the comfort, beauty, and serenity of a rural living space with the luxury, convenience, and amenities of a city apartment building, within minutes of Manhattan, located in historic and beautiful Bloomfield, New Jersey.
We feature 74 unique apartments, each a sophisticated living space in three sizes, making us a tremendous fit for families as well as single individuals looking for a comfortable and beautiful place to live.
Our apartments are perfectly located less than a mile from the Garden State Parkway and 12 miles from the Lincoln Tunnel, making your commute easy and painless. Our living spaces also feature easy accessibly via NJ Transit and the historic Bloomfield Railroad Station, only two stops away from the Hoboken Terminal which features access to New York City and beyond. We are within walking distance of several major malls, restaurants, and historic downtown Bloomfield.
Don't compromise another minute - call Brookside Apartments today to schedule a showing.
Call us today at (973) 748-1820. Office hours are from 9 am - 6 pm.
For a limited time, Brookside Apartments is featuring recently renovated & remodeled apartments that are currently available for a premium. See more at Our Apartments page.
|
import sqlite3
import os
from pacha.util import get_db_file, get_db_dir
REPOS_TABLE = """CREATE TABLE IF NOT EXISTS repos(
id integer primary key,
path TEXT,
permissions TEXT,
type TEXT,
timestamp TEXT
)"""
METADATA_TABLE = """CREATE TABLE IF NOT EXISTS metadata(
id integer primary key,
path TEXT,
owner TEXT,
grp TEXT,
permissions INT,
ftype TEXT
)"""
DB_FILE = get_db_file()
DB_DIR = get_db_dir()
class Worker(object):
"""CRUD Database operations"""
def __init__(self, db = DB_FILE):
self.db = db
self.conn = sqlite3.connect(self.db)
self.c = self.conn.cursor()
self.c.execute(REPOS_TABLE)
self.c.execute(METADATA_TABLE)
def is_tracked(self):
repo = [i for i in self.get_repo(DB_DIR)]
if repo:
return True
return False
def closedb(self):
"""Make sure the db is closed"""
self.conn.close()
def insert(self, path=None, permissions=None, type=None, timestamp=None):
"""Puts a new repo in the database and checks if the record
is not already there"""
if not timestamp:
stat = os.lstat(path)
timestamp = int(stat.st_mtime)
values = (path, permissions, type, timestamp, path)
command = 'INSERT INTO repos(path, permissions, type, timestamp) select ?,?,?,? WHERE NOT EXISTS(SELECT 1 FROM repos WHERE path=?)'
self.c.execute(command, values)
self.conn.commit()
def insert_meta(self, path, owner, grp, permissions, ftype):
"""Gets the metadata into the corresponding table"""
values = (path, owner, grp, permissions, ftype, path)
command = 'INSERT INTO metadata(path, owner, grp, permissions, ftype) select ?,?,?,?,? WHERE NOT EXISTS(SELECT 1 FROM metadata WHERE path=?)'
self.c.execute(command, values)
self.conn.commit()
def get_meta(self, path):
"""Gets metadata for a specific file"""
values = (path,)
command = "SELECT * FROM metadata WHERE path = (?)"
return self.c.execute(command, values)
def update_timestamp(self, path, timestamp):
"""Updates the timestamp for a repo that got modified"""
values = (timestamp, path)
command = 'UPDATE repos SET timestamp=? WHERE path=?'
self.c.execute(command, values)
self.conn.commit()
def remove(self, path):
"""Removes a repo from the database"""
values = (path,)
command = "DELETE FROM repos WHERE path = (?)"
self.c.execute(command, values)
self.conn.commit()
def get_repos(self):
"""Gets all the hosts"""
command = "SELECT * FROM repos"
return self.c.execute(command)
def get_repo(self, host):
"""Gets attributes for a specific repo"""
values = (host,)
command = "SELECT * FROM repos WHERE path = (?)"
return self.c.execute(command, values)
|
LittleBigPlanet 2 begins with a unique story mode that provides a beautifully reshaped world for Sackboy to explore and play. Each story level is influenced by cultural high points in history.
Players can reset the controller buttons for any object and change the rules to any level through an ability called Direct Control and multiplayer abilities advance the types of games possible for a social/competitive experience. Users are able to link levels together to provide longer gameplay experiences without the breakup of going back to the Pod and can see what LittleBigPlanet 2 levels/games their friends are playing via activity steams.
LittleBigPlanet 2 is a Platformer, Adventure-game for the developed by Media Molecule and published by Sony Interactive Entertainment.
|
#
# -*- coding: utf-8 -*-
# Copyright 2019 Red Hat
# GNU General Public License v3.0+
# (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
"""
The vyos_lldp_interfaces class
It is in this file where the current configuration (as dict)
is compared to the provided configuration (as dict) and the command set
necessary to bring the current configuration to it's desired end-state is
created
"""
from __future__ import absolute_import, division, print_function
__metaclass__ = type
from ansible_collections.ansible.netcommon.plugins.module_utils.network.common.cfg.base import (
ConfigBase,
)
from ansible_collections.vyos.vyos.plugins.module_utils.network.vyos.facts.facts import (
Facts,
)
from ansible_collections.ansible.netcommon.plugins.module_utils.network.common.utils import (
to_list,
dict_diff,
)
from ansible.module_utils.six import iteritems
from ansible_collections.vyos.vyos.plugins.module_utils.network.vyos.utils.utils import (
search_obj_in_list,
search_dict_tv_in_list,
key_value_in_dict,
is_dict_element_present,
)
class Lldp_interfaces(ConfigBase):
"""
The vyos_lldp_interfaces class
"""
gather_subset = [
"!all",
"!min",
]
gather_network_resources = [
"lldp_interfaces",
]
params = ["enable", "location", "name"]
def __init__(self, module):
super(Lldp_interfaces, self).__init__(module)
def get_lldp_interfaces_facts(self):
""" Get the 'facts' (the current configuration)
:rtype: A dictionary
:returns: The current configuration as a dictionary
"""
facts, _warnings = Facts(self._module).get_facts(
self.gather_subset, self.gather_network_resources
)
lldp_interfaces_facts = facts["ansible_network_resources"].get(
"lldp_interfaces"
)
if not lldp_interfaces_facts:
return []
return lldp_interfaces_facts
def execute_module(self):
""" Execute the module
:rtype: A dictionary
:returns: The result from module execution
"""
result = {"changed": False}
commands = list()
warnings = list()
existing_lldp_interfaces_facts = self.get_lldp_interfaces_facts()
commands.extend(self.set_config(existing_lldp_interfaces_facts))
if commands:
if self._module.check_mode:
resp = self._connection.edit_config(commands, commit=False)
else:
resp = self._connection.edit_config(commands)
result["changed"] = True
result["commands"] = commands
if self._module._diff:
result["diff"] = resp["diff"] if result["changed"] else None
changed_lldp_interfaces_facts = self.get_lldp_interfaces_facts()
result["before"] = existing_lldp_interfaces_facts
if result["changed"]:
result["after"] = changed_lldp_interfaces_facts
result["warnings"] = warnings
return result
def set_config(self, existing_lldp_interfaces_facts):
""" Collect the configuration from the args passed to the module,
collect the current configuration (as a dict from facts)
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
want = self._module.params["config"]
have = existing_lldp_interfaces_facts
resp = self.set_state(want, have)
return to_list(resp)
def set_state(self, want, have):
""" Select the appropriate function based on the state provided
:param want: the desired configuration as a dictionary
:param have: the current configuration as a dictionary
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
commands = []
state = self._module.params["state"]
if state in ("merged", "replaced", "overridden") and not want:
self._module.fail_json(
msg="value of config parameter must not be empty for state {0}".format(
state
)
)
if state == "overridden":
commands.extend(self._state_overridden(want=want, have=have))
elif state == "deleted":
if want:
for item in want:
name = item["name"]
have_item = search_obj_in_list(name, have)
commands.extend(
self._state_deleted(want=None, have=have_item)
)
else:
for have_item in have:
commands.extend(
self._state_deleted(want=None, have=have_item)
)
else:
for want_item in want:
name = want_item["name"]
have_item = search_obj_in_list(name, have)
if state == "merged":
commands.extend(
self._state_merged(want=want_item, have=have_item)
)
else:
commands.extend(
self._state_replaced(want=want_item, have=have_item)
)
return commands
def _state_replaced(self, want, have):
""" The command generator when state is replaced
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
commands = []
if have:
commands.extend(self._state_deleted(want, have))
commands.extend(self._state_merged(want, have))
return commands
def _state_overridden(self, want, have):
""" The command generator when state is overridden
:rtype: A list
:returns: the commands necessary to migrate the current configuration
to the desired configuration
"""
commands = []
for have_item in have:
lldp_name = have_item["name"]
lldp_in_want = search_obj_in_list(lldp_name, want)
if not lldp_in_want:
commands.append(
self._compute_command(have_item["name"], remove=True)
)
for want_item in want:
name = want_item["name"]
lldp_in_have = search_obj_in_list(name, have)
commands.extend(self._state_replaced(want_item, lldp_in_have))
return commands
def _state_merged(self, want, have):
""" The command generator when state is merged
:rtype: A list
:returns: the commands necessary to merge the provided into
the current configuration
"""
commands = []
if have:
commands.extend(self._render_updates(want, have))
else:
commands.extend(self._render_set_commands(want))
return commands
def _state_deleted(self, want, have):
""" The command generator when state is deleted
:rtype: A list
:returns: the commands necessary to remove the current configuration
of the provided objects
"""
commands = []
if want:
params = Lldp_interfaces.params
for attrib in params:
if attrib == "location":
commands.extend(
self._update_location(have["name"], want, have)
)
elif have:
commands.append(self._compute_command(have["name"], remove=True))
return commands
def _render_updates(self, want, have):
commands = []
lldp_name = have["name"]
commands.extend(self._configure_status(lldp_name, want, have))
commands.extend(self._add_location(lldp_name, want, have))
return commands
def _render_set_commands(self, want):
commands = []
have = {}
lldp_name = want["name"]
params = Lldp_interfaces.params
commands.extend(self._add_location(lldp_name, want, have))
for attrib in params:
value = want[attrib]
if value:
if attrib == "location":
commands.extend(self._add_location(lldp_name, want, have))
elif attrib == "enable":
if not value:
commands.append(
self._compute_command(lldp_name, value="disable")
)
else:
commands.append(self._compute_command(lldp_name))
return commands
def _configure_status(self, name, want_item, have_item):
commands = []
if is_dict_element_present(have_item, "enable"):
temp_have_item = False
else:
temp_have_item = True
if want_item["enable"] != temp_have_item:
if want_item["enable"]:
commands.append(
self._compute_command(name, value="disable", remove=True)
)
else:
commands.append(self._compute_command(name, value="disable"))
return commands
def _add_location(self, name, want_item, have_item):
commands = []
have_dict = {}
have_ca = {}
set_cmd = name + " location "
want_location_type = want_item.get("location") or {}
have_location_type = have_item.get("location") or {}
if want_location_type["coordinate_based"]:
want_dict = want_location_type.get("coordinate_based") or {}
if is_dict_element_present(have_location_type, "coordinate_based"):
have_dict = have_location_type.get("coordinate_based") or {}
location_type = "coordinate-based"
updates = dict_diff(have_dict, want_dict)
for key, value in iteritems(updates):
if value:
commands.append(
self._compute_command(
set_cmd + location_type, key, str(value)
)
)
elif want_location_type["civic_based"]:
location_type = "civic-based"
want_dict = want_location_type.get("civic_based") or {}
want_ca = want_dict.get("ca_info") or []
if is_dict_element_present(have_location_type, "civic_based"):
have_dict = have_location_type.get("civic_based") or {}
have_ca = have_dict.get("ca_info") or []
if want_dict["country_code"] != have_dict["country_code"]:
commands.append(
self._compute_command(
set_cmd + location_type,
"country-code",
str(want_dict["country_code"]),
)
)
else:
commands.append(
self._compute_command(
set_cmd + location_type,
"country-code",
str(want_dict["country_code"]),
)
)
commands.extend(self._add_civic_address(name, want_ca, have_ca))
elif want_location_type["elin"]:
location_type = "elin"
if is_dict_element_present(have_location_type, "elin"):
if want_location_type.get("elin") != have_location_type.get(
"elin"
):
commands.append(
self._compute_command(
set_cmd + location_type,
value=str(want_location_type["elin"]),
)
)
else:
commands.append(
self._compute_command(
set_cmd + location_type,
value=str(want_location_type["elin"]),
)
)
return commands
def _update_location(self, name, want_item, have_item):
commands = []
del_cmd = name + " location"
want_location_type = want_item.get("location") or {}
have_location_type = have_item.get("location") or {}
if want_location_type["coordinate_based"]:
want_dict = want_location_type.get("coordinate_based") or {}
if is_dict_element_present(have_location_type, "coordinate_based"):
have_dict = have_location_type.get("coordinate_based") or {}
location_type = "coordinate-based"
for key, value in iteritems(have_dict):
only_in_have = key_value_in_dict(key, value, want_dict)
if not only_in_have:
commands.append(
self._compute_command(
del_cmd + location_type, key, str(value), True
)
)
else:
commands.append(self._compute_command(del_cmd, remove=True))
elif want_location_type["civic_based"]:
want_dict = want_location_type.get("civic_based") or {}
want_ca = want_dict.get("ca_info") or []
if is_dict_element_present(have_location_type, "civic_based"):
have_dict = have_location_type.get("civic_based") or {}
have_ca = have_dict.get("ca_info")
commands.extend(
self._update_civic_address(name, want_ca, have_ca)
)
else:
commands.append(self._compute_command(del_cmd, remove=True))
else:
if is_dict_element_present(have_location_type, "elin"):
if want_location_type.get("elin") != have_location_type.get(
"elin"
):
commands.append(
self._compute_command(del_cmd, remove=True)
)
else:
commands.append(self._compute_command(del_cmd, remove=True))
return commands
def _add_civic_address(self, name, want, have):
commands = []
for item in want:
ca_type = item["ca_type"]
ca_value = item["ca_value"]
obj_in_have = search_dict_tv_in_list(
ca_type, ca_value, have, "ca_type", "ca_value"
)
if not obj_in_have:
commands.append(
self._compute_command(
key=name + " location civic-based ca-type",
attrib=str(ca_type) + " ca-value",
value=ca_value,
)
)
return commands
def _update_civic_address(self, name, want, have):
commands = []
for item in have:
ca_type = item["ca_type"]
ca_value = item["ca_value"]
in_want = search_dict_tv_in_list(
ca_type, ca_value, want, "ca_type", "ca_value"
)
if not in_want:
commands.append(
self._compute_command(
name,
"location civic-based ca-type",
str(ca_type),
remove=True,
)
)
return commands
def _compute_command(self, key, attrib=None, value=None, remove=False):
if remove:
cmd = "delete service lldp interface "
else:
cmd = "set service lldp interface "
cmd += key
if attrib:
cmd += " " + attrib
if value:
cmd += " '" + value + "'"
return cmd
|
It is true, the Kitchen is the hub of the home and deserves to have that something special that makes it look so. Here at Ceramo we can offer you a huge range of Kitchen splashback options that will do just that. Whether it is a Glass Splashback look or a classic subway tile we have it all. Contact us now to see any of the following types of Kitchen Splashback options.
|
##############################################################################
# Copyright (c) 2017-2018, The VOTCA Development Team (http://www.votca.org)
#
# This file is part of Spack.
# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/spack/spack
# Please also see the NOTICE and LICENSE files for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU Lesser General Public License (as
# published by the Free Software Foundation) version 2.1, February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class VotcaTools(CMakePackage):
"""Versatile Object-oriented Toolkit for Coarse-graining
Applications (VOTCA) is a package intended to reduce the amount of
routine work when doing systematic coarse-graining of various
systems. The core is written in C++.
This package contains the basic tools library of VOTCA.
"""
homepage = "http://www.votca.org"
url = "https://github.com/votca/tools/tarball/v1.4"
git = "https://github.com/votca/tools.git"
version('develop', branch='master')
version('1.4', 'cd47868e9f28e2c7b9d01f95aa0185ca')
version('1.4.1', '3176b72f8a41ec053cc740a5398e7dc4')
depends_on("cmake@2.8:", type='build')
depends_on("expat")
depends_on("fftw")
depends_on("gsl", when="@:1.4.9999")
depends_on("eigen@3.3:", when="@1.5:")
depends_on("boost")
depends_on("sqlite")
def cmake_args(self):
args = [
'-DWITH_RC_FILES=OFF'
]
return args
|
Making sure children feel safe and cared for. If you are worried about the welfare of a child - report it!
Free school meals and the council's catering services provided by MetroFresh.
Find a school, admissions, term dates, attendance, governors, travel, exclusions, educational trips and information for newly qualified teachers.
Applying for a secondary school place for September 2019.
Egress Switch is an email system we use to send and receive emails securely.
Sexually transmitted infections (STIs), emergency contraception and contraceptive advice.
Local people providing friendship, respite and activities for adults who need support due to disability, age or illness.
Our skilled sign makers can make all kinds of signs from road traffic to safety signs and corporate branding.
Report offensive smells and odours to us.
Determine whether or not your house is in a smoke control area.
If you are having difficulties with undertaking everyday tasks you may be able to get help with care and support.
Find out more about local leisure centres, sport facilities, getting outdoors, losing weight and more.
Start Well Family Centres provide activities and services for children under five and their families.
Health services that schools can access to help and support pupils.
Starting Point Plus can refer you for help with a wide range of services for older people across Wigan borough.
Produced annually, this gives clear information about the council's finances.
The council's cleansing team provide a regular programme of street and pavement cleaning.
Tell us about a fault with a street light or enquire about a street lighting issue.
Apply for a new address, what's the process for getting an address, postcodes and street name plates.
Send an invoice, check if we received it, change your bank details and find out about our payment terms.
For 6 weeks we can offer care and support to help you keep living in your own home.
How to get help if you care for a relative or friend with a disability or illness.
Community services, dementia, support for carers and help for those with sight or hearing loss.
Services to help young people with a range of things, including health, education and employment.
What supported living is, housing options, benefits and managing your money, transition from child to adult and more.
|
import copy
book1 = {'isbn': 1234,
'title': 'The book',
'authors': ['Bob Author', 'Helio Author'],
'pages': 500,
'format': 'Slippery back',
'publisher': 'Crazy dude publishing',
'publication_date': '1820 01 02',
'description': 'a book',
'thumbnail': 'a thumbnail'}
book2 = {'isbn': 1235,
'title': 'Great book',
'authors': ['Jane Author'],
'pages': 123,
'room_id': 2,
'format': 'Sturdy thing',
'publisher': 'Sane gal publishing',
'publication_date': '2016 12 31',
'description': 'Another book',
'thumbnail': 'another thumbnail'}
book3 = {'isbn': 1236,
'title': 'Great Songs',
'authors': ['Jane Author'],
'pages': 100,
'format': 'Sturdy thing',
'publisher': 'Sane gal publishing',
'publication_date': '2000 01 01',
'description':
'A very nice book about songs! All the best artists',
'thumbnail': 'another thumbnail'}
book4 = {'isbn': 1237,
'title': 'Great Poems',
'authors': ['Jane Author'],
'pages': 3,
'format': 'Sturdy thing',
'publisher': 'Sane gal publishing',
'publication_date': '1999 12 31',
'description':
'A very nice book about poems! All the best poets',
'thumbnail': 'another thumbnail'}
def get_book(book, book_id=1, room_id=1):
temp_book = copy.copy(book)
temp_book['book_id'] = book_id
temp_book['room_id'] = room_id
temp_book['loaned'] = False
return temp_book
def get_descriptor(book, num_copies=1):
temp_book = copy.copy(book)
temp_book['number_of_copies'] = num_copies
return temp_book
|
This memo outlines internal audit's observations related to a locations FSC cash applications process. It discusses process map corrections, design effectiveness observations and operating effectiveness observations.
Recommendations focus on systems and applications usage, the company's cash receipts sub-process, and system access. Design effectiveness observations relate to downloading flat files from the bank's website and un-posting payments.
|
import uuid
from aiohttp import web
from rororo import openapi_context, OperationTableDef
from rororo.openapi.exceptions import ObjectDoesNotExist
from .data import ENVIRONMENT_VARS, GITHUB_REPOSITORIES
from .decorators import login_required
operations = OperationTableDef()
@operations.register
@login_required
async def create_repository(request: web.Request) -> web.Response:
with openapi_context(request) as context:
return web.json_response(
{
**context.data,
"uid": str(uuid.uuid4()),
"jobs": ["test", "deploy"],
"status": "cloning",
},
status=201,
)
@operations.register
async def list_all_references(request: web.Request) -> web.Response:
return web.json_response({"default_env": {"CI": "1", "HOBOTNICA": "1"}})
@operations.register
@login_required
async def list_favorites_repositories(request: web.Request) -> web.Response:
with openapi_context(request) as context:
return web.json_response(
status=204, headers={"X-Order": context.parameters.query["order"]}
)
@operations.register
@login_required
async def list_owner_repositories(request: web.Request) -> web.Response:
with openapi_context(request) as context:
username = context.security["basic"].login
return web.json_response(
list((GITHUB_REPOSITORIES.get(username) or {}).values())
)
@operations.register
@login_required
async def list_repositories(request: web.Request) -> web.Response:
with openapi_context(request) as context:
username = context.parameters.header["X-GitHub-Username"]
return web.json_response(
list((GITHUB_REPOSITORIES.get(username) or {}).values())
)
@operations.register
@login_required
async def retrieve_owner_env(request: web.Request) -> web.Response:
with openapi_context(request) as context:
owner = context.parameters.path["owner"]
return web.json_response(ENVIRONMENT_VARS.get(owner) or {})
@operations.register
@login_required
async def retrieve_repository(request: web.Request) -> web.Response:
with openapi_context(request) as context:
owner = context.parameters.path["owner"]
repository = (GITHUB_REPOSITORIES.get(owner) or {}).get(
context.parameters.path["name"]
)
if not repository:
raise ObjectDoesNotExist("Repository")
return web.json_response(repository)
@operations.register
@login_required
async def retrieve_repository_env(request: web.Request) -> web.Response:
with openapi_context(request) as context:
owner = context.parameters.path["owner"]
name = context.parameters.path["name"]
env_key = f"{owner}/{name}"
return web.json_response(ENVIRONMENT_VARS.get(env_key) or {})
|
The East African Community has begun phasing-out imports of second-hand clothing to promote the development of the domestic garment sector. Using trade data and information obtained from the exporters, this study produces the first estimate of disaggregated imports of second-hand clothing in Tanzania. The net import of used clothing is estimated at over 540 million pieces per year, compared to a domestic production of new clothing of 20 million pieces and import of 177 million pieces of new clothing. This study assesses the short-term impact of the phase-out on the domestic garment sector. Depending on the substitutability between new and used clothing, the phase-out could prompt increased import of new clothing. It could also prompt employment losses and generate costs for the poorest consumers. In the longer term, the phase-out is unlikely to promote the development of the garment sector unless the existing constraints are properly addressed.
You are here: Home » Downloads » The phase-out of second-hand clothing imports: what impact for Tanzania?
|
import json
import logging
import matplotlib.pyplot as plt
import numpy as np
import scipy.cluster.hierarchy as hac
log = logging.getLogger(__name__)
def analyze(data):
# Convert this to python data for us to be able to run ML algorithms
json_to_python = json.loads(data)
per_size = dict() # IP-Response size
hostlist = dict()
# Data pre-processing here:
for y in json_to_python:
hostlist[y['HOST']] = 1
if y['HOST'] in per_size:
per_size[y['HOST']].append(int(y['SIZE']))
else:
per_size[y['HOST']] = [int(y['SIZE'])]
##Data pre-processing ends here
log.debug(
"*** Printing Input to analysis - 4 (1): K-means on IP and average response size ****"
)
#####*****SIZE******####
#### Analysis #4 (1): IP address - Size of response received feature
X = np.array([[0.00, 0.00]])
for x in hostlist:
avg_size = mean(per_size[x])
log.debug(x + ": " + str(avg_size))
y = x.split(".")
ip = ""
for z in range(4):
l = len(y[z])
l = 3 - l
if (l > 0):
zero = ""
for t in range(3 - len(y[z])):
zero = zero + "0"
y[z] = zero + y[z]
ip = ip + y[z]
# log.debug( str(float(float(ip)/1000)) + ": " + str(avg_size))
le = [float(float(ip) / 1000), avg_size]
X = np.vstack([X, le])
log.info(
"******** Printing Analysis #4: IP-Address and Response Size received: Centroid and Median Hierarchical Clustering ********\nCheck 'test-centroid-median.png' for more info!"
)
# print kmeans.labels_
### Analysis 4 (9): ###### CENTROID AND MEDIAN HAC*****#########
fig, axes23 = plt.subplots(2, 3)
for method, axes in zip(['centroid', 'median'], axes23):
z = hac.linkage(X, method=method)
# Plotting
axes[0].plot(range(1, len(z) + 1), z[::-1, 2])
knee = np.diff(z[::-1, 2], 2)
axes[0].plot(range(2, len(z)), knee)
num_clust1 = knee.argmax() + 2
knee[knee.argmax()] = 0
num_clust2 = knee.argmax() + 2
axes[0].text(num_clust1, z[::-1, 2][num_clust1 - 1],
'possible\n<- knee point')
part1 = hac.fcluster(z, num_clust1, 'maxclust')
part2 = hac.fcluster(z, num_clust2, 'maxclust')
clr = [
'#2200CC', '#D9007E', '#FF6600', '#FFCC00', '#ACE600', '#0099CC',
'#8900CC', '#FF0000', '#FF9900', '#FFFF00', '#00CC01', '#0055CC'
]
for part, ax in zip([part1, part2], axes[1:]):
for cluster in set(part):
ax.scatter(
X[part == cluster, 0],
X[part == cluster, 1],
color=clr[cluster % 10])
m = '\n(method: {})'.format(method)
plt.setp(
axes[0],
title='Screeplot{}'.format(m),
xlabel='partition',
ylabel='{}\ncluster distance'.format(m))
plt.setp(axes[1], title='{} Clusters'.format(num_clust1))
plt.setp(axes[2], title='{} Clusters'.format(num_clust2))
plt.tight_layout()
##plt.show()
plt.savefig('test-centroid-median.png')
def mean(numbers):
return float(sum(numbers)) / max(len(numbers), 1)
|
The Batavia Blue Devils will need to take down Joe Girard and Glens Falls for the Class B state championship.
The hype building for the NYSPHSAA Class B state championship game is palpable. On one end you have the No. 1 ranked team in the state, the Batavia Blue Devils who sit at an impressive 12-0.
On the other, you have the Glens Falls Indians, who won a state championship in 2016. Both are led by spectacular superstars who have played their best ball in the biggest moments.
Blue Devils running back Ray Leach broke multiple state tournament records with 450 all-purpose yards and eight touchdowns in the Far West Regional. He shattered and tied his own record with 494 all-purpose yards and eight more touchdowns in the semifinal against Skaneateles.
The Indians are led by Syracuse basketball commit and New York's all-time leading scorer Joe Girard. He started on the Indians' 2016 state championship. In Glens Falls' 48-28 semifinal win, the future Orangeman threw for more than 300 yards and scored four times, including a defensive touchdown.
The Leach vs. Girard matchup is a huge storyline, but neither team is a one-man show and there is plenty of talent on both sides.
Here are four keys for Batavia in the state title game that will kickoff at noon Saturday at the Carrier Dome. Admission is $10.
One thing Batavia should have going for it is the fact the Blue Devils just faced another athletic quarterback who will play Division I in another sport. Yale lacrosse commit Patrick Hackler torched Batavia's defense with 402 yards of offense, including 147 on the ground. Girard poses a similar threat with 379 rushing yards this season and 1,424 for his career.
The Blue Devils know how dangerous an athlete like Girard can be if he gets in space. It may be wise for Batavia to use Leach or linebacker Alex Rood as a spy at all times. Girard is more than capable of making throws from inside the pocket, but if Batavia can limit his running lanes it should give its pass rush more time to get home.
Can Leach run for 500 or more yards? At this point, anything is possible for the senior who has rushed for 366, 417, and 474 yards in the last three weeks alone. The Blue Devils needed every bit of his 474 against Skaneateles as the Lakers were one of the few teams to get the better of the Batavia defense. The Indians are averaging 41 points a game and the best defense against this may be to let Leach eat up time of possession.
In an early-season loss to Class A Burnt Hills, the Indians allowed two running backs to produce nearly 300 yards on the ground. With Batavia's offensive line and Leach's talent, it isn't hard to fathom another epic performance on the ground.
With two high-powered offenses dueling, there could be a lot of points scored. That means the game could come down to who has the ball last or who is able to win the turnover battle. Batavia didn't turn the ball over against Skaneateles and two key interceptions from Leach and Andrew Francis were crucial in the Blue Devils reaching the Carrier Dome.
Francis' interception came on the first play of the second and allowed Batavia to build a two-score lead which the Lakers couldn't overcome and Leach broke up what could've been a score. Girard is completing 63 percent of his attempts this season, so it won't be easy to force him into mistakes.
At this point, the Indians will be gearing up everything they can to be the first team to stop Leach. With that in mind, Batavia should have ample opportunities to take advantage of an overly aggressive defense. Ethan Biscaro hasn't needed to throw much in the past month or so, but the state championship would be an ideal time to let the signal caller try to pick apart the Indians' defense.
Biscaro is also a rushing threat, evidenced by his 30-yard scramble against Skaneateles. Read options and run-pass-options could generate big plays for Batavia.
|
# encoding: utf-8
"""
Implementación del Plugin NeedUpdate.
"""
import ckan.plugins as plugins
import ckan.plugins.toolkit as toolkit
import logging
from controller import NeedupdateController
# logs
logger = logging.getLogger(__name__)
def get_plugins_list():
"""
Retorna la lista de plugins que posee la plataforma.
Args:
- None.
Returns:
- list()
"""
c = NeedupdateController()
return c.get_list_of_repos()
class NeedupdatePlugin(plugins.SingletonPlugin):
plugins.implements(plugins.IConfigurer)
plugins.implements(plugins.interfaces.IRoutes, inherit=True)
plugins.implements(plugins.IResourceView, inherit=True)
plugins.implements(plugins.ITemplateHelpers)
def info(self):
return {'name': 'NeedUpdate',
'title': 'NU',
'icon': 'file-text',
'default_title': 'NU',
}
def update_config(self, config_):
toolkit.add_ckan_admin_tab(config_, 'ext_status_dashboard', 'My Plugins')
toolkit.add_template_directory(config_, 'templates')
toolkit.add_public_directory(config_, 'public')
toolkit.add_resource('fanstatic', 'needupdate')
def before_map(self, m):
return m
def after_map(self, m):
m.connect('ext_status_api',
'/ext_status.json',
controller='ckanext.needupdate.plugin:NeedupdateController',
action='ext_status')
m.connect('ext_status_dashboard',
'/my_extensions',
controller='ckanext.needupdate.plugin:NeedupdateController',
action='dashboard_ext')
return m
def get_helpers(self):
"""
Registrar el helper "get_plugins_list()"
Returns:
- list(). Lista de plugins instalados.
"""
# Template helper function names should begin with the name of the
# extension they belong to, to avoid clashing with functions from
# other extensions.
return {'needupdate_get_plugins_list': get_plugins_list}
|
Thousands of motorists are choosing to pay tolls to get ahead on the Washington region’s traffic-choked highways.
On Interstates 495 and 95, where more than 45 miles of toll lanes have opened within the past six years, drivers choose when to take the express lanes, with only a small share — 5 percent in the D.C. area — using the lanes every day, according to a new survey.
The majority of frequent Express Lanes users, defined as those who choose the lanes at least once a week, are more likely to use them for commuting to work and are willing to pay when they need to get to an important meeting and traffic is bad in the regular lanes, according to the survey commissioned by Transurban, the company that operates the 495 and 95 Express Lanes.
More than half the Washington-area drivers surveyed — from a sample of 1,732 motorists — said they’ve used the 95 and 495 Express Lanes. The most popular reason? They view them as a way to beat traffic and save time.
“These are our neighbors. They are busy professionals who have kids and are trying to get to work on time,” said Elisa Bell, marketing director at Transurban North America.
Occasional users, defined as those who drive the lanes only about once a month, are more likely to use them for specific travel such as a family road trip down the I-95 corridor or to get to the airport, according to the 2018 Transurban State of the Lanes report.
Despite HOT lanes' disparaging nickname of “Lexus lanes,” most 495 and 95 express users are not affluent, according to the survey, and many of them work for employers who subsidize their commutes. About 60 percent of the frequent users said they have household incomes of less than $100,000, and a similar share have a bachelor’s degree or higher. About one-third of those users said they don’t mind the tolls because their employers pick up the bill, according to the survey.
More than half he lane users said they live in Virginia, while 31 percent are Maryland residents, and 16 percent are D.C. residents, according to the data.
The typical user is younger than 45 (82 percent), has young children and relies on home services such as grocery delivery and house cleaning. They are loyal Amazon customers who get a package from the online retailer at least once a month.
“They don’t mind paying a fee for convenience services and similarly don’t mind paying for tolls,” Bell said.
An average of 45,000 daily trips are made on the 495 Express Lanes, where tolls average $5.40. The 95 Express Lanes have an average of 51,000 daily trips, with tolls averaging $8.45, according to Transurban. Tolls have topped $30 for the 495 lanes, which are used more for commuting.
Drivers who have an E-ZPass Flex and drive with at least two passengers can use the lanes free. The lanes have a variable tolling system, meaning rates change based on volume to keep traffic flowing at a target speed of 65 mph.
There is no cap on the tolls. The highest toll recorded for the 495 lanes last year was $32.30; the highest for the 95 express lanes was $46.25. In the case of the 495 lanes, a crash in the lanes between Interstate 66 and Route 7 on June 8, 2017, caused delays; fewer than 30 people paid the highest toll, Transurban said. The 95 toll hit its high point on Nov. 15, 2017, when a crash in the lanes in the Lorton area caused a backup. Fewer than 10 motorists paid $46.25.
The two express lane systems are part of a growing network of toll lanes in Northern Virginia that is expected to grow to 90 miles by 2022. The newest entrant, the 66 Express Lanes, opened in December, with 10 miles of rush-hour, peak direction toll lanes that have yielded some of the highest tolls in the country — $47 one way. That system is directly operated by the state.
The 495 Express Lanes, stretching 14 miles, were the first high-occupancy toll lanes in the region when they opened in 2012. The state is studying an expansion to the American Legion Bridge, which will add three more miles to Virginia’s system and connect to a proposed system across the Potomac River into Maryland.
The 95 Express Lanes opened four years ago, spanning 29 miles from just north of Garrisonville Road in Stafford to the vicinity of Edsall Road on Interstate 395 in Fairfax County. An expansion in Stafford added two miles last year. And work is underway on a 10-mile extension from Garrisonville Road (Route 610) in Stafford to Route 17 in the Fredericksburg area, a project expected to be completed in 2022.
Before that, Virginia is slated to add another eight miles of toll lanes on I-395. The state is converting the HOV lanes into toll lanes as part of a $480-million project slated to open next year. The 395 Express Lanes will essentially amount to an extension of the 95 Express Lanes.
The state is also building another 22 miles of toll lanes on I-66 outside the Capital Beltway. With that addition, the 395 Express Lanes and the Fredericksburg extension, these projects will deliver the next major milestone in the state’s vision to create a network of more than 90 miles of HOT lanes in Northern Virginia by 2022.
|
# -*- coding: utf-8 -*-
# Form implementation generated from reading ui file 'layersbyfielddialogbase.ui'
#
# Created: Fri Mar 28 11:23:31 2014
# by: PyQt4 UI code generator 4.8.3
#
# WARNING! All changes made in this file will be lost!
from PyQt4 import QtCore, QtGui
try:
_fromUtf8 = QtCore.QString.fromUtf8
except AttributeError:
_fromUtf8 = lambda s: s
class Ui_LayersByFieldDialog(object):
def setupUi(self, LayersByFieldDialog):
LayersByFieldDialog.setObjectName(_fromUtf8("LayersByFieldDialog"))
LayersByFieldDialog.resize(307, 232)
LayersByFieldDialog.setLocale(QtCore.QLocale(QtCore.QLocale.English, QtCore.QLocale.UnitedKingdom))
self.verticalLayout = QtGui.QVBoxLayout(LayersByFieldDialog)
self.verticalLayout.setObjectName(_fromUtf8("verticalLayout"))
self.label = QtGui.QLabel(LayersByFieldDialog)
self.label.setLocale(QtCore.QLocale(QtCore.QLocale.English, QtCore.QLocale.UnitedKingdom))
self.label.setObjectName(_fromUtf8("label"))
self.verticalLayout.addWidget(self.label)
self.inputLayerCombo = QtGui.QComboBox(LayersByFieldDialog)
self.inputLayerCombo.setObjectName(_fromUtf8("inputLayerCombo"))
self.verticalLayout.addWidget(self.inputLayerCombo)
self.label_2 = QtGui.QLabel(LayersByFieldDialog)
self.label_2.setLocale(QtCore.QLocale(QtCore.QLocale.English, QtCore.QLocale.UnitedKingdom))
self.label_2.setObjectName(_fromUtf8("label_2"))
self.verticalLayout.addWidget(self.label_2)
self.splitFieldCombo = QtGui.QComboBox(LayersByFieldDialog)
self.splitFieldCombo.setObjectName(_fromUtf8("splitFieldCombo"))
self.verticalLayout.addWidget(self.splitFieldCombo)
self.label_3 = QtGui.QLabel(LayersByFieldDialog)
self.label_3.setEnabled(False)
self.label_3.setLocale(QtCore.QLocale(QtCore.QLocale.English, QtCore.QLocale.UnitedKingdom))
self.label_3.setObjectName(_fromUtf8("label_3"))
self.label_3.setVisible(False)
self.verticalLayout.addWidget(self.label_3)
self.progressBar = QtGui.QProgressBar(LayersByFieldDialog)
self.progressBar.setLocale(QtCore.QLocale(QtCore.QLocale.English, QtCore.QLocale.UnitedKingdom))
self.progressBar.setProperty(_fromUtf8("value"), 0)
self.progressBar.setObjectName(_fromUtf8("progressBar"))
self.verticalLayout.addWidget(self.progressBar)
self.buttonBox = QtGui.QDialogButtonBox(LayersByFieldDialog)
self.buttonBox.setLocale(QtCore.QLocale(QtCore.QLocale.English, QtCore.QLocale.UnitedKingdom))
self.buttonBox.setOrientation(QtCore.Qt.Horizontal)
self.buttonBox.setStandardButtons(QtGui.QDialogButtonBox.Close|QtGui.QDialogButtonBox.Ok)
self.buttonBox.setObjectName(_fromUtf8("buttonBox"))
self.verticalLayout.addWidget(self.buttonBox)
self.retranslateUi(LayersByFieldDialog)
QtCore.QObject.connect(self.buttonBox, QtCore.SIGNAL(_fromUtf8("accepted()")), LayersByFieldDialog.accept)
QtCore.QObject.connect(self.buttonBox, QtCore.SIGNAL(_fromUtf8("rejected()")), LayersByFieldDialog.reject)
QtCore.QMetaObject.connectSlotsByName(LayersByFieldDialog)
def retranslateUi(self, LayersByFieldDialog):
LayersByFieldDialog.setWindowTitle(QtGui.QApplication.translate("LayersByFieldDialog", "Layers from field", None, QtGui.QApplication.UnicodeUTF8))
self.label.setText(QtGui.QApplication.translate("LayersByFieldDialog", "Input layer", None, QtGui.QApplication.UnicodeUTF8))
self.label_2.setText(QtGui.QApplication.translate("LayersByFieldDialog", "Split by field", None, QtGui.QApplication.UnicodeUTF8))
self.label_3.setText(QtGui.QApplication.translate("LayersByFieldDialog", "Save to", None, QtGui.QApplication.UnicodeUTF8))
|
SPEC engineers will walk your plant with you to provide their expertise on your process system optimization to help you achieve operational excellence. SPEC specializes in efficient process engineering and process management by assisting with elimination of loss and waste, as well as saving energy. SPEC will adjust your current operations or build a fully optimized new process in order to minimize cost and maximize throughput. SPEC will design to enhance equipment utilization, yield, product quality, safety, performance, operating procedures, and control optimization.
By optimizing your process, you will save money over time. Your facility will cost less to operate and your equipment will last longer. SPEC engineers can assist in reducing energy consumption, raw material consumption, and inventory levels.
Contact SPEC today to optimize your process.
|
# The Fibonacci sequence starts with 0 and 1. Subsequent terms are then gotten by adding the previous two
# such that the first seven terms are: 0, 1, 1, 2, 3, 5, 8.
# Using recursion, write a function that, given an integer n, returns the nth Fibonacci number.
# For example:
# given n = 0, the function should return 0
# if n = 1, it should return 1
# if n = 2, it should return 1
# if n = 4, it should return 3
# if n = 8, it should return 21
# Be sure to write automated tests for your solution.
# Hint:
# fibonacci(0) = 0
# fibonacci(1) = 1
# fibonacci(n) = fibonacci(n - 1) + fibonacci(n - 2)
def fibonacci(n):
if n == 0 or n == 1:
return n
else:
return fibonacci(n - 2) + fibonacci(n - 1)
def test_fibonacci_equals(arg, expected):
observed = fibonacci(arg)
if observed == expected:
print('Thumbs up.')
else:
print('Thumbs down. Expected %i but got %i' % (expected, observed))
test_fibonacci_equals(0, 0)
test_fibonacci_equals(1, 1)
test_fibonacci_equals(2, 1)
test_fibonacci_equals(3, 2)
test_fibonacci_equals(4, 3)
test_fibonacci_equals(5, 5)
test_fibonacci_equals(6, 8)
|
Creative Writing: Putting a New Spin on Old Fables | Latin Alive!
Our 6th and 7th grade classes end the year with this creative writing project. They are asked to rewrite a familiar fable. I intentionally choose a story that is very familiar and one which has repetitive lines. Such stories have included The Three Little Pigs, The Little Red Hen, and Goldilocks and the Three Bears. Each one of these stories the students have heard a hundred times in their childhood and could probably recite from memory. This allows them to comfortably rewrite the story in their own “Latin” words. I also allow them to put their own creative spin on the story as long as they stay true to the main story line. Once again, however, I begin the project by setting some important parameters that will end up making the project more enjoyable.
I usually break the class up into teams and assign each team a scene in the story. The three stories I mentioned above can all be broken down in to smaller pieces. This keeps the project from being too overwhelming, and allows us to easily keep the time to a week or less.
The first rule is K.I.S.S. = Keep It Simple Sweetie; especially when it comes to grammar. I insist they use only certain tenses, moods, and voices where verbs are concerned, or declensions and cases for nouns and adjectives. They can only use the grammar that they have mastered. This seems like a no-brainer, but your over-achievers will try to do something fancy to impress their friends and teacher.
Once the story has been written and edited, the class puts together a final draft complete with illustrations. In some years, time permitting, we have then put on a play for the younger classes. Because the story line is so familiar, the students are able to follow the story even if they do not know all of the words. In fact, they have a really good time listening for words they do know. The familiar repetitive lines also help the audience follow along and learn some fun new Latin. The project has been enjoyed so much by the performers and audiences that it has become a year-end tradition at Grace Academy.
|
"""
Django settings for project_template project.
Generated by 'django-admin startproject' using Django 1.9.
For more information on this file, see
https://docs.djangoproject.com/en/1.9/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/1.9/ref/settings/
"""
import os
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/1.9/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = '<SOME-SECRET-KEY>'
ALLOWED_HOSTS = []
# Application definition
INSTALLED_APPS = [
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
]
MIDDLEWARE_CLASSES = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.auth.middleware.SessionAuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'project_template.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'project_template.wsgi.application'
# Password validation
# https://docs.djangoproject.com/en/1.9/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.'
'password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.'
'password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.'
'password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.'
'password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/1.9/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/1.9/howto/static-files/
STATIC_URL = '/static/'
|
G8 Properties are pleased to offer this Immaculate 3/4 Bed House on Hampton Road, Ilford. Property comprises: Ground Floor: Through Lounge Reception, Fitted Kitchen, 2nd Reception can be used as Dinner, Family Bathroom and Beautiful Garden. 1st Floor: One Master Bedroom with Fitted Wardrobes, One Double Bedroom with Fitted Wardrobes, One Good size single Bedroom with fitted wardrobes and Toilet with Sink. Property is located on very nice residential area of Ilford and close to local amenities and Schools. For more information or to arrange an appointment to view this property please contact G8 Properties.
All administration/Credit reference fees are due at the start of the application process once the offer has been verbally agreed and will secure the property for you subject to satisfactory references being received on behalf of all applicable tenants .
|
# Copyright 2015 OpenStack Foundation
# Copyright (c) 2015 Hewlett-Packard Development Company, L.P.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import abc
from neutron_lib.api import extensions
from neutron.api.v2 import resource_helper
from networking_l2gw.services.l2gateway.common import constants
RESOURCE_ATTRIBUTE_MAP = {
constants.L2_GATEWAYS_CONNECTION: {
'id': {'allow_post': False, 'allow_put': False,
'is_visible': True},
'l2_gateway_id': {'allow_post': True, 'allow_put': False,
'validate': {'type:string': None},
'is_visible': True, 'default': ''},
'network_id': {'allow_post': True, 'allow_put': False,
'validate': {'type:string': None},
'is_visible': True},
'segmentation_id': {'allow_post': True, 'allow_put': False,
'validate': {'type:string': None},
'is_visible': True, 'default': ''},
'tenant_id': {'allow_post': True, 'allow_put': False,
'validate': {'type:string': None},
'required_by_policy': True,
'is_visible': True}
},
}
class L2gatewayconnection(extensions.ExtensionDescriptor):
"""API extension for Layer-2 Gateway connection support."""
@classmethod
def get_name(cls):
return "L2 Gateway connection"
@classmethod
def get_alias(cls):
return "l2-gateway-connection"
@classmethod
def get_description(cls):
return "Connects Neutron networks with external networks at layer 2."
@classmethod
def get_updated(cls):
return "2014-01-01T00:00:00-00:00"
@classmethod
def get_resources(cls):
"""Returns Ext Resources."""
mem_actions = {}
plural_mappings = resource_helper.build_plural_mappings(
{}, RESOURCE_ATTRIBUTE_MAP)
resources = resource_helper.build_resource_info(plural_mappings,
RESOURCE_ATTRIBUTE_MAP,
constants.L2GW,
action_map=mem_actions,
register_quota=True,
translate_name=True)
return resources
def get_extended_resources(self, version):
if version == "2.0":
return RESOURCE_ATTRIBUTE_MAP
else:
return {}
class L2GatewayConnectionPluginBase(object):
@abc.abstractmethod
def delete_l2_gateway_connection(self, context, l2_gateway_id,
network_mapping_list):
pass
@abc.abstractmethod
def create_l2_gateway_connection(self, context, l2_gateway_id,
network_mapping_list):
pass
@abc.abstractmethod
def get_l2_gateway_connections(self, context, filters=None,
fields=None,
sorts=None, limit=None, marker=None,
page_reverse=False):
pass
@abc.abstractmethod
def get_l2_gateway_connection(self, context, id, fields=None):
pass
|
On a scale from 0-10, how likely are you to recommend Steven Young to a friend?
Dr. Young is honest but understands the need for humanity even in giving bad news.... He has given us options for our procedures and is always patient to explain those options, the risks, and the benefits to us.
He was amazing! He would explain lab results in detail. Dr. Young had a sense of humor and provide a comfortable setting when the process is so stressful, but you would always leave with a piece of mind that the path you were doing was the right treatment.... Dr. Young would call whenever there was reason to explain any lab result. He was completely understanding and listened, then gave his advice. He was always realistic and optimistic.
|
from geoserver.support import ResourceInfo, bbox, write_bbox, \
write_string, xml_property, url
def _maybe_text(n):
if n is None:
return None
else:
return n.text
def _layer_list(node, element):
if node is not None:
return [_maybe_text(n.find("name")) for n in node.findall(element)]
def _style_list(node):
if node is not None:
return [_maybe_text(n.find("name")) for n in node.findall("style")]
def _write_layers(builder, layers, parent, element, attributes):
builder.start(parent, dict())
for l in layers:
builder.start(element, attributes or dict())
if l is not None:
builder.start("name", dict())
builder.data(l)
builder.end("name")
builder.end(element)
builder.end(parent)
def _write_styles(builder, styles):
builder.start("styles", dict())
for s in styles:
builder.start("style", dict())
if s is not None:
builder.start("name", dict())
builder.data(s)
builder.end("name")
builder.end("style")
builder.end("styles")
class LayerGroup(ResourceInfo):
"""
Represents a layer group in geoserver
"""
resource_type = "layerGroup"
save_method = "PUT"
def __init__(self, catalog, name):
super(LayerGroup, self).__init__()
assert isinstance(name, basestring)
self.catalog = catalog
self.name = name
# the XML format changed in 2.3.x - the element listing all the layers
# and the entries themselves have changed
if self.catalog.gsversion() == "2.2.x":
parent, element, attributes = "layers", "layer", None
else:
parent, element, attributes = "publishables", "published", {'type':'layer'}
self._layer_parent = parent
self._layer_element = element
self._layer_attributes = attributes
self.writers = dict(
name = write_string("name"),
styles = _write_styles,
layers = lambda b,l: _write_layers(b, l, parent, element, attributes),
bounds = write_bbox("bounds")
)
@property
def href(self):
return url(self.catalog.service_url, ["layergroups", self.name + ".xml"])
styles = xml_property("styles", _style_list)
bounds = xml_property("bounds", bbox)
def _layers_getter(self):
if "layers" in self.dirty:
return self.dirty["layers"]
else:
if self.dom is None:
self.fetch()
node = self.dom.find(self._layer_parent)
return _layer_list(node, self._layer_element) if node is not None else None
def _layers_setter(self, value):
self.dirty["layers"] = value
def _layers_delete(self):
self.dirty["layers"] = None
layers = property(_layers_getter, _layers_setter, _layers_delete)
def __str__(self):
return "<LayerGroup %s>" % self.name
__repr__ = __str__
class UnsavedLayerGroup(LayerGroup):
save_method = "POST"
def __init__(self, catalog, name, layers, styles, bounds):
super(UnsavedLayerGroup, self).__init__(catalog, name)
bounds = bounds if bounds is not None else ("-180","180","-90","90","EPSG:4326")
self.dirty.update(name = name, layers = layers, styles = styles, bounds = bounds)
@property
def href(self):
return "%s/layergroups?name=%s" % (self.catalog.service_url, self.name)
|
The bonus episode for Life is Strange: Before the Storm is out now on PS4.
This additional episode, dubbed ‘Farewell’ sees the wonderful drama series reunite players with Chloe and Max years before the latter leaves Arcadia Bay, and sees Ashly Burch and Hannah Telle reprising their roles as the duo after Burch was replaced in Before the Storm by Rihanna DeVries during the voice actors strike.
The episode is available in Before the Storm’s Season Pass. It launched for the PS4 at the same time as other platforms. So none of this faffing about waiting for another breath of that salty Arcadia Bay air.
In addition to that, a new 1GB+ patch for the game, version 1.05, has been deployed. Check out the full Life is Strange: Before the Storm version 1.05 patch notes below.
Life is Strange: Before the Storm is also getting a couple of new physical releases to coincide with Farewell, including a hip vinyl edition.
We love all things Life is Strange here at PlayStation Universe, and Before the Storm is no exception. We were high with our praise for the mini-series when it concluded towards the end of 2017.
|
try:
from itertools import izip
except ImportError:
izip = zip
import numpy as np
from .accuracy_cython import compute_rank_violations
def read_analogy_file(filename):
"""
Read the analogy task test set from a file.
"""
section = None
with open(filename, 'r') as questions_file:
for line in questions_file:
if line.startswith(':'):
section = line[2:].replace('\n', '')
continue
else:
words = line.replace('\n', '').split(' ')
yield section, words
def construct_analogy_test_set(test_examples, dictionary, ignore_missing=False):
"""
Construct the analogy test set by mapping the words to their
word vector ids.
Arguments:
- test_examples: iterable of 4-word iterables
- dictionay: a mapping from words to ids
- boolean ignore_missing: if True, words in the test set
that are not in the dictionary
will be dropeed.
Returns:
- a N by 4 numpy matrix.
"""
test = []
for example in test_examples:
try:
test.append([dictionary[word] for word in example])
except KeyError:
if ignore_missing:
pass
else:
raise
try:
test = np.array(test, dtype=np.int32)
except ValueError as e:
# This should use raise ... from ... in Python 3.
raise ValueError('Each row of the test set should contain '
'4 integer word ids', e)
return test
def analogy_rank_score(analogies, word_vectors, no_threads=1):
"""
Calculate the analogy rank score for the given set of analogies.
A rank of zero denotes a perfect score; with random word vectors
we would expect a rank of 0.5.
Arguments:
- analogies: a numpy array holding the ids of the words in the analogy tasks,
as constructed by `construct_analogy_test_set`.
- word_vectors: numpy array holding the word vectors to use.
- num_threads: number of parallel threads to use in the calculation.
Returns:
- ranks: a numpy array holding the normalized rank of the target word
in each analogy task. Rank 0 means that the target words was
returned first; rank 1 means it was returned last.
"""
# The mean of the vectors for the
# second, third, and the negative of
# the first word.
input_vectors = (word_vectors[analogies[:, 1]]
+ word_vectors[analogies[:, 2]]
- word_vectors[analogies[:, 0]])
word_vector_norms = np.linalg.norm(word_vectors,
axis=1)
# Pre-allocate the array storing the rank violations
rank_violations = np.zeros(input_vectors.shape[0], dtype=np.int32)
compute_rank_violations(word_vectors,
word_vector_norms,
input_vectors,
analogies[:, 3],
analogies,
rank_violations,
no_threads)
return rank_violations / float(word_vectors.shape[0])
|
Hilary Whitehall Putnam (July 31, 1926 – March 13, 2016) was an American philosopher, mathematician, and computer scientist. He was a central figure in analytic philosophy from the 1960s. He worked in philosophy of mind, philosophy of language, philosophy of mathematics, and philosophy of science. Until his death, Putnam was Cogan University Professor Emeritus at Harvard University.
Putnam was born in Chicago, Illinois. He studied at Harvard University, at University of Pennsylvania and at the University of California, Los Angeles. Putnam died on March 13, 2016 from mesothelioma at his home in Boston, Massachusetts. He was aged 89.
↑ "Boston Globe Obituaries". Retrieved 13 March 2016.
This page was last changed on 14 November 2016, at 12:20.
|
# Copyright (c) 2015 Jinxiong Tan
# GNU General public licence
import aerospike as aero
import multiprocessing
import time
import datetime
import sys
# Usage example:
# records = [Record('key_1', {'bin': 'value_1'}), Record('key_2', {'bin': 'value_2'}), Record('key_3', {'bin': 'value_3'})]
# aerospike_client = aerospike_client.AsyncClient([(host_1:port_1), (host_2:port_2)], 'namespace', 'set', 604800)
# success_count, failure_records = aerospike_client.put(records)
class Record():
def __init__(self, key, bins):
"""
:param key: Aerospike key, should be a string
:param bins: Aerospike bins, should be a dictionary
:return: None
"""
if type(bins) is dict:
self.key = key
self.bins = bins
else:
raise TypeError('Wrong types for bins')
class Client(object):
def __init__(self, cluster, namespace, set_name, ttl, retry_limit, logger):
self._cluster = cluster
self._namespace = namespace
self._set_name = set_name
self._ttl = ttl
self._retry_limit = retry_limit
self._logger = logger
def put(self, records):
raise NotImplementedError
def get(self, key):
raise NotImplementedError
def close(self):
raise NotImplementedError
def _log(self, content):
log = '{timestamp}: {content}'.format(timestamp=datetime.datetime.fromtimestamp(time.time()).strftime('%Y-%m-%d %H:%M:%S'), content=content)
if self._logger is None:
print log
else:
self._logger.logging(log)
class SyncClient(Client):
def __init__(self, cluster, namespace, set_name, ttl, retry_limit=3, logger=None):
"""
:param cluster: Aerospike cluster, should have the following format, [(host_1: port_1), (host_2: port_2), ..., (host_n: port_n)]
:param namespace: Aerospike namespace
:param set_name: Aerospike set
:param ttl: time to live for records
:return: None
"""
super(SyncClient, self).__init__(cluster, namespace, set_name, ttl, retry_limit, logger)
self.aerospike_dao = []
for node in cluster:
self.aerospike_dao.append(_AerospikeDao(node, namespace, set_name, ttl))
def put(self, records):
failure_records = []
total = len(records)
put_count = 0
self._log('Loading records to {0}'.format(self._cluster))
for record in records:
if not isinstance(record, Record):
raise Exception('Wrong type for aerospike object')
if put_count % 1000 == 0 and put_count > 0:
self._log('Finished {0}%'.format(int(float(put_count)/total*100)))
for aerospike_dao in self.aerospike_dao:
for attempt in xrange(1 + self._retry_limit):
try:
aerospike_dao.put(record.key, record.bins)
except Exception as e:
print e
else:
break
else:
failure_records.append(record.key)
put_count += 1
self._log('Finished 100%')
return len(records) - len(failure_records), failure_records
def get(self, key):
try:
bins = self.aerospike_dao[0].get(key)
except Exception as e:
print e
return None
else:
return bins
def close(self):
for aerospike_dao in self.aerospike_dao:
aerospike_dao.close()
class AsyncClient(Client):
def __init__(self, cluster, namespace, set_name, ttl, retry_limit=3, logger=None, pool_size=4, queue_size=256):
"""
:param cluster: Aerospike cluster, should have the following format, [(host_1: port_1), (host_2: port_2), ..., (host_n: port_n)]
:param namespace: Aerospike namespace
:param set_name: Aerospike set
:param ttl: time to live for records
:param retry_limit: limit for retrying times for failure records
:param pool_size: number of processes to load records
:param queue_size: the maximum capacity of blocking queue, by default it is set to 256
:return: None
"""
super(AsyncClient, self).__init__(cluster, namespace, set_name, ttl, retry_limit, logger)
self._pool_size = pool_size
self._queue_size = queue_size
self._task_queue = multiprocessing.JoinableQueue()
self._failure_queue = multiprocessing.Queue()
def put(self, records):
"""
:param records: Record object collection
:return: success record count and collection of failure records (after retries)
"""
processors = [_Processor(self._cluster, self._namespace, self._set_name, self._ttl, self._task_queue, self._failure_queue, self._retry_limit) for i in xrange(self._pool_size)]
for processor in processors:
processor.start()
total = len(records)
put_count = 0
self._log('Loading records to {0}'.format(self._cluster))
for record in records:
while True:
if self._task_queue.qsize() < self._queue_size:
break
time.sleep(0.1)
if not isinstance(record, Record):
raise Exception('Wrong type for aerospike object')
if put_count % 1000 == 0 and put_count > 0:
self._log('Finished {0}%'.format(int(float(put_count)/total*100)))
for node_index in xrange(len(self._cluster)):
self._task_queue.put(_Put(node_index, record))
put_count += 1
self._log('Finished 100%')
self._task_queue.join()
for i in xrange(self._pool_size):
self._task_queue.put(None)
self._failure_queue.put(None)
failure_records = []
while True:
failure_record = self._failure_queue.get()
if failure_record is not None:
failure_records.append(failure_record)
else:
break
for processor in processors:
processor.join()
processor.terminate()
return len(records) * len(self._cluster) - len(failure_records), failure_records
def get(self, key):
pass
def close(self):
pass
class _Processor(multiprocessing.Process):
def __init__(self, cluster, namespace, set_name, ttl, task_queue, failure_queue, retry_limit):
"""
:param task_queue: process-shared queue to contain tasks
:param failure_queue: process-shared queue to contain failure records after retries
:return: None
"""
super(_Processor, self).__init__()
self._task_queue = task_queue
self._failure_queue = failure_queue
self._retry_limit = retry_limit
self.aerospike_dao = []
for node in cluster:
self.aerospike_dao.append(_AerospikeDao(node, namespace, set_name, ttl))
def run(self):
while True:
next_task = self._task_queue.get()
if next_task is None:
self._task_queue.task_done()
break
result = next_task(self)
if (not result) and next_task.retry_counter < self._retry_limit:
next_task.retry()
self._task_queue.put(next_task)
elif not result:
self._failure_queue.put(next_task.record.key)
# task_done() should be called after appending records to failure queue since processes should be blocked until all failure records are captured
self._task_queue.task_done()
return
def close(self):
for dao in self.aerospike_dao:
dao.close()
def __del__(self):
self.close()
class _Put():
def __init__(self, dao_index, record):
"""
:param dao_index: unique index for each node's aerospike-dao
:param record: record to put
:return: None
"""
self.dao_index = dao_index
self.record = record
self.retry_counter = 0
def retry(self):
self.retry_counter += 1
def __call__(self, processor):
return processor.aerospike_dao[self.dao_index].put(self.record.key, self.record.bins)
def __str__(self):
return 'key={key},bins={bins}'.format(key=self.record.key, bins=self.record.bins)
class _AerospikeDao():
def __init__(self, host, namespace, set_name, ttl):
"""
:param host:
:param namespace:
:param set_name:
:param ttl:
:return:
"""
self._namespace = namespace
self._set_name = set_name
self._ttl = ttl
for attempt in xrange(3):
try:
self._aerospike_client = aero.client({'hosts': [host]}).connect()
except Exception as e:
print e
else:
break
else:
raise Exception('[Error] 3 failed attempts for connecting to {host}'.format(host=host))
def put(self, key, bins):
"""
:param key:
:param bins:
:return:
"""
try:
self._aerospike_client.put((self._namespace, self._set_name, key), bins, meta={'ttl': self._ttl})
except Exception as e:
print e
return False
else:
return True
def get(self, key):
"""
:param key:
:return:
"""
try:
(key, meta, bins) = self._aerospike_client.get((self._namespace, self._set_name, key))
except Exception as e:
print e
return None
else:
return bins
def close(self):
self._aerospike_client.close()
|
Overview: Bantu refers to a large, complex linguistic grouping of peoples in Africa. The Central-Congo Bantu people cluster encompasses multiple Bantu ethnic groups primarily found in the Congo region of Africa, what today is comprised of the Democratic Republic of the Congo (Congo-Kinshasa) and the Republic of the Congo (Congo-Brazzaville). More than one hundred Bantu people groups are found within this geographic region.
|
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, You can obtain one at https://mozilla.org/MPL/2.0/.
import json
import os.path
import re
import sys
BASE = os.path.dirname(__file__.replace('\\', '/'))
sys.path.insert(0, os.path.join(BASE, "Mako-0.9.1.zip"))
sys.path.insert(0, BASE) # For importing `data.py`
from mako import exceptions
from mako.lookup import TemplateLookup
from mako.template import Template
import data
RE_PYTHON_ADDR = re.compile(r'<.+? object at 0x[0-9a-fA-F]+>')
OUT_DIR = os.environ.get("OUT_DIR", "")
STYLE_STRUCT_LIST = [
"background",
"border",
"box",
"column",
"counters",
"effects",
"font",
"inherited_box",
"inherited_table",
"inherited_text",
"inherited_ui",
"inherited_svg",
"list",
"margin",
"outline",
"padding",
"position",
"table",
"text",
"ui",
"svg",
"xul",
]
def main():
usage = ("Usage: %s [ servo | gecko ] [ style-crate | geckolib <template> | html ]" %
sys.argv[0])
if len(sys.argv) < 3:
abort(usage)
product = sys.argv[1]
output = sys.argv[2]
if product not in ["servo", "gecko"] or output not in ["style-crate", "geckolib", "html"]:
abort(usage)
properties = data.PropertiesData(product=product)
files = {}
for kind in ["longhands", "shorthands"]:
files[kind] = {}
for struct in STYLE_STRUCT_LIST:
file_name = os.path.join(BASE, kind, "{}.mako.rs".format(struct))
if kind == "shorthands" and not os.path.exists(file_name):
files[kind][struct] = ""
continue
files[kind][struct] = render(
file_name,
product=product,
data=properties,
)
properties_template = os.path.join(BASE, "properties.mako.rs")
files["properties"] = render(
properties_template,
product=product,
data=properties,
__file__=properties_template,
OUT_DIR=OUT_DIR,
)
if output == "style-crate":
write(OUT_DIR, "properties.rs", files["properties"])
for kind in ["longhands", "shorthands"]:
for struct in files[kind]:
write(
os.path.join(OUT_DIR, kind),
"{}.rs".format(struct),
files[kind][struct],
)
if product == "gecko":
template = os.path.join(BASE, "gecko.mako.rs")
rust = render(template, data=properties)
write(OUT_DIR, "gecko_properties.rs", rust)
elif output == "geckolib":
if len(sys.argv) < 4:
abort(usage)
template = sys.argv[3]
header = render(template, data=properties)
sys.stdout.write(header)
elif output == "html":
write_html(properties)
def abort(message):
sys.stderr.write(message + b"\n")
sys.exit(1)
def render(filename, **context):
try:
lookup = TemplateLookup(directories=[BASE],
input_encoding="utf8",
strict_undefined=True)
template = Template(open(filename, "rb").read(),
filename=filename,
input_encoding="utf8",
lookup=lookup,
strict_undefined=True)
# Uncomment to debug generated Python code:
# write("/tmp", "mako_%s.py" % os.path.basename(filename), template.code)
return template.render(**context).encode("utf8")
except Exception:
# Uncomment to see a traceback in generated Python code:
# raise
abort(exceptions.text_error_template().render().encode("utf8"))
def write(directory, filename, content):
if not os.path.exists(directory):
os.makedirs(directory)
full_path = os.path.join(directory, filename)
open(full_path, "wb").write(content)
python_addr = RE_PYTHON_ADDR.search(content)
if python_addr:
abort("Found \"{}\" in {} ({})".format(python_addr.group(0), filename, full_path))
def write_html(properties):
properties = dict(
(p.name, {
"flag": p.servo_pref,
"shorthand": hasattr(p, "sub_properties")
})
for p in properties.longhands + properties.shorthands
)
doc_servo = os.path.join(BASE, "..", "..", "..", "target", "doc", "servo")
html = render(os.path.join(BASE, "properties.html.mako"), properties=properties)
write(doc_servo, "css-properties.html", html)
write(doc_servo, "css-properties.json", json.dumps(properties, indent=4))
if __name__ == "__main__":
main()
|
The next four days are about recovery for me after a great holiday with lots of family in town. We have had a busy few weeks and now it is time to get back to reality.
My house from Christmas, between decorations, presents and visitors our house is a bit cluttered.
My body from night after night of rich food and drinks.
My schedule from two weeks of visitors and the boys being out of school.
My office from being a bedroom and workspace for others.
My brain from not thinking about work while family was in town. I am all mixed up on what day of the week it is.
My sleep from late nights, early mornings and many thoughts running through my brain.
My goal by the end of the weekend is to feel like we have returned to normal, just in time for school to start again and work to ramp up.
|
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# pylint: disable=invalid-name
"""Utility function to get information from graph."""
from __future__ import absolute_import as _abs
import tvm
from . import graph_attr
from ..graph import create
from ..symbol import Group, ones_like
def infer_shape(graph, **shape):
"""Infer the shape given the shape of inputs.
Parameters
----------
graph : Graph
The graph to perform shape inference from
shape : dict of str to tuple
The specific input shape.
Returns
-------
in_shape : list of tuple
Shape of inputs
out_shape: list of tuple
Shape of outputs
"""
graph = graph_attr.set_shape_inputs(graph, shape)
graph = graph.apply("InferShape")
shape = graph.json_attr("shape")
index = graph.index
input_shape = [shape[index.entry_id(x)] for x in index.input_names]
output_shape = [shape[index.entry_id(x)] for x in index.output_entries]
return input_shape, output_shape
def infer_dtype(graph, **dtype):
"""Infer the type given the typeS of inputs.
Parameters
----------
graph : Graph
The graph to perform type inference from
dtype : dict of str to dtype
The specific input data type.
Returns
-------
in_dtype : list of tuple
Dtype of inputs
out_dtype: list of tuple
Dtype of outputs
"""
graph = graph_attr.set_dtype_inputs(graph, dtype)
graph = graph.apply("InferType")
dtype = graph.json_attr("dtype")
index = graph.index
input_dtype = [graph_attr.TCODE_TO_DTYPE[dtype[index.entry_id(x)]]
for x in index.input_names]
output_dtype = [graph_attr.TCODE_TO_DTYPE[dtype[index.entry_id(x)]]
for x in index.output_entries]
return input_dtype, output_dtype
_deep_compare = tvm.get_global_func("nnvm.graph.DeepCompare")
def check_graph_equal(grapha, graphb, compare_variable_attrs=False):
"""Check if two graphs have equal structure.
Parameters
----------
grapha : Graph
The first graph
graphb : Graph
The second graph
compare_variable_attrs : bool, optional
Whether we want to compare attributes(names) on variables.
Usually it is safe to skip it unless we want input name
to exactly match
Raises
------
ValueError
ValueError is raised with error message when graph not equal
"""
err = _deep_compare(grapha, graphb, compare_variable_attrs)
if err:
raise ValueError("Graph compare error: " + err)
def get_gradient_graph(ys, xs, grad_ys=None):
"""Create gradient graph of ys with respect to xs.
Parameters
----------
ys : Symbol or list of Symbol
Symbols from which the gradient is calculated.
xs : Symbol or list of Symbol
Symbols the gradient respect to.
For group symbol, gradients for all outputs will be calculated.
grad_ys : Symbol or list of Symbol
Head gradients for ys.
Returns
-------
ret : Graph
Generated gradient graph.
"""
if isinstance(ys, list):
ys = Group(ys)
g = create(ys)
g._set_symbol_list_attr('grad_ys', ys)
g._set_symbol_list_attr('grad_xs', xs)
ny = len(ys.list_output_names())
if grad_ys is None:
grad_ys = [ones_like(ys[i]) for i in range(ny)]
g._set_symbol_list_attr('grad_ys_out_grad', grad_ys)
return g.apply('Gradient')
def gradients(ys, xs, grad_ys=None):
"""Create gradient symbol of ys respect to xs.
Parameters
----------
ys : Symbol or list of Symbol
Symbols from which the gradient is calculated.
xs : Symbol or list of Symbol
Symbols the gradient respect to.
For group symbol, gradients for all outputs will be calculated.
grad_ys : Symbol or list of Symbol
Head gradients for ys.
Returns
-------
ret : list of Symbol
Generated gradient symbol. For each xs,
all gradients from ys are merged into a single symbol.
"""
grad_g = get_gradient_graph(ys, xs, grad_ys)
nx = len(Group(xs).list_output_names()) \
if isinstance(xs, list) else len(xs.list_output_names())
ret = [grad_g.symbol[i] for i in range(nx)]
return ret
|
Different blogs focus on providing you the Natural Beauty Tips. Apart from this there are certain sections that focus on the best of 10 including the Top 10 Places. However, there are very few blogs and websites that provide you the most relevant information about the Latest Health Tips. If you find one that highlights this important aspect too then you must bookmark the same. Here we are with the 3 most popular Latest Health Tips.
The first rule to the healthy life is a balanced diet. You should have three meals a day and at a fixed interval of time. Never overeat and avoid the junk food. Have the hygienic food that is cooked in good conditions, most preferably the home made food. The stomach is after all the key factor that determines your health. If you are unable to keep it healthy then it will not let the other parts also function properly.
You must have a sleep of at least 6 hours per night. In case you compromise with your sleep the body never heals of the fatigue that you cause to it. In addition to this, waking up with the tired body and mind will not let you function properly and the same cycle continues. If deficiency of sleep is known to cause problems and health issues then the same goes true for the excessive sleep also. In excess, sleeping can be a major cause of obesity. Therefore, you need to sleep in an appropriate quantity.
You need to take out time to indulge yourself in the physical exercises, be it the brisk walking in the morning/ evening or going to the gym, it is a must. You may also try yoga in case you have enough time. If not this then you can prefer walking more wherever possible as it is one of the most natural exercises.
All in all, these are the 3 most popular Latest Health Tips. There are umpteen other ways to maintain your health but these are the ones that though categorized as the latest are yet the most conventional.
|
# -*- coding: utf-8 -*-
import codecs
from ..internal.Container import Container
from ..internal.misc import encode
class TXT(Container):
__name__ = "TXT"
__type__ = "container"
__version__ = "0.21"
__status__ = "testing"
__pattern__ = r'.+\.(txt|text)$'
__config__ = [("activated", "bool", "Activated", True),
("use_premium", "bool", "Use premium account if available", True),
("folder_per_package", "Default;Yes;No",
"Create folder for each package", "Default"),
("flush", "bool", "Flush list after adding", False),
("encoding", "str", "File encoding", "utf-8")]
__description__ = """Read link lists in plain text formats"""
__license__ = "GPLv3"
__authors__ = [("spoob", "spoob@pyload.org"),
("jeix", "jeix@hasnomail.com")]
def decrypt(self, pyfile):
try:
encoding = codecs.lookup(self.config.get('encoding')).name
except Exception:
encoding = "utf-8"
fs_filename = encode(pyfile.url)
txt = codecs.open(fs_filename, 'r', encoding)
curPack = "Parsed links from %s" % pyfile.name
packages = {curPack: [], }
for link in txt.readlines():
link = link.strip()
if not link:
continue
if link.startswith(";"):
continue
if link.startswith("[") and link.endswith("]"):
#: New package
curPack = link[1:-1]
packages[curPack] = []
continue
packages[curPack].append(link)
txt.close()
#: Empty packages fix
for key, value in packages.items():
if not value:
packages.pop(key, None)
if self.config.get('flush'):
try:
txt = open(fs_filename, 'wb')
txt.close()
except IOError:
self.log_warning(_("Failed to flush list"))
for name, links in packages.items():
self.packages.append((name, links, name))
|
Research from Cardiff University and the Institute of Occupational Safety and Health has demonstrated that miners who are excluded from the decision-making process in regards to safety and health matters face disproportionate safety risks while at work.
From respiratory diseases caused by dust kicked up in underground operations to the impact of collapsing ceilings and malfunctioning machinery, mining continues to be a dangerous career for many of the workers on the front lines of one of the world’s most profitable industries.
The inherent dangers of the industry are amplified in countries where government regulation and corporate infrastructure are less developed. Strike action at Gold Field’s South Deep mines, for example, spiralled from industrial action to waves of violence and arson attacks, as the South African mining industry edges closer to a third successive year of increased fatalities at operations.
In response to these risks, scientists from Cardiff University, backed by the Institute of Occupational Safety and Health (IOSH) conducted research into the relationship between the extent to which miners were empowered to participate in decision-making processes with regard to safety, and safety records, between 2016 and 2018. The study, ‘The role and effects of representing miners in arrangements of safety and health in coal mining: a global study’, investigated coal mines in five countries – Australia, Canada, India, Indonesia and South Africa – and found that greater worker empowerment and tighter regulations contributed to more effective occupational safety and health (OSH) practices.
The researchers conducted interviews and convened workshops with miners in the target countries to learn what they thought about OSH practices at work, and produced a literature review that assessed the regulatory framework in place around the workers. By reviewing these legal apparatuses in tandem with the experiences of individual miners, it became clear that, while all of the countries insisted workers participate in health and safety activities, Indian and Indonesian workers felt like these processes were largely meaningless, and were not contributing to the safety of operations as a whole.
Indonesia in particular has strict guidelines on mining safety incorporated into its law, covering risk management, employee health programmes and company regulation and assessment – yet 28 people were killed in Indonesia’s Grasberg mine, the world’s second-largest copper project, in 2013.
State-owned miner PT Indonesia Asahan Aluminium purchased Rio Tinto’s 40% stake in the mine in July 2018, despite the project’s poor safety history and widespread accusations of environmental mismanagement that have resulted in a reported 200,000 tonnes of tailings being dumped into local water sources every day. The mine has endangered both its workers and local residents, and in spite of government and company commitments to ensuring safety, the state is only becoming more closely tied to the project.
Ogungbeje argued that responsibility falls on all parties involved, from individual workers to national governments. “I don’t think responsibility should fall on one type of actor, each has a part to play, be it government of private companies or workers themselves,” she said.
She added that governments are integral in ensuring standards are maintained across companies within a country, and that safety standards and recognition of the importance of OSH practices need to permeate all levels of the mining industry.
IOSH plans to revisit the study in a year, and then again in five years, to measure its impact over a longer period of time. While soft criteria, such as the extent to which workers feel involved in the decision-making process with regards to OSH matters, are difficult to quantify, it is hoped the work will raise awareness of the importance of safety procedures, and the effect that involving workers in those procedures can have on operational safety. Ogungbeje said a key motivation behind the study was a lack of literature on the subject of worker engagement in relation to safety performance, which the work done by Cardiff University aims to rectify.
While industry operators and regulators have not yet shown a similar concern, the variety of groups to have already expressed interest suggests that a greater range of companies and decision-makers will be involved in OSH practices in the future. The study demonstrated that a lack of variety in viewpoints related to safety, and an exclusion of all relevant parties from the management process, can have negative consequences on an operation’s safety record. It is hoped the study will encourage positive steps towards improving OSH performance.
|
from __future__ import division
from math import floor
def get_cidr_from_subnet(subnet):
if (not validate_ipv4(subnet)):
raise ValueError, 'Subnet must be valid.'
subnetsplit = subnet.split('.')
cidr = 0
for oct in subnetsplit:
cidr = (cidr + list(bin(int(oct))).count('1'))
return cidr
def get_subnet_from_cidr(cidr):
if (not ((type(cidr) is int) or (type(cidr) is long))):
raise TypeError, 'Value must be an integer or a long.'
num = 0
for i in range(0, cidr):
num = (num | (2 ** (31 - i)))
return get_ip_from_int(num)
def get_ip_from_int(num):
if (not ((type(num) is int) or (type(num) is long))):
raise TypeError, 'Value must be an integer or a long.'
one = floor((num // (2 ** 24)))
two = floor(((num - (one * (2 ** 24))) // (2 ** 16)))
three = floor((((num - (one * (2 ** 24))) - (two * (2 ** 16))) // (2 ** 8)))
four = (((num - (one * (2 ** 24))) - (two * (2 ** 16))) - (three * (2 ** 8)))
if validate_ipv4(('%d.%d.%d.%d' % (one, two, three, four))):
return ('%d.%d.%d.%d' % (one, two, three, four))
else:
return False
def get_ip_from_hex_str(data):
if (not (isinstance(data, str) or isinstance(data, unicode))):
raise TypeError, 'Must supply a hex string.'
if (len(data) != 8):
raise ValueError, 'Hex string must be in 8 characters in length'
one = ((int(data[0], 16) * 16) + int(data[1], 16))
two = ((int(data[2], 16) * 16) + int(data[3], 16))
three = ((int(data[4], 16) * 16) + int(data[5], 16))
four = ((int(data[6], 16) * 16) + int(data[7], 16))
if validate_ipv4(('%s.%s.%s.%s' % (one, two, three, four))):
return ('%s.%s.%s.%s' % (one, two, three, four))
else:
return False
def get_int_from_ip(ip):
if (not validate_ipv4(ip)):
raise ValueError, 'IP must be valid.'
splitwork = ip.split('.')
if (len(splitwork) != 4):
return ip
return ((((int(splitwork[0]) * (2 ** 24)) + (int(splitwork[1]) * (2 ** 16))) + (int(splitwork[2]) * (2 ** 8))) + int(splitwork[3]))
def expand_ipv6(address):
if (not validate_ipv6(address)):
raise ValueError, 'Address must be a IPv6 notation.'
half = address.split('::')
if (len(half) == 2):
half[0] = half[0].split(':')
half[1] = half[1].split(':')
nodes = ((half[0] + (['0'] * (8 - (len(half[0]) + len(half[1]))))) + half[1])
else:
nodes = half[0].split(':')
return ':'.join((('%04x' % int((i or '0'), 16)) for i in nodes))
def get_broadcast_from_subnet(ip, subnet):
if (not ((type(subnet) is str) or (type(subnet) is unicode))):
raise TypeError, 'Subnet must be a string representation.'
if (not validate_ipv4_subnet(subnet)):
raise TypeError, 'Subnet must be a valid subnet mask.'
if (not ((type(ip) is str) or (type(ip) is unicode))):
raise TypeError, 'IP must be a string representation.'
if (not validate_ipv4(ip)):
raise TypeError, 'IP must be a valid IP address.'
network = get_network_from_subnet(ip, subnet)
net_split = network.split('.')
sub_split = subnet.split('.')
broadcast = []
for i in range(0, 4):
broadcast.append(str((int(net_split[i]) | (int(sub_split[i]) ^ 255))))
return '.'.join(broadcast)
def get_network_from_subnet(ip, subnet):
if (not ((type(subnet) is str) or (type(subnet) is unicode))):
raise TypeError, 'Subnet must be a string representation.'
if (not validate_ipv4_subnet(subnet)):
raise TypeError, 'Subnet must be a valid subnet mask.'
if (not ((type(ip) is str) or (type(ip) is unicode))):
raise TypeError, 'IP must be a string representation.'
if (not validate_ipv4(ip)):
raise TypeError, 'IP must be a valid IP address.'
ip_split = ip.split('.')
sub_split = subnet.split('.')
network = []
for i in range(0, 4):
network.append(str((int(ip_split[i]) & int(sub_split[i]))))
return '.'.join(network)
def validate_ipv4_subnet(subnet):
if (not ((type(subnet) is str) or (type(subnet) is unicode))):
raise TypeError, 'Subnet must be a string representation.'
if (not validate_ipv4(subnet)):
return False
found_zero = False
for item in subnet.split('.'):
if ((not found_zero) and (item == '255')):
continue
if (found_zero and (not (item == '0'))):
return False
digit = int(item)
for i in range(0, 8):
if ((digit & (2 ** (7 - i))) == 0):
found_zero = True
elif found_zero:
return False
return True
def validate_ipv4(ip):
if (not ((type(ip) is str) or (type(ip) is unicode))):
raise TypeError, 'IP must be a string representation.'
octets = ip.split('.')
if (len(octets) != 4):
return False
for octet in octets:
try:
i = int(octet)
except ValueError:
return False
if ((i < 0) or (i > 255)):
return False
else:
return True
def validate_ipv6(ip):
if (not ((type(ip) is str) or (type(ip) is unicode))):
raise TypeError, 'IP must be a string representation.'
nodes = ip.split('%')
if (len(nodes) not in [1, 2]):
return False
addr = nodes[0]
if (len(nodes) == 2):
try:
int(nodes[1])
except ValueError:
return False
if (addr.count('::') > 1):
return False
groups = addr.split(':')
if ((len(groups) > 8) or (len(groups) < 3)):
return False
for group in groups:
if (group == ''):
continue
try:
i = int(group, 16)
except ValueError:
return False
if ((i < 0) or (i > 65535)):
return False
else:
return True
def validate(ip):
if (not ((type(ip) is str) or (type(ip) is unicode))):
raise TypeError, 'IP must be a string representation.'
if (':' in ip):
return validate_ipv6(ip)
elif ('.' in ip):
return validate_ipv4(ip)
else:
return False
def validate_port(port):
if (not ((type(port) is int) or (type(port) is long))):
raise TypeError, 'Port must be an int or long representation.'
if ((port >= 0) and (port <= 65535)):
return True
return False
|
Accelerate above your set speed whenever you wish, then release the throttle and your bike speed automatically drops back to the previously set speed. Touch the front or rear brake or engage the clutch and the cruise control cuts out instantly.
If you’ve ever suffered from a cramped right wrist, a stiff neck or an aching back you really should fit a MotorCycle Electronic Cruise control.
Basically the same as a car cruise control - except it performs even better.
Simply accelerate to the desired speed (30mph / 50kph upwards).
From then on the MotorCycle Cruise control will maintain your speed up and downhill, with or against the wind. You can vary your set speed by as little as 1mph (1.5kph) by a single touch on a button.
Safety features ensure that the cruise control will not resume speed if it has been turned off by the control switch or the bike ignition switch.
|
# -*- encoding: utf-8 -*-
from supriya.tools.ugentools.MultiOutUGen import MultiOutUGen
class Pan4(MultiOutUGen):
r'''A four-channel equal-power panner.
::
>>> source = ugentools.In.ar(bus=0)
>>> pan_4 = ugentools.Pan4.ar(
... gain=1,
... source=source,
... x_position=0,
... y_position=0,
... )
>>> pan_4
UGenArray({4})
'''
### CLASS VARIABLES ###
__documentation_section__ = 'Spatialization UGens'
__slots__ = ()
_ordered_input_names = (
'source',
'x_position',
'y_position',
'gain',
)
_valid_calculation_rates = None
### INITIALIZER ###
def __init__(
self,
calculation_rate=None,
gain=1,
source=None,
x_position=0,
y_position=0,
):
MultiOutUGen.__init__(
self,
calculation_rate=calculation_rate,
channel_count=4,
gain=gain,
source=source,
x_position=x_position,
y_position=y_position,
)
### PUBLIC METHODS ###
@classmethod
def ar(
cls,
gain=1,
source=None,
x_position=0,
y_position=0,
):
r'''Constructs an audio-rate Pan4.
::
>>> source = ugentools.In.ar(bus=0)
>>> pan_4 = ugentools.Pan4.ar(
... gain=1,
... source=source,
... x_position=0,
... y_position=0,
... )
>>> pan_4
UGenArray({4})
Returns ugen graph.
'''
from supriya.tools import synthdeftools
calculation_rate = synthdeftools.CalculationRate.AUDIO
ugen = cls._new_expanded(
calculation_rate=calculation_rate,
gain=gain,
source=source,
x_position=x_position,
y_position=y_position,
)
return ugen
@classmethod
def kr(
cls,
gain=1,
source=None,
x_position=0,
y_position=0,
):
r'''Constructs a control-rate Pan4.
::
>>> source = ugentools.In.ar(bus=0)
>>> pan_4 = ugentools.Pan4.kr(
... gain=1,
... source=source,
... x_position=0,
... y_position=0,
... )
>>> pan_4
UGenArray({4})
Returns ugen graph.
'''
from supriya.tools import synthdeftools
calculation_rate = synthdeftools.CalculationRate.CONTROL
ugen = cls._new_expanded(
calculation_rate=calculation_rate,
gain=gain,
source=source,
x_position=x_position,
y_position=y_position,
)
return ugen
### PUBLIC PROPERTIES ###
@property
def gain(self):
r'''Gets `gain` input of Pan4.
::
>>> source = ugentools.In.ar(bus=0)
>>> pan_4 = ugentools.Pan4.ar(
... gain=1,
... source=source,
... x_position=0,
... y_position=0,
... )
>>> pan_4[0].source.gain
1.0
Returns ugen input.
'''
index = self._ordered_input_names.index('gain')
return self._inputs[index]
@property
def source(self):
r'''Gets `source` input of Pan4.
::
>>> source = ugentools.In.ar(bus=0)
>>> pan_4 = ugentools.Pan4.ar(
... gain=1,
... source=source,
... x_position=0,
... y_position=0,
... )
>>> pan_4[0].source.source
OutputProxy(
source=In(
bus=0.0,
calculation_rate=CalculationRate.AUDIO,
channel_count=1
),
output_index=0
)
Returns ugen input.
'''
index = self._ordered_input_names.index('source')
return self._inputs[index]
@property
def x_position(self):
r'''Gets `x_position` input of Pan4.
::
>>> source = ugentools.In.ar(bus=0)
>>> pan_4 = ugentools.Pan4.ar(
... gain=1,
... source=source,
... x_position=0,
... y_position=0,
... )
>>> pan_4[0].source.x_position
0.0
Returns ugen input.
'''
index = self._ordered_input_names.index('x_position')
return self._inputs[index]
@property
def y_position(self):
r'''Gets `y_position` input of Pan4.
::
>>> source = ugentools.In.ar(bus=0)
>>> pan_4 = ugentools.Pan4.ar(
... gain=1,
... source=source,
... x_position=0,
... y_position=0,
... )
>>> pan_4[0].source.y_position
0.0
Returns ugen input.
'''
index = self._ordered_input_names.index('y_position')
return self._inputs[index]
|
Geoff Lawrence is a seasoned government-relations official, legislative director, accountant, economist and financial analyst. Prior to joining WeedTV, and its parent company, Players Network, Geoff spent a decade as a think-tank public policy analyst then worked as the policy director for the majority caucus in the Nevada Legislature. Then, he served as Nevada's Assistant State Controller, where he oversaw the state's external financial reporting and focused on accountability over the use of tax dollars. Lawrence joined Players Network in July 2017 as Chief Financial Officer and Chief Compliance Officer. He has a beautiful wife and two wonderful children.
|
#!/usr/bin/env python3
"""
req v3.1
Copyright (c) 2016, 2017, 2018 Eugene Y. Q. Shen.
req is free software: you can redistribute it and/or
modify it under the terms of the GNU General Public License
as published by the Free Software Foundation, either version
3 of the License, or (at your option) any later version.
req is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty
of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see http://www.gnu.org/licenses/.
"""
import os
import sys
from json import dumps
YEAR = '2017'
DUMP = 'req.json'
UBCPATH = 'data/ubc/'
INPATH = '/courses/'
if len(sys.argv) == 1:
COURSES = UBCPATH + YEAR + INPATH
else:
COURSES = sys.argv[1] + INPATH
# Generic class to store course data
class Course():
# Initialize all variables
def __init__(self, code):
self.code = code # course code, x. 'CPSC 121'
self.name = '' # name, x. 'Models of Computation'
self.desc = '' # course description in UBC Calendar
self.prer = '' # raw prereqs, x. 'Either (a) CPSC ...'
self.crer = '' # raw coreqs, x. 'All of CPSC 213 ...'
self.preq = [] # prereq tree, x. ['or', 'CPSC 221', ...]
self.creq = [] # coreq tree, x. ['and', 'CPSC 213', ...]
self.excl = ['or'] # exclusions, x. ['or', 'STAT 200', ...]
self.term = set() # terms offered, x. {'2017S', '2017W'}
self.cred = set() # possible credits, x. {3.0, 6.0}
# Set course parameters
def set_params(self, param, value):
if param == 'name':
self.name = value
elif param == 'desc':
self.desc = value
elif param == 'prer':
self.prer = value
elif param == 'crer':
self.crer = value
elif param == 'preq':
self.preq = get_reqs(value.split())
elif param == 'creq':
self.creq = get_reqs(value.split())
elif param == 'excl':
self.excl.extend(
[''.join(e.strip().split()) for e in value.split(',')])
elif param == 'term':
self.term.update({t.strip() for t in value[:-1].split(',')})
elif param == 'cred':
self.cred.update({float(c.strip()) for c in value.split(',')})
else:
print('Error: parameter not recognized.')
# Get course parameters
def get_params(self, param=''):
params = {'code': self.code, 'name': self.name, 'desc': self.desc,
'prer': self.prer, 'crer': self.crer,
'preq': self.preq, 'creq': self.creq, 'excl': self.excl,
'term': list(self.term), 'cred': list(self.cred)}
if param in params.keys():
return params[param]
else:
return params
# Turn requisites into list format; all entries must be true to satisfy
def get_reqs(value):
reqs = []
course = []
group = []
depth = 0
operator = 'and'
for term in value:
if depth < 0:
print('Error: mismatched parentheses.')
# Outside of parens, only terms are course names, and, or
if depth == 0:
if term.startswith('('):
depth = term.count('(')
group.append(term[1:])
elif term == 'and' or term == 'or':
operator = term
if course:
reqs.append(''.join(course))
course = []
else:
course.append(term)
# Call get_reqs again on anything inside parens
else:
if term.startswith('('):
depth += term.count('(')
elif term.endswith(')'):
depth -= term.count(')')
if depth == 0:
group.append(term[:-1])
reqs.append(get_reqs(group))
group = []
else:
group.append(term)
# Add final course after last operator
if course:
reqs.append(''.join(course))
reqs.insert(0, operator)
return reqs
if __name__ == '__main__':
# Parse all files in COURSES as Courses
courses = {}
for name in os.listdir(COURSES):
if name.endswith('.txt'):
with open(COURSES + '/' + name, encoding='utf8') as f:
for line in f:
split = line.split(':')
if len(split) > 1:
param = split[0].strip()
value = ':'.join(split[1:]).strip()
if param == 'code':
code = ''.join(value.split())
if code in courses.keys():
course = courses[code]
else:
course = Course(code)
courses[code] = course
else:
course.set_params(param, value)
# Dump courses into JSON file for JavaScript frontend
json = {}
for code, course in courses.items():
params = courses[code].get_params()
# Ignore courses with no name
if not params['name']:
continue
json[code] = params
with open(DUMP, 'w', encoding='utf8') as f:
f.write(dumps(json))
|
This report shows the usage statistics of eCommerce Product Catalog vs. Blastness vs. PencilBlue as content management system on the web. See technologies overview for explanations on the methodologies used in the surveys. Our reports are updated daily.
|
import os
class Toolchain:
MCU_TYPES = {
"stm32": "arm",
"atsam": "arm",
"at91sam": "arm",
"atmega": "avr",
"attiny": "avr",
"msp430": "msp430"
}
@classmethod
def mcu_type(cls, mcu):
for prefix in cls.MCU_TYPES:
if mcu.startswith(prefix):
return cls.MCU_TYPES[prefix]
return "unknown"
@classmethod
def find_toolchain(cls, mcu):
type = cls.mcu_type(mcu)
if type == "arm":
from mcu_info_util.toolchain_arm import ToolchainARM
return ToolchainARM()
if type == "avr":
from mcu_info_util.toolchain_avr import ToolchainAVR
return ToolchainAVR()
if type == "msp430":
from mcu_info_util.toolchain_msp430 import ToolchainMSP430
return ToolchainMSP430()
return None
def find_compiler(self):
return ""
def find_prefix(self):
compiler = self.find_compiler()
if compiler.endswith("gcc"):
return compiler[0:-3]
elif compiler.endswith("gcc.exe"):
return compiler[0:-7]
return os.path.dirname(compiler) + os.sep
def get_flags(self, mcu):
return []
def generate_header(self, mcu, filename=None):
return False
def generate_linker_script(self, mcu, filename=None):
return False
|
Elephant’s ears are large, evergreen perennials from damp sites in south and southeast Asia. They are grown for their big, veined, arrow- or heart-shaped leaves, which may be marked with black, dark purple, or bronze. The flowers are fairly insignificant. These rhizomatous or tuberous-rooted plants have a striking presence in the shady garden or large containers, or in a warm greenhouse or as a houseplant. Some species can reach over 15 feet tall. The tubers can be lifted and stored over winter.
Noteworthy CharacteristicsLarge, heavily veined, arrow-shaped leaves. Sap may irritate skin and all plant parts may cause mild stomach upset if ingested.
CareLikes moist but well-drained soil of moderate fertility in partial shade. Indoors, grow in filtered light. Provide high humidity, ample water, and fertilizer every 2 to 3 weeks during the growing season. Cut back on water in the winter.
PropagationAs soon as the seed is ripe, sow at 73°F. Divide the rhizomes, separate offsets, or root stem cuttings in spring or summer.
ProblemsMealybugs and scale insects can affect garden plants, while fungal and bacterial leaf diseases are common under glass.
|
from auth import requires_auth
from db_helper import IdUrlField, update_model
from flask.ext.restful import Resource, fields, reqparse, marshal, abort
__author__ = 'wojtowpj'
from google.appengine.ext import db
class BeerGlass(db.Model):
name = db.StringProperty(required=True)
description = db.StringProperty(required=False)
capacity = db.FloatProperty(required=False)
glass_fields = {
'name': fields.String,
'description': fields.String,
'capacity': fields.Float,
'uri': IdUrlField('beer_glass', absolute=True),
}
class BeerGlassListApi(Resource):
def __init__(self):
self.reqparse = reqparse.RequestParser()
self.reqparse.add_argument('name', type=str, required=True, help="Beer Glass Name is required")
self.reqparse.add_argument('description', type=str)
self.reqparse.add_argument('capacity', type=float)
super(BeerGlassListApi, self).__init__()
@requires_auth
def get(self):
glass_list = []
for g in db.Query(BeerGlass):
glass_list.append(g)
return {'beer_glasses': map(lambda g: marshal(g, glass_fields), glass_list)}
@requires_auth
def post(self):
args = self.reqparse.parse_args()
g = BeerGlass.all(keys_only=True).filter('name', args['name']).get()
if g:
abort(409, message="Beer glass with name %s already exists" % args['name'])
g = BeerGlass(name=args['name'],
description=args.get('description'),
capacity=args.get('capacity'))
g.put()
return {'beer_glass': marshal(g, glass_fields)}
class BeerGlassApi(Resource):
def __init__(self):
self.reqparse = reqparse.RequestParser()
self.reqparse.add_argument('name', type=str)
self.reqparse.add_argument('description')
self.reqparse.add_argument('capacity')
super(BeerGlassApi, self).__init__()
@requires_auth
def get(self, id):
g = BeerGlass.get_by_id(id)
if not g:
abort(404)
return {'beer_glass': marshal(g, glass_fields)}
@requires_auth
def put(self, id):
args = self.reqparse.parse_args()
g = BeerGlass.get_by_id(id)
if not g:
abort(404)
u = dict(filter(lambda (k, v): v is not None, args.items()))
update_model(g, u)
g.put()
return {'beer_glass': marshal(g, glass_fields)}
@requires_auth
def delete(self, id):
g = BeerGlass.get_by_id(id)
if g:
g.delete()
return {"beer_glass": marshal(g, glass_fields), 'action': 'deleted'}
abort(404)
|
an open-source choreography language for developing correct adaptive distributed systems from a global viewpoint.
AIOCJ is an open-source choreography programming language for developing adaptive systems. Choreography languages describe in a single view the global interaction among all entities (roles) in a system. Systems specified in AIOCJ benefit from this approach and are also deadlock-free by construction. Moreover, AIOCJ choreographies adapt. Through scopes the developer can specify which fragments of the interaction can change with respect to applicable rules, provided by a compliant repository. Rules are specified as AIOCJ choreographies and after the adaptation the system remains deadlock-free. AIOCJ choreographies can also make use of functions provided by external services (e.g., as done here).
AIOCJ provides a projection function. AIOCJ choreographies are projected into a set of separate programs that enact the specified interaction and can be distributed. Rules are projected into a distributable standalone repository.
AIOCJ Presentation Paper: Dalla Preda, M., Giallorenzo, S., Lanese, I., Mauro, J., & Gabbrielli, M. (2014). AIOCJ: A choreographic framework for safe adaptive distributed applications . In Software Language Engineering (pp. 161-170). Springer International Publishing.
Foundational Theory: Dalla Preda, M., Gabbrielli, M., Giallorenzo, S., Lanese, I., & Mauro, J. (2015, June). Dynamic Choreographies. In Coordination Models and Languages (pp. 67-82). Springer International Publishing.
Comprehensive Journal Paper: Dalla Preda, M., Giallorenzo, S., Lanese, I., Mauro, J., & Gabbrielli, M. Dynamic Choreographies: Theory and Implementation. Accepted at Logical Methods in Computer Science.
AIOCJ Tutorial: Giallorenzo, S., Lanese, I., Mauro, J., & Gabbrielli, M. Programming Adaptive Microservice Applications: an AIOCJ Tutorial. Behavioural Types: from Theory to Tools.
Let us write a first example of an adaptable choreography written in AIOCJ. u1 sends the message "Hello" to u2. The rule applies, as its matches with the property N.scope_name of the scope, and the result of the execution will display the message "Hello World!" to u2.
For further details, please check out the AIOCJ Syntax.
On the previous example, we add iteration of adaptation and conditions on environmental variables in rules (prefixed by E) to better explain how to write several rules and check for their applicability at runtime. As in the previous example, u1 sends the message "Hello World" to u2. The whole choreography is enclosed by a while, controlled by u1.
msg@u1 = "Bonjour le Monde"
The last code block in the choreography (Lines 12-15) uses the parallel operator to show the msg to u2 and to ask for continuation to u1 at the same time. In AIOCJ, brackets define sub-choreographies. In this case, we employ this feature to define that the code at Lines 12-15 executes only after u1 sent to u2 the msg (Line 11). Without the brackets, the instruction at Line 14 would execute in parallel wrt the scope.
We define two rules (respectively Lines 1-8 and 10-17). They both satisfy the condition of the non-functional property scope_name of the scope. However, they differ on the condition of the environmental variable E.lang. Environment can change at runtime.
The current implementation of AIOCJ equips a program called environment that simulates the presence of an environment that can be queried for environmental variables. Such program is part of the adaptation middleware. Environmental variables can be set (at runtime) by interacting with the shell in which the service running. Back to our example, if we set a new environmental variable lang to the value it, the first rule applies. Likewise, if the set lang to fr, the second rule applies.
Itemis also offers a repository of Eclipse releases with Xtext already installed.
In Eclipse, click command link Help > Install New Software.... Deselect check box Group items by category and copy in the combo box Work With the site address http://www.cs.unibo.it/projects/jolie/aiocj/. The feature AIOC Language Plugin Feature should be visible in the frame of available software.
Check the box relative to the plugin. Click Next for requirement collection. Next again to confirm the install details. Select the radio button for accepting the License Agreement terms and click on Finish.
Eclipse will now install AIOCJ-ecl. At the end of the installation Eclipse will reboot and the plugin will be installed.
From now on, we will refer the launched Eclipse instance as AIOCJ-ecl.
provided in the paper "AIOCJ: A Choreographic Framework for Safe Adaptive Distributed Applications"
Tutorial choreographies provided in the book chapter "Programming Adaptive Microservice Systems: an AIOCJ Tutorial"
Once selected one example, let us project the AIOC first.
the projection creates a folder named "epp_aioc" in the project. In case Eclipse does not show it, refresh the view in the package explorer. Alternatively, you can open the project folder following its path with a file manager.
Similarly to AIOC projection, rule(s) projection requires the creation of a new file.
project via "Jolie Endpoint Projection" button . The process creates a new folder named "epp_rules" containing the projected files.
To ease testing the projected choreography locally, the projection creates several batch files that open a shell for each component of the choreography. The main batch script choreography_launcher.sh takes care of launching all other batch scripts in the proper order, i.e., the adaptation manager, the environment, the external services, the adaptation server, and the roles of the choreography. The script can be launched from a shell with the command bash choreography_launcher.sh. We give a brief description of the sub-scripts launched by the main script to ease customisation and help to understand the structure of AIOCJ programs.
in the folder epp_aioc, the script aioc_launcher.sh starts the starter role of the choreography. Then it waits for an input of the user to proceed with the execution of the roles of the choreography. Pressing [Enter] launches the last role, which begins the execution of the choreography.
AIOCJ is released under the GNU Lesser General Public License v2.1 and its sources are available on Github.
|
# Copyright 2014, Sandia Corporation. Under the terms of Contract
# DE-AC04-94AL85000 with Sandia Corporation, the U.S. Government retains certain
# rights in this software.
from __future__ import absolute_import
from __future__ import division
import cairo
import toyplot.cairo
import toyplot.svg
try:
import cStringIO as StringIO
except: # pragma: no cover
import StringIO
def render(canvas, fobj=None, width=None, height=None, scale=None):
"""Render the PNG bitmap representation of a canvas using Cairo.
By default, canvas dimensions in CSS pixels are mapped directly to pixels in
the output PNG image. Use one of `width`, `height`, or `scale` to override
this behavior.
Parameters
----------
canvas: :class:`toyplot.canvas.Canvas`
Canvas to be rendered.
fobj: file-like object or string, optional
The file to write. Use a string filepath to write data directly to disk.
If `None` (the default), the PNG data will be returned to the caller
instead.
width: number, optional
Specify the width of the output image in pixels.
height: number, optional
Specify the height of the output image in pixels.
scale: number, optional
Ratio of output image pixels to `canvas` pixels.
Returns
-------
png: PNG image data, or `None`
PNG representation of `canvas`, or `None` if the caller specifies the
`fobj` parameter.
"""
svg = toyplot.svg.render(canvas)
scale = canvas._pixel_scale(width=width, height=height, scale=scale)
surface = cairo.ImageSurface(
cairo.FORMAT_ARGB32, int(scale * canvas._width), int(scale * canvas._height))
context = cairo.Context(surface)
context.scale(scale, scale)
toyplot.cairo.render(svg, context)
if fobj is None:
buffer = StringIO.StringIO()
surface.write_to_png(buffer)
return buffer.getvalue()
else:
surface.write_to_png(fobj)
def render_frames(canvas, width=None, height=None, scale=None):
"""Render a canvas as a sequence of PNG images using Cairo.
By default, canvas dimensions in CSS pixels are mapped directly to pixels in
the output PNG images. Use one of `width`, `height`, or `scale` to override
this behavior.
Parameters
----------
canvas: :class:`toyplot.canvas.Canvas`
Canvas to be rendered.
width: number, optional
Specify the width of the output image in pixels.
height: number, optional
Specify the height of the output image in pixels.
scale: number, optional
Ratio of output image pixels to `canvas` pixels.
Returns
-------
frames: Python generator expression that returns each PNG image in the sequence.
The caller must iterate over the returned frames and is responsible for all
subsequent processing, including disk I/O, video compression, etc.
Examples
--------
>>> for frame, png in enumerate(toyplot.cairo.render_png_frames(canvas)):
... open("frame-%s.png" % frame, "wb").write(png)
"""
svg, svg_animation = toyplot.svg.render(canvas, animation=True)
scale = canvas._pixel_scale(width=width, height=height, scale=scale)
for time, changes in sorted(svg_animation.items()):
toyplot.svg.apply_changes(svg, changes)
surface = cairo.ImageSurface(
cairo.FORMAT_ARGB32, int(scale * canvas._width), int(scale * canvas._height))
context = cairo.Context(surface)
context.scale(scale, scale)
toyplot.cairo.render(svg, context)
fobj = StringIO.StringIO()
surface.write_to_png(fobj)
yield fobj.getvalue()
|
Going on a bike ride around Reinfeld is truly one of the best ways to experience more of this area, although it’s often tough to know where to go. To solve that problem, we bring you the top 12 bike rides around Reinfeld — all you’ve got to do is pick the one that’s right for you.
If you are interested in routes for other sport types around Reinfeld, check out our guides on Hiking around Reinfeld, or Road Cycling Routes around Reinfeld.
|
# encoding: utf-8
# module ldb
# from /usr/lib/python2.7/dist-packages/ldb.so
# by generator 1.135
""" An interface to LDB, a LDAP-like API that can either to talk an embedded database (TDB-based) or a standards-compliant LDAP server. """
# no imports
# Variables with simple values
CHANGETYPE_ADD = 1
CHANGETYPE_DELETE = 2
CHANGETYPE_MODIFY = 3
CHANGETYPE_NONE = 0
ERR_ADMIN_LIMIT_EXCEEDED = 11
ERR_AFFECTS_MULTIPLE_DSAS = 71
ERR_ALIAS_DEREFERINCING_PROBLEM = 36
ERR_ALIAS_PROBLEM = 33
ERR_ATTRIBUTE_OR_VALUE_EXISTS = 20
ERR_AUTH_METHOD_NOT_SUPPORTED = 7
ERR_BUSY = 51
ERR_COMPARE_FALSE = 5
ERR_COMPARE_TRUE = 6
ERR_CONFIDENTIALITY_REQUIRED = 13
ERR_CONSTRAINT_VIOLATION = 19
ERR_ENTRY_ALREADY_EXISTS = 68
ERR_INAPPROPRIATE_AUTHENTICATION = 48
ERR_INAPPROPRIATE_MATCHING = 18
ERR_INSUFFICIENT_ACCESS_RIGHTS = 50
ERR_INVALID_ATTRIBUTE_SYNTAX = 21
ERR_INVALID_CREDENTIALS = 49
ERR_INVALID_DN_SYNTAX = 34
ERR_LOOP_DETECT = 54
ERR_NAMING_VIOLATION = 64
ERR_NOT_ALLOWED_ON_NON_LEAF = 66
ERR_NOT_ALLOWED_ON_RDN = 67
ERR_NO_SUCH_ATTRIBUTE = 16
ERR_NO_SUCH_OBJECT = 32
ERR_OBJECT_CLASS_MODS_PROHIBITED = 69
ERR_OBJECT_CLASS_VIOLATION = 65
ERR_OPERATIONS_ERROR = 1
ERR_OTHER = 80
ERR_PROTOCOL_ERROR = 2
ERR_REFERRAL = 10
ERR_SASL_BIND_IN_PROGRESS = 14
ERR_SIZE_LIMIT_EXCEEDED = 4
ERR_STRONG_AUTH_REQUIRED = 8
ERR_TIME_LIMIT_EXCEEDED = 3
ERR_UNAVAILABLE = 52
ERR_UNDEFINED_ATTRIBUTE_TYPE = 17
ERR_UNSUPPORTED_CRITICAL_EXTENSION = 12
ERR_UNWILLING_TO_PERFORM = 53
FLAG_MOD_ADD = 1
FLAG_MOD_DELETE = 3
FLAG_MOD_REPLACE = 2
FLG_NOMMAP = 8
FLG_NOSYNC = 2
FLG_RDONLY = 1
FLG_RECONNECT = 4
OID_COMPARATOR_AND = '1.2.840.113556.1.4.803'
OID_COMPARATOR_OR = '1.2.840.113556.1.4.804'
SCOPE_BASE = 0
SCOPE_DEFAULT = -1
SCOPE_ONELEVEL = 1
SCOPE_SUBTREE = 2
SEQ_HIGHEST_SEQ = 0
SEQ_HIGHEST_TIMESTAMP = 1
SEQ_NEXT = 2
SUCCESS = 0
SYNTAX_BOOLEAN = '1.3.6.1.4.1.1466.115.121.1.7'
SYNTAX_DIRECTORY_STRING = '1.3.6.1.4.1.1466.115.121.1.15'
SYNTAX_DN = '1.3.6.1.4.1.1466.115.121.1.12'
SYNTAX_INTEGER = '1.3.6.1.4.1.1466.115.121.1.27'
SYNTAX_OCTET_STRING = '1.3.6.1.4.1.1466.115.121.1.40'
SYNTAX_UTC_TIME = '1.3.6.1.4.1.1466.115.121.1.53'
__docformat__ = 'restructuredText'
__version__ = '1.1.17'
# functions
def binary_decode(string): # real signature unknown; restored from __doc__
"""
S.binary_decode(string) -> string
Perform a RFC2254 binary decode on a string
"""
return ""
def binary_encode(string): # real signature unknown; restored from __doc__
"""
S.binary_encode(string) -> string
Perform a RFC2254 binary encoding on a string
"""
return ""
def open(): # real signature unknown; restored from __doc__
"""
S.open() -> Ldb
Open a new LDB context.
"""
return Ldb
def register_module(module): # real signature unknown; restored from __doc__
"""
S.register_module(module) -> None
Register a LDB module.
"""
pass
def string_to_time(string): # real signature unknown; restored from __doc__
"""
S.string_to_time(string) -> int
Parse a LDAP time string into a UNIX timestamp.
"""
return 0
def timestring(p_int): # real signature unknown; restored from __doc__
"""
S.timestring(int) -> string
Generate a LDAP time string from a UNIX timestamp
"""
return ""
def valid_attr_name(name): # real signature unknown; restored from __doc__
"""
S.valid_attr_name(name) -> bool
nCheck whether the supplied name is a valid attribute name.
"""
return False
# classes
from object import object
class Control(object):
""" LDB control. """
def __getattribute__(self, name): # real signature unknown; restored from __doc__
""" x.__getattribute__('name') <==> x.name """
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
def __str__(self): # real signature unknown; restored from __doc__
""" x.__str__() <==> str(x) """
pass
critical = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
oid = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
from object import object
class Dn(object):
""" A LDB distinguished name. """
def add_base(self, dn): # real signature unknown; restored from __doc__
"""
S.add_base(dn) -> None
Add a base DN to this DN.
"""
pass
def add_child(self, dn): # real signature unknown; restored from __doc__
"""
S.add_child(dn) -> None
Add a child DN to this DN.
"""
pass
def canonical_ex_str(self): # real signature unknown; restored from __doc__
"""
S.canonical_ex_str() -> string
Canonical version of this DN (like a posix path, with terminating newline).
"""
return ""
def canonical_str(self): # real signature unknown; restored from __doc__
"""
S.canonical_str() -> string
Canonical version of this DN (like a posix path).
"""
return ""
def check_special(self, name): # real signature unknown; restored from __doc__
"""
S.check_special(name) -> bool
Check if name is a special DN name
"""
return False
def extended_str(self, mode=1): # real signature unknown; restored from __doc__
"""
S.extended_str(mode=1) -> string
Extended version of this DN
"""
return ""
def get_casefold(self, *args, **kwargs): # real signature unknown
pass
def get_component_name(self, num): # real signature unknown; restored from __doc__
"""
S.get_component_name(num) -> string
get the attribute name of the specified component
"""
return ""
def get_component_value(self, num): # real signature unknown; restored from __doc__
"""
S.get_component_value(num) -> string
get the attribute value of the specified component as a binary string
"""
return ""
def get_extended_component(self, name): # real signature unknown; restored from __doc__
"""
S.get_extended_component(name) -> string
returns a DN extended component as a binary string
"""
return ""
def get_linearized(self, *args, **kwargs): # real signature unknown
pass
def get_rdn_name(self): # real signature unknown; restored from __doc__
"""
S.get_rdn_name() -> string
get the RDN attribute name
"""
return ""
def get_rdn_value(self): # real signature unknown; restored from __doc__
"""
S.get_rdn_value() -> string
get the RDN attribute value as a binary string
"""
return ""
def is_child_of(self, basedn): # real signature unknown; restored from __doc__
"""
S.is_child_of(basedn) -> int
Returns True if this DN is a child of basedn
"""
return 0
def is_null(self, *args, **kwargs): # real signature unknown
""" Check whether this is a null DN. """
pass
def is_special(self): # real signature unknown; restored from __doc__
"""
S.is_special() -> bool
Check whether this is a special LDB DN.
"""
return False
def is_valid(self): # real signature unknown; restored from __doc__
""" S.is_valid() -> bool """
return False
def parent(self): # real signature unknown; restored from __doc__
"""
S.parent() -> dn
Get the parent for this DN.
"""
pass
def remove_base_components(self, p_int): # real signature unknown; restored from __doc__
"""
S.remove_base_components(int) -> bool
Remove a number of DN components from the base of this DN.
"""
return False
def set_component(self, *args, **kwargs): # real signature unknown
"""
S.get_component_value(num, name, value) -> None
set the attribute name and value of the specified component
"""
pass
def set_extended_component(self, name, value): # real signature unknown; restored from __doc__
"""
S.set_extended_component(name, value) -> None
set a DN extended component as a binary string
"""
pass
def validate(self): # real signature unknown; restored from __doc__
"""
S.validate() -> bool
Validate DN is correct.
"""
return False
def __add__(self, y): # real signature unknown; restored from __doc__
""" x.__add__(y) <==> x+y """
pass
def __cmp__(self, y): # real signature unknown; restored from __doc__
""" x.__cmp__(y) <==> cmp(x,y) """
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
def __len__(self): # real signature unknown; restored from __doc__
""" x.__len__() <==> len(x) """
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
def __repr__(self): # real signature unknown; restored from __doc__
""" x.__repr__() <==> repr(x) """
pass
def __str__(self): # real signature unknown; restored from __doc__
""" x.__str__() <==> str(x) """
pass
from object import object
class Ldb(object):
""" Connection to a LDB database. """
def add(self, message, controls=None): # real signature unknown; restored from __doc__
"""
S.add(message, controls=None) -> None
Add an entry.
"""
pass
def connect(self, url, flags=0, options=None): # real signature unknown; restored from __doc__
"""
S.connect(url, flags=0, options=None) -> None
Connect to a LDB URL.
"""
pass
def delete(self, dn, controls=None): # real signature unknown; restored from __doc__
"""
S.delete(dn, controls=None) -> None
Remove an entry.
"""
pass
def get_config_basedn(self, *args, **kwargs): # real signature unknown
pass
def get_default_basedn(self, *args, **kwargs): # real signature unknown
pass
def get_opaque(self, name): # real signature unknown; restored from __doc__
"""
S.get_opaque(name) -> value
Get an opaque value set on this LDB connection.
:note: The returned value may not be useful in Python.
"""
pass
def get_root_basedn(self, *args, **kwargs): # real signature unknown
pass
def get_schema_basedn(self, *args, **kwargs): # real signature unknown
pass
def modify(self, message, controls=None, validate=False): # real signature unknown; restored from __doc__
"""
S.modify(message, controls=None, validate=False) -> None
Modify an entry.
"""
pass
def modules(self): # real signature unknown; restored from __doc__
"""
S.modules() -> list
Return the list of modules on this LDB connection
"""
return []
def msg_diff(self, Message): # real signature unknown; restored from __doc__
"""
S.msg_diff(Message) -> Message
Return an LDB Message of the difference between two Message objects.
"""
return Message
def parse_ldif(self, ldif): # real signature unknown; restored from __doc__
"""
S.parse_ldif(ldif) -> iter(messages)
Parse a string formatted using LDIF.
"""
pass
def rename(self, old_dn, new_dn, controls=None): # real signature unknown; restored from __doc__
"""
S.rename(old_dn, new_dn, controls=None) -> None
Rename an entry.
"""
pass
def schema_attribute_add(self, *args, **kwargs): # real signature unknown
pass
def schema_attribute_remove(self, *args, **kwargs): # real signature unknown
pass
def schema_format_value(self, *args, **kwargs): # real signature unknown
pass
def search(self, base=None, scope=None, expression=None, attrs=None, controls=None): # real signature unknown; restored from __doc__
"""
S.search(base=None, scope=None, expression=None, attrs=None, controls=None) -> msgs
Search in a database.
:param base: Optional base DN to search
:param scope: Search scope (SCOPE_BASE, SCOPE_ONELEVEL or SCOPE_SUBTREE)
:param expression: Optional search expression
:param attrs: Attributes to return (defaults to all)
:param controls: Optional list of controls
:return: Iterator over Message objects
"""
pass
def sequence_number(self, type): # real signature unknown; restored from __doc__
"""
S.sequence_number(type) -> value
Return the value of the sequence according to the requested type
"""
pass
def setup_wellknown_attributes(self, *args, **kwargs): # real signature unknown
pass
def set_create_perms(self, mode): # real signature unknown; restored from __doc__
"""
S.set_create_perms(mode) -> None
Set mode to use when creating new LDB files.
"""
pass
def set_debug(self, callback): # real signature unknown; restored from __doc__
"""
S.set_debug(callback) -> None
Set callback for LDB debug messages.
The callback should accept a debug level and debug text.
"""
pass
def set_modules_dir(self, path): # real signature unknown; restored from __doc__
"""
S.set_modules_dir(path) -> None
Set path LDB should search for modules
"""
pass
def set_opaque(self, name, value): # real signature unknown; restored from __doc__
"""
S.set_opaque(name, value) -> None
Set an opaque value on this LDB connection.
:note: Passing incorrect values may cause crashes.
"""
pass
def transaction_cancel(self): # real signature unknown; restored from __doc__
"""
S.transaction_cancel() -> None
cancel a new transaction.
"""
pass
def transaction_commit(self): # real signature unknown; restored from __doc__
"""
S.transaction_commit() -> None
commit a new transaction.
"""
pass
def transaction_prepare_commit(self): # real signature unknown; restored from __doc__
"""
S.transaction_prepare_commit() -> None
prepare to commit a new transaction (2-stage commit).
"""
pass
def transaction_start(self): # real signature unknown; restored from __doc__
"""
S.transaction_start() -> None
Start a new transaction.
"""
pass
def write_ldif(self, message, changetype): # real signature unknown; restored from __doc__
"""
S.write_ldif(message, changetype) -> ldif
Print the message as a string formatted using LDIF.
"""
pass
def __contains__(self, y): # real signature unknown; restored from __doc__
""" x.__contains__(y) <==> y in x """
pass
def __getattribute__(self, name): # real signature unknown; restored from __doc__
""" x.__getattribute__('name') <==> x.name """
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
def __repr__(self): # real signature unknown; restored from __doc__
""" x.__repr__() <==> repr(x) """
pass
firstmodule = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
from Exception import Exception
class LdbError(Exception):
# no doc
def __init__(self, *args, **kwargs): # real signature unknown
pass
__weakref__ = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
"""list of weak references to the object (if defined)"""
from object import object
class Message(object):
""" A LDB Message """
def add(self, *args, **kwargs): # real signature unknown
"""
S.append(element)
Add an element to this message.
"""
pass
def elements(self, *args, **kwargs): # real signature unknown
pass
@classmethod
def from_dict(cls, ldb, dict, mod_flag=None): # real signature unknown; restored from __doc__
"""
Message.from_dict(ldb, dict, mod_flag=FLAG_MOD_REPLACE) -> ldb.Message
Class method to create ldb.Message object from Dictionary.
mod_flag is one of FLAG_MOD_ADD, FLAG_MOD_REPLACE or FLAG_MOD_DELETE.
"""
pass
def get(self, name, default=None, idx=None): # real signature unknown; restored from __doc__
"""
msg.get(name,default=None,idx=None) -> string
idx is the index into the values array
if idx is None, then a list is returned
if idx is not None, then the element with that index is returned
if you pass the special name 'dn' then the DN object is returned
"""
return ""
def items(self, *args, **kwargs): # real signature unknown
pass
def keys(self): # real signature unknown; restored from __doc__
"""
S.keys() -> list
Return sequence of all attribute names.
"""
return []
def remove(self, name): # real signature unknown; restored from __doc__
"""
S.remove(name)
Remove all entries for attributes with the specified name.
"""
pass
def __cmp__(self, y): # real signature unknown; restored from __doc__
""" x.__cmp__(y) <==> cmp(x,y) """
pass
def __delitem__(self, y): # real signature unknown; restored from __doc__
""" x.__delitem__(y) <==> del x[y] """
pass
def __getitem__(self, y): # real signature unknown; restored from __doc__
""" x.__getitem__(y) <==> x[y] """
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
def __iter__(self): # real signature unknown; restored from __doc__
""" x.__iter__() <==> iter(x) """
pass
def __len__(self): # real signature unknown; restored from __doc__
""" x.__len__() <==> len(x) """
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
def __repr__(self): # real signature unknown; restored from __doc__
""" x.__repr__() <==> repr(x) """
pass
def __setitem__(self, i, y): # real signature unknown; restored from __doc__
""" x.__setitem__(i, y) <==> x[i]=y """
pass
dn = property(lambda self: object(), lambda self, v: None, lambda self: None) # default
from object import object
class MessageElement(object):
""" An element of a Message """
def flags(self, *args, **kwargs): # real signature unknown
pass
def get(self, *args, **kwargs): # real signature unknown
pass
def set_flags(self, *args, **kwargs): # real signature unknown
pass
def __cmp__(self, y): # real signature unknown; restored from __doc__
""" x.__cmp__(y) <==> cmp(x,y) """
pass
def __getitem__(self, y): # real signature unknown; restored from __doc__
""" x.__getitem__(y) <==> x[y] """
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
def __iter__(self): # real signature unknown; restored from __doc__
""" x.__iter__() <==> iter(x) """
pass
def __len__(self): # real signature unknown; restored from __doc__
""" x.__len__() <==> len(x) """
pass
@staticmethod # known case of __new__
def __new__(S, *more): # real signature unknown; restored from __doc__
""" T.__new__(S, ...) -> a new object with type S, a subtype of T """
pass
def __repr__(self): # real signature unknown; restored from __doc__
""" x.__repr__() <==> repr(x) """
pass
def __str__(self): # real signature unknown; restored from __doc__
""" x.__str__() <==> str(x) """
pass
from object import object
class Module(object):
""" LDB module (extension) """
def add(self, *args, **kwargs): # real signature unknown
pass
def delete(self, *args, **kwargs): # real signature unknown
pass
def del_transaction(self, *args, **kwargs): # real signature unknown
pass
def end_transaction(self, *args, **kwargs): # real signature unknown
pass
def modify(self, *args, **kwargs): # real signature unknown
pass
def rename(self, *args, **kwargs): # real signature unknown
pass
def search(self, *args, **kwargs): # real signature unknown
pass
def start_transaction(self, *args, **kwargs): # real signature unknown
pass
def __init__(self, *args, **kwargs): # real signature unknown
pass
def __repr__(self): # real signature unknown; restored from __doc__
""" x.__repr__() <==> repr(x) """
pass
def __str__(self): # real signature unknown; restored from __doc__
""" x.__str__() <==> str(x) """
pass
from object import object
class Tree(object):
""" A search tree """
def __init__(self, *args, **kwargs): # real signature unknown
pass
|
William Logan, Our Savage Art: Poetry and the Civil Tongue, Columbia University Press, 2009, 346 pgs. Disclosure 1: I haven’t read much Logan previously, though I know he is notorious for poking holes in inflated poetic reputations; Disclosure 2: I don’t read a lot of poetry criticism because most of it is ego-stroking blather aimed to curry favor with the poet reviewed; Disclosure 3: I wanted to read this book because Logan includes two essays on later novels by Thomas Pynchon: Mason & Dixon and Against the Day, respectively.
“Well-meaning, often charming, sincere as a traffic sign, he has all the gifts that education and rationality can provide; but you never feel he’s actually moved to write.” Logan gives us all the “good” qualities of Pinsky while making them seem inadequate for poetry, where the important thing is the passion or feeling that compells composition. While it would be naïve to suggest that one writes out of emotion (even Wordsworth said poems were based on “emotion recollected in tranquility”), we still have to concede that a problem with Pinsky is how controlled and deliberate it all is. No rapture, no divine afflatus.
Or, on Ashbery: “when you read Ashbery you have to forget much of what you know about reading poetry. You have to take satisfaction where pleasures are rarely given and never let yourself wish for what isn’t there. (There’s so much that isn’t there.)” While I can’t imagine someone saying that “pleasures are rarely given” in Ashbery’s poems (if there’s any poet writing today who seems to live by Stevens’ dictum “It Must Give Pleasure,” that poet must be Ashbery), I do concede that the kinds of pleasures Logan means may well be rare in Ashbery—“so much that isn’t there” (echoing, it seems to me, Stevens’ “the nothing that is not there and the nothing that is”) calls to mind the things other poets do that readers of such poets seem to like. Which is to say that the pleasures of Ashbery are the pleasures of Ashbery; you’ll never confuse him with Lowell, or Auden, or Larkin, or any of the other poets that Logan uses as a measuring rod.
Crane was no innovative genius like Whitman; he was perhaps closer to a peasant poet like John Clare, an outsider too susceptible to praise and other vices of the city. Defensive about his lack of education, a Midwestern striver out of a Sinclair Lewis novel, Crane tried to make it among the big-city literary men, a rum in one hand and a copy of The Waste Land in the other. Had beauty been enough, he might even have succeeded.
This review apparently brought down much complaint upon Logan’s head, for it’s the only review here that is followed by a response to critics of his criticism. The objections to what Logan has to say about Crane are easily imaginable—is it worth mentioning that someone is “no Whitman”? But what’s instructive here is how Logan makes Crane critique-able. He raises an issue that is often lost sight of when we try to appraise those (seemingly) secure in the canon: how much of what they did is truly remarkable, how much of it achieved what was intended? Logan’s assessment of Crane—that he was too ambitious for his abilites, that he was out of his league with his intentions, that he was a writer of gorgeous lines rather than completely satisfying poems—is accurate, as far as it goes. And that’s far enough to offset the outrageous claims for Crane as one who achieved more than he did. But, though I’m sympathetic to Logan’s effort to be even-handed here (and entertaining—that rum and Eliot remark is funny but also sadly true: you don’t become the next Eliot by worrying so much about the current Eliot, and drinking to escape your inadequacies), I also find his appraisal to be ungenerous, not simply to Crane, but to the value of beauty in poetry. No, it’s not enough simply to be gorgeous, but Crane, arguably, is never simply gorgeous—the beauty he courts comes, when it does, at considerable risk, costing, it may be, “not less than everything”—including the kind of sense that Logan would like more of.
And it’s here that I can say I grew tired of Logan when read at such length. If we find it hard to imagine Pinsky being moved to write, we also find it hard to imagine Logan ever being transported by the pleasures of poetry, or simply overwhelmed by beauty. Logan is Lowellian, it seems, and that puts him off to the side of the leading taste of our day, I’d say, but I share his admiration for Life Studies and feel it’s the rare poet who can achieve as much as Lowell does in such deceptively simple diction. But the chaos and crazed ambition that lurk everywhere in Lowell’s work inspire, it seems to me, a bit more acceptance of a poet like Crane who wrestles with many of the same problems—a Lowell who never got from Lord Weary to Life Studies, let’s say. Logan, as a critic, is too-much enamored of his Johnsonian parallels—reading Logan’s criticism at length makes one feel trapped in an apothegm factory—and too-little concerned with poems as affective experience (which requires, I’d suggest, assuming a bit more what the poem assumes).
But, that said, Logan is to be praised for doing what he does with such aplomb, wit, and succinctness. The book opens with a reflective essay on his work as a critic, “The Bowl of Diogenes; or, The End of Criticism,” where Logan claims that the critic’s “besetting vice is generosity,” so I suppose it’s pointless to rebuke him for showing too little vice, and the essay is valuable for showing what Logan thinks of criticism, which he seems to regard as largely a necessary vice. How else to decide what is worth our time? We can’t read everything, so we look to critics to give us some idea of what we’re missing, maybe making claims that send us to things we’d otherwise avoid or convincing us to avoid something we’d otherwise waste time with.
In an interview included here, Logan, a poet, modestly refuses to claim company with grander poet/critics (such as Eliot and Jarrell), and that seems more than fitting. Logan, as critic, has the assured and captious tone of the entertaining friend one values for his ability to find fault with disarming confidence. One rarely feels antagonized by his pronouncements, and even more rarely does one feel challenged to delve more deeply into his meaning. His is the strength of the surface assessment; it’s often enough for him to quote a few damning lines of a lackluster poem to convince us that poetry is often simply the name for willful idiosyncracy in writing, but the effect is more like punching buttons on a radio to see if one catches a sound that will make one stay and listen. Logan gives us a pretty good idea of what he’s hearing, but apparently doesn’t feel he has to bother to spell out what he’s listening for—which Eliot and Jarrell were not so reticent about.
And what about the Pynchon reviews? I was pleased to find that Logan admires the audacious pleasures of Pynchon’s style, though as critic he also has to provide a caveat (on Mason & Dixon): “This intensity of imagery, this continual and immodest word-by-word invention, ruptures the plain understandings most fiction now requires.” And this assessment comes fully informed by the challenges even a sympathetic reader of Pynchon is apt to find: “Joyce and Proust offered character in lieu of plot, and many novelists substitute plot in lieu of character. It’s difficult for a novel, even a novel everywhere touched by brilliance, to offer so little of either.” And Logan is even less accepting of Against the Day (as were most). The point, we might say, contra this judgment, is that a writer like Pynchon wants us to get out of the habit of thinking in terms of plot and character as the mainstays of what the fictive reading experience offers, and I would like to think that dedicated readers of Pynchon have done so. And yet there is much justice in Logan’s assessment, but, as is often the case when one tries to hold the willfully slippery still long enough to deliver one’s plodding objection, his criticism boils down to wanting Pynchon to stop goofing around and simply give us the story.
Pynchon may have conceived Mason & Dixon as a supreme fiction, a poetic act freed of the slavery of plot and character; but conventions are cruel to those who betray them. As his stand-up comedy becomes merely a seven-hundred-page improvisation, the jokes grow hollow as the Earth itself. Here Pynchon’s poetics have seduced him: it hardly matters if most poems mean what they say. Poetry is the saying, but fiction (the drama, the action, the consequence, the regret) is the having said.
As a statement this can’t be argued against (except that M&D is the one Pynchon novel where “the regret” becomes palpable in the character of Mason). But IF M&D is a seven-hundred-page improv, then it’s all about the jokes and that might well grow tedious, but what’s at issue is what Pynchon is joking about (the thematics of the work) and part of what he’s joking about are the very conventions that, to Logan’s mind, he has “betrayed.” But is mocking, lampooning, satirizing, tickling, poking, needling, and slapping in the face with a custard pie the same as “betraying”? And, while it may sound wonderfully Johnsonian to say "poetry is saying and fiction the having said," it only makes sense to the degree that poetry is a form valued for its immediacy and fiction a form valued for its ability to impose order on what has occurred. But poetry’s order and fiction’s order are likewise impositions, the more so when convention becomes determinate for what can be said or shown. Logan wants more matter, less art, and certainly understands that Pynchon writes from a perspective in which that distinction becomes indistinct. No one can fault a critic for saying "something too much of this," and Logan earns respect for reading Pynchon carefully; if at times he sounds like a school teacher trying to hold his most irreverent student to the standard of his "best students," so be it.
Again, there is no deficiency in Logan’s position, it simply isn’t one that best serves the work under discussion. If poetry is the saying, and fiction the having said, I suppose that criticism is having one’s say. If not always saying much, Logan’s say is always well-said, and that’s saying something.
|
from FireGirlOptimizer import *
from FireGirlStats import *
# Keep track of file numbers so they don't repeat
server_file_counter = 0
def file_number_str():
global server_file_counter
server_file_counter += 1
return server_file_counter
def initialize():
"""
Return the initialization object for the FireGirl domain.
"""
return {
"reward": [
{"name": "Discount",
"description":"The per-year discount",
"current_value": 1, "max": 1, "min": 0, "units": "~"},
{"name": "Suppression Fixed Cost",
"description":"cost per day of suppression",
"current_value": 500, "max": 999999, "min": 0, "units": "$"},
{"name": "Suppression Variable Cost",
"description":"cost per hectare of suppression",
"current_value": 500, "max": 999999, "min": 0, "units": "$"}
],
"transition": [
{"name": "Years to simulate",
"description": "how far to look into the future",
"current_value": 10, "max": 150, "min": 0, "units": "Y"},
{"name": "Futures to simulate",
"description": "how many stochastic futures to generate",
"current_value": 25, "max": 1000, "min": 0, "units": "#"},
{"name": "Landscape Size",
"description": "how many cells wide and tall should the landscape be. Min:9, Max:129",
"current_value": 21, "max": 129, "min": 9, "units": "#"},
{"name": "Harvest Percent",
"description": "timber harvest rate as a percent of annual increment",
"current_value": 0.95, "max": 1, "min": 0, "units": "%"},
{"name": "Minimum Timber Value",
"description":"the minimum timber value required before harvest is allowed",
"current_value": 50, "max":9999, "min": 0, "units": "$"},
{"name": "Slash Remaning",
"description": "the amount of fuel load (slash) left after a harvest",
"current_value": 10, "max":9999, "min": 0, "units": "#"},
{"name": "Fuel Accumulation",
"description": "the amount of fuel load that accumulates each year",
"current_value": 2, "max":9999, "min": 0, "units": "#"},
{"name": "Suppression Effect",
"description": "the reduction in fire spread rate as the result of suppression",
"current_value": 0.5, "max":1, "min": 0, "units": "%"},
{"name": "Use Original Bugs",
"description": "set to 0 to use original bugs. 1 (or non-zero) to use the patches.",
"current_value": 0, "max":1, "min": 0, "units": "~"},
{"name": "Growth Model",
"description": "set to 1 to use original model; or 2 for updated model.",
"current_value": 1, "max":2, "min": 1, "units": "~"}
],
"policy": [
{"name": "Constant",
"description":"for the intercept",
"current_value": 0, "max": 10, "min":-10, "units": ""},
{"name": "Date",
"description":"for each day of the year",
"current_value": 0, "max": 10, "min":-10, "units": ""},
{"name": "Days Left",
"description":"for each day left in the year",
"current_value": 0, "max": 10, "min":-10, "units": ""},
{"name":"Temperature",
"description":"for air temperature at the time of an ignition",
"current_value": 0, "max": 10, "min":-10, "units": ""},
{"name": "Wind Speed",
"description":"for wind speed at the time of an ignition",
"current_value": 0, "max": 10, "min":-10, "units": ""},
{"name": "Timber Value",
"description":"for the timber value at an ignition location",
"current_value": 0, "max": 10, "min":-10, "units": ""},
{"name": "Timber Value 8",
"description":"for the average timber value in the 8 neighboring stands",
"current_value": 0, "max": 10, "min":-10, "units": ""},
{"name": "Timber Value 24",
"description":"for the average timber value in the 24 neighboring stands",
"current_value": 0, "max": 10, "min":-10, "units": ""},
{"name": "Fuel Load",
"description":"for the fuel load at an ignition location",
"current_value": 0, "max": 10, "min":-10, "units": ""},
{"name": "Fuel Load 8",
"description":"for the average fuel load in the 8 neighboring stands",
"current_value": 0, "max": 10, "min":-10, "units": ""},
{"name": "Fuel Load 24",
"description":"for the average fuel load in the 24 neighboring stands",
"current_value": 0, "max": 10, "min":-10, "units": ""}
]
}
def optimize(query):
"""
Return a newly optimized query.
"""
dict_reward = query["reward"]
dict_transition = query["transition"]
dict_policy = query["policy"]
#some variables
#pathway_count = 5 #how many pathways to use in the optimization
#years = 5 #how many years to simulate for each pathway
pathway_count = dict_transition["Futures to simulate"]
years = dict_transition["Years to simulate"]
#creating optimization objects
opt = FireGirlPolicyOptimizer()
#giving the simulation parameters to opt, so that it can pass
# them on to it's pathways as it creates them
opt.setFireGirlModelParameters(dict_transition, dict_reward)
#setting policy as well
#TODO make this robust to FireWoman policies
pol = FireGirlPolicy()
pol.setParams([dict_policy["Constant"],
dict_policy["Date"],
dict_policy["Days Left"],
dict_policy["Temperature"],
dict_policy["Wind Speed"],
dict_policy["Timber Value"],
dict_policy["Timber Value 8"],
dict_policy["Timber Value 24"],
dict_policy["Fuel Load"],
dict_policy["Fuel Load 8"],
dict_policy["Fuel Load 24"],
])
#assigning the policy to opt, so that it can use it in simulations.
opt.setPolicy(pol)
#creating pathways
opt.createFireGirlPathways(int(pathway_count),int(years))
#set desired objective function
if "Objective Function" in dict_transition.keys():
opt.setObjFn(dict_transition["Objective Function"])
#doing one round of optimization
opt.optimizePolicy()
#pulling the policy variables back out
learned_params = opt.Policy.getParams()
#TODO make this robust to FireWoman policies
dict_new_pol = {}
dict_new_pol["Constant"] = learned_params[0]
dict_new_pol["Date"] = learned_params[1]
dict_new_pol["Days Left"] = learned_params[2]
dict_new_pol["Temperature"] = learned_params[3]
dict_new_pol["Wind Speed"] = learned_params[4]
dict_new_pol["Timber Value"] = learned_params[5]
dict_new_pol["Timber Value 8"] = learned_params[6]
dict_new_pol["Timber Value 24"] = learned_params[7]
dict_new_pol["Fuel Load"] = learned_params[8]
dict_new_pol["Fuel Load 8"] = learned_params[9]
dict_new_pol["Fuel Load 24"] = learned_params[10]
return dict_new_pol
def rollouts(query):
"""
Return a set of rollouts for the given parameters.
"""
dict_reward = query["reward"]
dict_transition = query["transition"]
dict_policy = query["policy"]
pathway_count = int(dict_transition["Futures to simulate"])
years = int(dict_transition["Years to simulate"])
start_ID = 0
#generate 100 rollouts
opt = FireGirlPolicyOptimizer()
opt.setObjFn("J1")
#opt.setObjFn("J2")
opt.SILENT = True
#setting policy...
#This is brittle, and will not work directly with FireWoman data... or with future versions
# of FireGirl if new features get added...
pol = FireGirlPolicy()
pol.setParams([dict_policy["Constant"],
dict_policy["Date"],
dict_policy["Days Left"],
dict_policy["Temperature"],
dict_policy["Wind Speed"],
dict_policy["Timber Value"],
dict_policy["Timber Value 8"],
dict_policy["Timber Value 24"],
dict_policy["Fuel Load"],
dict_policy["Fuel Load 8"],
dict_policy["Fuel Load 24"],
])
#setting the policy in the optimizer, which will pass it to each created pathway
opt.setPolicy(pol)
#giving the optimizer custom model parameters
opt.setFireGirlModelParameters(dict_transition,dict_reward)
#creating landscapes. The function will enforce the custom model parameters
opt.createFireGirlPathways(pathway_count,years,start_ID)
#outermost list to collect one sub-list for each pathway, etc...
return_list = []
#parse the data needed...
for pw in opt.pathway_set:
#new ignition events list for this pathway
year_values = []
for ign in pw.ignition_events:
#get the dictionary representation of the ignition
features = ign.getDictionary()
#fill the total's dictionary
features["Harvest Value"] = pw.getHarvest(ign.year)
#features["Suppression Cost"] = pw.getSuppressionCost(ign.year) #already reported in ign.getDictionary()
features["Growth"] = pw.getGrowth(ign.year)
#TODO - Fix for Discount Rate
features["Discounted Reward"] = features["Harvest Value"] - features["Suppression Cost"]
features["Event Number"] = ign.year
#NOTE: This will be the same number for all ignitions in this pathway. It's the
# id number that a pathway uses to instantiate its random seed
features["Pathway Number"] = pw.ID_number
#adding cumulative measurements, from the start, up to this year
features["Cumulative Harvest Value"] = pw.getHarvestFrom(0, ign.year)
features["Cumulative Growth"] = pw.getGrowthFrom(0, ign.year)
features["Cumulative Timber Loss"] = pw.getTimberLossFrom(0, ign.year)
features["Cumulative Suppression Cost"] = pw.getSuppressionFrom(0, ign.year)
#add this ignition event + year details to this pathway's list of dictionaries
year_values.append(features)
#the events list for this pathway has been filled, so add it to the return list
return_list.append(year_values)
#done with all pathways
return return_list
def state(query):
"""
Return a series of images up to the requested event number.
"""
event_number = int(query["Event Number"])
pathway_number = int(query["Pathway Number"])
dict_reward = query["reward"]
dict_transition = query["transition"]
dict_policy = query["policy"]
show_count = 50
step = 1
if "Past Events to Show" in query.keys():
show_count = 1 + int(query["Past Events to Show"])
if "Past Events to Step Over" in query.keys():
step = 1 + int(query["Past Events to Step Over"])
#sanitizing
if step < 1: step = 1
if show_count < 1: show_count = 1
#creating optimization objects
opt = FireGirlPolicyOptimizer()
#giving the simulation parameters to opt, so that it can pass
# them on to it's pathways as it creates them
opt.setFireGirlModelParameters(dict_transition, dict_reward)
#setting policy as well
#TODO make this robust to FireWoman policies
pol = FireGirlPolicy()
pol.setParams([dict_policy["Constant"],
dict_policy["Date"],
dict_policy["Days Left"],
dict_policy["Temperature"],
dict_policy["Wind Speed"],
dict_policy["Timber Value"],
dict_policy["Timber Value 8"],
dict_policy["Timber Value 24"],
dict_policy["Fuel Load"],
dict_policy["Fuel Load 8"],
dict_policy["Fuel Load 24"],
])
#assigning the policy to opt, so that it can use it in simulations.
opt.setPolicy(pol)
#Setting opt to tell it's pathway(s) to remember their histories
#un-needed, since we're just re-creating the pathway of interest anyway
#opt.PATHWAYS_RECORD_HISTORIES = True
opt.SILENT = True
#creating image name list
names = [[],[],[],[]]
#creating pathway with no years... this will generate the underlying landscape and set
# all the model parameters that were assigned earlier.
opt.createFireGirlPathways(1, 0, pathway_number)
#now incrementing the years
#because we start with the final year, and then skip backward showing every few landscapes,
#we may have to skip over several of the first landscapes before we start showing any
start = event_number - (step * (show_count -1))
#checking for negative numbers, in case the users has specified too many past landscapes to show
while start < 0:
start += step
#manually telling the pathway to do the first set of years
opt.pathway_set[0].doYears(start)
#get new names
timber_name = "static/timber_" + str(file_number_str()) + ".png"
fuel_name = "static/fuel_" + str(file_number_str()) + ".png"
composite_name = "static/composite_" + str(file_number_str()) + ".png"
burn_name = "static/burn_" + str(file_number_str()) + ".png"
#and save it's images
opt.pathway_set[0].saveImage(timber_name, "timber")
opt.pathway_set[0].saveImage(fuel_name, "fuel")
opt.pathway_set[0].saveImage(composite_name, "composite")
opt.pathway_set[0].saveImage(burn_name, "timber", 10)
#add these names to the lists
names[0].append(timber_name)
names[1].append(fuel_name)
names[2].append(composite_name)
names[3].append(burn_name)
#now loop through the rest of the states
for i in range(start, event_number+1, step):
#do the next set of years
opt.pathway_set[0].doYears(step)
#create a new image filenames
timber_name = "static/timber_" + str(file_number_str()) + ".png"
fuel_name = "static/fuel_" + str(file_number_str()) + ".png"
composite_name = "static/composite_" + str(file_number_str()) + ".png"
burn_name = "static/burn_" + str(file_number_str()) + ".png"
#save the images
opt.pathway_set[0].saveImage(timber_name, "timber")
opt.pathway_set[0].saveImage(fuel_name, "fuel")
opt.pathway_set[0].saveImage(composite_name, "composite")
opt.pathway_set[0].saveImage(burn_name, "timber", 10)
#add these names to the lists
names[0].append(timber_name)
names[1].append(fuel_name)
names[2].append(composite_name)
names[3].append(burn_name)
timber_stats = pathway_summary(opt.pathway_set[0],"timber")
fuel_stats = pathway_summary(opt.pathway_set[0],"fuel")
total_growth = opt.pathway_set[0].getGrowthTotal()
total_suppression = opt.pathway_set[0].getSuppressionTotal()
total_harvest = opt.pathway_set[0].getHarvestTotal()
total_timber_loss = opt.pathway_set[0].getTimberLossTotal()
returnObj = {
"statistics": {
"Event Number": int(query["Event Number"]),
"Pathway Number": int(query["Pathway Number"]),
"Average Timber Value": int(timber_stats[0]),
"Timber Value Std.Dev.": int(timber_stats[1]),
"Average Timber Value - Center": int(timber_stats[2]),
"Timber Value Std.Dev. - Center": int(timber_stats[3]),
"Average Fuel Load": int(fuel_stats[0]),
"Fuel Load Std.Dev.": int(fuel_stats[1]),
"Average Fuel Load - Center": int(fuel_stats[2]),
"Fuel Load Std.Dev. - Center": int(fuel_stats[3]),
"Cumulative Harvest":total_harvest,
"Cumulative Suppression Cost": total_suppression,
"Cumulative Timber Loss":total_timber_loss,
"Cumulative Timber Growth":total_growth,
},
"images": names
}
return returnObj
|
Belief plays an important part in hypnosis, and in the field of medicine as well. Subconscious belief is the most powerful factor in successful living: it determines all behaviour patterns. However, our belief systems are imposed upon our unwary minds during childhood, way before our critical factors are developed enough to reject harmful ideas that we would reject at a later period in our lives.
Our subconscious beliefs determine whether we will succeed or fail, be happy or unhappy, sick or healthy, and even the length of our lives.
When the subconscious mind is convinced, it starts to act. As a hypnotherapist, my function is to convince the subconscious mind of the benefits of change. While my client is under hypnosis, the more the conscious mind recedes, the more accessible the subconscious mind becomes – but consciousness does not disappear at any level.
Hypnosis is a concentration of the mind upon a single idea, and the exclusion of all other thoughts in which the consciousness agreeably slows down to a single focus.
|
# -*- coding: utf-8 -*-
"""
example on how to plot decoded sensor data from crazyflie
@author: jsschell
"""
import cfusdlog
import matplotlib.pyplot as plt
import re
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("filename")
args = parser.parse_args()
# decode binary log data
logData = cfusdlog.decode(args.filename)
#only focus on regular logging
logData = logData['fixedFrequency']
# set window background to white
plt.rcParams['figure.facecolor'] = 'w'
# number of columns and rows for suplot
plotCols = 1
plotRows = 1
# let's see which keys exists in current data set
keys = ""
for k, v in logData.items():
keys += k
# get plot config from user
plotGyro = 0
if re.search('gyro', keys):
inStr = input("plot gyro data? ([Y]es / [n]o): ")
if ((re.search('^[Yy]', inStr)) or (inStr == '')):
plotGyro = 1
plotRows += 1
plotAccel = 0
if re.search('acc', keys):
inStr = input("plot accel data? ([Y]es / [n]o): ")
if ((re.search('^[Yy]', inStr)) or (inStr == '')):
plotAccel = 1
plotRows += 1
plotBaro = 0
if re.search('baro', keys):
inStr = input("plot barometer data? ([Y]es / [n]o): ")
if ((re.search('^[Yy]', inStr)) or (inStr == '')):
plotBaro = 1
plotRows += 1
plotCtrl = 0
if re.search('ctrltarget', keys):
inStr = input("plot control data? ([Y]es / [n]o): ")
if ((re.search('^[Yy]', inStr)) or (inStr == '')):
plotCtrl = 1
plotRows += 1
plotStab = 0
if re.search('stabilizer', keys):
inStr = input("plot stabilizer data? ([Y]es / [n]o): ")
if ((re.search('^[Yy]', inStr)) or (inStr == '')):
plotStab = 1
plotRows += 1
# current plot for simple subplot usage
plotCurrent = 0
# new figure
plt.figure(0)
if plotGyro:
plotCurrent += 1
plt.subplot(plotRows, plotCols, plotCurrent)
plt.plot(logData['timestamp'], logData['gyro.x'], '-', label='X')
plt.plot(logData['timestamp'], logData['gyro.y'], '-', label='Y')
plt.plot(logData['timestamp'], logData['gyro.z'], '-', label='Z')
plt.xlabel('timestamp [ms]')
plt.ylabel('Gyroscope [°/s]')
plt.legend(loc=9, ncol=3, borderaxespad=0.)
if plotAccel:
plotCurrent += 1
plt.subplot(plotRows, plotCols, plotCurrent)
plt.plot(logData['timestamp'], logData['acc.x'], '-', label='X')
plt.plot(logData['timestamp'], logData['acc.y'], '-', label='Y')
plt.plot(logData['timestamp'], logData['acc.z'], '-', label='Z')
plt.xlabel('timestamp [ms]')
plt.ylabel('Accelerometer [g]')
plt.legend(loc=9, ncol=3, borderaxespad=0.)
if plotBaro:
plotCurrent += 1
plt.subplot(plotRows, plotCols, plotCurrent)
plt.plot(logData['timestamp'], logData['baro.pressure'], '-')
plt.xlabel('timestamp [ms]')
plt.ylabel('Pressure [hPa]')
plotCurrent += 1
plt.subplot(plotRows, plotCols, plotCurrent)
plt.plot(logData['timestamp'], logData['baro.temp'], '-')
plt.xlabel('timestamp [ms]')
plt.ylabel('Temperature [degC]')
if plotCtrl:
plotCurrent += 1
plt.subplot(plotRows, plotCols, plotCurrent)
plt.plot(logData['timestamp'], logData['ctrltarget.roll'], '-', label='roll')
plt.plot(logData['timestamp'], logData['ctrltarget.pitch'], '-', label='pitch')
plt.plot(logData['timestamp'], logData['ctrltarget.yaw'], '-', label='yaw')
plt.xlabel('timestamp [ms]')
plt.ylabel('Control')
plt.legend(loc=9, ncol=3, borderaxespad=0.)
if plotStab:
plotCurrent += 1
plt.subplot(plotRows, plotCols, plotCurrent)
plt.plot(logData['timestamp'], logData['stabilizer.roll'], '-', label='roll')
plt.plot(logData['timestamp'], logData['stabilizer.pitch'], '-', label='pitch')
plt.plot(logData['timestamp'], logData['stabilizer.yaw'], '-', label='yaw')
plt.plot(logData['timestamp'], logData['stabilizer.thrust'], '-', label='thrust')
plt.xlabel('timestamp [ms]')
plt.ylabel('Stabilizer')
plt.legend(loc=9, ncol=4, borderaxespad=0.)
plt.show()
|
Our designer furniture has a new horizon: the IMM Cologne trade fair, to be held in Germany from 18 to 20 January.
We will travel to the first international event dedicated to furniture and interior design with a selection of our contemporary design proposals.
The main product innovation that CARMENES will showcase internationally in Cologne is the new Royale sofa, which will be accompanied by other pieces from our furniture catalogue with which it shares the same subtle and sophisticated essence.
The Loft sofa and the Armand armchair by Lluís Codina will share space at the fair with the Downtown tables, the Majestic sofa and the Heritage sideboard by La Mamba Studio, which has opened a new chapter in our history devoted to upholstered and occasional furniture.
All of these pieces of furniture are evidence of the knowhow, experience and craftsmanship of our company. Our innovative vocation and commitment to design have given rise to a transversal and elegant selection of designer furniture, naturally well suited to any space, whether residential or contract.
Merkato is a new food and wine tasting market in the central district of l’Eixample in the city of Valencia. It is a gastronomic project of the restaurant owner Valentín Sánchez Arrieta who has made the most of the industrial essence of the aircraft hangar that hosts it.
|
from typing import List
# Definition for singly-linked list.
class ListNode:
def __init__(self, x):
self.val = x
self.next = None
class Solution:
def mergeKLists(self, lists: List[ListNode]) -> ListNode:
l = len(lists)
if l == 0:
return None
if l == 1:
return lists[0]
if l == 2:
l1, l2 = lists[0], lists[1]
start = curr = ListNode(0)
while l1 != None and l2 != None:
if l1.val <= l2.val:
curr.next = l1
l1 = l1.next
else:
curr.next = l2
l2 = l2.next
curr = curr.next
if l1 != None:
curr.next = l1
else:
curr.next = l2
return start.next
split_idx = l // 2
return self.mergeKLists(
[
self.mergeKLists(lists[:split_idx]),
self.mergeKLists(lists[split_idx:])
]
)
def make_link_list(*vals):
if not vals:
return None
start = curr = ListNode(vals[0])
for idx in range(1, len(vals)):
curr.next = ListNode(vals[idx])
curr = curr.next
return start
def display_link_list(node):
from io import StringIO
vals = []
while node is not None:
vals.append(node.val)
node = node.next
return " -> ".join([str(val) for val in vals])
if __name__ == '__main__':
sol = Solution()
l1 = make_link_list(1, 4, 5)
l2 = make_link_list(1, 3, 4)
l3 = make_link_list(2, 6)
print(display_link_list(l1))
print(display_link_list(l2))
print(display_link_list(l3))
merged_list = sol.mergeKLists([l1, l2, l3])
print(display_link_list(merged_list))
|
Tough quest. Not your average tank-and-spank bosses, they all have nasty fire AOE. Going to need a good, healty 5-man team to take this on.
Varedis has the most HP out of all of them so he will be the toughest to fight. I'd recommend fighting him first, and if your group can kill him then the other guys will be a breeze.
Using the book takes quite a while so be prepared for that.
This quest is actually easier than it sounds, 3 of the 4 elites you need to kill is quite easy and will go down quickly.
The only one a bit hard is Varedis himself, im sure this can be 3 manned with the right classes.
Just to sum it up, all of their spells are by pess, not me.
I'm sick of seeing 2-hander quest rewards.
I thought I would add my group's experience doing this quest, in case someone might find it useful. Our group make up was hunter, rogue, mage, and druid tank (me). You might notice we are lacking a dedicated healer.
As people have already mentioned, three of the four elites are not very tough. They aren't anything special. The only one that was problematic was Varedis himself.
If you have dps, a druid tank, but no healer, then what you can do is clear out the staircase that leads up to Varedis. When your druid tank gets around half or so health, he jumps straight down next to the foot of the staircase, and that should give him enough time to heal himself.
Then just catch him back on the staircase and bring him back upstairs.
Oh, and don't forget to use the book of fel names.
Possible to solo this entire quest as a hunter.
You must take advantage of the railing on the stairs that is between Alandien and Varedis. The key feature of it is the rocky formations at one end of it that allows mobs to run up onto the railing structure and the downward stairs at the opposite end of the rocky part. When you get the demon hunter near the railing with that feature it is possible to kite them back and forth by jumping onto the rail and then down, making the demon hunter run all the way around to reach you.
Basicly, the stairs that lead towards Varedis from Alandien has a railing that holds three braziers, has an end where mobs can walk onto it and another, downward stair case where you can jump up onto it and thanks to pathing limitations mobs must run around.
Alandien is the easiest as the railing I used is right above her, just shoot her and shes basicly in position.
Netharel is out of reach of this and I happened to luck out when there was a bunch of farmers to the point where I could simply just kite him out the front of the ruins.
Varedis and Theras can be reached by clearing a path towards the railing you use and kiting them to it.
Edit(Patch 2.3.3) : Visited the location and seems that blizzard changed the terrain, as now both sides of the Railing are path-able by mobs, it is still possible to execute the kite, but it is MUCH harder and complicated.
I can verify the above post. All four of these mobs are soloable using the strategy of exploiting the stair pathing mechanism. Soloed by a 70 lock, 41/5/15, 1150 spell damage, ~8k hp, ~6.5k mana, used imp pet, no buffs. Died twice learning the method and mastering the jump. Died once to accidental trash pulling. Lost the tap twice learning the timing.
To clarify the above post a bit, you want to use the railing that runs north-south. It is the first railing above Alandien, to the east. It is distinguishable by a large mass of rocks on the southern end. This is the area that the quest mobs will path through to bridge the gap between the "floor" and the railing. It somewhat resembles the masses of rocks and eggs in the spider boss portion of ZG.
All of the quest mobs have a mana burn ability. I was not able to successfully avoid this by moving out of LoS. I imagine it would be very difficult if not impossible for any class except hunters and warlocks to use this strategy effectively. Also take note of the fact that Theras has a spell reflect ability that he uses every 30-45 second - it lasts for approximately 10 seconds, and resembles a blue mana shield. It does not have a cast time, but makes a distinguishable sound.
For Theras, Alandien, and Varedis, what you want to do is jump from the upper portion of the staircase onto the flat portion of the railing between the far right brazier and the middle brazier. It's difficult to describe. I found a good measure was to jump from the second stair on the upper portion of the staircase, with a healthy angle towards the railing. You're aiming right for the middle of the flat portion. If you fail mid-fight, it is possible to make a "straight" jump from the edge of the railing onto the top of the railing, but it is chancy and more difficult, especially when you're getting hit (and possibly dazed).
I strongly recommend practicing this method on some of the trash nearby before attempting it on a quest mob. There is definitely a learning curve associated with making the jump.
As a lock, remember to put your imp on stay so that he does not attempt to continue pathing, causing you to continuously lose and regain your stam buff.
It is possible to solo Netharel using this method, but you have to use a different spot. Netharel has a few points in his path that are near a short staircase leading down. If you clear the 6 mobs close to this staircase, he can be pulled away from them. Jump from the short staircase onto the railing. Netharel will path down the long staircase and around at the bottom. This path is much longer and seemingly more complex for Netharel to follow, bringing him out of LoS for 5+ seconds. It's sometimes difficult to maintain the tap. An affliction lock can handle it with some practice - it may be more difficult for any other ranged class.
Note that Varedis does have to be in LoS to begin using the book, but he does NOT have to be in LoS when the book finishes its cast. This comes into play because there is a ledge in the railing (where the middle brazier is) that can impede LoS when Varedis is on top of the railing. Knowing ahead of time that you don't have to have LoS when the book finishes can relieve some of the anxiety and you should have plenty of time to bring him out of demon form at 50%. I found that the best way to use the book was to assign it to a hotkey, wait until he was close to me on the "ground" level, then jump onto the railing and immediately use the book - he'll still be in LoS here for a moment before going out of LoS behind the railing, and you can adjust your camera so that you can see him pathing around.
If you need to try the book more than once mid-fight, remember that you can maintain the tap by casting spells even when he goes "immune" to spells and abilities.
Please download because livestream has a sound bug.
As with the Medallion of Karabor line, this chain starts different depending on whether you're Aldor (A) or Scryer (S). All the quests are rated level 70 - though the chain is available at level 68, it sends you to the Shadow Labyrinth to kill Blackheart, which you might want to hold off on until level 70.
Now it's time for a venture into the Shadow Labyrinth to kill Blackheart! Once that's done that and his body has been looted for the book, it's back to Altruis where again the chains diverge depending on your allegiance.
Varedis Must Be Stopped is the final part. It's a 5-man group quest that sends you to the terrace outside the Black Temple to kill four elites: Varedis, Netharel, Theras and Alandien. Look them up for co-ordinates and tips on killing them - they're pretty straightforward, really.
When they're dead it's a simple matter of returning to the questgiver to choose a prize.
For Druids, I'd reccommend taking Hauberk of Karabor over Wildcaller. It's easy to get a weapon replacement like Fleshling Simulation Staff or Dreamer's Dragonstaff. Even staying with Staff of Beasts for some time won't hurt ya. But Hauberk of Karabor is IMHO the second best pre-raid and pre-heroic chest you can get. It's bested only by Shadowprowler's Chestguard for which it will take more time and money to grind and you'll either pay a hell of a lot for the Bolts of Soulcloth or be doing Karazhan for the Soul Essences before you get it.
For those who doesnt need any thing from rewards http://www.wowhead.com/?item=31010 is the most expensive !
Soloed with a 70lvl warlock. Just kite on top of stairs at 71,50 and put pet on Varedis after he changes form to get some time to cast the book. Basicly similiar to others apart from that.
Oh, and make sure you clear the mobs on the stairs just before you pull Varedis, it's gonna take a bit of time.
I'm at level 68 and Exarch Onaala doesn't have this quest. Maybe the minimum is actually higher?
hm... tried to solo it today, but that b*tch run the other way. well thats not the prob, the main prob is mana burn. i cant kill him without mana. lolz.
so i dunno this is prolly soloble for resto druids, but will take ages.
and adds will spawn... so no idea..
Varedis was easily 3-manned by enchance shaman, holy priest, and arcane mage. Shammy used elemental to tank and he was downed quickly.
The shammy left, and I (the arcane mage) tanked/dps'd the other 3 elites in BG gear (nothing special... 10k/10k with 220 resilience) while the priest healed. Took a lil' bit, but do-able.
You probably haven't done all the prereqs.
Soloed Netharel, Theras and Alandien.
Had to ask guildie for help for Varedis and 2manned it.
Holy paladin, judge wisdom, seal of righteousness judging, holy shock, concenration until oom (I was oom around at 60%). Just keep yourself up with the little mana you get while autoattacking. Mana tap x3 and Arcane torrent just after a mana burn to do some dmg. Didn't use tome.
Soloed Varedis as protection pally, tough fight thou, took over 10mins.
*LOH in CD, in case of accident.
1. lure 3-4 initiates with you when starting.
2. move around the entire fight, to lure more initiate into the fight, try to avoid caster mobs.
4. leave your bubble to 50%, cuz you will need a full 10secs casting time to use book of fel names.
5. finish the remaining half, same as first half.
key to success: always keep moving, both to get more adds, and to minimize damage taken from flame wave and flame attacks.
I wiped 5 times on this guy, i tried again and again because on the first try i got him down to 20k already.
Some suggestions were very helpful, but a few confused me a little, so just to be sure: use the book when he transforms. I died the first time because I thought a valid tactic could be to not use it, but he becomes immune to magic when he transforms. All my nice DK abilities missed him completely and my army of skellies only stood around looking stupid while he was in demon form.
So for people who like me want to solo him on level 80: whack him to half health, use the book and whack some more. At level 80 none of his abilities hit very hard, even the flame thingy can be ignored, and he is very simple to kill then.
Soloed as lvl 85 hunter with pet. Very easy. Pet kept him occupied while using book.
For those who are looking for Netharel at 69:53. He patrols a wide area around those coords. I found him on the lower deck below the coords. So he is not standing on top of Alandien as the coords given imply.
Level 69 Guardian druid, pretty easy to solo. Obviously Wildcaller is a pretty cool staff for a bear-themed druid, so I had to go after this!
Just keep tapping on Rejuvenation and make sure your Survival Instinct and other damage-reducing moves are up as frequently as possible.
Veradis himself is definitely a little more tricky than the others, but stay out of his Flame Wall and have massive amounts of patience and it's almost as easy as pumpkin pie!
And this quest, my friend Altruis, is why i didn´t choose you in Legion. You betrayed us once, you won´t betray us again.
Exarch Onaala wants you to go to the ruins of Karabor and slay Alandien, Theras, Netharel and Varedis. Use the Book of Fel Names when Varedis uses Metamorphosis to weaken him. Return to Exarch Onaala with the Book of Fel Names after you've completed this task.
Completing quests for the Aldor will cause your Scryers reputation to decrease.
We must dispose of this foul artifact, but not before we use it to dispose of Varedis's powers.
We need to permanently shut down the Ruins of Karabor training grounds, which will mean killing not only Varedis but also the three night elf masters that assist him: Alandien, Theras and Netharel.
Gather a suitable force and return to the Ruins of Karabor. Follow Altruis's instructions for defeating Varedis. For all of our sakes, I hope they work.
Excellent work, <name>! Your victory over Varedis will have a tremendous impact on our war against Illidan.
You've proven yourself not just to the Aldor but to all of Outland. Allow me to destroy what remains of that horrible book.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.